Surya B

Surya B Email and Phone Number

Rowlett, TX, US
Surya B's Location
Rowlett, Texas, United States, United States
About Surya B

13+ years of IT Industry experience in the analysis designing testing maintaining database ETL, Bigdata, Cloud and Data warehouse applications.Experience in designing, developing, execute and maintain data extraction, transformation, and loading for multiple corporate Operational Data Store, Data warehousing, and Data mart systems.Extensive experience in Python programming using Numpy and panda libraries. Strong experience in Business and Data Analysis, Data Profiling, Data Migration, DataIntegration, Data governance and Metadata Management, Master Data Management andConfiguration Management.Strong in ETL, Data warehousing, Operations Data Store concepts, data marts and OLAPtechnologies.Experience with the use of AWS services includes Serverless Glue, Lambda, DMS, CloudWatch,S3, RDS Postgres SQL, IAM, S3, EC2, Step Functions, EBS and VPC and also administeringAWS resources using Console and CLI.Hands on experience on building the infrastructure required for optimal extraction,transformation, and loading of data from a wide variety of data sources using NoSQL and SQLfrom AWS.Experience in Data Modeling with expertise in creating Data Vault 2.0 Model through Hubs, andSatellites using Erwin tool.Experience building and optimizing & big data, data pipelines, architectures and data sets. (S3,Redshift, Snowflake)Super-eminent understanding of AWS (Amazon Web Services), S3, Amazon RDS, Apache SparkRDD, process and concepts. Developing Logical Data Architecture with adherence to EnterpriseArchitecture.Involved in all phases of software development life cycle in Agile, Scrum and Waterfallmanagement process.

Surya B's Current Company Details
Pacific Gas and Electric Company(PG&E)

Pacific Gas And Electric Company(Pg&E)

View
Data Architect
Rowlett, TX, US
Surya B Work Experience Details
  • Pacific Gas And Electric Company(Pg&E)
    Data Architect
    Pacific Gas And Electric Company(Pg&E)
    Rowlett, Tx, Us
  • Pacific Gas And Electric Company(Pg&E)
    Aws Lead Data Engineer
    Pacific Gas And Electric Company(Pg&E) Jan 2022 - Apr 2024
    Texas, United States
    • Technical lead for end-to-end data strategy including data architecture, data engineering, data quality and governance• Developed Data Integration platform components/process using Informatica cloud platform.• Designed, developed and implemented ETL Processes using IICS data integration.• • Designing and building multi-terabyte, full end-to-end Data Lake from the ground up on Snowflake for large scale data handling Millions of records every day with sum of Approx. 350+ GB data• Development and execution of Python based Store Proc in Snowflake for ingesting excel data from AWS S3 to Snowflake. • Performed loads into Snowflake instance using Snowflake connector in IICS for a separate project to support data analytics. • Data ingestion from Palantir Foundry data systems to Snowflake through AWS 3. • Data migration from on premises server(SQL Server to Snowflake Datawarehouse.• Building ETL data pipelines in Python/PySpark and building python DAGs in Apache Airflow.• Extensively worked on Python and PySpark programming in creating the delta datasets from the snapshot datasets and also created summary files as part of meta data store for lambda trigger.• Worked on Step functions for ETL pipeline Orchestration.• Used Spark-SQL to Load JSON data and create schema RDD and loaded into Hive Tables and handled Structured data using SparkSQL.• Participating in the call of Business Requirements Designs from client and work with Architecture team for shaping out the Design of ELT tasks lists.• Standardizing the data in unified format for efficient querying and analysis as the data from disparate sources are mostly inconsistent (different file formats (XLSX, JSON, CSV, Parquet), different data format (Structured and Semi-Structured).• Maintained and developed complex SQL queries, views, functions and Stored Procs that qualify customer requirements on Snowflake.• Worked on building EC2 instances, Creating IAM user’s groups and defining policies.
  • Signet Jewelers
    Data Engineer
    Signet Jewelers Jan 2021 - Dec 2021
    Worked in the CDP (Customer Data Platform) Data application project involves application development and maintenance where I was responsible for all the Data Development, Data Analysis, Data Reporting, Data Quality and Data Maintenance efforts of the application which my team was highly focusing on.Responsible for looking after Cloud Architectural design solutions over AWS. Created IICS connections using various cloud connectors in IICS administrator.Build ETL pipelines end to end from AWS S3(Landing, Curated, Presentation) to Redshift data warehouse for analytical queries and specifically for cloud data.Used Redshift, S3 within AWS along with Informatica Cloud Services to load data into s3 bucket.Worked on Lambda function development using Python and Pyspark programming for connecting to multiple source systems as part of ETL pipeline builds. Manage metadata alongside the data for visibility of where data came from, its linage to ensure and quickly and efficiently finding data for customers using AWS Data Lake and its complex functions like AWS Lambda, AWS Glue.Worked on ETL data pipelines to migrate from Hadoop/Oracle to Redshift. AWS DMS replication instance between different databases for historical and delta changes Document all the changes implemented across all systems and components using Confluence and Atlassian Jira. Documentation includes technical changes, Infrastructure changes, and Business Process changes. Post Release documentation would also include Known Issues from Production Implementation and Deferred defects.Supported DevOps team to build CI-CD pipelines to deploy the code to different environmentsPerform regular code reviews for complex modules so that optimal and secure coding practices are followed.Worked in Production Environment which involves building CI/CD pipeline using Jenkins with various stages starting from code checkout from GitHub to Deploying code in specific environment.
  • Goldman Sachs
    Data Engineer
    Goldman Sachs Sep 2015 - Dec 2020
    Bengaluru, Karnataka, India
    Developed pipeline using Hive (HQL) to retrieve the data from Hadoop cluster, SQL to retrievedata from Oracle database and used ETL for data transformation.Analyzed and gathered business requirements from clients, conceptualized solutions withtechnical architects, and verified approach with appropriate stakeholders, developed E2Escenarios for building the application.Derived data from relational databases to perform complex data manipulations and conductedextensive data checks to ensure data quality. Performed Data wrangling to clean, transform andreshape the data utilizing NumPy and Pandas library.Worked with datasets of varying degrees of size and complexity including both structured andunstructured data and Participated in all phases of Data mining, Data cleaning, Data collection,variable selection, feature engineering, developing models, Validation, Visualization andPerformed Gap analysis.Optimized lot of SQL statements and PL/SQL blocks by analyzing the execute plans of SQLstatement and created and modified triggers, SQL queries, stored procedures for performanceimprovement.Developed data integration pipelines across various layers and developing code components toprocess and transform the data.Created Linked Service to Land the data from SFTP location to S3.Used Sqoop to move data from oracle database into hive by creating a delimiter separated filesand using these files in an external location to be used as an external table in hive and furthermoving the data into refined tables as parquet format using hive queriesManaged workload and utilization of the team. Coordinated resources and processes to achieveproject implementation plans.
  • Weatherford
    Technology Lead
    Weatherford May 2011 - Aug 2015
    Hyderabad, Telangana, India
    Worked on an application called LOWIS which is a web based suite of software applicationsdeveloped by Weatherford, a leading oilfield service company, and used by Oil and Gasproducers to enhance the efficiency and effectiveness of well optimization processes.Involved in Requirements Gathering and analysis.Involved in translating business requirements into technical specification.Involved in developing application software using the programming languagesC++,VC++,JavaScript,HTML and Framework such as Microsoft Visual Studio.Played a vital role in design, Implementation, testing and support of LOWIS Applicationsoftware product.Performed duties in executing and writing stored procedures using DB Query languagesSQL,PL-SQL.Responsible for providing technical assistance to the team and handling team management.Extensively used software versioning tools such as Perforce,SVN, MicrosoftTeamFoundationsystem for code check-ins.Played Agile scrum master role at offshore.Well versed in Microsoft Excel usage for presenting the project status/progress during statusmeetings with stake holders periodically.Responsible in analyzing the sprint work items in agile methodology and distribute tasks to theteam.Involved in managing Project Management software systems.Involved in Code Reviews, Code Integrations, and Integration testing.

Surya B Education Details

Frequently Asked Questions about Surya B

What company does Surya B work for?

Surya B works for Pacific Gas And Electric Company(Pg&e)

What is Surya B's role at the current company?

Surya B's current role is Data Architect.

What schools did Surya B attend?

Surya B attended Birla Institute Of Technology And Science, Pilani.

Not the Surya B you were looking for?

  • Surya B

    Senior Sales Recruiter At Dataflake Llc (Pavan.K@Dataflake.Com)
    Cincinnati, Oh
  • Surya B

    Certified Safe® 5 Scrum Master | Resumes Doesn'T Perform Job'S. It'S People- So Hire People Not Resumes.
    Troy, Mi
  • Surya B

    Sr. Azure Cloud Infrastructure Engineer | Microsoft Certified: Azure Administrator Associate | Azure | Aws | Aks Eks | Ansible | Docker | Jenkins | Iac | Python, Bash | Git | Cloud Migration | Terraform | Kubernetes |
    Herndon, Va
  • Surya B.

    Twinsburg, Oh

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.