Pavan Kumar

Pavan Kumar Email and Phone Number

AWS Data Engineer | GCP Data Engineer| Snowflake Engineer| Azure Cloud | SQL | DWH | Informatica PowerCenter | Teradata | Apache Spark | PySpark | Azure DataFactory | Azure Databricks | Snowflake @ Wells Fargo
United States
Pavan Kumar's Location
United States, United States
About Pavan Kumar

Pavan Kumar is a AWS Data Engineer | GCP Data Engineer| Snowflake Engineer| Azure Cloud | SQL | DWH | Informatica PowerCenter | Teradata | Apache Spark | PySpark | Azure DataFactory | Azure Databricks | Snowflake at Wells Fargo.

Pavan Kumar's Current Company Details
Wells Fargo

Wells Fargo

View
AWS Data Engineer | GCP Data Engineer| Snowflake Engineer| Azure Cloud | SQL | DWH | Informatica PowerCenter | Teradata | Apache Spark | PySpark | Azure DataFactory | Azure Databricks | Snowflake
United States
Website:
wellsfargo.com
Employees:
205138
Pavan Kumar Work Experience Details
  • Wells Fargo
    Wells Fargo
    United States
  • Wells Fargo
    Aws Data Engineer
    Wells Fargo Feb 2022 - Present
    San Francisco, California, Us
    •Developed robust data ingestion pipelines using Azure Data Factory and Python to extract patient data from various sources such as Electronic Health Records (EHR) systems, medical devices, IoT sensors, and NoSQL databases into Azure Data Lake Storage.•Developed and maintained custom connectors in Azure Data Factory to integrate with various third-party APIs, NoSQL databases, and data sources, extending the ETL capabilities and enabling seamless data ingestion.•Integrated SQL Auto Loader in data ingestion processes to streamline and automate the detection and loading of new files, enhancing the efficiency of data pipeline workflows and reducing manual intervention.•Integrated Azure Logic Apps into data workflows to automate and orchestrate data processing tasks across Azure services, improving workflow efficiency and reducing manual intervention.•Integrated Azure Data Factory with Azure Key Vault to securely manage and access sensitive configuration settings and credentials, enhancing the security of ETL processes.•Conducted performance tuning and optimization of SQL queries, indexes, and database configurations to enhance the efficiency and speed of database operations.•Implemented data processing workflows utilizing Azure Databricks and Python to clean, transform, and enrich raw patient data, ensuring data quality and consistency for downstream analytics and reporting.•Managed and maintained the Unity Catalog to ensure secure and organized data governance.•Implemented role-based access control (RBAC) for data access management using Unity Catalog.•Developed scalable data processing workflows in Azure Databricks to handle petabyte-scale datasets, ensuring efficient resource utilization and minimal processing time.Environment: Azure Data Lake Storage Gen 2, Azure Data Factory, Azure Databricks, Azure Logic Apps, Azure Function Apps, Ataccama, Apache Spark, Python, Scala, SQL, Oracle, Hive, Jira, Power BI.
  • Express Scripts Pharmaceuticals
    Aws Data Engineer
    Express Scripts Pharmaceuticals Oct 2020 - Jan 2022
    • Developed and optimized SQL queries and stored procedures within Azure Data Factory and SSIS to extract, transform, and load data, ensuring efficient processing and adherence to data warehousing best practices.• Created T-SQL scripts for automating routine database maintenance tasks such as index rebuilding, statistics updates, and backup operations, ensuring database health and performance.• Utilized T-SQL's advanced analytical functions such as window functions, common table expressions (CTEs), and pivoting to perform complex data analyses and generate insightful reports.• Implemented Azure Logic Apps to automate workflows and orchestrate data integration tasks between disparate systems, enhancing operational efficiency and reducing manual effort.• Performed performance tuning of Snowflake data warehouse using Query Profiler, caching mechanisms, and virtual data warehouse scaling to optimize query execution and enhance overall system performance.• Developed data quality dashboards and reports to provide insights into data quality trends and issues, supporting data governance and stewardship efforts.• Implemented data governance frameworks and policies to ensure data quality and compliance, collaborating with stakeholders to establish data standards and best practices.• Developed and deployed Python-based Azure Functions for data preprocessing, enrichment, and validation tasks within the data pipelines, enhancing data quality and integrity.Environment: Azure Databricks, Data Factory, Logic Apps, Snowflake, Functional App, Snowflake, MS SQL, Oracle, HDFS, Kafka, Spark, Hive, SQL, Python, Scala, PySpark, shell scripting, GIT, JIRA, Power BI
  • State Of Arizona
    Big Data Engineer
    State Of Arizona May 2017 - Sep 2020
    Phoenix, Az, Us
    •Prepared an ETL framework using Sqoop, Pig, and Hive to bring in data from various sources and make it available for consumption.•Processed HDFS data and created external tables using Hive along with developing scripts for table ingestion and repair for reuse across the project.•Developed and maintained integration solutions to import and export data between SQL Server and external systems using APIs, Linked Servers, and other connectivity options.•Created, optimized, and maintained complex stored procedures and user-defined functions to encapsulate business logic and improve application performance.•Established comprehensive audit and logging mechanisms to track data changes, user activities, and system performance for security and compliance purposes.•Developed ETL jobs using Spark and Scala to migrate data from Oracle to new MySQL tables.•I analyzed source data, efficiently handled data type modifications, and used Excel sheets, flat files, and CSV files to generate Power BI ad-hoc reports.•Extensive experience in optimizing SQL queries for improved performance.Environment: Hadoop, Hive, spark, PySpark, Sqoop, Spark SQL, Shell script, Cassandra, YAML, ETL.
  • Global Payments Inc.
    Big Data Engineer
    Global Payments Inc. Apr 2016 - Mar 2017
    Atlanta, Georgia, Us
    • Experience in developing complex store procedures, efficient triggers, required functions, creating indexes and indexed views for performance.• Excellent Experience in monitoring SQL Server Performance tuning in SQL Server• Expert in designing ETL data flows using SSIS, creating mappings/workflows to extract data from SQL Server and Data Migration and Transformation from Access/Excel Sheets using SQL Server SSIS.• Efficient in Dimensional Data Modeling for Data Mart design, identifying Facts and Dimensions, and developing fact tables, dimension tables, using Slowly Changing Dimensions (SCD).• Experienced in Building Cubes and Dimensions with different Architectures and Data Sources for Business Intelligence and writing MDX Scripting.• Thorough knowledge of Features, Structure, Attributes, Hierarchies, Star and Snowflake Schemas of Data Marts.• Developed DataMarts for specific business units, enhancing data accessibility and analysis.• Utilized dimensional data modeling for DataMart design, including identifying facts and dimensions.• Worked on the creation of Ad hoc reports and reports with complex formulas for DataMarts using SSRS.• Expertise in developing Parameterized, Chart, Graph, Linked, Dashboard, Scorecards, Report on SSAS Cube using Drill-down, Drill-through and Cascading reports using SSRS.• Flexible, enthusiastic and project-oriented team player with excellent written, verbal communication and leadership skills to develop creative solutions for challenging client needs.Environment: MS SQL Server 2016, Visual Studio 2017/2019, SSIS, Share point, MS Access, Team Foundation server, Git
  • Costco Wholesale
    Data Warehouse Developer
    Costco Wholesale Feb 2013 - Mar 2016
    Seattle, Wa, Us
    •Used BULK SQL and BULK BINDING to minimize the execution time of PL/SQL code.•Created SQL Loader Control Files for moving data from Flat Files (Fixed Record Length) to Staging Area Tables.•Analyzed the existing Stored Procedures, Functions, Triggers and Cursors on performance issues.•Managing Tables, Indexes, Constraints, Views, Sequences, and stored program units.•Used the Oracle Import/Export utilities for taking the backup of a particular partition data.•Gathered Statistics and Analyzed Tables and Indexes for Performance tuning.•Made use of DB LINKS for accessing data across different databases. •Resolved various optimizing problems to maintain a high level of customer satisfaction.•Created user’s privileges and managed user roles and grants.•Worked on DBMS_SCHEDULER to automate jobs.•Created Stored Procedures, Functions, and Triggers to maintain various business rules.•Extensively worked towards optimizing the Queries to enhance performance.•Analyzed Indexes on regular basis. •Loading data into the Database using SQL LOADER•Performed debugging of the PL/SQL codes using the DBMS_OUTPUTEnvironment: Oracle 10.2g, HP-Unix, Oracle Enterprise Manager, VSS, SQL Loader, SQL Plus, SQL Developer 1.5.4, VB.net, Toad Data Modeler.

Pavan Kumar Education Details

  • Hindustan University
    Hindustan University
    Computer Science

Frequently Asked Questions about Pavan Kumar

What company does Pavan Kumar work for?

Pavan Kumar works for Wells Fargo

What is Pavan Kumar's role at the current company?

Pavan Kumar's current role is AWS Data Engineer | GCP Data Engineer| Snowflake Engineer| Azure Cloud | SQL | DWH | Informatica PowerCenter | Teradata | Apache Spark | PySpark | Azure DataFactory | Azure Databricks | Snowflake.

What schools did Pavan Kumar attend?

Pavan Kumar attended Hindustan University.

Who are Pavan Kumar's colleagues?

Pavan Kumar's colleagues are Bill Rose, Joanna Torres, John Louie Padilla, Angel Carlin, Alexander Chan, Cams, Csm, Heidi Van Beek, Eddy H. Rojas.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.