Pavan K

Pavan K Email and Phone Number

Data Engineer @ Centene Corporation
saint louis, missouri, united states
Pavan K's Location
Revere, Massachusetts, United States, United States
About Pavan K

Having 10+ years of experience in IT as a Data Engineer with expertise on cloud platforms like AWS, Azure, and GCP, which helps to design, analyze, and develop ETL Data Pipelines. Experienced in developing both batch and real-time streaming data pipelines using cloud services, Python, and PySpark Scripts.Experienced in configuring and administering the Hadoop Cluster using major Hadoop Distributions like Apache Hadoop and Cloudera.Hands-on expertise with importing and exporting data from Relational databases to HDFS, Hive, and HBase using Sqoop. Proficiency with transporting and processing real-time event streaming using Kafka and Spark Streaming.Excellent in High-Level Design of ETL & SSIS Packages for integrating data using OLE DB connection from heterogeneous sources like (Oracle, Excel, CSV, Oracle, flat file, and Text Format Data) by using multiple transformations provided by SSIS such as Data Conversion, Conditional Split, Bulk Insert, Merge and Derived Column.Expertise in designing and implementing data solutions using Azure Data Factory.Expertise in Data masking, Data subsetting, Synthetic test data generation, and Data archive using Informatica TDM/ILM Suite.Expertise in using Databricks for data engineering, big data processing, and machine learning.Experience in Hadoop Ecosystems HDFS (Storage), Spark, Map Reduce (Processing), Hive, Pig, Sqoop, YARN, and AWS.Experience in moving data between GCP and Azure using Azure Data Factory.Experience with HBase, Cassandra, and MongoDB NoSQL databases and creation of Sqoop scripts for Teradata and Oracle to Big Data Environment data transfer.

Pavan K's Current Company Details
Centene Corporation

Centene Corporation

View
Data Engineer
saint louis, missouri, united states
Website:
centene.com
Employees:
17105
Pavan K Work Experience Details
  • Centene Corporation
    Sr.Data Engineer
    Centene Corporation Sep 2021 - Present
    Tempe, Arizona, United States
    Designed and implemented scalable data pipelines using Apache Spark and AWS Glue, resulting in aimprovement in data processing times. Collaborated with data scientists and healthcare analysts to develop ETL processes that ensured the accurateintegration of patient data from various sources. Developed and maintained data warehouses on Amazon Redshift, facilitating efficient storage and retrieval oflarge datasets. Implemented data quality checks and validation procedures, leading to a 25% reduction in data errors. Ensured compliance with HIPAA regulations by securing sensitive healthcare data through encryption and accesscontrols.
  • Metlife
    Sr. Data Engineer
    Metlife Nov 2020 - Aug 2021
    Los Angeles, California, United States
    Utilized Terraform and Cloud Formation to configure EC2 instances with high availability and developedadditional plugins for Terraform to enable support for newly required features. Responsible for designing and implementing data warehousing solutions using Snowflake on AWS to improvethe data processing speed. Actively involved and worked closely with business analysts and other stakeholders to understand theirrequirements and translate them into Snowflake data models and ETL pipelines. Developed and maintained robust data and machine learning (ML) pipelines, leveraging industry best practicesand tools to ensure reliable data processing and analysis.
  • United Airlines
    Sr.Data Engineer
    United Airlines Jul 2018 - Oct 2020
    New Jersey, United States
    Installed the Oozie workflow engine to conduct a number of distinct Hive and Pig operations in response to timeand data availability. Streaming systems using Storm and other SQL databases and files using Sqoop were used to import data intoHDFS for Big Data Lake. Using AWS Lambda — AWS’s serverless technology to run functions in the cloud to make functions execute. Deliver static and dynamic files quickly around the world using a Content Delivery Network (CDN). Involved in the Teradata to Snowflake Object Migration. Knowledge of BigQuery, Cloud functions, and GCP Dataproc Using HDFS MapReduce, Kafka, Spark, HBase, Hive,Hive UDF, and Spark to analyze massive and important datasets used the Scala Kafka Consumer API to get datafrom Kafka topics.
  • Wells Fargo
    Data Engineer
    Wells Fargo Sep 2016 - Jun 2018
    New York, United States
    Worked on Amazon Redshift to shift all data warehouses into one data warehouse and had a good understandingof Cassandra architecture, replication strategy, gossip, and snitches. Responsible for managing and processing large volumes of data on the AWS cloud platform. Designed and developed Spark applications using Python to handle data from various RDBMS and streamingsources like Kafka.
  • Hcsc Inc
    Data Engineer
    Hcsc Inc Oct 2014 - Jun 2016
    Bengaluru, Karnataka, India
    Used Selenium RC for testing different browser, Selenium Grid for checking instance of server and Selenium IDEfor record and playback. Developed Java test scripts in Selenium for Page Object Framework. Good Experience on Selenium IDE and creating Scripts in selenium --RC by using Java Experience in testing EDI according to HIPAA compliance.
  • Indian Eagle Pvt Ltd
    Data Engineer
    Indian Eagle Pvt Ltd Jul 2013 - Sep 2014
    Hyderabad, Telangana, India
    Performed extensive data mining and exploration on the virtual and physical server performance log data andconsequently applied anomaly detection algorithm using machine learning tools like R aimed at proactivelydetermining performance issues and reducing downtime. Involved in the administration of the data warehouse using the Warehouse Administrator functionality of SAS. Designed and implemented real-time data integration solutions using Azure Data Factory and Azure Event Hubs. Designed and implemented time series forecasting models for estimating server provisioning on the cloudenvironment which resulted in accurate demand forecasting for cost optimization and budgeting.

Pavan K Education Details

Frequently Asked Questions about Pavan K

What company does Pavan K work for?

Pavan K works for Centene Corporation

What is Pavan K's role at the current company?

Pavan K's current role is Data Engineer.

What schools did Pavan K attend?

Pavan K attended Jntuh College Of Engineering Hyderabad.

Who are Pavan K's colleagues?

Pavan K's colleagues are Kenneth K., Andy Kavalos, Mba, Shrm-Cp, Kenneth Fasola, Michele Kenworthy, Frank Barnes, Anthony Monteforte, Robin Cole.

Not the Pavan K you were looking for?

  • Pavan K

    Sr. Data Engineer | 8+ Years In Etl, Cloud Data Solutions, And Big Data Ecosystems | Aws | Azure | Python | Driving Scalable Data Pipelines & Business Intelligence Solutions
    Arlington, Tx
  • Pavan K

    Los Angeles Metropolitan Area
  • Pavan k

    Full Stack Java Developer | Java | J2Ee | Spring | Angular | Api | Hibernate | Aws | React | Node | Microservices | Sql | Mvc
    Frisco, Tx
  • Pavan K

    Software Developer | Java Developer | Software Engineer
    Lewisville, Tx
  • Pavan k

    Richardson, Tx

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.