Nikhil P

Nikhil P Email and Phone Number

Senior Data Engineer at AbbVie| Actively Seeking For New Opportunities| Data Engineer| AWS EC2| AWS Lamda | AWS Glue| Python SQL | PostgreSQL| PL/SQL| AZURE | Hadoop |PySpark| ETL | Shell Script | Tableau | Python @ AbbVie
north chicago, illinois, united states
Nikhil P's Location
Denton, Texas, United States, United States
About Nikhil P

Over 8+ years within Big Data and Data Engineering at real time software development and analysis experience and strong expertise in Spark Scala, Hive, Kafka Streaming and PySpark. Good experience on Data warehousing and Snowflake and Star schema. Strong experience within AWS, Dataproc, Dataflow, BigQuery, Dataflow and Python/Scala and had a knowledge of scripts, array, lists, and functions. Proficient to write SQL queries with groups by function and had an understanding of complex scenario base as well. Extremely excited about the new opportunity and looking forward to contributing to the new team with knowledgebase.

Nikhil P's Current Company Details
AbbVie

Abbvie

View
Senior Data Engineer at AbbVie| Actively Seeking For New Opportunities| Data Engineer| AWS EC2| AWS Lamda | AWS Glue| Python SQL | PostgreSQL| PL/SQL| AZURE | Hadoop |PySpark| ETL | Shell Script | Tableau | Python
north chicago, illinois, united states
Website:
abbvie.com
Employees:
46234
Nikhil P Work Experience Details
  • Abbvie
    Senior Data Engineer
    Abbvie May 2022 - Present
    Vernon Hills, Illinois, United States
    • Designed and developed Security Framework to provide fine grained access to objects in AWS S3 using AWS Lambda, DynamoDB. • Extracted data and load into HDFS using Sqoop commands and scheduling Map/Reduce jobs on Hadoop. • Implemented Lambda to configure Dynamo DB Autoscaling feature and implemented Data Access Layer to access AWS DynamoDB data. • Loaded data into S3 buckets using AWS Glue and PySpark. Involved in filtering data stored in S3 buckets using Elasticsearch and loaded… Show more • Designed and developed Security Framework to provide fine grained access to objects in AWS S3 using AWS Lambda, DynamoDB. • Extracted data and load into HDFS using Sqoop commands and scheduling Map/Reduce jobs on Hadoop. • Implemented Lambda to configure Dynamo DB Autoscaling feature and implemented Data Access Layer to access AWS DynamoDB data. • Loaded data into S3 buckets using AWS Glue and PySpark. Involved in filtering data stored in S3 buckets using Elasticsearch and loaded data into Hive external tables. • Experience in Migrating existing databases from on-premises to AWS Redshift using various AWS Services. • Developed the PySpark code for AWS Glue Jobs and EMR. • Worked on different files like CSV, JSON, Flat, Parquet to load the data from source to raw tables. Implemented Triggers to schedule pipelines. Show less
  • Homesite Insurance
    Data Engineer
    Homesite Insurance Nov 2019 - Apr 2022
    Boston, Massachusetts, United States
    • Responsible for validation of Target data in Data Warehouse which are Transformed, loaded using Hadoop Big data. • Wrote AWS Lambda functions in Python for AWS's Lambda which invokes python scripts to perform various transformations and analytics on large data sets in EMR clusters. • Designed the ETL process from various sources into Hadoop/HDFS for analysis and further processing of data modules. • Have good experience working with Azure BLOB and Data Lake storage and loading… Show more • Responsible for validation of Target data in Data Warehouse which are Transformed, loaded using Hadoop Big data. • Wrote AWS Lambda functions in Python for AWS's Lambda which invokes python scripts to perform various transformations and analytics on large data sets in EMR clusters. • Designed the ETL process from various sources into Hadoop/HDFS for analysis and further processing of data modules. • Have good experience working with Azure BLOB and Data Lake storage and loading data into Azure SQL Synapse analytics (DW). • Worked on Amazon Web service (AWS) to integrate EMR with Spark and S3 storage and Snowflake. • Using Spark, performed various transformations and actions and the result data is saved back to HDFS from there to target database Snowflake. Show less
  • Fifth Third Bank
    Python Developer
    Fifth Third Bank Jul 2017 - Oct 2019
    Evansville, Indiana, United States
    • Built models and tools to get value out of the mass amounts of data present in the environment.• Creating and launching EC2 instances using AMI’s of Linux, Ubuntu, Windows and wrote shell scripts to bootstrap instance.• Responsible for architecting, designing, implementing and supporting of cloud-based infrastructure and its solutions.• Setup database in AWS using RDS and configuring backups for S3 bucket.• Analyze cloud infrastructure and recommend improvements for… Show more • Built models and tools to get value out of the mass amounts of data present in the environment.• Creating and launching EC2 instances using AMI’s of Linux, Ubuntu, Windows and wrote shell scripts to bootstrap instance.• Responsible for architecting, designing, implementing and supporting of cloud-based infrastructure and its solutions.• Setup database in AWS using RDS and configuring backups for S3 bucket.• Analyze cloud infrastructure and recommend improvements for performance gains and cost efficiencies• Involved in Data Collection, Data Extraction, Data Pre-Processing, Feature Engineering, Dimensionality Reduction, Algorithm Implementation, Back Testing and Validation.• Imported data from AmazonS3 buckets using boto3 library. Show less
  • Couth Infotech Pvt Ltd
    Python Developer
    Couth Infotech Pvt Ltd Jan 2016 - Apr 2017
    Hyderabad, Telangana, India
    • Wrote and executed various MYSQL database queries from Python using Python-My SQL connector and My SQL db package.• Worked with tools like Jenkins to implement build automation. • Development of company´s internal CI system, providing a comprehensive API for CI/CD. • Associated with various phases of Software Development Life Cycle (SDLC) of the application like requirement gathering, Design, Analysis and Code development. • Worked with team of developers on Python… Show more • Wrote and executed various MYSQL database queries from Python using Python-My SQL connector and My SQL db package.• Worked with tools like Jenkins to implement build automation. • Development of company´s internal CI system, providing a comprehensive API for CI/CD. • Associated with various phases of Software Development Life Cycle (SDLC) of the application like requirement gathering, Design, Analysis and Code development. • Worked with team of developers on Python applications for RISK management. • Generated Python Django Forms to record data of online users. Used Python and Django creating graphics, XML processing, data exchange and business logic implementation. • Designed and developed the UI of the website using HTML, XHTML, AJAX, CSS, and JavaScript. Developed and tested many features for dashboard using Python, Java, Bootstrap, CSS, JavaScript, and J Query. Show less
  • Brio Technologies Private Limited
    Data Analyst
    Brio Technologies Private Limited Jul 2014 - Dec 2015
    Hyderabad, Telangana, India
    • One of the top five banks in the US is Wells Fargo. When it lends money out,the bank earns more money than when it borrows.• Consumer banking and Lending, Commercial Banking, Corporate and Investment Banking, and Wealth & Investment Management are the four business segments operated by Wells Fargo.• With Tableau desktop, engaged in the process of producing various reports in response to business requests and ad hoc requirements.• Provide financial deliverables for funding… Show more • One of the top five banks in the US is Wells Fargo. When it lends money out,the bank earns more money than when it borrows.• Consumer banking and Lending, Commercial Banking, Corporate and Investment Banking, and Wealth & Investment Management are the four business segments operated by Wells Fargo.• With Tableau desktop, engaged in the process of producing various reports in response to business requests and ad hoc requirements.• Provide financial deliverables for funding agencies, including budgeting reports, proof of concept documents, and prototypes of solutions.• For analysis, monitoring, management, and a better understanding of the business performance measures, create various dashboards and visualizations. Show less

Frequently Asked Questions about Nikhil P

What company does Nikhil P work for?

Nikhil P works for Abbvie

What is Nikhil P's role at the current company?

Nikhil P's current role is Senior Data Engineer at AbbVie| Actively Seeking For New Opportunities| Data Engineer| AWS EC2| AWS Lamda | AWS Glue| Python SQL | PostgreSQL| PL/SQL| AZURE | Hadoop |PySpark| ETL | Shell Script | Tableau | Python.

Who are Nikhil P's colleagues?

Nikhil P's colleagues are Michael Richardson, Cpa, Steve Marks, George Cunha, Priscilla Galhardi, Olga Khmylev, Nicholas Ergastolo, Brandon L. Wood.

Not the Nikhil P you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.