Nikhil P Email and Phone Number
Over 8+ years within Big Data and Data Engineering at real time software development and analysis experience and strong expertise in Spark Scala, Hive, Kafka Streaming and PySpark. Good experience on Data warehousing and Snowflake and Star schema. Strong experience within AWS, Dataproc, Dataflow, BigQuery, Dataflow and Python/Scala and had a knowledge of scripts, array, lists, and functions. Proficient to write SQL queries with groups by function and had an understanding of complex scenario base as well. Extremely excited about the new opportunity and looking forward to contributing to the new team with knowledgebase.
Abbvie
View- Website:
- abbvie.com
- Employees:
- 46234
-
Senior Data EngineerAbbvie May 2022 - PresentVernon Hills, Illinois, United States• Designed and developed Security Framework to provide fine grained access to objects in AWS S3 using AWS Lambda, DynamoDB. • Extracted data and load into HDFS using Sqoop commands and scheduling Map/Reduce jobs on Hadoop. • Implemented Lambda to configure Dynamo DB Autoscaling feature and implemented Data Access Layer to access AWS DynamoDB data. • Loaded data into S3 buckets using AWS Glue and PySpark. Involved in filtering data stored in S3 buckets using Elasticsearch and loaded… Show more • Designed and developed Security Framework to provide fine grained access to objects in AWS S3 using AWS Lambda, DynamoDB. • Extracted data and load into HDFS using Sqoop commands and scheduling Map/Reduce jobs on Hadoop. • Implemented Lambda to configure Dynamo DB Autoscaling feature and implemented Data Access Layer to access AWS DynamoDB data. • Loaded data into S3 buckets using AWS Glue and PySpark. Involved in filtering data stored in S3 buckets using Elasticsearch and loaded data into Hive external tables. • Experience in Migrating existing databases from on-premises to AWS Redshift using various AWS Services. • Developed the PySpark code for AWS Glue Jobs and EMR. • Worked on different files like CSV, JSON, Flat, Parquet to load the data from source to raw tables. Implemented Triggers to schedule pipelines. Show less -
Data EngineerHomesite Insurance Nov 2019 - Apr 2022Boston, Massachusetts, United States• Responsible for validation of Target data in Data Warehouse which are Transformed, loaded using Hadoop Big data. • Wrote AWS Lambda functions in Python for AWS's Lambda which invokes python scripts to perform various transformations and analytics on large data sets in EMR clusters. • Designed the ETL process from various sources into Hadoop/HDFS for analysis and further processing of data modules. • Have good experience working with Azure BLOB and Data Lake storage and loading… Show more • Responsible for validation of Target data in Data Warehouse which are Transformed, loaded using Hadoop Big data. • Wrote AWS Lambda functions in Python for AWS's Lambda which invokes python scripts to perform various transformations and analytics on large data sets in EMR clusters. • Designed the ETL process from various sources into Hadoop/HDFS for analysis and further processing of data modules. • Have good experience working with Azure BLOB and Data Lake storage and loading data into Azure SQL Synapse analytics (DW). • Worked on Amazon Web service (AWS) to integrate EMR with Spark and S3 storage and Snowflake. • Using Spark, performed various transformations and actions and the result data is saved back to HDFS from there to target database Snowflake. Show less -
Python DeveloperFifth Third Bank Jul 2017 - Oct 2019Evansville, Indiana, United States• Built models and tools to get value out of the mass amounts of data present in the environment.• Creating and launching EC2 instances using AMI’s of Linux, Ubuntu, Windows and wrote shell scripts to bootstrap instance.• Responsible for architecting, designing, implementing and supporting of cloud-based infrastructure and its solutions.• Setup database in AWS using RDS and configuring backups for S3 bucket.• Analyze cloud infrastructure and recommend improvements for… Show more • Built models and tools to get value out of the mass amounts of data present in the environment.• Creating and launching EC2 instances using AMI’s of Linux, Ubuntu, Windows and wrote shell scripts to bootstrap instance.• Responsible for architecting, designing, implementing and supporting of cloud-based infrastructure and its solutions.• Setup database in AWS using RDS and configuring backups for S3 bucket.• Analyze cloud infrastructure and recommend improvements for performance gains and cost efficiencies• Involved in Data Collection, Data Extraction, Data Pre-Processing, Feature Engineering, Dimensionality Reduction, Algorithm Implementation, Back Testing and Validation.• Imported data from AmazonS3 buckets using boto3 library. Show less -
Python DeveloperCouth Infotech Pvt Ltd Jan 2016 - Apr 2017Hyderabad, Telangana, India• Wrote and executed various MYSQL database queries from Python using Python-My SQL connector and My SQL db package.• Worked with tools like Jenkins to implement build automation. • Development of company´s internal CI system, providing a comprehensive API for CI/CD. • Associated with various phases of Software Development Life Cycle (SDLC) of the application like requirement gathering, Design, Analysis and Code development. • Worked with team of developers on Python… Show more • Wrote and executed various MYSQL database queries from Python using Python-My SQL connector and My SQL db package.• Worked with tools like Jenkins to implement build automation. • Development of company´s internal CI system, providing a comprehensive API for CI/CD. • Associated with various phases of Software Development Life Cycle (SDLC) of the application like requirement gathering, Design, Analysis and Code development. • Worked with team of developers on Python applications for RISK management. • Generated Python Django Forms to record data of online users. Used Python and Django creating graphics, XML processing, data exchange and business logic implementation. • Designed and developed the UI of the website using HTML, XHTML, AJAX, CSS, and JavaScript. Developed and tested many features for dashboard using Python, Java, Bootstrap, CSS, JavaScript, and J Query. Show less
-
Data AnalystBrio Technologies Private Limited Jul 2014 - Dec 2015Hyderabad, Telangana, India• One of the top five banks in the US is Wells Fargo. When it lends money out,the bank earns more money than when it borrows.• Consumer banking and Lending, Commercial Banking, Corporate and Investment Banking, and Wealth & Investment Management are the four business segments operated by Wells Fargo.• With Tableau desktop, engaged in the process of producing various reports in response to business requests and ad hoc requirements.• Provide financial deliverables for funding… Show more • One of the top five banks in the US is Wells Fargo. When it lends money out,the bank earns more money than when it borrows.• Consumer banking and Lending, Commercial Banking, Corporate and Investment Banking, and Wealth & Investment Management are the four business segments operated by Wells Fargo.• With Tableau desktop, engaged in the process of producing various reports in response to business requests and ad hoc requirements.• Provide financial deliverables for funding agencies, including budgeting reports, proof of concept documents, and prototypes of solutions.• For analysis, monitoring, management, and a better understanding of the business performance measures, create various dashboards and visualizations. Show less
Frequently Asked Questions about Nikhil P
What company does Nikhil P work for?
Nikhil P works for Abbvie
What is Nikhil P's role at the current company?
Nikhil P's current role is Senior Data Engineer at AbbVie| Actively Seeking For New Opportunities| Data Engineer| AWS EC2| AWS Lamda | AWS Glue| Python SQL | PostgreSQL| PL/SQL| AZURE | Hadoop |PySpark| ETL | Shell Script | Tableau | Python.
Who are Nikhil P's colleagues?
Nikhil P's colleagues are Michael Richardson, Cpa, Steve Marks, George Cunha, Priscilla Galhardi, Olga Khmylev, Nicholas Ergastolo, Brandon L. Wood.
Not the Nikhil P you were looking for?
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial