Over 8 years of experience in Research, Design, Development, Management of various web applications across Insurance, E-commerce, and Healthcare domains.Designed and implemented end-to-end data pipelines to process and analyze massive amounts of data.Expertise in working with different databases like Microsoft SQL Server, Oracle, MySQL, PostgreSQL and Good knowledge in using NoSQL databases MongoDB.Worked on several standard python packages like boto3, Pandas, Numpy, Pickle, PyTest, PySide, Scipy, Python, PyTables etc.Hands on experience with Spark batch process and POC on Kafka-spark streaming process. Good exposure on various scripting languages like Linux/Unix shell scripting. Proficient in data mining tools like R, Python, SQL, Excel, Big Data Hadoop eco-systemsDeveloped web applications using Python3, AWS, Django, PostgreSQL and front-end technologies HTML5, CSS, Bootstrap 3 and JavaScript.Design and Developed Data ingestion and processing flows in and out of data warehouse using combination of Python and Snowflakes SnowSQL.Expertise in Snowflake data modeling, ELT using snowpipe, implementing stored procedures and standard DWH and ETL concepts.Optimized performance of Kafka-based systems by tuning Kafka configuration settings and optimizing data processing algorithms.Good knowledge with AWS Cloud Technology, data modelling with Star schema and Snowflake schema.Worked with processes to transfer/migrate data from AWS S3/Relational database and flat files common staging tables in various formats to meaningful data into Snowflake.Strong Experience in Data Migration from RDBMS to Snowflake cloud data warehouse.Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks. Have good Knowledge in ETL and hands on experience in ETL.Capable of processing large sets (Gigabytes) of structured, semi-structured or unstructured data. Experience in working with different files formats like ORC, Avro, Parquet, CSV, JSON, and XML. Worked with the Sqoop, Oozie and Flume for Data ingestion into Hive, HBase and HDFS.Experience in Dimensional Data modeling, Star Schema/Snowflake modeling, FACT & Dimensions tables, Physical & logical data modeling, Oracle Designer, Data Integrator.
Franklin Templeton
View- Website:
- franklinresources.com
- Employees:
- 12060
-
Snowflake And Etl DeveloperFranklin TempletonUnited States -
Snowflake/Etl DeveloperFranklin Templeton Oct 2022 - PresentCalifornia, United States(Remote) -
Snowflake DeveloperAig Jun 2021 - Oct 2022Houston, Texas, United States -
Sr. Data Engineer / Snowflake DeveloperQuadrant Resources Corp. Jun 2017 - Feb 2020India -
Business AnalystAccenture Apr 2015 - May 2017India
Frequently Asked Questions about Sharon P
What company does Sharon P work for?
Sharon P works for Franklin Templeton
What is Sharon P's role at the current company?
Sharon P's current role is Snowflake and ETL developer.
Who are Sharon P's colleagues?
Sharon P's colleagues are Mruthyum Thatipamala, Miriam Fleming, Sravya Kandukuri, Ramesh Persaud, Torrey Ward, Monet Noe, Krishna Priyanka.
Not the Sharon P you were looking for?
-
1gloversvilleschools.org
1 +151873XXXXX
-
-
2rockforddiocese.org, lagrandeobserver.com
1 +181538XXXXX
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial