Daideep Patel is a Actively Looking for Data Engineer/Data Analyst Roles at Amplify Loyalty solutions.
Amplify Loyalty Solutions
-
Data EngineerAmplify Loyalty Solutions Jul 2022 - PresentUnited States• Work independently in the development, testing, implementation, and maintenance system of moderate-to-large size and complexity.• Developed and implemented data pipelines using AWS services such as S3, Glue, Athena to process data.• Create Data pipeline using AWS Glue for loading data in various layer in S3 and reporting using AWS Athena.• Create Athena data sources on S3 bucket for Querying data.• Developed basic to complex SQL queries to research, analyze, and troubleshoot data.• Proficiency in data ETL workflows with programming/scripting languages such as Python.• Design and assist in developing complex data migration processes for the data warehouse.• Create automated data pipelines via SQL and python based ETL frameworks.• Participate in reference architecture build out of data lake and data warehouse solution.• Performed data analysis using SQL and involved in critical problem-solving situations.• Good working exposure on Cloud technologies of AWS - EC2, S3, Lambda, Glue, EMR, Athena.• Perform data cleansing, data imputation and data preparation using Pandas & Numpy• Experience In building end to end data pipelines for data transfers using Python & AWS including Boto3.• Prototyped pipelines using Databrick notebooks, Snowflake and PySpark.
-
Data EngineerBlue Kc-Comp Feb 2022 - Jun 2022New York, United States• Analyze, design, and build Modern data solutions using Azure service to support visualization of data.• Extract, Transform, and Load data from different Azure resources to snowflake using Azure Data Factory.• Working experience in utilizing Azure DevOps.• Creating Pipelines in ADF using Linked Services & Datasets to Extract, Transform, and Load from different sources like ADO, Blob Storage and Azure Data Lake.• Build data pipelines in airflow in GCP for ETL related jobs using different airflow operators.• Experience in public cloud based managed services for data warehousing/analytics in Microsoft Azure• Design and Implement database solutions in Azure Sql and Snowflake• Used cloud shell SDK in GCP to configure the services Data Proc, Storage, Big Query• Written complex SQL queries for data analysis to meet business requirements.• Experience working with project managers to understand and design the workflows architectures as per requirements and data scientists to assist with feature engineering.• Created Source Target Mappings (STM) for the required tables by understanding the business requirements for the reports.• Hands-on experience with Big Data application phases like data ingestion, data analytics, and data visualization.• Developed dashboards for visualization using Cloud Dashboard Tools Looker and AWS Quick sight for continuous monitoring of real-time data.• POC to explore AWS Glue capabilities on Data cataloging and Data integration.
-
Data Analytics EngineerCapital One Jul 2021 - Jan 2022Malvern, Pennsylvania, United States• Work independently in the development, testing, implementation, and maintenance of systems of moderate-to-large size and complexity.• Migrated application from internal data center to AWS cloud platform.• Used and developed python scripts to migrate data to AWS.• Worked on standard python packages like boto and boto3 for AWS.• Handled Business logics by backend Python programming to achieve optimal results.• Proficiency in wrangling data and creating data pipelines using fast, efficient Python code.• Familiar with DBMS table design, loading, Data Modeling, and experience in SQL.• Good knowledge in various stages of SDLC (Software Development Life Cycle) method.• Gained Knowledge to write cloud formation templates and deployed AWS resourcing.• Extensively worked using AWS services along with wide and in depth understanding of each one of them.• Created and executed HQL scripts that creates external tables in a raw layer database in Hive. -
Data EngineerVanguard Feb 2020 - May 2021Malvern, Pennsylvania, United States• Work independently in the development, testing, implementation, and maintenance of systems of moderate-to-large size and complexity.• Migrated application from internal data center to AWS cloud platform.• Used and developed python scripts to migrate data to AWS.• Worked on standard python packages like boto and boto3 for AWS.• Handled Business logics by backend Python programming to achieve optimal results.• Proficiency in wrangling data and creating data pipelines using fast, efficient Python code.• Familiar with DBMS table design, loading, Data Modeling, and experience in SQL.• Good knowledge in various stages of SDLC (Software Development Life Cycle) method.• Gained Knowledge to write cloud formation templates and deployed AWS resourcing.• Extensively worked using AWS services along with wide and in depth understanding of each one of them.• Created and executed HQL scripts that creates external tables in a raw layer database in Hive. -
Data AnalystHealth Savings Administrators Jul 2019 - Dec 2019United States• Extracting and mining data for analysis to aid in solving business problems.• Experience in supporting a cloud-based data warehouse environment. such as Snowflake• Demonstrate a full understanding of the Fact/Dimension data warehouse design model, including star and snowflake design methods.• Worked on Data migration project from Teradata to Snowflake.• Formulated SQL queries, Aggregate Functions, and database schema to automate information retrieval.• Experience with SQL and Relational & Multi-dimensional Databases.• Used Databricks as platform for high-scale analytics using Spark.• Involved in converting SQL queries into Spark transformations using Pyspark concepts.
-
Data AnalystCapital One Bank Feb 2019 - May 2019Richmond, Virginia, United States• Extracting and mining data for analysis to aid in solving business problems.• Experience in supporting a cloud-based data warehouse environment. such as Snowflake• Demonstrate a full understanding of the Fact/Dimension data warehouse design model, including star and snowflake design methods.• Worked on Data migration project from Teradata to Snowflake.• Formulated SQL queries, Aggregate Functions, and database schema to automate information retrieval.• Experience with SQL and Relational & Multi-dimensional Databases.• Used Databricks as platform for high-scale analytics using Spark.• Involved in converting SQL queries into Spark transformations using Pyspark concepts.
-
Data DeveloperShivaasha Technologies And Services Private Limited Dec 2013 - May 2016Gujarat, India• Designed and developed web-applications using Python, SQL, PHP, and CMS systems• Used Python scripts to update content in the database and manipulate files.• Familiar with DBMS table design, loading, Data Modeling, and experience in SQL.• Analyze data from multiple data sources and develop a process to integrate the data into a single but consistent view Designed and developed data management system using MySQL.• Performed troubleshooting, fixed, and deployed many Python bug fixes of the two main applications that were a main source of data for both customers and internal customer service team.• Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.• Carried out various mathematical operations for calculation purpose using python libraries.• Implement code in python to retrieve and manipulate data.• Written multiple MapReduce programs to extract data for extraction, transformation and aggregation from more than 20 sources having multiple file formats including XML, JSON, CSV &other compressed file formats.• Implemented Spark Core in Scala to process data in memory.• Performed job functions using Spark API’s in Scala for real time analysis and for fast querying purposes.• Involved in creating Spark applications in Scala using cache, map, reduceByKey etc. functions to process data.• Created Oozie workflows for Hadoop based jobs including Sqoop, Hive and Pig. -
Data Analyst/Data DeveloperIlensys Technologies India Jun 2011 - Oct 2013India• Implemented the Slowly changing dimension scheme (Type II) for most of the dimensions. • Implemented the standard naming conventions for the fact and dimension entities and attributes of logical and physical model. • Reviewed the logical model with Business users , ETL Team, DBA's and testing team to provide information about the data model and business requirements. • Worked on Mercury Quality Center to track the defect logged against the logical and physical model. • Participated in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project. • Created and reviewed the conceptual model for the EDW (Enterprise Data Warehouse) with business users. • Analyzed the source system (JD Edwards) to understand the source data and JDE table structure along with deeper understanding of business rules and data integration checks. • Identified various facts and dimensions from the source system and business requirements to be used for the data warehouse. • Implemented the Slowly changing dimension scheme (Type II & Type I) for most of the dimensions.• Worked as an onsite project coordinator once the design of the database was finalized in order to implement the data warehouse according to the implementation standards
Frequently Asked Questions about Daideep Patel
What company does Daideep Patel work for?
Daideep Patel works for Amplify Loyalty Solutions
What is Daideep Patel's role at the current company?
Daideep Patel's current role is Actively Looking for Data Engineer/Data Analyst Roles.
Not the Daideep Patel you were looking for?
-
Daideep Patel
Indore -
Daideep Patel
Atlanta, Ga -
Daideep Patel
Jersey City, Nj2vanguard.com, capitalone.com -
Daideep Patel
Decatur, Ga
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial