Prem Sagar M Email and Phone Number
• Well versed in all phases of the data engineering project life cycle including data extraction, data cleaning, statistical modeling & data visualization with large data sets of structured & unstructured data.• Extensive knowledge of Data Warehouse concepts with expertise in data modelling for OLAP & OLTP systems from design & analysis to implementation including the conceptual, logical & physical data models.• In-depth knowledge of large database design techniques; experience in data analysis, data cleansing, and data transformation and migration using ETL tools.• Expertise in handling the activities for extracting, transforming & loading the data into database as per client requirements also employing best database development knowledge to address performance, scalability and efficiency issues with data warehouse databases.• Proven ability in planning and implementing the overall analytics and business intelligence strategy also ability to handle the design and development of analytical projects designed to understand key business behaviors that drive customer acquisition, retention and engagement.• Systematic and organized with quick adaptability to new technologies and processes with a logical, analytical and scientific approach. Enriched with the ability to learn new concepts & technique within a short span of time and flexible to work on any technology.Areas of Expertise• Extensive experience in Abinitio, Informatica, Netezza, Teradata, Unix Shell Scripting Snowflake, AWS and Microstrategy.• Instrumental in the migration of Enterprise Data Warehouse from on-prem Netezza server to cloud based Snowflake Data Warehouse.• Playing a key role in end to end Implementation of Enterprise Operational Data Store as OLTP system and analytical reporting for retail client.• Experience in integrating transactional data from various source systems (OLTP) into Enterprise Data Warehouse (EDW) OLAP for data analytics.• Highly experience in Data migration activities integrating data from different databases and legacy systems into Data Warehouse.• Adhered to SDLC process; including business requirement, functional specification, design, development, implementation and quality. • Delivered projects through bi-weekly sprints adhering to agile framework.• Worked in integration and data migration projects to integrate data from heterogeneous systems to enhance application functionality and reporting using Microstrategy.
-
Data EngineerPetco Animal Supplies Stores, Inc.San Diego, Ca, Us -
Senior Data EngineerPetco Animals Supplies Inc Jun 2022 - PresentCalifornia, United States• Ingesting data from different sources like Oracle BICC, Amazon S3 to lakegen2. • Ensured data quality and integrity by performing data validation, cleansing, and transformation operations using Azure Data Factory and Databricks. • Designed streaming data pipelines using Apache Spark Structured Streaming, Kafka, or Azure Event Hubs, ensuring data consistency in near real-time scenarios with Delta Lake. • Designed high-performance, distributed computing tasks using Azure, Spark, SQL, T-SQL and Python. • Worked with various Azure services including Azure Data Lake Analytics, Azure Data Lake Storage, Azure SQL databases, and Azure SQL Data Warehouse for analytics solutions. • Developed architecture for data services spanning Relational, NoSQL, and Bigdata technologies. • Experience in creating a Data Lake Analytics account and creating a Date Lake Analytics Job in Azure Portal using SQL Script. -
Sr. Data Management EngineerCharter Communications Jan 2020 - May 2022United States● Worked on data migration from an RDBMS to a NoSQL database and developed a schema for data deployed in varied data systems● Developed Spark Applications by using Python and Implemented Apache Spark data processing Project to handle data from various RDBMS and streaming sources● Experienced in Maintaining the Hadoop cluster on GCP using the Google cloud storage, Big Query and Dataproc● Led the migration/replicating from SQL Server to Big Query using DataStream, GCS buckets Pub/sub and Dataflow● Used Apache airflow in GCP composer environment to build data pipelines and used various airflow operators like bash operator, Hadoop operators and python callable and branching operators ● Implemented both batch and real-time ingestion workflows in Apache Druid, enabling the ingestion of historical and streaming data for comprehensive analytics● Extracted and transformed data from structured and semi-structured formats like JSON, CSV, and Parquet● Used cloud shell SDK in GCP to configure the services Data Proc, Storage, Big Query● Designed GCP Cloud composer DAG to load data from on-prem csv files to GCP Big Query Tables Scheduled DAG to load incremental mode● Using G-cloud function with Python to load data into Big query for on arrival CSV files in GCS bucket● Implemented Azure security features such as Azure Key Vault, Azure Active Directory integration, and data encryption to ensure data protection and compliance● Integrated Informatica Cloud (IICS) with PagerDuty for monitoring scheduled jobs● Used AWS Lambda functions and AWS SSM to trigger Drools and pass arguments involved in AWS Step functions for orchestration● Transformed data and developed metrics using Spark SQL to be displayed on dashboards● Stage the API or Kafka Data (in JSON file format) into Snowflake DB by Flattening the same for different functional services -
Sr. Data Management EngineerQuick Play Media Jun 2018 - May 2019United States• Prepared ADF pipelines to extract data from various source systems into UDH• Experienced in Develop data, Implementing, Documenting, Monitoring, and maintaining the Data Warehouse Extracts, Transformations Load, and ETL process in industries• Created Databricks notebooks to import files, transformed the data and output to SQL server Database/Azure BLOB Storage• Experience in migrating SQL database to Azure Data Lake, Azure Data Lake Analytics, Azure SQL Database, Databricks, and Azure SQL Data Warehouse and controlling and granting database access and migrating On-premises databases to Azure Data Lake Storage using Azure Data Factory• Implemented code and scripts for Acquisition and Transformation from different sources• Implemented notebooks Python and Scala in Data Bricks utilizing Data Frames and Spark-SQL API for faster. Processing of data
-
Data Engineer/AnalystEricsson Aug 2013 - Dec 2016India● Led the development prediction models in the areas of churn prediction, customer retention, acquisition, loyalty and retention. Techniques used include logistic regression, decision trees and random forest● Built multiple market mix models (MMM) for various sets of SKUs● Built prediction models to identify optimal monthly targets and to predict churn promoters● Assist developers in creating and maintaining web-based applications using technologies such as HTML, CSS, JavaScript, C#, ASP.NET, SQL Server, and Visual Studio. ● Involved in new product development and deployment for the Insurance Clients -in parallel handle high priority production issues and tickets.
Prem Sagar M Education Details
Frequently Asked Questions about Prem Sagar M
What company does Prem Sagar M work for?
Prem Sagar M works for Petco Animal Supplies Stores, Inc.
What is Prem Sagar M's role at the current company?
Prem Sagar M's current role is Data Engineer.
What schools did Prem Sagar M attend?
Prem Sagar M attended Jawaharlal Nehru Technological University Hyderabad (Jntuh).
Not the Prem Sagar M you were looking for?
-
Prem sagar M
Process Support Specialist At Infosys|Technical Support|Analyst|Aws|Jira|Mysql|Python|PromethieusBengaluru -
-
-
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial