Lalit Dath Email and Phone Number
Lalit Dath is a Data Engineer @ Giant Eagle at Giant Eagle, Inc..
Giant Eagle, Inc.
View- Website:
- gianteagle.com
- Employees:
- 4380
-
Senior Data EngineerGiant Eagle, Inc. Nov 2023 - PresentPittsburgh, Pennsylvania, United States• Built and implemented relational servers and databases for the Azure Cloud.• Delivered and approached the data migration for advancement of current solutions from on-premises systems and apps with pipelines to Azure cloud is being developed.• Implementing Azure Data Factory, Spark SQL, and U-SQL Azure Data Lake Storage Gen2 (ADLS) Analytics, extract, transform, and load data from sources systems to Azure Data Storage services.• Integrated the data from on-premises SQL servers to cloud databases (Azure Synapse Analytics & Azure SQL DB).• Created a Spark job using Azure Databricks to replace the current ETL/SSIS solution and provided Data warehouse solutions on Azure Synapse using Polybase/external tables.• Performed with data analysis using HiveQL, HBase, and custom Map Reduce programs as well as experience importing and exporting data from RDBMS to HDFS and Hive.• Performed various database transformations to cleanse the data and ensure its quality and provided production support and monitored ETL jobs to ensure that they were running smoothly.• Implemented the cloud-based Matillion ETL tool to create fact and dimensional models in MS SQL Server and Snowflake Database.• Interpretation of principles related to data warehousing, including normalization of data, OLTP and OLAP systems, physical and logical data models, and extensive expertise with star and snowflake schemas.• Implemented several UNIX environments, including CRON, FTP, and UNIX Shell Scripting.• Implemented Spark and Kafka to construct streaming apps in Azure Notebooks and used Spark Streaming and Kafka for cluster management and data ingestion.• Planned the use of scheduler to automate the entire data pipeline by creating Oozie workflows and designed Hadoop ETL tasks have automation scripts in place.• Implemented the on-premises migration. ARM templates, Azure DevOps, Azure CLI, and App services are used to connect Net apps, the DevOps platform, and Azure CI/CD processes. -
Big Data EngineerThomson Reuters Jun 2022 - Oct 2023Eagan, Minnesota, United States• Interpreted the usage, installation and configuration of the various parts of the Hadoop ecosystem• Implementation of the Google Cloud Platform's BigQuery and Cloud DataProc services to build and develop production data engineering solutions that deliver our pipeline patterns.• Designed the tools of the Hadoop ecosystem, including HDFS, MapReduce, Hive, Pig, YARN, Spark, HBase, Oozie, Sqoop, and Zookeeper.• Normalization of databases like Oracle, SQL, PostgreSQL, and Cassandra complies with design principles and best practices.• Deployed Talend Data Integration and Informatica Power Center to analyze the ETL technologies for developing data warehouses, BI, analytics, ETL procedures, data mining, data mapping, data conversion, and data migration.• Designed Power BI to build reports and dashboards, comprehend the data, and make business choices while working as a downstream data analyst.• Generated Hadoop MapReduce jobs, Spark, and Kafka producers and consumers.• Integrated every step of the process, from gathering data from MySQL to pushing it to the Hadoop Distributed File System to performing PIG/Hive and MapReduce processes using Oozie.• Developed and deployed unique Hadoop apps while working in an AWS environment.• Applied structural changes with Map-Reduce and HIVE, and analyze data using tools for data visualization and reporting.• Created a data pipeline utilizing Flume, Pig, and Sqoop to input customer histories and cargo data into HDFS for analysis.• Operated the configuration of the Hive Meta store with MySQL, which contains the metadata for Hive tables, by importing data from MySQL DB to HDFS and vice versa using Sqoop.• Designed the data refresh strategy document and the capacity planning papers necessary for project development and support, processes in Oozie and task scheduling in mainframes were developed.• Developed and built a complete end-to-end data warehouse infrastructure on Confidential Redshift -
Data AnalystAbbvie Aug 2020 - May 2022Vernon Hills, Illinois, United States• Examined the information at the cluster level from several databases when data transformation or data loading occurs.• Built many SQL scripts for data discrepancies as part of data migration and worked on migrating history data from Teradata SQL to Snowflake.• Deployed SQL Server Integration Services (SSIS) to build automated data pipelines to transport the data more effectively and used SQL Server Management Studio (SSMS) to construct database connections and write scripts to extract and convert the data.• Defined and controlled the policies for S3 buckets, which were used by AWS as backup and storage.• Integrated with Tibco and Spotfire and made computations along with predictive modeling using python scripts.• Evaluated promotional activities from a number of angles to maximize the return on investment for clients including click-through rates, conversion rates (CON), seasonal trends, search queries, quality scores (QS), competitors, distribution channels, etc.• Monitored the click events on a blended home screen, such as the click - through rate, exchange rate, bounce back rate, list view which provides useful data for finding relevant optimization.• Comprehended client requirements and developed a testing approach based on firm principles.• Built a defect management plan to track problems and assign tasks using JIRA was built.• Created process flows to show and clarify project requirements.• Coordinated for many teams such as QA, development, and automation and frequently used Quality Center to design and run test cases.• Involved in performance testing, integration tests, unit testing, and validation. Hadoop MapReduce written in Python, Pig, and Hive has been tested.• Designed Spark SQL to facilitate data testing and processing more quickly.• Derived a custom SQL query to check the parameters for the daily, weekly, and monthly jobs.
Frequently Asked Questions about Lalit Dath
What company does Lalit Dath work for?
Lalit Dath works for Giant Eagle, Inc.
What is Lalit Dath's role at the current company?
Lalit Dath's current role is Data Engineer @ Giant Eagle.
Who are Lalit Dath's colleagues?
Lalit Dath's colleagues are Kayla Lute, Jillian Banky, Debra Novak, Rick Barnes, Erin Isaac, Kearstyn Bizzarro, Caden Mcginnis.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial