Lalit Sai Email and Phone Number
Lalit Sai is a Senior Data Engineer - SQL | Python | Pyspark | ADF | Snowflake | Databricks | Airflow | Hadoop | AWS | Azure | DataBricks at Western Union.
Western Union
View- Website:
- westernunion.com
- Employees:
- 12747
-
Senior Data EngineerWestern Union Aug 2020 - Present- Spearheaded the migration of an extensive Alteryx workflow into Snowflake, transforming complex data processing logic into optimized SQL scripts to enhance query performance and system efficiency.- Collaborated with the Alteryx team to meticulously document and map existing workflows, ensuring accurate replication and integration within the Snowflake environment.- Led the integration of Talend to manage data flow between Snowflake and Power BI, establishing a robust pipeline for real-time data analytics and developed and optimized interactive dashboards in Power BI- Led the strategic planning and execution of a comprehensive cloud migration, integrating Snowflake data warehousing solutions with AWS cloud services, enhancing scalability and data analytics capabilities.- Orchestrated a detailed initial assessment and migration strategy that minimized downtime and prioritized data integrity, leading to the seamless transition of legacy systems to a modern cloud-based infrastructure.- Spearheaded the design and development of a scalable real-time data processing platform, to process high-volume data streams with low latency, resulting in faster decision-making and enhanced customer experiences.- Implemented CI/CD pipelines for code deployment and testing, optimizing Spark streaming jobs and partitioning strategies to efficiently handle high throughput, resulting in improved operational efficiency.- Spearheaded the creation of a centralized data lake, enhancing data accessibility and analysis capabilities across the organization, leveraging technologies such as HDFS, SQOOP, HBASE, and AWS services- Successfully integrated multiple data sources into a unified BI, ensuring data consistency and integrity. Utilized SQOOP and custom Python scripts for efficient data ingestion, accommodating a diverse range of data formats and RDBMS systems -
Big Data EngineerPalni Inc Dec 2018 - Jul 2020Mckinney, Texas, United States- Successfully migrated sensitive patient data from legacy SQL Server databases to Hadoop Distributed File System (HDFS), ensuring data integrity and compliance with healthcare regulations- Designed and optimized Spark RDD transformations using Scala to process large datasets, enabling deep insights into patient data and healthcare trends.- Designed and implemented secure, RESTful APIs using the Play Framework to facilitate seamless data exchange across various healthcare applications, ensuring compatibility and ease of integration- Spearheaded the integration and consolidation of disparate data sources into a unified analytics platform using Databricks, Delta Lake, Sqoop, and Apache Kafka, enabling a leading healthcare analytics firm to enhance data management and analytical capabilities.- Automated data import/export processes, integrating large volumes of data from various healthcare systems, which led to a 50% reduction in data processing time and supported the firm's real-time analytics and predictive modeling needs.- Configured and optimized Databricks workspaces and notebooks, employing Delta Lake for efficient data storage, which improved data query performance by 40% and facilitated advanced analytics operations.- Developed custom scripts and configurations to enhance data quality and compliance within the Delta Lake framework, leveraging its version control capabilities to maintain historical data accuracy and integrity -
Big Data EngineerState Of North Carolina Jul 2015 - Nov 2018North Carolina, United States- Managed the integration of multi-source data systems into a unified data lake, improving data accessibility and integrity for real-time analytics across various departments- Developed and implemented data processing workflows using Apache Spark and Apache Airflow, which processed over 1 TB of data monthly, enabling timely insights and reporting- Developed secure data ingestion pipelines with Apache Nifi, standardizing data flows from diverse sources into a centralized data lake- Configured and maintained AWS S3 and Azure Data Lake storage solutions, optimizing for cost-efficiency and data retrievability- Crafted dynamic data visualization tools and dashboards using Power BI and Apache Zeppelin, directly supporting strategic decision-making processes- Implemented comprehensive data security measures, integrating AWS IAM and Azure Active Directory to safeguard sensitive information in compliance with state and federal regulations- Orchestrated data workflow automation using Apache Airflow, enhancing data pipeline efficiency and reducing manual processing time -
Data EngineerOctal Technologies Sep 2013 - Jan 2015Chennai, Tamil Nadu, India- Involved in transition of Hive/SQL and Teradata to a Spark-based platform, reducing query processing time by over 40% and enhancing real-time analytics capabilities- Established a rigorous benchmarking protocol that proved Spark’s performance superiority, achieving a 50% increase in processing efficiency compared to the old system- Successfully converted complex Hive/SQL queries into Spark transformations, ensuring zero loss in data fidelity and significantly enhancing scalability and maintainability of analytics operation- Developed multiple proofs of concept (POCs) on a Yarn cluster, which led to optimized Spark configurations that handled increasing data volumes without performance degradation, prepared for future data growth- Engineered and implemented robust data ingestion solutions that increased data throughput by 40%, ensuring efficient handling of diverse and high-volume data sources- Seamlessly integrated advanced analytics tools (Tableau, QlikView, SAS) with Hadoop, enabling real-time data analysis and supporting dynamic BI team decision-making- Executed comprehensive system optimizations, including performance tuning and security enhancements, which resulted in a 25% reduction in system downtime and ensured high reliability- Developed and delivered in-depth training sessions for the BI team and produced comprehensive documentation, facilitating smooth system transition and long-term operational sustainability
Frequently Asked Questions about Lalit Sai
What company does Lalit Sai work for?
Lalit Sai works for Western Union
What is Lalit Sai's role at the current company?
Lalit Sai's current role is Senior Data Engineer - SQL | Python | Pyspark | ADF | Snowflake | Databricks | Airflow | Hadoop | AWS | Azure | DataBricks.
Who are Lalit Sai's colleagues?
Lalit Sai's colleagues are Jamie Fetterley, Staci Shantoria, Anastasiia Khudozhnyk, Javier Hickman, Tom Dillinger, Shane Washington, Cfcs, Raghu Hari.
Not the Lalit Sai you were looking for?
-
M K Lalit Sai
Business Development & Marketing Specialist @ Bayshore Intelligence Solutions | Driving Business Growth Through Strategic It Partnerships | Digital Marketing Strategist | B.Sc Iit DelhiKolkata -
Dr. Lalit Venkata Krishna Sai
Medical Writer | Pharmacovigilance Expert | Driving Digital Transformation In Pharma | Proven Track Record In Regulatory Writing, Patient Safety & It Business Partnering|St Neots
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial