Hari S

Hari S Email and Phone Number

Senior Data Engineer |Big Data |Hadoop| Spark | Python| AWS| Azure | Scala| Informatica |SQL| NoSQL | Data Analyst Actively looking for new Opportunities. @ U.S. Bank
minneapolis, minnesota, united states
Hari S's Location
Dallas, Texas, United States, United States
About Hari S

Experienced Senior Data Engineer with a demonstrated history of 9+ years working in the financial services industry. Skilled in Hadoop Ecosystem, Scala, Python, Apache Spark, AWS, Azure Data Factory, HBase, Big Data, and Elasticsearch.

Hari S's Current Company Details
U.S. Bank

U.S. Bank

View
Senior Data Engineer |Big Data |Hadoop| Spark | Python| AWS| Azure | Scala| Informatica |SQL| NoSQL | Data Analyst Actively looking for new Opportunities.
minneapolis, minnesota, united states
Website:
usbank.com
Employees:
59540
Hari S Work Experience Details
  • U.S. Bank
    Senior Data Engineer
    U.S. Bank Oct 2021 - Present
    Irving, Texas, United States
  • Citi
    Senior Data Engineer
    Citi Jan 2020 - Sep 2021
    Irving, Texas, United States
    The purpose of the project is to send offers/cash back / congratulatory SMS to eligible customers based on several criteria such as Late Fee/ Annual fee waiver reversal, Loan instalment completion, credit card transaction on the eligible merchants, Recurring payment, and Cash back offers on purchase. Offer recommendations to customers are build based on logic provided by the country business.Key Contributions & Responsibilities ● Worked with Spark and improved the performance and optimized the existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frame, RDD's, Spark YARN.● Design, Develop, Implement the ETL objects by extracting the data using Sqoop from source system to Hadoop Files system (HDFS). ● Developed scripts in Hive to perform transformations on the data and loaded it into target systems for reports.● Collecting and aggregating large amounts of log data using Flume and tagging data in HDFS for further analysis. ● Worked extensively with importing metadata into Hive using Python and migrated existing tables and applications to work on the AWS cloud (S3). ● Direct point of contact for business stakeholders, offshore team and Production supportteam.● Overall backend design to support application level interface and data modelling.● Gathering functional requirements from business and prepare plan for design solution,coding logic, automation, implementation and PS handover.● Reviewing the code quality written in (Hive, Sqoop, and Pyspark) and providingtechnical and functional solutions.• Implemented data migration and data engineering solutions using Azure products and services like Azure Data Lake Storages, Azure Data Factory, Azure Databricks etc. and traditional data warehouse tools. ● Performed analysis on the source data received in the HDFS directory,creating file patterns for data ingestion, validating the ingested data using count and date.
  • Anthem, Inc.
    Data Engineer
    Anthem, Inc. Sep 2017 - Dec 2019
    Atlanta, Georgia, United States
    • Experienced in migrating and transforming large sets of Structured, semi-structured, and Unstructured RAW data from HBase through Sqoop and placed in HDFS for further processing. • Written multiple Spark programs for data extraction, transformation and aggregation from multiple file formats including XML, JSON, CSV, and other codec file-formats.• Design and Development of data validation, load processes, test cases using PL/SQL, Stored Procedures, Functions, Triggers.• Analyzed the SQL scripts and designed the solution to be implemented using PySpark.• Working on optimizing Hive/Spark scripts and the configurations used to trigger them.• Used Spark SQL with Python for creating data frames and performed transformations on data frames like casting, joining data frames before storing them• Used AWS EMR for custom spark jobs using S3, SNS, and Lambda function.
  • Radian
    Big Data Engineer
    Radian Jun 2016 - Aug 2017
    Philadelphia, Pennsylvania, United States
    • Member of Core Hadoop Architecture and Implementation team participating in design/implementation of a multi-node cluster for the client. • Identify data content and support in data cleansing, extract, transform, data quality, data load, data verification and conversion/interfaces requirements.• Extensively used Joins and sub-queries for complex queries involving multiple tables from different databases and performance tuning of SQL queries and stored procedures.• Developed data pipeline using Flume, Sqoop, Pig, and MapReduce to ingest customer behavioral data and financial histories into HDFS for analysis. • Developed JSON Scripts for deploying the Pipeline in Azure Data Factory (ADF) that process the data using the SQL Activity. • Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings. • Experience in implementing Microsoft BI/Azure BI solutions like Azure Data Factory, Power BI, Azure Databricks, Azure Analysis Services, SQL Server Integration Services, SQL Server Reporting Services and Tableau.• Experienced in developing Power BI reports and dashboards from multiple data sources using data blending.• Created different Power BI reports utilizing the desktop and the online service and schedule refresh.
  • Air India Limited
    Hadoop Developer
    Air India Limited Apr 2015 - May 2016
    Mumbai, Maharashtra, India
    • Experience with Hadoop Development using components HDFS, MR2, Yarn, Hive, Sqoop, Flume, Oozie, HBase, Pig, and Zookeeper. • Provide in-depth technical and business knowledge to ensure efficient design, programming, implementation, and on-going support for the application.• Hands-on experience in writing T- SQL queries, procedures, views, indexes and worked on SQL query Optimization and tuning.• Responsible for complete SDLC management using Agile Methodology.• Wrote job workflows as per the requirements and their dependencies.• Developed scripts, UDF's using both Spark SQL and Spark Core in Scala for Data Aggregation, queries and verified its performance over MapReduce jobs.• Responsible for building scalable distributed data solutions using Hadoop components.• Handled importing of data from various data sources, performed transformations using Hive, Spark, and loaded data into HDFS.
  • Hp
    Software Developer
    Hp May 2014 - Mar 2015
    Hyderabad, Telangana, India
    • Requirements gathering and documentation for the development, maintenance work.• Coordinating, preparing estimates with offshore onsite development team and project management for the projects assigned.• Development, coding and maintaining code.• User acceptance testing with business users.• Status reporting to client stakeholders.• Participate in business and functional requirement calls to finalize the requirement.• Providing an estimate for the functional and business requirements received.• Creation of Technical specifications from the functional requirements and business requirements• Impact analysis of proposed solution

Frequently Asked Questions about Hari S

What company does Hari S work for?

Hari S works for U.s. Bank

What is Hari S's role at the current company?

Hari S's current role is Senior Data Engineer |Big Data |Hadoop| Spark | Python| AWS| Azure | Scala| Informatica |SQL| NoSQL | Data Analyst Actively looking for new Opportunities..

Who are Hari S's colleagues?

Hari S's colleagues are Jim Rice, Wendy Y, Cheryl Davis, Karey Thomas (Cox), Susie Dziepak Kuhn, Nathan Hagen, Brandon Davison.

Not the Hari S you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.