Sushma Reddy

Sushma Reddy Email and Phone Number

Data Engineer | Specializing in Cloud Data Engineering, ETL Engineering, and Real-Time Data Processing @ Northern Trust
chicago, illinois, united states
Sushma Reddy's Location
Chicago, Illinois, United States, United States
About Sushma Reddy

I am a Data Engineer with 7+ years of experience in designing, developing, and implementing software applications and data warehousing solutions. My expertise lies in building robust, scalable data pipelines and leveraging cloud technologies to solve complex business challenges.I have a deep understanding of data processing, with hands-on experience in extracting, processing, and storing data using various databases and tools like Snowflake, Postgres, DynamoDB, and Kafka. My skill set includes working with Python to create automation scripts and scalable solutions, along with expertise in distributed computing tools like Hadoop, Spark, and PySpark.I am proficient in cloud technologies, particularly AWS, Azure, and Google Cloud Platform, and have successfully implemented cloud migration projects. My background includes developing both batch and streaming pipelines using frameworks like PySpark, Pandas, and SQL. Additionally, I have experience with DevOps practices, CI/CD pipelines, and tools like Jenkins, Git, and Ansible.With a strong foundation in data modeling, performance optimization, and advanced analytics, I am passionate about delivering high-impact solutions that drive efficiency and business value. I am also skilled in creating insightful dashboards using Power BI and Tableau to support data-driven decision-making.Key Skills: AWS (EC2, S3, RDS, Redshift, Lambda, Glue), Microsoft Azure, Minimal Google Cloud Platform, HDFS, Map Reduce, Oozie, Hive, Pig, Sqoop, Flume, Zookeeper and HBase, CAWA, Spark, spark-sql, Impala, Mapr-DB, Azure, VOCI, Oracle Big Data Discovery, Kafka, Nifi, RedShift, Lambda, AppSync, MSK(Kafka), EMR, S3, RDS, Kibana.

Sushma Reddy's Current Company Details
Northern Trust

Northern Trust

View
Data Engineer | Specializing in Cloud Data Engineering, ETL Engineering, and Real-Time Data Processing
chicago, illinois, united states
Employees:
21127
Sushma Reddy Work Experience Details
  • Northern Trust
    Data Engineer
    Northern Trust Apr 2022 - Present
    Chicago, Illinois, United States
    • Designed, developed, and maintained ETL pipelines using AWS (Lambda, Glue, S3), SQL, Apache Airflow, Python, PySpark, and Redshift, leading to significant improvements in data processing efficiency and scalability.• Refactored legacy batch processes into Apache Airflow DAGs, enhancing the scheduling, monitoring, and overall manageability of data pipelines, leading to streamlined workflows and improved data orchestration.• Built real-time ETL workflows using AWS Lambda and S3, significantly reducing data ingestion time and improving the overall data freshness for downstream analytics.• Developed optimized data models and managed data warehousing solutions in Amazon Redshift, leading to improved query performance, reduced data latency, and enhanced analytical capabilities.• Implemented robust monitoring and alerting systems using AWS CloudWatch, ensuring high reliability and availability of data pipelines with early detection and mitigation of issues.• Collaborated with cross-functional teams, including business stakeholders, data scientists, and analysts, to translate complex business requirements into efficient technical solutions that drive key business decisions.• Designed and developed data models, creating ETL jobs to acquire, manipulate, and transform data from various sources into structured, analytics-ready formats.• Optimized database queries during migration projects, improving query performance and reducing data retrieval times, ensuring high responsiveness of data systems.• Slack Integration for Monitoring the applications using Datadog APM’s.• Proven ability to work under stringent deadlines, collaborating effectively in team environments or independently, consistently delivering high-quality solutions in dynamic and fast-paced settings.
  • Capital One
    Data Engineer
    Capital One Apr 2020 - Mar 2022
    Dallas, Texas, United States
    • As a part of migration project was involved in developing a comprehensive migration plan, including the approach, timelines, and risk mitigation strategies for migrating company's workloads from Azure to AWS.• Conduct a thorough assessment of the existing Azure infrastructure, identifying dependencies, workloads, and applications that need to be migrated.• Involved in designing AWS architecture with Arch Team to following best practices, with a focus on high availability, fault tolerance, scalability, and security.• Ensure the architecture meets the compliance requirements specific to the banking sector (e.g., PCI DSS, GDPR,CCPA etc.)• Plan and execute secure data migration strategies for banking data, ensuring data integrity, encryption, and compliance with financial regulations.• Developed and optimized ETL workflows using Apache Airflow for orchestration and Aws Glue to successfully integrated disparate data sources (SQL databases, NoSQL databases, and REST APIs)• Implement ETL (Extract, Transform, Load) processes for data migration where necessary.• Migrated from Batch to Event Streaming for batch processes to real-time event streaming architectures, improving data freshness and processing efficiency for real-time insights.• Used Kinesis Data Streams and Amazon Managed Streaming for Apache Kafka for data ingestions and AWS Lambda for near real time data processing with the usage of S3 as landing zone and Redshift as a target layer for analytical data.• Worked closely with SRE to Implement AWS security services such as IAM, VPC, encryption (KMS), and monitoring tools to safeguard sensitive data and financial transactions.• Refactored applications where it is necessary to optimize performance and take advantage of AWS-native services such as EMR, RDS, Lambda, and S3.• Conducted extensive testing post-migration, including functional, performance, and security testing, to ensure that the migrated infrastructure meets business requirements.
  • National Payments Corporation Of India (Npci)
    Bigdata Engineer
    National Payments Corporation Of India (Npci) Aug 2018 - Jan 2020
    Hyderabad, Telangana, India
    • Imported data from various data sources, performed transformations using Hive, MapReduce, Pig loaded data into HDFS and extracted the data from Oracle into HDFS using Sqoop.• Developed MapReduce jobs in java for data cleaning and preprocessing.• Importing and exporting data into HDFS and Hive using Sqoop.• Used Bash Shell Scripting, Sqoop, AVRO, Hive, HDP, Pig, Java, Map/Reduce daily to develop ETL, batch• processing, and data storage functionality.• Responsible for developing data pipeline using Flume, Sqoop to extract the data from weblogs and store in HDFS.• Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.• Worked on loading all tables from the reference source database schema through Sqoop.• Implemented the reusable Framework in Spark, Java, Python and Sqoop to handle the dynamic metadata changes and load data into the HDFS. • Worked on Cluster coordination services through Zookeeper.• Used Oozie and Zookeeper for workflow scheduling and monitoring.• Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig and HiveQL.• Experienced in managing and reviewing Hadoop log files.• Worked on data visualization and created interactive dashboards using BI tools like Power BI.
  • 3I Infotech Ltd.
    Software Engineer
    3I Infotech Ltd. Jun 2017 - Aug 2018
    Hyderabad, Telangana, India
    • Designed and developed Web Services using Java/J2EE in WebLogic environment. Developed web pages using Java Servlet, JSP, CSS, Java Script, DHTML, HTML5, and HTML. Added extensive Struts validation. • Involved in the Analysis, Design, and Development and testing of business requirements. • Developed Struts Action classes and Form Bean classes to handle the requests from front end jsp pages.• Worked with On-site team in a migration Project for moving all the SAS ETL jobs to Cloudera (Hadoop). • Perform Gap and impact analysis of ETL and IOP jobs for the new requirement and enhancements. • Performance tuning long running ETL/ELT jobs by creating partitions, enabling full load and other standard approaches.• Participate in ETL/ELT code review and design re-usable frameworks.• Work with Hadoop Admin, ETL and SAS admin teams for code deployments and health checks.• Performed extensive unit testing (end-to-end) and prepared detailed test scripts as part of it.

Sushma Reddy Education Details

Frequently Asked Questions about Sushma Reddy

What company does Sushma Reddy work for?

Sushma Reddy works for Northern Trust

What is Sushma Reddy's role at the current company?

Sushma Reddy's current role is Data Engineer | Specializing in Cloud Data Engineering, ETL Engineering, and Real-Time Data Processing.

What schools did Sushma Reddy attend?

Sushma Reddy attended Geethanjali College Of Engineering And Technology.

Who are Sushma Reddy's colleagues?

Sushma Reddy's colleagues are Sarita Saini, Chris Avzangelis, Kiran Sonawane, Neha Shrivastava, Aishwarya Anekar, Ushasree Karamala, Raksha G.

Not the Sushma Reddy you were looking for?

  • Sushma Reddy

    Power Bi Developer | Tableau Developer |Python Developer | Data Analyst | Azure Developer | Business Intelligence | Sql Developer | Data Visualization Analyst
    San Antonio, Texas Metropolitan Area
  • Sushma Reddy

    Full Stack Java Developer At General Motors
    Cary, Nc
  • Sushma reddy

    Senior Java Developer
    Dallas-Fort Worth Metroplex
  • Sushma Reddy

    Actively Looking For Full Stack Java Developer Roles
    Dallas-Fort Worth Metroplex
  • Sushma Reddy

    Java Full Stack Developer
    Irving, Tx

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.