Akash Reddy Email and Phone Number
With over 10 years of professional experience, my mission as a Senior AWS Data Engineer at U.S. Bank is to leverage advanced AWS technologies to drive efficiency and innovation. My expertise in building and optimizing data pipelines aligns with the bank's commitment to technological advancement and customer satisfaction. I am dedicated to enhancing performance and reducing costs, bringing a unique perspective and diverse experience that fosters growth and efficiency in our team's endeavors.At U.S. Bank, my focus has been on implementing AWS data pipelines using Glue, S3, and Lambda, achieving a significant reduction in processing time and infrastructure costs. My role has involved streamlining batch data workflows and developing ETL solutions with Spark SQL in Databricks, leading to improved data processing efficiency and accuracy. These efforts are emblematic of my dedication to automating data ingestion and management, ensuring our systems scale effectively while maintaining robust security standards.
Ameriprise Financial Services, Llc
View- Website:
- ameriprise.com
- Employees:
- 2
-
Aws Data EngineerAmeriprise Financial Services, LlcUnited States -
Aws Data EngineerU.S. Bank Nov 2021 - Aug 2024• Implemented a metadata-driven AWS data pipeline using AWS Glue, AWS S3, and AWS Lambda, reducing processing time by 40%.• Streamlined batch data workflows for efficient processing of large volumes. Implemented data partitioning to enhance performance and reduce processing times by 30%.• Implemented a Proof of Concept (POC) using AWS Glue, AWS Lambda, AWS S3, and serverless architecture for ETL processes, demonstrating a 50% reduction in data processing time and a 30% decrease in infrastructure costs.• Developed ETL solutions using Spark SQL in Databricks for data extraction, transformation, and aggregation from various sources, utilizing multiple file formats.• Automated data ingestion processes leveraging AWS Data Pipeline, Apache NiFi, and custom Python automation scripts, reducing manual intervention by 70% and improving operational efficiency.• Designed and developed robust Python scripts for data ETL processes, ensuring seamless integration of diverse data sources using Pandas, NumPy, and PySpark.• Created and maintained Python-based applications and tools for automating repetitive data engineering tasks using Flask, Django, and REST APIs.• Automated deployment processes using Jenkins, Docker, Kubernetes, CloudFormation, and Ansible, ensuring efficient and reproducible infrastructure.• Remediated over 300 application vulnerabilities identified through SAST tools like Veracode and Checkmarx, and integrated fixes via CI/CD pipelines.• Automated data governance tools with Collibra and custom scripts, reducing manual effort by 50%, saving 20 hours weekly, and ensuring 90% accuracy in tracking data dependencies, leading to a 60% efficiency boost. -
Aws Data EngineerH-E-B Sep 2019 - Oct 2021• Deployed, automated, and managed AWS cloud-based production systems and achieving a 30% increase in scalability while maintaining security standards.• Executed a successful migration from SQL Server (On-Prem) to AWS Cloud (Redshift).• Designed and optimized ETL workflows using Python scripts, Alteryx, AWS Glue 2.0, AWS Lambda, Amazon S3, and Redshift, integrating diverse data sources seamlessly. Achieved a 50% reduction in data processing time and enhanced data accuracy by 25%.• Automated deployment processes using Jenkins, Docker, Kubernetes, CloudFormation, and Ansible, ensuring efficient and reproducible infrastructure. • Orchestrated ETL workflows using AWS Glue, automating the movement and transformation of data.• Developed and maintained Python applications for automating various data engineering tasks, using libraries like Pandas, NumPy, and PySpark. • Developed and integrated Python automation solutions with AWS Lambda and AWS Step Functions to enhance functionality and streamline data workflows within cloud environments.• Worked with SQL, PL/SQL procedures and functions, stored procedures, and packages within mappings.• Managed and optimized relational databases, including AWS RDS, Oracle, and NoSQL databases like MongoDB.• Designed and implemented data storage and retrieval solutions using Amazon DynamoDB for real-time data access and optimized handling of large-scale data.• Utilized DynamoDB Streams for capturing data changes and processing them in near real-time with AWS Lambda.• Developed Tableau dashboards for Quality Check processes, automating quality assurance workflows and reducing manual effort, thereby saving 15 hours per week in operational tasks.• Enhanced database performance by optimizing SQL queries, achieving a 40% reduction in query execution time and significantly improving overall system efficiency. -
Data EngineerGe Healthcare May 2017 - Aug 2019Chicago, Illinois, United States• Migrated SQL databases to Azure Data Lake, Azure Data Lake Analytics, Azure SQL Database, Databricks, and Azure SQL Data Warehouse.• Managed database access and control during the migration process using Azure Data Factory.• Developed Spark applications using Spark/PySpark-SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats, analyzing and transforming data to uncover insights into customer behavior.• Experienced in dimensional modeling, forecasting with large-scale datasets (Star schema, Snowflake schema), transactional modeling, and Slowly Changing Dimensions (SCD).• Built a claims simulation app using R Shiny to estimate total loss amounts for claims settlement, leveraging multiple frequency and severity distributions.• Launched R Shiny apps integrating machine learning algorithms via R Studio Connect with Azure and Docker for scalable deployment.• Developed scripts for data transfer from FTP servers to the ingestion layer using Azure CLI commands.• Automated Azure HDInsights cluster creation using PowerShell scripts.• Utilized Azure Data Lake Storage Gen2 for storing Excel and Parquet files, retrieving user data via Blob API.• Worked with Azure Databricks, PySpark, Spark SQL, Azure ADW, and Hive for data loading and transformation.• Designed Azure Cloud Architecture and implementation plans for hosting complex application workloads on Microsoft Azure. -
Python DeveloperUs Cellular-Answer Wireless Jun 2015 - Mar 2017Chicago, Illinois, United States• Orchestrated comprehensive deployment of web applications on AWS, optimizing efficiency by leveraging S3 buckets.• Implemented AWS CLI Auto Scaling and CloudWatch Monitoring to enhance system performance.• Automated continuous integration using Git, Jenkins, and custom Python and Bash tools, reducing pipeline failure efforts by 70%.• Developed server-side modules deployed on AWS Compute Cloud, utilizing PHP, Node.js, and Python.• Utilized AWS Lambda for DynamoDB Auto Scaling and implemented a robust Data Access Layer.• Automated nightly builds with Python, reducing pipeline management effort significantly.• Employed AWS SNS for automated email notifications and messages after nightly runs.• Provisioned AWS Lambda and EC2 instances, managed security groups, and administered Amazon VPCs.• Implemented Jenkins pipelines for microservices builds, Docker registry management, and Kubernetes deployment.• Created Python-based AWS Lambda functions for faster and asynchronous processes.• Implemented CloudTrail for capturing AWS infrastructure-related events.• Monitored AWS EC2 containers using the Datadog API, optimizing data processing efficiency.• Employed Python scripts for data chunking, enhancing data processing speed.• Embraced a test-driven approach for application development, utilizing Python Unit Test framework.• Successfully migrated Django databases from SQLite to MySQL and PostgreSQL, ensuring data integrity.• Utilized SQL Server Reporting Services (SSRS), SSIS, and SSAS for report generation.• Developed Python-based APIs (RESTful Web Service) using SQLAlchemy and PostgreSQL. -
Python DeveloperSap May 2014 - May 2015Reston, Virginia, United States• Developed entire frontend and backend modules using Python on the Django web framework.• Used Python core packages and modules such as NumPy to boost data processing and analysis efficiency.• Developed REST APIs for web applications to improve web system interoperability.• Developed complex SQL queries and PL/SQL procedures.• Designed and developed a horizontally scalable API using Python Flask.• Wrote Python scripts to parse XML documents and load the data in the database.• Built web application development using Django/Python using HTML/CSS for server-side rendered applications.• Python was utilized to create Apache Spark tasks in the testing environment, while Spark SQL was used for querying.• Created Python notebook scripts to read tables as PySpark data frames for analysis.• Worked with Jenkins for the deployment of the project.• Git was used as version control system, that was utilized by a team of programmers to organize and monitor development changes.• Designed the Linux Services to run REST web services using a Shell script.• Create an RPM package for the product that allows feature Upgrades.• Used the PyUnit framework to participate in unit testing and construct test cases.
Akash Reddy Education Details
-
Computer Science
Frequently Asked Questions about Akash Reddy
What company does Akash Reddy work for?
Akash Reddy works for Ameriprise Financial Services, Llc
What is Akash Reddy's role at the current company?
Akash Reddy's current role is AWS Data Engineer.
What schools did Akash Reddy attend?
Akash Reddy attended University Of Missouri-Kansas City.
Who are Akash Reddy's colleagues?
Akash Reddy's colleagues are Taylor Sheets, Jennifer Rozmenoski, Nancy Ledonne, Jason Bartylla, David Drysdale, David Hoppes, Carrie Plack.
Not the Akash Reddy you were looking for?
-
-
Akash Reddy
Sr Java Full Stack Developer || Cisco Systems, Inc || Microservices, Spring Boot, Javascript, Html5, Mongodb, Ci/Cd Pipelines, Aws (S3, Ec2, Rds, Iam, Lambda)United States -
Akash Reddy
New York, Ny2keplercannon.com, gmail.com -
Akash Reddy
Senior .Net Full Stack Developer | Expert In C#, Asp.Net, Azure, Aws, React, Angular, Microservices, & Ci/Cd | Exploring New OpportunitiesUnited States -
Akash Reddy
University Of Michigan | Marsal Family Soe Scholarship Holder | Teach For India AlumnusAnn Arbor, Mi2itcinfotech.com, teachforindia.org
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial