Ramesh S

Ramesh S Email and Phone Number

Data Engineer | Specializing in Cloud Data Solutions, ETL Development, and Data Warehousing | AWS, Python, SQL Expert @
Ramesh S's Location
Cambridge, Ontario, Canada, Canada
About Ramesh S

"As a Data Engineer with over 6 years of experience, I specialize in designing and managing scalable data solutions in cloud environments. My expertise spans across AWS services, Python, SQL, and ETL processes, allowing me to build efficient and secure data pipelines for diverse industries, including Banking, Retail, and Healthcare.I have extensive experience in cloud platforms, particularly AWS, where I have implemented data migration projects, developed ETL workflows, and optimized data pipelines for large-scale data processing. My skills in data modeling, SQL, and various data warehousing technologies enable me to transform complex data sets into actionable insights.I thrive in fast-paced, dynamic environments and excel in collaborating with cross-functional teams to deliver high-quality data solutions that meet business objectives. I am passionate about leveraging my technical skills to solve challenging problems and drive data-driven decision-making."

Ramesh S's Current Company Details
Parexel Canada

Parexel Canada

Data Engineer | Specializing in Cloud Data Solutions, ETL Development, and Data Warehousing | AWS, Python, SQL Expert
Ramesh S Work Experience Details
  • Parexel Canada
    Sr Data Engineer
    Parexel Canada Sep 2023 - Present
    Architected scalable ETL solutions using AWS Glue and EMR for processing large datasets and integrating with Redshift.Implemented data migration projects from on-premise databases to AWS, ensuring data integrity and optimizing performance.Creating AWS Lambda functions using python for deployment management in AWS and designed, investigated and implemented public facing websites on Amazon Web Services and integrated it with other applications infrastructure.Designed and implemented ETL workflows using Py Spark and AWS Glue to efficiently process and transform large datasets, reducing data processing times.Creating different AWS Lambda functions and API Gateways, to submit data via API Gateway that is accessible via Lambda function.Utilized Python Libraries like Boto3, NumPy for AWS.Developed serverless data processing workflows using AWS Lambda and S3, reducing operational overhead and improving efficiency.Developed custom Py Spark jobs for batch processing, enabling the processing of millions of records daily and supporting large-scale data analytics.Created external tables with partitions using Hive, AWS Athena and Redshift.Worked closely with stakeholders to define data requirements, design architecture, and deliver solutions that met business objectives.Utilized Py Spark to process and analyze large datasets in a distributed computing environment, significantly reducing the time required for data transformation tasks.Created Py Spark applications for real-time data processing and analytics, integrating with AWS services such as S3 and Redshift to store and query processed data.Automated monitoring and logging using CloudWatch and integrated alerting mechanisms to maintain high availability of data services.Involvement in using collections in Python for manipulating and looping through different user defined objects.
  • Nike
    Data Engineer
    Nike Jan 2021 - Jan 2023
    Led the end-to-end data migration from legacy systems to AWS Redshift, optimizing data extraction, transformation, and loading processes.Designed and developed ETL pipelines using Python and AWS services, supporting analytics and reporting requirements.Enhanced AWS environments for performance, security, and cost-efficiency, reducing costs by 20%.Provided technical leadership in cloud data engineering, advising on best practices and strategies.Built and optimized SQL scripts and Python code for data processing and analytics operations on AWS.
  • Nike
    Sr Aws Data Engineer
    Nike Jan 2021 - Jan 2023
    Managed end-to-end data migration from legacy systems to AWS Redshift, including data extraction, transformation, and loading.Designed and implemented ETL pipelines using Python and AWS services to support data analytics and reporting needs.Worked with cross-functional teams to optimize AWS environments for performance, security, and cost-efficiency.Developed and maintained SQL scripts and Python code to support data processing and analytics operations on AWS.Provided technical leadership in data architecture, cloud strategy, and best practices for cloud data engineering.Assisted the development team in sending the correct data via query strings using PostgreSQL as the back-end for storing data.Used Python and Django to interface with the jQuery UI and manage the storage and deletion of content.Build SQL queries for performing various CRUD operations like create, update, read and delete.Placed data into JSON files using Python to test Django websites. Used Python scripts to update the content in database and manipulate files.Generated Python Django forms to maintain the record of online users.Generated Django forms to record data of online users and used PyTest for writing test cases.Extensive use of version controlling systems like GIT and SVN.Designed and developed data management system using MySQL.Built application logic using Python 2.7, Django and MYSQL.Rewrite existing Python/Django modules to deliver certain format of data.Adept at automating repetitive work through Shell scripts and Python.Able to analyze a requirement and act accordingly by managing the resources efficiently.
  • General Electricals
    Cloud Data Engineer
    General Electricals Oct 2017 - Dec 2020
    Designed and implemented data pipelines using AWS Glue and AWS Lambda for processing and transforming data from on-premise databases to AWS.Migrated legacy data systems to AWS Redshift, improving query performance and enabling advanced data analytics.Developed Python scripts for data extraction, transformation, and loading (ETL) to automate data workflows and reduce manual intervention.Collaborated with cross-functional teams to gather requirements, design solutions, and deploy data infrastructure on AWS.Developed Spark scripts by writing custom RDDs in Scala for data transformations and perform actions on RDDs.Optimized SQL queries and data models in Redshift to improve data retrieval times and reduce costs.Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark using Scala.

Ramesh S Education Details

Frequently Asked Questions about Ramesh S

What company does Ramesh S work for?

Ramesh S works for Parexel Canada

What is Ramesh S's role at the current company?

Ramesh S's current role is Data Engineer | Specializing in Cloud Data Solutions, ETL Development, and Data Warehousing | AWS, Python, SQL Expert.

What schools did Ramesh S attend?

Ramesh S attended Conestoga College, J.n.t. University, Jawaharlal Nehru Technological University, Kakinada.

Not the Ramesh S you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.