John Kuloth Petersan S

John Kuloth Petersan S Email and Phone Number

Data Engineer @ Stealth AI Startup
Paris, FR
John Kuloth Petersan S's Location
Garges-lès-Gonesse, Île-de-France, France, France
About John Kuloth Petersan S

Data Engineer professional with 5 years of experience in Python, SQL for the entire data engineering workflow. Over the years, I have accrued expertise in problem-solving, Communication, Technical, and Business skills. I am open to learning new things and getting involved with new projects that can help me learn or apply all that I have learned throughout my career.My Technical Skills:• Python• Pandas• PostgreSQL• Snowflake• Apache Airflow• Apache Nifi• DBT• ETL

John Kuloth Petersan S's Current Company Details
Stealth AI Startup

Stealth Ai Startup

View
Data Engineer
Paris, FR
Employees:
4857
John Kuloth Petersan S Work Experience Details
  • Stealth Ai Startup
    Data Engineer
    Stealth Ai Startup
    Paris, Fr
  • Freelance
    Data Engineer
    Freelance Dec 2023 - Present
  • Shippeo
    Data Engineer
    Shippeo Mar 2023 - Sep 2023
    Paris, Île-De-France, France
    • Developed automation tasks using Python script for NiFi Registry, ensuring seamless recovery of all tasks during the migration of NiFi into Kubernetes.• Designed and implemented an Airflow DAG using KubernetesPodOperator, improving data pipeline efficiency by 25%.• Contributed to Backstage by enhancing the microservice ecosystem with comprehensive documentation for 10+ microservices, increasing developer onboarding speed by 30%.• Implemented CI/CD processes using GitHub Actions with ArgoCD and Helm chart, facilitating seamless deployment of 15+ microservices into GCP Kubernetes.• Worked with dbt (data build tool) for data preparation, transforming raw data into organized and meaningful formats, reducing data processing time by 50%.Carbon Emission Insights: • Developed end to end Carbon Emission Insights Dashboard project using Python for data processing, Apache Airflow for job orchestration, DBT for data transformation, and Tableau for visualization. • Quantified CO2e emissions across various shipment methods and used Tableau to create interactive dashboards with comparative charts and geographical maps. • Facilitated informed decision-making for optimizing logistics strategies towards sustainability and reducing environmental impact.Enterprise Data Integration:• Designed and implemented a data pipeline to extract, transform, and load shipment data from GCP to Snowflake, ensuring efficient data processing and integration.• Integrated with the SeaRoutes Shipment API to enrich shipment data with carbon emission metrics, managing API call concurrency and handling errors effectively.• Automated data loading into Snowflake using SnowPipe, robust error handling, and reprocessing of failed shipments to maintain data integrity and accuracy.
  • Tata Consultancy Services
    Software Engineer
    Tata Consultancy Services Mar 2018 - Apr 2022
    Chennai, Tamil Nadu, India
    Client: Humana• Designed and developed ETL (Extract, Transform, and Load) jobs using SSIS to integrate data from Archer (Business Application) into MSSQL, processing 10GB+ of data daily.• Created and managed SSIS packages to handle data extraction, cleansing, transformation, and loading tasks efficiently and accurately.• Designed tables and optimized complex stored procedures to facilitate data retrieval, manipulation, and reporting, contributing to a 35% improvement in database performance and response times.• Optimized SSIS packages for performance and scalability, ensuring efficient data processing and reducing ETL job run times by 40%.• Created monitors, alarms, notifications and logs for Lambda functions, Glue Jobs, EC2 hosts using CloudWatch and used AWS Glue for the data transformation, validate and data cleansing. • Worked with S3, AWS, EC2 Machine, etc. and extracting the data from the different sources and the Redshift database and Create Glue jobs in AWS and load data to S3 staging area and persistence area.• Involved in developing data pipelines using AWS (EMR, EC2, S3, RDS, Athena, Lambda) and moving large scale pipeline applications from on premise clusters to AWS cloud.• Involved in creating event driven ETL pipelines by using AWS Glue whenever new data is appeared on AWS S3 and understand data assets.• Worked with AWS Athena to perform analytics on the data stored in S3 buckets.

John Kuloth Petersan S Education Details

Frequently Asked Questions about John Kuloth Petersan S

What company does John Kuloth Petersan S work for?

John Kuloth Petersan S works for Stealth Ai Startup

What is John Kuloth Petersan S's role at the current company?

John Kuloth Petersan S's current role is Data Engineer.

What schools did John Kuloth Petersan S attend?

John Kuloth Petersan S attended Epita: Ecole D'ingénieurs En Informatique, Pondicherry Engg. College.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.