Satheesh Babu Email and Phone Number
• Having 19 years of experience in Information Technology Industry with expertise in developing ETL Data Warehouse applications. • Experience in design and coding with ETL tools like Teradata, Snowflake, Python, Informatica, Ab initio. • Experience in developing Python extracts along with Cloud services like Terraform, API, AWS ECS, Lambda, Glue, SNS to extract the data into SP data lake in AWS S3 and automating, scheduling through Jenkins and Step functions. • Expert in creating solutions like Trending analysis, dashboard reports in Tableau, Cognos. • Proficient knowledge and experienced in Cloud technologies like AWS and Big Data. • Proficient knowledge on architecture and technical expertise on Teradata/Snowflake in design, development of database objects and scripts, unit testing, QA deployment and testing, performance tuning techniques and enhancements on production related issues and application support. • Experienced in interacting with business users to analyze the business process and requirements and transforming requirements into data model, designing mappings/graphs, documenting and rolling out the deliverable. • Extensively used Teradata/Snowflake/Informatica/Ab initio built in transformations and created reusable components by parameterized graphs/mapplets/worklets. • Expertise in administering and developing of Teradata/Snowflake/Informatica/Ab initio Mappings, Sessions, Folders, Repository and tuning the performance. • Involved in the Consumer Goods and Banking and Financial Services domain performing Technical and Team lead roles and responsibilities in Teradata/Snowflake/Ab initio/Informatica technologies. • Performed the role of onsite technical lead developer and interact with business users from gather Requirements, Design, Development, testing and deployment to higher environments, enhancements, support UAT and Implementation. • Experienced in ETL Testing, SQL performance tuning and SAP APO functional testing. • Documenting modifications and enhancements made to the applications, systems and databases required by the project and maintain version control using Github.
-
Teradata, Informatica, Aws Lead EngineerWiproPortland, Or, Us -
Computer ProgrammerLaksan Technologies Jul 2022 - PresentMetuchen, New Jersey, Us• Analyse the data in Data lake environment in Snowflake, AWS analytics tools like S3, Athena • In response to Covid-19 marketplace impacts, playing a critical role in executing a cross functional plan to analyze the large-scale changes to Demand plan/Supply plan for a huge buy for sending signals to factories• Drive meetings with Architects and Product owners on requirements from Planners and work with product team to create design, data model, data integration and implement the solution. • Create Python extracts along with cloud services like API, AWS ECS, Lambda, Glue, SNS to extract the data into SP data lake in AWS S3 and automating, scheduling through Jenkins and step functions. • Deploying changes in CI/CD pipelines and maintain version control using Github. Provide solutions like Trending analysis, dashboard reports in Tableau, Cognos. • Creating Technical Design Documents with HLDs, LLDS, and DLDs, algorithms, shell scripts and Visio/Lucid diagrams.• Create Data quality checks for monitoring the planning results and proactively catch and trap mechanism for adding the fail-safe capability to planning cycle• Developing and implementing complex APIs in Teradata, Snowflake and Unix using stored procedures, functions, views, optimizing System databases using Explain Plan analysis, collecting the statistics, Index definition, partition creation• Implement extraction of data from Live Cache– LCEX APO into Teradata by replacing the SAPI Business Application Programming Interface - BAPI extracts. • Experienced in ETL Testing, SQL performance tuning and aid in SAP APO testing. • Performing cut over activities, maintaining application modules and servers, was responsible for technical upgrades of the application using Oracle, Informatica and Teradata -
Teradata, Informatica, Aws Lead EngineerWipro Jan 2015 - Jul 2022Bangalore, Karnataka, In• Analyse the data in Data lake environment in Snowflake, AWS analytics tools like S3, Athena • In response to Covid-19 marketplace impacts, playing a critical role in executing a cross functional plan to analyze the large-scale changes to Demand plan/Supply plan for a huge buy for sending signals to factories• Drive meetings with Architects and Product owners on requirements from Planners and work with product team to create design, data model, data integration and implement the solution. • Create Python extracts along with cloud services like API, AWS ECS, Lambda, Glue, SNS to extract the data into SP data lake in AWS S3 and automating, scheduling through Jenkins and step functions. • Deploying changes in CI/CD pipelines and maintain version control using Github. Provide solutions like Trending analysis, dashboard reports in Tableau, Cognos. • Creating Technical Design Documents with HLDs, LLDS, and DLDs, algorithms, shell scripts and Visio/Lucid diagrams.• Create Data quality checks for monitoring the planning results and proactively catch and trap mechanism for adding the fail-safe capability to planning cycle• Developing and implementing complex APIs in Teradata, Snowflake and Unix using stored procedures, functions, views, optimizing System databases using Explain Plan analysis, collecting the statistics, Index definition, partition creation• Implement extraction of data from Live Cache– LCEX APO into Teradata by replacing the SAPI Business Application Programming Interface - BAPI extracts. • Experienced in ETL Testing, SQL performance tuning and aid in SAP APO testing. • Performing cut over activities, maintaining application modules and servers, was responsible for technical upgrades of the application using Oracle, Informatica and Teradata -
Assistant ConsultantTata Consultancy Services Jan 2011 - Jan 2015Mumbai, Maharashtra, In1. Analyzed and created data map and source target mapping, high level design and technical design document as per requirement. Designed the data model and reviewed with Architects for the tables created or modified in MDM Database2. Analyzed and designed the source target mapping and developed Informatica PowerCenter mappings and workflows to integrate with TIBCO/MDM for loading the Customer/Account/Relationship data into TIBCO 3. Worked on Informatica ETL testing and Unit Testing, System Integration Testing and User Acceptance Testing and created Informatica mappings for Data Quality metrics collection4. Analyzed requirements for integrating the customer data and billing data from different systems. Developed Ab initio graphs, migrated Informatica mappings to Ab initio graphs and performed end to end ETL testing. 5. Improved the performance of Informatica mappings, database objects and Ab Initio graphs by tuning the cache properties, collect stats, indexes and parameters -
AssociateCognizant Jun 2005 - Nov 2010Teaneck, New Jersey, Us• Analyzed requirements, designed, and developed Informatica mappings and scripts, and unit testing• Developed and maintained Informatica mappings and stored procedures• Tuned long running jobs, monitored and supported the batches, and communicated the status to the business• Analyzed requirements, designed, and developed Ab Initio and Teradata objects for loading into warehouse, UNIX scripts and reusable graphs• Migrated Informatica mappings to Ab Initio graphs• Designed data model, developed Informatica mappings and objects for loading data from Operational Data Store (ODS) to Teradata, performance tuned and troubleshot the QA issues• Developed Informatica mappings, migrated the mappings from Informatica 6.2 to 7.1, performance tuned and tested them• Extracted the data from various source systems/files of operational data from different sites, transformed it and loaded to staging area• Developed AbInitio graphs using components like Compress, Join, Filter by expression, Partition and Departition, reformat, rollup, multifiles etc• Extensively worked on UNIX shell scripting for creation of wrapper scripts of execution of AbInitio graphs• Developed and supported the common graphs for Unload/Export, Load, Multiload, Fastload and Denormalize data for the entire Operational Data warehouse team• Developed Teradata BTEQ scripts and executed them through Ab Initio to load the daily incremental data• Migrated existing Informatica jobs to Ab Initio and scheduled and maintained the Ab Initio batch jobs using UC4scheduling tool• Improved the performance of Ab Initio graphs by using various Ab Initio performance techniques like usage of multifiles, partitioning components, multithreads through Configuration parameters• collected statistics and dynamic sql while loading data for better performance
Frequently Asked Questions about Satheesh Babu
What company does Satheesh Babu work for?
Satheesh Babu works for Wipro
What is Satheesh Babu's role at the current company?
Satheesh Babu's current role is Teradata, Informatica, AWS Lead Engineer.
Who are Satheesh Babu's colleagues?
Satheesh Babu's colleagues are Sangeeta Biswas, Arunava Laha, Metla Ravi Kumar, Amrutha Vijayan, Kanchan Debnath, Priyanka Naykodi, Farhan Elias.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial