Karthick R. work email
- Valid
- Valid
Karthick R. personal email
● Over 15 years of IT experience specializing in Analysis, Design, Development of ETL and Data Ingestion/Wrangling processes in all phases of the Data Warehousing Life Cycle with expertise in Business Intelligence, Data Warehouse, and OLAP technology.● In-depth experience on the design and development of Extraction, Transformation, and Loading (ETL) and Data Ingestion/Wrangling processes from various sources into Data Warehouse/ Data Marts using IBM InfoSphere Information Server V11.7/V11.5, IBM DataStage V8.5, V8.7, Ascential DataStage V7.5 (Designer, Director Manager and Administrator.● Have good exposure in Azure cloud with Azure Data Factory (ADF), Databricks.● Have worked in Snowflake data-warehousing tool in Azure cloud environment.● DevOps automation process has been implemented on both on-premises and cloud environments with Continuous Integration (CI) and Continuous Deployment (CD) pipeline.● IBM Certified Solution Developer - InfoSphere DataStage v8.5 ● Teradata 12 Certified Professional.● IBM Certified Database Associate - DB2 v10.5 Fundamentals for LUW.● Extensively worked with different database and file sources like Teradata, UDB DB2, Oracle, PostgreSQL, XML, WSDL(Web service), Sequential files(.csv, .txt & .gz), SAP BW, SAP Hana and Flat files.● Automated reconciliation of systems and databases, implemented controls and alerts.● Have worked on SQL, stored procedure & PL/SQL in multiple projects.● Implemented seven big projects for various clients and involved in the complete project life cycle, involved from requirement gathering, analysis & design, Unit testing, System Integration Testing (SIT) and User Acceptance Testing (UAT), implementation and post-production support.● Solved challenging problems by optimizing and query tuning for performance by bottlenecks at database, DataStage and administrator levels.● Seamlessly managed workload to meet deadlines before missing SLA, offering exceptional troubleshooting skills and a talent for developing innovative solutions to unusual and difficult problems.● Data warehouse experience in the design and implementation of Star Schema and Snowflake Schema.● Experienced in UNIX Shell scripting (Bourne Shell (SH), Korn Shell (KSH), BASH and Cron tab) as part of triggering the DataStage jobs, automated scripts, file manipulation, count matching, Scheduling and text processing.● Excellent communication, interpersonal, analytical skills and strong ability to perform as part of a team.
-
Senior DeveloperFreddie Mac Sep 2020 - PresentMclean, Va, Us• Participate in all phases of the software development lifecycle, from inception through post-implementation support, for all types of applications consistent with established specifications and business requirements to deliver business value.• Understand the customer requirements and translate it into appropriate technical solution.• Responsible for serving as a data expert by translating business problems into design and to create the required aggregation and transformations.• Collaborate with business partners to identify and scope new opportunities for analytics applications to evaluate business performance and to support business decisions.• Work with a team of developers responsible for integrating data into the enterprise data warehouse by leveraging the Extract-Transform-Load (ETL) tool and business intelligence framework.• Design the technical deliverable artifacts such as source-to-target mappings, implementation run books etc. needed for project implementation.• Involve and guide the team in preparing the low-level design, coding and unit testing of the Data warehouse application based on high-level design solution.• Develop code that meets technical specifications and business requirements according to the established designs and framework.• Perform coding, development of analytical models to support business partner objectives and business needs and implementation.• Troubleshoot development, operational problems and provide timely solutions. -
Sr. DeveloperOppenheimerfunds Jun 2018 - Sep 2020Atlanta, Ga, Us• Participate in all phases of the software development lifecycle, from inception through post-implementation support, for all types of applications consistent with established specifications and business requirements to deliver business value.• Understand the customer requirements and translate it into appropriate technical solution.• Understanding of data warehousing, reporting and data integration design concepts/skills to contribute to the design, development, testing and implementation of end-to-end integrated systems and support production issues after deployment. • Responsible for serving as a data expert by translating business problems into design and to create the required aggregation and transformations. • Architect solutions that drive enterprise business intelligence projects from design, development, and demonstration through full implementation using a combination of both ETL and business intelligence tools.• Collaborate with business partners to identify and scope new opportunities for analytics applications to evaluate business performance and to support business decisions.• Work with a team of developers responsible for integrating data into the enterprise data warehouse by leveraging the Extract-Transform-Load (ETL) tool and business intelligence framework.• Design the technical deliverable artifacts such as source-to-target mappings, implementation run books etc. needed for project implementation.• Involve and guide the team in preparing the low-level design, coding and unit testing of the Data warehouse application based on high-level design solution.• Develop code that meets technical specifications and business requirements according to the established designs and framework.• Perform coding, development of analytical models to support business partner objectives and business needs and implementation.• Troubleshoot development, operational problems and provide timely solutions. -
Technical LeadCredit Suisse Jul 2017 - Jan 2018Zurich, Ch• Participate in all phases of the software development lifecycle with DevOps automation process in Cloud environment, from inception through post- implementation support, for all types of applications consistent with established specifications and business requirements to deliver business value.• Understand the customer requirements and translate it into appropriate technical solution.• Analysis and understanding of data modeling and/or diagramming techniques, with the ability to create the process flow diagrams and understand the entity-relationship models.• Understanding of data warehousing, reporting and data integration design concepts/skills to contribute to the design, development, testing and implementation of end-to-end integrated systems and support production issues after deployment. • Responsible for serving as a data expert by translating business problems into design and to create the required aggregation and transformations. • Architect solutions that drive enterprise business intelligence projects from design, development, and demonstration through full implementation using a combination of both ETL and business intelligence tools. -
Senior DeveloperBb&T Sep 2014 - Jul 2017Charlotte, Nc, Us• Hands on knowledge and experience in developing and designing code with ETL & Business intelligence framework.• Involved in requirements gathering for BI projects by interacting with business users and upstream data producers.• Involved in analysis of project requirements.• Prepared the Data Warehouse strategy and development design approach using InfoSphere DataStage 11.5 / DB2 for various Data integration requirements.• Created source to target mapping sheet to migrate the Medicaid eligibility and enrollment data to a new system.• Used the Secure file transfer protocol (SFTP) to connect to external Servers to get the files or send the files to Master Data Management (MDM). • Work on high level architectural design and review team’s technical deliverables. • Performed problem analysis, identified root causes and outlined resolution options. • Have worked on Data Profiling the source data by using Information Analyzer (IA).• Have worked on Data Quality (DQ) jobs in both Information Analyzer and QualityStage by creating Data Rule Definition, Data Rule and Data Rule Set Definition.• Have been involved in automation of entering business glossary to InfoSphere Information Governance Catalog (IGC).• Have worked in metadata migration by using InfoSphere Metadata Asset Manager (IMAM). -
System EngineerKeybank Jun 2008 - Aug 2014Cleveland , Ohio, Us• Partnering with the business users to understand the requirements and converting them into project level functional specification.• Worked with business analysts to identify the appropriate sources and data elements.• Conducted requirements gathering with various business teams to understand the data needs for the new chip cards.• Co-ordinated with different source system teams to understand the changes in process required for processing chip cards. • Analyze the existing data present in data marts and EDW to understand the impacts of process changes to the existing system.• Involved in data profiling of the new chip information for data modeling and data quality monitors in EDW.• Provided the High level and low level design solution using DataStage and Shell scripts.• Designed new process for data reprocessing and reconciliation to ensure no data loss in ETL layer.• Involved in designing the data model for the new chip data.• Design, develop and test ETL applications consistent with application architecture guidelines.• Test data preparation to ensure the test data availability for coverage of all test case scenarios.• Worked with business teams to get sign off on the productions changes went well. -
Senior Software EngineerAmeriprise Financial Services, Llc. Jan 2007 - May 2008Minneapolis, Mn, Us• Develop code that meets technical specifications and business requirements according to the established designs and framework. • Automated reconciliation process is to identify the missing organization information between systems for CSR agents.• Designed Mappings between sources to operational staging targets, using Star Schema, Implemented logic for Slowly Changing Dimension (SCD).• Download the XML from payment website once payment authorization is completed with using Curl command in Unix script and process the XML file into BI tables. • Created ETL jobs to invoke SOAP web services with using WSDL for change the order status once payment is approved or rejected from XML file data. • Used DataStage/Quality Stage Designer to import/export jobs, table definitions, Custom Routines and Custom Transformations.• Extensively worked with Join, Look up (Normal and Sparse) and Merge stages to improve the performance based on the volume of source and reference links.• Several hours running jobs are improved performance by reducing overall runtime in few minutes using different partitioning techniques.• Experienced the Agile Methodology approach in the project greatly increases the likelihood of successful implementation of a data warehouse on time and within budget.• Involved in testing, debugging, bugs fixing and documentation of the system.• Worked in analyzing the issue, trace the problem area, suggest a solution, discuss with business. -
Senior Software EngineerAmerican Express Jan 2005 - Dec 2006New York, Ny, Us• Partnered with business users in identifying, prioritizing & resolving numerous data issues create ETL Project Plans, design and development of the tables.• Extracted the data from heterogeneous sources like SAP BW using Openhub connection, Mainframe tables, Flat files and IBM DB2.• Designed and developed the parallel jobs for slowly changing dimensions (SCD Type1, Type 2 and Type 3) and fact tables loading and Type 2 to handling the historical data. • Designed DataStage parallel jobs to process more than 40 million records in the weekly basis.• Involved in monitoring history data migration in general ledger.• Involved in monitoring the workflows and in optimizing the load times.• Designed, developed & tested the fact table partition strategy to minimize performance impact due to additional history data in the fact tables.• Involved in the bulk Data migration from DB2 to Teradata through various Teradata utilities like Fastload, MultiLoad, TPump & FastExport.• Analyzed data with discrepancies through Error files and Log files for further data processing and cleansing.• Used DataStage Manager for importing metadata into repository, creating new job categories, exporting and importing the jobs between the production and the development servers.• Involved in analyzing the daily defects of the production support and finding the root cause.• Involved in testing, debugging, bugs fixing and documentation of the system.
Karthick R. Education Details
-
Madras UniversityInformation Technology
Frequently Asked Questions about Karthick R.
What company does Karthick R. work for?
Karthick R. works for Freddie Mac
What is Karthick R.'s role at the current company?
Karthick R.'s current role is Senior Developer at Freddie Mac.
What is Karthick R.'s email address?
Karthick R.'s email address is kr****@****nds.com
What schools did Karthick R. attend?
Karthick R. attended Madras University.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial