Kiran D Email and Phone Number
• Information Technology Professional with 14 years of Data Engineering, Data Integration and ETL/ELT experience in developing Enterprise Data Warehouses (EDW), Data Marts, Data Lakes, and Data Analytics solutions on both On-Prem and Cloud. • Create and maintain data platforms and very large-scale data analytics solutions based on the Snowflake Cloud Data Warehouse.• Design, develop, and maintain scalable and robust data pipelines, ETL/ELT processes to collect, process, and store large volumes of structured and unstructured data using Snowflake, ADF, Informatica Cloud (IICS/IDMC), and Informatica PowerCenter tools.• Good exposure in Snowflake Cloud Architecture and SnowSQL and SnowPipe for continuous data ingestion, bulk loading and unloading data into Snowflake tables.• Expertise in Snowflake concepts like setting up RBAC controls, Virtual Warehouse, Zero Copy Clone, Time Travel, Stages, File Formats, SnowPipe, SnowSQL, Tasks, Streams etc.• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse using Python and Snowflakes SQL.• Experience loading and unloading data with different file formats like PARQUET, AVRO, CSV, JSON, XML to/from Snowflake.• Implement CDC in Snowflake using Streams and automate the incremental load using Snowflake Tasks. • Write complex SQL scripts using Statistical Aggregate functions and Analytical functions to support ETL in snowflake cloud data warehouse.• Design, develop, and maintain Informatica Intelligent Cloud Services (IICS) assets utilizing best-practice data load techniques.• Extracting, transforming, and loading data from various sources into Azure Data Factory (ADF), including databases, Blob Storage Locations, flatfiles.• Proficient in developing data integration mappings, transformations, and reusable components in both Informatica IICS and PowerCenter . • Develop SQL scripts, PL/SQL procedures, functions, packages, triggers, and views that store, retrieve, and manipulate data for data analytics. • Write complex SQL queries using Analytical or Windows functions to simplify and improve the query performance.• Develop Python and Linux scripts to automate the execution of ETL Batch Jobs using Schedulers.• Execute, Schedule and Automate the Deployment process using ADO (Azure DevOps) or DevOps tools such as Git, Bitbucket, IBM uDeploy, Artifactory and Jenkins.• Develop ETL process consisting of data transformation, data sourcing, mapping, conversion, and loading. Implement SCD Techniques like slowly changing dimensions (type1, type2 and type3).
-
Senior Data EngineerPrime Video & Amazon Mgm StudiosPittsburgh, Pa, Us -
Senior Data Engineer (Cloud)Hyatt Hotels Corporation Dec 2023 - PresentPittsburgh, Pennsylvania, United States -
Senior Data EngineerAt&T Jun 2022 - Nov 2023Dallas-Fort Worth Metroplex• Design, implement, and optimize fully operational production grade large scale data solution on Snowflake Data Warehouse.• Developing ETL pipelines in and out of data warehouse using combination of Python and Snowflake SnowSQL.• Build, Develop, Deploy, and maintain data loads from ADLS/Azure Blob Storage to Snowflake using SnowSQL.• Build snowflake databases, schemas, tables, tasks, stages, and stored procedures and migrate applications from on-premises to the snowflake… Show more • Design, implement, and optimize fully operational production grade large scale data solution on Snowflake Data Warehouse.• Developing ETL pipelines in and out of data warehouse using combination of Python and Snowflake SnowSQL.• Build, Develop, Deploy, and maintain data loads from ADLS/Azure Blob Storage to Snowflake using SnowSQL.• Build snowflake databases, schemas, tables, tasks, stages, and stored procedures and migrate applications from on-premises to the snowflake platform.• Create External Stages for Microsoft Azure or AWS or GCP to unload the data into Snowflake. Also, create internal stages within Snowflake to upload the data from huge zip files greater than 50 MB.• Create JSON, Parquet, and CSV File Formats in Snowflake for unloading the data from external/internal stages to Snowflake.• Develop SnowPipe data pipelines to ingest the data from external stages (Azure Blob Storage or AWS or GCP) using JSON, Parquet, and CSV file formats and load it into Snowflake tables.• Create Streams in Snowflake to detect the changes using Metadata Columns and implement Change Data Capture for daily loads.• Create Tasks in Snowflake to automate the data ingestion process for SnowPipes and Streams using Cron Job Scheduler.• Create Python scripts to connect and fetch the data from RDBMS and export it to Azure VM or Snowflake Stage locations.• Build Shell Scripts to automate the files extraction and movement for seamless Snowflake Tasks execution. • Develop SQL and PL/SQL blocks in Oracle and SQL Server to fetch, load and modify the data for Data Ingestion purposes.• Write CTE, Windows Functions in Snowflake to improve the query performance by skipping sub queries and multiple table scans. • Conduct code reviews and lead development of standards, guidelines, and best practices for Snowflake Ingestion techniques.• Deploy Shell Scripts, Database and ETL components using CICD pipelines and Jenkins, Git and Bitbucket tools. Show less -
Data EngineerPnc May 2017 - May 2022Pittsburgh, Pennsylvania, United States• Develop ETL scripts to migrate data from Oracle into a snowflake data lake and support a variety of structured, semi-structured, and unstructured data sources.• Create SnowPipes to unload the files from External Stages (Azure) to Snowflake Tables and track changes using snowflake streams. Automate Snowflake loads using snowflake tasks on daily basis.• Develop IICS Taskflows, mapping tasks and mappings to load a data warehouse/data lake from various applications on Teradata, SQL… Show more • Develop ETL scripts to migrate data from Oracle into a snowflake data lake and support a variety of structured, semi-structured, and unstructured data sources.• Create SnowPipes to unload the files from External Stages (Azure) to Snowflake Tables and track changes using snowflake streams. Automate Snowflake loads using snowflake tasks on daily basis.• Develop IICS Taskflows, mapping tasks and mappings to load a data warehouse/data lake from various applications on Teradata, SQL Server, Cloud Sources, and files using Cloud Data Integration Service (CDI).• Develop Data Synchronization and Data Replication tasks in IICS to move the data from multiple sources.• Create data tasks, file ingestion tasks, assignment tasks, command tasks, notification tasks and integrate them in Informatica Taskflows using Cloud Data Integration Service (CDI).• Build linear and sequential Taskflows in IICS to execute the data tasks in the predefined or sequential order.• Build Mass Ingestion pipelines in IICS to move huge files from one remote server to another using SFTP.• Design and develop IICS Taskflows with conditions-based decision and parallel paths to execute data tasks.• Develop Informatica PowerCenter Mappings, Sessions, and Workflows to integrate the data from multiple on-prem sources. • Develop multiple RDBMS scripts using advanced SQL, PL/SQL blocks that include Oracle, Teradata, SQL Server databases to store, retrieve, and manipulate data for data analytics.• Conform to Informatica (IICS/IDMC/PowerCenter) standards so that development is done in a consistent manner.• Deploy the ETL Informatica PowerCenter components, Relational Database objects, Shell Scripts and files using DevOps or CI/CD tools – GitHub, Bitbucket, uDeploy, Artifactory and Jenkin• Develop Linux/Shell scripts to automate the execution of ETL Informatica PowerCenter Workflows.• Debug complex ETL processes, troubleshoot and resolve complex data load and data retrieval failures. Show less -
Senior Etl DeveloperNationwide Aug 2015 - Apr 2017Columbus, Ohio Metropolitan Area• Evaluated functional requirements and converted business specifications to ETL technical artifacts and determined the most efficient design solution across multiple integration projects.• Designed Dimensional Model, refined source to target mappings and ETL technical specification documents.• Developed ETL Informatica PowerCenter mappings, sessions, and workflows to ingest data from disparate data sources ranging from Oracle, SQL Server, Teradata, Netezza and Flat Files.• Created… Show more • Evaluated functional requirements and converted business specifications to ETL technical artifacts and determined the most efficient design solution across multiple integration projects.• Designed Dimensional Model, refined source to target mappings and ETL technical specification documents.• Developed ETL Informatica PowerCenter mappings, sessions, and workflows to ingest data from disparate data sources ranging from Oracle, SQL Server, Teradata, Netezza and Flat Files.• Created reusable transformations/mapplets in ETL Informatica PowerCenter to achieve data cleansing and standardization during initial and incremental data load.• Implemented SCD Type 1 and Type 2-dimension techniques using Change Data Capture (CDC) to maintain history of transactions using Informatica PowerCenter tools.• Built SQL scripts, PL/SQL procedures, functions, packages, and triggers that store, retrieve, and manipulate data for data analytics.• Enforced Informatica and Database best practices throughout the ETL development lifecycle. • Deployed the Informatica PowerCenter Components (Workflows, Sessions, Mappings), Database components, and Shell/Linux scripts using CI/CD pipelines and tools such as Jenkins, and Bitbucket.• Accelerated ETL Informatica PowerCenter Workflows and Database Performance by Pushdown Optimization (PDO), Caching, Partitioning Techniques, Index tuning, Windows Functions, and CTE.• Programmed Shell Scripts for scheduling jobs for various data cleansing and batch loading processes.• Created and scheduled Autosys jobs using JCL and Batch scripts to invoke Informatica Workflows and Procs. • Ingested the Property and Casualty (P&C) Insurance data into Guidewire Policy Center from Informatica PowerCenter for UpToDate policy lifecycle information for Nationwide Insurance agents. Show less -
Etl Informatica DeveloperThe Hartford Nov 2009 - Jul 2015Greater Hartford• Performed data analysis and refined source to target data mappings, ETL design and technical specifications.• Implemented Change Data Capture (CDC) (SCD Type 1, Type 2) techniques for Incremental/Delta loads using Start Schema and Snowflake Schema methodologies.• Developed ETL Informatica PowerCenter mappings using transformations such as Aggregator, Expression, Lookup, Router, Filter, Joiner, Union, Sequence Generator, Normalizer and Update Strategy.• Created new Informatica… Show more • Performed data analysis and refined source to target data mappings, ETL design and technical specifications.• Implemented Change Data Capture (CDC) (SCD Type 1, Type 2) techniques for Incremental/Delta loads using Start Schema and Snowflake Schema methodologies.• Developed ETL Informatica PowerCenter mappings using transformations such as Aggregator, Expression, Lookup, Router, Filter, Joiner, Union, Sequence Generator, Normalizer and Update Strategy.• Created new Informatica PowerCenter mappings, transition of PL/SQL ETLs to new mappings, or modification of existing mappings.• Built Oracle SQL, PL/SQL procedures, functions, packages, and views that are consistent and integrated with existing ETL processes. • Created SQL, PL/SQL blocks which include performance tuning, packages, stored procedures, and functions.• Built complex SQL queries on Oracle and performance optimization for large data volumes. Debugged and optimized SQL queries and ETL jobs to reduce the execution window or reduce resource utilization.• Developed reusable data integration or Informatica PowerCenter components such as Mapplets and Worklets. • Programmed Shell Scripts and JCL Scripts for automating ETL Informatica PowerCenter Workflows execution on scheduling basis using Autosys Scheduler. • Constructed CI/CD pipelines using DevOps tools (Jenkins, Git and Bitbucket) and deployed Database, Shell Scripts and ETL Informatica PowerCenter components.• Troubleshot and fixed data issues and performance issues in Informatica PowerCenter workflows. Performed optimization of Informatica PowerCenter mappings and sessions, for better performance and efficiency.• Demonstrated Property & Casualty (P&C) Insurance skills in implementing complete policy life cycle. Show less
Kiran D Education Details
Frequently Asked Questions about Kiran D
What company does Kiran D work for?
Kiran D works for Prime Video & Amazon Mgm Studios
What is Kiran D's role at the current company?
Kiran D's current role is Senior Data Engineer.
What schools did Kiran D attend?
Kiran D attended Hyderabad Alumni Association Of Jntu College Of Engineering, Kakinada.
Who are Kiran D's colleagues?
Kiran D's colleagues are 熊万勇, Victor Coles, Malichi Anamm, 程先功, Erika Freites, 何志翔, Sagar Sahoo.
Not the Kiran D you were looking for?
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial