Maddy C

Maddy C Email and Phone Number

Data Architect @ GreenSky®
Texas, United States
Maddy C's Location
Plano, Texas, United States, United States
About Maddy C

Over 18 years of IT Professional experience with diverse expertise in Cloud Technologies, Data Warehousing and Business Intelligence in various industries including Finance, Insurance, Investment banking, Retail and Media & Entertainment Domains. My specialized skills are Designing/Architect Data Warehouse, Integration, Data marts, ETL Architecture, Evaluating Solution Alternatives, Frameworks, SOX complains, Data Quality, Data Profiling, Data Governance, Audit and SOX complains.And also, hand on experienced in designing Conceptual Data Modeling, Logical Data Modeling and Physical Data Modeling.A highly analytical talented dedicated IT result driven professional having very good presentation, demonstration skill and customer relationship experience. I am looking for a challenging position that enable me to utilize my knowledge and skills and to grow with the company, take it to the next level and hoping this will open new direction and success.Work Experiences:-• Experience in Designing and migrating data from legacy platforms into the cloud with Informatica, Talend, pyspark, Python, AWS, and Snowflake.• Proficient in AWS Engineer in data warehouse and data lake implementation/support in cloud and On Prem .• Design dimensional model, data lake architecture, data vault 2.0 on Snowflake and used Snowflake logical data warehouse for compute.• Having experience in Snowflake cloud data warehousing shared technology environment for providing stable infrastructure, architecture, best practices, secured environment with reusable generic frameworks.• Well versed with Snowflake features like clustering, time travel, cloning, logical data warehouse, Streams,tasks,caching etc.• Experience in data modeling, data warehouse, and ETL design development using RalphKimball model with Star/Snowflake Model Designs with analysis-definition, database design, testing, and implementation process.• Data analysis, Database programming (Stored procedures; Triggers, Views), Table Partitioning, performance tuning• Strong knowledge of SDLC (viz. Waterfall, Agile, Scrum) and PMLC.• Researches new technologies while keeping up to date with technological developments in relevant areas of ETL, Cloud Solutions, Big Data, Analytics & Relational/NoSQL DBs• Understanding the business implications of technical solutions, assist in defining the technology solutions to support business requirements and future needs• Managed multiple projects across all phases of Solution, Modeling , Development , Design and implementation

Maddy C's Current Company Details
GreenSky®

Greensky®

View
Data Architect
Texas, United States
Maddy C Work Experience Details
  • Greensky®
    Data Architect
    Greensky®
    Texas, United States
  • Greensky®
    Principal Data Engineer
    Greensky® Apr 2024 - Present
    Texas, United States
    ✓ Working as a Data Architect\Lead to provide the data integration solution to the business requirements.✓ Re-engineered existing data model, near real time data requirements and implemented one stop data needs on revised data models adding new controls with process improvement to support more efficient data needs.✓ Closely working with Enterprise Data Architects in designing of tables and involved in modifying technica Specifications.✓ Developed Key data pipeline to process large volume of data by consolidating data from multiple disparate sources into a single destination, enabling quick data analysis for reliable business insights. ✓ Gather Requirements and analyze the ETL specification and design documents. Participated in all phases of the development life cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.✓ Working on Data analysis and Data Profiling based on Business requirements and recommend technical solutions.✓ Working on different strategies to move On-prem ETL jobs to Cloud-based solutions using the ELT approach.✓ Well versed with Snowflake features like clustering, time travel, cloning, logical data warehouse, Streams, tasks, caching etc.
  • Goldman Sachs
    Vice President
    Goldman Sachs Sep 2021 - Mar 2024
    United States
    ✓ Hands-on experience on multiple Integration project between SQL Server to Snowflake environment including Automation of several manual process to avoid manual intervention. ✓ Having experience in Snowflake cloud data warehousing shared technology environment for providing stable infrastructure, architecture, best practices, secured environment with reusable generic frameworks. ✓ Re-engineered existing data model, near real time data requirements and implemented one stop data needs on revised customer/person data models with process improvement, new controls to support more efficient data needs. ✓ Well versed with Snowflake features like clustering, time travel, cloning, logical data warehouse, Streams, tasks, caching etc. ✓ Developed Key data pipeline to process large volume of data by consolidating data from multiple disparate sources into a single destination, enabling quick data analysis for reliable business insights. ✓ Worked on automating end to end multiple manual process as part of data migration from SQL server to snowflake environment. ✓ Developed pipelines to pull data from snowflake tables and send it to downstream systems through S3/Ingestion table and performing Sftp. ✓ Designed multiple ETL process using Talend Tool to load from Sources to Targets through data Transformations. ✓ Managed multiple projects across all phases of Solution, Modeling , Development , Design and implementation ✓ Provided innovative ideas to overcome the project hurdles and challenges during the implementation. ✓ Migrated the data from on perm data warehouse to cloud data warehouse Snowflake environment ✓ Applied Multiple processes improving /Performance Tuning and re-engineering methodologies to ensure industrial standards & end user requirement are fulfilled.
  • Toyota North America
    Senior Solutions Architect
    Toyota North America Jul 2017 - Sep 2021
    Plano, Texas, United States
    • Worked in industrial agile software development process i.e. Daily Stand-ups, pre-iteration meetings, Iteration planning, Backlog refinement, Demo calls, Retrospective calls in JIRA Stories.• Worked on Snowflake Shared Technology Environment for providing stable infrastructure, secured environment, reusable generic frameworks, robust design architecture, technology expertise, best practices and automated SCBD (Secured Database Connections, Code Review, Build Process, Deployment Process) utilities• Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.• Worked on Snowflake Schemas and Data Warehousing• Developed Talend Bigdata jobs to load heavy volume of data into S3 data lake and then into Redshift data warehouse. • Migrated the data from on perm data warehouse to cloud data warehouse Snowflake environment.• Build dimensional modelling, data vault architecture on Snowflake.• Developed Talend jobs to populate the claims data to data warehouse - star schema, snowflake schema, Hybrid Schema• Data modelling activities for document database and collection design using Visual Studio.• Created Shell Script Python code to support API call using Rest API.• Analysis of Test Track tickets and creating JIRA stories.• Support user and production queries.
  • Toyota North America
    Technology Architect
    Toyota North America Nov 2014 - Jun 2017
    Torrance, California, United States
    • Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.• Developed FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis.• Involved in identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data.• Designed the ETL processes using Informatica to load data from Flat Files, XML Files, Excel files and used Power exchange for mainframe sources to Oracle Staging area.• Worked on Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer and Repository Manager.• Extensively worked on the performance tuning of the Informatica Power Center Mappings as well as the tuning of the sessions.• Created Detail design documents and Technical design specification documents for ETL Process based on the Business Requirements.• Extensively used Transformations like Aggregator, Joiner, Expression, Filter, Update, Lookup, Sorter and Stored Procedure transformations• Improved session performance with pipeline partitioning and increasing block size, data cache size, sequence buffer length and target based commit interval.• Worked extensively on Informatica to extract data from flat files and Oracle, and to load the data into the target database.• Extensively worked on Informatica client tools like Designer, Workflow Manager, and Workflow Monitor and Repository Manager.• Worked closely with Business QA team during the testing phase and fixed bugs that were reported.• Implemented Performance Tuning Logics wherever possible to provide maximum efficiency and performance.• Provided innovative ideas to overcome the project hurdles and challenges during the implementation.
  • Blue Cross And Blue Shield Of Minnesota
    Integration Team Lead
    Blue Cross And Blue Shield Of Minnesota Dec 2011 - Oct 2014
    Minnesota, United States
    • Attended multiple requirement gathering sessions with source system teams and business users.• Designed data structures for storage and programmed logic for data flow between the various stages of toll transaction processing. • Maintained and tracked weekly status call with team and finished the deliveries within given timeline.• Experience in working with Slowly Changing Dimensions and setting up Change Data Capture (CDC) mechanisms.• Extensively Implemented SCD TYPE 2 Mappings for CDC (Change data capture) in EDW.• Developed mappings to extract data from SQL Server, Oracle, Flat files, XML files, and loaded into Data warehouse using the Mapping Designer.• Used Source Analyzer and Warehouse Designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.• Created and modified COBOL copybook to connect source data using Power Exchange Navigator.• Used Power Exchange Data Change option (CDC) to capture modified records.• Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.• Developed Slowly Changing Dimensions for Type 1 SCD and Type 2 SCD.
  • Cognizant
    Associate
    Cognizant Aug 2010 - Nov 2011
    Chennai, Tamil Nadu, India
    • Designed, developed and delivered Data Warehouse solutions as per business requirements.• Coordination with business users to gather the requirements.• Parsed high-level design specification to simple ETL coding and mapping standards.• Estimations, planning and tracking of ongoing ETL projects.• Evaluation of ETL processes to identify areas of improvement. Tuning of Informatica ETL components for optimal load performance.• Document the source – to target transformation rules for the various data entities by studying the existing batch programs and interviewing the system owners.• Create data definition documentation for the enterprise including metadata, data flow diagrams, system data flowcharts and source to target mappings.• Designed and developed ETL Processes based on business rules, job control mechanism using Informatica Power Center.• Provide technical and analytical support for process improvement initiatives.• Single point of contact for user’s data questions, any new enhancements in current DWH processes
  • Cognizant
    Data Analyst
    Cognizant Jan 2010 - Jul 2010
    Chennai, Tamil Nadu, India
    • Involved in gathering the business requirements and molding them in to the technical specifications required for the conversions team.• Designed the target Data Warehouse using Star Schema, Involved in Extraction, Transformation and Loading (ETL) Process.• Extensively worked over the online tracking system for work-requests, database change request issues, automated UNIX scripts modification issues and various scheduler jobs issues.• Collection of requirements from business users and analyzed based on the requirements.• Designed and developed complex Informatica mappings using expressions, aggregators, filters, lookup and stored procedures to ensure movement of the data between various applications.• Mappings involved transformations like Expression, Sequence Generator, Joiner, and Update Strategy.• Created sessions by extensively using ETL methodology for complete processing of data extraction, transformations and loading using Informatica.• Managed multi source data extraction using Informatica.• Involved in necessary training and knowledge transfer to assigned project members• Developed PL/SQL procedures for processing business logic in the database and use them as a Stored Procedure Transformation.• Extensively worked in the performance tuning of programs, ETL procedures and processes. Error checking and testing of the ETL procedures and programs using Informatica session log.• Provided end user training and production system support, also worked with architecture team to review the Informatica standard documents.• Worked extensively on Oracle database 10g and flat files as data sources.
  • Igate
    Senior Software Engineer
    Igate Sep 2006 - Dec 2009
    Chennai, Tamil Nadu, India
    • Designed ETL mapping based on Existing ETL logic.• Worked with various transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router transformations. • Involved in the development of Informatica mappings and also tuned for better performance. • Extensively used ETL to load data from flat files to Oracle. • Prepared mapping specification document, which gives the data flow and transformation logic for populating each column in the data warehouse table. • Designed and developed the fact/dimension entities.• Performed the tuning of ETL SQLs.• Compared actual results to expected results and suggested changes to mappings owned by others• Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.• Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.• Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.• Worked as a fully contributing team member with independent planning & execution responsibilities

Maddy C Education Details

  • Adhiyamaan College Of Engineering
    Adhiyamaan College Of Engineering
    Information Technology

Frequently Asked Questions about Maddy C

What company does Maddy C work for?

Maddy C works for Greensky®

What is Maddy C's role at the current company?

Maddy C's current role is Data Architect.

What schools did Maddy C attend?

Maddy C attended Adhiyamaan College Of Engineering.

Not the Maddy C you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.