Parminder S

Parminder S Email and Phone Number

Senior Data Architect @ Delphine Diagnostics Inc
Parminder S's Location
Orangeville, Ontario, Canada, Canada
About Parminder S

Accomplished Data Management Professional with 17 years of IT experience in Banking, Insurance, Healthcare and Retail/e-Commerce, Life Sciences business domains, specializing in Data Modeling, Data Architecture, Data Warehousing, Data Analysis, Data Migration, Data Governance, Master Data Management, Reference Data Management, ETL-ELT, Business Intelligence, Big Data, AWS Cloud, Snowflake, Vertica and IBM Mainframes spanning all phases of Software Development Life Cycle.

Parminder S's Current Company Details
Delphine Diagnostics Inc

Delphine Diagnostics Inc

View
Senior Data Architect
Parminder S Work Experience Details
  • Delphine Diagnostics Inc
    Lead Data Modeler / Architect
    Delphine Diagnostics Inc Oct 2023 - Present
    Newark, New Jersey, United States
    • Create the Conceptual, Logical and Physical enterprise data model in ERwin for building machine learning model ingesting data for claims, pharmacy and other providers into the integrated reporting layer for the dashboards • Perform data analysis for the existing sources and their data against the new data model deployed using RDBMS applications • Create Source-to-target data mapping documents, data flow diagrams, metadata management using Collibra
  • Texas Capital
    Lead Data Architect
    Texas Capital May 2023 - Sep 2023
    Richardson, Texas, United States
    • Create the Conceptual, Logical and Physical Enterprise data model using Data Vault 2.0 in ERwin for Data Warehouse and Information Delivery/Semantic layers deployed on RDBMS applications• Perform data profiling of source of truth applications for Deposits and Lending and creating Metadata on Enterprise Repository• Prepare Master Data Management rules for Party, Product domains for Conformed Vault including survivor-ship rules for Presentation layer design• Implement Data Mesh and data domains for Deposits & Lending in Snowflake with Coalesce/Python as ELT tools.
  • Citi
    Senior Data Modeler
    Citi Oct 2022 - May 2023
    Irving, Texas, United States
    • Create the Conceptual, Logical and Physical enterprise data model in MagicDraw/ ERwin for the existing PBWMT applications and data• Perform gap analysis for the existing applications and their data against the new data model deployed on RDBMS applications• Provide recommendations and meeting stakeholders for requirements gathering
  • Thermo Fisher Scientific Inc
    Senior Data Architect
    Thermo Fisher Scientific Inc Feb 2022 - Apr 2023
    Carlsbad, California, United States
    • Design the data architecture of the Data Mart application to implement Delta Lake using Databricks and AWS S3• Create Implementing the Dimensional data model in PowerDesigner for the eBusiness Analytics team to consume online Webclick activity data in Enterprise Data Lakehouse deployed using RDBMS application• Collaborating with Technology leadership team for successful delivery of the program according to the enterprise vision
  • Mass Mutual Life Insurance Company
    Senior Data Modeler
    Mass Mutual Life Insurance Company Apr 2021 - Feb 2022
    Springfield, Massachusetts, United States
    • Interact with Business SMEs for gathering reporting requirements and translate into Data Model in ER/Studio• Develop Conceptual, Logical and Physical data models for the data mart to be implemented in Vertica Eon database on AWS Cloud• Create DDL scripts and physicalize the table and view structures in the database• Working closely with ETL developers by developing source-to-target data mapping documents• Create Star schema dimensional data model for Microstrategy to connect and extract reporting data• Work closely with Enterprise Solution Architects and external team stakeholders for elicitation of data modeling requirements
  • Mufg
    Data Quality Lead
    Mufg Mar 2020 - Oct 2021
    California, United States
    • Implementation of Data Management Integration Control Framework (DMICF) for all golden sources feeding Enterprise Data Lake on Cloud• Interaction with Business SMEs for the Golden Sources of the bank to gather and document the data quality requirements• Reviewing and coordinating the implementation of data quality rules for various Controls for each of the Regulatory Reports using Data Governance tool Collibra• Applying DQ rules on the data lake built on HIVE tables on Big data configurationUnderstand the data standards of each of the upstream sources of the bank and designing the data quality rules working with the data stewards
  • Highmark Blue Cross Blue Shield Of Western New York
    Senior Data Architect
    Highmark Blue Cross Blue Shield Of Western New York Aug 2019 - Mar 2020
    Buffalo, New York, United States
    • Created database design on Snowflake for migration of existing data warehouse• Involved and architected the proposal for migrating Tape datasets in Mainframe legacy application to AWS Cloud using DMS, S3 and Athena services.• Manage MDM Provider project requirements elicitation, analysis, Modeling and documentation process to capture business and data needs.• Create and maintain Reference data for the EDW Provider Datamart.• Engage with Provider project teams to communicate the design and develop Logical and Physical data model.• Promote usability data standards and develop high quality design using Power Designer• Perform comparison and gap analysis while sourcing Pharmacy feed into warehouse adhering to HL7 standards.• Created Data Dictionary for reference to the other stakeholders in the project.Perform data profiling to measure and analyze data integrity and quality, provide solutions to meet the data integrity and integration requirements.
  • Allianz Global Corporate & Specialty (Agcs)
    Reporting Data Architect
    Allianz Global Corporate & Specialty (Agcs) Sep 2018 - Jul 2019
    New York, New York, United States
    • Leading the data analysis work for understanding the PEGA system underlying data structures in the operational and transactional data marts. • Coordinate the data extraction from different legacy applications (such as Global Genius) for gathering the required data for ISO Reporting delivery. • Work closely with Reporting business users for obtaining the functional requirements for capturing required ISO statistical plans and coverages. • Ensure the optimum storage of front-end application (Pega) data in the operational data store for reporting needs. Coordinate alignment of attributes using xml/tags-names to the database table/columns. • Validate and provide sign-off on the Premium and Losses data feed from AGCS to Verisk for ISO Reporting submission. • Update the Business Analysts, Cognos Developers and Architects for any changes to the data lineage and source for tracking purposes. • Engage with the 3rd party vendor Verisk for delivery of the extract containing premium and losses data.
  • M&T Bank
    Data Warehouse Architect
    M&T Bank Jul 2015 - Sep 2018
    Buffalo, New York, United States
    • Leading a team of Data Analysts onshore and offshore working for CCAR, FHLB, Profitability, LCR and AML programs. • Worked extensively in documenting the Data Analysis requirements and Gap Analysis outcomes in the form of Functional Specifications and Data Mapping documents. • Interact closely with business counterparts for transformations of requirements into data structures, which can be used to efficiently store, manipulate and retrieve information. • Involved in carrying out Data Validation and Loading of data into testing regions using combinations of Informatica PowerCenter, SQL and Teradata Stored Procedures. • Maintained and performed updates for the Enterprise Data Warehouse data model using ER/Studio. • Involved in working on data analysis for sourcing-in new data into EDW on Hadoop distribution making use of HUE, HIVE and IMPALA tools.
  • First Horizon Bank
    Senior Data Modeler
    First Horizon Bank Jul 2013 - Jun 2015
    Memphis, Tennessee, United States
    • Developed Conceptual and Normalized Logical model using Erwin and ER/Studio based on requirements analysis for Operational Data Store (ODS) based on Data Vault methodology. • Created Data Dictionary for reference to the other stakeholders in the project. • Accountable for Design, planning and execution of ETL processes for migrating data from Mainframe based Interpose Application to new operational system OTX (Online Transaction Xchange) using IBM DataStage. • Perform source system data analysis including gap analysis in order to manage source to target Data Mapping, Data Profiling and Data Extraction activities. • Creating new application database design on Data Vault approach, with usage of Hubs, Satellites and Link for primary and ancillary information. • Create and maintain Reference data for the new OTX system.
  • Green Gold Technologies
    Data Warehouse Lead
    Green Gold Technologies Oct 2012 - Jun 2013
    Melbourne, Victoria, Australia
    • Involved in Data Collection and Analysis of various BI and ETL products comparing their positives and negatives. • Responsible for designing and implementing the Logical and Physical Data Model for the new Database based on the ER relationships using Erwin. • Designed data models for the sales database using ER/Studio data modeling tool. • Experience working on the OBIEE Reporting tool for fetching the reports out of the new Database for business requirements.
  • Anz
    Senior Data Warehouse Analyst
    Anz Nov 2008 - Sep 2012
    Melbourne, Victoria, Australia
    • Interacted with the End user community to understand the business requirements and in identifying data sources thereby developing Data Mappings with transformations for pushing data to the Enterprise Data Warehouse built on IBM Banking Data Warehouse logical schema (BDW). • Accountable for deliveries across various technology projects such as TRIAD, FICO, CBS, Risk and BASEL related implementations. • Worked as Data Modeler to create and maintain warehouse database tables following enterprise modeling patterns and domains using tools like ERWIN, IDA. • Designed and developed ETL processes using IBM Mainframes technologies to load high volumes of historical one-time data from various sources like Oracle, MS SQL, Flat Files (Fixed Width) and XML files to staging database and from staging to the target Data Warehouse database.
  • Ahold Delhaize Usa
    Technical Project Lead
    Ahold Delhaize Usa Sep 2006 - Nov 2008
    Atlanta, Georgia, United States
    • Extensive hands-on experience in Designing, Developing, Testing and Delivering software programs in IBM Mainframes environment. • Involved in Production support which includes providing emergency production fixes at Level-1 and on-call support for 24 hours. • Lead a small team of mainframe developers to perform analysis on the production incidents and Coordinate with the business users while working on the ticket closures.

Parminder S Education Details

Frequently Asked Questions about Parminder S

What company does Parminder S work for?

Parminder S works for Delphine Diagnostics Inc

What is Parminder S's role at the current company?

Parminder S's current role is Senior Data Architect.

What schools did Parminder S attend?

Parminder S attended Harvard Business School Online, Visvesvaraya Technological University.

Not the Parminder S you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.