Ken Quigley

Ken Quigley Email and Phone Number

Data Architect / Data Modeler Macro Consulting @ Macro Consulting
Southlake, TX, US
Ken Quigley's Location
Southlake, Texas, United States, United States
Ken Quigley's Contact Details

Ken Quigley work email

Ken Quigley personal email

About Ken Quigley

I am a Data Architect with 15 years of experience in Data Warehousing, Data Warehouse Architecture, design and deployment. I enjoy exposure to new technologies and have experience modeling in most Database Management Systems.

Ken Quigley's Current Company Details
Macro Consulting

Macro Consulting

View
Data Architect / Data Modeler Macro Consulting
Southlake, TX, US
Ken Quigley Work Experience Details
  • Macro Consulting
    Macro Consulting
    Southlake, Tx, Us
  • Macro Consulting
    Data Architect / Data Modeler
    Macro Consulting Apr 2023 - Present
    Porto, Porto, Pt
    • Used Agile processes to design and deploy an Operational Data Store (ODS)• Document and translate business requirements to data models and deploy into the data warehouse• Create and update data flow, requirements, and data models as needed• Responsible for ensuring data modeling documentation conforms to modeling and governance standards
  • Vetsez
    Data Governance Architect
    Vetsez Sep 2021 - Mar 2023
    San Diego, Ca, Us
    Data Architect supporting the Department of Veterans Affairs (VA) enterprise-level Community Care DevSecOps project.Security Clearance: Public trust tier4/High Risk BI/Adjudicated by the department of VA BI (Background Investigation)• Analyzed current state architecture which included all systems (31) within the Community Care product line including cloud-based systems and sub-product lines, as well as, their component/subsystems• Analyzed system interfaces and High-Level Integration Diagrams (HLIDs)• Created data flow diagrams based on the HLIDs while communicating with the stake holders about Gap analysis findings• Created Data Management artifacts such as: Data Lifecycle Diagrams, Data Migration Diagrams and Data Flow Diagrams/Information while working on the Data Governance Framework• Investigated, designed and created documents including Operating Framework, Data Governance (DG) Communications Plan, DG Readiness Plan, Data Quality Policy, Data Security Policy, Disaster Recovery Policy, Data Retention Policy and the creation of a Master Data Glossary• Ensured that policies and framework were in alignment with the VA, VHA, Federal and State Mandates, and best practices• Created a GAP Analysis of systems that were missing information• Created a Metadata Analysis to show which systems were missing the ‘Metadata/Data Dictionary’• Socialized the Data Governance framework and policies to the business
  • Macro Consulting
    Data Architect / Data Modeler
    Macro Consulting Mar 2020 - Jul 2021
    Porto, Porto, Pt
    • Used Agile process to deploy three websites• Designed the OLTP database for each website • Created complete documentation for production support for each websites• Coordinated design requirements with developers• Provided traceability from requirements through implementation• Utilized a home grown ticketing system to repair defects found in each website• Design tasks included Data Modeling and DDL generation (Erwin) for MySQL• Some PHP development to include members of the websites to view each other’s profile data• Search Engine Optimization (SEO) including identification of keywords and creating back links to the websites.
  • Bnsf Railway
    Data Architect / Data Modeler
    Bnsf Railway Sep 2018 - Mar 2020
    Fort Worth, Texas, Us
    • Using the Agile process to deploy Program Management Application systems• Meet with development teams to understand and document requirements• Utilized Hackolade v2.4.4 Agile Query Driven for NoSQL Database Modeling Tool generating schemas for: o Avero o JSON o Cassandra o MarkLogic• Translate business requirements into technical design specifications• Provide traceability from user requirement to deployment task• Utilized the Remedy Ticket system for data base Trouble Ticketing• Development tasks including Data Modeling and DDL generation for DB2 Mid-Tier, DB2 Mainframe, Teradata and SQL Server databases• Modeled XML Events, Kafka Topics & Events and generated .XSD event files• Utilized and modeled JSON Draft version4 NoSQL Schema and Events• Responsible for OLTP Modeling of Rail Yard Entry/Exit NoSQL Events including: o Switching Events o Derail Events o Route Configuration o Signaling o Device Management• Responsible for Data Governance including: o Naming Standards o Name Definitions o Domain Categorization
  • Texas Health Resources
    Data Warehouse Architect / Data Modeler
    Texas Health Resources May 2016 - Sep 2018
    Arlington, Texas, Us
    - Used Agile process to deploy Program Management Portal reporting system and releases- Met with business stakeholders to understand and document requirements- Translated business requirements into technical design specifications- Provided traceability from user requirement to deployment task- Evaluated and debugged and optimized Cognos SQL for developers- Projected scoping for multiple projects- Responsible for design and delivery of: - Reliable Care Blueprinting - Nursing Dashboard: which brings together several datamarts including Survey data, Industry Comparative Data and Reliability Learning Data (ie Injury and accident tracking system) - PressGaney Survey Datamart - Hospital Operations Innovation Datamart - Peoplesoft Employee Datamart- Development tasks including the Data Model, design, DDL table definition- Development of DML to Support a variety of maintenance tables- Data Migration tasks and script development for table migration through all environments- Technologies: - Aquadata Studio - IBM Data Architect - IBM Designer Client (Datastage) - IBM Director Client (Datastage) - IBM FastTrack: Source to Target Data Mapping for ETL Development - IBM Business Glossary - Netezza, SQL Server, Oracle(Clarity) - HP Quality Center - Cognos 11- Methodologies: - IBM industry standard Healthcare Model (3rd Normal Form) - Inmon and Kimball hybrid design- Experience in Requirements Gathering, Data Mining, Profiling, Data Modeling, Business Analysis, Technical Design, Software Development, Testing Components, Implementation activities, and Support activities- Lead multiple projects on all phases of database and data warehouse development and BI reporting, from gathering business requirements, data model design and implementation, application testing and support, enterprise reporting solutions- Performed POC to replace Infosphere Data Architect to provide a more strategic direction of THR
  • Corelogic
    Data Architect / Data Modeler
    Corelogic Jul 2007 - Mar 2016
    Irvine, Ca, Us
    Architecture design and maintenance of CoreLogic’s Enterprise Data Warehouse. This warehouse is a complex multi-tiered implementation encompassing 20 data sources including 6 different DBMS’s. Design included 4 architectural layers and a multi-stage data transformation architecture including the final data marts used to support the MicroStrategy BI and ad-hoc query delivery platforms. Designed Extract Transform and Load (ETL) processes and pseudo code utilizing 4GL ETL products as well as 3GL languages including C, C++, Cobol programs, batch programs (SQL, PL/SQL, Stored Procedures and Triggers). Designed and Delivered Enterprise naming standards to both production and data warehouse delivery teams. Designed and delivered a non-proprietary, useable data dictionary solution for management and the business community. Re-engineered production modeling standards and data archiving strategies. Coordinated architectural delivery during two warehouse migrations to three different hosting platforms (IBM Mainframe, OS/400, SUSE LINUX). Reverse engineered CISCO’s Unified Contact Manager and both the Chase and Prime Lending Loan Origination databases. CoreLogic’s Data Warehouse consists of over 5 terabytes of data, 520+ tables of which there are 157 dimensions and 47 fact tables. Responsible for the design and delivery of an OLTP Single Source of Truth order management system used in the calculation of tax included closing costs.Technical Environment: Hybrid Kimball/Inmon, Hub & Spoke Methodologies, Erwin Data Modeler, QMF, MicroStrategy 9.3, Informatica, Oracle Financials, CISCO Contact Manager, DB2, Oracle, ADABAS, Teradata 14, MySQL, SQL Server 2008, Informatica Metadata Manager MDM.
  • Citigroup Assurance Services (Cas), Ft. Worth Texas
    Data Architect - Contract Position
    Citigroup Assurance Services (Cas), Ft. Worth Texas Aug 2006 - Aug 2007
    Responsible for architecture definition, implementation, maintenance and resource planning for the initial Data Warehouse implementation used by this division of Citigroup. In addition I was responsible for definition of naming standards, data quality standards and modeling standards. Also data ownership methodology, metadata maintenance and reporting standards. This project utilized the Kimball methodology of star schema, dimensional modeling to represent credit card and financial insurance transactions.
  • 7-Eleven
    Data Architect - Contract Position
    7-Eleven May 2005 - Aug 2006
    Irving, Tx, Us
    Responsible for maintenance of corporate “Information Systems Application Summary” a metadata repository for all 7-Eleven production systems. Re-Engineered 7-Eleven’s Enterprise Information Flow by system into a more organized and understandable representation of the company. Designed and implemented a Master Data Management (MDM) system for enterprise vendor suppliers. The MDM project was based on the integration of corporate vendor data into an industry standard retail data model known as ARTS (Association for Retail Technology Standards).
  • Uici Student Insurance Division
    Data Architect - Contract Position
    Uici Student Insurance Division Sep 2002 - Jul 2004
    Responsible for the design and implementation of a Premium and Claims processing Data Warehouse. This architecture incorporated 4 Operational Data Sources. Data Modeling included conceptual, logical, and physical design. The design incorporated an efficient HIPPA compliant surrogate key cross reference schema supporting slowly changing Type III dimensions; Responsibilities also included the development of pseudo code for all ETL processing, the creation of multiple OLAP cubes for utilization by the various Claims and Premium processing departments, and the bid subject area whereby new business was won.
  • Sandstream Communications
    Director Of Management Information Systems
    Sandstream Communications Mar 2000 - May 2002
    Responsible for creating and implementing business and data models for all functional areas utilizing ERWin, UML and a hybrid Kimball/Inmon methodology. Responsible for six Operational Data Sources. Models included conceptual, logical and physical data architecture, customer event correlation and data capture, identification of cross-functional dependencies in billing, CRM, Order Management, Middle-ware publish/subscribe data identification, Reporting, and Data Warehousing.

Ken Quigley Skills

Data Modeling Data Warehousing Etl Oracle Databases Architecture Integration Business Intelligence Microsoft Sql Server Erwin Sdlc Dimensional Modeling Pl/sql Db2 Crm Architectures Business Analysis Enterprise Software Requirements Analysis Enterprise Architecture Master Data Management Teradata Data Integration Database Design Data Quality Requirements Gathering Erp Solution Architecture Agile Methodologies Informatica Data Architecture Cross Functional Team Leadership Ssis Data Governance Data Migration Ssrs Stored Procedures Soa Data Warehouse Architecture Microstrategy Visio Sql Olap Management Business Requirements

Ken Quigley Education Details

  • University Of North Texas
    University Of North Texas
    Computer Science And Business Administration
  • College Of Dupage
    College Of Dupage
    Music Theory And Composition

Frequently Asked Questions about Ken Quigley

What company does Ken Quigley work for?

Ken Quigley works for Macro Consulting

What is Ken Quigley's role at the current company?

Ken Quigley's current role is Data Architect / Data Modeler Macro Consulting.

What is Ken Quigley's email address?

Ken Quigley's email address is ke****@****nsf.com

What schools did Ken Quigley attend?

Ken Quigley attended University Of North Texas, College Of Dupage.

What skills is Ken Quigley known for?

Ken Quigley has skills like Data Modeling, Data Warehousing, Etl, Oracle, Databases, Architecture, Integration, Business Intelligence, Microsoft Sql Server, Erwin, Sdlc, Dimensional Modeling.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.