Minh Ha Email and Phone Number
Minh Ha work email
- Valid
Minh Ha personal email
Minh Ha commenced his data warehouse practices, data analysis, data analytics, business intelligence, web-related technology, and Big Data with Hewlett Packard in Cupertino, California in June 1999. He has been accountable for many M&A initiatives on engagement with Accenture Consulting Services, Washington Mutual with JP Morgan Chase, PNB-Paribas with Bank of the West in California, BMO with Harris bank in Chicago with IBM - Global Business Services, and with insurance P &C. He has done extensive consulting engagements in banking, retailing, insurance, and telecommunication building architect on-premise, cloud and hybrid.
Judicial Council Of California
View- Website:
- courts.ca.gov
- Employees:
- 896
-
Senior Big Data Integration And AnalyticsJudicial Council Of CaliforniaUnited States -
Senior Data Warehouse ArchitectCigna Healthcare Jun 2023 - PresentBloomfield, Ct, UsCorroborate closely with Data Scientists, data steward and product owner in collecting requirements and identifying sources for data acquisition and enrichment business processes to meet/deliver User Stories under Agile Scrum environment at every Sprint. Expand/optimize data and data pipeline architecture, plus consolidating data flow and collection for cross functional teams in compliance with architectural governance and best practices standards of CIGNA Enterprise Data Hub. Perform data model designs, database development standards, implementation and management of data warehouses and data analytics systems in accordance to Line-of-Businesses. Maintain architecture governance, technology standards processes and reference architecture development. -
Senior Big Data Integration And AnalyticsJudicial Council Of California Jun 2022 - Jun 2023San Francisco, California, UsBuild Judicial Council of California data platform for onboarding ePortal 58 counties statewide. The JCC's data platform enables e-filing of complaints, petitions in regard to all litigation – family, criminal, civil, and etc. The judicial branch will analyze, use, and share data to inform decision-making to enhance and expand vital and accessible services for all the people of state of California.Analyze technical and business problems and improvement opportunities, and implement techniques, processes, and tools as solutions. Execute tasks from schedules, producing deliverables, meeting milestones, and following timelines for assigned projects. Review court data validation reports and manage issue resolution. Lead coordination with Counties, JCC business partners and vendors on data validation matters, and address and resolve business questions or issues related to testing and quality management. Revise and document artifacts such as flow charts, process flow diagrams, data flow diagrams, and others artifact via Jira and Confluence using Microsoft 365 Visio, and DBeaver applications. -
Senior Data ArchitectEvernorth Health Services Nov 2021 - Jun 2022St. Louis , UsCorroborate closely with Data Scientists, data steward and product owner in collecting requirements and identifying sources for data acquisition and enrichment business processes to meet/deliver User Stories under Agile Scrum environment at every Sprint. Expand/optimize data and data pipeline architecture, plus consolidating data flow and collection for cross functional teams in compliance with architectural governance and best practices standards of CIGNA Enterprise Data Hub. Perform data model designs, database development standards, implementation and management of data warehouses and data analytics systems in accordance to Line-of-Businesses. Maintain architecture governance, technology standards processes and reference architecture development. -
Big Data Solution ArchitectIntact Nov 2020 - Oct 2021Toronto, Ontario, CaReview and asses existing business processes in order to migrate them onto distributed enterprise data platform using Cloudera. This strategic initiative encompasses the transportation of existing business solution to event streaming, data lineage and usage, data storage, modeling, ETL, event streaming batch file processing & transfer, data ingestion, visualization reporting & analytics. This Enterprise data platform is to meet financial regulatory IFRS-17, Payment Card Industry Data Security Standard (PCI DSS), Health Insurance Portability and Accountability Act (HIPAA) and FHI plus others. Lead the Intact Financial Corporation in migrating its legacy data AS400 to GuideWire Policy/Claims/Billing center onto its Enterprise Data Platform.Engage with product owners, architects, report developers, business analysts, and business partners to understand business requirements and develop effective database solutions that ensure reliability, quality, security, scalability and ensure performance metrics and SLA, SLO requirements are met. -
Big Data Solution ArchitectTd Securities Jan 2019 - Nov 2020Toronto, Ontario, CaLead the Data Foundation and migration of Guidewire Policy, Claim, and Billing from Legacy IBM AS400 to Microsoft Azure using Data Bricks and Data Factory in replication of storage, processes, and others. Perform migration of Legacy data to Guidewire via cloud Snowflake with Talend data platform. Address Production incident related to regulatory reports – monthly, quarterly, and annually. Develop new data pipelines, identify existing data gaps, and provide automated solutions to deliver analytical capabilities and enriched data to applications. Implement consolidated business processes an Enterprise data platform which enables 360 degrees visibility of clientele after a large-scale merger and acquisition activity of oversea entities in Far East Asia and Middle East. Perform Data Governance and Stewardship in alignment with TD Enterprise Architect within Hadoop Data Foundation to business layers: Consumption layer - Reporting, dashboard and visualizations; Descriptive analytics by dividing customers into segments and profiling profitable customers;Predictive analytics in terms of future profitability of a customer, scoring models, and predicting outcomes such as churn, fraud, etc.Optimization problems and what-if scenarios come under this category -
Big Data ConsultantBnp Paribas Asset Management Apr 2016 - Dec 2018Nanterre, Île-De-France, FrWorked to close the mandate gaps between FINRA which is tasked with regulating brokerage firms and stockbrokers, to complement SEC’s regulation which focus on individual investors in areas of data governance and stewardship. Inventory files system for archiving metadata on several layers of sources in accordance to Federal Reserve Bank financial regulatory, and periodic reporting, SOX, AML and the like. Designed ETL data pipeline of transactional and non-transactional throughout five stages as part of analyzing and gaining insight: acquire, prepare, analyze, validate, and operationalize. Discover and catalog metadata about Enterprise data stores into a central catalog which processes semi structured data, such as clickstream or process logs. Populate the AWS Glue Data Catalog with table definitions from scheduled crawler programs to infer the schema, format, and data types of your data. This metadata is stored as tables in the AWS Glue Data Catalog and used in the authoring process of your ETL jobs. Generate ETL scripts to transform, flatten, and enrich Enterprise data sets from source to target -
Big Data ConsultantIbm Ix Jan 2014 - Mar 2016Armonk, Ny, UsImplement and evolve existing model to address new business needs, data dictionary on enterprise critical mission - FATCA, AML, CM4, on-line credit transaction plus other regulatory of the same nature. Inventory files system for archiving metadata on several layers of sources in accordance to Federal Reserve Bank financial regulatory, and periodic reporting, SOX, AML and the like. BMO-Harris Bank on construction of the enterprise data warehouse IDP (Integrated Data Platform) that ingests a variety of sources (plus data format), then transform/load terabytes Legacy COBOL files through ETL/ELT processes using Hadoop ecosystems. Lead the full development cycle of Big Data solutions including architecting data acquisition (into Data Lake or Data Platform), standardization, validation, then visualization using Hadoop ecosystem like Hive QL, Spark via scala and python running in batch mode and in real-time. -
Data Warehouse/Etl ArchitectReitmans Canada Ltée/Ltd Apr 2011 - Dec 2013Montreal, Quebec, CaDevelop and implement ETL processes for Reitmans RETEK v4.0 real time database that is the corporate wide supply chain management in areas such as – on-line ordering, on-line order fulfillment, merchandise manufacturing in Far East Asia; and final phase of distribution to its nation-wide stores across Canada. This initiative enabled – 1st decommission obsolete RETEK v3.0; 2nd reducing order-related errors; and 3rd zero inventory because of successfully Just-in-Time inventory. Those achievements are possible because improvement in three (3) critical areas – 1st) Eliminate redundant RETEK v3.0; 2nd) Enable real-time supply chain management which permit on-time corrective measures; 3rd) Streamline RETEK v4.0 EDW that interacts intimately with third party supply chain vendors - Manhattan Associates and Chanel Intelligence via JMS queue and XML files. -
Data ArchitectNational Bank Of Canada Financial Markets - London Apr 2010 - Apr 2011Apply three pillars of Basel level – 1st) Minimum Capital requirements on credit, operational and market risk; 2nd) Capital Adequacy Assessments on systemic, pension, concentration, strategic, liquidity and legal risks; 3rd) Market discipline that facilitates dissemination of information to investors, analysts, customers, other banks and rating agencies that define good corporate governance Developed and implemented business processes to automatically reconcile OSFI reports before and after applying the Basel banking regulatory measures. This initiative reduces credit risk configurations of various sources to hours instead of day; hence allows 1st) Enable several iterations of end-to-end testing and data reconciliation; 2nd) Eliminate error-prone due to manual visual inspection; 3rd) Deliver on-time accurate OSFI reports. -
Data Warehouse/Etl ArchitectLa Capitale Insurance And Financial Services Sep 2008 - Mar 2010Québec, Québec, CaDevelop and implement ETL processes for La Capitale CCIC (Centre de Competence d’Intelligence d’Affaires) that merged two (2) large organizations, highly data distributed organizations, and those that have frequent or large-scale merger and acquisition activity. This initiative resulted in reducing IT resources costs by twenty percent (20); and in providing fifteen percent (15%) more business added values to Underwriting department because accurate and reliable data was made available to the actuarial group in designing insurance premium that fit existing and newly enlisted customers’ profile. Those achievements were possible because improvement in three (3) critical areas – 1st) Eliminate redundant ORACLE data loading layers by converting which enable a direct load into Netezza server; 2nd) Provide faster query response time to Support Business; 3rd) Provide on-line insurance premium quote on home, car, motorcycle plus life/travel -
Data Warehouse ArchitectChase (Formerly Washington Mutual) Apr 2001 - Jun 2008Implement an Agile framework for Wamu data warehouse to address business issues related to loan compensation and initiatives to Home Loan Center plus its affiliation through major and independent mortgage brokers. Apply Basel level I Minimum Capital requirements on credit its Washington Mutual credit stress resistance to its billions dollars sub-prime; and attempt to implement corrective measures in gradually mitigate its overexposure to toxic mortgage portfolio. Collaborate with FINRA in providing data reference and data feed which will confirm in real-time whether a person or firm is registered, as required by law, to sell securities (stocks, bonds, mutual funds and more), offer investment advice or both. Support projects throughout the entire life cycle, from the initial ROM estimates, impact analysis and ongoing support through successful and on-time completion. Deploy data marts that are compliant to SOX plus other financial regulatory standards.
-
Senior Software EngineerHewlett Packard Enterprise Jun 1999 - Mar 2001Houston, Texas, UsDevelop and implement Business Solution processes that maximize customers’ satisfaction about on-site field engineer visits in resolving software and hardware issues. The Global Call Center Support Center possesses a knowledge base with pattern recognition and advanced predictive algorithms that can provide field engineers list of possible fixes based on incident logs and how they were previously addressed. This HP critical enterprise application CRM adds value system improved HP on-site services by thirty percent (30%) because of on-going improvements in three critical (3) areas: 1st) Provide adaptive training to call center personnel so that they can capture the real root of the problem when logging an incident; 2nd) Generate key metrics in grading field engineers then determine which additional training they need and at which assignments they would excel; 3rd) Deliver key metrics to Sales and CRM department in devising appropriate contractual maintenance services after sales and/or on contract renewal.
Minh Ha Education Details
-
Mcgill UniversityInternational Finance -
Uqam | Université Du Québec À MontréalAnd Securities Law
Frequently Asked Questions about Minh Ha
What company does Minh Ha work for?
Minh Ha works for Judicial Council Of California
What is Minh Ha's role at the current company?
Minh Ha's current role is Senior Big Data Integration and Analytics.
What is Minh Ha's email address?
Minh Ha's email address is minh.ha@ca.gov
What schools did Minh Ha attend?
Minh Ha attended Mcgill University, Uqam | Université Du Québec À Montréal.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial