Norbert Kremer

Norbert Kremer Email and Phone Number

Google Cloud Solution Architect and AI Engineer @ Turnberry Solutions
Melrose, MA, US
Norbert Kremer's Location
Melrose, Massachusetts, United States, United States
Norbert Kremer's Contact Details

Norbert Kremer personal email

n/a
About Norbert Kremer

REQUEST A FINOPS ASSESSMENT (See Link Above)It is quite common for enterprises to spend more in the cloud than they had budgeted. The reasons for this are quite varied and the onset may be gradual or sudden.The goal of cloud FinOps is to align cloud spend with business value, using a collaborative approach. FinOps adoption may take many forms, depending on the degree of cultural change required, size of the enterprise and amount of cloud spend. Smaller enterprises may not need a dedicated FinOps team, the FinOps function can be divided across teams. Larger enterprises will have a separate FinOps team, perhaps as part of a Cloud Center of Excellence.An assessment by trained professionals can help to develop the best plan for adopting FinOps. Typically, immediate savings can be achieved within weeks of starting. While these tactical cost reductions are being achieved, a plan for more strategic efficiency engineering may be developed. This might include improving resource tagging, starting a showback or chargeback program and re-architecting applications to use cloud-native design patterns.Analytics By Design consultants have decades of experience in data engineering and data warehouse design and build. Our consultants have been working in the cloud for 7 years. The consultants have FOCP and FOCP certifications from the FinOps Foundation as well as multiple Google Cloud certifications. We can help enterprises align their cloud spend with business value, especially so in the area of cloud data platforms, cloud data lakes, and data engineering for analytics, BI and ML and AI.CONSULTING SERVICESFinOps Maturity AssessmentFinOps team buildingGetting started with FinOps - building toolsFinOps tool selection, tagging strategy, PoCFinOps for Cloud Data Platform (DataFinOps, Data FinOps, ML FinOps, AI FinOps)Build Cloud Center of ExcellenceStrategic planning of application and database migration to cloudCloud data platform selectionMulti-cloud and hybrid architecturesTraining and coaching on Google Cloud and FinOpsEXPERTISEFOCE FinOps Certified EngineerFOCP FinOps Certified PractitionerGoogle Champion Innovator (Data Analytics)Google Certified Professional Database EngineerCertified Professional in Health Information & Management Systems (CPHIMS)OMOP Common data model, OHDSI tools to generate evidence (RWE) from real-world data (RWD).Cost Optimization for analytic data platforms: Snowflake, RedShift, BigQuery, Databricks, TeradataCloud Data Warehouse, Data Lake, Data Vault, Data MeshEfficiency engineering, value engineering

Norbert Kremer's Current Company Details
Turnberry Solutions

Turnberry Solutions

View
Google Cloud Solution Architect and AI Engineer
Melrose, MA, US
Norbert Kremer Work Experience Details
  • Turnberry Solutions
    Google Cloud Solution Architect And Ai Engineer
    Turnberry Solutions
    Melrose, Ma, Us
  • Analytics By Design
    Finops Consultant & Cloud Solution Architect
    Analytics By Design Apr 2013 - Present
    Independent ConsultantFinOps ConsultantFinOps Team LeadGoogle Cloud Champion Innovator (Data Analytics)Data Engineer, Database engineer (performance and scalability of large data warehouses and data lakes)Cloud Solution ArchitectFreelance ConsultantFinOps Assessment, FinOps Maturity AssessmentCloud Economics, Cloud optimization, Cloud spend analyticsData FinOps, DataFinOps, AI FinOps, AIFinOps, ML FinOps, MLFinOpsAnalytics By Design is a consulting company dedicated to helping clients build cost-effective analytics solutions in the cloud. Consulting services include strategic advising, coaching, training, architectural planning, solution design and hands-on coding.The importance of aligning analytics solution to business value is observed at all stages of the life cycle.Special emphasis is placed on FinOps. Close links to the FinOps community of practitioners allows understanding of best practices. These are applied to the unique circumstances of each enterprise, depending on the current state of FinOps maturity. FinOps practice requires a collaborative mindset, with continuous improvement achieved through an iterative approach.Google Cloud Expertise: Cloud Solution Architect, BigQuery, BigLake, BigTable, Tensorflow, DataProc, DataFlow, DataLab, DataStudio, DataPrep. Spark, IAM, Looker, Vertex AI, MLFHIR, Dicom, HL7 data with Google Healthcare API, Data Mesh, BigQuery Data LakesCloud cost monitoring and optimization (CCMO tools) Ternary, Finout, Harness, Hystax OptScale, Gathr, Flexera, Apptio Cloudability, Turbonomic, CloudHealthData FinOps for BigQueryData FinOps for SnowflakeData FinOps for RedshiftData FinOps for Databricks
  • Tdwi
    Faculty Member
    Tdwi Nov 2020 - Present
    Renton, Wa, Us
    - Instructor at TDWI Transform 24 conferences.- Co-author of Modern Data Warehouse Platforms and Architectures for the Cloud and other presentations at TDWI Conferences in Orlando, Chicago, San Diego and Las Vegas. Richard Winter is the co-author of this course.Topics covered: Cloud Data Platform Architecture and Performance Characteristics Strategy for selecting the most appropriate Data Platform Multi-cloud architecture Cloud cost management - FinOps- Author of "Efficiency Engineering for Cloud Analytics and AI: FinOps and More"- Presenter at Modern Data Leader Summits
  • Finops Foundation
    Community Member & Foce & Focp
    Finops Foundation Oct 2022 - Present
    San Francisco, Ca, Us
    FinOps Certified Engineer - FOCEFinOps Certified Practitioner - FOCPFinOps Foundation Community MemberMember of Cloud FinOps New England Group (meetup)
  • Wintercorp
    Director, Wintercorp Cloud Data Warehouse Lab
    Wintercorp Jul 2019 - Present
    Tyngsboro, Massachusetts, Us
    The WinterCorp Cloud Data Warehouse Lab we conducts proofs-of-concept and benchmarking tests of cloud data warehouses. These are driven by our own R&D as well as engagements on behalf of enterprise clients.Many of the benchmarking tests that we have run in the past related to scalability issues. We often get called in to client engagements when cloud data platforms run into limitations that are inherent in the system architecture.Increasingly, cost per query is becoming more of a concern rather than absolute scalability limits. So, we are becoming FinOps Consultants in addition to Cloud Data Platform experts.
  • Medical Device Manufacturer
    Finops Consultant & Google Cloud Architect
    Medical Device Manufacturer Oct 2023 - Jan 2024
    A medical device manufacturer has transitioned its IT infrastructure to Google Cloud. An assessment of cloud spending patterns revealed significant opportunities for optimization. Our team made an independent evaluation, using a top-down enterprsise-wide approach to identify and quantify areas where greatest savings could be achieved. We use the billing data in BigQuery and the recommendations data in BigQuery for reporting. We developed custom tools, using python and terraform, to deactivate idle resources and right-size virtual machine instances. In the next phase of the project, we will help the client with FinOps tool selection and help to educate Engineering, Financial, and Business teams on the value of the collaborative approach recommended by the FinOps Foundation.
  • Arrakis Therapeutics
    Google Cloud Solution Architect
    Arrakis Therapeutics Mar 2023 - Jul 2023
    Waltham, Ma, Us
    Arrakis Therapeutics is a biopharmaceutical company developing drugs for neurological disorders and other diseases. Arrakis uses a systematic approach for the purposeful discovery of RNA-targeting small molecule drugs. The company uses a hybrid multi-cloud architecture that allows selection of the best tools from each cloud. We were engaged to implement the "Google Cloud setup checklist." A thorough assessment of the Google cloud compute resources, networking and security was made. We worked with Arrakis to group projects into folders, create IAM groups and add shared VPC networks to meet the needs of the computational biologists. Using a FinOps approach to align cloud costs with business value resulted in a substantial reduction of cloud spend.
  • Wayfair
    Google Cloud Solution Architect & Finops Consultant
    Wayfair Jan 2021 - Feb 2023
    Boston, Ma, Us
    Wayfair is an e-commerce company that sells furniture and home goods. More than 30 million customers in USA and Europe choose from 22 million items provided by 16000 global suppliers. The culture at Wayfair emphasizes engineering excellence. The atmosphere is very collaborative, with engineering teams working closely with product managers in Agile methodology. Wayfair began to move e-commerce and analytics workloads to Google Cloud Platform in 2019. GCP provides elasticity to handle peak periods such as Black Friday and WayDay. Data processing flexibility is provided by Pub/Sub, Dataproc, Dataflow, Airflow and BigQuery. Data Engineering teams are organized as high-velocity atomic teams that manage their own data models and code. Major achievements of our team include: 1.) Migration of a large data warehouse from an MPP database system to GCP's serverless BigQuery. 2.) Development of a curated data layer using nested data in BQ to support new Looker reports. 3.) Extension of AtScale cubes for use with Excel pivot tables.I recently joined the FinTech team, where I've been working on a SOX-compliant method to deploy database objects to BigQuery. This solution uses Terraform, Jenkins, git, buildkite, Hashicorp Vault, lots of yaml files, tied together with bash scripts.I'm also a member of the FinTech Infrastructure group, where we discuss solution architecture related to the upcoming SAP implementation, as well as cloud cost management approaches including FinOps.
  • Troposphere Technologies
    Data Engineer & Solution Architect On Google Cloud
    Troposphere Technologies May 2019 - Jun 2022
    Dover, Delaware, Us
    I am a partner at Troposphere Technologies, LLC, which is a Google Cloud Partner with deep expertise in:Cloud data warehouse. Design and build new DW. Or migrate existing DW to Google BigQuery.Relational and NoSQL database migration to GCP. We work with all GCP database and storage products.Batch and stream data processing pipelines. Design and build new pipelines using Dataflow (Apache Beam), Dataproc (Hadoop or Spark). Or migrate existing pipelines to GCP.Multi-cloud architecture with Anthos and Istio service mesh. Workload containerization and orchestration with Kubernetes.
  • Roi Training
    Google Authorized Trainer
    Roi Training Mar 2019 - Dec 2021
    I am a contract trainer for ROI Training. I deliver training in the following areas:Data Engineering on Google Cloud PlatformArchitecting with Google Cloud PlatformGoogle Cloud Certification Workshops
  • Data Carpentry
    Workshop Instructor
    Data Carpentry Dec 2017 - Dec 2020
    Training researchers in core data skills for efficient, shareable, and reproducible research practices.
  • Healthix
    Healthcare Analytics Solution Architect
    Healthix Nov 2019 - Sep 2020
    St. James, New York, Us
    Healthix is the largest public Health Information Exchange (HIE) in the USA, serving New York City and Long Island. Healthix collects data from more than 8,000 Healthcare Facilities for over 20 million patients.- Design and build a data warehouse on MS SQL Server to integrate data from 15 source database instances.- Develop python3 scripts to extract billions of patient records from Intersystems HealthShare Cache (IRIS) databases. Evaluate odbc, jdbc and Intersystems pythonbinding database drivers and use parallel extraction techniques to improve extract performance by 80 times.- Work with data science team to find and remediate data quality issues.- Discover COVID-19 patient records using LOINC codes for lab results and clinical notes.- Develop processes, per HIPAA Safe Harbor rules to de-identify patient records (remove PHI) for incorporation into reports.
  • Vertex Pharmaceuticals
    Cloud Data Engineer, Real World Evidence (Consultant Via Bprescient)
    Vertex Pharmaceuticals May 2019 - Oct 2019
    Boston, Ma, Us
    Vertex Pharmaceuticals is a highly innovative company and leader in cystic fibrosis therapies. The company is actively developing drugs for treatment of pain and is partnering with CRISPR Therapeutics to develop a treatment for beta-thalassemia that will use gene editing.Design and develop AWS Redshift database to manage Real World Evidence data.Build reproducible loading process using python3 to ingest multi-Terabyte claims data sets from IQVIA (Pharmetrics) and Premier.Work with Vertex infrastructure team to develop AWS cloud best practices around data management at scale.Work with HOER experts and data scientists to reduce SQL query times from hours to minutes.Work with vendors to load claims data sets in OMOP Common data model format.Install OHDSI Atlas tools, provide training to team members on use of SQL, R, Spotfire and OHDSI.
  • Dama New England
    Vp Programs
    Dama New England Jul 2017 - Dec 2018
    We seek presenters for DAMA events around New England. Various presentation styles are encouraged from traditional lecture to interactive workshop. We are also building a mentorship program.
  • Pepkor It
    Google Cloud Platform Data Engineer
    Pepkor It Jul 2018 - Sep 2018
    Bellville, Western Cape, Za
    Design and build a data lake on Google Cloud Platform. External streaming data is buffered with Cloud Pub/Sub. Incoming bath files are landed into Google Cloud Storage. Batch and streaming data sources are processed with a single code base in Cloud Dataflow using the Apache Beam Unified Programming Model. After validation and cleansing, data flows into BigQuery. Use of Cloud functions to trigger Cloud Composer jobs was evaluated. Stackdriver is used throughout for logging and monitoring.
  • Om1, Inc.
    Data Warehouse Architect
    Om1, Inc. Jan 2017 - Mar 2017
    OM1 researchers required a Cohort Selection data mart to discover patient populations fitting diagnosis, treatment, and demographic criteria. • Designed star schema with multi-attribute dimensions and bridge tables, using Aqua Data Studio modeler • Profiled and cleansed source data for consistent results, populated star schema using SQL and python • Tuned RedShift database structures to achieve 10 second query performance goal (250 million patients) • Created Tableau Desktop dashboards to allow selection of patient cohorts
  • State Street Global Advisors
    Data Warehouse Solution Architect (Consultant Via Advanti Solutions)
    State Street Global Advisors Nov 2013 - Sep 2016
    Boston, Massachusetts, Us
    State Street Global Advisors ($2.4 trillion under management) uses both passive and active fund management strategies. A new Equity History Research Database system was required to support portfolio allocation decisions based on equity risk factors, equity fundamentals and asset classes. The IBM PureData for Analytics (Netezza) MPP analytics appliance was chosen for its scalability and for the ability to run R code in-database.A team led by Advanti Solutions designed and built a multi-terabyte warehouse using a bitemporal "insert-only" data model. Code generators were developed to build hundreds of data loaders and data validation routines. Extensive ELT processing was done to re-organize data received from external vendors into forms useful for statistical analysis. The nzr interface was used to run sophisticated models developed in R in-database and in parallel.
  • Centerlight Health System
    Data Warehouse Analyst (Consultant Via Caserta)
    Centerlight Health System Apr 2013 - May 2013
    Bronx, New York, Us
    CenterLight Healthcare is undergoing rapid growth in its health plan offerings and will offer new services under a FIDA plan. New informatics requirements are being met by a new suite of operational systems. Reporting and analytics will be provided by a new data warehouse.A team led by Caserta Concepts provided a Data Warehouse Strategic Assessment. The assessment consisted of a plan for a Dimensional Data Warehouse, including an Enterprise Bus Matrix with conformed dimensions, and a Quick Start recommendation for initiating iterative implementation cycles. In addition, a Business Requirements document, Sunsetting strategy recommendation, technology recommendations, and source system analysis were provided.During this tactical engagement, the team members rapidly assimilated current and future states of a complex healthcare environment undergoing rapid change. The team brought together a wide array of technical skills, business analysis skills and domain expertise.The Strategic Assessment provides an illuminated pathway to build a robust and extensible data warehouse to provide both business performance and clinical quality metrics that are aligned with the company's strategic goals.
  • Best Buy
    Data Warehouse Solution Architect
    Best Buy Apr 2012 - Mar 2013
    Richfield, Minnesota, Us
    Best Buy operates some 1150 stores that generate nearly $50 billion in revenue annually. A multi-year strategy was developed to build a Customer Analytic Master data warehouse. Key technologies employed in building the CAMDW include IBM Puredata for Analytics (Netezza), Aginity Customer Intelligence Appliance software, Informatica Data Quality and the SAS statistical package. An integrated view of each of Best Buy's 120 million customers was built using match-merge techniques. Customer data was brought into Netezza from a large Oracle database and sales transaction data were sourced from Teradata. As a Solution Architect on this project, I had wide-ranging responsibility to design data flows into and out of the Netezza system that fit within Best Buy's Enterprise Architecture framework. Large volume historical data feeds were delivered using flat files, nzload and also Informatica ETL. Daily delta data feeds were delivered using Informatica. A major responsibility was to prepare Solution Architecture Blueprints and present and defend these before the Architectural Review Board.With design work completed, development on Netezza became the main activity. Major challenges here were deploying separate Development, Test, and Production environments in two data centers, and development of a strategy to encrypt all PII in Netezza and in flat files used for loading data.Finally, I worked with data architects and analytics end users in their efforts to design analytic data marts for use with the SAS statistics package. Education was key here, following guidance on best practices from IBM and SAS.
  • Fidelity Investments
    Senior Netezza Developer - Fcap Art Project
    Fidelity Investments Sep 2011 - Mar 2012
    Boston, Ma, Us
    Fidelity Investments uses an accounting method called Activity-Based Costing to attribute the costs of every business function to the fund level and customer level. A new data warehouse was required to process the high volumes of detailed cost data collected and allocate these costs according to complex econometric models.The Netezza analytic appliance was selected for its scalable architecture and speed. A rules engine was built inside Netezza. Metadata stored in the modeling data was read by stored procedures that built dynamic SQL to process the large data volumes stored in dimension and fact tables.The development effort involved stored procedure development with the NZPLSQL language, extensive error handling, integration with shell scripts, and performance measurement.The use of Agile methods allowed a high degree of interaction among developers, econometric analysts, and business users of the system.
  • Iq Associates
    Senior Data Warehouse Architect
    Iq Associates May 2010 - Mar 2012
    Arlington, Massachusetts, Us
    IQ Associates is a consulting company providing services in Business Intelligence and Analytics. Worked on client projects at Fresenius Medical Care and Fidelity InvestmentsAreas of Expertise:Business Intelligence Strategy & RoadmapsData Warehouse Architecture and ImplementationHighly scalable, Massively Parallel Data Environments, especially using Netezza and DataStagePredictive Analytics, Statistical Modeling, especially in-database analytics on NetezzaComplex Data Integration, ETL and ELT
  • Fresenius Medical Care
    Data Warehouse Architect
    Fresenius Medical Care May 2010 - Aug 2011
    Bad Homburg, Frankfurt, De
    Fresenius Medical Care provides dialysis services to some 170,000 patients at 2100 clinics in the U.S. Recent deployments of Soarian Financials and Soarian Clinicals EHR systems allow efficient operations within each clinic. A new data warehouse, the Knowledge Center Next Generation, or KCNG, was envisioned to provide a robust and scalable platform for reporting and analytics. Key KCNG features include: 1. Enterprise Content Store to provide fully integrated view of clinical data (diagnosis, treatment, assessments at patient visit-level detail) and financial data (detailed charge data and payment bundling data) in one system. 2. Data Marts for reporting and Analytics on combined financial and clinical data.3. IBM Netezza data warehouse appliance with AMPP architecture is used to provide rapid delivery of reports and highly scalable analytics. In database analytics (IBM Netezza Analytics, INZA) was integrated with SAS and R statistics packages to provide high performance to end users.My roles on this project were Data Architect (conceptual, logical, physical models of Enterprise Content Store and Data Marts), and lead Netezza Developer (ELT jobs orchestrated with Python and Tidal Scheduler).
  • Humedica (Now Optum)
    Data Analyst
    Humedica (Now Optum) Jun 2009 - May 2010
    Boston, Ma, Us
    Humedica's MinedShare product integrates multiple data streams from healthcare providers to achieve deep clinical insights into patient populations.Humedica ingests provider data from Ambulatory and Inpatient EHR's, lab results and claims data. Data is extensively analyzed for data quality and semantic content, identification of code sets, normalized and validated. Processed data becomes available to the analytic algorithms built into the browser-based MinedShare application.I designed and built several applications to support data ingestion and the data processing pipeline.One application monitored the arrival times, file sizes and other attributes of tens of thousands of XML files form a major business partner. A test for XML well-formedness avoided problems with downstream processes.Another effort was directed at processing tens of millions of HL7 messages. Rapid analysis of the messages was done by parsing the header of each message, putting the values into an Oracle database. Once in Oracle, the messages could be searched, filtered, categorized and counted using SQL. Python scripts were developed to send each message to the Intel SOA Expressway, which completely processed each message to Oracle database tables. Most of the design and coding effort here was in error handling, logging and restartability of interrupted jobs.A custom profiler was designed and written, using Agile methods, to the specifications of the Clinical Informatics team. The profiler scanned an entire database to give a more complete summary than the previous manual methods allowed. Manual effort was greatly reduced and discovery of semantic content in data was enhanced.Python, SQL and Oracle were the workhorse tools in all projects. The R statistics programming language, LaTeX and Sweave were used to prepare reports for use in house and presentation to clients.
  • Fresenius Medical Care
    Business Systems Analyst
    Fresenius Medical Care Oct 2007 - Mar 2009
    Bad Homburg, Frankfurt, De
    Fresenius Medical Care provides dialysis services to 120,000 patients and has annual revenue > $10B. The company has embarked on a large project to upgrade financial and medical systems to Siemens Soarian Financials and Soarian Clinicals applications, respectively. Two existing Master Patient Indexes were synchronized via batch ETL and are kept in synch via HL7 messaging. Patient, provider, insurance plan and facility data is migrated to the Soarian systems in preparation for conversion. Each of the some 2000 clinics is then switched from the legacy system to the Soarian systems via an automated process.In this environment, my efforts have focused on documenting requirements in Functional Specifications for migration of patient data; loading Physician data via Soarian’s Master File Central interface; and extensive testing of data extracts for the MedAssets Contract Modeling data mart.I was employed by The Systems Group for this contract work at FMCNA.
  • Harvard Pilgrim Healthcare
    Logical Data Modeler
    Harvard Pilgrim Healthcare Feb 2007 - Jul 2007
    Canton, Massachusetts, Us
    Harvard Pilgrim HealthCare (1 million members, $2.5B/yr revenue) has embarked on the Core Administrative System Replacement (CASR) project, in which claims processing will be performed by business partners. In preparation for this project, the existing business was documented in an enterprise logical data model (ELDM). In further analysis, data elements in extract files from the claims processing partner were mapped to the ELDM. This source to target mapping, with business requirements documented in the ELDM, was used to identify gaps in the extract files from the partner. The assigned mappings and analyst notes, were managed in a proprietary database. Eventually, the mapping documents will be used to create physical database structures, and specifications for ETL. The final steps in documentation bring data definitions and data lineage into a metadata repository for use by developers and analysts. I was employed by Gardner Resources Consulting for this contract work at HPHC.
  • Massachusetts General Hospital
    Database Architect
    Massachusetts General Hospital 2001 - 2004
    Boston, Ma, Us
    MGH was awarded a "Glue Grant," from NIGMS in 2001, for the study of trauma and burn patient outcomes. The informatics team was charged with rapid development of sample management (LIMS), clinical data capture and reporting applications. Additional complexity was introduced by the large scale and distributed nature of the project (some 200 trauma and burn specialists, genomics researchers, laboratory personnel and biostatisticians at 20 medical centers in U.S.).In order to best meet the requirements of the research protocols, staged deployment of LIMS and clinical databases was followed by a data warehouse to support reporting and statistical analysis. Clinical data capture was supported by customizing the Trial/DB open-source software developed at Yale University's YCMI. The Trauma-Related Database (TRDB) obtained refreshed data from LIMS and from Trial/DB using perl-based ETL.
  • Greystone Solutions, Inc
    Principal Consultant
    Greystone Solutions, Inc 1997 - 2001
    Principal Consultant at a Microsoft Gold Partner consultancy that develops database-backed custom web applications for clients in New England.Participated in 14 projects in financial, telecomm and environmental and healthcare industries. Delivered proof-of-concept technology demonstrations to prospective clients.Engaged in all phases of Software Development LifeCycle from Requirements Definition and Logical Data Modeling, to Design and Implementation of databases.
  • Veterans Administration Medical Center
    Instructor Of Medicine
    Veterans Administration Medical Center 1991 - 1994
    Washington, Dc, Us
    Molecular biology research on role of growth factors in pituitary tumors
  • Suny Stony Brook
    Research Biologist
    Suny Stony Brook 1988 - 1991
    Stony Brook, Ny, Us
    Role of proto-oncogenes in signal transduction of nerve growth factor

Norbert Kremer Skills

Data Modeling Business Intelligence Database Design Data Integration Analytics Business Analysis Data Management Data Mining Agile Methodologies R Enterprise Architecture Healthcare Information Technology Informatics Data Warehousing Netezza Architect Data Warehouse Architecture Netezza Developer Requirements Analysis Netezza Big Data Sql Master Data Management Machine Learning Amazon Web Services Enterprise Software Amazon Redshift Apache Spark Translational Medicine Hadoop Vertica Scikit Learn Data Science Hl7 Scala Artificial Neural Networks Etl Oracle Big Data Analytics Healthcare Spark Prestodb Enterprise Integration Architect Open Source Software

Norbert Kremer Education Details

  • Wesleyan University
    Wesleyan University
    Physics
  • Purdue University
    Purdue University
    Neurobiology
  • John Burroughs School
    John Burroughs School

Frequently Asked Questions about Norbert Kremer

What company does Norbert Kremer work for?

Norbert Kremer works for Turnberry Solutions

What is Norbert Kremer's role at the current company?

Norbert Kremer's current role is Google Cloud Solution Architect and AI Engineer.

What is Norbert Kremer's email address?

Norbert Kremer's email address is nk****@****hix.org

What schools did Norbert Kremer attend?

Norbert Kremer attended Wesleyan University, Purdue University, John Burroughs School.

What skills is Norbert Kremer known for?

Norbert Kremer has skills like Data Modeling, Business Intelligence, Database Design, Data Integration, Analytics, Business Analysis, Data Management, Data Mining, Agile Methodologies, R, Enterprise Architecture, Healthcare Information Technology.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.