Eric Thornton, M.A. work email
- Valid
Eric Thornton, M.A. personal email
Eric Thornton, M.A. phone numbers
• Cloud solutions architect, data scientist and teacher with 35 years' of experience leading teams in designing, developing, and deploying scalable cloud architectures. • Hands-on leader who is also a Python and mathematics expert.• 30 years' experience, and a master's degree in education, in authoring and delivering technical training as print, TV broadcasts, YouTube videos and/or instructor-led classes.
South Carolina Crypto
-
Aws Architect, Data Scientist And Full-Stack EngineerSouth Carolina CryptoNew York, Ny, Us
-
Aws Architect, Data Scientist And Full-Stack EngineerSouth Carolina Crypto (Startup) Feb 2023 - Present● Led project which successfully designed, developed and deployed their first automated, real-time evaluation (RTE) and trading platform. The new serverless solution, combined with real-time market data pipelines, made asset prescreening, strategy design, backtesting and trading activities much more efficient and resulted in a 50% reduction in time traders spent prescreening for target assets.● Deployed a set of real-time APIs and WebSocket connections to global exchanges, alternative data providers, etc., which reduced annual spend for subscriptions and off-the-shelf software by more than $1M.● The solution and its AWS State Machines, which coordinated a series of serverless Lambda functions, and built atop AWS SageMaker reduced deployment times for new trading strategies by 90%. ● Designed training curriculum, delivered courses and mentored traders, compliance team and researchers to ease migration to the new system.
-
Youtube Content CreatorChameleon Metadata Jan 2024 - Mar 2024• Create, edit and publish YouTube content focused on AWS architecture, AWS Administration, Data Science, Sagemaker and Metadata Management. • See below for newest content . . . -
Data Scientist / Snowflake Architect (Contracted By Ibm)Geico Jan 2022 - Dec 2022Chevy Chase, Md, Us● Led the IBM team which successfully migrated the legacy, back-end systems for GEICO DriveEasy app from a Storage Area Network (SAN) architecture onto a new, scalable Snowflake/ Azure cloud architecture. ● The system onboarded, corrected, enhanced and managed the metadata from the several thousand public, government and purchased datasets which arrived each day, including structured, unstructured and geospatial data.● The conceptual, logical and physical data models for the data lake were aligned to Data Vault 2.0 and were created using ERwin Data Modeler. This resulted in a 30% drop in the number of datasets being stored for more than 48 hours. ● Geospatial technical stack: ESRI ArcGIS, Geopandas, Folium, GDAL/OGR. -
Aws Architect & Data ScientistMerck Research Laboratories Feb 2020 - Nov 2021● Led the second project which successfully designed and deployed a new, scalable AWS architecture at a large pharmaceutical company after their first project with the same goal was scrubbed in 2019. Signoff was obtained from executive leadership and the research scientist community after we successfully delivered 60% faster and 90% cheaper than the first attempt. ● Migrated the research work of 50+ research scientists, mostly Python, MATLAB and R running on local PCs, onto AWS SageMaker. Used AWS DataSync to move each scientist’s files, scripts and images into S3 buckets which exactly matched their old files and directories from their workstation. The result was that the scientists loved that everything in S3 exactly matched the files layout from their old PC and that so many of their ‘legacy models’ were so easy to deploy using SageMaker Endpoints.● Designed a curriculum and delivered the courses, with live and recorded versions, to executives, scientists and lab technicians to ease everyone as they transitioned to AWS SageMaker and use S3, rather than their desktop PC, to store all their files. ● Developed all the natural language processing (NLP) code used to parse over 2 billion pages of research notes, compliance & GxP documents, governmental agency documents, etc. which resulted in the retirement of several, costly SharePoint and Confluence document management systems.
-
Data Scientist / Data ArchitectAccelerated Enrollment Solutions May 2019 - Jan 2020Horsham, Pa, Us● Led team of 7 people distributed across the U.S., U.K. and India who delivered a Real-World Evidence Platform which integrated healthcare messages (HL7, EMR’s) with data collected during clinical study enrollment and management.● Delivered business glossary, data catalog and metadata management system built atop my own Chameleon Metadata™● Created Neo4j graph and populated it with every study present at ClinicalTrials.gov and the EU Clinical Trials Register and linked everything to AES’s internal systems by sponsor, indication, protocol, etc.● Technology Stack: AWS, Snowflake, S3, Kinesis Firehose, Glue, SageMaker, Spark, Python, Neo4j and Elasticsearch -
Cloud ArchitectSave-On-Foods Jan 2019 - May 2019Langley, Bc, Ca● Led a team of 11 people to successfully deliver a master data management (MDM) solution built with Python machine learning microservices which predicted seasonalized inventory levels, optimized product pricing by season, improved data quality and trusted farm-to-table lineage for every product or services which resulted optimized inventory levels, pricing flexibility and reduced time to market.● Designed a Platform as a Service (PaaS) architecture which integrated their legacy systems’ data with Oracle’s cloud-based point of sale (POS) software and AWS using Python and APIs and WebSocket connections. -
SpeakerAi & Big Data For Banking Summit (Part Of Datax Nyc 2018) Dec 2018 - Dec 2018Topic: "Building a Data Science Information Architecture." Please see https://tinyurl.com/wu68hxhMy presentation and all the Python scripts used in the examples are found here: https://github.com/chameleonmetadata/DATAx_NYC_2018
-
Solution Architect & Artificial Intelligence ExpertConfidential Startup Aug 2018 - Dec 2018Rio De Janeiro, Br● As an induvial contributor, I successfully delivered a scalable, low-cost cloud-based solution to this real estate investment firm which integrated Hadoop, Spark, Kafka, ElasticSearch and Neo4j with the goal of dynamically calculating the optimized offer price for a potential target property or optimized sales price for properties they own. ● The solution was deployed as a fully functional proof of concept (POC) which leveraged an array of machine learning microservices to demonstrate the concept to prospective investors during ‘Seed Round’ presentations. Their product and processes capture detailed metadata for a specific business service or event, along with any data it accessed or created during each execution, if it was subject to the ISO 38500 standard for Corporate Governance. ● Machine Learning and natural language processing (NLP) work used Python SciPy and SpaCy. Visualizations were done with open-source software: Matplotlib and Neo4j Bloom depending on the use case. -
Data Science & Machine Learning ExpertCiti Feb 2018 - Jul 2018New York, New York, UsLed a team of 9 people tasked with designing a set of machine learning models meant to replace human review for parts of the Commercial Loan Application (CLA) process which involved reviewing huge number of supporting documents with many formats. The project successfully demonstrated that most components of their current manual CLA approval workflow could be automated using Natural Language Processing (NLP) and Machine Learning. Proof of concept components, aligned to specific CLA tasks, reviewed by the subject matter expert (SME) doing each task and the ML models were adjusted based on SME feedback. The proof of concept proved that robotic process automation (RPA) which could also auto-onboard SEC files. -
Solutions Architect And Data ScientistStartup (Permitsearchers) Apr 2016 - Feb 2018● Successfully delivered, as an induvial contributor, an architecture and data science infrastructure which registered, ingested their proprietary database of digital property records, at the town and county level of detail via Kafka/Oozie. The solution also parsed, classified and indexed unstructured content from public government records using Python, BeautifulSoup, Natural Language Tool Kit (NLTK), Neo4j, MySQL and Logstash into ElasticSearch/Kibana.
-
Data Analytics & Machine Learning Solutions ArchitectKpmg Us Aug 2015 - Apr 2016New York, Ny, Us● Designed and deployed a new Data Science infrastructure and led an India-based development team. The new system successfully delivers Natural Language Processing (NLP), supervised learning and unsupervised learning capabilities for ingesting, parsing, classifying and indexing unstructured content from legal documents. -
Data Science Solutions ArchitectCdm Smith Feb 2015 - Jul 2015Boston, Ma, Us● Delivered Neo4j recommendation engine which predicted the best mix of resumes for a future proposal to a known past client using Python and R-Studio for machine learning, natural language processing. A Data Vault ETL staging area ensures complete auditability and Change Data Capture (CDC). -
Lead Integration Architect (Ibm Mdm Server, Informatica Mdm, Idq And Pc)Manulife (International) Limited Feb 2014 - Dec 2014● Led team of 27 people geographically distributed across Canada, Malaysia, India and the United States which successfully delivered an information integration architecture to integrate two sets of legacy, mainframe systems after Manulife (of Canada) purchased John Hancock (of Boston) as Phase-I of their initial Master Data Management effort for Party data for several lines of business on IBM’s MDM Server (MDMS) V11.
-
Analytics SpecialistJanssen, Pharmaceutical Companies Of Johnson And Johnson Dec 2013 - Jan 2014Raritan, New Jersey, UsProvide design enhancement recommendations for an analytics environment staging via Data Vault stores (i.e. Hubs, Links and Satellites) atop Terradata. -
Data SpecialistWolters Kluwer Financial Services Nov 2013 - Dec 2013Minneapolis, Minnesota, UsPerformed a data quality and analytics assessment focused on: Financial and Compliance Services; Risk Management and Compliance; Product Development; and Global Account Management.Technical landscape included SAP (ECC, SD, CRM, BP, BI BOBJ and BW), MicroStrategy reporting and Microsoft TFS for service design and delivery management -
Hadoop Big Data ArchitectCiti Jul 2013 - Oct 2013New York, New York, Us• Captured existing business processes, lineage and metadata as RDF Subject, Verb, Object (SVO) Triples for their new Hadoop, HDFS, HIVE and HBASE data stores.• Designed Object-Oriented RDF using CmapTools 5.0.03 to segment incoming source data and UDEF-based RDF Schema aligned to W3C XSD 1.1 Part 2 datatypes for organizing captured domains and ranges and linking to source systems via D2R Server. ETL of structured data accomplished using a Data Vault data model design.• Architecture components included: Cloudera; Talend Big Data (5.3.1); IHMC Cmaps; Datameer; Tableau; Python 2.7.3; and R for polynomial equations. -
Enterprise Information ArchitectInteractive Data Oct 2011 - Jun 2013• Designed the Exchange-to-Hadoop/HDFS Business, Information and Technical architectures for any data lifecycle. • Designed and executed a migration of 18 legacy systems and existing business processes to a new architecture and data lifecycle based on Data Vault data models for ETL feeding downstream traditional databases, NoSQL data stores and a Graph Database (i.e. Neo4j) used for discovering relationships between data, vendors and consumers. Neo4j, Python 2.6.2 and R were used to predict compliance issues for different candidate approaches.• Standardized the valid Linked Data value pairs and their relationship(s) to the GDSN GS1 Global Product Classification (GPC) ontology. And, where possible, linked ISO 10383 MIC identifiers for source data vendors.• Created a Corporate Product Information Ontology using the Florida Institute for Machine & Human Cognition (IMHC) CmapTools Knowledge Modeling Kit. • Created a TOGAAF-based workflow management, role-based product entitlements and task-level audit metrics to capture data product information management (PIM) knowledge.• Data Governance solution used my own Chameleon Metadata key-pairs data quality management system.
-
Enterprise Data ArchitectAstrazeneca Jul 2011 - Oct 2011Cambridge, Cambridgeshire, GbReview their current data architecture and design solutions for throughput challenges to completing nightly Siperian workflows within the allotted time window. -
Enterprise Data ArchitectBridgewater Associates Feb 2011 - Jun 2011Westport, Connecticut, UsDesigned and delivered a fully functioning, multiple vendor, Proof-Of-Concept EIM system running atop Informatica's 9.x PowerCenter, DX, DQ and MDM/Siperian in just 14 weeks to support continuous/incremental improvements to their data's quality. -
Management ConsultingBristol-Myers Squibb, Princeton, New Jersey Oct 2010 - Jan 2011Lawrence Township, Nj, UsDesigned the business and information architecture which ensured auditable PPSA compliance (i.e. Aggregate Spend) in a SAP ECC environment.The delivered assessment created a framework within which BMS could capture and audit all money spent, per physician, on gifts, honoraria, consulting fees, food, research, continuing medical education and clinical investigator payments will be made public.As this new federal law will not preempt existing state disclosure laws, the framework needed to be flexible enough to also maintain compliance with any additional reporting mandated by individual states. -
Management ConsultantInc Research Sep 2010 - Oct 2010Raleigh, North Carolina, UsDetailed Enterprise Information Management assessment of, and suggested improvements to, their proposed business & information architectures -
Enterprise Information Management (Multi-Channel Marketing)Pfizer May 2010 - Sep 2010New York, New York, UsLead an assessment and road-mapping effort in support of a new US/EU Customer Relationship Management multichannel marketing effort. Collaboratively designed an MDM-centric framework with Pfizer's Marketing, Technical Infrastructure and BI/Analytics Organizations.Sitting atop an Informatica/Siperian stack, this framework consisted of: A conceptual data model integrating multichannel marketing processes, event-based measurement processes related to a specific marketing campaign and a closed-loop feedback of campaign tactic effectiveness into an iterative campaign refinement cycle. Designed a Master Data Management (MDM) hub capable of handling real-time events interactively, third-party data integration and an "Agile-like" process improvement lifecycle; a SOA-based Customer Interaction Repository (CIR); and a new set of knowledge enterprise-wide projections via their existing business intelligence tool-set. -
Master Data Management & Business Process Improvement ConsultantMedco Oct 2009 - May 2010Designed the master data (MDM) and metadata management approach to be supported by IBM’s InfoSphere Master Data Management Server. Designed and staffed Medco’s new MDM center of excellence organization charged with integrating multiple business units’ data across technical, political and geographic boundaries often using disparate currencies. Deployed a Proof-of-Concept IBM MDM Server hub for PARTY assuming PATIENT & PHYSICIAN roles.Designed the first strategic roadmap for this newly created MDM center-of-excellence (COE) organization as well as the logical, physical, canonical, information flow and process models.Negotiated and obtained executive sign-offs for the following processes: ITIL-based service management; metadata management; data governance; and MDM-to-Source System publish/subscribe (pub/sub) functions. Designed automated business rules to integrate with either PegaRULES or Corticon’s rules engine.Deployed proof-of-concept data governance and metadata portal on Linux/Apache/MySQL/PHP (LAMP).
-
Data Architect (Master Data Management)Lexisnexis 2009 - Oct 2009New York City, Ny, UsThis readiness assessment delivered a detailed product master capabilities model and an overall Enterprise Product Information Management (EPIM) functional architecture model on top of Oracle’s AIA PIM Hub framework for LexisNexis. Delivered roadmap to Lexis Nexis which included enterprise product definitions and taxonomy representing their product segments, families, lines, bills of materials and multi-product assemblies. Designed Conceptual and Logical Product Master Data Models.PIM Hub Classification and Attribution Model for faceted classifications and subtype inheritance.Provided guidance on end-state business processes & governance rules for the proposed EPIM data's managementIdentification of current-state/end-state gaps related to technology, tools, data, processes, and rules -
Master Data Management ConsultantDiscovery Communications Feb 2008 - Dec 2008New York, UsLed programme to deploy “Location/Site” master data from sources distributed across 41 countries.Deployed proof-of-concept Siperian MDM systems (profiling w/DataLever, ETL w/ IBM DatStage, metadata/data governance/stewardship w/ custom MySQL/PHP metadata repository & front-end).Designed Siperian MDM models for MDM metadata, profiling and data governance needed to create Golden Records and align data across seven sources to SAP (FI, MM and IS-R modules).Created MDM staffing plan, scope documents, project plans, test plans and MDM data modelManaged a RFP for MDM vendors (IBM, Siperian, Oracle) bidding to sell $2M-$9M tool suitesDesigned an in-house process, using the DataLever tool suite to accomplish at less than $1M, the same functionality as vendors products priced in the range of $2M-$9M -
Mdm Programme Lead / Chief Solutions ArchitectQuintiles Jan 2007 - Dec 2007Durham, North Carolina, UsSuccessfully deployed custom, in-house developed master data management (MDM) infrastructure. Designed a new ITIL-compliant, DataFlux-based approach to audit, report and correct data across seven lines of business in accordance with the new MDM standards.Deployed proof-of-concept MDM systems (profiling w/ DataFlux, ETL w/ Informatica, metadata/data governance/stewardship w/ custom MySQL/PHP metadata repository & front-end).Mastered two clinical trials subject areas by across four business unit, which were distributed across 50 countries.Installed and configured DataFlux Integration Server on Linux Red Hat platform.Provided Quintiles with new, previously unavailable, capabilities to ensure universal compliance with FDA 21 CFR Part 11 (US) and MHRA Annex 11 (UK) including new consolidated reports and metrics on clinical trial projects which crossed lines of business for pharmaceutical clients. -
Mdm Programme Lead / Chief Solutions ArchitectBacou-Dalloz Sa Jan 2006 - Nov 2006Bp 55288 Villepinte, Roissy Cdg Cedex, FrRoissy Cedex, FranceProvided executive and technical leadership for the design and deployment of a Master Data Management (MDM) solution supported by the IBM Websphere MDM suite conforming several subject areas of their globally distributed SAP ERP, SAP R/3 and JD Edwards (Sales Force) applications. Bacou-Dalloz is a world leader in the manufacture of personal protective gear.Delivered Product Information Management (PIM) roadmap which included metadata, taxonomies, metadata/governance repository as well as the conceptual, logical and physical data models needed to complete detailed project plans.Designed MDM model using PowerDesigner 12.1 to conform divergent Customer, Order, Employee and Product subject areas’ data across native systems deployed in 41 countries.Collected and improved globally distributed information using a mix of IBM’s Customer Data Integration (CDI) and Product Information Management (PIM) WebSphere and Stage tools -
Business Continuity Solutions ArchitectUps Jan 2005 - Dec 2005Atlanta, Ga, UsDesigned, planned, deployed and tested a comprehensive management process which identified potential risks to the organization, the impact of those risks may have on business operations and a disaster recovery process to restore an organizations' technology infrastructure after a disaster or failure. The process ultimately deployed accounted for employees’ safety, ensuring security restrictions of resources (i.e. people/hardware/software), and restoration of normal operations and finally to the financial aspects regarding the proactive and reactive options available to consider. Delivered risk and business impact analysis to identify the assets, threats, vulnerabilities and countermeasures for each service.Developed a holistic recovery plan for IT services which ensured that, rather than simply technologies, a holistic recovery of IT services ensured normal business as usual (BAU) services were recoverable. -
Mdm Solutions Architect (Ibm Team Lead)Welch Allyn Feb 2004 - Dec 2004Skaneateles Falls, New York, UsProvided executive and technical leadership for customer data integration (CDI) feasibility study at Welch Allyn, a medical device manufacturer based in Skaneateles, New York. Delivered current state assessments of their current SAP CRM, SAP R/3 and Sales Force applications as well as the conceptual, logical and technical roadmaps for implementing CDI.Delivered Customer Data Integration (CDI) roadmap which included “householding” taxonomies, as well as the conceptual, logical and physical data models for a To-Be CDI architecture.Created and delivered executive education campaign on using ITIL, CMM and/or CobiT to improve the efficiency of the strategic IT outsourcing role IBM recently assumed at Welch Allyn. -
Data Integration ArchitectDun & Bradstreet Jan 2003 - Dec 2003Jacksonville, Fl, UsDun and Bradstreet entered into a joint venture with American Express (AMEX) providing online credit, marketing and collection tools for their OPEN: Small Business Network (OSBN) clients. The new website (open.americanexpress.com) provided real-time, online access to D&B information.Designed enterprise, common information (CIM) and canonical data and process models to integrate AMEX business-account and consumer-account data around new Customer, Small Business and Vendor master business concepts. Designed taxonomies, hierarchies and 360-degree view of Customer, Small Business and Vendor via SOA-servicesDesigned and executed the planning, budgeting, control and communication procedures for this $8 million programme, aligning to the ITIL framework. -
Enterprise Solutions ArchitectPrudential Financial Apr 2001 - Dec 2003Newark, New Jersey, UsDesigned and deployed a metadata-driven information architecture along with an automated data integration lifecycle required for the conversion of the Agencies Database (ADB) system from IDMS to DB2. Gathered business requirements, conducted stakeholder interviews, and reverse-engineered the existing ADB system which was running atop IDMS.Documented the legacy and proposed systems using the five views of the unified modeling language (UML): Use Case; Design; Process; Implementation; and Deployment. The UML Process View was augmented/enhanced using the Object Management Group's Model Driven Architecture (MDA) and Semantics of Business Vocabulary and Business Rules (SBVR) standards. Designed canonical model, logical and physical data model (as a multidimensional snowflake schema) and documented it all using ERWin's Data Modeler. Target system tuning done using DB2 Trace IFCID capture and reported using SAS.Development was coded in India, tested in Ireland and deployed onto their Roseland, NJ and Jacksonville, FL data centers.All work was formally audited to ensure alignment with Software Engineering Institute (SEI) Capability Maturity Model (CMM) framework. -
Database System ArchitectLewco Securities Corp Mar 2000 - Dec 2000Retained by brokerage firm (jointly owned by Wertheim Schroder & Co., Inc and Hambrecht & Quist) to migrate Internet application back-end components from a combination of Oracle and SQL Server back ends to MVS DB2 via DB2 ConnectReduced "T+4" settlements to "T+1" as per Securities Industry Association (SIA) recommendations Installed IBM / UDB with DB2 Connect and configured it to communicate with MVS V6 DB2 database -
Systems Architect / Sepg Cmm AdministratorPrudential Financial May 1997 - Dec 2000Newark, New Jersey, UsLed multi-phased transition and capacity plans for migration to CORBA UDB Version 6 MPP environment. Requirements included Client access to MVS/DB2 data via DB2 Connect and MQ-Series which increased throughput for the nightly batch run process by 300%.Obtained Prudential's first Capability Maturity Model (CMM) Level-2 Certification whilst concurrently migrating the system to the new platform/ -
Data ArchitectAt&T 1997 - 1997Dallas, Tx, UsRetained by IBM to design a Customer Data Integration Client / Server applicationApplication exploited XML-type messages, stored in VSAM files, to resolve message meaning via DTD for multi-platform MQ-Series data messages. -
Data ArchitectMbna America Bank N.A 1996 - 1996Retained by Price Waterhouse to design Customer Data Integration solution to shift "atomic business focus" from credit card account# to individual cardholder (Party) using ADW design workstation and ERwin. -
Database AdministratorFinancial Technologies International (Fti) Jan 1995 - Dec 1995Created in-house performance and tuning group for FTI DB2/CICS software. FTI's products included: multi-currency global custody; trust; mutual fund; brokerage; and securities-position management software products.
-
Database System ArchitectEducational Testing Service (Ets) May 1993 - Dec 1995Princeton, Nj, UsRetained by IBM to remedy performance issues with the leading test administration and scoring system (PRAXIS) Solely responsible for the direction of DB2 database strategies used in the first application this organization implemented using ADW, XDB, CICS and DB2 for design, coding and testing.Reduced response time of all transactions from 120 minutes to below five elapsed seconds, with all the mission critical high volume transactions completing within 500 milliseconds. -
Certified Ibm InstructorIbm Mar 1991 - Dec 1993Armonk, New York, Ny, UsDeliver entire range of project management, DB2 design, administration, performance tuning and coding courses offered by IBM -
Database AdministratorAetna Jan 1990 - Dec 1990Hartford, Connecticut, UsProduction support DB2 DBA requiring 24 x 7 support for financial application called "FISC." -
Application ProgrammerU.S. Sprint Apr 1989 - Dec 1990COBOL / CICS / DB2 application programmer on telephone invoicing application
-
Application ProgrammerAmerican International Group Aug 1988 - Mar 1989New York, Ny, UsCOBOL / CICS / DB2 programmer on a claims processing application programmer -
Idms Database AdministratorPrudential Insurance Mar 1987 - Aug 1988Newark, New Jersey, UsTrained at "The Cullinet Institute" as an IDMS database administrator supporting Cullinet’s General Ledger Application at Prudential
Eric Thornton, M.A. Skills
Eric Thornton, M.A. Education Details
-
Centenary UniversityInstructional Leadership -
Temple UniversityComputer Science -
Temple UniversityDouble Major - Computer Science & Marketing Research
Frequently Asked Questions about Eric Thornton, M.A.
What company does Eric Thornton, M.A. work for?
Eric Thornton, M.A. works for South Carolina Crypto
What is Eric Thornton, M.A.'s role at the current company?
Eric Thornton, M.A.'s current role is AWS Architect, Data Scientist and Full-Stack Engineer.
What is Eric Thornton, M.A.'s email address?
Eric Thornton, M.A.'s email address is li****@****ton.net
What is Eric Thornton, M.A.'s direct phone number?
Eric Thornton, M.A.'s direct phone number is +1.646.688*****
What schools did Eric Thornton, M.A. attend?
Eric Thornton, M.A. attended Centenary University, Temple University, Temple University.
What skills is Eric Thornton, M.A. known for?
Eric Thornton, M.A. has skills like It Strategy, Data Integration, Etl, Business Process Design, Data Warehousing, Requirements Analysis, Sdlc, Itil, Data Management, Data Analysis, Data Governance, Business Analysis.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial