Joseph Bernardo Email and Phone Number
Joseph Bernardo work email
- Valid
- Valid
- Valid
Joseph Bernardo personal email
- Valid
Joseph Bernardo phone numbers
A 35 year IT Software Engineering expert with over 25 years of consulting experience in Data Integration and Business Intelligence Pre-Sales, Solutioning, Staffing and Delivery. Client solutions focused and results-driven. Skilled at presentations, RFP responses, Project Management & Planning and Liaison with Client Business & Technical teams.A true Full SDLC lifecycle expert from: Requirements Definition, Development, Testing, Performance tuning, Deployment, Production Implementation, Production Support..A Software Engineer skilled in technologies ranging from Hadoop (on-premise and the cloud AWS EMR, Azure HDI, Google Cloud), Spark SQL, Python, bash scripts, Vertica 6.1.x, 7.0, (Projection design, training, Best practice development, Performance tuning ) Oracle, PL/SQL programming, Data Modelling ( Operational Data Stores, Date Warehouses, DataMarts) , LDM , PDM, Performance tuning, C , Unix, SQLFORMS 3.0, Forms 4.5A Data Architect/Modeler SME able to produce Logical and Physical Data Models ranging from Operational Data stores, Integrated Data stores for OLTP systems to Logic Dimensional Data Models for traditional Star Schemas used in Data Warehouses and Data Marts for OLAP systems.A Professional Program/Project Manager with both onshore and offshore management skills. Successfully managed over 30 full project lifecycles with budgets of $500K - $50M.Consulting experience spans across Finance, Banking, Brokerage, Retail, Manufacturing, Logistics, Telecommunications, Marketing, Health & Life Sciences, Pension, Benefits, Health Care Administration and Government.
-
Senior Solution Architect - Financial ServicesDatabricksNew York, Ny, Us -
Vp, Solution Principal - Data And Analytics At UsereadyUseready Jan 2023 - PresentNew York, UsUSEReady is a Data and Analytics firm that provides the strategies, tools, capability and capacity that businesses need to turn their data into a competitive advantage. The company is built on the following core values: Customer Centricity, Community, Continuous Improvement, Integrity and Humility.I will responsible for ensuring that our customers choose the right technologies and solutions to be successful. Our client solutions are focused on Migration to the Cloud , Optimization of Value, Modernization of Capabilities (MOM) of Data & Analytics technology initiatives for the purpose of helping our clients achieve their targeted business outcomes. In this role, I will also be responsible for building out USEReady's Starburst Data Practice. In that practice, I am working with clients to reimagine their existing data architecture and, in an effort to become more data driven, look towards potentially shifting to a data mesh architecture. -
Global Director, Professional Services Boomi Data Catalog & PrepBoomi Jan 2020 - Dec 2022Conshohocken, Pennsylvania, UsResponsible for adoption and implementation of the Boomi Data Catalog & Preparation (DCP) product. Managed a team of Technical Architects/Consultants supporting pre-sales (RFPs and POCS), post sales consulting implementations, internal training/adoption, partner enablement, support and client facing training. Startup focused, hands on technologist who leads and coaches by example. Real-world experience with EDWs, Data Preparation, Analytics, and Big Data & Hadoop. Installation & implementation of Boomi DCP with Hadoop on-premise and in the cloud (AWS EMR, Azure HDI and Google DataProc). Happy and comfortable in Linux and building the occasional python script. Strong presentation skills to both business and technical audiences. Communicated with product management and development to help improve the platform. Proven collaborator and liaison with all levels and departments within Boomi ranging from Product, Engineering, Sales, Marketing and Partner Channel. Created the DCP Partner Accelerator program to help partners build a Data practice around implementing Boomi DCP. Co-authored eBook 5 Reasons Snowflake Customers Need Boomi Data Catalog and Preparation (DCP). -
Director Of Professional ServicesUnifi Software Mar 2018 - Jan 2020None, UsAt Unifi, our Professional Services team functions as the premier product implementer within our organization. Led and recruited this team which includes pre-sales, training, and partner resources. The team is tasked to educate prospects and customers, demonstrate and prove out our solution, close deals, deliver on SOWs and make our customers happy and successful. Unifi is a young, fast-growing company. -
Senior Sales EngineerUnifi Software Mar 2016 - Mar 2018None, UsUNIFi software is a solution for big data acquisition and integration. It is intended for use by business intelligence analysts in enterprises engaged in data warehousing and data mining. UNIFi solves several barriers around data access and consumption: • Ability to parse unstructured and semi-structured data • Integration of diverse data sources • Reduced need to involve IT personnel and programmers UNIFi is a data integration platform that is built natively on top of the Hadoop ecosystem. It uses Hadoop Distributed File System (HDFS) as the data store. For data transformation, UNIFi works with different Hadoop execution components including Hive and MapReduce. UNIFi helps business analysts and the IT departments that support them to more quickly and accurately acquire the data, then transform the data so that it can be combined with existing data and analyzed using business intelligence and visualization tools to extract business insights. UNIFi provides a browser-based data integration software solution and users interact with UNIFi through a self- service graphical user interface. With UNIFi, business analysts can select and integrate their data sets without having to write code or involve IT personnel. This allows the analyst to pursue “what if” scenarios with the data and develop business insights much more quickly than with traditional hand-coded programming. The product has been designed to address the needs of both business users and IT administrators as enterprises look to balance the benefits of self-service with those of good data management and governance. -
Executive DirectorMorgan Stanley Sep 2014 - Mar 2016New York, Ny, UsManage Data Architecture, ETL & Reporting Team within Wealth Management Core Processing. Lead and contribute to all aspects of application development including design, development and support for the data and ETL needs for various Core Processing Applications. The Core processing legacy platform is a batch mainframe based platform heavily dependent on VSAM and other disparate data sources. Team was formed to support the Core Processing transformation strategy. Team will also provide adhoc query support and self-service reporting to Operations and the Business by integrating data and sourcing across various data stacks, including Teradata, DB2, SQL Server, Greenplum and HDFS.Responsibilities include:• Strategic data platform renovation. • Reference Data management.• Data Modelling Logical & Physical in support platform renovation and Datamarts for reporting.• ETL/ELT - Development• Database Development – Tuning, Stored Procedure development• Report Development – using tools such as Tableau, Qlikview or BOTechnologies used: Modelling: Erwin, PowerDesigner. ETL: Informatica. Database: Teradata, DB2 Mainframe, Linux, SQL Server. Data Virtualization/Federation: Composite. Reporting/Visualization tools: BO, Qlikview and Tableau. -
Senior Practice Principal, Information Management & Analytics , Enterprise ServicesHp Aug 2012 - Sep 2014Houston, Texas, UsVertica : Retail – CRH Customer Relationship Hub ProjectThe CDM and C&RA applications are increasingly being used by business processes for credit decisioning, customer segmentation & profiling, risk scoring, bankruptcy automation etc. The Teradata platform where the applications are hosted is costly both in terms of space and processing capacity. Moreover, the applications (CDM and C&RA) depend on the W as a data source leading to longer downtimes in case of a disaster. In addition, a lack of DR capability in the Teradata infrastructure puts these application at risk and prevents meeting the 24 hour SLA for recovery.Bank is consolidating two consumer databases CDM and C&RA into a single Vertica instanceHP worked with the Bank in consolidating two consumer databases CDM and C&RA into a single Vertica instance. Customer Relationship Hub.Services Provided: Project Mgmt, Architecture, Data Modeling, Vertica Best practices, Performance tuning, SIT & UAT testing, Go-Live cutover.Accomplishments:- Led Pre-Sales POC effort, solutioning and staffing for the project- Managed client relationship on average 4 HP resources with an average yearly revenue of $1 MM with 38% margin. -
Senior Practice Principal, Information Management & Analytics , Enterprise ServicesHp Feb 2012 - Sep 2014Houston, Texas, UsVertica Project; BACARDI is an enterprise class platform based on DB2 Data Warehouse Edition that contains a mixed workload comprised of deep analytics and operational processing. The operational processing component contains approximately 80 terabytes of raw data. It is sourced from BAC core cards processing that is mainframe based, and supports scoring and other high SLA reporting functions that are not consistent with the operating characteristics of an analytical data warehouse. It also ingests scoring data from upstream analytical systems that are then forwarded back to the core cards systems. In order to meet escalating data growth and new project demands and associated SLAs, BAC has decided to move the operational component of BACARDI to a new Vertica platform.HP worked with the Bank in the Vertica data migration effort by providing the following Services: • Project Management• Vertica training• Vertica Database architecture• Vertica Technical consulting• Vertica ETL architecture • Performance Tuning• Testing • Production Deployment and SupportAccomplishments:- Led Pre-Sales POC effort, solutioning and staffing for the project- Managed client relationship on average 5 HP resources with an average yearly revenue of $1.5 MM with 36% margin.- Ensured Vertica Best practices are followed and deliverables are met. -
Senior Practice Principal, Information Management & Analytics, Enterprise ServicesHp May 2010 - Sep 2014Houston, Texas, UsEngagement Manager:CRC Client Reporting Center BOA/Merrill LynchClient Reporting Center (CRC) is a central database warehouse designed, built and deployed by Hewlett-Packard.The Client Reporting Center (CRC) project provides integrated, scalable, cross-entity, cross-product, books-and-records-based Prime Brokerage reporting for GMF&S clients. CRC is a central data warehouse collecting full volumes of core books and records data (trades, non-trades, positions, balances, prices, etc) and a variety of reference data. The data is grouped into data marts providing specialized services per business unit or functional need. CRC Collects over 30 different feeds each day, both real-time & batch. On average, CRC loads over 4 million trades a day, 6.6 million positions, and collects many other subject areas such as account/party, product, rates, and others. Services include data movement across the full latency spectrum, data modeling, reporting, reference data management, orchestration and data quality functions.• CRC team leader responsible for overall project management with full client interface.• Oversee all aspects of engagement including P&L management on annual business volume of ~$5M.• On-shore and off-shore resource allocation and expense management with over 25 HP direct reports.• Streamline expenses to obtain profit margin of 40%.• Logical and physical data modeling for new CRC projects.• Implement standard database deployment process for QA, UAT, PP and production deployments.• Lead database migration efforts from Oracle 10g to Exadata 11gR2.• Analyze and implement RDW partitioning, compression and purging activities.• Work with COGNOS reporting team on performance tuning within COGNOS and database layer.• Tools: Oracle Database, 10G migrated to ExadataInformatica ETLCognos Reporting ToolAbinitio – Orchestration Event Management Scheduling -
Associate Director, Data Integration ServicesVitech Systems Group, Inc. Apr 2003 - May 2010• Consulted with multiple clients on data conversion requirements.• Participated on RFPs, Sales presentations, and Client kickoff meetings.• Successfully managed, developed and executed data conversions with import and export requirements for high profile projects with revenues exceeding $50M. Clients included: o Ohio Police and Fire Healthcare.o Central Valley Trust Teacher’s Union.o United Food and Commercial Workers of Northern California.o Pennsylvania Public School Employees’ Retirement System. o Hawaii Retirement System - ERS.o State Teachers Retirement System of Ohio.o Iowa Public Employees Retirement System.o Directors Guild (DGA).o Writers Guild of America (WGA).o New York City Police Pension Fund.o Hawaii State Employees Health Fund.• Standardized data conversion practices, bridging the gap between Solution Analysts and Database Developers.• Formed , directed and managed a data team consisting of 12 Data Architects.• Outlined technical specifications and managed overall team efforts.• Developed project implementation plans and timelines for data conversions.• Analyzed data structures, data dictionaries, record layouts and operating environment for all client legacy system migrations. • Established methods for extracting legacy data and for loading the extracted data into Oracle staging tables.• Partnered with technical architects, account teams, project managers, clients and 3rd party vendors on mapping client legacy systems into Vitech's system based on business requirements.• Developed, configured, tested and maintained database programs.• Documented results, detailed code comments steps for Dry run and Production Cutover activities.• Quality control of outputs and ensured test plans and all client certifications were in place for data conversion.
-
Senior Data Architect (Consultant)Citigroup, Inc. Aug 1999 - Apr 2003• Developed Data Models within the guidelines of Citibank’s Corporate Bank Data Model for Citidirect Project.• Developed Logical Entity Relationship Models documenting the relationships, entities and attributes from user requirements and forward engineered it to the physical database model within HPS. • Generated Data Access Layer (DAL) used by the Java front end to access the database. • Utilized Star schemas to develop Datamart for direct debit reporting and inquiries.• Performance tuned DALS using Oracle’s Explain Plan combined with the DALENT log files flagging any slow performing DALS.• Generated DDL, foreign key constraints and indexes from case tools.• Utilized Clearcase for version control of DDL and data migration scripts.• Implemented DDLS and DALS in Dev, QC, UAT and production environments.• Responsible for data migration and schema differences from DEV environments -> QC environments -> UAT environments -> production. • Collected business static data from various regions for population into Citidirect Oracle database using SQLOADER, PL/SQL.• Wrote and maintained stored procedures for loading of banking data into Citidirect.• Analyzed application defects through Teamtrack. • Created and executed various database scripts to resolve application defects.
-
Senior Analyst/Project Leader (Consultant)Basf Corporation Mar 1991 - Aug 1999•Analyzed, designed and implemented various logistical system projects. •Created data models from user requirements using Oracles Case Designer and Dictionary. •Forward engineered DDL to database and performance tuned where applicable. •Used ER Diagrams to communicate and validate information requirements with users.•Tested and ensured systems operation. • echnical environment included HP/UNIX, ORACLE, CLIENT/SERVER, NOVELL LANS, SQLNET, C, and TCP/IP on various hardware platforms, within ETHERNET NETWORK.Import System Phase IoSystem tracked import orders received from Germany in an ORACLE database on a HP 9000 series 800-machine running UNIX. Business units on separate LANS made up of DOS PC’s running ORACLE tools access this data. Communication is accomplished by running SQLNET on top of TCP/IP on both client and server platforms. SQLFORMS is the primary tool used by clients with reporting from SQLFORMS screens utilizing SQLPLUS. Server includes UNIX shellscripts, SQL scripts and PRO-C programs to maintain database. Import order information from Germany is transferred to HP from IBM mainframe using file transfer in TCP/IP.Import System Phase IIo Migrated the existing Import Order Entry System from the mainframe to the platform utilized for Import System Phase I tracking. Consolidated various business divisions into one BASF North America Import System. The new system provided additional interface with US customs, customs brokers, inbound transporters, and worldwide freight forwarders increasing productivity and efficiencies across project teams. The ER Diagram consisted of over 450 entities with over 6500 attributes. The ORACLE’S CASE*Method tool emphasized entity relationship diagram, function hierarchy, dataflow diagram and module generation. oSAP R2 interface with NAISoEDI with Custom Brokers CUSDECS (Shipping info, commercial invoice,) to Fritz and 856 (Tracking information) 810 (Entry Summary,Duty and Broker charges) from Fritz.
-
Senior Programmer Analyst (Consultant)At&T Apr 1990 - Mar 1991Dallas, Tx, Us• Designed, coded and tested programs, tables and SQL procedures for the Competitive Intelligence Database (CID) and other Market Research projects. • Installed Oracle database on AT&T 3B2s utilizing both raw devices and the UNIX file system.• Setup Oracle environment (Tablespaces, Rollback Segments, User Accounts, and CRT Definitions).• Performed DBA performance tuning using SQLDBA monitor, TKPROF and EXPLAIN facilities.• Designed and coded using SQLFORMS,SQLMENU,SQLPLUS,PRO-C and UNIX shell-scripts.• Developed an ad hoc query program using DYNAMIC SQL in PRO-C to allow end users to query database. -
Programmer AnalystThomas J. Lipton Co. Mar 1988 - Apr 1990• Analyzed, designed and implemented various manufacturing based systems on VAX VMS systems using C with Oracle, PRO-C, SQLPLUS, and SQLFORMS.Automated Warehouse Project:• Involved with all phases of this joint venture project designed to increase productivity in the warehouse. LXE terminals, AGVsTea Production Monitoring System• Created data models from user requirements• Handled user training on interactive system.• Developed screens through SQLFORMS.• Wrote programs in PRO-C to load data into Oracle using RMS files. • Transferred data files via KERMIT.
-
Programmer AnalystDonaldson, Lufkin & Jenrette (Dlj) May 1987 - Mar 1988• Assisted team with development of brokerage applications using ORACLE, C, SQL, SQLFORMS, SQLLOADER and FORTRAN on a VAX cluster consisting of two 785’s and a 750• Member of a team that developed a Trading and Sales System for operations and floor trades.• Performed ORACLE DBA functions: data Modeling, performance tuning, database security • Used SQLPLUS to create Oracle tables, indexes, grants, synonyms and accounts.• Designed Portfolio System consisting of a portfolio and security look-up screens using SQLFORMS. • Developed reports with SQLPLUS.• Loaded tapes into ORACLE tables using ODL and SQLLOADER
Joseph Bernardo Skills
Joseph Bernardo Education Details
-
Fairleigh Dickinson UniversityManagement Information Systems -
Ramapo College Of New JerseyComputer Science
Frequently Asked Questions about Joseph Bernardo
What company does Joseph Bernardo work for?
Joseph Bernardo works for Databricks
What is Joseph Bernardo's role at the current company?
Joseph Bernardo's current role is Senior Solution Architect - Financial Services.
What is Joseph Bernardo's email address?
Joseph Bernardo's email address is jo****@****ail.com
What is Joseph Bernardo's direct phone number?
Joseph Bernardo's direct phone number is +197853*****
What schools did Joseph Bernardo attend?
Joseph Bernardo attended Fairleigh Dickinson University, Ramapo College Of New Jersey.
What skills is Joseph Bernardo known for?
Joseph Bernardo has skills like Data Warehousing, Business Intelligence, Requirements Analysis, Sdlc, Sql, Project Management, Business Analysis, Enterprise Architecture, Integration, It Strategy, Program Management, Solution Architecture.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial