A certified seasoned Architecture Leader with a passion of creating business/product oriented, secure cloud-native solutions. I've spearheaded initiatives with scalable and efficient solutions to harness high-volume, high-velocity and high-variety data for actionable insights, complex reports and data management, in both streaming and batch architecture patterns.
-
Principal Solution ArchitectEquifax Feb 2020 - Oct 2024Atlanta, Ga, Us- Cloud native transformation and migration of Equifax Consumer Credit Mainframe batch and online applications to Equifax Data Fabric.- Architecture and design of innovative, cloud native solution for allowing Equifax customers on Google Cloud, Snowflake & AWS to perform identity resolution without customers sharing their PII data.- Strategy, design and reference implementation for Equifax’s enterprise data assets schema management & versioning allowing schema evolutions without impact to applications.- Architecture and design of metadata driven Equifax’s Global Model Execution Platform hosting and executing ~900 polyglot score models & feature modules including ML models, Python, C++, Java, Scala models.- Performance and cost optimization of Equifax Global batch platform, reducing job execution times and cost per run by 4x-10x and storage cost by 30%- Architecture and design regulated, petabyte-scale audit data lake & platform, capturing data movement and transformation across ingestion pipelines and fulfillment APIs for regulatory audits & business verification.- Equifax Data Accountability platform architecture and design to discern accountability for supplier data received and processed across systems and platforms, generating ~90000 reports/month, as part of CFPB regulatory requirements. -
Solutions Architect - Nextgen Data AnalyticsHcl Technologies Jul 2018 - Feb 2020Noida, Uttar Pradesh, In- Provide solution for a real-time, cloud-native next-gen data analytics platform for analysis of organizational data (financial, operational, customer, supply chain, risk, fraud & UW, marketing etc.)- Due-diligence across organization to identify analytics gaps and pain-points, current data maturity and provide a roadmap along with a PoC implementation- Proposed a cloud-native solution architecture of the analytics platform along with a PoC implementation- TCO comparison of different cloud platforms per proposed architecture - AWS, GCP, Azure for different configurations- Design and development of configuration based, reusable ingestion and ETL pipelines- Design and development of ABC (Audit, Balance & Control) framework- Setup and configure cloud platform on Amazon Web Services (AWS). Components include: - VPC, subnets (private & public), Route tables, Security Groups - NAT Gateway and Internet Gateway - EC2 instances with EBS volumes - Big Data Clusters - Zookeeper, Kafka, Kafka Connect, Schema Registry, Elastic Stack, Spark (EMR), Hive, Presto, Dremio, DOMOTechnology Stack: Amazon Web Services, Confluent OSS Platform 5.1 (Kafka-Connect, Kafka 2.x, Schema Registry), Zookeeper 3.4.12, Spark 2.3.x/2.4.x, Elastic Stack 6.5.4, Linux (CentOS 7), S3/HDFS, Hadoop 2.9.x, Scala 2.11.11, JDK8u191, Hive 2.3.4, Presto 0.215, Dremio Community Edition 3.0.6 -
Solutions Architect - Network Security Advanced AnalyticsHcl Technologies Oct 2016 - Jan 2019Noida, Uttar Pradesh, InNetwork Security Advanced Analytics project was conceived with the primary objective of using data science and machine learning in identifying information security threats. Project's primary goals can be summarized as: • Analyze and visualize enormous amount of network security data available within THD infrastructure from disparate sources • Reduction in time to identify potential security threats, with minimal false positives • Enabling SOC Analysts to take timely and necessary corrective measures to mitigate the security threatKey Responsibilities:- Architected a real-time, cloud-native (on-premise compatible) platform for monitoring and analysis of network security data for real-time monitoring, anomaly detection, threat identification and mitigation by SOC Analysts & Managers- Data volume/velocity: ~10-15 TB/day and ~500,000 events per second- Ability to parse and process events like IPFIX v10, Netflow 9, Juniper/PaloAlto/Fortinet syslogs, McAfee WebGateway, CEF logs, API Gateway events- Supported Protocols: UDP, TCP, HTTP(s)- Designed and developed real-time, scalable and resilient configuration based - data ingestion & forwarding framework - event & syslogs parsing framework with cleansing, deduplication, transformation and aggregations capabilities- Designed and developed common data processing framework and ad-hoc utilities with capabilities like - variable-rate streaming - event playback - exactly-once event processing - auto-offset management - back-pressure handling - variable rate storage - configurable event sink - retry & dead letter queues- Technology Stack: Google Cloud Platform, Nginx 11, Akka, Zookeeper 3.4.6, Kafka 0.8.2.2, Spark 2.0.2/2.2.x, ELK Stack 5.x/6.x, Linux (Debian 9), BigQuery, GCS/HDFS, Hadoop 2.7.3, Scala 2.11.8, JDK8u161, Bash 4 -
Solutions Architect - Integrated Data Warehouse [Idw]Hcl Technologies Jan 2016 - Oct 2016Noida, Uttar Pradesh, InIntegrated Data Warehouse (IDW) was a scalable BI platform that can adapt to the speed of the business by providing relevant, accessible, timely, connected, and accurate data. IDW was a hybrid data warehouse build in Hadoop Data Lake & Tera data, which provides services to reporting tools to generate the required reports, utilized by business usersKey Responsibilities:- Requirement analysis, source to target mappings and transformation rules identification- Implementation of ETL jobs for source data cleansing (pre-prep), transformation (prep) and storage indata lake and Teradata (dispatch), as per solution architecture- Involved with transformation job’s code and script refactoring to reduce development and testing effort and time- Optimization and refactoring of Pig scripts to reduce job runtime by 3-4x- Involved with CICD team to implement automated deployment of job artifacts- Review of code and design artifacts for completeness, correctness and quality- Unit Testing, QA Testing and UAT support- Creating production deployment plan, automated deployment scripts and support admin team for job deployment in test and production environmentsTechnology Stack: Java7, Hadoop 2.7.3, Hive 0.14, Pig 0.14.0, Hbase 0.98.x, Teradata, Bash 4, Maven 3.3.9, Jenkins, GitHub, Rally, Linux (CentOS 7) -
Advisory System Analyst - Ups | Freight Shipping SystemIbm Jun 2015 - Jan 2016Armonk, New York, Ny, UsUPS Freight Shipping system encompasses 2 web applications namely, UPS Internet Freight Shipping (UIFS) and Freight Calculate Time and Cost (FCTC) and a backend component, Freight Access Component (FAC). UPS Freight Shipping system collectively is responsible for air and ground freight shipping, rating, commodities, documents, pickup requests and much more.Key Responsibilities:- Project Role: Project Lead- Managing and leading a team of 4 resources- Responsible for requirement and architecture analysis, gap identification for initiatives impacting freight shipping- Collaboration with up/down stream applications, BAs, architects and business users across portfolios- Responsible for design and development artifacts reviews for completeness, correctness and quality- Project planning, scheduling, tracking and issue & risk management- QA Testing and UAT support -
Advisory System Analyst - Ups | Global Locator RedesignIbm Jul 2014 - May 2015Armonk, New York, Ny, UsUPS Global Locator Redesign initiative seeks to redesign the Global Locator application for UPS.com and APIs in order to better serve customer needs as well as promote locations and services to maximize UPS profitability, brand recognition and strategic relationships.Key Responsibilities:- Project Role: Application Architect, Project Lead- Managed and led a team of 15 members- Responsible for estimation, creating project proposal, project plan and schedule and solutionarchitecture- Responsible for design and implementation of client and server architecture- Designed and implemented enhancements to POX and Web Services APIs for location services.- Responsible for design and development of new REST API consumption for enhanced features- Designed and implemented dynamic page refresh using jQuery and Ajax for map drag capabilities- Instituted coding standards and code reviews to enforce standards- Performance test planning, review, performance improvements & code optimizations- Collaboration with Data Architects and DBA for Data Modeling -
Advisory System Analyst - Ups | Scl ModernizationIbm Aug 2013 - Jul 2014Armonk, New York, Ny, UsPrimary objective was to migrate the existing UPS location database from Mainframe DB2 to SQL Server 2012 with: -Design and development of new SOAP Web Services and POX APIs (wrapper services) -Design and development of new web applications for batch processing of location data & notificationsKey Responsibilities:- Project Role: Application Architect & Project Lead- Managed, directed and led a team of 20 members as Application Architect and Project Lead- Collaboration with client management, BA and Architects to assess, define and finalize the business and technical scope of the project- Estimation, Project Proposal, Solution Architecture and high/low level design• Project planning and scheduling, phase definitions and delivery• Collaborated with Data Architects, Security Architects, Application & infrastructure Architects and DBAs to review solution architecture• Designed and implemented web applications, Web Services (SOAP, POX), Batch Processing and Notification.• Instituted coding standards and code reviews to enforce standards• Performance test planning, review, performance improvements & code optimizations• Project Management – Resource and task management, demand-capacity, issue and risk management. -
Senior System Engineer - Ups | Global Locator Family Apps & Open AccountIbm Feb 2012 - Dec 2013Armonk, New York, Ny, UsUPS.com Global Locator Family consists of five different applications namely SCL Services, GL Proxy, HPDOL, ASO and RDO wherein only SCL was a backend application. Open Account is a flagship application that allows UPS.com customers to open shipping account for shipping/pick-ups through UPS.com. It also allows customers to manage their account preferences and ownership as well as allows UPS customer to avail discounted rates and promotions.Key Responsibilities:- Project Role: Project Lead & SME- Successfully led and managed 4 Enterprise Releases and 9 Maintenance Releases- Designed, developed and enhanced location APIs with multiple interface types like SOAP, POX and RMI- Project Management, Client PMO management, Demand-Capacity, Issue and Risk Management, Vendor Management- Requirement analysis, architectural solutions, data modeling- Production support management for entire portfolio having ~35 applications- Designed and developed multiple proof-of-concepts for supporting application design.- Designed and optimized account injection process to achieve 500x faster performance.- •Acted as a consultant to client needs and provided complex and/or critical custom solutions considering existing architecture and limitations.- Automated the Open Account process for APAC region by implementing the functionality to upload the legal documents.- Designed the process and architecture to integrate with 3rd party vendor for documents processing and viewing by Region representative.- Designed and implemented a critical maintenance release for legal requirements of pharmaceutical shipments.- Provided centralized solution to all UPS.com application for display of promotional banners on UPS.com and integration with promotion application.- Designed and developed integration for United Airlines promotion program with Open Account application, allowing United Airlines reward member to open a UPS Account. -
Senior System Engineer - Ups | Message ConsolidatorIbm Aug 2011 - Dec 2012Armonk, New York, Ny, UsMessage Consolidator was developed as a new backend application primarily focusing on centralized consolidation of event-based UPS.com emails. Also, the application was responsible for defining and maintain UPS.com email templatesKey Responsibilities:- Successfully led and managed 3 Enterprise Releases and 1 Maintenance Releases as Technical Lead- Responsible for business requirements analysis and responsible for sizing, estimation and planningprocess- Responsible for developing new application architecture and underlying framework for Email Template engine using XML, XSL, XSD and XSLT- Designed and developed a proof-of-concept application for supporting the design.- Developed the complete first release of application end-to-end for all the emails- Data modeling of database for handling consolidation of emails and for retry logic (for fail-safe)- Developed scheduler-threads framework for handling consolidation of emails from databases across datacenters.- Implemented XSL transformation using more than 2 XML files; created and implemented translation bundles in 150 locales. -
Senior System Engineer - Ups | Sms Data ManagerIbm Aug 2010 - Jul 2011Armonk, New York, Ny, UsSMS Data Manager (SDM) was developed as a back-end application for providing a centralized solution to all UPS application’s SMS alerts, notification and marketing information to the end customers of UPS. SDM was also response for interactive message communication from end customers like Opt-in, Opt-out etc. using their mobile devices.Key Responsibilities:- Successfully led and deployed 2 Enterprise Releases and 1 Maintenance Releases as Team Lead- Responsible for business requirements analysis and for sizing, estimation and planning- Responsible for developing new application architecture and underlying framework- Designed and developed major modules of the application like opt in, opt out, keyword, supported countries with Service and RMI interfaces- Developed the integration of application with vendor to send out SMSes and receive incoming messages for processing- Designed and developed a Manual tool tester for 1st and 2nd level support to handle customer calls -
Senior System Engineer - Ups | Global Locator Family ApplicationsIbm May 2009 - Jul 2010Armonk, New York, Ny, UsUPS.com Global Locator Family consists of five different applications namely SCL Services, GL Proxy, HPDOL, ASO and RDO.Key Responsibilities:- Project Role: Team Lead- Successfully deployed 3 Enterprise Release and 2 Maintenance Release as Technical Team Lead and On- shore coordinator- Responsible for getting knowledge transfer from UPS SME and get productive along with getting UPS business & application knowledge.- Proactively took responsibility of creating project plan for July 10 ER and Maintenance releases- Responsible for coordinating with third party vendor as well as other development teams in UPS fordevelopment and maintenance of Global Locator application- Responsible for creation of new application architecture and porting existing application to the new architecture- Responsible for coordinating with all the downstream 17 client applications as well as 2 backend third party vendors of SCL.- Created application support documents like AID, ASCP and support handover documents for all 5 applications -
System Engineer - Contracts Online (Release 4, 3, 2)Ibm Feb 2007 - May 2009Armonk, New York, Ny, UsContracts OnLine (COL) is considered a strategic asset to IBM Corporation. COL is an “external” IBM web application which provides electronic support for the contracting process including contract presentation, customer review workflow, negotiation & acceptance, archiving and replaces e-mail, fax and mails for distribution and signing paper contracts.COL Release 4.0 was to re-architect the complete COL application to MVC-II framework and to expose contract lifecycle management capabilities as Web Services including workflows, signatures etc. COL Release 3.0 and 2.0 were focused on providing enhanced functionalities and broadening of user baseKey Responsibilities:- Successfully led Release 4.0 as a Technical Team Lead; worked as Application Developer in Release 2.0 and 3.0- Lead a team of ~30 resources for developing application modules per new architecture- Assisted application architect in creating the new architecture for the application.- Responsible for development of use cases, high and low level design documents (EDD/IDD)- Responsible for designing and developing the core application framework and functional modules of the application- Designed the schema for exposing application capabilities as Web Services and integrated the services to core business logic in Release 4.0- Handled additional role of configuration manager and SME for function point analysis in Release 3.0 & 4.0- Designed the architecture of a module for i18N implementation; received the prestigious IBM award –Bravo in Release 3.0- Responsible for preparing database scripts, data files, sql files and deployment instruction document for production release as well as fix pack release- Responsible for enabling 4 new Leads, ~25 other team members (developers) and 6 QA through knowledge transfer for application, code, database and scheduled jobs -
System Engineer - Gart 2.0Ibm Aug 2006 - Feb 2007Armonk, New York, Ny, UsGlobal Analysis Reporting & Tools 2.0 (GART) is an information delivery system based on new SAS COTS technology that had replaced many of GM’s legacy systems. The purpose of the GART 2.0 project was to provide a common global analysis and reporting system for warranty claims, vehicles, dealers and other early warning quality data.Key Responsibilities:- Responsible for development of functional GUI’s pilot wireframes- Responsible for developing the client side scripting functionality with full cross browser testing.- Involved in development and coding application components in struts
Vinay Vyas Education Details
-
Indian Institute Of Technology, KharagpurAgricultural Engineering [Major: Farm Machinery And Power] -
Maharana Pratap University Of Agriculture And TechnologyAgricultural Engineering
Frequently Asked Questions about Vinay Vyas
What is Vinay Vyas's role at the current company?
Vinay Vyas's current role is Principal Solution Architect | Data Segment | Big Data | Analytics | Cloud | Common Sense.
What schools did Vinay Vyas attend?
Vinay Vyas attended Indian Institute Of Technology, Kharagpur, Maharana Pratap University Of Agriculture And Technology.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial