Harsh Tripathi work email
- Valid
- Valid
Harsh Tripathi personal email
- Valid
With around 15 years experience in Computer Sciences (Banking Retail/Wholesale, E-Commerce, Ad Tech, Digital Finance, Travel, Distributed Systems, Data discovery and data science) and managing multiple Engineering teams/systems, I love enabling engineers and building simple tech solutions for complex problems.Scale (Traffic and Data Volume), Search/Discovery and Data management and modelling have been my key strength areas. Engineering humbleness is what drives me to connect and engage.I have an Electronics Engineering degree from BITS-Pilani and I still love to code !!
-
Chief Technology OfficerOolkaBengaluru, Ka, In -
Cto Maya Bank/Head Of Engineering - MayaPaymaya Philippines Mar 2021 - PresentMandaluyong, Metro Manila, Ph -
Vice President Of EngineeringPaytm Nov 2017 - Mar 2021Noida, Uttar Pradesh, InEngineering Leadership at User Platform (Consumer Onboarding, Accounts, KYC), Wallet and Risk Profiling. Managing Core Platform to enable Scale(50K+ TPS), Stability(99.99) and Security for overall Financial ecosystem at Paytm! -
Director Of EngineeringGofro.Com Aug 2015 - Nov 2017Delhi, Delhi, InJoined as founding member, built team of passionate engineers, setting up engineering processes. Played crucial role in setting up Holiday’s platform for building highly scalable and distributed systems. Have been highly hands-on for fast deliveries and product shaping, responsible for following business verticals:* Holiday Search (Discovery of hotels, flights, transports, cities, fixed-packages and enablement of DIY holiday-packaging)* Advisor Platform (This enabled us growing from 15 internal holiday advisors to 150+ external advisors very fast)* Data Analytics Platform (Org wide internal discovery for all functions sales/ops/marketing/finance, was key system for enabling corporate efficiency) * Order Post Processing Systems (For post holiday package booking, this system aim was to provide awesome and informative experience pre and in-travel)* Companion App (Awesome Native experience to customer, responsible for managing outsourced work and later owning by internal team)* Built Toolings for enabling Ops Efficiencies (Heavy use of ML/NLP to cut 90% of manual hotel/room mapping tasks, intelligent ML driven crawler for price competence analysis) -
Senior Engineering ManagerSnapdeal Oct 2013 - Aug 2015Gurugram (Gurgaon), Haryana, InI led multiple Marketplace systems by building teams and products from scratch to support massive growth of Sellers. Sanpdeal marketplace growth peaked at 2 lacs+ Sellers and millions of Products. Some of the key systems and teams which I managed: Seller Search * Designed and built search and indexing system for product/SKU searches by suppliers* System was designed to guarantee ordering of documents updates without compromising indexing throughput* Setting up right search and index analysers chains for most relevant results as well optimised indexing performance* Supported analytical compute metrics ingestion as feedback loops for business optimal relevance* It was massive infra setup in production having large Apache Solr nodes cluster, clustered Mongo DB and unique topology of RabbitMQ exchanges/queues for pipelining. Seller Recommendations * A computed data serving system with massive volume and pluggable data ingestion and discovery support* System provides Sellers many specific recommendations around his performance, peer performance and what new products should he offerShield (Product Fraud Analysis)* System aims to protect both sellers and Snapdeal from doing errorneous activities, for example selling banned drugs in furniture category* A thick rule based workflow system with high configurability, system scans sellers profile, 100+ product categories and millions of products and discovers anomaliesSeller Self Serve* Self serve system was built to enable transperency and efficiencies between Sellers and Snapdeal by enabling Sellers to get live answers instead of going via manual support route* This system was highly performant aggregator system providing online insights to Sellers on various Issues around onboarding, listing of products, shipment or payment disputes etc -
Tech LeadTribal Fusion Feb 2011 - Oct 2013UsCore developer for 10s of Expo9 RMI based applications servers. Exponential provides AD Serving capabilities to huge consumer base from 1999. Serves millions of Ads in various formats across world. Amongst top-3 advertising provider with 500+ million unique users. Expo9 is AD Server backend which fuels AD Server with high-end advertising configurations, monitoring and reporting. Expo9 consists of many Java-RMI servers and Web-Server each connected via a common RMI Server Dispatcher.Challenges:* Highly distributed system consisting of 10s of server instances* Very large advertising dataset and complex advertising workflows* High level of concurrent usage of single database instance shared by multiple serversInvolvement:* Design/implementation for advertising business needs (user targeting, mobile partner integration, multiple ad-size per campaign)* Design/implementation for MailService framework with goal as high configurability and generic in nature* Design/implementation for REST based framework to expose existing operations as RESTful* Design/implementation for generic test framework having InContainer and Mock testing approaches. Augmented test framework with XML based input and output configuration* Design/implementation for Team Workflow application based on OSWorkflow (aka JIRA)* Objectivity integrated and standalone tool framework setup* Implementation of Query Server based on connect protocol* Implementation of Message Sync server using JMS, HibernateTechnologies: Solaris 10, Windows 2007, Java, Jersey (REST), Spring, Hibernate, J2EE components, Jackson, JSON, Oracle 10g, Objectivity/DB (OODBMS) -
Lead Product Engineer - Search SystemIron Mountain Digital May 2009 - Jan 2011Lead developer for Search Server which hosts search/indexing application on customized tomcat instance. Implemented indexes and configuration management.Challenges:* Search performance on 10s of millions of indexed documents* Efficient management of Index/Catalog to hold millions of recordsInvolvement:* Migration of Search Server from 32-bit Windows server to 64-bit Windows server* Design and Implementation of generic event transition framework to capture operation (search/indexing) state transition and related attributes* Design and Prototyping of distributed message logging mechanism best suited for event logging* Customization of SQLite nmake files for building 64-bit compatible SQLite binaries and JDBC driver, commonly used driver wasn’t available in 64-bit mode* Prototyping of web server distribution techniques using opensource frameworks like Apache Tribes, ZooKeeper, JGroup. Implemented custom multicasting messaging framework for connecting multiple JVM* Implementation of caching of internal index-ids for search performance improvement* Design/implementation of JMX management module to expose live attributes during search/indexing operation* Implementation of throttling and quota on documents sent for indexing jobs* Implementation of search term tokenization in Java as replacement over native tokenization* Design/implementation of search federation feature at Search server client* Implementation of T-SQL procedures for client side indexing/search request routing management and status update for documents/operationsTechnologies: Windows 2003, Windows 2008 64-bit, Java, JMX, JNI, dtSearch search engine libraries (32/64bit), Spring, Apache tribes, Axis2, JSP, Servlet, Struts, JDBC, Tomcat 6.18, SQL Server 2005/2008, CLR assemblies (C#), SQLite, T-SQL -
Senior Software Engineer - Ediscovery ProductIron Mountain Digital Dec 2006 - Apr 2009Core developer for SPC (Stratify Processing Console), implemented key workflows with extensive multi-threading for data/document processing. Implementation of various generic frameworks for parallelism, error and integration modules. Challenges:* Management of large volume (10s of millions, 10s of TBs) of records/documents for processing * High Complexity of document processing workflows and massive asynchronous activitiesInvolvement:* Key developer for SLDS 7.x, 8.x and 9.x versions of the product* Design/Implementation of Import-Manager module which extractes data into SQLServer. Highly database centric module with complex logic implementation in T-SQL procedures/functions.* Design/Implementation of Extraction-Manager module which manages multiple extraction jobs in parallel according to the workflow. This involves integration Extraction and Galileo server* Design/Implementation of Controller module, for sequence and dependency management of processing steps to enforce workflow steps.* Upgraded app-servers from Tomcat 5.5 to Tomcat 6.18. Involved in impact analysis, third party dependencies and changes in existing customization of tomcat.* Design/implementation of StrutsTestCase/Cactus/JUnit based testing framework to perform IC testing of workflows and produce higher coverage* Customization of Cactus source code to support JUnit 4 tests* Implementation of custom log4j components for splitting logs based on HTTP request parameters in a web application; this enables web-modules based separate log files.* T-SQL procedure implementation for transactional database logic and responsive UI development using prototype/AJAX/Javascript* Refactored JICE to support complex VO -
Associate System EngineerTata Consultancy Services Oct 2005 - Dec 2006Mumbai, Maharashtra, InParagon Credit Risk Management System : Key contributor for Deutsche bank Credit Risk Management System team. This was a comprehensive risk and limit-monitoring system providing consolidated exposure information on trade. Other specific functionalities were getting merged in paragon, one of which is Knowledge Stream, a Portal Web Application used for consolidated data publishing and work flow management. Involved in:Portal administration for Knowledge Stream.Code development for enabling application specific ‘Logging’ in Paragon using Log4j.Code development and JMS setup for synchronizing communication of multiple managed servers to CORBA server.Design and tested ‘Search capabilities’ for application-repository wide search as POC (Proof of Concept) using LuceneEnable clustering in paragon application for improving scalability, loadbalancing and failover.Automation of various manual tasks using ANT which were getting performed for setting up different testing environment Application setup on Weblogic8.1 SP3 server hosted in Unix and Windows machine.Technologies: UNIX,JDBC, JAVA, JSP, Servlet, JMS, Weblogic 8.x, Oracle 8i, PL/SQL -
Associate System EngineerTata Consultancy Services Sep 2004 - Sep 2005Mumbai, Maharashtra, InOnline banking capabilities to Wholesale & Retail customers: Developed a generic Java/XML based framework for faster end-to-end development of J2EE application. This frameworks wires various J2EE stack artefacts together which enables developer to become productive very fast. Wiring definition used to connect front-end (JSP/HTML), Middle Tier (XSQL, XSL, DTD, validatorXML) and back-end (View, Trigger, Package, Procedure, Functions) components.This framework was a good success as it got adopted by various other teams working on banking requirements.Design and development of Reporting framework for Cash, Payment and Collection modules. This includes setting up an architecture, which provides an integrated way of XML based report generation. XML as input reports can be generated in HTML, PDF and CSV formats.Performed System Testing, documented STS (system-testing-specs) for specific modules followed by error-correction.Technologies: UNIX,JDBC, JAVA, JSP, Servlet, JMS, Websphere, XSLT, XSLFO, Oracle 8i, PL/SQL
Harsh Tripathi Skills
Harsh Tripathi Education Details
-
Birla Institute Of Technology And Science, PilaniElectrical And Electronics
Frequently Asked Questions about Harsh Tripathi
What company does Harsh Tripathi work for?
Harsh Tripathi works for Oolka
What is Harsh Tripathi's role at the current company?
Harsh Tripathi's current role is Chief Technology Officer.
What is Harsh Tripathi's email address?
Harsh Tripathi's email address is ha****@****ail.com
What schools did Harsh Tripathi attend?
Harsh Tripathi attended Birla Institute Of Technology And Science, Pilani.
What skills is Harsh Tripathi known for?
Harsh Tripathi has skills like Java Enterprise Edition, Algorithms, Agile Methodologies, Spring Framework, Hibernate, Xslt, Object Oriented Design, Linux, Web Development, Spring, Java, Software Development.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial