Pradeep Tiptur Nagaraja

Pradeep Tiptur Nagaraja Email and Phone Number

AI Data Engineer and Modeler @ TEAM LEASE SERVICES PRIVATE LIMITED
Bengaluru, KA, IN
Pradeep Tiptur Nagaraja's Location
San Francisco Bay Area, United States, United States
Pradeep Tiptur Nagaraja's Contact Details

Pradeep Tiptur Nagaraja work email

Pradeep Tiptur Nagaraja personal email

About Pradeep Tiptur Nagaraja

• Passion to work with Data & Analytics. Quick to adept to emerging Data Management Technologies.• Involved in All phases of Data Life Cycle management including Data Creation/Acquisition, Storage, Usage, MDM, Metadata, Governance, Quality, Auditing and Analytics of Enterprise Data solutions.• Skilled in Data Architecting, Data Engineering, Data Modeling, Data Warehousing, Data Visualization.• Data Architect / Modeler: Seasoned data professional with 10+ years in architecting, re-architecting scalable, efficient solutions, develop integration, migration strategies ensuring security, compliance.• Experience in data architecture, data modeling, metadata & master data management (MDM) for OLTP, OLAP, Business E-R modeling, Supply chain management and Business Intelligence solutions.• Experience in data warehouse design, dimensional and data vault modeling, data profiling, data scaling, data governance, ETL processes features - change data capture, slowly changing dimensions.• Design data structures to support advanced analytics integrating reporting tools like Tableau.• Data Engineer: 5+ years in building Data Pipelines, BI applications for Data Analytics & Visualization.• Experience in AWS using Elastic Map Reduce (EMR), EC2, Redshift, Lambda for data processing.• Experience in designing and implementing large-scale data pipelines and real-time data processing solutions using Apache Spark, Kafka, and Flink with good understanding of Apache Iceberg features.• Proficient in Python with hands-on experience in SparkSQL, Kafka Streams and Flink Table API.• Understanding of DBT to perform data transformation tasks, setting up workflows (DAGs) in Airflow• Worked on Functions, Triggers, Stored Procedures, Complex Queries, CTEs, Views, Materialized View• Data Visualization: Experience in developing dashboards in Tableau for Finance, Sales, Marketing and Customer Service areas creating Line & Scatter plots, Histograms, Bar, Pie, Dot - charts, Box plots, Time series, Multiple axes, Subplots, etc.• Performance Optimization: By Data Model & Query Optimization, Benchmarking, Partitioning & Bucketing, Caching strategies, Concurrency Management, Batch Size Tuning, Monitoring & Profiling.• Collaboration: Work with stakeholders, business teams, subject matter experts & internal IT teams.• HIPAA: Knowledge of Health Insurance Portability & Accountability Standards & Compliance issues.• Domains: Banking, Financial, Pension, Healthcare, Manufacturing & working in Agile methodology.

Pradeep Tiptur Nagaraja's Current Company Details
TEAM LEASE SERVICES PRIVATE LIMITED

Team Lease Services Private Limited

View
AI Data Engineer and Modeler
Bengaluru, KA, IN
Website:
teamlease.com
Employees:
1
Pradeep Tiptur Nagaraja Work Experience Details
  • Team Lease Services Private Limited
    Ai Data Engineer And Modeler
    Team Lease Services Private Limited
    Bengaluru, Ka, In
  • Lyft
    Data Engineer / Data Modeler
    Lyft Jul 2020 - Present
    San Francisco, Ca, Us
    Environment: AWS, S3, Redshift, DBT, Tableau, Python, Airflow, Erwin, Apache – Kafka, Flink, Iceberg• Collaborate with business stakeholders to gather data requirements, identify the types of data, relationships & constraints, create conceptual, logical, and physical data models using Erwin.• Ensure data models are optimized for performance, scalability, and flexibility.• Worked on data ingestion pipelines from multiple sources to serve as feed for Tableau dashboards.• Implemented slowly changing dimensions (SCDs) and fact table designs to support historical analysis• Implement and manage data models in DBT, for accurate data transformation as per business needs• Worked on master data management (MDM) solution to ensure enterprise-wide data consistency.• Designed, implemented high-throughput, low-latency data pipelines using Apache Kafka and Flink.• Optimized Kafka cluster configurations for improved performance and fault tolerance.• Built stateful stream processing applications using Flink's DataStream and Table APIs.• Developed fault-tolerant, scalable Flink jobs with checkpointing & savepoints for production system.• Implemented complex event processing (CEP) patterns for detecting fraud in financial transactions.• Experience in designing DAGs, creating, debugging, scheduling, monitoring Airflow jobs in Python.• Designed & developed Tableau dashboards in line with Finance, Sales and Marketing KPIs.• Work with cross functional teams, stakeholders, Business Analysts to qualify and accomplish scope.
  • Visa
    Data Architect / Modeler / Engineer
    Visa Nov 2019 - Jun 2020
    Foster City, California, Us
    Environment: SQL Server, SSIS, Tableau, Python, Airflow, Erwin, DBT, Apache Spark• In Enterprise Audit reports, evaluate existing data architecture, including data model, storage & flow• Re-architecture of legacy data pipelines in data vault 2.0 for better scalability & flexibility in handling large & diverse data volumes, leverage technologies like distributed computing and parallel processing to ensure efficiency, data governance and compliance requirements.• Developed and optimized ETL processes for data ingestion and transformation of legacy system.• Implemented fault-tolerant, scalable data pipelines, complex data transformations using Spark's resilient distributed datasets - RDD APIs, DataFrames, Spark SQL to load data into data warehouses.• Optimized Spark SQL queries for data transformations by tuning memory, partitioning and caching.• Developed reusable Spark components for common ETL patterns and transformations.• Orchestrated multi-stage Spark pipelines using workflow management tools like Apache Airflow.• Develop SQL queries in DBT to enhance data transformation processes, improve overall performance• Created Stored Procedures, functions, complex queries, SSIS packages to develop ETL data pipelines.• Interact with SMEs, Stakeholders & Analysts to assist in their transition from legacy to newer version• Production support of multiple SSIS applications having csv, JSON, flat files data sources. • Designed & developed Business, Operational, Analytical & Executive KPI dashboards of year over year, quarter over quarter, YTD, QTD and MTD types of analysis.
  • Stanford Health Care
    Data Architect / Modeler / Etl Developer
    Stanford Health Care Jul 2019 - Oct 2019
    Palo Alto, California, Us
    Environment: Azure SQL Server, SSIS, SAP-BODS, Python, Tableau, Erwin• Analyze existing Membership, Claims processing from Provider / Payer side with corresponding EDI transactions (834, 835, 837) & ETL application built on SAP BODS with heterogeneous data sources.• Design target cloud data architecture, ETL processes of on-prem data warehouse to a cloud-based.• Redesign data models, adapt schemas for cloud optimization & storage, plan for data partitioning.• Develop phased migration approach, plan for data validation & reconciliation, implemented data migration solutions for csv, flat files & JSON source file formats.• Designed & developed Tableau visualization solutions for Financial & Operational analysis.
  • Rodan + Fields
    Etl-Tableau Developer
    Rodan + Fields Nov 2018 - Jun 2019
    San Francisco, Ca, Us
    Environment: SQL Server 2017, SSIS, SQL Profiler, MySQL, Tableau• Create packages using SSIS for data extraction from multiple sources to SQL Server Data Warehouse• Design Transformations & created SSIS Packages to extract data from csv & flat files, SQL Server using Pivot, Fuzzy Lookup, Aggregate Transformations to generate source data for MIS & BI reports. • Created Tableau Dashboards with interactive views & visualizations of various Industry-specific KPIs.• Created filters, table calculations, parameters, formulas, calculated sets for preparing dashboards, worksheets in Tableau. Restricted data for particular users using Row level security & User filters.• Built Tableau dashboards for measures with forecast, trend / reference lines with User-level security.
  • Globalfoundries
    Mts Software Engineering
    Globalfoundries Jun 2015 - Oct 2018
    Malta, Ny, Us
    Environment: Oracle 12c/11g, PHP, Perl, Django Python Framework, DB2, Version One, ERwin• Implemented conceptual, logical, physical data models in 3NF, Star, Snowflake schemas using ERwin.• Application development and Production Support of PHP/Perl-Oracle web-based application.• Collate enhancements / bugs from various user teams, translate into technical requirements, Code enhancements, solution delivery using PHP, Perl, JavaScript with Oracle.• Used Oracle Data Integrator Designer (ODI) to develop processes for extracting, cleansing, transforming, integrating, and loading data into primary application database. Provide continuous enhancement and review of MDM matching rules, metadata, data quality and validation processes.• Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the MDM • Utilize data from external provider to properly class MDM data components (customer category, sub-category). Facilitate request for adds/changes to MDM subject areas (new fields / list of values).• Develop Packages, Collections, Cursors, Ref Cursors, procedures, triggers, views, functions.
  • Byer California
    Oracle Applications Developer
    Byer California Jan 2012 - May 2015
    San Francisco, Ca, Us
    Environment: Oracle Applications 11i, Oracle APEX 4.2, Forms, Reports, XML Publisher, BI Publisher• Worked as a Techno-Functional Engineer to develop, implement and support various customizations of Manufacturing, Financial & Distribution modules.• Worked with the SMEs to prepare business rules & values for each material MDM fields, data cleansing and standardization for data extraction from external provider source systems.• Worked for Procure to Pay & Order to Cash Processes and Customer Data Management modules.• Implemented Module integrations with AR, GL, AP, OM, Bills of Material, Service Contracts, Drop Shipping, PO, WMS and INV modules within EBS System.• Formulated SQL queries, functions and processes as per customization requirement for Oracle APEX.• Good understanding of EDI to OM dataflow (Transaction Set# 850 – Purchase Order).• Developed New Reports and Customizations in standard reports using Oracle Reports, APEX &XML Publisher in Distribution, Finance, Manufacturing & CRM (Services, Contract, Field Service) Modules.
  • Ntt Data Americas
    Data Modeler-Etl Developer / Project Lead [Client - Wells Fargo Bank]
    Ntt Data Americas Dec 2009 - Jan 2012
    Plano, Texas, Us
    • Collaborate with data architects, data & business analysts to support design and data analysis needs.• Designed Conceptual data Model using ERWIN to design normalized Logical & Physical databases. • Generation of PL/SQL scripts for schemas, create users, roles, Collections, functions, stored procedures, triggers and managing of the data dictionary with Oracle & SQL Server databases. • Involved in the design, develop, test ETL processes include the analysis of source system data, design high level process flows, create Staging schemas & mapping of source to target data.• Designed ETL process using Informatica Power Center to load data from multiple Sources to Target through data transformations, mappings, mapplets, sessions and workflows.• Define, configure and optimize various MDM processes including landing, staging, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages.• Developed Python scripts to extract, transform and load data from csv, xml files to MySQL database.• Collaborate well with Client management, Subject matter experts & Business analysts.• Managed Project team of designers, developers and Testers. Participated in team meetings with Client Managers to update on the work progress and plan the work allocations strategies.
  • Ntt Data Americas
    Data Modeler-Etl Developer [Client - Standard & Poor'S]
    Ntt Data Americas Jun 2009 - Nov 2009
    Plano, Texas, Us
    • Worked on requirement gathering, data modeling (logical/physical data modeling using ERWIN tool) database design, development for a Global re-tooling initiative to modernize and streamline the back-office content capture, authoring, and management processes.• Involved with Oracle database design using database normalization principles and generation of ER diagrams using ERWIN.• Created Packages, Procedures, Functions, Views, Triggers with Oracle XML DB for data management, coordinated with front-end program developers, managed the data dictionary.• Designed ETL process using Informatica tool to load from Sources to Targets through data transformations, mappings, mapplets, sessions, workflows which is similar to Oracle Data Integrator.• Developed Unix shell scripts to process data loads and also to automate jobs.
  • Ntt Data Americas
    Oracle Developer [Client - Public Sector Enterprise Group]
    Ntt Data Americas Jan 2009 - Jun 2009
    Plano, Texas, Us
    • Analysis of the existing database design and redesigning it as per new enhancements.• Create or modify stored packages, procedures, triggers, synonyms, views using multiple databases for the required functionality.• Tuning of SQL Queries, Procedures, Functions and Packages using EXPLAIN PLAN and TKPROF.• Created Perl Scripts for scheduling and managing jobs.• Responsible for working with DBAs to setup the databases for QA & UAT teams. Create users, roles & privileges for database developers, application, and QA teams.• Participating in implementation status meeting with Business, testers, users to resolve Functional and UAT issues.
  • Ntt Data Americas
    Oracle Developer / Project Lead [Client - Royal London Group, Uk]
    Ntt Data Americas Mar 2008 - Dec 2008
    Plano, Texas, Us
    Environment: Oracle 10g, ASP, ADP, ERWIN Role: Oracle Consultant, Project LeadRoyal London Group is UK’s largest mutual life, pensions and investment company. • Worked in Custom Applications supporting their Pension funds & retirement planning tools and managing the enhancements in discussion with Subject Matter Experts.• Translate data requirements into technical specifications, develop data & dimensional models to validate the application data meets Business requirements.• Responsible for review of database requirements, mentor pl/sql developers, assist with SQL tuning, create views, tables, triggers, procedures, snapshots and review the code. • Coordinated with project team of 12 members consisting of application developers, database developer and QA members.• Participating in team meeting with Client Managers for a weekly status meeting, to update on the work progress and plan the work allocations.
  • Ntt Data Americas
    Etl Developer [Client - Wells Fargo Bank]
    Ntt Data Americas Nov 2007 - Mar 2008
    Plano, Texas, Us
    • Gathered Requirements from End Users and translated into successful technical designs.• Designed and developed Pl/Sql solutions to migrate data from the existing MySQL database. • Efficiently implemented Change Data Capture (CDC) to extract information from numerous MySql datasource.• Extracted and loaded data stored in a multi-level hierarchy using PL / SQL.• Loaded Dimension, Fact & Exception tables and automated email generation during exceptions.• Improved performance using Oracle Partitions, Indexes & other Performance Tuning techniques.• Created Autosys JIL scripts, Control files, Parameter files and UNIX Environmental files.
  • Ntt Data Americas
    Oracle Developer / Tech Lead [Client - Mckesson Corporation]
    Ntt Data Americas Oct 2005 - Oct 2007
    Plano, Texas, Us
    McKesson Corporation is a leading Healthcare provider in pharmaceutical distribution chain, healthcare services and medical supplies & equipments.• Worked on Custom applications managing their Pharmaceutical & Medical equipment Supply Chain.• Collaborated with clients and end-users to develop a requirement gathering template which reduced the effort for that phase by 50 percent and ensured exact requirement capture.• Involved in reviewing the requirements, database design and creation of stored procedures, functions, Triggers and managing of the data dictionary.• Generated graphical data, creation of histograms etc. using Python's numpy, matportlib modules.• Developed Python scripts to extract, transform and load data from csv, xml files to MySQL database.• Responsible for analysis and implementation of the Enhancements for the existing Applications.• Worked closely with the Client to research and resolve UAT issues.• As Technical Lead, Instrumental in initiating and developing best practices for software quality enhancement and development and thus reduced the number of recurring bugs.
  • Ntt Data Americas
    Oracle Developer [Client - Agile Business Suite, Aus]
    Ntt Data Americas Dec 2004 - Sep 2005
    Plano, Texas, Us
    Unisys Agile Business Suite (AB Suite®) is a unique Model Driven development environment for mission critical solutions. It supports design, development, generation, and ongoing maintenance of high-performance, highly available, enterprise-class applications.• Developed procedures, functions and packages to implements the Business logic using PL/SQL.• Created database triggers and managed the subprograms and triggers.• Analyzed, identified and implemented user roles and assigned relative database access privileges.• Optimization and performance tuning for the product to enhance the efficiency.
  • Ntt Data Americas
    Oracle Developer [Client - Profund Systems, Uk]
    Ntt Data Americas Mar 2004 - Nov 2004
    Plano, Texas, Us
    Profund Systems is a Pension fund administration software provider. • Worked on implementation of Actuary calculation and associated Annuities based on different Pension plans.• Involved in discussions with Subject Matter Experts and create Technical design documents and specifications for Implementation references.• Developed PL/SQL packages, procedures and triggers for the data processing as per Business requirements.• Created scripts for table, views, synonyms, sequences, user roles & privileges.
  • Quasar Innovations Pvt. Ltd.
    Software Developer
    Quasar Innovations Pvt. Ltd. Jun 2003 - Feb 2004
    Bangalore, Karnataka, In
    Environment: Oracle 8i, Visual Basic 6.0, Crystal Reports• Involved in System Study, Analysis, Design, Development and Implementation initiatives.• Work with Subject-Matter experts to obtain requirements, business rules and document it.• Create, packages, tables, functions, procedures, triggers, views as per Business rules.• Used Localization techniques in designing data structure so that it can be adapted to various languages and regions without engineering changes.• Developed UI screens in Visual Basic and generated reports using Crystal Reports.• Responsible to setup the databases for Development, QA & UAT teams. Create users, roles & privileges for database developers, application, and QA teams.Involved in performance tuning of database objects using Explain Plan and TKProf.
  • Vesta Software Technologies Pvt. Ltd.
    Data Modeler / Software Developer
    Vesta Software Technologies Pvt. Ltd. Nov 2000 - May 2003
    Environment: Oracle 8i, Visual Basic 6.0, Crystal Reports • Involved in Analysis, data modeling & developing solution strategy and implementation of the project in accordance with the Business requirements. • As Technical Lead, responsible for design and development of Production & Export modules, MIS reports in Crystal Reports and coordinating with the team for the development activities.• Designed and implemented Operational databases, Data Marts, Data warehouse, generate scripts to create schemas, users, roles, tables and maintain the data dictionary.• Created tables, views, cursors, procedures & triggers to facilitate business requirements.• Worked closely with Subject-Matter experts to identify and specify business requirements and processes. Researched and evaluate alternative solutions and made recommendations.• Assisted the Project Leader in establishing project’s software development process based on company standards and project specific requirements. Instrumental in initiating and developing best practices for software quality enhancement, thus reduced the number of recurring bugs.• Coordinating and training the End users and worked with Go Live issues.• Assisted Client Senior management in their Reporting requirements for Business Analysis.
  • Cogn Software Systems Pvt. Ltd
    Software Developer
    Cogn Software Systems Pvt. Ltd Jun 2000 - Oct 2000
    Environment: Oracle 8i, Visual Basic 6.0, Crystal Reports• Application development support for Medical warehouse requirement.• Develop tables, synonyms, sequences, views, cursors, stored procedures and triggers.• UI screen design & development using VB 6.0 and generating reports using Crystal Reports.

Pradeep Tiptur Nagaraja Skills

Oracle Applications Oracle Pl/sql Development Data Migration Informatica Visual Basic Crystal Reports Oracle E Business Suite Sql*plus Database Design Unix Project Management Requirements Gathering Sdlc Data Analysis Erwin Etl Data Modeling Pl/sql

Pradeep Tiptur Nagaraja Education Details

  • B. M. S. College Of Engineering
    B. M. S. College Of Engineering
    Mechanical Engineering
  • Vijaya High School
    Vijaya High School

Frequently Asked Questions about Pradeep Tiptur Nagaraja

What company does Pradeep Tiptur Nagaraja work for?

Pradeep Tiptur Nagaraja works for Team Lease Services Private Limited

What is Pradeep Tiptur Nagaraja's role at the current company?

Pradeep Tiptur Nagaraja's current role is AI Data Engineer and Modeler.

What is Pradeep Tiptur Nagaraja's email address?

Pradeep Tiptur Nagaraja's email address is pr****@****ies.com

What schools did Pradeep Tiptur Nagaraja attend?

Pradeep Tiptur Nagaraja attended B. M. S. College Of Engineering, Vijaya High School.

What skills is Pradeep Tiptur Nagaraja known for?

Pradeep Tiptur Nagaraja has skills like Oracle Applications, Oracle Pl/sql Development, Data Migration, Informatica, Visual Basic, Crystal Reports, Oracle E Business Suite, Sql*plus, Database Design, Unix, Project Management, Requirements Gathering.

Who are Pradeep Tiptur Nagaraja's colleagues?

Pradeep Tiptur Nagaraja's colleagues are Ranjith Kumar Ms, Raj Kumar, Nathaniel Thompson, Sheikh Imran, K Venkat, Meenakshi Gupta, Gaurav Rajput.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.