Rudolph Daniel

Rudolph Daniel Email and Phone Number

Data Engineer @ UPS
Ground Floor, Abhilasha building, Behind NABARD Office 6, Royd Street, Kolkata, West Bengal 700016, India
Rudolph Daniel's Location
United States, United States
Rudolph Daniel's Contact Details
About Rudolph Daniel

15+ years of experience in Software Development, with specialization in Big Data and Analytics. Extensive Hands on experience in developing enterprise level big data applications involving Ingestion / curation / anonymization / extraction / publish using hive, pig, spark (scala), sqoop for processing structured/semi-structured data on Horton works distribution of Hadoop.Worked as a Solution Designer for building Big Data Enterprise Viewership Platform in Datalake for a large Telecom provider.Designed and Developed pig solutions to handle dynamic event ingestion into core lake. Designed and developed scoop jobs to ingest metadata into HDFS for a Data Integration project. Designed and Developed hive and spark (Scala) ingestion/curation/extraction/publish jobs to send/receive from HDFS tables and provide business ready extracts as parquet files to Amazon S3. Was involved in the Architecting an implementation using Micro services/Kafka/MongoDB for satellite TV provisioning.Was involved in real time kafka spark streaming solution for app health data using Data Router pub sub framework hosted on kafka cluster. Worked on Data Mining and Real time analytics on clickstream data for Cable & Satellite TV provisioning data Integration project.Wide experience on Data Mining, Real time Analytics, Business Intelligence. Extensive hands on experience in ETL data warehousing and data delivery using Informatica, Trillium, Unix from various data sources like DB2, Neteeza, Mainframe, Informix, SQLSERVER and Oracle. Extensive experience designing and programming for relational databases, including DB2, Oracle, Netezza.Experienced in Developing Campaigns and Reports using Epiphany, Micro strategy 9.x, Crystal and Business objects and Answers reporting in OBIEE.Have obtained training in AI and Machine learning as well as R programing and pyspark.

Rudolph Daniel's Current Company Details
UPS

Ups

View
Data Engineer
Ground Floor, Abhilasha building, Behind NABARD Office 6, Royd Street, Kolkata, West Bengal 700016, India
Website:
ups.com
Company email:
sef@ups.com
Rudolph Daniel Work Experience Details
  • Ups
    Senior Data Engineer
    Ups Aug 2018 - Present
    Atlanta, Ga, Us
    Implementing Enterprise Datalake in Azure cloud with Databricks (spark). Part of the big data team to consolidate the data from an on prem HDInsight cluster to Azure ADLS.• Design/Develop jobs using Azure/Databricks/HDInsight/Python/Scala/HIVE/SPARK/SQOOP to pull logistics and campaign data to on premise Datalake (HDFS) as well as Azure cloud platform Datalake ADLS(HDFS) and DBFS (databricks file system) and provide Business users with analytical and aggregate data for KPI reporting. • Responsible for implementing Azure cloud bi data orchestration solution with data factory jobs to perform data collection and Ingestion into Azure Data lake and blob from heterogeneous source systems.• Responsible for creation of sparks jobs with Azure cloud with Azure Databricks notebooks on python/scala for movement of the raw data files from RAW to analytical to processed and aggregate layer with ORC, parquet and avro files to/within the Azure Datalake ecosystem to create enterprise customer data platform.• Designed and Developed an on premise HD insight solution to handle short term metric requirement for the CDP platform using python script to pull data from AWS S3, api’s and ftp servers. • Extensive experience working with Spark using Azure Databricks notebooks using python as well as scala. • Designed/Developed spark jobs using Azure data factory and scala notebooks to curate complex high volume hourly adobe analytics manager data from AWS S3
  • At&T
    Big Data Developer
    At&T Jul 2016 - Jul 2018
    Dallas, Tx, Us
  • At&T
    Senior Bigdata Application Designer
    At&T Jun 2016 - Jul 2018
    Dallas, Tx, Us
    • Design/Develop jobs using UNIX/Informatica/HIVE/SPARK/PIG/SQOOP/TWS to pull viewership data to HDFS ecosystems and provide Business ready extracts to the downstream users. • Responsible for creating Informatica/Bigdata jobs/mappings/workflows to perform Ingestion/curation/extraction/publish frameworks for data collection and movement of the viewership metadata and activity data from Oracle to/within the Datalake ecosystem to create enterprise viewership platform.• Designed/Developed hive/spark jobs to curate complex high volume hourly STB data curation data. • Designed/Developed fpe encryption jobs to enable SPI data compliance on the datasets. • Developed framework to publish incremental data from HDFS to AWS S3 bucket. • Was responsible to build an enterprise platform for warehousing the business ready data and metrics for the TV viewership from multiple device platforms like IPTV boxes, Satellite STBs, Online Streaming (Mobile Apps, Smart TV, Browsers). • As part of this project associates from Cognizant own the platform architecture, designing data flows, development, unit testing and System testing as multiple Agile teams.• Gather understanding of the various Viewership source systems used by the ATT and perform source data analysis to provide the feasibility study and estimates for ETL / Big Data feeds for the Use Case under consideration. • Responsible in implementing the Models along with the architects so that the information that is represented in the heterogeneous source systems are available for analytical reporting by the Big Data COE teams. • Provide Post Production Support in understanding User issues and provide solutions in the form of enhancements, defect fixes, explanations for the applications deployed to production during the Warranty period.• Successful deployments of releases into production and job monitoring with TWS scheduler to ensure schedules are running as expected and clean data is loaded for downstream delivery.
  • Cox Communications
    Etl Tech Lead
    Cox Communications Apr 2012 - Jun 2016
    Atlanta, Ga, Us
    Responsibilities• Represent the EBI technical team during source system Analysis and business requirement gathering with users.• Gather the reporting/ETL requirements along with the Business analyst as well as the completing the end to end upstream and downstream analysis to ensuring that the required data is not currently available in the existing data warehouse.• Create the data model and ensure that it confirms to dimensional modeling reporting needs.• Responsible for end to end verification of requirements gathered and the functional specifications and come up with technical design document and Source to Target mappings documents. • Development, review, testing of jobs involving extracting data from Oracle, SQLSERVER, Informix, MYSQL sources to the EDW in an Oracle EXADATA database.• Was having the responsibility to analyze and fix issues in production environment on priority basis. • Closely worked with the Enterprise business intelligence group to provide design suggestions and review the designs of projects on the same subject area. • Worked extensively on an Enterprise metadata management solution from scratch.• Extensively worked on Power Center client tools.• Worked on identifying Mapping Bottlenecks in Source, Target and Mappings to improve Performance.• Worked with source system teams to resolve data quality issues raised by end users. • Involved in reviewing technical documents, Unit test plans and test cases for QA and System testing.
  • Jcpenney
    Etl Technology Lead
    Jcpenney Jun 2010 - Apr 2012
    Plano, Texas, Us
    Responsibilities• Active participation in requirements gathering and analysis of source data with the Data Analyst team.• Responsible for verification of functional specifications and Source to Target mappings documents created by Data analysis team• Worked closely with Business End Users and Business Analysts to gather and document the Requirements and translated them to functional specifications and Source to Target documents with the help of Data analysis team.• Involved in the creation of logical and physical designs to develop Database schemas and tables in dimensional modeling.• Created Technical specifications for ETL mappings using the FSD and S2T requirements.• Development of mappings involving extracting data from Flat Files, Mainframe files, DB2, Oracle, SQLSERVER and Informix sources to DB2 database.• Executed the role of onsite Coordinator by gathering the requirements, developing the high level designs for the mappings and executing the development with an offshore team.• Provide estimations for ETL deliverables and oversee the progress for quality ETL Deliverables.• Lead a team of five offshore resources.• Extensively worked on Power Center client tools.• Automated the failover strategy in scheduling of workflows using a locking process which was reused across the enterprise.• Developed tools to capture the end to end metadata as well as validating the daily data load metrics.• Created Power Exchange jobs for accessing the Mainframe data from files by creating data maps.• Used to take the deployment anchor responsibility for all modules across projects. • Developed UNIX Shell Scripts for scheduling the workflows using the third party tool called ESP • Involved in documenting technical documents, Unit test plans and also written test cases for Unit and System testing• Extensively worked in the performance tuning of the ETL processes.
  • Carphone Warehouse
    Etl Tech Lead
    Carphone Warehouse Jul 2008 - May 2010
    • Was a System Analyst from Client side to understand the business logic and make sufficient additions to the existing HLD to capture all the requirements in the BRD. • Had the responsibility to come up with the design of Fact Tables, Dimensional Tables, handling Slowly Changing Dimensions, Data Mining and Data Validation. Preparing the approach documents and impact analysis for various system enhancements.• Involved in Data cleansing before loading into data warehouse.• Extensively used Mapplets and Reusable Transformations to prevent redundancy of transformation usage and maintainability.• Used workflow manager for session management, database connection management and scheduling of jobs.• Used the repository manager to create new users, to give permissions to users.• Was given the responsibility for creating the design and implementation documents, effort estimation, planning for coding & implementation, writing mappings, unix scripts and stored procedures and performance tuning the same to load data into staging area and then subsequent dimensions and facts. Performance tuning the mapping using pushdown optimization.• Had the responsibility to lead an 5 member team working in different technologies namely Informatica, Unix and Oracle and Netezza back end and Crystal and BO frontend.Environment: Informatica, Unix, Oracle, Neteeza, Crystal, Business Objects.
  • Nordstrom
    Etl Developer
    Nordstrom May 2008 - Jun 2008
    Seattle, Washington, Us
    • Worked as the developer for coding & implementation, unit testing, writing stored procedures and performance tuning the same to load data into the schema which contains all the inventory data for British Telecom. This was done in coordination with the team in Tech Mahindra.• Was responsible for the overall functional testing of the reports.• Worked as a developer for Nordstrom offshore BI/Reporting project in Microstrategy.
  • Indian Air Force
    Flying Officer(Cadet)
    Indian Air Force Dec 2007 - May 2008
    In
    Was part of the 1 SSC M F(P) to be imbibed into the IAF as a flying officer. Attended the training at AFA Dundigal, Hyderabad.
  • Bt
    Programmer Analyst
    Bt Jul 2006 - Dec 2007
    London, Gb
    • Played the role of a developer and was involved in the development, testing and deployment of the DCMO code, which involved creating a database containing trouble ticketing related data.• The responsibility included writing PL/SQL stored procedures for population of the schemas as well as automating the process and writing mapping queries for populating the different schemas.

Rudolph Daniel Skills

Informatica Etl Requirements Analysis Teradata Data Warehousing Data Modeling Pl/sql Oracle Db2 Unix Shell Scripting Sdlc Data Integration Microstrategy Business Objects Sql Data Warehouse Architecture Erwin Dimensional Modeling Requirements Gathering Unix Enterprise Architecture Agile Methodologies Business Intelligence Databases Cognos Software Development Life Cycle Obiee Mainframe Netezza Database Design Data Quality Visio Microsoft Sql Server Data Marts Database Administration Software Project Management Solution Architecture Java Enterprise Edition Global Delivery Sql Tuning Toad Datastage Data Migration Performance Tuning Master Data Management Olap Stored Procedures

Rudolph Daniel Education Details

  • Middle Georgia State University
    Middle Georgia State University
    Information Technology
  • College Of Engineering Trivandrum
    College Of Engineering Trivandrum
    Engineering

Frequently Asked Questions about Rudolph Daniel

What company does Rudolph Daniel work for?

Rudolph Daniel works for Ups

What is Rudolph Daniel's role at the current company?

Rudolph Daniel's current role is Data Engineer.

What is Rudolph Daniel's email address?

Rudolph Daniel's email address is ru****@****nds.com

What is Rudolph Daniel's direct phone number?

Rudolph Daniel's direct phone number is +197299*****

What schools did Rudolph Daniel attend?

Rudolph Daniel attended Middle Georgia State University, College Of Engineering Trivandrum.

What skills is Rudolph Daniel known for?

Rudolph Daniel has skills like Informatica, Etl, Requirements Analysis, Teradata, Data Warehousing, Data Modeling, Pl/sql, Oracle, Db2, Unix Shell Scripting, Sdlc, Data Integration.

Who are Rudolph Daniel's colleagues?

Rudolph Daniel's colleagues are Thomas Rittler, Mia Calderari, Lavanya Muppa, Andrew Glass, George Hudson, Gordon Sharon, Mark Norman.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.