R V

R V Email and Phone Number

Senior Data Engineer at Translink @ TransLink
new westminster, british columbia, canada
R V's Location
Boca Raton, Florida, United States, United States
About R V

R V is a Senior Data Engineer at Translink at TransLink.

R V's Current Company Details
TransLink

Translink

View
Senior Data Engineer at Translink
new westminster, british columbia, canada
Employees:
994
R V Work Experience Details
  • Translink
    Data Engineer
    Translink Apr 2023 - Present
    Boca Raton, Florida, United States
    Responsibilities: • Developed and maintaining data architecture and data models. Controlling data model changes through active involvement throughout the project lifecycle, as well as reviewing and approving data designs.• Moved Oracle source using ADF to Azure Cloud in parquet format.• Fetch data from Kafka/Event hub and stored in datalake and then used in further layers and also processed xml files to move data into cloud.• Build Delta Lake house and different delta lake zones… Show more Responsibilities: • Developed and maintaining data architecture and data models. Controlling data model changes through active involvement throughout the project lifecycle, as well as reviewing and approving data designs.• Moved Oracle source using ADF to Azure Cloud in parquet format.• Fetch data from Kafka/Event hub and stored in datalake and then used in further layers and also processed xml files to move data into cloud.• Build Delta Lake house and different delta lake zones bronze, silver and gold layers.• Designed what and where data will be stored in cloud. Ex. Folder hierarchy and Year, Month and Day folders.• Moved Oracle source using ADF to Azure Cloud in the form of parquet file.• Created SCD Type I and Type II jobs using data bricks and moved the data into Cloud.• Created delta lake house and used data bricks SQL to hookup data into power BI.• Used CI/CD pipelines for check in and check out of code.• Convert Store procedures into Pyspark Databricks code and make it runnable.• Improved performance of queries using right partitioning of delta lake and applying zorder.• Created python scripts to scale up and down analysis services.• The project was delivered in 2 weeks sprint. Show less
  • Capgemini
    Senior Data Engineer
    Capgemini Feb 2022 - Jan 2023
    Surrey, British Columbia, Canada
    Responsibilities: • Developed and maintaining data architecture and data models. Controlling data model changes through active involvement throughout the project lifecycle, as well as reviewing and approving data designs.• Moved Oracle source using ADF to Azure Cloud in parquet format.• Fetch data from Kafka/Event hub and stored in datalake and then used in further layers and also processed xml files to move data into cloud.• Build Delta Lake house and different delta lake zones… Show more Responsibilities: • Developed and maintaining data architecture and data models. Controlling data model changes through active involvement throughout the project lifecycle, as well as reviewing and approving data designs.• Moved Oracle source using ADF to Azure Cloud in parquet format.• Fetch data from Kafka/Event hub and stored in datalake and then used in further layers and also processed xml files to move data into cloud.• Build Delta Lake house and different delta lake zones bronze, silver and gold layers.• Designed what and where data will be stored in cloud. Ex. Folder hierarchy and Year, Month and Day folders.• Moved Oracle source using ADF to Azure Cloud in the form of parquet file.• Created SCD Type I and Type II jobs using data bricks and moved the data into Cloud.• Created delta lake house and used data bricks SQL to hookup data into power BI.• Used CI/CD pipelines for check in and check out of code.• Convert Store procedures into Pyspark Databricks code and make it runnable.• Improved performance of queries using right partitioning of delta lake and applying zorder.• Created python scripts to scale up and down analysis services.• The project was delivered in 2 weeks sprint. Show less
  • Save-On-Foods
    Data Engineer
    Save-On-Foods Sep 2020 - Nov 2021
    Langley, British Columbia, Canada
    Responsibilities: • Developed and maintaining data architecture and data models. Controlling data model changes through active involvement throughout the project lifecycle, as well as reviewing and approving data designs.• Moved Oracle source using ADF to Azure Cloud in parquet format.• Fetch data from Kafka and stored in datalake and then used in further layers.• Build Delta Lake house and different delta lake zones bronze, silver and gold layers.• Designed what and where data… Show more Responsibilities: • Developed and maintaining data architecture and data models. Controlling data model changes through active involvement throughout the project lifecycle, as well as reviewing and approving data designs.• Moved Oracle source using ADF to Azure Cloud in parquet format.• Fetch data from Kafka and stored in datalake and then used in further layers.• Build Delta Lake house and different delta lake zones bronze, silver and gold layers.• Designed what and where data will be stored in cloud. Ex. Folder hierarchy and Year, Month and Day folders.• Moved Oracle source using ADF to Azure Cloud in the form of parquet file.• Created SCD Type I and Type II jobs using data bricks and moved the data into Cloud.• Created delta lake house and used data bricks SQL to hookup data into power BI.• Used CI/CD pipelines for check in and check out of code.• Convert Store procedures into Pyspark Databricks code and make it runnable.• Improved performance of queries using right partitioning of delta lake and applying zorder.• Created python scripts to scale up and down analysis services.• The project was delivered in 2 weeks sprint. Show less
  • Sapient
    Sr Software Engineer
    Sapient Feb 2014 - Aug 2020
    Boca Raton, Florida, United States
    • Analysis of business requirement and its functional and technical implementation to facilitate stakeholders in making logical / efficient decision-making to implement business intelligence (BI) solutions for Sales, product, and customer KPIs across the company.• Developed and maintaining data architecture and data models. Controlling data model changes through active involvement throughout the project lifecycle, as well as reviewing and approving data designs.• Moved Oracle source… Show more • Analysis of business requirement and its functional and technical implementation to facilitate stakeholders in making logical / efficient decision-making to implement business intelligence (BI) solutions for Sales, product, and customer KPIs across the company.• Developed and maintaining data architecture and data models. Controlling data model changes through active involvement throughout the project lifecycle, as well as reviewing and approving data designs.• Moved Oracle source using ADF to Azure Cloud in parquet format.• Created SCD mappings using data bricks using Delta Lake and moved the data into Cloud.• Created Synapse tables and hooked into Power BI• Supported existing application process which were developed on Informatica power center.• Created alerts using and email notification at ADF.• The project was delivered in 2 weeks sprint• Developed Mapping level design documents and took client’s sign off.• Identified sources map to the business requirements.• Coordinated with team members to resolve design/development issues.• Developed BI process using Informatica to load data to data mart and created SCD Type I and SCD Type II mappings.• Worked on creating transformations and mappings using designer tools of Informatica and scheduling workflow using Control M or Tidal tools.• Involved in Performance tuning of Informatica Mappings to reduce the runtime.• Expertise in tuning Oracle Queries.• Created documents like CRC (Code Review Checklist) document to ensure a good practice should be followed while ETL Development. Show less
  • Galaxe.Solutions
    Senior Software Engineer
    Galaxe.Solutions Nov 2011 - Feb 2014
    Noida, Uttar Pradesh, India
    ETL developer. Informatica, oracle, unix

Frequently Asked Questions about R V

What company does R V work for?

R V works for Translink

What is R V's role at the current company?

R V's current role is Senior Data Engineer at Translink.

Not the R V you were looking for?

  • R V.

    Charlotte Metro
  • R V

    Certified Salesforce Professional • Open To Relocation/Onsite/Hybrid Roles
    Irving, Tx
  • R V

    Dallas-Fort Worth Metroplex
  • R. V.

    Chief Executive Officer At Baltimore Health Solutions
    Pikesville, Md
  • Laxmi V.

    Java Developer | Java | Spring Boot | Ci/Cd | Oracle | Restful Web Services | Html | Css | Xml
    New York, Ny

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.