Venkat Rajesh Email and Phone Number
As a Data Engineer, I create and maintain database objects on Snowflake, design and develop data pipelines using PySpark and Spark SQL, and optimize query performance and ETL operations. I have extensive experience in this role, where I work with source teams, stakeholders, and production support to deliver fast, effective, and quality data to users.I gained extensive exposure to software development life cycle, data ingestion and transformation, performance tuning, and data migration from legacy systems to cloud platforms. I have a Bachelor of Engineering in Computer Science from Visvesvaraya Technological University, and I am proficient in AWS, Azure Databricks, Python, Pyspark and Snowflake. I am always looking for opportunities and improvements to leverage on PaaS, IaaS, SaaS, and Serverless technologies.
Future Focus Infotech Private Limited
View- Website:
- focusinfotech.com
- Employees:
- 1298
-
Data EngineerFuture Focus Infotech Private Limited Oct 2022 - PresentBengaluru, Karnataka, India Creating Database Objects on Snowflake including tables, views, and schemas, to support business requirements. Prepared Source mapping document by interacting with source teams and stakeholders to process the required information from the source input data. Designing and developing Data Pipelines for Data Ingestion and Transformation using PySpark/Spark SQL Developed data pipelines using Databricks to load data from multiple sources to Snowflake. Performance Tuning- Monitor and optimize query performance in Snowflake to ensure efficient data retrieval. Created jobs for production deployment and monitor ETL jobs to ensure they run successfully and meet SLAs during postproduction support. Documentation of the process and jobs and creating confluence pages for reference. Providing support in resolving post-migration issues. CI/CD pipeline using GitLab, ensuring smooth and automated deployment of ETL jobs. -
Snowflake DeveloperDataeconomy Jun 2021 - Sep 2022Hyderabad, Telangana, India Creation of primary objects (Tables, Views, etc) required for the application on Snowflake. Development of Spark code using PySpark and Spark-SQL. Loading source data into S3 raw zone using Pyspark jobs. Transformed and loaded cleansed data into Snowflake tables. Carrying out Performance Tuning of Jobs, Stages, Sources and Targets. Support Unit Testing and working with QA / UAT Team to resolve defects. Writing / reviewing of Detail Design Document with Mapping Specifications & Unit & Integration Test Plans / Test Cases. Creating & scheduling Data pipelines jobs through AutoSys, guaranteeing smooth and timely ETL operations. -
Teradata Etl DeveloperWipro Aug 2019 - Jun 2021Singapore, Singapore Creating the DB Objects, deciding Primary / Secondary index, creating Join index, Partitioned Primary Index, Compare Load Utilities to determine the best load scenario. Writing scripts to importing the data from source files like flat file using the Teradata load utilities. Worked on data Extraction, Transformation and Loading from source to target system using BTEQ. Expertise is selecting Primary Index (UPI, NUPI) for a table for uniform data distribution across the AMP’s and Compressing Columns for efficient utilization of available disk space. Very strong on Teradata SQL query optimization and tuning techniques like: EXPLAIN feature, COLLECT STATISTICS option, Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Volatile and global temporary tables, selecting Primary Index (UPI, NUPI) for a table for uniform data distribution across the AMP’s and Compressing Columns for efficient utilization of available disk space. Extensively used Teradata Load and Unload utilities such as OLE Load, Multi Load, Fast Export and Bulk Load stages in Jobs for loading/extracting huge data volumes. Weekly meetings with technical collaborators and active participation in code review sessions with Team. -
Hadoop Etl DeveloperOptimum Solutions Pte Ltd Dec 2018 - Aug 2019Singapore Developing Ingesting pipelines to import data into HDFS using Sqoop from Teradata database. Built jobs to process the Data on HDFS and load into Hive Tables using Hive SQLs. Created views on Lake Tables for Tableau reporting team to create reports/dashboards. Loading data into HBase for Mart specific users. Worked on SVN for Version control, and Oozie for workflow management. Performance tuning of HQL jobs using various Optimization techniques Responsible for overseeing the Quality procedures related to the project. -
Etl DeveloperLead It Corporation Jan 2016 - Dec 2018United States Worked on Huge file data sets ingestion using Talend. Developed jobs for processing the Data and load into Teradata Tables Worked on Github, Autosys for deployments, orchestration, and scheduling of jobs. Perform rigorous data analysis to identify any inconsistencies or data quality issues and report with Architects and leads.
Venkat Rajesh Education Details
-
Computer Science
Frequently Asked Questions about Venkat Rajesh
What company does Venkat Rajesh work for?
Venkat Rajesh works for Future Focus Infotech Private Limited
What is Venkat Rajesh's role at the current company?
Venkat Rajesh's current role is Data Engineer | Databricks, Pyspark, Snowflake, Python, AWS.
What schools did Venkat Rajesh attend?
Venkat Rajesh attended Visvesvaraya Technological University.
Who are Venkat Rajesh's colleagues?
Venkat Rajesh's colleagues are Shahabaj Patel, Rajkumar Ravi, Milind Kalgaonkar, Manickam Pandiaraj, Aparna Keni, Rakesh Kayastha, Balram Patel.
Not the Venkat Rajesh you were looking for?
-
-
-
Venkat Rajesh
Gcp Devops Engineer | Gcp - 3X | Aws - 2X | Linux | Git | Maven | Terraform | Github Actions (Cicd) | Dockers | KubernetesBengaluru -
Venkat Rajesh
Data Scientist | Driving Business Solutions Through Data-Driven Strategies | Expertise In Ml & NlpBengaluru
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial