Rakesh S

Rakesh S Email and Phone Number

Senior Software Developer @ Sträva Technology Group
Irving, TX, US
Rakesh S's Location
Irving, Texas, United States, United States
About Rakesh S

Rakesh S is a Senior Software Developer at Sträva Technology Group.

Rakesh S's Current Company Details
Sträva Technology Group

Sträva Technology Group

View
Senior Software Developer
Irving, TX, US
Rakesh S Work Experience Details
  • Sträva Technology Group
    Senior Software Developer
    Sträva Technology Group
    Irving, Tx, Us
  • Charter Communications
    Snowflake Admin
    Charter Communications Oct 2022 - Present
    Stamford, Connecticut, Us
    - Developing ETL pipelines in and out of data warehouse using Python and Snowflakes Snow SQL for writing SQL queries against Snowflake.- Created Snowpipe for continuous data load and Used COPY to bulk load the data.- Involved in building Snowpipe, Data Sharing, Databases, Schemas and Table structures.- Process Location and Segments data from S3 to Snowflake by using Tasks, Streams, Pipes, and stored procedures.- Designed and implemented efficient data pipelines (ETLs) to integrate data from a variety of sources into Data Warehouse.- Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance. - Responsible for requirement gathering, user meetings, discussing the issues to be resolved and translated the user inputs into ETL design documents.- Integrated data from multiple sources using AWS Glue.- Working on Snowflake modeling and highly proficient in data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon, surrogate key assignment and change data capture. - Worked in Snowflake advanced concepts like setting up Resource Monitors, Role Based Access Controls, Data Sharing, Cross platform database Replication, Virtual Warehouse Sizing, Query Performance Tuning, Snow Pipe, Tasks, Streams, Zero- copy cloning etc.- Worked extensively on the performance tuning of the databases to ensure optimal reporting user experience.- Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts and data load. - Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations. - Developed Talend Bigdata jobs to load heavy volume of data into S3 data lake and then into Snowflake data warehouse. - Created ETL mapping document and ETL design templates for the development team.
  • Autodesk
    Senior Snowflake Developer
    Autodesk Sep 2019 - May 2022
    San Francisco, Ca, Us
    - Created tables and views on snowflake as per the business needs.- Used Tab Jolt to run the load test against the views on tableau.- Used matillion tool for blob storage component and loaded the tables to snowflake stage layer.- Created reports on Metabase to see the Tableau impact on snowflake in terms of cost- Participated in sprint planning meetings, worked closely with manager on gathering the requirements.- Strong skills in resolving Teradata issues, providing workaround for the problems, knowledge on diagnostics, Database Tuning, SQL Tuning and Performance Monitoring.- Designed and implemented efficient data pipelines (ETLs) to integrate data from a variety of sources into Data Warehouse.- Developing ETL pipelines in and out of data warehouse using Snowflakes SnowSQL Writing SQL queries against Snowflake- Performed data quality issue analysis using SnowSQL by building analytical warehouses on Snowflake.- Implemented data intelligence solutions around Snowflake Data Warehouse.- Perform various QA tasks if necessary.- Worked on migration from Teradata to Snowflake for several modules.- Experienced in troubleshooting Teradata scripts, fixing bugs and addressing production issues and performance tuning.- Designed and Created Hive external tables using shared Meta-store instead of derby with partitioning, dynamic partitioning, and buckets.
  • Bluebeam, Inc.
    Snowflake Developer
    Bluebeam, Inc. Dec 2016 - Sep 2019
    Pasadena, California, Us
    - Configured Spark streaming to get ongoing information from the Kafka and store the stream information to HDFS.- Import the data from different sources like HDFS/HBase into Spark RDD and perform computations using PySpark to generate the output response.- Worked on developing ETL processes (Data Stage Open Studio) to load data from multiple data sources to HDFS using SQOOP.- Developed data pipelines using Sqoop, Pig and Hive to ingest customer data into HDFS to perform data analytics.- Developing Spark scripts, UDFS using both Spark DSL and Spark SQL query for data aggregation, querying, and writing data back into RDBMS through Sqoop.- Designed and developed Security Framework to provide fine grained access to objects in AWS S3 using AWS Lambda, DynamoDB.- Developed a detailed project plan and helped manage the data conversion migration from the legacy system to the target snowflake database.- Familiar with data architecture including data ingestion pipeline design, data modelling and data mining.- Develop predictive analytic using Apache Spark Scala APIs. - Used Data Stax Spark connector which is used to store the data into Cassandra database or get the data from Cassandra database.- Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test deploy.- Involved in loading and transforming large sets of structured data from router location to EDW using an Apache NiFi data pipeline flow- Performed end- to-end Architecture & implementation assessment of various AWS Cloud services like Amazon EMR, Redshift, IAM, RDS, Cloud Watch, Athena.- Performing statistical data analysis and data visualization using Python.- Worked on MongoDB for distributed storage and processing.
  • Dhruvsoft Services Private Limited
    Etl Developer
    Dhruvsoft Services Private Limited Jun 2014 - Nov 2016
    Hyderabad, Telangana, In
    - Interacted with the Business Analysts to understand the process flow and the business.- Analyzing Source to Target mapping Excel document (BRS).- Created Data Flow Mappings to extract data from source system and Loading to Target.- Developed the strategy for the incremental loads.- Worked on various tasks like Session, E-Mail task and Command task.- Converted the regular mappings to handle the Slowly Changing Dimension.- Created and used various re-usable tasks, workflows, worklets, mapplets, and re-usable transformations.- Created mappings using the transformations like Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner etc.- Standardized parameter files to define session parameters such as database connection for sources targets, last updated dates for Incremental loads and many default values of fact tables.- Performed tuning of Informatica Mappings for optimum performance.- Used debugger to validate the mappings and gain troubleshooting information about data and error conditions.- Extensively used Unix Scripting, Scheduled PMCMD and PMREP to interact with Informatica Server from command mode.- Created Users, Groups, Roles, and grant privileges to the users. Create folders, Relational and Application connections and configure the ODBC connectivity.- Involved in the Upgrade of Informatica Power Center v9.1 to v9.6.1.- Worked with Admin team for migration strategy to the upgraded environments.- Created web service jobs by configuring WSDL in designer and used Informatica Web Services Hub to start the Informatica tasks.- Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from Oracle and DB2 databases.- Functionality, Back-end, and Regression testing during the various phases of the application and Data Integrity/back-end testing by executing SQL statements.- Involved in Prod Support to research and resolve the daily load issues.

Rakesh S Education Details

  • University Of Sunderland
    University Of Sunderland
  • Humphreys University
    Humphreys University
    Information Technology

Frequently Asked Questions about Rakesh S

What company does Rakesh S work for?

Rakesh S works for Sträva Technology Group

What is Rakesh S's role at the current company?

Rakesh S's current role is Senior Software Developer.

What schools did Rakesh S attend?

Rakesh S attended University Of Sunderland, Humphreys University.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.