Arjun V

Arjun V Email and Phone Number

Data Architect | Data Engineer | Sr. ETL Developer | CLOUD ENGINEER @ Elevance Health
Arjun V's Location
Suwanee, Georgia, United States, United States
About Arjun V

Experienced ETL Developer/Designer with strong technical expertise in building Data Warehouses, Data Marts, Data Integration, Operational Data Stores, and ETL processes for clients in Telecom, Satellite, and communication, Insurance, finance, and Oil and Gas industry domains.Skillset - Informatica, Informatica Cloud, Teradata, Unix/Linux scripting, Hadoop, AWS Redshift, S3, EMR, PostgreSQL, Oracle, SQL Server, DB2, Azure SQL Server, SAP BODS, Business Objects, Azure DevOps, Jira, Service Now, and GitHub.

Arjun V's Current Company Details
Elevance Health

Elevance Health

View
Data Architect | Data Engineer | Sr. ETL Developer | CLOUD ENGINEER
Arjun V Work Experience Details
  • Elevance Health
    Sr.Etl Developer / Data Engineer / Architect
    Elevance Health Aug 2023 - Present
    Indianapolis, Indiana, Us
    Hardware/Software: Teradata v16.20, Informatica Power Center 10.4 & 10.5, Teradata SQL assistant, Oracle, SQL server, UNIX, Salesforce.
  • Cvs Health
    Senior Etl Developer / Data Engineer / Architect
    Cvs Health May 2021 - May 2023
    Woonsocket, Ri, Us
    Hardware/Software: Teradata v16.20, Informatica Power Center 10.2 & 10.4, Teradata SQL assistant, Oracle, DB2, UNIX, Tivoli, Git lab, Jenkins, Unix shell script, python, GCP, Snowflake
  • Wells Fargo
    Senior Etl Developer
    Wells Fargo Jan 2021 - May 2021
    San Francisco, California, Us
    Hardware/Software: Teradata v16.20, Informatica Power Center 10.2 & 10.4, Teradata SQL assistant, MySQL Server, UNIX, Autosys, Jira, Pac 2000, DCM & SSRS.
  • Conocophillips
    Sr. Etl Developer
    Conocophillips Feb 2020 - Aug 2020
    Houston, Texas, Us
    Hardware/Software: Informatica Power Center 10.2, Informatica B2B Data exchange, Cleo Harmony, Teradata v16.20, Teradata Viewpoint, Teradata Studio, MySQLSever, Oracle, flat files, UNIX, Linux, Azure DevOps, ServiceNow, Wiki, Azure SQLServer Database, SQL Developer, Micro strategy, Spotfire reporting, CTRM.Worked with business users and data owners to gather requirements for new or changing ETL solutions.Designed and developed ETL Solutions using Informatica Power Center that follow the in house ETL Development Standards; Managed program coding for the extraction of data from all existing source systems, transforming data to industry standards and loaded into Integrated Data Warehouse.Developed Teradata Stored Procedures to effectively query from the Teradata Partition table.Developed complex informatica mappings using SCD Type 1, Type2, and PDO. Performed Data loading using TPT loaders and Relation connections.Tuned long-running workflows/session and SQL queries for better performance in Production.Support Informatica Power Center ETL -Designer, Repository Manager, Workflow Manager, Workflow Monitor, Data warehouse (Teradata), and ETL activities.Troubleshoot Technical and data problems and provided fixes and solutions.Created and Altered Databases Tables, views, and stored procedures and granted user roles.performed Production support and on-call duties and managed all production failures.
  • Asurion
    Sr. Etl Informatica Developer
    Asurion Aug 2017 - Dec 2019
    Nashville, Tennessee, Us
    Hardware/Software: Informatica Power Center 10.1.1, Power Exchange, Informatica Cloud, Amazon AWS S3, Redshift, EMR Hives, MySQL, PostgreSQL, Oracle, flat files, Presto, UNIX, Jira, ServiceNow, Windows 2008, SQL Developer, OBIEE, Micro strategy, Spotfire reporting, Activebatch. Responsible for Developing and Managing EU Claims data mart.Developed 300 plus Informatica Mappings for data ingestion for claims and enrollment data. Automated ETL Process using UNIX shell scripts.Resolved production issues during the warranty period and Ensured that deliverables are aligned with business expectations.Managed and supported CDC using power exchange in conjunction with the power center to replicate change data from Oracle RDDMS for fraud management. Developed Informatica Mappings to process data from various sources like Redshift, Oracle, SQL Server, and Flat files and loaded data into Redshift, S3, and Hive EMR Clusters for reporting.Implemented ETL Solutions to generate various outbound reports for different clients.Worked on Data encryption and Decryption using voltage in Informatica.Building and managing new enterprise Database (Schemas, Tables, Functions, Stored Procs, and Views) in AWS Redshift, SQL Server, HiveCreating Informatica mappings, mapping configurations, data synchronization, data replication, and task using Informatica Cloud to replicate data from source system to other databases.
  • Paypal
    Teradata / Hadoop Developer
    Paypal Sep 2016 - Apr 2017
    San Jose, Ca, Us
    Hardware/Software: Teradata, Teradata Utilities, Hadoop, Kafka, Informatica Power Center 10.1.1, Oracle, flat files, UNIX, Jira, ServiceNow, Github, UC4, crontab, CognosCo-ordinated with different teams to Perform UC4 Align and Untangle of Teradata batch jobs running on the EDW which reduces the complexity, identifies the business units/business owners, and simplifies the production maintenance and support.Created container, Job plans, Jobs, and event wait in UC4 scheduler and worked on scheduling / un-scheduling the Job plans in the containers.Worked on Teradata and Unix scripts to develop ETL logic for new developments, enhancements, and bug fixes of large Teradata Finance data Mart for Calculating financial cost and negative balanceAnalyzed and worked on reprocessing of Source Data to fix financial Data issues. Co-ordinated and worked with the Ops team for production release and also created and managed all Jira tickets for the release management. Developed scripts to load data from different sources like Oracle, Teradata, and Hadoop using BTEQ, FAST LOAD, MULTI LOAD, and FAST EXPORTInvolved in TFC/TLD Migration activities from Teradata to HadoopCreated effective space and resource utilization by Identifying and deprecating financial tables and Teradata Batch Jobs from Teradata and UC4 as part of financial table cleanup.Worked on agile scrum methodologies, Participated in scrum meetings. Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session
  • Frontier Communications
    Data Architect
    Frontier Communications Jul 2015 - Apr 2016
    Dallas, Texas, Us
    Hardware/Software: Informatica Power Center 9.6, Oracle, flat files, DB2, Sharepoint, Teradata v 14 Primary Owner and Data Mapper for the BL Arbor Source for CC&B Accounts Mapping Team. Involved in meeting with Business/End-user to understand the business process and gather requirements. Performed source data analysis and provided various business reports to validate the business requirements. Developed Functional data mapping specification to Map the Verizon Arbor Data to Target (DPI Subscriber Information Fields). Developed SQL Queries and Informatica transformation logic to convert the Verizon VADI Arbor Wholesale ISP Data from Oracle to Hestia (Frontier DPI Ordering, Provisioning, and Billing Systems). Audited and certified the mock data from Verizon. Assisted and provided clarification and solutions to Informatica ETL developers during development activity. Performed unit testing to validate the ETL Development to make sure it meets the business requirement and Data Mapping specification. Coordinated and worked with UAT and testing team to reduce the number of defects raised by the testing team.
  • Directv
    Sr.Etl Developer
    Directv Mar 2014 - May 2015
    El Segundo, Ca, Us
    Hardware/Software: Teradata V14, V15, Teradata Utilities, Cloudera Hadoop, Hive, HDFS, Sqoop, Oozie, Informatica Power Center 9.6, Informatica Cloud, Oracle, SQL Server, Salesforce, flat files, legacy system, golden gate, UNIX, Autosys, crontab, Cognos Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.Helped and Performed gap analysis and impact analysis and identified all the Mappings, Sessions, Multi load scripts, and BTEQ Scripts which are not having the Load_Date and Last_upd_Date and added the Load_Date and Last_upd_Date as per new standards for almost 100 tables as part of Teradata Phase 3 unity Project. Primary developer and SME for the COMMERCIAL EXTENDED CUSTOMER OFFER, SOCIAL SIGN-ON, and MY DIRECTV APPOINTMENT STATUS TRACKING Projects.Worked on Various other Projects like NFL – WEST FILE FEED, GENESYS CALLS, ATT- Tax Acquisition project, and ATT- Integration compensation projects.Primarily Involved in ETL Development and Production support activities of Enterprise Data Warehouse and supported the production Job errors.Handling the user request on-demand to implement the new business rules.Extracted and Transformed data from SFDC using Informatica cloud for commercial projects and loaded the data into Enterprise Data Warehouse. Extensively used Informatica, Teradata utilities and Hadoop to extract data from the Source system such as Oracle, Teradata, Hadoop HDFS, Flat files, and third-party systems like West, Gigya, and OmnitureAutomated the Campaign Management file feed process using a Unix shell script to create dummy files and previous day files, If the daily file has not arrived on time. Extracted and transformed data from the HDFS file system using hive script and exported the result set to Teradata using Sqoop export. Wrote several hive internal and external scripts and scheduled the jobs to run in the Oozie workflow
  • At&T
    Sr. Teradata / Data Analyst
    At&T Sep 2013 - Dec 2013
    Dallas, Tx, Us
    Hardware/Software: Teradata V14, Teradata Utilities, Oracle, SQL Server 2008 R2, Stored Procedure, flat files, UNIX, Windows,Involved in discussions with Teradata's team and client's team to understand and validate all the core metric applications / Tool interfaces for Optima to Pride (Oracle to Teradata) migration activity.Devised a plan by consolidating the list based on interface groups and identified all the objects ( stored procedures, tables, SQL queries) that are required Migration activity and presented to the clients.Analyzed and Converted 300 plus SQL queries, procedures, and tables from existing format to Teradata SQL formats.Installed, Configured and Documented Teradata Tools and Utilities and Teradata SQL assistant in Core Metrics SQL Server 2008 R2.Analyzed the DCA ETL Data Mart and Proposed Changes to the schema as per the reporting requirementsCollaborated with analyst and migrated all Analytical Reports from CPME Database to the Metrics Database and Performed Validation of reports in the report portal.Worked Extensively on complex SQL Queries and Stored procedures.

Arjun V Education Details

  • Kennesaw State University
    Kennesaw State University
    Computer Science

Frequently Asked Questions about Arjun V

What company does Arjun V work for?

Arjun V works for Elevance Health

What is Arjun V's role at the current company?

Arjun V's current role is Data Architect | Data Engineer | Sr. ETL Developer | CLOUD ENGINEER.

What schools did Arjun V attend?

Arjun V attended Kennesaw State University.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.