Sai  C

Sai C Email and Phone Number

Principal Solutions Architect @ Caterpillar Inc.
Tempe, AZ, US
Sai C's Location
Tempe, Arizona, United States, United States
Sai C's Contact Details

Sai C personal email

Sai C phone numbers

About Sai C

Experienced in designing, building, integrating, and maintaining multi-platform data pipelines on-prem and in data lake which is scalable and fault-tolerant. Architecting and providing end to end customer-focused solutions.Avid team player, lead teams,interacted, interfaced with various cross functions teams. Leveraged the downstream and upstream applications to build custom dashboards for the business stakeholders.Instrumental in industry domains such as Finance, Entertainment, Media, Cable.Successfully implemented SDLC methodologies such as agile, kanban, scrum. Technology agnostic, evolving, and penchant to learn new technologies, methods, best practices.

Sai C's Current Company Details
Caterpillar Inc.

Caterpillar Inc.

View
Principal Solutions Architect
Tempe, AZ, US
Sai C Work Experience Details
  • Caterpillar Inc.
    Principal Solutions Architect
    Caterpillar Inc.
    Tempe, Az, Us
  • Discount Tire
    Principal Solutions Architect | Segment Strategist
    Discount Tire Sep 2021 - Present
    Scottsdale, Az, Us
    • Driven Discount Tire’s enterprise analytics architecture to support enterprise insights and decision-making architecture leveraging established architecture disciplines.• Leveraged knowledge of the organization’s information, application, and infrastructure environment as well as the current technology landscape to design a holistic and optimized analytics platform.• Defined strategies to operationalize analytics-supporting capabilities in discovery, analysis, and reporting across business segments.• Designed and implemented Microsoft Azure & and AWS solutions for the product teams across various segments, ensuring scalability, security, and performance.• Performed hands-on solution design, solution architectures, architecture roadmaps, prototyping, proofs-of-concept, and development tasks as required in support of projects and products as requested.• Developed SQL queries extensively to research, understand, and relate distributed data repositories across the enterprise.• Developed and documented data models; ensuring that data is defined and searchable; collaborated with various users across the organization to capture output/ process changes and revise data models.• Supported the Data Science team in the approach to implement advanced analytical models and supporting data pipelines as well as developing the strategies and approaches for Analytics as a Service. • Partnered effectively with MDM (Master Data Management), Data SMEs, and other stakeholders (Sponsors, Product solutions architects etc.) to design solutions, including identifying and filling critical gaps.• Developed and maintained data artifacts and acted as a liaison between ETL, integration, Data Science, and BI team(s) to support the development of data solutions.• Worked on PoCs in building pipelines from DBT cloud to Snowflake.• Helped the Data science team in installing DataBricks for AIML use cases.• Designed schema registry using Confluent Kafka and S3 connectors
  • Amazon Web Services (Aws)
    Data Lake Architect
    Amazon Web Services (Aws) Mar 2020 - Aug 2021
    Seattle, Wa, Us
    • Helped customers with a roadmap towards cloud adoption leveraging AWS services not limited to but majorly includes Data Analytics services for AWS Data lake such as AWS Lambda, Amazon Redshift, Amazon Dynamodb, AWS Database Migration Service (DMS), AWS Glue, Amazon EMR, Amazon Elastic Compute Cloud (EC2), Amazon S3, Amazon Relational Database Service (RDS)/aurora, and Amazon Kinesis, Amazon ECS, Amazon EKS, Amazon Cloud formation. • Involved in migration projects for customers’ existing on-premises data warehouses to AWS Data Lake.• Led a team of cloud and Data architects in designing and implementing AWS-based solutions for enterprise clients.• Successfully managed the migration from legacy to AWS resulting in a 20% reduction in operational costs for a few of my clients.• Built data pipelines using programming languages such as Python, pyspark on AWS Glue, and Amazon EMR• Responsible for building and delivering POCs, end-to-end architectural design, scoping the project, and gathering the requirements from the clients for their cloud adoption.• Invented and built Big Data and Analytics solutions that solve complex problems, scale globally, guarantee performance, and enable breakthrough innovations.• Worked with systems engineers, consultants, and data scientists to design and build Data Analytics platforms and Data Lakes to support compute-heavy data science, dashboarding, and web-facing production tooling.• Built ETL to consolidate and relate petabytes of data owned by disparate teams.• Worked with customers and partners, guiding them through planning, prioritization, and delivery of complex transformation initiatives, while collaborating with relevant AWS Sales and Service Teams.• (Re)designed solutions to use technologies and modern software development practices.• Created and applied frameworks, methods, best practices, and artifacts that will guide our Customers; publish and present them in large forums and across various media platforms.
  • Nclouds
    Data Engineer
    Nclouds Jun 2019 - Feb 2020
    San Francisco, California, Us
    • Worked on AWS glue to move data ingested into the Influx time series database using pyspark API calls and data frames.• Created metadata tables in the AWS data catalog which crawl to s3 data storage for every appended data.• Cataloged metadata in s3 as hive partitions using crawlers and made it Query-efficient for Athena ad-hoc analysis• Worked on deduping, cleansing data, and enriching the data with master data storage using pyspark data frames.• Enriched the master data from MySQL rdbms to s3 by merging transactional data ingested via Influx db.• Involved in the end-to-end workflow of AWS glue ETL architecture and data modeling.• Worked on data curation and improving the performance of PySpark code as an enhancement.• Manage end-to-end cloud projects, from planning to implementation.• Define project scope, objectives, and deliverables• Developed test cases from an ad-hoc user perspective and created test data from QA UI.• Involved in regression and functional testing.• Worked with stakeholders to scope the project requirements and determine the work deadlines in an agile environment.
  • Onit
    Data Integration Developer
    Onit Nov 2018 - May 2019
    Houston, Tx, Us
    • Configured and developed the required conversion routines, scripts, or processes to staging schema using PENTAHO. Supporting end clients with the production servers.• Provided analytics of dirty data based on conversion runs, including those based on test data.• Developing and documented conversion output, including several records, failures, and anomalies.
  • Disney Parks & Resorts
    Data Integration Developer
    Disney Parks & Resorts May 2017 - Oct 2018
    • Responsible for orchestrating end-to-end ETL workflow using PDI and PostgreSQL.• Helped downstream teams optimize the data used for application.• Hands-on experience moving data from client library and relational database to HDFS using HDFS commands and Sqoop. • Expertise in creating mappings in Talend using tMap, tJoin, tReplicate, tParallelize, tConvertType, , tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher.• Analyzed large data sets (structured and unstructured) using Hive queries, Pig Scripts, and pyspark • Analyze multiple sources of structured and unstructured data to propose and design data architecture solutions for scalability, high availability, fault tolerance, and elasticity.
  • Great West Financials
    Sr.Software Developer
    Great West Financials Sep 2015 - May 2017
    • Worked on cash modules which included the development and support of PRO*C, PLSQL, UNIX and PERL. • Developed Pre-Processor scripts in PERL as a part of the ETL process to load work_participant and work_klink_cash tables for further processing.• Coordinated the analysis and resolution of computer outages in an expedient manner • Monitored and analyzed computer performance using the system console and software performance tools
  • Comcast Wholesale
    Senior Etl Developer
    Comcast Wholesale Aug 2014 - Sep 2015
    Centennial, Colorado, Us
    • Designed and implemented Change Data Capture (CDC) processes for Fact and Dimension tables through a combination of Time Stamps, staging (before/after images), and bridge tables • Identified performance issues in existing sources, targets, and mappings by analyzing the data flow, evaluating transformations, and tuning accordingly for better performance.• Designed and implemented Business Intelligence Platform from scratch. Integrated it with the upstream systems using Hadoop, Pig, Hive, and other Big Data components for various f functionalities. Make the platform more resilient, and also make it less configurable so that with minimal settings, we can onboard clients.
  • Edward Jones
    Database Developer
    Edward Jones Mar 2013 - Aug 2014
    St. Louis, Mo, Us
    • Moved Legacy data into new tables by writing new procedures.• Created customized procedures, cursors, triggers, and functions and worked with Java developers to populate that data on the websites of Edward Jones.• Built ETL for data mart, pre-defined reports using Pentaho BI suite and deploy them on different servers. • Used JNDI to connect Pentaho reports to the database, variable to make reporting dynamic. • Worked with subject matter experts in creating complex oracle SQL queries for reports • Involved in testing the scripts by performing Unit testing, System integration testing, Regression testing, and helped them deploy from Autosys box
  • Vmware
    Oracle Application Developer
    Vmware Sep 2012 - Feb 2013
    Palo Alto, Ca, Us
    • Responsible for implementation and customization of OM, INV, AR, PO, PA, GL, and Purchasing.• Performance tuning of custom packages and standard API’s
  • Conqsys I.T. (P) Ltd.
    Sql Developer
    Conqsys I.T. (P) Ltd. Aug 2007 - Jul 2010
    • Modified tables, synonyms, sequences, views, stored procedures and triggers.• Generated SQL and PL/SQL scripts to create and drop database objects.• Enforced constraints for data validation.• Fixed performance issues and bugs with packages and stored procedures using explain plan.• Created B-tree and Bitmap indexes. • Loaded flat file using as SQL *LOADER into database.• Used TOAD & SQL Developer for debugging application.• Created UNIX shell scripts and scheduled Cron jobs.• Written cursors and REF Cursors for transaction processing.• Implemented database triggers based on the business rules and requirements.• Assisted in Functional Requirement Specification & Use Case Diagrams to Streamlining the business flow

Sai C Skills

Agile Methodologies Sql Java Pl/sql Software Development Project Management Oracle Sql Developer Xml Management Integration Software Project Management Web Services Java Enterprise Edition Oracle Plsql Unix Javascript Tortoise Svn Toad Tableau Scrum Oracle Forms Oracle Reports Mongodb Microsoft Sql Server Sql Tuning T Sql Autosys Informatica Github Oracle E Business Suite Oracle Application Express Xml Publisher Oracle Discoverer Html Scripting Pentaho

Sai C Education Details

  • Oklahoma City University
    Oklahoma City University
    Computer Software Engineering
  • International Technological University (Itu)
    International Technological University (Itu)
    Software Engineering
  • The Icfai University, Dehradun
    The Icfai University, Dehradun
    Computer Science

Frequently Asked Questions about Sai C

What company does Sai C work for?

Sai C works for Caterpillar Inc.

What is Sai C's role at the current company?

Sai C's current role is Principal Solutions Architect.

What is Sai C's email address?

Sai C's email address is ch****@****ail.com

What is Sai C's direct phone number?

Sai C's direct phone number is 1 (800) 328*****

What schools did Sai C attend?

Sai C attended Oklahoma City University, International Technological University (Itu), The Icfai University, Dehradun.

What skills is Sai C known for?

Sai C has skills like Agile Methodologies, Sql, Java, Pl/sql, Software Development, Project Management, Oracle Sql Developer, Xml, Management, Integration, Software Project Management, Web Services.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.