Sai C work email
- Valid
- Valid
Sai C personal email
- Valid
Sai C phone numbers
Experienced in designing, building, integrating, and maintaining multi-platform data pipelines on-prem and in data lake which is scalable and fault-tolerant. Architecting and providing end to end customer-focused solutions.Avid team player, lead teams,interacted, interfaced with various cross functions teams. Leveraged the downstream and upstream applications to build custom dashboards for the business stakeholders.Instrumental in industry domains such as Finance, Entertainment, Media, Cable.Successfully implemented SDLC methodologies such as agile, kanban, scrum. Technology agnostic, evolving, and penchant to learn new technologies, methods, best practices.
-
Principal Solutions ArchitectCaterpillar Inc.Tempe, Az, Us -
Principal Solutions Architect | Segment StrategistDiscount Tire Sep 2021 - PresentScottsdale, Az, Us• Driven Discount Tire’s enterprise analytics architecture to support enterprise insights and decision-making architecture leveraging established architecture disciplines.• Leveraged knowledge of the organization’s information, application, and infrastructure environment as well as the current technology landscape to design a holistic and optimized analytics platform.• Defined strategies to operationalize analytics-supporting capabilities in discovery, analysis, and reporting across business segments.• Designed and implemented Microsoft Azure & and AWS solutions for the product teams across various segments, ensuring scalability, security, and performance.• Performed hands-on solution design, solution architectures, architecture roadmaps, prototyping, proofs-of-concept, and development tasks as required in support of projects and products as requested.• Developed SQL queries extensively to research, understand, and relate distributed data repositories across the enterprise.• Developed and documented data models; ensuring that data is defined and searchable; collaborated with various users across the organization to capture output/ process changes and revise data models.• Supported the Data Science team in the approach to implement advanced analytical models and supporting data pipelines as well as developing the strategies and approaches for Analytics as a Service. • Partnered effectively with MDM (Master Data Management), Data SMEs, and other stakeholders (Sponsors, Product solutions architects etc.) to design solutions, including identifying and filling critical gaps.• Developed and maintained data artifacts and acted as a liaison between ETL, integration, Data Science, and BI team(s) to support the development of data solutions.• Worked on PoCs in building pipelines from DBT cloud to Snowflake.• Helped the Data science team in installing DataBricks for AIML use cases.• Designed schema registry using Confluent Kafka and S3 connectors -
Data Lake ArchitectAmazon Web Services (Aws) Mar 2020 - Aug 2021Seattle, Wa, Us• Helped customers with a roadmap towards cloud adoption leveraging AWS services not limited to but majorly includes Data Analytics services for AWS Data lake such as AWS Lambda, Amazon Redshift, Amazon Dynamodb, AWS Database Migration Service (DMS), AWS Glue, Amazon EMR, Amazon Elastic Compute Cloud (EC2), Amazon S3, Amazon Relational Database Service (RDS)/aurora, and Amazon Kinesis, Amazon ECS, Amazon EKS, Amazon Cloud formation. • Involved in migration projects for customers’ existing on-premises data warehouses to AWS Data Lake.• Led a team of cloud and Data architects in designing and implementing AWS-based solutions for enterprise clients.• Successfully managed the migration from legacy to AWS resulting in a 20% reduction in operational costs for a few of my clients.• Built data pipelines using programming languages such as Python, pyspark on AWS Glue, and Amazon EMR• Responsible for building and delivering POCs, end-to-end architectural design, scoping the project, and gathering the requirements from the clients for their cloud adoption.• Invented and built Big Data and Analytics solutions that solve complex problems, scale globally, guarantee performance, and enable breakthrough innovations.• Worked with systems engineers, consultants, and data scientists to design and build Data Analytics platforms and Data Lakes to support compute-heavy data science, dashboarding, and web-facing production tooling.• Built ETL to consolidate and relate petabytes of data owned by disparate teams.• Worked with customers and partners, guiding them through planning, prioritization, and delivery of complex transformation initiatives, while collaborating with relevant AWS Sales and Service Teams.• (Re)designed solutions to use technologies and modern software development practices.• Created and applied frameworks, methods, best practices, and artifacts that will guide our Customers; publish and present them in large forums and across various media platforms. -
Data EngineerNclouds Jun 2019 - Feb 2020San Francisco, California, Us• Worked on AWS glue to move data ingested into the Influx time series database using pyspark API calls and data frames.• Created metadata tables in the AWS data catalog which crawl to s3 data storage for every appended data.• Cataloged metadata in s3 as hive partitions using crawlers and made it Query-efficient for Athena ad-hoc analysis• Worked on deduping, cleansing data, and enriching the data with master data storage using pyspark data frames.• Enriched the master data from MySQL rdbms to s3 by merging transactional data ingested via Influx db.• Involved in the end-to-end workflow of AWS glue ETL architecture and data modeling.• Worked on data curation and improving the performance of PySpark code as an enhancement.• Manage end-to-end cloud projects, from planning to implementation.• Define project scope, objectives, and deliverables• Developed test cases from an ad-hoc user perspective and created test data from QA UI.• Involved in regression and functional testing.• Worked with stakeholders to scope the project requirements and determine the work deadlines in an agile environment. -
Data Integration DeveloperOnit Nov 2018 - May 2019Houston, Tx, Us• Configured and developed the required conversion routines, scripts, or processes to staging schema using PENTAHO. Supporting end clients with the production servers.• Provided analytics of dirty data based on conversion runs, including those based on test data.• Developing and documented conversion output, including several records, failures, and anomalies. -
Data Integration DeveloperDisney Parks & Resorts May 2017 - Oct 2018• Responsible for orchestrating end-to-end ETL workflow using PDI and PostgreSQL.• Helped downstream teams optimize the data used for application.• Hands-on experience moving data from client library and relational database to HDFS using HDFS commands and Sqoop. • Expertise in creating mappings in Talend using tMap, tJoin, tReplicate, tParallelize, tConvertType, , tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher.• Analyzed large data sets (structured and unstructured) using Hive queries, Pig Scripts, and pyspark • Analyze multiple sources of structured and unstructured data to propose and design data architecture solutions for scalability, high availability, fault tolerance, and elasticity. -
Sr.Software DeveloperGreat West Financials Sep 2015 - May 2017• Worked on cash modules which included the development and support of PRO*C, PLSQL, UNIX and PERL. • Developed Pre-Processor scripts in PERL as a part of the ETL process to load work_participant and work_klink_cash tables for further processing.• Coordinated the analysis and resolution of computer outages in an expedient manner • Monitored and analyzed computer performance using the system console and software performance tools
-
Senior Etl DeveloperComcast Wholesale Aug 2014 - Sep 2015Centennial, Colorado, Us• Designed and implemented Change Data Capture (CDC) processes for Fact and Dimension tables through a combination of Time Stamps, staging (before/after images), and bridge tables • Identified performance issues in existing sources, targets, and mappings by analyzing the data flow, evaluating transformations, and tuning accordingly for better performance.• Designed and implemented Business Intelligence Platform from scratch. Integrated it with the upstream systems using Hadoop, Pig, Hive, and other Big Data components for various f functionalities. Make the platform more resilient, and also make it less configurable so that with minimal settings, we can onboard clients. -
Database DeveloperEdward Jones Mar 2013 - Aug 2014St. Louis, Mo, Us• Moved Legacy data into new tables by writing new procedures.• Created customized procedures, cursors, triggers, and functions and worked with Java developers to populate that data on the websites of Edward Jones.• Built ETL for data mart, pre-defined reports using Pentaho BI suite and deploy them on different servers. • Used JNDI to connect Pentaho reports to the database, variable to make reporting dynamic. • Worked with subject matter experts in creating complex oracle SQL queries for reports • Involved in testing the scripts by performing Unit testing, System integration testing, Regression testing, and helped them deploy from Autosys box -
Oracle Application DeveloperVmware Sep 2012 - Feb 2013Palo Alto, Ca, Us• Responsible for implementation and customization of OM, INV, AR, PO, PA, GL, and Purchasing.• Performance tuning of custom packages and standard API’s -
Sql DeveloperConqsys I.T. (P) Ltd. Aug 2007 - Jul 2010• Modified tables, synonyms, sequences, views, stored procedures and triggers.• Generated SQL and PL/SQL scripts to create and drop database objects.• Enforced constraints for data validation.• Fixed performance issues and bugs with packages and stored procedures using explain plan.• Created B-tree and Bitmap indexes. • Loaded flat file using as SQL *LOADER into database.• Used TOAD & SQL Developer for debugging application.• Created UNIX shell scripts and scheduled Cron jobs.• Written cursors and REF Cursors for transaction processing.• Implemented database triggers based on the business rules and requirements.• Assisted in Functional Requirement Specification & Use Case Diagrams to Streamlining the business flow
Sai C Skills
Sai C Education Details
-
Oklahoma City UniversityComputer Software Engineering -
International Technological University (Itu)Software Engineering -
The Icfai University, DehradunComputer Science
Frequently Asked Questions about Sai C
What company does Sai C work for?
Sai C works for Caterpillar Inc.
What is Sai C's role at the current company?
Sai C's current role is Principal Solutions Architect.
What is Sai C's email address?
Sai C's email address is ch****@****ail.com
What is Sai C's direct phone number?
Sai C's direct phone number is 1 (800) 328*****
What schools did Sai C attend?
Sai C attended Oklahoma City University, International Technological University (Itu), The Icfai University, Dehradun.
What skills is Sai C known for?
Sai C has skills like Agile Methodologies, Sql, Java, Pl/sql, Software Development, Project Management, Oracle Sql Developer, Xml, Management, Integration, Software Project Management, Web Services.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial