• Over 8+ years of Professional IT Experience: Expertise in designing and developing applications using Python, Java, SQL, and NoSQL databases.• AWS Cloud Resource Development: Proficient in AWS services including Kafka, EventBridge, Lambda, EC2, EMR, Glue, S3, Redshift, VPC, Route 53, CloudWatch, CloudTrail, CloudFormation, SES, SNS, SQS, and API Gateway. AWS Certified.• Container Orchestration and Microservices: Extensive experience with Kubernetes (EKS) for deploying, managing, and scaling microservices. Proficient in Terraform and Helm for infrastructure as code and configuration management.• Authentication and Authorization: Strong knowledge in implementing secure authentication and authorization mechanisms, ensuring robust security measures in application development.• Web and Microservice Development: Specialized in developing scalable web applications and microservices, with hands-on experience in building reference architectures and utilizing modern web frameworks.• Data Engineering and Integration: Built and maintained large-scale data ingestion pipelines from Kafka to AWS S3. Experienced in data modeling, ETL processes, and using AWS Glue for schema management. Proficient in real-time and batch data processing using Spark, Kafka, and Hadoop ecosystem technologies.• Continuous Integration and Continuous Deployment (CI/CD): Expertise in deploying Python-based applications using LOOPER and Concourse CI/CD pipelines, achieving significant improvements in deployment times and reliability.• Database Management and Performance Optimization: Extensive experience in database design, data integration (ETL), metadata management, and performance optimization. Skilled in using SQL Server, PostgreSQL, and Snowflake for various data management tasks.• Big Data Solutions: Implemented Big Data solutions over cloud environments, utilizing tools like Hadoop, Spark, HBase, and MongoDB. Proficient in AWS Redshift, S3, Spectrum, and Athena for querying large datasets.• working on HVR as a real-time database replication software. Ability to create table and load using HVR software and configure HVR for integration.• Automation and Monitoring: Developed comprehensive automation frameworks for application deployment and monitoring, enhancing system robustness and operational efficiency. Experienced in using Jenkins, Artifactory, SonarQube, and Selenium for CI/CD and automated testing.• Project Leadership and Innovation: Led Spark programs for data accuracy, and researched innovative techniques for Salesforce and cloud integration.
-
Senior Data Network EngineerAt&T Labs, Inc.Houston, Tx, Us -
Python Data EngineerDuke Energy Corporation Feb 2023 - Jul 2024Charlotte, North Carolina, Us• Architected and implemented a comprehensive AWS-based solution for monitoring and notifying schema changes across RDS and AWS Glue Catalog, enhancing data governance and schema consistency.• Developed and maintained SQL Server stored procedures, functions, triggers, and indexes, significantly improving database performance and reliability for critical energy operations.• Utilized SSIS to create and manage ETL processes, connecting to multiple data types and ensuring seamless data integration and processing.• Designed and implemented Concourse CI/CD pipelines for continuous integration and deployment of AWS Lambda functions, enhancing automation and reducing manual intervention in pipeline execution.• Developed Power BI and SSRS reports, providing advanced data visualization and reporting capabilities to support strategic decision-making.• Implemented and maintained a robust data ingestion pipeline from Kafka to AWS S3, utilizing AWS Lambda for efficient data processing and transfer, ensuring real-time data availability and integrity.• Configured AWS Glue Crawlers to update the Glue Catalog, automatically detecting and documenting schema changes from incoming data streams. -
Python Data EngineerVerizon Jan 2022 - Jan 2023Basking Ridge, Nj, Us• Leveraged Google Cloud Platform (GCP) and BigQuery to enhance data analytics capabilities, significantly improving data processing speed and accuracy for client projects.• Developed and optimized SQL queries and stored procedures in BigQuery, leading to a 20% increase in query performance and supporting large-scale data analysis.• Designed robust data pipelines using Google Cloud Dataflow, ensuring efficient data transformation and loading that supported real-time business intelligence.• Implemented data models and schemas in BigQuery, which facilitated complex data analysis and supported strategic decision-making for top management.• Automated data ingestion and ETL processes using Google Cloud Composer, enhancing operational efficiency and reliability in data handling.• Conducted detailed data quality assessments and validation, ensuring the integrity and accuracy of data in BigQuery for critical business operations.• Collaborated with cross-functional teams to integrate GCP solutions, enhancing overall system architecture and achieving seamless data flow across platforms.• Provided expert guidance on GCP best practices, optimizing resource utilization and cost management for cloud services.• Managed and optimized PowerShell or VBA scripts that interact with Power BI tenants or excel sheets, improving script efficiency and performance.• Mentored a team of junior developers in GCP and BigQuery, raising their productivity and technical skills by 30%.• Designed custom dashboards using Google Data Studio, improving data visualization and accessibility for business stakeholders by 40%.• Led the deployment of machine learning models on GCP, using BigQuery ML to predict market trends and inform business strategy.• Ensured compliance with data security and privacy standards in cloud deployments, maintaining high levels of data protection and trust. • Documented technical processes and trained end-users on new GCP features and BigQuery functionalities. -
Python DeveloperUsaa Oct 2019 - Dec 2021San Antonio, Texas, Us● ● Containerized the application into a Docker file to run it universally and hosted it on Docker Hub.● Deployed the model on AWS SageMaker along with docker including security configurations for Docker image on CentOS.● Built the frontend of the application using MLFlow and Flask in order to compare the runs of the model and evaluate the KPIs for the model.● The vanilla version of code is deployed into AWS using Jenkins CI/CD pipeline and I’ve configured all of the requirements as per the architecture design. ● Improved algorithm accuracy from 86% to 94% efficiency using wildcard-based regex feature implementation for the existing NLP algorithms.Worked on building PySpark algorithms for different aggregations of data based on the specifications. ● Involved with performance and process enhancement of the PySpark framework.● Developed integration checks around the PySpark framework for Processing of large datasets. ● Worked on migration of PySpark framework into AWS Glue for enhanced processing. ● Wrote various automation scripts for automation of data processing on AWS Glue.● Developed data warehouse model in snowflake for over 100 datasets using where cape.● Build integration testing system that involved testing the entire framework with a code check in.● Used SAS - output delivery system ODS to format reports in HTML, PDF, CSV and RTF reports.● Created databases using MySQL, wrote several queries to extract data from the database.● Communicated effectively with the external vendors to resolve queries.Environment: Python, PySpark, Java, wxPython, Apache, AWSGlue, AWS Athena, AWS Step Functions, CloudFormation pytest, Bootstrap, Flask, Oracle, PL/SQL, MySQL, MS-SQL, REST, PyCharm, Windows, Linux. -
Python DeveloperOracle Sep 2017 - Jun 2019Austin, Texas, Us• Worked on predictive analytics use-cases using R language.• Use Python unit and functional testing modules such as unit test, unittest2, mock, and custom frameworks in-line with Agile Software Development methodologies.• Performed several roles in app development lifecycle serving enterprise clients in tech and fintech. Tech stack included Java, Python, HTML, CSS, SQL, NoSQL, and Oracle Cloud applications.• Interacted with clients' executives throughout entire project lifecycle from requirements gathering to maintenance. • Used Agile processes during implementation phase. Improved deliverables productivity by 20%. Completed deliverables 2 months faster.• Developed test scenarios and test cases, which achieved 95% success rate.• Built, enhanced, debugged, and troubleshot scalable enterprise software programs with 50,000+ lines of code.• Improved data mining processes, resulting in a 20% decrease in time needed to infer insights from customer data used to develop marketing strategies.• Used predictive analytics such as machine learning and data mining techniques to forecast company sales of new products with a 95% accuracy rate.• Increased data security by updating companywide encryption, steganography, IP security, and secure wireless transmission practices.• Automated Performance Improvement Plan process for 13 business units. Saved 600+ monthly work hours. • Developed ERP analytics dashboard delivering real-time overview of client CEO's 13 business units, and see quarterly predictions using analytics and machine learning. Boosted quarterly client business activity by 20%.• Integrated external modules into Oracle Cloud HCM. Developed custom reports with 2X speed.• Developed test scenarios and test cases, which achieved 95% success rate.• Performed an in-depth study about Elastic Search, Kibana, Logstash, Kafka, and Spark to establish
Sai K Education Details
-
Jntu AnantapurElectronics And Computer Science
Frequently Asked Questions about Sai K
What company does Sai K work for?
Sai K works for At&t Labs, Inc.
What is Sai K's role at the current company?
Sai K's current role is Senior Data Network Engineer.
What schools did Sai K attend?
Sai K attended Jntu Anantapur.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial