• Around 10 years of experience in the complete Software Development Life Cycle (SDLC), spanning analysis, specification, design, implementation, and testing phases.• Applied agile methodology to ensure adaptability to changing project requirements, fostering iterative and collaborative development.• Extensive experience in containerizing and deploying ETL and REST services on AWS ECS through CI/CD Jenkins pipelines, optimizing deployment processes. Successfully migrated on-premises applications to AWS, utilizing services like EC2, S3, EMR, and Glue.• Designed, developed, and implemented performant ETL pipelines using Python API (PySpark) of Apache Spark on AWS EMR. Utilized AWS Glue for managing ETL services and data categorization with Athena integration.• Developed Python-based RESTful Web Services for managing campaign rules using Django and PostgreSQL, deployed on AWS EC2.• Applied unsupervised machine learning algorithms, specifically the K-means algorithm using PySpark, for public segmentation. Implemented machine learning schemes using Python libraries Scikit-learn and SciPy.• Designed and developed UI for websites using HTML, XHTML, AJAX, CSS, and JavaScript. Implemented Bootstrap for effective HTML page layout management.• Executed MYSQL database queries from Python, imported data into Power BI from various sources, and worked on SQL Server DB and Azure SQL DB.• Implemented IAM in AWS for user access levels and security. Integrated single sign-on authentication for applications using Python SDKs, SAML, and Open ID Connect.• Utilized Jenkins for continuous integration, configured servers, and created scripts for project monitoring. Designed and implemented CI systems with Jenkins, TFS, and scripts using Perl and Python.• Worked with Kubernetes for deploying, scaling, and load balancing, utilizing Docker Engine, Docker HUB, and Docker Compose for image management.• Architected a Data Quality Framework for schema validation and data profiling on Spark using PySpark.• Developed AWS Lambda functions in Python for automating workflows, managing S3 objects, and automating file copying.• Worked extensively on various Google Cloud components, migrated applications to Google Cloud, and leveraged services like Data Proc, BigQuery, and Cloud Storage.• Utilized Informatica Designer for creating complex mappings with various transformations, facilitating data movement to a Data Warehouse.• Developed Microservices using Flask, Django, and Docker. Implemented RESTful Microservices and deployed on AWS servers.
-
Pyspark And Python DeveloperCotivitiUnited States -
Pyspark/Python DeveloperCotiviti Jul 2022 - PresentEngaged in the complete Software Development Life Cycle (SDLC), including analysis, specification, design, implementation, and testing phases.• Applied agile methodology to develop applications, ensuring adaptability to changing project requirements.• Participated in requirement gathering and analysis, conducting workshops and meetings with business users to document business requirements.• Collaborated with cross-functional teams to ensure alignment between technical solutions and business needs.• Containerized and deployed ETL and REST services on AWS ECS through CI/CD Jenkins pipelines, optimizing deployment processes.• Migrated an existing on-premises application to AWS. Used AWS services like EC2 and S3 for small data sets processing and storage, experienced in maintaining the Hadoop cluster on AWS EMR.• Design, development and implementation of performant ETL pipelines using python API (PySpark) of Apache Spark, AWS glue on AWS EMR.• Created Athena and integrated with AWS Glue to fully manage ETL service that can categorize the data.• Developed Python based API (RESTful Web Service) to insert, delete and update campaign rules in JOB Config DB using Django and PostgreSQL. Deployed all the API’s on AWS EC2.• Implemented memory management strategies for ETL services, enhancing overall performance.• Applied unsupervised machine learning algorithms, specifically the K-means algorithm using PySpark, for public segmentation.• Utilized PySpark SQL for efficient analysis of SQL scripts, improving performance.• Developed Python Django forms for recording online user data and used pytest for writing test cases.• Utilized Python OpenStack APIs and scripts for database updates and file manipulation.• Designed and deployed machine learning solutions in Python for classifying Twitter users.• Developed Spark applications in both Scala and Python (PySpark) for efficient data processing. -
Pyspark / Python DeveloperExpress Scripts By Evernorth Nov 2019 - Jun 2022United StatesImported data from multiple sources, including cloud services and local file systems, into Spark RDD for scalable data processing.Leveraged Amazon Web Services (AWS) technologies like EMR, S3, EC2, and Lambda for efficient data storage and processing.Developed ETL jobs using AWS Glue to update data in staging databases (e.g., Postgres) from diverse sources and REST APIs.Utilized Apache Spark and Spark YARN to optimize algorithms and improve performance within Hadoop clusters.Developed automation scripts (Python, Shell Scripting) for task automation, deployment, and orchestration of cloud services.Implemented continuous testing practices, including unit testing and Pytest, to ensure code quality and reliability.Utilized Jenkins for continuous integration, building code, and deploying applications to remote servers.Created automated pipelines in AWS CodePipeline for deploying Docker containers on AWS ECS using CloudFormation and S3.Developed and scheduled Hive queries using Oozie coordinator for daily data processing tasks.Used Hive tables, Impala tables, partitions, and buckets extensively for analyzing large volumes of data.Developed Docker-based microservices and deployment modules with Jenkins, leveraging Kubernetes for scalable deployments.Designed and developed RESTful microservices using Flask and Django, deployed on AWS EC2 and Elastic Beanstalk.Integrated Flask framework into web applications for controlling application logic and managing backend services.Updated IAM roles on AWS to ensure proper access controls for users and applications.Implemented detokenization techniques for sensitive data before file transfer to enhance security measures.Utilized SQL and NoSQL databases (e.g., PostgreSQL, Cosmos DB) for data storage and retrieval.Developed and maintained web applications using frameworks like Flask, Django, and OpenStack APIs. -
Python DeveloperCharles Schwab Nov 2017 - Oct 2019• Developed customized APIs with token-based authentication for Django using Rest Framework, SRSS, and PySpark.• Automated infrastructure activities like continuous deployment, application server setup, and stack monitoring using Ansible Playbooks.• Worked on Amazon Web Services (AWS Kinesis, EC2) infrastructure with automation and configuration management tools.• Designed and developed ETL integration patterns using PySpark, AWS glue.• Developed the notification service by posting the JSON request in AWS API Gateway, Validating the response in Lambda by getting the data from database and sending the notification through AWS SNS and SQS.• Developed tools using Python, and XML to automate some of the manual tasks.• Azure cloud merge with Python to store data securely in the cloud.• Implemented server-side technologies with restful API and MVC design patterns using NodeJS and Django framework.• Developed and maintained infrastructure for a new project using Ansible, Redis, Postgres, and Celery.• Microservices architecture development using Python and Docker on an Ubuntu Linux platform using HTTP/REST interfaces with deployment into a multi-node Kubernetes environment.• Created ETL Pipeline using Spark and Hive for ingesting data from multiple sources.• Worked with Presto, Hive, Spark SQL, and Big Query, leveraging Python client libraries to build efficient and interoperable analytics programs.• Employed version control tools like GitHub and Bitbucket for code versioning and maintenance.• Developed and executed reports using Tableau for effective data visualization.• Developed dynamic web pages using Python Django Frameworks, created various templates for different customers, and saved their user preferences.• Monitored database performance with the help of AWS Cloudwatch and communicated with users to consume resources optimally.• Developed back-end components using Python 3.7 on Django 1.7 framework and front-end using HTML5, CSS/CSS3, and JavaScript. -
Python DeveloperThe Hackett Group Inc. May 2014 - Jul 2017• Followed Agile Methodologies to manage the full life-cycle development of projects.• Ensured efficient delivery of code based on Test Driven Development (TDD) principles and continuous integration, aligning with Agile Software Methodology.• Developed views and templates using Python and Django view controllers and templating language for creating user-friendly website interfaces.• Worked on large-scale, high-traffic Python web applications to define and implement new features, enhance core functionality, and integrate with other platforms and services.• Installed, configured, upgraded, migrated, and patched DB2, Oracle, and SQL Server databases.• Established database connections for Python by configuring packages like MySQL-Python.• Handled potential points of failure through robust error handling strategies.• Ensured clear communication of failures to relevant stakeholders.• Deployed projects into Heroku using GIT version control system and Microservices architecture.• Developed various Python scripts for vulnerability assessments with SQL queries, including SQL injection, permission checks, and performance analysis.• Utilized standard Python modules like CSV, Robot parser, Iter tools, Pickle, Jinja2, and XML for development.• Implemented MVC architecture in developing web applications with the Django framework.• Worked with MVW frameworks like Django, Angular JS, HTML, CSS, XML, JavaScript, AngularJS, jQuery, Bootstrap, JSON.• Designed, developed, and implemented ETL pipelines using Python API (PySpark) in AWS EMR.• Developed ETL pipelines in and out of the data warehouse using a combination of Python and Snowflake’s Snow SQL by writing SQL queries against Snowflake.•Used DataStax Spark connector to store data into Cassandra databases or retrieve data from Cassandra databases.•Utilized SSIS transformations such as Lookup, Derived Column, Data Conversion, Aggregate, Conditional Split, SQL Task, Script Task, and Send Mail Task for data integration.
Ashok Kurapati Education Details
Frequently Asked Questions about Ashok Kurapati
What company does Ashok Kurapati work for?
Ashok Kurapati works for Cotiviti
What is Ashok Kurapati's role at the current company?
Ashok Kurapati's current role is Pyspark and Python Developer.
What schools did Ashok Kurapati attend?
Ashok Kurapati attended Swarnandhra Institute Of Engineering & Technology, Seetharampuram, Narsapur, Pin-534280(Cc-Em).
Who are Ashok Kurapati's colleagues?
Ashok Kurapati's colleagues are Kelly Westbrooks, Sherwyn Moore, Mona Conrad, Nellita Reeves, Karen Phillips, Frances Terlich, Revathi Bolla.
Not the Ashok Kurapati you were looking for?
-
3gmail.com, mphasis.com, mphasis.com
-
-
-
Ashok Kurapati
Senior Salesforce Lightning Developer | Salesforce Certified Platform Developer I & IiPleasanton, Ca1capbluecross.com
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial