Lakshmi S

Lakshmi S Email and Phone Number

DevOps Cloud Engineer at Cardinal Health @ Cardinal Health
dublin, ohio, united states
Lakshmi S's Location
United States, United States
About Lakshmi S

• Around 8 years of IT Professional Experience as Cloud/DevOps Engineer with major focus on Microsoft Azure, AWS, Google Cloud Platform (GCP), OpenStack, Continuous Integration, Continuous Deployment, Configuration Management Build/release on each packaging Quality Control and which also includes Trouble shooting and performance issues.• Expertise on Amazon Web Service (AWS) Cloud Services VPC, IAM, EC2, S3, Route 53, ELB, Code Commit, Code Build, Code Deploy, Cloud Formation, EBS, RDS.• Experience in Pair Programming, Test Driven Development, Waterfall model and Agile methodologies like SCRUM, KANBAN.• Experience in maintaining AWS (Amazon Web Services), EC2 (Elastic Cloud Computing) resources like instances, EBS volumes, snapshots, ELB (Elastic Load Balancers), AMI's, security Groups, Elastic IP's, Key pairs and Amazon Cloud watch for different zones in development.• Experience on SaaS, PaaS, and IaaS concepts of cloud computing and implementing using, Azure, Google Cloud Platform understanding the principles of (SCM) in Agile, Scrum and Waterfall methodologies.• Experience in building Docker images using GitLab-ci build automation runner and Configuring JUnit coverage report and Integration Test cases as part of build in GitLab Runner.• Experience in AWS Scripting and Operations EC2, Confidential, S3, Glacier, VPC and Route53.• Experience in working web applications deployed on Amazon Web Services (AWS) and configuring services like Aws Route, Cloud Front, Elastic load Balancer, EC2, RDS and Cloud watch.• Expertise in writing Bash Scripts, Pearl Scripts (hash and arrays), Python programming for deployment of Java applications on bare servers or Middleware tools.• Experienced in Docker Swarm to implement a high - level API to provide Containers that run Processes isolation worked on customized Mesos marathon Docker container images, pushed the images to Docker hub.• Experienced in deploying Database changes to Oracle, MS SQL Server, MY SQL Database.• Experienced in build and deployment of Java applications on to different environments such as QA, UAT and Production.• Hands-on in using OpenShift for container orchestration with Kubernetes, container storage, automation, to enhance container platform multi-tenancy. Experience with OpenShift, Kubernetes architecture and design, troubleshooting issues and multi-regional deployment models and patterns for large-scale applications.

Lakshmi S's Current Company Details
Cardinal Health

Cardinal Health

View
DevOps Cloud Engineer at Cardinal Health
dublin, ohio, united states
Employees:
27278
Lakshmi S Work Experience Details
  • Cardinal Health
    Devops Cloud Engineer
    Cardinal Health Mar 2021 - Present
    United States
    • Developed build and deployment scripts using ANT and MAVEN as build tools in Jenkins to move from one environment to other environments.• Designing and configuring the AWS Simple Notification Service (SNS) and Secure Email Service (SES) architecture of the solution and working with a client.• Worked on the SonarQube Administer and provides the permissions to the user’s level and group level to the project, Quality profile and Quality gate.• Created a script in shell to auto start the wrapper service when the server is restarted or crash.• Created Azure services using ARM templates (JSON) and ensured no changes in the present infrastructure while doing incremental deployment.• Setting up data in AWS using S3 bucket and configuring instance backups to S3 bucket.• Writing Ansible playbooks and Chef Recipes to automate our build/deployment process and do an overall process improvement to any manual processes and managed servers on AWS cloud platform using configuration management tools.• Build servers using AWS, importing volumes, launching EC2, RDS, creating security groups, auto-scaling, Elastic load balancers (ELBs) in the defined virtual private connection in Amazon CLI.• Used Cloud formation templates and launch configurations to automate repeatable provisioning of AWS resources for applications.• Creating and updating AWS stack using AWS Cloud Formation and writing the templates file using JSON.• Integrated Amazon Cloud Watch with Auto scaling launch configurations for monitoring the log files, store them and track metrics.• Automated the cloud deployments using Chef, Python and AWS Cloud Formation Templates.• Perform troubleshooting and monitoring of the Linux server on AWS using Nagios and Splunk.• Integrated services like GitHub, AWS Code Pipeline, Jenkins and AWS Elastic Beanstalk to create a deployment pipeline.
  • The Depository Trust & Clearing Corporation (Dtcc)
    Devops Cloud Engineer
    The Depository Trust & Clearing Corporation (Dtcc) Apr 2020 - Feb 2021
    Dallas, Texas, United States
    Responsibilities:• Created CI/CD Azure DevOps pipelines to Build and deploy to azure services, Configured SonarQube, Aqua Scanner, Maven for enabling all security, code bugs, Image scans and Image Vulnerabilities• Setup Nginx, HTTP web server as a reverse proxy to critical applications such as Jira, Jenkins and IBM WebSphere Application Server. • Designed workflows in JIRA to deal with issues and maintained all the user stories for tracking as per agile style. • Worked on google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring and cloud deployment manager.• Automate Datadog Dashboards with the stack through Terraform Scripts.• Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of Open stack Nodes and Test Playbooks on AWS instances using Python.• Created Ansible playbooks. to create both Linux and Windows VMs, MS SQL servers in AWS Portal and used them to change the configurations. • Helped individual teams to set up their repositories in bit bucket and maintain their code and help them setting up jobs which can make use of CI/CD environment.• Developed Build and Deployment Scripts using build tools MAVEN in JENKINS to migrate from one environment to another environment.• Written groovy scripts to use multi branch pipeline projects in Jenkins to configure it as per client’s requirements.• Administered and developed Kubernetes clusters from scratch in AWS making it highly available across multiple Availability zones and encrypted all Kubernetes components, data at rest with TLS encryption.• Written automation scripts using Bash, Power Shell, Ruby, JSON, Python to support infrastructure as code and continuous deployment.• Designed and implemented Chef, including the internal best practices, cookbooks, automated cookbook CI and CD system.
  • Thrivent
    Aws Devops Engineer
    Thrivent Sep 2018 - Mar 2020
    Appleton, Wisconsin, United States
    • Installed and configured various web application servers like Apache Tomcat web server, JBoss for deploying the artifacts and deployed applications on AWS by using Elastic Beanstalk. Responsible for Red Hat Linux, Installing, Configure, Security, Backup, and upgrade.• Installed, Configured, Managed Monitoring Tools such as Splunk, New Relic for Resource Monitoring/Network Monitoring/Log Trace Monitoring. • Installed SonarQube plugin in Jenkins and integrated with project maven script Experience with Build Management Tools Ant and Maven for writing build. xmls and Pom.xmls. Experience in automating the java applications using serenity tool. And also working knowledge on cucumber.• Performed Integrated delivery (CI and CD process) Using Jenkins, Nexus, Yum. Branching, Tagging, Release Activities on Version Control Tools: SVN, GitHub. Creation and editing of Teamcity project and build configurations. Experience in managing virtual instances and disks using Puppet. • Involved in using Ansible to manage Web Applications, Config Files, Data Base, Commands, Users Mount Points, and Packages. Implemented Ansible to manage all existing servers and automate the build/configuration of new servers.• Monitored System Activities like CPU, Memory, Disk and Swap space usage to avoid any performance issues.• Configured Ansible to manage AWS environments and automate the build process for core AMIs (Amazon Machine Image) used by all application deployments including Auto scaling, and Cloud formation script. Implement multiple parallel builds through a build farm.• Involved in writing the Ansible playbooks with Python SSH as a wrapper to manage configurations and the test playbooks on AWS instances using Python. Automation (Ansible) and deploying on traditional VMware, Aws, Azure environment.• Used Ansible playbooks to automate Kubernetes clusters
  • Paycom
    Devops Administrator
    Paycom Jul 2017 - Aug 2018
    Oklahoma, United States
    • Performed day-to-day admin activities like creating and managing Build, deploy activities and improving the release process. • Responsible for daily builds and deployments from Development through various testing environments.• Migrated out core repository from Subversion (SVN) to GIT. In this process I was involved in individually migrating the entire code base from Subversion to GIT ensuring that health of our builds isn’t affected. Performed high level mergers of branches, code bases.• Configured and maintained Jenkins to implement the CI process and integrated the tool with Maven to schedule the build.• Managed Amazon Webservices (AWS) infrastructure with automation and configuration management tools such as Chef. Designing cloud hosted solutions, specific AWS product suite experience.• Provided highly durable and available data by using S3 data store, versioning, lifecycle policies, and create AMI’s for mission critical production servers for backup.• Performed the automation deployments using AWS by creating the IAMs and used the code pipeline plugin to• Integrate Jenkins with AWS and create theEC2 instances to provide the virtual servers.• Worked on many different concepts of chef like Roles, Environments, Data Bags, Knife and Chef Server Admin/Organizations. • Written Chef Recipes to automate build/deployment process and data bags in chef for better environment management.• Maintained multiple Modules in Puppet by converting production support scripts to Puppet Manifests to automate the configurations in servers.• Designing and implementing CI (Continuous Integration) system: configuring Jenkins Servers, Jenkins’s nodes, creating required scripts (Perl & Python), and creating/configuring VMs (Windows/Linux).and Ant, written Maven pom.xml build script.• Experience on total software development life cycle (SDLC) with programming advancement models like JIRA, Agile Scrum Model, Prototype demonstration, Waterfall display.
  • Marsh Insurance
    System Engineer
    Marsh Insurance Apr 2016 - Jun 2017
    Phoenix, Arizona, United States
    • Responsible for installation, configuration, Linux upgrade, troubleshooting and maintenance of RHEL 6/7, and Systems Firmware upgrade on Cisco UCS and HP Blade server.• Installed/configured Jumpstart and Kickstart for Solaris and Red Hat Linux server, respectively. Installation, integration, tuning, and troubleshooting in various application servers: Apache, Tomcat, WebSphere, and WebLogic.• Installation, configuration, backup, recovery, maintenance and support of RedHat Linux and Solaris. User account management worked with shell scripting (bash) to automate administration tasks. Responsible for installing, upgrading and managing packages via RPM and YUM.• Logged events from forced crash dumps and troubleshoot server issues. Configured Yum repository server for installing packages from centralized server. Installed Fuse to mount the keys on every Debian Production Server for Password-fewer authentication. • Installed and Configured DCHP server to give IP leases to production servers. Applied the clustering Topology that meets High Availability and Failover requirements for performance and functionality. Configured Send mail Utility on Linux servers.• Installation, Configuration and administration of DNS, LDAP, NFS, NIS and send mail on Red hat Linux/Debian severs. • Troubleshooting Day-to-Day issues with various Servers on different platforms. Created bash shell scripts to automate backups and routine tasks.• Monitored system, performed tune-up for kernel parameter, and involved in adding/removing users, disks, and hosts on the DNS domain.• Fine tuning of Servers and configuring networks for optimum performance. Assisted the development team and reviewed the required maintenance tasks. Responsible for Connectivity issues among various Servers and various software components.

Lakshmi S Education Details

Frequently Asked Questions about Lakshmi S

What company does Lakshmi S work for?

Lakshmi S works for Cardinal Health

What is Lakshmi S's role at the current company?

Lakshmi S's current role is DevOps Cloud Engineer at Cardinal Health.

What schools did Lakshmi S attend?

Lakshmi S attended Jntuh College Of Engineering Hyderabad.

Who are Lakshmi S's colleagues?

Lakshmi S's colleagues are Dane Nasenbenny, Jasmine Paw, Jacqueline Collier, Clay Foody, Hailey Park, Shawna Seaton, Janet Fagan.

Not the Lakshmi S you were looking for?

  • Lakshmi S

    Certified Workday Consultant | Specialized In Adaptive Insights, Extend, Integrations
    United States
  • Lakshmi S

    Senior Big Data Engineer | Innovating Algorithms, Ml Models, And Ci/Cd Integration For Advanced Systems. Actively Looking For C2C Job Opportunities.
    United States
  • Lakshmi S

    Workday Hcm
    Los Angeles, Ca
  • Lakshmi S

    Dallas-Fort Worth Metroplex

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.