Deepak K

Deepak K Email and Phone Number

AWS Big Data Administrator at FEPOC @
Deepak K's Location
Nashville Metropolitan Area, United States
About Deepak K

Deepak K is a AWS Big Data Administrator at FEPOC at FEPOC.

Deepak K's Current Company Details
FEPOC

Fepoc

AWS Big Data Administrator at FEPOC
Deepak K Work Experience Details
  • Fepoc
    Aws Big Data Administrator
    Fepoc Jul 2020 - Present
    Virginia, United States
    o Installed and configured various Hadoop distributions like CDH - 6.3.3 & CDP 7.1.8 versions. o Worked on AWS CDP Public Cloud Data Hub clusters, Data Lake, Management console, user accesses.o Deployed infrastructure on AWS utilizing as EC2 (Virtual Servers in the Cloud), VPC and Managed Network and Security, IAM, AWS S3.o Involved in Hadoop Cluster environment administration that includes scale up and down of servers based on the project usage.o Worked on CDP public cloud environment migration from on-prem to AWS Cloud for Hive & Kudu Data migrations, applications connectivity to HBase and Solr, Nifi to DB2 MQ connectivity. o Exposure to Kerberos authentication and authorization leveraging Active Directory and corresponding integration techniques with Big Data platforms to secure the data, users and processes.o Worked on Ranger policies setup for HBase, Solr, Kafka, HDFS & S3.o Worked on configuring Nifi to connect to DB2 MQ to pull data and load it into Kafka, HBase and Solr services based on the project needs.o Experienced with Solr, Nifi and HBase services configuration and data loads to populate data in Solr collections.o Worked with HBase and Solr consumer Applications to setup a connectivity to consume data using secure connection using LDAP and Kerberos.o Provided 24X7 support for production environment and diligently resolve performance and user access related issues. o Implemented shell scripts for various data cleansing, data loading, monitoring services, update stats on tables, HDFS backup etc. Also scheduled and implemented backup reporting model which will notify the team about the progress. o Implemented automated scripts that boost performance by updating stats against database tables, rebalance etc. o Implemented a CyberArk password rotation script to generate Kerberos keytab and copy it to remote severs using CDP CLI.o Involved in installing and configuring DataDog tool for application monitoring to check the performance, I/O etc.
  • Delta Air Lines
    Big Data Devops Engineer
    Delta Air Lines Jul 2018 - Jun 2020
    Atlanta, Georgia, United States
    o Installed and configured various Hadoop distributions like CDH - 5.16, 6.1.1 versions. o Involved in Hadoop cluster tasks like Adding and Removing Nodes without any effect to running jobs and data.o Experienced in designing and deployment of Hadoop cluster and different Big Data analytic tools including Hive, Oozie, Sqoop, Kafka, Spark, Impala, Nifi with Cloudera distribution.o Experienced in creating a Hive Database and providing permissions to users using Sentry roles.o Experienced in creating a HDFS encryption zone and adding encryption key to KMS-ACLS for security.o Experienced in setting up a Near real time streaming pipeline using Flume by connecting it to DB2 MQ to get the data and load it into Kafka topic for further processing.o Hands on experience in setting up of streaming pipeline using Flume and process using Spark and load it into Kudu tables.o Experienced in providing access to users for edge nodes, CM, Hive Databases etc. by providing appropriate access to AD groups. o Implemented automated shell scripts that boost performance by updating stats against database tables, rebalance etc.o Involved in integrating Pega with Big Data servers so as to Read/Write messages in Kafka Topics. o Involved in installing and configuring AppDynamics tool for monitoring Network interruptions, performance, CPU utilization, Database Metrics.o Created and launched instances on AWS Elastic Cloud Compute (EC2). o Worked on Jenkins auto NiFI flow deployment process in various environments. Experience on Jenkins configuration, create profile, Environment and workspace code configurations.o Worked on continuous deployment pipeline, Git, Jenkins and Ansible across geographically separated hosting zones in AWS.o Experience on Jenkins data collection in Splunk and created Dashboards for failed deployments, Error count, state and reasons.o Provided Configuration management, Build and Deployment support for multiple applications.
  • Vanguard
    Big Data Developer
    Vanguard Jul 2017 - Jan 2018
    Charlotte, North Carolina, United States
    o Designed and implemented HIVE queries and functions for evaluation, filtering, loading and storing of data.o Created Hive Tables, loaded data using Sqoop.o Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop. Worked on tuning the performance Pig queries. o Used Spark API over Hadoop YARN as execution engine for data analytics using Hive. o Exported the analyzed data to the relational databases using Sqoop to further visualize and generate reports for the BI team.o Developed ETL Process using Spark, Scala, and Hive. o Experienced on loading and transforming of large sets of structured, semi and unstructured data.o Developed workflows using Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.o Exposure to Big Data Exploration, Profiling, Quality and Transformation- Proficient in designing efficient and robust ETL/ELT workflows, schedulers, and event-based triggers.
  • Ch Excellency Private Limited
    Linux Software Engineer
    Ch Excellency Private Limited Aug 2014 - Nov 2015
    Hyderabad, Telangana, India
    o Experienced in installation of servers with operating system and necessary software packages.o Applied patches and updates to operating system and applications to ensure security and stability.o Used Nagios monitoring tool to monitor system availability, performance and resources usage.o Addressed identified issues related to CPU, memory, disk I/O and network usage.o Managed user permissions and access controls using SSSD and Active Directory.o Provided support to end users for system access and application issues.o Experienced in shell scripting to perform repetitive tasks and streamline system Administration.o Worked with other departments to integrate Linux systems with other technologies and applications.

Frequently Asked Questions about Deepak K

What company does Deepak K work for?

Deepak K works for Fepoc

What is Deepak K's role at the current company?

Deepak K's current role is AWS Big Data Administrator at FEPOC.

Not the Deepak K you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.