K Nirmala

K Nirmala Email and Phone Number

Big Data Engineer @
K Nirmala's Location
Greater Los Angeles, California, United States, United States
About K Nirmala

K Nirmala is a Big Data Engineer at Levi Strauss & Co (Levi’s),.

K Nirmala's Current Company Details
Levi Strauss & Co (Levi’s),

Levi Strauss & Co (Levi’S),

Big Data Engineer
K Nirmala Work Experience Details
  • Levi Strauss & Co (Levi’S),
    Big Data Engineer
    Levi Strauss & Co (Levi’S), Sep 2020 - Present
    o Worked on building centralized Data lake on AWS Cloud utilizing primary services like S3, EMR, Redshift and Athena. o Worked on AWS Data pipeline to configure data loads from S3 into Redshift.o Extracted, transformed and loaded data from various heterogeneous data sources and destinations using AWS Redshift.o Migrate data from on-premises to AWS storage bucketso Developed a python script to hit REST API’s and extract data to AWS S3o Created Tables, Stored Procedures, and extracted data using T-SQL for business users whenever required.o Performs data analysis and design, and creates and maintains large, complex logical and physical data models, and metadata repositories using ERWIN and MB MDRo Selected and generated data into csv files and stored them into AWS S3 by using AWS EC2 and then structured and stored in AWS Redshift.o Utilized Spark SQL API in PySpark to extract and load data and perform SQL queries.o Worked on developing Pyspark script to encrypting the raw data by using Hashing algorithms concepts on client specified columns.o Used PySpark and Pandas to calculate the moving average and RSI score of the stocks and generated them into a data warehouse.o Exploring with Spark to improve the performance and optimization of the existing algorithms in Hadoop using Spark context, Spark-SQL, PostgreSQL, Data Frame, OpenShift, Talend, pair RDD
  • Goma Net Technologies
    Big Data Engineer
    Goma Net Technologies Jun 2018 - Dec 2019
    o Worked with Azure cloud platform (HDInsight, Data Bricks, Data Lake, Blob storage, Data Factory and Data Storage Explorer).o Experience analyzing data from Azure data storages using Databricks for deriving insights using Spark cluster capabilities.o Developed ETL pipelines in Spark using python and workflows using Airflow.o Involved in development of data ingestion, aggregation, integration and advanced analytics using Snowflake and Azure Data Factory.o Worked on developing several Spark/Scala scripts for data extraction from different sources and providing data insights and reports as per need.o Wrote Spark SQL scripts to run over imported data and existing RDDs and implemented Spark practices partitioning, caching and checkpointing.o Developed code from scratch using Scala according to the technical requirements.o Loaded all data into Hive from source CSV files using Spark.o Worked on extracting Real time data using Spark streaming and Kafka and converted it to RDD and process it into Data Frame and load the data into HBase.o Used Spark SQL to read the Parquet data and loaded tables in hive to Spark using Scala.o Developed Spark jobs using Scala and Spark SQL for faster testing and processing of data.o Worked on deriving Structured Data from Unstructured data received using Spark.o Worked in the process of data collection, data processing and exploration project in Scala.o Involved in converting the JSON data into pandas DataFrame and stored into hive tables.o Used Azure DevOps for CI/CD and Production Deployment.o Actively monitored and resolved issues evolved in pipelines and designed dashboards using Power BI on requirement of the team.Environment: Hadoop, Hive, Hbase, Azure, Spark, Python, Airflow, Snowflow, Pyspark, SQL, Data Bricks, Azure, ETL.
  • General Use Tech Solutions,
    Big Data Developer
    General Use Tech Solutions, May 2016 - Apr 2018
    o Designedandbuiltend-to-endDataWarehouseinfrastructurefromthegrounduponRedshift for large scaled at a handling.o Implemented and managed ETL solutions and automating operational processes.o Optimized and tuned the Redshift environment, enabled queries to perform upto 100x faster foro Tableau and SAS Visual Analytics.o Migrated on-premises at a base structure to Redshift data warehouse.o Responsible for ETL and data validation using SQL Server Integration Services.o Defined and deployed monitoring, metrics, and logging systems on AWS.o Implemented Workload Management (WML)in Redshift to prioritize basic dashboard queries over more complex longer running ad hoc queries; this allowed for a more reliable and faster reporting interface, giving sub-second query response for basic queries.o Worked publishing interactive data visualizations dashboards, reports /workbooks on Tableau and SAS Visual Analytics.o Responsible for designing logical and physical data modeling on Redshift.o Designed and developed ETL jobs to extract data from salesforce replica and load it in data martin Redshift.Environment: Redshift, Airflow, AWS Data Pipeline, S3, Python, SQL Server Integration Services, SQL Server, AWS Data Migration Services, SAS Visual Analytics and Tableau.
  • Tata Consultancy Services
    Systems Engineer/Software Engineer
    Tata Consultancy Services May 2014 - Apr 2016
    Mumbai, Maharashtra, In
    o Involved in Daily SCRUM meetings and weekly sprint Meetings.o Implemented and developed UI components using Angular JS features like dependency Injection, Models, data binding and controllers.o Developed Interactive web pages using HTML5, CSS3, JavaScript and Bootstrap.o Handled Java Multithreading part in backend component, one thread will be running for each usero Database development required creation of new tables, PL/SQL Stored Procedures, Views, and Constraints, Triggers and required SQL tuning to reduce the response time in the application.o Extensively used Hibernate 4.2 concepts such as inheritance, lazy loading, dirty checking and transactions.o All the functionality is implemented using Spring IO /Spring Boot, Thyme leaf and Hibernate ORM.o Exposing the Microservices to base on RESTful API utilizing Spring Boot with Spring MVC.o Java developer involved in back-end and front-end developing team. Took part in developing,o maintaining, reviewing and supporting quality code and services.o Implemented Rest based web services using JAX-RS annotations, Jersey Provider and consumed usingo HTTP services from AngularJS modules.o Developed Application to assess JSON and XML from Restful web service using AngularJS.o Integration of automation batch run job with Jenkins Continuous Integration tool.o Used SVN as version management, Jira for defect management system.o Used JUnit for unit testing of the application and log4j for logging.Environment: EJB3.0, Spring, Jasper reports, IBM MQ, XML, REST, JDBC, JavaScript, XSLT, XML, UML, HTML, JNDI, Python, SaaS, Apache Camel, Jackson, Spring Boot, JBoss 6.x, Log4J, ANT, Swagger, Mockito
  • Tata Consultancy Services
    Assistant Systems Engineer
    Tata Consultancy Services Jul 2012 - Apr 2014
    Mumbai, Maharashtra, In
    Responsibilities: Development and maintenance of American Express Servicing Portal for AMEX with hands-on experience on all phases of the software project life cycle.● Worked on development and deployment of web-based UI and related business logic using .Net Framework 3.5 such as C#.net, ASP.net along with code review and defect tracking. For UI, I have used JavaScript, jQuery.● Implemented MVC best practices with lean controllers, service layer and view models.● Worked on Relational Database Management Systems for writing complex SQL Queries and Stored Procedures.● Trained associates, conducted/attended knowledge sharing sessions.● Participated in design discussions and requirement gathering sessions● Worked closely with the management to deliver the requirements without missing functionalities and deadlinesEnvironment: Net C#, VB.Net, ASP.net, JavaScript, jQuery, SQL, SharePoint, VCS, Git, Docker, Kubernetes

Frequently Asked Questions about K Nirmala

What company does K Nirmala work for?

K Nirmala works for Levi Strauss & Co (Levi’s),

What is K Nirmala's role at the current company?

K Nirmala's current role is Big Data Engineer.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.