Durga Devi Email and Phone Number
• Around 10 years of IT experience as on Azure Cloud in Administering, Design complex applications with emphasis on Design & Development of Data Solutions, business intelligence reporting, ETL development, Testing, documentation developed data pipelines using PySpark and databricks.• Working experience Pipelines in ADF using Linked Services/Datasets/Pipeline to Extract and load data from different sources like Azure SQL, ADLS, Blob storage, Azure SQL Data warehouse. • Experience on migrating on Premises ETL process to Cloud.• Working experience with Data bricks environment in Azure.• Configuring daily, weekly and monthly jobs in Databricks (Job Orchestration)• Working with Azure Data Factory Control flow transformations such as For Each, Lookup Activity, Until Activity, Web Activity, Wait Activity, If Condition Activity. • Having experience in writing Stored procedure, pulling data through joins in Databases. • Developed ETL pipelines in and out of data warehouse using combination of ADF and Snowflake’s Snow SQL.• Good experience in UNIX Shell Script for automation of tasks for file loading Job Scheduling.• Good knowledge of Data Marts, OLAP, Dimensional Data Modeling with Ralph Kimball Methodology (Star Schema Modeling, Snow-Flake Modeling for FACT and Dimensions Tables) using Analysis Services.• Experienced with different file formats like Parquet, ORC, CSV, Text, fixed width, XML, JSON and Avro files. Detailed exposure to Azure tools such as Azure Data Lake, Azure Data Factory, HDInsight, Azure SQL Server, and Azure DevOps. • Expert in Data Visualization development using PowerBI to create complex and innovative dashboards.• Good working knowledge in project management activities in Agile / Rally.• Expertise in various phases of project life cycles (Design, Analysis, Implementation, and testing). • Excellent communication, team membership and relationship management skills.
-
Data EngineerPgbc Jun 2023 - PresentWashington Dc-Baltimore Area• Have to support the UI/frontend of the application. So, as part of it I am writing stored procedure to save the files that are uploaded from the front end into the DB. Also created stored procedure using PL/SQL to send data requested by the UI in JSON format. • Created SSIS files to read the data from Legacy SQL Server 2013 to the Azure DB. AS part of creating the SSIS files, handled different transformations to match the datatypes of the legacy and Azure SQL DB. • Worked with JSON data, flat files and table data to create appropriate stored procedures to save data into DB. Saved data into tables as per SCD1 and SCD2 logic according to the requirement • Created procedures using PL/SQL to handle table truncation, deleting specific records from tables and rolling forward selected data to next year data. • Created indexing on tables, building key relationships between tables also developed database diagrams using MS Visio • Created dynamic procedures which are used by Azure functions to trigger the ADF pipelines to save data into the ADLK • Also worked as a PowerBI developer in which I have developed few graphs like line chart, Area charts, tables, matrixes in PowerBI Desktop Application. • Created Views in Azure Databricks in SQL language which are consumed by PowerBI application as source dataset • Created measures and derived columns using DAX and M queries to make sure the graphs are as Dynamic as possible. So, that the graphs can also as the source dataset advances to next year -
Data EngineerKasmo Nov 2022 - May 2023Plano, Texas, United States• Attending scrum calls for all project activities understanding requirements• Analyze and organize raw data into ADLS-Gen2 from Sharepoint • Developing, train and maintaining datasets• Designed table scripts for warehouse using DELTA Tables in databricks.• Build data systems and pipelines using ADF and PySpark• Conduct complex data analysis and report on results• Designed Python code to cleanse and standardize data.• Preparing queries for data transformation using SQL • Created job for daily and weekly incremental data load using data bricks designed databricks workflows to orchestrate jobs -
Data EngineerHastings Mutual Insurance Company Jan 2021 - Oct 2022Michigan, United States• Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules. • Understand business use cases, integration business, write business & technical requirements documents, logic diagrams, process flow charts, and other application related documents.• Built a common sftp download or upload framework using Azure Data Factory and Snowflake. • Involved in logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.• Responsible for monitoring and troubleshooting of the Spark data bricks daily and weekly jobs.• Worked on Azure Cloud services and Snowflake development. • Bulk loading from the external stage (Azure Blob Storage), internal stage (Using Snow SQL) to snowflake cloud using the COPY command. • Creating external tables and loading data into these tables from an External stage using Snow Sight. Also used FLATTEN table function to produce a lateral view of VARIANT, OBJECT, and ARRAY column.• Used COPY, LIST, PUT and GET commands for validating the files in the stage. • Loaded the tables from the azure data lake to azure blob storage for pushing them to snowflake• Developing ETL Pipelines in and out of data warehouse, develop major regulatory and financial reports using advanced SQL queries in snowflake.• Built ELT workflow using Python and Snowflake COPY utilities to load data into Snowflake.• Used import and Export from the internal stage (snowflake) from the external stage (ADLK).• Writing complex Snow SQL scripts in snowflake cloud data warehouse to business analysis and reporting. • Developed Data dashboards using PowerBI• Experienced with Snowflake utilities, Snow SQL, Snow Pipe etc. and monitoring Snowflake computing platform. -
Data EngineerAmeren Nov 2018 - Dec 2020Missouri, United States• Experience on migrating on Premises data to Azure Data Lake • Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform, and load data from different sources like Azure SQL, Blob storage, Azure SQLDW, Azure Data Lake. • Orchestrated all Data pipelines using Azure Data Factory and built a custom alerts platform for monitoring. • Implemented Data Factory Pipelines for Daily Full and Incremental loads. • Working with Azure Data Factory Control flow transformations such as For Each, Lookup Activity, Until Activity, Web Activity, Wait Activity, If Condition Activity. • Deployed the codes to multiple environments with the help of CI/CD process and worked on code defect during the SIT and UAT testing and provide supports to data loads for testing• Implemented Stored Procedure activity to update Metadata in tables as per requirement. • Configured and implemented the Azure Data Factory Triggers and scheduled the Pipelines; monitored the scheduled Azure Data Factory pipelines and configured the alerts to get notification of failure pipelines.• Experienced in Azure Data Factory and preparing CI/CD scripts, Devops for the deployment• Wrote complex SQL Queries, Joins, Procedures, Functions, and Packages to validate data and testing process. -
Sql Server/Ssis/Ssrs DeveloperKasmo Jul 2013 - Oct 2018United States• Involved to work as an SSRS developer to create reports for analysis, SSIS developer to build facts and dimension tables for efficient analysis and reporting and also as a DBA to monitor client's databases on production server.• Developed and optimized database structures, stored procedures, Triggers, Views, Joins, Cursors and Functions for the application.• Managed index statistics, Tables, Views, cursors and optimizing queries by using execution plan for tuning the database.• Involved in Performance tuning the Stored Procedures to by removing unnecessary cursors Developed SSIS Packages to pull data from Access, Excel sheets and Flat files into SQL Server 2005.• Performed ETL operations using SSIS to validate, extract, transform and load data to data warehouses.• Configured Database Mail for sending automatic emails when a SSIS package is failed or succeed.• Experience in creating various types of reports (table, chart, list and matrix report) Designed and developed Drill down and Drill through reports with Drop down menu option using SSRS. • Scheduled reports for Daily, Weekly, Monthly reports on sales and marketing information for various categories and regions based on business needs using SQL Server Reporting services (SSRS). • Extensive Experience working with SSIS SQL Server 2008/R2. • Export and import of data from text files and Excel to SQL Server database using bulk insert and BCP utility.
Durga Devi Education Details
-
Godavari Institute Of Engineering & Technology (Giet)A
Frequently Asked Questions about Durga Devi
What company does Durga Devi work for?
Durga Devi works for Pgbc
What is Durga Devi's role at the current company?
Durga Devi's current role is Data Engineer Actively looking for an opportunity durgay.de@gmail.com.
What schools did Durga Devi attend?
Durga Devi attended Godavari Institute Of Engineering & Technology (Giet).
Who are Durga Devi's colleagues?
Durga Devi's colleagues are Nick Williams, Harold Herd.
Not the Durga Devi you were looking for?
-
Durga Devi
San Jose, Ca -
Durga Lawrence
Nashville, Tn -
Durga Devi
Greater Chicago Area -
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial