Deepak Ahuja work email
- Valid
- Valid
- Valid
Deepak Ahuja personal email
- Valid
• Sr Data Architect with 14 years of experience in IT industry.• Empower business units to achieve self-service capability in BI and Analytics• Experienced in building ETL pipelines in Azure Data Factory• Experienced in building modern data warehouse in Azure Synapse Analytics• Proficient in MSBI suite (SSIS/SSRS/SSAS) using Visual Studio 2019/2017/2015 Data Tools.• Database/Data Warehouse Design and Development using SQL Server and SQL/TSQL.• Implementation of ETL design, development and deployment using SSIS packages.• Design and Development of Multidimensional and Tabular OLAP SSAS cubes.• Implementation of Star and Snowflake schema in data-marts and SSAS projects.• Implemented SSIS/SSAS/SSRS best practices and optimization/performance tuning.• Visualization, Interactive Dashboard reporting for business, various data sources connection, custom reporting solutions, Data Modeling, transformations, and DAX query/calculations using Power BI desktop, Power BI service in USDA-RMA and US forest service systems with Geo Spatial Data.• Enterprise Data Warehouse (EDW) design and developer (Data Vault 2.0 and dimensional modeling) of next generation enterprise data platform leveraging Azure Data Lake Storage, Azure SQL Data Warehouse, SQL Azure, Azure Analysis Services, and Power BI.• Predictive Modelling using Machine Learning algorithms for Predictive analysis projects using Python, Statistics, Tableau, Jupyter Notebook. • Automation of manual processes and modernization of legacy database systems/technologies.• Exposure to python programming to set-up DevOps CI\CD pipelines.• Dimensional Modeling of Data warehouse and Data Marts using SAP Power Designer tool.• Generated parameterized, sub-reports, drill down/drill through reports using SSRS reports.• TFS\TFVC and Git in different projects for version controlling.• Engage business and other stakeholders throughout Solution Architecture and implementation life cycle and recommend effective solutions aligned with the organization’s long-term goals & objectives;• Implement Data Lake and ETL Solution using Databricks in Azure;• Enhancements and troubleshooting in Web applications using .NET C#, LINQ, ASP.NET, Webservices, Entity Framework, MVC, JavaScript etc. • Active technical lead for providing technical solutions, peer reviews, coding standards, best practices, recommendations, business process automations, work estimation, requirement gathering, client interaction, collaboration, technical trainings.
-
Sr Data Architect | Technical LeadThe Cadmus Group Nov 2024 - PresentArlington, Virginia, Us -
Sr Data Architect|Technology Lead (Usda-Fpac Nrcs)Ventera Apr 2024 - PresentReston, Virginia, Us -
Sr Data Architect|Technology Lead (Cms Ccsq Ess)Ventera Jul 2023 - Apr 2024Reston, Virginia, UsData Pipelines (ingestion, evaluation) from different 3rd sources(NewRelic,Splunk,Https Apis, AWS CloudWatch etc) using Python SDK model/process and APIs and Lambda functionsDesign Data Solution for Timeseries Data Ingestion\Evaluation and json documents in AWS Timestream, Dynamo Db, S3, Kinesis DevOps Architecture in AWS EKS Fargate, Docker ContainerizationData Analytics and visualization dashboardExploratory Data Analysis , AI/ML pipeline(MLOps, SageMaker)Technology Lead, Work Management, Planning , Collaboration, Architecural Review , Recommendations -
Enterprise Data Architect (Usda-Fpac)Saic May 2023 - Jul 2023Reston, Va, Us -
Principal Data & Bi Engineer & Saic Research Fellow (Usda-Nrcs)Saic Jan 2021 - Apr 2023Reston, Va, Us• Work on Design and Development of NRCS Databases , Data Warehouse , ETL for USDA NRCS soil databases• Modernization of MS Access DBs to SQLite migration for Soil Survey Geo Spatial Databases using Python and SQLite• Data Loader tool development using Python 3.10 to load tabular and spatial data from multiple sources into file based SQLite, Geopackage and Spatialite databases. • Modernization of ESRI ArcGIS desktop to ArcGIS Pro• Apply Artificial Intelligence on USDA Soil Data for predictive modelling by using machine learning models using Azure Machine Learning Studio• Legacy processes automation, Design Strategy, Architectural Review and Peer review.• Create architectural guidelines and design strategies for Databases , Data Warehouse , Data Pipelines , AI\ML models , code reviews etc.•Designing and building scalable, agile, auditable, integrated Soil Data Warehouse (SDW) and soil data marts (SDM)•Designing data model and development of Nasis, Soil Data Warehouse and Soil Data Mart.•Creating the ETL pipeline to feed the data from Nasis source to sdwStaging , SDW and SDM.•Developed the BI reports, dashboards and AI/ML machine learning models to predict Dynamic Soil Properties•Developed a data loader tool using Python and file basesd SQLite db(Geopackage/Spatialite) templates to load tabular and spatial data from multiple sources which replaced the legacy data loader product based on MS access and VBmacros. It reduced the data processing time by 90% compared to legacy data loading process for 5k USDA NRCS users.•Building Data pipelines in Azure using Azure Data Factory, Synapse Analytics, Azure SQL database, DatabricksProudly supporting "Helping People Help the Land" -
Principal Data & Business Intelligence Engineer (Usda-Rma)Saic Apr 2019 - Jan 2021Reston, Va, Us• Design, Development and deployment of USDA RAS ( Reinsurance Accounting System) system databases, data warehouse using SQL/TSQL, ETL processes using SSIS, Enterprise Data Marts, Multidimensional and Tabular SSAS Cubes for OLAP solutions• Data Warehouse design , development and enhancements for USDA accounting systems.• Database design, standards , data modelling , performance tuning and modernization • SSRS report design , development , scripted deployment , security and standards.• Python scripting for automation of database , SSIS and reports deployments• Connect ,host and other data operations on Azure cloud • MDS (Master Data Services) solution design for different source data.Design and Development of MS PowerBI reports using PowerBI desktop using multiple data sources.• Implementation of Power BI transformations, data modelling, calculations and custom reporting solutions in Power BI.• Visualization, Interactive Dashboard reporting for business, mobile apps, Data Modeling and DAX query/calculations, Power BI desktop, deployment in Power BI service using Power BI desktop.• Oversee the data structure design (Data Vault 2.0 and dimensional modeling) of our next generation enterprise data platform leveraging Azure Data Lake Storage, Azure SQL Data Warehouse, SQL Azure, Azure Analysis Services.• Machine Learning techniques and algorithms. Python scripting, Python libraries , Statistical analysis , Predictive Analysis/Modelling . -
Senior Data & Business Intelligence Engineer (Usda-Rma)Clarus Group Dec 2016 - Apr 2019Overland Park, Kansas, Us• Design, Development and deployment of USDA RAS system databases, data warehouse using SQL/TSQL, ETL processes using SSIS, Enterprise Data Marts, Multidimensional and Tabular SSAS Cubes for OLAP solutions, SSRS and Power Pivot and Chart Reporting from Cube source.• Created database design documentation and implementation of best practices for SSAS and SSIS.• Integration of RAS systems with legacy systems using SSIS for ETL operations.• Implementation of Staging databases, Enterprise Data warehouse and data marts.• Data Virtualization and Integration with source systems.• Worked on performance tuning and optimization of SSIS and database objects.• Data Modelling , Design and Development of Data warehouse using Agile Data Vault Modelling.• RAS application data warehouse and DataMart data modelling using SAP Power Designer tool.• Generated Parameterized, Sub-Reports, drill down reports, drill through reports, parameterized reports, scheduling and email notifications using SSRS and Crystal Reports. • Implementation of role based access logic and security on SSAS Cubes and Reports.• Automated builds and deployment of database objects and BI components using PowerShell scripts.• PowerPivot Workbooks using data of Tabular model prove the functionality of database.• SSRS migration from sharepoint integrated mode to native mode.• Powershell scripting for automation of database , SSIS and reports deployments• Connect ,host and other data operations on Azure cloud .• POC on MDS (Master Data Services)• Visualization , Interactive Dashboard reporting for business , mobile apps , Data Modelling and DAX query/calculations , Power BI desktop , Power BI service using Power BI.• Machine Learning techniques and algorithms. Python & R scripting , Python libraries , Statistical analysis , Predictive Analysis/Modelling . -
Senior Professional : Application DesignerDxc Technology Dec 2013 - Dec 2016Ashburn, Virginia, UsProject Name: SCS(Service Contract System) and TPA interfaces Client Zurich NA Project Location Overland Park, KS Team Size 4Project Description: Service Contract System (SCS) is a vehicle/automobile insurance system which was developed to handle contracts for vehicle consumers and administer their claims. SCS system provides to dealership what they can sell to individual consumers to buy car or vehicle. Contribution: • DTS 2000 to SSIS 2012 upgrade and SQL Server 2000 to SQL Server 2012 upgrade. • Integration of SCS systems with third party systems/application using SSIS for ETL operation. Reporting using SSRS and Crystal Reports, Database development using SQL/TSQL, Scheduling of Tidal jobs using Tidal scheduler.• Developed Multi- dimensional Objects (Cubes, Dimensions) using SQL Server Analysis Services (SSAS).• Designed Dimensional Modeling using SSAS packages for End-User. Created Hierarchies in Dimensional Modeling.• Worked on Power BI, Power Pivot, Power View.• Designed and developed the reports according to the requirements using Power Map. -
Application DeveloperCsc - Computer Sciences Corporation Feb 2010 - Dec 2013Global, UsProject Name ZLPRS (Zurich Lender's Property Reporting System) Client Zurich NA Project Description: ZLPRS is a web based application which is used to protect the lender's interest in mortgaged properties and investment properties at the time of loss. ZLPRS is based on SOA (service oriented architecture) application and its consuming web-services and batch jobs to extract the data from ZLPRS application database tables and feed it CMS, Casper and Cesar using SSIS packages.Contribution: • Upgrade of databases from SQL Server 2000 to 2008 and Development of SQL 2008 SSIS packages.• Formatting of crystal reports using Crystal Reports using Crystal Report Wizard.• Worked with different control flow elements like Data Flow Task, Execute SQL Task, Script Task and Send Mail Task and migrate data between different heterogeneous sources such as Flat file, Excel 2005 using SSIS.• Worked in creating, deployment and managing cubes using SSAS. -
Application DeveloperCsc - Computer Sciences Corporation Jun 2012 - Aug 2013Global, UsProject Name Zurich Desktop Integrated Workflow Process(zDIWP)Client ZURICH NA Project Description: Zurich Desktop Integrated Workflow Process is a web application is developed for CSC and Zurich eWP teams to track the PC new/rebuild/refresh process and by Zurich employees to validate functions according to the checklist and provide sign-off.Contribution: • Designed MS SSIS Packages to extract data from various OLTP sources, Text Files, XML (Extensible Markup Language) Files, MS SQL Server 2000, Excel and MS Access 2000 to OLAP system like SQL Server.• Wrote T-SQL queries with the help of Stored Procedures, Functions, Triggers, Cursors, Views, and Indexes.• Created various SSIS packages using data transformations like Merge, Aggregate, Sort, Multicasting, Conditional Split, and SCD (Slowly Changing Dimension) and Derived column.• Developed Multi- dimensional Objects (Cubes, Dimensions) using SQL Server Analysis Services (SSAS).• Designed Dimensional Modeling using SSAS packages for End-User. Created Hierarchies in Dimensional Modeling. -
Application DeveloperCsc - Computer Sciences Corporation Dec 2011 - Dec 2012Global, UsProject Name ZOA and iSolve (SharePoint Portal)PProject Description: ZOA is a SharePoint portal which is being used for VISA processing process of CSC employees supporting Zurich client. iSolve is issue tracking tool to enter and search the issue/incident details.Contribution: • Transformed Complex business requirements into ETL using a variety of transformations like Aggregate, Derived Column, Data Conversion, Union All, Merge Join, and Lookup.• Wrote complex Stored Procedures, coding and triggers to capture updated and deleted data from OLTP systems.• Deployed the SSIS Packages to the MSDB as well as File System based on the requirements• Created the automated processes for the activities such as database backup processes and SSIS, SSRS Packages run sequentially using SQL Server Agent job and windows Scheduler.• Create PowerPivot models and PowerPivot reports, publish reports onto SharePoint, and deploy models into SQL Server SSAS instance. • PowerPivot Workbooks using data of Tabular model and Cube data model to prove the functionality of database. -
Associate Application DeveloperCsc - Computer Sciences Corporation Nov 2009 - Dec 2011Global, UsProject Name RUBI (FiQuote) Client Zurich NA Project Description: RUBI( Rating, Underwriting, Booking, Issuing) is a database based application. The system is used to rate, book issue and store underwriting information for selected fidelity and professional liability products written by financial enterprises. . The Project mainly involved in creating new Databases, Performance Tuning, T-SQL coding, Migration of data, Performing Back-up and also it included Extract, Transform and Loading (ETL) of data from Flat file, Excel sources etc. by using SSIS Packages. Created & Deployed and Scheduled Reports. Contribution:• Performed Data modeling organizing the data and provide proper structure to database by implementing constraints to tables.• Involved in debugging and tuning of T-SQL queries, SSIS packages and SSRS reports.• Used TFS to Check in and check out SSIS packages, SSRS Reports and database code (Version Control). • Documented all the coding standards and procedures.Environment: SQL Server 2008R2/2012, T-SQL, SSIS, SSRS, SQL Profiler.
Deepak Ahuja Skills
Deepak Ahuja Education Details
-
Guru Gobind Singh Indraprastha UniversityComputer Science -
Georgia Institute Of TechnologyArtificial Intelligence
Frequently Asked Questions about Deepak Ahuja
What company does Deepak Ahuja work for?
Deepak Ahuja works for The Cadmus Group
What is Deepak Ahuja's role at the current company?
Deepak Ahuja's current role is Sr Data Architect | Technology Lead.
What is Deepak Ahuja's email address?
Deepak Ahuja's email address is de****@****bal.com
What schools did Deepak Ahuja attend?
Deepak Ahuja attended Guru Gobind Singh Indraprastha University, Georgia Institute Of Technology.
What skills is Deepak Ahuja known for?
Deepak Ahuja has skills like Requirements Analysis, Sdlc, Web Services, Agile Methodologies, Xml, Oracle, Sql, Spring Framework, Pl/sql, Microsoft Sql Server, Ssrs, C#.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial