Ramneet Kaur

Ramneet Kaur Email and Phone Number

Sr. Data Integration Specialist @ Co-operators
Ramneet Kaur's Location
Mississauga, Ontario, Canada, Canada
Ramneet Kaur's Contact Details

Ramneet Kaur personal email

n/a
About Ramneet Kaur

Results-driven and highly experienced Sr Data Integration Specialist with over 15 years of expertise in database analysis, design, and development across Banking, Insurance, and Telecommunication sectors. Proficient in Oracle and Azure databases, with extensive knowledge of data warehousing, ETL processes, and data modeling. Skilled in utilizing a variety of tools including Azure Synapse Studio, TOAD, SQL Developer, and Aginity Workbench. Proven track record of collaborating with cross-functional teams to deliver successful Netezza-based solutions that drive business outcomes.Dynamic and results-oriented IT professional with experience leading end-to-end migration projects from Netezza to Azure Synapse. Proficient in designing, implementing, and optimizing data warehouse and data modeling solutions to meet business objectives effectively. Excellent communicator and team player with a strong understanding of Agile and Waterfall methodologies.

Ramneet Kaur's Current Company Details
Co-operators

Co-Operators

View
Sr. Data Integration Specialist
Ramneet Kaur Work Experience Details
  • Co-Operators
    Sr Data Integration Specialist
    Co-Operators Mar 2023 - Present
    Guelph, Ontario, Canada
    • Implementing code refactoring and modularization strategies to streamline stored procedure logic and reduce execution overhead.• Identify and resolve performance bottlenecks in SQL queries, resulting in improvement in query execution time.• Decoupling of the procedure and run Parallel jobs to finish daily loads early.• Developing and Implementing Indexing strategies tailored to specific workload patterns and access patterns within Azure Synapse Database.• Utilize Index… Show more • Implementing code refactoring and modularization strategies to streamline stored procedure logic and reduce execution overhead.• Identify and resolve performance bottlenecks in SQL queries, resulting in improvement in query execution time.• Decoupling of the procedure and run Parallel jobs to finish daily loads early.• Developing and Implementing Indexing strategies tailored to specific workload patterns and access patterns within Azure Synapse Database.• Utilize Index maintenance techniques such as index defragmentation, statistics updates and index reorganization to ensure optimal index performance and data access efficiency.• Successfully migrated large volumes of data while ensuring data integrity and minimal downtime.• Utilized Azure Synapse Analytics (DW) and Azure SQL DB for efficient data storage and processing. Show less
  • Co-Operators
    Business Intelligence Consultant
    Co-Operators Jan 2022 - Mar 2023
    Guelph, Ontario, Canada
    • Developed and executed comprehensive migration strategies, including full load and incremental loading processes.• Conducted thorough data profiling and validation to ensure accuracy and completeness of migrated data.• Implemented data transformation logic to align with Azure Synapse data modeling techniques.• Collaborated with stakeholders, including business analysts, database administrators, and data engineers.• Effectively communicated project progress, challenges, and… Show more • Developed and executed comprehensive migration strategies, including full load and incremental loading processes.• Conducted thorough data profiling and validation to ensure accuracy and completeness of migrated data.• Implemented data transformation logic to align with Azure Synapse data modeling techniques.• Collaborated with stakeholders, including business analysts, database administrators, and data engineers.• Effectively communicated project progress, challenges, and solutions to the project team and management.• Streamlined data processing, resulting in improved query performance in the Azure Synapse environment.• Ensured seamless data availability for business intelligence and reporting purposes.• Provided training and support to end-users and stakeholders on Azure Synapse features.• Designed and implemented efficient ETL pipelines using Azure Synapse tools such as Azure DevOps.• Conducted thorough assessment of source Netezza databases, identifying dependencies, data integrity issues, and optimization opportunities to streamline the migration process. Show less
  • Co-Operators
    Etl Developer/Bi Analyst Iii
    Co-Operators Sep 2016 - Feb 2022
    Guelph, Ontario
    • Worked on optimizing and tuning the SQL queries to improve the database performance.• Worked on migration of data from Netezza to Cloud databases - Azure Synapse Analytics (DW) & Azure SQL DB .• Experienced in code conversion from PostgreSQL to T-SQL.• Use Star Schema and Normalization data modeling techniques for warehousing of data. • Worked on highly complex projects in many different and varied technical areas and business segments.• Implemented Incremental Loading of… Show more • Worked on optimizing and tuning the SQL queries to improve the database performance.• Worked on migration of data from Netezza to Cloud databases - Azure Synapse Analytics (DW) & Azure SQL DB .• Experienced in code conversion from PostgreSQL to T-SQL.• Use Star Schema and Normalization data modeling techniques for warehousing of data. • Worked on highly complex projects in many different and varied technical areas and business segments.• Implemented Incremental Loading of data from existing full load process.• Created and maintained data mappings, transformation logic (for logical and physical models) to support implementation and ongoing maintenance of the operational data stores and warehouses.• Validated ETL processes and data in the different layers (staging and target). • Supported Netezza ETL and Data Warehouse production, development & test environments. • Implemented procedures for validation, reconciliation of metadata and error handling in Netezza ETL processes.• Performed analyses on data quality and apply business rules in all layers of data extraction transformation and loading process.• Dimensional Modeling, ER Modeling, Data Marts, Star Schema / Snowflake Schema, FACT and Dimensional tables.• Prepared job-notes for scheduling Stonebranch jobs and scheduled jobs in SAS schedular. • Involved in the migration of some manual processes into Automation using SAS, SAS-Macros and SQL queries, in all phases (DEV, QA and PROD).• Designed SAS Schedular in SAS Management Console and monitor the flows in Flow Manager.• Created the SAS calendar to schedule the flow using specific pattern and combine calendar. • Responsible for preparing Unix shell scripts and Job-notes to schedule the production automate jobs in Stonebranch.• Performed code reviews to ensure proper design, standard, content and functionality in existing data warehouse. • Monitor progress and document compliance with Netezza ETL best practices standards. Show less
  • Cibc
    Etl Developer/Business Data Analyst
    Cibc Jun 2015 - Sep 2016
    Toronto, Canada Area
    • Wrote ETL’s and Stored Procedures to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and Flat files to generate Regulatory Reports.• Analyzed differences between old and new system change requirements, create logical and physical database design based on required changes and design mapping document in TFS.• Estimated and planned development work in Agile Methodology in BI team.• Developed all the mappings according to the design… Show more • Wrote ETL’s and Stored Procedures to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and Flat files to generate Regulatory Reports.• Analyzed differences between old and new system change requirements, create logical and physical database design based on required changes and design mapping document in TFS.• Estimated and planned development work in Agile Methodology in BI team.• Developed all the mappings according to the design document and mapping specs provided and perform unit testing.• Worked with NZLoad (Netezza Load) to load data from flat files into Netezza tables.• Wrote Stored Procedures to log pre and post data information for target table while running the ETL. • Created Stored procedures from SAS programs and load data into Netezza tables from SAS datasets. • Created and Maintained database objects such as tables, views, indexes, constraints, SQL code, and stored procedures. • Designed Netezza ETL to reflect all the transaction level changes based on the all joining tables.• Implemented the concept of time_windowing (time frame) in the sql scripts to load data from source to target tables. • Worked with test team to fix all bugs are before code is deployed in UAT.• Implemented Bi-temporal logic in scripts to track the history of data for all the transactions of data. • Used Aginity Netezza workbench to perform various DML, DDL operations.• Optimized the queries to improve the database performance of the application. Show less
  • Bmo Financial Group
    Etl Developer
    Bmo Financial Group Apr 2014 - Jun 2015
    North York
    • Worked closely with the system analysts and business analysts to understand and analyze the requirements.• Worked on loading the data into Netezza from legacy systems and flat files using complex UNIX scripts.• Built the ELT architecture, design mapping of source to target in order to load data into data warehouse.• Used Aginity Netezza workbench to perform various DML, DDL operations on Netezza database.• Extensively used NZSQL and NZLOAD utility to load the data into… Show more • Worked closely with the system analysts and business analysts to understand and analyze the requirements.• Worked on loading the data into Netezza from legacy systems and flat files using complex UNIX scripts.• Built the ELT architecture, design mapping of source to target in order to load data into data warehouse.• Used Aginity Netezza workbench to perform various DML, DDL operations on Netezza database.• Extensively used NZSQL and NZLOAD utility to load the data into Netezza database.• Optimized the queries to improve the performance of the application.• Worked on optimizing and tuning the Netezza SQLs to improve the database performance.• Provided production support to resolve issues, inquiries and requests.• Designed SSIS Packages using several transformations to perform data profiling, data cleansing and data transformation.• Wrote complex SQL queries on Netezza and use them in lookup SQL overrides and source qualifier overrides.• Extracted data from various sources like Oracle, Netezza, flat files and load into the target Netezza database.• Designed Data Model using Power Designer.• Experienced in developing ELT scripts in UNIX/NZSQL for a Netezza data warehouse. • Involved in production support to resolve the issues encountered during the data loads. Show less
  • Rogers Communications
    Db Developer/ Dba
    Rogers Communications Jun 2010 - Mar 2014
    • Supported Microsoft SQL/Oracle Database environments, which include production, development, maintenance testing and patch testing databases to be used across the division for each applications release.• Worked closely with the development group on database design, database development and implementation projects.• Analyzed, developed, maintained and implemented Database on multiple environments. • Designed and developed Informatica Mappings to load data from Source systems… Show more • Supported Microsoft SQL/Oracle Database environments, which include production, development, maintenance testing and patch testing databases to be used across the division for each applications release.• Worked closely with the development group on database design, database development and implementation projects.• Analyzed, developed, maintained and implemented Database on multiple environments. • Designed and developed Informatica Mappings to load data from Source systems • Developed and modified PL/SQL packages, procedures, functions, triggers as per project requirements.• Built and maintained Microsoft SQL scripts, indexes, reports and queries for data analysis and extraction. • Developed new processes to facilitate import and normalization.• Used Microsoft SQL LOADER to upload the information into the database tables. • Responsible for gathering requirements from Business analysts and implementing the changes in existing product database. • Developed logical and physical data models that capture current state/future state data elements and data flows using Erwin. • Interacted with infrastructure, release management, change management, QA, DBA and application teams. • Created DDL scripts, database objects like Tables, Views, Clustered and non -Clustered Indexes, Synonyms, Procedures and Sequences• Used enterprise data warehouse for reporting, data analysis and data integration. • Involved in Performance tuning of complex queries and production problems for the applications.• Extensively worked in Analysis Services & Reporting Services, SQL Server Clustering and Log Shipping.• Used Visio as CASE tool for designing database structures. • Created and Evaluated Stored Procedures for potential database performance issues. • Prepared ETL design document which consists of the database structure, change data capture, Error handling, and restart and refresh strategies. • Worked in both Waterfall methodologies and Agile methodologies. Show less
  • Infosys
    Database Developer
    Infosys Jul 2007 - Dec 2009
    • Developed PL/SQL Procedures, Functions and Packages and used SQL loader to load data into the database. • Analyzed and computed statistics to improve response time for queries. • Performed test case/data preparation, executed and verified test results.• Developed Packages and customized functions and Triggers based upon the business logics. • Developed and maintained MS Access database.• Assisted in Installing Application Server, Report Server and Database Server… Show more • Developed PL/SQL Procedures, Functions and Packages and used SQL loader to load data into the database. • Analyzed and computed statistics to improve response time for queries. • Performed test case/data preparation, executed and verified test results.• Developed Packages and customized functions and Triggers based upon the business logics. • Developed and maintained MS Access database.• Assisted in Installing Application Server, Report Server and Database Server • Involved in performing periodic cold backup of the databases • Involved in Development of the applications through Oracle using PL/SQL created Tables, cursors and SQL Queries. • Performed Data Migrations from Production to Test and Development Environments.• Created SQL jobs for ETL processes and performed data validation after ETL processes• Developed and maintained existing code for system. • Used Visio to create data-connected Visio diagrams.• Created an MS Access database to collect data systems in order to provide lists, status reports and management overview reports.• Performed performance tuning of SQL queries used in extracting data from different source systems as well as in transformation of data using SQL and PL, SQL code. Show less

Ramneet Kaur Skills

Oracle Sql Agile Methodologies Databases Sdlc Pl/sql Unix Requirements Analysis Microsoft Sql Server Data Warehousing Xml Integration Web Services Scrum Performance Tuning Telecommunications Software Development

Ramneet Kaur Education Details

  • Ptu,Jalandhar,Punjab
    Ptu,Jalandhar,Punjab
    Computer Engineering

Frequently Asked Questions about Ramneet Kaur

What company does Ramneet Kaur work for?

Ramneet Kaur works for Co-Operators

What is Ramneet Kaur's role at the current company?

Ramneet Kaur's current role is Sr. Data Integration Specialist.

What is Ramneet Kaur's email address?

Ramneet Kaur's email address is kh****@****o.co.in

What schools did Ramneet Kaur attend?

Ramneet Kaur attended Ptu,jalandhar,punjab.

What skills is Ramneet Kaur known for?

Ramneet Kaur has skills like Oracle, Sql, Agile Methodologies, Databases, Sdlc, Pl/sql, Unix, Requirements Analysis, Microsoft Sql Server, Data Warehousing, Xml, Integration.

Not the Ramneet Kaur you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.