Poonam S. Email and Phone Number
• Highly analytical and process - oriented data analyst with 7+ years of experience in data analysis and data management and has proven ability to work efficiently in both independent and teamwork environments.• Experienced in Requirement Analysis, Test Design, Test Preparation, Test Execution, Defect Management, and Management Reporting.• An excellent understanding of both traditional statistical modeling and Machine Learning techniques and algorithms like Regression, clustering, ensembling (random forest, gradient boosting), deep learning (neural networks), etc.• Experience in SAS, SQL, SSRS, Tableau, Python, MS EXCEL (VLOOKUP, Pivot charts, Macros).• Expertise in Data Manipulations using TABULATE, UNIVARIATE, Append, Array, DO loops, Macros and Merge procedures• Skilled in performing data parsing, data manipulation and data preparation with methods including describe data contents, compute descriptive statistics of data, regex, split and combine, Remap, merge, subset, reindex, melt, and reshape.• Experience in using various packages in R and python-like ggplot2, caret, dplyr, Rweka, gmodels, twitter, NLP, Reshape2, rjson, dplyr, pandas, NumPy, Seaborn, SciPy, Matplotlib, sci-kit-learn, Beautiful Soup.• Expertise in using Linear & Logistic Regression and Classification Modeling, Decision-trees, Principal Component Analysis (PCA), Cluster and Segmentation analyses, and have authored and co-authored several scholarly articles applying these techniques.• Extensive experience in Text Analytics, developing different Statistical Machine Learning, Data mining solutions to various business problems and generating data visualizations using R, Python, and Tableau.• Expert in python libraries such as NumPy, SciPy for mathematical calculations, Pandas for data pre-processing/wrangling, Mat plot, Seaborn for data visualization, Sklearn for machine learning, Theano, TensorFlow, Keras for Deep leaning and NLTK for NLP.• Performed Data Analysis and Data validation by writing complex SQL queries using SQL against the SQL server database.• Strong Teradata skills that include build/maintain Teradata tables, Views, Constraints, Indexes, SQL & PL/SQL scripts, Functions, Triggers and Stored procedures.• Experience in Creating Teradata objects including Volatile Table, Derived Table, Global Temporary Table
Blue Cross & Blue Sheild
View- Employees:
- 138
-
Data AnalystBlue Cross & Blue Sheild Jun 2023 - Present• Perform root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.• Working as IPDS lead and keep eye on incoming and outgoing while making sure the accuracy of Data.• Work as SME for IPDS and Manage access Audit.• Working on DB visualizer, Toad and SQL for daily query runs. • Actively involved in LCMP updates.• Working on PDF Provider Data Repository (PDF).• Making sure that all the files are staged correctly with complete accuracy without any variances and are being sent by all Plans as per schedule.• Working actively on Service Now, responsible for Closing enquires by all plans and prepare reports.• Analyzing workflow and targets from quality team members and prepared the weekly report to the upper management using SAS and Power BI, create dashboards/scorecards to provide analytical support to the QA department as needed.• Run-bi-weekly production reporting to the vendor through SAS and perform data validation, Update and maintain quality outreach spreadsheets through updated production run data.• Quarterly mapping updates for Audit. -
Data AnalystM&T Bank Jun 2022 - May 2023Buffalo, New York, United States• Perform root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.• Perform Reverse engineering to provide Data Flow to Business Stake Holders.• Work on informatica to find Workflows and identify source from Unix. • Provide Data Lineage and use Teradata store procedures to find flow of Data.• Responsible for providing more than 45 SQL Patches validation on monthly basis on tight deadlines.• Worked to ensure high levels of data consistency between diverse source systems including flat files, XML and SQL Database.• Analyzing workflow and targets from quality team members and prepared the weekly report to the upper management using SAS and Power BI, create dashboards/scorecards to provide analytical support to the QA department as needed.• Run-bi-weekly production reporting to the vendor through SAS and perform data validation, Update and maintain quality outreach spreadsheets through updated production run data.• Quarterly mapping updates for Audit. -
Data Analyst IiiCo-Op Financial Oct 2021 - Apr 2022California, United States• Analyzed the Business Requirements Specification Documents and Source to Target Mapping Documents and identified the test requirements.• Developing SQL and Snowflake Queries to pull the data from the different database for the specified requirement.• Working on Azure Pipelines, FIS, Fiserv, Kobie data.• Working on daily reports for UAT and Production environment.• Participated in all phases of data preparation; data collection, data cleaning, validation, visualization, and performed Gap Analysis.• Implemented end-to-end systems for Data Analytics, Data Automation and integrated with custom visualization tools using Tableau.• Gathering all the data that is required from multiple data sources and creating datasets that will be used in the analysis.• Developed and implemented predictive models of user behavior data on websites, URL categorical, social network analysis, social mining and search content based on large-scale Machine Learning.• Developed predictive models on large-scale datasets to address various business problems through leveraging advanced statistical modeling, machine learning, and deep learning.• Co-ordinated with business users to understand functional requirements. This included creating ETL Specification Document, participating in review meetings.• Tested the format of the reports according to the specifications provided and compared the data in the reports with the backend Datamart through SQL and using excel for data comparison.• Created XML schema definitions (XSDs) with the XML Spy tool and converted them into Informatica metadata.• Manipulate and prepare data, extract data from the database for a business analyst using SAS.• Involved in end-to-end testing of the entire process flow starting from the source database to the target Datamart to the reports by considering all possible scenarios.
-
Sr Data AnalystTrinet Jan 2020 - Sep 2021• Perform root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.• Worked to ensure high levels of data consistency between diverse source systems including flat files, XML and SQL Database.• Analyzing workflow and targets from quality team members and prepared the weekly report to the upper management using SAS and Power BI, created dashboards/scorecards to provide analytical support to the QA department as needed.• Run-bi-weekly production reporting to the vendor through SAS and perform data validation, Update and maintain quality outreach spreadsheets through updated production run data.• Created reports and dashboards, by using D3.js and Tableau 9.x, to explain and communicate data insights, significant features, models scores and performance of new recommendation systems to both technical and business teams.• Utilize SQL, Excel, and several Marketing/Web Analytics tools (Google Analytics, Bing Ads, AdWords, AdSense, Criteo, Smartly, SurveyMonkey, and Mailchimp) to complete business & marketing analysis and assessment.• Wrote SQL Stored Procedures and Views, and coordinate and perform in-depth testing of new and existing systems. -
Data GovernanceUniversity Of Pennsylvania Nov 2015 - Dec 2019• Worked on IPDS (Inter Plan Data Solutions) to keep daily track of incoming and outgoing reports.• Daily track of BOC reports to make sure all are getting loaded without any error• Work with Plan in case of any error from Plan side.• Work with IT department in case of any error in file load due to decryption issue.• Follow the Data lineage in case of error detection and provide upper management with root cause analysis.• Worked solving Service Now tickets created for complex issues.• Worked on PDR Tool (Provider Data Repository) on daily basis to keep tight control on daily files getting staged without variance.• Run Toad queries/DB visualizer to find variance.• Worked on monthly GAP Analysis.• Quarterly IPDS user Audit and SOC Audit as my sole responsibility. -
Data AnalystHudson City Savings Bank Oct 2009 - Oct 201507054• Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).• Involved in extensive DATA validation using SQL queries and back-end testing.• Used SQL for Querying the database in UNIX environment.• Developed separate test cases for ETL process (Inbound & Outbound) and reporting• Design and development of ETL processes using Informatica ETL tool for dimension and fact file creation.• Involved in Manual and Automated testing using QTP and Quality Center.• Conducted Black Box – Functional, Regression and Data Driven. White box – Unit and Integration Testing (positive and negative scenarios).
Poonam S. Education Details
Frequently Asked Questions about Poonam S.
What company does Poonam S. work for?
Poonam S. works for Blue Cross & Blue Sheild
What is Poonam S.'s role at the current company?
Poonam S.'s current role is Data Analyst at M and T Bank.
What schools did Poonam S. attend?
Poonam S. attended Govt. College Gurdaspur.
Not the Poonam S. you were looking for?
-
Poonam S Bhogal PMP MBA Lean
Bradenton, Fl2magellanhealth.com, gmail.com
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial