Krishna Chaitanya Dhenuva Konda Email and Phone Number
Krishna Chaitanya Dhenuva Konda work email
- Valid
Krishna Chaitanya Dhenuva Konda personal email
- Valid
- Valid
Krishna Chaitanya Dhenuva Konda phone numbers
Result-oriented and highly motivated Solutions Architect / Data Architect with with 15+ years of experience designing and implementing cost-effective and efficient cloud solutions using industry best practices / Expert at deriving meaningful insights from data with rich experience in implementation and oversight responsibilities of Cloud (AWS), Big Data, Data Pipelines, Data Warehousing, Data Modeling, Database Migration, ETL/ELT, Data Governance and Data Virtualization with expert-level Oracle Data Modeling, PL/SQL Development and SQL Query Performance Tuning skills.• AWS certified Solution Architect & Well-rounded with entire AWS eco system for efficient and cost-effective architectural choices / quick in learning and implementing new technologies/ AWS Redshift & Oracle DBs specialized.• Hands-on experience in AWS Redshift/ RDS/ DynamoDB/ GLUE/ Lambda/ EC2/ Athena/ EMR/ S3/ IAM/ API Gateway, Tableau, Oracle, Python, Hadoop, Teradata, Informatica & Telecom, Supply Chain, Retail and Banking domains • Expert in in designing, implementing, turning, Securing and maintaining cloud based columnar DWs such as AWS Redshift• Expert-level SQL Query Performance Tuning skills & experience with scripting languages such as Python, SQL, PLSQL and UNIX.• Experience in designing and implementing cloud migrations, cloud based architectural solutions, big data, Data Lakes, ETL/data pipelines & large data warehouses• Evaluating and implementing various big data technologies and solutions to optimize processing of extremely large data sets in an accurate and timely fashion• Play a leading role in building systems and complex datasets using data engineering best practices, data management fundamentals, data storage principles, recent advances in distributed systems, and operational excellence best practices to meet business requirements• Proven track record to drive rapid prototyping and design for new projects, setting up data analytics environments & ability to translate broader business initiatives into clear team objectives and concrete individual goals, aligning appropriately with other stakeholders for efficient and coordinated action• Build best practices and market trends pertaining to Cloud and overall industry to provide thought leadership (seminars, whitepapers etc.) and mentor team to build necessary competency
Vanguard
View- Website:
- vanguard.com
- Employees:
- 25515
- Company email:
- privacy@vanguard.com
-
Data ArchitectVanguardSan Diego, Ca, Us -
Solution ArchitectVanguard Sep 2024 - PresentValley Forge, Pa, Us -
Data ArchitectKonami Gaming, Inc. Nov 2022 - Aug 2024Las Vegas, Nevada, Us• Oversaw the management of database operations for multiple customers, ensuring optimal performance and minimal downtime.• Set up CloudWatch metrics streams with filters to Kinesis• Configure to send CloudWatch logs to S3• Configure CloudWatch Logs Insights for querying capabilities on the logs• Create IAM permissions & Setup CloudWatch agent on EC2 instances to facilitate routing of logs to CloudWatch• Created, Administered and Maintained Peta-byte scale enterprise data warehouse clusters built using AWS Redshift• Improved Redshift query performance using WLM queues, AWS Performance Insights, AWS CloudTrail, AWS Redshift data distribution styles and sort keys.• Developed and Maintained data security and governance for Redshift EDW using VPC, Security Groups, IAM users & Encryption• Created Tables, Views, Constraints, Indexes, MVs, External Tables, Ref Cursors, Collections, Triggers• Performance Tuning of Oracle queries and PL/SQL stored procedures, functions, packages etc.• Led the design and deployment of a new data warehousing or Data Lake solution using AWS S3, Lake Formation, Glue, Redshift, Oracle that processed over 10TB of data daily• Enhanced system performance by optimizing query response times and doubling the speed of report generation for client-facing applications.• Conducted comprehensive reviews and updates to existing database systems for improving operational efficiency.• Collaborated with cross-functional teams to integrate data handling that improved location-based analytics capabilities.• Established data security measures that led to reduction in vulnerability exploits within the first year of implementation using KMS, IAM, TLS, VPC• Developed and maintained over 6 data models throughout multiple projects, ensuring consistency and accessibility. Facilitated the seamless replication of critical databases across different geographical locations, boosting data redundancy. -
Aws Solutions ArchitectVerizon Jul 2015 - Nov 2022Basking Ridge, Nj, Us• Design and implement cost-effective, efficient and secure cloud architectures on AWS• Deployment and management of AWS services – including but not limited to VPC, Route 53, ELB, EBS, EC2, S3, DynamoDB, RDS and Redshift• Monitor and optimize cloud infrastructure performance using AWS CloudWatch, CloudTrail• Setup correct IAM policies & CloudWatch agent on EC2 instances to facilitate routing of logs to CloudWatch• Setup Composite Alarms using CloudWatch• Setup CloudWatch alarms for EC2 instance recovery and alerts to SNS topics• Develop basic Lambda functions using Python• Created and Maintained AWS Redshift data warehouse clusters• Design and implement disaster recovery plans using AWS Backup, multi-AZ features, Route 53, Cloud Formation and best practices• Analyze and troubleshoot cloud infrastructure issues• Responsible for ETL design using Glue, Data Pipelines from multiple sources / Data Lake using S3 and Lake Formation / Datawarehouse using Redshift• Created Oracle PL/SQL stored procedures, functions and packages for moving data from staging area to data mart• Created Oracle indexes on the tables for better query performance• Used Oracle bulk processing for better performance• Partitioned the Oracle fact tables and MVs to enhance the performance• Data loading using SQL*Loader, External Tables• Write and implement the respective IAM policies as required per the respective services• Design and implement cloud-based monitoring and logging solutions• Created AWS Data Pipeline to consolidate operational data from multiple data sources to house at centralized location feeding data to multiple dashboards• Write reference architecture, best practices and thought leadership whitepapers, identify and enforce AWS architectural best practices• Create POCs to migration of existing on-prem applications to AWS• Delivered descriptive reports by extracting data from client systems using SQL & designed user screens for financial reporting on ad-hoc basis -
Data Architect & Tech ManagerEpsilon Nov 2012 - Jul 2015Irving, Texas, Us• Requirement Gathering and Analysis • Create the Logical, Physical Data Models using ERWin• Designed dashboards and scorecards to show operational performance across different markets; variance in actuals and planned, trend analysis to find process improvement opportunity• Automated weekly reports used by product and operation’s team to track/monitor performance • Designed a customized accounting report using complex SQL queries to be produced from the system on the 1st & 2nd day after month end for multiple legal entities based on user inputs• Design and develop various reports using Oracle PL/SQL stored procedures and UNIX scripts• Performance Tuning of Oracle queries and PL/SQL stored procedures• Developed a forecast model to forecast completions based on schedules, historical completions and other variables using SQL and tableau to predict operational capacity and completions for different vendors• Unit Testing of Oracle stored procedures and UNIX scripts • Collaborated with business team for requirement elicitation to design complex reports to extract & validate client’s financial & operational data in SQL & integrating with SAP & Salesforce systems • Version management of the code/documents using GitHub• Tracking & Tuning slow scorecard jobs per their Priority using Oracle PL/SQL and Business Objects• Triggering and Monitoring test runs using Business Objects Data Integrator• Report the status of performance improvements to Management on a weekly basis• Design and Develop ETL jobs using Pentaho/Kettle and Oracle MVs• Implement Master Data Management (MDM) and Data Governance rules engine• As an Agile team member, follow the Agile Methodology, attend daily team meetings, Estimate size of backlog items, Evaluate technical feasibility, Implement backlog items, Write and verify code which adheres to the acceptance criteria -
Sr Pl/Sql DeveloperCricket Communications Apr 2010 - Oct 2012Atlanta, Georgia, Us• Design and Develop stored procedures using Oracle PL/SQL• Tuning slow scorecard jobs per their Priority using Oracle PL/SQL and Business Objects• Systems Integration• Triggering and Monitoring test runs using Business Objects Data Integrator• Design and Develop ETL jobs using Business Objects Data Integrator• Unit Testing Oracle stored procedures and Business Objects ETL jobs -
Sr. Database/ Etl DeveloperBank Of America Sep 2009 - Mar 2010Charlotte, Nc, Us• Responsible for end-to-end development (LLD, Development, Unit Testing, Maintenance Support) of a given BPM function based on the HLD using PL/SQL - Oracle 10g • Extensive performance tuning using EXPLAIN PLANs, TKPROF, DBMS Profiler to meet the stringent SLAs• Identify and modify the time consuming PL/SQL procedures to use bulk binding(BULK COLLECT, FOR ALL, COLLECTIONS) to reduce context switches between PL/SQL and SQL engine• Development of Materialized views, Creation of indexes, De-normalization and Partitioning of tables/indexes, Rebuilding indexes, Stats Collection as required to improve the query performance• Simulation of EXPLAIN PLANs from one environment to another using SQL Profiles and Outlines• Responsible for replication of transactional data (from BPM) to Repoprting server (BAM) using Oracle Streams• Data Modeling enhancements for Term Life Insurance products• Generation and monitoring AWR Reports in DEV, CIT, SIT and LT environments -
Sr. Pl/Sql DeveloperCricket Wireless Feb 2009 - Aug 2009Atlanta, Georgia, Us• Design, Develop and Unit Test PL/SQL procedures, functions, and packages for PM using Oracle 10g• Identify and modify the time-consuming PL/SQL procedures to use bulk binding (BULK COLLECT, FOR ALL, COLLECTIONS) to reduce context switches between PL/SQL and SQL engine• Debug PL/SQL routines using TOAD IDE visual checkpoints• Develop complex SQL queries for generation of trend analysis reports• Resolving vendor historical feed file issues and load into PIN Management• Load failed vendor feed files using External Tables to identify data issues• Tuning PL/SQL procedures, functions and packages related to PM load process using DBMS profiler• Tuning SQL queries using EXPLAIN PLAN, TKPROF • User access creation and management, and cloning the production db in dev and test environments for development and testing -
Tata Consultancy Services - Pl/Sql Developer To Oracle DbaTata Consultancy Services Jan 2001 - Nov 2008Mumbai, Maharashtra, InNortel Networks, Durham, USAAT&T Wireless, Seattle, USACingular Wireless, Seattle, USAOman Telecommunications, Oman & India• Data Modeling (conceptual, logical and physical), Oracle Physical DB creation for the integrated Case Mart, CURE web and CODS environments using Oracle 10g• Design, Develop and Test PL/SQL procedures, functions, and packages • Design and implement Views to safeguard data, and Triggers to implement auditing and Replication• Develop ETL (Extract, Transform and Load) routines using PL/SQL for data extraction, validation, cleansing, transformation and loading.• Modify the time-consuming PL/SQL procedures to use bulk binding (BULK COLLECT, FOR ALL, COLLECTIONS) to reduce context switches• Tuning SQL queries (using EXPLAIN PLAN, TKPROF), PL/SQL Procedures, Functions and Packages (using DBMS_PROFILER), and Generating/Monitoring STATSPACK Reports• Data loading using SQL*Loader, Direct-Load Inserts, and Database schema/tables’ migration using Import/Export• Extensive design, development and support of BO XI universes and reports using BO Designer, DeskI/WebI, and InfoView• Performance tuning of BO XI reports at the universe level using derived summary views, hints etc.• UNIX shell scripting for file loading, file manipulation & job scheduling thru Crontab• Responsible for Database Backup and Recovery using RMAN• Perform day-to-day database administration tasks – tablespace management, user password management, ASM space/disk addition, privileges and grants etc.• Plan Oracle PSU patching and apply patches to Prod and non-Prod databases• Performance Tuning – SGA, initialization parameters, slow database response, re-size redo logs, ADDM and AWR reports
Krishna Chaitanya Dhenuva Konda Skills
Krishna Chaitanya Dhenuva Konda Education Details
-
Osmania UniversityComputer Science -
A.P.R.S.EnkoorS.S.C -
A.P.R.S -
A.P.R.S Enkoor
Frequently Asked Questions about Krishna Chaitanya Dhenuva Konda
What company does Krishna Chaitanya Dhenuva Konda work for?
Krishna Chaitanya Dhenuva Konda works for Vanguard
What is Krishna Chaitanya Dhenuva Konda's role at the current company?
Krishna Chaitanya Dhenuva Konda's current role is Data Architect.
What is Krishna Chaitanya Dhenuva Konda's email address?
Krishna Chaitanya Dhenuva Konda's email address is kr****@****zon.com
What is Krishna Chaitanya Dhenuva Konda's direct phone number?
Krishna Chaitanya Dhenuva Konda's direct phone number is +142544*****
What schools did Krishna Chaitanya Dhenuva Konda attend?
Krishna Chaitanya Dhenuva Konda attended Osmania University, A.p.r.s.enkoor, A.p.r.s, A.p.r.s Enkoor.
What skills is Krishna Chaitanya Dhenuva Konda known for?
Krishna Chaitanya Dhenuva Konda has skills like Data Warehousing, Pl/sql, Oracle, Etl, Sdlc, Data Modeling, Databases, Sql, Business Intelligence, Performance Tuning, Database Design, Data Warehouse Architecture.
Who are Krishna Chaitanya Dhenuva Konda's colleagues?
Krishna Chaitanya Dhenuva Konda's colleagues are Lisa Mariani Riter, Mba, Cfp®, Kyle T Mahady, Tevin Edwards, Pooja Malepati, Brian Brickman, Thomas Metz, Pat Green.
Free Chrome Extension
Find emails, phones & company data instantly
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial