Nag Arvind Gudiseva Email and Phone Number
Nag Arvind Gudiseva work email
- Valid
- Valid
Nag Arvind Gudiseva personal email
- Valid
- Valid
Nag Arvind Gudiseva phone numbers
• Over 13 years of experience in Retail, Insurance and Contact Centre domains specializing in IT Services Management, Software design and development of application framework using Web Technologies.• Around 5 years into Big Data Technologies and Analytics with background of Java Application Programming and Software Engineering.• Agile Product Development for Production of the Data Science modules.• Built Data Science Solutions following Microservices Architecture for (a) Stop Words Lookup with Redis, Scala, Akka HTTP and REST API using Apache Spark; (b) Fuzzywuzzy String matching / Levenshtein Distance with Jersey REST API.• Created Search and Page Ranking REST API using Lucence library for a high performant keywords and key-phrases search.• Designed and developed Hive Server 2 Query Tool for Cisco internal usage and evaluation.• Prepared Computer Based Tutorial (CBT) on Scala for Cisco internal trainings.• Organization wide initiative to convert Pentaho Data Integration ETL Tool to Scala Workflow by saving on the license renewal costs.• Designing Data Pipelines, Processing Workflows with Big Data Technology Stack recommendations.• Installation, configuration and implementation of Hadoop Ecosystems to process and analyze Store Returns from various multi-channel Retail Applications.• Architectural suggestions with Frameworks design. Prepare Proof of Concepts and apply in Production.• Managed Scrum teams in cross functional environments with successful track record of project deliverables by handling development, testing and support functions. • Demonstrated knowledge in Agile methodologies by customizing Scrum with Lean and defining stages to limit Work in Progress.• Proven experience in Requirements Gathering, Overseas Client Engagement, Project Transitions, Risks, Disaster & Seasonal Planning, Service Introduction, Fit for Service, Service Improvements, Root Cause Analysis and Post Mortem.
Sagarsoft (India) Ltd
View- Website:
- sagarsoft.in
- Employees:
- 218
-
Senior Architect, Data AnalyticsSagarsoft (India) Ltd Aug 2018 - PresentHyderabad Area, IndiaCustomer: Point72 Asset Management, USARole: Technical Architect● Implemented security strategy for the ELK Stack. With controlled access to Elasticsearch and Kibana, users are prevented with unauthorized access to data across the departments within the same organization.● FpML Trades data (XML format) is consumed from Kafka topics, parsed using XQuery and XPath within Apache NiFi and loaded into Apache Phoenix NoSql Database. Also, few of the trades were transposed using XSL… Show more Customer: Point72 Asset Management, USARole: Technical Architect● Implemented security strategy for the ELK Stack. With controlled access to Elasticsearch and Kibana, users are prevented with unauthorized access to data across the departments within the same organization.● FpML Trades data (XML format) is consumed from Kafka topics, parsed using XQuery and XPath within Apache NiFi and loaded into Apache Phoenix NoSql Database. Also, few of the trades were transposed using XSL Template.● Processed large and complex XML files using StAX (Streaming XML) in Pentaho Data Integration. This resolved the out of memory issue with Carte Server.● Implemented Rolling logs using Log4J Framework for Pentaho Application Logs and Carte Server Logs. Logs are zipped, archived and time-stamped with house-keeping.● AVRO messages that are consumed from Kafka Topics are deserialized in the Apache NiFi data pipeline with the Hortonworks Schema Registry ID and converted to JSON format.● Created Apache NiFi Custom Processors (Plugins):• (a) to read Kafka AVRO Key with Base64 encoding as Attribute• (b) to read Kafka Timestamp as Attribute• (c) to convert Attributes to Flowfile• (d) Evaluate Null Json Paths● Data Migration from Apache Phoenix NoSql Database to Microsoft SQL Server with MERGE in real-time using Apache NiFi.● Data Validation between Apache Phoenix NoSql Database and Microsoft SQL Server using Pentaho Transformation.● Parsing Confidential PDF files to CSV using Apache Tika library in Pentaho and Regex in User Defined Java Class.● Developed Pentaho Custom Plugin for Elasticsearch Bulk Load that works over Elasticsearch REST API with options to provide batch size and timeout. Indexed 6 million documents with improved performance of 20x.● Provided resilience capability to Pentaho REST Step Plugin with configurable Max Retries and Wait Time.● Privileged Security Access to MSSql with Kerberos and MongoDB with CyberArk. Show less -
Technical ArchitectMotorola Solutions Nov 2017 - Aug 2018Bengaluru Area, India● Designed data services API for information retrieval of Call Details Records (CDRs) using Elasticsearch, Jersey RESTful Web Services framework by securing with JSON Web Token (JWT).● Extended PostgreSQL with Bi-Directional Replication for multi-master, easy failover and High Availability of Hive / Impala Metadata. Load tested the application by simulating the CRUD operations using technologies: Java Concurrency Framework, Hibernate and C3P0 Connection Pooling.● Configured near Real… Show more ● Designed data services API for information retrieval of Call Details Records (CDRs) using Elasticsearch, Jersey RESTful Web Services framework by securing with JSON Web Token (JWT).● Extended PostgreSQL with Bi-Directional Replication for multi-master, easy failover and High Availability of Hive / Impala Metadata. Load tested the application by simulating the CRUD operations using technologies: Java Concurrency Framework, Hibernate and C3P0 Connection Pooling.● Configured near Real time Metrics / KPI monitoring and Time series analysis with Graphite and Grafana Dashboards.● Designed Security Solution for Grafana Dashboards and front-end custom visualizations with Keycloak OIDC and JWT.● Configured Hadoop High Availability of Name Node and Resource Manager. Enabled HA for Hive with Dynamic Service Discovery. Prepared strategy for Data Migration during upgrade. Show less -
Technical SpecialistCss Corp Jul 2016 - Nov 2017Bengaluru Area, IndiaTrack #1: Social Media AnalyticsCustomers’ data is extracted from external and internal sources like Blogs, Social Media Platforms, E-commerce Websites, Bazaarvoice APIs, etc. Usable insights are generated for Products.• Scrape websites using JSoup API in Scala and perform Sentiment Analysis.• SparkSQL DataFrame API for Text Cleaning in Natural Language Processing pipeline.• Extract YouTube comments in JSON format using YouTube Data API v3.• Architecture and setup of AWS… Show more Track #1: Social Media AnalyticsCustomers’ data is extracted from external and internal sources like Blogs, Social Media Platforms, E-commerce Websites, Bazaarvoice APIs, etc. Usable insights are generated for Products.• Scrape websites using JSoup API in Scala and perform Sentiment Analysis.• SparkSQL DataFrame API for Text Cleaning in Natural Language Processing pipeline.• Extract YouTube comments in JSON format using YouTube Data API v3.• Architecture and setup of AWS EMR Cluster with Amazon S3 Storage for MapReduce jobs and Hive External Tables.Track #2: Virtual Assistant Platform (YODAA)Engages Consumer Tech Support Customers in conversation to deliver self-help solutions.• Architected the Bot Application using Big Data technologies stack and designed the Data Flow for Information Retrieval system.• Developed Application for Page Ranking of documents using Bag of Words and TF-IDF.• Developed Scala Akka HTTP REST API to identify Stop Words, which are loaded into In Memory Redis.• Optimized Data Science modules by creating Jersey REST API for FuzzyWuzzy String Ratio; calculate Levenshtein Distance.Track #3: Analytics as a Service Platform for ServiceNowProvide actionable insights to the customers by gathering System Alerts, Events and Incidents data.• Architected ServiceNow - Active Insights Data Integration.• Developed Data Ingestion with RESTful API, JAXB, Hibernate and MS SQL Server ETL.• Designed Tableau Dashboards for Metrics (MTTR, MTTF, MTBF).Proof of concept:• Architecture and setup of AWS EMR Cluster with S3 Storage for MapReduce jobs and Hive External Tables.• Architected (A) Enterprise Application for Device Log Analytics; (B) Desktop Application for Home SOC• Demonstrated Google Cloud Platform capabilities by developing connectors for (A) Migration of On-premises data to Google Cloud Storage; (B) Real-time ingestion of live streaming data to Google BigTable; (C) Analyze large datasets with Google BigQuery Show less -
Hadoop Lead At Summitworks Technologies Pvt LtdCisco Jun 2015 - Jul 2016Bengaluru, India• Leading the offshore Analytics Team of Hadoop and ETL Developers.• Designing Data Pipelines, Processing Workflows with Technology Stack recommendations.• Architectural suggestions with Frameworks design and Proof of concepts.• Active contributor in the Product Management Review meetings.Responsibilities:• Re-designing Pentaho Data Integration ETL Tool to Scala (SBT) Workflow for automating of Data Processing.• Hive Code optimization with Octal Delimiters… Show more • Leading the offshore Analytics Team of Hadoop and ETL Developers.• Designing Data Pipelines, Processing Workflows with Technology Stack recommendations.• Architectural suggestions with Frameworks design and Proof of concepts.• Active contributor in the Product Management Review meetings.Responsibilities:• Re-designing Pentaho Data Integration ETL Tool to Scala (SBT) Workflow for automating of Data Processing.• Hive Code optimization with Octal Delimiters, intermediate table approach, bucketing and compression techniques to speed up the process.• MapReduce 1 (MRv1) to YARN migration for Data Ingestion of Customer device files into Hive.• HiveServer1 to HiveServer2 migration (with enhanced security using Beeline) for Data Processing of ingested data and exporting to RDBMS.• Release planning and capturing Agile Metrics like Release Burnup, Release Defect Trend, Iteration Burndown, etc.• Static code analysis using Jenkins to handle technical debt.Proof of concept:• Apache Drill with Oracle, Hive & HBase to generate JSON using REST API.• HBase 2.0 API to perform CRUD operations using Scala API.• Adding Idea IntelliJ projects to Subversion Version Control. Show less -
Scrum Master / Principal Software EngineerTesco Bengaluru Aug 2010 - Jun 2015Bengaluru Area, India• Managing multiple Scrum Teams working in virtual and distributed locations.• Kanban dashboard to track Run activities.• Handling Technical escalations for 25 Applications.• Incident and Problem Management along with Root Cause Analysis, Post Mortem and Continual Service Improvement.• Adopt and adapt best practices of ITIL in IT Service Management.• SME in building Service Capability for Business Critical Applications.Responsibilities:• MapReduce code to parse… Show more • Managing multiple Scrum Teams working in virtual and distributed locations.• Kanban dashboard to track Run activities.• Handling Technical escalations for 25 Applications.• Incident and Problem Management along with Root Cause Analysis, Post Mortem and Continual Service Improvement.• Adopt and adapt best practices of ITIL in IT Service Management.• SME in building Service Capability for Business Critical Applications.Responsibilities:• MapReduce code to parse the Barcode from the Product Data feed.• Identifying and creating Relations in Pig for analyzing and transforming the staging data.• Designing Schema and building HiveQL for storing Store Returns processed data in External Hive tables.• Configuring remote MetaStore for Hive.• Prepared Sqoop scripts to import / export processed data to RDBMS.• Housekeeping of Hadoop logs and temp files.• Coaching the Scrum Team and guiding with directions to achieve the Sprint goal.• Accountable for Product Increment, key milestones and Release deliverables.• Ensuring code adherence to SOILD Architectural principles.• Complete Object Oriented Code with maintainability, readability and re-usability.• Test Driven Development practicing Agile Methodology and Lean Principles.• Continual Service Improvement with Pound Savings, Business Benefits, Cost Avoidance, Capital Budget and Revenue Budget Reduction.Proof of concept:• Real-time stream processing using Apache Storm and MongoDB for Sentiment Analysis.• Hadoop 2.0 Cluster Setup on Ubuntu Linux.• HBase 2.0 API to perform CRUD operations using Java program.• IIS Logs analysis using Log Parser by ingesting data into Database.• E2E / Performance Testing using Apache JMeter to simulate concurrency and load when applying Support fixes.• Setup of test framework using Selenium IDE for Automation Testing.• Setup and administration of Tortoise SVN Repository for Legacy Applications. Show less -
Technical LeadWipro Limited Jan 2006 - Aug 2010Bengaluru Area, India• Requirements Analysis, Project Estimation, Work Breakdown Structure, Self Audit of Project Plan.• Preparations of various Technical artifacts like System Maintenance Technical Document (SMTD), Execution Process Document (EPD), Client Playback presentation, Process Adherence Checklists, etc.• Implemented Lean and worked in Agile Methodology.Responsibilities:• Coding, Unit and Regression Testing.• Requirements Gathering. Project transition from onsite to offshore.•… Show more • Requirements Analysis, Project Estimation, Work Breakdown Structure, Self Audit of Project Plan.• Preparations of various Technical artifacts like System Maintenance Technical Document (SMTD), Execution Process Document (EPD), Client Playback presentation, Process Adherence Checklists, etc.• Implemented Lean and worked in Agile Methodology.Responsibilities:• Coding, Unit and Regression Testing.• Requirements Gathering. Project transition from onsite to offshore.• Preparation of technical documentation.• Analysis of Work Requests, Work Plan preparation, Creation of Tasks, Task tracking and Effort Estimation.• Design, development, implementation, testing, support and maintenance.• Performance Testing of Remote Java Applications in a Managed Services Model.• Resolving the issues and independently handling the module from offshore. Show less -
It ConsultantLogica Oct 2005 - Jan 2006Bengaluru Area, IndiaResponsibilities:• Software design and development of application framework using Web Technologies.• Coding and Unit Testing -
Sales And Service EngineerGvs Envicon Tegnologies Pvt. Ltd. Jun 2002 - Jun 2003Chennai Area, India• Technical sales and customer service in Southern India.• Planning requirements for importing stocks and regular reporting to overseas principals including business forecasts and plans.
Nag Arvind Gudiseva Skills
Nag Arvind Gudiseva Education Details
-
Distinction -
Gitam School Of International BusinessDistinction -
Bharatiya Vidya Bhavan'S Public School, Hyderabad, IndiaComputer Science
Frequently Asked Questions about Nag Arvind Gudiseva
What company does Nag Arvind Gudiseva work for?
Nag Arvind Gudiseva works for Sagarsoft (India) Ltd
What is Nag Arvind Gudiseva's role at the current company?
Nag Arvind Gudiseva's current role is SIMPLIFY COMPLEX APPLICATION ARCHITECTURES - Big Data Analytics Solution with Distributed Computing, Parallel Processing.
What is Nag Arvind Gudiseva's email address?
Nag Arvind Gudiseva's email address is gn****@****ail.com
What is Nag Arvind Gudiseva's direct phone number?
Nag Arvind Gudiseva's direct phone number is +170374*****
What schools did Nag Arvind Gudiseva attend?
Nag Arvind Gudiseva attended University Of Ballarat, Gitam School Of International Business, Shivaji University, Bharatiya Vidya Bhavan's Public School, Hyderabad, India, Kendriya Vidyalaya.
What are some of Nag Arvind Gudiseva's interests?
Nag Arvind Gudiseva has interest in Bird Watching, Social Services, Adventure Sports, Etc, Education, Snorkling, Jungle Camping, Science And Technology, Ice Skating, Learning Big Data Hadoop Technologies.
What skills is Nag Arvind Gudiseva known for?
Nag Arvind Gudiseva has skills like Requirements Analysis, Java Enterprise Edition, Xml, Agile Methodologies, Java, Sdlc, Oracle, Struts, Agile Project Management, Software Development, Requirements Gathering, Core Java.
Who are Nag Arvind Gudiseva's colleagues?
Nag Arvind Gudiseva's colleagues are Sachin Sadhvik, Manidhar Kommuru, D Swapna, Shiva Icelero, Surya Brahma, Chandra Batchu, Manasa Tangirala.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial