Nagaraju Varkala -. Email and Phone Number
Nagaraju Varkala -. work email
- Valid
Nagaraju Varkala -. personal email
- Valid
• Certified Bigdata and Python/Java Expert with 18 Years of Experience in Bigdata - Data Lake,Enterprise DataHub, Data Aggregation, Streaming Data Analysis, Network Analysis,JavaJ2EE Applications, MicroServices and Machine Learning• Hands on Experience on Machine Learning Document classification use case. Preparing modelusing LibLinear and SVM algorithms, SVM Lite, Tesseract OCR, Python, Spark, Java andSpring Boot MicroServices on Bigdata CDH Cluster environment.• Hands-on experience in Bigdata analysis using Hadoop, HDFS, MapReduce, Hive, Spark,Neo4j, GraphX, Kafka, HBase, Solr, Scala, Avro, Parquet, Sqoop Cloudera Manager• Hands-on experience Design and develop Bigdata Enterprise Data Lake, Enterprise Data Hubmodel platform and Streaming Data pipelines using Kafka, Spark, Hive, Solr, Microservices• Designed and Developed Dataflow Framework for Data aggregator application with SparkStreaming, Kafka, Solr and Microservices• Designed and Develop Network Analysis using Spark, Neo4j, D3 UI tools.• Experience in JAVA, J2EE technologies, Web Services, WebLogic, Oracle, TFS, Linux• Strong experience creating real time data streaming solutions using Apache Spark Core,Spark SQL & Data Frames, Spark Streaming, Kafka.• Experience in importing and exporting data using Sqoop from HDFS to Relational DatabaseSystems (RDBMS) and from RDBMS to HDFS• Expertise in writing MapReduce programs and UDF’s to incorporate complex business logicinto Hive queries in the process of performing high level data analysis using Hive and Pig• Experience in Avro, Parquet and NoSQL technologies like HBase, Cassandra• Experience in designing Time Driven and Data Driven automated Oozie workflows• Hands-on experience in Spring Boot and Micro Services Development using REST• Hands-on Experience in Spring, Hibernate, MVC Architecture, Struts Framework• Experience in JSP, Servlets, JDBC, SOAP, XSD, XML, AJAX, ANT, JUnit and TestNG• Experience in Web Services Restful, SOAP, WSDL, Apache Axis2, JaxB, XMLBeans• Experience in Multithreading, Socket Programming, Java and J2EE Design Patterns• Hands-on experience in development of iText Reports and JfreeChart• Good Experience and exposure to relational databases like Oracle, MySQL and SQLServer• Experience and exposure to Applications servers like WebLogic, WebSphere and JBoss.• Experience in Cloud Computing Technology – SaaS based J2EE Applications and IAAS• Worked for Java COE group to develop Frameworks and give solutions to Applications
-
Founder And Technical DirectorSkillcitiHyderabad, In -
FounderAegletek Dec 2019 - PresentHyderabad, Telangana, India -
Bigadata And Ml EngineerIntelliswift Software Aug 2021 - Nov 2023San Francisco Bay AreaMigrating Bigdata ETL jobs developed on Spark, hive, cloudera CDP, apache solr and JavaSpring boot Microservices applications to Amazon AWS cloud.✔ Developing NLP modules to process data for different ML algorithms use cases.✔ Developed web crawling and web scraping modules for reading and processing data fromvarious portals✔ working on the Data ingestion process using Sqoop for Oracle imports and exports, and Kafkafor streaming data ingestion.✔ Developing a Java framework to index and search data from Solr. -
Solutions ArchitectSoothsayer Analytics Jul 2020 - Aug 2021Hyderabad, Telangana, India✔ Provide an overall architect responsibilities including roadmaps, planning, technical innovation, security, IT governance, etc✔ Present and persuade the design architecture to the various stakeholders (Customer, Server, Network, Security and other teams )✔ Produce artifacts architecture and implementation, including documentation, and presentations, diagrams and presented to technical and business audiences✔ Acting as a solutions architect, provide technical and process leadership for projects, defining and documenting information integrations between systems and aligning project goals with reference architecture✔ Lead end-to-end Dataiku and Hadoop implementation at large enterprise environment integrating with multiple applications in heterogeneous technologies ✔ Provide technical leadership and governance of the big data team and the implementation of the solution architecture in following Hadoop ecosystem (Hadoop (Cloudera), Hive, impala, HBase, Hue, Security – Kerberos, HDFS encryption), python, pySpark and Dataiku Data Science Studio✔ Configure and tune production and development Dataiku and Hadoop environments with the various intermixing Dataiku & Hadoop components -
Bigdata And Ml EngineerIntelliswift Software Mar 2017 - Nov 2019San Francisco Bay Area✔ Designed and developed Data Lake & Data Hub architecture for Fraud Analysis, Data Aggregation, Network Analysis use cases.✔ Worked on Document classification Machine Learning project on Document OCR, Prepare and deploy the model✔ Design and develop Enterprise Data Lake, Enterprise Data Hub model platform by usingBigdata tools Hadoop, Sqoop, Hive, Spark, Kafka, Impala, Hbase, Solr, Cloudera distribution.✔ Design and develop Streaming applications using Kafka, Spark Streaming and Solr.✔ Developed Dataflow framework for data aggregator application using kafka,microservices✔ Responsible for building scalable distributed data pipelines using Bigdata tools andtechnologies like sqoop, hive, Impala, Hbase, Spark, Kafka, Solr.✔ working on Graph Data base Neo4j and Spark GraphX to create Providers network analysis.✔ working on Spring Micro Services development to publish solr data to other applications.✔ working on REST API using Java, J2EE to search and pull the data from Solr indexing.✔ working on end-to-end data pipeline process of data ingestion, data cleansing and datatransformations for three services.✔ working on creating Bigdata jobs scheduling using BMC Control-M job schedular.✔ working on the Data ingestion process using Sqoop for Oracle imports and exports, and Kafkafor streaming data ingestion.✔ Implemented Central Sqoop job so that they can be connect different RDBMS system to pullthe data and load into Bigdata cluster, Hadoop file systems HDFS.✔ Performed Data Integrity (DI) checks on the imported data using MapReduce programs.✔ Using Avro and Parquet data SerDe formats with Snappy compression to store the data.✔ Involved in parsing json files using Jackson parser library.✔ Built re-usable UDFs for business requirements which enabled developers to use these UDFs indata parsing and aggregation -
ConsultantDell Nov 2013 - Jan 2017• Requirement analysis and prepare context and architecture diagrams for the proposed systems • Developed Kafka Producer based Rest API to collect order, product payload data and send to Spark Streaming App • Designed and built Spark jobs to consume Order & Product data from Kafka and read Parquet • Worked on Spark SQL and Spark HBase Connecter API to read and write HBase data • Efficiently organized, managed and analysed 100 TB of data and simplified data access to support Reporting, Predictive Analysis and Statistical Modelling • Designed and implemented REST API to invoke the Spark Job and HBase Data Model • Loaded structured data in to Hive tables and write hive scripts to analysis data • Developed Sqoop jobs to transfer data from several data sources such as Oracle to HDFS • Wrote Hive scripts and UDF’s that efficiently performed batch processing on HDFS to analyze and aggregate data of size 100 TB • Created RESTful web services to serve the data processed by Hadoop to external clients • Configured Oozie workflows to pull log data from FTP servers and transfer to HDFS for analysis • Monitored the clusters efficiently using Cloudera Manager and performed testing to verify data transformations -
Senior ConsultantCsc Aug 2011 - Nov 2013UCCORCH Project is a cloud computing based IAAS (Infrastructure as a Service) product which serves to the client all infrastructure related stuff. This product has lot of automation tools developed by CSC team which do client provisioning for new customer organization for various services including Bigdata analysis using Hadoop, Exchange, SharePoint, Lync and Mobility Services. • Developing Automation tools. • Writing algorithms for Exchange and SharePoint Provisioning. • Developed Client Provisioning Portal. • Developed Batch jobs. -
Sr.ConsultantFord Motor Company Aug 2011 - May 2013Role: Senior DeveloperCompliance with Regulations Information System (CRIS) is a Ford application used to view and maintain the corporate database for compliance with government emissions and fuel economy regulations. It is used to create emission certification and Corporate Average Fuel Economy (CAFE) documents, along with fuel economy labels for submission to the Environment Protection Agency (EPA), Department of Transportation (DOT) and Transport Canada (TC).Working Module : Inter-Facility Correlation (IFC).Inter-Facility Correlation is the application used to correlate test results between testing facilities, the Federal government. This correlation applies statistical methods designed to detect anomalies and trends in the test data, which in turn can help to identify problems in testing facilities, equipment, or methods. Due to the self-certifying nature of our FE and emissions regulations, this correlation function is also a regulatory requirement for automotive manufacturers in the United States. Responsibilities: Design and Coding of Module. Technical Inspection for Team members. Developed charts for various modules. Coordinate with onsite Team. Developing components for Ford Framework. Working on Production support & Bug fixing. Environment: Struts1.2, Ford Core J2EE Framework, JSP, Servlets, AJAX, XML, Java Script, AXIS2, RSA, WAS, Oracle10g, TopLink, iText Framework, JFree Chart, WinXP. -
It AnalystTcs Nov 2008 - Jun 2011Role: Senior DeveloperTCS iON - SMB Solutions is a Multi Tenant Cloud Computing SaaS type Solution .It is a complete pack of common office and Business solutions for Small and Medium Business Organizations. It provides Manufacturing, Payroll, HRMS, Finance and Accounting, CRM, Document Management, Community Management, Messaging Applications etc.Responsibilities: Code Integration with other Modules. Creating metadata for modules. Code review of Team members. Organization Provision and configuring user Permissions for various modules. Involved in development of TCSFramework. Code Tuning and Bug fixing. Environment: Spring, Hibernate, Struts1.2, TCS Framework, ORM, JSP, Servlets, AJAX, SOAP, WSDL, AXIS, JaxRPC, Jboss, MySQL, XML, JavaScript, Eclipse, Pentaho Reports, WinXP. -
Software DeveloperMarket Place Technologies Jan 2008 - Nov 2008Role: Java DeveloperDescription:The OMS is complete Risk Management System developed for National Commodity and Derivative Exchange. OMS is a web application which does the order validations and displays the various details to Exchange User. OMS communicates Warehouse and payment gateway to complete order validations. Exchange User can also do BOD and EOD operations using this applications. OMS has various modules like Margins, Funds Transaction, Order Messages, EOD, Warehouse operations, Funds history, Batch Jobs and Offline Trades.Responsibilities: As a Developer I involved in designing phase and coding of certain modules. Developing Webservices components. Code Review and Integration for the team. Application deployment at client Server. Code Realise of new versions and code move at client place. VSS co-ordinate for the team. Fixed bugs.Environment: Struts1.2, JSP, Servlets, AJAX, SOAP, WSDL, AXIS, JaxRPC, Tomcat, Oracle9i, XML, JavaScript, MyEclipse, Log4J, WinXP. -
Software DeveloperNcdex Jan 2008 - Sep 2008Role: Java DeveloperDescription:The OMS is complete Risk Management System developed for National Commodity and Derivative Exchange. OMS is a web application which does the order validations and displays the various details to Exchange User. OMS communicates Warehouse and payment gateway to complete order validations. Exchange User can also do BOD and EOD operations using this applications. OMS has various modules like Margins, Funds Transaction, Order Messages, EOD, Warehouse operations, Funds history, Batch Jobs and Offline Trades.Responsibilities: As a Developer I involved in designing phase and coding of certain modules. Developing Webservices components. Code Review and Integration for the team. Application deployment at client Server. Code Realise of new versions and code move at client place. VSS co-ordinate for the team. Fixed bugs.Environment: Struts1.2, JSP, Servlets, AJAX, SOAP, WSDL, AXIS, JaxRPC, Tomcat, Oracle9i, XML, JavaScript, MyEclipse, Log4J, WinXP. -
Java DeveloperUnited Airlines Jan 2007 - Dec 2007Role: Java DeveloperPerks Plus is a public web site built for United and Lufthansa Airlines. Its purpose is to draw high revenue by getting Corporations (Companies) and Travel Agents under the same umbrella. Corporations book flights to send their employees to various locations across the globe. This application will help them, directly, update their profiles, redeem online and view monthly performance reports. The more they fly the more they accrue mileage points and the more they can redeem and become entitled to benefits and promotions. Travel agents will act, as a bridge between United and the Corporations and in the process will earn benefits by delivering business to United. Responsibilities: Involved in modules Account, Redeem Points, Agency and Corporate Development of Actions. DAO Classes Implementation. Maintaining CVS activities for Team. Documentation work for UTP’s.Environment: Struts1.2, Java Server Pages, Weblogic9.2, Oracle9i, XML, JavaScript, MyEclipse, AJAX, Log4J, Windows XP.
Nagaraju Varkala -. Skills
Nagaraju Varkala -. Education Details
-
First Class With Distinction
Frequently Asked Questions about Nagaraju Varkala -.
What company does Nagaraju Varkala -. work for?
Nagaraju Varkala -. works for Skillciti
What is Nagaraju Varkala -.'s role at the current company?
Nagaraju Varkala -.'s current role is Founder and Technical Director.
What is Nagaraju Varkala -.'s email address?
Nagaraju Varkala -.'s email address is na****@****ail.com
What schools did Nagaraju Varkala -. attend?
Nagaraju Varkala -. attended Osmania University.
What skills is Nagaraju Varkala -. known for?
Nagaraju Varkala -. has skills like Java Enterprise Edition, Web Services, Software Project Management, Service Oriented Architecture, Software Development Life Cycle, Core Java, Servlets, Spring Boot, Maven, Apache Spark, Apache Kafka, Solution Architecture.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial