Sravan K work email
- Valid
Sravan K personal email
- Valid
- Valid
Sravan K phone numbers
Around 9 years of software development experience with more than 5 years of experience in Big Data technologies and Data modeling toolsA data addict with a vast experience in all stages of a data pipeline handling several tools and technologies.Developed several creative solutions to the business problems using a combination of programming languages and tools.Strongly believe in fail fast approach and agile development model to maintain transparency and make the business stakeholders part of the development process.Successfully architected several solutions right from ingestion to transformation and visualization of dataDeveloped Big Data Solutions that enabled the business and technology teams to make data driven decisions on the best ways enhance product development and provide them business solutions
-
Principal EngineerProcore Technologies Sep 2023 - PresentCarpinteria, Ca, Us -
Lead EngineerAsana Dec 2021 - Sep 2023San Francisco, California, Us -
Senior Bigdata EngineerAutodesk Jan 2014 - Dec 2021San Francisco, Ca, UsIn depth Knowledge of Apache Hadoop Ecosystem components: HDFS, MapReduce, YARN, Pig, Hive, Sqoop, Flume, OozieExperience working with a variety of NoSQL systems like: Cassandra and HBaseEfficient in writing MapReduce Programs and using Apache Hadoop API for analyzing the structured and unstructured data.Efficient in developing Spark/Storm topologies to process large set of data for better decision-making. Extensive development experience using custom partitions, multiple and composite file input formats, Avro serialization and distributed cache in MapReduce Experience in Hadoop streaming , REST APIExperience in configuring custom interceptors and sink processors in Flume to load the data from multiple sources directly into HDFS.Solution architecture, design, implementation of distributed computing, data streaming on AWS platform.Designed and developed data streaming pipeline using Amazon Lambda function, Kinesis Stream/Firehose, DynamoDB, CloudWatch and Amazon S3.Designed & developed several Hadoop solutions around Amazon Web Services such as Amazon S3, Amazon EMR, Amazon EC2, Amazon RDS, Amazon Redshift, Amazon CloudWatch etc.Designed & developed data streaming pipeline using Apache Kafka, Apache Zookeeper and Apache Flume, deployed as a Docker image on Apache Mesos/Marathon infrastructure on DC/OS. Data stream load is distributed via Amazon ELB. Source API call optimized by using caching service. Hashicorp Vault is used as secured storage of sensitive data, encryption is used to secure communication between various AWS services.Designed & developed Hadoop workflow using apache Oozie workflow management engine.Designed & developed data ingestion framework using apache Sqoop.Used WEB HDFS REST API like SAAS Based applications like Marketo and Cvent to make the HTTP GET, PUT, POST and DELETE requests from the webserver to perform analytics on the data lake. Developed job flows in Oozie to automate the workflow for Hive jobs. -
Integration LeadAutodesk May 2011 - Dec 2021San Francisco, Ca, UsResponsible for Leading the Offshore Team by guiding and assigning the task's to them. Prepared and presented a successful software development project plans along with the Project Manager.Worked on both Extracting and loading the data from and to Sales Force (Sales, Service & Support, Analytics, and their collaboration platform).Responsible for Creating the Solution design to define the Business Flow, Detail Design Document for the developers and preparing the Sub Doc's for the Code Migration from Dev to QA and QA to UAT and UAT to Prod.Enhanced the Dimensional Data Model (Star Schema) as per the Business Need's and worked on creating the Universe as well as generating the BO Reports (Webi and Deski Reports).Worked on Unix Shell Script to pick the data files from a filestore and load them for further processing. Responsible for Guiding the Apps support team in the Execution and issue resolving.Developed SSIS Packages to carry-out file SQL server operations, these packages are integrated with Informatica workflows.Wrote SQL, PL/SQL, stored procedures, triggers and cursors for implementing business rulesInvolved in Creating the Labels and Deployment GroupsTechnology Used: Informatica Power Center 8.6.1 , Business Objects XI 3.1, Quality Center, Erwin 7.1, Sales Force, Seibel , Oracle 10 g , SQL Management Studio 2008, Business Intelligence Development Studio (BIDS), Visio 2010, MS Office 2010, Windows Xp, putty. Unix; -
Informatica Lead DeveloperThe New York Times May 2010 - May 2011New York, Ny, UsInteracting with Client and Business Analysts to acquire Data requirements, Client requirements for the project.Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, mapplets, transformations, Worklets, re-usable transformations, etc.Worked on Business Object's XI Reporting(Desktop Intelligence and Web Intelligence) as per my discussion with the UsersResponsible for Leading the Offshore team (Informatica Developer/BO)Responsible for Migrating the Code from DEV to UAT to PRODResponsible for setting up the expectations to the Users and defining the time lines Working on creating as well as loading the data in to Star and Snowflake Schema (Fact and Dimensional Tables) by Slowly Changing Dimensions.Responsible for Leading the Offshore Team by guiding and assigning the task's to them. Worked on both Extracting and loading the data from and to Sales Force (Sales, Service & Support, Analytics, and their collaboration platform).Responsible for Creating the Solution design to define the Business Flow, Detail Design Document for the developers and preparing the Sub Doc's for the Code Migration from Dev to QA and QA to UAT and UAT to Prod.Enhanced the Dimensional Data Model (Star Schema) as per the Business Need's and worked on creating the Universe as well as generating the BO Reports (Web and Deski Reports).Worked on Unix Shell Script to pick the data files from a filestore and load them for further processing. Developed SSIS Packages to carry-out file SQL server operations, these packages are integrated with Informatica workflows.Involved in Creating the Labels and Deployment Groups
Sravan K Skills
Sravan K Education Details
-
Jawaharlal Nehru Technological University
Frequently Asked Questions about Sravan K
What company does Sravan K work for?
Sravan K works for Procore Technologies
What is Sravan K's role at the current company?
Sravan K's current role is Principal Data Engineer - at Procore.
What is Sravan K's email address?
Sravan K's email address is ku****@****ail.com
What is Sravan K's direct phone number?
Sravan K's direct phone number is +141550*****
What schools did Sravan K attend?
Sravan K attended Jawaharlal Nehru Technological University.
What skills is Sravan K known for?
Sravan K has skills like Spark, Data Science, Business Intelligence, Oracle, Sql, Sdlc, Agile Methodologies, Informatica, Microsoft Sql Server, Data Warehousing, Software Project Management, Amazon S3.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial