Arjun Paul work email
- Valid
Arjun Paul personal email
A strong aptitude to learn and growAdaptability to changesResponse-abilityStrong team working ability.Analytical and comprehensive capacity.Possess leadership qualities.
-
Senior Data EngineerEpsilonChicago, Il, Us -
Senior Data Solutions EngineerEpsilon Oct 2024 - PresentChicago, Illinois, United States -
Big Data EngineerPetsmart Sep 2023 - Oct 2024Phoenix, Arizona, United States• Worked on creating real time data ingestion pipelines using Databricks on GCP cloud.• Design, develop, implement, test, document, and operate large-scale, high-volume, high-performance big data structures for business intelligence analytics. • Creating Databricks notebooks using SQL, Python and automated notebooks using jobs. • Creating Spark clusters and configuring high concurrency clusters using Databricks to speed up the preparation of high-quality data. • Responsible for an initiative to analyze the jobs in Databricks and helped determine solutions resulting in 40% decrease in cluster usage cost.• Responsible for developing a real time streaming solution for a batch streaming process resulting in a 50% decrease in time of delivering reports to the business team.• Developed ETL pipelines where data is being read from a large number of Json files in GCS (Google Cloud Storage) buckets, transformed and loaded into multiple delta tables and then exposed to Snowflake tables to be finally consumed by MicroStrategy.• Built complex SnowSql scripts to perform transformations and validations where multiple delta tables act as source.• Experience in refactoring code logics used in Informatica into Snowsql.• Worked on a data migration project from Informatica to GCP cloud.• Extensive experience working with Snowflake tools to do validations, transformations of data based on business requirements.• Extensive experience on Databricks time traveling, incremental data processing, maintaining checkpoints.• Developed shell scripts to run on dataproc to maintain an archival module of processed and unprocessed source files.• Developed ETL pipelines to ingest data from API endpoints into delta tables using pyspark.• Worked with SFMC and SFCC integration with Databricks.• Used cloud shell SDK in GCP to configure the services Data Proc, Storage, BigQuery. • Built complex SQL queries to do transformation, calculations, and validations for a retail SLA module. -
Data EngineerPrudential Financial Mar 2022 - Sep 2023Newark, New Jersey, United States• Responsible for designing, creating and maintaining end to end ETL data pipelines which make available raw data in a processed and clean format for consumption by the end user.• Designing, building, and maintaining efficient, reusable, and reliable architecture and code using pyspark, python, SQL, linux, SCALA, etc.• Worked on AWS Data Pipeline to configure data loads from S3 into Redshift. • Independently perform hands on development and unit testing of the applications built using AWS tools AWS Glue, AWS StepFunctions, AWS LambdaFunctions, AWS CodePipeline, AWS EC2• Work in a team environment with product, frontend design, production operation, QE/QA and cross functional teams to deliver a project throughout the whole software development cycle; - JIRA• Experience in developing and deploying Lambda Layers in modules using Lambda Functions• Translate business specifications into design specifications and code. Responsible for writing complex programs, ad hoc SQL queries, and reports in AWS Athena, AWS Redshift, AWS DynamoDB. Ensure that all code is well structured, includes sufficient documentation, and is easy to maintain and reuse using BitBucket, SourceTree.• Worked on helping the team with migrating on premise Hadoop HIVE tables to AWS S3 and eventually pushed the processed data into AWS Sagemaker.• Participate in code review to make sure standards and best practices are met.• Troubleshoot software and process for data consistency and integrity. Integrate large scale data from a variety of sources for business partners to generate insight and make decisions.• Work with AWS SecretsManager, AWS Cloudwatch.• Write, maintain, and deploy Infrastructure as Code for AWS.• Help the foundation team build frameworks and keep up with the new technologies that the project demands.• Design, create and maintain reconciliation layers as part of the data pipeline to make sure the necessary metadata is stored for future validations in AWS S3, AWS EC2 servers. -
Data AnalystInternet Bull Report Inc. Jun 2021 - Sep 2021RemotePerform Data scraping and web scraping to procure data into databases using Python, R, BeautifulSoup, Selenium, Design, create and operate a contact information extraction tool from the data procured using Python, Pandas, Numpy, LSTM, NLP, Text Analysis tools and packages. Perform data analysis and text analysis to analyze the data and give insights using NLTK, Tweepy, TextBlob, VaderSentiment Build and enhance operation of email automation tool. Create a database of all the data that is procured in CSV Work with web scraping tools to get data from press releases posted in websites like PR newswire, Global Newswire, Yahoo Finance, News Filecorp, Business Newswire, Access Newswire. Applying sentiment analysis for the pr articles scraped from the sources mentioned above. Design, create and operate a sophisticated rule-based text analysis tool for analyzing pr articles for each particular company using a financial dictionary. Extracting historical tweets data from Twitter, posts from Reddit, articles from Yahoo Finance. Created and designed an Email Automation tool using python, HTML Applied data analytics methods to analyze websites and platforms to develop growth strategies. Performed data extraction, analysis, and prepared presentable insights. Designed, developed, and created a sentiment analyzer which considers scraped data from various platforms and analyzed the data for stock price prediction. Developed time series models to analyze and study financial data. Created a stock price prediction model considering sentiment analysis, historical stock data as features to predict the close price using LSTM and deep nets. Worked with various stakeholders to understand data requirements and applied deep technical knowledge of database management to solve key business issues. -
Student AssistantUniversity At Albany, Suny Jan 2020 - May 2021Albany, New York Area -
Business AnalystAccenture Jul 2018 - Dec 2019Hyderabad, Telangana, IndiaConduct load testing and provide input into capacity planning efforts. Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner Create and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications. Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface. Tested the database to check field size validation, check constraints, stored procedures, andcross verifying the field size defined within the application with metadata. Worked directly with the client business intelligence team by building SQL queries to analyze data and solve complex problems. Used mathematic modelling for generating reports about the weekly and monthly analysis of the process performance improvement by EXCEL Pivot tables and R. Utilized various statistical analysis on the data from operations team to improve the process and strategy used. Worked briefly on creating data visualizations using KNIME and data wrangling using Python. Created a tool feedback platform for the internal tools used at the client location. Worked directly with the client business intelligence team by building SQL queries to analyze data and solve complex problems. Used mathematic modelling for generating reports about the weekly and monthly analysis of the process performance improvement by EXCEL Pivot tables and R. Worked briefly on creating data visualizations using KNIME and data wrangling using Python. Created a tool feedback platform for the internal tools used at the client location. Worked with the business innovation team of the process to analyze and drive continuous improvement and efficiency for the project. Was a subject matter expert who works on and analyzes data coming in from smartphones made by google. Involved in bug testing.
Arjun Paul Skills
Arjun Paul Education Details
-
Information Science/Studies -
Electrical, Electronics And Communications Engineering
Frequently Asked Questions about Arjun Paul
What company does Arjun Paul work for?
Arjun Paul works for Epsilon
What is Arjun Paul's role at the current company?
Arjun Paul's current role is Senior Data Engineer.
What is Arjun Paul's email address?
Arjun Paul's email address is ap****@****any.edu
What schools did Arjun Paul attend?
Arjun Paul attended University At Albany, Suny, Vellore Institute Of Technology.
What skills is Arjun Paul known for?
Arjun Paul has skills like Matlab, Python, Microsoft Office, Teamwork, Microsoft Word, R, Microsoft Excel, Data Analytics, Team Spirit, C, Cadence, Mysql.
Who are Arjun Paul's colleagues?
Arjun Paul's colleagues are Vijay Kumar G, Dylan Ekenberg, Robert Underwood, Albert Johnson, Sylvia Padilla, Zohra Shariff, Blog Da Gata Pérola.
Not the Arjun Paul you were looking for?
-
Arjun Paul
San Francisco, Ca2tesla.com, sfgov.org1 +183261XXXXX
-
1lorvensofttech.com
1 (312) 2XXXXXXX
-
-
Arjun Paul
New York, Ny3americanexpress.com, aexp.com, uber.com
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial