Darsh Patel

Darsh Patel Email and Phone Number

Software Engineer at Amazon | Ex-JPMorgan @ Amazon Web Services (AWS)
Darsh Patel's Location
Seattle, Washington, United States, United States
About Darsh Patel

Experienced software engineer with a background and projects in software development, system design, machine learning, and web design. My primary expertise lies in software design and architecture, as well as building scalable and maintainable applications. I'm also passionate about exploring the potential of machine learning and AI to solve intricate problems in various domains. My portfolio includes several projects that showcase my expertise in software development, as well as ML and AI, including object detection, natural language processing, and image recognition.

Darsh Patel's Current Company Details
Amazon Web Services (AWS)

Amazon Web Services (Aws)

View
Software Engineer at Amazon | Ex-JPMorgan
Darsh Patel Work Experience Details
  • Amazon Web Services (Aws)
    Software Development Engineer Ii
    Amazon Web Services (Aws) Jul 2024 - Present
    Seattle, Wa, Us
  • Jpmorgan Chase & Co.
    Software Engineer
    Jpmorgan Chase & Co. Jan 2023 - Jul 2024
    New York, Ny, Us
    - Implemented Jenkins CI/CD pipelines to automate the build, testing, and deployment of API services, ensuring efficient and consistent software delivery.- Utilized DynamoDB for efficient data storage and retrieval in API applications.- Integrated Amazon S3 (Simple Storage Service) to securely store and manage large volumes of data within the API ecosystem.- Implementing caching mechanisms using Redis to optimize API performance and reduce database load.- Leveraged OpenAPI specifications to document and expose APIs, enabling clear communication and easy integration with external systems.- Implemented authentication and authorization mechanisms, OAuth 2.0, to secure API endpoints and protect sensitive data.- Utilized containerization technologies (Docker) to create lightweight and portable environments for API deployment.- Worked with message queues (Apache Kafka) to enable asynchronous communication and decoupling of services in API architectures.- Collaborated with other developers to provide comprehensive API documentation and support for downstream consumer integration needs.- Monitoring API performance using Datadog and Splunk, enabling proactive identification and resolution of issues.- Conducted load and performance testing using JMeter to ensure API scalability and stability under high traffic conditions.- Implemented logging and error tracking mechanisms to capture and analyze API logs, enabling quick troubleshooting and issue resolution.- Utilized version control systems (Git & Bitbucket) for efficient code management and collaboration within the development team.- Followed agile methodologies (Scrum) to facilitate iterative development and timely delivery of API features and enhancements.- Actively participated in peer code reviews and utilized static code analysis tools SonarQube and SonarLint to maintain code quality and adhere to best practices.
  • Jpmorgan Chase & Co.
    Software Engineer
    Jpmorgan Chase & Co. Jul 2022 - Jan 2023
    New York, Ny, Us
    - Created and maintained data pipelines to ingest structured and unstructured data from various sources into the data lake, ensuring data quality and integrity.- Worked with the data cataloging tool Apache Atlas to organize and categorize data assets within the data lake, improving discoverability and usability.- Developed and optimized data ingestion processes to efficiently extract, transform, and load (ETL) large volumes of data into data processing frameworks Hadoop and Apache Spark.- Implemented data integration solutions (Apache Kafka) for real-time or near-real-time data ingestion and streaming.- Designed and implemented data connectors and adapters to integrate with various data sources, including relational databases, APIs, and third-party services.- Leveraged distributed processing frameworks Apache Spark and Flink for scalable and parallel data processing, enabling faster analytics and insights.- Built data pipelines to transform and cleanse raw data into structured and usable formats for downstream analytics and reporting purposes.- Implemented batch processing workflows using frameworks Apache Airflow and BMC Control-M for scheduling and orchestrating data processing jobs with dependencies.- Designed and implemented streaming data processing pipelines using Apache Flink, enabling real-time data analytics and event-driven architectures.- Utilized Apache Hive and Apache Impala for querying and analyzing large datasets stored in Hadoop or cloud-based data platforms.- Implemented data visualization and reporting solutions using Apache Superset to present insights and analytics derived from big data.- Worked with columnar data formats (Parquet) for optimized storage and retrieval of big data, improving query performance and resource utilization.- Utilized distributed storage and processing frameworks (Apache Cassandra) for high-performance and scalable data storage and retrieval.
  • Sonos, Inc.
    Software Development Engineer
    Sonos, Inc. May 2022 - Jul 2022
    Santa Barbara, Ca, Us
    - Cloud Content Integration Team at Sonos, specializing in integrating cloud services and content for seamless music playback. ("Hey Sonos... play today’s top hits!")- Ensured software quality by designing and executing comprehensive test plans, test cases, and test scripts for various features and functionalities of the Cloud Content Integration platform.- Collaborated closely with developers and product owners to understand requirements, identify testable scenarios, and provide early feedback on potential quality issues.- Developed and maintained automated test suites using JUnit to increase testing efficiency and coverage.- Implemented continuous integration and continuous deployment (CI/CD) pipelines using Jenkins, enabling automated builds, testing, and deployment of software applications.- Integrated automated tests into the CI/CD pipeline, ensuring that every code change undergoes rigorous automated testing before being deployed to production.- Implemented automated contract testing using PACT framework to validate and ensure compatibility between different microservices and APIs involved in the Cloud Content Integration platform.- Collaborated with cross-functional teams, including developers, product managers, and architects, to define contract specifications and implement contract tests to verify interactions and data exchanges between services.- Actively participated in code reviews, providing feedback on code quality, testability, and ensuring adherence to testing best practices and quality standards.- Conducted exploratory testing to uncover edge cases, usability issues, and potential vulnerabilities in the Cloud Content Integration platform.- Collaborated with the development team to investigate and troubleshoot issues, providing detailed bug reports and contributing to the resolution process.
  • Novant Health
    Machine Learning Engineer
    Novant Health Feb 2021 - May 2022
    Charlotte, Nc, Us
    - Worked on developing and implementing machine learning models and algorithms to solve healthcare-related challenges and improve patient outcomes.- Collaborated with cross-functional teams, including data scientists, clinicians, and software engineers, to understand business requirements and translate them into machine learning solutions.- Assisted in data preprocessing, cleaning, and feature engineering to prepare datasets for model training and evaluation.- Implemented various machine learning algorithms: decision trees, random forests, and neural networks, to solve classification, regression, and clustering problems.- Utilized Python and popular libraries/frameworks scikit-learn, TensorFlow, and PyTorch for model development, training, and evaluation.- Conducted experiments and performed hyperparameter tuning to optimize model performance and generalization ability.- Developed and deployed predictive models for disease diagnosis, patient risk stratification, and treatment recommendation.- Conducted thorough analysis and interpretation of model outputs, providing actionable insights and recommendations to healthcare professionals.- Assisted in integrating machine learning models into existing healthcare systems and applications for real-time decision support and automation.- Implemented data pipelines and workflows to ensure efficient and scalable data processing, model training, and inference.- Collaborated with the data engineering team to design and optimize data storage and retrieval systems for large-scale healthcare datasets.- Followed best practices for model versioning, documentation, and reproducibility to ensure transparency and facilitate future enhancements.- Worked on evaluating the ethical implications and fairness of machine learning models, addressing issues related to bias, privacy, and security in healthcare data.
  • Commodity Investing By Students (Coins)
    Chief Financial Officer
    Commodity Investing By Students (Coins) Oct 2019 - Dec 2021
    Blacksburg, Virginia, Us
    - Managed the financial operations as the CFO of a College Investment Fund specializing in commodity trading through Exchange-Traded Funds (ETFs).- Oversaw the fund's financial planning, budgeting, and forecasting processes to optimize investment strategies and maximize returns.- Conducted in-depth analysis of commodity markets, including energy, metals, agriculture to identify profitable trading opportunities.- Developed and implemented risk management strategies to mitigate exposure to price volatility and market fluctuations in commodity trading.- Utilized financial modeling techniques and tools to assess investment performance, evaluate portfolio risk, and make informed trading decisions.- Collaborated with investment analysts and traders to formulate trading strategies based on market research, historical data, and quantitative analysis.- Monitored and analyzed financial market trends, news, and events to anticipate potential impacts on commodity prices and adjust trading strategies accordingly.- Utilized financial software and trading platforms, Bloomberg Terminal and Thomson Reuters Eikon, to access real-time market data, track portfolio performance, and execute trades.- Conducted thorough due diligence on ETFs and their underlying commodities to ensure compliance with fund objectives, regulatory requirements, and risk management policies.- Prepared financial reports, statements, and presentations for stakeholders, including investors, board members, and advisors, providing insights into fund performance and investment strategies.- Managed relationships with external financial institutions, brokers, and custodians to optimize trade execution, settlement, and portfolio management processes.- Stayed updated with industry trends, regulatory changes, and best practices in commodity trading and investment management to drive continuous improvement and ensure regulatory compliance.
  • Citi
    Sales & Trading
    Citi Jun 2021 - Aug 2021
    New York, New York, Us
  • General Motors
    Computer Vision Engineer
    General Motors Jun 2019 - Aug 2020
    Detroit, Michigan, Us

Darsh Patel Education Details

  • Virginia Tech
    Virginia Tech
    Computer Engineering (Machine Learning)
  • Worldquant University
    Worldquant University
    Financial Engineering & Machine Learning

Frequently Asked Questions about Darsh Patel

What company does Darsh Patel work for?

Darsh Patel works for Amazon Web Services (Aws)

What is Darsh Patel's role at the current company?

Darsh Patel's current role is Software Engineer at Amazon | Ex-JPMorgan.

What schools did Darsh Patel attend?

Darsh Patel attended Virginia Tech, Worldquant University.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.