Ramya Nagalla

Ramya Nagalla Email and Phone Number

Sr. Dev Engineer @ Technoidentity
Vijayawada, AP, IN
Ramya Nagalla's Location
Washington DC-Baltimore Area, United States
About Ramya Nagalla

I am a dedicated Data Engineer with over 4 years of experience, complemented by a Data Science degree. I have a strong passion for data engineering and data analysis, consistently seeking to leverage my skills to drive impactful solutions.Technical Proficiencies-Cloud Technologies: Experienced with AWS services including S3, Lambda, Redshift, Step Functions, Glue, EventBridge, Athena, EMR, QuickSight and Azure Data Lake Storage, Azure SQL Database, AzureData bricks and Azure Data Factory.Infrastructure as Code: Proficient in Terraform and CloudFormation for managing cloud resources efficiently.Programming Languages: Skilled in Python (Pandas, NumPy, Scikit-learn) and R for data manipulation and analysis.Databases: Knowledgeable in Oracle, MySQL, SQL Server, and MongoDB for data storage and management.Big Data Technologies: Experienced with PySpark, Hadoop, Hive, and Pig for processing large datasets.Machine Learning: Familiar with supervised and unsupervised learning algorithms, including regression, SVM, decision trees, and neural networks.Reporting Tools: Proficient in Tableau and Power BI for creating insightful visualizations.I thrive in dynamic environments and enjoy taking the initiative to tackle challenges and implement innovative solutions. My blend of practical experience and academic knowledge equips me to contribute effectively to any data-driven team.I am eager to connect with fellow professionals and explore new opportunities in data engineering and analysis. Let’s collaborate to unlock the potential of data together!

Ramya Nagalla's Current Company Details
Technoidentity

Technoidentity

View
Sr. Dev Engineer
Vijayawada, AP, IN
Employees:
205
Ramya Nagalla Work Experience Details
  • Technoidentity
    Sr. Dev Engineer
    Technoidentity
    Vijayawada, Ap, In
  • Sherwin-Williams
    Senior Data Engineer
    Sherwin-Williams Nov 2023 - Present
    Cleveland Heights, Ohio, United States
    • Developed ETL pipelines on GCP using Apache Beam and Dataflow to process large-scale data in real-time, resulting in a 20% improvement in data processing time.• Worked with informatica power exchange tools to give on demand access to the business users.• Worked on the installation and configuration of informatica power center.• Rebuild of legacy on-premises Oracle-based data warehouse to a data lake based on Azure Cloud• Design and implement data storage solutions using Azure services such as Azure SQL Database, Azure Cosmos DB, and Azure Data Lake Storage.• Develop and maintain data pipelines using Azure Data Factory and Azure Databricks.• Created and managed data processing jobs using Azure HDInsight and Azure Stream Analytics.• Worked with Python, SQL, and Bash scripts to develop custom data transformations and data quality rules, resulting in a 25% reduction in data processing errors.• Using informatica designer designed mappings, which populated the data into the target star schema on oracle instance.• Worked on installing, configuring, and managing NoSQL database systems, ensuring high availability and fault tolerance.• Experienced in handling data ingestion processes, including data import/export, batch processing, and real-time data streaming into NoSQL databases.• Worked on setting up data replication and sharing for NoSQL databases to distribute data across clusters and improve system resilience.• Developed and maintained ETL processes using SSIS to integrate data from multiple sources, improving data processing efficiency.• Worked on ETL tool Informatica, Oracle Database and PL/SQL, Python and Shell Scripts.• Advanced proficiency in Power BI Desktop, Power Query, DAX, and Power BI Service.• Strong data analysis and visualization skills using Excel, SQL, and other BI tools
  • Docusign
    Data Engineer
    Docusign Jan 2023 - Oct 2023
    San Francisco, California, United States
    • To enhance navigation performance within huge datasets, indexes were implemented on OLTP/OLAP tables. Creating mappings, sessions, and workflows to use with Informatica PowerCenter to load data for Ultima tix projects from source to target database.• Created the Informatica Mappings by leveraging the Aggregator, SQL overrides in Lookups, source qualifiers, and Router to govern data flow into different targets.• Created Sessions, gathered data from multiple sources, processed it as needed, and loaded it into the data warehouse.• With the Informatica Power Center Designer, strong mappings were created using a variety of transformations, including Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router, and Aggregator.• Used Azure Data factory to extract raw files from different sources and load them into Azure Data Lake as parquet files.• The parquet files are pulled from the data lake to Azure Synapse analytics to create external tables and staging tables. The data modelling is done on the stating table and the final views are used for visualization reports.• Designed and implemented ETL workflows using Alteryx to automate data extraction and transformation and to improve efficiency.• Developed interactive dashboards and reports in Tableau, enabling stakeholders to visualize key performance metrics and drive data-driven decisions.• Conducted data quality assessments and implemented validation checks in Alteryx for reducing data discrepancies.• Assisted in the development of ETL workflows using SSIS for data extraction, transformation, and loading.• Created and managed SQL Server databases, optimizing query performance and ensuring data integrity.• Created materialized views, planned jobs, ETL workflows, and reporting that allowed data to move between eight ERP systems.
  • Fannie Mae
    Hadoop Developer
    Fannie Mae Apr 2022 - Nov 2022
    Sandy Springs, Georgia, United States
    • Created and managed ETL (Extract, Transform, Load) processes and data pipelines to move and transform data from source systems into data warehouses or data lakes on AWS.• Worked on partitioning and clustering high-volume tables on fields in Big Query to make queries more efficient.• Worked on implementing scalable infrastructure and platform for large amounts of data ingestion, aggregation, integration, and analytics in Hadoop using Spark and Hive.• Performed tuning of NoSQL databases through monitoring tools and configurations, identifying and addressing bottlenecks for optimal performance.• Provided ongoing support for ETL jobs, monitored ETL processes, and resolved issues to ensure the stability and reliability of data pipelines on snowflake.• Created comprehensive documentation for ETL processes, data transformations, and Snowflake schemas, facilitating knowledge transfer and onboarding of new team members.• Used to write Python DAGs in airflow which orchestrate end-to-end data pipelines for multiple applications.• Worked on importing and exporting data from Snowflake, Oracle, and DB2 into HDFS and HIVE using Sqoop for analysis, visualization, and generating reports.• Utilized Tableau to create dynamic visualizations and dashboards that provided insights into sales and marketing performance.• Extracted, transformed, and loaded data using SQL and Alteryx, streamlining the data preparation process and enabling faster analysis.• Assisted in training team members on best practices for using Alteryx and Tableau, fostering a data-driven culture.• Created Hive tables using HiveQL, then loaded the data into Hive tables and analyzed the data by developing Hive queries.• Used Oozie Scheduler systems to automate the pipeline workflow and orchestrate the map-reduce jobs that extract and Zookeeper for providing coordinating services to the cluster.

Ramya Nagalla Education Details

Frequently Asked Questions about Ramya Nagalla

What company does Ramya Nagalla work for?

Ramya Nagalla works for Technoidentity

What is Ramya Nagalla's role at the current company?

Ramya Nagalla's current role is Sr. Dev Engineer.

What schools did Ramya Nagalla attend?

Ramya Nagalla attended Kent State University, Kl University, Vaddeswaram.

Who are Ramya Nagalla's colleagues?

Ramya Nagalla's colleagues are Yusuf Jetpurwala, Mary Hall, Pimisa Lincy, Aparna Kasturi, Shwetha Shah, Apoorv Vyas, Aakansha Pachauri.

Not the Ramya Nagalla you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.