Kashif Mohammed

Kashif Mohammed Email and Phone Number

Data Engineer @ Tech CU | Master of Science in Business Analytics @ Technology Credit Union (Tech CU)
san jose, california, united states
Kashif Mohammed's Location
Chicago, Illinois, United States, United States
About Kashif Mohammed

At Tech CU, our team leverages SQL and Power BI to interpret complex data sets, driving strategic decisions informed by transactional and digital usage insights. With a Master's in Business Analytics, I've become adept at optimizing data retrieval and reporting, responding to real-time inquiries to ensure accurate delivery.My expertise in relational database management and data warehousing has been crucial in providing scalable data solutions. We have fostered a culture of continuous improvement, enabling our business to thrive on data-driven innovation and robust decision-making frameworks reflective of my commitment to strategic growth.

Kashif Mohammed's Current Company Details
Technology Credit Union (Tech CU)

Technology Credit Union (Tech Cu)

View
Data Engineer @ Tech CU | Master of Science in Business Analytics
san jose, california, united states
Website:
techcu.com
Employees:
229
Kashif Mohammed Work Experience Details
  • Technology Credit Union (Tech Cu)
    Data Engineer
    Technology Credit Union (Tech Cu) May 2023 - Present
    Chicago, Illinois, United States
    • Leveraged SQL and Power BI to gather, analyze, and interpret complex data sets from various business aspects, including transaction data, banking data, lending data, and digital usage data.• Conducted detailed data analysis on data used across business units to evaluate business processes and products, driving data-driven decision-making.• Responded to data and product-related inquiries in real-time to support both business and technical teams, ensuring accurate and timely data delivery.• Provided relational database expertise to construct and execute complex SQL queries for data analysis activities, optimizing data retrieval and reporting.• Developed and provided data solutions, tools, and capabilities to enable self-service data frameworks for data consumers using Power BI.• Translated business needs into technical designs, developing tools, techniques, metrics, and dashboards for insights and data visualization.• Developed and executed tools to monitor and report on data quality, ensuring data integrity and accuracy.• Collected, analyzed, and reported data, providing relevant data-related reports to leadership and stakeholders for decision-making, action planning, and continuous improvement.• Provided technical assistance and built understanding among partners regarding the effective use of data and Power BI for data-driven decision-making.• Developed systems and tools for effective data management, visualization, and interpretation, including system design, dashboards, information graphics, and mapping using Power BI.• Supported the implementation of continuous improvement processes by developing and optimizing data management and reporting systems.• Maintained contact with internal and external stakeholders, providing courteous assistance and information, and facilitating discussions to obtain and share information effectively.
  • Tag - The Aspen Group
    Data Engineer
    Tag - The Aspen Group Jan 2022 - Mar 2023
    Chicago, Illinois, United States
    • Developed design patterns for feeding data into the Datalake from a variety of sources and standardizing it to enable enterprise-level benchmarking and comparison.• As a Data Engineer, I am responsible for building scalable distributed data solutions using Hadoop.• Used SQL queries to create interactive Dashboard in Salesforce Einstein Analytics Studio and analyzed CRM data.• Ingestion of data into one or more Azure Services (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing of data in Azure Data bricks.• Expert in developing Data bricks notebooks for extracting data from various source systems such as DB2, Teradata, and performing data cleansing, wrangling, ETL processing, and loading to Azure SQL DB.• ETL operations were carried out in Azure Data Factory by connecting to various relational database source systems via JDBC connectors. • Configured data pipelines using Azure Data Factory, and a custom alerts platform was built for monitoring. • Built ETL data pipelines to input data from Blob storage to Azure Data Lake Gen2 using Azure Data Factory (ADF).• Performed RDD transformations on ingested data to perform streaming analytics in Databricks by Spark streaming.• Developed Spark applications using python libraries like Pyspark, Numpy, pandas for data transformations• Experienced in Spark-SQL for data extraction, data transformation, and data aggregation from multiple file formats for analyzing & transforming.• Implemented Spark using Scala and utilizing Data Frames and Spark SQL API for faster processing of data.• Built and architected multiple Data pipelines, end to end ETL and ELT process for Data ingestion and transformation in GCP and coordinating tasks among the team.• Developing CI/CD (continuous integration and continuous deployment) and automation using Jenkins, Git, docker, Kubernetes for ML models deployment.
  • Amazon
    Data Engineer
    Amazon Sep 2019 - Jul 2021
    Telangana, India
    • Utilized Tableau to connect with Einstein Analytics, enabling in-depth explorations directly on Salesforce CRM data and incorporated predictive capabilities from Einstein Discovery into Tableau.• Designed data preparation workflows in Alteryx to streamline report development and visualization creation.• Implemented Spark performance enhancements across data pipelines based on optimization principles.• Installed and configured Apache Airflow for S3 bucket and Snowflake data warehouse, establishing and managing airflow DAGs.• Collaborated with data engineers and operations teams to execute ETL processes, optimizing SQL queries for efficient data extraction to meet analytical needs.• Created Hive tables via Hue for data loading and developed Hive queries for internal execution using MapReduce.• Integrated enterprise data into HDFS using JDBC, loaded into Hadoop's Big SQL, and performed real-time transformations with Spark API to construct a standardized learner data model stored in HBase.• Managed various file formats such as Text, Sequence, ORC, Parquet, and Avro in Hive to analyze and build data models, processing data from HDFS to parquet files and loading into HBase tables.• Developed DAGs and established production environments in Apache Airflow for scheduling and automating ETL and reporting processes.• Proficient in data preparation, modeling, and visualization using Power BI, with extensive experience creating reports and dashboards using diverse visualizations in Tableau.
  • Nisum
    Data Engineer
    Nisum Jan 2017 - Aug 2019
    Telangana, India
    • Engaged throughout the implementation process, focusing on crafting custom MapReduce and Hive scripts. Utilized Hive for data transformations, leveraging partitions and buckets for enhanced performance.• Proficient in managing various components like HDFS, Job Tracker, Task Tracker, Name Node, Data Node, YARN, Spark, and MapReduce programming.• Developed and managed data pipelines using Alteryx workflows, facilitating data extraction, transformation, and loading into the company's data warehouse. Oversaw complete data loads from production to AWS Redshift staging, as well as data loading from SQL to AWS Redshift Data Lake.• Monitored resource usage across the cluster with Cloudera Manager, Search, and Navigator.• Established external Hive tables for data consumption, storing in HDFS using ORC, Parquet, and Avro formats. Designed ETL Pipelines with Apache PySpark's Spark SQL and Data Frame APIs. Analyzed Hadoop clusters and employed various Big Data tools like Pig, Hive, HBase, Spark, and Sqoop.• Utilized Sqoop for dynamic data loading from files and relational databases into the cluster. Leveraged AWS EC2 instances, EMR, Redshift, and Glue for data processing tasks. Managed job scheduling with Airflow scripts in Python, defining task dependencies within DAGs.• Developed Batch jobs in Scala to process data from files and tables, applying business logic for transformation and delivery. Engaged in Continuous Deployment for table updates across different environments, including DDL creation.• Additionally, wrote Azure PowerShell scripts for data movement between local file systems and HDFS Blob storage.

Kashif Mohammed Education Details

Frequently Asked Questions about Kashif Mohammed

What company does Kashif Mohammed work for?

Kashif Mohammed works for Technology Credit Union (Tech Cu)

What is Kashif Mohammed's role at the current company?

Kashif Mohammed's current role is Data Engineer @ Tech CU | Master of Science in Business Analytics.

What schools did Kashif Mohammed attend?

Kashif Mohammed attended Cleveland State University, Osmania University.

Who are Kashif Mohammed's colleagues?

Kashif Mohammed's colleagues are Theresa Pagonico, Brian Huynh, J Rl, Bob Koury, Kealoha Lei, Erica Garcia, Fiona Taylor.

Not the Kashif Mohammed you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.