Menu M

Menu M Email and Phone Number

Senior Python Developer at Hulu @ Hulu
Santa Monica, CA
Menu M's Location
Los Angeles, California, United States, United States
About Menu M

Around 6 years of IT experience Python Developer. Proficient coder in multiple languages and environments including Python, Django, Flask, MongoDB, C, C++ and SQL. Experienced in web - based applications using Python 2.x and 3.x, UI/UX, PHP, C++, XML, CSS3, HTML5, DHTML, JavaScript, Jdk1.7, jQuery. Experienced in version control systems using CVS, SVN and Git/GitHub and issue tracking tools like Jira and Bugzilla. Experience in working with various Python IDE's using PyCharm, Spyder, PyStudio, PyScripter, and PyDev. Experienced in building frameworks and automating complex workflows using Python for Test Automation. Experienced in handling projects for entire software life cycle and provided first, second and third level support. Handled and deployed production hot fixes. Implemented with server-side technologies with restful API and MVC design patterns with node JS and Django framework.

Menu M's Current Company Details
Hulu

Hulu

View
Senior Python Developer at Hulu
Santa Monica, CA
Website:
hulu.com
Menu M Work Experience Details
  • Hulu
    Senior Python Developer
    Hulu Aug 2020 - Present
    Santa Monica, Ca, Us
    Developed applications using RESTful architecture using Python as backend languages and Used NumPy for Numerical analysis. Migrated the Django database from SQLite to MySQL to PostgreSQL and Designed, developed, and deployed CSV Parsing using the big data approach on AWS EC2. Used jQuery and AJAX calls for transmitting JSON data objects between front end and controllers and Utilized continuous integration and automated deployments with Jenkins, Ansible and Docker. Utilized MEAN stack (MongoDB) and JSON for data transfer and technologies related to server-side scripting. Developed API to Integrate with Confidential EC2 cloud-based architecture in AWS, including creating machine Images. Knowledge of the Software Development Life Cycle (SDLC), Agile and Waterfall Methodologies. Worked on AWS data pipeline for Data Extraction, Transformation and Loading from the homogeneous or heterogeneous data sources. Implemented and designed AWS virtual servers by Ansible roles to ensure deployment of web applications. Worked with multiple development teams to troubleshot and resolve issues. Used Git/GitHub for tracking changes in various files and coordinating work on those files among multiple people in the project team. Involved in CI/CD process using GIT, Jenkins job creation and Maven to build the application. Integrated the test suites to Jenkins to execute them automatically after every successful deployment. Involved in code reviews and wrote unit tests in python. Worked on tuples, dictionaries, object-oriented concepts-based inheritance features for making algorithms. Designed and developed data management system using MySQL. Used Git for version control. Monitored and troubleshot web applications.
  • At&T
    Data Engineer
    At&T May 2018 - Jul 2019
    Dallas, Tx, Us
    Designed and Developed DB2 SQL Procedures and UNIX Shell Scripts for Data Import/Export and Conversions. Developed an application in Linux environment and dealt with all its commands. Administrate Continuous Integration services (Jenkins, Nexus Artifactory and Repository). Performed dynamic UI designing with HTML5, CSS3, less, Bootstrap JS, jQuery, JSON and AJAX. Worked with SQOOP jobs to import the data from RDBMS and used various optimization techniques to optimize Hive, Pigand SQOOP. Monitored and Troubleshooted Hadoop jobs using Yarn Resource Manager and EMR job logs using Genie and Kibana. Developed a Python Script to load the CSV files into the S3 buckets and created AWS S3buckets, performed folder management in each bucket, managed logs and objects within each bucket. Loading, analyzing, and extracting data to and from Elastic Search with Python. Designed Forms, Modules, Views and Templates using Django and Python. Worked with Amazon Web Services (AWS) using EC2 for hosting and Elastic map reduce (EMR) for data processing with S3 as storage mechanism. Converted all Hadoop jobs to run in EMR by configuring the cluster according to the data size. Extensively used Stash Git-Bucket for Code Control. Worked with Avro and Parquet files and converted the data from either format Parsed Semi Structured JSON data and converted to Parquet using Data Frames in PySpark. Participated in requirement gathering and worked closely with the architect in designing and modelling. Wrote scripts in Python for extracting data from HTML file. Developed dynamic web pages using Python Django frameworks. Generated Python Django forms to record data of online users. Responsible for setting up Python REST API Framework using Django. Created database tables using MySQL to store user data including login credentials and contact information.
  • Twitter
    Python Developer
    Twitter Aug 2016 - Jul 2017
    San Francisco, Ca, Us
    Created Hive Fact tables on top of raw data from different retailer’s which indeed partitioned by Time dimension key, Retailer name, Data supplier name which further processed pulled by analytics service engine. Utilized Python libraries such as NumPy and Pandas for processing tabular format data. Designing and optimized Spark SQL queries, Data frames, import data from Data sources, perform transformations, perform read/write operations, save the results to output directory into HDFS/AWS S3. Implemented Restful web service to interact with Redis Cache framework. Developed Restful endpoints to cache application specific data in in-memory data clusters like REDIS and exposed them with Restful endpoints. Designed front end and backend of the application utilizing Python on Django Web Framework. Created PySpark frame to bring data from DB2 to Amazon S3. Optimized the PySpark jobs to run on Kubernetes Cluster for faster data processing. Modeled Hive partitions extensively for data separation and faster data processing and followed Hive best practices for tuning. Managed large-scale, geographically distributed database systems, including relational (Oracle, SQL server) and NoSQL (MongoDB, Cassandra) systems. Developed spark applications in PySpark on distributed environment to load huge number of CSV files with different schema in to Hive ORC tables. Designed, built, and managed the ELK (Elasticsearch, Logstash, graphite, Kibana) cluster for centralized logging and search functionalities for the App.
  • Sap Labs India
    Python Developer
    Sap Labs India May 2015 - Jul 2016
    Walldorf, Bw, De
    Experienced with configuration management tools like PUPPET and ANSIBLE jQuery and AJAX call used for more interactive web pages. Ensured quality versioning with GIT. Initiated continuous integration and automation using JENKINS & Used JIRA for tracking and Updating Project issue. Implemented Docker containers to create images of the applications and dynamically provision slaves to Jenkins CI/CD pipelines. Knowledge or experience of agile development (e.g. XP, SCRUM, Kanban) and continuing integration. Built various graphs using Matplotlib package which helped in taking business decisions.Architecture and designed principles and Developed Python Mapper and Reducer scripts and implemented them using Hadoop streaming. Created Restful API's to integrate and enhance functionalities of the application. Also Utilized Restful API in communicating with third parties

Menu M Skills

Api Development Pandas Amazon Web Services Data Engineering Django Continuous Integration And Continuous Delivery Logstash Jenkins Elastic Stack Agile Methodologies Amazon Ec2 Hadoop Cascading Style Sheets Cvs Html Pyspark Docker Products Pl/sql Kubernetes Hive Javascript Aws Lambda Numpy Python Application Programming Interfaces Amazon S3 Mysql Mapreduce Github

Frequently Asked Questions about Menu M

What company does Menu M work for?

Menu M works for Hulu

What is Menu M's role at the current company?

Menu M's current role is Senior Python Developer at Hulu.

What skills is Menu M known for?

Menu M has skills like Api Development, Pandas, Amazon Web Services, Data Engineering, Django, Continuous Integration And Continuous Delivery, Logstash, Jenkins, Elastic Stack, Agile Methodologies, Amazon Ec2, Hadoop.

Who are Menu M's colleagues?

Menu M's colleagues are Guillermo Asturias, Victoria Pruim, Steve Wescott, Jasmin Maleksalehi, Vince Allen, Lorraine Sposto, Mez ..

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.