David Ellington work email
- Valid
- Valid
David Ellington personal email
- Valid
David Ellington phone numbers
I am experienced in designing and implementing complex ETL processes, ensuring data quality, and performing data modeling. Some of the core technologies that I have experience with are Airflow, Snowflake, Python, dbt, and Spark (EMR, DataProc, and Databricks).
-
Senior Software EngineerZoominfo Apr 2024 - PresentVancouver, Washington, Us -
Senior Data EngineerSunstate Equipment Co., Llc May 2023 - Mar 2024Phoenix, Az, Us• Developed ETL pipelines with Fivetran, Airflow, AWS Lambda, AWS Glue, and Snowflake.• Built and maintained dbt models for streamlined data transformation.• Introduced Airflow and assisted team members in its use.• Replaced expensive Fivetran pipelines with PySpark, Lambda, Airflow solutions, reducing costs.• Introduced PyTest for unit testing Python code. -
Senior Software EngineerZoominfo Apr 2022 - Apr 2023Vancouver, Washington, Us• Deploying AWS data platform infrastructure using Terraform Cloud• Developing Jenkins pipelines to deploy code into AWS platform• Developing Airflow utils• Developing ETLs using Airflow, AWS, and Snowflake• Designed and contributed to the implementation of an in-house dbt Core solution running on AWS Batch (orchestrated and triggered by Airflow) -
Manager, Enterprise Data EngineeringZoominfo Oct 2019 - Apr 2022Vancouver, Washington, Us• Playing the role of a player-coach engineering manager: managing a data engineering Scrum team and doing engineering work along with the rest of the team• Conducting code reviews and working with my reports on goals, training, and performance• Working with the team to continually improve our code, architecture, and processes to provide better reliability, quality, performance, and cost-efficiency• Developing a Scala Spark library to plug into a PySpark codebase that allows for reading of HBase SequenceFiles and fast deserialization of protobuf serialized data• Working closely with the product owner to coordinate work with other technical teams and to understand stakeholder requirements and how best to meet them with a technical solution• Developing ETL to load star schemas in Snowflake• Adding technical spikes and tech-debt remediation stories to the backlog• Writing custom Airflow hooks, operators, and sensors -
Lead Data EngineerDiscoverorg Sep 2018 - Oct 2019Vancouver, Wa, Us• Providing tech leadership on projects to stand up Apache Airflow for ETL pipeline orchestration; to move ETL code from Databricks notebooks into Python eggs; to move Spark ETL processing from Databricks to AWS EMR and then back again to Databrick Light; to apply pytest unit tests to PySpark code• Loading high-volume complex JSON from S3 data lake sources into Snowflake data warehouse via Spark on EMR/Databricks Light - orchestrated by Airflow• Making over 6 billion (and counting) JSON records queryable by developing ETL to load into a partitioned, z-ordered Delta Lake table• Using SNS/SQS for managing dependencies between Airflow DAGs and Slack for Airflow failure notifications• Developing an automated scheduled Python process to correct Snowflake permissions, which is driven by YAML templates• Working with stakeholders to quickly generate various prototype options for ETL targets in Snowflake• Evangelizing Scrum framework and encouraging my team to stick with and improve at it• Mentoring engineers/analysts on Spark, Databricks, EMR, Airflow, Python, SQL -
Senior Data EngineerNike Nov 2017 - Sep 2018• Performing as tech lead for a supply and materials planning analytics squad• Designing and implementing data pipelines and structures in AWS/Airflow/Snowflake based on requirements from product owners, analysts, viz developers, and business partners within an Agile / Scrum methodology• Initiating Spark-based data ingest (on-prem Teradata/Oracle to AWS)• Writing ETL jobs in both Scala Spark and PySpark• Initiating new patterns for PySpark – more modular/testable/reusable code as opposed to scripting; Python 3.4 instead of 2.7; unit testing Spark ETL code• Writing Airflow DAGs and configuring Airflow to run jobs that spin up/down EMR clusters and run Spark jobs• Designing and implementing an architecture where data sources are snapshotted in S3 parquet files and ETL can be run/rerun for historical dates, which allows for bug fixes/enhancements to ETL code to be applied over entire output datasets• Loading detail and aggregate data into Snowflake cloud data warehouse via Spark ETL for use by Tableau and ad hoc querying• Developing Python utilities: for applying correct permissions to Snowflake objects; for scripting Snowflake objects to files for source control• Mentoring data engineers on Spark, EMR, Airflow, S3, Python, SQL, etc.
-
Tech Lead (Senior Application Engineer Hr Title)Nike Jul 2016 - Nov 2017• Performing as Tech Lead for a supply planning dashboard team• Working in an Agile /SAFe (Scaled Agile Framework) development environment• Designing star schema design on Teradata supporting Cognos and Tableau reporting• Working with large datasets – some exceeding 1 billion rows• Designing temporal tables on Teradata for capturing changes in data over time• Tuning Teradata SQL queries (Teradata is an MPP (massively parallel processing) RDBMS)• Mentoring fellow engineers/analysts in SQL and data design• Assisting business users in tuning/scheduling Alteryx workflows• Consistently looking for ways to improve design/systems/processes
-
Intermediate Business Intelligence EngineerNike Feb 2015 - Jul 2016• Designing star schemas on Teradata supporting Cognos and Tableau reporting• Completing a Cognos Framework project• Working with large datasets – some exceeding 1 billion rows• Serving as a go-to resource for diagnosing/troubleshooting reporting/data issues and suggesting appropriate fixes• Writing a white paper on data architecture/DB design• Participating in a two-day hackathon to grow awareness of potentially security vulnerabilities• Writing several C# utilities to accelerate Teradata development: auto-generation of dependency diagrams showing relationships between tables and views in Teradata, DDL compare across Teradata servers, writing DDL from entire Teradata DBs into files for source control purposes, analyze functional dependencies between key/attributes within tables
-
Database/Bi/Software DeveloperMenlo Worldwide May 2011 - Feb 2015Greenwich, Ct, Us•Developing, maintaining, and debugging SQL Server 2005/2012 databases, including stored procedures, functions, triggers, views, SQL Server Agent jobs, permissions, etc. and including light DBA work.•Designing, maintaining, and debugging ETL processes using SSIS, including .NET script tasks.•Designing and maintaining end-user reporting using SSRS (SharePoint) and Excel, as well as writing queries for ad-hoc data requests.•Gathering requirements from internal and external users and creating technical and other documentation and diagrams.•Enhancing and supporting existing .NET Winforms applications using both VB.NET and C#.•Developing ASP.NET MVC web apps utilizing technologies/modules such as Entity Framework Code First Migrations, Castle Windsor IoC, Elmah, AutoMapper, Bootstrap, jQuery, AJAX, JSON, toastr, etc.•Administering web apps and WCF services in IIS.•Designing complex build configs in TeamCity for building and deploying applications, web apps, and WCF services.•Unit-testing with NUnit and Moq.•Engaging in team development using SVN source control. -
Edi AnalystMenlo Worldwide Jun 2009 - May 2011Greenwich, Ct, Us• Developing, testing, and implementing new trading partner setups for both inbound (204, 211, 820) and outbound (210, 214, 990) ANSI X12 EDI standards – primarily version 4010. Working daily with external customers and 3rd parties, as well as with Con-way sales and support and other Con-way IT groups, to support new and existing EDI customers. • Creating and/or updating and maintaining several Access database tools – EDI setup work queue, EDI reject work tool, customer specific tools for matching shipments (customer reports to Con-way reports) that are missing 214 status updates. • Using and maintaining an SQL Server 2008 Express database with an Access frontend to load daily inbound 204/211 data (over 3 million records), which I created for searching for shipments as well as for statistics/reporting purposes.
David Ellington Skills
David Ellington Education Details
-
Drexel UniversityLibrary And Information Science -
Portland State UniversityPsychology
Frequently Asked Questions about David Ellington
What company does David Ellington work for?
David Ellington works for Zoominfo
What is David Ellington's role at the current company?
David Ellington's current role is Senior Software Engineer.
What is David Ellington's email address?
David Ellington's email address is da****@****ike.com
What is David Ellington's direct phone number?
David Ellington's direct phone number is +150367*****
What schools did David Ellington attend?
David Ellington attended Drexel University, Portland State University.
What skills is David Ellington known for?
David Ellington has skills like Python, Sql, Data, Etl, Business Intelligence, Database Design, Data Analysis, Data Modeling, Data Warehousing, Microsoft Sql Server, Ssis, Ssrs.
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial