Christopher Sparks

Christopher Sparks Email and Phone Number

Senior Software Engineer with 18+ years industry experience in Full Stack Dev, Solution Architecture, Data Science, and Generative AI.
Christopher Sparks's Location
Lake Alfred, Florida, United States, United States
Christopher Sparks's Contact Details

Christopher Sparks work email

Christopher Sparks personal email

n/a
About Christopher Sparks

With over a decade of experience in software engineering, my core competencies lie in asynchronous programming and system integration, particularly using Python and NodeJS. At Verizon, I was deeply committed to enhancing system maintainability and real-time data processing, aligning my work with the company's innovative edge. My mission has always been to simplify complex systems, ensuring that my contributions bolster the organization's goals and culture by bringing in diverse technical perspectives and robust solutions.At Verizon, I played a pivotal role in refactoring crucial microservices, eliminating dependencies to streamline deployments. My work ranged from leveraging asynchronous Python for device configurations to enhancing data quality tools for field technicians, all while modernizing CI/CD pipelines. These efforts were complemented by my previous experience at GenesisCare, where I led technical interviews, fostered team cohesion, and bridged knowledge gaps, ensuring seamless integration and collaboration between global teams.

Christopher Sparks's Current Company Details

Senior Software Engineer with 18+ years industry experience in Full Stack Dev, Solution Architecture, Data Science, and Generative AI.
Christopher Sparks Work Experience Details
  • Verizon
    Senior Software Engineer
    Verizon Apr 2023 - Oct 2024
    Basking Ridge, Nj, Us
    • Refactored a Python/Django/FastAPI microservice for remote cell site configuration, eliminating RabbitMQ and Celery dependencies to simplify deployments and enhance maintainability.• Leveraged asynchronous Python network I/O to enable real-time configurations for 15-80 devices simultaneously.• Enhanced a Python-based data quality tool for field technicians using XLSXWriter and Pydantic, generating self-validating MS Excel workbooks to ensure schema compliance.• Developed a NodeJS Adapter Microservice with Express and a React front-end, integrating vendor platforms to eliminate a 6-8 month, 4-developer project.• Modernized CI/CD pipelines by migrating to GitLab CI and integrating Jenkins with Blue Ocean, automating deployments and improving delivery workflows.• Established Infrastructure-as-Code (IaC) pipelines with Docker, Groovy scripts, and Ansible, introducing semantic versioning and reducing deployment risks.• Assessed and integrated vendor platforms, including the Itential IAP/IAG platform, resolving compatibility issues and streamlining service integrations.• Authored SDLC guides to align tools like Git, Jira, and Rally with enterprise processes, while incorporating Agile ceremonies (e.g., sprint kick-offs, retrospectives, and stand-ups).• Simplified deployments and improved system reliability by eliminating unnecessary dependencies, achieving a 90% reduction in production issues.• Reduced data rework time by 95%, saving over 6 hours per sprint with self-validating Excel tools for field teams.• Delivered 6-8 months of development savings by efficiently integrating vendor platforms, enabling faster project completion.• Increased test coverage by 70% and streamlined delivery workflows through CI/CD modernization and semantic versioning.• Enhanced team alignment with enterprise standards by integrating the Scaled Agile Framework (SAFe) into workflows and improving collaboration across tools like Jira and Rally.
  • Career Break
    Caregiving
    Career Break Sep 2022 - Feb 2023
    As a direct result of Hurricane Ian in Florida, I put my job search on hold to help elderly parents recover from extensive damage to their assets and disruption to both their lives and their finances.
  • Genesiscare
    Technical Engineering Lead - Usa
    Genesiscare May 2021 - Jul 2022
    Sydney, Nsw, Au
    • Designed and implemented the USA development team interview process, conducting technical interviews and fostering team cohesion through collaborative input• Identified knowledge gaps between US and AU teams, facilitating knowledge transfer via lecture series, working sessions, and lunch-and-learns to resolve integration challenges• Led the US team as Technical Lead, performing Python codebase/system discovery and collaborating with AU engineers to complete initial tasks and develop discussion frameworks• Enhanced a Django-based oncology portal with custom ReactJS components and maintained Cypress unit tests to ensure UI reliability and consistency• Maintained and optimized the GraphQL API using the Python Graphene library, representing custom domain objects and improving domain-specific queries• Implemented Redis for key-value caching and PubSub functionality, utilizing Change-Data-Capture (CDC) to synchronize the MS SQL-based EMR with PostgreSQL and the Celery Task Runner to publish events to Redis streams• Contributed to AWS DevOps workflows by developing locally with Docker Compose, pushing changes to GitHub, configuring GitHub Actions to trigger AWS CloudFormation, and deploying containers to AWS ECS for production environments• Standardized the technical hiring process and onboarded the USA development team successfully• Enabled smooth integration between US and AU teams, addressing key technical and knowledge gaps• Improved system reliability by extending the Django Oncology Portal and GraphQL API• Established real-time database synchronization using Redis and Change-Data-Capture mechanisms• Enhanced DevOps workflows, improving deployment consistency and infrastructure scalability in AWS• Improved deployment consistency and infrastructure scalability through AWS DevOps workflow enhancements
  • Magic Leap
    Senior Software Engineer
    Magic Leap Jul 2020 - Dec 2020
    Plantation, Florida, Us
    • Documented, improved, and maintained a Docker-based architecture for AR/VR testing, managing cluster deployments with Rancher, Kubernetes, and Docker• Streamlined deployment and promotion processes using Gerrit, Jenkins, and Artifactory, enhancing system efficiency and reliability• Migrated Infrastructure-as-Code (IaC) workflows from Ansible to AWS CloudFormation and GitHub Actions, optimizing Docker images with multi-stage builds for reusable deployments• Automated CI/CD pipelines for tools such as Docker, Jenkins, and Rancher, improving deployment consistency and reducing manual effort• Developed local development environments using Docker Compose, replacing staging environments and improving developer productivity• Conducted peer code reviews with Gerrit, standardizing the codebase by converting Bash scripts to Python for better maintainability and scalability• Maintained and extended the ELK Stack (Elasticsearch, Logstash, Kibana), optimizing Logstash ingestion pipelines for Jenkins and REST APIs, and improving Kibana dashboards with role-based access control (RBAC)• Reduced deployment times by 50% through automated CI/CD workflows and streamlined promotion processes• Eliminated merge conflicts and saved 3+ hours per sprint by introducing local development environments using Docker Compose• Improved scalability and reduced build times by 80% with multi-stage Docker builds and IaC optimizations• Enhanced monitoring and troubleshooting capabilities by optimizing data ingestion pipelines and improving Kibana dashboards for the ELK Stack
  • John Deere
    Geospatial Analytics Engineer
    John Deere Jun 2019 - Nov 2019
    Moline, Il, Us
    • Designed and implemented an interactive WebGL "Agronomic Data Mapping Tool" for visualizing farm metrics over satellite snapshots, integrating sensor data, atmospheric data, and IoT farm data using a modern technology stack (WebGL, React, and NodeJS).• Processed datasets larger than 50GB directly in-browser with GPU computation and transitioned from static offline Python visualizations to interactive online WebGL visualizations.• Conducted market analysis and competitor reviews to guide architecture and developed a JupyterLab extension to export visualizations, integrating Python and Javascript workflows seamlessly.• Delivered GPU-optimized field-data visualization web app, reducing processing from ~3 minutes per static visualization to instant real-time, customizable visualizations• Enhanced Data Scientist workflows by developing Jupyter extension to enable output of React / WebGL interactive widgets alongside traditional matplotlib outputs• Drastically enhanced end-user experience by removing dependency on internal data scientists, empowering them to derive custom insights from their own data• Enabled seamless adoption partners by negating standard Architectural Review Board processes via adopting common web standards, e.g. Typed Arrays, binary data protocols, WebGL, and Canvas API• Gave presentations of tool to key stakeholders
  • Westrock Company
    Senior Data Engineer
    Westrock Company Oct 2018 - Jul 2019
    Atlanta, Georgia, Us
    • Developed unit testing and regression testing suites for a big data ingestion and transformation pipeline, improving data quality and reducing Change-Test-Verify Time by over 140%• Refactored heavily-used Python/PySpark modules via method decomposition, creating reusable test cases and providing a development blueprint for less experienced engineers• Implemented a regression test suite to validate changes to the Python codebase, ensuring consistent results during ETL pipeline refactoring• Automated artifact generation for Hadoop, Spark, and AWS Data Pipeline, dynamically creating AWS JSON definitions, Apache Sqoop Bash-wrapped invocations, Hive HQL DDL table definitions, and AWS job configuration files• Transformed source data for compatibility with Multidimensional OLAP Cubes using custom Python scripts tailored for the Enterprise Data Warehouse• Reduced testing and validation times by 140%, streamlining developer workflows and expediting the release process• Enhanced pipeline efficiency and scalability by optimizing ETL workflows and balancing schema-driven automation• Reduced development time by automating artifact generation, saving an estimated 10 hours per week• Improved compatibility with enterprise data solutions by transforming datasets for OLAP Cube integration
  • Red Hat
    Solutions Architect, Data Science
    Red Hat Nov 2017 - Aug 2018
    Raleigh, Nc, Us
    • Guided machine learning platform selection by creating evaluation rubrics for the Business Intelligence Team and Data Science Working Group, consulting with stakeholders to define data science requirements and scope infrastructure needs, multi-tenancy, and platform capabilities.• Documented ad hoc infrastructure and business processes, authoring a comprehensive "Discovery Document" to facilitate gap analysis between current and desired capabilities.• Developed an AWS infrastructure strategy to support enterprise data science practices, ensuring scalability for current and future workflows while designing architectures for the"productionization" of data science models.• Defined a "Pathway to Production" for data science models and ETL pipelines, integrating AWS infrastructure, automation tools, business processes, and programming best practices.• Identified and defined proprietary semantic datasets by manually discovering IT artifacts and conducting structured stakeholder interviews to enrich dataset definitions.• Ensured GDPR compliance for a multi-tenant machine learning platform by developing security architectures, data usage policies, and anonymization strategies for P.I.I. in ETL pipelines.• Optimized Python packages and jobs for data science workflows, introducing Dask for parallelizing Pandas operations, refactoring long-running jobs for efficiency, and remediating ETL scripts during annual Territory and Quota Management cycles.• Delivered evaluation frameworks that guided enterprise machine learning platform selection, ensuring alignment with strategic goals• Optimized data science models, increasing performance by over 40% with Python refactorings and scalable tools like Dask• Achieved GDPR compliance across data management processes, mitigating legal and operational risks• Defined semantic datasets critical for enterprise-wide business intelligence and data science initiatives, improving data accessibility and usability
  • Honeywell
    Senior Application Architect
    Honeywell Sep 2015 - Jul 2017
    Charlotte, North Carolina, Us
    • Developed an Auditing/Compliance application per NIST 800-37 for DoD and FedRAMP, reviewing over 60 NIST Special Publications and mapping ISO 27001/27002 relationships to FedRAMP, FINRA, and HIPAA to authorize the AWS Government Cloud for government use.• Designed a data model compliant with ISO 27001/27002, incorporating OMB Circulars, DoD Directives, and Security Control Overlays, ensuring support for HIPAA through extensive research.• Built a Python ETL pipeline to scrape GSA Per Diem rates, process NIST 800-53 metadata, and integrate DISA manual checklists in XCCDF format to maintain ecosystem data freshness.• Authored a Development Roadmap and Migration Plan from legacy Python, leveraging Django, PyLint, and custom libraries for static/dynamic analysis, isolating complex logic for iterative refactoring, and creating a Living Style Guide based on Atomic Design Principles for UX.• Architected a minimalist front-end stack using jQuery, Backbone, Materialize, SASS, and Webpack, providing a comparative analysis of React and Angular alternatives.• Developed a Python REST API adhering to HTTP standards, enabling ecosystem interoperability through Django middleware for presenting hierarchical data in multiple formats (JSON, XML, and CSV).• Managed AWS pilot environments, administering S3, EC2, IAM, RDS, and VPC services to support system engineering and administration tasks.• Delivered a fully compliant auditing solution enabling FedRAMP authorization of AWS Government Cloud for governmental use• Authored NSA proposal on Risk Management Framework extension• Improved data freshness and system accuracy by integrating real-time scraping and processing pipelines• Standardized user experience across products with a Living Style Guide based on Atomic Design Principles• Reduced legacy system complexity through systematic migration planning and refactoring, ensuring ecosystem interoperability
  • Zencos
    Senior Analytics Platform Engineer
    Zencos Jul 2014 - Jul 2015
    Cary, Nc, Us
    • Developed a Python "Open-Stack" Analytics Platform for a DoD/DHS big data initiative, enabling early detection of disease outbreaks by leveraging Python 2.7, scikit-learn, NumPy, and Pandas• Created a versioned record data store integrated into Talend ETL workflows, supporting syndrome vector analyses with INI-based configuration parsing• Customized an IPython/RStudio Server grid roll-out for a legal analytics firm, enabling distributed execution via Platform LSF and Python-based grid resource stress tests using k-means clustering• Performed SAS Enterprise Case Management (ECM) platform customizations, delivering Javascript solutions with the Dojo framework and DWR for server communication, and training over 20 employees• Designed a stand-alone Python 3.4-based Data Drill-Down component, utilizing a stack including Backbone.js, Bootstrap 3, jQuery, and SQLAlchemy ORM for compatibility across Oracle, PostgreSQL, MySQL, and SQLite• Delivered an analytics platform supporting DHS's disease outbreak management initiative, improving resource allocation efficiency• Enhanced distributed computing workflows for legal analytics through custom grid integration with IPython Notebook, increasing computational capacity• Streamlined SAS ECM adoption by delivering customizations and conducting employee training sessions• Developed a modular Data Drill-Down component with scalable architecture and ORM flexibility, enabling seamless database integration
  • Therasim, Inc.
    Analytics Platform Development Lead
    Therasim, Inc. Jan 2013 - Feb 2014
    • Rebuilt the in-house data analytics platform by analyzing legacy Perl code and rewriting it in PHP, resolving inconsistencies through refactoring, dynamic introspection, and regression testing• Conducted requirements discovery and vendor evaluations, assessing Pentaho and QlikView for cost-effectiveness, and developed interview questionnaires to gather data usage insights from the Medical Content Team• Designed a hybrid PRD/FRD project solution using a custom OLAP Cube and ETL toolset, applying Kimball methodology for dimensional modeling• Transitioned graphing capabilities from Rafael JS to a Highcharts/D3 hybrid, leading reconciliation tasks and cross-referencing metrics with Google Universal Analytics• Built a REST API and front-end visualization system using Python, Javascript, HTML5, CSS3, and JSON to expose dimensional data insights• Refactored simulation software by converting it from stateful to stateless for dynamic data polling, utilizing PHPUnit for testing• Administered AWS infrastructure, managing DNS, Load Balancing, EC2, S3, Route 53, RDS, and CloudWatch to ensure scalability and performance• Delivered a fully re-architected analytics platform that surpassed functionality and accuracy expectations• Successfully introduced dimensional modeling principles and modernized data visualization tools, improving system usability and scalability• Reduced testing time and improved platform reliability by optimizing simulation workflows• Enhanced system scalability and performance through effective AWS service integration and migration processes
  • Red Hat
    Enterprise Software Architect
    Red Hat Jun 2012 - Jan 2013
    Raleigh, Nc, Us
    • Designed and implemented an enterprise-wide Single Sign-On (SSO) solution using SAML 2.0 within a J2EE ecosystem, collaborating with stakeholders to integrate internal and external systems• Authored the enterprise data model definition, aligning SSO infrastructure with systems like the intranet, Customer Portal, and Subscription Network• Conducted an in-depth analysis of the JBoss AS core codebase, ensuring compatibility with SSO vendor platform architecture and Red Hat middleware (e.g., ESB)• Enhanced the Drupal 6 intranet platform by developing wildcard search, hierarchical traversal, and profile extensions for the "Staff Roster" application• Designed a web-based real-time chat prototype using Jabber to evaluate enterprise collaboration platforms and technologies• Improved security and user experience by centralizing authentication with a robust SSO solution, reducing login errors by 30%• Enhanced employee productivity by upgrading the intranet's search and navigation capabilities, reducing time spent locating resources by 20%• Improved system interoperability and vendor integration success rates by aligning SSO infrastructure with enterprise data models• Reduced project overhead by delivering a functional collaboration prototype, accelerating vendor evaluations by 50%
  • Netapp
    Data Migration Software Engineer
    Netapp Apr 2011 - Apr 2012
    San Jose, California, Us
    • Embedded WebKit into a Python-based migration tool, enabling dynamic jQuery-to-PyQt interactions for generating reports and providing interactive recovery and reconciliation steps for field technicians• Architected a Python application for enterprise data migration planning, supporting the discovery of NetApp ONTAP filers and identifying migration-prohibitive states such as replication setups and Kerberos-secured NFSv3 exports• Developed a Python/PyQt GUI diff tool, streamlining QA processes by efficiently identifying configuration changes and discrepancies across SAN migration tools• Created a Python API leveraging the NetApp Autosupport HTTP/REST API, extending filer discovery capabilities and generating detailed infrastructure reports• Enhanced field technician efficiency with interactive reconciliation tools, improving troubleshooting speed and reducing errors• Streamlined data migration processes with a robust discovery and planning application tailored to NetApp ONTAP environments• Increased QA efficiency by automating configuration change detection, reducing manual review effort by 40%• Improved infrastructure reporting accuracy and operational planning through extended NetApp Autosupport API capabilities
  • Electronic Arts (Ea)
    C++ Software Engineer
    Electronic Arts (Ea) Jan 2011 - Apr 2011
    Redwood City, Ca, Us
    • Extended a distributed online interface for gaming titles on current-generation consoles, contracted by EA Sports, to enhance the C++-based Blaze server for Madden 2011, optimizing Xbox Live interactions through extensive refactoring• Authored Lua-to-C++ stress testing scripts to identify performance bottlenecks and ensure server reliability• Augmented the C++ HTTP handler for multipart/form-data requests, integrating complex HTTP interaction wrappers using libcurl• Improved the scalability and functionality of an online interface, supporting high-demand gaming applications• Enhanced server reliability and performance through stress testing and optimizations, ensuring a seamless gaming experience• Expanded HTTP handling capabilities, reducing operational delays in server-side request processing
  • Travel Tv Network, Llc
    Principal Software Engineer
    Travel Tv Network, Llc Jan 2008 - Jan 2010
    Us
    • Led a team of three developers to deliver a web-based social media business platform, ensuring on-budget and on-schedule completion• Maintained and extended OpenX ad-serving technology, leveraging PHP/MySQL to enable immediate monetization of web product traffic• Developed an object-oriented PHP framework with a REST facade, integrating three disparate social media web services into a unified platform• Maintained and enhanced a Python-based CMS using Zope/Plone, creating custom platform extensions tailored to business use cases and preserving historical company information• Delivered a social media business platform aligned with client goals, managing the development team effectively• Monetized web product traffic by extending and optimizing OpenX ad-serving technology• Enhanced CMS functionality, tailoring the platform to reflect specific business needs• Unified disparate social media services through a scalable, object-oriented PHP framework, increasing platform efficiency
  • Comdoc
    Web Developer
    Comdoc Jan 2006 - Jan 2008
    Ridgeland, Sc, Us
    • Developed and maintained customer web presences using the LAMP stack (Linux, Apache, MySQL, PHP), implementing tailored web solutions to meet diverse client needs• Collaborated directly with clients to elicit, analyze, and refine requirements through frequent in-person meetings, translating their business needs into actionable development strategies• Delivered customized web solutions, leveraging the LAMP stack to address unique customer requirements• Strengthened client relationships by fostering effective communication and facilitating collaborative development processes

Christopher Sparks Skills

Data Warehousing Requirements Analysis Data Science Load Business Analysis Web Development Agile Methodologies Enterprise Architecture Data Modeling Web Services Big Data Full Stack Development Business Intelligence Extract Project Management Software Development Life Cycle Software Development Machine Learning Data Migration Software Project Management Transform Solution Architecture

Christopher Sparks Education Details

  • Clemson University
    Clemson University
    Applied Mathematics
  • Clemson University
    Clemson University
    Computer Science

Frequently Asked Questions about Christopher Sparks

What is Christopher Sparks's role at the current company?

Christopher Sparks's current role is Senior Software Engineer with 18+ years industry experience in Full Stack Dev, Solution Architecture, Data Science, and Generative AI..

What is Christopher Sparks's email address?

Christopher Sparks's email address is ch****@****dit.com

What schools did Christopher Sparks attend?

Christopher Sparks attended Clemson University, Clemson University.

What skills is Christopher Sparks known for?

Christopher Sparks has skills like Data Warehousing, Requirements Analysis, Data Science, Load, Business Analysis, Web Development, Agile Methodologies, Enterprise Architecture, Data Modeling, Web Services, Big Data, Full Stack Development.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.