John Mcafee

John Mcafee Email and Phone Number

Elasticsearch Engineer at Booz Allen Hamilton
John Mcafee's Location
San Antonio, Texas, United States, United States
About John Mcafee

20+ years of experience supporting the Intelligence community and Department of Defense and currently hold an active Top Secret clearance. The first 10 years of my career were focused primarily on software engineering. During that time span, I created and maintained Java based web applications (Spring Framework, Apache Struts). I was also responsible for the engineering and administration of the databases ((Oracle, Sybase, SQL Server, MySQL) that supported the web applications. I transitioned from software engineering into providing Business Intelligence capabilities. During these 3 years, I used IBM Cognos to provide reporting, analytics, and score carding to clients. Additionally, I provided requirements management using IBM DOORS to help optimize communication and collaboration among teams and stakeholders.I am currently pursuing the Microsoft Certified: Azure AI Fundamentals certification and would prefer a Business Intelligence role with Artificial Intelligence capabilities.

John Mcafee's Current Company Details

Elasticsearch Engineer at Booz Allen Hamilton
John Mcafee Work Experience Details
  • Booz Allen Hamilton
    Elasticsearch Engineer
    Booz Allen Hamilton Nov 2021 - Oct 2024
    Atlanta, Georgia, United States
    Supported project A by creating robust data ingestion pipelines and data transformation. This was accomplished by leveraging Elastic's out-of-the-box integrations making data ingestion and connecting to other data sources such as ServiceNow, Amazon S3, and AWS CloudTrail available. He then secured the data by segregating the data sources and data accesses using Kibana RBAC, Kibana spaces, and SAML. Additional functionality was provided by leveraging Elastic SIEM and Elastic security to detect, investigate and respond to cyber threats.Project B supported a software platform that provides clients a cross-domain solution (CDS). My contribution to the software development of this platform included the creation and maintenance of Python code. The Elastic stack was also an important component of the CDS solution. Using my Elastic experience, I improved our security by implementing token-based authentication services. Additionally, I provided data transformations primarily leveraging Logstash and Beats with come custom code as needed. I supported the transformation and ingestion of several data sources such as logs, XML, and PostgreSQL relational database tables. Lastly, I created default Kibana dashboards and visualizations and assisted users with creating their own. In addition to providing technical leadership and contributions, I also interview potential candidates and provided technical presentations.
  • Millennium Corporation
    Data Engineer Iv
    Millennium Corporation Jun 2021 - Nov 2021
    Austin, Texas Metropolitan Area
    Recommended policies and implementation strategies for metadata storage and relationships across multiple systemsRecommended strategy for tagging data and metadata throughout the data life cycle. Collaborated with business and technology partners to provide customized protection and testing of security controls as needed.
  • Enterprize Software
    Senior Software & Data Engineer
    Enterprize Software Aug 2020 - Jun 2021
    Herndon, Virginia, United States
    This project provided a software platform with Elastic products as foundation to serve smaller projects which needed search, analysis, and reporting capabilities. My first contribution to this project was improving OCR processing by replacing legacy code with Apache Tika toolkit to detect and extract metadata and text from several different file types such as PDFs. I also used Go programming to support application development of the software platform. My additional duties included the configuration and administration of Elasticsearch, Kibana, Logstash and Beats.
  • Ennoble First Inc.
    Big Data Engineer
    Ennoble First Inc. Apr 2019 - Apr 2020
    Herndon, Virginia, United States
    The focus of this project was using Elasticsearch to identify and search for searching anomalies in their logs. In addition to searching for anomalies, the team needed tools for analysis and reporting. Using my expertise with the Elastic stack, I began supporting this team by creating Kibana visualizations and dashboards from the existing data to analyze and identify gaps and inconsistencies. Afterwards I held weekly meetings to discuss building the dashboards for reporting and any other Kibana needs. I also configured Elasticsearch ILM policies improving performance and cluster storage. Created data pipelines to transform complex data sets into client focused data using Logstash, Beats, Java and other transformation tools such as CloverDX.
  • Cyberspace Solutions
    Elasticsearch Sme
    Cyberspace Solutions Oct 2018 - Apr 2019
    Herndon, Virginia, United States
    The main priority of this project was processing large amounts of data to be useable by the analysts. My responsibilities involved creating methods for processing the structured, semi-structured, and unstructured data which included text files, CSV, JSON, XML, and logs into a useable format. I accomplished this my leveraging Logstash and some custom code (Java and Python) to ingest and transform the data. To protect the data, I implemented layered access using Kibana RBAC and Kibana spaces. Other supporting activities including the administration, upgrade, and configuration of Elasticsearch and Logstash. Trained all levels of the community how to create meaningful Kibana dashboards, reports, and visualizations.
  • Aerodyne Industries Llc
    Enterprise Data Architect
    Aerodyne Industries Llc Apr 2018 - Oct 2018
    Colorado Springs, Colorado, United States
    A challenging project that used the Elastic stack to ingest program management, mission, and financial data for search and analysis and correlate aspects of each for reporting and other purposes. I began this project by coordinating weekly meetings with management and technical leads to discuss data migration, scheduling, and other topics. Next, I provided oversight for defining data standards and metrics collection. My technical contributions began with developing data pipelines using Logstash, Talend Big data and Elastic Java REST clients to transform and index documents into Elasticsearch. Using Apache Flume and Elastic Beats, data from remote devices was collected and sent to Logstash or Hadoop. To protect the data, I secured communications with Kafka, Hadoop, Logstash and other components using Elastic security and Kerberos. Created Kibana dashboards and reports for the monitoring team and program management. Used Apache Hive to query and extract data the analysts used as input into other tools.
  • Plus3 It Systems
    Cloud Engineer
    Plus3 It Systems May 2017 - Mar 2018
    Colorado Springs, Colorado, United States
    The purpose of this project was to enforce compliance and regulation of organizations receiving aid funds from the US government. My first objective was migrating the stand-alone Elasticsearch instance to and Azure Cloud environment. Next, I created Alerts within Elasticsearch to detect conditions and trigger necessary actions. Created data ingestion pipelines using Python and Logstash to support Kibana dashboards, visualizations and reporting. Collaborate with cross-functional teams to understand data requirements, discuss latest laws and regulations, and review alerts generated by Elasticsearch. Responsible for the upgrade, configuration and administration of Elasticsearch, Kibana, and Logstash
  • Novetta
    Senior Software Engineer
    Novetta Aug 2015 - May 2017
    Tampa, Florida, United States
    An interesting research project that used Elastic products as a possible solution to a variety of technical challenges. Part of my responsibilities were to demonstrate Kibana/Elasticsearch features to other government programs and agencies, internal teams, and occasionally commercial teams. Configured Logstash to ingest scientific periodicals from RSS feeds and other data sources proving it can be used for more than just log ingestion. Using the Elastic Java Client, I maintained a Java application to export and import data indexed in Elasticsearch. This capability increased Elasticsearch interest to other government agencies by allowing a method to use data across tools and platforms. Used Elasticsearch and Kibana to evaluate current payroll systems and identify inefficiencies, and redundancy. Ability to understand complex business processes and be able to interpret business requirements and produce functional/technical specifications.

John Mcafee Education Details

Frequently Asked Questions about John Mcafee

What is John Mcafee's role at the current company?

John Mcafee's current role is Elasticsearch Engineer at Booz Allen Hamilton.

What schools did John Mcafee attend?

John Mcafee attended University Of Maryland Global Campus, Mississippi Valley State University.

Not the John Mcafee you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.