Peter Landis

Peter Landis Email and Phone Number

Principal Engineer | AI | Generative AI | Machine Learning | Data Engineering | Data Lake House | Data Analytics | Data Integration | Software Engineering | AWS | ELT @ Northwestern Mutual
Peter Landis's Location
Brookfield, Wisconsin, United States, United States
About Peter Landis

Peter Landis is a data and software guru with over 20 years of experience in building data platforms and software applications. As a Principal Engineer at Northwestern Mutual, he contributes to the design and development of AI systems, including augmented generative models, to enhance efficiency. Peter possesses extensive expertise in areas such as data engineering, ETL, architecture, AI development using frameworks like LangChain, LlamaIndex, and cloud platforms. Peter has been instrumental in the successful deployment of NM's first production use case of a customer service Retrieval Augmented Generation. He has previously held positions in data engineering, architecture, and software development at companies like Northwestern Mutual, Stratagem Consulting, and GE Medical. When he's not conquering the tech universe, Peter is a family man with two awesome kids. He loves hanging out with his fam and getting active with activities like tennis, jiu-jitsu, and snowboarding. Oh, and he's always up for fun programming projects involving websites, Arduino, Raspberry Pi, and home automation. Peter is the ultimate tech superhero, both at work and at home!

Peter Landis's Current Company Details
Northwestern Mutual

Northwestern Mutual

View
Principal Engineer | AI | Generative AI | Machine Learning | Data Engineering | Data Lake House | Data Analytics | Data Integration | Software Engineering | AWS | ELT
Peter Landis Work Experience Details
  • Northwestern Mutual
    Principal Engineer
    Northwestern Mutual May 2023 - Present
    Milwaukee, Wi, Us
    As a principal engineer, helped design and build various retrieval and augmented generative AI applications.  The AI app sourced knowledge articles, training videos, and web pages that improved the efficiency to the call center by reducing repeated field calls with improved accuracy.  The source data was ingested into various chunking strategies, metadata, and tagging.  This was then used to create vector embeddings along with various ranking techniques to improve search retrieval accuracy.  Leveraging frameworks like LangChain helped with the initial buildout of the orchestration between vector search, LLMs, prompt chaining, and agents.  Further expanding the orchestration capabilities, designed a Generative AI orchestration service that scaled based on usage built on top of Kubernetes that provided the orchestration flexibility to easily expand various LLMs and loosely coupled the vector stores . It also provided the ability to provide authorizations between the application and the Embeddings within the vector store. Leveraged an evaluation framework that improved the search retrieval accuracy based on adjusting the chunking strategy, number of sources, search algorithm, and feedback annotations.   Developed various techniques to use LLMs to help facilitate in the orchestration of search and retrieval accuracy. * Design, developing, and maintaining AI Generative AI systems for call center to improve accuracy, and efficiency. * Data collection (website, video, audio, documents), preprocessing (production grade pipeline), process flow design (embedding model, LLM model like GPT3.5/GPT4) and comparison, training and tuning LLMs, and deploy models to prod. * Contributed to the AI governance and council to help provide recommendations, principal design decisions, and learnings. * Leveraged LangChain for experiments and evaluation framework for different versions of GPT for response accuracy.
  • Northwestern Mutual
    Lead Data Engineer
    Northwestern Mutual Apr 2022 - Apr 2023
    Milwaukee, Wi, Us
    As a lead data engineer, helped solution and build various data platform capabilities for an enterprise Lakehouse by leveraging Databricks, AWS, and Airflow orchestration. Leveraging Autoloader for simplifying ingestion into Lakehouse delta tables and build dynamic templating data pipelines in python with parameterization passed through workflows providing a common ingestion patterns that supported the ability to replay transactions, archiving, and observability. With Databricks capabilities, helped build out reference patterns using Delta Live Tables for various data pipelines that integrated data quality controls like expectations, reusable modules, and workflow dependency/parametrization of data pipelines. Other activities including building out Java microservices leveraging the boundary control entity (BCE) design pattern which provided easy extensibility for future enhancements. With the knowledge of data and working with various Data Scientists, helped facilitate in building out Large Language Model Applications leveraging frameworks like LangChain for integrating various storage (vector databases), agents, chaining, and prompt engineering. As part of the LLM development worked with Azure OpenAI and Cognitive Search to test out various OpenAI capabilities against various data sets.
  • Northwestern Mutual
    Solutions Architect
    Northwestern Mutual Oct 2020 - Apr 2022
    Milwaukee, Wi, Us
    As a solution architect, helped design a unified data platform focused around the data mesh principals of building a self-service domain driven data platform. Responsibilities included building various ingestion services leveraging Databricks spark platform as the core engine for structured streaming via a multiplexer from Kafka, various database ingestions via JDBC, and file processing using Autoloader. The transformation layer focused around DBT and Spark SQL orchestrated through Dagster. The consumption layer consisted of a scalable query engine that focused around Presto running within Kubernetes that provided adhoc querying and reporting analysis. Solutioned a credential service leveraging Hashi Corp Vault so credentials are not stored within the platform but within the enterprise credential service. The Unified Data Platform was designed around a data contract that provided various platform services, ownership, privacy and securing for various personas around serving data, consuming data products, and operationalizing data science models.
  • Northwestern Mutual
    Lead Data Engineer At Northwestern Mutual
    Northwestern Mutual Aug 2019 - Oct 2020
    Milwaukee, Wi, Us
    As a lead data engineer in the data science and analytics area, helping modernize the departments cloud initiatives moving from on prem to AWS and building a platform for data scientists to build their data science models and run analytics. Responsibilities included understanding the current state of the data science environments, design a future state on what the new platform/capabilities will become, layout a roadmap to achieve the future state of the platform that aligns to each project workstreams, and build out the capabilities while strengthening the teams skills on building out the platforms capability. Learnings involved comparing various spark architectures and building a spark cluster running with the latest Haddop on a cluster of ec2 instances baking in features such as autoscaling schedules for reducing costs. The spark cluster was built on immutable infrastructure using Packer, Ansible, and Terraform for repeatable infrastructure to quickly provision data science environments. Along with the spark cluster, built out a Jupyter Hub notebook and integrated Jupyter with SSO along with AWS Role based security. Other learnings including building out job scheduling capabilities through airflow. This was then leveraged to build out a Python config driven orchestrator that dynamically loads ETL jobs from a config file while leveraging the spark cluster for processing data into the S3 data lake. Also built out an event driven architecture using AWS Lambda to process data in real time from S3 events as objects were landed. Lastly lead a team to build out a solution for productionalizing data science models in Kubernetes to process underwriting in real time.
  • Northwestern Mutual
    Senior Data Enginer
    Northwestern Mutual Sep 2017 - Jul 2019
    Milwaukee, Wi, Us
    Currently I am working as a Senior Data Engineer building an enterprise Data Vault 2.0 accountable for database modeling using Erwin, building out config driven etl using Snaplogic using a Worker Queue orchestrator architecture model, Splunk for building our alerts/custom splunk apps, integrating data quality seamlessly within the architecture of config driven design using Quilliup, defining the meta glossary using Alation, and working closely with the Business to buildout the associated info marts that allows consumers to easily subscribe and dynamically build out the info marts based in the data within the Data Vault.
  • Northwestern Mutual
    Data Engineer
    Northwestern Mutual Jan 2017 - Aug 2017
    Milwaukee, Wi, Us
    Working as a data engineer, my responsibilities include partnering with Data Scientists to help build, develop, test, and maintain a big data integration solutions which is used for shaping various propensity models. As the role of a data engineer this involved working with Hadoop based technologies such as Sqoop, Hive, Impala, Flume, Pig, Scala/Python on Spark. This also involves implementing complex data integration solutions focusing on collecting, parsing, managing, analyzing, and visualizing large sets of data to turn information into insights using multiple platforms and tools.
  • Northwestern Mutual
    Data Architect
    Northwestern Mutual Apr 2015 - Dec 2016
    Milwaukee, Wi, Us
    Working as a data architect, responsibilities included Applied Data Architecture, Data Integration Strategist, and Technical Architect. As the role of the Applied Data Architect, responsibilities included working in a SAFE portfolio with 3 week sprints where activities includes high level design, layout out sources and target data flows, working with the business clients on gather requirements and helping define the data model requirements, identifying solution options comparisons, and coaching/mentoring the onshore/offshore development teams in ETL data integration. This includes architectural frameworks for executing real time CDC Informatica components, building out a heartbeat solution for validating latency for CDC, and designing a Run Continuous workflow dependencies for real time processing. As a Data Integration Strategist focused was on providing a roadmap for the company's enterprise data warehouse reference architecture, design considerations around enterprise operational data store, warehouse, and analytical data store. Communicating architect direction through numerous forums that included presenting to leadership, business clients, and IT organizations. Partnered with different areas to help define the enterprise data requirements, determine usage patterns and guidelines, maturing documentation standards and processes. This also included assembled a governance committee that provided oversight of enterprise data needs, data quality, non-functionals, and cross cutting features across multiple programs. Responsibilities of a Technical Architect was focused on Informatica. This included defining future capabilities of the vendor, identifying findings, providing how to guides/design considerations such as Real Time CDC, Batch ETL, and PowerExchange. Other activities included partnering with vender management on new capabilities which included coordinate company meetings with the vender to understand and align to future roadmap.
  • Northwestern Mutual
    Enterprise Data Porfortfolio Tech Lead
    Northwestern Mutual Jul 2014 - May 2015
    Milwaukee, Wi, Us
    Partnering with IBM consultants as part of the Stampede program, integrated BigInsights and Hadoop IBM technologies to solve 2 uses cases. Using scoop to load structured data from a Netezza into BigSQL tables loaded on HDFS. Used Annotation Query Language (AQL) to build extractors to scan through a large volume of unstructured documents looking for specific searches based on filter criteria. This data was then loaded into BigSQL tables utilizing BigSheets to correlated structured data from Netezza to the AQL search criteria identified and graph the results.Splunk: using Splunk as a platform to look at machine data and aggregate log files to help in quicker to meantime resolution. Used Splunk to build dashboard functionality to visually represent statistical information, real time reporting, and alerts to inform when certain success/failure criteria was reached.Completed Big Data Project: Netezza platform for bringing structured data from many sources into Netezza (Extract, Land, and Transform ELT as to ETL), and building out a core data warehouse to be utilized for analytics. The platform is built using Informatica organizing the SQL/tables through ETL but using a SQL version of Push Down Optimization so SQL is executed on Netezza instead of bringing data into Informatica.
  • Northwestern Mutual
    Lead Data Integration Etl
    Northwestern Mutual 2012 - Jul 2014
    Milwaukee, Wi, Us
    As a Developer Lead involved working on a big data project utilizing Netezza, responsibilities included High-level design, POCs, developing Dims/Facts, loading large data sets using Pushdown Optimization and CTAS SQL for performance loading to take advantage of the MPP Netezza Architecture. Helped mentoring new team members on the new technology and processes coaching the team on how to build out dimensional structures using CTAS and Pushdown optimizations in Netezza. Leveraged performance patterns such as 2 pass insert/update and deletes, building out new unix frameworks to handle processing bulk loads, reprocessing, and parallel processing of jobs. This effort was a great learning experience on building new skills for learning Netezza and the development ETLs using the Big Data Netezza platform. Peter Landis partnered with the Unix Team and Informatica team on helping develop procedures on installing Netezza on Informatica by developing simple POC test cases to help resolve issues that were encountered. Peter collaborated with Information Architecture and helped build stronger relationships among other teams to help partner with the project on getting the teams on board to access their production data and land the data with the Netezza environments.
  • Northwestern Mutual
    Data Integration Etl Lead Developer
    Northwestern Mutual May 2008 - 2012
    Milwaukee, Wi, Us
    As a Tech Lead involved working on Data Warehousing/ETL transformations, responsibilities included working with the data specialists on understanding the business requirements and the physical database models, taking the source/target database tables and putting together a system level blueprint that would explain the detailed level components for the ETL extracts, delivering a detailed mapping document for the developers to work off of, code reviews/test plans, deployment planning/release, and support transitions. As part of the ETL/Data Warehousing other responsibilities included working on J2EE and JSE java design and development. The primary focus on design and development was around Java Listeners and Java Publishers (Pub/Sub model) for building out a J2EE WAS 7 listener for the CCV application to perform name searches on. The J2EE listener was built using Spring/Ibatis for the data access layer, Caster for the parsing of the JMS within the MDB. The publisher was built as a JSE publisher on Spring / JMS publishing to MQ subscribers. Other responsibilities involved with working partially on L1-L2 work agreements as 25% overflow of Information Architecture. This included putting together Solution Option Comparison.
  • Stratagem, Inc
    Team Lead / Project Manager
    Stratagem, Inc May 2005 - May 2008
    Mequon, Wi, Us
    As a Team Lead involved with the Integration of Siebel CRM applications, worked on a J2SE implementation of a pub/sub architecture involved with architecting the Client Listener application listening to XML publications sent via MQSeries. As a team lead, involved with a TNC Siebel Orphaned Clients rolling out over a 128 network offices. This involved architecting the solution for the MQ Java Publisher/LIstener to handle the orphaned clients and marking the associated clients primary in the CRM system.. Other activities included building new development features for a Field Reporting and Client Builder Goals system which was a J2EE WAS application. Working as a Project Manager managing small to medium projects responsibilities included working with business clients to understand requirements, put together L1-L4 estimates and work agreements for project stakeholders, manage work breakdown structures, laying out weekly project plan statuses to program manager, and managing risk of the project through change requests.
  • Stratagem, Inc
    Software Consultant
    Stratagem, Inc Jun 2002 - May 2005
    Mequon, Wi, Us
    As a consultant at Northwestern Mutual, served as a java lead developer supporting J2EE Websphere applications, java client-server applications, and integration of web/client applications within Siebel CRM. Responsibilities included working with project managers to define requirements, developing UML class diagrams and use case scenarios, implementing new functionality for J2EE web applications, unit test via JUnit classes, and supporting and troubleshooting production issues. Other responsibilities included having strong understanding of business rules defined for financial/non-financial products and how they integrate within financial representatives and clients. The web applications utilized a MVC framework using IBM’s Process Access Bean and Data Access Bean within a J2EE Websphere environment. The client/server applications utilized IBM MQ Series via a Broker class for retrieving and publishing XML messages to other enterprise applications. Automated Unit testing using JUnit and building of java applications using ANT Build Utility. Other tasks were to streamline support activities by using log4J for increasing logging during troubleshooting of applications and automating jobs via AutoSys.
  • Fullhousemedia
    Software / Internet Developer
    Fullhousemedia 2001 - Jun 2002
    Served as a lead java developer on a development team to implement an Internet web application streamlining both the production line and sales for a company. Initiated Java as a solution for developing an application for a multi-platform environment. Design various internet web applications implemented on BEA’s WebLogic Server for Enterprise Java Bean components along with JSP / Servlet pages as the presentation layer. Both J2EE and SWING are used for java thin client applications that are based on the Model View Controller Design Pattern. Utilize many design patterns for both J2EE applications and EJB Components such as the Session Façade, Singleton, Model View Controller, and the Helper patterns. Focus on database design implemented on different systems. Other responsibilities include migrating traditional web applications such as classic ASP or COM objects over to .NET and C#.Development of EJB components for BEA WebLogic 6.1Use J2EE and SWING for developing thin client applications.ASP or JSP are used for developing presentation layer.Design Patterns include: Session Façade, Singleton, Helper and, Model View Controller.Migration of traditional web applications to .NET
  • Marchfirst
    Software / Internet Consultant
    Marchfirst 2000 - 2001
    As a consultant in the Internet Technology practices, responsibilities included design and development for client / server e-commerce web sites. Systems developed for web applications included Microsoft Windows 2000 Servers, Sun Solaris Servers, and HP Systems. Other technologies used were in software management process utilized at Northwestern Mutual Life. This included setup and maintenances of build scripts, automated processes that monitored and administrated systems for Client / Server development projects. Worked extensively with project leaders and developers to manage source files and plan configuration builds for the projects. Other responsibilities included maintenances of build scripts for PVCS, PVCS version control, strong UNIX knowledge, perl, and KORN shell scripting.Set-up, automated, and maintained build systems for Client/Server Development Projects.Partner with project teams on configuration and code management.Worked with project leaders to develop build management plans and configuration management plans for the project.Ran, scripted, maintained, supported daily builds, and improved and optimized the entire build process.
  • Ge Medical Systems
    Software Engineer
    Ge Medical Systems 1998 - 1999
    Started off as an internship and then went full time working with a team on the development of a new configuration toolkit that is used in the digital x-ray architecture. Worked extensively on all four areas of software development cycles: analyzed software requirements, wrote and reviewed software requirement documentations, helped design the configuration application utilizing object oriented techniques and UML modeling (Rational Rose), implemented the software design, and last conducted software tests. Developed the configuration toolkit using C++ and CORBA / RMI for marshalling calls to communicate with distributed systems. The client side application was implement using java servlets to dynamically generate the configuration data for the interface.Involved in the design of the configuration application on the digital X-ray architecture.Responsible for documentation in designing, testing, implementation, and user manual.Software skills utilized: OOP design, C++, Java, UML – Rational Rose, CORBA / RMI, and Java servlets.

Peter Landis Education Details

  • University Of Wisconsin-Milwaukee
    University Of Wisconsin-Milwaukee

Frequently Asked Questions about Peter Landis

What company does Peter Landis work for?

Peter Landis works for Northwestern Mutual

What is Peter Landis's role at the current company?

Peter Landis's current role is Principal Engineer | AI | Generative AI | Machine Learning | Data Engineering | Data Lake House | Data Analytics | Data Integration | Software Engineering | AWS | ELT.

What schools did Peter Landis attend?

Peter Landis attended University Of Wisconsin-Milwaukee.

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.