Miguel Sanchez

Miguel Sanchez Email and Phone Number

SQL | ETL Specialist | Software Development Professional | AWS Certified Cloud Practitioner | Banking, Mortgage Technology | Mentor
Miguel Sanchez's Location
Bonita, California, United States, United States
About Miguel Sanchez

I am a software developer at Union Bank, where I lead and contribute to various projects involving migration from legacy systems to AWS cloud services, especially for CCAR reporting. CCAR is a regulatory framework that requires banks to assess their capital adequacy and risk management under different scenarios. With a strong background in database design, business requirements, and attention to detail, I deliver high-quality and reliable solutions for complex and dynamic data needs. I use a range of tools and technologies, such as Python, Hive, Redshift, Boto3, SSIS, SSRS, and StreamSets, to design, develop, and maintain data pipelines, workflows, and reports for various data sources and destinations, both on-premises and in the cloud. I have also been involved in creating and enhancing the IBM ingestion framework, which automates the data loading process from on-premises to AWS S3, Hive, and Redshift. I enjoy working with a diverse and collaborative team, where I can learn from others and share my expertise and insights. I am passionate about leveraging cloud computing and data analytics to improve business performance and customer experience. I am always eager to explore new challenges and opportunities to grow my skills and knowledge in this field.

Miguel Sanchez's Current Company Details

SQL | ETL Specialist | Software Development Professional | AWS Certified Cloud Practitioner | Banking, Mortgage Technology | Mentor
Miguel Sanchez Work Experience Details
  • Assetmark Financial, Inc
    Database/Etl Developer
    Assetmark Financial, Inc Jan 2024 - Aug 2024
    Concord, California, United States
    Developed and managed processes within Azure Cloud Postgres, leveraging PostgreSQL for database management. Built and deployed containerized applications using Docker, and implemented CI/CD pipelines for promoting changes across environments with Shell scripting, PowerShell, and PSQL. Conducted thorough testing and validation of tasks managed through Jira, and optimized database performance by refining PostgreSQL procedures, functions, triggers, and views.
  • Union Bank
    Software Developer
    Union Bank Apr 2015 - Nov 2023
    Remote
    Migration from on-premises legacy systems to AWS platform:Lead developer in charge of maintenance and development of new features needed for CCAR reporting in order to remain in good standing with Federal requirements.As part of CCAR governance requirements, coordinated with business owners & Reg Reporting team to make changes to the schedule generation logic as the federal reserve adds questions, lowers failure thresholds for edit checks, etc...Generated monthly CCAR FRY14M schedules A, B & C.Services/Tools used:Aws s3 | Hive | Redshift | Boto3 AWS API | Wherescape Red | Python | Unix shell scriptingSSIS | SSRS | SSMS | VS Code | C# | Black knight financial services interface | Autosys (scheduling)Streamsets | bitbucket | Cloud Native DB (SnowFlake, Hive, RedShift) Cloud ETL and Data Lake work:o Individual contributor to IBM (Python) ingestion framework from On-Prem to AWS s3, Hive, Redshift. Leveraged Boto3.o Individual contributor in development of an entirely event-driven, cloud-based, Enterprise Data Warehouse Solution in AWSo Proof of work project: Build Vault objects in Aurora PostgreSQLo Proof of work project: Create sophisticated vault pipelines in Streamsets Data Collector that populate satellites, hubs, and links.o Leveraged Streamsets Control Hub.o Develop an assortment of operational control framework micro-services which auto start pipelines (groovy scripting), check completeness and accuracy of streaming data and manage downstream dependencies.o Implement monitoring processes that log metadata regarding pipeline execution and error handling.o Built out an S3 Data Lake which stores files in Parquet format and uses AWS Glue to categorize, format and prepare data for easy extraction and advanced analytics (Hive to Redshift). Infrastructureo Create home grown ETL framework running on Linux EC2 instances.o Stand up dev, sit uat, pre-prod and prod environments. o Support testing/maintenance and production support efforts.
  • Axos Bank
    Developer - Etl-Sql
    Axos Bank Jul 2014 - Apr 2015
    San Diego, California, United States
    Build and expand existing database and warehouse architecture to include data modeling,architecting an Operational Data Store – including a landing database to staging databaseprocess, transform logic applied between the landing and staging database. Apply slowchanging dimension merge statement and logic. Lay out vision of a robust operationaldata store with data dictionary, standardized ETL practices. Integrated SalesForce feedinto existing data mart.Develop and maintain existing applications written in C#, TSQL, SSIS, SSRS, SSAS.Participate in Daily Scrum meetings (stand-up)Use TFS code versioning.Monitor and troubleshoot issues with automated processes and update end users on fixeta. Processes include SSIS, SSRS, TSQL, C#, VB.Net, & various other applications.Develop new processes for the bank analysis departments as requested by business users.Design workflow documentation on new and existing processes.Assess BI reporting tools for company use - such as Tableau.Third party software used:Jack Henry Associates data warehouseSoft-Vu | Velocify | NetSpend | GCC | Quickbridge | Exact Target | Salesforce & SharePoint APICadence | PCLender | Tools: (SSIS, SSRS, Visual Studio 2013, Windows server 2014, SQL Server 2008, 2012 and 2014, Sql Sentry monitoring tool)
  • Norima Consulting Inc.
    Developer - Etl-Sql
    Norima Consulting Inc. Sep 2013 - Mar 2014
    San Diego, California, United States
    Lead developer participation in project development for financial institution located in Los Angeles.Managed and mentored team of 4 developers.Sourced data from multiple data sources, including Informix, Oracle & SQL Server.Staged data, applied logic and sent data to RegEd.Sourced data from databases, used Oracle database to stage records, applied logicinvolving broker dealers and produced files to be sent per specifications. The files wereextracted to the file system (Linux), then logic was applied to save the files into a historyfolder which contained a rolling 7-day history. An active batch job scheduled Mon-Satproceeded to call Unix shell scripts to move the files to an outgoing file server, PGPencrypted the file and sent the encrypted files via SFTP. Additional work was performedto include the new feature on the web site for the broker dealers, technology used for asingle sign on (SSO) feature was asp.net coded with C#. Metadata for the SSO featurewas stored in an Oracle database. Sourced data from DB2, Oracle, SQL Server, Activedirectory, .xlsx files, flat files.Worked with virtual team from various U.S. states.Software:PL/SQL - T-SQL – UNIX Shell scripting - C#Tools:ETL Development – InformaticaQA complete (Bug Tracking)Active Batch scheduling system.Gemini (service request ticketing system)
  • Bank Of America
    Developer
    Bank Of America Mar 2011 - Jul 2013
    Calabasas, California, United States
    Lead developer participation in requirements development from inception to productionreleases related to mortgage data warehousing data load. Successfully completedmultiple projects from start to finish.Optimized sql queries using statistical information to identify table scans and reduce IO.Used in-house built system to generate xml contents of SSIS packages to dynamicallycreate new packages to be included in production and scheduled as needed. The metadataused to create ssis packages was table driven. Developed packages in the Dev.Environment, then moved packages to QA, UAT and production servers. Used C# tocreate personal tools to handle data quality and verification.Worked with virtual team from abroad and the U.S.T-SQL – SSIS - C#, Power ShellETL DevelopmentPega systems interaction
  • Union Bank
    Developer
    Union Bank Oct 2010 - Jan 2011
    San Diego, California, United States
    Work with software developers/engineerson large data sets that affected software performance. Used Team Foundation to deploycode to test and prod branches.- Used c# to code repetitive tasks to automate and standardize daily processes.- Developing ETL processes and proactively identify, troubleshoot and resolve issues with livedatabase systems.- Performance tune SQL Server VLDBs under high transaction rates with a smallmaintenance windowTechnologies used: MSSQL Server 2008, C# asp.net, SSRS, SSISDuties: Developer with SQL server database backend.Gather requirements, design and develop normalized database architecture to ensure dataintegrity.Identify and create database objects to ensure application scalability.Optimize queries and indexes on tables.
  • Excelsior Media Corp.
    Software Engineer
    Excelsior Media Corp. Apr 2009 - Feb 2010
    San Diego, California, United States
    Develop and maintain web applicationusing Asp.Net, C# with SQL Server 2000, 2005 & 2008 database backend. Automateprocesses using C# to load files to live database on a daily basis to SQL Server 2005databases. Successfully completed database migration project with minimum downtimefor end users. Conduct production support for internal staff.Duties: Web application developer with SQL server database backend.Develop and document guidelines on daily update processes for over 30 servers.Maintain & develop applications using C#, Visual Studio 2008 & VS2010.Optimize queries and indexes on tables containing large amounts of records.Create application to interface with Netbilling and internal databases.Build internal DMCA System to keep track of copyright infringements of companyassets.Build dynamic reporting system for company executives.Interact with Netbilling.Addressed issues across browsers – Safari, IE, Chrome, Opera, FireFox.
  • Lpl Financial
    Senior Etl Developer
    Lpl Financial Dec 2007 - Jan 2009
    San Diego, California, United States
    Develop and maintainIntegration services jobs for nightly stock market transaction processing. Build ETLprocesses using Informatica, C++, C# to load over ½ billion records on a daily basis toSQL Server 2005 databases. Successfully completed multiyear project with minimumdowntime for end users. Conducted production support for the following 6months of project deployment to production. Was on Call 24/7 to address any issues orquestions regarding nightly processing.Duties: Adhere to Software Development Life cycle (SDLC), Develop ETL processes,Develop databases & database objects. Fine tuning etc.,Maintain applications built in C++, C#, Informatica, Transact SQL stored procedures,table schemas etc.Performance tuning on very high volume OLTP databases.Optimize queries and indexes on tables containing large amounts of records.Create Databases with multiple filegroups to store related data into respective filegroup.Build processes to take advantage of table partitioning in SQL Server 2005.Represent changes at CAB review board, document changes for production supportgroup.
  • Time Warner Cable
    Full-Stack Developer
    Time Warner Cable Mar 2001 - Dec 2007
    San Diego, California, United States
    Built and maintained exitsting DTS, SSIS C# Web Apps, Built reports using Crystal reports and SQL Server, Reporting Services for multiple company departments. Used TRIPLE DES ENCRYPTION to satisfy SOX mandates.Developed web tool using ASP.NET with SQLSERVER 2005 back-end; to allow fieldtechnicians to order parts using their hand held "intermecs". The warehousewould "fill" the order request and have parts ready for the technician to pull up and loadsupplies to his/her vehicle. Converted paper transactions to web applications andautomated processes, saving the company thousands of man-hours.Duties: Develop Web applications using ASP.NET 2.0 with sql server 2000 and 2005database backend• Designed technical support application to look-up customer data, log issues with cableservice. Install, configure and administer MSSQL Server 2000 & 2005 Databases, SSISServer, SSRS Server to store and utilize data critical to everyday company operation.Used surface area configuration to administer transactional web server.• Managed servers and databases, configure login security, implement databasepermissions, manage logs and back up data.• Automated tasks to interact with 3270 emulator ( customer service accounts ) to processmass changes to customers’ accounts, such as refunds, flag accounts, etc..• Develop In-House applications to streamline employee manual processes. Developedapplications for multiple departments; including finance, accounting, executive, humanresources, warehouse, I.T. etc…• Query SYBASE telephone database to store data in local SQL Server 2000/2005.• Designed multiple ETL processes using DTS, SSIS and C# applications involvingextraction of data from third party databases, transformed the data to match requirementsand uploaded data to data warehouse, ftp sites and web server databases. Encrypted data(PGP) when required.• Create and maintain ORACLE 9i database for third party data access. Created databaseobjects and procedures using PL/SQL.
  • Guild Mortgage
    Computer Programmer
    Guild Mortgage Sep 1999 - Mar 2001
    San Diego, California, United States
    Developed applications according to customer requirements; trained end-users in the useof new applications. Documented table changes to update I.T. department guidelinesUsed SQL, RPG language to create applications to support mortgage offices in thewestern U.S. Followed company guidelines and procedure to ensure quality control.Developed queries and code for mortgage banking applications using IBM AS400 midrangecomputer.Duties: Develop jobs, ETL applications for all aspects of home financing process.Fine tuned queries and applications.
  • Kapos Associates Inc.
    Software Engineer
    Kapos Associates Inc. Nov 1998 - Sep 1999
    San Diego, California, United States
    Developed web based application to help Navy commanders at sea assess their respectiveship battle readiness by calculating Personnel, Experience, Knowledge and Aptitude(PEKA values). The application would present ship readiness in a color-coded report withpertinent information. End users had the ability to create "what if" scenarios to see whereto improve overall readiness at sea.Designed database, developed SQL queries and maintained and developed website for aU.S. Navy project.Stored ship's personnel information in database and calculated statistics for the west coastNavy command. Required Secret ClearanceDuties: Develop Web application, ETL processes & Database development.. Fine tuningqueries and applications.

Miguel Sanchez Education Details

Frequently Asked Questions about Miguel Sanchez

What is Miguel Sanchez's role at the current company?

Miguel Sanchez's current role is SQL | ETL Specialist | Software Development Professional | AWS Certified Cloud Practitioner | Banking, Mortgage Technology | Mentor.

What schools did Miguel Sanchez attend?

Miguel Sanchez attended San Diego State University, Coleman College, San Diego City College.

Not the Miguel Sanchez you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.