Anthony Millard Email and Phone Number
Anthony Millard work email
- Valid
Anthony Millard personal email
- Valid
- Valid
A highly motivated and articulate Systems/ Business and Data Analyst, with a demonstrable track record of delivering major successful projects that have a positive affect on the business bottom line. 25 years proven expertise in implementing bespoke applications and managing projects and project staff from initial studies to solution implementation.An innovative thinker with strong influencing skills who is highly experienced in communicating with personnel at all levels. Possesses strong management and mentoring skills with the capability to unify a group of co-professionals in achieving a common goal.SNAPSHOT:Jaguar Land Rover »» August 2018 to PresentDebenhams »» May 2018 to Jun 2018Hastings Direct »» Nov 2017 to Apr 2018AstraZeneca »» July 2017 to November 2017Daughter's Wedding in Rio de Janeiro and travel in Brazil »» Apr 2017 to May 2017Wesleyan Assurance Society »» Apr 2016 - Mar 2017nPower SMART Metering »» Nov 2015 - Jan 2016Volkswagen Financial Services »» May 2014 – Oct 2015National Grid »» Jan 2013 – Apr 2014Child Maintenance Group »» Aug 2012 – Dec 2012RBS [Amsterdam] »» Jan 2012 – Aug 2012Talk Talk Technology »» Nov 2011 – Jan 2012Accent Group »» Apr 2011 – Oct 2011Essentially Yours »» Oct 2010 – Apr 2011Lloyds TSB »» Jun 2010 – Sep 2010 Lloyds TSB »» Apr 2010 – Jun 2010 Circle Anglia »» May 2008 – Mar 2010 Lloyds TSB »» Nov 2006 – Mar 2008 Birmingham City Council »» Oct 2004 – Oct 2006 Creating Profitable Partnerships »» Sept 2003 – Oct 2004Sundial Systems »» April 2003 – Sep 2003Matrix One Software »» July 2002 – Mar 2003Tiscali »» Dec 2000 – Feb 2002SRG Labstaff »» Sep 1996 – Dec 2000BCCI Liquidation »» Feb 1996 – Sep 1996Minet Ltd »» Aug 1995 – Feb 1996Johnson Fry Ltd. »» Jun 1995 – Aug 1995Lloyds Register of Shipping »» Jun 1994 – Apr 1995Eurotunnel PLC »» Apr 1993 – Jun 1994Cellnet PLC »» Feb 1993 – Apr 1993Charterhouse Systems »» Jan 1992 – Feb 1993British Gas PLC »» Oct 1990 – Dec 1991
Charterhouse Systems Consultancy Ltd.
View- Website:
- charterhousesystems.com
- Employees:
- 1
-
Director [Independent Contractor]Charterhouse Systems Consultancy Ltd. Oct 1990 - PresentNorwich, United KingdomABOUTCharterhouse was formed in 1990 to facilitate several IT analysts to formally operate contracts with major blue-chip companies. Over the last 28 years, large and medium sized corporate organisations have trusted Charterhouse contractors to draw upon their unique expertise, knowledge, and accumulated experience to deliver comprehensive infrastructural changes to their organisations. Our analysts demonstrate the motivation, team skills and experience necessary to ensure uninterrupted efficiency of service delivery. The combination of these qualities have been central to ongoing successes of this family run firm.SOLUTIONSMany large companies are constantly managing change within their organisations. The change drivers might be unique for each vertical market or have similar drivers, such as the movement of technologies and the perceived need to constantly upgrade, but the outcome of managing change remains a constant challenge. However, despite the constant shifts in functional and technological availability, the data (the life blood of the business) remains fairly constant and architectural nature of the Meta data, which needs to be available to the business, should be well documented and understood. Charterhouse has been able to present comprehensive data solutions to help manage the principle drivers which impel and inform the changes in each organisation. -
Data ArchitectJaguar Land Rover Uk Aug 2018 - PresentWhitleyOBJECTIVE:To replace and existing legacy planning tool [SPRINT] with a strategic SaaS planning platform [ANAPLAN]. Identifying the core Enterprise Master Hierarchical data related to all vehicle Model Years (current and future) for all known valid combinations and apply this data to the transactional Retail Actuals, Wholesales Actuals and Stock Levels to determine consistent Just In Time [JIT] planning across the enterprise by ensuring common, maintained, clean data sources ; designing the Product, Demand Geographies, Model Years Hierarchies and preparing the Master data expressed to component level; so that the planning tool can roll up and report at any feature level. Define the data requirements to be available in the Data Lake and prepare the data for migration to the ANAPLAN Data Hub. Using BigQuery, Tableau to analyse the Data Lake system tables and export to Excel.APPROACH: • To analyse and extract the Source System structures and ensure the correct tables and appropriate data is in the Data Lake ready to be fed into ANAPLAN at the required volume and frequency.• To return the ANAPLAN reports and update the Source Systems with the progressed and details plans and planning framework for ongoing plans.• To prepare a generic and appropriate Master Data meta model with will accommodate both effective compositions for future modelling• To map the Source System tables to the appropriate ANAPLAN Data Hub as defined by the design architecture and to determine the level of transformation across the data landscape from extraction to loading.• To create an onboarding process with practical data and advice for future JLR workstreams to share the same Enterprise Master Hierarchical data and make appropriate changes where necessary.OUTCOME: • The successful loading of the ANAPLAN Data Hub with Enterprise Master Hierarchical and Transactional data ready for the planning model builders to start their crucial planning activities. -
Gdpr Business Analyst [Compliance Audits]Debenhams May 2018 - Jun 2018London, United KingdomOBJECTIVE:To audit the progress of GDPR implementation to date and give some advice direction to the next steps. To highlight areas where evidence can be increased and give the management a RAG status of the pertinent areas of the compliance shortfall and give some indication at what priority should be given to the remaining tasks. APPROACH: • To analyse the key tenets of the GDPR regulation and estimate the shortfall of work needed to gain compliance.• To analyse the Debenhams website assets and define the click through data passed to third party sites• To extract the physical structures from the core systems and create editable data artefacts to identify the personal data held in the source databases.OUTCOME: • To give the management a clearer idea of the tasks yet to be completed which would help Debenhams achieve compliance -
Gdpr Business AnalystHastings Direct Nov 2017 - Apr 2018Bexhill On SeaOBJECTIVE:To analyse from the GDPR legislation the areas of treatment of all aspects of the requirements needed to be in place by 25th May 2018. Working predominately on the Governance workstream, all GDPR aspects were analysed to ensure compliance with the requirements. Privacy by Design involved the creation of DPIA questionnaires being prepared and applied to the 120+ in-flight company projects. With an estimate 10+ data system many of which have Personally Identifiable Information [PII]. Working with a team of analysts, with a wider team of 20 workstream SMEs and advisory business stakeholders. APPROACH: • To analyse the AS-IS DSAR, Breach Reporting and Complaints processes and assess the gap analysis to produce the TO-BE version of those processes, needed to satisfy GDPR Compliance.• To create Data Privacy Impact Assessment process by linking a DPIA Register to the DPIAs issued to each Project and change initiatives being carried out by each department in Hastings; through SME workshops to flush out the Epics and User Stories which underpin the requirements necessary to measure the required criteria for the selection of workable solutions. • To design the proposed solutions to move Colleague and Customers unstructured documents from the server drives to secure environment in the Workday workspace.• Present the proposed solutions to the GDPR Steering Committee for their approval and subsequent embedding into the BAU processes• To present the core system structures in such a manner to identify the Personal Identifiable Information [PII] in their source tables and identify those fields needed to be satisfy the Right to be Forgotten, Right of Portability, Right of Rectification and DSAR.OUTCOME: • The Minimum Viable Product as defined by Hastings Direct Steering Committee is being successfully implemented for May 2018 -
Senior Gdpr Business AnalystAstrazeneca Jul 2017 - Nov 2017Cambridge, United KingdomOBJECTIVE:To analyse from the GDPR legislation the areas of treatment of Non-Production environments; Binding Corporate rules, International Transfers and Consent. The touch-points of Consent were analysed to ensure compliance with the GDPR requirements and to prepare the Project for the scale of the discovery needed to analysis for the magnitude of the task. With an estimate 70+ data system many of which have Personally Identifiable Information [PII]. Heading a team of analysts, with a wider team of 20 SMEs and advisory business stakeholders. APPROACH: • To analyse the COTS options for the leading discovery tools suppliers which has the best synergy with the AstraZeneca's model architectural. • To define the AS-IS function models for all the GDPR subject areas, and through SME workshops to flush out the Epics and User Stories which underpin the requirements necessary to measure the required criteria for the selection of workable solutions. My primary area of analysis is focused on Consent where the identification of the accurate Consent Shortfall [the tasks needs uplift the current state of Consent to the level required for compliance]• To implement a process for Consent discovery which explores the physical data and gathers shortfall tasks [required realise compliance]• To build up the requirements as defined in the GDPR legislation for the Minimum Viable Product; being the minimum requirements which can in place by May 2018 to comply from a legal perspective.OUTCOME: • WIP -
Data Anonymisation AnalystWesleyan Assurance Society Apr 2016 - Mar 2017Birmingham, United KingdomOBJECTIVETo develop, integrate an approach to the effective anonymisation the existing test data environments.The pre-cursor to the anonymisation is the reduction of table volumes to approximately 10% of their production volumes.APPROACH• To reduce the table volumes of the physical tables; ensuring inter-system referential integrity and make test environments smaller, more portable and coherent; preparing the way for automatically creation of consistent test environments.• To reverse engineer from the physical tables and allow data profiling of the table level data to establish formatting shape for knowledge of transformation.• To establish from the meta-data the candidates data fields required to be changed to achieve anonymisation with respect to the formatting shapes identified in the data profiling.• To apply the required transformation to the candidate data fields of the reduce dataset according to the formatting shapes.• OUTCOME: • Successfully created a genuine framework for putting data architecture at the heart of the business transformation in order that future changes would recognise the importance of data.• Successfully initiated the beginning of business data ownership by the key business stakeholders within Wesleyan to define business terms within the corporate data dictionary.• Set standards in data modelling; dictionary and created path to MDM “golden source” artefacts.• Project abandoned due to cost cutting -
Data ArchitectNpower Smart Metering Nov 2015 - Dec 2015Leeds, United KingdomOBJECTIVETo develop, integrate and merge the gas and the electric logical model with additional meta data needed for the upcoming SMART metering data requirements due at the end of 2016.These models are highly technical and complex with historical as well as technical differences.APPROACH• To reverse engineer from the physical tables and form the logical models for each utility type.• To reverse analyse each logical model [gas and electric SMART metering] at attribute level and establish with the SMEs the precise definition for each and establish the common attribute values.• Examine the synergy between the relationship and form a merged model.• Profile the existing data to establish whether the additional attributes would be adversely affected within the new model, making recommendations where the new attributes might impact.OUTCOMEOnce the analysis had been completed, the new models were confirmed with the Project Data Architect and the SMEs and signed-off was concluded -
Data Anonymisation AnalystVolkswagen Financial Services Jun 2014 - Oct 2015Milton Keynes, United KingdomOBJECTIVE:To develop a corporate wide process to anonymise the personal data in four test environments across 26 core systems in two countries. This process will identify the key data fields or groups of data elements and map the interfaces and data flows throughout the core systems. Each environment will treated differently to satisfy the testing needs of that environment balanced against the identified risk of the personal data being leak from the test environments. This includes 9 SAP Landscapes and 17 Oracle system hosted in Germany reversed engineered into ER Studio ready for loading into the Informatica Masking Tool.APPROACH: • To access all systems and extract the metadata structures from each physical tables along with a sample set of each physical table to profile the data, using Safyr Extraction Tool and importing into ER Studio Modelling Tool. • To evaluate and compare the market leaders of the key enterprise data architecture tools in a clear and structured manner with relevance to the key functionality of the data tool to the needs of the project. The purchase of the toll would lay down the foundations for a full enterprise business lead data dictionary and data modelling repository.• To reverse engineer from the physical tables and form the logical models for each system; compare the structures in the four testing environments for each of the core systems and create a unique depersonalised data set which fulfils the testing required across the four environments.• To track the data lineage of the physical interfaces and data element level across the systems. From where the data is first born to its final destination or reports and put the masking tool at the heart of Business Transformation’s Service Delivery FrameworkOUTCOME: Project abandoned due to cost cutting and lack of technical resources on 29th October 2015 Project assets parked after 60% of analysis completed. -
Senior Data AnalystNational Grid Jan 2013 - May 2014Warwick, WarwickshireNational Grid (National Gas Transmission Operation) Jan 2013 - PresentReference Process & Data AnalystOBJECTIVE:To assess and verify the AS-IS processes and data structures of the Gas Operations within the National Grid high pressure Transmission Network; with a focus on defining the TO-BE requirements for both processes and data. The main operation was defined within three main work streams, Gas Operations, Gas Control Room Operations and National Infrastructural Assets to define the TO-BE requirements to delivery a replacement system by 2016. Being responsible for the data requirements in the Gas Control Room Operations stream from all inline gas pressure, temperature, fire, etc sensor real-time monitoring to the Control Room operations in handling alarms throughout the country along with the likely response scenarios which are to be enacted for each alarm type. APPROACH: • To elicit, through structured workshops with selected SMEs, the process and data requirements with the expectation to reduce the implementation time to configure aspects of the system.• To develop the required data items for TO-BE data model for each of the defined TO-BE processes as the workshop progresses, whilst mapping those data items behaviours to the new TO-BE process. • To assist in the design the data structure of the TO-BE model; whilst collecting the core configurable items which will be required to reduce the time to configure aspects of the system.OUTCOME: • Successfully base-lined the Conceptual Data Model (CDM) in time for the Request for Purchase (RFP) while constructing the Logical Data Model (LDM) from the AS-IS Physical Data Model and structured workshops.• Completed the Report Catalogue Matrix which identifies the 1200+ reports across Gas Operations and establishes the usage by Level 3 Process.• Completed the Entity v Process Matrix which identifies which entities are used across all Level 3 Processes -
Reference Data AnalystChild Maintenance And Enforcement Commission Aug 2012 - Dec 2012LeedsChild Maintenance & Enforcement Commission (Central Government) Aug 2012 - Dec 2012Reference Process & Data AnalystOBJECTIVE:To design, present, agree and implement the processes and structures needed to manage the 40,000 configurable reference data items which represent the flexible nature of the Child Maintenance System (CMS) planned to go live later this year. These configurable items, collectively known as the Configurable Items Catalogue, provisionally under the governance of the Overarching DWP Change Control Gateway, needed to be subject to a fast path sub process which allows these items to be changed in a reasonable time frame, to meet the configurability expectancy of the business. APPROACH: • To interview the key stakeholders with interests in configuring the CMS system and elicit their requirements with the expectation to reduce the time to configure aspects of the system• To initially design processes for the management of change; collection of behavioural data and documents to record the history of changing specific items for specific change scenarios. • To present these processes to the stakeholders in group workshops to adapt them and agree a mutually beneficial process which would achieve time related objectives in configuring change in the Catalogue.• To design the data structure of the Catalogue; its change control governance and its data and document gathering functions.• To write and perform UAT function on selected configurable items within the Catalogue. OUTCOME: • To build RAD type models of Change Control Log, Catalogue and gathering functions to describe the precise specification of the requirements needed to build a coherent and useful data management tool to manage change of configurable items within the CMS system. -
Business Data AnalystRbs Markets & International Banking Jan 2012 - Oct 2012Amsterdam Area, NetherlandsOBJECTIVE:To manage senior stakeholder engagement with Heads of Country Operating Directors, by presenting project aims and gaining access to core banking systems to complete data cleansing operations across 32 countries. Producing analysis of accounts and customers and gaining authority to automatically closing personal accounts and de-activating customer records which are considered to be in scope for the project. The closures and de-activations are to save the cost of the customer interactions as required for Know Your Customers (KYC), before the change of legal status from BV to PLC. APPROACH: • To gain access to the country environments and extract in-scope data for the project• To develop a structured process tools written in VBA to manipulate the extract data with the captured messages to enable analysis needed to rate the accounts and customers.• To present that data to the main SCORE system in such as way as to gather the closing/ de-activate messages which enables relevant analysis to be carried out and presented to the branches to gain their approval for closure/ de-activation.• Once approval has been gained, the process is then repeated again with the full closure authorisation which actually closes and de-activates the accounts and customers respectively• To reconcile each batch of closures/ de-activations to present to the branches the full extent of the data cleansing OUTCOME: • To successfully clean up the non active accounts and customers across 32 countries before the legal movement for RBS from BV status to PLC -
Data AnalystTalktalk Technology Nov 2011 - Jan 2012Irlam, ManchesterOBJECTIVE:Formed part of the Fault Team, which fields faults referred by entire operational services. The Treatment Team's purpose is to remove the persistent understood faults away from the Fault Team and to move that work toward solutions developed by better validation and more precise processes developed through rectifying data. APPROACH: • To extract and clean data from Emergency Database (EDB) and Directory Enquiries Database (DQ). Manage the exporting and importing of Customer Line Identifiers (CLIs), when appropriate records fallout of the agreed SLAs between Service Providers• To extract statistics to show the current extract and clean processes are rectifying the progress of the general cleanup• To present the statistics in graphical form via pivot tables OUTCOME: • To move the validation processes to improve quality of created data up the reception processes to ensure that the failure rate diminishes and relieving the pressure on the Fault Team -
Business Migration AnalystAccent Group Ltd Apr 2011 - Oct 2011Shipley, West YorkshireOBJECTIVE:To extract data from seven source legacy systems dealing with different aspects of the landlord’s business, data clean-up and transformation to load into the single target system.APPROACH: • To analyse the source and target system’s data structures, express those systems in Entity Relationship Diagrams (ERD), identifying the primary and foreign keys.• To define and map the data tables to enable the merging of source systems to the fields in target data models. Carryout Gap Analysis to identify any issues, such as pre-transformation merges specific structured clean-up processes.• Create the definition of the data mapping in data driven tables to ensure the transformation can be repeated in end to end processes.• To extract from the source systems tables, via ODBC, using pass thru SQL to clean, rationalize, parse and correct data elements. Transformations carried out either using Talend Open Systems or VFP End to End procedures to provide structures sympathetic to the target structures using target data loaders.• To load the target system with the cleaned data structures and test loaded data using meta-data metric reports to match metrics of the source systems.• Create meta-data metrics on source tables to ensure that all data is extracted and loaded correctly.OUTCOME: • To create the mapping and transform process defining all major and minor processes which define the detail of the overall migrations.• To build the transform / mapping tables which define any pre-transformation merges or mapping to the target data models.• To create documentation, reports and data metrics to describe the models and processes to share with business users to communicate the intentions and assumptions made which define the overall migration.• To created data and work flows, business reports and system metrics to prove successfully migrations.• To present business users with a functional system ready for use over the migration weekend. -
Systems Designer/ ImplementerEssentially Yours Oct 2010 - Apr 2011Estapona, SpainOBJECTIVE:To define and develop a SQL driven database application, which creates a product ordering, invoicing and stock system to track orders and purchase invoices received in pounds sterling (£) against the sales invoices made to trade and retail customers made in Euros (€). To track the stock, its value and profit, either by product, product types and ingredients, to provide the instant profit reports for business analysis.APPROACH: • To analyse the purchase and sales processes operated by the Spanish organic cosmetic agent to represent the operating model for all agents working in Euros.• To define develop a database application to reflect the process and data analyse carried out in the initial phase.• To implement a remote access update platform to install upgrades of application to production platforms.• To retrospectively enter manual sales and purchase orders and invoices to test the database application accuracy and correctness, using live verifiable data.• To develop materials training and user manuals to explain the operational usage• To document the technical processes and data descriptions necessary to define the analyse of the initial phaseOUTCOME: • To implement the developed database application for live operation by Q1 2011• To rollout the database application to other agents operating in Euros by Q Q2 2011.
-
Data Analyst (Retail Banking - It - Data Warehouse)Lloyds Tsb Bank Jun 2010 - Sep 2010Bristol, United KingdomOBJECTIVE: To match and merge Personal Customer Data between HBoS and Lloyds TSB retail customer data (30 million customers) in Deep Level Extraction functionality, using Teradata SQL, in order to establish safe merging of Retail Banking Accounts. To achieve a 100% merge of all mutually inclusive retail product-based account holders.APPROACH: • To develop and identify matching scenarios, using Teradata SQL, which safely match customers using personal and product related data for final merge. Using Teradata SQL to identify and test proposed scenarios.• Analyse each merge primary matching patterns to identify further matching scenarios • To map data of the downstream target system to key upstream source system in data driven tables to assess the impact on the downstream target systemOUTCOME: • To establish a near completely successful merge of mutually inclusive personal customer data in the Retail Banking datasets. -
Data Analyst (Corporate & Commercial Banking - Recoveries)Lloyds Tsb Bank Apr 2010 - Jun 2010Bristol, United KingdomOBJECTIVE: To measure the impact of migration of commercial systems (120 systems in all) taking place between the Core Banking Systems between Halifax/Bank of Scotland (HBoS) and Lloyds TSB Bank (LTSB), predicting the effect on the Wholesale Banking Recoveries and agreeing the changes necessary to the Target Operating Model (TOM) to gain the possible benefits implicit in digestion of the source bank.APPROACH: • Working as part of the BAU Change Team (12 strong) in Wholesale Banking Recovery (WBR) in process mapping to identify and compare each current Source Operating Model to define some possible Target Operating Models (TOMs) for consideration by key stakeholders and Subject Matter Experts (SMEs). • To consider existing defined mapped processes and data with regards to WBR systems and proposed ‘end to end’ processes.• To negotiate agreement between the key stakeholders in agreeing the final TOM with respect to imperatives defined by other core banking processes.• To build data models of the upstream systems directly influencing the WBR systems.• To liaise with external consultants to help validate the overall approach and method of creating the TOM and achieve the required Advanced Basel II status in the new operating model.• To assess, confirm and validate the process models identified by stakeholder groups, in order to assess the impact on the WBR systemsOUTCOME: • Currently achieving a coherent picture of the datasets required to attain Advanced Basel II status and the likely impact on the WBR systems.• Ensured commonly agreed data definitions, which in turn has lead to the accurate and coherent extraction of MI reports and month on month comparability. • Reach common agreement amongst stakeholders as to common datasets to enhance next phase of systems integration.• To approach the overall migration projects with confidence, knowing the required datasets have been identified, cleaned and Advance status can be achieve as far as WBR is concerned. -
Senior Data/Systems Analyst & Project ManagerCircle Anglia May 2008 - Mar 2010Norwich, United KingdomOBJECTIVE: To achieve the transfer of business data ownership from IT personnel to business led personnel within the Group, enabling business led Business User Groups (BUGs) teams to review their data coherence and coding systems and make business changes to their central repository whilst managing the impact of the changes in a structured and considered manner. Leading customer facing (BUGs) to decide what tasks should be prioritised and best approaches to take to achieve the stated aims of the BUGs, with respect to the Target Operating Models (TOMs) across the Housing Operation functional models. Migrating new data of structures from newly absorbed housing associations to core group data model.APPROACH: • With the aid of well-designed applications tools and intranet wikis; key data was extracted from SQLServer data warehouses, to create a set of business tools to clarifying the complex coding system consisting of 35,000+ codes related to 57,000+ properties owned across the Group to facilitate current business. • Design & implement an Ethnic & Diversity database to show the business the extent of the data diversity with respect to nationally set completion targets. • Mapping new data structures of acquired companies to the core group data model to assist migration.• Design & implement a property reconciliation process to update the 5 key survey databases in line with the central repository. • Design & implement extraction processes to interface E & D data with external questionnaire websites and provided tenant satisfaction call sheets for Repairs, Anti-Social Behaviour, Lettings and Aids & Adaptations, etc….OUTCOME: • Provided the business operational personnel with the data tools they required, to effectively and confidently define and code the central repository. • Ensured commonly agreed data definitions, which in turn has lead to the accurate and coherent extraction of MI reports and month on month comparability. -
SabbaticalMotorcycle Tour Mar 2008 - May 2008Bali » Cambodia » Laos » VietnamA rare, but unmissable, opportunity to take a motorcycle tour of four countries in South East Asia, travelling through regions of outstanding natural beauty that had not been possible to visit when I lived there 3 decades earlier. Touring each of the above countries on pre-ordered rented motorcycles on pre-planned or bespoke itineraries.
-
Senior Business AnalystLloyds Tsb Bank Nov 2006 - Mar 2008OBJECTIVE: To successfully provide coherent, clean, statutory accountability data to Credit Risk Analysis department in Business Banking. Lead customer stakeholder groups to decide priorities by creating a monthly data subset of:• SMEs companies (Basel compliant < €1million turnover) and present editable data to credit risk managers, for their assessment and added value decisions, to modify initial credit scores.• all SMEs companies for remote access of customers to all lending managers through Credit Risk community within Lloyds TSB though the UK, ensuring customer data safety. Giving management a balanced view of risk and exposure.APPROACH: • Using the corporate data warehousing facilitate to extract criteria specific data and present that data each month in an editable form for the UK wide specific manager to update with added value contact details, customer agreed plans, adjusted rating and credit scores for operation and management review. • Created accurate analysis of corporate data, regarding the rating & scoring of customers for compliance with FSA regulations, for larger corporate clients. Led a small team of developers to redesign some existing added value systems germane to statutory requirements of the Risk Analysis section of the Bank. • Using the corporate data warehouse to extract new business and moving customers in and out of the scope of regulation, allowing nationwide loan assessment staff to keep track of their customers without carrying large amount of client data on their laptops, with a focus on security of data and reporting on risk and exposure.OUTCOME: • Provided the compliance managers with the method to prove their compliance, satisfying the requirements of the FSA and passing the outstanding anomalies in the current FSA compliance audit. -
Senior Systems AnalystBirmingham City Council Oct 2004 - Oct 2006Birmingham, United KingdomOBJECTIVE: • To develop key data acquisition objects within the AQUA class objects to allow data to be accessed from all major RDBMS backend to satisfy user demand. • Implementing one deliverable application to satisfy all 133 clients throughout the AQUA client base APPROACH: • Led a team of developers, trainers and analysts who support the AQUA LSC funding database used by 133 LEA FE & HE colleges around the country. • Analysed, designed, developed and implemented a complex but efficient class infrastructure to 350+ screens to ensure object integrity, thus integrating web application systems in existing organisational WAN solutions.OUTCOME: • Enabled the AQUA team to be able to deliver one product to all their clients, which allows client to independently configure their own systems to operate with any RDBMS, which they chose to hold their data in. • Simplified the maintenance, delivery, training and support of the product across the entire client base, thus making very real positive contributions to the overall profitability of the AQUA enterprise.
Anthony Millard Skills
Anthony Millard Education Details
-
Mathematics
Frequently Asked Questions about Anthony Millard
What company does Anthony Millard work for?
Anthony Millard works for Charterhouse Systems Consultancy Ltd.
What is Anthony Millard's role at the current company?
Anthony Millard's current role is Data Architect at Jaguar Land Rover UK.
What is Anthony Millard's email address?
Anthony Millard's email address is to****@****.uk.com
What schools did Anthony Millard attend?
Anthony Millard attended University Of Portsmouth.
What skills is Anthony Millard known for?
Anthony Millard has skills like Data Warehousing, Databases, Sql, Change Management, Microsoft Sql Server, Software Documentation, Data Mapping, Requirements Analysis, Data Modeling, Vba, Data Migration, Analysis.
Not the Anthony Millard you were looking for?
-
Anthony Millard
Greater Cardiff Area -
Anthony Millard
Ewen1anthonymillard.co.uk3 +120342XXXXX
-
Anthony Millard
United Kingdom -
Free Chrome Extension
Find emails, phones & company data instantly
Aero Online
Your AI prospecting assistant
Select data to include:
0 records × $0.02 per record
Download 750 million emails and 100 million phone numbers
Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.
Start your free trial