Regulatory Compliance: The critical importance of data quality Capgemini Financial Services ACORD IT Club Presentation 29 July 2010
Confidentiality Agreement Notice to the Recipient of this Document The information contained herein is considered proprietary and confidential information of Capgemini, and its release would offer substantial benefit to competitors offering similar services. This material includes descriptions of methodologies and concepts derived through substantial research and development efforts undertaken by Capgemini. No part of this document may be reproduced by any means or transmitted without the prior written permission of Capgemini except with respect to copies made or transmitted internally by you for the purpose of evaluating this document. 2010 Capgemini - All rights reserved 1
Introduction The rationale for governmental solvency surveillance is to facilitate development of a single market in insurance services throughout a broad, diverse geographical union while at the same time securing an adequate level of consumer protection. The aim of the Solvency II regime is to ensure the financial soundness of insurance undertakings, and in particular, to ensure that they can survive difficult periods. This is to protect policyholders (consumers, businesses) and the stability of the financial system as a whole. 1 Solvency II rules stipulate, [I]nsurance undertakings [must] have internal processes and procedures in place to ensure the appropriateness, completeness and accuracy of the data used in the calculation of their technical provisions. 2 Discussion Points 1. When data have quality, they become believable in their use for capital allocation and performance measurement. 2. The data quality requirements of Solvency II can be achieved only when an organisation has a fully operational data quality program in place. 1 http://ec.europa.eu/internal_market/insurance/docs/solvency/solvency2/faq_en.pdf 2 Article 82 quality and application of approximations, including case-by-case approaches, for technical provisions 2010 Capgemini - All rights reserved 2
Accuracy, Completeness and Appropriateness Insurers must be able to prove that they have adequate levels of Capital to be an ongoing concern. The risk metrics required by Solvency II are only meaningful when the data used to calculate them has high quality, over the dimensions of accuracy, completeness and appropriateness. p These are the dimensions that assure data quality. Completeness simply put, something is complete when nothing needs to be added to it. For example, in risk and compliance model algorithms, the notion of completeness refers to the ability of the algorithm to find a solution if one exists, and if not, to report that no solution is possible. Accuracy the closeness to the true value seen as the degree of agreement for one entity as seen from two or more independent observations. For example, if both Moody and S&P provide an investment grade rating for an organisation, then there is a high degree of probability that a rating of f investment tgrade for that torganisation is accurate. Appropriateness - the quality of having the properties that are right for a specific purpose. For example, in order to calculate a person s age at policy inception, person s date of birth and policy s date of inception are appropriate while person s s date of marriage or policy s expiration date are not. 2010 Capgemini - All rights reserved 3
Impact of Poor Quality on an Organisation Increased Risk Increased Costs Regulatory Risk Reputational Risk Investment Risk Privacy Risk Competitive Risk Fraud Detection Risk Detection and Correction Penalties Overpayments Conservative Capital Adequacy Reserves Increased Workloads Processing Delays Customer attrition Decreased Revenue Lost opportunities Delays in collections Low Confidence Trust Issues Faulty decision-making Inconsistent management reporting Impaired forecasting ability 2010 Capgemini - All rights reserved 4
Ten Best Practices of an Operational Quality Program I Treat your data as a corporate asset II Establish accountabilities and responsibilities VI Consider adjustments as data quality rule breaches VII Don t incorporate data quality rules into application logic III Know the meaning of your data VIII Eliminate supply chain complexity wherever possible IV Control your data supply chain IX Manage and organise your data domain knowledge V Let data consumers define fit-for-purpose X Monitor data quality across the entire data supply chain 2010 Capgemini - All rights reserved 5
I Treat your data as a corporate asset Focus should be on business value that data provides rather than the costs associated with its acquisition and storage. Perception should be that the value extracted from your data provides a significant strategic competitive advantage Commitment is required from highest levels of the organisation on down Organisations that manage their data as an asset take a systematic approach to ensuring the quality of their information An effective operational data quality program supports the data as a corporate asset concept 2010 Capgemini - All rights reserved 6
II Establish accountabilities and responsibilities A basic data governance framework ensures that everyone within an organisation knows the role(s) that they play and those which are accountable for providing quality data Originators Where data ultimately originates Can be internal or external Internal groups are originators when they derive new data Owners Ultimately accountable for the quality and reliability of data Make decisions on the data to address the needs of consumer Have the authority to make decisions on data Stewards Know the data and its implications best Responsible for upkeep and enforcement of data governance Typically involved in business operations Custodians Responsible for the movement and storage of data Create and enforce data standards Typically part of the IT function Consumers Use the data to make business decisions Can also be originators if the use data to derive new data 2010 Capgemini - All rights reserved 7
III Know the meaning of your data All stakeholders in the data supply chain need to have a common understanding of the meaning of data, and of each other s expectations in terms of data quality. Retail Line of Business Business Terms Glossary Commercial Line of Business Business Terms Glossary Risk & Finance Business Terms Glossary Retail LOB Master Name Glossary Retail LOB Connectors and Qualifiers Glossary Commercial LOB Master Name Glossary Risk & Finance Master Name Glossary Promote to Enterprise Promote to Enterprise Promote to Enterprise Enterprise Business Terms Glossary 2010 Capgemini - All rights reserved 8
IV Control your data supply chain Senior Executives rely on aggregated data to make business decisions The origin of data and the transformations applied to it can be extremely complex, especially in organisations that have grown via acquisition Manual adjustments performed to resolve consolidation issues further cloud the picture A methodical approach to operationalising the data supply chain is a mandatory prerequisite to any regulatory compliance project 2010 Capgemini - All rights reserved 9
V Let data consumers define fit-for-purpose consumers know what fit-for-purpose should be for the data that they use. The rules should be expressed using the terms by which the data consumers refer to their data (i.e. Country of Risk ) rather than the terms by which the data is persisted in a data mart or data warehouse. Rules should be expressed as assertions, which can be either true or false. If-then syntax provides the best level of common understanding Example of a data quality exception check: if and then the year of birth of the policyholder is before 1950 the golden age discount of the policy of the policyholder is 0 set policyholder golden age discount applied to false; 2010 Capgemini - All rights reserved 10
VI Consider adjustments as data quality rule breaches When a rule FAILS, the breach condition should be expressed such that remediation can be initiated on a timely basis. quality rules should be applied as early as possible in the data supply chain. 2010 Capgemini - All rights reserved 11
VII Don t incorporate data quality rules into application logic quality rules are business rules A business rule management system should be employed in order to manage the specification, creation, testing, execution and retirement of an organisation s data quality business rules Several enterprise strength business rule management systems are available in the marketplace. Business Applications Business Rule Execution Business Rule Management System Breach Remediation Quality Metrics Reporting 2010 Capgemini - All rights reserved 12
VIII - Eliminate supply chain complexity wherever possible The complexity and opacity of the data supply chain makes the identification of the ultimate data owners, who are responsible for data quality, difficult As a result, consolidation issues are resolved by manually adjusting data at the consumption point Operational Store Mart Mart Warehouse Mart Operational Store 2010 Capgemini - All rights reserved 13
IX - Manage and organise your data domain knowledge The Common Information Model describes data throughout the supply chain. It is the cornerstone of the Capgemini Quality Management Framework. It include a Business Dictionary of terms used in the supply chain and a Business Object Model showing relationships between these terms It fosters a common understanding of the data across all functions and lines of business It provides full data traceability and lineage Common Information Model Operational Store Operational Store Warehouse Mart Mart Mart 2010 Capgemini - All rights reserved 14
X Monitor data quality across the entire supply chain Quality Metrics Dashboards provide the user view into the quality of a data population. Organising the dashboard according to the process step and business rule will usually provide the best view for subsequent remediation efforts. 2010 Capgemini - All rights reserved 15
Summary: Vision, Strategy, Objectives and Execution Coordinate and Cooperate: Quality is a Program. It is not a Project. It must be an on-going and sustainable business effort that brings together Business, IT, Finance and Risk. It is a Solvency II program spanning all functional areas. Normalize: Information must be presented consistently across the companies of a multi-national entity. Align: Business and IT must be coordinated with respect to the company s information knowledge base and the Solvency II program Standardize & Repeat: In a multi-national company, given the multiple operational entities across geographies that may be involved in a Solvency II program, Insurance Companies must provide a blue print for execution of its data quality program and Solvency II information management program. Solvency II compliance may be step 1 and the initial priority. Step 2 in the roadmap may often include financial reporting on an economic capitalized basis, capital attribution and performance metrics such as risk-adjusted return on capital (RORAC). Enforce & Sustain: Any data quality Program must be enforceable and sustainable, and it must deliver trustworthy business information for business decision making. Enforceability often passes operational boundaries, and this must be taken into account in establishing a Solvency II program. 2010 Capgemini - All rights reserved 16
Contributors Frank Lemieux Frank is a senior manager in the Business Information Management practice for Capgemini Financial Services and has over 25 years of project management, business analysis, technical design and implementation experience serving the financial services industry with extensive subject matter expertise in insurance processing applications (underwriting, policy processing and claims handling), reference data, market data and risk data, including the associated metadata requirements and data quality issues. His focus includes metadata management, information taxonomy and ontology solutions, strategic blueprint and roadmap for enterprise data management platforms, workflow management, requirements gathering, database design, ETL, relational ldata models, prototyping, t software development, quality assurance, project management, business analysis, and data warehousing. He currently heads the Global Quality and Business Semantics practice within BIM and provides oversight to all engagements involving those disciplines. Prof Bryan Foss Bryan uses his extensive international FS, business governance and customer management experience to advise boards and decision makers on successful governance and change strategies. He is currently working with leading firms and the FSA on compliance areas including Solvency 2, Board Appointments, RDR, Operational Risk and Complaints Management. Bryan is extensively qualified as a director, also in marketing, finance and technology. Peter is a highly experienced insurance and IT professional with over 30 years in the insurance industry. Originally trained as an underwriter he spent several years as a Property Underwriting Manager before moving into IT. Working for various insurance software organisations he performed business analysis and pre-sales roles before moving into IT management as Head of IT for 2 Lloyd s syndicates. Peter Dixon Bill specializes in modeling, simulation and mathematical optimization for complex financial and actuarial problems. He has developed internal models for Solvency II Life and Non-Life capital charges as well as programmed standard model SCR(Life), SCR(Health) and SCR(Non- Life). During the last couple of decades, Bill architected and built actuarial computer systems for actuarial, accounting and commercial software companies. Dr Bill Scheel 2010 Capgemini - All rights reserved 17
www.capgemini.com/financialservices