DATA AUDIT: Scope and Content



Similar documents
Solvency II Data audit report guidance. March 2012

White Paper: FSA Data Audit

Lloyd s Managing Agents FSA Solvency II Data Audit

19/10/2012. How do you monitor. (...And why should you?) CAS Annual Meeting - Henry Jupe

Solvency II Detailed guidance notes

Internal Model Approval Process (IMAP) Contents of Application (CoA) Template. August 2011 Version 1.0

LLOYD S MINIMUM STANDARDS

Prudential Practice Guide

Central Bank of Ireland Guidelines on Preparing for Solvency II Pre-application for Internal Models

Solvency II Own Risk and Solvency Assessment (ORSA)

Following up recommendations/management actions

Operational Risk Management Program Version 1.0 October 2013

Solvency II Preparation and IMAP James Latto

GUIDANCE NOTE FOR DEPOSIT-TAKERS. Operational Risk Management. March 2012

Scenario Analysis Principles and Practices in the Insurance Industry

Guidance on Risk Management, Internal Control and Related Financial and Business Reporting

EIOPACP 13/011. Guidelines on PreApplication of Internal Models

Guidance Note: Corporate Governance - Board of Directors. March Ce document est aussi disponible en français.

OWN RISK AND SOLVENCY ASSESSMENT AND ENTERPRISE RISK MANAGEMENT

Basel Committee on Banking Supervision. Review of the Principles for the Sound Management of Operational Risk

MEMORANDUM. Date: October 28, Federally Regulated Financial Institutions. Subject: Cyber Security Self-Assessment Guidance

Principles for An. Effective Risk Appetite Framework

APPENDIX 50. Enterprise risk management - Risk management overview

Guidance Note: Stress Testing Class 2 Credit Unions. November, Ce document est également disponible en français

Functional and technical specifications. Background

IT Governance. What is it and how to audit it. 21 April 2009

RISK MANAGEMENT AND COMPLIANCE

NOTICE 158 OF 2014 FINANCIAL SERVICES BOARD REGISTRAR OF LONG-TERM INSURANCE AND SHORT-TERM INSURANCE

Risk Management Framework

PART B INTERNAL CAPITAL ADEQUACY ASSESSMENT PROCESS (ICAAP)

ORSA Implementation Challenges

Risk Management Programme Guidelines

CONSULTATION PAPER CP 41 CORPORATE GOVERNANCE REQUIREMENTS FOR CREDIT INSTITUTIONS AND INSURANCE UNDERTAKINGS

THE SOUTH AFRICAN HERITAGE RESOURCES AGENCY ENTERPRISE RISK MANAGEMENT FRAMEWORK

Administrative Guidelines on the Internal Control Framework and Internal Audit Standards

Validating Third Party Software Erica M. Torres, CRCM

THE INSURANCE BUSINESS (SOLVENCY) RULES 2015

All I want for Christmas is accurate, complete and appropriate data

Risk Management. National Occupational Standards February 2014

GUIDELINES ON CORPORATE GOVERNANCE FOR LABUAN BANKS

Practice Note. 23Revised. October 2009 AUDITING COMPLEX FINANCIAL INSTRUMENTS INTERIM GUIDANCE

M-MIS. Comptroller of the Currency Administrator of National Banks. Management Information Systems. Comptroller s Handbook. May 1995.

ITSM Maturity Model. 1- Ad Hoc 2 - Repeatable 3 - Defined 4 - Managed 5 - Optimizing No standardized incident management process exists

Guideline. Operational Risk Management. Category: Sound Business and Financial Practices. No: E-21 Date: June 2016

The validation of internal rating systems for capital adequacy purposes

Auditing Standard 5- Effective and Efficient SOX Compliance

University of New England Compliance Management Framework and Procedures

Service Integration &

IT audit updates. Current hot topics and key considerations. IT risk assessment leading practices

on Asset Management Management

Operational Risk Management - The Next Frontier The Risk Management Association (RMA)

Board of Directors Meeting 12/04/2010. Operational Risk Management Charter

How to achieve excellent enterprise risk management Why risk assessments fail

Part A OVERVIEW Introduction Applicability Legal Provision...2. Part B SOUND DATA MANAGEMENT AND MIS PRACTICES...

Solvency Assessment and Management: Pillar II Sub Committee Governance Task Group Discussion Document 81 (v 3)

Model Risk, A company perspective Peter K. Reilly, FSA Valuation Actuary & Head of Actuarial Strategic Initiatives Aetna, Inc

Royal Borough of Kensington and Chelsea. Data Quality Framework. ACE: A Framework for better quality data and performance information

Capital Management Standard Banco Standard de Investimentos S/A


Direct Line Insurance Group plc (the Company ) Board Risk Committee (the Committee ) Terms of Reference

Aegon Global Compliance

Solvency II Own risk and solvency assessment (ORSA)

Compliance Management Framework. Managing Compliance at the University

The Use of Spreadsheets: Considerations for Section 404 of the Sarbanes-Oxley Act*

INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS

CEIOPS Advice for Level 2 Implementing Measures on Solvency II: Articles 120 to 126. Tests and Standards for Internal Model Approval

Principles for BCM requirements for the Dutch financial sector and its providers.

From ICAAP/ORSA to ERM: Board and Senior Management Oversight. Leon Bloom, Partner, Deloitte & Touche LLP lebloom@deloitte.ca

RISK MANAGEMENT POLICY

BERMUDA MONETARY AUTHORITY

Capital Adequacy: Advanced Measurement Approaches to Operational Risk

Information Management Advice 35: Implementing Information Security Part 1: A Step by Step Approach to your Agency Project

Making Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management

GUIDELINES ON RISK MANAGEMENT AND INTERNAL CONTROLS FOR INSURANCE AND REINSURANCE COMPANIES

Risk Management Policy

Board Charter. HCF Life Insurance Company Pty Ltd (ACN ) (the Company )

The PNC Financial Services Group, Inc. Business Continuity Program

RISK MANAGEMENT REPORT (for the Financial Year Ended 31 March 2012)

Insurance Guidance Note No. 14 System of Governance - Insurance Transition to Governance Requirements established under the Solvency II Directive

Enterprise Risk Management

A Risk Management Standard

Capacity Management PinkVERIFY

Effective AML Model Risk Management for Financial Institutions: The Six Critical Components

Draft Prudential Practice Guide

Confident in our Future, Risk Management Policy Statement and Strategy

ORACLE ENTERPRISE GOVERNANCE, RISK, AND COMPLIANCE MANAGER FUSION EDITION

Location of the job: CFO Revenue Assurance

Integrated Stress Testing

Transforming risk management into a competitive advantage kpmg.com

ACCEPTANCE CRITERIA FOR THIRD-PARTY RATING TOOLS WITHIN THE EUROSYSTEM CREDIT ASSESSMENT FRAMEWORK

Saldanha Bay Municipality. Risk Management Strategy. Inclusive of, framework, procedures and methodology

Sound Practices for the Management of Operational Risk

Subject ST9 Enterprise Risk Management Syllabus

Risk Management Strategy EEA & Norway Grants Adopted by the Financial Mechanism Committee on 27 February 2013.

INSURANCE ACT 2008 CORPORATE GOVERNANCE CODE OF PRACTICE FOR REGULATED INSURANCE ENTITIES

Transcription:

DATA AUDIT: Scope and Content The schedule below defines the scope of a review that will assist the FSA in its assessment of whether a firm s data management complies with the standards set out in the Solvency II Directive. It is part of the FSA s Internal Model Approval Process (IMAP) and will supplement both the evidence that the firm uses as part of its self assessment in preparation for the application and any additional work that the FSA may choose to undertake. The scope of this review is all data (internal and external 1 ) that could materially impact the Internal Model. This would include, e.g., policy (exposure) data held in administration systems, observational data such as claims data, mortality data, market data, credit data, static or referential data, equity exposure data, model parameters set by users such as correlation input data, number of simulations, etc. The review should be performed by a suitably qualified person who is independent of model design, build, and operation (e.g. Internal Audit). In conducting the review, the reviewer should apply professional judgement in deciding how the controls are assessed (e.g. sample size, depth of document review, interviewees, etc) and how effective they are in addressing the risk. The review is not intended to assess the appropriateness of actuarial Expert Judgements with regards to data used in the Internal Model. The reviewer may make use of previous independent reviews (e.g. SOX compliance assessments, Internal/External Audit work, etc), so long as the data, assumptions, calculation methodology and IT environment reviewed have not changed significantly. After conducting the review, the firm should report back to the FSA with the following: An executive summary that includes the scope (data, business units, etc) of the review, any exclusions with justification, the summary work plan with the approach (i.e. testing performed, documents reviewed, interviews conducted, sampling criteria used, etc), and period covered, times scales of the review, personnel involved in the review (and their background, if not obvious from their job titles), significant / material findings, and; 1 'External data' in the context of this review means data that is externally sourced by the firm for use either directly or indirectly in the Internal Model. This does not include data that is proprietary to vendor models external to the firm (i.e. not directly within the control of the firm).

limitations of the review. Conclusions on individual controls () using the schedule below. If the conclusion is a No, a list of findings, potential impact (residual risk) on the firm s internal model, together with a list of actions to address the findings, and the expected completion date. Please indicate whether the action was planned (i.e. was already part of the firm s solvency II program), or whether it is remedial (i.e. was not originally in plan). Please also indicate whether the impact is at a group level or at a specific legal entity level. If a previous independent review was used to arrive at a conclusion, the scope of the review, any exclusion with justification, and the date of review. In addition to the above, FSA may request for supporting evidence that formed the basis of the conclusions on individual controls and may also request meetings with those who conducted the review. Using the schedule The schedule has five sections, corresponding to the five sub-risks to the overall risk that the data used in the internal model do not meet the SII directive requirements on data quality (complete, accurate, appropriate, and timely). These are as follows. 1. The approach (i.e. matters of policy) to managing data for use in the internal model does not ensure consistency in quality and application 2. Inadequate oversight of the development and implementation of the data policy increases the risk of poorly informed decision-making (e.g. risk management, capital allocation) and non-compliance with the required quality and standards 3. Lack of a clear understanding of the data used in the internal model, and of its impact and vulnerabilities, can create gaps in ownership and control 4. Errors, omissions, lack of timeliness and inaccuracies in the data can undermine the integrity of the internal model and management decision making 5. Unreliable IT environment, technology or tools can compromise the quality of the data and its processing within the internal model It is the responsibility of the reviewer to construct a suitable approach to assess the controls over these sub-risks. The examples given under the Assessment Approach column (see table below) are not intended to be a prescriptive list. The conclusion for each control can either be a Yes or a No.

Yes = the controls are in place (i.e. written, communicated and understood by relevant stakeholders), operating effectively and the residual risk to the firm is not material. No = the controls are either not in place, or not operating effectively and/or the firm remains exposed to material risks. The reviewer s assessment of the materiality of residual risk should be in accordance with the definition of materiality in the firm s data policy. This definition should be shared with the FSA prior to conducting the review.

Risk Control Objective Expected Control Assessment Approach Conclusion 1. The approach to managing data for use in the internal model does not ensure consistency in quality and application of the internal model 1.1 To ensure that data quality is maintained throughout the process of the internal model as required by SII 1.1.1 A data policy has been established and implemented. The policy, its associated procedures, and standards include: a definition of the different data sets that are to be covered by the policy; a definition of materiality (which is aligned to the firm s risk appetite where appropriate e.g. when an expert judgements is made to adjust for insufficient observational data); the respective ownership and responsibility for the data sets, including the system of governance and assurance over data quality; a definition of the standards for maintaining and assessing quality of data, including specific qualitative and quantitative standards for the data sets, based on the criteria of accuracy, completeness and appropriateness; the use of assumptions made in the collection, processing and application of data; the process for carrying out data updates to the internal model, including the frequency of regular updates and the circumstances that 1A: Confirm that the firm has: i) A written data policy approved by management with an appropriate degree of challenge and oversight in its development as evidenced through discussions and debates in minutes and/or equivalent documentation. ii) Written procedures, technical guides and standards for implementing the data policy; iii) Implemented the policy across the organisation as is evident from: communication of the policy, associated procedures and standards to the relevant stakeholders and individuals responsible for ownership and management of data used in the internal model, and; understanding of the relevant stakeholders and individuals responsible for ownership and management of data used in the internal model

trigger additional updates and recalculations of the probability distribution forecast; a high level description of the risk and impact assessment process including the frequency with which the assessment process is conducted, and; the frequency of the review of the data policy, associated procedures, and standards. 2. Inadequate oversight of the development and implementation of the data policy increases the risk of poorly informed decision-making and non-compliance with the required quality and standards 2.1 To set the tone and provide appropriate oversight of the implementation of the data policy necessary for sound decision making 2.2 To ensure appropriate and timely reporting to support required governance and management decision making process and timely detection of issues 2.1.1 The data governance structures and processes are operating as defined in the data policy and associated procedures and effective in: providing appropriate oversight in the application of the data policy; ensuring that the data policy, associated procedures, and standards including the responsibilities and accountabilities of the various stakeholders across the firm, the quantity and quality of data metrics reported to management, the data directory, and the risk and impact assessment are kept under regular review; ensuring appropriate assurance is carried out and received for validating the quality of data used in the internal model. 2.2.1 Data quality metrics (qualitative and quantitative) defined in the data policy are reported (individually, aggregated or categorised) to 2A: Review the firm s data governance arrangements and their fit with the organisation structure to determine: completeness of oversight as shown, for example, by the terms of reference and agenda, and; key discussions, debates, decisions and approvals from a review of minutes. 2B: Review and assess: revision history, and changes made to the policy, associated procedures, standards, governance framework, data directory, risk and impact assessment.

appropriate levels of management on a regular basis to enable them to assess the quality of data and take remedial action when there are material issues. The system of reporting should include a deficiency management process whereby exceptions identified as a result of data quality checks and controls, which could have a material impact on the internal model, are escalated to appropriate levels of management and actions taken to address them on a timely basis. the nature and timeliness of MI reports received; the extent of exception reporting for appropriateness and effectiveness; remedial actions taken to resolve exceptions, and; through interviews with key personnel, the level of understanding of their governance responsibilities and MI reports. 3. Lack of a clear understanding of the data used in the internal model, and of its impact and vulnerabilities, can create gaps in ownership and control 3.1. To ensure that data used in the internal model, its impact and vulnerabilities has been clearly identified and maintained 3.1.1 A directory of all data used in the internal model has been compiled specifying source, usage and characteristics including: storage (e.g. location, multiple copies) across the data flow to internal model; how data is used in internal model including any transformation (e.g. aggregation, enrichment, derivation) processes. 3.1.2 For each data set, a risk and impact (sensitivity) assessment has been performed to identify:- whether the impact of poor quality data 3A. Review the firm s data directory to determine its clarity, completeness and maintainability. 3B. Review the firm s risk and impact assessment for completeness, including an appropriate consideration of the outcome of the assessment against any issues reported in the data quality metrics. 3C. Confirm that the tolerance thresholds and materiality used are consistent with the reporting to the relevant management groups or governance oversight bodies.

(individually or in aggregation) on the internal model is material; the points in the data flow from source to internal model where likelihood of data errors is the greatest, and therefore, what specific data quality controls are required; tolerance threshold beyond which a data error could become material (individually or in aggregation). 4. Errors, omissions and inaccuracies in the data can undermine the integrity of the internal model and management decision making. 4.1. To ensure that data quality (complete, accurate, appropriate, and timely/current) is maintained in the internal model. 4.1.1 The management and data quality controls (preventative, detective, and corrective) proportional to the probability and materiality of potential data errors have been identified and implemented effectively. The controls should include (at a minimum): 4.1.1.a having individuals with sufficient competence to conduct the manual data checks on accuracy, completeness and appropriateness 4.1.1.b A well-defined and consistent process for refreshing or updating all data items in line with the data policy (timeliness and currency of data). The process must include appropriate change controls (automated or manual) that take into account any material impact (individually or in aggregation) on the internal model. 4A: Review and evaluate the firm s documented control procedures to assess their completeness and appropriateness in meeting the control objective. 4B: Assess the adequacy of training/experience of individuals responsible for critical stages of data checks. 4C: Assess the adequacy of change controls for a sample of key changes/updates made. 4D: Walk through the key validations and checks with key personnel to assess the degree of understanding and embedding. 4.1.1.c Data input validations (auto/manual) that 4E: Assess the operational

prevent data having incorrect or inconsistent format or invalid values. 4.1.1.d Completeness checks such as Reconciliation of data received against data expected. effectiveness of the key validations and checks. A process to assess if data is available for all relevant model variables and risk modules 4.1.1.e Accuracy checks such as Comparison directly against the source (if available). Internal consistency and coherence checks of the received/output data against expected properties of the data such as age-range, standard deviation, number of outliers, and mean. Comparison with other data derived from the same source, or sources which are correlated. 4.1.1.f Appropriateness checks such as Consistency and reasonableness checks to identify outliers and gaps through comparison against known trends, historic data and external independent

sources. A definition and consistent application of the rules that govern the amount and nature of data used in the internal model. A process to assess the data used in internal model for any inconsistencies with the assumptions underlying the actuarial and statistical techniques or made during the collection, processing and application of data. 5. Unreliable IT environment, technology or tools can compromise the quality and integrity of the data and its processing within the internal model 5.1 To ensure that the quality of data and its processing for use in the internal model is maintained 5.1.1 IT general computer (ITGC) controls over the data environment (for e.g. Mainframes, End User Computing applications such as spreadsheets, etc) that may have material impact on the internal model are established, such as: logical access management; development and change management (infrastructure, applications, and database); security (network and physical); 5A: Assessment of design and operational effectiveness of key ITGC controls that relate to the data sets as defined and required by the internal model. 5B: Review key IT MI reports such as network and access security breaches, system downtime, coding errors etc to determine whether any incidents that impact materially on the internal model have been followed through and resolved appropriately. business continuity; incident management and reporting, and;

other operational controls that support the collection (including data feeds), storage, analysis and processing.