Solvency II Data audit report guidance. March 2012

Similar documents
DATA AUDIT: Scope and Content

White Paper: FSA Data Audit

Lloyd s Managing Agents FSA Solvency II Data Audit

Solvency II Own risk and solvency assessment (ORSA)

Solvency II Detailed guidance notes

Solvency II Own Risk and Solvency Assessment (ORSA)

Internal Model Approval Process (IMAP) Contents of Application (CoA) Template. August 2011 Version 1.0

LLOYD S MINIMUM STANDARDS

Central Bank of Ireland Guidelines on Preparing for Solvency II Pre-application for Internal Models

19/10/2012. How do you monitor. (...And why should you?) CAS Annual Meeting - Henry Jupe

Solvency II Preparation and IMAP James Latto

EIOPACP 13/011. Guidelines on PreApplication of Internal Models

This section outlines the Solvency II requirements for a syndicate s own risk and solvency assessment (ORSA).

Solvency II guidance notes. February 2012

ORSA Implementation Challenges

CEIOPS Advice for Level 2 Implementing Measures on Solvency II: Articles 120 to 126. Tests and Standards for Internal Model Approval

Basel Committee on Banking Supervision. Review of the Principles for the Sound Management of Operational Risk

Prudential Practice Guide

BERMUDA MONETARY AUTHORITY

Risk Management Programme Guidelines

Scenario Analysis Principles and Practices in the Insurance Industry

Data Communications Company (DCC) price control guidance: process and procedures

Data Quality Policy. Appendix A. 1. Why do we need a Data Quality Policy? Scope of this Policy Principles of data quality...

Guidance Note: Corporate Governance - Board of Directors. March Ce document est aussi disponible en français.

on Asset Management Management

Capital Management Standard Banco Standard de Investimentos S/A

PERFORMANCE DATA QUALITY POLICY

Following up recommendations/management actions

Feedback on the 2012 thematic review of technical provisions

THE INSURANCE BUSINESS (SOLVENCY) RULES 2015

Guidance on Risk Management, Internal Control and Related Financial and Business Reporting

Item 10 Appendix 1d Final Internal Audit Report Performance Management Greater London Authority April 2010

Internal Audit Progress Report Performance and Overview Committee (19 th August 2015) Cheshire Fire Authority

CONSULTATION PAPER CP 41 CORPORATE GOVERNANCE REQUIREMENTS FOR CREDIT INSTITUTIONS AND INSURANCE UNDERTAKINGS

Guidelines on operational functioning of colleges

RISK MANAGEMENT AND COMPLIANCE

PERFORMANCE DATA QUALITY STRATEGY

Preparation of a Rail Safety Management System Guideline

Senate. SEN15-P17 11 March Paper Title: Enhancing Information Governance at Loughborough University

Risk Management Policy

The validation of internal rating systems for capital adequacy purposes

Solvency Assessment and Management: Pillar II Sub Committee Governance Task Group Discussion Document 81 (v 3)

Practice Note. 23Revised. October 2009 AUDITING COMPLEX FINANCIAL INSTRUMENTS INTERIM GUIDANCE

PART B INTERNAL CAPITAL ADEQUACY ASSESSMENT PROCESS (ICAAP)

From ICAAP/ORSA to ERM: Board and Senior Management Oversight. Leon Bloom, Partner, Deloitte & Touche LLP lebloom@deloitte.ca

University of New England Compliance Management Framework and Procedures

Applying Risk Assessment to Your Audit Plan Break-out Session T3, Tuesday, October 26 2:00-2:50pm

ACCEPTANCE CRITERIA FOR THIRD-PARTY RATING TOOLS WITHIN THE EUROSYSTEM CREDIT ASSESSMENT FRAMEWORK

MEMORANDUM. Date: October 28, Federally Regulated Financial Institutions. Subject: Cyber Security Self-Assessment Guidance

Market Watch. Trade volume advertising: Considerations for firms and individuals relating to risks of market abuse. Contents

Operational Risk Management Program Version 1.0 October 2013

Solvency II for Beginners

Terms of Reference - Board Risk Committee

Principles for An. Effective Risk Appetite Framework

Solvency II Technical Provisions valuation as at 31st december submission template instructions

List of critical errors in internal models of insurance undertakings

Insurance Guidance Note No. 14 System of Governance - Insurance Transition to Governance Requirements established under the Solvency II Directive

7 Directorate Performance Managers. 7 Performance Reporting and Data Quality Officer. 8 Responsible Officers

RISK MANAGEMENT POLICY

FINANCIAL MANAGEMENT MATURITY MODEL

Hertsmere Borough Council. Data Quality Strategy. December

Reserve Bank of Fiji Insurance Supervision Policy Statement No. 8 MINIMUM REQUIREMENTS FOR RISK MANAGEMENT FRAMEWORKS OF LICENSED INSURERS IN FIJI

GUIDANCE NOTE FOR DEPOSIT-TAKERS. Operational Risk Management. March 2012

NOTICE 158 OF 2014 FINANCIAL SERVICES BOARD REGISTRAR OF LONG-TERM INSURANCE AND SHORT-TERM INSURANCE

Appendix 14 CORPORATE GOVERNANCE CODE AND CORPORATE GOVERNANCE REPORT

Service Integration &

Royal Borough of Kensington and Chelsea. Data Quality Framework. ACE: A Framework for better quality data and performance information

NHS DORSET CLINICAL COMMISSIONING GROUP GOVERNING BODY INFORMATION GOVERNANCE TOOLKIT REPORT

A&CS Assurance Review. Accounting Policy Division Rule Making Participation in Standard Setting. Report

Regulations in General Insurance. Solvency II


INVESTMENT MANAGEMENT ASSOCIATION PENSION FUND DISCLOSURE CODE

A Risk Management Standard

MERCHANT NAVY OFFICERS PENSION FUND STATEMENT OF INVESTMENT PRINCIPLES

Swiss Federal Banking Commission Circular: Audit Reports of Banks and Securities Firms. 29 June 2005 (Latest amendment: 24 November 2005)

OWN RISK AND SOLVENCY ASSESSMENT AND ENTERPRISE RISK MANAGEMENT

AUDITOR-GENERAL S AUDITING STANDARD 4 (REVISED) THE AUDIT OF SERVICE PERFORMANCE REPORTS. Contents

All I want for Christmas is accurate, complete and appropriate data

JOB DESCRIPTION. Contract Management and Business Intelligence

Guidance Note: Stress Testing Class 2 Credit Unions. November, Ce document est également disponible en français

APPENDIX 50. Enterprise risk management - Risk management overview

Aboriginal Affairs and Northern Development Canada. Internal Audit Report. Audit of Internal Controls Over Financial Reporting.

ABI resource pack for financial promotions. November 2005

Solvency II. SUPERVISORY RePORTING & DISCLOSURE workshop. 15 & 16 May Lloyd s

Drinking Water Quality Management Plan Review and Audit Guideline

Validating Third Party Software Erica M. Torres, CRCM

Key functions in the system of governance Responsibilities, interfaces and outsourcing under Solvency II

Risk Management Framework

Guideline. Operational Risk Management. Category: Sound Business and Financial Practices. No: E-21 Date: June 2016

Administrative Guidelines on the Internal Control Framework and Internal Audit Standards

Making Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management

Aegon Global Compliance

Effective AML Model Risk Management for Financial Institutions: The Six Critical Components

Compliance Management Framework. Managing Compliance at the University

Transcription:

Solvency II Data audit report guidance March 2012

Contents Page Introduction Purpose of the Data Audit Report 3 Report Format and Submission 3 Ownership and Independence 4 Scope and Content Scope of the Data Audit Report 5 Contents of the Data Audit Report 5 Appendices 1. Example template for Data Audit Report 2. Data Audit Framework 1

2

Introduction As set out in the published 2012 Solvency II timetable, Lloyd s requires all managing agents to submit a Data Audit Report on 15 June 2012 in line with FSA requirements. The following guidance is intended to give agents more details on what Lloyd s expects to see in agent Data Audit Reports. Agents should note that any additional guidance provided in this document is intended to supplement the level 2 measures, not repeat them, and agents must therefore ensure that they are familiar with all of the requirements and do not rely solely on the guidance provided here. Lloyd s issued Detailed Guidance Notes for Dry Run Process in March 2010 which covered data requirements and these available on lloyds.com via the link below: Link to 2010 Guidance on data Revised draft Level 2 measures were published by EIOPA in November 2011 but have not yet been finalised. As and when further details or changes emerge on Level 2 or Level 3 implementing measures, Lloyd s will issue updates to the published guidance as appropriate. This plan and any further guidance issued is subject to on-going discussion and change as the European Commission (EC), European Insurance and Occupational Pensions Authority (EIOPA) and FSA requirements become clearer. Purpose of the Data Audit Report The primary purpose of the data audit report is to demonstrate that an agent s data management policies comply with the tests and standards set out in the Solvency II directive. The report should focus on the policies and procedures which are in place rather than the data itself. In addition, the data audit report should demonstrate how the overall risk that the data used in the internal model does not meet the Solvency II requirements on data quality (complete, accurate, appropriate and timely) is considered. This overall data risk is split into five sub-risks which are discussed further in the contents of the report and set out in the table in Appendix 2. Report format and submission Lloyd s does not intend to mandate the exact format or content of the Data Audit Report but at a minimum it should include the areas outlined in the section entitled Content of the Data Audit Report. Agents should first and foremost produce a Data Audit Report that is appropriate for their business and internal model structure. A suggested outline structure for the report is provided in Appendix 1 of this document which agents can use to structure their Data Audit Report if they wish. Whilst the format of the report is not mandated, all managing agents should submit the table in Appendix 2 as part of the Conclusions and Recommendations section of their report. Appendix 2 is available for download as a separate Word document via the link below: Link to Data Audit Framework template Those managing agents who are also participating in the FSA s Internal Model Approval Process (IMAP) and have prepared a Data Audit Report can submit the report they are submitting to the FSA to Lloyd s and will not be required to produce a separate report for Lloyd s. Agents should note however that the syndicate model should be fully within the scope of the data audit for this to be the case otherwise an additional report may be required. Agents should discuss this issue with Lloyd s if they are unclear. The Data Audit Report does not require sign off by the managing agent s board. 3

Once the Data Audit Report is submitted on the 15 June, Lloyd s will review the reports and follow up with managing agents on an ad-hoc basis, should queries arise. Alongside this review process, there will be onsite data audit reviews conducted for a sample of managing agents. Ownership and Independence The Data Audit Report should be produced as a result of a review conducted by a suitably qualified person, independent from the individuals responsible for the design, build, parameterisation and implementation of the internal model. The author of the Data Audit Report must therefore be independent of the normal operation of the model. For example, Internal Audit may be used or other internal teams or functions which have relevant experience and skills. Managing agents may also wish to consider using the same personnel that carried out model validation, providing they are suitably qualified and remain independent. In conducting the review, the reviewer should apply professional judgement in deciding how the controls are assessed (e.g. sample size, depth of document review, interviewees, etc) and how effective they are in addressing the risk. Any data, internal or external, (e.g. claims history, bond price movements, loss events, etc.) on the basis of which material expert judgments / assumptions and model calibrations are made should be included in scope. The reviewer may make use of previous independent reviews (e.g. SOX compliance assessments, Internal/External Audit work, etc), so long as the data, assumptions, calculation methodology and IT environment reviewed have not changed significantly. In line with the approach in the model validation guidance, the reviewer may rely on work performed by other (not necessarily independent) individuals in forming their conclusions. Of course, any work relied upon must have been completed to the same independence standards as the data audit review itself. Where a managing agent makes use of previous reviews for this purpose, the agent should provide some explanation and justification for its use and state why it has been considered that it is still relevant.. Where conclusions and findings are drawn from previous reports, the reports referred to must cover the same audit scope and level of independent review as this Data Audit Report. Additionally these previous reports must be submitted as appendices along with the Data Audit Report. 4

Scope and content Scope of the Review and Data Audit Report The scope of this review is all data (internal and external 1 ) that could materially impact the Internal Model. This would include, for example policy (exposure) data held in administration systems, observational data such as claims data, mortality data, market data, credit data, static or referential data, equity exposure data, model parameters set by users such as correlation input data, number of simulations, etc. The minimum expectation would be that the data review covers all data feeding into the internal model. The scope should cover any data items which could materially impact the internal model, even if they are deemed outside the scope of the model (e.g. business planning and reserving processes). Once within the scope of the internal model, any judgements or amendments to that data would be out of scope for this report, as they fall within the model validation process. All complex transformations of data outside the calculation kernel of the internal model, which could have a material impact on the model output, potentially fall within the scope of the data audit. Where the transformation is functional, and its design involves expert judgment (e.g. proxy modelling, cash flow bucketing, sensitivity calculation etc.), the design or methodology or functional specification of the transformation will not fall under the scope but instead be covered elsewhere in the Lloyd s review process. However, elementary data quality checks (e.g. consistency, reasonableness, completeness, etc.) to ensure that the output of the transformation reflects the input data, as well as testing to ensure that the technical implementation (including implementation of any manual processes) complies with its functional specification should fall under the scope of the audit. Any expert judgement applied to data would be out of scope for this report but would be covered in the model validation report. Expert judgement here is defined as for the internal model and therefore any adjustments, cleansing etc. would still be within the scope of the Data Audit Report. The scope of the review should be consistent with the managing agent s data directory and data policy. Contents of the Data Audit Report As advised, Lloyd s does not intend to mandate the exact format or content of the Data Audit Report but at a minimum it should include the areas outlined below. Where an agent does not follow this format, they should ensure that it is clear where within the report submitted these areas are addressed. 1. Executive Summary This should provide an overview of the Data Audit Process and the results. It should cover assessment against both quantitative and qualitative criteria for the Data Audit of the whole internal model. The executive summary should cover as a minimum: personnel involved in the review (and their background, if not obvious from their job titles including why they are sufficiently independent to conduct the review and prepare the report), the scope (internal and external data, business units, etc.) of the review, any exclusions with justification, significant / material findings, and; 1 'External data' in the context of this review means data that is externally sourced by the agent for use either directly or indirectly in the Internal Model. This does not include data that is proprietary to vendor models external to the agent (i.e. not directly within the control of the agent). 5

the summary workplan with the approach (i.e. testing performed, documents reviewed, interviews conducted, sampling criteria used, etc.), and period covered, times scales of the review, a summary of the Data Audit process, including a summary of the level of independence applied for each Data Audit component. a summary of any areas where there have been limitations in the Data Audit Review (or the application of the Data Audit policy). Additionally, any limitations of the model identified during the Data Audit. a summary of the Data Audit tests applied, and their results, with reference to the sub-risks in Appendix 1. 2. Scope of the Data Audit Process This should describe the areas that have been considered as part of the Data Audit process including the definition of materiality of residual risk as per the managing agent s data policy. 3. Results of the Data Audit Process The structure of this section will depend on the structure of the Internal Model and corresponding Data Audit process. Typically, the Data Audit process will be broken down into sub-risks as defined in the schedule found in Appendix 2. This section would then cover the following information for sub-risk: review and tests applied for each data sub-risk as defined in Appendix 2.This would be expected as a bare minimum of the review scope, but managing agents should feel free to expand the review beyond this framework. the results of those tests conclusions from the data audit review, including reference to the criteria for passing and failing any limitations identified from the review, or suggestions for further data audit work required. These should be consistent with those identified in the model validation work. 4. Conclusions and recommendations This section should cover the conclusions of the Data Audit process and support the confirmations made in section 1. It is expected that as a minimum, the table containing the five sub-risks in the Data Audit Framework (found in Appendix 2) is included in this section. It should also cover: limitations of the internal model and corresponding recommendations for improvement limitations of the Data Audit process and corresponding recommendations for improvement conclusions on individual controls () using the Data Audit Framework in Appendix 2. The individual controls relate to the five sub-risks that make up the overall data risk that the data used in the internal model do not meet the SII directive requirements on data quality (complete, accurate, appropriate, and timely). These are as follows. 1. The approach (i.e. matters of policy) to managing data for use in the internal model does not ensure consistency in quality and application 2. Inadequate oversight of the development and implementation of the data policy increases the risk of poorly informed decision-making (e.g. risk management, capital allocation) and non-compliance with the required quality and standards 6

3. Lack of a clear understanding of the data used in the internal model, and of its impact and vulnerabilities, can create gaps in ownership and control 4. Errors, omissions, lack of timeliness and inaccuracies in the data can undermine the integrity of the internal model and management decision making 5. Unreliable IT environment, technology or tools can compromise the quality of the data and its processing within the internal model It is the responsibility of the reviewer to construct a suitable approach to assess the controls over these sub-risks. The examples given under the Assessment Approach column (see table below) are not intended to be a prescriptive list. The conclusion for each control can either be a Yes or a No. Yes = the controls are in place (i.e. written, communicated and understood by relevant stakeholders), operating effectively and the residual risk to the managing agent is not material. No = the controls are either not in place, or not operating effectively and/or the managing agent remains exposed to material risks. Unlike the FSA, Lloyd s does not require managing agents to submit the assessment of the materiality of residual risk to Lloyd s ahead of the data audit review and report submission. However, the definition of materiality should be included in the Data Audit Report itself and should be consistent with that in the managing agent s data policy. If a previous independent review was used to arrive at a conclusion, the scope of the review, any exclusion with justification, and the date of review. 5. Management Response to Review Findings and Recommendations This section should be written by the management team at the managing agent and not the reviewer performing the data audit review. If the conclusion is a No, a list of findings, potential impact (residual risk) on the managing agent s internal model, together with a list of actions to address the findings, and the expected completion date. As well stating a completion date, managing agents should state whether limitations identified are short term, longer term or permanent together with the actions being taken to address them. The limitations section in the validation report guidance may help with writing this section. Please also indicate whether the action was planned (i.e. was already part of the managing agent s Solvency II programme), or whether it is remedial (i.e. was not originally in plan). Additionally if any deficiencies are identified as a result of the data review, the managing agent should inform Lloyd s as to whether these relate to an existing gap or whether a new gap has been identified. Similarly, the managing agent should notify Lloyd s of any self-assessment scoring changes as a result of the review, perhaps at the next scoring submission or via the regular monthly meetings. 6. Appendix: References to detailed Data Audit results This appendix should include references to the detailed information that supports the conclusions in the Data Audit report, enabling the reader to find further information on any aspect of the Data Audit as required. 7. Appendix: Contact Details This appendix should include contact details for any queries regarding the Data Audit process. 7

8. Appendices (other information) Other information as appropriate required to support the conclusions. 8

Appendix 1 - Example Template for Data Audit report The following appendix provides a suggested outline structure for the contents of the Data Audit report. Agents should note that this is not mandatory, and they should feel free to design their own Data Audit Report, as long as it addresses the requirements set out in this document. 1. Executive Summary 2. Scope of the Data Audit Process 3. Results of the Data Audit Process 4. Conclusions and Recommendations (to include table in Appendix 2) 5. Management Response to Review Findings and Recommendations 6. Appendix: References to detailed Data Audit results 7. Appendix: Contact Details 8. Appendix: Other information (as appropriate to support conclusions)

Appendix 2 data audit framework The scope of the data audit should generally be designed by the managing agent to suit the scope of their own internal model. The framework below containing the five sub-risks should be the minimum scope of the audit. It is not intended to be exhaustive and therefore managing agents should feel free to include other risk in the scope of their audit, if deemed material and appropriate. Whilst a yes or no should be entered in the conclusion column in the table below, managing agents should outline their conclusions, findings and limitations as commentary in the main body of the report. These should be related back to the results of the data audit process, also outlined in the main body of the report. Risk Control Objective Expected Control Assessment Approach Conclusion 1. The approach to managing data for use in the internal model does not ensure consistency in quality and application of the internal model 1.1 To ensure that data quality is maintained throughout the process of the internal model as required by SII 1.1.1 A data policy has been established and implemented. The policy, its associated procedures, and standards include: a definition of the different data sets that are to be covered by the policy; a definition of materiality (which is aligned to the managing agent s risk appetite where appropriate e.g. when an expert judgements is made to adjust for insufficient observational data); the respective ownership and responsibility for the data sets, including the system of governance and assurance over data quality; a definition of the standards for maintaining and assessing quality of data, including specific qualitative and quantitative standards for the data sets, based on the criteria of accuracy, completeness and appropriateness; 1A. Confirm that the managing agent has: i) A written data policy approved by management with an appropriate degree of challenge and oversight in its development as evidenced through discussions and debates in minutes and/or equivalent documentation. ii) Written procedures, technical guides and standards for implementing the data policy; iii) Implemented the policy across the organisation as is evident from: communication of the policy, associated procedures and standards to the relevant

Risk Control Objective Expected Control Assessment Approach Conclusion the use of assumptions made in the collection, processing and application of data; the process for carrying out data updates to the internal model, including the frequency of regular updates and the circumstances that trigger additional updates and recalculations of the probability distribution forecast; a high level description of the risk and impact assessment process including the frequency with which the assessment process is conducted, and; the frequency of the review of the data policy, associated procedures and standards. stakeholders and individuals responsible for ownership and management of data used in the internal model, and; understanding of the relevant stakeholders and individuals responsible for ownership and management of data used in the internal model 2. Inadequate oversight of the development and implementation of the data policy increases the risk of poorly informed decision-making and noncompliance with the required quality and standards 2.1 To set the tone and provide appropriate oversight of the implementation of the data policy necessary for sound decision making 2.1.1 The data governance structures and processes are operating as defined in the data policy and associated procedures and effective in: providing appropriate oversight in the application of the data policy; ensuring that the data policy, associated procedures, and standards including the responsibilities and accountabilities of the various stakeholders across the managing agent, the quantity and quality of data metrics reported to management, the data directory, and the risk and impact assessment are kept under regular review; ensuring appropriate assurance is carried out and received for validating the quality of data used in the internal model. 2A. Review the managing agent s data governance arrangements and their fit with the organisation structure to determine: completeness of oversight as shown, for example, by the terms of reference and agenda, and; key discussions, debates, decisions and approvals from a review of minutes. 2B. Review and assess: revision history and changes made to the policy, associated procedures, standards,

Risk Control Objective Expected Control Assessment Approach Conclusion 2.2 To ensure appropriate and timely reporting to support required governance and management decision making process and timely detection of issues 2.2.1 Data quality metrics (qualitative and quantitative) defined in the data policy are reported (individually, aggregated or categorised) to appropriate levels of management on a regular basis to enable them to assess the quality of data and take remedial action when there are material issues. The system of reporting should include a deficiency management process whereby exceptions identified as a result of data quality checks and controls, which could have a material impact on the internal model, are escalated to appropriate levels of management and actions taken to address them on a timely basis. governance framework, data directory, risk and impact assessment. the nature and timeliness of MI reports received; the extent of exception reporting for appropriateness and effectiveness; remedial actions taken to resolve exceptions, and; through interviews with key personnel, the level of understanding of their governance responsibilities and MI reports. 3. Lack of a clear understanding of the data used in the internal model, and of its impact and vulnerabilities, can create gaps in ownership and control 3.1. To ensure that data used in the internal model, its impact and vulnerabilities has been clearly identified and maintained 3.1.1 A directory of all data used in the internal model has been compiled specifying source, usage and characteristics including: storage (e.g. location, multiple copies) across the data flow to internal model; how data is used in internal model including any transformation (e.g. aggregation, enrichment, derivation) processes. 3.1.2 For each data set, a risk and impact (sensitivity) assessment has been performed to identify: whether the impact of poor quality data (individually or in 3A. Review the managing agent s data directory to determine its clarity, completeness and maintainability. 3B. Review the managing agent s risk and impact assessment for completeness, including an appropriate consideration of the outcome of the assessment against any issues reported in the data quality metrics. 3C. Confirm that the tolerance thresholds and materiality used are

Risk Control Objective Expected Control Assessment Approach Conclusion aggregation) on the internal model is material; the points in the data flow from source to internal model where likelihood of data errors is the greatest, and therefore, what specific data quality controls are required; tolerance threshold beyond which a data error could become material (individually or in aggregation). consistent with the reporting to the relevant management groups or governance oversight bodies. 4. Errors, omissions and inaccuracies in the data can undermine the integrity of the internal model and management decision making. 4.1. To ensure that data quality (complete, accurate, appropriate, and timely/current) is maintained in the internal model. 4.1.1 The management and data quality controls (preventative, detective, and corrective) proportional to the probability and materiality of potential data errors have been identified and implemented effectively. The controls should include (at a minimum): 4.1.1.a having individuals with sufficient competence to conduct the manual data checks on accuracy, completeness and appropriateness 4.1.1.b A well-defined and consistent process for refreshing or updating all data items in line with the data policy (timeliness and currency of data). The process must include appropriate change controls (automated or manual) that take into account any material impact (individually or in aggregation) on the internal model. 4.1.1.c Data input validations (auto/manual) that prevent data having incorrect or inconsistent format or invalid values. 4.1.1.d Completeness checks such as: Reconciliation of data received against data expected. A process to assess if data is available for all relevant 4A. Review and evaluate the managing agent s documented control procedures to assess their completeness and appropriateness in meeting the control objective. 4B. Assess the adequacy of training/experience of individuals responsible for critical stages of data checks. 4C. Assess the adequacy of change controls for a sample of key changes/updates made. 4D. Walk through the key validations and checks with key personnel to assess the degree of understanding and embedding. 4E. Assess the operational effectiveness of the key validations and checks.

Risk Control Objective Expected Control Assessment Approach Conclusion model variables and risk modules 4.1.1.e Accuracy checks such as: Comparison directly against the source (if available). Internal consistency and coherence checks of the received/output data against expected properties of the data such as age-range, standard deviation, number of outliers, and mean. Comparison with other data derived from the same source, or sources which are correlated. 4.1.1.f Appropriateness checks such as: Consistency and reasonableness checks to identify outliers and gaps through comparison against known trends, historic data and external independent sources. A definition and consistent application of the rules that govern the amount and nature of data used in the internal model. A process to assess the data used in internal model for any inconsistencies with the assumptions underlying the actuarial and statistical techniques or made during the collection, processing and application of data. 5. Unreliable IT environment, technology or tools can compromise 5.1 To ensure that the quality of data and its processing for use in the 5.1.1 IT general computer (ITGC) controls over the data environment (for e.g. Mainframes, End User Computing applications such as spreadsheets, etc) that may have material impact on the internal model are established, such as: 5A. Assessment of design and operational effectiveness of key ITGC controls that relate to the data sets as defined and required by the internal

Risk Control Objective Expected Control Assessment Approach Conclusion the quality and integrity of the data and its processing within the internal model internal model is maintained logical access management; development and change management (infrastructure, applications, and database); security (network and physical); business continuity; incident management and reporting, and; other operational controls that support the collection (including data feeds), storage, analysis and processing. model. 5B. Review key IT MI reports such as network and access security breaches, system downtime, coding errors etc to determine whether any incidents that impact materially on the internal model have been followed through and resolved appropriately.