Data Quality for BASEL II



Similar documents
COMMERCIAL BANK. Moody s Analytics Solutions for the Commercial Bank

Re-engineering the Credit Approval Process. Presented by: Nancy Hasey-Ross Date: October 5, 2011

Data Quality Assessment. Approach

Building a Data Quality Scorecard for Operational Data Governance

WHITE PAPER. Governance, Risk and Compliance (GRC) - IT perspective

IBM Cognos 8 Controller Financial consolidation, reporting and analytics drive performance and compliance

8 Key Requirements of an IT Governance, Risk and Compliance Solution

DataFlux Data Management Studio

Finding, Fixing and Preventing Data Quality Issues in Financial Institutions Today

EMEA Banking Model Governance Framework

Enterprise Risk Management

Integrated Stress Testing

How To Manage Risk With Sas

Financial close, consolidation, and reporting Leveraging process alignment and Oracle Hyperion EPM Tools

Assessing Credit Risk

The Role of Mortgage Insurance under the New Global Regulatory Frameworks

Enterprise Information Management and Business Intelligence Initiatives at the Federal Reserve. XXXIV Meeting on Central Bank Systematization

Risk Management of the financial Sector. Periklis Dontas Executive Director, Calculus Global Services Aybike Aker - Senior Consultant, FICO

Viewpoint ediscovery Services

White Paper Achieving GLBA Compliance through Security Information Management. White Paper / GLBA

Capgemini BizLender 360 An Integrated Straight Through Processing Solution for Business Lending Origination

IBM Business Analytics: Finance and Integrated Risk Management (FIRM) solution

Bringing agility to Business Intelligence Metadata as key to Agile Data Warehousing. 1 P a g e.

PEOPLESOFT CONTRACTS. Gain control and visibility into contracts. Tailor contracts to meet specific customer needs.

IT Governance. What is it and how to audit it. 21 April 2009

CREDIT RISK MANAGEMENT

Planning a Basel III Credit Risk Initiative

Capgemini BizLender 360 SM An Integrated Straight Through Processing Solution for Business Lending Origination

Portfolio Management for Banks

PM Services. Portfolio Strategy, Design and Build

Seven Rules of Thumb for Post-Trade Compliance

Compliance Management EFFECTIVE MULTI-CUSTODIAL COMPLIANCE AND SALES SURVEILLANCE

Best Practices in Contract Migration

ACCELUS COMPLIANCE MANAGER FOR FINANCIAL SERVICES

The Hidden Cost of Spreadsheets

Total Reconciliation Solution (T-Recs ) Enterprise A Control Framework for Governance, Risk Management and Compliance

Auto Days 2011 Predictive Analytics in Auto Finance

Big Data, Big Business Opportunities

White Paper March Consolidation automation Advancing compliance and performance management

RSA ARCHER OPERATIONAL RISK MANAGEMENT

ORACLE ENTERPRISE DATA QUALITY PRODUCT FAMILY

How To Manage It Asset Management On Peoplesoft.Com

BOARD OF GOVERNORS FEDERAL RESERVE SYSTEM

COMPLIANCE MANAGEMENT SOLUTIONS THOMSON REUTERS ACCELUS COMPLIANCE MANAGEMENT SOLUTIONS

Helping Enterprises Succeed: Responsible Corporate Strategy and Intelligent Business Insights

Kumaran Banking Solutions

Five Fundamental Data Quality Practices

Part A OVERVIEW Introduction Applicability Legal Provision...2. Part B SOUND DATA MANAGEMENT AND MIS PRACTICES...

High-Performance Scorecards. Best practices to build a winning formula every time

Fortune 500 Medical Devices Company Addresses Unique Device Identification

BUSINESS INTELLIGENCE

Basel Committee on Banking Supervision. Capital requirements for banks equity investments in funds

Optimizing government and insurance claims management with IBM Case Manager

The Power of Risk, Compliance & Security Management in SAP S/4HANA

Credit Research & Risk Measurement

ORACLE ENTERPRISE GOVERNANCE, RISK, AND COMPLIANCE MANAGER FUSION EDITION

White Paper. An Overview of the Kalido Data Governance Director Operationalizing Data Governance Programs Through Data Policy Management

RedPrairie for Convenience Retail. Providing Consistency and Visibility at Least Cost

Convercent Predictive Analytics

How To Improve Your Business

Business Process Services. White Paper. Effective Credit Risk Assessment Strengthening the Financial Spreading with Technology Enablers

Corralling Data for Business Insights. The difference data relationship management can make. Part of the Rolta Managed Services Series

AMBIT LOAN ORIGINATION A New Approach

with Managing RSA the Lifecycle of Key Manager RSA Streamlining Security Operations Data Loss Prevention Solutions RSA Solution Brief

WHITEPAPER. How to Credit Score with Predictive Analytics

Protecting Business Information With A SharePoint Data Governance Model. TITUS White Paper

Select the right configuration management database to establish a platform for effective service management.

INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK. Date: June 3, 2011

Managing the Multi-Company Corporation

Self-Service Data Assurance for BI and Analytics

Four Clues Your Organization Suffers from Inefficient Integration, ERP Integration Part 1

Leveraging data analytics and continuous auditing processes for improved audit planning, effectiveness, and efficiency. kpmg.com

PEOPLESOFT IT ASSET MANAGEMENT

The Informatica Solution for Improper Payments

CA Service Desk Manager

Credit Risk Management: Trends and Opportunities

OCC 98-3 OCC BULLETIN

Version: 5 Date: October 6, 2011

SAP ERP FINANCIALS ENABLING FINANCIAL EXCELLENCE. SAP Solution Overview SAP Business Suite

Business Intelligence for Banking

Balance Sheet Integrity The Utopian Close: Creating a low risk, highly effective financial close

White Paper. Thirsting for Insight? Quench It With 5 Data Management for Analytics Best Practices.

COGNOS PLAN-TO-PERFORM BLUEPRINTS CAPITAL EXPENDITURE PLANNING

Gain superior agility and efficiencies with enterprise origination solution. Finacle Origination

Aligning Quality Management Processes to Compliance Goals

Making Automated Accounts Payable a Reality

FINANCIAL MANAGEMENT MATURITY MODEL

IBM Cognos Controller

Transforming FP&A: Combining Process Redesign & Technology to Deliver Insights

ING Bank, fsb (ING DIRECT) appreciates the opportunity to comment on the Base1 I1 notice of proposed rulemaking (NPR).

Transcription:

Data Quality for BASEL II Meeting the demand for transparent, correct and repeatable data process controls Harte-Hanks Trillium Software www.trilliumsoftware.com Corporate Headquarters + 1 (978) 436-8900 trlinfo@trilliumsoftware.com

Executive Summary Trillium Software s BASEL II Solution helps large, complex financial services institutions meet the standards for data quality attestation by the FFEIC regulators of risk reporting. The BASEL II Solution delivers a data quality process management platform and domain expertise to support the rigorous demands of commercial and retail risk reporting and attestation. The solution leverages pre-built risk data quality best practice templates that provide new levels of transparency to reporting and attestation processes. Trillium Software has enabled several of the world s largest financial institutions to implement enterprise best practices that are used for attestation in the domains of risk reporting. These same processes have often assisted analysts and subject matter experts in identifying opportunities for capital reserve improvement and reduction. Leveraging the solution platform and the risk reporting process templates delivers a combination of fact-based insight, evidence of action and documentation, and business accuracy in support of internal and Federal data audits. All rights reserved Page 2 of 13

Business Challenges Transparency, consistency and accuracy Current regulatory directives demand rigor and focus on controls for the data that is used in Risk Calculation processes. BASEL II specifically stipulates that advanced banks (AIRB) must ensure that all data entering credit risk models is accurate and fit-for-purpose. 1 Further, these banks must be able to demonstrate a robust system of controls to validate the accuracy and consistency of risk data elements used in the ongoing measurement and monitoring of capital requirement calculations. BASEL II mandates: Increased transparency of key performance indicators, such as Probability of Default (PD) and of Loss Given Default (LGD) to better determine Exposure at Default (EAD). Strict rules on capital risks reserve provisions penalize those institutions highly exposed to risk and those unable to provide provably correct analysis of their risk position. Audit readiness As stipulated in the Final Rule, USA: Risk-Based Capital Standards: Advanced Capital Adequacy Framework Basel II, once a bank has adopted its implementation plan, it must complete a satisfactory parallel run before it may use the advanced approaches to calculate its risk-based capital requirements. To achieve satisfactory status, a bank must demonstrate that it has put in place a fully transparent audit trail, which it can share as evidence that the proper controls are in place to support the attestation process. Such evidence is demonstrated by comprehensive processes that examine full production volumes of data and produce reports that the bank makes available to its regulator. Regulatory agencies then tend to assess these advanced approach methodologies through discussion, review of data collection and analysis, and examination activities. Therefore, possessing a repeatable process that is understood and accepted by regulators is a best practice that leads to confidence 1 Basel Committee on Banking Supervision, International Convergence of Capital Measurement and Capital Standards, A Revised Framework, June 2004 All rights reserved Page 3 of 13

in the controls as banks seek to manage and monitor their capital reserve requirements. Data Quality Management Processes A Regulatory Perspective All rights reserved Page 4 of 13

Data Challenges Current Practices - A patchwork approach to understanding the data Large banks routinely employ large numbers of employees and expensive consultants to investigate data in an attempt to understand where the data gaps may exist. Given the size and complexity of the task, and problems with the data, risk management teams employing manual methods can only do so much when it comes to ensuring data completeness and accuracy. Often the analysis undertaken is based on samples and on known problems. These efforts do not examine full data sets at production volumes. This leaves institutions open to the risk of unknown exposures; for example, mortgages without collateral values impacting Loan to Value calculations, or cardholder records without FICO scores. Banks need auditable data governance strategies that ensure and prove risk data is fit for purpose. Manual approaches are labor intensive, expensive, imperfect, slow to deliver and difficult to maintain for compliance purposes. They can involve teams of many analysts who manually search multiple data stores for information which they copy to small data stores and spreadsheets. This approach to resolving data issues is a slow activity prone to error: Data must be manually validated and interpreted Data from multiple sources must be reconciled Identifying issues in the data is complicated SQL and other queries only interrogate source data for the conditions targeted Teams cannot fully understand the condition of data for unanticipated or unknown conditions It is difficult to share results and develop a holistic view of issues The process relies on human understanding and interpretation which by definition, is error prone Given the manual processes involved, it often proves impossible to audit the risk data to the detail required by current (and more demanding future) risk governance legislation. This means that inherent issues go undiscovered and processes break down, ultimately leading to an unknown, and thus unmanaged, level of business risk exposure. Clearly, what you don t know about your data can hurt you. Further, manual processes are neither flexible nor easily modified to meet new demands, such as those currently under consideration by regulators. Change is All rights reserved Page 5 of 13

Risk Amplification difficult to accommodate. And many banks will attest to the pains involved in trying to manage manual processes as technology platforms change and new sources of data are introduced, as happens when a bank is merged or expands into new lines of business. As demonstrated in the case study below, processes that rely on a combination of human intervention and manual processes suffer from risk amplification as unanticipated risk is introduced into the effort. In these situations, it is difficult to be certain that outcomes will be provably correct. This can result in unknown risk that undermines confidence in capital adequacy calculations and reporting. Case study: Global US-based financial services institution The risk management team employs 200 data analysts who gather and verify risk data and then build risk information portfolios. A business governance unit, this team interfaces with IT and various database administrators to create data extraction routines. The process they followed was painful: 1. Create a temporary repository to hold exported data 2. Analyze data formats 3. Write transformation routines in an attempt to standardize the data 4. Run record matching software to create links to enable risk aggregation. The results were muddled: Lots of data was being stored by multiple analysts in local databases and spreadsheets in an utterly complex, almost anarchic, system. The institution concerned was unable to prove its risk calculations to the satisfaction of regulators and, consequently, has been required to set aside a much larger capital risk reserve than its Board forecast. By implementing the Trillium Basel II solution: Automated profiling and discovery of data at full production volumes delivers transparency to credit risk data. All rights reserved Page 6 of 13

Provably correct Data standards are enforced without needing to develop additional complex transformation routines. Certified Data Accuracy, reviewed by subject matter experts and with metrics and alerts ordered by priority to the business, is demonstrated as evidence of proper controls to the regulators Within their governance, risk and compliance initiatives, banks need to own transparent, provably correct, and easily repeatable validation methods to comply with legislation. Under BASEL II mandates, regulators require that GRC teams demonstrate their risk governance processes under detailed scrutiny. Banks must: augment risk calculations with metrics indicating the quality of the underlying data prove that data controls are correct, comprehensive and consistent In theory this is a reasonable request. In practice, it presents a myriad of challenges around the accuracy of the data volumes that underpin risk calculations, including customer credit quality measurements and the wide variety of market exposures. An added concern is that source data, such as counterparty and credit ratings records, are spread out over multiple systems and captured differently in varied applications. Different data standards, formats, and naming conventions hamper the reconciliation of entities that impact risk exposures. These same data issues will also impact downstream default probability calculations. Incorporating inaccurate data into the calculation stream introduces unknown risk that resonates throughout risk management processes. All these exposures can add up to an increased Economic Capital cost. All rights reserved Page 7 of 13

The Need for Data Quality in BASEL II Managing data is one of the most significant challenges in BASEL II implementations. As a result of existing and new compliance requirements, enterprise data process controls are highly scrutinized. Regulators counsel that risk models are only as effective as the data that feeds them. Financial institutions are advised that they need sound data policies, procedures and processes to ensure data accuracy. Identifying and implementing a solution able to provide this insight is critical to successful BASEL II compliance. Data standards vary among systems Consider how data formats, standards and rules differ among systems. Although systems may share the same field names for data and other key content, such as loan payment and delinquency ratings, dates, client account information, product and reference data, account history and transactions, the way such data is formatted in those fields will vary from system to system. One application may store loan provision dates in one format while another stores payment and delinquency dates in a different format, creating problems in measuring account exposures. Further, there are often quality and consistency discrepancies across multiple fields that hamper accurate reporting. Record duplication hampers reconciliation These types of format, standardization, and coding differences result in duplication, errors in processing and consolidating payments, inability to reconcile account and payment data to a single entity, and worse still, failure to calculate accurate delinquency on account, default position, and cash collection. All rights reserved Page 8 of 13

Data quality issues preclude accurate record matching In addition, there may be data quality problems, such as counterparty name and location variances, spellings, missing data and other issues. For example, data from a working capital loan to GrayBlue Metals Inc of Pittsburgh for $25 million resides within one application at the ABC Bank, while in another system there is a reference to GreyBlue Metals of New York for $15 million and another to GBM Inc of Detroit for $20m. Without the ability to match these records within a corporate hierarchy, Bank ABC may not represent this counterparty as one entity in its portfolio and understand that the risk exposure is for $60M incurred by the parent company, GrayBlue Metals Inc., headquartered in Pittsburgh. Data Profiling can resolve these issues Data profiling is a critical first step in data management. Data profiling automates the identification of problematic data and metadata by examining data available in all relevant data source (e.g. a database or a file) systems at full production volumes, collecting statistics and information about that data, and presenting it to users in an actionable, business accessible format. Data profiling enables financial institutions to identify and correct inconsistencies, redundancies and inaccuracies in the source data used in risk management and capital reserve calculations. From a regulatory perspective, data profiling yields the evidence that builds confidence in the institution s controls over data populating their credit risk models. Implementing a platform that enables effective controls generates the evidence and demonstrates proper corporate actions that supervisors look for as validation that the bank has appropriate data quality for risk monitoring and reporting. Ongoing data monitoring and management Regulators also emphasize the critical importance of ongoing validation of advanced data quality controls. They look for methodologies that support corporate due diligence both before and after initial qualification decisions. As All rights reserved Page 9 of 13

part of the compliance mandate, ongoing data monitoring and management of these data drivers must be implemented to their satisfaction. The Trillium Basel II Solution is the enabler of choice for the data monitoring and identification capabilities required as part of a production data quality oversight and remediation process, because it implements a process that monitors full production volumes. The solution deliverables include critical time-sequence views of data quality metrics that monitor an immense amount of data volumes and aggregated detail. Critical information is presented in a format comprehensible to the business, and related metrics reflect the order of issue prioritization and importance that the business demands. Dashboards and reports are designed to help banks demonstrate that effective controls are in place to monitor the impact that changes in data quality have on their models and calculations. All rights reserved Page 10 of 13

The Solution Trillium Software s Basel II solution enables banks to accelerate their compliance efforts through a more effective and transparent data quality and governance process. Trillium Software s BASEL II Solution helps large, complex financial services institutions meet the standards for attestation by the FFEIC regulators of risk reporting. Using a combination of industry-leading expertise in the field of data quality management, data quality best practice templates, predefined business processes, analytical insight, and leading technology, Trillium Software has helped several of the world s largest lending institutions implement corporate best practices for attestation in the data domains of risk reporting and identify areas for capital reserve enhancement and reduction. By leveraging technology and multiple process templates, the Basel II solution delivers a combination of people skills, business-focused reports and dashboards that demonstrate process competency and business accuracy in support of internal and Federal data audits. Accelerated Time to Value The Basel II framework for profiling, measuring and monitoring data acts as a controls layer between a bank s operational processes and its financial reporting. These controls help to formalize the fiduciary requirements relating to the management of data assets critical to success and generate actionable output for risk management teams to assess and trend progress. Alerts: Automated alerts advise about issues with other controls and processes Evidence: Reports, scorecards, metrics, and trending allow validation and quantification of issues Actions: Users can track identified issues to resolution By delivering transparent processes for isolating and assessing the impact and cost of poor data quality on risk operations, senior executives can certify that the data feeding their regulatory reports has the proper controls and governance.. All rights reserved Page 11 of 13

Basel II Solution Benefits Rapid Deployment Proven rapid deployment via pilot, quick visibility or results and evidence for business users. Reduced Risk Manual coding & queries reliant on individuals. The Basel II solution ensures a consistent approach. Centralized Centralized repository of DQ measures, reusable business rules applied to multiple feeds, collaborative (multi-user) environment. Auditable Full transparent audit trail, usable as evidence to regulator in attestation process. Repeatable automated process. All rights reserved Page 12 of 13

Conclusion The increased transparency required of financial services firms translates to new levels of scrutiny into the financial disclosure process and all supporting data. To effectively meet compliance (GRC) requirements and optimize credit risk reserve positions, data controls must be correct, comprehensive and complete, and the bank must demonstrate consistent data management performance over time. Trillium Software s BASEL II solution delivers transparency into the quality of data that populates all risk management calculations and provides effective process controls for monitoring and maintenance. With greater insight into the data, business and GRC teams can: Access relevant data swiftly, regardless of platform Define and enforce complex risk validation and business rules View and validate data that supports risk reporting and economic capital calculations Gain visibility into data quality issues and operational processes, supporting long-term governance Create data quality indices and coefficients for red-lighting key exposures Develop evidence of verifiable processes and reports that the institution can make available for regulatory review Demonstrate the confidence and competence in data management that regulatory agencies demand Armed with a reliable understanding of the quality of the data that fuels risk positions, executive management can trust compliance initiatives with confidence. Trillium Software partners with large financial institutions to deliver the industry leading platform, domain expertise and data quality management capabilities that provide the proper foundation for compliance initiatives. Implementing a repeatable framework for measuring and monitoring the accuracy of data supporting risk reporting and RWA calculations, the Basel II solution delivers measurable, sustainable benefit. All rights reserved Page 13 of 13