Finding, Fixing and Preventing Data Quality Issues in Financial Institutions Today FIS Consulting Services 800.822.6758
Introduction Without consistent and reliable data, accurate reporting and sound decision-making become irrelevant, because the ability to process and analyze data depends on the quality of that data itself. Rapid consolidation and the deployment of new technology in the banking industry have left many banks struggling with the integrity of internal customer, loan and operational data. This paper will describe the extent of the data quality challenge in the U.S. financial industry and lay out a plan to identify, fix and remediate data quality challenges. The Cost of Bad Data The cost of bad data is becoming more apparent throughout the U.S. The majority of companies responding to a Forbes Insight survey reported losing more than $5 million annually as a result of bad data, and one-fifth of companies taking the survey estimate losses in excess of $20 million per year. According to the same survey, 95 percent of organizations agree that strong information management is critically important. 1 One-fifth of companies taking the survey estimate losses in excess of $20 million per year $20M Data quality in the banking industry is an especially acute challenge. Clean and consistent customer information needs to feed analytics systems driving marketing programs. Merger and acquisition activity creates bank consolidation that requires clean data to effectively collapse technology platforms. Regulatory agencies seek greater operational detail about all aspects of a bank, especially their loan portfolios. A case in point: A recent quality review was conducted by the firm AFS for the banks in its loan pricing benchmark database. They found wide disparities in levels of missing or invalid fields for such key elements as borrower risk ratings, collateral types, geography and industry codes. 1 Forbes Insight, Managing Information in the Enterprise: Perspectives for Business Leaders, April 2010 2
Approach for Achieving Data Quality Banks have started to invest in data quality staff, upgrade technology and revamp balance sheets in efforts to comply with requirements for clean data. These initiatives have undoubtedly improved to some degree banks marginal capability to capture, organize and report data, but have they had any appreciable impact on the basic quality of the underlying data itself? That seems to be where the real and persistent problems exist. To provide a sustainable solution to data quality challenges, we suggest a four-step approach: 1. Inventory and categorize the bank s data quality issues and challenges 2. Prioritize the issues and challenges 3. Implement immediate fixes and fill knowledge gaps 4. Implement strategic business process improvements and controls for sustainable change Inventory the bank s data quality issues The first step in resolving data quality challenges and issues within a bank is measuring just how big the problem is. A thorough assessment across all functional areas of a financial institution should generate the needed background to categorize and organize data quality issues. This assessment benefits from the eyes and ears of outside experts. In data analysis and inventory efforts, departmental siloes must be crossed. Individuals that understand how to ask probing questions can help surface any and all issues. The inventory of data quality issues can be organized into an actionable format. The goal is not to over-complicate, but rather to form an educated view of the challenges the bank faces in data quality, processes that create bad data and knowledge gaps. The following is an example of the high-level view this type of inventory provides. Benefits of a Bank Enterprise Data Management Program According to CapGemini, the benefits of an enterprise data management program at banking institutions delivers the following benefits: For operations, a centralized reference data management system will offer great advantages in providing accurate, timely and consistent data across systems. This will result in a huge reduction in reconciliation activities and will increase the efficiency and effectiveness of various teams. For risk management, enterprise data management offers, among other things, the ability to correctly identify counterparty risk. Accurate measurement and management of enterprise wide risk measurement and management would be virtually impossible without accurate, reliable and consistent data provided by an effective EDM. Benefits to finance and accounting from EDM are obvious, considering the performance analysis and management reports they produce that are viewed by external stakeholders (regulatory and market) and internal consumers (board, senior management and decision-makers). EDM can allow these reports to be certified with a greater degree of confidence. Data integrity and consistency, which allow for greater confidence in management reports and decisions, are of great importance from an audit, legal and compliance perspective as well. Sales and marketing operations benefit immensely from an EDM through the ability to have a single view of the customer that enables effective cross-selling and upselling. 3
Prioritize the data issues and challenges Because data quality challenges may overwhelm existing bank resources, it becomes important for the data quality team to establish clear principles and guidelines for cleaning bank data. A formal ranking system that weights data cleansing priorities on factors such as customer impact, financial risk and operational risk may work best. If your bank does not have a formal prioritization process, that may indicate holes in data quality and consistency. Another factor in prioritization is the leverage that can be gained with having clean and consistent data to meet a critical business need. For example, if compliance requirements drive a need for improvements in customer data, then investments to improve that data may also help drive customer data-driven marketing projects. 4
Customer segmentation, customer relationship management, direct marketing and product pricing decisions will benefit from the ability to access and analyze customer data in creative, new ways. Many organizations find it difficult to reach agreement on the priorities to apply to data cleansing activities. If that s the case, facilitated sessions can help them reach consensus. By bringing stakeholders together and allowing everyone to have a voice, an objective third party can guide a team toward decisions that achieve the greatest common good for a bank. This process should be objective, relatively short and let the participants own the decisions they make. Implement immediate fixes, fill knowledge gaps and validate changes The step of making the corrections to bad data can actually take more time than any of the other three steps. Changes to data fields must be researched thoroughly to ensure the correction impacts only the bad data previously identified. What on the surface may appear as a simple fix may have broader complications unless the data adjustment is understood by all impacted stakeholders. During these fixes, communication between functional areas within a bank becomes critical. Data analysts and others on data quality projects should err on the side of over-communicating. Formal and informal communication methods should be utilized. Skilled trainers should be engaged to develop the curriculum required to bridge any knowledge gaps within the organization. Self-paced and instructor-led classes will help answer (and reinforce the answer) to questions around data use and business processes. Any solution to inaccurate data requires validation prior to formalizing the remediation. Similar to the role of quality assurance in software development, individuals responsible for validating data need to test and verify data fixes to ensure intended discrepancies are resolved. Implement strategic business process improvements and technology for sustainable change Once data within a bank is cleansed, the longer-term objective becomes sustainable data quality. The faulty business processes that created the poor-quality data must be fixed. During those fixes, any other inefficiencies must be removed from business processes and procedures. Technology plays a vital role in the improvement of business processes, as data validation routines and databases create sources of leverage. Process improvements combined with enabling data capture technology contribute to efficiency gains and making tasks easier for bank employees at all levels. Facilitated sessions can again play a pivotal role in both the development and implementation of new business processes. By gathering collective input from all affected stakeholders, any changes stand a better chance of acceptance throughout the bank. 5
Getting Started Enabling Data Governance How can a bank mobilize and focus their staff to start attacking data quality issues and challenges? Establishing a Data Governance Council becomes a good first step. Data governance is an emerging discipline with an evolving definition. The discipline covers the convergence of data quality, data management, data policies, business process management, and risk management surrounding the handling of data in an organization. Data governance encompasses the people, processes, and information technology required to create a consistent and proper handling of a bank s data across the enterprise. Goals may be defined at all levels of the bank, and doing so may aid in acceptance of processes by those who will use them. The Data Governance Council consists of both data owners and data consumers. This is a cross-functional group with business and technical representation. For the data governance effort to succeed, it is important to have executive sponsorship. The Data Governance Council should have representation from the following functions: Data Governance Team Information Management Group Leader Information Technology and Network Support Information Security Line of Business functions Branch Management/Operations Accounting/Finance Audit Compliance and Risk Management 6
Data Governance Goals As with any initiative, it is important to set attainable and measurable goals. The goals of a Data Governance Council should support these overarching benefits: Data Governance Goals Increase consistency and confidence in decision-making Decrease the risk of regulatory fines Improve data security Maximize the income-generation potential of data Minimize or eliminating rework Optimize staff effectiveness Summary A sound Data Governance Council with the support of executive management can follow a proven methodology for uncovering and resolving data quality issues. Once these issues are addressed, the council s leaders become the drivers of the business process improvement initiative to ensure sustainable data quality throughout the financial institution. Contact Us For more information, contact FIS Consulting Services at 800.822.6758 or visit fisglobal.com. 7