Corporate Governance and Compliance: Could Data Quality Be Your Downfall? White Paper This paper discusses the potential consequences of poor data quality on an organization s attempts to meet regulatory compliance reporting such as Sarbanes-Oxley or Basel II. It also suggests that corporation s need to define strategic policies and processes for ensuring enterprise data quality. Prepared by Email: trlinfo@trilliumsoftware.com Web: www.trilliumsoftware.com Copyright 2004 Trillium Software, a division of Harte-Hanks. All rights reserved.
Table of Contents 1. Introduction...3 2. Can we trust our information?... 4 3. Data fit for Compliance... 5 4. Legacy data quality issues... 6 5. Data Profiling for data quality assessment... 8 6. Time to take Control... 10 7. Corporate Governance Standards... 11 7.1. Sarbanes-Oxley (USA)... 11 7.2. OFAC (USA)... 11 7.3. Patriot Act (USA)... 11 7.4. Basel II (International)... 12 7.5. Higgs Report (UK)... 12 7.6. IAS (UK/Europe)... 12 2
1. Introduction Recent corporate scandals and increased state emphasis upon controlling the financing of criminal and terrorist activity have brought about greater regulatory and legislative pressure on businesses. Depending on the nature of the business, acts such as Sarbanes-Oxley, The Patriot Act, Higgs, Basel II and new accounting standards such as IAS are demanding boardroom attention. Given that, in some areas, the consequences of non-compliance with these acts can be heavy personal fines and imprisonment, no wonder boardrooms everywhere are devoting time, energy and resources to meeting the requirements they impose. In the wake of Enron, WorldCom and others, Sarbanes-Oxley (for companies trading in the US) and Higgs (for those registered in the UK) seek greater transparency and accuracy of financial reporting. Basel II seeks better assessment by banks of loan risks and default provisions, while The Patriot Act requires that companies trading in the US check details of customers and suppliers against a list of criminal suspects. 3
2. Can we trust our information? Compliance with all of these regulations is very much about accurate reporting. This requires building processes and procedures for managing information, ensuring its integrity and accuracy, its completeness and its timeliness. In other words, its overall quality. The bottom line on compliance, and whether a CEO might sleep soundly at night having signed declarations of truth, comes down to the question of are these procedures working? and can we trust that the information, that we sign our names to, is indeed The Truth?. The resources being put aside for corporate governance compliance show that it is being taken seriously. Gartner Group estimates that Fortune 1000 companies have each spent about $2M meeting the requirements of Sarbanes-Oxley 1. AMR Research says leading companies will spend up to $2.5B in 2003 meeting this act, with 85 percent of companies requiring changes to their IT systems. For Basel II compliance, the estimates of various analysts for the costs to a global bank range from about $30m to $160m. For a boardroom to determine its weak-spots on compliance and where expenditure is needed, it must identify the key processes involved in ensuring information integrity. Much has been written on the topic of production and retention of the right documents and many firms have a policy for this, or are now getting their houses in order. But, also of vital importance, is the accuracy of the content of reports and this comes down to the accuracy and overall quality of the source data. Few companies have a policy governing electronic data and few Corporate Officers know very much about it. But now they must give the area serious attention. 1 Financial Times, Sep 5 th 2003 4
3. Data fit for Compliance Public companies have huge volumes of financial, operational and other data. They are swamped with it and typically collect it across multiple departmental level systems and databases. Strategic management pays very little attention to what they consider the 0 s and 1 s of data hidden within the databases deep in the bowels of the business. Rarely do senior officers know what data the business has or what it pertains to. They cannot judge its quality and fitness for purpose, beyond at best, a qualitative judgement from users not accountable to the authorities. Alongside other corporate drivers for taking control of data, governance is a motivator for consolidating the many systems and data sources, and for creating methods for obtaining easily accessible views of complete and accurate information. Driven by the need for compliance, organizations will need to build data warehouses and financial data marts, employ business intelligence, business process management and document management systems. They will purchase new financial and accounting systems and attempt to capture, organize and unify data from multiple distributed and legacy systems, from ERP systems to billing systems and accounts ledgers. Some firms will implement these steps by having their own IT departments work with Finance and other business areas, others will engage systems integrators or compliance software specialists. Some will engage a Corporate Compliance Officer. But no matter what approach firms employ, they cannot escape from the fact that the accuracy of the information their systems deliver is very much dependant upon the quality and integrity of the data upon which that information is built. Garbage In, Garbage Out applies. Yet, few companies treat data as an asset or manage it proactively. Few companies have a central knowledgebase of information about what data the organization has, its true content, quality or structure. In fact in most cases, there is simply no information at all. 5
4. Legacy data quality issues Without accurate knowledge about the content, structure and definition of electronic data stored across the organization, data administrators and analysts are unable to give any guarantees on its quality and consistency. Many organizations may have to face the fact that their legacy and mission-critical systems, may be riddled with inaccurate, inconsistent data, and that they simply do not fully understand what they have. There are a number of reasons for this: - Insufficient validation of data quality at the time the data was entered into the system Duplication of data from one system to another Data is updated by different departments resulting in inconsistencies between systems No centralized data quality standards No data quality monitoring in place Out-of-date documentation Often, data quality issues were never identified at the time the legacy systems were implemented or even for many years ensuing. The system worked and performed according to the business requirements, so quality issues were not even an agenda item. While the data may have been of a sufficient quality to meet the business requirements of the day, data standards for data entry may not have been rigidly applied or enforced, resulting in inconsistencies and errors over time. Furthermore, as new business functions and processes were defined and modifications made to the original applications and databases, the documentation describing the application and its data was not updated and soon became out-of-date. Compounding this further, isolated development of business applications and databases by different parts of the business resulted in inconsistent naming conventions, standards and 6
metrics. For instance, one system s definition of sales revenue for a product may have been quite different than another system s definition and formula. Or in other cases, different data names such as customer-revenue-month and monthly-sales-revenue may have been defined in two different systems, yet the underlying values and formulae are exactly the same. Without any form of centralized data management policies, nor clear ownership and responsibility for data standards, it was inevitable that over time organizations strayed further and further from having a single version of the truth about their data. At the time this may not have been of concern or priority, but try and integrate data from various sources and use it for another purpose such as a compliance project, then these sort of issues present a major hurdle. Since there is always pressure to implement new business initiatives quickly, there is never enough time to properly analyze existing data before attempting to merge it with other data and make new uses of it. And the more data is integrated, the more data quality issues will spread across the business like a virus. Corporate Officers signing declarations of accuracy need to ask themselves whether they have policies governing the integrity of electronic data and a means by which the success of those policies can be objectively measured. If CxOs cannot access figures for data accuracy, then no matter what other actions they have taken to ensure compliance, can they honestly trust data sources and be sure the information they declare under corporate governance is The Truth? 7
5. Data Profiling for data quality assessment It is important for Corporate Officers, since the buck stops with them, to have a quantified statement on data integrity and its fitness for the purpose of compliance. Many will not even think to consider it, simply assume accuracy or rely on subjective views of users, data administrators and IT managers. The importance of thorough analysis and early understanding of data quality cannot be underestimated. However, as often is the case, reliance upon documentation, metadata (information describing where data is held, how it is defined, its content etc), or upon analysis conducted on small data samples have shown to be poor approaches to understanding your data. On the other hand, to verify the quality and accuracy of data for compliance reporting, organizations may need to examine large volumes of data records stored in multiple systems. But faced with perhaps millions of records and tens, if not hundreds of data systems, where does one start? One method for learning the truth about data, its nature, structure and quality is the process of data profiling. Data profiling solutions like Trillium Software Discovery are suited to very large, enterprise-scale data sources and can be used either as a one-off to prove all is well, or be used regularly for managing, monitoring and reporting on data integrity. As well as measuring quality, data profiling lends understanding to exactly what data is held, where and in what format. It details precisely any issues found such that their impact can be understood and cost-benefit decisions made about their resolution. Data profiling helps determine how difficult it might be to integrate different sets of data, from different systems, whilst maintaining the integrity of the resulting information. Not only will data profiling indicate the degree of confidence management can hold in data sources, but issues discovered can be considered for redress through targeted data improvement projects, which will also benefit business effectiveness. Information and issues 8
from initial data analysis will provide reliable, accurate business rules to be fed into the correction, standardization and completion processes. These processes should then continue throughout the data lifecycle to turn raw data from disparate systems into reliable information. Data profiling is best undertaken early, since issues uncovered may influence policy decisions and will also require time to resolve. The typical approach is to define all significant data sources, profile them, and assess the value of the data and the issues located. Then each useful source is corrected, standardized, enhanced (using a data quality solution) and profiled again to determine the success of the data improvement process. A software product such as Trillium Software Discovery will enable the profiling part of this process to be undertaken in around five percent of the time taken by manual approaches, at a significant reduction in cost, and will be 100 percent accurate in the anomalies located. Whether an organization intends to install new IT applications to aid in compliance reporting or rely on existing ones, understanding data and its quality is still important. New applications will still not produce reliable information if they are fed with dirty data. 9
6. Time to take Control Compliance with particular legislation may be the point of pain that drives data profiling and quality projects in many large organizations; and surely that s a reasonable stay out of jail approach to take. However, more strategic thinking firms will take compliance as another reason to look at the issue of data quality holistically. Grasped as a key information asset, the formation of enterprise data quality policies and procedures and the formal introduction and management support of data profiling and data improvement processes would lead to a base of high quality, well understood data sources which management could trust. Such data policies would ensure a business could also quickly comply with new regulatory requirements, rather than be driven off course by them, and get more from the data as a well managed business asset. Data quality sensitive initiatives such as Enterprise Resource Planning Systems, Customer Relationship Management (CRM), Business Intelligence, Supply Chain Management, data warehousing and other applications would deliver better results. Hence a strategic policy on data quality would profit the business and make it more accommodating of new needs for accurate information, including reducing the risks on Corporate Officers of certifying information as being truthful only to land in jail for not realizing it was hardly The Truth at all. Perhaps it s time to profile your data and determine whether you need to act before it s too late. 10
7. Corporate Governance Standards There are hundreds of regulations with which modern businesses must comply, some industry specific and some more generic. Here are some of those having a major impact internationally necessitating data management related actions: 7.1. Sarbanes-Oxley (USA) Following US corporate scandals, designed to improve the management of public companies and the accuracy of their reporting. Requires CEOs and CFOs to establish proper procedures for reporting and to personally certify that their companies statements are up-to-date, complete and accurate. Puts pressure on CIOs to ensure systems deliver accurate and complete information. 7.2. OFAC (USA) The Office of Foreign Assets Control administers economic and trade sanctions. It also targets and blocks the financial transactions and assets of terrorists, narcotics traffickers, and foreign countries posing a threat to the national security and economy of the USA. US citizens and organizations and firms quoted or trading in the US should check their customer records with the threats list published by OFAC. 7.3. Patriot Act (USA) US regulation aimed at detecting and preventing terrorism, money laundering and other criminal activities. Organizations should compare customer records with the criminal suspect list published by government agencies. 11
7.4. Basel II (International) Aimed at banking institutions, The New Basel Capital Accord, dubbed Basel II and due to come into force in 2006, requires banking institutions to better judge and declare their exposure to the risks of repayment default of corporate customers. Banks that can show an advanced level of risk management should be able to reduce their capital and long-term costs. It will necessitate that IT departments integrate multiple databases and reporting systems and ensure information is accurate. 7.5. Higgs Report (UK) Following UK corporate scandals, designed to improve the management of public companies and the accuracy of their reporting. Places an emphasis on internal and external audits to ensure proper reporting and improved risk management. Likely to place pressure on CIOs to carry out IT audits and ensure the accuracy of data used in determining corporate performance. 7.6. IAS (UK/Europe) In 2005, European companies will be required by law to file their accounts in accordance with the new International Accounting Standards. In 2004, dual reporting in both current and IAS format will be required. The purpose is to standardize reporting and bring greater transparency to public company reporting. The main impacts on IT will be to work with Finance to adapt systems and to source the new data required of new reporting requirements. 12
Dedicated to increasing the value of information assets across organizations, Trillium Software, a division of Harte-Hanks, is the most trusted provider of technologies for continuous global data analysis, cleansing, enhancement, and monitoring. Many of the world s leading companies use the Trillium Software System and Trillium Software Discovery to help build and augment data-dependent business systems that sustain financial growth in demanding business environments. Offices of Trillium Software, a division of Harte-Hanks: Global Headquarters (US) 170 Lexington Rd. Billerica, MA 01821 US Main: +1 978 436 8900 Fax: +1 978 670 5793 EMEA Headquarters (UK) 1-2 Fortuna Court Calleva Park Aldermaston Berkshire RG7 8UB UK Main: +44 (0) 118 940 7600 Sales: +44 (0)118 940 7666 Fax: +44 (0)118 940 7699 Trillium Software and Trillium Software System are registered trademarks of Harte-Hanks, Inc. Other trademarks are the property of the owners of those marks. 13