Acting today v/s tackling a larger problem tomorrow

Similar documents
AT&T Global Network Client for Windows Product Support Matrix January 29, 2015

Analysis One Code Desc. Transaction Amount. Fiscal Period

COMPARISON OF FIXED & VARIABLE RATES (25 YEARS) CHARTERED BANK ADMINISTERED INTEREST RATES - PRIME BUSINESS*

COMPARISON OF FIXED & VARIABLE RATES (25 YEARS) CHARTERED BANK ADMINISTERED INTEREST RATES - PRIME BUSINESS*

Case 2:08-cv ABC-E Document 1-4 Filed 04/15/2008 Page 1 of 138. Exhibit 8

Enhanced Vessel Traffic Management System Booking Slots Available and Vessels Booked per Day From 12-JAN-2016 To 30-JUN-2017

Project Cost & Schedule Monitoring Process Using MS Excel & MS Project

Freedom of Information Request Reference No: I note you seek access to the following information:

PROJECTS SCHEDULING AND COST CONTROLS

Supplier Rating System

Strategies for a Successful E2E Systems Integration Test. Fiona Charles Let s Test May 9, 2012

UTILIZAÇÃO DA LEAN METODOLOGIA. Desmistificando Aplicações Reais Para CME. Apresentado por John Kimsey

Ashley Institute of Training Schedule of VET Tuition Fees 2015

Managing Open Source Code Best Practices

Gilead Clinical Operations Risk Management Program

Computing & Telecommunications Services Monthly Report March 2015

Anatomy of an Enterprise Software Delivery Project

Drill Down Deep Into Your Spend

CENTERPOINT ENERGY TEXARKANA SERVICE AREA GAS SUPPLY RATE (GSR) JULY Small Commercial Service (SCS-1) GSR

Qi Liu Rutgers Business School ISACA New York 2013

Daryl Fullerton. Oil & Gas Industry Principal

CAFIS REPORT

MEASURES FOR EXCELLENCE Getting a "RUNAWAY" Software Development. Under Control. A Case Study

Accident & Emergency Department Clinical Quality Indicators

OPTIMIZING THE USE OF VHA s FEE BASIS CLAIMS SYSTEM (FBCS)

Managing Staffing in High Demand Variability Environments

A Resource for Free-standing Mathematics Units. Data Sheet 1

EDI Services helps healthcare network streamline workflow, increase productivity, and improve revenue cycle management.

Deep Security/Intrusion Defense Firewall - IDS/IPS Coverage Statistics and Comparison

P/T 2B: 2 nd Half of Term (8 weeks) Start: 25-AUG-2014 End: 19-OCT-2014 Start: 20-OCT-2014 End: 14-DEC-2014

P/T 2B: 2 nd Half of Term (8 weeks) Start: 26-AUG-2013 End: 20-OCT-2013 Start: 21-OCT-2013 End: 15-DEC-2013

PowerSteering Product Roadmap Your Success Is Our Bottom Line

P/T 2B: 2 nd Half of Term (8 weeks) Start: 24-AUG-2015 End: 18-OCT-2015 Start: 19-OCT-2015 End: 13-DEC-2015

6 Months ending December 2009 Do some of the preliminary work. Set up work teams. Set up control mechanisms. Securities systems.

Deep Security Intrusion Detection & Prevention (IDS/IPS) Coverage Statistics and Comparison

Data Governance Center Positioning

NATIONAL CREDIT UNION SHARE INSURANCE FUND

Cost effective methods of test environment management. Prabhu Meruga Director - Solution Engineering 16 th July SCQAA Irvine, CA

Executive Branch IT Reorganization Project Plan

JUN Technology Services. Peak Metrics Report Out D E N V E R PERFORMANCE

Easter Seals Central Texas Programs Outcome Profiles Monthly and Year to Date FY % 87% 80% 80% 84% 84% 83%

ARIS 9ARIS 9.6 map and Future Directions Die nächste Generation des Geschäftsprozessmanagements

All of your product data. Just one click away. More than just a pleasant dream. Intelligent Master Data Management.

Temporally Allocating Emissions with CEM Data. Scott Edick Michigan Department of Environmental Quality P.O. Box Lansing MI 48909

Resource Management Spreadsheet Capabilities. Stuart Dixon Resource Manager

Looking Inside the Crystal Ball: Using Predictive Analytics to Tailor Services for Better Family Outcomes. Presented by: David Kilgore

NHS BLOOD AND TRANSPLANT MARCH 2009 RESPONDING EFFECTIVELY TO BLOOD DONOR FEEDBACK

Are you prepared to make the decisions that matter most? Decision making in healthcare

CONCEPT NOTE GROWING THE RWANDAN INTERNET CONTENT HOSTED IN RWANDA (1K WEBSITES) RICTA & PARTNERS. A Project/Initiative by

YEARLY ANALYSIS SHEET - CASH RECEIPTS 20

Solution-Driven Integrated Learning Paths. Make the Most of Your Educational Experience. Live Learning Center

2.9 % $1.5 AT A GLANCE. educated well informed engaged. Employment areas. private practice. hospitals. nursing homes.

Metric of the Month: The Service Desk Balanced Scorecard

Utilizing Defect Management for Process Improvement. Kenneth Brown, CSQA, CSTE

Consumer ID Theft Total Costs

CALL VOLUME FORECASTING FOR SERVICE DESKS

A Guide to the Insider Buying Investment Strategy

Employers Compliance with the Health Insurance Act Annual Report 2015

CABINET 9 th February Report of the Director of Partnerships and Customer Services

Architectural Services Data Summary March 2011

Leveraging Technology For ICD-10 Program Management Using MS SharePoint Poster Presentation by Maithili Vadula

Financial Statement Consolidation

Financial Operating Procedure: Budget Monitoring

Implementing Capacity Management within SIAM (Service Integration & Management) Amrit Bhattacharya

Office of the. Ombudsman Annual Report. A Message from Don Moffatt, the Ombudsman Listening to You Enhancing Your Experience Helping You

In accordance with risk management best practices, below describes the standard process for enterprise risk management (ERM), including:

A Tipping Point for Automation in the Data Warehouse.

Application Performance Testing Basics

ACTIVE MICROSOFT CERTIFICATIONS:

Planning, Budgeting and Forecasting

FORT KNOX. Environmental Management System Manual EMS-01 FORT KNOX. Environmental Management System Manual. Reference Number: Revision Date: 19MAY15

Untangling Attribution

Discussion Outline. A. KPIs Defined. B. Why KPIs Matter. C. KPI Menu. D. Implementation. E. Example KPIs. F. Pitfalls

Robeco High Yield Bonds

Article 4 IT Physician Heal Thyself Building Bridges and Breaking Boundaries

Business Idea Development Product production Services. Development Project. Software project management

Annexure B: Planning, Budgeting and Performance Management Programme

The Power of Business Intelligence in the Revenue Cycle

Jeff Haby, P.E. Director Sewer System Improvements. September 15, Agenda

The U.S. Leveraged Loan Market Today s drivers, tomorrow s challenges. Meredith Coffey, LSTA mcoffey@lsta.org

OPERATIONS SERVICE UPDATE

LeSueur, Jeff. Marketing Automation: Practical Steps to More Effective Direct Marketing. Copyright 2007, SAS Institute Inc., Cary, North Carolina,

Unleashing the Enormous Power of Help Desk KPI s. Help Desk Best Practices Series January 20, 2009

Release of the Draft Cybersecurity Procurement Language for Energy Delivery Systems

S&OP Case Study Creating Value at O Neal Manufacturing Services. Anthony Zampello, CPIM, CIRM, CSCP Adjunct Faculty Bentley University

April Gross Receipts Show Impact of Low Oil and Gas Prices

Municipal Transportation Quality Review

1. Introduction. 2. User Instructions. 2.1 Set-up

FCR The Driver of All Other Metrics

Process Intelligence: An Exciting New Frontier for Business Intelligence

Performance Dashboards in Local Government: What, Why, and How?

Basic Securities Reconciliation for the Buy Side

County of Orange ~ Information Technology Quarterly IT Project Status Detail Report Fiscal Year , 1st Quarter

DATA INTEGRATION APPROACH TO OIL &GAS LEGACY SYSTEMS WITH THE PPDM MODEL. Compete like never before. Consulting Technology Performance

Making Healthcare Meaningful Through Meaningful Use Stage 2

Enterprise Projects Fiscal Year 2011/2012 Third Quarter Report

Eliminating inefficiencies with PerfectServe. SUCCESS STORY Elimination of delays in consultant care. perfectserve.com

Kofax White Paper. Overcoming Challenges in AP Automation. Executive Summary. Benefits of Accounts Payable Automation

CASE STUDY: SIMULATION OF THE CALL CENTER ENVIRONMENT FOR COMPARING COMPETING CALL ROUTING TECHNOLOGIES FOR BUSINESS CASE ROI PROJECTION

Business Plan Example. 31 July 2020

Transcription:

White Paper Acting today v/s tackling a larger problem tomorrow Data Quality in Banks WE PUT THE BANKING INTO BUSINESS INTELLIGENCE www.icreate.in

Acting today v/s tackling a larger problem tomorrow Data Quality in Banks The Banking industry is no longer manual. All banks use some element of automation. They may not be fully automated, but the basic operations of the bank will be at worst, partially automated with plans to automate the nonautomated operations. However, automation has not been done in a structured manner with a long term vision. Some of the problems relating to development of systems in banks are listed below: No central architect governing the overall development of technology leading to a number of small systems on discrete platforms which do not talk to each other easily. The result is multiple systems with multiple owners. It is quite normal to find medium sized banks with 40-60 systems doing various activities but not integrating very well There has been no consistent development methodology or vendor selection. Development has been done in a decentralized manner with people assigning the development responsibilities to vendors with whom they had confidence. Apart from discrete platforms, there have been no data standards that have been applied across systems. It is not uncommon to find banks having two systems for processing different products having different nomenclature for currency codes or even customer number. The end result of the above factors is a best described as spaghetti systems where extraction of meaningful data is a horrendous process. Banks have acknowledged this problem and have now set up a central technology department to control all systems, projects to consolidate systems and set up a data standards board. While this is a step in the right direction, the technology unit is grappling with immediate user demands, managing systems which are using obsolete technology and budget constraints. While the issue of data quality from the spaghetti systems has been acknowledged, it has often remained in discussion state and not acted upon due to various constraints described above. s have thus managed day-to-day requirements by setting up huge Excel factories which takes inputs from various systems, massages the data and produces the desired output to meet the need of the hour. While Excel is the most respected and most used software in all banks, it has led to even more loss of control on data quality and processes. It is an accepted fact that the older the bank, the higher the data quality problems as the above issues have been going on for a longer time. Amidst the chaotic scenario painted above, the customers have been demanding more and more as use of internet and mobile banking has become rampant. They want a single picture of all their relationship with the bank in summary format with drill down features to get to the details. The regulators have become stricter and asked banks to automate their returns and provide online connection so that they can go and inspect areas directly without having to get in touch with the bank. The banks responded to these needs by using Band-Aid technology to cover their deficiencies and project a glossy picture to the external world while they struggled with their spaghetti. How does one overcome this rather complex problem of meeting day-to-day needs and also getting rid of this monstrous problem so that one does not have to live with it forever? While it is a difficult problem, it can be solved with a focused approach and commitment and resolve to solve the problem. The key to solving Data Quality problems is to monitor the issues consistently, measure them and find out the root causes and fix them. This will ensure that the Data Quality improves with every reporting cycle and we will get close to zero data defects. If the numbers of issues are very high, then they can be prioritized and issues can be resolved using an 80-20 classification and doing a Pareto analysis as shown in diagram below. You take up the top 20 percent data quality issues which may give the maximum amount of improvement. 2

Application of Pareto of Pareto on Causes on Causes of Data of Quality Data Quality OCCURENCE Sl. No. CAUSE Frequency % Cum % 1 Software 25 52.08 52.08 2 Mapping 13 27.08 79.17 3 Data Entry Errors 4 8.33 87.50 4 External Data 3 6.25 93.75 5 Operational Workflow 2 4.17 97.92 6 Documentation / Training 1 2.08 100.00 Total 48 Software Mapping Data Entry Errors External Data Operational Workflow Documentation / Training Data Quality monitoring must thus keep track of issues occurring on a report or data element and the number of occurrences of the defect and some other key measures, like amount or balances, by time period. This will help us in prioritizing the resolution of issues rather than trying to clean everything in one go and not getting the desired results. The key reasons for Data Quality problems are also summarized in the above diagram. Software Mapping Data Entry Errors External Data Operational Workflow Documentation / Training Software are data quality issues due to incorrect programming or due to a lack of understanding of the requirements or a design issue. Typically, these could result in missing data elements, data elements with incorrect value or drill down issues where the details do not add up to the total consolidated value. While these are some examples of Data Quality errors due to software defects, it can be observed in many other ways in an organization. Mapping are very likely to arise when there are a large number of application systems which were developed with different data standards over a period. It is not uncommon to find a bank having applications using different customer numbers for the same customer in different applications and having a relationship management system on top to bring all contracts and accounts under one relationship number. This will require extensive one time mapping for old customers and good controlled processes to minimize or eliminate mapping issues for new customers coming to the bank. There could also be timing issues in mapping errors. A customer may have been linked to an asset class of sovereign and reclassified to corporate say in Jan 2013. If mapping does not have a time stamp associated with it, all historical report retrievals will classify the customer incorrectly. Data Entry Errors occur due to poor data validation on capture of data. One of the biggest causes of data entry errors is that even the best automated organization has a number of peripheral activities done manually or on excel. They are uncontrolled data entry into the controlled and trusted data environment and a perennial source for data quality issues. They need to be reviewed and appropriate controls introduced 3

to prevent Data Quality issues. As example, court case details of defaulted customers are kept on an excel sheet and never on the core banking system. However, this data needs to be linked to data from core systems via some field say customer number. Incorrect linkage could lead to incorrect reporting or data getting dropped from reporting. External Data arise where banks use external data for business operations or performance management or regulatory reporting. Take the case of customer ratings. Each rating agency rates the customers on their own customer codes. They have to be mapped to bank s customer codes and there is a potential for mapping error. Additionally, there could be issues where the data is sourced from the wrong site or file transfer errors between external sources to internal systems. It is critical that these transfers are done with adequate controls like checksums and authentication to ensure that there is minimum errors on this front. Operational Workflow defects are defects that arise because the workflow is not defined properly. As an example, let us assume that one of the management dashboards uses the regulator supplied exchange rate for the day. Due to some operational error, the file with today s exchange error did not arrive and the system used the previous day s rate and displayed the reports. This could lead to a lot incorrect management and operational decisions. The workflow should stop the processing in this case if the rates have not been received. Documentation/Training are the cornerstone to Data Quality. All data manipulations are based on data entered in some source system by a human being. The person relies on documentation and training to ensure that correct data is input into the system. Let us assume that there has been a software upgrade and a new field for customer classification has been added in the release with a default to corporate. If the documentation has not been updated or the user has not been trained on the purpose of the change and the consequences of incorrect data entry, all entries will default the customer classification to corporate leading to incorrect analysis and decisions After prioritizing the issues to be taken for rectification, one needs to do a root cause analysis and fix it. There is no point in putting a temporary Band-Aid fix as the problem will recur again. Identifying the root cause is not easy and requires a structured analysis. While it may seem apparent in some cases, doing a formal root cause analysis ensures that we have covered it from all angles and the problem cannot occur again. Listing all possible causes for the problem and eliminating them one by one is a structured way of resolving the root cause of the problem. An example of looking at all possible causes for Software defects, analyzing them, Cause Effect Diagram eliminating some items and arriving at the root cause is given below. Cause and Effect Domain Knowledge Lack of training Aptitude of technical person Lack of guidance from user lacking domain knowledge Software Development Process Processes not followed Awareness of processes Processes need improvement Software Defect Test scripts are not adequate Coding skills Incorrect design Testing Test data was incomplete Incorrect data inputs Incorrect specifications Technical 4

After identifying the root cause, one has to work out a solution to fix the root cause of the problem. Fixing data quality issues normally involve a number of small projects which do not require the rigor of a full SDLC (Software Development Life Cycle) or project management. They normally involve quick fixes with testing around impacted areas and do not require a full system test. A good approach to monitoring the progress of these small projects could be to use a four-quadrant analysis described below. Four Four Quadrant Quadrant Analysis Analysis 4Q Anal 8 - Month: October 2012 7 - Dec 02 Jan 03 Feb 03 Mar 03 Apr 03 May 03 Jun 03 Jul 03 Aug 03 Sep 03 Oct 03 Nov 03 6 - Def 12 8 8 6 6 3 4 8 5 - Unit 8666 8666 8666 6772 6772 6772 6772 6772 4 - DP 1385 923 923 886 886 443 591 1181 3 - Go 185.3 166.8 150.1 135.1 121.6 109.4 98.5 88.6 79.8 71.8 64.6 58.1 2-1 - 0 - Not Done Unauth. Wrong Dependency Software Problem Pert Problem Others Series 1 Process : EOD Runs. Intent of Measure : Daily Action Plans / Comments A. Problem observed for new customer numbers only Process Start : EOD / BOD End Data Quality Issue : Missing Customer Number Number of Occurrences : 100 per month Opportunities for Error : 1 B. C. As customer number in treasury systems is different from standard customer number used for other products, translation table is not getting updated for new customers Though treasury system is giving customer number, it is getting blanked out as there is no translation for this new number Baseline Value : D. Fix bug in UPDATE-TRANSLATION process to fix this issue The first quadrant (top left) gives a summary of the number of times the data quality problem has occurred over a time horizon. The second quadrant (top right) gives the cause for the problem. It is possible that the problem occurred not because of one cause but due to multiple issues. It gives a factor for prioritizing and gives the resolution scale if you fix one cause. The third quadrant (bottom left) identifies the processes which need to be fixed and how we can measure if the problem has been fixed or not. The fourth Quadrant is a list of actions to be done to resolve the problem. This quadrant can also include a status column which can be used to monitor the progress of the resolution and remove the bottlenecks which hamper the progress. While the above approach will help in resolving existing data quality issues, one needs to prevent new issues coming up through new systems or enhancements to existing systems. A common technique used to prevent such issues is to form a Data Standards Board across the organization which will ensure that data standards are followed across the organization. They will build a list of data elements used in the organization and publish its purpose, format, validations and it has to be ensured that all new implementations adhere to this standard. If a new data element has to be added, it must be run through the Data Standards Board before it can be used in the application. While a data standards board will not eliminate Data Quality issues, it will minimize the number of issues and above all, bring an organization wide awareness of the problems and impact of Data Quality. This awareness will automatically reduce the proliferation of DQ issues In Summary. Data Quality is a problem in most large enterprises. Business Management teams have to realize the importance of Data Quality and put in a focused approach to resolve the issues. Delays in addressing these issues will lead to incorrect analysis and decisions being taken and more importantly, the more you delay, the bigger the problem to resolve < About the Author: Ravi Raman is Chief Solutions Evangelist at icreate Software. icreate Software Pvt. Ltd. 41, 6 th Block,17 th Main, 100 ft Road, Koramangala, Bangalore 560 095. T: +91 80 405 89 400 E: info@icreate.in W: www.icreate.in