ITAG RESEARCH INSTITUTE



Similar documents
ITAG RESEARCH INSTITUTE

ITAG RESEARCH INSTITUTE

ITAG RESEARCH INSTITUTE

ITAG RESEARCH INSTITUTE

COBIT 5 and the Process Capability Model. Improvements Provided for IT Governance Process

Strategic IT audit. Develop an IT Strategic IT Assurance Plan

ASSESSMENT OF THE IT GOVERNANCE PERCEPTION WITHIN THE ROMANIAN BUSINESS ENVIRONMENT

ITAG RESEARCH INSTITUTE

Designing a Data Governance Framework to Enable and Influence IQ Strategy

Implementing COBIT based Process Assessment Model for Evaluating IT Controls

Workshop agenda. Data Quality Metrics and IT Governance. Today s purpose. Icebreaker. Audience Contract. Today s Purpose

Revised October 2013

Auditors Need to Know June 13th, ISACA COBIT 5 for Assurance

ITIL AND COBIT EXPLAINED

2009 Solvay Brussels School and IT Governance institute

The Asset Management Landscape

COBIT 4.1 TABLE OF CONTENTS

ow to use CobiT to assess the security & reliability of Digital Preservation

IT Process Architectures for Enterprises Development: A Survey from a Maturity Model Perspective

Request for Proposal. Supporting Document 3 of 4. Contract and Relationship Management for the Education Service Payroll

Assessing & Managing IT Risks: Using ISACA's CobiT & Risk IT Frameworks

CobiT Strategy and Long Term Vision

INFORMATION TECHNOLOGY FLASH REPORT

Universiteit Leiden ICT in Business

Information Security Governance:

S11 - Implementing IT Governance An Introduction Debra Mallette

IT governance is a concept that has suddenly emerged and

Introduction to ISACA and ITGI By Georges Ataya, International Vice President, ISACA

G11 EFFECT OF PERVASIVE IS CONTROLS

Masterclass Cycle on Information Security Management

COMPLIANCE CHARTER 1

Benchmark of controls over IT activities Report. ABC Ltd

IT service management: resetting priorities for an uncertain economy.

Integrating CMMI with COBIT and ITIL

Standard 1. Governance for Safety and Quality in Health Service Organisations. Safety and Quality Improvement Guide

ITIL v3 Service Manager Bridge

Audit & Inspection Management. Enterprise Cloud Audit & Inspection Management Solution

Cyber Security Consultancy Standard. Version 0.2 Crown Copyright 2015 All Rights Reserved. Page 1 of 13

An IT Governance Framework for Universities in Spain

Case Study / A global customer service academy that creates a comprehensive cultural shift

Practical perspectives in advancing data governance to create improved data quality frameworks

Introducing a Capacity Management Maturity Model

CREATING A LEAN BUSINESS SYSTEM

Governance SPICE. ISO/IEC for Internal Financial Controls and IT Management. By János Ivanyos, Memolux Ltd. (H)

D6.1: Service management tools implementation and maturity baseline assessment framework

Information Technology Governance Best Practices in Belgian Organisations

AUDIT OF ACCOUNTING INFORMATION SYSTEM USING COBIT 4.1 FOCUS ON DELIVER AND SUPPORT DOMAIN

HP OpenView Service Desk + Alignability Process Model = ITIL Out of the Box?

quality, health & safety and environment training and consulting

An Exploratory Study into IT Governance Implementations and its Impact on Business/IT Alignment

Procurement Transformation: Towards Sourcing & Procurement Excellence

IT Governance and Control: An Analysis of CobIT 4.1. Prepared by: Mark Longo

Global Human Capital Trends 2015 Country report: Luxembourg

Risk & Hazard Management

April 20, Integrating COBIT into the IT Audit Process (Planning, Scope Development, Practices)

IT governance in Brazil:

CYBER SECURITY DASHBOARD: MONITOR, ANALYSE AND TAKE CONTROL OF CYBER SECURITY

Request for Information Integrated Portfolio, Project & Management Information System Technical Assistance Unit RFI: TAU/01

Family Evaluation Framework overview & introduction

A CobiT Case Study. Drawing on CobiT for the implementation of an Enterprise Risk Management Framework. December 2008

TECHNOLOGY BRIEF: PREVENTING UNAUTHORISED ACCESS TO CRITICAL SYSTEMS AND DATA. Colruyt ensures data privacy with Identity & Access Management.

Somewhere Today, A Project is Failing

ABB s Supplier Qualification System Registration in Achilles Power & Tech Frequently Asked Questions (FAQs) June 2013

How to gather and evaluate information

IT GOVERNANCE PANEL BRING VALUE BY AUDITING IT GOVERNANCE GET THE

Finance Effectiveness Efficiency

GOVERNING INFORMATION SECURITY IN CONJUNCTION WITH COBIT AND ISO 27001

Research Data Management Framework: Capability Maturity Guide

Do you know how your grants are being used?

STRATEGIC PLAN

How To Write A Bank

Board of Member States ERN implementation strategies

Presented by. Denis Darveau CISM, CISA, CRISC, CISSP

IT Governance: framework and case study. 22 September 2010

COBIT 5 For Cyber Security Governance and Management. Nasser El-Hout Managing Director Service Management Centre of Excellence (SMCE)

Stepping Through the Info Security Program. Jennifer Bayuk, CISA, CISM

Shepway District Council Risk Management Policy

CAPABILITY MATURITY MODEL & ASSESSMENT

, Head of IT Strategy and Architecture. Application and Integration Strategy

The Self-Assessment Methodology - Guidance

University of Sunderland Business Assurance Information Security Policy

FINANCIAL MANAGEMENT MATURITY MODEL

Performance Monitoring

Transcription:

ITAG RESEARCH INSTITUTE Control and Governance Maturity Survey Establishing a reference benchmark and a self-assessment tool Erik Guldentops Wim Van Grembergen Steven De Haes

Control and Governance Maturity Survey 2/8 Introduction The CobiT Framework identifies 34 IT processes within an IT environment. For each process, it provides a high-level control statement and between 3 and 30 detailed control objectives. With CobiT 3 rd edition, a management layer was added called Management Guidelines - providing critical success factors, key performance indicators and maturity models for each of the processes. The maturity levels were defined in a similar manner as SEI s Software Maturity Models (see Table 1): Table 1 0 - Non-existent Management processes are not applied at all 1 Initial Processes are ad hoc and disorganised 2 - Repeatable Processes follow a regular pattern 3 - Defined Processes are documented and communicated 4 - Managed Processes are monitored and measured 5 - Optimised Best practices are followed and automated These Maturity Models provide a method of scoring so that an organisation can grade itself from non-existent to optimised in controlling IT processes. While each process has specific descriptions of maturity, they follow a generic model provided in attachment. The basic principle of such a maturity measurement is that one can only move to a higher maturity when all conditions, described in a certain maturity level, are fulfilled. Management can use this tool to obtain a quick self-assessment or a reference in conjunction with an independent review. It defines the As-Is position of the enterprise relative to IT Control and Governance Maturity, allows to select a To-Be level appropriate for the enterprise and after analysis of the gaps - develop a strategy for improvement. Many ISACA members and COBIT users have requested information on how organisations are doing relative to these Maturity Models as described in the Management Guidelines. In response, ISACA has set up a "Control and Governance Maturity Survey." The purpose of this survey was to provide a self-assessment tool and to establish an actual reference benchmark on the IT control and governance maturity of enterprises and organisations in the public and not-for-profit sector. The results of the survey have now been collected and analysed. The main conclusion of the survey is that, on average, the maturity of enterprises in controlling IT processes hovers generally between the 2 and 2.5, with the financial industry and global companies in the 2.5 to 3.0 bracket. Moreover, further filtering the results by size of enterprises, type of industry and geography reveals some interesting specific differences.

Control and Governance Maturity Survey 3/8 Methodology To collect the data, we developed a web-based survey for assessing the average maturity of the 15 most important processes of CobiT (see Table 2). Table 2 PO1 Define a strategic IT plan PO3 Determine technological direction PO5 Manage the IT investment PO9 Assess risks PO10 Manage projects AI1 Identify automated solutions AI2 Acquire and maintain application software AI5 Install and accredit systems AI6 Manage changes DS1 Define and manage service levels DS4 Ensure continuous service DS5 Ensure systems security DS10 Manage problems and incidents DS11 Manage data M1 Monitor the process The first column refers to the domains in which the processes are classified: PO = Planning & Organisation, AI = Acquisition and Implementation, DS = Delivery and Support, M = Monitoring This selection of 15 out of 34 processes was done a year prior to the survey by interviewing a group of some 20 senior experts of the IT and audit industry. For each of the 15 processes, respondents had to give a maturity score from 0 to 5, each time complying with the principles of a maturity measurement. To assist the respondents in doing this, the maturity model descriptions could be very easily consulted by clicking on a link. By the same token, we asked respondents to record driving forces (that push the company to a higher maturity level) or inhibiting forces (that inhibit the company from reaching a higher maturity level). To facilitate this, a group of 5 industry experts identified some of the most common drivers and inhibitors, but respondents could of course identify others as well. Before mailing the final version of the survey, we first created a web-based pilot. This pilot was posted on the internet in December 2001 and we asked several experts to fill in the survey for a real-life situation (their results were also included in the final survey). Based on their comments and suggestions, we made the survey much more user-friendly and accessible. Moreover, together with them, we finalised the list of possible drivers and inhibitors. The final survey was posted in March 2002 and an invitation to participate was sent to the purchasers of CobiT s second and third edition. The survey was closed in June 2002.

Control and Governance Maturity Survey 4/8 In total, we received 168 valid responses, distributed over different geographies, sizes and sectors (see Figure 1). Figure 1 Global Asia/Oceania America s Small Large Other Finance Public sector Europa/Middle East/Africa Medium Retail & Manufacturing For the overall results of the maturity levels, we calculated and compared the un-weighted averages per process. We then filtered the results by size, sector and geography and again compared the un-weighted averages. This revealed some interesting differences. Finally, we ranked the driving and inhibiting forces in order of importance, based on the number of times the respondents selected a driving/inhibiting force. Findings Figure 2 represents the un-weighted averages of the maturity scores for each process. Most of the maturity levels fluctuate between the 2 and the 2.5 and the variance between these results is very small. There are five processes with a maturity level higher then 2.5: - DS5: ensure systems security - AI1: identify automated solutions - AI2: acquire and maintain application software - DS10: manage problems and incidents - PO10: manage projects After 11 th of September 2001, security and contingency are certainly more under the attention of management. This explains the high score for DS5 (ensure systems security). Probably, this situation also led to more investments in DS10 DS5 DS4 A16 A15 Incident Response capabilities, which clarifies the high maturity score of DS10 (manage problems and incidents). An explanation for the high maturity level of AI1 (identify automated solutions), AI2 (acquire and maintain software) and PO10 (manage projects) can be found in the economic downturn. A consequence is that the IT department often has to cut costs. This can be realised by, among other, optimising the project management and the selection and implementation of (automated) solutions. The processes with the lowest scores are: - DS1: define and manage service levels - M1: monitor the processes - PO9: assess risks The low scores for DS1 and M1 point out that enterprises should be more formal in the management, control and performance measurement of processes and service levels. PO9 is probably only a priority for very risk-aware enterprises. DS1 M1 3.5 3 2.5 2 1.5 4 Po1 Po3 A12 Po5 Po9 A11 Figure 2 Po10

Control and Governance Maturity Survey 5/8 Filtering these results by size generates Figure 3. As could be expected, this figure shows that smaller companies have on M1 average a lower maturity level compared to DS11 the larger companies. For large companies, the maturity levels of the processes hover DS10 around 3, while for small companies, the maturity level is situated around 2. However, DS5 for the small companies, the results reveal a peak for DS5 (ensure systems security) DS4 compared to its other processes. As already mentioned, September the 11 th 2001 can be DS1 an explanation for this phenomenon. The A16 maturity levels of the medium sized companies lean more towards the overall average. Po1 3.50 3.00 2.50 2.00 1.50 1.00 Po3 Figure 3 Po5 Po9 Po10 A11 A12 A15 large medium small We also filtered the results by type of industry (sector), as shown in Figure 4. The finance sector has in general a relatively high maturity level compared to the other sectors. This is certainly true DS5 DS10 DS4 DS11 DS1 M1 A16 3.5 3.0 2.5 2.0 1.5 Po1 A15 Po3 A12 Po5 A11 Po9 Po10 Figure 4 for DS4 and DS5, i.e. ensure continuous service and ensure systems security. These processes are of course extremely important for the financial institutions. This sector can not afford downtime of their systems. In this comparison, the retail and manufacturing sector score low, with an exception for AI1 and AI2, i.e. identify automated solutions and acquire and maintain application software. Automating business processes by software applications is very important in this sector. This graphic also illustrates a relatively higher maturity in the Public Sector in the Planning domain, probably due to the presence of explicit policies and regulations. Comparing the different continents, as shown in figure 5, indicates that enterprises in Asia and Oceania and global working companies Po1 have a relatively high maturity level M1 3.50 Po3 compared to companies in the America s DS11 3.00 Po5 and in Europe, Middle East and Africa 2.50 (EMEA). This result is quite understandable global 2.00 DS10 Po9 for global working companies. However, the 1.50 asia/oceania high maturity levels in Asia/Oceania are 1.00 more difficult to explain. A reason may be DS5 Po10 emea the fact that the measurement of a americas maturity level is probably dependent on DS4 A11 cultural and historical backgrounds of the DS1 A12 company and the region it is working in. Despite of the clearly described maturity A16 A15 models, these differences can have an impact on the way one measures and scales maturity. Moreover, it should be indicated that the finance public sector ret & manuf.

Control and Governance Maturity Survey 6/8 results are the maturity levels that the respondents have given to their own company and from their personal perspective. It is therefore very important that these results are interpreted with the necessary care (overestimations are possible) and that they are positioned in a world-wide context. It was never the purpose of this survey to validate these results by other means. A closer look at the America s reveals a peak for DS5 (ensure systems security). Again, September the 11th can be an explanation. On the other hand, the America s seem to be relatively immature, on M1 (monitor the process) and DS1 (define and manage service levels), compared to other processes. The same conclusion can be made for DS1 in Europe, Middle East and Africa. The other results for Europe, Middle East and Africa lean towards to overall average. As mentioned in the introduction of this article, we identified possible driving and inhibiting forces to reach a higher maturity level and asked respondents to indicate those that were applicable in their situation. Based on the results, we ranked the driving and inhibiting forces in order of importance (starting with the most important), as shown in the Table 3 and 4. Reputation and trust seems to be a very important driving force to move to a higher maturity level, while budget Limitations can inhibit moving up to a higher level. When an enterprise is aiming to move a certain process to a higher maturity level, it is clear that these issues have to be very closely monitored. The given context in which a company operates implies of course that specific driving or inhibiting forces could occur that are not mentioned in this list. The challenge is then to identify them. Some specific drivers that the respondents recorded themselves are corporate governance, acquisitions and size of the organisation. Specific inhibitors were a reactive mindset and again the size of the organisation. Conclusion Table 3 Table 4 Driving forces - Reputation and trust - Legal, regulatory, contract compliance - Performance improvement - Risk reduction - Cost reduction - Mission and goals - Corporate values - Competitive environment - External political/economical environment Inhibiting forces - Budget limitations - Resource priorities - Resource conflicts - Availability of skilled staff - Management awareness - Management commitment - No easy solution - Existing architecture - Lack of ownership - External political/economical environment - Lack of tools This study provides in the first place a reference benchmark and a self-assessment tool. The data collected indicates that the 15 most important processes of the CobiT framework have a maturity level between the 2 and 2.5. Filtering these results by size, geography and type of industry, revealed some interesting differences. For example, large companies, companies in the finance sector or globally operating companies showed a higher maturity level than average (2.5 to 3.0). While being aware of the limitations of self-assessments, the industry, size and geography breakouts of this survey should nevertheless provide many organisations with a benchmark to compare their maturity in IT Control and Governance. There was however one test that gave some confidence for the results. Prior to the survey, 5 industry experts estimated where the different industries should reasonably be. They scored generally 0.5 points higher than the respondents, meaning that respondents have most probably been fair in their judgement, and confirming what we all suspect: There is room for improvement. ISACA wants to thank all those who participated in this survey. We feel certain the information we gained as a result of the efforts of the respondents will be useful and beneficial to all who aspire to more effective IT governance.

Control and Governance Maturity Survey 7/8 Attachment 0 Non Existent. Complete lack of any recognisable processes. Organisation has not even recognised that there is an issue to be addressed. 1 Initial. There is evidence that the organisation has recognised that the issues exist and need to be addressed. There are however no standardised processes but instead there are ad hoc approaches that tend to be applied on an individual or case by case basis. The overall approach to management is chaotic. 2 Repeatable. Processes have developed to the stage where similar procedures are followed different people undertaking the same task. There is no formal training or communication of standard procedures and responsibility is left to the individual. There is a high degree of reliance on the knowledge of individuals and therefore errors are likely. 3 Defined. Procedures have been standardised and documented, and communicated through training. It is however left to the individual to follow these processes, and any deviations would be unlikely to be detected. The procedures themselves are not sophisticated but are the formalisation of existing practices. 4 Managed. It is possible to monitor and measure compliance with procedures and to take action where processes appear not to be working effectively. Processes are under constant improvement and provide good practice. Automation and tools are used in a limited or fragmented way. 5 Optimised. Processes have been refined to a level of best practice, based on the results of continuous improvement and maturity modelling with other organisations. IT is used in an integrated way to automate the workflow and provide tools to improve quality and effectiveness.

Control and Governance Maturity Survey 8/8 About UAMS UAMS (University Antwerp Management School) has the ambition to be a learning partner in management, by offering a broad range of training programmes for future and current managers in the business world, in public services and socialprofit organizations. The priorities cover optimal quality control, interactive teaching methods, an emphasis on researchbased knowledge and best practice, an international orientation and a continuous adaptation of our programmes to the needs of the market. About ITAG The Information Technology Alignment and Governance (ITAG) Research Institute, was established in within UAMS to host applied research in the domains of IT Governance and business/it alignment. The research centre is an initiative of Prof. dr. Wim Van Grembergen and dr. Steven De Haes. Both have research and practical experience in the IT Governance and Strategic Alignment domains. Recently, this team was reinforced by senior researcher Hilde Van Brempt. Contact UAMS - ITAG Research Institute Sint-Jacobsmarkt 9-13 B-2000 Antwerpen Belgium Wim Van Grembergen, Ph.D. is a professor at the Information Systems Management Department of the University of Antwerp and an executive professor at the University of Antwerp Management School. He is academic director of the Information Technology and Alignment (ITAG) Research Institute and has conducted research in the areas of IT governance, value management and performance management. Over the past years, he has been involved in research and development activities of several COBIT products. He can be contacted at Wim.VanGrembergen@ua.ac.be. Steven De Haes, Ph.D. is responsible for the information systems management executive programs and research at the University of Antwerp Management School. He is managing director of the Information Technology and Alignment (ITAG) Research Institute and recently finalised a Ph.D. on IT governance and business/it alignment. He has been involved in research and development activities of several COBIT products. He can be contacted at Steven.DeHaes@ua.ac.be. Erik Guldentops is an executive professor at the University of Antwerp Management School (Belgium). He has initiated and provided leadership to the COBIT and Val IT initiatives since their inception.