COBIT5 Assessment Questionnaire. BAI07 Manage Change Acceptance & Testing

Similar documents
Project Risk and Pre/Post Implementation Reviews

Roles, Activities and Relationships

Issue 1.0. UoG/ILS/IS 001. Information Security and Assurance Policy. Information Security and Compliance Manager

DNV GL Assessment Checklist ISO 9001:2015

ISO 9001:2015 Internal Audit Checklist

ISO :2005 Requirements Summary

Information Technology Engineers Examination. Information Technology Service Manager Examination. (Level 4) Syllabus

UMHLABUYALINGANA MUNICIPALITY IT CHANGE MANAGEMENT POLICY

ITIL Introducing service transition

Quality Manual ISO 9001:2015 Quality Management System

EQF CODE EQF. European Competence Profiles in e-content Professions.

CMS Policy for Configuration Management

Manage Release and Deployment

<name of project> Software Project Management Plan

Information Security Policies. Version 6.1

Managing and Maintaining Windows Server 2008 Servers

Release Management Policy Aspen Marketing Services Version 1.1

ITIL A guide to service asset and configuration management

Release & Deployment Management

PHASE 5: DESIGN PHASE

The Configuration Management process area involves the following:

Software Test Plan (STP) Template

PHASE 9: OPERATIONS AND MAINTENANCE PHASE

e-tourism Marketing Specialist

Spillemyndigheden s change management programme. Version of 1 July 2012

NOS for Network Support (903)

Maturity Model. March Version 1.0. P2MM Version 1.0 The OGC logo is a Registered Trade Mark of the Office of Government Commerce

Template K Implementation Requirements Instructions for RFP Response RFP #

Qlik UKI Consulting Services Catalogue

Internal Audit Checklist

Achieve. Performance objectives

COBIT 5 Introduction. 28 February 2012

Release and Deployment Management Software

The ITIL Foundation Examination

44-76 mix 2. Exam Code:MB Exam Name: Managing Microsoft Dynamics Implementations Exam

ITIL Intermediate Capability Stream:

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)

Internal Quality Management System Audit Checklist (ISO9001:2015) Q# ISO 9001:2015 Clause Audit Question Audit Evidence 4 Context of the Organization

Certification criteria for. Internal QMS Auditor Training Course

Row Manufacturing Inc. Quality Manual ISO 9001:2008

Revised October 2013

Guideline. Records Management Strategy. Public Record Office Victoria PROS 10/10 Strategic Management. Version Number: 1.0. Issue Date: 19/07/2010

Sound Transit Internal Audit Report - No

ITIL Roles Descriptions

State of Oregon. State of Oregon 1

SAP ERP Upgrade Checklist Project Preparation

IT Service Continuity Management PinkVERIFY

QUALITY MANUAL ISO 9001:2015

UoD IT Job Description

Information Systems and Technology

Build (develop) and document Acceptance Transition to production (installation) Operations and maintenance support (postinstallation)

Fundamentals of Information Systems, Fifth Edition. Chapter 8 Systems Development

The ITIL v.3 Foundation Examination

Information Management Advice 35: Implementing Information Security Part 1: A Step by Step Approach to your Agency Project

Computing Services Network Project Methodology

PROCESSING & MANAGEMENT OF INBOUND TRANSACTIONAL CONTENT

IT General Controls Domain COBIT Domain Control Objective Control Activity Test Plan Test of Controls Results

Certified Information Professional 2016 Update Outline

Exhibit to Data Center Services Service Component Provider Master Services Agreement

Procuring Penetration Testing Services

R3: Windows Server 2008 Administration. Course Overview. Course Outline. Course Length: 4 Day

Enterprise Security Tactical Plan

AUDITOR GUIDELINES. Responsibilities Supporting Inputs. Receive AAA, Sign and return to IMS with audit report. Document Review required?

EXECUTIVE SUMMARY...5

Identifying & Implementing Quick Wins

An organization properly establishes and operates its control over risks regarding the information system to fulfill the following objectives:

Digital Asset Manager, Digital Curator. Cultural Informatics, Cultural/ Art ICT Manager

Project Management Toolkit Version: 1.0 Last Updated: 23rd November- Formally agreed by the Transformation Programme Sub- Committee

7 Directorate Performance Managers. 7 Performance Reporting and Data Quality Officer. 8 Responsible Officers

Criticism of Implementation of ITSM & ISO20000 in IT Banking Industry. Presented by: Agus Sutiawan, MIT, CISA, CISM, ITIL, BSMR3

EMA CMDB Assessment Service

JOB DESCRIPTION CONTRACTUAL POSITION

INTERNAL AUDIT DIVISION AUDIT REPORT 2013/020. Audit of the Umoja software system (SAP) implementation

2.2 INFORMATION SERVICES Documentation of computer services, computer system management, and computer network management.

Lot 1 Service Specification MANAGED SECURITY SERVICES

Request for Proposal. Supporting Document 3 of 4. Contract and Relationship Management for the Education Service Payroll

Blend Approach of IT Service Management and PMBOK for Application Support Project

ITIL A guide to release and deployment management

1 Why should monitoring and measuring be used when trying to improve services?

FISCAL PLAN RESPONSE TO THE AUDITOR GENERAL

Second Clinical Safety Review of the Personally Controlled Electronic Health Record (PCEHR) June 2013

Project Management Fact Sheet:

Feature. A Higher Level of Governance Monitoring IT Internal Controls. Controls tend to degrade over time and between audits.

Adequate Records Management - Implementation Plan

MKS Integrity & CMMI. July, 2007

Test Plan Template: (Name of the Product) Prepared by: (Names of Preparers) (Date) TABLE OF CONTENTS

CMMI KEY PROCESS AREAS

The ITIL Foundation Examination

RS Official Gazette, No 23/2013 and 113/2013

NIST A: Guide for Assessing the Security Controls in Federal Information Systems. Samuel R. Ashmore Margarita Castillo Barry Gavrich

COBIT Helps Organizations Meet Performance and Compliance Requirements

Office of the Auditor General of Canada. Internal Audit of Document Management Through PROxI Implementation. July 2014

Design Document Version 0.0

Minnesota Health Insurance Exchange (MNHIX)

Integrating Project Management and Service Management

How To Use Adobe Software For A Business

A system is a set of integrated components interacting with each other to serve a common purpose.

NHS ISLE OF WIGHT CLINICAL COMMISSIONING GROUP BUSINESS CONTINUITY POLICY

SUBSIDIARY BODY FOR SCIENTIFIC AND TECHNOLOGICAL ADVICE Twenty-first session Buenos Aires, 6 14 December 2004

PRINCE2:2009 Glossary of Terms (English)

Transcription:

BAI07 Manage Change Acceptance & Testing

Document History Drafted Reviewed Approved V. Modification Date Who Date Who Date Who 1.00 Initial Document Information Author: Manager: Document Name: Title: Subject: Category: Comment: Daniel Frutschi Martin Andenmatten GLF_COBIT5_eng_T11_Questionnaire_BAI07_r2_v2.0.doc Questionnaire for the COBIT5 process assessment. 2

Summary 1. Individual Data... 4 2.... 5 2.1. Base Practices... 7 2.2. Work Products... 15 2.3. Maturity Level 1 Assessment... 17 2.4. Maturity Level 2 Assessment... 18 2.5. Work Products for Level 2... 22 2.6. Maturity Level 2 Assessment... 23 2.7. Work Products for Level 3... 29 2.8. Maturity Level 4 Assessment... 30 2.9. Work Products for Level 4... 34 2.10. Maturity Level 5 Assessment... 35 2.11. Work Products for Level 5... 38 3

1. Individual Data Name Current Position Seniority in the Organization and History of Positions Daily Tasks Description % of Time Spent on This Location Specific Skills Degree of Involvement During Interview Comments 4

2. ID BAI07 Name Description Purpose Statement Management Practice Manage Change Acceptance & Testing Formally accept and make operational new solutions, including implementation planning, system and data conversion, acceptance testing, communication, release preparation, promotion to production of new or changed business processes and IT services, early production support, and a post-implementation review. Implement solutions safely and in line with the agreed-on expectations and outcomes. BAI07.01 Establish an implementation plan. Establish an implementation plan that covers system and data conversion, acceptance testing criteria, communication, training, release preparation, promotion to production, early production support, a fallback/backout plan, and a post-implementation review. Obtain approval from relevant parties. BAI07.02 Plan business process, system and data conversion. Prepare for business process, IT service data and infrastructure migration as part of the enterprise s development methods, including audit trails and a recovery plan should the migration fail. BAI07.03 Plan acceptance tests. Establish a test plan based on enterprisewide standards that define roles, responsibilities, and entry and exit criteria. Ensure that the plan is approved by relevant parties. BAI07.04 Establish a test environment. Define and establish a secure test environment representative of the planned business process and IT operations environment, performance and capacity, security, internal controls, operational practices, data quality and privacy requirements, and workloads. BAI07.05 Perform acceptance tests. Test changes independently in accordance with the defined test plan prior to migration to the live operational environment. BAI07.06 Promote to production and manage releases. Promote the accepted solution to the business and operations. Where appropriate, run the solution as a pilot implementation or in parallel with the old solution for a defined period and compare behaviour and results. If significant problems occur, revert back to the original environment based on the fallback/backout plan. Manage releases of solution components. 5

BAI07.07 Provide early production support. Provide early support to the users and IT operations for an agreed-on period of time to deal with issues and help stabilise the new solution. BAI07.08 Perform a post-implementation review. Conduct a post-implementation review to confirm outcome and results, identify lessons learned, and develop an action plan. Evaluate and check the actual performance and outcomes of the new or changed service against the predicted performance and outcomes (i.e., the service expected by the user or customer). 6

2.1. Base Practices BAI07.BP1 N/P/L/F/N.A. BAI07.01 Establish an implementation plan Establish an implementation plan that covers system and data conversion, acceptance testing criteria, communication, training, release preparation, promotion to production, early production support, a fallback/backout plan, and a post-implementation review. Obtain approval from relevant parties. Are the following activities planned, done, checked and improved? 1. Create an implementation plan that reflects the broad implementation strategy, the sequence of implementation steps, resource requirements, inter-dependencies, criteria for management acceptance of the production implementation, installation verification requirements, transition strategy for production support, and update of BCPs. 2. Confirm that all implementation plans are approved by technical and business stakeholders and reviewed by internal audit, as appropriate. 3. Obtain commitment from external solution providers to their involvement in each step of the implementation. 4. Identify and document the fallback and recovery process. 5. Formally review the technical and business risk associated with implementation and ensure that the key risk is considered and addressed in the planning process. 7

BAI07.BP2 BAI07.02 Plan business process, system and data conversion. Prepare for business process, IT service data and infrastructure migration as part of the enterprise s development methods, including audit trails and a recovery plan should the migration fail. Are the following activities planned, done, checked and improved? N/P/L/F/N.A. 1. Define a business process, IT: service data and infrastructure migration plan. Consider, for example, hardware, networks, operating systems, software, transaction data, master files, backups and archives, interfaces with other systems (both internal and external), possible compliance requirements, business procedures, and system documentation, in the development of the plan. 2. Consider all necessary adjustments to procedures, including revised roles and responsibilities and control procedures, in the business process conversion plan. 3. Incorporate in the data conversion plan methods for collecting, converting and verifying data to be converted, and identifying and resolving any errors found during conversion. Include comparing the original and converted data for completeness and integrity. 4. Confirm that the data conversion plan does not require changes in data values unless absolutely necessary for business reasons. Document changes made to data values, and secure approval from the business process data owner. 5. Rehearse and test the conversion before attempting a live conversion. 6. Consider the risk of conversion problems, business continuity planning, and fallback procedures in the business process, data and infrastructure migration plan where there are risk management, business needs or regulatory/compliance requirements. 7. Co-ordinate and verify the timing and completeness of the conversion cutover so there is a smooth, continuous transition with no loss of transaction data. Where necessary, in the absence of any other alternative, freeze live operations. 8. Plan to back up all systems and data taken at the point prior to conversion. Maintain audit trails to enable the conversion to be retraced and ensure that there is a recovery plan covering rollback of migration and fallback to previous processing should the migration fail. 9. Plan retention of backup and archived data to conform to business needs and regulatory or compliance requirements. 8

BAI07.BP3 BAI07.03 Plan acceptance tests. Establish a test plan based on enterprisewide standards that define roles, responsibilities, and entry and exit criteria. Ensure that the plan is approved by relevant parties. Are the following activities planned, done, checked and improved? N/P/L/F/N.A. 1. Develop and document the test plan, which aligns to the programme and project quality plan and relevant organisational standards. Communicate and consult with appropriate business process owners and IT stakeholders. 2. Ensure that the test plan reflects an assessment of risk from the project and that all functional and technical requirements are tested. Based on assessment of the risk of system failure and faults on implementation, the plan should include requirements for performance, stress, usability, pilot and security testing. 3. Ensure that the test plan addresses the potential need for internal or external accreditation of outcomes of the test process (e.g., financial regulatory requirements). 4. Ensure that the test plan identifies necessary resources to execute testing and evaluate the results. Examples of resources include construction of test environments and use of staff time for the test group, including potential temporary replacement of test staff in the production or development environments. Ensure that stakeholders are consulted on the resource implications of the test plan. 5. Ensure that the test plan identifies testing phases appropriate to the operational requirements and environment. Examples of such testing phases include unit test, system test, integration test, user acceptance test, performance test, stress test, data conversion test, security test, operational readiness test, and backup and recovery tests. 6. Confirm that the test plan considers test preparation (including site preparation), training requirements, installation or an update of a defined test environment, planning/performing/documenting/retaining test cases, error and problem handling, correction and escalation, and formal approval. 7. Ensure that the test plan establishes clear criteria for measuring the success of undertaking each testing phase. Consult the business process owners and IT stakeholders in defining the success criteria. Determine that the plan establishes remediation procedures when the success criteria are not met (e.g., in a case of significant failures in a testing phase, the plan provides guidance on whether to proceed to the next phase, stop testing or postpone implementation). 8. Confirm that all test plans are approved by stakeholders, including business process owners and IT, as appropriate. Examples of such stakeholders are application development managers, project managers and business process end users. 9

BAI07.BP4 BAI07.04 Establish a test environment. Define and establish a secure test environment representative of the planned business process and IT operations environment, performance and capacity, security, internal controls, operational practices, data quality and privacy requirements, and workloads. Are the following activities planned, done, checked and improved? N/P/L/F/N.A. 1. Create a database of test data that are representative of the production environment. Sanitise data used in the test environment from the production environment according to business needs and organisational standards (e.g., consider whether compliance or regulatory requirements oblige the use of sanitised data). 2. Protect sensitive test data and results against disclosure, including access, retention, storage and destruction. Consider the effect of interaction of organisational systems with those of third parties. 3. Put in place a process to enable proper retention or disposal of test results, media and other associated documentation to enable adequate review and subsequent analysis as required by the test plan. Consider the effect of regulatory or compliance requirements. 4. Ensure that the test environment is representative of the future business and operational landscape, including business process procedures and roles, likely workload stress, operating systems, necessary application software, database management systems, and network and computing infrastructure found in the production environment. 5. Ensure that the test environment is secure and incapable of interacting with production systems. 10

BAI07.BP5 BAI07.05 Perform acceptance tests. Test changes independently in accordance with the defined test plan prior to migration to the live operational environment. Are the following activities planned, done, checked and improved? N/P/L/F/N.A. 1. Review the categorised log of errors found in the testing process by the development team, verifying that all errors have been remediated or formally accepted. 2. Evaluate the final acceptance against the success criteria and interpret the final acceptance testing results. Present them in a form that is understandable to business process owners and IT so an informed review and evaluation can take place. 3. Approve the acceptance with formal sign-off by the business process owners, third parties (as appropriate) and IT stakeholders prior to promotion to production. 4. Ensure that testing of changes is undertaken in accordance with the testing plan. Ensure that the testing is designed and conducted by a test group independent from the development team. Consider the extent to which business process owners and end users are involved in the test group. Ensure that testing is conducted only within the test environment. 5. Ensure that the tests and anticipated outcomes are in accordance with the defined success criteria set out in the testing plan. 6. Consider using clearly defined test instructions (scripts) to implement the tests. Ensure that the independent test group assesses and approves each test script to confirm that it adequately addresses test success criteria set out in the test plan. Consider using scripts to verify the extent to which the system meets security requirements. 7. Consider the appropriate balance between automated scripted tests and interactive user testing. 8. Undertake tests of security in accordance with the test plan. Measure the extent of security weaknesses or loopholes. Consider the effect of security incidents since construction of the test plan. Consider the effect on access and boundary controls. 9. Undertake tests of system and application performance in accordance with the test plan. Consider a range of performance metrics (e.g., end-user response times and database management system update performance). 10. When undertaking testing, ensure that the fallback and rollback elements of the test plan have been addressed. 11. Identify, log and classify (e.g., minor, significant, mission-critical) errors during testing. Ensure that an audit trail of test results is available. Communicate results of testing to stakeholders in accordance with the test plan to facilitate bug fixing and further quality enhancement. 11

BAI07.BP6 BAI07.06 Promote to production and manage releases. Promote the accepted solution to the business and operations. Where appropriate, run the solution as a pilot implementation or in parallel with the old solution for a defined period and compare behaviour and results. If significant problems occur, revert back to the original environment based on the fallback/backout plan. Manage releases of solution components. Are the following activities planned, done, checked and improved? N/P/L/F/N.A. 1. Prepare for transfer of business procedures and supporting services, applications and infrastructure from testing to the production environment in accordance with organisational change management standards. 2. Determine the extent of pilot implementation or parallel processing of the old and new systems in line with the implementation plan. 3. Promptly update relevant business process and system documentation, configuration information and contingency plan documents, as appropriate. 4. Ensure that all media libraries are updated promptly with the version of the solution component being transferred from testing to the production environment. Archive the existing version and its supporting documentation. Ensure that promotion to production of systems, application software and infrastructure is under configuration control. 5. Where distribution of solution components is conducted electronically, control automated distribution to ensure that users are notified and distribution occurs only to authorised and correctly identified destinations. Include in the release process backout procedures to enable the distribution of changes to be reviewed in the event of a malfunction or error. 6. Where distribution takes physical form, keep a formal log of what items have been distributed, to whom, where they have been implemented, and when each has been updated. 12

BAI07.BP7 BAI07.07 Provide early production support. Provide early support to the users and IT operations for an agreed-on period of time to deal with issues and help stabilise the new solution. Are the following activities planned, done, checked and improved? N/P/L/F/N.A. 1. Provide additional resources, as required, to end users and support personnel until the release has stabilised. 2. Provide additional IT systems resources, as required, until the release is in a stable operational environment. 13

BAI07.BP8 BAI07.08 Perform a post-implementation review. Conduct a post-implementation review to confirm outcome and results, identify lessons learned, and develop an action plan. Evaluate and check the actual performance and outcomes of the new or changed service against the predicted performance and outcomes (i.e., the service expected by the user or customer). Are the following activities planned, done, checked and improved? N/P/L/F/N.A. 1. Establish procedures to ensure that post-implementation reviews identify, assess and report on the extent to which: Enterprise requirements have been met. Expected benefits have been realised. The system is considered usable. Internal and external stakeholder expectations are met. Unexpected impacts on the enterprise have occurred. Key risk is mitigated. The change management, installation and accreditation processes were performed effectively and efficiently. 2. Consult business process owners and IT technical management in the choice of metrics for measurement of success and achievement of requirements and benefits. 3. Conduct the post-implementation review in accordance with the organisational change management process. Engage business process owners and third parties, as appropriate. 4. Consider requirements for post-implementation review arising from outside business and IT (e.g., internal audit, ERM, compliance). 5. Agree on and implement an action plan to address issues identified in the post-implementation review. Engage business process owners and IT technical management in the development of the action plan. 14

2.2. Work Products Input Work Products ID Name Used (X) BAI07.01 Establish an implementation plan Name in the Organization Location BAI01.09 Quality management plan BAI06.01 Change plan and schedule Approved requests for change BAI07.02 Plan business process, system and data conversion. BAI07.03 Plan acceptance tests. BAI01.09 Requirements for independent verification of deliverables BAI03.07 Test procedures Test plan BAI03.08 Test result communications Test result logs and audit trails BAI07.04 Establish a test environment. BAI07.05 Perform acceptance tests. BAI07.06 Promote to production and manage releases. BAI07.07 Provide early production support. APO11.03 Review results of quality of service, including customer feedback BAI05.05 Success measures and results BAI07.08 Perform a post-implementation review. APO11.04 Results of quality reviews and audits APO11.05 Root causes of quality delivery failures Results of solution and service delivery quality monitoring BAI05.05 Success measures and results 15

Output Work Products ID Name Used (X) BAI07.01 Establish an implementation plan Internal Approved implementation plan Internal Implementation fallback and recovery process BAI07.02 Plan business process, system and data conversion. DSS06.02 Migration plan BAI07.03 Plan acceptance tests. BAI01.04 Approved acceptance BAI01.08 test plan BAI07.04 Establish a test environment. Internal Test data BAI07.05 Perform acceptance tests. Internal Test results log BAI01.06 Evaluation of acceptance results BAI01.04 Approved acceptance and release for production BAI07.06 Promote to production and manage releases. BAI10.01 Internal Release plan Release log BAI07.07 Provide early production support. APO08.04 Supplemental support plan APO08.05 DSS02.04 BAI07.08 Perform a post-implementation review. BAI01.13 BAI01.14 BAI01.13 BAI01.14 Post-implementation review report Remedial action plan Name in the Organization Location 16

2.3. Maturity Level 1 Assessment Level 1 Performed PA 1.1 Performance Attribute GP 111 Achieve the Expected Results The implemented process achieves its Purpose. The Performance attribute is a measure of the extent to which the Purpose is achieved. As a result of full achievement of this attribute, the process achieves its defined Expected Results. Did this process reach its objectives, its Expected Results, and show some tangible evidence of process activities (e.g. information or document)? What could be improved? 17

2.4. Maturity Level 2 Assessment Level 2 Managed PA 2.1 Performance Management Attribute GP 211 Identify the Objectives for the Performance of the The Performed is now implemented in a managed fashion (planned, monitored, and adjusted) and its Work Products are appropriately established, controlled, and maintained. The Performance Management attribute is a measure of the extent to which the performance of the process is managed. Are the objectives of the process defined? Are there deadlines, constraints, or target values to reach? Who defines those objectives? Is there a plan of actions to execute the process? What type of plan is it? Is there any monitoring of process activities against that plan? Are activities appropriately and regularly monitored? Is the time to perform the process estimated and tracked? GP 212 Plan and Monitor the Performance of the to Fulfill the Identified Objectives 18

Is someone tracking the process activities? Does someone notice and react if they are not performed? Are there corrective actions when process results don t meet objectives? (More resources, new planning)? GP 213 Adjust the Performance of the What are the roles and responsibilities that have been identified for this process? Are they clearly defined (even not formally at this stage)? Are they communicated to everyone? Who is responsible for that? GP 214 Define Responsibilities and Authorities for Performing the Do you think that existing HR are well trained and efficient? Are they numerous enough and available? Are other resources available when needed (monitoring, tracking, and reporting tools)? GP 215 Identify and Make Available Resources to Perform the according to Plan 19

Who are involved parties in the process? Who uses the outputs from this process (reporting, documents or information)? What type of communication is there between the different stakeholders? How is the coordination between the organizational functions involved in the process managed? GP 216 Manage the Interfaces Between Involved Parties PA 2.2 Work Product Management Attribute GP 221 Define the Requirements for the Work Products The Work Product Management attribute is a measure of the extent to which the Work Products produced by the process are appropriately managed. What are the documents/information used or resulting from the process? Do you have defined requirements and quality criteria for these documents (and information)? Do you know what they should contain? Are the input and output of the process based on templates (at your level)? Do you have specific rules to manage the process inputs and outputs? Are there rules for versioning? Document naming? Are there access rules for documents, information, databases? GP 222 Define the Requirements for Documentation and Control of the Work Products 20

Are these rules applied in the field? Overall, do you have trouble handling documents? Does it happen that people do not have access to the information when they should? that you cannot find the latest version of a document? GP 223 Identify, Document, and Control the Work Products Are there reviews of the documents and other outputs produced by the process? Do you compare the actual documents to their requirements and quality criteria? Is there a frequency for reviewing the documents and other outputs produced by the process? If any problem is detected, are corrective actions taken? How? GP 224 Review and Adjust Work Products to Meet the Defined Requirements 21

2.5. Work Products for Level 2 ID Name Used (X) PA 2.1 Performance Management Attribute 3-00 Plan 5-00 Record 6-00 Report PA 2.2 Work Product Management Attribute 1-00 Object 3-00 Plan 5-00 Record 8-00 Specification Work Products Name in the Organization Location 22

2.6. Maturity Level 2 Assessment Level 3 Established PA 3.1 Definition Attribute The Managed is now implemented using a standard process definition capable of achieving its process Expected Results. The Definition attribute is a measure of the extent to which a standard process is defined and maintained to support the deployment of the. Is this process described formally? Are all the process activities described in a reference guide, a Quality or Service Management System (SMS or QMS), or in procedures? Do you have templates supporting the process? GP 311 Define the Standard that will support the Deployment of the GP 312 Determine the Sequence and Interaction between es so that they work as an Integrated System of es Are the connections between the process activities and other processes clearly identified and defined? Where? Do you have a diagram describing the relationships? 23

Is there a clear description of required competencies for those activities? Are the various roles defined in job descriptions, competence profiles? GP 313 Identify the Roles and Competencies for Performing the Standard Is the required work environment appropriately identified and described? Are the tools required for supporting the process performance specified? GP 314 Identify the Required Infrastructure and Work Environment for Performing the Standard GP 315 Determine Suitable Methods to Monitor the Effectiveness and Suitability of the Standard Is a mechanism defined to monitor the efficiency and suitability of the process across the organization? Have you identified the need for internal audits or management reviews? Are these internal audits or management reviews planned? Do you have defined the way to identify the weaknesses (and strengths) of your process? And how to benefit from them? 24

[ITIL 2011: Service Reporting] Have you identified the different audiences for the service management reports? Have you defined the layout, contents, and frequency of Service Management reports (according to the audience)? Was it agreed with the business? GP 316 Define and Agree upon the Content of Service Management Reports PA 3.2 Deployment Attribute GP 321 Deploy a Standard that satisfies the context-specific Requirements from the Standard Definition The Deployment attribute is a measure of the extent to which the standard process is effectively deployed to achieve its process Expected Results. Does the process implemented correspond to the one that is described formally? How do you organize the deployment of the process across the organization? 25

Are all the defined roles for this process assigned to someone? Are all the responsibilities identified and communicated? With appropriate authority lead? GP 322 Assign and Communicate Roles, Responsibilities, and Authorities for Performing the Standard GP 323 Ensure necessary Competencies for Performing the Standard Do you think that people working on this process have the right level of knowledge or training? How do you ensure that the competence level is adequate? Do you look at particular skills when hiring people? Is there a training plan for the actors of this process? Are there training sessions organized to improve the competence level? 26

Are resources and required information available to implement the process? Do you have enough financial and human resources? Do you miss any other resource? GP 324 Provide Resources and Information to Support the Performance of the Standard GP 325 Provide Adequate Infrastructure to Support the Performance of the Standard Does the work environment correspond to what has been defined formally for implementing this process? Do you have enough technical (HW, SW) resources? Do you have the appropriate tools? GP 326 Collect and Analyze Data about the Performance of the to Demonstrate its Suitability and Effectiveness Do you collect information on the performed activities? Do you collect and analyze data on the process implementation? What for? (to ensure that the process activities meet the requirements defined in the standard guide, or to check that it is suitable and effective) How do you analyze this data? Can it help determine process efficiency? 27

Do you produce Service Management reports according to service reporting policies and rules? What is the frequency? Do you communicate Service Management reports to interested parties? How? [ITIL 2011: Service Reporting] GP 327 Produce and Publish Service Management Reports 28

2.7. Work Products for Level 3 ID Name Used (X) PA 3.1 Definition Attribute 2-00 Description 3-00 Plan 4-00 Procedure 5-00 Record 8-00 Specification PA 3.2 Deployment Attribute 2-00 Description 3-00 Plan 5-00 Record 6-00 Report 8-00 Specification Work Products Name in the Organization Location 29

2.8. Maturity Level 4 Assessment Level 4 Predictable PA 4.1 Measurement Attribute The Established now operates within defined limits to achieve its process Expected Results. The Measurement attribute is a measure of the extent to which measurement results are used to ensure that the performance of the process supports the achievement of the relevant process performance objectives in support of defined business goals. Are business goals known? What process information is necessary to determine whether business goals are met? GP 411 Identify Information Needs, in Relation with Business Goals What needs to be measured in this process? Are the reports produced compatible with process objectives and business goals? GP 412 Derive Measurement Objectives from Information Needs 30

GP 413 Establish Quantitative Objectives for the Performance of the Standard, according to the Alignment of the with the Business Goals Can you give examples of process quantitative objectives that you follow? GP 414 Identify Product and Measures that support the Achievement of the Quantitative Objectives for Performance What metrics are measured and what is the frequency of those measurements? With these measures can you determine whether the process reaches its quantitative objectives? GP 415 Collect Product and Measurement Results through performing the Standard How are those measures collected? Who is responsible for collecting, and reporting measures? Are they measured automatically? How hard is it? How long does it take to get those measures? 31

GP 416 Use the Results of the Defined Measurement to Monitor and Verify the Achievement of the Performance Objectives Who analysis the measures? Are measurement results used to verify the achievement of the process? Is there, somewhere, a summary of the evolution of the process indicators? PA 4.2 Control Attribute The Control attribute is a measure of the extent to which the process is quantitatively managed to produce a process that is stable, capable, and predictable within defined limits. Are there regular statistical analyses of the process performance? Have you defined techniques to control the process performance? GP 421 Determine Analysis and Control Techniques, appropriate to control the Performance Are some performance limits defined for the process? Where do those limits come from? How have they been defined? GP 422 Define Parameters suitable to Control the Performance 32

GP 423 Analyze and Product Measurement Results to Identify Variations in Performance Does the analysis of measurement data enable the identification of problems encountered in the implementation of the process itself? Do you compare the process performance to the defined limits? Do you identify root causes of process performance variations? GP 424 Identify and Implement Corrective Actions to Address Assignable Causes How are special causes of variations treated? How are corrective action implemented? After the implementation of corrective actions, are new limits defined for process performance? GP 425 Re- Establish Control Limits following Corrective Actions 33

2.9. Work Products for Level 4 ID Name Used (X) PA 4.1 Measurement Attribute 2-00 Description 3-00 Plan 5-00 Record 6-00 Report 8-00 Specification PA 4.2 Control Attribute 2-00 Description 3-00 Plan 5-00 Record 6-00 Report Work Products Name in the Organization Location 34

2.10. Maturity Level 5 Assessment Level 5 Optimizing PA 5.1 Innovation Attribute The Predictable is continuously improved to meet relevant, current, and projected business goals. The Innovation attribute is a measure of the extent to which changes to the process are identified from analysis of common causes of variation in performance and from investigations of innovative approaches to the definition and deployment of the process. Do you define improvement objectives for the process based on your business goals? GP 511 Define the Improvement Objectives for the that Support the Relevant Business Goals Is the process reviewed to identify potential sources of variation? Do you identify improvement opportunities based on the analysis of these variations? GP 512 Analyze Measurement Data of the to Identify Real and Potential Variations in the Performance 35

Do you watch innovation and the evolution of best practices in the process domain to identify improvement opportunities? GP 513 Identify Improvement Opportunities of the Based on Innovation and Best Practices Are there reviews of new technologies? Do you define improvement opportunities for the process based on new technologies? GP 514 Derive Improvement Opportunities of the from New Technologies and Concepts Is there a defined improvement strategy for the process? GP 515 Define an Implementatio n Strategy based on longterm Improvement Vision and Objectives 36

PA 5.2 Optimization Attribute GP 521 Assess the Impact of Each Proposed Change against the Objectives of the Defined Standard GP 522 Manage the Implementatio n of agreed changes to select areas of the Defined Standard according to the Implementatio n Strategy GP 523 Evaluate the Effectiveness of Change on the basis of actual Performance against Performance and Capability Objectives and Business Goals The Optimization attribute is a measure of the extent to which changes to the definition, management, and performance of the process result in an effective impact that achieves the relevant process improvement objectives. How do you assess the proposed improvements of the standard process? How are the process improvements implemented? What are the steps? Who implements the process improvement changes? Who will follow and check the positive or negative effect? When improvement changes are implemented, is the implementation effectiveness assessed? Who is in charge? (top management, process owner?) 37

2.11. Work Products for Level 5 ID Name Used (X) PA 5.1 Innovation Attribute 2-00 Description 3-00 Plan 4-00 Procedure 5-00 Record 6-00 Report 8-00 Specification PA 5.2 Optimization Attribute 2-00 Description 3-00 Plan 5-00 Record 6-00 Report 8-00 Specification Work Products Name in the Organization Location 38