A framework for the comparison of Maturity Models for Project-based Management

Similar documents
Introducing a PMO In XYZ Company

Measuring Project Management Maturity - A framework for better and efficient Projects delivery

Benefits of conducting a Project Management Maturity Assessment with PM Academy:

Improving project management competency by using an OPM3 approach

Organizational Project Management Maturity Model (OPM3) Knowledge Foundation

Nydia González 1, Franck Marle 1 and Jean-Claude Bocquet 1. Ecole Centrale Paris, FRANCE

Contents. viii. 4 Service Design processes 57. List of figures. List of tables. OGC s foreword. Chief Architect s foreword. Preface.

Executive Summary of Mastering Business Growth & Change Made Easy

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

CDC UNIFIED PROCESS PRACTICES GUIDE

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)

CORPORATE INFORMATION AND TECHNOLOGY STRATEGY

OPM3 TM. Kevin Chui Vice President, PMI Hong Kong Chapter

Portfolio, Programme and Project Management Maturity Model - a Guide to Improving Performance

Overview of: A Guide to the Project Management Body of Knowledge (PMBOK Guide) Fourth Edition

Partnering for Project Success: Project Manager and Business Analyst Collaboration

Presented by. Denis Darveau CISM, CISA, CRISC, CISSP

Fourth generation techniques (4GT)

Introduction to Macroscope. Version 5.0. April 2012

Integrating the Worldwide Project Management Method (WWPMM) into an IBM Global Services Method Project

Relationship Manager (Banking) Assessment Plan

Revised October 2013

Learning Outcomes Implementation Guidance - Revised Staff Questions & Answers Document

Universiteit Leiden. ICT in Business. Leiden Institute of Advanced Computer Science (LIACS) Capability Maturity Model for Software Usage

Strategic HR Partner Assessment (SHRPA) Feedback Results

Investors in People Assessment Report. Presented by Alli Gibbons Investors in People Specialist On behalf of Inspiring Business Performance Limited

The Software Process. The Unified Process (Cont.) The Unified Process (Cont.)

This alignment chart was designed specifically for the use of Red River College. These alignments have not been verified or endorsed by the IIBA.

Expert Reference Series of White Papers. Intersecting Project Management and Business Analysis

OPM3 ProductSuite. Driving measurable business improvement.

Appendix B Data Quality Dimensions

ISO 21500: Did we need it? A Consultant's Point of View after a first experience. Session EM13TLD04

Speeding up Level 3 CMM Certification Process with Estimation Tool General Dynamics Calgary

A P3M3 Maturity Assessment for Attorney Generals Department Prepared by James Bawtree & Rod Sowden Dated 18 th November 2010

Contents. 2. Why use a Project Management methodology?

Maturity Model. March Version 1.0. P2MM Version 1.0 The OGC logo is a Registered Trade Mark of the Office of Government Commerce

P3M3 Portfolio Management Self-Assessment

STSG Methodologies and Support Structure

What are some effective standards-based classroom assessment practices?

The Standard for Portfolio Management. Paul E. Shaltry, PMP Deputy PM PPMS ( ) BNS02

ITIL CSI 2011 Vernon Lloyd

1 Introduction to ISO 9001:2000

THIRD REGIONAL TRAINING WORKSHOP ON TAXATION. Brasilia, Brazil, December 3 5, Topic 4

Topic 1 Introduction. to the Fundamentals of Project Management INTRODUCTION LEARNING OUTCOMES

Sound Transit Internal Audit Report - No

The Agile PMO. Contents. Kevin Thompson, Ph.D., PMP, CSP Agile Practice Lead cprime, Inc E. Third Avenue, Suite 205 Foster City, CA 94404

Advanced Software Test Design Techniques Use Cases

Healthcare, transportation,

Ten Steps to Comprehensive Project Portfolio Management Part 3 Projects, Programs, Portfolios and Strategic Direction By R.

11 Tips to make the requirements definition process more effective and results more usable

ISO 9001:2015 Your implementation guide

How PRINCE2 Can Complement PMBOK and Your PMP Jay M. Siegelaub Impact Strategies LLC. Abstract. About PRINCE2

Universiteit Leiden. Opleiding Informatica

TOGAF. TOGAF & Major IT Frameworks, Architecting the Family. by Danny Greefhorst, MSc., Director of ArchiXL. IT Governance and Strategy

Writing Reports BJECTIVES ONTENTS. By the end of this section you should be able to :

Business Value of Solution Architecture

Reaching CMM Levels 2 and 3 with the Rational Unified Process

Initial Professional Development Technical Competence (Revised)

Project Management. On-Site Training and Facilitation Services. For more information, visit

ITIL Service Lifecycles and the Project Manager

An Oracle White Paper September SOA Maturity Model - Guiding and Accelerating SOA Success

Leveraging CMMI framework for Engineering Services

Stakeholder Engagement Planning Overview

Family Evaluation Framework overview & introduction

IT STARTS WITH CHANGE MANAGEMENT

University Mission. Strategic Plan. Organisation Structure. Jobs

QTC Training & Education. Certificate IV of Project Management BSB41507 Study by Correspondence

Introduction to OpenPPM

Vito Madaio, PMP, TSPM 2015, September, 24th

Closing the Business Analysis Skills Gap

Introduction to Systems Analysis and Design

Redesigned Framework and Approach for IT Project Management

Metacognition. Complete the Metacognitive Awareness Inventory for a quick assessment to:

Syllabus. REQB Certified Professional for Requirements Engineering. Foundation Level

Setting the Expectation for Success: Performance Management & Appraisal System

Periodic risk assessment by internal audit

PMI Risk Management Professional (PMI-RMP ) - Practice Standard and Certification Overview

How To Plan A University Budget

NASCIO EA Development Tool-Kit Solution Architecture. Version 3.0

The Battle for the Right Features or: How to Improve Product Release Decisions? 1

[project.headway] Integrating Project HEADWAY And CMMI

MGMT 4135 Project Management. Chapter-16. Project Oversight

PinkVERIFY IT SERVICE MANAGEMENT TOOLS: COMPATIBILITY CONSIDERATIONS

Computing & Communications Services

LITERACY: READING LANGUAGE ARTS

Frameworks for IT Management

Quotes from Object-Oriented Software Construction

The Art of Project Management: A Competency Model For Project Managers

Section 4: Key Informant Interviews

Implementing COBIT based Process Assessment Model for Evaluating IT Controls

Governance SPICE. ISO/IEC for Internal Financial Controls and IT Management. By János Ivanyos, Memolux Ltd. (H)

Five Core Principles of Successful Business Architecture

Performance Appraisal System using Multifactorial Evaluation Model

TOGAF TOGAF & Major IT Frameworks, Architecting the Family

ENGINEERING COUNCIL OF SOUTH AFRICA

IMPLEMENTING BUSINESS CONTINUITY MANAGEMENT IN A DISTRIBUTED ORGANISATION: A CASE STUDY

Application of the Prado - Project Management Maturity Model at a R&D Institution of the Brazilian Federal Government

MANDARIN ORIENTAL HOTEL GROUP REPORT SUMMARY

Transcription:

Utrecht University A framework for the comparison of Maturity Models for Project-based Management Tjie-Jau Man, BSc. Student number - 027972 Thesis number: INF/SCR-07-07 Utrecht University supervisors: Lidwien v/d Wijngaert Sjaak Brinkkemper Capgemini supervisors: Chris ten Zweege Erwin Dunnink September 2, 2007 Capgemini

Preface Writing a thesis is unmistakably one of the hardest tasks that a graduating student has to deal with. Despite the enthusiasm at the beginning of the venture, you inevitably end up counting the days before the final deadline (that is, if you have decided upon one). As comical as it might sound, the whole process of doing the thesis assignment can be summarized with three consecutive thoughts (at least in my case). First, you will think that other students are exaggerating since there is no way you would take that much time to finish your thesis. After all, you are positive about the subject you have chosen. After several months, you will begin to wonder if you are on the right track because something just doesn t feel right. And then, finally, when you are nearing your final deadline, you would most probably wish you had taken the assignment more seriously from the beginning. Fortunately, support is provided to alleviate the recurring dips of motivation and selfdiscipline during this long and cumbersome journey towards graduation. In my case, I wish to thank my supervisors of the University of Utrecht, Lidwien van de Wijngaert and Sjaak Brinkkemper, for their guidance. I also want to thank the people at Capgemini Netherlands for providing me with a resourceful environment to conduct my thesis assignment, especially my supervisors Chris ten Zweege and Erwin Dunnink. Likewise to all members of the PMI project group, please accept my thanks for your cooperation and helpful feedback during and after the project meetings. And last but certainly not the least; I want to thank my parents for supporting me from the very beginning. Without them, I would never have come this far. Zevenaar, September 2007 Tjie-Jau Man i

Abstract To conduct project-based management (PM) as successful as possible, it is fundamental for organizations to invest time and effort to construct the necessary infrastructure, such as organizational structure, policies and competencies of people. Over time, more advanced organizations may wonder where they exactly stand in the whole process and what they should do to make further advancements. Maturity models for PM are developed to assist organizations that have these thoughts. By comparing their own practices against best practices described by these models, organizations can find out how mature or professionalized they are in performing project-based management and what they could do to realize desirable improvements in it. However, with more than 20 maturity models available in the field of PM, organizations have to consider carefully which one they can adopt. In order to do this, organizations need to know what aspects of these models are important to consider and how they should evaluate them. In this thesis, research is done on relevant dimensions to the evaluation of maturity models for PM. This set the stage for the selection of measures that are needed to evaluate similarities and differences between maturity models for PM. The research showed that maturity models for PM can be evaluated along three dimensions: structure, applicability and usage. And three measures were selected to operationalize these dimensions in the same respective order: Process-Data Diagrams, evaluation criteria and user interviews. These measures formed a framework that was applied to several maturity models for PM to determine its quality. The framework and its constituting measures proved useful in shedding light on the relevant similarities and differences between the models. It was able to show the strengths and weaknesses of each evaluated maturity model, which should be considered by organizations planning to adopt them. ii

Table of contents. INTRODUCTION...2.. RESEARCH QUESTION...4.2. SCIENTIFIC & SOCIETAL RELEVANCE...5.3. RESEARCH APPROACH...6 2. MATURITY MODELS FOR PROJECT-BASED MANAGEMENT...8 2.. PROJECT-BASED MANAGEMENT...8 2.2. MATURITY MODELS...0 2.3. MATURITY MODELS FOR PM...0 2.4. MATURITY MODEL SELECTION...2 3. THE EVALUATION FRAMEWORK...9 3.. STRUCTURE DIMENSION...9 3... Process-Data Diagram (PDD) Modeling...20 3..2. PDD comparison method...22 3.2. APPLICABILITY DIMENSION...25 3.2.. Evaluation criteria...25 3.2.2. Criteria comparison method...34 3.3. USAGE DIMENSION...35 3.3.. Interviews...35 4. ANALYSIS & RESULTS...37 4.. STRUCTURE ANALYSIS...37 4... Maturity reference model structure...37 4..2. Assessment method structure...40 4..3. Structure comparison summary...44 4.2. APPLICABILITY ANALYSIS...46 4.2.. Maturity reference model...46 4.2.2. Assessment method...55 4.3. USAGE ANALYSIS...63 5. DISCUSSION & CONCLUSIONS...66 5.. FRAMEWORK REQUIREMENTS...7 5.2. SUGGESTIONS FOR SITUATIONAL SELECTION...73 5.3. RECOMMENDATIONS FOR FURTHER RESEARCH...75 REFERENCES...76 APPENDIX...80 APPENDIX A- PMI PROJECT GROUP MEMBERS AND ROLES...80 APPENDIX A-2 CONSULTED EXPERTS...8 APPENDIX B PROCESS-DATA DIAGRAM MODELING...82 APPENDIX D- ASSESSOR & USER SAMPLE QUESTIONNAIRES...87 APPENDIX D-2 SUMMARY INTERVIEW FINDINGS...89 APPENDIX E PROCESS-DATA DIAGRAMS...92 APPENDIX F CONCEPT AND ACTIVITY TABLES...95 APPENDIX G CRITERIA RESULTS PER MODEL...04

. Introduction In general, there are two reasons why it is beneficial for organizations to adopt a maturity model for project-based management, which includes the management of projects, programs and portfolios. Ever since organizations began to adopt the projectbased way of conducting business, they have strived to deliver projects successfully. To do this, organizations require the necessary infrastructure, which includes processes (methods and techniques), governance structures, competences of people and tools []. Developing such an infrastructure may take several years, and because of this, more advanced organizations may start to wonder after a while where they exactly stand in the whole process and whether they are going the right way. This is when the adoption of a maturity model proves useful. A maturity model is able to assist organizations in verifying what they have achieved by describing activities and best practices and categorizing these descriptions into progressive levels of maturity. The second benefit for adopting a maturity model becomes apparent when an organization has finished assessing its current practices and aims for advancements to a desired level of maturity [2]. By comparing the results of a maturity assessment with the descriptions in a maturity model, an organization gains insight into their strengths and weaknesses and is able to prioritize its actions to make improvements. In addition to the above arguments, the execution of a maturity assessment in itself raises the awareness about what can be improved within an organization. In other words, members of an organization will focus more on the inefficiencies of their ways of working simply because they know they are being assessed. Besides deciding whether a maturity model should be adopted, an equivalently important process regards making the choice between the offered maturity models for project-based management. Many maturity models have emerged since the mid 90s [3] and one question that can be asked here is how organizations should evaluate them in order to select an appropriate maturity model. As an attempt to answer this question, a project group has been gathered by Project Management Institute Nederland (PMI-NL) and given the assignment to publish a book, in which different maturity models for project-based management are compared with each other. 2

Closely aligned to this assignment, research was done on maturity models for projectbased management. A framework was developed to support the evaluation and comparison between such models. This framework was applied to compare three maturity models for project-based management with each other. 3

.. Research question The purpose of this thesis is to develop a framework to evaluate and compare maturity models for project-based management. The central research question of this exploratory thesis research is formulated as follows: What measures are needed to evaluate the similarities and differences between maturity models for project-based management? The following sub research questions will set the stage for answering the main research question.. What is a maturity model for project-based management? Before analyzing a maturity model for project-based management, it is important to understand the reasons behind its existence and what this concept means. 2. What constitutes a maturity model for project-based management? Examining the components that constitute a maturity model sets the stage for determining important aspects or dimensions along which they can be evaluated. 3. What are relevant dimensions to the evaluation of maturity models for projectbased management? These dimensions provide guidance to the selection of the measures that facilitate the comparison between the maturity models. They will ultimately form the evaluation framework mentioned earlier. 4. What are the main similarities and differences between maturity models for project-based management? This sub question focuses on the main similarities and differences found after evaluating several maturity models for project-based management with the developed framework. 4

.2. Scientific & societal relevance This thesis research explores the aspects of maturity models for project-based management that are relevant to distinguishing them from one another. It is meant to shed light on the strengths and weaknesses of these models and how these affect their applicability in certain situations. However, it is not meant to indicate superiority or inferiority among them. The main reason behind this thesis lies in the assumption that organizations should take careful considerations of which maturity model to adopt. Maturity models in general are measurement tools used to assess and/or improve an organization s processes. Depending on the match between a maturity model and an organization s situation, the organization may end up assessing different capabilities than initially planned. This could affect the outcomes of the maturity assessment and may cause an organization to overlook some important weaknesses in its current processes. Supporting the above assumption is the contingency theory; a theory that takes many forms in the world of research. The earliest and extensively researched form of contingency theory was introduced by Fiedler in the 60s, which explains that group performance is a result of interaction of two factors: leadership style and situational favorableness [4][5]. Since then, the contingency approach has been applied and adopted many times [6][7][8][9]. A list of scientific articles that use the contingency theory is shown in [0]. Similar to what other researchers have done, this thesis applies the contingency theory on maturity models for project-based management. As mentioned before, the focus here is on finding measures to elicit similarities and differences among these maturity models. The results of this research could set the stage for further research on the possible contingency or fit between the models and organizational situations, and relate this to performance variables. Also, the differences found with the measures may prove useful in future research efforts on the categorization of maturity models for project-based management. And if the framework proves useful to the comparisons between them, it may provide a foundation or be used to evaluate and compare other maturity models of the same discipline. 5

.3. Research approach The thesis research starts off with the reviewing of literature and scientific papers to gather information about maturity models for project-based management. The next chapter of this report provides the necessary background information regarding this concept (Chapter 2). Chapter 2 also provides a long-list of existing maturity models for project-based management. From this long-list, several maturity models are selected for the purpose of testing the framework to be developed. The description of the selection process and the selected maturity models are described in the final paragraphs of Chapter 2. The examination of literature continues in combination with expert consults to find the dimensions in which maturity models may differ from each other. The experts here include all members of the project group and individuals who have had experience with maturity models for project-based management before. Both the selected dimensions and the evaluation framework that will comprise them are explained in Chapter 3. After the description of the framework, it is applied to several selected maturity models for project-based management to test the framework and the dimensions. The elaboration of the analysis and results of this application can be found in Chapter 4. In order to determine the quality of the evaluation framework, we have adopted the five requirements from the field of situational method engineering []. These requirements are used to assess the quality of a method assembled from method fragments to suit a situation specific for a project. As our framework describes a method to evaluate and compare maturity models for project-based management, these requirements can be used for its evaluation. The five requirements are defined as follows for this thesis: - Completeness: the framework describes all relevant dimensions for the evaluation of maturity models for project-based management; - Consistency: all activities and concepts are consistently defined and described throughout the framework; 6

- Applicability: the researchers are able to execute the method described by the framework; - Reliability: the method described by the framework is semantically correct and meaningful; - Efficiency: the method described by the framework can be performed at minimal cost and effort. This thesis concludes with the elaboration of discussion topics and conclusions based on the analysis results and the description of the framework s quality based on the above requirements (Chapter 5). 7

2. Maturity models for project-based management In order to understand the concept of maturity models for project-based management, it is necessary explain the two smaller concepts that constitutes it: project-based management and maturity model. These two concepts will be further elaborated in the following subsections. 2.. Project-based management As mentioned in the introduction, the term project-based management refers to all three domains of this discipline, which are project management, program management and portfolio management [2]. From here on, the abbreviation PM will be used from here on whilst referring to all three domains. In cases when the abbreviation is less appropriate, the term project-based management will be used. To understand the concept of PM, it is necessary to explain the three domains constituting it. This is done in the following paragraphs. A project is a temporary endeavor with a definite beginning and end. To complete a project successfully, it needs to meet the requirements predefined by all stakeholders and deliver a product or service different in some unique way from all similar products or services [3]. The Project Management Body of Knowledge (PMBOK) of the Project Management Institute (PMI) uses the following broad definition to define the management of projects: the application of knowledge, skills, tools and techniques to activities within a project in order to meet or exceed stakeholders needs and expectations [3]. Although this definition is similar to that of process management [4], it is different from process management in that project management is concerned with managing (collections of) temporary undertakings rather than ongoing activities. 8

Programs differ from projects in that they are carried out to achieve specific strategic business objectives or goals. Or as formulated in [2], a program s focus is on producing, in accordance with a vision of an end state consistent with organizational strategic objectives. An example of an end state is the realization of 5% cost reduction throughout the entire organization. To achieve this, a program will consist of a number of projects or functional activities including for example the implementation of a new logistics system and the development of a new IT system [2]. In addition, while the aforementioned project is successful when the logistics system is implemented conform specifications, the completion of the above program depends on the realization of the 5% cost reduction, and not with a new IT or logistics system. According to Wikipedia [5], the management of programs is: the process of managing multiple ongoing inter-dependent projects. (It) focuses on selecting the best group of programs, defining them in terms of their constituent projects and providing an infrastructure where projects can be run successfully but leaving projects management to the project management community. However, due to physical or financial constraints, organizations cannot undertake all projects or programs on their to-do-list at the same time. That is when the creation of a portfolio becomes relevant. A portfolio, in terms of PM, is a collection of projects and/or programs grouped together to meet strategic objectives [2]. Portfolio management is the centralized management of one or more portfolios. This domain is all about the prioritization, facilitation and management of projects/programs based on their alignment to the business strategy of an organization. According to [6] there has been an increase in the amount of organizations employing PM. The growing popularity is also reflected by the large amount of publications written about this concept. While some of them focus on defining the success of a project [7][8][9][20], others try to shed light on the various factors that might influence the effectiveness of PM [2][22][23]. The emergence of these literature points to the need to professionalize PM such that organizations can 9

undertake successful project continuously. This brings us to the discussion of maturity models, which are developed to facilitate the improvement of how PM is undertaken. 2.2. Maturity models The literature has paid a considerable amount of attention to the concept of maturity models [2][3][24][25][26][27]. This is because a maturity model allows an organization to assess and compare its own practices against best practices or those employed by competitors, with the intention to map out a structured path to improvement [3]. Basically, a maturity model is a framework describing the ideal progression toward desired improvement using several successive stages or levels. te that an organization in the context of maturity models for PM does not necessarily refer to an entire company. A maturity model can also be applied to a business unit, functional groups or departments. One well-known maturity model is the Capability Maturity Model (CMM), introduced by the Software Engineering Institute (SEI). This model was later replaced by its successor, the Capability Maturity Model Integration (CMMI) [28]. The development of Capability Maturity Models had inspired the emergence of other maturity models in the same field of Software Development. Examples of these are the Test Process Improvement (TPI) Model developed by Sogeti [29] and the Usability Maturity Model [30]. 2.3. Maturity Models for PM The existence of CMMI has also led to the development of maturity models for PM. Because of the role that PM plays in the software development process, many of the concepts of maturity incorporated in capability maturity models, such as the CMMI, were adopted by maturity models that emerged in the field of PM [3]. Building on what was explained about maturity models earlier, maturity models for PM are used to measure the degree to which an organization is executing PM by comparing its PM practices against practices in general or best practices. These models describe how mature or professionalized organizations are in conducting PM and what they could do to improve their way of working. 0

According to [3], there is no generally agreed definition of what a mature projectbased organization looks like. In spite of this, the current number of maturity models for PM is estimated at 30 [3]. During this research, an attempt was made to construct a long-list containing existing maturity models for PM. This list of maturity models is depicted in Table, along with their names and owners. Table Long-list of existing maturity models for PM Nr Acronym Name Owner OPM3 Organizational Project Management Project Management Institute (PMI) Maturity Model 2 P3M3 Portfolio, Programme, Project Office of Government Commerce (OGC) Management Maturity Model 3 P2M Project & Program Management for Enterprise Innovation (P2M) Project Management Association of Japan (PMAJ) 4 PMMM Project Management Maturity Model PM Solutions 5 PPMMM Project Portfolio Management Maturity PM Solutions Model 6 PMMM Programme Management Maturity Programme Management Group Model 7 PMMM Project Management Maturity Model KLR Consulting 8 (PM)2 The Berkeley Project Management Process Maturity Model Department of Civil Engineering University of California at Berkeley 9 ProMMM Project Management Maturity Model Project Management Professional Solutions Limited 0 MINCE2 Maturity Increments IN Controlled MINCE2 Foundation Environments PPMM Project and Portfolio Management PriceWaterhouseCoopers (PWC) Belgium Maturity 2 CMMI Capability Maturity Model Integration Software Engineering Institute (SEI) 3 SPICE Software Process Improvement and Capability determination 4 FAA-iCMM Federal Aviation Administration - Integrated Capability Maturity Model 5 Trillium Trillium Bell Canada Software Quality Institute Griffith University, Australia US Federal Aviation Administration 6 EFQM EFQM Excellence Model European Foundation for Quality Management (EFQM) 7 COBIT Control Objectives for Information and related Technology Information Systems Audit and Control Association (ISACA) 8 INK INK Managementmodel Instituut Nederlandse Kwaliteit (INK) 9 ProjectProof VA Volwassenheidsmodel Van Aetsveld 20 PAM Project Activity Model Artemis 2 Project The Project Excellence Model Berenschot Excellence Model 22 PMMM Project Management Maturity Model International Institute for Learning (IIL) H. Kerzner

Maturity models differ from one another in the concepts they embody and the suggestions they make as to how the path to maturity looks like [22]. Different maturity model for PM may define maturity differently and measure different things to determine maturity. Because of this, organizations should give careful consideration to the selection of a maturity model. 2.4. Maturity model selection In order to select maturity models to test the framework to be developed, each member of the project group were asked to rate each maturity model of the long-list based on several criteria. Information regarding the members of the project group is included in the Appendix section of this thesis (see Appendix A-). The maturity models were rated using the following criteria: - Method independency: the degree to which a maturity model is closely aligned to a PM methodology? - Public domain: the degree to which a maturity model and maturity assessment can be applied by anyone besides its owners. - Publication: the degree to which a maturity models is issued in publications. - Industry independency: the degree to which the application of a maturity model is limited to particular industry sectors. - Transparency: the traceability of the calculation of the maturity scores. - Toolset independency: the degree to which the usage a maturity model is bound to a toolset. - Years of existence: how many years a maturity model has existed. - Ease of use: the degree to which a maturity model is easy to use in practice. A candidate maturity model should be at least publicly available (or against moderate payment) through publication in a book (electronically or in print). This is to ensure that the needed information about a maturity model is accessible when it is evaluated using the framework. The table below shows the average scores given to the maturity models. 2

Table 2 Scoring table maturity model for PM Maturity model nr. 2 3 4 5 6 7 8 9 0 Selection Criteria Method independency 6 4 - - 7 Public domain 6 7 - - 0 Publication 0 0 - - 0 Industry independency 0 0 - - 0 Transparency 6 3 - - 0 Toolset independency 5 3 - - 5 Years of existence 7 3 - - 0 Ease of use 6 6 - - 0 Table 2 (continued) Maturity model nr. 2 3 4 5 6 7 8 9 20 2 22 Selection Criteria Method independency 0 0 Public domain 0 0 Publication 0 0 Industry independency 3 0 Transparency 7 0 Toolset independency 8 0 Years of existence 0 3 Ease of use 3 0 The maturity models were rated with scores ranging from 0 to 0 where 0 indicates Hardly and 0 indicates Completely. Grayed out maturity models are those excluded from the long-list based on their accessibility. The maturity models MINCE2 (nr. 4) and PM Solutions PMMM (nr. 8) have not been examined yet due to the limitations of time. The short-list that resulted from the selection process consisted of the following maturity models: - Organizational Project Management Maturity Model (OPM3) [2] - Capability Maturity Model Integration (CMMI-DEV) [28] - Kerzner Project Management Maturity Model (PMMM) [25] - Project, Program, Portfolio Management Maturity Model (P3M3) [32] - Maturity Increments IN Controlled Environments (MINCE2) [33] 3

The first three models of the list were selected to conduct this thesis, which are the OPM3, CMMI and PMMM. For this thesis, materials are used provided by the following institutions for the respective maturity models: Project Management Institute (PMI) for OPM3, International Institute for Learning (IIL) [34] for PMMM and Software Engineering Institute (SEI) for CMMI. A brief background of each model is provided in the following sections, including the reasons why they were or were not selected to test the framework. 2.4.. OPM3 OPM3 is an acronym for Organizational Project Management Maturity Model. It is a standard developed under the stewardship of and introduced in December 2003 by the Project Management Institute (PMI). The development of this standard was inspired by the increasing interest in a maturity model that shows a step-by-step method of improving and maintaining an organization s ability to translate organizational strategy into the successful and consistent delivery of projects. In other words, OPM3 is meant to enable organizations to bridge the gap between organizational strategy and successful projects [35]. The purpose of OPM3 is not to prescribe what kind of improvements users should make or how they should make them. Rather, by providing a broad-based set of organizational project management (OPM) best practices, this standard allows an organization to use it as a basis for study and self-examination, and consequently to make its own informed decision regarding potential initiatives for changes [2]. The standard comprises three interrelated elements: - Knowledge. In this element, the user can become proficient with OPM3, be comfortable with the body of best practices knowledge it contains, with the idea of OPM and OPM maturity, and with the concepts and methodology of OPM3. - Assessment. The organization is compared to OPM3 in this element to determine its current location on a continuum of OPM maturity. 4

- Improvement. Here, organizations can decide to move ahead with change initiatives leading to increased maturity using the results of the assessment as a basis for planning. OPM3 is represented by two complementary parts: the Foundation and the ProductSuite. The Foundation is developed by PMI and the ProductSuite is developed by PMI in collaboration with Det rske Veritas (DNV) [36], an international consulting firm. The Foundation is a description of the OPM3 model itself, readily available to all organizations interested in knowing about the model. The ProductSuite, on the other hand, is a description of OPM3 as in how the model should be applied and what steps are taken during a maturity assessment. Access to the latter is limited to those who enroll for training sessions to become certified assessors, while the Foundation can be purchased in the form of a book. Both sources are consulted during the evaluation of OPM3. In cases where the distinction should be made between the two, the abbreviations OPM3-PS and OPM3-F will indicate information derived from the ProductSuite and Foundation respectively. OPM3 was selected for the evaluation because of its popularity in the field of PM. Information about this model was readily accessible since the chairman of the project group is a certified OPM3 assessor. This model is closely aligned to the PMBOK [2], which is a well-accepted standard approach for project management. Besides this, additional PMBOK guides have recently been developed to describe approaches to program and portfolio management. These additions are also embedded into the OPM3. 2.4.2. CMMI CMMI stands for Capability Maturity Model Integration. Its first version (.) was introduced by the Software Engineering Institute (SEI) in 2002 as the successor of the Capability Maturity Model (CMM), which was developed from 987 until 997. The SEI (2007) defines CMMI as a process improvement approach that helps organizations integrate separate functions, set process improvement goals and priorities, provide guidance for quality processes, and provide a point of reference for appraising current processes. 5

The latest version of CMMI (2.), released in 2006, comprises a framework that allows the generation of multiple models, training courses and appraisal methods supporting specific areas of interest. CMMI for development is one of those models and provides guidance for managing, measuring, and monitoring software development processes. The development of CMM and CMMI was based on the premise, which states that the quality of a system is highly influenced by the quality of the process used to acquire, develop and maintain it [28]. These models comprise the essential elements of effective processes for one or more disciplines (e.g. Software Development) and describe an evolutionary improvement path from ad hoc, immature processes to disciplined, mature processes with improved quality and effectiveness. And the fundamental idea behind this is that even the finest people within an organization cannot perform their best if the process is not understood or operating at its best [28]. The purpose of the CMMI for Development model is to assist organizations in enhancing their software development processes for both products and services by describing characteristics of best practices. The reason why CMMI was included in this thesis is because of its rich history and worldwide acceptance. The model s long history sped up the search for information and experts who were willing to share their knowledge about it. In particular, the CMMI for Development is selected because of its availability. The model information was readily downloadable from the Word Wide Web. For the sake of brevity, the acronym CMMI will be used to refer to CMMI for Development for the rest of this thesis. During the evaluation, documents of SEI s CMMI and the Standard CMMI Assessment Method for Process Improvement (SCAMPI) [37] are consulted. It should be noted that SEI is not the only institution that provides CMMI maturity assessments. While SEI s assessment method is the only one being evaluated for CMMI, it is not the only that exists for CMMI. Thus, the findings here do not necessarily account for the procedures employed by other institutions that also provide CMMI assessments. 6

2.4.3. PMMM The Project Management Maturity Model (PMMM) was introduced by H. Kerzner in 998. The first edition of his book describing this model was published in 200. In 2005, he published the second edition. PMMM is a practical PMBOK-aligned standard [25]. The model sets out various levels or stages of development towards project management maturity; along with assessment instruments to validate how far along the maturity curve the organization has progressed. The original intent of the PMMM is to provide organizations with a framework that allows organizations to create an organization-specific maturity model. Each organization can have a different approach to maturity and that is why organizations are allowed to adapt the questions and answers of the PMMM questionnaire [25]. The physical part of the model consists of a book and an online assessment tool. Both components serve to provide individual assessment participants and their organizations with: - a breakdown on how they are doing in different categories in each maturity level; - a comparison on overall results against those of other companies and individuals who have taken the assessment; and - a high-level prescriptive action plan to follow for individual and organizational improvement. This model was chosen because of its simplicity and availability. It is interesting to examine the differences between PMMM and OPM3 since both of them are aligned to the PMBOK. 2.4.4. P3M3 P3M3 stands for Portfolio, Program and Project Management Maturity Model. It was formally published in February 2006 by the Office of Government Commerce (OGC) after some refinements were made [27]. The model describes the portfolio, program 7

and project-related activities within process areas that contribute to achieving desirable project outcomes. Although P3M3 was eligible to be included in this thesis, access to information about this model was only granted to people of accredited institutions. Several attempts were made to contact these accredited institutions, but most of them could not fulfill the request for information due to license agreements. Some of the contacted persons agreed to provide answers to questions only relevant to conducting the evaluation, but because this took place when most of the research was already done it was no longer possible to include P3M3 in the thesis. 2.4.5. MINCE2 The last candidate maturity model for PM was MINCE2, acronym for Maturity INcrements IN Controlled Environments 2. The MINCE2 Foundation (established in May 2007) developed this model in order to: - determine the project maturity level an organization is in; - report in a standardized way regarding the findings; and - indicate what to do in order to increase the maturity [33]. MINCE2 was not included in the thesis because at the moment of the maturity model selection, the owners of the model were still working on the publication of the model. All details about MINCE2 were going to be published in August 2007. So although one of the project group members had access to information about the model, it could not be used for the thesis research due to publishing rights. In this chapter, the concept of maturity models for PM was explained. The next chapter elaborates on the development of the framework used to evaluate the selected maturity models. 8

3. The evaluation framework The chairman of the PMI-NL Project Group organized monthly meetings to guide the construction of the framework. Whenever prompt feedback was needed, all members could contact each other by email or phone. The description of the project group members and their roles is included in Appendix A-. After several brainstorm sessions and informal meetings, the project group members agreed upon a staged approach in the evaluation framework. It was decided that the framework should evaluate a maturity model s structure, applicability and usage. The following sections describe these three dimensions. After explaining what a dimension holds and how it can be operationalized, a measure is described to elicit the characteristics of a maturity model on that dimension. The description of each dimension concludes with an elaboration of how the results of the models, found with the selected measure, are compared with each other. 3.. Structure Dimension The first dimension along which the framework evaluates a maturity models for PM is structure. Perhaps it is important to mention here that a maturity model for PM is made up of two parts: a maturity reference model and an assessment method. From an assessor s point of view, the maturity reference model is considered a measuring staff; it elaborates on what an assessor should assess in order to determine the maturity of an organization. The description of the assessment method, on the other hand, describes how assessors should carry out the assessment to determine maturity. The characteristics of an assessment method are just as important as the maturity reference model because it affects the repeatability of an assessment and therefore, also the reliability of the results. Besides the fact that these two parts have different measurable characteristics, another reason why a distinction should be made is because a maturity model for PM does not necessarily have only one assessment method to apply its reference model. As mentioned earlier, the SEI is the owner of CMMI, and has described SCAMPI as its standard assessment method. 9

The structure of the maturity reference model comprises a collection of concepts and relationships between these concepts. Each concept says something about the concept of maturity as defined by a maturity model. And it is the relationship between these concepts that illustrates their importance and role in the definition. Shedding light on the structure of the model concepts makes it easier for organizations to understand the purpose and essence of a maturity model. The assessment method of a maturity model can be broken down into multiple process phases and activities. By knowing the structure of these process phases and activities, organizations will know what to anticipate when engaging in an assessment. The products resulting from the assessment activities are also important, especially their relationships with the concepts underlying the maturity reference model. After all, these products contain the data that assessors use to assess maturity. And they do that by comparing this data with the measures defined by maturity reference models. So the relationships between the products resulting from the assessment activities and the concepts of the reference model must not be ignored. To depict the structure of the maturity reference model and assessment method as good as possible, a meta-modeling technique was selected as a measure for the structure dimension. 3... Process-Data Diagram (PDD) Modeling Objective comparisons of the maturity models structures require them to be on a higher abstraction level where it is possible to describe them uniformly. A metamodel is a model constructed on a higher abstraction level, used to describe the features of an underlying model. The usage of meta-models would prove useful because maturity models for PM are described in different languages using various terminologies. They may use various descriptions even though they imply the same underlying concepts. Since the purpose of the framework is not to determine superiority or inferiority between maturity models, it is inappropriate to select one model (and its terminology) as a starting point and compare the other models with it. Another reason why the framework should use a meta-modeling technique as a measure for structure is because it enables a simplified illustration of the maturity 20

models. One quick glance at a meta-model allows an organization to understand the hierarchy of concepts or activities of a maturity model for PM. In [38], the authors were able to compare different object-oriented analysis and design techniques with each other using a meta-modeling technique. According to the authors, the construction of meta-models is a uniform and formal way to compare methodologies with each other as objective as possible, provided that the same constructs are used to model them. In this thesis, the framework will employ a type of meta-models called Process-Data Diagrams (PDD) described in [39]. A PDD is made up of two different meta-models, namely a meta-process model and a meta-data model. As mentioned earlier, a maturity model for PM incorporates a maturity reference model and an assessment method. The two types of meta-models are suitable for modeling each of the two parts. More specifically, the meta-process model will be used to depict the process phases and activities of an assessment method, while the meta-data model models the concepts underlying the maturity reference model. After creation, these two metamodels will be combined to create a PDD in which the relationships between the activities of the assessment process and the concepts of the model are shown. PDDs are able to answer the following three questions for each maturity model for PM: - What process phases and activities is the assessment method made up of? (using the meta-process model) - What products do the activities of the assessment method deliver? (meta-data model) - What concepts underlie a maturity model for PM, and how are they related to each other and the products of the assessment method? (process-data diagram) A condensed description of the PDD modeling technique and notations employed by the framework is provided in the Appendix section (see Appendix B). A more thorough explanation of this method can be found in [39]. 2

3..2. PDD comparison method The comparison between the PDDs follows the same approach described in [28]. This approach begins by defining all concepts and describing all activities depicted in the PDD of each maturity model. Each maturity model for PM can use a different terminology, so a thorough understanding of the definitions of the activities and concepts is needed before comparisons can be made with other models. The definitions are shown using activity tables and concept tables. They are derived from publications and literature about the maturity models. People who have had experience with a particular maturity model are also consulted to gain more and to verify the definitions. After all activities and concepts are defined, they are used to create two additional tables: an activity comparison table and a concept comparison table. The activity comparison table consists of a consolidated reference list of the activities of all maturity models that are selected to test the framework. This means that overlapping activities of the maturity models are combined and non-overlapping ones are added to the table. The activity comparison table also contains consolidated process phases of the assessment methods. Similarly, the concept comparison table contains a consolidated reference list of concepts of the selected maturity models. These tables are used for the actual comparison of the activities and concepts of the selected maturity models. This is done by filling in the fields in the Maturity model columns using the symbols explained below. For the activity comparison table, filling in: - = means that an activity of the reference list is present and also equivalent to the one in the PDD of a maturity model - > or < means that an activity of the reference list is present and comprises respectively less or more than the one in a PDD - a (2) after one of the previous symbols means that an activity is described in the second process phase of a PDD. 22

For the concept comparison table, filling in: - = means that a concept of the reference list is present and also equivalent to the one in the PDD of a maturity model - the name of a concept means that the concept in the reference list is present in a PDD but under a different name. - REPORT means that a concept of the reference list is present but not depicted in the PDD for the sake of space and structure. And if the concept of the reference list was depicted, it would be described within the REPORT concept. Finally, an empty field means that an activity or concept in the reference list is not present in the PDD of the respective maturity model for PM. Examples of comparison tables resulting from the analysis are depicted below: Table 3: Example Activity comparison table. Process phase Maturity Maturity model model 2. Activity. = <.2 Activity.2 < (2) >.3 Activity.3 = 2. Process phase 2 Maturity Maturity model model 2 2. Activity 2. < < () 2.2 Activity 2.2 > 2.3 Activity 2.3 > = Table 4: Example Concept comparison table. Process phase Maturity Maturity model model 2. Concept. CONCEPT =.2 Concept.2 = CONCEPT.3 Concept.3 = = 2. Process phase 2 Maturity Maturity model model 2 2. Concept 2. CONCEPT3 2.2 Concept 2.2 = 2.3 Concept 2.3 CONCEPT5 A comparison table allows a quick overview of what activities or concepts are present in the PDD of a maturity model for PM and what the main differences are with other models. It should be noted that only the concepts related to the assessment activities are compared using a concept comparison table. The core concepts underlying a maturity model are depicted as gray boxes in the respective PDDs and are compared using narrative text instead. This was decided because a concept comparison table 23

only compares the presence/absence of concepts in PDDs. However, whether one maturity model embodies more or fewer concepts than another one is not relevant. More important is to look how different maturity model are actually built-up. This is easier to describe using narrative text than a comparison table. 24

3.2. Applicability Dimension The second stage by the framework focuses on the applicability dimension of maturity models for PM. This dimension is selected because it sheds light on the properties of a maturity model for PM that affects the context in which the model can be applied. It is complementary to the previous dimension since it elicits information about a maturity model that cannot be depicted by a PDD. Besides the structure of a maturity model, it is important to know about properties that determine whether a maturity model for PM is suitable for an organization to adopt. For instance, a maturity model for PM may contain properties that limit its application in certain industries. A PDD is not capable of showing this piece of information. Eventually, the decision was made to use evaluation criteria to measure relevant properties of maturity models for PM. Criteria are appropriate measures to use for this dimension because of their flexibility. Each criterion can capture one property independently from each other, and this is useful especially when the list of criteria is not definitive. It allows the addition or removal of criteria without affecting other criteria on the list. 3.2.. Evaluation criteria To decide which criteria were relevant to measure the applicability dimension, several sessions with the project group members were held. During these sessions, the list of criteria was changed and refined numerous times due to differing knowledge levels about maturity models for PM. Because of the limited time, the project group settled for a list of criteria shown in the tables below (see Table 5 and Table 6). The criteria are categorized into two main groups: criteria regarding the maturity reference model and those that focus on the assessment method. Table 5: Maturity reference model criteria Nr. Maturity reference Model (MM) criterion MM Openness MM2 Industry & size MM3 Scope MM4 Maturity level description MM5 Dimensions of maturity MM6 Process areas MM7 Process area dependencies 25

Table 6: Assessment method criteria Nr. AM AM2 AM3 AM4 AM5 AM6 AM7 Assessment Method (AM) criterion Assessment commitment Competence level Assessment method description Data gathering method Length of questionnaire Supportive assessment tools Benchmarking Because the purpose of the evaluation framework is descriptive rather than normative, no scores or ranks will result from the evaluation based on the criteria. For this reason, the findings per maturity model will be presented in the format shown in Table 7. The value indicates that a maturity model meets a criteria and indicates otherwise. The Reference column contains references to the information sources used and the Explanation column provides a brief explanation of the findings. Table 7: Example evaluation results format per maturity model Criteria Aspect Value Reference Explanation Maturity model criteria Criterion A Reference # <description of findings about criterion A in reference #> Criterion B Aspect ba Reference # <description of findings about aspect ba in Aspect bb Reference #2 Reference # and #2> Besides books and publications, assessors, experts and accredited associations are consulted to gather and verify information. Experts and assessors are people who have had experience with a maturity model for PM before during maturity assessments. Accreditation associations are organizations qualified to train and dispatch consultants to conduct maturity assessments for other organizations. These associations may also provide training sessions to those who wish to be certified assessors or those who want to know more about a particular maturity model for PM. Each criterion is defined and explained below to clarify the property it measures, the reasons why this property is relevant and how it is used to measure this property. 26

Maturity reference Model (MM) criteria MM - Openness Openness is the degree to which a maturity model for PM is available for public and whether its usage is limited to certain individuals or organizations. This criterion employs four possible situations that have significant implications for the openness of a maturity model. It measures whether a maturity model for PM and/or assessment materials: - can be accessed by without payment (free access) - can be accessed against payment (paid access) - is meant to be used by certified assessors (certified usage) - can only be used by specific organizations (proprietary access & usage) These situations are not independent of each other. A maturity model can, for instance, be made openly available while its assessment can only be done by certified assessors. The resulting schema will indicate whether the four situations hold for a maturity model and assessment. In de Explanation column, additional information will be provided for the values found. MM2 Industry & size Maturity models for PM can be specifically designed for a certain organizational context (e.g. type, structure, industry sector). Therefore, it is important for a model description to contain information about its applicable areas. This criterion measures whether a maturity model specifies limitations in its application to organizations operating in particular industry sectors or to organizations of specific sizes. MM3 - Scope As explained before, the discipline of PM includes the domains of project management, program management and portfolio management. Because of the differences between projects, programs and portfolios, the corresponding best practices and processes will differ as well. The scope of a maturity model for PM is the extent to which a maturity model embodies PM. Depending on the domains employed, a maturity model may be structured differently and describe different processes to improve in. One maturity model for PM may describe all three domains 27

of PM while another one may focus solely on the management of projects. And the model that only describes project management limits its usage to organizations that wishes to improve its project management processes. In other words, the limits in scope have effect on the applicability of a maturity model. This is why scope was selected as a relevant property. MM4 - Maturity level description In order to determine maturity, an assessor needs to know what exactly he or she should assess within an organization and how the findings relate to the maturity descriptions provided in the maturity reference model. Maturity level description is the degree to which a maturity reference model describes what an assessor should look at within an organization to determine its maturity. In the same way, organizations gain insight into what a maturity reference model focuses on when they are being evaluated. The consultation of literature about the selected maturity models and meetings with different assessors resulted in the following list of elements that maturity level descriptions can contain (see Table 8): Table 8: Maturity description elements Element name Process area Activity Role Competency Deliverable Result Explanation A process area is a group of related practices that, when implemented collectively, contribute to making improvements in that area. At the highest abstraction level, assessors determine the maturity level of organizations by assessing the presence/absence of particular process areas. Examples of process areas are: risk management, cost management and change management. The description of an activity provides assessors with guidance in evaluating whether an organization carries out certain activities to achieve purposes defined in the maturity reference model. Examples of activities are: developing a project plan and involving stakeholders. Roles describe functions of individuals who should be responsible for executing certain activities. This element can be employed by maturity reference models to see whether activities are executed by people with the appropriate authority. Besides roles of the people executing activities, a maturity model may describe the minimum or required knowledge levels and capabilities of these people. The competencies of individuals carrying out certain activity may affect the outcome of that activity. The idea behind this element is that if an organization claims to carry out certain activities, it should be able to deliver the products resulting from these activities. For instance, if a member of an organization claims to develop a project plan, this member should be able to have a project plan produced. Maturity models are meant to help an organization improve. A maturity model may describe activities or practices that should be in place within organizations, but it can also describe outcomes or improvements that an organization may experience after achieving a particular maturity level. In this case, an assessor can determine an organization s maturity by assessing whether the organization is able to achieve improvements/outcomes described by a maturity reference model. 28

The level of detail of a maturity reference model is related to the amount of elements used to describe the maturity level conditions. The more elements are used, the more accurate assessors can determine the maturity of organizations. As a result, maturity can be rated repeatedly leaving little space for inconsistencies caused by the maturity reference model itself (reliability). MM5 - Dimensions of maturity In [3], two families of maturity models for PM are described: - Models in which same things are measured at all levels of maturity; where it is simply the results that improve with maturity. - Models stating that more mature organizations measure different things than immature ones; where the increase in maturity is indicated by measures showing improving results. Maturity models for PM of the first group can describe different dimensions in which organizations can mature or professionalize their practices, but these dimensions are measured on all maturity levels. Take for instance a dimension such as competences of people. A maturity model adopting this dimension describes the type of competences that members of an organization should possess at each maturity level. This model then rates an organization s maturity on this dimension and perhaps if applicable, on other dimensions as well. The next criterion shall shed more light on models of the second family. By shedding light on the maturity dimensions employed by a maturity model, organization will gain a better understanding of them in order to select a model. The initial intention was to use a pre-defined list of dimensions used in the literature. However, the evaluation of the maturity models for PM with this list proved to be very difficult due to the varying interpretations of the dimensions and underlying concepts. For this reason, it was decided that the framework will not employ a predefined list of maturity dimensions. Instead, the framework will simply enlist and describe the dimensions used by a maturity model. 29

MM6 - Process areas In a maturity model for PM, a PM domain can be broken down into different but related process areas. So basically, this criterion evaluates the extent to which a maturity model describes a PM domain. Models belonging to the second family mentioned earlier, such as CMMI, evaluate the process areas or processes that an organization should have in place instead of dimensions. These models measure different things at different maturity levels, which are process areas in CMMI s case. Analyzing what process areas are described by a maturity model helps organizations understand the processes the model deems relevant to achieving maturity. As with the previous criterion, there are standard process areas defined in the literature, such as in the PMBOK guide [3]. However, since different maturity model define different process areas, it is difficult to develop a general list of process areas for reference. So, here too, the framework will enlist and describe the process areas categorized by each maturity model for PM. MM7 - Process area dependencies The reason why maturity models of the second family measure different things at different maturity levels is because these models acknowledge strict dependencies between maturity levels [3]. This criterion examines whether maturity models for PM acknowledge such dependency. Describing these dependencies explicitly makes it easier for organizations to understand the path that the model describes towards maturity. Besides the presence or absence, a brief description on how maturity models for PM describe these dependencies is also be provided during the evaluation. 30

Assessment Method (AM) criteria AM - Assessment commitment One of the most important enablers of an adoption of a maturity model is the commitment from higher managers. Just like a project, if the higher management does not provide full support to its execution, it will not likely produce the desired outcome. In an empirical study, support from higher management was the second most selected factor believed to be most critical to a project s outcome [40]. A maturity model description should contain suggestions or encouragements about gaining commitments from the higher management. AM2 - Competence level t every member of a to-be-assessed organization is qualified or capable to be involved in a maturity assessment. A maturity model may specify requirements for those carrying out the assessment (assessors) as well as those participating in it (participants). This is what is meant by competence level. This criterion investigates whether specifications for both roles are provided by a maturity model. By describing requirements, a maturity model decreases the chance that the outcomes of an assessment are affected by the choice of assessors or participants. This is the reason why this property was included. AM3 - Assessment method description While the elaborateness of a maturity model description helps assessors to define what they should measure to verify the maturity of an organization, the details of the assessment method contains information on how they should carry out the assessment process. After studying different maturity assessment methods, the project group decided upon the following description elements that could constitute an assessment method. Table 9 Assessment method description elements Element name Examples Process phase prepare assessment, conduct assessment Activity determine scope of assessment, deliver results Deliverable assessment plan, team, report Role lead assessor, project manager, process owner Dependency an assessment plan is a prerequisite for the actual assessment 3

The number of elements used affects the level of detail of a method description. With a tight protocol to conduct an assessment, the outcomes are less likely affected by variances in the choices and actions of the assessors. So a thorough description of an assessment method can ensure its repeatability and the reliability of the assessment outcomes. AM4 - Data gathering method A maturity model can prescribe different methods to retrieve information from an organization during a maturity assessment. These methods can involve: - questionnaires (meant for participants) - interviews - group discussions - document consultations Each of these methods provides information from a different angle. If multiple methods are used to verify the same piece of information, the reliability of this information would increase. This property also shows the degree of involvement required from the members of an organization. An assessment method that requires assessors to conduct interviews to gather information needs close cooperation from the members of an organization. On the other hand, if a sponsor of an assessment wants to involve as few members as possible, it can opt for a maturity model with an assessment method that only uses document consultations. AM5 - Length of questionnaire There are two types of questionnaires in the context of maturity assessments. The first type involves questionnaires only available to assessors. Assessors use this type of questionnaire as a guide to develop interview questions or as score forms to rate an organization s maturity level. The second type of questionnaires regards those that participants of a maturity assessment have to fill out. In this research, no distinction is made between these two types of questionnaires. This is because no matter who is using the questionnaire, the number of questions still contributes to the detail level of the information that is required by a maturity model. The more questions are asked, the greater the reliability of the retrieved information will be. The type of questionnaire used does not change this. Because the scope of each maturity model 32

for PM may vary, this criterion only considers the part of the questionnaire focusing on the project management domain. This criterion results in a number, indicating the amount of questions in a questionnaire provided by each maturity model for PM. AM6 - Supportive assessment tools To ease the application of an assessment, various kind of support can be provided together with a maturity model for PM. This criterion measures three types of supportive tools: - (self-)assessment toolsets (e.g. an online maturity assessment mechanism) - training sessions to increase the understanding of a maturity reference model and the assessment method. - certification possibilities (to become a certified assessor) The presence or absence of these supportive tools affects the usability of a maturity model for PM. Besides maturity assessment with external assessors, an organization can also opt for unofficial assessments, which are assessments done by organizations themselves. If tools are provided to facilitate the understanding and application of the maturity model without the intervention of certified assessors, the application of the model would then not limit itself to organizations desiring official assessments. AM7 - Benchmarking Depending on the maturity model, the results of a maturity assessment may be used for benchmarking purposes. This means that anonymous maturity results of assessed organizations can be gathered by assessment institutions and used to make comparisons of aggregated scores between for instance industry sectors or region. If an institution allows it, assessed organization may compare their maturity scores with the aggregated maturity scores of organizations operating in the same industry or region. Before that is possible, however, a standard method for the assessment needs to be in place. As with the standardization of a questionnaire, the standardization of the assessment method decreases the amount of variance in the results attributable to various factors (e.g. choice of participants or assessors) and increases the attribution 33

to the real variable in question: maturity. This criterion will consider two questions when evaluating each assessment method description for this criterion: - whether a standard method is described for a maturity assessment, and - whether maturity profiles will be preserved for benchmarking purposes available to an assessed organization. 3.2.2. Criteria comparison method After evaluating the maturity models for PM using the evaluation criteria, the findings per model are compared with each other in a schematic way. Elaborate comparisons between the criteria results are provided in the form of tables like Table 0. Each table signifies each evaluation criteria where the cells in the upper right contains the similarities and the cells in the lower left contains the differences found between each pair of maturity models for PM. Table 0: Example schematic comparison per criteria Criterion A Model Model 2 Model 3 Model similarity # similarity #2 Similarities similarity # Model 2 Model 3 Differences difference # difference #2 difference #3 difference # difference #2 difference # similarity # similarity #2 similarity #3 After these tables, another table is depicted to summarize briefly the findings of each model per criteria. By placing all values in a schema, it is easy to see the differences between maturity models in a quick glance. As some criteria cannot be answered with a simply yes/no answer, some of the fields of the schema will also contain narrative text as values. An example of the resulting schema is shown in Table. Table : Example schematic comparison of criteria results Model Model 2 Model 3 Criterion A Criterion B - aspect ba - aspect bb Criterion C Criterion D 34

3.3. Usage Dimension The last part of the evaluation framework examines the usage dimension of a maturity model for PM. The reason why this dimension is chosen is because the true value of a maturity model lies in the eye of the ones using it. Theory may describe a maturity model as applicable or usable (as elicited by the evaluation criteria) but there is no value in this if the people using it think otherwise. There are two ways to measure on this dimension: via a survey or conducting interviews. Because there was little time to gather enough respondents to make a survey statistically relevant, the project group had decided to opt for interviews. 3.3.. Interviews This part of the framework was originally meant to elicit user experiences with a maturity model for PM; therefore, these interviews had to involve people who have had experience before with the selected maturity models. In the end, only a small number of interviews were conducted: two for CMMI and one for OPM3. The reason for this is because there were few people available who had experience with the selected maturity models, especially OPM3 and PMMM. These two models are relatively less well-known in Europe compared to CMMI. On top of that, not many organizations in the Netherlands have adopted these two models. Ideally, the usage dimension should be measured by interviews with two different user groups per maturity model, namely: the assessors and members of user organizations. The reason behind this is because simply taking one of the two perspectives will create a biased view of these models. Unlike (certified) assessors, members of user organizations do not have much experience with maturity models and thus, their knowledge backgrounds will differ from those of assessors. And even if members of a user organization attend training sessions to gain knowledge about a maturity model (i.e. not with the intention to become certified assessors), there will still be a difference in how they look at a maturity model simply because they have different objectives of using one. Assessors use maturity models to assess other user organizations; they interview members of an organization and verify 35

the findings to determine the degree of maturity of that organization. Conversely, user organizations are those who request assistance of certified assessors to assess their maturity. They are the ones being interviewed and providing information about their ways of working. Depending on the maturity model, either assessors or user organizations will use the model to determine improvement trajectories. Ultimately, it will be the members of the user organizations who initiate and realize the improvement initiatives. Because of these reasons, eliciting practical experiences from both the assessors and members of user organizations will help generating a complete picture of the usage dimension of a maturity model. Two different questionnaires were developed for this dimension: one meant for a user organization and the other for an assessor. These questionnaires are included in the Appendix section in Dutch (see Appendix D-). Due to the small number of interviews, the Project Group decided that the retrieved interview data will be used to support findings of the previous two dimensions. However, information retrieved from both interviews and informal consultations are described separately as the results of the usage dimension if it provides insight into the models overlooked by the previous two dimensions. 36

4. Analysis & Results 4.. Structure analysis The analysis of the results of the structure dimension begins with the maturity reference model concepts depicted in the PDDs as gray boxes. This facilitates the understanding of the comparisons that follow between the remaining activities and concepts in the diagram. The PDDs of the three PM maturity models are included in the Appendix section along with the concept tables and activity tables containing the concept definitions and activity descriptions (see Appendix E & F). 4... Maturity reference model structure As argued before, it is not important to compare the individual concepts underlying the maturity models for PM. Much more interesting is the hierarchy that holds these concepts together. The following paragraphs describe this structure for each maturity reference model. Maturity reference model concepts are written between quotation ( ) marks in the text. Appendix F contains more detailed definitions of these concepts. OPM3 The OPM3 standard comprises several important components, namely the foundation, a self-assessment tool and three sets of directories. And to register maturity scores of client organizations for reasons of benchmarking, the PMI also has a maturity database to store maturity profiles. While the foundation and selfassessment tool are made available to assist client organizations in understanding the OPM3 standard, the three directories form the core of the assessment method. These are the best practices directory, capabilities directory and improvement planning directory. The best practices directory contains two types of best practices, namely organizational enablers and process best practices. The former are supportive practices that relate to the organizational structures and processes required to facilitate efficient and effective realization of best practices for projects. The latter are basically practices that are currently recognized and applied by industries to achieve a stated goal or objective. These best practices are grouped by process improvement stages 37

and PM domains. Process improvement stages are the four stages of process maturity: from process Standardization to Measurement to Control and ultimately to Continuous Improvement. In other words, the OPM3 model describes best practices that an organization can learn from to Standardize, Measure, Control or Continuously Improve its PM processes. Organizations can also choose to assess their process maturity within a specific PM domain such as project management, program management or portfolio management. OPM3 describes capabilities, stored in the capabilities directory, which are specific competencies that must exist in an organization in order for it to execute a best practice. To prove the existence of a capability, the existence of one or more corresponding outcomes is examined. Outcomes are the tangible or intangible result of applying a capability. The degree to which an outcome exists is determined by criteria also known as key performance indicators (KPI). Furthermore, OPM3 acknowledges dependencies among its underlying concepts, particularly best practices and capabilities. The first type of dependency lies between the series of capabilities leading to a best practice. Also, each capability builds upon preceding capabilities to achieve one single best practice. But a capability under one best practice can also depend on the existence of a capability under a different best practice, in this case a best practice is said to be dependent on another best practice. This is the second type of dependency described within OPM3. These dependencies are stored in the improvement planning directory. CMMI CMMI embodies five maturity levels, each a layer in the foundation for ongoing process improvement. Because each maturity level forms a necessary foundation for the next level, an organization cannot achieve, for instance, maturity level 3 if it hasn t achieved level 2 yet. CMMI describes dependencies between pairs of adjacent maturity levels. In CMMI, each maturity level is represented by several process areas and for an organization to achieve a particular maturity level, the corresponding process areas have to have achieved that maturity level. The maturity model describes generic goals and specific goals that guides the process of bringing a process area to a higher maturity level. Generic goals can be 38

understood as objectives to bring a group of process areas to a certain level of maturity and the model describe generic practices that an organization can apply to achieve these objectives. Specific goals are objectives unique to a specific process area that have to be achieved before the process area can be considered as satisfied, i.e. implemented. And to implement a process area, the model describes specific practices that can be applied. The philosophy behind this is that organizations can only achieve a level of maturity if it is capable of achieving certain goals. To achieve these goals, there are (groups of) processes that need to be in place. And the presence or absence of these processes is assessed by looking at an organization s practices. Maturity scores of assessed organizations that are wiling to register themselves by SEI are stored in SEI s maturity profile database. PMMM Just like CMMI, the PMMM also embodies five maturity levels. In the book of Kerzner [5] where this model is described, each maturity level is equipped with explanations about roadblocks, risks advancement criteria and an assessment instrument. The first three concepts represent things an organization need to know before it can achieve that level and advance to the next level. Each level is accompanied by an assessment instrument in the form of a questionnaire that organizations can use to assess the degree to which it has achieved that level. Organizations can do it manually using the book of Kerzner, but PMMM also has an online version of the assessment instruments. And those who have used the online assessment tool can also compare their assessment scores with scores of other organizations that are stored in IIL s benchmarking database. 39

4..2. Assessment method structure After decomposing the PDDs of all three assessment methods into activities and concepts, members of the project group were consulted to develop the consolidated reference lists. During the development, similar process phases, activities and concepts were combined and included in the reference list under a more general name and dissimilar ones were added under their original names. This resulted in the two comparison tables below (see Table 2 and Table 3). Each of these tables is followed by the descriptions of the similarities and differences found between the models. Table 2: PDD activity comparison table. Prepare assessment OPM3 CMMI PMMM. Familiarize with maturity model =.2 Perform self-assessment =.3 Determine assessment requirements = (2) = >.4 Select participants < (2) < =.5 Develop assessment plan = (2) = >.6 Select & prepare assessment team = (2) =.7 Gather pre-assessment data < (2) =.8 Prepare for assessment conduct < (2) =.9 Prepare participants < (2) < (2) 2. Conduct assessment OPM3 CMMI PMMM 2. Conduct interviews = < 2.2 Study records & documents = < 2.3 Document gathered information = 2.4 Verify gathered information = = 2.5 Enter values in assessment tool = = 2.6 Generate individual assessment results = 2.7 Generate summarized assessment results < = < 2.8 Validate assessment results < = 2.9 Generate benchmark scores * * = 2.0 Create final report < (2) < 3. Finalize assessment OPM3 CMMI PMMM 3. Deliver final report < (2) = < 3.2 Document assessment < (2) = 4. Plan for improvement OPM3 CMMI PMMM 4. Select & prioritize improvement initiatives = < 4.2 Develop improvement plan = > *: optional after assessment 40

In the first process phase, there are several differences between the three maturity models for PM. First of all, during an OPM3 assessment, organizations usually familiarize themselves with the model and conduct a self-assessment themselves to determine the necessity for an OPM3 rigorous assessment involving certified assessors. Contrary to that, the assessment method of both CMMI and PMMM do not include these two activities. This is because the SEI does not provide any tools for self-assessments for CMMI. The remaining activities in the first phase are present in the assessment methods of both OPM3 and CMMI. Both models require assessors to consult documents and conduct interviews to gather information in the second phase, and the activities.3 to.9 are carried out to make the necessary preparations. PMMM on the other hand contain only activities regarding the requirements analysis, the selection of participants and the development of an assessment plan. Furthermore, there is no self-assessment included in the assessment method. This is because the assessment method of PMMM only requires organizations to interact with an online tool, which takes place during the second phase (2.5). The IIL provides an online as well as offline version of the same questionnaire for the assessment. Organization may use the offline version to do a self-assessment, but unlike OPM3 s selfassessment tool, no explanation is provided for the results found. Furthermore, while OPM3 and CMMI assessors have the responsibility to decide on the adequate amount and the appropriate roles of assessment participants. IIL does not have control over who ultimately interacts with the PMMM online assessment tool. In this latter case, it is the responsibility of the client organization to select the right amount of particular roles to include in the scoped sample. After gathering the necessary data, it is documented during a CMMI assessment. This data is then used to manually determine the maturity level of the organization. During an OPM3 assessment, this data is entered into a tool by lead assessors to have the assessment results automatically generated. In both cases, a final report is then delivered to the assessed organization containing the aggregated results and corresponding explanations. The PMMM assessment differs from this by allowing assessments participants to retrieve their individual scores and compare them to 4

temporary aggregate scores during an assessment. After all pre-selected participants have filled out the online questionnaire the tool will generate a summary of the assessment results with brief explanations. If desired, organization can request for a report containing more elaborate explanations of the results (3.). Among others, this report contains more specific suggestions for improvements based on the results. While benchmarking possibilities become available after OPM3 and CMMI assessments, PMMMs online tool allows organizations to generate benchmarking scores during and after the assessment. Finally, while CMMIs assessment method ends after the assessors deliver the final report to the assessed organization, the OPM3 assessment continues until an improvement plan is developed containing prioritized improvement initiatives. This is because OPM3 assessments are meant to help organizations realize improvements. CMMI assessments do not include these activities because a CMMI assessment is not necessarily conducted with the goal to realize improvements. Organizations may undergo CMMI assessments just to rank themselves onto a standardized model without the intention to improve. This is not possible for OPM3 because it does not define strict maturity levels. 42

Table 3: PDD concept comparison table. Prepare assessment OPM3 CMMI PMMM. SELF-ASSESSMENT REPORT =.2 ASSESSMENT TRIGGER = REQUIREMENT DRIVER.3 ASSESSMENT PLAN = APPRAISAL PLAN =.4 COMMUNICATION PLAN = APPRAISAL PLAN.5 TEAM & ROLES DESCRIPTION = TEAM.6 ORGANIZATION DESCRIPTION = ENGAGEMENT.7 REQUIREMENT = DESCRIPTION DATA INITIAL OBJECTIVE.8 PRE-ASSESSMENT DATA COLLECTION PLAN EVIDENCE REVIEW.9 DATA COLLECTION PLAN = = COMPANY BACKGROUND.0 QUESTIONNAIRE PROTOCOL = ASSESSMENT. ASSESSMENT PARTICIPANTS SAMPLE 2. Conduct assessment OPM3 CMMI PMMM 2. ASSESSMENT DATA PRELIMINARY OBJECTIVE ASSESSMENT EVIDENCE FINDINGS 2.2 ARTIFACT = 2.3 AFFIRMATION = 2.4 DOCUMENT = 2.5 INTERVIEW = 2.6 ASSESSMENT RESULTS ASSESSMENT RESULTS APPRAISAL RESULTS ASSESSMENT RESULTS SUMMARY 2.7 RESULTS ANALYSIS = = = 2.8 INDIVIDUAL ASSESSMENT RESULTS = 2.9 INDIVIDUAL SCORE = 2.0 INDIVIDUAL SCORE ANALYSIS = OPM MATURITY 2. ORGANIZATIONAL MATURITY SCORE SCORE 2.2 OUTCOME SCORE = 2.3 CAPABILITY SCORE = MATURITY LEVEL RATING 2.4 BEST PRACTICE SCORE = PRACTICE RATING 2.5 GOAL RATING = ORGANIZA- TIONAL SCORE 2.6 PROCESS AREA RATING = AGGREGATE 2.7 BENCHMARK SCORE SCORE 2.8 INDUSTRY SCORE = 2.9 SIZE SCORE = 3. Finalize assessment OPM3 CMMI PMMM 3. FINAL REPORT = FINAL FINDINGS REPORT ELABORATE ASSESSMENT REPORT 3.2 ASSESSMENT RECORD APPRAISAL RECORD 3.3 MATURITY PROFILE = = ORGANIZA- TIONAL SCORE 3.4 BENCHMARK COMPARISON = 4. Plan for improvement OPM3 CMMI PMMM 4. IMPROVEMENT PLAN = 4.2 IMPROVEMENT TRIGGER = 4.3 INITIATIVE = 4.4 SCHEDULE = 4.5 PRIORITY = 4.6 FACTOR = 4.7 ATTAINABILITY = 4.8 STRATEGIC PRIORITY = 4.9 BENEFIT = 4.0 COST = IMPROVEMENT SUGGESTION SUGGESTED ACTION 43

Since PMMMs assessment method only gathers information using an online questionnaire, there is no need to develop assessment team descriptions (.5) and data collection plans (.9) or collect pre-assessment data (.8) during the prepare assessment phase. In the conduct assessment phase, CMMIs method describes more thoroughly the categories and type of data gathered during the assessment (2.2-2.5) while OPM3 summarizes them into one concept (2.). PMMM does not employ these concepts at all because, unlike OPM3 and CMMI, its assessment does not involve assessors who have to gather the right data from the right sources. In the activity comparison table, it was evident that PMMM allows participants to access their individual scores. This explains the corresponding concepts in the second phase of the concept comparison table (2.8-2.0). The differences found in the concepts 2.2-2.6 between OPM3 and CMMI can be explained by the differences in the concepts used to describe maturity (see previous section). Unlike PMMM, the OPM3 and CMMI benchmarking scores are not provided during the maturity assessment, which explains why the benchmark related concepts (2.7-2.9) only have corresponding symbols in the PMMM column. And finally, the OPM3 model is the only one describing concepts that are related to the development of an improvement plan in the fourth phase. The reasons for this are already explained before. 4..3. Structure comparison summary With fewer activities and concepts than the other two maturity models, the maturity reference model and assessment method of PMMM appear to possess a relatively simple structure. It does not involve assessors and only requires members of an organization to fill out an online questionnaire to calculate the maturity scores. There are several implications of this simplicity. First of all, organizations themselves are fully responsible for selecting participants for the assessment, so an organization can 44

assess whoever it wants to assess. Considering the fact that participants can look into their individual scores, this means that the model allows for individual assessments and development for all members of a project-based organization. The downside of this, however, is that unless a large sample is involved, the aggregate scores will be affected by the choice of participants. Furthermore, with an online website and questionnaire as a medium during the assessment, no account is taken for the environment of an organization or other factors when generating the results. So organizations have to contemplate whether the results and improvement suggestions are applicable to their specific situation. During the assessments of OPM3 and CMMI, the assessors are responsible for selecting, preparing and interviewing the appropriate members based on their experience besides consulting documents to gather information. Both models determine the maturity level of an organization by the practices that it has implemented. The difference between the two models is evident when the gathered data is processed to generate the assessment results. OPM3 assessors make use of a tool to process the data gathered and automatically generate a report containing assessment results, while CMMI assessors (following the SCAMPI method) have to do it manually through meetings between lead assessors and assessment team members. All things being equal, this might imply that a CMMI assessment takes more time than an OPM3 assessment. Finally, the structure of OPM3 shows that it is a maturity model revolving around making improvements. What is less evident is that OPM3 does not have a standardized while CMMI can also be used to determine an organization s maturity level in a standardized model. PMMM differs from OPM3 in that it defines distinctive levels of maturity, but it does not require organizations to follow the path to maturity from to 5. PMMM allows organizations to assess its progress at all 5 maturity levels in a relatively simple way. 45

4.2. Applicability analysis Next is the applicability dimension where each of the selected maturity models are evaluated using fourteen criteria defined in Chapter 3. During and after every evaluation, the findings were discussed with assessors to check their correctness and completeness. See Appendix G for the detailed evaluation results per maturity model. The summarized results of the evaluation are described in the next sub-sections, starting with the maturity reference model criteria. For the sake of space, the acronyms of the three models are used in the explanations instead of referring to their maturity reference models or assessment methods. 4.2.. Maturity reference model The tables below describe more thoroughly the comparisons between each pair of maturity model for PM for each criterion. The upper right cells specify the similarities and the cells in the lower left the differences. Each table is accompanied by a brief explanation about the most important similarities and differences, and the corresponding implications. The input for these cells is derived from the detailed evaluation tables in Appendix G. After all seven maturity reference model criteria are discussed separately; another table is depicted with the summarized results along with a brief explanation. 46

Table 4: Comparison table Openness MM - Openness OPM3 CMMI PMMM Similarities OPM3 - Organizations have to pay to access assessment materials - In the case of official assessments, materials of OPM3-PS and CMMI are meant for certified assessors to use. OPM3-F materials can be used unrestrictedly to conduct self-assessments - Usage of the models is not limited to their owners - Model and assessment materials are only available against payment - Usage of the models is not limited to certain organizations CMMI PMMM Differences - Client organizations have to pay for all OPM3 materials while CMMI model information is downloadable - Client organizations have to pay for all PMMM materials while some CMMI materials are downloadable - During maturity assessments, PMMM does not require intervention of certified assessors while OPM3 does - During maturity assessments, PMMM does not require intervention of certified assessors while CMMI does - Organizations have to pay to access assessment materials - Usage of the models is not limited to certain organizations The most important similarity here is that the application of the three models is not limited to the owners of the models. Materials of all three models are available to the public, although the access can be bound by payment in the case of PMMM and OPM3. The definition documents of CMMI, on the other hand are freely downloadable from the Web. Unlike OPM3 and CMMI, PMMM does not require the intervention of certified assessors to conduct the assessment. Explanations about this and about the possible implications are already described before in section 4..2. 47

Table 5: Comparison table Industry & Size MM2 - Scope OPM3 CMMI PMMM Similarities OPM3 - industry related restrictions are posed the usage - industry related restrictions are posed on the usage CMMI PMMM Differences - CMMI does not explicitly mention size related restrictions while OPM3 does - PMMM does not explicitly mention size related restrictions while OPM3 does - industry related restrictions are posed on the usage - size related restrictions are mentioned Regarding scope of the maturity models, none of them places industry related restriction on the application of the model (see Table 5). As for the size of the client organizations, OPM3 is the only one that explicitly states that the model can be applied to organizations of all sizes. An implication of this is that organizations have to consider by themselves whether a maturity model is still applicable to the size of the scope they have in mind. Table 6: Comparison table Scope MM3 PM Domain OPM3 CMMI PMMM Similarities OPM3 - The models describe best practices related to project management - The models describe best practices related to project management CMMI Differences - PM practices within CMMI are described to achieve goals regarding software development. Conversely, OPM3 describes practices to achieve goals regarding PM - OPM3 describes program and portfolio management practices while CMMI does not - The models describe best practices related to project management - The models do not cover practices related to program and portfolio management PMMM - PMMM only focuses on the maturity of PM in an organization while OPM3 also focuses on program and portfolio management - PM practices in CMMI are described to achieve goals of software development, practices in PMMM are described to achieve goals related to PM All three models appear at first glance to cover the project management domain, but it should be noted that CMMI describe project management practices differently than the other two models. This is because, as mentioned before, CMMI is developed for software development purposes and not PM. This model focuses on the management 48

of projects in order to achieve software developmental goals. PMMM focuses on the project management domain only while OPM3 focuses on all three PM domains. An important implication of this is that organizations should consider applying CMMI when the assessment involves improving project management in the software development context. For improving program and portfolio management processes, OPM3 should be selected instead of PMMM. Table 7: Comparison table Maturity level description MM4 Maturity level description OPM3 CMMI PMMM Similarities OPM3 - Process areas and activities are described - The models do not describe competences that members of an organization should harbor, or results that could be expected when a maturity is achieved - Both models verify the existence of process areas and activities by the products resulting from them - Process areas and activities are described - The models do not describe the results that organizations should be able to achieve at a certain maturity level CMMI - OPM3 only describe activities (i.e. practices) that should be in place in an organization. CMMI, on the other hand, also describe process areas, and roles that should be responsible for executing certain activities - OPM3 does not describe what process areas should be in place or roles that should be responsible for executing certain activities, while PMMM mentions roles in the assessment questionnaire - PMMM describes competences of people while OPM3 does not - OPM3 assesses the deliverables that result from activities, but PMMM does not require the assessment of deliverables - Process areas and activities are described by both models - Roles are described as well - The models do not describe the results that client organizations should be able to achieve at a certain maturity level PMMM Differences - CMMI does not describe competencies that members of an organization should maintain, but PMMM does - CMMI assesses the deliverables that result from activities, but PMMM does not require the assessment of deliverables The fourth criterion is the degree to which a PM maturity model describes the conditions to achieve a maturity level (see Table 7). ne of the three models appear to describe the project results that should be achieved when a maturity level has been reached. This is not too surprising when taking into account that a lot of factors can influence the outcome of a project. A project can still fail even if PM is successfully 49

carried out within an organization [7] and vice versa. So even if a certain maturity level is achieved, it does not guarantee desired outcomes of projects (or PM). An implication of this is that organizations should judge by themselves whether outcomes have improved after the adoption of a maturity model for PM (models do not measure the value delivered to business stakeholders). An important difference that is not immediately obvious in the table is the way CMMI and PMMM describe and employ the notion of process areas. CMMI employs process areas to indicate what processes within an organization should receive attention at a particular maturity level in order to achieve certain (software development) goals. How this is different from PMMMs approach becomes evident when examining the dimensions of maturity employed by the model (see next criterion). It will also become clear why PMMM is the only model that describes the competencies of people within an organization. OPM3 and CMMI both require an organization to show evidence (i.e. deliverables) of the existence of practices while with PMMM the scores given are completely based on the answers provided by participants. One important implication of this is the lack of control over the correctness of the answers given during a PMMM assessment. This could be compensated, however, if a large assessment sample is selected. 50

Table 8: Comparison table Dimensions of maturity MM5 Maturity dimension OPM3 CMMI PMMM Similarities OPM3 CMMI PMMM Differences - OPM3 describes process maturity in terms of best practices and CMMI looks at the capability levels of processes - PMMM assesses different dimensions on each consecutive level. OPM3 does not describe maturity dimensions along which an organization can achieve maturity - PMMM assesses different dimensions of PM maturity on each consecutive level. CMMI does not describe maturity dimensions along which an organization can achieve maturity CMMI describes maturity by the capability levels of processes. OPM3 focuses on the improvement of process in terms of best practices. And PMMM assesses several dimensions but each at a different maturity level. At level, the model looks at the knowledge of people within an organization regarding basic project management terminology. At level 2, it measures the degree to which processes are made common throughout an organization. Level 3 is about combining all corporate methodologies into a singular methodology to conduct project managements. At level 4, PMMM measures whether an organization is aware of the importance of benchmarking and the degree to which benchmarking is carried out. And finally, level 5 measures the extent to which an organization is improving the singular methodology and business processes using the information obtained through benchmarking. By examining these 5 maturity levels, it becomes apparent that, unlike OPM3 and CMMI, PMMM is a model developed to help organizations understand the basic requirements for doing project management. 5

Table 9: Comparison table Process areas MM6 Process areas OPM3 CMMI PMMM Similarities OPM3 CMMI - CMMI defines process areas from a software development perspective. OPM3 does not describe maturity using process areas, but with practices - OPM3 includes process areas of project, program and portfolio management and PMMM focuses on project management alone - OPM3 uses the process areas defined in PMBOK to categorize its best practices and PMMM uses them to categorize knowledge areas on the first maturity level PMMM Differences - CMMI takes a software development perspective while PMMM takes a perspective of PM - CMMI defines process areas for a different purpose than PMMM. Process areas in CMMI are groups of processes that can be improved along a maturity dimension. Conversely, PMMM only uses the PMBOK process areas to indicate different knowledge areas on level one PMMM use project management process areas as defined in the PMBOK to categorize knowledge areas at its first maturity level. The model does not make use of process areas besides that. Because OPM3 is aligned to the PMBOK, the process areas are embedded into the practices. But OPM3 does not use process areas the way CMMI or PMMM does. The process areas employed by CMMI are different those defined in the PMBOK, mainly because CMMI categorizes them from a software development perspective. Organizations should examine these process areas carefully before determining which model to use for an assessment. 52

Table 20: Comparison table - Process areas dependencies MM7 Process area dependencies OPM3 CMMI PMMM Similarities OPM3 - Both models acknowledge dependencies CMMI PMMM Differences - CMMI describes relationship among process areas while OPM3 describes dependencies at a lower level (i.e. among best practices) - OPM3 describes dependencies between best practices and capabilities while PMMM describes dependencies between maturity levels - PMMM only elaborates dependencies between maturity levels, not process areas. CMMI describe dependencies between process areas and maturity levels - Both models acknowledge dependencies between maturity levels The Process areas dependencies criterion was developed to examine whether the PM maturity models employed process areas to categorize all PM processes on each maturity level and whether the realization of one process area depended on the realization of another one. Although PMMM describes dependencies between maturity levels like CMMI, these dependencies do not involve process areas. CMMI categorize processes into process areas on all maturity levels and describe relationships between them to indicate their dependencies. Because OPM3 describes dependencies at a lower level between components such as best practices or capabilities, organizations are provided with more guidance when selecting improvement trajectories. But there is also a downside to this since more descriptions may lead to restrictions of an organization s choice in improvement actions. The above results are summarized in the following table (see Table 2). It is accompanied by a short explanation of the main similarities and differences between the maturity models for PM. 53

Table 2: Summary results of maturity reference model criteria Openness Free access Paid access Certified usage Proprietary access & usage Industry & Size Size of organizations Industry sector Scope Project Management Program Management Portfolio Management Maturity level description Process area Activity Role Competency Deliverable Result OPM3 CMMI PMMM Dimensions of maturity t employed Processes Varies per maturity level. Process areas t employed PM-related process areas: project planning, project monitoring and control, supplier agreement management, integrated project management, risk management, quantitative project management. Scope management, time management, cost management, human resources management, procurement management, quality management, risk management, communications managements Process areas dependencies (Described?) Described differently Information about all three models is relatively easy to access, with or without payment. And none of them limit their usage to their owners or organizations operating in specific industries. OPM3 is the only maturity model describing practices of program and portfolio management. CMMI describes project management practices in a software development context and PMMM describes project management practices applicable in organizations that want to learn the basics about conducting project management. OPM3 only defines and determines maturity in terms of activities (best practices) and deliverables, while CMMI and PMMM also look at roles of people and process areas. PMMM is the only model that looks at competencies of the people in an organization 54

and does not concern itself with the deliverables. Furthermore, none of the maturity models describe the expected results of having achieved a particular maturity level. Regarding the dimensions of maturity, only PMMM assesses different dimensions of an organization at each maturity level. CMMI only looks at the processes dimension in organizations and OPM3 does not explicitly define stages of maturity or dimensions in which maturity can be achieved. Unlike CMMI, OPM3 does not define process areas or dependencies between them. Instead, it describes dependencies between best practices and the capabilities that constitute them. 4.2.2. Assessment method As before, the next seven tables will describe the differences and similarities found per pair of maturity models for each criterion regarding their assessment methods. After that, another table is used to summarize the comparison results. Table 22: Comparison table - Assessment commitment AP5 Assessment commitment OPM3 CMMI PMMM Similarities OPM3 CMMI PMMM Differences - OPM3 does not explicitly state the necessity for higher management commitment - OPM3 does not explicitly state the necessity for higher management commitment - The necessity of commitment of higher management initiative is explicitly stated by both models To achieve desired outcomes in all initiatives within organizations, support from higher management is one of the most important determinants. Describing the importance of higher management commitment helps making members of an organization aware of this. They will be more likely to see the adoption of a maturity model and process improvement as something important. OPM3 does not describe this, while PMMM and CMMI do (Table 22). This means that organizations adopting OPM3 should still keep the importance of higher management commitment in mind although it is not suggested explicitly. 55

Table 23: Comparison table - Competence level AP Competence level OPM3 CMMI PMMM Similarities OPM3 - Minimum requirements for selecting a (lead) assessor are described by both models - Both models do not describe requirements for participants - Both models do not describe requirements for participants CMMI PMMM Differences - While CMMI provides minimum requirements, OPM3 lists specific competences and personal attributes that assessors need to posses - Unlike OPM3, PMMM does not describe requirements for assessors or participants - CMMI describes requirements for selecting assessors and PMMM does not provide descriptions or requirements at all - Requirements nor restrictions are described for the selection of assessment participants - Both models do not describe requirements for participants Looking at Table 23, OPM3 and CMMI both provide criteria for selecting (lead) assessors and PMMM does not provide guidelines at all. These guidelines contribute to the reliability of an assessment since they help in ruling out factors that might affect the outcome of the assessment, which in this case would be the competences of the assessors or participants. By not providing guidelines, an organization may risk selecting participants that do not represent the scope of the assessment. 56

Table 24: Comparison table - Assessment method description AP3 Assessment method description OPM3 CMMI PMMM Similarities OPM3 - The assessment method for CMMI as well as OPM3 are described in process phases, activities, deliverables, roles and dependencies - The process phases of the assessment method, deliverables and dependencies are described by both models CMMI PMMM Differences - The assessment method of CMMI is described in an online document. The assessment method of OPM3 is described in the assessor training manuals - The assessment method of PMMM is described for, while OPM3s assessment method as described by OPM3-PS is meant for assessors - CMMIs assessment method is meant for assessors to use whereas PMMMs assessment description is written for participants - Also because of this, the assessment description of PMMM is less detailed than that of CMMI - The process phases of the assessment method, deliverables and dependencies are described by both models - The information about the assessment method is online available for the public Both OPM3 and CMMI describe detailed standard procedures that assessors should follow when conducting assessments. PMMM also provides these descriptions, but of a lower level of detail since they are meant to be used by organizations that want to undergo an assessment. Having a detailed description of the assessment method adds up to the repeatability of an assessment, which contributes to the reliability of the outcomes. This is relevant when maturity scores are compared with each other. This way, differences in maturity scores would be less attributable to factors other than the differences between assessed organizations. 57

Table 25: Comparison table Data gathering method AP4 Assessment method OPM3 CMMI PMMM Similarities OPM3 - The data gathering methods for both models include conducting interviews and consulting documents - Both models do not prescribe group discussions as one of the gathering methods - The assessment methods for both models include the usage of questionnaires that can be filled out by organizations themselves - Both models do not prescribe group discussions as one of the gathering methods - Both models do not prescribe group discussions as one of the gathering methods CMMI - CMMI assessment participants do not have to fill out any questionnaires. OPM3 participants can fill out a questionnaire as part of the assessment preparation - PMMMs questionnaire is used by organizations to conduct a full assessment while OPM3s questionnaire is used by organizations to conduct a self-assessment - OPM3s assessment involves gathering and verifying assessment data using interviews and documents whereas PMMMs assessment involves gathering data solely by having participants fill out a questionnaire PMMM Differences - Unlike with PMMM, CMMI assessment participants do not have to fill out any questionnaires - CMMIs assessment involves gathering and verifying assessment data using interviews and documents whereas PMMMs assessment involves gathering data solely by having participants fill out a questionnaire Although group discussion can strengthen assessment findings and rule out personal biases, none of the three assessment methods prescribe this data gathering method to retrieve assessment data. Both CMMI and OPM3 prescribe interviews and the consultation of documents to retrieve and verify data (Table 25). The questionnaire provided by OPM3 is only meant as a self-assessment tool. The data resulting of this self-assessment can then be used as one of the input (i.e. trigger) to conduct the more rigorous assessments with external assessors. PMMM provides one questionnaire for the assessment and does not describe other methods to retrieve additional information or for verification. 58

Table 26: Comparison table - Length of questionnaire AP6 Length of questionnaire OPM3 CMMI PMMM Similarities OPM3 CMMI - OPM3s questionnaire comprise more questions than CMMIs questionnaire PMMM Differences - OPM3s questionnaire comprise more questions than PMMMs questionnaire - All questions in PMMMs questionnaire focus on project management across all maturity levels. On the other hand, only a portion of CMMI s questions are focused on project management The questionnaire of OPM3 contains more questions than CMMI and PMMI, because each maturity level is made up of multiple components and subcomponents (best practices and capabilities). Each question in the questionnaire signifies either a best practice or capability and since OPM3 embodies about 600 best practices, it is not surprising that only the project management domain would result in more than 800 questions. Questionnaires used by CMMI assessors contain fewer details, but it also means that assessors have to rely on their own judgment to create the appropriate lists of questions to ask participants during interviews. The same holds for OPM3 assessors. The difference in the amount of questions is also explained by the scope of the questionnaires used for the evaluation. The 800 questions of OPM3 cover the entire project management domain across four process maturity levels. The project management related questions in CMMI however does not represent one particular maturity level. The questions of CMMI are grouped according to the 23 process areas employed by this model and the number of process areas relating to project management is only a subset of the total amount. The 83 questions of PMMM is the standard questionnaire for project management over all five maturity levels. The more questions are provided by a maturity model, the more aspects are considered, provided that all questions are relevant to the assessment. However, too many questions may lengthen an assessment process considerably, since it takes time for assessors to convert hundreds of probe questions into a relevant list of questions for interviews. 59

Table 27: Comparison table - Supportive assessment tools AP2 Supportive assessment tools OPM3 CMMI PMMM Similarities OPM3 - Certifications and trainings are provided for both models - Toolsets to conduct (self)assessments are provided CMMI PMMM Differences - Tools for self-assessments and rigorous assessments (to generate results) are provided for OPM3 whereas no such tools are provided for CMMI. - trainings or certification possibilities regarding PMMM are provided while they are provided for OPM3 - Unlike with PMMM, CMMI provides no tools for selfassessments or for the generation of results Since both CMMI and OPM3 employ assessors to conduct the assessment, it is not surprising that trainings and certifications are provided for these two models. Organization using PMMM have to consult the IIL or the available literature [25] instead of trainings in order to understand PMMM. electronic tools are provided for CMMI while assessors of OPM3 and participants of PMMM can use electronic tools to calculate the maturity score of an organization. In addition, OPM3 (Foundation) provide means to conduct self-assessments. Organizations are able to conduct a quick scan on themselves before deciding whether a full-blown assessment using external assessors is necessary. 60

Table 28: Comparison table - Benchmarking AP7 Benchmarking OPM3 CMMI PMMM Similarities OPM3 - Both models allow registered organizations to request for anonymous maturity scores of other assessed organizations for purposes of benchmarking - Both models provide aggregate (anonymous) benchmarking scores at request of registered organizations CMMI - Both models provide aggregate (anonymous) benchmarking scores at request of registered organizations PMMM Differences - With PMMM, organizations automatically receive benchmarking scores along with the results while with OPM3, the scores can be requested after assessments - OPM3 only provides maturity scores aggregated per industry whereas PMMM also provides scores categorized according to the size of an organization - Unlike OPM3, PMMM also allows individual participants to compare their scores with aggregated scores per industry and organization size during the assessment - CMMI-registered organizations have to request for benchmarking scores. With PMMM, organizations automatically receive benchmarking scores along with the results - CMMI only provides maturity scores aggregated per industry whereas PMMM also provides scores categorized according to the size of an organization - Unlike CMMI, PMMM also allows individual participants to compare their scores with aggregated scores per industry and organization size during the assessment During and after a PMMM assessment, organizations can access benchmarking scores by default. Organizations conducting an OPM3 or CMMI assessment can only request for them after the assessment has ended. In this latter case, organizations receive anonymous maturity scores aggregated per industry, while PMMM also provides explanations of the benchmark findings in the elaborated results report. Benchmarking is important because it allows organizations to compare their maturity scores with other similar organizations. There is, however, a downside to this since multiple factors can influence a maturity score. t much information is gained when only looking at industry-related differences. Organizations have to consider other factors such as size or region. Below is another table to summarize the above criteria results regarding the assessment methods. 6

Table 29: Summary results assessment method criteria OPM3 CMMI PMMM Assessment commitment Competence level Assessor Participant Assessment method description Process phase Activity Deliverable Role Dependency Data gathering method Questionnaires Interviews Group discussions Document consultations Length of questionnaire (of the project management domain) 800+ 55-80 83 Supportive assessment tools (Self-)assessment toolset Training Certification Benchmarking The assessment methods of all three models do not pose requirements for members of organizations participating in the assessments. However, OPM3 and CMMI do provide guidelines for the assessors, since they are the ones responsible for selecting appropriate participants. As PMMM only employs an online questionnaire to gather information, there are no requirements described for assessors. The organizations themselves are responsible for selecting participants during PMMM assessments. Because of this, PMMM is the only one that does not include roles in its assessment method description. Except for the self-assessment questionnaire provided by OPM3, both OPM3 and CMMI use interviews and document consultations to gather data during an assessment. Also, training sessions and certification possibilities are provided by both OPM3 and CMMI to help organizations understand the models better. And finally, maturity results are gathered by all three assessment methods for benchmarking purposes. 62

4.3. Usage analysis Three experts of the list in Appendix A-2 were interviewed using the questionnaires developed to assess the usage dimension. They were an assessor of CMMI, an assessor of OPM3 and a member of an organization that had experience with CMMI before. The remaining experts on the list were consulted by email and informal faceto-face meetings. The first half of the questionnaires was developed to gain information about the interviewee and his/her experiences with the maturity model. The second part of the questionnaires, where interviewees are asked to score the maturity models, was meant to find out about how the selected maturity models for PM are rated by users eyes in terms of usability and satisfaction. But these scores are only representative if multiple assessors or members of assessed organizations are interviewed for the same maturity model. Since this was not the case, the retrieved scores held no meaning and were excluded from analysis and this thesis. The relevant findings of the interviews are summarized in Appendix D-2, excluding the information about the interviewees (and their organizations) and the scores given to maturity models. Although the retrieved data of this dimension were used to support the analysis of the previous findings, some additional insight was gained as well. Especially the informal meetings with experts elicited important information to what was already found, which is summarized in the next paragraphs. About CMMI and CMMI-DEV Since CMMI for Development was evaluated during this thesis, it is not surprising that all findings regarding this model are related and limited to the field of Software Development. However, the idea behind CMMI (i.e. CMM) is that you can use this model to assess all types of processes. This is possible because CMMI assesses maturity by looking at the capability levels of processes that altogether contribute to achieve goals within an organization. CMMI does not concern itself with matters such as knowledge or abilities, unlike for example how OPM3 is focused on PM. CMMI allows the usage of other tools (e.g. methodologies or best practices) in order to assess types of processes other than those related to Software Development. For 63

instance, if an ICT organization wants to assess its governance processes, it can consider using CMMI in combination with a tool such as ITIL. ITIL stands for Information Technology Infrastructure Library and is a collection of best practices to assist ICT organizations in structuring their governance processes [4]. So in the end, CMMI for Development is an offspring of CMMI, standardized to specifically assess processes regarding Software Development. A vital difference between OPM3 and CMMI It was mentioned in section 4. that an OPM3 assessment is mainly carried out with the purpose of realizing process improvements within the assessed organization. CMMI on the other hand also allows organizations to determine their current maturity and rank themselves onto a standard model without the intention to improve. In the first part of the framework, this was explained by the structural differences between the maturity models. The interviews and expert consultations led to a more elaborate explanation for this, which relates to the method employed by the models to determine the maturity level of an organization. According to CMMI, an organization can only achieve level 2 if all process areas of level have fulfilled the necessary requirements. So even though the organization has fulfilled all requirements to achieve maturity level 2, it will still be rated as level if one or more requirements for level have not been met. The model applies the same logic as working with building blocks: one cannot place a block on top of a lower one that is incomplete. Because of this strict distinction between maturity levels, organizations can use it to rank themselves onto a standard continuum of maturity. OPM3 works with a completely different logic. Since it works with best practices, scores are provided to an organization with the presence of each best practice found. When determining the organizational maturity at the end of the assessment, no distinction is made between scores found in the three PM domains (project, program and portfolio management) or in the four improvement stages (standardize, measure, control and continuously improve). In other words, all scores are added together. This is analogous to filling a bucket with water, where the resulting water level indicates the organizational maturity. An implication of this is that the organizational maturity 64

score does not provide definite information about what an organization is doing well or poor. OPM3 does not explicitly define maturity levels; it provides best practices that organization can learn from or adopt to improve their processes. 65

5. Discussion & conclusions The purpose of this thesis was to develop a framework to evaluate and compare different maturity models for PM. It resulted in an evaluation framework that looks at maturity models for PM from three different perspectives. In order to answer the main research question of this thesis, four sub-questions had to be answered. These questions including their answers are provided below.. What is a maturity model for project-based management? According to the literature consulted, a maturity model for PM is a framework comprising several successive maturity stages/levels that allows an organization to assess its current ability to conduct PM and determine its steps towards desired improvements. 2. What constitutes a maturity model for project-based management? The literature study revealed that a maturity model for PM comprises two components: a maturity reference model and an assessment method. A maturity reference model is a description of the requirements that an organization should meet in order to achieve a desired maturity level embodied by the maturity model. With a maturity reference model, users of a maturity model will know what should be assessed to determine the maturity of an organization. There are two user groups that make use of a maturity model: assessors and participants. Assessors are people equipped with the knowledge to assess an organization to determine its maturity level and participants are members of organizations being assessed. And to provide these users with the knowledge about how the assessment should be conducted, a maturity model also describes an assessment method. This description explains what should be done to assess whether a requirement is fulfilled and whether an organization has achieved a maturity level. There are two reasons why these two components should be distinguished when evaluating and comparing maturity models for PM. First, the two components contain different characteristics that should be measured differently. At the most basic level, the assessment method describes a process while the reference model description has 66

a more static nature. Second, a maturity model for PM can have more than one assessment method to apply its reference model. If a distinction is made between the two components when developing measures for the framework, the components can be evaluated separately. This may save the effort of having to evaluate an entire maturity model when it is only the assessment method that is different. 3. What are relevant dimensions to the evaluation of maturity models for projectbased management? Three dimensions were selected to include in the evaluation framework: structure, applicability and usage. These three dimensions were used to evaluate three existing maturity models for PM. The structure dimension was used to reveal the concepts that underlie each maturity model and the activities that comprise the corresponding assessment method. During the evaluation, it showed that each maturity reference model employed a different structure and different concepts. These concepts altogether reflect the approach that a maturity model takes to define and determine maturity. And by evaluating the structure of the assessment method, it became clear what kind of procedure is needed to use a maturity model. Focusing on the structure alone simplified the process of understanding how a maturity model is built up in a certain way and effects it has on its application. In addition, this dimension was able to show where the differences lay within the maturity models and what implications these differences cause. Complementary to the structure dimension, the applicability dimension evaluated specific properties of a maturity model that may affect its capability of being applied in practice. The broad definition of this dimension allowed a flexible approach to select relevant properties. The properties that could be evaluated along this dimension included restrictions posed to their usage, subjects they do or do not describe, requirements they prescribe and what tools they provide to their users. These are all useful pieces of information for organizations that are planning to select a maturity model. The third and last dimension was meant to evaluate and compare the experiences of the users of the maturity models, but due to the lack of contacts, user experiences 67

could not be retrieved for all three models. This dimension is essential because it focuses on how users are experiencing the usage of the models in practice instead of what is described about the models in theory (see second dimension). Also, the retrieved information was useful for the verification and explanation of the findings of the previous two dimensions. 4. What are the main similarities and differences between maturity models for project-based management? One of the main similarities between maturity models for PM lies in the structure of their assessment methods. Maturity assessments described by these maturity models always start with a preparation phase, a conduct assessment phase and end with a finalization or improvement phase where respectively a report is delivered or an improvement plan is developed. Most activities in the preparation and finalization phases are similar. The differences between the models are mainly present in the second phase where activities regarding the data retrieval and results generation are described. A model such as PMMM gathers information about an organization using only one questionnaire, while OPM3 and CMMI require the intervention of external assessors to conduct interviews. Moreover, the results are generated automatically by a tool during an OPM3 or PMMM assessment while they are generated manually during a CMMI assessment. Another similarity is that maturity models for PM are all developed for the same goal: to help organizations that are adopting the project-based way of working. And it is in the way of achieving this goal that these maturity models can differ from each other. For example, a maturity model may be developed to enable an organization to verify its current position in a standard continuum of maturity (e.g. CMMI) while another maturity model may help organizations by suggesting improvement initiatives (e.g. OPM3). On top of that, a maturity model may exist to do both (e.g. PMMM). Maturity models for PM also differ from each other in the way they define maturity itself. This has implications for a maturity model s structure (maturity reference model) because the definition of maturity determines what aspects of an organization a model finds important when assessing its maturity. For instance, OPM3 defines maturity in terms of best practices, and CMMI defines maturity in terms of processes and whether these processes are effective in helping an organization attain 68

predetermined business goals. Consequently, assessors will determine the existence of best practices during an OPM3 assessment or verify the effectiveness of processes during a CMMI assessment to assess the maturity of an organization. With the above conclusions, the main research question can be answered next. What measures are needed to evaluate the similarities and differences between maturity models for project-based management? To measure the maturity models along the structure, applicability and usage, the techniques Process-Data Diagram (PDD) modeling, evaluation criteria and interviews were respectively used. The construction of PDDs was useful in several ways; it enabled the evaluation of the assessment method and reference model separately. In spite of that, it was still able to show the connection between the two components. This measure made it possible to depict the structure of a maturity model in a simple way. The comparison tables that were developed accordingly made it easy to reveal the position and nature of the similarities and differences between the maturity models. For instance, it showed that most of the differences between the assessment methods of OPM3, CMMI and PMMM lay in their descriptions of how the gathered assessment data should be processed to generate the results. And the most basal similarity is that all three assessment methods have a preparation phase, conduct assessment phase and finalize assessment phase. The evaluation criteria were configured to evaluate specific properties affecting the applicability of the maturity models. Of the 4 properties, openness appears to be the most important property of a maturity model, because it determines whether information can be retrieved to evaluate the model on the remaining properties of the list. Properties such as industry & size and scope were useful in that they may describe limitations to the context in which a maturity model can be applied. Criteria such as maturity level description, assessment method description, data gathering method and supportive assessment tools were able to reveal aspects related to the ease of use of the maturity models. Criteria such as maturity dimension and process areas 69

are only useful if the evaluated maturity models employ maturity dimensions and process areas, and if there is enough time to examine the different definitions used by each maturity model to describe them. Nevertheless, these criteria shed light on the aspects for which organizations can achieve maturity according to a maturity model, so they should be included in the framework when possible. Finally, criteria such as the length of the questionnaire and assessment commitment contributed little to distinguishing one maturity model from another. These criteria can be excluded from further usage. As for the third measure, few interviews were conducted due to the lack of contacts and the unavailability of people of both user groups for each of the three models. However, the retrieved data still provided additional insight into the maturity models that was not found with the previous two measures (section 4.3). Thus, the arguments for the relevancy of this dimension and measure remain valid. The real value of a maturity model lies in the eyes of its users, not in the literature or publications. And interviews are more useful than a survey because it enables the collection of more indepth information such as user experiences. In the end, the purpose was to find out whether the above measures were eligible to describe the three dimensions and show similarities and differences between maturity models for PM. After applying the framework to the three maturity models, modeling PDDs and using evaluation criteria were proven useful measures for the corresponding dimensions. And regardless of the small number of formal interviews, the useful data gathered by speaking with assessors and a user proves the adequacy of the third measure. 70

5.. Framework requirements In the beginning of this thesis, five requirements were defined to assess quality of the evaluation framework as a whole. These requirements are repeated below including the assessment of the framework. Completeness The framework is comprehensive for the analysis and comparison of maturity models for PM; however it is not yet complete because of the few user experiences gathered for the third dimension. The framework is considered complete when more interviews are conducted with members of both user groups for each maturity model being evaluated. This may happen when this framework is used by the PMI-NL Project Group to compare maturity models for PM. Consistency The framework is consistent since all activities and concepts are consistently defined and described throughout the framework. Applicability This thesis has described the initial application of the framework. The applicability of the framework will be further determined by the PMI-NL Project Group when using the framework to compare other maturity models for PM. Reliability The reliability of the framework is determined by the way the framework is constructed and by the sources consulted for information. The framework is not reliable if the findings of the three measures contradict each other. The correctness of the PDDs, criteria results and interview data have been verified by experts and references are described for every piece of retrieved information. And above all, the findings of the three measures complement each other well, so the framework is reliable. 7

Efficiency Evaluating maturity models for PM using the framework is efficient because it focuses on the most important aspects of these models to evaluate and compare. The only costs are made when acquiring information about the models. Those wanting to compare maturity models do not have to search for and purchase literature aimlessly since the framework already describes what information is needed. License costs can be avoided since the framework does not enforce the evaluation of models that are not open to public. Modeling PDDs can be done using Microsoft Visio, which is not expensive to purchase. Using the framework might require some effort and time since evaluating on the applicability dimension requires thorough desktop research with the gathered information about the models. Ultimately, the application of the framework does not limit itself to maturity models for the PM discipline. The only exceptions are several criteria for the applicability dimension (e.g. scope, process areas). The remaining criteria may be used to evaluate maturity models of other disciplines. PDDs can be used to model maturity reference models and assessment methods regardless of the discipline. The same holds for interviews with user groups for the usage dimension. 72

5.2. Suggestions for situational selection This thesis proved that there are significant differences between maturity models for PM. Although the scope of this research does not include the examination of the possible fit between maturity models for PM and certain types of organizations, there are some findings worth mentioning. In particular, these findings regard the implications that certain properties of each maturity model have for its usage. The implications are described per maturity model below as characteristics of organization that could opt for it. Each implication is accompanied by a property of the same model that explains the implication. Organizations may opt for OPM3 when they: want to find out if a rigorous assessment is necessary. OPM3 provides facilities to conduct self-assessments prior to a rigorous assessment. want to improve their project, program or portfolio processes. This is because OPM3 describes best practices of project, program as well as portfolio management. assistance in developing a plan for improvements initiatives. OPM3s assessment does not necessarily end after delivering the report containing the results of the assessment. If needed, OPM3 assessors also provide assistance for the development of an improvement plan afterwards. do not need to rank themselves onto a standard maturity continuum. OPM3 does not describe strict maturity levels, so organization cannot state that they have achieved for example maturity level or 2 after an assessment has ended. 73

Organizations may opt for CMMI (for Development) when they: want to assess their project management processes to improve their software development processes. CMMI for Development is specifically developed to assess software development processes. The project management practices in this model are described for purposes of improving software development processes. are certain that a rigorous assessment is what they seek. SEI does not provide self-assessment tools for CMMI. want to rank themselves on a standard maturity continuum. Because CMMI describes a standard path to maturity and defines strict maturity levels, organization can announce themselves as having achieved a certain CMMI maturity level after an assessment. Organizations may opt for PMMM when they: seek a quick and simple assessment without intervention of any external assessors. PMMM only uses an online questionnaire to gather information about an organization. intervention of assessors is needed. want to adopt the project-based way of working. PMMM describes an approach to implement project management in five stages (i.e. maturity levels). can decide for themselves whether the results are applicable to their own specific situation and have a high tolerance for the reliability of the results. Because PMMM uses only an online questionnaire to gather information, there is no guarantee that the assessment results are aligned to the situation of the organization. The reliability of the results is also questionable since the organizations themselves are responsible for selecting the respondents to fill out the questionnaire. This is the trade-off for the simplicity and the savings in costs provided by this model. 74

5.3. Recommendations for further research During the research, an attempt was made to compile a list of relevant criteria. More research should be done on validating the current list and examine the usefulness of other properties that may help making distinctions between maturity models. The validity of the current list can for example be determined by asking experts to judge the importance of the properties elicited by each criterion [42]. Due to the unavailability of the needed resources, models such as the P3M3 and MINCE2 model could not be included in the evaluation. While the OPM3 is one of the most prominent maturity models in America based on the PMBOK, the P3M3 is a well-known model in Europe based on PRINCE2. If this model was included in the research, it would have been interesting to see the differences between these two models. And because MINCE2 is one of the newest and easy to access maturity models developed, it would have been interesting to evaluate as well. It is now the task of the PMI-NL Project Group and future researchers to examine these two maturity models. Furthermore, additional research could also evaluate different assessment methods of the same maturity model to find out which method is the most suitable. Like maturity models, differences in assessment methods may affect its applicability in an organization. And above all, further research should focus on the possible fit between different maturity models for PM and different types of organizations, and the effects this fit might have on the performance of an organization. 75

References [] Ten Zweege, H.C., De Koning, M.C. & Bons, F. (2006). PPI Project Performance Improvement. Succesvolle projecten zijn geen toeval. The Hague, The Netherlands: Sdu Uitgevers. [2] Kwak, Y.H. & Ibbs, C.W. (2002). Project management process maturity (PM) 2 Model [electronic version]. Journal of Management Engineering, 8(3), 50-55. [3] Pennypacker, J.S. & Grant, K.P. (2003). Project management maturity: an industry benchmark [electronic version]. Project Management Journal, 34(), 4-. [4] Fiedler, F.E. (972). The effects of leadership training and experience: a contingency model interpretation [electronic version]. Administrative Science Quarterly, 7(4), 453-470. [5] Wikipedia.org (2007). Fiedler contingency model. Retrieved August 27, 2007 from Wikipedia.org: http://en.wikipedia.org/wiki/fiedler_contingency_model [6] Lin, W.T. & Shao, B.B.M. (2000). The relationship between user participation and system success: a simultaneous contingency approach [electronic version]. Information & Management, 37, 283-295. [7] Beach, L.R. & Mitchell, T.R. (978). A contingency model for the selection of decision strategies [electronic version]. The Academy of Management Review, 3(3), 439-449. [8] Shenhar, A.J. (200). One size does not fit all projects: exploring classical contingency domains [electronic version]. Management Science, 47(3), 394-44. [9] Ginzberg, M.J. (980). An organizational contingencies view of accounting and information systems implementation [electronic version]. Accounting, Organization and Society, 5(4), 369-382. [0] Association for Information Systems. (2007). Theories used in IS research: contingency theory. Retrieved August 6, 2007, from the World Wide Web: http://www.istheory.yorku.ca/contingencytheory.htm [] Brinkkemper, S., Saeki, M. & Harmsen, F. (999). Meta-modelling based assembly techniques for situational method engineering [electronic version]. Information Systems, 24(3), 209-228. [2] Project Management Institute [PMI]. (2003). Organizational Project Management Maturity, knowledge foundation. Newton Square, Pennsylvania USA: PMI. [3] Project Management Institute [PMI]. (2004). A guide to the project management body of knowledge (3 rd Ed.). Square, Pennsylvania USA: PMI. [4] Wikipedia, 2007. Process management. Retrieved May 20, 2007, from Wikipedia.org: http://en.wikipedia.org/wiki/process_management [5] Wikipedia, 2007. Program management. Retrieved August 5, 2007, from Wikipedia.org: http://en.wikipedia.org/wiki/project_management [6] Grant, K.P. & Pennypacker, J.S. (2006). Project management maturity: an assessment of project management capabilities among and between selected industries [electronic version]. IEEE Transactions on Engineering Management, 53(), 59-68. 76

[7] De Wit, A. (988). Measurement of project success [electronic version]. Project Management Journal, 6(3), 64-70. [8] Dvir, D., Raz, T. & Shenhar, A.J. (2003). An empirical analysis of the relationships between project planning and project success [electronic version]. International Journal of Project Management, 2, 89-95. [9] Storm, P.M. & Janssen, R.E. (2004). High performance projects a speculative model for measuring and predicting project success. Conference paper submitted to IRNOP VI Project Research Conference. Retrieved vember, 2006 from www.ou.nl: http://www.ou.nl/docs/faculteiten/mw/mw%20working%20papers/gr%2004-04%20storm%20en%20janssen.pdf [20] Khang, D.B. & Moe, T.L. (AIT - SOM working paper, August 2006). Success criteria and factors for international development projects: a lifecycle-based framework [electronic version]. [2] Hyväri, I. (2006). Project management effectiveness in project-oriented business organizations [electronic version]. International Journal of Project Management, 24, 26-225. [22] Johnson, J. Boucher, K.D., Connors, K. & Robinson, J. (200). Project management: the criteria for success [electronic version]. Software Magazine, Feb 200 http://findarticles.com/p/articles/mi_m0smg/is 2/ai_756248 [23] Atkinson, R. (999). Project management: cost, time and quality, two best guesses and a phenomenon, its time to accept other success criteria [electronic version]. International Journal of Project Management, 7(6), 337-342. [24] Clarke, A. (999). A practical use of key success factors to improve the effectiveness of project management [electronic version]. International Journal of Project Management, 7(3), 39-45. [25] Kerzner, H. (2005). Using the project management maturity model: strategic planning for project management (2 nd Ed.). New Jersey, USA: John Wiley & Sons. [26] Crawford, J.K. (2002). Project management maturity model: providing a proven path to project management excellence. Basel, Switzerland: Marcel Dekker, Inc. [27] Murray, A. & Ward, M. (2006). Capability maturity models using P3M3 to improve performance. Outperform, UK. Retrieved 28 vember 2006 from outperform.co.uk: http://www.outperform.co.uk/portals/0/p3m3%20performance%20improvement%20v2- APP.pdf [28] Software Engineering Institute [SEI] (August, 2006). CMMI for development, version.2. Improving processes for better products. Retrieved March 3, 2007 from sei.cmu.edu (searched with CMMI for development ): http://www.sei.cmu.edu/publications/ [29] Koomen, T. & Pol, M. (998). Improvement of the test process using TPI. Retrieved May 2, 2007 from sogeti.nl: http://www.sogeti.nl/images/summary%20tpi%20uk%20v%2e2_tcm6-3237.pdf [30] Earthy, J. (999). Usability maturity model: processes. Public document. Retrieved May 2, 2007 from the World Wide Web: http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/usability-maturity- Model%5B2%5D.PDF 77

[3] Cooke-Davies, T.J. (2004). Measurement of organizational maturity: what are the relevant questions about maturity and metrics for a project-based organization to ask, and what do these imply for project management research? In D.P. Slevin, D.I. Cleland & J.K. Pinto (Eds.), Innovations: project management research 2004 (pp. 2-228). Newton Square, Pennsylvania USA: PMI. [32] Office of Government Commerce [OGC]. (2006). Portfolio, programme & project management maturity model (P3M3). Crown copyright. Retrieved 28 vember 2006 from the World Wide Web: http://www.ogc.gov.uk/documents/p3m3.pdf [33] MINCE2 Foundation (2007). Maturity INcrements IN Controlled Environments (MINCE2). Retrieved May 20, 2007 from the World Wide Web: http://www.mince2.org/ [34] International Institute for Learning [IIL]. (2007, June 4). Kerzner project management maturity model online assessment. Retrieved June 4, 2007, from the Word Wide Web: http://www.iil.com/pm/kpmmm/ [35] Schlichter, J. (2000). Organizational project management maturity model program plan. Retrieved February 26, 2007, from the World Wide Web: https://committees.standards.org.au/committees/it-030/z0005/it-030-z0005.pdf [36] DNV (2007). OPM3 standard knowledge course. Retrieved June 6, 2007 from dnv.nl: http://www.dnv.nl/dnvtraining/categorie/project_management/opm3_standard_knowledge_cou rse.asp [37] Software Engineering Institute [SEI] (August, 2006). Standard CMMI Assessment Method for Process Improvement (SCAMPI sm ) A, Version.2: Method definition document. Retrieved March 3, 2007 from sei.cmu.edu (searched with SCAMPI ): http://www.sei.cmu.edu/publications/ [38] Hong, S., Van Den Goor, G. & Brinkkemper, S. (993). A formal approach to the comparison of object-oriented analysis and design methodologies [electronic version]. Proceeding of the Twenty-Sixth Hawaii International Conference, 4, 689-698. [39] Weerd, I. van de & Brinkkemper, S. (2007). Meta-modeling for situational analysis and design methods [electronic version]. To appear in the Handbook of Research on Modern Systems Analysis and Design Technologies and Applications. Hershey, PA, USA: Idea Group Publishing. [40] White, D. & Fortune, J. (2002). Current practice in project management an empirical study [electronic version]. International Journal of Project Management, 20, -. [4] Wikipedia.org (2007). ITIL. Retrieved September 0, 2007 from Wikipedia.org: http://nl.wikipedia.org/wiki/information_technology_infrastructure_library [42] Dalkey, N. & Helmer, O. (963). An experimental application of the Delphi method to the use of experts [electronic version]. Management Science, 9(3), 458-467. [43] Project Management Institute [PMI] (2006). OPM3 ProductSuite Assessor Training. PMI Inc. [44] Project Management Institute [PMI] (2007). OPM3 Consultant Training Program. PMI Inc. [45] International Institute for Learning [IIL] (2007). Kerzner project management maturity assessment: level -5 assessment report for ABC company [sample report]. Provided June 3, 2007 by C. Damiba, IIL NY. 78

[46] Project Management Institute [PMI] (2007). Organizational project management maturity model (OPM3). Retrieved April 5, 2007, from the World Wide Web: http://opm3online.pmi.org/default.aspx [47] Project Management Institute [PMI] (2004). An executive s guide to OPM3. Retrieved June 5, 2007 from pmi.org: http://opm3online.pmi.org/demo/downloads/pp_opm3execguide.pdf [48] Software Engineering Institute [SEI] (2007). SEI s official CMMI website: publications. Retrieved April 4, 2007 from the Word Wide Web: http://www.sei.cmu.edu/publications/ [49] Software Engineering Institute [SEI] (August, 2006). Appraisal requirements for CMMI, Version.2. Retrieved March 3, 2007 from sei.cmu.edu (searched with appraisal requirements ): http://www.sei.cmu.edu/publications/ 79

APPENDIX Appendix A- PMI Project Group members and roles Name (role within project group) Organization of employment (role) John Verstrepen (Sponsor) PMI NL (General Manager) Chris ten Zweege (Chairman) Capgemini Netherlands (Cluster Manager PPI Consulting) Winnie Weintré (Member) PRI Management (Project and Interim Manager) Remco Meisner (Member) Andarr (Senior Project, Program and Interim Manager) Carl Splinter (Member) Stork (n/a) Jeroen l Ecluse (Member) ABN Amro (Vice President Project Management Office) Jan van Galen (Member) Van Aetsveld (Change and Project Manager) Tjie-Jau Man (Member) University Utrecht / Capgemini (Student on work placement) 80

Appendix A-2 Consulted experts Maturity model name Expert name (role in relation to maturity model, organization of employment) OPM3 Chris ten Zweege (OPM3 assessor, Capgemini) Lex van der Helm (OPM3 assessor, Capgemini) Raimond Wets (OPM3 assessor, Capgemini) CMMI Ben Kooistra (CMMI assessor/it architect, Capgemini) Gerhard Lutjenhuis (CMMI lead assessor/pm auditor, Capgemini) Ahmet Ersahin (senior consultant, Capgemini) Bert den Ouden (manager change management Schade & Inkomen, ING) PMMM Christian Damiba (director Methodology and Assessment Solutions, IIL) 8

Appendix B Process-Data Diagram Modeling (derived from [39]) The meta-process model The meta-process model is an adapted version of the UML activity diagram, were the activities and transitions between them are depicted graphically. Activities are shown as rectangles with rounded corners and can contain a collection of (sub)-activities where applicable. The transitions are depicted as arrows and they indicate the progress from one activity to another. Activities Activities can be subdivided into standard activities and composite activities. The latter can, in its turn, be subdivided into open and closed activities. An open activity is a composite activity of which the (sub)-activities are elaborated and a closed activity is one of which the (sub)-activities are not shown. To increase the readability and understandability of the PDDs, the framework will only use the notation of standard activities and open activities to when modeling the PM maturity models. Also, although they are officially called activities, the framework will use the term processes for all open activities and activities to indicate (standard) subactivities from here on (see Figure below). Figure : Activity types Transitions There are officially four types of transitions: unordered, sequential, simultaneous and conditional. Unordered means that activities of a process can be carried out in an arbitrarily chosen order. This is illustrated by the absence of transition arrows between the activities. On the other hand, sequential activities are obliged to follow a 82

predefined order. The arrows that connect these activities indicate the order in which they should be carried out (see Figure 2). Figure 2: Transition types The concurrent property of the simultaneous activities are depicted using two thick black lines, respectively called the fork and join. The fork initiates the simultaneous flow where separate activities are carried out at the same time and the join closes this flow by uniting all separate transitions into one single transition. It is important to note that the following activity can only be initiated once all simultaneous activities between the fork and join have been completed. Lastly, the conditional flow is shown as a toppled square that splits up an incoming transition into two or more conditional transitions. A conditional transition is accompanied by a condition written between brackets (see Figure 2). This condition has to be formulated in a way such that it can be answered with a true or false. The meta-data model The meta-data model is comparable to the UML class diagram. An obvious difference is that a meta-data model uses concepts instead of classes. Just like activities, concepts can be subdivided into standard and composite concepts, and with the latter dividable in open and closed concepts. And just like before, the framework will only use the notation of standard and open concepts when constructing a metadata model. Concepts are depicted as rectangles and written in capitals using singular nouns within the meta-model as well as outside the model. In order to distinguish the 83

deliverables of activities in the meta-process model from the concepts underlying a PM maturity model, the latter will be depicted using gray boxes (see Figure 3). STANDARD CONCEPT OPEN CONCEPT CLOSED CONCEPT MATURITY MODEL CONCEPT Figure 3: Concept types Concepts are connected by relationships. The evaluation framework employs three types of relationships: generalization, association and aggregation. Relationships Generalization is a type of relationship between a general concept and a more specific concept. This type of relationship is illustrated with a directed line with an open arrowhead pointing at the general concept. A generalization is also called a is-a relationship. The generalization relationship shown in Figure 4 below can be read as a survey/interview is a data source. An association is a structural relationship that describes how concepts are connected to each other. It is depicted as a simple undirected line between two or more concepts. The connection represented by an association is expressed as an active verb together with by a black triangle that indicates the direction in which the connection should be read. Thus, the association depicted in Figure 4 can be read as a maturity score determines a maturity. Lastly, an aggregation connotes a relationship between a concept as a whole and other concepts as a part of it. This relationship is also known as a has-a relationship. It is shown as a directed line with an open diamond pointing at the concept as a whole, which is also an open concept. In the example below, the aggregation relationship can be read as a plan has an action and (has a) schedule. Figure 4: Relationship types 84

Multiplicity Besides a name and direction, there is another property of an association worth mentioning: multiplicity. Table 30: Forms of multiplicity Form Meaning Exactly one 0.. Zero or one 0..* Zero or more..* One or more 5 Exactly five This property indicates how many instances of a certain concept have a connection with an instance of another concept. Multiplicity is expressed numerically at each end of an association. The different forms of multiplicity are shown in the table below. In Figure 5, the association example of Figure is repeated, but this time with expressions of multiplicity. According to this example, one or more instances of maturity scores can determine precisely one instance of maturity, provided that one is assuming the assessment of one single organization. Figure 5: Example association with multiplicity The process-data diagram The combination of a meta-process and meta-data model creates a Process-Data Diagram (PDD) in which the relationships are shown between the processes and activities of an assessment process, and the deliverables resulting from each activity. These deliverables are then connected to the concepts underlying a PM maturity model, which are also depicted in the PDD. The connection between an activity and a deliverable is expressed using a directed, dashed line with its arrowhead pointing to the deliverable. An example of a PDD is shown in Figure 6. 85

Assess maturity Assess process Determine maturity score MATURITY SCORE determines Determine maturity MATURITY Figure 6: Example Process-Data Diagram 86

Appendix D- Assessor & User sample questionnaires Questionnaire user Over de gebruiker. Hoe lang hebt u al ervaring met het volwassenheidsmodel? Over het volwassenheidsmodel & evaluatiemethode 2. Sinds wanneer is uw organisatie begonnen met het toepassen van het model? 3. Wat was de aanleiding hiervoor? 4. Wat waren de verwachtingen/doelstellingen voor het toepassen van het model? 5. Kunt u iets vertellen over hoe de evaluatie was verlopen? a. Betrokken niveaus van organisaties b. Aantal betrokkenen c. Aandachtspunten tijdens evaluatie 6. Wat vindt het management van uw organisatie van het model? Staan ze er sterk achter? Ondersteunen ze het initiatief actief? 7. Gebruik uw organisatie de evaluatie om structurele verbeteringen mogelijk te maken? 8. Wat heeft uw organisatie na de evaluatie met de resultaten gedaan? Wat kon er gedaan worden met de verkregen resultaten? 9. Heeft uw organisatie sinds haar eerste evaluatie nog herhaaldelijke evaluaties uitgevoerd om de voortgang van verbeteringen te achterhalen? 0. Als laatste om het verhaal een beetje samenvattend te maken zou ik u graag het model op een aantal dimensies willen laten scoren (rapportcijfer -0). d. Subjectieve bruikbaarheid van het model/assessment i. Begrijpbaarheid & leerbaarheid ii. Tijd en moeite om een evaluatie voor te bereiden iii. Toepasbaarheid (tijd en moeite om een evaluatie uit te voeren) e. Subjectieve bruikbaarheid/betrouwbaarheid van de resultaten i. Weergave van de werkelijkheid ii. Begrijpbaarheid iii. Toepasbaarheid f. Subjectieve tevredenheid i. Tevredenheid over het model 87

Questionnaire assessor Over de assessor. Hoe lang hebt u al ervaring met het volwassenheidsmodel? 2. Sinds wanneer bent u een gecertificeerde assessor voor het model? 3. Hoe vaak bent u betrokken geweest bij een evaluatie sinds de certificering? 4. In welke industrieën hebt u de evaluatie toegepast? 5. Bij welke type en grootte van organisaties hebt u de evaluatie toegepast? 6. Hebt u herhaaldelijke evaluaties uitgevoerd bij dezelfde organisaties? Over het volwassenheidsmodel & evaluatiemethode 7. Is er sprake van een standaard procedure bij een evaluatie? 8. Kunt u vertellen over uw ervaringen met het model? a. Wat zijn de sterke punten van het model? b. Wat zijn de zwakke punten van het model? c. Wat zijn de beperkingen van het model ten opzichte van andere volwassenheidsmodellen? 9. Als laatste om het verhaal een beetje samenvattend te maken zou ik u graag het model op een aantal dimensies willen laten scoren (rapportcijfer -0). a. Subjectieve bruikbaarheid van het model/assessment i. Begrijpbaarheid & leerbaarheid ii. Tijd en moeite om een evaluatie voor te bereiden iii. Toepasbaarheid (tijd en moeite om een evaluatie uit te voeren) b. Subjectieve bruikbaarheid/betrouwbaarheid van de resultaten i. Weergave van de werkelijkheid ii. Begrijpbaarheid iii. Toepasbaarheid c. Subjectieve tevredenheid i. Tevredenheid over het model 88

Appendix D-2 Summary interview findings CMMI Wat was de aanleiding voor het toepassen van het model en wat waren de verwachtingen/doelstellingen? - de beheersbaarheid van processen verbeteren Kunt u iets vertellen over hoe de evaluatie was verlopen? Commitment van alle niveaus van de te evalueren organisatie was belangrijk en iedereen moest gemotiveerd worden. Een succesfactor hierbij was het kunnen aanwijzen van pijnpunten met behulp van het model en vervolgens binnen een korte termijn kunnen aantonen dat het proces beter verliep nadat men het anders uitvoerde. Een ander voordeel was de platte structuur van de organisatie en de grootte van de organisatie. Bij een groep van <00 is het makkelijker om te achterhalen of mensen volgens dezelfde richtlijnen werken dan een organisatie van >500 man verdeeld over verschillende lokaties binnen of buiten eenzelfde gebouw. Wat vindt het management van het model? Staan ze er sterk achter? Ondersteunen ze het initiatief actief? Het management is overtuigd van de voordelen en ondersteunt het initiatief door tijdens de kick-off aan iedereen van de organisatie uitgelegd wat de bedoeling is en waar het goed voor is. Management commitment meet je door te kijken wie er komt opdagen bij de kick-off, evaluatie en begeleiding. Consistentie is ook erg belangrijk: als je eist dat een proces aanwezig moet zijn en het ontbreekt dan moet je dat niet accepteren. En als je straks resultaten terugkrijgt van de evaluatie, moet je er ook iets mee doen. Wat kon er gedaan worden met de verkregen resultaten? Er staan verbeterpunten in het rapport en die kan je prioriteren naar de belang van de aanbevelingen en input van de aanbeveling. We keken naar of we er snel mee kunnen beginnen en of we er snel van kunnen profiteren. Zo n rapport laat duidelijk zien welke processen nog verbeterd moeten worden en welke al voldoen aan de eisen van het CMMI model. Is er sprake van een standaard procedure bij een evaluatie? Ja. De SEI heeft een standaard methode voor de evaluatie gedefinieerd. Maar dat is niet de enige methode om CMMI toe te passen. Er zijn andere geaccrediteerde instanties die ook CMMI evaluaties kunnen uitvoeren en als deze methoden voldoen aan de eisen die SEI stelt aan de CMMI evaluatie dan worden deze methoden ook erkent door de SEI. De SEI stelt geen vragenlijsten voor zelf-assessments beschikbaar. In de meeste gevallen worden deze vragenlijsten opgesteld door adviesbureaus en consultancy bedrijven. Deze vragenlijsten kunnen dan door organisaties gebruikt worden om een zelf-assessment uit te voeren. Voordeel hiervan is dat het bijdraagt aan de awareness binnen een organisatie. Hiermee kan je kijken of de medewerkers een goede instelling heeft en veel kennis hebben over de processen die zij uitvoeren. Een valkuil is dat interne assessors geen goede evaluatie kunnen uitvoeren over hun eigen organisatie 89

omdat ze vaak niet in staat zijn om op een afstand naar hun eigen gebruiken en processen te kijken. Wat zijn de sterke/zwakke punten van het model? Volledigheid. Het model geeft handvaten om een organisatie op een niveau te brengen. Het bestrijkt alle activiteiten die te maken hebben met software ontwikelling en project management in dezelfde context. Het model leert je om op een onafhankelijke en afstandelijke manier naar verbeterpunten in processen te kijken. Een zwakke punt van CMMI is dat het model wat te theoretisch is bij het beschrijven van de activiteiten. Het wordt soms erg generiek beschreven waardoor men weleens afvraagt wat het model er precies mee bedoelt. De practices zijn soms onvoldoende concreet beschreven. CMMI zegt wel wat je moet doen, maar hoe je het moet doen laat het model open. Een nadeel hiervan is dat een organisatie veel tijd en moeite ervoor moet doen om de hoe vast te stellen en er naartoe te handelen. Aan de andere kant zijn er ook nadelen zodra je de medewerkers van een organisatie een model gaat opleggen. Dit zorgt voor weerstand. Het feit dat CMMI de hoe openlaat maakt het model moeilijker toe te passen in een grote organisatie dan in een kleine. In kleine organisaties kunnen alle medewerkers een bijdrage leveren aan het definiëren van processen conform CMMI. Dit is haalbaar. Maar bij een groep van een paar honderd man is het moeilijker. In dat geval zou een model waarin de wat, wie, hoe en waarom staan beschreven wel handig zijn. De toegevoegde waarde van een assessment en procesverbetering is dat men op de probleempunten gewezen worden. Men wordt bewust gemaakt van het feit dat er wordt gemeten en dat motiveert mensen om extra aandacht te geven aan knelpunten. Een voordeel van een assessment is dat je alle aandacht richt op de belangrijkste schakels binnen een process zonder dat je er heel veel voor hoeft te doen. OPM3 Is er sprake van een standaard procedure bij een evaluatie? Gedurende de OPM3 training voor assessors wordt er een standaard methode beschreven die toegepast moet worden tijdens een assessment. Wat zijn de sterke/zwakke punten van het model? Het model bestrijkt niet alleen project management practices, maar ook practices van programma s en portfolio s. Het model stelt een tool beschikbaar om resultaten te genereren, dit maakt het makkelijk om scores te genereren en te achterhalen. Alle essentiële punten komen naar voren in het rapport. Een ander sterk punt is dat OPM3 niet alleen kijkt naar de activiteiten zelf, maar ook naar de organisatie. De OPM3 onderscheid een lijst met best practices die met name beoordelen of een organisatie genoeg draagvlak creëert om verbeteringen en veranderingen voor het projectmatig werken in te voeren. Een zwak punt van OPM3 is dat het sterk is gebaseerd op PMBOK waardoor een organisatie, die volgens een ander project management methode zoals PRINCE2 werkt, meer moeite heeft om de aanbevelingen in het rapport te vertalen zodat zij er gebruik van kunnen maken. 90

Overige opmerkingen OPM3 gaat verder dan alleen het opleveren van een rapport na een assessment. Medewerkers van een organisatie die opgeleid willen worden tot OPM3 assessors hebben ook de keuze om een opleiding te volgen om OPM3 consultant te worden. De meeste OPM3 assessors zijn ook opgeleid als OPM3 consultants waardoor ze na het opleveren van de assessment resultaten een organisatie ondersteuning kunnen bieden bij het opstellen van een plan van aanpak voor verbeteringstrajecten. 9

Appendix E Process-Data Diagrams OPM3 OPM3. Prepare assessment Familiarize with OPM3 Perform self-assessment leads to SELF-ASSESSMENT REPORT ASSESSMENT TRIGGER generates..* FOUNDATION SELF-ASSESSMENT TOOL [else]] [Pursue rigorous assessment] 2. Perform Assessment Determine depth and breadth of rigorous assessment ORGANIZATION DESCRIPTION COMMUNICATION PLAN ENGAGEMENT DESCRIPTION PM DOMAIN PROCESS IMPROVEMENT STAGE categorizes BEST PRACTICES DIRECTORY Acquire & prepare team TEAM & ROLE DESCRIPTION Develop assessment plan Prepare for data collection ASSESSMENT PLAN DATA COLLECTION PLAN PROTOCOL ORGANIZATIONAL ENABLER PROCESS BEST PRACTICE..*..* BEST PRACTICE has Conduct interviews Study records & documents PRELIMINARY ASSESSMENT FINDINGS..* OUTCOME 2..* has CAPABILITY..*..* CAPABILITIES DIRECTORY [Pursue further improvement] Verify assessment findings Enter findings in rigorous assessment tool Generate & analyze data Prepare final report [else] [Pursue improvements] 3. Plan for improvement Select & prioritize improvement initiatives Develop improvement plan [else] [Pursue reassessment] KEY PERFORMANCE INDICATOR FINAL REPORT leads to..* IMPROVEMENT TRIGGER PRIORITY has INITIATIVE..*..* RESULTS ANALYSIS..* decides OUTCOME SCORE influences influences SCHEDULE IMPROVEMENT PLAN..* has..* ASSESSMENT RESULTS - OPM maturity score..* decides CAPABILITY SCORE..* FACTOR has..* DEPENDENCY - (other) BEST PRACTICE/ CAPABILITY (value: present/absent) ATTAINABILITY STRATEGIC PRIORITY BENEFIT COST..* decides..* BEST PRACTICE SCORE MATURITY MATURITY PROFILE..* DATABASE..*..* IMPROVEMENT PLANNING DIRECTORY 92

CMMI. Plan and prepare for appraisal Analyze Requirements..* REQUIREMENT Develop Appraisal Plan APPRAISAL PLAN Select and prepare Team TEAM Obtain & inventory initial Objective Evidence INITIAL OBJECTIVE EVIDENCE REVIEW Prepare for appraisal conduct DATA COLLECTION PLAN 2. Conduct appraisal Prepare participants categorizes PRACTICE IMPLEMENTATION INDICATOR ARTIFACT AFFIRMATION..*..* Examine Objective Evidence..* OBJECTIVE EVIDENCE determines has Document Objective Evidence INTERVIEW DOCUMENT Verify Objective Evidence..* PRACTICE RATING..* has..* PRACTICE..* SPECIFIC PRACTICE GENERIC PRACTICE Validate preliminary findings Generate appraisal Results 3. Report results Deliver appraisal Results RESULTS ANALYSIS determines APPRAISAL RESULTS FINAL FINDINGS REPORT determines GOAL RATING..*..*..* determines PROCESS AREA RATING..* determines MATURITY LEVEL RATING has GENERIC GOAL has..* GOAL..4 SPECIFIC GOAL..* PROCESS AREA..* CMMI 5 MATURITY LEVEL 2 has DEPENDENCY - (other) MATURITY LEVEL Package and archive appraisal Assets APPRAISAL RECORD..* IMPROVEMENT SUGGESTION MATURITY PROFILE..* MATURITY PROFILE DATABASE 93

PMMM. Initiation ASSESSMENT PLAN PMMM Identify assessment need DRIVER MATURITY LEVEL 5 Determine assessment scope SCOPE ROADBLOCK..* Sample assessment participants ASSESSMENT SAMPLE RISK..* ADVANCEMENT CRITERIA..* 2. Assessment Fill in online questionnaire INDIVIDUAL SCORE calculates..*..* ASSESSMENT INSTRUMENT provides 5 Generate individual assessment scores INDIVIDUAL ASSESSMENT RESULTS..* INDIVIDUAL SCORE ANALYSIS ONLINE ASSESSMENT TOOL..* [else] BENCHMARKING DATABASE [Need for benchmark results] ORGANIZATIONAL SCORE..* Generate benchmark scores..* BENCHMARK SCORE INDUSTRY SCORE..* SIZE SCORE..* Generate assessment results summary ASSESSMENT RESULTS SUMMARY [else] [Request for elaborate assessment report] ASSESSMENT PLAN 3. Assessment report development & delivery COMPANY BACKGROUND Generate & deliver elaborate assessment report ELABORATE ASSESSMENT REPORT..* RESULTS ANALYSIS SUGGESTED ACTION..* BENCHMARK COMPARISON 94

Appendix F Concept and activity tables OPM3 te: Two of the sources for the definition of the PDDs activities and concepts comprise official OPM3 training materials. These sources are subdivided into modules (chapters), each with their own separate page numbers. When referring to particular pages, the capital M will be used to indicate the module number, followed by the actual page numbers. Assessment process concept ASSESSMENT TRIGGER SELF-ASSESSMENT REPORT ASSESSMENT PLAN ORGANIZATION DESCRIPTION COMMUNICATION PLAN ENGAGEMENT DESCRIPTION TEAM&ROLES DESCRIPTION DATA COLLECTION PLAN PROTOCOL PRELIMINARY ASSESSMENT FINDINGS RESULTS ANALYSIS ASSESSMENT RESULTS OPM MATURITY SCORE (attribute) Description The needs of the organization and the reason for the assessment. ([43], M6.p.5) A review of which best practices in OPM3 are and are not currently demonstrated by the organization, and identifying the organization s general position on a continuum of organizational project management maturity. ([2], p.9) A scheme of action containing information about the assessment to be conducted. The main objective of this document is to make the assessors aware of what they are required to assess and what time period is available in which to perform the assessment. ([43], M6.p.3) Information about the organization being assessed, which comprises: name, description, product/service, type and size of projects, organizational unit involved in the assessment, point of contact and results of previous improvement projects in place. ([43], M6.pp.3-6) A scheme of action containing information about, among others, when to provide feedback to the organization on an ongoing bases and when to gather the assessment team together to review the progress. ([43], M6.pp.3-6) Description of the assessment to be conducted, which comprises information about: the purpose and objectives, scope and criteria, data collection process, data analysis process, data validation process, and the schedule of activities. ([43], M6.pp.3-6) Description of the members of the assessment team and what roles and responsibilities are fulfilled by whom. ([43], M6.pp.3-6) A data collection plan contains information about how to collection the data needed for the assessment. This plan contains an interview strategy, an interview sampling strategy and interview lists. ([43], M8.pp.5-) An assessor question set generated from the OPM3 ProductSuite Assessment Tool. This protocol can be the foundation for a question set for the scoped assessment. ([43], M8.p.2) These are the data collected after conducting interviews and reviewing records and documents. This data is primarily made up of scores given to key performance indicators, which are used to measure the degree of existence of the outcomes defined in OPM3. ([43], M.8.pp.7-9) Part of the assessment report generated by the OPM3 ProductSuite Assessment Tool containing descriptions of the analysis steps that resulted in the achievement scores of the OUTCOMES, CAPABILITIES and BEST PRACTICES within scope. ([30], M9.pp.7-9) Assessment report generated by the OPM3 ProductSuite Assessment Tool and evaluated by the assessors. The results show achievements of outcomes, capabilities and best practices, and the degree of organizational project management maturity. ([43], M9.pp.7-9) The degree to which an organization practices systematic management of projects, programs and portfolios in alignment with the achievement of strategic goals. This degree is given to an organization based on the findings resulting from an assessment. ([2], p.xiii & p.73) OUTCOME SCORE A value given to an outcome indicating the degree to which it is achievement. ([43], M.8.pp.20-2) CAPABILITY SCORE A value given to a capability indicating the degree of its implementation within an organization. ([43], M.8.pp.20-2) BEST PRACTICE A value given to a best practice demonstrating the degree of its existence within an SCORE organization. ([43], M.8.pp.20-2) FINAL REPORT A comprehensive presentation of the assessment results. This report reflect results of the detailed analysis undertaken by the assessment team, and provides information to the organization on: the degree of organizational project management maturity, best practices, outcomes, capabilities, the associated scores of these latter three components, and the ProductSuite scores per project management domain and process improvement stage. ([43], M.9.p.22) 95

MATURITY PROFILE IMPROVEMENT TRIGGER IMPROVEMENT PLAN PRIORITY INITIATIVE SCHEDULE FACTOR ATTAINABILITY STRATEGIC PRIORITY BENEFIT COST Maturity model concept OPM3 FOUNDATION SELF-ASSESSMENT TOOL BEST PRACTICES DIRECTORY CAPABILITIES DIRECTORY IMPROVEMENT PLANNING DIRECTORY BEST PRACTICE PM DOMAIN PROCESS GROUP A profile containing information regarding the current maturity of an organization. A basic maturity profile includes: - achieved BEST PRACTICES - achieved CAPABILITIES - tool generated SCORES ([44], M.3.p.8) The needs of the organization and the reason for choosing to pursue organizational improvements leading to increased maturity. ([2], p.9) An improvement plan provides all of the key information relevant to the purpose and contents of the improvement project. This plan typically includes descriptions of: the organization, the purpose and objectives of the improvement, the process used to achieve the improvement plan, the implementation strategy, list of best practices to be improved, schedule, risks and constraints and (as a reference) the final report of the assessment. ([43], M.4.p.7) A right to precedence given to an improvement initiative ([2], pp.39-40) A leading action to implement a capability that leads to the realization of improvements and increased organizational project management maturity. ([2], pp.39-40) A scheme stating the time allocated to improvement initiatives, the other in which they should be carried out (priority) and the resources needed. ([2], pp.39-40) An element that may influence the prioritization of planned improvement initiatives for optimum use of resources. ([2], p.39) A factor expressing the degree to which an improvement initiative is achievable. This consideration can help the organization demonstrate early success and gain valuable momentum to sustain the improvement initiative. ([2], p.40) A factor describing the essentialness of an improvement initiative to an organization s strategy. ([2], p.40) A factor expressing the advantage of an improvement initiative. ([2], p.40) A factor indicating the expenditures connected to an improvement initiative. ([2], p.40) Description A standard developed under the stewardship of the Project Management Institute. The purpose of this standard is to provide a way for organizations to understand organizational project management and to measure their maturity against a comprehensive and broad-based set of organizational project management best practices. ([2], p.xiii) Narrative text, presenting the OPM3 foundational concepts, with various appendices and a glossary. ([2], p.xiv) A tool in support of the self-assessment step outlined in the OPM3 standard. ([2], p.9) One of the three directories necessary to assess an organization against OPM3 and evaluate the scope and sequence of possible improvements. In the best practices directory, the names and brief descriptions are provided of nearly 600 best practices. ([2], pp.3-32) One of the three directories necessary to assess an organization against OPM3 and evaluate the scope and sequence of possible improvements. In the capabilities directory provides detailed data on all of the capabilities in OPM3, organized according to the best practices with which they are associated. ([2], p.32) One of the three directories necessary to assess an organization against OPM3 and evaluate the scope and sequence of possible improvements. This directory is provided to show the dependencies between capabilities, which are essential to the assessment and improvement steps of the OPM3 cycle. Once the organization has identified best practices requiring assessment, this directory will indicate the capabilities leading to each of these best practices, along with any additional capabilities on which they may depend. ([2], p.32) A best practice is an optimal way currently recognized by industry to achieve a stated goal or objective. For organizational project management, this includes the ability to deliver projects successfully, consistently, and predictably to implement organization strategies. ([2], p.7) A domain refers to each of the three domains of PM: project management, program management, and portfolio management. ([2], p.72) Project management processes can be organized into five process groups of one or more processes each: initiating processes, planning processes, executing processes, controlling processes and closing processes. Process groups are linked together by the results they produce the results or outcome of one becomes an input to another. ([3], pp.27-29) 96

PROCESS IMPROVEMENT STAGE ORGANIZATIONAL ENABLER PROCESS BEST PRACTICE CAPABILITY OUTCOME KEY PERFORMANCE INDICATOR DEPENDENCY MATURITY DATABASE One of the dimensions along which OPM3 defines organizational project management maturity in terms of best practices. These stages to process improvement employed by OPM3 are: standardize, measure, control, and continuously improve. The sequence implies a prerequisite relationship between the stages, in that the most advanced stage (continuously improve) is dependent on a state of control, which is, in turn, dependent on measurement, which is dependent on standardization. ([2], p.6 & p.9) Organizational enablers is a sub-set of the OPM3 best practices that relate to the organizational structures and processes necessary to support efficient and effective implementation and operation of the best practices for the project, program and portfolio domains. ([43], M.2) See best practice definition. A capability is a specific competency that must exist in an organization in order for it to execute project management processes and deliver project management services and products. Capabilities are incremental steps leading to one or more best practices. ([2], p.7) An outcome is the tangible or intangible result of applying a capability. The degree to which an outcome is achieved is measured by a key performance indicator. ([2], p.7) A criterion by which an organization can determine, quantitatively or qualitatively, whether an outcome associated with a capability exists or the degree to which it exists. ([2], p.72) A dependency is a relationship in which a desired state is contingent upon the achievement of one or more prerequisites. In OPM3, one type of dependency is represented by the series of capabilities that aggregate to a best practice. Another type occurs when the existence of one best practice depends, in part, on the existence of some other best practice. In this case, at least one of the capabilities within the first best practice depends on the existence of one of the capabilities within the other best practice. ([2], p.72) A repository for maturity profiles of organizations that have conducted rigorous OPM3 assessments. This database is managed by the PMI. (confirmed by certified OPM3 assessor, C. Ten Zweege) Assessment process activity Prepare assessment Perform assessment Sub-activity Familiarize with OPM3 Perform self-assessment Determine depth and breadth of rigorous assessment Acquire & prepare team Develop assessment plan Prepare for data collection Conduct interviews Description Before an organization decides to conduct a self- or rigorous OPM3 assessment, it has to understand the contents of OPM3 as thoroughly as possible and becoming familiar with organization project management and with the operation of OPM3. In preparation of a rigorous assessment, BEST PRACTICES that are and are not currently demonstrated in the organizational unit are identified. This can be done with OPM3 s SELF- ASSESSMENT TOOL, but also by a tool developed by organizations themselves or consulting companies. This activity results in a SELF-ASSESSMENT REPORT. The analysis of the SELF-ASSESSMENT REPORT will eventually lead to the decision whether or not there is a need for an organization to conduct a rigorous assessment with the assistance of certified OPM3 assessors. The scope and criteria of the assessment is determined, which is one of the elements described in the ENGAGEMENT DESCRIPTION. The ROLES and responsibilities of the assessment TEAM are specified and communicated to all TEAM members. The ASSESSMENT PLAN is developed. It is made up of the following parts: ORGANIZATION DESCRIPTION, COMMUNICATION PLAN, ENGAGEMENT DESCRIPTION and TEAM & ROLES. The PROTOCOL for conducting the data collection is generated from the OPM3 ProductSuite Assessment Tool and a DATA COLLECTION PLAN is developed. Interviews are conducted with members of an organization. This sub-activity results in PRELIMINARY ASSESSMENT FINDINGS comprising KEY PERFORMANCE INDICATORS. 97

Perform assessment (cont.) Plan for improvement Study records & documents Verify assessment findings Enter findings in rigorous assessment tool Generate & analyze results Prepare final report Select & prioritize improvement initiatives Develop improvement plan Records and documents are examined to collect data. This subactivity results in PRELIMINARY ASSESSMENT FINDINGS comprising KEY PERFORMANCE INDICATORS. The PRELIMINARY ASSESSMENT FINDINGS are verified. The verified PRELIMINARY ASSESSMENT FINDINGS are entered into the OPM3 ProductSuite Assessment Tool. The OPM3 ProductSuite Assessment Tool processes the PRELIMINARY ASSESSMENT FINDINGS entered and generates ASSESSMENT RESULTS. After generation, the ASSESSMENT RESULTS are analyzed first. Using the ASSESSMENT RESULTS, a FINAL REPORT is prepared. Based on the FINAL REPORT, an organization decides whether the findings result in an IMPROVEMENT TRIGGER. Based on the FINAL REPORT, improvement INITIATIVES are given PRIORITIES in the IMPROVEMENT PLAN. An IMPROVEMENT PLAN is developed based on the analysis of the improvement INITIATIVES that can be taken to realize improvements in the organization; in which order and when they should be taken, and how. 98

CMMI te: The term appraisal is defined by CMMI as an examinations of one or more processes by a trained team of professionals using an appraisal reference model as the basis for determining, at a minimum, strengths and weaknesses. In CMMI models this term is employed instead of assessment, which is defined as an appraisal done by an organization internally for the purposes of process improvement. In order to stay true to the definitions of SEI regarding CMMI and SCAMPI, the term appraisal will be used instead of assessment here while both mean the same throughout this report. Assessment process concept REQUIREMENT APPRAISAL PLAN TEAM INITIAL OBJECTIVE EVIDENCE REVIEW DATA COLLECTION PLAN OBJECTIVE EVIDENCE INTERVIEW DOCUMENT ARTIFACT AFFIRMATION APPRAISAL RECORD FINAL FINDINGS REPORT APPRAISAL RESULTS RESULTS ANALYSIS IMPROVEMENT SUGGESTION PRACTICE RATING Description A piece of information required to plan an appraisal. Examples of requirements include the objective of the appraisal, constraints, scope and output. ([37], p.i_0) A guide containing all technical and non-technical details of an appraisal. An appraisal plan includes among others descriptions about the method to be applied during an appraisal, the resources needed, costs and risks to be taken into account ([37], pp.ii_8-ii_20). A group of individuals qualified, experiences, trained, available and prepared to execute an appraisal ([37], pp.ii_32-ii_34). A report constructed in preparation to the actual appraisal containing information regarding the data (un)availability, additional information needed, operations and processes within an organizational unit and an initial set of objective evidence (see definition below) ([37], pp.ii_48-ii_49). A guide containing details about the procedures to collect data for during an appraisal. A data collection plan comprises mainly information about the participants to be consulted, documents to be reviewed, responsibilities for data collection activities and presentations/ demonstrations to be provided to the participants ([37], pp.ii_59-ii_62). Documents or interview results used as indicators of the implementation or institutionalization of model practices. Sources of objective evidence can include instruments, presentations, documents, and interviews ([37], p.iii_53). A meeting of appraisal team members with appraisal participants for the purpose of gathering information relative to work processes in place. In SCAMPI, this includes faceto-face interaction with those implementing or using the processes within the organizational unit. Interviews are typically held with various groups or individuals, such as project leaders, managers, and practitioners. A combination of formal and informal interviews may be held and interview scripts or exploratory questions developed to elicit the information needed ([37], p.iii_52). A collection of data, regardless of the medium on which it is recorded, that generally has permanence and can be read by humans or machines. Documents can be work products reflecting the implementation of one or more model practices. These documents typically include work products such as organizational policies, procedures, and implementation-level work products. Documents may be available in hardcopy, softcopy, or accessible via hyperlinks in a Web-based environment ([37], p.iii_50). A tangible form of objective evidence indicative of work being performed that is a direct or indirect result of implementing a CMMI model practice ([37], p.iii_49). An oral or written statement confirming or supporting implementation (or lack of implementation) of a CMMI model specific practice or generic practice. Affirmations are usually provided by the implementers of the practice and/or internal or external customers, but may also include other stakeholders (e.g., managers, suppliers) ([37], p.iii_47). An orderly, documented collection of information that is pertinent to the appraisal and adds to the understanding and verification of the appraisal findings and ratings generated ([37], p.iii_48). An appraisal record contains among others: the appraisal requirements, appraisal plan, objective evidence, all appraisal ratings and final findings ([37], pp.ii_29-pp.ii_3). The final findings report contain the validated strengths, weaknesses, and ratings (as defined by the appraisal plan), reflecting the organizational maturity level for process areas within the appraisal scope ([37], pp.ii_6-ii_2). The results of an appraisal comprise the goal satisfaction ratings, satisfaction ratings of process areas within the appraisal scope and the derived maturity level rating ([37], pp.ii_06-ii_08). Report describing the analysis steps taken to determine the APPRAISAL RESULTS ([37], pp.ii_06-ii_08). A section in the FINAL FINDINGS REPORT containing suggestions for improvements based on the APPRAISAL RESULTS found (CMMI appraisal sample report, 2007). A value assigned by an appraisal team, which indicates the extent to which a practice is implemented throughout the organizational unit ([37], pp.ii_96-ii_00). 99

GOAL RATING PROCESS AREA RATING MATURITY LEVEL RATING MATURITY PROFILE The value assigned by an appraisal team to a CMMI goal. The rating is determined by enacting the defined rating process for the appraisal method being employed. The goal satisfaction rating is based on the extent of practice implementation throughout the organizational unit ([37], p.i_34 & p.iii_48). The value assigned by an appraisal team to a process area. The process area satisfaction rating is derived from the set of goal satisfaction judgments ([37], p.ii_2 & & p.iii_48). The value assigned by an appraisal team to the maturity level of an organizational unit. The rating is determined by enacting the defined rating process for the appraisal method being employed. The rating of maturity levels is driven algorithmically by the goal satisfaction ratings ([37], p.i_34 & p.iii_48). A subset of the contents of the appraisal record, as well other data used by SEI to aggregate and analyze appraisal performance data for reporting to the community and monitoring the quality of performed appraisals ([37], p.ii_32 & III_49). Maturity model concept CMMI MATURITY LEVEL DEPENDENCY PROCESS AREA Description CMMI stands for Capability Maturity Model Integration. It is a process improvement maturity model for the development of products and services. It consists of best practices that address developments and maintenance activities that cover the product lifecycle from conception through delivery and maintenance ([28], p.i). Degree of process improvement across a predefined set of process areas in which all goals in the set are attained. ([28], p.543) A relationship between pairs of MATURITY LEVELS in CMMI, where the lower MATURITY LEVEL has to be achieved first before the higher one can be achieved. For example, an organization cannot achieve CMMI level 3 if not all conditions of level 2 are met. ([28], p.39) A cluster of related practices in an area that, when implemented collectively, satisfy a set of goals considered important for making improvement in that area. All CMMI process areas are common to both continuous and staged representations ([28], p.548). GOAL A required CMMI component that can be either a generic goal or a specific goal ([9], p.54). A required component describes what an organization must achieve to satisfy a process area. This achievement must be visibly implemented in an organization s processes ([28], p.6). GENERIC GOAL SPECIFIC GOAL PRACTICE GENERIC PRACTICE SPECIFIC PRACTICE PRACTICE IMPLEMENTATION INDICATOR (PII) MATURITY PROFILE DATABASE A generic goal describes the characteristics that must be present to institutionalize the processes that implement a process area ([28], p.54). An example of a generic goal is The process is institutionalized as a defined process. ([28], pp.9-20). In this sense a GENERIC GOAL can be understood as the same as achieving a maturity level. A required model component that describes the unique characteristics that must be present to satisfy the process area ([28], p.555). For example, a specific goal from the Configuration Management process area is Integrity of baselines is established and maintained. ([28], p.9) An expected CMMI component that can be either a generic practice or a specific practice. An expected component describes what an organization may implement to achieve a required component. Expected components guide those who implement improvements or perform appraisals ([28], p.6). Practices within CMMI are examples of how activities are carried out in practice by organizations in general. CMMI acknowledges the fact that there are different ways to achieve GOALS; therefore it does not enforce organizations to act blindly according to the PRACTICE descriptions. These PRACTICES are guidelines that can be used to appraise organizations. An expected model component that is considered important in achieving the associated generic goal. The generic practices associated with a generic goal describe the activities that are expected to result in achievement of the generic goal ([28], p.54). For example, a generic practice for the generic goal The process is institutionalized as a managed process is Provide adequate resources for performing the process, developing the work products, and providing the services of the process. ([28], p.2) An expected model component that is considered important in achieving the associated specific goal. The specific practices describe the activities expected to result in achievement of the specific goals of a process area ([28], p.555). For example, a specific practice from the Project Monitoring and Control process area is Monitor commitments against those identified in the project plan. An objective attribute or characteristic used as a footprint to verify the conduct of an activity or implementation of a CMMI model specific or generic practice. Types of practice implementation indicators include artifacts and affirmations ([37], p.iii_54). An electronic storage environment containing aggregated appraisal results. The SEI provides approved information within the bounds of confidentiality to the community, based on results from the appraisal data collected. The SEI establishes the format and mechanisms for the presentation of this information. ([37], p.ii_32). 00

Assessment process activity Plan and prepare for appraisal Conduct appraisal Report results Sub-activity Analyze requirements Develop appraisal plan Select and prepare team Obtain & inventory initial objective evidence Prepare for appraisal conduct Prepare participants Examine objective evidence Document objective evidence Verify objective evidence Validate preliminary findings Generate appraisal results Deliver appraisal results Package and archive appraisal assets Description Understand the business needs of the organizational unit for which the appraisal is being requested. Information is collected to determine appraisal REQUIREMENTS. An appraisal method is tailored and the needed resources are identified. The costs and schedule associated with the appraisal are determined. Logistics are planned and managed and risks are documented and managed. All this information is documented in the APPRAISAL PLAN. The leader and members of the appraisal TEAM are selected and prepared. Obtain information that facilitates site-specific preparation. Obtain data on model practices used. Potential issue areas, gaps, or risks are identified to aid in refining the APPRAISAL PLAN. This results in an INITIAL OBJECTIVE EVIDENCE REVIEW. Specific data-collection STRATEGIES including sources of data, tools and technologies to be used, and contingencies to manage risk of insufficient data are planned and documented in the form of a DATA COLLECTION PLAN. Inform appraisal participants of the purpose of the appraisal and prepare them for participation. Activities in accordance with the DATA COLLECTION PLAN are performed. INTERVIEWS and DOCUMENTS are examined as OBJECTIVE EVIDENCE to collect information about the practices implemented in the organizational unit. Lasting records of the OBJECTIVE EVIDENCE gathered are created by identifying and then consolidating notes, transforming the data into records that document practice implementation, as well as strengths and weaknesses. The implementation of the organizational unit s practices for each instantiation is verified. Each implementation of each practice is verified so it may be compared to appraisal reference model (i.e. CMMI) practices, and the TEAM characterized the extent to which the practices in the model at implemented by means of a PRACTICE RATING. Preliminary findings are validated. Gaps in the implementation of model practices are weaknesses and exemplary implementation of model practices may be highlighted as strengths in the appraisal OUTPUTS. Both strengths and weaknesses are validated with members of the organizational unit. APPRAISAL RESULTS are generated based on the validation of preliminary appraisal findings. APPRAISAL RESULTS contain PRACTICE RATINGS, GOAL RATINGS and the MATURITY LEVEL RATING. APPRAISAL RESULTS are provided in the form of a FINAL FINDINGS REPORT to the organizational unit to guide following actions. Important data and records from the appraisal are preserved in an APPRAISAL RECORD. Part of this APPRAISAL RECORD is provided to the CMMI Steward (i.e. SEI) in the form of a MATURITY PROFILE. Sensitive materials are disposed in an appropriate manner. 0

PMMM Assessment process concept ASSESSMENT PLAN Description A plan containing information about the reasons for conducting the maturity assessment (DRIVER), the scope of the assessment (SCOPE) and those who will participate in the assessment (ASSESSMENT SAMPLE). [34] DRIVER Purpose of the maturity assessment. [34] SCOPE Information regarding what part(s) of the organization will be subjected to the maturity assessment. [34] ASSESSMENT SAMPLE Group of employees of the organization selected to fill out the questionnaire in the ONLINE ASSESSMENT TOOL. [34] INDIVIDUAL Assessment findings based on the INDIVIDUAL SCORES of a participant. [34] ASSESSMENT RESULTS INDIVIDUAL SCORE INDIVIDUAL SCORE ANALYSIS BENCHMARK SCORE ORGANIZATIONAL SCORE INDUSTRY SCORE SIZE SCORE ASSESSMENT RESULTS SUMMARY ELABORATE ASSESSMENT REPORT COMPANY BACKGROUND RESULTS ANALYSIS SUGGESTED ACTION BENCHMARK COMPARISON Scores achieved by each participant after filling out the part of the questionnaire belonging to each MATURITY LEVEL. This score indicates the extent to which a participant has achieved a certain MATURITY LEVEL. [34] Brief report containing analysis results and improvement possibilities based on the INDIVIDUAL SCORES. [34] BENCHMARK SCORES allow comparisons between the INDIVIDUAL SCORE and the scores of other participants of the same organization, organizations of the same industry sector and organizations of the same size. [34] Aggregate score calculated (so far) of the organization within the scope of the assessment. [34] te that this ORGANIZATIONAL SCORE is complete when presented in the ASSESSMENT RESULTS SUMMARY. Aggregate score of organizations in the BENCHMARKING DATABASE operating in the same industry sector as the organization in scope. [34] Aggregate score of organizations in the BENCHMARKING DATABASE that are of similar size as the organization in scope. [34] Assessment findings based on the collective scores of all participants in scope after the questionnaire period. [34] Report in which assessment findings as well as improvement suggestions are elaborated. The findings are more thoroughly explained in this report compared to the ASSESSMENT RESULTS SUMMMARY. [34] Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL when requested. It contains information regarding the organization in scope of the assessment [45]. Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL when requested. It contains a detailed analysis of the assessment results [45]. Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL when requested. It contains descriptions of the possible actions that could be taken to realize improvements within the organization [45]. Part of the ELABORATE ASSESSMENT REPORT provided to organizations by the IIL when requested. It contains comparisons between the scores achieved by the organization in scope and organizations of the same size or operating in the same industry [45]. Maturity model concept PMMM MATURITY LEVEL ROADBLOCK RISK ADVANCEMENT CRITERIA ASSESSMENT INSTRUMENT Description The Kerzner Project Management Maturity Model enables a diagnosis of the health of project management in your organization. It assists organizations in identifying strategic strengths and weaknesses and, if the ONLINE ASSESSMENT TOOL is used, creates a prescriptive action plan for improving the health of your PM efforts. It allows you to objectively assess your project management capabilities against key knowledge areas of the PMBOK Guide. [34] Level of the PMMM model, representing a particular degree of maturity in project management. ([25], p.42) A barrier (belonging to a certain maturity level) preventing an organization from reaching the next maturity level. ([25], pp.46-47) A risk can be assigned to each level of the PMMM (low, medium, high). The level of risk is most frequently associated with the impact of having to change the corporate culture. (confirmed by Mr. C. Damiba) Condition that has to be met in order to reach the next maturity level. ([25], pp.46-47) Tool to help an organization determine its degree of maturity at each maturity level embodied by PMMM. Basically, it is a part of the complete assessment questionnaire of PMMM. ([25], pp.46-47) 02

ONLINE ASSESSMENT TOOL BENCHMARKING DATABASE The online format of the maturity assessment questionnaire. It is fast, automatic and easy to use, plus an executive interface monitors results. (confirmed by Mr. C. Damiba) Repository that allows comparisons between the maturity levels of organizations of the same industry or size. (confirmed by Mr. C. Damiba) Assessment process activity Initiation Assessment Assessment report development & delivery Sub-activity Identify assessment need Determine assessment scope Sample assessment participants Fill in online questionnaire Generate individual assessment scores Generate benchmark scores Generate assessment results summary Generate & deliver elaborate assessment report Description The DRIVER for carrying out an assessment is identified. The SCOPE of the assessment is determined During this activity, the ASSESSMENT SAMPLE is selected from the organization to which the assessment will be applied. Together with the DRIVER and SCOPE, this piece of information will be documented in an ASSESSMENT PLAN. The participants in the ASSESSMENT SAMPLE are invited to fill in a questionnaire provided by the ONLINE ASSESSMENT TOOL of the PMMM. After filling out the assessment instrument assigned to a MATURITY LEVEL, each participant can retrieve his or her INDIVIDUAL SCORE and INDIVIDUAL SCORE ANALYSIS. Besides the INDIVIDUAL SCORE and INDIVIDUAL SCORE ANALYSIS, each participant can also retrieve BENCHMARK SCORES to compare their INDIVIDUAL SCORES to scores of those who have also taken the assessment. After all participants of the ASSESSMENT SAMPLE has provided all information needed to the ONLINE ASSESSMENT TOOL, the assessment sponsor will be granted access to an interface where the ASSESSMENT RESULTS SUMMARY is presented. If there is a need for an ELABORATE ASSESSMENT REPORT, the IIL can develop and provide this to the organization where the assessment took place. 03

Appendix G Criteria results per model OPM3 Criteria Aspect Value Reference Explanation Maturity reference model criteria Openness Free access [2]. OPM3 purpose and scope (p.3) Using the purchasable OPM3 Foundation (OPM3-F) provided by OPM3, Paid access [46] Pricing & Signup any organization can conduct a quick scan on itself. A comprehensive [47] What does OPM3 look like? assessment with an acknowledged maturity degree has to be conducted Certified usage by certified assessors. In this latter case, materials of the OPM3 Proprietary access & ProductSuite (OPM3-PS) are used. usage Industry & Size Size of organization [2]. OPM3 purpose and scope (p.3) OPM3 does not place applicability constraints regarding organization size Industry sector or industry. Scope Project management [2].4 Organizational maturity (pp. 5-6) In OPM3, organizational project management maturity is reflected by the Program management combinations of best practices achieved within the project, program and Portfolio management portfolio domains. Maturity level Process area description Activity Role Competency Deliverable Result [2] Ch3. Best practices (pp.3-20) [2] Ch4. The organizational project management processes (pp.2-28) [2] Ch5. The OPM3 directories (pp.3-34) [2] Appendix F: Best practices directory [2] Appendix G: Capabilities directory (p.23) [2] Appendix H: Improvement planning directory (p.25) Dimension of maturity Process areas t employed [2] Appendix I: Program and portfolio management process models (pp.27-29) Process area dependencies Within OPM3, activities are described as practices and deliverables as outcomes. The model contains 600 best practices and all of them are provided openly to users. During an assessment, the assessors determine an organization s maturity by the implementations of practices. And the outcomes (deliverables) are examined to prove that a practice is really implemented. [2].4 Organizational maturity (pp.5-6) OPM3 does not employ specific dimensions along which organizations can achieve maturity. In OPM3, best practices are not grouped into process areas. However, the model does use process areas defined in the PMBOK to determine improvements activities. Described differently [2] Appendix H: Improvement planning directory OPM3 acknowledges dependencies between capabilities instead of process areas. And it is the improvement planning directory that contains the dependencies between capabilities that aggregate to different best practices.

Assessment method criteria Assessment commitment t described. [2] Executive summary (p.xvi) OPM3 does not explicitly state the necessity of commitment by higher management. However, it does indicate the importance of communicating frequently with the assessment sponsor and senior management of the organization. Competence level Assessment method description Data gathering method Length of questionnaire Supportive assessment tools Assessor Participant Process phase Activity Deliverable Role Dependency Questionnaires Interviews Group discussions Document consultations Project management domain (Self-)assessment toolset Training Certification [43] Module 6. Planning: acquire an assessment team (pp.-2) [43] Module 8. Execution: preparing for the assessment (pp.5-6) [2] 6.3 Steps of the OPM3 cycle (pp.36-46) [43] Module. Introduction: assessment process overview (p.4) [43] Module8. Execution: perform the OPM3 ProductSuite assessment (pp.4-2) 800+ Confirmed by Mr. Ten Zweege, certified OPM3 assessor. [2] Executive summary (p.xiv) [43] Module. Introduction (p.3) [36] OPM3 standard knowledge course. Benchmarking Benchmarking is optional [43] Module2. Orientation to OPM3 ProductSuite (p.5) [47] OPM3 benchmarking (p.5) Specific personal attributes as well as competences that an OPM3 ProductSuite Assessor needs to have are described in the training manual. Explicit competences or traits of assessment participants are not provided, but principles for sampling interviewees are described. The detailed description of the assessment method is only available during official OPM3 training programs. A less elaborate explanation of the assessment cycle is described in the purchasable knowledge foundation book (i.e. [2]). The only questionnaire meant for participants to fill out is the one that organizations can use to execute a quick scan prior to the rigorous assessment. This questionnaire is provided by the OPM3-F. Interviews and document consultations are conducted during rigorous assessment by certified assessors. During rigorous assessments, no questionnaires are provided to the members of the organization. The quick scan is considered one of the data gathering methods during an OPM3 assessment, because the results of the questionnaire (quick scan) trigger the rigorous assessment. This number includes all questions of the domain of project management on the entire dimension from Standardize to Continuous Improve as well as questions related to the organizational structures and processes necessary to support efficient and effective implementation and operation of the best practices for the project domain. OPM3-F provides a self-assessment tool. OPM3-PS provides a professional and extensive tool to facilitate the execution of an assessment and improvement planning activities. OPM3 provides training programs that train assessors how to asses and consultants how to plan, develop and produce improvement project plans. Courses are also available to those wanting to learn more about OPM3. OPM3 permits benchmarking of OPM3 self-assessment data. It allows users to gain insight into peer organizations maturity continuum scores and best practices, achieved with average, mean and median reports. OPM3 benchmarking data will be available to those organizations that participate in the collection and sharing of the data.

CMMI Additional remarks: - This evaluation is completely based on SEI s definition of and service offerings regarding the CMMI for Development. It should be kept in mind that there are other institutions like SEI who are eligible to conduct (official) CMMI assessments. - It should be kept in mind that CMMI Development is a maturity model constructed for software development. CMMI encompasses PM processes for the sake of software development. In CMMI, PM is an aspect of software development and system engineering. - CMMI employs two distinct approaches, also known as representations, for its assessments, namely the continuous representation and the staged representation. The continuous representation is a maturity model structure wherein capability levels, instead of maturity levels, provide a recommended order for approaching process improvement within each specified process area. With this approach, no maturity ratings are provided afterwards. With the staged representation approach, on the other hand, a maturity level is established by attaining the goals of a predefined set of process areas. In this representation, each maturity level builds a foundation for subsequent maturity levels. The continuous approach is not included in the scope of this research for the sake of time. Criteria Aspect Value Reference Explanation Maturity reference model criteria Openness Free access [49] Books (SEI CMMI Official webs) Journal publications (Computerworld, IEEE Paid access software, Journal of Systems and Software, Information and Management) Industry & Size Scope Maturity level description Certified usage Proprietary access & usage Size of organization Industry sector Project management Program management Portfolio management Process area Activity Role Competency Deliverable Result [28] Ch. Introduction: The scope of CMMI for Development (p.8) [28] Ch. Introduction: The scope of CMMI for Development (p.8) [28] Ch.3 Tying it all together: Understanding maturity levels. (pp.35-46) [28] Ch.2 Process area components (pp.6-28) [28] Ch.5 Part Two: Generic Goals and Generic Practices and the Process Areas (pp.73-53) The description of the SEI s CMMI for Development model and standard assessment method is downloadable from SEI s official website. The questionnaires used during assessments can only be accessed via the official SEI website by authorized persons (e.g. certified assessors). These questionnaires are used by assessors to develop interview questions. An official maturity assessment can only be conducted by certified CMMI assessors. Although some materials are readily available online, organizations still have to acquire a license first before they can use SEI s standard questionnaire and conduct CMMI assessments. Among others: aerospace, banking, computer hardware, software, defense, automobile manufacturing, and telecommunications. The SEI does not pose any industry sector related restrictions on the application of CMMI. Size related restrictions are not mentioned. Although CMMI does cover aspects of PM, it should be kept in mind that the model is designed for software development purposes and not PM. PM is only considered an aspect that, if managed well, contributes to achieving software development goals. To assess the maturity of an organization, assessors study the process area, activities, roles and deliverables. When assessing process areas, assessors determine what kind of processes are in place within an organization, not how these processes are organized. The purpose of assessors is to examine whether certain processes are in place and if so, if they contribute to achieving the goals described in CMMI.

Dimensions of maturity Process areas Process area dependencies Processes CMMI assesses the maturity of processes, so that is the only dimension employed. Causal analysis and resolution [28] Ch.3 Tying it all together: Process Areas. There are a total of 23 process areas embodied by CMMI for Configuration management (pp.4-44) Development. These process areas are categorized in 4 related groups Decision analysis and resolution [28] Ch.4 Relationships among Process Areas: that are relevant to the software development process: process Integrated project management Project Management. (pp.55-58) management, project management, engineering and support. Measurement and analysis [28] Ch.5 Part Two: Generic Goals and Generic Organizational innovation and Practices and the Process Areas (pp.73-53) Within CMMI for Development, processes like human resource deployment management or stakeholder management apply to all process areas, Organizational process definition which is why these are not described as separate process areas. Organizational process focus Organizational process performance Organizational training Product integration Project monitoring and control Project planning Process and product quality assurance Quantitative project management Requirements development Requirements management Risk management Supplier agreement management Technical solution Validation Verification Described [28] Ch.4 Relationships among Process Areas (pp.5-64) [28] Ch.5 Part Two: Generic Goals and Generic Practices and the Process Areas (pp.73-53) CMMI for Development describes dependencies between the different maturity levels, and because of that, there are also dependencies described between process areas.

Assessment method criteria Assessment Described. commitment [28] Ch.5 Using CMMI Models: Adopting CMMI (pp.65-66) Building strong organizational support through senior management sponsorship a crucial step toward process improvement. Competence level Assessment method description Data gathering method Length of questionnaire Supportive assessment tools (lead) Assessor Participant Process phase Activity Deliverable Role Dependency Questionnaires Interviews Group discussions Documents consultation Project management domain (Self-)assessment toolset Training Certification [37] Executive summary: Time frame and personnel requirements (p.i3). [37] Part II Process definitions:.3 Select and prepare team (pp.ii32-ii39) [49] Ch.4 Requirements for CMMI appraisal methods (pp.7-0). [37] Executive Summary: What is SCAMPI A? (pp.i9-i) [37] SCAMPI A method overview (pp.i5-i38) [37] Part II Process Definitions (pp.ii-ii34). [37] SCAMPI A method overview: Types of objective evidence (p.i22) [37] SCAMPI A method overview: Instruments and tools (pp. I28-I29) 55-80 [28]Ch.4 Relationships among Process Areas: Project Management. (pp.55-58) [28] Ch.5 Using CMMI Models: Using CMMI Appraisals (pp.68-69). [28] Ch.5 Using CMMI Models: CMMI-Related Training (pp.70-7). [49] Ch.3 Requirements for CMMI appraisal method class structure (pp.5-6). Benchmarking Benchmarking is optional [49] Ch.3 Requirements for CMMI appraisal method class structure (pp.5-6). CMMI describes which criteria is required to qualify the assessment team members and leader, but does not elaborate on the contents. There are however, minimum requirements for each role. It also elaborates the process of selecting the assessment team leader and members. The standard assessment method for CMMI is officially documented by the SEI. SEI s CMMI for Development definition document shows that the category project management comprises 6 process areas. Each process area is accompanied by a standard questionnaire, so the amount of questions per process area was added up to get a total amount. The amount of questions adds up to approximately 30 per process area. SEI provides information regarding the available CMMI training facilities. The SEI does not provide self assessment tools for CMMI v.2. There are however other institutions that do provide them. There are possibilities to compare an organization s own maturity score with average industry sector maturity score, but this is only because the SEI registers (by default) CMM and CMMI levels of officially assessed organizations. This information is not publicly available; only for those who have taken the official assessment and are registered. Even so, a registered organization cannot retrieve information regarding the identity of other assessed organizations.

PMMM Criteria Aspect Value Reference Explanation Maturity reference model criteria Openness Free access Paid access Certified usage Proprietary access & usage Industry & Size Size of organization Scope Maturity level description Dimensions of maturity Process areas Process area dependencies Industry sector Project management Program management Portfolio management Process area Activity Role Competency Deliverable Result Varies per maturity level Scope management Time management Cost management Human resources Procurement management Quality management Risk management Communications management t described [34] Official IIL PMMM webpage. PMMM consists of a purchasable book and an online assessment tool, supported by the International Institute for Learning (IIL), Inc. Access to the online assessment tool has to be purchased before organizations can use it to generate reports and recommendations. [34] Who should take this assessment? The online assessment tool is applicable in all industries. And it can be used by a wide range of users. The applicability to particular sizes of organizations is not specified. [25] Ch. An introduction to the PMMM (pp.4-44) [25] Ch5. Level5: Continuous improvement (pp. -38) [25] Ch. 4-9. An introduction to the Project Management Maturity Model (PMMM) & the 5 maturity levels. (pp.4-43) [25] Ch. 4-9. An introduction to the Project Management Maturity Model (PMMM) & the 5 maturity levels. (pp.4-43) [25] Ch. 4-9. An introduction to the Project Management Maturity Model (PMMM) & the 5 maturity levels. (pp.4-43) [25] Ch. 4-9. An introduction to the Project Management Maturity Model (PMMM) & the 5 maturity levels. (pp.4-43) Kerzner s PMMM specifically focuses on the 5 levels of project management maturity. Program management and portfolio management are described as areas of development when an organization has reached the level of continuous improvement regarding project management. Project management maturity is described in different terms on each maturity level by the PMMM. At level, the model looks at the knowledge of people within an organization regarding basic project management terminology. At level 2, it describes the importance of common processes throughout an organization. Level 3 is about combining all corporate methodologies into a singular methodology. Level 4 acknowledges the importance of conducting benchmarking. And finally, level 5 is about continuously improving the singular methodology and business processes using the information obtained through benchmarking. At each maturity level, the PMMM discusses: organizational characteristics, roadblocks that prevent an organization from attaining the next level, what must be done to reach to next level and potential risk. However, the descriptions of these aspects are not used to determine the maturity level of an organization. As mentioned earlier, PMMM describes different dimensions of PM maturity at each maturity level. On maturity level, where PM knowledge is assessed, the processes are categorized into the process areas as defined in the PMBOK. This categorization, however, is not used in the remaining 4 maturity levels as the model assesses different aspects on each level. PMMM does not describe dependencies between process areas, but does describe dependencies and overlapping between aspects of PM maturity elaborated at each particular maturity level.

Assessment process criteria Assessment commitment Described [25] Ch4. An introduction to the PMMM (p.4) The PMMM assists organizations in achieving strategic planning for project management and to make this happens, executive-level involvement is necessary to make sure any development and implementation process is driven from the top down. Competence level Assessor Confirmed by Mr. C. Damiba (see Appendix A-2). The PMMM does not place restrictions on the suitability of those Participant participating in an assessment. Also, the model does not require assessors as the assessments are carried out by the online assessment tool. However, should an organization desire a comprehensive assessment report, it will be constructed by eligible consultants within the IIL. Assessment method description Data gathering method Length of questionnaire Supportive assessment tools Benchmarking Process phase Activity Deliverable Role Dependency [34] Demo version of PMMM online assessment tool. There is no standard procedure for the assessment conduct. Participants of an organization only need to complete the questionnaire provided by the online assessment tool. The standard procedure to be followed by the client organization is explained using a demo-version of the online assessment tool on IIL s website. As for dependency, organizations can only retrieve simple or comprehensive assessment reports after working with the online assessment tool. Questionnaires [34] Here s how it works. Interviews Group discussions Document consultations Project management 83 [34] Here s how it works. PMMM employs only one questionnaire for the assessment. domain (Self-)assessment [25] Introduction (p.xviii) To conduct an assessment, PMMM only requires the usage of the online toolset [34] Here s how it works. assessment tool accompanying the model. Training Certification Benchmarking is optional at the end [34] Ready to give it a test run? In the online assessment tool, respondents can compare their individual of the assessment of each maturity scores at each level with others who have also taken the assessment; level. within the same organization as well as in other industries. Ultimately, the higher management can retrieve aggregate scores and compare the total organizational score with peers in the same industry, of the same size or with those in other industries.