Enhancing RUP for CMMI compliance: A methodological approach



Similar documents
Using Rational Software Solutions to Achieve CMMI Level 2

An Approach for assessing the Quality of Software for small and medium sized firms

Reaching CMM Levels 2 and 3 with the Rational Unified Process

Risk Knowledge Capture in the Riskit Method

International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research)

CMMI KEY PROCESS AREAS

Requirement Management with the Rational Unified Process RUP practices to support Business Analyst s activities and links with BABoK

[project.headway] Integrating Project HEADWAY And CMMI

Chap 1. Introduction to Software Architecture

An Overview of Software Engineering Process and Its Improvement

Moving from ISO9000 to the Higher Levels of the Capability Maturity Model (CMM)

Developing CMMI in IT Projects with Considering other Development Models

V. Phani Krishna et al, / (IJCSIT) International Journal of Computer Science and Information Technologies, Vol. 2 (6), 2011,

MKS Integrity & CMMI. July, 2007

Software Process Improvement CMM

The Software Process. The Unified Process (Cont.) The Unified Process (Cont.)

SOFTWARE QUALITY MANAGEMENT THROUGH IMPLEMENTATION OF SOFTWARE STANDARDS

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

How Rational Configuration and Change Management Products Support the Software Engineering Institute's Software Capability Maturity Model

<name of project> Software Project Management Plan

An Integrated Quality Assurance Framework for Specifying Business Information Systems

Design Specification for IEEE Std 1471 Recommended Practice for Architectural Description IEEE Architecture Working Group 0 Motivation

Mahmoud Khraiwesh Faculty of Science and Information Technology Zarqa University Zarqa - Jordan mahmoud@zpu.edu.jo

CHAPTER 9 SOFTWARE ENGINEERING PROCESS

Engineering Standards in Support of

GQM + Strategies in a Nutshell

CMMI and IBM Rational Unified Process

Software Quality Development and Assurance in RUP, MSF and XP - A Comparative Study

Software Process Maturity Model Study

Lecture Slides for Managing and Leading Software Projects. Chapter 1: Introduction

Application of software product quality international standards through software development life cycle

The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into

Family Evaluation Framework overview & introduction

Process Improvement. From the Software Engineering Institute:

A STRUCTURED METHODOLOGY FOR MULTIMEDIA PRODUCT AND SYSTEMS DEVELOPMENT

How To Understand And Understand The Cmm

Process Improvement. Objectives

Leveraging RUP, OpenUP, and the PMBOK. Arthur English, GreenLine Systems

SOFTWARE ASSURANCE STANDARD

LUXOFT ADVANTAGES. International Quality Standards

Software Development Life Cycle (SDLC)

Software Configuration Management. Wingsze Seaman COMP250SA February 27, 2008

Supporting Workflow Overview. CSC532 Fall06

Software Quality Standards and. from Ontological Point of View SMEF. Konstantina Georgieva

Requirements Management Practice Description

International standards, approaches and frameworks relevant to Software Quality Management and Software Process Improvement

Leveraging CMMI framework for Engineering Services

Capability Maturity Model Integration (CMMI SM ) Fundamentals

Software Quality Assurance: VI Standards

Software Process Improvement Framework for Software Outsourcing Based On CMMI Master of Science Thesis in Software Engineering and Management

CDC UNIFIED PROCESS PRACTICES GUIDE

Appendix 2-A. Application and System Development Requirements

Status Report: Practical Software Measurement

Plan-Driven Methodologies

A Capability Maturity Model (CMM)

SOFTWARE ENGINEERING IT 0301 Semester V B.Nithya,G.Lakshmi Priya Asst Professor SRM University, Kattankulathur

Quality Management of Software and Systems: Continuous Improvement Approaches

Managing Small Software Projects - An Integrated Guide Based on PMBOK, RUP, and CMMI

Achieving CMMI Level 2 in the Configuration Management Process Area Using IBM Rational Software Solutions

Case Study of CMMI implementation at Bank of Montreal (BMO) Financial Group

Program Lifecycle Methodology Version 1.7

CS 1632 SOFTWARE QUALITY ASSURANCE. 2 Marks. Sample Questions and Answers

The IT Infrastructure Library (ITIL)

Controlling software acquisition: is supplier s software process capability determination enough?

UML Modeling of Five Process Maturity Models

ISO, CMMI and PMBOK Risk Management: a Comparative Analysis

Knowledge Infrastructure for Project Management 1

MTAT Software Engineering Management

Software Quality. Process Quality " Martin Glinz. Chapter 5. Department of Informatics!

Future of CMM and Quality Improvement. Roy Ko Hong Kong Productivity Council

Surveying and evaluating tools for managing processes for software intensive systems

SEI Level 2, 3, 4, & 5 1 Work Breakdown Structure (WBS)

V-Modell XT. Part 1: Fundamentals of the V-Modell

Evaluation and Integration of Risk Management in CMMI and ISO/IEC 15504

This is an author-deposited version published in : Eprints ID : 15447

Increasing Development Knowledge with EPFC

Developing Business Architecture with TOGAF

Distributed and Outsourced Software Engineering. The CMMI Model. Peter Kolb. Software Engineering

CHAPTER 7 Software Configuration Management

Using Measurement to translate Business Vision into Operational Software Strategies

CMMI 100 Success Secrets

Requirements-Based Testing: Encourage Collaboration Through Traceability

Best Practices for the Acquisition of COTS-Based Software Systems (CBSS): Experiences from the Space Systems Domain

SOFTWARE QUALITY & SYSTEMS ENGINEERING PROGRAM. Quality Assurance Checklist

Capability Maturity Model Integrated (CMMI)

Qualipso Project: Quality Recommendations for FLOSS development processes

How To Create A Process Measurement System

Systematization of Requirements Definition for Software Development Processes with a Business Modeling Architecture

A Variability Viewpoint for Enterprise Software Systems

Treasury Board of Canada Secretariat (TBS) IT Project Manager s Handbook. Version 1.1

Transcription:

Page 1 of 15 Copyright IBM Corporation 2004. http://www-106.ibm.com/developerworks/rational/library/5318.html Search for: within All of dw Use + - ( ) " " Search help IBM home Products & services Support & downloads My account developerworks > Rational Enhancing RUP for CMMI compliance: A methodological approach Walcelio Melo Unisys Corporation 8 Jul 2004 from The Rational Edge: This paper traces the approach Unisys GPS Blueprinting used to evaluate the Rational Unified Process in support of the Unisys Business Blueprint, a business and systems modeling architecture that integrates business vision and IT execution to drive organizational agility. IBM Rational Unified Process, or RUP, provides an outstanding foundation that allows Unisys to achieve a higher level of process capabilities in many different CMMI process areas. Moreover, RUP allows the selection and customization of its process elements to adhere to the particularities inherent in each Unisys Global Public Sector project, program, or business unit. Contents: The approach New process elements Conclusion References Appendix A: Related work Notes About the author Rate this article Subscriptions: dw newsletters dw Subscription (CDs and downloads) However, RUP is weak in some CMMI process areas. In a comprehensive study performed by the Software Engineering Institute, RUP was assessed against CMMI Continuous Representation. 2 Although RUP performed well in most of the CMMI process areas, the assessment concluded that RUP was weak in the areas of Supplier Agreement Management and Technical Solution process area practices. In this paper 1, I describe new process elements that allow RUP to overcome these weaknesses. I discuss some experiences in which RUP has been used to assist IT organizations in achieving a higher level of process capabilities, then present the software process improvement approach that we are using to guide our work. I also describe the new process elements that would be created to overcome the identified weakness related to CMMI assessment. A philosophy of improving process A "software process" includes all the activities necessary to manage, develop, acquire, and maintain software products or services. The existence of a defined and sustained process offers a foundation for organizational planning and continuous improvement. When software process is well-established in an organization, it serves as a vehicle for learning, reuse, and the promotion of best practices. To improve software process quality, we must understand the organization that will benefit from the improvement. The best way to gain this understanding is by studying representative sample projects, and paying particular attention to their development, management, support, and operational practices. This helps identify points in the process that need to be improved, missing practices that need to be included, and practices used in some projects that would bring business benefits if extended to the entire organization. To help evaluate process practices and identify improvement opportunities, we can compare current capabilities of an organization's processes with the best practices recommended by a reference model. This comparison serves as a basis to derive improvement plans, and specifically allows us to evaluate an organization's capability to produce quality results on time and under the stipulated budget. 3 Several reference models for process capability determination have been proposed, including CMMI (Capability Maturity Model - Integration) and ISO/IEC 15504 (a standard for software process assessment agreed by ISO and the International Electrotechnical Commission). They suggest process practices that should be implemented in an organization to better perform and successfully achieve its business goals. Although ISO/IEC 15504 is an international standard for determining process capability, in the US the CMMI is unquestionably the most popular model, and CMMI complies with the ISO/IEC 1504 standard. The Rational Edge--July 2004

Page 2 of 15 CMMI is designed to help an organization improve processes used to manage the development, acquisition, and maintenance of products or services. CMMI is used as a guide for selecting process improvement strategies by facilitating the determination of current process capabilities and the identification of the issues most critical to software quality and process improvement. The problem Based on the assumption that there are fundamental practices that should be incorporated in all the processes executed in the organization, a standard process for an organization is typically defined by considering best practices and suggestions from international standards, industrial organizations, governmental guidelines, in-house best practices, etc. (e.g., RUP, ISO/IEC 12207, etc). But for each project, the standard process must be adapted to the particularities of the organization and the project itself. Following this strategy, Unisys created the Unisys Business Blueprint. This is a business and systems modeling architecture that integrates business vision and IT execution to drive organizational agility. To make organizations successful in meeting the requirements of today's fast-paced information world Unisys has developed Blueprinting to align business with information technology and provide traceability across the whole enterprise. Blueprinting is now the foundation for most of the Unisys business operations, including Unisys GPS geographies, programs, and projects. All are encouraged to use Unisys Business Blueprint Methodology for development, management, and transition of Unisys solutions. RUP is an important component of Unisys Business Blueprint. RUP allows us to select and customize its process elements to adhere to the particularities inherent in each Unisys GPS project, program, or business unit. Moreover, RUP provides an outstanding foundation that allows Unisys to achieve a higher level of process capabilities in many different CMMI process areas. However, RUP is weak in some CMMI process areas. In a comprehensive study performed by the Software Engineering Institute, RUP was assessed against CMMI Continuous Representation. 4 Although RUP performed well in most of the CMMI process areas, the following weaknesses were identified: Supplier Agreement Management is out of the scope of RUP; that is, RUP in its current form does not explicitly deal with managing work from external suppliers to the project. This is an issue for projects in which Unisys needs to employ subcontractors. RUP does not explicitly support all the Technical Solution Process Area's practices. For example, RUP does not explicitly cover consideration of design alternatives except at the architectural level, and RUP does not explicitly cover the use of selection criteria for product solutions or components. In a 2003 study, Manzoni and Price evaluated RUP against SW-CMM. 5 For each key practice (KP) identified in each key process area (KPA) of SW-CMM levels 2 and 3, the Rational Unified Process was assessed to determine whether it satisfied the KPA or not. The report concluded that an organization using RUP would need to complement it to conform to SW-CMM. According to the study, SW-CMM key process areas best supported by RUP are requirements management, software project planning, software project tracking and oversight, software configuration management, organization process definition, and software product engineering. RUP offers good support to both integrated software management and inter-group coordination KPAs; RUP offers low support for software quality assurance, organization process focus, and peer review KPAs; and RUP does not support software subcontract management or training KPAs. Based on these analytical studies, we conclude that an organization would need to enhance its version of RUP to improve its process capabilities. The approach In this section, I will describe the approach Unisys GPS Blueprint is using to enhance Unisys Blueprint Methodology to increase its level of compliance with CMMI. As indicated in Figure 1, there is a symbiotic relationship between assessment, capability maturity determination, and process improvement. 6 The capacity determination of a process in our case, the Unisys Blueprint methodology, or more precisely the RUP product is used to identify weaknesses that motivate improvements. Such improvements reflect changes in the process in our case, changes to the RUP product.

Page 3 of 15 Figure 1: Software process assessment and improvement Models are used as references for process assessment and for determining improvement opportunities. Nevertheless, software process continuous improvement demands a disciplined approach. In our case, we have adopted an incremental, inductive improvement approach called the Quality Improvement Paradigm QIP. QIP has been successfully applied for many years at several organizations. HP, Daimler-Benz, NASA, Nokia, and Motorola 7 offer a few examples where higher levels of maturity and return on investment have been obtained via the use of QIP and its supporting methods. QIP: A two-loop feedback process The QIP is a two-loop feedback process that includes a project loop and an organization loop. 8 This is a variation of the scientific method consisting of the steps illustrated in Figure 2: Figure 2: QIP intra- and inter-loop As shown in Figure 2, the QIP process involves the following steps: 1. Characterize and understand the organization and its process capabilities. To do so, perform capability determination and process

Page 4 of 15 assessment of the current projects and its environment with respect to reference models and metrics. 2. Set quantifiable goals for successful project performance and improvement. 3. Choose the appropriate process and supporting techniques, methods, and tools. New processes may need to be created. However, it is much more likely that existing processes will be tailored to fulfill the gaps identified during the characterization and understanding of the organization. 4. Execute the processes, construct the products, and collect and validate the prescribed data. According to AINSI, 9 after issues are identified and improvement options determined, a pilot project should be executed using new practices and/or tools. The scope, participants, and project evaluation strategy should be defined before its execution. After the pilot project execution is finished and the results are analyzed, the technology might be transferred to the rest of the organization or other pilot projects should be executed. 5. Analyze the data to evaluate the current practices, determine problems, record findings, and recommend future project improvements. This step is crucial to the success of this solution. It is here that both project and organization learning is leveraged. The feedback obtained by analyzing the measurable goals and process effectiveness will support the continuous improvement of both our business and our methodology. 6. Package the experience in the form of models and other forms of structured knowledge. The following sections outline the approach to be used in each of the above steps. Capability determination and process assessment In the study performed by the Software Engineering Institute, RUP was assessed against CMMI Continuous Representation. 10 The weaknesses of RUP regarding this reference model have been identified and suggestions for improvement have been outlined. Set quantifiable goals The Goal/Question/Metric (GQM) is the method for defining software measurement. 11 GQM allows us to define a measurement model on three levels: Conceptual level (goal): A goal is defined for an object, for a variety of reasons, with respect to various models of quality, from various points of view, and relative to a particular environment/project. Operational level (question): A set of questions is used to define models of the object of study and then focuses on that object to characterize the assessment or achievement of a specific goal. Quantitative level (metric): A set of metrics, based on the models, is associated with every question in order to answer it in a measurable way. Table 1 provides a sample of a Goal-Questions-Metrics model used in our study. Table 1: GQM sample GOAL Purpose Issue Process Area Viewpoint Environment Improve RUP Process Capability level Technical Solution and Supplier Management Process Engineer Solution Development QUESTION METRIC What is the current level of compliance between RUP and CMMI Technical Solution process area? # of CMMI Technical Solution process practices supported by RUP

Page 5 of 15 # of CMMI Technical Solution specific goals supported by RUP QUESTION METRIC What is the current level of support RUP provides vis-à-vis CMMI Supplier Management process area? # of CMMI Supplier Management process practices supported by RUP # of CMMI Supplier Management specific goals supported by RUP Choose the processes Based on the characterization of the environment and the defined objectives, we have to choose the appropriate processes for improvement, as well as the process tools and supporting methods, making sure they agree with the objectives. Table 2 presents the specific process practices within the CMMI Technical Solution process area that have low or medium support by RUP according to SEI assessment. Table 2: Specific process practices from CMMI Technical Solution and Supplier Agreement Management process areas weakly supported by RUP Process Area Specific Goal Specific Process Practice Description Technical Solution SG 1 Select Product-Component Solutions TS SP 1.1-1 Develop Alternative Solutions and Selection Criteria TS SP 1.1-2 Develop Detailed Alternative Solutions and Selection Criteria TS SP 1.3-1 Select Product-Component Solutions SG 2 Develop the Design TS SP 2.4-3 Perform Make, Buy, or Reuse Analyses Supplier Agreement Management SG 1 Establish Supplier Agreements SAM SP 1.1-1 SAM SP 1.2-1 Determine Acquisition Type Select Suppliers SAM SP 1.3-1 Establish Supplier Agreements SG 2 Satisfy Supplier Agreements SAM SP 2.1-1 SAM SP 2.2-1 Review COTS Products Execute the Supplier Agreement SAM SP 2.3-1 Accept the Acquired Product SAM SP 2.4-1 Transition Products Table 3 shows the process models we analyzed and their compliance with CMMI Technical Solution's Specific Process practices. CMMI Supplier Agreement Management's Specific Process practices are not included since RUP does not provide any specific support to them. The compliance level is indicated by Low, Medium or High (L, M, H). Table 3: Process models vs. Technical Solution process area Process Analyzed SP 1.1-1 SP 1.1-2 SP 1.3-1 SP 2.4-3 NASA SEL COTS-based process M H OTSO COTS Selection Process H H H L SPC's Subcontracting Products or Services for Software-Intensive Systems L L L H

Page 6 of 15 RUP M L M L Note that the RUP benchmarking was extracted from Gallagher & Brownsword 2001 RUP/CMMI tutorial. 12 Our results can be further explained as follows: Rational Unified Process. We replicated here the results of the assessment conducted by SEI where RUP was benchmarked against CMMI Continuous Representation. Only the specific practices considered as either low or medium supported by RUP are shown. NASA SEL COTS-based process. 13 After investigating fifteen projects developed according to NASA COTS-based software development, Morisio and colleagues proposed new process elements for COTS (Commercial Off-the-Shelf available software) identification, selection, and integration. For instance, the requirement process was enhanced with the following activities: make versus buy decision, COTS identification and selection, and COTS familiarization. Table 4 shows the synergy between this process model and the CMMI Technical Solution process area. Table 4: Synergy between NASA SEL COTS-based process (Morisio et al., 2001) and CMMI Technical Solution process area Specific Process Procedure Description NASA SEL COTS-based process Comments TS SP 1.3-1 Select the product component solutions that best satisfy the criteria established. Requirements Definition. COTS Identification and Selection. Requirements for the project are sketched out to guide the identification of COTS. In addition, COTS are identified and evaluated using vendor documentation, reviews, peer experiences and suggestions. TS SP 2.4-3 Evaluate whether the product components should be developed, purchased, or reused based on established criteria. Make Versus Buy Decision I. Make Versus Buy Decision II. These two make vs. buy decision activities analyze in detail different tradeoffs among requirements satisfied, risks accepted, and cost. OTSO COTS Selection Process. Kontio 14 proposes a method for searching, screening and evaluting COTS. This method was successfully applied in two case studies carried out with Hughes Corporation in the EOS program developed for NASA. Software Productivity Consortium embraced this method as part of its COTS purchase process (SPC, 1989). Table 5 presents the synergy between OTSO and CMMI Technical Solution and Supplier Management process areas. Table 5: Synergy between OTSO and CMMI Technical Solution and Supplier Management process areas Specific Process Procedure Description OTSO Process Elements Comments TS SP 1.1-1 Develop alternative solutions and selection criteria. Search criteria definition It produces the criteria necessary to conduct the selection and evaluation of the component alternatives. TS SP 1.1-2 Develop detailed alternative solutions and selection criteria. Detailed evaluation criteria definition Weighing of criteria It provides guidance on how to elaborate requirements for the COTS into a well-defined measurement criteria set. These criteria are used to select, screen and evaluate COTS, thereby creating alternative solutions.

Page 7 of 15 TS SP 1.3-1 Select the product component solutions that best satisfy the criteria established. Analysis of results It is used to perform analysis on the financial and qualitative data resulting from the criteria-based evaluation and to decide which component(s) will be selected for inclusion in the system. SAM SP 2.1-1 Review candidate COTS products to ensure they satisfy the specified requirements that are covered under a supplier agreement. OTSO process as a whole. OTSO provides detailed guidance for: Developing criteria for evaluating COTS products Evaluating candidate COTS products against requirements and evaluation criteria Selecting COTS products to be acquired or reused SPC's Subcontracting Products or Services for Software-Intensive Systems Guidebook. This provides a process for managing subcontracting of products or services. The first activity of this process is to perform the make vs. buy analysis. This activity describes the use of a balanced scorecard to identify and focus on business needs when evaluating make versus buy alternatives (SPC, 2001). The scorecard contains financial performance, customer satisfaction, internal process, learning and innovation, and sourcing liability factors, along with strategies to optimize performance for those factors that relate to business needs. Table 6 presents the synergy between this process, more specifically between the activity to perform make vs. buy analsyis and the CMMI Technical Solution. Table 6: Synergy between SPC's guidebook and CMMI Technical Solution and Supplier Management process areas Specific Process Practices Description SPC's Process Element Comments TS SP 2.4-3 Evaluate whether the product components should be developed, purchased, or reused based on established criteria. Perform Make vs. Buy Analysis The option to purchase any of the candidate COTS products is compared to the option to build a similar product in-house with both options evaluated against previously established decision criteria. SAM SP 1.1-1 Determine the type of acquisition for each product or product component to be acquired. SAM SP 1.2-1 Select suppliers based on an evaluation of their ability to meet the specified requirements and established criteria. Solicit and Select Supplier Via its sub-activity 'Determine Contract Type,' Solicit and Select Supplier recommends the type of acquisition for each product or product component to be acquired. It treats the 'contract type' specification before beginning the supplier relationship as a key to mitigate risk early on. SPC fulfills the CMMI recommendation to select suppliers based on an evaluation of their ability to meet the specified requirements and established criteria. The CMMI specified practices for 'Select Suppliers' are fulfilled in one or many of the following exit artifacts: Supplier evaluation criteria Solicitation materials Solicited supplier list Evaluations of each prospective supplier's ability to perform with associated risks Identification of product or service to be acquired, with preferred supplier and alternates

Page 8 of 15 SAM SP 1.3-1 Establish and maintain formal agreements with the supplier Agree to Terms SPC fulfills the CMMI recommendation to establish and maintain formal agreements with the supplier. The CMMI specified practices for 'Establish Supplier Agreements' are fulfilled in one or many of the following exit artifacts: SOW Supplier agreement (e.g., contract, MOU) Update of the project's plans and risk plans Supplied product or service specifications SMP (Supplier Management Plan) SAM SP 2.1-1 Review candidate COTS products to ensure they satisfy the specified requirements that are covered under a supplier agreement. Perform Make vs. Buy Analysis Solicit and Select Supplier During Perform Make vs. Buy Analysis activity, evaluation criteria needed to meet the business objectives are documented and trade studies are performed. Solicit and Select Supplier also indirectly addresses this specific practice by leveraging a component evaluation process. As indicated before, OTSO provides detailed guidance for screening, evaluating, and selecting COTS products. Therefore, SPC recommends the use of component evaluation process when the make or buy decision is oriented towards the acquisition of COTS products. SAM SP 2.2-1 Perform activities with the supplier as specified in the supplier agreement. Manage the Relationship SPC fulfills the CMMI recommendation to perform activities with the supplier as specified in the supplier agreement. The CMMI specified practices for 'Execute the Supplier Agreement' are fulfilled in one or many of the following exit artifacts: Supplier progress and status reports Audit and review reports Technical and management review reports Action Items Documentation of work product and document deliveries SAM SP 2.3-1 Ensure that the supplier agreement is satisfied before accepting the acquired product. Accept the Products and Services SPC fulfills the CMMI recommendation to ensure that the supplier agreement is satisfied before accepting the acquired product. The CMMI specified practices for 'Accept the Acquired Product' are fulfilled in one or many of the following exit artifacts: Reviews or test cases are performed, and acceptance criteria are met. Required deliverables are placed under CM. SAM SP 2.4-1 Transition the acquired products from the supplier to the project. Transition the Product or Service SPC fulfills the CMMI recommendation to transition the acquired products from the supplier to the project. The CMMI specified practices for 'Transition Products' are fulfilled in one or many of the following exit artifacts: Delivered product with user and maintenance guides Transition plans Training plans Support and maintenance plans Transition audit and test results, report, and corrective action plan In this section, we presented an overview of the process models we investigated to address the weakness of RUP with regards to CMMI Technical Solution and Supplier Management process areas. Three process models (in addition to RUP) were analyzed. We indicated the level of synergy of these processes and CMMI. The three cited processes inspired us to propose the creation of new process elements to extend RUP. Later on, we will present how this was done.

Page 9 of 15 Packaging experiences for project and corporate learning Once the measurable goals are set, the processes are chosen and executed. Then, it is time to analyze the results, package the experience and lessons learned, and save these packages in our corporate repository for reuse and continuous improvement. By doing so, our assets can be reused and can evolve. In addition, as prescribed in Figure 2, we should consider them for the further iterations in the same project or leverage them across the organization. To package our experience, we have extended RUP. In the next section, these new elements are presented. New process elements Figure 3 shows an overview of the new process elements that we have created. In the following paragraphs I provide a high-level description of these new process elements. To create the elements, we have used Spearmint, 15 a graphical modeling tool for describing software development processes. The tool's conceptual schema of process information is a subset of the OMG's Software Process Engineering Metamodel (SPEM) specification. The tool utilizes a graphical notation close to UML. A full description of the new elements can be provided under request. 16

Page 10 of 15 Figure 3: New process elements Based on the analyses of process models cited in the previous section, we proposed that the following activities be added to our version of RUP. Perform the make vs. buy analysis. The make vs. buy analysis is the decision-making process to determine whether to implement a COTS solution or build a custom solution. The project manager is assigned this activity. As noted by Morisio and his colleagues, the use of COTS in a project is a key decision that impacts all subsequent phases and the success of the project. Therefore, the main project stakeholders should participate in this analysis and final decision. Define COTS evaluation scope. This planning activity is performed by the project manager to define the scope for the activities involved

Page 11 of 15 in the component evaluation process. 17 The organizational characteristics provided in the Software Development Document, customer needs indicated in the Vision document, and project specifications provided in the Software Requirement Specification (SRS) and Supplementary Specification are used by the project manager to create the component evaluation plan. This plan would enrich the Software Development Document and defines the level of effort for each activity to be performed in the component evaluation process. Each iteration may require this plan be redefined for one or more of the process activities. Further details about Software Development Document, SRS, Supplementary Specification and Vision artifacts can be found in RUP. Perform COTS evaluation. This activity addresses the problems of evaluating, comparing, and selecting components. 18 This activity would complement the Analysis and Design discipline more precisely, the workflow detail: perform architectural synthesis activity. Although this activity focuses on off-the-shelf components, it applies to all types of components large or small. This activity is strongly related to the architectural synthesis. It deals with evaluating components that already exist within the project or that have been created in previous iterations or system versions, or COTS that have been requested to be part of the system. The architect is primarily responsible for this activity. As recommended by Kontio (1995) and SPC, this activity can be decomposed in the following sub-activities: search and screen candidates, define evaluation criteria, evaluate component alternatives, and analyze evaluation results. Figure 4 shows the product flow of these sub-activities. Based on the results provided by this activity, the architect can generate change requests related to the proposed architecture and/or requirements. These change requests would be managed using the project change management process as recommended in RUP. The main output of this activity would be a list of selected components indicating the components chosen for inclusion in the system as a result of the evaluation process. As far as RUP artifacts are concerned, this list would be included in the Software Architecture Document. Figure 4: Component evaluation product flow Purchase COTS product and service agreement. This activity's main goal is to acquire the select components. It is also recommended to purchase a service agreement that provides support and upgrades to the acquired components. 19

Page 12 of 15 Transition COTS product. This activity's main goal is to transit the acquired products from the supplier to the project. 20 The project manager must ensure that an adequate infrastructure is in place to receive, store, use, and maintain the acquired components. In addition, appropriate training must be provided for those involved in receiving, storing, using, and maintaining the acquired products. Finally, the project manager should make sure that storing, distributing, and using the acquired products are performed according to the terms and conditions specified in the supplier agreement or license. Conclusion RUP is the de facto industry standard for project lifecycle development and management. Unisys integrates RUP into its Business Blueprint methodology to provide a highly mature process across the entire organization. However, RUP presents low-process capability regarding some CMMI software process areas and needs improvement. To clearly identify these weaknesses and improve RUP to overcome them, we have used an empirically validated and technically sound software process improvement approach called QIP. As a result, a process model based on the Rational Unified Process concepts was proposed to enhance RUP compliance to CMMI. To verify the efficiency and efficacy of the proposed process model, we are currently validating it on pilot projects. To roll it out in the entire organization, we are proposing the integration of this new capability via a CMMI plug-in for RUP. Finally, we believe that other groups can apply the approach we have described in this paper to mitigate risks in other process areas. References V. Basili, G. Caldiera and D. Rombach, "The Goal Question Metric Approach."Encyclopedia of Software Engineering. Wiley 1994. V. R. Basili, M. K. Daskalantonakis, R. H. Yacobellis. "Technology Transfer at MOTOROLA," IEEE Software, 11(2): 70-76. L. Briand; K. El Eman; W. L. Melo. "AINSI: An inductive method for software process improvement: concrete steps and guidelines." In K. El Eman and N. H, Madhavji (eds.), Elements of Software Process Assessment and Improvement. 1999. IEEE Press. CMMI Product Team. Capability Maturity Model Integration (CMMISM), Version 1.1, Continuous Representation, CMU/SEI-2002-TR-011, 2002. CMMI Product Team. Capability Maturity Model Integration (CMMISM), Version 1.1, Staged Representation, CMU/SEI-2002-TR-012, ESC-TR-2002-012, 2002. K. El Emam; J.-N. Drouin; W. Melo. SPICE: The Theory and Practice of Software Process Improvement and Capability Determination. IEEE Press. 1998. B. Gallagher & L. Brownsword. "The Rational Unified Process and the Capability Maturity Model Integrated Systems/Software Engineering." RUP/CMMI Tutorial ESEPG, 2001. R. Grady, Successful Software Process Improvement, Prentice-Hall, 1997. T. Kilpi, "Implementing a Software Metrics Program at Nokia," IEEE Software, 18(6):72-77, 2001. J. Kontio. "A Case Study in Applying a Systematic Method for COTS Selection,"Proc. of the 18th Int. Conf. on Software Engineering, IEEE CS Press, March 1996. F. McGarry et.al. "An Overview of the NASA Software Engineering Laboratory." NASA SEL, Technical Reports, SEL-94-005, 1994. ISO/IEC TR 15504-1. "Information technology Software process assessment Part 1: Concepts and introductory guide." 1998. L.V. Manzoni, "Using a Workflow Management System to Support Software Development Based on Extended Rational Unified Process to Reach Maturity Model Levels 2 and 3," Master Dissertation, Inst. of Informatics, Federal Univ. of Rio Grande do Sul, Porto Alegre, Brazil, 2001, http://www.inf.ufrgs.br/amadeus/atuais/lisandra.html. L. V. Manzoni & R. T. Price. "Identifying Extensions Required by RUP to Comply with CMM Levels 2 and 3." IEEE TSE, Vol. 29, No. 2, February 2003. M. Morisio, C.B. Seaman, V.R. Basili, A.T. Parra, S.E. Kraft, S.E. Condon. "COTS-Based Software Development: Processes and Open Issues."Journal of Software and Systems, 2001. M. Paulk. Capability Maturity Model for Software, Version 1.1. Addison-Wesley.1993.

Page 13 of 15 Software Productivity Consortium (SPC). Component Evaluation Process. SPC-98091-CMC. 1999. Software Productivity Consortium (SPC). "Subcontracting Products or Services for Software-Intensive Systems." SPC-2000039-MC, September 2001. Rational Software. "Reaching CMM Levels 2 and 3 with the Rational Unified Process." White Paper. 2001. R. W. Reitzig; Carlo Rodriguez; Gary Holt. "Achieving Capability Maturity Model Level 2 with the Rational Unified Process." Gognenceinc Integrated Software Engineering. www.cognence.com. White Paper. 2002. R. W. Reitzig; John B. Miller; Dave West; Raymond L. Kile. "Achieving Capability Maturity Model Integration Maturity Level 2 Using IBM Rational Software's Solutions." Rational White Paper. 2003. R. W. Reitzig. "Using Rational Software Solutions to Achieve CMMI Level 2."The Rational Edge. Jan. 2003. Appendix A: Related work RUP is one of the pillars of Unisys Blueprint Methodology. RUP has been claimed as a "software process improvement tool" that would help an organization or project to achieve a higher level of process capabilities. In this section, we highlight some studies that advocate RUP as such a tool. We do not intend to cover all the experiences already conducted about this subject, but to show a sample related to our proposal. Reitzig and his colleagues (2002) identified various RUP roles, disciplines, templates, and activities that would apply in satisfying the various CMM Level 2 key practice areas. To deal with the weakness of RUP regarding SW-CMM Level 2-3, they recommended using other processes in combination with RUP. For instance, regarding the lack of RUP support for the supplier management process, they claimed that an organization could become compliant with this process area by using the IEEE Std 1062 Recommended Practice for Software Acquisition. This standard outlines the recommended steps an organization should follow when undergoing a software acquisition effort. According to Reitzig et. al., if the standard were used correctly, many of the supplier management procedures asked for by the SW-CMM would be produced. In another white paper Rational Software (2001) describes how Rational Unified Process can support an organization that is trying to achieve SW-CMM Level 2-3. High-level recommendations are provided on how to cover RUP weaknesses regarding SW-CMM Level 2-3 (Rational, 2001). Reitzig and his colleagues (2003) provide guidelines on how an organization that uses RUP can accelerate the attainment of CMMI maturity Level 2, and have a solid foundation for maturity level 3. Since the Technical Solution process area is only requested in CMMI Staged Representation Level 3, they did not address the weaknesses identified in RUP regarding compliance with this process area. Regarding the supplier management process area, which is required in CMMI Staged Representation Level 2, they indicated that many of the practices required by the CMMI in the Supplier Agreement Management process area are not specifically addressed by RUP. Reitizig (2003) complements this study by indicating supplementary standards and practices that could be leveraged to overcome RUP weaknesses regarding CMMI Level 2-3, but he does not explicitly indicate how RUP should be enhanced to incorporate his recommendations. Manzoni and Price (2003) after identifying the issues of RUP with regards to SW-CMM proposed creating new process elements to improve RUP. For instance, they proposed a new discipline for dealing with supplier management and new procedures to estimate and track critical computer resources. Detailed descriptions of such new elements can be found in Manzoni (2001). These cited studies are very useful for confirming the weaknesses of RUP with regards to CMMI compliance. Also, they provide overall guidance on how RUP could be enhanced to overcome such issues. Notes 1 Any opinions, findings, and conclusions or recommendations expressed in this paper are those of the author(s) and do not necessarily reflect the views of Unisys Corporation. 2 B. Gallagher & L. Brownsword. "The Rational Unified Process and the Capability Maturity Model Integrated Systems/Software Engineering." RUP/CMMI Tutorial ESEPG, 2001. 3 M. Paulk. Capability Maturity Model for Software, Version 1.1. Addison-Wesley.1993. 4 B. Gallagher & L. Brownsword. "The Rational Unified Process and the Capability Maturity Model Integrated Systems/Software

Page 14 of 15 Engineering." RUP/CMMI Tutorial ESEPG, 2001. 5 L. V. Manzoni & R. T. Price, "Identifying Extensions Required by RUP to Comply with CMM Levels 2 and 3," IEEE TSE, Vol. 29, No. 2, February 2003. 6 K. El Emam; J.-N. Drouin; W. Melo. SPICE: The Theory and Practice of Software Process Improvement and Capability Determination. IEEE Press. 1998. 7 The success with QIP is documented for most of these companies in the studies cited here: For HP, see R. Grady, Successful Software Process Improvement, Prentice-Hall, 1997. For NASA, see F. McGarry et. al. "An Overview of the NASA Software Engineering Laboratory," NASA SEL, Technical Reports, SEL-94-005, 1994. For Nokia, see T. Kilpi, "Implementing a Software Metrics Program at Nokia", IEEE Software, 18(6):72-77, 2001. For Motorola, see V. R. Basili, M. K. Daskalantonakis, R. H. Yacobellis. "Technology Transfer at Motorola," IEEE Software, 11(2): 70-76. 8 V. Basili, "Software Improvement Feedback Loops: The SEL Experience," 10th Software Development Expo & Conference (SODEC), Tokyo, Japan, June 2001. 9 L. Briand; K. El Eman; W. L. Melo. "AINSI: An inductive method for software process improvement: concrete steps and guidelines," in K. El Eman and N. H, Madhavji (eds.), Elements of Software Process Assessment and Improvement. IEEE Press: 1999. 10 B. Gallagher & L. Brownsword, 2001. Op. cit. 11 V. Basili, G. Caldiera and D. Rombach, "The Goal Question Metric Approach."Encyclopedia of Software Engineering. Wiley 1994. 12 B. Gallagher & L. Brownsword, "The Rational Unified Process and the Capability Maturity Model Integrated Systems/Software Engineering," RUP/CMMI Tutorial ESEPG: 2001. 13 M. Morisio, C.B. Seaman, V.R. Basili, A.T. Parra, S.E. Kraft, S.E. Condon, "COTS-Based Software Development: Processes and Open Issues," Journal of Software and Systems, 2001. It is important to point out that Morisio's COTS-based process provides new elements to an already existing NASA COTS-based software development, which is essentially different from the Rational Unified Process. It is out of the scope of this paper to compare NASA COTS-based software development to RUP. Our goal here is to indicate that the ideas proposed by Morisio and his colleagues have influenced our decision to create new process elements to enhance our corporate process and increase our compliance to CMMI. 14 J. Kontio. "A Case Study in Applying a Systematic Method for COTS Selection," Proc. of the 18th Int. Conf. on Software Engineering, IEEE CS Press, March 1996. 15 http://www.iese.fhg.de/spearmint_epg/ 16 A full description of the new elements can be provided under request. 17 Software Productivity Consortium (SPC). Component Evaluation Process. SPC-98091-CMC. 1999. 18 Ibid. See also J. Kontio. "A Case Study in Applying a Systematic Method for COTS Selection,"Proc. of the 18th Int. Conf. on Software Engineering, IEEE CS Press, March 1996. 19 Software Productivity Consortium, 1999. Op. cit. 20 Ibid.