STATE-OF-THE-ART textbooks emphasize metrics programs

Size: px
Start display at page:

Download "STATE-OF-THE-ART textbooks emphasize metrics programs"

Transcription

1 350 IEEE TRANSACTIONS ON ENGINEERING MANAGEMENT, VOL. 52, NO. 3, AUGUST 2005 Information-Centric Assessment of Software Metrics Practices Helle Damborg Frederiksen and Lars Mathiassen Abstract This paper presents an approach to enhance managerial usage of software metrics programs. The approach combines an organizational problem-solving process with a view on metrics programs as information media. A number of information-centric views are developed to engage key stakeholders in debating current metrics practices and identifying possible improvements. We present our experiences and results of using the approach at Software Inc., we offer comparisons to related approaches to software metrics, and we discuss how to use the information-centric approach for improvement purposes. Index Terms Improvement, information medium, soft systems approach, software metrics. I. INTRODUCTION STATE-OF-THE-ART textbooks emphasize metrics programs as a key to developing and maintaining professional software practices. A metrics program [46] builds on measures, which provide quantitative indications of the extent, amount, dimensions, capacity, or size of some attribute of a software product or process. The measures result from collecting one or more data points, and then aggregating them through the metrics program to obtain performance indicators. The organization can use the measures and indicators to support managerial decision-making and intervention related to software practices [5], [6], [21]. Collecting and using data about the software operation for managerial purposes is by no means a trivial task [11], [13], [19], [28], [44], [47]. Project managers and software engineers are expected to supply measures, but constant deadline pressures and weak incentives to prioritize data collection typically lead to low data quality. Software managers are expected to base their decisions and interventions on metrics program data, but the appropriate data might not be available when it is needed. When data is available, managers might not trust its quality or might react emotionally to negative indicators by questioning the metrics and the data collected. Concerns like these have led to development of approaches to assess and improve software metrics programs [5], [6], [32], Manuscript received August 13, 2003; revised May 1, 2004, December 1, 2004, and March 1, Review of this manuscript was arranged by Department Editor R. Sabherwal. This work was supported in part by the Department of Computer Science, Aalborg University, Aalborg, Denmark, and in part by the Case Company. H. D. Frederiksen is with the Department of Computer Science, Aalborg University, Aalborg, Denmark ( hdf@kmd.dk). L. Mathiassen is with the Center for Process Innovation, J. Mack Robinson College of Business, Georgia State University, Atlanta, GA USA ( lars.mathiassen@eci.gsu.edu). Digital Object Identifier /TEM [37], [38]. This paper contributes to this line of research by presenting a new information-centric assessment approach. The approach seeks to enhance managerial usage of software metrics programs by: 1) viewing software metrics programs as media for collecting, analyzing, and communicating information about the software operation; 2) applying different information-centric views to assess metrics practices; and 3) involving relevant stakeholders in debating as-is and could-be metrics practices. The proposed approach is adaptable to the particular organizational context in which it is used. The underlying research was conducted as part of a three-year collaborative research effort [35] to improve metrics practices within Software Inc., a large Danish software organization. We first present the theoretical background for our research in Section II, followed by research design in Section III that describes the research approach and Software Inc. s metrics program. As we describe in the assessment approach in Section IV, our information-centric approach consists of five activities: appreciate current situation, create information-centric viewpoints, compare situation with viewpoints, interpret findings, and identify improvements. In the assessment results in Section V, we present the results of applying the approach at Software Inc., followed by an overview of our experiences in assessment lessons in Section VI. Finally, Section VII reviews our research contribution and related approaches to software metrics. II. THEORETICAL BACKGROUND Most software metrics research focuses on defining metrics, but there is also a fair amount of research into implementing metrics programs [15], [16]. It is nonetheless difficult to successfully design and implement a metrics program: by one estimate, up to 78% of metrics programs fail [13]. Recent research has, therefore, focused on critical success factors [5], [13], [20], [21], [23], [24], [27], [28], [30], [45], [48]. Among the issues explored are lack of alignment with business goals, lack of management commitment, and insufficient resources [20], [21]. With this background, we focus on how to assess and improve managerial usage of metrics programs. We first present related work on assessment of software metrics programs (Section II-A). We then present the theoretical foundation for our approach (Section II-B). A. Assessment of Metrics Programs The experienced difficulties in developing successful software metrics programs have led to an interest in assessing practices in software organizations that already collect data about /$ IEEE

2 FREDERIKSEN AND MATHIASSEN: INFORMATION-CENTRIC ASSESSMENT OF SOFTWARE METRICS PRACTICES 351 their software operation [38]. Mendonça and Basili have developed an approach for improving data collection mechanisms and data usage within a software organization. They combine a top-down approach using goal-question-metric (GQM) and a bottom-up approach based on data mining techniques. The top-down perspective creates a structured understanding of the existing set of measurements, and the bottom-up perspective helps reveal new and relevant information that can be generated from data already collected. Kitchenham et al. [32] argue that measures and indicators should be representative of the attributes they are supposed to reflect. Validation is therefore critical to a metrics program s success. They suggest a framework that can help validate measures and assess their appropriateness in a given situation. Berry and Jeffery identify variables that can lead to a metrics program s success or failure [5]. The purpose is to evaluate and predict the success of new programs on the basis of experiences with previous programs. They use a structured set of questions to collect data from people implementing and managing metrics programs. Most questions represent advice from experienced practitioners; some are based on theory. The instrument includes questions on the program s status, context, inputs, processes, and products. Berry and Vandenbroek have developed a complementary approach to help individual software organizations improve metrics programs [6]. They offer a meta-framework to design and deploy assessment instruments that target specific software processes, e.g., project tracking and oversight or configuration management. This approach s critical elements are the ability to build a performance model tailored to a software organization s particular needs, and the explicit inclusion of experiences and attitudes of practitioners who are involved in metrics and software practices. The model also includes social factors such as management leadership, fear of measurement, and ethical use of measures that software engineers are usually poorly equipped to deal with [6]. The authors suggest that an organization can use the method periodically, with or without outside assistance. The approach is complex and the results are comprehensive. Finally, ISO/IEC defines a measurement process applicable to software-related engineering and management disciplines. The standard defines the measurement process (e.g., activities for specifying information needs) and activities for determining validity of analysis results. Practical software measurement (PSM) [37] serves as the basis for the ISO standard and provides details on the standard s activities. ISO/IEC offers a normative basis against which existing metrics programs can be assessed. The standard was used as a basis for creating the CMMI s measurement and analysis process area [1]. While these approaches provide valuable support for assessing and improving software metrics programs, none focus on how metrics programs are used to support software management. Several studies suggest, however, that a metrics program s success depends intrinsically on regular use of indicators and measures to inform managerial decision-making and intervention [20], [21], [27], [28], [42]. Our research is, therefore, directed toward filling this gap. Metrics programs measure attributes of software processes and products and make data available on different levels of aggregation to stakeholders within a software organization. Given this, a metrics program s primary purpose is to support and strengthen management practices on all organizational levels. Organizations use measures and indicators to estimate projects, analyze defects, identify improvement initiatives, pinpoint best practices, and so on. Grady [22] suggests that software metrics can be used for tactical and strategic purposes. Project managers represent the tactical use in project planning and estimation, progress monitoring, and product evaluation. People involved in software process improvement represent the strategic use, identifying and prioritizing improvement initiatives, engaging in defect analysis, and validating best practices. B. Information-Centric Assessment We propose an information-centric approach that combines two streams of theory and applies them to assess software metrics practices. First, it views software metrics programs as media for creating and sharing information about software practices [2], [51]. This perspective focuses on the relations between the measured software processes and products, the program designer s intended meaning for measures and indicators, and the program users interpreted meaning based on data from the metrics program [43]. Second, it engages key stakeholders in debating metrics practices in the target software organization based on soft systems methodology (SSM) [8], [10], a general approach to organizational problem-solving. The first theoretical foundation for our approach is to view metrics programs as information media. In order to measure software processes and products, you must observe them; for the resulting data to be useful, you must interpret and act upon them. Weinberg [51] represents these fundamental activities in a four-element model: intake, meaning, significance, and response. In the intake process, people access information about the world. In the meaning process, they assign meaning to their observations. In the significance process, they give priority to these meanings. Finally, in the response process, people act by translating their observations into actions. Software metrics programs must therefore support observation of and store data about relevant software processes and products. In addition, such programs must support data interpretation and information communication between different actors; cf. publish objectives and data widely and facilitate debate [27]. Finally, to create value for the software organization, such programs must lead to managerial responses; cf. use the data [27]. Fig. 1 illustrates this view of software metrics programs as information media. Three types of activities are involved: measure software practices to generate data, analyze data to create useful and relevant information, and intervene into software practices based on information from the program. Metrics programs mediate the interaction between many stakeholders, including data suppliers, software engineers, software managers, improvement agents, and metrics staff. While information is fundamental to computer and information science, there is, unfortunately, little agreement about the concept [39]. We have adopted a definition that is in line with

3 352 IEEE TRANSACTIONS ON ENGINEERING MANAGEMENT, VOL. 52, NO. 3, AUGUST 2005 Fig. 1. Software metrics program as a medium. to learn from their differences and experiences. To do this, and thus identify possible improvements, SSM uses soft systems, i.e., idealized viewpoints on a situation expressed as adaptive systems [8], [10]. Checkland [9] provides a survey of SSM concepts and practices. We base our assessment approach on SSM for many reasons. First, SSM is particularly well suited to addressing complex organizational practices involving many different stakeholders. Second, SSM has proven useful in addressing issues related to information systems in organizational contexts. Third, while the approach is qualitative and interpretive, it offers rigorous techniques to apply systems thinking and practices to organizational problem-solving. Finally, SSM is adaptable to specific organizational contexts [8], [10]. Weinberg s [51] focus on sense-making. It emphasizes the relationships between the observed object, the intended meaning, and the interpreted meaning as follows [43]. Information is a symbolic expression of an object, i.e., a real-world entity or an abstract concept. Programs, people, and activities are real-world entities and program size and quality, person-hours, and productivity are examples of abstract concepts related to software practices. Program designers create information to communicate an intended meaning about specific objects, e.g., when they define a procedure for assessing a program s function points as a way to understand and measure program size. Program users interpret the information s symbolic representation in a given social context in order to interpret the related object s meaning. When managers read a productivity report based on function points, they might assume that all types of function points are equally difficult to implement, while the intended meaning was to differentiate between more or less complex function point types. There are several reasons for viewing software metrics programs as information media. First, two potential risks of metrics programs are that they might fail to make managers use the data and fail to make intentions and interpretations meet [42]. Second, metrics programs are used to support communication and interaction between different actors [27]. Third, a media perspective applies to the complete cycle of storing, accessing, interpreting, and acting upon observations about software processes and products [51]. Finally, information systems have generally become media for communication and collaboration [2]. The second theoretical foundation for our approach is organizational problem-solving based on SSM [8], [10]. SSM provides a general approach for addressing problematic situations in organizational contexts and has been used extensively to address and study information systems issues (e.g., [3], [12], [34], [49]). It has also been sporadically adopted in relation to software metrics [4], [25]. SSM s generic activities involve appreciating the situation (i.e., existing metrics practices); developing idealized viewpoints on the situation; comparing viewpoints of the situation in a debate between relevant stakeholders; and identifying actions that can lead to an improved situation. SSM is a qualitative, interpretive approach to organizational problem-solving. It is based on the assumption that actors have different beliefs and viewpoints and it engages the involved stakeholders in a debate III. RESEARCH DESIGN The research we present is part of a collaborative practice study [35] carried out between Software Inc. and Aalborg University, Denmark, from January 2000 to April 2004 [17]. Collaborative practice research targets specific professional practices with the double goal of improving these practices, while also contributing to research. The basic approach is action research combined with various forms of practice studies and experiments. This paper is based on activities that we carried out in the early phase of the collaborative project. Our goal was to identify key problems and possible improvements in software metrics practices at Software Inc. Here, we present the rationale for the information-centric assessment approach together with a case study [18], [53] of its use within Software Inc. The organization had extensive experience implementing and running a metrics program, and there was a growing concern that its benefits were not as great as expected. Furthermore, senior management was willing to fund and support an R&D initiative and there was a well-established collaboration between software metrics practitioners from Software Inc. and researchers from Aalborg University. A. Case Study The business of Software Inc. is to develop, maintain, and run administrative software for local authorities. With more than 2400 employees and almost 700 software developers, Software Inc. is one of Denmark s largest software organizations. Software Inc. is geographically distributed at four sites and organized in a conventional hierarchical divisional structure. The software operation is largely organized in projects and teams according to application type. A unit supporting the software organization with methods, tools, and techniques is responsible for the metrics program and software process improvement. Software Inc. has a long tradition of collecting simple measures, e.g., the number of faults in an application. In 1996, senior management decided to implement an elaborate metrics program to increase productivity and quality by benchmarking against the industry [7]. An external contractor supplied the metrics program to make benchmarking possible. The research collaboration was initiated to address the growing concern about unsatisfactory return on investment by enhancing managerial usage of the metrics program. When

4 FREDERIKSEN AND MATHIASSEN: INFORMATION-CENTRIC ASSESSMENT OF SOFTWARE METRICS PRACTICES 353 TABLE I ASSESSMENT APPROACH Fig. 2. Software metrics program at Software, Inc. we began assessing metrics practices at Software Inc. in 2000, we found no approach in the literature to identify strengths, weaknesses, and opportunities in managerial usage of metrics programs. We, therefore, decided to develop our own approach. One of us executed the assessment in close collaboration with other Software Inc. stakeholders, while the other served as coach and critical outsider during the assessment. We kept a diary of plans, events, results, and experiences throughout the assessment to record the process [29], [40]. Two limitations apply to our research design. First, as with any case study, one should be cautious about generalizing the findings [18], [53]. The advantage of a case study is that it provides in-depth insight into practices within one organization. To transfer the information-centric assessment approach to other software organizations, one must carefully consider the conditions under which it was applied at Software Inc. Second, the results we present here are from the initial assessment of a large improvement initiative. Continued efforts to improve Software Inc. s metrics practices will provide additional experiences and valuable feedback on the approach [17]. B. Metrics Program Fig. 2 illustrates Software Inc. s metrics program. The program s purpose is to monitor and improve software processes and products. Project and application managers supply data (measures) to the metrics program every three months. The data includes time spent on various activities and data on the application errors reported and corrected in the measurement period. The data is supplemented with characteristics on projects or applications, e.g., business area, technology, and size measured in function points. The payroll and the human resource departments are some of the external units that supply data, e.g., on personnel and expenses on software and equipment. At each site, a controller supplies data on time spent on activities unrelated to projects and applications, e.g., management, administration, and education. The controller is responsible for getting the measures from managers on time and for helping managers submit measures to the program. The metrics staff processes, validates, and packages data and sends them to the contractor. After the contractor processes data, results are returned including aggregate indicators on a general level, and on a project and application level. The metrics staff interprets and disseminates some results to data suppliers and senior managers. They primarily disseminate results by , but also use the company s intranet. Although the data itself is different, the presentation is the same for all users. Each year, the contractor delivers a written report. The report includes an analysis of Software Inc. s software operation on different levels of aggregation, along with a set of recommendations for further actions, e.g., software process improvements. The contractor and Software Inc. s software process improvement staff present the report to senior management and facilitate a structured debate. Subsequently, results are made available to the next level of managers. Occasionally, these managers ask the metrics staff to facilitate a debate on a more detailed level. The amount of managerial response varies, however, across the organization. The project and application managers have encouraged a system in which results are traceable to projects and applications, but not to employees. A specialist in the supporting unit is responsible for setting the measurements strategy and mapping the contractor s model to Software Inc. s practices. Controllers are responsible for controlling data collection in each unit, supporting project and application managers in supplying data and presenting results back to the unit. The specialist and controllers constitute the metrics staff. They meet on a regular basis to discuss and fine-tune the program. IV. ASSESSMENT APPROACH The proposed information-centric assessment approach consists of five activities: appreciate current situation, create information-centric viewpoints, compare situation with viewpoints, interpret findings, and identify improvements. We now describe each activity along with details about how we executed them at Software Inc. Table I summarizes the approach. Further guidelines based on our experiences are presented in the Assessment Lessons and in Table V. Appreciate current situation: This activity is aimed at getting an overview and appreciation of the metrics program and its context. We used rich pictures (e.g., Fig. 2) to represent different views on the situation and to identify problems, problem

5 354 IEEE TRANSACTIONS ON ENGINEERING MANAGEMENT, VOL. 52, NO. 3, AUGUST 2005 owners, and possible idealized viewpoints (soft systems) for detailed analysis [8], [10]. We conducted seven semi-structured interviews with project managers, senior managers, metrics staff, and quality assurance staff. Each interview lasted for three hours. We documented the key points and gave these to the interviewee to correct and approve. We also collected plans, decision reports, and correspondence related to metrics, minutes of meetings from the metrics staff, and documents from the organization s process improvement initiative. Finally, at the time of our study, one of the authors had worked as a metrics specialist at Software Inc. for five years and had kept a personal log of experiences with using the metrics program. We used these data sources to learn about current metrics practices. We drew half-a-dozen rich pictures to express different views of the current situation and to get an initial appreciation of problematic issues. This led us to identify nine information-centric viewpoints, each of which provided a possible starting point for soft systems modeling and debate. Fig. 3. Idealized view of measuring software practices. A system for helping data suppliers ensure data quality. A system for satisfying the information needs of users. A system for reporting essential problems in the software process. A system for demonstrating to data suppliers that the data is actually used, thus increasing their motivation. A system for discussing results and assumptions to facilitate open debate. A system for acting upon negative trends in indicators. A system for interpreting and disseminating results. A system for improvements based on the results. A system for producing measurements from the ERPsystem at Software Inc. Create information-centric viewpoints: This activity s purpose is to define and model selected information-centric viewpoints based on soft systems thinking [8], [10]. Each viewpoint is defined textually as a root definition of the involved Customers, Actors, Transformation, Weltanschauung, Owner, and Environment (the CATWOE of SSM). A conceptual model of the necessary activities is then developed, and a simplified rich picture is drawn as illustration. We selected viewpoints from the list above based on three criteria: they should address key problems at Software Inc., cover the entire lifecycle from data collection to use, and express key features of good information system design. Three viewpoints proved particularly useful: a system for measuring software practices, a system for disseminating indicators and interpretations, and a system for using metrics to generate managerial response. Compare situation with viewpoints: This activity s purpose is to assess current metrics practices by systematically comparing viewpoints to current practices. The activity is carried out with relevant stakeholders [8], [10]. To begin, we systematically compared data from the appreciate current situation activity with each viewpoint s individual elements (see Figs. 3 5). This produced several findings, which we organized into a coherent assessment based on each particular viewpoint. We validated the assessment by discussing it with the interviewees and with other stakeholders. In general, stakeholders were enthusiastic. They felt that their experiences with and concerns about Fig. 4. Fig. 5. Idealized view of disseminating indicators and interpretations. Idealized view of using metrics to generate response. the metrics program in Software Inc. were well covered. The discussions demonstrated that the three viewpoints helped address key problems and challenges at Software Inc., and they led to a revised set of findings. Interpret findings: This activity s purpose is to interpret the findings across individual viewpoints by looking at the metrics program as an information medium. This lens helps clarify the relation between the measured objects, the intended meaning of measures and indicators, and different users interpreted meaning based on data from the metrics program. The author that led the Software Inc. assessment carried out this activity, assisted by the other author. We used the model (Fig. 1) to systematically interpret the findings, analyze each element (measure, analyze, intervene, and medium) across the findings, and discuss them in the light of Pedersen s notion of

6 FREDERIKSEN AND MATHIASSEN: INFORMATION-CENTRIC ASSESSMENT OF SOFTWARE METRICS PRACTICES 355 TABLE II CATWOE OF MEASURING SOFTWARE PRACTICES TABLE IV CATWOE OF USING METRICS TO GENERATE RESPONSE TABLE III CATWOE OF DISSEMINATING INDICATORS AND INTERPRETATIONS information. Key stakeholders in the organization validated the interpretation. We also presented the results to outsiders to test their relevance. We then presented the results to managers at Software Inc. Identify improvements. This activity s purpose is to identify possible improvements based on the analysis. The author who led the assessment distilled the initial set of improvements that had emerged through the debates of the information-centric views and the subsequent interpretation of the findings. These were carefully checked against current practices to ensure that they would address key issues in a feasible way. We then documented the analysis in a report to a group of senior and middle managers at Software Inc. This report was presented at a meeting, which led to further elaboration of the proposed improvements. V. ASSESSMENT RESULTS In the following, we present the results of the Software Inc. assessment. The three viewpoints that emerged as particularly helpful are summarized as root definitions in Tables II IV. We present each viewpoint as a simplified text and rich picture, followed by the findings that resulted from comparing the viewpoint to Software Inc. practices. Finally, we present the results of the interpretation of the findings together with possible improvements. A. Measure Software Practices Ideally, as illustrated in Fig. 3 and summarized in Table II, project and application managers supply measures, such as function points, directly to the metrics program. Relevant indicators, such as project productivity, are then fed back to data suppliers so they can use their own data for project management. Metrics program staff also informs data suppliers about the need for measurements to support management and improvement, such as to identify important improvement areas. Data suppliers use of indicators and their appreciation of the need for them typically motivate them to supply quality data, which in turn reflects the progress and general perception of the project. Comparing this first, information-centric view of how measurement activities are ideally performed with metrics practices at Software Inc. led to a number of observations. The company organizes software practice as projects and tasks for developing and maintaining applications. Projects and tasks are well defined, and have an appointed manager. Generally, routines for collecting measures are well established and managers submit measures to the metrics program as required. However, managers often do not understand the precise definition of metrics and measures. In particular, it is difficult for them to understand and accept function point measures. As a result, managers often submit measures without really understanding the data or implications for managerial decision-making and intervention. Still, the metrics program contains a lot of data that are potentially useful for several purposes, e.g., tracking the resources used to correct errors, assessing a project s productivity, or tracking application quality. The indicators are not typically used to support management of the software operation; they are primarily used for high-level software process improvement, and for feeding data into a model for estimating development projects. Project managers, however, seldom use this estimation model. Relevant indicators are rarely presented to data suppliers, and the metrics staff has not identified when and how data suppliers could make use of indicators. The metrics program does not supply relevant indicators to project managers during the project life cycle, i.e., the indicators are only available after the project has ended. Hence, project managers cannot use indicators for controlling their projects. They can only use indicators for postmortem evaluation. In some cases, indicators are not available for two to five months after a project has ended. Although data suppliers do not themselves use indicators, the experienced need of other colleagues could be communicated to them, e.g., the use of data for high-level software process improvement. This is, however, not happening. The need for measures and indicators is poorly communicated to data suppliers,

7 356 IEEE TRANSACTIONS ON ENGINEERING MANAGEMENT, VOL. 52, NO. 3, AUGUST 2005 and most software engineers and managers do not understand the metrics program s purpose. There is no explicit standard for data quality, but the metrics staff does perform a comprehensive validation of data in some units. Although it is difficult for data suppliers to participate in this validation, the metrics staff tries hard to define measures and support data suppliers. The metrics program s adjustment in response to data quality problems is nonetheless quite unsystematic. B. Disseminate Indicators and Interpretations Ideally, as illustrated in Fig. 4 and summarized in Table III, targeted information is disseminated effectively, taking into account available measures and indicators, user needs, and available media. Hence, the metrics staff interprets measures and indicators and chooses an appropriate medium to communicate targeted information to program users. To generate relevant information, the metrics staff must appreciate the users need for information. When comparing this second, information-centric view of ideal dissemination practices with current metrics practices at Software Inc., we made several additional observations. The metrics program has more than 100 measures and indicators. As mentioned earlier, they are not well understood by the users. Hence, the metrics staff makes targeted information available to relevant groups. A proper interpretation requires, however, insight into the metrics program, submitted measures, and actual software practices. This means that results are often presented to users in quite general terms that require further interpretation. A few routines for disseminating information are rather well established, e.g., the presentation of indicators for senior management. In most cases, low-level managers simply collect required measures and ignore indicators. They only engage in interpretations when senior management requires that they do so. The metrics staff does not systematically analyze target groups and information needs. This means that they cannot tailor the presentation of information to specific recipients. Moreover, because most managers do not see any need for the metrics program, it is difficult to get them actively involved in identifying information needs. Most managers believe they can do their job without the metrics program. State-of-the-art media are available for presenting information in a variety of ways. However, choice of media is not considered systematically. Nearly all interpretations are presented in written, paper-based reports. Only a few indicators are presented on the company intranet. Software Inc. has decided on a strategy for disseminating information, but the strategy is not applied to the metrics program. Information is supposed to be communicated from the metrics staff to senior management and from senior management to lower level managers. Project and application managers are supposed to receive indicators on their own software practices from the metrics staff. This happens only in few cases. There are no explicit criteria for the success of information dissemination. Moreover, there is no systematic evaluation and adjustment of dissemination practices. C. Use Metrics to Generate Response Ideally, as illustrated in Fig. 5 and summarized in Table IV, the metrics staff facilitates managerial response and organizes structured debates among managers and engineers on the basis of measures and indicators. The program is subsequently tuned and gradually developed based on the metrics staff s appreciation of information needs and user experiences. When comparing this view of how metrics programs are used to generate managerial response with metrics practices at Software Inc.,we arrived at additional observations. The set of measures and indicators is very complex. To make the information accessible to users, the metrics staff can facilitate structured debates with users. This requires, however, substantial knowledge of the indicators and their interpretation, plus skills to organize structured debates. There are few people at Software Inc. with those skills. The metrics staff does not consider relevant questions about organizing debates: Why should they be organized? Who should participate? When and where should they take place? Few debates to assign meaning to measures and indicators actually take place, and there are no explicit criteria for evaluating them when they do occur. Furthermore, there is no systematic evaluation and adjustment of how and when debates are organized. This is true despite the fact that senior management holds structured debates each year, and there is a strong tradition of structured debates within the software process improvement organization. There are few managerial responses based on information from the metrics program. Senior management tries to understand and explain the indicators, but lower level managers usually ignore indicators. However, many software process improvement initiatives are based on the metrics program, e.g., those related to estimation, software configuration management, and requirements management. Structured debates could facilitate a systematic appreciation of information needs. However, the metrics staff rarely uses such opportunities to reflect on the use and possible improvement of the metrics program. Few systematic efforts are aimed at tuning the metrics program. The metrics staff decides on minor improvements without any real commitment from management. In summary, this assessment reveals a gap between the three information-centric views of metrics programs and the current practices at Software Inc. The organization has a high commitment to use software metrics to manage and improve the software operation and puts considerable effort into data collection. But when it comes to ensuring data quality and creating information to manage or improve software practices, our analysis reveals a number of weaknesses and dysfunctional practices. D. Interpret Findings To get a deeper understanding of why the software metrics program at Software Inc. fails in certain ways, we subsequently viewed the program as an information medium that shapes communication and social interaction between data suppliers, software engineers, software managers, improvement agents, and metrics staff. The success of any metrics program is highly dependent on data quality, cf. measure in Fig. 1. Our analysis shows that

8 FREDERIKSEN AND MATHIASSEN: INFORMATION-CENTRIC ASSESSMENT OF SOFTWARE METRICS PRACTICES 357 data suppliers at Software Inc. do not appreciate the need for collecting data, nor do they use measures or indicators themselves. The data suppliers task is to measure certain objects related to their practices. Using Pedersen s notion of information [43], data providers appear to have little insight into the intended meaning. Their actions are purely symbolic, focusing on objects and producing data about them. Put differently, measuring is detached from their practices and they have little motivation or knowledge to facilitate high data quality. Only limited analyses are made of data in the metrics program, cf. analyze in Fig. 1. The analyses that are made are either carried out by the contractor or by the metrics staff. The contractor has little or no insights into the software operation at Software Inc. and its interpretations tend to be straightforward and based on general trends. Because the metrics staff and the contractor share an understanding of the program s intended underlying meaning, the metrics staff can make sense of, modify, and refine the contractor s analyses to make them more useful within Software Inc. The metrics staff also performs certain analyses on their own, but these are on an aggregate level and target senior management or the organization as a whole. No specialized interpretations are provided relating data in the metrics program to particular contexts and needs within the organization. Moreover, there is little explicit knowledge about particular needs for information about the software operation. For these reasons, little is done to convey the fact that the program s intended purpose is related to the information needs of software managers at Software Inc. Software Inc. s metrics program is used for some purposes, but not for others, cf. intervene in Fig. 1. Those involved in software process improvement use the program strategically to identify and support new improvement initiatives. In contrast, the metrics program is used very little, if at all, on a tactical level to support software management. Again, we can use Pedersen s notion of information [43] to explain this variation. The improvement people have participated actively in designing and implementing the metrics program. They understand its underlying intention and it is easy for them to interpret meanings that are relevant and useful in their context. In contrast, other managers have not participated in the program s design and have little knowledge of the metrics program s intended meaning. It is therefore more demanding, and perhaps even impossible, for them to make sense of data in their contexts. So far, measures, indicators, and interpretations from Software Inc. s metrics program have not been successfully disseminated. This is surprising in light of two factors. First, Software Inc. s culture is open, i.e., people have explicitly requested that metrics information be publicly available. Second, Software Inc. is using contemporary media, e.g., an intranet, to support communication within the organization. Despite these enabling factors, dissemination from the metrics program is still quite limited. The metrics staff provides overall information to top management on a quarterly basis. Once that report is approved, lower levels in the organization can produce more focused reports relevant to their particular situations. This rarely happens. One possible explanation is that the managerial hierarchy severely limits open forms of dissemination, such as posting information on the intranet. E. Identify Improvements Our assessment and subsequent interpretation of the findings indicate areas in which Software Inc. can potentially improve practices. We summarized these as follows. Data suppliers needs should be given more consideration in order to increase the quality of data. Targeted interpretations should be made available to support increased usage. The intended meaning of measures and indicators should be shared across groups. The managerial hierarchy should not restrict information dissemination. Contemporary technologies should be used to facilitate dissemination. The metrics staff should change its role from that of data supplier to that of information provider. The three information-centric views of software metrics programs and the insights into current practices gave us more detailed clues as to how the company can establish specific initiatives within these improvement areas. After considering other improvement activities, trends, and Software Inc. s climate, we formulated four specific initiatives. Initially, senior management decided to launch the following two. Data collection should be optimized to motivate data suppliers and save time. Furthermore, results and their use should be visible to data suppliers in order to increase data quality. This initiative should increase data suppliers appreciation of the intended meaning of measures and indicators. Measures and indicators should be made available for project management. This initiative s focus was to provide software quality measures of the number of bugs per function point at different stages of a project. As these rather limited improvements were being implemented, there was a change of senior management. The new management found the assessment of the existing metrics program interesting and decided on a radical improvement strategy in which a new firm-specific metrics program was designed and the contract with the external supplier was cancelled. The new metrics program and practices were based on our assessment s recommendations and findings. Also, a process of continuous improvement was established to further develop and sustain the metrics program as a useful and integral part of Software Inc. s management practices [17]. VI. ASSESSMENT LESSONS Table V summarizes guidelines for the assessment approach. The guidelines build on our experiences and on the background literature [8], [10], [43], [51]. They provide advice on how to use the information-centric assessment approach for improvement purposes. Appreciate current situation: Look at processes, structures, climate, and power [8], [10]. Each of these perspectives provides different insights into the current situation. By looking at processes, you can identify inappropriate or ineffective practices. Structures help you identify the conditions under which the metrics program operates. By looking at climate, you can identify

9 358 IEEE TRANSACTIONS ON ENGINEERING MANAGEMENT, VOL. 52, NO. 3, AUGUST 2005 TABLE V ASSESSMENT LESSONS explanations for current weaknesses. If, for example, metrics program data has been used to publicly blame project managers, they will tend to submit data that helps them avoid such blame, rather than data that reflects their projects. It is useful but often difficult to make people talk about the power structure in their organization. Personal observations are, therefore, necessary to qualify interviews on this point. Furthermore, a lot of people are not very conscious about power issues; they concentrate on doing their jobs. Be creative and open-minded you can always dispose of viewpoints later [8], [10]. In our case, we drew several rich pictures. From these, we listed nine possible viewpoints, of which three turned out to be particularly valuable. It is important that you do not limit yourself at this stage, as you could miss out on important viewpoints. Be careful when you identify actors. Some actors are part of the problematic situation or have something at stake [25]. Middle managers might not support the use of data about their applications or projects. Is this because they do not believe in the idea, or because they want to prevent senior managers and others from gaining insight into their areas? In some cases, you have to decide beforehand how radical your assessment should be. Likewise, key actors will sometimes use the assessment as an opportunity to blame other actors. Consider confidentiality in interviews and published results, at least in the early stages. Metrics programs are monitoring and control mechanisms [31] and involve contradictory interests [25]. If you can guarantee confidentiality in interviews, people will be more open. It is important to illustrate how interviews will be used, and how results will be published. Create information-centric viewpoints: Describe concrete ideas, thoughts, and points of view, and not the situation as-is [8], [10]. Use viewpoints to learn about the situation, and create concrete viewpoints so that they are useful in debates. Your chance of getting relevant feedback will decrease if the viewpoints are academic and detached from the organization s reality. Describe a viewpoint s essence, not the detailed steps or activities [8], [10]. The simplified rich picture of a viewpoint is not supposed to be a data flow diagram. Looking at processes and structures, it is tempting to draw very detailed diagrams, as is often customary in requirements analysis. Instead, draw a simple, overall picture and elaborate from there. Even though the viewpoint is very complex and consists of concurrent elements, consider it to be linear or break it into several different viewpoints in order to simplify. Trying to cope with all issues at one time will take you nowhere. Remember: viewpoints are vehicles for learning [8], [10], [33]. The point is to create a foundation for assessing the current situation, not to make a model that will solve current problems. Think of viewpoints as tools for learning how to improve, rather than solutions to the organization s problems. Iterate several times: You should do several iterations over root definitions (CATWOE), texts, and simplified rich pictures. Do a thorough check of texts and pictures against root definitions. Likewise, you should iterate and reiterate when you draw rich pictures. Explain the pictures to an outsider to see if they make sense. The pictures play a vital role in the information-centric approach, and it is important that they are relevant and make sense in your particular context. Furthermore, you will

10 FREDERIKSEN AND MATHIASSEN: INFORMATION-CENTRIC ASSESSMENT OF SOFTWARE METRICS PRACTICES 359 be explaining the pictures to key actors, and you should know how to present them in a straightforward way. Compare situation with viewpoints: Aim for a structured debate [8], [10]. Work your way systematically through the debate to ensure that you cover all aspects of each viewpoint. Keeping this in mind, you will see that the information-centric viewpoints are well suited for structuring a debate with many different audiences. We have successfully performed debates with interviewees, other colleagues at the company, the metrics staff from other companies, and conference participants. Look for differences between viewpoints and practices and identify problems and opportunities related to the metrics program. Remember that at least one actor should promote a problem to ensure you do not end up seeing everything as problematic. Also, try to identify the impact of problems to help prioritize your effort [33]. You will not be able to resolve every issue. To begin the comparison of key stakeholders, gather material from the appreciate current situation activity (see Table I). Your initial findings will provide guidance for what you are looking for. Although you should debate with other stakeholders to test for relevance and usefulness, start with interviewees. They know the set up, which gives you a chance to rehearse the process. Guide participants through the viewpoints and initial findings. The viewpoints are rich in information, and the actors will not be able to read them on their own. A fruitful approach is to talk the actors through the picture as planned in the previous step. Be aware that you should adjust your presentation order in accordance with the participants questions and responses. Start with elements that are most familiar to the participants. Interpret findings: Look across viewpoints and findings. Your focus is now shifting from details within viewpoints to viewing the metrics programs as an information medium [43], [51]. Use the theoretical framework systematically and carefully cover all elements measure, analyze, intervene, and medium (see Fig. 1). Let the framework challenge your findings and help identify explanations and root causes. Furthermore, try to move from isolated observations to systemic problems. For example, the decision to share or not share the information s intended meaning explained why data from Software Inc. s metrics program were used for software process improvement and not management support. In our case, the theoretical lens challenged the analysis and helped us look beyond conventional beliefs within the organization. Also, it was helpful to have both an outsider and an insider conduct the interpretation and identify and challenge findings. To validate your interpretations, you should present them to interviewees and other stakeholders in a structured debate. Many aspects of most situations are interrelated, and it s almost impossible to provide feedback without a structured approach to discussion. We successfully performed debates on interpretations with interviewees, other colleagues, and people from other companies. Identify improvements: Identify improvement areas from viewpoints and findings [8], [10]. The information-centric viewpoints suggest how you can execute metrics practices, while the findings suggest where the problems and opportunities reside. Identify and debate improvement areas with key stakeholders. The documentation from the earlier steps is well suited for this dialogue. To get the process going, you can present a set of preliminary improvements. These should serve as inspiration, and you must be open to debating them. Again, try to structure the debate with a focus on one or a few improvement areas at a time. Hopefully, stakeholders involvement will commit them to the improvements [22], [26], [33]. Make sure to address identified problems and to compare improvements to current practices. As you work your way through the assessment approach s steps, you can get carried away. Given this, it is important that you pause and check your improvement suggestions against the current situation. Will the improvements actually resolve the identified problems? Test the feasibility of the improvements. Even though improvements are technically possible, you must ensure that you have the right kind of commitment from sponsors and the people who will be affected [22], [26]. Furthermore, you should be aware of general trends and the climate in the organization. Some improvements will be more uphill than others. It might be better to focus on two minor improvement areas that are likely to succeed than one major improvement that is likely to fail. Likewise, it is important to align with other efforts in the organization in order to be a credible partner. Always remember that a desirable improvement is not necessarily a feasible one [8], [10]. Communicate improvements in specific terms that make sense within the organization. Having worked your way around the rich pictures, the information-centric viewpoints, and the interpretations, you must get back to the real world. Your suggestions for improvements should be presented to relevant stakeholders in language that communicates directly to them. We have presented the information-centric assessment approach at conferences and to colleagues, and it has been easily understood. However, actually adopting the approach requires that you appreciate the ideas of SSM and have insights into the literature on software metrics. It is important that you use the approach systematically. VII. DISCUSSION Our research confirms that it requires a dedicated effort to integrate metrics programs into software practices [5], [13], [19] [21], [27], [28], [38], [42], [44], [47]. Implementing software metrics programs requires change or, as Weinberg puts it, successfully transforming the old status quo to a new status quo [52]. When software metrics programs are introduced, they are, however, often rejected or integrated into the old status quo without achieving a new status quo. The limited success of the metrics program at Software Inc. illustrates this. The overall purpose of our research was to enhance managerial usage of metrics programs within software organizations. The key contribution is the information-centric approach to metrics assessment combining an information medium perspective [43], [51] with an organizational problem-solving process based

11 360 IEEE TRANSACTIONS ON ENGINEERING MANAGEMENT, VOL. 52, NO. 3, AUGUST 2005 TABLE VI RELATED APPROACHES TO SOFTWARE METRICS on SSM [8], [10]. In the following, we explicate the contribution by comparing the proposed approach to related approaches to software metrics. First, we compare with other approaches that use soft systems thinking in relation to software metrics. Second, we compare with other approaches to assessment of metrics programs. The key features of the related approaches are summarized in Table VI. Similar to our approach, Hughes [25] and Bell et al. [4], have adopted soft systems thinking to software metrics. Hughes focuses on design of metrics programs. For that purpose, he has embedded software measurements into SSM to model software development. Hughes argues that GQM is reputed to be the primary method for defining relevant metrics, but identification of common, acceptable goals can be quite difficult. There might, for instance, be a conflict between software developers who prefer generous effort estimates to relieve pressure on their projects and software managers who prefer restricted estimates to set high productivity rates. Hughes suggests that SSM addresses these issues effectively and that SSM-like approaches should precede or substitute for GQM. Likewise, Bell et al. [4] have introduced and tested an approach to software process improvement based on SSM. Their methodology combines soft systems thinking and GQM. They argue that GQM does not provide any guidelines on how to identify relevant problems and goals. Their methodology consists of four stages: 1) framing to prepare for enquiry; 2) enquiry to identify the problems as perceived by the involved stakeholders; 3) metrication to resolve the identified problems by enabling metrics to be collected based on GQM; and 4) action to collect and store data in an accessible and transparent manner. Our motivation to adopt SSM in relation to software metrics resonates with those of Hughes [25] and Bell et al. [4]. In their approaches, however, SSM is used as an alternative or supplement to GQM, and soft systems are used to model and debate software development practices. Our focus is on assessment of software metrics practices. We apply soft systems thinking to develop information-centric views of metrics practices and we use SSM s learning cycle to guide metrics assessment and involve key stakeholders in the process. Our approach complements existing approaches to assess software metrics programs, cf. Table VI and Section II. Mendonça and Basili [38] focus on collection and use of data in a software metrics program; their approach combines GQM with data-mining techniques to more effectively utilize available measures and indicators already implemented in the program. Kitchenham et al. [32] are interested in validation of metrics programs; based on measurement theory, they offer a conceptual framework for assessing how well attributes, measures, and indicators represent the real world. Berry and Jeffrey [5] are concerned with predicting metrics program success; they have collected data from several programs and offer a framework for assessing a program s success through structured questions about its status, context, inputs, processes, and products. Berry and Vandenbroek [6] emphasize metrics related to specific practices; they provide a meta-framework to design and deploy assessment instruments of metrics targeting specific software processes. Finally, ISO/IEC focuses on benchmarking of metrics programs; this approach offers a standard set of measurement activities that can be used as a normative basis for assessment. The information-centric approach adds to this line of research by focusing on how information from metrics programs can enhance managerial decision-making and intervention. The approach combines a perspective on software metrics programs as media [2], [43], [51] with an organizational problem-solving approach that engages key stakeholders in debating metrics practices [8], [10]. The approach seeks in this way to enhance managerial usage of software metrics programs by: 1) viewing software metrics programs as media for collecting, analyzing, and communicating information about the software operation; 2) applying different information-centric views to assess metrics practices; and 3) involving relevant stakeholders in debating as-is and could-be metrics practices. We found in the study that the information-centric perspective in many ways contrasted traditional perceptions of software metrics at Software Inc. The original metrics program focused on data and technology rather than on information and technology usage; management was formally seen as a hierarchy, rather than as a network of communicating individuals; and, computers were viewed as devices for storing and processing data rather than as media for creating and sharing information. Software Inc. s initial metrics program was in this way focused on data, programs, computers, and hierarchy. The information-centric assessment suggested that successful metrics practices would require equal emphasis on information, interpretation, and human interaction and communication.

12 FREDERIKSEN AND MATHIASSEN: INFORMATION-CENTRIC ASSESSMENT OF SOFTWARE METRICS PRACTICES 361 The general purpose of assessment is to understand practices and identify possible areas for improvement [33], [36], [50], [52]. Some assessment approaches are based on abstract models and others emphasize identification of unique problems [41]. The information-centric approach is both problem and modeldriven, cf. Table I. It starts by collecting data about current practices without having any specific normative models of metrics practices in mind. Based on that, it creates several rich pictures of current practices and identifies a number of problematic issues. This initial exploration constitutes the problem-driven part of the information-centric approach. The approach then creates a number of idealized views of metrics practices. The systematic comparison between these views and practices together with the subsequent interpretation of findings constitutes the model-driven part of our assessment. There are both important limitations and implications of the information-centric approach. The research was carried out at Software Inc., and the results are not necessarily transferable to all software organizations. The context was a large software organization that emphasizes measurements, and the study was conducted by a well-functioning collaborative team of researchers and practitioners. The information-centric approach requires strong modeling and analytic skills and knowledge about soft systems thinking and practice. Our previous experiences with SSM were instrumental in conducting the assessment. If you are trained in drawing rich pictures and formulating systemic views, the process is rather quick and iterations are easy. It does, however, require effort and practice to acquire these skills. This stresses the more general point that organizational assessment and problem-solving is a crucial skill in software improvement efforts [33], [36], [50], [52]. The information-centric approach has implications for both practice and research. Many software organizations fail to get satisfactory benefits from their metrics program investments [13], [20], [21]. Metrics programs often fail to provide useful information, and when they do provide it, it is not necessarily communicated to the right people at the right time. Software managers are, therefore, advised to prioritize critical assessments of their metrics program [38]. The information-centric approach complements existing approaches by concentrating on making metrics programs useful for managerial decision-making and intervention. It also supports the active participation of different stakeholders and stresses the relation between measured objects, the program designer s intended meaning, and different users interpreted meaning. Our findings from Software Inc. suggest that this approach can help increase the value generated by metrics programs. We agree with Niessink and Vliet [42] that it is important to focus on those factors that enable metrics programs to generate value and not just data. The information-centric approach illustrates how emphasis on information rather than data, interpretation rather than facts, and human interaction and communication rather than computation and storage of results can contribute to this end. Future research could further explore information-centric approaches by adopting general approaches to assess information systems quality (for example, see [14]) to the specific area of software metrics programs. Additional research is also required to complement the assessment focus of this research with information-centric techniques for other activities in the improvement cycle of the IDEAL model [36]. The experiences reported here focus on the cycle s initiation and diagnosing phases, and some aspects of the establishing phase. Future research addressing IDEAL s action and learning phases could provide valuable new knowledge of the information-centric approach as an integral part of comprehensive software metrics improvement strategies. ACKNOWLEDGMENT The authors wish to thank colleagues at Software, Inc. for active participation in the assessment, for providing valuable feedback, and for engaging themselves in improving the software metrics program. K. Kautz, J. Nørbjerg, and colleagues at the 24th IRIS Conference 2001 in Norway have provided valuable comments on an earlier version of the paper. Finally, they want to thank the editors and reviewers for critical and very constructive comments that have helped us improve the manuscript. REFERENCES [1] D. M. Ahern, A. Clouse, and R. Turner, CMMI Distilled: A Practical Introduction to Integrated Process Improvement. Reading, MA: Addison-Wesley, [2] P. B. Andersen, A Theory of Computer Semiotics. Semiotic Approaches to Construction and Assessment of Computer Systems. Cambridge, U.K.: Cambridge Univ. Press, [3] D. Avison and T. Wood-Harper, Multiview: An Exploration in Information Systems Development. New York: McGraw-Hill, [4] G. A. Bell, M. A. Cooper, J. O. Jenkins, S. Minocha, and J. Weetman, SSM+GQM=The Holon methodology: A case study, in Proc. 10th ESCOM, 1999, pp [5] M. Berry and R. Jeffery, An instrument for assessing software measurement programs, Empirical Softw. Eng. Int. J., vol. 5, no. 3, pp , [6] M. Berry and M. F. Vandenbroek, A targeted assessment of the software measurement process, in Proc. IEEE 7th Int. Softw. Metrics Symp., London, U.K., 2001, pp [7] P. Bøttcher and H. D. Frederiksen, Experience from implementing software metrics for benchmarking, presented at the 11th Int. Conf. Softw. Qual., Pittsburgh, PA, [8] P. Checkland, Systems Thinking, Systems Practice. New York: Wiley, [9], Soft systems methodology: a thirty year retrospective, Syst. Res. Behav. Sci., vol. 17, pp , [10] P. Checkland and J. Scholes, Soft Systems Methodology in Action. New York: Wiley, [11] M. K. Daskalantonakis, A practical view of software measurement and implementation experiences within Motorola, IEEE Trans. Softw. Eng., vol. 18, no. 11, pp , Nov [12] L. Davies and P. Ledington, Information in Action: Soft Systems Methodology. New York: MacMillan, [13] C. A. Dekkers, The secrets of highly successful measurement programs, Cutter IT J., vol. 12, no. 4, pp , [14] W. H. delone and E. R. McLean, The delone and McLean model of information systems success: A ten-year update, J. Manage. Inf. Syst., vol. 19, no. 4, pp. 9 30, [15] N. E. Fenton and M. Neil, Software metrics: Successes, failures and new directions, J. Syst. Softw., vol. 47, no. 2 3, pp , [16] H. D. Frederiksen and J. Iversen, Implementing software metrics programs: A survey of lessons and approaches, in IRMA 2003, Philadelphia, PA, [17] H. D. Frederiksen and L. Mathiassen, Assessing improvements of software metrics practices, presented at the IFIP 8.6 Working Conf. IT Innovation for Adaptability and Competitiveness, Dublin, Ireland, May 30 Jun

13 362 IEEE TRANSACTIONS ON ENGINEERING MANAGEMENT, VOL. 52, NO. 3, AUGUST 2005 [18] R. D. Galliers, Choosing appropriate information systems research approaches: A revised taxonomy, in Information Systems Research: Contemporary Approaches & Emergent Traditions, H. E. Nissen, H. K. Klein, and R. A. Hirschheim, Eds. Amsterdam, The Netherlands: Elsevier, [19] W. Goethert and W. Hayes, Experiences in Implementing Measurements Programs. Pittsburgh, PA: The Software Engineering Institute, Carnegie Mellon Univ., CMU/SEI-2001-TN-026. [20] D. R. Goldensen, A. Gopal, and T. Mukhopadhyay, Determinants of success in software measurement programs: Initial results, in Proc. 6th IEEE Int. Symp. Softw. Metrics, Boca Raton, FL, Nov. 4 6, 1999, pp [21] A. Gopal, M. S. Krishnan, T. Mukhopadhyay, and D. R. Goldenson, Measurement programs in software development: Determinants of success, IEEE Trans. Softw. Eng., vol. 28, no. 9, pp , [22] R. B. Grady, Practical Software Metrics for Project Management and Process Improvement. Englewood Cliffs, NJ: Prentice-Hall, [23] T. Hall and N. Fenton, Implementing effective software metrics programs, IEEE Softw., vol. 14, no. 2, pp , Mar./Apr [24] J. D. Herbsleb and R. E. Grinter, Conceptual simplicity meets organizational complexity: Case study of a corporate metrics program, in Proc. Int. Conf. Softw. Eng., Kyoto, Japan, 1998, pp [25] R. T. Hughes, Embedding software measurement in a soft system approach, in Proc. 11th ESCOM, 2000, pp [26] W. S. Humphrey, Managing the Software Process. Reading, MA: Addison-Wesley, [27] J. Iversen and K. Kautz, Principles of metrics implementation, in Improving Software Organizations: From Principles to Practice, L. Mathiassen, J. Pries-Heje, and O. Ngwenyama, Eds. Reading, MA: Addison-Wesley, 2001, pp [28] J. Iversen and L. Mathiassen, Cultivation and engineering of a software metrics program, Inf. Syst. J., vol. 13, no. 1, pp. 3 20, [29] L. O. Jepsen, L. Mathiassen, and P. A. Nielsen, Back to thinking mode diaries as a medium for effective management of information systems development, Behav. Inf. Technol., vol. 8, no. 3, pp , [30] C. Jones, Software measurement programs and industry leadership, Crosstalk, vol. 14, no. 2, pp. 4 7, [31] L. J. Kirsch, The management of complex tasks in organizations: Controlling the systems development process, Org. Sci., vol. 7, no. 1, pp. 1 21, [32] B. Kitchenham, S. L. Pfleeger, and N. Fenton, Toward a framework for software measurement validation, IEEE Trans. Softw. Eng., vol. 21, no. 12, pp , [33] G. F. Lanzara and L. Mathiassen, Mapping situations within a systems development project, Inf. Manage., vol. 8, no. 1, pp. 3 20, [34] P. J. Lewis, Linking soft systems methodology with data-focused information systems development, J. Inf. Syst., vol. 3, no. 3, pp , [35] L. Mathiassen, Collaborative practice research, Inf., Technol., People, vol. 15, no. 4, pp , [36] B. McFeeley, IDEAL. A User s Guide for Software Process Improvement. Pittsburgh, PA: The Software Engineering Institute, Carnegie Mellon Univ., CMU/SEI-96-HB-001. [37] J. McGarry, D. Card, C. Jones, B. Layman, E. Clark, J. Dean, and F. Hall, Practical Software Measurement. Objective Information for Decision Makers. Reading, MA: Addison-Wesley, [38] M. G. Mendonça and C. R. Basili, Validation of an approach for improving existing measurement frameworks, IEEE Trans. Softw. Eng., vol. 26, no. 6, pp , Jun [39] J. C. Mingers, Information and meaning: Foundations for an intersubjective account, Inf. Syst. J., vol. 5, pp , [40] P. Naur, Program development studies based on diaries, in Lecture Notes in Computer Science, Formal Methods and Software Development, T. R. Green et al., Eds. Berlin, Germany: Springer-Verlag, [41] P. A. Nielsen and J. Pries-Heje, A framework for selecting an assessment strategy, in Improving Software Organizations: From Principles to Practice, L. Mathiassen, J. Pries-Heje, and O. Ngwenyama, Eds. Reading, MA: Addison-Wesley, 2001, pp [42] F. Niessink and H. van Vliet, Measurement should generate value, rather than data, in Proc. 6th Int. Softw. Metrics Symp., Boca Raton, FL, 1999, pp [43] M. K. Pedersen, A Theory of Informations. Copenhagen: Samfundslitteratur, [44] S. L. Pfleeger, Lessons learned in building a corporate metrics program, IEEE Softw., vol. 10, no. 3, pp , [45] D. Philips, Back to basics: Metrics that work for software projects, Cutter IT J., vol. 12, no. 4, pp , [46] R. S. Pressman, Software Engineering A Practitioner s Approach, European Adaptation, 5th ed. New York: McGraw-Hill, [47] S. Rifkin and C. Cox, Measurement in Practice. Pittsburgh, PA: The Software Engineering Institute, Carnegie Mellon Univ., CMU/SEI-91-TR-016. [48] E. C. L. Starrett, Measurement 101, Crosstalk, vol. 11, no. 8, pp , [49] F. A. Stowel, Ed., Information Systems Provision: The Contribution of Soft Systems Methodology. New York: McGraw-Hill, [50] G. M. Weinberg, Becoming a Technical Leader An Organic Problem Solving Approach. New York: Dorset House, [51], Quality Software Management, Volume 2, First-Order Measurement. New York: Dorset House, [52], Quality Software Management, Volume 4: Anticipating Change. New York: Dorset House, [53] R. K. Yin, Case Study Research Design and Methods, 2nd ed. Newbury Park, CA: Sage, 1994, vol. 5, Applied Social Research Methods Series. Helle Damborg Frederiksen received the M.S. and Ph.D. degrees from Aalborg University, Aalborg, Denmark, in 1992 and 2004, respectively. She has been employed at the Case Company, Aalborg, Denmark, since 1995, and was an industrially-based Ph.D. student at Aalborg University from 2000 to Lars Mathiassen received the M.S. degree in computer science from Aarhus University, Aarhus, Denmark, in 1975, the Ph.D. degree in informatics from Oslo University, Oslo, Norway, in 1981, and the Dr.Techn. degree in software engineering from Aalborg University, Aalborg, Denmark, in He is currently a Professor of Computer Information Systems at Georgia State University, Atlanta. He is coauthor of Computers in Context (Oxford, U.K.: Blackwell, 1993), Object Oriented Analysis & Design (Aalborg, Denmark: Marko Publishing, 2000), and Improving Software Organizations (Reading, MA: Addison-Wesley, 2002). His research interests include information systems and software engineering with a particular emphasis on process innovation. Dr. Mathiassen is a member of the Association for Computing Machinery (ACM) and AIS.

How PRINCE2 Can Complement PMBOK and Your PMP Jay M. Siegelaub Impact Strategies LLC. Abstract. About PRINCE2

How PRINCE2 Can Complement PMBOK and Your PMP Jay M. Siegelaub Impact Strategies LLC. Abstract. About PRINCE2 How PRINCE2 Can Complement PMBOK and Your PMP Jay M. Siegelaub Impact Strategies LLC Abstract PMBOK is the recognized (de facto) standard of project management knowledge. In the UK and Europe, PRINCE2

More information

White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program

White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program White Paper from Global Process Innovation by Jim Boots Fourteen Metrics for a BPM Program This white paper presents 14 metrics which may be useful for monitoring progress on a BPM program or initiative.

More information

Finding the Right People for Your Program Evaluation Team: Evaluator and Planning Team Job Descriptions

Finding the Right People for Your Program Evaluation Team: Evaluator and Planning Team Job Descriptions : Evaluator and Planning Team Job Descriptions I. Overview II. Sample Evaluator Job Description III. Evaluator Competencies IV. Recruiting members of your strategic evaluation planning team V. Recruiting

More information

HOW TO USE THE DGI DATA GOVERNANCE FRAMEWORK TO CONFIGURE YOUR PROGRAM

HOW TO USE THE DGI DATA GOVERNANCE FRAMEWORK TO CONFIGURE YOUR PROGRAM HOW TO USE THE DGI DATA GOVERNANCE FRAMEWORK TO CONFIGURE YOUR PROGRAM Prepared by Gwen Thomas of the Data Governance Institute Contents Why Data Governance?... 3 Why the DGI Data Governance Framework

More information

An Effective Approach to Transition from Risk Assessment to Enterprise Risk Management

An Effective Approach to Transition from Risk Assessment to Enterprise Risk Management Bridgework: An Effective Approach to Transition from Risk Assessment to Enterprise Risk Management @Copyright Cura Software. All rights reserved. No part of this document may be transmitted or copied without

More information

Cover Page. The handle http://hdl.handle.net/1887/33081 holds various files of this Leiden University dissertation.

Cover Page. The handle http://hdl.handle.net/1887/33081 holds various files of this Leiden University dissertation. Cover Page The handle http://hdl.handle.net/1887/33081 holds various files of this Leiden University dissertation. Author: Stettina, Christoph Johann Title: Governance of innovation project management

More information

12 A framework for knowledge management

12 A framework for knowledge management 365 12 A framework for knowledge management As those who work in organizations know, organizations are not homogenous entities where grand theoretical systems are easily put in place. Change is difficult.

More information

Project Management Process

Project Management Process Project Management Process Description... 1 STAGE/STEP/TASK SUMMARY LIST... 2 Project Initiation 2 Project Control 4 Project Closure 5 Project Initiation... 7 Step 01: Project Kick Off 10 Step 02: Project

More information

Delaware Performance Appraisal System

Delaware Performance Appraisal System Delaware Performance Appraisal System Building greater skills and knowledge for educators DPAS-II Guide for Administrators (Principals) Principal Practice Rubric Updated July 2015 1 INEFFECTIVE A. DEVELOPS

More information

School of Advanced Studies Doctor Of Management In Organizational Leadership. DM 004 Requirements

School of Advanced Studies Doctor Of Management In Organizational Leadership. DM 004 Requirements School of Advanced Studies Doctor Of Management In Organizational Leadership The mission of the Doctor of Management in Organizational Leadership degree program is to develop the critical and creative

More information

Unleashing your power through effective 360 feedback 1

Unleashing your power through effective 360 feedback 1 Consulting with organizations that are committed to being an employer of choice. Unleashing your power through effective 360 feedback 1 What is feedback? Feedback is input from others. It reflects the

More information

Section Two: Ohio Standards for the Teaching Profession

Section Two: Ohio Standards for the Teaching Profession 12 Section Two: Ohio Standards for the Teaching Profession 1 Teachers understand student learning and development and respect the diversity of the students they teach. Teachers display knowledge of how

More information

Correlation between competency profile and course learning objectives for Full-time MBA

Correlation between competency profile and course learning objectives for Full-time MBA Correlation between competency and course for Full-time MBA Competency management in the Organizational Behavior and Leadership Managing Sustainable Corporations Accounting Marketing Economics Human Resource

More information

THE COSTS AND BENEFITS OF DIVERSITY

THE COSTS AND BENEFITS OF DIVERSITY Fundamental rights & anti-discrimination THE COSTS AND BENEFITS OF DIVERSITY European Commission Emplo 2 THE COSTS AND BENEFITS OF DIVERSITY A Study on Methods and Indicators to Measure the Cost-Effectiveness

More information

The MSPA Rubric for Evaluation of School Psychologists

The MSPA Rubric for Evaluation of School Psychologists The MSPA Rubric for Evaluation of School Psychologists The Massachusetts School Psychologists Association (MSPA) has developed this rubric for the evaluation of school psychologists for adoption, in part

More information

Moving from ISO9000 to the Higher Levels of the Capability Maturity Model (CMM)

Moving from ISO9000 to the Higher Levels of the Capability Maturity Model (CMM) Moving from ISO9000 to the Higher Levels of the Capability Maturity Model (CMM) Pankaj Jalote 1 Infosys Technologies Ltd. Bangalore 561 229 Fax: +91-512-590725/590413 Jalote@iitk.ernet.in, jalote@iitk.ac.in

More information

Masters Comprehensive Exam and Rubric (Rev. July 17, 2014)

Masters Comprehensive Exam and Rubric (Rev. July 17, 2014) 1 Educational Leadership & Policy Studies Masters Comprehensive Exam and Rubric (Rev. July 17, 2014) The comprehensive exam is intended as a final assessment of a student s ability to integrate important

More information

Monitoring and Evaluation Plan Primer for DRL Grantees

Monitoring and Evaluation Plan Primer for DRL Grantees Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or

More information

Holistic Development of Knowledge Management with KMMM

Holistic Development of Knowledge Management with KMMM 1 Karsten Ehms, Dr. Manfred Langen Holistic Development of Knowledge Management with KMMM Siemens AG / Corporate Technology Knowledge Management & Business Transformation If knowledge management is to

More information

How To Be A Successful Leader

How To Be A Successful Leader Leadership Development Program (Level I and Level II) PR O GR A M OV ER V IE W Our Leadership Development Programs (LDP Level I and Level II) are designed to address the development needs of managers and

More information

REFLECTING ON EXPERIENCES OF THE TEACHER INDUCTION SCHEME

REFLECTING ON EXPERIENCES OF THE TEACHER INDUCTION SCHEME REFLECTING ON EXPERIENCES OF THE TEACHER INDUCTION SCHEME September 2005 Myra A Pearson, Depute Registrar (Education) Dr Dean Robson, Professional Officer First Published 2005 The General Teaching Council

More information

Existing Analytical Market Assessment Tools - Definitions

Existing Analytical Market Assessment Tools - Definitions Existing Analytical Market Assessment Tools - Definitions November, 2003 This list of market assessment tools was prepared by Development Alternatives Inc. (DAI) as an internal working document to support

More information

School of Advanced Studies Doctor Of Management In Organizational Leadership/information Systems And Technology. DM/IST 004 Requirements

School of Advanced Studies Doctor Of Management In Organizational Leadership/information Systems And Technology. DM/IST 004 Requirements School of Advanced Studies Doctor Of Management In Organizational Leadership/information Systems And Technology The mission of the Information Systems and Technology specialization of the Doctor of Management

More information

Information Management

Information Management G i Information Management Information Management Planning March 2005 Produced by Information Management Branch Open Government Service Alberta 3 rd Floor, Commerce Place 10155 102 Street Edmonton, Alberta,

More information

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. February 2013 1 Executive Summary Adnet is pleased to provide this white paper, describing our approach to performing

More information

TenStep Project Management Process Summary

TenStep Project Management Process Summary TenStep Project Management Process Summary Project management refers to the definition and planning, and then the subsequent management, control, and conclusion of a project. It is important to recognize

More information

TOOL KIT for RESIDENT EDUCATOR and MENT OR MOVES

TOOL KIT for RESIDENT EDUCATOR and MENT OR MOVES Get to Know My RE Observe Collect Evidence Mentor Moments Reflect Review Respond Tailor Support Provide Provide specific feedback specific Feedback What does my RE need? Practice Habits Of Mind Share Data

More information

Methods Commission CLUB DE LA SECURITE DE L INFORMATION FRANÇAIS. 30, rue Pierre Semard, 75009 PARIS

Methods Commission CLUB DE LA SECURITE DE L INFORMATION FRANÇAIS. 30, rue Pierre Semard, 75009 PARIS MEHARI 2007 Overview Methods Commission Mehari is a trademark registered by the Clusif CLUB DE LA SECURITE DE L INFORMATION FRANÇAIS 30, rue Pierre Semard, 75009 PARIS Tél.: +33 153 25 08 80 - Fax: +33

More information

4 Research Methodology

4 Research Methodology 4 Research Methodology 4.1 Introduction This chapter presents the conceptual framework used for the study. The conceptual model gives a clear picture of the structure of the study and shows how the theory

More information

Implementation of a Quality Management System for Aeronautical Information Services -1-

Implementation of a Quality Management System for Aeronautical Information Services -1- Implementation of a Quality Management System for Aeronautical Information Services -1- Implementation of a Quality Management System for Aeronautical Information Services Chapter IV, Quality Management

More information

Pharmaceutical Sales Certificate

Pharmaceutical Sales Certificate Pharmaceutical Sales Certificate Target Audience Medical representatives Objective The objective of this program is to provide the necessary skills and knowledge needed to succeed as medical representatives.

More information

CRISP-DM, which stands for Cross-Industry Standard Process for Data Mining, is an industry-proven way to guide your data mining efforts.

CRISP-DM, which stands for Cross-Industry Standard Process for Data Mining, is an industry-proven way to guide your data mining efforts. CRISP-DM, which stands for Cross-Industry Standard Process for Data Mining, is an industry-proven way to guide your data mining efforts. As a methodology, it includes descriptions of the typical phases

More information

Experience Report: Using Internal CMMI Appraisals to Institutionalize Software Development Performance Improvement

Experience Report: Using Internal CMMI Appraisals to Institutionalize Software Development Performance Improvement Experience Report: Using Internal MMI Appraisals to Institutionalize Software Development Performance Improvement Dr. Fredrik Ekdahl A, orporate Research, Västerås, Sweden fredrik.p.ekdahl@se.abb.com Stig

More information

Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER

Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Useful vs. So-What Metrics... 2 The So-What Metric.... 2 Defining Relevant Metrics...

More information

Ten Steps to Comprehensive Project Portfolio Management Part 3 Projects, Programs, Portfolios and Strategic Direction By R.

Ten Steps to Comprehensive Project Portfolio Management Part 3 Projects, Programs, Portfolios and Strategic Direction By R. August 2007 Ten Steps to Comprehensive Project Portfolio Management Part 3 Projects, Programs, Portfolios and Strategic Direction By R. Max Wideman This series of papers has been developed from our work

More information

School of Advanced Studies Doctor Of Education In Educational Leadership With A Specialization In Educational Technology. EDD/ET 003 Requirements

School of Advanced Studies Doctor Of Education In Educational Leadership With A Specialization In Educational Technology. EDD/ET 003 Requirements School of Advanced Studies Doctor Of Education In Educational Leadership With A Specialization In Educational Technology The mission of the Doctor of Education in Educational Leadership degree program

More information

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3) PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3) 1st February 2006 Version 1.0 1 P3M3 Version 1.0 The OGC logo is a Registered Trade Mark of the Office of Government Commerce This is a Value

More information

FRAMEWORK OF SUPPORT: SCHOOL-LEVEL PRACTICE PROFILE

FRAMEWORK OF SUPPORT: SCHOOL-LEVEL PRACTICE PROFILE FRAMEWORK OF SUPPORT: SCHOOL-LEVEL PRACTICE PROFILE S The Framework of Supports are a set of Practice Profiles that serve as an implementation overview of Support for Personalized Learning (SPL). Practice

More information

School leadership and student outcomes: Identifying what works and why

School leadership and student outcomes: Identifying what works and why School leadership and student outcomes: Identifying what works and why Summary of the the Best Evidence Synthesis (BES) conducted by Viviane Robinson, Margie Hohepa and Claire Lloyd Research Summary December

More information

UNITED NATIONS OFFICE FOR PROJECT SERVICES. ORGANIZATIONAL DIRECTIVE No. 33. UNOPS Strategic Risk Management Planning Framework

UNITED NATIONS OFFICE FOR PROJECT SERVICES. ORGANIZATIONAL DIRECTIVE No. 33. UNOPS Strategic Risk Management Planning Framework UNOPS UNITED NATIONS OFFICE FOR PROJECT SERVICES Headquarters, Copenhagen O.D. No. 33 16 April 2010 ORGANIZATIONAL DIRECTIVE No. 33 UNOPS Strategic Risk Management Planning Framework 1. Introduction 1.1.

More information

Trends in Brand Marketing:

Trends in Brand Marketing: a Nielsen bluepaper Trends in Brand Marketing: An interview with Kevin Lane Keller, author of Strategic Brand Management Trends in Brand Marketing: Interview with Prof. Kevin Lane Keller, author of Strategic

More information

Implementing a Metrics Program MOUSE will help you

Implementing a Metrics Program MOUSE will help you Implementing a Metrics Program MOUSE will help you Ton Dekkers, Galorath tdekkers@galorath.com Just like an information system, a method, a technique, a tool or an approach is supporting the achievement

More information

In control: how project portfolio management can improve strategy deployment. Case study

In control: how project portfolio management can improve strategy deployment. Case study Case study In control: how project portfolio can improve strategy deployment Launching projects and initiatives to drive revenue and achieve business goals is common practice, but less so is implementing

More information

Computing & Communications Services

Computing & Communications Services 2010 Computing & Communications Services 2010 / 10 / 04 Final Kent Percival, M.Sc., P.Eng. Defining the Value of the Business Analyst In achieving its vision, key CCS partnerships involve working directly

More information

This alignment chart was designed specifically for the use of Red River College. These alignments have not been verified or endorsed by the IIBA.

This alignment chart was designed specifically for the use of Red River College. These alignments have not been verified or endorsed by the IIBA. Red River College Course Learning Outcome Alignment with BABOK Version 2 This alignment chart was designed specifically for the use of Red River College. These alignments have not been verified or endorsed

More information

TEAM PRODUCTIVITY DEVELOPMENT PROPOSAL

TEAM PRODUCTIVITY DEVELOPMENT PROPOSAL DRAFT TEAM PRODUCTIVITY DEVELOPMENT PROPOSAL An initial draft proposal to determine the scale, scope and requirements of a team productivity development improvement program for a potential client Team

More information

Alliance Scorecarding and Performance Management at TechCo. A Vantage Partners Case Study

Alliance Scorecarding and Performance Management at TechCo. A Vantage Partners Case Study Alliance Scorecarding and Performance Management at TechCo A Vantage Partners Case Study With the assistance of Vantage Partners, TechCo, a microelectronics company based in California, developed and implemented

More information

Hybrid-Agile Software Development

Hybrid-Agile Software Development Hybrid-Agile Software Development Anti-Patterns, Risks, and Recommendations Paul E. McMahon, PEM Systems Abstract. Many organizations are driving toward increased agility in their software development

More information

The IIA Global Internal Audit Competency Framework

The IIA Global Internal Audit Competency Framework About The IIA Global Internal Audit Competency Framework The IIA Global Internal Audit Competency Framework (the Framework) is a tool that defines the competencies needed to meet the requirements of the

More information

C. Wohlin and B. Regnell, "Achieving Industrial Relevance in Software Engineering Education", Proceedings Conference on Software Engineering

C. Wohlin and B. Regnell, Achieving Industrial Relevance in Software Engineering Education, Proceedings Conference on Software Engineering C. Wohlin and B. Regnell, "Achieving Industrial Relevance in Software Engineering Education", Proceedings Conference on Software Engineering Education & Training, pp. 16-25, New Orleans, Lousiana, USA,

More information

The Success Profile for Shared Services and Centres of Expertise

The Success Profile for Shared Services and Centres of Expertise 1 The Success Profile for Shared Services and Centres of Expertise Contents Role and models 3 Great minds think alike 4 Five factors that make the difference 5 Five factors in action 7 What can we take

More information

Using Scrum to Guide the Execution of Software Process Improvement in Small Organizations

Using Scrum to Guide the Execution of Software Process Improvement in Small Organizations Using Scrum to Guide the Execution of Software Process Improvement in Small Organizations Francisco J. Pino, Oscar Pedreira*, Félix García +, Miguel Rodríguez Luaces*, Mario Piattini + IDIS Research Group

More information

Business Continuity Position Description

Business Continuity Position Description Position Description February 9, 2015 Position Description February 9, 2015 Page i Table of Contents General Characteristics... 2 Career Path... 3 Explanation of Proficiency Level Definitions... 8 Summary

More information

PROPS Manual for Project Managers

PROPS Manual for Project Managers PROPS Manual for Project Managers 1 PROPS Manual for Project Managers CONTENTS INTRODUCTION... 3 PROJECT MANAGEMENT MODEL... 7 PRESTUDY PHASE... 11 PHASE START-UP AND TEAMBUILDING... 17 COACHING, INTEGRATION

More information

ASSESSMENT CENTER FOR IDENTIFYING POTENTIAL PROJECT MANAGERS: A CHANCE FOR SYSTEMATIC HUMAN RESOURCE DEVELOPMENT

ASSESSMENT CENTER FOR IDENTIFYING POTENTIAL PROJECT MANAGERS: A CHANCE FOR SYSTEMATIC HUMAN RESOURCE DEVELOPMENT ASSESSMENT CENTER FOR IDENTIFYING POTENTIAL PROJECT MANAGERS: A CHANCE FOR SYSTEMATIC HUMAN RESOURCE DEVELOPMENT Dipl. Psych. Ingo Heyn, ALLIANZ LEBENSVERSICHERUNGS-AG, Germany, 1999 Paper for the 6th

More information

GQM + Strategies in a Nutshell

GQM + Strategies in a Nutshell GQM + trategies in a Nutshell 2 Data is like garbage. You had better know what you are going to do with it before you collect it. Unknown author This chapter introduces the GQM + trategies approach for

More information

An Enterprise Framework for Evaluating and Improving Software Quality

An Enterprise Framework for Evaluating and Improving Software Quality An Enterprise Framework for Evaluating and Improving Software Quality Abstract Philip Lew philip.lew@xbosoft.com With the world s economy increasingly driven by software products, there has been a relentless

More information

Organisational Change Management

Organisational Change Management Organisational Change Management The only thing that is constant is change in your business, your market, your competitors, and your technology. Remaining competitive and responsive to your customers and

More information

THE BUSINESS OF THE DESIGN DOCTORATE A CRITICAL ANALYSIS OF AIMS, INTERACTIONS AND IMPACTS Martin Woolley, University of the Arts, London

THE BUSINESS OF THE DESIGN DOCTORATE A CRITICAL ANALYSIS OF AIMS, INTERACTIONS AND IMPACTS Martin Woolley, University of the Arts, London THE BUSINESS OF THE DESIGN DOCTORATE A CRITICAL ANALYSIS OF AIMS, INTERACTIONS AND IMPACTS Martin Woolley, University of the Arts, London This paper examines the aspirations of research students, whether

More information

Training and Development (T & D): Introduction and Overview

Training and Development (T & D): Introduction and Overview Training and Development (T & D): Introduction and Overview Recommended textbook. Goldstein I. L. & Ford K. (2002) Training in Organizations: Needs assessment, Development and Evaluation (4 th Edn.). Belmont:

More information

Department of Management

Department of Management Department of Management Course Student Learning Outcomes (ITM and MGMT) ITM 1270: Fundamentals of Information Systems and Applications Upon successful completion of the course, a student will be able

More information

Australian Safety and Quality Framework for Health Care

Australian Safety and Quality Framework for Health Care Activities for the HEALTHCARE TEAM Australian Safety and Quality Framework for Health Care Putting the Framework into action: Getting started Contents Principle: Consumer centred Areas for action: 1.2

More information

On the Setting of the Standards and Practice Standards for. Management Assessment and Audit concerning Internal

On the Setting of the Standards and Practice Standards for. Management Assessment and Audit concerning Internal (Provisional translation) On the Setting of the Standards and Practice Standards for Management Assessment and Audit concerning Internal Control Over Financial Reporting (Council Opinions) Released on

More information

ISO, CMMI and PMBOK Risk Management: a Comparative Analysis

ISO, CMMI and PMBOK Risk Management: a Comparative Analysis ISO, CMMI and PMBOK Risk Management: a Comparative Analysis Cristine Martins Gomes de Gusmão Federal University of Pernambuco / Informatics Center Hermano Perrelli de Moura Federal University of Pernambuco

More information

Guidance Note on Developing Terms of Reference (ToR) for Evaluations

Guidance Note on Developing Terms of Reference (ToR) for Evaluations Evaluation Guidance Note Series UNIFEM Evaluation Unit October 2009 Guidance Note on Developing Terms of Reference (ToR) for Evaluations Terms of Reference (ToR) What? Why? And How? These guidelines aim

More information

Applies from 1 April 2007 Revised April 2008. Core Competence Framework Guidance booklet

Applies from 1 April 2007 Revised April 2008. Core Competence Framework Guidance booklet Applies from 1 April 2007 Revised April 2008 Core Competence Framework Guidance booklet - Core Competence Framework - Core Competence Framework Core Competence Framework Foreword Introduction to competences

More information

Grounded Theory. 1 Introduction... 1. 2 Applications of grounded theory... 1. 3 Outline of the design... 2

Grounded Theory. 1 Introduction... 1. 2 Applications of grounded theory... 1. 3 Outline of the design... 2 Grounded Theory Contents 1 Introduction... 1 2 Applications of grounded theory... 1 3 Outline of the design... 2 4 Strengths and weaknesses of grounded theory... 6 5 References... 6 1 Introduction This

More information

Developing Strategic Leadership

Developing Strategic Leadership Secondary Leadership Paper 31 1998-2008 Anniversary Edition Developing Strategic Leadership Brent Davies & Barbara J Davies Introduction: What is Strategic Leadership? One of the key challenges, when taking

More information

Agile Master Data Management TM : Data Governance in Action. A whitepaper by First San Francisco Partners

Agile Master Data Management TM : Data Governance in Action. A whitepaper by First San Francisco Partners Agile Master Data Management TM : Data Governance in Action A whitepaper by First San Francisco Partners First San Francisco Partners Whitepaper Executive Summary What do data management, master data management,

More information

SCHOOL CITY OF MISHAWAKA TEACHER EVALUATION RUBRIC (SCHOOL SOCIAL WORKERS)

SCHOOL CITY OF MISHAWAKA TEACHER EVALUATION RUBRIC (SCHOOL SOCIAL WORKERS) APPENDIX E DOMAIN A: PURPOSEFUL PLANNING 1. Utilizing Student, School, and Community Data to Plan. The school social worker does not monitor academic achievement. The social worker rarely or never uses

More information

Wilder Research. Program evaluation capacity building at the Minnesota Historical Society Summary of the project and recommended next steps

Wilder Research. Program evaluation capacity building at the Minnesota Historical Society Summary of the project and recommended next steps Wilder Research Program evaluation capacity building at the Minnesota Historical Society Summary of the project and recommended next steps In October 2010, the Minnesota Historical Society (the Society)

More information

NORFOLK PUBLIC SCHOOLS SUMMATIVE SPECIAL EDUCATION TEACHER EVALUATION. Summative Evaluation Directions

NORFOLK PUBLIC SCHOOLS SUMMATIVE SPECIAL EDUCATION TEACHER EVALUATION. Summative Evaluation Directions Teacher: Date: School Year: Building: Assignment: Evaluation Status: Summative Evaluation Directions The purposes of the summative evaluation conference are: 1) to review and discuss the contents of the

More information

ICSEI 2010: Educational Evaluation

ICSEI 2010: Educational Evaluation ICSEI 2010: Educational Evaluation Rikke Sørup (rs@eva.dk) Katja Munch Thorsen (kmt@eva.dk) The Danish Evaluation Institute From knowledge production to sustainable practice - presentation of tools for

More information

Software Quality Management II

Software Quality Management II Software II Lecture 13 Software Engineering CUGS Kristian Sandahl Department of Computer and Information Science Linköping University, Sweden kristian.sandahl@ida.liu.se A Software Life-cycle Model Which

More information

Planning, Monitoring, Evaluation and Learning Program

Planning, Monitoring, Evaluation and Learning Program Planning, Monitoring, Evaluation and Learning Program Program Description Engineers Without Borders USA EWB-USA Planning, Monitoring, Evaluation 02/2014 and Learning Program Description This document describes

More information

Module 2. Software Life Cycle Model. Version 2 CSE IIT, Kharagpur

Module 2. Software Life Cycle Model. Version 2 CSE IIT, Kharagpur Module 2 Software Life Cycle Model Lesson 4 Prototyping and Spiral Life Cycle Models Specific Instructional Objectives At the end of this lesson the student will be able to: Explain what a prototype is.

More information

Deriving Value from ORSA. Board Perspective

Deriving Value from ORSA. Board Perspective Deriving Value from ORSA Board Perspective April 2015 1 This paper has been produced by the Joint Own Risk Solvency Assessment (ORSA) Subcommittee of the Insurance Regulation Committee and the Enterprise

More information

Arkansas Teaching Standards

Arkansas Teaching Standards Arkansas Teaching Standards The Arkansas Department of Education has adopted the 2011 Model Core Teaching Standards developed by Interstate Teacher Assessment and Support Consortium (InTASC) to replace

More information

» Kienbaum 360 Degree Feedback

» Kienbaum 360 Degree Feedback » Kienbaum 360 Degree Feedback Develop leaders. Improve leadership quality. What we offer 2» The Challenge 3 Self-reflected, authentic, confident Why leadership quality is so important good leaders make

More information

12 th ICCRTS. Adapting C2 to the 21 st Century

12 th ICCRTS. Adapting C2 to the 21 st Century 12 th ICCRTS Adapting C2 to the 21 st Century Human Performance Technology: A Discipline to Improve C2 Concept Development and Analysis Tracks: Track 1: C2 Concepts, Theory, and Policy Track 6: Metrics

More information

Seven Principles of Change:

Seven Principles of Change: Managing Change, LLC Identifying Intangible Assets to Produce Tangible Results Toll Free: 877-880-0217 Seven Principles of Change: Excerpt from the new book, Change Management: the people side of change

More information

The Communications Audit NEVER MORE RELEVANT, NEVER MORE VALUABLE:

The Communications Audit NEVER MORE RELEVANT, NEVER MORE VALUABLE: WHITE PAPER The Communications Audit NEVER MORE RELEVANT, NEVER MORE VALUABLE: VALUE PROPOSITION OBJECTIVES METHODOLOGY BY GARY DOLZALL CHIEF COMMUNICATIONS OFFICER I. INTRODUCTION: THE VALUE PROPOSITION

More information

Interview studies. 1 Introduction... 1. 2 Applications of interview study designs... 2. 3 Outline of the design... 3

Interview studies. 1 Introduction... 1. 2 Applications of interview study designs... 2. 3 Outline of the design... 3 Interview studies Contents 1 Introduction... 1 2 Applications of interview study designs... 2 3 Outline of the design... 3 4 Strengths and weaknesses of interview study designs... 6 5 References... 7 1

More information

How To Change A Business Model

How To Change A Business Model SOA governance and organizational change strategy White paper November 2007 Enabling SOA through organizational change Sandy Poi, Global SOA Offerings Governance lead, associate partner, Financial Services

More information

[project.headway] Integrating Project HEADWAY And CMMI

[project.headway] Integrating Project HEADWAY And CMMI [project.headway] I N T E G R A T I O N S E R I E S Integrating Project HEADWAY And CMMI P R O J E C T H E A D W A Y W H I T E P A P E R Integrating Project HEADWAY And CMMI Introduction This white paper

More information

A SYSTEM DEVELOPMENT METHODOLOGY FOR ERP SYSTEM IN SMEs OF MALAYSIAN MANUFACTURING SECTORS

A SYSTEM DEVELOPMENT METHODOLOGY FOR ERP SYSTEM IN SMEs OF MALAYSIAN MANUFACTURING SECTORS A SYSTEM DEVELOPMENT METHODOLOGY FOR ERP SYSTEM IN SMEs OF MALAYSIAN MANUFACTURING SECTORS 1 YOUSEF KHALEEL, 2 RIZA SULAIMAN 1 Student, Department of Industrial Computing, UKM, Selangor, Malaysia 2 Assoc.

More information

PERFORMANCE DEVELOPMENT PROGRAM

PERFORMANCE DEVELOPMENT PROGRAM PERFORMANCE DEVELOPMENT PROGRAM Document Number SOP2009-056 File No. 08/470-02 (D009/8429) Date issued 16 September 2009 Author Branch Director Workforce Unit Branch contact Strategic Projects Coordinator

More information

Software Quality Management

Software Quality Management Software Lecture 9 Software Engineering CUGS Spring 2011 Kristian Sandahl Department of Computer and Information Science Linköping University, Sweden A Software Life-cycle Model Which part will we talk

More information

Using Measurement to translate Business Vision into Operational Software Strategies

Using Measurement to translate Business Vision into Operational Software Strategies Using Measurement to translate Business Vision into Operational Software Strategies Victor R. Basili University of Maryland and Fraunhofer Center - Maryland BUSINESS NEEDS Any successful business requires:

More information

1. What is PRINCE2? Projects In a Controlled Environment. Structured project management method. Generic based on proven principles

1. What is PRINCE2? Projects In a Controlled Environment. Structured project management method. Generic based on proven principles 1. What is PRINCE2? Projects In a Controlled Environment Structured project management method Generic based on proven principles Isolates the management from the specialist 2 1.1. What is a Project? Change

More information

Building Disaster Risk Management capacity: key principles

Building Disaster Risk Management capacity: key principles briefing note Building Disaster Risk Management capacity: key principles How can programmes aiming to build disaster risk management (DRM) capacity improve their effectiveness? What works and why? What

More information

Partnering for Project Success: Project Manager and Business Analyst Collaboration

Partnering for Project Success: Project Manager and Business Analyst Collaboration Partnering for Project Success: Project Manager and Business Analyst Collaboration By Barbara Carkenord, CBAP, Chris Cartwright, PMP, Robin Grace, CBAP, Larry Goldsmith, PMP, Elizabeth Larson, PMP, CBAP,

More information

JOB DESCRIPTION SYSTEMS DEVELOPMENT OFFICER - Grade 6

JOB DESCRIPTION SYSTEMS DEVELOPMENT OFFICER - Grade 6 JOB DESCRIPTION SYSTEMS DEVELOPMENT OFFICER - Grade 6 Title: Contract type: Systems Development Officer Fixed Term 12 Months Grade: Grade 6 Location: Reporting to: Carmarthen or Lampeter Campus Information

More information

Questions? Assignment. Techniques for Gathering Requirements. Gathering and Analysing Requirements

Questions? Assignment. Techniques for Gathering Requirements. Gathering and Analysing Requirements Questions? Assignment Why is proper project management important? What is goal of domain analysis? What is the difference between functional and non- functional requirements? Why is it important for requirements

More information

Undergraduate Psychology Major Learning Goals and Outcomes i

Undergraduate Psychology Major Learning Goals and Outcomes i Undergraduate Psychology Major Learning Goals and Outcomes i Goal 1: Knowledge Base of Psychology Demonstrate familiarity with the major concepts, theoretical perspectives, empirical findings, and historical

More information

The Role of Information Technology Studies in Software Product Quality Improvement

The Role of Information Technology Studies in Software Product Quality Improvement The Role of Information Technology Studies in Software Product Quality Improvement RUDITE CEVERE, Dr.sc.comp., Professor Faculty of Information Technologies SANDRA SPROGE, Dr.sc.ing., Head of Department

More information

2 Business, Performance, and Gap Analysis

2 Business, Performance, and Gap Analysis 2 Business, Performance, and Gap Analysis The performance consulting process generally includes identifying business needs, performance needs, and work environment and capability needs. All strategic performance

More information

Practical Experiences of Agility in the Telecom Industry

Practical Experiences of Agility in the Telecom Industry Practical Experiences of Agility in the Telecom Industry Jari Vanhanen 1, Jouni Jartti 2, and Tuomo Kähkönen 2 1 Helsinki University of Technology, Software Business and Engineering Institute, P.O. Box

More information

Enterprise Social Networks - The Seven Advantages

Enterprise Social Networks - The Seven Advantages BEST PRACTICE IN ENTREPRISE SOCIAL NETWORKING - 2013 BY WITH 0 ZYNCRO 2013. All right reserved CONTENU - 1/ GUIDELINES FOR USING ENTERPRISE SOCIAL NETWORKS - P.02 2/ HOW AND WHEN TO USE ENTERPRISE SOCIAL

More information

pm4dev, 2007 management for development series The Project Management Processes PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS

pm4dev, 2007 management for development series The Project Management Processes PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS pm4dev, 2007 management for development series The Project Management Processes PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS A methodology to manage

More information