BENCHMARK YOUR TEST FUNCTION AGAINST A WORLD-CLASS TESTING ORGANIZATION

Similar documents
An Introduction to. Metrics. used during. Software Development

Lean Six Sigma Black Belt Certification Recommendation. Name (as it will appear on the certificate) Address. City State, Zip

pm4dev, 2016 management for development series Project Scope Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS

Key Steps to a Management Skills Audit

Master Document Audit Program. Version 7.4, dated November 2006 B-1 Planning Considerations. Purpose and Scope

Reaching CMM Levels 2 and 3 with the Rational Unified Process

QUALITY TOOLBOX. Understanding Processes with Hierarchical Process Mapping. Robert B. Pojasek. Why Process Mapping?

Maintaining employees focus and motivation is essential if they are to make a full contribution to your business.

Project Management Best Practices

Project Cost Overrun Review Process Improvement

Integrated methodology for testing and quality management.

A Six Sigma Approach for Software Process Improvements and its Implementation

Cost of Poor Quality:

Input and Output of ISM-Benchmark

PROJECT MANAGEMENT PLAN CHECKLIST

Process Models and Metrics

Assessing Your Information Technology Organization

Survey Instrument Requirements Requirements Definition Template

INTERNAL CONTROL MATRIX FOR AUDIT OF ESTIMATING SYSTEM CONTROLS Version No. 3.2 June d Page 1 of 7

CSTE Mock Test - Part I - Questions Along with Answers

Earned Value. Valerie Colber, MBA, PMP, SCPM. Not for duplication nor distribution 1

RISK BASED AUDITING: A VALUE ADD PROPOSITION. Participant Guide


ISO 9001:2008 Requirements Explained - An Adobe PDF File for Use on a Network System

SC21 Manufacturing Excellence. Process Overview

Basic Concepts of Earned Value Management (EVM)

Quality Assurance Checklist

44-76 mix 2. Exam Code:MB Exam Name: Managing Microsoft Dynamics Implementations Exam

PROJECT PLAN TEMPLATE

2-Day Workshop Project JumpStart The Workshop Designed to Launch Projects Faster

Certified Quality Improvement Associate

Sound Transit Internal Audit Report - No

Performance Element: Compare stocks, bonds, and commodities to determine advantages.

Introduction Assessment Tools... 2

Certified Six Sigma Yellow Belt

Introduction to Project Management

The University of Adelaide Business School

BPM Methodologies: Turning the Land of Confusion into Solutions for your BPM Initiatives. Alan Ramias Partner PERFORMANCE DESIGN LAB

ILM Level 3 Certificate in Using Active Operations Management in the Workplace (QCF)

INTERNAL AUDIT DIVISION AUDIT REPORT 2013/073

SPICE International Standard for Software Process Assessment

RUNNING HEAD: Learning through Engagement in an Online College Algebra Course 1

200Canada Bridges Page 1 07/11/2009. Project Management: Trainee Guide Module 1 INTRODUCTION

How does TMMi work and what does it contain? June 2013

Personality Profiling based on the DISC System

SAN FRANCISCO PUBLIC UTILITIES COMMISSION INFRASTRUCTURE DIVISION PROCEDURES MANUAL PROGRAM AND PROJECT MANAGEMENT

Agenda. Copyright Acuity Business Solutions, LLC All Rights Reserved

Certification Authorities Software Team (CAST) Position Paper CAST-15

Certified Quality Process Analyst

How to Craft a World-Class Work Breakdown Structure

BODY OF KNOWLEDGE CERTIFIED SIX SIGMA YELLOW BELT

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME >

360 feedback. Manager. Development Report. Sample Example. name: date:

Procurement Performance Measurement System

Lean Six Sigma Training The DMAIC Story. Unit 6: Glossary Page 6-1

2 SYSTEM DESCRIPTION TECHNIQUES

J-SOX Compliance Approach Best Practices for Foreign Subsidiaries November 8, 2007

Study Guide for the Mathematics: Proofs, Models, and Problems, Part I, Test

Configuration Management (CM) Plans: The Beginning to Your CM Solution

Road map for establishing Technology & Innovation Support Centers (TISC)

Bridge Rehabilitation Financial Model and Case Study

THE USAGE OF CAPABILITY MATURITY MODEL INTEGRATION AND WEB ENGINEERING PRACTICES IN LARGE WEB DEVELOPMENT ENTERPRISES: AN EMPIRICAL STUDY IN JORDAN

Sarbanes-Oxley 404. Sarbanes-Oxley Background. SOX 404 Internal Controls. Goals of Sarbanes-Oxley

Hertsmere Borough Council. Data Quality Strategy. December

Implementing Configuration Management for Software Testing Projects

Request of Expressions of Interest For Information Technology Consulting Firm

Formulating and Implementing an HP IT program strategy using CobiT and HP ITSM

CONDUCTING EFFECTIVE MEETINGS WORKBOOK A BASIC BUSINESS VICTORY GUIDE

Universiteit Leiden. ICT in Business. Leiden Institute of Advanced Computer Science (LIACS) Capability Maturity Model for Software Usage

Integration Mgmt / Initiating Process Group 4.1 Develop Project Charter

Goals, Objectives and Norms

Economic impact of privacy on online behavioral advertising

STANDARDIZED WORK 2ND SESSION. Art of Lean, Inc. 1

Programme Specification Date amended: April 8, 2008

Post-accreditation monitoring report: The Chartered Institute of Personnel and Development. June 2007 QCA/07/3407

Unit purpose and aim. The Learner will: 1 Understand the structure of their organisation

Measuring ROI of Agile Transformation

Trade Association Cost of Goods Sold Manual and Website Content Development Request for Proposal

Course Outline. Foundation of Business Analysis Course BA30: 4 days Instructor Led

SHARED ASSESSMENTS PROGRAM STANDARDIZED INFORMATION GATHERING (SIG) QUESTIONNAIRE

SOFTWARE REQUIREMENTS

Oracle EBS Service Contracts Extensions for Oracle Endeca

This module introduces the use of informal career assessments and provides the student with an opportunity to use an interest and skills checklist.

Measurement Information Model

A CASE STUDY ON SOFTWARE PROJECT MANAGEMENT IN INDUSTRY EXPERIENCES AND CONCLUSIONS

QUIZ #1 Preparation Materials

FAO Competency Framework

JOURNAL OF PUBLIC PROCUREMENT, VOLUME 4, ISSUE 1,

ISO 9001:2015 QUALITY MANAGEMENT SYSTEM ***** ISO 14001:2015 ENVIRONMENTAL MANAGEMENT SYSTEM

Software Process Improvement Software Business. Casper Lassenius

Configuration Management One Bite At A Time

Certified Software Quality Engineer (CSQE) Body of Knowledge

An Introduction to Organizational Maturity Assessment: Measuring Organizational Capabilities

[14: /2/22 Ch03-H8325.tex] ISBN: YULL: BTEC First ICT Practitioners Curriculum Support Pack Page:

Microsoft Office Project Tips and Tricks

How quality assurance reviews can strengthen the strategic value of internal auditing*

Revealing the Big Picture Using Business Process Management

WHITE PAPER ON TECHNICAL DEVELOPMENT APPROACH FOR SAP IMPLEMENTION PROJECTS

User research for information architecture projects

Level 4. Senior Assessor s Report. November Foundation Diploma in Purchasing and Supply. Effective Negotiation in Purchasing & Supply L4-01

Transcription:

BENCHMARK YOUR TEST FUNCTION AGAINST A WORLD-CLASS TESTING ORGANIZATION PRACTICE OBJECTIVE This practice will enable you to conduct an assessment of your testing organization, using eight criteria that QAI believes are associated with a world-class testing organization. At the conclusion of the Assessment, you will develop a Kiviatt Chart that shows your test organization s rating (your baseline) against QAI s world-class testing organization model. You can use the results for improving your test organization. PRACTICE TUTORIAL During the past 18 years, the Quality Assurance Institute has been studying what makes software testing organizations successful. The results are that QAI has identified eight criteria that are normally associated with world-class testing organizations. These eight criteria are test planning, training of testers, management support for testing, user satisfaction with testing, use of testing processes, efficient testing practices, use of test tools, and quality control over the testing process. When these eight criteria are in place and working the result is a world-class testing organization. The assessment process, developed by QAI, has five areas to address within each of the eight criteria. The more of those areas that are in place and working the more likely that category will contribute to world-class testing. Figure 1 shows a cause effect diagram indicating the areas to address, called Drivers, that result in a world-class testing organization. Software testing organizations can use the results of this assessment in any one of the three ways: 1. Determine your current software testing status versus a world-class testing organization. The responses in the area to address will indicate your strengths and weaknesses compared to a world-class testing organization. 2. Developing a software testing goal/object to become a world class testing organization. QAI s world-class model indicates a profile of a world-class testing organization. Achieving that profile can become a goal/objective for your software testing organization. 3. Develop an improvement plan. By doing the assessment, you will develop a Kiviatt Chart that shows where improvement is needed. Those areas to address in which you are deficient becomes the means for improving your software testing organization. Figure 1 Overview of the Assessment Process DRIVERS OF WORLD-CLASS TESTING DESIRED RESULTS Management Support Use of Process Planning Tools World-Class ing Quality Control Efficiency User Satisfaction Training Copyright 2000 Quality Assurance Institute Orlando, Florida TESTING ASSESSMENT Version 2 PAGE 1

PRACTICE WORKBENCH This workbench (see Figure 2) is designed to lead you through an assessment of your software testing function. The workbench begins with knowledge of your software testing function. A four-step process is designed to lead you from building an assessment team to analyzing the results of the assessment process. Because it is difficult for any organization to make significant improvements until they know where they are and where they want to be, the assessment process becomes a key component in any effective improvement plan. INPUT PRODUCTS The only input needed to perform this assessment is a knowledge of your organization's software testing activities. The knowledge is usually possessed by one or more of the Sr. Software ers in your organization. Thus, the assessment can be performed by one very knowledgeable software tester, or several if the knowledge needed is spread among two or more people. In some instances, documented software testing work practices will be needed. IMPLEMENTATION PROCEDURES This practice involves performing four steps that are explained in the following subsections. Step 1: Build Assessment Team The assessment team needs to be combined of one or more people who in totality possess the knowledge on how software testing is performed in your organization. Prior to establishing the team, the areas to address should be reviewed to determine the make up of the team. It is recommended that a Matrix be prepared with the areas to address on one dimension of the Matrix and the recommended assessment team on the other. The Matrix should be completed, indicating which assessment team member is knowledgeable about each of the areas to address. If all areas to address have been associated with an assessment team member, it can be concluded that the assessment team is adequate to perform the assessment. Figure 2 Software ing Assessment Workbench INPUTS REWORK PROCEDURES TO: DELIVERABLES OUTPUTS DO CHECK KNOWLEDGE OF SOFTWARE TESTING ACTIVITIES STEP 1 - Build Assessment Team STEP 2 - Complete Assessment Questionnaires INTERIM PRODUCT ASSESSMENT PERFORMED CORRECTLY YOUR STATUS VERSUS A WORLD- CLASS TESTING ORGANIZATION STEP 3 - Build Kiviatt Chart STEP 4 - Assess Results STANDARDS TOOLBOX POLICY - WHY (OBJECTIVE) PAGE 2 Version 2 TESTING ASSESSMENT

Step 2: Complete Assessment Questionnaires The assessment questionnaire is comprised of eight categories and five areas to address for each category. A total of 40 areas to address. For each area to address a YES or NO response should be made. The meaning of a YES or NO response follows: A YES response means all of the following: Criteria is formal and in place. Criteria is understood by testers. Criteria is widely used, where applicable. Criteria has produced some possible results. A NO response means any of the following: No formal item in place. Criteria is applied differently for different test situations. No consistency as to when used or used very seldom. No tangible results were produced. The assessment team should read aloud each area to address. The team should then discuss how that area is addressed in their software testing organization. Using the YES/NO response criteria, the assessment team needs to come to a consensus on whether a YES or NO response should be indicated for that area to address. The result of that consensus should be recorded on Questionnaire 1. The assessment team may also wish to record comments that clarify the response and/or to provide insight in how that area may be improved. Step 3: Build Kiviatt Chart Using Worksheet 1 (Kiviatt Worksheet for Recording Software ing Assessment Results) transcribe the results of completing Questionnaire 1. For each category the number of YES responses should be totaled. A dot should be placed on the Kiviatt Chart on the line representing the number of YES responses. For example, if there were three YES responses for test planning a dot would be placed on the test planning line at the intersection of the line representing three YES responses. A dot should be put on the line representing all eight categories for the number of YES responses. The dots are then connected by a line resulting in what is called a footprint of the status of your software testing organization versus a world-class testing organization. Step 4: Assess Results Two assessments should be made regarding the footprint developed on the Worksheet 1 Kiviatt Chart. As follows: 1. Assess status of each category versus what that category should be in a world-class testing organization. To do this you need to look at the number of YES responses you have recorded for each category versus a world-class organization, which would have five YES responses. For example, if you had three YES responses for test planning that would indicate that improvements could be made in your test planning process. The two areas that received NO responses are indications of where improvements are needed to move your test planning activities to a world-class level. 2. Interpret your software testing assessment Kiviatt Chart. The footprint in your Kiviatt Chart provides an overview of the type of software testing your organization is performing. Given the footprint your assessment team should attempt to draw some conclusions about how software testing is performed in your organization. Three examples are given to help in drawing these conclusions. The examples are shown in Figure 3, 4, and 5. CHECK PROCEDURES The following list of questions, if responded to positively, would indicate that the assessment has been performed correctly: 1. Does the assessment team comprise the knowledge needed to answer all of the areas to address within the eight categories? 2. Are the individual assessors free from any bias that would cause them not to provide proper responses to the areas to address? 3. Was there general consensus among the assessment team to the response for each area to address? 4. Are the areas to address appropriate for your testing organization? 5. Have the areas to address been properly totaled and posted to the Kiviatt Chart Worksheet? 6. Does the assessment team believe the Kiviatt Chart footprint is representative of your software testing organization? 7. Does your assessment team believe that if they improve the areas to address, that have NO responses, that the software testing organization will become more effective? 8. Does your software testing organization believe that the overall assessment made of the Kiviatt footprint is representative of your software testing organization. TESTING ASSESSMENT Version 2 PAGE 3

DELIVERABLES There are two deliverables from this self-assessment. The first is the Kiviatt Chart Worksheet. The second is the analysis of the Kiviatt Chart footprint. USAGE TIPS NOTE TO ASSESSEMNT TEAM: If your assessment team is having difficulty in drawing conclusions, send the results of your assessments to QAI, Managing Director, and QAI will provide you with an interpretation of your Kiviatt Chart. PAGE 4 Version 2 TESTING ASSESSMENT

Questionnaire 1 SOFTWARE TESTING ASSESSMENT QUESTIONNAIRE CATEGORY AREA TO ADDRESS RESPONSE COMMENTS Planning Criteria 1: Business/Technical Risks Do the testers identify the business and technical risks associated with implementing and operating software, and are those risks addressed in the test plan? YES NO Planning Criteria 2: Building a Plan Is a test plan created for each software system? Does the plan identify the test requirements/objects and does it establish success criteria for each requirement? Planning Criteria 3: Involvement in Planning Are the users actively involved in the development of the test plan, and does the plan include user involvement in testing? Planning Criteria 4: Plan is a Working Document Are test plans followed? Is approval required to deviate from the plan, and are the plans changed to reflect changing test approaches and changing software requirements? Planning Criteria 5: Acceptance Criteria Established Does the test plan contain the criteria that the software must meet in order to be placed into production and do the users agree with those criteria? Management Support for Criteria 1: Availability of Resources Does management provide the resources necessary (including calendar time) to adequately train, plan, conduct, and evaluate results for software testing assignments? Management Support for Criteria 2: Life Cycle Involvement by ers Are testers involved from the inception through termination of software projects to ensure that testing concerns are continuously addressed? TESTING ASSESSMENT Version 2 PAGE 5

CATEGORY AREA TO ADDRESS RESPONSE COMMENTS Management Support for Criteria 3: Equality Between Development and Does management allocate as many resources to the test processes and tools as it does to the development process and tools? YES NO Management Support for Criteria 4: Management s Personal Time Equality Between Development and Does management spend as much personal time on test planning and test execution as it does on development planning and development execution? Management Support for Criteria 5: Knowledge and Skills Is management knowledgeable and sufficiently trained in test theory, processes and tools to effectively manage test planning and execution, and understand and effectively act on test results? Processes Criteria 1: ing Performed by Process Do testers follow processes to plan tests, prepare test data, execute tests, and develop and report test results? Processes Criteria 2: Processes are Understandable and Usable Can documented test processes be correctly interpreted by testers, so that the test procedures can be followed as intended during use? Processes Criteria 3: Processes are Complete Do the processes provided for testing cover all the activities that are needed to perform effective testing? Processes Criteria 4: Process Maturity Plan Has a plan been developed and put in place to mature the test processes, so they become more effective, efficient, and on time? PAGE 6 Version 2 TESTING ASSESSMENT

CATEGORY AREA TO ADDRESS RESPONSE COMMENTS Processes Criteria 5: ers Build Processes Do the owners/users of the test processes (i.e., testers) build the processes used for testing? YES NO Tools Criteria 1: Capture/Playback Tool Available Do testers use an automated tool to generate and reuse test data? Tools Criteria 2: Too Needs Defined Prior to Acquisition Are test tools selected in a logical manner? Meaning, test needs drive the search for/acquisition of test tools. Tools Criteria 3: ers Trained Prior to Using Tools Can testers only use test tools after they have received adequate training in how to use the test tools? Tools Criteria 4: Mandatory Tool Usage Is test tool usage specified in the test plan? Meaning, usage of the tools is mandatory, not optional, or by the sole discretion of a tester. Tools Criteria 5: Tool Help Desk Available Has a process for obtaining assistance in using test tools been established, and does it provide testers with the needed instructional information? Training Criteria 1: Training Plan for ers Does a career training plan for testers exist, and is it in use to develop a tester from an unskilled state to a master tester state? Training Criteria 2: Process Training Training Criteria 3: Theory/Risk Training Are testers adequately trained in test processes before using those processes for testing? Are testers trained in the theory of testing, risk analysis, the various approaches to testing, etc., so that they understand why they perform certain test tasks? TESTING ASSESSMENT Version 2 PAGE 7

CATEGORY AREA TO ADDRESS RESPONSE COMMENTS Training Criteria 4: Statistical Training Are testers trained in statistics, so they understand the level of confidence they can provide a user by different test approaches and how to interpret test results? YES NO Training Criteria 5: Process Management and Improvement Training Are testers trained in how to measure process performance, and do they use the results of that measurement to improve the test processes? User Satisfaction Criteria 1: Reports Do users get the information they need to track test progress and assess results prior to placing software into production? User Satisfaction Criteria 2: User Survey Are user surveys conducted to determine user satisfaction with test planning, test execution, test results, communications, etc.? User Satisfaction Criteria 3: User Involvement in Acceptance Do users participate in tests that determine whether or not the software is acceptable for use? User Satisfaction Criteria 4: User Approves Plan Are users presented with a plan for testing, and do they approve (i.e., agree) that if that plan is followed, they will consider testing to be satisfactory? User Satisfaction Criteria 5: User Activities Included in Are the user support activities such as data entry, output usage, terminal usage, manual usage, etc., validated as part of testing? Efficiency Criteria 1: Focused on Risks Has the test planned been developed, so that the test resources will be allocated to validate that the major risks are addressed prior to minor risks? PAGE 8 Version 2 TESTING ASSESSMENT

CATEGORY AREA TO ADDRESS RESPONSE COMMENTS Efficiency Criteria 2: Process Measured/Efficient Has a measurement process been installed to measure the efficiency of the test processes? YES NO Efficiency Criteria 3: Budget/Schedule Met Efficiency Criteria 4: Tool Efficiency Measured Efficiency Criteria 5: Defect Removal Efficiency Measured Is compliance to the budget and schedule measured and variances addressed effectively? Is tool usage measured to assess the contribution received from automated testing? Is the percentage of defects removed versus the total defects eventually attributable to a development phase measured? Quality Control Criteria 1: Defects Recorded Are defects made by testers during testing recorded and effectively addressed? Quality Control Criteria 2: Plans Reviewed/Inspected Is the test plan reviewed/inspected during/after completion by peers for adequacy and compliance to test standards? Quality Control Criteria 3: Plan Includes Quality Control Does the test plan include the procedures that will be used to verify that the plan is executed in accordance with the plan? Quality Control Criteria 4: Individual Quality Control Reports Prepared Are regular reports prepared that show the full status of testing individual software systems? Quality Control Criteria 5: Summarized Quality Control Reports Prepared Periodically, are the individual quality control reports summarized to show the efficiency and effectiveness of testing in the entire information services organization? TESTING ASSESSMENT Version 2 PAGE 9

WORKSHEET 1 Assessment Kiviatt Chart ASSESSMENT KIVIATT CHART Management Support for 5 Planning 4 3 2 Use of Processes 1 Quality Control Tools User Statisfication with Training Efficiency PAGE 10 Version 2 TESTING ASSESSMENT

Figure 3 Example of a World-Class Software ing Organization INTERPRETATION: ing considered a technical issue. User/Management considers test part of development. Planning Management Support for 5 4 3 2 Use of Processes 1 Quality Control Tools User Statisfication with Training Efficiency TESTING ASSESSMENT Version 2 PAGE 11

Figure 4 Example of a World-Class Software ing Organization INTERPRETATION: processes developed, but not used as written. process became test guidelines. Planning Management Support for 5 4 3 2 Use of Processes 1 Quality Control Tools User Statisfication with Training Efficiency PAGE 12 Version 2 TESTING ASSESSMENT

Figure 5 Example of a World-Class Software ing Organization INTERPRETATION: ing is an art/craft focused on doing what the customer wants. Management Support for 5 Planning 4 3 2 Use of Processes 1 Quality Control Tools User Statisfication with Training Efficiency TESTING ASSESSMENT Version 2 PAGE 13