Defining Quality Workbook. <Program/Project/Work Name> Quality Definition

Similar documents
Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

Design Document Version 0.0

Software Quality. Software Quality Assurance and Software Reuse. Three Important Points. Quality Factors

Appendix M INFORMATION TECHNOLOGY (IT) YOUTH APPRENTICESHIP

Software Engineering: Analysis and Design - CSE3308

The Agile Manifesto is based on 12 principles:

Agile extreme Development & Project Management Strategy Mentored/Component-based Workshop Series

International Journal of Advance Research in Computer Science and Management Studies

Basic Testing Concepts and Terminology

Survey Instrument Requirements Requirements Definition Template

Effective Business Requirements (Virtual Classroom Edition)

Scrum: A disciplined approach to product quality and project success.

Copyright 1

What do you think? Definitions of Quality

risks in the software projects [10,52], discussion platform, and COCOMO

Project Quality Planning

Getting Things Done: Practical Web/e-Commerce Application Stress Testing

Software Testing Certifications

Project Management Agile Experience Report

Technology Change Management

SECTION 2 PROGRAMMING & DEVELOPMENT

Chap 1. Software Quality Management

Taking the first step to agile digital services

CDC UNIFIED PROCESS PRACTICES GUIDE

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008

NIST Cloud Computing Program Activities

How To Measure Quality

THE THREE ASPECTS OF SOFTWARE QUALITY: FUNCTIONAL, STRUCTURAL, AND PROCESS

Importance of Testing in Software Development Life Cycle

Software Development. Topic 1 The Software Development Process

Blending Traditional and Agile Project Documentation

Software Engineering Compiled By: Roshani Ghimire Page 1

Roles: Scrum Master & Project Manager

ICAgile Learning Roadmap Agile Testing Track

White Paper. Change Management: A CA IT Service Management Process Map

Software Test Plan (STP) Template

ITU Service - Content Management System (CMS)

Reliable Test Effort Estimation

TESTING FRAMEWORKS. Gayatri Ghanakota

We are live on KFS Now What? Sameer Arora Director Strategic Initiatives, Syntel

CLOUD COMPUTING SOLUTION - BENEFITS AND TESTING CHALLENGES

WHITEPAPER. Delivering More Rigorous Testing of Software Systems to Banks and Financial Institutions. Overview

Agile Testing (October 2011) Page 1. Learning Objectives for Agile Testing

Business Solutions Manager Self and contribution to Team. Information Services

The style is: a statement or question followed by four options. In each case only one option is correct.

Enterprise Portfolio Management

Evaluation of the Iceland State Financial and Human Resource System REPORT OF THE INDIVIDUAL EVALUATOR. Annex 2 SYSTEM AND SOFTWARE QUALITY

Policy-based optimization

Sample Exam Foundation Level Syllabus. Mobile Tester

Patterns to Introduce Continuous Integration to Organizations

STL Microsoft Dynamics CRM Consulting and Support Services

Kunal Jamsutkar 1, Viki Patil 2, P. M. Chawan 3 (Department of Computer Science, VJTI, MUMBAI, INDIA)

Agile So)ware Development

Driving Quality Improvement and Reducing Technical Debt with the Definition of Done

Use service virtualization to remove testing bottlenecks

Key Benefits of Microsoft Visual Studio Team System

Effective Test Management can help you to launch mobile payments faster, smarter and cheaper

Software Testing Interview Questions

Higher Computing. Software Development. LO1 Software Development process

CYBER S BUSINESS I.T MAINTENANCE CONTRACT

Assurance of Open Source Projects

ALM Solutions using Visual Studio TFS 2013 ALMI13; 5 Days, Instructor-led

Software Requirements, Third Edition

Business Analysis Capability Assessment

What is a process? So a good process must:

DSDM DSDM. CONSORTiUM. CONSORTiUM. AgileBA. The Handbook for Business Analysts. Extract The Requirements Lifecycle In An Agile Project.

Application Performance Testing Basics

SOFTWARE REQUIREMENTS

Introduction to Software Engineering. 8. Software Quality

AGILE BUSINESS INTELLIGENCE

Crossing the DevOps Chasm

Procedure for Assessment of System and Software

ISTQB Certified Tester. Foundation Level. Sample Exam 1

revenue growth MANUFACTURING KNOW HOW 4 STEPS TO IMPROVE your business CRITICAL QUESTIONS to ask yourself LEARN FROM best in class manufacturers

The Practical Organization of Automated Software Testing

The Agile Audit. 2. Requirements & Technical Architecture

ITL BULLETIN FOR JUNE 2012 CLOUD COMPUTING: A REVIEW OF FEATURES, BENEFITS, AND RISKS, AND RECOMMENDATIONS FOR SECURE, EFFICIENT IMPLEMENTATIONS

Bringing Value to the Organization with Performance Testing

Ten steps to better requirements management.

Midsize retailers can now relax the nightmare of trying to keep up with the

ZAP Business Intelligence Application for Microsoft Dynamics

7 Steps to Guide Your Field Service Technology Purchase

Bridging the Gap Between Acceptance Criteria and Definition of Done

The Team... 1 The Backlog... 2 The Release... 4 The Sprint... 5 Quick Summary Stakeholders. Business Owner. Product Owner.

Agile user-centred design

Software Requirements Specification

Draft Requirements Management Plan

Load DynamiX Storage Performance Validation: Fundamental to your Change Management Process

Quality Management. Lecture 12 Software quality management

Call for Tender for Application Development and Maintenance Services

Asset Management for the Public Sector with Total Accountability

The Design and Improvement of a Software Project Management System Based on CMMI

CSC 408F/CSC2105F Lecture Notes

Measuring the Impact of Volunteering

MCQ on Management Information System. Answer Key

Agile, Secure, Reliable: World-Class Customer Service in the Cloud

To Meaningful Use and Beyond

STL Microsoft SharePoint Consulting and Support Services

REDEFINING QUALITY ASSURANCE

Upping the game. Improving your software development process

Transcription:

Defining Quality Workbook <Program/Project/Work Name> Quality Definition

Introduction: Defining Quality When starting on a piece of work it is important to understand what you are working towards. Much of the time, scope / time / cost has value attached to it where quality is ambiguous with multiple definitions. This Defining Quality pack has been put together to bring the team to a common understanding on the definition of Quality. This pack consists of instructions for team members on how to define Quality followed by a template where users can work through the steps below to create their own quality definition. 1 2 3 4 5 6 7 Quality Advocates Quality Taxonomy Quality Prioritisation Quality Tradeoff Risks Quality Measurement Success Sliders Quality Definition What does quality mean to the different roles in the team? What are quality attributes? How do we know what quality attributes to include? What are risks with the quality attributes we are trading off? How do we test and measure quality? How does quality relate to the sliders? What does quality mean? This pack is based on ideas introduced in Software Quality Attributes: Following All The Steps available at: http://www.clarrus.com/documents/software%20quality%20attributes.pdf 8 Next Steps How do we apply quality to our work

Overview: Quality Definition Quality is a multi-dimensional measure that describes how a system or application satisfies the stated or implied requirements. Stated requirements are equivalent to functions or features and implied requirements are equivalent to performance, useability, security, maintainability or other non-functional requirements. For projects, quality must be agreed and continually reviewed to ensure all the requirements of stakeholders (stated or implied) are properly satisfied. Software development projects often focus too heavily on satisfying functional requirements and overlook requirements for maintainability, flexibility and performance. Everyone in the team must have the same understanding of quality, how it is measured and the impact that has on their working practices. If this agreement is not reached, then the process for ensuring quality is maintained can not be agreed upon. Application Owner Everyone is Responsible for Quality! Enterprise Architect Other key stakeholders Business Analyst Tester Iteration Manager Core Quality Team Architect Project Sponsor Software Engineer Project Manager Tech lead Bus. SME Infra. Lead Extended Quality Team Program Manager Support Analyst

1 Quality Advocates INSTRUCTIONS Identify the quality advocates in your team and what quality means to each role / skillset. Ensure you have representation from the business, technical, testing and management. This works well as a sticky note session, where everybody writes their definition and shares it with the rest of the team. Delete this instruction and replace the examples above with the definitions from your entire team. Add new pages as required.

Overview: Quality Taxonomy Quality Attribute Correctness Reliability Robustness Integrity Efficiency Portability Reusability Interoperability Maintainability Flexibility Testability Installability Availability Survivability Usability Meaning The degree to which the features or functions need to be correct The probability of the system operating without failure for a certain period of time The degree to which the system functions correctly when confronted with invalid input data, software or hardware defects, or unexpected operating conditions Ability to preclude unauthorised access, prevent information loss, protect from viruses infection, protect privacy of data entered How well the system uses available resources processor capacity, disk space, memory, communication bandwidth The effort required to migrate a piece of software from one operating environment to another The extent to which a software component can be used in other applications Ease with which the system can exchange information with other systems How easy it is to correct a defect or make a change in the system How much effort is required to add more capabilities to the product The ease of which the components of the system can be tested to find defects The ease of installation of the software Percentage of planned uptime that the system is required to be operational The amount of time the system is required for use Measures the effort required to prepare input for, operate, and interpret the output of the product

...you get... Reliability Robustness Availability Integrity Flexibility Usability Interoperability Efficiency Testability Maintainability Reusability Portability Overview: Quality Attribute Tradeoffs If you have... Reliability + + + + - + + Robustness + + + - Availability + + Integrity - - - - - Flexibility + - - + + + Usability + - - Interoperability - + - + Efficiency - - - - - - - - Testability + + + + - + Maintainability + + + - + Reusability - - + + - + + + Portability + - + - + - +

2 Quality Taxonomy Priority Quality Attribute Meaning 5 Reliability The probability of the system operating without failure for a certain period of time 3 Robustness The degree to which the system functions correctly when confronted with invalid input data, software or hardware defects, or unexpected operating conditions 2 Integrity Ability to preclude unauthorised access, prevent information loss, protect from viruses infection, protect privacy of data entered Efficiency Portability How well the system uses available resources The effort required to migrate a piece of software from one operating environment to another 2 Reusability The extent to which a software component can be used in other applications 1 Interoperability Ease with which the system can exchange information with other systems 10 Maintainability How easy it is to correct a defect or make a change in the system 4 Flexibility How much effort is required to add more capabilities to the product 4 Testability The ease of which the components of the system can be tested to find defects Installability Availability Survivability The ease of installation of the software INSTRUCTIONS Percentage of planned uptime that the system is required to be operational The idea here is to define the key quality attributes for the team, based on the wisdom of the crowd. The amount of time the system is required for use Correctness is a given for any piece of work, so don t include this attribute. 10 Usability Measures the effort required to prepare input for, operate, and interpret the output top 3 attributes via voting. of the product Prioritise your quality attributes. You could do this via MoSCoW technique or by getting all of the attendees to nominate their Delete this instruction and replace the example above.

3 Quality Prioritisation & Meaning Top 5 (±1) Attribute Usability Maintainability Reliability Meaning Can people use it, less training, easy to learn Logical, easy to use, intuitive Completeness quote and application Efficiency Easy to change, speed to market Simple Fast to improve, responsiveness TCO, life of product Consistent availability No crashes, gracefully crashes Consistent outcome Robustness INSTRUCTIONS We are now looking for our top 5 quality attributes and OUR meaning of them. Reducing the quality attributes from the quality taxonomy allows the ability for the team to focus on what is meaningful and important to their piece of work. First, take the top 4 attributes from step 2 and add them to this table. Then, as a group, agree what these attributes mean to us as a team. We will add the fifth attribute after the next step. Delete this instruction and replace the example above.

Quality Taxonomy Tradeoff Risks Quality Attribute Maintainability Reliability Useability Integrity Reliability Flexibility Portability Interoperability Testability Survivability Robustness Installability Tradeoff Risk Unable to easily upgrade the system Wasted time on maintenance Loss of income Unhappy users Security breaches Allowing unauthorised access System outage Overly complex system Unable to use multiple platforms Unable to interact with other systems Insufficient testing capability Only has a short shelf life Error prone Unable to install software on local machines Low Quality = Technical Debt

4 Quality Prioritisation - Tradeoffs Priority Quality Attribute Risk 6 Flexibility Covered in maintainability? Hard to meet time to market, miss some scope Less competitive 4 Testability Errors not found, low confidence Reliability impact, cost of change More effort and time Resource strain Complex dependencies 5 Robustness Low customer confidence Increase training, support costs Higher cost of change, manual workaround Brand impact, complaints INSTRUCTIONS We now deal with the remaining quality attributes and decide OUR risk of not completing them. First, add the quality attributes from step 2 that we did not prioritise in step 3. Then, as a group, agree what the risk is of not meeting this attributes as a team. Next, prioritise your quality attributes. You could do this via MoSCoW technique or by getting all of the attendees to nominate their top risk via voting. Take the top priority risk and add it to the attributes in step 2. Delete this instruction and replace the example above.

Lines of Code Lines of Test Code Number of Features Business Value Number of Risks & Issues New Risks & Issues Raised Number of Defects Number of Tests Overview: Quality Measurement PROJECT TESTING HEALTH MAINTAINABILITY 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7 8 Iteration DEVELOPMENT 5 4 3 2 1 0 9 8 7 6 5 4 3 2 1 0 OVERALL 1 2 3 4 5 6 7 8 Iteration USER 25 20 15 10 5 0 TEST COVERAGE PERFORMANCE 5700 5600 5500 5400 5300 5200 5100 5000 4900 4800 4700 1800 1600 1400 1200 1000 800 600 400 200 0 1 2 3 4 5 6 7 8 9 $30 8 $25 7 6 $20 5 $15 4 3 $10 2 $5 1 0 $- 1 2 3 4 5 6 7 8 Template Iteration available at: http://www.agileacademy.com.au/agile/sites/default/files/quality%20focused%20project%20bvc.pdf Iteration

5 Measurement Test Criteria & SMART Goals Attribute Test Criteria SMART* Goal / Metric Efficiency Execution Efficiency X% of the available processor capacity shall be unused at peak load conditions (defined elsewhere) Reliability Simplicity The McCabe Complexity of all modules shall be below 10; the average of all modules shall be below 5 Availability Mean Time to Repair The average time required to bring the system back online after failure shall not exceed 10 minutes Usability Response Times The average time required to generate and display an online report shall be less than 2 seconds, and no online reports shall take more than 5 seconds. For those that require more than 250 milliseconds, there shall be graphical feedback to the user that the system is processing data INSTRUCTIONS We now want to discuss how we measure that we are meeting our quality attributes. Adding Test Criteria to the Quality Attributes allows the team to understand how the Quality Attribute will be tested and assigns it a definition of done. Adding SMART goals enables the parameter around the attributes test criteria to be set and outlines definition on how to achieve the test criteria. First, transfer the final list of quality attributes from step 3. Then, determine the test criteria and then the SMART goals and specific metrics to measure each attribute. Delete this instruction and replace the example above. THIS STEP CAN BE DEFERRED TO A SEPARATE DISCUSSION IF TIME IS NOT AVAILABLE. * Specific, Measurable, Achievable, Realistic and Time-Bound

Overview: Quality Success Slider Meaning Quality Slider Level Explanation 1 Work your hardest to sign off all your quality attributes and associated testing criteria application is critical, impact on business is high, media will become involved if issues arise 2 Look at signing off your top three quality attributes and their associated testing criteria core piece of infrastructure, future development will be undertaken and the need to expand or change quickly is high 3 Pick one or two of your quality attributes (focussing on the ones that give most value to the work) and aim at completing those other pressures means quality needs to be sacrificed a high level of technical debt may be incurred 4 Do your best to undertake some of the testing criteria, focussing on the most critical in the time you have (this doesn t mean not doing anything) you get what you get which will be basic correctness, a high level of technical debt will be incurred A discussion needs to be had and agreement needs to be made on what it means if the quality slider is set at a certain level. If the quality slider is not number one then some form of technical debt will be accrued. A risk needs to be raised and a strategy put in place to address this.

6 Success Sliders SUCCESS TYPE SCOPE meet the projects requirements FIXED FLEXIBLE SUPPORTING EXPLANATION New and existing records converted, all reports available FIXED COST meet agreed budget FLEXIBLE Budget = $500K FIXED TIME deliver the project on time FLEXIBLE Due by June QUALITY meet quality requirements FIXED FLEXIBLE OPTIONAL INSTRUCTIONS No high priority bugs, performance levels met, ability to add new functions easily in next release FIXED FLEXIBLE Now either define your success sliders OR refer to the success sliders that have already been defined for your piece of work. Only one slider can sit at each level. Set the position of the sliders to reflect your understanding of how the success of this project will be judged when it s finished. Review the position of the quality slider against the meaning on the previous page and discuss amongst the team if we are comfortable with this. If you are not, you will need to adjust the sliders. Delete this instruction and replace the example above.

7 Quality Definition As a team we define quality as <summary of what will make up your definition of quality using output from step 3, based on the position of the success slider in step 6> We will strive to achieve quality by <summary of the test criteria or SMART goals from step 5, based on the position of the success slider in step 6 > If <time, cost, scope > permits we will focus additionally on <add the quality attibutes listed in step 3 and/or step 4, based on the position of the success slider in step 6 (only add in if quality is not the number one slider)> Due to project constraints we believe we will not be able undertake <add the quality attributes that the team believes will not be able to be completed using output from step 4, along with a brief summary of the major risks> INSTRUCTIONS The quality definition is our clear statement of what quality means (what we will and will not focus on). This does not necessarily mean we will not do something about all of our quality attributes, just that some will not be our focus. This statement should be placed in a prominent area with the team. Add a summary based on the inline instructions above. Delete this instruction and replace the example above.

Overview: Next Steps (Quality Practices) Template available at: http://www.agileacademy.com.au/agile /sites/default/files/quality%20focused %20Project%20BVC.pdf

8 Next Steps # Action Done 1 Create cards for the quality attributes and associated criteria 2 Agree on metrics and measures that will be used to track progress and make visible on a big visual chart (BVC) 3 Complete Agile Quality Practices Assessment to identify quality practices targets 4 Include quality attributes and associated criteria in definition of quality in test strategy 5 Determine the automation strategy 6 Revisit Quality definition in retrospective to confirm definition is still the same INSTRUCTIONS Agree any next steps you will do at as a team, either from issues raised in this workshop or from a maturity assessment as shown on the previous page. Delete this instruction and replace the example above.