ISO/IEC 9126-1 Software Product Quality Model

Similar documents
ISO and Industry Standards for User Centred Design

The Role of Information Technology Studies in Software Product Quality Improvement

Common Industry Format Usability Tests

Quality in Use: Meeting User Needs for Quality

Outline. Lecture 13: Web Usability. Top Ten Web Design Mistakes. Web Usability Principles Usability Evaluations

Evaluation of the Iceland State Financial and Human Resource System REPORT OF THE INDIVIDUAL EVALUATOR. Annex 2 SYSTEM AND SOFTWARE QUALITY

Lecture 8 About Quality and Quality Management Systems

Key Factors for Developing a Successful E-commerce Website

MEASURING USABILITY OF ICONIC BASED GUIs OF MOBILE EMERGENCY SERVICE SOFTWARE BY USING HCI. Y.Batu Salman, Adem Karahoca

Requirements engineering and quality attributes

SOFTWARE QUALITY MODELS: A COMPARATIVE STUDY

Mike Peters Senior Test Consultant/Delivery Manager LogicaCMG All rights reserved

Examination SUBJECT. Version:

MTAT Software Engineering Management

Requirements engineering

Measurement Information Model

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)

Software Engineering Compiled By: Roshani Ghimire Page 1

Requirements Engineering: Elicitation Techniques

Human-Computer Interaction Standards

TABLE OF CONTENTS ABSTRACT ACKNOWLEDGEMENT LIST OF FIGURES LIST OF TABLES

Requirements Engineering Process

Application of software product quality international standards through software development life cycle

International Software & Systems Engineering. Standards. Jim Moore The MITRE Corporation Chair, US TAG to ISO/IEC JTC1/SC7 James.W.Moore@ieee.

Chapter 11. HCI Development Methodology

Develop Project Charter. Develop Project Management Plan

Software Engineering: Analysis and Design - CSE3308

Bidirectional Tracing of Requirements in Embedded Software Development

Leveraging CMMI framework for Engineering Services

ISO 9001:2008 Audit Checklist

Draft Requirements Management Plan

Software Requirements Specification

Screen Design : Navigation, Windows, Controls, Text,

Testing Metrics. Introduction

How To Manage Assets

OVERVIEW. In all, this report makes recommendations in 14 areas, such as. Page iii

Process Models and Metrics

PROJECT MANAGEMENT PLAN Outline VERSION 0.0 STATUS: OUTLINE DATE:

Quality Management. Objectives

Quality Management. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 27 Slide 1

A system is a set of integrated components interacting with each other to serve a common purpose.

Software Quality Management

Management and Leadership. Level 5 NVQ Diploma in Management and Leadership (QCF)

Quality Management. Managing the quality of the software process and products

Quality Management. Objectives. Topics covered. Process and product quality Quality assurance and standards Quality planning Quality control

The document you download is the copyright of ISO, and may not be stored, reproduced, transferred or resold by any means, except as follows.

Implementation of a Quality Management System for Aeronautical Information Services -1-

MNLARS Project Audit Checklist

CDC UNIFIED PROCESS PRACTICES GUIDE

Basic Testing Concepts and Terminology

Why do Project Fail? 42% 37% 27% 26% 24% 24% 0% 10% 20% 30% 40% 50% IBM Software Group Rational software. Source: AberdeenGroup, August 2006

Chap 1. Software Quality Management

CORL Dodging Breaches from Dodgy Vendors

Visual Interface Design. Interaction Design. Design Collaboration & Communication. Lean UX

Non-Functional Requirements

SOFTWARE REQUIREMENTS

Employability Skills Summary

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

Kunal Jamsutkar 1, Viki Patil 2, P. M. Chawan 3 (Department of Computer Science, VJTI, MUMBAI, INDIA)

Job Description Solutions Lead

Characteristics of Computational Intelligence (Quantitative Approach)

Prof. Paolo Nesi. Lab: DISIT, Sistemi Distribuiti e Tecnologie Internet

West Midlands Police and Crime Commissioner Records Management Policy 1 Contents

Business Analysis Essentials

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

Service Measurement Index Framework Version 2.1

Role Description Business Analyst / Consultant - ICT

Lecture 17: Requirements Specifications

Requirements Engineering: A Roadmap

QUALITY MANAGEMENT SYSTEM REQUIREMENTS General Requirements. Documentation Requirements. General. Quality Manual. Control of Documents

Evaluating the Quality of Software in ERP Systems Using the ISO 9126 Model

Job Description. Industry business analyst. Salary Band: Purpose of Job

How To Write Software

The 10 Knowledge Areas & ITTOs

The Danwood Group Professional Services Offering DANWOOD

Software Asset Management (SAM) and ITIL Service Management - together driving efficiency

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same!

JOURNAL OF OBJECT TECHNOLOGY

Performance Test Process

PROJECT RISK MANAGEMENT

FSW QA Testing Levels Definitions

Total Quality. 1) Quality

Requirements Engineering Processes. Ian Sommerville 2006 Software Engineering, 8th edition. Chapter 7 Slide 1

PROJECT PROCUREMENT MANAGEMENT

Change Management for Rational DOORS User s Guide

SELECTION OF AN ORGANIZATION SPECIFIC ERP

PMP Examination Tasks Puzzle game

MKS Integrity & CMMI. July, 2007

STAGE 1 COMPETENCY STANDARD FOR ENGINEERING ASSOCIATE

pm4dev, 2007 management for development series The Project Management Processes PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS

Chapter 24 - Quality Management. Lecture 1. Chapter 24 Quality management

Operations. Group Standard. Business Operations process forms the core of all our business activities

Transcription:

Why do current systems fail? Standish Group found that 51% of projects failed 31% were partially successful Main causes were poor user requirements: 13.1% Incomplete requirements 12.4% Lack of user involvement 10.6% Inadequate resources 9.9% Unrealistic user expectations 9.3% Lack of management support 8.7% Requirements keep changing 8.1% Inadequate planning 7.5% System no longer needed 1 ISO/IEC 9126-1 Software Product Quality Model quality in use functionality accuracy suitability interoperability security usability understandability learnability operability attractiveness reliability maturity fault tolerance recoverability availability efficiency time behaviour resource utilisation maintainability analysability changeability stability testability portability adaptability installability co-existence replaceability 2

Usability Requirements: Quality in use ISO/IEC TR 9126-1: Quality in use metrics User performance all data entry clerks will be able to complete the task with at least 95% accuracy in under 10 minutes User satisfaction the mean score on the SUMI scale will be greater than 50 More information on quality in use requirements 3 Usability Requirements: external usability ISO/IEC TR 9126-2: External metrics Requirements (which can be tested by using a prototype) Understandability Product description and demonstrations Interface functions (e.g. menus) easy to understand Learnability Functions learnt quickly Effective user documentation and help Operability Understandable messages, undoability, customisability Attractiveness Screen layout and colour 4

Usability Requirements: internal usability ISO/IEC TR 9126-3: Internal metrics Requirements (which can be tested by inspecting the specification) Understandability Product description complete Interface functions (e.g. menus) easy to understand Learnability Complete user documentation and help Operability Consistency, self explanatory messages, undoability, customisability GUI style guide Attractiveness Screen layout and colour 5 Types of user requirements Context of use Quality in use Constraints Usability (external internal) Functionality Efficiency Reliability 6

Types of user requirements Context of use Users groups and other stakeholders Tasks and scenarios that the system should support Environment: physical and organisational Usability: quality in use (summative goals) Task performance scenarios: effectiveness and efficiency Satisfaction Usability: detail (summative goals) Interface behaviour u External and internal Functions to support usability (formative design) General principles to be followed u Specific system features to enhance usability Efficiency Reliability 7 ISO/IEC CD 25030: Software quality requirements and evaluation Quality requirements Define quality requirements Identify stakeholders u end users, developers, producers, trainers, maintainers, disposers, acquirer, supplier organisations and regulatory bodies Elicit requirements from stakeholders u Quality in use, External and Internal Quality Analyse the set of requirements Resolve problems Confirm requirements Record requirements Formalise quality requirements Specify target values for measures Demonstrate traceability Maintain requirements 8

Human centred design process for interactive systems: ISO 13407 1. 1. Plan the the human centred process Meets requirements 5. 5. Evaluate designs against user requirements 2. 2. Specify the the context of of use use 4. 4. Produce design solutions 3. 3. Specify user and organisational requirements 9 ISO 13407: Usability requirements The specification of user and organizational requirements should identify the range of relevant users and other personnel in the design, provide a clear statement of the human-centred design goals, set appropriate priorities for the different requirements, provide measurable criteria against which the emerging design can be tested, be confirmed by the users or those representing their interests in the process, include any statutory or legislative requirements, and be adequately documented. 10

System lifecycle (TRUMP) Plan Process Specify Context of Use Specify Requirements Design Solutions Evaluate against Requirements System lifecycle feasibility requirements design implement release 1. Stakeholder meeting 2. Context of use 3. Scenarios 4. Usability requirements 5. Evaluate existing system 6. Prototyping 7. Style guide 8.Evaluation 9. Usability testing 10.Collect feedback www.usability.serco.com/trump/ucdmethods 11 Context of Use The usability of a product is affected not only by the features of the product itself but also by its Context of Use Context is the characteristics of: the users of the product the tasks they carry out the technical, organisational and physical environment in which the product is used the date and time when the product is being used 12

Usability Context Analysis (UCA) Structured method for documenting key aspects of a system which affect usability Provides support for: Identifying the (intended) context of use Specifying the context in which usability should be measured Helps with problem of generalising from findings Some laboratory studies have been so remote from conditions of actual system use that the relation of the data to life was at best irrelevant and at worst distorting (Whiteside et al., 1988) 13 Who should be involved? work analysis requirements analysts user representatives change management usability professionals training development IT & peripherals development usability team user support marketing 14

Context structure 1 Users 1.1 User types 1.2 Skills & knowledge 1.3 Physical attributes 1.4 Mental attributes 1.5 Job characteristics 1.6 List of tasks 2 Task characteristics 3 Organisational environment 4 Technical environment 5 Physical environment 15 Example: A bank ATM Context description: The users The user characteristics The tasks users perform The technical environment (hardware and software to support system) The physical environment The social or organisational environment 16

New design features to meet contextual needs Recess for wheelchair access. Speech output for visually impaired users. Customisation features for rapid access. Finger print for identification. Visor appears during sunny weather. Buttons light during darkness. Alarm button for security alert. 17 Usability measures (quality in use) Task Inputs System User Task Outputs Environment % Effectiveness Task Time (minutes) % Productive Period useful work 18

Effectiveness and Efficiency User-system combination Task Time % Effectiveness Efficiency = Effectiveness Task Time We use a black box approach goal-centred and relates to performance objectives process-independent, enabling comparison of different systems, workflows, task allocations, etc. to achieve same goal 19 Usability requirements Volumetrics Estimate values Existing system Requirement effectiveness efficiency satisfaction Business prototyping Usability test effectiveness efficiency satisfaction Actual effectiveness efficiency satisfaction 20

Effectiveness costs Measure consequences, not causes, e.g.: Minor inconsistencies 1-10% Administrative consequences 10-50% Financial implications 1-100% Negative business consequences 100% 21 Effectiveness scores An acceptable task output is 100% effective Consider each element in the task output in turn, e.g.: What plausible errors could occur in each element? What impact would each type of error have on the stakeholders? How much does this reduce the value of the output? Assign a percentage to each possible error type in each task element For example the inconvenience associated with an error that would be detected later and corrected might reduce the effectiveness by 20% A serious undetected error might reduce the value by 80% The inconvenience of a typo might be judged to reduce the value by 5% Some errors may invalidate the whole output, reducing the value to 0% 22

Specifying effectiveness Choose a user/business task sequence of activities by an individual that meet a user/business goal What is the input and the output? Consider each element of the output What could go wrong with each element? What output errors occur using existing systems and procedures? What types of mistakes might users make? Estimate the user/business impact of each error situation For each element of the task output that has an error, subtract the appropriate percentage Subtract each percentage from 100 to give the overall effectiveness If the sum of errors greater than 100, the effectiveness is 0% Calculate the average effectiveness What is the relative frequency of correct results and each error scenario? Multiply each effectiveness scenario by the frequency and calculate the average 23 Effectiveness calculation Output element Error scenario Impact Business value Frequenc y Value x frequency Impact x frequency Acknowledgement Not sent 10% 90% 1% 90 10 Corrected d.o.b Wrong d.o.b. 50% 50% 2% 100 100 Payment made No payment 100% 0% 1% 0 100 No errors 0% 100% 96% 9600 0 Total 100% 9790 Mean value 97.9% 24

Setting effectiveness and efficiency targets What assumptions about task time and accuracy have been made in the Volumetrics, Performance and Staffing models? What is known about the time and accuracy of existing similar procedures The new system should be at least as good as the existing systems Set a target range Objective for effectiveness and efficiency Worst case (e.g. must be at least as good as existing systems) 0 100 90 95 worst target 25 User satisfaction: SUMI questionnaire 60 50 40 30 20 New System Old System 10 0 Global Efficiency Affect Helpfulness Control Learnability 26

Detailed usability requirements Understandability u Interface elements (e.g. menus, controls) should be easy to understand u For a walk up and purchase or use system, the purpose of the system should be easily understandable Learnability u The user documentation and help should be complete u The help should be context sensitive and explain how to achieve common tasks u The system should be easy to learn Operability u The interface actions and elements should be consistent u Error messages should explain how to recover from the error u Undo should be available for most actions u Actions which cannot be undone should ask for confirmation u The system should be customisable to meet specific user needs u A style guide should be used Attractiveness u The system design and screen layout and colour should be appealing 27 Other usability metrics Number of errors Interesting to measure, but what does it mean? Number of assists Would there be assists in real life? Preference for features A formative measure 28

Why is summative usability testing important? Summative testing is about meeting business and user needs Effectiveness: success in achieving goals Efficiency: productivity, staffing, waiting time in line Satisfaction: willingness to use the system 29 Differences between summative and formative testing Formative: diagnosis Identify usability defects Understand user problems Early in design Fast iteration Eliminate as many problems as possible Summative: measurement How usable is the product? Does it meet the usability requirements? Does it need further improvement? 30

Measurement needs a more rigorous procedure Diagnostic Measurement 3-4 users at least 8 users think aloud natural prompted not assisted informal controlled qualitative results quantitative results 31 The danger of not setting usability requirements New software for issuing UK passports Installed in passport issuing offices Took operators twice as long Caused delays of up to 3 months in obtaining a passport Huge cost of additional clerical staff E-commerce web sites User success in purchasing ranges from 25%-42% 32

Summative testing Summative testing is unfashionable Much early usability work used summative methods u Whiteside et al, 1988 Not always supported by other user centred design activities Gained the reputation for being an expensive way to identify problems when it was too late to fix them! The emphasis moved to formative evaluation So-called discount usability methods that can be used earlier in development Without subsequent summative testing, it is difficult to judge the effectiveness of the usability work 33 Part 5. Common Industry Format for usability test reports NIST initiative National Institute of Standards and Technology Suppliers provide standard test reports to purchasers Suppliers include: IBM, Microsoft, HP, Sun, Oracle, Compaq Purchasers include: Boeing, Northwest Mutual Life, State Farm Insurance, Fidelity, Kodak Reports provided in confidence Could permit comparisons 34

CIF motivation Boeing We traditionally have had little visibility of how usable a product will be or how much training and support users will need. This has made it difficult to compare products, to plan for support, or estimate total cost of ownership. US WEST US WEST has been actively participating in [the CIF] initiative and will clearly benefit from the results of this effort such as a standard testing process for usability, a standard specification for reporting usability tests, and other techniques to enable us to partner more effectively with our vendors. State Farm Insurance We have found it difficult to identify software products that meet our needs without contributing to excessive overhead, increasing support costs, or negatively impacting employee productivity or morale. If successful the [CIF] initiative should result in the development of better, more usable software for all of industry. 35 CIF objectives Supplier Consumer Usability requirements Usability requirements Product usability CIF Purchase decision 36

Usability test report format 1. Title page 2. Executive summary 3. Introduction 3.1 Product description 3.2 Test objectives 4. Method 4.1 Participants 4.2 Context of use in the test 4.3 Experimental design 4.4 Usability metrics 5. Results 6. Appendices Product usability requirements format 1. Title page 2. Executive summary 3. Product details 3.1 Product description 3.2 Product objectives 4. Requirements Users, tasks, scenarios, metrics, computer, display, environment Usability test requirements format 1. Users 2. Context of use in the test 3. Test procedure 4. Usability metrics 5. Appendices 37 Usability as part of procurement Evaluate existing system to establish a baseline Completion rate, task time, user satisfaction Specify usability requirements At least as good as the baseline Design issues arising from Baseline evaluation [need to use UCD methods in development] Evaluate first working prototype of the new system Negotiate solution if targets are not being met 38

Benefits of the CIF Problem Usability costs money Usability is an interface issue Usability is not part of the process No usability requirements Customers don t ask for usability Buyers can t assess usability Solution Usability saves money Usability is a business issue ISO 13407 user centred design process Use Common Industry Format for requirements Provide Common Industry Format usability results Ask for Common Industry Format usability results 39 www.usabilitynet.org 40