USABILITY SPECIFICATIONS



Similar documents
Screen Design : Navigation, Windows, Controls, Text,

White Paper: Designing Resourceful Graphical User Interfaces (GUIs) for Healthcare Applications

Module 7 Invoicing. Step by Step Guide PSA Single project invoicing 7.2 Batch invoicing projects 7.3 Program invoicing 7.

EXCEED IEP Goals Product Screen

OVERVIEW. Microsoft Project terms and definitions

Job Streaming User Guide

Outline. Lecture 13: Web Usability. Top Ten Web Design Mistakes. Web Usability Principles Usability Evaluations

Screen Design : Navigation, Windows, Controls, Text,

Benchmark Study on the User Experience of Microsoft Dynamics NAV 5.0 and Microsoft Dynamics NAV 2009

User interface design. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 16 Slide 1

Project Creation and Gantt Chart Design Using Microsoft Project. R. Baker. The University of Tampa

The Top 10 Things DBAs Should Know About Toad for IBM DB2

Pipeliner CRM Phaenomena Guide Opportunity Management Pipelinersales Inc.

Ten steps to better requirements management.

User Interface Design

Sample- for evaluation only. Advanced PowerPoint. TeachUcomp, Inc. A Presentation of TeachUcomp Incorporated. Copyright TeachUcomp, Inc.

NHA. User Guide, Version 1.0. Production Tool

Getting Started with Command Prompts

MSP How to guide session 2 (Resources & Cost)

GroupWise to MS Outlook 2007

This web-based report provides information for single funds centers. The report can be run for one funds center or multiple single funds centers.

Product Development. Using Critical Path EVM for the Microsoft Project Desktop Application Readme

Appointments Module. User s Manual

Usability Heuristics for the Web. 1. Visibility of system status

Microsoft Office Project Standard 2007 Project Professional April February 2006

Release Notes Assistance PSA 2015 Summer Release

Title Electronic Scheduling of Equipment. Boston University Photonics Center Page 1 of 10

Excel 2007 Basic knowledge

Perform this procedure when you need to add a recurring payment option, or when you need to change or withdraw it.

Optomate Training Compendium Appointment Book Page 2

Software Development. HCI in Software Development. Motivation for Good Design. CSCU9N5 Multimedia & HCI

DESIGNING FOR WEB SITE USABILITY

Manual English KOI Desktop App 2.0.x

Testing Websites with Users

SMART Board Interactive Whiteboard Basics Tip Sheet for ABP Faculty

SecuraLive ULTIMATE SECURITY

FREE FALL. Introduction. Reference Young and Freedman, University Physics, 12 th Edition: Chapter 2, section 2.5

UPGRADE. Upgrading Microsoft Dynamics Entrepreneur to Microsoft Dynamics NAV. Microsoft Dynamics Entrepreneur Solution.

Excel Charts & Graphs

Information Systems Services Enterprise Vault

Usability Test Plan Docutek ERes v University of Texas Libraries

How To Design A Website For The Elderly

SPSS INSTRUCTION CHAPTER 1

Using Excel (Microsoft Office 2007 Version) for Graphical Analysis of Data

eacc18 Online ACC Medical Certificate: Quick Reference Guide for Users of Medtech Evolution

Adding A Student Course Survey Link For Fully Online Courses Into A Canvas Course

VIDEO TRANSCRIPT: Content Marketing Analyzing Your Efforts 1. Content Marketing - Analyzing Your Efforts:

ELECTRO-MECHANICAL PROJECT MANAGEMENT

TrackRoad GPS User Guide

Project Management with Enterprise Architect

GUIDELINES FOR WEB DESIGN

Open-EMR Usability Evaluation Report Clinical Reporting and Patient Portal

Website Designer. Interrogation Sheet

QUALITY FUNCTION DEPLOYMENT (QFD) FOR SERVICES HANDBOOK MBA Luis Bernal Dr. Utz Dornberger MBA Alfredo Suvelza MBA Trevor Byrnes

INTRODUCTION... 4 MODULE 5. TIMESHEET Overview TIMESHEET CALENDAR VIEW... 7 INTRODUCTION What you will learn in this section...

Graphical Environment Tool for Development versus Non Graphical Development Tool

Using Excel for Analyzing Survey Questionnaires Jennifer Leahy

Mastering Microsoft Project 2010

User experience storyboards: Building better UIs with RUP, UML, and use cases

Module 1 Concept & Navigation

Checking System Requirements. How-To Guide

Excel macros made easy

Users Guide. By Dr. Don J. Wood Dr. Srinivasa Lingireddy. Featuring the Surge Wizard

Operating instructions TSE Wireless Software Home

Evaluation of an Electronic Charting System in the BCIT Nursing Simulation Lab

Appointment Scheduler

CREATE AN ANIMATED AQUARIUM IN POWERPOINT

DesignHammer Media Group LLC

Certification criteria for. Internal QMS Auditor Training Course

Computer Training Centre University College Cork

Controlling User Typed Responses for Fill-in Questions

Business Portal for Microsoft Dynamics GP. Key Performance Indicators Release 10.0

USER MANUAL (PRO-CURO LITE, PRO & ENT) [SUPPLIED FOR VERSION 3]

Best Practice. 4 Step ERP Evaluation Methodology. White Paper

User Testing Report. Team 1 Anita Sekharan Daniel So Jacqueline Wong

PSA 2015 Step by Step Guide is published by Assistance Software. All rights reserved. No part of this publication may be Projects reproduced.

Heuristic Evaluation of Three Cellular Phones

Factor Analysis Using SPSS

Creating Business Value with Mature QA Practices

Using the CORES Online Reservation System

Task Scheduler. Morgan N. Sandquist Developer: Gary Meyer Reviewer: Lauri Watts

Q1. What are the differences between Data Backup, System Restore, Disk Image, System Recovery Disc and System Repair Disk?

Reporting Student Progress and Achievement

Online Tools Training Lesson Plan

Warehouse Management (WM) Case Study IV

E x c e l : Data Analysis Tools Student Manual

Microsoft Outlook 2013 Workshop

Oracle Sales Cloud Activity Management

Lab: Data Backup and Recovery in Windows XP

<odesi> Survey Example Canadian Community Health Survey (CCHS)

Creating Appointment Groups using Scheduler

Pipeliner CRM Phaenomena Guide Add-In for MS Outlook Pipelinersales Inc.

Transcription:

USABILITY SPECIFICATIONS TOPICS: Chapter 8 What they are Usability specification table Benchmark task descriptions User errors in usability specifications Usability specifications and managing the UE process Team exercise on usability specifications Copyright 2001 H. Rex Hartson and Deborah Hix. All rights reserved. No part of this material may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, microfilming, recording, or otherwise, without prior written permission of its authors. 11.1 Uspec

INTRODUCTION TO USABILITY SPECIFICATIONS Revisiting the usability engineering life cycle 11.2 Uspec

USABILITY SPECIFICATIONS Another piece of the puzzle... Introducing Envision * Videoclips to be used as examples of process activities * Envision: a digital library of computer science literature * Search results are presented in graphical scatterplot * User-controllable visualization along axes of scatterplot * Videoclip: Envision prototype 11.3 Uspec

USABILITY SPECIFICATIONS Usability is not something warm, fuzzy, but is quantitative Quantitative usability goals against which user interaction design is measured Target levels for usability attributes * Operationally defined metrics for a usable interaction design If you can't measure it, you can't manage it * Management control for usability engineering life cycle * Indication that development process is converging toward a successful interaction design * Establish as early in process as feasible 11.4 Uspec

USABILITY SPECIFICATIONS Tie usability specifications to early usability goals * E.g., for early goal of walk-up-and-use usability, base usability specification on initial task performance time All project members should agree on usability specification attributes and values Get customers and users involved in setting levels 11.5 Uspec

USABILITY SPECIFICATION DATA Usability specifications based on * Objective, observable user performance * Subjective user opinion and satisfaction - Subjective preferences may reflect users desire to return to your Web site, but flash and trash soon bores and irritates Objective and subjective usability specifications can both be quantitative 11.6 Uspec

USABILITY SPECIFICATION TABLE Credit to: [Whiteside, Bennett, & Holtzblatt, 1988] Usability Attribute Measuring Instrument Value to be Measured Current Target Observed Results * Usability attribute what general usability characteristic is to be measured For Y2K Calender: initial performance, since want good 'walk-up-and-usability' w/o training or manuals - May need separate usability attributes for each user class Some quantitative usability attributes * Objective - Initial performance - Longitudinal (experienced, steady state) performance - Learnability - Retainability 11.7 Uspec

USABILITY SPECIFICATION TABLE Some other quantitative usability attributes * Subjective - Initial impression - Longitudinal satisfaction 11.8 Uspec

USABILITY SPECIFICATION TABLE Usability Attribute Initial performance Measuring Instrument Value to be Measured Current Target Observed Results * Measuring instrument vehicle by which values are measured for usability attribute, the thing that generates the data (e.g., task generates timing data, questionnaire generates preference data) Benchmark 1: Schedule a meeting with Dr. Ehrich for four weeks from today at 10AM in 133 McBryde, about the HCI research project. What tasks should be included? * Representative, frequently performed tasks * Common tasks 20% that account for 80% of usage * Critical business tasks not frequent, but if you get it wrong, heads can roll 11.9 Uspec

USABILITY SPECIFICATION DATA User performance for specific benchmark tasks objective Questionnaire score subjective (opinion) - Example question from QUIS: terrible wonderful NA Overall reaction to user interface 0 1 2 3 4 5 6 7 8 9 10 frustrating satisfying 10 NA 0 1 2 3 4 5 6 7 8 9 uninteresting interesting 10 NA 0 1 2 3 4 5 6 7 8 9 dull stimulating 10 NA 0 1 2 3 4 5 6 7 8 9 difficult easy 10 NA 0 1 2 3 4 5 6 7 8 9 11.10 Uspec

BENCHMARK TASK DESCRIPTIONS Developing benchmark task descriptions * Representative, frequently performed tasks * Clear, precise, repeatable instructions * What task to do, not how to do it * Clear start and end points for timing Not good: Display last week's appts. E.g.,"access a certain kind of info" better than "submit query", better match to work activity * Adapt scenarios already developed for design - Clearly an important task to evaluate - Remove information about how to do it 11.11 Uspec

BENCHMARK TASK DESCRIPTIONS Developing benchmark task descriptions (continued) * Start with fairly simple tasks, then progressively increase difficulty E.g., add an appt, then add an appt 60 days from now, or add multiple recurring appts. E.g., move an appt from one month to another * Avoid large amounts of typing if typing skill is not being evaluated * To evaluate error recovery, benchmark task can begin in error state 11.12 Uspec

BENCHMARK TASK DESCRIPTIONS Developing benchmark task descriptions (continued) * Tasks should include navigation E.g., 'add appt one month from today' causes crossing month boundary independent of absolute date, BUT... * Task wording should be unambiguous Above ex. is confusing; also, wording of tasks can influence task performance (4 wks -- users move by week; 1 mo -- users move by month) * Don t use words in benchmark tasks that appear specifically in interaction design E.g., Find first appt..., when there s a button labeled find. Instead, word task like: search for or locate * Typically put each benchmark on a separate sheet of paper * Typical number of benchmark tasks: Enough for reasonable, representative coverage * Example for Y2K Calendar: Add an appointment with Dr. Kevorkian for 4 weeks from today at 10AM concerning your flu shot. 11.13 Uspec

USABILITY SPECIFICATION TABLE Usability Attribute Initial performance Measuring Instrument "add appt" task per Benchmark 1 Value to be Measured Current Target Observed Results * Value to be measured metric for which usability data values are collected - Time to complete task - Number of errors - Frequency of help and documentation use - Time spent in errors and recovery - Number of repetitions of failed commands - Number of times user expresses frustration or satisfaction - Number of commands, mouse clicks, or other user actions to perform task(s) 11.14 Uspec

USABILITY SPECIFICATION TABLE Usability Attribute Initial performance Measuring Instrument "add appt" task per Benchmark 1 Value to be Measured Length of time to add appt on first trial Current Target Observed Results * Current level present value of usability attribute to be measured of performance for current version of system for measuring instrument (when available) Baseline to help set target level, from: * Manual system (performing task without computer) * Automated system (existing or previous version) * Competitor system * Developer performance (for expert, longitudinal use) 11.15 Uspec

USABILITY SPECIFICATION TABLE Usability Attribute Initial performance Measuring Instrument "add appt" task per Benchmark 1 Value to be Measured Length of time to add appt on first trial Current 20 secs. (competitor system) Target Observed Results * Target level value indicating unquestioned usability success for present version Minimum acceptable level of user performance Determining target level values * Usually acceptable improvement over current level 11.16 Uspec

USABILITY SPECIFICATION TABLE Observed results actual values from evaluation with users, filled in after evaluation sessions More example usability specifications Usability Attribute Initial performance Measuring Instrument "add appt" task per Benchmark 1 Value to be Measured Length of time to add appt on first trial Current 20 secs. (competitor system) Target 15 secs. Observed Results 11.17 Uspec

USER ERRORS IN USABILITY SPECIFICATIONS Determining errors * Deviation from any correct path to accomplish task (except, for example, going to Help) * Only situations that imply usability problems * Do not count "oops" errors, doing wrong thing when knew it was wrong E.g., using wrong month on calendar totally by accident 11.18 Uspec

USER ERRORS IN USABILITY SPECIFICATIONS Examples of errors * Selecting wrong menu, button, icon, etc. when user thought it was the right one * Double clicking when single click is needed, and vice versa * Using wrong accelerator key * Operating on the wrong interaction object (when user thought it was the right one) E.g., working on wrong month of calendar because they couldn't readily see month's name * Usually not typing errors 11.19 Uspec

CREATING USABILITY SPECIFICATIONS Usability evaluation design driven by usability goals * First determine usability goals - In terms of user class, task context, special tasks, marketing needs Example: Increase number of times user can perform task X, based on current version, implies: - Example: Reduce amount of time for novice user to perform task X in Version 2.0 - Be as specific as possible * Then quantify as usability specification(s) - Example: Currently 35 seconds to perform task X ("current level"); reduce to 25 seconds ("target level") 11.20 Uspec

CREATING USABILITY SPECIFICATIONS What are constraints in user or work context? How can setting be more realistic? (Ecological validity) * Usability lab can be "sterile work environment" * Does task require telephone, more than one person or role? * Does task involve background noise? * Does task involve interference, interruption? Phone can be used for this, too. Experimental design must take into account trade-offs among usability characteristics, user groups * Watch out for potential trade-off between learnability for new users and performance power for experienced users 11.21 Uspec

USABILITY SPECIFICATION TABLE Some additional example usability specifications Usability Attribute Initial performance Measuring Instrument "add appt" task per Benchmark 1 Value to be Measured Length of time to add appt on first trial Current 20 secs. (competitor system) Target 15 secs. Observed Results Initial performance "add appt" task per Benchmark 1 Number of errors 2 1 11.22 Uspec

USABILITY SPECIFICATIONS HELP MANAGE THE USABILITY ENGINEERING PROCESS This is the control of usability engineering life cycle * Quantifiable end to process * Accountability * Stop iterating when target level usability specifications are met * It s expected that you will not meet all usability target levels on first iteration - If usability target levels are met on first iteration, they may have been too lenient - Point is to uncover usability problems 11.23 Uspec

USABILITY SPECIFICATIONS HELP MANAGE THE USABILITY ENGINEERING PROCESS Bottom line: Good engineering judgment is important * For setting levels (especially "target" level) * For knowing if specifications are "reasonable" Videoclip: Setting usability specifications 11.24 Uspec

TEAM EXERCISE: USABILITY SPECIFICATIONS Goal: * To gain experience in writing precise, measurable usability specifications Activities: * Produce three usability specifications, two based on objective measures, and one based on subjective measures. * For each specification for objective measures, write a brief but specific benchmark task description for participants to perform. Write two different benchmark task descriptions, each on a separate sheet. Have them be a little complicated, including some navigation. 11.25 Uspec

TEAM EXERCISE: USABILITY SPECIFICATIONS * Specifications with objective measures should be evaluatable, via the benchmark tasks, in a later class exercise, on formative evaluation. Develop tasks that you can "implement" in your next exercise, to build a rapid prototype. * The specification for subjective measure should be based on the questionnaire supplied. Select 3 or 4 items from questionnaire. Cautions and hints: * Don t spend any time on design in this exercise; there will be time for detailed design in the next exercise. * Don't plan to give users any training. 11.26 Uspec

TEAM EXERCISE: USABILITY SPECIFICATIONS»»Three usability specifications, in the form on a transparency»»questionnaire question numbers included in subjective specification»»benchmark task descriptions, each on a separate sheet of paper Completed by: About 30 minutes; 40 mins. max. 11.27 Uspec