3 Usability ISO : The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction, in a specified context of use Efficiency Effectiveness Satisfaction Context: user, task, environment
4 Nielsen s model of usability
5 Operationalizing usability How to assess usability criteria? What measures? What thresholds? What is usable enough?
8 Simple Observation User is given a task, and evaluator just watches the user Problem: no insight into the user s decision process or attitude
9 Think-aloud Protocol Subjects are asked to say what they are thinking/doing: What they believe is happening What they are trying to do Why they took an action Gives insight into what the user is thinking
10 Think-aloud Protocol Problems: Awkward/uncomfortable for subject (thinking aloud is not normal!) Thinking about it may alter the way people perform their task Hard to talk when they are concentrating on problem Still the most widely used method in industry
11 Other Problems with Think-aloud
12 Retrospective Think-aloud Problems with Think-aloud: Awkward for subject (thinking aloud not normal!) Thinking about it may alter the way people perform their task Hard to talk when they are concentrating on problem Solution: videotape the experience, perform a retrospective think-aloud Has its own problems Awkwardness of watching themselves on video Awkwardness of reliving mistakes Reflection of the experience rather than in context 12
13 Co-discovery Learning Two people work together on a task Normal conversation between the two users is monitored Removes awkwardness of think-aloud, more natural Provides insights into thinking process of both users 13
14 Field Studies Observe in the field = natural environment Sit and observe Video records Join the culture (ethnography)
15 Field Studies Observe in the field = natural environment Sit and observe Video records Join the culture (ethnography) Requires that the system be fully deployed Highest degree of realism Can be highly specific to the particular setting Can take a long time
16 Recording Observations Paper and pencil primitive but cheap evaluators record events, interpretations, and extraneous observations evaluator seems disengaged problem: writing is slow prepared coding schemes can help; just tick off events Audio recording capture discussion (think aloud, co-discovery) hard to synchronize streams (e.g., interface actions) (expensive) tools exist to help transcription is slow and difficult! tools exist to help
17 Recording Observations Video recording can see what a user is doing (good to use one camera/scan converter for screen + one for subject) can be intrusive (at least initially) analysis can be challenging annotation is time consuming and dull Companies often build usability labs with one-way mirrors, video cams, etc.
18 Analyzing Observation Data Qualitative data: interpreted to tell a story Qualitative data: categorized Quantitative data: presented as values, tables, charts, and graphs often treated with statistical tests How do you know which analysis is appropriate? Depends on what you are using it for
20 Querying Users with Interviews conversations with a purpose excellent for pursuing specific issues more interactive than observation: address specific issues of interest more flexible than questionnaires: probe more deeply on interesting issues as they arise problems accounts are subjective time consuming (to conduct and to analyze) evaluator can bias the interview prone to rationalization of events/thoughts by user user s reconstruction may be wrong
21 Planning the Interview general what is the purpose of the interview? how many people? (breadth vs. depth) length of interview & number of sessions scheduling interviews (location, times, people) will the interview be recorded? (audio, video; transcription) avoid: asking long questions using compound sentences using jargon asking leading questions and generally be alert to unconscious biases.
22 Interviews three main types: 1. open-ended / unstructured 2. semi-structured 3. structured control & pre-determined questions other categories (can include types above): 4. group 5. retrospective
23 Unstructured Interviews most like a conversation, often go into depth open questions exploratory key is to listen rather than talk: practice silence! pros/cons: + rich data, things interviewer may not have considered - easy to go off the rails - time-consuming & difficult to analyze - impossible to replicate
24 Structured Interviews predetermined questions (like questionnaire, often with a flowchart) closed questions short, clearly worded questions confirmatory pros/cons: + replicable - potentially important detail can be lost better (cheaper) with a questionnaire?
25 Semi-structured Interviews Between structured & unstructured Uses elements of both In usability studies, unstructured and semistructured are the most common
26 Group Interviews (Focus Group) 3 10 people interviewed at one time usually has agenda, but may be structured/unstructured skilled moderator critical! usually recorded pros/cons: + can accommodate diverse and sensitive issues + opinions developed within a social context + good way to locate proto-users : most articulate, imaginative participants can help later w/participatory design - some interviewees may dominate - expensive: usually pay participants + professional moderator - people may not know what they think (or be afraid to express it)!
27 Retrospective Interview post-test interview to clarify events that occurred during system use: record what happened, replay it, and ask about it pros/cons: + excellent for grounding a post-test interview + avoids erroneous reconstruction + users often offer concrete suggestions - requires a second session I didn t see it. Why don t you make it look like a button? Do you know why you never tried that option?
28 Overview of an Exploratory Interview 1. explain purpose of the interview allow time to get acquainted with the interviewee provide understanding and background 2. enumerate activities find out what the user does 3. explain work methods find out how the user does things (skills and knowledge) 4. trace interconnections determine other people and activities that are related 5. identify performance issues explore current problems and impediments to success
29 Things You Uncover during Interviews exceptions lots of things people do are not in the manual many jobs evolve to fit changing circumstances much of this is not documented many times management does not know about this domain knowledge most people know a lot about their jobs, and those they work with terminology, common phrases, specific details audio recording helps capture this video recording helps provide body language written notes can provide context, but not always details
30 Questionnaires (Surveys)
31 Querying Users with Questionnaires closed or open questions get evidence of wide general opinion or experiences after an experiment pros/cons: + preparation expensive, but administration cheap can reach a wide subject group (e.g. mail or ) + does not require presence of evaluator + results can be quantified - risk: low response rate and/or low quality responses
32 Questionnaires: Designing Questions establish the purpose of the questionnaire: what information is sought? how would you analyze the results? what would you do with your analysis? determine the audience you want to reach typical: random sample of between 50 and 1000 users of the product -- why a random sample? test everything before sending it out: test the wording test the timing test the validity test the analysis
33 Administering Questionnaires in-person administration take home (conventional) requires time to administer, but highest completion rate often subjects don t complete / return the questionnaire permits subjects to answer on their own time responses may tend to be more free-form attachments may be a problem response rates depend on trust in source web-based forms general issues standardize formats and responses scripts to ensure correct / complete payment or incentives anonymity self-selection
34 Styles of Questions: Open-ended asks for opinions good for general subjective information but difficult to analyze rigorously E.g., can you suggest any improvements to the interface?
35 Styles of Questions: Closed restricts responses by supplying the choices for answers can be easily analyzed but can still be hard to interpret, if questions / responses not well designed! alternative answers should be very specific Do you use computers at work: O often O sometimes O rarely In your typical work day, do you use computers: O over 4 hrs a day O between 2 and 4 hrs daily O between 1 and 2 hrs daily O less than 1 hr a day
36 Styles of Questions: Scalar - Likert Scale measure opinions, attitudes, and beliefs ask user to judge a specific statement on a numeric scale scale usually corresponds to agreement or disagreement with a statement Characters on the computer screen are hard to read: strongly agree strongly disagree
37 Styles of Questions: Scalar - Semantic Differential Scale explore a range of bipolar attitudes about a particular item each pair of attitudes is represented as a pair of adjectives WebCT is: clear confusing attractive ugly
38 Styles of Questions: Multi-Choice respondent offered a choice of explicit responses How do you most often get help with the system? (tick one) O on-line manual O paper manual O ask a colleague Which types of software have you used? (tick all that apply) O word processor O data base O spreadsheet O compiler
39 Styles of Questions: Ranked respondent places an ordering on items in a list useful to indicate a user s preferences forced choice Rank the usefulness of these methods of issuing a command (1 most useful, 2 next most useful..., 0 if not used) 2 command line 1 menu selection 3 control key accelerator
40 Styles of Questions: Combining Open-ended & Closed Questions gets specific response, but allows room for user s opinion It is easy to recover from mistakes: disagree agree comment: the undo facility is great!
41 Herman Miller Aeron Chair Comfort Likert 1-10 (want 7.5) Got 4.5 Eventually inched up to 8 before release Aesthetics Likert 1-10 Got 2-3 (never above 6!) Usually a relationship between these two but it didn t happen! Focus Group Check on pricing Architects and designers liked Facility managers and ergonomicists hated! Entire design was *actually* user friendly Where is this chair at today?
42 Considerations Style of Questions Open Ended Closed Choose one Choose all that apply Ratings (scale) Rankings Ease of Analysis Poor Depends Easy Somewhat easy* Easy Somewhat easy *Note: Can t really make a pie chart if the responses don t add up to 100%
43 Be Considerate of Your Respondents not just because it s nice, but it works better. questionnaire length (short is good): think in terms of reasonable completion times do not ask questions whose answers you will not use! privacy invasions: be careful how / what you ask motivation why should the respondent bother? usually need to offer something in return but be careful about introducing bias.
44 Deployment Issues Online/ html tools U of S survey tool Survey Monkey Choice impacts ease of analysis responses? Go directly to database
45 Presenting questionnaire results Choose one
46 Presenting questionnaire results Choose one Choose all that apply
47 Summary: Questionnaires 1. Establish purpose 2. Determine audience 3. Variety of administration methods (for different audiences) 4. Design questions: many kinds, depend on what you want to learn 5. Be considerate of your respondents 6. Motivate your respondents (without biasing them).
User Interface Design Winter term 2005/2006 Thursdays, 14-16 c.t., Raum 228 Prof. Dr. Antonio Krüger Institut für Geoinformatik Universität Münster 20. Februar 06 IfGi Universität Münster User Interface
Questionnaires Unit 7 Learning outcomes Understand when/how to use questionnaires in interaction design Surveys Satisfaction questionnaires (post use) Learn how to prepare them Different types of scales
Data Collection Strategies 1 Bonus Table 1 Strengths and Weaknesses of Tests Strengths of tests (especially standardized tests) Can provide measures of many characteristics of people. Often standardized
DATA ANALYSIS, INTERPRETATION AND PRESENTATION OVERVIEW Qualitative and quantitative Simple quantitative analysis Simple qualitative analysis Tools to support data analysis Theoretical frameworks: grounded
Introduction to Usability Testing Abbas Moallem COEN 296-Human Computer Interaction And Usability Eng Overview What is usability testing? Determining the appropriate testing Selecting usability test type
Qualitative data acquisition methods (e.g. Interviews and observations) -. Qualitative data acquisition methods (e.g. Interviews and observations) ( version 0.9, 1/4/05 ) Code: data-quali Daniel K. Schneider,
Capstone Suggestions for Survey Development for Research Purposes 1. Begin by listing the questions you would like to answer with the survey. These questions will be relatively broad and should be based
Market Research and Information Technology Principles of Marketing Chapter 5 1 Importance of Marketing Research Determine current consumer needs Shifts in attitudes and purchasing patterns Changes in the
Steps for Conducting Focus Groups or Individual In-depth Interviews 1. Plan the study Plan your study with the input of your strategy team, stakeholders, and research experts so that collaboratively you
Designing & Conducting Survey Research Santa Monica College Fall 2011 Presented by: Hannah Alford, Director Office of Institutional Research 1 Workshop Overview Part I: Overview of Survey Method Paper/Pencil
Usability Testing Credit: Slides adapted from H. Kiewe Materials for this class Readings: Nielsen, Why You Only Need to Test with 5 Users Dickelman, Usability Testing -- Fear and Loathing on the Keyboard
Interviews and Questionnaires P535, Fall 2012 Kay Connelly Part 1: Interviews How to Conduct Them Interview Types Conversations with a purpose (Kahn and Cannell, 1957, Dynamics of Interviewing: Theory,
Usability Test Results CyberLynx Annotator for Lotus 1-2-3 Executive Summary John Milanski ENG525 Usability Testing 25 Mar 1997 1 2 O VERVIEW Formal usability tests how well your software performs with
Interaction Design Chapter 4 (June 1st, 2011, 9am-12pm): Applying Interaction Design I 1 II Applying Interaction Design May June July Applying Interaction Design I What is Design Research? Conducting Design
Usability Testing with Howard Kiewe Eight Frequently Asked Questions 2008, Howard Kiewe The Eight FAQs We will cover the following questions: 1. What is Usability? 2. What is a Usability Test? 3. Who are
EVALUATION METHODS TIP SHEET QUANTITATIVE METHODS: Quantitative data collection methods consist of counts or frequencies, rates or percentages, or other statistics that document the actual existence or
Interviews Benjamin Bisinger & Max von Bülow 21. Mai 2014 Verantwortlicher Professor: Prof. Dr. Florian Alt Definition A meeting in which someone asks you a series of questions as part of a research project
Google Cloud Storage Experience Usability Study Plan HCDE 517 Usability Testing Team 5G s (4 Gals + Google): Victoria Briscoe, Vina Chan, Sooyoung Hwang, Shu-Fan Wen ! Goals Google Cloud Storage is part
Research Methods Family in the News Can you identify some main debates (controversies) for your topic? Do you think the authors positions in these debates (i.e., their values) affect their presentation
Fulfilling the core market research educational needs of individuals and companies worldwide Presented through a unique partnership between How to Contact Us: Phone: +1-706-542-3537 or 1-800-811-6640 (USA
LECTURE 3 REQUIREMENTS GATHERING Key Definitions The As-Is system is the current system and may or may not be computerized The To-Be system is the new system that is based on updated requirements The System
Moderating Usability Tests Principles & Practices for Interacting Joe Dumas and Beth Loring Morgan Kaufmann, 2008 ISBN 978-0-12-373933-9 Introduction Getting Started 10 Golden Rules Initial Contacts Interacting
Survey research Chaiwoo Lee Postdoctoral Associate MIT AgeLab email@example.com agelab.mit.edu Contents Key characteristics of survey research Designing your questionnaire Writing the questions Putting them
VII. Information Acquisition What is Information Acquisition? Sampling and Investigating Hard Data Stakeholders and Interviewing Types of Information Acquisition Sampling Hard Data and Using Questionnaires
COMP 3220 Human-Computer Interaction Usability Testing Questions 4.2. Usability Testing Alexander Nikov 1. What is a Usability Test? 2. Who are the Test Participants? 3. Why Usability Test? 4. Why Doesn
Class 2: Qualitative Research Methods Urban and Regional Planning 298A San Jose State University Why qualitative methods? Not everything that can be counted counts, and not everything that counts can be
Facilitator Script Slacker Usability Test Portable OOBE June 2-3, 2008 This is a rough script for the usability test facilitator. The script shows flow and intent. Deviation from the script will be necessary
INTERCALL STREAMING SURVEY REPORT Methodology CONTACT US 800.820.5855 www.intercall.com An online survey was conducted using the field services of Russell Research. The study was fielded between August
of software / interfaces, but it applies more generally too Shneiderman, Chapter 4 What is usability testing? Usability Testing Why bother with the usability of software / interfaces? How to test? What
Introduction Improving Government Websites and Surveys With Usability Testing and User Experience Research Jennifer Romano Bergstrom, Jonathan Strohl Fors Marsh Group 1010 N Glebe Rd., Suite 510, Arlington,
Chapter 6 Multiple Choice Questions (The answers are provided after the last question.) 1. According to your text, how many points should a rating scale have? a. Five b. Four c. Ten d. Somewhere from 4
Usability Testing for Beginners on a Tight Budget Introduction Web Development Librarian at LSU Formerly Web Services Librarian at Tulane Held several web design positions at major corporations Usability
68 S ECTION 5: DEVELOPING AND IMPLEMENTING AN EVALUATION PLAN EVALUATION METHODS Although indicators identify what you will look at to determine whether evaluation criteria are met, they do not specify
Research and Research Methods What we will cover: Formal vs. Informal Qualitative vs. Quantitative Primary vs. Secondary Focus Groups In-Depth Interviews Intercept Interviews 1 Research and Research Methods
Focus Groups Overview The Focus Group is an exploratory research method used to help researchers gather indepth, qualitative information of their participants' attitudes and perceptions relating to concepts,
What Are Standards Of Rigor For Qualitative Research? Gery W. Ryan RAND Corporation In anticipation of spending two days discussing the question above, I think it would be helpful to make explicit some
Event Marketing in IMC 44 CHAPTER THREE: METHODOLOGY 3.1. Introduction The overall purpose of this project was to demonstrate how companies operating in emerging markets can successfully organize activities
Surveys Methods for Evaluating Advising Program/U.C. Berkeley 1 The purpose of surveys in an Advising Program evaluation is to gain insight into respondents attitudes and perceptions about advising services
User interface design Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 16 Slide 1 Objectives To suggest some general design principles for user interface design To explain different interaction
Webb Webb s Depth of Knowledge Guide Career and Technical Education Definitions 2009 1 H T T P : / / WWW. MDE. K 12.MS. US H T T P : / / R E D E S I G N. R C U. M S S T A T E. EDU 2 TABLE OF CONTENTS Overview...
Suggestions for Survey Design: Things to consider when conducting a survey Did you know that has conducted over 50 surveys in the past two years? The research office conducts annual surveys of first-time
Guidelines for Using the Retrospective Think Aloud Protocol with Eye Tracking September, 2009 Short paper by Tobii Technology Not sure of how to design your eye tracking study? This document aims to provide
Advertising Research Five Basic Methods of Market Research Source: Dun and Bradstreet Media, AllBusiness.com While there are many ways to perform market research, most businesses use one or more of five
Debra Wilcox & Consulting Learning new behavior new or enhanced skills is the most powerful outcome of training. Behavioral change is the most difficult outcome to achieve, but may be the most important.
Using Focus Groups in Program Development and Evaluation by Roger A. Rennekamp, Ph.D. and Martha A. Nall, Ed.D. Extension Specialists in Program and Staff Development 203 Scovell Hall University of Kentucky
A study to investigate Students and Teachers attitudes towards the use of computer games in St Johns School, to assist learning. 1 Contents Page Abstract Page 3 Introduction Page 4 Methodology Page 5 and
Interaction Design Chapter 5 (May 22nd, 2013, 9am-12pm): Applying Interaction Design 1 II Applying Interaction Design May June July 2 Applying Interaction Design What is Design Research? Conducting Design
Section 2: Ten Tools for Applying Sociology CHAPTER 2.6: DATA COLLECTION METHODS QUICK START: In this chapter, you will learn The basics of data collection methods. To know when to use quantitative and/or
Market Research It is essential that market research is undertaken to establish there is a need for childcare within a specified area and have knowledge of the local community that a business wishes to
WR305, Writing for the Web Usability Test Documents Introduction This documents contains the portfolio usability test informed consent form, participant demographic questionnaire, protocols, and surveys
Usability Testing with Howard Kiewe Seven Infrequently Asked Questions 2006, Howard Kiewe The Seven IAQs We will cover the following questions: 1. What is a Usability Test? 2. Who are the Test Participants?
Toolkit for the OSEP TA & D Network on How to Evaluate Dissemination A Component of the Dissemination Initiative Richard Sawyer, PhD National Dissemination Center for Children with Disabilities Contents
Evaluating Survey What Respondents Do to Answer a Question Chase H. Harrison Ph.D. Program on Survey Research Harvard University Comprehend Question Retrieve Information from Memory Summarize Information
Appendix N INFORMATION TECHNOLOGY (IT) YOUTH APPRENTICESHIP WEB & DIGITAL COMMUNICATIONS PATHWAY WEB & DIGITAL MEDIA UNIT UNIT 6 Web & Digital Communications Pathway: (Unit 6) PAGE 1 OF 12 Unit 6: Pathway
Performing a Community Assessment 37 STEP 5: DETERMINE HOW TO UNDERSTAND THE INFORMATION (ANALYZE DATA) Now that you have collected data, what does it mean? Making sense of this information is arguably
LEGAL SERVICES NTAP v v NATIONAL TECHNOLOGY ASSISTANCE PROJECT Website Usability Testing A Guide for Legal Aid Websites Jamila Hussein & Xander Karsten 1 Website Usability Testing Guide This guide is designed
Consulting Performance, Rewards & Talent Making Employee Engagement Happen: Best Practices from Best Employers The Challenge Companies across the globe are taking the initiative to administer and manage
www.airweb.org/academy Course Catalog 2015 Hosted by the Association for Institutional Research, Data and Decisions Academy courses provide self-paced, online professional development for institutional
Semi-structured interviews 3 rd December 2014 Prof. Edwin van Teijlingen BU Graduate School Aim of this session: introduce various ways of conducting interviews with qualitative methods; outline strength
net : Application Layer Factor the Content Bernd Paysan EuroForth 2011, Vienna Outline Motivation Requirements Solutions Some Basic Insights Factor Data Distribute the Code net2o Recap: Lower Level Parts
Usability Testing Basics An Overview Contents Usability Testing Defined... 3 Decide What to Test... 3 Determine When to Test What... 4 Decide How Many to Test... 4 Design the Test... 5 Consider the Where,
Writing Better Survey Questions Mike Pritchard 5 Circles Research 425-968-3883, firstname.lastname@example.org, www.5circles.com, @thatresearchguy Why are we here? Increase competence and consciousness Good questions
Customer Market Research Primer This factsheet acts as a primer to the basic concepts of Market Research and how a business or individual could do their own research. Market research is an effective way
1 User Interface Design Designing effective interfaces for software systems Importance of user interface 2 System users often judge a system by its interface rather than its functionality A poorly designed
Analysing Qualitative Data Workshop Professor Debra Myhill Philosophical Assumptions It is important to think about the philosophical assumptions that underpin the interpretation of all data. Your ontological
Fan Fu. Usability Testing of Cloud File Storage Systems. A Master s Paper for the M.S. in I.S. degree. April, 2013. 70 pages. Advisor: Robert Capra This paper presents the results of a usability test involving
GE Capital The Net Promoter Score: A low-cost, high-impact way to analyze customer voices The Net Promoter Score: A low cost, high impact way to analyze customer voices GE Capital s Net Promoter survey
Trier 4 Marketing Research MANAGING MARKETING INFORMATION TO GAIN CUSTOMER INSIGHTS Marketing Information and Customer Insights Fresh and deep insights into customers needs and wants Difficult to obtain
9. Data-Collection Tools Depending on the nature of the information to be gathered, different instruments are used to conduct the assessment: forms for gathering data from official sources such as police
Complementary perspectives in the value chain on ERP system implementation Philip Holst, email@example.com, Center for Applied ICT (CAICT), CBS Janni Nielsen, firstname.lastname@example.org, Center for Applied ICT (CAICT),
Accelerate Development Reduce Time to Product Automate Critical Tasks White Paper: Designing Resourceful Graphical User Interfaces (GUIs) for Healthcare Applications The ASHVINS GROUP, Inc. 6161 Blue Lagoon
Appendix A USABILITY TEST May 2001 TECHSOURCE COLLEGE LIBRARY (TCL) Contents This is the summary of test results conducted on the new online catalog for TCL s newly established library. The summary includes