The USE Project: Bridging the Gap between Usability Evaluation and Software Development
|
|
|
- Julius Williamson
- 10 years ago
- Views:
Transcription
1 The USE Project: Bridging the Gap between Usability Evaluation and Software Development Als, B. 1, Frøkjær, E. 2, Hornbæk, K. 2, Høegh, R. 1, Jensen, J. 1, Nørgaard, M. 2, Skov, M. 2, Stage, J. 1 & Uldall-Espersen, T. 2 Introduction 1 Department of Computer Science, Aalborg University, {als,runethh,jjj,dubois,jans}@cs.aau.dk 2 Department of Computing, University of Copenhagen, {erikf,kash,mien,tobias}@diku.dk We describe the USE project, a research project running from 2005 to 2008 between researchers from Aalborg University, Copenhagen University, and developers and managers from the software industry. Despite at least 20 years of research into usability engineering, there is still a significant gap between usability evaluation and software design. Currently, an increasing number of development projects employ usability-engineering techniques in an attempt to improve the quality of the software produced. For example, the techniques most commonly taken up in industry are various forms of usability evaluation (Vredenburg et al. 2002). The costbenefit arguments for including usability evaluation in software development activities are also strong (Bias & Mayhew 1994). While these evaluations often help identifying an exhaustive list of usability problems with the software, they typically have a very limited impact on the subsequent development activities. This lack of impact has severe consequences including unsuitable user interface designs, limited support for the core work activities of users, and lack of productivity gains despite heavy investments in information technology (Gould & Lewis 1985; Brooks 1987; Landauer 1995). Current research fails to address why usability evaluation techniques do not impact practical software development. We see three main reasons for this. First, most research studies focus on generating reports that list usability problems with a certain piece of software. Recently, Wixon (2003) argued that feedback through lists of problems has limited relevance in practical software development, because it ignores that problems should be fixed, not just found. Thus feedback from evaluations is not design-oriented. Second, only very few studies of usability evaluation are conducted in real industrial settings. Among the studies listed in recent reviews (Gray & 1
2 Salzman 1998; Hartson et al. 2001), none have been conducted in the context of software systems development. Thus, those studies fail to take into account the complexity of real software development. Third, research is only beginning to address how developers understand and assess usability problems (Dumas et al. 2004; Hornbæk & Frøkjær 2004). Therefore, we do not know in which form feedback should be provided to development projects, nor what kind of feedback to present for developers and managers. The USE project aims at increasing the impact of usability evaluations on industrial software development, bridging the gap between usability evaluation and software design presented above. We do so by (a) exploring tools and techniques for extending feedback from usability evaluations beyond a mere listing of problems and (b) developing and evaluating our ideas for integration of usability evaluation and software design in industrial software development projects. Current work in USE To illustrate some current work in the USE, we below describe six subprojects that are either published (see references below) or under review. Does usability work pay off? (Uldall-Espersen 2005) The value of work to improve the usability of information systems in industrial settings is rarely accounted for. This subproject concerns an experiment where an administration and risk management system in a bank were improved through three versions during a period of six month. The system has ten users and usability data was collected through questionnaires and logging of data from practical use. The experiment shows how it was possible to improve the system over a broad range of measures covering efficiency, effectiveness and user satisfaction issues. The analysis of return on investment (ROI) of this usability work shows a pay back time of five years, but in this case, the most important improvements were found in issues not accounted for by the ROI. These issues, such as expected increased use of the system and a postponement of a replacement of the system, alone justified the usability work. This indicates how ROI analyses are at risk of missing the most important issues. Are usability reports any good? (Nielsen et al. 2005) A usability evaluation is a potentially rich basis for understanding and improving the design of the interaction between an interactive system and its users. Realizing this potential requires a well-functioning interplay 2
3 between interaction design and usability evaluation. This subproject explores how to provide feedback from a user-based usability evaluation to the designers of the user interaction. The subproject involved two experiments where the traditional usability report was compared to direct observation of the evaluation, and where the impact of the traditional usability report and its individual elements on the developers was assessed. The results indicate that usability reports have a strong impact, as they can significantly change developers opinion about a system, especially about its weaknesses. Direct observation of user-based usability tests by a few developers has a similar strong impact, but is difficult to disseminate to other developers and does not facilitate systematic work with the identified usability problems. How do usability evaluators conduct think aloud tests? (under review) Think-aloud testing is a widely employed usability evaluation method, yet its use in practice is rarely studied. This subproject is based on a field study of 14 think-aloud sessions, the audio recordings of which were examined in detail. The study shows that analysis of observations made in a think-aloud session are done sporadically, if at all. When running a test, evaluators are already aware of problems that they seek to confirm. During testing, evaluators often ask users about their expectations and about hypothetical situations, rather than about experienced problems. In addition, evaluators only infrequently get input about the utility of the system being tested compared to input about its usability. The study show how practical realities rarely discussed in the literature on usability evaluation influence sessions, for example in the form of time pressure and evaluators not knowing the system they evaluate. The subproject raises a number of questions for usability researchers and professionals, including techniques for fast-paced analysis and tools for capturing observations during sessions. How should we understand the evaluator effect? (under review) The evaluator effect implies that usability evaluators in similar conditions identify substantially different sets of usability problems. Yet, little is known about the factors involved in the evaluator effect. This subproject consists of a study of 50 novice evaluators usability tests and subsequent comparisons in teams and individually of the resulting usability problems. Also, 10 HCI researchers independently analyzed the problems. The study shows an agreement between evaluators of about 40%, indicating a substantial evaluator effect. Matching problems in teams improves this agreement, and evaluators express greater satisfaction with the teams 3
4 matchings. The matching of individuals, teams, and independent researchers show an evaluator effect of similar size; yet they fundamentally disagree about which problems are similar. We argue that previous claims in the literature about the evaluator effect are questionable, and propose an alternative view of evaluator agreement. How to test for children? (Als et al. 2005) Constructive interaction provides natural thinking-aloud as test subjects collaborate to solve tasks. It has been suggested that children may more easily use constructive interaction rather than traditional think-aloud. However, the relationship between think-aloud and constructive interaction is still poorly understood. This subproject comprises an experiment that compared think-aloud and constructive interaction in usability testing. The experiment involved 60 children with three setups where children applied think-aloud and constructive interaction in acquainted and non-acquainted dyads. Our results showed that the pairing of children had major impact on the identification of usability problems as acquainted dyads identified more problems than both non-acquainted dyads and the think-aloud children. Furthermore, the pairing of children highly influenced the children's thinking-aloud as the acquainted dyads verbalized more than the think-aloud children. Finally, the acquainted pairs reported less frustration during the test despite the identification of more problems. Conclusion We have described the motivation for the USE project and some of the current work in the project. The key goals of the project are to deliver: A catalogue of techniques and tools for usability evaluation and feedback with measures of impact on design that are empirically validated in industrial settings Documentation of relations between usability evaluation and feedback in industrial projects Integration of usability evaluation in software development in the participating companies Courses, training programs and a conference for practitioners to disseminate the practical experience to the broader software industry The work exemplified above is the first step towards reaching those goals and better integrating usability evaluation and software development. 4
5 References Als, B., Jensen, J., & Skov, M. (2005), "Creating Usability Testing Settings for Children: Exploring Different Setups", Proceedings of Interact 2005, LNCS 3585, Berlin: Springer Verlag, Bias, R. G. & Mayhew, D. J. (1994), Cost-justifying usability, San Francisco, CA: Morgan Kaufman. Dumas, J., Molich, R., & Jefferies, R. (2004), "Describing Usability Problems: Are we sending the right message?", interactions, 4, Gray, W. D. & Salzman, M. C. (1998), "Damaged merchandise? A review of experiments that compare usability evaluation methods", Human- Computer Interaction, 13, 3, Hartson, H. R., Andre, T. S., & Williges, R. C. (2001), "Criteria for evaluating usability evaluation methods", International Journal of Human-Computer Interaction, 13, 4, Hornbæk, K. & Frøkjær, E. (2004), "Usability Inspection by Metaphors of Human Thinking Compared to Heuristic Evaluation", International Journal of Human-Computer Interaction, 17, 3, Landauer, T. (1995), The trouble with computers, Cambridge, MA: MIT Press. Nielsen, C., Overgaard, M., Pedersen, M., & Stage, J. (2005), "Feedback from Usability Evaluation to User Interface Design: Are Usability Reports Any Good?", Proceedings of Interact'2005, LNCS 3585, Berlin: Springer Verlag, Uldall-Espersen, T., (2005), "Benefits of Usability Work: Does it pay off?", International COST294 Workshop on User Interface Quality Models, Sept 2005, Rome, Italy, Vredenburg, K., Mao, J.-Y., Smith, P. W., & Carey, T. (2002), "A Survey of User-Centered Design Practice", Proceedings of CHI 2002, Wixon, D. (2003), "Evaluating usability methods: Why the Current Literature fails the Practitioner", interactions, 10, 4,
Location, Location, Location: Challenges of Outsourced Usability Evaluation Murphy, John; Howard, Steve; Kjeldskov, Jesper; Goshnick, Steve
Aalborg Universitet Location, Location, Location: Challenges of Outsourced Usability Evaluation Murphy, John; Howard, Steve; Kjeldskov, Jesper; Goshnick, Steve Published in: Proceedings of the Workshop
A Systematic Review of Usability Evaluation in Web Development 1
A Systematic Review of Usability Evaluation in Web Development 1 Emilio Insfran, Adrian Fernandez ISSI Group, Department of Information Systems and Computation Universidad Politécnica de Valencia Camino
Usability Evaluation Methods for the Web: A Systematic Mapping Study
*Manuscript Click here to view linked References Usability Evaluation Methods for the Web: A Systematic Mapping Study Adrian Fernandez a,*, Emilio Insfran a, Silvia Abrahão a a ISSI Research Group, Department
The Usability of Electronic Stores based on the Organization of Information and Features
The Usability of Electronic Stores based on the Organization of Information and Features CHAN KAH-SING Singapore Polytechnic This paper describes an investigation on how the perceived usability of electronic
Usability Testing. Usability Testing. There are two major considerations when
18 There are two major considerations when conducting usability testing. The first is to ensure that the best possible method for testing is used. Generally, the best method is to conduct a test where
Usability Testing (in HCI 304-424B) McGill University, January 26 th, 2006. Usability Testing. with Howard Kiewe. Seven Infrequently Asked Questions
Usability Testing with Howard Kiewe Seven Infrequently Asked Questions 2006, Howard Kiewe The Seven IAQs We will cover the following questions: 1. What is a Usability Test? 2. Who are the Test Participants?
To measure or not to measure: Why web usability is different. from traditional usability.
To measure or not to measure: Why web usability is different from traditional usability. Charlotte Olsson Department of Informatics Umeå University, 901 87 UMEÅ, Sweden +46 90 786 6820 [email protected]
Understanding the User Model of the Elderly People While Using Mobile Phones
Understanding the User Model of the Elderly People While Using Mobile Phones Hsien-Hui TANG Shih-An KAO DITL, The Graduate Institute of Industrial Design, Chang Gung University, Taiwan [email protected]
"A Role With No Edges": The Work Practices of Information Architects
Proceedings of HCI International 2003, vol 1, 22-27 June 2003, Crete, pp. 396-400. "A Role With No Edges": The Work Practices of Information Architects Toni Robertson, Cindy Hewlett, Sam Harvey and Jenny
Single Level Drill Down Interactive Visualization Technique for Descriptive Data Mining Results
, pp.33-40 http://dx.doi.org/10.14257/ijgdc.2014.7.4.04 Single Level Drill Down Interactive Visualization Technique for Descriptive Data Mining Results Muzammil Khan, Fida Hussain and Imran Khan Department
Exploring the Usability of Municipal Web Sites: A Comparison Based on Expert Evaluation Results from Four Case Studies
Informatica Economică vol. 14, no. 4/2010 87 Exploring the Usability of Municipal Web Sites: A Comparison Based on Expert Evaluation Results from Four Case Studies Costin PRIBEANU, Ruxandra-Dora MARINESCU,
EVALUATING THE USABILITY OF ERP SYSTEMS: WHAT CAN CRITICAL INCIDENTS TELL US?
EVALUATING THE USABILITY OF ERP SYSTEMS: WHAT CAN CRITICAL INCIDENTS TELL US? Mari-Klara Oja Bentley University MA, United States [email protected] Wendy Lucas Bentley University MA, United States [email protected]
User-Centered Design Chadia Abras 1, Diane Maloney-Krichmar 2, Jenny Preece 3
DRAFT: User-Centered Design 1 User-Centered Design Chadia Abras 1, Diane Maloney-Krichmar 2, Jenny Preece 3 1. Introduction and History The design of everyday objects is not always intuitive and at times
Aalborg Universitet. Fast, Fastere, Agile UCD Pedersen, Tina Øvad; Larsen, Lars Bo. Publication date: 2014
Aalborg Universitet Fast, Fastere, Agile UCD Pedersen, Tina Øvad; Larsen, Lars Bo Publication date: 2014 Document Version Preprint (usually an early version) Link to publication from Aalborg University
The Usability Engineering Repository (UsER)
The Usability Engineering Repository (UsER) Marc Paul, Amelie Roenspieß, Tilo Mentler, Michael Herczeg Institut für Multimediale und Interaktive Systeme (IMIS) Universität zu Lübeck Ratzeburger Allee 160
An Iterative Usability Evaluation Procedure for Interactive Online Courses
An Iterative Usability Evaluation Procedure for Interactive Online Courses by Laurie P. Dringus ABSTRACT The Internet and World Wide Web (W3) have afforded distance learners simple links to access information.
Running head: USABILITY ENGINEERING, COGNITIVE SCIENCE, AND HEALTHCARE INFORMATION SYSTEMS
Usability Engineering 1 Running head: USABILITY ENGINEERING, COGNITIVE SCIENCE, AND HEALTHCARE INFORMATION SYSTEMS Usability Engineering, Cognitive Science and Healthcare Information Systems Yong IL Choi
User Experience Metric and Index of Integration: Measuring Impact of HCI Activities on User Experience
User Experience Metric and Index of Integration: Measuring Impact of HCI Activities on User Experience Anirudha Joshi Indian Institute of Technology Bombay Mumbai 400076, India +91 9820345569 [email protected]
Exploring inexperienced user performance of a mobile tablet application through usability testing.
Proceedings of the 2013 Federated Conference on Computer Science and Information Systems pp. 557 564 Exploring inexperienced user performance of a mobile tablet application through usability testing. Chrysoula
Improving Government Websites and Surveys With Usability Testing and User Experience Research
Introduction Improving Government Websites and Surveys With Usability Testing and User Experience Research Jennifer Romano Bergstrom, Jonathan Strohl Fors Marsh Group 1010 N Glebe Rd., Suite 510, Arlington,
RESTORATIVE TECHNIQUES IN COGNITIVE REHABILITATION: PROGRAM DESIGN AND CLINICAL BENEFITS
RESTORATIVE TECHNIQUES IN COGNITIVE REHABILITATION: PROGRAM DESIGN AND CLINICAL BENEFITS In the treatment of traumatic brain injury, cognitive rehabilitation is an intervention that seeks to improve cognitive
Scenario-Based Development of Human-Computer Interaction. MARY BETH ROSSON Virginia Polytechnic Institute and State University
USABILITY ENGINEERING Scenario-Based Development of Human-Computer Interaction MARY BETH ROSSON Virginia Polytechnic Institute and State University JOHN M. CARROLL Virginia Polytechnic Institute and State
User Interface Design
User Interface Design Winter term 2005/2006 Thursdays, 14-16 c.t., Raum 228 Prof. Dr. Antonio Krüger Institut für Geoinformatik Universität Münster 20. Februar 06 IfGi Universität Münster User Interface
Remote Usability Evaluation Methods and Tools: A Survey
Remote Usability Evaluation Methods and Tools: A Survey Fidas Christos, Katsanos Christos, Papachristos Eleftherios, Tselios Nikolaos, Avouris Nikolaos Human Computer Interaction Group, Electrical and
Appropriate Web Usability Evaluation Method during Product Development
Master Thesis Software Engineering Thesis no: MSE-2008-03 Feb 2008 Appropriate Web Usability Evaluation Method during Product Development A comparison and analysis of formative web usability evaluation
Human-computer interaction and usability testing: application adoption on B2C Web sites
Volume 14, Number 1, 2012 WIETE 2012 Global Journal of Engineering Education Human-computer interaction and usability testing: application adoption on B2C Web sites Vasilia Peppa, Sarantos Lysikatos &
Literature survey: historical and theoretical background. The chapter requires you to have done some library and company research to:
Writing the MBA Dissertation 1. General Comments This should be divided into chapters as detailed in the following section: Note: The dissertation may not divide up easily into the 6 main headings, but
WACUBO. The Western Association of College and University Business Officers. On Boarding for Multi Campus Project Teams
WACUBO The Western Association of College and University Business Officers On Boarding for Multi Campus Project Teams Taking Some of the Guess Work Out of Building High Performing Project Teams Barbara
HELSINKI METROPOLIA UNIVERSITY OF APPLIED SCIENCES. Master s Degree in Business Informatics. Master s Thesis research
HELSINKI METROPOLIA UNIVERSITY OF APPLIED SCIENCES Master s Degree in Business Informatics Master s Thesis research USABILITY STUDY FOR SAILFISH OPERATING SYSTEM. Author: Elena Xydaki Tutor: Thomas Rohweder,
A Comparative Test of Web Accessibility Evaluation Methods
A Comparative Test of Web Accessibility Evaluation Methods Giorgio Brajnik Dipartimento di Matematica e Informatica Università di Udine Via delle Scienze, 206 33100 Udine Italy [email protected] ABSTRACT
Usability Studies and the Hawthorne Effect
Usability Studies and the Hawthorne Effect Vol. 2, Issue 3, May 2007, pp. 145-154 Ritch Macefield Department of Applied Computing Staffordshire University College Road Stoke on Trent, ST4 2DE UK [email protected]
A Framework for Integrating Software Usability into Software Development Process
A Framework for Integrating Software Usability into Software Development Process Hayat Dino AFRICOM Technologies, Addis Ababa, Ethiopia [email protected] Rahel Bekele School of Information Science, Addis
Dr. Solomon Antony Murray State University Computer Science and Information Systems (270) 809-6211 Email: [email protected]
Dr. Solomon Antony Murray State University Computer Science and Information Systems (270) 809-6211 Email: [email protected] Education Ph.D., Florida International University. Major: Information Systems
METHODOLOGIES FOR STUDIES OF PROGRAM VISUALIZATION
Full paper ABSTRACT METHODOLOGIES FOR STUDIES OF PROGRAM VISUALIZATION Niko Myller & Roman Bednarik Department of Computer Science University of Joensuu PO Box 111, FI-80101 [email protected]
Prediction of Business Process Model Quality based on Structural Metrics
Prediction of Business Process Model Quality based on Structural Metrics Laura Sánchez-González 1, Félix García 1, Jan Mendling 2, Francisco Ruiz 1, Mario Piattini 1 1 Alarcos Research Group, TSI Department,
User interface evaluation experiences: A brief comparison between usability and communicability testing
User interface evaluation experiences: A brief comparison between usability and communicability testing Experiências de avaliação de interface de usuário: uma breve comparação entre usabilidade e testes
Usability Evaluation of Mobile Phone Applications. Harsha BN and Shrinivas NOKIA India
Usability Evaluation of Mobile Phone Applications Harsha BN and Shrinivas NOKIA India Today, switching from one phone to another, or from one carrier to another, requires learning new menus and screen
The Relationship Between Information Systems Management and
The Relationship Between Information Systems Management and Organizational Culture Jakobus Smit Utrecht University of Applied Science, Netherlands [email protected] Marielle Dellemijn CRM Excellence, Netherlands
Continuous User Experience Development
Continuous User Experience Development Kati Kuusinen Tampere University of Technology Tampere, Finland Korkeakoulunkatu 1, FI-33101 Tampere [email protected] Abstract. Continuous approaches for software
Determinants of Customer Loyalty: An Exploratory Investigation on Relational Benefits in the Context of Customer Club
Page 1 of 8 ANZMAC 2009 Determinants of Customer Loyalty: An Exploratory Investigation on Relational Benefits in the Context of Customer Club Kevin Siu-lung Yu*, University of South Australia, [email protected]
Usability Testing. Credit: Slides adapted from H. Kiewe
Usability Testing Credit: Slides adapted from H. Kiewe Materials for this class Readings: Nielsen, Why You Only Need to Test with 5 Users Dickelman, Usability Testing -- Fear and Loathing on the Keyboard
KNOWLEDGE ORGANIZATION
KNOWLEDGE ORGANIZATION Gabi Reinmann Germany [email protected] Synonyms Information organization, information classification, knowledge representation, knowledge structuring Definition The term
Alexander Nikov. 4.2. Usability Testing. 1. What is a Usability Test? Usability Testing Questions. 2. Who are the Test Participants?
COMP 3220 Human-Computer Interaction Usability Testing Questions 4.2. Usability Testing Alexander Nikov 1. What is a Usability Test? 2. Who are the Test Participants? 3. Why Usability Test? 4. Why Doesn
Brown, H. N., & Sorrell, J. M. (1993). Use of Clinical Journals to Enhance Thinking. Nurse Educator,18(5), 16-19.
Use of Clinical Journals to Enhance Critical Thinking By: Hazel N. Brown, EdD, RNC, and Jeanne M. Sorrell, DAEd, RN Brown, H. N., & Sorrell, J. M. (1993). Use of Clinical Journals to Enhance Thinking.
Roles of Practitioners and Strategic Planning Practices
Roles of Practitioners and Strategic Planning Practices *** Associate Professor Dr. Kanya Sirisagul Department of Advertising and Public Relations Business Administration Faculty Ramkhamhaeng University
Cost benefits evidence and case studies
v1, Feb 2005. Includes material adapted from the chapter: Cost benefits framework and case studies, in Bias, R.G. & Mayhew, D.J. (2005) Cost-Justifying Usability: An Update for the Internet Age. Cost benefits
I S O 1 3 4 8 5 2 0 1 6 G AP A N A L Y I S T O O L
7.1 PLANNING REQUIREMENTS 1 2 Do you plan the processes that are needed to realize products? Do you develop the processes that you need to realize products? According to ISO 13485 2016 section 0.2, organizations
User research for information architecture projects
Donna Maurer Maadmob Interaction Design http://maadmob.com.au/ Unpublished article User research provides a vital input to information architecture projects. It helps us to understand what information
Course Title: Introduction to Human Computer Interaction Course Number: HCI101 Course Prerequisites: none Credit Hours: 3 General Studies Credits: 0
Course Title: Introduction to Human Computer Interaction Course Number: HCI101 Course Prerequisites: none Credit Hours: 3 General Studies Credits: 0 I. Course Description: This course will cover some of
Knowledge Management in Software Companies An Appraisal
Knowledge Management in Software Companies An Appraisal B. Gopalkrishna, Lewlyn L. R. Rodrigues, P. K. Poornima, and Shivanshu Manchanda Abstract The present study involved evaluation of state of knowledge
Usability Evaluation with Users CMPT 281
Usability Evaluation with Users CMPT 281 Outline Usability review Observational methods Interview methods Questionnaire methods Usability ISO 9241-11: The extent to which a product can be used by specified
Heuristic Evaluation Techniques for Collaborative Software
Heuristic Evaluation Techniques for Collaborative Software Abigail Kirigin The MITRE Corporation 202 Burlington Road M/S M325 Bedford, MA 01730 USA [email protected] Gary Klein The MITRE Corporation 7525
