Quality of Web Usability Evaluation Methods: an empirical study on MiLE+

Similar documents
Quality of Web Usability Evaluation Methods: An Empirical Study on MiLE+

MiLE+ (Milano-Lugano Evaluation method)

E-learning evaluation: A cross-technique comparison

Does Branding Need Web Usability? A Value-Oriented Empirical Study

Interface Design Rules

Empirical Investigation of Web Design Attributes Affecting Brand Perception

Evaluating a Usability Inspection Technique by means of Industry Case Studies

Technology-Enhanced Communication for Cultural Heritage (TEC-CH)

An approach for usability evaluation of e-commerce sites based on design patterns and heuristics criteria

Announcements. Project status demo in class

Web site evaluation. Conducted for. (The Client) By Information & Design. October 17th, 1998

MEASURING USABILITY OF ICONIC BASED GUIs OF MOBILE EMERGENCY SERVICE SOFTWARE BY USING HCI. Y.Batu Salman, Adem Karahoca

Usability, Playability, and Long-Term Engagement in Computer Games

Screen Design : Navigation, Windows, Controls, Text,

Web Usability: Principles and Evaluation Methods

Service Delivery Module

EMAC Conference Helsinki The Inclusive Museum. Cristina Da Milano

Evaluation of Fiji National University Campus Information Systems

Software Development and Usability Testing

Tool Support for Model Checking of Web application designs *

REGULATIONS FOR THE DEGREE OF MASTER OF SCIENCE IN LIBRARY AND INFORMATION MANAGEMANT (MSc[LIM])

Building Reusable Testing Assets for a Product Line

REGULATIONS FOR THE DEGREE OF MASTER OF SCIENCE IN LIBRARY AND INFORMATION MANAGEMANT (MSc[LIM])

To measure or not to measure: Why web usability is different. from traditional usability.

On the Acceptability of Conceptual Design Models for Web Applications

Content Design Cognitive Design Audience Considerations Navigational Design Layout--- Physical & Logical Designing Graphical Elements Interactivity

functional Safety UL Functional Safety Mark

Macromedia Dreamweaver 8 Developer Certification Examination Specification

Mobile Testing Automation

REGULATIONS FOR THE DEGREE OF MASTER OF SCIENCE IN INFORMATION TECHNOLOGY IN EDUCATION (MSc[ITE])

Examining the Usability of the University of Hawaii at Manoa s Office of the Registrar Website

Capturing Web Application Requirements through Goal-Oriented Analysis

Value-Driven Design for Infosuasive Web Applications

Use Case Evaluation (UCE): A Method for Early Usability Evaluation in Software Development

Comparing Recommendations Made by Online Systems and Friends

Some Critical Success Factors for Industrial/Academic Collaboration in Empirical Software Engineering

A Systematic Review of Usability Evaluation in Web Development 1

Job Posting Manager, Digital and Online - HQ

Software localization to gender and culture: fostering female participation and creativity in the Web

Certificate Program in Applied Big Data Analytics in Dubai. A Collaborative Program offered by INSOFE and Synergy-BI

CATALYST REPORT DebateHub and Collective Intelligence Dashboard Testing University of Naples Federico II

BCS HIGHER EDUCATION QUALIFICATIONS Level 6 Professional Graduate Diploma in IT. March 2013 EXAMINERS REPORT. Software Engineering 2

Phases, Activities, and Work Products. Object-Oriented Software Development. Project Management. Requirements Gathering

Business Intranet Redesign: Can High Usability Mediate Competitive Advantage?

User interface evaluation experiences: A brief comparison between usability and communicability testing

Five laws on Technology Enhanced Learning. Prof. Ronghuai Huang Beijing Normal University, China

ENTERPRISE ARCHITECTUE OFFICE

Exploratory Testing Dynamics

An Experiment on the Effect of Design Recording on Impact Analysis

FSW QA Testing Levels Definitions

Appropriate Web Usability Evaluation Method during Product Development

Screen Design : Navigation, Windows, Controls, Text,

Usability Inspection of Web Portals: Results and Insights from Empirical Study

COLLEGE OF WESTERN IDAHO. Request for Proposal WEBSITE REDESIGN SERVICES. Due: November 8, 2010 At 5:00 p.m. Deliver to:

elearning Instructional Design Guidelines Ministry of Labour

Design Approaches to Enhance Usability for E-Commerce Sites

Requirements Analysis Concepts & Principles. Instructor: Dr. Jerry Gao

GOAL-BASED WEB DESIGN TOWARDS BRIDGING THE GAP BETWEEN REQUIREMENTS AND DESIGN OF WEB APPLICATIONS

Early Software Product Improvement with Sequential Inspection Sessions: An empirical Investigation of Inspector Capability and Learning Effects

The Core Pillars of AN EFFECTIVE DOCUMENT MANAGEMENT SOLUTION

THE KEY ADVANTAGES OF BUSINESS INTELLIGENCE AND ANALYTICS

ForeverSOA: Towards the Maintenance of Service-Oriented Architecture (SOA) Software

Module 10. Coding and Testing. Version 2 CSE IIT, Kharagpur

Teaching & Media: A Systematic Approach

E10: Controlled Experiments

Requirements for Context-dependent Mobile Access to Information Services

TRANSITION TO TELEPRESENCE. One teacher s foray into the digital divide

Dreamweaver Domain 2: Planning Site Design and Page Layout

Ozgur Aktunc Assistant Professor of Software Engineering St. Mary s University

How To Test On A Mobile Device

BACHELOR OF BUSINESS ADMINISTRATION Program: Goals and Objectives

Better, Faster, and Cheaper SAS Software Lifecycle

UNLV Department of Curriculum and Instruction Masters in Education Degree with an emphasis in Science Education Culminating Experience

OCR LEVEL 2 CAMBRIDGE TECHNICAL

Year of Entry into Clinical Program: 2 0

Usability Evaluation Methods for the Web: A Systematic Mapping Study

An Efficient System For Generating Reports Of Cots Used In Component Based Software Engineering

Exploratory Testing Dynamics

MyCompany Professional Web Developer Certification Examination Specification

Basic Testing Concepts and Terminology

UNIDO. Competencies. Strengthening Organizational Core Values and Managerial Capabilities

CURRICULUM FOR PG PROGRAMME IN DIGITAL DESIGN: RESEARCH & INDUSTRY ORIENTED

Issues in Information Systems Volume 15, Issue II, pp , 2014

User Requirements for a Practice-integrated Nurse-administered Online Communication Service for Cancer Patients

Metric Based Architecture to Enhance Software Usability

Investigating the Temporal Behavior of Defect Detection in Software Inspection and Inspection-Based Testing

Prototyping Techniques for

Assignment # 4 Summing up the Course

Types of Usability Testing For Your Website

ARM vs Excel for Large Models

Scenario-based Requirements Engineering and User-Interface Design

UMKC Website Starter Kit

Abstract. Keywords: controlled experimentation, single-subject experiments

Recon Rally. The User Experience Design Behind the Rally. By Michael Grubbs, Tiffany Milano, and Daniel Rotondo

Metrics, Measures and Meanings: Evaluating the CareSearch website Reference Number: WC0077

The USE Project: Bridging the Gap between Usability Evaluation and Software Development


Configuring Single Sign-on for WebVPN

Testing Websites with Users

Program Coupling and Parallel Data Transfer Using PAWS

Transcription:

Quality of Web Usability Evaluation Methods: an empirical study on MiLE+ Davide Bolchini University of Lugano (Switzerland) Franca Garzotto Politecnico di Milano (Italy)

Outline Motivation Goals MiLE+ Empirical study: Experiment 1: quick inspection Experiment 2: usability project Results and Discussion 2

Motivation A proliferation of Usability Evaluation Methods (UEMs) Different philosophies - conception of quality, usability and their interrelationships -, thus approaches and techniques Every method should support the improvement of the usability of the application However, ongoing discussion (and little agreeement) on the salient quality attributes for UEMs 3

Motivation - 2 Quality attributes concerning the output of the evaluation Effectiveness? Number of usability problems discovered Thoroughness (found problems vs existing problems) Reliability (consistency of results across different inspectors) Validity (correcting predicting user s behaviour, no or minimized false positives ) Productivity Scope Quality attributes concerning acceptability and adoption Learnability Applicability and Compatibility in current practice Verticalization on domains Reusability Cost-effectiveness Supporting arguments with empirical data 4

Goals Evaluate MiLE+ Ongoing development from 2000 Used in consultancy projects, in training professionals + ca. 200 students a year. Focus on few key attributes that we could measure in a realistic setting and that support effective adoption: Performance Efficiency Cost-effectiveness Learnability 5

Goals Verifying key acceptability requirements for a UEM: Become able to use the method after a reasonable time (1-3 person/days) of study Being able to detect the largest amount of usability problems with the minimum effort Producing a first set of results in a few hours and a complete analysis in a few days 6

MiLE+ : Milano-Lugano Evaluation MiLE+ is a usability inspection method Evolution of two previous usability methods: SUE (Systematic Usability Evaluation) MiLE Borrow general concepts from mainstream usability inspection approaches Foster a systematic, structured approach to the analysis, yet aimed at being particularly suitable to novice evaluators 7

MiLE+ Activities Framework APPLICATION DEPENDENT USABILITY APPLICATION APPLICATION INDEPENDENT USABILITY EXPERT Library of Scenarios SCENARIOS NAVIGATION CONTENT TECHNOLOGY SEMIOTICS COGNITIVES GRAPHICS Scenarios (as a tool, not mandatory) TECHNICAL INSPECTION User Experience Indicators (UEIs) UEIs 90% Framework Technical Heuristics Library of Heuristics USER TESTING Library of UEIs Validate / Invalidate 10% Specific USER EXPERIENCE INSPECTION EXPERT 8

MiLE+ technical heuristics (82) Technical Heuristics, coupled by a set of operational guidelines that suggest the inspection tasks to undertake in order to measure the various heuristics. Organized by design dimensions Navigation: (36) heuristics addressing website info architecture and navigation structure Content: (8) heuristics addressing the general principles of content quality Technology/Performance: (7) heuristics addressing technology-dependent features of the application Interface Design: (31) heuristics that address the semiotics of the interface, the graphical layout, and some cognitive aspects 9

Operationalized Attributes - 1 Performance: Performance indicates the degree at which a method supports the detection of all existing usability problems for an application. It is operationalized as the average rate of the number of different problems found by an inspector (Pi) in given inspection conditions (e.g. time at disposal) against the total number of existing problems (Ptot) Performance = avrg (Pi)/Ptot 10

Operationalized Attributes - 2 Efficiency: Efficiency indicates the degree at which a method supports a fast detection of usability problems. It is operationalized as the rate of the number of different problems identified by an inspector in relation to the time spent, and then calculating the mean among a set of inspectors: Pi Efficiency = avrg( ) t i Pi is the number of problems detected by the i-th inspector in a time period ti. 11

Operationalized Attributes - 3 Cost-effectiveness: Cost-effectiveness denotes the effort - measured in terms of person-hours - needed by an evaluator to: carry out a complete evaluation of a significantly complex web application produce an evaluation documentation that meets professional standards, i.e., a report that can be proficiently used by a (re)design team to address the usability problems. 12

Operationalized Attributes - 4 Learnability: Learnability denotes the ease of learning a method. We operazionalize it by means of the following factors: the effort, in terms of person-hours, needed by a novice to become reasonably expert and to be able to carry on an inspection activity with a reasonable level of performance the novice s perceived difficulty of learning, i.e., of moving from knowing nothing to feeling reasonably comfortable with the method and ready to undertake an evaluation the novice s perceived difficulty of applying the method in a real case. 13

Empirical study: general conditions The overall study involved 42 participants Students from HCI course, Politecnico di Milano (Milano and Como campus) novice inspectors Preconditions: No previous exposure to usability Basic background in web development Etherogeneous profile in terms of age and technical background Preparatory conditions: 5 hours classroom training on usability and MiLE+ Assignment of learning material to study (MILE+ overview, technical heuristics library, 2 real-life case studies, excerpts from USABLE - an online course on usability - and MILE+) 14

Exp.1: quick inspection Inspectors 16 graduate students (Como) Purpose measure efficiency and performance Learnability hypothesis: study effort to become proficient <= 2 days Assigned Inspection Goals: Inspect website (Cleveland Museum of Modern Art) with MiLE+ technical inspection Setting: Synchronus and syntopic individual inspection 3 hour time Limited inspection scope (2 main sections, around 300 page instances) One week after MiLE+ classes Output produced: inspection notes including, for each usability problem, name, design dimension, description (max 3 lines), page URL 15

Key Results: from zero to hero Experiment 1 Avg number of problems discovered: 14.8 Hourly efficiency: avg 4.9 problems per hour Existing usability problems (team of experts): 41 Performance: 36% After 6 hours of training and a maximum of 15 hours of study of MiLE+, a novice can become able to detect more than one third of the existing usability problems. 16

Exp.2: usability project Inspectors 26 graduate students (Milano) Purpose measure perceived difficulty in learning and using MiLE+ and effort needed to produce a professional evaluation report Assigned Inspection Goals: Inspect website at choice (full application) Setting: Asynchronous, team inspection Two months period Output produced: Complete usability evaluation report 17

Key Results: Learning Effort % students 80% 70% 60% 50% 40% 30% 75% 20% 10% 8% 10% 7% 0% <10 hours 10-13 hours 14-17 hours >18 hours Time invested in learning MiLE+ before and during the project 18

Key Results: Learning Difficulty % students 80% 70% 60% 50% 40% 30% 20% 10% 0% 73% 74% 61% 37% 25% 22% 1% 3% 0 1% 1% 2% Very simple Simple Difficult Very difficult Overall difficulty Technical Inspection User Experience Inspection Perceived difficulty in learning the various MiLE+ tasks 19

Key Results: Difficulty in Using the Method % students 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 79% 47% 45% 44% 57% 41% 21% 24% 19% 14% 9% 0 0 0 0 0% Very simple Simple Difficult Very difficult Overall difficulty Scenario definition Technical Inspection User Experience Inspection Perceived difficulty in using the various MiLE+ tasks 20

Key Results: Individual Effort % students 60% 54% 50% 40% 32% 30% 20% 12% 10% 0% 2% 20-30 30-40 40-50 50-60 Person/hours Estimated effort for the entire evaluation process 21

Key Results: Individual Effort per Task % students 100% 80% 60% 40% 20% 3-5 5-10 10-15 15-20 20-25 Person/hours 0% Technical Inspection Scenario definition User Experience Inspection 3-5 94% 8% 5-10 3% 69% 1% 4% 51% 10-15 39% 31% 23% 41% 15-20 40% 0% 34% 20-25 18% 0% 42% Infra-team agreement Documentati on Estimated effort for the entire evaluation process BY task 22

Conclusions Promote usability evaluation methods for adoption Fostering learnability and cost effectiveness We have empirically substantiated the adoption suitability of MILE+, with encouraging results Performant, efficient, cost-effective, easy to learn and use Next steps: Comparative evaluation Nielsen s heuristics, more unstructured approaches Ongoing effort for transfering of MiLE+ to the industry: Courses for professionals Google - CH 23