[Project Name] Usability Test



Similar documents
Ux test SU Bookstore SAINT CHRISTOPHER MONTERROSO By: St. Christopher Monterroso

How To Fix A Fault In The Mcaesa.Org (Mcaes) Software (For A School)

DUOLINGO USABILITY TEST: MODERATOR S GUIDE

Usability Testing Basics. An Overview

A Usability Test Plan for The San Antonio College Website

Datacolor TOOLS. Datacolor TOOLS QCX. Datacolor TOOLS WORKSTATION


Ricoh Copier Scan to File Instructions

HOW TO CREATE USEFUL SOFTWARE PROCESS DOCUMENTATION ABSTRACT

EHR Usability Test Report of Radysans EHR version 3.0

Testing Websites with Users

ADOBE ACROBAT 7.0 CREATING FORMS

Dom Jackson, Web Support Assistant Student Services Information Desk

Google Cloud Storage Experience Usability Study Plan

Classroom Communication

April 30, Assignment 4. Instructional Design for Conversion of Legacy FrameMaker Documentation to the new Xerox Branding

KMYMONEY USABILITY TESTING REPORT

COURSE OUTLINE COMPUTER INFORMATION SYSTEMS 1A. PREREQUISITE: None. Concurrent enrollment in CIS-96 or CIS-97 is recommended.

White Paper: BMC Service Management Process Model 7.6 BMC Best Practice Flows

FSA Infrastructure Trial Guide

Chapter 14: Links. Types of Links. 1 Chapter 14: Links

Web Content Problems. Web Content Management

USABILITY TEST. Appendix A. May 2001 TECHSOURCE COLLEGE LIBRARY (TCL)

Usability Test Plan Docutek ERes v University of Texas Libraries

Acrobat XI Pro Accessible Forms and Interactive Documents

FOCUS GROUPS FOR THE HSE MANAGEMENT STANDARDS SECTION 5

Overview of how to test a. Business Continuity Plan

Using Surveys for Data Collection in Continuous Improvement

A Guide To Evaluating a Bug Tracking System

<Insert Picture Here> When to Automate Your Testing (and When Not To)

LSS Online Lawyer User Guide

Moderating Usability Tests Principles & Practices for Interacting

Next Generation ProSystem fx Suite. Planning and Implementation Overview

Assigning a Digital Signature to Electronic Documents Guide

HOW TO WRITE A LABORATORY REPORT

Adobe Acrobat 9 Pro Accessibility Guide: Creating Accessible Forms

Referencing Web links in the Survey Workbench Questionnaire

Critical analysis. Be more critical! More analysis needed! That s what my tutors say about my essays. I m not really sure what they mean.

Candidate FAQs & User Guide for the ALSG Learning Site

Pro/INTRALINK Curriculum Guide

Using the Thesis and Dissertation Templates

How a Fort Lauderdale Dentist Captured 67 New Patients Last Month by Following a Simple, 7-Step Blueprint

SAP Business Intelligence (BI) Reporting Training for MM. General Navigation. Rick Heckman PASSHE 1/31/2012

This alignment chart was designed specifically for the use of Red River College. These alignments have not been verified or endorsed by the IIBA.

Software Testing Lifecycle

THE DO S AND DON TS OF WEB CHAT. with Johan Jacobs

Online Control Panel User Guide. Document Version 1.0

HIMSS EMR Usability Evaluation Guide For Clinicians Practices

Change Management for Rational DOORS User s Guide

Creating Accessible Forms in Microsoft Word and Adobe PDF

Enterprise Test Management Standards

Hardware Usability Testing with Morae

How To Be A Successful Call Center Employee

Creating a Digital Signature in Adobe Acrobat Created on 1/11/2013 2:48:00 PM

ENVISION TM WEB HOSTING CHECKLIST

King County Written Language Translation Process Manual Guidelines, Effective practices, Maps, Resources

The Impact of Management Information Systems on the Performance of Governmental Organizations- Study at Jordanian Ministry of Planning

SMART Board Menu. Full Reference Guide

NEW YORK STATE UNIFIED COURT SYSTEM OFFICE OF COURT ADMINISTRATION. EXAMINATION ORIENTATION GUIDE: 2015 Senior Court Reporter Examination

An Evaluation of Kansas City Reading Programs for Turn the Page Kansas City

STUDY OF RESIDENTIAL CABLE TELEVISION RELATED COMMUNITY NEEDS AND INTERESTS IN THE CITY OF BALTIMORE, MARYLAND

DEPARTMENT OF LEADERSHIP STUDIES DISSERTATION FORMATTING. The Publication Manual of the American Psychological Association, sixth edition

Gaz s Mac Notes. Startup

WESTERN KENTUCKY UNIVERSITY. Web Accessibility. Objective

Creating Forms With Adobe LiveCycle Designer 8.2

How To Edit An Absence Record On A School Website

Web Forms. Step One: Review Simple Contact Form

COMPUTER APPLICATIONS AND WEB TECHNOLOGIES

Ithaca College Survey Research Center Survey Research Checklist and Best Practices

DCOM Group Project 2: Usability Testing. Usability Test Report. Tim Harris, Zach Beidler, Sara Urner, Kacey Musselman

Guidelines for reporting. for Accompanying Measures. implemented as. Specific Support Action

Test Run Analysis Interpretation (AI) Made Easy with OpenLoad

OMBEA Response. User Guide ver

Microsoft Word 2010 Mail Merge (Level 3)

SharePoint Focus Group Report

elearning in Postsecondary Education: Questions for Professors who use elearning

Effective Business Requirements (Virtual Classroom Edition)

Adobe Acrobat 6.0 Professional

Internet-based remote support for help desks. Product white paper

Phase I Conduct a Security Self-Assessment

INTERNATIONAL STUDENT APPLICATION FORM UG, PGT & MBA 2015/2016 ACADEMIC YEAR ENTRY

CLASS FINAL REPORT UNIVERSITY OF CENTRAL FLORIDA FRONTIERS IN INFORMATION TECHNOLOGY COP 4910

Assigning a Digital Signature to Electronic Documents Guide

Transcription:

[Project Name] Usability Test Final Report Month/Day/Year Prepared by: epubs Usability Testing Group Genuity

Table of Contents PROJECT NAME... 1 USABILITY TEST... 1 TABLE OF CONTENTS... 2 PURPOSE/SYNOPSIS... 3 PROBLEM STATEMENT & TEST OBJECTIVES... 3 USER PROFILES... 3 TEST ENVIRONMENT... 3 TEST MONITOR ROLE... 3 METHOD/METHODOLOGY... 3 TASK LIST... 4 TESTING RESULTS... 5 DETAILS OF THE ACTIONS OF EACH TESTER... 8 DISCUSSION SECTION... 13 APPENDIX A - ORIENTATION SCRIPT... 15 APPENDIX B - GENERAL BACKGROUND QUESTIONNAIRE... 16 APPENDIX C - PRE-TEST QUESTIONNAIRE... 17 APPENDIX D - SAMPLE TASK LIST FOR TESTERS... 18 APPENDIX E - DEBRIEFING QUESTIONNAIRE... 21 2

Purpose/Synopsis Explain the purpose of the test. This should be a high-level overview describing the issues and questions that you wanted to resolve. Provide any background information that is applicable. Give a brief outline of your findings. This should be a high-level overview describing Problem Statement & Test Objectives This section should include known issues and questions that require a resolution. Problem statements should be concise and unambiguous. This section should only include several sentences. User Profiles Describe each of your testers. Outline their level of experience with the site being tested as well as any prior knowledge with the subject matter at hand. Describe the testers general knowledge of the web. Examples of questionnaires that can be used to gather this information are included in Appendix B and Appendix C. Test Environment Describe the environment where the testing took place. Describe how the area was set up, what kind of hardware was used, and any distractions that may have been present. Test Monitor Role Explain the role of the test monitor in the testing process. Outline how the monitor interacted with both the tester and test subjects. Method/Methodology Provide a detailed description of the testing process that you used. Give an overview of each part of the test from the time the participants arrive until they leave. The methodology should be detailed enough so someone who reads this portion of the usability test can easily replicate the test. This section should also be a good outline for test observers to follow the test. 3

Task List Include the list of tasks that you gave to the participants. Define the criteria that you used to determine if a task was successfully completed. List the maximum amount of time that you allowed the participants to complete the task. A complete sample of a task list is provided in Appendix E. MTC = Maximum Time to Completion SCC = Successful Completion Criteria Task Number Task Description 1. Define task participant will complete. MTC: 5 minutes SCC: Define Successful Completion Criteria 2. Define task participant will complete. MTC: 5 minutes SCC: Define Successful Completion Criteria 3. Define task participant will complete. MTC: 5 minutes SCC: Define Successful Completion Criteria 4. Define task participant will complete. MTC: 5 minutes SCC: Define Successful Completion Criteria 5. Define task participant will complete. MTC: 5 minutes SCC: Define Successful Completion Criteria 4

Testing Results List whether or not each participant was able to complete the various tasks and, if so, how long it took them to do so. An example is included below. Legend Completed Successfully F Failed (Considered Failure)! Stopped (Considered Failure) Task: Tester Number, Result, and Time Number 1 1.! 0.00 (000) 2. 0.00 (000) 3. 0.00 (000) 4. 0.00 (000) 5.! 0.00 (000) Overall Final Results for the Usability Test: (000 total tasks, worth 0.0% each) Task Success Percentage: 00.0% Task Failure Percentage: 00.0% Mean: 0.00 Median: 0.00 (Tester X & X) Range: 0.00 (Tester X&X) Overall Success Percentage Usability Test: 00.00% Overall Failure Percentage Usability Test: 00.00% 5

Task: Tester Number, Result, and Time Number 2 1.! 0.00 (000) 2. 0.00 (000) 3. 0.00 (000) 4. 0.00 (000) 5.! 0.00 (000) Overall Final Results for the Usability Test: (000 total tasks, worth 0.0% each) Task Success Percentage: 00.0% Task Failure Percentage: 00.0% Mean: 0.00 Median: 0.00 (Tester X & X) Range: 0.00 (Tester X&X) Overall Success Percentage Usability Test: 00.00% Overall Failure Percentage Usability Test: 00.00% Task: Tester Number, Result, and Time Number 3 1.! 0.00 (000) 2. 0.00 (000 3. 0.00 (000) 4. 0.00 (000) 5.! 0.00 (000) Overall Final Results for the Usability Test: (000 total tasks, worth 0.0% each) Task Success Percentage: 00.0% Task Failure Percentage: 00.0% Mean: 0.00 Median: 0.00 (Tester X & X) Range: 0.00 (Tester X&X) Overall Success Percentage Usability Test: 00.00% Overall Failure Percentage Usability Test: 00.00% 6

Task: Tester Number, Result, and Time Number 4 1.! 0.00 (000) 2. 0.00 (000) 3. 0.00 (000) 4. 0.00 (000) 5.! 0.00 (000) Overall Final Results for the Usability Test: (000 total tasks, worth 0.0% each) Task Success Percentage: 00.0% Task Failure Percentage: 00.0% Mean: 0.00 Median: 0.00 (Tester X & X) Range: 0.00 (Tester X&X) Overall Success Percentage Usability Test: 00.00% Overall Failure Percentage Usability Test: 00.00% Task: Tester Number, Result, and Time Number 5 1.!0.00 (000) 2. 0.00 (000) 3. 0.00 (000) 4. 0.00 (000) 5.! 0.00 (000) Overall Final Results for the Usability Test: (000 total tasks, worth 0.0% each) Task Success Percentage: 00.0% Task Failure Percentage: 00.0% Mean: 0.00 Median: 0.00 (Tester X & X) Range: 0.00 (Tester X&X) Overall Success Percentage Usability Test: 00.00% Overall Failure Percentage Usability Test: 00.00% 7

Details of the Actions of Each Tester Outline the actions of each participant as they attempted to complete the tasks. An example is given below. Legend Completed Successfully F Failed (Considered Failure)! Stopped (Considered Failure) Participant # 1 General background of participant, and specifically their knowledge of the product (if any). Overall Tasks 1 Task 1 STATUS:! F TIME: 0:00 1. Action 1. 2. Action 2. 3. Action 3. 4. Action 4. 8

Debriefing/Post Test Chat session Include an outline of the details of any conversation that took place after the testing. Include the participant's thoughts about the test as well as any suggestions they make about improving the quality of the site being tested. A questionnaire that can be used for debriefing is included in Appendix D. Participant # 2 General background of participant, and specifically their knowledge of the product (if any). Overall Tasks 2 Task 1 STATUS:! F TIME: 0:00 1. Action 1. 2. Action 2. 3. Action 3. 4. Action 4. 9

Debriefing/Post Test Chat session Include an outline of the details of any conversation that took place after the testing. Include the participant's thoughts about the test as well as any suggestions they make about improving the quality of the site being tested. A questionnaire that can be used for debriefing is included in Appendix D. Participant # 3 General background of participant, and specifically their knowledge of the product (if any). Overall Tasks 3 Task 1 STATUS:! F TIME: 0:00 1. Action 1. 2. Action 2. 3. Action 3. 4. Action 4. 10

Debriefing/Post Test Chat session Include an outline of the details of any conversation that took place after the testing. Include the participant's thoughts about the test as well as any suggestions they make about improving the quality of the site being tested. A questionnaire that can be used for debriefing is included in Appendix D. Participant # 4 General background of participant, and specifically their knowledge of the product (if any). Overall Tasks 4 Task 1 STATUS:! F TIME: 0:00 1. Action 1. 2. Action 2. 3. Action 3. 4. Action 4. 11

Debriefing/Post Test Chat session Include an outline of the details of any conversation that took place after the testing. Include the participant's thoughts about the test as well as any suggestions they make about improving the quality of the site being tested. A questionnaire that can be used for debriefing is included in Appendix D. Participant # 5 General background of participant, and specifically their knowledge of the product (if any). Overall Tasks 5 Task 1 STATUS:! F TIME: 0:00 1. Action 1. 2. Action 2. 3. Action 3. 4. Action 4. Debriefing/Post Test Chat session Include an outline of the details of any conversation that took place after the testing. Include the participant's thoughts about the test as well as any suggestions they make about improving the quality of the site being tested. A questionnaire that can be used for debriefing is included in Appendix D. 12

Discussion Section Include information about the specific areas of the site that require further investigation and updating. Describe in detail any "hot spots" or areas that consistently caused problems. Describe any trends that you identified during the course of the testing. Provide suggestions for how to alleviate any problems. Specific Hot Spots There were several glaring hot spots uncovered during the testing, and they include the issues discussed below. 1. Difficulty in XXXX. Describe the issue/hot spot here the participants encountered. Question Describe the question this hot spot related to. Give percentages of success/failure rates to bolster issue. 2. Difficulty in XXXX. Describe the issue/hot spot here the participants encountered. Question Describe the question this hot spot related to. Give percentages of success/failure rates to bolster issue. 3. Difficulty in XXXX. Describe the issue/hot spot here the participants encountered. Question Describe the question this hot spot related to. Give percentages of success/failure rates to bolster issue. Trends Detail and discuss trends you saw in both the test and the way it was administered and the issues you uncovered from the test. Usually one or two paragraphs. 13

Suggestions for Change Repeat question that you used as a specific "hot spot" Recommend solution to the question described above. 14

Appendix A - Orientation Script Orientation Script Hello. My name is [Insert Name] from epubs, and I will be working with you and monitoring you in today s session. The [any other observers], will also be observing your testing today. Let me explain why we have asked you to come in today. You will be performing some typical tasks participants might do to find specific information about the [specific software/web site]. I ask you to please try to work and find the information in the way that you would normally work. For example, please try to work at the same speed and with the same attention to detail that you normally would do. Please do your best, but do not be concerned with the results: this is a test of the format of the [specific software/web site], and not a test of you. We are trying to uncover the portions of the [specific software/web site] that are both useful and need improvement. Please feel free to think or talk out loud while performing the various tasks that are part of the text. You may ask questions at any time, but I may not answer them, since this is a study of the product and the documentation and the documentation format, and part of the test is seeing how the information works with a person such as yourself working independently. During our session today, I will also be asking you to complete some forms and answer some questions. It is very important that you answer truthfully. My only role here today is to discover both the flaws and advantages of this product from your perspective. Please do not answer questions on what you think [observers] or I may want to hear. I need to know exactly what you honestly think. The test today is divided into X, with each section having X tasks. The first section focuses on [section 1] and tasks related to that, the second section focuses [section 2] and tasks related to that, and the third section focuses on [section 3] and tasks related to that. While you are working, both all the observers and myself will be sitting or standing nearby taking some notes and timing. Also, we will be recording your voice during the session to keep a record of your thoughts during the testing period. Do you have any questions? If no questions then proceed to questionnaires. 15

Appendix B - General Background Questionnaire Question Possible Answers General Computer Experience 0 to 2 years (non Internet) 2 to 4 years 4 + years General Computer Experience 0 to 2 years (Internet specific) 2 to 4 years 4 + years Computer Interaction Experience Windows, GUI, or Web-based Character-based (keyboard commands, no mouse) Both Education Level High School Bachelor Degree Graduate Degree Age 20 to 30 30 to 40 40 to 50 Gender Male Female 16

Appendix C - Pre-Test Questionnaire Pre Test Questionnaire (Tester # ) Software/Web Site Use Questionnaire Question Did you ever use the [software/web site] for [insert purposes] Have you ever used the World Wide Web before to view software documentation? Do you prefer obtaining information from books or online (World Wide Web or Help Files)? Have you ever used the [software/web site] before to obtain information about the [software/web site]? If you answered yes above, how often do you use the [software/web site] to obtain information about the [software/web site] or particular tasks? Please rank in importance, from 1 to 6 (1 is most important, and 6 is least important), which sections of the [software/web site] you use most often? Possible Answers Yes No Yes No Books Online (World Wide Web or Help Files) Yes No Once a day Once a week Once a month Never First portion Second portion Third portion Fourth portion Fifth portion Sixth portion None (I don t use any portion) Do you have an understanding of the basic [software/web site] terminology prior to participating in this test? Please rate yourself on your knowledge of the [software/web site] application. Yes No Never Used Before New User (Used between 0 to 3 months) Proficient User (Used between 4 to 6 months) Expert User (Used between 7 to 12 months) 17

Appendix D - Sample Task List for Testers ATS Usability Test Tester # This first set of tasks refers to information about the Cambridge Provisioning screen: Task Number Task Description Answer 1. Determine if the CFA Send to LEC field is derived. 2. Locate the Del (Delete) option in the Control Block. Record the option located before and after the Del (Delete) option. 3. Find the Secondary Fields in the Cambridge Provisioning screen and record the second field listed. 4. According to Keynote, what was the performance percentage rating of total successes for Genuity's Carteret, NJ data center and when was the information on that data center last updated. 5 Locate the Selection Screen in Cambridge Provisioning Screen and record the exact screen names of the two fields. 18

The second set of tasks refers to information about Reports: Task Number Task Description Timing (Seconds) 1. Locate and name the two types of IT&A Checklist Reports. 2. Determine and notate if Metrics - LEC Circuits Past Due is a report that is printed, saved, or both. 3. Locate and record the exact name of the template for the Red Team Report - AOL/GNI. 4. Open the Brio InsightWeb site from within this Web site. Write down the URL of the site. 5 Produce and print a Metrics Tier 2 - Avg Time to Issue DLR Report (in Brio). 19

The third set of tasks refers to information about the ATS User s Guide, Web site user interface: Task Number Task Description Answer 1. Open an Adobe Acrobat (pdf file) from within this Web site. 2. Locate the Del (Delete) option in the Control Block. Record the option located before and after the Del (Delete) option. 3. Return to the beginning of the document, showing the main title page of the document 4. Go to the index, find the BH Bill Comm Date field definition, and record the field just before this field. 5 Send an email to the Documentation department complaining about the site, with wording Test in the subject line. 20

Appendix E - Debriefing Questionnaire Thank you for helping us with a test of the [software/web site]. As a final task, we would like you to take several minutes and give us your thoughts about the questions below. We would like to gather information on what you like and do not like about this Web site. 1. Specific question about the software/web site 2. Specific question about the software/web site 3. Specific question about the software/web site 4. Specific question about the software/web site 5. Please provide any personal comments you would like to add about this test or software/ Web site. 21