T-404-LOKA, Final Project. Tempo Mobile Usability Testing



From this document you will learn the answers to the following questions:

What percentage of Activity stream did the search bar have?

Who had problems refreshing the issue stream?

What did the first usability test change the home screen of?

Similar documents
After you complete the survey, compare what you saw on the survey to the actual questions listed below:

Using the Content Distribution Manager GUI

Spiel. Connect to people by sharing stories through your favorite discoveries

QuickBooks 2016 Getting Started Guide for Financial Institutions. Financial Institution Support OFX Connectivity Group

MS Internet Explorer Vs Opera Comparative Usability Test

Jobulator Mobile Overview for ios (iphone, ipad, ipod Touch)

What is OneDrive for Business at University of Greenwich? Accessing OneDrive from Office 365

Introduction to U-verse Easy Remote

Online Sharing User Manual

USEFUL HINTS & TIPS ALCATEL ONE TOUCH 993. better BUSINESS

Reference Guide - Raising an order from Hubwoo in Purchasing Self-Service (PSS)

OneNote 2016 Tutorial

FirstClass for Mobile -

Jumble for Microsoft Outlook

NetPortal All your documents, direct to your iphone

Nursery Phone App Tutorial

The Coast to Coast AM Podcast for itunes 11

Mobile Connect - Approver. ios User Guide

RingCentral for Desktop. UK User Guide

TBR System Office Performance Management Employee s Guide

How to Create a Resume Using Microsoft Word

Getting Started Using AudibleManager. AudibleManager 5.0

How To Use Textbuster On Android (For Free) On A Cell Phone

InfoEd erm Project Instructions for obtaining Research Information Spreadsheets from InfoEd

The Rush 24/7 Podcast for itunes 11

Sendspace Wizard Desktop Tool Step-By-Step Guide

Step-by-Step Archiving with pictures

Microsoft PowerPoint Exercises 4

The Social Accelerator Setup Guide

Business Mobile Banking

Egnyte App for Android Quick Start Guide

The AmeSecurities application is compatible with all IOS and Android devices. Supported IOS and Android version is 4.0 and above.

Logging in to Google Chrome

Fingerprint Identity User Manual for the Griaule Biometric Framework Rev 1.00

ONLINE BANKING - INTERNET BROWSER SETTINGS: COOKIES

The goal with this tutorial is to show how to implement and use the Selenium testing framework.

Getting started with OneDrive

City of Corpus Christi. Mobile GIS Viewer. For ipad

DroboAccess User Manual

AT&T U-verse App for iphone FAQ s

Set up Delegate & Travelers

Live Agent for Support Agents

HOME PAGE. Quick Start Guide. Here s how to navigate the Films On Demand home page you first see when you log in.

Schools CPD Online General User Guide Contents

1. Application Overview System Requirements Installation Splash Screen Registration Screen...

Setting up a basic database in Access 2003

Editors Comparison (NetBeans IDE, Eclipse, IntelliJ IDEA)

Rochester Institute of Technology. Finance and Administration. Drupal 7 Training Documentation

The HR department has provided these training materials to assist with your understanding and use of this system.

OUTLOOK WEB APP (OWA): MAIL

ios 9 Accessibility Switch Control - The Missing User Guide Updated 09/15/15

Using the Educator Dashboard

Business Software Solutions. Business Plus Accounting Touch POS Quick Start Guide

- User input includes typing on the keyboard, clicking of a mouse, tapping or swiping a touch screen device, etc.

Lab: Data Backup and Recovery in Windows XP

PERSONAL LEARNING PLAN- STUDENT GUIDE

MOBILE APP TRAINING MANUAL

B&SC Office 365

RingCentral from AT&T Desktop App for Windows & Mac. User Guide

Selection Manager: Quick Start Guide

ATTENTION: End users should take note that Main Line Health has not verified within a Citrix

3M Cloud Library Reading Apps User s Guide Revision Date: September 2014

WHAT'S NEW WITH SALESFORCE FOR OUTLOOK

Entering a Case Into the PeopleSoft Helpdesk (CRM) Logging in

Adding A Student Course Survey Link For Fully Online Courses Into A Canvas Course

COGNOS REPORTING SYSTEM USER GUIDE

Last updated: October 4, einvoice. Attorney Manual

How to Login Username Password:

Access to Moodle. The first session of this document will show you how to access your Lasell Moodle course, how to login, and how to logout.

Citrix Desktop for Home Computers Apple ios Instructions

NCAA Single-Source Sign-On System User Guide

Getting Started with the DCHR Service Desk. District Service Management Program

Sensi TM. Wi-Fi Programmable Thermostat SCHEDULING GUIDE. Version: March Emerson Electric Co. All rights reserved.

How to use the VCCS Student System

OneDrive for Business User Guide

User Guide. Logout button: will log you out of the session! The tablet tool automatically logs out after 30 minutes of idle time.

Applicant Workflow Hiring Managers

BLACKBOARD CONTENT COLLECTION FACULTY TRAINING GUIDE

Frequently Asked Questions

Getting Started with Zoom

Internet Explorer 11 Flash Install on Win7

WeCompose. Instructions on Use

my.scouting Tools Training-Home Trend Chart Training Summary Report

Store & Share Quick Start

Frog VLE Update. Latest Features and Enhancements. September 2014

FREQUENTLY ASKED QUESTIONS

Migrating helpdesk to a new server

Configuration Guide Contigo Mobile Tracker

POEMS Trading Platform (CFD)

Managing Existing Mobile Apps

Cleaning your Windows 7, Windows XP and Macintosh OSX Computers

RENSATIS WEB. Working with Cloud Communication Solution (C.C.S).

Frequently Asked Questions: Cisco Jabber 9.x for Android

Add in Guide for Microsoft Dynamics NAV May 2012

Lab - Data Backup and Recovery in Windows XP

Check current version of Remote Desktop Connection for Mac.. Page 2. Remove Old Version Remote Desktop Connection..Page 8

PISA 2015 MS Online School Questionnaire: User s Manual

Avaya one-x Mobile User Guide for iphone

Using the Jive for ios App

Why do I have to log in as a Current UM Employee?

Transcription:

T-404-LOKA, Final Project Tempo Mobile Usability Testing Árni Fannar Þráinsson Gunnar Smári Agnarsson Sindri Sigurjónsson Theodór Tómas Theodórsson Spring 2015 B.Sc Computer Science Instructor: Hlynur Sigurþórsson Examiner: Jökull Jóhannsson

Contents 1 Introduction 1 2 Usability Testing 1 2 2.1 Participants....................................... 2 2.2 Tasks.......................................... 3 2.2.1 Task 1...................................... 3 2.2.2 Task 2...................................... 4 2.2.3 Task 3...................................... 5 2.2.4 Task 4...................................... 6 2.2.5 Task 5...................................... 7 2.2.6 Task 6...................................... 8 2.2.7 Task 7...................................... 9 2.2.8 Task 8...................................... 10 2.2.9 Task 9...................................... 10 2.2.10 Task 10..................................... 10 2.3 Results from interviews................................ 11 2.4 Results from questionnaires.............................. 12 2.5 Summary........................................ 12 3 Usability Testing 2 13 3.1 Participants....................................... 13 3.2 Tasks.......................................... 14 3.2.1 Task 1...................................... 14 3.2.2 Task 2...................................... 14 3.2.3 Task 3...................................... 14 3.2.4 Task 4...................................... 15 3.2.5 Task 5...................................... 15 3.2.6 Task 6...................................... 16 3.2.7 Task 7...................................... 16 3.2.8 Task 8...................................... 17 3.2.9 Task 9...................................... 18 3.2.10 Task 10..................................... 18 3.3 Results from interviews................................ 19 3.4 Results from questionnaires.............................. 20 3.5 Summary........................................ 20 4 Conclusion 21 4.1 Assistance Comparison................................. 21 4.2 Time Comparison.................................... 21 4.3 Questionnaire Comparison............................... 23 4.4 Summary........................................ 25

1 Introduction In this report we discuss usability tests conducted for the final project Tempo-Mobile at Reykjavík University. These tests were focused on the tracker interface of the application. In section 2 we discuss usability test 1 and in section 3 we discuss usability test 2. In both sections we go over how the tests were conducted, its participants, how each task was outlined and the results from each task. We also summerized results from questionnaire and interviews. In section 4 we discuss both usability tests and compare the results and give our conclusion. 1

2 Usability Testing 1 The goal of the usability test was to learn how Tempo users use the mobile app. The usability test was conducted the 27th of February 2015 in a meeting room at Tempo Software offices at a predefined time that had been arranged with each user. The users were given a phone to use. Each individual session lasted approximately 20 minutes. During the session, the test administrator explained the test session to the users and before the first task was given he asked the user a few background questions. The participants were then given one task at a time and were asked to try to solve the given task using the app, while another member of the team observed and tracked the time. When the user had completed all the tasks we asked the user a few questions regarding the app and tried to get a sense of what the users liked or disliked. The session was then concluded with a short questionnaire regarding the application. 2.1 Participants All the participants in the usability tests are employees at Tempo Software and had experience using the products that Tempo developes. The majority of the participants had a university degree. Tempo Software wanted to keep testing internal to the company so we would not create false hope with Tempo customers that a mobile app was in the pipelines. Gender Age Jobtitle Male 35 Web Designer Male 38 Software Developer Male 25 Customer Advocate Female 23 Software Engineer Female 33 Software Tester Male 48 CTO Male 46 Senior Software Developer Male 32 Product Manager Table 1: Overview of Participants 2

2.2 Tasks The users had ten tasks to finish. In the following subsections you can see the given tasks and their results. 2.2.1 Task 1 The first task was to open the app and start a tracker without logging in. The point of emphasis was to check if the users realised that they could start a tracker without logging in. Success Criteria: The user does not login to the app and presses the "Start Tracker" button and pauses the tracker. 2.2.1.1 Results The results from the task show that it s not clear to the user that he can start a tracker without logging in as figure 1 shows. Realised out that login was not required. 37.5 % 62.5 % Attempted to login and needed assistance. Figure 1: Login problem Average time to finish task: 20.7 seconds 3 users finished the task. 5 users finished the task with assistance. 3

2.2.2 Task 2 The next task was to login to the app and refresh the issue stream. There were three possible ways to get back to the login page and we wanted to see what route they would take. We also wanted to know whether users were able to figure out how to refresh. Success Criteria: The user is able to login to the account given and refresh the stream. 2.2.2.1 Results All users navigated back to the login page through the side navigation. When it came to refreshing the issue stream the users had problems and as figure 2 shows most users needed help regarding that part of the task. The main problem was that there were no indicators to tell the users to pull down to refresh the page. No problem refreshing 25 % 75 % Had problems refreshing. Figure 2: Refresh problem Average time to finish task: 62 seconds 2 users finished the task. 6 users finished the task with assistance. 4

2.2.3 Task 3 The next task was to start another tracker and rename it "user testing". Here we wanted to see if users could rename the tracker without difficulties. Success Criteria: The user starts a new tracker and is able to name it. 2.2.3.1 Results ios had a "DONE" button to close the keyboard of the phone but the button does not close the modal view and save the changes. Four users pressed the "DONE" button before going back to the keyboard and pressing "RETURN", only two needed assistance. The DONE button can be very confusing so it has to be removed. The edit field had two functionalities to rename the tracker and attach an issue, four users pointed out that they found it confusing that one input field had two functionalities. Average time to finish task: 28.5 seconds 6 users finished the task. 2 users finished the task with assistance. 5

2.2.4 Task 4 The next task was to delete the tracker named user test. Here we wanted to check if the trash can that you had to press to activate delete mode on the list was obvious. Success Criteria: The user is able to delete the tracker named "user test". 2.2.4.1 Results Even though all the users were able to finish the task, we noticed that it was not clear to all users how they should delete a tracker. Figure 3 shows how the users did in the task. Even though people had some difficulties, they all finished the task without help. Tried swiping 25 % Difficulties finding the trash 12.5 % 62.5 % Had no problems Figure 3: Results from deleting a tracker Average time to finish task: 17.75 seconds All users finished the task. 6

2.2.5 Task 5 The next task was to start a tracker and attach issue WIKK-14 to the tracker. There was one path we wanted the user to take and that was starting a tracker and then attaching a issue through the edit field. There were two other ways to do this, finding the issue in the search bar or in the activity stream and start the tracker there. Success Criteria: The user is able to start a tracker and attach the issue key to it. 2.2.5.1 Results Even though all the users did not take the path that we wanted they all finished the task. Figure 4 shows what path the users took in the process of finishing the task. Started tracker and attached issue through edit Search bar 37.5 % 12.5 % 50 % Found issue in activity stream Figure 4: Path taken to finish the task Average time to finish task: 17.75 seconds All users finished the task. 7

2.2.6 Task 6 The next task was to log work on a tracker with issue WIKK-14 attached to it. Put 2 in worked and 3 in remaining and type "User testing" in comment. The point of emphasis was to see if the users could log work without any hindrance as the user needed to swipe left on the issue to see the "Log Work" button. Success Criteria: The user is able to log the tracker. 2.2.6.1 Results The results were clear cut, the users had problems with getting to the "Log Work" view as figure 5 shows. The users tried to click on the issue rather then swiping to the right. One other thing came up in this task and that was the users were not sure if they had actually logged the work as there was no indication that told them that the logging was successful. No problem 25 % 75 % Needed assistance Figure 5: Results from Task 6 Average time to finish task: 68.8 seconds 2 users finished the task. 6 users finished the task with assistance. 8

2.2.7 Task 7 The next task was to find issue AKA-15 and start tracking it. There were two ways to find the issue, with the search bar or in the activity stream. The point of emphasis was to see what route they would take to find the issue. Success Criteria: The user finds the given issue and is able to start a tracker on the given issue. 2.2.7.1 Results Figure 6 shows what path the users took. Only one user needed assistance finishing this task, he did not notice the search bar nor the activity stream. Search bar 50 % 50 % Activity stream Figure 6: Results from Task 7 Average time to finish task: 12.13 seconds 7 users finished the task. 1 user finished the task with assistance. 9

2.2.8 Task 8 The next task was to rename tracker AKA-15 to User testing. The point of emphasis was to see if the user could rename a tracker that had an issue attached. Success Criteria: The user is able to rename the tracker. 2.2.8.1 Results The only problem that came up in the task was the "DONE" button on the keyboard. The button does not close the modal view and save it, but rather only closes the keyboard. Average time to finish task: 18.16 seconds All users finished the task. 2.2.9 Task 9 The next task was to go into the "Log Work" view on issue WDP-7 from the assigned issues list without starting a tracker. The point of emphasis was to see if the user could access the log work view without any hindrance as the user needed to swipe left on the issue to see the "Log Work" button. Success Criteria: The user sideswipes and enters a log work view. 2.2.9.1 Results By this time the users had all learned that a swipe to the left was needed to access the log work view and no users had any problems. Average time to finish task: 25.63 seconds All users finished the task. 2.2.10 Task 10 The last task was to go back out of the "Log Work" view and logout of the app. The main emphasis was to see if the users could find the logout button. Success Criteria: The user is able to logout of the app. 2.2.10.1 Results No problems came up regarding this task. Average time to finish task: 5.63 seconds All users finished the task 10

2.3 Results from interviews We got very good feedback from the users and they were not afraid to speak their mind. They had the app in front of them and could look over it while we asked them some questions regarding what they did not like, what they liked and what they thought was missing. Log work button hard to find. Pull to refresh not obvious. No hightlight when input field was selected. No highlight on a running tracker. How the tracker button looked. Small touch field on the edit button. Did not like the trash can. How you deleted trackers. Table 2: What the users disliked Easy to use. Easy to start a tracker. Easy to delete a tracker. That you could name the tracker. That you could see assigned issues. That you could log work. The feature that the search refreshes on change of the input. Table 3: What the users liked That the user can choose what view is the home view of the app. To see if the issue is in progress. How much time is remaining on the issue. That you can set a default name of a tracker. Collapsable issue stream. To see who is assigned to a issue. To see all of the summary of the issue. Button to log work insted of the swipe. Table 4: What the users wanted to see 11

2.4 Results from questionnaires In figure 7, figure 8 and figure 9 you can see the results from the questionnaires given to the users. The corresponding question is under each Figure. 87.5% 12.5% 0% Strongly disagree Disagree Neither Agree or Disagree Agree Strongly Agree Figure 7: Overall, the app was easy to navigate through. 75% 25% 0% Very bad Bad Neither Bad or Good Good Very Good Figure 8: How was your experience using the app? 50% 25% 25% 0% Very bad Bad Neither Bad or Good Good Very Good Figure 9: How did you like the overall look of the tracker interface? 2.5 Summary Overall the usability test did not go very well. Even though the sample group was only eight users too many of them needed assistance with finishing tasks. Even though the results where not as planned, we still got very productive feedback from the interviews with the users and used that feedback to improve the app. 12

3 Usability Testing 2 The goal of the usability test was to learn how Tempo users use the mobile app and try to improve the results from the first usability test. The usability test was conducted on the 26th of March in a meeting room at Tempo Software offices at a predefined time that had been arranged with each user. The users were given a phone to use in the test. Each individual session lasted approximately 20 minutes. During the session, the test administrator explained the test session to the users and before the first task was given he asked the user a few background questions. The participants were then given one task at a time and were asked to try to solve the given task using the app, while another member of the team observed and tracked the time. When the user had compleated the tasks we asked the user a few questions regarding the app and tried to get a sense of what the users liked or disliked. The session was then concluded with a short questionnaire regarding the application. 3.1 Participants All the participants in the usability tests are employees at Tempo Software and had experience using the products that Tempo develops. The majority of the participants had a university degree. As in the first usability test, Tempo Software wanted to keep the testing internal. Gender Age Jobtitle Male 33 Software Developer Male 27 Software Developer Male 23 Software Developer Male 26 Marketing Female 43 Agile Coach Male 24 Software Developer Female 39 Channels Manager Female 30 Marketing Table 5: Overview of Participants 13

3.2 Tasks The users had ten tasks to finish. In the following subsections you can see the given tasks and their results. 3.2.1 Task 1 The first task was to start a tracker and pause it without logging in. Success Criteria: The user does not login to the app and presses the "Start Tracker" button and pauses the tracker. 3.2.1.1 Results From the first usability test we changed the home screen of the app to the tracker view so it would not confuse the users that they needed to login. Now they are just prompted to login if they try to do something that needs authentication. With the change no user had any problems finishing the task. Average time to finish task: 10 seconds All users finished the task. 3.2.2 Task 2 The next task was to login to the app. The point of emphasis was to see if the users had any problems navigating to the login page. Success Criteria: The user is able to login to the account given. 3.2.2.1 Results With the change from the previous usability test with refreshing we had eliminated the problem and now the app refreshes without any input from the user so we removed the refresh part out of the task. One user did not understand the task and needed assistance. Average time to finish task: 22.375 seconds 7 users finished the task. 1 user finished the task with assistance. 3.2.3 Task 3 The next task was to start another tracker and rename it "user testing". Success Criteria: The user starts a new tracker and is able to name it. 3.2.3.1 Results No users had any problems with the task. 14

Average time to finish task: 22.75 seconds 8 users finished the task. 3.2.4 Task 4 The next task was to delete the tracker named user test. Here we wanted to check if the delete function was easy to find. Success Criteria: The user is able to delete the tracker named "user test". 3.2.4.1 Results All users finished the task without any problems. Average time to finish task: 12.25 seconds All users finished the task. 3.2.5 Task 5 The next task was to start a tracker and attach issue WIKK-14 to the tracker. There was one path we wanted the user to take and that was starting a tracker and then attaching an issue through the edit field. There were two other ways to do this, finding the issue in the search bar or in the activity stream and start the tracker there. Success Criteria: The user is able to start a tracker and attach the issue key to it. 3.2.5.1 Results All the users finished the task, but they did not all take the wanted path. Figure 10 shows what path the users took in the process of finishing the task. Average time to finish task: 17.75 seconds 7 users finished the task. 1 user finished the task with assistance. 15

Started tracker and attached issue through edit 50.0 % 12.5 % 37.5 % Found issue in activity stream Search bar Figure 10: Path taken to finish the task 3.2.6 Task 6 The next task was to log work on a tracker with issue WIKK-14 attached to it. Put 2 in worked and 3 in remaining and type "User testing" as a comment. The point of emphasis was to see if the users could log work without any hindrance. Success Criteria: The user is able to log the tracker. 3.2.6.1 Results All of the users finished the task. One user had questions as he did not understand the given task. Average time to finish task: 48 seconds All users finished the task. 3.2.7 Task 7 The next task was to find issue AKA-15 and start tracking it. There were two ways to find the issue, with the search bar or in the activity stream. The point of emphasis was to see what route they would take to find the issue. Success Criteria: The user finds the given issue and is able to start a tracker on the given issue. 3.2.7.1 Results All users finished the task and everyone used the search function to find the issue. 16

Average time to finish task: 16.875 seconds All users finished the task. 3.2.8 Task 8 The next task was to rename tracker AKA-15 to User testing. The point of emphasis was to see if the user could rename a tracker that had an issue attached. Success Criteria: The user is able to rename the tracker. 3.2.8.1 Results Two users encountered difficulties, both went into the "Log Work" view first, one user noticed the error and went back and finished the task without any assistance. The other user tried to add a comment in the "Log Work" view and needed assistance to finish the task. Average time to finish task: 17.375 seconds 7 users finished the task. 1 user finished the task with assistance. 17

3.2.9 Task 9 The next task was to go into the Log work view on issue WDP-7 from the assigned issues list without starting a tracker. Success Criteria: The user finds the issue in "Assigned issues" and clicks the issue to get the context menu and enters the "Log Work" view. 3.2.9.1 Results The users are getting very familiar with the app and no problems came up. Average time to finish task: 22.25 seconds All users finished the task. 3.2.10 Task 10 The last task was to go back out of the log work view and logout of the app. The main emphasis was to see if the users could find the logout button. Success Criteria: The user is able to logout of the app. 3.2.10.1 Results No problems came up regarding the task. Average time to finish task: 7.75 seconds All users finished the task 18

3.3 Results from interviews We got very good feedback from the users and they were not afraid to speak there mind. They had the app in front of them and could look over it while we asked them some questions regarding what they did not like, what they liked and what they thought was missing. That you attach issues through edit view. If you add a comment and then go back to edit it, the latest comment was not shown in the input box. No descriptions with the icons on buttons. Icon for log work confusing. Tracker comment confusing, in JIRA it s Log Work Description. Wanted to see the active tracker also in the list below the active tracker. Table 6: What the users disliked That the app is very easy to use. That the Log Work Description (Comment) get s transfered to the log work view when posting a worklog. Fast. The tabs for assigned issues, recently viewed and search. Search fast. Table 7: What the users liked Slide to delete a tracker. More ways to connect a issue to a tracker. Help when using the app for the first time. Play button in the search list to create a tracker and play it. Table 8: What the users wanted to see 19

3.4 Results from questionnaires In figure 11, figure 12 and figure 13 you can see the results from the questionnaires given to the users. The given question is under each Figure. 62.5% 37.5% 0% Strongly disagree Disagree Neither Agree or Disagree Agree Strongly Agree Figure 11: Overall, the app was easy to navigate through. 62.5% 37.5% 0% Very bad Bad Neither Bad or Good Good Very Good Figure 12: How was your experience using the app? 62.5% 37.5% 0% Very bad Bad Neither Bad or Good Good Very Good Figure 13: How did you like the overall look of the tracker interface? 3.5 Summary Overall the usability test did go well. Even though the sample group was only eight people, fewer people needed assistance compared to the first usability test. The users also gave some very good feedback regarding what was confusing in the app and we tried to improve that. The results from the questionnaire in the usability test was considerably better than from the first test. 20

4 Conclusion In this section we compare usability test 1 and 2. First, we compare number of times a user needed help to complete given tasks. Second, we compared the average time it took the users to finish each task. Third, we compared the result from the three user experience questionnaires. Fourth and last, we summarize our findings and explained how we interpreted the results. 4.1 Assistance Comparison After the first usability test the goal was to reduce the number of users that needed assistance to finish the tasks in the next usability test. In figure 14 you can see the comparison between usability test 1 and 2 regarding how many users needed assistance in each task. In the first usability test the users needed assistance a total of 20 times. In the second test after improvments were made to the application the users only needed help a total of 3 times and that s a major improvement. 6 5 Number of users 4 3 2 1 0 Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7 Task 8 Task 9 Task 10 Test 1 Test 2 Figure 14: Users that need assistance in usability test 1 and 2 4.2 Time Comparison Figure 15 shows the comparison between usability test 1 and 2 regarding the time it took users to finish each task. Blue represents usability test 1 and red represents usability test 2. After the improvements we made, after the first test, the time it took users to finish tasks dropped in all tasks except in Task 7 and 10. Regarding task 7 we made a big change to the way the user could see his assigned issues, recently viewed issues and search for issues. Instead of a divided list below the trackers the user 21

has three tabs on the bottom to view each list and the search. The extra click might be the reason why it took more time for the users in test 2, to finish the task. In task 10 the users had to back out of the "Log Work" view and logout of the app. Even though it took users from usability test 2 longer to logout of the app we think the time is still acceptable for users using the app for the first time also no user from both tests had any problem with the task. 70 60 50 Seconds 40 30 20 10 0 Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7 Task 8 Task 9 Task 10 Test 1 Test 2 Figure 15: Time comparison between usability test 1 and 2 22

4.3 Questionnaire Comparison In figure 16, figure 17 and figure 18 you can see comparison between the results from the queestionnaire from both usability tests. In figure 16 we can see the only numbers that did not improve between tests, went from 87.5% "Strongly Agree" to 62.5% in the later usability test. Nevertheless in figure 17 and figure 18 we see that in the second usability test we got better results from the users regarding the overall experience and the tracker interface. 100 80 60 Test 1 Test 2 87.5 62.5 % 40 37.5 20 12.5 0 0 0 0 0 0 0 Strongly disagree Disagree Neither agree/disagree Agree Strongly Agree Figure 16: Overall, the app was easy to navigate through. 23

100 Test 1 Test 2 80 75 60 62.5 % 40 20 0 37.5 25 0 0 0 0 0 0 Very bad Bad Neither bad/good Good Very good Figure 17: How was your experience using the app? 100 80 Test 1 Test 2 % 60 50 62.5 40 37.5 20 25 25 0 0 0 0 0 0 Very bad Bad Neither bad/good Good Very good Figure 18: How did you like the overall look of the tracker interface? 24

4.4 Summary Even though the sample groups only consisted of 8 people each, we believe that they gave us a small example of how Tempo users would use the mobile app. When we summarized these results we can see a lot of improvements. Firstly, there is a great reduction in number of users who needed help between tests, which is exactly what we aimed for. Secondly, there was a time improvement in completion of almost all tasks in usability test 2 compared to usability test 1 which tells us that we have really done something to make the app easier to use. Thirdly, there is improvement in 2 out of 3 of the user experience questionnaires and users where quite happy with the overall look of the tracker interface. From the results of usability test 2 we drew the conclusion that we really did improve the product. We agree that we analysed the results from usability test 1 correctly and we spent a good amount of time to design the improvements which we then conclude went great. 25