An Introduction to Coding and Analyzing Qualitative and Quantitative Data Dr. Allen Brizee, Loyola University Maryland



Similar documents
WR305, Writing for the Web Usability Test Documents

Usability Test Plan Docutek ERes v University of Texas Libraries

Big Kahuna Project User Testing for A Journey through Fantasy Land Jennifer Jones

First Certificate in English Online Practice Test Free Sample. How to complete the FCE Online Practice Test Free Sample: Writing

How to complete the FCE Online Practice Test Free Sample: Use of English

Key English Test Online Practice Test Free Sample. How to complete the KET Online Practice Test Free Sample: Reading & Writing

How to complete the FCE Online Practice Test Free Sample: Reading

Creating A Grade Sheet With Microsoft Excel

MAC Magic Mouse. Usability Test Plan

How to complete the PET Online Practice Test Free Sample: Listening

Access to Moodle. The first session of this document will show you how to access your Lasell Moodle course, how to login, and how to logout.

Use of rubrics for assessment and student learning

DUOLINGO USABILITY TEST: MODERATOR S GUIDE

Ready to Redesign? THE ULTIMATE GUIDE TO WEB DESIGN BEST PRACTICES

Usability Test Results CyberLynx Annotator for Lotus Executive Summary

Data Visualization Basics for Students

How to take an ITI Course

Why Your Practice Needs a Mobile Website Interface Right Now and How To Make it Happen.

Achievement and Satisfaction in a Computer-assisted Versus a Traditional Lecturing of an Introductory Statistics Course

EVALUATING QUALITY IN FULLY ONLINE U.S. UNIVERSITY COURSES: A COMPARISON OF UNIVERSITY OF MARYLAND UNIVERSITY COLLEGE AND TROY UNIVERSITY

DECLINING PATIENT SITUATIONS: A Study with New RN Residents

Analyzing Qualitative Data For Assessment October 15, 2014

Audience Response System (Turning Point) A Quick Start Guide

Oregon Institute of Technology Bachelor of Science Degree in Dental Hygiene Degree Completion Outreach Program

IELTS ONLINE PRACTICE TEST FREE SAMPLE

Web design & planning

PowerPoint 2013 Basics of Creating a PowerPoint Presentation

How To Access The Self Study Guide On Ncaa.Org

Student Guide for Usage of Criterion

TKT Online. Self-study Guide

Search Engine Optimization

Best Practice in Web Design

IELTS ONLINE PRACTICE TEST FREE SAMPLE

User Interface Design

Affiliated Provider Billing/Coding

WEBSITE & DIGITAL MARKETING MUST HAVES NOVEMBER 9 TH, 2015 What They Are, Why You Need Them & How They Will Make Your Business Succeed Online

Computer Programming Richard Hoagland, Rand Riness Assessment Report. Criteria

EnglishBusiness CompactMINIs Fast and to the point. Cheap and effective.

Building a Business Analyst Competency & Training Program. Presented by Tom Ryder

Measuring Evaluation Results with Microsoft Excel

Step 1 Preparation and Planning

BANA6037 Data Visualization Fall Semester 2014 (14FS) / First Half Session Section 001 S 9:00a- 12:50p Lindner 107

Measuring the online experience of auto insurance companies

Instructions on How to Access Online Resources in SATERN

Google Cloud Storage Experience Usability Study Plan

Creating Online Surveys with Qualtrics Survey Tool

STEP TWO: Highlight the data set, then select DATA PIVOT TABLE

Army Readiness Assessment Program (ARAP) Navigation Explanation

Examining the Role of Online Courses in Native Hawaiian Culture and Language at the University of Hawaii

To Begin Customize Office

MARKETING LENS SAMPLE REPORT. Are your marketing efforts effective? September Available on

Visualizing Relationships between Related Variables: Improving Physics Education through D3.js Network Visualizations

Executive Dashboard Cookbook

How do you use word processing software (MS Word)?

Michigan State University Human Resources

A Usability Test Plan for The San Antonio College Website

Design Your Web Site from the Bottom Up. By Peter Pappas. edteck press

A Student s Guide to Beginning Criterion. You re on your way to becoming a better writer!

Lean UX. Best practices for integrating user insights into the app development process. Best Practices Copyright 2015 UXprobe bvba

End User Guide Wazoku s Idea Spotlight

The Great Debate. Handouts: (1) Famous Supreme Court Cases, (2) Persuasive Essay Outline, (3) Persuasive Essay Score Sheet 1 per student

U. S. Coast Guard Auxiliary Instructor Development Course Mentor Guide Appendix C

LEARNING, DESIGN AND TECHNOLOGY CAPSTONE PROJECT GUIDELINES

Conversion Rate Optimisation Guide

3 Guidance for Successful Evaluations

Faculty Access for the Web 7 - New Features and Enhancements

White Paper: Designing Resourceful Graphical User Interfaces (GUIs) for Healthcare Applications

Privacy Statement. Privacy Practices and Feedback

10+4 Principles to Capture Your Customer Experience


COMCAST.COM - PRIVACY STATEMENT

Premium Advertising Sweden UK France Germany

Teaching Portfolio. Presentation Enhancing Technology. What is Instructional Media?

Factors Affecting Critical Thinking in an Online Course. Simone Conceição, PhD Assistant Professor University of Wisconsin-Milwaukee

What do I need to complete a college application? (And why should I care about this if I m not planning on going to college?

Ux test SU Bookstore SAINT CHRISTOPHER MONTERROSO By: St. Christopher Monterroso

Transcription:

Dr. Allen Brizee, Loyola University Maryland The purpose of this handout is to introduce you to the basics of coding and analyzing qualitative and quantitative data that you collect from usability testing 1. By the time you have completed the steps from this handout, you should better understand the data you have collected, be able to use this data in a report and, be able to use this data to help you make changes to a deliverable, such as a website. Qualitative Data To code and analyze qualitative data, we will be using a simplified process guided by an approach known as grounded theory. 2 Grounded theory, a popular research model in the social sciences, is useful for participatory approaches to usability testing and design because it allows researchers to develop categories of important terms based on participant feedback. At this point in your research, you should have qualitative data from your survey and test open- ended questions. The open- ended questions are the ones that ask participants to provide their feedback rather than, for example, rating on a scale from 1-5 the quality of the deliverable. If you have designed your qualitative questions effectively, you can use this data to help you find and fix problems with your deliverable. To code and analyze your qualitative data, follow these steps: 1. Read all of your participants answers carefully and take notes/record terms and trends that you notice occurring frequently. This is known as coding your data. For example, from your open- ended questions about what improvements users suggest for a website, you might notice that the participants repeatedly mention the terms site map and search feature. They might also mention that navigation and organizations was a problem, but they might not use the terms site map or search engine every time. 2. Next, group your coded data into concepts, or areas that are related in some way. For instance, terms that are related to the concept of taxonomy or organization might be placed together, whereas data coded as having something to do with color might be placed together. 3. Next, develop categories for your concepts. Do this by preparing a data rubric that you can fill in with terms and trends that you noticed in your participants responses. Your rubric might look like this: 1 Please note that this handout is intended as an introductory lesson for undergraduates, not as a process for graduate or formal research studies. 2 The simplified methodology described here is closer to Strauss and Corbin s approach.

2 4. Now, fill in your rubric with terms and trends and develop some basic categories you notice emerging from your data. Record how many times your participants mentioned specific terms or used terms that fall into the overall categories. Taxonomy Color Page Design Content Visuals The site needs a search engine I wish there were a site map Confusing organization I felt lost this site needs a site map Organization was ok Good contrast Text was easy to read against that white background Great color scheme I thought the navigation bar should have been on the right It was hard for me to find the navigation bar The content was helpful I understood the purpose of the site The pictures were pixilated and hard to see The pictures were stretched looking I didn t understand the connection between the pics and the text Now you can begin analyzing your data and developing a theory to explain your findings. From the basic rubric above, we can say that the organization (taxonomy) of the site is probably an issue because we found 5 comments regarding organization in the qualitative feedback 4 of them negative and 1 that says the organization is ok. However, we can determine that the color scheme and contrast made a better impression on the participants because we have 3 instances of positive feedback that fall into that category. Likewise, it seems as if the content of the site was helpful because we have 2 positive responses from participants in that area. Lastly, however, the visuals probably need some work because participants responded negatively to those 3 times.

3 You can put your findings to work by making changes based on your data. As noted above, your qualitative data (now quantified through grounded theory) can also help you make positive changes to a deliverable. The number of negative responses tells you that there is a problem in taxonomy, and the content of these responses helps you decide to add a site map. Combined with the feedback regarding page design, you may also move the navigation bar from the right to the left hand side and add a search engine that sits atop that bar. We received similar feedback when testing the new Civic Engagement area on the Purdue OWL and made these changes: Quantitative Data To code and analyze quantitative data, we will be using six types of data sets: demographics, time- on- task; mouse clicks; task success rates; Likert scale feedback, and ratings. Rather than going through steps to code and analyze each of these, this handout shows you how to handle time- on- task, task success rates, rating scores, and Likert scale feedback. With these three basic explanations, you should be able to develop similar data sets for the other tests.

4 Before we get started: to help you make sense of your quantitative data, you will want to do some research on optimal scores and feedback before you test. You should be able to answer these types of questions: How many mouse clicks is optimal for a website? How much time will users spend to find information on a website before they get frustrated and leave? Once you have a good idea of what others say about these areas, test your site yourself. Though you already know where things are, tracking how many mouse clicks it takes you to get to the information you will have you users try to find will help you set baselines. From here, you can set some goals for your website. For example, you might say I want users to be able to find my journalistic writing samples within four mouse clicks and within 60 seconds of landing on my homepage. Also, you might say I would like to have very positive feedback from my users. So for my rating scale, I would like to have 4s and 5s. Without baselines and other scholarship explaining what is good or bad for quantitative data regarding the usability of a website, it will be difficult to understand your findings. Time- on- Task At this point in your research, you should have quantitative data from your test protocols, including the time- on- task, task success rates, and Liker scale feedback. Your time- on- task data for participants 1-3 might look like this: Task 1: 65 seconds Task 2: 30 seconds Task 3: 73 seconds Task 1: 48 seconds Task 2: 20 seconds Task 3: 64 seconds Task 1: 105 seconds Task 2: 50 seconds Task 3: 126 seconds Your goal is to find the mean (average) amount of time it took participants to complete each task. Follow these steps to accomplish that: 1. Add the totals of each task and divide that total by 3 (or the total number of participant time- on- task data sets you have) 65 + 48 + 105 = 218 / 3 = 72.67 seconds. So it took participants an average of 72.67 seconds to complete task 1.

5 2. Repeat these steps for all of your tasks. Task 2: 33.33. Task 3: 87.67. Optimal time- on- task for your participants is going to vary depending on the purpose, size, and complexity of your site, but a good range is between 20 and 60 seconds. Task Success Rate For your task success rates, your data might look like this: Task 1: Y Task 2: N Task 3: Y Task 1: Y Task 2: Y Task 3: N Task 1: N Task 2: Y Task 3: N Your goal is to find the percentage of success rate for each task and then to average that rate. Task 1: Y- 2, N- 1. Task 2: Y- 2, N- 1. Task 3: Y- 1, N- 2. The success rate for task 1, therefore, is 66.66%. Task 2 success rate is also 66.66%. Task 3 success rate is 33.33%. Averaged together, the overall task completion rate for this site is 55.55%, or just over ½. Website Rating Score Your rating data (on a 1-5 scale with 1 being the lowest and 5 being the highest) might look like this: Website rating: 2 Website rating: 3 Website rating: 2 Find the mean score by adding the ratings and dividing by 3: 2 + 3 + 2 = 7 / 3 = 2.33.

6 Likert Scale Feedback Your Likert scale feedback (on a 1-5 scale with 1 reflecting Strongly Disagree and 5 reflecting Strongly Agree) for one of your questions might look like this: I found the content this site to be informative: 4 I found the content this site to be informative: 4 I found the content this site to be informative: 2 Find the mean score by adding the scores and dividing by 3: 4 + 4 + 2 = 10 / 3 = 3.33, or a nigh- neutral score. Once you have coded and analyzed your data, you should be able to use it to help you make changes in your website. Some questions you will want to answer are: Based on the rhetorical situation of the deliverable (purpose, audience, et.c) what does this data tell us? What does it mean? How can I use this data to help me better meet the needs and expectations of my users? Please see the Purdue OWL Usability Report and the other examples on the course website for samples of how to write up and discuss your data.