Google Analytics as a tool in the development of e-learning artefacts: A case study



Similar documents
Recommendations for enhancing the quality of flexible online support for online teachers

The move to Moodle: Perspective of academics in a College of Business

Google Analytics. Web Skills Programme

Google Analytics Basics

Getting the most from your Google Analytics

Library Website Google Analytics Report

Getting Started with Google Analytics 7 Easy but comprehensive steps

Use of Echo360 generated materials and its impact on class attendance

INTERACT INTEGRATE IMPACT

Self-adaptive e-learning Website for Mathematics

Introduction to Google Analytics

Designing for Success: Creating Business Value with Mobile User Experience (UX)

Plaza, B. (2010) Google Analytics Intelligence for Information Professionals. Online 34(5),

A Model of Online Instructional Design Analytics. Kenneth W. Fansler Director, Technology Services College of Education Illinois State University

A framework for evaluating online learning in an ecology of sustainable innovation

Practitioner s task design considerations and choices for blended mode large language classes

Infinity Call Tracking

Scopus. Quick Reference Guide

About Google Analytics

Integration of Learning Management Systems with Social Networking Platforms

Analytics Data Groups

ANALYZING YOUR RESULTS

Using Google Analytics for Improving Library Website Content and Design: A Case Study

Using Google Analytics for Improving User Experience and Performance

PISA 2015 MS Online School Questionnaire: User s Manual

Decision-making using web analytics. Rachell Underhill, UNC Grad School Anita Crescenzi, UNC Health Sciences Library

Executive Dashboard Cookbook

Web Analytics Definitions Approved August 16, 2007

EDN205 ICT in the Classroom

Auditing education courses using the TPACK framework as a preliminary step to enhancing ICTs

Benchmarking across universities: A framework for LMS analysis

Google Analytics & Social Media Monitoring Jeremy Coates

Defining an agenda for learning analytics

SAS BI Dashboard 3.1. User s Guide

Using Google Analytics

DIT Online Self Service for Clients

SuccessFactors Performance Management System. Reference Guide

Moreketing. With great ease you can end up wasting a lot of time and money with online marketing. Causing

The UC Learning Center: Disabling Pop-Up Blockers

Dashboard. Aug 22, Aug 28, 2011 Comparing to: Site. 199 Visits % Bounce Rate. 658 Pageviews. 00:02:54 Avg. Time on Site. 3.

A New Approach to Equip Students with Visual Literacy Skills:

Search engine optimisation (SEO)

AN ONLINE WRITING SUPPORT INITIATIVE FOR FIRST-YEAR INFORMATION TECHNOLOGY STUDENTS

How to use the Online Module Enrolment Application

SkyGlue Technology Inc., all rights reserved SkyGlue User Manual SkyGlue Technology Inc.

The user guide may be freely distributed in its entirety, either digitally or in printed format, to all EPiServer SEO users.

Web Analytics. Using emetrics to Guide Marketing Strategies on the Web

Student Guide to Using Blackboard Academic Suite ver. 8.0

Google Analytics Guide. for BUSINESS OWNERS. By David Weichel & Chris Pezzoli. Presented By

Website analytics / statistics Monitoring and analysing the impact of web marketing

For more information, or to get started with an Online Marketing campaign, contact Ray Archie at: or by phone at

Web Analytics. User Guide. July Welcome to AT&T Website Solutions SM

Reference Guide for WebCDM Application 2013 CEICData. All rights reserved.

Scientific Knowledge and Reference Management with Zotero Concentrate on research and not re-searching

ibus: Perceptions of the rise of mobile learning in a business degree

GUIDE TO DLM REQUIRED TEST ADMINISTRATOR TRAINING

Shelly, G. B., & Campbell, J. T. (2012). Web design: Introductory (4th ed.). Boston, MA: Course Technology.

Enhanced Library Database Interface at NTU Library

Best Practice in Web Design

SEO Techniques for Higher Visibility LeadFormix Best Practices

TPACKing for the Student Learning Centre Digital Strategy

How we use cookies on our website

Administrator s Guide

Google Analytics Health Check Laying the foundations for successful analytics and optimisation

Unit Title: Content Management System Website Creation

Influencing User Behaviour through Search Engine Optimisation Techniques

Getting Starting with Google Analytics. Summer Durrant IUPUI University Library Indiana Library Federation Conference November 14, 2012

Exploring Learning Analytics as Indicators of Study Behaviour

Leo LMS version 6.1 Student - User Manual

Social media metics How to monitor a Social Media campaign?

Google Analytics. Google Analytics Glossary of terms

The Unobtrusive Usability Test : Creating Measurable Goals to Evaluate a Website

COBWEB Use of Cookies [March 2015]

OPENGREY: HOW IT WORKS AND HOW IT IS USED

A simple time-management tool for students online learning activities

The objective setting phase will then help you define other aspects of the project including:

Dispute Resolution LMS User Guide

CINSAY RELEASE NOTES. Cinsay Product Updates and New Features V2.1

The Open University s repository of research publications and other research outputs

This guide will walk you through the process of disabling pop-up blockers found in three popular web browsers.

Web Usability Probe: A Tool for Supporting Remote Usability Evaluation of Web Sites

MENDELEY USING GUIDE CITATION TOOLS CONTENT. Gain access Mendeley Institutional account

Department of Homeland Security Use of Google Analytics

Using a collaborative investigation and design strategy to support digital resource development in an online unit of study

an introduction to Google Analytics for ecommerce Thomas Holmes

m-pathy Analyses Winning Insights. Process, Methods and Scenarios m-pathy Analyses Shows What Your Customers Experience.

graphical Systems for Website Design

How People Read Books Online: Mining and Visualizing Web Logs for Use Information

Google Analytics in the Dept. of Medicine

The introduction of an online portfolio system in a medical school: what can activity theory tell us?

SAS BI Dashboard 4.4. User's Guide Second Edition. SAS Documentation

Impressive Analytics

Pure1 Manage User Guide

Learning Portal: Learning Management System (LMS)

HOW DOES GOOGLE ANALYTICS HELP ME?

Your Blueprint websites Content Management System (CMS).

Introduction to RefWorks

The Commerce Trust Company

Virtual Contact Center

D9.1 Project Website

Transcription:

Google Analytics as a tool in the development of e-learning artefacts: A case study Damon Ellis Massey University The design, development, and evaluation of e-learning artefacts requires extensive and potentially time-consuming evidence collection in order to verify that the artefact is fulfilling its educational goals. There is a need for inexpensive tools that can facilitate the quantitative portion of this evidence base. This paper explores the use of Google Analytics in this capacity. The needs analysis, design, testing, embedding, and evaluation of APA Interactive an e-learning artefact targeting students at Massey University serves as a case study, demonstrating how analytics data can inform all stages in the creation of web-based educational resources. Keywords: online learning; evidence-based practice; e-learning artefacts Google Analytics launched in 2005, and rapidly became one of the most prominent on-site web analytics solutions (Clifton, 2012). Analytics can provide website owners with a wealth of data, measuring visitor numbers, the source of those visits, visitor technology, the way in which visitors interact with the site, and more. The data is aggregated in real-time but is not uniquely associated with individuals. Although Google Analytics primarily targets e-commerce sites, it has been drafted by educational technology research as a way to analyse the use and success of educational websites and online tools. Recent studies have employed analytics to improve website navigation (Fang, 2007), identify user needs through search keyword analysis (Yardy & Date-Huxtable, 2011), and track multimedia usage (Betty, 2009). This paper discusses the ways in which Google Analytics informed development of an educational web tool, APA Interactive, at Massey University. Phillips, Kennedy, and McNaught (2012) describe stages in the lifecycle of an e-learning artefact; in this case study analytics data was a component in five of these stages: the original needs analysis the design and testing of the APA tool embedding in authentic learning environments evaluation of its reception and use identification of opportunities for future development The purely quantitative data of analytics is not sufficient on its own to inform these stages, but usage statistics have some advantages over other site evaluation tools because they monitor how users actually work with a site rather than what they say they would do (Arendt & Wagner, 2010, p. 38). Analytics provides detailed, granular, and anonymous data that can be compared over years of use, making it a prime candidate to support the evidence base behind e-learning artefacts. Context The APA Interactive tool was developed for Massey University s Online Writing and Learning Link (http://owll.massey.ac.nz/). OWLL is a public-facing learning portal covering a wide range of tertiary study topics: time management, critical thinking, assignment writing, referencing, exam preparation, and other aspects of academic literacy. The university s Centre for Teaching and Learning operates and maintains the site. OWLL is not designed to be a learning environment in itself; a relatively static website has the potential to fall into the education as information delivery trap (Herrington, Reeves, & Oliver, 2005) and ignore the authentic environments necessary for effective teaching. Instead, the website is conceptualised as a source that can be used within classes and mediated by the lecturer, or accessed independently by students from throughout the university. Needs analysis

Analytics data has been collected for OWLL since 2007. The original identification of the gap that APA Interactive was designed to fill relied on three statistics available through Google Analytics: pageviews by section, keywords used in Google searches that led to the site, and keywords used in the in-site search bar. Pageviews are perhaps the most widely used feature of Google Analytics. They track the number of times a page has been accessed by a web browser a more useful statistic than server-side data because they exclude the majority of non-browser visits by web crawlers and robots (Clifton, 2012). These pageviews can be categorised by subsection within Google Analytics. Table 1 summarises the most popular sections of OWLL from 2009 to 2011. Referencing was consistently the most popular section by a significant margin. Table 1: 2009 OWLL pageviews sorted by section OWLL section Percentage of pageviews 2009 2010 2011 Referencing 29.6% 37.2% 40.9% Assignment types 18.6% 15.2% 16.1% Academic writing 9.1% 11.8% 12.5% Study skills 7.9% 6.5% 5.8% Tests and exams 5.7% 4.6% 3.9% All other sections 29.2% 24.7% 20.9% Analytics data also divides visits based on the method of entry: direct (visitors typing the address into their browser or following a bookmark), referral (link from other websites), and search (visitors coming from Google, Bing, or another search engine). In the case of visits arising from a search engine, analytics lists the keywords or phrases that led to the site, and this can provide additional data about the topics that are most important to the site s users. In the case of OWLL, 26.4% of the 2009 searches involved keywords relating to referencing: citation, APA, MLA, footnotes, referencing, and the like. Likewise, searches from the site s own search bar focused on referencing (11.5% of searches in 2009). The popularity of the referencing section was reinforced through other channels, such as informal student and staff feedback. That feedback also identified the core problem: referencing involves simple rules, but there are hundreds of those rules. The student need was caught between information overload and I need an answer to my specific (and obscure) question. Few students would need to know how to cite a book with six authors until they encounter that specific situation, but once they do encounter it they require an answer immediately. Teaching the technical aspects of referencing is not a new problem, and many solutions already exist. The two most common approaches are books and websites with long lists of examples and explanations, and bibliographic management software such as EndNote, Mendeley, or Zotero. The APA Interactive solution targeted the middle ground between these two solutions, incorporating the easy access and low learning curve of a website while avoiding the excessive and static inventory of referencing rules. Design and testing Studies in website viewing behaviour consistently emphasize that users prioritise the parts of the webpage immediately visible when the page is loaded the so-called above the fold principle (Djamasbi, Siegel & Tullis, 2011). However, it can be difficult to determine just what appears above the fold given that users have wildly different screen resolutions. Google Analytics records the screen resolution of all visitors to the site. In 2009, more than 50% of users had a resolution of 1024x768 or 1280x800, and this data formed the basis of the APA Interactive tool s design. Since that time Google has added a visual breakdown of which sections of a webpage are visible to its users without scrolling (Yahas, 2012). There are many answers to the above the fold problem: clickable anchors that direct users to the appropriate part of the webpage, hide-reveal script that expands sections of the page, and formatting solutions such as charts and flowcharts. APA Interactive was conceived as a tool that would maximise the amount of information that could be conveyed in a small space, while also preventing information overload by revealing relevant sections

only when explicitly required. The final version incorporates more than 6000 words of text, but the entire artefact remains above the fold for most users (Figure 1). APA Interactive presents a range of basic referencing options (referencing a book, book chapter, journal article, and so forth) and a menu of customisable options (multiple authors, edition numbers, print vs. internet sources). When these are selected, the tool creates a correctly formatted sample of the reference list entry and in-text citation for each option. Several hundred different combinations are possible depending on the options chosen. Clicking on parts of the example opens popups that give further details about that component: where to find the publisher location, for example, or how to capitalise the source s title. Figure 1: APA Interactive The tool was coded in JavaScript with the help of the jquery library. Google Analytics data directed the technical aspects of pre-release testing. If more than 1% of OWLL visitors used a particular browser or system, that environment was tested to ensure that APA Interactive displayed and operated correctly. In 2009, four browsers were above that 1% threshold: Internet Explorer (65.3%), FireFox (25.0%), Safari (5.6%) and Google Chrome (2.8%). The system breakdown at that time was 91.2% Windows, 8.0% Mac, and 0.6% Linux, so the tool was tested on both Windows and Mac systems. The technical specifications of the target audience were clearly identified in the analytics data. Any future changes in those specifications can also be tracked: in 2012 Google Chrome is used by 28.0% of visitors. Embedding in learning environments As mentioned earlier, Google Analytics tracks the source of all visits. This proved helpful to identify the learning environments in which OWLL was used. In 2009, the majority of referral links (37.9%) came from within Massey University s Online Learning Enviornment (OLE), WebCT. The development of APA Interactive coincided with the university s shift to Moodle, so connections to the tool were integrated into the new OLE via custom blocks that lecturers could add to their Moodle environment. Teaching development staff also encouraged lecturers to use the tool in assignment instructions and study materials. A significant minority of referral visits (5.9%) came from links on the Massey Library website, so this site was also identified as a potential student entry point. Evaluation Quantitative data is only a part of the proper evaluation of an e-learning artefact: usage logs simply record users behaviour in an e-learning environment, but they do not explain why that behaviour occurs (Phillips et al., 2011, p. 997). The quantitative measures of Google Analytics used to evaluate the success of the tool included pageviews, numbers of returning visitors, and average time spent on the page. By all of those measures, APA Interactive has seen significant uptake by students and staff. Over the first semester of 2012 the page recorded 100,713 views, an average of more than a thousand views per day. Although analytics data on returning vs. new visitors is not a perfect metric (Clifton, 2012), Google Analytics data suggests that only 27% of the pageviews were from first-time users, indicating a high level of

repeat usage. In terms of time spent on the page, APA Interactive recorded more than twice the overall site average, suggesting that students are accessing the details provided by the tool. Future development Loftus (2012) notes that analytics reports can facilitate iterative improvement. Data can be used to identify and quantify trends in user technology and behaviour, and that in turn can inform future developments. For example, the current movement towards mobile learning and social network integration (Brown & Green, 2012) is reinforced by the data. Google Analytics can identify the ratio of visits from mobile devices. While still small as a proportion of total visits, mobile users in the first semester of 2012 increased nearly sevenfold over 2011. In terms of social network integration, links on Facebook are now the most common non-massey source of referral visitors. Conclusion Analytics provides data beyond a simple headcount of visitors to a page. The method of entry (search keywords and external links), the technical specifications of users (screen size, browser, operating system, and mobile use), and the details of visits (levels of return visits, time spent on the page) can all inform the design, evaluation, and continued development of an e-learning artefact. This data can complement and verify other sources of information, such as student and staff feedback, to ensure that these artefacts are fulfilling their function and will continue to do so in the future. References Arendt, J. & Wagner, C. (2010). Beyond description: Converting web site usage statistics into concrete site improvement ideas. Journal of Web Librarianship, 4(1), 37-54. doi: 10.1080/19322900903547414 Betty, P. (2009). Assessing homegrown library collections: Using Google Analytics to track use of screencasts and Flash-based learning objects. Journal of Electronic Resources Librarianship, 21(1), 75-92. doi: 10.1080/19411260902858631 Brown, A., & Green, T. (2012). Issues and trends in instructional technology: Lean times, shifts in online learning, and increased attention to mobile devices. In M. Orey, S. A. Jones, & R. M. Branch (Eds.), Educational media and technology yearbook: Volume 36 (pp. 67-80). New York, NY: Springer. doi: 10.1007/978-1-4614-1305-9_6 Clifton, B. (2012). Advanced web metrics with Google Analytics (3rd ed.). Indianapolis, IN: John Wiley & Sons. Djamasbi, S., Siegel, M., & Tullis, T. (2011). Visual hierarchy and viewing behavior: An eye tracking study. In J. Jacko (Ed.), Lecture notes in computer science: Vol. 6761. Human-computer interaction: Design and development approaches (pp. 331-340). doi: 10.1007/978-3-642-21602-2_36 Fang, W. (2007). Using Google Analytics for improving library website content and design: A case study. Library Philosophy and Practice [special issue]. Retrieved from http://www.webpages.uidaho.edu/~mbolin/lppgoogle.htm Herrington, J., Reeves, T., & Oliver, R. (2005). Online learning as information delivery: Digital myopia. Journal of Interactive Learning Research, 16(4), 353-367. Loftus, W. (2012). Demonstrating success: Web analytics and continuous improvement. Journal of Web Librarianship, 6(1), 45-55. doi: 10.1080/19322909.2012.651416 Phillips, R., Kennedy, G., & McNaught, C. (2012). The role of theory in learning technology evaluation research. Australasian Journal of Educational Technology, 28(7), 1103-1118. Retrieved from http://www.ascilite.org.au/ajet/ajet28/phillips.html Phillips, R., Maor, D., Cumming-Potvin, W., Roberts, P., Herrington, J, Preston, G., & Moore, E. (2011). Learning analytics and study behaviour: A Pilot study. In Changing demands, changing directions: Proceedings Ascilite Hobart 2011, 997-1007. Retrieved from http://www.ascilite.org.au/conferences/hobart11/downloads/papers/phillips-full.pdf Yahas, G. (2012). Conduct browser-size analysis within Google Analytics. Retrieved from http://analytics.blogspot.co.nz/2012/06/new-feature-conduct-browser-size.html Yardy, A. & Date-Huxtable, E. (2011). Recommendations for enhancing the quality of flexible online support for online teachers. In Changing demands, changing directions: Proceedings Ascilite Hobart 2011, 1367-1371. Retrieved from http://www.ascilite.org.au/conferences/hobart11/downloads/papers/yardy-concise.pdf

Author contact details: Damon Ellis, d.k.ellis@massey.ac.nz Please cite as: Ellis, D. (2012). Google Analytics as a tool in the development of e-learning artefacts: A case study. Copyright 2012 Damon Ellis The author(s) assign to the ascilite and educational non-profit institutions, a non-exclusive licence to use this document for personal use and in courses of instruction, provided that the article is used in full and this copyright statement is reproduced. The author(s) also grant a non-exclusive licence to ascilite to publish this document on the ascilite website and in other formats for the Proceedings ascilite 2012. Any other use is prohibited without the express permission of the author(s).