Markers experiences of providing formative assessment feedback in hardcopy, desktop and tablet

Similar documents
Developing Research & Communication Skills

Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?

LEARNING MANAGEMENT SYSTEM USER GUIDE: TURNITIN GRADEMARK RUBRICS

HE learning and teaching: vision

Using a blogging tool to assess online discussions: an integrated assessment approach

Increasing Productivity with Mobile Integration.

A Writer s Workshop: Working in the Middle from Jennifer Alex, NNWP Consultant

The use of mind maps as an assessment tool.

VIRTUAL UNIVERSITIES FUTURE IMPLICATIONS FOR

User research for information architecture projects

The Cambridge Program. For Board Examination System Schools

How to Plan and Guide In Class Peer Review Sessions

Course Specification. PGCE Early Childhood Care and Education (0 5 years) with recommendation for EYTS (PGECC)

Using PBL Assignments in Undergraduate Operations Management Course

A Guide. to Assessment of Learning Outcomes. for ACEJMC Accreditation

Digital Literacy: Theoretical Framework

British Journal of Educational Technology Vol 35 No

PROGRAMME SPECIFICATION POSTGRADUATE PROGRAMMES

DEEPER LEARNING COMPETENCIES April 2013

Qualitative and Quantitative Evaluation of a Service Learning Program

Local Government Records Management Benchmarking Report

Evaluating Assessment Management Systems: Using Evidence from Practice

Florida State College at Jacksonville

Integrating Technology into the Classroom. Trevor Moore. Western Oregon University

Working with us Support and benefits for authorised centres

Text-to-Speech and Read Aloud Decision Guidelines Page 1

INSTRUCTION AT FSU THE FLORIDA STATE UNIVERSITY OFFICE OF DISTANCE LEARNING. A Guide to Teaching and Learning Practices

Foundation Year in Social Sciences and

Assessment Policy. 1 Introduction. 2 Background

The Polymath Degree Program

Guidelines for Minimum Standards for Learning Management System (LMS) Unit Design

Summary Report on the Mahara e-portfolio Pilot Project

Flipped classroom in first year management accounting unit a case study

Section 1 - General Course Information

Self Assessment. Introduction and Purpose of the Self Assessment Welcome to the AdvancED Self Assessment.

Programme Specification 2015/16 N/A

Reading Specialist (151)

QUALITY ASSURANCE DOCUMENT QA3 - PROGRAMME SPECIFICATION

School of Education. EDST5133 Creating Engaging Learning Environments. Semester 1

YSU Program Student Learning Assessment Report Due Date: Thursday, October 31, 2013

Augmented reality enhances learning at Manchester School of Medicine

BOCES Educational Consortium Testimony

Enhancing Self-efficacy Through Scaffolding

Use Your Master s Thesis Supervisor

Reading Competencies

LITERACY: READING LANGUAGE ARTS

The Five Key Elements of Student Engagement

Letter from the Editor-in-Chief: What Makes an Excellent Professor?

PROJECT BASED INTRODUCTION TO LEARNING

Business School Writing an Essay

MOE Online Class Quality Guidelines

Best Practices in Online Teaching 2012

Doctor of Ministry (AQF level 10 Doctoral Degree) 71

CHAPTER 5 : COURSE AND UNIT DESIGN AND QUALITY

ECU Quality Assurance Guidelines for Online Delivery

Course Specification

Can using free online video tutorials through lynda.com enhance my teaching?

LEARNING MANAGEMENT SYSTEM USER GUIDE: STUDENT ACTIVITY DATA. Performance Dashboard Grade Centre Student Activity Centre...

CED 117 Interpersonal Skills in Human Relationships (3 Sem Hours) Department of Education and Clinical Studies Fall, 2015 Online Education

TAFE Queensland Brisbane Online

Planning and Writing Essays

Issues in Information Systems Volume 13, Issue 1, pp , 2012

What does good learning look like?

Publishers Note. Anson Reed Limited St John Street London EC1V 4PY United Kingdom. Anson Reed Limited and InterviewGold.

PROGRAMME SPECIFICATION

classroom Tool Part 3 of a 5 Part Series: How to Select The Right

Debbie Hepplewhite s suggestions for effective and supportive phonics provision and practice

LITERATURE REVIEWS. The 2 stages of a literature review

Syllabus for IST 346 Operating Systems Administration Permanently Tentative

Research Rationale for the Criterion SM Online Writing Evaluation Service

The University of South Dakota. School of Education. Division of Educational Leadership. EDAD 701 Introduction to Educational Administration 3 credits

USING DATA TO INFORM INSTRUCTION AND PERSONALIZE LEARNING

Subject Experience Survey Instrument Questions

School of Management. Trimester 1, 2013 COURSE OUTLINE

Evaluation of an Electronic Charting System in the BCIT Nursing Simulation Lab

Stages of Instructional Design V. Professional Development

Describe the action you took in this cycle.

Expectations for Classroom Setup and Online Teaching

Del Mar College Child Development / Early Childhood. Course Syllabus TECA 1354 (online)

SECTION 1. Why Coaching?

Student Guide: College Composition 101 and Academic Year

Advantages and Disadvantages of the Block Schedule

Requirements EDAM WORD STUDY K-3: PRINT AWARENESS, LETTER KNOWLEDGE, PHONICS, AND HIGH FREQUENCY WORDS

Brief Course Descriptions for Academic English Cluster

A quality assurance and benchmarking framework in an academic library

Course Specification. MSc Audio Engineering (MSADE) LEEDS BECKETT UNIVERSITY

READING WITH. Reading with Pennsylvania Reading Specialist Certificate

Newspaper Multiplatform Usage

Middlesex Community College Library Strategic Plan

COMPUTER APPLICATIONS TECHNOLOGY TEACHER GUIDE

Professional Development Needs Assessment for Teachers

Transcription:

Markers experiences of providing formative assessment feedback in hardcopy, desktop and tablet Rebecca Olson, School of Science and Health, University of Western Sydney Anthony Burton, School of Medicine, University of Western Sydney Paul Byron, School of Arts and Media, University of New South Wales Margo Turnbull, Centre for Research in Learning and Change, University of Technology Sydney Abstract Student numbers in first year undergraduate courses are growing. Lecturers are under increasing time and budgetary pressures to foster meaningful assessmentdriven learning. Online course management, esubmission and emarking have the capacity to streamline workflows associated with formative assessment marking. It is widely asserted that emarking is faster than paper-based marking, but there is a paucity of research to support these claims. Much of the literature focuses on students perceptions of emarking; few incorporate tablet-based marking into their research. In this exploratory study, we reflect on experienced and inexperienced markers time survey responses as they mark formative assessments in hardcopy and either Turnitin s Grademark for desktops or Branchfire s iannotate for ipads. Findings suggest that marking is faster and more flexible in iannotate. Thus, tablet-based marking may provide solutions for first year unit lecturers wanting to reduce the time associated with formative assessment feedback. Contextual factors, however, should be central to decisionmaking. Introduction Formative assessments are important to student learning at every level, but fundamental to student learning during the first year of higher education. Formative assessments allow students to engage in deep-learning through the application of course material and critical analysis (Gikandi, Morrow, & Davis, 2011). When they are part of a scaffolded curriculum, formative assessments also offer students an opportunity to reflect on and learn from their mistakes (Evans, 2013; Wharton, 2013). Meaningful and timely feedback is essential, however, to allowing students to apply suggestions received in early work to future assessments, achieving deep learning and understanding across assignments and contexts. Providing meaningful feedback on formative assessments is increasingly challenging in the current university landscape. As Evans (2013, p. 73) explains in her recent review article on assessment feedback in higher education, Enhancing the quality of feedback to students needs to be considered against the backdrop of the massification and consumerization of HE in the 21 st century with increasing numbers and a more diverse student body than ever before. Students are beginning higher education with varied academic literacy skills and need support in learning how to learn (Cameron, George, & Henley, 2012). However, there is strong competition for students time, with many commuting to campuses and combining work and study (Tinto, 2012). Assessment-driven curricula has been posed as an important 1

strategy for encouraging student engagement, retention and learning (Donnison & Penn- Edwards, 2012). Just as assessments are becoming increasingly important to meeting goals around student retention and the development of life-long learners, the time for assessment feedback is shrinking. Academic workloads are swelling. Faced with growing student numbers, time for assignment administration, marking, as well as face-to-face teaching and learning are diminishing (Donnison & Penn-Edwards, 2012). In this changing higher education landscape, many lecturers are turning to information and communication technologies for solutions (Espasa & Meneses, 2010). The popularity of online and blended learning is growing fast (Gikandi et al., 2011; Yeh & Lo, 2009); esubmission and emarking may provide a way to reduce the time associated with assignment administration and feedback (Maxwell & Kist, 2011). There is a dearth of research on emarking, leaving lecturers unsupported empirically in their decision-making. Following Buckley and Cowap (2013), we investigated marker s qualitative perceptions of two emarking tools: Grademark ( 2013) and iannotate (version 2.4.5). Following Coniam (2009) we offer here an exploratory, comparative study of experienced and inexperienced markers perceptions of both hardcopy and emarking. Most academics comparing paperbased marking (PBM) and onscreen marking (OSM) have solely examined the attitudes of experienced markers as they shift from PBM to OSM (Coniam, 2009); few have examined more than one form of emarking. Our small explanatory study had three dimensions: experienced and inexperienced markers; hardcopy and desktop based emarking using Turnitin s Grademark; and hardcopy and tablet-based iannotate emarking. To our knowledge, it is the first to compare perceptions and marking speed across hardcopy, Grademark and tablet-based iannotate emarking, offering much needed, though preliminary, insight to lecturers looking to foster supportive first year learning experiences with limited resources. In the paragraphs that follow, we ground our study in the vast literature on the importance of formative assessments and timely feedback and the limited but growing literature on esubmissions and emarking. Literature Review Gikandi et al. (2011, p. 2334) describe assessment as the heart of formal higher education. Assessments provide students with opportunities to engage meaningfully with course content, applying it to differing contexts and testing their understanding (Donnison & Penn-Edwards, 2012; Wiliam, 2011). In the current teaching and learning climate, where university staff compete for face-to-face time with students, with part-time and sometimes full-time work and long commutes (Tinto, 2012), assessments remain a priority to students (Donnison & Penn- Edwards, 2012). As more and more of university learning takes place online, assessments and feedback on assessments are the central platform on which scaffolded and continuous learning takes place (Espasa & Meneses, 2010). There are two main types of assessment: summative and formative. Summative assessment tasks are used for validation and accreditation, testing students comprehension (Gikandi et al., 2011, p. 2334). They are commonly used in first year and online higher education contexts in the form of, for example, surface-learning quizzes to test recall and maintain engagement (improving retention) among assessment-driven transitioning students (Donnison 2

& Penn-Edwards, 2012; Gikandi et al., 2011). Formative assessments are implemented into curricula to support higher-order or deep learning (Gikandi et al., 2011, p. 2334). They form one part of the learning scaffolding designed to monitor, assess and improve student understanding through ongoing and timely feedback (Gikandi et al., 2011, p. 2337). Meaningful feedback is essential to the efficacy of formative assessments in achieving deep learning outcomes (Wharton, 2013). Some see feedback as an end in and of itself (Evans, 2013) to justifying assigned grades (Wharton, 2013). Within a socio-constructivist view of learning, however, feedback is the interpersonal exchange that allows learners to improve future work (Evans, 2013; Wharton, 2013; Wiliam, 2011). Described as both feedback and feed-forward, it helps students to measure their performance against specific objectives (Espasa & Meneses, 2010; Evans, 2013, p. 71). It is also a central strategy for encouraging independent and life-long learning (Espasa & Meneses, 2010; Evans, 2013; Gikandi et al., 2011; Wharton, 2013). Particularly in online learning settings, feedback is associated with improved student performance and student satisfaction (Espasa & Meneses, 2010). For the feed-forward capacity of feedback to be achieved, however, it needs to be timely. The optimum timing of feedback hinges on the type of learning underpinning an assessment (Wiliam, 2011, p. 9). In scaffolded formative assessments, students are asked to reflect on and learn from corrections and advice received in early assignments, allowing them to apply this feedback and make improvements to subsequent assignments. Thus, students should receive feedback from early assignments with plenty of time to digest and apply markers comments towards later assessments. With growing student numbers and ballooning academic work plans, providing meaningful and timely feedback can be problematic. It can take several days to sort assignments, several more days for tutors to collect assignments and then weeks for marking. The digital shift and the introduction of online course management systems may have the capacity to ameliorate some of the time-consuming aspects of assessment marking, helping lecturers to ensure students receive the in depth feedback on formative assessments required for selfimprovement and independent learning (Johnson, Hopkin, Shiell, & Bell, 2012, p. 107). The literature focuses on the timesaving potential of both esubmissions and emarking. First, esubmission, the process of handing in assignments as electronic files only (Maxwell & Kist, 2011, p. 61), has the potential to reduce assignment turn around times by streamlining the process (Jones & Jamieson, 1997). Instead of tracking, recording and sorting submissions manually, with esubmissions, markers can download assignments securely and directly from an online website (Buckley & Cowap, 2013; Coniam, 2009). This reduces the number of people handling assignments, and subsequent administrative mistakes, as well as the steps and time between the point when a student submits his or her assignment and the point when a tutor can begin marking (Maxwell & Kist, 2011). Second, emarking or e-assessment feedback, providing feedback through information communication technology (Evans, 2013, p. 85), could speed up marking times. It has been widely asserted that emarking is faster than paper-based marking (Buckley & Cowap, 2013; Johnson et al., 2012; Jones & Jamieson, 1997; Maxwell & Kist, 2011). However, there is little empirical evidence to support this assertion. In reflecting on their experiences of marking using Grademark, participants in Buckley and Cowap s (2013) focus group described the Quickmark comment feature, where users can create a bank of comments to be reused in future assignments, as useful. I found it a lot quicker to make the comments and even the general comments, just to type them is a lot quicker. This reflection, however, is 3

not based in direct comparison and is subject to recall bias. Other assertions about the timesaving capacities of emarking seem to be primarily based in Turnitin s plagiarism detection function, which prevents markers from having to check for plagiarism using online search engines (Buckley & Cowap, 2013). There is a need for further pragmatic research into the experiences of those who mark assignments (Coniam, 2009) using a variety of emarking tools (Gikandi et al., 2011). Most of the literature on emarking focuses on students perceptions (Buckley & Cowap, 2013; Johnson, Nádas, & Bell, 2010) and limits comparison to one form of emarking and paperbased marking. Our research, by taking an exploratory empirical focus, attempts to move the discussion beyond assertion, by explicitly examining temporal as well as qualitative dimensions of formative assessment marking across three modes. Methods We conducted our exploratory research as academic staff teaching a health science unit for first and second year undergraduate students at the University of Western Sydney (UWS). UWS is a large multi-campus university with a diverse student population and teaching staff committed to opportunity and excellence (Gill, Lombardo, & Short, 2013; Gill et al., 2011). Student data for 2012 reveals the extent of this diversity: 23.7 per cent of students are from low socio-economic backgrounds; 53 per cent are first in family to attend university; 32 per cent are of a non-english speaking background; the majority of students have work or family responsibilities (Gill et al., 2013). The unit, Public Health, is designed to extend students understanding of concepts and academic literacy skills introduced in a first year first semester health science unit. As a blended learning unit, students attend 3 hour long face-to-face lecture tutorials for half of the semester (weeks 1, 2, 4, 6, 7, 8, 10) and engage with online lecture materials in the remaining weeks. There are two formative assessments and a final (summative) exam. In this paper, we reflect on our experiences of marking the first written assessment in the unit: a 1,500 word report. The report, due in week 8, is designed to offer students a lower risk (30%) formative assessment, where feedback allows students to reflect on their writing and researching skills before crafting and submitting their final written assessments in week 14. For our study, we systematically examined our experiences as three markers, one experienced and two inexperienced, and one lecturer using emarking in Public Health for the first time. Two modes of emarking were compared against hardcopy marking: Turnitin s Grademark for desktops and Branchfire s iannotate for ipads. Grademark offers markers the capacity to access, read, highlight, comment in-text (using both tailored and generic comments), grade (using a rubric created by the lecturer) and comment holistically on a students assessment in one website. Markers can view Turnitin plagiarism similarity matches as they mark. There is no need to download or upload files as all assessments are submitted and available through Turnitin, though it can take time to open documents on the website. iannotate, in contrast, is a generic.pdf (portable document file) annotation tool that allows users with an ipad to insert marks such as ticks, arrows and circles with their fingers. Users can also insert comment boxes or type directly on the page. These annotations are then saved by the user as another layer to the document. All assignments marked in iannotate were downloaded as a zip file from Turnitin before being merged individually with the marking rubric and then emailed to the marker. After marking assignments in iannotate, the marker, second author A, emailed the graded assessments back to himself and posted the grades and 4

annotated.pdf files to the online course management system s gradecentre. A similar process was followed for hardcopy feedback. Individually marked reports and rubrics were scanned as.pdf files and then the grades and files were posted in the online gradecentre. Care was taken to ensure grades and feedback were hidden from students in the gradecentre until all assessments were marked. Each tutor marked assignments in hardcopy and either Grademark or iannotate. Assignments were not allocated to tutors randomly, but alphabetically as they appeared in Turnitin. Third author P, an experienced marker with several years experience in hardcopy marking, accessed 13 assignments online through Grademark and received 13 assignments by post to mark in hardcopy. Fourth author M, an inexperienced marker, accessed 13 assignments online through Grademark and received 10 assignments by post to mark in hardcopy. A, an inexperienced marker, but experienced ipad user, received 14 assignments by email to mark using iannotate for ipads and received 14 assignments by post to mark in hardcopy. Lecturer, and first author, R, an experienced marker with 7 years of experience in PBM, coordinated the process. She familiarised herself with both emarking tools, providing samples of marked assignments and detailed instructions, including links to online instructional videos, for markers to follow. She also drafted a three page survey for each marker to use in documenting 1) the time they took to become familiar with the emarking tools; 2) the time taken to download, mark and then upload assignments to the online course management system; and 3) their reflections on the process. Results Advanced statistical analysis was not performed, as the study is preliminary and based on only four participants. Instead, average preparation and marking times are presented here along with markers reflections on their experiences. Preparation Before the exploratory study could begin, the lecturer (R) became familiar with the relevant emarking tools and made various arrangements to set-up the online system for markers. This included undertaking training in Grademark, adding markers as users to the course management system, experimenting with emarking tools, creating a rubric in Grademark, drafting detailed instructions for markers and working with blended learning staff to overcome challenges with the course management system s Gradecentre and Grademark. The time spent on set-up was not insignificant at 18 hours. Time spent on preparation and familiarisation as markers was considerably less onerous. A spent 15 minutes locating and paying for (approximately $10 AUD) iannotate through the itunes store and downloading it to his ipad. There was no set-up time associated with Grademark, as it was available through the course management system as a component of Turnitin s plagiarism detection software. Familiarisation with the university s course management systems (vuws) and Gradecentre took markers 20 (A), 40 (P) and 50 (M) minutes. Familiarisation took A significantly less time as he is also a postgraduate student at the same university and more familiar with the university s systems. Familiarisation with emarking tools took slightly more time. A spent 30 minutes reading the supplied instructions, watching the recommended youtube clips and experimenting with the different iannotate tools. Inexperienced marker M required only 20 5

minutes to become familiar with Grademark using the supplied guidelines, while experienced marker P took 30 minutes to become familiar with the etool and an additional 75 minutes to overcome challenges with the tool through consultation with the lecturer and blended learning support staff. Marking time For both experienced marker P and inexperienced marker M, marking in Grademark was more time consuming than hardcopy marking (see figure 1). 45 40 35 30 25 20 15 10 5 0 Experienced marker: P Inexperienced maker: M Inexperienced marker: A Hardcopy Grademark iannotate Figure 1: Average time marking in minutes The difference for P was dramatic with Grademark assessments taking P nearly twice as long. The difference for M was less pronounced, but still noteworthy. The amount of time spent marking in Grademark did, however, decrease with experience. Initial assessments took 55 (M) and 60 (P) minutes, while the last assessments marked in Grademark took only 30 minutes. Remarkably, marking in iannotate and hardcopy took A the same amount of time on average. Clear differences also emerged across experienced and inexperienced markers. Across the inexperienced markers, hardcopy marking took, on average, 36 minutes. For experienced marker P, hardcopy marking took a mere 23 minutes on average. It is important to point out that these averages include the time spent downloading and uploading assignments (see table 1). Note that while Grademark does not require users to download assessments before they are marked, M regularly experienced a 1-2 minute delay between selecting and being able to view a student s assessment. Time spent downloading assessments and uploading feedback varied substantially across markers depending on the strength and speed of their home internet connections, the amount of traffic on the online course management site and whether the marker had access to a paper feed scanner. 6

Mode Marker Download Marking Upload Total Hardcopy Experienced marker: P 0 20.77 2 22.7 Grademark Experienced marker: P 0 39.62 0 39.62 Hardcopy Inexperienced marker: M 0 33.5 2 35.5 Grademark Inexperienced marker: M 1.5 38.85 0 40.35 Hardcopy Inexperienced marker: A 0 29.64 6 35.64 iannotate Inexperienced marker: A 2 29.64 4 35.64 Reflections Table 1: Average time marking in minutes In reflective comments, all three markers noted the easy and prompt access to assessment that emarking offered. This was especially important as all three markers lived more than 30 minutes form the campus where assessments were submitted. However, they lamented the amount of work associated with learning a new piece of technology. A was untroubled about the effort required to learn to use iannotate saying, it does take a little while to learn the ins and outs, but it s great to learn how to use a new bit of kit! P, in contrast, was less enthusiastic about Grademark: it took some getting used to. Some aspects were fairly intuitive, others were not. In his assessment of iannotate, A found downloading and uploading assessments to be easier with iannotate than with hardcopy, but found flipping through pages slightly slower. He enjoyed the freedom associated with being able to access esubmissions through his ipad: better than lugging around 3.5 kilos of paper!. However, he expressed concern that extended online marking can be hard/tiring on one s eyes. Additionally, he found his fingers were too large, making annotation fiddly. Marker s comments about Grademark were largely disparaging. Both M and P experienced problems with Grademark freezing, causing them to have to start their marking of an assessment over again. Experienced marker P, however, expressed sustained frustration over the amount of time it took him to become comfortable and efficient with Grademark s Quickmark comment bank: Familiarising with the selectable comments took time. Sometimes there are similar/overlapping comments to choose from. Sometimes after selecting, I would read the additional comments inbuilt and realise these were too specific or inappropriate, so I d delete this and add my own comment. He also found the layout cluttered with lots of scrolling. However, he did appreciate the ability to make, add and edit final comments in Grademark. He evaluated his holistic feedback in Grademark as probably a bit more useful and detailed. While M liked Grademark overall, she found it harder to give detailed English language and written expression feedback in Grademark: My slow pace was primarily due to the writing quality of the assignments. Giving feedback/examples of alternative sentence structures in Grademark was more difficult. I am not sure that this type of feedback was clear in Grademark. 7

Discussion This small exploratory comparative study of hardcopy and emarking is the first to include tablet-based iannotate. The generalizability of results is limited, as it is a preliminary study based on reflective time surveys conducted with a small number of participants. However, the study adds a much-needed empirical and temporal dimension to the growing literature on markers perceptions of emarking. Findings suggest that emarking is not necessarily faster. Many have asserted that emarking, Grademark in particular (Buckley & Cowap, 2013), is faster than paper-based marking. However, time comparisons in this study suggest that emarking may be slower, in the case of Grademark, or just as fast, in the case of iannotate. Comparisons across markers support Coniam s (2009) finding that experienced markers tend to be less favourable in their evaluations of emarking compared to inexperienced markers. Qualitative comments also led us to suggest that markers may prefer tablet-based to desktopbased emarking. Scrolling, flipping through pages and providing detailed in-text feedback may be easier with a tablet device such as an ipad. Findings also provide valued insight for lecturers (including R) struggling to find strategies to decrease the time and resources associated with scaffolding meaningful feedback on formative written assessments into first year units. First, while esubmission does reduce some of the administration associated with formative assessments (Jones & Jamieson, 1997; Maxwell & Kist, 2011), results present a reminder to lecturers that there are upfront time costs associated with introducing esubmissin and emarking into university courses. Online course management systems must be adjusted, rubrics modified, guides created, Apps purchased and markers, lecturers and coordinators must become familiar and comfortable with the etools and supporting systems. Second, the marking time averages and markers comments point to the importance of contextual factors to emarking adoption decisionmaking. These factors include students writing skills, marker experience, ease of access to hardcopy assessments, time and budgetary constraints associated with training as well as access to equipment (i.e., ipads and styluses) and the internet. Conclusion In this exploratory study, we compared the experiences of three markers and one lecturer providing, for the first time, formative assessment emarking feedback to first and second year students in one undergraduate health science unit. We offer empirical results and caution to the limited but growing literature on emarking and esubmissions in higher education. Findings suggest that emarking is not necessarily faster than hardcopy marking. etools matter, with iannotate showing more promise than Grademark for providing in-text feedback to university students still in the process of developing academic writing skills. This may be due to the ease of making corrections to grammar and punctuation using the pencil tools in iannotate. Other contextual factors, such as the comfort of the markers with tablets, internet connection speed and marker experience should also be taken into consideration. In a university landscape where formative assessment feedback is increasingly important, evidence-based strategies for streamlining assessment submissions and reducing the time and administration associated with marking are needed. Future studies should systematically investigate the time-demands and user perceptions of emarking, comparing a wider range of emarking tools and engaging a larger sample of markers and lecturers. Comparisons of students perceptions across emarking tools are also warranted. 8

References Buckley, E., & Cowap, L. (2013). An evaluation of the use of Turnitin for electronic submission and marking and as a formative feedback tool from an educator's perspective. British Journal of Educational Technology, 44(4), 562-570. doi: 10.1111/bjet.12054 Cameron, C., George, L., & Henley, M. (2012). All hands on deck: A team approach to preparing year one arts students for their first major assignment. A practice report. The International Journal of the First Year in Higher Education, 3(1), 101-108. doi: 10.5204/intjfyhe/v3i1.117 Coniam, D. (2009). A comparison of onscreen and paper- based marking in the Hong Kong public examination system. Educational Research and Evaluation, 15(3), 243-263. doi: 10.1080/13803610902972940 Donnison, S., & Penn- Edwards, S. (2012). Focusing on first year assessment: Surface or deep approaches to learning? The International Journal of the First Year in Higher Education, 3(2), 9-20. doi: 10.5204/intjfyhe.v3i2.127 Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education, 59, 277-292. doi: 10.1007/s10734-009- 9247-4 Evans, C. (2013). Making sense of assessment feedback in higher education. Review of Educational Research, 83(1), 70-120. doi: 10.3102/0034654312474350 Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers & Education, 57, 2333-2351. doi: 10.1016/j.copmedu.2011.06.004 Gill, B., Lombardo, L., & Short, S. (2013). Unscrambling the egg: A muddled path to a holistic, coherent and integrated institution- wide approach to first year student transition. A practice report. The International Journal of the First Year in Higher Education, 4(2), 97-103. doi: 10.5204/intjfyhe.v4i2.175 Gill, B., Ramjan, L., Koch, J., Dlugon, E., Andrew, S., & Salamonson, Y. (2011). A standardised orientation program for first year undergraduate students in the College of Health and Science at UWS. A practice report. The International Journal of the First Year in Higher Education, 2(1), 63-69. doi: 10.5204/intjfyhe.v2i1.48 Johnson, M., Hopkin, R., Shiell, H., & Bell, J. F. (2012). Extended essay marking on screen: Is examiner marking accuracy influenced by marking mode? Educational Research and Evaluation, 18(2), 107-124. doi: 10.1080/13803611.2012.659932 Johnson, M., Nádas, R., & Bell, J. F. (2010). Marking essays on screen: An investigation into the reliability of marking extended subjective texts. British Journal of Educational Technology, 41(5), 814-826. doi: 10.1111/j.1467-8535.2009.00979.x Jones, D., & Jamieson, B. (1997). Three generations of online assignment management. Paper presented at the The Australian Society for Computers in Learning in Tertiary Education: ASCILITE, Curtin University of Technology, Perth. Maxwell, A., & Kist, A. A. (2011). Review of esubmission and emarking to provide quality feedback and minimise turn around times for external students. Paper presented at the 1st World Engineering Education Flash Week, Lisbon, Portugal. Tinto, V. (2012). Enhancing student success: Taking the classroom success seriously. The International Journal of the First Year in Higher Education, 3(1), 1-8. doi: 10.5204/intjfyhe.v2i1.119 9

Wharton, S. (2013). Written feedback as interaction: Knowledge exchange or activity exchange? The International Journal of the First Year in Higher Education, 4(1), 9-20. doi: 10.5204/intjfyhe.v4i1.133 Wiliam, D. (2011). What is assessment for learning? Studies in Educational Evaluation, 37, 3-14. doi: 10.1016/j.stueduc.2011.03.001 Yeh, S.- W., & Lo, J.- J. (2009). Using online annotations to support error correction and corrective feedback. Computers & Education, 52, 882-892. doi: 10.1016/j.compedu.2008.12.014 10