Online Assessment: Quality Production and Delivery for Higher Education. Don Mackenzie



Similar documents
Honours Degree (top-up) Computing Abbreviated Programme Specification Containing Both Core + Supplementary Information

Online Assessment (CAA) Pedagogical guidelines: Best Practice for recommended use of software for formative and summative assessment

Honours Degree (top-up) Business Abbreviated Programme Specification Containing Both Core + Supplementary Information

UNDERGRADUATE PROGRAMME SPECIFICATION

BSc (Honours) Computing and Information Systems (Distance Learning)

FORMATIVE ASSESSMENT IN ENGINEERING USING "TRIADS" SOFTWARE

LOUGHBOROUGH UNIVERSITY

fashion pre-masters programme

Course Specification

Programme Specification May 2012

UNDERGRADUATE PROGRAMME SPECIFICATION

Programme Specification: BSc (Hons) Sound Engineering and Production

Project Planning With IT

AN ONLINE MASTER S DEGREE: TEACHING AND LEARNING STRATEGY VS. MANAGERIAL CONSTRAINTS

Finance & Accounting

Investment and Wealth Management

MA EDUCATION MA Education: Childhood and Youth Studies MA Education: Higher Education MA Education: Leadership and Management MA Education: TESOL

Digital Photography: Workflow and Creative Manipulation

CASE STUDY. Portadown Independent Christian School ARTICULATE E-ASSESSMENT PILOT CASE STUDY INFORMATION

Map Patterns and Finding the Strike and Dip from a Mapped Outcrop of a Planar Surface

De Montfort University. Course Template

Tutorial for Tracker and Supporting Software By David Chandler

University Centre at Blackburn College. Abbreviated Programme Specification Containing Both Core + Supplementary Information

Course Specification MSc Information Management (INMAM)

Awarding Institution: Institute of Education, University of London. Teaching Institutions: Institute of Education, University of London

Birmingham City University Faculty of Computing, Engineering and the Built Environment. Undergraduate Programme. Programme Specification

Henley Business School at Univ of Reading. Post-Experience Postgraduate Programmes

Course Specification. MSc Audio Engineering (MSADE) LEEDS BECKETT UNIVERSITY

POSTGRADUATE PROGRAMME SPECIFICATION

Streaming Audio and Video for Course Design Kerry Shephard

UNDERGRADUATE PROGRAMME SPECIFICATION

Responding to feedback from students. Guidance about providing information for students

Programme name Computer Science with Artificial Intelligence

COURSE NAVIGATOR DEMO QUICK GUIDE

What you can do:...3 Data Entry:...3 Drillhole Sample Data:...5 Cross Sections and Level Plans...8 3D Visualization...11

Project Management in Marketing Senior Examiner Assessment Report March 2013

PROGRAMME SPECIFICATION

How To Get A Masters Degree In Management At University Of Bromford

Birmingham City University Faculty of Technology, Engineering and the Environment. Programme Specification. MEng Mechanical Engineering

Programme Specification

The Course Navigator also allows instructors to assess students' work, track progress, download results, and view upcoming events.

PROGRAMME SPECIFICATION KEY FACTS

The Opportunities and Challenges Faced in Utilising e-based Assessment. Paul Howarth - Managing Director EMEA

Phase 1 pilot 2005/6 Intervention. GCU Caledonian Business School Business Management Page 1 of 8. Overview

Date of Revision: October 2012 October 2013 December 2014 (to include all teaching institutions & updated regulations & Blended Learning mode)

POSTGRADUATE PROGRAMME SPECIFICATION

Programme Specifications. MSc in Geographical Information Science

PROGRAMME SPECIFICATION

Effective reporting for online assessment --- shedding light on student behaviour

PROGRAMME SPECIFICATION

Technologies Experiences and outcomes

Mode of Study The MPH course will be delivered full-time and part-time on campus at the Kedleston Road site

Nottingham Trent University Course Specification

PROGRAMME SPECIFICATION

UNDERGRADUATE PROGRAMME SPECIFICATION

Australian Government Department of Education and Training More Support for Students with Disabilities

QAA Subject Benchmarking Group: Business and Management (2007)

PROGRAMME SPECIFICATION

QAA Subject Benchmarking Group: Business and Management (2007)

E-Learning Authoring Tool Guide. Is an e-learning Authoring Tool Right for Your Organization?

This Unit is a mandatory Unit within the National Progression Award in Cyber Security at SCQF 6.

1 Awarding institution Liverpool John Moores University 2 Teaching institution university LIVERPOOL JOHN MOORES UNIVERSITY

Bachelor of Arts (Honours) in

Section 1: Programme Specification

COURSE NAVIGATOR DEMO QUICK GUIDE

Life Insurance Modelling: Notes for teachers. Overview

PGCert/PGDip/MA Education PGDip/Masters in Teaching and Learning (MTL) Programme Specifications

New ECDL. Frequently Asked Questions

POSTGRADUATE PROGRAMME SPECIFICATION

PROGRAMME DETAIL SPECIFICATION. Programme Summary

School of Computer Science Computing and Oceanography (COOS) Prog. Spec Awarding institution Teaching institution Programme accredited by

POSTGRADUATE PROGRAMME SPECIFICATION

LOUGHBOROUGH UNIVERSITY Programme Specification Digital Imaging, Computer Graphics and Vision

2. Basis for computing design and implementation at all levels through OS, distributed systems, human interface and computer graphics.

Programme Specification. BA Education Studies. Valid from: Sept 2015 Programme Code: X300

UNIVERSITY OF CENTRAL LANCASHIRE

Date of specification: January 2005

PROGRAMME SPECIFICATION - UNDERGRADUATE PROGRAMMES. School of Mathematics, Computer Science and Engineering Department or equivalent Computer Science

PROGRAMME SPECIFICATION

UNDERGRADUATE PROGRAMME SPECIFICATION

UNDERGRADUATE PROGRAMME SPECIFICATION

Oxford Brookes University Faculty of Business / Abingdon & Witney College

Programme Specification

Programme name International Business Law (distance learning)

Henley Business School at Univ of Reading. Postgraduate Pre-Experience Board of Studies

Programme Specification. Construction Management

Introduction to elearning Pedagogy

Programme Specification BA (Hons) Business and Management

PROGRAMME SPECIFICATION Postgraduate Diploma / Master of Science Psychology

PROGRAMME SPECIFICATION University Certificate Psychology. Valid from September Faculty of Education, Health and Sciences -1 -

Early Childhood Studies FT Programme Code ECS-FND-S UCAS Code Criteria for Admissions (Please see General Regulations)

2012/2013 Programme Specification Data. Public Relations

STUDENT CENTERED INSTRUCTIONAL DESIGN FOR E-LEARNING CONTENT LEARNING MANAGEMENT SYSTEM (LMS)

TUTORIAL MOVE : 3D MODEL CONSTRUCTION FROM SURFACE GEOLOGICAL DATA

Introduction to ProForm Rapid elearning Studio. What is ProForm? The ProForm Course Authoring Tool allows you to quickly create

BA (Hons) Contemporary Textiles (top up) BA (Hons) Contemporary Fashion (top up) BA (Hons) Contemporary Design for Interiors (top up)

Programme Specification

Transana 2.60 Distinguishing features and functions

PROGRAMME SPECIFICATION

Transcription:

Online Assessment: Quality Production and Delivery for Higher Education. Don Mackenzie Head of the Centre for Interactive Assessment Development University of Derby Kedleston Rd. Derby DE22 1GB E.Mail: D.Mackenzie@derby.ac.uk This paper focuses on the practical steps that can be taken to ensure the maintenance of quality of online assessment from design through production and delivery to data retrieval, moderation and feedback into course design. A spectrum of assessment types will be demonstrated, from simple formative quizzes to high stakes summative assessments that may involve more complex question types or sophisticated simulations. The general stages in the design to delivery process are outlined and the quality assurance checkpoints are discussed. The relative merits of two models of production are examined. In the first of these, the Devolved Tutor Development model, the academic tutor is responsible for a large proportion of the process from design, through production to delivery and reporting. In the second of these, the Integrated Team Development model the academic tutor works closely with a central support team of assessment developers who provide advice and manage the production, delivery and analytical data reporting. The likely outcome from the application of each model is reviewed in terms of quality, diversity and level of assessment. The strategies that can be employed to combat collusion are briefly discussed. The challenge is to develop a method of working that is scaleable and economic whilst delivering assessments that have the rigour required by the Higher Education environment. Presentation with commentary

The benefits of computer-based assessments have been rehearsed many times often with an over emphasis on the potential time savings for tutors. However significant time saving can only be realised in the longer term and are maximised for assessments with a long shelf life and large cohorts. In this talk I wish to focus on the substantial enhancement in quality that can be achieved by the application of 'advanced' computer-based assessment. Here are some examples taken from the support and assessment materials associated with a blended learning course in Geological Map Interpretation. The course is delivered by means of small tutor-led tutorial groups coupled with both paperbased and computer-based exercises for the students. There are two summative assessments, both delivered online. All the examples are delivered using the TRIADSystem. An interactive demonstration of some of them is available. Example 1. Determine the outcrop

The example above requires the student to predict the outcrop of a rock surface across a landscape. The topographic contours are marked in brown and the contours on the rock surface are marked in blue. The rock surface contours are equidistant and parallel since it is an evenly dipping plane. The answer is a curved line joining all points where the topographic and rock surface contours intersect at the same height (black line). The computer system will score the correct positioning of the line very accurately since the line may be subdivided into a number of segments and the score allocation for each segment may be weighted according to its importance. Areas of the map where a line should never be drawn may be negatively scored if required. Example 2. Identify major geological features

In many computer-based questions it is commonly the case that the correct answer is to be found on the screen somewhere and the candidate simply has to eliminate the most unlikely possibility and then make and informed guess at the answer. In the example above, the candidate has to select a number or major geological features and identify them. All areas of the map are hot-spotted and the computer responds in exactly the same way regardless of whether the selected feature is one of the required features or not. The candidate must judge how many features to select. Partially correct answers are awarded partial credit. Example 3. Determine the direction of dip of two groups of strata

In the example above the candidate must be able to visualise the geological structure in three dimensions from the pattern of colours on the map, then draw and arrow representing the direction in which each of two layer-cake sequences of strata are dipping. The answer is scored on the basis of the azimuth of the arrows drawn. The nearer the drawn azimuth is to the acceptable range of values the higher the score gained. No score is gained for azimuths that fall outside error margins set by the tutor.

This screen illustrates some of the feedback embedded into the formative version of this question. Example 4. Determine the sequence of strata in a borehole drilled at A

In the question displayed above, the candidate is required to predict the sequence of strata that would be expected in a borehole drilled at the position marked 'A' on the map. This is not an easy question and requires the candidate to be able to combine 3D visualisation of the geological structure with appropriate use of contour sets on the rock surfaces that are accessed via the buttons on the right hand side of the screen. The question is completed by dragging the colour for each bed of rock from the key to position in the column on the right in the correct sequence. I have completed a typical answer to this question from a student who could partially visualise the structure but did not use the contour sets appropriately. The screen below shows the feedback:

The green stippled bed is an incorrect inclusion in the sequence and carries a negative score so that the candidate achieved 67% rther than the full 100% for an otherwise correct sequence. Sequencing questions are very powerful tools for testing understanding. This instance utilised a graphical example, but sequencing of processes and actions with text-based items can be especially useful. Example 5. Match the borehole logs with the boreholes

In the example above the candidate is required to match each borehole log at the bottom of the screen with its correct position on the map (move object question type). Many tools, accessed by buttons on the right hand side of the screen, are provided to help the candidate. However, the candidate could be easily phased by the level of information and must remain calm and have a very full understanding of the geological structure in order to complete this exercise successfully. Some of the boreholes may be relatively easily positioned, but as the candidate progresses further into the exercise, the level of difficulty increases. Any incorrect positioning will automatically exert a double penalty by precluding the positioning of the correct log. This is a standard, drag n' drop style of question that could be programmed in a number of assessment systems. However the TRIADSystem differs in the level of sophistication possible in the scoring and the flexibility of the interaction area with the facility to add tools etc. All TRIADS move object question types allow each movable object to be scored differently in each position if required with the possibility of additional penalties for incorrect positioning if required. When run in summative tests, this question usually has a discrimination index of around 0.85 and even course tutors have to think really hard to get full marks! Standard CBA The examples given above contrast markedly with standard computer-based assessment such multiple choice type tests as illustrated by the VLE quiz type questions. Tests involving just simple items I refer to as standard CBA rather than advanced CBA.

Whilst multiple-choice type questions are extensively used in CBA, good questions of this type are more difficult to write than is often appreciated and, even when good questions are used, they give little feedback to the tutor on the level of understanding of the candidate. If a 100% score is achieved on an item, the tutor has no way of knowing whether the candidate thought long and hard about the question and worked out the answer from first priciples, knew the correct answer or just guessed it with a 1 in 5 chance of getting it right. In my view, it is doubtful whether the higher order skills can be tested by this type of question. Risks to quality and barriers to take-up There are wide ranging threats to quality in the creation process and a number of barriers to take-up of computer-based assessment. When converting from traditional models of assessment, these need to be appreciated at the outset and steps taken to ameliorate their effects if CBA is to be effective and reliable. Let's examine the typical CBA production and delivery cycle to see where the potential quality assurance nodes might be.

In this cycle, the quality feedback loop into course design/delivery and the design of subsequent assessments is extremely important and often overlooked in the dash for time savings. There is a common assumption that the conversion from traditional modes of assessment to computer delivery will be accomplished largely by individual academic tutors. However, if we are attempting to use CBA to test higher order skills it may be necessary to use more sophisticated assessment tools that require steeper learning curves in order to gain the level of interactivity required. Should we really be expecting tutors to be software programmers as will as subject material and pedagogic experts? Lets compare two extreme models of CBA development: Devolved Tutor Development model where the design, programming and delivery of the assessment is entirely handled by individual academic tutors Integrated Team Development model where the academic tutor is supported by a team of experts throughout the whole process.

Note: IQC = Independent Quality Check Red arrows indicate iterative review and modification cycles The likely outcomes of application of each of these two models is given below:

In my view the application of the Integrated Team Development model provides the best opportunities to extend the scope of CBA into new areas of assessment and will promote a more rapid and informed take-up with higher quality output. This will allow us to undertake assessments that simulate real situations that would be expensive or impractical to assess by traditional means and to test higher order skills very effectively in some disciplines.

We cannot, of course, assess all aspects of all subjects by computer and it is essential that students are exposed to a wide variety of assessment scenarios. The more discursive and subjective disciplines are still difficult to assess by computer and in my view the computer will never be able to reliably assess a final honours essay where a degree of originality is expected in order to obtain a first class grade. Clearly the originality cannot be predicted and therefore cannot be automatically scored. However, there are large areas of many disciplines, particularly in science and engineering that are amenable to this sort of assessment.

Supplementary information About the author Prof. Don Mackenzie, BSc., PhD., F.G.S., ILTM Don is a graduate of the University of Newcastle-upon-Tyne with first and higher degrees in Earth Science specialising in the Geochemistry of Mineral Deposits. He has had experience in the refractories industry and 28 years experience of teaching Earth Sciences to undergraduates at the University of Derby in a department that gained an Excellent rating in the first round of HEFCE Teaching Quality Assessments. Don has been working in the area of courseware, e.learning and computer-based assessment development since 1989 and is the originator and principal programmer of the highly interactive TRIADS assessment system that was chosen to underpin the HEFCE-FDTL(1) funded Assessment of Learning Outcomes project (University of Liverpool (lead), Open University and University of Derby) involving promotion and evaluation of CBA in over 40 departments, 27 universities and 16 disciplines. Don is currently the Head of the Centre for Interactive Assessment Development at Derby that was set up in 1999 to provide a university-wide assessment design, production, delivery and results analysis service for academic tutors. Currently this department delivers over 10,000 summative student assessments annually as well as undertaking commercial contract work for business and examination board clients.

Useful general starting references and web sites: The Proceedings of the International CAA Conferences can be viewed online at: http://www.caaconference.com These provide a useful starting point for general work on online assessment. The CAA Centre website at http://www.caacentre.ac.uk/resources/ has some useful materials links and references even though this project has now finished. Bobby Elliot s smartgroup site provides some useful general links and a discussion forum regarding e.assessment with special reference to Scottish Schools. http://www.smartgroups.com/groups/e-assessment Bull, J & McKenna, C., 2003, Blueprint for Computer-assisted Assessment. Routledge Falmer. ISBN 0415287049 TRIADS (Tripartite Assessment Delivery System) TRIADS is a highly flexible system, capable of delivering courseware and assessments employing a wide variety of question styles in a wide variety of modes to facilitate the testing of higher order learning skills. Its potential for in-depth and flexible assessment separates TRIADS from the proprietary assessment systems currently available. Key elements: System can be used both as a dedicated assessment system and as a courseware/e-learning development shell with menu. Semi-open coded in Authorware (MacromediaTM) Especially suitable for highly visual disciplines. All principle graphics formats are supported together with functionality for incorporation of Flash and most digital movie formats. Assessments/courseware - standalone/cd/lan/web deliverable and externally configurable. Wide variety of question delivery modes including sequential, paged and cycling (allows repeated return to failed questions/topics until a tutor-defined threshold score is achieved). Wide variety of question styles (around 30 possible, about 20 are distributed) with graph drawing and plotting styles as additional extras. Optional randomisation of question selection/order with selection from whole bank or from groups of questions within a bank is available in some modes. Functionality to present benchmark questions to all candidates in an otherwise random selection is possible in all randomisation modes. Randomisation of data or selections is possible in some question styles. This, together with the range of randomisation modes possible in question presentation, makes the system suitable for the creation of assessments to be delivered in open environments. The system can evaluate some types of user-input mathematical formulae. Can link to external programs at runtime. Data may be returned for scoring in the assessment from external programs via the clipboard or by reading text file output from the external program. Extensive feedback options within and post each question, including links to full multi-media courseware, Web pages or other external programs. Highly flexible scoring, capable of partial credit within questions as well as grading across the whole assessment. Custom scoring and parsing of text/numeric-entry answers is possible. Capable of embedding subject specific simulations and multimedia. Any level of simulation may be incorporated and optionally scored within the system. Sound and video are easily integrated into questions or simulations. Multiple data output options formatted for easy item analysis References Various TRIADS related papers at http://www.caaconference.com O Hare D. & Mackenzie D.M., 2004 Advances in Computer Aided Assessment. SEDA Paper 116. Staff and Educational Development Association. Birmingham. ISBN 1 902435 24 9 A set of papers summarizing the outcomes of the TRIADS Assessment of Learning Outcomes Project (1996-2000). When reading this please be aware that some of the applications papers were written four years ago and both the system and the

applications have moved on since then. Many of the issues discussed therein have been resolved but some remain to be solved. Specialist starting references: Boyle, A & O Hare D., 2003 Finding appropriate methods to assure quality Computer-Based Assessment development in UK Higher Education. Proceedings of the Seventh International Computer Assisted Assessment Conference pp.67-84 University of Loughborough. ISBN 0-9539572-2-5 This paper can be viewed on line in the proceedings at: http://www.caaconference.com It provides 40 references for those wishing to research in more detail the area of quality in computer-based assessment design. Ercole, A, Whittlestone, K.D., Melvin, D.G & Rashbass, J., 2002 Collusion detection in multiple choice examinations. Medical Education; 36 p166-172. Also online at: http://www.caret.cam.ac.uk/pdfs_ppts/collusion%20detection.pdf An interesting statistical approach to the problem of collusion. (one for the mathematicians!) BS7988 (2002) Code of practice for the use of information technology (IT) for the delivery of assessments. Technical Committee IST/43, British Standards Institute.