Experience with a Multiple-Choice Audience Response System in an Engineering Classroom



Similar documents
H-ITT CRS V2 Quick Start Guide. Install the software and test the hardware

Title: Transforming a traditional lecture-based course to online and hybrid models of learning

Assessing the effectiveness of Tablet PC-based instruction

When I first tried written assignments in my large-enrollment classes,

MAT 151 College Algebra and MAT 182 Trigonometry Course Syllabus Spring 2014

Student Preferences for Learning College Algebra in a Web Enhanced Environment

Do Supplemental Online Recorded Lectures Help Students Learn Microeconomics?*

The Use of Blackboard in Teaching General Physics Courses

DSCI 3710 Syllabus: Spring 2015

TIPS AND TECHNIQUES FOR TAKING NOTES IN CLASS 1

UNIVERSITY OF MARYLAND MONEY AND BANKING Economics 330 Fall 2015

MATH 140 HYBRID INTRODUCTORY STATISTICS COURSE SYLLABUS

D2L Overview Sept/Oct 2011

A Comparison of Student Learning in an Introductory Logic Circuits Course: Traditional Face-to-Face vs. Fully Online

Impact of On-line Lectures for an Undergraduate Mechanisms Course

Benefits of Electronic Audience Response Systems on Student Participation, Learning, and Emotion

TABLET PCs IN A CODE REVIEW CLASS

Student Guide and Syllabus for MAT100 Introductory Algebra

EMPORIA STATE UNIVERSITYSCHOOL OF BUSINESS Department of Accounting and Information Systems. IS213 A Management Information Systems Concepts

Student Feedback: Improving the Quality through Timing, Frequency, and Format

Online Course Development Guidelines and Rubric

Essays on Teaching Excellence. Practice Tests: a Practical Teaching Method

To Click or Not To Click: The Impact of Student Response Systems On Political Science Courses

DePaul University February, Bahrain Accounting 500 Financial Accounting

Study Strategies Used By Successful Students

Course Syllabus STA301 Statistics for Economics and Business (6 ECTS credits)

Math 103, College Algebra Spring 2016 Syllabus MWF Day Classes MWTh Day Classes

GRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE *

TITLE: Elementary Algebra and Geometry OFFICE LOCATION: M-106 COURSE REFERENCE NUMBER: see Website PHONE NUMBER: (619)

Blended Assessment: A Strategy for Classroom Management

93% of staff (274) who completed the on-line training provided feedback. This indicated that staff felt that:

Development of a Tablet-PC-based System to Increase Instructor-Student Classroom Interactions and Student Learning

CENTRAL COLLEGE Department of Mathematics COURSE SYLLABUS

The Effects Of Unannounced Quizzes On Student Performance: Further Evidence Felix U. Kamuche, ( Morehouse College

Department of Electrical and Electronic Engineering, California State University, Sacramento

Teaching and Encouraging Meaningful Reading of Mathematics Texts

How to be Successful in Foundations of Physics I (PHYS 2305)

Welcome to QMB 3200 Economic and Business Statistics - Online CRN Spring 2011* Elias T Kirche, PhD

Response Rates in Online Teaching Evaluation Systems

IMPORTANT NOTICE. This syllabus is provided only as an example of what you might find in my sixteen-week lecture course.

Napa Valley College Fall 2015 Math : College Algebra (Prerequisite: Math 94/Intermediate Alg.)

Progressive Teaching of Mathematics with Tablet Technology

PHYSICS 1101W.100: Introductory College Physics I. Preliminary Syllabus, Spring Syllabus revisions will be posted on the 1101 syllabus web site

Niles West High School STUDY SKILLS MANUAL

El Camino College Chemistry 1B: General Chemistry II Instructor: Dr. Melvin Kantz Office: Chem

Assessing the Impact of a Tablet-PC-based Classroom Interaction System

Course Syllabus: Math 1314 College Algebra Spring 2016 T/R

PCB 3043: Ecology Spring 2012, MMC

Math 143: College Algebra Spring Week Session

LMS User Manual LMS Grade Book NUST LMS

The University of Texas at Austin McCombs School of Business Foundations of Accounting (ACC 310F) Course Syllabus Spring 2015

Features of the Online Course. Table of Contents. Course Interface: Overview Syllabus Navigating through the Course Content...

Del Mar College - Mathematics Department SYLLABUS for the Online College Algebra Math 1314 Summer 2014

Use of Web-Supported Material in Teaching

MATH : Intermediate Algebra

Student Feedback on Online Summer Courses

NOTE TAKING AND THE TABLET PC

Florida Gulf Coast University Lutgert College of Business Marketing Department MAR3503 Consumer Behavior Spring 2015

Successful completion of Math 7 or Algebra Readiness along with teacher recommendation.

Investment Management Course

S Y L L A B U S F O R C H E M G E N E R A L C H E M I S T R Y I, 4 C R, G R E A T B A S I N C O L L E G E

EPI 820/CPH504: Epidemiology in Public Health (Online)

PRECALCULUS WITH INTERNET-BASED PARALLEL REVIEW

High School Algebra Reasoning with Equations and Inequalities Solve equations and inequalities in one variable.

The Application of Statistics Education Research in My Classroom

PSYC 201 GENERAL PSYCHOLOGY Fall 2013

EDUCATIONAL BENEFITS OF A PAPERLESS CLASSROOM UTILIZING TABLET PCS. R. Ellis-Behnke, J. Gilliland, G.E. Schneider, D. Singer

COURSE OUTLINE FOR MATH 115. Instructor: Rich Tschritter, Ewing 268. Text: Precalculus, Sixth Edition, by Larson & Hostetler CHAPTER A: APPENDIX A

COLLEGE OF ARTS AND SCIENCES DEPARTMENT OF MATH AND COMPUTER SCIENCE

AGEC 448 AGEC 601 AGRICULTURAL COMMODITY FUTURES COMMODITY FUTURES & OPTIONS MARKETS SYLLABUS SPRING 2014 SCHEDULE

CLASS PARTICIPATION: MORE THAN JUST RAISING YOUR HAND

ONLINE LEARNING AND COPING STRATEGIES. Marcela Jonas, University of the Fraser Valley, Canada

CHEM PRINCIPLES OF CHEMISTRY Lecture

MAT150 College Algebra Syllabus Spring 2015

Math 50 Trigonometry Peralta Class Code Berkeley City College

PSYCHOLOGY Section M01 Mixed Mode Spring Semester Fundamentals of Psychology I MW 11:30 - A130. Course Description

MAN 4802 Entrepreneurship/Small Business Management Online Fall 2013

What is active learning?

Analysis of the Effectiveness of Online Learning in a Graduate Engineering Math Course

SYLLABUS: OPERATIONS & SUPPLY CHAIN MANAGEMENT

Adam David Roth MESSAGE FROM THE BASIC COURSE DIRECTOR. Dear students:

Section Format Day Begin End Building Rm# Instructor. 001 Lecture Tue 6:45 PM 8:40 PM Silver 401 Ballerini

DYNAMICS AS A PROCESS, HELPING UNDERGRADUATES UNDERSTAND DESIGN AND ANALYSIS OF DYNAMIC SYSTEMS

Transcription:

Experience with a Multiple-Choice Audience Response System in an Engineering Classroom David W. Petr University of Kansas, Electrical Engineering and Computer Science, 1520 W. 15 th St, Suite 2001 Lawrence, KS 66045-7621 petr@eecs.ku.edu Abstract This paper presents results and observations from using a multiple-choice audience response system (ARS) in an engineering class with enrollments averaging approximately 60. The system consisted of studentregistered infra-red transmitters, receivers daisy-chained to an in-class computer, and associated software. The system allows the students to answer multiple-choice questions, with the software processing the students answers. We discuss this system s use for ungraded inclass questions intended to provide active learning for students and feedback to both students and instructor regarding how well students are assimilating new concepts and techniques. We also describe the system s use for weekly multiple-choice quizzes. These quiz scores are found to be significantly lower than scores on traditional exams with partial-credit grading, and there is wide individual variation of scores in these two assessment formats. We conclude that an ARS can be a very effective teaching/learning tool and that multiple-choice quizzes can complement traditional assessment techniques. Index Terms Classroom Technology, Active Learning, Assimilation Feedback, Multiple-Choice Assessment INTRODUCTION AND BACKGROUND Technological innovations can assist in the teaching and learning process in a number of ways. Any new tool, however, requires users to learn how to use it skillfully; a new tool also should be evaluated for its effectiveness. This paper reports on these and other issues regarding an Audience Response System (ARS) used in an engineering classroom setting. An ARS allows students to provide answers to multiplechoice questions in a classroom setting. The system processes responses automatically and immediately, including the capability to record individual student scores or display a summary of the responses to the class, or both. Such a system has the potential to assist both instructors and students in the teaching and learning processes. As an assessment tool, it can significantly reduce the burden of grading papers and (as discussed later) give students very timely feedback on their performance. As a pedagogical tool, it can provide both instructor and students with in-class feedback regarding the assimilation of new concepts. The author (and his students) gained experience with the use of such a system during the Spring and Fall 2004 semesters in a beginning circuit analysis class (EECS 211: Circuits I). This is a sophomore level class for Electrical Engineering (EE) and Computer Engineering (CoE) majors, but it is also taken by other engineering majors of various levels. There were approximately 70 students in the class in the Spring with approximately 60% EE/CoE majors and approximately 50 students in the fall with approximately 80% EE/CoE majors. Both courses were conducted in 80-minute classroom sessions that met in a large (200+ seat) lecture hall with good audio-visual capabilities. In particular, the room had two projectors aimed side-by-side at a large screen positioned above the whiteboard. Each projector could be driven by a different device (e.g., laptop computer and camera display device). This arrangement was nearly ideal for the use of an ARS. ARS SYSTEM DESCRIPTION The ARS system was from Hyper-Interactive Teaching Technologies (H-ITT) [1]. This system consisted of a 6- button infra-red (IR) transmitter (resembling a large pen) for each student (see Figure 1), a set of IR receivers daisy-chained to an in-class computer, and software for processing the responses. The transmitters allow students to answer multiple-choice questions with up to five choices (the sixth button is a control button, described later) by pointing the transmitter at a receiver and pushing the appropriate button. H-ITT s second generation ( high speed ) transmitters were used, with a 10 ms signal transmission time for each button push (compared with 100 ms for the first generation transmitters). The shorter transmission time reduces the probability of signal collision, which is the primary factor in determining how many responses can be collected in a particular amount of time. Each transmitter has an ID number, which can be associated with a particular student. This registration process was handled manually in this experiment, but automated registration options are available. The registration process also allows an email address to be associated with the transmitter, allowing for automated email notification of individual results. The receivers collect the student responses and relay them to software in a computer (a Windows laptop PC was used in this case). H-ITT recommends one receiver for every 50 students. In this case, three receivers were used, mounted on a S3G-1

cart that was wheeled to the front of the lecture hall each class period. semester. A few students in the Fall 2004 semester already had purchased transmitters for use in a previous Physics class. ASSIMILATION FEEDBACK FIGURE 1 ARS RECEIVER AND TRANSMITTER (PENCIL SHOWN FOR SCALE) The software can display questions, collect student responses, grade questions, display results, and send email to students. It consists of two modules: an acquisition program used in the classroom and an analyzer program used outside of class. The acquisition program provides essential feedback to students through a display during answer collection. This is actually a grid of displays, one for each transmitter, which indicates to the associated student that a response has been received (H-ITT now offers 2-way transmitters that provide this feedback through the transmitter itself). System component costs were (and still are as of this writing) $25 per transmitter and $200 per receiver, plus miscellaneous costs for cables, power supplies, etc. The software can be freely downloaded from the H-ITT website. Typically, students would purchase transmitters from the university bookstore or perhaps a student organization, just as textbooks are purchased. School units would be responsible for transmitter and cabling costs. In this case, however, all system components were purchased by the author using teaching-improvement funds, so the transmitters were loaned (free of charge) to the students for the duration of the The system was used in two different ways, one pedagogical and one for assessment. The first use consisted of non-graded questions during a lecture to check student understanding of concepts and techniques, as described in this section. The second consisted of graded quizzes administered each week, described in the next section. A common frustration for instructors of medium-to-large classes is obtaining feedback from students regarding their assimilation of the material being presented. We would like to know if the students are getting it. A couple of common approaches are to observe body language or ask questions during the presentation. However, body language can at best provide feedback on attentiveness, not assimilation. Asking questions during class is often not effective since many students (particularly in a larger class) are reluctant to perform in front of their peers, which means questions are answered by the same handful of students, usually the ones who are assimilating well. This tends to provide an optimistic view of assimilation. From the student perspective, it is difficult for many to concentrate during continuous lecturing for 50 or 80 minutes (even with a 5-minute break in the middle, which this instructor gives). It is also well-established that many students would benefit from some sort of active learning activity during class (see [2] for a recent survey of student learning styles). Yet, it is difficult to incorporate active learning exercises into a lecture, especially for medium to large classes. I. ARS-Based Assimilation Feedback An ARS opens up a new possibility in the form of questions answered by the entire class with minimal personal consequences. The methodology used in this case was as follows. After introducing a concept or technique, the instructor would pose a question or short series of questions to the class with the help of the ARS. The question and answer choices would be asked verbally or written on the board (with sufficient preparation, they could also be projected using a camera-display device). For example, after introducing the notions of series and parallel connections of resistors, the instructor sketched a network of several resistors on the board and provided the following answer choices: A for series, B for parallel, and C for neither series nor parallel. The instructor then called out pairs of resistors ( R1 and R3 ) and asked the students if these two resistors were in series (A), parallel (B), or neither (C). Students would be asked to answer each question using their transmitters, during which time the instructor would provide the correct answer via an instructor transmitter. Such questions break up the flow (students might use the term monotony ) of the class, providing all students an opportunity to actively participate by checking their own understanding of new material. S3G-2

When nearly all students had provided an answer (almost always within one minute, and often within 30 seconds), a histogram summary of the responses was displayed. This type of display preserves student anonymity, which encourages students to participate in the process: Nobody can think I am a dummy as they might if students had to answer the question aloud. Although student answers to such questions certainly could be scored and recorded individually as an assessment component, this instructor chose to assign no credit or points for answers to such questions. The results display gives both instructor and students valuable feedback and introduces additional teaching/learning opportunities. For example, if nearly all students provide the correct answer, this is evidence that the concept has been largely assimilated, and the instructor can proceed. Much more commonly, however, many (or even most) students or will provide an incorrect answer. Student reactions to the displayed results in such a case can be quite interesting. It seems that students often believe that they understand a concept, and an incorrect answer will shatter this illusion and jolt them back to reality. Observations of their body language indicated that they were surprised and troubled by their lack of understanding and that they were more attentive as a result (i.e., they would frown and then sit up and take notice ). In order to take full advantage of the teaching opportunity that results from this process, incorrect answers should be chosen to reflect common student misunderstandings or errors (i.e., it is important for the instructor to think like a student when selecting incorrect choices). In such a case, the instructor can discuss with the class why a particular choice is incorrect. Depending on the situation, any combination of the following may occur: students may ask clarifying questions, the instructor may choose to go over important points again (perhaps from a different angle), the instructor might ask a similar question to see if the misunderstanding or error has been corrected, or the instructor might ask a more difficult question to probe deeper. II. Using ARS Effectively Ideally, assimilation questions should be interspersed throughout each class period. In this initial experience with the methodology, this author typically introduced one or two such feedback opportunities per class period, each of which typically included one to three questions. Of course, each such interchange requires precious class time, so it is important that the process be as time-efficient as possible, which is facilitated by the speed of the ARS in collecting responses and displaying results. Particularly efficient formats include those in which a several questions all have the same answer choices (see the previous resistor example) or in which a number of questions are asked about a particular equation (e.g., the first order differential equation describing the transient response of an RC circuit) or graph (e.g., voltage versus time for a capacitor). This instructor found that a typical interchange would require five minutes or less of class time. Because care should be exercised in selecting answer choices, the majority of these questions should be prepared in advance. Questions and answer choices can be integrated into lecture notes at appropriate places. This requires some additional preparation, but the increased classroom interaction is well worth the effort. On a few occasions, however, this instructor had a flash of inspiration during a lecture that led to a very effective ARS question/answer interchange with students. III. Student Acceptance Student acceptance of the assimilation feedback was gauged both quantitatively and qualitatively. Acceptance can be measured quantitatively by a record of the number of students who provided answers, especially when we recall that participation in these questions was purely voluntary (i.e., student grades were not affected at all by these questions). Figures 2 and 3 show number of student responses versus class session for the Spring and Fall semesters, respectively. In blue are the numbers of students responding to assimilation questions, and in red are the numbers of students taking graded quizzes (see next section). A couple of features of Figures 2 and 3 bear noting. First, assuming that nearly all students took the graded quizzes, note that a relatively large percentage of the students (though not all) participated in the assimilation questions. For class sessions in which only assimilation questions were given (such as classes 9 and 11 of Spring 2004), this implies that most students were attending class. For class sessions in which both assimilation questions and graded quiz questions were administered (such as session 15 in Spring 2004), it is clear that some students chose not to participate in the assimilation exercises. Second, notice that participation seemed to decrease somewhat as the semester progressed (although some of this is attributable to student attrition). Number of Responses 80 60 40 20 0 ARS Responses: Spring 2004 1 3 5 7 9 11 13 15 17 19 21 23 25 27 Class Session Assimilation Graded Quizzes FIGURE 2 STUDENT RESPONSES DURING THE SPRING 2004 SEMESTER S3G-3

Number of Responses 60 50 40 30 20 10 0 ARS Responses: Fall 2004 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 Class Session Assimilation Graded Quizzes FIGURE 3 STUDENT RESPONSES DURING THE FALL 2004 SEMESTER Qualitatively, student acceptance of the assimilation questions was evaluated by student comments on the anonymous, end-of-semester course evaluations. For the Spring semester, of the 53 students who wrote comments, 32 mentioned the ARS system, 21 of which contained clearly positive comments and 10 of which specifically cited the assimilation questions as being helpful to them. Corresponding numbers from the Fall semester are 24 students who provided comment, 14 mentioned ARS, 10 positive, 3 cited assimilation questions as helpful. Following are a few verbatim comments illustrating that the assimilation feedback questions had their intended effect, at least for these students: I liked the informal quizzes. They keep you on top of the little things needed to be successful. The remotes were good in that they got us involved in class. Ungraded surveys help to see the new material and let me know if I have the right idea. Anonymity during class is nice. Truly appreciated the quiz system; instant feedback is very helpful. An ARS can provide an efficient solution for graded quizzes, in that quizzes can be given every week with minimal time and effort requirements. However, a potential issue is that an ARS constrains the quizzes to multiple-choice format. Multiple-choice quizzes are compared with more traditional exams in the next section. This section describes the quizzing procedure and post-processing used in this case. I. ARS Quiz Procedure Quizzes were administered using the ARS every Thursday both semesters except during exam weeks. These quizzes carried a weight in the course grading scheme of 15% in the Spring 2004 semester and 12% in the Fall. The basic quiz problem(s) would be projected using a camera-display device (these were often hand-drawn circuit diagrams), while the actual quiz questions and answer choices were projected within the ARS acquisition display. See Figures 4 and 5 for an example quiz problem and questions/answers. Note that this procedure totally eliminates distributing and collecting quiz papers. FIGURE 4 EXAMPLE QUIZ PROBLEM GRADED MULTIPLE-CHOICE QUIZZES Frequent assessment opportunities can be very helpful in keeping students current with course material, especially for courses (such as circuit analysis) in which later material builds on earlier material. The benefit of frequent assessment seems to be especially important for students in their first two years. Weekly quizzes can thus be an excellent teaching/learning tool that is actually appreciated by many students. For a mediumto-large class, however, the costs in both class time (distributing, collecting, and returning quiz papers in addition to the time allowed for the quiz) and instructor effort (grading papers and recording scores) can be significant. While a teaching assistant, grader, or staff assistant could certainly assist with such quizzes, these are not always available to instructors. Using an all-or-nothing quiz format [3] reduces the grading burden, but only moderately so. FIGURE 5 EXAMPLE QUIZ QUESTIONS AND ANSWER CHOICES Figure 5 illustrates some principles of effective answer choices. First, note that None of A-D is a choice for all three questions. In fact, this was included as a choice on every question of every quiz. Including this choice keeps the students honest if sometimes this is the correct answer (as it is for Problem 1 of Quiz 11). Furthermore, instructors do sometimes make mistakes themselves in calculating correct answers, so this choice ensures that there is always a correct answer for every question. Also, note that the incorrect answer choices represent common mistakes that students might make. In Problem 1, for example, choice A represents S3G-4

the negative of the pole values, choice B represents the zero value, and choices C and D represent possible errors in evaluating the quadratic formula for the denominator roots. As with Quiz 11, quizzes typically consisted of two or three distinct questions; to allow students to work through the questions at their own pace, a different mode of the ARS was used. In this quiz mode, the 6 th transmitter button is used to increment (circularly) the number of the question that a student wishes to answer, and each individual transmitter display includes an indication of which question the student is answering,` in addition to providing feedback that an answer had been received. Training for this more complicated mode was provided in two ways. First, after explaining the quiz mode of operation, a dry-run quiz was administered to make sure students understood how to properly enter answers. Specifically, students were told to enter A for question 1, B for question 2, and C for question 3, and their recorded responses were checked for accuracy. Secondly, for the first graded quiz, students were instructed to enter answers using the ARS, but they were also instructed to hand in their multiple-choice answers on paper. The paper answers were correlated with the ARS answers, and the very few discrepancies were reported to students with instructions to come for help in using the system. In cases of discrepancy, actual scores for the first quiz were based on paper answers. For all subsequent quizzes, the ARS was the only allowed means of entering answers. Quizzes were given during the last 10 to 15 minutes of the class period, so students were free to leave when they had completed the quiz. Typically, however, several students would wait until the end of the quiz period to find out what the correct answers were, thereby giving them near-immediate feedback on their performance. II. Quiz Post-Processing The typical procedure for the instructor following a quiz was quite simple. Upon returning to his office, correct answers and point values were entered using the analyzer program. Having done this, all quizzes were instantly graded! Student scores would then be cut and pasted into a course grade spreadsheet. A very nice feature of the analyzer program allowed the instructor, with a single button-click, to send an email message to every student containing his/her answers and score as recorded by the ARS system. So, even if students did not wait in the classroom to get the correct answers, they could get them within a matter of minutes via email. This message not only provided timely performance feedback while the quiz was fresh in their minds, it also allowed them to confirm that the ARS had correctly received their answers. Finally, a solution sheet would be prepared using photocopies of the problem display and the question/answer display. The percentage of correct answers for each question was included as well. This solution sheet was then scanned and posted on the course website. This instructor found that this entire post-processing procedure was almost always accomplished within 30 minutes of the end of class. It is worth noting that this post-processing time is independent of the number of students in the class! COMPARING MULTIPLE-CHOICE QUIZZES WITH TRADITIONAL EXAMS The multiple-choice format for quizzes naturally raises several questions. How do students perform on multiple-choice quizzes compared to traditional engineering exams? Would it be advisable to use the multiple-choice format exclusively for assessment? Do multiple-choice questions assess a different type of knowledge than traditional exams? This section will begin to address some of these questions. First, a traditional engineering exam should be described. As used here, this means a problem-oriented exam in which students work out solutions and provide answers (typically numerical). A traditional engineering exam is also one in which partial credit is awarded for correct problem setup or any number of other reasons, even if the final answer is incorrect. We should also be precise about the multiple-choice format. As used here, this implies that one and only one answer receives credit, i.e., there is no partial credit awarded for incorrect answers. Although the ARS system allows partial credit to be awarded for some answers (e.g., answer A for Problem 1 of Quiz 11 could have been assigned a point value less than the correct answer s value), this was not done in this case. Reasons for this decision are discussed subsequently. I. Score Comparison We first present a comparison of multiple-choice quiz scores and traditional exam scores for the two semesters being examined. Each exam score presented here was adjusted by this instructor by adding a certain number of points per exam to every student s score. This was done to account for exams that were too long or too difficult in the instructor s judgment. Quiz scores received no such adjustment. Table I below compares the average exam and quiz scores for the two semesters (for all students remaining in the class at the end of the semester). There is a significant difference of approximately 9 percentage points between the exam averages and the quiz averages. This is in spite of the instructor s judgment that quiz questions were significantly less difficult than exam questions. Quiz scores are averaged only for those students who took the quizzes; if a 0 score is assigned for every quiz not taken, the quiz averages would drop to 67.6%, 68.1% and 67.9%, respectively, increasing the difference between exam and quiz scores to approximately 12 points. TABLE I COMPARISON OF QUIZ AND EXAM AVERAGES 1 2 Final Average Quiz Average Spring 87.9% 80.4% 73.8% 80.7% 70.3% Fall 82.6% 80.0% 74.2% 78.9% 71.5% Average 85.3% 80.02% 74.0% 79.8% 70.9% S3G-5

The difference in average scores by itself does not rule out the possibility that students perform relatively the same on quizzes and exams. If there is a high correlation between quiz and exam scores on an individual student basis, one might argue that the multiple-choice format could be used for exams as well. Anyone who has ever graded traditional exams for a large class will immediately understand the motivation for such a possible switch. Correlation between exam scores and quiz scores was undertaken by first normalizing the quiz average to match the exam average by increasing every student s quiz average by the same amount (zeros were included in the quiz average for quizzes not taken). For example, for the Spring semester, every student s quiz average was increased by (80.7-67.6) or 13.1 percentage points. Figure 6 shows a scatter plot of the exam and adjusted quiz scores. The wide individual variation between exam and quiz scores is apparent. The quiz-exam difference ranges from +30.8 to -31.1 points on a 100 point basis, with a standard deviation of 12.6. That is, some students scored 30+ points higher (out of 100) on quizzes, while others scored 30+ points lower! Only 39% of the difference magnitudes are 5 points or less, 31% are between 5 and 10, 31% are between 10 and 20, and 17% are larger than 20. Clearly, individual student performance varies considerably between the multiple-choice quiz format and the traditional exam format! Average 120.0 100.0 80.0 60.0 40.0 20.0 Quiz: Correlation Quiz- Ideal 0.0 0.0 20.0 40.0 60.0 80.0 100.0 120.0 Quiz Average (Adjusted) FIGURE 6 CORRELATION BETWEEN QUIZ AND EXAM SCORES What are we to make of this? Faced with this evidence, the multiple-choice format is certainly not a viable substitute for traditional exams. However, this author believes that the difference argues for including both assessment formats within a course, since they may very well be testing different skills. The exam format, with partial credit, tests understanding of broad concepts and procedures, while allowing for small mistakes in the details. On the other hand, the all-or-nothing multiple-choice quiz format emphasizes detail in addition to broad concepts. In fact, the all-or-nothing grading of the quizzes was chosen primarily to impress on students the importance of being careful in their solutions; they were also actively encouraged to check their answers. II. Student Acceptance of Multiple-Choice Quizzes Given the significantly lower average quiz scores as shown in Table I, it is not surprising that students were somewhat disenchanted with the all-or-nothing multiple-choice format. Negative student comments about the ARS system overwhelmingly cited the all-or-nothing multiple-choice format. Of the 23 students over both semesters who recorded negative comments about the ARS system, 19 specifically mentioned the lack of partial credit on the weekly quizzes. Similar negativism was observed in a previous semester when all-or-nothing grading was used for paper quizzes [3]. Despite the students hostility towards the grading format, a few students specifically mentioned that the weekly quizzes were helpful to them, as in this example comment: It s really annoying to get 66% or 33% on a quiz because of a minor mistake. I do think they do keep the students on their toes, though. CONCLUSIONS Two major conclusions can be drawn from this experience. First, an ARS can be a very effective teaching/learning tool for engineering courses, especially for medium-to-large classes. An ARS can provide active learning opportunities and assimilation feedback during lectures. The time savings from automatic grading and recording of scores allows quizzes to be given more frequently than might otherwise be the case. Effective use does require some adjustments on the part of both instructor and students, but the benefits can be well worth the effort. Secondly, an all-or-nothing multiple-choice quiz format can be a good complement to traditional exams for course assessment. The wide relative variation in individual student performance (some score significantly better with this format, others significantly worse) points to the possibility that different skills are being assessed with the different formats. In particular, the all-or-nothing grading encourages students to pay particular attention to details and to check their answers. Provided that the overall course weight for such quizzes is limited to about 10-15%, this author plans to continue using ARS-assisted, all-or-nothing, multiple-choice quizzes in future classes, despite students generally negative view of this format. REFERENCES [1] Hyper-Interactive Teaching Technology, http://www.h-itt.com [2] Felder, R.M. and Brent, R, Understanding Student Differences, Journal of Engineering Education, Vol. 94, No. 1, January 2005, pp. 57-72. [3] Petr, D.W., "Cross-Checking and Good Scores Go Together: Students Shrug", Proceedings of the 2001 ASEE/IEEE Frontiers in Education Conference (FIE 01), October 2001, pp. TA3-7 to TA3-12. S3G-6