Online Course Evaluation Literature Review and Findings Prepared by: Jessica Wode Jonathan Keiser Academic Affairs Columbia College Chicago Spring 2011
TABLE OF CONTENTS Summary of Scholarly Research on Student Course Evaluations...2 Recommendations for Improving Response Rates...6 Additional Reading on Online Evaluations: Annotated Bibliography...7 1
SUMMARY OF SCHOLARLY RESEARCH ON STUDENT COURSE EVALUATIONS The validity and reliability of course evaluations Researchers generally consider student evaluations of instructors to be highly reliable and at least moderately valid. 1,2,3 Other methods of evaluation (such as evaluations by colleagues or trained observers) have not been found to be reliable and therefore not valid. 1 Student ratings of instructors have been found to be related to ratings of instructor s skills in course organization, rapport with students, and fair grading; variance in organizational skill (having an organized course plan and clearly identifying what students need to do) explained most variance in student evaluations. 4 Alumni rarely change their opinions of former teachers. 1,3 When instructors collect mid-term feedback from students and have an honest discussion about it with someone, it leads to higher evaluations at the end of the semester as well as higher final exam scores, providing evidence that good evaluations can lead to better teaching. 5 Although grades do have some effect on how students rate instructors, 6 its effect is fairly low 7 and can be statistically adjusted for. 8 Grades do not have as large of an effect as do how much students feel they ve learned, 9 how much they felt stimulated by the class, 10 and whether the class was appropriately difficult (courses are rated lower for being too easy or too difficult). 11 Contrary to the retaliation theory, students who do poorly in a class are equally or less likely than those who do well to complete course evaluations. 12 1 Centra, John. 1993. Reflective faculty evaluation: enhancing teaching and determining faculty effectiveness (Jossey Bass higher and adult education series). San Francisco: Jossey-Bass Inc Pub. 2 Hobson, Suzanne M., and Donna M. Talbot. "Understanding Student Evaluations: What All Faculty Should Know." College Teaching 49, no. 1 (2001): 26-31. 3 Aleamoni, Lawrence M. "Student Rating Myths Versus Research Facts from 1924 to 1998." Journal of Personnel Evaluation in Education 13, no. 2 (1999): 153-166. 4 Jirovec, Ronald L., Chathapuram S. Ramanathan, and Ann Rosegrant-Alvarez. "Course Evaluations: What are Social Work Students Telling Us About Teaching Effectiveness?" Journal of Social Work Education 34, no. 2 (1998): 229-236. 5 Overall, J.U., and Herbert W. Marsh. "Midterm Feedback from Students: Its Relationship to Instructional Improvement and Students' Cognitive and Affective Outcomes." Journal of Educational Psychology 71, no. 6 (1979): 856-865. 6 Johnson, Valen E. "Teacher Course Evaluations and Student Grades: An Academic Tango." Chance 15, no. 3 (2002): 9-16. 7 Gigliotti, Richard J., and Foster S. Buchtel. "Attributional Bias and Course Evaluations." Journal of Educational Psychology 82, no. 2 (1990): 341-351. 8 Greenwald, Anthony G., and Gerald M. Gillmore. "Grading leniency is a removable contaminant of student ratings." American Psychologist 52, no. 11 (1997): 1209-1217. 9 Bard, John S. "Perceived learning in relation to student evaluation of university instruction." Journal of Educational Psychology 79, no. 1 (1987): 90-91. 10 Remedios, Richard, and David A. Lieberman. "I liked your course because you taught me well: the influence of grades, workload, expectations and goals on students' evaluations of teaching." British Educational Research Journal 34, no. 1 (2008): 91-115. 11 Centra, John A. "Will Teachers Receive Higher Student Evaluations by Giving Higher Grades and Less Course Work? " Research in Higher Education 44, no. 5 (2003): 495-518. 12 Liegle, J O and D S McDonald. Lessons Learned From Online vs. Paper-based Computer Information Students' Evaluation System. In The Proceedings of the Information Systems Education Conference 2004, v 21 (Newport): 2214. ISSN: 1542-7382. 2
Online vs. paper course evaluations The one consistent disadvantage to online course evaluations is their low response rate 13,14 ; using reminder e-mails from instructors and messages posted on online class discussions can significantly increase response rates. 15 Evaluation scores do not change when evaluations are completed online rather than on paper. 14,16 Students leave more (and often more useful) comments on online evaluations compared to paper evaluations. 13,16,17 Students, faculty, and staff generally view online evaluations more positively than paper evaluations. 13,16 13 Anderson, Heidi M., Jeff Cain, and Eleanora Bird. "Online Student Course Evaluations: Review of Literature and a Pilot Study." American Journal of Pharmaceutical Education 69, no. 1, article 5 (2005). 14 Avery, Rosemary J., W. Keith Bryant, Alan Mathios, Hyojin Kang, and Duncan Bell. "Electronic Course Evaluations: Does an Online Delivery System Influence Student Evaluations?." Journal of Economic Education 37, no. 1 (2006): 21-37. 15 Norris, John, and Cynthia Conn. "Investigating Strategies for Increasing Student Response Rates to Online- Delivered Course Evaluations." Quarterly Review of Distance Education 6, no. 1 (2005): 13-29. 16 Donovan, Judy, Cynthia E. Mader, and John Shinsky. "Constructive Student Feedback: Online vs. Traditional Course Evaluations." Journal of Interactive Online Learning 5, no. 3 (2006): 283-296. 17 Kasiar, Jennifer B., Sara L. Schroeder, and Sheldon G. Holstad. "Comparison of Traditional and Web-Based Course Evaluation Processes in a Required, Team-Taught Pharmacotherapy Course." American Journal of Pharmaceutical Education 66 (2002): 268-270. 3
Student perceptions of course evaluations Students tend to feel that evaluations have no effect on teacher performance, and they don t seem to know if anyone other than the instructor sees the evaluations. 18 Surveys of students typically indicate that students believe faculty and administrators don t take their evaluations seriously. 19 This may be justified, as some studies have found that instructors do not view student evaluations as valuable for improving instruction 20 and very few report making changes to their courses as a result of course evaluations. 21 Students are more likely to complete course evaluations if they see value in them (e.g., understand how they are being used, believe that their opinions have an effect). 22 18 Marlin, James W., Jr. "Student Perceptions of End-of-Course Evaluations." The Journal of Higher Education 58, no. 6 (1987): 704-716. 19 Spencer, Karin J., and Liora Pedhazur Schmelkin. "Student Perspectives on Teaching and its Evaluation." Assessment & Evaluation in Higher Education 27, no. 5 (2002): 397-409. 20 Nasser, Fadia, and Barbara Fresko. "Faculty Views of Student Evaluation of College Teaching." Assessment & Evaluation in Higher Education 27.2 (2002): 187-198. 21 Beran, Tanya N., and Jennifer L. Rokosh. "Instructors' perspectives on the utility of student ratings of instruction." Instructional Science 37.2 (2009): 171-184. 22 Gaillard, Franklin D., Sonja P. Mitchell, and Vahwere Kavota. "Students, Faculty, And Administrators Perception Of Students Evaluations Of Faculty In Higher Education Business Schools." Journal of College Teaching & Learning 3, no. 8 (2006): 77-90. 4
Effects of allowing students access to course evaluation data Students who do not have access to course evaluating ratings rate course evaluations as more important to making a course selection than those who do have access. 23 This may indicate that students think course evaluation data will be more helpful than it actually is. If all else is equal, a student is twice as likely to choose an instructor with excellent ratings over an instructor with good ratings; however, students are willing to select a poor instructor if they believe they will learn a lot from the class. 24 Students will choose a highly rated course over less highly rated courses even if the workload is greater for that course than the others. 25 Results are mixed on whether receiving evaluation information influences how students consequently rate the instructor. 24 Some studies have indicated that students who receive information that an instructor was rated highly will rate that instructor highly, and vice versa. 26,27 Rulings involving the University of Wisconsin and University of Idaho found that students had a right to view the results of student evaluations of faculty. 28 23 Wilhelm, Wendy Bryce, and Charles Comegys. "Course Selection Decisions by Students on Campuses With and Without Published Teaching Evaluations." Practical assessment, research & evaluation 9, no. 16 (2004). http://pareonline.net/getvn.asp?v=9&n=16 (accessed February 15, 2010). 24 Wilhelm, Wendy Bryce. "The Relative Influence of Published Teaching Evaluations and Other Instructor Attributes on Course Choice." Journal of Marketing Education 26, no. 1 (2004): 17-30. 25 Coleman, Jeffrey, and W.J. McKeachie. "Effects of Instructor/Course Evaluations on Student Course Selection." Journal of Educational Psychology 73, no. 2 (1981): 224-226. 26 Perry, Raymond P., R. Ronald Niemi, and Keith Jones. "Effect of prior teaching evaluations and lecture presentation on ratings of teaching performance." Journal of Educational Psychology 66, no. 6 (1974): 851-856. 27 Griffin, B.W. "Instructor Reputation and Student Ratings of Instruction." Contemporary Educational Psychology 26.4 (2001): 534-552. 28 Haskell, R.E.. "Administrative Use of Student Evaluation of Faculty." Education Policy Analysis Archives 5, no. 6 (1997). http://epaa.asu.edu/ojs/article/view/622/744 (accessed February 22, 2010). 5
RECOMMENDATIONS FOR IMPROVING RESPONSE RATES The literature suggest that there are three primary methods to improve response rates on end-of-course evaluations: 1) Make evaluation a part of the course (most effective) 2) Send reminder notices 2) Offer a small incentive 1. Make Evaluation Part of the Course The most effective method to maintain high-quality response rates is to make evaluation part of your course. Simply administering a mid-semester course evaluation and providing the results and your plan of action based on their feedback to the class will dramatically improve response rates at the end of the year. This is because it addresses students' primary complaint about course evaluation: No one looks or even cares about what I have to say about the course. Surveys and information suggest that students have little confidence that faculty or administrators pay attention to the results. If you show them that their feedback is important, studies show that they will provide that feedback to you. 2. Send Reminder Notices At Columbia, as part of the centrally administered option, three email reminders are sent to the students through their university email accounts each week the evaluation is open. There is also a pop-up reminder each time a student logs into OASIS. Instructors are encouraged to remind their own students of the importance of the evaluations and encourage their participation through whatever communication channel you have established for your course. 3. Offer a Small Incentive The literature stated that small incentives will boost the response rates from students. Examples that were provided were one-half of one percent grade enhancement or contests for prizes like an ipod. 6
ADDITIONAL READING ON ONLINE EVALUATIONS: ANNOTATED BIBLIOGRAPHY Note:ThisliteraturereviewisbasedonthebibliographyofCollings,D.&Ballantyne,C.S.(2004,November24 25). Onlinestudentsurveycomments:Aqualitativeimprovement? Anderson,H.,Cain,J.,Bird,E.(2005) OnlineStudentCourseEvaluations:ReviewofLiterature andapilotstudy. AmericanJournalofPharmaceuticalEducation2005;69(1)Article5. Theliteraturereviewrevealedseveralstudiesthatfoundnostatisticallysignificantdifferencesbetween deliverymodes.twoalsonotedthatstudentsprovidedmorecommentsintheonlineforms.responserates variedwidely.theuniversityofkentuckycollegeofpharmacy,drivenbythefaculty sdesireformoretimely returnofresults(3 4monthstypically),launchedapilotstudyofonlineevaluationsin3courses.Theresponse ratesforthe3courseswere85%,89%,and75%.the9coursesusingthepaperformsaveragedan80%response rate(consistentwiththe2previousyearsalsoabout80%).thecommentsontheonlineformsweremore frequentandlongerthanthepaperforms.studentslikedtheonlineformbetterthanthepaperformand thoughttheycouldprovidemoreeffectiveandconstructivefeedbackonline. Anderson,J.,G.Brown,andS.Spaeth.(2006) OnlineStudentEvaluationsandResponseRates Reconsidered. Innovate2(6).http://www.innovateonline.info/index.php?view=article&id=301 SynopsisfromInnovate: Manyadministratorsaremovingtowardusingonlinestudentevaluationstoassess coursesandinstructors,butcriticsofthepracticefearthattheonlineformatwillonlyresultinlowerlevelsof studentparticipation.joananderson,garybrown,andstephenspaethclaimthatsuchaconcernoftenfailsto acknowledgehowtheevaluationprocessalreadysuffersfromsubstantiallackofengagementonthepartof studentsaswellasinstructors;theonlineformat,theyassert,merelyinheritsthefundamentalproblemof perceivedirrelevanceintheprocessitself.afteraddressingthereasonsbehindthisproblemanddiscussing howwell designedonlineevaluationscanstillmakeapositivedifference,theauthorsdescribethedevelopment andimplementationofacomprehensive,college wideonlineevaluationsurveyatwashingtonstateuniversity's CollegeofAgricultural,Human,andNaturalResources.Inreviewingthesurveyresults,theyfoundthatclasssize, academicdiscipline,anddistributionmethodplayedanegligibleroleinstudentresponserates.however,they foundthatvariancesinresponserateweresignificantlyinfluencedbytherelativelevelofparticipationamong facultymembersanddepartmentheadsintheoriginaldevelopmentofthesurvey.theauthorsmaintainthat onlinesurveyscanmaketheprocessmorerelevantandmeaningfultostudents,buttheyconcludethateliciting greaterresponserateswillstillrequiresustainedsupport,involvement,andadvocacybyfacultymembersand administrators. Ardalan,A.,Ardalan,R.,Coppage,S.,andCrouch,W.(2007) Acomparisonofstudentfeedback obtainedthroughpaper basedandweb basedsurveysoffacultyteaching. BritishJournalof EducationalTechnology.Volume38Number62007. Thispaperprovidesasummaryofthecurrentresearchinonlinevs.paperevaluationsaswellasresultsfroma studenttocomparethefeedbackresults.thesameformwasgivento46sectionpairings onepaperandone online.theonlineresponseratewas31%(392outof1276possibleresponses)andthepaperwas69%(972 outof1415).nosignificantdifferencewasfoundinthequantitativeratingsbetweenthetwomethods.they examinedthedifferencesonan overalleffectiveness questioninratingforfacultywhowereabovethecollege averageandthenforfacultywhowerebelowthecollegeaverage.facultywhowereabovetheaveragewere scoredslightlyloweronlineandthefacultywhowerebelowthecollegeaveragewerescoredhigheronline. Therewasnosignificantdifferenceinthenumberofstudentsgivingopen endedfeedbackonline;however, therewasasignificantincreaseinthelengthofopen endedfeedbackonline. 7
8 Avery,RosemaryJ.,BryantW.K.,Mathios,A.,Kang,H.,andBell,D.(2006). ElectronicCourse Evaluations:DoesanOnlineDeliverySystemInfluenceStudentEvaluations? Journalof EconomicEducation.Washington:Winter2006.Vol.37,Iss.1p21 38(ProQuestdocumentID 973267691). TheDepartmentofPolicyAnalysisandManagementaCornellUniversitydidastudyofcourseevaluationdata from1998 2001.Usingthesameform,datawasanalyzedfrom29courses(20usingthepaperversion,9using theonlineversion).thestudyexaminedresponseratesandmeanscoresbetweenthemethods.whilespecific responseratesvaried,onlinewastypicallylowerthanthepaperform.forexample,infall2000paperwas69% comparedwith47%online.usinga5 pointscaleontheir13questions,4questionshadasignificantdifference inmeanscoresbetweenmethods.thiswasagreaterthan0.10differencewiththewebhavingthehighermean score.theother9questionshadalessthan0.10differenceinmeanscoresagainwithwebhavingthehigher means. Ballantyne,C.S.(2003).Onlineevaluationsofteaching:Anexaminationofcurrentpracticeand considerationsforthefuture.insorenson,d.l&johnson,t.d(eds)onlinestudentratingsof Instruction,NewDirectionsforTeachingandLearning,No.96,Winter2003,Jossey Bass Thisarticlesummarizessomeoftheknownissuesrelatedtoonlinesurveys,usingMurdochUniversity s implementationofanonlinecourseevaluationsystemasacasestudy.responseratesareoftenlowerthan desiredbutcanbeincreasedwithstrategiessuchasprovidingcomputeraccess,havingfacultysupportforthe system,andlettingstudentsknowhowtheirfeedbackisused.onlinesystemsneedtobedesignedtoprevent multipleratingsofacoursebythesamestudentwhilestillprotectingstudents anonymity.quantitativeratings aresimilartothosecompletedonpaper,whilecommentsaremoreplentifulandmorethoughtful.onlinerating systemsaresignificantlylessexpensivethanpaperevaluationsystems. Ballantyne,C.S.(2004).Onlineoronpaper:Anexaminationofthedifferencesinresponseand respondentstoasurveyadministeredintwomodes.paperpresentedtotheaustralasian EvaluationSocietyAnnualConference,Adelaide,SouthAustralia,13 15October,2004. http://www.aes.asn.au/conference2004/index.htm#fri In2003MurdochUniversitycarriedoutasatisfactionsurveyofallstudents.Initialcontactwasviaemailasking studentstorespondonline.follow upsofnonrespondentsusedthemoretraditionalmailout/paperformat.a responserateoffiftypercentwasachievedwithsixty threepercentofresponsescomingviatheonlinemode. Malestudents,youngerstudents,undergraduatesandfull timestudentsweremorelikelytorespondonline. Studentsrespondingonlinewerelesslikelytocomment,butonlinecommentswerelengthierthanpaper comments. Bothell,T.W&Henderson,T.2003 Doonlineratingsofinstructionmake$ense? InSorenson, D.L&Johnson,T.D(Eds)OnlineStudentRatingsofInstruction,NewDirectionsforTeachingand Learning,No.96,Winter2003,Jossey Bass Researcherscomparedthecostofanonlineevaluationsystemcomparedtopaperevaluations.Theyfoundthat whenbrighamyounguniversityswitchedtoonlineevaluations,itsavedthem$235,000ayear.theestimated costatbyuforpaperevaluationsis$1.06perstudentrating form,comparedwith$0.47peronlinestudent rating form.thesavingscomefromareductioninprintingcosts,adecreasedneedforpersonnelhelpwith collection,processing,andreporting,andfewertimetakenawayfrominstructorsintheclassroom.
BrighamYoungUniversityStudentratings:Frequentlyaskedquestions.Retrieved15thApril 2010fromhttps://studentratings.byu.edu/info/students/faq.asp ProvidesinformationtostudentsabouttheBYUonlinecourseevaluations.Studentscannotratecoursesafter finalexaminationsbegin.studentsareassuredofanonymitybutgiventheoptiontoprovidetheirnametothe instructorif,forexample,theinstructorsoffersextracreditforcompletingevaluations(thenameisnot associatedwithresults,andisonlyvisibletotheinstructorifatleast5studentscompleteevaluationsforthat class).studentscanseetheresultsforonlyfouritems,whichareassociatedwithstudentlearning,andonlyif theycompleteevaluationsforalltheirclasses. Carini,R.M,Hayek,J.C.,Kuh,G.D.&Ouimet,J.A.(2003)."Collegestudentresponsestoweband papersurveys:doesmodematter?",researchinhighereducation,2003,44,(1),p1 19 Retrieved13thSeptember2004fromhttp://www.kluweronline.com/issn/0361 0365/contents Weexaminedtheresponsesof58,288collegestudentsto8scalesinvolving53itemsfromtheNationalSurvey ofstudentengagement(nsse)togaugewhetherindividualsresponddifferentlytosurveysadministeredviathe Webandpaper.Ourfindingssuggestthatmodeeffectsforfirst yearandseniorcollegestudentsgenerallytend tobesmall.anotableexceptioninvolvesitemsrelatedtocomputingandinformationtechnology,whichexhibit morefavorableresponseswhenansweredviatheweb.however,ourdatadonotallowustodiscernwhether thisisatruemodeeffectorwhetherthosemostengagedincomputingandinformationtechnologyarealso thosewhogravitatetowardtheweb basedmodes. Cates,W.M.(1993).Asmall scalecomparisonoftheequivalenceofpaper and penciland computerizedversionsofstudentend of courseevaluations.computersinhumanbehavior,9, 401 409. Thisstudycomparedresponsestotwoversionsofanend of courseevaluationinstrumentcompletedby graduatestudents:thetraditionalprintedformcompletedusingpencilandpaper,andamicrocomputer based formthatpresentedequivalentitemsandacceptedstudentresponses.afindingofnosignificantdifferencein favorablenessofcompositeratingsbetweenthetwoversionspromptedtheresearchertoperformitem by item analysesofthetwoinstruments.theseanalysesrevealedthatratingsoftheindividualitemsononeinstrument werehighlycorrelatedwiththeratingsoftheirmatchedcorrespondingitemsontheotherinstrument.the paper and pencilandcomputerizedevaluationinstrumentswerefoundtobeofalmostidenticallyhigh reliability. Chen,Y.&Hoshower,L.B.(2003)."Studentevaluationofteachingeffectiveness:anassessment ofstudentperceptionandmotivation."assessment&evaluationinhighereducation28,(1) 72 88. Overthepastcentury,studentratingshavesteadilycontinuedtotakeprecedenceinfacultyevaluationsystems innorthamericaandaustralia,areincreasinglyreportedinasiaandeuropeandareattractingconsiderable attentioninthefareast.sincestudentratingsarethemost,ifnottheonly,influentialmeasureofteaching effectiveness,activeparticipationbyandmeaningfulinputfromstudentscanbecriticalinthesuccessofsuch teachingevaluationsystems.nevertheless,veryfewstudieshavelookedintostudents'perceptionofthe teachingevaluationsystemsandtheirmotivationtoparticipate.thisstudyemploysexpectancytheoryto evaluatesomekeyfactorsthatmotivatestudentstoparticipateintheteachingevaluationprocess.theresults showthatstudentsgenerallyconsideranimprovementinteachingtobethemostattractiveoutcomeofa teachingevaluationsystem.thesecondmostattractiveoutcomewasusingteachingevaluationstoimprove coursecontentandformat.usingteachingevaluationsforaprofessor'stenure,promotionandsalaryrise decisionsandmakingtheresultsofevaluationsavailableforstudents'decisionsoncourseandinstructor 9
selectionwerelessimportantfromthestudents'standpoint.students'motivationtoparticipateinteaching evaluationsisalsoimpactedsignificantlybytheirexpectationthattheywillbeabletoprovidemeaningful feedback.sincequalitystudentinputisanessentialantecedentofmeaningfulstudentevaluationsofteaching effectiveness,theresultsofthisstudyshouldbeconsideredthoughtfullyastheevaluationsystemisdesigned, implemented,andoperated. Collings,D.&Ballantyne,C.S.(2004,November24 25).Onlinestudentsurveycomments:A qualitativeimprovement?paperpresentedatthe2004evaluationforum,melbourne,victoria. http://www.tlc.murdoch.edu.au/pubs/docs/eval_forum_paper.pdf Giventhatonlineevaluationstendtohaveadecreaseinresponseratebutanincreaseincommentscompared topaperevaluations,theseresearchersquestionedwhetheranincreasedresponserateonlinewouldleadto lessvaluablecomments.thiswouldbelikelyifthestudentsmosteagertoparticipatewerealsomostlikelyto comment.however,theyfoundthatregardlessofwhenstudentsrespondedtothesurvey,thepercent commentingandthelengthofcommentswerenearlythesame,withaslightdecreaseforthoseresponding neartheendofthetimeperiod.theysuggestthatqualitativefeedbackmaybemorevaluablethanquantitative feedback,andincreasingresponseratesisn tnecessaryforqualityfeedback. Cummings,R.andBallatyne,C.(1999). Studentfeedbackonteaching:Online!Ontarget? PaperpresentedattheAustralisianSocietyAnnualConference,October,1999. MurdochUniversitySchoolofEngineeringranapilotin1999ofonlinecourseevaluationsusingthesameform onlineasonpaper.studentsfoundtheonlineformeasier,faster,andfeltitofferedgreateranonymity.the schoolhasa50%mandateforresponserateincourseevaluations.typicallypaperevaluationshada65% responserate.theonlinepilotaveraged31%with4ofthe18coursesoverthe50%mandate.theresponse raterangewasawide3%to100%.becausethepilotwasinadequatelypromoted,somefacultydidn tknow theywereusingonlineformsanddidn tadequatelypreparestudents.studentsnotedthattheyfeltnopressure tofillouttheonlineevaluations.theinvestigatorsconcludedthatthequalityofresponseswasthesame becausetheyreceivedthesameamountofcommentsonline,whichiswhatisusedmostfromtheevaluation form. Dommeyer,CJ.,Baum,P.,Chapman,KS.,andHanna,RW.(2003). Anexperimental investigationofstudentresponseratestofacultyevaluations:theeffectoftheonlinemethod andonlinetreatments. PaperpresentedatDecisionSciencesInstitute;Nov.22 25,2003; Washington,DC. TheCollegeofBusinessAndEconomicsatCaliforniaStateUniversity,Northridgedidastudywith16professors toseehowthemethodofevaluationaffectsresponserateandifonlinetreatments(incentives)affectthe responserate.eachprofessortaught2sectionsofthesameundergraduatebusinesscourse.thesameformwas usedinbothmethods.instructorswererandomlyassignedinto1of4groupsusingdifferentincentives:0.25% gradeincentiveforcompletionofanonlineevaluation(4courses),in classdemonstrationonhowtodothe onlineevaluation(2courses),if2/3oftheclasssubmittedonlineevaluationsstudentswouldreceivetheir finalgradesearly(2courses),oracontrolgroup(8courses).theonlineevaluationsaverageda43%response rateandthepaperevaluationsaveraged75%.lookingatjustthecontrolgroup,theiraverageresponseratewas 29%.Intheindividualcasestheincentiveshadtheeffectofincreasingresponserate(gradeincentive87% responserate,demonstration53%,andearlyfinalgrade51%). 10
Dommeyer,C.J.,Baum,P.Hanna,R.W.,&Chapman,K.S.(2004)"Gatheringfacultyteaching evaluationsbyin classandonlinesurveys:theireffectsonresponseratesandevaluations" Assessment&EvaluationinHigherEducation29,(5)611 623. Thisstudycomparesstudentevaluationsoffacultyteachingthatwerecompletedin classwiththosecollected online.thetwomethodsofevaluationwerecomparedonresponseratesandonevaluationscores.inaddition, thisstudyinvestigateswhethertreatmentsorincentivescanaffecttheresponsetoonlineevaluations.itwas foundthattheresponseratetotheonlinesurveywasgenerallylowerthanthattothein classsurvey. Additionally,thestudyfoundthatonlineevaluationsdonotproducesignificantlydifferentmeanevaluation scoresthantraditionalin classevaluations,evenwhendifferentincentivesareofferedtostudentswhoare askedtocompleteonlineevaluations. Donovan,J.,Mader,C.,andShinsky.J.,(2006) Constructivestudentfeedback:Onlinevs. traditionalcourseevaluations. JournalofInteractiveOnlineLearning.Volume5,Number3, Winter2006. Abstract:Substantialeffortshavebeenmaderecentlytocomparetheeffectivenessoftraditionalcourseformats toalternativeformats(mostoften,onlinedeliverycomparedtotraditionalon sitedelivery).thisstudy examines,notthedeliveryformatbutrathertheevaluationformat.itcomparestraditionalpaperandpencil methodsforcourseevaluationwithelectronicmethods.eleveninstructorstookpartinthestudy.each instructortaughttwosectionsofthesamecourse;attheend,onecoursereceivedanonlinecourseevaluation, theotheratraditionalpencilandpaperevaluation.enrollmentinthese22sectionswas519students. Researchersanalyzedopen endedcommentsaswellasquantitativerankingsforthecourseevaluations. Researchersfoundnosignificantdifferencesinnumericalrankingsbetweenthetwoevaluationformats. However,differenceswerefoundinnumberandlengthofcomments,theratioofpositivetonegative comments,andtheratioofformativetosummativecomments.studentscompletingfacultyevaluationsonline wrotemorecomments,andthecommentsweremoreoftenformative(definedasacommentthatgave specificreasonsforjudgmentsothattheinstructorknewwhatthestudentwassuggestingbekeptor changed)innature. Emery,L.,Head,T.,Zeckoski,A.,andYuBorkowski,E.(2008) DeployinganOpenSource, OnlineEvaluationSystem:MultipleExperiences. PresentationatEducause2008,October31, Orlando,FL. Fourinstitutions,UniversityofMichiganAnnArbor,VirginiaTech,UniversityofCambridgeandUniversityof Maryland,collaboratedonanopensourceonlineevaluationsystemwithinSakai.Responseratesinthevarious pilotsrangedfrom32%to79%.theyfoundthekeybenefitsofonlineevaluationstobesecurity,validity, efficiency,costsavings,rapidresultsturnaroundandhigherqualitystudentcomments. Ernst,D.(2006) StudentEvaluations:AComparisonofOnlinevs.PaperDataCollection. PresentationatEducause2006,October10,Dallas,TX. TheCollegeofEducationandHumanDevelopmentattheUniversityofMinnesotadidastudyon314classpairs (14,154studentevaluations)fromfall2002tofall2004.Thegoalsweretoseeifthereisadifferenceinresponse rate,adifferenceinresponsedistributions,adifferenceinaverageratingsbetweenthetwomethodsandwhat arethecommonperceptionsofeachmethod.inthestudygrouptheonlineformaverageda56%response ratewhereasthepaperversionaveraged77%.slightlymorestudentsrespondedonthehighandlowendsof the7 pointscalethandidinthemiddle.therewasnosignificantdifferenceinthemeanratingon4required questions. 11
12 exploranceinc., AFreshLookatResponseRates. WhitePaper. http://www.explorance.com/education/brochures/a%20fresh%20look%20at%20response%2 0Rates.pdf Thiswhitepaperoutlines9bestpracticesformovingtoonlinecourseevaluations.Keybenefitstomoving onlinearelistedaswellasstrategiestobuildresponserates. Fraze,S.,Hardin,K.,Brashears,T.,Smith,J.,Lockaby,J.(2002) TheEffectsOfDeliveryMode UponSurveyResponseRateAndPerceivedAttitudesOfTexasAgri ScienceTeachers. Paper presentedatthenationalagriculturaleducationresearchconference,december11 13,Las Vegas,NV, TexasTechUniversitystudied3modesofsurveyingarandomgroupofTexasAgri Scienceteachers.The3modes weree mail,web,andpaper.nosignificantdifferenceinthereliabilityoftheresponseswasfound.however, theresponserateswere60%,43%and27%forpaper,webande mailrespectively. Handwerk,P.,Carson,C.,andBlackwell,K.(2000). On linevs.paper and pencilsurveyingof students:acasestudy. Paperpresentedatthe40thAnnualMeetingoftheAssociationof InstitutionalResearch,May2000(ERICdocumentED446512). TheUniversityofNorthCarolinaatGreensborodidastudyofusingandonlineversionofafeedbacksurveyfor determiningwhystudentsselectedordidnotselectgreensboro.theyfoundtheonlineversiongeneratedmore commentsthoughhadalower(26%)responseratethanthepaperversion(33%).nosignificantdifferencewas foundintheresponsecontentbetweenthetwomethods. Hardy,N.2003 Onlineratings:Factandfiction. InSorenson,D.L&Johnson,T.D(Eds)Online StudentRatingsofInstruction,NewDirectionsforTeachingandLearning,No.96,Winter2003, Jossey Bass ThisstudyuseddatafromNorthwesternUniversity simplementationofanonlineevaluationsystemtorefute mythssurroundingonlinecourseevaluations.contrarytothefearsofsomefacultymembers,onlineratings werenotmorelikelythanpaperevaluationstoproducenegativeratingsorcomments,andstudentswrote substantiallymorecommentsontheonlineevaluations.additionally,anygivenclassmayhaveahigher,lower, orsimilarresponseratewhenswitchingfrompapertoonline. Hmieleski,K.andChampagne,M.2000"Pluggingintocourseevaluation."Assessment, September/October2000. TheIDEALaboratorysurveyedthenation's200mostwiredcollegesasidentifiedbyZDNet.Surprisingly,98%of the"mostwired"schoolsuseprimarilypaper basedevaluationforms.oftheschoolsrequiringsomeformof courseorfacultyevaluation,allcurrentlyadministertheevaluationformssolelyattheendoftheterm(the "autopsyapproach").sixty sevenpercentofschoolsreportedreturnratesof70%orhigherforpaper based evaluation.schoolsusingorpilot testingaweb basedevaluationsystemreportedreturnratesrangingfrom 20%togreaterthan90%.Only28%ofrespondentsratedtheirfacultyasverysupportiveoftheirschool's currentevaluationsystem.ninety fivepercentofschoolsreportedthattheirfacultymembersareinvolvedin thedevelopmentofcourseevaluations,typicallythroughparticipationinthefacultysenateorbydeveloping evaluationquestions.thirty onepercentofschoolsreportedthatstudentsareinvolvedinthedevelopmentof theircollege scourseevaluationsystem,typicallythroughparticipationinthestudentsenate,and36%of schoolsallowtheirstudentstoviewtheresultsofcourseevaluations,typicallyviatheinternetandstudent
publications.theysuggesta feedback and refinement bywhichstudentscanprovidefeedbackthroughout thecoursetoallowinstructorstomakerapidchangestothecourse,andfoundthatwhenusingafeedback andrefinementsystem,commentstendtobemoreplentifulandinsightful.additionally,theynotethatwhen responsesarerequired,responseratesapproach100%butvaluablecommentsdropdramatically. Hoffman,K.M.2003 Onlinestudentratings:Willstudentsrespond? InSorenson,D.L& Johnson,T.D(Eds)OnlineStudentRatingsofInstruction,NewDirectionsforTeachingand Learning,No.96,Winter2003,Jossey Bass ThisinvestigationwasintendedasanupdatetoHmielskiandChampagne(2000) sarticleoncollegesusingpaper oronlineevaluationforms.oftheinstitutionssurveyed,90%werestillusingaprimarilypaper based evaluationprocess,and12%wereusingnonscannablepaperforms.however,56%wereusingtheinternetfor theevaluationofonlinecoursesorwereplanningtoimplementanonlineratingssystemforonlinecoursesin 2003.MoreschoolsusedtheInternettoreportevaluationresultstofacultythanusedtheInternettocollect ratingsfromstudents;additionally,12%ofinstitutionsallowedstudentstoviewevaluationresults. Johnson,T.D.2003 Onlinestudentratings:Willstudentsrespond? InSorenson,D.L& Johnson,T.D(Eds)OnlineStudentRatingsofInstruction,NewDirectionsforTeachingand Learning,No.96,Winter2003,Jossey Bass BrighamYoungUniversityexperimentedwithdifferentstrategiesforincreasingresponseratestoonlinecourse evaluations.wheninstructorsassignedstudentstocompletecourseevaluations,whetherornottheygave pointsfortheassignment,therewasalargejumpinresponserates.additionally,whentheevaluationform wasshort,studentstookthetimetowritemoreopen endedcomments.studentswhodidnotrespondmost oftendidnotknowabouttheonlineevaluations.infocusgroups,students topsuggestionforincreasing responserateswastoallowearlyaccesstogradesforthosewhocompletedtheevaluations. Kasiar,J.B.,Schroeder,S.L.&Holstad,S.G.(2002).Comparisonoftraditionalandweb based courseevaluationprocessesinarequired,team taughtpharmacotherapycourse.american JournalofPharmaceuticalEducation,66(3),268 70. Inateam taughtcourse(enrollment=169),studentswererandomlyassignedtocompleteevaluationsonline(n =50)orbytraditional,paper basedmethods(n=119).web basedandtraditionalevaluationswerecompared forlikertscore,quantityandqualityofstudentcomments,studentsatisfaction,andconsumptionofstaffand facultytime.of252questionsaskedofeachstudent,72(29percent)hadasignificantlydifferentlikertscore.in allbuttwoquestions,however,themedianand/orrangewasdifferentbyonlyonepointandinmostcasesdid notchangetheoverallmeaningoftheresponse(e.g.,amedianresponseof StronglyAgree ratherthan Agree. )Thenumberofcommentswassignificantlyhigherintheweb basedgroupcomparedtothe traditionalgroup.students,facultyandstaffallratedthewebprocessasmoreconvenientandlesstimeconsumingthanthetraditionalmethod.aweb basedevaluationsystemusingsubsetsofstudentstocomplete eachevaluationcanbeemployedtoobtainrepresentativefeedback.theweb basedprocessyields quantitativelyandqualitativelysuperiorstudentcomments,enhancedstudentsatisfaction,andmoreefficient useoffacultyandstafftime. 13
Laubsch,P.(2006). Onlineandin personevaluations:aliteraturereviewandexploratory comparison. JournalofOnlineLearningandTeaching,2(2). http://jolt.merlot.org/vol2_no2_laubsch.htm TheMasterofAdministrativeScienceprogramatFairleighDickinsonUniversityperformedastudyoncourses taughtbyadjunctfaculty.theonlineevaluationsreceiveda61%responserateandthein classevaluations receiveda82.1%responserate.theyfoundthattheonlineevaluationsreceivedtwiceasmanycomments (countingtotalwords)asthein classevaluations.onthequestionabout materialsbeingclearlypresented (focusedonthefacultymember)thevariationinmeanscoresinonlineandin classwas0.33ona5 pointscale withonlinehavingaless positiverating.thisisastatisticallysignificantdifference.administratorsnotedthat bothmeanswerebetterthanthe agree andwerenotconsideredpoorratings. LayneB.H.,DeCristoforJ.R.,McGintyD(1999). Electronicversustraditionalstudentratingsof instruction. ResHigherEduc.1999;40:221 32. Atasoutheasternuniversity66coursesmadeupof2453studentsdidacomparisonofresponseeffectsbetween paper and pencilandonlineusingthesameform.halfdidonlineandhalfdidpaper and pencilforms.the onlineresponseratewas47%andthetraditionalgroupwas60%.also,76%oftheonlineformsprovided commentscomparedto50%ofthetraditionalforms.nosignificantdifferencewasfoundinmethods. Liegle,JOandDSMcDonald.LessonsLearnedFromOnlinevs.Paper basedcomputer InformationStudents'EvaluationSystem.InTheProceedingsoftheInformationSystems EducationConference2004,v21(Newport): 2214.ISSN:1542 7382.(Alaterversionappearsin InformationSystemsEducationJournal3(37).ISSN:1545 679X.) http://proc.isecon.org/2004/2214/index.html GeorgiaStateUniversityCollegeofBusinessranavoluntarypilotfrom2002to2003usinganidenticalonline versionoftheirpapercourseevaluationforminthedepartmentofcomputerinformationsystems.faculty fearedanonlineformwouldyieldlowerscoresandlowerresponserates.inparticular,thefearwasthatfew studentswouldsubmitonlineevaluations,poorstudentswould takerevenge onthefacultyandgoodstudents wouldn tbother.thepaperformhada67%responserateandtheonlineformhadan82%responserate.this likelyduetothefactthatthecisdepartmenthadeasyaccesstocomputerlabsforstudentstotakethe evaluationsonline.usingaquestiononteachereffectiveness,thestudyfoundnosignificantdifference betweenthemethods.goodstudentsparticipatedinthesamenumbersandweakerstudentsdidfewer onlineevaluations. Lovric,M.(2006).Traditionalandweb basedcourseevaluations comparisonoftheirresponse ratesandefficiency.paperpresentedat1stbalkansummerschoolonsurveymethodology. http://www.balkanprojectoffice.scb.se/paper%20miodrag%20lovrich_university%20of%20belg rade.pdf Thepaperpresentsashortliteraturereviewcomparingonlineevaluationswithpaper.TheEconomics departmentatuniversityofbelgrade,serbiaconductedasmallpilotinacourseof800studentsinmayof2006. Halfthestudentsreceivedpaperevaluationsinclassandhalfweredirectedtocompleteanidenticalonline evaluation.thepaperevaluationreceiveda92.5%responserateandtheonlinereceiveda52%responserate afteranincentivewasintroduced.theyfoundthatnearlytwiceasmanystudentsfilledouttheopen ended questiononlinewhencomparedtothepapergroup.ontheinstructor relatedquestionstheyfoundavariation of0.09to0.22ona10 pointscale.nostatisticalanalysiswasdoneforsignificance. 14
15 Matz,C.(1999). Administrationofwebversuspapersurveys:Modeeffectsandresponse rates. MastersResearchPaper,UniversityofNorthCarolinaatChapelHill.(ERICdocument ED439694). InasurveyofacademicreferencelibrariansinNorthCarolina,Matzfoundnosignificantdifferenceinresponse contentsbetweenthemethodsused.theonlineformhada33%responserateandthepaperformhada43% responserate. Monsen,S.,Woo,W.,Mahan,C.Miller,G.&W(2005). OnlineCourseEvaluations:Lessons Learned. PresentationatTheCALIConferenceforLawSchoolComputing2005. YaleLawstartedonlinecourseevaluationsin2001withalessthan20%responserate.Thecurrent8 question formisrunbystudentrepresentativesandhasa90%responserate.studentscannotseetheirgradesuntil theyfillouttheevaluation.northwesternuniversityschooloflawstartedonlinecourseevaluationsin2004. Sofartheyhavea68%responseratewhichcomparestoa70 80%paperresponserate.Northwesternisagainst usinganypenalties(withholdinginformationfromastudentuntiltheyfilloutanevaluation).theuniversityof DenverSturmCollegestartedonlinecourseevaluationsin2002withapilotof10courses.Thepilothadan83% responserate.continuinginto2003thepilotexpandedto80courses(withan81%responserate)andthen expandedtoalloftheirofferings(witha64%responserate).currentlytheymaintainaresponseratearound 70%.DukeLawstartedonlinecourseevaluationsin2003whentheirscantronmachinebrokeandtheexpense ofreplacingwastoogreat.theyproposedagoalof70%responserateandusedthesameformonline.thefirst termaverageda66%responserate(with29%ofthe82coursesreachingthe70%goal).inspring2004the averagewas60%(with30%ofthe119coursesreachingthe70%goal).infall2004theaveragewas52%(with 8%ofthe93coursesreachingthe70%goal).Inspring2005,afterdroppingnon lawstudentsfromthepool, theaveragewas67%(with41%ofthe117coursesreachingthe70%goal).theschoolisconsideringseveral penaltiesforfailuretofilloutanevaluation withholdingregistration,withholdinggrades,orwithholdingfree printing. Norris,J.,Conn,C.(2005). Investigatingstrategiesforincreasingstudentresponseratestoon linedeliveredcourseevaluations. QuarterlyReviewofDistanceEducation2005;6(1)p13 32 (ProQuestdocumentID975834871). Thispaperreportsthefindingsof2studiesdoneatNorthernArizonaStateUniversity.Thefirststudylookedat historicdatafrom2000 2002toexaminestudentresponsestoonlinecourseevaluationsin1108course sections.thisgrouphadanaverageresponserateof31%.afollow upquestionnairewassentto50facultyin thegrouptoexplorewhatstrategiesimprovedresponserate.theseresultsinformedthesecondstudyon39 onlinecoursesectionsand21sectionsofarequiredfreshmanface to facecourse.thesecondstudyusedsome basicstrategies(nopenaltystrategies)intheimplementationoftheonlinecourseevaluations:2weeksbefore theendofthecoursetheurltoevaluationwaspostedinthecoursemanagementsystem,anannouncement containingastatementofcourseevaluationvalueandduedatewassentinamethodappropriatetotheclass (email,onlinesyllabusordiscussionboard),andareminderemailwassent1weekbeforetheclassended containingtheurlandduedate.the39onlinecoursesectionsaverageda74%responserateandthe21faceto facecoursesaverageda67%responserate.inaddition,11sectionsoftheface to facecourseusedpaper evaluationsandreceiveda83%responserate.thesesuggestionsareverysimilartotheemergingfindingsfrom thetltgroup sbetaproject.
OfficeofInstitutionalResearchandAssessment,MarquetteUniversity, On linecourse EvaluationPilotProjectatMarqetteUniversity. Spring2008. http://www.marquette.edu/oira/ceval/ MarquetteUniversitymovedfromacopyrightedinstrument,IAS,totheirowninstrument,MOCES.Becauseof thecopyrightconcernsthenewinstrumenthasre wordeditemsthatmaintaintheintentoftheiasitems.in springsemesterof2008apilotwasconductedin124coursesectionswith3837students.theyevaluatedthe effectivenessofanonlineapproachversuspaperandpencilandthesoftwareusedtodelivertheevaluations. Responseratesonlinewerelowerin3ofthe5pilotdepartments,comparablein1andhigherin1when comparedto3semesteraveragesofpaperandpencilforms.a poweranalysis oftheresponseratesrevealed therateswerehighenoughof95%confidenceintheresults.therewasnosignificantdifferenceinthemean ratingsforthe4corequestionsbetweentheoldiasformandthemocesonlineform. OnlineCTEProjectTeam.(2005).Onlinecourseandteachingevaluation:Reportonatrialrun withrecommendations.teachingandlearningcenter,lingnanuniversity. http://www.ln.edu.hk/tlc/level2/pdf/online%20cte%20report%20050411.pdf Apilotof18classesusedanonlinecourseandteachingevaluation(CTE)atLingnanUniversity.Formostclasses, amemberoftheprojectteamwenttoascheduledclassduringtheevaluationperiodandexplainedtostudents thenatureandpurposeoftheonlinectetrial.theaverageresponserateforthe18classeswas69.7%.classes withlowresponseratescorrespondedtothosethathadalargenumberofundeliverablee mailorhadnot receivedabriefingfromaprojectteammember.nosignificantdifferenceswerefoundinmeanscores betweenonlineevaluationsandpreviouspaperevaluationsforthesameinstructorandcourse.only3ctes recordedmorecommentsbystudentsthaninthepreviouspaper basedctes;however,theonlinectes containedmoreelaboratecomments.studentfeedbackindicatedthatstudentsgenerallypreferredtheonline CTE;concernswereprimarilyabouttheanonymityoftheirresponsesbecausetheywererequiredtologinto theevaluationsystem. Sax,L.,Gilmartin,S.,Keup,J.,Bryant,A.,andPlecha,M.(2002).Findingsfromthe2001pilot administrationofyourfirstcollegeyear(yfcy):nationalnorms.highereducationresearch Institute,UniversityofCalifornia. TheYFCYdistributeditssurveythatassessesstudentdevelopmentduringthefirstyearincollegeusing3 methods:online,onlineorpaper,andpaper.inapoolof57schools,16usedthealternativemethodsof distribution.thestudyfoundnosignificantdifferenceinresponsesbetweenthemethods.theresponserate overallwas21%.theonlineonlymethodresponseratewas17%andtheonlineorpapergrouphada24% responserate. Schawitch,M.(2005) OnlineCourseEvaluations:OneInstitute ssuccessintransitioningfrom apaperprocesstoacompletelyelectronicprocess! PresentationattheAssociationfor InstitutionalResearchForum,June2005. TheRose HulmanInstituteofTechnologypilotedanonlinecourseevaluationin2002withasmallgroupof faculty.overtheacademicyearthepilothada70%responserate.77%ofstudentspreferredtheonlinemode andfacultyreactedpositivelytothepilot.in2003theentirecampusadoptedtheonlineform.overthe3 terms,theonlineevaluationshadresponseratesof86%,78%and67%.in2004the3termshad75%,71%and 67%.Historicallypaperevaluationshadan85 87%responserate.Theyareinvestigatingvariousincentive possibilities. 16
Spencer,K.&Schmelkin,L.P.(2002)"StudentPerspectivesonTeachinganditsEvaluation." Assessment&EvaluationinhigherEducation,27(5)397 409. Theresearchonstudentratingsofinstruction,whilevoluminous,hashadminimalfocusontheperceptionsof thestudentswhodotheratings.thecurrentstudyexploredstudentperspectivesoncourseandteacherratings aswellassomeissuesrelatedtoteachingeffectivenessandfacultyroles.itwasfoundthatstudentsare generallywillingtodoevaluationsandtoprovidefeedback,andhavenoparticularfearofrepercussions. Thorpe,S.W.(2002).Onlinestudentevaluationofinstruction:Aninvestigationofnon response bias.paperpresentedatthe42ndannualforumoftheassociationforinstitutionalresearchin TorontoCanada.Retrieved9thNovember2004,fromhttp://www.airweb.org/forum02/550.pdf DrexelUniversitystudiedwhethersignificantdifferencesexistinstudentresponsestocourseevaluationsgiven onpaperandonlinein3courses.responseratesinthe3classesforpaperandonline(respectively)were37% and45%,44%and50%,70%and37%.incomparingstudentswhorespondedtotheevaluationsacrossthe3 coursesthestudyfoundthatwomenweremorelikelythanmentorespond,studentswhoearnedhigher gradesweremorelikelytorespond,andstudentswithahigheroverallgpaweremorelikelytorespond.for twocoursestheonlineevaluationshadaslightlyhigheraverageitemrating.fortheothercourse2significant differenceswerefound:studentsdoingtheonlineevaluationwerelesslikelytoparticipateactivelyand contributethoughtfullyduringclassandtoattendclasswhencomparedtothepaperevaluationgroup.butthe responsesoverallwerenotsignificantlydifferent. 17