The Challenges and Potentials of Evaluating Courses Online Mark Troy Texas A&M University Hossein Hakimzadeh Indiana University February 23, 2009 Trav D. Johnson Brigham Young University Dawn M. Zimmaro University of Texas at Austin
2 University Experiences with Online Evaluations Brigham Young University Indiana University University of Texas Texas A&M University
3 Agenda: Motivation behind using Electronic Evaluations Policy and Procedural differences Challenges Opportunities Improving Teaching Discussioni
Save time and cost. Control, privacy and ownership of data. Accommodate a variety of questions and answer-types. Evaluation templates that can be applied to hundreds of sections at a time. Ability to interface with existing campus systems (e.g. PeopleSoft). Automatic calculation and aggregation of results. Easy report generation. Growth of distance education. 4
5 Policy and Procedural Differences Between Paper and Online Evaluation Systems Timing of evaluation Security of evaluations Faculty procedures Student procedures Processing of results Long-term availability of results
6 Timing of evaluation Security of evaluations Faculty procedures Student procedures Processing of results Long-term availability of results Mostly done during the last two weeks of classes however, some systems allow open/closed period to be adjusted by the departments as needed.
7 Timing of evaluation Security and anonymity Faculty procedures Student procedures Processing of results Long-term availability of results The systems employs Role Based Access Control. Some systems restrict access for a specified period of time. Student data is protected (typically not maintained or it is maintained in a separate system. )
8 In some systems, faculty are allowed to supplement the campus level evaluations. Timing of evaluation Security of evaluations y y Faculty procedures Student procedures Processing of results Long-term availability of results In other systems faculty have no direct interaction with the system. (Interaction is through the academic departments)
9 Timing of evaluation Security of evaluations Faculty procedures Student procedures Processing of results Long-term availability of results Most of our systems allow students to authenticate using the CAS. Some systems provide randomly generated passwords Some systems allow the students to view prior evaluations results (e.g. Issues related to learning outcomes)
10 Timing of evaluation Security of evaluations Faculty procedures Student procedures Processing of results Long-term availability of results Timely reports for faculty and administrators.
11 Timing of evaluation Security of evaluations Faculty procedures Student procedures Processing of results Long-term availability of results Data has been stored since inception but some campuses request departments to print their reports at the end of each semester. Different campuses have different requirements for how long the data will be maintained. Faculty and administrators are not given access to raw data
12 Challenges of Online Evaluations Response rate (Generally lower) Comparability (Generally comparable results between paper and electronic) Cost (can be considerable: personnel, maintenance, data storage, etc.) Policy implications (Open records, maintenance of data, who gets to see the raw data, etc.) Sustainability issues (Has to be viewed as a mission i critical service and supported properly)
13 Effective methods for meeting the challenges of online evaluations: Improving response rates: The single most important factor in getting students to submit an evaluation online is convincing them that you value their opinion Faculty who do mid-term evaluations increase their response rates on end-of-term evaluations an average of 10% Faculty who take part in the evaluation process by selecting their course for evaluation themselves, as opposed to having it selected automatically for them, increase their response rates an average of 20% Some institutions use a separate system for performing mid-term evaluation. Mid-term evaluations are not returned to the department. Publicize through flyers, etc. Provide incentives to students Take students to computer labs to complete the evaluations
14 Opportunities of Online Evaluations Students become more reflective about teaching and learning Increased quality of responses Timeliness of feedback Better and more flexible reporting Cost saving Sustainability (Less paper) Non response bias (equal opportunity for all to respond)
15 Using Student Ratings to Improve Teaching and Learning Results from learning items available to students Mid-course evaluation tool Helping faculty with low student ratings Using research on student ratings to improve teaching
16 Using research on student ratings to improve teaching Student interpretation of student rating items Relevance Caring Faculty who increased student ratings by 1.5 points or more and sustained that increase for 3 semesters/terms Active/practical learning Teacher-student interactionsi
17 Conclusions Lower response rates are inevitable Increasing response rates requires the active involvement of faculty Online evaluations appear to be resistant to nonresponse bias Features that allow faculty to obtain better information: Mid-term evaluations Flexibility in item selection Ease of collecting and quality of student comments Discussing i evaluations with students t improves response rate and quality of responses Rapid turn-around gives faculty time to make necessary changes
18 Discussion Common experience: Successful online evaluations involve more than a change in delivery systems
Questions? 19
Texas A&M 20
21 Improving Response rate Notification from the university Reminders Faculty interest Request comes from instructor Instructor discusses the importance of evaluations Instructor discusses the use of evaluations Instructor offers incentive
22 Getting Students to Respond: Inducements to Students
23 Notifying students E-mail from TA or instructor Course syllabus Course listserv/website Announcement in class
Getting Students to Respond: Notification to Students 24
25 Why students don t do online evaluations Forgot Missed the deadline Didn t have time Don t believe anyone cares Don t see the point of evaluations Inconvenient to go online Don t like to do surveys
Getting Students to Respond: Why Students Fail to Submit Evaluations 26
27 Non-response bias The bias that arises when students who do not respond have different course experiences than those who do respond Only those who really like you and those who really hate you will respond
28 College of Engineering g Distributions, Texas A&M Class Size 31-50 College of Engineering Distribution of Response Options by Semester 50 60 45 50 40 35 40 30 25 20 30 6b paper 6c paper 7a paper 7b PICA 7c PICA PICA Paper 15 20 10 5 10 0 A B C D E 0 Rating E D C B A Response Options
29 Incentives Faculty are uncomfortable bribing students Students do not like to feel their opinions are bought Small incentive can reinforce the importance of the evaluation to the professor Small incentive can reinforce intrinsic motivation
30 Examples of incentives Participation points for each student who submits an evaluation Participation points for all students if class response rate is XX% or higher Early access to grades for students who submit an evaluation Drawing for ipod, bookstore credit, etc. Access to evaluation results
UT Austin 31
32 (1) review policies and reports on electronic student evaluations from 11 institutions comparable to UT Austin (2) review response rates and overall ratings from paper and electronic evaluations at five other institutions with published findings (3) investigate response rates and overall ratings from paper and electronic Course Instructor Surveys at UT Austin
33 Peer institutions Most UT Austin peer institutions have replaced or are beginning to replace their paper systems.
34 Published research Initially, response rates are lower for electronic than for paper systems but rates tend to increase over time as institutions continue to use electronic student evaluations. Online evaluations may be less susceptible to non-response bias than are paper evaluations.
35 UT Austin comparison Response rates in electronic format are lower by about 10%. Overall course and instructor ratings do not differ significantly.
BYU 36
37 Results from learning items available to students I learned a great deal in this course. Course materials and learning activities were effective in helping students learn. This course helped me develop intellectual skills (such as critical thinking, analytical reasoning, integration of knowledge). The instructor showed genuine interest in students and their learning.
Results from learning items available to students 38
39 Mid-course evaluation tool Questionnaires can be sent anytime during the semester. Faculty can choose from two existing questionnaires or construct their own. Questionnaires are automatically sent to all students in a course. Faculty receive reports via email in both Excel and Word formats.
40 Mid-course evaluation tool Mid-course evaluation tool
41
Mid-course evaluation tool 42
43 Helping faculty with low student ratings Teaching and learning consultants work together with deans and department chairs to help faculty with low student ratings. A discussion of low performing faculty is part of the annual performance review for college deans.
IU South Bend 44
45 Audience Survey Raise your hand if your campus uses paper evaluation forms. Raise your hand if the students responses are typed before they are made available to faculty. Raise your hand if your school uses electronic evaluation. Raise your hand if you are using a commercial system. Raise your hand if the your campus uses a single form. Raise your hand if each hdepartment tuses a different tform.
IU-EVAL 46
IU-EVAL 47
48 Anonymity 1. Student PASSWORDS are randomly generated. 2. Valid for a specific course. 3. Valid for one use only. 4. Valid for a specified period of time. IU-EVAL
49 Student Survey Preference between electronic vs. paper course evaluations? 80 70 60 50 40 76 30 20 10 0 Electronic evaluation 8 Paper evaluation 16 No preference IU-EVAL
50 Reports Reports are uniform and easy to generate. IU-EVAL
IU-EVAL 51
IU-EVAL 52
53 Lessons Learned (Technical) Use 3-Tiered, Web-Based approach. Use mature open-source tools (MySQL, PHP, Linux, Apache). Take the time to do extensive code review. Centralized Quality Control (bug tracking / reporting). IU-EVAL
54 Lessons Learned (Non-Technical) Develop privacy and use policies. i Develop appropriate training modules before allowing users to use the system. Develop and distribute FAQ sheets to students t and faculty. IU-EVAL
55 Lessons Learned (Non-Technical) Don t force the departments to use the system. (It has taken about 4 years, but nearly 100% of our units are using the system.) Support the users who make the transition to the electronic system. IU-EVAL
56 Lessons Learned (Non-Technical) Be careful about requests to mine the data. Give the users better reports that they are getting now. Employ techniques to increase student participation. IU-EVAL
Use of and Research Related to Electronic Course-Instructor Surveys (ecis) at The University of Texas at Austin Use of ecis System at UT Austin Fall 2006 62 departments, 1,318 instructors, and 2,515 unique sections Spring 2007 45 departments, 530 instructors, and 874 unique sections Fall 2007 56 departments, 722 instructors, and 1,375 unique sections Spring 2008 55 departments, 705 instructors, and 1,227 unique sections Fall 2008 59 departments, 695 instructors, and 1,253 unique sections UT Austin ecis Policy Starting in the spring 2008 semester preliminary ecis results were released early to those faculty using the ecis system o About 2 weeks prior to preliminary paper results in the fall o About 1 week prior to preliminary paper results in the spring o At the same time as preliminary paper results in the summer DIIA is working with Faculty Council, UT legal office, and UT governmental relations office to address issues related to type written comments UT Austin ecis and Paper CIS Comparison Study Purpose: To determine whether an ecis system provides information and security comparable to that provided by the paper-based CIS system. Goals: (1) to review policies and reports on electronic student evaluations from UT Austin s 11 peer institutions (2) to review response rates and overall ratings from paper and electronic evaluations at five other institutions with published findings (3) to investigate response rates and overall ratings from paper and electronic Course- Instructor Surveys at the University of Texas at Austin. Results: Overall ecis results for Fall 2006 and Spring 2007 were compared to paper CIS results for the same courses taught by the same instructor in Fall 2005 and Spring 2006, respectively (total matched course sections = 780). o Overall response rates for the ecis were lower (between 6 10% less) than those for the matched paper CIS course sections from previous semesters. o Overall instructor and course ratings for the ecis were equal or nearly equal (within 0.1) to those for the paper CIS.
Comparing UT Austin with other U.S. institutions o Most UT Austin peer institutions have replaced or are beginning to replace their paper system with electronic systems. o Similar to UT Austin, overall course and instructor ratings do not significantly differ between paper and electronic forms at other institutions. o Similar to UT Austin, response rates are somewhat lower for electronic than paper systems at other institutions. Rates tend to increase over time as an institution continues to use electronic student evaluations. o Email reminders and informing students about the importance of student evaluations are the two most used strategies to improve electronic response rates. o Online evaluations may be less susceptible to non-response bias than paper evaluations. Conclusions: Most UT Austin peer institutions have replaced or are beginning to replace their paper systems. Initially, response rates are somewhat lower for electronic than for paper systems at both UT Austin and at other institutions, but rates tend to increase over time as institutions continue to use electronic student evaluations. The two most common strategies for improving electronic response rates are sending e- mail reminders and informing students about the importance of their evaluations. Online evaluations may be less susceptible to non-response bias than are paper evaluations. Overall course and instructor ratings do not differ significantly between paper and electronic forms, either at UT Austin or at other institutions. Report location: http://www.utexas.edu/academic/mec/publication/pdf/fulltext/ecis.pdf For more information contact: Dawn Zimmaro, Ph.D. Associate Director Instructional Assessment and Evaluation Division of Instructional Innovation and Assessment University of Texas at Austin 512-232-2639 dawn.zimmaro@austin.utexas.edu