The Challenges and Potentials of Evaluating Courses Online



Similar documents
Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process

Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process DRAFT

Southwest Texas Junior College Distance Education Policy

Progress Review of the University-Wide Course Evaluation System

Best Practices for Increasing Online Teaching Evaluation Response Rates

Administering course evaluations online: A summary of research and key issues

Online Course Evaluation and Analysis

Texas Wesleyan University Policy Title: Distance Education Policy

4. Description of the project

What is the template for the initial students receive?

C S 105 (53485) Computer Programming: PHP/SQL Spring 2014

Online Student Ratings: Will Students Respond?

Bishop State Community College Distance Education Policy

Concordia University Course Evaluation Survey: Summary Report

Online Student Course Evaluations: Strategies for Increasing Student Participation Rates

CONVERTING A TRADITIONAL LECTURE/LAB PROGRAMMING COURSE TO AN ONLINE COURSE

Student Interpretations of the Scale Used in the BYU Online Student Ratings Instrument. Paul Fields, PhD Department of Statistics

Faculty Development Course Registration and Course Access Manual

Online Course Proposal Form Form 1

Online Class* Development Guidelines Middlesex Community College March 11, 2015

The University of Texas at Austin Department of Civil, Architectural and Environmental Engineering

Department of Secondary Education and Educational Leadership SED The Professional Educator

Faculty Handbook for Alternative Delivery Classes

ITSE 1401 Web Design Tools General Syllabus (Note: This general syllabus presents only general course information for nonregistered students)

The University Of Texas At Austin. The McCombs School of Business

Memorandum of Understanding. between

CSC 241 Introduction to Computer Science I

Online Program. An academic program that contains only online courses.

BUAD : Corporate Finance - Spring

Table of Contents TCC Online Learning Department - Guidelines for Online Courses... 3

Definitions SECTION ACADEMIC AFFAIRS 333 DISTANCE LEARNING

IST565 M001 Yu Spring 2015 Syllabus Data Mining

Fall Summer W e s t e r n C o n n e c t i c u t S t a t e U n i v e r s i t y

Deploying an Open Source, Online Evaluation System: Multiple Experiences

ABET Criterion 3: Outcomes Met By Course Content

Course Evaluations at the Kelley School of Business

SOC URBAN SOCIOLOGY & ANTHROPOLOGY SECTION 01W-- CRN # COURSE SYLLABUS: SPRING 2013 COURSE INFORMATION

DISSERTATION MANUAL. How many members are on a dissertation committee?

CourseShare Manual. Resources for Departments and Faculty

Ivy Tech Community College of Indiana

EARLY WARNING Impact of Participating in Early Warning on Course Performance Fall 2011

ADVISING FAQ. I don t think I am going to get the required C- in my accounting class? What should I do?

Online Course Development Templates Template 1 Learner-Centered Syllabus

Online Course Evaluation System Overview

Teacher Course Evaluations. explorance Pilot. Tara Rose Director, Assessment Office of Faculty Advancement & Institutional Effectiveness

Approval Process for New PST Course

Does not teach online, not interested (12) Does not teach. online, is

Austin Community College Marketing Research Marketing Fall 2009 Distance Learning

DISTANCE EDUCATION. References: Title 5 Sections et seq.; ACCJC Accreditation Standard, II.A.1

IST359 - INTRODUCTION TO DATABASE MANAGEMENT SYSTEMS

Traditional courses are taught primarily face to face.

ME Ph.D. Program: Revised Rules and Requirements

Download and review the Computer Science Checklist For New Students, program/incoming- students

How To Get A Phd In Social Work

Online Learning Policies & Procedures. La Roche College

Teaching in Higher Education

A Class Project in Survey Sampling

ONLINE COURSE EVALUATIONS

The University of Texas of the Permian Basin Distance Education Policy

DES MOINES AREA COMMUNITY COLLEGE EDUCATIONAL SERVICES PROCEDURES FACULTY

PROCEDURES RELATING TO THE USE OF THE MASSEY ONLINE SURVEY TOOL (MOST -

Open Source, Web-based Management Information System for Faculty Administration.

EFFECTS OF A WEB-BASED COLLABORATION TOOL IN ENGINEERING DESIGN COURSES

DFST Courtship and Marriage Fall 2014

TVCC Distance Learning Faculty Handbook. Distance Learning. Faculty Handbook. 1 P age

Online Collection of Student Evaluations of Teaching

Distance Learning Faculty Handbook

Writing effective student learning outcomes

James Madison University. Best Practices for Online Programs

The ODU Guide to Teaching Online. Education Division MEd Online

Center for Distance Learning and Instructional Technology. Policies, Procedures and Best Practices for the Development and Teaching of Online Courses

Policy and Guidelines for Web-Based Academic Courses and Programs

Digital Production Art 3-D AET Bringing ones imagination to life has never been easier. Fall 2015 CBA TTH 9:30-11:00 AM

Introduction. Current Course Structure. Currently, the course requirements for Composition I include: The completion of Five 3-4 page essays

Teaching large lecture classes online: Reflections on engaging 200 students on Blackboard and Facebook

Grade Appeals Academic Dishonesty Appeals. Updated July 2012

Student Course Evaluation Management and Application

Distance Education Policies and Procedures

THE HUTTON HONORS COLLEGE UNDERGRADUATE GRANT PROGRAM RESEARCH GRANT APPLICATION FORM

Request for Proposal ecampus Technology Equipment Program

Teaching and Learning Standards and Principles

In the College of Education at Stephen F. Austin State University, we value and are committed to:

KENNESAW STATE UNIVERSITY GRADUATE COURSE PROPOSAL OR REVISION, Cover Sheet (10/02/2002)

University of North Texas Learning Enhancement Grant Application 1. Project Title Online Masters Program in Applied Anthropology

Memorandum of Understanding For Development of Regent s Online Degree Courses East Tennessee State University

Online Instruction

Davis Applied Technology College Curriculum Development Policy and Procedures Training Division

Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations

English 2950: Scientific and Technical Report Writing Common Syllabus

Hybrid Courses: Obstacles and Solutions for Faculty and Students. Robert Kaleta Director, Learning Technology Center University of Wisconsin-Milwaukee

FAU Faculty Survey for Distance Learning Online Courses, Summer Office Hours 2014 Results Summary and Recommendations

Offering Choice Providing students with options for access to learning at Canadore College. Opportunity

MASTERS PROGRAM INTERNSHIP HANDBOOK

PSYC 430 ABNORMAL PSYCHOLOGY

New Online Course Development Proposal Form (FY13)

PSYC 414 COGNITIVE PSYCHOLOGY

Supplemental Materials

DEVELOPING A DISTANCE EDUCATION CLASS TECHNOLOGY MEDIATED INSTRUCTION (

MSU IDEA Pilot Study Preliminary Results

MOU for New Online Program Development: Master of Engineering in Industrial Engineering, Engineering Management Option

Transcription:

The Challenges and Potentials of Evaluating Courses Online Mark Troy Texas A&M University Hossein Hakimzadeh Indiana University February 23, 2009 Trav D. Johnson Brigham Young University Dawn M. Zimmaro University of Texas at Austin

2 University Experiences with Online Evaluations Brigham Young University Indiana University University of Texas Texas A&M University

3 Agenda: Motivation behind using Electronic Evaluations Policy and Procedural differences Challenges Opportunities Improving Teaching Discussioni

Save time and cost. Control, privacy and ownership of data. Accommodate a variety of questions and answer-types. Evaluation templates that can be applied to hundreds of sections at a time. Ability to interface with existing campus systems (e.g. PeopleSoft). Automatic calculation and aggregation of results. Easy report generation. Growth of distance education. 4

5 Policy and Procedural Differences Between Paper and Online Evaluation Systems Timing of evaluation Security of evaluations Faculty procedures Student procedures Processing of results Long-term availability of results

6 Timing of evaluation Security of evaluations Faculty procedures Student procedures Processing of results Long-term availability of results Mostly done during the last two weeks of classes however, some systems allow open/closed period to be adjusted by the departments as needed.

7 Timing of evaluation Security and anonymity Faculty procedures Student procedures Processing of results Long-term availability of results The systems employs Role Based Access Control. Some systems restrict access for a specified period of time. Student data is protected (typically not maintained or it is maintained in a separate system. )

8 In some systems, faculty are allowed to supplement the campus level evaluations. Timing of evaluation Security of evaluations y y Faculty procedures Student procedures Processing of results Long-term availability of results In other systems faculty have no direct interaction with the system. (Interaction is through the academic departments)

9 Timing of evaluation Security of evaluations Faculty procedures Student procedures Processing of results Long-term availability of results Most of our systems allow students to authenticate using the CAS. Some systems provide randomly generated passwords Some systems allow the students to view prior evaluations results (e.g. Issues related to learning outcomes)

10 Timing of evaluation Security of evaluations Faculty procedures Student procedures Processing of results Long-term availability of results Timely reports for faculty and administrators.

11 Timing of evaluation Security of evaluations Faculty procedures Student procedures Processing of results Long-term availability of results Data has been stored since inception but some campuses request departments to print their reports at the end of each semester. Different campuses have different requirements for how long the data will be maintained. Faculty and administrators are not given access to raw data

12 Challenges of Online Evaluations Response rate (Generally lower) Comparability (Generally comparable results between paper and electronic) Cost (can be considerable: personnel, maintenance, data storage, etc.) Policy implications (Open records, maintenance of data, who gets to see the raw data, etc.) Sustainability issues (Has to be viewed as a mission i critical service and supported properly)

13 Effective methods for meeting the challenges of online evaluations: Improving response rates: The single most important factor in getting students to submit an evaluation online is convincing them that you value their opinion Faculty who do mid-term evaluations increase their response rates on end-of-term evaluations an average of 10% Faculty who take part in the evaluation process by selecting their course for evaluation themselves, as opposed to having it selected automatically for them, increase their response rates an average of 20% Some institutions use a separate system for performing mid-term evaluation. Mid-term evaluations are not returned to the department. Publicize through flyers, etc. Provide incentives to students Take students to computer labs to complete the evaluations

14 Opportunities of Online Evaluations Students become more reflective about teaching and learning Increased quality of responses Timeliness of feedback Better and more flexible reporting Cost saving Sustainability (Less paper) Non response bias (equal opportunity for all to respond)

15 Using Student Ratings to Improve Teaching and Learning Results from learning items available to students Mid-course evaluation tool Helping faculty with low student ratings Using research on student ratings to improve teaching

16 Using research on student ratings to improve teaching Student interpretation of student rating items Relevance Caring Faculty who increased student ratings by 1.5 points or more and sustained that increase for 3 semesters/terms Active/practical learning Teacher-student interactionsi

17 Conclusions Lower response rates are inevitable Increasing response rates requires the active involvement of faculty Online evaluations appear to be resistant to nonresponse bias Features that allow faculty to obtain better information: Mid-term evaluations Flexibility in item selection Ease of collecting and quality of student comments Discussing i evaluations with students t improves response rate and quality of responses Rapid turn-around gives faculty time to make necessary changes

18 Discussion Common experience: Successful online evaluations involve more than a change in delivery systems

Questions? 19

Texas A&M 20

21 Improving Response rate Notification from the university Reminders Faculty interest Request comes from instructor Instructor discusses the importance of evaluations Instructor discusses the use of evaluations Instructor offers incentive

22 Getting Students to Respond: Inducements to Students

23 Notifying students E-mail from TA or instructor Course syllabus Course listserv/website Announcement in class

Getting Students to Respond: Notification to Students 24

25 Why students don t do online evaluations Forgot Missed the deadline Didn t have time Don t believe anyone cares Don t see the point of evaluations Inconvenient to go online Don t like to do surveys

Getting Students to Respond: Why Students Fail to Submit Evaluations 26

27 Non-response bias The bias that arises when students who do not respond have different course experiences than those who do respond Only those who really like you and those who really hate you will respond

28 College of Engineering g Distributions, Texas A&M Class Size 31-50 College of Engineering Distribution of Response Options by Semester 50 60 45 50 40 35 40 30 25 20 30 6b paper 6c paper 7a paper 7b PICA 7c PICA PICA Paper 15 20 10 5 10 0 A B C D E 0 Rating E D C B A Response Options

29 Incentives Faculty are uncomfortable bribing students Students do not like to feel their opinions are bought Small incentive can reinforce the importance of the evaluation to the professor Small incentive can reinforce intrinsic motivation

30 Examples of incentives Participation points for each student who submits an evaluation Participation points for all students if class response rate is XX% or higher Early access to grades for students who submit an evaluation Drawing for ipod, bookstore credit, etc. Access to evaluation results

UT Austin 31

32 (1) review policies and reports on electronic student evaluations from 11 institutions comparable to UT Austin (2) review response rates and overall ratings from paper and electronic evaluations at five other institutions with published findings (3) investigate response rates and overall ratings from paper and electronic Course Instructor Surveys at UT Austin

33 Peer institutions Most UT Austin peer institutions have replaced or are beginning to replace their paper systems.

34 Published research Initially, response rates are lower for electronic than for paper systems but rates tend to increase over time as institutions continue to use electronic student evaluations. Online evaluations may be less susceptible to non-response bias than are paper evaluations.

35 UT Austin comparison Response rates in electronic format are lower by about 10%. Overall course and instructor ratings do not differ significantly.

BYU 36

37 Results from learning items available to students I learned a great deal in this course. Course materials and learning activities were effective in helping students learn. This course helped me develop intellectual skills (such as critical thinking, analytical reasoning, integration of knowledge). The instructor showed genuine interest in students and their learning.

Results from learning items available to students 38

39 Mid-course evaluation tool Questionnaires can be sent anytime during the semester. Faculty can choose from two existing questionnaires or construct their own. Questionnaires are automatically sent to all students in a course. Faculty receive reports via email in both Excel and Word formats.

40 Mid-course evaluation tool Mid-course evaluation tool

41

Mid-course evaluation tool 42

43 Helping faculty with low student ratings Teaching and learning consultants work together with deans and department chairs to help faculty with low student ratings. A discussion of low performing faculty is part of the annual performance review for college deans.

IU South Bend 44

45 Audience Survey Raise your hand if your campus uses paper evaluation forms. Raise your hand if the students responses are typed before they are made available to faculty. Raise your hand if your school uses electronic evaluation. Raise your hand if you are using a commercial system. Raise your hand if the your campus uses a single form. Raise your hand if each hdepartment tuses a different tform.

IU-EVAL 46

IU-EVAL 47

48 Anonymity 1. Student PASSWORDS are randomly generated. 2. Valid for a specific course. 3. Valid for one use only. 4. Valid for a specified period of time. IU-EVAL

49 Student Survey Preference between electronic vs. paper course evaluations? 80 70 60 50 40 76 30 20 10 0 Electronic evaluation 8 Paper evaluation 16 No preference IU-EVAL

50 Reports Reports are uniform and easy to generate. IU-EVAL

IU-EVAL 51

IU-EVAL 52

53 Lessons Learned (Technical) Use 3-Tiered, Web-Based approach. Use mature open-source tools (MySQL, PHP, Linux, Apache). Take the time to do extensive code review. Centralized Quality Control (bug tracking / reporting). IU-EVAL

54 Lessons Learned (Non-Technical) Develop privacy and use policies. i Develop appropriate training modules before allowing users to use the system. Develop and distribute FAQ sheets to students t and faculty. IU-EVAL

55 Lessons Learned (Non-Technical) Don t force the departments to use the system. (It has taken about 4 years, but nearly 100% of our units are using the system.) Support the users who make the transition to the electronic system. IU-EVAL

56 Lessons Learned (Non-Technical) Be careful about requests to mine the data. Give the users better reports that they are getting now. Employ techniques to increase student participation. IU-EVAL

Use of and Research Related to Electronic Course-Instructor Surveys (ecis) at The University of Texas at Austin Use of ecis System at UT Austin Fall 2006 62 departments, 1,318 instructors, and 2,515 unique sections Spring 2007 45 departments, 530 instructors, and 874 unique sections Fall 2007 56 departments, 722 instructors, and 1,375 unique sections Spring 2008 55 departments, 705 instructors, and 1,227 unique sections Fall 2008 59 departments, 695 instructors, and 1,253 unique sections UT Austin ecis Policy Starting in the spring 2008 semester preliminary ecis results were released early to those faculty using the ecis system o About 2 weeks prior to preliminary paper results in the fall o About 1 week prior to preliminary paper results in the spring o At the same time as preliminary paper results in the summer DIIA is working with Faculty Council, UT legal office, and UT governmental relations office to address issues related to type written comments UT Austin ecis and Paper CIS Comparison Study Purpose: To determine whether an ecis system provides information and security comparable to that provided by the paper-based CIS system. Goals: (1) to review policies and reports on electronic student evaluations from UT Austin s 11 peer institutions (2) to review response rates and overall ratings from paper and electronic evaluations at five other institutions with published findings (3) to investigate response rates and overall ratings from paper and electronic Course- Instructor Surveys at the University of Texas at Austin. Results: Overall ecis results for Fall 2006 and Spring 2007 were compared to paper CIS results for the same courses taught by the same instructor in Fall 2005 and Spring 2006, respectively (total matched course sections = 780). o Overall response rates for the ecis were lower (between 6 10% less) than those for the matched paper CIS course sections from previous semesters. o Overall instructor and course ratings for the ecis were equal or nearly equal (within 0.1) to those for the paper CIS.

Comparing UT Austin with other U.S. institutions o Most UT Austin peer institutions have replaced or are beginning to replace their paper system with electronic systems. o Similar to UT Austin, overall course and instructor ratings do not significantly differ between paper and electronic forms at other institutions. o Similar to UT Austin, response rates are somewhat lower for electronic than paper systems at other institutions. Rates tend to increase over time as an institution continues to use electronic student evaluations. o Email reminders and informing students about the importance of student evaluations are the two most used strategies to improve electronic response rates. o Online evaluations may be less susceptible to non-response bias than paper evaluations. Conclusions: Most UT Austin peer institutions have replaced or are beginning to replace their paper systems. Initially, response rates are somewhat lower for electronic than for paper systems at both UT Austin and at other institutions, but rates tend to increase over time as institutions continue to use electronic student evaluations. The two most common strategies for improving electronic response rates are sending e- mail reminders and informing students about the importance of their evaluations. Online evaluations may be less susceptible to non-response bias than are paper evaluations. Overall course and instructor ratings do not differ significantly between paper and electronic forms, either at UT Austin or at other institutions. Report location: http://www.utexas.edu/academic/mec/publication/pdf/fulltext/ecis.pdf For more information contact: Dawn Zimmaro, Ph.D. Associate Director Instructional Assessment and Evaluation Division of Instructional Innovation and Assessment University of Texas at Austin 512-232-2639 dawn.zimmaro@austin.utexas.edu