Committee Report and Recommendations to Faculty Senate



Similar documents
Best Practices for Increasing Online Teaching Evaluation Response Rates

Student Course Evaluation Management and Application

Spring 2013 Structured Learning Assistance (SLA) Program Evaluation Results

Course-based Student Feedback Forms: Proposal for a New System Fall 2005

Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations

Course Evaluations Online

Online Collection of Student Evaluations of Teaching

Constructive student feedback: Online vs. traditional course evaluations. Judy Donovan, Ed.D. Indiana University Northwest

ONLINE COURSE EVALUATIONS

Administering course evaluations online: A summary of research and key issues

Perceived Stress among Engineering Students

Web-Based Student Evaluation of Instruction: Promises and Pitfalls

Getting Started with WebCT

AN INVESTIGATION OF THE DEMAND FACTORS FOR ONLINE ACCOUNTING COURSES

Response Rates in Online Teaching Evaluation Systems

The ICT Driving Licence at the University of Helsinki: Synergy from Cooperation

Myths and Facts About Online Classes. Read each slide carefully. Use the down arrow to advance the slide show.

Memo. Open Source Development and Documentation Project English 420. instructor name taken out students names taken out OSDDP Proposal.

TEACHING SABBATICAL FINAL REPORT FALL

Student Learning Outcomes in Hybrid and Face-to-Face Beginning Spanish Language Courses

Printed textbooks versus E-text: Comparing the old school convention with new school innovation

"Traditional and web-based course evaluations comparison of their response rates and efficiency"

Speech 101-Los Angeles Harbor College Oral Communication

Multiple Listing Service Gains $5.4 Million in Savings from Scalable Database Solution

Sample Online Syllabus

King Fahd University of Petroleum and Minerals DEANSHIP OF ACADEMIC DEVELOPMENT KFUPM ONLINE COURSES:

Modifying Formative Evaluation Techniques For Distance Education Class Evaluation

Study Guide for the Special Education: Core Knowledge Tests

Answers to Faculty Concerns About Online Versus In- class Administration of Student Ratings of Instruction (SRI)

CREATIVE USE OF THREADED DISCUSSION AREAS

How To Find Out If Distance Education Is A Good Thing For A Hispanic Student

DEPARTMENT OF CRIMINOLOGY AND CRIMINAL JUSTICE STUDIES FACULTY IN-SERVICE MEETING MINUTES & POLICY UDATES

University of Michigan Dearborn Faculty Senate elearning Ad hoc Committee Report

HAWAI I COMMUNITY COLLEGE ONLINE EDUCATION REPORT

Rollins College Entrepreneurial and Corporate Finance BUS 320- H1X

PERSONAL LEARNING PLAN- STUDENT GUIDE

Graduate Handbook Department of Computer Science College of Engineering University of Wyoming

MBA 6410 Strategic Global Marketing 3 Credit Hours Milton Fall Term 2, 2014

classroom Tool Part 3 of a 5 Part Series: How to Select The Right

Ad-Hoc Committee on Academic Integrity. Survey Summary Report

VANGUARD UNIVERSITY SCHOOL FOR PROFESSIONAL STUDIES BACHELOR OF SCIENCE IN NURSING DEGREE PROGRAM NURS 310: INFORMATIONAL TECHNOLOGY IN NURSING

SAUDER SCHOOL OF BUSINESS UNIVERSITY OF BRITISH COLUMBIA

Steps to Success. What you need to know at Lorain County Community College The Nuts and Bolts of Starting College

College Algebra Online Course Syllabus

Creating Outside Instructional Activities for Online and Accelerated Courses

This list is not exhaustive but can serve as a starting point for further exploration.

College Students Attitudes Toward Methods of Collecting Teaching Evaluations: In-Class Versus On-Line

AC : A STUDY OF TRADITIONAL UNDERGRADUATE STU- DENT ENGAGEMENT IN BLACKBOARD LEARNING MANAGEMENT SYSTEM

A PRELIMINARY COMPARISON OF STUDENT LEARNING IN THE ONLINE VS THE TRADITIONAL INSTRUCTIONAL ENVIRONMENT

University of Central Oklahoma Survey on Learning Management Systems

AOM 221 BUSINESS/OFFICE COMMUNICATONS COURSE SYLLABUS

Online And Paper Course Evaluations Faruk Guder, Loyola University Chicago, USA Mary Malliaris, Loyola University Chicago, USA

Human Development and Learning in Social and Educational Contexts (EDP 201) Spring 2012 Syllabus

Ten Strategies to Encourage Academic Integrity in Large Lecture Classes

ACADEMIC CONTINUATION PLAN

Evaluating and Grading Student Work

Evaluating Assessment Management Systems: Using Evidence from Practice

Journal of College Teaching & Learning - December 2005 Volume 2, Number 12 ABSTRACT

TRIO Wolf Creek distributed the AdvancED survey link to all students and parents in a variety of ways listed below:

The Integration of Secure Programming Education into the IDE

CULTURE OF ONLINE EDUCATION 1

Orientation Leader Application

BRAZOSPORT COLLEGE LAKE JACKSON, TEXAS SYLLABUS ITSC 2339: PERSONAL COMPUTER HELP DESK HYBRID VERSION

This definition applies to texts published in print or on-line, to manuscripts, and to the work of other student writers.

In general, the following list provides a rough guide of the survey process:

Ithaca College Survey Research Center Survey Research Checklist and Best Practices

Policy on Student Ratings of Teaching

Implementation of WeBWork in and Team-Built Assignments: Pros and Cons

University Hospital Preoperative Patient Flow & Work Flow Analysis. Final Report

A Survey of Needs and Services for Postsecondary Nontraditional Students

Developmental Education Pilot Report Southeastern Louisiana University Submitted January 2015

EVALUATION OF DEPARTMENT CHAIR

Building a strategic talent management program

Offering Choice Providing students with options for access to learning at Canadore College. Opportunity

Speech-Language Pathology Study Guide

When it Comes to Teaching ESL Writing Online. Mei-ching Ho. Arizona State University. Proceedings of the CATESOL State Conference, 2005

TEACHING ONLINE INTRODUCTORY ACCOUNTING COURSE FOR THE FIRST TIME: WHAT TO EXPECT

Relationship Manager (Banking) Assessment Plan

DESIGNING A HYBRID CLASS IN ACCOUNTING: HOW TO GET MAXIMUM MILEAGE?

Course Syllabus OPRE/MIS Supply Chain Software The University of Texas at Dallas

I. School Lunch Periods

Information about the elective has appeared in the literature previously, in 1996 and 2005 (2,3); this paper updates those publications.

CSC 281 Automata and Algorithms Spring 2009

American Journal Of Business Education July/August 2012 Volume 5, Number 4

SURVEY RESEARCH AND RESPONSE BIAS

FFCS 199- Foundations for College Success (Sample Syllabus) Fall 2013

Converting a Face-to-Face Course to a Hybrid Course

Study Guide for the Elementary Education: Content Knowledge Test

Online courses for credit recovery Promising

College Timeline for the Class of 2017

Compliance Audit Report Texas A&M University--Kingsville Principal Certification Program

Social Media and Digital Marketing Analytics ( INFO-UB ) Professor Anindya Ghose Monday Friday 6-9:10 pm from 7/15/13 to 7/30/13

Math Center Services and Organization

Speech Communication Internship Fact Sheet For Interns and Supervisors COMM4624 & COMM 4337

FAU Faculty Survey for Distance Learning Online Courses, Summer Office Hours 2014 Results Summary and Recommendations

APPENDIX 16 MATH REDESIGN ABSTRACT. Cleveland State Community College DSPM Redesign Abstract

INTRODUCTION TO BUSINESS BA2020 WAYNE STATE UNIVERSITY SCHOOL OF BUSINESS ADMINISTRATION

Study Guide for the Music: Content Knowledge Test

360 feedback. Manager. Development Report. Sample Example. name: date:

Multimedia & the World Wide Web

Transcription:

To: Longwood University Faculty Senate From: Ad-Hoc Committee on Student Evaluations (online vs in-class) Chris Bjornsen (Chair), Susan Lynch, Linda Lau, Mark Lenker, Jay Lynn, Chris Shumaker Date: April 1, 2009 Re: Committee Report and Recommendations to Faculty Senate Overall purpose and recommendation: The ad-hoc committee was asked to investigate the pros and cons of switching from the current in-class (face-to-face: F2F ) student assessment of instruction system to an electronic, online student assessment system. The committee met several times over the course of the current academic year to discuss the issues and information obtained, and conducted extensive interviews with faculty and administrators at all of Longwood s peer institutions that were either using or studying the use of online assessments. The committee also conducted an online survey of Longwood University faculty regarding the possibility of switching to an online evaluation system. The results of this survey are included in this report. Possible scenarios: a. Continue using current in-class system b. Search for and adopt/purchase a different in-class system (because WebSAID will not be functional much longer due to programming problems) c. Purchase online system that will function independent of Banner d. Purchase online system that will function in conjunction with Banner e. Conduct a pilot study for a number of semesters in which a random selection of faculty will use both in-class and online forms, and compare results to assess validity, etc. In sum, based on the information below, the committee recommends that: a pilot study should be conducted to compare the validity and utility of on-line vs in-class assessment systems, in which a random sample of instructors would administer both types of assessments over the course of 2-4 semesters senators should share the information contained in this report with their departmental colleagues, and should engage in departmental discussions on this issue over the next academic year the Senate should discuss and debate this issue over the course of the coming academic year

The pilot study should address the following concerns: The online system must be reliable and easy to use. To achieve an appropriate response rate, students must have some incentive to use the system. One approach would be to provide student grades electronically only after students have completed the evaluations for their courses. If they do not complete the evaluation, they would have to wait until paper copies of their grades arrive in the mail. To gain the trust of students, measures taken to preserve their anonymity must be effective and easy to explain. To gain the trust of faculty, pilot tests must show that the online format does not significantly differ from in-class evaluations. In particular, it is critical to assure the ability to detect patterns with a very high response rate among students who either love the class/instructor or (worse) despise the class/instructor. If pilot testing shows that the online system does not satisfy these concerns, we should continue to use paper-and-pencil evaluations during class time. After the above processes have taken place, the Senate should have sufficient information to reach a consensus regarding the proposal to adopt or not adopt an online system for student assessment of instruction. Faculty/Staff/Administrators should also keep abreast of technological developments in online system administration during the pilot study. In particular, the welldocumented concern regarding the completion of student assessments outside of class may well be alleviated in the near future through the development of capabilities for students to complete assessments in class electronically by connecting to the online system via cell phones. The Senate would, of course, have the option of considering recommendations and proposals regarding not only the method of SAI data collection, but also the data collected and the guidelines regarding faculty and administrative use of SAI data. Peer Institution contacts and information: As part of our investigation, the committee determined that the following peer institutions were either using or were in the process of investigating the use of online student evaluations. Elon University Rollins College The University of Tampa Salisbury University The University of Texas of the Permian Basin Truman State University University of Scranton California State University-Bakersfield SUNY at Geneseo SUNY College at Plattsburgh Trinity University Westfield State College Seattle Pacific University Each institution was contacted, and the person in charge of the online evaluation process was interviewed regarding the pros and cons of using online forms. In short, there was significant

variability in the views regarding the positive and negative issues and outcomes of using or moving toward the use of online evaluations. Other than the well-established determinations that online assessments are easier to conduct and are more cost-effective in the long run, and that faculty members in general hold negative views of online evaluations, no consensus emerged from our investigation of other institutional experiences regarding the overall success of online systems. (Senators will note that this view is not in accord with what software vendors communicate about online systems; see end of this report.) In fact, there were approximately equal numbers of institutions (a.) studying the process, (b.) conducting a pilot study using online systems, and (c.) actually using an online system. Examples of specific Institutional experiences: SUNY Geneseo: Located in New York State, a union state, the SUNY Universities adopted a standard online assessment instrument several years ago. It has been a dramatic failure according to the person interviewed. Faculty demanded the ability to use their own assessment forms within each department, and most do so, and the online instrument serves little purpose for faculty evaluation and teaching enhancement. In general, the faculty at Salisbury University did not like the idea of using online evaluation (Hoffman, 2008). Teaching carries the heaviest weight in faculty performance evaluation, but the student evaluation is only one of the many components for evaluating faculty teaching. Therefore, faculty members were not terribly upset with this new method of student evaluation. The administrators, on the other hand, held positive views regarding the fact that results from the online evaluation could be collected and accessed instantly by administrators and faculty. A phase transition from department to department and mandatory monthly training for faculty and staff contributed to the successful implementation of the online evaluation at Truman State University. This online evaluation system is a customized, in-house built Web-based system that is connected to the Banner system. Starting with a one-department process at a time, it took the IT division of Truman State one and a half year to implement the online evaluations at 14 of the 28 the departments. The online evaluation is administered over a 2-week period before the final exam week. It eliminates the use of different systems spread over the different departments, but yet allows each department to develop their own questions. Results of survey of Longwood faculty (see attached): a majority of faculty (58%) did not have previous experience using online evaluations approximately equal numbers of faculty believed online assessments would provide data equal to and different than in-class assessments 46% of faculty believed instructor ratings would change if online assessments were used, as opposed to 31% of faculty who believed ratings would not change approximately equal numbers of faculty believed online assessments would be valid or not valid 74% of faculty believed online evaluations would free up more class time

44% of faculty believed online evaluations would allow students to provide more feedback regarding instruction, while 39% did not believe this would be true 62% of faculty believed online evaluations would facilitate faster feedback to faculty 78% of faculty agreed that an online system would facilitate adding more specific questions to the evaluation 84% of faculty agreed that online evaluations may introduce bias in the results, namely that highly motivated and more disgruntled students would complete evaluations 91% of faculty agreed that an online system would reduce faculty control over the administration of evaluations 42% of faculty were not in favor of switching to online evaluations; 35% were in favor of switching; 22% were unsure Pros and Cons of using online assessments: Positive Negative More cost effective Reduces work load of staff Would potentially allow for a more thorough and useful analysis of student opinions (e.g., student assessments could be analyzed in relation to expected grades, actual grades, GPA, year in school, etc.) In some ways, easier to administer for faculty (takes almost no class time; email reminders to students to complete assessments) In some ways, easier for students to complete (in dorm room, in library) Go Green eliminates paper used for in-class evaluations Possibly eliminates time spent in class on evaluations (unless we adopt a system in which students complete evaluations online in class) Student feedback can be utilized to improve instruction/assignments during the semester Potential for student ratings of faculty to change dramatically Potential, as with any electronic/computer driven/online system for data loss, data corruption, programming errors, etc., some of which may not be caught or evident to faculty/administration/it staff Potential for online system to be less user-friendly as in-class system Validity: How can we be certain students take assessments seriously and answer questions honestly? How can we build student compliance into the system to ensure that all or most students complete assessments How can we be certain that students complete their own evaluations? How will students be sure that their opinions will be anonymous? Where should students complete assessments? (Classrooms, computer labs, library, at home)

When should students complete assessments? (Should assessments be available only between 8 am and 6 pm Sun-Thur?) Should completion by students be required or optional? Should credit be added to course grades for completion? Should students be proctored while completing forms? Comments / open ended responses? How will results/feedback be given to instructor? How are results to be used by administration? Should an online process also include faculty assessment of chairs, deans, VPAA? Annotated Bibliography Online Evaluation Literature Benton, T. H. (2008). Do students online ratings of courses suck (or rock ). Chronicle of Higher Education (online version). Description of the pros and cons experienced by professors at Hope College (Michigan) regarding switching to online evaluations, and how to improve the system and process. Nulty, D. D. (2008). The adequacy of response rates to online and paper surveys: What can be done? Assessment & Evaluation in Higher Education, 33 (3), 301-314. Reviews the extant literature on online assessements, and finds that while there is substantial variability in response rates and utility of online assessments, the prevailing position is that response rates are typically much lower using online assessments (ave = 33%) compared to face-to-face assessments. Article also includes numerous recommendations regarding improving response rates to online surveys, and provides details on required response rates by class size. Fidelman, C. G. (2008). Course evaluation surveys: In-class paper surveys versus voluntary online surveys. Dissertation Abstracts International Section A: Humanities and Social Sciences, Vol 69(2-A), 2008. pp. 582. Found that interest in the class topic and expected grade predicted student ratings of instructor on online assessments. Sorenson, Lynn, and Christian Reiner. 2003. "Charting the Uncharted Seas of Online Student Ratings of Instruction." New Directions for Teaching & Learning, no. 96: 1-24. Academic Search Complete, EBSCOhost (accessed March 30, 2009). Sorensen and Reiner provide a helpful overview of the important factors to consider when determining the desirability of a transition from pencil-and-paper to online teaching evaluations at the local level. The authors recommend

a multi-step procedure for building consensus among faculty, administration, and IT personnel, and collaboration with other institutions to minimize time spent reinventing the wheel, and careful consideration of local circumstances before going ahead with implementation. Dommeyer, Curt J., Paul Baum, Robert W. Hanna, and Kenneth S. Chapman. 2004. "Gathering Faculty Teaching Evaluations by In-Class and Online Surveys: Their Effects on Response Rates and Evaluations." Assessment & Evaluation in Higher Education 29, no. 5: 611-623. ERIC, EBSCOhost (accessed March 30, 2009). The authors find that the transition to an online format for evaluations can result in a decreased response rate. This rate can be brought up to levels of participation similar to those associated with in-class evaluation by providing small grade incentives and/or frequent reminders. The authors also recommend that institutions changing to an online format must make it clear to students that the online system will preserve their anonymity. The authors did not discover a difference in the scores students gave their teachers. Oliver, Richard L., and Elise Pookie Sautter. "Using Course Management Systems to Enhance the Value of Student Evaluations of Teaching." Journal of Education for Business 80, no. 4 (March 01, 2005): 231-234. ERIC, EBSCOhost (accessed March 30, 2009). Oliver and Sautter describe a case study in which evaluations collected via WebCT (1) achieve a response rate comparable to in-class evaluations, and (2) assure students of their anonymity by using a module not associated with the class of the professor the students are evaluating.

Appendix Vendor response to query regarding software flexibility (CoursEval): The committee asked Jay Lynn to determine if either of the recommended software systems would include flexibility that would allow individual faculty to determine the dates and times that online evaluations would be available for their own classes (thus more closely approximating the current in-class system used at Longwood). The following response was sent by one software vendor: This is the first time the issue has been raised by prospective or existing customers and none of the support staff has had any reports of late night behavior being a problem with current customers. CoursEval surveys cannot be turned on and off each day. The behavior most reported with students is that all students need the reminder message to respond. Whenever the reminder message is sent, it is followed quickly by submitted surveys. If the reminder email is sent in the afternoon or early evening, there should be few responders very late at night (you can confirm this with the reports and log of participants)and the behaviors of concern will be less likely. Another consideration is that far more students (than my tired body can imagine) are up very late and submitting serious surveys. My experience at my campus was that malicious or poorly behaved students couldn't be bothered to submit a serious survey day or night. We do not see the bad behavior evident in "RateMyProfessor.com" with CoursEval surveys. There is an option to examine this more closely in CoursEval that wasn't demonstrated for the campus. The campus manager and others with authorized access can glean limited additional information from the written comments with a feature that will group a single student's comments to see whether a few students 'went negative' on everything and account for problematic responses. Another option allows the manager to add comments to those few student comments that 'require a response.' An easier reporting option is to provide response frequencies on the faculty report. This will quickly indicate whether there is bad behavior among the students. Lastly, our customers report that serious surveys elicit serious responses and faculty engagement elicits student engagement. Messages, reminders, surveys, and campus support all help with this. I'm confident that faculty will quickly realize that bad student behavior is not a problem and that

surveys are much more likely to be submitted by the better students (as our customers have shown.)