Online Course Evaluations Collecting, Consolidating, and Leveraging Data Jackie Charonis Associate University Registrar Stanford University Office of the University Registrar
Stanford University Private research University founded in 1891 Our Student Body 6,689 Undergraduates 8,201 Graduates Our Faculty 1,807 Tenure-line faculty 65 Academic departments and interdisciplinary programs Over 3,000 classes scheduled each quarter
Student Systems at Stanford PeopleSoft SA 8.0 student records database Resource 25/Schedule 25 3.3 scheduling software OnBase - document imaging What Do You Think course evaluation software
Developing the System: Our Requirements Vendor-delivered ASP solution: CollegeNET Integrated approach: Stanford presence and security via our portal Self-service application collects and displays results Paper-free process including email reminders & announcements Existing reports replicated in new online system Ability to distribute raw data to key users
Our Design Considerations No changes to existing evaluation forms One approach for all participating schools Ease of use (self-service model) Ease of management (paper-free, hassle-free) Confidentiality at least 3 students enrolled in the course or section combined courses receive one summary of aggregate data Validity - only one evaluation per enrolled student per course Accuracy - data validated by third-party statistical consultant
Our Implementation Timeline Winter 2005-06 Pilot 2 volunteer departments 85 classes and no discussion sections Spring 2005-06 Early Adopters 25 volunteer departments 659 classes and 241 discussion sections Summer 2005-06 No Paper Autumn 2006-07 All departments (excl. professional schools) 1758 courses and 721 discussion sections Autumn Quarter 2007-08: Law School joins
Collecting Data: The Forms Two forms currently used Course Form Section Form Only officially scheduled courses and enrolled students Independent study courses are not included
Collecting Data: Communication Email notifications to instructors and students when evaluation period opens Email reminders Email notifications to instructors when data is available for viewing
Collecting Data: Timing & Incentives Evaluation Period Evaluations open two weeks at end of term incl. finals Results available following grading deadline Grade Withholding Grades are always viewable in system to staff and available on official transcripts Grades are released daily for students view who complete all their evaluations Grades available for those who do not complete evaluations two weeks after the grading deadline
Our Response Rates 90% 80% 70% 60% 50% 40% 30% Paper 2005 Online 2006 20% 10% 0% Autumn Winter Spring
Consolidating Data: Basic Reporting Means by Department, School, and Areas Histograms by Department, School, and Areas Individual Course Summaries Student Comments
Sample Instructor Summary
Sample Mean Summary
Sample Histogram
Leveraging Data: Aspects of Courses Class size and course level Class type (e.g., lecture vs. seminar) Effects of team teaching Comparisons with grade distributions
Sample Analysis: Single vs. Multiple Instructors single inst multiple inst Q01-02 Q03-07 Question # Q08-09 Q10-12 Q13-17 Q18 3.5 3.6 3.7 3.8 3.9 4.0 4.1 4.2 4.3 4.4 4.5 4.6 4.7 4.8 4.9 5.0 Mean Course Evaluation Rating
Sample Analysis: Single vs. Crosslisted Courses single listing crosslisted Q01-02 Q03-07 Question # Q08-09 Q10-12 Q13-17 Q18 3.5 3.6 3.7 3.8 3.9 4.0 4.1 4.2 4.3 4.4 4.5 4.6 4.7 4.8 4.9 5.0 Mean Course Evaluation Rating
Leveraging Data: Aspects of Instructors Same course but different instructors Tenured vs. untenured instructors Male vs. female instructors
Sample Analysis: Faculty vs. Visiting Instructors Visitor Faculty Q01-02 Q03-07 Question # Q08-09 Q10-12 Q13-17 Q18 3.5 3.6 3.7 3.8 3.9 4.0 4.1 4.2 4.3 4.4 4.5 4.6 4.7 4.8 4.9 5.0 Mean Course Evaluation Rating
Sample Analysis: Gender of Instructors Male Female Q01-02 Q03-07 Question # Q08-09 Q10-12 Q13-17 Q18 3.5 3.6 3.7 3.8 3.9 4.0 4.1 4.2 4.3 4.4 4.5 4.6 4.7 4.8 4.9 5.0 Mean Course Evaluation Rating
Leveraging Data: Aspects of Students Self-reported demographic information Male vs. female students
Evaluating the Evaluation Process Did the change in medium effect the results? Did the change in timing of the evaluation period adversely effect the results? Is there meaning behind blank submissions or all 1s, all 3s, all 5s?
Examples of Analysis: Paper vs. Electronic "Difference" between 2006-07 (online) and 2005-06 (paper) area means (mean=-0.020, stdev=0.090, n=306) 100 90 <-- paper forms higher online forms higher --> 89 80 70 68 Frequency 60 50 40 59 36 30 20 10 0 16 14 11 0 0 0 0 0 0 0 0 0 0 0 0 1 2 3 1 1 2 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0-1.00-0.95-0.90-0.85-0.80-0.75-0.70-0.65-0.60-0.55-0.50-0.45-0.40-0.35-0.30-0.25-0.20-0.15-0.10-0.05 0.00 0.05 0.10 0.15 0.20 0.25 0.30 0.35 0.40 0.45 0.50 0.55 0.60 0.65 0.70 0.75 0.80 0.85 0.90 0.95 1.00 Difference Score (2006-07 minus 2005-06 for areas with 10 or more evals)
The Future of Course Evaluations Benchmarking Evaluating learning rather than teaching Evaluating the impact of instructor training programs Developing systems for reviewing qualitative data Expanding the use of course evaluation data
The Future: Mashing Data
Questions & Answers Jackie Charonis Assistant Vice Provost for Student Affairs & Associate University Registrar charonis@stanford.edu Paul Casey Associate Vice President - Sales pcasey@collegenet.com