Teacher Course Evaluations explorance Pilot Tara Rose Director, Assessment Office of Faculty Advancement & Institutional Effectiveness
Presentation Overview explorance Overview Process Participation Faculty Feedback and Results Discussion of Responses Rates Ongoing Improvements What s Coming in 2014-2015 Future Plans
Why explorance? Efficiency - Five step process down to one. No More: Scanning Importing into SAS Running SAS to analyze data Manually distributing reports College contacts responsible for distributing reports Flexibility in TCE form The current TCE process (paper) faculty can add up to 13 questions likert scale only explorance will allow unlimited questions in any format (limited during pilot) 13 likert scale 13 open-ended 10 yes/no Questions are analyzed for you and added to the TCE Students receive ONE TCE per course section regardless of the number of instructors
Why explorance? Robust reporting features Linked to SAP for demographic data Data automatically generated and analyzed Timely feedback Visual reporting Provides anytime, anywhere accessibility laptop desktop tablet smart phone screen reader access through any browser any kind of internet connection
Process Email communication with TCE contacts Requirements to participate: Use of the standard uniform Teacher/Course Evaluation. Check the TCE flag box in SAP Identify TCE liaison for adding Supplemental Questions at the course level Review excel file with course and section ID Identify department or college level questions and send to IE Launch/close question personalization Launch/close TCE Distribution of Reports
Participation Fall 2013 Spring 2014 Summer I 2014 Summer II 2014 8,772 Students 14,761 Students 627 Students 1,130 Students 210 Faculty 748 Faculty 106 Faculty 184 Faculty 365 Course Sections 1,346 Course Sections 112 Course Sections 209 Course Sections 13,009 Evaluations 31,686 Evaluations 745 Evaluations 1,681 Evaluations
Faculty Feedback- Fall 2013 Faculty Survey: sent to 210 faculty Likert Scale: 82 responses for a 39% response rate RR Strategies for University: 61 responses for a 29% response rate RR Strategies for Faculty: 55 responses for a 26% response rate
Faculty Feedback- Spring & Summer 2014 Faculty Survey: sent to 846 faculty Likert Scale: 66 responses for a 7.8% response rate RR Strategies for University: 47 responses for a 5.6% response rate RR Strategies for Faculty: 39 responses for a 4.6% response rate
Mean Value Comparison Mean Values: Fall to Spring/Summer 4 3.9 3.8 3.91 3.82 3.88 3.93 3.88 3.86 3.84 3.7 3.6 3.5 3.4 3.3 3.49 3.33 3.42 3.2 3.1 3 Using explorance as the main teacher course evaluation method was easy The email notifications which included embedded links to view the response rates and results were useful Comparing course data to the department, college, and university all within the individual faculty report was useful I plan to participate in explorance againt this fall semester I would encourage other faculty in my department to choose explorance as the method of conducting teacher course evaluations Fall Spring & Summer
Response Rates by Method TCE Method Fall 2013 Spring 2014 Summer I 2014 Summer II 2014 explorance 46.50% 50.96% 38.07% 42.25% Paper 69.70% 65.50% N/A N/A Qualtrics 39.10% 35.03% 26.56% 28.73%
What are some strategies that you feel the University should consider to help promote an increase in student response rates? Fall 2013 Spring & Summer 2014 Requirement to Hold Grades 20 17 Incentives 11 7 Survey Improvement (Content) and 19 3 Execution Return to Paper 0 10 *Note: Some participants provided multiple solutions.
What are some strategies that you, the instructor, should consider to help promote an increase in student response rates? Fall 2013 Spring & Summer 2014 Announcements/Reminders 22 12 Class time 10 10 Requirement/Attach to an assignment 5 0 Survey Improvement/Enhancement 5 0 Incentives 4 5 Engage in Student Dialogue/Explain 0 5 Purpose *Note: Some participants provided multiple solutions.
Discussion on Response Rates Accessibility: link embedded in email, Blackboard, evaluate.uky.edu/blue Network effect central screen for students with all evals Inform students their opinions matter Valued Expected Anonymous Marketing the TCE in-class, syllabus, demo Provide time in class Instructor monitoring Instructor reminders
Ongoing Improvements Process Ability to include all courses early/late courses Dynamic Scheduling - launch date aligns more closely will actual start/end date 130 different schedules on campus Self-serve supplemental questions Reduced human error by integrating with SAP, added many layers of error checking to eliminate bad data problems Instructor flag (TCE flag) in SAP/Event Planning Automatic Reports - two weeks after grades due Emails go out 4x faster, monitoring server loads Uploading of TCE questions 20 and 21 into Digital Measures
Mon 27 QP-Invite Tues 26 Wed 25 Thurs 24 QP-R1 Fri 23 Sat 22 Sun 21 Mon 20 QP-Invite Tues 19 QP-R2 Wed 18 QP-Invite Thurs 17 Fri 16 QP-R1 QP-Close Sat 15 Sun 14 TCE-Invite Mon 13 QP-R1 Tues 12 QP-R2 Wed 11 QP-R2 Thurs 10 TCE-R1 Fri 9 QP-Close QP-Close Sat 8 Sun 7 TCE-Invite Mon 6 Tues 5 TCE-Invite Wed 4 TCE-R1 Thurs 3 TCE-R1 TCE-R2 Fri 2 TCE-R2 Sat 1 TCE-R2 Sun 0 TCE-Close TCE-Close TCE-Close Mon -1? Finals Tues -2 Finals Finals Finals Wed -3 Finals Finals Thurs -4 Finals Finals Fri -5? Finals Sat -6
Ongoing Improvements Survey Instrument Questions are numbered Questions aligned to actual type of course: UK Core, UK Core SLOs, Labs, Seminars, Distance Learning Comment box for each instructor and course, not just one overall comments box Faculty Senate ad hoc committee
explorance Pilot for 2014-2015 Improve automated scheduling for short classes (1-5 weeks long) (Fall 2014) Roll out new website (Fall 2014) Develop Marketing Strategy (Fall 2014) Default login Datasource to Users (Fall 2014) Integrate with UKAT Helpdesk (Fall 2014) Effective Cross-listed course reports (Spring 2014)
Plans for Future explorance Pilots Combined Cross-Listed Reporting - (problems specifically with cross-listed courses with 3-8 cross listings and/or multiple instructors) Importing paper and qualtrics data are infeasible - UKAT looking into combining exported data in HANA Qualitative Analysis Continue to be transparent Continue to use faculty feedback to improve the process and systems Marketing
Marketing Newsletters Advertisements Social Networking Banners Videos from President, Provost and Deans http://www.youtube.com/watch?v=ncy1zmjtcia http://www.youtube.com/watch?v=nekdgrj8uza http://www.youtube.com/watch?v=o2zga2xfqwa
Marketing Examples
Questions?
Contact Information Visit us at http://www.uky.edu/eval/ Email us at institutionaleffectiveness@uky.edu