Programs With Student Learning Outcomes & Curriculum Map 2008-2013 Program SLOs Exist 77% 88% 91% 90% Program SLOs Published 60% 89% 87% 81% 2013 (N=232) 2012 (N=233) 2011 (N=230) Course SLOs Produced for Most Courses 87% 82% 81% 79% 2010 (N=227) 2009 (N=226) 2008 (N=224) 68% 52% Curriculum Map Exists 41% 46% 82% 77% 70% 66% SLO = student learning outcome
Programs That Conducted Assessment 2011-2013 74% Reported that assessment took place 73% 75% 69% Reported that evidence was collected 72% 55% 65% 2013 (N=232) 2012 (N=233) 2011 (N=230) Reported results 53% 59% 54% Reported use of results 48% 51%
Programs That Have Program SLOs 98% 89% Programs That Reported They Published Program SLOs 90% 93% 88% 91% Places Where Programs Published Program SLOs 210 programs (91%) published program SLOs. Department website 177 Syllabi 121 Student handbook 67 Program information sheet Other (e.g., dept. documents, accreditation rpt.) Catalog 40 38 50 Note: Programs reported publishing in multiple locations.
Programs That Have a Curriculum Map 91% 80% 70% 82% Programs That Reported Conducting Assessment in 2012-2013 75% 75% 70% 74%
Types of Evidence Collected for 2012-13 Program Assessment 159 programs (69%) reported collecting evidence. Student survey (program-developed) Embedded assignment/exam* Oral performance (presentation, oral defense) Student interview/focus grp Thesis/dissertation (undergrad & grad) Other Course survey (e.g., ecafé) Faculty/advisor discussion of or report on student Exam: Qualifying/ comprehensive (graduate level) Exam: Exit/ program/ professional/ nat'l Alumni survey Curriculum analysis/assessment discussion Publications, proposals to external grp. Course grades Employer survey/interview Student self-assessment/ reflection Portfolio Post-graduation data IR/Student data Exhibition/Performance/Project Applied experience outside the classroom Capstone IRB approval of research Syllabi/assignment analysis by faculty Entry-level diagnostics/evaluation* 35 28 27 25 19 18 17 15 15 14 13 12 12 11 10 10 9 7 5 5 3 3 1 54 50 Note: Programs often collect multiple types of evidence. Other includes evidence such as survey of faculty members, meeting minutes, student-course taking patterns *Embedded assignment/exam : assignment or exam completed as part of regular course requirements *Entry-level Diagnostics : entrance exams (e.g., GRE), completion of pre-requisite courses with minimum grade, etc. *Applied Experience Outside the Classroom such as internship and clinical practice
Person/Group Who Evaluated Evidence Collected for 2012-2013 Program Assessment 148 programs (64%) reported the person/group. Faculty Committee 110 Course Instructor 83 Department Chairperson 66 Faculty Advisor 54 Dean/Director Ad hoc Faculty Group Students External Reviewers Other (e.g., staff) 23 22 22 20 18 Advisor (student support svc) 10 Note: Programs often report more than one group/individual who evaluated evidence. Methods Used to Evaluate the Evidence, 2012-2013 149 programs (64%) reported the method to evaluate evidence. Used professional judgment 96 Scored exams/tests/quizzes 62 Compiled survey results 61 Used a rubric or scoring guide 61 Used qualitative methods 33 Other 14 External organization analyzed data 8 Note: Programs often report multiple methods to evaluate evidence.
Programs That Reported 2012-13 Results Out of the 159 programs that collected evidence, 128 reported results (55% of all degree programs). 57 46 25 128 Programs That Reported Use of 2012-2013 Program Assessment Results 126 programs reported use of results (79% of those who collected evidence; 54% of all programs). Programs reported making changes in these areas: Course/curriculum 67 Assessment processes or procedures 40 Students' out-of-course experiences 37 Program policies or procedures 21 Other 20 Results indicated no action needed 12 Note: Programs often make multiple types of changes. Examples: Courses/curriculum: changes to pedagogy, frequency of course offerings, program requirements, pre-requisites Assessment processes or procedures: revise SLOs, rewrite curriculum map, create a scoring rubric, new activities to increase response rates Students' out-of-course experiences: change advising, improve program website, start brown-bag lunch sessions on career planning, improve facilities Program policies or personnel: require SLOs on syllabi, request hiring of new faculty line