1 ACADEMIC AFFAIRS FORUM Student Course Evaluation Management and Application Custom Research Brief Research Associate Anna Krenkel Research Manager Nalika Vasudevan October 2012
2 2 of 14
3 3 of 14 Table of Contents I. Research Methodology... 4 Project Challenge... 4 Project Sources... 4 Research Parameters... 4 II. Executive Overview... 5 Key Observations... 5 III. Process to Distribute and Collect Student Course Evaluations... 6 Delivery Methods and Timeline... 6 Transition to Paperless Collection... 7 Course Evaluation Oversight... 7 Course Evaluation Content... 7 Technology... 8 IV. Management of Student Course Evaluations Evaluation Storage Access to Evaluations V. Application of Student Course Evaluations Evaluation Application... 14
4 I. Research Methodology Project Challenge Leadership at a member institution approached the Forum with the following questions: What is the process to conduct student course evaluations? What is the timeline to distribute and collect evaluations from students? Which office oversees student course evaluations? How is the content regulated across departments and colleges? What technologies do universities use in the student course evaluation process? How do administrators keep completed student course evaluations? Who has access to completed course evaluations? How do departments factor student course evaluations into promotion and tenure review? 4 of 14 Project Sources The Forum consulted the following sources for this report: Advisory Board s internal and online research libraries ( The Chronicle of Higher Education ( National Center for Education Statistics (NCES) ( Research Parameters The Forum interviewed academic administrators and provosts at private institutions. A Guide to Institutions Profiled in this Brief Institution Location Type Approximate Institutional Enrollment (Undergraduate/Total) Classification University A Mid-Atlantic Private 2,900 / 4,100 University B Midwest Private 1,900 / 4,900 University C South Private 2,500 / 4,200 University D* South Private 3,800 / 6,200 University E* Midwest Private 7,200 / 13,900 Master s Colleges and Universities (larger programs) Master s Colleges and Universities (larger programs) Master s Colleges and Universities (larger programs) Research Universities (very high research activity) Research Universities (very high research activity) Source: National Center for Education Statistics *These interviews were conducted for previous research.
5 II. Executive Overview Key Observations 5 of 14 Students complete online course evaluations at lower rates than paper course evaluations; therefore, provost office staff and individual faculty offer rewards and academic incentives to encourage student participation in evaluation process. Strategies to increase the percentage of students who complete student course evaluations include raffles for money, electronics, and extra-credit; early access to final grades; charitable donations; and access to the final exam in return for a certificate of completion. The provost and faculty senate at all institutions write a standard set of questions for undergraduate courses; at institutions with online delivery methods, faculty can add questions to reflect course structure or content. Individual faculty can customize online forms to evaluate course-specific format (e.g., lab sections, one-on-one music lessons, etc.) or content (e.g., texts, technology, specific assignments, etc.); and faculty can restrict department chairs from viewing answers to these customized questions. College deans and faculty determine student course evaluation questions for graduate students. Deans, department chairs, and faculty review student course evaluations to inform tenure and promotion decisions and to recommend professional development training and workshops to faculty. Faculty members undergoing tenure review are responsible for providing summary reports (i.e., average responses for each question and overall course satisfaction) to review committee members. During the annual review process, department chairs note faculty with consistently low reviews in a specific area and recommend professional development. The provost s office, dean s offices, department chairs, and individual faculty members store completed student course evaluations for at least ten years. At institutions with online storage, the office of academic administration, the dean s office, the department chair, and the faculty member evaluated have access to completed student course evaluations. At institutions with paper systems, offices receive student course evaluation summary reports, and the faculty member under review receives the original copies of evaluations with student comments. Provost s offices collect and analyze course evaluations through software such as Banner, Qualtrix, Excel, and Statistical Package for the Social Sciences (SPSS). These software programs save resources to eliminate staff time spent on data entry and office supplies and compile summary reports for each course. Deans, department chairs, and faculty present the statistics (e.g., mean rating, standard deviations, etc.) these software programs produce during the tenure review process.
6 6 of 14 III. Process to Distribute and Collect Student Course Evaluations Delivery Methods and Timeline Students Complete Course Evaluations During Last Two Weeks of Semester Using Scannable Paper or Online Tools Only students at University A complete the majority of course evaluations on paper, although contacts foresee a transition to an online system in the next five to 10 years. Administrators deliver the course evaluation forms to faculty two or three weeks before the semester ends. During the last one to two weeks of the course, instructor selects a student to distribute and collect the evaluations during course time. After the instructor selects the student, the instructor leaves the room and does not return until the selected student collects the completed forms and departs the classroom to deliver them to the dean s office. Academic administrators report ninety percent of students complete course evaluations. Paper Evaluation Timeline Administrators deliver forms to faculty two to three weeks before end of course Student administers evaluations during last one to two weeks of course Faculty review aggregate information three to four weeks after final grades posted University B and University D both use an online system to deliver and collect student course evaluations. All students begin to receive automated s from academic administrators two weeks before courses end with links to course evaluation forms, and many faculty deliver in-class reminders to complete the evaluations. At University B, student may complete the evaluations during the last week of a course and the week after a course completes. Contacts at both institutions acknowledge lower participation rates are a concern. At University D, 75 to 80 percent of students completed paper course evaluations; however, this rate has decreased since administrators implemented the online system. Although over 50 percent of students completed student evaluations the first year with the online system the percentage decreased in subsequent years. Graduate Students Less Responsive to s Contacts report graduate students are less responsive to automated reminders to complete course evaluations, because their courses end at different times throughout the semester. Therefore, they do not develop the habit to complete evaluations at the end of the semester. Online Evaluation Timeline Administrators send reminder s with links to students two weeks before end of course Students complete evaluations during last week of course and week after course Faculty review aggregate information three to four weeks after final grades posted
7 7 of 14 Transition to Paperless Collection Administrators Transition to Online System Through Trial Period Administrators at University C are piloting an online course evaluation system. In fall 2011, faculty began debating whether to transition to an online course evaluation system. At the end of the semester, the faculty senate voted in favor of a trial period for an online system evaluation system for courses in In the fall of 2012, they will vote on whether to continue with the online system or whether to revert to the paper system used previously. Faculty will not present data from the online evaluations collected during the trial period to tenure review committees. Additionally, student participation rate decreased from 75 to 80 percent to 60 to 65 percent completion after the switch to an online system. Transition to Paperless Timeline at University C Faculty debate and vote on trial period for paperless evaluation system Course evaluations administered online for trial period (one year) Faculty vote at end of trial period on whether to continue online system Course Evaluation Oversight Head of Academic Administration Oversees Process with Input from Faculty At all institutions, the office for academic administration or faculty affairs manages the student course evaluation process. Staff in the provost s office work with the faculty senate and faculty committees to determine questions on student course evaluations. They also distribute paper evaluations to faculty and remind students to complete online evaluations. Dean s offices and departments within individual colleges are responsible for analyzing and distributing completed evaluation forms. At University A, staff in the dean s office scans each paper evaluation. Administrators at University A and University C distribute paper copies of completed student course evaluation. The dean s office provides them to department chairs, who deliver them to individual faculty members. At institutions that are entirely paperless, deans, department chairs, and faculty can access all the student course evaluation data online. At University A, centers for teaching help academic administrators during the beginning of the process to clear systems that receive scanned evaluations of previous data and enter the number of students enrolled in each course. However, at University B, the center for teaching structures faculty development trainings and workshops based on the outcome of evaluations. They also offer to meet with individual faculty to discuss how to customize student course evaluations to solicit the most helpful student feedback. Course Evaluation Content Provost s Office Standardizes Undergraduate Course Evaluations Across Colleges and Departments The questions administered across courses at the undergraduate level rarely change, and evaluations for all courses include the same 10 to 13 questions on topics such as: Quality of course discussions Quality of reading materials Usefulness of laboratory work or practicums (if applicable) Instructor knowledge and interest in course subject Instructor availability outside of course
8 8 of 14 Instructor organization Instructor lecture skills Overall satisfaction with course Overall satisfaction with instructor At University B, faculty can add questions specific to their course. For example, if a faculty member is testing new texts or technology they can include questions specifically about those materials. Additionally, they can adjust settings so their deans and department chairs do not receive responses to supplemental questions. This allows them solicit feedback from students on innovations in the classroom without the results appearing in tenure or promotion review processes. Several institutions also have additional evaluations for courses with labs and practicums. At the graduate level, student course evaluations vary more by college within an institution. Deans of the college and faculty contribute to the development of these evaluations, and the questions are typically more subject-specific. College deans occasionally base questions on undergraduate student course evaluations. Technology All Institutions Use Software to Summarize Student Course Evaluation Data Online tools allow institutions to collect course evaluations from students without reducing the time spent learning in the classroom. Offices for academic administration at institutions with paperless evaluation systems require fewer resources (e.g., staff hours, office supplies, etc.) to enter and evaluate data. (See following page for a description of all software tools.)
9 9 of 14 Student Course Evaluation Software for Collection and Analysis Software Institution User Purpose Satisfaction Class Climate University A Dean s office, department chairs Evaluation data analysis Academic administrators report low satisfaction because staff struggle to create graphs, and deans and department chairs complain that they receive for every scanned evaluation. Qualtrix University C Office of academic administration, students Collects student responses, some evaluation data analysis Academic administrators at University C report satisfaction because they already owned the software so it required no additional investment, although Qualtrix has limited analysis capabilities. Excel University C Office of academic administration Evaluation data analysis Academic administrators at University C report satisfaction, they use Excel in conjunction with Qualtrix and SPSS because Excel creates more insightful graphs than the other programs. Statistical Package for the Social Sciences (SPSS) University C Office of academic administration Evaluation data analysis Academic administrators at University C report satisfaction; they use SPSS in conjunction with Excel and Qualtrix because individually the software does not have adequate analytical capabilities. Banner University B Office of academic administration, faculty, department chairs, deans, students Collects student responses, evaluation data analysis Staff in the office of academic administration report high satisfaction with Banner for collection and analytics.
10 10 of 14 Administrators Support Transition to Paperless System, Faculty More Hesitant About Change 45,000 Saved Pieces of Paper Contacts at University C estimate they saved 45,000 pieces of paper by eliminating paper surveys. Other saved resources include 100 staff hours. Contacts identified two methods to transition to an online course evaluation system. The first is a top-down transition, during which academic administrators decide to transition and announce it to faculty and students. The alternative is a bottom-up transition, during which the faculty senates discusses and votes on whether to transition to an online process. Administrator-Driven Transitions to Online Student Course Evaluations Challenges Faculty Unfamiliar with Technology Strategies Provide trainings to faculty: At University B, academic administrators presented the online course evaluation system during an all-faculty meeting. The presentation included a training on the interface for faculty (e.g., how to customize the questions, how to download and manipulate the data, etc.). Additionally, a student demonstrated the interface students see as they complete the online course evaluation form so faculty could understand how the changes they make to the evaluation form impact student experience. After the initial formal presentation, most faculty training has been informal, with faculty training other faculty. Faculty Resistant to Change Present a unified position: After academic administrators decide to use an online course evaluation system, every academic leader and staff member must disseminate accurate information about the transition. To ensure buy-in from faculty and staff who oppose the new platform, academic leadership must articulate the benefits (i.e., fewer administrative resources, less class time dedicated to evaluations, etc.) for all members of the community (e.g., academic staff, faculty, students). Faculty-Driven Transitions to Onlince Student Course Evaluations Challenges Faculty concerned with the impact on tenure review Misinformation increases faculty apprehension Strategies Offer a low-risk trial period: Faculty at University B voted for a trial period, before determining whether to continue with online course evaluations. One condition of this trial period was that departments would not use the evaluations collected through the online process in tenure reviews, so that faculty could experiment with the system without the threat of repercussions on their career. At University D, a faculty working group studied the types of student who complete online course evaluations; its result allayed fears that only students with extreme viewpoints responded to online evaluations. Organize open forums: Provosts and associate provosts organize open forums for faculty to express concerns, and for academic administrators to demonstrate the value of online course evaluation systems. Contacts indicate faculty express concern about lower response rates and lower ratings, while academic administrators emphasize the decreased workload.
11 11 of 14 Online Student Response Rates are Lower than In-Class Response Rates Contacts identify maintaining high student participation as the greatest challenge during the transition to online course evaluation systems. At University C, student participation rate decreased from 75 to 80 percent to 60 to 65 percent completion after the switch to an online system. All institutions that went through this transition report a significant decrease in the percentage of student who complete online course evaluations in comparison to the number who completed paper course evaluations. Methods to increase student participation occur both at the institutional level and through individual faculty initiatives. Contacts at University D expressed concern that certain types of incentives, such as raffles and academic credit, send a message to students that they should participate only for personal gain, and not community improvement. Strategies to Increase Student Participation in Online Course Evaluations Access to Final Grades Institution-wide Raffles Faculty-specific Raffles Charitable Incentives Academic Incentives Faculty allow students who complete the course evaluation for their course to access the final grade prior to other students. Conversely, students who do not complete course evaluations may have to wait an additional time period (e.g., two weeks) to access their final grade. At University B, academic administrators will implement this policy at the institutional level starting in fall For institutions with campus-wide raffle initiatives, automated systems enter students who complete a course evaluation into a raffle. The first year after implementing an online system, the office for academic administration at University D spent $2,000 on ipods for the raffle and experienced over 50 percent participation. In subsequent years, the amount spent on raffle prizes has decreased, and prizes include gift cards to campus bookstores and food vendors; however, the participation level also decreased. Contacts speculate that even if ipods were reinstituted as raffle prizes, participation would not improve, because the so many students already own ipods, they no longer serve as incentives. Some faculty offer students course-specific raffles for enrolled students. At University B, students who complete online course evaluations print out the certificate they receive upon completion. They present it to the faculty who enters them in a raffle; potential prizes include extra credit on the final exam, $5 gift cards, and items from the school bookstore. At University E, academic administrators identified an alumnus willing to donate a dollar to charity for every completed online course evaluation, during the first semester of the program they raised over $50,000. Variations on this program include alumnus donations to the institution for every completed evaluation. Individual faculty offer students extra credit in the course overall, or on the final exam, if they complete the online course evaluation. At University B, some faculty require students to print a screenshot of the evaluation completion certificate and bring it to class before they receive the final exam. If students do not have one, they must visit a computer lab and complete it before beginning the final exam.
12 IV. Management of Student Course Evaluations Evaluation Storage 12 of 14 Course Evaluations Stored for at Least 10 Years in Digital and Paper Formats At all institutions, multiple offices and individuals store copies of student course evaluations. Online storage systems allow institutions to maintain completed course evaluations for a longer period of time without space limitations, and also facilitate access by faculty members and their supervisors. Student Course Evaluation Storage Insitution Dean s Office Department Chair Faculty Member Academic Assessment Format Length of Time University A Digital (Dean s office and Department chairs); Original Paper (faculty members) Indefinitely University B All digital Indefinitely University C All paper 10 years (provosts determined length of time) University D All digital Indefinitely Access to Evaluations Faculty Subjects of Evaluations and Their Supervisors May Access Completed Student Course Evaluations for Annual and Tenure Reviews At University A, University B, and University C, academic administrators restrict access to completed course evaluations to the faculty member under evaluation, their department chair, and the college dean. Provosts and the office of institutional research may view the data on request but rarely do so. Academic administrators view the information as confidential. At University B, all faculty access the results through a password-protected Banner system; department chairs and deans may access the evaluations for faculty under their purview. At University C, access to student course evaluations varies across departments. The office of academic administration stores paper copies in locked filing cabinets and academic administrators access scanned forms and data through password-protected computers.
13 13 of 14 Academic Administrators Publish Online For Students to Inform Course Selection and Create Culture of Transparency At University D, students access information from student course evaluations through an online portal which informs students course selection process and increases transparency in the faculty evaluation process. In 2008, administrators began to publish average ratings for all evaluations questions and student comments in response to students concern that faculty and administrators did not consider students evaluations when conducting faculty reviews. Academic administrators do not review the comments before they are posted online, but faculty can request the removal of comments they deem offensive (e.g., comments about faculty appearance or personal attacks). However, contacts note that since posting students comments online, the tone of comments became more civil and students invest more in the evaluation process. Publishing Student Course Evaluations Increases Student Engagement Students enroll in courses that are badly taught, and they fill out evaluations, and then the course is badly taught again. Sometimes professors rated as excellent do not receive tenure, and those with poor ratings do; there seems to be no impact from student evaluations. Since we cannot bring students into the tenure review committee we took a rather controversial step a step students saw as universally positive and faculty less so of publishing all comments and averages online from student course evaluations. Their comments directly impact other students, and that s a visible benefit. - Council Interview Evaluation Content Not Changed or Removed to Protect Student Identity All institutions allow faculty to view students anonymous comments, but do not remove any information that may identify students identity. To preserve student privacy, institutions do not request students name or date of birth, although they do collect students major, grade point average range, class year, and anticipated grade in the course. Contacts at University A, which uses a paper system, note that faculty familiar with student s handwriting could identify them through comments. This is not a concern for institutions with online course evaluation forms. No Course Evaluations for Small Courses At University C, faculty do not administer student course evaluations in undergraduate classes with fewer than ten students and graduate courses with fewer than five students to protect student anonymity.
14 V. Application of Student Course Evaluations Applications of Course Evaluations Faculty Submit Summary Data from Student Course Evaluations to Tenure Review Committees At all institutions, the tenure review committees use the results of student course evaluations in the review process, although the impact they have on the process depends on the department. At University B, faculty, deans, and provosts debated whether faculty members should submit course evaluations to tenure review committees, or the committee should access and read through evaluations online. Academic leaders decided the Public speaking latter option relied too much on the proactive Lesson planning behavior of tenure committee members. Instead the faculty member under review prints summary Syllabus development data for each semester over the period under review and delivers paper copies to each member of the tenure review committee. 14 of 14 Evaluations Inform Professional Development Department chairs use student course evaluations to recommend professional development for faculty in areas such as: Classroom management At University A, the faculty member under review creates a portfolio, which they submit to the tenure review committee as an argument for their eligibility for a tenured position. They include summary information from student course evaluations, and possibly quotations from responses to open-ended questions. A faculty committee at University D reads all the student course evaluations comments for a specific department, and provides summaries to each faculty member on the tenure review committee. Offices of Institutional Research Compare Data Across Departments At all institutions, provosts and offices of institutional research have the option to view the result of student course evaluations; however, no provosts choose to view the data. At University A, the office of institutional research creates department summaries every five years. This summary includes a table that compares the average overall course rating for a department to the averages across the college and university. Administrators use this number to evaluate the comparative strength of department faculty.
UNIVERSITY BUSINESS EXECUTIVE ROUNDTABLE Performance Management for Faculty and Staff Custom Research Brief Research Associate Anna Krenkel Research Manager Nalika Vasudevan September 2012 2 of 12 3 of
STUDENT AFFAIRS FORUM Math Center Services and Organization Custom Research Brief Research Associate Amanda Michael Research Manager Nalika Vasudevan October 2012 2 of 12 3 of 12 Table of Contents I. Research
May 5, 2015 Page 1 of 6 PURPOSE: To outline policy regarding the selection of a Student Rating of Teaching (SRT) instrument and how SRT rankings shall be viewed in faculty evaluation. Supersedes SP 07-12
ACADEMIC AFFAIRS FORUM Assessing Faculty Research Productivity at Public Research Institutions Custom Research Brief Research Associate Kevin Danchisko Research Manager Allison Thomas December 2012 2 of
; Online Student Course Evaluations: Strategies for Increasing Student Participation Rates TABLE OF CONTENTS RESEARCH ASSOCIATE Michael Ravenscroft RESEARCH DIRECTOR Christine Enyeart I. Research Parameters
UNIVERSITY LEADERSHIP COUNCIL Administering Student Course Evaluations Online Custom Research Brief February 14, 2011 RESEARCH ASSOCIATE Ehui Nyatepe-Coo RESEARCH MANAGER Josh Albert I. Research Methodology
Academic Affairs Forum Balancing Student Enrollments to Achieve Revenue Goals Enrollment Review and Outcomes Custom Research Brief eab.com Academic Affairs Forum Cate Auerbach Research Associate Lauren
STUDENT COURSE EVALUATIONS : RECOMMENDATIONS FROM THE FACULTY SENATE COMMITTEE ON UNDERGRADUATE EDUCATION May 7, 2013 At the request of the Faculty Senate, the Faculty Senate Committee on Undergraduate
ACADEMIC AFFAIRS FORUM Faculty Workload Policies at Public Universities Custom Research Brief Research Associate Ashley Greenberg Associate Research Director Sarah Moore February 2013 2 of 12 3 of 12 Table
Faculty Diversity Internship Program Background Historically, there has been a lack of diversity in the applicant pool for academic positions, especially academic administration and full-time faculty at
Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process Overview Effective teaching is the core of any outstanding university and is
UNIVERSITY LEADERSHIP COUNCIL Effective Strategies for Supporting Students on Academic Probation Custom Research Brief RESEARCH ASSOCIATE Hilary Krase RESEARCH MANAGER Ehui Osei-Mensah I. Research Methodology
UNIVERSITY LEADERSHIP COUNCIL Strategies for Success within a Student Affairs-Based Enrollment Management Enterprise Custom Research Brief RESEARCH ASSOCIATE Jeffrey Martin RESEARCH MANAGER Sarah Moore
Academic Affairs Forum Balancing General Education and Major Requirements At Private, Religious Institutions eab.com 2014 The Advisory Board Company 1 eab.com Academic Affairs Forum Natalia Alvarez Diaz
ACADEMIC AFFAIRS FORUM Hispanic and First-Generation Student Retention Strategies Custom Research Brief Research Associate Amanda Michael Research Manager Nalika Vasudevan November 2012 2 of 10 3 of 10
ACADEMIC AFFAIRS FORUM Procedures to Prevent and Detect Academic Dishonesty Among Graduate Students Custom Research Brief Research Associate David Masterman Research Manager Lisa Geraci March 2013 2 of
The Graduate School: A Model to Enhance Quantity and Quality in PhD Production Maresi Nerad South African PhD Panel Report October 2009 South Africa s universities are confronted with two major challenges.
Academic Affairs Forum Conducting Market Analysis for New Programs Developing Financially Viable Programs and Meeting Market Demand Custom Research Brief eab.com Academic Affairs Forum Emily McKelvey Research
Bridgewater State University Virtual Commons - Bridgewater State University EdTech Day EdTech Day 2008 Jan 16th, 1:00 PM - 1:50 PM Student Learning and Assessment with Electronic Portfolios Thomas Brady
1 University of Alabama Tuscaloosa, Alabama Credit Hour Definition/Policy [As of: 4/25/2013] Purpose. The purpose of this policy is to guide the determination of credit hours to be awarded for course work
Strategic Planning Procedure Manual Adopted by the Strategic Planning Committee January 2003; revised December 2007, revised November 2011; revised September 2012; revised October 2014; revised June 2015
Academic Affairs Forum Encouraging Faculty Participation in Student Recruitment Custom Research Brief eab.com Academic Affairs Forum Noorjahan Rahman Research Associate 202-266-6461 NRahman@eab.com Kevin
Structuring Assessment for Accreditation by Professional Organizations Moderator: Steven H. Davidson, Associate Dean for Academic Programs, Questrom School of Business Assessing Excellence in Education:
ACADEMIC OPERATING POLICY AND PROCEDURE MEMORANDUM TO: All Holders of Mississippi State University Academic Operating Policy and Procedure Manual DATE: June 29, 2010 SUBJECT: The Adoption and Sale of Textbooks
Department of Electrical Engineering and Computer Science (EECS) Bylaws for Program Structure I. Mission and Objectives: The mission of the University of Kansas Department of Electrical Engineering and
Summer 12 WCSU Online Policy Faculty Senate Distance Education Committee Please see http://library.wcsu.edu/wcsu/distanceed/wiki for a list of committee members and working documents. W e s t e r n C o
I. Purpose of Policy: College of Liberal Arts Online Education Course Management Policy 2011-2012 Last update: July, 2011 The College of Liberal Arts Online Education Course Management Policy 1 provides
V. Course Evaluation and Revision Chapter 14 - Improving Your Teaching with Feedback There are several ways to get feedback about your teaching: student feedback, self-evaluation, peer observation, viewing
UNIVERSITY LEADERSHIP COUNCIL Faculty Workload & Compensation in Online and Blended Courses Custom Research Brief Research Associate David Godow Research Manager Katie Sue Zellner May 2012 2 of 11 3 of
WHEELOCK COLLEGE FACULTY DEVELOPMENT AND EVALUATION PROGRAM REVISED SPRING 2011 TABLE OF CONTENTS Development And Evaluation Process: Tenure Track Faculty... 4 Overview Of Mentoring And Evaluation Process
STo appear in Proceedings of Workshop on the Impact of Pen-Based Technology on Education (WIPTE) 2008. Assessing the Impact of a Tablet-PC-based Classroom Interaction System Kimberle Koile David Singer
Graduate Handbook of the Mathematics Department North Dakota State University May 5, 2015 Graduate Program Policies and Handbook 1 Graduate Committee Voting members of the Graduate Committee are the Graduate
Student Learning Outcomes Bachelor in Health Information Management Department of Health Sciences College of Applied Science and Technology Upon successful completion of the Health Information Management
Texas A&M University-San Antonio 12.03.99.O1.01 Faculty Workload Approved: May, 2012 Next Scheduled Review: May, 2013 Procedure Statement This Procedure is established to provide guidelines regarding the
University of Delaware College of Health Sciences Department of Behavioral Health and Nutrition GUIDELINES FOR PROMOTION, TENURE AND REVIEW I. INTRODUCTION The mission of the Department of Behavioral Health
RESPONSIBILITIES OF GRADUATE TEACHING ASSISTANTS Department of Electrical Engineering and Computer Science South Dakota State University Copyright 2011, South Dakota State University Introduction Graduate
NCTA DUAL CREDIT POLICY Purpose: The purpose of this policy is to describe how dual credit courses are conducted at NCTA, to describe how faculty are compensated for the development of dual credit courses
Columbus State Community College English Department Course and Number: ENGL 2367 Composition II CREDITS: 3 CLASS HOURS PER WEEK: 3 INSTRUCTOR: OFFICE PHONE: EMAIL: DEPARTMENT PHONE: 614/287-2531 or 614/287-2201
Technical Center For Career and Adult Education Updated Revisions Dates: 8/12/2016, 11/10/2016 Purpose: The plan is for determining the effectiveness of student personnel services. Responsible Party: The
Guide to Planning and Resource Allocation Developed, reviewed and adopted by the: Management Leadership Council March, 2007 Planning & Budget Committee March, 2007 Faculty Senate May, 2007 SECTION I INTRODUCTION
Plan of Organization for the School of Public Health 2011 Table of Contents PREAMBLE... 3 ARTICLE I MISSION... 3 ARTICLE II SHARED GOVERNANCE... 4 ARTICLE III SCHOOL ADMINISTRATION... 4 A. Administration...
Assessment METHODS What are assessment methods? Assessment methods are the strategies, techniques, tools and instruments for collecting information to determine the extent to which students demonstrate
Department: Psychology MSU Departmental Assessment Plan 2007-2009 Department Head: Richard A. Block Assessment Coordinator: Richard A. Block Degrees/Majors/Options Offered by Department B.S. in Psychology
Unit Plan for Assessing and Improving Student Learning in Degree Programs Unit: Psychology Unit Head approval: David E. Irwin Date: May 7, 2008 SECTION 1: PAST ASSESSMENT RESULTS Brief description of changes
SOUTHEASTERN UNIVERSITY AND COLLEGE COALITION FOR ENGINEERING EDUCATION 1999-2000 SUCCEED Faculty Survey of Teaching Practices and Perceptions of Institutional Attitudes Toward Teaching a Executive Summary
FORMAT 1 Submit original with signatures + 1 copy+ electronic copy to UAF Governance. See http://www.uaf.edu/uafgov/faculty/cdfor a complete description of the rules governing curriculum & course changes.
Guide to CSUSM Online Course Evaluations 1. Introduction Cal State San Marcos implemented Class Climate, a Scantron product, for the campus-wide student course evaluations in the summer of 2006. The majority
James Madison University Best Practices for Online Programs Updated December 2013 JMU Best Practices for Online Programs I. Introduction... 2 II. Institutional Context and Commitment... 2 III. Curriculum
University of Pennsylvania Handbook for Faculty and Academic Administrators 2016 Version VI.I. Procedures for the Evaluation and Certification of the English Fluency of Undergraduate Instructional Personnel
Information Technology Services 2015 Annual Report The 2015 Information Technology Services Annual Report Key Performance Indicators (KPIs) and Highlights of the 2014-2015 Academic Year Information Technology
Jerry and Vickie Moyes College of Education TENURE DOCUMENT Approved by Faculty Senate April 16, 2009 Introduction The purpose of this document is to outline the criteria and the procedures used to evaluate
2009 MASTER PLAN/PROGRESS REPORT Academic Program: School Counseling, M.Ed Person Responsible: Dr. Christine Anthony Date Submitted: May 22, 2009 Mission: The Master of Education in School Counseling endeavors
Rules of Organization and Bylaws Gladys A. Kelce College of Business Approved by the General Faculty December 11, 2012 PREAMBLE This document provides the framework within which the Faculty of the Gladys
Graduate Student Handbook of the Mathematics Department Department of Mathematics North Dakota State University October 7, 2015 1 1 General Information The Department of Mathematics offers graduate study
Achieving AACSB International s Initial Accreditation: The Montana State University Billings Experience - The College of Business Coordinator of Accreditation and Assessment s Perspective Barbara M. Wheeling
Chapter III Supporting Data: Collection & Presentation Since the documentation of teaching must be selective to avoid unnecessary bulk in a tenure file and to increase the efficiency of the review process
GUIDELINES FOR ACADEMIC PROGRAM REVIEW For self-studies due to the Office of the Provost on October 1, 2015 GRADUATE PROGRAMS OVERVIEW OF PROGRAM REVIEW At Illinois State University, primary responsibility
Master of Arts in Teaching/Science Education Master of Arts in Teaching/Mathematics Education Assessment F12-S13 FOR ACADEMIC YEAR: 2012-2013 PROGRAM: MAT/Science and Mathematics Education SCHOOL: NS&M
HEALTH INFORMATICS PROGRAM SCHOOL OF INFORMATICS Request for a New Graduate Minor To be offered at Indiana University-Purdue University Indianapolis Approved by the HI program faculty on October 17, 2012
WEB-BASED COURSE EVALUATION: COMPARING THE EXPERIENCE AT TWO UNIVERSITIES Jack McGourty 1, Kevin Scoles and Stephen Thorpe 2 Abstract This paper describes the implementation of online student evaluation
ONLINE COURSE EVALUATIONS Table of Contents Introduction Review of published research literature Benefits of online evaluations Administration costs Data quality Student accessibility Faculty influence
ASSESSMENT PLAN UNDERGRADUATE MAJOR IN POLITICAL SCIENCE Mission Statement The Department of Political Science aims to provide instruction that enables students completing the major to acquire a college-level
#AItraining MOVING TO A RESPONSIBILITY- CENTERED BUDGET MODEL CONSIDERATIONS FOR IMPLEMENTING RESPONSIBILITY-CENTER MANAGEMENT ON CAMPUS Larry Goldstein Campus Strategies, LLC Larry.Goldstein@Campus-Strategies.com
Master of Library Science Internship Manual Guidelines for LIBS 6991, 6992 Professional Internship Department of Information and Library Science Master of Library Science Program College of Education East
7-29-2013 Composition Studies Graduate Certificate Program Online Certificate Program in English Indiana University East Department of English Composition Studies Graduate Certificate Program Online Graduate
COURSE AND TEACHER SURVEYS (CATS) AT VILLANOVA UNIVERSITY A guide for new faculty members There are probably more studies of student ratings than of all of the other data used to evaluate college teaching
Approved by Academic Affairs May 2010 DEPARTMENT OF MANAGEMENT/HRM POLICY ON REAPPOINTMENT, TENURE, AND PROMOTION (RTP) BASED ON COLLEGE OF BUSINESS ADMINISTRATION POLICY ON RTP TABLE OF CONTENTS I. COLLEGE
STUDENT AFFAIRS FORUM Building a Comprehensive Social Media Strategy for Student Recruitment Custom Research Brief Research Associate Lisa Qing Research Manager Lisa Geraci October 2012 2 of 16 3 of 16
Best Practices for Increasing Online Teaching Evaluation Response Rates Denise T. Ogden, Penn State University Lehigh Valley James R. Doc Ogden, Kutztown University of Pennsylvania ABSTRACT Different delivery
Department of Special Education Initial Licensure and Added Endorsement the Special Education Generalist Overview and Assessment Plan Purpose for Program Change In the Fall of 2013 the Department of Special
Online Class* Development Guidelines Middlesex Community College March 11, 2015 I. Online Class Proposal: Submission and Review** The proposal to develop a new online course should start six months before
Online Course Evaluation Pilot 1 On-Line Course Evaluation Pilot Project at Marquette University: Spring 2008 The Marquette Online Course Evaluation System (MOCES) Development of the Evaluation Instrument
Position No: 298237 CARLETON UNIVERSITY POSITION DESCRIPTION Position Title: Executive Assistant to the Dean Title of Immediate Supervisor: Dean, Faculty of Public Affairs Department: Office of the Dean,
To: Longwood University Faculty Senate From: Ad-Hoc Committee on Student Evaluations (online vs in-class) Chris Bjornsen (Chair), Susan Lynch, Linda Lau, Mark Lenker, Jay Lynn, Chris Shumaker Date: April
A Case Study of the Effectiveness of a Hybrid course in Engineering Education Michael V. Gangone University of Texas at Tyler Department of Civil and Environmental Engineering email@example.com Higher
Introduction Provost Huenneke convened this committee to evaluate the trial use of the Student Evaluation of Teaching Effectiveness (SETE) instrument and the SmarterSurvey systems being used as part of
1. Computer Science Unit Definition The Computer Science offers a Bachelor of Science in Computer Science (BSCS), a degree program that has grown from about 20 majors in fall 2002 to about 97 majors today.