WVU STUDENT EVALUATION OF INSTRUCTION REPORT OF RESULTS INTERPRETIVE GUIDE



Similar documents
Student Response to Instruction (SRTI)

VANDERBILT UNIVERSITY DEPARTMENT OF POLITICAL SCIENCE GRADUATE PROGRAM REQUIREMENTS Effective January 2014

CRITICAL THINKING ASSESSMENT

Department of History Policy 1.1. Faculty Evaluation. Evaluation Procedures

MAP Reports. Teacher Report (by Goal Descriptors) Displays teachers class data for current testing term sorted by RIT score.

COURSE AND TEACHER SURVEYS (CATS) AT VILLANOVA UNIVERSITY

IDEA. Student Ratings of Instruction Survey Report Fall Guam Community College

AN INTEGRATED APPROACH TO TEACHING SPREADSHEET SKILLS. Mindell Reiss Nitkin, Simmons College. Abstract

University of Nevada, Reno, Mechanical Engineering Department ABET Program Outcome and Assessment

Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process DRAFT

CRIMINAL JUSTICE PROGRAM OUTCOMES FALL 2011 TO SPRING 2012

How U.S. News Calculated the 2015 Best Colleges Rankings

Boise State University Department of Construction Management Quality Assessment Report and Action Plan

Assessment Findings and Curricular Improvements Department of Psychology Undergraduate Program. Assessment Measures

Annual Assessment Summary B.S. in Psychology

Learning Outcomes Assessment for Building Construction Management

University Mission School Mission Department Mission Degree Program Mission

INDIANA UNIVERSITY EAST-EARLHAM COLLEGE COOPERATIVE AGREEMENT FOR THE BACHELORS DEGREE IN ELEMENTARY OR SECONDARY EDUCATION (Under Review)

ACCOUNTING Ph.D. PROGRAM REQUIREMENTS KRANNERT GRADUATE SCHOOL OF MANAGEMENT (revised July 2014)

Errors in Operational Spreadsheets: A Review of the State of the Art

SUNY Adirondack

Trinity Christian College Palos Heights, Illinois

Getting prepared: A report. on recent high school. graduates who took. developmental/remedial. courses

Course Evaluations at the Kelley School of Business

DEPARTMENT OF POLITICAL SCIENCE MICHIGAN STATE UNIVERSITY

For further support information, refer to the Help Resources appendix. To comment on the documentation, send an to

Assessment Processes. Department of Electrical and Computer Engineering. Fall 2014

Measuring the response of students to assessment: the Assessment Experience Questionnaire

Information Technology Services will be updating the mark sense test scoring hardware and software on Monday, May 18, We will continue to score

Wichita State University School Psychologist, Narrative Describing the Operation of the Program Assessment Plan Approved

Rules governing masters studies at the Reykjavík University School of Law

MPA Program Assessment Report Summer 2015

Student Ratings of College Teaching Page 1

Data Analysis, Statistics, and Probability

WHEELOCK COLLEGE FACULTY DEVELOPMENT AND EVALUATION PROGRAM

The Influence of Session Length On Student Success

feature fill the two to three million K 12 teaching recruits have? These are the questions confronting policymakers as a generation

ACT Research Explains New ACT Test Writing Scores and Their Relationship to Other Test Scores

DIVISION OF BUSINESS AND COMPUTER INFORMATION SYSTEMS

Windows-Based Meta-Analysis Software. Package. Version 2.0

PROCEDURES AND EVALUATIVE GUIDELINES FOR PROMOTION OF TERM FACULTY DEPARTMENT OF PSYCHOLOGY VIRGINIA COMMONWEALTH UNIVERSITY MARCH 31, 2014

Department of Accounting, Finance, & Economics

GRANDVIEW INDEPENDENT SCHOOL DISTRICT ENGLISH AS A SECOND LANGUAGE POLICY AND PROCEDURES

How To Determine Teaching Effectiveness

APPROVED RULES OF THE COLLEGE OF ENGINEERING AND APPLIED SCIENCE UNIVERSITY OF COLORADO BOULDER

Student Assessment That Boosts Learning

PROPOSAL TO IMPLEMENT A NEW ACADEMIC PROGRAM (Major, Minor, Master s, Dual Degree, or Certificate)

Assessing Quantitative Reasoning in GE (Owens, Ladwig, and Mills)

ASSESSMENT REPORT CMDS Master s I. CMDS Master s Degree (MS/MCD) student performance on the National Examination in Speech Language Pathology

Banner Training Manual: Department Chairs

International Baccalaureate (IB) Programme Update

Department of Business Administration, Management, & Marketing

Northeastern Illinois University Chicago, Illinois

The Mystery of Good Teaching by DAN GOLDHABER

A spreadsheet approach for monitoring faculty intellectual contributions

The Undergraduate Study and Examinations Regulations and the KFUPM Rules for Their Implementations

ANNUAL PROGRAM LEARNING OUTCOMES ASSESSMENT SUMMARY REPORT

DUAL CREDIT PROGRAM ARTICULATION AGREEMENT

Elmhurst College Elmhurst, Illinois

All students are admitted during the summer and begin their coursework in the fall. Students must commit to completing these courses in sequence.

They are reviewed annually and revised or revalidates for the upcoming academic year.

Fundamentals of Economics Courses: Fun Course or a Positive Learning Experience?

Annual Goals for Math & Computer Science

Test Scoring And Course Evaluation Service

Interpretive Guide for the Achievement Levels Report (2003 Revision) ITBS/ITED Testing Program

Interpreting Reports from the Iowa Assessments

Department of Political Science. College of Social Science. Undergraduate Bachelor s Degree in Political Science

Determine whether the data are qualitative or quantitative. 8) the colors of automobiles on a used car lot Answer: qualitative

Drawing Inferences about Instructors: The Inter-Class Reliability of Student Ratings of Instruction

The test uses age norms (national) and grade norms (national) to calculate scores and compare students of the same age or grade.

How To Pass A Queens College Course

GAO SCHOOL FINANCE. Per-Pupil Spending Differences between Selected Inner City and Suburban Schools Varied by Metropolitan Area

Adjunct Faculty Orientation and Professional Development Custom Research Brief

Bryan College of Health Sciences School of Nurse Anesthesia. Plan for Assessment of Student Learning

Transcription:

WVU STUDENT EVALUATION OF INSTRUCTION REPORT OF RESULTS INTERPRETIVE GUIDE The Interpretive Guide is intended to assist instructors in reading and understanding the Student Evaluation of Instruction Report of Results. The SEI Report of Results is a computer printout that provides a summary of students' responses to the standard items and the items selected by instructors for instructor and course evaluation. The report and interpretive guide are intended to help diagnose teaching strengths and weaknesses and to provide documentation of student perceptions which may also be used for review during faculty evaluation procedures. LIMITATIONS ON USE FOR FACULTY EVALUATION: General approval of student ratings as a measure of teacher effectiveness comes with recognition of the limitations. The first and most often mentioned limitation is that perceptions of individual students are reduced to statistical, averaged reports. They are actually very complex perceptions reported in numerical form and because they are presented as numerical data, they are easily misinterpreted as precise and objective measurements. A second limitation of SEI reports is that they should be used in conjunction with other methods of assessment (such as peer review of course materials, exit interviews with seniors, alumni follow-ups, administrative judgments) in order to balance out weaknesses or uncertainties that arise with the use of only one method for purposes of evaluation of teaching. A third limitation falls in the area of systematic use. If ratings are used only sporadically or for only a single course repeatedly, then adequate information may not be available to make an evaluation decision using SEI rating data. For evaluation purposes, some researchers recommend that ratings from at least five representative classes taught over the past one or two years be presented in faculty evaluation. At least two-thirds of the students in each class should participate in the evaluation, and the number of participating students should be at least 15. A fourth limitation is that student perception of faculty performance involves many other variables over which faculty may have little or no control. Finally, the evidence relating student ratings with actual student learning is inconsistent. Over the past decade there is some indication that "global" item ratings of teacher effectiveness and course value correlate more highly with student learning than do specific instructional style items (e.g. student-teacher interaction). Therefore, inclusion of more global item ratings could be more suitable for evaluative review than specific style item ratings. Despite these limitations, student ratings have been shown to be a useful feedback tool to instructors and administrators if designed in such a way to be flexible enough to elicit appropriate and valid information.

READING SEI REPORTS DEMOGRAPHIC INFORMATION: Part of the report contains a summary of student demographic features, which can help you to understand the class composition and possible biases, which may be present in the results due to these characteristics. This information is listed in terms of frequency of response to each item category, including omissions. SEI REPORT OF RESULTS: EVALUATIVE ITEMS The main body of the SEI Report consists of a listing of the standard set of 13 (items 1 6 and 18-24), five college selected items (items 7-11), five instructor-chosen items (items 12-16), and one free-form question composed by the instruction (item 17). Each of these items or numbers is followed by a summary of student responses consisting of: (1) response category frequencies, (2) a mean response, (3) a standard deviation, and (4) comparative information by University and by College. (The comparative information is not included for the instructor-composed item item 17) The number of students who chose each response category and the number of omissions are reported for each item and under each response scale value (Always, Frequently, Usually, Seldom, Rarely (for items 1-17) or Excellent, Good, Satisfactory, Fair, Poor (for items 18-24)). Instructors should pay close attention to items for which ratings fall below expectations; it is usually cause for concern when a third of the responses give relatively low ratings to some aspect of the course or instruction. In addition, please interpret items cautiously where the omission count is greater than 10% of the students participating in the evaluation. The mean response for each item is calculated by using the values (1-5) of the response categories (POOR to EXCELLENT) to assign values to student responses, then totaling those values chosen by students, and dividing this total by the number of students responding to that item. Students who reply "Not Applicable" (NA) or omit items are not included in the analysis. The standard deviation is also computed using the students' responses per category. Data for students who omit items are not included in those item analyses. The standard deviation indicates the average spread of student responses as they are distributed about the mean. If the standard deviation is greater than 1.20, be cautious in interpreting an item mean. A spread this large may indicate either a heterogeneity of student backgrounds and interests, or that the instructor only attended to a proportionally small group of students within a class. Along with the above information, comparative statistics are derived for each item (excluding the instructor composed item 17) based on two groups: (1) all University classes utilizing the instrument and (2) the classes utilizing the instrument within each college. The comparative statistics are included to inform instructors about their standings on each item relative to all others. NOTE: If you choose to submit your ratings for annual review or promotion and tenure, and you feel that your ratings have been affected by variable(s) beyond your control, you are encouraged to provide a statement attached to your report citing those circumstances.

INTERPRETATION OF THE DATA The SEI report is composed of several sections, each present different kinds of information. These sectional are Identification defining the instructor, course, term and student response count, demographics describing characteristics of the members of the class, and the evaluation summary data containing a list of evaluation items, student responses to those items (frequency counts, means and standard deviations) and norms expressing the evaluation of instruction as percentiles relative to other instructors or courses in the University. See attached sample SEI report. Section 1 - Identification The top section (section A) of the report is for identification. On the left is the instructor s name and the CRN for the course. The instructor s name is displayed just as it is entered when the evaluation forms are ordered on the on-line SEI order system. The CRN is displayed just below the instructors name and has two parts to it. The first five digits is the CRN as assigned by the Office of Admissions and Records. The last five digits is a special code assigned by the online SEI order system when the order is place. In general only the first two of these digits is used and represents an instructor code. Most of the time this will be 01. However, there are three situations that result in something other than 01. These are Multiple instructors in a class Some course are taught with more that one instructor, but still listed under a single CRN. In such course, instructors each order their evaluation forms using the same CRN. The first instructor to do so will receive an instructor code of 01, the second 02 and so on. Wrong CRN It is possible for an instructor to order SEI forms using the wrong CRN, one that is a correct CRN for another course. The first instructor to order under that CRN will get an instructor code of 01. When another instructor tries to use the same CRN, the system assigns an instructor code of 02 to that order. Evaluation data and reports are tracked by the combination of the CRN and the instructor code. All instructors using the same CRN (even if in error) may proceed with the administration of their evaluations since each set is uniquely identified. Change of heart If an instructor places an order for evaluation forms and changes his or her mind about the composition of the questions or the forms fail to show up in campus mail, he or she may simply get back on-line and reorder. In this situation the second order will have an instructor code of 02. In the case of lost forms, the instructor must be careful to only use one set of the forms. Each set will be treated by the system as unique classes. The Course: label at the center of the top of the report is the identifier entered by the instructor or his or her designee when the student response forms were ordered. To the far right of the identification section of the report is an indicator for the academic term that the report covers (for example, Spring 2002 ). Reports for Summer session I and Summer session II course are labeled Summer below the term indicator is the number of students

responses to the evaluation. Response forms with no responses on them are not included in the analyses that generated this report. Section 2 Demographics The second section of the report (below the first horizontal bar is referred to as the Demographics section in that it describes some characteristics of students in your class. For example, left most data displayed in this section describes the students responses to whether the course was a liberal studies requirement, a major requirement, a major elective or a general elective. The questions displayed in this section correspond to questions 25 through 30 on the student response form. The numbers displayed for each item are simple frequency counts of the number of student selecting that response to the corresponding question. For example, Under the item labeled Course is? the number to the right of the row labeled Major represents the number of students in that class indicating that the course is a requirement for their major. Likewise the next pair of columns to the right indicates the number of students in the class that are freshman, sophomores, and so on. Section 3 and 4 Sections 3 and 4 of the report are the bottom two sections of the report page and contain the data from the student responses to the evaluation items. These two sections are identical in structure with section 3 corresponding to questions one through seventeen of the SEI student response form and section four corresponds to questions eighteen through twenty-four. These are separated because questions one through seventeen are scaled rarely, seldom, usually, frequently, and always while questions eighteen through 24 are scaled poor, fair, satisfactory and excellent. Section layout- These sections are laid out into four clusters of columns. These columns or column clusters are Item text - The left most column of the evaluation summary display contains the text of the evaluation response items. It should be noted that items one through six are standard University items. They are used for every student in every course, and, of course, they appear on every student response form. Items seven through eleven were selected by the Dean of your college (or his or her designee). These items appeared on the student response forms and summary reports for all courses in your college. Since other Deans may have selected the same items, they may appear on the forms and reports in other colleges as well. Items 12 through 16 were selected by you and may have uniquely appeared on your course s student forms and your report. However, since these items were available to other instructors, some or all of the items that you selected may appear on the evaluations of other instructor s courses as well. It should be pointed out that items selected by Deans were removed from the item selection options for instructors in the same college. Consequently, it is not possible for the same item to appear twice on the same report once under the Dean s item section and once under the instructor s item section. However, item selection by a specific Dean did not effect the items selection options for instructors in other colleges. Therefore, it is possible for a specific item to used by the Dean in one college and by faculty member in another college.

It should also be pointed out that the SEI order process required instructors to select up to five items, plus to optionally enter a question of their own composition. The key term here is up to, since instructors or Deans could choose not to select any items. Additionally, an instructor may have opted to omitted the fill-in question. In such cases the corresponding items are left blank on the report. In the sample report, items 16 (fifth instructor question) and 17 (fill-in question) were omitted. Response Frequencies The data displayed in this cluster of columns represents the student s ratings for the evaluation item listed in the same row. The numbers displayed here are simple frequency counts of the number students in that class that marked that specific rating for the item. For example, if the number 10 appears under the column labeled Always for question one that means that 10 students marked Always in response to item one. NA (Not Applicable) is a response option on the student form, so the number appearing in that column indicates the number of students that marked that option in response to that item. However, if a student did not mark a response at all to an item it is counted as an Omit and the frequency of such non-responses is printed in the last column of the Response Frequencies section. Class Mean and Standard Deviation The third column group displays the class mean and standard deviation for each respective item. The items response were weighted on a five point scale with Rarely equal to 1 and Always equal to 5 for items 1 through 17, and Poor equal to 1 and Excellent equal to 5 for items eighteen through 24. Using these numeric weights, then, simple arithmetic means and standard deviations were calculated. These means and standard deviations are displayed in the two-column group labeled CLASS Norm - The class mean for each item (as displayed in the Class Mean column) is used to calculate norms for item. The norms express the student s evaluation of the course or instructor relative to other courses or instructors represented as a percentile. The percentile score indicates, roughly speaking, the percentage of other course evaluations for the same item that were below this mean score. So, using the example, if the mean for item one is 4.91 the percentile score, using the University data as a base, is 84. That means that, again roughly speaking, 83% of courses across the University had a mean of lower than 4.91 on that item. UNV and Col norms - The norms were calculated at two levels. The score listed under the column labeled Unv was derived by comparing the mean for this item for this course with the means for the same item for all courses in the University. The score listed under the column labeled Col, however, was derived by comparing this item mean with means for the same item within the same college. Following the example, University (Unv) and College (Col) norm scores of 84 and 66, respectively, suggests that about 83 percent of courses across the University are rated lower on this item than this course, but only about 65 percent of courses in the same college are rated lower on the same item. An asterisk (*) in any position in the Norms column of the report indicates that there was insufficient data to calculated norms for the corresponding item. Here, insufficient data, was defined as less than 10 response means for that particular item.

Important note: The same item could appear at different locations on different instructors evaluation forms. That is, suppose two instructors selected Criteria for grading was clearly stated. That item could, depending on other selections, appear as item 12 on one instructors form, but as item 15 on another instructor s instrument. For purposes of norm calculations, the specific items, as stated, were grouped together rather than grouping by item number. In other words, mean student responses for Criteria for grading are clearly stated were compared with responses to Criteria for grading are clearly stated for other instructors, rather than calculating norms by comparing item number (item 15 compared to item 15).