1 1 Pilot Study for Assessing the Viability of Using Online Course Evaluations at California State University Sacramento A Report from the Electronic Course Evaluation Task Force Fall 2010
2 2 Sacramento State conducts a majority of their student teaching evaluations using paper and pencil. The practice of collecting teaching evaluations using paper and pencil has come under question. In particular, it is argued that collecting teaching evaluations in this manner is costly (Miller, 1987), demands large amounts of staff time to collect and process the forms (Kronholm, Wisher, Curnow, & Poker, 1999), delays feedback to the faculty (Layne, DeCristoforo, & McGinty, 1999), consumes large quantities of storage space (Donmeyer, Baum, Hanna, & Chapman, 2004), and is environmentally unfriendly (Anderson, McCain, & Bird, 2005). One alternative to conducting paper and pencil student teaching evaluations is to use a system in which evaluations are collected online. In an effort to explore the implications and practical effects of shifting student evaluations from the traditional paper and pencil to an online format at Sacramento State, an eleven-person Electronic Course Evaluation Task Force (ECETF), representing the relevant university groups related to the collection of teaching evaluations, was formed (see Table 1 for a full list of ECETF members). This document identifies the relevant literature, reports on the findings of a pilot study, and concludes with recommendations regarding the use online evaluations at Sacramento State. Table 1 - Electronic Course Evaluation Task Force Members Kimo Ah Yun Jean-Pierre Bayard Wendy Cunningham Doug Jackson Kathryn Kay (replaced in Summer 09 by Jesse Cuevas) David Lang (replaced in Spring 10 by Brett Holland) Raymond Pina Ann Stoltz Harry Theodorides Jennifer Underhill Director, Center for Teaching and Learning, Co- Chair ECETF AVP for Academic Affairs Technology Initiatives and Director, Academic Technology and Creative Services, Co-Chair ECETF Faculty Policies Committee AVP for Information Resources and Technology ASI VP for Academic Affairs Curriculum Policies Committee Manager, AIT, Academic Technology and Creative Services Chair, Nursing Faculty, Kinesiology and Health Science Student, Department of Sociology
3 3 Jing Wang Director, Office of Institutional Research Literature Review Benefits, Drawbacks, and Questions of Electronic Teacher Evaluations Literature on shifting teaching evaluations from face-to-face to online delivery identifies the potential benefits and drawbacks of such an endeavor. Benefits include (a) timely student feedback to instructors, (b) reduced use of class time devoted to conducting evaluations, (c) increased time for students to complete evaluations, (d) greater number of written comments from students, (e) lower costs associated with the process and (f) increased opportunities for students to complete evaluations. In contrast, disadvantages include (a) lowered student response rates, (b) need for computer access to complete the evaluations, and (c) the risk that electronic evaluations will be less secure than paper evaluation forms (Anderson, McCain, & Bird, 2005; Lieberman, Bowers, and Moore, 2001). Given that student response rates tend to decrease when teaching evaluations are shifted from a face-to-face process to an online one, implementation research has focused on indentifying which strategies maximize student response. For universities that value full student participation, they have created policies that require students to complete their teacher evaluations in order to enjoy continued access to their Learning Management System (LMS) or to register for classes. The strategy to block LMS access or class registration has been extremely successful at obtaining student compliance, but some of the collected data have been called into question as some students may simply click buttons to access the LMS or register for classes without paying attention to what they are doing. In contrast, other universities have taken a less rigid approach. For example, some universities simply make students aware that the teaching evaluations are online and students are required to log into a central site to
4 4 complete their evaluations. Overall, when employing strategies that are less rigid, research reveals that response rates drop by approximately twenty percent (Anderson, McCain, & Bird, 2005). Most college and university implementation strategies fall between the most and least strict models, as described above. It has been found that, when students receive messages with a link to conduct their teaching evaluations, are reminded on several occasions by their teachers to complete the teaching evaluation, are informed by the faculty member that their teaching evaluation feedback is valuable to the teaching process, and are reminded by to complete the teaching evaluation, that the response rate nears those of paper and pencil rates. Further, when incentives, such as slight increases in their grade, or opportunities to win prizes in a raffle for their completion of teaching evaluations, are made available, the response rate tends to be slightly higher than rates of current paper and pencil formats (Donmeyer, Baum, and Hanna, 2002). In light of the current fiscal constraints Sacramento State faces, if online teaching evaluations are more cost effective while providing equally good or better data from students, then such a move should receive careful consideration. With respect to costs for paper and pencil evaluations at Sacramento State, the current process requires the printing of thousands of pages of evaluation forms, purchasing of scantron forms and pencils, human hours to collect and send the data to Information Resources & Technology (IRT), and IRT resources to process, calculate, and mail teaching evaluation reports to each faculty member and department. To date, one study attempted to quantify cost savings of shifting from paper and pencil to online evaluations (Kronholm, Wisher, Curnow, and Poker, 1999). In this study, it was estimated that paper evaluations were approximately 30 times more costly than online evaluations. Given the obvious potential cost saving of moving from paper and pencil to online teaching evaluations, Sacramento State could save significant dollars.
5 5 Based on a careful reading of the literature, taking into account information received from the Senate Executive Committee, the University Appointment, Retention, Tenure and Promotion (UARTP) Committee, Faculty Senate, and discussions within the ECEFT, the following list of pros, cons, and issues that were deemed particularly relevant at Sacramento State was composed (see Table 2). Table 2. The Pros and cons of online course evaluations in comparison to the traditional face-to-face course evaluations Pros Cons Issues/Questions Security of the submissions with students being authenticated in the campus LMS (SacCT) Security of the data given that departments get a copy of the electronic data, with the original submission remaining on the server Complete anonymity - No possibility of instructors recognizing students handwriting Access to the evaluation data very quickly, as soon as course grades are submitted, and in time to impact next semester s course design Cost reduction for departments and the University in the administration and processing of the evaluations Save on storage space No need to store evaluations GREEN Saves paper Flexibility allows for students to complete the evaluation at their leisure. Even if they miss class they can participate. Lower return rate. Students could complete the evaluations together, which may impact how they answer. System technical reliability. Getting students to submit their evaluations (return rate) outside class time How will the data be analyzed, and who will analyze the data? The need for faculty buy-in Responding to student concerns for privacy and anonymity The need to change the culture to support online student evaluation processes Verifications of submissions? Having all students potentially submit an evaluation, including those who have not been attending class. Faculty choosing when to administer the survey.
6 6 This list was used to guide the design of the pilot study presented here in order to minimize any potential problems. Using Online Teaching Evaluations The use of online teaching evaluations is spreading in colleges and universities throughout the country. Cited as being at the forefront of collecting teaching evaluations, Drexel University shifted from paper and pencil to online course evaluations in the academic year. Initial research on the impact of such a move revealed that the response rate decreased. Follow-up research suggests that colleges and universities need not sacrifice dramatic decreases in their response rates. For example, Columbia University in their initial test of implementing online teaching evaluations achieved response rates of 85%, which match most current universities paper and pencil response rates (Sax, Gilmartan, & Bryant, 2003). Although the Drexel and Columbia University data provide a starting point to consider the impact of shifting teaching evaluations to an online format, their data do not provide sufficient comparison groups between face-to-face and online teaching evaluations. Two studies helped to illuminate the expected effect on completion rates of traditional versus online evaluations. In a study conducted at Georgia State University (N = 2,453) in which students were randomly assigned to either a traditional paper or pencil versus online evaluation condition, it was revealed that the response rates for face-to-face participants (60.6%) were higher than those in the online format (47.8%). However, most important, the average ratings for faculty members did not differ across the selected mediums (Layne, DeCristiforo, & McGinty, 1999). This study also revealed an unanticipated effect. Their data revealed that students in the online evaluation condition provided a substantially greater number of written comments about their educational experience than those in the paper and pencil condition.
7 7 Another typical study on the impact of shifting from a paper and pencil to an electronic format was conducted at California State University, Northridge (Donmeyer, Baum, & Hanna, 2002). In this study, 16 faculty members were assigned to have one of his/her sections evaluated in-class and the other evaluated online (N = 696). In the online evaluation condition, faculty members were also assigned to one of four treatment conditions, which included: (1) A very modest grade incentive (onequarter of a percent), (2) demonstration by teacher of how to complete the online evaluation, (3) ability to receive early final grade feedback from the instructor, and (4) no information or incentive. Results revealed that the grade condition (87%) had the highest response rate, followed by demonstration (53%), feedback (51%), and the no information or incentive condition (28%). Overall, the results indicated a response rate difference from the paper and pencil format (75%). However, consistent with most research comparing teacher ratings between those conducted using paper and pencil versus an online format there were no differences in teacher rating scores amongst any of the online treatments groups compared to the paper and pencil control group. While a change from a paper and pencil format to an online one at Sacramento State deserves careful consideration, general data trends suggest that a shift would be unlikely to yield detrimental effects on faculty members, or the university. Within the CSU system, there are a number of instances in which individual faculty members and departments have already shifted to online evaluations without major difficulty. In fact, San Diego State has already shifted all of their teaching evaluations to an online format. On the Sacramento State campus, online evaluations are currently used. For example, the Division of Nursing shifted exclusively to the use of online teacher evaluations several years ago. Findings from this department reveal a drop in response rates, no differences with respect to individual teacher evaluation ratings, and an increase in student comments.
8 8 While Nursing is the only Sacramento State department to fully make use of online evaluations, there are numerous faculty members from other university units, such as the College of Education, College of Arts and Sciences, and The College of Business, who currently use some form of online evaluations. In these other units, the process continues to be utilized and no major issues have been noted. Although comparison data are not readily available for these units in their use of electronic evaluations, anecdotal data suggest that such a system is viable for use at Sacramento State. Pilot Study The ECETF pilot study was designed using feedback from Human Resources, Faculty Senate Executive Committee, Faculty Senate, UARTP Committee, CFA (state and local), and ECETF discussions. The design was such that: 1. No electronic evaluations used in this pilot study became part of any faculty member s file. 2. Maximization of representation of faculty from as many colleges was sought. 3. Faculty participation was voluntary. 4. Faculty participation was disallowed if their RTP document precluded the use of non-paper and pencil evaluations, or mandated the inclusion of all course evaluations in their RTP file. 5. A sufficiently large enough sample was obtained to address issues of generalizability. Sample and Methodology Recruitment of faculty was solicited through an to department chairs asking for faculty volunteer participants. This initial request yielded approximated 40 faculty volunteer members. After applying the volunteer requirements some faculty volunteers were deemed ineligible. To complete the targeted 40 faculty participants, volunteer participant recruitment was undertaken by ECETF members to guarantee a representative cross section of participants. In the recruitment stage, a total of 30 classes
9 9 met the participation criteria and electronic evaluation data were collected on these 30 classes. Unfortunately, some of the data for the 30 classes could not be used. For example, in some cases the department did not keep the hard copies of the evaluations for comparison purposes. The final participation count consisted of 25 courses with online evaluation, and 25 courses with traditional evaluation. The 50 courses were paired by the course contents and taught by the same instructors and included about 1,600 student participants. Courses were distributed across the university as follows, Arts and Letters (n = 12), Business (n = 12), Education (n = 4), Health and Human Services (n = 6), Natural Sciences and Mathematics (n = 4), Social Science and Interdisciplinary Studies (n = 12). No data were collected from Engineering and Computer Science faculty as the College requires faculty to collect student teaching evaluations from all of their courses. All comparative analyses utilized Paired Samples T-Test. According to the rule of thumb for mean comparison, the smallest group should not be less than 1/3 rd the size of the largest group. Therefore, all classes were divided into two groups: one was a large class group with 43 or more students in each class; another was a small class group with less than 43 students in each class. One pair of classes was excluded from the group analyses as their sizes were mismatched (58 vs. 25). All incomplete evaluations (such as a student answering all but one of the questions) were included when calculating the mean score for each course. The incomplete evaluations were a very small portion of each teaching evaluation so they had little impact on the average score of individual courses. Method of Deployment This pilot study was built upon the experiences and practices of delivering electronic course evaluations at Sac State using the software, Flashlight, which is used for many of the online courses. Flashlight uses a Secure Sockets Layer (SSL), which is the current commercial standard to protect data by
10 10 encryption over the internet. Access to the survey databases was limited to an authorized person only by means of a username and password. The data center that physically contains the survey database is physically secured and data are backed up nightly: should the need arise to restore information. Student anonymity is maintained as the system does not associate the student identifiers with the raw data. The evaluation delivery process was as follows: 1. A list of faculty participants and selected courses was provided by the Task Force to ATCS. 2. Existing course evaluation formats were converted to an electronic format. 3. Respondent pools were created for each survey, including student addresses. 4. A template message (describing how the evaluation was to be delivered) was provided to each faculty member that could be sent to each student or posted in SacCT. 5. Course evaluations were sent by message to each student and the message included a unique url. The evaluations were available for a two-week period. 6. Midway through the evaluation period, an reminder was sent to each student who had not completed the evaluation. 7. The evaluation data were sent to the authorized department contact. Evaluation data were also provided to the OIR for the purposes of this report. No evaluation data were provided to faculty prior to them submitting their final grades. Results and Analyses 1. There was no significant difference between online evaluation and traditional evaluation among large classes in terms of response rates and mean scores.
11 11 2. The response rate for online evaluation compared to traditional evaluation for the small classes was statistically significant. The mean scores of both evaluations showed no significant difference among small classes. 3. Among the paired 25 courses, the mean scores of online evaluation were slightly lower than the mean score of traditional evaluation but the difference was not statistically significant. However, the response rate of online evaluation was significantly lower than the rate of traditional evaluation % 80.0% 60.0% 40.0% 20.0% 0.0% Response Rate of Teaching Evaluation 79.5% 72.7% 62.1% 55.4% 50.3% 45.7% Online Traditional Large classes Small Classes All Classes 5.00 Mean Score of Teaching Evaluation Online Traditional 0.00 Large classes Small Classes All Classes It is worth mentioning that the traditional evaluations were selected from different semesters, ranging from fall 1998 to summer However, the online evaluations were completed in spring 2010.
12 12 According to the results, the time difference did not impact the ratings that students gave to their instructors. In conclusion, students ratings for each of the paired courses were consistent regardless of implementing online evaluation or traditional evaluation. However, the response rate of online evaluation was significantly lower than traditional evaluation for small classes as well as for overall comparison. In general, a low response rate means less representativeness of the evaluation and the margin of error has to be added when interpreting the mean score of each class. Discussion Results of this pilot study mirror findings from other universities. Shifting from paper and pencil to online evaluations reduced the overall percentage of students opting to complete the teacher evaluation, but the teacher rating scores were not statistically different between the two forms. While these data confirm differences in mean scores that are not statistically significant, the actual numerical differences may be interpreted by RTP committees as indicative of a pattern in decreasing scores. Caution must be used by faculty and the in the event that faculty rating scores change slightly. Finally, one might also ask whether the lowered response rate impacts the quality and quantity of the qualitative data through student written responses. Universities that have studied the effect of shifting the medium of collecting on teaching evaluations have found no negative impact on the comments. Although no statistical analysis of the qualitative data were undertaken, a review of the qualitative data of those evaluations collected in this study and with informal discussions with faculty participants, the total number of student comments either remained the same or increased and the comments reflected greater depth.
13 13 Recommendations This pilot study reveals that online evaluations will yield somewhat similar results to traditional paper and pencil evaluations at Sacramento State. Given that online evaluations are substantially more cost effective, less labor intensive, and can provide more timely feedback to faculty members, the following recommendation is proposed: Recommendation 1: Faculty members should be provided with the option to conduct their teaching evaluations electronically with approval from their department. Because some departments may require discussion about their policies regarding the use of electronic evaluation, it is further recommended that departments engage in discussions about how best to implement and allow electronic evaluations. As such, the following recommendation is proposed: Recommendation 2: Academic Affairs ask each of the Deans to communicate with individual departments that policy regarding electronic evaluations be revisited to accommodate, where appropriate, the use of electronic teaching evaluations. Academic Technology and Creative Services successfully served as the unit responsible for coordinating and working with faculty members to collect electronic evaluations for this pilot study. Given their prior experience with working with faculty within and outside of the pilot study to collect electronic teaching evaluations, it seems reasonable to charge this unit with future oversight in this area. As such, the following recommendation is proposed: Recommendation 3: Academic Technology and Creative Services be charged with creating a formal campus procedure for collecting and distributing electronic course evaluations that is secure and expedient.
14 14 Recommendation 4: Academic Technology and Creative Services assume all duties related to the collection of electronic course evaluations at Sacramento State. Because the California Faculty Association plays a role in representing faculty and that use of online teaching evaluations has been a topic of interest to the California Faculty Association, the following recommendation is made: Recommendation 5: Human Resources and CFA should confer with one another to find an acceptable approach to supporting the use of online teaching evaluations at Sacramento State.
15 15 References Anderson, H.J., Cain, J., & Bird, E. (2005). Online student course evaluations: Review of literature and a pilot study. American Journal of Pharmaceutical Education, 69, Donmeyer, C.J., Baum, P., & Hanna, R.W. (2002). Collete students attitudes toward methods of collecting teaching evaluations: In-class versus on-line. The Journal of Education for Business, Donmeyer, C.J., Baum, P., Hanna, R.W., & Chapman, K.S. (2004). Gathering faculty teaching evaluations by in-class and online surveys: Their effects on response rates and evaluations. Assessment and Evaluation in Higher Education, 28, Kronholm, E. A., Wisher, R. A., Curnow, C. K., and Poker, F. The transformation of a distance Learning training enterprise to an internet base: From advertising to evaluation. Paper presented at the Northern Arizona University NAU/Web99 Conference, Flagstaff, AZ, September Layne, B.H., DeCristoforo, J.R., & McGinty, D. (1999). Electronic versus traditional student ratings of instruction. Research in Higher Education, 40, Lieberman, D. A., Bowers, N., and Moore, D. R. (2001). Use of electronic tools to enhance student Evaluation feedback. Techniques and strategies for interpreting student evaluations. New directions for teaching and learning, no. 87. San Francisco, Jossey-Bass, pp Miller, R. I. (1987). Evaluating faculty for tenure and promotion. San Francisco: Jossey-Bass. Sax, L. J., Gilmartin, S. K., and Bryant, A. N. Assessing Response Rates and nonresponse bias in web and
16 paper surveys. Research in Higher Education, 2003, 44,
1 Pilot Study for Assessing the Viability of Using Online Course Evaluations at California State University Sacramento A Proposal from the Electronic Course Evaluation Task Force: Interim Report October
Assessment & Evaluation in Higher Education Vol. 29, No. 5, October 2004 Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations Curt J. Dommeyer*,
Course-based Student Feedback Forms: Proposal for a New System Fall 2005 Introduction As part of Augsburg College s commitment to providing a high quality education, we have solicited feedback from students
Compiled by Scott Krajewski, Augsburg College, email@example.com An initial bibliography on online course evaluations This list is not exhaustive but can serve as a starting point for further exploration.
"Traditional and web-based course evaluations comparison of their response rates and efficiency" Miodrag Lovric Faculty of Economics, University of Belgrade, Serbia e-mail: firstname.lastname@example.org Abstract
The Journal of Effective Teaching an online journal devoted to teaching excellence Online vs. Paper Evaluations of Faculty: When Less is Just as Good David S. Fike 1ab, Denise J. Doyle b, Robert J. Connelly
Report to the Senate: Spring 2013 Online Evaluations Pilot at Humboldt State University Richard W. Bruce Humboldt State University September 2013 2 Table of Contents Introduction... 3 Methods... 3 Results...
Administering course evaluations online: A summary of research and key issues Office of the Vice Provost for Undergraduate Education June 2009 Prepared by: J. David Perry IUB Evaluation Services & Testing
ONLINE COURSE EVALUATIONS Table of Contents Introduction Review of published research literature Benefits of online evaluations Administration costs Data quality Student accessibility Faculty influence
The Examination of Strength and Weakness of Online Evaluation of Faculty Members Teaching by Students in the University of Isfahan Ansary Maryam PhD Scholar, Philosophy of Education, Faculty of Educational
Assessment & Evaluation in Higher Education Vol. 32, No. 6, December 2007, pp. 681 691 Are they the same? Comparing the instructional quality of online and faceto-face graduate education courses Andrew
College Students Attitudes Toward Methods of Collecting Teaching Evaluations: In-Class Versus On-Line CURT J. DOMMEYER PAUL BAUM ROBERT W. HANNA California State University, Northridge Northridge, California
Testing the Invariance of Interrater Reliability Between Paper-Based and Online Modalities of the SIR II Student Instructional Report David Klieger, John Centra, John Young, Steven Holtzman, and Lauren
Queens College Online Course Evaluation Pilot Report of Results: April 2, 2009 Teaching Excellence and Evaluation Committee -Susan Croll, Chair -Chantal Bruno, student -Claudia Perry -Jill Frohmann, student
Stephanie J Richardson, PhD, RN Director, Center for Teaching & Learning Excellence Associate Professor and Division Chair, College of Nursing Student Course Evaluations at the University of Utah Background.
UNIVERSITY LEADERSHIP COUNCIL Administering Student Course Evaluations Online Custom Research Brief February 14, 2011 RESEARCH ASSOCIATE Ehui Nyatepe-Coo RESEARCH MANAGER Josh Albert I. Research Methodology
Assessing Quantitative Reasoning in GE (Owens, Ladwig, and Mills) Introduction Many students at CSU, Chico, receive much of their college-level mathematics education from the one MATH course they complete
9 Evaluation Online offers a variety of options to faculty who want to solicit formative feedback electronically. Researchers interviewed faculty who have used this system for midterm student evaluation.
Best Practices for Increasing Online Teaching Evaluation Response Rates Denise T. Ogden, Penn State University Lehigh Valley James R. Doc Ogden, Kutztown University of Pennsylvania ABSTRACT Different delivery
Designing & Conducting Survey Research Santa Monica College Fall 2011 Presented by: Hannah Alford, Director Office of Institutional Research 1 Workshop Overview Part I: Overview of Survey Method Paper/Pencil
An Online Evaluation Pilot Study in the College of Arts, Humanities & Social Sciences Richard W. Bruce Humboldt State University February 2013 2 Table of Contents Introduction... 3 Literature Review...
Program Learning Outcome Assessment Plan Template General Information Academic Year of Implementation: 2011 2012 Academic Program / Discipline Area (for General Education) or Co Curricular Program Area:
May 5, 2015 Page 1 of 6 PURPOSE: To outline policy regarding the selection of a Student Rating of Teaching (SRT) instrument and how SRT rankings shall be viewed in faculty evaluation. Supersedes SP 07-12
The Journal of Educators Online-JEO January 2016 ISSN 1547-500X Vol 13 Number 1 1 Formative Evaluations in Online Classes Jennifer L. Peterson, Illinois State University, Normal, IL., USA Abstract Online
COMMUNITY COLLEGE COMPRESSED CALENDARS: RESULTS OF A STUDENT SURVEY AND A FACULTY SURVEY 1 Michael Carley 2 Porterville College Abstract Many community colleges are considering changes in their traditional
Running head: DISTANCE LEARNING OUTCOMES 1 A Comparison of Unsuccessful Student Outcomes for Distance Learning, Hybrid and Traditional Course Formats Ron W. Eskew, Ph.D. Director of Institutional Research
Institutional Review Board for the Use of Human Subjects in Research GUIDELINES FOR A PROPOSAL NARRATIVE In your narrative, address each of the topics outlined below. Every application for IRB review must
2011-12 Early Mathematics Placement Tool Program Evaluation Mark J. Schroeder James A. Wollack Eric Tomlinson Sonya K. Sedivy UW Center for Placement Testing 1 BACKGROUND Beginning with the 2008-2009 academic
Abstract Student Preferences for Learning College Algebra in a Web Enhanced Environment Laura Pyzdrowski West Virginia University Anthony Pyzdrowski California University of Pennsylvania It is important
The Challenges and Potentials of Evaluating Courses Online Mark Troy Texas A&M University Hossein Hakimzadeh Indiana University February 23, 2009 Trav D. Johnson Brigham Young University Dawn M. Zimmaro
A Rasch measurement approach to analyzing differences in pencil-and-paper and online formats for higher education course evaluations Leslie A. Sweeney 1, University of Kentucky Kathryn Shirley Akers, University
ONLINE FACULTY HANDBOOK Online Faculty Support Guide The Online Faculty Handbook was created to inform faculty on college procedures for developing, managing, and requesting online courses whether they
Union County College Faculty Curriculum Committee Course Revision Form To all faculty members seeking to introduce a course revision proposal to the Curriculum Committee, please make sure to complete the
When do evaluations open? End-of-Term Student Opinion survey (SETE) FAQs Meeting pattern dates determine when evaluations will open. The dates are set automatically based on the chart below. Class Length
California State University s Early Start Program Frequently Asked Questions 1. What is Early Start? Under Early Start, beginning in 2012, entering freshmen that are not proficient in math or at risk in
References Regarding Online Course Evaluations Samuels, B. (2013). Increasing Number of Academic Departments Use Online Course Evaluations, CampusWest, April 30, 2013. Most departments at Syracuse University
Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process Overview Effective teaching is the core of any outstanding university and is
Action Plan for the Graduate College Feb. 2012 Western Michigan University Introduction The working premises of this plan are that graduate education at WMU is integral to the identity and mission of the
Bishop State Community College Distance Education Policy Purpose of Distance Education The goal of distance education at Bishop State Community College (BSCC or the College) is to provide students with
Answers to Faculty Concerns About Online Versus In- class Administration of Student Ratings of Instruction (SRI) The posting below compares online student ratings of instructors with in- class ratings.
California State University, East Bay College of Education and Allied Studies Multiple Subject Teaching Credential Single Subject Teaching Credential Committee on Academic Planning and Review (CAPR) Five-Year
Task Force Members: Andrew Baker, Marketing (Fall) University Senate Task Force on Faculty Evaluations FINAL REPORT January 9, 2015 Marcie Bober-Michel, Learning Design and Technology (Senate Officer Representative)
Online and In person Evaluations: A Literature Review and Exploratory Comparison Paulette Laubsch Assistant Professor School of Administrative Science Fairleigh Dickinson University Teaneck, NJ 07666 email@example.com
Effective Date: Definition Authority Scope The purpose of this document is to ensure the consistent academic quality and accessibility of all online/ hybrid courses and programs offered through Humboldt
Online Collection of Student Evaluations of Teaching James A. Kulik Office of Evaluations and Examinations The University of Michigan December 5, 2005 1 More than a dozen universities now collect all their
Faculty Retention/Loss Report 2005 Office of Institutional Research & Evaluation Office of the Vice Provost for Academic Affairs October 2005 University of Arizona Report Appendices Report Summary Anecdotal
Distance Education Policies and Procedures These policies and procedures are designed to ensure Clayton State University compliance with Federal Regulations concerning the definition of distance vs. correspondence
www.ncolr.org/jiol Volume 6, Number 3, Winter 2007 ISSN: 1541-4914 Online vs. Traditional Course Evaluation Formats: Student Perceptions Judy Donovan Indiana University Northwest Cynthia Mader and John
APPENDIX 2.A POLICIES CONCERNING ADJUNCT FACULTY The University recognizes the benefit both to the University and to students of instruction by adjunct faculty. The university also recognizes that it cannot
Self-Selection Bias In On-Line Accounting Courses: Student Characteristics Thomas T. Amlie, (E-mail: firstname.lastname@example.org), State University of New York Institute of Technology Abstract Over the past several
QM Research Grants FY11 The Development of Technological Pedagogical Content Knowledge (TPACK) in Instructors Using Quality Matters Training, Rubric, and Peer Collaboration: Cheryl Ward, Principle Investigator,
Senate Reference No. 09-22 To: IPFW Senate From: Cheryl Sorge, Chair Curriculum Review Subcommittee Date: March 26, 2010 Re: Proposal for the Certificate in Small Business Management The Curriculum Review
April 14, 2015 Page 1 of 10 PURPOSE: This document is intended to create and clarify policies related to a range of online teaching and learning issues at CI. BACKGROUND: The Special Committee for Online
Incorporating a Service-Learning Option in an Online Course Final Report for Summer 2012 CELT Summer Instructional Design Grant Adam D. Dircksen IPFW Department of Communication I. Introduction and Background
Designing Effective Online Course Development Programs: Key Characteristics for Far-Reaching Impact Emily Hixon, Ph.D. School of Education email@example.com Janet Buckenmeyer, Ph.D. School of Education
Program Personnel Standards Approval Form Disciplrne: Nursing ','J1* )lplll RTP Committeehair Date i Introduction Relationship of Discipline Standards to CSU Channel Islands Program Standards To understand
Demand vs. Hiring Attitudes for Online Doctoral Technical Education Jim Flowers Professor & Director of Online Education Ball State University Introduction Prior to going online with two master s degrees
IMACST: VOLUME 3 NUMBER 1 FEBRUARY 212 61 Technological Tools to Learn and Teach Mathematics and Statistics Abstract: Mujo Mesanovic American University of Sharjah, firstname.lastname@example.org The blended learning
1 of 7 Distance Technology: A National Study of Graduate Higher Education Programs Bonita K. Butner, Ph.D. Assistant Professor, Higher Education Program Texas Tech University P. O. Box 41071 Lubbock, TX
REVIEWS American Journal of Pharmaceutical Education 2005; 69 (1) Article 5. Online Student Course Evaluations: Review of Literature and a Pilot Study Heidi M. Anderson, PhD, Jeff Cain, MS, and Eleanora
Paper presented at the annual conference of the American Educational Research Association New Orleans, 2002 Trav Johnson, Brigham Young University Background Online Student Ratings: Will Students Respond?
PSYCH 7020 A 20280 Conditions of Learning 3 Semester Hours, Spring, 2014 Dewar College of Education Valdosta State University Department of Psychology and Counseling Conceptual Framework: Guiding Principles
Columbus State University Credit Hour Policy and Procedures Definition of Credit Hours The institution has policies and procedures for determining the credit hours awarded for courses and programs that
Approved: 2/23/099 Department of History Policy 1.1 Faculty Evaluation Evaluation Procedures 1. The Department of History will evaluate all tenured and non-tenure faculty by March 1 of each academic year
EVALUATION OF DEPARTMENT CHAIR Background This document is developed based on the Office of Academic Affairs Memorandum No. 05-3 (OAAM 05-3) document. In that document, the department chair s responsibilities
For more resources click here -> Online Course Delivery at 50 Accredited Institutions: The Critical Issues Robert M. Colley Associate Dean, Continuing Education Syracuse University Shelly Blowers Graduate
Online Courses: An analysis of student satisfaction Franza, T., Williams, D., Morote, E. & Cunningham, J Purpose of Study Abstract: The purpose of this study was to determine if existing student or instructor
Proceedings of the 2001 Annual ASEE Conference, ASEE, June 2001. Abstract A COMPARISON OF ELECTRONIC SURVEYING BY E-MAIL AND WEB Catherine E. Brawner, Richard M. Felder, Rodney H. Allen, Rebecca Brent,
1 SUPPORTING FIRST-YEAR TRANSITIONS TEMPLE UNIVERSITY Jodi Levine Laufgraben, Michele L. O Connor, and Jermaine Williams Learning communities at Temple University began in 1993 with about two hundred students.
Project Proposal Type Systemic Project Division of Academic Affairs Technology Fee Project Proposal 2014 Proposal Deadline: Tuesday, January 21, 2014 Projects proposed by operational units of the university
UF ONLINE ANNUAL REPORT 2013-14 Submitted July 21, 2014 by the Advisory Board for UF Online 1 Executive Summary Under Florida law, the Advisory Board for the institute for online learning, which was subsequently
Chapter 5 Market Research and Business Intelligence Lessons from Vignette 5 Advertising networks track consumers online Tracking can: reveal consumer preferences lead to better pricing decisions enable
Online Course Evaluation Literature Review and Findings Prepared by: Jessica Wode Jonathan Keiser Academic Affairs Columbia College Chicago Spring 2011 TABLE OF CONTENTS Summary of Scholarly Research on
Mission / Purpose Collaborate and support. Division of Online Programs Detailed Assessment Report 2013-2014 I. Goals and Outcomes/Objectives, with Any Related Measures, Targets, Findings, and Action Plans
ISSUED: REV. A 1/10/00 12/15/04 PAGE 1 OF 6 PURPOSE The purpose of this policy is to describe the principles and processes designed to ensure quality in distance education at Creighton University and to
WEB-BASED COURSE EVALUATION: COMPARING THE EXPERIENCE AT TWO UNIVERSITIES Jack McGourty 1, Kevin Scoles and Stephen Thorpe 2 Abstract This paper describes the implementation of online student evaluation
Annual Goals for Math & Computer Science 2010-2011 Gather student learning outcomes assessment data for the computer science major and begin to consider the implications of these data Goal - Gather student
MOTION Motion to Support Creation of Undergraduate Certificate in Web Development (Submitted by the Curriculum Committee) Motion: The Faculty Senate approves the proposal to create an Undergraduate Certificate
DISTANCE LEARNING COMPENSATION IN CSU BUSINESS SCHOOLS 19 Distance Learning Compensation in CSU Business Schools Robert L. Hurt Joumana McGowan Accounting University Community Educators confront many issues
Student Feedback: Improving the Quality through Timing, Frequency, and Format Travis Habhab and Eric Jamison This paper was completed and submitted in partial fulfillment of the Master Teacher Program,
Missouri Baptist University Center for Distance Learning Policies and Procedures Manual MBU Center for Distance Learning Vision, Mission and Goals Through technologically-enhanced teaching-learning opportunities,
International Journal of Nursing December 2014, Vol. 1, No. 2, pp. 39-47 ISSN 2373-7662 (Print) 2373-7670 (Online) Copyright The Author(s). 2014. All Rights Reserved. Published by American Research Institute