Constructive student feedback: Online vs. traditional course evaluations. Judy Donovan, Ed.D. Indiana University Northwest
|
|
|
- Christiana Harrell
- 10 years ago
- Views:
Transcription
1 Volume 5, Number 3, Winter 2006 ISSN: Constructive student feedback: Online vs. traditional course evaluations Judy Donovan, Ed.D. Indiana University Northwest Cynthia E. Mader, Ph.D. Grand Valley State University John Shinsky, Ph.D. Grand Valley State University Abstract Substantial efforts have been made recently to compare the effectiveness of traditional course formats to alternative formats (most often, online delivery compared to traditional on-site delivery). This study examines, not the delivery format but rather the evaluation format. It compares traditional paper and pencil methods for course evaluation with electronic methods. Eleven instructors took part in the study. Each instructor taught two sections of the same course; at the end, one course received an online course evaluation, the other a traditional pencil and paper evaluation. Enrollment in these 22 sections was 519 students. Researchers analyzed open-ended comments as well as quantitative rankings for the course evaluations. Researchers found no significant differences in numerical rankings between the two evaluation formats. However, differences were found in number and length of comments, the ratio of positive to negative comments, and the ratio of formative to summative comments. Students completing faculty evaluations online wrote more comments, and the comments were more often formative (defined as a comment that gave specific reasons for judgment so that the instructor knew what the student was suggesting be kept or changed) in nature. Introduction Faculty course evaluations have the potential to affect professional and career advancement, promotion, and tenure. Few areas in higher education cause more anxiety than course evaluations, and few areas have been studied more for validity and reliability (Wachtel, 1998). At a time when online course methods such as teaching, testing, grading, and discussion are no longer a novelty, online course evaluations bring the advantage of saving time and resources over the traditional paper and pencil scan sheet method. Thus, instructors may be encouraged or even required to use them in place of the more cumbersome paper method. Regardless of convenience or efficiency, however, instructor resistance to online evaluations has been well documented, especially perceptions of lower return rates and higher percentage of negative responses. The first purpose of this study was to verify the results of existing research in those two areas. The second purpose was to expand on existing research in the area of openended responses. Despite the attention paid to them by faculty and their peers, open-ended student comments have not received the research attention we sought. These, then, were our objectives: 283
2 1. To verify research on return rate and quantitative rankings 2. To investigate open-ended responses, based on an analysis of these factors: a. Extent of responses (number, length, proportion of respondents) b. Nature of responses (positive or negative) c. Usefulness of responses in improving instruction (summative or formative) The second objective, investigating open-ended student comments, was the one that most intrigued the researchers and may offer the most significant and interesting findings. Perspectives from the Literature Previous research indicates both benefits and limitations inherent in online course evaluations. The benefits include time and cost savings, faster reporting of results, and possible improved quantity and quality of student comments (Kuhtman, 2004). In addition, online evaluations are less subject to faculty influence, allow students to take as much time as they wish to complete the evaluation, and also allow students to choose the time they wish to complete the evaluation (Anderson, Cain, & Bird, 2005). One study reported that students preferred completing electronic instructor evaluations to paper ones (Layne, DeCristoforo, & McGinty, 1999). A focus group in another study reported that the online tool was easy to use, students liked the anonymity of the online evaluation, and the online evaluation allowed them to offer more thoughtful remarks than did the traditional, inclass, print-based teaching evaluation (Ravelli, 2002). In another study, over 90% of students marked Agree or Strongly Agree when asked if they preferred online to traditional evaluation format (Anderson et al., 2005). On the other hand, online evaluations may have disadvantages, such as requiring students to have computer access (Anderson et al., 2005). Also documented is faculty resistance when moving from traditional to online evaluations (Sorenson & Reiner in Sorenson & Johnson, 2003). Some research found that faculty prefer traditional evaluations because they believe traditional methods produce a higher rate of return and more accurate responses (Dommeyer, Baum, Chapman & Hanna, 2002). Some faculty who are not proficient with computers or knowledgeable about online surveys also believe that online evaluations are less accurate than traditional ones (Anderson et al., 2005). Other concerns voiced by faculty who object to online evaluations include the beliefs that quantitative scores are lower, negative comments are more frequent, student return rate is lower, and while students voice more dissatisfaction with less favored instructors, they are not as motivated to express satisfaction with more favored instructors (Ravelli, 2000). Return Rate. Some studies verify concerns about return rates. Lower response rate has been linked with online evaluations (Dommeyer, et al., 2004; Layne et al., 1999; Sorenson & Johnson, 2003). Higher response rates may be helped by the use of course management systems such as Blackboard, which have been found to increase response rates (Oliver & Sautter, 2005). Quantitative Rankings. Little difference has been reported in quantitative scores. Online evaluations produce essentially the same quantitative scores as traditionally delivered evaluations (Dommeyer et al., 2004; Layne et al., 1999; Sorenson & Johnson, 2003). A recent work concluded that although online response rates were lower, the mean scores were the same as those derived from traditionally delivered evaluations (Avery, Bryant, Mathios, Kang, & Bell, 2006). It also found that smaller classes tended to have higher survey response rates, and that online survey response rates increased over time. Johnson (2002) noted that response rates increased yearly from 40%, then 51%, 62% and finally 71% the 284
3 final year of a study. Carini, Hayek, Kuh, Kennedy and Ouimet (2003), however, found that students who completed the web-based format of a survey responded more favorably toward faculty than students completing the paper format. Extent of Open-Ended Responses. Online open-ended responses have been studied for potential differences in length compared to traditional evaluation responses. Researchers report that student open-ended responses are lengthier in the online evaluation format, and more comments are made. Kasiar, Schroeder, & Holstaad (2001) reported, The total number of words typed per student using the online system was more than 7 times that of the student using the traditional system. Another study of a graduate management class found that students wrote an average of four times as many words online (62:15 words/student) as they did using paper and pencil evaluation formats (Hmieleski & Champagne, 2000, p. 5). Nature and Usefulness of Open-Ended Responses. The researchers found nothing that analyzed the content and substance of open-ended responses. Because these are the very responses that often draw the most attention from faculty and others during the evaluation process, this study hoped to bridge that apparent gap in the literature. Methods and Techniques The study was conducted at a large Midwestern public university and included 22 sections of graduate and undergraduate Education courses. A total of 30 instructors were identified who fit the study s primary criteria: teaching two sections of the same course during the same semester. Of these 30 instructors, 11 were identified who fit the criteria and were willing to administer the online evaluation to one section and the traditional paper evaluation to the other section. Those who were unwilling offered reasons such as the following: first time teaching a course and unsure about varying evaluation format; belief that they would receive more negative evaluations online; unsure of how to administer online evaluations; and belief that return rate would be lower. At the end of the course, one section of the instructor s classes filled out the course evaluation in class in the traditional manner, while the instructor s other section of the same course completed the evaluation online in electronic format. Items and questions were identical in each format. The department s office assistants then paired and coded the completed evaluations, removed identifying course and instructor information, and provided photocopied sets to the researchers. First, the researchers analyzed return rates and quantitative rankings. Second, they analyzed open-ended comments to determine length, number, and proportion of respondents making such comments. Finally, working in pairs, the researchers analyzed each open-ended comment and categorized it as a) positive or negative and b) formative or summative. A formative comment was defined as one that gave specific reasons for judgment so that the instructor knew what the student was suggesting be kept or changed. A summative comment was defined as one lacking detail on what to keep or change. These examples may provide clarity as to the judgments made by the researchers. 285
4 Table 1. Examples of Researcher Interpretations Open-Ended Comment Positive Negative Formative Summative This was a wonderful class. x x Great professor! x x Good job! x x I appreciated how you respected everyone s opinion. x x We always had very good discussions, which made me want to come back. x x Made me feel comfortable participating. x x This class was frustrating for me at times. x x This class was horrible. x x This is one of the worst classes I ve ever taken. x x It would help if you explained assignments better. x x He needs work with computers, really struggles. x x Assignments were not clearly explained. x x No attempt was made to correlate the possible combinations of positive/negative and formative/summative. Although negative comments may be less pleasant to hear than positive comments, they can be just as instructive if they are formative in content rather than simply summative. In some cases, student comments reflect circumstances that are beyond the instructor s control. Prior to tabulating data, the researchers determined that such comments would not be included in the data analysis. An example would be: Classrooms in this building are always too cold. Although such comments may be useful for other reasons, they were not included in the data analysis because they did not directly pertain to instructor or course evaluation. Researchers also had to determine how to handle multiple judgments from each student. The first printing of comments came to the researchers grouped by each student regardless of the number of sentences or judgments within the block. The researchers reformatted each set of comments into discrete judgments. For example, the following two-sentence block was determined to contain three separate judgments: 286
5 I really enjoyed her knowledge (1). She could stand to be a little more organized, (2) but overall she was very good (3). Return Rates Results Table 2 shows the difference in return rates between the two groups. The number of total evaluations returned was 413 out of 519 students, an overall return rate of about 80%. About half, or 48% of the students who returned evaluations completed online ones, and 52% completed traditional evaluations. Of the students who filled out traditional evaluations, 83% (215 students of 258 offered traditional evaluations) returned the forms, compared with 76% (198 of 261) who submitted online evaluations. In some cases, the online evaluations generated a higher rate of return, but overall more students returned traditional evaluations than online evaluations. Table 2. Summary Data on Return Rates (N=519) Students per Evaluation Format (n=519) Evaluations Returned (n=413) % Return Rate based on Evaluations Returned Online Traditional Online Traditional Online Traditional Difference % 52% (Note: Tables at the end of this paper contain complete data from each course section.) 4 % higher from traditional Quantitative Rankings: Little difference was found in quantitative results between traditional and online evaluations. Using a five-point Likert scale, students responded to 14 questions asking the extent to which the instructor was prepared, fair, enthusiastic, etc. The responses for each of the 14 questions was averaged for each individual instructor, resulting in one average score per instructor for each question. When responses were tabulated, two showed a.2 difference, ten showed a.1 difference, and two showed no difference whatsoever. The average quantitative ranking for the online format was 1.55; for the traditional format it was 1.46, with the highest possible rating 1.0 at excellent, and 5.0 representing poor. Three statistical tests (Independent t-test, Mann-Whitney, and Levine test for unequal variance) found no difference of statistical significance at the.05 level in these results. These statistical tests were administered because the data was based on group averages, rather than the individual scores of each student. The individual scores were not available, as results are reported to faulty only in a composite format. Students rated faculty essentially the same no matter whether the evaluations were online or in traditional form. Extent of Open-Ended Responses: Although quantitative rankings can be extremely helpful in pointing out areas of relative strength and weakness, some instructors give more attention to open-ended comments. After completing the 14 Likert-scale questions, students in 287
6 the study were asked the open-ended question, Do you have any additional comments? In the area of open-ended responses, several differences were found between those who submitted online evaluations and those who submitted traditional evaluations. Comments According to Format: Online responses to the open-ended question were consistently greater than were those from the traditional format. Of 670 individual comments, 452 were made online compared to 218 traditional a ratio of 2:1. More than twice as many comments came from online evaluations as from traditional evaluations. Comments within Returned Evaluations: Among those who returned evaluations, the percentage of online respondents making open-ended comments was also considerably greater. As Table 3 shows, when compared to the overall number of students who submitted evaluations in either form, approximately 27% more online respondents also wrote open-ended comments. Table 3. Summary Data on Comments within Returned Evaluations Students Submitting Evaluations (n=413) Students Making Open-Ended Comments (n=250) % Submitters Making Open-Ended Comments Online Traditional Online Traditional Online Traditional Difference % 47% 27% more from online Comments and Words per Student: Despite the slightly higher return rate of traditional evaluations, of the 250 respondents who also took the time to make open-ended comments, online respondents made an average of half again (53% more) as many comments as their counterparts did. Furthermore, of the 7976 words written in the comments, the average number of online words was 54% greater than the average length of individual comments submitted in traditional form. Table 4 illustrates the differences. Table 4. Summary Data on Comments and Words per Student Students Making Open-Ended Comments (n=250) Number of Comments (n=658) (Number per Student) Number of Words Written (n=7976) (Number per Student) Online Traditional Online Traditional Online Traditional (3.03) 218 (2.08) 5395 (37) 2582 (24) Nature of Comments (Positive or Negative): Comments made in the open-ended response portions of the evaluations were examined to determine whether there was a difference in the ratio of positive to negative comments in the two evaluation formats. As Table 5 shows, the 288
7 overall ratio of positive to negative comments revealed a 6% difference, with 55% of online comments being positive, compared to 61% of those submitted on traditional evaluations. Table 5. Positive Comments vs. Negative Comments (N=644) Online (n= 433) Traditional (n=211) Percent Online Percent Traditional Difference in Positive Comments Positive Negative Positive Negative Positive Negative Positive Negative % 44% 61% 39% 6% more from traditional Usefulness of Comments (Formative or Summative): After determining whether each open-ended comment was positive or negative, researchers categorized each comment as formative or summative, based on whether or not it provided the instructor with information on what to change, add or keep in the course. As shown in Table 6, online respondents made about twice as many statements, with online respondents making formative comments at a 6% higher rate than respondents submitting traditional evaluations (79% to 73%). Table 6. Formative Comments vs. Summative Comments Online (n=432) Traditional (n=207) % Online % Traditional Difference in Formative Comments Formative Summative Formative Summative Formative Summative Formative Summative % 21% 73% 27% 6% more from online Analysis Overall, the results of this comparative study on course evaluation formats were these: a) It showed higher online return rates than other research had found, although there was still a somewhat higher return rate for traditional evaluations (7%); b) It supported other research on the similarity of quantitative rankings; c) It supported other research showing greater number and length in online responses; d) It showed differences in the area of open-ended comments and the extent to which these comments were positive or negative and formative or summative. Return Rates. In response to concerns about return rates, this study differed somewhat from reports in the literature showing that online return rates were sometimes as low as 40% (Johnson, 2002). The results of this study showed a 76% return rate for online evaluations and an 83% 289
8 return rate for traditional evaluations. It is possible that the higher rates of return in this study were due to the use of a course management system (in this case, Blackboard); such systems have been found to increase response rates. Furthermore, it is possible that non-classroom settings provided the respondent with circumstances that are more amenable for some students, such as more time and more privacy from instructor and classroom peers. Although any number of factors can affect return rates, this study showed that comparable rates are indeed possible. Quantitative Rankings. This is one of the most frequently voiced instructor concerns when choosing between traditional and online evaluation formats. This study verified what the literature points out consistently: there was no significant difference in quantitative rankings between online evaluation formats and traditional paper and pencil formats. This study found a slim.09 difference on a 5.0 scale. Number and Length of Open-Ended Responses. As also reported in the literature, the number and length of open-ended responses was considerably greater in online evaluations than in traditional paper and pencil evaluations. The ratio of online comments to traditional comments was 2:1. The percentage of online respondents who made open-ended comments was 27% greater than the percentage of traditional evaluation respondents. The online respondents wrote 53% more comments per student than the traditional respondents (2.9 online to 1.9 traditional). Finally, online respondents wrote 54% longer comments than traditional respondents (37 words per student online to 24 words per student in traditional format). Thus far the study seemed to show that online evaluations resulted in comparable return rates, comparable rankings, more open-ended comments, and lengthier open-ended comments. But what about the substance of those comments? In the seeming absence of previous research on this question, this study offers these findings. Nature of Open-Ended Comments (Positive vs. Negative). Whereas quantitative rankings showed virtually no difference between the two formats, in the open-ended comments respondents submitting traditional evaluations showed a slightly higher percentage (6%). The researchers have not pursued this line of inquiry, but one reason may be the anonymity afforded by electronically printed comments as compared to handwritten comments. Usefulness of Open-Ended Comments in Improving Instruction (Formative vs. Summative). Respondents completing online evaluations not only made more comments and longer comments, they made a larger percentage of formative comments (6%) than their counterparts did in traditional formats. Although a 6% difference may not seem large, when combined with substantially more online respondents making comments, and with their comments being substantially lengthier, the amount of information offered to instructors can be useful indeed. Conclusions Several conclusions can be drawn from this research, although it should be replicated in other settings with other populations. First, this study confirms existing research on similarities between evaluation forms. Second, it also clarifies what some faculty believe: There are indeed differences in student evaluations based on format. The differences, however, may allay rather than exacerbate faculty concerns. Based on our results, we offer the following for consideration: 290
9 1. Instructors using either format are likely to find that quantitative rankings will be similar; 2. Instructors using either format are likely to find that the proportion of positive to negative comments will be similar; 3. Instructors using online formats, however, are likely to find that open-ended comments will not only be quantitatively greater in number and length, but they will contain more qualitative detail than is likely to be found in traditional evaluations. Faculty concerns are not to be dismissed lightly. Most educators would agree, however, that course evaluations are important not only for tenure and promotion but also to give faculty feedback to improve teaching, give students a chance for valuable input into their own learning, and help both strive for a learning community. All of these purposes seem well served by detailed, thoughtful student input into the class experience. If the findings of this study hold true, the combination of administrative convenience, instructor desire for timely feedback, and course improvement through formative evaluation may reduce the use of traditional paper evaluations and make online evaluations the format of choice. 291
10 References Anderson, H.M., Cain, J., & Bird, E. (2005). Online Course Evaluations: Review of Literature and a pilot study. American Journal of Pharmaceutical Education, 69(1), Avery, R. J., Bryant, W. K., Mathios, A., Kang, H., Bell, D. (2006). Electronic Course Evaluations: Does an Online Delivery System Influence Student Evaluations? Journal of Electronic Education. 37:1, Carini, R., Hayek, J., Kuh, G., Kennedy, J., Ouimet, J. (2003). College Student Responses to Web and Paper Surveys: Does Mode Matter? Research in Higher Education. 44:1; Dommeyer, C, Baum, P., Hanna, R., Chapman, K. (2004). Gathering Faculty Teaching Evaluations by in-class and online surveys: Their effects on Response Rates and Evaluations. Assessment and Evaluation in Higher Education. 29:5, 611. Dommeyer, C, Baum, P., Chapman, K., Hanna, R. (2002). Attitudes of Business Faculty toward two Methods of Collecting Teacher Evaluations: Paper vs. Online. Assessment and Evaluation in Higher Education. 27:5, 455. Hmieleski, K., Champagne, M.V. (2000). Plugging in to Course Evaluations. The Technology Source. September/October. Available at: Johnson, T. (2002). Online Student Ratings: will Students respond? Paper presented at: American Educational Research Association Annual Meeting, Kasiar, J.B., Schroeder, S. L., Holstaad, S. G., (2001). Comparison of Traditional and Web-based Course Evaluation Processes in a required, team-taught pharmacotherapy course. America Journal Pharmaceutical Education. 63, Kuhtman, M. (2004) Review of Online Student Ratings of Instruction. College and University Journal. 80:1, Layne, B., DeCristoforo, J., McGinty, D. (1999). Electronic Versus Traditional Student Ratings of Instructions, Research in Higher Education. 40 (2) Oliver, R. & Sautter, E. (2005) Using Course Management Systems to Enhance the Value of Student Evaluations for Teaching. Journal of Education for Business. 80:4, Ravelli, B. (2000). Anonymous Online Teaching Assessments: Preliminary Findings, Paper presented at: Annual National Conference of the American Association for Higher Education, June 14-18, 2000; Charlotte, North Carolina. Sorenson, D.L. & Johnson, D. (2003). Online Student Ratings of Instructions. San Francisco: Jossey Bass. 292
11 Wachtel, H.K. (1998). Student evaluation of college teaching effectiveness: A brief review. Assessment and Evaluation in Higher Education. 23(2),
12 Table 2. Data on Return Rates Appendix A Detailed Data Tables Course Sections per Evaluation Format Students Evaluations Returned Return Rate Online Traditional Online Traditional Online Traditional Table 3. Data on Comments within Returned Evaluations Paired Sections Students Submitting Evaluations (n=413) Students Making Open-Ended Comments (n=250) Submitters Making Open-Ended Comments Online Traditional Online Traditional Online Traditional Total
13 Table 4. Data on Comments and Words per Student Students Making Open-Ended Comments Word Count Words per Student Online Traditional Online Traditional Online Traditional Difference Table 5. Positive Comments vs. Negative Comments (N=644) Paired Sections (n=22) Online (n=433) Traditional (n=211) Positive Negative Positive Negative Total
14 Table 6. Formative Comments vs. Summative Comments (N=639) Paired Sections Online (n=432) Traditional (207) Formative Summative Formative Summative Totals Percentages 79% 21% 73% 27% 296
Online vs. Traditional Course Evaluation Formats: Student Perceptions. Judy Donovan Indiana University Northwest
www.ncolr.org/jiol Volume 6, Number 3, Winter 2007 ISSN: 1541-4914 Online vs. Traditional Course Evaluation Formats: Student Perceptions Judy Donovan Indiana University Northwest Cynthia Mader and John
Administering course evaluations online: A summary of research and key issues
Administering course evaluations online: A summary of research and key issues Office of the Vice Provost for Undergraduate Education June 2009 Prepared by: J. David Perry IUB Evaluation Services & Testing
A Rasch measurement approach to analyzing differences in pencil-and-paper and online formats. for higher education course evaluations
A Rasch measurement approach to analyzing differences in pencil-and-paper and online formats for higher education course evaluations Leslie A. Sweeney 1, University of Kentucky Kathryn Shirley Akers, University
SIR II Student Instructional Report
Testing the Invariance of Interrater Reliability Between Paper-Based and Online Modalities of the SIR II Student Instructional Report David Klieger, John Centra, John Young, Steven Holtzman, and Lauren
"Traditional and web-based course evaluations comparison of their response rates and efficiency"
"Traditional and web-based course evaluations comparison of their response rates and efficiency" Miodrag Lovric Faculty of Economics, University of Belgrade, Serbia e-mail: [email protected] Abstract
Online vs. Paper Evaluations of Faculty: When Less is Just as Good
The Journal of Effective Teaching an online journal devoted to teaching excellence Online vs. Paper Evaluations of Faculty: When Less is Just as Good David S. Fike 1ab, Denise J. Doyle b, Robert J. Connelly
This list is not exhaustive but can serve as a starting point for further exploration.
Compiled by Scott Krajewski, Augsburg College, [email protected] An initial bibliography on online course evaluations This list is not exhaustive but can serve as a starting point for further exploration.
Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations
Assessment & Evaluation in Higher Education Vol. 29, No. 5, October 2004 Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations Curt J. Dommeyer*,
Are they the same? Comparing the instructional quality of online and faceto-face graduate education courses
Assessment & Evaluation in Higher Education Vol. 32, No. 6, December 2007, pp. 681 691 Are they the same? Comparing the instructional quality of online and faceto-face graduate education courses Andrew
Best Practices for Increasing Online Teaching Evaluation Response Rates
Best Practices for Increasing Online Teaching Evaluation Response Rates Denise T. Ogden, Penn State University Lehigh Valley James R. Doc Ogden, Kutztown University of Pennsylvania ABSTRACT Different delivery
Online versus traditional teaching evaluation: mode can matter
Assessment & Evaluation in Higher Education Vol. 30, No. 6, December 2005, pp. 581 592 Online versus traditional teaching evaluation: mode can matter Eyal Gamliel* and Liema Davidovitz Ruppin Academic
Samuels, B. (2013). Increasing Number of Academic Departments Use Online Course Evaluations, CampusWest, April 30, 2013.
References Regarding Online Course Evaluations Samuels, B. (2013). Increasing Number of Academic Departments Use Online Course Evaluations, CampusWest, April 30, 2013. Most departments at Syracuse University
Online and In person Evaluations: A Literature Review and Exploratory Comparison
Online and In person Evaluations: A Literature Review and Exploratory Comparison Paulette Laubsch Assistant Professor School of Administrative Science Fairleigh Dickinson University Teaneck, NJ 07666 [email protected]
The Videoconferencing Classroom: What Do Students Think? A. Mark Doggett Western Kentucky University. Introduction
The Videoconferencing Classroom: What Do Students Think? A. Mark Doggett Western Kentucky University Introduction The advantages of video conferencing in educational institutions are well documented. Scholarly
Outcomes of Preservice Teacher s Technology Use
Outcomes of Preservice Teacher s Technology Use William F. Morrison, Assistant Professor, Bowling Green State University Tara L. Jeffs, Assistant Professor, East Carolina University Abstract: At a time
How To Improve Course Evaluations
Online Course Evaluation Literature Review and Findings Prepared by: Jessica Wode Jonathan Keiser Academic Affairs Columbia College Chicago Spring 2011 TABLE OF CONTENTS Summary of Scholarly Research on
How To Determine Teaching Effectiveness
Student Course Evaluations: Key Determinants of Teaching Effectiveness Ratings in Accounting, Business and Economics Courses at a Small Private Liberal Arts College Timothy M. Diette, Washington and Lee
Student Feedback on Online Summer Courses
Student Feedback on Online Summer Courses October 8, 2015 Santa Clara University Office of Assessment Report Introduction In the summer of 2015, approximately 700 undergraduate students were enrolled in
Math Science Partnership (MSP) Program: Title II, Part B
Math Science Partnership (MSP) Program: Title II, Part B FLOYD COUNTY: COLLABORATIVE SYMPOSIUM FOR MATH IMPROVEMENT IN FLOYD COUNTY SCHOOLS ANNUAL EVALUATION REPORT: YEAR TWO Report Prepared by: Tiffany
Collecting and Using Student Feedback
Collecting and Using Student Feedback CETL Center for Excellence in Teaching and Learning October, 2009 Adapted from Collecting and Using Mid-semester Feedback Teaching Support Services by Jeanette McDonald
Students Perceptions of Effective Teaching in Distance Education
Students Perceptions of Effective Teaching in Distance Education Albert Johnson, M.Ed. (IT) Senior Instructional Designer, Distance Education and Learning Technologies Trudi Johnson, PhD Associate Professor,
COURSE AND TEACHER SURVEYS (CATS) AT VILLANOVA UNIVERSITY
COURSE AND TEACHER SURVEYS (CATS) AT VILLANOVA UNIVERSITY A guide for new faculty members There are probably more studies of student ratings than of all of the other data used to evaluate college teaching
Pilot Study for Assessing the Viability of Using Online Course Evaluations at California State University Sacramento
1 Pilot Study for Assessing the Viability of Using Online Course Evaluations at California State University Sacramento A Report from the Electronic Course Evaluation Task Force Fall 2010 2 Sacramento State
Modifying Formative Evaluation Techniques For Distance Education Class Evaluation
Turkish Online Journal of Distance Education-TOJDE October 2005 ISSN 1302-6488 Volume: 6 Number: 4 Article: 1 Modifying Formative Evaluation Techniques For Distance Education Class Evaluation ABSTRACT
Communication Humor and Personality: Student s attitudes to online learning
Communication Humor and Personality: Student s attitudes to online learning Originally published in the Academic Quarterly Exchange, Summer 2001 Diane Goldsmith Ph.D. Connecticut Distance Learning Consortium
Online Student Course Evaluations: Review of Literature and a Pilot Study
REVIEWS American Journal of Pharmaceutical Education 2005; 69 (1) Article 5. Online Student Course Evaluations: Review of Literature and a Pilot Study Heidi M. Anderson, PhD, Jeff Cain, MS, and Eleanora
Alternative Online Pedagogical Models With Identical Contents: A Comparison of Two University-Level Course
The Journal of Interactive Online Learning Volume 2, Number 1, Summer 2003 www.ncolr.org ISSN: 1541-4914 Alternative Online Pedagogical Models With Identical Contents: A Comparison of Two University-Level
Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?
Assessment METHODS What are assessment methods? Assessment methods are the strategies, techniques, tools and instruments for collecting information to determine the extent to which students demonstrate
Concordia University Course Evaluation Survey: Summary Report
Concordia University Course Evaluation Survey: Summary Report Introduction In fall 2010, the Vice-Provost, Teaching and Learning engaged the Institutional Planning Office to conduct a survey of student
Assessing Online Asynchronous Discussion in Online Courses: An Empirical Study
Assessing Online Asynchronous Discussion in Online Courses: An Empirical Study Shijuan Liu Department of Instructional Systems Technology Indiana University Bloomington, Indiana, USA [email protected]
Assessing the quality of online courses from the students' perspective
Internet and Higher Education 9 (2006) 107 115 Assessing the quality of online courses from the students' perspective Andria Young, Chari Norgard 1 University of Houston-Victoria, 3007 N. Ben Wilson, Victoria,
UNH Graduate Education Department. Quarterly Assessment Report
First Quarter Assessment Report UNH Graduate Education Department Quarterly Assessment Report First Quarter i First Quarter Assessment Report Table of Contents Introduction... Section - Purpose of the
Student Response to Instruction (SRTI)
office of Academic Planning & Assessment University of Massachusetts Amherst Martha Stassen Assistant Provost, Assessment and Educational Effectiveness 545-5146 or [email protected] Student Response
Blended Learning vs. Traditional Classroom Settings: Assessing Effectiveness and Student Perceptions in an MBA Accounting Course
Blended Learning vs. Traditional Classroom Settings: Assessing Effectiveness and Student Perceptions in an MBA Accounting Course Clement C. Chen, University of Michigan - Flint Keith T. Jones, Illinois
How To Know If Online Courses Work For Middle And High School Students
Do Online Courses Work for Middle Grades and High School Students? Online Students Have Their Say William R. Thomas Can a middle grades or high school student really take a course for credit just by using
Noel- Levitz Student Satisfaction Inventory Results: Disaggregated by Undergraduate and Graduate
Noel- Levitz Student Satisfaction Inventory Results: Disaggregated by Undergraduate and Graduate Gallaudet University Spring 2015 Report December 18, 2015 Office of Institutional Research Gallaudet Student
V. Course Evaluation and Revision
V. Course Evaluation and Revision Chapter 14 - Improving Your Teaching with Feedback There are several ways to get feedback about your teaching: student feedback, self-evaluation, peer observation, viewing
Interpretation Guide. For. Student Opinion of Teaching Effectiveness (SOTE) Results
Interpretation Guide For Student Opinion of Teaching Effectiveness (SOTE) Results Prepared by: Office of Institutional Research and Student Evaluation Review Board May 2011 Table of Contents Introduction...
Student Ratings of College Teaching Page 1
STUDENT RATINGS OF COLLEGE TEACHING: WHAT RESEARCH HAS TO SAY Lucy C. Jacobs The use of student ratings of faculty has increased steadily over the past 25 years. Large research universities report 100
Master of Education Online - Exit Survey
Master of Education - Online School of Leadership and Education Sciences University of San Diego 2015 Exit Survey Results This report was prepared by the USD-SOLES Office of Accreditation & Assessment
Title: Transforming a traditional lecture-based course to online and hybrid models of learning
Title: Transforming a traditional lecture-based course to online and hybrid models of learning Author: Susan Marshall, Lecturer, Psychology Department, Dole Human Development Center, University of Kansas.
Assessing the Impact of a Tablet-PC-based Classroom Interaction System
STo appear in Proceedings of Workshop on the Impact of Pen-Based Technology on Education (WIPTE) 2008. Assessing the Impact of a Tablet-PC-based Classroom Interaction System Kimberle Koile David Singer
A Comparison of E-Learning and Traditional Learning: Experimental Approach
A Comparison of E-Learning and Traditional Learning: Experimental Approach Asst.Prof., Dr. Wanwipa Titthasiri, : Department of Computer Science : Faculty of Information Technology, Rangsit University :
Student Mood And Teaching Evaluation Ratings
Student Mood And Teaching Evaluation Ratings Mary C. LaForge, Clemson University Abstract When student ratings are used for summative evaluation purposes, there is a need to ensure that the information
Student Perceptions of Online Learning: A Comparison of Two Different Populations
Student Perceptions of Learning: A Comparison of Two Different Populations Catharina Daniels 1 Pace University School of Computer Science and Information Systems Technology Systems Department New York,
RUNNING HEAD: TEACHER PERCEPTIONS OF ONLINE LEARNING TOOLS. Teacher Perceptions as an Integral Component in the Development of Online Learning Tools
Online Learning Tools 1 RUNNING HEAD: TEACHER PERCEPTIONS OF ONLINE LEARNING TOOLS Teacher Perceptions as an Integral Component in the Development of Online Learning Tools Jessica D. Cunningham and Kelly
A COMPARISON OF COLLEGE AND HIGH SCHOOL STUDENTS IN AN ONLINE IT FOUNDATIONS COURSE
A COMPARISON OF COLLEGE AND HIGH SCHOOL STUDENTS IN AN ONLINE IT FOUNDATIONS COURSE Amy B. Woszczynski Kennesaw State University [email protected] Debra Bass Geist Kennesaw State University [email protected]
Assessing the General Education Mathematics Courses at a Liberal Arts College for Women
1 Assessing the General Education Mathematics Courses at a Liberal Arts College for Women AbdelNaser Al-Hasan and Patricia Jaberg Mount Mary College, Milwaukee, WI 53222 [email protected] [email protected]
Final Exam Performance. 50 OLI Accel Trad Control Trad All. Figure 1. Final exam performance of accelerated OLI-Statistics compared to traditional
IN SEARCH OF THE PERFECT BLEND BETWEEN AN INSTRUCTOR AND AN ONLINE COURSE FOR TEACHING INTRODUCTORY STATISTICS Marsha Lovett, Oded Meyer and Candace Thille Carnegie Mellon University, United States of
ONLINE LEARNING AND COPING STRATEGIES. Marcela Jonas, University of the Fraser Valley, Canada
ONLINE LEARNING AND COPING STRATEGIES Marcela Jonas, University of the Fraser Valley, Canada Summary The goal of this case study is to contrast two types of students: native and non-native speakers of
Online Collection of Midterm Student Feedback
9 Evaluation Online offers a variety of options to faculty who want to solicit formative feedback electronically. Researchers interviewed faculty who have used this system for midterm student evaluation.
AC 2012-3818: FACULTY PERCEPTIONS AND USE OF A LEARNING MANAGEMENT SYSTEM AT AN URBAN, RESEARCH INSTITUTION
AC 2012-3818: FACULTY PERCEPTIONS AND USE OF A LEARNING MANAGEMENT SYSTEM AT AN URBAN, RESEARCH INSTITUTION Julie M. Little-Wiles M.S.M., Ph.D. (A.B.D.), Purdue University, West Lafayette Julie M. Little-Wiles
GRADUATE FACULTY PERCEPTIONS OF ONLINE TEACHING
GRADUATE FACULTY PERCEPTIONS OF ONLINE TEACHING Sharon Santilli and Vesna Beck Nova Southeastern University The participants for this study were 47 doctoral faculty from Nova Southeastern University Fischler
Assessing Online Learning and Effective Teaching Practices
1 21st Annual Conference on Distance Teaching and Learning 08.05 Assessing Online Learning and Effective Teaching Practices Cynthia Whitesel Adjunct Instructor Visiting Lecturer Lincoln University of the
An Examination of Assessment Methods Used in Online MBA Courses
An Examination of Assessment Methods Used in Online MBA Courses Shijuan Liu Richard Magjuka Xiaojing Liu SuengHee Lee Kelly Direct Programs, Indiana University Curtis Bonk Department of Instructional Systems
University Students' Perceptions of Web-based vs. Paper-based Homework in a General Physics Course
Eurasia Journal of Mathematics, Science & Technology Education, 2007, 3(1), 29-34 University Students' Perceptions of Web-based vs. Paper-based Homework in a General Physics Course Neşet Demirci Balıkesir
Education at the Crossroads: Online Teaching and Students' Perspectives on Distance Learning
Education at the Crossroads: Online Teaching and Students' Perspectives on Distance Learning Jacqueline Leonard and Smita Guha Temple University Abstract The Internet offers colleges and universities new
Comparison of Student and Instructor Perceptions of Best Practices in Online Technology Courses
Comparison of and Perceptions of Best Practices in Online Technology s David Batts Assistant Professor East Carolina University Greenville, NC USA [email protected] Abstract This study investigated the perception
How to Support Faculty as They Prepare to Teach Online Susan C. Biro Widener University Abstract: A survey, an in-depth interview, and a review of
How to Support Faculty as They Prepare to Teach Online Susan C. Biro Widener University Abstract: A survey, an in-depth interview, and a review of the literature were used to explore the changes faculty
GAO SCHOOL FINANCE. Per-Pupil Spending Differences between Selected Inner City and Suburban Schools Varied by Metropolitan Area
GAO United States General Accounting Office Report to the Ranking Minority Member, Committee on Ways and Means, House of Representatives December 2002 SCHOOL FINANCE Per-Pupil Spending Differences between
NEW WAYS OF THINKING ABOUT MULTIMEDIA AND ONLINE TEACHING IN HIGHER EDUCATION
NEW WAYS OF THINKING ABOUT MULTIMEDIA AND ONLINE TEACHING IN HIGHER EDUCATION Ahmad Abuhejleh Computer Science & Information Systems University Of Wisconsin River Falls [email protected] Abstract
Designing Effective Online Course Development Programs: Key Characteristics for Far-Reaching Impact
Designing Effective Online Course Development Programs: Key Characteristics for Far-Reaching Impact Emily Hixon, Ph.D. School of Education [email protected] Janet Buckenmeyer, Ph.D. School of Education
An Analysis of IDEA Student Ratings of Instruction in Traditional Versus Online Courses 2002-2008 Data
Technical Report No. 15 An Analysis of IDEA Student Ratings of Instruction in Traditional Versus Online Courses 2002-2008 Data Stephen L. Benton Russell Webster Amy B. Gross William H. Pallett September
Examining Students Performance and Attitudes Towards the Use of Information Technology in a Virtual and Conventional Setting
The Journal of Interactive Online Learning Volume 2, Number 3, Winter 2004 www.ncolr.org ISSN: 1541-4914 Examining Students Performance and Attitudes Towards the Use of Information Technology in a Virtual
Online Courses: An analysis of student satisfaction
Online Courses: An analysis of student satisfaction Franza, T., Williams, D., Morote, E. & Cunningham, J Purpose of Study Abstract: The purpose of this study was to determine if existing student or instructor
Fighting Diabetes through a Service Learning Partnership between Turtle Mountain Community College and the University of North Dakota
Introduction Fighting Diabetes through a Service Learning Partnership between Turtle Mountain Community College and the University of North Dakota Peggy Johnson, Turtle Mountain Community College, and
A PRELIMINARY COMPARISON OF STUDENT LEARNING IN THE ONLINE VS THE TRADITIONAL INSTRUCTIONAL ENVIRONMENT
A PRELIMINARY COMPARISON OF STUDENT LEARNING IN THE ONLINE VS THE TRADITIONAL INSTRUCTIONAL ENVIRONMENT Dr. Lori Willoughby Dr. Linda Cresap Minot State University Minot ND [email protected] [email protected]
EXPLORING ATTITUDES AND ACHIEVEMENT OF WEB-BASED HOMEWORK IN DEVELOPMENTAL ALGEBRA
EXPLORING ATTITUDES AND ACHIEVEMENT OF WEB-BASED HOMEWORK IN DEVELOPMENTAL ALGEBRA Kwan Eu Leong Faculty of Education, University of Malaya, Malaysia [email protected] Nathan Alexander Columbia University
KREMEN SCHOOL OF EDUCATION AND HUMAN DEVELOPMENT Spring 2009 Exit Surveys
KREMEN SCHOOL OF EDUCATION AND HUMAN DEVELOPMENT Spring 2009 Exit Surveys Yes. Cost is reasonable. I would advise them to talk to Dr. Fry Bohlin. Her personal connection is why I started and completed
Research Proposal: ONLINE VERSUS FACE-TO-FACE CLASSROOM: EVALUATING LEARNING EFFECTIVENESS FOR A MASTERS LEVEL INSTRUCTIONAL DESIGN COURSE
1 Online Versus Face-to-Face Classroom: Evaluating Learning Effectiveness for a Masters Level Instructional Design Course Ann Peniston George Mason University Author Note Submitted in partial fulfillment
Community Colleges. Measuring Internationalization. AMERICAN COUNCIL ON EDUCATION The Unifying Voice for Higher Education
Measuring Internationalization at Community Colleges Funded by the Ford Foundation AMERICAN COUNCIL ON EDUCATION The Unifying Voice for Higher Education Center for Institutional and International Initiatives
Dominican University of California Peer Mentoring in Health Professions Education Programs
Dominican University of California Peer Mentoring in Health Professions Education Programs Dr. Ruth Ramsey, OTR/L Associate Professor, Chair Dominican University of California AOTA/NBCOT Education Summit:
