College Students Attitudes Toward Methods of Collecting Teaching Evaluations: In-Class Versus On-Line
|
|
- Candace Jennings
- 8 years ago
- Views:
Transcription
1 College Students Attitudes Toward Methods of Collecting Teaching Evaluations: In-Class Versus On-Line CURT J. DOMMEYER PAUL BAUM ROBERT W. HANNA California State University, Northridge Northridge, California T he evaluation of faculty instruction is an ongoing and ubiquitous rite on college and university campuses throughout the United States and around the world. The vast majority of U.S. institutions require it in some form (Wachtel, 1994), and the amount of research on this activity is enormous (Wachtel, 1998), involving thousands of research papers (Marsh & Dunkin, 1992). According to Cashin (1988), there are probably more studies of student evaluations of faculty than the combination of all other means of evaluating faculty teaching. Although this research dates back to the late 1920s with the early work of Brandenburg and Remmers (1927), until the last couple of decades it has been largely concerned with validity and reliability issues. In a comprehensive review of research concerning student evaluation of college and university instructors, Wachtel (1998) noted that after nearly seven decades of research on the use of student evaluations of teaching effectiveness, it can safely be stated that the majority of researchers believe that student ratings are a valid, reliable, and worthwhile means of evaluating teaching (p. 2). Yet there are many others who would argue this point, as more recent research has indicated that methodological factors and situational characteristics can affect the validity of ABSTRACT. In this study, the authors investigated college students attitudes toward in-class and on-line methods of gathering teaching evaluations of faculty. Sixteen professors who taught two sections of the same class were randomly assigned to have one of their sections evaluated by the in-class method and the other by the on-line method. Students also completed a post-teaching-evaluation survey measuring their attitudes toward the method of evaluation used. The on-line method had a lower response rate than did the in-class method. Online respondents complained that the evaluation process may not be anonymous and that it was time consuming and involved complicated log-on procedures. Suggestions are offered for improving the on-line evaluation process. student evaluations. It has been shown, for example, that student evaluations may be affected by student perceptions of grade leniency (Nimmer & Stone, 1991), by instructor enthusiasm (Williams & Ceci, 1997), by student perceptions of anonymity (Blunt, 1991; Feldman, 1979), and by the early formation of student opinions about a course or instructor well before a course is underway (Hewett, Chastain, & Thurber, 1988). Students attitudes toward the traditional methodology of student evaluations have received relatively little research attention. According to Marlin (1987), student satisfaction as a factor in student ratings has been largely ignored. He noted that some students complain that faculty and administrators pay scant attention to student opinions and that teachers do not modify their behavior based on student comments on rating forms. Abbott, Wulff, Nyquist, Ropp, and Hess (1990) stressed that students willingness to participate in the evaluation method is directly related to their satisfaction with the evaluation process. To increase student participation in the evaluations of faculty members, university administrators must learn students attitudes toward current and proposed methods for gathering teaching evaluations. The wide availability of computers and the Internet, both on and off campus, has afforded students numerous learning opportunities, such as quick access to lecture material, literature databases, professors, and classmates. The wide Internet access also provides a new mode by which students may evaluate their professors namely, on-line evaluations. The traditional 1-day inclass evaluation of instructors via optically scanned forms may eventually give way to evaluation via the Internet, allowing for extended periods of student evaluation time. The on-line method promises more convenience to students September/October
2 and ease of use for administrators (Layne, DeCristoforo, & McGinty, 1999). Although electronic surveys are a fairly recent technique, they have been compared with traditional survey methods and been proven effective across a variety of organizational applications such as market research, personnel evaluations, and psychological counseling (Sproull & Kiesler, 1990; Rosenfeld, Booth-Kewley, & Edwards, 1993). Until the last few years, there has been an absence of published research comparing on-line student evaluations of faculty with traditional in-class paper-andpencil methods. A recent study comparing these two methods was conducted by Layne et al. (1999). The researchers studied both graduate and undergraduate students, the majority of whom were in engineering-related disciplines and were computer literate. They concluded that students had higher levels of satisfaction with the electronic (on-line) method of faculty evaluation over the traditional (in-class) method. Voluntary comments suggested that although the on-line method was perceived by students as having less anonymity, they saw it as less wasteful of resources (paper) and class time. An unexpected finding was that students who completed the survey on-line were more likely to provide voluntary comments than students who completed it in class. Overall, it appeared that the method of evaluation did not affect the responses given by the students. Nonetheless, the survey results revealed a lower response rate for students using the on-line method (47.8%) compared with students using the inclass method (60.6%). The student attitudes that might account for these response rate differences were not directly determined. The authors suggested that effective use of the on-line method requires greater institutional endorsements and guarantees of confidentiality. They also suggested that incentives may be necessary to achieve higher response rates. In another comparable study, Baum, Chapman, Dommeyer, and Hanna (2001) compared the in-class and online evaluation methods among business undergraduate students. They found an even larger difference in response rates 12 Journal of Education for Business between the on-line method (32.8%) and the in-class method (76.8%) and some tendency toward higher student ratings of instructors with the on-line method. In this article, we extend that study by comparing student attitudes toward the in-class and on-line evaluation methods. Objectives Our primary purpose in this study was to compare the attitudes of college students who were asked to give their teaching evaluations of their professors in class with students who were asked to submit their evaluations on-line. The two groups were compared on their impressions of the evaluation method that they were asked to use. The following questions were addressed: 1. Which method of teaching evaluation (in-class vs. on-line) achieves the higher response rate to the teaching evaluations? 2. Does the method of teaching evaluation affect the reasons that students give for evaluating or not evaluating their professor? 3. Does the method of teaching evaluation affect students overall impressions of the teaching evaluation process? 4. Does the method of teaching evaluation affect students ratings of specific dimensions of the teaching evaluation process (e.g., feelings of anonymity, ability to express true feelings, simplicity, and convenience ). A secondary purpose of this research was to determine whether either method of teaching evaluation produces a response bias that is, a situation in which the respondents and nonrespondents to a survey differ on key characteristics (Malhotra, 1999). We addressed this issue by determining whether students who responded to a specific teaching evaluation process differed significantly from the nonrespondents on the following factors: gender, expected grade in the class, and rating of the professor s teaching. Method During the spring semester of 2000 at California State University, Northridge, 16 business professors were recruited to participate in an experiment on on-line teaching evaluations. All of the professors selected for the experiment were currently teaching at least two sections of the same class during the semester. The majority were teaching two identical sections of the same class in adjacent time periods. Each of the professors was asked to alter the method of teaching evaluations in the paired sections on a random basis: One section was scheduled for in-class evaluations, whereby the students would rate the professor on forms distributed in the classroom, and the other section was asked to submit its evaluations of the professor via the Internet. Both survey methods used the same evaluation form. Students selected for the on-line evaluations were informed that they could go to a Website any time during a 2-week period to submit their evaluations. The experimental procedures were designed to minimize biases that might be attributed to the professor, the class, or the time of day. Prior to the last week of the semester, the teaching evaluations from all students, both in-class and on-line, were completed. During the last week of the semester, each of the 16 professors in the experiment was asked to distribute a post-teaching-evaluation survey to his or her in-class and on-line students to gather their impressions of the evaluation method they were asked to use. The questionnaires were distributed during class time at a meeting that most students were expected to attend. Students were told that their responses to the survey would be anonymous, that their instructor would not examine their answers, and that the completed surveys would be taken to the dean s office for analysis. The post-teaching-evaluation survey was two pages long and contained 16 questions. It began by asking whether the student had completed the earlier teaching evaluations and then asked why the student did (or did not) complete the evaluation of his or her professor. Those students who had completed the teaching evaluation were asked to indicate the method of evaluation that they used (either in-class or on-line), to explain their likes and dislikes of the method, and to rate the method they
3 used by agreeing or disagreeing with eight belief statements. All of the students, whether they completed the earlier teaching evaluations or not, were asked to indicate their gender, the grade they expected in the class as of the time of the survey, and their rating of their professor s teaching ability on a 10- point scale. Results Altogether, 1,549 students were enrolled in the classes given the postteaching-evaluation surveys. Of the group, 987 students completed the survey. However, 26 students either stated that they did their evaluations on-line when they were asked to do them in class or submitted them in class when they were asked to do them on-line. These surveys were eliminated from the analysis. Thus, usable responses were received from 961 students, resulting in a response rate of 62%. Randomization Check At the end of the post-teaching-evaluation survey, all respondents were asked to give their gender, their grade in the class at the time of the survey, and their rating of the instructor on a 10- point scale. We compared the in-class and on-line respondents to the postevaluation-survey on these variables. No significant differences were found between the two treatment groups, suggesting that the random assignment of the treatments to the 32 classes did not cause any selection bias in terms of gender, grade, or instructor evaluation. Comparing the Treatment Groups During the post-evaluation-survey, students were asked whether they had evaluated their professor s teaching performance during the previous weeks. Those who were asked to compete the teaching evaluations in class were significantly more likely to indicate that they had evaluated their professor than those who were asked to complete the evaluations on-line (92% vs. 60%, χ 2 (1) = 128, p <.001). These results are consistent with previous studies that have compared response rates of in-class and on-line teaching evaluations (Baum et al., 2001; Layne et al., 1999). When students who had provided teaching evaluations were asked why they gave them, the main reason given by both the in-class and on-line students was that they were asked or required to give the evaluations. However, students who did not submit teaching evaluations gave reasons for not providing the evaluations that were specific to the evaluation method. That is, nonrespondents to the in-class evaluations stated that their having missed class on the day of the evaluations was the main cause of their nonresponse. On-line nonrespondents, in contrast, were more likely to state that they forgot or missed the deadline or had computer problems (see Table 1). Students who completed the teaching evaluations were asked what they liked about the evaluation process; their answers are also displayed in Table 1. On-line respondents spoke mainly of the convenience, speed, or ease of the method, whereas in-class respondents listed those features as well as TABLE 1. Reasons Given for Not Evaluating the Professor, for Having Liked the Evaluation Method, and for Not Having Liked It, in Percentages of Respondents Respondent category Student response In-class On-line Reasons for not evaluating professor Forgot or missed deadline 67 Absent from class 64 6 Computer problems 13 Inconvenient 7 Do not care about evaluations 4 2 Miscellaneous 32 4 Total Reasons students liked evaluation method Convenient 7 39 Fast Easy Anonymous response 17 3 Had space for free response comments 14 2 Well designed questions 15 2 Professor left the room 13 Forces students to complete the evaluations 3 Uses up class time 3 Did not waste class time 5 Had plenty of time for the evaluation 1 Total 101 a 101 a Reasons students disliked evaluation method Answers may not be anonymous 9 24 Inconvenient 6 21 Questions were too broad 42 6 Too many questions 4 1 Other students can see your answers 2 1 Wasted valuable class time 27 Not enough time to complete survey 11 Took too much time 28 Complicated log-on process 12 Computer problems 8 Total 101 a 101 a Note. For reasons for not evaluating professor, in-class n = 25 and on-line n = 184. For reasons students liked evaluation method, in-class n = 101 and on-line n = 152. For reasons students disliked evaluation method, in-class n = 55 and on-line n = 91. a Percentages do not sum to 100 because of rounding errors. September/October
4 several others, namely anonymous response, had space for free response comments, well designed questions, and professor left the room. When students were asked to list what they disliked about their method of teaching evaluation, in-class respondents mainly mentioned that the questions were too broad, that valuable class time was wasted, and that they had insufficient time to complete the survey. On-line respondents complained primarily about the process taking too much time, that response to the survey may not be anonymous, that the evaluation process was inconvenient, and that the log-on process was complicated (see Table 1). Students who evaluated their professor were asked to use a 5-point Likert scale to rate the evaluation process on several specific dimensions (see Table TABLE 2. Respondent Attitudes Toward Method of Teaching Evaluation Average scores Statement a In-class On-line Dif. b 1. I felt relaxed when completing the evaluation of my professor When giving my evaluation, I felt free to express my true feelings I felt there was no way my professor could identify my evaluation with me * 4. I felt there was no way my instructor could determine if I had done or not done the evaluation * 5. It was very simple for me to provide my evaluation of my professor The process used to collect my evaluation was complicated ** 7. It was convenient for me to give my evaluation of my professor Giving my evaluation wasted good class time * Note. Scores are based on a 5-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree). Average scores are reported. a The statements above have been placed in logical groupings for presentation purposes. They were presented in a different order on the survey instrument. b We used a t test to compare the means. Difference scores are not shown unless they are statistically significant at or below the.10 alpha level. *p <.01.**p <.05. TABLE 3. Examining Nonresponse Bias for Each Method Provided Provided in-class on-line evaluations Test evaluations Factor Yes No statistic a Yes No Test statistic a Female 94% 6% 57% 43% Male 91% 9% χ 2 (1) = % 36% χ 2 (1) = 2.2 Expected grade b t(407) = t(496) =.09 Professor s teaching performance c t(410) = t(504) = 1.2 a None of the test statistics was significant at or below the.10 alpha level. b Averages based on a 4- point grade scale. c Averages based on a 10-point scale. 2). The results indicate that both treatment groups felt relaxed and free to express their true feelings when giving the evaluations (items 1 and 2). However, the in-class respondents were more likely to believe that their response to the survey was anonymous. Though both groups indicated that the teaching evaluation process was simple and uncomplicated, the in-class respondents were slightly more likely than the online ones to disagree that the process was complicated. Both groups agreed that the evaluation process that they used was convenient. Finally, the inclass respondents were more likely to believe that the teaching evaluations wasted good class time. Examining Each Evaluation Method s Nonresponse Bias We used a 10% significance level when examining nonresponse bias. For each method of teaching evaluation, no statistically significant differences were found between the respondents and nonrespondents on gender, expected grade in the class, or the rating of the professor s teaching performance (see Table 3). Thus, there is no evidence that either of the teaching evaluation methods incurred any significant nonresponse bias. Discussion and Conclusions Collecting teaching evaluations via the Internet clearly has advantages over the traditional, in-class method of data collection: On-line data collection eliminates paper costs; requires less class time; permits efficient processing of the data; is less vulnerable to influence of the faculty, and can be a fast, easy, and convenient method for students to submit their evaluations. Despite these advantages, the on-line method will not be widely adopted by universities if it cannot overcome some serious hurdles. It was shown in this survey and others that the on-line method produces a lower response rate to the teaching evaluations than the traditional paper-andpencil method. The lower response rate is most likely related to factors such as fear that responses to the on-line survey may not be anonymous and that the on- 14 Journal of Education for Business
5 line method can be inconvenient, timeconsuming, and prone to technical problems. Moreover, as shown in Table 1, the on-line survey may be the victim of student apathy, as the majority of the nonrespondents to the on-line survey forgot or missed the deadline date. Many of the impediments to on-line response can be effectively dealt with over time. As students become more computer literate and have greater access to computers, it will be less likely that inconvenience and technical problems will inhibit survey response. There are a number of ways that professors may be able to enhance survey response. First, they can provide a live demonstration of how to submit an on-line response, to reduce any computer-related fears and reinforce the importance of responding. Second, if a professor has the available class time and access to a bank of personal computers, he or she could use class time to direct students to a computer laboratory where they could submit their evaluations. However, this remedy does defeat some of the advantages of the on-line method. Third, to ensure that students do not forget or miss an on-line survey deadline date, at the end of each class professors could continually remind students of the deadline date and of the importance of the evaluations. Finally, professors might be able to use either positive or negative incentives to increase on-line response rates. For example, students could be told that if they bring the professor a validation slip indicating that they have submitted an evaluation on-line, they will be given a reward such as extra credit or an early notification of their course grade. Perhaps the greatest obstacle for the on-line method is how to deal with the students fear that their response to the on-line survey may not be anonymous. Much of this fear no doubt stems from the fact that students must submit their student identification number when accessing the on-line survey. Although students are told that their identification number will never be associated with their response, they may not believe it. It may take some time before all students trust the integrity of the on-line survey procedures. It may, in the future, be possible for students to submit their on-line evaluation without having to submit any identifying information. This could prove to be a tricky procedure, however, because there must be a way to keep track of each student s submission if students are to prove to their professor that they provided a response. Moreover, the data collection system must have a way to block on-line evaluations from people who are not in the professor s class as well as to prevent multiple evaluations from the same student. Though there was no evidence of nonresponse bias in either the in-class or online teaching evaluations, one should note that we assessed nonresponse bias by examining only three variables in the post-evaluation-survey. Had more variables been involved in this analysis, it is possible that other conclusions could have been drawn. It is also important to note that the post-evaluation-survey may have suffered from its own form of nonresponse bias, because valid surveys were received from only 62% of the students enrolled in the classes. Had more students responded to the post-evaluation-survey, the results may have been slightly different. REFERENCES Abbott, R. D., Wulff, D. H., Nyquist, J. D., Ropp, V. A., & Hess, C. W. (1990). Satisfaction with processes of collecting student opinions about instruction: The student perspective. Journal of Educational Psychology, 82, Baum, P., Chapman, K., Dommeyer, C., & Hanna, R. (2001). On-line versus in-class student evaluations of faculty. Paper presented at the Hawaii Conference on Business, Honolulu, June, Blunt, A. (1991). The effects of anonymity and manipulated grades on student ratings of instructors. Community College Review, 18, Brandenburg, G. C., & Remmers, H. H. (1927). A rating scale for instructors. Educational Administration and Supervision, 13, Cashin, W. E. (1988). Student rating of teaching: A summary of the research. Manhattan, KS: Center for Faculty Evaluation and Development, Kansas State University. Feldman, K. A. (1979). The significance of circumstances for college students ratings of their teachers and courses. Research in Higher Education, 10, Hewett, L., Chastain, G., & Thurber, S. (1988). Course evaluations: Are student ratings dictated by first impressions? Paper presented at the Annual Meeting of the Rocky Mountain Psychological Association, Snowbird, Utah. Layne, B. H., DeCristoforo, J. R., & McGinty, D. (1999). Electronic versus traditional student ratings of instruction. Research in Higher Education, 40(2), Malhotra, N. K. (1999). Marketing research: An applied orientation (3rd ed.). Upper Saddle River, NJ: Prentice Hall. Marlin, J. W. (1987). Student perception of endof-course evaluations. Journal of Higher Education, 58, Marsh, H. W., & Dunkin, M. J. (1992). Students evaluations of university teaching: A multidimensional perspective. In J. C. Smart (ed.), Higher education: Handbook of theory and research, 8, pp New York: Agathon Press. Nimmer, J. G., & Stone, E. F. (1991). Effects of grading practices and time of rating on student ratings of faculty performance and student learning. Research in Higher Education, 32, Rosenfeld, P., Booth-Kewley, S., & Edwards, J. E. (1993). Computer-administered surveys in organizational settings: Alternatives, advantages, and applications. American Behavioral Scientist, 36(4), Sproull, L., & Kiesler, S. (1991). Connections: New ways of working in the networked organization. Cambridge, MA: MIT Press. Wachtel, H. K. (1994). A critique of existing practices for evaluating mathematics instruction (Doctoral dissertation, University of Illinois at Chicago, 1994). Dissertation Abstracts International, 56, no. 01A, p Wachtel, H. K. (1998). Student evaluation of college teaching effectiveness: A brief review. Assessment and Evaluation in Higher Education, 23(2), Williams, W. M., & Ceci, S. J. (1997). How m I doing? Problems with student ratings of instructors and courses. Change, 29(5), September/October
6
Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations
Assessment & Evaluation in Higher Education Vol. 29, No. 5, October 2004 Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations Curt J. Dommeyer*,
More informationOnline And Paper Course Evaluations Faruk Guder, Loyola University Chicago, USA Mary Malliaris, Loyola University Chicago, USA
Online And Paper Course Evaluations Faruk Guder, Loyola University Chicago, USA Mary Malliaris, Loyola University Chicago, USA ABSTRACT The purpose of this study is to compare the results of paper and
More informationOnline versus traditional teaching evaluation: mode can matter
Assessment & Evaluation in Higher Education Vol. 30, No. 6, December 2005, pp. 581 592 Online versus traditional teaching evaluation: mode can matter Eyal Gamliel* and Liema Davidovitz Ruppin Academic
More informationBest Practices for Increasing Online Teaching Evaluation Response Rates
Best Practices for Increasing Online Teaching Evaluation Response Rates Denise T. Ogden, Penn State University Lehigh Valley James R. Doc Ogden, Kutztown University of Pennsylvania ABSTRACT Different delivery
More informationStudents Attitudes about Online Master s Degree Programs versus Traditional Programs
Aberasturi, S., & Kongrith, K. (2006). Students attitudes about online master s degree programs versus traditional programs. International Journal of Technology in Teaching and Learning, 2(1), 50-57. Students
More informationResponse Rates in Online Teaching Evaluation Systems
Response Rates in Online Teaching Evaluation Systems James A. Kulik Office of Evaluations and Examinations The University of Michigan July 30, 2009 (Revised October 6, 2009) 10/6/2009 1 How do you get
More informationConstructive student feedback: Online vs. traditional course evaluations. Judy Donovan, Ed.D. Indiana University Northwest
www.ncolr.org/jiol Volume 5, Number 3, Winter 2006 ISSN: 1541-4914 Constructive student feedback: Online vs. traditional course evaluations Judy Donovan, Ed.D. Indiana University Northwest Cynthia E. Mader,
More informationExamining Science and Engineering Students Attitudes Toward Computer Science
Examining Science and Engineering Students Attitudes Toward Computer Science Abstract Concerns have been raised with respect to the recent decline in enrollment in undergraduate computer science majors.
More informationOnline and In person Evaluations: A Literature Review and Exploratory Comparison
Online and In person Evaluations: A Literature Review and Exploratory Comparison Paulette Laubsch Assistant Professor School of Administrative Science Fairleigh Dickinson University Teaneck, NJ 07666 plaubsch@fdu.edu
More informationAdministering course evaluations online: A summary of research and key issues
Administering course evaluations online: A summary of research and key issues Office of the Vice Provost for Undergraduate Education June 2009 Prepared by: J. David Perry IUB Evaluation Services & Testing
More informationAre they the same? Comparing the instructional quality of online and faceto-face graduate education courses
Assessment & Evaluation in Higher Education Vol. 32, No. 6, December 2007, pp. 681 691 Are they the same? Comparing the instructional quality of online and faceto-face graduate education courses Andrew
More informationMathematics Placement And Student Success: The Transition From High School To College Mathematics
Mathematics Placement And Student Success: The Transition From High School To College Mathematics David Boyles, Chris Frayer, Leonida Ljumanovic, and James Swenson University of Wisconsin-Platteville Abstract
More information"Traditional and web-based course evaluations comparison of their response rates and efficiency"
"Traditional and web-based course evaluations comparison of their response rates and efficiency" Miodrag Lovric Faculty of Economics, University of Belgrade, Serbia e-mail: misha@one.ekof.bg.ac.yu Abstract
More informationOnline vs. Paper Evaluations of Faculty: When Less is Just as Good
The Journal of Effective Teaching an online journal devoted to teaching excellence Online vs. Paper Evaluations of Faculty: When Less is Just as Good David S. Fike 1ab, Denise J. Doyle b, Robert J. Connelly
More informationONLINE COURSE EVALUATIONS
ONLINE COURSE EVALUATIONS Table of Contents Introduction Review of published research literature Benefits of online evaluations Administration costs Data quality Student accessibility Faculty influence
More informationEducation at the Crossroads: Online Teaching and Students' Perspectives on Distance Learning
Education at the Crossroads: Online Teaching and Students' Perspectives on Distance Learning Jacqueline Leonard and Smita Guha Temple University Abstract The Internet offers colleges and universities new
More informationElectronic Course Evaluations: Does an Online Delivery System Influence Student Evaluations?
Electronic Course Evaluations: Does an Online Delivery System Influence Student Evaluations? Rosemary J. Avery, W. Keith Bryant, Alan Mathios, Hyojin Kang, and Duncan Bell Abstract: An increasing number
More informationAnalysis of the Effectiveness of Online Learning in a Graduate Engineering Math Course
The Journal of Interactive Online Learning Volume 1, Number 3, Winter 2003 www.ncolr.org ISSN: 1541-4914 Analysis of the Effectiveness of Online Learning in a Graduate Engineering Math Course Charles L.
More informationImpact of attendance policies on course attendance among college students
Journal of the Scholarship of Teaching and Learning, Vol. 8, No. 3, October 2008. pp. 29-35. Impact of attendance policies on course attendance among college students Tiffany Chenneville 1 and Cary Jordan
More informationCourse-based Student Feedback Forms: Proposal for a New System Fall 2005
Course-based Student Feedback Forms: Proposal for a New System Fall 2005 Introduction As part of Augsburg College s commitment to providing a high quality education, we have solicited feedback from students
More informationWhy Do Students Withdraw From Courses?
Why Do Students Withdraw From Courses? Dr. Michelle Hall Director, Office of Institutional Research & Assessment Kenneth Smith Research Associate Donald Boeckman Research Associate Vinay Ramachandra Institutional
More informationAssessing the Impact of a Tablet-PC-based Classroom Interaction System
STo appear in Proceedings of Workshop on the Impact of Pen-Based Technology on Education (WIPTE) 2008. Assessing the Impact of a Tablet-PC-based Classroom Interaction System Kimberle Koile David Singer
More informationComparison of Student and Instructor Perceptions of Best Practices in Online Technology Courses
Comparison of and Perceptions of Best Practices in Online Technology s David Batts Assistant Professor East Carolina University Greenville, NC USA battsd@ecu.edu Abstract This study investigated the perception
More informationEvaluating Different Methods of Delivering Course Material: An Experiment in Distance Learning Using an Accounting Principles Course
Evaluating Different Methods of Delivering Course Material: An Experiment in Distance Learning Using an Accounting Principles Course Alexander R. Vamosi, Florida Institute of Technology Barbara G. Pierce,
More informationStrategies for Teaching Undergraduate Accounting to Non-Accounting Majors
Strategies for Teaching Undergraduate Accounting to Non-Accounting Majors Mawdudur Rahman, Suffolk University Boston Phone: 617 573 8372 Email: mrahman@suffolk.edu Gail Sergenian, Suffolk University ABSTRACT:
More informationUsing Practice Tests on a Regular Basis to Improve Student Learning
14 Daily practice tests over assigned reading followed immediately by class discussion can improve learning and grades. Using Practice Tests on a Regular Basis to Improve Student Learning Margaret K. Snooks
More informationOnline vs. Traditional Course Evaluation Formats: Student Perceptions. Judy Donovan Indiana University Northwest
www.ncolr.org/jiol Volume 6, Number 3, Winter 2007 ISSN: 1541-4914 Online vs. Traditional Course Evaluation Formats: Student Perceptions Judy Donovan Indiana University Northwest Cynthia Mader and John
More informationAbstract Title Page Not included in page count.
Abstract Title Page Not included in page count. Title: Measuring Student Success from a Developmental Mathematics Course at an Elite Public Institution Authors and Affiliations: Julian Hsu, University
More informationIssues in Information Systems Volume 13, Issue 2, pp. 193-200, 2012
EXPECTED ADVANTAGES AND DISADVANTAGES OF ONLINE LEARNING: PERCEPTIONS FROM COLLEGE STUDENTS WHO HAVE NOT TAKEN ONLINE COURSES Melody W. Alexander, Ball State University, malexander@bsu.edu Allen D. Truell,
More informationAlternative Online Pedagogical Models With Identical Contents: A Comparison of Two University-Level Course
The Journal of Interactive Online Learning Volume 2, Number 1, Summer 2003 www.ncolr.org ISSN: 1541-4914 Alternative Online Pedagogical Models With Identical Contents: A Comparison of Two University-Level
More informationStudents beliefs and attitudes about a business school s academic advising process
Students beliefs and attitudes about a business school s academic advising process ABSTRACT M. Wayne Alexander Minnesota State University Moorhead Deborah Kukowski Minnesota State University Moorhead Lee
More informationINTENSIVE TEACHING IN LAW SUBJECTS
INTENSIVE TEACHING IN LAW SUBJECTS Ian Ramsay* Abstract The use of intensive teaching is increasing in Australian law schools. For some Australian law schools, most of their masters subjects are now taught
More informationUW Colleges Student Motivations and Perceptions About Accelerated Blended Learning. Leanne Doyle
UW Colleges Student Motivations and Perceptions About Accelerated Blended Learning Leanne Doyle Abstract: Nationwide, both online and blended learning enrollments are causing a paradigm shift in higher
More informationBlended Learning Technology: Connecting with the Online-All-the-Time Student
Blended Learning Technology: Connecting with the Online-All-the-Time Student Table of Contents 03 Executive Summary 05 Survey Method 06 Respondent Profi les 07 Instruction delivered through a blended learning
More informationKNOWLEDGE AND PERCEPTIONS OF VIRGINIA SECONDARY AGRICULTURE EDUCATORS TOWARD THE AGRICULTURAL TECHNOLOGY PROGRAM AT VIRGINIA TECH
KNOWLEDGE AND PERCEPTIONS OF VIRGINIA SECONDARY AGRICULTURE EDUCATORS TOWARD THE AGRICULTURAL TECHNOLOGY PROGRAM AT VIRGINIA TECH Dennis W. Duncan, Assistant Professor Virginia Tech Abstract Identifying
More informationVirtual Teaching in Higher Education: The New Intellectual Superhighway or Just Another Traffic Jam?
Virtual Teaching in Higher Education: The New Intellectual Superhighway or Just Another Traffic Jam? Jerald G. Schutte California State University, Northridge email - jschutte@csun.edu Abstract An experimental
More informationIMPLEMENTATION OF ONLINE FORUMS IN GRADUATE EDUCATION
IMPLEMENTATION OF ONLINE FORUMS IN GRADUATE EDUCATION Walkyria Goode ESPAE-Graduate School of Management Escuela Superior Politécnica del Litoral Abstract 1 This paper argues that online forums in graduate
More informationPsychology. Administered by the Department of Psychology within the College of Arts and Sciences.
Psychology Dr. Spencer Thompson, Professor, is the Chair of Psychology and Coordinator of Child and Family Studies. After receiving his Ph.D. in Developmental Psychology at the University of California,
More informationA Class Project in Survey Sampling
A Class Project in Survey Sampling Andrew Gelman and Deborah Nolan July 1, 2001 Courses in quantitative methods typically require students to analyze previously collected data. There is great value in
More informationMSU IDEA Pilot Study Preliminary Results
MSU IDEA Pilot Study Preliminary Results This document was a quick attempt to share what was learned from the pilot of a new course evaluation form created and administered by the IDEA Center (http://ideaedu.org/services/student-ratings).
More informationStudent Engagement Strategies in One Online Engineering and Technology Course
Paper ID #7674 Student Engagement Strategies in One Online Engineering and Technology Course Dr. Julie M Little-Wiles, Purdue School of Engineering and Technology, IUPUI Dr. Julie Little-Wiles is a Visiting
More informationTitle: Transforming a traditional lecture-based course to online and hybrid models of learning
Title: Transforming a traditional lecture-based course to online and hybrid models of learning Author: Susan Marshall, Lecturer, Psychology Department, Dole Human Development Center, University of Kansas.
More informationEffectiveness of Online Instruction
Effectiveness of Online Instruction Delar K. Singh, Ph.D. Associate Professor Department of Education Eastern Connecticut State University Willimantic, CT 06226 E-mail: singhd@easternct.edu Paper presented
More informationFirst-time Online Instructors Use of Instructional Technology in the Face-to-face Classroom. Heather E. Arrowsmith. Kelly D.
Running Head: USE OF INSTRUCTIONAL TECH First-time Online Instructors Use of Instructional Technology in the Face-to-face Classroom Heather E. Arrowsmith Kelly D. Bradley 1 University of Kentucky 1 Use
More informationStudent Interpretations of the Scale Used in the BYU Online Student Ratings Instrument. Paul Fields, PhD Department of Statistics
Student Interpretations of the Scale Used in the BYU Online Student Ratings Instrument Paul Fields, PhD Department of Statistics Trav Johnson, PhD Center for Teaching and Learning Student Interpretations
More informationOnline Collection of Midterm Student Feedback
9 Evaluation Online offers a variety of options to faculty who want to solicit formative feedback electronically. Researchers interviewed faculty who have used this system for midterm student evaluation.
More informationThe Examination of Strength and Weakness of Online Evaluation of Faculty Members Teaching by Students in the University of Isfahan
The Examination of Strength and Weakness of Online Evaluation of Faculty Members Teaching by Students in the University of Isfahan Ansary Maryam PhD Scholar, Philosophy of Education, Faculty of Educational
More informationCOMPARISON OF INTERNET AND TRADITIONAL CLASSROOM INSTRUCTION IN A CONSUMER ECONOMICS COURSE
Journal of Family and Consumer Sciences Education, Vol. 20, No. 2, Fall/Winter 2002 COMPARISON OF INTERNET AND TRADITIONAL CLASSROOM INSTRUCTION IN A CONSUMER ECONOMICS COURSE Debbie Johnson Southeastern
More informationHow To Determine Teaching Effectiveness
Student Course Evaluations: Key Determinants of Teaching Effectiveness Ratings in Accounting, Business and Economics Courses at a Small Private Liberal Arts College Timothy M. Diette, Washington and Lee
More informationPresented at the 2014 Celebration of Teaching, University of Missouri (MU), May 20-22, 2014
Summary Report: A Comparison of Student Success in Undergraduate Online Classes and Traditional Lecture Classes at the University of Missouri Presented at the 2014 Celebration of Teaching, University of
More informationInternational Conference on Communication, Media, Technology and Design. ICCMTD 09-11 May 2012 Istanbul - Turkey
OVERCOMING COMMUNICATION BARRIERS IN ONLINE TEACHING: UNDERSTANDING FACULTY PREFERENCES Ertunga C. Ozelkan and Agnes Galambosi Systems Engineering & Engineering Management University of North Carolina
More informationOnline Collection of Student Evaluations of Teaching
Online Collection of Student Evaluations of Teaching James A. Kulik Office of Evaluations and Examinations The University of Michigan December 5, 2005 1 More than a dozen universities now collect all their
More informationTeaching and Encouraging Meaningful Reading of Mathematics Texts
Teaching and Encouraging Meaningful Reading of Mathematics Texts Jane Friedman, Perla Myers and Jeff Wright Department of Mathematics and Computer Science University of San Diego 5998 Alcala Park, San
More informationGRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE *
GRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE * Amber Settle DePaul University 243 S. Wabash Avenue Chicago, IL 60604 (312) 362-5324 asettle@cti.depaul.edu Chad Settle University
More informationQ1 How do you prefer to access your online classes? Check all that apply. 18% 53% 17% 12% 0 90 2.2
Summer Session III 2014 Distance Learning Aggregate Survey Results In order to strengthen UWG s distance and distributed learning programs, separate evaluations have been developed for distance courses.
More informationJames E. Bartlett, II is Assistant Professor, Department of Business Education and Office Administration, Ball State University, Muncie, Indiana.
Organizational Research: Determining Appropriate Sample Size in Survey Research James E. Bartlett, II Joe W. Kotrlik Chadwick C. Higgins The determination of sample size is a common task for many organizational
More informationFace to face or Cyberspace: Analysis of Course Delivery in a Graduate Educational Research Course
Face to face or Cyberspace: Analysis of Course Delivery in a Graduate Educational Research Course Robert S. Legutko DeSales University Center Valley, PA USA RSL2@desales.edu Abstract Student attitudes
More informationDo Students Understand Liberal Arts Disciplines? WHAT IS THE EDUCATIONAL PURPOSE of the curricular breadth encouraged at liberal arts institutions?
One important educational outcome should be for students to develop accurate perceptions of the disciplines they study DONALD E. ELMORE, JULIA C. PRENTICE, AND CAROL TROSSET Do Students Understand Liberal
More informationProcrastination in Online Courses: Performance and Attitudinal Differences
Procrastination in Online Courses: Performance and Attitudinal Differences Greg C Elvers Donald J. Polzella Ken Graetz University of Dayton This study investigated the relation between dilatory behaviors
More informationCHAPTER 1 INTRODUCTION. stakeholders, and a subject that goes beyond the world of researchers, given its social,
CHAPTER 1 INTRODUCTION The evaluation of teaching in higher education is an area of strong interest for different stakeholders, and a subject that goes beyond the world of researchers, given its social,
More informationReproductions supplied by EDRS are the best that can be made from the original document.
DOCUMENT RESUME ED 448 722 IR 020 485 AUTHOR Johnson, Scott D.; Aragon, Steven R.; Shaik, Najmuddin; Palma-Rivas, Nilda TITLE Comparative Analysis of Online vs. Face-to-Face Instruction. PUB DATE 1999-10-00
More informationOn-line student feedback: a pilot study
On-line student feedback: a pilot study Liz Barnettjane Galbraith, Paul Gee, Fran Jennings and Ron Riley London School of Economics and Political Science email: l.barnett@lse.ac.uk This paper reports on
More informationBuilding Online Learning Communities: Factors Supporting Collaborative Knowledge-Building. Joe Wheaton, Associate Professor The Ohio State University
For more resources click here -> Building Online Learning Communities: Factors Supporting Collaborative Knowledge-Building Joe Wheaton, Associate Professor David Stein, Associate Professor Jennifer Calvin,
More informationPRACTICUM HANDBOOK. 2008 Community and College Student Development. The College of Education & Human Development UNIVERSITY OF MINNESOTA
2008 Community and College Student Development 2009 PRACTICUM HANDBOOK The College of Education & Human Development UNIVERSITY OF MINNESOTA Department of Educational Psychology Counseling and Student Personnel
More informationInternet classes are being seen more and more as
Internet Approach versus Lecture and Lab-Based Approach Blackwell Oxford, TEST Teaching 0141-982X Journal Original XXXXXXXXX 2008 The compilation UK Articles Statistics Publishing Authors Ltd 2008 Teaching
More informationCOMMUNITY COLLEGE COMPRESSED CALENDARS: RESULTS OF A STUDENT SURVEY AND A FACULTY SURVEY 1
COMMUNITY COLLEGE COMPRESSED CALENDARS: RESULTS OF A STUDENT SURVEY AND A FACULTY SURVEY 1 Michael Carley 2 Porterville College Abstract Many community colleges are considering changes in their traditional
More informationThe Inventory of Male Friendliness in Nursing Programs (IMFNP)
The Inventory of Male Friendliness in Nursing Programs (IMFNP) Background At the 2001 annual conference of the American Assembly for Men in Nursing (AAMN), a nursing student discussed his educational experiences
More informationStudent Mood And Teaching Evaluation Ratings
Student Mood And Teaching Evaluation Ratings Mary C. LaForge, Clemson University Abstract When student ratings are used for summative evaluation purposes, there is a need to ensure that the information
More informationRunning Head: COMPARISON OF ONLINE STUDENTS TO TRADITIONAL 1. The Comparison of Online Students Education to the
Running Head: COMPARISON OF ONLINE STUDENTS TO TRADITIONAL 1 The Comparison of Online Students Education to the Traditional Students of Classroom Education Brent Alan Neumeier The University of Arkansas
More informationTest Anxiety, Student Preferences and Performance on Different Exam Types in Introductory Psychology
Test Anxiety, Student Preferences and Performance on Different Exam Types in Introductory Psychology Afshin Gharib and William Phillips Abstract The differences between cheat sheet and open book exams
More informationAssessing faculty performance using student evaluations of teaching in an uncontrolled setting
Assessment & Evaluation in Higher Education Vol. 35, No. 4, July 2010, 463 475 Assessing faculty performance using student evaluations of teaching in an uncontrolled setting Clifford Nowell*, Lewis R.
More informationTraditional In-class Examination vs. Collaborative Online Examination in Asynchronous Learning Networks: Field Evaluation Results
Traditional In-class Examination vs. Collaborative Online Examination in Asynchronous Learning Networks: Field Evaluation Results Jia Shen (jxs1866@njit.edu) Kung-E Cheng (kc37@njit.edu) Michael Bieber
More informationUniversity Undergraduate Teaching Quality 3.12. Chapter 3 Section. Background. Ministry of Training, Colleges and Universities
Chapter 3 Section 3.12 Ministry of Training, Colleges and Universities University Undergraduate Teaching Quality Background Ontario s 20 publicly assisted universities offer graduate and undergraduate
More informationThe HRM program, originally known as the Personnel and Administration Studies program, began to accept students in Fall 1981.
HUMAN RESOURCES MANAGEMENT PROGRAM Senate Undergraduate Council Report to Senate April 16, 2012 Attachment #1 Review Process The previous self study of the Human Resources Management (HRM) program was
More informationTraditional Lecture Format Compared to Computer-Assisted Instruction in Pharmacy Calculations
Traditional Lecture Format Compared to Computer-Assisted Instruction in Pharmacy Calculations Jeffrey C. Delafuente, Oscar E. Araujo and Sue M. Legg College of Pharmacy, Health Science Center, PO Box 100486,
More informationA Study of Barriers to Women in Undergraduate Computer Science
A Study of Barriers to Women in Undergraduate Computer Science Abstract Greg Scragg & Jesse Smith SUNY Geneseo Dept. of Computer Science Geneseo, NY 14454, USA scragg@cs.geneseo.edu jds97@cs.geneseo.edu
More informationThe Role of Community in Online Learning Success
The Role of Community in Online Learning Success William A. Sadera Towson University Towson, MD 21252 USA bsadera@towson.edu James Robertson University of Maryland University College Adelphia, MD USA Liyan
More informationHigher Performing High Schools
COLLEGE READINESS A First Look at Higher Performing High Schools School Qualities that Educators Believe Contribute Most to College and Career Readiness 2012 by ACT, Inc. All rights reserved. A First Look
More informationAgriculture Teachers' Attitudes toward Adult Agricultural Education in Ohio Comprehensive High Schools
Agriculture Teachers' Attitudes toward Adult Agricultural Education in Ohio Comprehensive High Schools Yung-Chul Kim, Graduate Student Department of Human and Community Resource Development The Ohio State
More informationOnline Course Evaluation and Analysis
Session 1793 Online Course Evaluation and Analysis Li Bai Saroj Biswas Department of Electrical and Computer Engineering College of Engineering Temple University Philadelphia, PA19122 lbai@temple.edu sbiswas@temple.edu
More informationKnowledge Management & E-Learning
Knowledge Management & E-Learning, Vol.5, No.3. Sep 2013 Knowledge Management & E-Learning ISSN 2073-7904 A brief examination of predictors of e-learning success for novice and expert learners Emily Stark
More informationSTUDENTS PERCEPTIONS OF ONLINE LEARNING AND INSTRUCTIONAL TOOLS: A QUALITATIVE STUDY OF UNDERGRADUATE STUDENTS USE OF ONLINE TOOLS
STUDENTS PERCEPTIONS OF ONLINE LEARNING AND INSTRUCTIONAL TOOLS: A QUALITATIVE STUDY OF UNDERGRADUATE STUDENTS USE OF ONLINE TOOLS Dr. David A. Armstrong Ed. D. D Armstrong@scu.edu ABSTRACT The purpose
More informationConcordia University Course Evaluation Survey: Summary Report
Concordia University Course Evaluation Survey: Summary Report Introduction In fall 2010, the Vice-Provost, Teaching and Learning engaged the Institutional Planning Office to conduct a survey of student
More informationStudent Perceptions of Online Learning: A Comparison of Two Different Populations
Student Perceptions of Learning: A Comparison of Two Different Populations Catharina Daniels 1 Pace University School of Computer Science and Information Systems Technology Systems Department New York,
More informationOnsite Peer Tutoring in Mathematics Content Courses for Pre-Service Teachers
IUMPST: The Journal. Vol 2 (Pedagogy), February 2011. [www.k-12prep.math.ttu.edu] Onsite Peer Tutoring in Mathematics Content Courses for Pre-Service Teachers Elaine Young Associate Professor of Mathematics
More informationEffectiveness of Flipped learning in Project Management Class
, pp. 41-46 http://dx.doi.org/10.14257/ijseia.2015.9.2.04 Effectiveness of Flipped learning in Project Management Class Jeong Ah Kim*, Hae Ja Heo** and HeeHyun Lee*** *Department of Computer Education
More informationAnswers to Faculty Concerns About Online Versus In- class Administration of Student Ratings of Instruction (SRI)
Answers to Faculty Concerns About Online Versus In- class Administration of Student Ratings of Instruction (SRI) The posting below compares online student ratings of instructors with in- class ratings.
More informationThe coach-team approach: An introductory accounting instructional alternative
ABSTRACT The coach-team approach: An introductory accounting instructional alternative Lynette I. Wood Winston-Salem State University Many students approach the introductory accounting course with a great
More informationPERCEPTIONS OF IOWA SECONDARY SCHOOL PRINCIPALS TOWARD AGRICULTURAL EDUCATION. Neasa Kalme, Instructor Hamilton, Indiana
PERCEPTIONS OF IOWA SECONDARY SCHOOL PRINCIPALS TOWARD AGRICULTURAL EDUCATION Neasa Kalme, Instructor Hamilton, Indiana James E. Dyer, Assistant Professor University of Missouri Abstract The primary purpose
More informationHigh School Counselors Influence
High School Counselors Influence Katey O Donnell and Katie Logan Undergraduate Students, Human Development and Family Studies Key Words: College freshmen, school counselors, high school, adequately prepared
More informationLangara College 2009 Current Student Survey Report
Langara College 2009 Current Student Survey Report Office of Institutional Research Langara College May 3, 2010 TABLE OF CONTENTS SURVEY SAMPLE AND METHODOLOGY... 1 Table 1: Characteristics of Student
More informationPredictors of student preference for online courses
VOLUME 2, NUMBER 1, 2013 Predictors of student preference for online courses Louis Charles Glover, EdD The University of Tennessee at Martin Veronica Evans Lewis, PhD The University of Louisiana at Monroe
More informationRunning head: THE EFFECTS OF EXTRA-CURRICULAR ACTIVITIES
Extra-Curricular Activities 1 Running head: THE EFFECTS OF EXTRA-CURRICULAR ACTIVITIES The Effects of Extra-Curricular Activities on Student s Perceived Academic Self-Efficacy Extra-Curricular Activities
More informationOpen and Distance Learning Student Retention: A Case Study of the University of Papua New Guinea Open College
Open and Distance Learning Student Retention: A Case Study of the University of Papua New Guinea Open College INTRODUCTION Prof. Dr. Abdul Mannan, University of Papua new Guinea Open College, mannanma@upng.ac.pg
More informationAttitudes Toward Science of Students Enrolled in Introductory Level Science Courses at UW-La Crosse
Attitudes Toward Science of Students Enrolled in Introductory Level Science Courses at UW-La Crosse Dana E. Craker Faculty Sponsor: Abdulaziz Elfessi, Department of Mathematics ABSTRACT Nearly fifty percent
More informationDeveloping and Teaching a Hybrid Software Engineering Introductory Course
Developing and Teaching a Hybrid Software Engineering Introductory Course Anna Koufakou 1 Florida Gulf Coast University Abstract This paper summarizes the author s experiences in developing and teaching
More informationStudent Involvement in Computer-Mediated Communication: Comparing Discussion Posts in Online and Blended Learning Contexts
The 1 st International Conference on Virtual Learning, ICVL 2006 113 Student Involvement in Computer-Mediated Communication: Comparing Discussion Posts in Online and Blended Learning Contexts Amy M. Bippus
More informationATTITUDES OF ILLINOIS AGRISCIENCE STUDENTS AND THEIR PARENTS TOWARD AGRICULTURE AND AGRICULTURAL EDUCATION PROGRAMS
ATTITUDES OF ILLINOIS AGRISCIENCE STUDENTS AND THEIR PARENTS TOWARD AGRICULTURE AND AGRICULTURAL EDUCATION PROGRAMS Edward W. Osborne, Professor University of Florida James E. Dyer, Assistant Professor
More informationFindings from the California Community College Flashlight Project 1998-99
Findings from the California Community College Flashlight Project 1998-99 Susan S. Obler, Ph.D. Project Director, Flashlight Project Rio Hondo College Robert S. Gabriner, Ed.D. Director Research, Planning
More informationCosts to Instructors in Delivering Equated Online and On-campus Courses. Dale Shaw University of Northern Colorado
The Journal of Interactive Online Learning Volume 1, Number 4, Spring 2003 www.ncolr.org ISSN: 1541-4914 Costs to Instructors in Delivering Equated Online and On-campus Courses Dale Shaw University of
More information