ONLINE COURSE EVALUATIONS

Size: px
Start display at page:

Download "ONLINE COURSE EVALUATIONS"

Transcription

1 ONLINE COURSE EVALUATIONS Table of Contents Introduction Review of published research literature Benefits of online evaluations Administration costs Data quality Student accessibility Faculty influence Online response rates Comparing online and paper response rates Potential impact of low response rates Changes in response rate over time Correlation of online and paper results Comparing online and paper results Potential bias in results Data collection methods Improving online response rates Determinants of response rate Grade incentives Other incentives Instructor encouragement Training and communication Institutional environment Appendix A: Data and Charts Appendix B: References

2

3 ONLINE COURSE EVALUATIONS Introduction Most academic institutions regularly conduct student evaluations of faculty teaching performance. Since the results of these evaluations are often used to make promotion, tenure, and merit pay decisions, the topic can generate significant controversy among faculty. Most early research focused on the reliability and validity of the questions or appropriate usage of the results, but with the advent of online evaluations, there is also an increasing need to address how the evaluation data is collected. In a typical online evaluation, students are provided with a web site address where they can access the survey instrument. Prior to giving their responses, students are informed that instructors will not have access to any student s individual responses and that instructors will receive the results of the evaluation only after the final grades have been posted. After students log on to the online system, typically using a student ID number, they are able to indicate their responses to multiple-choice items and type their answers to open-ended questions. After students submit their responses, they receive a printed document that verifies they have completed the evaluation. Students are generally given at least two weeks to provide their evaluations, usually near the end of the term. Schools usually consider switching from paper evaluations to online evaluations because they are easier to administer and the results are easier to analyze. Online evaluations also provide a feedback mechanism that is universally accessible to students and less susceptible to faculty influence. Despite these potential advantages, two major concerns are frequently raised regarding online evaluation systems: Response rates may be significantly lower than paper evaluations Results may be biased and therefore significantly different than paper evaluations These two concerns are related, but they must be evaluated separately. No assumptions can be made about the results of an evaluation based solely on the response rate. Furthermore, even when the results are different, the smaller sample is not always biased. Relatively small samples may still be an accurate, unbiased representation of the students in the class. Ultimately, the main concern is not whether the response rate is lower or even whether the responses are different, but whether the evaluation method has somehow impacted the validity of the results. Assessing the feasibility of online evaluations involves answering five fundamental questions: 1. Is the response rate for online evaluations significantly lower than paper evaluations? 2. Do online evaluations provide a representative sample of students in the class? 3. Is it possible to increase the response rate for online evaluations? 4. Are the results for online evaluations significantly different than paper evaluations? 5. Are the results for online evaluations biased? In most published research studies, the response rate for online evaluations is lower than paper evaluations. However, in most cases, the response rate increased considerably after the research study was completed and online evaluations were officially adopted. Higher response rates may be a natural byproduct of ongoing implementation activities and increased support among students and faculty, but 1

4 specific strategies can also be employed to boost response rates. Grade incentives are particularly effective, although some instructors may consider other incentives to be more appropriate. Even if specific incentives are not offered, response rates can be improved by repeatedly encouraging students to participate and designing the evaluation system to facilitate student responses. With the implementation of proper administrative procedures and various incentives to encourage student participation, many schools have achieved online response rates that are equivalent to paper. In addition, none of the studies found that a lower response rate or any other characteristics of the evaluation method caused the results to be invalid. The ratings obtained with online evaluations were similar to paper ratings and there was no evidence that the online ratings were biased. Online course evaluations are also known as online student ratings and online surveys. The term online is considered synonymous with electronic and the term evaluation is synonymous with assessment. For example, electronic student assessments would be considered the same as online course evaluations. Also, the term online does not imply that the evaluation is conducted on the Internet or a web-based university computer system. The evaluation could also be administered with clickers, interactive voice response (IVR) systems, or other technology that does not require the student to use a computer. Review of Published Research Literature The Office of University Planning and Analysis (UPA) at North Carolina State University maintains a large list of Internet resources for higher education outcomes assessment. The list includes sections for the following specific topics: Student assessment of courses and faculty Using the Web for student evaluation of courses and faculty Comparing online and paper evaluation systems The list includes a link to the Online Student Evaluation of Teaching in Higher Education (OnSET) site maintained by Brigham Young University, which contains an extensive bibliography of articles related to online student evaluations. To identify the most useful articles, the OnSET bibliography was crossreferenced against the bibliographies from three well-regarded research studies conducted at the following institutions: Augsburg College (Scott Krajewski and Diane Pike) McGill University (Dr. Laura Winer and Rittu Sehgal) Murdoch University (David Collings and Christina Ballantyne) Six articles are referenced in two of these bibliographies (none are referenced in all three). The OnSET bibliography references all six of the most popular articles, so the OnSET bibliography is considered a comprehensive list of relevant articles on the topic. Each of the other bibliographies references four of the six articles, so the other bibliographies are considered an equal cross-section of relevant articles. The McGill University study was chosen as the best starting point. It contains fewer references, but the references are considered an adequate cross-section of the most popular articles. The McGill University study is also the most recent. It was published in April 2006, while the Augsburg College study was published in April 2005 and the Murdoch University study was published in November The 2

5 bibliographies for the McGill University and Augsburg College studies contain useful summaries of each reference, while the Murdoch University study does not. Administration Costs Benefits of Online Evaluations Online evaluations are usually easier and cheaper to administer than paper evaluations, since they use less paper and require less time from highly-paid faculty and administrators. Faced with the reality of shrinking budgets, academic institutions are seeking more efficient ways to operate under increasingly constrained administrative support resources. Online technology offers huge efficiencies in the execution of course evaluations and other repetitive administrative tasks, especially for departments with significant undergraduate enrollment in basic large-service courses. Once an online evaluation system is established, many of the costs associated with traditional evaluations can be avoided, including the cost of printing, distributing, collecting, scanning, and storing the paper forms, typing student responses to open-ended questions, and delivering hard-copy summary reports to faculty. Larger-scale evaluations could generate even more cost savings, since the variable costs associated with online evaluations are minimal or nonexistent. Data Quality Online evaluations usually improve the quality of student responses, thereby improving the timeliness and overall value of the analysis that is performed and the reports that are distributed. Most online evaluation systems include features that reduce or eliminate errors in completed evaluations. For example, the system can verify that all required questions have been completed and that each response is in the correct format. Responses to open-ended questions are also easier for faculty and administrators to manipulate, since handwriting is not an issue and grammar mistakes are less likely. Still, written comments usually need to be categorized before they can be adequately analyzed. Another potential advantage of the online method is that it provides greater flexibility in the design of the evaluation. Some online evaluation systems allow instructors to generate questions specifically designed for their courses and to have complicated skipping and branching patterns that are not possible with paper evaluations. In research studies comparing online and in-class responses to open-ended questions, students provided much more information online. Paper evaluations often suffer from a lack of written comments, especially when students fill out the questionnaires at the end of class. Students are not constrained by time during an online session, so they can provide a more complete response. Also, since their responses are typed, online respondents do not have to worry about someone identifying their handwriting. Relevant findings from published research literature During the BYU study, 63% of the online forms contained written comments, compared to less than 10% of the paper forms. In addition, the online comments were generally longer. During the McGill University study, 70% of the online respondents submitted at least one written comment. 87% of the comments contained at least five words and 75% contained at least 10 words. No comparable data was available for paper evaluations, but the apparent increase in 3

6 comment quality was consistent with previous literature and confirmed by anecdotal feedback from instructors. During the study at an unnamed university, 76% of the online evaluations contained written comments, compared to 50% of the paper evaluations. Student Accessibility Online evaluations are considered more accessible than paper evaluations, since students are not required to attend class to complete the evaluation. With paper evaluations conducted in the classroom, students have only one opportunity to provide their opinion during the class period when the evaluations are distributed. With online methods of evaluation, students can provide their opinion over a much longer period of time. However, if the online evaluation is implemented on the Internet or a university computer system, then students who do not have access to a computer will not be able to complete the evaluation. Most schools provide access to computers, but if students are not required to use the computer system for other course activities, then it may be unfair to require a computer for course evaluations. This is also true for clickers and other similar technology that could be used for course evaluations, but may not be required for the course itself. Also, instructors may be able to overcome accessibility issues with paper evaluations by allowing students to submit their evaluation to the department outside of normal class time. Faculty Influence Online evaluations are less susceptible to faculty influence than in-class evaluations. Complaints about paper evaluations often include instructors manipulating ratings through their comments or actions during the evaluation or altering the responses prior to turning them in. With a typical in-class evaluation, faculty members can perform an activity on the day of the evaluation that is designed to elicit a favorable response from students. For example, have a pizza party, announce that the workload requirements have been reduced, or announce a new way to earn extra credit. The mere presence of the faculty member during an in-class evaluation could affect a student s response, especially if the student fears that their response could be identified. Online methods of evaluation are less susceptible to these influences, since the student responds to the online evaluation in an environment that is somewhat distant from the classroom experience. Moreover, since the faculty member does not have any contact with the completed forms, there is no opportunity for them to alter the data after it has been collected. Comparing Online and Paper Response Rates Online Response Rates Despite the potential advantages of online evaluations, many instructors are hesitant about switching to online evaluations because they think it will drastically reduce the number of students responding to the evaluation. Paper evaluations are usually completed at one time, in the classroom, when there is little competition for the attention or time of the student. High response rates for paper evaluations may also be encouraged by perceived social pressure to respond, since the instructor is typically present while the evaluations are filled out. On the other hand, online evaluations are usually completed during free time, in 4

7 the personal space of the respondent, and are not subject to social pressure to respond. This allows students more time to fill out the form, but it also gives them more freedom to decide whether or not to fill it out at all. In most published research studies, the response rate for online evaluations is significantly lower than paper evaluations. However, lower response rates don t always result in lower ratings. In many cases, the ratings are the same or even higher, even though the response rates are lower. As mentioned earlier, it is important to avoid evaluating the response rate by itself. If the response rate is lower, it is still necessary to evaluate the actual results. If the results are considered statistically valid, then any variation in the response rate is essentially irrelevant. Relevant findings from published research studies During the BYU pilot, the response rate was 50% for online evaluations and 71% for paper evaluations. During the Cornell University study, the response rate was 50% for online evaluations and 78% for paper evaluations. The response rate for online evaluations was lower than paper evaluations for every course and the differences in response rates were all considered statistically significant. During the California State University study, the response rate was 43% for online evaluations and 75% for paper evaluations. The response rate for online evaluations was lower than paper evaluations for all but one of the instructors and 10 of the 16 differences were considered statistically significant. When no incentive was offered for the online evaluation, the response rate was 29% and all but one of the instructors received a significantly lower response rate than their paper evaluations. During the McGill University study, the response rate was 45% for online evaluations and 55% for paper evaluations. The response rate for online evaluations was lower than paper evaluations for approximately two-thirds of the courses. During the unnamed study, the response rate was 48% for online evaluations and 61% for paper evaluations. Potential Impact of Lower Response Rates The question that is commonly asked regarding the response rate is whether or not the sample is large enough for the results to be accurate. In other words, are there enough responses for the results to be a true representation of all students? These concepts are related to the confidence interval, which represents the level of certainty associated with the results. If the confidence interval is too large, then more data is needed before any definite conclusions can be reached. The approach typically involves determining if online evaluation methods are accurate enough based on some minimum level of confidence. In reality, it is almost impossible to determine if the responders are a random sample from the full class. In comparing the responses for an online evaluation to a paper evaluation, it is very possible that neither sample is truly representative of the class. There is no way to tell, without obtaining responses from all students. The most anyone can do is compare online with paper evaluations to determine if they are different. In other words, we can t determine if online evaluations are representative enough, but we can determine if they are less representative than paper. Even then, while response rate is an indicator of the 5

8 representativeness of the responses, a lower response rate for online evaluations doesn t always mean that the samples obtained electronically are less representative than paper. In most published research studies, lower response rates did not result in wider confidence intervals or samples that were not representative. Lower response rates can also have an impact on the evaluation of faculty by raising the standard error of estimates, which results in fewer statistically significant differences in performance. If one of the purposes of the evaluation is to test differences among instructors, then the lack of variation between scores could become problematic. Relevant findings from published research studies In the McGill University study, some confidence intervals were narrower and some were wider. In other words, there did not seem to be a systematic increase or decrease in precision with online evaluations. Changes in Response Rate over Time Lower response rates documented during research studies is often temporary. There is evidence that once adopted, online evaluation systems will yield higher response rates over time. As students and faculty adjust to the new system, response rates may increase significantly, eventually nearing or exceeding the response rates observed with paper systems. On the other hand, students could be more likely to respond to the online evaluation if they know it s part of a formal research study. Either way, the response rate from the research study may not be an accurate indicator of the response rate that would be obtained after the system is fully implemented. Relevant findings from published research studies After a campus-wide implementation of online evaluations at BYU, response rates are currently approaching 70%, which is only 1% lower than paper evaluations. After online evaluations were adopted for all courses at Cornell University, the response rate has increased in each successive semester. During the most recent semester included in the published report, the average response rate was 72%, which is only 6% lower than paper evaluations, and every course had a response rate greater than 50%. At McGill University, the online response rate has increased from 31% after the first pilot to 51% during the most recent semester included in the published report, which is only 4% lower than paper evaluations. Comparing Online and Paper Results Correlation of Online and Paper Results As mentioned earlier, no assumptions can be made about the results of an online evaluation based solely on the response rate. Regardless of the response rate, online responses must be analyzed to determine if they are qualitatively different from paper responses. The goal is to determine if the evaluation method affected faculty ratings and other feedback obtained from the evaluation. 6

9 Many instructors believe that a different subset of students will respond to the online evaluation, compared to a paper evaluation distributed to the same students. Specifically, they think students who are particularly upset or disappointed with the instructor are more likely to participate. Therefore, the online evaluation method would yield a negative bias in the results, causing the average rating to be much lower than paper evaluations. This type of response bias can also occur in both positive and negative directions. If students who really like the instructor and students who really dislike the instructor are both more likely to respond than other students, then the distribution of scores will be different than an unbiased evaluation, but the average score may turn out to be the same. In this situation, the plot is bimodal, with one peak for high ratings and another peak for low ratings, rather than one peak centered on the overall mean. When compared to paper evaluations, the distribution of responses for online evaluations could also have one peak, but more spread. In other words, the online evaluation is not completely biased toward the extremes, but it s more biased than the paper evaluation. Therefore, to fully determine if the online evaluation is biased, the mean response, distribution of responses, and standard deviation of responses must all be evaluated. Relevant findings from Cornell University research study Method 1: Compare the average scores for online and paper evaluations. The scores for online and paper evaluations were similar, with only four of the questions having a statistically significant difference. Even when significant differences were found, in practical terms, the scores were quite similar. For 9 of the 13 questions, the scores were within 0.1 of each other on a five-point scale. When scores were compared between different paper evaluations for the same instructor, the differences often exceed this amount. For each question, the average online score was higher than the paper score. However, these aggregate measures can hide important differences across courses, since the courses were not the same size and each course was not evaluated the same number of times. Method 2: Same as method 1, but compare each course separately. 51% of the comparison tests resulted in a negative value and 49% resulted in a positive value, where a negative value indicates that the online score was lower than the paper score and a positive value indicates that the online score was higher than the paper score. Some of the individual values were statistically significant, but since the results were evenly split between positive and negative values, the cumulative value was not statistically significant. When the test results for each course were combined, six of the eight courses had significant differences in scores, but again, these differences were not in the same direction. Some courses had significantly lower scores and some courses had significantly higher scores. Overall, four of the cumulative values were positive and four were negative. When the test results for each question were combined, there were no significant differences for any of the questions. Six of the cumulative values were positive and seven were negative. When the course that was taught twice in the same semester was analyzed separately, there was no statistical difference between scores in one semester and a significant difference in the other semester. Again, the resulting values were of different signs. In one semester, the instructor had higher online scores, and in the other semester, they had higher paper scores. When the results 7

10 for this course were combined, the cumulative value was not statistically significant. 14 of the individual values were positive and 12 were negative. Method 3: Compare the actual scores for all possible combinations of online and paper evaluations, rather than comparing the average scores for each evaluation method. 80% of the comparison tests resulted in a difference that was not statistically significant. Of the comparisons that were statistically significant, 51% had an online score that was higher than the paper score and 49% had an online score that was lower than the paper score. Relevant findings from other published research literature During the BYU study, there was no evidence that lower response rates for online evaluations resulted in lower ratings. The overall course and instructor ratings were 0.1 point higher than paper evaluations on a seven-point scale. For 68% of the courses, the online rating was the same or higher than the paper rating. The other 32% had ratings within 0.1 to 0.5 points of the paper rating. The results for online evaluations were not highly correlated to the response rates, which suggests that online ratings are less susceptible to bias than paper ratings. During the California State University study, online evaluations did not produce significantly different mean scores than paper evaluations, even when different incentives were used. No significant variations were found for the eight instructors who used an incentive, indicating that there were no significant differences between their online and paper evaluations. Among the eight instructors who didn t use an incentive, only one showed a significant difference. During the McGill University study, there was no systematic tendency for results to be either higher or lower with online data collection, even when response rates were much lower. There was no significant difference in the mean rating, shape of distribution, or standard deviation for any of the courses. During the unnamed study, there was no significant effect for the method of data collection. The response distributions did not vary according to whether an online or paper evaluation is used. Potential Impact of Biased Results If the results for online evaluations are significantly different than paper evaluations, then the online ratings may be flawed, most likely due to a bias introduced by the evaluation method. However, biased responses are not completely unusable. Schools can choose to accept the bias, but only if all faculty members are using the same method or there is no need to compare the results between methods. If the results are not correlated, then it s very difficult to compare an instructor s online ratings to their previous paper rating. Teaching portfolios forming part of tenure and promotion files often consist almost exclusively of reports on standardized course evaluation scores from year to year. Many academic institutions may be slow to change their evaluation systems because of the significant cost of reconciling data generated before and after the changes are implemented. In addition, if some instructors are using online evaluations, but others are not, then the results for those instructors can t be reliably compared. The focus at this point is solely on bias that may exist, as identified by the variability of the results. The overall accuracy of the method is not in question, since the response rate has already been analyzed. In addition, this form of bias is different than non-response bias, which involves determinants such as 8

11 gender, expected grade in the class, and opinion of the instructor s teaching performance. Non-response bias, which is also known as response-rate bias, can impact the response rate, while negative bias impacts the actual results. Data Collection Methods To avoid confusion during research studies comparing online evaluations to paper evaluations, students should not be asked to complete two evaluations for the same course. Students can be asked to complete an online form for one course and a paper form for another course, but they should not be asked to complete both forms for the same course. To obtain reliable data without confusing students, most published research studies utilize one of following approaches: Split individual classes into two separate groups. Ask one group of students to complete an online evaluation and the other group to complete a paper evaluation. Identify instructors that are teaching multiple sections of the same course in the same term. Ask one section to complete an online form and the other section to complete a paper form. Identify instructors that are teaching the same course in different terms. If reliable data for paper evaluations is available from previous terms, the online evaluation can be administered in the current term and then compared to the previous term. Otherwise, the study must be conducted over two upcoming terms. Compare different courses taught by different instructors in the same term. This method may shorten the study, but it makes the analysis more difficult. In selecting a method of data collection, the goal is to minimize any normal variation that exists between the two samples, so most or all of the variation can be attributed to the evaluation method. In other words, if the results are different, we want to ensure that the difference is caused by the evaluation method, not some other factor. Normal variation can exist between any two samples that are not exactly the same, but choosing samples that are closely related will minimize it. Two groups of students from the same class offers the clearest comparison, but this method is only feasible if the class is large enough. Normal variation increases with any other method of data collection, even if the evaluations are administered in multiple sections taught by the same instructor, since students in different sections of the same course may have significantly different characteristics. For example, students in an evening course may be markedly different from students in a daytime course. Variation can also occur due to the order in which the sections are taught. Instructors, because of learning effects, may consistently perform better in the second section. Still, comparing courses taught by the same instructor is generally better than comparing courses taught by different instructors, since normal variation can be very high between instructors. However, if historical data for paper evaluations is not already available, this method can greatly increase the length of the study, since new evaluations will have to be administered over two terms. 9

12 Improving Online Response Rates Determinants of Response Rate Understanding the determinants of online responses may help identify ways to increase response rates in the future. In some research studies, there appeared to be some predictability about who responds to online evaluations, based on factors such as class performance, gender, race, and class size. For example, during the Cornell University study, students who anticipated low final grades had a lower probability of submitting an evaluation than students who anticipated high final grades. This implies that instructors will receive less feedback from students who perform poorly. If those students are more likely to submit negative feedback, then the results could be biased in a positive direction, causing the instructor s rating to be artificially high. However, the relationship between performance and response rates may be equally strong for both online and paper evaluations. Other factors that can influence the response rate for online evaluations include the length of the evaluation and the overall perception of anonymity. During the California State University study, students who complained about online evaluations were most likely to think that the evaluations were too time consuming or fear that their responses were not anonymous. Some students felt that the integrity of the online system could be compromised, causing their ID number or other identifying information to be revealed with their responses. However, the lack of an anonymous response is also a concern for students using traditional paper evaluations, as they sometimes fear that the instructor will be able to identify their handwriting in answers to open-ended questions. One way to ensure students that their response is truly anonymous is to develop a set of access codes for the web site. In the classroom, the instructor could randomly distribute the access codes to students and explain that it is impossible for the access codes to be tied to a particular student. An educational endorsement may also be effective in assuring students that online evaluations are confidential and anonymous. For example, institutional guarantees of confidentiality could be published and endorsements could be provided by the student government. Relevant findings from published research literature During the BYU study, the length of the form did not appear to be an important factor for students deciding whether or not to complete it, although there would undoubtedly be a threshold at some point. The longest form was 18 questions. During the Cornell University study, students who anticipated low final grades were less likely to submit an evaluation than students who anticipated high final grades. In addition, women were 18% more likely to submit an evaluation than men, Asian students were 12% less likely to submit an evaluation than students of other races, and students in larger classes were less likely to submit an evaluation than students in smaller classes. During the unnamed study, students with higher grade-point averages were more likely to complete an online evaluation than students with low grade-point averages. Out of the five class levels studied, sophomores were most likely to respond and seniors least likely. Science students, which include computer science, were more likely to respond than students from any of the other five academic areas. The mean anonymity rating was significantly higher for paper evaluations 10

13 than online evaluations, but high percentages of students in both groups 87% for paper and 72% for online reported that they felt anonymous in completing the evaluation. Grade Incentives The most effective way to promote participation is to provide extra credit points to students who complete the online evaluation. When this type of incentive is used, response rates have been achieved that are comparable to paper evaluations. However, not all instructors are willing to use a grade incentive as an online response motivator. Some instructors believe that a grade incentive will bias the results in favor of students who are more concerned about their grades. Others argue that it is unethical to use a grade incentive, since a student s participation in a faculty evaluation should be a voluntary event that has no bearing on the student s grade. Instructors may also fear that grade incentives will attract responses from students who can t provide a fair evaluation because they rarely attend class. Grade incentives require the instructor to determine which students completed the evaluation. To maintain confidentiality, the instructor must be able to obtain this information without seeing the individual responses, especially if the incentive will be provided to students before the final grades are posted. Even if steps are taken to ensure that the responses remain anonymous, revealing the names of the students who completed the evaluation may be considered a serious risk to the integrity of the evaluation. Relevant findings from published research literature When points were given at BYU, the response rate was 87%, compared to 71% for paper evaluations. When a modest grade incentive was given at California State University, the response rate was 87%, which was the same as paper evaluations. The response rate for the grade incentive was significantly higher than any other incentive and it did not bias the results. Other Incentives Instructors can offer other positive incentives to encourage participation, such as contributing money to a charity for each form completed, giving students free coupons for food, or entering students in a drawing for various prizes. Instructors can also provide early access to grades or withhold the posting of a final grade until an evaluation is submitted. Withholding grades may seem unfair, but students in the BYU study supported this strategy, saying it would be effective and yet not too restrictive. If the registration system supports it, students who complete the online evaluation could also be assigned earlier registration times for the next term. These incentives could be offered to individual students or the entire class as a group. For example, students could receive early feedback on their course grades if at least two-thirds of the class completes the online evaluation. Relevant findings from published research literature When an early grade feedback incentive was offered at California State University, the response rate was significantly higher than online evaluations without an incentive, but it was significantly lower than paper evaluations. Instructor Encouragement 11

14 Instructors should be advised to show a personal interest in students completing the online evaluation, rather than simply distributing the evaluation because they are required to do so. Instructors can mention the evaluations repeatedly in class and let students know that they pay attention to the responses. They could also make the evaluation a formal assignment and dismiss students early to complete it, which sends an even stronger message that the feedback is particularly valued. When faculty members at BYU assigned students to complete the online evaluation, response rates greatly increased, even when points were not given for the assignment. Even if points are not awarded, students may think the evaluation affects their standing in the course, which is known to be a very strong response motivator. If the instructor has access to the names of students who have completed the evaluation, they could send personal messages to students who have not completed it, encouraging them to complete it at their earliest convenience and reminding them that it is viewed as an important activity. As mentioned earlier, even if measures are taken to preserve student confidentiality, some students may be concerned that the instructor will be able to use the names to identify the student s response. Providing the names also encourages the practice of awarding extra points, which the school may have already determined to be inappropriate. In these situations, the reminders could be sent by school administrators or departmental staff who do not have any control over the grades. Some online evaluation systems can also send reminders automatically. These methods are not as effective as personal reminders from the instructor, but they could help alleviate any concerns about confidentiality and other potential improprieties. Relevant findings from published research literature When instructors assigned the online evaluation to students during the BYU study, the response rate was 77%. When they encouraged students to complete the evaluation, but did not make it a formal assignment, the response rate was 32%. Training and Communication Most online evaluation systems are very easy to use, but response rates can still be affected if adequate instructions are not provided. One option is to provide a live demonstration showing how to log on to the system, how to fill out the evaluation form, and how to log off. The demonstration can be performed in the classroom and recorded in a video for students to view later. At a minimum, complete written instructions should be provided, either online or as in-class handouts. Students should also receive additional information about the online evaluation system to help them understand the importance of their input and how the results are used. This information can be shared with students through presentations during new-student orientation and meetings with various student groups. General advertising about the online evaluation can also be distributed via school newspapers, campus posters, and easy-to-find web pages on the school computer system. Any communication to faculty and students about the online evaluation system should be standardized to ensure that a clear and consistent message is provided. The frequency should also be reviewed to ensure that the message is adequately distributed without being annoying or overbearing to the intended recipients. Ideally, course evaluation results should be available to students in a user-friendly online format. If necessary, the school can allow individual instructors to provide permission for their evaluations to be 12

15 posted for student consultation. Students can use this information to make informed decisions about which classes to take in upcoming terms and it also reinforces the notion that completing the evaluation is a valuable use of their time. Relevant findings from published research literature When in-class demonstrations were performed at California State University, the response rate was lower than paper evaluations in one case and the same as paper evaluations in the other case. The response rate was significantly higher than online evaluations without a demonstration, but it was not significantly different than the early grade feedback incentive. During the McGill University study, four separate administrative reminders to encourage students to complete their online evaluations proved to be very successful, increasing the number of submissions by over 1,000 on the day following the reminder. During the unnamed study, students who had not completed the online evaluation were sent a reminder along with a set of instructions, which resulted in another 156 completed evaluations. Institutional Environment There is evidence that electronic student evaluations can be successfully implemented in institutions of higher education where the student body is fairly computer literate and computers are readily available on campus. In addition, when completion of online evaluations is assigned or encouraged in more than one course, the likelihood that students will complete the evaluation for all their courses improves considerably. To obtain these cumulative benefits, standard procedures should be adopted across all departments and broad implementation should be encouraged, if not required. Higher response rates may also be achieved by allowing students to complete the surveys somewhat earlier than the end of the term, before students are pressured with final exams. Since student evaluations of faculty are fairly stable from the middle of the term to the end of the term, it is conceivable that the online evaluations could start anytime after mid-term. 13

16 Appendix A: Data and Charts In the Cornell University study: The average score for paper evaluations may not be completely accurate, since the raw data wasn t always available, but it was determined to be a reasonable estimate. In method 2, the cumulative value for each question is calculated as the sum of the individual values for each course, while in method 1, the cumulative value is an aggregation. This results in different values, although the differences are consistent for most questions (see chart). For the course that was taught twice in two different semesters, some instances of the course are not included in the analysis. Specifically, the cumulative values do not include the paper evaluations from Spring 1998, Spring 1999, and Spring Method 1 Method Different Methods Used to Evaluate Variability in Cornell University Study 14

17 Appendix B: References Avery, R. J., Bryant, W. K., Mathios, A., Kang, H., & Bell, D. (2006). Electronic course evaluations: Does an online delivery system influence student evaluations? Journal of Electronic Education, 37(1), Dommeyer, C.J., Baum, P., Hanna, R.W., & Chapman, K.S. (2004). Gathering faculty teaching evaluations by in-class and online surveys: Their effects on response rates and evaluations. Assessment and Evaluation in Higher Education, 29(5), Johnson, T. D. (2003). Online student ratings: Will students respond? New directions for teaching and learning: Online student ratings of instruction, 96(49-59). Layne, B.H., DeCristoforo, J.R., & McGinty, D. (1999). Electronic versus traditional student ratings of instruction. Research in Higher Education, 40(2), Winer, L.R. & Sehgal, R. (2006). Online Course Evaluation Analysis Report. The Office of University Planning and Analysis (UPA) at North Carolina State University OnSET: Online Student Evaluation of Teaching in Higher Education 15

Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations

Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations Assessment & Evaluation in Higher Education Vol. 29, No. 5, October 2004 Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations Curt J. Dommeyer*,

More information

The College of Arts, Humanities & Social Sciences Online Review

The College of Arts, Humanities & Social Sciences Online Review Report to the Senate: Spring 2013 Online Evaluations Pilot at Humboldt State University Richard W. Bruce Humboldt State University September 2013 2 Table of Contents Introduction... 3 Methods... 3 Results...

More information

Constructive student feedback: Online vs. traditional course evaluations. Judy Donovan, Ed.D. Indiana University Northwest

Constructive student feedback: Online vs. traditional course evaluations. Judy Donovan, Ed.D. Indiana University Northwest www.ncolr.org/jiol Volume 5, Number 3, Winter 2006 ISSN: 1541-4914 Constructive student feedback: Online vs. traditional course evaluations Judy Donovan, Ed.D. Indiana University Northwest Cynthia E. Mader,

More information

College Students Attitudes Toward Methods of Collecting Teaching Evaluations: In-Class Versus On-Line

College Students Attitudes Toward Methods of Collecting Teaching Evaluations: In-Class Versus On-Line College Students Attitudes Toward Methods of Collecting Teaching Evaluations: In-Class Versus On-Line CURT J. DOMMEYER PAUL BAUM ROBERT W. HANNA California State University, Northridge Northridge, California

More information

Online vs. Paper Evaluations of Faculty: When Less is Just as Good

Online vs. Paper Evaluations of Faculty: When Less is Just as Good The Journal of Effective Teaching an online journal devoted to teaching excellence Online vs. Paper Evaluations of Faculty: When Less is Just as Good David S. Fike 1ab, Denise J. Doyle b, Robert J. Connelly

More information

Electronic Course Evaluations: Does an Online Delivery System Influence Student Evaluations?

Electronic Course Evaluations: Does an Online Delivery System Influence Student Evaluations? Electronic Course Evaluations: Does an Online Delivery System Influence Student Evaluations? Rosemary J. Avery, W. Keith Bryant, Alan Mathios, Hyojin Kang, and Duncan Bell Abstract: An increasing number

More information

Administering course evaluations online: A summary of research and key issues

Administering course evaluations online: A summary of research and key issues Administering course evaluations online: A summary of research and key issues Office of the Vice Provost for Undergraduate Education June 2009 Prepared by: J. David Perry IUB Evaluation Services & Testing

More information

Online Student Ratings: Will Students Respond?

Online Student Ratings: Will Students Respond? Paper presented at the annual conference of the American Educational Research Association New Orleans, 2002 Trav Johnson, Brigham Young University Background Online Student Ratings: Will Students Respond?

More information

Course-based Student Feedback Forms: Proposal for a New System Fall 2005

Course-based Student Feedback Forms: Proposal for a New System Fall 2005 Course-based Student Feedback Forms: Proposal for a New System Fall 2005 Introduction As part of Augsburg College s commitment to providing a high quality education, we have solicited feedback from students

More information

Best Practices for Increasing Online Teaching Evaluation Response Rates

Best Practices for Increasing Online Teaching Evaluation Response Rates Best Practices for Increasing Online Teaching Evaluation Response Rates Denise T. Ogden, Penn State University Lehigh Valley James R. Doc Ogden, Kutztown University of Pennsylvania ABSTRACT Different delivery

More information

Online vs. Traditional Course Evaluation Formats: Student Perceptions. Judy Donovan Indiana University Northwest

Online vs. Traditional Course Evaluation Formats: Student Perceptions. Judy Donovan Indiana University Northwest www.ncolr.org/jiol Volume 6, Number 3, Winter 2007 ISSN: 1541-4914 Online vs. Traditional Course Evaluation Formats: Student Perceptions Judy Donovan Indiana University Northwest Cynthia Mader and John

More information

This list is not exhaustive but can serve as a starting point for further exploration.

This list is not exhaustive but can serve as a starting point for further exploration. Compiled by Scott Krajewski, Augsburg College, krajewsk@augsburg.edu An initial bibliography on online course evaluations This list is not exhaustive but can serve as a starting point for further exploration.

More information

Student Course Evaluation Management and Application

Student Course Evaluation Management and Application ACADEMIC AFFAIRS FORUM Student Course Evaluation Management and Application Custom Research Brief Research Associate Anna Krenkel Research Manager Nalika Vasudevan October 2012 2 of 14 3 of 14 Table of

More information

A Class Project in Survey Sampling

A Class Project in Survey Sampling A Class Project in Survey Sampling Andrew Gelman and Deborah Nolan July 1, 2001 Courses in quantitative methods typically require students to analyze previously collected data. There is great value in

More information

Online Collection of Midterm Student Feedback

Online Collection of Midterm Student Feedback 9 Evaluation Online offers a variety of options to faculty who want to solicit formative feedback electronically. Researchers interviewed faculty who have used this system for midterm student evaluation.

More information

Online And Paper Course Evaluations Faruk Guder, Loyola University Chicago, USA Mary Malliaris, Loyola University Chicago, USA

Online And Paper Course Evaluations Faruk Guder, Loyola University Chicago, USA Mary Malliaris, Loyola University Chicago, USA Online And Paper Course Evaluations Faruk Guder, Loyola University Chicago, USA Mary Malliaris, Loyola University Chicago, USA ABSTRACT The purpose of this study is to compare the results of paper and

More information

EARLY WARNING Impact of Participating in Early Warning on Course Performance Fall 2011

EARLY WARNING Impact of Participating in Early Warning on Course Performance Fall 2011 EARLY WARNING Impact of Participating in Early Warning on Course Performance Fall 2011 July 2012 Undergraduate Education Institutional Research Report Allison M. Cantwell Director of Evaluation, Assessment,

More information

Ithaca College Survey Research Center Survey Research Checklist and Best Practices

Ithaca College Survey Research Center Survey Research Checklist and Best Practices This document provides a checklist and recommendations to help plan and administer a survey. Contents: Checklist 2 Planning a Survey 3-5 Components of a Survey an Overview 6 Creating a Survey Instrument

More information

A Rasch measurement approach to analyzing differences in pencil-and-paper and online formats. for higher education course evaluations

A Rasch measurement approach to analyzing differences in pencil-and-paper and online formats. for higher education course evaluations A Rasch measurement approach to analyzing differences in pencil-and-paper and online formats for higher education course evaluations Leslie A. Sweeney 1, University of Kentucky Kathryn Shirley Akers, University

More information

"Traditional and web-based course evaluations comparison of their response rates and efficiency"

Traditional and web-based course evaluations comparison of their response rates and efficiency "Traditional and web-based course evaluations comparison of their response rates and efficiency" Miodrag Lovric Faculty of Economics, University of Belgrade, Serbia e-mail: misha@one.ekof.bg.ac.yu Abstract

More information

Online Student Course Evaluations: Strategies for Increasing Student Participation Rates

Online Student Course Evaluations: Strategies for Increasing Student Participation Rates ; Online Student Course Evaluations: Strategies for Increasing Student Participation Rates TABLE OF CONTENTS RESEARCH ASSOCIATE Michael Ravenscroft RESEARCH DIRECTOR Christine Enyeart I. Research Parameters

More information

AC 2012-3818: FACULTY PERCEPTIONS AND USE OF A LEARNING MANAGEMENT SYSTEM AT AN URBAN, RESEARCH INSTITUTION

AC 2012-3818: FACULTY PERCEPTIONS AND USE OF A LEARNING MANAGEMENT SYSTEM AT AN URBAN, RESEARCH INSTITUTION AC 2012-3818: FACULTY PERCEPTIONS AND USE OF A LEARNING MANAGEMENT SYSTEM AT AN URBAN, RESEARCH INSTITUTION Julie M. Little-Wiles M.S.M., Ph.D. (A.B.D.), Purdue University, West Lafayette Julie M. Little-Wiles

More information

Paralegal Studies Assessment Report 2005-2006. Introduction

Paralegal Studies Assessment Report 2005-2006. Introduction Paralegal Studies Assessment Report 2005-2006 Introduction The Paralegal Studies program volunteered to use the elumen Achievement System to assess student learning for the 2005-2006 academic year. We

More information

GRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE *

GRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE * GRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE * Amber Settle DePaul University 243 S. Wabash Avenue Chicago, IL 60604 (312) 362-5324 asettle@cti.depaul.edu Chad Settle University

More information

Pilot Study for Assessing the Viability of Using Online Course Evaluations at California State University Sacramento

Pilot Study for Assessing the Viability of Using Online Course Evaluations at California State University Sacramento 1 Pilot Study for Assessing the Viability of Using Online Course Evaluations at California State University Sacramento A Report from the Electronic Course Evaluation Task Force Fall 2010 2 Sacramento State

More information

MSU IDEA Pilot Study Preliminary Results

MSU IDEA Pilot Study Preliminary Results MSU IDEA Pilot Study Preliminary Results This document was a quick attempt to share what was learned from the pilot of a new course evaluation form created and administered by the IDEA Center (http://ideaedu.org/services/student-ratings).

More information

The Challenges and Potentials of Evaluating Courses Online

The Challenges and Potentials of Evaluating Courses Online The Challenges and Potentials of Evaluating Courses Online Mark Troy Texas A&M University Hossein Hakimzadeh Indiana University February 23, 2009 Trav D. Johnson Brigham Young University Dawn M. Zimmaro

More information

Northern Illinois University Office of Assessment Services

Northern Illinois University Office of Assessment Services Northern Illinois University Office of Assessment Services 2011 University Writing Project Report Analysis of College-Level Writing Ability/Skills Fall 2011 Distributed Fall 2011 For Additional Information

More information

Concordia University Course Evaluation Survey: Summary Report

Concordia University Course Evaluation Survey: Summary Report Concordia University Course Evaluation Survey: Summary Report Introduction In fall 2010, the Vice-Provost, Teaching and Learning engaged the Institutional Planning Office to conduct a survey of student

More information

Assessing faculty performance using student evaluations of teaching in an uncontrolled setting

Assessing faculty performance using student evaluations of teaching in an uncontrolled setting Assessment & Evaluation in Higher Education Vol. 35, No. 4, July 2010, 463 475 Assessing faculty performance using student evaluations of teaching in an uncontrolled setting Clifford Nowell*, Lewis R.

More information

Course Evaluations at the Kelley School of Business

Course Evaluations at the Kelley School of Business Course Evaluations at the Kelley School of Business Administration of course evaluations According to official Kelley School of Business policy, every class taught at Kelley, including summer session classes,

More information

COMMUNITY COLLEGE COMPRESSED CALENDARS: RESULTS OF A STUDENT SURVEY AND A FACULTY SURVEY 1

COMMUNITY COLLEGE COMPRESSED CALENDARS: RESULTS OF A STUDENT SURVEY AND A FACULTY SURVEY 1 COMMUNITY COLLEGE COMPRESSED CALENDARS: RESULTS OF A STUDENT SURVEY AND A FACULTY SURVEY 1 Michael Carley 2 Porterville College Abstract Many community colleges are considering changes in their traditional

More information

Response Rates in Online Teaching Evaluation Systems

Response Rates in Online Teaching Evaluation Systems Response Rates in Online Teaching Evaluation Systems James A. Kulik Office of Evaluations and Examinations The University of Michigan July 30, 2009 (Revised October 6, 2009) 10/6/2009 1 How do you get

More information

Student Response to Instruction (SRTI)

Student Response to Instruction (SRTI) office of Academic Planning & Assessment University of Massachusetts Amherst Martha Stassen Assistant Provost, Assessment and Educational Effectiveness 545-5146 or mstassen@acad.umass.edu Student Response

More information

International Conference on Communication, Media, Technology and Design. ICCMTD 09-11 May 2012 Istanbul - Turkey

International Conference on Communication, Media, Technology and Design. ICCMTD 09-11 May 2012 Istanbul - Turkey OVERCOMING COMMUNICATION BARRIERS IN ONLINE TEACHING: UNDERSTANDING FACULTY PREFERENCES Ertunga C. Ozelkan and Agnes Galambosi Systems Engineering & Engineering Management University of North Carolina

More information

Answers to Faculty Concerns About Online Versus In- class Administration of Student Ratings of Instruction (SRI)

Answers to Faculty Concerns About Online Versus In- class Administration of Student Ratings of Instruction (SRI) Answers to Faculty Concerns About Online Versus In- class Administration of Student Ratings of Instruction (SRI) The posting below compares online student ratings of instructors with in- class ratings.

More information

Towards the design of a decentralized support system for online learners (Proposal Feedback ID 493)

Towards the design of a decentralized support system for online learners (Proposal Feedback ID 493) Towards the design of a decentralized support system for online learners (Proposal Feedback ID 493) Olabisi Kuboni University of the West Indies, Trinidad and Tobago olabisi.kuboni@dec.uwi.edu INTRODUCTION

More information

Preparation of Two-Year College Mathematics Instructors to Teach Statistics with GAISE Session on Assessment

Preparation of Two-Year College Mathematics Instructors to Teach Statistics with GAISE Session on Assessment Preparation of Two-Year College Mathematics Instructors to Teach Statistics with GAISE Session on Assessment - 1 - American Association for Higher Education (AAHE) 9 Principles of Good Practice for Assessing

More information

Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process

Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process Overview Effective teaching is the core of any outstanding university and is

More information

Online versus traditional teaching evaluation: mode can matter

Online versus traditional teaching evaluation: mode can matter Assessment & Evaluation in Higher Education Vol. 30, No. 6, December 2005, pp. 581 592 Online versus traditional teaching evaluation: mode can matter Eyal Gamliel* and Liema Davidovitz Ruppin Academic

More information

A Downsized, Laboratory-Intensive Curriculum in Electrical Engineering

A Downsized, Laboratory-Intensive Curriculum in Electrical Engineering A Downsized, Laboratory-Intensive Curriculum in Electrical Engineering T. W. Martin and W. D. Brown Department of Electrical Engineering University of Arkansas Fayetteville, Arkansas 72701 Abstract - The

More information

Progress Review of the University-Wide Course Evaluation System

Progress Review of the University-Wide Course Evaluation System Introduction Progress Review of the University-Wide Course Evaluation System Sharon La Voy, Director of Assessment, Institutional Research, Planning and Assessment Jessica Mislevy, Graduate Assistant,

More information

COURSE AND TEACHER SURVEYS (CATS) AT VILLANOVA UNIVERSITY

COURSE AND TEACHER SURVEYS (CATS) AT VILLANOVA UNIVERSITY COURSE AND TEACHER SURVEYS (CATS) AT VILLANOVA UNIVERSITY A guide for new faculty members There are probably more studies of student ratings than of all of the other data used to evaluate college teaching

More information

Online Collection of Student Evaluations of Teaching

Online Collection of Student Evaluations of Teaching Online Collection of Student Evaluations of Teaching James A. Kulik Office of Evaluations and Examinations The University of Michigan December 5, 2005 1 More than a dozen universities now collect all their

More information

An Online Evaluation Pilot Study in the College of Arts, Humanities & Social Sciences. Richard W. Bruce. Humboldt State University.

An Online Evaluation Pilot Study in the College of Arts, Humanities & Social Sciences. Richard W. Bruce. Humboldt State University. An Online Evaluation Pilot Study in the College of Arts, Humanities & Social Sciences Richard W. Bruce Humboldt State University February 2013 2 Table of Contents Introduction... 3 Literature Review...

More information

SIR II Student Instructional Report

SIR II Student Instructional Report Testing the Invariance of Interrater Reliability Between Paper-Based and Online Modalities of the SIR II Student Instructional Report David Klieger, John Centra, John Young, Steven Holtzman, and Lauren

More information

Phase 1 pilot 2005/6 Intervention. GCU Caledonian Business School Business Management Page 1 of 8. Overview

Phase 1 pilot 2005/6 Intervention. GCU Caledonian Business School Business Management Page 1 of 8. Overview University Department Module Overview Glasgow Caledonian Business school Business Management The Business School has been implementing assessment re-engineering using a variety of technologies, initially

More information

UNIVERSITY OF DAYTON MANAGEMENT AND MARKETING DEPARTMENT MKT 315: RETAIL MARKETING Course Syllabus Winter 2008, Section 01

UNIVERSITY OF DAYTON MANAGEMENT AND MARKETING DEPARTMENT MKT 315: RETAIL MARKETING Course Syllabus Winter 2008, Section 01 UNIVERSITY OF DAYTON MANAGEMENT AND MARKETING DEPARTMENT MKT 315: RETAIL MARKETING Course Syllabus Winter 2008, Section 01 INSTRUCTOR: Serdar S. Durmuşoğlu, Ph.D. OFFICE LOCATION: Miriam Hall 703 PHONE:

More information

Examining the Role of Online Courses in Native Hawaiian Culture and Language at the University of Hawaii

Examining the Role of Online Courses in Native Hawaiian Culture and Language at the University of Hawaii Examining the Role of Online Courses in Native Hawaiian Culture and Language at the University of Hawaii Introduction Kelley Dudoit University of Hawaii, Manoa Educational Technology Graduate Student Hawaii,

More information

TIPS DATA QUALITY STANDARDS ABOUT TIPS

TIPS DATA QUALITY STANDARDS ABOUT TIPS 2009, NUMBER 12 2 ND EDITION PERFORMANCE MONITORING & EVALUATION TIPS DATA QUALITY STANDARDS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related to performance

More information

Ad-Hoc Committee on Academic Integrity. Survey Summary Report

Ad-Hoc Committee on Academic Integrity. Survey Summary Report Ad-Hoc Committee on Academic Integrity Survey Summary Report Presented at the UAA Faculty Senate Retreat 24 August 2011 CONTENTS I. Methodology II. Perceptions of Academically Dishonest Behaviors Table

More information

Mathematics Placement And Student Success: The Transition From High School To College Mathematics

Mathematics Placement And Student Success: The Transition From High School To College Mathematics Mathematics Placement And Student Success: The Transition From High School To College Mathematics David Boyles, Chris Frayer, Leonida Ljumanovic, and James Swenson University of Wisconsin-Platteville Abstract

More information

College of Liberal Arts Online Education Course Management Policy 2011-2012 Last update: July, 2011. I. Purpose of Policy:

College of Liberal Arts Online Education Course Management Policy 2011-2012 Last update: July, 2011. I. Purpose of Policy: I. Purpose of Policy: College of Liberal Arts Online Education Course Management Policy 2011-2012 Last update: July, 2011 The College of Liberal Arts Online Education Course Management Policy 1 provides

More information

Resource 6 Workplace travel survey guide

Resource 6 Workplace travel survey guide Resource 6 Workplace travel survey guide Page 1 Resource 6 Workplace travel survey guide Overview Introduction Contents The NZ Transport Agency (NZTA) provides a workplace travel survey (hereafter referred

More information

Administering Student Course Evaluations Online

Administering Student Course Evaluations Online UNIVERSITY LEADERSHIP COUNCIL Administering Student Course Evaluations Online Custom Research Brief February 14, 2011 RESEARCH ASSOCIATE Ehui Nyatepe-Coo RESEARCH MANAGER Josh Albert I. Research Methodology

More information

Table 1: Number of students enrolled in the program in Fall, 2011 (approximate numbers)

Table 1: Number of students enrolled in the program in Fall, 2011 (approximate numbers) Program: Department: MBA Human Resource Management CBA Table 1: Number of students enrolled in the program in Fall, 2011 (approximate numbers) MBA Concentration (Program) # students Human Resource Management

More information

>> BEYOND OUR CONTROL? KINGSLEY WHITE PAPER

>> BEYOND OUR CONTROL? KINGSLEY WHITE PAPER >> BEYOND OUR CONTROL? KINGSLEY WHITE PAPER AUTHOR: Phil Mobley Kingsley Associates December 16, 2010 Beyond Our Control? Wrestling with Online Apartment Ratings Services Today's consumer marketplace

More information

Are they the same? Comparing the instructional quality of online and faceto-face graduate education courses

Are they the same? Comparing the instructional quality of online and faceto-face graduate education courses Assessment & Evaluation in Higher Education Vol. 32, No. 6, December 2007, pp. 681 691 Are they the same? Comparing the instructional quality of online and faceto-face graduate education courses Andrew

More information

Annual Goals for Math & Computer Science

Annual Goals for Math & Computer Science Annual Goals for Math & Computer Science 2010-2011 Gather student learning outcomes assessment data for the computer science major and begin to consider the implications of these data Goal - Gather student

More information

Online Course Evaluation. Access to Evaluation Data Context & Recommendations

Online Course Evaluation. Access to Evaluation Data Context & Recommendations Background and Context: Online Course Evaluation Access to Evaluation Data Context & Recommendations The campus is transitioning from a course evaluation system that is highly distributed to an enterprise,

More information

Policy on Student Ratings of Teaching

Policy on Student Ratings of Teaching May 5, 2015 Page 1 of 6 PURPOSE: To outline policy regarding the selection of a Student Rating of Teaching (SRT) instrument and how SRT rankings shall be viewed in faculty evaluation. Supersedes SP 07-12

More information

Incorporating the MARS Sales Management Simulation into a Sales Management Course

Incorporating the MARS Sales Management Simulation into a Sales Management Course Incorporating the MARS Sales Management Simulation into a Sales Management Course Joe Chapman, Ph.D. Ball State University Miller College of Business Dept. of Marketing & Management Muncie, IN 47306-0355

More information

Spring 2013 Structured Learning Assistance (SLA) Program Evaluation Results

Spring 2013 Structured Learning Assistance (SLA) Program Evaluation Results Crafton Hills College RRN 682 July 2013 Research Brief Spring 2013 Structured Learning Assistance (SLA) Program Evaluation Results Prepared by Lorena Guadiana Summary of Main Findings 85% of respondents

More information

Student Engagement Strategies in One Online Engineering and Technology Course

Student Engagement Strategies in One Online Engineering and Technology Course Paper ID #7674 Student Engagement Strategies in One Online Engineering and Technology Course Dr. Julie M Little-Wiles, Purdue School of Engineering and Technology, IUPUI Dr. Julie Little-Wiles is a Visiting

More information

Gender Equality: Student Culture Survey Guidance for Departments

Gender Equality: Student Culture Survey Guidance for Departments Gender Equality: Student Culture Survey Guidance for Departments 1 1. Introduction The steps that lead to gender equality and wider diversity within organisations are also those that promote good practice

More information

Formative Evaluations in Online Classes. course improvements is also increasing. However, there is not general agreement on the best

Formative Evaluations in Online Classes. course improvements is also increasing. However, there is not general agreement on the best The Journal of Educators Online-JEO January 2016 ISSN 1547-500X Vol 13 Number 1 1 Formative Evaluations in Online Classes Jennifer L. Peterson, Illinois State University, Normal, IL., USA Abstract Online

More information

Students beliefs and attitudes about a business school s academic advising process

Students beliefs and attitudes about a business school s academic advising process Students beliefs and attitudes about a business school s academic advising process ABSTRACT M. Wayne Alexander Minnesota State University Moorhead Deborah Kukowski Minnesota State University Moorhead Lee

More information

Online Course Evaluation and Analysis

Online Course Evaluation and Analysis Session 1793 Online Course Evaluation and Analysis Li Bai Saroj Biswas Department of Electrical and Computer Engineering College of Engineering Temple University Philadelphia, PA19122 lbai@temple.edu sbiswas@temple.edu

More information

Attitudes, Concerns and Opinions Relating to the Provision of Emergency Medical Services

Attitudes, Concerns and Opinions Relating to the Provision of Emergency Medical Services Survey of Fire and Emergency Medical Services Department Operational Employees: Attitudes, Concerns and Opinions Relating to the Provision of Emergency Medical Services District of Columbia Adrian M. Fenty,

More information

Guidelines for the Use of the Participant Pool

Guidelines for the Use of the Participant Pool Guidelines for the Use of the Participant Pool The Department of Psychology & Neuroscience maintains a participant pool composed of students in various psychology classes. Naturally, the participant pool

More information

REDESIGNING ALGEBRA COURSES: FROM IMPLEMENTATION TO RESULTS

REDESIGNING ALGEBRA COURSES: FROM IMPLEMENTATION TO RESULTS [Type text] REDESIGNING ALGEBRA COURSES: FROM IMPLEMENTATION TO RESULTS Laura J. Iossi Broward College Department of Mathematics, Broward College, 3501 S.W. Davie Rd. Davie, Fl 33314. liossi@broward.edu

More information

The Examination of Strength and Weakness of Online Evaluation of Faculty Members Teaching by Students in the University of Isfahan

The Examination of Strength and Weakness of Online Evaluation of Faculty Members Teaching by Students in the University of Isfahan The Examination of Strength and Weakness of Online Evaluation of Faculty Members Teaching by Students in the University of Isfahan Ansary Maryam PhD Scholar, Philosophy of Education, Faculty of Educational

More information

D R A F T. Faculty Senate Ad Hoc Committee on Quality in Online Learning.

D R A F T. Faculty Senate Ad Hoc Committee on Quality in Online Learning. Faculty Senate Ad Hoc Committee on Quality in Online Learning. The Faculty Senate Ad Hoc Committee on the Future Direction of Quality Education is charged with: Defining quality in online/distance education

More information

Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process DRAFT

Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process DRAFT Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process DRAFT Overview In 2011, The University of Texas System Chancellor unveiled

More information

An Analysis of how Proctoring Exams in Online Mathematics Offerings Affects Student Learning and Course Integrity

An Analysis of how Proctoring Exams in Online Mathematics Offerings Affects Student Learning and Course Integrity An Analysis of how Proctoring Exams in Online Mathematics Offerings Affects Student Learning and Course Integrity Introduction Reliable assessment is central to education and educational institutions for

More information

Development of a Tablet-PC-based System to Increase Instructor-Student Classroom Interactions and Student Learning

Development of a Tablet-PC-based System to Increase Instructor-Student Classroom Interactions and Student Learning Development of a Tablet-PC-based System to Increase Instructor-Student Classroom Interactions and Student Learning Kimberle Koile David Singer MIT CS and AI Lab MIT Dept of Brain and Cognitive Science

More information

IPCE Institute for Policy and Civic Engagement http://www.uic.edu/cuppa/ipce/

IPCE Institute for Policy and Civic Engagement http://www.uic.edu/cuppa/ipce/ IPCE Institute for Policy and Civic Engagement http://www.uic.edu/cuppa/ipce/ TRANSPARENCY, CIVIC ENGAGEMENT, AND TECHNOLOGY USE IN LOCAL GOVERNMENT AGENCIES: FINDINGS FROM A NATIONAL SURVEY April 2011

More information

Online Reputation in a Connected World

Online Reputation in a Connected World Online Reputation in a Connected World Abstract This research examines the expanding role of online reputation in both professional and personal lives. It studies how recruiters and HR professionals use

More information

Student Preferences for Learning College Algebra in a Web Enhanced Environment

Student Preferences for Learning College Algebra in a Web Enhanced Environment Abstract Student Preferences for Learning College Algebra in a Web Enhanced Environment Laura Pyzdrowski West Virginia University Anthony Pyzdrowski California University of Pennsylvania It is important

More information

Bio 204 Lab reform: End of Second Summer Report 1. Project motivation and goals: 2. Project summary:

Bio 204 Lab reform: End of Second Summer Report 1. Project motivation and goals: 2. Project summary: Bio 204 Lab reform: End of Second Summer Report 1. Project motivation and goals: Biology 204 Plant and Animal form and Function was developed in 2004 and first taught in 2005 as part of a re-design of

More information

Assessing Quantitative Reasoning in GE (Owens, Ladwig, and Mills)

Assessing Quantitative Reasoning in GE (Owens, Ladwig, and Mills) Assessing Quantitative Reasoning in GE (Owens, Ladwig, and Mills) Introduction Many students at CSU, Chico, receive much of their college-level mathematics education from the one MATH course they complete

More information

Integrating Instructional Technology into the Classroom: Laptops and LearningSpace in Business Administration

Integrating Instructional Technology into the Classroom: Laptops and LearningSpace in Business Administration Abstract Integrating Instructional Technology into the Classroom: Laptops and LearningSpace in Business Administration Cynthia L. Krey Assistant Director - Instructional Technology The McGlynn Computer

More information

HE STEM Staff Culture Survey Guidance

HE STEM Staff Culture Survey Guidance HE STEM Staff Culture Survey Guidance 1 1. Introduction The steps that lead to gender equality within organisations are also those that promote good employment practice and an inclusive environment for

More information

Web-Based Student Evaluation of Instruction: Promises and Pitfalls

Web-Based Student Evaluation of Instruction: Promises and Pitfalls Web-Based Student Evaluation of Instruction: Promises and Pitfalls Jack McGourty, Associate Dean of Engineering, Columbia University Kevin Scoles, Associate Professor of Engineering, Drexel University

More information

Assessment Findings and Curricular Improvements Department of Psychology Undergraduate Program. Assessment Measures

Assessment Findings and Curricular Improvements Department of Psychology Undergraduate Program. Assessment Measures Assessment Findings and Curricular Improvements Department of Psychology Undergraduate Program Assessment Measures The Department of Psychology uses the following measures to assess departmental learning

More information

Student Course Evaluations at the University of Utah

Student Course Evaluations at the University of Utah Stephanie J Richardson, PhD, RN Director, Center for Teaching & Learning Excellence Associate Professor and Division Chair, College of Nursing Student Course Evaluations at the University of Utah Background.

More information

GRADUATE SCHOOL OF LIBRARY AND INFORMATION SCIENCE INTRODUCTION TO LIBRARY AND INFORMATION STUDIES RESEARCH REPORT

GRADUATE SCHOOL OF LIBRARY AND INFORMATION SCIENCE INTRODUCTION TO LIBRARY AND INFORMATION STUDIES RESEARCH REPORT GRADUATE SCHOOL OF LIBRARY AND INFORMATION SCIENCE INTRODUCTION TO LIBRARY AND INFORMATION STUDIES RESEARCH REPORT Matthew S. Darby Charlotte Fowles Ruth Jiu Monika Szakasits Sarah Ziebell Mann Group LIS

More information

Assessing the Impact of a Tablet-PC-based Classroom Interaction System

Assessing the Impact of a Tablet-PC-based Classroom Interaction System STo appear in Proceedings of Workshop on the Impact of Pen-Based Technology on Education (WIPTE) 2008. Assessing the Impact of a Tablet-PC-based Classroom Interaction System Kimberle Koile David Singer

More information

CEDAR CREST COLLEGE Psychological Assessment, PSY - 312 Spring 2010. Dr. Diane M. Moyer dmmoyer@cedarcrest.edu Office: Curtis 123

CEDAR CREST COLLEGE Psychological Assessment, PSY - 312 Spring 2010. Dr. Diane M. Moyer dmmoyer@cedarcrest.edu Office: Curtis 123 CEDAR CREST COLLEGE Psychological Assessment, PSY - 312 Spring 2010 Dr. Diane M. Moyer dmmoyer@cedarcrest.edu Office: Curtis 123 Course Description: The goal of this course is to expose students to the

More information

Procrastination in Online Courses: Performance and Attitudinal Differences

Procrastination in Online Courses: Performance and Attitudinal Differences Procrastination in Online Courses: Performance and Attitudinal Differences Greg C Elvers Donald J. Polzella Ken Graetz University of Dayton This study investigated the relation between dilatory behaviors

More information

Section 7: The Five-Step Process for Accommodations for English Language Learners (ELLs)

Section 7: The Five-Step Process for Accommodations for English Language Learners (ELLs) : The Five-Step Process for Accommodations for English Language Learners (ELLs) Step 1: Setting Expectations Expect English Language Learners (ELLs) to Achieve Grade-level Academic Content Standards Federal

More information

Comparison of Student Performance in an Online with traditional Based Entry Level Engineering Course

Comparison of Student Performance in an Online with traditional Based Entry Level Engineering Course Comparison of Student Performance in an Online with traditional Based Entry Level Engineering Course Ismail I. Orabi, Ph.D. Professor of Mechanical Engineering School of Engineering and Applied Sciences

More information

Program Level Assessment Report for 2012-2013

Program Level Assessment Report for 2012-2013 Program Level Assessment Report for 2012-2013 PROGRAM NAME, DEGREE NAME (e.g. Organizational Leadership, B.S.): Sociology, B.A. COLLEGE in which PROGRAM is housed: CoLA REPORT PREPARED by: Jacqueline Bergdahl

More information

CHAPTER 3 RESEARCH DESIGN AND METHODOLOGY. The review of literature has produced reoccurring themes emphasizing the

CHAPTER 3 RESEARCH DESIGN AND METHODOLOGY. The review of literature has produced reoccurring themes emphasizing the 36 CHAPTER 3 RESEARCH DESIGN AND METHODOLOGY Introduction The review of literature has produced reoccurring themes emphasizing the importance of technological literacy for citizens in the 21 st Century

More information

Designing & Conducting Survey Research

Designing & Conducting Survey Research Designing & Conducting Survey Research Santa Monica College Fall 2011 Presented by: Hannah Alford, Director Office of Institutional Research 1 Workshop Overview Part I: Overview of Survey Method Paper/Pencil

More information

How To Implement Online Course Evaluations At Boone State

How To Implement Online Course Evaluations At Boone State On-line Course Evaluation Implementation and Improvement of Response Rates Marcia Belcheir, Ph.D. Robert Anson, Ph.D. James A. Goodman, Ph.D. Boise State University 2012 Boise State University 1 Goals

More information

e-learning in College Mathematics an Online Course in Algebra with Automatic Knowledge Assessment

e-learning in College Mathematics an Online Course in Algebra with Automatic Knowledge Assessment e-learning in College Mathematics an Online Course in Algebra with Automatic Knowledge Assessment Przemysław Kajetanowicz Institute of Mathematics and Computer Science Wrocław University of Technology

More information

Experiences with Tutored Video Instruction for Introductory Programming Courses

Experiences with Tutored Video Instruction for Introductory Programming Courses Experiences with Tutored Video Instruction for Introductory Programming Courses Richard Anderson and Martin Dickey and Hal Perkins Department of Computer Science and Engineering, University of Washington

More information

Student Perceptions of Online Homework in Introductory Finance Courses. Abstract

Student Perceptions of Online Homework in Introductory Finance Courses. Abstract Student Perceptions of Online Homework in Introductory Finance Courses Abstract This paper examines student perceptions concerning online homework assignments in an introductory finance class. In general,

More information

Online and In person Evaluations: A Literature Review and Exploratory Comparison

Online and In person Evaluations: A Literature Review and Exploratory Comparison Online and In person Evaluations: A Literature Review and Exploratory Comparison Paulette Laubsch Assistant Professor School of Administrative Science Fairleigh Dickinson University Teaneck, NJ 07666 plaubsch@fdu.edu

More information