What Makes a Quality Online Course? The Student Perspective Penny Ralston-Berg, M.S. Senior Instructional Designer University of Wisconsin-Extension Leda Nath, Ph.D. Assistant Professor of Sociology University of Wisconsin-Whitewater Introduction Quality Matters (QM) is a program that is receiving a lot of attention from campuses around the country. This program offers quality assurance through a rubric for online course design. The rubric includes eight dimensions (i.e., course overview and introduction; learning objectives; assessment and measurement; resources and materials; learner interaction; course technology; learner support; and ADA compliance) and allows instructors teaching online to participate in a peer review process to check that their course meets the recommended requirements as developed from years of research, national standards, and a team of scholars ( Research Literature, 2005). The Quality Matters (2005/2006) Inter-Institutional Quality Assurance in Online Learning lists specific review standards. These represent the features, ranked on a scale of 1-3, 3 being essential, 2 being very important, and 1 being important according to QM. Although the rubric is designed as a peer review tool, instructional designers and course developers are also using it as a course design guide, a key design element checklist, and a faculty development tool. In the Division of Continuing Education, Outreach & E-Learning (CEOEL) at University of Wisconsin- Extension, designers utilize the FY05-06 rubric and the research behind it to emphasize and encourage key elements of quality course design to faculty. As part of the validation of design and development practices, CEOEL regularly acquires feedback from online students to ensure faculty professional development efforts are in line with online student needs. At the start of this study no data was available to compare the rubric items to the needs and values of online students. This study is an effort to validate the QM rubric and CEOEL s use of the rubric and to determine what students value in terms of quality in online courses. This information will continue to feed the design and development of online courses. The QM rubric has been validated further through its comparison to accreditation standards for distance learning (Legon, 2006). Legon concluded that when the QM rubric is compared with the Best Practices for Electronically Offered Degree and Certificate Programs endorsed by CHEA and the eight regional accrediting agencies, the rubric is fully consistent with published accreditation standards for online education. He also concludes that the QM review program can contribute to the quality assurance processes required for accreditation. Satisfaction The online course environment is now a large part of the college setting. There has been a great deal of research examining the benefits of satisfaction in the college setting. For example, satisfaction has been linked to student performance among college students (Bean & Bradley 1986; Organ 1977; Schwab & Cummings 1970). Donohue and Wong (1997) argued that satisfaction is highly correlated with 1
achievement motivation among both traditional and non-traditional students. This may be why others have found an association between satisfaction and college student achievement (Centra & Rock, 1971; Lavin, 1985). Grade point average (GPA) has been linked to student satisfaction (Bentler & Speckart, 1979; Fishbein & Ajzen, 1975). Student satisfaction has also been examined as a factor contributing to student retention (Aitken, 1982; Astin, 1993; Spady, 1970; Tinto, 1975, 1993) and student attrition (Bean 1983; Spady, 1970; Tinto 1975, 1993). Satisfaction and academic performance have also been viewed as intervening variables that affect student attrition (Bean, 1980, 1983, 1985; Pascarella, 1980; Spady, 1970; Tinto, 1975). Apart from the academic benefits outlined above, satisfaction has also been correlated with students progress in their intellectual and social development (Pace, 1984). Scholars have argued that satisfaction is a key psychological-affective outcome, which in turn leads to a direct measure of success in college (Astin, 1977; 1993). Student satisfaction in older students has been shown to be related to creating a learner-centered approach (Miglietti & Strange, 1998). Many program evaluations include measures of student satisfaction because of knowledge relating to its practical benefits, though much of the knowledge regarding satisfaction comes from earlier studies during the late 1960s and early 1970s (e.g., Betz, Klingensmith, & Menne, 1970; Pervin, 1967; Schmidt & Sedlacek, 1972). Satisfaction has also been linked to the institutional culture of the university. Cultures that value and build community are more likely to have higher student satisfaction rates (Kuh, 2001-2002). Understanding student satisfaction in an online environment contributes then to informing online course design practices, faculty development programs (e.g., faculty training workshops) and to attempt to reduce retention among students. For these reasons, our study, which looks at the proposed Quality Matters online recommended course features, is important. We examine to what degree students value these recommended features, and how their perspective compares to the recommended QM rubric. Methods We randomly surveyed students from a small Midwestern university online as to what degree they valued the Quality Matters proposed online course features. Several features representing each of eight Quality Matters general review standards were included. Students were asked to only consider the online course environment while rating the list of features in terms of how valuable they are using a 6-point likert scale (i.e., 1 = extremely worthless; 2 = worthless; 3 = somewhat worthless; 4 = somewhat valuable; 5 = valuable; 6 = extremely valuable. Students were also informed that by value we meant how much they wanted the feature in an online course. Results With the exception of the Netiquette feature (i.e., the inclusion of outlined etiquette guidelines for how to behave online), students ranked each item significantly different from QM. For all QM features ranked 3, students ranked each item significantly less. For all QM features ranked 1 or 2 by QM, students ranked each item significantly more. For all items except Netiquette, students gave all features a score somewhere more than 2 but less than 3. Because the range of scores for student rankings held primarily between 2 and 3, we wanted to determine if any significant difference existed between items viewed as essential by QM (i.e., those given a rank of 3) to others (i.e., those given a rank of 2 or 1). Therefore, we conducted Paired Sample T-tests comparing the students overall mean score for QM items ranked 3, to QM items ranked 2, and to QM items ranked 1. Results revealed that students are significantly differentiating between the items. The 2
ranking of items however is different from the QM recommendations. Students value the same items marked 3 and 2 by QM, but value significantly less items marked 1 by QM. Table 1. Paired Sample T-Test comparing Student Mean Scores across QM ranked items 1-3. Compared Pair of Student Ranked items QM 3 QM 2 QM 1 t-value df marked 3 by QM compared to items marked 2 by QM 2.51-1.36 139 marked 2 by QM compared to items marked 1 by QM (.31) 2.40 (.34) 7.59 t 142 marked 1 by QM compared to items marked 3 by QM 2.41 (.33) -6.91 t 137 * p <.05; ** p <.01; *** p <.001; t p <.000 Student satisfaction relates to education in terms of learning as described earlier. To examine the relationships between satisfied online learning students and the QM features, correlations were performed. Results revealed that students who claim high satisfaction in online courses also significantly value all QM features more than students who claim low satisfaction generally in online courses. That is, QM items marked 1 (r =.18, p <.05), QM items marked 2 (r =.28, p <.001), and QM items marked 3 (r =.24, p <.01) had a significant and positive relationship with student satisfaction in online courses. Table 3. Pearsons Correlation comparing student satisfaction in online courses in general with value of QM course features items marked 1 items marked 2 items marked 3 Satisfaction.18*.28***.24** Discussion These results indicate that all items in the QM rubric were valuable to students. This confirms the path QM is taking in encouraging instructors to include such features in their online design. Student rankings also suggest that items ranked 2 by QM (with the exception of Netiquette ) are as important as those marked 3. In other words, while QM values all items on its list, that QM s prescription in general is to not value items marked 2 as highly as those it marks 3 may be insensitive to student wishes.. That is, instructors may wish to pay as much attention to those QM items marked 2 as those marked 3 when considering the student. Results also reinforced that items marked 1 by QM were also significantly less valued than the others by students. This is consistent with QM s implied message that these items, if necessary, can be less often included in course design. It can also be noted that although only seven students identified that they had a disability, the ADA survey items (i.e., QM items beginning with VIII) were valued by all students in general taking the survey. That is, ADA is also important to students without a disability. 3
Understanding what students value may have implications on their satisfaction levels and which relates to learning, among other important education variables. Therefore, when designing online courses using the QM rubric, it is worth considering their perspective. References Aitken, N. D. (1982). College student performance, Satisfaction and retention: Specification and estimation of a structural model. The Journal of Higher Education, 53(1), 32-50 Astin, A. W. (1977). Four critical years. San Francisco: Jossey-Bass. Astin, A. W. (1993). What matters in college? Four critical years revisited. San Francisco: Jossey Bass. Bean, J. P. (1980). The synthesis of a causal model of student attrition. Research in Higher Education, 12(2), 155-187. Bean, J. P. (1983). The application of a model of turnover in work organizations to the student attrition process. The Review of Higher Education, 6(2), 129-148. Bean, J. P. (1985). Interaction effects based on class level in an explanatory model of college student dropout syndrome. American Educational Research Journal, 22(1), 35-64. Bean, J. P., & Bradley, R. K. (1986). Untangling the satisfaction-performance relationship for college students. The Journal of Higher Education, 57(4), 393-412 Bentler, P. M., & Speclart. G. (1979). Models of attitude-behavior relations. Psychological Bulletin, 86(5), 452-64. Betz, B. L., Klingensmith, J. B., & Menne, J. W. (1970). The measurement and analysis of college student satisfaction. Measurement and Evaluation in Guidance, 3(2), 110-118. Burchfield, C. M. & J. Sappington.. (1999). Participation in classroom discussion. Teaching of Psychology, 26(4):290-291. Centra, J., & Rock, D.(1971). College environments and student achievement. American Educational Research Journal, 8(4), 623-34. Donohue, T. L., & Wong, E. H. (1997). Achievement motivation and college satisfaction in traditional and nontraditional students. Education, 118(2), 237-243. Fishbein, M,, & Aizen, I. (1975). Belief, attitude, intention and behavior. Reading: Addison-Wesley. Kuh, G. D. (2001-02). Organizational culture and student persistence: Prospects and puzzles. Journal of College Student Retention, 3(1), 23-39. Lavin, D.E. (1985). The predication of academic performance. New York: Russell Sage Foundation. Legon, R. (2006). Comparison of the Quality Matters rubric to accreditation standards for distance learning. Retrieved January 14, 2008, from http://www.qualitymatters.org/documents/comparison%20of%20the%20quality%20matters%2 0Rubric%20-%20Summary.pdf Miglietti, C. L., & and Strange, C. C. (1998). Learning styles, Classroom environment preferences, teaching styles, and remedial course outcomes for underprepared adults at a two-year college. Community College Review 26(1), 1-19. Organ, D. W. (1977). A reappraisal and reinterpretation of the satisfaction causes performance hypothesis. Academy of Management Review, 2(1), 46-53. Pace, C. R. (1984). Measuring the quality of college student experiences. An account of the development and use of the College Student Experience Questionnaire. Los Angeles: Higher Education Research Institute, UCLA. Pascarella, E. T. (1980). Student-faculty informal contact and college outcomes. Review of Educational Research, 50(4), 545-595. Pervin, L. A. (1967). Satisfaction and perceived self-environment similarity: A semantic differential study of student-college interaction. Journal of Personality, 35(4), 623-624. 4
Research literature and standards sets support for quality matters review standards. (2005). Retrieved January 14, 2008, from http://www.qualitymatters.org/documents/matrix%20of%20research%20standards%20fy0506. pdf Schmidt, D. K., & Sedlacek, W. E. (1972). Variables related to university student satisfaction. Journal of College Student Personnel, 13(3), 233-238. Schwab, D. L., & Cummings, L. L. (1970). Theories of performance and satisfaction. Industrial Relations, 9(3), 408-430. Spady, W. (1970). Dropouts from higher education: An interdisciplinary review and synthesis. Interchange, 1(1), 64-85. Tinto, V. (1975). Dropouts from higher education: A theoretical synthesis of recent research. Review of Educational Research, 45(1), 89-125 Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition. Chicago: The University of Chicago Press. About the Presenters Penny Ralston-Berg is a Senior Instructional Designer for the Division of Continuing Education, Outreach & E-Learning at UW-Extension. Address: E-mail: ralston-berg@learn.uwsa.edu URL: http://www.uwex.edu/ce/ Phone: 608.262.8095 Fax: 608.265.9396 Leda Nath, Ph.D., Assistant Professor, teaches Sociology in the Department of Sociology, Anthropology and Criminal Justice at UW-Whitewater. Address: White 415 UW-Whitewater 800 West Main Street Whitewater, WI 53190 E-mail: nathl@uww.edu Phone: 262.472.1125 Fax: 262.472.2803 5