COMPARISON OF INSTRUCTORS AND STUDENTS PERCEPTIONS OF THE EFFECTIVENESS OF ONLINE COURSES



Similar documents
THREE DIMENSIONS OF THE ONLINE COURSE EVALUATION INSTRUMENT IN POSTSECONDARY EDUCATION

Running Head: COHORTS, LEARNING STYLES, ONLINE COURSES. Cohorts, Learning Styles, Online Courses: Are they important when designing

THE IMPORTANCE OF TEACHING PRESENCE IN ONLINE AND HYBRID CLASSROOMS

NRMERA 2011 Distinguished Paper. Instructors Perceptions of Community and Engagement in Online Courses

Learners Perceptions toward the Web-based Distance Learning Activities/Assignments Portion of an Undergraduate Hybrid Instructional Model

KNOWLEDGE AND PERCEPTIONS OF VIRGINIA SECONDARY AGRICULTURE EDUCATORS TOWARD THE AGRICULTURAL TECHNOLOGY PROGRAM AT VIRGINIA TECH

The Role of Community in Online Learning Success

Quality Measurement and Good Practices in Web-Based Distance Learning:

Building Online Learning Communities: Factors Supporting Collaborative Knowledge-Building. Joe Wheaton, Associate Professor The Ohio State University

Students Perception Toward the Use of Blackboard as a Course. Delivery Method. By Dr. Ibtesam Al mashaqbeh

Ready to Teach Online?: A Continuum Approach

Effectiveness of Online Instruction

Knowledge Management & E-Learning

PREDICTING STUDENT SATISFACTION IN DISTANCE EDUCATION AND LEARNING ENVIRONMENTS

A CASE STUDY COMPARISON BETWEEN WEB-BASED AND TRADITIONAL GRADUATE LEVEL ACADEMIC LEADERSHIP INSTRUCTION

Instructor and Learner Discourse in MBA and MA Online Programs: Whom Posts more Frequently?

Model for E-Learning in Higher Education of Agricultural Extension and Education in Iran

STUDENT ATTITUDES TOWARD WEB-BASED COURSE MANAGEMENT SYSTEM FEATURES

Future of E-learning in Higher Education and Training Environments

Students Attitudes about Online Master s Degree Programs versus Traditional Programs

GRADUATE FACULTY PERCEPTIONS OF ONLINE TEACHING

EFL LEARNERS PERCEPTIONS OF USING LMS

An exploratory study of student motivations for taking online courses and learning outcomes

E-learning at the Arthur Lok Jack Graduate School of Business: A Survey of Faculty Members

Comparison of Student and Instructor Perceptions of Best Practices in Online Technology Courses

THE BARRIERS AND NEEDS OF ONLINE LEARNERS

PRE-SERVICE SCIENCE AND PRIMARY SCHOOL TEACHERS PERCEPTIONS OF SCIENCE LABORATORY ENVIRONMENT

Assessing the quality of online courses from the students' perspective

Assessing Blackboard: Improving Online Instructional Delivery

2003 Midwest Research to Practice Conference in Adult, Continuing, and Community Education

PERCEPTIONS OF ONLINE INSTRUCTION

Using Classroom Community to Achieve Gender Equity in Online and Face-to-Face Graduate Classes

Principals Use of Computer Technology

How To Find Out If Distance Education Is A Good Thing For A Hispanic Student

UNH Graduate Education Department. Quarterly Assessment Report

Alternative Online Pedagogical Models With Identical Contents: A Comparison of Two University-Level Course

Effective Online Instructional Competencies as Perceived by Online University Faculty and Students: A Sequel Study

Where has the Time Gone? Faculty Activities and Time Commitments in the Online Classroom

Online International Interdisciplinary Research Journal, {Bi-Monthly}, ISSN , Volume-V, Issue-V, Sept-Oct 2015 Issue

Graduate Student Perceptions of the Use of Online Course Tools to Support Engagement

A Comparative Study of the Effectiveness of an Online and Face-to-Face Technology Applications Course in Teacher Education

Education at the Crossroads: Online Teaching and Students' Perspectives on Distance Learning

What Faculty Learn Teaching Adults in Multiple Course Delivery Formats

Designing Social Presence in an Online MIS Course: Constructing Collaborative Knowledge with Google+ Community

Factors Influencing a Learner s Decision to Drop-Out or Persist in Higher Education Distance Learning

Comparatively Assessing The Use Of Blackboard Versus Desire2learn: Faculty Perceptions Of The Online Tools

OKLAHOMA STATE UNIVERSITY School of Applied Health and Educational Psychology

The Acceptability of Online University Degrees in the Arab Academia

UW Colleges Student Motivations and Perceptions About Accelerated Blended Learning. Leanne Doyle

FINAL REPORT. Of Online and On Campus RN- to- BSN Students

Toward online teaching: The transition of faculty from implementation to confirmation at Bar-Ilan University

Health Care Management Student Perceptions of Online Courses Compared to Traditional Classroom Courses

Assessment of Online Learning Environments: Using the OCLES(20) with Graduate Level Online Classes

Preferred Learning Styles for Respiratory Care Students at Texas State University San Marcos

AC : DEVELOPMENT OF AN ONLINE MASTER'S DEGREE IN TECHNOLOGY MANAGEMENT

PERCEPTIONS OF IOWA SECONDARY SCHOOL PRINCIPALS TOWARD AGRICULTURAL EDUCATION. Neasa Kalme, Instructor Hamilton, Indiana

Factors Influencing Students Success in an Online Statistics Course at College-level

Establishing Guidelines for Determining Appropriate Courses for Online Delivery

STUDENT PERCEPTIONS AND PREFERENCES FOR TERTIARY ONLINE COURSES: DOES PRIOR HIGH SCHOOL DISTANCE LEARNING MAKE A DIFFERENCE?

Master of Arts in Higher Education (both concentrations)

Assuring Quality in Large-Scale Online Course Development

The Effect of Software Facilitated Communication on Student Outcomes in Online Classes

Pam Northrup, Ph.D. Associate Provost, Academic Innovation, Distance and Continuing Education University of West Florida.

STUDENT PERCEPTIONS OF INSTRUCTOR INTERACTION IN THE ONLINE ENVIRONMENT

PETER ATOR METOFE. Curriculum Vitae August 31, 2013

Graduate Program Review of EE and CS

A Comparison of E-Learning and Traditional Learning: Experimental Approach

Issues in Information Systems Volume 15, Issue II, pp , 2014

Achievement and Satisfaction in a Computer-assisted Versus a Traditional Lecturing of an Introductory Statistics Course

Perception of Nurse Interns about Clinical Assignment Preparation Requirements

STUDENTS PERCEPTIONS OF ONLINE LEARNING AND INSTRUCTIONAL TOOLS: A QUALITATIVE STUDY OF UNDERGRADUATE STUDENTS USE OF ONLINE TOOLS

Web-based versus classroom-based instruction: an empirical comparison of student performance

The Trials and Accomplishments of an Online Adjunct Faculty Member

Examining Science and Engineering Students Attitudes Toward Computer Science

Agriculture Teachers' Attitudes toward Adult Agricultural Education in Ohio Comprehensive High Schools

The Effect of Flexible Learning Schedule on Online Learners Learning, Application, and Instructional Perception

PhD Manual Part 4 - Program of Studies PROGRAM OF STUDIES

21st Century Best Practice and Evaluation for Online Courses

Career Development in a College

Student Perceptions of Online Learning: A Comparison of Two Different Populations

Onsite Peer Tutoring in Mathematics Content Courses for Pre-Service Teachers

Doctoral Students: Online, On Time, and On to Graduation

Parents' Perception of Online College Degree Programs for their Children

Jennifer Durham, Ph.D.

AN INNOVATIVE INTEGRATED MATHEMATICS, SCIENCE, AND TECHNOLOGY EDUCATION TEACHER CERTIFICATION PROGRAM: CHARACTERISTICS AND FORMATIVE EVALUATION

Justification For Certification Program for Teaching Online. Daniel Aguilar Jose Banda Maria Eugenia Perez

Identifying Stakeholder Needs within Online Education. Abstract

HIGHER EDUCATION PRACTICE RESEARCH ABSTRACTS

MARKETING STRATEGIES FOR RECRUITING 4-H MEMBERS IN WEST VIRGINIA. Gary J. Wingenbach, Assistant Professor Mississippi State University

MPA Program Assessment Report Summer 2015

Examining Students Performance and Attitudes Towards the Use of Information Technology in a Virtual and Conventional Setting

Metropolitan Community College (MCC) recognizes

Clinical Mental Health Counseling Program Antioch University Seattle Outcomes Report 2014

METU Instructional Technology Support Office: Accelerating Return on Investment Through e-learning Faculty Development

Peter A. Metofe Curriculum Vitae

METACOGNITIVE AWARENESS OF PRE-SERVICE TEACHERS

Activities and Resources in Online Learning: From a Critical Thinking View

WHY DO HIGHER-EDUCATION INSTITUTIONS PURSUE ONLINE EDUCATION?

INTERNAL MARKETING ESTABLISHES A CULTURE OF LEARNING ORGANIZATION

Training for faculty who teach online

Transcription:

COMPARISON OF INSTRUCTORS AND STUDENTS PERCEPTIONS OF THE EFFECTIVENESS OF ONLINE COURSES Soonhwa Seok The University of Kansas Carolyn Kinsell Solers Research Group Boaventura DaCosta Solers Research Group Chan K. Tung Kansas City Community College This study used an extensive online course evaluation inventory to analyze the subjects perceptions of course effectiveness in the following subscales: flexibility, user interface, navigation, getting started, technical assistance, course management, universal design, communications, instructional design, and content. Survey results compared perceptions across instructors, students, and demographics, to include age, gender, educational level, and course experience. Results indicated that both students and instructors had positive perceptions of course effectiveness, with instructors having higher perceptions than students in some subscales. The results also indicated positive correlations between perceptions and teaching experience, suggesting the need for further research in the flexibility, communications, and online instructional design course effectiveness subscales. INTRODUCTION The number of online courses has dramatically increased in the United States in recent years. The largest increases have occurred in associate degree institutions, where 72% agree that online courses are part of their long-term strategy, as compared to 58% in 2003 (Allen & Seaman, 2005). Thus, Web-based pedagogy is a critical factor that affects today s education in community colleges. As technology has improved, the delivery of online courses has changed significantly. For example, more media and interactivity, as well as community publishing are now part of the delivery of course content. But as Webbased pedagogy brings new opportunities, it also brings new challenges to both instructors Soonhwa Seok, The Center for Research on Learning, The University of Kansas, 1000 Sunnyside, Lawrence, KS 66045. Telephone: (785) 864-2700. The Quarterly Review of Distance Education, Volume 11(1), 2010, pp. 25 36 ISSN 1528-3518 Copyright 2010 Information Age Publishing, Inc. All rights of reproduction in any form reserved.

26 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010 and students. For example, in 2000 the Bell- South Foundation (2003) launched the Power to Teach Program to explore ways to help create a critical mass of K-12 teachers capable of incorporating technology into everyday classroom experiences. The data showed that while teachers felt they were making dramatic leaps in using technology to create new learning experiences for their students, students saw few changes in their classroom instruction. In fact, students revealed that they were hungry for more opportunities to use technology in the learning environment. So, how can there be such a disparity in perceptions? The Web-based learning model the Model of Community of Inquiry developed at the University of Alberta, assumes that learning occurs within the community through the interaction of three elements: cognitive presence, social presence, and teaching presence (Garrison, Anderson, & Archer, 2000; Seok, 2006, 2007a, 2007b). One of the contributing factors may have to do with teaching presence. While cognitive and social presences are considered core elements in learning, whether or not learning is achieved depends on the presence of a teacher to facilitate the learning activities. Other contributing factors may be at play as well with regard to social presence. Although the BellSouth program focused on K-12, in general, it sheds light on the importance of perceptions in the classroom. For instructors to deliver more effective Webbased learning in community college settings, we need to know more about the perceptions of online courses by the producers and consumers of online technology instructors and students. Purpose This study compared instructors and students perceptions of the effectiveness of online courses in community college settings. Course effectiveness was analyzed along the following composites: flexibility, user interface, navigation, getting started, technical assistance, course management, universal design, communications, online instructional design, and content. This study was an extension of a validation study conducted by Soek (2006) that piloted a comprehensive online course evaluation instrument at the postsecondary level. In this study, two survey questionnaires were used to gather students and instructors demographic information and their perceptions of online course effectiveness in community colleges. Variables of gender, age, native language, academic major, educational level, technology skill, and experience with online courses were taken into consideration to determine if they significantly affected subjects perceptions of the effectiveness of Web-based courses. Educational theories, principles, practices, and research on perceptions of course effectiveness in Web-based instruction was the research base for this study. The findings may be used to help instructors and course designers gain a better understanding of how to evaluate, design, and deliver more effective webbased learning environments for each subscale and across students demographics (Seok, 2007a, 2007b, 2008). Research Questions In order to compare instructors and students perceptions of the effectiveness of online courses in a community college setting, answers were sought to the following research questions: Research Question 1: Are there significant relationships between students and instructors perceptions of online course effectiveness and students and instructors demographic characteristics (i.e., gender, age, native language, major, educational level, technology skill, and learning experience)? Research Question 2: Are there significant differences between students perceptions and instructors perceptions of online course effectiveness?

Comparison of Instructors and Students Perceptions of the Effectiveness of Online Courses 27 LITERATURE REVIEW To better understand the importance of the perceptions of online course effectiveness, the subsequent review has been organized into the following sections: the pedagogical features, principles, and practices of online learning; students perceptions of online course effectiveness; and instructors perceptions of online course effectiveness. Online Learning Pedagogical Features, Principles, and Practices The World Wide Web provides new functionality in transmitting information and has enhanced the ability to educate students electronically in both stand-alone tutorials and online workshops. As such, the Web has increased opportunities for learning and alternative formats for information delivery (Dwyer, Barbieri, & Doerr, 1995). The pedagogical features of major course management systems include collaboration and communication, content creation and delivery, administrative tools, learning tools, and assessment tools (Online Learning, 2005). The five categories that address the major components of an educational event as designed and developed for distance education include: learning goals and content presentation, interactions, assessment and measurement, instructional media and tools, and learner support systems and services (Ragan, 1999). In summary, the pedagogical features of online learning include: a tool for learning, interactions (communication and collaboration), enhanced content and its delivery, measurement, and technical and administrative services. The goals of these features are intended to enhance interactions between learners and peer learners, learners and content, learners and technology, and learners and instructors (Seok, 2007a, 2007b, 2008). All in all, students are no longer totally dependent on instructors for learning when in an online learning environment when there is easy access to course content and information resources. For online learning to be successful, instructors as well as students must take on new roles in the teaching-learning relationship. Instructors must be willing to motivate and release control of learning to students (Illinois Online Network, 2007) and students must be able to assume more independence. Students Perceptions of Online Course Effectiveness Findings from studies examining perceptions of online course effectiveness have been positive. For example, O Malley and McGraw (1999) found that distance learning and online learning technologies were perceived by students as having some benefits, although they were not necessarily knowledge related. Their study investigated student perceptions of the effectiveness of distance and online learning to determine which dimensions of online learning provide advantages relative to traditional methodologies. A 128-item 7-point Likerttype survey questionnaire was developed and administered to students in a variety of business courses at a participating university. Survey results indicated that students appeared to be ambiguous regarding the effectiveness of online learning when compared with traditional methodologies. Most of the relative advantages of online learning were related to saving time, scheduling, and being able to take more courses. Koohang and Durante (2003) designed a 10-item Likert-type instrument based on instructional objectives to collect information about students perceptions of web-based distance learning activities and the assignment portion of a hybrid program. Subjects were 106 students enrolled in an undergraduate hybrid program on management fundamentals designed for working adults. Results indicated that, overall, students greatly perceived that the web-based distance learning activities/ assignments portion of their program promoted learning. A significant difference was found among levels of learners experience

28 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010 with the Internet and their perceptions of the web-based distance learning activities/assignments portion of the hybrid program. In other words, subjects who had more experience with the Internet indicated significantly more favorable perceptions of the web-based distance learning activities and assignment portion than did subjects with less experience with technology and the Internet. While finally, Jurczyk, Benson, and Savery (2004) used a standards-based approach to measure student perceptions in web-based courses in an effort to develop a process for evaluating perceptions. Standards were based on a literature review and interviews with 147 individuals. Subjects included faculty members, students, and administrators at six leading accredited institutions in distance education. Forty-five benchmarks were identified and organized into seven categories: institutional support, course development, teaching/learning process, course structure, evaluation and assessment, student support, and faculty support. In addition to a process, a questionnaire was developed to measure student attitudes before, during, and after taking the web-based course. Instructors Perceptions of Online Course Effectiveness Positive findings have also been found with regard to instructors perceptions of online course effectiveness. For example, Wingard (2004) found that a significant number of faculty thought that adding web-improved preparations, for themselves as well as for their students, contributed to greater student engagement and active learning in the classroom. Within an online environment, faculty often felt they were more familiar with their students academic progress during the term and reported a growing expectation that students could take more responsibility for independently learning the fundamentals from the readily available resources provided on the Web. Guidera (2004) investigated the perceptions of faculty at both public and nonprofit private institutions in the United States including 2- year institutions, 4-year colleges, and universities on the effectiveness of online instruction in terms of the seven principles of effective undergraduate education. These seven principles include good instructional practice, encourage student-faculty contact, encourage cooperation among students, encourage active learning, provide prompt feedback, emphasize time on task, communicate high expectations, and respect diverse talents and ways of learning. The research results indicated that online instruction was rated slightly more effective overall and more effective for promoting prompt feedback, time on task, respect for diverse learning styles, and communicating high expectations. However, it was rated less effective for promoting student-faculty contact and cooperation among students. Interestingly, the perceived effectiveness was higher for experienced faculty, which increased with the number of online courses taught (Guidera, 2004). Finally, Seok (2006, 2007a, 2007b, 2008) identified indicators to evaluate online instruction at the postsecondary level. Ninety-nine indicators applicable to the evaluation of online instruction were identified and validated. In doing so, Seok (2006) developed 11 categories as follows: flexibility (e.g., schedules, technical skills, and work settings), user interface (e.g., consistent user interface and appealing screens), navigation (e.g., menus and labels), getting started (e.g., class orientation and performance expectations), technical assistance (e.g., online help and on-call support), course management instructor (e.g., manage student assignments and monitor student progress), course management student (e.g., detailed syllabus and benchmarks for completing course requirements), universal design (e.g., accommodate students with disabilities and allow students to vary font size), communication (e.g., individual responses, distribute announcements, discussions, and chat), online instructional design (e.g., glos-

Comparison of Instructors and Students Perceptions of the Effectiveness of Online Courses 29 sary of all terms, content maps of all lessons, and additional resources for enrichment), and content. These validated indicators can be transformed into item scales and subscales to evaluate the effectiveness of learning instruction. It is these indicators that are examined in this study. It is hoped that instruments derived from this study s findings may be used to evaluate productivity and processes of e-learning (Seok, 2007a, 2007b). METHODOLOGY This study is a composite of findings from quantitative statistics multidimensional scaling and a one-way analysis of variance (ANOVA) from two large-scale studies Perceptions of Students and Instructors of Online and Web-Enhanced Course Effectiveness in Community Colleges (Tung, 2007) and Validation of Indicators by Rating the Proximity Between Similarity and Dissimilarity among Indicators in Pairs for Online Course Evaluation in Postsecondary Education (Seok, 2006). Subjects and Sampling Two groups participated in this study instructors teaching community-college-level online courses and students enrolled in community-college-level online courses. Using convenience sampling, 281 instructors and 176 students were recruited via e-mail and asked to complete online survey questionnaires. A total of 193 instructors and 141 students completed the survey of online course effectiveness. Variables Independent Variables There were eight independent variables (see Appendix) for instructors and students: gender, age, native language, academic major, educational level, technology skills, and number of online courses completed (for students) and taught (for instructors). Dependent Variables Eleven dependent variables were included to ascertain students perceptions of online course effectiveness: flexibility, user interface, navigation, getting started, technical assistance, course management (instructor), course management (student), universal design, communications, instructional design, and content. Measurement Instruments Two online survey questionnaires based on the Inventory for Online Course Evaluation in Postsecondary Education (Seok, 2006) were administered to collect instructors and students perceptions of the effectiveness of various aspects of online learning and teaching. Two different formats one for instructors and one of students consisted of the same questions. Each questionnaire consisted of two sections: perceptions of course effectiveness and personal data. The first section included 99 questions and used a 5-point Likert-type scale (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly agree). Subjects were asked to indicate whether they agreed or disagreed with each statement, which were all positively worded. An open-ended question was included at the end of survey to collect additional comments about course effectiveness in teaching and learning. The second section asked for students and instructors gender, age, native language, academic major, educational level, technology skills, number of online courses completed (for students) and taught (for instructor). Survey questionnaires focused on subjects perceptions in the following areas of online instruction: flexibility, user interface, navigation, getting started, technical assistance, course management (instructor), course management (student), universal design, communications, instructional design, and content.

30 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010 Validity and Reliability of the Instrument Validity. Ninety-nine items were developed and solidly validated in a previous study (Seok, 2006) to develop an instrument to evaluate online courses by utilizing 4, 856 pairwise comparisons by subject matter experts (SMEs) by implementing multidimensional scaling (MDS). Reliability. Cronbach s coefficient alphas were used to compute internal consistency estimates of reliability for each subscale (flexibility, user interface, navigation, getting started, technical assistance, course management, universal design, communications, instructional design, and content) of the measurement instrument. According to Nunnaly (1978), 0.7 is an acceptable reliability coefficient. Using an internal consistency estimate of reliability, individuals are administered a measure with multiple parts on a single occasion (Green & Salkind, 2005). In the current instrument, no items needed to be reverse-scaled since all survey questions presented positively worded statements. Furthermore, all items shared the same metric, since the response scale for all items ranged from 1 = strongly disagree to 5 = strongly agree. All subscales in this study were found to have alpha levels greater than 0.7, indicating acceptable reliability (see Table 1). In this study, factor analysis consisted of the four typical phases: Phase I item development, Phase II item distribution to large sampling, Phase III factor analysis application, and Phase IV item revision. As mentioned earlier, MDS was conducted. This quantitative statistic application is the process of item validation, which corresponds to Phase I in factor analysis. Multidimensional scaling utilizes SMEs to validate the identified and developed items (which is one of the differences between factor analysis and multidimensional scaling) and for the purpose of this study, consists of six phases. Phase I involved a literature search to identify indicators crucial to effective online learning. Phase II involved two subject matter experts (SMEs) reaching consensus on the indicators, consisting of a comprehensive set of 99 independent indicators considered to be representative of online course structure for postsecondary education. Phase III involved developing an online MDS instrument to TABLE 1 Reliability Statistics for Internal Consistency of Instructors and Students Perceptions of Online Course Effectiveness Subscale Instructors/Students Subscale Cronbach's Alpha # of Items Flexibility.72/.76 6 User interface.83/.84 9 Navigation.84/.84 6 Getting started.80/.81 6 Technical assistance.83/.81 4 Course management (instructor).89/.91 10 Course management (student).80/.84 7 Universal design.80/.79 7 Communications.87/.87 8 Online instructional design.90/.94 22 Content.92/.95 14

Comparison of Instructors and Students Perceptions of the Effectiveness of Online Courses 31 accommodate the 4, 851 pairwise comparisons in random order. Phase IV involved a panel of four SME judges rating the 4, 851 pairwise comparisons ([99*98]/2), where the number 99 indicates that there were 99 items developed. Their ratings of the similarity of indicators confirmed the existence of the indicators validity. A multiple-point scale from 1 to 9 was used, with higher numbers indicating similarity and lower numbers indicating dissimilarity. And finally, phase V involved data collection via an online MDS instrument. Phase VI involved applying MDS analysis to the comparisons. A survey was distributed to a larger sample for this study to compare the perceptions of instructors and students corresponding to Phase II of factor analysis. A one-way ANOVA was conducted after MDS and after the items were solidly validated (Seok, 2006, 2007a, 2007b, 2008). Data Analysis The results of the Soek (2006) study indicated that a three-dimensional solution was an appropriate model for analysis, thus indicating a.24222 of the fit values of STRESS and 74% of R 2. Three dimensions were labeled as accessibility, adaptability, and clarity of communication. Four clusters of indicators were identified as contextual accommodation, instructional access, guided learning, and organizational clarity (Seok, 2006, 2007a, 2007b). An ANOVA test was used to measure whether: the means on subjects perceptions of course effectiveness subscales were significantly different across gender, age, native language, academic majors, educational levels, and technology skills; and if the means on students and instructors perceptions of course effectiveness were significantly different. RESULTS AND DISCUSSION The results relating to student perceptions of course effectiveness suggested satisfactory evidence with internal consistency estimates ranging from.76 to.96 for the individual subscales within the surveys. Instructor perceptions of course effectiveness also suggested satisfactory evidence, with internal consistency estimates ranging from.72 to.92 for the individual subscales within the surveys. Using Cronbach s coefficient alphas, the subscale scores were found to be reasonably reliable for this ANOVA study (see Table 1). The descriptive results (see Table 2) indicated that, overall, students and instructors had positive perceptions of the effectiveness of online courses. Relationships Between Instructors and Students Perceptions of Online Course Effectiveness and Demographic Characteristics The subsequent sections describe the data collected on the demographic section of the surveys, which showed significant findings. Gender Gender was found to be a statistically significant factor for instructors and students alike. Both female instructors and students had statistically significant higher perceptions of the online course effectiveness than males. One-way ANOVAs indicated a significant difference between male and female instructors perceptions of the subscales as follows. For getting started, F(2, 191) = 4.97, p <.05, the mean of female instructors was significantly higher (M = 4.2, SD =.57) than the mean of male instructors (M = 4.1, SD =.61). In the case of technical assistance, F(2, 191) = 7.83, p <.05, the mean of female instructors was significantly higher (M = 3.9, SD =.75) than the mean of male instructors (M = 3.6, SD =.80).

32 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010 TABLE 2 Instructors and Students Perceptions of Online Course Effectiveness by Subscales Instructor/Student Subscale N Mean SD Flexibility 193/141 4.1/4.2.6/.58 User Interface 193/141 4.0/3.9.5/.56 Navigation 193/141 4.0/3.9.6/.65 Getting Started 193/141 4.2/4.0.6/.64 Technical Assistance 193/141 3.8/3.7.8/.71 Course Management (instructor) 193/141 4.1/3.9.6/.70 Course Management (student) 193/141 4.3/4.1.5/.60 Universal Design 193/141 3.7/3.8.6/.58 Communications 193/141 4.4/4.1.5/.63 Online Instructional Design 193/141 4.0/3.9.5/.63 Content 193/141 4.3/4.1.5 And with regard to universal design, F(2, 191) = 7.77, p <.05, the mean of female instructors was significantly higher (M = 3.8, SD =.59) than the mean of male instructors (M = 3.6, SD =.64). One-way ANOVAs also indicated a significant difference between male and female students perceptions in six subscales, including user interface, getting started, technical assistance, communications, online instructional design, and content. The largest gaps between means were found in three subscales user interface, online instructional design, and content. For user interface communications, F(2, 139) = 7.12, p <.05, the mean of female students was significantly higher (M = 4.0, SD =.53) than the mean of male students (M = 3.7, SD =.62). In the case of online instructional design, F(2, 139) = 6.50, p <.05, the mean of female students was significantly higher (M = 4.0, SD =.62) than the mean of male students (M = 3.6, SD =.61). And with regard to content, F(2, 139) =13.26, p <.05, the mean of female students was significantly higher (M = 4.2, SD =.61) than the mean of the male group (M = 3.7, SD =.72). Teaching Experience For instructors teaching experience, Table 3 shows that small positive correlations were found between online course taught and most subscales, except flexibility and navigation. Thus, increased online teaching experience contributed to instructors perceptions of delivering more effectively designed online courses. These findings indicate that the delivery of effective courses may depend upon increased teaching experience. Native Language For students native language, the results indicated that native-english-speaking students had statistically significant higher perceptions of online course effectiveness than non-native-english-speaking students in the user interface and getting started subscales. For user interface, F(2, 139) = 5.11, p <.05, the mean of English speakers was significantly higher (M = 3.9, SD =.54) than the mean of the other language speaker (M = 3.4, SD =.76), while with respect to getting started, F(2, 139) = 5.39, p <.05, the mean of English speakers

Comparison of Instructors and Students Perceptions of the Effectiveness of Online Courses 33 TABLE 3 Pearson Correlation Coefficient for Online Course Taught and Subscales Subscale Pearson Correlation With Online Course Taught Flexibility.117** User interface.197** Navigation.128** Getting started.159** Technical assistance.180** Course management (instructor).198** Course management (student).205** Universal design.184** Communications.163** Online instructional design.185** Content.170** Note: **Correlation is significant at the 0.01 level (2-tailed). *Correlation is significant at the 0.05 level (2-tailed). was significantly higher (M = 4.0, SD =.62) than the mean of other language speakers (M = 3.4, SD =.88). Non-native-English-speaking students had significantly lower satisfaction with online course effectiveness in the user interface and getting started subscales than native English-speaking students. Such a finding deserves further attention in future research. Topics might include how computer user interface design affects students online learning with regard to cultural diversity and how to provide a better getting-started design in online learning for non-native-englishspeaking students of diverse cultural backgrounds. No statistically significant differences were found in instructors perceptions of online course effectiveness subscales across instructors native languages and academic majors. Educational Level Significant differences were found across instructors education levels (less than high school, associate s degree, bachelor s degree, master s degree, and doctoral degree) in their perceptions of the following online course effectiveness subscales: navigation, F(5, 188) = 2.66, p <.05; getting started, F(5, 188) = 4.32, p <.05; course management, F(5, 188) = 3.22, p <.05; and universal design, F(5, 188) = 3.64, p <.05. Significant differences also emerged across students educational levels in their perceptions of the online course effectiveness in the subscales of instructional design, F(6, 135) = 3.68, p <.05, and content F(6, 135) = 5.20, p <.05. This finding indicates that instructors with higher educational levels may deliver more effectively designed online courses in terms of navigation, getting started, course management, and universal design. Technology Skills Results indicated significant differences across instructors technology skills in their perceptions of the online course effectiveness of the following subscales: flexibility, F(3, 190) = 3.81, p <.05; user interface, F(3, 190) = 3.82, p <.05; communications, F(3, 190) = 4.44, p <.05; online instructional design, F(3, 190) = 4.55, p <.05; and content, F(3, 190) = 6.74, p <.05. For the students, statistically sig-

34 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010 TABLE 4 Means of Instructors Perceptions of Subscales Across Technology Skill Group Scales Flexibility User Interface Communications Online Instructional Design Content Beginner 4.0 3.7 3.7 3.4 3.5 Intermediate 4.0 3.9 4.3 3.9 4.3 Advanced 4.2 4.1 4.4 4.1 4.4 Total 4.1 4.0 4.4 4.0 4.3 Mean nificant differences were found across technology skills in their perceptions of the online course effectiveness for the content subscale, F(3, 138) = 3.83, p <.05. The flexibility mean value for beginners was higher than the value for instructors with intermediate skills. However, both mean values were significantly lower than the mean value for instructors with advanced technology skills. Thus, increased technology skills may contribute to instructors perceptions of delivering more effectively designed online courses in the following course effectiveness subscales: flexibility, user interface, communications, online instructional design, and content. As a result, the delivery of effective online courses may depend upon increased technology skills as illustrated in Table 4. Comparison Between Instructor s and Students Perceptions of Online Course Effectiveness Results of the ANOVA showed that instructors had statistically significant higher perceptions toward online course effectiveness than students in the following subscales: getting started, course management, communications, and content (see Table 1). Implications The aforementioned results indicated that instructors had statistically significant higher perceptions of the effectiveness of online courses than did students. These results deserve further attention and should be subsequently researched. For example, it would be advantageous to explore whether the difference in perceptions are due to generational gaps between what Prensky (2003) refers to as digital natives and digital immigrants. Results also showed positive correlations between online courses taught and most subscales except flexibility and navigation. These findings indicated that the delivery of effective courses depends upon teaching experience. Instructors with advanced technology skills had statistically significant higher perceptions in some course effectiveness subscales, further implying that, with rapid advances of information technology, popular use of course management systems, comfort levels of using a computer, and increased online learning/teaching experience, come the increased positive perceptions of online course effectiveness on the part of both students and instructors. Furthermore, native-english-speaking students had statistically significant higher perceptions of online course effectiveness than did non-native-english-speaking students in terms of user interface and getting started. Non-native-English-speaking students had significantly lower satisfaction with online course effectiveness in these subscales than did native-english-speakers. Such findings may imply that non-native-english-speakers may need additional assistance when taking online courses.

Comparison of Instructors and Students Perceptions of the Effectiveness of Online Courses 35 Suggestions for Future Research Based on the findings of this study, further research is recommended related to the following course effectiveness subscales: flexibility, communications, and online instructional design. For flexibility, further research is recommended relating to effective self-paced online learning and teaching because of the demands of self-paced flexibility from online course students and concerns expressed by faculty about students success in the self-paced learning environment. For communications, it is recommended that appropriate communication theories and principles that meet the needs of online learning environments are included in future research. While finally, for online instructional design, it is recommended that future studies address students learning preferences (Dunn & Dunn, 1978) in online instructional design. It is also the recommendation of the authors that non-native-english speakers perceptions see further attention in future research. CONCLUSION The descriptive results indicated that, overall, students and instructors had positive perceptions of online course effectiveness. Findings, generally speaking, are in line with past studies investigating the perceptions of instructors and students with regard to online courses. As previously discussed, there are three major aspects of online learning: cognitive, social, and teaching (Seok, 2006, 2007a, 2007b). The findings of this research related to the teaching aspects in terms of students with different language backgrounds other than English, teaching experience, and technology skills. Teaching experience and technology skills were found to be highly correlated with online course effectiveness, while students with different language backgrounds other than English had low perceptions of online course effectiveness. Taken together, these findings underscore the importance of faculty development in online learning and the development of cognitive and social strategies for students with different cultural and linguistic backgrounds. APPENDIX: INDEPENDENT VARIABLES Nominal variable for the independent variables: Gender: 1= female and 2 = male; Native language: 1= English and 2 = other; Academic major: 1 = business and 2 = continued education, 3 = engineering or technology, 4 = fine arts, 5 = humanities, 6 = information technology, 7 = mathematics, 8 = natural sciences, 9 = nursing, dental or allied health, and 10 = social sciences. Ordinal variables for the independent variables: Educational level: 1 = less than high school, 2 = high school diploma or GED, 3 = associate s degree, 4 = bachelor s degree, 5 = master s degree, and 6 = doctoral degree; Technology skills: 1 = beginner: students who have minimum experience in using a computer and the Internet, 2 = intermediate: students who use a computer and the Internet on a daily basis, and 3 = advanced: students who have the ability to solve their problems in using a computer and the Internet. Interval variable: Number of online courses taught and completed. REFERENCES Allen, I. E., & Seaman, J. (2005). Growing by degrees: Online education in the United States. Needham, MA: Sloan-C. Retrieved from http:// www.sloan-c.org/resources/ growing_by_degrees.pdf BellSouth Foundation. (2003). The big difference: The growing technology gap between schools and students [Electronic version]. Retrieved from http://www.bellsouthfoundation.org/pdfs/ pttreport03.pdf Dunn, R., & Dunn, K. (1978). Teaching students through their individual learning styles: A practical approach. Reston, VA: Prentice Hall. Dwyer, D., Barbieri, K., & Doerr, H. (1995). Creating a virtual classroom for interactive education

36 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010 on the Web [Electronic version]. Retrieved from http://www.igd.fhg.de/archive/1995_www95/ papers/62/ctc.virtual.class/ctc.virtual.class.html Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 11(2), 114. Green, S. B., & Salkind, N. J. (2005). Using SPSS for Windows and Macintosh: Internal consistency estimates of reliability. Upper Saddle River, NJ: Pearson Prentice Hall. Guidera, S. G. (2004). Perceptions of the effectiveness of online instruction in terms of the seven principles of effective undergraduate education. Journal of Educational Technology Systems, 32(2 & 3), 139-178. Illinois Online Network. (2007). Instructional strategies for online courses [Electronic version]. Retrieved from http://www.ion.illinois.edu/ resources/tutorials/pedagogy/instructionalstrategies.asp Jurczyk J., Benson, S., & Savery, J. R. (2004, Winter). Measuring student perceptions in webbased courses: A standards-based approach [Electronic version]. Online Journal of Distance Learning Administration. Retrieved from http:// www.westga.edu/~distance/ojdla/winter74/ jurczyk74.htm Koohang, A., & Durante, A. (2003). Learners perceptions toward the Web-based distance learning activities/assignments portion of an undergraduate hybrid instructional model. Journal of Informational Technology Education, 2, 105-113. Nunnaly, J. (1978). Psychometric theory. New York, NY: McGraw-Hill. O Malley, J., & McGraw, H. (1999, Winter). Students perceptions of distance learning, online learning and the traditional classroom [Electronic version]. Online Journal of Distance Learning Administration. Retrieved from http:// www.westga.edu/~distance/omalley24.html Online Learning. (2005). Online learning: Concepts, strategies and application [Electronic version]. Retrieved from http://www.prenhall.com/ dabbagh/ollresources/resources9.html Prensky, M. (2003, May/June). Overcoming educators' digital immigrant accents: A rebuttal [Electronic version]. The Technology Source Archives. Retrieved from http://technologysource.org/article/overcoming_educators_ digital_immigrant_accents Ragan, L. (1999). Good teaching is good teaching: An emerging set of guiding principles and practices for the design and development of distance education. CAUSE/EFFECT Journal, 22, 1. Seok, S. (2006). Validation of items by rating the proximity between similarity and dissimilarity among items in pairs for online course evaluation instrument at the postsecondary educational level. Unpublished dissertation, University of Kansas, Lawrence. Seok, S. (2007a). Item validation of online postsecondary courses: Rating the proximity between similarity and dissimilarity among item pairs (Validation study series i Multidimensional scaling) [Electronic version]. Retrieved March 11, 2009, from http://www.springerlink.com/ content/wxp145x6125q8j27 Seok, S. (2007b). Standards, accreditations, benchmarks in distance education. Quarterly Review of Distance Education, 8(4), 387-398. Seok, S. (2008). Teaching aspects of e-learning. The International Journal on E-Learning, 7(4), 725-741. Tung, C. K. (2007). Perceptions of students and instructors of online and Web-enhanced course effectiveness in community colleges. Unpublished dissertation, University of Kansas, Lawrence. Wingard, R. G. (2004). Classroom teaching changes in Web-enhanced courses: A multi-institutional study. Educause Quarterly, 1, 26-35.

AUTHOR BIOGRAPHICAL DATA Aries Cobb, an assistant professor of educational technology in the Division of Education at Baldwin-Wallace College, works with teaching professionals and teaching candidates to use technology-based instruction in the classroom. Formerly principal investigator of Enhancing Education Through Technology (EETT) for the Cleveland Metropolitan School District, Cobb assessed the EETT program, provided teachers with instructional strategies to integrate technology in the classroom, and assisted teachers in increasing their student academic achievement by maintaining an e-portfolio for their students. She is the author of e-portfolio: Action Research Team Professional Development Plan, published in Distance Learning. Cobb s research interests relate to cooperative learning and the use of instructional technologies for the improvement of teaching and learning. Boaventura DaCosta has a BS in computer science and an MA and PhD in instructional systems design. He is a researcher and the cofounder of Solers Research Group, Inc. in Orlando, FL. In addition to his research interests in cognitive psychology and information and communication technology innovations, DaCosta is interested in how games can be used in learning. Complimenting his work as a researcher, DaCosta has worked in the commercial and government training sectors for the past 15 years as a software engineer and has been involved in a number of defense programs to include the Warfighters Simulation, the One Semi-Automated Forces simulation, and Future Combat Systems. Taurean T. Davis graduated with his master s degree in student affairs/counselor education from Clemson University in Clemson, SC. He serves as career counselor for outreach at the University of Virginia. His research interests include first generation students, transfer students, and multicultural issues in higher education. Jianxia Du earned her BA from Southwest Normal University in China where she later served as an assistant professor. After coming to United States, she earned an MA in educational policy and technology and a PhD in educational technology at University of Illinois at Urbana-Champaign. She has enjoyed her role as assistant professor in the Department of Instructional Systems, Leadership, and Workforce Development at Mississippi State University for the past several years. Her research interests include race and gender issues in instructional technology, online discussion, and collaborative learning. Du s professional accomplishments included over 20 articles and professional presentations. The Quarterly Review of Distance Education, Volume 11(1), 2010, pp. 59 64 ISSN 1528-3518 Copyright 2010 Information Age Publishing, Inc. All rights of reproduction in any form reserved.

60 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010 Kerry W. Foxx completed his graduate coursework in student affairs/counselor education at Clemson University in 2009. He is currently the assistant director for the Career and Community Engagement Center at Lewis & Clark College. His research interests include leadership, social justice, and the intersection among leadership, social justice, and civic engagement in higher education. Pamela Havice is an associate professor at Clemson University. She has published and presented widely over the last 15 years on the topics of distance and distributed learning environments. Presently she is the coordinator of the student affairs/counselor education graduate program and serves as a faculty member in the higher education doctoral program. William L. (Bill) Havice is the associate dean for academic support services and undergraduate studies in the College of Health, Education and Human Development at Clemson University. In this role, Havice oversees undergraduate curriculum, student support and technology for the college. He has been actively involved in researching, presenting and publishing on instructional technology and distributed learning environments for the past 20 years Carolyn Kinsell holds a PhD in instructional technology and a certification in human performance. Her career expands over 18 years in which she focused on the application of training that spans from analysis, to the development of virtual environments, to defining requirements and solutions for human performance standards; and, more recently, to research and development of training applications. She has worked closely with the military to include cryptologists, intelligence specialists, naval diving and salvage experts, to Force XXI Battle Command Brigade and Below Joint Capabilities Release. She has also supported commercial clients such as Cingular and North America Honda. Julie A. McElhany is the coordinator for instructional design in the Department of Instructional Technology and Distance Education at Texas A&M University-Commerce and, as an adjunct faculty member, teaches graduate courses in the educational technology leadership program. Her research interests include effective instructional design and practices related to online learning and adult learners as well as the integration of educational technology in the classroom. She serves as a member of the Distance Education Advisory Council for the Texas A&M University System and is a member of the ecollege Product Advisory Board. She has been invited to conduct workshops on effective instructional design practices for online learning and technology integration. Soonhwa Seok has an MA and PhD in curriculum and instruction in special education from the University of Kansas. She has interests in educational communication and technology with applications for teaching English as a second language and special education. Most recently, as a postdoctoral researcher, she has examined and developed intersensory learning models, assistive technology, and motivation and feedback for students with learning disabilities. Another research focus is assistive technology evaluation, such as functional evaluation for assistive technology, and supports intensity scales implementing assistive technology for the students with disabilities. She has served as a peer reviewer for conference proposals, presented on web accessibility, and published articles on distance education and special education technology. Chan Tung holds a PhD in instructional technology and has been teaching computer systems networking and telecommunications at Kansas City Community College over 15 years.

Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.