Efficiency of Online vs. Offline Learning: A Comparison of Inputs and Outcomes



Similar documents
MARKETING EDUCATION: ONLINE VS TRADITIONAL

Research Proposal: ONLINE VERSUS FACE-TO-FACE CLASSROOM: EVALUATING LEARNING EFFECTIVENESS FOR A MASTERS LEVEL INSTRUCTIONAL DESIGN COURSE

College Teaching Methods & Styles Journal May 2008 Volume 4, Number 5

Student Performance in Traditional vs. Online Format: Evidence from an MBA Level Introductory Economics Class

Student Success in Business Statistics

Skills versus Concepts: Attendance and Grades in Information Technology Courses

The Effect of Electronic Writing Tools on Business Writing Proficiency

Retention and Success of Regular vs. Online and Hybrid Courses Fall 2001 through Spring 2003

Student Perceptions of Online Homework in Introductory Finance Courses. Abstract

A Comparison of Student Learning Outcomes in Traditional and Online Personal Finance Courses

Do Supplemental Online Recorded Lectures Help Students Learn Microeconomics?*

Teaching college microeconomics: Online vs. traditional classroom instruction

STUDENT EVALUATION OF INSTRUCTIONAL EFFECTIVENESS OF WEB-BASED LEARNING STRATEGIES

The International College Teaching Methods & Styles Journal Volume 1, Number 3. Abstract

Shaw University Online Courses FAQ

Professor and Student Performance in Online Versus Traditional Introductory Finance Courses

REDESIGNING STUDENT LEARNING ENVIRONMENTS

Student Performance Online Vs. Onground: A Statistical Analysis of IS Courses

COMPARISON OF INTERNET AND TRADITIONAL CLASSROOM INSTRUCTION IN A CONSUMER ECONOMICS COURSE

QUESTIONING THE HYBRID MODEL: STUDENT OUTCOMES IN DIFFERENT COURSE FORMATS

Assessing The Comparative Effectiveness of Teaching Undergraduate Intermediate Accounting in the Online Classroom Format

Factors Influencing Students Success in an Online Statistics Course at College-level

Strategies for Teaching Undergraduate Accounting to Non-Accounting Majors

Where has the Time Gone? Faculty Activities and Time Commitments in the Online Classroom

Teaching a Blended Supply Chain Management Course to Marketing Majors

Students' Adoption of Online Video-Based Distance Learning

Distributing Course Materials Through Online Assistance

Analysis of the Effectiveness of Traditional Versus Hybrid Student Performance for an Elementary Statistics Course

GRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE *

Assessing the quality of online courses from the students' perspective

Learning Equity between Online and On-Site Mathematics Courses

A critique of the length of online course and student satisfaction, perceived learning, and academic performance. Natasha Lindsey

AN INVESTIGATION OF TRADITIONAL EDUCATION VS. FULLY-ONLINE EDUCATION IN INFORMATION TECHNOLOGY

Writing Measurable Learning Outcomes

Running Head: COMPARISON OF ONLINE STUDENTS TO TRADITIONAL 1. The Comparison of Online Students Education to the

Journal of College Teaching & Learning July 2008 Volume 5, Number 7

Communication Program Assessment Report

Blended Assessment: A Strategy for Classroom Management

A Hybrid Approach to Teaching Managerial Economics

Online, ITV, and Traditional Delivery: Student Characteristics and Success Factors in Business Statistics

students to complete their degree online. During the preliminary stage of included Art, Business, Computer Science, English, Government, History, and

Internet classes are being seen more and more as

Virtual Teaching in Higher Education: The New Intellectual Superhighway or Just Another Traffic Jam?

Success rates of online versus traditional college students

A Comparison of Course Delivery Methods: An Exercise in Experimental Economics

Student Perceptions of Online Learning: A Comparison of Two Different Populations

Does the Choice of Introductory Corporate Finance Textbook Affect Student Performance?

Running Head: Intelligent Tutoring Systems Help Elimination of Racial Disparities

Knowledge Management & E-Learning

Diagnosis of Students Online Learning Portfolios

SELF-STUDY FORMAT FOR REVIEW OF EXISTING DEGREE PROGRAMS

Learning Management System Self-Efficacy of online and hybrid learners: Using LMSES Scale

Asynchronous Learning Networks in Higher Education: A Review of the Literature on Community, Collaboration & Learning. Jennifer Scagnelli

A SUCCESSFUL STAND ALONE BRIDGE COURSE

Comparing Student Learning in a Required Electrical Engineering Undergraduate Course: Traditional Face-to-Face vs. Online

THE EFFECTIVENESS OF VIRTUAL LEARNING IN ECONOMICS

A PRELIMINARY COMPARISON OF STUDENT LEARNING IN THE ONLINE VS THE TRADITIONAL INSTRUCTIONAL ENVIRONMENT

Student Engagement Strategies in One Online Engineering and Technology Course

Can Web Courses Replace the Classroom in Principles of Microeconomics? By Byron W. Brown and Carl E. Liedholm*

Issues in Information Systems

West LA College Mathematics Department. Math 227 (4916) Introductory Statistics. Fall 2013 August December 9, 2013

A Comparison Between Online and Face-to-Face Instruction of an Applied Curriculum in Behavioral Intervention in Autism (BIA)

The Journal of Applied Business Research Volume 18, Number 2

Multi-Course Comparison of Traditional versus Web-based Course Delivery Systems

STUDENTS PERCEPTIONS OF THE EFFECTIVENESS OF VARIOUS ONLINE LEARNING FORMATS AS RELATED TO COMPUTER AND INFORMATION SYSTEMS COURSES

IMPLEMENTATION OF ONLINE FORUMS IN GRADUATE EDUCATION

Presented at the 2014 Celebration of Teaching, University of Missouri (MU), May 20-22, 2014

Undergraduate Students Perceptions and Preferences of Computer mediated Communication with Faculty. Jennifer T. Edwards

Running head: DISTANCE LEARNING OUTCOMES 1

BA530 Financial Management Spring 2015

Issues in Information Systems Volume 13, Issue 2, pp , 2012

Technological Tools to Learn and Teach Mathematics and Statistics

Comparison of Student and Instructor Perceptions of Best Practices in Online Technology Courses

Comparatively Assessing The Use Of Blackboard Versus Desire2learn: Faculty Perceptions Of The Online Tools

Student Perceptions of Credibility and Enhancing the Integrity of Online Business Courses: A Case Study

Comparison of Student Performance in an Online with traditional Based Entry Level Engineering Course

OPTIMIZING COURSE MANAGEMENT SYSTEM RESOURCES IN THE INTRODUCTORY ACCOUNTING COURSE

Online Class Orientation

Effectiveness of Flipped learning in Project Management Class

Faculty Workload & Compensation in Online and Blended Courses

Blended Learning vs. Traditional Classroom Settings: Assessing Effectiveness and Student Perceptions in an MBA Accounting Course

An Evaluation of Student Satisfaction With Distance Learning Courses. Background Information

Online Course Syllabus. POL 1113: American National Government. Fall 2015

Do students read textbooks? E-text use in blended and online introductory physics courses

Switching Economics Courses from Online Back to the Classroom: Student Performance and Outcomes

Assessing Blackboard: Improving Online Instructional Delivery

Research Proposal: Evaluating the Effectiveness of Online Learning as. Opposed to Traditional Classroom Delivered Instruction. Mark R.

Should Graduate Mathematics Courses Be Taught Fully Online?

IMPACT OF INFORMATION LITERACY AND LEARNER CHARACTERISTICS ON LEARNING BEHAVIOR OF JAPANESE STUDENTS IN ONLINE COURSES

Student Learning Outcomes in Hybrid and Face-to-Face Beginning Spanish Language Courses

Electronic Engineering Technology Program Exit Examination as an ABET and Self-Assessment Tool

Frequently Asked Questions: Masters and Certification Programs in Early Childhood Special Education (ECSE) Preschool Special Education Endorsement

An Investigation on Learning of College Students and the Current Application Situation of the Web-based Courses

Determinants of Success in Economics Principles: Online vs. Face-to-Face Courses 1

Flipped classroom in first year management accounting unit a case study

The Relationship between Gender and Academic Success Online

A COMPARISON OF COLLEGE AND HIGH SCHOOL STUDENTS IN AN ONLINE IT FOUNDATIONS COURSE

Fully Online or Blended Courses does it make a difference for the learner?

Factors Associated with Student Performance in Financial Accounting Course

Transcription:

International Journal of Business, Humanities and Technology Vol. 2 No. 1; January 2012 Efficiency of Online vs. Offline Learning: A Comparison of Inputs and Outcomes Abstract Shweta Singh Assistant Professor of Marketing CFO 404 School of Management Texas Woman's University P. O. Box 425738, Denton, Texas76204-5738 United States of America David H. Rylander Associate Professor of Marketing School of Management Texas Woman's University P. O. Box 425738, Denton, Texas76204-5738 United States of America Tina C. Mims Adjunct Professor of Marketing School of Management Texas Woman's University P. O. Box 425738, Denton, Texas76204-5738 United States of America As the trend toward online education intensifies, questions remain regarding the overall efficiency of online courses versus their in-class counterparts. The current paper seeks to estimate the efficiency of students who take online courses relative to the efficiency of students who are enrolled in offline courses. Efficiency outcomes are defined in terms of (1) quantitative scores achieved by the student at the end of the course, (2) the student s viewpoint of how much they learned in the course and (3) the student s level of satisfaction with the course.the authors use Data Envelopment Analysis (DEA) to estimate a model of student efficiency. Demographics, student experience and student preferences are examined as differentiating attributes. The sample is taken from a course offered both online and in a traditional classroom setting, with both formats being taught by the same instructor in a single semester.implications include a better understanding of the strengths and weaknesses in efficiency of different course formats. Keywords: efficiency analysis, online learning, offline learning, student experience Introduction Teaching in the 21 st century is riddled with technology that brings the ability to offer students anytime, anywhere performance possibilities for course work. But is this flexibility in an online delivery format as efficient as the traditional face-to-face learning experience? As the trend toward online education intensifies, it leaves in its wake a series of questions that remain unanswered regarding the overall efficiency of these online courses versus their in-class (i.e. offline) counterparts. Research comparing online versus face-to-face learning is mixed, with results ranging from online superiority to no difference to face-to-face superiority. Many results can be traced to sample or method differences. This paper improves on previous efforts by using online and face-to-face samples of the same course, same student population and same instructor and applying a new approach to analysis. A Data Envelopment Analysis (DEA) approach builds a model with effort as the input and efficiency outcomes including student performance, perceived learning and student satisfaction (Banker, Charnes and Cooper, 1984). The efficiency ratio desired in this study is akin to the efficiency ratio achieved using the DEA model in a business setting. 93

Centre for Promoting Ideas, USA www.ijbhtnet.com Therefore, estimating the efficiency of students who take online courses relative to the efficiency of students who are enrolled in offline courses expands upon the current thinking in the literature. Background Scholars have laid ingots of evidence suggesting there is no difference in online versus offline student performance based on student demographic characteristics (Huh et al., 2010).In evaluating student performance based on student completion rates of materials, Olson (2002) found insufficient evidence to indicate that online versus offline delivery is a factor influencing a student s completion of his or her coursework.others found lower student performance in online classes (e.g., Trawick, Lile and Howsen, 2010), while some even found higher learning in an online format (e.g., Detwiler, 2008). In a comparison of traditional and hybrid sections of Principles of Marketing, Priluck (2004) found no difference in performance, yet significant difference in student satisfaction. As technology continues to braid its way into all teaching and learning methods, investigations reveal a consistent use of the term performance. Performance appears ubiquitous, unless otherwise stated by investigators, as assessed at the end of the course by the student s final mark, otherwise known as the course grade (Bliuc et.al., 2010; Olson, 2002). Other means of defining student performance include using student test scores or other graded items (e.g. discussion boards, homework) as a variable (McFarland andhamilton, 2005; Rivera and Rice, 2002). The term performance, unless otherwise indicated by the investigator, tends to indicate a grade achieved by the student irrespective of whether student performance is a course grade or an item grade. So far as studies predict student performance, indications are that the format of learning, i.e. offline or online, is not a sufficient treatment to influence significant difference in a performance outcome (McFarland andhamilton, 2005; Rivera and Rice, 2002; Olson, 2002). In two studies reviewed, student learning was inferred by using the student grades during the end of the course (BiktimirovandKassen 2008, Brown andliedholm 2002). Consistent results in the literature expose the possibility that more than the format of learning is a factor in identifying influencers to student performance. While educators grapple with the transformation of formats and technical solutions for delivering coursework, so too are investigators engaged in a haze of indicators attempting to discover tactual aid for educators to use. Investigators have explored everything from a student s journal of activity (i.e hits, access, attendance) (BiktimirovandKlassen, 2008; Chen andpeng, 2008;CappelandHayen, 2004) to a student s age, race, GPA, homework, and test scores (Lundgren andnantz, 2002;ChuenandKan, 2002) to capture signposts on how an educator might enhance student learning in either online or offline forums. Distinctive in these research efforts are a few investigations that branch out to consider student learning styles, study patterns, and student learning approaches (Bliuc et.al., 2009; Lu, Yu and Liu., 2003). Taking a psychometric approach helps expand issues for future investigators to consider. Despite an increasing amount of research on technology and teaching, questions remain unanswered with regard to the overall efficiency of online courses versus their in-class counterparts. When thinking about performance in terms of the role of the student as a producer, the student becomes a decision making unit (DMU) (Banker et.al., 1984). Our idea of categorizing students as DMUs yields factors to consider as investigators. One such factor that is commonly used in determining the performance of a DMU is an efficiency rating. This study seeks to estimate the efficiency of students who take online courses relative to the efficiency of students who are enrolled in offline courses. Data Envelopment Analysis (DEA) is an ex post facto tool that assesses the relative efficiency of the DMU (Banker et.al., 1984). DEA examines the DMU s ability to accomplish the desired outcome in production by using an efficiency rating in its predictive calculations(banker et.al., 1984). DEA analysis, created by Banker (1980) and Banker et al. (1984), forged a link between the actual productions achieved by the manufacturer as a result of evaluating, post hoc, the DMU inputs for decisions that impact performance efficiencies. Considering that decision makers in DMUs have certain quantifiable inputs to consider, Banker and team proposed a post hoc evaluation tool that yields an efficiency measurement of management decisions by creating the DEA model (Banker and Morey 1986). The vital contribution achieved by a DMU using the DEA model means that management can run mathematical scenarios to determine the relative efficiency of management decisions to predict production outcomes. Seeing students as the management decision makers of their own academic performance is an unprecedented way of considering how to evaluate student performance. 94

International Journal of Business, Humanities and Technology Vol. 2 No. 1; January 2012 Also comparable to DMUs, behaving as management decision makers of their own performance, students have certain inputs (e.g., time spent studying) that regulate student performance achieved. Also analogous to DMUs, students can have their performance efficiency determined by using ascertained inputs as required by the DEA model. Therefore, using a model such as the DEA model discussed here begins a new conversation for understanding the performance of online versus offline student learning environments. Methodology Data were collected from two sections of a Consumer Behavior class: one 100% online and the other 100% offline. Both sections were taught by the same instructor in the same time period. There were 26 students enrolled in the offline section and 44 students enrolled in the online section. A survey was administered at the end of the term to collect information on the students level of satisfaction, perceived level of learning, and other differentiating characteristics. Please refer to Table 1 for the comparative frequency estimates of the variables used in this paper to predict student efficiency. Student efficiency scores were calculated using the DEA outputoriented VRS (variable returns to scale) model. Data Envelopment Analysis (DEA) is a nonparametric approach widely used in the area of operations research to calculate production efficiencies. It is an extreme point methodology based on the premise that firms use certain inputs to produce certain outputs and the best or most efficient is the one that produces the maximum output for a given level of input or uses the least amount of input to produce a given level of output. Results The input used in this paper is the student s effort level, measured by the number of hours in a week the students spend studying for the course. As compared tonearly 46% of the students in the online section, around 65% of the students enrolled in the offline section said they put in between 3to 4 hours studying for the course each week. Around 34% of online students and 23% of offline students said they put in between 1 to 2 hours of study whilenearly21% of the online students and around 12 % of offline studentsreported putting in 5 hours or more. The three measures of output were(1) the absolute scores the students received at the end of the course, (2) their self reported levels of learning and (3) self-reported satisfaction. While the average score for the online class was 78, the average score for the offline course was 70. The students were asked to rate their levels of satisfaction and learning using a 7 point likert scale. The average satisfaction score for the online class was 6.15 while the average satisfaction score for the offline class was 5.90. The average learning score for the online class was 6.00 while the average learning score for the offline class was 5.73. We used the EMS software to estimate our DEA model of efficiency. The output oriented model identifies the most efficient students as those who produce the highest possible outputs with given amounts of inputs. The variable return to scale specification assumes that the amount of outputs produced increase more or less proportionately to the increase in the input. While 38% of the offline students were classified as being efficient, 56% of the online students were deemed efficient. In order to better identify the discriminating variables that distinguish the efficient students from the inefficient ones, we used a Logit model. The estimates from the Logit model are provided in Table 2. We used a dummy variable online to indicate whether the student was enrolled in the online or the offline section of the course. Our results indicate that online students tend to be more efficient than offline students. Past studies have found no concrete evidence of the superiority of one method of instruction over the other. Consistent with past studies that have looked at the factors affecting student performance and satisfaction (McFarland and Hamilton, 2005), we find that efficiency decreases as students become busier. The variable busy is measured by the number of courses taken in a semester by the students. All course materials were made available online on the course webpage for both the online and the offline sections. All quizzes and exams were conducted online for both sections of the course. Given the above conditions, to do well in both courses required a certain level of proficiency and familiarity with the Internet. We asked the students to rate their familiarity with using the Internet on a 7 point scale and found that familiarity with internet does increase efficiency. Students were also asked to choose as many aspects of the course as they liked and preferred. We wanted to see if students differed in their preference for certain aspects of the course and found that efficient students tend to like the study material provided online and the quizzes conducted online more than the inefficient students. Our findings also indicate that compared to those students who did not hold any jobs at the moment, students who worked either part time or full time proved to be more efficient. Age did not appear to be a differentiating factor between the efficient and the inefficient students. 95

Centre for Promoting Ideas, USA www.ijbhtnet.com Discussion Comparison of online versus offline learning is no doubt of substantial interest to educators and the focus of numerous studies. As preference for online learning increases, mostly due to the convenience and flexibility it offers students, universities find themselves increasing the number of online format courses to meet the growing demand. However, the question remains whether the delivery format of a course, i.e. online versus offline, impacts student performance, their satisfaction and learning. Many a priori studies report mixed results. Our study takes a novel approach by opening a discussion for future investigators to consider measures that impact student efficiency. By using the DEA approach to estimating student efficiency in this investigation, we have found sufficient evidence to indicate that students taking the online course format are more efficient than their offline counterparts. The results indicate a difference between online versus offline formats when considering the number of hours students spend studying as an indicator of student performance. Student performance includes the student s final grade and self-reported level of learning and satisfaction from their course experience. Additionally, the DEA approach reveals sufficient evidence to indicate the course load negatively impacts the efficiency of students. Finally, students that work full or part-time, have familiarity with the Internet, a have preference for online course material all positively impacts a student s efficiency using the DEA approach. Limitations of this study should be noted. The sample is not necessarily representative of other courses, other teaching approaches or other student populations. Additional research is needed with a variety of samples. However, the approach presented here, where the course content, timing, testing method and instructor are all controlled, can be used as a model. The sample courses also had differences in the student demographics. The online class had more older students who were more likely to have jobs, while the offline students carried heavier course loads. These differences could complicate interpretation of the results. As educators continue to struggle to meet intermingled needs of technology, teaching and learning through courses designed to fit a disparate population of students, investigators may benefit from operational models that liken the student to the producer of his or her own performance. This study generates a discussion for future investigators to consider additional indicators or even differing operational measures. The DEA approach helps explain student performance in offline versus online by relating the student s role to a DMU. Without the DEA approach, the investigation may not have yielded significant findings. Using the DEA approach, sufficient evidence exists to indicate characteristics of significance between offline versus online formats. This study and approach can help business schools make better decisions regarding course format and can demonstrate the value and potential of online courses. REFERENCES Banker, R.D. (1980).Studies in cost allocation and efficiency evaluation. Unpublished doctoral thesis, Harvard University, Graduate School of Business Administration. Banker, R.D., Charnes, A., and Cooper, W.W. (1984). Some models for estimating technical and scale inefficiencies in data envelopment analysis.management Science, 30 (9, Sept.). Banker, R.D. and Morey, R.D. (1986). Efficiency analysis for exogenously fixed inputs and outputs.operations Research, 34 (4, July-Aug.). Biktimirov, E.N. and Klassen, K.J. (2008). Relationship between use of online support materials and student performance in an introductory finance course. Journal of Education for Business, 83 (3, Jan.-Feb.). Bliuc, A., Ellis, R., Goodyear, P., and Piggott, L. (2010). Learning through face-to-face and online discussions: Associations between student s conceptions, approaches and academic performance in political science. British Journal of Education Technology, 41 (3). Brown, B.W. and Liedholm, C.E. (2002). Can web courses replace the classroom in principles of microeconomics? Teaching Microeconomics Principles, 92 (2). Cappel, J.J. and Hayen, R.L.(2004). Evaluationing e-learning: A case study. Journal of Computer Information Systems, Summer. Chen, Y.F. and Peng, S.S. (2008). University students Internet use and its relationships with academic performance, interpersonal relationships, psychosocial adjustment, and self-evaluation. CyberPsychology& Behavior, 11 (4, Aug.), 467-469. Cheung, L.L.W., and Kan, A.C.N. (2002). Evaluation of factors related to student performance in a distance-learning business communication course. Journal of Education for Business, 77 (5, May-June). Detwiler, J.E. (2008). Comparing student performance in online and blended sections of a GIS programming class. Transactions in GIS, 12 (1), 131-144. 96

International Journal of Business, Humanities and Technology Vol. 2 No. 1; January 2012 Huh, S., Jin, J.J., Lee, K.J., andyoo, S. (2010).Differential effects of student characteristics on performance: Online vis-à-vis offline accounting courses.academy of Education Leadership Journal, 4. Lu, J., Yu, C.S., and Liu, C.(2003). Learning style, learning patterns, and learning performance in a WebCT-based MIS course. Information & Management, 40(6),497-507. Lundgren, T.D. and Nantz, K.S.(2002). Student attitudes toward Internet courses: A longitudinal study, Journal of Computer Information Systems, Summer. McFarland, D. and Hamilton, D. (2005-2006). Factors affecting student performance and satisfaction: Online versus traditional course delivery. Journal of Computer Information Systems, (Winter), 25-32. Olson, D. A (2002).Comparison of online and lecture methods for delivering the CS 1 course.journal of Computer Sciences in Colleges, 18 (2, Dec). Priluck, R. (2004). Web-assisted courses for business education: An examination of two sections of Principles of Marketing. Journal of Marketing Education, 26 (2), 161-173. Rivera, J.C. and Rice, M.L.(2002). A comparison of student outcomes and satisfaction between traditional and web based course offerings, Online Journal of Distance Learning Administration, 3 (Fall). Trawick, M.W., Lile, S.E. and Howsen, R.M. (2010). Predicting performance for online students: Is it better to be home alone? Journal of Applied Economics and Policy, 29 (Spring), 34-46. Variable Level of learning: Table 1: Frequency Table of Relevant Variables Frequency (%) Online section ssec Online 4 4.54 3.84 5 22.72 46.15 6 40.9 23.07 7 31.81 26.92 Level of satisfaction: 4 4.54 7.69 5 20.45 26.92 6 29.54 34.61 7 45.45 30.76 Effort: 1-2 hours/ week 34.09 23.07 3-4 hours/week 45.45 65.38 More than 5 hours/week 20.45 11.53 Age : 18-30 years 77.27 100 Greater than 30 years 22.72 0 Frequency (%) Offline section Liking for online study material 57.14 57.69 Liking for quizzes 58.57 50 Liking for course schedule 55.71 42.30 Liking for instructor s teaching style 41.42 76.92 Job: Working full time 43.18 11.53 Working part time 31.81 65.38 Not working 25 23.07 Familiarity with the Internet: 4 2.27 3.84 5 11.36 15.38 6 27.27 34.61 7 59.09 46.15 Busy Less than 3 0 11.53 3 81.81 42.30 4 15.90 34.61 More than 4 2.27 11.53 97

Centre for Promoting Ideas, USA www.ijbhtnet.com Table 2: Logit Output for identifying the most efficient students Parameters Estimate Std. error Pr>ChiSq Online 3.27 1.51 0.03 Busy -0.96 0.47 0.04 Age (18 to 30 years) 0.05 1.09 0.95 Liking for online study 1.74 1.03 0.09 material Liking for quizzes 1.69 0.96 0.07 Liking for course schedule 1.13 0.72 0.11 Liking for instructor s teaching 1.51 1.02 0.13 style Full time working 4.14 1.47 0.004 Part time working 2.40 1.26 0.05 Familiarity with Internet 1.73 0.80 0.03 Model Fit Statistics: Criterion Intercept Only Intercept and Covariates AIC 88.38 63.24 SC 90.62 87.82-2 LOG L 86.38 41.24 R-Square: 48.01% Max-rescaled R-Square: 67. 24% 98