GRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE *



Similar documents
DePaul University February, Bahrain Accounting 500 Financial Accounting

SE 333/433 Software Testing and Quality Assurance

Moving from Traditional to Online Instruction: Considerations for Improving Trainer and Instructor Performance

An Analysis of how Proctoring Exams in Online Mathematics Offerings Affects Student Learning and Course Integrity

Annual Goals for Math & Computer Science

Do Supplemental Online Recorded Lectures Help Students Learn Microeconomics?*

Realizeit at the University of Central Florida

Discrete Mathematics I Distance Learning (online) sections

DePaul University School of Accountancy and MIS ACC Online

A Comparison of Student Learning Outcomes in Traditional and Online Personal Finance Courses

Comparison of Student Performance in an Online with traditional Based Entry Level Engineering Course

West LA College Mathematics Department. Math 227 (4916) Introductory Statistics. Fall 2013 August December 9, 2013

STUDENT EVALUATION OF INSTRUCTIONAL EFFECTIVENESS OF WEB-BASED LEARNING STRATEGIES

Syllabus for MATH 191 MATH 191 Topics in Data Science: Algorithms and Mathematical Foundations Department of Mathematics, UCLA Fall Quarter 2015

Features of the Online Course. Table of Contents. Course Interface: Overview Syllabus Navigating through the Course Content...

WEEKEND PROGRAMS FOR MATHEMATICS TEACHERS OFFERED BY THE MATHEMATICS DEPARTMENT AT DEPAUL UNIVERSITY

Section Format Day Begin End Building Rm# Instructor. 001 Lecture Tue 6:45 PM 8:40 PM Silver 401 Ballerini

Psychology Course # PSYC300 Course Name: Research Methods in Psychology Credit Hours: 3 Length of Course: 8 Weeks Prerequisite(s):

CSC 241 Introduction to Computer Science I

Luther Rice College and Seminary Guide to Regular and Substantive Interaction in Distance Education

Statistical Methods Online Course Syllabus

REDESIGNING ALGEBRA COURSES: FROM IMPLEMENTATION TO RESULTS

Professor and Student Performance in Online Versus Traditional Introductory Finance Courses

Blended Course Evaluation Standards

MATH 241: DISCRETE MATHEMATICS FOR COMPUTER SCIENCE, Winter CLASSROOM: Alumni Hall 112 Tuesdays and Thursdays, 6:00-8:15 pm

Introduction. The busy lives that people lead today have caused a demand for a more convenient method to

MAT 120 Probability & Statistics Mathematics

THE EFFECTIVENESS OF VIRTUAL LEARNING IN ECONOMICS

COMPARISON OF INTERNET AND TRADITIONAL CLASSROOM INSTRUCTION IN A CONSUMER ECONOMICS COURSE

DOES TEACHING ONLINE TAKE MORE TIME?

WHAT TYPES OF ON-LINE TEACHING RESOURCES DO STUDENTS PREFER

Technological Tools to Learn and Teach Mathematics and Statistics

Teaching Without a Classroom: Delivering Courses Online

Faculty Handbook for Alternative Delivery Classes

Shaw University Online Courses FAQ

Research Proposal: Evaluating the Effectiveness of Online Learning as. Opposed to Traditional Classroom Delivered Instruction. Mark R.

LAGUARDIA COMMUNITY COLLEGE CITY UNIVERSITY OF NEW YORK DEPARTMENT OF MATHEMATICS, ENGINEERING, AND COMPUTER SCIENCE

INSC 102 Technologies for Information Retrieval FALL 2014 SECTION 002 Delivered online via Asynchronous Distance Education (ADE)

How To Complete The College Readiness Math Mooc

Analysis of the Effectiveness of Online Learning in a Graduate Engineering Math Course

I. PREREQUISITES For information regarding prerequisites for this course, please refer to the Academic Course Catalog.

Numerical Analysis Beginning Scientific Computing (AMATH 301) or equivalent coursework

Steve Sworder Mathematics Department Saddleback College Mission Viejo, CA August, 2007

Applied Information Technology Department

Investigating the Effectiveness of Virtual Laboratories in an Undergraduate Biology Course

A Comparison of Student Learning in an Introductory Logic Circuits Course: Traditional Face-to-Face vs. Fully Online

BRAZOSPORT COLLEGE LAKE JACKSON, TEXAS SYLLABUS ACNT 2311: MANGERIAL ACCOUNTING ONLINE VERSION COMPUTER TECHNOLOGY & OFFICE ADMINISTRATION DEPARTMENT

Redesigned College Algebra. Southeast Missouri State University Ann Schnurbusch

Spring 2015: Gordon State College. Barnesville, GA Online Course: Econ 2106 Microeconomics. Course Meeting Time and Location: Internet (D2L)

ACT Mathematics sub-score of 22 or greater or COMPASS Algebra score of 50 or greater or MATH 1005 or DSPM 0850

Determining Student Performance in Online Database Courses

COURSE SYLLABUS PHILOSOPHY 001 CRITICAL THINKING AND WRITING SPRING 2012

Truman College-Mathematics Department Math 125-CD: Introductory Statistics Course Syllabus Fall 2012

An Evaluation of Student Satisfaction With Distance Learning Courses. Background Information

TEACHING COURSES ONLINE: HOW MUCH TIME DOES IT TAKE?

Alabama Department of Postsecondary Education. Representing The Alabama Community College System

Online, ITV, and Traditional Delivery: Student Characteristics and Success Factors in Business Statistics

Improving Distance Education Through Student Online Orientation Classes

Math 103, College Algebra Spring 2016 Syllabus MWF Day Classes MWTh Day Classes

MAT 103B College Algebra Part I Winter 2016 Course Outline and Syllabus

Strategic Use of Information Technology (CIS ) Summer /

Math 35 Section Spring Class meetings: 6 Saturdays 9:00AM-11:30AM (on the following dates: 2/22, 3/8, 3/29, 5/3, 5/24, 6/7)

Mathematics for Business Analysis I Fall 2007

College Algebra Math 1050 Fall :30-1:20 M-R BLT123

REDESIGNING STUDENT LEARNING ENVIRONMENTS

UCLA Undergraduate Fully Online Course Approval Policy

Comparing Student Learning in a Required Electrical Engineering Undergraduate Course: Traditional Face-to-Face vs. Online

How To Study The Academic Performance Of An Mba

Brian Alan Snyder, Ph. D.

Student Perceptions of Online Homework in Introductory Finance Courses. Abstract

DESIGNING ONLINE COURSES TO PROMOTE STUDENT RETENTION

Where has the Time Gone? Faculty Activities and Time Commitments in the Online Classroom

Management Science 250: Mathematical Methods for Business Analysis Three Semester Hours

ONLINE TEACHING: THE WARMTH FACTOR

ON ADDING A CRITICAL THINKING MODULE TO A DISCRETE STRUCTURES COURSE *

Department of Electrical and Electronic Engineering, California State University, Sacramento

COMPUTER SCIENCE TECHNOLOGY ITSC 1301 INTRODUCTION TO COMPUTERS Website: Course Syllabus

Multimedia & the World Wide Web

Can Web Courses Replace the Classroom in Principles of Microeconomics? By Byron W. Brown and Carl E. Liedholm*

A SUCCESSFUL STAND ALONE BRIDGE COURSE

Teaching Hybrid Principles Of Finance To Undergraduate Business Students Can It Work? Denise Letterman, Robert Morris University

Select One: New Delete Course Modification

Springfield Technical Community College School of Mathematics, Sciences & Engineering Transfer

Access Code: RVAE4-EGKVN Financial Aid Code: 6A9DB-DEE3B-74F

Evaluating Different Methods of Delivering Course Material: An Experiment in Distance Learning Using an Accounting Principles Course

IT 101 Introduction to Information Technology

Online Basic Statistics

Knowledge Recognition Employment. Distance Learning PROGRAMS OF STUDY.

COMPARING STUDENT PERFORMANCE: ONLINE VERSUS BLENDED VERSUS FACE-TO-FACE

Social Work Statistics Spring 2000

Transcription:

GRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE * Amber Settle DePaul University 243 S. Wabash Avenue Chicago, IL 60604 (312) 362-5324 asettle@cti.depaul.edu Chad Settle University of Tulsa 600 South College Tulsa, OK 74104 (918) 631-3157 chad-settle@utulsa.edu ABSTRACT Student satisfaction with distance learning is impacted by a variety of factors, including interaction with the instructor and the structure of the course. We describe our experiences teaching discrete mathematics to graduate students using both a traditional classroom setting and two different types of distance learning formats. We then compare student evaluations between the traditional and distance-learning courses to determine if student satisfaction was affected by the course format. INTRODUCTION The School of Computer Science, Telecommunications, and Information Systems (CTI) is the largest and most diverse institution for information technology education in the United States. Over 40% of all information technology graduate students in Illinois are enrolled at CTI. There are 10 M.S. degree programs at CTI, one M.A. degree program in Information Technology, and joint M.S. and M.A. degrees with the School of Commerce, the College of Law, and the School for New Learning. CTI was one of the first Schools at DePaul to begin offering distance learning. Distance education at CTI is designed not only to serve students who live outside the Chicago area, but also to provide more value for students who live in or near Chicago. Distance learning has proven to be a popular option with students. There are now 8 M.S. degrees and 1 M.A. degree available as purely online programs at CTI, and there are 82 distance learning courses * Copyright 2005 by the Consortium for Computing Sciences in Colleges. Permission to copy without fee all or part of this material is granted provided that the copies are not made or distributed for direct commercial advantage, the CCSC copyright notice and the title of the publication and its date appear, and notice is given that copying is by permission of the Consortium for Computing Sciences in Colleges. To copy otherwise, or to republish, requires a fee and/or specific permission. 79

JCSC 21, 1 (October 2005) being offered in the Fall 2004. Students enrolled in online degrees accounted for 16% of the total graduate enrollments in Spring 2004. Given the success of distance learning at CTI, it is clear that students find it a useful way to complete their education. Distance learning, however, can be difficult for students. It has been asserted that while learning outcomes from in-class and distance learning courses may be similar, the distance learning format may not be satisfying to students [1]. In this paper, we describe the first author s experience teaching graduate discrete mathematics both in the standard classroom setting and via distance learning. We investigate whether course evaluations in distance learning discrete mathematics classes taught are different than evaluations in more traditional classes, and we attempt to understand how distance learning affects student satisfaction both with the course and the instructor. THE COURSE The course discussed in this paper is CSC 415: Foundations of Computer Science. While the title of CSC 415 does not include the words discrete mathematics for historical reasons, the course covers traditional discrete mathematics material including propositional and predicate logic, proofs by induction, understanding basic algorithms, asymptotic analysis, recurrence relations, basic graph theory, and graph algorithms. Each of the Masters degrees at CTI have a prerequisite phase designed to prepare students for graduate study in their chosen area, a common solution for students who wish to switch areas of study after completing their undergraduate degree [4]. CSC 415 is designed to prepare graduate students without a mathematical background for the algorithms material required in the more technical degree programs, and it is part of the prerequisite phase for a number of degrees at CTI. DISTANCE LEARNING AT CTI Students at CTI have two different options for distance learning courses. The majority of sections are sibling distance learning sections that run parallel to a regular section of the course. As an alternative, a small number of sections of pre-recorded distance learning courses are available. When distance learning was introduced to CTI in the Spring quarter 2001, the goal was to provide a large variety of distance learning courses while minimizing the impact on the faculty and the regular sections of courses. To do so, staff at CTI developed a hardware and software system called Course Online (COL). COL simultaneously captures audio, video, the instructor s notes written on the whiteboard, and the images displayed on the instructor s computer screen. The capture of the information is done automatically, and although the equipment is monitored remotely, there are no staff in the classroom when the recording is done. Instead, the lecture is captured using a camera fixed at the back of the room. The video, the audio, the whiteboard, and the computer screen are synchronized, and by the morning after the lecture they are made available to students registered in the class. The recordings remain online for the entire quarter and are a part of an integrated course management system that allows faculty to post course information such as the syllabus, assignments, class notes, and grades and includes a 80

CCSC: Midwestern Conference homework submission system. More technology intensive than labor intensive, sibling distance learning is unique [2]. Though the system is asynchronous, distance learning students can hear the comments and questions made by in-class students. The distance-learning students and the students in the regular section also can communicate easily with each other via threaded discussion boards and with the instructor via e-mail and posted announcements. Once sibling distance learning courses proved successful, the CTI administration began to consider whether alternative models of distance learning would be feasible. One experiment involved the creation of pre-recorded distance learning classes. Unlike the sibling distance learning courses described above, a pre-recorded distance learning class is not connected to a live section of a course. Instead, lectures are recorded in a specially equipped classroom. The lectures for an entire quarter are recorded prior to the offering of the course. Unlike a sibling distance learning class, there are no students in the class when the recording is made, and the recording is made solely for the use of the distance learning students. The image quality is better, since multiple cameras are used, the cameras are placed closer to the instructor, and the cameras are manned by operators. Prior to the start of the quarter, the pre-recorded lectures are placed on a COL site along with all of the homework, the lecture notes, the syllabus, and any other information the instructor has for the students. When the quarter begins, all of this material is available to the students. Students communicate with the instructor via e-mail and each other via threaded discussion forums. While there are no student interactions on the recordings, the course tends to be more structured and better organized than the standard sibling distance learning course at CTI. DISCRETE MATH AND DISTANCE LEARNING We now describe the first author s experience teaching CSC 415, including regular sections, sibling distance learning sections, and pre-recorded distance learning sections, during the past several years. The Courses Since the Fall 2001, the first author has taught 9 sections of CSC 415. Of these, three have been regular sections of the course, two have been sibling distance learning sections, and four have been pre-recorded distance learning courses. The regular section courses were Fall 2001 (with 37 students), Winter 2002 (with 37 students), and Spring 2003 (with 18 students). The sibling DL courses were Fall 2001 (with 10 students), and Winter 2002 (with 6 students). The pre-recorded DL sections were Fall 2002 (with 4 students), Winter 2003 (with 11 students), Spring 2003 (with 7 students), and Fall 2003 (with 8 students). The format of the course varied according to its type. The regular sections and the sibling distance learning sections consisted of 10 weeks of lectures with weekly assignments, a midterm exam, and a final exam. The grade in the Fall 2001 and Winter 2002 courses was based 30% on assignments, 35% on the midterm, and 35% on the final exam. The grade in the Spring 2003 course was based 25% on assignments, 35% on the midterm, and 40% on the final exam. 81

JCSC 21, 1 (October 2005) The pre-recorded distance learning courses were organized in five modules: an introduction to the course including an algebra review, propositional and predicate logic, mathematical induction, algorithms, and graph theory. Students were provided with a traditional lecture and a recorded problem solving session for each topic. The grading for each pre-recorded course varied. In the Fall 2002 class there were no fixed deadlines for any of the module assignments or exams. The students were required to complete the module homework with 75% or better before taking the associated exam. Each of the four module exams (there was no module exam for the introductory module) counted for 25% of the grade. While this type of schedule provided students with maximum flexibility, it also resulted in significant procrastination by the students in the class. In the Winter and Spring 2003 quarters, the module assignments and exams were given fixed deadlines. Each module assignment counted for 2% of the course grade, for a total of 10% of the grade, and late assignments were not accepted. The exams for the second, third, and fifth modules counted for 20% of the grade and the exam for the algorithms module counted for 30% of the grade. This scheme resulted in significantly less procrastination, although due to the length of the algorithms module, many students allocated insufficient time for its completion. In the Fall 2003 course, the algorithms module was broken into two parts each counting for 1% of the grade. The grading was also changed. The logic and induction modules were still worth 20% of the grade, but the algorithms module counted for 35% of the grade, and the graph theory module counted for only 15% of the grade. CTI Course Evaluations Student evaluations are conducted at CTI for every course during every quarter. The evaluations are conducted online via CTI s Web site. The students must log into a secure system and may submit only one evaluation per CTI course in which they are enrolled. No identifying information about the student is associated with the evaluation, making them anonymous. Completing an evaluation is mandatory for all students enrolled in CTI courses. Course evaluations are completed during the 8th and 9th weeks of the 10 week quarter, although results are not made available to instructors until after grades are submitted. The evaluations consist of 22 multiple choice questions and several sections for comments. The multiple choice questions ask the student to rate various aspects of the course and the instructor for the course. The ratings are on a scale from 0 to 10, and the meaning of a rating depends on the question. In general, a higher number indicates a greater degree of student satisfaction with the area addressed by the question. A zero indicates that the student feels the question is not applicable. Below we compare course evaluations between the regular sections and the pre-recorded distance-learning sections of CSC 415. Ideally, we would like to compare the two different models of distance learning to each other, but unfortunately due to an error in the system during the first year distance learning was offered at CTI, course evaluations were not conducted on those courses. For this reason, we have no course evaluation data for the sibling distance learning sections taught in the Fall 2001 and Winter 2002 quarters. 82

CCSC: Midwestern Conference Student Satisfaction On the course evaluation, there are ten questions labeled as course-related questions. However, one of them, How fair is the grading of the homework and exams of this course focuses more on an instructor-related issue and will be considered in the next section. One of the questions listed in the instructor-related section, Would you recommend this course to another student, deals more with course-related factors and will be considered here. The course-related questions (Q_CR) are listed below: 1. Was this course well organized? 2. Do you feel the course objectives were accomplished? 3. The amount of work you performed outside of this course was: 4. How difficult was this course material? 5. The textbook for this course was: 6. Supplementary reading for this course was: 7. The assignments for this course were: 8. What is your overall estimate of this course? 9. How valuable was this course in terms of your technical development? 10. Would you recommend this course to another student? We next analyze the course evaluation data for these questions. Since all sections of CSC 415, both traditional and distance-learning, were taught by the same instructor we need not include a dummy variable to net out instructor-specific fixed effects. Instead, we focus our analysis on two key variables, time and whether or not the course was a traditional section or a distance-learning section. Our Ordinary Least Squares (OLS) regression equation is given by equation (1): Q i = " 0 + " 1 X 1i + " 2 X 2i + u i (1) where Q is an individual question on the course evaluation, " 0 is a constant term (" 0 will be the expected question score when X 1i = 0 and X 2i = 0), X 1i = time, X 2i is a dummy variable where X 2i = 0 if the course is a traditional course and X 2i = 1 if the course is a distance-learning course, and u i is an error term. Time is included as a proxy in order to account for systematic and immeasurable changes in teaching and/or student attitudes across time. The variable for time will include changes in the course structure, changes in the professor s teaching methods, and changes in general student attitudes toward their educational experience across time. One reason to include time as a variable is to eliminate spurious correlation. Time is measured by quarters taught at DePaul University (for the Fall of 2001, X 1i = 1, for the Winter of 2002, X 1i = 2, etc.). The dummy variable for distance-learning sections is included in order to account for differences between distance-learning sections of the course and traditional sections of the course. If " 2 is statistically different from 0, it will indicate a difference in how students view distance-learning sections as compared to traditional sections. Table 1 summarizes our results. The pooled sample including all evaluations from all 6 sections of the course yield a total of 100 observations for each question. After dropping responses of zero for each question (since a response of zero indicates that the student felt the question was not applicable to the course or instructor), regression equation (1) is run 22 times, once for each question on the evaluation in order to 83

JCSC 21, 1 (October 2005) determine which questions have a statistically significant difference across time and for distance-learning sections. Table 1: Regression Results for Course-related Questions Question Time DL Question Time DL Q-CR1-0.164** (0.075) 0.200 (0.393) Q-CR6 0.046 (0.164) 0.302 (0.910) Q-CR2-0.120 (0.079) -0.023 (0.416) Q-CR7-0.071 (0.101) 0.237 (0.533) Q-CR3 0.064 (0.086) 0.221 (0.460) Q-CR8-0.179** (0.089) 0.594 (0.467) Q-CR4 0.305*** (0.102) -0.549 (0.530) Q-CR9-0.199* (0.110) 0.620 (0.573) Q-CR5 0.185 (0.144) -0.261 (0.758) Q-CR10-0.202** (0.092) 0.067 (0.486) Coefficient estimates are presented with standard errors in parentheses. *Statistically significant at the 10% level of a two-tailed test. **Statistically significant at the 5% level on a two-tailed test. ***Statistically significant at the 1% level of a two-tailed test. The coefficients for course-related questions 1, 4, 8, 9, and 10 were all significantly different from zero across time. None of the coefficients for the course-related questions were significantly different from zero for the distance-learning sections. While students recognize a change in the course over time, they do not attribute the change to distance-learning sections. Additionally, when the regression is run without time as an explanatory variable, no course-related questions are statistically different from zero. These results are interesting because it indicates that distance-learning students satisfaction with the course does not differ in a statistically significant way from the satisfaction of the regular section students. This is despite the differing format of the two courses, with the regular section a traditional 10-week lecture format, and the distance-learning section organized around five, somewhat self-paced modules. The remaining twelve course evaluation questions are for instructor-related (Q_IR) factors. 1. How would you characterize the instructor s knowledge of this subject? 2. How would you characterize the instructor s ability to present and explain the material? 3. Does the instructor motivate student interest in the subject? 4. How well does the instructor relate the course material to other fields? 5. Did the instructor encourage participation from the students? 6. Was the instructor accessible outside of class? 7. What was the instructor s attitude? How did he/she deal with you? 8. How well did the instructor conduct, plan, and organize classes? 84

CCSC: Midwestern Conference 9. Were the instructor s teaching methods effective? 10. How fair was the grading of the homework and exams of this course? 11. Would you take this instructor for another course? 12. Rate the teaching effectiveness of this instructor as compared to other faculty in the department. Table 2: Regression Results for Instructor-related Questions Question Time DL Question Time DL Q-IR1-0.010 (0.072) 0.065 (0.379) Q-IR7-0.115 (0.084) 0.013 (0.437) Q-IR2-0.069 (0.069) -0.029 (0.360) Q-IR8-0.110 (0.084) 0.042 (0.442) Q-IR3-0.194** (0.083) -0.016 (0.437) Q-IR9-0.181** (0.088) 0.570 (0.454) Q-IR4-0.189* (0.098) 0.043 (0.515) Q-IR10 0.010 (0.087) 0.304 (0.451) Q-IR5-0.267** (0.102) 0.481 (0.654) Q-IR11-0.096 (0.089) 0.411 (0.461) Q-IR6-0.053 (0.090) -0.183 (0.468) Q-IR612-0.114 (0.082) 0.268 (0.443) Coefficient estimates are presented with standard errors in parentheses. *Statistically significant at the 10% level on a two-tailed test. **Statistically significant at the 5% level on a two-tailed test. The coefficients for instructor-related questions 3, 4, 5, and 9 were all significantly different from zero across time. None of the coefficients for the instructor-related questions were significantly different from zero for the distance-learning sections. While students recognize a change in the course over time, they do not attribute the change to distance-learning sections. When the regression is run without time as an explanatory variable, none of the coefficients for the instructor-related questions were significantly different from zero. The result for instructor-related question 12 is interesting since that is the question that has been traditionally used to evaluate faculty for tenure and promotion as well as annual merit raises. It is anecdotal common-knowledge among CTI faculty that distance-learning courses produce lower student evaluations. While the sign of the coefficient when time is excluded from the regression is indeed negative, suggesting the instructor evaluations are lower for the distance-learning sections, it is not significantly different than zero. Furthermore, the sign of the coefficient is positive when time is included in the regression, suggesting changes across time are what are attributable to the lower evaluations and not the distance-learning format. However, the coefficients for both time and distance-learning sections are not statistically different from zero. These results 85

JCSC 21, 1 (October 2005) suggest instructor evaluations do not differ between traditional sections and distance-learning sections. Evaluations of traditional sections and distance-learning sections are not identical. One difference is in the frequency of submitting a zero on the evaluation, indicating the student feels the question is not applicable. On instructor-related question 5, 60% of distance-learning students felt the question was not applicable while only 6% of traditional students felt the question was not applicable. Given the format of the distance-learning course, this result is to be expected. On instructor-related question 12, 15% of distance-learning students felt the question was not applicable while only 2.5% of traditional students felt the question was not applicable. A possible explanation for this is that the distance-learning students are not watching the recordings, relying on written notes and the textbook instead. While the raw score on evaluations of traditional and distance-learning sections are not significantly different, student expectations of distance-learning instructors do appear to differ. CONCLUSIONS AND FUTURE WORK Distance-learning students have significantly less interaction with the instructor, a factor that has a demonstrable influence on student satisfaction [3]. Given this, the fact that there is no statistically significant difference between pre-recorded and regular student evaluations for either course-related or instructor-related factors is surprising. This result suggests that the superior organization of the pre-recorded distance-learning course, with five, cohesive, subject-oriented modules, was able to compensate for the lack of interaction. This is consistent with previous work on distance-learning student satisfaction that showed a course with a few, highly consistent, modules resulted in both a perception of more interaction with the instructor and of better learning outcomes on the part of students [3]. When designing a distance-learning course it is beneficial to aim for simplicity in course structure. The results for instructor-related question 12 run counter to the perception that many faculty at CTI have about distance-learning evaluations. It would be interesting to determine if this result holds for other courses. We are conducting a comparison of regular-section and distance-learning section evaluations for courses taught by the first author to investigate this further. We would also like to understand why a higher percentage of distance-learning students consider that question to be inapplicable. We hypothesize that these students are not watching the recordings of the instructor, but at the present time, there is no technical way to check this. We intend to work with the COL development team to see if a feature can be added that would allow us to track the amount of time students spend viewing the recordings. Finally, it would be interesting to consider other components of distance learning such as effectiveness. We have already begun work on a study assessing student performance in online Java courses. 86

CCSC: Midwestern Conference REFERENCES [1] Carr, S., Online Psychology Instruction is Effective but Not Satisfying, Study Finds, Chronicle of Higher Education, March 10, 2000, http://chronicle.com. [2] Knight, L., Steinbach, T., and White J., An Alternative Approach to Web-based Education: Technology-intensive, Not Labor-Intensive. Proceedings of the Information Systems Education Conference, 2002. [3] Swan, K., Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses, Distance Education, 22 (2), 306-332, 2001. [4] Wyatt, R., Milito, E., Epstein, R., and Kline, R., A Graduate Master s Prerequisite Program, Journal of Computing Sciences in Colleges, 17 (3), 169-177, February 2002. 87