Continuing Education and Knowledge Retention: A Comparison of Online and Face-to-Face Deliveries

Similar documents
Final Exam Performance. 50 OLI Accel Trad Control Trad All. Figure 1. Final exam performance of accelerated OLI-Statistics compared to traditional

Delta Journal of Education ISSN

The Effect of Electronic Writing Tools on Business Writing Proficiency

Creating Rubrics for Distance Education Courses

Title: Transforming a traditional lecture-based course to online and hybrid models of learning

Advanced Degree Seeking Students' Satisfaction with Online Courses at UMKC - An Early Investigation

MPA Program Assessment Report Summer 2015

Are they the same? Comparing the instructional quality of online and faceto-face graduate education courses

Journal of College Teaching & Learning August 2008 Volume 5, Number 8

A COMPARISON OF COLLEGE AND HIGH SCHOOL STUDENTS IN AN ONLINE IT FOUNDATIONS COURSE

EBM Curriculum Development & Evaluation

Research Proposal: Evaluating the Effectiveness of Online Learning as. Opposed to Traditional Classroom Delivered Instruction. Mark R.

Rio Salado College Assessment of Student Learning Executive Summary: Annual Report

Faculty and Student Evaluations of a Web Based Nursing Program. Associate Professor of Nursing College of Nursing, University of New Mexico

Appendix D: Summary of studies on online lab courses

Examining Students Performance and Attitudes Towards the Use of Information Technology in a Virtual and Conventional Setting

Online Courses: An analysis of student satisfaction

Blending e-learning with face-to-face teaching: the best of both worlds or just twice the work?

Department of Management Information Systems Terry College of Business The University of Georgia. Major Assessment Plan (As of June 1, 2003)

Running Head: COMPARISON OF ONLINE STUDENTS TO TRADITIONAL 1. The Comparison of Online Students Education to the

Face to face or Cyberspace: Analysis of Course Delivery in a Graduate Educational Research Course

Asynchronous Learning Networks in Higher Education: A Review of the Literature on Community, Collaboration & Learning. Jennifer Scagnelli

A Comparison of Student Learning Outcomes in Traditional and Online Personal Finance Courses

Does not teach online, not interested (12) Does not teach. online, is

Student Performance in Traditional vs. Online Format: Evidence from an MBA Level Introductory Economics Class

Assessing varied instructional approaches in the instruction of online graduate students

UW Colleges Student Motivations and Perceptions About Accelerated Blended Learning. Leanne Doyle

Adjunct Faculty Members Perceptions of Online Education Compared to Traditional. Education

Switching Economics Courses from Online Back to the Classroom: Student Performance and Outcomes

Health Care Management Student Perceptions of Online Courses Compared to Traditional Classroom Courses

Environmental Detectives: An Environmental Science Curriculum for Middle Schools Year 3 Evaluation. Cynthia A. Char, Char Associates

THE NATURE AND CHARACTERISTICS OF ONLINE LEARNING IN ILLINOIS COLLEGES AND UNIVERSITIES

2015 LEARNING SYSTEM PREPARATION OVERVIEW. CSCP APICS Certified Supply Chain Professional

Program Overview and Assessment Methodology for Online Instruction in Chemistry at Oregon State University

RefWorks investigated An appropriate bibliographic management solution for health students at King s College London?

Factors Influencing a Learner s Decision to Drop-Out or Persist in Higher Education Distance Learning

Procrastination in Online Courses: Performance and Attitudinal Differences

Comparing the Effectiveness of a Supplemental Online Tutorial to Traditional Instruction with Nutritional Science Students

Outcome: Compare and contrast different research methods used by psychologists including their respective advantages and disadvantages.

Math Science Partnership (MSP) Program: Title II, Part B

A Comparison Between Online and Face-to-Face Instruction of an Applied Curriculum in Behavioral Intervention in Autism (BIA)

Title: Psychology 102: Creating an online course to prepare students for the major and for careers in psychology after undergraduate coursework

TABLE OF CONTENTS FAMILY MEDICINE

College/School/Major Division Assessment Results for

CURRICULUM ON PRACTICE-BASED LEARNING AND IMPROVEMENT MSU INTERNAL MEDICINE RESIDENCY PROGRAM. Revision date: March 2015 TEC Approval: March 2015

Hillsborough Community College. QEP Baseline Analysis. ACG 2021 Financial Accounting

A Comparative Study of the Effectiveness of an Online and Face-to-Face Technology Applications Course in Teacher Education

CTL Online Workshop Program Referral Resource Sheet

Evaluation in Online STEM Courses

Missouri Baptist University Center for Distance Learning

ABET TAC CIP Report for the Academic Year Mechanical Engineering Technology (MET) Program

Emese Ivan. Teaching Sports Economics through Technology

Student Perceptions of Online Learning: A Comparison of Two Different Populations

Outcome-Based Evaluation for Technology Training Projects. Developed for: The New York State Library. Division of Library Development

Blended Learning vs. Traditional Classroom Settings: Assessing Effectiveness and Student Perceptions in an MBA Accounting Course

Achievement and Satisfaction in a Computer-assisted Versus a Traditional Lecturing of an Introductory Statistics Course

Student Learning Outcomes Report

The Game Changers in Online Learning Series

REFLECTIONS ON ONLINE COURSE DESIGN - QUALITY MATTERS EVALUATION AND STUDENT FEEDBACK: AN EXPLORATORY STUDY

Teaching Hybrid Principles Of Finance To Undergraduate Business Students Can It Work? Denise Letterman, Robert Morris University

Paralegal Studies Assessment Report Introduction

Blending Makes the Difference: Comparison of Blended and Traditional Instruction on Students Performance and Attitudes in Computer Literacy

Reproductions supplied by EDRS are the best that can be made from the original document.

Evaluating Training. Debra Wilcox Johnson Johnson & Johnson Consulting

Evaluation Plan for Administrative Departments

Teaching Risk Management: Addressing ACGME Core Competencies

Teaching large lecture classes online: Reflections on engaging 200 students on Blackboard and Facebook

Bachelor of Social Work

Michigan State University, College of Nursing Certificate in College Teaching Program

Validation of the Colorado Psychiatry Evidence-Based Medicine Test

Multi-Course Comparison of Traditional versus Web-based Course Delivery Systems

The Use of Blackboard in Teaching General Physics Courses

Simulating Distributed Leadership. Dr Ian Heywood and Russell Williams

Formative Evaluations in Online Classes. course improvements is also increasing. However, there is not general agreement on the best

Distance Learning Policy Guidance

Jean Chen, Assistant Director, Office of Institutional Research University of North Dakota, Grand Forks, ND

AACSB Annual Assessment Report For

Utilising Online Learning in a Humanities Context

ANIMATING DISTANCE LEARNING: THE DEVELOPMENT OF AN ONLINE LEARNING COMMUNITY. Mary Quinn, DBA, Assistant Professor. Malone College.

How To Test Online Information Literacy

Student Feedback on Online Summer Courses

Communication Program Assessment Report

CONSUMERS OF ONLINE INSTRUCTION

How To Teach The Fundamentals Of Business Administration Online

Example of a Well-Designed Course in Nursing

Issues in Information Systems Volume 13, Issue 2, pp , 2012

Assessment Summaries of ACBSP Accredited Programs. College of Management

Analysis of the Effectiveness of Online Learning in a Graduate Engineering Math Course

The HRM program, originally known as the Personnel and Administration Studies program, began to accept students in Fall 1981.

D R A F T. Faculty Senate Ad Hoc Committee on Quality in Online Learning.

Rhode Island Distance Learning Policy

UW-Madison School of Nursing Assessment Report Please see attached assessment plan for the UW-Madison School of Nursing.

Teaching Media Design in an Online Setting: A Needs Assessment

Healthy People 2020 and Education For Health Successful Practices for Clinical Health Professions

Gonzaga University Virtual Campus Ignatian Pedagogical Approach Design Portfolio (IPA) Updated: October 15, 2014

Blended Learning Technology: Connecting with the Online-All-the-Time Student

STUDENT SATISFACTION IN THE ONLINE MASTER OF DISTANCE EDUCATION (MDE) - PRELIMINARY RESULTS

COMPARISON OF INTERNET AND TRADITIONAL CLASSROOM INSTRUCTION IN A CONSUMER ECONOMICS COURSE

Title of the submission: A Modest Experiment Comparing Graduate Social Work Classes, on-campus and at A Distance

Transcription:

Grand Valley State University ScholarWorks@GVSU Articles University Libraries 4-1-2007 Continuing Education and Knowledge Retention: A Comparison of Online and Face-to-Face Julie A. Garrison Grand Valley State University, garrisoj@gvsu.edu Follow this and additional works at: http://scholarworks.gvsu.edu/library_sp Part of the Information and Library Science Commons Recommended Citation Garrison, Julie A., " " (2007). Articles. Paper 2. http://scholarworks.gvsu.edu/library_sp/2 This Article is brought to you for free and open access by the University Libraries at ScholarWorks@GVSU. It has been accepted for inclusion in Articles by an authorized administrator of ScholarWorks@GVSU. For more information, please contact scholarworks@gvsu.edu.

1 Continuing Education and Knowledge Retention: A Comparison of Online and Face-to-Face Connie Schardt Duke University Julie Garrison Grand Valley State University

2 Abstract Objective: Drawing upon earlier research that surveyed students grasp of subject knowledge after taking either an online or face-to-face EBM course, this paper explores the effectiveness of a Web-based professional continuing education course, compared with an equivalent face-toface version. The course was designed to teach practicing medical librarians how to participate in and advocate for Evidence Based Medicine at their individual institutions. Methods: Seventy-two practicing librarians, self-selected to participate in either the distance education eight week course or the eight hour face-to-face class. Using a modified version of the Fresno Test of Competence in Evidence-Based Medicine, the authors compared student preclass, post-class, and six-month post-class assessment scores to assess subject knowledge retention, evaluate student learning, and determine the efficacy of the course delivery methods. Results: When comparing the scores of only those who completed all assessments, the DE students averaged over 10 points higher than the CE group in each test. Based on the raw numbers, it appeared that students in the DE group came into the classroom with a greater knowledge of the subject and retained more knowledge six months after the course had ended. However, after analyzing the data from all participants, the study showed that the differences between the distance education group and face-to-face group were not statistically significant. Conclusions: In this study, the distance education group and face-to-face groups had no difference in level of knowledge retention.

3 Introduction Using the Web to deliver professional continuing education is increasingly common in today's technologically advanced work environments. Regular offerings to participate in online workshops or courses are advertised via association listservs and Web sites, yet there is little literature evaluating how well professionals learn in this environment compared to the traditional face-to-face classroom. In the spring of 2000, the authors compared student responses to a post-class questionnaire, given after completing an Evidence-Based Medicine continuing education course sponsored by the Medical Library Association (MLA). Six of these librarians took the continuing education class online, through a Blackboard course shell, while the other six participated in a face-to-face eight-hour workshop covering the same material. Six months after the end of the course, online students answered 80% of the test questions correctly, while the classroom students only answered 40% of the test questions correctly. The authors speculated that the differences in knowledge retention might be because of three important attributes of distance education: more time for learning and reflection, more individual practice and attention which encouraged deeper learning and greater student self-motivation that enhanced engagement in the learning process. (Schardt, Garrison, & Kochi, 2002) That small study had several limitations, including the small sample size, no baseline subject knowledge measured before medical librarians began the continuing education course and lack of student demographic information collected at the start of the study. Drawing upon that earlier project, this paper continues to explore the effectiveness of a Web-based professional continuing education course, as compared with traditional face-to-face classroom instruction. The authors designed a larger study that compared student pre-class, postclass, and 6-month post-class knowledge retention from two cohorts of online and face-to-face classes. Literature Review Evidence-Based Medicine Curriculum Assessment Several studies have attempted to evaluate the effectiveness of an Evidence-Based Medicine curriculum for health professionals. Coomarasay and Khan (2004) conducted a systematic review of the literature on postgraduate evidence-based medicine teaching. Within the 23 articles included in their review, the teaching methods varied and included workshops,

4 journal clubs, seminars, and one seven-week course. Some were integrated into a formal teaching group, which focused on training in EBM components (such as question formulation, literature searching, and critical appraisal) in real time clinical ward rounds or basing the EBM teaching sessions on encounters with patients on the wards and in clinics, (p. 1019) while others were standalone courses. Seventeen of the 23 studies assessed knowledge retention and indicated improvements from both integrated and standalone teaching programs. However, the literature also showed that standalone curricula were unsuccessful in changing behavior over time. A study by McCluskey and Lovarini (2005) also found that a standalone workshop for health professionals improved EBM knowledge. Only one study looked at teaching EBM skills to librarians. Hicks, Booth, and Sawers (1998) evaluated a mixed modality program on searching for various types of study designs. It combined face-to-face sessions at the beginning and end of the workshop with distance education modules that were completed independently. Participants were asked to evaluate the course by scoring aspects of course satisfaction on a Likert scale. In addition they were offered an opportunity to provide suggestions for improvements and indicate what they felt were the most and least relevant aspects. The study did not include any outcome measures to assess knowledge gained. Online Versus Face-to-Face The majority of the studies comparing online education with a more traditional face-toface approach have been done in academic settings and have shown no significant difference between the two modalities. Buckley (2003), investigated the transition of undergraduate nursing nutrition courses from a face-to-face, to a hybrid (or web-enhanced), to an online environment, and found no significant differences in mid-term scores, final exam, or grades between the three groups. However, those students taking the web-enhanced nutrition courses performed the highest of the three groups. Another study of undergraduate nursing education by Leasure, Davis, And Thievon (2000), compared student outcomes in a traditional and web-base research course and found no significant difference in the grades. In each of these studies, the sample of students for the online courses was less than half that of the face-to-face classroom comparison. Olmsted (2002) compared online dental hygiene students with those taking their program in a face-to-face format. Over the five-year study period, researchers found no significant differences between the two groups of students in board examination scores, grades, and grade

5 point averages. Neuhauser (2002) found online student averages to be slightly higher in a comparison of an online and face-to-face section of the same class. There were no significant differences in test scores, participation, or final grades. One study of MBA students done by Anstine and Skidmore (2005) did reveal statistically significant lower examination scores of students taking a statistics class in the online environment. However, overall study results proved online outcomes to be on par with face-to-face outcomes. While a number of studies have compared online and face-to-face education in an academic setting, there is still little evidence about how effective the online environment is for continuing education and professional development types of activities. Continuing education activities, as defined by the Medical Library Association, are educational opportunities that "are planned, organized learning experiences designed to augment the knowledge, skills and attitudes of those who work in health sciences libraries." (MLANET MLA Continuing Education Activities, 2002, Definition Section, para. 1) Continuing education courses are generally work related and do not lead to an advanced degree, are not graded, and usually do not assess measurable educational outcomes. Very often these courses are short in time frame and taught by peers as opposed to professional educators. It is difficult to measure the effectiveness and knowledge retention of these programs when offered face-to-face. The student course evaluation and the instructor s impression of the depth of conversations and student enthusiasm may be the only outcome data available at the end of a course. This is very different from an academic course that may have several graded assignments or other measurable student outcomes. As the educational market expands and more individuals consider investing their time and money in online alternatives for CE, there is a growing need for more rigorous assessment to help learners choose the most effective courses. Methodology The participants in this study were practicing librarians enrolled in the course Evidence Based Medicine and the Medical Librarian between January and November of 2004. The course was offered four times during that year. Two of the sessions were presented to 38 librarians as a traditional classroom course (CE) covering eight hours of face-to-face instruction. The other two sessions were presented as a distance education course (DE) to 34 librarians. The individual librarians self selected to attend either the classroom or the distance education sessions for the course. Decisions to take either course format were based on location, timing, and funding for

6 travel. All participants signed an informed consent and agreed to fill out a demographic survey and take three online tests over the course of six months. The curriculum for the course was the same for both delivery options. The course, which follows the practice of evidence-based medicine, was divided into four modules: introduction and question building, selecting resources and conducting the search, evaluating the evidence, and roles for the librarian. The content included lectures (or printed material included in the online course shell), assignments, discussion and additional readings. Throughout the course, case studies were used to teach and practice EBM skills. The actual delivery of content however was not the same for both class formats. In the traditional classroom setting, the instructor presented the core material through lecture and discussion. Students were provided with a course manual, including a glossary and further reading list. Group exercises and practice time were built into the daylong course to reinforce EBM concepts. Contact time between the instructors and students was limited to eight hours. The distance education participants used the Blackboard course management system to access the course manual, assignments and the course discussion board. Exercises and practice were completed individually. Instructors provided detailed feedback for each individual's assignments via email, answered specific questions via telephone or e-mail, and provided additional comments on the class discussion board. The course was open for eight weeks during which the students and instructors could communicate with each other as much or as little as needed. All participants were asked to answer a pre-course survey to gather information about their level of professional experience and their learning styles. In addition, all participants were asked to take three assessments: a pre-test before the start of the course, a post-test within one week after the end of the course and a follow-up post-test 6 months after the end of the course. The Fresno Test of Evidence Based Medicine was selected as the assessment for this study because it was designed and validated to objectively test the effectiveness of an evidencebased medicine curriculum. Instead of asking participants to self report attitudes and levels of EBM skills, the Fresno Test presents two clinical scenarios and asks specific questions related to forming a clinical question, selecting appropriate resources, conducting a search and evaluating the validity of an article. (Ramos, Scharfer, and Tracz, 2003) The complete test and a grading rubric were freely available online through the BMJ Web site. The author, Kathleen D. Ramos,

7 was contacted and permission was granted to modify and use the Fresno Test for this study. Minor changes were made to the questions regarding the validity of an article and calculating the results of a study to reflect the focus of the content of this specific course. The questions required a combination of short free-text responses and multiple-choice answers. The same test with the same case scenarios and questions was given each time. Since students would not receive any feedback or indication of the correctness of their answers, it was decided that using the same case scenarios would provide a more accurate indication of skill level than changing the scenarios and questions each time. SurveyMonkey was used to create, distribute and collect student responses to the assessment tool. Individual test responses were printed from the database, manually stripped of identifying information, and number-coded by research assistants who were not aware of the purpose of the study. Using the grading rubric, the authors graded the blinded responses individually and then through a series of conference calls, agreed on a final score for each test. A third reviewer was available to settle disagreement. For each open-ended question there were four possible levels for an answer (excellent, strong, limited and not evident). The levels reflected the completeness of the answer and points were assigned accordingly. Results Seventy-two professional librarians enrolled in the course during 2004. Fifty-six librarians completed the demographic survey and at least one of the pre- or post-tests. Fourteen librarians completed the course but did not submit any surveys or assessments, despite repeated requests for compliance. Two librarians dropped the course and did not finish the work. The two groups, continuing education (CE) and distance education (CE), were similar in terms of type of library (academic or hospital) and primary responsibilities (reference or education) (See Table 1). The groups differed in terms of age and experience. The CE students were older with more years of experience. All (100%) of the CE students were 36 years old or older, compared to 64% of the distance education students. Eighty two percent of the CE students had 10 years or more of library experience, compared to only 29% of the distance education students. As expected the CE students reported a strong preference (82%) for traditional classroom instruction; while only 61% of distanced education students reported this preference. When asked about reasons for taking an online course, both groups indicated that

8 convenience of not traveling, timeliness of the topic, and working at own pace as the most important factors. Seventeen students from the traditional face-to-face class and 21 students from the distance education course completed the first post-test. As indicated in Figure 1, there was a significant decline in the number of students completing responses to the subsequent tests. Ultimately, data from all assessments were available for only 10 CE and 11 DE students (See Table 2). When comparing the scores of only those who completed all assessments, the DE students averaged over 10 points higher than the CE group in each test. Both groups showed a significant improvement from the pre-test to the post-test (CE increased by 57%, DE increased by 21%). As expected, average student scores declined from the first post-test to the second posttest (CE decreased by 14%, DE decreased by 11%.), although they remained higher than pre-test levels (See Figure 2). Based on the raw numbers, it appeared that students in the DE group came into the classroom with a greater knowledge of the subject and retained more knowledge six months after the course had ended. Due to the small number of students who completed all assessments, the data from all the participants was re-analyzed and adjusted for missingness (See Table 3). When the data was recalculated using a mixed model, with fixed effects for time (three levels) and group (classroom and distance) and time/group interaction, the adjusted differences were not statistically significant. Considering all the data, the study showed that the distance education group and face-to-face groups had no difference in level of knowledge retention (See Figure 3). Study Limitations This study was designed to determine if there is a difference between learning retention for students taking a continuing education course in the classroom compared to taking the same course through distance education. Students were given three assessments to determine their baseline knowledge of the topic, what they learned immediately after the course, and what they retained six months after the course. Pooled data from those assessments showed that there was no statistical difference in the scores of the two groups of students, leading the authors to conclude that neither delivery method is better than the other.

9 While this study was designed to evaluate a larger population than the first study, it was difficult to get students to complete the assessments, especially after the course ended. Students were sent assessments via e-mail and follow-up messages were sent from both SurveyMonkey and the instructors. Despite these reminders, the number of students responding to each subsequent assessment declined. While there appeared to be a trend towards favoring distance education for those students who completed all three assessments, that sample size of individuals who completed all assessments was too small to show a significant difference. When the missing data was factored into the results the trend disappeared. Repeating the same case scenarios for all three assessments may have affected student compliance. The researchers used the same scenarios to ensure that the answers would be comparable over time and not affected by the difficulty of the presented case. Instead this may have contributed to a loss of interest in the study, as evidenced by some skipped responses or abbreviated answers in the second and third assessments. The open-ended questions of the Fresno Test were sometimes difficult to grade. While the researchers were blinded to the identity of the students, the type of course, and which test was being reviewed, it was a challenge to remain objective and consistent in evaluating the answers. This was especially a problem with the questions related to information resources and the search strategy. Also, while the instructions appeared clear before the test was implemented, it became evident that the level of detail required for the answer was not as clearly stated as it should have been. These problems could have been anticipated and addressed before the study by better field-testing of the assessment tool. Discussion There is continued debate about the effectiveness of online education. There remains a need for more rigorous study as evidenced by the recent bill introduced by Congressman Ehlers (R-Grand Rapids, MI). As a January press release stated, the Independent Study of Distance Education Act of 2007 (H.R. 412), requires that the National Academy of Sciences (NAS) conduct a scientifically correct, statistically valid study of the quality of distance education programs as compared to campus-based programs. (Brandt, 2007, para. 2) While this bill specifically addresses issues in academic higher education settings, there may be implications for other groups promoting online learning programs.

10 Professional associations spend considerable personnel and funding resources on providing continuing education courses in the classroom setting and through distance education. Often associations use the number of courses taken as a measurement of competency or as a requirement for certification. These courses are often only evaluated for the classroom experience and the impressions left by the instructors and the facilities. An evaluation that describes the objectives, teaching styles and room environment of a course, does not provide any information to indicate if the student learned the course content and has become more knowledgeable about the subject. To better grasp the usefulness and worth of continuing education, evaluation needs to focus more on knowledge retention or competency, rather than the experience.

11 References Anstine, J., & Skidmore, M. (2005). A small sample study of traditional and online courses with sample selection adjustment. Journal of Economic Education, 36, 107-127. Brandt, J. (2007, January 16). Ehlers bill calls for study of distance learning courses. Congressman Vernon J. Ehlers Press Releases. Retrieved February 20, 2007, from http://www.house.gov/apps/list/press/mi03_ehlers/011607distanceedstudy.html Buckley, K. M. (2003). Evaluation of classroom-based, Web-enhanced, and Web-based distance learning nutrition courses for undergraduate nursing. Journal of Nursing Education, 42, 367-370. Coomarasamy, A., & Khan, K. S. (2004). What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review [Electronic Version]. BMJ, 329, 1017-1022. Hicks, A., Booth, A., & Sawers, C. (1998). Becoming ADEPT: Delivering distance learning on evidence-based medicine for librarians. Health Libraries Review, 15,175-184. Leasure, A. R., Davis, L., & Thievon, S. L. (2000). Comparison of student outcomes and preferences in a traditional vs. World Wide Web-based baccalaureate nursing research course. Journal of Nursing Education, 39, 149-154. McCluskey, A., & Lovarini, M. (2005). Providing education on evidence-based practice improved knowledge but did not change behaviour: A before and after study [Electronic Version]. BMC Medical Education, 5, 40-51. Medical Library Association. (2002, October). MLA continuing education activities. Retrieved January 17, 2007, from http://www.mlanet.org/education/cech/cecriteria.html Neuhauser, C. (2002). Learning style and effectiveness of online and face-to-face instruction. The American Journal of Distance Education, 16, 99-113. Olmsted, J. L. (2002). Longitudinal analysis of student performance in a dental hygiene distance education program. Journal of Dental Education, 66, 1012-1020. Ramos, K. D., Schafer, S., & Tracz, S. M. (2003). Validation of the Fresno test of competence in evidence based medicine [Electronic Version]. BMJ, 326, 319-321.

12 Schardt, C. M., Garrison, J., & Kochi, J. K. (2002). Distance education or classroom instruction for continuing education: who retains more knowledge? Journal of the Medical Library Association, 90, 455-457.

13 Table 1 CE and DE Student Characteristics Baseline Characteristics Continuing Education n=28 Distance Education n=28 Type of Library Academic 9 (32%) 9 (32%) Hospital 16 (57%) 10 (36%) Other 3 (11%) 8 (29%) Current Position Reference/education 9 (32%) 17 (61%) Other 19 (68%) 9 (32%) Years of experience Less than 5 4 (14%) 14 (50%) 5 to 10 1 (4%) 6 (21%) More than 10 23 (82%) 6 (21%) Age Under 35 0 (0%) 10 (36%) 36 to 45 8 (29%) 6 (21%) Over 46 20 (71%) 11 (39%)

14 Table 2 Test Compliance; number of students completing assessments Group Pre-Test Post-Test 6-Month Post-test All assessments Continuing Education 25 17 14 10 Distance Education 27 21 14 11

15 Table 3 Adjusted Data Analyzed for Missing Student Test Scores Pre-test baseline Post-test Change Baseline post-test 2 nd post-test change post-test 2 nd post-test Change Pre-test 2 nd post-test CE n=10 Submitted all tests 47.78 75.15 0.57 64.78-0.14 0.36 CE all responses 55.78 73.77 0.32 64.41-0.13 0.15 DE n=11 Submitted all tests 71.15 86.35 0.21 77.15-0.11 0.08 DE all responses 57.85 82.66 0.43 71.25-0.14 0.23 * 2nd average is a model mean estimate that adjusts for missingness

16 Figure 1: Number of student responses to assessments. 30 25 20 15 10 CE DE 5 0 Pre-test Post-test 6 month post-test

17 Figure 2: Test scores for students completing all tests. 90 80 70 60 50 40 30 20 10 0 Pre-test Post-test 6 month Post-test CE DE

18 Figure 3: Test scores recalculated to account for missingness. 90 80 70 60 50 40 30 20 10 0 Pre-test Post-test 6 month Post-test CE DE