Using a Question Generation Approach to Improve Web-Based Collaborative Learning



Similar documents
Running head: FROM IN-PERSON TO ONLINE 1. From In-Person to Online : Designing a Professional Development Experience for Teachers. Jennifer N.

Ready to Teach Online?: A Continuum Approach

Research Proposal: Evaluating the Effectiveness of Online Learning as. Opposed to Traditional Classroom Delivered Instruction. Mark R.

An Evaluation of Student Satisfaction With Distance Learning Courses. Background Information

Distance Learning Practices in Military Training and Education

e-learning: Teaching Computer Programming Online to First Year Engineering

Future of E-learning in Higher Education and Training Environments

Improving Developmental College Counseling Programs

Evaluating Different Methods of Delivering Course Material: An Experiment in Distance Learning Using an Accounting Principles Course

Comparison of Student Performance in an Online with traditional Based Entry Level Engineering Course

EDP 504 ADVANCED EDUCATIONAL PSYCHOLOGY

Using Research About Online Learning to Inform Online Teaching Practice

Instructional Design Practices for Supporting Self-regulated Learning in Online and Hybrid Settings. Frances Rowe, M.Ed. Jennifer Rafferty, MA, M.Ed.

Knowledge Management & E-Learning

Cary F. Kemp, M.A. Personnel Research Psychologist Office of Personnel Management Center for Talent Services Washington, DC

Technology Use in an Online MBA Program

Performance in e-learning: online participation and student grades

LAWRENCE MORLEY HANSER

Justine Ashokar, Ph.D.

EMBRACING TECHNOLOGY TO ACHIEVE RESULTS WHAT IS LEARNSMART? HOW LEARNSMART BENEFITS EDUCATORS

The Role of Community in Online Learning Success

Introduction to Psychology 2101

Teaching college microeconomics: Online vs. traditional classroom instruction

GRADUATE STUDENT SATISFACTION WITH AN ONLINE DISCRETE MATHEMATICS COURSE *

Chenoa S. Woods, Ph.D.

Master of Arts Programs in the Faculty of Social and Behavioral Sciences

Retail Data Mart Online Course: The First-Best Online Training Experience

An Examination of Assessment Methods Used in Online MBA Courses

Engaged Learning in the 21st Century

Students Attitudes about Online Master s Degree Programs versus Traditional Programs

Using A Learning Management System to Facilitate Program Accreditation

n The ACA Online Library is a member s only benefit. You can join today via the web: counseling.org and via the phone: x222.

Psychology, William Paterson University, Wayne New Jersey Honors in Biopsychology B.A.,

[CONVERTING YOUR COURSE TO A BLENDED FORMAT]

Why Using a Learning Management System Alone May Miss the Learning Curve. ProEd. Five Guidelines for Improving Your Corporate Learning Outcomes

K12 ONLINE LEARNING: THE NATIONAL LANDSCAPE AND

University Students' Perceptions of Web-based vs. Paper-based Homework in a General Physics Course

Distance Education Courses: How Course Management Systems are Used and Factors Affecting Use

CLASS PARTICIPATION: MORE THAN JUST RAISING YOUR HAND

Training for faculty who teach online

INSTRUCTOR: Jeffrey H. Nathan, Ph.D. OFFICE HOURS: By appointment TELEPHONE: (808) EFFECTIVE DATE: 1/13/2014 5/16/2014

ASMC Guide to Gaining College Level Credit for EDFMC Completion Earning college credit for ASMC s Enhanced Defense Financial Management Course (EDFMC)

Blending Pedagogical Concept and Information Technology in Blended Learning: an Analysis in Sports Physiotherapy Course Abstract Keywords

PREPARING STUDENTS FOR CALCULUS

Trimble Geospatial. ecognition Training - Course Catalog

E-learning at the Arthur Lok Jack Graduate School of Business: A Survey of Faculty Members

The Effect of Varied Visual Scaffolds on Engineering Students Online Reading. Abstract. Introduction

STUDENT PARTICIPATION INDEX: STUDENT ASSESSMENT IN ONLINE COURSES

Web-Based Instruction: Design and Technical Issues which Influence Training Effectiveness

Intel Teach Elements Mobile Learning Courses

A Comparison Between Online and Face-to-Face Instruction of an Applied Curriculum in Behavioral Intervention in Autism (BIA)

Hybrid Courses: Obstacles and Solutions for Faculty and Students. Robert Kaleta Director, Learning Technology Center University of Wisconsin-Milwaukee

Advanced Placement Psychology Course description

Instructor-Led Courses & Self-Paced Tutorials

Naber July, Gettysburg College, PA, on Presidential Scholarship, majored in Psychology and Religion and minored in Chinese Studies.

Effectiveness of Online Instruction

E-Learning Instructional Design Certificate Program

Michelle A. Maher Page 1. Michelle A. Maher

Masters of Science (MS) in Educational Psychology

Introduction to Industrial and Organizational Psychology PSY 319 Fall, 2015 (Section 1)

ED 653 Instructional Design


Introduction. Two vastly different online experiences were presented in an earlier column. An

Management Information Systems: International Edition, 11/E

AC : DEVELOPMENT OF AN ONLINE MASTER'S DEGREE IN TECHNOLOGY MANAGEMENT

DePaul University February, Bahrain Accounting 500 Financial Accounting

Professional Development with the Online Dental Practice Readiness Curriculum (DPRC) The DPRC Story

Howard University School of Education. Faculty Vitae

Online Military Training: Using a Social Cognitive View of Motivation and

The International Research Foundation for English Language Education

Defense Systems Management College Fort Belvoir, VA Phone, ; (toll free)

Alessi, Stephen & Trollip, Standley (2001). Multimedia for Learning: Methods and Development (3rd Edition). Boston: Allyn & Bacon.

AIR NATIONAL GUARD CALENDAR YEAR (CY) 2016 OFFICER DEVELOPMENT OPPORTUNITIES

Tips, Tools, and Techniques for Teaching in the Online High School Classroom

Track 3 E-Learning Diploma

EDUC 660. Organization and Administration of School Counseling Programs

A STATISTICS COURSE FOR ELEMENTARY AND MIDDLE SCHOOL TEACHERS. Gary Kader and Mike Perry Appalachian State University USA

PSYCHOLOGY 3200 Industrial Psychology: Personnel Selection Mixed Mode COURSE OUTLINE

Conditions of Learning (R. Gagne)

A Comparison of Reading Response Methods to Increase Student Learning. Cheryl J. Davis, Endicott College, Northborough, Massachusetts, USA

This historical document is derived from a 1990 APA presidential task force (revised in 1997).

Transcription:

For more resources click here -> Online Collaboration Using a Question Generation Approach to Improve Web-Based Collaborative Learning James Belanich Research Psychologist U.S. Army Research Institute for the Behavioral and Social Sciences Robert A. Wisher Senior Research Psychologist U.S. Department of Defense Kara L. Orvis Consortium Research Fellow George Mason University Introduction Whether in a traditional classroom setting or through distributed learning (DL) technologies, learning is influenced by interactions or collaborations among learners. Collaborations with other learners allow for the synergistic building of knowledge. The strengths of each team member can be shared with other team members, leading to sharing of a greater knowledge base. Studies have shown that the use of collaborative learning exercises leads to improved knowledge acquisition when compared to instruction without collaborative exercises (Alkhateeb & Jumaa 2002; Bonk & Wisher, 2000). Learners in DL courses often feel isolated due to the lack of natural interaction between class members that is present in traditional class settings (Muilenburg & Berge 2001). Distance learners rarely have the opportunity to interact with other students. They cannot readily exchange ideas after class, nor are they able ask another student for clarification during a lecture. Research has found that many distance learners feel that learner-learner interaction should be designed into online courses to alleviate the feeling of isolation (Navarro & Shoemaker, 2000). Collaborative online tools, therefore, are needed to increase learner-learner interaction, which should subsequently motivate students in a DL environment. Question Generation Learning One technique to increase leaner collaboration is through the process of asking and answering questions. The asking of a question requires active processing of information, leading to more in-depth learning (Graesser & Wisher 2001). To develop a question, several meta-cognitive tasks must be completed. The questioner must first identify what information they know, what information they do not know, and how to construct a question that would be understandable to others. Method In this research, the effects of a collaborative question-writing tool on learner outcomes, such as test performance and quality of questions developed, were assessed in a military training environment. 1

For more resources click here -> Apparatus The collaboration tool used was TEAMThink, produced by Athenium LLC, a Web-based platform where a team of users collaborated to develop questions. Each participating school provided computers with high-speed Internet access. The collaboration tool, a tutorial, and all performance data were stored on a central server in Boston, MA. In the current experiment, the collaboration tool was used in a classroom setting due to experimental and scheduling constraints. (Note: the use of trademark or copyright material does not imply endorsement of a product by the U.S. Army Research Institute for the Behavioral and Social Sciences or by the U.S. Department of Defense) Participants The participants were students at three Army schools: the Engineer School at Fort Leonard Wood, Missouri; the Ordnance Munitions and Electronic Maintenance School at Redstone Arsenal, Alabama; and the Intelligence School at Fort Huachuca, Arizona. Two sessions for a total of about 4 hours were needed for the participants to complete the experiment. For all groups, the experiment was integrated into a course required for career advancement. Essentially there were two separate issues investigated across the three groups of participants, learning effect and quality of questions developed. Two different comparisons were made to assess learning effect. In the first comparison, participants at the Engineer School were used for a between group assessment. For this comparison, 104 students were assigned to an experimental group, which participated in the collaborative exercise, and 108 students were assigned to the control group that did not participate in the collaborative exercise. The second comparison involved 40 participants from the Ordnance School for a within group design, comparing participants performance on a test of the topic they completed with the collaboration tool versus their performance on a test of a different topic. To assess the quality of questions generated, 55 participants from the Intelligence School, along with the participants from the Engineer School and Ordinance School, were used. Tutorial Procedure Participants first completed a Web-based, self-paced tutorial, instructing them on how to write effective multiple-choice questions. The tutorial took about 45 minutes to complete and consisted of three modules: a) learning through questions, b) writing effective questions, and c) creating a set of answers that included one correct and 3-4 appropriate incorrect options. Participants cycled through a module until achieving a score of 100% correct on an end-of-module quiz before continuing to the next module. Question Collaboration Participants were divided into teams and instructed to compose questions individually. The size of the teams varied from 3 to 24 members. The collaboration tool provided a template that included an area to write the question stem and separate areas to write the possible answers (see Figure 1). All teams were given specific course topics on which to write their questions. In all locations, half of the teams in a specific class wrote questions on one topic while the other half wrote questions on a different topic. Because of small team sizes in the Ordnance School, the students were instructed to write three questions, while in the other locations the students were asked to write one question. As a general guideline, students were asked to write questions that 70 percent of the class could answer correctly. 2

For more resources click here -> After writing their questions, students critiqued questions written by their teammates. For each question, the procedure was: (1) answer the question; (2) show the correct answer; and (3) write comments on the question. When the students completed the review and commenting process, they were allowed to read the comments written about their own questions and revise their original question. Figure 1. The template for writing questions, answer stems, and comments. Test of Questions After critiquing their questions, participants were tested on questions developed by other teams that composed questions on the same course topic. The questions that appeared on the tests were limited to questions that were accepted by the course instructors for technical accuracy. Participants taking the tests had not previously seen the questions they were given, since a separate team developed them. The participants from the Ordnance School took two tests. The first test consisted of questions on the topic that their team used during the collaborative exercise. The second test consisted of questions on a different course topic. Both topics had been covered recently in the course. This allowed for the comparison of performance on tests of the topic areas with which they did and did not collaborate. Control Group 3 For comparison with the participants from the Engineer School, participants who were in the same course as the Engineer experimental group were assigned to a control group. In the control condition the participants did not use the collaboration tool or go through the question-writing tutorial. They wrote questions based on the same topics as members of the experimental group. The control group then took a test consisting of those questions.

For more resources click here -> Results Quality of Questions Across all schools, instructors accepted 258 of 336 questions (77 percent) as technically accurate. The production of so many quality questions indicates that a collaboration tool could be used to generate a corpus of questions for later use by training developers. In addition, the development of so many appropriate questions indicated that the participants understood the course material, which was reassuring to the instructors. Learning Effect Two analyses were conducted to assess evidence for a learning effect. In the first analysis, the percent of questions answered correctly during the test of the control and experimental groups from the Engineer School were compared. In the second analysis, two sets of questions were answered by the students from the Ordnance School; these included questions about the topic on which they collaborated (within) and a different topic that was covered in class (across). Engineer School. The control group participants answered only 68.4 percent of the questions correctly. The experimental group participants answered 76.1 percent of the questions correctly. The 7.7 percent difference between means was statistically significant (independent samples t=3.76, df=210, p<.01). The effect size for this difference was 0.73 standard deviation units. Ordnance School. In this group, students took two tests; the first was with questions in the same topic area they used during the collaborative exercise (within) and the second in a different topic area (across). In both of the tests, it was the first time participants were exposed to the questions. Participants averaged 7.3 percent higher scores on the within topic versus across topic tests. The results, however, were not statistically significant (p=0.16). The effect size for this difference was 0.30 standard deviation units.) The collaboration that occurs during the question generation exercise may aid in the learning process, as shown in the significantly higher scores by the participants from the Engineer s School who used the collaboration tool versus the participants who only wrote questions and took a test, with no collaboration. For the participants from the Ordnance School, however, the difference between the within and across groups was not statistically significant. This suggests that a limited learning effect was evident after only a single use of the collaboration tool. Further experimentation would be required to assess the effect after multiple uses of the collaboration tool. Conclusion The use of the collaborative question-generation tool produced positive outcomes for both students and instructors. The benefits to the students were an increased comprehension of the course material and a novel method of interacting with other students. These benefits were realized after only one use of the program; greater benefit would be expected if the collaboration tool were used repeatedly throughout a course. The benefits to the instructor were the development of quality questions with minimal time investment and the opportunity to monitor topic comprehension by students. 4 Technologies that encourage interactions between learners are a vital ingredient for increasing the effectiveness of DL. By capitalizing on the natural advantages of quality learner-learner interactions, collaborations, and a learning environment that includes a question-generation strategy, the extent and depth of learning can substantially improve.

For more resources click here -> References Alkhateeb, H. M., & Jumaa, M. (2002). Cooperative learning and algebra performance of eighth grade students in United Arab Emirates. Psychological Reports, 90, 91-100. Bonk, C. J., & Wisher, R. A. (2000). Applying collaborative and e-learning tools to military distance-learning: A research framework. (Technical Report 1107). Alexandria, VA: U.S. Army Research Institute for the Social and Behavioral Sciences. Graesser, A. C., & Wisher, R. A. (2001). Question generation as a learning multiplier in distributed learning environments (Technical Report 1121). Alexandria, VA; U.S. Army Research Institute for the Social and Behavioral Sciences. Muilenburg, L., & Berge, Z. L. (2001). Barriers to distance education: A factor-analytic study. The American Journal of Distance Education, 15(2), 7-24. Navarro, P., & Shoemaker, J. (2000). Performance and perceptions of distance learners in cyberspace. The American Journal of Distance Education, 14(2), 15-35. Biographical Sketches James Belanich is a Research Psychologist with the U. S. Army Research Institute for the Behavioral and Social Sciences. He earned a Ph.D. in Psychology-Learning Processes from the City University of New York Graduate Center. His current research interests include advanced distributed learning and computer supported collaborative learning. He has over 40 presentations and 15 publications in areas including distributed learning, adaptive/assistive technology, and learning processes. Address: 5001 Eisenhower Avenue Alexandria, VA 22333-5600 E-mail: BelanichJ@ari.army.mil Phone: 703.617.2362 Fax: 703.617.3573 Robert A. Wisher is a Senior Research Psychologist within the Office of the Secretary of Defense where he directs the Advanced Distributed Learning Initiative. He also serves as the U.S. delegate to NATO education and training groups. He holds a Ph.D. degree in Cognitive Psychology from the University of California, San Diego. His current work involves large-scale implementation of distributed learning content and services, as well as research on new technologies. Address: Office of the Secretary of Defense DUSD(R)/Readiness and Training 4000 Defense Pentagon, Room 1C757 Washington, DC 20301-4000 E-mail: Robert.Wisher@osd.mil Phone: 703.697.4992 5 Kara L. Orvis is a Doctoral Candidate in the Industrial/Organizational Psychology program at George Mason University and currently employed as a Senior Research Fellow at the U.S. Army Research Institute in Alexandria, VA. Her main research interests have been in the realms of teams, multi-team systems, and leadership, concentrating on team training and development. Her current research focuses on

For more resources click here -> teamwork and leadership issues regarding dispersed teams and computer supported collaborative learning. Recent publications include "Communication Patterns during Synchronous Web-based Military Training in Problem Solving " and "Overcoming Barriers to Information Sharing in Virtual Teams". Address: 5001 Eisenhower Avenue Alexandria, VA 22333-5600 E-mail: korvis@gmu.edu Phone: 703.617.0356 Fax: 703.617.3573 6