How To Understand The Relationship Between A Question And Your Thinking Online



Similar documents
Facilitating Students Critical Thinking in Online Discussion: An Instructor s Experience

Assessing & Improving Online Learning Using Data from Practice

Using Research About Online Learning to Inform Online Teaching Practice

Critical Thinking in Online Discussion Forums

A Conceptual Framework for Online Course Teaching and Assessment in Construction Education

STUDENTS PERCEPTIONS OF ONLINE LEARNING AND INSTRUCTIONAL TOOLS: A QUALITATIVE STUDY OF UNDERGRADUATE STUDENTS USE OF ONLINE TOOLS

Relationship Between Sense of Community and Learning in Online Learning Environments

Learning and Teaching

Applying and Promoting Critical Thinking in Online Education

Developing Cognitive, Social, and Teaching Presence Online. Tina Stavredes, PhD Chair, Psychology, School of Undergraduate Studies Capella University

Learner Centered Education in Online Classes

The Evolution of Online Learning

STUDENT PERCEPTIONS OF INSTRUCTOR INTERACTION IN THE ONLINE ENVIRONMENT

Developing Higher Level Thinking

Undergraduate Students Perceptions of the Value of Online Discussions: A Comparison Between Education and Engineering Students

Writing Learning Objectives

Faculty Guidelines for Graduate Online/Hybrid Course Development

Using PBL Assignments in Undergraduate Operations Management Course

LEARNING THEORIES Ausubel's Learning Theory

Assessment of Asynchronous Online Discussions for a Constructive Online Learning Community

Best Practices and Review Standards for Online Instruction. Recommended Best Practices for Online Instruction

Guided Reading: Constructivism in Action. Donna Kester Phillips, Niagara University. Abstract

Distance learning is an educational delivery

Best Practices for Online Courses. 100 Quality Indicators for Online Course Design

An Engagement Model for Learning: Providing a Framework to Identify Technology Services

Chan Chang Tik. Monash University Malaysia, Selangor, Malaysia

Community of Inquiry Framework: Establishing Community in an Online Course. Judy L. Lambert & Juenethia L. Fisher University of Toledo

Teacher Education Portfolio Guidelines and Rubric

Writing Quality Learning Objectives

ONLINE COMMUNITY OF INQUIRY REVIEW: SOCIAL, COGNITIVE, AND TEACHING PRESENCE ISSUES

Designing Social Presence in an Online MIS Course: Constructing Collaborative Knowledge with Google+ Community

Automated Content Analysis of Discussion Transcripts

Principles of Data-Driven Instruction

, Discussion Boards, and Synchronous Chat: Comparing Three Modes of Online Collaboration

Developing Online Roleplay Simulations for Preparing Engineering Students for Multidisciplinary and International Practice

Graduate Handbook EDUCATIONAL PSYCHOLOGY

Higher Education E-Learning Courseware: Pedagogical-Based Design and Development

Making the Leap: Developing and Implementing Hybrid and Online Courses. Professors John H. Shannon and Susan A. O'Sullivan-Gavin.

A methodological proposal to analyse interactions in online collaborative learning environments

Rider University Online E-coaching Tips. Teaching Online Tip #3: Stages and Steps in Building a Learning Community

NRMERA 2011 Distinguished Paper. Instructors Perceptions of Community and Engagement in Online Courses

The Effect of Flexible Learning Schedule on Online Learners Learning, Application, and Instructional Perception

The Undergraduate Education Office and First-Year Offerings

Ready to Teach Online?: A Continuum Approach

Predictors of student preference for online courses

Preprint: To appear in The Learning Curve. Lowenthal, P. R., & Parscal, T. (2008). Teaching presence. The Learning Curve, 3(4), 1-2, 4.

Master of Arts in Higher Education (both concentrations)

Understanding Online Learning. Najib Manea

GEORGIA STANDARDS FOR THE APPROVAL OF PROFESSIONAL EDUCATION UNITS AND EDUCATOR PREPARATION PROGRAMS

MOE Online Class Quality Guidelines

Scaffolding Student Collaboration for Group Wiki Projects

Writing learning objectives

Learning Design Principles for Power Point Presentations

Online Education Disadvantages. Many students can learn with the music on or searching things on the web others cannot,

A Comparative Analysis of Preferred Learning and Teaching Styles for Engineering, Industrial, and Technology Education Students and Faculty

The Changing Nature of Online Communities of Inquiry: An Analysis of How Discourse and Time Shapes Students' Perceptions of Presence

A Proposed Collaborative Computer Network-Based Learning Model for Undergraduate Students with Different Learning Styles

THE IMPORTANCE OF TEACHING PRESENCE IN ONLINE AND HYBRID CLASSROOMS

Instructor and Learner Discourse in MBA and MA Online Programs: Whom Posts more Frequently?

Instructional Design Strategies for Teaching Technological Courses Online

Student Involvement in Computer-Mediated Communication: Comparing Discussion Posts in Online and Blended Learning Contexts

How To Be A Critical Thinker

The Role of Community in Online Learning Success

CREATING LEARNING OUTCOMES

Strategies for Effective Online Teaching

Cognitive Domain (Bloom)

CHAPTER III METHODOLOGY. The purpose of this study was to describe which aspects of course design

Facilitating Cognitive Presence in Online Learning: Interaction Is Not Enough

The Effects Of Unannounced Quizzes On Student Performance: Further Evidence Felix U. Kamuche, ( Morehouse College

E-coaching and Feedback Practices to Promote Higher Order Thinking Online

Can Using Individual Online Interactive Activities Enhance Exam Results?

APPENDIX A: ACTFL GUIDELINES. APPENDIX C: Methodology for Innovative Instruction in K-12 World Language Programs

I. INTRODUCTION ABSTRACT KEYWORDS

Assessment of Online Learning Environments: Using the OCLES(20) with Graduate Level Online Classes

Course Mapping: Instructions and Template

Oglala Lakota College Teacher Preparation Program Instructional Unit Scoring Rubric

3 Conceptual Framework

Writing Learning Objectives that Engage Future Engineers: Hands-on & Minds-on Learning Activities

Instructional Design based on Critical and Creative Thinking Strategies for an Online Course

Case Based Scenarios: Evidence Based Teaching Learning Strategy in Nursing Education Pharmacology Course

Physics Teacher Education Program Web Site. Journal of Physics Teacher Education Online 10/22/11 7:53 AM 1

2003 Midwest Research to Practice Conference in Adult, Continuing, and Community Education

Identifying the State of Online Instruction in ATE funded Technical Education. Programs at Community Colleges

Concept-Mapping Software: How effective is the learning tool in an online learning environment?

Conversion Program Curriculum & Instruction Certification

The Elementary Education Program Brandeis University Waltham, MA 02454

High-Impact Practices and Experiences from the Wabash National Study

Strategies for Effective Online Teaching

Exemplary Online Educators: Creating a Community of Inquiry

Measuring Online Course Design: A Comparative Analysis

EDU 330: Developmental and Educational Psychology Course Syllabus Spring 2015

Course Guide Masters of Education Program (UOIT)

A New Force to Push Universities in the U.S. to Go Online

Differentiated Instruction: Are University Reading Professors Implementing It?

Course Guide Masters of Education Program

DEVELOPMENT OF PROJECT DOCUMENTATION: KEY INGREDIENT IN TEACHING SYSTEMS ANALYSIS AND DESIGN

GOGOT SUHARWOTO AND MAGGIE NIESS Oregon State University Corvallis, OR USA and

Graduate Students Perceptions of Online Learning. Authors. Dr. LaVonne Fedynich, Associate Professor. Texas A&M University-Kingsville

Diversity Outcomes for the Miami Plan Working Group Final Report. May 2007

Purposes for the Culminating Experience: General Guidelines

Transcription:

Relationship between Question Prompts and Critical Thinking in Online Discussions Ayesha Sadaf Purdue University 3120 Beering Hall of Liberal Arts and Education 100 N. University St. West Lafayette, IN 47907-2098 Jennifer Richardson Purdue University 3138 Beering Hall of Liberal Arts and Education 100 N. University St. West Lafayette, IN 47907-2098 Peggy A. Ertmer Purdue University 3144 Beering Hall of Liberal Arts and Education 100 N. University St. West Lafayette, IN 47907-2098 Abstract This study examined the relationships between the structure of question prompts and the levels of critical thinking demonstrated by students responses in 27 online discussions. Discussion question types were classified using Andrews typology (1980): playground, brainstorm, focal, general invitation, lower-level divergent, analytical convergent, shotgun, funnel, and critical incident. Transcripts of students responses from twenty seven online discussions were analyzed for levels of critical thinking using Garrison s four-stage Practical Inquiry Model. Differences in the impact on the levels of critical thinking were observed across the differing question types. Among the nine question types, critical incident and lower divergent questions were most effective in generating high levels of student thinking compared to other question types. Implications for instructors who are looking for general guidelines regarding how to structure their questions to elicit high quality responses are discussed. Introduction Online and blended forms of learning have grown considerably in higher education over the past decade and continue to grow at a rapid rate today. The growth of online and blended learning has prompted educators and researchers to identify best practices for the design of online learning experiences. Although a wide variety of instructional strategies can be used to encourage student learning online, discussion is one of the most commonly used pedagogical methods in online education since it has the potential for promoting critical thinking (Richardson & Ice, 2010; Walker, 2004; Yang, 2002) and knowledge construction (Pena-Shaff & Nicholas, 2004). Although online discussions have the potential to promote critical thinking, simply giving students the opportunity to interact does not automatically lead to meaningful discourse or the use of critical thinking skills (McLoughlin & Mynard, 2009). Based on the results of their research, Garrison, Anderson, and Archer (2001) noted that over 80% of students discussion posts reflected lower levels of thinking. Similarly, Gilbert and Dabbagh (2005) reported that approximately 75%-80% of their students online postings were at the lower levels of Bloom s taxonomy (e.g., knowledge, comprehension, application). Gibson (1999) suggests that questions structured and constructed by instructors is the most basic tool of discussions that stimulate students thinking to construct understanding of the specific content. Over the years, research has consistently demonstrated a strong relationship between teachers questions and subsequent student responses (Bloom, 1956; Dillon, 1994). Much research has been done concerning questions 209

and question types in face to face environments and traditional classroom setting, but only a handful of studies have been conducted to explore the role of questions in online discussions, specifically, initial question prompts. However, as the popularity of online instruction continues to grow (Allen & Seaman, 2008), it is important to examine the nature of this relationship in the online environment as well. This study aims to fill this gap in the literature on online education. Purpose of the Study The purpose of this study is to examine the relationship between the structure of online discussion prompts/questions types and the quality of students responses, specifically in terms of levels of critical thinking in online discussions. The research questions guiding this study included: 1. What is the relationship between the type of question prompt (Andrews typology) and students responses in terms of levels of critical thinking? 2. Which question prompt types promote the highest levels of critical thinking? Theoretical Framework Garrison, Anderson, and Archer s (2001) Community of Inquiry (CoI) and Bloom s six levels of cognitive processing was used as the theoretical framework to guide the study. The CoI framework has frequently been used to assess critical thinking skills of students in online discussions (Richardson & Ice, 2009). Garrison et al. has developed CoI model to explain the concept of critical thinking in online collaborated learning environments, through cognitive presence, social presence, and teaching presence. Cognitive presence is defined here as the extent to which learners are able to construct and confirm meaning through sustained discourse in the critical community of inquiry (Garrison et al., 2001, p.1). The CoI reflects the critical thinking process of its learners and the means to create cognitive presence in the context of group collaboration. CoI is grounded in a practical inquiry model (PIM) that provides a framework to assess the levels of CT in students online discussions postings. The PIM is based on the four phases of cognitive presence: (1) Triggering- become aware of a problem; (2) Exploration- explore a problem by searching/offering information; (3) Integration- interpretations/construction of possible solution, and (4) Resolution- providing potential solutions. These four phases are not considered discrete or linear since students may need to return to a previous phase (Swan, Garrison, & Richardson, 2009). According to Garrison et al. (2001), the PIM focuses on thinking processes and can be used as a tool to assess critical discourse and higher-order thinking in online discussions. Therefore, this framework was selected to examine the influence of the initial question prompts on the critical thinking. Methods Context The exploratory study was designed to examine the relationship between the verbal structure of question prompts and the level of students responses in online discussions. We examined discussion prompts from 10 asynchronous courses, taught by seven different instructors during five semesters: spring and fall, 2008; and spring, summer, and fall, 2009. Three courses were taught primarily online while seven used online discussions to augment regular class meetings. Courses ranged in size from 9 221 students (n = 569) and represented six disciplines including Educational ; Educational Psychology; English Education; Literacy and Language; Speech, Language, and Hearing Sciences, and Veterinary Medicine. The students in each course engaged in online discussions related to course content during 16-week semesters. In general, students received participation points for the responses posted in the online discussions. 210

Table 3. Course and Participant Details N Discipline Course Level Semester Approach 17 Educational Psychology 9 Educational 29 Educational 9 Educational 221 Educational 178 Educational Advanced Educational Psychology Educational Applications of Hypermedia Educational for Teaching and Learning Foundations of Distance Education Introduction to Educational Introduction to Educational Graduate Fall 09 Blended Graduate Fall 09 Blended Graduate Sum 09 Web-based Graduate Fall 08 Web-based Undergrad Spring 08 Blended Undergrad Fall 08 Blended 21 English Education Composition for English Teachers Undergrad Spring 09 Blended 10 Language and Literacy English Language Development Graduate Fall 09 Blended 62 Speech, Language, & Hearing Sciences 13 Veterinary Medicine Introduction to Aural Rehabilitation Across the Lifespan Management Topics for Veterinary Technicians Undergrad Spring 09 Blended Undergrad Spring 09 Web-based Data Collection Discussion questions (n=92) collected from the 10 courses were classified into nine question types: playground, brainstorm, focal, general invitation, lower-level divergent, analytical convergent, multiple consistent, shotgun, and critical incident (See Table 1). To ensure accuracy in the categorization, all 92 questions were classified by two researchers independently and then compared for any differences. After reviewing each of the classifications, researchers discussed their differences and clarified individual interpretations in order to reach consensus. Eight questions were based on Andrews (1980) and one additional question critical incident was added to represent another category of question type by the researchers. After coming to consensus on the classifications for the 92 discussion prompts, we then selected 27 discussions (3 from each of the final 9 categories) to use for the analysis of students postings. Question selection with associated discussion was based on the following criteria: questions representing different courses and disciplines, questions that best fit the description of the question type, and discussions with more interaction (depth) and posts (breadth). 211

Table 1 Types of discussion prompts and their descriptions Type of Description Question Prompt 1 Playground These questions represent the interpretation or analysis of a specific aspect of the material playground for discussion. Students have freedom of choice to discover and interpret the material. 2 Brainstorm These questions represent generation of all conceivable ideas, viewpoints or solutions within an articulated issue. Students are free to generate any or all ideas on a given topic. 3 Focal These questions represent an issue and require students to make decision or take a position and justify it. Students can choose from limited or unlimited number of positions to support their viewpoints. 4 General Invitation 5 Lower Divergent 6 Analytical Convergent These questions represent a wide range of responses in a formless or unfocussed discussion. Students are encouraged to give a wide range of responses within a broad topic with insufficient direction. These questions represent the simple reproduction of knowledge from comprehension with a lack of content richness. Students are required to analyze information/ material to discover the reasons, draw conclusions, or generalizations. These questions represent analytical thoughts for single correct answer. Students are required to examine a relevant material and produce a straightforward conclusion, summarize material, or describe a sequence of steps in a process. 7 Shotgun These questions represent multiple question-sentences and may contain two or more content areas. Students are expected to answer at least one fragment of the question. 8 Funnel These questions begin with a broad opening question, follow with narrower question, and eventually funnel down to a very specific concrete question. 9 Critical Incident These questions represent a scenario and ask students to respond using information from the material or experience. Data Analysis After identifying 27 discussions for coding, researchers independently coded students postings in four of the discussions using the Practical Inquiry Model (Garrison et al., 2001). Postings were scored at the message level, which varied in length from a sentence to several paragraphs. After coming to consensus on the codes for the responses in these four discussions, each researcher independently coded remaining 23 discussions. Then frequencies and percents of levels of CT across 9 question types were compared to explore which type has the most productive response in terms of levels of CT. Following this, discussion codes were entered into NVivo, a qualitative analysis software package. Matrix coding queries were then performed in order to examine relationships among specific, selected variables (question type, question level, etc.). Results and Discussion Question 1: Relationship between Question Type and Level of Response Results based on PIM (Table 2) shows that responses to Critical Incident (34%) and a small percentage of responses to Playground (6%), Analytical Convergent (5%), Funnel (4%), and Lower Divergent (4%) questions reached the highest level of CT at the resolution level. This was followed by Lower Divergent (69%), Shotgun (66%), Focal (55%), and Analytical Convergent (55%) question responses that mainly resulted in the second highest level of CT at the integration level. Although, responses to General Invitation (61%) and Brainstorm (56%) questions resulted in the greatest percentage of student responses at the exploration level, a small percentage of student responses to General Invitation (21%) questions also resulted at the lowest level of CT and were at the triggering level. 212

Table 2 Percentage of Critical thinking levels expressed by students for different question prompts Critical Incident Playground Funnel Shotgun Brainstorm General Invitation Lower Divergent Focal Analytical Convergent Triggering 6% 5% 3% 4% 6% 21% 5% 4% 17% Exploration 31% 41% 44% 29% 56% 61% 22% 39% 13% Integration 28% 47% 48% 66% 38% 17% 69% 55% 55% Resolution 34% 6% 4% 1% 0% 0% 4% 1% 5% Question 2: Which questions prompt promote the highest levels of critical thinking? Overall, Critical Incident that required students to respond to a scenario to create a solution seemed to be the most influential in generating high levels of student thinking compared to other question responses (see Fig. 1). These question types facilitated student responses that reached the resolution level (CI=34%) and also at the integration (CI=28%) levels of thinking. A review of students postings revealed that for Critical Incident question prompts, student tended to use their knowledge from the course material and personal experience to create a solution and give examples of how they can apply their solutions in a real life situation. Moreover, students justified their solutions with their personal examples and other resources, when asked to clarify any misunderstandings or conceptual conflicts of their peers. This suggests that structuring of questions that require original evaluative thinking to solve real-life problems and produce a solution can help students to think in more complex ways to generate a unique response. Figure 1. Level of students responses when presented with different types of question prompts. Lower-Divergent, Focal, Analytical Convergent and Shotgun question prompts mostly fell into the integration phase. For instance, in lower-divergent question responses, students analyzed material to discover reasons to support opinions, they took a position and defended for responses of Focal question, and mainly summarize the required material during Analytical-Convergent question responses. These question prompts seemed to facilitate students synthesizing material and connecting relevant ideas from the discussion and describing the 213

issues presented in the discussion. Students also tended to negotiate meaning by producing well conceived thoughts within their responses as well as agreeing with their peers in order to reach a tentative solution. This finding is aligned with Zsohar & Smith s (2008) conclusion that discussion prompts that incorporate course material, require reflective thinking to go beyond facts and use judgment to produce knowledge can facilitate higher level of critical thinking. General Invitation and Brainstorm questions mostly resulted in students achieving the exploration critical thinking phase. These questions allowed students to give a wide range of responses on a given topic. As compared to other questions, these questions did not facilitate higher levels of critical thinking. Due to the structure of these types of question (i.e., asking students to give a wide range of responses on a given topic), students tended to exchange ideas, search for explanations, and mainly use their personal opinions to support their arguments. This suggests that these questions are primarily effective in prompting students to share their initial ideas on a topic and thus, demonstrate their basic understanding of an issue. Limitations This study comprised an exploratory descriptive study; as such the results are not readily generalizable. When interpreting our results, it is important to recognize our study s limitations. For example, the discussion questions were collected from 10 different courses, representing 6 different disciplines, including both undergraduate and graduate levels. Although others (Gilbert & Dabbagh, 2005; Schrire, 2006) have reported that graduate students tend to demonstrate a higher frequency of high-level responses than undergraduates, we did not examine differences among these populations. Future research might examine more closely the differences among students responses at different educational levels, as well as in different courses and disciplines, and with different instructors. This has the potential to lead to more specific guidelines for the types and levels of questions to use with different populations. Conclusions and Implications Although questions are used for many instructional reasons such as focusing attention, promoting recall, and encouraging reflection, using questions to stimulate critical or higher-order thinking is one of the most important goals of education (Gibson, 1999). Studies have shown that online discussions can support higher-order thinking (Gilbert & Dabbagh, 2005; Richardson & Ice, 2010), particularly through the use of effective questioning techniques. The results of this study provide additional evidence that initial question prompt can impact the level of critical thinking students achieve in online discussions. If the goal of online discussion is to facilitate higher levels of critical thinking, then the Critical Incident questions types can work the best. While encouraging higher levels of critical thinking is often the goal of online discussions, every discussion forum has a different objective; discussion questions should be designed based on the desired educational objectives. For example, if goal is to exchange ideas, provide explanations, and give suggestion for exploring relevant solutions, then Focal and Analytical Convergent question prompts be the best strategy. If the goal is to share ideas and develop a basic or foundational understanding of the issue, general invitation and brainstorm questions can be a possible choice. In this study, Brainstorm and General Invitation questions generated the lowest levels of thinking. Yet, instructors can modify these types of questions to target the higher or middle levels of CT. On the other hand, instructors should avoid using questions that are convergent and promote low-level responses. For example, Shotgun and Funnel questions that target a knowledge level response might not generate thoughtful or interactive discussions. Instructors can stimulate deeper thinking by omitting recall or memory level questions and using a Shotgun or Funnel question that requires students to describe underlying relationships or to make connections among ideas. The results of this study have important implications for instructors who teach online, especially those looking for general guidelines regarding how to structure discussion prompts to elicit high quality student responses. Because instructors have a lot of control over which questions they ask, and how they structure them, deliberate use of different types/levels of questions may enable them to engender higher quality responses from students. It is our hope that by examining the patterns observed in our results others will be able to modify their own discussion prompts to stimulate higher levels of thinking among their students. 214

References Allen, I. E., & Seaman, J. (2008). Staying the course: Online education in the United States, 2008. Needham, MA: Sloan Consortium. Retrieved November 30, 2008, from http://www.sloanc.org/publications/survey/downloadreports. Andrews, J. D. W. (1980). The verbal structure of teacher questions: its impact on class discussion. POD Quarterly, 2, 129 163. Bloom, B. (1956). Taxonomy of educational objectives. New York: David McKay. Dillon, J. T. (1994). The effect of questions in education and other enterprises. Journal of Curriculum Studies, 14, 127-152. Garrison, D. R., Anderson, T., and Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. The American Journal of Distance Education 15(1), 7 23. Gibson, J. (2009). Discussion approach to instruction. In C. M. Reigeluth (Ed.), Instructional-design theories and models, Vol III: Building a common knowledge base (pp. 99-116). New York: Taylor and Francis. Gilbert, P. K., & Dabbagh, N. (2005). How to structure online discussions for meaningful discourse: A case study. British Journal of Educational, 36(1), 5-18. Limbach, B., & Waugh, W. (Fall, 2005). Questioning the lecture format. The NEA Higher Education Journal: Thought and Action, 20(1) 47-56. Retrieved on January 18, 2011, from http://www.nea.org/assets/img/pubthoughtandaction/taa_05_05.pdf McLoughlin, D. & Mynard, J. (2009). An analysis of higher order thinking in online discussions. Innovations in Education and Teaching International, 46 (2) 147-160 Pena-Shaff, J. B., & Nicholls, C. (2004). Analyzing student interactions and meaning construction in computer bulletin board discussions. Computers and Education, 42(3), 243-265. Richardson, J. C., & Ice, P. (2010). Investigation students level of thinking across instructional strategies in online discussions. Internet and Higher Education, 13, 52-59. Schrire, S. (2006). Knowledge building in asynchronous discussion groups: Going beyond quantitative analysis. Computers and Education, 46, 49-70. Swan, K., Garrison, D. R. & Richardson, J. C. (2009). A constructivist approach to online learning: the Community of Inquiry framework. In Payne, C. R. (Ed.) Information and Constructivism in Higher Education: Progressive Learning Frameworks. Zsohar H, & Smith J. A. (2008). Transition from the classroom to the Web: successful strategies for teaching online. Nursing Education Perspective. 29(1), 23-28. 215