QM Research Grants FY11



Similar documents
1 Past AOL reports and reviews are available at

How To Get A Good Grade Online

Assuring Quality in Large-Scale Online Course Development

Measuring Online Course Design: A Comparative Analysis

The University of Arizona

Research on the Impact of the Quality Matters Course Review Process. Ron Legon Executive Director The Quality Matters Program / MarylandOnline

Educational Psychology Program HLC Report (AY Prepared by Caron A. Westland & Jung-In Kim October 2011

UNLV Department of Curriculum and Instruction Masters in Education Degree with an emphasis in Science Education Culminating Experience

Assessment. Show US The Learning. An Institution-Wide Process to Improve and Support Student Learning. College of DuPage

How To Write An Online Course

TOOL KIT for RESIDENT EDUCATOR and MENT OR MOVES

TVCC Distance Learning Faculty Handbook. Distance Learning. Faculty Handbook. 1 P age

Paralegal Studies Assessment Report Introduction

Moving from Traditional to Online Instruction: Considerations for Improving Trainer and Instructor Performance

Making the Leap: Developing and Implementing Hybrid and Online Courses. Professors John H. Shannon and Susan A. O'Sullivan-Gavin.

Pathway to Master Online Instructor Program

The Trials and Accomplishments of an Online Adjunct Faculty Member

MCNY DL Case-Study: Paradigm-Shift in Business Program s Approach to Serving Predominantly Black Minority Students

UNH Graduate Education Department. Quarterly Assessment Report

On Track: A University Retention Model. Utilizing School Counseling Program Interns. Jill M. Thorngren South Dakota State University

The Quality Assurance Initiative s Effect on Barriers for Success and Engagement in Online Education at a Community College

2. Recognizing and Promoting Quality in Online Education

Math TLC. MSP LNC Conference Handout. The Mathematics Teacher Leadership Center. MSP LNC Conference Handout. !!! Math TLC

Principal Practice Observation Tool

Fall 2009 Fall 2010 Fall 2011 Fall 2012 Fall 2013 Enrollment Degrees Awarded

Measuring Outcomes in K-12 Online Education Programs: The Need for Common Metrics. Liz Pape Virtual High School. Mickey Revenaugh Connections Academy

Baker College - Master of Business Administration Program Assessment Report CGS Assessment Report: MBA Program

The New Student Experience

MPA Program Assessment Report Summer 2015

Blended Course Evaluation Standards

EXAMPLE FIELD EXPERIENCE PLANNING TEMPLATE CCSU MAT Program

The University of Hawai i Community Colleges Online Learning Strategic Plan

Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?

Teacher Evaluation. Missouri s Educator Evaluation System

Social Media and CFL Pedagogy: Transforming Classrooms into Learning Communities

Changing Practice in an Ed.D. Program

Online Teaching and Learning Certificate Academic Assessment Plan

Annual Goals for Psychology

Part I Program SLO Assessment Report for

Northeastern State University Online Educator Certificate

Program Report for the Preparation of Elementary School Teachers Association for Childhood Education International (ACEI) 2007 Standards - Option A

Wisconsin Educator Effectiveness System. Principal Evaluation Process Manual

Human Development and Learning in Social and Educational Contexts (EDP 201) Spring 2012 Syllabus

Subject: Proposal for Innovative and Creative Undergraduate Teaching Historical Context: Rationale:

Curriculum, Instruction, and Professional Development EDS

Rio Salado College Assessment of Student Learning Executive Summary: Annual Report

Summary of the Case University of Virginia Reading Education Progra m1 October 27-30, 2014 Authorship and approval of the Inquiry Brief

1. Is the program overall effective in preparing candidates to meet the expected outcomes?

Graduate School of Education Education & Special Education Programs

the Role of the Principal as Literacy Leader: A Continuum of practice

How To Find Out If Distance Education Is A Good Thing For A Hispanic Student

NATIONAL RECOGNITION REPORT Preparation of School Psychologists

Creating Quality Developmental Education

Masters Comprehensive Exam and Rubric (Rev. July 17, 2014)

Standards for Professional Development

Critical Thinking in Online Discussion Forums

Analyze Motivation-Hygiene Factors to Improve Satisfaction Levels of Your Online Training Program

Improving STUDENT RETENTION. A Whitepaper from The Learning House, Inc.

College/School/Major Division Assessment Results for

Outcome: Compare and contrast different research methods used by psychologists including their respective advantages and disadvantages.

This guide provides the notes that accompany the QM Overview power point presentation.

Electronic Portfolio Implementation in the Metro Academies Program at San Francisco State University and City College of San Francisco

How to Build Effective Online Learner Support Services

Advantages and Disadvantages of Various Assessment Methods

Assessment of Critical Thinking Institutional Learning Outcome (ILO)

Systemwide Program to Support and Enhance Academic Quality in Online Courses

SDA Bocconi Learning Goals

COLLEGE OF EDUCATION AND LEADERSHIP 6801 N. Yates Road, Milwaukee Wisconsin, 53217

How To Pass A Queens College Course

GEORGIA STANDARDS FOR THE APPROVAL OF PROFESSIONAL EDUCATION UNITS AND EDUCATOR PREPARATION PROGRAMS

Building a Better Buy-in:

Master of Arts in Teaching/Science Education Master of Arts in Teaching/Mathematics Education

Student Feedback on Online Summer Courses

Assessment Plan Department of Educational Leadership Nathan Weiss Graduate College Kean University

First-time Online Instructors Use of Instructional Technology in the Face-to-face Classroom. Heather E. Arrowsmith. Kelly D.

specific content of the assessments to reflect current research trends and practices in professional counseling.

Welcome to the Quality Matters: Evolving Standards for Online Courses fully-online workshop.

Strengthening the Role of Part-Time Faculty in Community Colleges

The California Teaching Performance Assessment (CalTPA) Frequently Asked Questions

Assessing the quality of online courses from the students' perspective

UNM Department of Psychology. Bachelor of Science in Psychology Plan for Assessment of Student Learning Outcomes The University of New Mexico

Program Outcomes and Assessment

Aurora University Master s Degree in Teacher Leadership Program for Life Science. A Summary Evaluation of Year Two. Prepared by Carolyn Kerkla

Research into competency models in arts education

AC : FACULTY PERCEPTIONS AND USE OF A LEARNING MANAGEMENT SYSTEM AT AN URBAN, RESEARCH INSTITUTION

A critique of the length of online course and student satisfaction, perceived learning, and academic performance. Natasha Lindsey

Web-based versus classroom-based instruction: an empirical comparison of student performance

Assessment of Student Learning Plan (ASLP): ART Program

M.S. in Education Assessment in the Major Report By Dr. Renee Chandler, Program Director Submitted: October 2011

Virtual Programs and Assessment in Graduate Teacher Education. Nedra Atwell Marge Maxwell Western Kentucky University. Abstract

Program Personnel Standards Approval Form. Disciplrne: Nursing. ','J1* )lplll. RTP Committeehair Date i

Web Design 1. Running Head: WEB DESIGN AND DEVELOPMENT

EFRT 305 Human Development and Learning

Key components of a successful remedial/developmental education program include an effective organizational structure, mandatory assessment and

Academic Program Annual Assessment Report

Delivered in an Online Format. Revised November 1, I. Perspectives

Arizona s Master Teacher Program. Program Overview

School Psychology Program Goals, Objectives, & Competencies

Program Learning Outcome Assessment Plan Template

Transcription:

QM Research Grants FY11 The Development of Technological Pedagogical Content Knowledge (TPACK) in Instructors Using Quality Matters Training, Rubric, and Peer Collaboration: Cheryl Ward, Principle Investigator, University of Akron Findings: Pedagogy is central to the quality development of online course design and that the TPACK (Technological, Pedagogical, Content Knowledge) conceptual framework is key in enabling instructors to develop new schema for a re-conceptualization of content, pedagogy and technology. Posited that use of the Quality Matters process helps instructors develop this complex knowledge that enables them to discuss, develop and implement more effective online learning. Therefore, the purpose of the project was to study the process of how the QM Rubric and QM training help instructors develop TPACK knowledge that enables them to discuss, develop and implement more effective online learning. Here are the research questions that guided the study: Is the QM rubric consistent with the TPACK framework to help instructors construct knowledge in quality design and online instruction? How is the QM rubric implemented and integrated as a catalyst to inform and guide online instructors for quality design and instruction? An alignment between the QM Rubric and the TPACK conceptual framework was done to determine if any gaps existed between the rubric and the six areas of the framework. Three professors of instructional technology and three instructional technology students did independent alignment processes with the TPACK framework and the QM Rubric. One premise of this study was that the QM Program can inform and facilitate knowledge growth in the TPACK areas. The alignment indicated that the QM Rubric is fairly well aligned with the TPACK conceptual framework. It was interesting that a rubric that purports to only address the design elements of an online class aligns so highly with the pedagogical elements in the TPACK conceptual framework. This alignment supports initial contentions that the elements of the QM Rubric foster discussion and knowledge development in more areas than just design of these environments. Technological, pedagogical and content discussions are overlapping and connected in a dynamic way that result in the inability to discuss or work on them in isolation for online course development.

Four participants were recruited after they finished QM training. Multiple data resources collected in the study revealed that becoming online learners themselves for the QM training helped the participants understand the needs of online learners. The QM Rubric increased the instructors knowledge of the importance of aligning learning objectives to assessment, instructional activities, and technology integration. Even though QM training and the QM Rubric did not specifically introduce the Technology Pedagogical Content Knowledge conceptual framework to the instructors, it is clear that their knowledge increased in the areas of technology, pedagogy and content. The learning experiences they shared with the researchers about the QM training also demonstrated that they grew to be more sophisticated online instructors because of the way they designed, modified and implemented their online courses through the knowledge they gained. The data analysis results from this study suggest a developmental model that depicts a few key transitional points in order to become effective online instructors, and how QM training can effectively consider these transitional points to deliver the training more efficiently to enhance the quality of online courses with more explicit guidelines to not only course design, but permeate to the other aspects of online teaching and learning. Evaluating the Impact of the Quality Matters Review Process on Student and Faculty Perceptions of Course Quality: Tina Parscal, Principle Investigator, University of the Rockies Findings: Research Questions: What is the extent to which QM certification impacts student perceptions of a course? Which standards have the greatest impact on student perceptions of a course? In an enterprise model of course development, what do instructors perceive as the positive and negative aspects of teaching the same course before and after the QM course review process? UOR commitment to QM at institutional level: Although a QM review is generally a 6-8 week process per course, as the phases dictate, chronological progress is dependent on the outcome of each particular state. If course suggestions need to be applied, the overall length of the process would grow accordingly. These dependencies are critical to the overall lifecycle, but also impacted the data collection phases and survey launches associated with the grant research project. 10-12 participants (faculty and staff) are selected per quarter to complete the APPQMR training. Standard with the most impact: All 8 standards identified at least once Strengths of QM process: Collaboration (4), Alignment (4), Consistency (4)

Student survey responses: 4.3 (.68) The instructions on how to get started in the course were clear. (Standard 1) 4.18 (.88) The course outcomes seemed achievable. (Standard 2) 4.31 (.82) The course requirements for student interaction were clearly stated. (Standard 5) Unintended results of this study: In this research we aimed to determine student and faculty perceptions of quality as a result of the QM review process, we discovered that the majority of our courses are QM recognized on the first submission. Although this is not supportive of the grant research process, per se, it does indicate that embedding QM standards in the tools and templates that faculty and the curriculum team use to develop online courses are effective in support of the development of high quality courses. This has led to a high level of student perception of quality on pre-tests. The Faculty Experience in Preparing Faculty to Use the Quality Matters Model for Course Improvement, Carol Roehrs, Principle Investigator, University of Northern Colorado Findings: A descriptive qualitative study to gather information on most efficient and effective ways to use the program by examining the experiences of faculty with QM training and updating of their online courses. Six faculty members volunteered, two for each level of QM training [self (provided with QM Rubric only), short (attended informal QM training session, no follow-up support) and long (attended QM training and follow-up support made available)]. Following training, each course was self-reviewed by the study participant, informally peer reviewed, updated to reflect recommendations and officially QM reviewed. Participants were asked to comment on their experiences. An increase in Rubric scores from peer to QM reviews was seen as evidence that participants were able to improve their courses. Taking QM training helped faculty improve their courses, but how to change the course to meet the standards was sometimes unclear to participants, and the help of instructional designers in course improvement was valued. There was a trend toward more agreement of the self- and official-reviews for faculty who participated in the longer training and for the faculty that had more experience teaching online. Participants reported that they found QM principles and the Rubric to be a valuable approach to improving their online courses.

Feedback from the faculty peer reviewers was seen as helpful, as was feedback from the official QM review teams. Participants lamented the lack of time they had to learn how to use the Rubric and to participate in the study. Despite their ability to make improvements, instructors wanted to have help from experts when they were updating their courses so they could get things right the first time they worked on the course. The experiences of the faculty were positive overall. They liked the Rubric and whatever training they had. The pervasive theme in faculty comments was lack of time to assess and revise their courses, much less to participate in the study, which implies that new approaches or processes may need to be devised to support faculty efforts to improve their courses. These findings support administrative efforts to develop and support faulty peer review programs, perhaps starting with self-review, and to provide expert help with instructional design for faculty who are seeking to improve the quality of their online courses. Impact of Quality Matters Recognition on Course Attrition Rates, Salvatore Diomede, Principle Investigator, Cleveland State University Findings: Attrition in distance higher education is a common problem. Although recent studies have pointed to more and more colleges improving their online retention rates (Parry, 2010). There are many factors that go into determining whether or not a student is likely to continue with his or her studies. Diaz and Cartnal (2006) identified four broad categories of factors that help explain and predict attrition in distance education: student situations, student dispositions, institutional systems and course content. Course design, which QM attends to, falls under the category of institutional systems factors. Although course design is important, this prediction model shows that other factors, both internal and external to the student, play a role in explaining attrition, too. This project was an attempt by the research team to organize a baseline study examining whether course design has an impact on course attrition. In this study 11 courses were reviewed and improved during one semester (level of improvement was not detailed) and offered by the same instructor the next semester. While there is no statistical evidence that QM impacts students online experience and/or withdrawal rates, students responses to open-ended questions about their online experience do suggest QM standards are important to their positive learning experience. Student comments reflect three major themes: In order to reduce students frustration, courses should have clear course/unit-level objectives, organization, and instructor expectations; additionally, clear instruction on how to proceed and where to find various course components should be provided. Many

participants (especially those participants from non-qm courses) expressed concern about organizational and navigational issues. To promote students positive experience in an online course, sufficient interaction and engaging activities or materials are also necessary. Nonresponsiveness of instructors was noted as a frustration. Unfortunately it appears that some instructors believe online teaching is about presenting reading materials/notes and then testing what students have learned. From both QM standards and students perspectives, more interactive activities and materials are necessary, such as synchronous online interaction, case studies, videos, etc. While QM standards might facilitate students positive online learning experience, without an instructor s positive presence, students are very likely left with a negative online experience. Initially, we (researchers) were surprised to see quite a few very negative comments about online courses (even some QM-recognized courses). After more detailed examination of students feedback, we found that students were talking about instructor presence, which is more of a delivery-related aspect, rather than course design itself. Therefore, even if a course is well designed, it does not mean students will automatically glean a positive experience from the course. A well-designed course still needs an instructor s creative and diligent delivery. Discussion: Although this study as completed in a short, two-semester "turn-around" did not find statistically significant evidence either supporting or refuting QM s effect on withdrawal rates, there are ways to expand upon the research to tell a greater piece of the story of attrition. One important factor in future research would be to control for the delivery variable - meaning instructor level of interaction with students. According to our survey, instructor presence seems to have a direct effect on students perceptions about their online learning experience. This very likely impacts decisions students are making when deciding to persist or withdraw from a course. Therefore, to further understand the effects of QM recognition on attrition, a more accurate control for variables is necessary. Additionally, a longitudinal study may provide a more complete set of data for comparison. This study had only one semester of data available for the QM-recognized courses, which provided for a limited comparison to the three years of previous data for the pre-qm courses. Further studies could examine multiple semesters of post-qm recognition in comparison to an equal length of time of pre-qm offerings. Quality Matters Research Grants FY10 Project Title: Linking Course Design to Learning Processes Using the Quality Matters and Community of Inquiry Frameworks Researchers: Karen Swan, James J. Stukel Distinguished Professor of Educational Leadership, University of Illinois Springfield

Findings: 1. Results indicate no differences on the CoI survey but differences approaching significance on the outcome measures. On average, students in the spring 2010 class outperformed students in the fall 2009 classes on both the final exam and the research proposal, and their overall grade point averages were slightly higher. None of these differences were statistically significant as measured by independent samples t-tests, but differences in scores on the research proposal approached significance (p=.053). 2. Findings suggest that revising course around stated objectives resulted in better student outcomes related to them, especially concerning the ability to write a research proposal. Although there may be a ceiling effect operating with respect to outcome measures as well, the authors believe that greater numbers (from summer and fall 2010 sections of EDL 541) may yield significant results. (The study is on-going.) 3. Researchers suggested that student performance may have improved because the QM revision led instructors to focus on objectives and the mapping of objectives to outcomes, and that such focus translated into their activity in the course. This possibility will be explored further through qualitative means. 4. Finally, it should be pointed out that findings concerning course outcomes support the notion that the QM and CoI frameworks are orthogonal in nature. The linking of online course design to implementation and learning processes to course outcomes is long overdue in distance education. This ongoing study is not only a first step in that direction, but it employs what are probably the two most commonly used theoretical frameworks in online education in the process. Project Title: Quality Matters Rubric as "Teaching Presence": Application of Community of Inquiry Framework to Analysis of the QM Rubric s Effects on Student Learning Researcher: Anna A. Hall, PhD, Delgado Community College Findings: QM Rubric implementation as new design and organization : 1. Increases teacher and teaching presences 2. Reduces direct management on DB/ personalizes management by moving it to the teacher domain (personal e-mail) 3. Reduces student self-management (i.e., group concerns ) on DB, thus, indirectly reducing student social presence 4. Has a positive effect on student higher-order cognitive presence via higher teaching presence 5. Teacher presence, student social presence and QM positively affect course satisfaction 6. QM and higher-order cognitive presence positively affect DB grades

Project Title: The Impact of Quality Matters Standards on Student Perception of Online Courses Researcher: Evelyn Knowles, Park University Findings: Differences when surveying before/after by students and QM master reviewer 1. The certified master reviewer found that both courses did not meet many of the standards. However, the vast majority of students disagreed and concluded that the course met the standards. Yes, Yes, Yes answers on survey. 2. Discussion why the surprising differences? Since the students scored all of the QM standards positively on both the initial and redeveloped courses, this may indicate a discrepancy in their level of expectations when compared to the certified master reviewer s expectations. Perhaps, students did not access or read the content items necessary to meet the QM standards. Another possibility is that during the teaching function of delivery the "instructor" contributed clarifying statements and directions. Project Title: Maricopa Quality Matters Program Review Researchers: Christy Alarcon, Jennifer Strickland, Shelley Rodrigo; Maricopa Community Colleges 1. Survey and a focus group conducted by the Maricopa Center for Learning and Instruction of 20 of 67 QM-certified faculty and administrators for baseline data on impact of QM participation and dissemination throughout the expansive Maricopa Community College system (10 colleges, 4,000 faculty, 250,000 students). 2. Knowledge gained during QM training was shared informally through conversations (100%). A majority of responders reported sharing QM information at department committee meetings (75%), at college-wide meetings (60%) and by distributing the QM Rubric to colleagues (65%). 3. Alignment between learning objectives, outcomes/competencies and course materials was the area that had the most impact due to the QM training. Most faculty and administrators realized that alignment was the key thing to make the course design effective. 4. A prominent theme that emerged was that the biggest obstacle for faculty to develop online and hybrid courses is funding by their respective colleges. Our data has shown that the colleges with the most success with QM were those that implemented a funding model structure. 5. The creation of EDUlearning Community at MCC allowed faculty to become familiar with the QM process before deciding to submit their course for an official QM review. This is a department initiative that creates a sense of community among online and adjunct

faculty. Project Title: QM Standard 8 Pilot Project Researcher: Ed Bowen, Dallas TeleCollege Recommendation: Accessibility is as much about the user as it is about the course. Student involvement in accessibility efforts is strongly suggested. This project was designed to gather student input on thoughts and ideas about accessibility.