Designing Scoring Rubrics for Your Classroom



Similar documents
Interpretive Guide for the Achievement Levels Report (2003 Revision) ITBS/ITED Testing Program

Evaluating Students in the Classroom Faculty Development Conference February 2007 D. P. Shelton, MSN, RN, CNE, EdD (c)

Developing a Sustainable Assessment Plan for EAC-ABET Accreditation

Principles of Data-Driven Instruction

Rubrics for Assessment

What is the PYP Perspective on Assessment?

Grade Level Year Total Points Core Points % At Standard %

ONLINE RESOURCES for ASSESSMENT

Demonstrating Understanding Rubrics and Scoring Guides

LITERACY: READING LANGUAGE ARTS

A Writer s Workshop: Working in the Middle from Jennifer Alex, NNWP Consultant

Assessment Plan for the MBA Program Department of Business University of Wisconsin - Parkside

Implementing Performance Assessment in the Classroom. ERIC/AE Digest.

What is Assessment and Why is it Necessary?

Appendix A: Annotated Table of Activities & Tools

Principal Performance Review

Mean (Freshmen) Mean (Seniors) Sig. (1-tailed) Mean N Mean *.009 N Mean N

Program Rating Sheet - Athens State University Athens, Alabama

Examples of Proficiency-based Learning Options

Japanese International School. Assessment Recording and Reporting Policy

Elementary Math Methods Syllabus

Mississippi School Librarian Evaluation

Comparison of Research Designs Template

Form: Filled in table. Method: Peer assessment. Tool: Checklist. Form: Completed table. Method: Peer assessment. Tool: Checklist

201 Math Problem Solving

Performance Based Learning and Assessment Task

Rubrics for Assessing Student Writing, Listening, and Speaking High School

Performance Assessment Task Bikes and Trikes Grade 4. Common Core State Standards Math - Content Standards

MEMORANDUM. RE: MPA Program Capstone Assessment Results - CY 2003 & 2004

Teacher Guidebook

Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?

Chapter 13. Competitive Negotiation: Evaluation Criteria

Things to Do. Data Collection. Data collection is collecting specific information about

3 Guidance for Successful Evaluations

Science Notebooks. a tool for increasing student understanding of inquiry and science content a tool for enhancing literacy skills

Writing Quality Learning Objectives

Rubrics for AP Histories. + Historical Thinking Skills

Tulsa Public Schools Teacher Observation and Evaluation System: Its Research Base and Validation Studies

1 J (Gr 6): Summarize and describe distributions.

The student will explore and learn about the Legislative, Executive, and Judicial branches of the United States government.

Department of Communication Studies M.A. Program Annual Report

Assessment That Drives Instruction

Guide to the Student Portal of the Texas Assessment Management System

Striving for Success: Teacher Perspectives of a Vertical Team Initiative

Table 1A-Business Management A.A.S. Student Learning Results

How to Upload Artifacts to MyLearningPlan.com

Allen Elementary School

Critical Components of Lesson Design Guide

Creating & Recognizing Quality Rubrics

What are some effective standards-based classroom assessment practices?

Learning Aims: To research, record and evaluate the difficulties of starting your own business.

Leadership and Learning: The Journey to National Accreditation and Recognition

Department of Chemistry University of Colorado Denver Outcomes Assessment Plan. Department Overview:

CFSD 21 ST CENTURY SKILL RUBRIC CRITICAL & CREATIVE THINKING

Consumer s Guide to Research on STEM Education. March Iris R. Weiss

Tennessee Educator Acceleration Model (TEAM) TEAM Evaluation Supplemental Materials 2014

Writing Reports BJECTIVES ONTENTS. By the end of this section you should be able to :

ASSESSING MIDDLE SCHOOL STUDENTS CONTENT KNOWLEDGE AND SCIENTIFIC

Asse ssment Fact Sheet: Performance Asse ssment using Competency Assessment Tools

COURSE OF STUDY OUTLINE BUSINESS GRADE 9/10, BUSINESS, OPEN (BTT10/BTT20)

Henderson State University Teachers College Department of Advanced Instructional Studies Educational Leadership

George Mason University Graduate School of Education Special Education Program

MED RESEARCH IN MATHEMATICS EDUCATION SPRING, 2006

A. Wessels, J. D. & Birkholz, C. Rubrics and Other Tools for Teaching Quality. Montana: Ten Sigma, 1996.

School Administrators and the Importance of Utilizing Action Research

First-Year Report: Data

GRADE 6 MATH: SHARE MY CANDY

Force and Motion: Ramp It Up

Oglala Lakota College Teacher Preparation Program Instructional Unit Scoring Rubric

Requirements & Guidelines for the Preparation of the New Mexico Online Portfolio for Alternative Licensure

Baltimore County Public Schools Elementary Science, Engineering, and Technology Fair

ASSESSSMENT TASK OVERVIEW & PURPOSE:

The What, Why and How of Classroom Action Research

INTEGRATING TECHNOLOGY INTO THE MATHEMATICS CLASSROOM: THE ROLE OF TEACHER PREPARATION PROGRAMS

Bar Graphs with Intervals Grade Three

Part I Program SLO Assessment Report for

Scaffolding as a Teaching Strategy

Problem Solving Students will demonstrate their ability to solve problems in mathematics including problems with open-ended answers.

Sample Completed Summative Report Form for a Secondary Teacher 1*

Designing Writing Assessments & Rubrics. 11th LARC/CALPER Webinar on Assessment November 14, 2013 Deborah Crusan, Wright State University, Dayton, OH

Introduction. Two vastly different online experiences were presented in an earlier column. An

Follow this and additional works at: Part of the Physical Sciences and Mathematics Commons

Understanding Types of Assessment Within an RTI Framework

Florida s Plan to Ensure Equitable Access to Excellent Educators. heralded Florida for being number two in the nation for AP participation, a dramatic

Narrative Literature Response Letters Grade Three

Comparing Sets of Data Grade Eight

Performance Assessment Task Which Shape? Grade 3. Common Core State Standards Math - Content Standards

Mathletics For Students

A METHOD FOR DEVELOPING RUBRICS FOR RESEARCH PURPOSES

Senior Phase Grade 7 Today Planning Pack ENGLISH

Developing an Assessment Tool. February 2012

Evaluating the School Strategic Plan Components Title I Rubric for Schoolwide Projects

ALGEBRA 1 ~ Cell Phone Task Group: Kimberly Allen, Matt Blundin, Nancy Bowen, Anna Green, Lee Hale, Katie Owens

SCORING PERFORMANCE ASSESSMENTS: CHECKLISTS, RATING SCALES AND RUBRICS

PUBLIC HEALTH Nurse. L. Michele Issel, PhD, RN University of Illinois at Chicago School of Public Health

ASSESSMENT 5: Masters Degree ECE Summative Assessment Project, Thesis or Paper

Study Guide. Developing Literate Mathematicians: A Guide for Integrating Language and Literacy Instruction into Secondary Mathematics

Which Shape? Student Task Core Idea 4 Geometry and Measurement

YSU Program Student Learning Assessment Report Due Date: Thursday, October 31, 2013

Measurement with Ratios

Transcription:

Designing Scoring Rubrics for Your Classroom Craig A. Mertler, Ph.D. Division of Educational Foundations & Inquiry College of Education & Human Development Bowling Green State University Bowling Green, OH 43403 mertler@bgnet.bgsu.edu Phone: 419-372-9357 Fax: 419-372-8265

Designing Scoring Rubrics for Your Classroom Rubrics are rating scales as opposed to checklists that are specifically used with assessments. They are formally defined as scoring guides, consisting of specific pre-established criteria, used in evaluating student work on assessments. Rubrics are typically the specific form of scoring instrument used when evaluating student s or products resulting from a task. There are two types of rubrics: holistic and analytic (see Figure 1). A holistic rubric requires the teacher to score the overall process or product as a whole, without judging the component parts separately (Nitko, 2001). In contrast, with an analytic rubric, the teacher scores separate, individual parts of the product or first, then sums the individual scores to obtain a total score (Moskal, 2000; Nitko, 2001). Figure 1 Types of scoring instruments for assessments Scoring Instruments for Performance Assessments Checklists Rating Scales Rubrics Analytic Rubrics Holistic Rubrics Holistic rubrics are customarily utilized when errors in some part of the process can be tolerated provided the overall quality is high (Chase, 1999). Nitko (2001) further states that use of holistic rubrics are probably more appropriate when tasks require students to create some sort of response and where there is no definitive correct answer. The focus of a score reported using a holistic rubric is on the overall quality, proficiency, or understanding of the specific content and skills it involves assessment on a unidimensional level (Mertler, 2001). Holistic rubrics can result in a somewhat quicker scoring process than when using an analytic rubric (Nitko, 2001). This is basically due to the fact that the teacher is required to read through or otherwise examine the student product or only once, in order to get an overall sense of what the student was able to accomplish (Mertler, 2001). Since assessment of the overall is the key, holistic rubrics are also typically, though not exclusively, used when the purpose of the assessment is summative in nature. At most,

only limited feedback is provided to the student as a result of scoring tasks in this manner. A template for holistic scoring rubrics is presented in Figure 2. Analytic rubrics are usually preferred when a fairly focused type of response is required (Nitko, 2001); that is, tasks where there may be one or two acceptable responses and creativity is not an essential feature of the students responses. Furthermore, analytic rubrics result initially in several scores, followed by a summed total score their use represents assessment on a multidimensional level (Mertler, 2001). As previously mentioned, the use of analytic rubrics can cause the scoring process to be substantially slower, mainly because assessing several different skills or characteristics individually requires a teacher to examine the product several times. Both their construction and use can be quite time-consuming. A general rule of thumb is that an individual s work should be examined a separate time for each of the specific tasks or scoring criteria (Mertler, 2001). However, the advantage to the use of analytic rubrics is quite substantial. The degree of feedback offered to students and to teachers is significant. Students receive specific feedback on their with respect to each of the individual scoring criteria something that does not happen when using holistic rubrics (Nitko, 2001). It is possible to then create a profile of specific student strengths and weaknesses (Mertler, 2001). A template for analytic scoring rubrics is presented in Figure 3. Figure 2 Template for holistic rubrics Template for Holistic Rubrics Score 5 Demonstrates complete understanding of the problem. All requirements of task are included in response. 4 Demonstrates considerable understanding of the problem. All requirements of task are included. 3 Demonstrates partial understanding of the problem. Most requirements of task are included. 2 Demonstrates little understanding of the problem. Many requirements of task are missing. 1 Demonstrates no understanding of the problem. 0 No response/task not attempted.

Figure 3 Template for analytic rubrics Template for Analytic Rubrics Criteria #1 Criteria #2 Criteria #3 Criteria #4 Beginning 1 beginning level of beginning level of beginning level of beginning level of Developing 2 movement toward of movement toward of movement toward of movement toward of Accomplished 3 achievement of of achievement of of achievement of of achievement of of Exemplary 4 Score highest level of highest level of highest level of highest level of Total Score = Prior to designing a specific rubric, a teacher must decide whether the or product will be scored holistically or analytically (Airasian, 2000 & 2001). Regardless of which type of rubric is selected, specific criteria and observable indicators must be identified as an initial step to development. The decision regarding the use of a holistic or analytic approach to scoring has several possible implications. The most important of these is that teachers must consider first how they intend to use the results. If an overall, summative score is desired, a holistic scoring approach would be more desirable. In contrast, if formative feedback is the goal, an analytic scoring rubric should be used. It is important to note that one type of rubric is not inherently better than the other you must find a format that works best for you (Montgomery, 2001). Other implications include the time requirements, the nature of the task itself, and the specific criteria being observed. As you saw demonstrated in the templates (Figures 2 and 3), the various levels of student can be defined using either quantitative (i.e., numerical) or qualitative (i.e., descriptive) labels. In some instances, teachers might want to utilize

both quantitative and qualitative labels. If a rubric contains four levels of proficiency or understanding on a continuum, quantitative labels would typically range from 1 to 4. When using qualitative labels, teachers have much more flexibility, and can be more creative. A common type of qualitative scale might include the following labels: master, expert, apprentice, and novice. Nearly any type of qualitative scale will suffice, provided it fits with the task. The following list offers several suggestions for qualitative descriptors (Stix, 1996). Lowest Highest Novice Apprentice Proficient Master Attempted Acceptable Admirable Awesome Lead Bronze Silver Gold Amateur Semi-Pro Professional Hall of Fame Byte Kilobyte Megabyte Gigabyte Minnow Goldfish Tuna Shark One potentially frustrating aspect of scoring student work with rubrics is the issue of somehow converting them to grades. It is not a good idea to think of rubrics in terms of percentages (Trice, 2000). For example, if a rubric has six levels (or points ), a score of 3 should not be equated to 50% (an F in most letter grading systems). The process of converting rubric scores to grades or categories is more a process of logic than it is a mathematical one. Trice (2000) suggests that in a rubric scoring system, there are typically more scores at the average and above average categories (i.e., equating to grades of C or better) than there are below average categories. For instance, if a rubric consisted of nine score categories, the equivalent grades and categories might look like this: Rubric Score Grade Category 8 A+ Excellent 7 A Excellent 6 B+ Good 5 B Good 4 C+ Fair 3 C Fair 2 U Unsatisfactory 1 U Unsatisfactory 0 U Unsatisfactory When converting rubric scores to grades (typical at the secondary level) or descriptive feedback (typical at the elementary level), it is important to remember that there is not necessarily one correct way to accomplish this. The bottom line for classroom teachers is that they must find a system of conversion that works them and fits comfortably into their individual system of reporting student.

Steps in the Design of Scoring Rubrics A step-by-step process for designing scoring rubrics for classroom use is presented below. Information for these procedures was compiled from various sources (Airasian, 2000 & 2001; Mertler, 2001; Montgomery, 2001; Nitko, 2001; Tombari & Borich, 1999). The steps will be summarized and discussed, followed by presentations of two sample scoring rubrics. Step 1: Step 2: Step 3: Step 4a: Step 4b: Step 5a: Step 5b: Step 6: Step 7: Re-examine the learning objectives to be addressed by the task. This allows you to match your scoring guide with your objectives and actual instruction. Identify specific observable attributes that you want to see (as well as those you don t want to see) your students demonstrate in their product, process, or. Specify the characteristics, skills, or behaviors that you will be looking for, as well as common mistakes you do not want to see. Brainstorm characteristics that describe each attribute. Identify ways to describe above average, average, and below average for each observable attribute identified in Step 2. For holistic rubrics, write thorough narrative descriptions for excellent work and poor work incorporating each attribute into the description. Describe the highest and lowest levels of combining the descriptors for all attributes. For analytic rubrics, write thorough narrative descriptions for excellent work and poor work for each individual attribute. Describe the highest and lowest levels of using the descriptors for each attribute separately. For holistic rubrics, complete the rubric by describing other levels on the continuum that ranges from excellent to poor work for the collective attributes. Write descriptions for all intermediate levels of. For analytic rubrics, complete the rubric by describing other levels on the continuum that ranges from excellent to poor work for each attribute. Write descriptions for all intermediate levels of for each attribute separately. Collect samples of student work that exemplify each level. These will help you score in the future by serving as benchmarks. Revise the rubric, as necessary. Be prepared to reflect on the effectiveness of the rubric and revise it prior to its next implementation. These steps involved in the design of rubrics have been summarized in Figure 4.

Figure 4 Step-by-step procedure for the design of assessment scoring rubrics Designing Scoring Rubrics: Step-by-Step Procedure Step 1: Step 2: Step 3: Re-examine the learning objectives to be addressed by the task. Identify specific observable attributes that you want to see (as well as those you don t want to see) your students demonstrate in their product, process, or. Brainstorm characteristics that describe each attribute. For holistic rubrics For analytic rubrics Step 4a: Write thorough narrative descriptions for excellent work and poor work incorporating each attribute into the description. Step 4b: Write thorough narrative descriptions for excellent work and poor work for each individual attribute. Step 5a: Complete the rubric by describing other levels on the continuum that ranges from excellent to poor work for the collective attributes. Step 5b: Complete the rubric by describing other levels on the continuum that ranges from excellent to poor work for each attribute. Step 6: Step 7: Collect samples of student work that exemplify each level. Revise the rubric, as necessary.

Two Examples Two sample scoring rubrics corresponding to specific assessment tasks are presented next. Brief discussions precede the actual rubrics. For illustrative purposes, a holistic rubric is presented for the first task and an analytic rubric for the second. It should be noted that either a holistic or analytic rubric could have been designed for either task. Example 1: Subject Mathematics Grade Level(s) Upper Elementary Mr. Harris, a fourth grade teacher, is planning a unit on the topic of data analysis, focusing primarily on the skills of estimation and interpretation of graphs. Specifically, at the end of this unit, he wants to be able to assess his students mastery of the following instructional objectives: Students will properly interpret a bar graph. Students will accurately estimate values from within a bar graph. (step 1) Since the purpose of his task is summative the results will be incorporated into the students grades in nature, he decides to develop a holistic rubric. He identifies the following four attributes on which to focus his rubric: estimation, mathematical computation, conclusions, and communication of explanations (steps 2 & 3). Finally, he begins drafting descriptions of the various levels of for the observable attributes (steps 4 & 5). The final rubric for his task appears in Figure 5. Figure 5 Sample holistic rubric Math Performance Task Scoring Rubric Data Analysis Name Date Score 4 Makes accurate estimations. Uses appropriate mathematical operations with no mistakes. Draws logical conclusions supported by graph. Sound explanations of thinking. 3 Makes good estimations. Uses appropriate mathematical operations with few mistakes. Draws logical conclusions supported by graph. Good explanations of thinking. 2 Attempts estimations, although many inaccurate. Uses inappropriate mathematical operations, but with no mistakes. Draws conclusions not supported by graph. Offers little explanation. 1 Makes inaccurate estimations. Uses inappropriate mathematical operations. Draws no conclusions related to graph. Offers no explanations of thinking. 0 No response/task not attempted.

Example 2: Subjects Social Studies; Probability & Statistics Grade Level(s) 9-12 Mrs. Wolfe is a high school American government teacher. She is beginning a unit on the electoral process and knows from past years that her students sometimes have difficulty with the concepts of sampling and election polling. She decides to give her students a assessment so they can demonstrate their levels of understanding of these concepts. The main idea that she wants to focus on is that samples (surveys) can accurately predict the viewpoints of an entire population. Specifically, she wants to be able to assess her students on the following instructional objectives: Students will collect data using appropriate methods. Students will accurately analyze and summarize their data. Students will effectively communicate their results. (step 1) Since the purpose of his task is formative in nature, she decides to develop an analytic focusing on the following attributes: sampling technique, data collection, statistical analyses, and communication of results (steps 2 & 3). She drafts descriptions of the various levels of for the observable attributes (steps 4 & 5). The final rubric for this task appears in Figure 6. Figure 6 Sample analytic rubric Performance Task Scoring Rubric Population Sampling Name Date Sampling Technique Survey/ Interview Questions Statistical Analyses Communic ation of Results Beginning 1 Inappropriate sampling technique used Inappropriate questions asked to gather needed information No attempt at summarizing collected data Communication of results is incomplete, unorganized, and difficult to follow Developing 2 Appropriate technique used to select sample; major errors in execution Few pertinent questions asked; data on sample is inadequate Attempts analysis of data, but inappropriate procedures Communicates some important information; not organized well enough to support decision Accomplished 3 Appropriate technique used to select sample; minor errors in execution Most pertinent questions asked; data on sample is adequate Proper analytical procedures used, but analysis incomplete Communicates most of important information; shows support for decision Exemplary 4 Score Appropriate technique used to select sample; no errors in procedures All pertinent questions asked; data on sample is complete All proper analytical procedures used to summarize data Communication of results is very thorough; shows insight into how data predicted outcome Total Score =

Resources for Rubrics on the Web The following is just a partial list of some Web resources for information about and samples of scoring rubrics. Scoring Rubrics: What, When, & How? (http://ericae.net/pare/getvn.asp?v=7&n=3) This article appears in Practical Assessment, Research, & Evaluation and is authored by Barbara M. Moskal. The article provides discusses what rubrics are, and distinguishes between holistic and analytic types. Examples and additional resources are provided. Performance Assessment Scoring (http://www.pgcps.pg.k12.md.us/~elc/scoringtasks.html) Staff in the Prince George s County (MD) Public Schools have developed a series of pages that provides descriptions of the steps involved in the design of tasks. This particular page provides several rubric samples. Rubrics from the Staff Room for Ontario Teachers (http://www.quadro.net/~ecoxon/reporting/rubrics.htm) This site is a collection of literally hundreds of teacher-developed rubrics for scoring tasks. The rubrics are categorized by subject area and type of task. This is a fantastic resource check it out! Rubistar Rubric Generator (http://rubistar.4teachers.org/) Teacher Rubric Maker (http://www.teach-nology.com/web_tools/rubrics/) These two sites house Web-based rubric generators for teachers. Teachers can customize their own rubrics based on templates on each site. In both cases, rubric templates are organized by subject area and/or type of task. These are wonderful resources for teachers!

References Airasian, P. W. (2000). Assessment in the classroom: A concise approach (2 nd ed.). Boston: McGraw-Hill. Airasian, P. W. (2001). Classroom assessment: Concepts and applications (4 th ed.). Boston: McGraw-Hill. Mertler, C. A. (2001). Using assessment in your classroom. Unpublished manuscript, Bowling Green State University. Montgomery, K. (2001). Authentic assessment: A guide for elementary teachers. New York: Longman. Moskal, B. M. (2000). Scoring rubrics: what, when, and how?. Practical Assessment, Research, & Evaluation, 7(3). Available online: http://ericae.net/pare/getvn.asp?v=7&n=3 Nitko, A. J. (2001). Educational assessment of students (3 rd ed.). Upper Saddle River, NJ: Merrill. Stix, A. (1996, November). Creating rubrics through negotiable contracting and assessment. Paper presented at the National Middle School Conference, Baltimore, MD. (ERIC Document Reproduction Service ED411273) Tombari, M. & Borich, G. (1999). Authentic assessment in the classroom: Applications and practice. Upper Saddle River, NJ: Merrill. Trice, A. D. (2000). A handbook of classroom assessment. New York: Longman. Mertler, Craig A. (2001). Designing scoring rubrics for your classroom. Practical Assessment, Research & Evaluation, 7(25). Available online: http://www.pareonline.net/getvn.asp?v=7&n=25