Progress Monitoring for Specific Learning Disabilities (SLD) Eligibility Decisions



Similar documents
Frequently Asked Questions about Making Specific Learning Disability (SLD) Eligibility Decisions

Contracting with Technical Colleges and Other Ways to Complete High School

COURSE OPTIONS FREQUENTLY ASKED QUESTIONS Revised June 12, 2014

RtI Response to Intervention

SPECIFIC LEARNING DISABILITY

Transcript: What Is Progress Monitoring?

GUIDELINES FOR THE IEP TEAM DATA COLLECTION &

Technical Assistance Paper

Spring School Psychologist. RTI² Training Q &A

Wappingers Central School District

Response to Intervention Frequently Asked Questions (FAQs)

ALTERNATIVE EDUCATION PROGRAMS

Essential Principles of Effective Evaluation

MEQUON-THIENSVILLE SCHOOL DISTRICT BOARD OF EDUCATION

Wisconsin s Specific Learning Disabilities (SLD) Rule:

The Massachusetts Tiered System of Support

Hamilton Southeastern Schools

The Importance of Response to Intervention (RTI) in the Understanding, Assessment, Diagnosis, and Teaching of Students with Learning Disabilities

Directions: Complete assessment with team. Determine current level of implementation and priority. Level of Implementation. Priority.

Rubric for Evaluating Colorado s Specialized Service Professionals: School Psychologists Definition of an Effective School Psychologist

Home-Based Private Educational Program (Homeschooling) Frequently Asked Questions September 2015

1:1 Nursing Services for Students with Special Health Care Needs

Home-Based Private Educational Program (Homeschooling) Frequently Asked Questions November 2013

Implementing RTI Using Title I, Title III, and CEIS Funds

LOS ANGELES UNIFIED SCHOOL DISTRICT MEMORANDUM

Please use the guidance provided in addition to this template to develop components of the SLO and populate each component in the space below.

NORTH CAROLINA STATE BOARD OF EDUCATION Policy Manual

Benchmark Assessment in Standards-Based Education:

Tips for Designing a High Quality Professional Development Program

Wisconsin Educator Effectiveness System. Teacher Evaluation Process Manual

Middle and High School Learning Environments and the Rhode Island Diploma System

REGULATIONS of the BOARD OF REGENTS FOR ELEMENTARY AND SECONDARY EDUCATION

MTSS Implementation Components Ensuring common language and understanding

Progress Monitoring and RTI System

Rubric for Evaluating Colorado s Specialized Service Professionals: School Psychologists

Fulda Independent School District 505

... and. Uses data to help schools identify needs for prevention and intervention programs.

Eligibility / Staffing Determination EMOTIONAL DISTURBANCE. Date of Meeting:

The ABCs of RTI in Elementary School: A Guide for Families

Cooperative Education Skill Standards Certificate Program

Rhode Island Criteria and Guidance for the Identification of Specific Learning Disabilities

Response to Intervention/ Student Support Team Manual Department of Psychological Services

Higher Performing High Schools

NEW TSPC SPECIALIZATION: AUTISM SPECTRUM DISORDER. Q and A. May 24, 2012

Oh, that explains it. Michigan Merit Curriculum High School Graduation Requirements

Kansas Multi-Tier System of Support

Galileo Pre-K Online:

Wisconsin Educator Effectiveness System. Principal Evaluation Process Manual

School Levels of Support. By Dr. Kristine Harms

NEW YORK STATE TEACHER CERTIFICATION EXAMINATIONS

Using CBM to Progress Monitor English Language Learners

How To Get A High School Diploma In Wisconsin

University of Wisconsin Madison General Information about the Reading Teacher and Reading Specialist Licenses

Measures of Academic Progress MAP. Glossary of Terms

M D R w w w. s c h o o l d a t a. c o m

MATTOON COMMUNITY UNIT SCHOOL DISTRICT #2 SCHOOL SOCIAL WORKER EVALUATION PROCEDURES AND FORMS

Fidelity Integrity Assessment (FIA)

The Historic Opportunity to Get College Readiness Right: The Race to the Top Fund and Postsecondary Education

Section Two: Ohio Standards for the Teaching Profession

MO DEPT ELEMENTARY AND SECONDARY EDUCATION SCHOOL YEAR: DISTRICT: ATLANTA C-3 BUILDING: 4020 ATLANTA ELEM.

LOS ANGELES UNIFIED SCHOOL DISTRICT REFERENCE GUIDE

Frequently Asked Questions Contact us:

DEPARTMENT OF EDUCATION SUPERINTENDENT OF PUBLIC INSTRUCTION SCHOOL PSYCHOLOGIST CERTIFICATE

writing standards aligned IEPs.

Part 12 Statewide Online Education Program Act

Online Math Solution For Grades 3-12

Middle School Special Education Progress Monitoring and Goal- Setting Procedures. Section 2: Reading {Reading- Curriculum Based Measurement (R- CBM)}

Highly Qualified Teacher Professional Development Resource Guide

National Center on Student Progress Monitoring

Wisconsin Alternative Route Program Application Review

Chapter 2 - Why RTI Plays An Important. Important Role in the Determination of Specific Learning Disabilities (SLD) under IDEA 2004

North Carolina School Nurse Evaluation Process

Rubric for Evaluating North Carolina s Speech- Language Pathologists

Maple Lake Public School District # Annual Report on Curriculum, Instruction and Student Achievement

Online Assessment Systems

COLLEGE OF EDUCATION AND LEADERSHIP 6801 N. Yates Road, Milwaukee Wisconsin, 53217

M.S. & Ed.S. in School Psychology Assessment in the Major Report By Dr. Christine Peterson, Program Director Submitted: October 2014

Teacher Evaluation. Missouri s Educator Evaluation System

Unit 1: Introduction to Environmental Science

Standards for Special Education Teachers

Ch. 4: Four-Step Problem Solving Model

Louisiana Profile of State High School Exit Exam Policies 2012

Florida Math for College Readiness

GUIDE TO BECOMING A READING CORPS SITE

Teacher Performance Evaluation System

Louisiana Special Education Guidance

19K660. Brooklyn, NY Jocelyn Badette

PA Guidelines for Identifying Students with Specific Learning Disabilities (SLD)

What exactly are the new roles of the School Psychologist and Social Worker?

Revisioning Graduate Teacher Education in North Carolina Master of Arts in Elementary Education Appalachian State University

ACT, Inc. 500 ACT Drive P.O. Box 168 Iowa City, Iowa (319)

Rubric for Evaluating Colorado s Specialized Service Professionals: Speech-Language Pathologists

WHAT IT MEANS TO BE A FOCUS DISTRICT OR SCHOOL

Elementary Middle High School

Lincoln Park Public Schools Special Education Teacher Consultant Evaluation

NEW YORK STATE TEACHER CERTIFICATION EXAMINATIONS

How To Improve Education Planning In Dekalb County Schools

AZ Response to Intervention (RTI)

Identifying Students with Specific Learning Disabilities. Part 1: Introduction/Laws & RtI in Relation to SLD Identification

Transcription:

Tony Evers, PhD, State Superintendent Progress Monitoring for Specific Learning Disabilities (SLD) Eligibility Decisions Introduction Since the revised SLD rule went into effect in 2010, the Department has received a number of questions about the selection of progress monitoring tools used to collect data during intensive scientific, research-based or evidence-based interventions (SRBIs). This document provides information and guidance in response to the specific question: How does a local educational agency determine whether to use a Curriculum Based Measure (CBM) or a Computer Adaptive Test (CAT) when collecting progress data during intensive scientific, research-based or evidence-based interventions for the purpose of making a special education eligibility decision in the area of Specific Learning Disabilities? Background There are several types of progress monitoring tools that may or may not qualify as a probe as defined by the Wisconsin Specific Learning Disabilities (SLD) eligibility rule. Both Curriculum Based Measures (CBMs) and Computer Adaptive Tests (CATs) can be used in a school district s strategic assessment system and can serve numerous purposes when making instructional decisions for all students. Both can be used for progress monitoring over relatively long periods of time (e.g., months, semesters, years). However, when considering an instrument to use to monitor progress over a short period of time weeks or over the course of a special education evaluation timeline local educational agencies (LEAs) will need to carefully consider the purpose for which the data are used; and the relative strengths and weaknesses of that measure to meet the specific purpose. This is particularly true when the data will be used to make a special education eligibility decision. Local Decision How a LEA conducts progress monitoring during intensive interventions is governed by administrative rule, the SLD rule (see PI 11.02 and PI 11.36(6)). However, the choice of which progress monitoring probe to use when collecting progress data during a scientific, research- or evidence-based intensive intervention (SRBI) for the purpose of making a special education eligibility decision in the area of SLD is a local decision. Decision Errors Identifying a student as having a disability is a high stakes decision, both for the student and for the district. Decision errors are costly. Public schools spend an average of two to three times more money on each student eligible for special education as they do for students without disabilities (Center for Special Education Finance). Educational, employment and communitybased outcomes for students with special educational needs are poor in comparison to students PO Box 7841, Madison, WI 53707-7841 125 South Webster Street, Madison, WI 53703 (608) 266-3390 (800) 441-4563 toll free dpi.wi.gov

educated entirely in the general educational system. Therefore, LEAs are keen to ensure they do not make the mistake of identifying a student as needing special educational services when s/he does not. More costly are the lifelong consequences that a student and her/his family bear when the student is misidentified as needing special education when s/he does not in fact have a disability. There is a second type of decision error to consider: that of finding a student does not need special educational services when s/he does, in fact, need those services in order to progress in school. Such a student is at risk of not receiving the intensive interventions s/he needs to close the academic gap. Threats to decision errors in SLD eligibility determinations can be reduced, among other means, by using valid and reliable data that are sensitive to growth over short periods of time. Decision Guidelines Based on the Wisconsin Administrative Rule Decisions are based on a variety of information including that from individually administered standardized tests of achievement, progress monitoring data collected during SRBIs, and observations during classroom instruction and intervention [PI 11.36(6)]. The Wisconsin Administrative Rule [Wis. Admin.Code PI 11.02 ] offers the following definitions used during special education decisions about SLD eligibility: Progress monitoring is defined as a scientifically based practice to assess pupil response to interventions [Wis. Admin. Code PI 11.02(10)]. Progress monitoring requires the use of scientifically based tools called probes to measure progress. s are a. Brief, b. Direct measures of specific academic skills, c. With multiple equal or nearly equal forms, d. That are sensitive to small changes in student performance and e. That provide reliable and valid measures of pupil performance during intervention [Wis. Admin. Code PI 11.02(9)]. The rule further requires Individualized Education Program (IEP) teams to analyze weekly progress monitoring data collected during two SRBIs implemented with fidelity in each area of concern. The IEP team analyzes the data to determine whether or not the student s response to intensive intervention was sufficient or insufficient [Wis. Admin. Code PI 11.36 (6)(c) 2.a.]. 1 Progress Monitoring s States and districts are in the early stages of knowing how to use CATs and CBMs for determining whether a student has or has not made sufficient academic progress. Each system has relative strengths and weaknesses; neither CBMs nor CATs function perfectly as probes. Those strengths and weaknesses may be considered tolerable when making relatively low stakes decisions. However, in a high stakes decision, such as when using a growth measure to 1 Please note there are other provisions in the SLD rule that must be considered. This document focuses only on those provisions of the rule that relate to appropriate progress monitoring tools to use during intensive interventions that fulfill the requirements of the rule.

determine whether a student s progress is sufficient or insufficient in the course of an evaluation used to determine eligibility for special education, the relative strengths and weaknesses are magnified. As such, LEAs have a duty to minimize the negative impact of high stakes decisions on students. This table provides some explanation and detail on the relative strengths and weaknesses of these two methods of measurement, especially relative to purpose. Comparison of CATs and CBMs for Use in SLD Identification 2 Considerations Theoretical Foundation a. Brief b. Direct measure of specific academic skills c. Has multiple equal, or nearly equal, forms Computer Adaptive Tests (CATs) Item response theory (IRT): A mathematical model of the relationship between performance on a test item and the test taker s level of performance on a scale of the proficiency being measured. In theory, the item s difficulty level and discrimination (diagnostic accuracy) are controlled, thus accounting for error. With technological advances, there has been an acceleration of use of IRT in computer adaptive assessments. Each computerized assessment takes 20-30 minutes to administer; group administrations are possible; When administered individually, an adult is required to promote engagement and guard against technology failure. The amount of time needed for administration may be an important consideration since probes must be administered at least weekly when consistent with the SLD rule. CATs are not direct measures of specific academic skills and may not align with all targeted areas of concern listed in the SLD rule; items are not drawn from the same discrete skill area (e.g., items assessing basic reading skills are included along with items of reading comprehension). CATs tend to be considered measures of broad areas, such as reading comprehension or math problem solving despite the item content coming from other areas as well. CATs are available for two, or at most, three of the eight areas of concern. See [Wis. Admin. Code PI 11.36 (6) 1]. Each item is equated using the Item Characteristic Curve from Item Response Theory after being field tested on students. Using item characteristic information, every Curriculum Based Measures (CBMs) Classical test theory: A psychometric theory based on the view that an individual s observed score on a test is the sum of a true score component for the test taker plus an independent measurement error component. Reducing the size of the error is accomplished by quality test construction and implementation with fidelity. The Standard Error of Measurement for the score as well as for the slope (Standard Error of Estimate) may be calculated and analyzed by IEP teams. Assessments range from one minute to 20 minutes by type; Assessments are administered individually, with some exceptions (e.g., math problem-solving, correct written word sequences). Most probes require individual face-to-face administration by a trained adult. CBMs are not direct curriculum measures; nor are they direct measures of specific academic skills; they are indicators of direct skills meaning that the CBM has high predictive validity of the direct skill. There are CBMs for some but not all targeted areas of concern listed in the SLD rule. CBMs are available for six of the eight areas of concern. See [Wis. Admin. Code PI 11.36 (6) 1]. s are equated by exacting standards. Different publishers offer different numbers of probes in a probe set. s being equal or nearly equal is essential to reduce error. A 2 This table is not an exhaustive list or comparison of measures, but rather, a guidepost meant to guide LEAs in planning the high stakes decisions surrounding SLD identification, a process governed by Wisconsin Administrative Rule.

d. Sensitive to small changes in student performance (over weeks, for an SLD eligibility decision) e. Reliable and Valid Scientificallybased practice Other factors item is placed on the test scale. This item by item equating process results in multiple equal or nearly equal forms. The pool of equated items is typically quite large. The item quality information is usually available in technical guidance from publishers. Currently available CATs have not achieved sensitivity to growth over weeks. Sensitivity to growth has been demonstrated over months, semesters or years. LEAs should review information for each test before determining whether it is appropriate for the particular use. The Standard Error of Measurement (SEM), that is, the tendency of scores to vary because of random factors, is relatively high for CATs. This means that data sometimes may be difficult to interpret. For example, interpretation of slopes which are relatively flat is difficult, as one cannot distinguish whether the lack of growth is an artifact of the test or whether the student s rate of progress is poor. Peer reviewed research on item response theory and computer adaptive testing for screening has accumulated for decades. Research on CATs for progress monitoring is in early stages. CATs require an adult to monitor computerized administration and to promote engagement. CATs can be used for instructional design and for information about the point of access in the sequence of the curriculum. They are strong screening tools. At this time, few commercial publishers market their CATs as appropriate for weekly progress monitoring. Their greatest vulnerability is sensitivity to growth over short periods of time. Another important vulnerability is mismatch between the tool and intervention target. Results about progress are meaningful for decision-making only if you actually intervened in area measured. Eligibility decisions must be made using a measurement tool which matches the target area of the intervention. larger number of unique, equated probes is useful when measuring weekly. Each type of CBM has associated research indicating the sensitivity to demonstrate growth by weeks, and by age. Some CBMs have greater sensitivity than others (e.g., CBM-Reading vs. CBM-Writing). Sensitivity to growth is more robust at younger ages than older ages (e.g., late middle school or high school). LEAs should review information for each test before determining whether it is appropriate for the particular use. Implementation integrity is required to achieve reliability of the probe score. Sufficient duration of weekly probe data is required to achieve reliability of the slope. No free, public domain CBM-Reading assessment is known to have achieved the level of reliability to meet this standard for reliability. Some educators perceive face validity to be low. For example, CBM-Reading Comprehension (e. g, Maze, CBM-Reading) probes are criticized for not truly measuring comprehension. Reliability and validity may vary by age and test type. Peer reviewed research on CBMs for screening has accumulated for decades and for progress monitoring for about ten years. Requires adult one to one administration for most measures; group administration for some measures. CBMs can be used for instructional design, and for better understanding the nature of reading problems. They can be used for screening and frequent progress monitoring. Their greatest vulnerability is poor implementation integrity which results in poor reliability of the score and of the slope. Another important vulnerability is mismatch between the tool and intervention target. Results about progress are meaningful for decision-making only if you actually intervened in area measured. Eligibility decisions must be made using a measurement tool which matches the target area of the intervention.

Summary The Department promotes the use of a strategic assessment system, which encourages LEAs to use various assessment tools that are matched to a particular purpose. Few measures are totally successful for more than one purpose. When the purpose is to monitor student progress during intensive interventions over a relatively short period of time in a manner consistent with Wisconsin s SLD rule, LEAs can reduce the chances of decision errors. LEAs may do so by choosing a measure that is matched to the intended purpose; valid and reliable; sensitive to growth over the relatively short period of time inherent in a special education evaluation timeline; implemented with integrity, and which best meets the requirements of the Wisconsin Administrative Rule [Wis. Admin. Code PI 11.36(6)].