Technical Report DIBELS Data System: Percentile Gains for. Predominant DIBELS Next Benchmark Assessments. Authors: Patrick C.

Similar documents
How To Pass A Test

Ohio Technical Report. The Relationship Between Oral Reading Fluency. and Ohio Proficiency Testing in Reading. Carolyn D.

DIBELS Next Benchmark Goals and Composite Score Dynamic Measurement Group, Inc. / August 19, 2016

Preschool Early Literacy Indicators (PELI ) Benchmark Goals and Composite Score Dynamic Measurement Group, Inc. / August 20, 2015

mclass: Reading 3D Reminders

How To: Assess Reading Speed With CBM: Oral Reading Fluency Passages

Oral Fluency Assessment

mclass :RTI User Guide

How To: Assess Reading Comprehension With CBM: Maze Passages

A COMPREHENSIVE K-3 READING ASSESSMENT PLAN. Guidance for School Leaders

7.0 Overview of Reports

Center on Education Policy, Reading First: Locally appreciated, nationally troubled

Administering and Scoring of the Oral Reading Fluency and Maze Tests

District 2854 Ada-Borup Public Schools. Reading Well By Third Grade Plan. For. Ada-Borup Public Schools. Drafted April 2012

Teachers have long known that having students

DIBELS Next Assessment Manual. Dynamic Indicators of Basic Early Literacy Skills

DUAL DISCREPANCY MODEL ASSESSMENT SCORES:

Setting Individual RTI Academic Performance Goals for the Off-Level Student Using Research Norms

Teaching Reading Essentials:

Linking DIBELS Oral Reading Fluency with The Lexile Framework for Reading

TECHNICAL REPORT #33:

Matching Intervention to Need and Documentation of Tier 2 and Tier 3 Interventions

Early literacy skills in Latvian preschool children with specific language impairment

Middle School Special Education Progress Monitoring and Goal- Setting Procedures. Section 2: Reading {Reading- Curriculum Based Measurement (R- CBM)}

Opportunity Document for STEP Literacy Assessment

Choosing a Progress Monitoring Tool That Works for You. Whitney Donaldson Kellie Kim Sarah Short

the sites selected for participation in the study were targeted Assistance schools with a history of unacceptably low achievement.

The Response to Intervention of English Language Learners At- Risk for Reading Problems

Understanding Types of Assessment Within an RTI Framework

Best Practices in Setting Progress Monitoring Goals for Academic Skill Improvement

Woodcock Reading Mastery Tests Revised, Normative Update (WRMT-Rnu) The normative update of the Woodcock Reading Mastery Tests Revised (Woodcock,

Key words: reading assessments, constrained skills, developmental analyses. Background

GUIDE TO BECOMING A READING CORPS SITE

Running Head: FIELD TESTING

S=Specific, M=Measurable, A=Appropriate, R=Realistic & Rigorous, T=Timebound

20 by Renaissance Learning, Inc. All rights reserved. Printed in the United States of America.

DIBELS Data System Data Management for Literacy and Math Assessments Contents

Using Direct Instruction Programs as Intervention Programs in Grades K 3

Indicator 5.1 Brief Description of learning or data management systems that support the effective use of student assessment results.

Comparison of Progress Monitoring with Computer Adaptive Tests and Curriculum Based Measures

Minnesota Reading Corps State-Wide Evaluation

Interpreting Reports from the Iowa Assessments

SPECIFIC LEARNING DISABILITY

The test uses age norms (national) and grade norms (national) to calculate scores and compare students of the same age or grade.

Unit 4 Module 1: Administering the Texas Middle School Fluency Assessment

Teaching Young Children How to Read: Phonics vs. Whole Language. Introduction and Background

Standardized Tests, Intelligence & IQ, and Standardized Scores

The Effects of Read Naturally on Grade 3 Reading: A Study in the Minneapolis Public Schools

Teaching All Students to Read: With Strong Intervention Outcomes

2012 University of Texas System/ Texas Education Agency

CHAPTER 20 INTRODUCTION. Building, Implementing, and Sustaining a Beginning Reading Improvement Model: Lessons Learned School by School

Reading First Assessment. Faculty Presentation

Developing Fluent Readers

Transcript: What Is Progress Monitoring?

Comprehensive Reading Assessment Grades K-1

Targeted Reading Intervention for Students in Grades K-6 Reading results. Imagine the possibilities.

ABSTRACT DYNAMIC INDICATORS OF BASIC EARLY LITERACY SKILLS

MAP Reports. Teacher Report (by Goal Descriptors) Displays teachers class data for current testing term sorted by RIT score.

Student Progress Monitoring in Mathematics Pamela M. Stecker, PhD

Does Intensive Decoding Instruction Contribute to Reading Comprehension? Stephen Krashen Knowledge Quest (in press)

Nevis Public School District #308. District Literacy Plan Minnesota Statute 120B.12, Learning together... Achieving quality together.

The researched-based reading intervention model that was selected for the students at

There are 5 Big Ideas in Beginning Reading:

Technical Report. Overview. Revisions in this Edition. Four-Level Assessment Process

Descriptive Statistics

Academic Progress Monitoring: Technical and Practical Challenges to Implementation. 325 T Webinar Series April 19, 2013

9/27/2010. LynneD SpillerEdD Creighton School District

DRP Report Interpretation Guide

Directions for Administering the Graded Passages

(A) Program means The Academic Savings Account program created in this subchapter.

Graphing and Interpreting CBM Scores

South Dakota s Growth Model. From Student Growth Percentiles to School Accountability Scores

TAS Instructional Program Design/ Scientifically-based Instructional Strategies

Tertiary Intervention

Parent s Guide to STAR Assessments. Questions and Answers

AP STATISTICS 2010 SCORING GUIDELINES

CPSE 430 Brigham Young University Counseling Psychology and Special Education Winter 2010

Considerations for Using STAR Data with Educator Evaluation: Ohio

Phonemic Awareness. Section III

Portland State University Graduate School of Education Department of Special Education

Reading W ell. by Third G rade LITE RA C Y PLA N Ogilvie School District Ogilvie, Minnesota

Methods for Increasing the Intensity of Reading Instruction for Students with Intellectual Disabilities

Leveled Literacy Intervention (LLI) consists of a series of planned lessons designed to provide supplementary instruction

PATHWAYS TO READING INSTRUCTION INCREASING MAP SCORES HEIL, STEPHANIE. Submitted to. The Educational Leadership Faculty

Year 1 reading expectations (New Curriculum) Year 1 writing expectations (New Curriculum)

Guided Reading 9 th Edition. informed consent, protection from harm, deception, confidentiality, and anonymity.

Mix Methods Research. Findings Report. Letters Alive: Case Study

Strand: Reading Literature Topics Standard I can statements Vocabulary Key Ideas and Details

To learn to read is to light a fire; every syllable that is spelled out is a spark." Victor Hugo, Les Miserables

The Alphabetic Principle

Ongoing Progress Monitoring Across All Tiers of Support. MDCPS Office of Academics, Accountability and School Improvement

INCREASE YOUR PRODUCTIVITY WITH CELF 4 SOFTWARE! SAMPLE REPORTS. To order, call , or visit our Web site at

Approaches and Considerations of Collecting Schoolwide Early Literacy and Reading Performance Data

Special Education Leadership Institute

Measures of Academic Progress MAP. Glossary of Terms

Chapter 4. Probability and Probability Distributions

Fulda Independent School District 505


Wave 3 Intervention Guide Intervention Briefing Sheets plus Examples of Intervention Monitoring Templates

Test of Early Numeracy. Kindergarten Benchmark Assessments. Benchmark Period 3 Notes and Observations: Given To: School: Fall Date Scored By

Transcription:

Technical Report 1501 DIBELS Data System: 2013-14 Percentile Gains for Predominant DIBELS Next Benchmark Assessments Authors: Patrick C. Kennedy Kelli D. Cummings Holle A. Bauer Schaper Mike Stoolmiller University of Oregon Center on Teaching and Learning Citation: Kennedy, P. C., Cummings, K. D., Bauer Schaper, H. A., & Stoolmiller, M. (2015). DIBELS Data System: 2013-2014 Percentile Gains for DIBELS Next Benchmark Assessments (Technical Report 1501). Eugene, OR: University of Oregon. The research reported here was supported exclusively by the Center on Teaching and Learning (CTL), a research and outreach unit in the office for Research, Innovation, and Graduate Education (RIGE) at the University of Oregon, its Director, Hank Fien, and its Associate Director, Jeannie Smith. The data, analyses, results, and opinions expressed are those of the authors and CTL, and do not represent the views of RIGE or the University of Oregon.

DIBELS Next 2013-14 Percentile Gains 2 Technical Report 1501 DIBELS Data System: 2013-14 Percentile Gains for Predominant DIBELS Next Benchmark Assessments This report describes the procedures used to calculate percentile gains for the predominant DIBELS Next benchmark assessments at each grade level, based on a nationally representative convenience sample of schools that used the DIBELS Data System (DDS) in the 2013-2014 school year. Percentiles are a common metric used to facilitate the interpretation of individual characteristics such as reading proficiency, relative to the distribution of those characteristics in a particular population. Gain scores measure changes in performance across two points in time, such as the difference between the scores a student earns on an assessment given at the beginning of the year and a similar measure administered at the end of the year. Percentile gains are thus used to describe normative changes in performance over time. The DIBELS Next benchmark percentile gains provide detailed information regarding changes in student performance from one benchmark assessment to another for the predominant DIBELS Next measure in each grade, as identified in the 2012-2013 DIBELS Data System Update Part I (University of Oregon, Center on Teaching and Learning, 2012a), for a nationally representative sample of students in kindergarten through grade 6. The percentiles gains are designed to facilitate comparisons in the performance of an individual student over time to the performance of other students with a similar starting score. These comparisons can provide a more nuanced understanding of student progress than simply comparing a student s score to either a criterion (i.e., benchmark) goal or a normative percentile rank, and are an especially useful tool for evaluating the performance of students who perform below the benchmark level and whose progress over time needs to be monitored more closely.

DIBELS Next 2013-14 Percentile Gains 3 Take, for instance, a student with a median correct words per minute score of 5 on the middle of first grade Oral Reading Fluency (ORF) benchmark passages, and a median correct words per minute score of 30 at the end of the year. Based on the recommended benchmark goals (University of Oregon, Center on Teaching and Learning, 2012b), this student s performance (a) falls below the specified cut point for risk on both assessments (20 and 36 words read correct, respectively), and (b) places them at the 3 rd percentile in the middle of year and the 21 st percentile at the end of the year (Cummings et al., 2011). Thus, even though this student has managed to improve her oral reading fluency rate by 25 words per minute over the course of just a few months, a status oriented interpretation of her spring score is similar to that of her winter score: her performance is still far below average relative to the normative sample, and it is difficult to determine just how effective the instruction she received during the second half of the year really was. However, if you were to evaluate this student s performance relative to all students with a similar ORF score in the middle of first grade, it becomes clear that the gains made by this student are greater than the gains made by more than 90% of students who read fewer than 10 words correct in the middle of first grade. In other words, this student has made significant gains in reading proficiency in only a few months time, likely indicating the presence of an effective, well-implemented intervention. Thus, with access to additional information about typical rates of growth, an educator is likely to reach a much different conclusion than if she had evaluated only the benchmark status or end of year percentile rank for this student. As this example illustrates, normative gain score information is most informative for those students who perform below the benchmark level as a way of setting ambitious, yet realistic performance goals when the end of year benchmark goal isn t feasible.

DIBELS Next 2013-14 Percentile Gains 4 Methods Participants The DIBELS Next percentile gains are based on scores from all students who attended a DDS school in the United States during the 2013-2014 school year and for whom both baseline and end of year scores on the predominant measure were entered in the DDS. Although the schools and districts in the DDS represent a substantial proportion of U.S. primary schools and are distributed widely across the country, they may not be fully representative of the instruction and assessment practices that are used throughout the country. For schools currently using the DDS, we argue that this comparison group still provides important contextual information regarding the performance of students in your school and district. However, we must point out that our sample has not been randomly selected, it is not a probability sample, and the data were collected and entered into the DDS independently by schools. Table 1 displays the total number of districts, and the distribution of the number of students per district, for each grade in the percentile gains sample. Due to a number of large districts in the DDS, the distribution of the total number of students per district is positively skewed, with a mean that is noticeably higher than the median and a relatively large standard deviation. Conversely, the DDS also includes a number of private schools, each of which is considered a separate district in the system, which may negatively skew the median. Consequently, both values are reported. The median ranges from a low of 25 students per district in sixth grade to a high of 56 students per district in kindergarten and first grade; the mean ranges from a low of approximately 75 in sixth grade to a high of approximately 166 in first grade.

DIBELS Next 2013-14 Percentile Gains 5 Table 1 Number of Districts and Participating Students per District in DIBELS Next Gain Percentiles Grade N Districts Min Number of Participating Students per District Lower Quartile Median Upper Quartile Max Mean SD K 2286 1 24.0 56.0 150.0 7445 159.71 377.55 1 2299 1 23.0 56.0 150.0 6871 165.69 408.08 2 2245 1 21.0 51.0 139.5 6194 153.47 379.62 3 2017 1 20.0 46.0 123.0 6035 138.66 343.96 4 1631 1 17.0 39.0 106.0 5612 116.31 287.71 5 1382 1 16.0 36.0 100.0 3892 105.38 241.68 6 746 1 11.0 25.0 58.25 3610 74.58 234.68 Table 2 displays the number of schools, and the distribution of the number of students per school, in the sample for each grade. These distributions show relatively little skew, and the mean in most cases is roughly equivalent to the median. The median number of students per school ranges from a low of 28 in sixth grade to a high of 58 in first grade. Additional details regarding the demographic composition of schools in the DDS, including comparisons based on data from the National Center for Education Statistics is available in Cummings et al. (2011). Table 2 Number of Schools and Participating Students per School in DIBELS Next Gain Percentiles Grade N Schools Min Number of Participating Students per District Lower Quartile Median Upper Quartile Max Mean SD K 5903 1 32.00 57.00 82.00 482 61.85 43.08 1 6081 1 32.00 58.00 85.00 471 62.64 41.91 2 5876 1 29.00 54.00 79.75 398 58.63 39.48 3 5013 1 26.00 51.00 77.00 375 55.79 38.98 4 3717 1 19.00 46.00 73.00 326 51.04 39.08 5 3045 1 16.00 39.00 70.00 670 47.83 42.03 6 1368 1 11.00 28.00 59.00 640 40.67 43.28

DIBELS Next 2013-14 Percentile Gains 6 Measures The DIBELS Next assessments (Good & Kaminski, 2011) are a collection of measures designed to assess five essential early reading skills in kindergarten through sixth grade: phonemic awareness, phonics, accuracy and fluency with connected text, vocabulary, and comprehension (National Reading Panel, 2000). As students become proficient on the more foundational skills (e.g., phonemic awareness, phonics), measures of those skills are phased out and measures of the more complex skills are introduced. Each measure is standardized and individually administered, and higher scores indicate higher levels of the desired skill. DIBELS Next measures are not prorated, so if a student finishes a probe prior to the 1-minute time limit, their final score consists of their total score at completion. Gain percentiles were calculated for the predominant measure in each grade, as identified in the 2012-2013 DIBELS Data System Update Part I (University of Oregon, Center on Teaching and Learning, 2012a), with one exception: from the beginning to middle of kindergarten, First Sound Fluency is used, because although Letter Naming Fluency is a strong indicator of risk, it does not represent a skill on which instructional goals should be based. In all other instances (i.e., Nonsense Word Fluency Correct Letter Sounds from the middle to end of kindergarten and the beginning to middle of grade 1, and Oral Reading Fluency Words Read Correct from the middle to end of grade 1 and the beginning to end of grades 2-6), the measure represented is the empirically determined strongest indicator of future reading performance. First Sound Fluency (FSF). First Sound Fluency assesses a student s ability to identify and articulate the beginning sounds in words (Cummings, Kaminski, Good, & O Neil, 2011; Good et al., 2011; Kaminski, Baker, Chard, Clarke, & Smith, 2006). The examiner says a word to the student, and asks the student to name the beginning sound or group of sounds in that word.

DIBELS Next 2013-14 Percentile Gains 7 Once the student responds, the examiner presents another word, and the process is repeated until 1 minute has elapsed or until the last item has been reached. The total FSF score is the number of correct beginning sounds produced. If a student responds incorrectly to each of the first five test items, examiners are instructed to discontinue the task and record a total score of 0. FSF is administered in the beginning and middle of kindergarten. Nonsense Word Fluency (NWF). The NWF task measures knowledge of the alphabetic principle including both letter-sound correspondence and the ability to blend letters into words (Kaminski & Good, 1996; Good & Kaminski, 2002; Good & Kaminski, 2011). Students are presented with a page of stratified vowel-consonant (VC) and consonant-vowel-consonant (CVC) nonsense words (e.g., sog, rav, ov) and asked to: (a) say the individual letter sound of each letter, or (b) read the whole nonsense word. For example, if the stimulus word is /sog/ the student could say "/s/ /o/ /g/" to obtain a total of 3 letter sounds correct, or say the word "/sog/" to obtain a total of 3 letter sounds correct and 1 whole word read. The student is given 1 minute to say as many letter-sounds as s/he can, and the final DIBELS Next NWF score consists of two parts: (i) the number of letter-sounds produced correctly (CLS) and (ii) the number of whole words read (WWR). The discontinue rule for NWF is enforced for any student who produces 0 correct letter sounds in the first five items. Examiners are instructed to record a total score of 0 for both CLS and WWR for students who meet the discontinue rule. Oral Reading Fluency (ORF). ORF is a measure of accuracy and fluency with connected text (Children's Educational Services, 1987; Good & Kaminski, 2011; Good, Kaminski, & Dill, 2002). DIBELS Next ORF consists of a set of passages calibrated to the goal level of reading for each grade and that follow standardized administration procedures (Powell- Smith, Good, & Atkins, 2010). To obtain the benchmark score for ORF, students are asked to

DIBELS Next 2013-14 Percentile Gains 8 read 3 passages aloud for 1 minute each. Words omitted,, substitutions, and hesitations of more than 3 seconds are scored as errors. Words self-correctedd within 3 seconds are scored as correct. For benchmark assessments, the DIBELS Next ORF score is composed of the median number of words (from the 3 passages) that are read correctly, unless a student has met one of the following discontinue rules: (a) if students read 0 words correctly inn the first row of the passage, examiners are to discontinue ORF administration immediately and record a score of 0 words correct and the corresponding number of errors as the student's score in the DDS; (b) if studentss read fewer than 10 words correctly in the first passage, neither the secondd nor the third passage is administered. The final score for a student who met discontinue rule (b) is the number of words correct from the first passage alone. Calculating Percentile Gains We employed a two-step process for calculating gain percentiles. First, raw gains between benchmark periods weree calculated for each student by subtracting the baseline benchmark score from the end of year benchmark score. Subsequently, gain percentiles weree calculated collectively for groups of studentss whose baseline scores fell within one standard error of measurement (SEM) of each other. Take, for example, a raw score of 65 on the beginningg of second grade ORF benchmark. Given that the SEM for ORF is approximately ten correct words per minute, gain percentiles for ORF from the beginningg to end of second grade were calculated using gain scores for all students with a beginning of second grade ORF score between 60 and 69 correct words per minute. Growth percentiles were calculated for each SEM band using the following formula: University of Oregon Center on Teaching and Learning. All rights reserved. Revision Date: February 2 2015

DIBELS Next 2013-14 Percentile Gains 9 where P n represents the percentile rank for the n th gain ranked in order from lowest to highest, N represents the total sample size for each raw gain score, and n represents the position of that gain in the ranked order. For example, the gain representing the 65 th percentile for students with a beginning of second grade ORF score between 60 and 69 (i.e., the score that is equal to or greater than 65% of all gain scores) was 43. In other words, nearly two thirds of second grade students in the DDS who started the year with an ORF score between 60 and 69 gained fewer than 43 points on ORF in the time between the beginning and end of year assessments. Results and Use Due to the size of the lookup tables, gain percentiles for the predominant DIBELS Next measure for all grades are available as part of the DDS Zones of Growth Student Goals feature. This tool provides (a) the end of year goal at the 50 th, 65 th, and 80 th percentiles according to a student s baseline (typically, beginning of year) score; and (b) the exact gain percentile represented by the difference between the student s end of year and baseline scores. This information serves practitioners in three ways. First, it allows them to describe a student s gain in terms of their normative performance improvement. Second, it offers a way to set ambitious, but realistic goals for student growth. Third, it facilitates the process of monitoring students progress towards those goals.

DIBELS Next 2013-14 Percentile Gains 10 References Children's Educational Services. (1987). Test of reading fluency. Minneapolis, MN: Author. Cummings, K.D., Kaminski, R.A., Good, R.H., & O'Neil, M. (2011). Assessing phonemic awareness in early kindergarten: Development and initial validation of First Sound Fluency (FSF). Assessment for Effective Intervention, 36(2), 94-106. Cummings, K.D., Kennedy, P.C., Otterstedt, J., Baker, S.K., & Kame'enui, E.J. (2011). DIBELS Data System: 2010-2011 percentile ranks for DIBELS Next benchmark assessments (Technical Report 1101). Eugene, OR: University of Oregon. Retrieved from https:///docs/techreports/dibels_next_percentile_ranks_report.pdf Good, R.H. & Kaminski, R.A. (2002). Dynamic Indicators of Basic early Literacy Skills. (6 th ed.). Eugene, OR: Institute for the Development of Educational Achievement. Available: https:/// Good, R. H., & Kaminski, R.A. (2011). DIBELS Next technical manual. Eugene, OR: Dynamic Measurement Group. Retrieved from https://dibels.org/next/index.php Good, R.H., & Kaminski, R.A., (with Cummings, K., Dufour-Martel, C., Petersen, K., Powell- Smith, K., Stollar, S. & Wallin, J.). (2011). DIBELS Next assessment manual. Eugene, OR: Dynamic Measurement Group. Available: https://dibels.org Good, R.H., Kaminski, R.A., & Dill, S. (2002). DIBELS Oral Reading Fluency. In. R.H. Good & R.A. Kaminski (Eds.), Dynamic Indicators of Basic Early Literacy Skills (6th ed.). Eugene, OR: Institute for the Development of Educational Achievement. Available: https:///

DIBELS Next 2013-14 Percentile Gains 11 Kaminski, R.A., Baker, S.K., Chard, D., Clarke, B., & Smith, S. (2006). Final report: Reliability, validity, and sensitivity of Houghton Mifflin Early Growth Indicators (Technical Report). Eugene, OR: Dynamic Measurement Group and Pacific Institutes for Research. Kaminski, R.A. & Good, R.H. (1996). Toward a technology for assessing early basic literacy skills. School Psychology Review, 25, 215-217. National Reading Panel (2000). Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction. Reports of the subgroups. Washington, DC: National Institute of Child Health and Human Development. Retrieved from http://www.nichd.nih.gov/publications/pubs/ nrp/documents/report.pdf Powell-Smith, K.A., Good, R.H., & Atkins, T. (2010). DIBELS Next Oral Reading Fluency readability study (Technical Report No. 7). Eugene, OR: Dynamic Measurement Group. University of Oregon, Center on Teaching and Learning (2012a). 2012-2013 DIBELS Data System Update Part I: DIBELS Next Composite Score. (Technical Brief No. 1202). Retrieved from https:///docs/techreports/ DDS2012TechnicalBriefPart1.pdf University of Oregon, Center on Teaching and Learning (2012b). 2012-2013 DIBELS Data System Update Part II: DIBELS Next Benchmark Goals (Technical Brief No. 1203). Eugene, OR: University of Oregon. Retrieved from https:/// docs/techreports/dds2012technicalbriefpart2.pdf