Minnesota Reading Corps 2011-2012 State-Wide Evaluation 2011-2012 KERRY BOLLMAN, SSP, NCSP Instructional Services Coordinator, Reading Center Director Saint Croix River Education District kbollman@scred.k12.mn.us BENJAMIN SILBERGLITT, PhD Director, Software Applications Technology and Information Educational Services Benjamin.Silberglitt@ties.k12.mn.us DAVID PARKER, PhD School Psychologist, Master Coach South Washington County Schools, Minnesota Reading Corps parke384@umn.edu MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 1
Table of Contents BACKGROUND OF THE MINNESOTA READING CORPS 5 EVALUATION DESIGN 6 ASSESSMENT DATA COLLECTION 7 EVALUATION REPORT 9 1. CURRENT IMPACT 9 GROWTH IN DATA REPORTED 12 DEMOGRAPHICS 16 2. FIDELITY OF ASSESSMENT DATA COLLECTION 17 3. FIDELITY OF INTERVENTION IMPLEMENTATION 28 GROWTH IN FIDELITY REPORTING 30 4. STUDENT OUTCOMES 40 PRE-KINDERGARTEN STUDENT PERFORMANCE 40 SUPPLEMENTAL INTERVENTION IN MRC PRE-K PROGRAM 48 CLASSROOM OUTCOMES FOR PRE-K SITES 51 KINDERGARTEN-GRADE 3 STUDENT PERFORMANCE 52 PERFORMANCE ON THE STATEWIDE READING ASSESSMENT 71 5. SYSTEMS CHANGE 75 6. IMPACT ON AMERICORPS MEMBERS 77 7. ACTION RESEARCH: RESULTS OF PILOT STUDIES 79 8. Appendix 85 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 2
Tables and Figures TABLE 1: OVERALL MEMBER COUNTS AND SERVICE HOURS, BY REGION AND TOTAL... 9 TABLE 2: PRE-KINDERGARTEN PARTICIPATION... 11 TABLE 3: NUMBER OF K-3 STUDENTS BENCHMARKED AS FOLLOW-UP TO PRIOR YEARS' TUTORING.. 11 TABLE 4: NUMBER OF STUDENTS WITH AT LEAST 1 DATA POINT, ANY MEASURE... 11 TABLE 5: NUMBER OF STUDENTS WITH AT LEAST 5 WEEKS OF DATA (R-CBM, NWF, OR LSF)... 12 TABLE 6: NUMBER OF STUDENTS WITH AT LEAST 10 WEEKS OF DATA (R-CBM, NWF, OR LSF)... 12 TABLE 7: NUMBER OF STUDENTS WITH AT LEAST 20 WEEKS OF DATA (R-CBM, NWF, OR LSF)... 12 FIGURE 1: GROWTH IN STUDENT DATA REPORTED... 13 TABLE 8: MINNESOTA READING CORPS PARTICIPATION: ALL STUDENTS... 13 TABLE 9: PARTICIPATION LEVELS IN MRC TUTORING... 14 TABLE 10: PARTICIPATION LEVELS IN K-3 MRC TUTORING BY REGION... 14 TABLE 11: PRE-KINDERGARTEN - GRADE 3 PARTICIPANT DEMOGRAPHIC DATA... 16 TABLE 12: FIDELITY OF ASSESSMENT DATA COLLECTION PROCEDURES... 28 TABLE 13: FIDELITY OF INTERVENTION IMPLEMENTATION PROCEDURES... 29 FIGURE 2: GROWTH IN FIDELITY REPORTING... 31 TABLE 14: KINDERGARTEN PARTICIPANT PERFORMANCE ON IGDIS: FALL BENCHMARK... 41 TABLE 15: PRE-KINDERGARTEN PARTICIPANT PERFORMANCE ON IGDIS: WINTER BENCHMARK... 42 TABLE 16: PRE-KINDERGARTEN PARTICIPANT PERFORMANCE ON IGDIS: SPRING BENCHMARK... 42 FIGURE 3: PERCENTAGE OF STUDENTS ON OR ABOVE, NEAR, AND FAR FROM TARGET: FALL, WINTER, AND SPRING... 43 FIGURE 4: NORMATIVE PERFORMANCE OF 4-YEAR-OLDS ON IGDI MEASURES: FALL, WINTER, AND SPRING... 44 TABLE 17: NORMATIVE PERFORMANCE OF 4-YEAR-OLDS ON IGDI MEASURES: FALL, WINTER, AND SPRING... 45 TABLE 18: FLOOR EFFECT ISSUES WITH PRE-KINDERGARTEN ASSESSMENTS - PERCENTAGE OF STUDENTS NOT COMPLETING SAMPLE IGDIS ITEMS OR SCORING ZERO ON LETTER NAMING/SOUND FLUENCY BY MEASURE AND SEASON (ALL PREK STUDENTS)... 46 TABLE 19: PRE-KINDERGARTEN STUDENT GROWTH... 46 TABLE 20: IGDI FALL-SPRING GROWTH BY HOURS PER WEEK IN CORE INSTRUCTION: 4-YEAR-OLDS... 46 TABLE 21: IGDI FALL-SPRING GROWTH BY HOURS PER WEEK IN CORE INSTRUCTION: ALL PRE- KINDERGARTEN STUDENTS... 47 TABLE 22: PARTICIPATION IN PRE-K SUPPLEMENTAL INTERVENTIONS: NUMBER OF STUDENTS AND NUMBER OF MINUTES PER WEEK... 48 TABLE 23: IGDI FALL-SPRING GROWTH BY PARTICIPATION IN SUPPLEMENTAL INSTRUCTION: 4-YEAR- OLDS... 48 TABLE 24: IGDI FALL-SPRING GROWTH BY PARTICIPATION IN SUPPLEMENTAL INSTRUCTION: ALL PRE-KINDERGARTEN STUDENTS... 49 FIGURE 5: CROSS-COHORT PERCENT ABOVE TARGET ON EARLY LITERACY MEASURES... 50 TABLE 25: PERCENT OF PRE-K STUDENTS MOVING FROM BELOW TO AT OR ABOVE TARGET... 51 TABLE 26: ELLCO PERFORMANCE IN THE FALL AND SPRING... 51 TABLE 27: KINDERGARTEN - GRADE 3 PARTICIPANT GROWTH... 54 TABLE 28: AVERAGE LINEAR GROWTH RATES, BY REGION... 54 FIGURE 7: GRADE 1 NONSENSE WORD FLUENCY GROWTH CURVE ESTIMATES BY REGION... 56 FIGURE 8: GRADE 1: ORAL READING FLUENCY GROWTH CURVE ESTIMATES BY REGION... 57 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 3
FIGURE 9: GRADE 2: ORAL READING FLUENCY GROWTH CURVE ESTIMATES BY REGION... 58 FIGURE 10: GRADE 3: ORAL READING FLUENCY GROWTH CURVE ESTIMATES BY REGION... 59 TABLE 29: KINDERGARTEN - GRADE 3 PERCENTAGE OF STUDENTS ABOVE GROWTH TARGETS BY REGION... 60 TABLE 30: AVERAGE TUTORING PARTICIPATION BY GRADE LEVEL... 61 TABLE 31: AVERAGE TUTORING PARTICIPATION FOR K-3 BY GRADE LEVEL AND REGION... 61 TABLE 32: PERCENTAGE OF STUDENTS WITH 3 WEEKS OF PROGRESS MONITORING DATA WHO EXIT... 63 TABLE 33: PERCENTAGE OF STUDENTS WITH 3 WEEKS OF PROGRESS MONITORING DATA WHO EXIT BY REGION... 63 FIGURE 11: PERCENTAGE OF STUDENTS SUCCESSFULLY EXITED: 2009-2010 TO 2010-2011 COMPARISON... 65 TABLE 34: PERCENTAGE WHO EXIT: FALL BENCHMARK TIER 2 VS. TIER 3... 66 TABLE 35: PERCENTAGE WHO EXIT: FALL BENCHMARK TIER 2 VS. TIER 3 BY REGION... 66 FIGURE 12: PERCENT OF STUDENTS IN EACH TIER WHO EXITED SUCCESSFULLY... 67 TABLE 36: PERCENTAGE WHO EXIT WHO ALSO MEET OR EXCEED SPRING BENCHMARK... 68 TABLE 37: PERCENTAGE WHO EXIT WHO ALSO MEET OR EXCEED SPRING BENCHMARK BY REGION... 68 FIGURE 13: CHANGE IN PERCENTAGE OF STUDENTS EXITING AND MEETING SPRING BENCHMARK TARGETS... 70 FIGURE 14: PERCENT OF STUDENTS WITH SUCCESSFUL EXIT OR BENCHMARK SCORE AND SUCCESS ON STATEWIDE READING ASSESSMENTS GRADE 3: COMPARISON OF 2009-2010 TO 2011-2012... 72 TABLE 39: OUTCOMES OF 2010-2011 SECOND GRADE MRC PARTICIPATING STUDENTS ON SPRING GRADE 3 STATEWIDE READING ASSESSMENT... 73 TABLE 40: OUTCOMES OF 2009-2010 FIRST GRADE MRC PARTICIPATING STUDENTS ON SPRING GRADE 3 STATEWIDE READING ASSESSMENT... 73 TABLE 41: OUTCOMES OF 2008-2009 KINDERGARTEN MRC PARTICIPATING STUDENTS ON SPRING GRADE 3 STATEWIDE READING ASSESSMENT... 74 TABLE 42: TOTAL NUMBER OF STUDENTS MONITORED IN MRC PROGRAMS, WITH THIRD GRADE STATEWIDE READING ASSESSMENT IN 2011-2012... 74 TABLE 45: INTERNAL COACH SYSTEMS CHANGE SURVEY RESULTS... 75 TABLE 46: MRC MEMBER IMPACT SURVEY RESULTS... 77 TABLE 47: K-FOCUS MRC MEMBER IMPACT SURVEY RESULTS... 78 TABLE 48: COMPARISON OF PERCENT OF STUDENTS MEETING INDIVIDUAL GROWTH RATES BY PARTICIPATION IN KINDERGARTEN FOCUS MODEL... 79 TABLE 49: AVERAGE WEEKLY GROWTH RATE OF ALL MRC PARTICIPATING STUDENTS IN KINDERGARTEN FOCUS MODEL *... 80 FIGURE 15: GROWTH CURVE ESTIMATES, BY K-FOCUS PARTICIPATION... 80 TABLE 50: THREE-YEAR HISTORY OF TOTAL NUMBERS OF KINDERGARTEN STUDENTS SERVED IN SITES IMPLEMENTING K-FOCUS BY YEAR SITE BEGAN IMPLEMENTING K-FOCUS... 81 TABLE 51: MATCHED PAIRS T-TESTS OF WORD CONSTRUCTION TREATMENT VS. COMPARISON GROUPS... 82 TABLE 52: FAMILY LITERACY PILOT PROGRAM PARTICIPATION RATES... 84 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 4
BACKGROUND OF THE MINNESOTA READING CORPS Minnesota Reading Corps (MRC) is an AmeriCorps program that provides trained literacy tutors (Members) for children age three to grade three. Some MRC Members work with preschoolers and focus on integrating talking, reading, and writing into all classroom activities. Other Members provide supplemental literacy skills tutoring for children in kindergarten to third grade. Still others recruit, train, and manage community volunteers to expand the capacity of the program. MRC Members and community volunteers are trained in specific researchbased literacy instructional protocols, and supported by expert coaches. Members use reliable, valid assessment tools to monitor student progress on a regular basis, and with help from their expert coaches, use data from assessments to inform tutoring strategies for each student. Use of specific research based instructional techniques and technically adequate assessment tools for decision making make the MRC program both highly unique and highly coveted across the literacy landscape. The vision of the Minnesota Reading Corps is to impact literacy in the state of MN through children, AmeriCorps members and communities as follows: All children in MN, ages 3 to grade 3, who qualify for MRC, will have access to MRC and will meet reading standards by third grade. AmeriCorps members, through the training, development and service opportunity provided by MRC, will pursue education related careers and/or continue to be ambassadors for children's literacy throughout their lives. Schools and community institutions/organizations, through their experiences with MRC, will understand and adopt the MRC methods for increasing literacy; those institutions will, in turn, promote MRC literacy methods to their colleagues. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 5
EVALUATION DESIGN The evaluation for the Minnesota Reading Corps 2011-2012 program year includes six broad questions as listed below. Additions to the current year evaluation include the evaluation of a new intervention designed to support early word reading, and a family literacy pilot program. 1. What is the current impact of the MRC on the state of Minnesota in terms of numbers of students and programs receiving support? 2. Are the data collection tools used with children being implemented with fidelity? 3. Are the interventions used with children being implemented with fidelity? 4. Is the performance of MRC-participating students in terms of their literacy improvement consistent with expectations? 5. Are the organizations with which the MRC is working changing to adopt the practices of the MRC? 6. What is the impact of the MRC experience on the AmeriCorps Members? MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 6
Assessment Data Collection The assessment data listed below was collected during the 2011-2012 school year. Minnesota Reading Corps members collect all student assessment data. Internal and master coaches collect fidelity data for assessments and interventions. Internal coaches, site supervisors, and members complete survey data regarding perceptions of systems change and impact. Internal and master coaches collect Pre-K environmental observation data. Student Data for Pre-school Programs: Age 3 on or before Sept 1 st Fall Winter Spring IGDI Rhyming IGDI Picture Naming IGDI Alliteration IGDI Rhyming IGDI Picture Naming IGDI Alliteration IGDI Rhyming IGDI Picture Naming IGDI Alliteration Age 4 on or before Sept 1 st Age 5 on or before Sept 1 st but not enrolled in K IGDI Rhyming IGDI Picture Naming IGDI Alliteration Letter Naming Fluency Letter Sound Fluency IGDI Rhyming IGDI Picture Naming IGDI Alliteration Letter Naming Fluency Letter Sound Fluency IGDI Rhyming IGDI Picture Naming IGDI Alliteration Letter Naming Fluency Letter Sound Fluency IGDI Rhyming IGDI Picture Naming IGDI Alliteration Letter Naming Fluency Letter Sound Fluency IGDI Rhyming IGDI Picture Naming IGDI Alliteration Letter Naming Fluency Letter Sound Fluency IGDI Rhyming IGDI Picture Naming IGDI Alliteration Letter Naming Fluency Letter Sound Fluency MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 7
Student Data for K-3 Programs: Kindergarten Letter Naming Fluency Letter Sound Fluency Grade 1 Letter Naming Fluency Letter Sound Fluency Nonsense Word Fluency Grade 2 Oral Reading Fluency (3 passages) Grade 3 Oral Reading Fluency (3 passages) Fall Winter Spring Letter Naming Fluency Letter Sound Fluency Nonsense Word Fluency Fluency Nonsense Word Fluency Oral Reading Fluency (3 passages) Oral Reading Fluency (3 passages) Oral Reading Fluency (3 passages) Letter Naming Fluency Letter Sound Fluency Nonsense Word Oral Reading Fluency (3 passages) Oral Reading Fluency (3 passages) Oral Reading Fluency (3 passages) Additional Data: Observations of assessment fidelity three times per year Observations of intervention fidelity between three and nine times per year in K-3 programs, monthly in Pre-K programs. Early Language and Literacy Classroom Observation two times per year in Pre-K programs only End of year MRC Member surveys End of year MRC site surveys MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 8
EVALUATION REPORT 1. Current Impact What is the current impact of the MRC on the state of Minnesota in terms of numbers of students and programs receiving support? In the tables below, the number of Minnesota Reading Corps members, by position type, serving during the 2011-2012 school year, who collected data for students during one or more benchmark assessment windows and submitted the data for evaluation is recorded. In addition, the number of students receiving MRC services for whom data are recorded is reported. Numbers of participating students are compiled according to the following criteria: Number of preschool students with data from 1 benchmark window Number of preschool students with data from 3 benchmark windows Number of K-3 students benchmarked as a follow up from previous year s tutoring Number of K-3 students with at least 1 week of data on at least 1 measure Number of K-3 students with at least 5 weeks of data on at least 1 measure Number of K-3 students with at least 10 weeks of data on at least 1 measure Number of K-3 students with at least 20 weeks of data on at least 1 measure The following table provides a count of Minnesota Reading Corps participating Members who served during the 2011-2012 school year in Pre-K and K-3 programs. The total number of hours served is also displayed. Overall counts, regional counts, and counts-by-position are provided. Table 1: Overall Member Counts and Service Hours, by Region and Total Region Position Number of Members Number of Hours Served Central 109 147882 K-3 Literacy Tutor 68 92034 Preschool Literacy Tutor-Community Corps 24 30875 Preschool Literacy Tutor-Professional Corps 16 24051 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 9
Volunteer Coordinator 1 922 Metro 487 763242 K-3 Literacy Tutor 269 419026 Family Literacy Member 4 7013 Kindergarten-Focused Literacy Tutor 65 110531 Preschool Literacy Tutor-Community Corps 92 137969 Preschool Literacy Tutor-Professional Corps 54 83467 Volunteer Coordinator 3 5236 Northcentral 30 45471 K-3 Literacy Tutor 12 17577 Preschool Literacy Tutor-Community Corps 11 15735 Preschool Literacy Tutor-Professional Corps 6 10436 Volunteer Coordinator 1 1723 Northeast 29 49073 K-3 Literacy Tutor 16 26411 Preschool Literacy Tutor-Community Corps 9 15598 Preschool Literacy Tutor-Professional Corps 3 5277 Volunteer Coordinator 1 1787 Northwest 41 50190 K-3 Literacy Tutor 24 33632 Preschool Literacy Tutor-Community Corps 13 13811 Preschool Literacy Tutor-Professional Corps 4 2747 Southeast 44 73573 K-3 Literacy Tutor 37 61264 Preschool Literacy Tutor-Community Corps 1 1720 Preschool Literacy Tutor-Professional Corps 5 8854 Volunteer Coordinator 1 1735 Southwest 49 73559 K-3 Literacy Tutor 37 56115 Preschool Literacy Tutor-Community Corps 5 6690 Preschool Literacy Tutor-Professional Corps 6 9033 Volunteer Coordinator 1 1721 Grand Total 789 1202990 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 10
Table 2: Pre-Kindergarten Participation Region Number of Members Number of Students with 1 or More Assessments in 1 Window Number of Students with 1 or more Assessments in 3 Windows Central 32 1024 902 Metro 118 3,658 2776 North Central 16 272 240 North East 9 303 252 North West 17 337 268 South East 6 101 85 South West 9 258 227 Total 207 5,953 4,750 Table 3: Number of K-3 Students benchmarked as follow-up to prior years' tutoring Region Gr K Gr 1 Gr 2 Gr 3 TOTAL North East 159 46 105 138 448 North Central 24 62 91 91 268 Metro 314 838 1326 1307 3785 North West 3 84 182 220 489 South East 73 234 214 521 South West 116 166 261 213 756 Central 27 120 312 469 928 TOTAL 643 1389 2511 2652 7195 Table 4: Number of Students with at least 1 data point, any measure Region Gr K Gr 1 Gr 2 Gr 3 TOTAL North East 51 214 157 166 588 North Central 60 110 89 120 379 Metro 3255 2857 2358 2721 11191 North West 193 196 186 214 789 South East 249 436 306 283 1274 South West 218 305 275 362 1160 Central 329 421 349 339 1438 TOTAL 4355 4539 3720 4205 16819 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 11
Table 5: Number of Students with at least 5 weeks of data (R-CBM, NWF, or LSF) Region Gr K Gr 1 Gr 2 Gr 3 TOTAL North East 49 207 154 163 573 North Central 59 98 82 115 354 Metro 3114 2743 2251 2631 10742 North West 174 191 184 210 759 South East 238 426 297 273 1235 South West 201 292 263 347 1103 Central 312 404 337 333 1386 TOTAL 4147 4361 3568 4072 16152 Table 6: Number of Students with at least 10 weeks of data (R-CBM, NWF, or LSF) Region Gr K Gr 1 Gr 2 Gr 3 TOTAL North East 21 165 138 143 467 North Central 17 73 65 80 235 Metro 1947 2088 1897 2181 8113 North West 56 126 133 147 462 South East 118 325 247 205 895 South West 99 206 212 268 785 Central 133 269 254 249 905 TOTAL 2391 3252 2946 3273 11862 Table 7: Number of Students with at least 20 weeks of data (R-CBM, NWF, or LSF) Region Gr K Gr 1 Gr 2 Gr 3 TOTAL North East 2 84 85 68 239 North Central 6 30 33 26 95 Metro 393 1146 1146 1101 3786 North West 3 72 53 55 183 South East 26 172 150 109 457 South West 11 117 125 124 377 Central 9 138 100 98 345 TOTAL 450 1759 1692 1581 5482 Growth in Data Reported The figure below compares total numbers of students served with data recorded for the MRC program during the 2008-2009, 2009-2010, 2010-2011, and 2011-2012 school years. Significant growth in total amount of student data recorded is noted across pre-k and k-3 programs. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 12
Figure 1: Growth in Student Data Reported 14,000 12,000 10,000 8,000 6,000 4,000 2008-2009 2009-2010 2010-2011 2011-2012 2,000 0 # Pre-K Students with Data in 1 or More Benchmark Windows # Pre-K Students with Data in 3 Benchmark Windows # K-3 Students with 5 or More Data Points # K-3 Students with 20 or More Data Points # K-3 Students Benchmarked as Follow up to Last Year's Tutoring In order to more fully acknowledge the impact on student participation in the Minnesota Reading Corps, the following tables summarize the total number of students identified as having participated in the MRC program, regardless of data presence or tutoring time: Table 8: Minnesota Reading Corps Participation: All Students Grade Active Exited Moved Re-Enrolled Referred to other services Grand Total PreK 2 65 5 5 75 PreK 3 1028 39 58 2 1127 PreK 4 4549 145 269 1 8 4972 PreK 5 208 5 11 1 225 PreK Age Unknown 166 7 22 2 0 197 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 13
PreK Total 6016 201 365 3 11 6596 K 968 2956 188 116 96 4324 1 1533 2290 378 247 243 4691 2 1784 1475 526 106 180 4071 3 1722 1884 598 197 183 4584 TOTAL 12023 8806 2055 669 713 24266 Further analysis was completed to identify the average participation dosage for students in both the Pre-K and the K-3 MRC programs. Average number of weeks, sessions per week, and minutes per week of tutoring service provided is summarized below, first in total across the state, and then broken down for the K-3 program by region. Increases in both number of sessions per week and minutes per week is noted in K-3 for the 11-12 school year relative to 10-11. Table 9: Participation Levels in MRC Tutoring Number of Students Average Sessions per Week Average Total Minutes Average Minutes per Week Average Average Total Grade Weeks Sessions Minutes PreK 2 2 14.00 28.00 2.39 209.50 15.68 419 PreK 3 175 7.35 18.12 2.17 99.15 12.08 17352 PreK 4 2731 9.84 26.05 2.59 142.91 14.19 390284 PreK 5 86 10.29 23.20 2.17 149.28 13.84 12838 PreK Unknown 81 N/A N/A N/A N/A N/A N/A PreK Total 3075 9.65 25.40 2.55 140.05 14.07 430649 K 4189 9.22 46.43 4.84 913.76 94.82 3827755 1 4388 13.39 47.46 3.50 925.79 68.14 4062362 2 3574 17.63 63.24 3.56 1242.29 69.75 4439956 3 4008 16.14 56.27 3.46 1301.64 134.19 5216982 TOTAL 19234 13.24 48.49 3.64 934.68 79.37 17977704 Table 10: Participation Levels in K-3 MRC Tutoring by Region Grade North East Number of Students Average Weeks Average Sessions Average Sessions per Week Average Total Minutes Average Minutes per Week Total Minutes K 50 8.92 28.90 3.20 544.70 59.16 27235 1 213 13.26 47.68 3.58 912.16 67.84 194290 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 14
2 155 19.06 67.46 3.51 1302.37 67.42 201867 3 161 16.79 59.19 3.48 1142.17 66.80 183890 TOTAL 579 15.42 54.55 3.50 1048.85 66.69 607282 North Central K 32 9.16 29.94 3.25 597.81 65.21 263805 1 90 13.36 44.84 3.25 832.33 59.56 19130 2 72 16.85 55.21 3.20 1092.01 63.28 74910 3 102 14.15 45.48 3.14 893.53 62.01 78625 TOTAL 296 14.02 45.97 3.20 891.23 61.92 91140 Metro K 3155 9.70 53.24 5.33 1052.23 105.23 3319786 1 2757 13.90 49.27 3.51 960.56 68.26 2648254 2 2266 18.62 67.02 3.57 1314.26 69.83 2978124 3 2605 16.94 59.31 3.47 1159.07 67.58 3019367 TOTAL 10783 14.40 56.59 4.04 1109.67 79.24 11965531 North West K 177 6.36 21.69 3.44 418.84 66.86 74135 1 185 11.77 41.62 3.52 807.45 68.42 149378 2 174 13.24 47.17 3.54 926.57 69.51 161224 3 200 12.31 42.42 3.43 827.08 67.22 165416 TOTAL 736 10.96 38.35 3.48 747.49 67.98 550153 South East K 234 8.72 29.79 3.38 573.46 64.93 134190 1 419 13.75 49.79 3.60 982.88 71.00 411828 2 279 17.88 66.11 3.73 1315.35 74.06 366983 3 260 15.55 53.73 3.44 1069.58 69.07 278090 TOTAL 1192 14.12 50.54 3.55 999.24 70.10 1191091 South West K 206 7.67 26.70 3.64 507.55 64.28 1813403 1 283 12.51 44.14 3.50 872.30 68.56 104556 2 256 17.12 60.17 3.50 1190.76 68.89 246861 3 326 15.81 55.22 3.51 3549.55 886.91 304834 TOTAL 1071 13.68 47.99 3.53 1693.19 316.91 1157152 Central K 323 7.61 23.56 3.11 448.89 59.34 1101485 1 415 11.38 39.74 3.38 779.50 66.13 144993 2 339 13.74 49.03 3.51 966.47 69.19 323491 3 331 13.55 47.00 3.48 922.56 68.57 327634 TOTAL 1408 11.59 39.97 3.37 782.30 65.88 305367 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 15
Demographics In order to more fully describe the population of children served by the Minnesota Reading Corps program, data regarding gender, ethnicity, and English Learner Status were collected. These demographic data for participating students were entered into the MRC OnCorps Database by members. Table 11: Pre-Kindergarten - Grade 3 Participant Demographic Data Gender Ethnicity English Learner Status Pre-K 51.37% Male 48.57% Female 0.07% Not Reported 2.35% American Indian / Alaska Native 19.57% African American 8.45% Asian 12.54% Hispanic / Latino 0.50% Pacific Islander 2.65% Unknown / No Response 6.43% Multiple 46.68% White 0.83% Not Reported 75.24% English as Primary Language 24.11% ELL 0.65% Not Reported K-3 49.32% Male 46.02% Female 4.65% Not Reported 2.55% American Indian / Alaska Native 18.51% African American 8.24% Asian 0.02% Filipino 10.10% Hispanic / Latino 0.29% Pacific Islander 2.27% Unknown/No Response 2.58% Multiple 48.85% White 6.59% Not Reported 66.71% English as Primary Language 14.88% ELL 18.41% Not Reported MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 16
2. Fidelity of Assessment Data Collection What is the Research Base for MRC Program Assessments? As listed in the above, the assessment tools used to determine literacy progress of MRC-participating students include the following measures: Picture Naming Fluency Alliteration Fluency Rhyming Fluency Letter Naming Fluency Letter Sound Fluency Nonsense Word Fluency Oral Reading Fluency These tools were selected for use in the MRC because of their well-established statistical reliability and validity for screening and progress monitoring purposes. Picture Naming, Alliteration, and Rhyming measures were developed through the University of Minnesota, and are commonly referred to as Individual Growth and Development Indicators (IGDIs) of literacy. Letter Naming, Letter Sounds, and Nonsense Words are measures of early literacy skills thoroughly researched by many groups, but most famously packaged by two assessment programs: DIBELS and AIMSweb. Oral Reading Fluency provides an assessment of connected text reading. Early and ongoing research on this measure has also been conducted at the University of Minnesota. All these measures fit under the umbrella of Curriculum-Based Measurement (CBM), and are fluency based assessments, meaning that students respond to an unlimited number of items within a fixed amount of time, and the number of correct responses is counted. The information that follows summarizes empirical findings related to the statistical reliability and validity of the measures used in the Minnesota Reading Corps. Picture Naming Fluency: r=.44 to.78 1 month alternate form reliability r=.67 test-retest 3-week reliability r=.47 to.75 with PPVT-3 and.63 to.81 with PLS-3 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 17
r=.32 to.37 with DIBELS Letter Naming Fluency and.44 to.49 with DIBELS Initial Sound Fluency r=.41 (longitudinal) and.60 (cross sectional) between scores and chronological age, with correlations of.63,.32, and.48 for typically developing, HeadStart, and ECSE populations respectively Sources: McConnell, S.R., Priest, J.S., Davis, S.D., & McEvoy, M.A. (2002). Best Practices in Measuring Growth and Development for Preschool Children, In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology (4 th ed., Vol. 2, --.1231-1246). Washington DC: National Association of School Psychologists. Missall, K.N., & McConnell, S.R. (April, 2004). Psychometric Characteristics for Individual Growth and Development Indicators: Picture Naming, Rhyming, and Alliteration (Technical Report). Minneapolis, MN: University of Minnesota. Accessed online at http://ggg.umn.edu/techreports/ecri_report8html July 27, 2004. Missall, K.N. et. al. (2007). Examination of Predictive Validity of Preschool Early Literacy Skills. School Psychology Review 36 (3) 433-452. Missall, K. N., McConnell, S. R., & Cadigan, K. (2006). Early literacy development: Skill growth and relations between classroom variables for preschool children. Journal of Early Intervention, 29, 1-21. Phaneuf, R. L., & Silberglitt, B. (2003). Tracking preschoolers' language and preliteracy development using a general outcome measurement system. Topics in Early Childhood Special Education, 23, 114-123. Priest, J. S., McConnell, S. R., Walker, D., Carta, J. J., Kaminski, R. A., McEvoy, M. A., Good, R., Greenwood, C. R., & Shinn, M. R. (2001). General growth outcomes for young children: Developing a foundation for continuous progress measurement. Journal of Early Intervention, 24, 163-180. Wackerle, Alisha K. (2007). Test review: Selection of general growth outcomes for children between birth and age eight. Assessment for Effective Intervention. 33(1), 51-54. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 18
Alliteration: r=.46 to.80 test-retest reliability over 3 weeks r=.40 to.57 with PPVT-3 r=.34 to.55 with Clay s Concepts about Print r=.75 to.79 with TOPA r=.39 to.71 with DIBELS Letter Naming Fluency r=.61 with chronological age Sources: McConnell, S.R., Priest, J.S., Davis, S.D., & McEvoy, M.A. (2002). Best Practices in Measuring Growth and Development for Preschool Children, In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology (4 th ed., Vol. 2, --.1231-1246). Washington DC: National Association of School Psychologists. Missall, K.N., & McConnell, S.R. (April, 2004). Psychometric Characteristics for Individual Growth and Development Indicators: Picture Naming, Rhyming, and Alliteration (Technical Report). Minneapolis, MN: University of Minnesota. Accessed online at http://ggg.umn.edu/techreports/ecri_report8html July 27, 2004. Missall, K.N. et. al. (2007). Examination of Predictive Validity of Preschool Early Literacy Skills. School Psychology Review 36 (3) 433-452. Missall, K. N., McConnell, S. R., & Cadigan, K. (2006). Early literacy development: Skill growth and relations between classroom variables for preschool children. Journal of Early Intervention, 29, 1-21. Phaneuf, R. L., & Silberglitt, B. (2003). Tracking preschoolers' language and preliteracy development using a general outcome measurement system. Topics in Early Childhood Special Education, 23, 114-123. Priest, J. S., McConnell, S. R., Walker, D., Carta, J. J., Kaminski, R. A., McEvoy, M. A., Good, R., Greenwood, C. R., & Shinn, M. R. (2001). General growth outcomes for young children: Developing a foundation for continuous progress measurement. Journal of Early Intervention, 24, 163-180. VanDerHeyden, A.M., Snyder, P.A., Broussard, C., & Ramsdell, K. (2007). Measuring Response to Early Literacy Intervention with Preschoolers at Risk. Topics in Early Childhood Special Education 27(4), 232-249. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 19
Wackerle, Alisha K. (2007). Test review: Selection of general growth outcomes for children between birth and age eight. Assessment for Effective Intervention. 33(1), 51-54. Rhyming: r=.83 to.89 test-retest reliability over 3 weeks r=.56 to.62 with PPVT-3 r=.54 to.64 with Clay s Concepts about Print r=.44 to.62 with TOPA r=.44 to.63 with IGDI Picture Naming and.43 with IGDI Alliteration r=.48 to.59 with DIBELS Letter Naming Fluency r=.44 to.68 with DIBELS Initial Sound Fluency r=.46 with chronological age Sources: McConnell, S.R., Priest, J.S., Davis, S.D., & McEvoy, M.A. (2002). Best Practices in Measuring Growth and Development for Preschool Children, In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology (4 th ed., Vol. 2, --.1231-1246). Washington DC: National Association of School Psychologists. Missall, K.N., & McConnell, S.R. (April, 2004). Psychometric Characteristics for Individual Growth and Development Indicators: Picture Naming, Rhyming, and Alliteration (Technical Report). Minneapolis, MN: University of Minnesota. Accessed online at http://ggg.umn.edu/techreports/ecri_report8html July 27, 2004. Missall, K.N. et. al. (2007). Examination of Predictive Validity of Preschool Early Literacy Skills. School Psychology Review 36 (3) 433-452. Missall, K. N., McConnell, S. R., & Cadigan, K. (2006). Early literacy development: Skill growth and relations between classroom variables for preschool children. Journal of Early Intervention, 29, 1-21. Phaneuf, R. L., & Silberglitt, B. (2003). Tracking preschoolers' language and preliteracy development using a general outcome measurement system. Topics in Early Childhood Special Education, 23, 114-123. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 20
Priest, J. S., McConnell, S. R., Walker, D., Carta, J. J., Kaminski, R. A., McEvoy, M. A., Good, R., Greenwood, C. R., & Shinn, M. R. (2001). General growth outcomes for young children: Developing a foundation for continuous progress measurement. Journal of Early Intervention, 24, 163-180. VanDerHeyden, A.M., Snyder, P.A., Broussard, C., & Ramsdell, K. (2007). Measuring Response to Early Literacy Intervention with Preschoolers at Risk. Topics in Early Childhood Special Education 27(4), 232-249. Wackerle, Alisha K. (2007). Test review: Selection of general growth outcomes for children between birth and age eight. Assessment for Effective Intervention. 33(1), 51-54. Letter Naming Fluency: r=.94 inter rater reliability r=.90 2 week test retest reliability r=.88 1 month alternate reliability r=.93 alternate forms reliability r=.70 with WJ-R Readiness Cluster r=.70 with WJ Psychoeducational Battery r=.53 to.58 with CTOPP Composite Predictive r=.65 with WJ Total Reading Cluster Predictive r=.71 with R-CBM ELL Predictive r =.67 with a composite of DIBELS NWF and R-CBM Sources: Assessment Committee Report for Reading First. (2002). Analysis of Reading Assessment Measures. Retrieved February 21, 2007, from http://dibels/uoregon.edu/techreports/dibels_5th_ed.pdf Good, R.H., Kaminski, R.A., Shinn, M. Bratten, J., Shinn, M., & Laimon, L. (in preparation). Technical Adequacy and Decision Making Utility of DIBELS (Technical Report). Eugene, OR: University of Oregon Good, R.H. III., Kaminski, R.A., Simmons, D., Kame enui, E.J. (2001). Using Dynamic Indicators of Basic Early Literacy Skills (DIBELS) in an outcomes-driven model: Steps to reading outcomes. Unpublished manuscript, University of Oregon at Eugene. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 21
Elliot, J., Lee, S.W., Tolefson, N. (2001). A Reliability and Validity Study of the Dynamic Indicators of Basic Early Literacy Skills Modified. School Psychology Review, 30 (1), 33-49. Haager, D. & Gersten, R (April, 2004). Predictive Validity of DIBELS for English Learners in Urban Schools. DIBELS Summit conference presentation, Albuquerque, NM Hintz, J.M., Ryan, A.L., & Stoner, G. (2003). Concurrent Validity and Diagnostic Accuracy of DIBELS and the CTOPP. School Psychology Review Kaminski, R.A. & Good, R.H. (1996). Toward a Technology for Assessment Basic Early Literacy Skills. School Psychology Review, 25, 215-227. Rouse, H., Fantauzzo, J.W. (2006). Validity of the Dynamic Indicators of Basic Early Literacy Skills as an Indicator of Early Literacy for Urban Kindergarten Children. School Psychology Review 35 (3)3 341-355. Letter Sound Fluency: r=.83 2-week test-retest reliability r=.80 alternate form reliability r=.79 with Letter Naming Fluency Predictive r=.72 with R-CBM Sources: Elliott, J., Lee, S.W., & Tollefson, N. (2001). A Reliability and Validity Study of the Dynamic Indicators of Basic Early Literacy Skills Modified. School Psychology Review, 30 (1), 33-49. Fuchs, L., Fuchs D. (2004). Determining Adequate Yearly Progress from Kindergarten through Grade 6 with Curriculum Based Measurement. Assessment for Effective Intervention 29 (4) 25-37. Howe, K. B., Scierka, B. J., Gibbons, K. A., & Silberglitt, B. (2003). A School-Wide Organization System for Raising Reading Achievement Using General Outcome Measures and Evidence-Based Instruction: One Education District s Experience. Assessment for Effective Intervention, 28, 59-72. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 22
Scott, S.A., Sheppard, J., Davidson, M.M., & Browning, M.M. (2001). Prediction of First Graders Growth in Oral Reading Fluency Using Kindergarten Letter Naming Fluency. Journal of School Psychology, 39(3), 225-237. Ritchey, K.D (2008). Assessing Letter Sound Knowledge: A Comparison of Letter Sound Fluency and Nonsense Word Fluency. Exceptional Children 74 (4) 487-506. Nonsense Word Fluency: r=.83 one month alternate form reliability r=.36 to.59 with WJ-R Readiness Cluster Predictive r=.82 with Spring R-CBM in Spring of grade 1 Predictive r =.65 with oral reading and.54 with maze in grade 3 Ell Predictive r=.63 with a composite of DIBELS NWF and R-CBM Sources: Burke, M. D., Hagan-Burke, S. (2007). Concurrent criterion-related validity of early literacy indicators for middle of first grade. Assessment for Effective Intervention. 32(2), 66-77. Good, R.H., Kaminski, R.A., Shinn, M. Bratten, J., Shinn, M., & Laimon, L. (in preparation). Technical Adequacy and Decision Making Utility of DIBELS (Technical Report). Eugene, OR: University of Oregon Good, R.H., Kaminski, R.A., Simmons, D., & Kame-enui, E.J. (2001). Using DIBELS in an Outcomes Driven Model: Steps to Reading Outcomes. Unpublished manuscript, University of Oregon, Eugene. Haager, D. & Gersten, R (April, 2004). Predictive Validity of DIBELS for English Learners in Urban Schools. DIBELS Summit conference presentation, Albuquerque, NM Howe, K. B., Scierka, B. J., Gibbons, K. A., & Silberglitt, B. (2003). A School-Wide Organization System for Raising Reading Achievement Using General Outcome Measures and Evidence-Based Instruction: One Education District s Experience. Assessment for Effective Intervention, 28, 59-72 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 23
Kaminski, R.A. & God, R.H. (1996). Toward a Technology for Assessment Basic Early Literacy Skills. School Psychology Review, 25, 215-227. Ritchey, K.D (2008). Assessing Letter Sound Knowledge: A Comparison of Letter Sound Fluency and Nonsense Word Fluency. Exceptional Children 74 (4) 487-506. Rouse, H., Fantauzzo, J.W. (2006). Validity of the Dynamic Indicators of Basic Early Literacy Skills as an Indicator of Early Literacy for Urban Kindergarten Children. School Psychology Review 35 (3)3 341-355. Vanderwood, M.., Linklater, D., Healy, K. (2008). Predictive Accuracy of Nonsense Word Fluency for English Language Learners. School Psychology Review 37 (1) 5-17. Oral Reading Fluency: r=.92 to.97 test retest reliability r=.89 to.94 alternate form reliability r=.82 to.86 with Gates-MacGinite Reading Test r=.83 to Iowa Test of Basic Skills r =.88 to Stanford Achievement Test r=.73 to.80 to Colorado Student Assessment Program r=.67 to Michigan Student Assessment Program r=.73 to North Carolina Student Assessment Program r=74 to Arizona Student Assessment Program r=.61 to.65 to Ohio Proficiency Test, Reading Portion r=.58 to.82 with Oregon Student Assessment Program (SAT 10) Sources: Barger, J. (2003). Comparing the DIBELS Oral Reading Fluency indicator and the North Carolina end of grade reading assessment (Technical Report). Ashville, NC: North Carolina Teacher Academy. Baker S. et. al,. (2008). Reading Fluency as a Predictor of Reading Proficiency in Low-Performing, High-Poverty Schools. School Psychology Review 37 (1) 18-37. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 24
Burke, M. D., Hagan-Burke, S. (2007). Concurrent criterion-related validity of early literacy indicators for middle of first grade. Assessment for Effective Intervention. 32(2), 66-77. Deno, S. L., Mirkin, P. K., & Chiang, B. (1982). Identifying valid measures of reading. Exceptional Children, 49. 36-45. Howe, K. B., Scierka, B. J., Gibbons, K. A., & Silberglitt, B. (2003). A School-Wide Organization System for Raising Reading Achievement Using General Outcome Measures and Evidence-Based Instruction: One Education District s Experience. Assessment for Effective Intervention, 28, 59-72 Hintze, J.M, et al (2002). Oral Reading Fluency and Prediction of Reading Comprehension in African American and Caucasian Elementary School Children. School Psychology Review, 31 (4) 540-553 Hintze, J. M. & Silberglitt, B. (in press). A Longitudinal Examination of the Diagnostic Accuracy and Predictive Validity of R-CBM and High-Stakes Testing. School Psychology Review. Marston, D., Fuchs, L., & Deno, S. (1987). Measuring pupil progress: a comparison of standardized achievement tests and curriculum-related measures. Diagnostique, 11, 77-90. Marston, D. (1989). Curriculum-based measurement: What is it and why do it? In M. R. Shinn (Ed.), Curriculum-based measurement: Assessing special children (pp. 18-78). New York: Guilford Press. McGlinchey, M. T., & Hixson, M. D. (2004). Contemporary research on curriculumbased measurement: Using curriculum-based measurement to predict performance on state assessments in reading. School Psychology Review, 33(2), 193-204. Schilling, S. G., Carlisle, J. F., Scott, S. E., & Zeng, J. (2007). Are fluency measures accurate predictors of reading achievement? The Elementary School Journal, 107(5), 429-448. Silberglitt, B. & Hintze, J. M. (in press). Formative Assessment Using Oral Reading Fluency Cut Scores to Track Progress Toward Success on State-Mandated MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 25
Achievement Tests: A Comparison of Methods. Journal of Psychoeducational Assessment. Shaw, R., & Shaw, D. (2002). DIBELS Oral Reading Fluency-Based Indicators of the third-grade reading skills for Colorado State Assessment Program (CSAP) (Technical Report). Eugene, OR: University of Oregon. Shinn, M., Good, R., Knutson, N., Tilly, W., & Collins, A. (1992). Curriculum-based measurement of oral reading fluency: A confirmatory analysis of its relation to reading. School Psychology Review, 21, 459-479. Stage, S. A., & Jacobsen, M. D. (2001). Predicting student success on a statemandated performance-based assessment using oral reading fluency. School Psychology Review, 30(3), 407-420. Tindal, G., Germann, G., & Deno, S. (1983). Descriptive research on the Pine County Norms: A compilation of findings (Research Report No. 132). Minneapolis, MN: University of Minnesota Institute for Research on Learning Disabilities. Vander Meer, C. D., Lentz, F. E., & Stollar, S. (2005). The relationship between oral reading fluency and Ohio proficiency testing in reading (Technical Report). Eugene, OR: University of Oregon. Wilson, J. (2005). The relationship of Dynamic Indicators of Basic Early Literacy Skills (DIBELS) Oral Reading Fluency to performance on Arizona Instrument to Measure Standards (AIMS). Tempe, AZ: Tempe School District No. 3. Are the data collection tools used with children being implemented with fidelity? Analysis of fidelity with which student assessment data are conducted is a critical initial aspect to the evaluation of the MRC program so that results from evaluation of these data may be reported with confidence. In order to accomplish this, a series of Accuracy of Implementation Rating Scales (AIRS) have been compiled from each Minnesota Reading Corps (MRC) site. MRC Internal Coaches and Master Coaches were trained in August 2011 to administer and score assessment measures, and to conduct observations of Reading Corps members as they administer and score these measures. The AIRS MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 26
are structured observational protocols that provide an opportunity for observers to certify that each aspect of a standardized administration for each assessment measure has been fully conducted. Master and/or Internal Coaches completed a minimum of 1 AIRS for each Reading Corps Member for each type of assessment the member conducted at least 3 times each year, near the benchmark data collection periods. The table below documents the total number of AIRS assessments compiled, the total number of AIRS assessments completely filled out on 2011-2012 forms, and percent fidelity documented of those completely filled out on 2011-2012 forms for each measure. In addition to reporting these data in aggregate format to document high fidelity of assessment procedures across the state, this observation system also provided Members with immediate feedback regarding the quality of their own assessment skills, and an opportunity to receive clarification or re-training as needed in a timely manner. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 27
Table 12: Fidelity of Assessment Data Collection Procedures Measure Rhyming Alliteration Picture Naming PK Letter Naming PK Letter Sounds K-3 Letter Naming K-3 Letter Sounds Nonsense Words Oral Reading Fluency Total AIRS Collected Total Complete AIRS 729 712 722 718 703 2077 2100 2214 2520 729 712 722 718 703 2077 2100 2214 2520 Fidelity Range Reported 0% - 100% 11% - 100% 0% - 100% 36%- 100% 0% - 100% 46% - 100% 38% - 100% 0% - 100% 15% - 100% Median % Fidelity Reported Mean % Fidelity Reported Standard Deviation 100% 100% 100% 100% 100% 100% 100% 100% 100% 99% 99% 99% 99% 99% 98% 98% 98% 98% 0.07 0.07 0.05 0.04 0.05 0.04 0.05 0.06 0.05 3. Fidelity of Intervention Implementation Are the interventions used with children being implemented with fidelity? As with the assessment tools, analysis of the level of fidelity with which the student intervention protocols are followed is a critical initial aspect to the evaluation of the MRC program so that results of student growth analysis may be attributed to accurate implementation of intervention scripts. In order to accomplish this, a series of intervention integrity observations have been MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 28
compiled from each MRC site. MRC internal and master coaches were trained in August 2011 to evaluate implementation integrity for each of the MRC interventions. The integrity checklists provide an opportunity for observers to certify that each aspect of a standardized administration for each intervention has been fully conducted. Master and/ or internal coaches completed a minimum of 1 intervention integrity checklist for each MRC member during each monthly visit, for a possible total of 9 checklists per member per year. The interested reader is referred to Ehrhardt, Barnett, Lentz, Stollar, & Reifin, (1996) for a description of how to use scripts to improve intervention integrity. The first 10 intervention protocols listed are available to all students involved in the K-3 MRC program. The next 7 are offered in a sub-set of K-3 MRC sites whose Members have been nominated and provided additional training on implementation of small group intervention protocols. The final group of intervention protocols are those offered in the Pre-K MRC program. The table below documents the number of integrity checklists compiled and percent fidelity documented for each intervention: Table 13: Fidelity of Intervention Implementation Procedures Intervention Total Fidelity Median Mean Standard Complete Range Percent Percent Deviation Fidelity Reported Fidelity Fidelity Checks Reported Reported Duet Reading 1583 19% - 100% 100% 96% 0.08 Newscaster Reading 530 18% - 100% 100% 96% 0.08 Repeated Reading with 1328 23% - 100% 94% 87% 0.15 Comprehension Strategy Practice Stop / Go 129 70% - 100% 100% 97% 0.66 Pencil Tap 282 21% - 100% 100% 94% 0.11 Great Leaps 1092 7% - 100% 96% 92% 0.13 Letter Sound Identification 586 25% - 100% 100% 95% 0.11 Word Blending 719 25% - 100% 100% 96% 0.10 Phoneme Blending 297 38% - 100% 100% 96% 0.09 Phoneme Segmenting 215 56% - 100% 100% 97% 0.07 Word Construction 138 35%-100% 95% 91% 0.15 Partner Reading (Paired) 44 78% - 100% 100% 97% 0.06 Phoneme Blending (Paired) 88 36% - 100% 100% 95% 0.11 Phoneme Segmenting (Paired) 62 46% - 100% 100% 94% 0.12 Letter Sound Identification 173 55% - 100% 100% 94% 0.11 (Paired) Blending Words (Paired) 64 64% - 100% 100% 95% 0.09 Repeated Read Aloud (K Focus) 316 19% - 100% 89% 85% 0.15 Pre-K Oral Language 345 53% - 100% 100% 97% 0.07 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 29
Pre-K Phonological Awareness 1 137 58% - 100% 100% 96% 0.06 Pre-K Phonological Awareness 2 239 18%-100% 100% 97% 0.08 Pre-K Phonological Awareness 3 167 64%-100% 100% 97% 0.05 Pre-K Phonological Awareness 4 18 67%-100% 100% 90% 0.13 Pre-K Repeated Read Aloud 1 368 18% - 100% 91% 83% 0.19 Pre-K Repeated Read Aloud 2 77 23%-100% 91% 86% 0.19 Pre-K Repeated Read Aloud 3-4 64 0%-100% 91% 74% 0.34 Pre-K Repeated Read Aloud 5 17 63%-100% 100% 95% 0.10 Pre-K Sign In 442 20% - 100% 100% 95% 0.10 Pre-K Visual Discrimination 1 135 62% - 100% 100% 94% 0.09 Pre-K Visual Discrimination 2 173 61%-100% 100% 94% 0.08 Pre-K Visual Discrimination 3 163 41%-100% 100% 96% 0.07 Pre-K Visual Discrimination 4 72 82%-100% 100% 97% 0.04 Pre-K Visual Discrimination 5 90 44%-100% 100% 98% 0.07 Pre-K Visual Discrimination 6 36 80%-100% 100% 100% 0.13 Growth in Fidelity Reporting The chart below compares total numbers of fidelity observations recorded for the MRC program during the 2008-2009, 2009-2010, 2010-2011, and 2011-2012 school years for interventions offered across more than one year. Significant growth in total complete observations recorded is noted, mirroring growth in the program. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 30
Figure 2: Growth in Fidelity Reporting 1800 1600 1400 1200 1000 800 600 400 200 0 2008-2009 2009-2010 2010-2011 2011-2012 What is the Research Base for MRC Program Interventions? The interventions identified for use in the MRC program are each designed to provide additional practice that is supplemental to the core reading instructional program offered by the local school site. This practice is provided with the intention of building automaticity and fluency of important reading skills that have already been introduced by local classroom teachers. It is important to note that MRC participation is in addition to, not in replacement of, a comprehensive core reading instructional program, and that the MRC program should in no way be viewed as a substitute for high quality core instruction. MRC provides important additional guided practice time in reading for students who need this support. For further discussion regarding the benefit of supplemental support to students at risk for reading failure, see Harn (2008). For a discussion of benefit of well-matched interventions, see Wagner et al (2006). The chosen interventions share a common theme in focus on building fluency for basic reading skills such as phonemic awareness, letter sound knowledge, MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 31
decoding skill, and sight word recognition. Fluency is interpreted in this program as incorporating rate, accuracy, and prosody, or expression. Richard Allington, Former president of the International Reading Association writes: "There are a substantial number of rigorously designed research studies demonstrating (1) that fluency can be developed, most readily through a variety of techniques that involve rereading texts and (2) that fostering fluency has reliable positive impacts on comprehension and performance. Thus when fluency is an instructional goal, as it should be for struggling readers, we have a wealth of research to guide our instructional planning. (Allington, 2001) For further discussion on the relationship between oral reading fluency and comprehension skills, the interested reader is referred to Tenenbaum & Wolking (1989) and Sindelar, Monda, & O Shea, (1990). A unique feature of MRC is the consistent use of research-based intervention protocols with participating students to provide this additional support. MRC members select from a menu of research-based supplemental reading interventions for use with participating MRC students as listed below. For each intervention protocol, a description of the research base, and/or sources of empirical evidence of intervention effectiveness are listed. Repeated Reading with Comprehension Strategy Practice Nelson, J. S., Alber, S. R., & Grody, A. (2004). Effects of systematic error correction and repeated readings on reading accuracy and proficiency of second graders with disabilities. Education and Treatment of Children, 27, 186 198. Staubitz, J. E., Cartledge, G., Yurick, A., & Lo, Y. (2004). Repeated reading for students with emotional or behavioral disorders: Peerand trainermediated instruction. Behavior Disorders, 31, 51 64 Therrien, W. J. (2004). Fluency and comprehension gains as a result of repeated reading: A meta-analysis. Remedial and Special Education, 25, 252 261. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 32
Moyer, S.B. (1982). Repeated reading. Journal of Learning Disabilities, 45, 619-623 Rashotte, C.A., & Torgeson, J.K. (1985). Repeated reading and reading fluency in learning disabled children. Reading Research Quarterly. 20, 180-188 Samuels, S. J. (1979). The method of repeated reading. The Reading Teacher, 32, 403-408. Samuels, S.J., (1987). Information processing abilities and reading. Journal of Learning Disabilities, 20(1), 18-22. Sindelar, P.T., Monda, L.E., & O Shea, L.J. (1990). Effects of repeated reading on instructional and mastery level readers. Journal of Educational Research, 83, 220-226. Therrien, W.J. (2004). Fluency and comprehension gains as a result of repeated reading: A meta-analysis. Remedial and Special Education. 25(4) 252-261 Morrow, L. M. (1985). Retelling stories: A strategy for improving young children s comprehension, concept of story structure, and oral language complexity. The Elementary School Journal, 85, 646 661. Duet Reading Aulls, M.W., (1982). Developing Readers in Today s Elementary Schools. Allyn & Bacon: Boston. Blevins, W. (2001). Building Fluency: Lessons and Strategies for Reading Success. New York: Scholastic Professional Books. Dowhower, S.L. (1991). Speaking of prosody: Fluency s unattended bedfellow. Theory into Practice, 30 (3), 165-175. Mathes, P.G., Simmons, D.C., & Davis, B.I. (1992). Assisted reading techniques for developing reading fluency. Reading Research and Instruction, 31, 70-77. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 33
Weinstein, G., & Cooke, N. L. (1992). The effects of two repeated reading interventions on generalization of fluency. Learning Disability Quarterly, 15, 21 27. Newscaster Reading Armbruster, B.B., Lehr, F., & Osborn, J. (2001). Put reading first: The research building blocks for teaching children to read. Washington, DC: US Department of Education, National Institute for Literacy. Dowhower. S.L. (1987). Effects of repeated reading on second-grade transitional readers fluency and comprehension. Reading Research Quarterly. 22, 389-406. (listening to a tape) Heckelman, R.G. (1969). A neurological-impress method of remedial reading instruction. Academic Therapy, 4, 277-282. Daly, E. J., III, & Martens, B. (1994). A comparison of three interventions for increasing oral reading performance: Application of the instructional hierarchy. Journal of Applied Behavior Analysis, 29, 507 518. Skinner, C. H., Adamson, K. L., Woodward, J. R., Jackson, R. R., Atchison, L. A., & Mims, J. W. (1993). The effects of models rates of reading on students reading during listening previewing. Journal of Learning Disabilities, 26, 674 681. Rasinski, T.V. (2003). The fluent reader: Reading strategies for building word recognition, fluency, and comprehension. New York, NY: Scholastic Professional Books. Searfoss, L. (1975). Radio Reading. The Reading Teacher, 29, 295-296. Stahl S. (2004). What do we Know About Fluency?: Findings of the National Reading Panel. In McCardle, P., & Chhabra, V. (Eds) The Voice of Evidence in Reading Research. Brookes: AU. Stop Go MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 34
Blevins, W. (2001). Building Fluency: Lessons and Strategies for Reading Success. New York: Scholastic Professional Books. Rasinski, T., & Padak, N. (1994). Effects of fluency development on urban second-graders. Jorunal of Education Research, 87. Rasinski, T.V. (2003). The fluent reader: Reading strategies for building word recognition, fluency, and comprehension. New York, NY: Scholastic Professional Books. Pencil Tap Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Education Research. 77(1), 81-112. Howell, K., W., & Nolet. V., (2000). Curriculum-Based Evaluation: Teaching and Decision Making 3 rd Ed. Belmont, CA: Wadsworth. Lysakowski, R.S., & Walberg, H.J. (1982). Instructional effects of cues, participation, and corrective feedback: A quantitative synthesis. American Educational Research Journal Vol 19(4), 559-578 Tenenbaum, G., & Goldring, E. (1989). A meta-analysis fo the effecta of enhanced instruction: Cues, participation, reinforcement and feedback and correctives on motor skill learning. Journal of Research & Development in Education. Vol 22(3) 53-64. Great Leaps Mercer, Cecil D., Campbell, Kenneth U., Miller, W. David, Mercer, Kenneth D., and Lane, Holly B. Effects of a Reading Fluency Intervention for Middle Schoolers with Specific Learning Disabilities. Learning Disabilities Research and Practice, 15(4), 179-189. 2000. Meyer, Marianne. Repeated Reading: An Old Standard is Revisited and Renovated. Perspectives, 28(1), 15-18. 2002. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 35
Letter Sound Identification Adams, M.J. (1990). Beginning to read: Thinking and learning about print. Cambridge, MA: MIT Press. Adams, M.J. (2001).Alphabetic anxiety and explicit, systematic phonics instruction: A cognitive science perspective. In S.B. Neuman & D.K. Dickinson (eds.), Handbook of Early Literacy Research (pp. 66-80). New York: Guilford Press. Chard, D.J., & Osborn, J. (1999). Word Recognition: Paving the road to successful reading. Intervention in school and clinic, 34(5), 271-277. Word Blending Adams, M.J. (2001).Alphabetic anxiety and explicit, systematic phonics instruction: A cognitive science perspective. In S.B. Neuman & D.K. Dickinson (eds.), Handbook of Early Literacy Research (pp. 66-80). New York: Guilford Press. Goswami, U. (2000). Causal connections in beginning reading: The importance of rhyme. Journal or Research in Reading, 22(3) 217-240. Greaney, K.T., Tunmer, W.E., & Chapman, J.W., (1997). Journal of Educational Psychology, 89(4) 645-651. Phoneme Blending Adams, M.J. (1990). Beginning to read: Thinking and learning about print. Cambridge, MA: MIT Press. Bos, C.D., & Vaughn, S. (2002). Strategies for teaching students with learning and behavioral problems (5 th Ed.). Boston: Allyn & Bacon. Ehri, L.C., Nunees, S.R., & Willows, D.M. (2001). Phonemic awareness instruction helps children learn to read: Evidence from the National Reading Panel s meta-analysis. Reading Research Quarterly, 36(3). 250-287. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 36
Elkonin, D.B. (1973). U.S.S.R. In J. Downing (Ed.), Comparative Reading (pp.551-579). New York: MacMillan. National Reading Panel. (2000). Teaching children to read: An evidencebased assessment of the scientific research literature on reading and its implications for reading instruction. Bethesda, MA: National Institutes of Health. Santi, K.L., Menchetti, B.M., & Edwards, B.J. (2004). A comparison of eight kindergarten phonemic awareness programs based on empirically validated instructional principals. Remedial and Special Education, Vol 25(3) 189-196. Smith, C.R. (1998). From gibberish to phonemic awareness: Effective decoding instruction. Exceptional Children, Vol 30(6) 20-25 Smith, S.B., Simmons, D.C., & Kame enui, E, J. (1998). Phonological Awareness: Research bases. In D.C. Simmons & E.J. Kame enui (Eds.), What Reading research tells us about children with diverse learning needs: Bases and basics. Mahwah, NJ: Lawrence Erlbaum Associates. Snider, V. E. (1995). A primer on phonemic awareness: What it is, why it is important, and how to teach it. School Psychology Review, 24, 443 455. Phoneme Segmentation Adams, M.J. (1990). Beginning to read: Thinking and learning about print. Cambridge, MA: MIT Press. Blachman, B. A. (1991). Early intervention for children s reading problems: Clinical applications of the research on phonological awareness. Topics in Language Disorders, 12, 51 65. Bos, C.D., & Vaughn, S. (2002). Strategies for teaching students with learning and behavioral problems (5 th Ed.). Boston: Allyn & Bacon. Ehri, L.C., Nunees, S.R., & Willows, D.M. (2001). Phonemic awareness instruction helps children learn to read: Evidence from the National MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 37
Reading Panel s meta-analysis. Reading Research Quarterly, 36(3). 250-287. National Reading Panel. (2000). Teaching children to read: Anevidencebased assessment of the scientific research literature on reading and its implications for reading instruction. Bethesda, MA: National Institutes of Health. Santi, K.L., Menchetti, B.M., & Edwards, B.J. (2004). A comparison of eight kindergarten phonemic awareness programs based on empirically validated instructional principals. Remedial and Special Education, Vol 25(3) 189-196. Smith, C.R. (1998). From gibberish to phonemic awareness: Effective decoding instruction. Exceptional Children Vol 30(6) 20-25. Smith, S.B., Simmons, D.C., & Kame enui, E, J. (1998). Phonological Awareness: Research bases. In D.C. Simmons & E.J. Kame enui (Eds.), What Reading research tells us about children with diverse learning needs: Bases and basics. Mahwah, NJ: Lawrence Erlbaum Associates. Snider, V. E. (1995). A primer on phonemic awareness: What it is, why it is important, and how to teach it. School Psychology Review, 24, 443 455. Partner Reading Fuchs, D., Fuchs, L., & Burish, P. (2000). Peer-Assisted Learning Strategies: An Evidence-Based Practice to Promote Reading Achievement. Learning Disabilities Research and Practice, 15(2), 85-91. Koskinen, P. & Blum, I. (1986). Paired repeated reading: A classroom strategy for developing fluent reading. The Reading Teacher, 40(1), 70-75. Rathvon, N. (2008) Effective School Interventions, Second Edition. New York, NY, Guilford Press Vaughn, S., Chard, D., Bryant, D., Coleman, M., Tyler, B., Linan Thompson, S., & K. Kouzekanani. (2000) Fluency and comprehension interventions for third grade students. Remedial and Special Education, 21(6), 325 335. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 38
Word Construction McCandliss, B.D., Beck, I., Sandak, R., & Perfetti, C. (2003). Focusing attention on decoding for children with poor reading skills: A study of the Word Building intervention. Scientific Studies of Reading.7(1),75-105. Repeated Read Aloud McGee, Lea M., & Schickedanz, Judith A. (2007). Repeated interactive read-alouds in preschool and kindergarten. The Reading Teacher. 60(8), 742-751. Lonigan, C. J., Anthony, J. L., Bloomield, B. G., Dyer, S. M., & Samwel, C. S. (1999). Effects of two shared-reading interventions on emergent literacy skills of at-risk preschoolers. Journal of Early Intervention, 22(4), 306 322. Whitehurst, G. J., Arnold, D. S., Epstein, J. N., Angell, A. L., Smith, M., & Fischel, J. E. (1994). A picture book reading intervention in day care and home for children from low-income families. Developmental Psychology, 30(5), 679 689. Whitehurst, G. J., Epstein, J. N., Angell, A. L., Payne, A. C., Crone, D. A., & Fischel, J. E. (1994). Outcomes of an emergent literacy intervention in Head Start. Journal of Educational Psychology, 86(4), 542 555 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 39
4. Student Outcomes Is the performance of MRC participating students in terms of their literacy improvement consistent with expectations? The following sections document growth and achievements of children age 3 to grade 3 who participated in the MRC program in the 2011-2012 school year. It is important to acknowledge that MRC participating students are also supported by a variety of resources, most notably the instruction and guidance provided by their schools and families. This evaluation is not intended to address or control for the variables related to these resources, nor to suggest that student progress or lack thereof must be attributed solely to the service being provided through the efforts of the MRC program. This design s purpose is to focus on the desired literacy outcomes for all children. Pre-Kindergarten Student Performance The five measurement tools utilized for the Pre-K Reading Corps program are listed below. For a description of the technical characteristics of these assessments, the interested reader is referred to the 2007-2008 Minnesota Reading Corps program evaluation. For each assessment tool, a target score was identified as the goal for the end of the year. These target scores were based on the target scores used in Minneapolis Public Schools for incoming kindergarten students, and upon 50 th %ile scores for incoming kindergarten students within school districts served by Saint Croix River Education District. Prior to use of these targets for the current project, these targets were reviewed by the researchers at the University of Minnesota who originally created the Individual Growth and Development Indicators. The measures and target scores for this project are listed below: Measure Spring Target Score Rhyming 12 Picture Naming 26 Alliteration 8 Letter Sound Fluency 8 Letter Naming Fluency 14 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 40
Pre-Kindergarten student performance on fall, winter, and spring IGDI measures is listed in the tables below for all students with a birth-date reported, and data collected during the appropriate assessment window. Score ranges are also reported. Students score NA when they do not complete the sample items sufficiently to warrant participation in the assessment. A score of 0 indicates adequate performance on the sample items, but no accurate responses during the assessment. Table 14: Kindergarten Participant Performance on IGDIs: Fall Benchmark Measure Output Three Year Olds Four Year Olds Five Year Olds Picture Fall Number Students Tested 955 4,489 197 Naming Range of Scores NA-37 NA-44 NA-47 Fall (Number) Percent Students Above Spring Target (65) 6.81% (909) 20.25% (74) 37.56% Alliteration Fall Number Students Tested 937 4,493 194 Range of Scores NA-14 NA-31 NA-22 Fall (Number) Percent Students Above Spring Target (13) 1.39% (417) 9.28% (38) 19.59% Rhyming Fall Number Students Tested 954 4,464 193 Range of Scores NA-21 NA-28 NA-25 Fall (Number) Percent Students Above Spring Target (12) 1.26% (535) 11.98% (51) 26.42% Letter Naming Fall Number Students Tested 646 5,577 227 Fluency Range of Scores NA-40 NA-63 NA-68 Fall (Number) Percent Students Above Spring Target (36) 5.57% (1,212) 21.73% (87) 38.33% Letter Sound Fall Number Students Tested 255 4,431 195 Fluency Range of Scores NA-21 NA-110 NA-41 Fall (Number) Percent Students Above Spring Target (10) 3.92% (628) 14.17% (53) 27.18% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 41
Table 15: Pre-Kindergarten Participant Performance on IGDIs: Winter Benchmark Measure Output Three Year Olds Four Year Olds Five Year Olds Picture Winter Number Students Tested 917 4,340 197 Naming Range of Scores NA-43 NA-50 NA-47 Winter (Number) Percent Students Above Spring Target (237) 25.85% (2,037) 46.94% (122) 61.93% Alliteration Winter Number Students Tested 884 4,280 196 Range of Scores NA-24 NA-30 NA-30 Winter (Number) Percent Students Above Spring Target (67) 7.58% (1,179) 27.55% (79) 40.31% Rhyming Winter Number Students Tested 916 4,326 197 Range of Scores NA-25 NA-44 NA-29 Winter (Number) Percent Students Above Spring Target (99) 10.81% (1,570) 36.29% (91) 46.19% Letter Naming Winter Number Students Tested 373 4,349 197 Fluency Range of Scores NA-50 NA-79 NA-76 Winter (Number) Percent Students Above Spring Target (85) 22.79% (2,428) 55.83% (124) 62.94% Letter Sound Winter Number Students Tested 270 4,282 195 Fluency Range of Scores NA-34 NA-77 NA-49 Winter (Number) Percent Students Above Spring Target (26) 9.63% (1,696) 39.61% (92) 47.18% Table 16: Pre-Kindergarten Participant Performance on IGDIs: Spring Benchmark Measure Output Three Year Olds Four Year Olds Five Year Olds Picture Spring Number Students Tested 806 3,8,77 174 Naming Range of Scores NA-41 NA-58 NA-47 Spring (Number) Percent Students Above Spring Target (398) 49.38% (2,795) 72.09% (139) 79.89% Alliteration Spring Number Students Tested 784 3,867 173 Range of Scores NA-27 NA-38 NA-34 Spring (Number) Percent Students Above Spring Target (142) 18.11% (2,047) 52.94% (109) 63.01% Rhyming Spring Number Students Tested 797 3,882 174 Range of Scores NA-28 NA-37 NA-35 Spring (Number) Percent Students Above Spring Target (179) 22.46% (2,330) 60.02% (119) 68.39% Letter Naming Spring Number Students Tested 291 3,867 174 Fluency Range of Scores NA-63 NA-95 NA-73 Spring (Number) Percent Students Above (119) (3,020) (141) MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 42
Letter Sound Fluency Spring Target 40.89% 78.10% 81.03% Spring Number Students Tested 243 3,819 173 Range of Scores NA-43 NA-105 NA-61 Spring (Number) Percent Students Above (61) (2,512) (126) Spring Target 25.10% 65.78% 72.83% The figure below shows the percentage of students who were on or above, near, and far below target by each benchmark period. Data are displayed for each early literacy measure. Figure 3: Percentage of Students On or Above, Near, and Far from Target: Fall, Winter, and Spring The figure and table below show the normative performance of all 4-year-old students participating in the IGDI measures during each benchmark window during the 2011-2012 school year. As a clarification, it is noted that students who did not successfully complete the sample items on each assessment measure in order to continue on to the actual assessment were given a score of NA which is recorded in this figure as a -1. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 43
Figure 4: Normative Performance of 4-Year-Olds on IGDI Measures: Fall, Winter, and Spring 50 40 Items Correct 30 20 10 0-10 Letter Naming Spring Letter Naming Winter Letter Naming Fall Alliteration Spring Alliteration Winter Alliteration Fall Picture Naming Spring Picture Naming Winter Picture Naming Fall Letter Sound Spring Letter Sound Winter Letter Sound Fall Rhyming Spring Rhyming Winter Rhyming Fall Green Diamond = 90 th %ile score; Mid-Blue X = 75 th %ile score; Navy Triangle = 50 th %ile score; Light Blue Diamond= 25 th %ile score; Red Square = 10 th %ile score. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 44
Table 17: Normative Performance of 4-Year-Olds on IGDI Measures: Fall, Winter, and Spring 90th %ile 75th %ile 50th %ile 25th %ile 10 th %ile Number Students in Norm Group Rhyming - Fall Rhyming - Winter Rhyming - Spring Letter Sounds - Fall Letter Sounds - Winter Letter Sounds - Spring Picture Naming - Fall Picture Naming - Winter Picture Naming - Spring Alliteration - Fall Alliteration - Winter Alliteration - Spring Letter Naming - Fall Letter Naming - Winter Letter Naming - Spring 12 18 21 11 22 29 29 33 36 7 12 16 28 40 45 8 14 17 3 13 20 24 29 32 4 8 12 15 29 35 3 8 13 0 4 12 19 25 28 0 4 8 4 17 25 NA 2 8 0 0 5 12 19 24 NA 0 4 1 5 14 NA NA 2 NA 0 0 2 12 19 NA NA NA 0 1 6 4464 4328 3883 4431 4282 3819 4489 4340 3877 4494 4280 3868 4427 4349 3868 As these data indicate, there may be significant floor effect issues with the IGDI assessments for this population. This means that the assessments do not accurately measure the skills of some students because these students are not yet able to demonstrate the skills expected on the assessment. Assessments for which a floor effect exists may not fully capture growth that is occurring for students with low skills. The table below provides the percentage of students in fall and spring who received a score of NA on each IGDI assessment, or a score of zero on Letter Naming and Letter Sound Fluency. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 45
Table 18: Floor Effect Issues with Pre-Kindergarten Assessments - Percentage of Students Not Completing Sample IGDIs Items or scoring zero on Letter Naming/Sound Fluency By Measure and Season (All PreK Students) Fall Spring Rhyming 34.42% 9.17% Letter Naming 8.02% 2.50% Picture Naming 7.16% 0.63% Alliteration 43.64% 14.39% Letter Sounds 16.25% 4.89% The number of children who demonstrated growth across the school year from fall to spring on each assessment measure is reported below. All students included in this analysis had a valid birthdate and a valid score for at least one measure in both fall and spring assessment windows. Table 19: Pre-Kindergarten Student Growth 3 Year Olds 4 Year Olds Number Percent Number Percent Showed Growth on 1 or More Measures 780 96.77% 3874 99.18% Showed Growth on 2 or More Measures 617 76.55% 3746 95.90% Showed Growth on 3 or More Measures 434 53.85% 3632 92.99% Showed Growth on 4 or More Measures 134 16.63% 3262 83.51% Showed Growth on 5 Measures 71 8.81% 2270 58.12% Total Students 806 3906 Additional analysis was completed for the 2011-2012 school year to evaluate the effects of length of instructional program on student growth as measured by the IGDIs. Average fall to spring growth for students who were 4 years old on September 1, 2011 is reported below for students in programs meeting 2-9 hours per week, 10-15 hours per week, and 16+ hours per week. Table 20: IGDI Fall-Spring Growth by Hours per Week in Core Instruction: 4-Year-Olds 2-9 Hrs/Week 10-15 Hrs/Week 16+ Hrs/Week Unreported Grand Total Rhyming Growth* Average 7.81 8.51 7.98 6.24 8.16 Standard Deviation 6.16 6.72 5.85 4.88 6.40 N 1139 2020 548 108 3815 Letter Naming Growth Average 14.82 17.03 18.09 15.45 16.48 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 46
Standard Deviation 10.34 11.60 11.53 11.21 11.28 N 1146 2059 561 115 3881 Picture Naming Growth Average 8.36 11.16 9.13 8.67 9.96 Standard Deviation 8.03 8.30 7.36 7.27 8.16 N 1137 2011 550 108 3806 Alliteration Growth* Average 5.54 6.38 6.21 5.07 6.07 Standard Deviation 5.61 5.96 5.62 4.47 5.78 N 1139 2007 547 108 3801 Letter Sounds Growth* Average 10.03 10.79 12.23 10.20 10.75 Standard Deviation 9.63 8.89 9.54 9.98 9.27 N 1130 1983 547 108 3768 * - Significant floor effects may have affected the results for these measures For comparison purposes, the same data are provided below for all Pre-K enrolled students, regardless of age. Table 21: IGDI Fall-Spring Growth by Hours per Week in Core Instruction: All Pre-Kindergarten Students 2-9 Hrs/Week 10-15 Hrs/Week 16+ Hrs/Week Unreported Grand Total Rhyming Growth* Average 7.12 8.08 7.73 5.84 7.64 Standard Deviation 6.16 6.68 5.75 4.82 6.35 N 1574 2321 713 150 4758 Letter Naming Growth Average 14.16 16.34 17.21 13.82 15.69 Standard Deviation 10.40 11.77 11.72 11.13 11.39 N 1385 2237 633 148 4403 Picture Naming Growth Average 8.48 11.00 9.30 8.77 9.84 Standard Deviation 8.04 8.17 7.17 7.60 8.05 N 1579 2313 715 150 4757 Alliteration Growth* Average 4.90 6.03 5.69 4.54 5.56 Standard Deviation 5.51 5.87 5.50 4.60 5.69 N 1562 2306 711 150 4729 Letter Sounds Growth* Average 9.52 10.37 11.58 9.23 10.24 Standard Deviation 9.40 8.98 9.51 9.98 9.24 N 1325 2143 609 137 4214 * - Significant floor effects may have affected the results for these measure MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 47
Supplemental Intervention in MRC Pre-K Program During the 2011-2012 school year, some pre-k students in each classroom were identified to receive supplemental intervention support in addition to the core instructional program. These additional small group interactions within the classroom were designed to provide additional opportunities to practice important early literacy skills. In the analysis reported below, student participation rates in supplemental interventions are provided, including number of students participating, and the average and standard deviation of the total number of minutes of supplemental intervention received during the school year. Table 22: Participation in Pre-K Supplemental Interventions: Number of Students and Number of Minutes per Week Four Year-Olds only All PreK Students Average Total Minutes 110.27 108.28 Standard Deviation of Total Minutes 113.59 115.29 Number of Participants 2730 3075 IGDI Growth Data were further analyzed to explore differences in IGDI growth by participation in supplemental interventions. Growth statistics are disaggregated by whether the student participated in the intervention ( None if no participation) and the total number of minutes of intervention received over the school year (1-99 minutes, 100-199 minutes, 200+ minutes) in the tables below. Results are reported first for students who were 4 years old on September 1, 2011, then for all pre-k participating students. Table 23: IGDI Fall-Spring Growth by Participation in Supplemental Instruction: 4-Year-Olds 1-99 Minutes 100-199 Minutes 200+ Minutes None Grand Total Rhyming Growth* Average 8.11 8.48 8.20 7.79 8.16 Standard Deviation 6.53 6.57 6.15 6.10 6.40 N 1603 886 744 582 3815 Letter Naming Growth Average 16.81 16.96 16.07 15.39 16.48 Standard Deviation 11.89 11.41 10.40 10.29 11.28 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 48
N 1633 902 761 585 3881 Picture Naming Growth Average 8.77 9.48 10.97 12.65 9.96 Standard Deviation 8.43 7.85 7.68 7.69 8.16 N 1596 887 743 580 3806 Alliteration Growth* Average 6.98 5.93 5.29 4.75 6.07 Standard Deviation 6.12 5.72 5.35 4.99 5.78 N 1594 887 741 579 3801 Letter Sounds Growth* Average 12.67 10.94 8.98 7.49 10.75 Standard Deviation 10.55 8.41 7.74 6.93 9.27 N 1578 880 737 573 3768 * - Significant floor effects may have affected the results for these measures Table 24: IGDI Fall-Spring Growth by Participation in Supplemental Instruction: All Pre-Kindergarten Students 1-99 Minutes 100-199 Minutes 200+ Minutes None Grand Total Rhyming Growth* Average 7.23 8.18 8.11 7.68 7.64 Standard Deviation 6.37 6.57 6.17 6.02 6.35 N 2334 1011 789 624 4758 Letter Naming Growth Average 15.40 16.48 15.83 15.20 15.69 Standard Deviation 12.06 11.39 10.37 10.29 11.39 N 1992 992 801 618 4403 Picture Naming Growth Average 8.94 9.41 10.96 12.47 9.84 Standard Deviation 8.15 7.86 7.70 7.68 8.05 N 2333 1013 789 622 4757 Alliteration Growth* Average 5.86 5.63 5.25 4.70 5.56 Standard Deviation 5.95 5.68 5.34 4.98 5.69 N 2315 1006 787 621 4729 Letter Sounds Growth* Average 11.54 10.60 8.83 7.47 10.24 Standard Deviation 10.55 8.38 7.68 6.89 9.24 N 1884 952 774 604 4214 * - Significant floor effects may have affected the results for these measures MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 49
A cross-cohort analysis of performance by 4-year-old students has been compiled across eight years of the Minnesota Reading Corps program. The following figure shows the percent of 4-year-old MRC participants meeting the assessments spring target scores at fall, winter, and spring assessment times across years. It is noted that the 2006-2007 data was analyzed by an outside agency, and only includes students enrolled in Headstart MRC classrooms. Across successive school years, fairly stable fall performance is noted, along with an overall increase in percent of students meeting spring target scores by the spring benchmark window. Data from the current year show an increase in performance relative to last year in 5 of 5 measures, and highest percentage above target by spring to date in 5 of 5 measures. Figure 5: Cross-Cohort Percent Above Target on Early Literacy Measures 90.00% 80.00% 70.00% 60.00% 50.00% 40.00% 30.00% 20.00% 10.00% 2003-2004 2004-2005 2005-2006 2006-2007 2007-2008 2008-2009 2009-2010 2010-2011 2011-2012 0.00% Rhyming Picture Naming Alliteration Letter Sounds Letter Names 2003-2004 33.30% 44.70% 26.80% 20.30% 38.90% 2004-2005 39.40% 57.10% 35.60% 32.10% 44.30% 2005-2006 24.00% 35.00% 22.00% 32.00% 44.00% 2006-2007 44.80% 50.90% 47.30% 39.40% 49.80% 2007-2008 44.00% 58.90% 43.40% 49.80% 63.30% 2008-2009 42.50% 54.70% 35.02% 44.90% 62.38% 2009-2010 47.99% 58.72% 38.90% 44.80% 65.46% 2010-2011 55.18% 70.51% 48.24% 56.79% 72.83% 2011-2012 60.02% 72.09% 52.94% 65.78% 78.10% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 50
Analysis was completed to determine the percent of pre-k students with fall scores below target who moved to target or higher than target in each measure by spring. For each measure, students were included if they were age 3-5 on September 1, 2011, and if they had in window scores recorded in fall and spring for each of the measures. Current year data demonstrates an increase across all five measures for percent of students moving from below to at or above target relative to the 2010-2011 school year. The table below summarizes the findings. Table 25: Percent of Pre-K Students Moving from Below to At or Above Target Measure Number of Pre-K students with Fall Scores Below Target and a Spring Score Percent of Pre-K Students with Fall Scores Below Target and Spring Scores At or Above Target Rhyming 4,025 49.24% Letter Naming Fluency 2,822 69.49% Picture Naming 3,681 63.38% Alliteration 3,998 44.82% Letter Sound Fluency 3,295 60.00% Classroom Outcomes for Pre-K Sites Two times per year, Master or Internal coaches completed a systematic observation in each classroom using the Early Language and Literacy Classroom Observation (ELLCO) tool. This observation tool is designed to address the classroom environmental factors that support language and literacy development for pre-kindergarten students. The table below illustrates performance in fall and spring for pre-kindergarten classroom assessed using this measure. Total possible score is 95 points. Table 26: ELLCO Performance in the Fall and Spring Percentile Fall ELLCO Total Score Spring ELLCO Total Score 90 th 89.3 94.5 75 th 85.6 91 50 th 78 82 25 th 68 74 10 th 62.07 65 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 51
Kindergarten-Grade 3 Student Performance The four assessment tools utilized for the K-3 Reading Corps program are listed below. A description of the technical characteristics of these assessments, is listed above. For each assessment tool, a target score was identified as the goal for the beginning, middle, and end of the year. These target scores were based on research conducted at the St. Croix River Education District, which documented the predictive and concurrent validity of these measures with the Minnesota Comprehensive Assessment II. As a result of the strong correlations between performance on the selected fluency measures and on the statewide reading assessment, a series of cut scores has been identified. The table below specifies assessments given at each grade level and the cut scores for each assessment during several points throughout the school year. These cut scores, or target scores, define levels of performance on the fluency measures that strongly predict future success on the grade 3 statewide reading assessment. For example, a student who reads a Grade 1 passage in the winter of first grade at a rate of 22 words read correctly per minute has a 75% chance of earning a score in the meets standards or exceeds standards ranges on the 3 rd grade statewide reading assessment two years later. Grade Measure Fall Target Winter Target Spring Target K Letter Sound Fluency 10 21 41 1 Nonsense Word Fluency 32 52 N/A 1 Oral Reading Fluency N/A 22 52 2 Oral Reading Fluency 43 72 90 3 Oral Reading Fluency 70 91 109 The target scores as listed above for each assessment used as a part of ongoing student literacy measurement in Reading Corps grow across years from age 3 to grade 3, define a pathway to success. Through consideration of the inherent growth that would occur for a child who met each of the targets, an expectation of growth rate at each grade level can be defined. For example, the fall grade 2 target score is 43 on oral reading fluency. The spring grade 2 target score on this measure is 90. To grow from 43 to 90 in one academic year, a student would need to gain 1.31 words correct per minute per week on the oral reading fluency assessment. Thus, 1.31 words growth per week becomes the expectation for 2 nd grade growth rates. Because our targets are connected to the state- wide assessment rather than normative performance of other MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 52
students in local districts, we have a consistent and meaningful comparison across the state. Students participating in the Minnesota Reading Corps program are monitored frequently. The primary purpose of this progress monitoring is to enable those providing support to the student the ability to evaluate the effectiveness of current reading instruction, and to make data-based decisions regarding changes in instruction. For the purposes of outcomes evaluation, the progress monitoring data also provide a means for comparing the rate of growth of participating Minnesota Reading Corps students to the expected grade level growth rate. Students are selected for participation in the Reading Corps program because they are identified as having below grade level skills in reading. These students who achieve higher growth rates than those indicated by our targets are catching up to grade level expectation by making more than one year s growth in one year s time. For the state-wide Minnesota Reading Corps project, one measure of our success is the extent to which our participating students achieve this primary goal. In the table below, a comparison between weekly growth rate expectations and the average weekly growth rates on program assessment measures of children participating in K-3 Reading Corps programs who have at least 3 data points collected per measure is listed. The current analysis includes all data collected between 9/1/2011 and 6/30/2012. In all grade levels, the average growth rate of Minnesota Reading Corps participants exceeded the target growth rate. Note that in Grade 1, while mean growth of participants in the second half of the year, on Oral Reading Fluency, was slightly below target growth, the mean growth in the first half of the year, on Nonsense Word Fluency, exceeded target growth. In other words, the average growth rate for Minnesota Reading Corps participants exceeded a rate of one year s growth in one year s time. This demonstrates that participating students are actually catching up to grade level expectations. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 53
Table 27: Kindergarten - Grade 3 Participant Growth Grade K (Letter Sound Fluency) Grade 1 (Nonsense Word Fluency) Grade 1 (R-CBM)** Grade 2 (R-CBM) Grade 3 (R-CBM) MRC Mean Growth Rate 2.04 2.03 1.60 1.54 1.35 Target Growth Rate 1.15 1.11 1.67 1.31 1.08 Number of Students 4304 3958 2682 3683 4173 * Only students with 3 or more data points on the given measure were included in growth rate calculations ** Students in this group may have also participated in Grade 1 NWF Table 28: Average Linear Growth Rates, By Region Grade K (LSF) Grade 1 (NWF) Grade 1 (R-CBM) Grade 2 (R-CBM) Grade 3 (R-CBM) North East 1.56 2.37 1.52 1.61 1.36 North Central 2.74 2.23 1.86 1.73 1.62 Metro 2.01 1.97 1.52 1.48 1.31 North West 2.19 2.15 1.80 1.82 1.48 South East 1.92 2.04 1.78 1.50 1.33 South West 2.10 2.18 1.64 1.63 1.47 Central 2.41 2.03 1.85 1.72 1.43 TOTAL 2.04 2.03 1.60 1.54 1.35 The average linear growth is reported above, and used throughout the evaluation. This is done to provide an easily interpretable metric that can be used to provide group comparisons and can be readily compared to growth rates of benchmark target scores. Research suggests, however, that growth rates on many general outcome measures are typically non-linear across a given school year. The following graphs are provided to allow for closer examination of this non-linear growth pattern, as well as for visual examination of differences in these patterns across districts. Figures 5 through 9 depict the growth curve estimates for participating MRC students. These visual displays more accurately capture the quadratic nature of growth on the assessment measures. On each chart the bold black lines represent target scores at each season of the school year. The solid red line represents the aggregate growth curve for all students, and the remaining lines represent the growth curve for students from each district. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 54
Figure 6: Grade K: Letter Sound Fluency Growth Curve Estimates by Region 80 Letter Sounds Correct per Minute 60 40 20 0-20 NE NC Metro NW SE SW Central TOTAL TARGET -40 Week 1 Week 3 Week 5 Week 7 Week 9 Week 11 Week 13 Week 15 Week 17 Week 19 Week 21 Week 23 Week 25 Week 27 Week 29 Week 31 Week 33 Week 35 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 55
Figure 7: Grade 1 Nonsense Word Fluency Growth Curve Estimates by Region 60 50 Correct Sounds per Minute 40 30 20 10 NE NC Metro NW SE SW Central TOTAL TARGET 0 Week 1 Week 3 Week 5 Week 7 Week 9 Week 11 Week 13 Week 15 Week 17 Week 19 Week 21 Week 23 Week 25 Week 27 Week 29 Week 31 Week 33 Week 35 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 56
Figure 8: Grade 1: Oral Reading Fluency Growth Curve Estimates by Region 60 50 Words Correct per Minute 40 30 20 10 NE NC Metro NW SE SW Central TOTAL TARGET 0 Week 1 Week 3 Week 5 Week 7 Week 9 Week 11 Week 13 Week 15 Week 17 Week 19 Week 21 Week 23 Week 25 Week 27 Week 29 Week 31 Week 33 Week 35 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 57
Figure 9: Grade 2: Oral Reading Fluency Growth Curve Estimates by Region 100 90 80 Words Correct per Minute 70 60 50 40 30 20 10 NE NC Metro NW SE SW Central TOTAL TARGET 0 Week 1 Week 3 Week 5 Week 7 Week 9 Week 11 Week 13 Week 15 Week 17 Week 19 Week 21 Week 23 Week 25 Week 27 Week 29 Week 31 Week 33 Week 35 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 58
Figure 10: Grade 3: Oral Reading Fluency Growth Curve Estimates by Region 120 100 Words Correct per Minute 80 60 40 20 NE NC Metro NW SE SW Central TOTAL TARGET 0 Week 1 Week 3 Week 5 Week 7 Week 9 Week 11 Week 13 Week 15 Week 17 Week 19 Week 21 Week 23 Week 25 Week 27 Week 29 Week 31 Week 33 Week 35 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 59
The table below examines K-3 students who have at least 3 data points collected per measure, represented by Total Number of Students. The table presents the percentage of these students whose individual growth rates exceeded the target growth rates for that grade level and measure. This calculation represents the portion of the population participating in a Reading Corps intervention that had growth rates in excess of one year of growth in one year s time. Percentages are given by region, and overall. Table 29: Kindergarten - Grade 3 Percentage of Students Above Growth Targets by Region Region Grade K (LSF) Grade 1 (NWF) Grade 1 (R-CBM)** Grade 2 (R-CBM) Grade 3 (R-CBM) TOTAL * % Above Target 84.21% 86.42% 39.17% 66.24% 65.66% 67.19% North East Total Number of Students 38 162 120 157 166 643 % Above Target 92.00% 81.01% 55.74% 70.93% 84.35% 77.24% North Central Total Number of Students 50 79 61 86 115 391 % Above Target 86.34% 79.46% 42.61% 64.07% 66.85% 69.88% Metro Total Number of Students 2878 2210 1718 2332 2700 11838 % Above Target 94.51% 83.11% 53.77% 84.95% 76.06% 80.17% North West Total Number of Students 164 148 106 186 213 817 % Above Target 91.30% 82.11% 53.68% 79.93% 75.90% 76.77% South East Total Number of Students 214 346 247 303 280 1390 % Above Target 84.11% 79.77% 52.63% 70.30% 64.64% 70.50% South West Total Number of Students 184 218 190 274 361 1227 % Above Target 93.84% 87.80% 57.50% 80.58% 73.08% 79.05% Central Total Number of Students 276 295 240 345 338 1494 TOTAL % Above Target 87.46% 80.97% 46.26% 68.69% 68.68% 71.77% Total Number of Students 3804 3458 2682 3683 4173 17800 * TOTAL represents the total number of slopes analyzed, not the total number of students, as students in Grade 1 may have participated in two categories ** Students in this group may have also participated in Grade 1 (NWF) Regarding student performance during the 2011-2012 school year, data were examined to determine the typical number of weeks that a successful MRC program participant could expect to receive tutoring sessions before MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 60
graduating out of the program. The results of this analysis are displayed below, overall and by region for both pre-k and k-3 populations. Table 30: Average Tutoring Participation by Grade Level Number of Students Average Sessions per Week Average Total Minutes Average Minutes per Week Average Average Total Grade Weeks Sessions Minutes PreK 2 2 14.00 28.00 2.39 209.50 15.68 419 PreK 3 175 7.35 18.12 2.17 99.15 12.08 17352 PreK 4 2731 9.84 26.05 2.59 142.91 14.19 390284 PreK 5 86 10.29 23.20 2.17 149.28 13.84 12838 PreK Unknown 81 N/A N/A N/A N/A N/A N/A PreK Total 3075 9.65 25.40 2.55 140.05 14.07 430649 K 4189 9.22 46.43 4.84 913.33 94.61 3825939 1 4388 13.39 47.46 3.50 925.79 68.14 4062362 2 3574 17.63 63.24 3.56 1242.29 69.75 4439956 3 4008 16.14 56.27 3.46 1102.06 67.66 4419221 TOTAL 19234 13.24 48.49 3.64 893.00 65.46 17175962 Table 31: Average Tutoring Participation for K-3 by Grade Level and Region Number of Students Average Weeks Average Sessions Average Sessions per Week Average Total Minutes Average Minutes per Week Total Minutes Grade North East K 50 8.92 28.90 3.20 544.70 59.16 27235 1 213 13.26 47.68 3.58 912.16 67.84 194290 2 155 19.06 67.46 3.51 1302.37 67.42 201867 3 161 16.79 59.19 3.48 1142.17 66.80 183890 TOTAL 579 15.42 54.55 3.50 1048.85 66.69 607282 North Central K 32 9.16 29.94 3.25 597.81 65.21 19130 1 90 13.36 44.84 3.25 832.33 59.56 74910 2 72 16.85 55.21 3.20 1092.01 63.28 78625 3 102 14.15 45.48 3.14 893.53 62.01 91140 TOTAL 296 14.02 45.97 3.20 891.23 61.92 263805 Metro K 3155 9.70 53.24 5.33 1051.66 104.95 3317987 1 2757 13.90 49.27 3.51 960.56 68.26 2648254 2 2266 18.62 67.02 3.57 1314.26 69.83 2978124 3 2605 16.94 59.31 3.47 1159.07 67.58 3019367 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 61
TOTAL 10783 14.40 56.59 4.04 1103.54 78.77 11899471 North West K 177 6.36 21.69 3.44 418.84 66.86 74135 1 185 11.77 41.62 3.52 807.45 68.42 149378 2 174 13.24 47.17 3.54 926.57 69.51 161224 3 200 12.31 42.42 3.43 827.08 67.22 165416 TOTAL 736 10.96 38.35 3.48 747.49 67.98 550153 South East K 234 8.72 29.79 3.38 573.46 64.93 134190 1 419 13.75 49.79 3.60 982.88 71.00 411828 2 279 17.88 66.11 3.73 1315.35 74.06 366983 3 260 15.55 53.73 3.44 1069.58 69.07 278090 TOTAL 1192 14.12 50.54 3.55 999.24 70.10 1191091 South West K 206 7.67 26.70 3.64 507.55 64.28 104556 1 283 12.51 44.14 3.50 872.30 68.56 246861 2 256 17.12 60.17 3.50 1190.76 68.89 304834 3 326 15.81 55.22 3.51 1095.80 69.00 357230 TOTAL 1071 13.68 47.99 3.53 946.30 67.95 1013487 Central K 323 7.61 23.56 3.11 448.89 59.34 144993 1 415 11.38 39.74 3.38 779.50 66.13 323491 2 339 13.74 49.03 3.51 966.47 69.19 327634 3 331 13.55 47.00 3.48 922.56 68.57 305367 TOTAL 1408 11.59 39.97 3.37 782.30 65.88 1101485 Starting in the 2010-2011 school year, exit criteria were refined for the K-3 program to require that students have both 3-5 consecutive data points above the aim line on their progress monitoring graphs, and that they have at least one data point above the nearest upcoming benchmark target. It was anticipated that this additional exit requirement would result in longer length of service to students in general prior to exiting, and that those who successfully exited the program would also meet spring target expectations and ultimately meet standards on the grade 3 statewide reading assessment in greater numbers. The following tables provide data related to results of this change in exit criteria. In the tables below, the percent of K-3 MRC participating students with at least 3 progress monitor data points who graduated from tutoring (exit) during the 2011-2012 year is reported, first across the state, and then by region. Members are asked to report student exit status in two locations: on individual student progress monitor graphs kept within the AIMSweb data system, and within the MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 62
OnCorps data management system. For this and following analyses involving student exit status, status was based on reporting within the OnCorps system. A 3 data point minimum was selected in order to remove from the analysis students who did not have at least a reasonable minimum length of service prior to discontinuation. Table 32: Percentage of Students with 3 Weeks of Progress Monitoring Data Who Exit Grade Number of Students Number Exited Percent Exited K 4189 2895 69.11% 1 4388 2261 51.53% 2 3574 1448 40.51% 3 4008 1843 45.98% K-3 Total 16159 8447 52.27% TOTAL 19234 8533 44.36% Table 33: Percentage of Students with 3 Weeks of Progress Monitoring Data Who Exit by Region Region Grade Number of Students Number Exited Percent Exited North East K 50 25 50.00% North East 1 213 102 47.89% North East 2 155 51 32.90% North East 3 161 73 45.34% North East TOTAL 579 251 43.35% North Central K 32 19 59.38% North Central 1 90 41 45.56% North Central 2 72 24 33.33% North Central 3 102 54 52.94% North Central TOTAL 296 138 46.62% Metro K 3155 2204 69.86% Metro 1 2757 1393 50.53% Metro 2 2266 851 37.56% Metro 3 2605 1156 44.38% Metro TOTAL 10783 5604 51.97% North West K 177 129 72.88% North West 1 185 110 59.46% North West 2 174 101 58.05% North West 3 200 117 58.50% North West TOTAL 736 457 62.09% South East K 234 165 70.51% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 63
South East 1 419 242 57.76% South East 2 279 130 46.59% South East 3 260 130 50.00% South East TOTAL 1192 667 55.96% South West K 206 153 74.27% South West 1 283 153 54.06% South West 2 256 118 46.09% South West 3 326 156 47.85% South West TOTAL 1071 580 54.15% Central K 323 253 78.33% Central 1 415 237 57.11% Central 2 339 186 54.87% Central 3 331 182 54.98% Central TOTAL 1408 858 60.94% The figure below captures the change in percentage of students who successfully exited the program between the 2009-2010 and 2011-2012 school years. Interestingly, the percentage of students exited returned back to levels at or above that of the 2009-2010 school year even through the more stringent exit criteria that were implemented in 2010-2011 were employed again in 2011-2012. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 64
Figure 11: Percentage of Students Successfully Exited: 2009-2010 to 2010-2011 Comparison 80.00% 70.00% 60.00% 50.00% 40.00% 30.00% 20.00% 2009-2010 2010-2011 2011-2012 10.00% 0.00% Kindergart en Grade 1 Grade 2 Grade 3 Total 2009-2010 62.54% 52.48% 43.83% 46.72% 50.16% 2010-2011 61.90% 46.65% 37.38% 43.58% 46.66% 2011-2012 69.11% 51.53% 40.51% 45.98% 52.27% Further examination of exit rates for participating students was conducted through consideration of initial performance level in the fall of the 2011-2012 school year. For this analysis, students were divided into 2 groups based on their fall benchmark assessment level: Those beginning the school year somewhat below grade level standards (Tier 2), and those beginning the school year significantly below grade level standards (Tier 3). Category assignment was defined by target scores that reference success on the grade 3 statewide reading assessment. Students in the somewhat below (Tier 2) category were those expected to have between a 25-75% chance of meeting standards on the grade 3 statewide reading assessment. Students in the significantly below (Tier 3) category were those expected to have less than a 25% chance of meeting standards on the statewide reading assessment based on current performance on the assessment measures. Exit rates for these two groups of students are summarized in the table below. This is an important question as it relates to level of students for whom the MRC intervention program is most effective. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 65
Table 34: Percentage Who Exit: Fall Benchmark Tier 2 vs. Tier 3 Grade Number Fall Tier 2 Percent Tier 2 Exited Number Fall Tier 3 Percent Tier 3 Exited K 421 76.96% 1 2105 71.35% 271 40.96% 2 1934 51.76% 123 12.20% 3 2322 62.06% 79 15.19% K-3 Total 6937 61.53% 473 29.18% Table 35: Percentage Who Exit: Fall Benchmark Tier 2 vs. Tier 3 by Region Region Grade Number Fall Tier 2 Percent Tier 2 Exited Number Fall Tier 3 Percent Tier 3 Exited North East K 5 60.00% North East 1 111 70.27% 12 16.67% North East 2 85 44.71% 4 0.00% North East 3 100 62.00% 1 0.00% North East TOTAL 306 59.15% 17 11.76% North Central K 4 75.00% North Central 1 47 68.09% 3 100.00% North Central 2 47 36.17% 2 0.00% North Central 3 70 64.29% 1 0.00% North Central TOTAL 168 57.74% 6 50.00% Metro K 332 78.61% Metro 1 1292 69.81% 198 40.91% Metro 2 1174 47.61% 105 12.38% Metro 3 1452 59.09% 51 13.73% Metro TOTAL 4348 59.34% 354 28.53% North West K 13 69.23% North West 1 104 71.15% 7 57.14% North West 2 120 71.67% 3 33.33% North West 3 133 71.43% 7 42.86% North West TOTAL 376 70.21% 17 47.06% South East K 10 50.00% South East 1 214 76.17% 20 15.00% South East 2 177 53.11% 1 0.00% South East 3 166 58.43% 8 12.50% South East TOTAL 576 62.33% 29 13.79% South West K 15 53.33% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 66
South West 1 126 72.22% 15 46.67% South West 2 140 50.71% 3 0.00% South West 3 180 62.78% 4 25.00% South West TOTAL 472 59.96% 22 36.36% Central K 42 83.33% Central 1 201 77.11% 16 68.75% Central 2 173 71.68% 3 33.33% Central 3 223 69.06% 4 0.00% Central TOTAL 639 73.24% 23 52.17% In keeping with program design, the vast majority of participating students in the Minnesota Reading Corps program have fall assessment scores that fall in the tier 2 range. Given the large differences in number of students in tier 2 and 3 fall performance groups, comparisons of outcomes between the groups are difficult to make. However, the vast difference in success rate for the small group falling below tier 2 as compared to those falling within tier 2 is noted. The following figure displays a graphical representation of the total percentage of students with exit information who successfully exited the MRC program based on their Tier designation at time of fall benchmark data. Figure 12: Percent of Students in Each Tier Who Exited Successfully 70.00% 60.00% 50.00% Percent Exited 40.00% 30.00% 20.00% 10.00% 0.00% Tier 2 Tier 3 Percent Exited 61.53% 29.18% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 67
During the 2011-2012 school year, MRC members continued to collect benchmark data three times per year for all students who had previously participated in the program. This additional data collection allows an analysis of the percentage of students who exit the MRC program who also meet or exceed the spring benchmark scores at the end of the year. This is an important analysis as it allows for study of the current exit criteria for the program. Results are summarized in the tables below. As can be seen in the tables, across the state there is a reported range of 63%- 76% of students who exit the program and also meet spring benchmark targets. Overall, just over 69% of students who exited the program prior to the end of the school year exceeded the spring benchmark target across the state. Table 36: Percentage Who Exit Who Also Meet or Exceed Spring Benchmark Grade Number Exited Number Exited with Spring Benchmark Number Above Spring Benchmark Percent Above Spring Benchmark K 2895 2789 1861 66.73% 1 2261 2161 1641 75.94% 2 1448 1402 992 70.76% 3 1843 1756 1107 63.04% K-3 Total 8447 8108 5601 69.08% Table 37: Percentage Who Exit Who Also Meet or Exceed Spring Benchmark by Region Number Exited with Spring Benchmark Number Above Spring Benchmark Percent Above Spring Benchmark Region Grade Number Exited North East K 25 25 14 56.00% North East 1 102 101 81 80.20% North East 2 51 50 39 78.00% North East 3 73 70 49 70.00% North East TOTAL 251 246 183 74.39% North Central K 19 19 14 73.68% North Central 1 41 41 34 82.93% North Central 2 24 22 13 59.09% North Central 3 54 48 25 52.08% North Central TOTAL 138 130 86 66.15% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 68
Metro K 2204 2088 1410 67.53% Metro 1 1393 1332 1008 75.68% Metro 2 851 826 588 71.19% Metro 3 1156 1084 669 61.72% Metro TOTAL 5604 5330 3675 68.95% North West K 129 128 81 63.28% North West 1 110 108 74 68.52% North West 2 101 101 69 68.32% North West 3 117 117 77 65.81% North West TOTAL 457 454 301 66.30% South East K 165 155 88 56.77% South East 1 242 235 189 80.43% South East 2 130 114 83 72.81% South East 3 130 119 78 65.55% South East TOTAL 667 623 438 70.30% South West K 153 130 88 67.69% South West 1 153 145 113 77.93% South West 2 118 110 77 70.00% South West 3 156 149 102 68.46% South West TOTAL 580 534 380 71.16% Central K 253 237 163 68.78% Central 1 237 197 141 71.57% Central 2 186 173 121 69.94% Central 3 182 164 103 62.80% Central TOTAL 858 771 528 68.48% The figure below captures the change in percentage of students who successfully exited the program and met spring benchmark targets between the 2009-2010 and 2011-2012 school years. As was expected, an increase in the percentage of students meeting spring targets resulted from the increased stringency of the exit criteria. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 69
Figure 13: Change in Percentage of Students Exiting and Meeting Spring Benchmark Targets 90.00% Percent Meeting Spring Target 80.00% 70.00% 60.00% 50.00% 40.00% 30.00% 20.00% 10.00% 0.00% Kindergarte n Grade 1 Grade 2 Grade 3 Total 2009-2010 51.29% 61.25% 30.62% 81.31% 57.43% 2010-2011 65.41% 75.09% 71.74% 63.56% 68.92% 2011-2012 66.73% 75.94% 70.76% 63.04% 69.08% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 70
Performance on the Statewide Reading Assessment For students participating in the MRC program as third graders during the 2011-2012 school year, as second graders during the 2010-2011 school year, as first graders during the 2009-2010 school year, and/or as kindergarteners during the 2008-2009 school year, analysis was completed to compare their outcome in the MRC program with their performance on the spring 2012 grade 3 statewide reading assessment. For most students in the K-3 MRC sample, this is the Minnesota Comprehensive Assessment: II of reading (MCAII). For students identified as English Language Learners, the statewide reading assessment given is the Test of Emerging Academic English (TEAE). Student scores on the grade 3 MCAII and TEAE are considered equivalent, and are reported in aggregate form throughout this report. The tables below summarize these data, first for 2011-2012 grade 3 participants, then for 2010-2011 grade 2 participants, then the 2009-2010 grade 1 participants, and finally the 2008-2009 kindergarten participants. Table 38: Outcomes of 2011-2012 Third Grade MRC Participating Students on Spring Grade 3 Statewide Reading Assessment District Total with MCA Results Total Pass MCA Percent Pass MCA Number Successful Exit or Spring Benchmark and have MCA Results Of Successful Exit or Benchmark, Percent Success MCA North East 160 100 62.50% 81 76.54% North Central 113 88 77.88% 68 82.35% Metro 2435 1471 60.41% 1061 75.12% North West 201 152 75.62% 126 81.75% South East 252 178 70.63% 136 81.62% South West 335 263 78.51% 181 85.08% Central 326 252 77.30% 190 85.26% TOTAL 3822 2504 65.52% 1843 78.40% The following figure provides comparison data on third grade student success on the statewide reading assessment in the 2009-2010 through 2011-2012 school years. In most regions and for the state in total, a greater percentage of students both exited the MRC program and demonstrated at least grade level performance on the statewide reading assessment in grade 3 in the 2011-2012 school year relative to 2009-2010. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 71
Figure 14: Percent of Students with Successful Exit or Benchmark Score and Success on Statewide Reading Assessments Grade 3: Comparison of 2009-2010 to 2011-2012 100.00% Percent Exited and Successful on MCA 90.00% 80.00% 70.00% 60.00% 50.00% 40.00% 30.00% 20.00% 10.00% 0.00% North East North Central Metro North West South East South West Central TOTAL 2009-2010 73.68% 65.71% 63.11% 70.34% 71.61% 81.63% 77.91% 67.72% 2010-2011 75.95% 63.08% 77.34% 79.66% 82.98% 88.98% 83.45% 78.86% 2011-2012 76.54% 82.35% 75.12% 81.75% 81.62% 85.08% 85.26% 78.40% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 72
Table 39: Outcomes of 2010-2011 Second Grade MRC Participating Students on Spring Grade 3 Statewide Reading Assessment District Total with MCA Results Total Pass MCA Percent Pass MCA Number Successful Exit or Spring Benchmark and have MCA Results Of Successful Exit or Benchmark, Percent Success MCA North East 139 91 65.47% 70 71.43% North Central 96 68 70.83% 47 82.98% Metro 1568 939 59.89% 672 74.85% North West 160 127 79.38% 92 86.96% South East 226 157 69.47% 107 79.44% South West 160 120 75.00% 63 87.30% Central 188 149 79.26% 101 87.13% TOTAL 2537 1651 65.08% 1152 78.13% Table 40: Outcomes of 2009-2010 First Grade MRC Participating Students on Spring Grade 3 Statewide Reading Assessment District Total with MCA Results Total Pass MCA Percent Pass MCA Number Successful Exit or Spring Benchmark and have MCA Results Of Successful Exit or Benchmark, Percent Success MCA North East 160 117 73.13% 94 86.17% North Central 63 51 80.95% 46 84.78% Metro 1233 745 60.42% 643 74.34% North West 184 150 81.52% 124 87.90% South East 347 270 77.81% 237 87.34% South West 107 81 75.70% 62 83.87% Central 349 266 76.22% 223 80.72% TOTAL 2443 1680 68.77% 1429 80.20% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 73
Table 41: Outcomes of 2008-2009 Kindergarten MRC Participating Students on Spring Grade 3 Statewide Reading Assessment District Total with MCA Results Total Pass MCA Percent Pass MCA Number Successful Exit or Spring Benchmark and have MCA Results Of Successful Exit or Benchmark, Percent Success MCA North East 75 47 62.67% 34 64.71% North Central 21 17 80.95% 17 82.35% Metro 364 229 62.91% 249 70.28% North West 53 44 83.02% 47 82.98% South East 98 67 68.37% 64 81.25% South West Central 108 82 75.93% 74 81.08% TOTAL 719 486 67.59% 485 74.64% Table 42: Total Number of Students Monitored in MRC Programs, with Third Grade Statewide Reading Assessment in 2011-2012 District Total with MCA Results Total Pass MCA Percent Pass MCA Number Successful Exit or Spring Benchmark in most recent year of participation and have MCA Results Of Successful Exit or Benchmark in most recent year or participation, Percent Success MCA North East 350 240 68.57% 195 77.95% North Central 216 166 76.85% 136 83.82% Metro 4032 2479 61.48% 1997 76.66% North West 430 343 79.77% 288 86.81% South East 651 492 75.58% 400 85.25% South West 475 370 77.89% 222 86.49% Central 740 571 77.16% 444 84.68% TOTAL 6894 4661 67.61% 3682 80.28% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 74
5. Systems Change Are the organizations with which the MRC is working changing to adopt the practices of the MRC? In the spring of the 2011-2012 program year, an electronic survey was sent to all internal coaches requesting their feedback regarding systems change impact the MRC program had on their local site. Internal coaches are employees of the host districts who work regularly to support their local members in the MRC role. Overall, 39.0% of respondents indicated that participation in the MRC program was very influential in prompting systems change at the local site. 46.5% indicated that participation in MRC was somewhat influential in prompting systems change. The following results summarize additional feedback received from 274 internal coaches regarding perceptions of specific areas of systems change in MRC sites. Table 45: Internal Coach Systems Change Survey Results The following questions are designed to capture your perceptions regarding changes to your local system resulting from participation in the MRC program. For each item below, select the response the best describes your site. Answer Options Prior to participating in the MRC program, our building collected fluency based screening measures for all students K-3 at least 3 times per year. Due (at least in part) to our participation in the MRC program, our building has begun or will begin collecting fluency based screening measures for all students K-3 at least 3 times per year. Prior to participating in the MRC program, teachers in our building used screening data to assist in identifying students for supplemental interventions. Due (at least in part) to our participation in the MRC program, teachers in our building now use screening data to assist in identifying students for supplemental interventions. Prior to participating in the MRC program, teachers regularly reviewed progress-monitoring data (weekly graphs) of students receiving supplemental interventions. Strongly Agree Agree Disagree Strongly Disagree 55.3% 22.6% 15.1% 7.0% 21.5% 31.3% 27.7% 19.5% 41.4% 41.4% 15.2% 2.0% 27.7% 41.5% 17.9% 12.8% 13.6% 36.9% 40.9% 8.6% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 75
Due (at least in part) to our participation in the MRC program, teachers now regularly review progressmonitoring data (weekly graphs) of students receiving supplemental interventions. Prior to participating in the MRC program, teachers viewed progress monitoring (weekly data) as an important method to evaluate the impact of instruction on students. Due (at least in part) to our participation in the MRC program, teachers now view progress monitoring (weekly data) as an important method to evaluate the impact of instruction on students. Prior to participating in the MRC program, our school used aggregated data as one way to evaluate the instructional practices of the site. Due (at least in part) to our participation in the MRC program, our school now uses aggregated data as one way to evaluate the instructional practices of the site. Prior to participating in the MRC program, the building principal shared data on student performance with the superintendent or school board. Due (at least in part) to our participation in the MRC program, the building principal now shares data on student performance with the superintendent or school board. Prior to participating in the MRC program, teachers shared progress monitor graphs with parents. Due (at least in part) to our participation in the MRC program, teachers now share progress monitor graphs with parents. Due (at least in part) to our participation in the MRC program, greater emphasis has been placed on selecting reading interventions for students that have a scientific research base. Prior to participating in the MRC program, instruction was modified if student performance was not improving based on the progress monitoring data collected. Due (at least in part) to our participation in the MRC program, instruction is now modified if student performance is not improving based on the progress monitoring data collected. Prior to participating in the MRC program, the district had adopted its own data warehouse system for efficiently storing and accessing data. 19.4% 54.1% 16.8% 8.7% 13.5% 40.5% 38.5% 7.5% 21.2% 52.0% 18.7% 8.1% 25.6% 53.3% 19.1% 2.0% 17.8% 49.2% 23.4% 9.6% 35.0% 48.2% 12.7% 4.1% 16.5% 41.2% 28.4% 13.9% 19.1% 32.2% 39.2% 6.5% 16.4% 55.4% 21.0% 7.2% 29.1% 54.3% 11.6% 5.0% 22.6% 52.8% 22.1% 2.5% 20.0% 59.5% 13.3% 7.2% 38.7% 39.7% 17.6% 4.0% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 76
Due (at least in part) to our participation in the MRC program, the district has now adopted its own data warehouse system for efficiently storing and accessing data. Due (at least in part) to our participation in the MRC program, the district has taken concrete steps I am aware of to formally link Pre-K with K-3 literacy instruction. 38.7% 39.7% 17.6% 4.0% 13.4% 38.7% 34.5% 13.4% 6. Impact on AmeriCorps Members What is the impact of the MRC experience on the AmeriCorps Members? In the spring of the program year, an electronic survey was sent to all members, requesting their feedback on the types of impact they perceived the MRC program had on them as individuals. The following results summarize feedback received from 465 members. Table 46: MRC Member Impact Survey Results Please indicate your agreement with each of the following statements based on your experience with Minnesota Reading Corps. Answer Options Strongly Agree Agree Disagree Strongly Disagree As a result of my participation in the MRC program, I am considering a career involving children As a result of my participation in the MRC program, I am considering a career in teaching or education As a result of my participation in the MRC program, I am committed to continued volunteering in schools As a result of my participation in the MRC program, I am committed to ongoing promotion of childhood literacy. As a result of my participation in the MRC program, I am committed to continued community service. 42.3% 39.6% 15.6% 2.4% 38.4% 33.7% 23.5% 4.4% 34.3% 52.7% 11.9% 1.1% 57.8% 38.0% 3.7% 0.4% 41.4% 47.6% 9.9% 1.1% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 77
As a result of my participation in the MRC program, if a job I hold in the future does not have community service as part of it's mission, I will encourage the organization to include opportunities for community service. 30.8% 53.5% 14.1% 1.5% During the 2011-2012 school year, a sub-group of MRC members served in a specialized capacity within a kindergarten focus program. These members worked primarily or exclusively with kindergarten age students, meeting with students for 40 minutes per day rather than 20 minutes per day. Half of this intervention time involved the same literacy interventions as those available outside of this K-Focus model. The other half of the time involved the member implementing a shared book reading intervention with groups of 4 students. Survey results for K-Focus members specifically were disaggregated to determine whether their unique role within the MRC program was related to different perceptions of outcome and future plans for members. In all, 43 K- Focus members responded to the survey. Overall while the pattern of responses for K-Focus members was not significantly different from the responses of all members, K-Focus members appeared somewhat less strong in their intention to continue in careers with children or education, but somewhat more strong in their commitment to ongoing advocacy around childhood literacy and community service. Table 47: K-Focus MRC Member Impact Survey Results Please indicate your agreement with each of the following statements based on your experience with Minnesota Reading Corps. Answer Options Strongly Agree Agree Disagree Strongly Disagree As a result of my participation in the MRC program, I am considering a career involving children 38.1% 42.9% 14.3% 4.8% As a result of my participation in the MRC program, I am considering a career in teaching or education As a result of my participation in the MRC program, I am committed to continued volunteering in schools As a result of my participation in the MRC program, I am committed to ongoing promotion of childhood literacy. 31.0% 35.7% 28.6% 4.8% 33.3% 47.6% 16.7% 2.4% 59.5% 28.6% 9.5% 2.4% MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 78
As a result of my participation in the MRC program, I am committed to continued community service. As a result of my participation in the MRC program, if a job I hold in the future does not have community service as part of it's mission, I will encourage the organization to include opportunities for community service. 47.6% 45.2% 7.1% 0.0% 52.4% 40.5% 7.1% 0.0% 7. Action Research: Results of Pilot Studies Study of Kindergarten Focus Model During the 2010-2011 school year, MRC began a new pilot study for students enrolled in kindergarten. Students participating in this pilot study were given two 20-minute tutoring sessions per day. One session consisted of one or more of the standard pre-literacy MRC interventions, delivered to a pair of students together. The other daily session consisted of a shared book reading intervention that incorporated elements of phonemic awareness, phonics, and vocabulary instruction. Students participated in the shared book reading intervention in groups of four. Members were chosen to participate in this pilot study based on district of service and master coach recommendation, and were given additional training to perform the shared book reading intervention with fidelity. For purposes of evaluation, performance of students participating in the K-Focus program was compared to the performance of kindergarten students participating in the standard MRC interventions within schools in which K-Focus was also available. Table 48: Comparison of Percent of Students Meeting Individual Growth Rates by Participation in Kindergarten Focus Model Grade K (LSF) K-Focus Total % Above Target 85.52% Total Number of Students 1651 Non K-Focus Total % Above Target 74.76% Total Number of Students 416 Total % Above Target 83.36% Total Number of Students 2067 To further analyze the result of participation in the kindergarten focus model intervention, a comparison of average weekly growth for these students was compared to the average weekly growth of kindergarten students in schools in which K-Focus was available, but who received standard 1:1 tutoring sessions MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 79
instead. Results of this analysis, as shown below, indicate that average weekly growth was significantly greater for students involved in the kindergarten focus model relative to those involved in standard 1:1 tutoring sessions. Table 49: Average Weekly Growth Rate of All MRC Participating Students in Kindergarten Focus Model * Grade K (LSF) K-Focus Growth Rate 2.07 Non K-Focus Growth Rate 1.62** Target Growth Rate 1.15 Number of K-Focus Students 2067 * - Only students with 3 or more data points on the given measure were included in growth rate calculations ** - Difference is statistically significant p<.001 The following chart provides visual description of the growth curve for kindergarten students in the K-Focus and Non K-Focus programs Figure 15: Growth Curve Estimates, by K-Focus Participation One additional measure of growth for the K-Focus program is the number of kindergarten students served. By providing a targeted program designed for kindergarten students, the MRC program hoped to increase the number of students in this grade level served. The table below presents the number of kindergarten students served each school year, for both years with the K-Focus model, and prior years when the K-Focus model was not available. In each year MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 80
that a site began implementing K-Focus, the number of students increased by 5-7 times over the previous year. For example, sites beginning K-Focus in 2011-12 served a total of 144 students in 2009-10 and 117 students in 2010-11 (both years in which K-Focus was not implemented), and then served a total of 888 students in 2011-12 (the first year implementing K-Focus). Table 50: Three-Year History of Total Numbers of Kindergarten Students Served in Sites Implementing K- Focus by Year Site Began Implementing K-Focus 2011-12 Total Number of Students Served Year Site Began K- Focus Active Exited Moved Referred to other services TOTAL Re- Enrolled 2010-11 151 455 37 17 7 667 2011-12 215 581 48 29 15 888 TOTAL 366 1036 85 46 22 1555 2010-11 Total Number of Students Served Year Site Began K- Re- Referred to Focus Active Exited Moved Enrolled other services TOTAL 2010-11 157 467 19 44 7 694 2011-12 43 68 4 1 1 117 TOTAL 200 535 23 45 8 811 2009-10 Total Number of Students Served Year Site Began K- Re- Referred to Focus Active Exited Moved Enrolled other services TOTAL 2010-11 46 108 6 8 2 170 2011-12 60 68 4 5 7 144 TOTAL 106 176 10 13 9 314 Study of Word Construction Intervention In December 2011, a small cohort of members was trained to provide a new intervention to students primarily during the second semester of first grade. This intervention was designed from the research of Bruce McCandliss and Isabel Beck, who demonstrated an effective method of supporting students development of accurate and fluent word decoding skills. Students use letter tiles to build words, write words, and practice reading the words in isolation and in silly sentences. This protocol was selected for inclusion in the Minnesota Reading Corps program to support students who had mastered other basic early word decoding skills covered in MRC early literacy interventions, but who MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 81
were not fully ready to read connected text independently. The interested reader is referred to the primary research publication describing the intervention method: McCandliss, B., Beck, I.L., Sandak, R., & Perfetti, C. (2003). Focusing attention on decoding for children with poor reading skills: Design and preliminary tests of the word building intervention. Scientific Studies of Reading, 7, 75-104. An initial examination of the effect of the Word Construction intervention was conducted, exploring the effect of Word Construction on the slope of growth on ORF in the second half of first grade. Students participating in Word Construction were matched to a comparison group of first graders who did not receive the Word Construction intervention. The comparison group was matched on the following variables: Performance on Winter NWF Benchmark Assessment (within 7 words read correct / minute); Performance on Winter ORF Benchmark Assessment (within 5 words read correct / minute); School district (same site, when possible). A series of matched pairs t-tests (with Bonferroni correction for family-wise error) were conducted to establish whether any differences between the treatment and comparison group existed on the Winter ORF Benchmark, and the differences between the treatment and comparison group on the ORF slope in the second half of first grade. Tests were conducted first on all students receiving Word Construction, and next on only students with strong Winter Benchmark performance on the NWF (above target of 52), but weak Winter Benchmark performance on the ORF (below target of 22) labeled strong decoders, non-fluent readers in the table below. Table 51: Matched pairs t-tests of Word Construction treatment vs. comparison groups Answer Options Treatment Mean Comparison Mean P(difference) Number of Pairs All Students in Word Construction ORF Winter Benchmark Performance 12.03 12.68 0.04 91 ORF Growth Slope (words/minute/week) 0.93 1.07 0.09 91 Strong Decoders, Non-fluent Readers Only ORF Winter Benchmark Performance 12.67 13.83 0.07 24 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 82
ORF Growth Slope (words/minute/week) 1.11 0.98 0.49 24 No significant differences were found on initial ORF performance, however, differences approached significance and comparison students were generally slightly higher than the students who participated in the Word Construction intervention. For winter first grade benchmark, even small differences in performance can be important differences, because scores are typically much lower overall. No significant differences were found on ORF slope, indicating that students in the treatment group did not grow at a faster rate in the second half of first grade than the comparison students. Interestingly, the difference was slightly positive (though non-significant) for the strong decoder, non-fluent reader subset of participants, despite a bias of slightly lower (though non-significant) initial performance on the winter ORF benchmark. Further examination of Word Construction performance should be conducted with a larger sample size, especially for the strong decoder, poor reader group, where samples were too small to draw meaningful conclusions. Family Literacy Pilot Projects During the 2011-2012 school year, Minnesota Reading Corps members serving both pre-k and k-3 students participated in a pilot program designed to engage family participation in student literacy. For both age groups, this pilot program involved sending home books or stories for the students to read, together with instructions and suggestions for family members about engaging in shared reading. Family members were asked to sign a response form indicating their level of participation. In the following analysis, results were included if responses fell in valid ranges. Overall, between 32-45% of participating MRC students received at least one book / story at home, and between 71-93% of these families responded with at least one signature indicating participation. MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 83
Table 52: Family Literacy Pilot Program Participation Rates N (Valid N) Pre- 6596 K (6578) K-3 8658 (8643) Number Books/Stories Sent Home Number of Signatures Returned Number (Percent) Families that Received at Least One Book/Story Number (Percent) Families With At Least One Signature Of Those Who Received At Least One Book/Story Mean (SD) Books/Stories Sent Home for Those Participating Mean (SD) Signatures Received for Those Receiving at Least One Book/Story 20,258 12,832 2125 (32%) 1499 (71%) 9.53 (6.25) 6.04 (6.21) 85,131 180,526 3913 (45%) 3626 (93%) 21.75 (20.03) 46.13 (52.43) MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 84
Appendix 1 The following table provides a count of unique Pre-K and K-3 Minnesota Reading Corps participating students by grade level as listed in OnCorps regardless of data presence or tutoring time. Student status is also reported. Grade Active Exited Moved Re-Enrolled Referred to other services Grand Total PreK 2 65 5 5 75 PreK 3 1028 39 58 2 1127 PreK 4 4549 145 269 1 8 4972 PreK 5 208 5 11 1 225 PreK Unknown 166 7 22 2 0 197 PreK Total 6016 201 365 3 11 6596 0 968 2956 188 116 96 4324 1 1533 2290 378 247 243 4691 2 1784 1475 526 106 180 4071 3 1722 1884 598 197 183 4584 TOTAL 12023 8806 2055 669 713 24266 MINNESOTA READING CORPS 2011-2012 STATE-WIDE EVALUATION 85