Running Head: FIELD TESTING 09-10 1



Similar documents
District 2854 Ada-Borup Public Schools. Reading Well By Third Grade Plan. For. Ada-Borup Public Schools. Drafted April 2012

TECHNICAL REPORT #33:

Phonics and Word Work

The Effects of Read Naturally on Grade 3 Reading: A Study in the Minneapolis Public Schools

2009 CREDO Center for Research on Education Outcomes (CREDO) Stanford University Stanford, CA June 2009

Fulda Independent School District 505

STAR Early Literacy Technical Manual

BEST PRACTICES RESOURCE GUIDE for ELEMENTARY, MIDDLE, and HIGH SCHOOLS

Annual Report on Curriculum, Instruction, and Student Achievement Independent School District 700, Hermantown Community Schools

Effective Early Literacy Skill Development for English Language Learners: An Experimental Pilot Study of Two Methods*

Rouch, Jean. Cine-Ethnography. Minneapolis, MN, USA: University of Minnesota Press, p 238

WORLD S BEST WORKFORCE PLAN

EXPEDITIONARY LEARNING SCHOOL PERFORMANCE. Expeditionary Learning 2010

ELLS 2.0 Preparation and Implementation Tips

Woodcock Reading Mastery Tests Revised, Normative Update (WRMT-Rnu) The normative update of the Woodcock Reading Mastery Tests Revised (Woodcock,

Improved Reading Achievement by Students in the Craven County Schools who used Scientific Learning Products:

Early Childhood Study of Language and Literacy Development of Spanish-Speaking Children

For Immediate Release: Thursday, July 19, 2012 Contact: Jim Polites

About the Finalist. Cumberland County Schools North Carolina DISTRICT PROFILE

Bangor Central Elementary School Annual Education Report

Literacy Specialist Endorsement

Technical Report. Overview. Revisions in this Edition. Four-Level Assessment Process

NEW STUDENT SURVEY REPORT FOR FALL 2008 COHORT

BASI Manual Addendum: Growth Scale Value (GSV) Scales and College Report

Oral Fluency Assessment

SYLLABUS. Text Books/Additional Readings Course requirements will be aligned with readings from No additional readings will be planned.

Colorado Springs School District 11

Reading W ell. by Third G rade LITE RA C Y PLA N Ogilvie School District Ogilvie, Minnesota

Perfect for RTI. Getting. the Most. Early Literacy. Using Data to Inform Instruction and Intervention

What Does Research Tell Us About Teaching Reading to English Language Learners?

Allen Elementary School

G R A D E S K - I N S T R U C T I O N A N D A S S E S S M E N T

MCA Data Discoveries

Understanding How Parents Chose Schools: An Analysis of Denver s SchoolChoice Form Questions

Massachusetts English Proficiency Assessment (MEPA) 2010 Technical Report

English Learner Program Description White Bear Lake Area Schools

STAR. Early Literacy Teacher s Guide

Annual Report on Curriculum, Instruction, and Student Achievement

Opportunity Document for STEP Literacy Assessment

COMPUTER TECHNOLOGY IN TEACHING READING

All materials are accessed via the CaseNEX website using the PIN provided and the user name/password you create.

Glossary of Terms Ability Accommodation Adjusted validity/reliability coefficient Alternate forms Analysis of work Assessment Battery Bias

Annual Report on Curriculum Instruction and Student. Achievement

UNH Graduate Education Department. Quarterly Assessment Report

DR. PAT MOSSMAN Tutoring

Continued Approval of Teacher Preparation Programs in Florida: A Description of Program Performance Measures. Curva & Associates Tallahassee, FL

PRE AND POST TEST TO SEE THE DIFFERENCE BETWEEN YEARS OF ANIMATED LITERACY AND KNOWLEDGE OF LETTERS STEPHANIE, BUCK. Submitted to

Characteristics of Colorado s Online Students

Minnesota Tablet Usability Study Report

CLAD Ô /BCLAD Ô : California Reforms in the Preparation and Credentialing of Teachers for a Linguistically and Culturally Diverse Student Population

Standard Two: Knowledge of Mathematics: The teacher shall be knowledgeable about mathematics and mathematics instruction.

WiggleWorks Aligns to Title I, Part A

C Reading Sufficiency Act - Programs of Reading Instruction

1 REVISOR C. show verification of completing a Board of Teaching preparation program

SPECIFIC LEARNING DISABILITIES (SLD)

Parent s Guide to STAR Assessments. Questions and Answers

California Department of Education

Reading Program Expenditures

State of New Jersey OVERVIEW WARREN COUNTY VOCATIONAL TECHNICAL SCHOOL WARREN 1500 ROUTE 57 WARREN COUNTY VOCATIONAL

Help! My Student Doesn t Speak English. 10 Tips To Help Your English Language Learners Succeed Today

parent ROADMAP SUPPORTING YOUR CHILD IN GRADE FIVE ENGLISH LANGUAGE ARTS

GUIDE TO BECOMING A READING CORPS SITE

Successful RtI Selection and Implementation Practices

An Examination of the No Fail Policy in Thailand and the Effect on Community Relations

Annual Report of Life Insurance Examinations Calendar Year 2010

CHARTER SCHOOL PERFORMANCE IN PENNSYLVANIA. credo.stanford.edu

Transitioning English Language Learners in Massachusetts: An Exploratory Data Review. March 2012

118 One hundred Eighteen

Structured English Immersion Models of the English Language Learner Task Force

Teaching All Students to Read: With Strong Intervention Outcomes

The following is a compilation of data reflective of the Educator Preparation Program at Harding University

Technical Report. Teach for America Teachers Contribution to Student Achievement in Louisiana in Grades 4-9: to

Nevis Public School District #308. District Literacy Plan Minnesota Statute 120B.12, Learning together... Achieving quality together.

Feifei Ye, PhD Assistant Professor School of Education University of Pittsburgh

Validity, reliability, and concordance of the Duolingo English Test

Psychology of Learning to Read

State of New Jersey

The National Reading Panel: Five Components of Reading Instruction Frequently Asked Questions

There are many reasons why reading can be hard. This handout describes

LANG 557 Capstone Paper . Purpose: Format: Content: introduction view of language

The researched-based reading intervention model that was selected for the students at

Minnesota Transitions Charter School Annual Report. Table of Contents

Program Overview. This guide discusses Language Central for Math s program components, instructional design, and lesson features.

Welcome. Hello and Welcome to the Fast ForWord Family!

Hiawatha Academies School District #4170

Intel Teach Essentials Course Instructional Practices and Classroom Use of Technology Survey Report. September Wendy Martin, Simon Shulman

the sites selected for participation in the study were targeted Assistance schools with a history of unacceptably low achievement.

Executive Summary. McWillie Elementary School

Annual Report on Curriculum, Instruction and Student Achievement

Big idea: Key generalization or enduring understanding that students will take with them after the completion ofa learning unit.

!"#$%&'"()"!*+,(-./0"12.342*0"()"" "8.-6%.62"82.*-2*0"!

Face to face training is provided by the SD8 SETBC District Partner. Online training can be arraigned by

ACCESS for ELLs 2.0. Test Coordinator Training Session. Fall 2015

Standard 1. Foundational Knowledge Candidates have knowledge of the foundations of reading and writing processes and instruction.

Response to Intervention Frequently Asked Questions (FAQs)

Learning Today Smart Tutor Supports English Language Learners

Transcript: What Is Progress Monitoring?

AISD K-5 ACCELERATED READING AND MATHEMATICS INSTRUCTION EVALUATION,

Striving for Success: Teacher Perspectives of a Vertical Team Initiative

NW COLORADO BOCES ALTERNATIVE LICENSURE PROGRAM

Transcription:

Running Head: FIELD TESTING 09-10 1 Report Number: 8 FT 09-10 Computer Based Assessment System-Reading (CBAS-R): Technical Report on Field Test Results 2009-2010 Allison McCarthy Amanda Pike Theodore Christ University of Minnesota Date Initiated: 2/2010 Finalized Draft: 11/23/10 Reviewed by Advisory Board: TBA Preparation of this technical report was supported in part by a grant from the Office of Special Education Programs, U.S. Department of Education (H327A090064). Opinions expressed herein do not necessarily reflect the position of the U.S. Department of Education.

Field Testing 09-10 2 Abstract The sampling plan and data collection to establish the item parameters for CBAS-2 items are presented. Data were collected in the Fall and Winter of 2009-2010 at seven different schools in Minnesota. 638 items from the CBAS-2 item bank were administered; 16 items were selected and used as linking items, while 622 additional items were parameterized and retained for the subsequent version of CBAS-R. This report summarizes details of data collection including demographics of samples, identification of linking items, test item selection, and rationale for stratified sampling procedures. Keywords: sampling, CBAS-2, plan, 2009-2010

Field Testing 09-10 3 Research Log Activity Lead Person Timeframe Recommendations to PI s Allison McCarthy 2/10 Writing Allison McCarthy, Amanda Pike 2/10-11/10 Review Theodore Christ 11/2/10

Field Testing 09-10 4 Table of Contents Abstract...2 Introduction...5 Purpose of CBAS-R...5 Field Testing CBAS-R...5 Method...6 Participants...6 Initial Field Testing to Establish Linking Items...7 Field Testing to Establish Item Parameters...9 Procedure: Data Collection Overview...10 Procedure: Sampling Plan...10 Fall 2009 Data Collections...10 October (s A & B)...10 November (s C & D)...11 December ( E)...11 Winter 2010 Data Collections...12 January ( A)...12 February (s C, D, & G)...12 March (s B, E, & F)...13 Results...13 References...14 Appendix...15

Field Testing 09-10 5 CBAS-R: Technical Report on Field Test Results 2009-2010 Purpose of CBAS-R The Computer-Based Assessment System for Reading (CBAS-R) is a computer administered assessment of reading achievement for use in kindergarten through fifth grade. CBAS-R outcomes will enable continuous evaluation of the level and rate of reading achievement within and across the primary grades. The application of contemporary psychometric theory (3-Parameter Item Response Theory) has been used to support the development of CBAS-R as a computer adaptive test (CAT). The advent of CAT-based assessments has the potential to confer substantial benefits to teachers and educators. The CBAS-R targets reading achievement across five critical sub-domains put forth by the National Reading Panel (2000): concepts of print, phonemic awareness, phonics, vocabulary, and comprehension. It also provides sufficient flexibility to assess within the sub-domains and derive profiles of achievement across domains and outcomes will be equated both horizontally and vertically (i.e., within and between grades). Field Testing CBAS-R The CBAS-R team sought to field-test the new CBAS2 item bank (Fall 2009) in hopes of having a better and more representative bank of items than the previous (CBAS) bank of items. Data were collected in the Fall and Winter of 2009-2010 at seven different schools in Minnesota, with similar methods used in the 2007 field testing data collection (See Ayodele, Jiban, Christ, & McCarthy: Tech. Rep. No. 2).

Field Testing 09-10 6 Method Participants Students in kindergarten through fifth grade were drawn from seven schools in the suburbs near Minneapolis, Minnesota. As the goal was to test a large number of students (300 per item), CBAS personnel wanted to ensure that the majority of students in schools would be able to participate without getting frustrated. Thus, English Language Learner (ELL) students would not be the best population to participate as the tests may have caused frustration due to the large amount of reading required. demographics are presented in Table 1. Sample sizes by school and grade are presented in Table 2. s were identified for participation based on a high rate of non-ell students. Principals were emailed inquiring about their interest to participate. The Project Coordinator corresponded with principals to gauge their interest in participation. In some cases, principals recommended that CBAS staff contact individuals at the administration level (e.g., Superintendent) to confirm that it was appropriate for their school to participate. Once confirmation was received from administration staff or the principal, CBAS personnel proceeded with scheduling the data collections. Initial Field Testing to Establish Linking Items Item Response Theory (IRT) requires that each field tested item be sampled on 300 individuals (Weiss personal communication, 2009). Therefore, our goal was to reach 300 students tested per item; some items were tested on more than 300 students due to available participants in schools. Sampling at this rate ensures that all three parameters (a, b, and c) are stable estimates of the item parameters.

Field Testing 09-10 7 The goal of the first data collection was to identify and establish parameters for linking items. Linking items (N = 16) are those items that were included on each subsequent field test as necessary to use a mixed group common item linking procedure (Kolan & Brennan, 2004). Without linking items, examinee responses across alternate items and examinees samples could not be compared and used to estimate item parameters. Linking items were developed and selected to represent all domains as well as the range of item difficulties, which generally correspond with reading performance from kindergarten through fifth grade. Identification of linking items from the first data collection was extremely important because they appear on each subsequent field test. CBAS personnel identified 16 linking items that spanned across the difficulty level and were content-balanced (all linking items can be found in the appendix of this report). CBAS personnel systematically chose 55 items out of the entire bank of items (N = 638). First, they individually chose what they determined were the best 55 items based on findings from previous administrations during CBAS-R-1 and on literacy research. Any item that both members chose was included in the 55 with the assumption that they were chosen because they were good items. The number of items chosen per domain was calculated based on our a priori model. For example, Concepts of Print and Phonological Awareness develop in kindergarten and first grade. Therefore, the proportion of these item types should be much smaller than the proportion of Comprehension items on tests; Comprehension develops across five grades, not just two. The number of items per grade level was similarly determined. CBAS staff decided that some items would be too difficult for young students at the first data collection in which 55 items were to be tested. Thus, the administration was structured to have five tests with branching and termination criteria. Each test was content-balanced and

Field Testing 09-10 8 successive tests were sequenced so they became progressively more difficult (item difficulty was based on expert judgment of project personnel). For instance, the first test included 15 items that were considered to be the easiest. Based on performance within each test, students administration was either terminated (due to the number of incorrect responses) after a test was completed and prior to subsequent tests, or continued to the next, progressively difficult, test (if enough correct responses were given). If an examinee s test terminated, then it was assumed that the examinee would not respond to subsequent items at a rate greater than chance. The results of this initial field testing were used to guide the selection of linking items. These results were not included in the supermatrix or to estimate the final item parameters. Field Testing to Establish Item Parameters Researchers did not want any students to become frustrated by items that were too difficult. Based on best estimates of the project team, items were sequenced by difficulty and divided into 6 sets of 3 progressively more difficult tests (ex: for s A & B there was a Fall set of 3 tests (K-1, 2-3, 4-5) and a Spring set of 3 tests (K-1, 2-3, 4-5)). While all reading domains were assessed at all grade levels, grade level tests were constructed with the particular domain emphasis for that grade in mind. Specifically, personnel believed that Kindergarten and first grade examinees should be assessed in the domains of Concepts of Print, Phonological Awareness, and some Phonics skills.. Thus, it seemed reasonable to combine these grades and test them using the same set of items. Similarly, it was expected that examinees in grades two and three and grades four and five were similar in reading development. The most relevant domains for grades two and three seemed to be Phonics and Vocabulary. Finally, the focus for grades four and five was Vocabulary and Comprehension. It is important to note that each grade

Field Testing 09-10 9 was tested on Vocabulary and Comprehension, but at varying levels of difficulty corresponding with grade level. Excluding the first data collection at A, both the Kindergarten-First Grade (K-1) tests and the Second-Third Grade (2-3) tests had 39 new items on each. Thirty-one new items were on each Fourth-Fifth Grade (4-5) test. The number of items on the 4-5 tests was reduced due to the majority of items belonging to the Comprehension domain. In general, Comprehension items require more time because more reading is involved. This means that the K-1 and 2-3 tests each had a total of 55 items, including the linking items, and the 4-5 test had a total of 47 items, including the linking items. Procedure: Data Collection Overview Data were collected using a mobile computer lab (N = 28 laptops each). At the time of administration, data collectors directed students to the appropriate computer, located in a test carrel, and provided a brief set of directions. A picture of an individual student administration is presented in Figure 1 of the Appendix. Visual stimuli are presented on a computer monitor and auditory stimuli are presented with padded earphones (padding helps isolate students from extraneous noises and helps ensure that the assessment could be conducted within a populated area). The students receive oral directions from a proctor or teacher followed by automated directions and on-screen demonstrations to guide them through the test. Students responded to CBAS items in sessions of 15 30 minutes. Upon completion, students were free to return to their classroom. Procedure: Sampling Plan Fall 2009 Data Collections October.

Field Testing 09-10 10 A. The purpose of this initial data collection was to identify linking items (procedure described above). Three hundred sixty four students participated in this data collection. Once linking items were identified (N = 16), they were included in all subsequent tests for which items were field tested (with one exception; B Fall Data collection, 4-5 test). B. As another early data collection in which CBAS personnel had no data to aid us in choosing items, items considered to be high quality and representative of a range of difficulty and content for this data collection were selected. Five linking items were unintentionally left out of the 4-5 test. As a result, items from the 4-5 test were added to the Supermatrix once all data collections were completed. The researchers adjusted the parameters appropriately. Five hundred and sixty total students participated in this data collection at this school; specifically, 153 students were administered the K-1 test, 211 students were administered the 2-3 test, and196 students were administered the 4-5 test. November. C. This elementary school served as a make-up school for items tested at B. In other words, the goal of 300 students per item was not achieved at B due to the student population. Students at C took the same test set as at B in order to obtain the goal of 300 students per item. Three hundred and sixty two total students participated in this data collection at this school; specifically, 128 students were administered the K-1 test, 123 students were administered the 2-3 test, and 111 students were administered the 4-5 test. D. Items were chosen with consideration for difficulty level and content balance. Seven hundred and forty one total students participated in this data collection at this

Field Testing 09-10 11 school; specifically, 243 students were administered the K-1 test, 255 students were administered the 2-3 test, and 243 students were administered the 4-5 test. December. E. Items were chosen with consideration for difficulty level and content balance. Seven hundred and twenty seven total students participated in this data collection at this school; specifically, 217 students were administered the K-1 test; 238 students were administered the 2-3 test; 272 students were administered the 4-5 test. Winter 2010 Data Collections January. A. Based on initial item analysis of the October data collection, more items that were very easy (< -2.0) and very difficult items (> +2.0) were needed to expand the range of linking items. As a result, CBAS personnel focused on adding a majority of extremely easy items (in the domains of Concepts of Print and Phonological Awareness) to the K-1 test. The 4-5 th grade test included high level Vocabulary items and Comprehension items considered very difficult due to length of passage, vocabulary, and question type. The tests at this school consisted of an entirely new item set. Three hundred and eighty total students participated in this data collection at this school; specifically, 100 students were administered the K-1 test, 130 students were administered the 2-3 test, and 150 students were administered the 4-5 test. February.

Field Testing 09-10 12 C. Again, this school served as the make-up school for previous fall data collections (i.e., B, D, E). Three hundred and sixty two total students participated in this data collection at this school; specifically, 126 students were administered the K-1 test, 122 students were administered the 2-3 test, and 114 students were administered the 4-5 test. D. Items were chosen with consideration for difficulty level and content balance. Tests administered consisted of an entirely new item set. Seven hundred and ten total students participated in this data collection at this school; specifically, 234 students were administered the K-1 test, 252 students were administered the 2-3 test, and 224 students were administered the 4-5 test. G. Items were chosen with consideration for difficulty level and content balance. Five hundred and seventy one total students participated in this data collection at this school; specifically, 215 were administered the K-1 test, 174 students were administered the 2-3 test, and 182 students were administered the 4-5 test. March. B. Items were chosen with consideration for difficulty level and content balance. Tests administered at B were the same as those administered during the winter data collection at F. Six hundred and fifty three total students participated in this data collection at this school; specifically, 203 students were administered the K-1 test, 238 students were administered the 2-3 test, and 212 students were administered the 4-5 test. E. E served as the make-up school for all data collections in need of more administrations for items. Seven hundred and sixty eight total students participated in

Field Testing 09-10 13 this data collection at this school; specifically, 278 students were administered the K-1 test, 258 students were administered the 2-3 test, and 232 students were administered the 4-5 test. F. Items were chosen with consideration for difficulty level and content balance. Tests administered consisted of an entirely new item set. Five hundred and seventy six total students participated in this data collection at this school; specifically, 216 students were administered the K-1 test, 180 students were administered the 2-3 test, and 180 students were administered the 4-5 test. Results A summarization of the mean, standard deviation, minimum and maximum values for the three IRT parameters is presented in Table 3 for each reading domain. The number of items developed in each reading domain for the different ability levels are presented in Table 5.. In total, 638 items from the CBAS-2 item bank were administered during the Fall and Winter of 2009-2010. Approximately 4,060 students participated in the study, allowing for approximately 300 responses per item. 16 items were selected and used as linking items, while 622 additional items were parameterized and retained for the subsequent version of CBAS-R, for use to index reading development from kindergarten through fifth grade.

Field Testing 09-10 14 References Ayodele, A. N., Jiban, C. L., Christ, T. J. & McCarthy, A. M. (2007). CBAS-R: Technical Report on Field Test Results 2007 (Tech. Rep. No. 2). Minneapolis, MN: University of Minnesota, Department of Educational Psychology. Kolen, M. J.,& Brennan, R. L. (2004). Test equating, scaling, and linking: Methods and practices (2 nd ed.). New York: Springer-Verlag. National Reading Panel. (2000a). Report of the National Reading Panel Subgroups: Teaching children to read (No. NIH 00-4754): National Institute of Child Health and Human Development. Weiss personal communication, 2009

Field Testing 09-10 15 Table 1 Demographics Appendix Category A B C D E F G White 89% 68% 78% 72% 69% 76% 63% Black 3% 16% 7% 7% 5% 4% 5% Hispanic 3% 8% 8% 7% 6% 4% 26% Asian 4% 8% 7% 13% 19% 15% 6% American Indian 1% <1% <1% 1% 1% 1% <1% Free/Reduced Lunch 21% 42% 37% 24% 14% 20% 41% Limited English Proficient Special Education Grades Served Total Population 1% 10% K-6 470 12% 8% K-5 663 9% 8% K-5 396 14% 11% K-5 782 14% 10% K-5 766 15% 11% K-5 690 28% 13% K-5 638 Table 2 Participants by and Grade Grade A B C D E F G K-1 100 203 128 243 278 216 215 2-3 130 238 123 255 258 180 174 4-5 150 212 114 243 238 180 182 Total 380 653 365 741 774 576 571

Field Testing 09-10 16 Table 3 Summarization of Parameter Estimates for Each Domain Parameter a Parameter b Parameter c Domain N Mean SD Min Max Mean SD Min Max Mean SD Min Max Concepts of Print 64 1.13 0.13 0.72 1.43-0.91 1.18-2.60 1.61 0.20 0.06 0.06 0.44 Phonological Awareness 70 1.04 0.09 0.80 1.22-0.47 1.01-2.37 1.35 0.24 0.08 0.11 0.47 Phonics 132 1.14 0.19 0.74 2.24-0.30 1.30-2.95 3.06 0.20 0.06 0.08 0.43 Vocabulary 213 1.10 0.10 0.71 1.69 0.96 1.60-2.95 3.06 0.20 0.06 0.07 0.47 Comprehension 143 1.16 0.17 0.88 2.13 0.55 0.97-1.20 3.06 0.21 0.07 0.09 0.47 Whole Bank 622 Table 4 Item Difficulty Information Domains (-3<b<=-2) (-2<b<=-1) (-1<b <=0) (0<b<=1) (1<b<=2) (2<b<3) Concepts of Print 15 16 18 9 6 0 Phonological Awareness 6 15 27 15 7 0 Phonics 15 32 31 29 22 3 Vocabulary 13 8 39 43 49 61 Comprehension 0 1 48 55 23 16 Whole Bank 49 72 164 151 107 77

Field Testing 09-10 17 Figure 1. Illustration of CBAS-R administration conditions. The multimedia features allow visual and auditory input. They also isolate the student from extraneous stimuli that might otherwise distract the test taker. The keyboard is hidden with a keyboard cover and students responses are recorded by pointing and clicking with a mouse. Figure 2. Linking item Unique ID 11; Audio: Click on the picture that shows the front of a book.

Field Testing 09-10 18 Figure 3. Linking item Unique ID 12; Audio: Click on the word where you start reading. Figure 4. Linking item Unique ID 24; Audio: Click on the first letter of the word.

Field Testing 09-10 19 Figure 5. Linking item Unique ID 35; Audio: Choose the best answer. Figure 6. Linking item Unique ID 43; Audio: Choose the best answer.

Field Testing 09-10 20 Figure 7. Linking item Unique ID 45; Audio: Choose the best answer. Figure 8. Linking item Unique ID 57; Audio: Choose the best answer.

Field Testing 09-10 21 Figure 9. Linking item Unique ID 81; Audio: Choose the best answer. Figure 10. Linking item Unique ID 67; Audio: Choose the best answer.

Field Testing 09-10 22 Figure 11. Linking item Unique ID 94; Audio: Choose the best answer. Figure 12. Linking item Unique ID 104; Audio: Choose the best answer.

Field Testing 09-10 23 Figure 13. Linking item Unique ID 176; Audio: Which word does not have the same middle sound? Figure 14. Linking item Unique ID 379; Audio: Click on the word big.

Field Testing 09-10 24 Figure 15. Linking item Unique ID 381; Audio: Click on the word girl. Figure 16. Linking item Unique ID 395; Audio: Click on the word belong.

Field Testing 09-10 25 Figure 17. Linking item Unique ID 395; Audio: Choose the best answer.