An overview of Value-Added Assessment

Similar documents
MEASURING WHAT MATTERS. How value-added assessment can be used to drive learning gains. By Ted Hershberg, Virginia Adams Simon, and Barbara Lea-Kruger

Florida s Plan to Ensure Equitable Access to Excellent Educators. heralded Florida for being number two in the nation for AP participation, a dramatic

Atrisco Heritage Academy High

JUST THE FACTS. Birmingham, Alabama

Every Student Succeeds Act

From Cradle ToCareer Connecting American Education From Birth Through Adulthood

Top Bay Area Public Schools for Underserved Students

ILLINOIS SCHOOL REPORT CARD


YES Prep Public Schools

FEDERAL ROLE IN EDUCATION

West Seattle Elementary A School in Transformation. Leveraging adaptive technology to close the math achievement gap.

For Immediate Release: Thursday, July 19, 2012 Contact: Jim Polites

Pennsylvania School Performance Profile

The MetLife Survey of

United States Government Accountability Office GAO. Report to Congressional Requesters. February 2009

Understanding Ohio s New Local Report Card System

School Report Card. School Report Card Commonwealth of Pennsylvania. This report contains:

JUST THE FACTS. Memphis, Tennessee

RELATIONSHIP BETWEEN THE PATTERN OF MATHEMATICS AND SCIENCE COURSES TAKEN AND TEST SCORES ON ITED FOR HIGH SCHOOL JUNIORS

Testing opt-out/refusal guide for South Dakota. Form completed by UOO Admin (LM) Contact information ( )

Every Student Succeeds Act: A Progress Report on Elementary and Secondary Education

BUILDING AND SUSTAINING TALENT. Creating Conditions in High-Poverty Schools That Support Effective Teaching and Learning

How To Understand The Facts Of Education In The United States

Allen Elementary School

ILLINOIS DISTRICT REPORT CARD

Understanding Types of Assessment Within an RTI Framework

Annual Report on Curriculum Instruction and Student. Achievement

Colorado Springs School District 11

TEST-DRIVEN accountability is now the

Evaluating and Reforming Urban Schools

ROLES AND RESPONSIBILITIES The roles and responsibilities expected of teachers at each classification level are specified in the Victorian Government

Principles to Actions

Ensure Everyone Travels the Path of High Expectations

Executive Summary. Ohio Virtual Academy. Dr. Kristin Stewart, Superintendent 1655 Holland Rd Maumee, OH 43537

Every Student Succeeds Act (ESSA)

Review of Special Education in the Commonwealth of Massachusetts

Introduction: Online school report cards are not new in North Carolina. The North Carolina Department of Public Instruction (NCDPI) has been

ILLINOIS SCHOOL REPORT CARD

Louisiana s Schoolwide Reform Guidance

Executive Summary. Metro Nashville Virtual School

South Dakota s Growth Model. From Student Growth Percentiles to School Accountability Scores

On-entry Assessment Program. Accessing and interpreting online reports Handbook

School Librarians Continue to Help Students Achieve Standards: The Third Colorado Study (2010) November 2010

Research Findings on the Transition to Algebra series

JUST THE FACTS. Florida

Information Session Fall 2014

A second year progress report on. Florida s Race to the Top

State of New Jersey OVERVIEW WARREN COUNTY VOCATIONAL TECHNICAL SCHOOL WARREN 1500 ROUTE 57 WARREN COUNTY VOCATIONAL

Introduction and Executive Summary. Alex Molnar. Arizona State University

JUST THE FACTS. Missouri

EDUCATION PROGRAM FOR GIFTED STUDENTS.

Purpose #1: WE ASSESS TO INFORM INSTRUCTIONAL DECISIONS

Graphing and Interpreting CBM Scores

Running head: THE EFFECTS OF EXTRA-CURRICULAR ACTIVITIES

Broward County Public Schools Florida

Public Housing and Public Schools: How Do Students Living in NYC Public Housing Fare in School?

Tulsa Public Schools Teacher Observation and Evaluation System: Its Research Base and Validation Studies

Appraisal: Evaluation instrument containing competencies, indicators, and descriptors.

date by June Key initiatives included: Paraprofessionals and Academic specialists

A Principal s Guide to Intensive Reading Interventions for Struggling Readers in Reading First Schools

ILLINOIS SCHOOL REPORT CARD

Improve programs and funding for Special Education

San Diego Unified School District California

Big idea: Key generalization or enduring understanding that students will take with them after the completion ofa learning unit.

Bangor Central Elementary School Annual Education Report

PENNSYLVANIA SCHOOL PERFORMANCE PROFILE Frequently Asked Questions

Examining the Relationship between Early College Credit and Higher Education Achievement of First- Time Undergraduate Students in South Texas

EDUCATION POST 2015 Parent Attitudes Survey

Master Plan Evaluation Report for English Learner Programs

What are some effective standards-based classroom assessment practices?

Agenda for Reform. Summary Briefing December 14, 2009

Explore Excel CHARTER SCHOOL ACCOUNTABILITY PLAN PROGRESS REPORT

High-Quality Research on School Effectiveness in New Orleans Tells the Real Story

Empirical Data Supporting Technology Transfer of the Morningside Model of Generative Instruction

Utah Comprehensive Counseling and Guidance Program Evaluation Report

State of New Jersey

S. Dallas Dance, Ph.D.

SCHOOLS: Teachers and School Leaders

KANSAS ASSOCIATION OF SCHOOL BOARDS 1420 SW Arrowhead Road, Topeka, Kan

Testimony to the Tennessee Senate Education Committee in Support of. of the Common Core State Standards.

State of New Jersey

Middle Grades Action Kit How To Use the Survey Tools!

State of New Jersey OVERVIEW SCHOOL OF CULINARY ARTS HOSPITALITY AND TOURI PASSAIC 150 PARK AVENUE PATERSON CITY

ILLINOIS SCHOOL REPORT CARD

Oklahoma City Public Schools Oklahoma

Taking Attendance Seriously. How School Absences Undermine Student and School Performance in New York City

The New Common Core State Standards Assessments:

Kodiak Island Borough School District

Gwinnett County Public Schools, Ga.

How To Teach Math To A Grade 8

The principal must make effective use of student performance data to identify

KINETON GREEN PRIMARY SCHOOL MATHEMATICS POLICY FEBRUARY 2015

Indicator 5.1 Brief Description of learning or data management systems that support the effective use of student assessment results.

COMPARISON OF SEO SCHOLARS AND

ILLINOIS SCHOOL REPORT CARD

The Historic Opportunity to Get College Readiness Right: The Race to the Top Fund and Postsecondary Education

This chapter provides a general introduction to data-based decision

Accountability and Virginia Public Schools

School Leader s Guide to the 2015 Accountability Determinations

Transcription:

An overview of Value-Added Assessment Ted Hershberg Director, Operation Public Education Professor, Public Policy and History University of Pennsylvania

Without much thought, we answer: Good schools are those with high test scores.

Where do we find these schools? In rich communities.

Most people would reject the definition of a successful school as one with wealthy students. They are correct.

Family Income and SAT Scores Family Income Under $10,000 $10,000 - $20,000 $20,000 - $30,000 $30,000 - $40,000 $40,000 - $50,000 $50,000 - $60,000 $60,000 - $70,000 $70,000 - $80,000 $80,000 - $100,000 884 906 937 967 996 1014 1026 1039 1058 Over $100,000 1119 SAT Scores *"2005 College-Bound Seniors: Total Group Profile Report" published by CollegeBoard SAT

Do good schools make good students or do good students make good schools?

Field Experiment What would happen over time if we took... Kids from the inner-city and educated them in affluent suburban schools Kids from the affluent suburbs and educated them in the inner-city schools Since we can t undertake this experiment, we need a statistical method that can do it for us.

The difference between achievement and growth

Achievement 100 80 60 40 20 Achievement -- a score on a vertical scale at a single moment in time (absolute or raw score, status, proficiency) -- is best predicted by family income. 0 Actual Scores

New technologies and data sets make new research possible James Coleman (1966) and Christopher Jencks (1972) concluded that family background was more important than schooling in explaining achievement. They did not have the technology to trace individual students over time (not cohorts). Nor did they have data sets that link the teacher for every subject and grade to an individual student s record.

Growth 100 80 60 40 20 0 School Year Actual Scores (September) Actual Scores (June) Growth (the progress students make over the course of the school year) is best predicted by the quality of instruction.

No Child Left Behind All students must reach proficiency in reading and math within 12 years Adequate yearly progress (AYP) measured for All students All major racial/ethnic subgroups Low-income students Limited English proficiency students Students with disabilities

No Child Left Behind A powerful catalyst for change Admirable goal Serious design flaws Value-added assessment can improve AYP, strengthen instruction and increase student achievement Proposals from states to include growth in AYP will now be accepted by the Fed DOE

Identifying AYP s shortcomings Achievement High Achievement Low Growth Low Achievement Low Growth High Achievement High Achievement High Growth Low Achievement High Growth Growth

Defining a successful school Each year the performance of the students exceeds what is expected of them, given their academic background. Over time all students are able to achieve high standards (NCLB).

Value-Added: A New Lens

Value-Added Assessment First developed for Tennessee by William Sanders. Since 1992, tracks each of the state s 885,000 students. 10 million records, grades 2-12 with test scores in every subject, every grade, every teacher. Largest data base ever assembled. Mandatory in Pennsylvania and Ohio as well as in over 300 districts and consortia across the U.S.

Philosophy Behind Value-Added Assessment Schools can and should add value for each student from September to June. This is true whether the student comes in above grade, at grade or below grade. Students are entitled to grow at least at a rate they have demonstrated in the past.

Value-Added: The Basics Value-added is not a test. It is a way of looking at the results that come from tests. Value-added lets us determine whether the students in a class, school or district are making enough academic growth each year.

Example - Student A Test Score Student A 3 4 5 6 7 8 9 10 11 Grade

Student A vs. District Average Test Score Student A District Average 3 4 5 6 7 8 9 10 11 Grade

What can we conclude? Test Score Not much. The student appears to be consistently near average, if a little below. Student A District Average 3 4 5 6 7 8 9 10 11 Grade

Test Score But we see that in fourth grade, the student advanced at a rate far above the district average 3 4 5 6 7 8 9Student 10A 11 Grade District Average

Test Score and in fifth grade, the student advanced at a rate below the district average. 3 4 5 6 7 8 9Student 10A 11 Grade District Average

Test Score We can draw no conclusions about the quality of classroom activity from these facts alone... 3 4 5 6 7 8 9Student 10A 11 Grade District Average

Now, if we instead think of the line as representing: Test Score - the average of all students in the teacher s classroom, and - the average of three years of the teacher s classrooms, then we can draw meaningful conclusions about what s happening in that class. 3 4 5 6 7 8 9Student 10A 11 Grade District Average

The Concept Behind Value-Added Value-added is statistically and computationally complex But the idea behind it is straightforward

Projected and Actual Scores Value-added calculates a projected test score for a student in a given grade and subject. The projected score is based entirely on the student s prior academic achievement. It is then compared to the actual score at the end of the year.

Value-Added Levels the Playing Field Projected Score 100 80 60 40 20 Actual Score Actual falls below projected Actual equals projected Actual exceeds projected 0 Value-added measures the difference between actual and projected.

Value-Added Levels the Playing Field 0 20 40 60 80 100 Projected Score Actual Score Value-added measures the difference between actual and projected.

Don t t confuse value-added added assessment with mere growth or gain Many people make this mistake. It sounds reasonable to think of the growth or gain a student makes from one year to the next as the value that s been added. Value-added assessment is much more powerful than a simple growth or gain score.

Value-Added Divides difference between projected and actual scores into two parts That which is contributed by the student That which is contributed by the teacher

Each child serves as his own statistical control 80 70 60 50 40 30 20 10 0 Grade 3 Grade 4 The Environmental Variables Remain the Same -Family Income -Ethnicity -Gender -Neighborhood

Value-added added yields three outcomes { { Value-added above projection Value-added in projected range Scaled Score 3 4 5 { Value-added below projection These records are then averaged for all students in the teacher s classroom over a 3-year period.

What makes value-added added fair? For children Value-added is fair to students because it bases their projected score only on their prior academic record. That ensures that all children are expected to make progress each year from wherever they start.

What makes value-added added fair? For educators It is fair to administrators and teachers because prior academic achievement data already incorporate the student background characteristics that bias absolute test scores.

Value-Added in a Standards World (Hypothetical Progression) Student Achievement The Standard Not losing ground Right Now Closing the Gap through Systemic Change Grade

Value-Added Findings Patterns from the Data

Tennessee Schools and their Value-Added Scores It is impossible to determine where a school falls just by knowing its location or the make-up of its student body Math: 1996-97 97 330 Number of Schools 166 250 234 108 121 110+ 100-110 90-100 80-90 70-80 <70 % of National Norm 100 (on the horizontal axis) means a year s worth of growth in a year.

Income has no effect on value-added Cumulative Gain of a Large East Coast County s Schools Compared with the Percentage of Students Receiving Free and Reduced-Priced Lunches Each dot represents one school. 100 (on the vertical axis) means a years worth of growth in a year.

Minority status has no effect on value-added Cumulative Gain of Tennessee Schools Compared with the Percent of Minority Students in the School Each dot represents one school. 100 (on the vertical axis) means a years worth of growth in a year.

Teacher Effectiveness Teacher Effectiveness First 10-12 years Second 10-12 years Typical Salary Schedule After 20-24 years Teacher Experience

Value-Added Findings From Tennessee The Teacher Effect

Importance of Teacher Sequence

Probability that a bottom-quartile 4 th grade student will pass the high-stakes graduation exam in 9 th grade th Poor teacher sequence: <15% Average teacher sequence: 38% Good teacher sequence: 60%

Cumulative Effects of Value-Added TVAAS Scores 75% 100% 140% Grade 2 3 4 5 6 7 8 2 2.75 3.5 4.25 5 5.75 6.5 2 3 4 5 6 7 8 2 3.4 4.8 6.2 7.6 9 10.4 GRADE LEVEL IMPACT -1.5 0 +2.4 This means a difference of almost 4 grade levels by the end of middle school.

Predicting Student Learning Results Achievement Best predicted by family income Growth Best predicted by the quality of instruction Tennessee research shows that teacher effectiveness is the single most powerful predictor of student progress stronger than income, class size, race or family educational background.

Reducing class size is a poor policy choice to increase student learning It ranks 40 th among 46 options Feedback and direct intervention are the highest (effect sizes of 0.81) Where the average is 0.40, the effect size of reducing class size is 0.12 Source: John Hattie, Keynote, International Conference on Class Size, University of Hong Kong, May, 2005

Using Value-Added to Inform Instruction

Diagnostics 1 The Focus of Instruction

Previous Achievement Shed Pattern This pattern of high early achievement followed by extremely low value-added is quite common, especially in urban districts. Gain Low Average High

Previous Achievement Reverse Shed In this pattern frequently found in suburban districts the teacher is teaching to the high achievers at the expense of other students. Gain Low Average High

Previous Achievement Tepee Gain In this pattern, the teacher is teaching right down the middle. Low Average High

Previous Achievement Sustained Growth 115-120% 105% Gain 100% Closing the gap Still making gains Low Average High

Diagnostics 2 The Impact of Instruction

Value-Added: Three Results Above 100% No Detectable Difference (NDD) One year s worth of growth Below (using 3-year running averages)

Diagnostics 3 Combining the Focus and Impact of Instruction

Example: Four 5 th Grade Classrooms 100% No Detectable Difference Reading Language Arts Math Social Studies

Example: High School English Dept. 100% No Detectable Difference 9 Adv9 10 Adv10 11 Adv11 12 AP12

Tepee Pattern Using Previous Academic Achievement Levels Example 1 100% No Detectable Difference Low Average High

Tepee Pattern Using Previous Academic Achievement Levels Example 2 100% No Detectable Difference Low Average High

Shed Pattern Using Previous Academic Achievement Levels Example 100% No Detectable Difference Low Average High

Reverse Shed Pattern Using Previous Academic Achievement Levels Example 100% No Detectable Difference Low Average High

Value-added added provides powerful diagnostic data Identify and improve the focus and impact of instruction End the isolation of teachers Build learning communities Improve data-driven decision making Differentiate instruction Create student growth trajectories to targets and develop intervention strategies Measure the success of schools through growth, not simply achievement

The Tennessee Experience

Tennessee NAEP Scores 240 Mathematics - Grade 4 235 NAEP Scale Score 230 225 220 215 210 205 200 195 1992 1996 2000 2005 Tennessee National Average

The limits of value-added added in Tennessee May be used in individual teacher evaluations, but may not exceed 8% Lack of professional development to accompany statewide rollout

Value-added added assessment is only a thermometer; if we don t t analyze the information and use it, nothing happens.

Why collect classroom level data? The variation in the quality of instruction is much greater within schools than between schools. Struggling students are not randomly distributed in classrooms they are found disproportionately in classrooms where they receive poor instruction. It allows you to deal with underlying causes not symptoms.

For additional information on our package of reforms, please contact: cgpinfo@pobox.upenn.edu or (215) 746-6478