February 10, 2015 Assessing the Success of Specific GEAR UP Components: Lessons Learned from a Rigorous Study in Philadelphia Presented At: 2015 Capacity-Building GEAR UP Conference Presented By: Dr. Manuel Gutiérrez Julia Alemany Michelle Grimley
Goals for This Session Learn about the results from a six-year evaluation of the Philadelphia GEAR UP Partnership initiative, with a focus on two key programmatic components: AVID and college readiness activities. Share successful practices, challenges and lessons learned around data collection, analysis, and reporting, as well as using results to drive programmatic improvements. 1
The Philadelphia GEAR UP Partnership Initiative Need for GEAR UP only 45% of the students in the original target high schools graduate within four years and less than half of these graduates enroll in college. Six-year federal grant that began in 2009-10 with 6 th - and 7 th -grade cohorts in 32 middle schools. In 2014-15, the program serves over 4,000 11 th -grade and 12 th -grade students in seven high schools Overseen by the School District of Philadelphia s Office of Academic Support Strong network of partners, including Temple University, Community College of Philadelphia, Philadelphia Youth Network, College Board, White-Williams Scholars, Philadelphia Higher Education Network for Neighborhood Development, AVID, Community in Schools, Trizen, CoolSpeak, Metis, and BAI. Staffed by Project Director, Assistant Director, Program Managers and Site Monitors 2
Program Components Academic enrichment and supports Skills building: AVID, STEM, Robotics Tutoring Academic Advisement Credit recovery Supports for Teachers and School Staff AVID Summer Institute; Teacher Institute by College Board; trainings by Temple University and others College and Career Preparation Activities College Readiness Series PSAT and SAT test prep College visits Financial planning/fafsa Family Involvement One-on-one support, workshops, Newsletter 3
Overall Evaluation Design OUTCOME/IMPACT STUDY Purpose Data sources and methods 1. Assess impact on academic performance and preparation 2. Assess impact on student and family knowledge about college 3. Assess impact on high school graduation and college enrollment Student and parent surveys Participation data School district student data o Dosage analyses to assess the relationship between participation and outcomes o Quasi-experimental impact analyses comparing outcomes of GEAR UP students to a rigorouslymatched comparison group Purpose IMPLEMENTATION STUDY (WITH BRANCH ASSOCIATES) Data sources and methods 1. Assess program fidelity 2. Document promising practices, challenges and lessons learned 3. Identify areas for growth and inform programmatic decisions Interviews with project staff and partners Case studies in participating schools (e.g., observations, interviews and focus groups with staff and students) School administrator and teacher surveys Participation data 4
AVID AVID (Advancement Via Individual Determination) Elective program for middle and high school students that focuses on various instructional strategies aimed toward improving students college readiness Targets students who are performing in the middle range of achievement, come from low socioeconomic backgrounds, and have a desire to attend college Trained AVID teachers implement the program-specific curriculum centered on the writing, inquiry, collaboration, organization, and reading (WICOR) method 5
AVID: Evaluation Methods and Data Sources AREAS ASSESSED Fidelity of implementation adherence to AVID Implementation Essentials Factors facilitating or hindering implementation Impact of AVID on students o 21 st Century skills o Academic performance (standardized test scores, credits earned, Algebra I completion rates) o School attendance o (In Year 6, 7) High school graduation and college enrollment Observations METHODS USED Interviews and focus groups with school staff and GEAR UP oversight staff Focus groups with students Student survey Teacher survey Analyses of attendance/participation data Descriptive and dosage analyses of outcome data Rigorous impact study using a quasiexperimental design 6
AVID: Implementation Findings In 2013-14, AVID was implemented in all seven high schools, serving 356 tenth- and eleventh-grade students (~10% of the cohort) AVID students were more likely to be female, general ed students with slightly higher attendance and achievement at baseline, but still represented a wide range of academic skills at baseline GEAR UP schools adhered to most AVID implementation best practices; two schools met AVIDcertification standards. There is evidence that teachers are using AVID instructional techniques in their content classrooms. In a survey, most teachers reported using critical reading strategies, binders, and Cornell note taking on a daily basis. The main implementation challenges are: variations in principal buy-in and support; AVID staff turnover due to staff lay-offs and teacher re-assignments; and rostering students into the elective class. Recommendations: Increase staff buy-in, particularly among school principals, to promote AVID-friendly schools. As suggested by teachers, provide more opportunities for teacher collaboration, including observations of experienced teachers implementing AVID. 7
AVID: Outcome Findings Student Reports of AVID Impact AVID made me more focused on college and making choices it kind of opened up my horizons. This class [AVID] will get me into college. AVID helps me stay on top of my work. I went from a D to a B in math because of AVID. -- AVID Students 8
AVID: Outcome Findings The Philadelphia GEAR UP Partnership AVID Factsheet Teacher Reports of AVID Impact For my kids, [AVID] it's done wonders. I see kids for the first time in 11 years of teaching kind of competing with one another, they're always monitoring their grades, and they're taking responsibility ( ). They ask more questions, they're involved in their learning. For the students who use the organizational strategies and the study strategies, they do better in class. I like that [AVID] it reinforces and teaches the kids personal responsibility, which I think is really important. 9
AVID: Impact Findings Rigorous quasi-experimental design using Propensity Score Matching for developing a comparison group PSM is the best available approach to generating a comparable group of nonparticipants without random assignment First step: summarize all pertinent characteristics observed prior to treatment (i.e., the matching variables) into a single score (i.e., the propensity) Second step: participants are matched to one or more comparison students based on similar propensity scores Matching variables: baseline reading and math achievement, grade level, age, gender, race/ethnicity, free/reduced price lunch (FRL) eligibility, English language learner (ELL) and disability status, and previous average daily attendance Two-level Hierarchical Linear Modeling was used in order to account for the clustering of students within schools 10
AVID: Impact Findings Year Outcomes Assessed Results Year 1 (2009-10) Year 2 (2010-11) Year 3 (2011-12) Year 4 (2012-13) Year 5 (2013-14) Year 6 (2014-15) None Grade 7 and 8 students: PSSA reading scores PSSA math scores Grade 8 students: PSSA reading scores PSSA math scores Grade 9 students: Algebra I completion rate Grade 9 students: Algebra I completion rate Grade 9 and 10 students: Credits earned School attendance Grade 10 and 11 students: Credits earned School attendance Grade 11 and 12 students: Credits earned School attendance Grade 12 students only: High school graduation College enrollment N/A Positive, significant impact on: PSSA reading scores (Grade 8) Positive, significant impact on: PSSA reading scores (Grade 8) Credits earned (Grade 9) Positive, significant impact on: Credits earned (Grade 9 and 10) School attendance (Grade 9 and 10) Positive, significant impact on: Credits earned (Grade 10 and 11) School attendance (Grade 10 and 11) To be completed in the fall 2015 11
AVID: Impact Findings Impact of AVID on Student Outcomes (2013-14) 1 An asterisk denotes a statistically significant difference at the.05 level. 12
College Readiness Programming Multiple opportunities to participate: College advisement College week College Readiness Series PSAT and SAT test prep College visits Financial planning/fafsa Parent advisement and workshops Other 13
College Readiness Programming: Evaluation Methods and Data Sources AREAS ASSESSED METHODS USED Implementation successes and challenges Progress towards meeting project benchmarks Impact on students college readiness: o Self-reported knowledge about the college selection, application and financial aid process o Completion of steps, such as taking college access tests, FAFSA applications, etc. Interviews and focus groups with school staff and GEAR UP oversight staff Focus groups with students Student survey Parent survey Analyses of attendance/participation data Descriptive and dosage analyses of outcome data 14
College Readiness: Implementation Findings In 2013-14, 3,593 students (about 91% of the GEAR UP cohort) took part in one or more college preparation activities, with an average of 12 hours of programming per student. Type of College Readiness Activities 15
College Readiness: Implementation Findings School-wide events were extremely useful for promoting a college-going culture in the schools, engaging a large number of students in activities and energizing both students and teachers. College readiness activities, and specifically college visits, allowed students to build stronger relationships with GEAR UP staff. Visits to colleges/universities and the College Knowledge Bowl were the favorites GEAR UP activities among student focus group participants. Challenges included: Difficulty finding teachers for SAT/ACT Preparation classes; poor attendance in some of these classes, particularly those held after school Low parent attendance to parent events and workshops on college readiness 16
Key Outcome Findings Dosage Analyses Students with higher GEAR UP dosage (particularly those with 20+ hours) in college preparation activities had: higher educational expectations, stronger beliefs in the importance of postsecondary education, increased knowledge of the cost of college, increased likelihood of planning to take college access tests, increased learning about postsecondary options and financing. 17
Relationship between College Programming and Students Taking College Access Tests 18
Relationship between College Programming and Students Self-Reported Learning 19
Key Outcome Findings Survey Analyses Parents of students who attended a GEAR UP school the previous year were more likely than their peers to report that: They have spoken to someone about college entrance requirements, They have spoken to their child about college, They have enough information about college requirements and financial aid, They are knowledgeable about costs and benefits of attending college. 20
Key Outcome Findings Survey Analyses Differences between Parents of Children who Attended a GEAR UP School Last Year and Parents of Children who Did Not 21
Lessons Learned Challenge: very difficult to tease out the effectiveness of specific components/services, especially when students may be receiving multiple GEAR UP interventions. Select robust components (e.g., high intensity, frequency and/or duration) for this type of analysis Assess both implementation and outcomes Seek input from multiple stakeholder groups Collect data through a variety of methods observations, interviews, focus groups, surveys, school records and extant data If possible, include comparison group of non-gear UP participants to avoid potential contamination effects Consider carefully which outcome measures should be used to assess impact (commensurate with the type of intervention assessed) 22
Lessons Learned Challenge: collecting accurate, complete, timely participation data. Use a web-enabled, user-friendly database to track student, teacher and parent participation Identify staff (school-based or GEAR UP) who will be responsible for data entry; communicate expectations early (e.g., as part of job description) Establish accountability measures for data entry (e.g., monthly data entry reports and meetings) Increase buy-in by encouraging staff to use participation data to inform programmatic decisions at the school, activity, and student level (e.g., through user-friendly embedded database reports) Create a master document of activities and corresponding APR categories so there is consistency across data entry staff and schools NEW: Set specific implementation benchmarks for each partner and school and produce ongoing progress reports on those benchmarks 23
Lessons Learned Challenge: ensuring high response rates from surveys. Student survey Multiple administration methods (during class, advisory periods, at large events) Explicitly identifying survey administration as a responsibility of program staff Sharing results with the schools to promote buy-in and involvement schoolbased student survey reports Parent surveys Multiple administration methods (mailings, online, take-home with students, at large school events) Enroll the help of the school-based parent liaisons Use of incentives, if possible NEW: Involve key stakeholders (e.g., school staff) in the development of instruments; include questions provided by them 24
Lessons Learned Challenge: effectively using data to drive programmatic and policy decisions evaluation results need to be timely, useful and reported in a format appropriate for the intended audience For program staff: frequent participation reports, quarterly evaluation meetings and reports, annual comprehensive reports For partners: partner meetings For USDOE: APRs, biennial reports, final performance report For school community (school staff, students and families): school-based survey reports, factsheets, Newsletter For elected officials, policy makers: factsheets, personal stories, hearings and meetings NEW: Comprehensive school-based reports to be shared with (and used by) school-based teams; in-person presentations/discussions NEW: GEAR UP factsheet or policy brief to be shared with larger audience 25
Questions 26
Contact information Manuel Gutiérrez, Ph.D. Senior Research Scientist, Metis Associates mgutierrez@metisassoc.com Julia Alemany Senior Associate, Metis Associates jalemany@metisassoc.com Michelle Grimley Assistant Director, GEAR UP, School District of Philadelphia megrimley@philasd.org 27
28