DATA VALIDATION MONITORING Kim Gilson. Senior Consultant Data And Accountability Region 10 ESC Kim.gilson@region10.org 972.348.1480
Today s Objectives: Discuss the relationship between Data Validation Monitoring (DVM), Performance Based Monitoring (PBM) and Program Monitoring and Interventions (PMI) Define and understand the purpose and importance of DVM Examine the three areas of DVM analysis: Assessment, Discipline, Leavers Explore the indicators in three areas of DVM Review the DVM Process
THE BIG PICTURE
Performance-Based Monitoring Data Validation The Performance-Based Monitoring (PBM) system was developed in 2003 in response to state and federal statute It is a comprehensive system designed to improve student performance and program effectiveness. The PBM system is a data-driven system that relies on data submitted by districts; therefore, the integrity of districts data is critical. To ensure data integrity, the PBM system includes annual data validation analyses that examine districts leaver and dropout data, student assessment data, and discipline data. Additional data analyses, including random audits, are conducted as necessary to ensure the data submitted to the Texas Education Agency (TEA) are accurate and reliable.
Program Monitoring and Interventions (PMI) Division Supports the state s goals for public education by reviewing, evaluating, monitoring, and intervening with campuses and their local education agencies (LEAs, which include districts and charter schools) to ensure excellence in education for all students. The major responsibilities of the division include monitoring and interventions for: State accountability Federal requirements bilingual education/english as a second language, career and technical education, No Child Left Behind, and special education program areas including residential facilities Data validation
Interventions and Strategies Interventions occur through performance-based monitoring (PBM) strategies implemented by the agency. An LEA assigned a stage of intervention for more than one program area must engage in integrated intervention activities. Underlying strategies of the PBM system include a shift away from process to results (program effectiveness and student performance), a strong emphasis on data integrity, a focus on a coordinated approach to agency monitoring, and more effective sanctions and interventions.
Program Monitoring and Interventions (PMI) State Accountability Monitoring Monitoring Performance Based Monitoring (PBM) Performance Based Monitoring Analysis System (PBMAS) Data Validation Monitoring (DVM) BL/ESL Education Federal Requirements No Child Left Behind (NCLB) Career and Technology Education (CTE) Special Education & Residential Facilities
DATA VALIDATION MONITORING DEFINED
Data Validation Monitoring (DVM) The DVM system monitors the accuracy of data submitted by school districts through the Public Education Information and Management System as well as data that is used in the state's accountability rating and performance-based monitoring analysis system (PBMAS). Implemented by the Program Monitoring and Interventions (PMI) Division, for: Student Assessment, Student Discipline, and Student Leaver program areas. On-site reviews are conducted to validate the implementation of intervention activities and the accuracy of data driving the PBM system.
DVM VS PBMAS
Similarities: Both use indicators to determine staging and interventions. Differences: Goals PBMAS - Progress toward the standard. PBM Data Validation - Report accurate data each year. A PBMAS performance indicator yields a definitive result. For example, 85% of a district s graduates completed the Recommended High School Program. A Data Validation indicator suggests an anomaly that may require a local review to determine whether the anomalous data are accurate.
Differences between Student Assessment Data Validation Indicators and PBMAS Indicators. Indicator Type Result Standards District Response Student Assessment Data Validation Suggests and anomaly Based on the annual review of data to identify anomalous data and trends observed overtime Validate accuracy of data locally and, as necessary, improve local data collection and submission procedures or address program implementation concerns. PBMAS Yields a definitive result Based on standards established in advance Improve performance or program effectiveness or if identification occurred because of inaccurate data improve data collection and submission procedures
INTERVENTIONS
Interventions: The PMI division reviews and follows up with local education agencies (LEAs) identified for potential data inaccuracies, data anomalies, or data irregularities. If noncompliance, student performance, program effectiveness, or data reporting accuracy concerns are identified during the visit, the LEA must undertake actions to address them and may be subject to additional sanctions and interventions.
Statutory Authority: 19 TAC 97.1071(h) Purpose: DVV visits are conducted to validate and verify implementation of PBM intervention activities
Selection criteria includes, but is not limited to: size and location of LEA; stages of intervention; and/or years identified for interventions. To verify and validate implementation, DVV visits include: Document reviews; Staff interviews; and Classroom observations.
DATA VALIDATION MANUALS
Current-Year Manuals 2014 Student Assessment Data Validation Manual (PDF) Discipline Data Validation Manual (PDF) Leaver Records Data Validation Manual (PDF) Data Validation Manuals Main Link: http://tea.texas.gov/pbm/dvmanuals.aspx
DATA VALIDATION PROGRAMS
Student Assessment Data Validation Indicators: Background The Texas Education Code (TEC) contains two statutory references. TEC 39.057 calls for special accreditation investigations when anomalous data related to reported absences are observed in the administration of the state student assessment program. TEC 7.028 provides specific authority for TEA to monitor the Public Education Information Management System (PEIMS) data integrity and accountability under Chapter 39.
Student Assessment Data Indicators 1. (i-xi) STAAR 3-8 Absent Rate (Mathematics) 2. (i-xi) STAAR 3-8 Absent Rate (Reading) 3. (i-xi) STAAR 3-8 Absent Rate (Science) 4. (i-xi) STAAR 3-8 Absent Rate (Social Studies) 5. (i-xi) STAAR 3-8 Absent Rate (Writing) 6. (i-xi) STAAR 3-8 Other Rate (Mathematics) 7. (i-xi) STAAR 3-8 Other Rate (Reading) 8. (i-xi) STAAR 3-8 Other Rate (Science) 9. (i-. xi) STAAR 3-8 Other Rate (Social Studies) 10. (i-. xi) STAAR 3-8 Other Rate (Writing) 11. TELPAS Reading Absent Rate 12. TELPAS Reading Other Rate 13. (i-v) STAAR EOC Test Participation Rate 14. Discrepancy between PEIMS CTE Status and STAAR EOC Answer Documents
Student Assessment Data Indicator Student Groups: (i) (ii) (iii) (iv) (v) (vi) (vii) All Students African American Students American Indian Students Asian Students Hispanic Students Pacific Islander Students White Students (viii) Students with Two or More Races (ix) (x) (xi) Economically Disadvantaged Students English Language Learners Students Served in Special Education
Student Assessment Indicator #13 End of Course (EOC) Assessments: I. Algebra I II. III. IV. English I English II Biology V. U.S. History
Example Indicator Page Assessment:
Discipline Data Validation Indicators: Background In 1995, the 74 th Texas Legislature enacted the Safe School Act, which created Disciplinary Alternative Education Programs (DAEPs) and Juvenile Alternative Education Programs (JJAEPs) to serve student who had committed disciplinary offenses. To evaluate district use of DAEPs and JJAEPs and to review the documentation of district-reported discipline information. TEA developed a process for collecting and evaluating discipline data. A record (425 Disciplinary) Action Data Student was added to the Public Education Information Management System (PEIMS) to obtain the data necessary for these analyses. This record collects both Disciplinary Action Reason Codes and Disciplinary Action Codes in order to capture both the student s conduct and the district subsequent response.
Discipline Data Validation Indicators: 1. Length of Out-of-School Suspension 2. Length of In-School Suspension (Report Only) 3. Unauthorized Expulsion-Students Age 10 and Older 4. Unauthorized Expulsion-Students under Age 10 5. Unauthorized DAEP Placement-Students under Age 6 6. High Number of Discretionary DAEP Placements 7. African American (Not Hispanic/Latino) Discretionary DAEP Placements 8. Hispanic Discretionary DAEP Placements 9. No Mandatory Expellable Incidents Reported for Multiple Years
Example Indicator Page Discipline:
Example Indicator Page Discipline:
Leaver Records Data Validation Indicators: Background Since 1997-1998,the integrity of leaver records has been evaluated annually by TEA through various indicators and data analyses. Statutory requirement have also guided TEA s leaver records data validation efforts. During the 78 th Legislature Regular Session (2003), Texas Education Code was amended to require and annual electronic audit of dropout records and a report based on the findings of the audit. House Bill 3, passed during the 81 st Legislature Regular Session (2009) maintained the requirement in TEC, 39.308
Leaver Records Data Validation Indicators: 1. Leaver Data Analysis 2. Underreported Students 3. Use of Leaver Reason Codes by Districts with no Dropouts 4. Use of One or More Leaver Reason Codes 5. Use of Certain Leaver Reason Dropout Codes
Example Indicator Page Leaver:
Example Indicator Page Leaver:
REPORTS
Data Validation Reports: Generated for LEAs identified on one or more indicators Available via the Texas Education Agency Secure Environment (TEASE) Accountability Application LEAs not identified will receive a message that a report is not available for various reasons. Only indicators that a district triggers will be listed on the report
SYSTEM & TIMELINES
2014 PBM Data Validation Monitoring Reports 10.31.14 Leaver Data Validation Reports 11.21.14 Discipline Data Validation Reports 12.19.14 Student Assessment Data Validation Reports Posted on TEASE Accountability Application
2014 PBM Data Validation Staging Dates 1.12.15 - Leaver Data TBD - Discipline Data TBD - Student Assessment Data Staging Released on Intervention Stage and Activity Manager (ISAM)
DVM Leaver Updates Staging release in ISAM early week of January 12 th Stage Staging rubric Indicator workbooks DVM-CAP Guidance TAA letter will be released when staging goes up in ISAM (1.12.15) DVM-Leaver webpage will go live Friday Indicator workbooks DVM-L CAP How Was My District Selected document Guidance
LOOKING BACK AND LOOKING BEYOND
PMI Updates DVM Key Take-Aways Establish written district-wide processes and procedures Develop and follow through on annual training Design second verification before PEIMS submissions
TEAL/TEASE Applications Program Contacts Request necessary TEAL/TEASE Applications Accountability ISAM TEA Secure Applications Information http://tea.texas.gov/index4_wide.aspx?id=2684 (Link on top bar of new TEA Website) IMPORTANT
What Next: Assess your LEA performance in regards to the Key Take- Aways and What s Next. Consider what practices and procedures impact DVM. Determine if the accuracy of data may be impacting performance. Identify who is responsible for the various roles that contribute to the DVM process and what if any role you or people within your department have. Be aware of timelines and requirements with DVM. Implement procedures to prevent missing Assessment Leavers in 2015! Contact us if you have questions or need support.
Resources TEA Program Monitoring and Interventions http://tea.texas.gov/pmi/ TEA Data Validation Monitoring http://tea.texas.gov/student_testing_and_accountability/dvm/ TEA Performance Based Monitoring Analysis System http://tea.texas.gov/student_testing_and_accountability/pbmas/ R10 Special Education Data Analysis http://www.region10.org/special-education/data-analysis-specialeducation/
R10 Contact information Student Assessment Data Validation Kim.Gilson@region10.org Student Discipline Data Validation Kim.Simmons@region10.org Student Leaver Data Validation Lorna.Bonner@region10.org Special Education Data Analysis & Accountability Anna.Griffiths@region10.org