Ohio Department of Education Teacher Evaluation System (OTES) Pilot CSP

Size: px
Start display at page:

Download "Ohio Department of Education Teacher Evaluation System (OTES) Pilot CSP 904912"

Transcription

1 Ohio Department of Education Teacher Evaluation System (OTES) Pilot CSP Final Report June 29, 2012

2 Final Report for CSP PROJECT: ODE Teacher Evaluation System (OTES) Pilot CLIENT REFERENCE NUMBER: CSP MGT PROJECT NUMBER: MGT PROJECT MANAGER: Susan Zoller CONTRACT START DATE: November 2011 PERIOD OF PERFORMANCE: November 2011 June 2012 DATE OF REPORT: June 30, Executive Summary Background MGT of America, Inc. (MGT) conducted a comprehensive, mixed-methods evaluation study of the four teacher evaluation options being implemented in the Ohio Teacher Evaluation System (OTES) Pilot Program during the school year. All districts in the state were invited to participate in the pilot; 139 were selected. As part of its application for participation, each district identified which of the four approved teacher evaluation models they planned to implement. The four options included: 1. OTES model 2. OTES model with locally-developed student growth measures 3. Local evaluation system aligned to OTES model (e.g. Danielson, Marzano, other) 4. Local evaluation system aligned to OTES model (e.g. Danielson, Marzano, other) with locallydeveloped student growth measures The evaluation examined the implementation and impacts of the four OTES options to inform further refinement of the OTES model. The research questions included the following areas: Implementation of the selected model Impact on teacher effectiveness and behavior Impact on administrative behavior and school/local Educational Agencies practices, policies, and procedures Impact on student achievement Sustainability Best practices in teacher evaluation 1 P age

3 Methodology MGT gathered data from the 139 Local Educational Agencies (LEA) participating in the OTES pilot using online surveys in January 2012 (mid-year through the pilot school year) and April 2012 (summative). The online surveys were distributed by the Ohio Department of Education to all 659 participants teachers, teacher evaluators, and administrators. The initial survey had a 49% response rate; the summative survey had a 44% response rate. Responses were received from at least one person at each LEA. In addition to the surveys, MGT conducted site visits to a stratified, random sample of 12 LEAs selected based on the evaluation model that they had chosen, their geographic location (NE, NW, SE, SW, urban, and central), and the type of LEA (Exempted Village, City, Local, Community School, and ESC). MGT staff visited the selected sites during April and May 2012, and conducted a total of 75 interviews with participating and non-participating teachers and principals and reviewed pre-identified documentation of implementation. MGT staff also interviewed the superintendent in each LEA. MGT reported findings from the initial survey in the February Mid-Term Report and the Case Study data in the April Quarterly Report. Findings and Recommendations IMPLEMENTATION: All LEAs were implementing at least some of the components of the OTES model, including nearly 100% implementation of the pre- and post-observation conferences. However, some aspects of the OTES model had limited implementation. For example, only about 32% reported using student growth information and only 35% reported holding a year-end summative conference. Nearly 20% of the LEAs initially indicated that they planned to develop their own local model, but several had instead chosen to implement the OTES model. 91% of those who answered the survey indicated that their LEA did not make any significant modification while implementing the OTES model. Implementation of this new model had an unanticipated positive impact on the frequency and level of interactions between teachers and administrators. Nearly all participants interviewed mentioned this as a positive effect. RECOMMENDATIONS: Adopt all components, at least provisionally, and leave them in place for a second pilot year. Make the tools, forms, and structure available through a statewide online system of support that helps both teachers and evaluators manage all the parts and pieces. Create a clear and simple flow chart showing activities on a sample timeline a Year-at-aglance. Provide clear documentation to identify what is required and what is recommended as best practice. Samples of each should be included both strong and weak examples to support the goal of transparency and improved teacher and evaluator performance. 2 P age

4 Provide ongoing, in-depth, and accessible professional development. Improve face validity of the system by ensuring that the system is fair, equitable, and reliable for all teachers. Present the summative teacher evaluation ratings in actionable terms that provide guidance for decision-making about classroom practices and professional development needs. Conduct another pilot during IMPACT ON TEACHER BEHAVIOR AND EFFECTIVENESS: There was mixed impact on teacher behavior identified through this study. Only half of those responding indicated that analysis of student data, multiple assessments, and the focus on professional growth had an impact on instructional practice. The most frequently identified impact came from teacher understanding of SMART goals (goals that are Specific, Measurable, Attainable, Relevant, and Time-Bound), collaboration and communication between teachers and their building administrator, and alignment between SMART goals and classroom activities. RECOMMENDATIONS: Maintain the use of SMART goals as an expected/required component of the system. Provide professional development aimed at using data, developing multiple assessments, and differentiating instruction to impact instructional practice. IMPACT ON ADMINISTRATIVE BEHAVIOR AND SCHOOL/LEA PRACTICES: None of the LEAs reported making changes to their policies or procedures based on the OTES pilot. However, many reported that they were waiting until the model was finalized before they would make any adjustments. LEAs also reported not making changes in their union contracts, except for some Memoranda of Understandings (MOUs) put in place for the pilot year. Few of the LEAs reported having a comprehensive communications plan to ensure that nonparticipants were informed about pilot activities and plans. Participating teachers reported that their colleagues ranged from curious to nervous. Some teachers expressed concern about having other teachers become trained evaluators, indicating that they did not think peer evaluation was appropriate. At least one LEA had specific contract language preventing teachers from serving in this capacity. Several principals thought this might be a way to reduce the burden for administrators. RECOMMENDATIONS: Invite Ohio teacher, administrative, and school board organizations to assist in the development of model policies and procedures or contractual language. 3 P age

5 Invite input or feedback from administrators and teachers on ways to differentiate evaluations based on teacher performance to reduce the perceived burden of implementing this model with all teachers. IMPACT ON STUDENT ACHIEVEMENT: Although one goal of this study was to review the impact on student achievement, there were significant barriers to gathering data in support of the research questions. During the pilot year, only teachers working in grades 4-8 in reading or mathematics had access to value-added data and only a few of the participating teachers fell into that group. Less than 35% of participants reported analyzing student growth data as a component of the evaluation. RECOMMENDATION: Conduct an analysis of the impact of the OTES model on student achievement once the model has been implemented for long enough to show trends and draw conclusions. Ensure that student growth scores are based on more than one year and have, to the degree possible, factored out the effects of external conditions impacting student learning. SUSTAINABILITY: Many participants expressed concern about the time commitment to conduct evaluations using the new model. Principals who were using OTES with only 1-2 teachers doubted their capacity to conduct that level of discussion with all members of their staff, given all their other obligations. Participating teachers and principals both arrived for site visit interviews with large file folders or stacks of documents. There is a need for electronic forms and support structures. RECOMMENDATIONS: Develop an electronic system of support for the OTES model that supports all phases of the process and allows the uploading of artifacts or evidence in a variety of formats by both the teacher and principal. Provide ongoing, accessible professional development in how to use the electronic system of support. REVIEW OF BEST PRACTICES IN TEACHER EVALUATION: MGT conducted a literature review of current teacher evaluation activities and practices. Recent rapid changes in teacher evaluation practices have been prompted by federal and state policy initiatives. Before 2010, only four states were using student achievement as a predominant influence in how teacher performance was assessed. By 2011, there were 13 and an additional 10 states that had student performance as a small percent of teacher evaluations. Many states, including Ohio, have developed or adopted value-added measures (VAM), a term applied to a range of approaches that vary in their data requirements and statistical complexity. Many have also tied compensation to the value-added scores. However, researchers disagree on the appropriate use of VAM. 4 Page

6 There is significant debate on how to evaluate teachers, but significant agreement exists in that teacher effectiveness can and should be evaluated. However, teacher evaluation is best grounded in criteria for which teachers have control: setting SMART goals, regularly measuring and monitoring the learning progress of students, adjusting classroom practice to address gaps in learning or to accelerate learning through differentiated instruction, engaging in professional development aligned with needs assessment results, and collaborating with support staff, other teachers, and administrative leaders to address the needs of all students. RECOMMENDATIONS: Avoid using VAM data for high-stakes decisions about an individual teacher s effectiveness, and include VAM data as only a small percent of the evaluation. Conduct rigorous pilot studies of how the results of various value-added statistical models correlate with other measures for student growth and teacher effectiveness. Focus at least as much attention on the alignment of standards and professional development for both teachers and administrators. 5 P age

7 2.0 Overview of Research Project This project used a comprehensive, mixed-methods approach to study the Ohio Teacher Evaluation System (OTES) Pilot Program during the school year. The evaluation examined the implementation and impacts of the four OTES options to inform further refinement of the OTES model and strategies for scaling up valid and reliable teacher evaluation approaches that are relevant and useful to teachers and principals at Local Educational Agencies (LEA), as well as to inform the overall Ohio Human Capital Management System for educating today s youth. The evaluation explored the relevance and usefulness of the OTES model for guiding LEAs and the Ohio Department of Education (ODE) in the implementation of high-quality teacher evaluation approaches aligned with the Ohio Standards for the Teaching Profession and best practices in measuring teacher quality. The evaluation used the Ohio Standards for the Teaching Profession as the basis for defining effective teaching, and examined the impacts of using OTES options on teachers professional growth and development, administrative behavior and school/lea processes. This report includes data from a summative survey of participants conducted in April 2012 and summary findings for the study as a whole. The formative and summative feedback from the evaluation will identify and describe accomplishments, impacts, and challenges of the OTES Pilot Program options that LEAs piloted during the school year. 6 P age

8 3.0 Methodology This research project included data gathered through two online surveys, site visits to selected LEAs and charter schools, and a literature review of best practices in teacher evaluation. Each approach is described in detail below. Online Survey Tools MGT conducted a formative survey of all participants in the OTES Pilot Program in January All of the participants from 139 LEAs were surveyed. The initial survey emphasized answering the research questions for implementation with some attention to impacts on teacher effectiveness and administrative behaviors. The summative survey was conducted in April 2012 to fully address the research questions and identify any changes over time. MGT analyzed the data from both surveys using descriptive statistics in SPSS and Excel. Open-ended questions were analyzed using qualitative coding, which identified common trends and concerns among pilot LEAs. The spring follow-up survey used indicators from the initial survey coding to gather quantitative data to further examine the trends and concerns that emerged in the formative survey data. The data from the initial survey included the role of the respondents and further questions were differentiated based on that role. Teacher questions included their perceptions about the level of knowledge among principals about the new evaluation methods, the amount and source of training about the new methods, and the range of teacher evaluation tasks that had been completed to date. Administrator questions included the changes made or planned in policies, procedures, or union contracts, the amount and source of training, and the range of teacher evaluation tasks to date. The survey was intended to provide baseline data for evaluation of the OTES pilot project as well as background information needed for the selection of case study districts. The survey was designed to gather information from each of the 139 LEAs in the pilot to ensure that each participating principal, teacher, and district and union representative at each site had an opportunity for input and feedback. The initial survey had a 49% return rate (325/659 pilot participants). The data from the summative survey included questions regarding the implementation of OTES, the effect of OTES on teacher, evaluator, and student behaviors, and the impact on LEA administration. The survey was intended to provide summative information regarding all aspects of the OTES pilot. The summative survey had a 44% return rate (288/659 participants). Of the 288 LEA participants who answered the survey, 268 (93%) said they were directly involved with implementing the OTES in their building or district. The remaining 20 (7%) participants were not directly involved in implementing the OTES. For those not involved with OTES implementation, the survey skipped to the last survey question about alignment between the OTES model and organizational goals. The summative survey results included in this report are only from those directly involved in implementing the OTES model during the pilot year. Survey participants identified their roles as follows: 7 P age

9 30% identified their role as building principals, not involved in evaluations 18% were evaluators, including principals 17% were district administrators 30% were classroom teachers 2% were union leaders 3% were various program coordinators The design of both sets of survey questions was informed by analysis of the new Ohio teacher evaluation criteria and methods, the ODE s fall 2011 training survey results, and a literature review of best practices in teacher evaluation. The questions from both the initial and summative surveys are located in Appendix A. Case Study Site Visits The case studies examined the relevance and usefulness of the OTES model for guiding LEA s implementation of high-quality teacher evaluation methods aligned to the Ohio Standards for the Teaching Profession and best practices in measuring teacher quality. Twelve case study sites were selected from a stratified random sample of OTES pilot sites. The types of districts and geographic regions in the statewide distribution of case study sites are shown Exhibit 3-1. Exhibit 3-1 Type and Geographic Region of Case Study Sites CASE STUDY SITE 1 CASE STUDY SITE 2* OTES - Student Growth Measures CASE STUDY SITE 3 CASE STUDY SITE 4 CASE STUDY SITE 5 CASE STUDY SITE 6 Exempt Village City Local Local Local Exempt Village Northeast Urban Northwest Central Northeast Northwest OTES - No Student Growth Measures CASE STUDY SITE 7* CASE STUDY SITE 8 CASE STUDY SITE 9 CASE STUDY SITE 10 CASE STUDY SITE 11 CASE STUDY SITE 12 City City Education Service Center City Local Community School Southwest Urban Northeast Northwest Southeast Southeast Source: MGT of America, Inc., *These two sites included locally-developed evaluation system components in their OTES pilot. 8 P age

10 Two evaluators from MGT conducted site visits at the 12 LEAs between March 26 and April 5, The evaluators conducted a total of 75 interviews, reviewed and gathered documents that had been previously requested, and reviewed teacher evaluations during the site visits. No teacher evaluation documents were copied or taken by MGT. Data were organized for analysis based on protocols developed by MGT to address the research questions. Responses from interviewees were coded to identify issues and patterns across locations. Best Practice Review MGT conducted a review of the literature available to describe best practices in teacher evaluation. The review included three components: trends in teacher evaluation methodology, perspectives on valueadded measures (VAM) from measurement and research experts, and information regarding merit pay and bonus incentives. The complete best practice review, including bibliography, is located in Appendix B. Research Questions This research was based on a set of questions identified by ODE. The questions dealt with the implementation of OTES (how well or to what degree OTES was implemented during the pilot year), the impact of OTES (what effect did OTES have on the teachers, administrators, and other evaluators who implemented the model), the impact on student achievement (what effect did OTES have on student learning), and the impact on LEA policies or practices. Additionally, research included questions dealing with the sustainability of the OTES model and a review of the literature regarding best practices in teacher evaluation. The research questions are defined below: 1. Implementation: Review the ongoing implementation of the pilot in the selected schools to identify successes and areas in need of improvement. This includes sub-questions such as: To what extent were teachers, administrators, and union leaders involved in the design and implementation? What is the fidelity in relation to the project plan? To what extent were comprehensive communication plans developed and successfully utilized? What were the best practices of the most effective implementers? 2. Impact on Teacher Effectiveness and Behavior: Report the pilot program s impact on effectiveness and behavior as measured by student achievement and value-added measures. This includes changes in individual instructional practices and levels of embedded change within LEAs. This includes sub-questions such as: What student achievement and growth measures were used? 9 P age

11 What were the intended and unintended consequences on instructional practices? 3. Impact on Student Achievement: Examine impact on student achievement. 4. Impact on Administrative Behavior and School/LEA Processes: LEA level. Sub-questions may include: Have LEA policies and procedures changed? Examine impact at the school and To what extent has the pilot evaluation model impacted professional development? What is the nature and degree of alignment of organizations process and performance outcomes across school and LEA? 5. Sustainability: Examine the sustainability of the evaluation system. 6. Best Practices: Monitor and review research and practices in other states and districts to make a summary and recommendations for future refinement of the project. Each of these research questions has been explored during the course of this study. Several questions were explored using multiple methodologies to understand the issue from various perspectives, to validate responses, or to review implementation. The matrix below (Exhibit 3-2) shows the source of data used to answer each research question. Exhibit 3-2 Research Question Data Sources RESEARCH QUESTION 1. Implementation Involvement of teachers and unions Fidelity Communications plan Best practices of implementers 2. Impact on Teacher Effectiveness and Behavior 3. Impact on Student Achievement 4. Impact on Administrative Behavior and School/LEA Processes 5. Sustainability INITIAL SURVEY DATA SOURCES CASE STUDY LITERATURE REVIEW FINAL SURVEY X X X X X X X X X X X X X 6. Best Practices X X Source: MGT of America, Inc., P age

12 4.0 Findings and Recommendations This section of the report provides a summary of the findings gathered from the initial survey and case study site visits that were previously reported in the February 2012 and April 2012 reports. New to this report are the data from the summative survey. Initial Survey Findings Summary The initial survey was conducted in January 2012 via an electronic survey tool. ODE sent the survey tool to all OTES participants. In total, 325 of approximately 659 pilot participants completed the survey (49%). Of the 139 districts participating in the OTES pilot, at least one person from 123 of those districts completed the survey (89%). Approximately 59% of survey respondents work in rural school districts, 25% in suburban districts, and 17% in urban settings. Implementation Survey respondents reported a high level of engagement and participation in the design and implementation of the new teacher evaluation methods. Ninety-nine percent of building principals are actively involved and 95% of them serve as the primary teacher evaluator. They reported focusing mostly on implementing the teacher performance component, which is only 50% of the new evaluation methods. Most had not yet addressed the other 50% of the evaluation criteria pertaining to value-added student data. They expressed concern about the standardized use of student data for all teachers, content areas, and student levels. Most of the participating LEAs reported conducting formal classroom observation in accordance with the OTES implementation timeline; however, only about two-thirds of them had conducted the postobservation conference meeting or provided a written observation report. They also reported that the new teacher evaluation methods were overwhelmingly time consuming and therefore not feasible to fully implement without the creation of full-time evaluation staff at the building level, especially for those working in rural or small districts. These results suggest the need to make modifications to OTES so that it is more adaptable to each school s culture. Impact on Teacher Behavior and Effectiveness Approximately 65% of those who answered the survey reported successes as a result of implementing the new teacher evaluation method. The biggest successes included gaining an appreciation for the how the new teacher evaluation methods foster more reflective practice among teachers and enhanced communication between teachers and evaluators, who in most cases are building principals. These successes help focus instructional practices on SMART Goals and more accountability for meeting standards and new teacher evaluation criteria. Pilot sites also reported some success with improving evaluation measurement tools and uses of data in the instructional decision-making process. These types of successes can have a positive impact on instructional practices and student learning. 11 P age

13 Impact on Administrative Behavior and School/LEA Processes Survey respondents saw value in the theory behind OTES, but were concerned about its current design which was perceived to be too complex and time consuming, two factors that make it difficult to roll-out in its current format. They generally thought the teacher performance component, which constitutes 50% of the criteria, was valuable; however, some components of the rubric are difficult to interpret. Many questioned the validity of using 50% student growth and have unanswered questions about how to measure student growth in a fair and reliable manner for all teachers in a building or district. Some principals reported disagreeing with the teacher evaluation results when they applied the criteria. Sustainability Those responding to the survey expressed widespread concern for the complexity of the evaluation process and the length of time needed to complete forms, which indicated a need to simplify the method and make it more user-friendly. Suggestions for improving sustainability include: Adjusting the process to better match the daily work flow of principals and teachers. Simplifying steps to reduce the time between steps to provide more timely feedback. Simplifying the forms to reduce paperwork. Considering different levels of evaluation for different teachers, i.e. suggestions of using the piloted method for only teachers whose performance is marginal or problematic. Aligning the criteria more closely with teachers areas of influence, e.g. content areas aligned with valid student assessments for all teachers. Delineating student-control factors from teacher-control factors. Case Study Findings Summary The case studies examined the relevance and usefulness of the OTES model for guiding LEAs implementation of high-quality teacher evaluation methods aligned to the Ohio Standards for the Teaching Profession and best practices in measuring teacher quality. Between March 26 and April 5, 2012, two evaluators from MGT conducted site visits at 12 LEAs that were selected from a stratified random sample of OTES pilot sites. The types of districts and geographic regions had been determined through a review of all districts participating in the pilot and based on the percent of such districts in the state. Implementation LEAs were giving most attention to implementing the teacher evaluation components, identifying how well teachers were meeting standards 1-5 through formal observations, post-observation progress conferences, written observation reports and teachers use of self-assessment tools, and setting SMART goals. The least attention had been given to evaluating teachers collaboration, communication and professionalism for meeting standards 6-7, and documenting informal walkthroughs. 12 P age

14 During case study visits, MGT staff reviewed documentation as evidence of implementation of OTES. Written teacher evaluations were the most prevalent type of documentation with 75% of LEAs able to show documents as evidence of their OTES pilot efforts. At 67% of the case study sites, LEAs had implemented procedures related to the new teacher evaluation process. At 75% of the LEA sites where documentation was not available, administrators explained that a comprehensive communication plan would be created only after the OTES guidelines were fully developed at the state level and adopted by their local school boards. During site-based interviews, teachers and principals reported spending more time this pilot year communicating during one-on-one meeting conferences about teachers instructional practices and strengths and areas for improvement than they had in previous years. Teachers find high value in doing the self-assessment and including it as part of the discussion with their evaluator. Teachers and principals reported a clearer, shared understanding of the teacher evaluation criteria based on in-depth examination of the OTES rubric and individual teachers rubric scores following formal observations. In this regard, OTES is fostering more meaningful communication about teacher effectiveness between teachers and principals. Impact on Teacher Behavior and Effectiveness During interviews, teachers participating in the pilot and non-participants alike verbalized a general anxiety stemming from confusion about the definition of and appropriate interpretation of value-added measures (VAM) and student growth scores. Teachers and some administrators expressed concern about how VAM scores relate to teachers performance scores and how much weight VAM scores will carry in decisions about teachers employment and pay scales under the OTES model. Evaluators found a general lack of understanding among those at LEAs about the state-of-the art of VAM metrics, which are undergoing intense research and development. They expressed concern that the OTES model does not provide clear guidelines for the VAM formula or sources of value-added scores. There is a lack of understanding that student growth scores can include metrics for school and learner characteristics and measures for groups of teachers who collaborate in teaching student cohorts. There also is a lack of understanding that multiple years of data are compiled into VAM scores. The confusion about VAM and student growth at LEAs is not surprising, given the current widespread technical debates among policymakers and measurement experts. There was also concern and relief that the state plans to allow individual LEAs to define the VAM used in their setting. Those who expressed concern mentioned that it might not be fair and equitable across the state, and teachers in one area might be being held to a higher or different standard than those in a neighboring district. Those who expressed relief indicated that they would rather be held to locallydefined standards than statewide standards that might not be appropriate for their populations. Teachers and administrators in both groups were unsure how the student growth measures were going to be developed in their LEA and were very concerned about having salaries tied to the new evaluations. 13 P age

15 Impact on Student Achievement The case studies investigated the extent to which LEAs had been able to identify impacts of the new teacher evaluation on student achievement, and found that it is too early in the OTES pilot to draw conclusions about impacts on student achievement or value-added impacts of teachers on student growth. Impact on Administrative Behavior and School/LEA Processes The evaluators reviewed documentation of any new incentives or policies and procedures at case study sites where available. Most districts have not yet started to create policies and procedures in support of the new model of teacher evaluation. Teachers and principals reported finding high value in the SMART goal process which fosters more reflective teaching practices among pilot participants and informs professional development choices more aligned to professional growth needs. Both teachers and administrators reported expecting to revisit the SMART goals in the final evaluation conference, but found that the document did not include any reference to the SMART goals. Most people believed that goal setting should remain as a required element in the OTES model. Those interviewed also reported that OTES creates a need for teachers in-depth training on SMART goals, assessment literacy and data-driven practices, differentiated instructional methods, and methods for fostering self-regulated learners. There were concerns about the perceived move away from unannounced observations, especially in districts whose union contracts preclude the use of walk-through visits for evaluation. Sustainability The amount of time principals need to spend on conducting a full OTES evaluation for a single teacher was not viewed as feasible to scale up for all staff. Administrators viewed the current OTES model as a valuable and comprehensive way to evaluate new teachers and those with marginal performance. However, there was a lack of understanding about either making modifications to OTES or using some of the other OTES tools that might be more appropriate for a veteran or distinguished teacher. One of the major concerns expressed during the site visits was the need for electronic forms and support structures. The evaluators observed both principals and teachers paging through large folders of documents to identify and describe a single observation. Most participants identified this as an obstacle to sustainability. Summative Survey Findings The summative survey was conducted in April 2012 and was sent to all OTES participants. The survey had a 44% return rate (288/659). Of the 288 LEA participants who answered the survey, 268 (93%) said they were directly involved with implementing the OTES in their building or district. The remaining P age

16 participants (7%) were not directly involved in implementing the OTES. For those not involved with OTES implementation, the survey skipped to the last survey question about alignment between the OTES model and organizational goals. The survey results outlined in this report are from only those who reported being directly involved in implementing the OTES during the pilot year. Participants identified themselves as follows: 30% were building principals not involved in evaluations 18% were evaluators, including principals 17% were district administrators 30% were classroom teachers 2% were union leaders 3% were various program coordinators Implementation Participants were asked, To what extent did your district modify the OTES model to fit your school culture during this year s pilot? 91% of those who answered the survey indicated that their LEA did not make any significant modification while implementing the OTES model. The other 9% who indicated their LEA did make significant modifications to the OTES model were asked to briefly describe the modifications and reason(s) for modifications. These participants indicated that modifications to the OTES model focused primarily on aligning OTES language and forms with their LEA s existing teacher evaluation process for goal-setting, walkthroughs, examples of evidence, use of the Danielson model, number of observations, and the use of peer evaluators. The percentage of respondents who described modifying OTES is shown in Exhibit P age

17 Exhibit 4-1 Types of OTES Modifications by LEAs Source: MGT of America, Inc., Exhibit 4-2 shows pilot participants answers to the question, Which of the following OTES components did you complete or participate in during the pilot year? There was a high degree of fidelity implementing the formal classroom observations and associated pre- and postobservation conferences. In addition, 84% of pilot participants said they used SMART goals and the teacher self-assessment tool. Only 70% used the OTES written evaluation report component. Exhibit 4-2 also indicates that the majority of LEAs did not use the OTES professional growth plan, lesson reflection, or professional development resources. Approximately 32% of pilot participants said they analyzed evidence of student growth data as part of the OTES pilot. 16 P age

18 Exhibit 4-2 Fidelity of Implementing the OTES Model (N=247) Source: MGT of America, Inc., These data make clear that participants did not implement all components of OTES, but did not feel like that omission constituted a significant modification of OTES. Information gained during the onsite visits reported earlier indicated that many participants were unaware or unsure of some of the components. It is unclear whether the lack of implementation is due to lack of awareness or competence. Exhibit 4-3 includes data about the ease of use of the OTES process and documents. Participants were asked, Rate how easy it was to use each of the following components of OTES. The most difficult component for 75% of pilot participants was information about student growth measures, and 50% said the year-end summative form was difficult. The evaluation rubric for the standards and the SMART goalsetting form were also difficult for about 40% of pilot participants. Although participants found the SMART goal form difficult to use, the large majority of pilot participants indicated that they did conduct a SMART goal-setting conference as part of their OTES implementation activities. Exhibit 4-3 OTES - Ease of Use OTES COMPONENTS N DIFFICULT OR SOMEWHAT DIFFICULT VERY OR SOMEWHAT EASY Student Growth Measure Information from ODE % 25% Year-end Summative Form % 50% Rubric for teacher evaluation Standards 6 and % 58% SMART goal setting form % 59% Rubric for teacher evaluation Standards % 60% Self-assessment form % 83% Source: MGT of America, Inc., P age

19 Impact on Teacher Behavior and Effectiveness One of the goals of the new teacher evaluation system is to identify and define effective teaching practices. Ideally, the evaluation system should positively impact instructional practices. To understand the impact, the survey asked participants, What, if any, impacts did the new teacher evaluation system have on instructional practices among pilot teachers during the school year? At least two-thirds of those who answered the survey agreed about the top three areas of impact that the new teacher evaluation system had on instructional practices: Understanding SMART goals Improving collaboration and communication between teachers and their building administrator Alignment between SMART goals and classroom activities Exhibit 4-4 shows that half of those who answered the survey said analysis of student data for progress monitoring, use of multiple assessments, and the focus on professional growth had an impact on instructional practice. Exhibit 4-4 also shows that the majority of OTES pilot participants did not see OTES impacts on other important classroom practices, e.g. re-teaching areas of weakness in student learning or differentiated instruction. Exhibit 4-4 Impact of OTES on Instructional Practice (N = 243) Source: MGT of America, Inc., Participants were asked, To what extent does the new teacher evaluation system impact teachers professional development needs? Pilot participants indicated that OTES has wide-spread impact. Three-fourths of participants indicated six common professional development needs, as shown in Exhibit 18 Page

20 4-5. Five of the six top needs pertain to use of student learning data to inform instructional practice. Designing lessons aligned to the new Common Core learning standards is also an area in which OTES has a strong impact on professional development needs. Exhibit 4-5 OTES Impacts on Professional Development Needs (N= 239) Selecting assessments to monitor student learning for specific lessons and benchmarks No or Slight Impact Moderate Impact Extensive Impact 21% 48% 31% Analyzing student learning data from multiple measures 22% 40% 38% Designing and implementing SMART goals 23% 54% 23% Understanding value-added measures (VAM) 24% 42% 34% Designing lessons aligned to Common Core standards 24% 45% 31% Using student learning data to select differentiated instructional strategies 25% 49% 26% Designing classroom assessments 29% 43% 28% Implementing differentiated instruction 29% 45% 26% Using VAM scores to inform instructional and administrative practices for school improvement Involving students in self-assessment, progress monitoring and goal setting 30% 40% 30% 30% 44% 26% Best practices for specific content areas 31% 51% 18% Communicating assessment results with students, parents and colleagues 35% 46% 19% Fostering self-directed strategy use among students 37% 45% 18% Integrating technology into lesson plans 41% 43% 16% Team teaching and teacher collaboration strategies 42% 42% 16% Source: MGT of America, Inc., Impact on Student Achievement The Summative Survey did not collect data about impact of the OTES pilot on student achievement. Impact on Administrative Behavior and School/LEA Processes Participants were asked, To what extent does the OTES model support alignment between your organizational processes and performance goals? 56% of those who took the survey responded to this question (161 respondents). About 84% of those who responded indicated that although they have not all reached full implementation, they agreed that OTES supports moving in the direction of their organizational and performance goals. Another 16% of those who answered this question indicated that 19 Page

21 the OTES model does not support alignment between organizational processes and performance goals at the LEA level because it is too time-consuming or complex to feasibly implement with all teachers during a school year and because it does not align with current contracts in place. Sustainability As shown in Exhibit 4-6, 57% of pilot participants indicated that their LEAs did not have a plan for new contract language related to the use of teacher evaluation results when making compensation, placement, or retention decisions. The large majority of pilot sites also do not have a comprehensive communication plan that explains OTES nor have they made changes to policies or procedures that reinforce the new teacher evaluation process and use of student achievement data as part of teacher evaluation. These survey results indicate that there is much to be accomplished before OTES is institutionalized at a sustainable level. Exhibit 4-6 Status of Plans to Align LEA Policies or Procedures with OTES (N = 240) LEA has new contract language related to the use of teacher evaluation results in compensation, placement and retention decisions. LEA has a new comprehensive communication plan that explains the new teacher evaluation system. LEA has new policies and/or procedures that support using student achievement data as a part of the teacher evaluation process. LEA has new policies and/or procedures that reinforce the new teacher evaluation process. Source: MGT of America, Inc., NO PLAN DRAFT PLAN PARTIAL PLAN FULL PLAN 57% 26% 13% 4% 40% 33% 22% 5% 33% 36% 26% 5% 25% 37% 28% 10% Exhibit 4-7 shows how those who took the survey rank a number of supports for sustainability of OTES in response to the question: What supports need to be in place to ensure sustainability of the new teacher evaluation system within your district? There was a high degree of consensus about the top five priorities for sustainability. Nearly two-thirds said the highest priority is step-by-step guidelines for implementing OTES. Other top priorities include: Ongoing OTES training for administrative and peer evaluators Professional development and support for use of VAM scores to inform instructional and administrative practices for school improvement Online OTES data management system 20 P age

22 SUPPORTS FOR SUSTAINABILITY Exhibit 4-7 Supports for OTES Sustainability (N = 238) NOT A PRIORITY LOW PRIORITY MEDIUM PRIORITY HIGH PRIORITY Step-by-Step guidelines for implementing OTES 1% 9% 26% 63% Ongoing OTES training for administrative and peer evaluators 0% 8% 33% 59% Support for how to use VAM scores to inform instructional and administrative practices for school 1% 8% 41% 50% improvement Professional development and system supports for understanding value-added measures 1% 9% 41% 49% Online OTES data management system with evaluator and teacher access 2% 12% 38% 49% Interactive evaluation rubric for use with hand-held devices for capturing data in the classroom 7% 20% 28% 45% Tool for integrating Collaboration, Communication and Professionalism (Standards 6 and 7) with SMART goals Evaluation option to conduct progress monitoring of teacher s SMART goals Tool or guideline for integrating CEUs for licensure with teacher evaluation results Source: MGT of America, Inc., % 19% 47% 29% 2% 21% 50% 26% 6% 36% 45% 12% The survey included questions regarding the Teacher Rating Matrix that had been developed to describe how the teacher performance and student growth components would create the final teacher summative evaluation score. The survey asked, Ohio's new system for evaluating teachers combines a rating of teacher performance (based on classroom observations and other factors) with a rating of student academic growth. These two ratings are each weighted at 50 percent. How clearly does the matrix (shown in the survey) explain the method for computing a summative evaluation rating? Although 84% of those who answered the survey said that the Teacher Rating Matrix was Somewhat or Very Clear, many OTES pilot participants raised concerns about how summative ratings are assigned to teachers based on the matrix. When asked for suggestions to help clarify or change the Teacher Rating Matrix that combines student growth and the teacher performance ratings, 37% of those who participated in the survey (107 respondents) responded with comments related to four issues, as shown in Exhibit P age

23 Exhibit 4-8 Teacher Rating Matrix Feedback 10% 16% Validity Concerns 21% 53% Clarification Suggestions Source: MGT of America, Inc., Their most pressing concern pertains to the validity of the summative ratings. About 53% of those who offered suggestions or comments about the matrix questioned the validity of the proposed formula for determining a teacher s summative evaluation score. They raised concerns about its fairness, equity and inter-rater reliability for all teachers across grade levels and content areas, and variance in student sub-groups. They questioned the practice of only assigning an accomplished rating to teachers whose students student growth measure (SGM) outcomes exceed one year s growth, since one year s growth has always been considered the gold standard. They also said the summative rating needs to reflect if a teacher is ineffective for ethical or communication problems even if student growth is good. Finally, they would like to see the research base behind the matrix design. In addition, 10% said they have concerns about using scores from SGM that have not yet been identified. They consider it premature to implement a high-stakes teacher evaluation system using SGM before the measures have been thoroughly piloted and researched for technical soundness with measures at all grade levels aligned to the new Common Core learning standards. Another 21% offered suggestions for adding more explanation of the range in student growth scores that correlate to each possible summative teacher rating. They asked for more explicit examples of how the student growth scores impact teacher evaluation ratings so that teachers better understand the expectations for student growth associated with the different levels in the matrix. About 16% offered suggestions related to the matrix format, including making a decision-tree type flow chart for assigning a summative rating, aligning the display of matrix categories with the OTES rubric, and replacing the numbers with descriptive rating categories. The numbering system was said to be confusing for two reasons: 1) The numbers are perceived to be a carryover from the previous, less holistic evaluation system; and 2) The numbers are confusing to some because they are not certain what the numbers represent. 22 P age

24 5.0 Summary MGT conducted a comprehensive review of the pilot OTES in order to answer research questions dealing with implementation and impact. Survey data were gathered from participants both in the middle of the pilot year (January 2012 initial electronic survey) and near the end of the pilot year (April 2012 summative electronic survey). Data were also gathered through documents collected and face-to-face interviews at sites selected as case study locations. Conclusions Several conclusions can be drawn from the data collected in this study. The sequence of the following list of conclusions is not meant to imply greater or lesser value or importance. The value of SMART goals. LEA pilot participants perceive SMART goals to have a strong positive impact on instructional practices, although nearly half reported that the SMART goals form was difficult to use. Since SMART goals are based on a review of evidence related to student learning needs, the process of establishing SMART goals can inform plans for differentiated instruction and selection of multiple measures to monitor student learning. There is a need for more professional development support for setting SMART goals and improving teachers assessment literacy, both of which have the potential to positively impact student learning growth. The student growth measures (SGM). There is a significant need for a more thorough explanation about how to identify student growth measures. This is a problem not only for teachers working in non-tested areas, but also for those teaching in tested areas since the new Common Core standards and assessments are yet to be developed and validated. The impact of SGM on individual teachers. There is a need for more explanation about how the results of student growth measures are computed into value-added ratings for individual teachers. The Teacher Rating Matrix. There is a need to address concerns about the validity of the Teacher Rating Matrix, including an explanation of the research-based rationale for why one year of academic growth for students is not an acceptable goal for accomplished teachers. The research behind the model. There is a need for a more thorough explanation of the technical measurement design behind value-added scores so teachers better understand the meaning of SGM and how it relates to their classroom practice. A comprehensive understanding of the model. LEAs in the pilot were not using the full complexity of the OTES model and will likely need more supports to do so. Many participants did not understand if there could be some differentiation between new or struggling teachers and veteran/distinguished teachers. Their questions did not imply that being new equated with struggling or that being a veteran equated with being distinguished. However, they expressed a need to address teachers differently and were unsure if they could under this new model. 23 P age

25 Local policy or contract changes. As of April 2012, there had been few changes in LEA policies and procedures. This may mean that LEAs are waiting for concerns about the OTES model or issues to be resolved before institutionalizing it. However, LEAs may also need some technical assistance or model policies/procedures to use as examples, rather than creating their own from scratch. The Ohio teacher, administrative, or school boards associations could be of assistance in developing sample or model documents or providing other technical assistance. Recommendations The recommendations included in this section are based on the entire study. Individual recommendations are provided for each of the research questions. IMPLEMENTATION: Adopt all components, at least provisionally, and leave them in place for a second pilot year. Make the tools, forms, and structure available through a statewide online system of support that helps both teachers and evaluators manage all the parts and pieces. Create a clear and simple flow chart showing activities on a sample timeline a Year-at-aglance. Provide clear documentation to identify what is required and what is recommended as best practice. Samples of each should be included both strong and weak examples to support the goal of transparency and improved teacher and evaluator performance. Provide ongoing, in-depth, and accessible professional development. Improve face validity of the system by ensuring that the system is fair, equitable, and reliable for all teachers. Present the summative teacher evaluation ratings in actionable terms that provide guidance for decision-making about classroom practices and professional development needs. Conduct another pilot during IMPACT ON TEACHER BEHAVIOR AND EFFECTIVENESS: Maintain the use of SMART goals as an expected/required component of the system. Provide professional development aimed at using data, developing multiple assessments, and differentiating instruction to impact instructional practice. 24 P age

26 IMPACT ON ADMINISTRATIVE BEHAVIOR AND SCHOOL/LEA PRACTICES: Invite Ohio teacher, administrative, and school board organizations to assist in the development of model policies and procedures or contractual language. Invite input or feedback from administrators and teachers on ways to differentiate evaluations based on teacher performance to reduce the perceived burden of implementing this model with all teachers. IMPACT ON STUDENT ACHIEVEMENT: Conduct an analysis of the impact of the OTES model on student achievement once the model has been implemented for long enough to show trends and draw conclusions. Ensure that student growth scores are based on more than one year and have, to the degree possible, factored out the effects of external conditions impacting student learning. SUSTAINABILITY: Develop an electronic system of support for the OTES model that supports all phases of the process and allows uploading of artifacts or evidence in a variety of formats by both the teacher and the principal. Provide ongoing, accessible professional development in how to use the electronic system of support. REVIEW OF BEST PRACTICES IN TEACHER EVALUATION: Avoid using VAM data for high stakes decisions about an individual teacher s effectiveness and include VAM data as only a small percent of the evaluation. Conduct rigorous pilot studies of how the results of various value-added statistical models correlate with other measures for student growth and teacher effectiveness. Focus at least as much attention on the alignment of standards and professional development for both teachers and administrators. Overall, the new teacher evaluation system needs to be easier to implement and to gain more face validity with educators in Ohio. Since there are many LEAs that have not yet piloted the new teacher evaluation system and many of those who did participate in the pilot did not include student growth measures and other components of the OTES model, it is recommended that another OTES pilot be conducted during the school year with these additional supports for implementing OTES. 25 P age

27 Appendices: A Surveys: Initial and Summative surveys and responses B Best Practices Review and Bibliography 26 P age

28 Appendix A Survey

29 Appendix A - Survey Summary Report - Feb 21, 2012 Survey: Ohio Teacher Evaluation System Pilot Project Survey Consent: Yes 100.0% Consent: Value Count Percent % Yes % No 0 0% Statistics Total Responses 325 Type of district in which you work: Suburban 24.5% Rural 58.8% Urban 16.7% Type of district in which you work: Value Count Percent % Rural % Urban % Suburban % Statistics Total Responses 323 Appendix A - Survey INITIAL SURVEY RESULTS

30 Appendix A - Survey Please select the LEA where you work from the list: Akron Public Schools 2.5% Allen East 0.9% Alternative Education Academy 0.3% Amherst Exempted Village Schools 0.9% Auglaize County ESC 0.3% Aurora City Schools 0.9% Batavia LSD 0.6% All others 93.5% Please select the LEA where you work from the list: Value Count Percent % Akron Public Schools 8 2.5% Allen East 3 0.9% Alternative Education Academy 1 0.3% Amherst Exempted Village Schools 3 0.9% Auglaize County ESC 1 0.3% Aurora City Schools 3 0.9% Batavia LSD 2 0.6% Beavercreek City 2 0.6% Belpre 4 1.2% Bettsville 1 0.3% Bloom Vernon Local School District 3 0.9% Bridges Community Academy 2 0.6% Brown Local 3 0.9% Buckeye Online School for Success 2 0.6% Canton Local Schools 1 0.3% Cincinnati City 4 1.2% Circleville City Schools 2 0.6% Columbus City 4 1.2% Conneaut City Schools 5 1.5% Coshocton City Schools 5 1.5% Coventry Local Schools 3 0.9% Crestview Local School District 4 1.2% Crooksville EVSD 1 0.3% East Cleveland 2 0.6% Edgewood City Schools 2 0.6% Edon Northwest Local 2 0.6% Elida Local Schools 1 0.3% ESC of Cuyahoga County 3 0.9% Fairlawn Local School 3 0.9% Fayette Local Schools 4 1.2% Franklin Local Schools 3 0.9% Fremont City Schools 3 0.9% Galion City Schools 2 0.6% Gallia County Local 1 0.3% Statistics Total Responses 325 Sum 15,029,990.0 Average 46,532.5 StdDev 13, Max 143,644.0 Appendix A - Survey INITIAL SURVEY RESULTS

31 Appendix A - Survey Georgetown Exempted Village School District 3 0.9% Goshen Local Schools 4 1.2% Grand Valley Local 2 0.6% Grandview Heights CSD 1 0.3% Greenfield Exempted Village School District 2 0.6% Hamilton City Schools 3 0.9% Highland Local (Medina) 2 0.6% Hilliard City School District 3 0.9% Hudson City Schools 2 0.6% Imagine Harrisburg Pike 1 0.3% Indian Lake Local Schools 1 0.3% Indian Valley Local Schools 4 1.2% Jackson Local Schools 2 0.6% Johnstown Monroe 2 0.6% Kenton City Schools 4 1.2% Lancaster City Schools 3 0.9% Liberty Center Local Schools % Liberty Union-Thurston Local Schools 2 0.6% Licking Heights Local School District 3 0.9% Lorain City Schools 2 0.6% Lucas Local School 3 0.9% Lynchburg-Clay Local Schools 1 0.3% Mad River Local School District 4 1.2% Madison Local 1 0.3% Maple Hts. City Schools 1 0.3% Marietta City Schools 4 1.2% Marion City 1 0.3% Marysville Exempted Village School District 3 0.9% Maysville Local 2 0.6% Middletown City 1 0.3% Mid-East Career and Technology Centers 2 0.6% Milford Exempted Village School District 1 0.3% Millcreek-West Unity Local Schools 2 0.6% Mississinawa Valley LSD 2 0.6% Morgan Local School District 2 0.6% Mount Vernon City 4 1.2% Muskingum Valley ESC 1 0.3% New Boston Local School District 4 1.2% New Knoxville School 3 0.9% New Lebanon Local 5 1.5% New Lexington City School District 2 0.6% New Miami Local Schools 2 0.6% Noble Local 3 0.9% Nordonia Hills City Schools 3 0.9% North Central Local 2 0.6% Northmont City Schools 3 0.9% Northwest Local School 3 0.9% Norwood City Schools 1 0.3% Ohio Connections Academy 1 0.3% Ottawa-Glandorf Local 4 1.2% Parma City 3 0.9% Appendix A - Survey INITIAL SURVEY RESULTS

32 Appendix A - Survey Paulding Exempted Village Schools 5 1.5% Perrysburg Schools 1 0.3% Pickaway-Ross JVSD 2 0.6% Pickerington Local School District 3 0.9% Plymouth-Shiloh 1 0.3% Revere Local School District 1 0.3% Ridgewood 3 0.9% River View Local 4 1.2% Rock Hill Local 2 0.6% Rolling Hills Local School District 2 0.6% Scholarts Prep and Career Center 1 0.3% Sciotoville Community School 2 0.6% Sciotoville Elementary Academy 2 0.6% Sebring Local 3 0.9% Shelby City Schools 3 0.9% Southern Local 4 1.2% Southern Local-Perry 2 0.6% St. Bernard- Elmwood Place City 2 0.6% Stryker Local School 1 0.3% Tipp City Exempted Village Schools 3 0.9% Toledo Public Schools 1 0.3% Toronto City 1 0.3% Union Local-Belmont 2 0.6% Union Scioto Local Schools 3 0.9% Valley LSD 3 0.9% Van Wert City Schools 5 1.5% Vinton County Local School District 4 1.2% Virtual Schoolhouse 1 0.3% Walnut Twp. Local Schools 3 0.9% Washington Court House City SD 2 0.6% West Muskingum Local 3 0.9% Western Local 2 0.6% Willard City Schools 4 1.2% Willoughby-Eastlake City Schools 7 2.2% Wilmington City Schools 5 1.5% Worthington City Schools 2 0.6% Xenia Community City 3 0.9% New Philadelphia City Schools 2 0.6% Akron Digital Academy 0 0% Bellefontaine City Schools 0 0% Canal Winchester Local School District 0 0% Crittenton Community School 0 0% Dayton Early College Academy 0 0% Eastern Local School District 0 0% Fairfield City School District 0 0% Lion of Judah Academy 0 0% Phoenix Community Learning Center 0 0% Renaissance Academy 0 0% Southeast Local Schools 0 0% Southern Local 0 0% Southwest Licking Local 0 0% Appendix A - Survey INITIAL SURVEY RESULTS

33 Appendix A - Survey Tomorrow Center 0 0% Troy City Schools 0 0% VLT Academy 0 0% Please list the name of your school. Count Response 1 (IRN ) Northwest Ohio Educational Service Center 1 ACE Academy 3 Allen East 1 Atkinson 1 B. L. Miller 1 BL Miller Elementary 2 Batavia High School 1 Bath Elementary 1 Belle Aire 1 Belpre City Schools, Belpre High School 1 Belpre Elementary 1 Belpre High School 1 Bettsville Local School 1 Beverly Gardens 1 Beverly Gardens Elementary 1 Bevis Elementary 1 Bloom Vernon Elementary 1 Bloom-Vernon Elementary 2 Bloomfield 1 Board Office 1 Bridgeport Elementary 1 Bridges Community Academy 2 Buckeye Online School for Success 1 Buffalo Campus 1 Bunsold Middle School 1 Central Elementary 1 Central Elementary - Vinton County Middle School 1 Central Middle School 7 Central Office 1 Central Office Staff 1 Columbus Global Academy Conneaut Area City School Distrtict 2 Conneaut High School 1 Coshocton High School 2 Coventry High School 1 Creekview 1 Crestview 1 Crestview High School 1 Crestview Local Schools 1 Crestview School District 1 Crooksville Primary School 1 Dan Emmett 1 Diley Middle School Appendix A - Survey INITIAL SURVEY RESULTS

34 Appendix A - Survey 5 District 1 District Office 1 Dixie HS 2 Dixie Middle School 1 Downs 1 ESC-CC 1 ESCCC 3 East Elementary 1 East High School 1 Edon Elementary 1 Edon Northwest Elementary 1 Elida High School 1 Evening Street Elementary 2 Everts Middle School 1 Fairbrook 1 Fairlawn Local 1 Fairlawn Local Schools 1 Fairlawn Middle/High School 1 Fayette Elementary 1 Fayette Elementary and MS?HS 1 Fayette JH/HS 1 Fayette Jr/Sr High School 3 Franklin Local Community School 1 Fremont Ross HS 1 Fremont Ross High School 1 Galion Primary 1 Gamble Montessori 1 Gateway Elementary 1 Georgetown Elementary School 2 Glandorf Elementary 2 Glenwood High School 1 Goshen High School 1 Grand Valley Elementary 1 Grand Valley Elementary School 1 Green Valley Elementary 1 Greenfield Elementary 1 Greenfield Elementary School 1 Hardin Central 1 Hardin Central Elementary 2 Harmar 1 Harmon 2 Harmon Middle School 1 Harrisburg Pike Community School 3 High School 1 Highland High School 1 Highland Middle School 1 Hilliard Heritage Middle School 1 Hilltop Elementary 1 Hilltop High School 1 Holmes 1 Hudson High School Appendix A - Survey INITIAL SURVEY RESULTS

35 Appendix A - Survey 1 Indian Lake Elementary 2 Indian Valley High School 2 Jackson High School 2 Kenton High School 1 Lancaster high school 1 Ledgeview Elementary 4 Leggett CLC 4 Liberty Center Elementary 3 Liberty Center High School 3 Liberty Center Middle School 1 Liberty Center middle school 1 Liberty Union 1 Liberty Union Middle School 1 Lincoln Elementary 1 Linden McKinley STEM Academy 2 Longfellow Middle School 1 Lucas High 1 Lucas High School 1 Lucas elementary and ms 1 Lynchburg-Clay Elementary 1 MCCormick Elementary 1 MIddletown High School 1 Madison Junior High 1 Malvern Elementary 2 Malvern Middle School 1 Marietta Middle SChool 1 Marietta Middle School 1 Marr-Cook Elementary 1 Marr/Cook 1 Marysville Schools 1 Maysville Elementary School 1 Maysville Middle School 1 McCord Middle 1 Meadowbrook H.S. 1 Middle School 1 Miller High School 2 Millersport Elementary 1 Millersport High School 1 Missisnawa Valley 1 Mississinawa Valley LSD 1 Morgan West Elementary 1 Nevin Coppock ES 1 New Knoxville 1 New Knoxville Local 1 New Knoxville School 1 New Lebanon 1 New Lexington Middle School 1 New Miami MS/HS 1 Nord 1 Nord Middle School 1 Nordonia High School Appendix A - Survey INITIAL SURVEY RESULTS

36 Appendix A - Survey 1 North Central Elementary 1 North Central High School 1 North High School 2 Northmont High School 1 Norwood High School 1 Oak Intermediate and Stanton Primary 1 Oakwood Elementary 1 Oakwood Elementary School 1 Ohdela 1 Ottawa Elementary 1 Parma 1 Parma Senior High 1 Parma Senior High School 1 Paulding H.S. 1 Paulding High School 1 Perrysburg Junior High 1 Pickaway-Ross CTC 1 Pickerington High School North 1 Pleasant Street Elementary 1 Pleasant Street Elementary School 1 Plymouth High School 2 Port Washington Elementary 1 RIver View High Schhol 1 Ridgewood 1 Ridgewood Elementary 2 Ridgewood Middle School 1 Ritzman 2 Ritzman CLC 1 River View High School 1 Robinson Elementary 1 Rock Hill High School 1 Royalview 1 Scholarts 1 Sciotoville Community School 2 Sciotoville Elementary Academy 1 Searfoss 1 Searfoss Elementary 1 Shelby Middle School 1 Shenandoah Elementary 1 Shenandoah High School 1 South Elementary 1 South Webster Jr/Sr High School 1 Southern Elementary School 2 Southern High School 1 Southern Local 1 Southern elementary 1 St. Bernard - Elmwood Place Schools 1 St. Bernard-Elmwood Place High School 1 Stanton Primary and Oak Intermediate 1 Stebbins High School 1 Stryker Local Appendix A - Survey INITIAL SURVEY RESULTS

37 Appendix A - Survey 1 Superintendent's Office 1 Thomas A. Edison Intermediate and Grandview Heights Middle 1 Tippecanoe Middle School 1 Toronto High School 1 Turkeyfoot elementary 1 Union Elementary 1 Union Local Elementary 1 Unioto Elementary 2 Unioto High School 1 Valley Elementary School 1 Valley High School 2 Valley Middle School 3 Van Wert Middle School 1 Vinton County Middle School 1 Vinton Elementary 1 Vinton co middle 1 Virtual Schoolhouse 1 Walker Elementary 2 Warner Middle School 2 Warsaw Elementary 1 Washington Elementary 1 West 1 West Muskingum Local Schools 1 West Muskingum Middle School 1 Western Elementary 1 Western High School 1 Willard High School 1 Willard Middle 1 Willard Middle School 1 Willoughby South 4 Willoughby South High School 2 Wilmington High School 1 Wilson Middle School 1 Xenia High School 1 Zanesville Campus 1 champion Middle 1 gfghghdghggddf 1 madison south 1 mckinley jr/sr hs 1 pickaway ross career technology center 1 shenandoah Appendix A - Survey INITIAL SURVEY RESULTS

38 Appendix A - Survey Which model of teacher evaluation are you are implementing? Locally developed evaluation system with Student Growth Measures (SGM) 9.2% developed evaluation system without Student Growth Measures (SGM) 8.6% Ohio Teacher Evaluation System (OTES) without Student Growth Meas eacher Evaluation System (OTES) with Student Growth Measures (SGM) 42.8% Which model of teacher evaluation are you are implementing? Value Count Percent % Ohio Teacher Evaluation System (OTES) without Student Growth Measures (SGM) Ohio Teacher Evaluation System (OTES) with Student Growth Measures (SGM) Locally developed evaluation system without Student Growth Measures (SGM) Locally developed evaluation system with Student Growth Measures (SGM) % % % % Statistics Total Responses 325 What is your current role? (Check all that apply) % 30.8% 25 20% 0 District Administrator 8.3% 4.3% 1.5% Building Principal Classroom Teacher Union Leader Evaluator Professional Growth Coach 4.3% Other (Please specify): What is your current role? (Check all that apply) Value Count Percent % District Administrator 65 20% Building Principal % Classroom Teacher % Union Leader % Evaluator % Professional Growth Coach 5 1.5% Other (Please specify): % Statistics Total Responses 325 Appendix A - Survey INITIAL SURVEY RESULTS

39 Appendix A - Survey Open-Text Response Breakdown for "Other (Please specify):" Count Assist Principal 1 ESC supervisor 1 LPDC Chair 1 LPDC chairperson 1 Music/Band Teacher 1 Race to the Top Lead Teacher 1 Reading Specialist 1 School Improvement Coordinator 1 Sp. Ed. Supervisor 1 Superintendent 1 TIF Coordinator 1 TIF coord. 1 assistant principal 1 When pilot started, I was union leader. 2 weeks ago I was moved to assistant principal at our elementary What grade(s) do you teach? (Check all that apply) % 12.1% 12.1% 18.2% 13.1% 12.1% 15.2% 0 K All others What grade(s) do you teach? (Check all that apply) Value Count Percent % K 7 7.1% % % % % % % % % % % % % Statistics Total Responses 99 Sum 1,560.0 Average 7.4 StdDev 3.50 Max 12.0 Appendix A - Survey INITIAL SURVEY RESULTS

40 Appendix A - Survey What subjects do you teach? (Check all that apply) % 40.4% 34.3% 27.3% 18.2% 0 English language arts/reading Mathematics Science Social Studies/History/World Cultures Other: (Please explain) What subjects do you teach? (Check all that apply) Value Count Percent % English language arts/reading % Mathematics % Science % Social Studies/History/World Cultures % Other: (Please explain) % Statistics Total Responses 99 Open-Text Response Breakdown for "Other: (Please explain)" Count Agriculture 1 Health PE 1 Health, etc. 1 Health/Phys Ed 1 Industdial Twchnology 1 Lifeskills, Sustained Reading 1 Music 2 Reading Intervention 1 Special Education 1 Technology 1 Visual Arts 1 art 1 elementary self contained 1 life skills 1 self contained classroom 1 technology 2 Appendix A - Survey INITIAL SURVEY RESULTS

41 Appendix A - Survey Did you receive a teacher level value-added report from Battelle for Kids in fall 2011? Yes 26.3% No 73.7% Did you receive a teacher level value-added report from Battelle for Kids in fall 2011? Value Count Percent % Yes % No % Statistics Total Responses 99 Are you using the value-added report as part of your evaluation plan during the pilot year? No 46.2% Yes 53.8% Are you using the value-added report as part of your evaluation plan during the pilot year? Value Count Percent % Yes % No % Statistics Total Responses 26 What percent of teachers in your district are involved in the OTES pilot this year? Count Response 1.005% 1.04% 1.25% 1.5% Appendix A - Survey INITIAL SURVEY RESULTS

42 Appendix 1A - Survey 0.4% 11 1% 1 1.8% 8 10% 1 100% 1 12% 2 15% 5 2% 2 20% 1 23% 5 3% 1 30% 5 5% 1 6% 1 7% 1 7.5% 1 9% In your opinion, how knowledgeable are principals in your district about the components of OTES or your LEA-developed evaluation system? Very knowledgeable 7.1% Other: 5.4% Not at all knowledgeable 7.1% Knowledgeable 37.5% Somewhat knowledgeable 42.9% In your opinion, how knowledgeable are principals in your district about the components of OTES or your LEA-developed evaluation system? Value Count Percent % Not at all knowledgeable 4 7.1% Somewhat knowledgeable % Knowledgeable % Very knowledgeable 4 7.1% Other: 3 5.4% Don't know 0 0% Statistics Total Responses 56 Open-Text Response Breakdown for "Other:" Count ODE is not knowledgeable, so how can the principal be knowledgeable 1 Only two are familiar because of training 1 The prinicipals involved in the pilot are very knowledgable 1 Appendix A - Survey INITIAL SURVEY RESULTS

43 Appendix A - Survey In your opinion, how knowledgeable are teachers in your district about the components of OTES or your LEA-developed evaluation system? Knowledgeable 13.0% Other: 5.6% Not at all knowledgeable 16.7% Somewhat knowledgeable 64.8% In your opinion, how knowledgeable are teachers in your district about the components of OTES or your LEA-developed evaluation system? Value Count Percent % Not at all knowledgeable % Somewhat knowledgeable % Knowledgeable 7 13% Other: 3 5.6% Very knowledgeable 0 0% Don't know 0 0% Statistics Total Responses 54 Open-Text Response Breakdown for "Other:" Count Only eight are knowledgable 1 see the above answer 1 Some are becoming knowledgeable, others are not. Union leadership is involved with the OTES pilot. 1 How many classroom teachers work in your building? Count Response Appendix A - Survey INITIAL SURVEY RESULTS

44 Appendix 15A - Survey What percent of classroom teachers are participating in the OTES pilot this year? Count Response % 3 0% 10 1% 1 1.3% 1 1.5% Appendix A - Survey INITIAL SURVEY RESULTS

45 Appendix 21A - Survey 10% 5 100% 2 11% 7 12% 5 13% 2 14% 6 15% 6 16% 2 17% 2 18% 3 2% 1 20% 1 21% 1 24% 2 25% 12 3% % 1 30% % 1 33% 3 4% 1 45% 9 5% 2 50% 3 6% 1 60% 7 7% 1 70% 10 8% 1 8.3% 1 80% 6 9% 1 9.3% Are you directly involved in implementing the OTES in your building? No 0.7% Yes 99.3% Appendix A - Survey INITIAL SURVEY RESULTS

46 Appendix A - Survey Are you directly involved in implementing the OTES in your building? Value Count Percent % Yes % No 1 0.7% Statistics Total Responses 144 If yes, please explain your role: (Check all that apply) % % 8.4% 0 Primary evaluator Implementation support resource/guide Professional development coach 0.7% Other (please specify): If yes, please explain your role: (Check all that apply) Value Count Percent % Primary evaluator % Implementation support resource/guide % Professional development coach % Other (please specify): 1 0.7% Statistics Total Responses 143 Open-Text Response Breakdown for "Other (please specify):" Principal 1 Count In your opinion, how knowledgeable are teachers in your school/district about the components of OTES or your LEA-developed evaluation system? Other: 0.7% Very knowledgeable 3.4% Knowledgeable 15.1% Not at all knowledgeable 16.4% Somewhat knowledgeable 64.4% In your opinion, how knowledgeable are teachers in your school/district about the components of OTES or your LEA-developed evaluation system? Value Count Percent % Statistics Appendix A - Survey INITIAL SURVEY RESULTS

47 Appendix A - Survey Not at all knowledgeable % Somewhat knowledgeable % Knowledgeable % Very knowledgeable 5 3.4% Other: 1 0.7% Total Responses 146 Open-Text Response Breakdown for "Other:" Some are knowledgeable 1 Count Did you receive training about the new teacher evaluation methods implemented in your school/district during the school year? No 8.1% Yes 91.9% Did you receive training about the new teacher evaluation methods implemented in your school/district during the school year? Value Count Percent % Yes % No % Statistics Total Responses 321 I attended training on the OTES model. (Check all that apply) % % 64.1% % 0 Day 1 Introduction and training for the team Day 2 Focus on the evaluation process, using the rubrics and other forms Day 3 Practice using the rubrics and planning for Post-Observation Conferences Received information from someone who attended Day 1, 2, or 3 training I attended training on the OTES model. (Check all that apply) Value Count Percent % Day 1 Introduction and training for the team % Statistics Total Responses 295 Appendix A - Survey INITIAL SURVEY RESULTS

48 Appendix Day 2 A - Focus Surveyon the evaluation process, using the rubrics and other forms Day 3 Practice using the rubrics and planning for Post-Observation Conferences Received information from someone who attended Day 1, 2, or 3 training % % % How many hours of training did you receive thus far about the new teacher evaluation methods? 1 2.4% 2 0.3% 4 2.0% 5 5.8% 6 9.2% 7 2.0% 8 5.1% All others 73.0% How many hours of training did you receive thus far about the new teacher evaluation methods? Value Count Percent % % % 4 6 2% % % 7 6 2% % % % % Other (please indicate specific hours): % None 0 0% 3 0 0% % Statistics Total Responses 293 Sum 1,486.0 Average 8.8 StdDev 3.37 Max 12.0 Open-Text Response Breakdown for "Other (please indicate specific hours):" days worth hrs. full 3 day scheduled 1 15 hours 2 15 or so for all 3 days hours Appendix A - Survey 2 INITIAL SURVEY RESULTS Count

49 Appendix 18 to A date - Survey Full day sessions 1 3 all day ? 1 3 days 3 3 days - 18 hrs. 1 3 days of training 1 3 days training. About 15 hours 1 3 days x 6/7 hour sessions 1 3 days. Approx sessions from 9am-3pm >12 1 About 20 1 Approx 15 1 Day 1, 2, 3 1 I attended all 3 sessions 1 I was at the 3 days of training 1 Just the day one training 1 More than 12 from various sources. 1 Participated in all 3 OTES Sessions 1 These hour sessions at a conference. 1 Three full days - 20 hours 1 Three training days 1 about 18 hours 1 about 25 hours 1 all 3 days so around 15 hours 1 all that have been provided 1 approximately 15 hours in three training sessions 1 at least 15 hours to date 1 attended all three sessions 1 more than 15 1 not sure 1 over 40 hours 1 whatever the 3 days were 1 will be 4 totals days of 7 hrs each day. 1 Appendix A - Survey INITIAL SURVEY RESULTS

50 Appendix A - Survey Who provided the training to you about the new teacher evaluation methods? (Check all that apply) % Ohio Department of Education representatives 12.6% 6.8% 4.1% Building principal Building or district evaluator or coach Other (please specify): Who provided the training to you about the new teacher evaluation methods? (Check all that apply) Value Count Percent % Ohio Department of Education representatives % Building principal % Building or district evaluator or coach % Other (please specify): % Statistics Total Responses 294 Open-Text Response Breakdown for "Other (please specify):" Count Assistant Supt. 1 Battell 1 District Task Force 1 Evaluators who attended Day 2 and 3 1 HR and Curriculum Directors 1 Local ESC sponsored 1 OAC, OEA, BfK 1 OEA 1 Ohio Department of Education representatives 1 Ohio Education Association representatives 1 Race to the Top Team 1 University of Dayton 1 Rate the extent to which you agree with the following statements about the new teacher evaluation methods being implemented in your school/district. Our new teacher evaluation methods are fair to teachers in all classrooms, content areas and grade levels. Our new teacher evaluation methods are useful for helping teachers improve student learning. Our new teacher evaluation methods are relevant to teachers' instructional support needs. Our new teacher evaluation methods are adaptable to my school's specific context. Our new teacher evaluation methods are user-friendly and easy to implement. Strongly Disagree 10.9% % 3 3.7% % % 77 Slightly Disagree 18.0% % % % % 103 Slightly Agree 38.2% % % % % 98 Strongly Agree 28.6% % % % % 38 Don't know Responses Our new teacher evaluation methods align with Ohio Standards for 0.3% 1.9% 17.8% 75.7% 4.4% 321 Appendix A - Survey INITIAL SURVEY RESULTS 4.3% % 4 1.2% 4 3.1% %

51 Appendix A - Survey the Teaching Profession The evaluation criteria used in the new teacher evaluation methods are relevant and useful. The evidence indicators used in the new teacher evaluation methods are relevant and useful. The evaluator's written and verbal feedback and suggestions are relevant and useful. The new teacher evaluation methods foster reflective teaching practice. 2.2% 7 1.5% 5 1.2% 4 1.2% 4 9.6% % % % % % % % % % % % % 6 2.5% 8 4.4% % In your opinion, how adequately do the new teacher evaluation methods provide relevant assessment and useful feedback to teachers that lead to professional growth in: Understanding student learning and development, including respect for diversity of the students Knowing the content areas for which teachers have instructional responsibility Using varied assessments to inform instruction and evaluate student learning Planning and delivering effective instruction that advances the learning of each individual student Creating an environment that promotes high levels of student learning and achievement for all Collaborating and communicating with students, parents, other teachers, administrators Assuming responsibility for professional growth and performance in a learning community. Very Slightly Inadequate Inadequate 2.2% 7 2.2% 7 2.5% 8 2.8% 9 2.5% 8 2.8% 9 2.5% % % % % % % % 19 Slightly Adequate 45.0% % % % % % % 112 Very Adequate 38.8% % % % % % % 170 Don't know Responses 4.1% % % 9 2.2% 7 3.5% % % Which of the following OTES tools did you or your school/district use during the pilot year? (Check all that apply) % % % 51.6% % 0 Using the Standards for the Teaching Profession for Self- Assessment Tool Ohio Continuum of Teacher Development Resource Tool Data Collection Tool for Communication and Professionalism Self-Assessment Summary Tool Other (Please specify): Which of the following OTES tools did you or your school/district use during the pilot year? (Check all that apply) Value Count Percent % Using the Standards for the Teaching Profession for Self-Assessment Tool % Ohio Continuum of Teacher Development Resource Tool % Data Collection Tool for Communication and Professionalism % Self-Assessment Summary Tool % Other (Please specify): % Statistics Total Responses 310 Appendix A - Survey INITIAL SURVEY RESULTS

52 Appendix A - Survey Open-Text Response Breakdown for "Other (Please specify):" APS form 1 Assessment of Teacher performance 1 Danielson 1 District Walkthrough tool 1 Goal setting 1 Goals Sheet 1 Locally developed tools similar to the above 1 None due to contractual issues 1 Not sure of the name, no self assessment included. 1 Observation Rating Rubric 1 Observation Rating Rubric (that's all I know) 1 Professional Goal Setting Tool 1 Smart Goal Setting 1 WHOLE OTES DOCUMENT 1 We are using our own tool 1 We stopped because we were not happy with the way the training was going 1 goal setting tool 1 not sure 3 not whole district but all for those involved 1 observation rating rubric 1 observation rubric 1 smart goals 1 Professional goal-setting tool, Pre-observation planning and lesson reflection, observation narrative form, postobservation conference discussion, post-observation conference rating rubric and summary Count 1 Please identify the teacher evaluation tasks your pilot team has completed thus far. (Check all that apply) % 91.6% 86.1% 90.4% 73.1% % 9.3% 0 Evaluation planning conference between each participating teacher and evaluator Teacher Self- Assessment (based on Ohio Standards for Teaching Profession) Teacher analysis of student data Pre-observation conference between teacher and evaluator to discuss decision-making process for lesson Formal Classroom Observation of Teacher Performance Informal Classroom Walkthroughs Analysis of the Professional Project All others Please identify the teacher evaluation tasks your pilot team has completed thus far. (Check all that apply) Value Count Percent % Evaluation planning conference between each participating teacher and evaluator Teacher Self-Assessment (based on Ohio Standards for Teaching Profession) % % Teacher analysis of student data % Statistics Total Responses 323 Appendix A - Survey INITIAL SURVEY RESULTS

53 Appendix A - Survey Pre-observation conference between teacher and evaluator to discuss decision-making process for lesson % Formal Classroom Observation of Teacher Performance % Informal Classroom Walkthroughs % Analysis of the Professional Project % Evaluator's written observation report based on Ohio Standards for the Teaching Profession (Standards 1 5) % Teacher's written reflection on the success of the lesson observed % Post-conference meeting between each participating teacher and evaluator to assess progress and next steps Analysis of evidence regarding teacher behaviors in Collaboration, Communication and Professionalism (Standards 6 and 7) Use of professional development resources aligned with Teacher Growth Plan % 58 18% % Assessment of Student Growth and Achievement % OTES Improvement Plan % Please identify the types of student data used in your school/lea for goal setting and reflection about students' instructional needs. (Check all that apply) % 50.2% 35.3% 43.1% 42.7% % 8.1% 0 Student age ranges Student attendance rates School/district graduation rates Classroom sociograms School working conditions survey Aggregate classroom grades for subject areas Subgroup classroom grades for subject areas All others Please identify the types of student data used in your school/lea for goal setting and reflection about students' instructional needs. (Check all that apply) Value Count Percent % Student age ranges % Student attendance rates % School/district graduation rates % Classroom sociograms % School working conditions survey % Aggregate classroom grades for subject areas % Subgroup classroom grades for subject areas % Individual classroom grades for subject areas % District/Building Report Card % EMIS classroom report % Other (Please specify): % Statistics Total Responses 295 Open-Text Response Breakdown for "Other (Please specify):" AIMS, teacher assessments 1 AIMSWEB 1 Appendix A - Survey INITIAL SURVEY RESULTS Count

54 Appendix AYP data, A - Survey OAA data, Reading DRA data, 1 Battelle for Kids 1 Common Assessments 1 DIBEL's, Behavior Reports, School program Reading points 1 DIBELS 1 Disaggregated Data 1 Discipline Referrals 1 Discovery Ed Benchmark Assessements, DIBELS, KRAL, STAR,Coach Assessments 1 District Assessments (AIMSweb) 1 IEP's 1 Iam not the one doing he observations with the teachers. 1 LMS Gradebook. ODE Successnet 1 List of gifted students 1 Local Assessments such as Fountas & Pinnel, MAP, DIBELS 1 MAP and Short Cycle assessment data 1 Monthly Assessments 1 No data was provided or discussed, All work was done individually independant from the pilot 1 No goal setting took place 1 OAA 1 OAA results 1 OAA results, 9 week exams, VA goals 1 OAA scores from previous year, OAA practice test scores, Study Island Benchmark scores 1 OAA/Pro-Ohio Assessment data 1 OGT Results by Department 1 OGT SCORES 1 Our in house data reports 1 Perkins Performance Measures 1 Practice OGT Scores; 8th Grade OAA Scores 1 Presidential standards for fitness, teacher gathered dat 1 SCA data 1 SCA, Study Island, pre and post data 1 SOAR Data, AIMSWEB, DRA, DMA, Ongoing Formativer Assessments 1 Short Cycle Assessments 1 Study Island, Aims Web, Diagnostic Reading Assessments 1 This is up to the individual 1 Unsure, especially for S.S. 1 Value Added reports 1 Value-Added Data 1 We used the KRAL assessment as well as classroom assessments 1 Work in progress 1 aimsweb screening tool 1 common assessments 1 has not yet been determined 1 none 1 ptogress monitoring 1 student assessment data 1 value-added measures 1 value-added scores 1 Appendix A - Survey INITIAL SURVEY RESULTS

55 Appendix A - Survey Please identify the frequency of formal classroom observations you have been involved with thus far during the school year. (Check all that apply) % % % 0 One first semester One second semester Other (Please specify frequency): Please identify the frequency of formal classroom observations you have been involved with thus far during the school year. (Check all that apply) Value Count Percent % One first semester % One second semester % Other (Please specify frequency): % Statistics Total Responses 314 Sum 64.0 Average 3.8 StdDev 5.45 Max 24.0 Open-Text Response Breakdown for "Other (Please specify frequency):" per teacher first semester, 2 second semester 1 2 per semester per semester 1 24 total 1 3 First semester and 2 second semester 1 3 for a new teacher first semester; 2 second semester first semester 1 At least one observation per teacher this year. 2 for 50 percent of the teachers 1 I am not evaluating anyone as an Ed. Consultant. Only observing the process 1 I am not the principal doing the observations with the teachers. 1 I do not do observations 1 I have not been an evaluator 1 None 1 Principals have done the evaluations. 1 Second semester has not yet occurred but is planned. 1 Superintendent 1 Two First Semester 1 Two each semester 1 Appendix A - Survey INITIAL SURVEY RESULTS Count

56 Appendix Two observations A - Survey first semester 1 Two scheduled points during the academic year 1 based on contract 1 frequent walk throughs 1 n/a 2 none 6 none CO administrator 1 none for the pilot 1 none, I'm central office administrator 1 normally one a semester however 2 times per semester for non-tenured teachers 1 one per year 1 two first semester 1 two in fall and two in spring 1 weekly 1 I am a union leader and have not been directly involved with formal evaluations, although I know they have occurred. 1 Please identify the frequency of informal classroom walkthroughs you have been involved with thus far during the school year? (Check all that apply) % 45.6% 33.9% 25 0 One first semester One second semester Other (Please specify frequency): Please identify the frequency of informal classroom walkthroughs you have been involved with thus far during the school year? (Check all that apply) Value Count Percent % One first semester % One second semester % Other (Please specify frequency): % Statistics Total Responses 298 Sum Average 6.3 StdDev Max 50.0 Open-Text Response Breakdown for "Other (Please specify frequency):" per week weekly 1 10 first semester 1 10 per semester probably Appendix A - Survey INITIAL SURVEY RESULTS Count

57 Appendix 2 X month A - Survey 1 2 each semester 1 2 first semester 2 2 per month 3 2 times first and 2 time second semester 1 2 times/semester 1 2 walkthroughs in a semester 1 3 per semester 1 3 times per week per week times a month 1 3: 2 first semester and 1 second semester 1 4 first semester 2 second 1 4 x 22 this year per semester walkthroughs/teacher each nine week period Approximately 3 1 At least one for the year 1 At least two other, non-evaluative observations have occurred. 1 Daily 1 Daily as possible 1 Daily when possible 1 Each classroom each quarter 1 Every couple weeks 1 Have not completed walkthroughs 1 Have not used this as part of evaluation 1 I am not the one doing the observations with the teachers. 1 I believe I've had three walkthroughs so far. 1 I do not do walkthroughs 1 I visit every classroom every day but I don't have a rubric to follow 1 Multiple walk throughs 1 N/A 1 None 5 Not Sure 1 Once every few weeks 1 Once per month 1 Principals are in room 2-3 times per month 1 Same as above. 1 Several 1 Several during first semester 1 Several for every teacher throughout the year Appendix A - Survey 1 INITIAL SURVEY RESULTS

58 Appendix Superintendent A - Survey 1 Twice a month 1 We schedule multiple district walkthroughs and conduct many informal observations 1 Weekly 2 almost daily 1 at least 3 times weekly 1 at least once a week per class 1 bi-weekly 2 daily 1 every classroom one time each 9 weeks 1 every other month 1 goal of 1 per month 1 many 1 monthly 2 more than 1 1 multiple 2 multiple walkthroughs 1 n/a 2 none 7 none as an evaluator 1 none informal 1 none, I'm central office administrator 1 not used for evaluation process 1 numerous 1 once a month 1 once a week usually 1 once weekly 1 once/twice a month 1 one every two weeks 1 several dozens 1 several first semester 1 several times a week 1 some weekly, some monthly 1 three first semester 1 twice a month 1 two first semester, one second semester 1 unsure 1 varied throughout the year 1 walk throughs weekly 1 walkthroughs are done every day 1 weekly 5 weekly per teacher 1 weekly walk throughs 1 weekly walkthroughs 1 I have stopped in classrooms numerouls times. However, I have only conducted one formal walkthrough with follow up. 1 Appendix A - Survey INITIAL SURVEY RESULTS

59 Appendix A - Survey Please identify the duration of formal classroom observations you have been involved with thus far during the school year? Other (Please specify duration): 24.8% 5-10 minutes 2.2% minutes 0.6% minutes 0.6% minutes 1.9% minutes 12.1% minutes 20.3% minutes 37.5% Please identify the duration of formal classroom observations you have been involved with thus far during the school year? Value Count Percent % 5-10 minutes 7 2.2% minutes 2 0.6% minutes 2 0.6% minutes 6 1.9% minutes % minutes % minutes % Other (Please specify duration): % Statistics Total Responses 315 Sum 7,435.0 Average 31.4 StdDev 6.64 Max 36.0 Open-Text Response Breakdown for "Other (Please specify duration):" minutes 1 1 hour minutes 1 1hour minutes minutes 1 30 minutes each x 3 teachers minutes min minutes minutes 3 43 to 45 minutes each minutes 1 45 min plus 1 45 minutes 3 45 minutes minutes minutes 1 48 minutes Appendix A - Survey INITIAL SURVEY RESULTS Count

60 Appendix 50 Minute A - Survey periods and evaluation usually take the entire period min as teacher, 50 min as observer 1 50 mins 1 50 minute observations minutes 7 50 minutes for a non-pilot observation minutes minutes 1 60 minutes minutes 1 >60 minutes 1 I am not the one doing the teacher observations. 1 N/A 1 No observations 1 None 1 Other (Please specify duration):: 2 hours for evaluated teachers 1 Over 120 minutes 1 See above 1 Superintendent 1 Whole Period (50 minutes) 1 at least 3 hours with prep, preconfr, obersvation and post conference 1 have not had one 1 length of a class period: 54 minutes 1 n/a 2 none 5 not evaluating 1 plus forty minues 1 unclear question 1 Please identify the duration of informal classroom walkthroughs you have been involved with thus far during the school year? Other (Please specify duration): 18.2% minutes 2.4% minutes 0.7% minutes 2.4% minutes 1.3% minutes 4.4% 5-10 minutes 55.9% minutes 14.8% Please identify the duration of informal classroom walkthroughs you have been involved with thus far during the school year? Value Count Percent % 5-10 minutes % Statistics Total Appendix A - Survey INITIAL SURVEY RESULTS

61 Appendix 5-10 minutes A - Survey % minutes % minutes % minutes 4 1.3% minutes 7 2.4% minutes 2 0.7% minutes 7 2.4% Other (Please specify duration): % Total Responses 297 Sum 2,102.0 Average 8.7 StdDev 7.06 Max 36.0 Open-Text Response Breakdown for "Other (Please specify duration):" Count minutes mins minutes 2 45 min minutes minutes 1 64 walk through thus far this year each lasting at least 5 minutes 1 Complete walkthrough almost every day 1 I am not the one doing the teacher evaluations 1 I'm not sure of the criteria for informal classroom walktroughs 1 N/A 2 No observations 1 None 1 Not Sure 1 See above 2 Superintendent 1 between two minutes and fifteen minutes 1 depends on lesson 1 more than 40 minutes 1 n/a 2 none 7 not evaluating 1 not for evaluation purposes 1 over minute =360 minutes 1 unclear question 1 unknown 1 walk throughs range from 5 minutes to 30 minutes 1 whole building 2 and 1/2 hours per walkthrough 1 In your opinion, how effective are the new teacher evaluation methods for informing decisions about the following: Teacher retention Teacher dismissal Teacher tenure Very Ineffective Slightly Ineffective Slightly Effective Very Effective Don't Know Responses 6.0% % % % % % % % % % % % % % % 67 Appendix A - Survey INITIAL SURVEY RESULTS

62 Appendix A - Survey Teacher promotion 9.5% % % % % Teacher compensation 19.6% % % % % Rate the extent to which each of the following has been involved in the design of the OTES pilot within your school/district: District Administration Building Principal Classroom Teachers Union Leaders Other Not Involved Somewhat Involved Very Involved Don't Know Responses 9.7% % 9 6.3% % % % % % % % % % % % % % % 6 0.6% 2 6.6% % If you answered "Other" as to who else was involved in the design of the OTES pilot within your school/district, please specify: Count Response Ass't Superintendent 1 Assistant Principal 1 BLT 1 BLT members 1 Board of Education 1 Board of Education -- Don't Know 1 Central Administration 1 Charter Stakeholders 1 Classroom teacher 2 Curriculum Director 1 Curriculum and Instruction Supervisor/RTTT Chair 1 DLT 1 Did not intend to answer "other" but unable to go back to change. 1 Do not know. 1 ESC Consultant 1 ESC supervisors 1 I didn't answer "Other" 1 I didn't answer other 2 Mistake 1 N/A 1 NA 1 NO Other, can't get the tool to unclick the answer. 2 No one 1 Not Sure 2 ODE 1 Our LPDC (licensing office) personnel 1 Possibly superintendent Appendix A - Survey INITIAL SURVEY RESULTS

63 Appendix 1A - Survey Race to the Top Committee 1 Race to the Top Transformation Team 1 RttT Team 1 School Board 1 School Board President 1 School Psych 1 Special Education Director 2 Superintendent 1 Support Staff 1 Support Staff and Classroom Aids 2 TIF Coordinator 1 TIF Lead 1 TIF coord. 1 Teaching and Learning Department Instructional coaches 1 Team Members 1 Union 1 Union Leader 1 Unsure of who has been involved 1 assistant principal 2 curriculum director 1 directors in curriculum 1 do not know 1 don know 1 i must have answered it wrong. 2 teachers and myself, the building principal 1 literacy specialist/clp 3 n/a 1 na 1 no one 1 no other 1 nobody 4 none 1 not sure 1 other administrator 1 other piloting teachers 1 parents and board members 1 special education teacher 1 union 1 School Board had little if any impact, RttT transformation team's participation was limited by state changes to the evaluation system Briefly describe your role in implementing the new teacher evaluation methods in your classroom/school/district. Count Response 1 A member of the OIP team and RTTT team as well as district administrator. 1 Administrative Team 1 As a RttT district we are implementing the OTES tool in the Elemenary and Junior High/High School 1 As a classroom teacher to assist in the design/implementation of the evaluation 1 As a classroom teacher, I was selected to participate in the new evaluation system. Appendix A - Survey INITIAL SURVEY RESULTS

64 Appendix 1A - Survey As building principal work with three teachers to pilot the evaluation tool 1 As our building union representative I was asked to participate in the OTES training. 1 Attend OTES TRAINING with team. Monitor informal observations. 1 Attended 3 training sessions - conducted the new process with three teachers 1 Attending OTES meetings 1 & 4. Explaining OTES to teachers. Helping teachers understand forms. 1 Based on contract negotiations...not much. 1 Building Principal and evaluator 1 Building Principal using data to inform ODE of areas of concern or need further refinement. 1 Building administrator piloting the program. 1 Building principal 1 Central office oversight 1 Classroom Teacher 1 Classroom teacher 1 Committee Meetings Pre and post conferences Formal Observations Trainings 1 Coordinate, communicate, and attend OTES 1 Coordinator of efforts; technical assistance, RttT Lead 1 Evaluating and following the directives. 1 Evaluation of a select group of teachers using the evaluation model. 1 Evaluator 1 Evaluator - staff meetings - Adm. meetings 1 Facilitating the process and working with our building principal. 1 Facilitator 1 I am a building principal conducting the pilot with three of my classroom instructors. 1 I am a building principal using the tool with three teachers. 1 I am a classroom teacher involved in the pilot program. 1 I am a volunteer lass room teacher in the pilot program 1 I am an evaluator participating in the OTES pilot. 1 I am assisting the principal in our elementary buildings. 1 I am currently a teacher piloting the program. 1 I am currently completing the OTES pilot and serving on the evaluation committe 1 I am currently conducing a pilot implementation of hte OTES with 3 teachers. 1 I am head of the committee to design the new teacher evaluation for the district 1 I am on the team to create/implement the new system for my district. 1 I am one of two evaluators in the district. 1 I am part of the Race To The Top Team in our district. 1 I am participating in the pilot as an evaluator. 1 I am piloting the new program for our school district as a teacher. 1 I am serving as a test teacher for our district. 1 I am the Vice President of our union. My role is also be the district represenative. 1 I am the assistant principal working with a teacher for the OTES pilot. 1 I am the building principal and do all of the evals for grades k I am the building principal and used it with three of my staff members this year. 1 I am the chair of both the LPDC and Resident Educator Programs. 1 I am the chief evaluator and walk thru trainer. 1 I am the evaluator in the project pilot for our district. 1 I am the main evaluator in our building. 1 I am the only administrator involved in this process. 1 I am the only principal in my building so it has been 100% up to me. 1 I am the person that is doing the observations and evaluations 1 I am using the tool to evaluate one classroom teacher in my building. 1 I attended meetings and discussed the program with principals and administrators. Appendix A - Survey INITIAL SURVEY RESULTS

65 Appendix 1A - Survey I have agreed to learn about it and pilot the evaluation process. 1 I have attended the first OTES meeting and will attend the fourth meeting as well. 1 I have attended the meetings and participated in discussions. 1 I have been assigned the task of observing/evaluating one teacher in our building. 1 I have lead the OTES implementation at my school. I am a primary evaluator. 1 I have worked with three teachers using the new model. 1 I help design and implement. 1 I piloted the model as a teacher and served as a union representative for the process. 1 I w as bldg principal in the pilot program...assigned to evaluate theee teachers 1 I was a classroom teacher involved in the pilot program of OTES both last year and this year. 1 I was a teacher being evaluatedc 1 I was an evaluator and took part in aligning our evaluation tool with the state 1 I was involved in the training. 1 I was one of the teachers part of the pilot program for the new evaluation method. 1 I was the sole evaluator for three high school teachers. 1 I will be a formal evaluator and staff developer 1 I will be sharing info and doing the evaluations 1 I will be sole evaluator for both buildings 1 I'm being observed. 1 I'm on the committee 1 I'm the building administrator so I have been serving in the evaluator role. 1 I'm the middle school principal. I am currently evaluating two teachers using the OTES Model. 1 Implementation with teachers. 1 Keeping building and other principals in the loop. 1 Learning how to use the instrument to evaluate teachers 1 Member of the pilot group from the district. 1 My duties range from providing informal assistance and feedback to formal evaluation. 1 My role is to be trained and in the future I may do the special education staff in our district. 1 My role is to go through the new process and evaluate how affective it is. 1 Not directly involved. Work with Curriculum Director and Building Principals to develop plan. 1 Observations 1 Observing 1 Oversight of the district's initiative. 1 Part of the OTES team. 1 Participated as a teacher 1 Pilot school 1 Pilot teacher 1 Prepared and distributed and provided PD 1 Primary evaluator 1 Primary evaluator. 1 Principal and primary evaluator 1 Principal working through the process 1 Principal/coach 1 Professional development; resource for pilot schools 1 Providing information and overview for district administrators 1 Race to the top/ Union Rep. 1 Recieve training through pilot program, share info with staff, setting timelines for completion. 1 Source of communication and implementation for our district. 1 Superintendent 1 Superintendent of the district...negotiating contract with bargaining unit 1 Support from the district level to encourage cooperation, participation, and implementation. Appendix A - Survey INITIAL SURVEY RESULTS

66 Appendix 1A - Survey TIF coordinator-the communicator to all involved 1 Teachers and principals have been involved. 1 Union Leader attending training seesion 1 1 Volunteer to be observed under new OTES model 1 Volunteered to use the system myself. 1 Walk throughs, observations, refining the current instrument 1 Worked with three teachers who voulunteered to pilot the system 1 central office administrator - part of team designing and revising of local tool 1 classroom teacher set up of targeted data for individual teacher performance 1 i will be a main evaluator and making decisions as to the direction our district needs to go 1 main source of information and implimentation 1 major evaluator in my building 1 pilot goal setting, pre, observation, post...getting ready to start the 2nd round. 1 superintendent with RttT ands teacher evaluation negotiated agreement 1 three day training, classroom observations 1 I serve a role as an example of a teacher who does not receive value-added data since my subject area of Resource English does not have a large enough sample N<6 to draw data from. I teach multiple grade levels with multiple disability categories within one classroom period, therefore it is difficult to obtain a large enough sample from a mixed, small group instructional program. 1 My role has been to take part in the training and provide support for the building principals who will be completing the evaluations 1 Primary evaluator of 3 teachers. Currently using the district alternative appraisal option to incorporate the components of the OTES system. It is very cumbersome and not realistic when we will have to evaluate all teachers yearly. It will be a fulltime job per building. 1 I'm not on the RttT committee but I am one of the two designated administrators to work on the new OTES pilot. I attend the OTES meetings. I do the three evaluations with all of the goal setting 2x, preobs 2x, obs 2x, postobs 2x and evaluations. I will help present our findings and general info about the new system to the staff at some not yet determined time. 1 The building principal and I have met on several occasions to discuss the new program and read over the requirements. I am one of the teachers in the building participating in the pilot. 1 We have been very transparent with the process. Besides the three teachers involved in the pilot, we hav had at least four other teachers attend the day 2 and day 3 training. We have utilized the observation rubric tool as professional development during our weekly morning meetings. It aligns very well with our current evaluation tool. 1 I have been formally evaluated after a pre-conference.i was involved in a post conference and have had several walk throughs. 1 I participated in the self-assessment and goal-setting. I attended the day 1 seminar. My administrator has completed walk-throughs. 1 On scheduled PD days within my district, I have met with all staff to discuss the OTES program and all updates following the series of meetings that I've attended. I've also worked with the participating teachers to practice implementation of the new system. 1 I am the building principal therefore I am the one that does the evaluating of the teachers in my building. 1 I am the representative for my building and also a union member. I am working with the 3 teachers from my building who are actually being evaluated to help them understand the process and be a liason between them and adminstration 1 I am the building principal implementating the new OTES system with 3 teachers in the Middle School. 1 I conducted the preconference with the teachers, the goal setting followed up with walkthroughs and formal observations and then held post conferences with teachers afterwords. 1 I was buidling principal. I worked with the sped corrdinator and 2 teachers throughout this process. We role played pre conferences, ocnducted 2 observationas and helpd a post conference. We did not attend the first meeting so we must much crucial infomraiton about the process. we spent lots of time trying to figure out what everything was and how it was to be implemented. 1 I was union president when the pilot began. I went to Day 1 training. I worked closely with my building principal and 2 other staff members on the self-assessment and goal setting portions. My building principal did 2 formal evaluations in which we had a pre-conference but did not discuss the lessons afterwards. 2 weeks ago I was moved from the high school to become elementary school assistant principal. So I am no longer really involved with the pilot, but the Assistant Superintendent wants me to attend the Day 4 training. Appendix A - Survey INITIAL SURVEY RESULTS

67 Appendix 1A - Survey I am pparticipating in the pilot. I have had two formal observations last year and one so far this year. 1 I am the union president and was instrumental in our LEA joining the Ohio Appalachian Collaborative/RttT, etc. I have been a part of our discussions and planning throughout the past months. I also volunteered to be a pilot teacher for OTES. I am an Ohio Master Teacher and also a National Board Certified Teacher (renewed) so I have strong feelings about putting good teachers in front of our students. 1 I provided direction ahd, hopefully, some concrete ideas regarding how we can do this process so it will be helpful to all involved. 1 As the Assistant Superintnedent in charge of Curriculum & Instruction as well as Professional Development, Assessment and Research, and Human Resources, I have worked closely with the two building principals and six teachers through the pilot. I am also facilitating a District Teacher Evaluation Revision Committee based on the new framework and our work in OTES. As a RttT district, we will be piloting the new tool in each of our buildings with small groups of teachers for the school year as long as we can be trained as credentialed evaluators by the start of the school year. 1 I am the RttT director and am observing the Principals as they go through the process so I can give the remaining teachers support. 1 Classroom teacher, union member, attended the trainings, participated in the observations and it's steps 1 As a TIF district and the coordinator, I was involved with the training and helped set up the implementation in the district. 1 I have been a part of a team of educators who were part of the initial process of developing this new evaluation system within my distrcit. We met with a representative from ODE and she briefed us on the new expectations and regulations of the new OTES. 1 As building principal I am being trained to be a teacher evaluator and I am working with three teachers in my building with the pilot program. 1 My role is to utilize the OTES model (without student performance data) with applicable staff personnel. 1 I will provide PD for teachers in my building. I will share the results of the pilot with other administrators 1 As Principal, it is my job to implement the OTES pilot with 3 classroom teachers. In addition, I have kept our DLT and building staff abreast of what the new evaluation system will be and what our district needs to do to be in compliance. I have contacted the Danielson group for the new framework and will be working with the Union to develop a revised teacher eval. tool which is part of the master agreement. 1 I am a a union leader and a member of our RTTT transformation team and feel I have had virtually no role in implementing the new evaluation system. Once our administration chose to go with OTES, the only teachers involved are the few piloting OTES. 1 Evaluator of 3 teachers in my building. Sharing our experiences with our district's tramsformation team. 1 As a classroom teacher, I am an active participant that is trying to be open minded to the process. I am aware that it is a pilot program and the objective is constant revision. 1 So far the principal and I have used the methods in my observations. We have completed the goal setting, preconference, observations, and post conference. 1 I am the evaluator being trained and completing all evaluations. I have created forms as needed (we wanted a preobservation conference form before the meeting) and have taken notes from the three teachers on their recommendations on forms as we've used them 1 As school improvement coordinator, I was in charge of finding the personnel to participate in the pilot. I have participated in all trainings thus far and have met with building principals to review training information and reflect on progress. 1 As the union leader, I will be involved in implementing the new teacher evaluation in our district. I will be directly involved when teachers are rated ineffective or have to develop improvement plans. 1 I am one of two evaluators. We are currently in the process of completing the evals with the pilot volunteers. I will serve on the committee that will revise our teacher ob. /eval. to conform to the new law. 1 I am a teacher being evaluated with the pilot method and will work with others to design a new method that works for our district and aligns to the standards 1 I am the Race to the Top chairperson, and have taken a role as both evaluator and observed teacher during the process. 1 As supervisor it is my job to run a pilot ysytem in order to find its effectiveness on our instructor's performances 1 I was asked to participate as an art teacher. The only data available to me was as an English teacher. I was involved in a training in the district which consisted of a poor summary of the program. A distribution of paperwork (tools) and the administrators setting deadlines to have paperwork completed. Any additional information I received was through the local teacher's union and OEA. 1 Two principals (including me) are involved with the fulltrining process by the state. We are sharing information at staff meetings in all buildings. I have completed first observations on 3 teachers. 1 I am on the district team. I have only attended one meeting thus far. We have not shared much information with the staff Appendix A - Survey INITIAL SURVEY RESULTS

68 Appendix A - Survey as a whole due to this being a pilot project. 1 I am our Board Administration representative and have not be allowed to partipcate in the Evaluator Training, nor do I complete observations at the school level. I am helping to un-pack the Danielson model and exzplain the process through PD during Transformation Team meetings. 1 I have been a part of the trainings and then brought that information back and shared it with a sub committee. 1 Led Evaluation Committtee on District's Transformation Team as a part of Race to the Top. Attended each of the trainings. Actively involved in informing District Admin Team and other building principals. Working closely with building reps and union leaders. 1 Collaboratively establishing the evaluation system. Working with principals to ensure proper implementation. 1 I was one of the Guinnea pigs. I am also a union officer which will help when it is time to roll out to rest of staff. 1 As a classroom teacher I worked closely with the Building Principal to implement the neew teacher evaluation. We eorked together through the process and I was responsible for explaining the process to other staff. 1 I have implemented the new teacher evaluation methods in my school with three teacher volunteers. The volunteers are from Kindergarten, Third Grade, and Fifth Grade. All were invited by me to participate. 1 We are a Race to the Top School, so we agreed to participate in the pilot to see if the OTES as it is would be an effective, efficient method for teacher evaluations. We are in the process of adapting it to something more user-friendly, less time intensive for the evaluator that is still within the framework. 1 As a veteran teacher, I have worked on the self-assessment tool and talked to other teachers about how to best use it. 1 I am the primary evaluator for my building. During the pilot year, I am evaluating one teacher. We ahve attended the ODE provided training on the OTES system. 1 I am the union president and am involved at everystep the developing and piloting of the new teacher evaluation. 1 Conducting pre and post observation meetings, observations, writing up the observation reports, meeting to talk about goals 1 I attended day 1 training and have been through the pre/post/and observation. My principal has been in to observe and discuss my lessons. 1 I have followed the plan to its fullest - using the ODE/OTES training days as a guide. I have chosen the three teachers and have gone through the process with them; however, there are elements on this survey with which I am not familiar. There are elements of this survey which I have not learned at the training sessions. 1 Assisting two teachers in the pilot with their goal setting. Explaining the process to the two teachers in the pilot. Working closely with the pilot teachers in the pre and post conferences. Doing two formal observations of each of the pilot teachers. Doing an evaluation based on the new form. 1 I am a principal with 3 teachers being evaluated with the OTES model. I am also the co-chair of our district team. 1 I was trained and piloted the program with two classroom teachers. I will be responsible for training teachers and implementing the evaluation system next year. 1 I attending the first meeting as an introduction to OTES pilot. I have also had meetings with my principal and fellow teachers to review, dicuss, and plan for OTES. 1 I was in the first training and explained it to the other teachers in my grade level. I was also one of the test teachers. 1 I have attended OTES meetings to learn about the new evaluation methods. Also have met with my principal and other teachers involved to better understand what the evaluation consists of. 1 I attended an OEA leadership academy strand (3 days) and shared the material with our assistant superintendent. I am part of our district's evaluation team and was included in the decision to pilot the OTES. Our evaluation team has developed a job description for an effective teacher, based on the teacher's standards. We have been meeting as a team while implementing the OTES pilot. We have also been informing our certified staff to keep them informed and also get their views. 1 My role as a pilot participant is to work with the association and the district team to develop the new evaluation. 1 I am currently evaluating 3 teachers using the OTES model. We are currently working on our local evaluation system. 1 I have participated on the district committee that discusses how to implement the OTES model within our district. 1 I have attended the training and have implemented the OTES pilot with three teachers in our district. 1 I am one of 2 teachers in my school district attending the 1st and 4th OTES Training sessions with our participating principals. There are 6 teachers in our district participating in the pilot--3 from one of our elementary schools and 3 from the high school. 1 I am a member of the districts Race to the Top team. I attended the day 1 training and I am one of the teachers being evaluated using the OTES model. 1 I have beenthe leader in educating staff in the process and procedures. I have also been responsible for carrying out the expectaions of the ODE pilot. 1 This has been a very confusing pilot for our district. The roles of the pilot were not clearnly defined at the beginning Appendix A - Survey INITIAL SURVEY RESULTS

69 Appendix A - Survey causing staff to be very confused as to what our purpose was in particpating the pilot. Following the day 3 training session, we have completed more observation and time with the pilot staff. 1 I have been responsible for development and implementation of all state standards for both health and physical education 1 Co-Chair of the Great Teachers RttT team in designing the TES in our district ti align with the OTES framework. 1 I am heading up the RttT initiative in our district, therefore I am facilitating meetings regarding the OTES pilot in addition to subcommittes of our RttT Transformation Team. 1 The role was to select three teachers to pilot the OTES with in the school year. I have chosen a guidance counselor as one of the participants to see how this fits with thier job. It has been a struggle to see how this evaluation tool works well with their jobs. I meet with the Race to the Top team and update how the training is progressing. The pilot is not beyond the three teachers in terms of staff knowledge about the pending changes in evaluation. 1 I participated in the evaluation system as a teacher who was being evaluated under the State model for the first time. 1 We've met together to discuss how to properly integrate the growth model into the OTES. I've met with my teacher, held a pre-conference, an observation, a post-conference, and a handful of walk-through observations. 1 We have a committee on teach evaluation and I was put on it. Went to a meeting, administration did not want teacher involved in the evaluation. 1 Serve as the Director for School Improvement and work directly with the seven School Improvement Grant Cohort One schools. 1 I am the point person for both the union and administration. I work with both groups on learning the new system and components of the OTES model. 1 I am participating in the pre and post meetings of the evaluation process. I created goals and we discuss those goals at each evaluation meeting. 1 We have been a pilot school and I am on the district committee to develop our new evaluation system. 1 I've participated in the piloting program and have used the new evaluation on the three teachers required through the program. 1 I have attended ODE trainings. I have worked with the three teachers who helping me pilot the system to introduce and help them learn to use the tools provided. I have conducted pre- and post- observation conferences and observations. I have worked with the teachers to set SMART goals. 1 I am a Union President. I have been actively involved with the new teacher evaluation system. We are a RTTT school. As a result, I am using my experiences with the ODE's evaluation system to guide me as I develop a new evaluation system for our school. 1 I have attended the Day 1 Training. I have met with principal and several other teachers to sort out details. 1 I was on the original committee in 2000 when we implemented a standards based framework for our District evaluation using the Charlotte Danielson framework. As Ohio has moved forward to implement ODES I have met with the District Evaluation committee to encourage participating in the OTES pilot. I feel my voice has been instrumental in the decision to pilot. 1 I am a first grade teacher. I was selected to participate in part because I am also National Board Certified. I found this experience to be a similar one in many aspects. I did the survey, collaboration logs etc. and found them very time consuming and would be too much for an average teacher to do on top of all the other mandates that we are working with. It was also very time consuming for the principal and difficult to schedule times to meet. 1 Building administrator responsible for implementing program with 3 volunteer teachers. Have also trained the entire staff on the new system. H 1 I am a classroom teacher, so my role is to complete the teacher components of the OTES as a pilot participant. I have also attended all the OTES trainings even the ones designed for the evaluators. I will help train staff as we initiate the new evaluations next year. 1 ODE training and sharing of materials with staff. Pre conference, observations and walkthroughs, post conferences 1 We attended days 1 and 2 of the training. I was very frustrated with the training and did not feel it was something we wanted to continue to do. I am in favor of changing the evaluation system and think it could be a great thing, but to pull me to a meeting where it only really took two hours but make me miss the entire day of instruction at school on multiple occasions is actually unfair in my opinion. I drove to Dayton for one meeting. I arrived at 8:30 for a 9:00 start time. We were informed that we would start a few minutes late due to traffic. Not acceptable in my opinion. Then we were given one hour and fifteen minutes for lunch. THen the training included me watching a 62 minute video of a teacher teaching her class. Nothing like what I am dealing with in my district. It was insulting to my time. I then approached my superintendent and asked if I could be excused from this training in the future and I would do whatever she decided to prepare for the new evaluation system. She agreed and I did not go back for the day 3 training. 1 As a classroom teacher, I have taken part in the OTES training meeting (#1), self-assessment, goal-setting, and writing and implementing Smart goals for my own classroom. I have also worked cooperatively with the other two teachers in the building who are piloting the program. Appendix A - Survey INITIAL SURVEY RESULTS

70 Appendix A - Survey 1 I am one of the teachers involved in the pilot evaluation. I am also on the Race to the Top team so I attended one of the informational day long meetings about the new program. 1 As principal, I am the primary evaluator. I have attended meetings with OTES and monthly district meetings with our evaluation team (Asst Superintendent, Special Ed Director, myself and another principal, 3 teachers, and the union president and another union rep). I have evaluated 2 of my 3 teachers (1st round). The third has not been in attendance at school this quarter. 1 I am member of the Race to the Top committee. I am involved with the pilot, and I am attending workshops designed to "Redesign the Evaluation System: to align with the OTES model. 1 I am a classroom teacher that has been asked to go through the pilot program and work through the process. 1 As building principal I have been doing it with three volunteer teachers. I have also made sure to have regular presentations on the pilot activities to the entire staff to keep them informed. Our building leadership team regularly reviews the progress. 1 As the buiding principal, I have attended all of the trainings through the ODE and have taken information back to my staff. We have had multiple discussions at grade level and staff meetings about the the new OTES model. 1 The only building administrator for 34 teaching staff...and the point man at all OTES meetings. Very active role! 1 Main representative for our district. Attended all training session, conducted all observations and conferences. 1 As building principal, attended 3 (will be 4) days of training w/ ODE. Piloted the OTES process and evaluation form w/ 3 classroom teachers. Participated in pre- and post- observation conference, professional and communication goalsetting process and gathered feedback from those 3 teachers on their impressions of the OTES. 1 As the building principal I am one of the pilot administrators along with 3 teachers in my building. I am also on the district evaluation team involving teachers from all 6 of district buildings. 1 We started as a Team in July to write an evaluation for our school district. The Gap analysis we completed reflected the need for a better process for evaluation to be in place. Since our school is a part of the RttT/OAC and TIF we dicided to review in detail the ODE OTES model. This we did over months of meetings and we changed the formats to fit our School District. We presented adn received approval to implement the OTES forms for our District in October of We copied notebooks of the forms for all areas ( Self Evaluation/Goal setting/ Formal Evaluation./ Communication and Professionalism. We also copied it ( OTES FORMS) on a jump drive for each teacher to access during the process. The principal conferenced and reviewed each part with teachers and teacher teams to explain the process and for them to be able to ask question and get feedback prior to the start of the observations. We are now completing the evaluation but since this is what I describe to the teachers as our "learning year" conferences have taken a long time. We feel it will be quicker as we continue the process. 1 My role is as a formal evaluator. I serve as the high school principal, assistant superintendent and facilitator for the transformation team. I share the responsibility of developing and implementing professional development to inform staff of the OTES process. 1 I was involved in the Professional Goal Setting, Self-Assessment Summary, and the formal observation. I had the preobservation meeting and the evalution, but not the post evaluation meeting yet. 1 I am a classroom teacher who is piloting the program and then will be used to explain the evaluation to other buildings. 1 I am a classroom teacher that has been participating in the pilot program for 2 years. I have gone through the evaluation process during the and school years. I am on the committee that will be making the final recommendation as to how the evaluation process will be done in our district in the future. 1 attended training, took self evaluation tool identified goals from the self evaluation tool formal and informal evaluation 1 I have been one of two persons attending the 4 day training process. I am responsible for completing the pilot project of observing and evaluating 3 teachers on our campus. 1 I was personally trained in the new teacher evaluation methods by ODE on day 1 and later shared the information gained with other teachers who were to participate in the pilot program. I also completed the self assessment tool and goal setting tool on my own after being instructed to by the building principal, who asked that I submit these tools but did not offer any feedback or even schedule any conferences. 1 As building principal, I have worked with a small team of buidling principals and teachers as we drafted our own tool using the Ohio Standards for Teachers. We are currently piloting with a small number of teachers. The high schools is piloting our own tool with some of the OTES protocol with our continuing contracted teachers. Each of the four elementaries and one middle school principal are piloting with one teacher at their buidlings. Those evaluations are not complete at this time. 1 I am just the district liason for this pilot. I have attended the first session and support the two building principals who are involved with the entire training. 1 Building Prinicpal. Orientation of entire staff. Working closely with those teachers directly involved in the pilot. 1 As an assistant principal, I have attended Day 1-3 training so far and have been the administrator directly involved in implementing the pilot evaulations. Appendix A - Survey INITIAL SURVEY RESULTS

71 Appendix 1A - Survey I have attended all 3 training sessions sponsored through ODE. I have assisted the piloted teachers in completing the self-assessment tool, goal setting form, and meeting with specific teachers for pre-conference and post conference. 1 As union president, I have been involved in the planning that proceeds the creation of the teacher evaluation model that will be used in our school. 1 As union president I am working with administration to implement and as a teacher I am being evaluated using the new system. 1 My role as district administrator is to ensure the fidelity of the instrument and to make sure we have union support for the instruments. 1 I have yet to be observed by my principal. I have left him numerous notes about coming into my classroom; these go unanswered. 1 I am a pilot teacher. I have participated in the new observation methods and walk through's. That's about all I know. It's still pretty new to me. 1 Attend trainings offered by ODE. Implementation of goal setting with entire staff. Use of OTES tool with 3 staff members for pre-observation conference, formal observation, post-observation 1 As the Director of District Improvement, implementation of OTES and OPES has been a top priority. I facilitate our district Evaluation Committee, a team composed of both administrators and classroom teachers with union representation which meets on quarterly basis. 1 As the building principal, it is my job to complete the pilot observations on two teachers. It is also my job to create the timeline of the model and present to the Board for approval. 1 I have completed one entire formal observation of a veteran teacher. I serve on the district transformation team and have shared the process with the team, including all principals. 1 To train/inform district teachers about OTES. To facilitate entire process with building teachers participating in the pilot. 1 I am the administrator that has been attending the sessions by ODE and completing the evaluation processes. 1 Veteran teacher who has volunteered to be part of the pilot by being observed and evaluated based on the OTES Model. 1 I have spent time with the evaluators discussing goal-setting and observation methods. I have also met with the three teachers undergoing the evaluation pilot in order to help them with their self-evaluations and goals. 1 I am one of three high school teachers involved in the pilot. I have set goals, documented said goals, and completed one formal evaluation. 1 I am working on our district committee to implement our own Teacher Evaluation instrument based on Danielson's model. 1 Participated in three days of training to date. Invested 5 hours of meeting time to "crosswalk" our evaluation instrument criteria with the state standards. Communicate to/inform staff regarding the potential changes in teacher evaluation. Working with three teachers as part of the observation/evaluation pilot this school year. 1 Attending OTES model meetings with team and working with pilot teachers in the building as the district model is being formed 1 I have worked with three teachers regarding sharing the information and using the information during the observation. 1 Participating in pilot. Using evaluation tools in Resident Educator program, leading evaluation design and training meetings, communicating upcoming changes to teachers in all buildings. Will be primary source in working on contract language. 1 agreeing to be apart of the pilot program and then attending the training and implementing the process according with in the building- act as the evaluator- 1 As the building principal, I am also requiring my administrative team of assistant principals to learn the process. Each administrator at Pickerington HS North (5) will learn the pilot with 2 teachers. Total of 10 teachers and 5 administrators. I also am a member of the Race to the Top Transformation Team, and I provide updates of OTES progress at each transformation team meeting. District personnel and other building principals who are not participating in the pilot receive this information as the process moves forward. 1 Attended first training session, teacher viewpoint, helped local teachers prepare for evaluations. 1 I was chosen as a Union Rep for the building to pilot this system. I attened day 1 training and an scheduled to attend day 4 soon. I have gone through one formal evaluation. I forsee myself acting as a go-to person for teachers in the following years as the new system is rolled out. 1 As the building principal I am completing the walk througs, observations, writing the reports, meeting with the teachers with the pre and post conferences. 1 Attending the training, coming back to the district to make the information suit our district and building, going through the pre observation, observation, and post observation to feel more comfortable with the process. 1 As an ESC supervisor (Northwest Ohio Ed. Service Ctr IRN , not listed in school district choices on page 1 of survey) and principal of a residential/separate public facility, I complete evaluations for intervention specialists. I am the Appendix A - Survey INITIAL SURVEY RESULTS

72 Appendix A - Survey only evaluator of these teachers. 1 I have been leading OTES Committee meetings on the creation, analysis, & revision of our evaluation tool. 1 I attended the OTES Pilot Day 1 training and will attend Day 4. I am part of the team that will deliver professional development to district staff regarding the new evaluation tool. 1 I have been working closely with one of my staff members. We distribute or share inofrmation after each meeting. 1 I was asked to participate in the pilot as an administrator. It has been very informative to go to the trainings but very difficult to implement it as needed due to time restraints and other duties. 1 I am being evaluated using the OTES pilot. My principal and myself are trying to work through the system. 1 Piloting OTES with three teahcers in my building. Attending OTES training and communicating with my district. Creating an implementation timeline for the district. 1 I have piloted the program with three moderate to intense intervention specialists in my building as the building administrator 1 As District Administrator I have supported principals by helping them to understand the portions of our current evaluation system that align to OTES, and the pieces that are new. We have held meetings to orient all teachers and principals involved in the pilot. 1 As the building principal I have attended all of the trainings, and met with my 3 teachers that are helping us to pilot this project. I have conducted all of the phases of the OTES model, and held various briefings to our staff during staff meetings. 1 I am piloting the new model with three teachers -- a master teacher in 7th grade science, a counselor and an art teacher. We do not have student measures developed for teachers who do not have value-added data. I have completed one announced observation, complete with pre- and post-conferences. All of my teachers used the goal setting rubric, and the three in the pilot completed the self-evaluation before setting goals. 1 I am a special education teacher who participated in the evaluation process in my classroom. I spent an unreasonable amount of time on pre - post conferences and reflection pieces. I feel my time would be better spent working with collegues to improve student learning. 1 I served as the primary evaluator and provided continued feedback to other staff members on developments of OTES. 1 I am a teacher being evaluated - I have helped to identify problems that need to be adjusted to meet our school needs 1 I have been the one to share all the new information with my participating teachers. I have worked all year with getting to know the rubric and implementing the tools into my observations of these teachers. 1 I have been the sole observer of teachers in my building. I have been working with the high school as a district team. 1 Participate in the OTES Pilot program and share knowledge gained and data collected with a District Wide Evaluation Committee. 1 I currently observe teachers, so I will play a strong role in implementing the new evaluation methods. 1 I am part of the team. I have participated in pre obersevation, formal observation and follow up at this time. We have been tweeking the plan to best meet what we need for our system. 1 I am the building principal who is responsible for evaluating staff; I have attended every inservice day and have had many conversations, both formal and informal, with colleagues. What successes or benefits have you or your school/district team experienced as a result of implementing the new teacher evaluation methods thus far? Count Response 1 * 1? 1 A better understanding of what is expected of teachers. 1 A degree of knowledge with the process 1 Alignment of OTES rubric to Danielson rubric Comparison of tools 1 All onthe same page 1 All staff are focused on what effective teaching looks like. 1 An awareness of the whole evaluation process & why each segment is important. 1 As a part of the pilot we are getting a head start on understanding the process involved. 1 Based on teachers standards Appendix A - Survey INITIAL SURVEY RESULTS

73 Appendix 1A - Survey Better Reflection and using data to improve student acheivement 1 Better instruction 1 Better observation instrument. 1 Better understanding o fthe tool and the requirements of HB Better, richer, more reflective dialogue with teachers. 1 Clarification of the OTES. 1 Conversation Pre/post conversations 1 Core group of Resident Teachers feel comfortable oberserving new teachers and peers. 1 Do not know at this time. 1 Fairness, feedback and making teachers accountable. 1 Feedback to teachers and data analysis. 1 Focus is now aligned with OSTP and improving student learning 1 Found that our current tool closely aligns with the state standards criteria. 1 Getting a better understanding of the new evaluation process. 1 Goal/Data Driven Measures Teacher reflection on practice 1 Good Reflection piece. 1 Good feedback regarding the process. Understanding what is ahead. 1 Great reflection, created teacher awareness and urgency for academic performance 1 Has given our staff an opportunity to see what the tool will look like. 1 Having more of a evidenced based model for the evaluations and what we see. 1 Helps to start the conversation of effective teaching. 1 I am not sure 1 I am understanding the process and forms with each training day. 1 I am unsure of any benefits. 1 I believe OTES is much more structred that our current evaluation system. 1 I belive a strength of the system is it causes self reflection on the teacher's part. 1 I cannot say because I have not been included in the experience. 1 I don't feel I have enough experience to comment. 1 I don't think there is enough information yet. 1 I have a better understanding of the process 1 I like the reflections. 1 I liked writing the smart goals. 1 I think we are just get to know the system and working to understand all of the components. 1 Insight into a new evaluation process 1 It allows teachers to reflect on their practices. 1 It has been successful so far. There have not been any big issues with the program. 1 It is too early to tell. 1 It's a better system than what we were using 1 Just piloting the OTES evaluation method thus far. Not to implementation stage. 1 Knowledge of what is expected / opportunity to try it out. 1 Makes teachers really think more about teaching and learning 1 More discussion as a district around teacher evaluations. 1 More in depth and helpful 1 NA 2 None 1 None at this point 1 None do far. 1 None that I'm aware of. 1 None to mention at this time. 1 None. I completed "tools." I received no feedback and no observations as part of the pilot. 1 Nothing notable due to our current evaluation system lining up with the States. Appendix A - Survey INITIAL SURVEY RESULTS

74 Appendix 1A - Survey Observing the quality of teaching and improving dialogue between teacher and administrator. 1 Our teachers are becoming much more familiar with the Ohio Standards for Teachers. 1 Personal reflection of the teacher. 1 Personally, other than posting learning targets I have not changed my teaching style at all 1 Prior knowledge of OTES before non-rttt districts 1 Rich discussion about goal setting and standards. 1 Richer conversations than previous tool allowed, goal setting, teacher reflection 1 Rubrics are great! It takes any chance of personal opinion out of the process 1 Self Reflection aspect is great for teachers in goal setting for improvement in areas of need 1 Slow process - Very time consuming in addition with all that need saccomplished. 1 Smart Goals Self assessment 1 Staying abreast of what is coming on the legislative/mandated horizon 1 Still learning the process 1 Successful implementaton with pilot 1 Teacher awareness of the system. 1 Teachers are becoming more familiar with the teaching standards 1 Teachers are starting to work towards completing their goals. 1 Teachers have become much more aware of the impact of Student Growth and the Teacher Standards. 1 Teachers have been more reflective about their lessons. 1 Teachers like that they may have a choice of who their evaluator will be. 1 Teachers more reflective in planning their lessons. 1 The OTES evaluation system definitely makes a teacher more reflective about classroom decisions. 1 The discussions have been rich. 1 The evaluation is very thorough and comprehensive. 1 The goal setting process was believed to be beneficial. 1 The rubric has been a useful tool 1 The self-reflection forms have been very valuable for both administators and teachers. 1 The strongest and most useful piece of the model is the teacher goal setting and self reflection. 1 The teacher evaluations are quite similar to those already used in our district. 1 There has been more discussion of the process but that is the only success thus far. 1 This program has started the needed dialogue between teacher and administrator. 1 Too early to tell 1 Too soon to say. 1 Understanding of how the system is likely to look in the future. 1 Very defined goals to work on. Self reflection on their daily instruction. 1 Very little. 1 We are ahead of the game. 1 We are currently piloting the method----can't say we have had any successes or benefits yet. 1 We are learning about the new OTES system and are gaining experience using this model. 1 We are really just getting started, so the jury is still out... 1 We clearly see the need to streamline the way we are going to be evaluation staff members. 1 We developed a better, more thorough walk-through form aligned to standards for teaching. 1 We have aligned our current evaluation tool with the OTES model. 1 We have created some awarness of changes that are forthcoming 1 We have done this same format for over 10 years. 1 We have found that our current system is aligned to the new pilot program 1 We have not completed the evaluation process so I cannot comment 1 We have not experienced much success because it takes way too much time to implement fully. 1 We have not implemented to a level that would yield results at this point. 1 We haven't seen any yet 1 Working with each other in a positive reflection. Appendix A - Survey INITIAL SURVEY RESULTS

75 Appendix 1A - Survey awareness of a better preassessment tool 1 collaboration on project 1 getting a better understanding of what is expected and how it will work 1 goal setting and self reflection of teachers was productive 1 modeled from educator standards training from state 1 more indepth effective more administrative/instructor interaction 1 n/a 1 none 1 not sure at this point 1 ongoing, frequent meetings and discussions 1 still in beginnig stages 1 using the teacher standards more consistantly and measuring student growth 1 very few, except communication brought on by evaluation process 1 We are already operating at a high rate of self-evaluation and enjoying the successes of a staff that is already reflective. So we are continuing those successes, I would say. 1 The self-reflection from teachers is invaluable. Teachers are really taking the evaluation as a professional growth experience and a time to reflect on their teaching. It has not been perceived negatively by the teachers, as I thought it could. I thought that some teachers may be worried due to the teacher value-added piece. 1 Completion of self assessment with staff has allowed them to identify areas for improvement in their instruction. 1 This is very similar to our current plan. The only additional thing is the goal setting. I like this component and hope to incorporate this with our current evaluation system. 1 I feel that we already have an effective evaluation tool that we currently use with 2 observations and then an evaluation for those teachers who need to be evaluated because of contracts. 1 I would say the self- assessment and goal setting has been productive and effective, but the formal evaluations have not really supported the development of the goals written by the teachers. this has led to additional meetings and conferences to discuss how the the goals are being met. 1 Involved principals find the pilot very helpful in planning for implementation of the new teacher evaluation methods. As we provide information to the buildings and teachers not currently involved in the pilot, we find teachers apprehensive and concerned but aware that many of the changes will most likely lead to improved professional development and classroom instruction. 1 Completing the GAP Analysis tool. Also, our current systematic and process is similar to the OTES Model 1 As a principal, I like the OTES model. It is thorough, easy to understand, and it provides valuable information to teachers. 1 We have experienced an increased awareness of the relationship between student growth and teacher performance. 1 Greater accountability for teachers and added professional developement as a result of going through the process. 1 I really like putting my comments on the rubric form. The teacher can visually see where he/'she needs to move to improve. 1 Our current evaluation system is very outdated; this has helped us focus on teaching and learning. 1 The building principals are both fairly new to the evaluation process. Therefore it has been very beneficial training for the them. 1 Increased communication between the principal and teacher. Also by having a chance to view the evaluation form it helped form my lesson planning. 1 I think the self-assessment tool is excellent. It requires teachers to really think about what they do and why they do it. Even the things I know I am good at, when really reflecting on those items, I found that I could still continue to improve on them. 1 We have a decent understanding of what the new system will look like in 2 years when it is implemented. Other than that, there have been no benefits. 1 The rubric has helped to define the expectations and narrow down the habits associated with effective teaching. The role of the pre and post conference has been very well explicated and proved to be powerful in the professional growth model. 1 My teachers found the eval tool to be useful when I provided feedback based on their observations. My visiting their goals, and re-visiting after our lesson proved to be very powerful. 1 The process has been very scatter and it is not fitting together for my teachers or myself. The goal setting does not reflect the 30 minute formal observation process. How to identify student growth measures has not been discuss or determine yet. The rubric is hard to follow and the post conference write up is inadequate and subjective. Appendix A - Survey INITIAL SURVEY RESULTS

76 Appendix 1A - Survey It has increased focus on the elements of the teaching standards. The rubric for classroom observations has been a useful tool for self reflection and discussion of the staff. Some of the 21 teachers not directly in the pilot this year have used the rubric and information to begin making adjustments. It also has been discussed in our Professional Learning Communities. 1 We seem to have a common interest in the evaluation process- making it an effective tool for improving instruction for the good of our students & community. 1 We have not implemented this yet in our district. We have just piloted this with a few teachers. It has been very beneficial to me as an administrator to talk to the teachers about the new evaluation and the expectations that will result. 1 Goal setting coversations as well as an understanding of best practice needs has been established as a result of the OTES. 1 The self-reflection and goal setting on the teacher's part is a very valuable tool. It helped me to grow and achieve more as a teacher. 1 Time is a definite factor. The amount of paperwork involved is a little overwhelming for all involved. Duplicate versions of forms make it difficult on people not involved in the training days caus a lack of clarity for participants. 1 Some teachers feel that the feedback given to them by their principal has been more effective and more informative in guiding their classroom instruction. 1 I feel that we had a very good tool that we have used in the past - the state's tool was similar, so it was effective in evaluating. 1 Both teacher involved have tried new activities and have gathered more data after discussions after the observation 1 We (Race to the Top team) are finding clarity in the changes that will need to occur and the type of professional development that will be needed to convert to the new system. 1 Being able to have a part in recognizing the strenghts and weaknesses of the method and having the opportunity to experience the evaluation program before it is implemented. 1 I like the guideline set forth in the rubric, though I'd like to see better alignment across each sub-category for consistency. 1 As the two teams discuss benefits of this pilot we agree that we have benefited from the reflection piece. 1 We see great benefit with the goal setting portion of the system. We see the opportunity for a great amount of reflection and self improvement through this process. The collection tools for communication and professionalism are easy to use. 1 Nothing of any consequence thus far. We are still in the working and learning stages. It is quite a lengthy and rather confusing process. 1 This is my first year in the this building as an administrator, so to be honest the emphasis on this pilot was very low. So, the thoroughness of implementation could have been much better on my part. 1 None and it has caused problems with the staff as they feel that they are being evaluated on concepts that are not in their area of influence. For example, students coming to school unprepared for learning or having limited literacy experiences. 1 The OTES tool aligns nicely with our curent Danielson model for teacher effectiveness in our current evaluation system. However, the student growth measures are created based on a business model that does not align with creating productive citizens. Instead it focuses on making education a neatly packaged product, like a car or phone. It completly ignores the human factors, ignores the diversity of learners (all standardized tests to measure growth), and works to wipe-out all educational areas that do not directly line-up with a specific career skill set. These types of business models will create one-dimensional thinkers; a citzenary that will only be able to slide part "A" into part "B". The great thinkers of history and inventors of our time were not marginalized in this way. 1 Teacher learning to systematically reflect on their practice. Also, the rubric helps teachers focus on what is important in classroom practice and instruction. 1 The pilot has been very valuable for our district in terms of analyzing our needs and having the knowledge to anticipate problems and plan for communication/buy in and necessary professional development for next school year. 1 The major benefit is this helped facilitate a contractual agreement with the union to change our evaluation tool. It has also been helpful to be a part of the process to get a feel for the direction of teacher evaluation 1 There are no real benefits to anyone in regards to this system. It could be useful with new first year teachers or teachers who have been found to be deficient in their abilities to teach. In regards to using it for the whole certified staff it is to time consuming. We have 450 students with almost 50 staff members. I have no assistant and one secretary. I have 32 certified staff members to evaluate. I can't imagine how I would be able to get all of them done correctly in 7 months. Who is going to take care of behavior issues, meet on IAT's and RTI's, meet with irate parents, run and manage the building when I'm observing. Fortunately, we have one more year ('12-'13) on the teachers' current contract and this is my 34th year in education so I really don't have to worry about it. I have two children of my own who want to be teachers and I've told them if they want to be teachers I certainly won't help pay for their college education (I will but I told them I wouldn't). Teachers, firemen and policemen have come under attack by our state government. Instructionally, my Appendix A - Survey INITIAL SURVEY RESULTS

77 Appendix A - Survey teachers are very good. However, I have had issues with teachers in the past who lack tact and diplomacy when working with parents. Some lack a nurturing, empathetic nature when working with children but that really isn't addressed in this new OTES model. You can't base teacher evaluations on instruction and test scores alone. Furthermore, teachers in our state and country are not respected by legislators or the community in general. We do an excellent job of educating the students in our district, excellent...yet we are constantly under fire by politicians, community members, the business roundtable and Battelle. However, our local community is supportive and always has been because we do a fine job of educating our children and preparing them for their futures. 1 Some of our tools are more rigorous, especially the SMART goal setting based on teacher Evaluation results as well as Student growth and Achievement. Goal setting also defines the Professional Development necessary to meet the SMART goals. We have developed a MUltiple Measures Matrix and this is used in the Goal setting to determine what measures the teacher will be using to assess student growth and achievement. 1 Updated evaluation form dealing with relevant material that can lead to data driven decision making. 1 Teachers and administration both feel it will force more conversation and reflection on the part of both. 1 The teachers who were evaluated (including myself) found the self-evaluation process was very useful. Also, the goalsetting proceedures were excellent and helpful for long term growth. 1 I really like the pre and post conference of the evaluation system. Our current evaluation system does not include these steps. 1 Being able to extend teacher/administrator knowledge of what is coming and how we can better plan and prepare to make sure all our staff members reach the accomplished level. 1 This new system is assistant with the accountability piece regarding student data. Our district offers many tools to teachers to help analyze their student population and test scores. This tool forces teachers to analyze data resources provided in order to write smart goals for their lessons in order to help assist their decisions in the classroom. OTES also forces teachers to reflect on best practices. 1 There has been more communication between teacher and administrator as to the purpose of the planned observation and how the teacher could improve their instruction. The form has also narrowed down specific details as to what the administrator should be looking for during a pre-conference, post conference and during the observation. 1 We have not fully implemented the new teacher evaluation methods yet in our district. I like the feedback the teachers will reciev from the new model. 1 So far, the SMART goals and reflective piece of the evaluation tool have been the most beneficial. 1 I am anticipating seeing an increase in the conversations around student growth- we have been working as district to increase the collaboration amoung the grade level teachers (elementary) 1 I believe the self-assessment piece is great--it makes you really stop and think about what you are doing or what you need to do. 1 We are presenting a plan for next school year for Union to "pilot" OTES observation tool and and goal setting tools. 1 After having gone through the process of pre-observation activities, observation, and post-observation activities I know that this system will be a very useful tool for all of us who want to "sharpen the saw." The self-assessment helped me identify a couple of areas on which I should focus. Creating the SMART Goals and compiling the things for the communication and professional development requirements have helped me "hone in" on some things. 1 Becoming familiar with the process and the paperwork(forms) involved with the state model has been helpful. 1 We are seeing the benefits of some of the suggested forms/parts of OTES, such as the Teacher Observation Rubric and the goal-setting pieces. 1 Personally, I have learned a lot about the intent of this effort. I have recently obtained my administrative license and I see the value of having an improved evaluation process for teachers in our district. The previous methodology was not very effective an was in need of improvement. 1 It has opened some doors to more reflective conversations between myself and the teachers. I feel like I have become a better evaluator throughout the year and can provide my teachers with more support. 1 It is encouraging to witness teachers reflecting upon their practices. This reflection has generated growth and examination of best practices, 1 It is very specific with the things that it is looking for so that you are able to be consistent with ratings. 1 The process is not complete. I do not know of any successes or benefits resulting from the evaluation implementation at this time. 1 I don't know that there were any improvements - we had an excellent system in place based on Danielson. 1 I love the process of self evaluation - goal setting - evaluation - reflection. Then to have the discussions with the staff builds a sense of doing this together rather than doing it to you. And that really allows for planning in a much deeper way for professional needs and growth. 1 I feel that it opens up the lines of communication between teachers and administrators. It also helps me focus on how I want to grow professionally. Appendix A - Survey INITIAL SURVEY RESULTS

78 Appendix 1A - Survey The successes are that we have been able to use our data to guide our goal setting, self-assessment, and teaching. 1 The system is being used with two teachers as pilot as required by for RtT. It has been reported several time that there is too much on the plate and significant amount of time needs to be allocated to think and develop evaluation system that has high stakes associated with it. I do not believe it is a volunteer pilot when you are required to participate. I do not believe anyone has thought througjh the long term effects of OTES and the implication it has for principals time allocation in their school and how it will affect the principal and teacher relationship. I believe local contril has been removed from district's developing and creating evaluation process. May I suggest that you read Drive by Pink. 1 I was able to focus on some key areas of development that I was previously weak in. I began to utilize formative assessments and goal setting more frequently. 1 We have appreciated the depth of professional collaboration and dialogue. The new methods empower teachers to take more repsonsibility for their own professional development and student performance. 1 aligns very closely with our current evaluation model..benefit has been the goal setting and self evaluation that has been done by the pilot teachers. 1 Reflection of standards based teaching methods and excellent practices for educating all students. 1 Greater examination of student data on an individual classroom basis. We are good at looking at data as grade levels but not necessarily as individual classrooms. 1 Lots of questions still exist regarding the student growth piece. The evaluation tool itself has some issues regarding wording in the rubric, and the number of pages make it much more difficult to manage than our current tool. 1 We are happy that the district is already doing many of the procedures described in the OTES model. Discussing the new model has helped us see where we have room to refine our procedures. We are happy that we have knowledge about areas that we will have to revise or change in the future. 1 Not many successes - trouble managing time to complete the process. Pleasure working with our union leader and documenting the process each step of the way. 1 We are still in pilot phase, so I am unsure of the implementation process and how it will imact me as a classroom teacher. 1 The Evaluation System has formalized our teacher evaluations, which is a major improvement. We will now have better evidence when approaching teachers with areas of reinforcement as well as areas of improvement. 1 I like the goal setting and conferences it better gets me in touch with what the teacher/students need to be sucessful 1 the only success is that all teachers that are evaluated, are evaluated the same since I am the only one. 1 The two administrators that have attended the trainings are doing well understanding and learning to use the OTES model. The teachers that attended the trainings are struggling with it a bit because of the data requirements. However, the more work they have with it seems to build better understanding. 1 It has caused us to look at our evaluation system and opened discussion as to it's effectiveness. It has allowed us to see the educational trends and prepared us to implement a new state wide evaluation system before it is required by law. 1 I appreciate the focus that writing Smart goals has given my instructional decisions. It has given me a specific area to work at improving, and I enjoy watching the progress being made. I look forward to tweaking those goals each year as I gain new knowledge and experience. 1 It is very early. I will say that there is more focus on understanding the lessons with regard to the conferences, etc. 1 We attended days 1 and 2 of the training. I was very frustrated with the training and did not feel it was something we wanted to continue to do. I am in favor of changing the evaluation system and think it could be a great thing, but to pull me to a meeting where it only really took two hours but make me miss the entire day of instruction at school on multiple occasions is actually unfair in my opinion. I drove to Dayton for one meeting. I arrived at 8:30 for a 9:00 start time. We were informed that we would start a few minutes late due to traffic. Not acceptable in my opinion. Then we were given one hour and fifteen minutes for lunch. THen the training included me watching a 62 minute video of a teacher teaching her class. Nothing like what I am dealing with in my district. It was insulting to my time. I then approached my superintendent and asked if I could be excused from this training in the future and I would do whatever she decided to prepare for the new evaluation system. She agreed and I did not go back for the day 3 training. 1 I have learned about the new evaluation system as a result of participating in the pilot program; which will allow me a chance to help explain the new evaluation to others. 1 Some teachers have expressed the benefit of the self-reflection process in setting goals for the upcoming year. 1 We are more informed than other districts who are not participating but still don't have a clear concept as to how the tool will work for all teachers across all disciplines and all grade levels which is frustrating. 1 We are re-visiting short cylce assessments used about 5 years ago that do a better job of demonstrating student growth then the 9-week assessments we adopted 5 years ago. 1 Providing evidence and having a focus for personal professional growth have been benefits to my staff and myself. 1 I feel the teachers goal setting is very important at reflecting on student growth and current practices. Appendix A - Survey INITIAL SURVEY RESULTS

79 Appendix 1A - Survey We have been able to directly forecast what's next. We have been able to be open with our staff, preview our findings, and put all of them into an awareness mode about the new evaluation model. We have been able to receive direct feedback from staff about their likes/dislikes in concern of the system. We have benefited from the self-assessment piece - people like it. 1 The reflective pieces are very valuable to identifying areas in need of growth, as well as strengths. It was quite lengthy and took a great deal of time, however. 1 It involves teachers more in the evaluation process by making them think about what their goals are as professionals and reflecting upon their work. 1 I think that it really made us consider and get to know the teaching standards. It also pushed us to look hard at our data, mbut we do that already. The collaboration logs are a good habit for documenting what you do and who you work with. 1 This system causes the teacher to critically reflect on what they are teaching. Even though I have been teaching for awhile now it was good to sit down and think really why I chose to teach the lesson the way that I did 1 The teacher reflective piece helps the teacher improve overall instructional performance. The goal setting keeps the teachers on track for improvement and sets a bar of high expectations. 1 I like focusing on the standards and choosing my goals based on the standard that needs my focus. Even though I had trouble deciding on my Smart goals, I can see the benefit of them. I've had good conversations with my evaluating principal about the whole process. 1 I have enjoyed the idea of our school district being at the forefront of change, for the first time in a long time. We have a supertindent who makes sure we are getting as much support as possible to fully implement the evaluation system. With my experiences with the OTES Pilot, I am able to better educate, inform, and relax my staff about impelmenting this new system. 1 A success would be that the teacher is asked to reflect upon something that he/she could improve upon. 1 We are trying to get the information out to the entire building staff mainly. Just grasping the whole tool. 1 It forces me to reflect on my teaching. I believe my self-assessment and the observation/evaluation results matched very well. 1 It is very reflective. It allows professional conversations related specifically to instruction, assessment, and differentiation. 1 We like the emphasis the model puts on the planning of the lesson. The model also provides stretch for our high performing teachers. 1 We are becoming familiar with the educator standards. We are self-reflecting on our teaching methods. 1 We have all become more familiar with teaching standards and our current evaluation tool. We use Danielson and have crosswalked the standards with the Domains and analyzed the rubrics. 1 The process is great for making teachers reflect upon their strengths and weaknesses. The SMART goals are a great way to focus on areas that can be improved. The rukbrics are also a great way for teachers to see what the expectations are for a great lesson. 1 The teachers seem to think this is a valuable tool and gets them thinking about teaching, learning, and the instructional process. The self assessment tool, the reflection piece, and communication that occurs between the principal and teacher are the most beneficial parts of OTES. 1 The self assessment and goal setting component are very effective for all teachers. I used this for the whole staff when they were setting their personal goals at the beginning of the year. Several teachers expressed that this was very helpful in determining their instructional improvement goals. The goals were more rigorous. I am about to administer a mid year check and will see if the teachers have been more successful in keeping on track for the attainment of their goals. 1 Greater discussion among administration and staff in relation to evaluation practices. More administrative time in the classroom. 1 The self-assessment and goal setting component was very beneficial and helped improve my instruction. The goal setting helped me maintain my focus throughout the year and the midyear reevalutation continued the reflection process. 1 Teachers and administrators are working together on this tool. Both groups seem happy with the process and product so far. 1 The greatest benefit has been to get our evaluation form in line with the teaching standards. The previous one was not as directly connected. We also did not have self-evaluation and goal setting as elements of the evaluation. Those now will be. 1 We believe in working with teachers in a process of reflective dialogue about student growth and instructional support. This process lends itself to those types of conversations and growth opportunities. 1 I think the self-assessment tool and goal setting page are extremely beneficial. I actually like the whole process, documentation included. I worry about the time it it going to take to complete evaluations for every person, every year. I Appendix A - Survey INITIAL SURVEY RESULTS

80 Appendix A - Survey am also concerned about the use of data for merit pay reasons. 1 I am more aware of what my students need to know and how to deliver the information for all learners. 1 The greatest benefit we have experienced has been a focus on individual professional growth. The self-assessment tools led to professional goal setting which has, in turn, directed our approach to professional development. In addition, we now incorporate peer observation. These informal observations are opportunities for teachers to get feedback from peers of their choice on instruction. 1 It has helped me to be a better evaluator and enhanced my abilities to communicate more effectively in the post observation process. 1 More aware of the level of differentiation built into each lesson Developing goals that are based on student data 1 We are now a leader and have a better understanding of the otes model...lots of work still to go and no understanding of how the value added piece is going to work. 1 The goal setting process was productive and thought-provoking for the teachers involved. I too appreciated the opportunity to discuss the goals with the two (2) teachers involved. 1 Observed veteran teachers who weren't used to being formally observed - it was a good experience for them and me. 1 Time to talk one on one about the teachers goals and how they plan on to improve during the school year. 1 I feel I am better informed as a principal and I am learning and gaining insight daily on the process with my teachers. The conferencing is very helpful in providing feedback and collaboratively working together to reflect and go back to the classroom and change or extend instructional practices. I also feel the key we are looking at is assessments and monitoring the growth of the students. We are just beginning this year so I will have more information at year end when we prepare for next year. However, I think it was necessary for the teachers to use the OTES format to better understand where we are going in education in the area of evaluation and accountability. Also to gain feedback that could inform the process for revision of the OTES forms. 1 GREAT conversation with union leaders and good benfit analysis of current system versus what it "could" or "should" be. 1 We have been using a model that is much like the new model so we consider it to be effective and a slight upgrade. 1 It has provided better communication between the teacher and administrator. As the evaluator, it was nice to have the discussion with the teacher about the before and after activates with the lesson I observed. It gave me a better understanding of the thought process of the teacher. 1 I feel for the teachers that are found accomplished, this is too much work. This system would be good for teachers in question. Two observations per year, is a little much. 1 This has given another avenue for staff to collaborate and reflect on their current teaching, as well as what in fact good practice is. It has done a good job of forcing teachers to do more data driven decision making during the process, instead of simply instructing, and then testing the students. 1 Pre and post observation conference have been powerful in reflecting on teacher practices as well as ensuring improvement in student growth. Goal setting allowed us to assist teachers in setting SMART Goals to prioritize their focus based on data. The OTES aligns to our local Instructional Rounds which are aligned to the Standards for Ohio Teaching Professionals. 1 I have a better understanding of all the components and how they link together than I would have had if I had just read through the material on paper. 1 We are using video tapes of our pilot teachers to help our administration team learn to use the rubric for the new teacher evaluation. As we view the tapes as a team we are gathering ways and means to do better classroom evaluations. 1 Since we have not completed the process, I don't know yet. One positive is the forced collaboration between teachers and administrators. 1 The new evaluation methods are not that much different from our current system so that are the usual benefits of having good conversation about teaching based on a common rubric used during an observation. 1 It has been good getting together as a team to see what we have and what we can do to make it better. Time is a concern. 1 The level of reflection on the part of the teacher is a positive, the time requirements for doing this every year with all teachers is consuming, it just doesn't seem practical at all, who is to take care of the day to day admin. business when observations need made is a concern 1 Learnilng about the new State Model has reinforced the current evaluation model being used in my building.the OTES ruberics have been very useful. 1 One of our teachers has been formally evaluated using the tool. As a school, we have been documenting our personal evidence on the evaluation tool. 1 We realized that our current model with the exception of gathering student data was very much on the mark. 1 The biggest asset is the concept of teacher evaluations being for growth! It has improved my pre and post observation Appendix A - Survey INITIAL SURVEY RESULTS

81 Appendix A - Survey questioning skills. 1 No benefits have been experienced as the new teacher evaluation methods were not implemented due to a lack of commitment and involvement by the building principal. 1 It has pointed out where future inservice must go so teacher will be able to sucessfully jump through the new hoops. 1 I cannot speak to the benefits as we have not completed implementation and discussion the process. 1 The evaluation method is thorough; however, the document was voluminous. I believe the process was well-received and successful. I question using the method with all staff members due to daily time constraints. 1 We haven't had much success. The timing of trying to fit our preconferencing, observation, and then having the postconference in a timely matter is not successful, due to school circumstances. Also, having teachers do the evalutation as well as the project is too demanding in one year. It should be one or the other! 1 We are more aware of the new requirements from the state and in a better position to implement the changes needed to meet state demands. What obstacles or unintended consequences have you or your school/district team encountered as a result of implementing the new teacher evaluation methods pilot thus far? Count 1 * Response 1 A lot of time is involved with the new evaluation process. 1 Acquiring enough time to complete the OTES process. 1 Adm. vs. teachers 1 Amount of time the process takes 1 An unintended consequence is that many teachers are very nervous about the new observations. 1 As with many it is a time consuming process, though a good one. 1 Because we are just beginning the proces,, I have not yet encountered many obstacles thus far. 1 Better union rapport 1 Communicating the expectations to our staff and making changes to what's always been done. 1 Difficulty finding meaningful/ useful student growth data for all types oof teachers. 1 Exspectations - What is expected at pre/post conferences 1 First day training for OTES was not clear. 1 Hard to find time to complete all of the work. 1 How to rank teachers and determine categories. 1 I am unaware of any. 1 Increased time committment to an already busy schedule. 1 It has been time consuming to have all of the conferences. 1 It takes much more time on top of the other evaluations that we have to do. 1 It takes up a lot of time. I think the evaluator can be biased either for the good or the bad. 1 It was very time consuming and it started too late in the year to be fully successful. 1 It will take a significant amount of time to get it done. 1 Lack of quality training from ODE 1 Lack of time. 1 Lack of time. Feedback 1 Length and amount of documentation is very time consuming. Drop down response boxes would help. 1 Length of process, poor evidence of student growth measures, 1 Measuring student achievement in classes that are not tested areas. 1 More work for everyone 1 N 1 N/a 1 NA 1 Needing more clarification on the rubrics for pre and post observations Appendix A - Survey INITIAL SURVEY RESULTS

82 Appendix 1A - Survey Negotiating with the union 1 No real obstacles so far. Not enough volunteers have completed. 2 None 1 None at this time 1 None at this time. It helps to be a non-union school. 1 None so far. 1 None so far. 1 None this far. 2 None. 1 Not applicable. 1 Not enough time to do the forms that ODE wants in addition to pre and post conferences. 1 Not enough time to evaluate all using all components. 1 Nothing 1 Performance is going to be challenging in some areas. 1 Principals are very apprehensive about the time commitment. 1 Self-reflection will have to be made a part of the evaluation process so tool is used properly. 1 Some area will be difficult to use the tool. 1 Staying caught up as things happened very quickly. 1 Still trying to utilize both OTES and our district adopted teacher evaluation system. 3 TIME 1 TIME TIME TIME Rural school distrcit with two administrators 1 TIME TIME TIME - This evaluation takes hours to complete!!!!!!!!!!!!!!! 1 TIME!!!!!!!!!!! 1 TIMe 1 The amount of time it takes to complete 1 person is enormous let alone the entire staff 1 The amount of time spent by myself and the principal on this process has been a crime. 1 The cost may be prohibitive 1 The obstacles were in the implementation of this pilot with only seven schools. 1 The paperwork is extensive and very time consuming. It is difficult to fit into the day. 1 The required length of time to complete this is unrealistic. 1 The self-assessment tools were quite lengthy and took a great deal of time. 1 The time it takes to complete the process. 1 The time spent, while important and necessary is almost overwhelming. 4 Time 1 Time Contractual language 1 Time - lengthy process. 1 Time - too time consuming. It also does not address non-academic areas 1 Time and forms are very confusing! 1 Time consuming and very cumbersome. 1 Time consuming. I don't like documenting my communication and collaboration on the template. 1 Time is a definite issue. 1 Time is always an obstacle. More training for the teachers involved is necessary,. 1 Time is an obstacle. Making the process understandable for all teachers. 1 Time to complete the observation instrument. 1 Time to fit in all the pieces. 1 Time to meet and discuss 1 Time!!!! 1 Time, Time, and Time 1 Time, training. 1 Time. 1 Too many forms, and will be a challenge on time when dealing with negotiated timelines. Appendix A - Survey INITIAL SURVEY RESULTS

83 Appendix 1A - Survey Too much time and NEVER a high school classroom video. 1 Too too time consuming I am the only principal in a building with 41 teachers 1 Trying to determine how to measure student growth for non-tested areas. 1 Union 1 Union and some teachers are against the project. Very time consuming process 1 Unsure of the multiple forms. 1 Very unclear expectations for districts and staff. 1 Way too time consuming and overwhelming if you are the only administrator in the building. 1 We have not experience any obstacles or unintended consequences thus far. 1 With the process not being complete people are apprehensive. 1 concerns that the evaluators will have the time to complete the volume of evaluations 1 difficulty of setting goals that are measured.. 1 getting together to go over the conferences 2 n/a 1 negotiations 2 none 1 none except time 1 the inequity of juding teachers by student data in two content areas but not in others 1 time 1 time The technology piece of the evaluation could be improved. 1 time needed to perform this evaluation system for each teacher 1 time! 1 we are diving into the I observation as well and it is confusing 1 I have not been given any written feedback or indication of how my performance is measuring up to the standards. The rubrics are very cumbersome. 1 It is extremely overwhelming. The State has done a nice job of trying to break it into pieces so we can have time to learn the system. I feel that a weak teacher can put on a great lesson for the evaluation when it is announced ahead of time. As a lone adminstrator of a K-8 building with 6 special education units, I can intend to do an evaluation and things happen that take priority. It isn't fair to do all of the pre-planning for the teacher only to have the principal be a no show. I think we need to be able to show teachers examples of the rubrics---of what makes an extraordinary teacher and a not so good one. I think we will be okay with the process, except for the amount of time it is going to take. As a district, we have a very thorough and intensive hiring process and feel we get good teachers. I think more needs to be done at the college level to weed out potential teachers who just aren't going to make the grade. It is difficult for me to go to a training to hear things from the presenters like "we don't know HOW that is going to be done yet;" or "we're still figuring that out". They are just trying to meet timelines, too, but I'd rather this took a little more time and was done well from the start. 1 Large amounts of paperwork and looking ahead to after the pilot, providing 2 evaluations for each staff member would require 190 evaluations, 190 pre-conferences and 190 post-conferences along with 190 mid-year evaluations. In the 184 student school days, it might assume that I'd complete 4 meetings each day ( one from each of the above mentioned designations), and run out of school days on that schedule. However, it is unrealistic to hold the mid year conferences the first days of school, and equally as unrealistic to begin the classroom observation process the first school day with a post conference as well. While I appreciate the process, I have concerns with the demands of scheduling as stated above. 1 the need to front load - revisit the information on the teaching standards- the amount of time it is taking to get through the process of the evaluation- 1 The process take a large amount of time. The challenge for the committee will be to make the process effective without taking too much time for the paper work. We also need to develop a plan on introducing the new method to the entire staff in a way that doesn't make them panic. 1 Time management. The whole process has been broken up into too many pieces and this process has not been smooth- too much time in between training sessions. There has not been enough time to collaborate with the pilot team in my district. I am still very unclear of the whole process and all the paperwork that goes with it. 1 I do not believe that evaluators were properly informed as to what exactly they were supposed to do. My evaluator could not answer any questions I had about filling out the paperwork. 1 I have problems with the schizophrenic approach that this has in part; teachers do not need to have a power point going and have students working in groups to be good teachers. Some of our instructors were not as savy as I believe that they should be in educational standards:at one point they had stated that a teacher was not differentiating when she Appendix A - Survey INITIAL SURVEY RESULTS

84 Appendix A - Survey was using a strategy called "jigsaw". They did not see that the instruction has to be driven by the content to an extent. Technology also need not be a big feature in every lesson. 1 Administrators have spent a lot of time out of district for training. Teachers were confused about our roles and responsiblities due to lack of training. We have spent a good deal of time on paperwork that has had little or no impact on student learning or teacher performance. 1 The OTES takes an incredible amount of time. Teacher evaluations already take a lot of my time, but I will be doubling or tripling the amount of time that I normally spend on them. I am not sure how I will be able to meet the many other needs of my building when the OTES monopolizes so much of my day. 1 It has become very time conumsing for the principal. I can't imagine her having to do a whole staff worth of these each year. 1 Time has been an issue. We feel next year we will need to block times for conferencing so we can complete them without interruptions. The conferencing which has been very informative is lengthy possibly due to this being our first year to implement. Also we have limited time due to cuts in staffing to cover or sub in so we can talk. The majority of my conferencing have been after school or on the weekend. Therefore, we are looking at alternative ways to incorporate the conferencing into our daily time-frame. However, after school is a good time for one-on-one conferencing so maybe we can find a way to work that into our schedule. 1 The most difficult obstacle is time, and the overwhelming feeling of getting it all completed. There are also so many documents. The amount of forms and paperwork adds to the overwhelming feeling of this process. In a perfect world that would be the focus of a principal's day, however there are urgent issues with parents and students and teachers that need immediate attention. The evaluation / professional growth momentum is lost at those points. 1 Not knowing where we are headed related to our model. We are currently, as a district, in discussions to try deciding if we will be using SGM or not. A lot has to be decided by how the state implements HB 1. 1 We have two buildings participating. One is doing the project based model, and we have not had training that addresses this model. We have concerns that we still do not know much about how this will look. We are hopeful that day 4 training will address our questions. 1 The amount of time to complete the goal setting, self-assessment took, pre-conference, post conference and classroom observation has been a lot to complete for one administrator. At my building, there is only one administrator. 1 Varied methods of Student Performance measures not outlined or undetermined for various content areas. 1 We are working with Dr. Rowley from the University of Dayton on writing our new policy and he pushes the Danielson model, so we have to wait for him to create our new forms. 1 Time seems to be an obstacle. The conferences between the teacher and administrator are lengthy due to the content we need to cover. The information is necessary and makes for good feedback, but the evaluator is pressed for time and I do not know how he/she could do this with everyone in his/her building. 1 Time required to complete the 14 pages for each evaluation. 1 principal and 26 teachers this is a seemingly insurmountable task 1 Too time consuming for the principal and teacher - paperwork takes hours and hours of time that I could dedicate to my lesson plans 1 It has been frustrating trying to align our district with a model that leaves many important items unclear. For example, we have difficulty trying to base 50% of a teacher's evaluation on student growth when there is not yet a system for determining what criteria/tests will be used and even when this is determined, the validity is questionable because of the value-added vs achievement issue. Personally, I feel that 50% is really too high of a percent given the inconsistency of data used to determine if teacher was successful or not. We realized also that using the rubric, that a teacher's status is down-graded if her students meet expected growth... 1 It is extremely time consuming and some areas are not clearly defined. I have observed my teachers and now are focused on the goals. If goals are met mid-year do I wait until the end of the year assessments to monitor progress? Can I go ahead and complete the evaluation? I have been working closely with these 3 teachers but not given the same amount of time to my other staff with their evaluations. I am not sure I will be able to complete 30 evaluations like this next year. 1 TIME, teacher buy-in to the three assessment measures, using past year's data to inform current students 1 Finding the time. Managing our workload. The "newness" of the process and trying to figure it out as we go. It has been a learning experience. 1 Very time consuming, We were told not to have the post conference until after Day 3 training. This put weeks in between the observation and post-observation meeting causing it to not be effective. 1 The use of data in determining teacher effectiveness is an issue at this time as far as determining student growth in all subject areas at the high school level. 1 TIME. this system is very time consuming. I only used this for three of the teachers in my building and it was very time consuming. I fear the day when we are evaluating all of the teachers in the building and the time contraints that are involved. Appendix A - Survey INITIAL SURVEY RESULTS

85 Appendix 1A - Survey Teachers are overwhelmed witht the amount of time it takes. Teachers are putting on a "dog and pony" show for their observation which curves the evaluation. then going back to their "old" ways. The rubric does not accurately measure how elementary teachers teach or are involved with their students. They don't teach ONE subject at atime. 1 The tool in its current state is too Cumbersome and too time consuming. The paper work part of this process is too big. It keeps me from being in the classrooms as often as I would like. Also, it would be nice if you could create a whole document with everything in it that can be used on a laptop or IPAD to easily record evidence during observations. This process has been difficult because we had to roll it out prior to ODE having completed the entire document and recieving training. Ethically, I don't think we should ask people to be evaluated with a tool that is not complete at implementation time. 1 The rubric not being an even flow across the levels created questions and not sure how non "tested" areas are to gather data to support the merit pay portion. 1 Finding time to devote to completing the paperwork; how will the student achievement piece will apply to guidance counselors, libriarians, etc; who will determine an appeals process 1 The time involved with the new process and the completion of all the different components is overwhelming when trying to think about this program with 25 staff members and also complete the daily tasks of running a building and providing educational opportunities for teachers. 1 The system is not realistic. We have too many teachers to go through this process. This is a "perfect world" system on paper but is too time consuming to be used the way it is intended to be used. 1 The obstacles are time. The procedure is very involved, and requires a great deal of evidence siting. 1 The "Student Growth Measures" piece and the mystery about how this will be determined has made people very wary of the process. 1 Time. Please do not observe this as the typical response borne out of fear of change. The obstacle to effectively assess the quantity of teachers under the current legislation is daunting. 1 Implementation time and follow through. How do you have quality evaluations and feedback with pre and post conferences, goal setting etc with 25+ teachers twice a year. This is a major district concern. As a TIF school 100% of teachers are to be evaluated in I figured that at a minimum 10 meetings per teacher with two minute evaluations. 1 The 5 minute informal walkthroughs do not provide accurate data. It's not possible to hit all 7 areas of critique in 5 minutes. Many times I have just covered some of the criteria and then the administrator walks in afterwords and writes down that what I covered was not observed. 1 The OTES pilot involves A LOT of paperwork (for teachers and administrators alike). I'm not sure how a large district like ours will be able to complete two OTES observations a year for every teacher. They are rather intensive. It reminds me of the Praxis with all the paperwork and documentation. 1 The state has been vague about how the student measures will work and this is 50% of the evaluation. I see unfairness in using value-added data for some teachers and not for others. How can a teacher created pre- and post- measure in a fine arts classroom equal value-added data on a state standardized test? In Hamilton, we track students. Will teacher assignments with particular levels affect their value-added data? 1 Deciding what will be used to determine students growth and getting detailed answers from the state have both been issues. 1 Time contraints while evaluating the other 31 teachers, while learning the OTES model for the three teaching staff in the pilot 1 I won't have time to do anything else during the evaluation time frame since I am the only evaluator until more are trained. 1 It is a lot of additional paperwork that I have to complete. Although I already do keep records in each of the areas, it has taken a little extra time and effort to track everything in the "program" format in order to have it available for evaluation. I do feel like I have probably neglected to do something - it's easy to forget some of the details that are supposed to be recorded. 1 Our setting is very unique with the amount of "self-paced" curriculum that we encounter. Much of what the majority of our students do is independent study. We must evaluate how we will link those classes to classroom teachers in a fair manner. 1 The amount of time needed to complete it is very long and concerning when I think about having to evaluate all teachers, every year, with the system in the future. 1 Time and the some of the needed materials were not developed. Just learning the new processes has been challenging... in a timely manner. 1 Confusion on the process, every person at the last training had used a different form. The trainers be to provide us an organized folder with each needed piece of the puzzle 1 One obstacle of the new OTES lies solely in the amount of time involved in each evaluation. The teacher as well as the principal spends a great amount of time preparing, discussing, planning, and writing for the OTES. Although the Appendix A - Survey INITIAL SURVEY RESULTS

86 Appendix A - Survey evaluationis informative, the amount of time required is not feasible given our current resources. 1 Nothing yet. The union is working well with us, but we have only implemented a small portion. That may change going in to negotiations and next year. 1 The staff as a whole is not really aware of the Standards for Educators. Professional Development will be needed in this area. 1 It is too closely tied to RttT, which many staff members feel will be gone in two years. HOw do you attach Value Added scores to people who are Unified Arts teachers? 1 Time frame and roll out format. It has seemed rushed. Should have started training prior to this school year. 1 It seems to be developing as we implement the pilot; instead of it being already prepared and ready to test. The documents, rubrics, and general paper work that goes along with the program are very cumbersome, unorganized and not user friendly. The time spent preparing for the Pre conference/post Conference is difficult to set aside during school hours and takes away from being visiable to staff and students. Evaluators will take short cuts in order to meet requirements and deadlines. The time line for evaluations (especially second semester) is not doable or reasonable. Technically you cannot start 2nd semester evaluations until after February 10th and must have all completed by April 15th so that post conferences occur no later than April 30th. If a teacher is going to be released from a contract, we have language that requires the evaluation process to be completed for that individual by April 1st. I am not sure the evaluation system hits on elements that everyone agrees make for a good teacher and effective teaching. For example, student engagement is not really addressed where as student learning styles and differentiation of instruction is hit extremely hard. I also find that coaching a teacher may be destroyed by the holestic evaluation rating of four categories. I love the ineffective and developing category but the division between the Proficient and Accomplish is difficult. 1 Time constraints and how to determine which category a teacher falls within if the lesson does not allow for the evaluation tool to be met in the one formal lesson. Also, if a teacher scores an equal amount of scores in each category, which designation do they fall within or what process is used to "break a tie." 1 It is impossible for our principals to meet the timelines on the amount & time of evaluations/observations. We are a small District w/ one elementary & one high school principal. They have many other responsibilities in addition to evaluation. 1 It is hard to get valuable student growth data unless the teacher is in a value-added grade. We currently use DRAs and formative assessments but I am not sure how reliable these are. I am really struggling with what to do with my art teacher. 1 The lack of forms for this process has been a tremendous burden. I need more written information. There needs to be written forms to fill out to document the pre-conference, the observation, the post-conference, mid-year evaluation as well as the final evaluation. 1 Teacher who feel that they are being 'picked on' and parents feeling we push students too far too fast. 1 This is very time-consuming for teacher and administrator. The turn around between observation and post-observation was 30 days. It is hard to remember that particular lesson, on that day, for that group of students when you have ten periods a day! Our principal is trying very hard to implement this new evaluation with little training. A caring principal could get bogged down very quickly with all that is being asked of him. An overwhelmed administrator might be inclined to inflate evaluations to lighten his work load. Also, it is hard to jump on board when we have no idea what the other half of the evaluation is going to be based on. The student achievement component has not been addressed. Teachers are very concerned about half their evaluations being based on an unknown. Many teachers in my school do not teach a tested area. 1 both teachers and administrators agree that the process takes so much time that a new division of labor system must be developed. 1 We have questions concerning employees that are under teacher contracts, but do not have classrooms..guidance, VOSE, etc. 1 It has hindered communication and cooperation. The only people involved are the principals and the teachers getting evaluated. 1 The amount of time required thus far has been a concern. Ten hours per teacher has been the average of the pilot evaluations. This does not seem practical for a whole school implementation. Those involved in the pilot thought they would be given input in the process. However, they have been granted minimal input thus far. The union leader was invited to the initial meeting, but has not been included or invited to any meetings since. This has created a great void in our understanding of the process and the information that we can provide about the evaluation system. We do not have growth measures in place for the majority of the teachers to generate the growth measure equal to fifty percent of evaluation. How fair is it to the teachers with value added reports that calculate into the evaluation formula when the other teachers without value added reports get a pass on that portion of their evaluation? This is a big problem waiting to happen. We have concerns about the resources available for principals teachers to access as part of professional growth per their personal plans. 1 There is no humanly possible way that we will have time to implement this system for all teachers. Not enough time in the day or days in the year. Appendix A - Survey INITIAL SURVEY RESULTS

87 Appendix 1A - Survey Because of the time I've spent in meetings and observations and all they entail, I am having difficulty getting my regular observations and evaluations done. Two of the teachers that I have chosen to evaluate with the OTES system have their principal's licenses. They are no longer pursuing a career as an administrator. I've had to reschedule observations several times because of student behavior issues that are a direct result of their disabilities but we have 32 K-5 children on IEP's with just two sp ed teachers in cross categorical units and 2.5 aides while we try to implement an inclusion model. This OTES system can not be implemented correctly...people will cheat and the result will be a system that has no credibility. I will do what I need to do but like I said...i only have to do it for one more year. The Lord is watching out for me. 1 Time continues to be my number one concern. It is not realistic to spend the amount of time required by the OTES model on every teacher in my building for evaluation each year. I currently have 33 certified staff to evaluate without an assistant. To follow the process as designed is not a realistic expectation. 1 Great idea. There will not be enough time to implement this model. Most schools are on very limited budgets with administrators wearing multiple hats and it will simply be impossible to effectively implement this model. 1 Lots of questions that we are unable to answer regarding the student growth piece, causing anxiety among our staff. We have spent a lot of time trying to figure out how we will complete this evaluation process with all teachers, 75 elementary school teachers and only 2 principals. 1 Staff is hung up on the compensation issue regarding the evaluation process. Many are concerned about that 50%aspect of the evaluation and how it will be implemented. This is especially true for those Fine Arts instructors. 1 Anything new is scary. Teachers who have not been through Praxis training might find it to be more difficult than those who have been through Praxis. 1 Nothing that I can recall. It takes more time for both the teachers and the administrators. Is it worth the added time? I hope so. 1 The amount of time to pre conference, observe, post conference, conduct walkthroughs, write up a summative evaluation, analyze student achievement data, set goals, review goals, etc. is an undaunting task when there are many teachers to observe and evaluate. 1 The time it takes to conduct each pre-conference, 60 minute observation, and post-conference is sometimes difficult. I do worry once we have to do this for every teacher, that I will have to sacrifice spending time with my students. Although I do feel this process is important. 1 I'm concerned about the time needed to complete the evaluations the right way. With 50 staff members, I wonder how I'll effectively evaluate all of my staff members in a meaningful way each year. I want to do it right, but I'm not sure that there is enough time in the day to do it. 1 The scope of the process would be very difficult to complete if done appropriately. It's frequently very difficult to solidify meeting times for goal setting, pre-observation, post-observation and etc. With contract limitations it would call for us pulling teachers from classrooms to complete the process appropriately as it stands now. 1 There is no way our principal can take the time it involves to implement this evaluation twice a year for each teacher. It should be condensed. 1 Without a doubt...time. The evaluation system is good; we like it. Adding very thorough evaluations to the life of an administrator and a teacher has been a challenge. In addition, I am not an expert in this system (yet), so I have to work with staff in discovering key elements. They may have questions that I cannot answer. There is an instant loss of credibility when that happens. 1 The amount of time needed by the evaluator to complete the pre-conference, the class room observation and the post conference is such a large block that it is going to be difficult for smaller school systems. 1 Our placemat is a modified Danielson model. We have adapted the model to make it appropriate for Cincinnati. The structure of the OTES obseration form is cumbersome. 1 I think the most profound concern with the entire model is the 50% student growth measures. ODE has not released the list of approved measures yet is trying to force schools to adopt this model. If these measures were clearly spelled out for people, I think schools will more willingly adopt this framework. 1 Personally I have been very frustrated with the ODE trainings. My questions haven't been answered and the training has not been effective. 1 I see time as the major obstacle to over come. I see principals needing to set aside 5-6 hours to complete the preobservation conference, formal observation, time to go back and reflect on your script to establish questions to ask during the post-conference, conducting the post-conference itself, writing up the final report, and then sitting down with them to go over this observation report with them. Then multiply that by two times a year, then by how many teachers you have in all, and that is much of your school year. Not to mention that this does not even include the goal setting, communication, and professionalism pieces, as well as the walk through piece. Plus then you still need to create a final designation. Don t get me wrong, I think there are many positives about this OTES. However, are we going to have time to do everything else well at the same time? Every district has made major cuts to teachers, but to administrations as well. I just don t see how the current make up of administrations is going to be able to do a great job on this, while also keeping all of the other plates spinning at the same time as well. I also think there needs to be greater emphasis on the Appendix A - Survey INITIAL SURVEY RESULTS

88 Appendix A - Survey goal setting portion. I feel this was simply glazed over on Day 1 Training because no one really knew what laid ahead on the rest of the days trainings. I feel a lot of confusion with the goal setting. 1 Principals are very concerned about the time commitment of the new teacher evaluation system. We also recognize the importance of deepening evaluators' knowledge of effective classroom practices and explicit communication skills. 1 The process takes longer, but I plan to get more efficient. With our current observation tool, I am able to provide more teacher feedback on specific effective instructional strategies. The OTES observation tool has less emphasis on insturcitonal strategies during the lesson. 1 The main obstacle was the teachers did not feel that their final rating was accurate. Mainly the concern of the teachers who were evaluated was that there was not a clear understand of how their final rating was reached. 1 It takes too much time away from my teaching. It is unrealistic to think that every lesson that is observed contains evidence of every standard. 1 Of course just doing it for the first time brings its own set of challenges. We have currently usethe Danielson model and I am used to a more general format. I struggle a little to include some of the things that I normally would. 1 Not having a direction from the state on the 50% that will be based on Student Achievement is a huge roadblock. 1 It is very time consuming. The standard needed to achieve the "accomplished" level will be an issue as it will be inconsistantly enforced by administrators throughout the state. There are many teachers who have always been the "straight A kid" they will have a difficult time accepting the fact they are not "accomplished" in all categories. If the rating is used to determine rift status, the pressure is on the administrator regarding how the teacher compare. 1 The amount of time to complete the process is tremendous. I believe there needs to be changes to speed up the process if it is to be successful. 1 This is an extremely time-consuming process. It baffles me how a building administrator will ever find sufficient time to include this without an overhaul of our responsibilities, which we all know is not going to happen. In essence, the expectation to evaluate every teacher each year is drastically unrealistic. 1 The forms can be time consuming to complete, but they are a good reflective piece.time constraints for administration to complete effective observations both formal and informal may become a logistical concern in the future when more staff members fall under the umbrella of observations. 1 TIME!!!! With all of the other responsibilities which include discipline, curriculum, etc...it is hard to find the time to complete the tasks. Also, coming up with the ultimate final rating. 1 Not having a complete copy of the evaluation and receiving it in parts has made it difficult to see how it all fits together and to explain to other team members how to implement the plan 1 All RttT processes have made our OIP confusing to teachers and other staff members who are not involved at the leadership level. OTES is an example of many instances where our focused plan is now not so focused. 1 We are just starting to scratch the surface with the model...we have lots of staff and administrators that need training... 1 Compensation, assigning percentages for Special Ed. and Title I staff in terms of responsibility for student growth. The inequality between value added for our teachers in grade 4 based on one test, one day, one time and the other growth measures used with grades K-3. Measuring growth in the music, art, PE, Title I and Spec. Ed. 1 We attended days 1 and 2 of the training. I was very frustrated with the training and did not feel it was something we wanted to continue to do. I am in favor of changing the evaluation system and think it could be a great thing, but to pull me to a meeting where it only really took two hours but make me miss the entire day of instruction at school on multiple occasions is actually unfair in my opinion. I drove to Dayton for one meeting. I arrived at 8:30 for a 9:00 start time. We were informed that we would start a few minutes late due to traffic. Not acceptable in my opinion. Then we were given one hour and fifteen minutes for lunch. THen the training included me watching a 62 minute video of a teacher teaching her class. Nothing like what I am dealing with in my district. It was insulting to my time. I then approached my superintendent and asked if I could be excused from this training in the future and I would do whatever she decided to prepare for the new evaluation system. She agreed and I did not go back for the day 3 training. 1 It is WAY TOO LONG and involved - especially when every teacher begins to be evaluated. This makes it a DIScouraging process rather than an ENcouraging one. (Or is that the intent?!) Also, my colleague, who is farther along in the process than I am, says it will be VERY difficult to ever reach the accomplished level, and she is a very driven person. She is National Board certified and feels this way. How is this going to affect teachers who are good teachers, but not as driven as this teacher? I am really concerned about the morale and stress level of my friends in teaching. Teaching is a noble and rewarding profession, but it is starting to feel like we are the villans here. 1 Obstacles encountered include a complete lack of commitment from building principal. Building principal currently fails to complete observations as required by negotiated agreement so it is not really a surprise that the building principal failed to do any of the activities in the new teacher evaluation methods pilot. Because of this lack of involvement by the administration, an effective evaluation of the new teacher evaluation methods plot is not possible. District administration is aware of this nonperformance of duties but chooses to ignore this inactivity. 1 Who defines student growth? What standarized measurements should be used? How do we evaluate teachers using a standardized measurement? How do we insure that tests we are using to measure growth align with the curriculums of Appendix A - Survey INITIAL SURVEY RESULTS

89 Appendix A - Survey the district? How do you insure that multiple tests have the same degree of difficulty? This is another example of the state creating a framework that isn't fully thought out or formed and throwing it out into local districts to pilot. 1 The tool takes a large amount of time. Little information has been provided on the student growth component. We also worrying about the matrix, especially only having three categories for student growth. 1 the vast amount of time to properly impliment this process is stressful and impossible to do for all building teachers 1 Very lengthy; difficult to understand and sort through. It seems like this will take a very long time when implemented with an entire school. 1 I am the only administrator for my building and will likely be the only one doing evaluations. I know there is a possibility that teachers could take on evaluator roles, but that would require a change in the contract which is very unlikely. That means 43 teachers evaluated two times next year. 1 The new teacher evaluation methods are very time consuming. We have great concern about the feasibility of being able to goal set and evaluate every teacher every year given the process in the OTES Pilot. We are also anxious about the Student Growth Measures including those for areas not assessed by OAA. 1 Still not sure how the teacher goals relate to the classroom obervation. We are also in a unique situation in determining how many classes a teacher must teach to be evaluated. Some of our licensed teachers are used as advisors and are not actually instructing daily content. 1 I am concerned about what the student growth component is going to entail. I understand and support the teacher evaluation component but I'm not sold on how student growth is going to be measured for every grade level. 1 I am very concerned about the use of data when evaluating art teachers, physical education teachers, music teachers and guidance counselors who serve k-3 students. Also, not sure of how relevant the data sources would be when evaluating kindergarten and first grade teachers. 1 The new evaluation methods are very thorough, but cumbersome. It would be helpful to have the evaluation rubric all on one page, instead of several pages. It is quite thorough & productive, but extremely time consuming. 1 The amount of time needed to complete all the observations etc will be a lot of work if you are the only evaluator in the building. Also, the amount of time it will take to get the pre-conferences and post-conferences. Finally, the not knowing the what the student growth measures will be is a little frustrating also. 1 The hardest thing I have encounter was the time factor of accomplishing the evaluations on top of the regular evaluations I need to do already. 1 Teachers don't understand how different this new model is going to be. They have no idea what's coming even though I have tried to tell them countless times when I was union President. I have 2 major issues with the new evaluations. First, I feel that if value-added scores are only based on OAA and OGT scores or some other sort of standarized test, then teaching to the test will become an even bigger problem than it is now. If half of my evaluation comes from value-added and that is based on 1 assessment, then as a teacher, that is all I will focus on all day, every day. If I can just be organized, have all my paperwork orgainzed, etc. on my 2 formal observations, then I can probably recieve a good evaluation. Districts need the flexibility to be able to create multiple assessments in order to figure value added scores, not just one test. Ideally, I would like to see atleast 8-10 test scores per year figured into the value-added data. Secondly, no one seems to know what the merit pay system is going to look like, and more importantly, how teachers in non-tested areas will be evaluated based on value-added, etc. On Day 1 of the training, I asked this question and they brought in some "expert' from the back of the room to answer the question. He gave the most political answer I have ever heard. He talked for 20 minutes and never said anything. It was obvious, at that time, that the people running that training, had no clue how to answer this question. This is a BIG DEAL. You need to know what this is going to look like before you ram these radical changes down people's throats. I really hope someone reads these comments and they don't just get recycled into some dumpster somewhere. This system can work, but REAL teachers, not a bunch of politicians and PhDs who have never been in a classroom, are all asking the same questions. Someone needs to be able to give them some answers. 1 Very concerned about the amount of time administrators will need for the process. Just pushing the paper is very time intensive. For it to truly be effective and meaningful this is huge time investment of time. 1 I anticipate union push back about the process. I am concerned with the time element of the process and the consistency with multiple evaluators evaluating the same staff. There needs to be time to make sure that ALL evaluators are similar when using the rubric. The rubric is very cumbersome and leaves much room for interpretation. Would prefer a rubric similar to the Praxis model. 1 It will take forever to do 1 teacher with the current system the way it is being set up. There are going to need to be multiple trained evaluators in a district. 1 Time and the simultaneous completion of the current negotiated agreement's evaluation system and the evolving new methods. The uncertainty of the multiple student measures and how it will interplay with the final assessment of staff. 1 After completing the gap analysis, we determined we would use the Ohio Model instead of attempting to write our own. 1 The entire new evaluation system has not been totally implemented. There is one evaluator for the entire school district for grades K-12 in small rural school setting. This evaluator is also the building principal which makes it extremely difficut to use this lengthy evaluation system for the entire staff multiple times a year while still Appendix performing A - Survey administrative INITIAL SURVEY RESULTS

90 Appendix A - Survey duties that do not involve the evaluation process. The evaluation system also does not seem equal for all teachers due to some teachers not having value-added growth measures to show student growth. There are other measures that are being used, but in terms of each of these measurements being fair and equal, how can you determine that one assessment is fair for one teacher and a different teacher has to use something else. 1 The amount of time for the pre and post conferences is a huge issue. There is not enough time to complete those conferences to the extent that is being asked for each teacher. Though we have less teachers than many schools, time is a huge factor in completing conferences. 1 The time it takes to prepare for the pre-evalution. It took me at least 2 and a half hours to write it up. I believe though that any teacher good or bad could 'look good" for these 2 formal assessments and then return to former ways. Not sure it will be effective in making major changes in the profession. I believe that more frequent informal evaluations, unannounced, is the only way to get a true picture of a teacher's performance. And then immediate feedback on certain skills being observed would be more beneficial. 1 The time requirement seems to be the biggest, the evidence gathering tool/rubric doesn't allow for some things that make a great teacher, determining how to rate someone who is in more than one category is still vague, there's seems to be a big difference between the ratings and an additional level would have been incentive for teachers who are almost at that highest rating but not quite there 1 With two people participating, it is difficult to answer this question. I do believe as one looks to the future the obstacles are going to include: (1) time; (2)principal and teacher relationships; (3) teacher fear of a system that will cause teachers to do test prep rather than focus on student learning; (4) time line for design and implementing is rushed; and (5) the model needs be field tested following full development and not partial development. 1 The biggest obstacles are as follows: (1) the amount of time involved for the evaluator; (2) the unwieldy nature of the forms; (3) misinformation from the local OEA uniserv rep to our local OES president; On this last one he has been saying that "negotiations" mean that if the local OEA doesn't like OTES (or its equivalent) they can refuse to agree to have it as an evaluation tool. 1 The evaluation completed for our 3 teachers was about 25 pages long vs. our old system which was 3-5 pages long. There needs to be a balance between the old and the new. 1 Time!!!!!!!!! Small district administrators have to do it all. This is not practical approach for small rural districts. We are sp Ed directors, curriculum directors, discipline providers. This program is developed based on school administrators with more support via assistants and numerous resources. This does not represent a workable model in rural schools 1 No obstacles. The pilot is not what I though it would be. It seems to be a way to get administrators credentialed to evaluator teachers. It did provide information about the rubric and goal setting tools. I wish we would know more about other tools we could use to determine student growth besides value-added. There is still alot of questions about what will be local control and determination of retention, promotion, and termination Changing the way observations are done through the OTES/Marzanno system. It has a different way of collecting and viewing data taken from an observation. 2. Finding student growth measures and how to calculate them in the model. 3. Changing how teachers collect data on themselves for OTES. How many artifacts do you need? 1 Takes a great deal of time to do the process correctly. Worried about when we will be doing this for all teachers every year. How are sp. Ed. Staff impacted by student growth measures, particularly those in classrooms of kids with severe cognitive disabilities? 1 We had some insights into the goal setting process and have been able to change our professional development schedule this year to train staff on writing SMART goals. Wwe also relaized that professional development can no longer be one size fits all. It has to be more focused for individuals or small groups. 1 It is a stressful evaluation, much more so that the models used by our district in the past. OTES, I believe, is very heavy on differentiation, using data and diagnostic testing to justify what is being taught, and use of technology. Obviously, the content standards need to drive what is being taught; use of data scores may be over-emphasized in this model. 1 It is VERY time consuming for both teachers and administrators. Neither of us have the time to use the evaluation to the full potential. The evaluation is very biased due to the role of Value Added, until all teachers have state data or Value Added, it will not be a fair evaluation for an entire staff. 1 The time constraints are huge. The reflection if done well takes a serious amount of time to answer. It took a lot of time to go over with the principal. The observation was fine but the post-conference was very time consuming as well. 1 An increased amount of time decidated to the process on both the parts of administration and teachers. 1 Time factor. Teachers said it took approximately 6-7 hours to complete their action plan for the lesson. 1 The time it takes to complete all the paperwork and actual evaluations takes time from the work I need to be doing in my classroom planning and instruction, 1 Unfortunately I sensed an immediate "push back" from our union president. I really wanted the 8 of us (2 principals and 6 teachers) to be able to become familiar with this tool before others voiced opinions. (And I wondered how they could really voice opinions without having had any training about his pilot!) With good leaders this evaluation tool will help all of us become better educators/leaders for the betterment of our students. This pilot is very much a TEAM effort. It's not Appendix A - Survey INITIAL SURVEY RESULTS

91 Appendix A - Survey an "us against them" thing, which unfortunately is the message I get from our local and district OEA representatives. (Dealing with that constant negativity is the only thing I dislike about my job. Sad, but true.) 1 What about people who are on teacher contracts but are not actual teachers...nurse, guidance, psychologist, etc. How will this impact their evaluation process? 1 The OTES is extremely time consuming. Our building principals are having a difficult time finding the time to do the preobservation conferencing, a 30-minute observation, a written summary of observation, and a post-conference once per year, let alone twice. Secondly, there is concern among teachers that the system will be overly critical of teachers in core areas due to the lack of data available in the non-tested areas. Finally, the purpose of the new system really is to determine and address staff needs. A system for addressing each individual teacher's professional development needs could be difficult and expensive to implement. 1 The goal setting papers were very cumbersome and distracted from the good conversation about the teachers students and their needs. 1 The goal setting piece did not seem to fit into the evaluation process. it seems like an extra piece that doesn't really fit with the rest of the rubric. 1 We do not have much data to analyze at our high school. We don't have common assessments to measure growth, and are only relying on OGT data, some PLAN and EXPORE data, and two year old OAA data. This process has identified a great need to begin work on locally developed common formative and summative assessments. In addition, we have non-core classroom teachers participating in the pilot ie: instructional coaches, school counselors, ELL and elective teachers. It is very difficult to make the instrument work for these special areas. I don't believe we had strong guidance in the selection of teacher participants. The current model only works well for core area classroom teachers. 1 Time is a major issue. The amount of time needed to implement this program with only 3 teachers is staggering. I am having a hard time imagining how this is supposed to be feasible when all teachers will be doing this process. 1 The state did not provide any guidelines at all about the student-improvement aspect of the new required program. This is what we said in our RttT Scope-of-the-Work that we were going to do this year. The state's lack of ability to provide timely information to us really let us down and we do not have a working model to be in place for next year - we will have to pilot something else new. 1 Going through the entire evaluation process is extremely time consuming for each educator. I have spent 4 hours in conferences/observations and an additional 2 hours writing the evaluation with a reflective summary based on 1 observation. 1 The amount of time needed to spend on each individual teacher is unrealistic. With only one administrator for 520 students and 32 teachers it will be a challenge to implement effectively. 1 Time to complete the forms to lead to improvement student learning, accomplished teaching practices. We all want our students prepared for college and/or career readiness. How can we make this great instrument user-friendly for principals without losing the fidelity of what we are trying to do. Time to complete all steps is a real issue in many districts. How can you make this process more friendly to the evaluator. 1 Time... we anticipate 9 specific meetings with each teacher, not counting the prep work in completing the forms prior to a meeting and not counting walk throughs. I will have 32 teachers to evaluate. Frustration - regarding the "growth measures" for non Value-Added classes, and even when possible measures do come, what will determine "expected growth" vs. "less than expected growth" vs."more than expected growth." More frustration - disappointed that a teacher who gets a top rating from me and can show 1.9 times the standard deviation for expected growth can no way be labeled as accomplished. Why must a teacher be green to be accomplished? Why can't a top rating with a yellow be accomplished? We are going to have many, many top teachers feel so discouraged when they see this. More frustration - I have to have the evaluation done in April but won't get the growth measures for that year until September... will we be using previous year's/trends scores? What about all the teachers who don't have Value Added? I think the evaluation tool itself has many pluses, but it is only 50% of the rating. There is no way the growth measures implemented will be fair to all teachers in all content areas including our librarian, guidance counsleor, PE, art, etc. The intent is admirable, but it's not realistic, and when ratings start getting published in the paper, we're talking about people's lives here. I'm very worried for my good teachers. 1 We still have no direction on what data will be used for the evaluation of teachers...especially in elective or non-core areas. Since student achievement is to account for half of the evaluation, and evaluation will drive compensation, more concrete information/guidance is needed. The last thing I want to experience is a contentious evaluation meeting involving union representation/legal representation when compensation decisions are made. 1 We've had some technology issues. The OTES documents that allow me to type directly into them are sometimes difficult. I cannot start new paragraphs, bullet items, number items, etc. We are also overwhelmed with the amount of work this will require for teachers and administrators each year--if followed in it's current format. My report alone will be 38 pages long. Our principals struggle to evaluate a third of our staff each year. I cannot imagine how they are going to complete this type of work with every teacher. I've had to put some of my own lesson planning, grading, parent calls, etc. on hold to keep up with the pilot data collection and writings. I truly think focusing on two standards is too much with the work required for each. I truly cannot do justice to two areas with the steps I need to complete. 1 There is no plan to help identify how the 50% of the evaluation based on student growth will Appendix work. This A - Survey gap has caused INITIAL SURVEY RESULTS

92 Appendix A - Survey major anxiety for our team and staff. It seems like the roll out is occuring and there is this big chunk not clarified for anyone. 1 The recommended process is too time-consuming for our principals (large student bodies and high numbers of teachers), and teachers find the pre-conference form too time-consuming also. 1 Our pilot teacher spent an hour and thirty minutes preparing a detailed lesson plan for the pre-confernece evaluation. It spend several hours preparing for the pre-conference, completing the formal observation, aligning the observation data to the rubric, creating responses and preparing for the post conference. This does not include the hours that will need to be spent collecting assessment data to measure student growth. I am confident that it will take between 7-10 per teacher to complete two evaluations per year. THis is excessive and not manageable when I am responsible for over 40 staff members. 1 Obstacles include attitudes such as "if it isn't broken, don't fix it" and "this is just the latest flavor of the month education reform measure." 1 Clearly the focus of teaching staff is on the "Student Achievement" measure and how that will be defined and implemented as the 50% part of the evaluation. As for the 50% performance side, it seems to have been accepted by the pilot teachers as an adequate evaluation instrument. 1 Time considerations, especially with brand new assistant principal. Secondly, one question I have raised is the issue of how the local districts will develop and implement an Improvement Plan to assist the struggling teacher. It was presumed by my assistant principal that the subsequent year of evaluation would just be a fresh start with a new goal in mind. I disagreed with this notion, and that the subsequent year of a teacher in need of improvement needs to continue with the monitoring of the same goals, or similar goals and monitoring of issues of concern to evaluate whether or not improvement has been made, and exactly what interventions need to be aligned for that particular teacher so that they are not trying to dig their way of of a hole alone, rather the district supplies supports and put them in place for the struggling teacher. I realize that the LEA needs to be included in the process of this document so collaboration would be strongly considered as a necessary course of action to develop this at each district level. One additional thought: There is no mention in the training for the principals who are doing the evaluation to provide the suggestion to an outstanding teacher to develop goals as a LEADER in their profession, rather it seemed they were just to continue to improve their own work. The film that demonstrated the elementary teacher who was quite exceptional was directed to use more visual/written feedback to the student. This teacher just went from feeling as though she had worked her tail off, to being given the message, "I still found something wrong with what you did". To me, that's an indication of too much pressure on the principal to "find something". The administrator in their feedback would have been much further ahead by asking the individual teacher what she think she needed improvement on, because, she would have had more investment in her own effort to improve herself, than jump through a hoop of writing on a sticky note for the student as she cruised the room so that they would have written feedback. The administrator realized this teacher had performed well, but still needed to find a benign bone of contention. That's not the purpose of improvement. That particular teacher should be invited and encouraged to work with additional staff members to help improve the instructional practices that she clearly had mastery of. Instead, the focus was what to fix (as if something were amiss) instead of how to make the best use of the strong talents the individual has in the classroom, and potentially in the building. Clearly multipling the exceptional practices would benefit the whole district, AND make teacher evaluation a tool for improvement, not punishment. 1 It has been very difficult to measure "student progress." I have observed an Art Teacher and I do not feel comfortable with the data we came up with. Further, I do not feel that test data of any teacher adequately measures their "success." We have some teachers who have good test data, but are not necessarily good teachers. 1 The time that it has taken to write the plans,and carry them out has been overhelming, not to mention the time it takes to disaggregate all of the data before I could even think about beginning to make my goals. 1 Time!!! Trying to schedule pre-conference, post-conference, and the other meetings required to meet the criteria of the evaluation process are unrealistic. Two teachers at my building are piloting the new evaluation and we struggled to complete all of the conferencing times. When the principal is responsible for 20+ teachers the time requirements will occupy his/her entire day throughout the entire school year. Obviously the principals have more to do than simply evaluate teachers. The RUBRIC is not designed appropriately nor objectively. As a teacher it worries me on two levels. First, I feel it will cramp my creative planning, when I am working toward an accomplished rating I will follow the rubric to the letter, rather than developing a plan to best meet the needs of the students and curriculum. Secondly, the rubric leaves a lot of room for subjectivity. A rubric is meant to be objective. However, it is missing minor words like "Are the materials directly linked to the objective." The lesson we watched the the teacher was completely unorganized and presented errors in the content of her lesson, but her materials were appropriate for grade level. ODE presenters stated that they were not connected the outcome of the objective. This is not stated in the rubric as were many other details. Since these evaluations are now linked to job security, it is scary to think that the rubric is still so inconsistent. 1 Not meeting the needds to remove persisitantly low teachers (as seen compared to past leagal issues). TIMELY (right wrong or indifferent) Appendix A - Survey INITIAL SURVEY RESULTS

93 Appendix A Survey

94 Appendix A Survey Summative Survey Results 1. By clicking below, I consent to take the Ohio Teacher Evaluation System Survey. # Answer Response % 1 I CONSENT % Total % 2. 1a. Are you directly involved in implementing the OTES in your building or district? # Answer Response % 1 Yes % 2 No 19 7% Total % Appendix A Survey SUMMATIVE SURVEY RESULTS

95 3. 1b. Please identify your role. (Check all that apply.) # Answer Response % 1 Primary evaluator 49 18% 2 Building principal % 3 District administrator 59 22% 4 OTES resource/guide 27 10% 5 Professional development 11 4% coach 6 Teacher 82 31% 7 Union leader 34 13% 8 Other 14 5% 4. 1c. Please describe the "other" role you indicated in the previous question. Statistic Value Total Responses 12 Appendix A Survey SUMMATIVE SURVEY RESULTS

96 5. 2a. To what extent did your district modify the OTES model to fit your school culture during this year s pilot? # Answer Response % No modifications were made % A few minor modifications were made 96 36% Some significant modifications 19 7% were made Extensive modifications 6 2% were made Total % Appendix A Survey SUMMATIVE SURVEY RESULTS

97 6. 2b. Briefly describe the modifications and reason(s) for making modifications to OTES. Text Response Three teachers were evaluated using the OTES We completely adopted OTES model, forms, etc for our continued pilot in Paperwork was reformatted to make responses more streamlined. We modified the pre-observation and post-observation forms by using the ones that were developed in the OTES pilot. To allow to fit local assessments We chose Option #4 so that we may choose things locally. We are working with a group of people to develop forms etc. to fit the model provided along with our local needs. Difficult to understand all of the parts because you were being trained at the same time. I wasn't sure of certain things and didn't see how it all fit together. Plus with out value added you really were off on what level of achievement the teacher was starting at. Plus ODE was working through the bugs as they went. We need another year of piloting this OTES w/value added and all of the pieces Better form - more organized Once goal setting was discussed as being optional, we value it as part of the evaluation process, but sometimes the paperwork can be over burdening. We alligned ours with the Danielson model We missed the1st meeting. The goal setting was not completed becuase of the timeline. An observation was completed two times. The teacher performance data was also not accessed becuase the teachers observed were at the Primary level. Communication/Professionalism piece We created a pre-observation conference form for teachers to complete before the pre-conference meeting to help guide them. We want to rearrange the goal-setting sheet. Teachers found it not userfriendly and wanted goals at top, not in middle of document. We are still tryng to develop a data collection tool to use with the self-assessment tool to help teachers guide their goal setting. Adapting the model to the current model used by the district. Modification may contnue. We simplified some of the forms. We also combined some of the OTES requirements with our LPDC requirements. Some of the requirements of OTES would meet the LPDC requirements so we designed the form to reflect the LPDC information as well. This way we only have to fill out one form and it can be used for OTES and LPDC. Goal Setting was done earlier so we had to reset goals to meet the goal recommendations for OTES. We made an example of possible evidence for each area for our teachers. During the pilot we continued to use many of our contract forms because they agree in content with the model and are familiar to us. Near the conclusion, we realized that we will not be able to continue this because the intensive state traing is all on the state's chosen forms, so we are being forced to use those instead of the ones we had developed. It would have been more honest of thestate to tell us at the very beginning thta we weren't going to have much choice in the matter. Appendix A Survey SUMMATIVE SURVEY RESULTS

98 Our district participated in the pilot under Option 4. We used the evaluation tool that is part of our negotiated agreement.and is aligned to Danielson. Instituted SMART Goals Rubric aligned with our current system of teacher evaluation so we utilized evidence rubric to fit in our system. Rubric clarified goal setting form simplified We worked to have a measure for each level on the rubric for the Standards addressed in the model. We wanted to have distinct indicator across the rubric for each level in order to clearly define expectations. Better alignment to our Danielson model Danielson Model With the time constraints we modified the pre and post conference process, along with the goal setting process. Our current evaluation system was very similar to the OTES, however some of the "terms" or "language" had to be modified to follow the model effectively. We were in the pre-pilot group and part of TIF, so we already had made changes to the rubric. Also, we like the pre-conference piece that was shared last year, and decided to leave it in. Modifying the pre/post conference questions. The number of observations were changed to the model. Overall, the model was similar to once we developed a few years earlier. We did not get all of the paperwork done in the timeframe we will in the future because we were learning about it as we went. I know we didn't get the professionalism and communication pieces done on time, but they have now changed anyway. Instead of only the principal being involved with the evaluations, we also had a team of teachers who participated in the goal setting, pre-observation conference, observation, post-observation conference and evaluation of the teacher. This led to rich discussions and leadership opportunities. Goal setting for Annual and PRE evaluations which included Student Growth Measures. The initial layout of the rubric was not user friendly. We attempted some adjustments to line items up. Evaluation set up Teacher evaluating teacher We modified the pre-observation meeting to save time. We found it extremely time consuming and we needed to make the process more concise. More than anything else I wanted the teachers to help me provide a more useful evaluation. So we didn't really do the evaluation as a true evaluation but more as a learning experience for me and the teacher. Principal modified final evaluation write up sheet format, but used categories from the OTES model I don't believe the narrative was fully implemented. we did not use a walk through for evaluation purposes- the HS and MS have been using the Teachscape walk through tool to gather data to analyze and converse to detemoine impact of instructional practices. They have trained teachers to do the walk throughs and their staff will rotate through the responsibility- Appendix A Survey SUMMATIVE SURVEY RESULTS

99 We had to make some changes to the process of the evaluation. ( Preconference, etc..). Also walk throughs needed to be added. We had to insert a specific self assessment and formal goal setting page We also are redoing our script pages to compliment the OTES rubric We had to create some of our own forms that were not avialable at the time of the pilot. We felt deeply that a pre-conference and post-conference for the first evaluation was a necessity. Both conferences were extremely valuable in the growing process. Number evaluated We did not fully implement the teacher effect this year. The pilot did not start at the beginning of the year so timelines were changed. To check alignment with district evaluation tool. The district is in negotiations. For some teachers who participated in the pilot, value added was unavailable, due to grade level or content. In addition, for many teachers in the high school, the value added, if available, is based on an ACT end of Course Exam that is unfamiliar to most teachers and is not aligned with the current state wide testing. In addition, it has been administered in hap-hazard manner; which makes the data derived from it invalid. Did not use the OTES communication or professionalism forms. We instead utilized forms currently in use in the district addressing the same areas. We don't presently have goal-setting so that process was new. We continue to investigate the origin of data to use for goal setting and when we explore student achievement, we're not sure what to use. Several teachers were chosen from each building to participate. All others were evaluated using the traditional evaluation. Just one evaluation was done for the pilot teachers. We also did not analyze student growth data. Just for time sake they changed the number of observations to one rather than two, but did include the walk throughs as part of the observations. I asked the staff to write SMART goals but not at the extent that was recommended by the state. I basically followed our adopted procedures but used the strategies and concepts when I evaluated staff. Plus I only used it for two teachers versuses all I had to evaluate this eyar. we OBVIOUSLY COULD NOT USE THE 50% STUDENT GROTH AT THIS TIME. -written evaluation reports may not have addressed all 5 teaching standards -a teacher professional project was not implemented during the pilot -analysis of evidence for standards 6 & 7 occurred more informally yet documented -for the pilot no improvement plans were discussed or needed Teachers in the pilot used only the OTES as their evaluation tool, rather than the district tool. Wording and instructions were modified to clarify and to distinguish between initiatives already in place in out district. Some vocabulary that was used was the same but had slightly different meanings. In the pst only a sampling of teachers were evaluated each year. It will be a huge change to evaluate every teacher every year. We had to increase the number of evaluators in the district. Wwe have chosen to implement the OTES fully. Columbus City Schools agreed to pilot the OTES with only the SIG Priority Schools in Cohort One. This Appendix A Survey SUMMATIVE SURVEY RESULTS

100 was a jointly established agreement with Human Resources, The Columbus Education Association, etc. We only modified the language having to do with the pre and post observations. Rather than stipulating a time we stated the pre and post could take whatever amount of time was necessary. We used Dr. Rowley's goal setting forms. As a pilot high school, we are keeping each of the standards separate and we have added our own rubric evidence for teachers and students to accompany the seven teacher standards. We worked to make a schedule to implement OTES, and we instituted peer review to get people used to being observed in a more "non-threatening" way. We studied FIP, Bloom's, etc. to get us on track with posting and talking with students about learning targets. People worked on personal professional projects and reflection pieces. We looked at value-added and STAR progress monitoring to gage growth. Primary classes used BAS to assess reading progress and inform instruction. We held an informative meeting on one of our waiver days for all certified staff. Our district has an evaluation system aligned to OTES> Piloted the new model with a few teachers unofficially to get an idea on how the new model will work. Tried bits and pieces of the model. We have modified the pre-conference forms and expectations to save meeting time due to the large number of teachers that each principal will be evaluating. Our teachers felt it was not necessary to conduct pre-conference meetings either. We also made changes to the rubric that we feel are improvements, and that also deal with goal-setting in a more complete manner. Goal setting sheet Completed OTES components and added cover page for alternative evaluation per our contract and timelines. A group of teachers piloted the new program. Information was shared with all staff members. Time constraints to all involved. We looked at different assessments to make up the 50% part of the evaluation. These would be added together with the new state tests. The number of required observations were altered from 4 to 1. We were told by our principal that feedback from principals was given and the modification was made in an effort to make the process more doable. We did not use the student performace part of the evaluation. We are crosswalked the new OTES evaluation rubrics vs. our current teacher evaluation rubric to see how it matched up. We saw some similarities and some differences. post walkthrough conferences and observations conferences were not held because of time constraints of the normal evaluation cycle. walkthroughs stopped Forms and process to graduate process to the end of our contract. I only used the model on several of the teachers I evaluated and really just started a familiarization process with them. I used my own and tied it to the model. We adjusted the timing a little to mesh our current contract and the training dates. I included information from Walk Throughs on one observation report and not on the other to see how it would work since walk throughs are not addressed in our current contract. Pre conference time was limited Appendix A Survey SUMMATIVE SURVEY RESULTS

101 The timeline was modified due to the school year already beginning. There were situations where we didn't necessarily have sufficient information about the process to effectively implement. For example, the initial SMART Goal setting conference was challenging. Student growth data may have been insufficient for teachers not serving in a tested area i.e. Computer education. We created our own based on the Charlotte Danielson model. Appendix A Survey SUMMATIVE SURVEY RESULTS

102 7. 3. Which of the following OTES components did you complete or participate in during the pilot year? (Check all that apply.) # Answer Response % Teacher Self- Assessment Tool (based on Ohio Standards for Teaching Profession) Initial SMART Goal-Setting Conference between teacher and evaluator Pre-observation conference between teacher and evaluator to discuss decisionmaking process for lesson Formal classroom observations Informal classroom walkthroughs Post-observation conference between teacher and evaluator Evaluator s written evaluation report after each observation based on Standards 1-5 Teacher s written lesson reflection Teacher s professional % % % % % % % 98 40% 98 40% Appendix A Survey SUMMATIVE SURVEY RESULTS

103 growth plan Analysis of student growth data Analysis of Teacher s Professional Project Year-End summative conference between teacher and evaluator Analysis of evidence regarding teacher behaviors in Collaboration, Communication and Professionalism (Standards 6 and 7) Use of professional development resources aligned with Teacher Growth Plan OTES Improvement Plan 79 32% 23 9% 88 35% 87 35% 35 14% 31 13% Appendix A Survey SUMMATIVE SURVEY RESULTS

104 8. 4. What, if any, impacts did the new teacher evaluation system have on instructional practices among pilot teachers during the school year? # Question Decreased No Change Increased Responses Mean Understanding of how to articulate SMART goals Alignment between SMART goals and classroom activities Analysis of student data for progress monitoring Knowledge of the content areas for which a teacher has instructional responsibility Use of multiple assessments to inform instruction and evaluate student learning Differentiated instruction in pilot classrooms Re-teaching for areas of weakness in student learning Enrichment teaching for areas of Appendix A Survey SUMMATIVE SURVEY RESULTS

105 strength in student learning Active engaged learning among students Student motivation and setting personal learning goals Behavioral problems among learners in pilot classrooms Use of selfdirected learning activities in pilot classrooms Use of methods to foster students taking responsibility for their own learning Collaboration and communication between teacher and parents Collaboration and communication between teacher and administrator Collaboration and Appendix A Survey SUMMATIVE SURVEY RESULTS

106 17 communication among teachers in the building or district Responsibility for professional growth and performance in a learning community Appendix A Survey SUMMATIVE SURVEY RESULTS

107 9. 5. Which of the following describes your current status regarding each area? # Question No Plan Draft Plan Partial Plan LEA has new policies and/or procedures that reinforce the teacher evaluation process. LEA has new policies and/or procedures that support using student achievement data as a part of the teacher evaluation process. LEA has new contract language related to the use of teacher evaluation results in compensation, placement and retention decisions. LEA has a new comprehensive communication plan that explains the new teacher evaluation system. Full Plan Responses Mean Appendix A Survey SUMMATIVE SURVEY RESULTS

108 To what extent does the new teacher evaluation system impact teachers professional development needs in the following areas? # Question No Impact Designing and implementing SMART goals Designing classroom assessments Designing lessons aligned to common core standards Selecting assessments to monitor student learning for specific lessons and benchmarks Analyzing student learning data from multiple measures Using student learning data to select differentiated instructional strategies Involving students in selfassessment, progress monitoring and goal setting Implementing differentiated Slight Impact Moderate Impact Extensive Impact Responses Mean Appendix A Survey SUMMATIVE SURVEY RESULTS

109 instruction Best practices for specific content areas Fostering selfdirected strategy use among students Team teaching and teacher collaboration strategies Integrating technology into lesson plans Understanding value-added measures (VAM) Communicating assessment results with students, parents & colleagues Using VAM scores to inform instructional and administrative practices for school improvement Appendix A Survey SUMMATIVE SURVEY RESULTS

110 11. 7a. What supports need to be in place to ensure sustainability of the new teacher evaluation system within your district? (Check all that apply.) # Question Not a priority Tool for integrating Collaboration, Communication and Professionalism (Standards 6 and 7) with SMART goals Tool or guideline for integrating CEUs for licensure with teacher evaluation results Evaluation option to conduct progress monitoring of teacher s SMART goals Interactive evaluation rubric for use with hand-held devices for capturing data in the classroom Online OTES data management system with evaluator and teacher access Low priority Medium priority High priority Responses Mean Appendix A Survey SUMMATIVE SURVEY RESULTS

111 6 Step-by-Step guidelines for implementing OTES 7 Ongoing OTES training for administrative and peer evaluators 8 Professional development and system supports for understanding value-added measures 9 Support for how to use VAM scores to inform instructional and administrative practices for school improvement 10 Other b. What "other" supports do you believe should be in place to ensure sustainability of the new teacher evaluation system within your district? Text Response Appendix A Survey SUMMATIVE SURVEY RESULTS

112 13. 8a. Please rate how easy it was to use each of the following components of OTES: # Question Very easy 1 Selfassessment form Somewhat Easy Somewhat Difficult Difficult Did not use Responses Mean SMART goal setting form 3 Rubric for teacher evaluation Standards Rubric for teacher evaluation Standards 6 and 7 5 Year-end Summative Form 6 Student Growth Measure information from ODE 7 Other Appendix A Survey SUMMATIVE SURVEY RESULTS

113 14. 8b. What "other" components were you referring to in the previous question? Text Response NA Lesson reflection form none non testing areas value added Messages coming from ODE do not match the messages coming from Governor's office. Time Did not use N/A none Checked by mistake. Could not remove! Sorry! Just stating we didn't use other measures... holistic portion It is impossible to get all of that done with 30 staff members per year. IMpossible!!! communication form n/a none Instrument in usable form N/A growth plan With this model, it appears that the majority of teachers will be in the middle; at proficient or slightly above, also the model doesn't reflect all of our values as educators for what is developing, proficient, accomplished. In addition these terms are not well defined. It also places a heavier burden on content areas; while lessening the burden for non-content teachers. Finally, the Peerformance levels are not, or do not provide, an example of how they would be used to determine compensation or incentive pay. none Collab and Professional form None Did not use. Jointly agreed upon process and timeline developed by all stakeholders. The amount of time that is used to implement OTES. Did not use anything else. Time involved in preconfering, observation, and post confering. Overall, completing the whole progess involving hours of work for both teacher and administrator. Some incertainity about forms continously Appendix A Survey SUMMATIVE SURVEY RESULTS

114 changing and the amount of work that went into each revision. It was very difficult to complete the Teacher Professional Growth plan without a link to the PD resources Ohio's new system for evaluating teachers combines a rating of teacher performance (based on classroom observations and other factors) with a rating of student academic growth. These two ratings are each weighted at 50 percent. How clearly does the matrix below explain the method for computing a summative evaluation rating? # Answer Response % 1 Very clear % 2 Somewhat clear % 3 Somewhat unclear 20 8% 4 Very unclear 18 8% Total % What would make clear how the teacher performance level and student growth level are combined? (Check all that apply.) # Answer Response % 1 Provide written description 23 61% 2 Provide examples 30 79% 3 Provide rationale or 16 42% research base 4 Other: (please describe) 5 13% Appendix A Survey SUMMATIVE SURVEY RESULTS

115 Other: (please describe) All of the above. The survey won't let me check more than 1. how will this affect teachers who work with students with disabilities Label the axes with more understandable terms. Darken the outline of the box so we can clearly differentiate between the side and top descriptors and the ratings. Don't color the side and top descriptors. WRITTEN DESCRIPTION OF HOW YOU EQUATE NON NUMERIC VALUES TO %S. Differentiate based on SES of students. Appendix A Survey SUMMATIVE SURVEY RESULTS

116 What other changes would you make to the matrix that combines student growth and the teacher performance ratings? Text Response None Get rid of the numbers and use the words for the column headings I think with the uncertainty of the "student performance measures, I would make teachers able to be in the "green" area without a complete year's growth. None. Difficult to know since slos not in place none We were told that the teacher is evaluated wholistically not numerically and then the matrix has numbers! Doesn't make sense. Which way is it going to be? Assign numbers or wholistically? explain that the data is from previous year and determines observation rating also it is CRAZY that only 4-8 has value added data when your talking about all teachers. Makes the stress level a lot higher in grades 4-8. We need to roll out LEA data for other grades ASAP since that is half your evaluation. none Teacher Performance Level should be designated as "Summative" Elimination of the numbers. Leads to unecessary conversation. We discussed this on Day 4 training. How student growth measurements align with each perfromance rating (i.e. is a specific percentage of student growth needed to reach each level, how are the variousl measures balanced, etc.)? The 4, 3, 2, 1 is confusing because the teacher performance rating is not based on numbers. None. I would make it in flow chart form Needs to be sme way to show a teacher being ineffective for ethical/communication/etc problems even if student growth is good. I would change that a teacher rated Accomplished by evaluation and gets expected growth should be rated Accomplished overall. None currently. I would like to see a breakdown of the percentages for the student growth measures be more definitive, i.e. 20% first year VA or equivalency, 20% benchmark data, 10% formative/and or portfolio assessments. Differentiation of subjective, objective measures but diffinitive weighting system for all districts. The testing list is a little light and I would like to see the correlations of the tests on the list to the CORE. We need a district session on test selection on the lists. Our district is working to develop measures beyond value-added data, which we will receive for the first time this year by individual teacher. Without clear multiple measures in place, it is difficult to speak to the rubric's fairness. Also, if different districts use different tools, how will the overall teacher ratings shared be equitable across Ohio? It still feels like it is more of an opinion about where someone falls than any "fair" way of making that judgement. Appendix A Survey SUMMATIVE SURVEY RESULTS

117 Don't use the numbers 1-4. We don't know what that means. Also, our value added comes in five levels, not three. Thta confuses teachers. Remove the numbers since we are no longer using numbers as part of the evaluation system. The knowledge that ultimately it is up to the principal to make a final determination. I would not have accomplished in the expected column because if your student growth is expected and you are a 4 in teacher performance then you should be able to be accomplished. less categories teacher performance level of 2 with expected student progress should be Proficient To be accomplished you have to have above expected growth, when expected growth is a high expectation in itself. Our district feels that the matrix is slanted more toward the negative than the positive. The ineffective and accomplished are equal, but the developing outweighs the proficient by too much - they should be at least equal, if not skewed toward the proficient. It would be helpful to include the labels at the top of the teacher performance ratings (4-accomplished, 3-Proficient, etc.,) It needs a written explanation. It is geared to err in favor a students. If a rating is mixed, the teacher is rated tot he lower side. I would like to see how the ratings will show inter-rater reliability. Teacher Student Growth data needs to be specific to sub-groups. There are some anomolies that are going to cause unfair results, such as: Parents insisting on Gifted Identification of students at an early age. (5-6-7) This Gifted Identification causes a sub-group that isn't an authentic Gifted group as the students move from grade to grade. There are students in the Gifted subgroup who aren't gifted and are not performing at the predicted level for Gifted students. None at this time. Please go back to the original plan and not include a holistic portion. The evaluations must not be subjective. More clarity on student growth measures I would change the column of "Expected Growth" and a "Performance Level of 4" to Accomplished. If a teacher has a high performing class and meets the expected growth and has demonstrated high levels of performance to their evaluator, they deserve to be higher than proficient. Start over and have teachers and administrators that work in school settings design the measures so that they are reasonable. Business roundtable members, legislators and Battelle for Children have no working knowledge of how things work in a school setting. None. the matrix seems to be a usable tool- the challenge for us will be establishing student growth measures that will be used and then assuring that there is equity across the district for teachers- Student growth measures should iinclude 5 levels of growth, similar to VA Reports. Growth needs to be defined at the high school level and a time frame is needed to measure. I, personally, would have designed the online program to automatically rate the teacher using data, not a matrix. The VAM should have been established a long while ago and in an automated system so as to Appendix A Survey SUMMATIVE SURVEY RESULTS

118 feed the necessary information immediately. The ODE always does things backwards!!!!!!!!!!!!!!!!!!!!!!!!!! We, as evaluators, are left to endless hours of PD and testing, only to use a paper matrix that is already antiquated. Furthermore, those of us who gave up precious time to attend mindless hours of instruction (that changed within the program) should have been credentialed to perform evaluations and provided ongoing optional PD. NA Key as to what each part means. green doesn't copy very well so maybe an different color to show the differences instead of green none at this time Short narrative of student performance measures/student growth measures may be helpful. It seems that in order to make a fundamental change in the evaluation process, student growth measures would have to be available in all content areas and grade levels. Research suggests that these student growth measures are not reliable at every grade level, PreK-2 for example. I think the new evaluation systems has many positives, including more observations; however, it seems we are moving very quickly to implement data driven decisions without all the assessments in place. This seems quite troubling. If a teacher is 4 and student growth is expected, I think teacher should still be Accomplished Brief descriptio and a shared shaded section between levels. Hard to say. There is so much uncertainty for the Student Growth Measures, particularly in grades/subjects where there is not OAA, etc. None; after explanation it was easy to use To difficult to figure out above, expected and below growth for teacher to figure out left hand side of matrix. Plus the matrix is not a fair division for accomplished teachers. If a teacher is accomplished and meets expected growth; the teacher is proficient not accomplished. That one square in the matrix will cause major problems in getting teacher buy in and a feeling of being treated fairly. None! None Rating system for growth measures, e.g., what kind of "score" correlated to accomplished, proficient, etc. I feel that it is unfair that a teacher can not obtain a higher score than proficient if her students meet growth measures. I think the areas of rating and the expectations of these areas need to be explained thoroughly to the staff being evaluated. They need to have a solid understanding of the expectations. The rubrics do this, but I think they need more because most will not spend the needed time with the rubrics. The matrix seems very clear - as long as people have clear explanations of what value added measures mean, the matrix is clear. Clarity must be given to student growth measures for teachers who do not teach tested grades (prek-2, music, art, physical education, and support teaching staff) and those who do not have value-added measures (all except 4th and 5th grades). Gray areas between the categories. NONE Appendix A Survey SUMMATIVE SURVEY RESULTS

119 None. None. None get rid of the numbers at the top and replace with the 4 levels of teacher effectiveness More numeric and less subjective I believe a teacher should be "Accomplished" if he/she is receiving a 4 AND his/her students are meeting their expecting growth. Using the current scale, a teacher is only "Accomplished" if the student growth measure is above the expected level of growth. VERY DEFINED VALUES...ALTHOUGH THAT GOES AGAINST HOW YOU RATE THE TEACHERS DURING THE ACTUAL OBSERVATION SIDE OF THE PROCESS (HOLISTIC). none The fact that the only way to be ranked "Accomplished" is to have above expected student growth. Good, for now. I would need more experience with it before I recommended any further changes. It is very difficult to reach the accomplished rating. The will be a concern when it is rolled out to the full staff. Also the concern from teachers heavily involved in inclusion classrooms and how it impacts them. Give an example teacher score with explanation based on student growth and teacher performance level It doesn't make sense that a teacher can earn a 1 but still be developing. If the teacher is working with a high group, they can score expected growth potentially in spite of the teacher. Conversely, the teacher has to earn both a 4 and get above expected growth in order to earn the highest rating. A narrative which clearly explains each rating. How the teacher is rated as accomplished if they showed expected growth. I think teachers need a clear picture of how each District will rate using the various tools they can select from to assess the 50% of student measures that will affect their evaluation. In my opinion, the matrix is clear. It makes no sense that students could make expected growth, and the teacher has no chance to be an accomplished teacher. That means the teacher will have done his/her job well and will not be recognized for it. That section of the matrix needs to be changed. If a teacher is evaluated by the administrator as accomplished and the students make expected growth, the teacher should be an accomplished teacher. That is just logical. Simplify it somehow - it is way too confusing. It is very hard to tell the "formula" for being "rock solid" proficient! none The evaluation rubric is written with Accomplished at the far right and the matrix is written with Accomplished at the far left. It would be easier to understand visually if the matrix and evaluation rubric put Accomplished at the same place. There is the possibility for confusion when the matrix and evaluation rubric have to be read differently (in opposite directions). Throw it out. It is unfair to teachers in lower SES areas and also does not include staff who are not in grades or areas that are not using OAA testing. There needs to be more direction from ODE on the 50% student growth. Sample plans, guidelines for Appendix A Survey SUMMATIVE SURVEY RESULTS

120 acceptable measures, and a reevaluation of the vendor test list. I am concerned about contract and the equity involved in initiating according to legisative demands rather than giving districts time to work cooperatively with ODE through the ESC's to establish fair and equitable guidelines for implementing the student growth measures. Battelle has a proven track record and a history of measuring student growth using certain tools. New tests, etc. will not have an opportunity to establish this validity for this purpose. I would like to attend the SLO training and perhaps this would be clearer however, like many of the things being set up for the end of the year I have a conflict. Next year needs to focus on this. The actual observation is really good and nicely aligned with all the initiatives we are doing as part of OIP, i.e. formative assessment, instructional practices, and differentiated instruction. I really enjoyed the training and then the opportunity to practice the skill. Steamline the process with extended timelines to show student growth. We were limited on the student growth measurement due to time constraints. I would allow for teachers to be accomplished if they display accomplished teacher skill during observation and meet expected growth. Other possible ways to show student growth. The OAA is only a "snapshot" of what a child can do on ONE day of a school year. Some children do not perform well on these high stake tests. There needs to be more options to show student achievement.?????????????? There needs to be an element added that gives an equal chance to the teacher who does not have the VAM calculated into their score rating. Example: a third grade teacher cannot receive VAM because her students take the OAA for the first time in the third grade. A student who is going through emotional trauma at home often does not perform well on tests. Yet the teacher's performance is viewed as inferior and this counts as half of the teacher's performance rating. The labels at the top 1-4 are unclear We waited for an explanation for how to accurately and objectively combine all the components of OTES (Teacher Performance)into a 1-4 rating. It didn't happen. This evaluation is far too important to teachers to be reduced to a subjective evaluation that we have had all along. None I teach 1/8 of the students classes spend 43 minutes of 24hrs a day with a student and that equals 50% of my rating? Student growth contains many more factors than what is being implemented I would wait until we had student growth measures in place for all teachers and have reliable data in place before we start using data (3 years of value-added=reliable source) to evaluate teachers. We appreciate that during the implementation the LEA's have input into the weight/percentage of student growth/value added measures. It should be possible for a teacher to achieve an overall "Accomplished" rating if their SGM is "at expected level" and they achieve "Accomplished" for teacher performance. At this point I am not sure because I don't think the creators of this plan even know what they are doing. How can we discuss studdent growth mesures when not all of the test documents have been identified. none I am concerned that the only way a teacher can be accomplished is to have to have Above Expected Growth. If a teacher helps her students make a years worth of growth and is rated accomplished in the performance section, he/she has accomplished the task set forth for him/her and should be rated as such. Appendix A Survey SUMMATIVE SURVEY RESULTS

121 shortened time during observation Appendix A Survey SUMMATIVE SURVEY RESULTS

122 To what extent does the OTES model support alignment between your organizational processes and performance goals? Text Response From my understanding the OTES model will be aligned to our process. At this point, what we currently have in place looks and feels different from the new model but, I don't see any reason for hang-ups between the two models. OTES does align with our organizational goals. It matches very closely what we already had in place as a district. some Good alignment The new system closely aligns the two. OTES largely supports the alignment between our organizational processes and performance goals. Our biggest concern is figuring out how to align evaluation processes of related service and other staff (OT, PT, Speech, school psych, paraprofessionals, etc) so their evaluations hold them to the same level of accountability. This will be difficult, if not impossible. I believe the alignment process between the organizational process and performance goals is clearly stated and devloped to monitor and enhance both student perfformance and teacher accountability. Aligns very well The OTES model supports alignment in the fact that if all these things can be done huge benefits await. However, it detracts from alignment in that the time it takes to complete all other aspects of running an effective classroom and school. Something will have to give. The OTES model is almost perfectly aligned. Only minor changes will be necessary. You really do not want my opinion. Seems to focus on standards more than personal goals. I think it will be a good process total alignment I feel that OTES requires some focused analysis of strengths and weaknesses which helps teachers to better be able to develop professional goals. This is where we want / need to be. It will just take some work to make the transition. It is a very good tool. With continued education on goal setting this will be a very good thing. However, on the downside, since continuous changes are occurring in the landscape of a classroom setting the big fear is how much will change in the next several years will occur. In theory the process is good to improve teachers, but it is unrealistic for administrators to do a quality job when evaluating all teachers. The time frame makes this almost impossible. Take a building with 30 teachers do a goal setting meeting, preeval meeting, eval, post eval and then do that twice!!!!! You are talking roughly meeting with each teacher 5-7 times plus two evaluatoins. It seems this may be difficult to get done. While OTES provides a systematic approach to walking through the process, it appears that a great deal of work with our staff regarding Why change in our evaluation system is needed, What should be evaluated, and How teacher evaluation data can be reflected is needed. If we want to promote OTES as Appendix A Survey SUMMATIVE SURVEY RESULTS

123 an opportunity to grow teachers as professionals, rather than a compliance activity "that the people at the Board" or "the people at ODE" are requiring, we need to identify ways to develop meaning prior to roll-out. I have developed Modules aligned to these goals and tied it in with Charlotte Danielson's early work in "Teacher Evalauation to Enhance Professional Practice", but I fear that the timeline will rush this integral process. OTES does support the alignment of our organizational processes and performance goals. To a moderate extent. We are currently using rubrics with our instrument and teachers in one phase of our evaluation process write two goals, provide evidence, but may or may not have a clinical observation, depending on the goals. However, our process has more flexibility for teachers at different points in their career ladder, and SGM's are not a part of our instrument. We continue to have concerns about the two required observations. somewhat A strong extent. I like the model, but it is way too cumbersome. Needs to be easier to use, electronic, and easy to read and interpret. The OTES model supports alignment between my organizational processes and performance goals somewhat. I believe they are closely related. It supports it but it is such an extensive process for every teacher to go through that I as a single evaluator am worried I will either be overwhelmed time-wise or take shortcuts. I don't want either to happen. The alignment is starting. The teacher review/assessment is similar to what we have used in the past. Our goal is to increase and measure student learning and performance. I am not sure. It seems unclear when all facets have not yet been negotiated and we are not sure what percenage, etc. will be used. There is disagreement among our principals, teachers, union on how things should be. Each evaluator seems to have a different opinion. It is very subjective. The observation instrument is well thought out and correlates well with our current instrument. The student performance section is a do it yourself, easily manipulated mess. This needs to be developed with the same thoughtful consideration given to the observation instrument. It is not ready to be rolled out this summer for those poor districts who are negotiating contracts. Please develop this section with the same care you gave the other. Steps- develop the weights, do preliminary VA work with the tests on the lists, do professional development with districts for the student growth section. The model aligns well with our current organizational processes and performance goals. The language of the rubric, especially in the revised version (dated March 23) is clear and specific. The goal setting piece created a laser-like focus in my building this school year. Extremely well aligned Very well. If people truly want to do the best job possible then I think this instrument is good because it requires teachers to think and reflect on why they do what they do in the classroom. I think it will also help people identify their weaknesses and make them think about how to improve in these areas. I believe Appendix A Survey SUMMATIVE SURVEY RESULTS

124 that organizational processes will be designed to better achieve the performance goals so I believe the OTES model does support alignment. I am a consultant with an ESC, I am currently training educators on the CCSS, mentoring and helping districts implement formative instructional practices. Understanding the OTES model help with my consulting work...i like to stay current with education. Time consuming Aligns with teacher standards, but we need to see what local measures besides OAA would be in affect to teachers with students with disabilities or behaviors. I am concerned about how other teachers will want to work with " challenging " students because "this kid could affect my evalutaion" It matches almost exactly what we used to do (with a couple of new areas) but is much more complicated and difficult for teachers to understand without extensive briefings. OTES aligns well with our current teacher performance goals. More clarification and guidance is needed from ODE regarding SGMs and how to fairly assess teachers when SGMs are not the same (Not all teachers will have VAM, OAAs, teach the arts, Title I or spec. ed). We need more guidance, especially when teacher/principal livelihoods are at stake. We want to get there, we just have a lot happening this year with RttT and all of this. We are trying to do this well. We just need time. Easier to go from district to district to check on implementation and questions The time committment for one evaluator to do this well will be extremely difficult! It provides a more positive alignment. It aligns fairly well. 100% Not at all...the amount of time for this entire process is completely overwhelming. I believe there should be outside evaluators that serve districts by coming in and taking the burden off of already overly burdened principals. I believe the OTES model in which we have based our draft model will support the growth of educators in our district. To be an accomplished teacher, one must show dedication and the willingness to learn with changing times in education. We have a work in progress, but have a clear idea of where we are going. The framework supports alignment, but the overall model needs to be adjusted to fit our needs better The OTES model is dove tailed with what is done in the Resident Educator program and directly reflects the behaviors that need to habits of maind when teaching. The OTES model support alignment between my organizational processes and performance goals are somewhat aligned. I think it is supportive in grades 4-8 value added subjects. I don't see a lot of support for non value added subjects, k-2, related arts areas. All in all we are very happy with the process, but want to see the whole package. We have it in bits and pieces and are excited to see what changes this feedback has made. The OTES will become the focal point for our orgainizational goals as it relates to student performance. Not sure what is meant by this question. Student performance goals? District? Teacher? Appendix A Survey SUMMATIVE SURVEY RESULTS

125 OTES supports our organizational processes and performances goals. The model guides the teacher in a self-assessment which assists the formation of smart goals to enhance student and growth. The rubrics provide guidance and benchmarks for methods to enhance student learning and growth. similar to what we are using with the 4 domains. They are aligned. I think it will be helpful to get all staff focused on quality instruction and student learning. It is similar to our existing tool. The challenge will be in the evaluation of all staff, the "multiple measures" and coming to agreement with the union in any topics that require negotiations. Our district is part of the OAC, so we were making changes with differentiation, formative assessment, student learning, etc. before we implemented the OTES pilot, so it was hard to separate what effect OTES had compared to our other initiatives. The model is good and has been very useful for the most part. What concerns me is the lack of information available to those who are not involved in the pilot. At our final training we were told about a "webinar" that would be available on the ODE website. We heard details about it and that it would be in a few different sections because of its length. I asked about it recently and got a very terse response with no explanation: There is no webinar. How are teachers and local districts supposed to know how to proceed when there is so much misinformation? (When we had a breakout session at our final OTES training this was a common theme in all of the districts represented, so I know that it's not just a matter of our particular district not having the information.) We asked questions on March 9th that the presenters could not answer. VERY FRUSTRATING! it aligns nicely. It supports alignment. However, there is going to be a need to constantly revise as movement toward assessment in the Common Core unfolds. Green Value-Added may dramatically change as new assessments are unwrapped and students are asked to perform differently instead of predicably on the OAA assessments. We feel it matched very closely with our current evaluation tool. The gap was in the area of student growth measures being part of the evaluation. As a district, our teachers already have smart goals. Reading and math teachers have student growth goals. Our job is to develop student growth goals with all content area. 40/60% It doesn't right now but it will in the future. Very Muxch It aligns well other than the the final matrix. Currently, not at all. We still have another year on our teachers' contract. OTES is not a reasonable answer to teacher evaluations and performance. It would be fine to use with new teachers or teachers who have been found to be below par but to expect anyone to perform this system of evaluation on a whole staff every year is not realistic. If you have 30 certified staff members and you implement the system as written, you will meet with or observe 240 times before March 1st and that's without walkthroughs. Seriously.? In a building with 450 students, one secretary and one administrator that is unreasonable!!! THank goodness this is my 34th year in education and we still have a year on our teachers' contract. Makes relationship clear. Appendix A Survey SUMMATIVE SURVEY RESULTS

126 It is a giant step forward and is far superior to our present evaluation system. In my opinion- OTES used correctly- will be an enhancing tool that will allow districts to align the RE program, CCSS, PD(LPDC), to help teachers work more closely with the educator standards that are designed to guide our practice- It is not realistic for administrators to complete all of the steps with all of the teachers. Teachers will have a difficult time not being rated as accomplished. The OTES model helps us to focus in on the goals and align the focus across the district. I believe the OTES model completely supports the alignment between our organizational processes and performance goals. I also believe as an OTES pilot teacher that this is a great evaluation system we are putting into place that supports teacher growth and performance at all levels. Our main focus is increased student performance. We achieve increases by helping teachers to self-reflect, be honest with their own teaching skills and abilities and search new ways to become the best you can be by refining your own personal teaching practice. Still confusing We are using the Pathwise framework for our evaluation system. We provide calibration for teachers and administrators every year. We feel that our teachers and administrators understands the rubric we use at the present moment. Moving to the OTES will be much easier since we have this system in place. All I have to say is whoever was involved in putting this tool and the process together was not considering the amount of time that needs to be put into each evaluations...and not to mention the other responsibilites we as educators have. Somewhat. somewhat? Very well. alignment and support math profeesional goals. OTES Model will serve as a final alignment piece for staff performance goals and assuring alignment to common core. We have Board adopted the OTES format. Next year we will continue to use and edit as needed to meet the needs of our teachers and adiminstration. Some areas like 6 and 7 need clarified as well as the summative and the additon of multiple growth measures to the evaluation since our school district utilizes multiple measures PreK-6. It has helped to bring clarity and focus to creating an alignment between these goals. We have decided to use the framework and the Danielson model to design our model. We will be revising our organizational process once the current agreement expires. I like the way things are now, truly, connected. Most years I write a SMART goal and forget it by the end of the year. With this system, I set a goal, monitored it, discussed it with my principal, and made definite gains throughout the school year. The OTES is a great tool for encouraging growth. More development of alignment for Career Tech Programs is needed. It's very similar to the Danielson framework we use now. Our contract precludes us from OTES processes until We have time to see what works for others in the area of student achievement data. The goal and focus is commendable, improving instruction. I am not sure how reliable the various data Appendix A Survey SUMMATIVE SURVEY RESULTS

127 pieces will be. continuing concern is time to do the process well. Many administrators have over 40 certified staff to evaluate. This is a very lengthy process if done well. Process for OTES model is lacking in our district. The OTES Model has some great research support and is well intended. However, it is cumbersome and time consuming. For this reason, it will interfere with the organizational processes and performance goal unless the districts and each building are given more time to adjust, fine tune and pilot the model. Thie implementation date of is to aggressive and will cause short cuts and no time for collaboration, discussion and consensus of all parties involved in formative and summative evaluations. Corrective legislation would go a long way in securing a positive outcome. Don't rush to get it done; slow down to get it done right. Allow schools latitude to insure the OTES Model greatly supports their organizational processes and goals. Fair extent! The OTES Model has a nice link to the evaluation tool we are using in the district. I believe the model does support our organizational process and performance goal. The OTES model has a nice alignment between the two. It is helpful, but I feel STRONGLY that educating teachers on what it means, how to use it, how it aligns to teacher standards AND what the various indicators LOOK LIKE will be important in guiding teacher performance growth. From what I hear from most teachers, they are still uneducated on what the Ohio Standards for Teachers mean and how to use them. For our district, they were put in our mailboxes and left there a couple of years ago, and no training or professional development was done on them. Yes, teachers should be interested in their professional growth and the standards for their professionalism. But the reality is that most of them are entrenched in their daily activities, in working with students, in implementing new technology, and more, and they simply don't have the time to do professional reading and studying. We fail our teachers as people who implement professional development in our district because we do not address them with the very teaching strategies that we want them to use. They are not only teachers, but students of the profession and in some cases, we need to deliver information to them in the same structured, organized ways we deliver instruction to our students. Our teachers are at all levels, much as our students. Some will take it upon themselves to keep abreast of their profession, others will wait to be led, others need assistance. Instead of just throwing things at them to read, we need to use structured, organized, thoughtful PD with them that delivers instruction, guides them in their practices, and fosters an environment in which positive growth is encouraged and rewarded. Sadly, that is not my experience in Ohio schools at this time. It supports alignment by giving instructional guidelines to teachers based on specific goals that they have set. However, it is very difficult to evaluate certificated staff members that are not in the classroom. The OTES model will strengthen our ability to help teachers grow professionally. Hopefully the model will allows us to make better decisions in regards to teachr performance. Not sure how to answer this question. Appendix A Survey SUMMATIVE SURVEY RESULTS

128 Appendix B

129 Appendix B Best Practices Review and Bibliography Best Practices Literature Review of Trends in Teacher Evaluation Methodology Across the United States, policymakers, measurement researchers, and those in the teaching profession are re-examining and experimenting with innovative ways to evaluate teacher effectiveness. The following review of research and current practices in teacher evaluation methodology was conducted to identify strategies and lessons learned from current teacher evaluation initiatives. Approaches to teacher evaluation that are underdevelopment are summarized with recommendations to inform future refinement of the Ohio Teacher Evaluation System (OTES). Since 2010, laws and policies governing teacher evaluation have been undergoing rapid changes in the United States. The primary shifts have been from teacher evaluation methods based on a review of teacher behavior and salary and tenure decisions based on seniority and degrees earned to a focus on student learning outcomes. Before 2010, only four states were using student achievement as a predominant influence in how teacher performance was assessed. By 2011, the number had jumped to 13 with 10 additional states including student achievement as a small percent of teacher evaluations, according to the recent report, State of the States: Trends and Early Lessons on Teacher Evaluation and Effectiveness Policies from the National Council on Teacher Quality (October, 2011). In September 2011, the federal government directed that states wanting relief from the No Child Left Behind (NCLB) law could apply for a waiver from the law s tough-to-meet requirements for student achievement in reading and math. To qualify for a waiver, however, states must define how they would use teacher and principal evaluations to make personnel decisions. Initially, 11 states applied for waivers and an additional 28 states said they planned to seek waivers. In addition to the NCLB waivers, the Race to the Top (RttT) competition motivated states to adopt new evaluation systems in order to win the grant competition. While these policy changes have caused a major shift in local accountability systems for teachers and principals, the use of value-added scores are at the center of the debate about how to best measure teacher effectiveness. Driven by policy mandates, educational systems are beginning to adopt value-added methods (VAM) for evaluating teachers even before researchers have concluded investigations into how best to measure the impact of teaching practices on student growth. Recently, the Education Commission of the States tracked 18 state legislatures that had modified teacher tenure or continuing contract policies. One of the most notable was in Idaho, where legislators enacted a bill banning tenure for new teachers and other certified employees. While some states leaders waged intense battles with teachers unions, Illinois changed its approach to teacher tenure with less conflict. Governor Pat Quinn signed into law a measure that links educators tenure, hiring, and job security to performance rather than to seniority. The Illinois law makes it easier to remove an educator Appendix B Best Practices Review and Bibliography

130 from the classroom for continuously poor performance. Tennessee and Colorado are among the states taking an early role in implementing VAM into teacher evaluation systems. These two states have aggressive laws and have implemented practices using VAM for teacher evaluation. The Tennessee Comprehensive System is one of the more established systems using student achievement data in teacher evaluations. In , student test-score growth will account for 35% of a teacher's year-end evaluation. Districts will use the data to decide which teachers will receive tenure and which are let go. An additional 15%of a teacher's evaluation is made up of achievement measures chosen by the district, and 50%is based on classroom observations and other measures. Under the Tennessee 5-point rating system, teachers defined as a 3, or "at expectations," are those whose students make at least one grade level of gain on the state s test. To receive tenured status, a new teacher s students must demonstrate more than one year s gain for two years. William Sanders, a former University of Tennessee researcher who now works for SAS, a private business-intelligence company, developed Tennessee's value-added formula which does not factor in any individual student characteristics. Previously, teachers in Tennessee were evaluated only once every five years. Under the new system, principals are required to spend from 60 to 90 minutes in a teacher's classroom annually, with the amount of time in classrooms dependent on a teacher's experience. For veteran teachers, principals must conduct four 15-minute observations over the course of the school year [see The Colorado State Council for Educator Effectiveness (CSCEE) was established in 2010 as part of Senate Bill 191 to design a value-added model to provide 50% of the state s teacher evaluation data. The Colorado law requires LEAs to conduct performance evaluations for all teachers and principals at least once each school year. At least half of each teacher s and principal s evaluation must be based on multiple measures of students academic growth, including the state test. Teachers performance effectiveness ratings must be used before seniority when considering district-level layoffs. Colorado law also requires LEAs to consider student mobility and the numbers of students with disabilities or at risk of failing school in its VAM formula. Colorado implemented a pilot program for new teacher evaluation methods during the school year, the results of which are not yet available. However, in an April 2011 report, the CSCEE said it had established a Technical Advisory Group of local and state stakeholders to assist with development of recommendations and is conducting an evaluation with LEAs that are piloting new teacher evaluation methods and research review of best practices. The April 2011 recommendations from the CSCEE emphasized alignment of LEA methods with the Colorado Teacher Quality Standards and called for common statewide technical guidelines for selection and use of valid and reliable measurements, including how to translate assessment data into growth scores for evaluation purposes. The CSCEE also called for protecting educator evaluations as private data within the state accountability system. The CSCEE recommended that the state develop summative assessments for 70 percent of the teachers who currently teach untested subjects or grades and align state policies to quality standards for licensure, accreditation of preparation programs, approval of induction programs, professional Appendix B Best Practices Review and Bibliography

131 development, and educator recognition criteria. The CSCEE recommended that the state provide one complete educator evaluation model with supporting measurement tools and exemplars of LEA evaluation practices. They also recommended that LEAs be encouraged to attribute student growth to teams of educators instead of to individual teachers and to use student and parent/guardian perceptual data as part of teacher evaluation. The CSCEE reported findings from a study of LEA costs to implement the new teacher evaluation system. The estimated annual cost was $531 for effective teachers and $3,783 for ineffective teachers who required more supervision and remediation. The study estimated an additional one-time cost of $53 per student in the first year of implementation. In addition to Colorado and Tennessee, the Delaware Performance Appraisal System is a statewide educator evaluation system that has been in place since 2008 but is undergoing significant modifications which are being piloted during the school year. The teacher evaluation was initially based on Charlotte Danielson s Enhancing Professional Practice: A Framework for Teaching (2nd Edition. 2007). The recent changes to teacher evaluation are summarized in a policy briefing from the Delaware Department of Education (see In the Delaware model, a school-wide assessment measure, a student cohort assessment measure, and a teacher-specific assessment measure will be combined on a 100-point scale for the student growth component of the teacher effectiveness rating. The school-wide assessment measure will be used for all teachers and specialists and accounts for 30 points. Each educator will receive either the Delaware Comprehensive Assessment System (DCAS) reading score across all grades in school or the DCAS mathematics score across all grades in school, with the determination based on which one shows the most positive result using DCAS Adequate Yearly Progress (AYP) scores. For the student cohort assessment measure, districts in Delaware are asked to use the fall-spring growth based on students instructional scores on the DCAS, accounting for 20 points. A student cohort could be specific to a grade level, subject area or a student-based cohort within a test grade/subject area. For example, a counselor may identify a subset of students with frequent absences and focus on math with that group of students. The teacher-specific assessment measure, which counts for 50 points, is a non- DCAS measures tied directly to the teacher or specialist s current teaching assignment and to be approved by the Secretary of Education. In Georgia, 26 local education systems are participating in the state s Race to the Top grant to pilot its new statewide teacher evaluation system, called Teacher Keys, beginning in January The system includes student performance data, student surveys about teachers and teacher surveys about principals, and administrators two classroom observations. Pilot sites in Georgia will field test the new electronic platform that provides web-based access to the evaluation process guides, templates, and support materials. Appendix B Best Practices Review and Bibliography

132 Beginning in , participating sites will have access to a data warehouse for all observation records, documentation to supplement and support those observations, student survey and growth data, and other relevant information. An electronic record will be maintained of all components of the evaluation system, including orientation, familiarization, self-assessment, TAPS formative and summative documents, student surveys, Student Learning Objective (SLO) data and evaluation, student growth percentile data and calculations. During the pilot, functionality of the platform will be limited with linkages to student data expected in the next few years (for more details, see the Georgia State School Superintendent s explanation of the Teacher Keys Evaluation System Pilot online at Perspectives on VAM from Measurement and Research Experts In a joint statement from the National Council for Measurement in Education, the American Psychology Association, and the American Education Research Association researchers explain, Tests valid for one use may be invalid for another. Each separate use of a high-stakes test, for individual certification, for school evaluation, for curricular improvement, for increasing student motivation, or for other uses requires a separate evaluation of the strengths and limitations of both the testing program and the test itself. The National Research Council s Board on Testing and Assessment (2009) made similar claims and warned against using standardized test scores to evaluate the effectiveness of teachers without first determining the degree of alignment between the test, content standards, instruction, and curriculum. The Council s Board on Testing and Assessment (BOTA) asserts, BOTA has significant concerns that the [federal] Department s proposal places too much emphasis on measures of growth in student achievement (1) that have not yet been adequately studied for the purposes of evaluating teachers and principals and (2) that face substantial practical barriers to being successfully deployed in an operational personnel system that is fair, reliable, and valid.... The term value-added model (VAM) has been applied to a range of approaches, varying in their data requirements and statistical complexity. Although the idea has intuitive appeal, a great deal is unknown about the potential and the limitations of alternative statistical models for evaluating teachers value-added contributions to student learning. BOTA agrees with other experts who have urged the need for caution and for further research prior to any large-scale, high-stakes reliance on these approaches (e.g., Braun, 2005; McCaffrey and Lockwood, 2008; McCaffrey et al., 2003). A primary concern among researchers is that it is technically inaccurate to assign causality of student learning outcomes to teachers in real world classrooms as teachers and students are not randomly assigned to class groups. Technical issues that complicate the meaning of value-added scores include this lack of random assignment of teachers to schools and students to teachers. Current VAM techniques do not control for those differences and therefore VAM scores are not comparable between Appendix B Best Practices Review and Bibliography

133 teachers who work with different populations. In addition, value-added scores may be affected by student motivation and parental support (BOTA, 2009). Linn (2008) concludes: As with any effort to isolate causal effects from observational data when random assignment is not feasible, there are reasons to question the ability of value-added methods to achieve the goal of determining the value added by a particular teacher, school, or educational program. Darling-Hammond (et al., 2012) further explain: Value-added models enable researchers to use statistical methods to measure changes in student scores over time while considering student characteristics and other factors often found to influence achievement. In large-scale studies, these methods have proved valuable for looking at factors affecting achievement and measuring the effects of programs or interventions. Many of the leading researchers in the area of teacher effectiveness and educational measurement concur (Baker et al., 2010; Braun, 2005, 2011; Newton, 2011) that with respect to using value-added measures of student achievement for evaluating individual teachers, there is strong research evidence that suggests that high-stakes, individual-level decisions, or comparisons across highly dissimilar schools or student populations should be avoided. The term student growth is defined differently for different value-added models. As Darling- Hammond (et al., 2012) explain, research reveals that a student s achievement and measured gains are influenced by much more than by the work of any individual teacher. Others factors include: School factors such as class sizes, curriculum materials, instructional time, availability of specialists and tutors, and resources for learning (books, computers, science labs, etc.) Home and community supports or challenges Individual student needs and abilities, health, and attendance Peer culture and achievement Prior teachers and schooling, as well as other current teachers Differential summer learning loss, which especially affects low-income children The specific tests used, which emphasize some kinds of learning and not others, and which rarely measure achievement that is well above or below grade level Most of these factors are not actually measured in current value-added models, and the teacher s effort and skill, while important, constitute a relatively small part of this complex equation. As a consequence, researchers have issued the following cautions about adopting value-added models to accurately measure teacher effectiveness: Appendix B Best Practices Review and Bibliography

134 Current value-added models of teacher effectiveness are highly unstable as research shows a teacher s effectiveness rating can differ substantially from class-to-class or year-to-year and across different VAM statistical models. Current value-added models do not account for disproportionate numbers of students with poor attendance, unstable home environments, low parental involvement, lack of internal motivation, and foreign language background, which cause misestimates of teachers effectiveness and disincentives for teaching students with the greatest needs. The accuracy of value-added models is greatly improved with the random assign of students to teachers; however, the likelihood of random assignment is low in our educational systems. Current value-added ratings do not account for the complexity of influences such as multiple teachers, school conditions, prior teachers, and classroom groupings. Often the VAM ratings generated by today s rudimentary methods do not correlate with teacher evaluation ratings assigned by skilled observers. In terms of statistical validity, teacher evaluations need multiple measures that all point to the same conclusion. Harris (2011) explains that there are considerable errors in value-added measures which need to be addressed and caution is needed in how VAM scores are used within teacher performance systems. In addition, VAM scores are based on the bell curve which inherently ranks scores so that some teachers will always be below average and some above average even if the difference between them is insignificant. Harris also explains that any accountability system should focus on holding teachers accountable for what they control, which include goals, lesson plans, progress monitoring of students, differentiated instruction, and classroom management. Students also have control in the classroom with regard to their behavior, motivation, receptiveness to learning, peer influences, attendance, health, and nutrition that influence teaching. School communities control factors that influence teaching such as class size, class grouping, administrative leadership and support, curriculum resources, support staff, and parental involvement. Larger societal trends, current events, and funding all impinge on student behaviors and teaching practices in the classroom. Prior to 2010, the most common methods for teacher evaluation have been classroom observations and evaluations by administrators, teacher portfolios documenting a teaching behaviors and responsibilities, and peer review (Hinchey, 2010). Darling-Hammond (et al., 2012) also point out that countries like Singapore include a major emphasis on teacher collaboration in their evaluation systems. This kind of measure is supported by studies which have found that stronger value-added gains for students are likely in schools where teachers work together as teams and have higher levels of teacher collaboration. Just as the No Child Left Behind law sought to address problems with low expectations for student performance and led to focus on narrow curriculum and teaching to the test, Hinchey warns that the VAM movement s focus on test scores also will lead to undesirable consequences that undermine the goal of developing excellence in the teacher workforce. Harris (2011) reviewed the many models of value-added measures under development by assessment and measurement experts and concluded Appendix B Best Practices Review and Bibliography

135 that VAM is not suitable for use in evaluating individual teachers, but when applied appropriately, school value-added measures are useful for accountability. Merit Pay and Bonus Incentives Another area of debate is the merit pay incentives that states and districts are packaging with the new teacher evaluation systems. School districts in at least 42 states have some form of merit pay, according to the National Center on Performance Incentives at Vanderbilt University. In a study of the District Awards for Teacher Excellence (DATE) merit pay program in Texas, Springer and Lewis (2010) concluded that there was a correlation between schools voluntary participation in the merit pay program, increased teacher retention, and improvements in students test scores in some, but not all, participating schools. In 2011, Indiana passed a new law mandating that test performance of students be factored into teacher pay raises. This move represents a major shift away from the pay scales that award pay raises based primarily on a teacher s years of experience and the academic degrees they earned. At Indiana s Wayne Township Schools, administrators are proposing a new compensation system based on a sevenpoint award system (Elliott, S. & Butrymowicz, S., 2012). Under the plan, teachers receive one point each for years of teaching experience, degrees attained, professional leadership, attendance, and up to three points for performance based on a formal evaluation. Within the proposed evaluation method, student test scores carry 20% weight. Eighty percent of a teacher s evaluation is based on observations of their work. The school district s superintendent wanted the compensation system to motivate all teachers to meet their students academic needs, not just those in a particular subject area or hard-tofill position. Berry and Eckert (2012) point out that incentive programs that merely pay teachers for student test scores produce limited results; other incentives can produce better outcomes and can be used to spread expertise to colleagues. In their review of the empirical evidence, Berry and Eckert note that teacher incentive proposals are rarely grounded on what high-quality research indicates are the kinds of teacher incentives that lead to school excellence and equity. For example, the authors note that empirical evidence, including large-scale studies and an increasing number of teacher testimonies, suggest that working conditions are far more important than bonuses. Moreover, those important working conditions go well beyond the issues of time, class size, and the length of the workday. The working conditions that appear to support improved teacher and student performance are those that allow teachers to teach effectively, including: Principals who cultivate and embrace teacher leadership Time and tools for teachers to learn from each other Specialized preparation and resources for the highest needs schools, subjects, and students Appendix B Best Practices Review and Bibliography

136 The elimination of out-of-field teaching assignments Teaching loads that are differentiated based on the diversity and mobility of students taught Opportunities to take risks Integration of academic, social, and health support services for students Safe and well-maintained school buildings In addition, missing from virtually all of the currently in-vogue strategies to give teachers incentives to improve achievement is an understanding of how incentives could be used to reward teachers who spread their expertise to their colleagues. Teachers have long been organizationally siloed from each other. Berry and Eckert (2012) point out that strategic compensation could be used to reward teachers who collaborate, not compete, with their colleagues in helping them teach more effectively. Still, many educators criticize performance pay plans, arguing that the promise of more money will not make teachers work harder. Performance appraisal systems that focus heavily on test scores can have negative consequences, such as cheating and the narrowing of learning opportunities when teachers focus on test content (Chetty, Friedman, Rockoff, 2011). Jabbar (2011) argues for an approach that would incorporate psychological knowledge about human behavior related to intrinsic motivation and decision-making to enhance student performance measures in education rather than external motivation incentives, such as merit pay. Dobbie and Fryer (2011) did an intensive mixed methods study of successful charter schools in New York and found five strategies in effective schools: give frequent feedback to teachers, use loads of data on individual students to guide their instruction, employ heavy tutoring, increase instructional time, and maintain very high expectations. These strategies are school-wide and require administrative support to implement in the classroom and teachers need formative evaluation feedback during the school year, access to high quality data systems and short cycle assessments aligned to high quality benchmarks and outcome standards for content area learning, time to analyze student learning data on a regular basis, and time to plan and implement differentiated instruction which may require the support of additional teachers or resource professionals for students performing at the low-end and high-end margins. These types of strategies require a coordinated, school-wide effort. Incentives and methods that pit individual teachers against each other or that do not support professional collaboration and communications among teaching staff and administrators can be counterproductive to fostering students academic growth. Darling-Hammond (et al., 2012) concur that a viable alternative to the use of VAM in teacher evaluation is to focus on teachers planning processes and use evidence-based instructional strategies proven to be effective for fostering students academic growth. Appendix B Best Practices Review and Bibliography

137 Conclusions and Recommendations from the Case Study and Review of Best Practice Literature The use of value-added measures to evaluate teachers raises many technical and ethical issues that will be debated for years to come as assessment experts and educators investigate correlations and causal links between teacher behaviors and student growth. In the meantime, principals, school districts, and teachers struggle to meet new mandates set up by new laws that call for a student growth score to constitute up to 50% of a teachers evaluation rating. However, there are indications that policymakers are pausing to rethink the use of VAM within teacher evaluation systems. For example, the Elementary and Secondary Education Act (ESEA) of 2011 scaled back its original mandate for student growth measures in teacher evaluation systems at the federal level. Current research indicates that student achievement and measured gains are influenced by much more than an individual teacher. Other school factors influencing student achievement include class sizes, instructional time, availability of specialists and tutors, the degree of alignment of curriculum resources with content standards and effective assessment measures, previous and other teachers, and school leadership. Out-of-school factors influencing student achievement include level of supports or challenges in the home environment, individual learner needs and abilities and intrinsic motivation, personal and family health, school attendance rate, peer culture, and differential summer learning loss. VAM researchers agree that the lack of random pairing of students and teachers makes causal attributions a fundamental flaw of current VAM formulas. There is a wide-spread interest in resolving this technical issue. However, until more robust value-added formulas become available, experts strongly caution against using VAM data to make high-stakes decisions about critical issues such as teacher pay, tenure, awarding of teaching licenses, and ranking of teacher effectiveness. A major finding of the review of research and practices related to teacher evaluation methodology is the understanding that while educational policy is well-intended in seeking to link teachers classroom practices to student learning, there is much work to be done to resolve technical issues with current value-added methodologies for doing so. Few dispute that teachers have a significant impact on student learning. The debate is how to best measure that impact and on that point there is yet to be consensus among policymakers and researchers. It is recommended that educators conduct pilot studies of VAM and take a more evidence-based approach to including student growth data in the evaluation of teachers. In summary: Experts caution against using VAM scores as the basis for high-stakes decisions about tenure, hiring, and pay incentives. Experts also caution against giving substantial weight to VAM scores to avoid consequences such as cheating and teaching to the test. Appendix B Best Practices Review and Bibliography

138 Educators need to understand how to interpret VAM scores and evidence-based VAM methodologies in order to be properly informed when selecting vendors and weighting VAM scores within the OTES model. Longitudinal studies needs to be conducted to verify how VAM scores provided by vendor assessment systems correlate with other measures of teacher effectiveness such as observations, classroom assessments, graduation rates, and college readiness. Current value-added methods offer limited value for teacher appraisals and need to be only one of multiple measures used for evaluating teacher effectiveness. Other measures could include SMART goals and progress monitoring, observations and walkthroughs, lesson plans, short cycle assessment results, benchmark assessments, student work and professional collaboration, context measures for special needs, class grouping, and school, home, and community characteristics. Educators should focus on designing an evaluation system that provides useful feedback to individual teachers, teacher cohorts, and school administrators to use teacher effectiveness data to improve instruction. These findings from the literature review were compared to findings from the OTES case study sites and the following recommendations are made relative to the improvement and scalability of a sustainable OTES model. Amidst the thriving debate about how to evaluate teachers, leaders in policy, education, and measurement all agree on one thing: teacher effectiveness can and should be evaluated. The challenge is to make operational the well-intended movement toward a better educational system for Ohio s children. With regard to using value-added measures in evaluations of teacher effectiveness, the literature reviewed indicates that such measures have significant errors that need to be minimized through statistical research in more depth. It will take time to develop value-added measurements that are valid and reliable enough to use in schools. In the meantime, it would be judicious to: Avoid using VAM data for high-stakes decisions about individual teacher s effectiveness; Include VAM as only a small percent of teacher groups evaluation results; and Conduct rigorous pilot studies of how the results of various value-added statistical models correlate with other measures for student growth and teacher effectiveness. In addition, it is recommended that OTES not lose sight of the fact that this new teacher evaluation movement in the United States reaches far beyond value-added scores to coalesce a decade or more of work to improve teacher effectiveness through high-quality professional teaching standards, common core content standards, meaningful curriculum and multiple assessments aligned to standards, and the accessibility of timely data systems. While the expected outcome is improvement in student academic growth, the state and its districts need to focus on aligning their standards, policies, resources, and Appendix B Best Practices Review and Bibliography

139 assessment measures while they improve the teacher evaluation system. Districts need to provide teachers and educational leaders with professional development to appropriately and effectively implement supports for improving educational opportunities for students. Educators interviewed at case study sites generally do not understand the complexity of value-added measures, how a VAM score is derived, how to interpret it in terms of improving their teaching practice, or how to interpret VAM for parents. These are important topics that need to be address through action research, professional development, and communication plans before student growth scores are introduced to the public. In the end, teacher evaluation is best grounded in criteria for which teachers have control, such as setting SMART goals, regularly measuring and monitoring the learning progress of students, adjusting classroom practice to address gaps in learning or to accelerate learning through differentiated instruction, engaging in professional development aligned to needs assessment results, and collaborating with support staff, other teachers, and administrative leaders to address the needs of all students within a school community. Appendix B Best Practices Review and Bibliography

140 BEST PRACTICE BIBLIOGRAPHY Baker, E., Barton, P., Darling-Hammond, L., Haertel, E., Ladd, H., Linn, R., Ravitch, D., Rothstein, R., Shavelson, R., Shepard, L. (2010) Problems with the use of student test scores to evaluate teachers. Washington, D.C.: Economy Policy Institute. [Available online at Barlevy, G. & Neal, D. (2012). Pay for percentile. American Economic Review [forthcoming]. Berry, B. & Eckert, J. (2012) Creating Teacher Incentives for School Excellence and Equity. Bolder, CO: National Education Policy Center. [Available online at TchrPay.pdf]. Board on Testing and Assessment. (2009). Letter Report to the U.S. Department of Education on the Race to the Top Fund. Washington, D.C.: National Research Council. {Available online at Braun, H. (2005). Using Student Progress to Evaluate Teachers: A Primer on Value-Added Models, Educational Testing Service Policy Perspective, Princeton, NJ [Available online at ]. Chetty, R., Friedman, J.N., Rockoff, J.E. (2011). The long-term impacts of teachers: Teacher value-added and student outcomes in adulthood. Cambridge, MA: National Bureau of Economic Research. Danielson, C. (2008). The Handbook for Enhancing Professional Practice: Using the Framework for teaching in your school. Alexandria, VA: Association for Supervision and Curriculum Development (ASCD). Darling-Hammond, L., Amrein-Beardsley, A., Haertel, E., & Rothstein, J. (2012). Evaluating teacher evaluation. Phi Delta Kappan, 93 (6): Dobbie, W., & Roland G. Fryer, J. (2011). Getting Beneath the Veil of Effective Schools: Evidence from New York City. NBER Working Paper No Cambridge, MA: National Bureau of Economic Research. Dobbie, W., & Roland G. Fryer, J. (2011). Are High-Quality Schools Enough to Increase Achievement Among the Poor? Evidence from the Harlem Children s Zone. Public Policy, 3 (November), Elliott, S. & Butrymowicz, S. (April 9, 2012). Questions abound as districts shift to merit pay for teachers. The Hechinger Report [available online at Harris D.N. (2011). Value-added measures in education. Cambridge, MA: Harvard Education Press. Appendix B Best Practices Review and Bibliography

141 Hinchey, P.H. (2010). Getting teacher assessment right. Boulder, CO: National Education Policy Center [Available online at Jabbar, H. (2011). The behavioral economics of education: New directions for research. Educational Researcher, Linn, R.L. (2008). Measurement issues associated with value-added models. Paper prepared for the workshop of the Committee on Value-Added Methodology for Instructional Improvement, Program Evaluation, and Educational Accountability, National Research Council, Washington, DC, November [Available online at [September 2009]. National Academies, Board on Testing and Assessment and the National Academy of Education Committee on Value-Added Methodology for Instructional Improvement, Program Evaluation, and Accountability. (November 13-14, 2008). Collection of Conference Workshop White Papers. Newton, S.P. (2011). Predictive Validity of the Performance Assessment for California Teachers. Stanford, CA: Stanford Center for Opportunity Policy in Education, Newton, X., Darling-Hammond, L., Haertel, E., & Thomas, E. (2010) Value-Added Modeling of Teacher Effectiveness: An exploration of stability across models and contexts. Educational Policy Analysis Archives, 18 (23). [Available online at Marzano, R.J. (2011). The Marzano teacher evaluation model. Englewood, CO: Marzano Research Laboratory. Springer, M.G. & Lewis, J.L. (2010). District Awards for Teacher Excellence (D.A.T.E.) Program: Final Evaluation Report. Policy Evaluation Report. Nashville, TN: National Center on Performance Incentives at Vanderbilt University. [Available online at SITE.pdf]. Washington, DC: National Academies. Appendix B Best Practices Review and Bibliography

142 Olympia Office th Ave SE, Suite 201 Olympia, WA phone

Wisconsin Educator Effectiveness System. Teacher Evaluation Process Manual

Wisconsin Educator Effectiveness System. Teacher Evaluation Process Manual Wisconsin Educator Effectiveness System Teacher Evaluation Process Manual Updated February 2016 This manual is an interim update to remove inaccurate information. A more comprehensive update for 2016-17

More information

Greenville City Schools. Teacher Evaluation Tool

Greenville City Schools. Teacher Evaluation Tool Greenville City Schools Teacher Evaluation Tool Table of Contents Introduction to Teacher Evaluation 2 Glossary 3 Record of Teacher Evaluation Activities 5 Teacher Evaluation Rubric 6 Self-Assessment 11

More information

Technical Review Coversheet

Technical Review Coversheet Status: Submitted Last Updated: 8/6/1 4:17 PM Technical Review Coversheet Applicant: Seattle Public Schools -- Strategic Planning and Alliances, (S385A1135) Reader #1: ********** Questions Evaluation Criteria

More information

Ohio School Counselor Evaluation Model MAY 2016

Ohio School Counselor Evaluation Model MAY 2016 Ohio School Counselor Evaluation Model MAY 2016 Table of Contents Preface... 5 The Ohio Standards for School Counselors... 5 Ohio School Counselor Evaluation Framework... 6 School Counselor Evaluation

More information

What are some effective standards-based classroom assessment practices?

What are some effective standards-based classroom assessment practices? How does classroom assessment help teachers and students? Classroom assessments can help teachers plan and implement effective instruction and can help students learn at deeper and higher levels. Assessments

More information

Howard Lake-Waverly-Winsted Teacher Evaluation System

Howard Lake-Waverly-Winsted Teacher Evaluation System Howard Lake-Waverly-Winsted Teacher Evaluation System 1 Marzano Teacher Evaluation Model Howard Lake-Waverly-Winsted Schools The system described in this resource is the Minnesota Marzano Teacher Evaluation

More information

Q Comp Requirements and Guiding Principles

Q Comp Requirements and Guiding Principles Q Comp Requirements and Guiding Principles Program Purposes The purpose of the Q Comp program is for participating school districts, intermediate school districts, integration districts, state schools/academies

More information

Teacher and Leader Evaluation Requirements An Overview

Teacher and Leader Evaluation Requirements An Overview Teacher and Leader Evaluation Requirements An Overview Teacher and Leader Evaluation Requirements: An Overview The Office of the State Superintendent of Education (OSSE) believes that all District of Columbia

More information

Wisconsin Educator Effectiveness System. Principal Evaluation Process Manual

Wisconsin Educator Effectiveness System. Principal Evaluation Process Manual Wisconsin Educator Effectiveness System Principal Evaluation Process Manual Updated February 2016 This manual is an interim update to remove inaccurate information. A more comprehensive update for 2016-17

More information

North Carolina TEACHER. evaluation process. Public Schools of North Carolina State Board of Education Department of Public Instruction

North Carolina TEACHER. evaluation process. Public Schools of North Carolina State Board of Education Department of Public Instruction North Carolina TEACHER evaluation process Public Schools of North Carolina State Board of Education Department of Public Instruction Rubric for Evaluating North Carolina Teachers ( This form should be

More information

DENVER PUBLIC SCHOOLS. EduStat Case Study. Denver Public Schools: Making Meaning of Data to Enable School Leaders to Make Human Capital Decisions

DENVER PUBLIC SCHOOLS. EduStat Case Study. Denver Public Schools: Making Meaning of Data to Enable School Leaders to Make Human Capital Decisions DENVER PUBLIC SCHOOLS EduStat Case Study Denver Public Schools: Making Meaning of Data to Enable School Leaders to Make Human Capital Decisions Nicole Wolden and Erin McMahon 7/19/2013. Title: Making Meaning

More information

The Ohio Resident Educator Program Standards Planning Tool. 1 9-1-11 Final

The Ohio Resident Educator Program Standards Planning Tool. 1 9-1-11 Final The Ohio Resident Educator Program Standards Planning Tool 1 The Resident Educator Program Standards Planning Tool The Resident Educator Program Standards Planning Tool is intended to assist district/schools

More information

Principal Appraisal Overview

Principal Appraisal Overview Improving teaching, leading and learning T e x a s P r i n c i p a l E va l u a t i o n S y s t e m Principal Appraisal Overview Design and Development a collaborative effort McREL International Texas

More information

Massachusetts Department of Elementary and Secondary Education. Professional Development Self- Assessment Guidebook

Massachusetts Department of Elementary and Secondary Education. Professional Development Self- Assessment Guidebook Massachusetts Department of Elementary and Secondary Education Professional Development Self- Assessment Guidebook For Teacher Professional Development Offerings Modified for use by the District and School

More information

Oregon Framework for Teacher and Administrator Evaluation and Support Systems

Oregon Framework for Teacher and Administrator Evaluation and Support Systems Oregon Framework for Teacher and Administrator Evaluation and Support Systems Revised for 2014 2015 State Guidelines for ESEA Waiver & SB 290 OREGON DEPARTMENT OF EDUCATION 255 Capitol St, NE, Salem, OR

More information

Performance Evaluation System Protocol. Licensed Executive Professionals

Performance Evaluation System Protocol. Licensed Executive Professionals Colorado Springs School District 11 Dr. Nicholas Gledich, Superintendent Performance Evaluation System Protocol Licensed Executive Professionals Revised March 2014 235169.6 Department of Human Resources

More information

Georgia Department of Education School Keys: Unlocking Excellence through the Georgia School Standards

Georgia Department of Education School Keys: Unlocking Excellence through the Georgia School Standards April 17, 2013 Page 2 of 77 Table of Contents Introduction... 5 School Keys History...5 School Keys Structure...6 School Keys Uses...7 GaDOE Contacts...8 Curriculum Planning Strand... 9 Assessment Strand...

More information

Tulsa Public Schools Teacher Observation and Evaluation System: Its Research Base and Validation Studies

Tulsa Public Schools Teacher Observation and Evaluation System: Its Research Base and Validation Studies Tulsa Public Schools Teacher Observation and Evaluation System: Its Research Base and Validation Studies Summary The Tulsa teacher evaluation model was developed with teachers, for teachers. It is based

More information

Appendix A: Annotated Table of Activities & Tools

Appendix A: Annotated Table of Activities & Tools APPENDIX A Annotated Table of Activities and Tools MODULE 1: Get Ready for Inquiry ( Activities) Activity 1.1 Activity 1.2 Activity 1.3 Activity 1.4 Activity 1.5 How Data Can Help Us in the School Improvement

More information

LITERACY: READING LANGUAGE ARTS

LITERACY: READING LANGUAGE ARTS IMPORTANT NOTICE TO CANDIDATES: The assessment information in this document is aligned with NBPTS Literacy: Reading Language Arts Standards, Second Edition (for teachers of students ages 3 12). If you

More information

Correlation Map of LEARNING-FOCUSED to Marzano s Evaluation Model

Correlation Map of LEARNING-FOCUSED to Marzano s Evaluation Model Correlation Map of LEARNING-FOCUSED to Marzano s Evaluation Model Correlation Map of LEARNING-FOCUSED to Marzano s Evaluation Model LEARNING-FOCUSED provides schools and districts with the best solutions

More information

Principal Practice Observation Tool

Principal Practice Observation Tool Principal Performance Review Office of School Quality Division of Teaching and Learning Principal Practice Observation Tool 2014-15 The was created as an evidence gathering tool to be used by evaluators

More information

STUDENT LEARNING OBJECTIVES As Measures of Educator Effectiveness. A Guide For Principals

STUDENT LEARNING OBJECTIVES As Measures of Educator Effectiveness. A Guide For Principals STUDENT LEARNING OBJECTIVES As Measures of Educator Effectiveness A Guide For Principals WHAT IS AN SLO?... 3 WHAT IS THE PURPOSE OF SLOS?... 3 SLOS IMPROVE TEACHER PRACTICE... 3 SLOS IMPROVE SCHOOLS...

More information

MARZANO SCHOOL LEADERSHIP EVALUATION MODEL

MARZANO SCHOOL LEADERSHIP EVALUATION MODEL TEACHER & LEADERSHIP EVALUATION MARZANO SCHOOL LEADERSHIP EVALUATION MODEL Prepared by Learning Sciences Marzano Center Center for Teacher and Leadership Evaluation April 2012 1 TEACHER & LEADERSHIP EVALUATION

More information

2013 Marzano School Leader Evaluation Model Rubric

2013 Marzano School Leader Evaluation Model Rubric 2013 Marzano School Leader Evaluation Model Rubric Exclusive partners with Dr. Robert J. Marzano for the Teacher Evaluation Model and School Leader Evaluation Model Learning Sciences International 175

More information

Revisioning Graduate Teacher Education in North Carolina Master of Arts in Elementary Education Appalachian State University

Revisioning Graduate Teacher Education in North Carolina Master of Arts in Elementary Education Appalachian State University Revisioning Graduate Teacher Education in North Carolina Master of Arts in Elementary Education Appalachian State University A. A description of how the proposed program has been revisioned to reflect

More information

Principle 3: Supporting Effective Instruction and Leadership

Principle 3: Supporting Effective Instruction and Leadership Principle 3: Supporting Effective Instruction and Leadership 3.A Develop and Adopt Guidelines for Local Teacher and Principal Evaluation and Support Systems This section provides a description of the state

More information

2016 Charter School Application Evaluation Rubric. For applications submitted to The Louisiana Board of Elementary and Secondary Education

2016 Charter School Application Evaluation Rubric. For applications submitted to The Louisiana Board of Elementary and Secondary Education 2016 Charter School Application Evaluation Rubric For applications submitted to The Louisiana Board of Elementary and Secondary Education 2016 Charter School Application Evaluation Rubric The purpose of

More information

FRAMEWORK FOR EFFECTIVE TEACHING. Newark Public Schools Teacher Performance Evaluation

FRAMEWORK FOR EFFECTIVE TEACHING. Newark Public Schools Teacher Performance Evaluation FRAMEWORK FOR EFFECTIVE TEACHING Newark Public Schools Teacher Performance Evaluation A GUIDEBOOK FOR TEACHERS & ADMINISTRATORS 2014-2015 TABLE OF CONTENTS Letter from the Superintendent... iii Introduction...

More information

Quality of Community School Sponsor Practices Review (QSPR)

Quality of Community School Sponsor Practices Review (QSPR) Quality of Community School Sponsor Practices Review (QSPR) Recent legislation, Amended Substitute House Bill 555, requires the Ohio Department of Education (ODE) to develop and implement a sponsor evaluation

More information

Requirements EDAM-5002. WORD STUDY K-3: PRINT AWARENESS, LETTER KNOWLEDGE, PHONICS, AND HIGH FREQUENCY WORDS

Requirements EDAM-5002. WORD STUDY K-3: PRINT AWARENESS, LETTER KNOWLEDGE, PHONICS, AND HIGH FREQUENCY WORDS LETTER OF ENDORSEMENT: TEACHER LEADERSHIP AND INSTRUCTIONAL COACHING Requirements Dr. Grace Surdovel, Director of Master's Programs/Faculty of Practice The Letter of Endorsement in Teacher Leadership and

More information

Practicum/Internship Handbook. Office of Educational Field Experiences

Practicum/Internship Handbook. Office of Educational Field Experiences Practicum/Internship Handbook Office of Educational Field Experiences Northwest Missouri State University 2015-2016 1 General Information and Standards The practicum/internship is designed to provide students

More information

Framework for Leadership

Framework for Leadership Framework for Leadership Date Leader Self-Assessment Evaluator Assessment Domain 1: Strategic/Cultural Leadership Principals/school leaders systemically and collaboratively develop a positive culture to

More information

Teacher Evaluation. Missouri s Educator Evaluation System

Teacher Evaluation. Missouri s Educator Evaluation System Teacher Evaluation Missouri s Educator Evaluation System Teacher Evaluation Protocol Introduction Missouri s Educator Evaluation System was created and refined by hundreds of educators across the state.

More information

classroom Tool Part 3 of a 5 Part Series: How to Select The Right

classroom Tool Part 3 of a 5 Part Series: How to Select The Right How to Select The Right classroom Observation Tool This booklet outlines key questions that can guide observational tool selection. It is intended to provide guiding questions that will help users organize

More information

Principles of Data-Driven Instruction

Principles of Data-Driven Instruction Education in our times must try to find whatever there is in students that might yearn for completion, and to reconstruct the learning that would enable them autonomously to seek that completion. Allan

More information

Standards for Professional Development

Standards for Professional Development Standards for Professional Development APRIL 2015 Ohio Standards for Professional Development April 2015 Page 1 Introduction All of Ohio s educators and parents share the same goal that Ohio s students

More information

Compensation Reports: Eight Standards Every Nonprofit Should Know Before Selecting A Survey

Compensation Reports: Eight Standards Every Nonprofit Should Know Before Selecting A Survey The Tools You Need. The Experience You Can Trust. WHITE PAPER Compensation Reports: Eight Standards Every Nonprofit Should Know Before Selecting A Survey In today s tough economic climate, nonprofit organizations

More information

Alignment of the Career and Life Role Common Curriculum Goals with Career-Related Learning Standards Oregon Department of Education October 2002

Alignment of the Career and Life Role Common Curriculum Goals with Career-Related Learning Standards Oregon Department of Education October 2002 Alignment of the Career and Life Role with Oregon Department of Education October 2002 (available on the ODE website at www.ode.state.or.us/cimcam) The Oregon Department of Education hereby gives permission

More information

Support Services Evaluation Handbook

Support Services Evaluation Handbook Support Services Evaluation Handbook for members of Paraprofessionals and School-Related Personnel (PRSP), Baltimore Teachers Union, Local 340 City Union of Baltimore (CUB), Local 800 Baltimore City Public

More information

READING WITH. Reading with Pennsylvania Reading Specialist Certificate

READING WITH. Reading with Pennsylvania Reading Specialist Certificate READING WITH PENNSYLVANIA READING SPECIALIST CERTIFICATE Reading with Pennsylvania Reading Specialist Certificate Program Coordinator: Ms. Anne Butler The Master of Science degree in Education with a concentration

More information

Pre-Requisites EDAM-5001 Early Literacy Guiding Principles and Language

Pre-Requisites EDAM-5001 Early Literacy Guiding Principles and Language . EDAM EDAM-5001. EARLY LITERACY: GUIDING PRINCIPLES AND LANGUAGE DEVELOPMENT This course is the prerequisite for all other courses in the Early Childhood Literacy program. It outlines the philosophical

More information

Maine DOE Teacher Performance Evaluation and Professional Growth Model

Maine DOE Teacher Performance Evaluation and Professional Growth Model Maine DOE Teacher Performance Evaluation and Professional Growth Model A Handbook and Implementation Guide for School Administrative Units (2014-2015) 23 State House Station Augusta, ME 04333 Systemic

More information

WHAT WORKS IN INNOVATION AND EDUCATION IMPROVING TEACHING AND LEARNING FOR ADULTS WITH BASIC SKILL NEEDS THROUGH FORMATIVE ASSESSMENT STUDY OUTLINE

WHAT WORKS IN INNOVATION AND EDUCATION IMPROVING TEACHING AND LEARNING FOR ADULTS WITH BASIC SKILL NEEDS THROUGH FORMATIVE ASSESSMENT STUDY OUTLINE WHAT WORKS IN INNOVATION AND EDUCATION IMPROVING TEACHING AND LEARNING FOR ADULTS WITH BASIC SKILL NEEDS THROUGH FORMATIVE ASSESSMENT STUDY OUTLINE The OECD s Centre for Educational Research and Innovation

More information

Leader Keys Effectiveness System Implementation Handbook

Leader Keys Effectiveness System Implementation Handbook Leader Keys Effectiveness System Implementation Handbook Office of School Improvement Teacher and Leader Keys Effectiveness Division Acknowledgments The s (GaDOE) (LKES) Handbook was developed with the

More information

Corrective Action Plan for Michael R. White STEM School

Corrective Action Plan for Michael R. White STEM School Corrective Action Plan for Michael R. White STEM School House Bill 525 directs CMSD to develop a school improvement plan for schools identified as in need of corrective action. Investment School Corrective

More information

Core Goal: Teacher and Leader Effectiveness

Core Goal: Teacher and Leader Effectiveness Teacher and Leader Effectiveness Board of Education Update January 2015 1 Assure that Tulsa Public Schools has an effective teacher in every classroom, an effective principal in every building and an effective

More information

NCEE EVALUATION BRIEF April 2014 STATE REQUIREMENTS FOR TEACHER EVALUATION POLICIES PROMOTED BY RACE TO THE TOP

NCEE EVALUATION BRIEF April 2014 STATE REQUIREMENTS FOR TEACHER EVALUATION POLICIES PROMOTED BY RACE TO THE TOP NCEE EVALUATION BRIEF April 2014 STATE REQUIREMENTS FOR TEACHER EVALUATION POLICIES PROMOTED BY RACE TO THE TOP Congress appropriated approximately $5.05 billion for the Race to the Top (RTT) program between

More information

Section Two: Ohio Standards for the Teaching Profession

Section Two: Ohio Standards for the Teaching Profession 12 Section Two: Ohio Standards for the Teaching Profession 1 Teachers understand student learning and development and respect the diversity of the students they teach. Teachers display knowledge of how

More information

TOOL KIT for RESIDENT EDUCATOR and MENT OR MOVES

TOOL KIT for RESIDENT EDUCATOR and MENT OR MOVES Get to Know My RE Observe Collect Evidence Mentor Moments Reflect Review Respond Tailor Support Provide Provide specific feedback specific Feedback What does my RE need? Practice Habits Of Mind Share Data

More information

Coaching Expectations

Coaching Expectations Coaching Expectations School- based coaches are selected by the principal and must have the necessary skills and qualifications as stated in the District s new job descriptions (Literacy, Science, Mathematics).

More information

Teacher Guidebook 2014 15

Teacher Guidebook 2014 15 Teacher Guidebook 2014 15 Updated March 18, 2015 Dallas ISD: What We re All About Vision Dallas ISD seeks to be a premier urban school district. Destination By The Year 2020, Dallas ISD will have the highest

More information

Oak Park School District. Administrator Evaluation Program

Oak Park School District. Administrator Evaluation Program Oak Park School District Administrator Evaluation Program Table of Contents Evaluation Purpose...1 Evaluation Timeline...2 Rubric for Instructional Administrator Standard 1...3 Standard 2...5 Standard

More information

Illinois State Board of Education

Illinois State Board of Education Illinois State Board of Education August 2014 Guidance Document Guidance on Building Teacher Evaluation Systems for Teachers of Students With Disabilities, English Learners, and Early Childhood Students

More information

Mississippi Statewide Teacher Appraisal Rubric M-STAR

Mississippi Statewide Teacher Appraisal Rubric M-STAR Mississippi Statewide Teacher Appraisal Rubric M-STAR Introduction and Process Guide May 2012 2192_05/12 Contents Introduction... 1 Purpose of Teacher Performance Evaluation... 1 Teacher Evaluation Process...

More information

North Carolina INSTRUCTIONAL CENTRAL OFFICE STAFF EVALUATION PROCESS

North Carolina INSTRUCTIONAL CENTRAL OFFICE STAFF EVALUATION PROCESS North Carolina Instructional Central Offi ce Staff Evaluation Process North Carolina INSTRUCTIONAL CENTRAL OFFICE STAFF EVALUATION PROCESS Public Schools of North Carolina State Board of Education Department

More information

Leadership Portfolio

Leadership Portfolio California State University, Fresno Education Administration Program Developing Your Leadership Portfolio September 2005 Table of Contents What is the Leadership Portfolio?........... 1 Why develop the

More information

Comprehensive Reading Plan K-12 A Supplement to the North Carolina Literacy Plan. North Carolina Department of Public Instruction 2013-2014

Comprehensive Reading Plan K-12 A Supplement to the North Carolina Literacy Plan. North Carolina Department of Public Instruction 2013-2014 Comprehensive Reading Plan K-12 A Supplement to the North Carolina Literacy Plan North Carolina Department of Public Instruction 2013-2014 1 TABLE OF CONTENTS INTRODUCTION PAGE 3 NCDPI PAGE 4 STANDARDS-BASED

More information

Results Snapshot: The SIOP Model

Results Snapshot: The SIOP Model Improve the Academic Achievement of English Learners 3 Ways Pearson Professional Development can help your district close the achievement gap for English learners. 1 2 3 Implement a scientifically validated

More information

Marion County Instruction Evaluation System

Marion County Instruction Evaluation System MARION COUNTY PUBLIC SCHOOLS Marion County Instruction Evaluation System MCIES 2013-2014 George D. Tomyn, Superintendent Marion County Instructional Evaluation System Philosophy Educating others is a complex

More information

Educator Evaluation Gradual Implementation Guidebook

Educator Evaluation Gradual Implementation Guidebook Learning to Succeed Educator Evaluation Gradual Implementation Guidebook Version 1.0 1 Portland Public Schools * 196 Allen Avenue * Portland, Maine 04103 * Phone: (207) 874-8100 Table of Contents Introduction...

More information

How To Improve Education Planning In Dekalb County Schools

How To Improve Education Planning In Dekalb County Schools I. Executive Summary At the request of Superintendent Cheryl Atkinson, Education Planners, LLC (Ed Planners) conducted an evaluation of the curricula-based programs and professional development currently

More information

B. Public School Partners Involvement in the Revisioning of the Program and Continued Involvement in the Delivery and Evaluation of the Program

B. Public School Partners Involvement in the Revisioning of the Program and Continued Involvement in the Delivery and Evaluation of the Program Re-Visioning Graduate Teacher Education in North Carolina MA in History, Secondary Education with Licensure in History and Social Studies Appalachian State University A. Description of how the Proposed

More information

Setting Professional Goals*

Setting Professional Goals* Setting Professional Goals* Beginning teachers are often overwhelmed by the scope of their work and the day-to-day demands of classroom instruction. Stepping back from and reflecting upon their teaching

More information

Florida s Plan to Ensure Equitable Access to Excellent Educators. heralded Florida for being number two in the nation for AP participation, a dramatic

Florida s Plan to Ensure Equitable Access to Excellent Educators. heralded Florida for being number two in the nation for AP participation, a dramatic Florida s Plan to Ensure Equitable Access to Excellent Educators Introduction Florida s record on educational excellence and equity over the last fifteen years speaks for itself. In the 10 th Annual AP

More information

Mississippi School Librarian Evaluation

Mississippi School Librarian Evaluation Mississippi School Librarian Evaluation 2015 Introduction of the Mississippi School Librarian Evaluation The Mississippi School Librarian Evaluation is a process designed to: improve the professional performance

More information

Performance Management. Date: November 2012

Performance Management. Date: November 2012 Performance Management Date: November 2012 SSBA Background Document Background 3 4 Governance in Saskatchewan Education System 5 Role of School Boards 6 Performance Management Performance Management Overview

More information

TOOL. Project Progress Report

TOOL. Project Progress Report TOOL SUMMARY: PROJECT PROGRESS REPORT The purpose of the is to compile information from the analysis done by project participants, partners and LWR country staff about the progress or advances the project

More information

PERFORMANCE STANDARD #1: PLANNING AND PREPARATION Special Service Providers plan for quality service using a comprehensive approach.

PERFORMANCE STANDARD #1: PLANNING AND PREPARATION Special Service Providers plan for quality service using a comprehensive approach. PERFORMANCE STANDARD #1: PLANNING AND PREPARATION Special Service Providers plan for quality service using a comprehensive approach. Service and Support Audiologists should know and be able to: Level of

More information

2014 EPP Annual Report

2014 EPP Annual Report 2014 EPP Annual Report CAEP ID: 24736 AACTE SID: Institution: Tulane University EPP: Teacher Preparation and Certification Program Section 1. AIMS Profile After reviewing and/or updating the Educator Preparation

More information

TEACHER DEVELOPMENT & EVALUATION HANDBOOK

TEACHER DEVELOPMENT & EVALUATION HANDBOOK TEACHER DEVELOPMENT & EVALUATION HANDBOOK College, Career Career & Citizen-Ready! Table of Contents Part 1: Introduction Purpose Part 2: Standards of Effective Teaching Performance Standards Sample Performance

More information

Florida Department of Education. Professional Development System Evaluation Protocol

Florida Department of Education. Professional Development System Evaluation Protocol Professional Development System Evaluation Protocol Reviewer s Guide Third Cycle 2010-14 Bureau of Educator Recruitment, Development and Retention April 2010 printing Florida Department of Education Reviewers

More information

How To Write A Curriculum Framework For The Paterson Public School District

How To Write A Curriculum Framework For The Paterson Public School District DEPARTMENT OF CURRICULUM & INSTRUCTION FRAMEWORK PROLOGUE Paterson s Department of Curriculum and Instruction was recreated in 2005-2006 to align the preschool through grade 12 program and to standardize

More information

American Statistical Association

American Statistical Association American Statistical Association Promoting the Practice and Profession of Statistics ASA Statement on Using Value-Added Models for Educational Assessment April 8, 2014 Executive Summary Many states and

More information

An Evaluation of the National Institute for School Leadership: Executive Development Program in Milwaukee Public Schools INTERIM REPORT YEAR TWO

An Evaluation of the National Institute for School Leadership: Executive Development Program in Milwaukee Public Schools INTERIM REPORT YEAR TWO An Evaluation of the National Institute for School Leadership: Executive Development Program in Milwaukee Public Schools INTERIM REPORT YEAR TWO Roisin P. Corcoran, Ph.D. Joseph M. Reilly Steven M. Ross,

More information

Online Professional Development Modules N O R T H C A R O L I N A D E P A R T M E N T O F P U B L I C I N S T R U C T I O N

Online Professional Development Modules N O R T H C A R O L I N A D E P A R T M E N T O F P U B L I C I N S T R U C T I O N Online Professional Development Modules N O R T H C A R O L I N A D E P A R T M E N T O F P U B L I C I N S T R U C T I O N Free self-paced modules, self-paced mini-modules, and facilitated courses Learn

More information

GUIDELINES FOR THE IEP TEAM DATA COLLECTION &

GUIDELINES FOR THE IEP TEAM DATA COLLECTION & GUIDELINES FOR THE IEP TEAM DATA COLLECTION & Progress Monitoring Decisions about the effectiveness of an intervention must be based on data, not guesswork. Frequent, repeated measures of progress toward

More information

Sample Completed Summative Report Form for a Secondary Teacher 1*

Sample Completed Summative Report Form for a Secondary Teacher 1* Sample Completed Summative Report Form for a Secondary Teacher 1* This form must be used for each performance appraisal. The duties of the principal may be delegated to a vice-principal in the same school,

More information

How To Pass A Queens College Course

How To Pass A Queens College Course Education Unit Assessment Analysis Guide...committed to promoting equity, excellence, and ethics in urban schools and communities 1 Unit Assessment Analysis Guide Introduction Form 1: Education Unit Core

More information

Student Achievement and School Accountability Programs (SASA) Monitoring Plan for School Improvement Grants October 1, 2010 to September 30, 2011

Student Achievement and School Accountability Programs (SASA) Monitoring Plan for School Improvement Grants October 1, 2010 to September 30, 2011 Student Achievement and School Accountability Programs (SASA) Monitoring Plan for School Improvement Grants October 1, 2010 to September 30, 2011 January 12, 2011 Table of Contents I. INTRODUCTION...

More information

EXAMPLE FIELD EXPERIENCE PLANNING TEMPLATE CCSU MAT Program

EXAMPLE FIELD EXPERIENCE PLANNING TEMPLATE CCSU MAT Program EXAMPLE FIELD EXPERIENCE PLANNING TEMPLATE CCSU MAT Program Secondary Education (Math, History/Social Studies, Science, World Languages) and Special Education (K-12) Goal: The MAT program focuses on preparing

More information

Possible examples of how the Framework For Teaching could apply to Instructional Coaches

Possible examples of how the Framework For Teaching could apply to Instructional Coaches Possible examples of how the Framework For Teaching could apply to 1b. Specific Examples 1b. Demonstrating Knowledge of Students 1a. Specific Examples 1a. Demonstrating knowledge of Content and Pedagogy

More information

Assessment, Recording and Reporting Policy

Assessment, Recording and Reporting Policy Assessment, Recording and Reporting Policy Assessment Assessment enables teachers and pupils to monitor and evaluate learning and to set new targets. Its purpose is to articulate progress and shape the

More information

Developing an implementation research proposal. Session 2: Research design

Developing an implementation research proposal. Session 2: Research design Developing an implementation research proposal Session 2: Research design Learning objectives After completing this session, you will be able to: Develop a research design outlining your data collection

More information

Fridley Alternative Compensation Plan Executive Summary

Fridley Alternative Compensation Plan Executive Summary Fridley Alternative Compensation Plan Executive Summary Fridley School District Alternative Compensation Plan is a district wide program that is approved by the Minnesota Department of Education and was

More information

Evaluating Training. Debra Wilcox Johnson Johnson & Johnson Consulting

Evaluating Training. Debra Wilcox Johnson Johnson & Johnson Consulting Debra Wilcox & Consulting Learning new behavior new or enhanced skills is the most powerful outcome of training. Behavioral change is the most difficult outcome to achieve, but may be the most important.

More information

Mississippi Counselor Appraisal Rubric M-CAR

Mississippi Counselor Appraisal Rubric M-CAR Mississippi Counselor Appraisal Rubric M-CAR 2014-15 Process Manual Last Modified 8/25/14 The Mississippi State Board of Education, the Mississippi Department of Education, the Mississippi School of the

More information