Advancing Professional Excellence Guide Table of Contents

Size: px
Start display at page:

Download "Advancing Professional Excellence Guide Table of Contents"

Transcription

1 1

2 Advancing Professional Excellence Guide Table of Contents APEX Values and Beliefs History and Development of APEX Purpose of APEX Components of APEX The APEX Process APEX Process Timeline Training/Orientation Professional Practice Self-Assessment (power in self-reflection) Professional Growth Plan (PGP) Goal Setting Classroom Visits (Observations), School Visits and Ongoing Feedback Mid-Year Conversations End of Year Conversation Measures of Student Learning Student Learning Objectives (SLO) Handbook Professional Practice Ratings and MSL Ratings Determining an Overall Rating Appendix A - Resources & Tools Appendix B RANDA Navigation Support Appendix C - Additional Details About APEX Revised 8/12/15 2

3 Advancing Professional Excellence Adams 12 Five Star Schools Advancing Professional Excellence (APEX) is the professional growth and performance system for educators in Adams 12 Five Star Schools. APEX encompasses the components of Colorado s Educator Effectiveness bill and represents the district s core values and beliefs of professional growth. APEX Values and Beliefs We believe high-quality reflective educators create high performing schools. We believe in a growth mind set for all educators. We believe professional relationships built on mutual respect and transparency enhance everyone s practice and elevate our profession. We believe reflective practice and professional collaboration make educators more effective. We believe in continuous improvement, rather than perfection. History and Development of APEX Since 2011, the Five Star Schools Educator Effectiveness Advisory Committee, comprised of teachers, principals, district leaders, and District Twelve Educators Association leaders has been proactively studying the Colorado Educator Effectiveness Model and planning for its implementation in our district. This committee was formed in response to Colorado Senate Bill , Ensuring Quality Instruction through Educator Effectiveness (or simply: Educator Effectiveness), which was passed in May of In 2011 and 2012, the committee focused on developing a deep understanding of the purpose and intent of Educator Effectiveness and the professional practices referenced in the Colorado State Evaluation Rubrics. From 2012 to 2014, the committee focused on planning for how the Professional Practices (evaluation rubrics) portion of Educator Effectiveness would be utilized in the Five Star District as a tool for supporting educators growth. During the school year, the district started using RANDA, the online support tool for educator effectiveness, and the committee tackled specific questions around Measures of Student Learning (MSL) and has helped guide a cohort of more than 200 educators who are learning about the use of MSL data, including setting Student Learning Objectives (SLOs). The committee fosters a strong collaborative relationship among its members and has identified the values and beliefs that support our collective goal of ensuring we have a cadre of reflective, growth minded school leaders, teachers, and specialized service professionals throughout our district. Revised 8/12/15 3

4 Purpose of APEX We know that high quality instruction creates high performing schools that have a positive impact on student achievement and prepare students with the 21st century skills needed for college and career. APEX is designed to support the professional growth of building leaders, teachers and specialized service professionals through reflective practice and purposeful feedback. The purpose of APEX is to enhance professional learning and growth so that every child in Adams 12 Five Star District is educated by high quality classroom teachers and service professionals who are effectively supported by building leaders. Benefits to Students Research shows that when educators have opportunities to reflect on their practice in order to strengthen their instruction, student achievement increases. In Adams 12 Five Star Schools, we have established high expectations for our students based on the Colorado Academic Standards. These standards can support us in providing all students with the academic knowledge, language, and skills necessary to be successful beyond our system in college and careers. While the academic standards provide the foundation for the content we teach, APEX provides educators with the supports they need to ensure all students can achieve. Benefits to Educators APEX provides building leaders, teachers, and specialized service professionals with opportunities to become reflective practitioners and receive meaningful feedback about how their practice impacts student achievement. The system includes training, goal setting, instructional observations, and reflective collaborative conversations that can support all educations in refining their practice in order to continually meet the needs of all students. This Guide is intended for all educators within APEX, including principals, assistant principals, teachers and special service professionals. Revised 8/12/15 4

5 Components of APEX APEX can be thought of as having two main halves. Details about each half, and how they work together are addressed later in this guide. This summary of each half, and the following pie graphs, provide context for the details addressed later. Professional Practice Standards (50%) One half of APEX is based on Professional Practice Standards. These outline performance indicators that align with best practice for instruction that will help all students succeed. After multiple opportunities for feedback, reflection and growth, educators will receive a rating on the Quality Standards that measure professional practice. Teacher & Special Service Professional Quality Standards I. Content Knowledge II. Establish Classroom Environment III. Facilitate Learning IV. Reflect on Practice V. Demonstrate Leadership Principal & Assistant Principal Quality Standards I. Strategic Leadership II. Instructional Leadership III. School Culture & Equity Leadership IV. Human Resource Leadership V. Managerial Leadership VI. Development Leadership Measures of Student Learning (50%) The other half of APEX addresses Measures of Student Learning (MSL) which are based on multiple measures including Student Learning Objectives (SLOs) not a single assessment. The MSL half applies to all educators beginning in the school year. Revised 8/12/15 5

6 APEX Educator Weighting Models The following models illustrate the approved weighting of the professional practice standards and measures of student learning as determined by the Five Star District Educator Effectiveness Advisory Committee. Revised 8/12/15 6

7 Revised 8/12/15 7

8 The APEX Process Revised 8/12/15 8

9 APEX Process Timeline The APEX process puts our values and beliefs of reflection, feedback and growth into action. Through regular reflection and collaboration, educators and their supervisors continue to build relationships and hone practices. These set the foundation for strong educational environments for our students. Item Timeframe / Deadline Action / Purpose APEX Training for Evaluators August Ensure reliability and validity, and give all evaluators the same foundational knowledge Describe APEX and expectations for its implementation Train evaluators on how to utilize RANDA Educator Effectiveness Management System Training will be provided for new evaluators. Orientation Mid September Complete by end of September Orientation for using the RANDA system is available on CDE s site. We will also offer help sessions for those who are interested. Self assessment & Professional Growth Plan (PGP) Goals Link to Sample PGP Goals Link to RANDA Login Complete by end of October Educators will complete a self assessment within RANDA, to reflect on professional performance within the professional practice standards and determine areas of focus for a professional growth plan (PGP). Educators draft a PGP goal and submit these goals in the RANDA system. Educators are encouraged to complete the self assessment and draft their PGP with a colleague or team of colleagues in order to have open/honest collaborative conversations about areas of strength with instructional practice and areas where they may want to grow. A collegial approach allows educators to consider how to support each other. The self assessment and PGP can also be completed individually, if an educator is more comfortable with that format. Revised 8/12/15 9

10 Measures of Student Learning (MSL) Link to Teacher Categories Student Learning Objectives (SLOs) (as part of the Measures of Student Learning) Link to SLO Handbook Link to sample SLOs Supervisor Approval of PGP Goals & SLOs Mid Year Conversation Complete by End of October Complete by End of October Complete by Mid November December January Complete by End of January * NOTE: For the school year, only SLOs will be used by all certified staff for the MSL half of APEX. * Administrator and teacher agree upon the Measures of Student Learning portion of the system which, depending on teacher category, will include data from two or more of the following measures: Colorado Growth Model (CGM) State Summative Assessment (CMAS, ACT, ACCESS, etc.) School Performance Framework (SPF) Student Learning Objectives (SLOs) Please follow the link to the teacher categories section of this document for more information about each measure. Because SLOs are new, much more information about them can be found in the row below, and later in this document in the SLO Handbook section. Educators work in teams to determine student learning objectives (SLOs) that will be part of their overall Measures of Student Learning. Educators draft two SLOs and submit them to their evaluator for review and approval. SLOs are entered into the RANDA system. Schools will determine the best approach for teams to have time to draft SLOs. Many support documents, tools and some centralized support sessions will be offered. Educators are highly encouraged to complete the SLO with a team (e.g. grade level or department team) of colleagues in order to think about common assessments and the importance of certain standards for their students. Throughout the fall, administrators host individual and team meetings as needed for clarification, suggestions, etc. in regard to PGP Goals and SLOs. Administrators approve PGP Goals and SLOs within the RANDA System by 11/15. Reflective conversation between educator and supervisor to discuss the following: o Progress toward PGP Goals o Progress toward SLOs o Review of student performance data o Summary of ongoing conversations o Potential ratings on Quality Standards within the professional practices rubric Use RANDA to support conversation and track progress Most educators will have a reflective conversation at mid year. However, for educators with any known performance concerns, those concerns should be discussed in this meeting, including supports provided and next steps addressed administrators should document these conversations. Revised 8/12/15 10

11 End of Year Conversation April May Complete by Mid May Reflective conversation between educator and supervisor to discuss the following: o Success with PGP Goals o Success with SLOs o Review of student performance data o Summary of ongoing conversations o Final ratings on Professional Practice Quality Standards o Initial conversation about potential PGP Goals for next school year, based on reflection from current year o Review of process for determining overall effectiveness rating so educators understand how the professional practices ratings will combine with the MSL ratings in the fall of 2016 Use RANDA to support conversation, finalize professional practice ratings, and finalize MSL worksheet. Training/Orientation All educators will receive training on the components of APEX. An orientation to the system will occur at the beginning of the school year with additional trainings provided throughout the year as needed to share updates to APEX or updates from CDE. Trainings may also include directions for how to navigate and utilize RANDA. This training will primarily occur at the building level for teachers and SSPs, and will also occur for principals at the start of their school year. Prior to inputting information into the RANDA system, educators will need to acknowledge completion of the training/orientation. A sample screenshot of the training acknowledgement page can be found in Appendix B RANDA Navigation Support. Revised 8/12/15 11

12 Professional Practice As the district implements more rigorous academic standards for our students, we want to ensure that our educators are equipped with the knowledge, skills and strategies needed to help all students be successful in meeting these standards. The Professional Practice Standards in APEX are an outline of the performance indicators that align with best practice for instruction that will help all students succeed. APEX creates a system of high expectations for our teams of educators including school leaders, teachers, specialized service professionals, and all central school support employees that is based on research based professional practices, proven to have a positive impact on student achievement. This section of the APEX Guide will go into detail about the Professional Practice half of APEX in the following order: Self Assessment (power in self reflection) Professional Growth Plan Goal Setting (power in identification of professional growth goals) Classroom Visits (Observation), School Visits and Ongoing Feedback Mid Year Conversations End of Year Conversations Self-Assessment (power in self-reflection) Purpose of the self-assessment: The self assessment process provides educators an opportunity to reflect on their practice. Using the professional practices on the Colorado Evaluation Rubrics, educators assess their performance by assigning ratings and identifying areas of strength and areas for refinement. The rubric used for a self assessment reference the same professional practices that appear on the evaluator s end of year rubric. Process for the self-assessment: All educators will complete the self assessment and are encouraged to draft their PGP with a colleague or team of colleagues in order to have open/honest conversations about areas of strength with instructional practice and areas where they may want to grow. A collegial approach allows educators to consider how to support one another. The self assessment can also be completed individually, if an educator is more comfortable with that format. Note: Even though educators who are new to the profession and/or APEX may not have historical data to draw from, they will also fill out the self assessment. The process will familiarize them with the professional practice standards and allow them to think about their areas of strength and areas for growth in the role they serve. Sharing the self-assessment: The practice of educators sharing their self assessments is encouraged. Sharing one s self assessment allows opportunity for collaborative, reflective conversations that lead to shared understanding of an educator's strengths and areas for professional growth. It is more likely that supports, leadership opportunities, collaborative learning opportunities and other individualized possibilities for learning will surface if the self assessment is shared and discussed with one s supervisor. Revised 8/12/15 12

13 Professional Growth Plan (PGP) Goal Setting Power in identifying professional goals Purpose and makeup of the PGP: Educator goal setting is an important component of reflection and feedback that leads to growth and refinement in an educator s practice. Goal setting is designed to focus educators on developing and mastering skills that will impact their overall performance, and in turn, student achievement. The establishment of PGPs promotes collaboration and reflection among educators. Experience and research show that the goal setting process has the greatest impact on student learning when educators use it to reflect on professional practices which are having the greatest impact on student learning. The PGP includes two goals: One goal is based on the educator s area of growth that emerged from the self assessment. One goal is based on a school wide focus from the school Unified Improvement Plan. The PGP is not a plan that includes focus on student achievement goals. Measures of Student Learning see corresponding section below take into account the importance of goals aligned to student achievement. In contrast, the PGP allows educators to think critically about their practice. We know that the practice of school leadership, teaching, and the many specialized supports for students, are all complex practices. The opportunity to set professional goals through the PGP allows educators to focus on their practice and elevate their craft. Process for PGP Development: Educators are encouraged to complete the self assessment and draft their PGP with a colleague, or team of colleagues, in order to have open/honest conversations about areas of strength in instructional practice and areas where they may want to grow. A collegial approach allows educators to consider how to support each other. The self assessment and PGP can also be completed individually, if an educator is more comfortable with that format. Approval of PGPs: Teachers, specialized service professionals, and assistant principals will collaborate with their school leader (supervisor) to finalize their PGP. Principals will collaborate with their district leader to finalize their PGP. All supervisors will need to approve PGPs in the online RANDA system (please see the APEX Process Dates for timelines). Please see Appendix A for a Professional Growth Plan Goal Setting Protocol. Please see Appendix B for guidance on navigating the PGP in the RANDA system. Revised 8/12/15 13

14 Sample PGP Goals Example 1: Classroom Management (potentially a new teacher PGP goal) Professional Learning Goal: Name and Description Goal Name : Classroom Management Goal Description: I will maximize student learning by creating and sustaining a focused classroom, (set clear directions, emphasize positive student behavior, support students in self-management of their behavior). How will we will measure progress toward this goal? Overall growth over time on Quality Standard 2 Element A and F of the Educator Effectiveness Rubric. Consistently implement School-wide Discipline System, High Behavioral Expectations, Clear Expectations, Positive Framing, and 100%) Implement Kagan structures Informal Observation Feedback ( During informal and formal observations) Observation Data on Student Actions after Teacher Directions What Action Steps will work on to improve my practice and meet this goal? Modeling from coach and strong teachers, then practicing the skill. Reflection from classroom visits and feedback from the Instructional Coach and building leader Co-planning with grade level and mentor Attend district Kagan workshops Example 2: (potentially a more seasoned teacher PGP goal; or potentially a school-wide goal) Professional Learning Goal: Name and Description Goal Name : Instructional practice with text-based questioning in my classroom Goal Description: I will maximize student learning by working on my practice with questioning in my classroom, trying to include more text-based questions when facilitating classroom discussion and more text-based questions in the assignments I provide students. How will we will measure progress toward this goal? Overall growth over time on Quality Standard 3 Element C, E, and G of the Educator Effectiveness Rubric. Share with my students that this is what I m working on and that they are going to get good at answering questions from text this year ask them to help me track my progress Implement 2 LDC modules this year with emphasis on text-based questions embedded into the overall task and the skills lessons along the way Informal Observation Feedback ( During informal and formal observations) ask my principal to look at this when he/she is in my classroom and share with me the questions he/she hears me ask students Observation Data on Student Actions after Teacher Directions What Action Steps will work on to improve my practice and meet this goal? Reflection on observation and feedback from the Instructional Coach and building leader have them come observe me and give me feedback on this practice when they are in my room. Co-planning with grade level let them know I m working on this and collaborate to have them help me think of these questions. Attend LDC Learn sessions to support my understanding of the shift for text-based questioning and build it into an LDC Module. Revised 8/12/15 14

15 Example 3: (Specialized Support Professional - Social Worker) Professional Learning Goal: Name and Description Goal Name : Screening and preventive support for social-emotional needs of students Goal Description: I will utilize a universal social emotional screening tool to identify and support students who are at risk, including their families. How will we will measure progress toward this goal? Quality Standard 1 Element A of the educator effectiveness rubric for social workers. Implement universal screener with all 1st, 2nd and 3rd grade students Use data from screener to identify students at risk and group them according to need Run regular social emotional groups with students Meet 3-6 times with families of the students for update and supportive strategies at home Observation from assistant principal of both groups and family meetings including feedback on if the instruction matches student need identified by screener What Action Steps will work on to improve my practice and meet this goal? Regular electronic and face to face communication with families Regular electronic and face to face communication with students teachers Attend training on universal screener Attend district social worker early release wednesday PLC meetings to collaborate with other mental health professionals Example 4: (Principal or Assistant Principal) Professional Learning Goal: How will we measure progress toward this goal? Name and Description Goal Name : Principals demonstrate instructional leadership. Goal Description: Promote school-wide efforts to establish, implement and refine appropriate expectations for curriculum, instructional practices, assessment and use of data from student work. Overall growth over time on Quality Standard 2 Element A, C, D, and E of the Educator Effectiveness Rubric. Share with teachers that this is what I m working on and that they will receive feedback through formal and informal observations. Create an Early Release schedule focused on time for teachers to continue to learn about and plan for the three instructional shifts in the Common Core Standards, district writing focus, and TLC Planning on Purpose Sessions focused on the Common Core ELA Standards and District Units of study. Teachers will complete surveys and/or reflections after each Early Release session that connect learning to their instructional practices. Utilize the MSL Cohort Leadership Team and Elementary Writing teacher leaders to help facilitate planning sessions with teachers to increase assessment literacy aligned to writing. Teachers will design SLOs and lesson plans that highlight the CCSS including instructional shifts and formative and summative assessments. Facilitate data day sessions with grade level teams focused on grade level data, specifically looking at student work to help identify student needs and refine instructional practices. Use the data to support feedback conversations about school-wide instructional practices with Executive Director during observational walkthroughs. What Action Steps will I work on to improve my practice and meet this goal? Reflection on observation and feedback from Executive Director. Co-planning with assistant principal, building instructional coach and teacher leaders to develop an aligned predictable yearlong professional development schedule for Early Release Days, Data Days, and Grade Level Planning. Attend principal Just in Time Trainings. Attend principal cohort visits with colleagues to norm expectations for instructional practice Track progress by engaging in instructional walkthroughs with executive director during each visit Revised 8/12/15 15

16 Classroom Visits (Observations), School Visits and Ongoing Feedback As part of APEX, principals, teachers and specialized service professionals should anticipate that their district and school leaders will provide them with feedback, so that they have the opportunity to engage in self reflection to help them refine their craft. Here is a sample list of professional interaction opportunities for supervisors to observe and provide feedback to practitioners, with associated tools to support reflective, collaborative relationships: School or classroom visits with reflective feedback conversations (See Appendix A for a Classroom Visit Tool and Reflective Conversation Protocol ) Participation in Professional Learning Communities Engaging in data study Planning sessions that include unpacking standards Participation in district/department/team/grade level meetings Engaging in walk throughs Participating in staff meetings/principal meetings *At least one of these professional interactions needs to be documented in the RANDA system for non probationary staff and two for probationary staff. (See Appendix B for guidance on navigating the Observation section in the RANDA system). *Principals should be in classrooms and with teachers or assistant principals often enough to identify strengths and needs of teachers and assistant principals. Executive Directors should be in schools and with principals enough to identify strengths and needs of principals. A Note about Classroom Visits (Observations): Some classroom and school visits may be scheduled and more formal in nature, while others may be part of drop in visits and informal in nature. Regardless of whether school and classroom visits are scheduled or not, teachers, principals, assistant principals and executive directors of schools are always encouraged to engage in providing feedback and engaging in reflection about their practice. Feedback Conversations: Feedback conversations are a critical component to APEX in that through these conversations, educators have the opportunity to reflect and grow. The Five Star District is continually working with executive directors, principals and assistant principals to support a shared understanding of instructional practice and a shared process for providing feedback that is helpful for educators. Please see Appendix A Resources & Tools for a reflective conversation protocol. Revised 8/12/15 16

17 Mid-Year Conversations (December - January) Mid year conversations are an opportunity for educators to meet with their evaluator and reflect on their professional growth. These conversations should occur between December and the end of January, and are in addition to the feedback conversations occurring throughout the school year. During a mid year conversation, educators and their evaluator collaboratively review evidence of their practice, along with student data, in order to identify strengths and areas for growth. The use of educator and student data allows educators to reflect on the impact their practices are having on student achievement and develop strategies/next steps based on this impact. This data should also be used to reflect on the educator s progress toward meeting his or her professional goal and, if needed, make revisions to the PGP. Mid year conversations provide an opportunity to review available student assessment information and student progress toward Student Learning Objectives (SLOs) see more about SLOs in the Measures of Student Learning and Student Learning Objectives sections of this guide. Please see Appendix A for Mid Year Conversation Guidance for Evaluators. Documentation to review during a Mid-Year Conversations may include: Draft ratings and anecdotal data for the educator s practice Progress toward PGP Goals Progress toward SLOs and potential revisions to SLOs (if necessary) Review of student performance data (formative and/or summative) aligned to the SLO and/or other student data Summary of ongoing conversations Potential ratings on Quality Standards within the professional practices rubric. Student assessment information (formative and/or summative) aligned to the SLO and/or other student data Most educators will have a reflective conversation at mid year. However, for educators with any known performance concerns, those concerns should be discussed specifically in this meeting, including supports provided and next steps addressed. Please see Appendix A for a mid year reflective conversation protocol for an individual educator or small group of educators. There is also a protocol to help guide a more candid conversation. Possible Outcomes of the Mid-Year Conversation: Identification of areas of professional strength Identification of areas for professional growth, with potential revision to the PGP Goals Strategies/steps the educator will implement Supports/opportunities the educator will benefit from Potential Revision to an educator s SLOs Identification of support the educator may receive to promote their professional growth Revised 8/12/15 17

18 To prepare for mid-year conversations, educators may reflect on the following questions: What would you identify as strengths within your practice? What evidence do you have of those strengths? What would you identify as an area (s) of growth in your practice? Why would you identify this area? What strategies/steps do you need to utilize to support your professional growth? How have you made progress toward meeting your professional goal? What is your evidence of this progress? How has this progress impacted student learning? What is your evidence of this impact? What support do you need in order to continue your professional growth? Outcomes from the mid year conversations will be documented in the RANDA system by completing the mid year portion in the system or through a document upload. See Appendix B for guidance on navigating the mid year conversation section in the RANDA system. Revised 8/12/15 18

19 End of Year Conversation (Mid-May) At the end of the school year, (April 1 through mid May) principals, assistant principals, teachers, specialized service professionals, and evaluators will have a reflective conversation about their overall professional practice. These conversations will provide an opportunity for educators to reflect on their implementation of the professional practices on the Colorado Evaluation Rubric and the impact these practices had on student achievement. Educators may also begin the process of thinking about next year s Professional Growth Plan (PGP) Goals. During end of year conversations, educators also have an opportunity to review students progress in meeting the Student Learning Objectives (SLOs). Documentation reviewed during an End- of-year Conversations may include: Final ratings and anecdotal data for the educator s professional practice Progress toward PGP Goals Progress toward SLOs Review of student performance data (formative and/or summative) aligned to the SLO and/or other student data Summary of ongoing conversations Potential ratings on Quality Standards within the professional practices rubric Student assessment information Possible Outcomes from End-of-Year Conversations: Identification of areas of professional strength Identification of areas for professional growth Student performance data Final professional practices ratings Begin PGP goal setting for the following school year To prepare for the end-of-year conversations, educators may reflect on the following questions: What would you identify as strengths within your practice? What evidence do you have of these strengths? What would you identify as an area (s) of growth in your practice? Why would you identify this area? What strategies/steps do you need to utilize to support your professional growth? How have you made progress toward meeting your professional goal? What is your evidence of this progress? How has this progress impacted student learning? What is your evidence of this impact? What support do you need in order to continue your professional growth? Please see Appendix A for guidance on a reflective end of year conversation. Outcomes from the end of year conversation must be documented in the RANDA. See Appendix B for guidance on navigating the End of Year Conversation in the RANDA system. Revised 8/12/15 19

20 Measures of Student Learning (NOTE: This section addresses aspects of APEX beyond the school year. It is provided in anticipation of full implementation as data become available.) In the Measures of Student Learning (MSL) portion of APEX, each teacher and specialized service professional is classified into one of three categories based on the summative data available. That data is usually tied closely to the content areas and grade level(s) taught. Principals and Assistant Principals are in a separate category. The teacher and special service professional categories in APEX are as follows: * NOTE: For the school year, all certified staff will only use SLOs for the MSL half of APEX. * Category 1: Teachers who instruct students in grade levels and content areas with Colorado Growth Model (CGM) data available. Category 2: Teachers who instruct students in content areas measured by State Summative Assessments but where CGM data are unavailable for the students instructed. Category 3: Teachers who instruct students in content areas where there are no State Summative Assessments. For help determining the correct category please see the Teacher Type Category flowchart in Appendix A. Teachers in each category will have different assessment results factored into the Measures of Student Learning portion of the system based on the content area(s) and grade level(s) taught. The total weight of the measures incorporated into the Measures of Student Learning portion for each teacher will be 50%. The three types of measures incorporated are: Summative assessment results or Colorado Growth Model (CGM) Data School Performance Framework (SPF) Indicators tied directly to the School s UIP Student Learning Objectives (SLOs) * With the latest revisions to state law the APEX Committee may revise the following two tables. * Teacher or Specialized Service Professional Category Summative/CGM Measure Weight SPF/UIP Measure Weight SLO Measure(s) Weight Category I 10% 10% 30% Category II 10% 10% 30% Category III 0% 20% 30% Measures of Student Learning for Principals and Assistant Principals Measure Weight Literacy Median Growth Percentile (MGP) 10% Math Median Growth Percentile (MGP) 10% Student Learning Objectives Measures 30% Revised 8/12/15 20

21 Framework for Measures of Student Learning for Educator Effectiveness Every teacher s or special services provider s evaluation is determined by equally weighting professional practice and measures of student learning/outcomes. The professional practices score and rating is determined by the completed professional practices rubric. There is no single best way to score and evaluate a teacher based on measures of student learning/outcomes; however, there are some requirements based on the rules promulgated by the State Board of Education. The full set of State Board of Education rules for professional growth can be found in Appendix C. Fundamental Requirements for Student Growth Measures for Teachers a) A measure of individually-attributed Student Academic Growth, meaning that outcomes on that measure are attributed to an individual licensed person; b) A measure of collectively-attributed Student Academic Growth, whether on a school-wide basis or across grades or subjects, meaning that outcomes on that measure are attributed to at least two licensed personnel (e.g., measures included in the school performance framework, required pursuant to section , C.R.S.); c) When available, Statewide Summative Assessment results; and d) For subjects with annual Statewide Summative Assessment results available in two consecutive grades, results from the Colorado Growth Model. Because of these rules, creating a single system applicable for all teachers and special services providers will become difficult. Some teachers teach at grade levels and in subject areas where Colorado Growth Model (CGM) data are available (e.g., 6 th grade Math and 9 th grade Reading and Writing), but many teachers do not. Some teachers teach in subject areas and grade levels where CMAS/TCAP proficiency data are available (e.g., 3 rd grade and 8 th grade Science), but most teachers do not. Then, there are teachers who do not teach in a grade level or content area measured by CMAS/TCAP or other state assessments. Finally, there are special services providers who may or may not provide instruction to students related to any content standards. As a result of these differences, a method for determining measures of student learning must be devised based on the type(s) of assessment data or outcome data available for a specific teacher or special services provider. At this time, there appear to be three categories into which teachers or special services providers could be classified. Category I teachers who instruct students in content areas where Colorado Growth Model data are available from state summative assessments (e.g. 4th or 5th grade teacher or a Language Arts or Math teacher at the secondary level) Category II Teachers who instruct students in content areas with state summative assessments, but with no Colorado Growth Model data (e.g. K- 3 classroom teacher or secondary Science/Social Studies teachers) Category III Teachers who instruct students in areas without state summative assessments (e.g. elementary specials teacher or elective teacher) or special services providers may or may not provide instruction to students tied to content standards at all (e.g. counselors, nurses, or psychologists). On the following pages, a proposed framework for each category of teacher will be outlined and explained. The framework for each category of teacher will use multiple measures to divide and weight the portion of the evaluation based on student growth. The method for each category of teacher entails using both required measures and choice measures to ensure a level of consistency as well as flexibility given the unique needs and improvement focus of each school and for each teacher. The measures used will fall into two or three of the following categories based on the data available. 1) CGM or State Summative data 2) School Performance Framework (SPF) measure(s) 3) Other Student Learning Objective (SLO) measure(s) Revised 8/12/15 21

22 CATEGORY I TEACHERS Teachers in Category I have data directly attributable to them from the CGM for CMAS/TCAP Reading, Writing, and Math, and/or ACCESS; teach students in one or more content areas that are assessed on State summative assessments, and also have data attributable to them from district and/or classroom assessments. CGM data is individually attributable for a Category I teacher because the CGM measures change in performance of students the year the teacher is directly instructing those students. The following table and explanation outlines a proposed method to use multiple measures to divide and weight the portion of the evaluation based on student growth for the teacher. The method entails using both required measures and choice measures to ensure a level of consistency as well as flexibility given the unique needs and improvement focus of each school. Measures Used for Student Growth Type of Measure Attribution Measure Weight CGM MGP Individual MGP(s) most applicable area(s) 10% School Focus SPF Measure(s) Tied to UIP Other SLO(s) Teacher Discretion Collective Individual or Collective TBD by Teacher/Evaluator at the grade, department, or school level 10% TBD by Teacher/Evaluator 30% Explanation of Growth Measures* 1) CGM MGP The most technically-sound growth measure currently available is from the CGM. As a result, a required measure for Category I teachers will be the inclusion of the MGP of his/her students in the most applicable content area(s) as an individual attribution measure. For example, a Math teacher s results would include the Math MGP of his/her students. The weight of this measure is 10%. 2) School Focus SPF Measures(s) Tied to UIP A principal, in consultation with his/her teachers, is required to choose one or more measures from the SPF to include in the identified educator s growth calculation (e.g., Math Achievement (%P&A), ACCESS Growth (MGP), Growth of IEP students in Reading (MGP), or Composite ACT score) as a collective attribution measure. In some schools, this may be the same measure for all educators. In other schools, the principal may differentiate this measure based on the various roles that educators play in that building. For example, it may make sense for an entire elementary school to include writing growth from the SPF in everyone s evaluation. However, at a comprehensive high school, this may not be the case. The weight for this measure is 10%. So, two other measures from the SPF, the combined total of those measures weights must be 10%. As an example, a teacher could choose both Math Achievement with a weight or 5% and ACCESS Growth MGP with a weight of 5%. 3) Other SLO(s) Teacher Discretion A teacher, in consultation with his/her evaluator, must choose two or more other measures as part of his or her growth calculation as either individual or collective attribution measures. These measures should align with the school s goal of increasing student achievement/growth and be comprised of well-constructed, sound assessments. The weight for these measures must total 30%. For example, a teacher could choose to use Fall-Winter MAP Math growth with a weight of 10%, a District Writing Assessment at 15%, and a science inquiry assessment at 5%, which overall would total 30%. *The total combined weight of all the measures from #1-3 above must total 50%. Revised 8/12/15 22

23 The following examples are merely illustrations for a variety of teacher types. Not every measure would be the same for every individual teacher belonging to that type. Example 1: Category I Middle School Math Teacher Type of Measure Attribution Measure Weight CGM MGP Individual Teacher s TCAP Math MGP 10% School Focus SPF Collective Math Achievement for Grade 7 students 10% Measure(s) Tied to UIP Other SLO(s) Individual Semester Grade 7 Math Test - % of students w rubric score 15% Teacher Discretion 3+ Individual NWEA Math MAP Growth - % of students meeting growth projection 15% Example 2: Category I 4 th Grade Teacher Type of Measure Attribution Measure Weight CGM MGP Individual Teacher s TCAP Reading MGP 10% School Focus SPF Measure(s) Tied to UIP Other SLO(s) Teacher Discretion Collective School s ACCESS MGP 5% Individual Teacher s TCAP Math MGP 5% Individual Collective Individual Teacher s Language Use MAP growth % meeting growth target School s % of students scoring 12+ on district writing assessment School-developed 4 th grade summative reading proficiency assessment Example 3: Category I ESL teacher Type of Measure Attribution Measure Weight CGM MGP Individual Teacher s ACCESS MGP 10% School Focus SPF Measure(s) Tied to UIP Other SLO(s) Teacher Discretion Collective School s TCAP Reading MGP 5% Collective School s Writing TCAP MGP for ELLs 5% Collective Percentage of students exiting ELL as FEP 15% Individual English Literacy Development portfolio growth measure 15% 10% 10% 10% Revised 8/12/15 23

24 CATEGORY II TEACHERS Teachers in Category II have data indirectly (collectively) attributable to them from CMAS/TCAP Reading, Writing, Math, Science, or other state summative assessments and also have data directly attributable to them from district and/or classroom assessments. The data from the CGM and State summative assessments are not directly attributable to the Category II teachers because the assessments measure the contribution of multiple teachers to the performance of students over time. For example, for an 8 th grade Science teacher, the data on the 8 th grade Science CMAS/TCAP would be collectively attributable, as that teacher instructed those students the year of the test, but not the prior two years upon which the test is also based. For a 6 th grade Science teacher at the same school, the data would be collectively attributable because, while the teacher did not instruct those students the year of the test, he or she contributed in the past and/or during the current year through collaborative work within the Science department. The following table and explanation outlines a proposed method to use multiple measures to divide and weight the portion of the evaluation based on student growth for the teacher. The method entails using both required measures and choice measures to ensure a level of consistency as well as flexibility given the unique needs and improvement focus of each school. Measures Used for Student Growth Type of Measure Attribution Measure Weight CGM or Summative Collective TCAP MGP, %P&A (TCAP), and/or avg scale score (ACT) 10% Assessment School Focus SPF Collective TBD by Teacher/Evaluator at the grade, department, or 10% Measure(s) Tied to UIP school level Other SLO(s) Individual or TBD by Teacher/Evaluator 30% Teacher Discretion Collective Explanation of Growth Measures* 1) CGM or Summative Assessment TCAP and other State summative assessment are valid measures of student achievement, and Category II teachers teach in content areas where these tests are used as part of school accountability. As a result, a required measure for Category II teachers will be the inclusion of CGM or State summative assessment data for students in the most applicable content area(s) as a collective attribution measure. For example, a 2 nd grade teacher s results could include the School s Reading MGP or the %P&A on Reading TCAP. The weight of this measure is 10%. 2) School Focus SPF Measures(s) Tied to UIP A teacher, in consultation with his/her evaluator, is required to choose one or more measures from the SPF to include in the growth calculation (e.g., Math Achievement (%P&A), ACCESS Growth (MGP), Growth of IEP students in Reading (MGP), or Composite ACT score) as a collective attribution measure. The weight of this measure is 10%. So, if a teacher chooses two other measures from the SPF, the combined total of those measures weights must be 10%. 3) Other SLO(s) Teacher Discretion A teacher, in consultation with his/her evaluator, must choose two or more other measures as part of his or her growth calculation, at least one of which must be an individual attribution measure. These measures should align with the school s goal of increasing student achievement/growth and be comprised of reliable, valid assessments. The weight for these measures must total 30%. For example, a socials studies teacher could choose to use a common grade level social studies summative assessment with a weight of 15% and a socials studies document-based social studies performance task, which overall would total 30%. *The total combined weight of all the measures from #1-3 above must total 50%, no more and no less. The following examples are merely illustrations for a variety of teacher types. Not every measure would be the same for every individual teacher belonging to that type. Revised 8/12/15 24

25 Example 1: Category II 1 st Grade Teacher Type of Measure Attribution Measure Weight CGM or Summative Collective School s %P&A on Reading TCAP 10% Assessment School Focus SPF Collective School s Writing MGP 10% Measure(s) Tied to UIP Other SLO(s) Individual Teacher s Math MAP growth % meeting growth target 10% Teacher Discretion Collective School s % of students scoring 12+ on district writing 10% assessment Collective 1 st grade Reading MAP growth % meeting growth target 10% Example 2: Category II 7 th Grade Science Teacher Type of Measure Attribution Measure Weight CGM or Summative Collective School s TCAP Science %P&A 10% Assessment School Focus SPF Collective School s TCAP Reading MGP 5% Measure(s) Tied to Collective School s ACCESS MGP 5% UIP Other SLO(s) Individual Teacher s data from Approved 7 th grade Common 15% Teacher Discretion Assessment % scoring 3+ on rubric Individual Teacher s data for common district inquiry assessment - % meeting proficiency benchmark 15% Example 3: Category II 11 th Grade English Teacher Type of Measure Attribution Measure Weight CGM or Summative Collective School s ACT Score - Reading 10% Assessment School Focus SPF Collective School s TCAP Reading MGP for minority students 10% Measure(s) Tied to UIP Other SLO(s) Individual English AP test - % of students scoring 3+ 15% Teacher Discretion Collective School s % of students scoring 12+ on district writing assessment 15% Revised 8/12/15 25

26 CATEGORY III TEACHERS Teachers in Category III do not teach in content areas assessed by State summative assessments. Category III teachers have only data directly attributable to them from district and/or classroom assessments. Further, special services providers may or may not provide instruction to students tied to content standards at all. The following table and explanation outlines a proposed method to use multiple measures to divide and weight the portion of the evaluation based on student growth for the teacher or specialized service provider. The method entails using both required measures and optional measures to ensure a level of consistency as well as flexibility given the unique needs and improvement focus of each school. For every Category III teacher, at least one Other SLO measure must be individually attributable to meet the rules regarding individual and collective attribution. Measures Used for Student Growth Type of Measure Attribution Measure Weight School Focus SPF Measure(s) Tied to UIP Other SLO(s) Teacher Discretion Collective TBD by Teacher/Evaluator at the grade, department, or school level Individual or TBD by Teacher/Evaluator 30% Collective *For special services providers whose work cannot be tied directly to SPF measures, the full 50% of the measures of student learning will come from SLOs. Explanation of Growth Measures* 1) School Focus SPF Measures(s) Tied to UIP A teacher, in consultation with his/her evaluator, is required to choose one or more measures from the SPF (unless the specialized service provider s work cannot be directly tied to SPF measures) to include in the growth calculation (e.g., Math Achievement (%P&A), ACCESS Growth (MGP), Growth of IEP students in Reading (MGP), or Composite ACT score) as a collective attribution measure. The weight for this measure is 20%. So, if a teacher chooses two measures from the SPF, the combined total of those measures weights must be at least 20%. 2) Other SLO(s) Teacher Discretion A teacher, in consultation with his/her evaluator, must choose two or more other measures as part of his or her growth calculation, at least one of which must be an individual attribution measures. These measures should align with the school s goal of increasing student achievement/growth and be comprised of reliable, valid assessments. The weight for this measure is 30%. For example, a teacher could choose to use a common content area assessment approved through the school and district as well as an AP test administered to students taking that exam both weighted at 15%, for a total of 30%. *The total combined weight of all the measures from #1-3 above must total 50%, no more and no less. 20%* Revised 8/12/15 26

27 The following examples are merely illustrations for a variety of teacher types. Not every measure would be the same for every individual teacher belonging to that type. Example 1: Category III Elementary PE teacher Type of Measure Attribution Measure Weight School Focus SPF Collective School s Reading MGP 10% Measure(s) Tied to UIP Collective School s ACCESS MGP 10% Other SLO(s) Teacher s data from Approved PE Common Assessment Individual Teacher Discretion scoring 3+ on rubric 20% Individual Teacher s data - % of students improving overall fitness score on national fitness measure 10% Example 2: Category III Middle School Computer Science Teacher Type of Measure Attribution Measure Weight School Focus SPF Measure(s) Tied to Collective School s Math MGP 20% UIP Other SLO(s) Teacher s data from Approved Technology Common Individual Teacher Discretion Assessment for 6 th grade scoring 3+ on rubric 15% Individual Teacher s data from Approved Technology Common Assessment for 8 th grade scoring 3+ on rubric 15% Example 3: Category III High School Music Teacher Type of Measure Attribution Measure Weight School Focus SPF Collective School s TCAP Writing MGP 10% Measure(s) Tied to UIP Collective School s graduation rate 10% Other SLO(s) Individual Music Theory AP test - % of students scoring 3+ 15% Teacher Discretion Percentage of students scoring 3 or higher on outside music Individual competitions through CHSAA 15% Example 4: Category III High School Counselor Type of Measure Attribution Measure Weight School Focus SPF Collective School s Graduation Rate 10% Measure(s) Tied to UIP Collective School s Dropout Rate 10% Other SLO(s) Individual Post-secondary planning completion rate for juniors 15% Teacher Discretion Social/emotional counseling survey measure for support Individual groups 15% Example 5: Category III School Nurse Type of Measure Attribution Measure Weight School Focus SPF Measure(s) Tied to UIP Collective Not Applicable as no logical link to SPF data 0% Other SLO(s) Teacher Discretion Individual School attendance rate 25% Individual Immunization compliance rate 25% Revised 8/12/15 27

28 Student Learning Objectives (SLO) Handbook Student Learning Objective/Outcome (SLO) Handbook Revised 8/12/15 28

29 Student Learning Objectives Handbook Table of Contents Definition and Structure of an SLO Rationale for Using SLOs SLO Timeline SLO Process Step 1: Create the Learning Goal Step 2: Define Assessments and Scoring Step 3: Establish Performance Targets Step 4: Score the SLO SLO Handbook Resources Revised 8/12/15 29

30 Definition and Structure of an SLO An SLO is a measure of an educator s impact on student learning or outcomes within a given interval of instruction. An SLO is a measurable, long-term goal informed by available data that a team of educators developed collaboratively along with an administrator at the beginning of a period of instruction (typically the beginning of the school year) for all students or for subgroups of students. The educator and students work toward the SLO growth targets throughout the period of instruction and use interim, benchmark, and formative assessments to measure progress toward the goal. Ultimately, summative assessment(s) are administered or outcome measures are identified to provide data about students attainment of the SLO. At the end of the instructional period, the educator meets with an evaluator to discuss attainment of the SLO and determine the educator s impact on student learning or outcomes. Research indicates the act of setting rigorous goals to improve student and school outcomes, combined with the purposeful use of data, results in greater academic growth and impact on outcomes. Specifically, the process: Encourages educators to implement systematic, strategic practice; Encourages reflection and examination of practice and data; Provides educators opportunities to demonstrate growth; and Leads to improved practice. There are three primary components to an SLO. The components provide clarity about what learning is expected (significant content, processes, and skills students will demonstrate), how student achievement will be measured, and how attainment of the expected learning will be evaluated. 1) Learning Goal : A description of what students will be able to do at the end of the instructional period based on course- or grade-level content standards. The learning goal should convey the big idea(s), along with associated standards, and allow for a demonstration of rigorous, deep understanding of student learning of content standards. 2) Assessments* and Scoring : Assessments should be standards-based, of high quality, and designed to best measure the knowledge and skills specified in the learning goal of the SLO. This does not necessarily mean a paper-and-pencil test, as other methods of assessment such as performance tasks or portfolios may be more appropriate depending upon the scope and content of the learning goal. Additionally, the assessments should be accompanied by clear scoring criteria or rubrics to consistently evaluate student learning. *For specialized service providers, other outcome measures such as graduation rates, attendance rates, or other quantitative measures may be more appropriate given the educator s role and responsibilities. 3) Performance Targets : Identify the expected outcomes by the end of the instructional period for the student population (or subpopulation, where appropriate). Performance targets are based on a thorough review of baseline data to establish realistic expectations for student learning or outcomes over the instructional period measured. Revised 8/12/15 30

31 Rationale for Using SLOs There are a number of reasons SLOs were chosen to be included in APEX.. 1) SLOs mirror effective educational practice - Setting goals for students utilizing baseline data, using data from formative assessments to evaluate student progress, and adjusting instruction based upon that progress are all part of effective teaching. All the aforementioned steps should be occurring during day-to-day practice in classrooms. Further, SLOs help underscore the importance of using growth targets to inform judgments about the impact of instruction and formal measurement of student learning or outcomes. 2) SLOs reinforce the professional expertise of educators - The SLO process allows educators to have input on how student learning will be measured and factored into the evaluation process. Also, the process allows educators to focus on the objectives that are most relevant for their student population and content areas and provide a clear, measurable connection to instruction or job performance. 3) SLOs provide flexibility and are comprehensive State assessments only measure limited content areas and grade levels, but SLOs are not dependent upon the availability of standardized assessment scores. Further, SLOs can draw upon different data sources such as end of course exams, performance-based assessments, or district- or team-created assessments. SLOs can be highly adaptable and are more closely tied to the instructional priorities of the district, school, or individual classroom. 4) SLOs promote professional collaboration Developing strong, effective SLOs will require collaboration among teams of educators and administrators to reflect on writing and critiquing quality learning targets; selecting or creating aligned, rigorous assessments or outcome measures; and establishing high, yet attainable targets based on baseline student achievement or performance. Throughout the remainder of this document the term teacher will be used for both teachers and specialized service providers to avoid repetitiveness. The basic process of developing an SLO is the same for both groups of educators. SLO Timeline The timeline to develop, review, monitor and score SLOs will be very similar to the overall timeline for the professional practices portion of the educator evaluation system in Adams 12. SLOs will be developed near the beginning of the school year, monitored throughout the school year as assessments are given and more data become available, and then finalized and scored at the end of the school year. A basic overview of the timeline can be found below. Timeframe August through October End of October November through mid-year conference Mid-year conference February through end-of-year conference End-of-year Conference Activity Gather and review baseline data and develop SLOs with colleagues and evaluator Deadline for SLO approval by evaluator Gather formative and summative data to gauge students achievement of SLO target(s) Discuss current assessment data with evaluator and finalize/score any 1 st semester SLOs (if applicable) Continue to gather formative and summative data to gauge students achievement of remaining SLO target(s) Present assessment data to evaluator to finalize/score all SLOs Revised 8/12/15 31

32 SLO Process Developing an effective SLO involves a considerable amount of reflection and planning; however, the type of planning required, as mentioned earlier, is the foundation for effective instruction and student learning. To facilitate development of SLOs, samples of effective SLOs are incorporated into each step of the process outline, and a variety of artifacts, including a rubric to evaluate the effectiveness of an SLO, are included at the end. The SLO process is comprised of the following steps. Revised 8/12/15 32

33 SLO Template To document the SLO, teachers will use a common SLO template (pictured below and located at the end of the SLO section). Throughout the remainder of the document, as each step of the SLO process is further outlined and explained, examples for each step will be provided using the SLO template. Complete SLO samples are available at the end of the SLO section. SLO Title: Teacher: Evaluator: School Year: Subject/Grade/Course: Component Indicator Description Learning Essential Goal Content & Standards Rationale Student Population Instructional Period Assessments and Scoring Assessment(s) Scoring and Results Performance Targets Baseline Data Growth Targets Rationale Summative Data Final Results Final SLO Rating (0-3) Teacher Signature: Evaluator Signature: Date: Date: Revised 8/12/15 33

34 General Questions and Answers about SLOs Do all teachers and SSPs have to create SLOs? Yes. In APEX, all teachers and SSPs will develop SLOs. SLOs tie the use of assessments or outcome measures and data analysis most closely to the educational process. In fact, SLOs will carry significant weight in the measures of student learning portion of the Adams 12 Educator Effectiveness System for all teachers. Is there a difference between SLOs developed by teachers and SLOs developed by SSPs? Yes, to some degree. While teachers will align the learning goal of their SLOs to meaningful content and standards they are teaching, many SSPs will align their learning goal to an important student outcome they can affect. Further, teachers will use assessments aligned to their learning goal to measure students learning. For SSPs, they will select an outcome measure they aligns with the outcome they are trying to affect. Can an SLO be developed collaboratively by a group of teachers/ssps? Yes. In fact, developing SLOs collaboratively with other teachers or SSPs is very highly recommended. Professional dialogue related to establishing learning goals, assessments/outcome measures, and data analysis related to targets will most likely result in the development of stronger SLOs and improved professional practice. If teachers are co-teaching a class, can they collaboratively develop an SLO for that class? Yes. This would be a very reasonable thing to do. Both teachers would then be responsible for providing instruction to meet all students needs and equally share in the SLO outcome. Can an SLO be developed by an individual teacher/ssp? Yes. However, it is definitely preferable to develop SLOs collaboratively. If a content area or course is taught by only one teacher in a school, it is still recommended that the teacher have a colleague at the school or district-level review the content to provide feedback prior to the approval process. How many SLOs must each teacher/ssp develop? Teachers and SSPs must create at least two SLOs, but may choose to create more. For the school year, the only measures included in the MSL portion of APEX are SLOs. Having only one SLO would put an inappropriate amount of weight on that measure. Further, because the process of creating a quality SLO will be time consuming at first, it is recommended that teachers and SSPs not create more than four SLOs, as monitoring data for each SLO could become challenging. Do I have to create an SLO for every course or subject I teach? No. Only two SLOs must be developed. Some teachers teach three or more grades/courses or content areas each year. In a case such as this, when determining for which courses to develop SLOs, teachers should prioritize courses with the greatest population of students to best sample student growth and achievement. Do I have to create each SLO or will be there be some pre-generated SLOs from which to choose? A few pre-generated SLOs, in limited content areas, will be available at the beginning of the school year. Teachers and SSPs can select these SLOs if they so choose or modify them to meet their specific needs. Over time, more quality SLOs will be developed by groups of teachers and SSPs and made available to educators across the district. Who approves the SLOs? SLOs will be approved by a teacher s or SSP s evaluator. Can the same SLO be used across multiple years? Yes. A teacher could use the same SLO in the school year and then again in the school year, provided that it has been approved by his or her evaluator. Revised 8/12/15 34

35 Step 1: Create the Learning Goal Determining the learning goal is the foundation upon which an SLO is built. It is impossible to measure student performance or growth without knowing what is being assessed, why it is important to assess, who is being assessed, and when the assessment will occur. Adequately defining a learning goal requires that a teacher consider the following indicators. A) Determine the essential content and standards addressed This section of the SLO should articulate the specific concepts or skills, aligned to standards, which students will gain during a course or class. The content or skill areas should represent the essential content of the course or class such as key skills, concepts, or overarching knowledge. In addition, the specific standard(s) that align with the skills and knowledge should be identified. B) Provide a rationale for why the essential content and standards were chosen The rationale provides a clear description of the importance of the selected content and standards, including a justification about why the learning goal was chosen. Included in this rationale should be a description of how the learning goal represents cognitively demanding, critical content needed in the current instructional period and/or in preparation for future grade levels or courses. Reference to the rigor of the content aligned to Webb s Depth of Knowledge (DOK) or Hess s Cognitive Rigor Matrix helps to provide context for the cognitive demand of the learning goal as well as in grounding development or choice of aligned assessments. C) Identify the student population included The student population to which the SLO will apply must be defined. A teacher s SLO should include as many students as possible to most adequately sample the impact on all students and improve the generalizability of results. The student population will be based on the content areas, grade levels, and/or courses the teacher instructs. When creating SLOs, teachers should consider the number and type of classes taught to determine the most appropriate SLOs to develop. If a teacher instructs three classes of English 9 and two classes of AP Language and Composition, it would be reasonable to write one SLO for English 9 with all three classes pooled together and one SLO for AP Language and Composition with both classes pooled together. Teachers with a large number of courses/content areas taught may have to pick some courses/content areas instead of others, with the selection process guided by the overall number of students taught in each course/content area, the instructional focus of the school, and the other measures already included in measures of students learning such as summative assessment data or SPF data. D) Define the instructional period The instructional period is the time period during which the educator expects learning to occur. This time period typically ranges from six weeks to a full year. Ultimately, the interval of instruction should be an adequate time for the expected growth to occur given the breadth and scope of the learning goal and the time the teacher has to work with students. Revised 8/12/15 35

36 Samples of Learning Goals Elementary Literacy Component Indicator Description Learning Essential Students will comprehend non-fiction texts and multimedia sources to identify and Goal Content & summarize the main idea and explain how details support the main idea. Students will then Standards write an opinion paper and support their claim with information derived from those sources. Standards assessed : R.2.a.i -Refer to details and examples in a text when explaining what the text says explicitly and when drawing inferences from the text. R.2.a.ii-Determine the main idea of a text and explain how it is supported by key details, summarize the text R.2.d-By the end of year, read and comprehend informational texts, including history/social studies, science, and technical texts, in the grades 4-5 text complexity band proficiently, with scaffolding as needed at the high end of the range RR.b.iv-Read for key ideas, take notes, and organize information read (using graphic organizer) W.1.a-Write opinion pieces on topics or texts, supporting a point of view with reasons and information W.1.a.i-Introduce a topic or text clearly, state an opinion, and create an organizational structure in which related ideas are grouped to support the writer s purpose W.1.a.ii-Provide reasons that are supported by facts and details W.1.a.iii-Link opinion and reasons sing words and phrases W.1.a.iv-Provide a concluding statement or section related to the opinion presented Rationale Being able to gather information, including text-based evidence, from a variety of appropriate grade-level and above grade-level resources and develop a written argument is a rigorous task that cuts across all content areas. Further, this type of task directly aligns with expectations on the Literacy CMAS PARCC assessment. This content requires Depth of Knowledge 3 indicative of strategic thinking. Student All students enrolled in my 4 th grade literacy class 27 students. Population Instructional Period The instructional period runs from August through November as we are covering opinion writing. Middle School Math Component Indicator Description Learning Essential Students will demonstrate a thorough knowledge of 6 th grade content standards. Students Goal Content & will demonstrate the ability to independently solve problems ranging from procedural tasks to Standards problems integrating multiple mathematical concepts to applications of content in new and varied contexts. Students must be able to demonstrate their reasoning clearly through problem solving, written explanations and procedural fluency. Standards assessed : 1) Connecting ratio and rate to whole number multiplication and division and using ratio and rate to solve problems. 2) Complete understanding of division of fractions and extending the notion of number to the system of rational numbers. 3) Writing, interpreting, and using expressions and equations. 4) Developing an understanding of statistical thinking. Rationale Rigor is ensured on this SLO by using an assessment that incorporates complex problems requiring true application of multiple mathematical concepts in novel contexts not experienced in prior instruction, aligned to at least application and Depth of Knowledge 3. Student Population All students enrolled in my 6 th grade core comprised of 3 regular classes (94 students) and 1 honors class (28 students) Instructional Period The instructional period is the entire school year. The summative assessment is administered in early May. Revised 8/12/15 36

37 High School US History Component Indicator Description Learning Essential Students will independently use primary and secondary sources to explain and analyze Goal Content & current civics and/or political issues. Students will do so through the use of a written and oral Standards argument that reflects an accurate and in-depth characterization of the relationship between the contemporary issue and historical precedent. Standards assessed : Standard 1: History 1. Use the historical method of inquiry to ask questions, evaluate primary and secondary sources, critically analyze and interpret data, and develop interpretations defended by evidence 2. Analyze key historical periods and patterns of change over time within and across nations and cultures Rationale The learning goal allows for significant cognitive depth and complexity. Students must use primary and secondary sources from a historical period to analyze factors affecting events of that period and then make coherent connections to current events and how similar relationships exist today. Students must make a logical, compelling argument both in writing and verbally. All these skills will transfer to later social sciences classes as well as across all other content areas. These skills align with Depth of Knowledge 3 and 4. Student All students enrolled in each of four year-long US History sections total of 121 students. Population Instructional Period The instructional period is the entire school year leading up to the submission of the paper and oral presentation in late April. High School Drama Component Indicator Description Learning Essential Students will develop a character as part of a one-act ensemble play that showcases vocal Goal Content & characteristics and techniques, posture, and movement to convey the physical, social, and Standards psychological dimensions of a character. Standards assessed : Create: Character development in improvised and scripted works Creation, appreciation, and interpretation of scripted works Perform: Drama and theatre techniques, dramatic forms, performance styles, and theatrical conventions that engage audiences Critically Respond: Elements of drama, dramatic forms, performance styles, dramatic techniques, and conventions Rationale The learning goal encompasses more than merely memorizing and delivering lines in a production, which is most entering students conception. Students have to delve into the motivation and psychology of the character to truly bring that character to life. Additionally, instructional practices used to accomplish the learning goal incorporate student self- and peer-assessment using a rubric, which requires significant reflection and rigor. Strategic thinking is required, equivalent to DOK 3. Student All students enrolled in two semester-long drama I classes, one class 1 st semester and one Population class 2 nd semester 59 students total Instructional Period The instructional period will encompass the entire semester for each class. Analysis for the first semester class will be complete in January and in May for the second semester class. Revised 8/12/15 37

38 Questions and Answers about Learning Goals Should the essential content of the learning goal be aligned primarily to course curriculum or State standards? Because course curricula should be aligned to state standards, there is little practical difference. For the sake of consistency, the Colorado Academic Standards should be referenced. The exception would be a content area such as science where our district has adopted the next generation science standards. For SSPs, the learning goal may not represent standards as much as an important outcome such as improving immunization rates at a school (school nurse) or improving students ability to demonstrate appropriate course credit progression in high school (counselor). Should the essential content of the learning goal cover multiple standards or just one? The focus of essential content is on rigorous big ideas vital to the current grade/course as well as future academic work. Consequently, the essential content should cover a significant portion, if not the entirety, of the current grade/course. As such, the essential content should likely cover multiple standards. As mentioned in the previous question, SSPs should target a meaningful outcome for students to which they can directly contribute. Can the essential content of the learning goal cover practice standards such as the mathematical practice standards, which are part of the Common Core Math Standards? Yes. Many of the mathematical practice standards lend themselves to measuring meaningful content at a high cognitive level, which is an important component to writing a strong SLO. Can the essential content of the learning goal for a special education teacher be organized around students meeting IEP goals? Yes. So long as the IEP goals are tied to meaningful content standards and are cognitively demanding, an SLO may be structured around students with IEPs meeting their IEP goals. How do I determine the student population? Typically, the student population will include all students enrolled in the class(es) or course(s) targeted by the SLO. For SSPs, this will depend upon the outcome chosen, but should include the students the SSP works with or the entire student population. How do I determine the student population if I teach 450 students over the course of a year? Teachers who instruct a large number of students, such as specials teachers in elementary schools, should write SLOs targeted at specific grades/courses that best represent the school as a whole. Trying to assess and track data for a huge population could easily become overwhelming for a teacher. For example, an elementary art teacher may choose to develop SLOs specifically for 2 nd and 4 th grade only. Revised 8/12/15 38

39 Can students with disabilities or English Language Learners (ELLs) be excluding from the student population? No. Difficulty in achieving targets is not a reason to exclude any subgroup of students. If students demonstrate different initial levels of achievement upon entering a course, those differences can be accounted for by establishing differentiated targets, which will be discussed later in this handbook. The only exception would be students with a significant cognitive disability in a mainstream course being taught to different standards than those covered by the SLO. What if a student joins my class late in the year? Students must be enrolled in a grade/course for at least 85% of the instructional period to have their data included in the SLO process. What if students in the population are absent frequently? Assuming a student meets the criterion of being enrolled in a grade/course for at least 85% of the instructional period, the student must be included if his or her attendance rate for that instructional period is 85% or more. Is there a minimum number of students that should be in my SLO? Ideally, an SLO should have at least 15 students in the student population. Because some teachers and SSPs may not work directly with that number of students, the minimum of 15 students can be waived by the teacher and evaluator when the SLO is approved. Is the instructional period one curriculum unit or the entire school year? Typically, the instructional period will be a semester or the entire year, but should be an extended period over which students can master the content of the SLO and demonstrate adequate academic growth. Covering significant essential content, as described above, requires a significant period of instruction, ongoing student practice, and multiple opportunities for feedback. When should the instructional period end? The instructional period should end when the culminating summative assessment is administered and scored. Can the instructional period for one SLO be 1 st semester and the instructional period for a second SLO be 2 nd semester? Yes. There is no requirement that the SLOs of a teacher or SSP run concurrently. Teachers/SSPs and their evaluators have flexibility to schedule SLOs at any time within the school year so long as they are completed by early May in order to meet the timeline to finalize evaluations. Revised 8/12/15 39

40 Step 2: Define Assessments and Scoring The second phase of developing an SLO is to select or create an appropriate assessment or assessments. This can often be challenging, but it is vital to the SLO process. Without having an assessment directly aligned to the content specified in the learning goal, there is no way to measure the achievement or growth of students, and consequently, the instructional impact of the teacher. The learning goal should be established first and the assessment second. When examined out of context, an assessment may be well-constructed and appropriate for a given content area or grade level, but for it to be of any use in the SLO process, it must align clearly and completely with the learning goal without extraneous content or skills being measured. The District will begin to develop a bank of district-administered assessments that could be considered for use in SLOs, such as the MAP assessment, assessments developed to align with units of study, and other measures taken directly from the School Performance Framework (SPF). Over time, the District will add other assessments to the bank as content coordinators and teams of teachers develop well-aligned measures of the standards through the SLO process. If teachers do not wish to use an assessment from the bank of District-administered assessments, the District recommends that teachers utilize an assessment created collaboratively by a team of teachers for the SLO process. Creating an assessment in a collaborative manner is more likely to improve assessment quality and will facilitate professional growth for the team involved as a result of discussions about appropriate content, relative importance of knowledge and skills, and common understanding of student expectations. If a teacher must create an assessment that is unique to his or her classroom, the District strongly recommends that the teacher have the assessment reviewed by an educator at the district level, other school staff members, or colleagues who teach in the same content area at other schools in the district. When creating or selecting an appropriate assessment, the items or tasks on the assessment should align closely with the learning goal in terms of both content and cognitive demand. Assessments do not need to be pencil-and-paper tests, but can be performance-based assessments as well. Educators are encouraged to select the assessment(s) that are most appropriate for measuring student achievement and growth aligned to the learning goal. If an assessment requiring teacher judgment as part of the scoring process is selected or created, a rubric must be developed and, preferably, evaluated for effectiveness by multiple teachers using student work samples. Assessment options could include Traditional paper-and-pencil tests made up of selected-response or constructed-response questions; Performance-based assessments, such as presentations, projects, and tasks scored with a rubric; Portfolios of student work scored using a rubric; and A combination of the methods listed above. ** A set of questions to consider when selecting or creating an assessment can be found at the end of the SLO section. Answering these questions will help to improve the reliability and validity of the results of the assessment for the purpose of evaluating the impact of instruction on student achievement and growth. Revised 8/12/15 40

41 Samples of Assessment and Scoring Elementary Literacy Assessments and Scoring Assessment(s) Scoring and Results The culminating assessment used will be the TCRWP 4 th grade Informational Reading/Argument Writing Performance Assessment (Draft ) along with its accompanying rubric. The assessment requires students to watch a multimedia clip and read two appropriate texts. While doing these activities students are prompted to complete four tasks directly aligned to the Learning Goal, with the final task requiring the students to write an argument essay based on the texts and video clip. The test will be given by all 4 th grade teachers in the school in mid-november. There will be two formative assessments (the initial formative serving as a pre-assessment) similar to the culminating assessment (one at the end of August and one in early October). The same rubric for the culminating assessment will be used to score the formative assessments. Scoring meetings will be held after each formative assessment during which collaborative scoring will be used to improve accuracy and consistency of scoring using the common rubric. Culminating assessments will be scored by the classroom teacher. Each teacher will bring a representative sample of 4-8 papers for dual scoring at a team scoring meeting. Middle School Math Assessments and Scoring Assessment(s) Scoring and Results The summative assessment used will be the common district-developed 6 th grade summative assessments comprised of both selected-response and constructed-response items. The assessment covers all content covered in the 6 th grade Colorado Academic Standards and is purposefully aligned in proportion to the Type I, Type II, and Type III tasks required on the PARCC Math assessments for 6 th grade. The test will be given by all 6 th grade Math teachers at our school. Selected-response items will be scored using the answer key and assigned a value of 1 point each. Constructed-response items will be scored using a 3-point rubric or 4-point rubric depending upon the item. Prior to scoring their own assessments, the 6 th grade math team will pull a sample of assessments and conduct blind, dual scoring of those assessments. Results will be compared across teachers and refinement of expectations and common scores will be determined with blind scored papers as anchor papers. Teachers will then score the remainder of their own students assessments. For scoring purposes, the regular sections will carry 3 times the weight of the honors section. Revised 8/12/15 41

42 High School US History Assessments and Scoring Assessment(s) Scoring and Results The summative assessment is comprised of two parts. To complete both parts, the student will be provided a packet of primary and secondary sources related to pro and con opinions of US involvement in international military conflicts throughout US history up to the present day. Students will have a day to read and annotate the packet in class. Students will then be given a week to address Parts 1 and 2. Part 1: Students will compose a written argument either pro or con given a hypothetical scenario in which the US could intervene in a conflict between Jerusalem and a group of other middle-eastern states. Part 2: Students will create a 5-7 minute oral presentation incorporating the use of technology to present their argument to the class. Students will receive two scores for the assessment: one for the essay and one for the presentation. Each portion of the assessment will be scored using a rubric collaboratively designed by the school s US history common course team. For the essay, blind, dual scoring of 5% of the essays will help to triangulate teachers expectations. For the oral presentation, a sample of 5 recorded presentations from the prior year will be collaboratively rated and discussed by all US History teachers prior to scoring current students presentations. For scoring purposes, the essay and the oral presentation will receive the same weight. High School Drama Assessments and Scoring Assessment(s) Scoring and Results The summative assessment is a performance task based on the each student s performance in an ensemble one-act play. To accommodate all students, multiple teacher-selected one-act plays will be utilized. The plays will be held early in the month before the end of each semester. Students were given a tryout role in the spring of their prior year. Additionally, students will be given a formative assessment during the 1 st two weeks of class comprised of a single scene to prepare and act out with a partner. Each scene presentation will be videotaped and critiqued by the class and myself, using the rubric. Students will be scored using the rubric developed by a group of theatre teachers which will be used throughout the course by both the students and me. Each performance will be videotaped. This will allow me multiple opportunities to view the performance to enhance scoring. I will have another theatre teacher blind score the performances to check for accuracy and consistency. Finally, to promote student reflection, students will also watch and rate recordings of their performances during class. Student ratings will be reviewed as an additional check on my own ratings. Students with average preparedness will receive twice the weight of the other categories because this group is at least twice as large as the other groups. Revised 8/12/15 42

43 Questions and Answers about Assessments and Scoring Do I need to use a pre- and post-assessments in my SLO? No. Pre- and post-assessments are not required; however, the use of a pre-assessments can provide useful baseline data for students when they begin the instructional period, particularly if the pre- and post-assessment are comparable in terms of content coverage and difficulty. Who scores the assessments? Due to logistical demands, teachers will score their own assessments. Whenever possible, putting specific scoring practices such as blind scoring, collaborative scoring, or double scoring in place will help ensure that scoring is done as accurately and consistently as possible. These practices will be defined further in the question below. Ultimately, it is expected that educators will use their professional judgment to score assessments with integrity. If I score my own assessments, how do I know if I m scoring accurately? The use of the following practices, either independently or in combination, will help to ensure assessments are scored as accurately as possible. Blind scoring All references to the identity of the student whose assessment is being scored are covered to prevent preconceived notions of student achievement, strengths, or weaknesses. Also, if the assessment is scored by two or more teachers, the scores of other teachers are not known to the current rater. Collaborative scoring Two or more teachers score the assessment of the same student at the same time and discuss ratings either during scoring or after each teacher has scored the assessment. This process is most often completed at the beginning of the scoring process to help ensure inter-rater reliability and common expectations, though it can be conducted at any time. Double scoring After one teacher has scored a student s assessment, a second teacher scores the same assessment. Again, this process is most often completed at the beginning of the scoring process to help ensure inter-rater reliability and common expectations, though it can be conducted at any time. How do I know my assessment results are valid and reliable? Only extensive research can provide concrete evidence of the reliability and validity of assessment results for a particular purpose. However, if you have developed your assessment collaboratively, considered the common assessment development questions referenced at the end of this handbook, and put appropriate scoring practices in place, you have taken strong first steps to improve the technical quality of your assessment. Should students get accommodations on assessments used in SLOs? Students who receive accommodations during classroom instruction and assessments should also receive the same accommodations for the assessments used in the SLO process, assuming that the accommodation does not alter what the assessment is attempting to measure. Generally, students who receive accommodations have an educational plan (e.g., IEP, 504, Language Acquisition, etc.) in place outlining for which accommodations the students are eligible. Can I use more than one summative assessment in the SLO? Yes. More than one summative assessment can be used so long as the assessments used are tied directly to the learning goal of the SLO and meet the other criteria outlined in this section. If multiple summative assessments are used in an SLO, consideration must be given about how they will be weighted to come up with one overall set of data to compare to a performance target. At times, it will make sense to weigh the assessments equally, but at other times, one assessment may carry more weight. Revised 8/12/15 43

44 Step 3: Establish Performance Target(s) Ensuring that the assessments used for SLOs are well constructed and aligned to the learning goal helps ensure that teachers can get an accurate picture of what students know, understand, and can do at the end of an instructional period. In order to gauge the effectiveness of instruction, however, performance targets must be established prior to instruction to determine what level of achievement or growth should be expected based on students levels of achievement entering the course. In practice, establishing performance targets is the most difficult aspect of creating an SLO, particularly during the first years of implementation. As more baseline data and summative data become available over time, establishing performance targets will become more precise. To adequately establish performance targets for a teacher s students, or a subgroup of those students, the following indicators must be addressed. A) Determine baseline data for students In order to assess the extent to which students learning progressed over the instructional period, it is imperative that teachers also have an accurate picture of where their students began. An important component of the SLO process, therefore, is collecting evidence of what students already know and understand, and the types of skills they already possess in other words, determining their starting points or baseline data. Knowing students starting points lets teachers set performance targets that are both ambitious and feasible for the students in their class. Factoring students starting points into SLOs enables teachers and evaluators to more accurately determine the amount of progress students made during the instructional period. In order to make this determination, teachers should collect multiple forms of evidence. Teachers must use their professional judgment when deciding which types of information would be helpful in determining students starting points. Common sources of evidence include o Results from a beginning of year pre-test or performance tasks (e.g., a common course pre-test, the first interim assessment, etc.); o Results from prior year tests that assess knowledge and skills that are prerequisites to the current course/grade; o Results from tests in other subjects, including both teacher- or school-generated tests, and state tests such as TCAP or CMAS, as long as the test assessed pre-requisite knowledge and skills (e.g., a physics teacher may want to examine results of students prior math assessments); and o Students grades in previous classes, though teachers should ensure they understand the grading methodology used. Baseline data could be established for the entire group of students by listing summary scores such as average scores or a percentage of students meeting a proficiency target. This type of target is most applicable for fairly homogenous groups of students. However, most groups of students have heterogeneous achievement backgrounds. For this type of student population, baseline data should be established by clustering students into groups based on past or current evidence. For example, students could be classified into the following subgroups: o o o Low level of preparedness - Students who have yet to master prerequisite knowledge or skills needed for the current grade level/course; Average level of preparedness - Students who are appropriately prepared to meet the demands of the current grade level/course ; and High level of preparedness - Students who start the course having already mastered some key knowledge or skills of the current grade level/course. Teachers should use as much information as needed to help identify student starting points. It is rare to find a single assessment or previous grade that provides enough information to determine a student s starting point. Instead, by using multiple measures of student achievement, teachers will establish a more complete picture of the level of their class as a whole or, more likely, for subgroups of students. Revised 8/12/15 44

45 B) Develop the SLO growth target(s) Using the baseline data as a starting point, a specific growth/achievement target for the overall student population or subpopulations must be developed based on the assessment(s) used. These targets should include specific indicators of achievement, such as percentages of students meeting a proficiency level, growth in percentage of points earned, or another appropriate metric based on the assessment(s) used. The target may be equally applicable to all students in the population, or, as in most student populations, the target can be tiered for subgroups to allow all students to demonstrate growth, such as in the example above where students were classified as low level of preparedness, average level of preparedness, and high level of preparedness. Regardless of whether a single target is set for the entire student population or differentiated targets are established, the targets should be rigorous, yet attainable, as determined by the baseline or pretest data. For every target set, there should be four performance levels established so that the target can be scored once summative data become available. The four performance level categories are Much Less than Expected, Less than Expected, Expected, and More than Expected. The tables below provide examples of how targets could be established, one for the entire population as a whole and one for subgroups. Example 1: Median Growth Percentile Target for Math (all students) Much Less than Expected Less than Expected Expected More than Expected (0) (1) (2) (3) 1 st -39 th percentile 40 th -49 th percentile 50 th -59 th percentile 60 th -99 th percentile Example 2: Percentage of Students Scoring Proficient on Summative Writing Assessment (subgroups) Tier Much Less than Expected (0) Less than Expected (1) Expected (2) More than Expected (3) Low Prep 0-34% 35-50% 51-65% % Average Prep 0-45% 46-61% 62-75% % High Prep 0-60% 61-75% 76-90% % C) Explain the rationale for the growth target(s) High-quality SLOs include logical justifications for why a growth target is appropriate and achievable for a student population or subgroup. The rationale should be a precise and concise explanation that references baseline data along with any other qualitative evidence that informed the creation of the target. Revised 8/12/15 45

46 Sample Performance Targets Elementary Literacy Performance Targets Baseline Data Growth Targets Rationale Baseline data were obtained from the formative assessment at the end of August. 35% of students in my class earned an average rubric score of 3 or higher on the tasks using the rubric. These data aligned closely with students prior year 3 rd grade common district writing assessment results. The results are not significantly different than 4 th grade classes the past two years. Based on the baseline data, following rating scale will be used to rate this SLO on the percentage of students scoring at the same level at the end of October % of students 1-35% 36-54% 55-74% % with a 3+ rubric average The growth targets established take into consideration baseline data such that if baseline data are maintained, the lowest growth category is earned. The cutoff values for the other categories were established by averaging prior years students results on the summative assessment. Because the baseline data for this group is not significantly different than in prior years, averaging prior year results was reasonable. Middle School Math Performance Targets Baseline Data Growth Targets Rationale Student baseline data were determined based on prior year TCAP results (%P&A) as well as the first unit assessment. Also, our 6 th grade math team compared the current 6 th grade class s data to prior years data. Our current 6 th graders are coming to us better prepared overall than in the previous two year s classes. Based on those data, it became clear that setting differentiated targets for regular sections and honors sections would be appropriate, as the baseline data for students in the honors sections was much higher on average. The following rating scales were determined for both students comprising regular sections and honors sections. The target is based on the % of students obtaining a score of 80% or higher on the summative assessment Regular 0-40% 41-52% 53-65% % Sections Honors 0-70% 71-80% 81-90% % Section The growth targets were established by looking at distributions of students scores in prior years and increasing them slightly because this year s 6 th grade class overall is coming in better prepared based on TCAP data and first unit assessment results. Revised 8/12/15 46

47 High School Social Studies Performance Targets Baseline Data Growth Targets Rationale The following sources of data were examined to determine students overall baseline status: prior year TCAP reading and writing, prior year grades in social studies course(s), initial assessment data from US history unit 1 assessments. Overall, students written assessment data were not as strong as their performance on oral and presentation skills. Two growth targets are being set, one for the essay portion of the summative assessment and one for the oral presentation. Both targets are based on achieving an average overall rubric rating of 3 proficient Essay 0-30% 31-50% 51-70% % Presentation 0-20% 21-55% 56-80% % The targets are set somewhat higher for the presentation than the essay based on students initial data as a starting point. Given that this is the first year this assessment will be given, comparison to essay and presentation results are not possible. However, students prior year TCAP scores indicate that a threshold of 51% required to obtain a rating of expected is reasonable. High School Drama Performance Targets Baseline Data Growth Targets Rationale Baseline data is comprised primarily of the tryout and for the 1 st semester s students, the initial formative play performances of students, though their performance on other in-class work factored into classification decisions as well. Based on each student s baseline status, students were classified into three baseline categories relative to the expectations of the learning goal: low preparation (15 students), average preparation (32 students), and high preparation (12 students). Performance targets are set according to the percentage of students in a group who receive an overall average rubric score Meets Expectations corresponding to a 3 on a 5-point scale. The targets that have been established for each group follow Low Prep 0-25% 26-44% 45-60% % Average Prep 0-40% 41-60% 61-75% % High Prep 0-59% 60-79% 80-90% % Because the same summative assessment, rubric, and scoring procedures have been in place the last two years, the targets for this year s classes (which may be adjusted once baseline data are available for second semester s class) are set in relation to prior year s student results in comparison to their baseline characteristics. Overall, the baseline data of prior years students were stronger. Consequently, targets were adjusted downward slightly to account for this discrepancy. Revised 8/12/15 47

48 Questions and Answers about Performance Targets What data sources can be used to gather baseline data? A wide variety of assessments can be used to establish baseline data, so long as the results of the assessment can be aligned logically to the learning goal. This could include State or District summative assessments in the same content area or in a related content area, teacher-created assessments from prior years, grades in prerequisite courses, or pre-assessments from the current school year. Using multiple sources of data will provide the richest picture of students current level of baseline achievement. What if I teach in a content area where no formal State or District assessments exist? The majority of teachers instruct students in content areas or grade levels not covered by State or District summative assessments. Teacher-created assessments from prior years or current year pre-assessments can be used to establish baseline data. It is important to note, though, that State or District summative assessments in other content areas can often be beneficial in establishing baseline data. For example, a social studies teacher can use State assessment data related to reading and writing, or a physics teacher could use summative math data to help define student baselines. Can data that are two or three years old be used in establishing baseline data? Yes. Utilizing performance data from multiple years can provide valuable information. Data over time can help remove some year-to-year instability inherent in assessment data. Further, trend data over a period of years can provide information about recurring areas of strength or weakness in students preparation, which could then be addressed in setting targets and devising effective instruction. How long do I have to gather and analyze baseline data? Baseline data can be gathered from the start of the school year until SLOs are approved at the end of October. This allows time at the beginning of each school year for teachers to gather past data for their students as well as to administer pre-assessments at the beginning of the school year to build a more comprehensive body of baseline data. Is it better to have an overall population growth target or tiered targets for subgroups of students based on baseline data? Either growth target can be effective. The benefit of an overall population target is the relative simplicity of having one target to monitor. The major drawback is that setting one target may not allow all students, particularly low-performing or high-performing students, to demonstrate their learning. Conversely, setting differentiated targets allows for students across the achievement continuum to demonstrate growth; however, having multiple targets makes data analysis more challenging. Based on practice in other states and districts, differentiated targets are more common than overall targets, given that diverse student populations are common. At what point can a teacher revise his or her growth targets? In most cases, growth targets should not be revised once they have been established. However, if there is considerable turnover or mobility in the population incorporated into an SLO and that mobility results in significant changes to baseline data, the teacher could have a conversation with his or her evaluator about what type of reasonable adjustment should be made to growth target(s). Why include a rationale for growth targets? The rationale serves the purpose of providing a concise explanation of how the baseline data and growth targets are directly aligned and reasonable. Revised 8/12/15 48

49 Step 4: Score the SLO Once summative assessments have been administered and data related to the target(s) have been compiled, the SLO may be scored and a final rating established. To score the SLO, the teacher must meet with the evaluator to present summative assessment or outcome data aligned directly to the performance targets established when the SLO was approved. For example, if performance targets were established using the metric of percentage of students meeting a specific benchmark score, the teacher must provide that percentage along with any supporting data to show how that percentage was calculated (e.g., class list of individual student scores, any calculations necessary, etc.). Based on the summative data received, the evaluator will compare the data to the performance target(s) and score the SLO as More than Expected (3), Expected (2), Less than Expected (1), or Much Less than Expected (0). If one target was set for the entire student population, like Example 1 on page 44, the process is simple. However, if differentiated targets have been established for multiple assessments or multiple subgroups, scoring will be more complex because averages will need to be calculated, like for Example 2 on page 44. Another potential issue arises if multiple assessments are used with differing weights or multiple subgroups are used with differing weights. The following scenarios demonstrate how averaging and including differing weights could affect the scoring of the SLO from example 2 on page 44. Scenario 1 Multiple Groups with Equal Weights : Assume that each subgroup has been given the same weight and the following ratings were determined based on student results. Subgroup Rating Low Prep 2 Average Prep 1 High Prep 2 To obtain the overall rating for the SLO a simple average must be calculated. The average of those values would be 1.67, which rounds up to 2. The overall rating for the SLO would then be 2 (Expected). Scenario 2 Multiple Groups with Differing Weights: Assume that the Low Prep subgroup is given twice the weight of the Average Prep or High Prep groups because the number of students in the Low Prep group is much larger than the other groups. To obtain the overall rating for the SLO a weighted average must be calculated based on the weights given to each group. Subgroup Rating Weight Weighted Value Low Prep Average Prep High Prep The weighted value is calculated by simply multiplying the rating by its weight. Finally, to calculate the weighted average, the sum of the weighted values (7) is divided by the sum of the weights (4). The weighted average would be 1.75, which rounds up to 2. The overall rating of the SLO would then be 2 (Expected). Revised 8/12/15 49

50 SLO Template, Rubric, Samples & Guiding Questions SLO Template SLO Title: Teacher: School Year: Evaluator: Subject/Grade/Course: Component Indicator Description Learning Essential Goal Content & Standards Rationale Student Population Instructional Period Assessments and Scoring Assessment(s) Scoring and Results Performance Targets Baseline Data Growth Targets Rationale Summative Data Final Results Final SLO Rating (0-3) Teacher Signature: Date: Evaluator Signature: Date: Revised 8/12/15 50

51 SLO Review Process Questions and Considerations The school year will be the first year that the Measures of Student Learning (MSL) portion of APEX will be in place. Due to the requirements in HB passed and signed into law in May of 2015, the only measures that will be included in the school year MSL portion of APEX will be SLOs. SLOs will represent a new method to measure student learning for all educators in the Adams 12 Five Star School District. Consequently, there will be questions from both educators and evaluators about how best to create and evaluate the quality of an SLO. As evaluators and educators collaboratively review, critique, and approve SLOs, the outcome should be to develop a reasonable method to define, measure, and evaluate the extent to which students have mastered meaningful content and skills. The goal is not to create the perfect SLO, but rather, to use professional judgment to map out a way to measure an educator s effectiveness in obtaining meaningful student outcomes. Because the school year will be the first year in which SLOs will be used, it should be understood that the standard for developing an SLO should be one of reasonableness. If a team of educators has created an SLO that addresses the questions and considerations outlined below and provides a logical link between the learning goal, assessments and scoring, and performance targets, the threshold of reasonableness will have been met. As years pass and educators develop more experience with the SLO process, SLOs will improve and become more well-aligned, rigorous and meaningful. SLO Section Learning Goal Considerations and Questions Essential Content & Standards Considerations : SLOs should cover meaningful standards and content rather than just easy-to-measure or trivial content. The content of an SLO should lead students along a developmental continuum of content that will result in their being prepared for post-secondary success through strategies such as problem solving, information literacy, effective communication, and critical reasoning, regardless of grade level. Questions : - Why are the content and standards listed in the learning goal meaningful? - Is this content preparing students, in an age appropriate way, for post-secondary success? How do you know? Rationale Considerations : The rationale provides a clear description of the importance of the selected content and standards, including a justification about why the learning goal was chosen. Further, the rationale should describe how the content allows for high cognitive demand, allowing for at least some thinking at depth of knowledge (DOK) 3 strategic thinking. Questions : - Does the rationale paint a compelling picture of why the content and standards are worth measuring? Why? - Which aspects of the content and standards require high cognitive demand, meaning at least Depth of Knowledge (DOK) 3 strategic thinking? Student Population Considerations : The population of an SLO should consist of all students to which the essential content and standards are being taught, regardless of current level of achievement, demographic characteristics or educational plan. It can be helpful for both the teacher and evaluator to delineate characteristics of students to provide a more complete picture of the population. Revised 8/12/15 51

52 Question : - Are all students and student groups being included appropriately in the SLO (e.g. ELLs, students with IEPs, low performing students)? Instructional Period Considerations : There is no required minimum interval of instruction for an SLO, but for meaningful content and standards to be taught and learned, significant time will be needed. The broader and more complex the content and standards, the longer the instructional period is likely to be. Further, consideration needs to be given to the time teachers have with students, as this can vary significantly from the whole year every day to periodically for the whole year to a shorter period of time. Question : - Is the instructional period long enough to allow students to show growth on the content and standards specified? Why? Assessments and Scoring Assessment(s) Considerations : Alignment of the assessment to the learning goal is vital. There should be a meaningful link in terms of content and skills between each of the standards in the learning goal and the items/tasks on the assessment. Further, at least some of the items/tasks should measure a DOK of 3, in alignment with the learning goal. Not all items/tasks must be at DOK 3, though, as it is often important to present items/tasks at lower levels of DOK for students to demonstrate more basic understanding and to scaffold their understanding toward DOK 3 or 4. The team of teachers who created the SLO should be able to provide a meaningful explanation of the process and plan that was used to develop the assessment including why certain types of items/tasks were chosen (e.g. selected response, constructed response, performance tasks, etc.) and why the team feels enough items/tasks were chosen to give an accurate picture of students learning. Questions : - To what degree does the assessment align to the content and standards of the learning goal? - To what degree does the assessment align to the cognitive demand of the learning goal? Which items/tasks measure DOK 3 or 4? - What types of items or tasks did you choose for this assessment? Why did you choose those types of items or tasks? - Do you feel that you have enough items or tasks on this assessment to make accurate judgments about students learning? Scoring and Results Considerations : There should be a method to arrive at an overall score for the assessment points correct, an overall rubric score, etc. For assessments with constructed response or performance items/tasks, which will likely be virtually all assessments used in SLOs, quality rubrics should be provided as well. These rubrics should be clear and allow a competent scorer to make an accurate, consistent judgment about the performance of a student. In addition to the rubrics, there should be some indication of what procedures the team of teachers will use to score a sample of assessments collaboratively to promote professional dialogue about expectations for students and to enhance scoring accuracy and inter-rater reliability. Questions : - How will students assessments be scored and receive and overall score? - What rubric will you be using for this assessment? Did you create this rubric or adopt it? - What practice have you had or will you have using this rubric to score student work? - When scoring student work, what processes will you use to ensure that you and your colleagues are scoring consistently and accurately? Revised 8/12/15 52

53 Performance Targets Baseline Data Considerations : An important part of establishing performance targets is to understand students level of knowledge and skill at the beginning of the instructional period. There are often multiple sources of evidence to inform this understanding prior year assessments/grades, current year pre-assessments, surveys, observational data, etc. Generally, the most compelling baseline data comes from an assessment similar in nature to the SLO summative assessment. Teachers should be able to describe which sources of evidence they used to understand their students and set performance targets. Questions : - Is the baseline data gathered reasonable for the learning goal and assessment chosen? - Have enough sources of data been gathered to allow for reasonable performance targets to be set? Growth Targets Considerations : Based on baseline data, performance targets must be set using a four category scale for the overall population or subpopulations. The ranges in each category should match the overall score metric outlined in the assessment and scoring section earlier in the SLO. The decision to use a single scale or tiered scale for multiple subgroups will depend on the baseline data for the student populations as well as teachers desire for specificity. Ultimately, the targets should be set so that they are attainable for teachers yet still rigorous. Questions : - Have targets been set on a four category ratings scale? - Do the ranges in each category match the overall score metric for the assessment (e.g. % of students earning a rubric score of 3 or more, the percentage of students earning a specific number of points, etc.)? - Given the composition of the student population, is a single rating scale appropriate or would differentiated scales be more appropriate for growth target(s)? Why? Rationale Considerations : The rationale is intended to allow the teacher to explain why the performance targets are reasonable, given the baseline data and student population. Often, the rationale will specifically reference the baseline data as well as how the current cohort of students being taught compares to prior cohorts of students, which can provide valuable context for current performance targets. Question : - Given the baseline data presented and the rating scales used, is the rationale for the growth target reasonable? Why? Additional Context for Performance Targets Establishing high, yet realistic performance targets for students is arguably the most challenging aspect in the creation of an SLO, particularly for an assessment that is being given for the first time. If the team of teachers creating the SLO is using reasonable baseline data that provides insight into performance related to the culminating assessment, sets plausible growth targets, and provides a thoughtful rationale for those growth targets as well as why a single rating scale or differentiated rating scales was chosen, the team has exercised sound professional judgment in the development of its SLO. It is important to note that while the learning goal and assessments and scoring sections of an SLO for a team of teachers will be identical, the performance targets may vary based on the students the teacher is instructing. For example, if one teacher is teaching three classes of regular biology and another is teaching three classes of honors biology, the two teachers performance targets most likely should vary based on students initial levels of achievement. Revised 8/12/15 53

54 Adams 12 Student Learning Objective (SLO) Quality Tool The following quality tool is intended for teachers, special services providers, and school administrators in evaluating the quality of the three primary aspects of an SLO: Learning Goal, Assessments and Scoring, and Performance Targets. Effective use of this quality tool will help improve consistency of SLOs within and across schools. Acceptable Quality Learning Goal: A description of what students will be able to do at the end of the course or grade based on course- or grade-level content standards. Assessments and Scoring: Assessments should be standards-based, of high quality*, and be designed to best measure the knowledge/skills specified in the learning goal, including clear rubrics or scoring criteria. Appropriately identifies and describes an important and meaningful learning goal Is the learning goal broad enough that it captures the major content of an extended instructional period, but focused enough that it clearly pertains to the course/subject/grade/students and can be measured? (consider Grain Size or the Goldilocks principle) Standard(s) are clearly aligned to and measured by the learning goal There is a clear explanation of the depth of knowledge with some items on the assessment reaching a DOK of 3 (rationale) A clear definition of the student population being measured with all sub groups of students included. A clear explanation of the instructional period. Appropriately identifies and clearly describes high quality assessment(s) Items are aligned to the content and standards of the learning goal Evidence to support how the appropriateness and quality of the assessments has been established Rubrics or scoring criteria appropriately differentiate student performance, including evidence to support these rubrics/scoring criteria have been developed or validated Effective scoring procedures that will be utilized to promote accurate and consistent scoring Performance Targets: Identify the expected outcomes by the end of the instructional period for the student population (or subpopulation, where appropriate). Clearly and thoroughly explains how the data are used to gauge teacher performance Appropriate baseline data/information used to establish and differentiate expected performance Growth Targets include rigorous expectations that are realistic and attainable for students Compelling rationale is presented for how performance targets were set * A high quality assessment is aligned to identified standards and depth of knowledge; utilizes items or tasks best suited to measure the content and skills measured; includes a rubric or scoring guide that allows for accurate and reliable scoring; yields data that accurately reflects student achievement; and is fair and unbiased. Rubric adapted from Instructional Guide for Developing SLOs. Part of the Center for Assessment s SLO Toolkit (2013), Center for Assessment: Revised 8/12/15 54

55 APEX Student Learning Objective (SLO) Rubric The following rubric is intended for teachers, specialized services providers, and school administrators in evaluating the quality of the three primary aspects of an SLO: Learning Goal, Assessments and Scoring, and Performance Targets. Effective use of this rubric will help improve consistency of SLOs within and across schools. Performance Targets Identify the expected outcomes by the end of the instructional period for the student population (or subpopulation, where appropriate). Acceptable Quality Needs Improvement Insufficient Quality Learning Goal A learning goal is a description of what students will be able to do at the end of the course or grade, based on course or grade-level content standards. Assessments and Scoring Assessments should be standards-based, of high quality*, and designed to best measure the knowledge/skills specified in the learning goal, including clear rubrics or scoring criteria. Performance Targets The performance targets identify the expected outcomes to be achieved by the end of the instructional period for the student population (or subpopulation, where appropriate). Appropriately identifies and describes an important and meaningful learning goal, with o the big idea and the standard(s) are clearly aligned to and measured by the learning goal o a clear explanation of the critical content and nature of the learning goal for all students in the specific grade/ course and thorough reference to deep understanding o a clear definition of the student population being measured o a clear definition of the instructional period Appropriately identifies and clearly describes o high quality assessment(s), with evidence to support how the appropriateness and quality of the assessments has been established o rubrics or scoring criteria appropriately differentiate student performance, with evidence to support these rubrics/scoring criteria have been developed or validated o effective scoring procedures that will be utilized to promote accurate and consistent scoring Clearly and thoroughly explains how the data are used to gauge teacher performance, including o appropriate baseline data/information used to establish and differentiate expected performance o rigorous expectations that are realistic and attainable for students o a compelling rationale for how performance targets were set Generally identifies and describes a learning goal, with o the big idea and the standard(s) are somewhat aligned to the learning goal o some explanation of the importance of the learning goal for all students in the specific grade/ course with limited reference to deep understanding o a vague/generalized definition of the student population being measured o a vague/generalized definition of the instructional period Identifies and provides some description, which may lack specificity, of the o assessment(s), with limited evidence to support how the appropriateness and quality of the assessments have been established o rubrics or scoring criteria partially differentiate student performance, with little evidence to support how the rubrics/scoring criteria have been developed or validated o questionable scoring procedures are listed with respect to promoting accurate and consistent scoring Broadly explains how the data are used to gauge teacher performance, including o some baseline data/information used to establish and differentiate expected performance o imprecise expectations that are somewhat realistic and/or attainable for students o a general rationale for how performance targets were set, but may lack specificity Identifies and describes a learning goal that is vague, trivial, or unessential, with o the big idea and/or standards not well aligned to the learning goal o lack of information on the importance of the learning goal for students in the specific grade/course and no reference to deep understanding o little or no definition of the student population being measured o little or no definition of the instructional being measured Identifies and provides an unclear, insufficient, or confusing description of the o assessment(s) with little or no reference to how the appropriateness and quality of the assessments have been established o rubrics or scoring criteria insufficiently differentiate student performance with minimal or no evidence to support how the rubrics/scoring procedures have been developed or validated o scoring procedures may not be listed or provide no assurance or accurate scoring Provides an unclear, insufficient, or confusing explanation of how the data are used to define teacher performance, and may include o no baseline data/information or use of irrelevant information to establish and differentiate expected performance o low or unreasonable expectations, for students o a vague rationale for how performance targets were set with little logical relationship to baseline data * A high quality assessment has been determined to be aligned to identified standards and depth of knowledge, has a rubric or scoring guide that allows for reliable scoring, and is fair and unbiased. Rubric adapted from Instructional Guide for Developing SLOs. Part of the Center for Assessment s SLO Toolkit (2013), Center for Assessment: Revised 8/12/15 55

56 Sample SLOs SLO Title: Non-fiction text and opinion writing Teacher: School Year: Evaluator: Subject/Grade/Course: 4 th Grade Literacy Component Indicator Description Essential Content & Standards Learning Goal Assessments and Scoring Performance Targets Rationale Student Population Instructional Period Assessment(s) Scoring and Results Baseline Data Growth Targets Students will comprehend non-fiction texts and multimedia sources to identify and summarize the main idea and explain how details support the main idea. Students will then write an opinion paper and support their claim with information derived from those sources. Standards assessed : R.2.a.i -Refer to details and examples in a text when explaining what the text says explicitly and when drawing inferences from the text. R.2.a.ii-Determine the main idea of a text and explain how it is supported by key details, summarize the text R.2.d-By the end of year, read and comprehend informational texts, including history/social studies, science, and technical texts, in the grades 4-5 text complexity band proficiently, with scaffolding as needed at the high end of the range RR.b.iv-Read for key ideas, take notes, and organize information read (using graphic organizer) W.1.a-Write opinion pieces on topics or texts, supporting a point of view with reasons and information W.1.a.i-Introduce a topic or text clearly, state an opinion, and create an organizational structure in which related ideas are grouped to support the writer s purpose W.1.a.ii-Provide reasons that are supported by facts and details W.1.a.iii-Link opinion and reasoning words and phrases W.1.a.iv-Provide a concluding statement or section related to the opinion presented Being able to gather information, including text-based evidence, from a variety of appropriate grade-level and above grade-level resources and develop a written argument is a rigorous task that cuts across all content areas. Further, this type of task directly aligns with expectations on the Literacy CMAS PARCC assessment. This content requires Depth of Knowledge 3 indicative of strategic thinking. All students enrolled in my 4 th grade literacy class 27 students. The instructional period runs from August through November as we are covering opinion writing. The culminating assessment used will be the TCRWP 4 th grade Informational Reading/Argument Writing Performance Assessment (Draft ) along with its accompanying rubric. The assessment requires students to watch a multimedia clip and read two appropriate texts. While doing these activities students are prompted to complete four tasks directly aligned to the Learning Goal, with the final task requiring the students to write an argument essay based on the texts and video clip. The test will be given by all 4 th grade teachers in the school in mid-november. There will be two formative assessments (the initial formative serving as a pre-assessment) similar to the culminating assessment (one at the end of August and one in early October). The same rubric for the culminating assessment will be used to score the formative assessments. Scoring meetings will be held after each formative assessment during which collaborative scoring will be used to improve accuracy and consistency of scoring using the common rubric. Culminating assessments will be scored by the classroom teacher. Each teacher will bring a representative sample of 4-8 papers for dual scoring at a team scoring meeting. Baseline data were obtained from the formative assessment at the end of August. 35% of students in my class earned an average rubric score of 3 or higher on the tasks using the rubric. These data aligned closely with students prior year 3 rd grade common district writing assessment results. The results are not significantly different than 4 th grade classes the past two years. Based on the baseline data, following rating scale will be used to rate this SLO on the percentage of students scoring at the same level at the end of October % of students 1-35% 36-54% 55-74% % with a 3+ rubric average Rationale The growth targets established take into consideration baseline data such that if baseline data are maintained, the lowest growth category is earned. The cutoff values for the other categories were established by averaging prior years students results on the summative assessment. Because the baseline data for this group is not significantly different than in prior years, averaging prior year results was reasonable. Revised 8/12/15 56

57 SLO Title: 6 th Grade Math Teacher: School Year: Evaluator: Subject/Grade/Course: 6 th grade Math Component Indicator Description Learning Essential Students will demonstrate a thorough knowledge of 6th grade content standards. Students will Goal Content & demonstrate the ability to independently solve problems ranging from procedural tasks to Standards problems integrating multiple mathematical concepts to applications of content in new and varied contexts. Students must be able to demonstrate their reasoning clearly through problem solving, written explanations and procedural fluency. Standards assessed : 1) Connecting ratio and rate to whole number multiplication and division and using ration and rate to solve problems. 2) Complete understanding of division of fractions and extending the notion of number to the system of rational numbers. 3) Writing, interpreting, and using expressions and equations. 4) Developing understanding of statistical thinking. Rationale Rigor is ensured on this SLO by using an assessment that incorporates complex problems requiring true application of multiple mathematical concepts in novel contexts not experienced in prior instruction, aligned to at least application and Depth of Knowledge 3. Student Population All students enrolled in my 6 th grade core comprised of 3 regular classes (94 students) and 1 honors class (28 students) Assessments and Scoring Performance Targets Instructional Period Assessment(s) Scoring and Results Baseline Data Growth Targets The instructional period is the entire school year. The summative assessment is administered in early May. The summative assessment used will be the common district-developed 6 th grade summative assessments comprised of both selected-response and constructed-response items. The assessment covers all content covered in the 6 th grade Colorado Academic Standards and is purposefully aligned in proportion to the Type I, Type II, and Type III tasks required on the PARCC Math assessments for 6 th grade. The test will be given by all 6 th grade Math teachers at our school. Selected-response items will be scored using the answer key and assigned a value of 1 point each. Constructed-response items will be scored using a 3-point rubric or 4-point rubric depending upon the item. Prior to scoring their own assessments, the 6 th grade math team will pull a sample of assessments and conduct blind, dual scoring of those assessments. Results will be compared across teachers and refinement of expectations and common scores will be determined with blind scored papers as anchor papers. Teachers will then score the remainder of their own students assessments. For scoring purposes, the regular sections will carry 3 times the weight of the honors section. Student baseline data were determined based on prior year TCAP results (%P&A) as well as the first unit assessment. Also, our 6 th grade math team compared the current 6 th grade class s data to prior years data. Our current 6 th graders are coming to us better prepared overall than in the previous two year s classes. Based on those data, it became clear that setting differentiated targets for regular sections and honors sections would be appropriate, as the baseline data for students in the honors section was much higher on average. The following rating scales were determined for both students comprising regular sections and honors sections. The target is based on the % of students obtaining a score of 80% or higher on the summative assessment Regular 0-40% 41-52% 53-65% % Sections Honors 0-70% 71-80% 81-90% % Section Rationale The growth targets were established by looking at distributions of students scores in prior years and increasing them slightly because this year s 6 th grade class overall is coming in better prepared based on TCAP data and first unit assessment results. Revised 8/12/15 57

58 SLO Title: Historical-Modern Written and Oral Analysis Teacher: School Year: Evaluator: Subject/Grade/Course: US History Component Indicator Description Learning Essential Students will independently use primary and secondary sources to explain and analyze current Goal Content & civics and/or political issues. Students will do so through the use of a written and oral argument Standards that reflects an accurate and in-depth characterization of the relationship between the contemporary issue and historical precedent. Standards assessed : Standard 1: History 1. Use the historical method of inquiry to ask questions, evaluate primary and secondary sources, critically analyze and interpret data, and develop interpretations defended by evidence 2. Analyze key historical periods and patterns of change over time within and across nations and cultures Rationale The learning goal allows for significant cognitive depth and complexity. Students must use primary and secondary sources from a historical period to analyze factors affecting events of that period and then make coherent connections to current events and how similar relationships exist today. Students must make a logical, compelling argument both in writing and verbally. All these skills will transfer to later social sciences classes as well as across all other content areas. These skills align with Depth of Knowledge 3 and 4. Student All students enrolled in each of four year-long US History sections total of 121 students. Population Assessments and Scoring Instructional Period Assessment(s) The instructional period is the entire school year leading up to the submission of the paper and oral presentation in late April. The summative assessment is comprised of two parts. To complete both parts, the student will be provided a packet of primary and secondary sources related to pro and con opinions of US involvement in international military conflicts throughout US history up to the present day. Students will have a day to read and annotate the packet in class. Students will then be given a week to address Parts 1 and 2. Part 1: Students will compose a written argument either pro or con given a hypothetical scenario in which the US could intervene in a conflict between Jerusalem and a group of other middle-eastern states. Part 2: Students will create a 5-7 minute oral presentation incorporating the use of technology to present their argument to the class. Performance Targets Scoring and Results Baseline Data Growth Targets Rationale Students will receive two scores for the assessment: one for the essay and one for the presentation. Each portion of the assessment will be scored using a rubric collaboratively designed by the school s US history common course team. For the essay, blind, dual scoring of 5% of the essays will help to triangulate teachers expectations. For the oral presentation, a sample of 5 recorded presentations from the prior year will be collaboratively rated and discussed by all US History teachers prior to scoring current students presentations. For scoring purposes, the essay and oral presentation will receive the same weight. The following sources of data were examined to determine students overall baseline status: prior year TCAP reading and writing, prior year grades in social studies course(s), initial assessment data from US history unit 1 assessments. Overall, students written assessment data were not as strong as their performance on oral and presentation skills. Two growth targets are being set, one for the essay portion of the summative assessment and one for the oral presentation. Both targets are based on achieving an average overall rubric rating of 3 proficient Essay 0-30% 31-50% 51-70% % Presentation 0-20% 21-55% 56-80% % The targets are set somewhat higher for the presentation than the essay based on students initial data as a starting point. Given that this is the first year this assessment will be given, comparison to essay and presentation results are not possible. However, students prior year TCAP scores indicate that a threshold of 51% required to obtain a rating of expected is reasonable. Revised 8/12/15 58

59 SLO Title: Character Development and Portrayal Evaluator: Subject/Grade/Course: Drama I Component Indicator Description Learning Essential Students will develop a character as part of a one-act ensemble play that showcases vocal Goal Content & characteristics and techniques, posture, and movement to convey the physical, social, and Standards psychological dimensions of a character. Standards assessed : Create: Character development in improvised and scripted works Creation, appreciation, and interpretation of scripted works Perform: Drama and theatre techniques, dramatic forms, performance styles, and theatrical conventions that engage audiences Critically Respond: Elements of drama, dramatic forms, performance styles, dramatic techniques, and conventions Rationale The learning goal encompasses more than merely memorizing and delivering lines in a production, which is most entering students conception. Students have to delve into the motivation and psychology of the character to truly bring that character to life. Additionally, instructional practices used to accomplish the learning goal incorporate student self- and peer-assessment using a rubric, which requires significant reflection and rigor. Strategic thinking is required, equivalent to DOK 3. Student All students enrolled in two semester-long drama I classes, one class 1 st semester and one class 2 nd Population semester 59 students total Assessments and Scoring Performance Targets Instructional Period Assessment(s) Scoring and Results Baseline Data Growth Targets The instructional period will encompass the entire semester for each class. Analysis for the first semester class will be complete in January and in May for the second semester class The summative assessment is a performance task based on the each student s performance in an ensemble one-act play. To accommodate all students, multiple teacher-selected one-act plays will be utilized. The plays will be held early in the month before the end of each semester. Students were given a tryout role in the spring of their prior year. Additionally, students will be given a formative assessment during the 1 st two weeks of class comprised of a single scene to prepare and act out with a partner. Each scene presentation will be videotaped and critiqued by the class and myself, using the rubric. Students will be scored using the rubric developed by a group of theatre teachers which will be used throughout the course by both the students and me. Each performance will be videotaped. This will allow me multiple opportunities to view the performance to enhance scoring. I will have another theatre teacher blind score the performances to check for accuracy and consistency. Finally, to promote student reflection, students will also watch and rate recordings of their performances during class. Student ratings will be reviewed as an additional check on my own ratings. Students with average preparedness will receive twice the weight of the other categories because this group is at least twice as large as the other groups. Baseline data is comprised primarily of the tryout and for the 1 st semester s students, the initial formative play performances of students, though their performance on other in-class work factored into classification decisions as well. Based on each student s baseline status, students were classified into three baseline categories relative to the expectations of the learning goal: low preparation (15 students), average preparation (32 students), and high preparation (12 students). Performance targets are set according to the percentage of students in a group who receive an overall average rubric score Meets Expectations corresponding to a 3 on a 5-point scale. The targets that have been established for each group follow Low Prep 0-25% 26-44% 45-60% % Average Prep 0-40% 41-60% 61-75% % High Prep 0-59% 60-79% 80-90% % Rationale Because the same summative assessment, rubric, and scoring procedures have been in place the last two years, the targets for this year s classes (which may be adjusted once baseline data are available for second semester s class) are set in relation to prior year s student results in comparison to their baseline characteristics. Overall, the baseline data of prior years students were stronger. Consequently, targets were adjusted downward slightly to account for this discrepancy. Revised 8/12/15 59

60 Questions to Consider for Assessment Development* Criterion Content Rigor Format Results Fairness Reliability Scoring Questions to Consider How well do the items/tasks/criteria align to appropriate standards, curriculum and essential outcomes for the grade level or course? In what ways would mastering or applying the identified content be considered essential for students learning this subject at this grade level? How do the content, skills and/or concepts assessed by the items or tasks provide students with knowledge, skills and understandings that are (1) essential for success in the next grade/course or in subsequent fields of study; or (2) otherwise of high value beyond the course? In what ways do the items/tasks and criteria address appropriately challenging content? To what extent do the items/tasks require appropriate critical thinking and application? How do the items/tasks ask students to analyze, create, and/or apply their knowledge and skills to situations or problems where they must apply multiple skills and concepts? To what extent are the items/tasks and criteria designed such that student responses/scores will identify student s levels or knowledge, understanding and/or mastery? When will the results be made available to the educator? (The results must be available to the educator prior to finalizing their effectiveness rating) To what extent are items or tasks free from words and knowledge that are characteristic to particular ethnicities, subcultures, and genders? To what extent are appropriate accommodations available and provided to students as needed? Is there a sufficient number of items/tasks for each important, culminating, overarching skill? Do open-ended questions/performance tasks have rubrics that (1) clearly articulate what students are expected to know and do and (2) differentiate between levels of knowledge/mastery to ensure inter-rater reliability? To what extent does scoring give appropriate weight to the essential aspects? If the test includes selected response items (multiple choice, true/false, etc.) as well as performance tasks, what method will be used to consistently combine those data into an overall score? *Adapted from School/Student Learning Objectives, Developmental Pilot , Wisconsin Department of Public Instruction. === End SLO Handbook === Revised 8/12/15 60

61 Professional Practice Ratings and MSL Ratings Professional Practices Ratings The ratings system used in APEX comes directly from the Colorado State Model Evaluation System. Professional Practice Quality Standards for all educators are evaluated using a rubric which is available in the online tracking platform, RANDA. Teacher & Special Service Professional Quality Standards I. Content Knowledge II. Establish Classroom Environment III. Facilitate Learning IV. Reflect on Practice V. Demonstrate Leadership Principal Quality Standards I. Strategic Leadership II. Instructional Leadership III. School Culture & Equity Leadership IV. Human Resource Leadership V. Managerial Leadership VI. Development Leadership Each Quality Standard is comprised of Elements which are scored individually based on evidence for implementation of the related professional practices. Quality Standards and Elements for all educators can be found in the associated professional rubric for each type of educator; and links to those professional rubrics are available in Appendix A below. The table below shows the point values assigned to each performance level on the Rubric for Evaluating Teachers. Each teacher receives a professional practice rating based on the accumulation of points on the 27 elements of the rubric. The Colorado Department of Education has a detailed guide which also provides an example for how points earned on each standard are rolled up to an overall professional practice score and rating. *Note: Because ratings for APEX are tracked in the online RANDA platform, once the data are entered, these calculations are done automatically in RANDA. Point Value of Professional Practices Ratings for Teachers Professional Practices Ratings Point Value per Rating Exemplary 4 Accomplished 3 Proficient 2 Partially Proficient 1 Basic 0 Revised 8/12/15 61

62 Measures of Student Learning Ratings Local school districts may identify the measures of student learning that will comprise the 50 percent of an educator s overall evaluation rating. In the state model, each measure is awarded points that range from 0 to 3. The table below shows how the point values correspond to measures of student learning ratings. The measures of student learning are weighted and combined and then converted to a score that will range from 0 to 540. For more specific information on how a score between 0 and 540 is obtained on this component, please refer to the appropriate Measures of Student Learning Guidance available from CDE. Point Values for Measures of Student Learning Measure of Student Learning Rating Point Value per Measure More than Expected 3 Expected 2 Less than Expected 1 Much Less than Expected 0 Revised 8/12/15 62

63 Determining an Overall Rating Senate Bill , the Great Teachers and Leaders Act, requires 50 percent of an educator s final evaluation be based on a rating using the Colorado Evaluation Rubrics and 50 percent be based on multiple measures of student learning. These two ratings are combined to determine an overall effectiveness rating of Ineffective, Partially Effective, Effective or Highly Effective. The state model uses an additive approach expressed through an index score to arrive at a final effectiveness score. The figure below illustrates how scores earned on each component of the evaluation will be used to determine a final effectiveness score and rating. The process of combining measures starts with the final scores from professional practices and the measures of student learning. Once the professional practice score and measures of student learning score are determined, they are added together to create a single effectiveness, or index score. A final effectiveness rating is assigned to an educator based on the total number of points reported. (The figure below is for Teachers. The process for Principals, Assistant Principals, and Special Service Professionals is quite similar. However, because the different types of educators have varying numbers of Professional Practice Standards and Elements, some of the numbers used in the calculations are different. For details about Principal ratings, please see the Colorado Department of Education s guide: Determining Final Effectiveness Ratings Using the Colorado State Model Evaluation System for Principals. ) Process for Assigning Effectiveness Ratings for Teachers Professional Practices 50% The Adams 12 Educator Effectiveness Advisory Committee assigned weights for each standard given a performance level on the Colorado Evaluation Rubrics: Basic, Partially Proficient, Proficient, Accomplished and Exemplary. Measures of Student Learning 50% Weights for each measure of student learning were assigned based on feedback from Teacher Focus Groups and the Educator Effectiveness Advisory Council. The rating will be calculated for each measure based on set targets: Much Less than Expected, Less than Expected, Expected and More than Expected. A professional practice score is determined by summing points earned across standards. That score is then multiplied by 27 to arrive at a final score that can range from 0 to 540.* A measures of student learning score is determined by summing the weighted points across measures (0 3) and applying rules* to get a final MSL score between 0 and 540. A final effectiveness score is determined by summing the professional practice score and the measures of student learning score. The final score can range from 0 to * Once data is entered, the RANDA system does these calculations automatically. For detailed information about determining these ratings, please see the Colorado Department of Education document about determining a final rating for Teachers. ** (Use this guide for principals.) ** Revised 8/12/15 63

64 The diagram above shows the professional practices and measures of student learning cut points. The vertical axis, or y axis, displays the professional practices scale of 540 points and is divided into five sections. Moving from the bottom to the top, each of these sections corresponds to a professional practices performance level: Basic, Partially Proficient, Proficient, Accomplished or Exemplary. The horizontal axis, or x axis, which displays the measures of student learning scale of 540 points, is divided into four sections of 135 points each. Moving from left to right, each of these four sections corresponds to MSL ratings: Much Less than Expected, Less than Expected, Expected or More than Expected. The third set of cut points, which corresponds to the final score, is called out in the table below. To arrive at the final score, the professional practice score is simply added to the measures of student learning score. The final effectiveness score can be translated to a rating using this table. For more information about how the cut points were established, please see Appendix C. Final Effectiveness Rating Based on Index Score for Teachers Rating Category Ineffective Partially Effective Effective Highly Effective Index Score Range Revised 8/12/15 64

65 Appendix A - Resources & Tools Frequently Asked Questions Glossary Classroom Visit Tool Reflective Conversation Protocol Teacher-Type Category Decision Process Flowchart Professional Rubrics Frequently Asked Questions In our ongoing collaborative work to better outline and define how the Colorado State Model Evaluation System aligns with professional values and beliefs in Adams 12 Five Star Schools, our Five Star Schools educator effectiveness committee is evaluating many aspects of these systems. Here are the values and beliefs identified by the Five Star Schools Educator Effectiveness Advisory Committee : We believe in a growth mind set for all educators. We believe professional relationships built on mutual respect and transparency enhance everyone s practice and elevate our profession. We believe reflective practice and professional collaboration make educators more effective. We believe in continuous improvement, rather than perfection. In the interest of transparency, reflection, and collective responsibility, the Educator Effectiveness Advisory Committee will regularly report the results of decisions on which they have reached consensus. The following questions, decisions, and rationale are presented as district wide expectations for educators and supervisors to use during their collaborative discussions. The craft of delivering high quality education to every student every day is a difficult one, and we know our collective pursuit of educational excellence will never be done. However, the following decisions and background will be helpful as we work together toward that end. Five Star Schools Educator Effectiveness Committee Guidance Question: Can the data for a student be individually attributed to more than one teacher if those teachers provide direct instruction to the student in the same content area (e.g. a classroom teacher and SPED teacher at an elementary school)? Committee decision: Yes. If two teachers provide direct instruction to a student in the same content area on a consistent basis (at least 30 minutes a day or 2.5 hours per week), the data for that student may be individually attributed to each teacher. Rationale: In keeping with our district s belief in collective responsibility, since both teachers provide a significant amount of direct instruction to the student, both teachers contribute to that student s level of achievement and his or her growth. This approach also encourages collaboration to work toward the district s identified outcomes for a student s overall achievement and growth. Revised 8/12/15 65

66 Question: What process should be used to set cut points for targets related to summative assessment data? In the interest of continued transparency this decision comes with the following caveat: The Five Star Schools Educator Effectiveness committee recognizes that the following decision is complex. For those who want to know more about the process outlined below, we re working on developing additional resources to help explain this decision in greater detail. The committee reviewed multiple options before agreeing on this approach as most appropriate for the Five Star District. Committee Decision: Two factors should be considered when setting targets for summative assessment proficiency data. 1. Past performance on the same summative assessment at the school 2. Differences in achievement between the school s incoming and outgoing students. The Educator Effectiveness committee agreed that the following approach is most appropriate: Educators will set targets based on average scale scores and then adjust them for incoming student cohort differences. Using average scale scores instead of % Strong & Distinguished (%S&D) allows schools to more accurately measure the students in their particular school. For example If an outgoing 4th grade cohort has an average CMAS social studies scale score of 575, then 575 would be the starting point for setting expectations for the next incoming cohort. If, based on other relevant assessment data, the incoming class is identified as starting at a higher or lower level, the expectation of 575 would be adjusted accordingly. For this example, if the incoming cohort performed 3 points higher than the previous class on a similar assessment, the expectation would be adjusted to 578. Members of the Educator Effectiveness committee are working on additional resources to help educators identify scale scores and how to adjust them for their incoming students. (Background) The following information is additional background regarding how and why the committee landed on the recommendation regarding how to set cut points related to summative assessment data. Context: Scale scores are a common form of measurement and have been around for many years. A scale score is a numerical value attributed to the level of achievement based on a student s test performance. The higher the value of the scale score, the greater the student s demonstrated level of achievement. The range of the scale is defined when the test is initially developed, but can vary from test to test. For example, ACT uses a scale of 1 to 36, while SAT uses a scale of 200 to 800. CMAS scales range from 300 to 900, with most school averages falling between 500 and 700. An initial Expected cut point could be set using the average scale score of the prior year s cohort of students, with an adjustment for differences between the current cohort and the prior cohort. The ranges for each category could then be based on the observed variance of the prior year s cohort. The following would be an example for the CMAS Social Studies assessment if the Expected scale score cut point was 530 with a standard deviation of 9 scale score points. Much Less than Expected Less than Expected Expected More than Expected Revised 8/12/15 66

67 Rationale: Proficiency data are highly correlated with school demographic factors (e.g., free/reduced lunch, percentages of ELLs, etc.). Also, particularly at small schools, proficiency data can vary significantly from one cohort of students to the next cohort of students. An incoming cohort of students can demonstrate higher or lower achievement than the prior year s cohort. Taking these two aspects into account when setting performance targets for state summative assessment data, is important. Further, focusing a performance target on the percentage of students reaching a certain proficiency level, such as %Strong & Distinguished (%S&D) narrows the ability to show growth to only the students who are able to exceed the cut point to get into the Strong Command proficiency level. Students who are well above or below that cut point will not be able to show growth in %S&D. Finally, another major limitation of using %S&D for setting summative assessment targets is that %S&D performance levels can approach a floor (0% S&D) or a ceiling (100% S&D). For schools approaching these thresholds, defining four performance categories could become difficult. For example, if a school had a %S&D of 2%, which occurred on the spring 2014 CMAS Social Studies assessment, setting a reasonable range for the Much Less than Expected and Less than Expected categories would become impossible. Consequently, using scale scores rather than %S&D to set targets for state summative assessment data is the most reasonable of the three options discussed. Question: The vast majority of CGM and State Summative Assessment data will not be available prior to the end of the school year in time to finalize MSL ratings, and therefore, overall effectiveness ratings. Should prior year data be used or should the evaluation be finalized in the fall after summative data become available? Committee Decision: Use current year data for the evaluation and wait until the following fall to finalize evaluation ratings. Rationale: Basing the CGM and summative assessment data in MSL portion of a teacher s evaluation upon current year data is in alignment with the teacher s current year professional practices and SLOs. This would provide the best basis for a teacher s current year performance, rather than prior performance. Question: What should be the minimum number of students in a measure for it to be included in the MSL system? Committee Decision: The minimum number of students in a population for a measure to be included in the MSL system should be 15. Rationale: The minimum number of 15 is in alignment with the threshold Denver has used (15 students combined within the last three year rolling average). It is also within the range outlined in the publication, Using Student Growth Percentiles for Educator Evaluations at the Teacher Level: Key Issues and Technical Considerations for School Districts in Colorado, from the Center for Assessment. The report states, Typical values for this minimum number found across states range from 10 to 20. (p.9) Revised 8/12/15 67

68 Question: Should there be a minimum attendance rate and enrollment period for a student s data to be included in MSL calculations? If so, what should they be? Committee Decision: Yes. Both student attendance and enrollment should be taken into consideration when considering which students scores should be included in the MSL system. For a student s data to be included in the MSL system, the student must have an attendance rate of 85% or more and been enrolled during at least 85% of the class/course. Rationale: The thresholds listed above are in alignment with the thresholds Denver has used. By requiring minimum enrollment and attendance criteria prior to including a student s data within the MSL system, factors which affect student achievement and growth (which are beyond a teacher s control), can be taken into account to provide a fairer measure of the impact of the teacher s instruction on student outcomes. Question: For the attendance and enrollment criteria above, should the same guidance apply to all measures CGM, State Summative, SPF, and SLO? If not, why? Committee Decision: Yes. The same inclusion criteria should be used for all measures. Rationale: Using the same inclusion criteria for every type of measure will promote consistency within the system. While the SPF uses different criteria for inclusion (does not factor in attendance and requires continuous enrollment from October 1 st until testing), there is no compelling reason for differences in inclusion criteria among measures. Question: For CGM and Summative Assessment Data, should only one year of data be used or should data be pooled from multiple years? Committee Decision : Use one year of data for Rationale: With the assessment transition, all PARCC CGM data will be comprised of only one year of growth data will be available for PARCC to PARCC comparisons (2015 PARCC to 2016 PARCC). Consequently, the use of 1 year of data will be the only possible option for the school year when the MSL system will first be in place. However, it is acknowledged that research will be conducted during the next few years to determine whether or not multiple years of data, when it is available, should be used instead of only one year of growth data. Question: What guidance should be provided to teachers and evaluators about using assessment data across measures that overlap for example, if a teacher has literacy CGM data individually for her students and then the SPF measure is overall literacy MGP collectively for the school as a whole? Committee Decision : Teachers and evaluators should be provided information about the benefits and drawbacks of using the same measure multiple times within an educator s MSL measures, particularly if there is significant overlap in the population between the teacher s individual measures and a collective measure. Rationale : If the example listed in the question above occurred at a small elementary school with only two teachers per grade, there is more significant overlap than a large, comprehensive high school where there may be many English teachers at a given grade level. The MSL system is designed to use multiple measures to evaluate a teacher s effectiveness in helping students learn. Weighing one measure too heavily, could unintentionally impact that result. Revised 8/12/15 68

69 Question: For CGM data, should the median or mean be used to represent average growth data? Committee Decision: The median should be used to represent the average of growth percentiles. Rationale: In large normally distributed data distributions, the mean and mean are basically the same. When dealing with smaller data sets, which are often not normally distributed, the median is generally less susceptible to high or low outliers than the mean, which can be skewed significantly by outliers. Further, growth percentiles are not interval data like inches on a ruler or degrees on a thermometer, and as such should not technically be averaged using a mean. Question: Should a common rating scale be used when interpreting CGM MGPs? Committee Decision: Yes, a common rating scale should be used when interpreting growth percentiles. For the first year of the system the following scale is recommended. However, it should be acknowledged that this scale could be adjusted based on ongoing research. Much Less Than Expected Less Than Expected Expected More Than Expected Rationale: The CGM is a consistent method for calculating growth percentiles regardless of the test, assuming the test being compared across years is the same (e.g. TCAP to TCAP or ACCESS to ACCESS). Having a common rating scale for all CGM measures acknowledges that consistency. Further, having a common scale rating promotes consistency and simplicity in communicating with stakeholders across the system. Revised 8/12/15 69

70 Glossary Collective Attribution: The use of measures required by the current provisions of the Elementary and Secondary Education Act and/or other standardized assessments used to measure the performance of groups of teachers. Measures of collective performance may assess the performance of the school, grade level, instructional department, teams or other groups of teachers. These measures can take a variety of forms including school wide student growth measures, team based collaborative achievement projects and shared value added scores for co teaching situations. Colorado Academic Standards (CAS): The Colorado Academic Standards are the expectations of what students need to know and be able to do at the end of each grade. They also stand as the values and content organizers of what Colorado sees as the future skills and essential knowledge for our next generation to be more successful. All Colorado districts are required to adopt local standards that meet or exceed the Colorado Academic Standards. The Colorado Academic Standards are also the basis of the annual state assessment. Colorado Growth Model (CGM): The Colorado Growth Model is a statistical model to calculate each student s progress on state assessments taken in consecutive years. The calculation is normative in nature and is based on a comparison of a student s growth to other students statewide at the same grade level with the same level of starting achievement. Criterion referenced Assessment: An assessment designed to measure student performance against a fixed set of predetermined criteria or learning standards (concise, written descriptions of what students are expected to know and be able to do at a specific stage of their education). Criterion referenced assessments are used to evaluate whether students have learned a specific body of knowledge or acquired a specific skill set. Depth of Knowledge (DOK): a reference to the complexity of mental processing that must occur to answer a question, perform a task, or generate a product. There are four levels of DOK. Level 1 includes basic recall of facts, concepts, information, or procedures. Level 2 includes skills and concepts such as the use of information (graphs) or requires two or more steps with decision points along the way. Level 3 includes strategic thinking that requires reasoning and is abstract and complex. Level 4 includes extended thinking such as an investigation or application to real work. Formative Assessment: A process used by teachers and students during instruction that provides feedback to adjust ongoing teaching and learning resulting in improved student achievement of intended instructional outcomes. Growth Percentile (GP): Defines how much relative growth a student made. The CGM serves as a way for educators to understand how much growth a student makes from one year to the next relative to a student s academic peers. More specifically, the Colorado Growth Model compares each student s current achievement to students in the same grade throughout the state who had similar assessment scores in past years. The model then produces a student growth percentile, much like children s height and weight percentiles that pediatricians share with parents. Percentile scores have a relatively straightforward interpretation: A child that is in the 76th percentile in weight is as heavy or heavier than 76% of other children of the same age. Revised 8/12/15 70

71 Higher Order Thinking Skills: Critical, logical, reflective, metacognitive and creative thinking. These skills are activated when individuals encounter unfamiliar problems, uncertainties, questions, or dilemmas. Successful applications of the skills result in explanations, decisions, performances and products that are valid within the context of available knowledge and experience and that promote continued growth in these and other intellectual skills. Individual Attribution: The use of measures required by the current provisions of the Elementary and Secondary Education Act and/or other standardized assessments used to measure the performance of an individual teacher. These measures can take a variety of forms including Colorado growth model data, district interim assessment data, or school or classroom level assessment data. Interim Assessment: A term generally used to refer to medium scale, medium cycle assessments. Interim assessments: 1) evaluate students knowledge and skills relative to a specific set of academic goals, typically within a limited time frame and, 2) are designed to inform decisions at both the classroom and beyond the classroom level, such as the school or district level. Thus, they may be given at the classroom level to provide information for the teacher, but unlike true formative assessments, the results of interim assessments can be meaningfully aggregated and reported at a broader level. However, interim assessments generally provide less instructionally relevant diagnostic information than formative assessments. Measures of Student Learning (MSL) [also referred to as Student Academic Growth]: The various types of assessments of student learning, including for example, value added or growth measures, curriculum based tests, pre /post tests, capstone projects, oral presentations, performances, artistic portfolios or other projects. Median Growth Percentile (MGP): The median growth percentile summarizes student growth rates by district, school, grade level, or other group of interest. The median is calculated by taking all the individual student growth percentiles of the students in the group being analyzed, ordering them from lowest to highest, and identifying the middle score, which is the median. The median may not be as familiar as the mean or average, but is similar in interpretation. Medians are more appropriate to use than averages when summarizing a collection of percentile scores. MSL Rating Categories: Describe performance with respect to Colorado s Quality Standard VI for teachers or specialized service providers. Much less than Expected: Students performance on measures of student learning is significantly lower than expected. Less than Expected: Students performance on measures of student learning is lower than expected. Expected: Students performance on measures of student learning meets or slightly exceeds expectations. More than Expected: Students performance on measures of student learning significantly exceeds expectations. Revised 8/12/15 71

72 Norm referenced Assessment: A type of assessment that yields an estimate of the tested individual s performance evaluation relative to a predefined population, with respect to the trait being measured. This type of assessment determines whether the test taker performed better or worse than other test takers, but not whether the test taker knows either more or less material than is necessary for a given purpose. Reliability: The ability of an instrument to measure consistently across different raters and contexts. Rigor: A term widely used by educators to describe instruction, schoolwork, learning experiences, and educational expectations that are academically, intellectually, and personally challenging. Rigorous learning experiences help students understand knowledge and concepts that are complex, ambiguous, or contentious, and they help students acquire skills that can be applied in a variety of educational, career, and civic contexts throughout their lives. Statewide Summative Assessment: An assessment administered pursuant to the Colorado student assessment program as listed in state law (CRS and ). These assessments include ACCESS, CMAS, PARCC, and ACT. Student Learning Objective/Outcome: A formalized measure of what students should know, understand and be able to do including a prescribed assessment or data collection and performance target(s) to allow for the evaluation of student learning or outcomes. Validity: The ability of an instrument to measure the attribute it intends to measure as well as the accuracy and usefulness of the results of the instrument for an intended purpose. Revised 8/12/15 72

73 Professional Growth Plan Goal Setting Protocol Teachers and specialized service professionals will collaborate with their leader to develop a Professional Growth Plan (PGP) focused on the educator s professional goals. A professional practice goal names an action that will be taken by the teacher to support professional growth, reflection, and student learning. The teacher s action should be linked to his/her self assessment on the professional practices rubric and reflect the language of a specific attribute from Standards 1 5. Description Professional Practice Goal Guiding Questions Collaborative Goal Teachers and specialized service professionals will select a collaborative goal facilitated by the evaluator. Collaborative goals may be school-wide, grade-level, or team goals based on what is most appropriate for their learning environment with guidance from the building principal. What are the school-wide goals defined in the School Unified Improvement Plan? What will my content area team focus on together this school year? Individual Goal Teachers and specialized service professionals will also select one individual goal based on one of the professional practices standards and elements that the teacher, specialized service professional and evaluator agree would best help the professional grow in his or her educational practice. What professional practices will you focus on to promote your own professional growth and reflection? What instructional strategies will you implement to support student achievement? Why are you focusing on this practice? How will this practice support student learning? What support will you need to implement these practices? How will you monitor your progress toward attainment of this goal? Collaborative Professional Learning Goal Name and Description How will we measure progress toward this goal? What action steps will I work on to improve my practice and meet this goal? Individual Professional Learning Goal Name and Description How will we measure progress toward this goal? What action steps will I work on to improve my practice and meet this goal? Revised 8/12/15 73

74 Classroom Visit Tool Clarity/Pacing Skill Standard I Element A Teachers provide instruction that is aligned with Colorado Academic Standards; their district s organized plan of instruction and the individual needs of their students. Standard I, Element D Teachers demonstrate knowledge of the content, central concepts, tools of inquiry, appropriate evidence-based instructional practices and specialized character of the disciplines being taught. Content Presentation/ Instructional Strategies STANDARD 1, ELEMENT B Teachers demonstrate knowledge of student literacy development in reading, writing, speaking and listening. Standard I, Element D Teachers demonstrate knowledge of the content, central concepts, tools of inquiry, appropriate evidence-based instructional practices and specialized character of the disciplines being taught. Standard I, Element F Teachers make instruction and content relevant to students and take actions to connect students background and contextual knowledge with new information being taught. Teacher implements lesson plans based on (A PP) Student needs Colorado Academic Standards District s plan of instruction Instructional objectives appropriate for students (A B) Explanations of content are: (D PP) Accurate Clear Concise Comprehensive Instruction presented: Integrates literacy skills including: vocabulary, comprehension, fluency, writing, listening, speaking, and listening (B Elementary/Secondary PP) Breaks down concepts into appropriate instructional parts (D B) Incorporates materials that are accurate and appropriate for the lesson being taught. (D B) Enhances literacy skill development, critical thinking and reasoning skills (B All Teachers- P) Is needs-based and of sufficient duration to accelerate learning (B Elementary/Secondary P) Helps students connect to their learning by linking the current lesson with prior knowledge, experiences and/or cultural contexts (F - PP) Provides supports that facilitate engagement (F PP) Student Tasks/Product STANDARD 1, ELEMENT B Teachers demonstrate knowledge of student literacy development in reading, writing, speaking and listening. Standard I, Element D Teachers demonstrate knowledge of the content, central concepts, tools of inquiry, appropriate evidence-based instructional practices and specialized character of the disciplines being taught. Students engage in tasks that require them to: Think critically about complex texts (B- All teachers - P) Ask and solve problems that are relevant to them (F - A) Discuss ideas with peers that are intellectually challenging to them (D - E) Apply literacy skills: reading, writing, speaking and listening to understand complex texts (B Elementary/Secondary- A) Standard 1, Element F Teachers make instruction and content relevant to students and take actions to connect students background and contextual knowledge with new information being taught. *Descriptors on the Classroom Visit Protocol are aligned to the Professional Practices for Standard I referenced on the Rubric for Evaluating Colorado Teachers, which is part of the state s Educator Effectiveness Model. A reference to the corresponding element on the state rubric is provided under the name of each indicator on the protocol. Instructional Shifts in Literacy Regular practice with complex text and academic language Reading, writing, and speaking grounded in evidence from text, both literary and informational Building knowledge through content rich nonfiction text Revised 8/12/15 74

75 Reflective Conversation Protocol Revised 8/12/15 75

76 Mid-Year Conversation Guidance for Evaluators (December - January) Purpose Feedback, reflection and growth are integral components of APEX. In order for educators to reach their goals and continually refine their practice, teachers should expect and request frequent, shorter duration collaborative conversations with instructional leaders and colleagues throughout the school year to reflect on their practice. Outcomes In addition to the ongoing collaborative conversations, in the middle of the school year, (December/January) teachers, specialized service professionals, and evaluators will have a reflective conversation that includes strengths and recommended areas of growth to support professional growth or recommended professional learning. These conversations will provide an opportunity to reflect and discuss professional practices and revise the Professional Growth Plan (PGP) if needed. In addition, mid year conversations provide an opportunity to review available student assessment information and student progress toward Learning Goals in Student Learning Objectives (SLOs). The goal of the mid year conversation is for the teacher and evaluator to engage in a professional dialogue focused on reflection and growth related to professional goals, professional practices on the rubric, and student learning objectives for the first half of the school year. Providing a rating on the rubric is not required at this point of the school year. Tips to Ensure a Productive Conference Focused on Outcomes Establish a specific timeframe (15 30 minutes). Communicate expected outcomes for the Mid Year Conference to teachers. Identify areas of professional strengths. Clarify actions the educator is willing to commit to in order to improve and grow. If there is disparity between the person being evaluated and the evaluator in ratings, educator s strengths, and/or areas of growth, this is a great time to have a professional collaborative conversation about next steps. Provide a general impression of the educator s evaluation rating. Educators may want to know where you see them within the continuum of performance at this point in the year. This is the opportunity to further discuss a specific learning plan to move the educator along the performance continuum within the professional practices rubric. If there are performance concerns, this is the time to identify and outline specific needs for growth. Complete the mid year component for feedback within the RANDA system. Revised 8/12/15 76

77 Mid-Year Reflective Conversation Protocol (December - January) Mid-Year Conversations may happen individually or in groups, if appropriate, based on collaborative goals and successful progress towards completion of goals. Professional Practices Tell me about your learning related to your professional goals. How are you collaborating with colleagues to develop, expand or refine your instructional practices? What are you learning about your practice that is helping you to grow as an educator? Measures of Student Learning How are students progressing toward the Learning Goals you ve set for them this year? What evidence/data do you have to support your analysis of student progress? Are the growth targets that you set at the beginning of the year attainable? Based on your current review of student progress, what are your plans for achieving your end of year growth targets? What additional supports do you need to ensure that you are successful with your students? Guiding Questions for Teachers Focused on Standards I III Standard I Content Knowledge Standard II Classroom Environment Standard III Instruction How will I prioritize which standards to teach (e.g., complexity, highly tested, most challenging for students to master, district plan for instruction) in this lesson or unit? How do I collaborate with school staff to ensure my planning and instruction support the needs of all students and align with the approved curriculum? How will I select instructional materials and strategies that provide relevance, central contexts, and are foundational evidence based? Examples of collaborative work with colleagues Differentiated lesson plans Student intervention plans How do I ensure my classroom environment is conducive to learning? How do I support students in developing positive relationships with their peers? How do I model and teach elements of respectful dialogue? How will I ensure each student s contributions to the lesson are valued? How will I develop a sense of community in the classroom? How will I ensure all students participate in class activities? How will I adapt the learning environment to address individual student needs? Possible Artifacts to Enhance Mid Year Conversation Evidence of communication with parents How will I plan for a variety of instructional methods during a lesson? How will I continually monitor student readiness? How will I ensure my assessments (formative and summative) are aligned with academic standards and student outcomes? How can I model risk taking for my students? How will I determine the developmental and academic needs of each student? How can the use of technology enhance student leaning and engagement? How will I establish and communicate high expectations for all students? Student work samples Graphs, tables, or rubrics describing student results Analysis of classroom assessments Revised 8/12/15 77

78 Candid Conversation Protocol: Guidance for Evaluators Purpose Feedback, reflection and growth are integral components of the APEX system. If an educator isn t progressing adequately toward his or her goals, you may need to have a candid conversation and provide supports. The goal of a candid feedback conversation is for the teacher and evaluator to engage in a professional dialogue focused on growth, defining areas that need improvement and supports. Educators struggling to meet their goals should expect and request frequent, shorter duration feedback conversations focused on areas of need throughout the school year. These conversations should focus on setting short term goals and providing support to educators in meeting their year long goals. Outcomes Teachers, specialized service professionals, and evaluators will have specific feedback conversations focused on recommended areas of growth to support professional growth or recommended professional learning. These conversations provide an opportunity to reflect and discuss professional practices and revise supports as needed. In addition, available student assessment information and student progress toward Learning Goals in Student Learning Objectives (SLOs) may need to be discussed. Providing a rating on the rubric is not required, but may be shared if necessary. Suggested agenda: Open set purpose Share share feedback in a concrete, specific way Open dialogue allow for reaction and conversation Set expectations identify specific, concrete expectations for next steps Outline supports offer supports and ask what other supports would be helpful Summarize confirm shared understanding of next steps and available supports Tips to ensure a productive conversation focused on outcomes: Establish a specific timeframe (15 30 minutes). Communicate expected outcomes for the conversation. Name the issue and select a specific example that illustrates the challenge you want to address. Clarify what is at stake and indicate your wish to support the teacher. Invite the educator to respond. Identify specific next steps and available supports. If the conversation requires redirection: Ask meditational questions (e.g. What would happen if What is another way you might ) Restate messages about values, mission, vision (e.g. We believe every student can learn ) Make a firm statement to move the discussion forward (e.g. We are going to talk about some concrete next steps to take in support of student learning...) Closing the conversation: End the conversation with clear expectations about deliberate next steps and supports for the educator. Plan for follow up classroom visits, agreed upon supports and feedback conversations. Tips for handling resistance in feedback conversations: Seek to understand the other person Put your agenda on hold Deal with the emotion Revised 8/12/15 78

79 o I am sensing there is more here than just the topic of conversation. o It seems like you are feeling because. o I hear you saying, Is my understanding of your perspective accurate? o How can we work together to help with these next steps? Accept responsibility o I sense that I have done something to offend you or make you mad. That was not my intent and I am not sure what it was that I did. If you can help me to understand, I can make the adjustment. Acknowledge Universals o We love children and want to do right by them. o This is a hard job. o We have too many demands and too little time. Take a side trip if necessary o Allow venting and paraphrasing. o Sometimes people just need to be heard. Recognize outside stressors. Recognize school demands such as testing, accountable for things outside control, the unknown, etc. Find out what others see as solutions. ***Adapted from Fierce Conversations, by Susan Scott Revised 8/12/15 79

80 End-of-Year Reflective Conversation Protocol (April/May) Professional Practices Tell me about your professional learning related to your professional goals this year. How did you collaborate with colleagues to develop, expand or refine your instructional practices? What did you learn or refine about your practice that helped you to grow as a teacher? Measures of Student Learning How well did students progress toward the Learning Goals you set for their learning this year? What evidence/data do you have to support student outcomes? Were the growth targets that you set at the beginning of the year attainable? How have your Professional Practices Goals affected student outcomes? Guiding Questions for Teachers Focused on Standards I - III Standard I Content Knowledge Standard II Classroom Environment Standard III - Instruction How will I prioritize which standards to teach (e.g., complexity, highly-tested, most challenging for students to master, district plan for instruction) in this lesson or unit? How do I collaborate with school staff to ensure my planning and instruction support the needs of all students and align with the approved curriculum? How will I select instructional materials and strategies that provide relevance, central contexts, and are foundational evidence-based? How do I ensure my classroom environment is conducive to learning? How do I support students in developing positive relationships with their peers? How do I model and teach elements of respectful dialogue? How will I ensure each student s contributions to the lesson are valued? How will I develop a sense of community in the classroom? How will I ensure all students participate in class activities? How will I adapt the learning environment to address individual student needs? How will I plan for a variety of instructional methods during a lesson? How will I continually monitor student readiness? How will I ensure my assessments (formative and summative) are aligned with academic standards and student outcomes? How can I model risk taking for my students? How will I determine the developmental and academic needs of each student? How can the use of technology enhance student learning and engagement? How will I establish and communicate high expectations for all students? Possible Artifacts to Enhance Conversation Examples of collaborative work with colleagues Differentiated lesson plans Student intervention plans Evidence of communication with parents Student work samples Graphs, tables, or rubrics describing student results Analysis of classroom assessments Revised 8/12/15 80

81 Teacher-Type Category Decision Process Flowchart * NOTE: For the school year, all certified staff will only use SLOs for the MSL half of APEX. * Decision Process For most teachers, the flowchart above will be all that is needed to determine into which category they fall. However, there are teachers who aren t so easy to classify, such as special education teachers. For teachers who are hard to classify, it will be important to look at the context of the teacher s role and instructional responsibilities, which can vary depending upon the school s context. Example As an example, let s assume an elementary school has two learning specialists who provide direct literacy instruction to students in pull-out groups. One learning specialist works with students in grades K-2 and the other works with students in grades 3-5. Both educators instruct students in a content area (literacy) which is assessed using a state summative assessment. The first learning specialist instructs students in grades K-2 where no CGM data are available; consequently, the first learning specialist would be classified as Category II. The second learning specialist instructs students in grades 3-5. CGM data is available for students in grades 4 and 5. Therefore, the second learning specialist would be classified as Category I. Revised 8/12/15 81

82 Professional Rubrics Colorado Evaluation Rubrics can also be accessed on the Five Star Schools Educator Effectiveness website at or: Staff > Educator Effectiveness Teacher Rubric Principal and Assistant Principal Rubric School Audiologist Rubric School Psychologist Rubric School Nurse Rubric School Physical Therapist Rubric School Occupational Therapist Rubric School Counselor Rubric School Social Worker Rubric School Speech Language Pathologist Rubric School Orientation and Mobility Specialist Rubric Revised 8/12/15 82

83 Appendix B RANDA Navigation Support Main RANDA Teacher Navigation Page After you have logged in the RANDA system and clicked on My Evaluation, this is the page you will use to navigate each step of the process. The items listed under Activity will turn blue as each step is completed during the school year. You will not be able to begin your self assessment or Professional Growth Plan until you have completed training/orientation in your building or department. Revised 8/12/15 83

84 Training and Orientation Page In order to navigate the Educator Effectiveness process in the RANDA system, educators must acknowledge they have had Training and/or Orientation and select the date of training. This training will take place in your building or department and may include updates from CDE, technical training on RANDA (if needed), and/or reviewing the evaluation process. After selecting which statement applies to you, the date of the training, click the box indicating your understanding of the evaluation process, and then click submit. Revised 8/12/15 84

85 Self-Assessment Quick Reference Guide on how to complete a self-assessment in RANDA (please note the RANDA system has HELP icons) Locate the appropriate Colorado Evaluation Rubric in RANDA for your job title and responsibilities. o Rubrics in RANDA are populated by Human Resources based on job categories. o The Teacher rubric is for all classroom teachers (regular, elective, ESL, special education, etc.) o Specialized Service Professional rubrics are populated based on job titles. o Principal and Assistant Principal Rubrics are populated based on job titles. Assign a rating for each of the Elements contained in each Domain of the rubric o Read each Professional Practice, beginning with those referenced under the column, Basic. o Mark each Professional Practice for which you have demonstrated evidence of implementation. o Rate each Element based on the highest performance level for which all Professional Practices are marked and for which all Professional Practices are marked in the level(s) below that Element. The score for each element will be determined by the last column that has all the boxes checked. For example, if all boxes are checked in the Basic, Partially Proficient, and Proficient columns but, only one box is checked in the Accomplished column, as shown above, the rating on this element would be Proficient. After you have completed your self-assessment in all Quality Standards, you will click on the check progress button at the bottom of the screen. If you missed any Standard or Element, they will be listed. If the assessment is complete, you will click on the View Scoring button at the bottom of the screen. Revised 8/12/15 85

86 Professional Growth Plan Step 1 Revised 8/12/15 86

87 Professional Growth Plan Step 2 Revised 8/12/15 87

88 Professional Growth Plan Step 3 Revised 8/12/15 88

89 Professional Growth Plan Step 4 Revised 8/12/15 89

90 Mid-Year Conversation in RANDA In the RANDA system, the supervisor initiates the mid-year conversation. This example is the teacher-view after the supervisor has started the form from his/her RANDA screen. This form may be partially completed prior to the mid-year conversation or may be completed during or after the mid-year conversation. Once the mid-year conversation is complete and the supervisor has completed the document, the educator will click that he/she has participated in a mid-year conversation ( this doesn t imply that an educator agrees or disagrees with that conversation ), and then the educator may make additional comments if he/she chooses, and then submit. Revised 8/12/15 90

91 End-of-Year Conversation in RANDA In the RANDA system, the supervisor initiates the end-of-year conversation. This example is the educator view after the administrator has started the form from his/her RANDA screen. This form may be partially completed prior to the end-of-year conversation or may be completed during or after the end-of-year conversation. The end-of-year conversation, will include a discussion on progress of professional goals, review of student data, and discussion of the professional practices overall rubric ratings. The conversation won t likely include discussion of every standard and element on the professional practices rubric. However, if there are questions about ratings, this is the time to discuss them. Once an educator has had the end-of-year conversation with his/her evaluator, and the evaluator has completed the document, the educator will click to acknowledge that he/she has participated in an end-of year review (this isn t an indicator of agreement or disagreement), and make additional comments - if needed - and then submit. Revised 8/12/15 91

92 RANDA Final Ratings - Professional Practices After the professional growth cycle has ended. You will see a summary page similar to this example. It will provide the overall rating for each Quality Standard on the Professional Practices Rubric and will show an overall Professional Practices Rating, which is 50% of the full evaluation. Administrators may comment on any Standard but, must make a comment if any Quality Standard (not specific element) is rated below proficient. Revised 8/12/15 92

93 RANDA Final Ratings - Measures of Student Learning Combined with Professional Practices The top portion of this example shows the Measures of Student Learning score for each measure and the overall Measures of Student learning score. The bottom portion illustrates the Professional Practices score combined with the Measures of Student Learning score in order to get the final effectiveness rating. Revised 8/12/15 93

94 Appendix C - Additional Details About APEX Measures of Student Learning - State Board of Education Rules Teacher Growth [5.01 (E) (7) and (8)] School Districts and BOCES shall categorize Teachers into appropriate categories based on the availability and technical quality of student assessments available for the courses and subjects taught by those Teachers. School Districts and BOCES shall then choose or develop appropriate Measures of Student Academic Growth to be used in the evaluation of each personnel category. The Department will develop technical guidance, based on research and best practices that emerge from the pilot of the State Model System and the implementation of other local systems during the Pilot Period, which School Districts and BOCES may choose to use in developing their own Measures of Student Academic Growth. This technical guidance shall address methods for ensuring that such Measures of Student Academic Growth meet minimum standards of credibility, validity, and reliability. Measures of Student Academic Growth shall be generated from an approach or model that makes design choices explicit and transparent (e.g., in a value added model, transparency about student or school level factors which are statistically controlled for) and has technical documentation sufficient for an outside observer to judge the technical quality of the approach (i.e., a value added system must provide adequate information about the model). Measures of Student Academic Growth shall be generated from an approach or model that presents results in a manner that can be understood and used by Educators to improve student performance. Student Academic Growth shall be measured using multiple measures. When compiling these measures to evaluate performance against Teacher Quality Standard VI, School Districts and BOCES shall consider the relative technical quality and rigor of the various measures (E) (7) e) A measure of individually attributed Student Academic Growth, meaning that outcomes on that measure are attributed to an individual licensed person; f) A measure of collectively attributed Student Academic Growth, whether on a school wide basis or across grades or subjects, meaning that outcomes on that measure are attributed to at least two licensed personnel (e.g., measures included in the school performance framework, required pursuant to section , C.R.S.); g) When available, Statewide Summative Assessment results; and h) For subjects with annual Statewide Summative Assessment results available in two consecutive grades, results from the Colorado Growth Model (E) (8) a) School Districts and BOCES shall seek to ensure that Measures of Student Academic Growth are valid, meaning that the measures are aligned with the academic standards adopted by the local school board pursuant to , C.R.S. and that analysis and inferences from the measures can be supported by evidence and logic; b) School Districts and BOCES shall seek to ensure that Measures of Student Academic Growth are reliable, meaning that the measures should be stable over time and in substance and that data from the measures will be sufficient to warrant reasonably consistent inferences; c) In the effort to ensure that Measures of Student Academic Growth are comparable among Teachers of similar content areas and grades, School Districts and BOCES are strongly encouraged to include Teachers in a discussion of which measures are most appropriate to the Teachers classrooms; and d) For Teachers teaching two or more subjects, individual Measures of Student Academic Growth shall include Student Academic Growth scores from all subjects for which the Teacher is responsible. Revised 8/12/15 94

95 Establishing Cut Points for Teacher Overall Effectiveness Ratings The first overall effectiveness rating cut point is established by determining the maximum score for Basic on professional practices (54) and the minimum score for less than Expected on measures of student learning (135), providing a combined rating of 189 which is the first cut point for a partially effective rating. To determine the cut point for Effective, the maximum score for Partially Proficient on professional practices (189) is added to the minimum score for Expected on the measures of student learning (270). The cut point for an Effective rating is 459 ( ). The cut point for Highly Effective is determined by adding the maximum score for Proficient on the professional practices (324) to the minimum score for more than Expected on the measures of student learning (405). The cut point for a Highly Effective rating is 729 ( ). The final effectiveness rating is determined after the professional practice score and measures of student learning score have been combined. For example, an Effective rating is earned if the combined total is between 459 and 728. Revised 8/12/15 95

NC TEACHER EVALUATION PROCESS SAMPLE EVIDENCES AND ARTIFACTS

NC TEACHER EVALUATION PROCESS SAMPLE EVIDENCES AND ARTIFACTS STANDARD I: ELEMENT A: Teachers demonstrate leadership Teachers lead in their classroom Developing Has assessment data available and refers to it to understand the skills and abilities of students Accesses

More information

Wisconsin Educator Effectiveness System. Principal Evaluation Process Manual

Wisconsin Educator Effectiveness System. Principal Evaluation Process Manual Wisconsin Educator Effectiveness System Principal Evaluation Process Manual Updated February 2016 This manual is an interim update to remove inaccurate information. A more comprehensive update for 2016-17

More information

Essential Principles of Effective Evaluation

Essential Principles of Effective Evaluation Essential Principles of Effective Evaluation The growth and learning of children is the primary responsibility of those who teach in our classrooms and lead our schools. Student growth and learning can

More information

Masters Comprehensive Exam and Rubric (Rev. July 17, 2014)

Masters Comprehensive Exam and Rubric (Rev. July 17, 2014) 1 Educational Leadership & Policy Studies Masters Comprehensive Exam and Rubric (Rev. July 17, 2014) The comprehensive exam is intended as a final assessment of a student s ability to integrate important

More information

Teacher Performance Evaluation System

Teacher Performance Evaluation System Chandler Unified School District Teacher Performance Evaluation System Revised 2015-16 Purpose The purpose of this guide is to outline Chandler Unified School District s teacher evaluation process. The

More information

James Rumsey Technical Institute Employee Performance and Effectiveness Evaluation Procedure

James Rumsey Technical Institute Employee Performance and Effectiveness Evaluation Procedure James Rumsey Technical Institute Employee Performance and Effectiveness Evaluation Procedure James Rumsey Technical Institute, a West Virginia state institution, is bound by Policy 5310, Performance Evaluation

More information

THE SCHOOL BOARD OF ST. LUCIE COUNTY, FLORIDA TEACHER PERFORMANCE APPRAISAL SYSTEM, 2014-2015 TABLE OF CONTENTS

THE SCHOOL BOARD OF ST. LUCIE COUNTY, FLORIDA TEACHER PERFORMANCE APPRAISAL SYSTEM, 2014-2015 TABLE OF CONTENTS , 2014-2015 TABLE OF CONTENTS Purpose and Key Components... 1 1. Core of Effective Practices... 1 2. Student Growth... 2 3. Evaluation Rating Criteria... 13 4. Teacher and Principal Involvement... 14 5.

More information

Online Professional Development Modules N O R T H C A R O L I N A D E P A R T M E N T O F P U B L I C I N S T R U C T I O N

Online Professional Development Modules N O R T H C A R O L I N A D E P A R T M E N T O F P U B L I C I N S T R U C T I O N Online Professional Development Modules N O R T H C A R O L I N A D E P A R T M E N T O F P U B L I C I N S T R U C T I O N Free self-paced modules, self-paced mini-modules, and facilitated courses Learn

More information

Leader Keys Effectiveness System Implementation Handbook

Leader Keys Effectiveness System Implementation Handbook Leader Keys Effectiveness System Implementation Handbook Office of School Improvement Teacher and Leader Keys Effectiveness Division Acknowledgments The s (GaDOE) (LKES) Handbook was developed with the

More information

Principal Appraisal Overview

Principal Appraisal Overview Improving teaching, leading and learning T e x a s P r i n c i p a l E va l u a t i o n S y s t e m Principal Appraisal Overview Design and Development a collaborative effort McREL International Texas

More information

Educator Evaluation Gradual Implementation Guidebook

Educator Evaluation Gradual Implementation Guidebook Learning to Succeed Educator Evaluation Gradual Implementation Guidebook Version 1.0 1 Portland Public Schools * 196 Allen Avenue * Portland, Maine 04103 * Phone: (207) 874-8100 Table of Contents Introduction...

More information

Model for Practitioner Evaluation Manual SCHOOL PSYCHOLOGIST. Approved by Board of Education August 28, 2002

Model for Practitioner Evaluation Manual SCHOOL PSYCHOLOGIST. Approved by Board of Education August 28, 2002 Model for Practitioner Evaluation Manual SCHOOL PSYCHOLOGIST Approved by Board of Education August 28, 2002 Revised August 2008 Model for Practitioner Evaluation Guidelines and Process for Traditional

More information

Crosswalk of the New Colorado Principal Standards (proposed by State Council on Educator Effectiveness) with the

Crosswalk of the New Colorado Principal Standards (proposed by State Council on Educator Effectiveness) with the Crosswalk of the New Colorado Principal Standards (proposed by State Council on Educator Effectiveness) with the Equivalent in the Performance Based Principal Licensure Standards (current principal standards)

More information

Principal Practice Observation Tool

Principal Practice Observation Tool Principal Performance Review Office of School Quality Division of Teaching and Learning Principal Practice Observation Tool 2014-15 The was created as an evidence gathering tool to be used by evaluators

More information

Stronge Teacher Effectiveness Performance Evaluation System

Stronge Teacher Effectiveness Performance Evaluation System Stronge Teacher Effectiveness Performance Evaluation System Overview Teacher Effectiveness Student Achievement Student Achievement Stronge Evaluation System Effectiveness is the goal. Evaluation is merely

More information

NYSED/NYCDOE JOINT INTERVENTION TEAM REPORT AND RECOMMENDATIONS

NYSED/NYCDOE JOINT INTERVENTION TEAM REPORT AND RECOMMENDATIONS NYSED/NYCDOE JOINT INTERVENTION TEAM REPORT AND RECOMMENDATIONS DBN: 14K477 School Name: High School for Legal Studies 850 Grand Street School Address: Brooklyn, NY 11211 Principal: Monica Ortiz Restructuring

More information

St. Joseph s College Education Department Handbook for Student Teachers Cooperating Teachers College Supervisors

St. Joseph s College Education Department Handbook for Student Teachers Cooperating Teachers College Supervisors St. Joseph s College Education Department Handbook for Student Teachers Cooperating Teachers College Supervisors Brooklyn Campus Long Island Campus 245 Clinton Avenue 155 West Roe Boulevard Brooklyn, NY

More information

The School Leadership Collaborative Intern and Administrative Mentor Guide

The School Leadership Collaborative Intern and Administrative Mentor Guide Gonzaga University School of Education The School Leadership Collaborative Intern and Administrative Mentor Guide Principal Certification Program Administrator Certification Department of Educational Leadership

More information

TITLE 23: EDUCATION AND CULTURAL RESOURCES SUBTITLE A: EDUCATION CHAPTER I: STATE BOARD OF EDUCATION SUBCHAPTER b: PERSONNEL

TITLE 23: EDUCATION AND CULTURAL RESOURCES SUBTITLE A: EDUCATION CHAPTER I: STATE BOARD OF EDUCATION SUBCHAPTER b: PERSONNEL ISBE 23 ILLINOIS ADMINISTRATIVE CODE 50 Section 50.10 Purpose 50.20 Applicability 50.30 Definitions TITLE 23: EDUCATION AND CULTURAL RESOURCES : EDUCATION CHAPTER I: STATE BOARD OF EDUCATION : PERSONNEL

More information

Oregon Framework for Teacher and Administrator Evaluation and Support Systems

Oregon Framework for Teacher and Administrator Evaluation and Support Systems Oregon Framework for Teacher and Administrator Evaluation and Support Systems Revised for 2014 2015 State Guidelines for ESEA Waiver & SB 290 OREGON DEPARTMENT OF EDUCATION 255 Capitol St, NE, Salem, OR

More information

Performance Evaluation System Protocol. Licensed Executive Professionals

Performance Evaluation System Protocol. Licensed Executive Professionals Colorado Springs School District 11 Dr. Nicholas Gledich, Superintendent Performance Evaluation System Protocol Licensed Executive Professionals Revised March 2014 235169.6 Department of Human Resources

More information

Wisconsin Educator Effectiveness System. Teacher Evaluation Process Manual

Wisconsin Educator Effectiveness System. Teacher Evaluation Process Manual Wisconsin Educator Effectiveness System Teacher Evaluation Process Manual Updated February 2016 This manual is an interim update to remove inaccurate information. A more comprehensive update for 2016-17

More information

Online Professional Development Modules

Online Professional Development Modules Online Professional Development Modules NORTH CAROLINA DEPARTMENT OF PUBLIC INSTRUCTION Free Self- paced Modules, Self- paced Mini- modules, and Facilitated Courses Learn more at www.rt3nc.org. Self- Paced

More information

Standards: What does it mean to Community College Faculty? General Overview & Panel Discussion. OhioMATYC April 12, 2013

Standards: What does it mean to Community College Faculty? General Overview & Panel Discussion. OhioMATYC April 12, 2013 The Common Core State Standards: What does it mean to Community College Faculty? General Overview & Panel Discussion OhioMATYC April 12, 2013 Ohio s Readiness for College and Careers Definition High school

More information

Revisioning Graduate Teacher Education in North Carolina Master of Arts in Elementary Education Appalachian State University

Revisioning Graduate Teacher Education in North Carolina Master of Arts in Elementary Education Appalachian State University Revisioning Graduate Teacher Education in North Carolina Master of Arts in Elementary Education Appalachian State University A. A description of how the proposed program has been revisioned to reflect

More information

NEW YORK STATE EDUCATION DEPARTMENT

NEW YORK STATE EDUCATION DEPARTMENT NEW YORK STATE EDUCATION DEPARTMENT GUIDANCE ON THE NEW YORK STATE DISTRICT- WIDE GROWTH GOAL-SETTING PROCESS FOR TEACHERS: STUDENT LEARNING OBJECTIVES Revised November 2013 1 SLO RESOURCES: SLO Resources

More information

Norfolk Public Schools Teacher Performance Evaluation System

Norfolk Public Schools Teacher Performance Evaluation System Norfolk Public Schools Teacher Performance Evaluation System The Norfolk Public Schools does not discriminate on the basis of race, sex, color, national origin, religion, age, political affiliation, veteran

More information

Teacher Evaluation. Missouri s Educator Evaluation System

Teacher Evaluation. Missouri s Educator Evaluation System Teacher Evaluation Missouri s Educator Evaluation System Teacher Evaluation Protocol Introduction Missouri s Educator Evaluation System was created and refined by hundreds of educators across the state.

More information

Colorado High School Graduation Guidelines

Colorado High School Graduation Guidelines Colorado High School Graduation Guidelines Adopted by the State Board of Education May 2013 Introduction In 2007, the General Assembly adopted H.B. 07-1118 that set forth a process for developing statewide

More information

To expand teachers use of a variety of resources to improve instruction

To expand teachers use of a variety of resources to improve instruction Tool 5.1 Coaching roles Role Purpose Example Resource provider To expand teachers use of a variety of resources to improve instruction Gathers information and/or resources (articles, materials, etc.) for

More information

GEORGIA STANDARDS FOR THE APPROVAL OF PROFESSIONAL EDUCATION UNITS AND EDUCATOR PREPARATION PROGRAMS

GEORGIA STANDARDS FOR THE APPROVAL OF PROFESSIONAL EDUCATION UNITS AND EDUCATOR PREPARATION PROGRAMS GEORGIA STANDARDS FOR THE APPROVAL OF PROFESSIONAL EDUCATION UNITS AND EDUCATOR PREPARATION PROGRAMS (Effective 9/01/08) Kelly Henson Executive Secretary Table of Contents Standard 1: Candidate Knowledge,

More information

Illinois Center for School Improvement Framework: Core Functions, Indicators, and Key Questions

Illinois Center for School Improvement Framework: Core Functions, Indicators, and Key Questions Framework: Core Functions, Indicators, and Key Questions The Core Functions and Indicators, which form the structure for the delivery and execution of (Illinois CSI) services, describe the big ideas or

More information

The performance assessment shall measure the extent to which the teacher s planning:

The performance assessment shall measure the extent to which the teacher s planning: Arizona s Professional Teacher Standards Standard 1: The teacher designs and plans instruction that develops students abilities to meet Arizona s academic standards and the district s assessment plan.

More information

Colorado Professional Teaching Standards

Colorado Professional Teaching Standards Colorado Professional Teaching Standards Standard I: Teachers demonstrate knowledge of the content they teach a. Teachers provide instruction that is aligned with the Colorado Academic Standards and their

More information

Texas Principal Evaluation and Support System FAQ

Texas Principal Evaluation and Support System FAQ A. Overview 1. What is T-PESS? Texas Principal Evaluation and Support System FAQ T-PESS is the Texas Principal Evaluation and Support System. It is a new principal evaluation system for the state of Texas

More information

Ohio School Counselor Evaluation Model MAY 2016

Ohio School Counselor Evaluation Model MAY 2016 Ohio School Counselor Evaluation Model MAY 2016 Table of Contents Preface... 5 The Ohio Standards for School Counselors... 5 Ohio School Counselor Evaluation Framework... 6 School Counselor Evaluation

More information

TEACHER DEVELOPMENT & EVALUATION HANDBOOK

TEACHER DEVELOPMENT & EVALUATION HANDBOOK TEACHER DEVELOPMENT & EVALUATION HANDBOOK College, Career Career & Citizen-Ready! Table of Contents Part 1: Introduction Purpose Part 2: Standards of Effective Teaching Performance Standards Sample Performance

More information

TITLE 23: EDUCATION AND CULTURAL RESOURCES SUBTITLE A: EDUCATION CHAPTER I: STATE BOARD OF EDUCATION SUBCHAPTER b: PERSONNEL

TITLE 23: EDUCATION AND CULTURAL RESOURCES SUBTITLE A: EDUCATION CHAPTER I: STATE BOARD OF EDUCATION SUBCHAPTER b: PERSONNEL ISBE 23 ILLINOIS ADMINISTRATIVE CODE 50 Section 50.10 Purpose 50.20 Applicability 50.30 Definitions TITLE 23: EDUCATION AND CULTURAL RESOURCES : EDUCATION CHAPTER I: STATE BOARD OF EDUCATION : PERSONNEL

More information

All students are admitted during the summer and begin their coursework in the fall. Students must commit to completing these courses in sequence.

All students are admitted during the summer and begin their coursework in the fall. Students must commit to completing these courses in sequence. Department of Special Education Initial Licensure and Added Endorsement the Special Education Generalist Overview and Assessment Plan Purpose for Program Change In the Fall of 2013 the Department of Special

More information

Educator Performance Evaluation and Professional Growth System

Educator Performance Evaluation and Professional Growth System Educator Performance Evaluation and Professional Growth System September 2015 Table of Contents Introduction... 2 Stakeholders Group... 2 Beliefs... 3 Development Process... 4 Steering Committee Membership...

More information

Elementary Education: Teacher Leadership Track

Elementary Education: Teacher Leadership Track Elementary Education: Teacher Leadership Track Designed for: Students who have an undergraduate degree in Education and an initial teaching license for elementary grades K-6, plus a minimum of one year

More information

School Leadership Framework And Assistant Principal Evidence Guide

School Leadership Framework And Assistant Principal Evidence Guide Leadership Framework And Guide Leadership Framework A Shared Vision for Effective Leaders in Denver Public s At Denver Public s we believe we can achieve our vision that every child succeeds by having

More information

Key Principles for ELL Instruction (v6)

Key Principles for ELL Instruction (v6) Key Principles for ELL Instruction (v6) The Common Core State Standards (CCSS) in English Language Arts and Mathematics as well as the soon-to-be released Next Generation Science Standards (NGSS) require

More information

MILLIKIN TEACHING STANDARDS

MILLIKIN TEACHING STANDARDS MILLIKIN TEACHING STANDARDS Millikin Teaching Standards are correlated to and modifications of Illinois Professional Teaching Standards. Modifications reflect Millikin s mission and the education unit

More information

North Carolina TEACHER. evaluation process. Public Schools of North Carolina State Board of Education Department of Public Instruction

North Carolina TEACHER. evaluation process. Public Schools of North Carolina State Board of Education Department of Public Instruction North Carolina TEACHER evaluation process Public Schools of North Carolina State Board of Education Department of Public Instruction Rubric for Evaluating North Carolina Teachers ( This form should be

More information

TOOL KIT for RESIDENT EDUCATOR and MENT OR MOVES

TOOL KIT for RESIDENT EDUCATOR and MENT OR MOVES Get to Know My RE Observe Collect Evidence Mentor Moments Reflect Review Respond Tailor Support Provide Provide specific feedback specific Feedback What does my RE need? Practice Habits Of Mind Share Data

More information

Chapter 6: Hiring and placing coaches

Chapter 6: Hiring and placing coaches Tool 6.1 Teacher on special assignment/ elementary instructional coach job description 6.2 High school language arts instructional coach job description and responsibilities Purpose Use this sample job

More information

Hiawatha Academies School District #4170

Hiawatha Academies School District #4170 Hiawatha Academies School District #4170 World s Best Workforce Plan All Hiawatha Academies scholars will be empowered with the knowledge, character and leadership skills to graduate from college and serve

More information

Professional Education Unit Early Childhood, Elementary, and Special Education

Professional Education Unit Early Childhood, Elementary, and Special Education Professional Education Unit Early Childhood, Elementary, and Special Education Clinical Practice: Infants & Toddlers and Preschool for 3-5 Year Olds IECE 425-001 Spring 2013 Dr. Elizabeth McLaren 301 B

More information

B. Public School Partners Involvement in the Revisioning of the Program and Continued Involvement in the Delivery and Evaluation of the Program

B. Public School Partners Involvement in the Revisioning of the Program and Continued Involvement in the Delivery and Evaluation of the Program Re-Visioning Graduate Teacher Education in North Carolina MA in History, Secondary Education with Licensure in History and Social Studies Appalachian State University A. Description of how the Proposed

More information

Model for Practitioner Evaluation Manual SCHOOL COUNSELOR. Approved by Board of Education August 28, 2002

Model for Practitioner Evaluation Manual SCHOOL COUNSELOR. Approved by Board of Education August 28, 2002 Model for Practitioner Evaluation Manual SCHOOL COUNSELOR Approved by Board of Education August 28, 2002 Revised August 2008 Model for Practitioner Evaluation Guidelines and Process for Traditional Evaluation

More information

FRAMEWORK FOR EFFECTIVE TEACHING. Newark Public Schools Teacher Performance Evaluation

FRAMEWORK FOR EFFECTIVE TEACHING. Newark Public Schools Teacher Performance Evaluation FRAMEWORK FOR EFFECTIVE TEACHING Newark Public Schools Teacher Performance Evaluation A GUIDEBOOK FOR TEACHERS & ADMINISTRATORS 2014-2015 TABLE OF CONTENTS Letter from the Superintendent... iii Introduction...

More information

Curriculum and Instruction: A 21st Century Skills Implementation Guide

Curriculum and Instruction: A 21st Century Skills Implementation Guide Curriculum and Instruction: A 21st Century Skills Implementation Guide Produced by To succeed in college, career and life in the 21st century, students must be supported in mastering both content and skills.

More information

TEACHING THE LAW AND JUSTICE CURRICULUM. lawandjustice.edc.org

TEACHING THE LAW AND JUSTICE CURRICULUM. lawandjustice.edc.org lawandjustice.edc.org Introduction Welcome to the Law and Justice curriculum! Developed by Education Development Center, Inc. (EDC), with support from The James Irvine Foundation, the Law and Justice curriculum

More information

Indiana s Department of Education STEM Education Implementation Rubric

Indiana s Department of Education STEM Education Implementation Rubric Indiana s Department of Education STEM Education Rubric The rubric that follows provides an outline for the implementation of STEM attributes in schools. The rubric is designed to show varying levels of

More information

2015-16 Rubric for Evaluating Colorado s Specialized Service Professionals: Speech-Language Pathologists

2015-16 Rubric for Evaluating Colorado s Specialized Service Professionals: Speech-Language Pathologists 2015-16 Rubric for Evaluating Colorado s Specialized Service Professionals: Speech-Language Pathologists Definition of an Effective Speech-Language Pathologist Effective speech-language pathologists are

More information

Tackling the NEW Teacher Evaluation Guidelines

Tackling the NEW Teacher Evaluation Guidelines Tackling the NEW Teacher Evaluation Guidelines Required by the Feds for Arizona to be eligible for Race to the Top $$$, Federal stabilization funding and Title IIA By Nannette Soule And Dennis Blauser

More information

Mississippi Statewide Teacher Appraisal Rubric M-STAR

Mississippi Statewide Teacher Appraisal Rubric M-STAR Mississippi Statewide Teacher Appraisal Rubric M-STAR Introduction and Process Guide May 2012 2192_05/12 Contents Introduction... 1 Purpose of Teacher Performance Evaluation... 1 Teacher Evaluation Process...

More information

New York State Professional Development Standards (PDF/Word) New York State. Professional Development Standards. An Introduction

New York State Professional Development Standards (PDF/Word) New York State. Professional Development Standards. An Introduction New York State Professional Development Standards New York State Professional Development Standards (PDF/Word) Background on the Development of the Standards New York State Professional Development Standards

More information

Fact Sheet 15-13 UPDATED January 2016

Fact Sheet 15-13 UPDATED January 2016 UPDATED Fact Sheet Annual Professional Performance Review (APPR) January 2016 Page 1 ANNUAL PROFESSIONAL PERFORMANCE REVIEW (APPR) Fact Sheet 15-13 UPDATED January 2016 The 2015 state budget included replacing

More information

Rubric for Evaluating Colorado s Specialized Service Professionals: School Psychologists Definition of an Effective School Psychologist

Rubric for Evaluating Colorado s Specialized Service Professionals: School Psychologists Definition of an Effective School Psychologist Rubric for Evaluating Colorado s Specialized Service Professionals: School Psychologists Definition of an Effective School Psychologist Effective school psychologists are vital members of the education

More information

Understanding District-Determined Measures

Understanding District-Determined Measures Understanding District-Determined Measures 2013-2014 2 Table of Contents Introduction and Purpose... 5 Implementation Timeline:... 7 Identifying and Selecting District-Determined Measures... 9 Key Criteria...

More information

BUILDING CURRICULUM ACCOMMODATION PLAN

BUILDING CURRICULUM ACCOMMODATION PLAN BUILDING CURRICULUM ACCOMMODATION PLAN 2014-2015 ERIC STARK, PRINCIPAL KATE PERETZ, ASSISTANT PRINCIPAL Alone we can do so little; together we can do so much. Helen Keller FRANKLIN PUBLIC SCHOOLS VISION

More information

Teacher Generated Examples of Artifacts and Evidence. Criterion Element Example Artifacts/ Evidence

Teacher Generated Examples of Artifacts and Evidence. Criterion Element Example Artifacts/ Evidence Criterion 1: Centering instruction on high expectations for student achievement 2b: Establishing a culture for learning 3a: Communicating with students 3c: Engaging Students in learning student generated

More information

CONNECTICUT SEED Student and Educator Support Specialists Guidance Document

CONNECTICUT SEED Student and Educator Support Specialists Guidance Document CONNECTICUT SEED Student and Educator Support Specialists Guidance Document 1 This document provides guidance to administrators and Student and Educator Support Specialists (SESS) on the application of

More information

19K660. Brooklyn, NY 11207. Jocelyn Badette

19K660. Brooklyn, NY 11207. Jocelyn Badette NYSED/NYCDOE JOINT INTERVENTION TEAM REPORT AND RECOMMENDATIONS BEDS Code/DBN: 19K660 School Name: W. H. Maxwell High School 145 Pennsylvania Avenue School Address: Brooklyn, NY 11207 Principal: Jocelyn

More information

A Guide to Implementing Principal Performance Evaluation in Illinois

A Guide to Implementing Principal Performance Evaluation in Illinois A Guide to Implementing Principal Performance Evaluation in Illinois Prepared by the Illinois Principals Association & Illinois Association of School Administrators Why This Guide? Implementing a new principal

More information

The residency school counselor program does not prepare candidates to design, deliver, and

The residency school counselor program does not prepare candidates to design, deliver, and STANDARD V: KNOWLEDGE AND SKILLS SCHOOL COUNSELORS -Building on the mission to prepare educators who demonstrate a positive impact on student learning based on the Improvement of Student Achievement act

More information

North Carolina Professional Teaching Standards

North Carolina Professional Teaching Standards North Carolina Professional Teaching Standards For every student in North Carolina, a knowledgeable, skilled compassionate teacher...a star in every classroom. As Approved by the State Board of Education

More information

Texas Teacher Evaluation and Support System FAQ

Texas Teacher Evaluation and Support System FAQ A. Overview 1. *What is T-TESS? Texas Teacher Evaluation and Support System FAQ T-TESS is the Texas Teacher Evaluation and Support System. It is a new teacher evaluation system for the state of Texas designed

More information

2. Explain how the school will collect and analyze student academic achievement data.

2. Explain how the school will collect and analyze student academic achievement data. 14 Del. C. 512(4)-(7) 1. Explain how the school s Board and School Leadership Team will measure and evaluate LTA (LTA) will use multiple data sources to enhance student learning while focusing on the need

More information

District Accountability Handbook Version 3.0 September 2012

District Accountability Handbook Version 3.0 September 2012 District Accountability Handbook Version 3.0 September 2012 Colorado Department of Education Page 1 The purpose of this handbook is to provide an outline of the requirements and responsibilities for state,

More information

ENROLLED SENATE BILL No. 103

ENROLLED SENATE BILL No. 103 Act No. 173 Public Acts of 2015 Approved by the Governor November 5, 2015 Filed with the Secretary of State November 5, 2015 EFFECTIVE DATE: November 5, 2015 Introduced by Senator Pavlov STATE OF MICHIGAN

More information

GaPSC Teacher Leadership Program Standards

GaPSC Teacher Leadership Program Standards GaPSC Teacher Leadership Program Standards Purpose: Georgia has identified a need to improve P-12 students academic performance as measured by various assessments. One method to ensure improved student

More information

I. THE PLAN. Init.10/10

I. THE PLAN. Init.10/10 I. THE PLAN Professional employees will be grouped into four categories, or STAGES. Teacher evaluation occurs through a variety of activities: (1) required supervisory activities, (2) personalized supervisory

More information

08X540. School For Community Research and Learning 1980 Lafayette Avenue School Address: Bronx, NY 10473

08X540. School For Community Research and Learning 1980 Lafayette Avenue School Address: Bronx, NY 10473 NYSED/NYCDOE JOINT INTERVENTION TEAM REPORT AND RECOMMENDATIONS BEDS Code/DBN: 08X540 School Name: School For Community Research and Learning 1980 Lafayette Avenue School Address: Bronx, NY 10473 Principal:

More information

Arkansas Teaching Standards

Arkansas Teaching Standards Arkansas Teaching Standards The Arkansas Department of Education has adopted the 2011 Model Core Teaching Standards developed by Interstate Teacher Assessment and Support Consortium (InTASC) to replace

More information

GEORGIA DEPARTMENT OF EDUCATION Introduction and Overview Formative Instructional Practices Professional Learning. www.gadoe.

GEORGIA DEPARTMENT OF EDUCATION Introduction and Overview Formative Instructional Practices Professional Learning. www.gadoe. GEORGIA DEPARTMENT OF EDUCATION Introduction and Overview Formative Instructional Practices Professional Learning www.gadoe.org/georgiafip 1 Goals of the Session Share an overview the GaDOE Assessment

More information

YOUNG FIVES PROGRAM 2009-2012 THREE-YEAR SINGLE PLAN FOR STUDENT ACHIEVEMENT. Palo Alto Unified School District

YOUNG FIVES PROGRAM 2009-2012 THREE-YEAR SINGLE PLAN FOR STUDENT ACHIEVEMENT. Palo Alto Unified School District YOUNG FIVES PROGRAM THREE-YEAR SINGLE PLAN FOR STUDENT ACHIEVEMENT 2009-2012 Palo Alto Unified School District DISTRICT GOAL: Create an exceptional learning environment that engages, challenges, and supports

More information

Committee On Public Secondary Schools. Standards for Accreditation

Committee On Public Secondary Schools. Standards for Accreditation Committee On Public Secondary Schools Standards for Accreditation Effective 2011 New England Association of Schools & Colleges 3 Burlington Woods Drive, Suite 100 Burlington, MA 01803 Tel. 781-425-7700

More information

REQUIRED TEXTBOOK LIST

REQUIRED TEXTBOOK LIST 10 800 Troy-Schenectady Road, Latham, NY 12110-2455 518-213-6000 800-528-6208 FAX 518-213-6456 www.nysut.org/elt REQUIRED TEXTBOOK LIST Students will now purchase the Required Text(s) for ELT courses directly

More information

Standards for Professional Development

Standards for Professional Development Standards for Professional Development APRIL 2015 Ohio Standards for Professional Development April 2015 Page 1 Introduction All of Ohio s educators and parents share the same goal that Ohio s students

More information

Frequently Asked Questions Contact us: [email protected]

Frequently Asked Questions Contact us: RAC@doe.state.nj.us Frequently Asked Questions Contact us: [email protected] 1 P a g e Contents Identification of a Priority, Focus, or Reward School... 4 Is a list of all Priority, Focus, and Reward Schools available to

More information

Appendix E. Role-Specific Indicators

Appendix E. Role-Specific Indicators Appendix E. Role-Specific Indicators A frequent topic of debate with regard to educator evaluation both in Massachusetts and across the nation is the extent to which performance rubrics should be specific

More information

DENVER PUBLIC SCHOOLS. EduStat Case Study. Denver Public Schools: Making Meaning of Data to Enable School Leaders to Make Human Capital Decisions

DENVER PUBLIC SCHOOLS. EduStat Case Study. Denver Public Schools: Making Meaning of Data to Enable School Leaders to Make Human Capital Decisions DENVER PUBLIC SCHOOLS EduStat Case Study Denver Public Schools: Making Meaning of Data to Enable School Leaders to Make Human Capital Decisions Nicole Wolden and Erin McMahon 7/19/2013. Title: Making Meaning

More information

Fridley Alternative Compensation Plan Executive Summary

Fridley Alternative Compensation Plan Executive Summary Fridley Alternative Compensation Plan Executive Summary Fridley School District Alternative Compensation Plan is a district wide program that is approved by the Minnesota Department of Education and was

More information

TENNESSEE STATE BOARD OF EDUCATION

TENNESSEE STATE BOARD OF EDUCATION Alternative Education Program Model/Standards Standard 1.0: Mission An exemplary alternative education program operates with a clearly stated mission, a formal set of standards, and a plan for program

More information

ROCORI Schools Professional Development Program (PDP)

ROCORI Schools Professional Development Program (PDP) ROCORI SCHOOLS ISD 750 ROCORI Schools Professional Development Program (PDP) To be central Minnesota s public education standard of excellence A joint agreement between Education Minnesota-ROCORI and ISD

More information

Reading Specialist. Practicum Handbook Addendum to be used in conjunction with the Education Unit Practicum Handbook 2014-2015

Reading Specialist. Practicum Handbook Addendum to be used in conjunction with the Education Unit Practicum Handbook 2014-2015 Reading Specialist Practicum Handbook Addendum to be used in conjunction with the Education Unit Practicum Handbook 2014-2015 Nancy L. Murray, Ed.D January 2014 Adapted from Rosemarie Giovanni, Ph.D. 1

More information

Cyber School Student Teaching Competencies

Cyber School Student Teaching Competencies Cyber School Student Teaching Competencies Introduction The Pennsylvania Department of Education (PDE) has developed a general set of student teaching competencies that afford a student teacher the opportunity

More information

WHEELOCK COLLEGE FACULTY DEVELOPMENT AND EVALUATION PROGRAM

WHEELOCK COLLEGE FACULTY DEVELOPMENT AND EVALUATION PROGRAM WHEELOCK COLLEGE FACULTY DEVELOPMENT AND EVALUATION PROGRAM REVISED SPRING 2011 TABLE OF CONTENTS Development And Evaluation Process: Tenure Track Faculty... 4 Overview Of Mentoring And Evaluation Process

More information

2013 Marzano School Leader Evaluation Model Rubric

2013 Marzano School Leader Evaluation Model Rubric 2013 Marzano School Leader Evaluation Model Rubric Exclusive partners with Dr. Robert J. Marzano for the Teacher Evaluation Model and School Leader Evaluation Model Learning Sciences International 175

More information

How To Write A Curriculum Framework For The Paterson Public School District

How To Write A Curriculum Framework For The Paterson Public School District DEPARTMENT OF CURRICULUM & INSTRUCTION FRAMEWORK PROLOGUE Paterson s Department of Curriculum and Instruction was recreated in 2005-2006 to align the preschool through grade 12 program and to standardize

More information