1 Working Paper Estimating the Percentage of Students Who Were Tested on Cognitively Demanding Items Throug gh the State Achievement Tests KUN YUAN & VI-NHUAN LE RAND Education WR-967-WFHF November 2012 Prepared for the William and Flora Hewlettt Foundation RAND working papers are intended to share researchers latest findings. Although this working paper has been peer reviewed and approved for circulation by RAND Education, the research should be treated ass a work in progress. Unless otherwise indicated, working papers can be quoted and cited without permission of the author, provided the source is clearly referred to as a working paper. RAND s publications do not necessarily reflect the opinions of its research clients and sponsors. is a registered trademark.
3 - iii - PREFACE The William and Flora Hewlett Foundation s Education Program initiated a new strategic initiative in 2010 that focuses on students mastery of core academic content and their development of deeper learning skills (i.e., critical-thinking, problem-solving, collaboration, communication, and learn-how-to-learn skills). The Foundation would like to track the extent to which U.S. students are assessed in a way that emphasizes deeper learning skills during its Deeper Learning Initiative. This report presents the results of a project to estimate the percentage of U.S. elementary and secondary students being assessed on deeper learning skills through statewide mathematics and English language arts achievement tests at the beginning of the Deeper Learning Initiative. This research has been conducted by RAND Education, a unit of the RAND Corporation.
5 - v - CONTENTS Preface... iii Figures... vii Tables... ix Summary... xi Acknowledgments... xvii Abbreviations... xix 1. Introduction Identifying Tests for Analysis Choosing a Framework to Analyze the Cognitive Rigor of State Tests Examining the Rigor of State Achievement Tests in Targeted States Estimating the Percentage of U.S. Students Assessed on Selected Deeper Learning Skills Through State Achievement Tests...27 Appendix A: Exemplary Test Items at Each DOK Level...33 Appendix B. Percentage of Released State Test Items Rated at Each DOK Level, by Subject, State, and Grade Level...45 References...55
7 - vii - FIGURES Figure S.1. Percentage of Test Items at Each DOK Level, by Subject and Item Format xiii Figure 4.1. Percentage of Test Items Analyzed, by Subject Figure 4.2. Percentage of Test Items Rated at Each DOK Level, by Subject Figure 4.3. Percentage of Test Items Rated at Each DOK Level, by Subject and Item Format Figure 4.4. Procedures to Determine Whether a State Test Qualified as a Deeper Learning Assessment Figure 5.1. Estimated Percentage of Students Assessed on Selected Deeper Learning Skills in Reading Nationwide with Different Cutoff Percentages for MC Items Figure 5.2. Estimated Percentage of Students Assessed on Selected Deeper Learning Skills in Mathematics and English Language Arts Nationwide with Different Cutoff Percentages for MC Items... 29
9 - ix - TABLES Table 2.1. Comparisons of Candidate Tests Considered for This Project... 7 Table 4.1. Percentage of Released State Mathematics Test Items at Each DOK Level, by State and Question Format Table 4.2. Percentage of Released State Reading Test Items at Each DOK Level, by State and Question Format Table 4.3. Percentage of Released State Writing Test Items at Each DOK Level, by State and Question Format Table B.1. Percentage of Released State Mathematics Test Items Rated at Each DOK Level, by State and Grade Level, and Deeper Learning Assessment Classification Results Table B.2. Percentage of Released State Reading Test Items Rated at Each DOK Level, by State and Grade Level, and Deeper Learning Assessment Classification Results Table B.3. Percentage of Released State Reading Test Items Rated at Each DOK Level, by State and Grade Level, and Deeper Learning Assessment Classification Results... 53
11 - xi - SUMMARY In 2010, the William and Flora Hewlett Foundation s Education Program initiated its strategic Deeper Learning Initiative that focuses on students mastery of core academic content and their development of deeper learning skills (i.e., critical-thinking, problemsolving, collaboration, communication, and learn-how-to-learn skills). One of the goals of the Deeper Learning Initiative is to improve the proportion of U.S. elementary and secondary students nationwide being assessed on deeper learning skills to 15 percent by The Foundation asked RAND to conduct a study to examine the percentage of U.S. elementary and secondary students being assessed on deeper learning skills at the beginning of the Deeper Learning Initiative. ABOUT THE STUDY Selection of State Mathematics and English Language Arts Tests in 17 States To estimate the percentage of U.S. elementary and secondary students assessed on deeper learning, we had to identify measures of student learning to be included in the analysis. Moreover, we needed access to information about test items and the number of test takers for each measure. We started by searching for tests for which these two types of information were publicly available. We conducted a literature review and an online information search, consulted educational assessment experts, and considered a variety of tests to be included in the analysis, such as statewide achievement tests, Advanced Placement (AP) tests, International Baccalaureate (IB) exams, and benchmark tests. Among the tests we considered, information about both the test items and the number of test takers was available only for state achievement tests. Therefore, state achievement tests were the only type of student measures that we could include in this project. Given the available project resources, it was not feasible to analyze the state achievement tests for all states, so we had to prioritize by focusing on a group of states whose achievement tests had higher probabilities of assessing deeper learning than those used in other states. We conducted a literature review on the design, format, and rigor of statewide achievement assessments. Prior literature suggested 17 states whose state achievement tests were more cognitively demanding and might have a higher probability of assessing deeper learning. Because statewide mathematics and English language arts tests are administered to students in grades 3 8 and in one high school grade level in
12 - xii - most states, our analyses of the items focused on mathematics and English language arts tests at these grade levels in these 17 states. Using Webb s Depth-of-Knowledge Framework to Analyze the Cognitive Processes of Selected Deeper Learning Skills The manner in which students are assessed on the state exams restricted our analysis to two types of deeper learning skills: critical-thinking and problem-solving skills and written communication skills. To determine the extent to which each state test measures these deeper learning skills, we reviewed multiple frameworks that had been used to describe the cognitive processes of test items and learning tasks. The frameworks we reviewed included Norman Webb s (2002a) four-level Depth-of- Knowledge (DOK) framework; Andrew Porter s (2002) five-level cognitive rigor framework; Karin Hess et al. s (2009) matrix that combines Webb s DOK framework and Bloom s Taxonomy of Educational Objectives; Newmann, Lopez, and Bryk s (1998) set of standards to evaluate the cognitive demand of classroom assignments and student work; and Lindsay Matsumura and her colleagues (2006) instructional quality assessment toolkit to measure the quality of instruction and the cognitive demand of student assignments. Although these frameworks differed in their structure and purpose, they all focused on describing the cognitive rigor elicited by the task at hand. Therefore, we decided to assess whether a state test met the criteria for a deeper learning assessment based on the cognitive rigor of the test items. Among the five frameworks we reviewed, Webb s DOK framework is the most widely used to assess the cognitive rigor of state achievement tests and best suited the needs of this project. Therefore, we adopted Webb s DOK framework to analyze the cognitive rigor demanded of state tests. Webb defined four levels of cognitive rigor, where level 1 represented recall, level 2 represented demonstration of skill/concept, level 3 represented strategic thinking, and level 4 represented extended thinking. We applied Webb s subject-specific descriptions for each of the DOK levels for mathematics, reading, and writing in our analysis. Our review of the DOK framework suggests that the cognitive demands associated with DOK level 4 most closely match the Deeper Learning Initiative s notion of deeper learning, so we use DOK level 4 as our indicator that a test item measures deeper learning.
13 - xiii - FINDINGS The Overall Rigor of State Mathematics and English Language Arts Tests in 17 States Was Low, Especially for Mathematics For each state test, we applied Webb s DOK framework to analyze the cognitive rigor of individual test items and summarized the percentage of items that met the criteria for each DOK level. Two researchers and two subject experts rated the cognitive rigor of more than 5,100 released state test items using Webb s DOK framework, with two raters per subject. The inter-rater reliability was high (above 0.90) for both subjects. In general, the cognitive rigor of state mathematics and English language arts tests was low. Most items were at DOK level 1 or 2. Open-ended (OE) items had a greater likelihood of reaching DOK level 3 or 4 than did multiple-choicee (MC) items. Figure S.1 shows the average percentagee of test items at each DOK level by subject and item format. Figure S.1. Percentage of Test Items at Each DOK Level,, by Subject and Item Format Subject and Question Type Writing OE Writing MC Reading OE Reading MC Math OE Math MC DOK1 DOK2 DOK3 DOK4 0% 20% 40% 60% 80% 100% Percentage MC and OE items had different likelihoods of being rated at DOK level 4, so we set two different criteria for a test to be considered as a deeper learning assessment that took into account the question format. Criterion A was more strict; it required thatt 5 percent of MC items were rated at DOK level 4 and at least onee OE item was rated at DOK level 4. Criterion B was less strict; it required that 5 percent of MC itemss were rated at DOK level 4 or at least one OE item was rated at DOK level 4. We chose 5 percent as the
14 - xiv - cutoff level for MC items because it is the mean (and median) percentage of reading items that were rated at DOK level 4 on state reading tests across the 17 states.. We judged each test separately on the two criteria, giving us a range of results depending on how strictly deeper learning assessment was defined. None of the state mathematics tests we analyzed met the criteria for a deeper learning assessment using either criterion. Depending on the criterion we used, between 1 and 20 percent of the state reading tests and percent of the state writing tests we analyzed qualified as deeper learning assessments. Only 3 10 Percent of U.S. Elementary and Secondary Students Were Assessed on Selected Deeper Learning Skills Through State Mathematics and English Language Arts Tests Using our DOK coding results and student enrollment data from the National Center for Educational Statistics, we estimated the percentage of U.S. elementary and secondary students assessed on deeper learning skills in mathematics, reading and writing, under the assumption that none of the tests in the other states not analyzed in this study measure deeper learning. We found that 0 percent of students in the U.S. were assessed on deeper learning in mathematics through state tests, 1 6 percent of students were assessed on deeper learning in reading through state tests, and 2 3 percent of students were assessed on deeper learning in writing through state tests. Overall, 3 10 percent of U.S. elementary and secondary students were assessed on deeper learning on at least one state assessment. We also estimated the percentage of students assessed on deeper learning based on different cutoff scores for MC items. Results showed that when a cutoff percentage for MC items of 4 percent or higher was adopted, the final estimation of U.S. elementary and secondary students assessed on deeper learning through the state mathematics and English language arts tests stays approximately the same. INTERPRETING THE RESULTS As described above, the types of assessments analyzed and the range of deeper learning skills evaluated in this study were limited due to the requirements on access to both test items and the number of students taking the test and the current status of the assessment landscape for deeper learning skills. The criterion used to define what counts as deeper learning also has limitations in its capacity to capture deeper learning comprehensively. Moreover, cognitive rigor represents only one dimension of deeper
15 - xv - learning. Thus, the study enacted is somewhat different from the one envisioned at the beginning. In addition, there are several caveats worth noting when interpreting the results of this analysis. First, a lack of information about the test items and the number of test takers for other types of tests, such as AP, IB, and benchmark tests, prevented us from examining the extent to which these tests measure deeper learning skills. This constraint likely means that our findings underestimate the percentage of students assessed on deeper learning skills in our sample of states. Second, the content and format of state achievement tests and resource constraints did not allow us to analyze mastery of core content, collaboration, oral communication, or learn-how-to-learn skills. Although omitting these deeper learning skills might have caused us to overestimate the percentage of state tests that meet the criteria for deeper learning assessments, doing so allowed us to conduct meaningful analysis of the extent to which the current state tests measure other important aspects of deeper learning. Third, given the available project resources, we had to prioritize by focusing on 17 states tests identified by prior studies as more rigorous than those used in the other twothirds of U.S. states. We assumed that the results about the rigor of the 17 state tests published in prior reviews were accurate and that the tests level of rigor had not changed substantially since those reviews were conducted. We also assumed that none of the tests used in the other two-thirds of states would meet the criteria for deeper learning assessments. Fourth, the determination of whether a state test met the criteria for a deeper learning assessment might be biased because the full test form was not available in some states and the unreleased items might be different than the released items in the extent to which they measure deeper learning skills. However, the issue of partial test forms is unavoidable. There are a number of reasons states do not release full test forms and we could only work with the items they did release. Fifth, we assessed whether a state test met the criteria for a deeper learning assessment based on the percentage or number of test items rated at the highest DOK level. We also considered using the portion of the total test score that is accounted for by DOK level 4 items to represent the cognitive rigor of a state test. However, we could not use this measure because some states did not provide the number of score points for released items or the total score of a state test. Sixth, the choice of the cutoff percentage of MC items rated at DOK level 4 is admittedly arbitrary. Our analysis of different cutoff scores showed that raising or
16 - xvi - lowering the cutoff by one or two percent did not substantially change the estimate of the percentage of U.S. elementary and secondary students assessed on deeper learning through state tests. NEXT STEPS FOR THE FOUNDATION This study has implications for the Foundation, as it moves to gauge progress towards the Deeper Learning Initiative s goal of increasing the percentage of students assessed on deeper learning skills to at least 15% by First, due to a number of constraints with the state achievement tests, we were able to assess only limited aspects of deeper learning. Future evaluations of the Deeper Learning Initiative may encounter the same types of challenges as this benchmark study, such that only a limited type of deeper learning skills can be examined. Second, given the lack of credible standardized measures of intrapersonal or interpersonal competencies within the Deeper Learning Initiative and information about the number of students taking such measures, the Foundation may not be able to assess the full range of deeper learning skills outlined in the Deeper Learning Initiative without making trade-offs with respect to psychometric properties, costs, and other considerations. Third, because of the interdependence between critical thinking and problem solving skills and fluency with the core concepts, practices, and organizing principles that constitute a subject domain, it is necessary to develop an analytic framework that would allow an analysis of the mastery of core conceptual content as integrated with critical thinking and problem solving. Although this task was beyond the scope of the time and resources available for this study, future studies examining the Foundation s Deeper Learning Initiative should consider frameworks that define fundamental concepts and knowledge for each subject area. Finally, it is unclear whether there are any conceptual frameworks that can account for interpersonal competencies such as complex communication and collaboration, and intrapersonal competencies such as persistence and self-regulation. Because these dimensions were not included on the state tests, we did not examine any frameworks that assessed these skills. However, considering the lack of standardized measures that assess these skills, there is also likely to be a dearth of analytic frameworks that can be used to study these competencies. As curriculum, instructional practices, and assessment methodologies evolve that are responsive to a full range of deeper learning competencies, the analytic frameworks will need to evolve as well.
17 - xvii - ACKNOWLEDGMENTS This report could not have been completed without assistance from the following individuals. We would like to thank Dr. Christine Massey for her review of an earlier draft her feedback greatly improved this report. We would also like to thank Dr. Joan Herman for her insightful comments on the analysis plan and choice of analytical framework. We thank Susanna Berko and Justin Boyle for their help in analyzing the rigor of released state test items. We are also grateful to the sponsor of this project, the William and Flora Hewlett Foundation, and the support and feedback we received from the program staff. Several RAND colleagues also contributed to the completion of this report. We thank Cathleen Stasz, Kate Giglio, and Lauren Skrabala for their thoughtful reviews and comments. We would also like to thank Sarah Hauer and Sharon Koga for their administrative assistance during the completion of the report.
19 - xix - ABBREVIATIONS AP CAE CWRA DOK IB MC NECAP OE Advanced Placement Council for Aid to Education College and Work Readiness Assessment Depth-of-Knowledge International Baccalaureate multiple-choice New England Common Assessment Program open-ended
20 INTRODUCTION THE DEEPER LEARNING INITIATIVE In 2010, the William and Flora Hewlett Foundation s Education Program implemented the Deeper Learning Initiative, which emphasizes students mastery of core academic content and their development of deeper learning skills. The initiative focuses on enabling students to attain the following types of deeper learning skills: 1. Master core academic content: Students will develop a set of disciplinary knowledge, including facts and theories in a variety of domains and the language and skills needed to acquire and understand this content. 2. Think critically and solve complex problems: Students will know how and when to apply core knowledge by employing statistical reasoning and scientific inquiry to formulate accurate hypotheses, offer coherent explanations, and make well-reasoned arguments, along with other skills. This category of competencies also includes creativity in analyzing and solving problems. 3. Work collaboratively: Students will cooperate to identify or create solutions to societal, vocational, and personal challenges. This includes the ability to organize people, knowledge, and resources toward a goal and to understand and accept multiple points of view. 4. Communicate effectively: Students will be able to understand and transfer knowledge, meaning, and intention. This involves the ability to express important concepts, present data and conclusions in writing and to an audience, and listen attentively. 5. Learn how to learn: Students will know how to monitor and direct their own work and learning. As part of its efforts to promote deeper learning, the Foundation also launched a series of research activities. One aspect of the Foundation s efforts to increase students exposure to deeper learning is the development of seven model school networks that embody the deeper learning approach (see Yuan and Le, 2010, for more details on these model school networks). Although the networks vary with respect to organizational structure, training and supports, and student populations served, the networks share commonalities with respect to their design principles for promoting deeper learning. Namely all the networks emphasize small learning communities, personalized learning opportunities, connections to the real world, and college and work readiness as core principles of deeper learning. The networks also indicate they often used authentic
21 - 2 - assessments, such as student portfolios, performance during workplace internships, and cross-curricular team-based projects, to evaluate student learning. Another aspect of these research activities included estimating the percentage of elementary and secondary students nationwide being assessed on deeper learning at the outset of the Deeper Learning Initiative. Although the Foundation currently lacks an estimate of the percentage of students being tested on deeper learning, it believes that the percentage is likely to be very low. Thus, one of its goals is to increase the nationwide percentage of students who are assessed on deeper learning skills to at least 15 percent by PURPOSE OF THIS STUDY AND STRUCTURE OF THIS REPORT The purpose of this project was to estimate the percentage of students being tested on deeper learning skills at the beginning of the Deeper Learning Initiative. These estimates are intended to serve as baseline measures to gauge the progress of the initiative toward its 15-percent benchmark in five years. We conducted this project in four steps. Because the types of measures that could feasibly be included in our project restricted the types of deeper learning skills we could analyze, we started this project by searching for tests that we could analyze for this study. Chapter Two provides detailed descriptions of the tests that were included in our analysis, focusing on the process, criteria, and rationale for the selection of these measures. After finalizing the choice of tests to be included in the analysis, we identified a framework to analyze the extent to which state tests measure deeper learning. In Chapter Three, we describe the types of deeper learning skills we analyzed and how we chose the framework. With the tests to be included and the analytical framework chosen, we applied the analytical framework to the selected tests to assess the extent to which they measured deeper learning skills. Chapter Four presents our results about the degree to which selected tests measure deeper learning skills and how we assessed whether they qualified as deeper learning assessments. Our final step was to estimate the percentage of U.S. elementary and secondary students assessed on deeper learning skills based on whether selected tests qualified as deeper learning assessments and on the number of students assessed by each test at the beginning of the Deeper Learning Initiative. The Chapter Five presents our estimation results and discusses the caveats and limitations of this project.
22 - 3 - The report also includes two appendixes that provide sample test items correlated with each level in our analytical framework and present detailed findings by state, grade level, subject, and classification, respectively.
24 IDENTIFYING TESTS FOR ANALYSIS Educators use a variety of assessment tools to measure student learning, such as homework, formative assessments, interim or benchmark tests, and state achievement tests. These tests vary substantially in many ways for example, in their purposes, formats, and frequency and breadth of administration. Moreover, the use of these tests varies by school, district, and state. Not all measures used to assess student learning are analyzable in a project such as this, so our first step was to identify tests for analysis. In this chapter, we describe the process and criteria for choosing the tests to be included in our study. Ideally, the measures of student learning outcomes that we analyzed would reflect the core principles set forth by the model school networks. However, as we conducted our search for the measures that would be available for an analysis of deeper learning, it became apparent that the types of assessments that were favored by the networks would not be analyzable on a large-scale basis. For example, previous studies that have attempted to examine student portfolios statewide have reported prohibitive costs (Koretz et al., 1994) and unreliability in scoring (Reckase, 1995). Because project resources did not allow us to analyze the types of authentic assessments that would likely capture all aspects of deeper learning on a large-scale basis, our analysis is restricted to the types of skills and knowledge that were likely to be assessed by the tests administered on a largescale. Specifically, to estimate the percentage of students assessed on deeper learning skills, we needed to have access to information about test items and the number of test takers. This information allowed us to determine the extent to which a test measured deeper learning skills and how many students were tested on these skills. We set out to identify student assessments for which both types of information were publicly or readily available. CRITERIA FOR SELECTION We conducted a literature review and online information search to identify tests to be included in the analysis. We considered a variety of tests that are commonly used to assess student learning in any given school year, such as statewide achievement tests, exams used by special assessment consortiums (such as the New York Performance Assessment Consortium), district-level assessments, and other tests currently used in K
25 schools that might measure deeper learning skills, such as the Advanced Placement (AP) tests, International Baccalaureate (IB) exams, and the College and Work Readiness Assessment (CWRA) developed by the Council for Aid to Education (CAE). 1 We examined the availability of information about test items and test takers for each candidate test to determine the feasibility of including it in this study. Our analysis showed that information both about test items and the number of test takers is readily available only for state tests. Other types of candidate tests we considered lacked one or both types of information and thus could not be included (see Table 2.1). For example, we considered AP tests and IB exams, for which the test content was readily available. However, information about the number of test takers in a given year is not publically available. Although each school district may collect student-level information regarding which student took the AP or IB exams, it was beyond the scope of this project to collect such information from all school districts nationwide. Benchmark tests (also referred to as interim assessments) are used by many school districts to monitor student performance on a monthly or quarterly basis to predict students performance on the state achievement tests (Goertz, Olah, and Riggan, 2009). The format and rigor of test items is usually similar to that of the state achievement tests, so students performance on the benchmark tests provides useful information about students possible test scores on the state tests. School districts make local decisions regarding whether to administer benchmark tests to monitor student performance, and the specific grades, years, and subject areas in which these tests are administered can vary by district or even by school. This variability made it impossible for us to identify school districts nationwide in which benchmark tests were used at the beginning of the Deeper Learning Initiative. Moreover, for proprietary reasons, commercial test developers do not provide public access to benchmark tests items. Without access to both the benchmark test items and the number of students who took these tests, we could not include benchmark assessments in the analysis. Performance assessments were another type of test that we considered. Such tests measure student learning through constructed-response tasks (Stecher, 2010), while state 1 We did not consider national or international tests (such as the National Assessment of Educational Progress and the Program for International Student Assessment) because these tests are administered every three or four years, not in one particular school year. Including these tests in this type of analysis might cause the estimate of U.S. elementary and secondary students assessed on deeper learning skills to fluctuate according to the year in which these tests are administered.
Understanding Depth of Knowledge and Cognitive Complexity Keystone Exam Review of Items One of the steps in the item-review process involves Pennsylvania educators review of items for cognitive complexity
Exploring Cognitive Demand in Instruction and Assessment Karin K. Hess Over the past decades, educators and psychologists have attempted to develop models for understanding cognitive complexity as it relates
DEEPER LEARNING COMPETENCIES April 2013 Deeper learning is an umbrella term for the skills and knowledge that students must possess to succeed in 21 st century jobs and civic life. At its heart is a set
Degree- Level Expectations and Course Learning Outcomes Introduction to DLEs Degree-Level Expectations (DLEs) are a threshold framework for the expression of the intellectual and creative development of
2014 15 Understanding Your Praxis Scores The Praxis Series Assessments are developed and administered by Educational Testing Service (E T S ). Praxis Core Academic Skills for Educators (Core) tests measure
From Outcomes and Maps to Developing A Plan to Assess Student Learning Presented at UW-Stevens Point March, 2011 Peggy Maki Education Consultant Assessment Editor and Writer email@example.com 1 Foci Visualizing
Rethinking College Readiness High schools need to graduate greater numbers of young people prepared for college and careers. But how should readiness be defined? DAVID T. CONLEY The likelihood that students
Writing learning objectives This material was excerpted and adapted from the following web site: http://www.utexas.edu/academic/diia/assessment/iar/students/plan/objectives/ What is a learning objective?
CREATING LEARNING OUTCOMES What Are Student Learning Outcomes? Learning outcomes are statements of the knowledge, skills and abilities individual students should possess and can demonstrate upon completion
WORKING P A P E R Math Science Partnership of Southwest Pennsylvania Year Four Evaluation Report CYNTHIA A. TANANIS JOHN F. PANE VALERIE L. WILLIAMS STUART S. OLMSTED CARA CIMINILLO SHANNAH THARP-TAYLOR
APPENDIX Revised Bloom s Taxonomy In 1956, Benjamin Bloom and his colleagues published the Taxonomy of Educational Objectives: The Classification of Educational Goals, a groundbreaking book that classified
March 2010 BLOOM'S TAXONOMY AND CRITICAL THINKING Critical Thinking across the Curriculum Page 2 of 7 Contents Bloom's Taxonomy and Critical Thinking... 3 Bloom s in a group ICT project -... 4 Learning
Support Materials for Core Content for Assessment Version 4.1 Mathematics August 2007 Kentucky Department of Education Introduction to Depth of Knowledge (DOK) - Based on Norman Webb s Model (Karin Hess,
A New Lens for Examining Cognitive Rigor in Standards, Curriculum, & Assessments What are some implications for the transition to Common Core State Standards? Karin K. Hess, Ed.D., Senior Associate National
Indiana Statewide Transfer General Education Core Preamble In 2012 the Indiana legislature enacted Senate Enrolled Act 182, thereby establishing the requirements for a Statewide Transfer General Education
APPENDIX A Teaching Performance Expectations A. MAKING SUBJECT MATTER COMPREHENSIBLE TO STUDENTS TPE 1: Specific Pedagogical Skills for Subject Matter Instruction Background Information: TPE 1. TPE 1 is
Department of Planning, Innovation, and Accountability Report from the Office of Student Assessment Number 36 September 17, 2013 2013 COLLEGE AND WORK READINESS ASSESSMENT RESULTS AUTHORS: Darra K. Belle,
GENERAL QUESTIONS 1. What is? is a recommended, rigorous course of study based on standards in Massachusetts s curriculum frameworks that aligns high school coursework with college and career expectations.
The Condition of College & Career Readiness 13 Indiana Indiana The Condition of College & Career Readiness 13 ACT has been measuring college readiness trends for several years. The Condition of College
Undergraduate Psychology Major Learning Goals and Outcomes i Goal 1: Knowledge Base of Psychology Demonstrate familiarity with the major concepts, theoretical perspectives, empirical findings, and historical
The Critical Skills Students Need 2 Why do I keep calling these critical skills? I use the term critical here, and in the title of the book, in two ways. First, I believe the skills I mentioned in Chapter
Attachment A The following table provides information on student teaching requirements across several states. There are several models for these requirements; minimum number of weeks, number of required
How does classroom assessment help teachers and students? Classroom assessments can help teachers plan and implement effective instruction and can help students learn at deeper and higher levels. Assessments
1 Conley, D. T. (2010). College and Career Ready: Helping all Students Succeed Beyond High School. San Francisco: Jossey Bass. (Abstract prepared for AVID Postsecondary by Harriet Howell Custer, Ph.D.)
NEW YORK STATE TEACHER CERTIFICATION EXAMINATIONS TEST DESIGN AND FRAMEWORK September 2014 Authorized for Distribution by the New York State Education Department This test design and framework document
Example s World Languages and Foreign Languages Pilot Priority Grades Pre-Kindergarten High School 1 Example s Table of Contents World Languages Grades Pre-Kindergarten 1... 5 Nebraska Designing s (K-12
H&G The University of the State of New York THE STATE EDUCATION DEPARTMENT Albany, New York 12234 GENERAL INFORMATION INFORMATION BOOKLET FOR SCORING REGENTS EXAMINATIONS IN GLOBAL HISTORY AND GEOGRAPHY
MARSHALL UNIVERSITY COLLEGE OF EDUCATION TEACHER EDUCATION PROGRAM Revised: August, 2011 All teacher candidates will create a Teacher Candidate Work Sample (TCWS) analyzing a sample of their work and the
U.S. Department of Education March 2014 Participation and pass rates for college preparatory transition courses in Kentucky Christine Mokher CNA Key findings This study of Kentucky students who take college
Supporting State Efforts to Design and Implement Teacher Evaluation Systems A Workshop for Regional Comprehensive Center Staff Hosted by the National Comprehensive Center for Teacher Quality With the Assessment
Building Critical Thinking Skills in General Education and Career Programs Wayne County Community College District Presented By: Mary Mahoney Jo Ann Allen Nyquist College Definition of Critical Thinking
Degree- Level Expectations and Course Learning Outcomes Introduction to DLEs Degree-Level Expectations (DLEs) are a threshold framework for the expression of the intellectual and creative development of
NAME: Lesson Plan Template DATE: Updated 11/14/2011 Title of Lesson: Subject Grade Level DOK Level(s) Unit Length Resources/Supplies needed for this lesson- GLE/CLE Code & Specific Expectation / Process
Chemistry Department Policy Assessment: Undergraduate Programs 1. MISSION STATEMENT The Chemistry Department offers academic programs which provide students with a liberal arts background and the theoretical
Alignment of Taxonomies Bloom s Taxonomy of Cognitive Domain Bloom s Taxonomy Cognitive Domain Revised Cognitive Demand Mathematics Cognitive Demand English Language Arts Webb s Depth of Knowledge Knowledge
Working Paper Evaluation of the College Bound Program: Early Findings Vi-Nhuan Le, Louis T. Mariano, Susannah Faxon-Mills RAND Education WR-971-COBND February 2013 Prepared for TG and the College Bound
TOOL 1 HESS COGNITIVE RIGOR MATRIX (READING CRM): Applying Webb s Depth-of-Knowledge Levels to Bloom s Cognitive Process Dimensions Revised Bloom s Taxonomy Webb s DOK Level 1 Recall & Reproduction Remember
USING THE PRINCIPLES OF ITIL ; SERVICE CATALOGUE Examination Syllabus V 1. October 009 ITIL is a Registered Trade Mark of the Office of Government Commerce in the United Kingdom and other countries APMG
Teacher Evaluation Missouri s Educator Evaluation System Teacher Evaluation Protocol Introduction Missouri s Educator Evaluation System was created and refined by hundreds of educators across the state.
BS Environmental Science (2013-2014) Program Information Point of Contact Brian M. Morgan (firstname.lastname@example.org) Support for University and College Missions Marshall University is a multi-campus public
The Irish Journal of Psychology 2009 Vol. 30 No. 3-4 pp. 123-131 Copyright 2009 by The Psychological Society of Ireland ISSN 0303-3910 Measuring critical thinking, intelligence, and academic performance
The Praxis Series ebooks The Official Study Guide Interdisciplinary Early Childhood Education Test Code: 0023 Study Topics Practice Questions Directly from the Test Makers Test-Taking Strategies www.ets.org/praxis
Requirements for Level 2 and Level 3 Teaching Certificates in 50 States and the District of Columbia January 20, 2011 Prepared by North Central Comprehensive Center On behalf of Minnesota Department of
Outcome-Based Education Outcome based education A student-centred learning system focusing on measuring student performance. Does not specify any specific style of teaching just that the student be able
Studying for Kaplan Nursing School Entrance Exam Kaplan Nursing School Entrance Exam study guide may be purchased online from these stores: See next page for an ADDITION to the study guide Kaplan and Lippincott,
Exemplary Planning Commentary: Elementary Literacy! This example commentary is for training purposes only. Copying or replicating responses from this example for use on a portfolio violates TPA policies.
Parents Guide to New Tests in Georgia In 2010, Georgia adopted the Common Core State Standards (CCSS) in English language arts and mathematics and incorporated them into the existing Georgia Performance
The Condition of College & Career Readiness 13 Indian Students Indian Students The Condition of College & Career Readiness 13 ACT has been measuring college readiness trends for several years. The Condition
Approaches to learning (ATL) across the IB continuum Through approaches to learning in IB programmes, students develop skills that have relevance across the curriculum that help them learn how to learn.
Industrial Engineering Definition of Tuning Tuning is a faculty-led pilot project designed to define what students must know, understand, and be able to demonstrate after completing a degree in a specific
Chapter 3 - Alternative Assessments Evaluating with Alternative Assessments In the ERIC Digest article "Alternative Assessment and Second Language Study: What and Why?" Charles R. Hancock states "In the
2011 Table of Contents Purposes for the Teacher Work Sample... Contextual Factors prompt and rubric... Design for Instruction prompt and rubric... Analysis of Student Learning prompt and rubric Reflection
May 14, 2015 achieve.org PROFICIENT VS. PREPARED: DISPARITIES BETWEEN STATE TESTS AND THE 2013 NATIONAL ASSESSMENT OF EDUCATIONAL PROGRESS (NAEP) Today s economy demands that all young people develop high-level
2013 14 Understanding Your Praxis Scores The Praxis Series Assessments are developed and administered by Educational Testing Service (E T S ). Praxis Core Academic Skills for Educators (Core) tests measure
ALIGNMENT STUDY FINAL REPORT DISTRICT OF COLUMBIA PUBLIC SCHOOLS COMPREHENSIVE ASSESSMENT SYSTEM ALTERNATE ASSESSMENT (DC CAS- ALT) Presented to District of Columbia Public Schools June 2007 Karin Hess,
ASU College of Education Course Syllabus ED 4972, ED 4973, ED 4974, ED 4975 or EDG 5660 Clinical Teaching Course: ED 4972, ED 4973, ED 4974, ED 4975 or EDG 5660 Credit: 9 Semester Credit Hours (Undergraduate),
GLOSSARY of Assessment Terms STATE COLLABORATIVE ON ASSESSMENT AND STUDENT STANDARDS ARTS EDUCATION ASSESSMENT CONSORTIUM (SCASS/ARTS) COUNCIL OF CHIEF STATE SCHOOL OFFICERS This glossary was developed
School Leader Effectiveness Evaluation: Student Learning Objectives (SLO) Guidebook 2014-15 March 11, 2015 1 Contents I. Overview of the Guidebook...3 II. An Introduction to SLOs...3 A. What is an SLO?...4
Get to Know My RE Observe Collect Evidence Mentor Moments Reflect Review Respond Tailor Support Provide Provide specific feedback specific Feedback What does my RE need? Practice Habits Of Mind Share Data
REACH/E2: Warren Township s Enrichment and Gifted and Talented Program Reach/E2 Staff Dr. Susan Cooper, Warren Middle School PJ Jones, Woodland School Dr. Susan Kline, Angelo L. Tomaso School Wendy Piller,
Bloom s Taxonomy Each of the three categories requires learners to use different sets of mental processing to achieve stated outcomes within a learning situation. Benjamin Bloom (1913-1999) was an educational
SECTION 1. GENERAL TITLE 135 PROCEDURAL RULE West Virginia Council for Community and Technical College SERIES 24 PREPARATION OF STUDENTS FOR COLLEGE 1.1 Scope - This rule sets forth minimum levels of knowledge,
appendix 3 curriculum, instruction, and assessment: rigor and relevance The old adage of learning best by experience is really true. Research confirms that more learning occurs when students are immersed
Assessment on the basis of essay writing in the admission to the master s degree in the Republic of Azerbaijan Natig Aliyev Mahabbat Akbarli Javanshir Orujov The State Students Admission Commission (SSAC),
Webb Webb s Depth of Knowledge Guide Career and Technical Education Definitions 2009 1 H T T P : / / WWW. MDE. K 12.MS. US H T T P : / / R E D E S I G N. R C U. M S S T A T E. EDU 2 TABLE OF CONTENTS Overview...
Updated 11 22 13 Question and Answers about the Illinois Common Core Standards Information Regarding the Illinois Common Core English Language Arts Standards The Illinois Learning Standards, based on the
Rover Races Middle School NGSS, Common Core, and 21 st Century Skills Alignment Document WHAT STUDENTS DO: Establishing Communication Procedures. Following Curiosity on Mars often means roving to places
The New Illinois Learning Standards Incorporating the Next Generation Science Standards (NILS-Science) Illinois Vision of K-12 Science Education All Illinois K-12 students will be better prepared for entrance
Building a Better Teacher Appraisal and Development System: Draft Instructional Practice Rubric As a part of the design process for HISD s new teacher appraisal system, the Instructional Practice working
Psychology Learning Goals and Outcomes UNDERGRADUATE PSYCHOLOGY MAJOR LEARNING GOALS AND OUTCOMES: A Report (March 2002) American Psychological Association, Task Force Members: Jane S. Halonen, Drew C.
PROGRAM LEARNING OUTCOMES Rubric for Assessing the Quality of Academic Program Learning Outcomes Criterion Initial Emerging Developed Highly Developed The list of outcomes is The list includes reasonable
The Condition of College & Career Readiness 2013 National 2013 by ACT, Inc. All rights reserved. The ACT college readiness assessment is a registered trademark of ACT, Inc., in the U.S.A. and other countries.
Benchmark Assessment for Improved Learning AN AACC POLICY BRIEF Joan L. Herman, Ellen Osmundson, & Ronald Dietel For a more detailed report, please refer to our Full Report available at aacompcenter.org
A Teacher s Guide: Getting Acquainted with the 2014 GED Test The Eight-Week Program How the program is organized For each week, you ll find the following components: Focus: the content area you ll be concentrating
This fact sheet provides the following: Background Asse ssment Fact Sheet: Performance Asse ssment using Competency Assessment Tools background information about the development and design of the competency
Step by Step Guide to Developing Targeted Professional Development Sessions Aligned with the Core Body of Knowledge Step1. Gather Information 1. If possible, conduct an observation of the early childhood
Stanford SRN Informational booklet Why Performance-Based Assessment? As the vast body of knowledge continues to expand, it is becoming impossible for individuals to keep up with the amount of information
International Baccalaureate Diploma Assessment Policy Holy Heart of Mary High School Philosophy Assessment is the process of collecting information from a variety of sources to measure student attainment
Curriculum and Instruction: A 21st Century Skills Implementation Guide Produced by To succeed in college, career and life in the 21st century, students must be supported in mastering both content and skills.
Initial Achievement Level Descriptors and College Content-Readiness Policy Introduction The Smarter Balanced Assessment Consortium (Smarter Balanced) has developed an interconnected system of initial achievement
HONORS INDEPENDENT STUDY GUIDELINES: According to the Lake Land College catalog course description, independent study is designed to permit the student to pursue a course of study not typically available
SREB State College and Career Readiness Initiative Teacher Development to Increase College and Career Readiness Guidelines and Promising Practices for States College and career readiness is an increasingly
SOUTH DAKOTA BOARD OF REGENTS Policy Manual SUBJECT: Associate Degree General Education Requirements NUMBER: 2:26 1. Associate of Arts Degree A. The general education component of all associate of arts
How To Use the Survey Tools Get the most out of the surveys We have prepared two surveys one for principals and one for teachers that can support your district- or school-level conversations about improving