American Diploma Project Algebra II End-of-Course Exam
|
|
|
- Blaze Watson
- 10 years ago
- Views:
Transcription
1 Standard Setting Meeting Chicago, IL July 22-24, 2009 American Diploma Project Algebra II End-of-Course Exam Standard Setting Briefing Book
2 Copyright 2009 Copyright Pearson 2009 Education, Pearson Inc. Education, or its affiliate(s) Inc. or its and affiliate(s). Achieve.
3 1: BRIEFING BOOK OVERVIEW This section of the Briefing Book includes an overview of 1) the American Diploma Project, 2) the ADP Algebra II End-of-Course Exam, 3) the standard setting process, and 4) the validity studies conducted to inform standard setting. About the American Diploma Project In many states today, students can meet the requirements for high school graduation and still be unprepared for success in college or in the workplace. Simply put, current state standards and assessments have not kept pace with the world students are entering after high school. According to a survey commissioned by Achieve, 39 percent of recent graduates enrolled in college and 46 percent in the workforce say there were significant gaps in their preparation for life after high school. Professors and employers estimate that four out of ten graduates are unprepared for college or employment. States, post secondary institutions, employers and young people spend an estimated $17 billion each year on remedial classes to re-teach material that should have been mastered in high school. The price tag might be acceptable if remediation was a proven fix, but one national study indicates that 7 percent of students who take remedial courses in reading and 63 percent of students who take one or two remedial courses in mathematics fail to earn college degrees. In 2004, Achieve published Ready or Not: Creating a High School Diploma that Counts, the result of two years of research launched by Achieve in partnership with The Education Trust and the Thomas B. Fordham Foundation. The report includes English and mathematics benchmarks that describe the specific content and skills that graduates should have mastered by the time they leave high school if they expect to succeed in postsecondary education or in high-growth jobs. In English, this content is equivalent to 4 years of gradelevel English/Language Arts. In mathematics, this content is included in courses typically found in Algebra I, Geometry, and Algebra II, although other pathways can be used. At the time of the creation of the ADP Benchmarks, only Arkansas and Texas required an Algebra II-level course for graduation, and even those states had an opt-out clause. Subsequent reports have assessed the rigor of state high school exit exams and high school graduation requirements, and examined the use of advanced math knowledge and skills in a range of workplaces. The American Diploma Project (ADP) is an initiative created to ensure that all students graduate from high school prepared to face the challenges of work and college. Governors, state superintendents of education, business executives and college leaders are working to bring value to the high school diploma by raising the rigor of high school standards, assessments and curriculum and aligning expectations with the demands of postsecondary education and the workplace. 1-1
4 The ADP Network now includes 35 states, which collectively educate nearly 85 percent of all U.S. public school students: Alabama, Arizona, Arkansas, California, Colorado, Connecticut, Delaware, Florida, Georgia, Hawaii, Idaho, Illinois, Indiana, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Nebraska, New Jersey, New Mexico, North Carolina, Ohio, Oklahoma, Oregon, Pennsylvania, Rhode Island, Tennessee, Texas, Virginia, Washington and Wisconsin. CA WA OR NV ID UT MT WY CO ND SD NE KS MN IA MO WI IL MI OH IN KY VT NH ME MA NY RI PA CT NJ DE WV MD VA AZ NM OK AR TN NC SC TX LA MS AL GA AK FL HI What Work Have the ADP Network States Committed to Undertake? Research shows that the ADP benchmarks, which contain skills students need to be successful in college or careers after graduating high school, are significantly more rigorous than current high school standards, resulting in an expectations gap that explains why many high school graduates aren't prepared to succeed when they arrive at college or the workplace. To close the expectations gap, ADP Network states have committed to the following four actions: 1. Align high school standards and assessments with the knowledge and skills required for success after high school. 2. Require all high school graduates to complete a college- and career-ready curriculum, including requiring students to complete a course that covers advanced Algebra, so that earning a diploma assures a student is prepared for opportunities after high school. 3. Build assessments into the statewide system that measure students readiness for college and careers. 4. Develop an accountability system that promotes college and career readiness. Although all Network states are committed to a common set of key policy priorities, there's no one-size-fitsall approach. Each state has developed its own action plan for carrying out the agenda. Each year since the ADP Network was launched in 2005, Achieve has published a report entitled Closing the Expectations Gap to report on the progress the 50 states and DC have made in implementing the pillars of the ADP policy agenda. To date, ADP Network states have led the nation in implementing college- and careerready policies, and many more are in the process of adopting these policies in their states. In 2008, Achieve released a report that described the emergence of a consistent and common core of knowledge in English and mathematics required of students in states that have adopted college- and careerready standards. Some of these states have formally aligned their standards to the ADP benchmarks, but others worked independently. 1-2
5 About the ADP Algebra II End-of-Course Exam As states began work on the ADP agenda, they soon realized they would need new assessments to match their raised expectations. They were also interested in streamlining the transition from high school to college and wanted a test that could signal to high school students whether they were ready for credit bearing college mathematics courses. To that end, the chief state school officers in a number of ADP states formed the ADP Assessment Consortium to create a rigorous college-ready Algebra II end-of-course exam. In March of 2007, original ADP Assessment Consortium members, with the state of Ohio serving as the procurement lead, awarded Pearson the contract to develop and administer the ADP Algebra II End-of-Course Exam. A subset of Assessment Consortium states have also worked together, with Achieve and Pearson, to create an ADP Algebra I End-of-Course Exam. The consortium represents the largest multi-state collaborative assessment ever undertaken. It is a dramatic departure from past testing practices in which states developed their own exams, based on their own standards and often at considerable individual state expense. Currently the ADP Assessment Consortium includes 15 states Arkansas, Arizona, Florida, Hawaii, Indiana, Kentucky, Maryland, Massachusetts, Minnesota, New Jersey, North Carolina, Ohio, Pennsylvania, Rhode Island, and Washington. Additional ADP Network states may also join. Why Algebra II? Because mastery of skills contained in an Algebra II course, or an integrated course covering the same content, is important for all high school graduates, the states wanted to be sure that the courses they offered titled Algebra II contained the right content and were at an appropriate level of rigor. Why does Algebra II matter?! Algebra II fosters problem solving, abstract reasoning and critical thinking skills that are used long after the course ends.! Algebra II and other higher level mathematics classes improve access to postsecondary education. Algebra II includes the advanced content that faculty at two-and four-year institutions say is critical for success in credit-bearing mathematics college coursework.! Students who study mathematics at least through Algebra II in high school are more than twice as likely as those who do not to earn a four-year degree and the level of mathematics a student reaches in high school is the most accurate predictor of whether that student will earn a Bachelor s degree. In contrast, students who have not mastered Algebra II in high school are more likely to need remediation and, therefore, less likely to complete a college degree. Why an End-of-Course Exam? End-of-course exams are attractive with states because they align directly to curriculum standards and courses students need to take for graduation. End-of-course exams are also more sensitive to instruction than are grade-level survey exams because they are taken right after a student has completed a course and can provide teachers with relevant information about students understanding of the content, enabling teacher to adjust instruction for subsequent classes accordingly. In addition, end-of-course tests serve as a way to ensure consistency and rigor in classrooms within and across states, so that all students are exposed to a rigorous curriculum. 1-3
6 The ADP Algebra II End-of-Course Exam serves as a means to ensure consistency and rigor as the number of students enrolled in the course grows, while simultaneously offering students a signal of readiness that can be valued and utilized by postsecondary institutions. What is the Purpose of the Exam? The ADP Algebra II End-of-Course Exam serves three goals: 1. To improve curriculum and instruction and ensure rigor and consistency across states. The test will help classroom teachers focus on the most important concepts and skills in Algebra II and identify areas where the curriculum needs to be strengthened. 2. To serve as an indicator of college readiness. The test is aligned to the ADP mathematics benchmarks and designed to help colleges determine if students are ready to do credit-bearing work. It measures rigorous content and skills students need to enter and succeed in first-year, credit-bearing mathematics courses. Postsecondary institutions will be able to use the results of the test to tell high school students whether they are ready for college-level work, or if they have content and skill gaps that need to be filled before they enroll in college. This information should help high schools better prepare their students for college, and reduce the need for colleges to provide costly remediation courses. 3. To compare performance and progress among the participating states. Having agreed on the core content expectations of Algebra II, states are interested in tracking student performance over time. Each year, Achieve will issue a report comparing performance and progress among the participating states. This report aims to help state education leaders, educators, and the public assess performance, identify areas for improvement, and evaluate the impact of state strategies for improving secondary math achievement. What Does the Exam Measure? The Algebra II End-of-Course Exam was created to provide an honest assessment of how well students have mastered the advanced knowledge and skills that are necessary for success in credit-bearing college mathematics courses. By design the test is challenging. To develop the test, the participating states engaged high school educators and college faculty in all stages of the development of the exam, from the writing of the content standards to reviewing the exam questions to determining how the student responses should be scored. Common, rigorous standards The standards on which the exam is based were developed collaboratively by the partner states, based largely on the ADP mathematics benchmarks. The ADP Algebra II End-of-Course Exam standards are robust, emphasizing advanced algebra, critical thinking, and problem solving. The mathematics content assessed consists of five key strands. A description of each standard and its associated emphasis on the exam is listed in Table 1. In each strand students are expected to model and solve problems in context and translate among multiple representations. 1-4
7 Table 1: Algebra II Exam Standards Standard Topics Addressed Percentage of Total Points Operations on Numbers and Operations with numbers and algebraic expressions, Expressions involving real and complex numbers. 15% Equations and Inequalities Linear and nonlinear equations and inequalities, and systems of linear equations and inequalities. 20% Polynomial and Rational Quadratic functions and higher-order polynomial and Functions simple rational functions. 30% Exponential Functions Exponential functions and basic logarithms and their relationship to exponents. 20% Function Operations and Inverses Combinations and inverses of functions. 15% The emphasis a particular standard was given on the exam is related directly to the emphasis that topic should be given in an Algebra II classroom. Item Types The operational core test includes 55 test questions:! 46 multiple-choice (1 point each)! 6 short-answer (2 points each)! 3 extended-response (4 points each) In response to state requests, the test was designed so that thirty percent of the score is based on the shortanswer and extended-response items. Additional field-test items are embedded in the operational form but do not count towards the student score. Calculator Use In developing the content of the exam, the mathematics experts felt it was necessary for students to demonstrate fluency in mathematics both with and without the use of technology. As a result, the exam is structured into two sections: one that allows calculator, and one that does not. Although not required, the use of a graphing calculator is highly recommended on the calculator section. Room for Growth The ADP Algebra II End-of-Course Exam content standards identify content for the core exam as well as content for seven optional modules. The modules were developed to further challenge students and to enable growth of the ADP Algebra II End-of-Course Exam beyond the traditional Algebra II curriculum. These include: 1. Data and Statistics 2. Probability 3. Logarithmic Functions 4. Trigonometric Functions 5. Matrices 6. Conic Sections 7. Sequences and Series The modules have been field tested but have not been administered operationally. Only the core exam will be considered for standard setting. 1-5
8 How can Post-Secondary Institutions Use this Exam? The ability of the K-12 system to raise standards and adopt a more rigorous curriculum depends heavily on buy-in and support from postsecondary institutions and systems, in sending signals to students, parents, schools and communities that improving standards and achievement at K-12 is essential to student postsecondary success. Higher education s voice in support of raising standards, often in concert with the business community, has led many states to stronger curriculum for all students; the same voices are equally if not more important in making the case for more rigorous high school assessments. Although the Algebra II End-of-Course Exam is not intended as a replacement for college mathematics placement tests or institutional placement instruments such as ACCUPLACER and COMPASS, colleges can use the Algebra II test as an indication of college mathematics readiness for entry-level credit-bearing mathematics courses such as College Algebra. (For higher-level courses such as calculus or pre-calculus, other placement instruments will be necessary.) For example, for a student who does well on the Algebra II exam, a college/department might exempt the student from taking a separate placement test for the first creditbearing mathematics course or courses, provided s/he continues to take advanced mathematics every year until graduation. This would reduce the number of tests a student would take, decrease the cost to colleges and departments for administering tests, and have an added benefit of signaling to high school students that the exam is valued by postsecondary and thus worth being taken seriously both by students and their high schools. When given early enough in a student s high school career, and once results are provided soon after administration, the assessment can also signal to students and schools where improvement/further instruction and support is needed to help students reach college-readiness in mathematics by the time of high school graduation. How does the Content Compare to Other Standards and Exams? Although states are in the process of adopting college- and career-ready standards, the rigorous nature of the ADP Algebra II End-of-Course Exam s items and the exam s purpose to measure mastery of advanced Algebra content, including the knowledge and skills necessary to be successful in a first-year credit-bearing college mathematics courses makes this exam more rigorous than most statewide high school exams. Many state exams, as well as college admissions and placement tests are survey exams, oftentimes measure a broad range of mathematics content that is less advanced than that found in an Algebra II course. Relatively few states have developed their own Algebra II end-of-course exams, which would be more closely tied to the curriculum and the courses that the states require for graduation. Appendix B includes results from four studies conducted by Achieve that describe the variability of Algebra II standards within the United States, the limited focus on Algebra II content found on existing college admissions and placement tests, the Algebra II content included in a sample of College Algebra and Precalculus college courses, and a comparison of the Algebra II exam standards to international standards. Although Algebra II Standards Vary Within the United States, the ADP Algebra II End-of-Course Exam Standards Addresses Many Topics Found in State Courses (Appendix B1) To determine how the ADP Algebra II End-of-Course (EOC) Exam compares to Algebra II expectations from states, Achieve compared the ADP Algebra II End-of-Course Exam standards with the Algebra II course standards and end-of-course exam standards from 15 states. Ten of the states included in the analyses are members of the ADP Network, and two of those ten states are also members of the Assessment Consortium. The data were examined from two different perspectives. 1-6
9 ! The first analysis was conducted to understand the degree to which state Algebra II course and exam standards are aligned, as a whole, with the ADP EOC Assessment.! The second analysis was conducted to better understand how prominently the core ADP Algebra II Exam benchmarks are currently represented in state course and exam standards First Analysis Of the state documents reviewed, most of the state standards documents (18 of 20) are 34% to 66% aligned to the EOC benchmarks. Only two sets of state standards are less than 33% aligned. No set of state standards is 100% aligned to the EOC standards. Second Analysis In comparing the 41 EOC benchmarks across the state standards, seven of the EOC benchmarks that have counterparts in more than 75% of the state documents analyzed and nine benchmarks that have counterparts in fewer than 25% of the state documents. Of the remaining 25 EOC benchmarks, the majority of them (16) have counterparts in 50%-74% of the state documents reviewed. No single benchmark is included in all of the state documents. Summary The purpose of this study was to determine how the ADP Algebra II Exam compares to Algebra II expectations across the states. The findings indicate that there is great variation in Algebra II courses and although there are some common topics, the benchmarks and skills vary within those topics. Although the ADP Algebra II End-of-Course Exam aligns partially with the majority of state standards analyzed, there is not perfect alignment, even with the topics on this exam. In other words, the ADP Algebra II End-of-Course Exam, although not a perfect match, addresses many of the topics found in the state standards documents analyzed. College Admissions and Placement Tests Do Not Fully Measure Algebra II Content (Appendix B2) As many states raise high school standards, college admissions and placements tests are being used for purposes for which they were not intentionally designed. For example, states are using the ACT or SAT as their official statewide high school graduation exam and incorporating these tests into their state assessment and accountability systems. Achieve conducted the study Aligned Expectations? A Closer Look at the College Admissions and Placements Tests to help inform the decisions that states are making about high school assessments by providing information about the content included on college admissions and placement tests. Achieve analyzed more than 2,000 questions from college admissions and placement exams to determine how these tests compare to one another and how well they measure the college and work readiness benchmarks created by the American Diploma Project. For mathematics, the study found that admissions and placement tests in mathematics emphasize algebra, which is critical for credit-bearing mathematics courses. However, the algebra content assessed tends emphasize pre-algebra and basic algebra over the advanced algebraic concepts and skills essential for college readiness. Although placement tests, often developed by each institution, are narrowly focused on algebra, admissions tests such as the SAT and ACT are broader, measuring a range of other important topics such as data analysis, statistics, and geometry. 1-7
10 The ADP Algebra II End-of-Course Exam Standards Provide a Solid Foundation for College Algebra and Pre-Calculus College Courses (Appendix B3) Achieve analyzed college syllabi from 69 College Algebra and 30 Pre-Calculus courses. Professors representing these courses attended one of three judgment studies described in the Judgment Studies (Tab 4). The purpose of this analysis was to provide information regarding the level of college mathematics course that a student performing at the college-ready level on the ADP Algebra II End-of-Course Exam (EOC) would be prepared to take. Each type of course, college algebra and pre-calculus, was analyzed from two different perspectives.! For the first analysis, Achieve content experts reviewed each ADP Algebra II EOC benchmark individually across each course, college algebra and pre-calculus to determine how well the content in the Algebra II exam standards would prepare a student for the content in each course.! For the second analysis, Achieve content experts reviewed each syllabus, as a whole, against the ADP Algebra II EOC standards to determine the degree of overlap between each syllabus and the Algebra II standards. First Analysis When considering each ADP Algebra II EOC benchmark individually with the content topics in the college algebra syllabi, the overlap at the content standard level ranges from 19% to 48% and the overlap of the content in the pre-calculus syllabi and Algebra II EOC exam at the standard level ranges from 7% to 48%, indicating that the ADP Algebra II EOC Exam standards will provide a strong mathematics foundation for students entering either a college algebra or pre-calculus course. Second Analysis When considering each syllabus, as a whole, in comparison to the ADP Algebra II EOC standards, there is considerable overlap between the Algebra II standards and the college algebra syllabi. On average, the college algebra syllabi overlap in 34% of the 41 benchmarks (14 benchmarks). There is less content in common between the ADP Algebra II EOC standards and the pre-calculus syllabi. The pre-calculus syllabi average 24% of the benchmarks (10 benchmarks) overlapping with the Algebra II standards. Summary Overall, the content covered on the ADP Algebra II End-of-Course Exam would provide a solid foundation for either a college algebra or pre-calculus college-level course. There is a small degree of overlap between the Algebra II Exam standards and the college syllabi that would allow for a review of previous mathematics content at the beginning of a college course, but not so much that a student would be repeating the exact course already taken. In fact, the college courses are natural extensions of the ADP Algebra II Exam standards. The standards, therefore, seem at the appropriate level, on average, to prepare students for the first credit-bearing college mathematics courses. The ADP Algebra II Exam Standards are Not Too Rigorous When Viewed in an International Context (Appendix B4) Achieve content specialists compared the content and rigor of the core ADP Algebra II End-of-Course Exam (EOC) Standards with upper-level secondary mathematics standards in eleven countries. Four analyses were included in this study:! The first analysis compared the distribution of mathematics content across the Number, Algebra, Geometry/Measurement, and Data Analysis/Statistics strands.! The second comparison evaluated the level of algebra content, classified as either pre-algebra, basic algebra or advanced algebra.! The third analysis evaluated the international grade placement of the mathematics content 1-8
11 ! The fourth analysis determined the level of rigor of all of the reviewed standards. Overall, the ADP Algebra II End-of-Course Exam standards are well in line with international expectations; in fact a number of countries have standards that are more rigorous. Although not the highest in any of the four analyses, the ADP Algebra II EOC Exam Standards are not considered the lowest in any of the four analyses either. With this in mind, the ADP Algebra II Exam Standards (which were created for a rigorous exam in the United States, where Algebra II is not a mandatory course or exam for many high school students) are average when compared to international standards that all upper secondary students are required to meet in their respective countries. The standards from other countries demand that all of their students master at least this content, if not higher level mathematics content. Administration of the ADP Algebra II End-of-Course Exam Most of the states in the ADP Assessment Consortium are in the process of developing and adopting policies that will govern participation in the exam and address how the exam results will ultimately be used. For the first operational administration in the spring of 2008, states used a variety of approaches, both in terms of who decided which students would take the exam (school, district, or state decision) and how the cost of the exam would be funded (district or state). Two states (Arkansas and Hawaii) required all students who took an Algebra II course in the school year to take the exam in spring Arkansas was one of 20 states at that time that required all students, starting with the class of 2010, to complete Algebra II to graduate from high school. Hawaii is working with both its postsecondary institutions and employers to create incentives for students to complete a set of core curriculum courses, including Algebra II. Other states chose to pilot the exam with only a subset of their students taking the exam, as they considered how to incorporate it into their broader assessment systems. Since the initial 2008 operational administration, Indiana has increased its participation and like Arkansas and Hawaii plans to require all students completing and Algebra II course to take the exam. Spring 2008 Results Nearly 90,000 students from 12 states participated in the first operational administration of the ADP Algebra II End-of-Course Exam in the spring of The following tables provide results from the initial administration, and reflect that overall, across the states the performance was low. This data is shared to provide context about the difficulty of the exam. The results from the spring 2009 administration will be shared with panelists after round 2 of standard setting, and will be reported to schools, districts, and states on August 24, Participation in the spring 2009 exam increased by 10% from last spring over 100,000 students from 13 states participated in the spring 2009 administration. 1-9
12 Table 2: Average Number of Points and Percent Correct, By Question Type and State Multiple-Choice Items Constructed Response All Items (Total possible (Total possible points = Items (Total possible points = 76) 46) points = 30) State* Average Number of Points Correct Average Percent of Points Correct Average Number of Items Correct Average Percent of Items Correct Average Number of Points Correct Average Percent of Points Correct Total % % % AZ % % 2 6.4% AR % % % HI % % 3 8.2% IN % % 2 7.6% KY % % 2 5.4% MN % % 1 4.2% NJ % % % NC % % % OH % % % PA % % 3 8.7% RI % % 2 7.1% WA % % % Table 3: Average Number of Points and Percent Correct, by Content Standard CONTENT STANDARDS Number of Points Possible Average Number of Points Correct Average Percent of Correct Items Operations on Numbers and Expressions Equations and Inequalities Polynomial and Rational Functions Exponential Functions Function Operations and Inverses % 25.0% 31.8% 27.6% 17.9% 1-10
13 Table 4: Average Number of Points and Percent Correct, By Grade Level Average Number of Number of Grade Level Points Correct Students Tested (Out of Possible 76) Average Percent of Correct Points Grade Eight % Grade Nine 5, % Grade Ten 27, % Grade Eleven 42, % Grade Twelve 10, While Maryland and Massachusetts were part of the Assessment Consortium they did not participate in the spring 2008 administration. Florida did not join the Assessment Consortium until Reliability Table 5 below shows the Cronbach s alpha (Alpha), mean raw score (mean), standard deviation of this raw score (S.D), standard error of the mean (SEM), average points correct (p-value), and average point-biserial correlation (Pt.Bis) for the overall group of test-takers for the spring 2008 operational administration. The alpha of 0.87 is consistent with other large-scale assessments and meets industry standards for reliability. The mean score was out of 76 points (27% of points possible). The average p-value was 0.38 which is further evidence of the rigorousness of this content. Table 5: Spring 2008 Overall Reliability Interpreting the Results While performance was low across the states, the ability to interpret the spring 2008 test results and make comparisons across states was limited for several reasons. First, performance standards or cut scores had not yet been established, so there was no clear basis for interpreting the results. Scores were reported based on the percentage of test questions answered correctly. A second challenge in interpreting and comparing the results was that the number of test takers varied significantly across states. In some states all students enrolled in an Algebra II course took the exam, while in others just a subset participated. It is expected that as the states continue to expand their use of the test, the number of test takers will increase and the results will yield more comparable data. When looking at the data in aggregate, initial findings from the first administration suggest:! Student performance was low across all states and in all content strands! Constructed response items are a particular challenge for students! Students who take Algebra II in earlier grades perform better! Student motivation may have impacted performance! States vary in their policies regarding course requirements and data collection 1-11
14 Next Steps for Policymakers Achieve issued The American Diploma Project Algebra II End-of-Course Exam: 2008 Annual Report in August 2008 which included the following next steps for policymakers.! Strengthen K-12 standards! Provide supports to teachers and students! Make college and career readiness the focus on high school assessment and accountability systems! Use the ADP Algebra II Exam to determine if students are prepared to take credit-bearing college mathematics courses, specifically College Algebra or Pre-Calculus About the Algebra II Standard Setting Process Until now, the ADP Algebra II End-of-Course Exam results have been reported in terms of raw scores. This was true for the Spring 2008 and End-of-Fall 2008 administrations. A total of 76 points were possible. Following standard setting, results from the Spring 2009 administration, and future administrations, will be reported in terms of scaled scores and along three performance levels. What is Standard Setting? Standard setting is the process of taking the continuum of student performance on an assessment and separating it into performance levels. The goal of the standard setting is to define the scores on the continuum that separate students from one performance level to the next. Each performance level indicates a student s proficiency in Algebra II and his/her likely preparedness for first-year credit-bearing college mathematics courses. The ADP Algebra II Exam includes three performance levels:! Needs Preparation! Prepared! Well Prepared The knowledge, skills, and abilities required at each performance level are described in the Mapping to Performance Level Descriptors section (Tab 5) and are referred to as Performance Level Descriptors (PLDs). The PLDs were drafted by higher education mathematics faculty and further revised by Pearson, Achieve, and the Assessment Consortium. Who is Involved in Standard Setting? The ADP Algebra II standard setting panel includes:! Fifteen state department of education representatives (one representative from each of the 15 Assessment Consortium states)! Twelve mathematics and higher education representatives (selected by Achieve) All panelists have been involved with the exam during the development and/or administration. The panelists represent a combination of mathematics and policy leadership experience. 1-12
15 The standard setting meeting will be facilitated by Pearson, the developer/owner of the ADP Algebra II Endof-Course Exam. At the request of the ADP Assessment Consortium states, Achieve will attend standard setting as an observer and will have final responsibility for setting cut scores for the ADP Algebra I and II End-of-Course Exams based on recommendations from both standard setting groups. What is the Standard Setting Process for the ADP Algebra II Exam? Achieve has convened an expert group of advisors to provide technical advice and expertise on critical issues, including how to best define achievement levels and what process to use in setting proficiency standards. This group, called the Research Alliance, is comprised of experts in mathematics, assessment, design, K-12 education policy, and higher education. Based on a number of considerations including in particular the multiple purposes of the exam: to improve high school curriculum and instruction; to provide a common measure across states over time; and to serve as an indicator of readiness for first-year college credit-bearing mathematics courses the Research Alliance recommended delaying standard setting from July 2008 (after the first operational administration) to July 2009, so that various validity study evidence could be collected prior to standard setting and allow time for the states to incorporate the standards into their curriculum. It was the Research Alliance s recommendation that delaying standard setting would ground the standard setting in real data about the validity of the exam for indicating college readiness rather than speculation about what the student performance at that level should be. A modified briefing book approach, similar to the one outlined by Haertel (2002) 1 will be used to set standards on the exam. In this approach, policy makers are provided with a briefing book of all of the relevant data related to standard setting decisions and are asked to explicitly consider the policy implications of the full set of data. Haertel argued that stakeholder participation in a rational and coherent deliberative process is necessary to assure that the appropriate validity argument for performance standards can be satisfied. The briefing book itself (which includes evidence from a number of validity and alignment studies) will serve as the basis for the standard setting exercise. For the ADP Algebra II End-of-Course Exam standard setting, panelists will be provided with empirical data from the validity studies, in the form of a briefing book, to provide basis for their policy-based cut score recommendations. Panelists will also review the operational items from the spring 2009 test administration to provide context about the exam, but a traditional item mapping approach will not be followed. Why Not a More Traditional Approach to Standard Setting? Setting standards on a mathematics exam determining what it means to be proficient and what score is necessary to achieve at that level is typically done by mathematicians who review each item on the exam and its difficulty level and use that information to determine the mix of correct items that constitute basic, proficient, or advanced performance of the test content. In this case, proficient must also mean prepared to succeed in a first-year credit-bearing college mathematics course in post-secondary institutions. Therefore, standard setting will rely heaving on data that show the relationship between scores on the Algebra II exam and performance in postsecondary education. In this 1 Haertel, E. H. (2002). Standard setting as a participatory process: Implications for validation of standardsbased accountability programs. Educational Measurement: Issues and Practice, 21 (1),
16 case, an item-based exercise such as a modified Angoff or Bookmark procedure does not by itself provide a sufficient basis for quantifying performance level cuts. How Is College Readiness Defined? There is no universally accepted definition of college readiness. Twenty states have currently developed definitions of college readiness. These definitions aimed at high school students are couched in terms of courses, skills, standards, and/or test performance (Lloyd, 2009) 2. A primary purpose of the Algebra II standard setting is to define a score level on the test that indicates college readiness, that is, the level of performance that indicates a student is prepared for first-year credit-bearing college mathematics courses. For the research studies summarized in this briefing book, Pearson assumed working definitions of college readiness to assist with data interpretation. For example, in the Judgment studies, we asked participating college instructors to think of Prepared students as those that will ultimately earn a B or Better in relevant mathematics courses without remediation. The instructors made item level judgments about performance with these students in mind. For the predictive studies, we focused specifically on the Algebra II test scores as predictors of final course grades of B or higher. These practical definitions are useful in interpreting the results of the research studies and in thinking about their implications for the Algebra II standard setting. What are the Ground Rules for Standard Setting? The goal of the Algebra II standard setting is to obtain your judgments about where the Prepared and Well- Prepared cut scores should be placed. As a participant in the standard setting process, you will review the content of the Algebra II test and be exposed to the policy goals that serve as the basis for the test. Although you may be interested in commenting on the test questions or expressing personal opinions about the goals of the ADP program, the ground rules for the standard setting ask that you refrain from using the meeting time to critique test content or the policy basis for the program. Pearson will have content development staff available at the meeting who will accept written comments related to the content of specific questions should you feel compelled to provide this feedback. 2 Lloyd, S.C. (2009, June 11). Consensus on meaning of readiness remains elusive. Diplomas Count 2009, Education Week, 28(34), p
17 About the Research Used to Inform Standard Setting Pearson conducted a number of validity studies to better understand how the ADP Algebra II Exam fits into the current landscape of mathematics instruction and assessment across public high schools and two- and four-year public colleges. Three main types of validity studies will be used to inform standard setting: 1. Concurrent studies Student scores from the Spring 2008 ADP administration were matched to student scores to other state and national assessments to establish relationships, including those with existing measures of college readiness. 2. Cross-sectional studies The ADP Algebra II Exam was administered to students at the beginning of the semester of their college mathematics course and compared to their final grade in the course to determine how well a student s performance on the exam predicts his/her performance in the college math course. 3. Judgment studies Feedback was gathered from 133 college professors who teach College Algebra and Pre-Calculus courses regarding the relevance of the ADP Algebra II Exam standards to their course, draft performance level descriptors, and recommended cut scores. How is This Data included in the Briefing Book? The concurrent, cross-sectional, and judgment studies are described in sections 2-4 of the Briefing Book. Section 5 includes the Performance Level Descriptors and a study done by Pearson in which the content of each item appearing on the Spring 2009 exam was assigned to a Performance Level based on the Performance Level Descriptions. Section 6 includes a cross-walk of cut scores using the results from the various validity studies and the Performance Level mapping study. Appendix A includes additional information about the methodology used in conducting the validity studies and a complete copy of the research validity plan. Appendix B includes results from the content-based studies conducted by Achieve. What are the Results of the Validity Studies? Concurrent: What relationships exist between the ADP exam and state exams? There are moderate-to-strong relationships across states in the proportion of students who took both the state tests and the ADP exam. Differences across state tests are primarily due to differences in the matched student samples used for the studies Concurrent: What relationships exist between the ADP exam and the national exams? There is a strong relationship across states in the proportion of students who took both the national exams and the ADP exam. Differences in these relationships are primarily due to differences in the matched student samples used for the studies. Cross-Sectional: What ADP score predicts a B or better in College Algebra and Precalculus? Predictive studies suggested scores between 32 and 38 on the 76-point 2008 ADP exam associated with a 65% probability of earning a B or better. Differences in predictions were seen based on institution type (community college, 4-year typical, or 4-year selective) and course (College Algebra or Pre-Calculus). 1-15
18 Contrasting groups studies suggested a 2008 ADP exam score of 25 separates the distributions of students obtaining a B or better in College Algebra and Pre-Calculus from the distributions of students obtaining grades of C, D, or F. Judgment: What feedback was received from College Algebra and Pre-Calculus Professors, and what cut scores did they recommend? College Algebra and Pre-Calculus professors identified some clear content standards that were considered to be important and essential for incoming students to master prior to entry into credit-bearing courses. Median cut scores for the Prepared student ranged from 23 to 38 on the 2008 ADP exam. As expected, results varied based on type of course and institution. How Much Weight Should be Given to Each Study? While all of the studies include some limitations, the results of each study should be considered, along with the considerations, in reaching a cut score recommendation. Of the concurrent studies, the results involving the ACT are probably the strongest because of the large and representative sample and ACT s previouslyestablished benchmark college readiness score of 22. Of the two types of cross-sectional studies, the predictive studies are probably more meaningful than the contrasting groups studies, since they more directly focus on the relationship between ADP exam scores and criterion course grades of B or higher. The judgment studies are also worthy of consideration, given the representative sample of 133 professors representing 79 institutions and 20 different states. What Can Be Inferred from Converging Data? The converging data illustrate the rigor of the ADP exam. Both the empirically-based and judgment studies seem to support placing the Proficient cut score within a range of scores between 24 and 38 on the 2008 ADP exam. Your final recommendations for the ADP exam cut scores should be based on the goals of the exam, the specific information contained in this briefing book, discussions that occur during the standard setting meeting, and your own best judgment. 1-16
19 2. Concurrent Studies
20
21 Executive Summary - Concurrent Validity Studies Background Data! The concurrent validity studies provide data about how the ADP Algebra II End-of- Course Exam relates to other mathematics assessments.! Two types of exams were included in the concurrent validity studies: o National exams, specifically the math sections of the ACT, SAT, and PSAT assessments. o State exams, specifically mathematics assessments administered at the high school level in six different states.! The concurrent validity studies provide indirect information for considering the use of the ADP Algebra II exam to assess college readiness. For example, the SAT and ACT already provide implicit and explicit benchmarks of college readiness. The PSAT is used as a qualifying exam for the National Merit Scholarship program, and the state tests classify students into Proficient and Advanced performance categories.! The relevance of the various concurrent validity studies for the ADP Algebra II standard setting varies for a number of reasons. First, the exams studied vary in how closely they match the content measured by the Algebra II exam. Second, the exams differ in how well performance on them is thought to relate do college readiness. Finally, the amount and quality of the data collected across the various studies differed.! National exam scores matched with spring 2008 ADP Algebra II exam scores: o ACT scores were matched in Arkansas for 6,278 students and in Kentucky for 999 students. o SAT scores were matched in Indiana for 205 students and in Pennsylvania for 414 students. o PSAT scores were matched in Rhode Island for 954 students.! State exam scores matched with spring 2008 ADP Algebra II exam scores: o Hawaii State Assessment (HSA) scores were matched for 1,164 students. o Indiana Statewide Testing for Educational Progress-Plus (ISTEP+) scores were matched for 2,858 students. o Kentucky Core Content Test (KCCT) scores were matched for 923 students. o New Jersey High School Proficiency Assessment (HSPA) scores were matched for 206 students. o Pennsylvania System of School Assessment (PSSA) scores were matched for 3,015 students. o New England Common Assessment Program (NECAP) scores were matched for 882 students in Rhode Island. 2-1
22 Methods! Spring 2008 ADP Algebra II exam scores and scores on the national and state exams were linked using equipercentile linking, resulting in a concordance table for each exam.! For the ACT and SAT, linear regression was conducted to predict ACT/SAT scores exam scores from spring 2008 ADP exam scores.! For the ACT and SAT, logistic regression was conducted to investigate the probability of earning a particular ACT/SAT score, given an ADP exam score. Results! ACT o Strong linear relationship between ADP exam and ACT (r =.698). o The ACT college-readiness score of 22 is associated with spring 2008 ADP exam scores of 25 and 26. o An ADP exam core of 32 is associated with having a 65% chance of earning a 22 or better on the ACT. o An ADP exam score of 31 is associated with a predicted ACT score of 22.! SAT o Moderate correlation between ADP exam and SAT (r =.490). o Using the concordance table between ACT and SAT, the ACT collegereadiness score of 22 is associated with SAT scores of 520 and 530. These scores mapped to ADP exam scores of 23 and 24. o ADP exam scores of 36 and 39 are associated with a 65% chance of earning SAT scores of 520+ and 530+, respectively. o An ADP exam scores between 36 and 40 linearly predict SAT scores in the range.! PSAT o Strong linear relationship between ADP exam and PSAT (r =.616). o The level of PSAT math performance associated with being recognized as a National Merit Scholar corresponds to ADP exam scores of 56 to 59.! State exams o Mean scores on the spring 2008 ADP exam for the matched students varied considerably by state, ranging from 13.8 to o Correlations between ADP exam and the state exams also varied, ranging from.323 (New Jersey HSPA) to (Indiana STEP+). o Based on the concordance tables, the proficient levels on the state exams are associated with ADP exam scores ranging from 12 to 20. o Based on the concordance tables, the advanced levels on the state exams are associated with ADP exam scores ranging from 21 to
23 Considerations! The content match between the exams included in the concurrent studies and the Algebra II exam is limited. o The ACT, SAT, and PSAT contain varying amounts of algebra content, little of which is likely to be at the Algebra II level. o The state exams are primarily used for NCLB and/or as part of graduation. requirements, and are unlikely to measure Algebra II content.! The quality of the concurrent data varies across the studies. o The ACT sample is fairly representative of the ACT testing population. o The SAT sample is lower performing than the SAT testing population. o The PSAT sample is representative of the PSAT testing population in terms of mean performance, but is significantly less variable. o The state exam samples vary in their quality and the ability levels of the students matched to the ADP Algebra II exam.! The ACT concurrent validity study seems most relevant for the purposes of the ADP Algebra II standard setting. o The ACT sample is largest and ACT scores are highly correlated with ADP Algebra II scores. o ACT s definition of college-ready is having a 50% chance of earning a B or better and a 75% chance of earning a C or better in College Algebra. ACT s benchmark for this is an ACT mathematics score of 22 or higher.! The SAT concurrent validity study is relevant for the purposes of the ADP Algebra II standard setting; however, the SAT results should be interpreted with caution. o There is no established college-readiness score for the SAT. o The SAT sample was small and unrepresentative of the SAT testing population. o Correlations with ADP Algebra II scores in the concurrent validity study were lower than would be expected.! The PSAT study provides validity data for the ADP Algebra II exam, but is less relevant for the standard setting. o Although a high correlation between PSAT and ADP scores was found, there is not a clear relationship between PSAT and college-readiness. o The concordance for the benchmark score associated with eligibility for a National Merit Scholarship provides some indication of an upper level of performance on the ADP Algebra II exam; however, this level of performance likely exceeds the level of performance associated with college readiness. o PSAT data were limited to students in Rhode Island.! Although they provide validity evidence for the ADP Algebra II exam, the studies involving state exams are probably least relevant to setting the college readiness standards. o The tests largely measure different mathematics skills at a different level of rigor. o Connections between proficiency levels on the state tests and inferences of college readiness are unknown. 2-3
24 National Assessments-ADP Algebra II Exam Concurrent Validity Study Report Concurrent validity studies were conducted for the American Diploma Project (ADP) Algebra II End-of-Course Exam using scores from the Spring 2008 operational administration. These studies were designed to establish statistical and empirical relationships between performance on the ADP Algebra II exam and performance on national mathematics tests. This report focuses on relationships with mathematics sections of the ACT, SAT, and PSAT, which are described in Table 1. Table 1. Description of National Assessments Exam Description and Subject Areas Included Title ACT The ACT test assesses high school students' general educational development and their ability to complete college-level work. The multiple-choice tests cover four skill areas: English, mathematics, reading, and science. The Writing Test, which is optional, measures skill in planning and writing a short essay. Many students take the test twice, once as a junior and again as a senior. The ACT is administered in all 50 states and is taken by the majority of high school graduates in 26 states. SAT The SAT assesses the critical thinking skills students need for academic success in college. Each year, more than two million students take the SAT. The SAT includes a student-produced essay, multiple-choice questions, and student-produced responses (grid-ins) for testing critical reading, mathematics, and writing. More than two million students take the SAT every year. The SAT is typically taken by high school juniors and seniors. PSAT The Preliminary SAT/National Merit Scholarship Qualifying Test (hereafter referred to as the PSAT), is co-sponsored by The College Board and National Merit Scholarship Corporation (NMSC). The PSAT measures critical reading, math, and writing skills that are important for success in college. Critical reading and math sections test a student's ability to reason with facts and concepts; the multiple-choice writing skills component assesses communication skills. Most students take the test as high school juniors, but some take it as sophomores or in earlier grades. Information obtained for test results can help students plan their education beyond high school. PSAT scores can be used to estimate probable performance on the SAT Reasoning Test TM. Purpose ACT scores are accepted at all four-year colleges and universities across the country. ACT scores are also used to make appropriate course placement decisions by the majority of fouryear schools in the U.S. Nearly every college in America accepts the SAT as a part of its admissions process. Used with GPAs and high school transcripts, SAT scores allow colleges to fairly compare applicants. Taking the SAT gives students access to scholarship opportunities. The PSAT is used by NMSC as an initial screen of entrants in its annual academic competitions for recognition and college scholarships. NMSC receives the scores of all students who take the PSAT. Those who qualify to continue in the competitions receive applications from NMSC. 2a-1
25 It is important to understand that these national assessments contain varying degrees of Algebra content relative to the ADP Algebra II End-of-Course Exam, although the distribution of cognitive demand across the three assessments is similar. The ACT and SAT assessments are comprised of only 43% and 49% Algebra coverage, respectively. In contrast, the ADP Algebra II End-of-Course Exam, as its name suggests, is 100% Algebra-based, with 58% being Advanced Algebra. Table 2 1 presents the content and cognitive demand distributions for the ACT, SAT, and ADP Exam. Table 2. Content Characteristics of ACT, SAT, and ADP Exam ACT SAT ADP Content Strand Numbers 18% 11% 0% Algebra 43% 49% 100% Geometry/Measurement 36% 30% 0% Data 4% 10% 0% Type of Algebra Content Pre-Algebra 41% 42% 16% Basic Algebra 24% 30% 26% Advanced Algebra 35% 28% 58% Cognitive Demand Recalling 8% 6% 0% Routine Procedures 54% 55% 61% Non-Routine Procedures 13% 10% 0% Formulating Problems/Strategizing 24% 27% 25% Advanced Reasoning 1% 2% 14% Participating Students Five ADP Consortium states provided student scores on national assessments. Table 3 displays the number of students within each state that had both a valid ADP exam score and a national exam score. Table 3. Number of Matched Records by State State ACT SAT PSAT Arkansas 6, Indiana Kentucky Pennsylvania Rhode Island Total 7, Data are from What Do College Placement Assessments Measure? presentation at NASH/Achieve Academic Leaders P-16 Summer Institute in Minneapolis, MN on July 29-31, a-2
26 Table 4 shows the grade that students were in when they took the ADP exam. For the ACT sample, students were spread mainly between grades but for the SAT sample, most students were in grade 12. For the PSAT, students were in grades 10 and 11. Table 4. Distribution of Grades at ADP Administration ACT SAT PSAT Grade Grade 10 1, Grade 11 4, Grade 12 1, Unknown Total 7, The demographic breakdown by ethnicity and gender for each sample (S) and the 2008 national testing population (P) is displayed in Table 5. To evaluate how well the selected samples represent the nationwide population academically, Table 6 2 compares the samples to their respective populations in terms of test performance. The scale score ranges for the ACT, SAT, and PSAT are 1-36, , and 20-80, respectively. Table 5. Sample (S) and Population (P) Comparison of Ethnicity and Gender by Assessment ACT SAT PSAT Ethnicity S P S P S P African American 14% 13% 17% 11% 4% 15% American Indian 1% 1% 0% 1% 0% 1% Asian/Pacific Islander 1% 4% 9% 10% 2% 8% Hispanic 2% 8% 4% 12% 10% 17% White 81% 63% 65% 57% 83% 53% Unknown 0% 12% 5% 8% 0% 7% Gender Female 56% 54% 56% 54% 55% 52% Male 44% 44% 44% 46% 45% 47% Unknown 0% 2% 0% 0% 0% 1% *Percentages may not sum to 100% due to rounding. 2 National statistics were taken from and 2a-3
27 Table 6. Comparison of Sample (S) and Population (P) Score Distributions by Assessment Statistic ACT SAT PSAT S P S P S P N 7,277 1,421, ,518, ,119,810 Mean Std Dev Q N/A Q N/A Q N/A All three samples appear to represent their respective national testing population demographically, but the assessment performance of each sample was lower than the corresponding national performance. For both the ACT and SAT, the sample mean is at least half a standard deviation below the national mean and quartile differences increase with the scale. However, for the PSAT, the sample mean is only slightly lower than the national mean. Because of these differences, the results of any analysis with these samples should be interpreted with caution. Analyses Performed This study examined raw scores from the ADP exam, since scaled scores had not yet been developed, and mathematics scaled scores for the ACT, SAT, and PSAT assessments. First, univariate statistics and boxplots were examined to compare the relationships among the various assessments. Second, the LEGS (Linking with Equivalent Groups or Single Group Design) program was used to statistically link ADP Algebra II exam scores and scores from the national mathematics assessments. Here, post-smoothing values of S=1.0 and the equipercentile equating method were used to construct concordance tables displaying corresponding ADP/ACT, ADP/SAT, and ADP/PSAT scores, respectively. Third, logistic regression was conducted in which a specific level of achievement on a national exam (0=below, 1=at or above) was regressed on ADP exam scores. Results are displayed in expectancy tables to present the probability of achieving a certain national assessment score or better based on ADP exam performance. Fourth, national assessment scores were linearly regressed on ADP exam scores in order to get a predicted national assessment score. In both the concordance and expectancy tables, comparisons were made with scores of interest on the national assessments to provide insights into the range of scores providing evidence of college readiness. Results Table 7 reports the mean, median, standard deviation, minimum (observed and absolute) and maximum (observed and absolute) values of ADP scores, and correlations between the ADP exam scores and ACT, SAT, and PSAT mathematics scores, respectively. Note that none of the samples had ADP exam scores spanning the entire score point range of 0 to 76. 2a-4
28 Table 7. Descriptive Statistics of ADP Score by Sample Statistic ACT SAT PSAT Sample Sample Sample Mean Std Dev Minimum Maximum Correlation with ADP It is evident from Table 7 that ADP exam scores vary by sample, with the ACT sample containing higher performing students than the other two samples. It is important to note that Arkansas, a state providing 87% of the ACT data analyzed, has implemented the ADP Algebra II exam on a statewide basis, so student motivation may play a role in these higher test scores. In addition, recall that the SAT sample is primarily comprised of students in grade 12 (see Table 4), further explaining the lower performance of the SAT sample since students taking Algebra II in grade 12 are likely to be lower performing than those who take Algebra II in earlier grades. Figure 1 displays boxplots illustrating the distribution of ADP exam scores for each of the three samples. The score distributions are similar for each sample, each being positively skewed. 2a-5
29 Figure 1. Boxplots of ADP Exam Score by National Test Score ADP Exam Score ACT SAT PSAT National Exam Boxplots displaying the distribution of national assessment scores for each ADP exam score point are shown in Figures 2-4. All three plots provide evidence of a linear relationship between the ADP exam and the respective national test. For the ACT, the linear relationship appears to be particularly strong, which is also supported by the observed correlation coefficient (r =.698) provided in Table 7. The linear relationship between ADP exam scores and SAT mathematics scores is less strong (r =.490), likely resulting from the presence of a cluster of lower ADP exam scores that map to low and high SAT scores. This could be a result of lack of student motivation on the ADP exam. The linear relationship between PSAT mathematics scores and ADP exam scores appears to be moderately strong (r =.616) even though there still exists a cluster of lower and higher performing ADP students who score highly on the PSAT. 2a-6
30 Figure 2. Distribution of ACT Scores by ADP Exam Score ACT Mathematics Score ADP Exam Score 2a-7
31 Figure 3. Distribution of SAT Scores by ADP Exam Score SAT Mathematics Score ADP Exam Score 2a-8
32 Figure 4. Distribution of PSAT Score by ADP Exam Score PSAT Mathematics Score ADP Exam Score 2a-9
33 The concordance results for ADP exam scores and ACT scores are displayed in Table 8, with each ADP exam score mapped to a corresponding ACT mathematics score. Because student scores did not span the entire ADP exam score range, the range of scores in the concordance table is restricted. Also, note that some ADP exam scores are mapped to the same ACT score (e.g., ADP exam scores of both 8 and 9 are associated with ACT scores of 14). The ACT mathematics assessment has a published college readiness standard of As evidenced by the concordance table below, an ACT mathematics score of 22 corresponds to ADP exam scores of 25 and 26. Results of this analysis suggest that ADP exam scores in the neighborhood of would indicate a student is ready to learn college level mathematics material. Table 8. ADP Exam-ACT Mathematics Concordance Table ADP Exam Score ACT Mathematics Score ADP (cont.) ACT (cont.) ADP (cont.) ACT (cont.) a-10
34 The ADP Exam-ACT mathematics expectancy table (Table 9) displays the probability of achieving a score on the ACT given performance on the ADP exam. In addition to these probability values, the final column in the table presents the predicted ACT mathematics score for students scoring at each score point on the ADP exam. For example, a student scoring a 25 on the ADP exam would have a 98% probability of scoring at least 16 on the ACT, a 17% probability of scoring at least a 24, and would be predicted to score a 20 (rounded) on the ACT mathematics exam. Using the probabilities provided and the ACT score of interest (22), a student would need a score of 32 to have a 65% probability of scoring a 22 (shaded in orange). Finally, using the prediction equation, a student would need to score 31 to have a predicted ACT mathematics score of 22 (shaded in green). These analyses suggest that ADP exam scores in the range would indicate that a student is ready to learn college level mathematics material. Table 9. ADP Exam-ACT Mathematics Expectancy Table Probability of an ACT Mathematics Score of ADP Exam Score Predicted ACT Mathematics Score a-11
35 *Values estimated to be 0.00 and 1.00 were changed to 0.01 and 0.99, respectively, to avoid probabilities of certainty. 2a-12
36 Table 10 presents the concordance results for the ADP exam and SAT mathematics test. As was seen in the ACT concordance table, the range of ADP exam scores is restricted. Unlike the ACT, there is no college-ready standard published for the SAT mathematics assessment. For this reason, two separate comparisons were made. First, we carried forward the ADP exam score needed to demonstrate evidence of college readiness (25 or 26) based on the ACT concordance analysis and mapped those scores to the corresponding SAT scores. As shaded in orange in Table 10, SAT scores of 539 and 545 correspond to ADP exam scores of 25 and 26. Secondly, we used an existing concordance table linking SAT and ACT mathematics scores to link the SAT and ADP exams. According to published research conducted by The College Board, an ACT mathematics score of 22 corresponds to SAT mathematics scores of 520 and While it is inappropriate to statistically equate these two assessments since they measure slightly different content, concordance scores have been established to assist students, educators, and college admissions officers in comparing performance on these two commonly-accepted college entrance exams. With that in mind, these analyses suggest that an ADP Algebra II exam score of 23 or 24 (shaded in green in Table 10) and above would provide some evidence of college readiness. The results of this analysis should be interpreted with caution, since no official college readiness standard has been developed for use with the SAT a-13
37 Table 10. ADP Exam-SAT Mathematics Concordance Table ADP Exam SAT Mathematics ADP (cont.) SAT (cont.) Score Score The ADP Exam-SAT mathematics expectancy table (Table 10) displays the probability of achieving a particular score or higher on the SAT given performance on the ADP exam. In addition to these probability values, the final column in the table presents the predicted SAT mathematics score for students scoring at each score point on the ADP exam. For example, a student scoring a 25 on the ADP exam would have a 38% 2a-14
38 probability of scoring at least 500 on the SAT, a 3% probability of scoring at least a 590, and would be predicted to score a 470 (rounded) on the SAT. The probabilities in Table 11 were used in conjunction with the SAT scores of interest (539 and 545 using results from first analysis and 520 and 530 using results from second analysis) to identify meaningful ADP exam scores. Since SAT scores are scaled that they are rounded to the nearest 10 th (400, 410, etc.) scores of 540 and 550 were used in lieu of 539 and 545. According to the results in Table 11 that are shaded in yellow, a student would need a score of 36 to have a 65% probability of scoring 520, score of 39 to have a 65% probability of scoring a 530, a score of 41 to have a 65% probability of scoring a 540, and a score of 42 to have a 65% probability of scoring a 550 or higher. Finally, using the prediction equation results that are shaded in purple, a student would need to score a 36 to have a predicted SAT mathematics score of 520, 38 to have a predicted score of 530, 40 to have a predicted score of 540, and 42 to have a predicted score of 550. This analysis suggests that scores in the neighborhood of 36 to 42 provide evidence that a student is ready to learn college-level mathematics material. 2a-15
39 Table 11. ADP Exam-SAT Mathematics Expectancy Table ADP Probability of an SAT Mathematics Score of Exam Score Predicted SAT Mathematics Score a-16
40 *Values estimated to be 0.00 and 1.00 were changed to 0.01 and 0.99, respectively, to avoid probabilities of certainty. 2a-17
41 Since the PSAT is used to identify National Merit Scholars, the cut needed to qualify as a National Merit Scholar was examined. In 2008, the combined score needed was approximately 211, which means that a student needed to have an average section score of 70. Table 12 indicates that a PSAT mathematics score of 70 maps to ADP exam scores of which are shaded in orange. Table 12. ADP Exam-PSAT Mathematics Concordance Table ADP Exam PSAT Mathematics ADP (cont.) PSAT (cont.) Score Score a-18
42 Conclusion This study examined the relationships between scores on the ADP Algebra II End-of- Course Exam and scores on the ACT, SAT, and PSAT mathematics assessments. Students with both an ADP exam score and a national mathematics score were included in the study. The three samples were fairly representative of their respective testing populations in terms of ethnicity and gender, although to a lesser degree for the PSAT sample. Academically, the mean scores for the samples were lower than each respective testing population, particularly for the ACT and SAT samples. Linear relationships appear to exist between the ADP exam and the ACT, SAT, and PSAT mathematics assessments. In order to establish an empirical relationship between the ADP exam and each of the national tests, equipercentile linking was conducted. In addition to these analyses, expectancy tables were created to describe the probability of earning a particular score on the national mathematics assessment based upon performance on the ADP exam. Predicted scores were also calculated using linear regression. Results were varied and, using a published ACT college readiness standard and a derived SAT readiness standard, suggest that corresponding ADP exam scores would range from Scores of roughly correspond to the values needed to earn recognition as a National Merit Scholar. Considerations Results from these analyses should be interpreted cautiously for the reasons outlined previously. Specifically, the samples suffer from restriction of range and vary in terms of how representative they are academically. It is suggested that more consideration be given to the results using ACT scores for several reasons. First, ACT scores were shown to be more highly correlated with ADP exam scores. Second, the sample size used for the ACT analyses was over ten times as large as the sample used for the analyses with other national assessments, leading to more stable estimations. In addition, the students in the ACT sample are thought to be more motivated than students in the other samples, since the ADP exam is implemented on a statewide basis for most students in that sample. Lack of student motivation (on field tests and in other situations in which no student consequences are associated with test performance) has been shown to adversely impact analyses such as these, so a highly motivated sample of students will likely result in more accurate inferences. And finally, the SAT scores of interest used in this report are unofficial and not recognized as college readiness standards from the test publishers, whereas such an indicator has been empirically studied and supported by the publishers of the ACT. 2a-19
43 State Assessments-ADP Algebra II Exam Concurrent Validity Study Report Concurrent validity studies were conducted for the American Diploma Project (ADP) Algebra II End-of-Course Exam using scores from the operational Spring 2008 administration. These studies were designed to establish empirical relationships between scores on the ADP Algebra II exam and scores on the math sections of statewide assessments. This report focuses specifically on relationships with the mathematics sections of the Hawaii State Assessment (HSA), Indiana Statewide Testing for Educational Progress-Plus (ISTEP+), Kentucky Common Core Test (KCCT), New Jersey High School Proficiency Assessment (HSPA), Pennsylvania System of School Assessment (PSSA), and Rhode Island s use of the New England Common Assessment Program (NECAP). All of these exams are used for NCLB purposes and are further described in Table 1. Table 1. Description of State Assessments Exam Title Description Hawaii State Assessment (HSA) Indiana Statewide Testing for Educational Progress-Plus (ISTEP+) Kentucky Core Content Test (KCCT) New Jersey High School Proficiency Assessment (HSPA) The Hawaii State Assessment (HSA) is an annual testing program that measures student achievement in reading, mathematics (grades 3,4,5,6,7,10), science (grades 5,7,11), and writing (grades 4,6,9,11), based on the third edition of the Hawaii Content and Performance Standards (HCPS III). Students are tested in science and writing in the fall and in reading and mathematics in the spring. Based on Indiana s Academic Standards, the Indiana Statewide Testing for Educational Progress-Plus (ISTEP+) provides a learning check-up at each grade level to make sure students are on track and signal whether they need extra help. Spring 2009 ISTEP+ (March, Late-April/Early May) includes questions on English and math at grades 3-8, science at grades 4 and 6 and social studies at grades 5 and 7. The class of 2012 will be the first group of students to take the new GQE which consists of Algebra I and English 10 exams taken whenever students complete the corresponding course. The Kentucky Core Content Test (KCCT) assesses student mastery of the Kentucky Core Content for Assessment, as well as higher-order thinking and communication skills. The KCCT is comprised of open-response items and multiple-choice questions. The test measures reading and mathematics in grades 3,4,5,6,7,8,11; science in grades 4,7,11; Practical Living/Vocational Studies in grades 4,7,10; social studies in grades 5,8,11; Arts & Humanities in grades 5,8,11; and On-demand Writing in grades 5,8,12. The High School Proficiency Assessment (HSPA) is a state test given to students in the eleventh grade and is used to determine student achievement in reading, writing, and mathematics as specified in the New Jersey Core Curriculum Content Standards. 2b-1
44 Pennsylvania System of School Assessment (PSSA) New England Common Assessment Program (NECAP) The annual Pennsylvania System of School Assessment (PSSA) is a standards-based, criterion-referenced assessment used to measure a student's attainment of the academic standards while also determining the degree to which school programs enable students to attain proficiency of the standards. Every Pennsylvania student in grades 3 through 8 and grade 11 is assessed in reading and math. Every Pennsylvania student in grades 5, 8 and 11 is assessed in writing. New England Common Assessment Program (NECAP) is a collaborative partnership among New Hampshire, Vermont, and Rhode Island, and annually measures achievement of students in mathematics and reading in grades 3-8, and 11, writing in grades 5,8,11, and science in grades 4, 8, and 11. These statewide assessments contain varying degrees of Algebra content as illustrated by Table 2 below. The information provided in Table 2 was taken from the blueprints of each exam and is approximate as the exact content strands for each test do not perfectly fit into the four content strands provided. For all six tests, Algebra content accounts for somewhere between 25-40% of the material on the test; however, it is unknown how much of that content is specifically at the Algebra II level. Personal communication with several states including Hawaii, Indiana, and Pennsylvania indicated that their respective assessments did not cover Algebra II material. Thus, it is likely that the tests include little, if any, Algebra II material; however, since there is no robust Algebra II test with which to cross-validate the ADP exam, comparisons of this nature are useful. In addition to varying content coverage, these state assessments vary in terms of the grade in which grade they are administered. The HSA and ISTEP+ are administered in the tenth grade, but the remaining four assessments are administered in the eleventh grade as shown in Table 2 below. It is important to keep in mind that these state assessments measure the entire curriculum at the assessed grade level, while the ADP Algebra II exam measures specifically Algebra II content. Table 2. Mathematics Content Coverage for State Assessments Content Strand HSA Grade 10 ISTEP+ Grade 10 KCCT Grade 11 HSPA Grade 11 PSSA Grade 11 NECAP Grade 11 Numbers & Operations 19% 30% 20% 15% 15% 15% Geometry & Measurement 37% 26% 30% 25% 30% 30% Algebra & Functions 25% 32% 35% 30% 40% 40% Data Analysis 19% 12% 15% 30% 15% 15% 2b-2
45 Participating Students Table 3 displays the demographic breakdown by gender and ethnicity for each sample (S) and respective state population (P). Statewide statistics were taken from the National Center for Education Statistics (NCES) website 1. Each selected sample is fairly representative of the demographic composition of the state from which it was drawn. Table 3. Demographics of Study Samples (S) and Their Corresponding Populations (P) Ethnicity Hawaii Indiana Kentucky New Jersey Pennsyl. Rhode Is. S P S P S P S P S P S P African American 2% 2% 21% 13% 11% 11% 23% 17% 11% 16% 6% 9% American Indian 0% 1% 0% 0% 0% 0% 3% 0% 0% 0% 0% 1% Asian/ Pacific Is. 64% 73% 2% 1% 0% 1% 5% 8% 7% 3% 3% 3% Hispanic 2% 5% 5% 6% 0% 2% 19% 19% 4% 7% 12% 18% White 20% 20% 68% 80% 84% 86% 49% 56% 71% 76% 79% 70% Unknown 12% 0% 3% 0% 4% 0% 1% 0% 7% 0% 0% 0% Gender Female 54% 48% 54% 49% 48% 48% 53% 49% 52% 48% 53% 48% Male 46% 52% 46% 51% 52% 52% 47% 51% 48% 52% 47% 52% *Percentages may not add up to 100% due to rounding. Table 4 provides information regarding the grade that students were in when they took the ADP exam. Only students who have matched scores, and are thus included in the study, are included in Table 4. Notice that some states provided scores only for the current year s testing while others matched across multiple years administrations. Table 4. Distribution of Grades for ADP Administration for Study Participants HI IN KY NJ PA RI Grade Grade 10 1, Grade , , Grade Unknown Total 1,164 2, , b-3
46 In order to evaluate the representativeness of the chosen samples in terms of academic performance, the percentages of students in the sample and the entire state who were classified as being proficient by the respective state assessment are compared to those students classified as being not proficient in Table 5. Statewide statistics were taken from state Department of Education websites. It is important to bear in mind that each state s definition of proficient may differ. In addition, it should be noted that while the NECAP is administered in multiple states, this study uses only NECAP data from the state of Rhode Island. The data in Table 5 suggest that the composition of each state s sample in terms of academic performance, with the exception for Hawaii, reasonably approximates that of the population from which it was drawn. The Hawaii sample appears to contain higher performing students. This is logical, since Hawaii only provided HSA scores for 10 th grade students. Since the HSA is administered in 10 th grade and the students who are enrolled in Algebra II in 10 th grade are typically the higher performing students, the sample s higher performance is expected. However, this is not the case with the other 10 th grade state assessment in this study (the ISTEP+) because Indiana provided three years of ISTEP+ data for ADP students, those enrolled in grades 10, 11, and 12. Analyses Performed Table 5. Proficiency Distributions for State Samples and Corresponding Distributions State N Not Proficient Proficient Hawaii Sample 1,164 10% 90% State 12,641 81% 19% Indiana Sample 2,858 21% 79% State 76,440 32% 77% Kentucky Sample % 34% State 44,415 61% 38% New Jersey Sample % 64% State 97,985 27% 73% Pennsylvania Sample 3,015 41% 59% State 135,137 44% 56% Rhode Island Sample % 21% State 11,174 78% 22% *Percentages may not add up to 100% due to rounding. Raw scores for the ADP exam and scaled scores for the state assessments were analyzed in this study. First, univariate statistics and boxplots were examined. Then, to investigate 2b-4
47 how ADP exam scores relate to state proficiency, ADP exam performance across state proficiency classification groups was compared. That is, the ADP exam scores for students who were deemed proficient by their respective state assessment and those who were deemed not proficient were compared. Finally, the LEGS (Linking with Equivalent Groups or Single Group Design) program was used to statistically link ADP Algebra II exam scores and state assessment mathematics scores. Results reported in this paper are from the equipercentile method with post-smoothing. Selection of the most appropriate smoothing value was determined on a case-by-case basis by visual inspection of plots. For this report, the smoothing values selected ranged from 0.25 to 1.0. Results Table 6 displays the mean, standard deviation, and minimum, and maximum scores for the ADP Algebra II exam and Figure 1 displays the boxplots for each state sample. While the distributions are positively skewed, the samples, with the exception of Hawaii, appear to be fairly similar in terms of ADP exam performance. The Hawaii sample s average ADP exam score provide further evidence that the sample contains higher performing students. In addition, range restrictions are a concern in this analysis, particularly for the Rhode Island sample where the highest score was 46 out of a possible 76 points. Table 6. ADP Algebra II Exam Descriptive Statistics by State State Mean SD Min Max Hawaii Indiana Kentucky New Jersey Pennsylvania Rhode Island Note: The maximum score on the ADP exam was 76. 2b-5
48 Figure 1. Boxplots of ADP Algebra II Exam Scores by State ADP Exam Score HI IN KY NJ PA RI State 2b-6
49 Table 7 presents the mean, standard deviation, and minimum and maximum scores for the respective state mathematics assessment and the correlation between the state assessment and ADP exam scores. The minimum and maximum columns specify the observed (obs) value and the absolute (abs) possible value. For example, for the HSA, the minimum score possible is a 100 but the lowest score observed in the sample was 134. Table 7. State Assessment Descriptive Statistics State Assessment Mean SD Minimum Maximum Corr. with Obs (Abs) Obs (Abs) ADP Exam Hawaii HSA (100) 443 (500) Indiana ISTEP (300) 920 (920) Kentucky KCCT (1100) 1180 (1180) New Jersey HSPA (132) 264 (300) Pennsylvania PSSA (700) 2089 (~2100) Rhode Island NECAP (1100) 1153 (1180) *Note: Obs stands for observed while Abs stands for absolute. Boxplots displaying the distribution of state assessment scores for every obtainable ADP exam score point are presented in Figures 2-7. The boxplot for the ISTEP+ reveals the strongest linear relationship between ADP exam scores and state assessment mathematics scores while the boxplot for the HSPA suggests a much weaker linear relationship. These relationships are quantified by the correlations displayed in Table 7 where the correlation between scores on the ADP exam and ISTEP+ is but it is only between scores on the ADP exam and HSPA. However, the magnitudes of correlations are for the most part moderately strong, as would be expected with these exams. Nonetheless, it is important to keep in mind various factors that may influence these correlations such as the restriction of range and the homogeneity of the sample. 2b-7
50 Figure 2. Distribution of HSA Scores by ADP Exam Score HSA Score ADP Exam Score 2b-8
51 Figure 3. Distribution of ISTEP+ Scores by ADP Exam Score ISTEP+ Score ADP Exam Score 2b-9
52 Figure 4. Distribution of KCCT Scores by ADP Exam Score KCCT Score ADP Exam Score 2b-10
53 Figure 5. Distribution of HSPA Scores by ADP Exam Score HSPA Score ADP Exam Score 2b-11
54 Figure 6. Distribution of PSSA Scores by ADP Exam Score PSSA Score ADP Exam Score 2b-12
55 Figure 7. Distribution of NECAP Scores by ADP Exam Score NECAP Score ADP Exam Score 2b-13
56 Table 8 provides the descriptive statistics of ADP exam scores by proficiency classification according to each student s respective state assessment. For all six samples, the mean of the proficient students was considerably higher than the mean of nonproficient students. Table 8. ADP Exam Scores by Statewide Assessment Proficiency Classification Not Proficient Proficient Sample N Mean SD Min Max N Mean SD Min Max HSA , ISTEP , KCCT HSPA PSSA 1, , NECAP The results of the equipercentile linking analysis are shown in Table 9. Because student scores did not span the entire ADP exam score range, the restricted ranges in the concordance table is evident. While state performance levels are not consistently named, there is a cut that reflects proficiency and another higher cut that this report will refer to as advanced. The shaded cells in Table 9 identify the proficient and advanced cut scores for the respective state assessments. Proficient cuts are shaded with orange and the advanced are shaded with green. For example, the HSA proficiency cut score of 300 maps to an ADP exam score of roughly 14 and the advanced cut score maps to an ADP exam score of 21. The statewide assessment proficiency cut scores map onto ADP exam scores in the range with the exception of the NECAP. For that assessment, the proficiency cut score maps to a higher ADP exam score of 20, which makes sense given the sample s composition as described earlier in this report. The advanced cut scores map to ADP exam scores in the range with the exception of the NECAP which maps to the range. 2b-14
57 Table 9. ADP Exam-State Assessment Concordance Table ADP Exam Score HSA Score ISTEP+ Score KCCT Score HSPA Score PSSA Score NECAP Score b-15
58 *Proficient & Advanced cuts are indicated by orange and green shading, respectively. Conclusion This study examined the relationship between scores on the ADP Algebra II End-of- Course Exam and scores on six statewide mathematics assessments. Students with both a valid ADP exam score and statewide assessment mathematics score were analyzed in the study. Each of the six state samples appeared to be representative of their full statewide populations in terms of ethnicity and gender. Academically, the samples were representative of their respective states with the exception of Hawaii whose sample contained higher performing students 2b-16
59 There was evidence of a linear relationship between scores on the ADP exam and the HSA, ISTEP+, KCCT, PSSA, and NECAP. However, the observed linear relationship was weak between scores on the ADP exam and the HSPA. In order to establish an empirical relationship between scores on ADP exam and scores on the state assessments, equipercentile linking was conducted. The ADP exam score that mapped to the proficiency cut score for each state assessment was then identified. For five of the six state assessments, ADP exam scores that mapped to state assessment cut scores were in the range for the proficient cut and in the range for the advanced cut. Considerations The results of these analyses should be interpreted with caution. First, the analyses involving the Hawaii assessment, HSA, should be considered keeping in mind that the sample contained higher than average performing students. The observed results may not generalize to the full testing population. In addition, the analysis with the New Jersey assessment, HSPA, should be considered in light of the fact that linear relationship between scores on it and the ADP exam was not particularly strong. Finally, the results of the analyses with the NECAP by Rhode Island should interpreted while recognizing that the sample was highly restricted in range, with the highest performing student scoring a 46 out of 76 on the ADP exam. 2b-17
60 3. Cross Sectional Studies
61
62 Executive Summary Cross-Sectional Validity Studies Background Data! The cross-sectional validity studies provide data about how college students perform on the ADP Algebra II exam. Students in Algebra and Pre-Calculus courses from community colleges, 4-year typical institutions, and 4-year more selective institutions participated.! The data collected for the cross-sectional validity studies were analyzed in two different ways: o Predictive Study-examined how students ADP exam score predicts their course grade and provides information regarding what ADP scores seem to signify college readiness based on the final course grades of students. For this analysis, only Algebra and Pre-Calculus students with assigned final grades were included. o Contrasting Groups Study-examined how the distributions of scores for students earning a B or better differ from those students who earn less than a B. Resulting cut scores provide information regarding the scores that best separate these distributions.! The exam was administered at the beginning of the semester and scores were matched with final course grades.! Data were collected in Fall 2008 and Spring 2009 from 31 institutions.! 3,132 students had matching ADP exam scores and final course grades.! Students were split fairly evenly between institution types, but the number of students in Pre-Calculus was less than half the number of those in Algebra. Methods! Predictive-For each institution type by subject subgroup, logistic regression was conducted to investigate the score needed in order to have a 65% chance of earning an A or better, B or better, and C or better in the student s first credit-bearing course.! Contrasting Groups-Within each institution type, smoothed distributions for successful students (those earning a B or better) were compared to those students earning less than a B, and the intersections were determined. Results! Predictive o ADP Exam scores associated with a 65% probability of earning a B or better in an Algebra course varied by institution type, ranging from 32 to
63 o For Pre-Calculus, ADP Exam scores of 32 and 38 for community college and 4- year typical institutions, respectively, were associated with a 65% probability of earning a B or better. The model did not fit for 4-year more selective institutions.! Contrasting Groups o For community college students, a score of 25 separated successful students from those earning less than a B. o For 4 year typical institution students, a score of 26 optimally successful students from those earning less than a B. o For 4 year more selective institution students, a score of 23 separated successful students from those earning less than a B. Considerations! Lack of motivation likely influenced students scores.! The restriction in range of ADP Algebra II scores affected the predictive study results. o For the community college Algebra sample, only three out of 761 students received scores of 48 or higher. o For Algebra students from 4-year more selective institutions, the highest score was a 42. o Relatively small differences in mean Algebra II scores were seen for students receiving different grades, especially in the data based on Algebra courses.! The contrasting groups analyses are limited by the following considerations. o The sample size for each distribution is relatively small sample sizes (~500 students). o Restriction of range of ADP Algebra II scores also affected the contrasting groups results.! The logistic regression results reported in the predictive studies seemed more credible for the Pre-Calculus groups than the Algebra groups. o For the Pre-Calculus groups, ADP Algebra scores between 32 and 38 are associated with a 65% probability of earning a B or better. o The logistic regression results for the community college Algebra sample should probably be discounted.! The 65% probability value used with the logistic regressions significantly impacted results to other probability values that might be considered. For example, a 50% probability of earning a B or better seemed to be associated with ADP Algebra scores between 20 and
64 ADP Algebra II Exam Predictive Study Predictive validity studies were conducted for the American Diploma Project (ADP) Algebra II End-of-Course Exam. College students enrolled in Algebra and Pre-Calculus courses were administered one section (calculator or non-calculator) of the ADP Algebra II Exam at the beginning of the semester and then final course grades were collected at the end of the semester. However, in some instances, students took the entire exam. An empirical relationship was then determined such that given ADP exam score, the probability of earning a particular final course grade could be estimated. Students enrolled in Algebra and Pre-Calculus courses at community colleges (CC), 4-year typical institutions (4T), and 4-year more selective institutions (4S) were recruited and data were collected during both the Fall 2008 and Spring 2009 semesters. Participating Students Students from 28 colleges in Arizona, Arkansas, Florida, Hawaii, Indiana, New Jersey, Ohio, and Maryland participated in the study. Because students may have taken either half of the exam (calculator or non-calculator sections) or the entire exam, scores were rescaled to the Spring 2008 full test scale. (See Appendix A1: Calculating Expected Score for an explanation of the rescaling process). After rescaling, every student had a score that was on the Spring 2008 full test raw score scale which ranges from 0 to 76. Students enrolled in Algebra and Pre-Calculus courses participated and were distributed across institution type as shown in Table 1. The demographic breakdown of participating students is displayed in Table 2. Unfortunately, because many students failed to indicate their ethnicity on the answer document, we do not know the ethnicity of roughly onethird of the students in the sample. Table 1. Distribution of Students by Course and Institution Type Community College 4-Year Typical 4-Year More Selective Total Algebra ,078 Pre-Calculus ,054 Total 929 1, ,132 3a-1
65 Analyses Performed Table 2. Demographic Characteristics of Study Participants Ethnicity Percent African American 9% American Indian 1% Asian/Pacific Islander 10% Hispanic 6% White 40% Unknown 34% Gender Female 53% Male 47% To evaluate the relationship between ADP exam score and final course grade, ADP exam score descriptive statistics for students earning each course grade were calculated by course type and institution type. In addition, to visually represent these relationships, boxplots showing the distribution of ADP exam score by course grade, and line graphs displaying mean ADP exam score by course grade, were constructed. Next, to determine whether these relationships vary across institution type, line graphs were constructed separately for each course. Finally, logistic regression was used to model the probability of earning course grades of A or better, B or better, and C or better given a student s ADP exam score. Data were disaggregated by both course and institution type for analysis. The ADP exam scores associated with a 65% probability of achieving various course grades were identified. Results Table 3 presents ADP exam performance according to institution type, course type, and course grade earned. As seen in the table, average ADP exam scores generally increase with course grades. 3a-2
66 Table 3. ADP Exam Scores by Institution Type, Course, and Course Grade N Mean Std Dev Min Max Community College Algebra A B C D F Pre-Calculus A B C D F Year Typical Algebra A B C D F Pre-Calculus A B C D F Year Selective Algebra A B C D F Pre-Calculus A B C D F a-3
67 Boxplots illustrating the distribution of ADP exam scores based on college course grade for each combination of course and institution type are presented in Figures 1-6. It is worth noting that the highest score earned was a 66 which is below the maximum possible score of 76. Figure 1. Distribution of Community College Exam Scores by Algebra Course Grade ADP Exam Score A B C D F CC Algebra Course Grade 3a-4
68 Figure 2. Distribution of 4T Institution Exam Scores by Algebra Course Grade ADP Exam Score A B C D F 4T Algebra Course Grade 3a-5
69 Figure 3. Distribution of 4S Institution Exam Scores by Algebra Course Grade ADP Exam Score A B C D F 4S Algebra Course Grade 3a-6
70 Figure 4. Distribution of CC Institution Exam Scores by Pre-Calculus Grade ADP Exam Score A B C D F CC Pre-Calculus Course Grade 3a-7
71 Figure 5. Distribution of 4T Institution Exam Scores by Pre-Calculus Grade ADP Exam Score A B C D F 4T Pre-Calculus Course Grade 3a-8
72 Figure 6. Distribution of 4S Institution Exam Scores by Pre-Calculus Grade ADP Exam Score A B C D F 4S Pre-Calculus Course Grade 3a-9
73 Figure 7 displays the mean ADP exam scores for Algebra and Pre-Calculus students earning a particular final course grade without respect to institution type. For both Algebra and Pre-Calculus, ADP exam scores tend to decrease as course grades decline, and Pre-Calculus averages are consistently higher than Algebra averages. Figure 7. Relationship between Course Grade and ADP Exam Score by Course Type Average ADP Exam Score A B C D F Algebra Pre-Calculus Course Grade Algebra Pre-Calculus 3a-10
74 Figure 8 shows the mean ADP exam score given Algebra grade by institution type. The relationships between Algebra grade and ADP exam score are similar regardless of institution type. Figure 8. Relationship between Algebra Grade and ADP Exam Score by Institution Type Average ADP Exam Score A B C D F CC T S Algebra Course Grade CC 4T 4S 3a-11
75 Figure 9 shows the relationship between Pre-Calculus grade and ADP exam score by institution type. While the trend of decreasing scores with declining course grades is consistent for students in community colleges (CC) and 4-year typical institutions (4T), there was little variation in average score across course grade for students in 4-year more selective institutions (4S). Figure 9. Relationship between Pre-Calculus Grade and ADP Exam Score by Institution Type Average ADP Exam Score A B C D F CC T S Pre-Calculus Course Grade CC 4T 4S Logistic regression was conducted on all course and institution type subgroups and there was adequate model fit for each subgroup with the exception of the 4S Pre-Calculus group. The fit statistics for the logistic regression involving Pre-Calculus in 4-year more selective institutions indicated that the model did not adequately fit. Lack of fit is not surprising given the plot in Figure 9 where there is little variation in mean ADP exam scores across course grades. 3a-12
76 Figure 10 displays the relationship between ADP exam score and the probability of earning a particular grade in Algebra for students in community colleges. Figure 10. Probability of Grade in Algebra Given ADP Exam Score-CC Probability A or better B or better C or better ADP Exam Score Based on Figure 10, given an ADP exam score of approximately 12, a student would have a 65% probability of earning a C or better in an Algebra course. In order to have a 65% probability of earning a B or better in an Algebra course, a student would need to score approximately 48. As is shown in the figure, there is no obtainable ADP score associated with a 65% probability of earning an A or better in Algebra for community college students. 3a-13
77 Figure 11 displays the relationship between ADP exam score and the probability of earning a particular grade in Algebra for students in 4-year typical institutions. Figure 11. Probability of Grade in Algebra Given ADP Exam Score-4T Probability ADP Exam Score A or better B or better The model estimating the probability of earning a grade of C or higher was a poor fit to the data, and thus those results are not presented here. According to Figure 11, given an ADP exam score of approximately 37, a student would have a 65% probability of earning a B or better in an Algebra course. An ADP exam score of 49 is associated with having a 65% probability of earning an A or better in an Algebra course. 3a-14
78 Figure 12 displays the relationship between ADP exam score and the probability of earning a particular grade in Algebra for students in 4-year more selective institutions. Figure 12. Probability of Grade in Algebra Given ADP Exam Score-4S Probability A or better B or better C or better ADP Exam Score Based on Figure 12, given an ADP exam score of approximately 15, a student would have a 65% probability of earning a C or better in an Algebra course. In order to have a 65% probability of earning a B or better in an Algebra course, a student would need to score approximately 32. Finally, an ADP exam score of 40 is associated with having a 65% probability of earning an A or better in an Algebra course. Confidential: Do not photocopy or distribute 3a-15
79 Figure 13 displays the relationship between ADP exam score and the probability of earning a particular grade in Pre-Calculus for students in community colleges. Figure 13. Probability of Grade in Pre-Calculus Given ADP Exam Score-CC Probability A or better B or better C or better ADP Exam Score Based on Figure 13, given an ADP exam score of approximately 6, a student would have a 65% probability of earning a C or better in a Pre-Calculus course. In order to have a 65% probability of earning a B or better in a Pre-Calculus course, a student would need to score approximately 32. Finally, an ADP exam score of 50 is associated with having a 65% probability of earning an A or better in a Pre-Calculus course. Confidential: Do not photocopy or distribute 3a-16
80 Figure 14 displays the relationship between ADP exam score and the probability of earning a particular grade in Pre-Calculus for students in 4-year typical institutions. Figure 14. Probability of Grade in Pre-Calculus Given ADP Exam Score for 4T Students Probability A or better B or better C or better ADP Exam Score Based on Figure 14, given an ADP exam score of approximately 21, a student would have a 65% probability of earning a C or better in a Pre-Calculus course. In order to have a 65% probability of earning a B or better in a Pre-Calculus course, a student would need to score approximately 38. Finally, an ADP exam score of 52 is associated with having a 65% probability of earning an A or better. Confidential: Do not photocopy or distribute 3a-17
81 Conclusion This study examined the relationship between ADP exam score and final course grade for students enrolled in Algebra and Pre-Calculus courses from community colleges, 4-year typical institutions, and 4-year more selective institutions. Data were analyzed disaggregating by course type and institution type. Analyses could not be conducted for Pre-Calculus students in 4-year more selective institutions as there was too little variation in ADP exam scores across final course grades. Table 4 summarizes the logistic regression results for Algebra and Table 5 shows the results for Pre-Calculus. These tables indicate the ADP exam score required to have a 65% probability of achieving different course grades. Interestingly, for Algebra, the institution type requiring the highest ADP exam score varied with the grade being investigated. For Pre-Calculus, the 4-year typical institutions are associated with higher ADP scores than the community colleges. Table 4. ADP Exam Score Associated with a 65% Probability of Algebra Grade Algebra Grade Community College 4-Year Typical 4-Year More Selective A or better B or better C or better Considerations Table 5. ADP Exam Score Associated with a 65% Probability of Pre-Calculus Grade Pre-Calculus Grade Community College 4-Year Typical A or better B or better C or better 6 21 The results presented here should be interpreted with caution due to a range restriction. The distribution of raw scores did not fully cover the available score points resulting in artificial truncation of the results, particularly with Algebra students in 4-year more selective institutions where the highest ADP score earned was a 42. In addition, lack of motivation was likely a factor. Confidential: Do not photocopy or distribute 3a-18
82 ADP Algebra II Contrasting Groups Study As a supplement to the logistic regression analyses predicting college course grades from ADP assessment performance, a study was conducted comparing the ADP Algebra II performance of several groups of students who were thought to differ in the degree to which they were prepared to learn college-level mathematics material. This study used the same data as the predictive study but analyzed it in a different way. The ADP assessment was administered to students at the beginning of the semester that were enrolled in college-level Algebra or Pre-Calculus courses and then final course grades were collected at the end of the semester. A contrasting groups analysis was conducted to compare the performance of students who earned an A or a B with students who earned a C, D, or F. Student performance on the ADP Algebra II examination was compared across school type. Participating Students Data collection for this study took place over two semesters. In the Fall 2008 data collection, students from seven colleges in Arizona and Maryland were administered the ADP examination. Students from 24 colleges in Hawaii, Indiana, Florida, Ohio and New Jersey participated in the Spring 2009 data collection. Course grades were collected at the end of the semester and matched to students ADP exam scores. In this study, students completed only one half of the examination (calculator or noncalculator sections). Because of this, scores were rescaled to the Spring 2008 full test scale. To do so, the items in each section were first anchored to the item bank which was on the Spring 08 scale. Then, using the raw score to theta conversions for each half test, an expected full test raw score was calculated. So although students took only half of the exam, every student had estimated score on the Spring 2008 full test raw score scale which ranges from 0 to 76. Students enrolled in Algebra and Pre-Calculus courses participated and were distributed across institution type as shown in Table 1. The demographic breakdown of participating students is displayed in Table 2. Table 1. Number of Participating Students by Course Type and Institution Type Community College 4-Year Typical 4-Year More Selective Total Algebra ,078 Pre-Calculus ,054 Total 929 1, ,132 3b-1
83 Table 2. Demographic Characteristics of Study Participants Ethnicity Percent African American 9% American Indian 1% Asian/Pacific Islander 10% Hispanic 6% White 40% Unknown 34% Gender Female 53% Male 47% Analyses Performed A modified contrasting groups analysis was performed. The purpose of the contrasting groups analysis was to evaluate the relationship between student performance on the ADP Algebra II assessment and success in the first credit-bearing math course, where success is defined as earning a B or better. These analyses help identify the performance on ADP necessary to provide evidence of readiness to learn college-level mathematics material. The research design was taken from the standard setting literature and is a modification of the area of work typically referred to as contrasting groups 1. In the contrasting groups study design, two populations of students are identified, those seen as masters and those seen as novices. Both of these populations of students are administered an assessment instrument, and their performance is compared. An empirical relationship is determined which illustrates how the masters performed relative to the novices on the assessment. The point where the two populations diverge is taken as the optimal cut score or the point that would most often classify the masters and the novices correctly. For this study, the novices were considered to be the students enrolled in remedial mathematics courses, while students enrolled in either College Algebra and Pre-Calculus courses were considered to be masters. The study design takes advantage of the fact that these two groups of students had already been identified and classified as to their readiness to learn using other methods (course placement tests, ACT/SAT assessments, high school GPA, etc.) In other words, students enrolled in remedial mathematics courses were previously considered not ready to learn the mathematics content covered in Algebra and Pre-Calculus courses. This type of analysis was used successfully in the 1 Zieky, M, J., & Livingston, S. A. (1977). Manual for setting standards on the basic skills assessment tests. Princeton, NJ: Educational Testing Service. 3b-2
84 establishment of the Higher Education Readiness Component on the Texas Assessment of Knowledge and Skills 2. The following contrasting groups analyses were conducted: Results 1. A, B students vs. C, D, F students in community colleges 2. A, B students vs. C, D, F students in 4 year typical institutions 3. A, B students vs. C, D, F students in 4 year selective institutions Figures 1-3 presents the results of the contrasting group analyses. These results provide a comparison of the ADP score distributions for students who were successful in their first college-credit bearing course (either Algebra or Pre-Calculus). In all cases the distributions have been smoothed using a polynomial regression method of best fit. Visual inspection of the point where each of the two distributions intersect is one way of determining the test score which best classifies students as being college ready. The results displayed in Figures 1-3 are summarized in Table 3 below. The cut scores are identified by locating the point of intersection of the two distributions. Table 3. Summary of Cut Score Results from Contrasting Groups Analyses Figure Comparison Groups Institution Type Cut Score 1 A, B Students vs. C, D, F Students Community college 25 2 A, B Students vs. C, D, F Students 4 year typical 26 3 A, B Students vs. C, D, F Students 4 year selective dinessreport.pdf 3b-3
85 Figure 1. Community College Students: A, B Students vs. C, D, F Students Percent of Students A or B C, D, or F Poly. (A or B) Poly. (C, D, or F) ADP Exam Score *Note: Poly. refers to the smoothed distributions. Figure 2. 4-Year Typical Students: A, B Students vs. C, D, F Students Percent of Students A or B C, D, or F Poly. (A or B) Poly. (C, D, or F) ADP Exam Score *Note: Poly. refers to the smoothed distributions. 3b-4
86 Figure 3. 4-Year More Selective Students: A, B Students vs. C, D, F Students Percent of Students A or B C, D, or F Poly. (A or B) Poly. (C, D, or F) ADP Exam Score *Note: Poly. refers to the smoothed distributions. Additionally, since the cuts were similar for regardless of institution type, additional contrasting groups analyses were conducted where the data were aggregated. The results displayed in Figure 4 are summarized in Table 4 below. The cut score was identified by locating the point of intersection of the two distributions. Table 4. Summary of Cut Score Results from Contrasting Groups Analyses Figure Comparison groups Institution type Cut score 4 A, B Students vs. C, D, F Students All 25 3b-5
87 Figure 4. All Institution Types: A, B Students vs. C, D, F Students Percent of Students A or B C, D, or F Poly. (A or B ) Poly. (C, D, or F) ADP Exam Score *Note: Poly. refers to the smoothed distributions. Conclusions The results of these analyses suggest that cut scores in the neighborhood of between 23 and 26 points (out of a total of 76) best distinguish between those ready to learn collegelevel mathematics material and those not ready. The cut scores observed in this study do not appear to be heavily dependent on the type of institution in which the student is enrolled. When data from all institution types were aggregated, the cut score was 25. Considerations The results of this study should be interpreted with caution. The sample size was fairly small as distributions by institution type were comprised of roughly 500 students. In addition, lack of motivation was likely a factor in the low ADP exam scores. 3b-6
88 4. Judgment Studies
89
90 Executive Summary Judgment Study Background! The judgment study provides data about how the ADP Algebra II exam relates to college professors expectations of what students need to have previously mastered in order to successfully learn the material that will be covered in their College Algebra or Pre- Calculus course, where successful is defined as earning a B or better without remediation.! The data gathered were used to inform the ADP Algebra II content-based performance level descriptors for Needs Intensive Preparation, Needs Preparation, Prepared, and Well Prepared and to calculate an initial cut score separating Prepared students from Needs Preparation students.! At the time of the judgment studies, four performance levels were planned; however, the lowest level Needs Intensive Preparation has since been dropped. Participants! Three one-day meetings were held in the spring of o Little Rock, Arkansas o Baltimore, Maryland o Cleveland, Ohio! 133 professors representing 79 institutions from 20 states participated.! Professors were currently teaching either Algebra or Pre-Calculus! Community college, 4-year typical institutions, and 4-year more selective institutions were represented Methods! Participants were presented with three major tasks: o Standards Relevance Survey-Each ADP Algebra II benchmark was rated as to its relevance in preparing successful students for the first college level credit-bearing math course. Benchmarks were rated as Essential, Important, Helpful, or Not Relevant. o Defining Threshold Descriptors-Participants described what knowledge, skills, and abilities students in each of the performance levels would be able to demonstrate on the first day of their college-level math course. o Item-Level Judgments-Participants examined each item in the Spring 2008 operational exam and determined the number of points a successful student would earn on the first day of class. Results! Standards Relevance Survey o There was clear agreement that Real Numbers and Algebraic Expressions represent important or essential skills that should be previously mastered by incoming students in order to be successful in their first college level math course. 4-1
91 o Elements of Linear Equations and Inequalities, Quadratic Functions, and Function Operations were also considered important skills by most participants.! Defining Threshold Descriptors o See Tab 5 for the final Performance Level Descriptors! Item-Level Judgments o Following Round 2, cut scores varied by course and institution type with subgroup medians ranging from 23.0 to o Pre-Calculus cut scores were consistently higher than Algebra cut scores. o Cuts for the 4-year institutions were higher than those for community colleges. o The median cut score from all data was Considerations! While the instructors all taught College Algebra or Pre-Calculus the content varied considerable across the institutions. (See Appendix B3: Analysis of Judgment Study Syllabi.)! The understanding of a B or better student was likely interpreted differently by each instructor based on their own individual expectations and the grading norms at their institutions.! Anecdotal evidence from comments made by the instructors at the meetings suggests that the instructors have lowered their expectations to accommodate an increasingly unprepared population of students (i.e., I don t expect them to know this since I have to teach it anyway ). 4-2
92 ADP Algebra II End-of-Course Exam Judgment Study Summary Report July 2009 Psychometric Services Pearson
93 Table of Contents Table of Contents... i Introduction and Background... 1 Who Were the Panelists?... 2 Judgment Study Meeting Tasks... 3 Task 1: Standards Relevance Survey... 3 Task 2: Defining Threshold Descriptors... 3 Task 3: Making Item-Level Judgments... 4 Round Discussion... 5 Round Task 4: Filling out Final Paperwork... 5 Results... 5 Task 1: Standards Relevance Survey... 5 Task 2: Defining Threshold Descriptions... 9 Algebra II Core Competencies... 9 Performance Level Descriptions Task 3: Item Level Judgments Round Community College Group Four Year Typical Admittance Rate Group Four Year Selective Admittance Rate Group Round Community College Group Four Year Typical Admittance Rate Group Four Year Selective Admittance Rate Group Comparison of Round 2 Results When combining all data across all subgroups, the resulting median cut score that separates Prepared students from Needs Preparation is Task 4: Exit Survey Conclusions Appendices Appendix A: List of Participants by Meeting Table 1. Little Rock Meeting Participants Appendix B: ADP Judgment Study Script Appendix C: General Session PowerPoint Slides Appendix D: ADP Algebra II EOC Exam Standards Appendix E: Exit Survey Overall Results Four-Year Selective Admittance Institution Results Four-Year Typical Admittance Institution Results Community College Results Appendix F: Demographic Survey i
94 Introduction and Background During three one-day meetings, post-secondary mathematics instructors/professors from 15 ADP Algebra II states, as well as five ADP Network States, reviewed the ADP Algebra II assessment standards and test items and made judgments about their relevance to the college mathematics course they teach. The purpose of the meetings was to determine what knowledge, skills, and abilities a college student needs to have previously mastered in order to successfully learn the material that will be covered in their first credit-bearing math course specifically College Algebra or Pre-Calculus. Successfully learn refers to students who will ultimately earn either an A or B in the class without taking a remediation class. In total, 133 professors representing 79 institutions from 20 states attended the meetings. The first meeting was held February 19, 2009 in Little Rock, Arkansas. Fifty-three professors representing 29 institutions from five states (AR, AZ, FL, HI, and MN) attended. The second meeting was held March 2, 2009 in Baltimore, Maryland. Thirtyfour professors representing 20 institutions from six states (MA, MD, NC, NJ, PA, and RI) attended 1. The final meeting was held on April 9, 2009 in Cleveland, Ohio. Forty-six professors representing 30 institutions from 13 (FL, IL, IN, KY, LA, MD, MI, NC, OH, PA, TX, WA, and WI) states attended. Appendix A of this report lists the institutions represented at each meeting. At each of the three meetings, panelists engaged in three tasks that helped define the score that separates students who are Prepared for College from those who Need Preparation. The data gathered across these meetings was used to inform the ADP Algebra II content-based performance level descriptors (PLDs) and to calculate an initial cut score separating Prepared students from Needs Preparation students. These initial recommendations for cuts will be provided to the standard setting committee, which is ultimately responsible for making recommendations to Achieve regarding cut scores. Each judgment study meeting began with a general session in which an overview of Achieve and the ADP Network, the Assessment Consortium, the ADP Algebra II End-of- Course Exam, and the purpose for the meeting were provided. The reason for setting performance standards was explained and an overview of the three separate tasks to be accomplished by the panelists was provided. Following the general session, the larger group was separated into three smaller groups based on the type of institution each panelist was representing. Participants were in one of three groups: community college (CC), four-year college/university with typical admittance rates (4T), or four-year college/university with selective admittance rates (4S). Table 1 shows the distribution of institution type by meeting 2. 1 Due to an ice storm that impeded travel, participation was limited. 2 Typical institutions included institutions admitting 64-89% of applicants. Selective institutions had admittance rates of 40-63%. 1
95 Table 1. Institution Type by Meeting Type Little Rock Baltimore Cleveland Total 4S T CC Total While in the smaller breakout groups, the panelists engaged in the three tasks of the day. The first task that panelists engaged in was to fill out a survey that judged the relevance of the ADP Algebra II exam standards relative to the course they were currently teaching (either College Algebra or Pre-Calculus). The second task had the panelists working as a table (based on type of course taught) to define the performance level threshold for justbarely Prepared, just-barely Well-Prepared, and just-barely Needs Preparation students. After discussion and noting agreement across tables about what the threshold level descriptions were for the Prepared performance level, the panelists took the test in order to experience it as a student would. They discussed the test as a whole and then engaged in two rounds of rating the individual test items in terms of the number of score points a just-barely Prepared student would earn on the first day in a college-level credit-bearing math course. Following the first round of ratings, the panelists engaged in table-level discussions about how the individual items were rated as well as the resulting raw score cuts for their table, their subject area, and the room as a whole. Panelists were allowed a second round of ratings following discussion and then returned all materials, including a demographic survey and an exit survey, and were adjourned for the day. Who Were the Panelists? The 133 panelists involved in the judgment study event represented 79 separate institutions from 20 states participating in the ADP Assessment Consortium or ADP Network: Arkansas (26), Arizona (7), Florida (11), Hawaii (10), Illinois (1), Indiana (10), Kentucky (2), Louisiana (1), Massachusetts (3), Maryland (11), Minnesota (1), North Carolina (8), New Jersey (5), Ohio (14), Pennsylvania (9), Rhode Island (3), Texas (2), Washington (4), and Wisconsin (3). The group was primarily female (61%), predominantly white (78%) and the majority of attendees were currently teaching College Algebra (65%). Seventy-two percent of the group had at least 10 years experience teaching college-level mathematics and 33% had at least 20 years experience; 95% had a math-related degree. Table 2 shows the breakdown of various demographics of the panelists. 2
96 Table 2. Demographics of the Panelists Gender Male 39% Ethnicity American Indian Years Teaching Highest Degree Math- Related Field 2% 1-3 4% B.A./B.S. 2% Yes 95% Female 61% Asian/PI 11% 4-6 9% M.A./M.S. 53% No 5% Black 6% % Ph.D. 46% Hispanic 2% % White 78% % Other 1% % % Course Taught College Algebra Pre- Calculus 65% 35% Judgment Study Meeting Tasks Dr. Julie Miles facilitated the general session for the group. The breakout rooms were facilitated by Mary Veazey (4S), Chris Rozunick (4T), and Julie Miles (CC). The script used by the facilitators is included in Appendix B and the PowerPoint slides used during the general session are included in Appendix C. Task 1: Standards Relevance Survey Panelists were provided with a survey listing each of the 41 ADP Algebra II EOC exam benchmarks. It was explained that the purpose of the survey was to gauge the relevance of each benchmark in preparing successful students for the first college level creditbearing math course specifically College Algebra or Pre-Calculus. Successful was defined for the panelists as the incoming student that are ready to learn without remediation and that would ultimately earn a B or better in their course. They were instructed to individually review each benchmark the exam may cover and rate the relevance of each with regard to the course they were currently teaching. Ratings were provided in terms of which benchmarks should be previously mastered by incoming students so that they would be successful in their first college level math course. The choices provided for response options were Essential, Important, Helpful, and Not Relevant. See Appendix D for the ADP Algebra II EOC exam benchmarks rated by each panelist. Task 2: Defining Threshold Descriptors Performance level descriptors are statements of what a student should know and be able to do at each achievement level given the content standards being assessed. For the purposes of this judgment study meeting, four performance levels were discussed (Needs Intensive Preparation, Needs Preparation, Prepared, and Well-Prepared). Using the ADP Algebra II exam standards, the panelists were asked to describe what the group of Needs Preparation, Prepared, and Well-Prepared students would be able to demonstrate in terms 3
97 of knowledge, skills, or abilities (KSAs) on the first day of their college-level math course as related to each objective for the test of interest. Two steps were involved in writing these preliminary threshold descriptors: 1. Identifying what the panelists at a single table believed were the knowledge, skills, and abilities (KSAs) of students just-barely in each performance level. 2. Discussing as a group any commonalities that existed across tables. The table-level groups began this task by focusing on the just-barely Prepared student. Since Prepared indicates the student will ultimately earn a B or better, it was suggested that they think of the just-barely Prepared student as the student who will earn a B-. The focus of this task was to identify the skills that fundamentally separate students who are in the Needs Preparation category from the students who are just-barely in the Prepared category. Once the tables had defined the just-barely Prepared KSAs, they created separate lists for the just-barely Well-Prepared, and the just-barely Needs Preparation performance levels. Once all lists were completed at all tables, a room level discussion occurred to see if commonalities existed across tables for any of the levels. These lists of KSAs will be combined with other judgment study meetings lists to form the basis for the performance level descriptions to be used on student score reports and at the July standard setting meeting. Task 3: Making Item-Level Judgments After the threshold level descriptors were discussed, following lunch, panelists were told that the next task would be used to determine what score on the ADP Algebra II End-of- Course Exam a high school student would need to earn to predict success in the panelists courses. Panelists were provided a binder with the operational items from the Spring 2008 assessment, the scoring rubrics for open-ended items, and a key sheet showing information about each item in the binder: item sequence, whether or not it was a calculator item, the standard being assessed, and the correct key. Items were provided in operational order from the Spring 2008 administration. They were not in order of difficulty. Panelists were instructed to review each item in the binder and determine the number of points a successful student WOULD earn on that item if they encountered it on their first day in the panelist s course. Since the focus of this task was on predicting success in college, the panelists were instructed to focus on the just-barely Prepared students when making their item level judgments. Following the instructions and a brief question and answer period, panelists indicated their readiness to proceed with Task 3 by filling out a survey indicating they understood their task and were ready to proceed. Round 1. Panelists began with the first multiple-choice item in the binder and answered the question, Would a successful student get this item correct on their first day in my class? If the answer was yes, they were instructed to put a 1 on their rating sheet 4
98 next to the item; if the answer was no, they were instructed to put a 0 on their rating sheet next to the item. The panelists worked through the binder in this fashion until they encountered an open-ended item. For open-ended items, the question they asked of themselves changed to, How many points would a successful student receive if they answered this question on their first day in my class? Once they reviewed the provided scoring rubrics and determined a score, they were instructed to put this score on their rating sheet next to the item. They proceeded in this manner until all 57 operational items had been judged. Throughout the task, panelists were reminded to review each item in light of the previously discussed threshold level descriptors and what WOULD a successful student get correct not what SHOULD they get correct. This distinction between would and should was made several times. As panelists worked on their individual judgments, facilitators were available to answer procedural questions and clarify the purpose of their role in the process. As panelists finished their ratings, they returned their judgment sheets to the facilitator. Once all panelists were finished with the Round 1 ratings, the data were analyzed and handouts of the results were printed out for discussion. Discussion. Following Round 1, panelists were provided information on the results of Round 1. They were shown the mean, median, standard deviation, minimum, and maximum cut scores that resulted from Round 1 for their individual table, their subject (either College Algebra or Pre-Calculus), and the room as a whole (institution type). At their individual tables, the panelists were encouraged to state the reasoning for their item-level judgments relative to their expectations of how a successful student would respond, especially for items where significant disagreement in the ratings occurred. Round 2. After discussion of the Round 1 results, panelist made their final ratings by reviewing each item again in light of the conversation and discussion at their table and across the room following Round 1. Task 4: Filling out Final Paperwork Upon completing Task 3, panelists were asked to fill out an exit survey addressing their comfort and understanding of the procedures used along with a demographic survey and reimbursement form. Appendix E shows the exit survey along with the results from each institution type. Appendix F shows the demographic survey. Task 1: Standards Relevance Survey Results Table 3 on the following pages represents the mean rating of each content standard according to by subject taught (Algebra or Pre-Calculus), the number of years the panelist has been teaching college-level mathematics, and the type of institution represented (4S, 4T, or CC). For these analyses, the ratings were recoded so that Essential = 3, Important = 2, Helpful = 1, and Not Relevant = 0. Cells shaded gray indicate that the 5
99 average rating is greater than or equal to 2.0. The results show clear agreement among panelists, regardless of their subject, experience or institution type, that Real Numbers (O1) and Algebraic Expressions (O3) represent important or essential skills. Elements of Linear Equations and Inequalities (E1d), Quadratic Functions (P1a and P1b), and Function Operations (F1a) were also considered important skills by most subgroups of individuals with a few subgroups in disagreement on their relevance. 6
100 Table 3. Mean Ratings of Content Standards with Shaded Cells for Values of 2.00 or higher Content Institution Type Years of Teaching Experience Subject Standard 4S 4T CC AL PC All O1a O1b O1c O2a O2b O3a O3b O3c O3d O3e O3f E1a E1b E1c E1d E2a E2b E2c E2d E2e P1a P1b P1c P1d P2a P2b P2c
101 P2d P2e P2f X1a X1b X1c X1d F1a F1b F2a F2b F3a F3b F3c
102 Task 2: Defining Threshold Descriptions Using the ADP Algebra II exam standards, the panelists were asked to describe what the group of Needs Preparation, Prepared, and Well-Prepared students would be able to demonstrate in terms of knowledge, skills, or abilities (KSAs) on the first day of their college-level math course as related to each objective of the test of interest. Following the meetings, Pearson content support services staff combined all bulleted lists provided at each meeting by each subgroup and looked for commonalities and patterns in the lists of KSAs within each performance level under consideration 3. This list of commonalities resulted in a list of Core Competencies as shown below. Algebra II Core Competencies In order to show mastery students must consistently apply procedural knowledge and demonstrate conceptual understanding and skills of the Algebra II objectives. The following list represents a sample of what students will likely be able to do prior to their first credit-bearing college mathematics course. Operations on Numbers and Expressions! Convert between and among radical and exponential forms of numerical and algebraic expressions! Simplify and perform operations on numerical and algebraic expressions containing radicals! Apply the laws of exponents to numerical and algebraic expressions, including those with rational and negative exponents! Perform operations on polynomial and rational expressions! Write equivalent algebraic expressions in one or more variables and extract information from the expressions Equations and Inequalities! Solve equations and inequalities involving the absolute value of a linear expression! Recognize and solve problems that can be represented by linear equations/inequalities in one or two variables, systems of linear equations in two or more variables, systems of linear inequalities in two variables, and interpret the solutions(s) in terms of the context of the problem! Solve single-variable quadratic equations and inequalities over the set of complex numbers! Solve factorable higher-order polynomial equations! Solve rational and radical equations over the set of real numbers 3 These same staff members are engaged in ADP item and test development activities. Additionally, they acted as facilitators at each of the meetings and had first-hand knowledge of the discussions that generated the bulleted lists. 9
103 Polynomial and Rational Functions! Determine key characteristics of quadratic functions and their graphs! Represent quadratic functions using tables, graphs, verbal statements, and equations and translate among these representations! Describe and represent the effect that changes in the parameters of a quadratic function have on the shape and position of its graph! Recognize, express, and solve problems that can be modeled using quadratic functions! Determine key characteristics of power functions in the form n f ( x) " ax, a # 0, for positive integral values of n and their graphs! Determine key characteristics of simple rational functions in the form n f ( x) " ax, a # 0, for negative integral values of n and their graphs Exponential Functions! Determine key characteristics of exponential functions and their graphs Function Operations and Inverses! Combine functions by addition, subtraction, multiplication, and division! Determine the composition of two functions, including any necessary restrictions on the domain Performance Level Descriptions The list of core competencies was then further refined into Performance Level Descriptions that will be used for score reporting purposes. Performance level descriptions are more general and are meant as a guide to students, parents, and teachers as to the knowledge, skills, and abilities that students are able to demonstrate within each performance level assigned.! Well Prepared - The student consistently applies concepts, procedures, and skills needed to show mastery of Algebra II. The student is highly effective at devising and clearly communicating a wide range of strategies to solve complex mathematical and contextual problems. The student computes accurately and uses precise mathematical and symbolic language to solve problems and communicate solutions. The student s explanations demonstrate the ability to reason mathematically, making appropriate connections between ideas in or across areas of mathematics, using formal reasoning to justify solutions, and evaluating the validity of solutions.! Prepared - The student usually applies concepts, procedures, and skills to show adequate progress toward the mastery of Algebra II. The student is usually effective at devising and communicating a variety of strategies to solve mathematical and contextual problems. The student is adept in computation and uses mathematical and symbolic language to solve problems and communicate 10
104 solutions. The student s explanations demonstrate the ability to reason mathematically, recognizing connections between ideas in or across areas of mathematics, using formal and informal reasoning to justify solutions, and evaluating the validity of solutions.! Needs Preparation - The student inconsistently applies concepts, procedures, and skills to show minimal progress toward the mastery of Algebra II. The student is generally effective at recalling and using routine, easily recognized, or straightforward strategies to solve simple mathematical and some contextual problems. The student can generally compute accurately and uses limited mathematical and symbolic language to solve problems and communicate solutions. The student s explanations demonstrate limited ability to reason mathematically, using informal reasoning to justify solutions, and evaluating the validity of solutions. Task 3: Item Level Judgments This next section presents the overall group results of both Round 1 and Round 2 of item level judgments for each institution type overall and broken down by subject taught and overall. Each table presented shows the mean, median, standard deviation, minimum and maximum cut scores assigned within each round. The community college group had 51 panelists, while the four-year typical and selective admittance groups had 46 and 35, respectively. Round 1 Across all institution types, and regardless of subject taught, Round 1 results reveal a wide range in expectations of what a student should know and be able to demonstrate on day one in their first credit-bearing college course in order to ultimately be successful (earn a B or better ). The possible range in cut scores is 0 to 76; the cut scores resulting from Round 1 ratings ranged from 5 to 70. Community College Group. The Prepared median raw score cut was The mean was with a standard deviation of and the range was 7 to Table 4. Community College Round 1 Results Subject Mean Median Std Dev Minimum Maximum Algebra Pre-Calculus Combined The panelist assigning the cut score of 70 incorrectly rated items on what a student should know on day one instead of what a successful student would know. The panelist refined his judgment in Round 2. 11
105 Four Year Typical Admittance Rate Group. The Prepared median raw score cut was The mean was with a standard deviation of 8.47 and the range was 9 to 48. Table 5. Four Year Typical Round 1 Results Subject Mean Median Std Dev Minimum Maximum Algebra Pre-Calculus Combined Four Year Selective Admittance Rate Group. The Prepared median raw score cut was The mean was with a standard deviation of and the range was 5 to 64. Table 6. Four Year Selective Round 1 Results Subject Mean Median Std Dev Minimum Maximum Algebra Pre-Calculus Combined Round 2 Community College Group. The Prepared median raw score cut was The mean was with a standard deviation of 7.53 and the range was 11 to 49. Table 7. Community College Round 2 Results Subject Mean Median Std Dev Minimum Maximum Algebra Pre-Calculus Combined
106 Four Year Typical Admittance Rate Group. The Prepared median raw score cut was The mean was with a standard deviation of 6.89 and the range was 18 to 44. Table 8. Four Year Typical Round 2 Results Subject Mean Median Std Dev Minimum Maximum Algebra Pre-Calculus Combined Four Year Selective Admittance Rate Group. The Prepared median raw score cut was The mean was with a standard deviation of and the range was 11 to 47. Table 9. Four Year Selective Round 2 Results Subject Mean Median Std Dev Minimum Maximum Algebra Pre-Calculus Combined Comparison of Round 2 Results As expected, the median cut score for the Prepared student varies across institution. Community colleges would expect an incoming student to score a in order to be successful in their courses, while four year institutions have higher expectations of and for the typical and more selective institutions, respectively. Table 10. Comparison by Institution Type Institution Type (N) Mean Median Std Dev Minimum Maximum Community College (51) Four Year Typical (46) Four Year Selective (35) When looking at the overall results by subject taught, the Prepared student would need to earn a median score of to be successful in a College Algebra course, and a score of would be needed to be successful in a Pre-Calculus course. Pre-calculus instructors were slightly more consistent in their judgments based on the range seen when compared across subjects. 13
107 Table 11. Comparison by Subject Taught Subject (N) Mean Median Std Dev Minimum Maximum Algebra (88) Pre-Calculus (44) Since it was possible that institution type was confounding the results by subject, the median Round 2 cut score is plotted by institution type in Figure 1 which indicates that regardless of the subject being taught, the instructors have similar expectations of what the successful student would be able to demonstrate on day one of their first creditbearing college math course, but that four year institutions have slightly higher expectations than do community colleges. Figure 1. Round 2 Median Cut Score by Institution Type and Subject Median Round 2 Cut Score AL PC 0.0 CC 4T 4S AL PC Institution Type Additional analyses were done to investigate the impact of the number of years teaching experience the panelists reported on their cut scores. When looking at the overall results by the number of years the panelist has been teaching college-level mathematics, there is no definitive pattern in terms of the expected score a student would need to achieve in order to be successful ( r " 0.021, p ". 81.). 14
108 Table 12. Combined Across Years Teaching Years Teaching (N) Mean Median Std Dev Minimum Maximum 1-3 (5) (12) (20) (28) (22) (16) (27) This lack of a definitive pattern could have been caused by the confounding influence of institution type or subject being taught by the instructors. In Figure 2 the results are disaggregated by type of institution. The pattern of scores generally shows that the 4T group has higher expectations than the CC group and that the 4S group had higher expectations in the earlier years of teaching with a trend in decline of expectations in the later years, although this erratic pattern could be an artifact of the smaller number of professors in some of these 4S groupings. None of the groups had a statistically significant relationship between years teaching and their expected cut scores. Figure 2. Round 2 Median Cut Score by Years of Teaching Round 2 Median Cut Scores CC 4T 4S CC T S Years Teaching 15
109 To further investigate if the subject being taught was impacting the results, Figure 3 shows the median round 2 cut score by years of teaching for each subject. The results show a surprising concordance in terms of expectations within each group regardless of the subject being taught, with the Pre-Calculus group having slightly higher cut scores in most instances. The single point of strong divergence in agreement (at 1-3 years teaching experience) is likely due to the fact that the Pre-Calculus group had only a single instructor in this group and is therefore not likely to be a reliable estimate of expectations. Figure 3. Round 2 Median Cut Score by Subject Taught Median Round 2 Cut Score AL PC AL PC Years Teaching When combining all data across all subgroups, the resulting median cut score that separates Prepared students from Needs Preparation is Task 4: Exit Survey Following all activities, panelists were asked to indicate their level of agreement to each of the five statements below. The overall results indicate that the majority of the panelists either Agreed (4) or Strongly Agreed (5) with each of the statements. Following each statement is the average agreement level on the 5-point Likert scale. For the complete breakout of results by level of agreement for the overall group and by institution type, please see Appendix E. 1. I was comfortable with my ratings of the relevance of each ADP Algebra II learning objective ( x " 4. 3 ). 16
110 2. I had a good understanding of what the test was intended to measure ( x " 4. 4 ). 3. The method for rating items (as Yes=1 or No=0 ) was conceptually clear ( x " 4. 6 ). 4. After the first round of ratings, I felt comfortable with the standard setting procedure ( x " 4. 3). 5. I found the feedback on the ratings of panelists compared to other panelists useful in setting the standard ( x " 4. 3 ). 6. How confident do you feel that your final cut score recommendation reflects the ability level of the Successful student in your course ( x " 4. 1)? Conclusions The results of the Standards Relevance Survey identify some clear content standards that panelists, regardless of background, consider to be Important and Essential for incoming students to master prior to entry into credit-bearing College Algebra or Pre- Calculus courses. These identified standards and information provided by the panelists during the Threshold Description task will form the basis for the core competencies and performance level descriptors that will be used on student score reports and at the standard setting event in July. The results from the item-level judgments indicate, as expected, that there are widely differing expectations based on type of course taught as well as type of institution represented. In general, the cut scores derived from the item-level judgments indicate that Pre-Calculus instructors expect students to know more on their first day in their course than do Algebra instructors and, as expected, that instructors from four year institutions had higher expectations than instructors from the community college institutions. Interestingly, the number of years experience teaching college-level mathematics has very little impact on expectations regardless of institution type or subject taught. 17
111 Appendices 18
112 Appendix A: List of Participants by Meeting 19
113 Table 1. Little Rock Meeting Participants Meeting Where AR AZ FL HI MN Total Arizona Western College 2 2 Arkansas State University Mountain Home 3 3 Broward College 2 2 Central Arizona College 2 2 Cossatot Community College 1 1 FGCU 1 1 Henderson State University 3 3 Honolulu Community College 1 1 Kauai Community College 1 1 Leech Late Tribal College 1 1 Leeward Community College 2 2 Maui Community College 1 1 Northern Arizona University 3 3 Ozarka College 3 3 Palm Beach Community College 1 1 Southern Arkansas University 3 3 St Cloud State University 1 1 University of Arkansas 1 1 University of Arkansas -Fayetteville 2 2 University of Arkansas, Fort Smith 3 3 University of Arkansas, Little Rock 3 3 University of Arkansas 1 1 University of Central Arkansas 2 2 University of Central Florida 2 2 University of Hawaii 2 2 University of Hawaii Hilo 2 2 University of North Florida 1 1 University of West Florida 1 1 Windward Community College 1 1 (blank) 1 1 Total Little Rock, Arkansas 20
114 Table 2. Baltimore Meeting Participants Meeting Where MA MD NC NJ OH PA RI Total Anne Arundel Community College 3 3 Baltimore City Community College 2 2 Bergen Community College 1 1 Bloomsburg University 3 3 Central Piedmont Community College 2 2 County College of Morris 1 1 Delaware Community College 1 1 Durham Tech Community College 2 2 Framingham State College 2 2 Kutztown University 2 2 Millersville University 1 1 Montclair State University 2 2 Montgomery College 2 2 Morgan State University 2 2 Penn State University 1 1 Rhode Island College 3 3 Rowan University 1 1 Towson University 1 1 University of Massachusetts Boston 1 1 Winston Salem State University 1 1 Total Baltimore, Maryland 21
115 Table 3. Cleveland Meeting Participants Meeting Where FL IL IN KY LA MD MI NC OH PA TX WA WI Total Ball State University 1 1 Bossier Parish Community College 1 1 California University of Pennsylvania 1 1 Central Washington University 1 1 Columbus State Community College 1 1 Eastern Washington University 3 3 Elgin Community College 1 1 Fayetteville State University, NC 3 3 Indiana Purdue Fort Wayne 1 1 Indiana State University 1 1 Indiana University, Bloomington 1 1 Indiana University, Purdue 1 1 Ivy Tech Community College 3 3 James A. Rhodes State College 1 1 Jefferson Community College 2 2 Kent State University 1 1 Kent State University, Trumbull 2 2 Lake Superior State University 1 1 Lakeland Community College 2 2 Purdue University North Central 2 2 Rhodes State College 2 2 San Jacinto College 1 1 The Ohio State University 3 3 TU 1 1 University of Cincinnati 2 2 University of Florida 1 1 University of North Texas 1 1 University of South Florida 2 2 University of Wisconsin, Madison 1 1 University of Wisconsin, Milwaukee 2 2 Total Cleveland, Ohio 22
116 Appendix B: ADP Judgment Study Script 23
117 ADP Judgment Study Script Panelist Check In 30 Minutes (7:30-8:15) Set up Panelists should check in with Pearson staff. Upon arrival, each panelist will be given a folder containing the following materials in order of use: Name Tent, Agenda, Confidentiality Agreement, Content Standards Relevance Survey, Judgment Study Readiness Survey, Panelist Rating Sheet, Panelist Information Sheet, Reimbursement form, Evaluation of Judgment Study Process, Participant List, and an ADP brochure. Introductions and Agenda 15 Minutes (8:15-8:30) PEARSON (Julie): Welcome the panelists to judgment study and provide introductions of the staff attending the meeting (Slide 2). Provide an overview of the goals of the day (Slide 3). ACHIEVE (Nevin): Review the history of the ADP Algebra II EOC assessment and the purposes of the assessment. Slides will cover: About Achieve (Slide 4), The Expectation Gap (Slide 5), Leads to High Remediation Rates (Slide 6), Majority of Graduates (Slide 7), Closing the Gap (Slide 8), ADP Benchmarks: Development (Slide 9), ADP Benchmarks: Key Findings (Slides 10-11), 34 ADP States (Slide 12), Subset States (Slide 13), Exam Content Standards and Design (Slide 14), and Purposes of the Common Exam (Slide 15). Are there any questions about Achieve, the ADP Network, the ADP Assessment Consortium or the Content Standards? Answer any questions that are asked. Now I ll turn the meeting back to Dr. Miles so that she can walk through the outline of what you will be doing today. Slide 16: As you can see from what has been presented so far, a lot of thought and hard work have gone into this assessment and this next step of setting performance standards for the ADP Algebra II EOC assessment is just as important so once again thank you for coming today to help ACHIEVE and PEARSON with this crucial activity. You will engage in three tasks today that will help us define a high school student who is Prepared for college versus a student who is Not Prepared for college. This dividing line is called a cut score and all the tasks you engage in today will help us define the score that separates students who are Prepared for College from those who Need Preparation. Slide 17: This slide represents what we are trying to do by setting performance standards. We re trying to take the continuum of student performance on the exam and separate those scores into four performance levels or buckets, if you will. For this exam we have four buckets that we ll be placing students into: Well-Prepared, Prepared, Needs Preparation, and Needs Intensive Preparation The tasks that you engage in today will 24
118 help you make a judgment on what score point is associated with the most important dividing line the one separating the Prepared students from the Needs Preparation students. The data collected today will be combined with data from similar meetings in two other regions as well as additional data from other types of studies to provide guidance to the ADP Assessment Consortium who will be making recommendations to Achieve on what the final cut scores should be for all the cuts shown on this continuum. Are there any questions on setting performance standards? Answer any questions. 30 Minutes (8:30-9:00) PEARSON (Julie): Provide a brief overview of the actual judgment tasks via PowerPoint presentation. Walk through each agenda item. Slide 18: Okay, moving on to the tasks you will be engaged in that will help us decide on the performance standards. The agenda for the day is packed, but we hope that you find the activities engaging and meaningful and if not, at least we hope the food is good. There are three major activities that we will engage in today. Each of these judgment tasks focuses on the knowledge, skills, and abilities (KSAs) an incoming college student needs to have previously mastered high school in order to successfully learn the material that you will be covering in your College Algebra (or Pre-calculus) course. Throughout the day, by successfully learn, we want you to think of students who will ultimately earn B or better in your class without any remediation. For those institutions that award them, a grade of B- would be included in the B or better group. The next slides will provide a brief overview of each of the three tasks. More details will be provided in the breakout rooms as the day progresses. Slides 19: The first task will be a survey where you tell us which ADP Algebra II EOC exam standards are important for high school students to master prior to entry into your college-level math courses. You will be provided with a survey that lists each ADP exam standard and benchmark. While you may not agree with the standards or objectives being assessed and think they are too vague or too specific, these were developed and agreed upon by the ADP assessment consortium so please review them with this in mind. After reviewing each exam benchmark, you will decide if it is Essential, Important, Helpful, or Not Relevant that the incoming student who will be successful in your course master this topic while in high school. Remember that by successful we mean students who will earn B or better in your course. Slide 20: Please locate the Standards Relevance Survey in your folder. It is a large 11x17 sheet that has been folded it has green highlights on it and looks like what is showing on the screen. You will read each benchmark and then circle the E, I, H, or N to indicate your judgment of the relevance. Your thoughts on this task will guide the discussion that takes place for the next task of the day. Does anyone have any questions about the Standards Relevance Survey? 25
119 Slides 21: The second task is at the heart of setting performance standards. You will create a general description of knowledge, skills, and abilities (KSAs) which fundamentally separate students that fall within different performance levels. These are called Threshold Descriptions because we are describing students at the threshold between performance levels. You will have just completed the Standards Relevance Survey so the benchmarks will be fresh on your mind as is your judgment of their importance. The distinction is that for this task we want to know what the student WOULD know not what they SHOULD know (which is what the survey will tell us). Slide 22: To help you understand what you will be doing, I ve added a few components to the continuum graphic presented earlier. You will see now that there is an X next to each cut score or dividing lines. These X represent students who are just barely in each performance level and it means that they just-barely have the skills needed to be classified as Prepared etc hence threshold students. Your list of KSAs will focus on describing these just-barely students in terms of what they WOULD be able to demonstrate on the first day in your class. As we work through the task of creating Threshold Descriptions, the reason why we focus on these just-barely students is to insure that we are sure we are capturing all the students who truly fall within a performance level even those at the lowest end of a level. The four performance levels that you will focus on for this task are Well- Prepared, Prepared, Needs Preparation, and Needs Intensive Preparation. The most important distinction in KSAs that you will be making is the one that separates students that are Prepared from those that Need Preparation. Slides 23: The third and final task will be for you to review each of the 57 items on the spring 2008 exam and tell us whether or not a successful student WOULD be able to get it right on their first day in your college-level math class. This will tell us what score on the ADP Algebra II EOC assessment might predict a student earning a B or better in your college course. For institutions that award them, a B- would be considered B or better. Slide 24: Please locate the Panelist Rating Sheet in your folder. It is an 11x17 sheet that is folded it looks like what is on the screen. Explain each column. Slide 25: For each multiple-choice question you will answer the following question: WOULD a successful student get this item correct on their first day in my class? Not should they get it correct, but WOULD they get it correct. Given your experience with incoming students and the skill sets they have, WOULD they get this item correct? If the answer is yes, then put a 1 next to this item on the judgment sheet. If the answer to that question is no, then put a 0 next to this item on the judgment sheet. Slide 26: When you encounter an open-ended item, review the scoring rubric that will be provided in the item binder and note the KSAs required to earn the various score points. You will then ask yourself: 26
120 How many points WOULD a successful student receive if they answered this question on their first day in my class? Remember, this is WOULD get correct, not SHOULD get correct. Once you have decided on the number of points, you will write this number next to the item on your judgment sheet. As you review these questions, please note that they have gone through multiple layers of review and field-testing. Additionally, the notation that is used in the items is the notation agreed upon by the ADP assessment consortium. Your state may use a different notation system, but the items accurately reflect what was agreed upon and reviewed throughout the development phase of this assessment. Are there any questions about this task? Slide 27: Following your review and discussion of the exam and making of score point judgments on the items we will enter the data while you take a break and then we will discuss the results. You will see results aggregated for three groups within your room: your table-level results, subject-level results (either Algebra or Pre-Calculus), and the overall room. You will have a discussion with those at your table followed by room-level discussion and then you ll have a second chance to review and make adjustments to your individual judgments of the items. Slide 28: At the end of the day, there will be forms that we will need to collect from you that will help us provide a context to the results of this meeting, as well as, to get you paid for your hard day s work. Slide 29: Most of the ground rules tie back to the confidentiality of the materials you will be presented with, so please keep these in mind. Since we must track each document provided to you by your unique ID number (located inside your folder), we ask for your patience when we ask you to fill in this ID on each piece of paper you are provided and again when we are checking you out at the end of the day. We also ask that you are respectful of your peers perspectives throughout the discussions that will occur today. We don t expect 100% agreement to occur during any of the tasks rather we are trying to get as broad a perspective on what defines college readiness as possible. Slide 30: Are there any questions about the ADP Algebra II assessment or the tasks that you will be undertaking today? Address any questions. Slide 31: We will now break out into rooms based on the type of institution that you are representing. Institutions were separated by characteristics that were common across the regions being represented to insure equivalence of results across sites. Please locate your unique ID inside the folder provided to you at check-in. If you ll locate the 6 th and 7 th positions if you see 4S please follow Mary Veazey to Conference Room 1. If you see CC please follow me to Conference Room 2. If you see 4T, please stay with Chris Rozunick in Salon 4. We will now take a 15 minute break and regroup in the room you are assigned to. 27
121 BREAK 15 Minutes (9:00-9:15) Task 1 Standards Relevance Survey 30 Minutes (9:15-9:45) PEARSON (Facilitator): Welcome back! My name is [FACILITATOR] and if you are in this room, then you should be representing [4S, 4T, OR CC] institution as is designated by your unique ID. If you are not supposed to be in this room, please let me know and I ll help you get to where you are supposed to be. We have a lot of ground to cover today, so let s get started. Please find the form titled Proprietary Information Non-Disclosure and Consulting Agreement in the folder you were provided at check in. This is a confidentiality form and by signing it you agree to not discuss any of the items or scoring rubrics that you will see today. However, we encourage you to discuss the ADP program and the processes used to day with your colleagues who may be interested. If you have any questions about what you may discuss outside of this meeting, please see an Achieve or Pearson staff member during a break. Please fill out the form and hand in to me. Once everyone has done this, we can begin with the Standards Relevance Survey. FACILITATOR: COLLECT CONFIDENTIALITY FORM If someone claims to have already filled out a form, please ask them for the sake of convenience to fill it out again. If someone refuses to sign the form, explain to them that without it they will not be able to participate in the activities and will be asked to leave immediately. Notify Cathy or Anne if they continue to refuse. I have all the confidentiality forms so let s begin. You have been introduced to the purpose of the ADP Algebra II End-of-Course Exam and a description of how the standards being assessed were developed. We would now like to know the relevance of each standard in preparing successful students for their first college level credit-bearing math course specifically College Algebra (or Pre-calculus). We would like to know which objectives should be previously mastered by incoming students so that they are successful in your course. There is no right or wrong answer this is judgment based so rely on your expertise with your subject matter and experience with the KSAs that incoming college students should be able to demonstrate in order to be successful in your course. 28
122 Please find the Standards Relevance Survey in your folder and write your unique ID in the upper right-hand corner. Please take a minute to review the instructions. Are there any questions? Please read each content benchmark and rate the mastery of each objective as Essential, Important, Helpful, or Not Relevant. When you are done with the survey, please stand your name tent up vertically so that I can gauge the progress of the room and keep us on track with our schedule. We have scheduled an hour for this task, but if we finish sooner, we can move ahead to Task 2. Remember! When we refer to successful students we mean that the incoming student is ready to learn without remediation and that the student will ultimately earn B or better in your course. If anyone needs more details on any of the standards on the survey, I have a copy of the full content standards for your review. FACILITATOR: As the panelists are working, walk around the room and make sure that each person puts their UNIQUE ID in the upper right-hand corner of the survey. 29
123 Task 2 Creating Threshold Descriptions 90 Minutes (9:45-11:15) PEARSON (Facilitator): It appears that everyone has completed Task 1, which means that you have reviewed the standards and objectives and determined which content objectives are fundamental to success in your course. The next task is to come up with Threshold Descriptors. Remember, the Threshold Descriptions are statements of what a just-barely student WOULD know and be able to do at each performance level given the exam standards being assessed. Remember that for this assessment the four performance levels are Needs Intensive Preparation, Needs Preparation, Prepared, and Well-Prepared. In order to refresh our thoughts on those distinctions, I will pass out a copy of the slide from earlier this morning showing the continuum of ability with the cut points and those just barely students that we are trying to describe. FACILITATOR: Pass out one continuum slide per table. We will begin by focusing on the just-barely Prepared student. Since Prepared indicates the student will ultimately earn a B or better, you may want to think of the just-barely Prepared student as the student who will earn a B-. You will focus on the skills that fundamentally separate students who are in the Needs Preparation category from the students who are just-barely in the Prepared category. Since each of you has your own expectations, we ll use a group discussion to facilitate this work and try to find common ground at your table and ultimately across the room. Your goal is not to come up with an exhaustive list of KSAs, but rather 4 or 5 that really make the difference in a student being successful in your college-level math course. If you think in terms of a student sitting on a fence post between two levels, what KSAs will tip the student into the next higher performance level side of the fence? For instance, what fundamental KSAs would move a student from earning a C+ to earning a B-? What about from a B+ to an A-? These should be statements of what students WOULD be able to demonstrate. For instance, your statements may be worded like a just-barely Prepared student can interpret graphs of exponential functions. The source for your list of KSAs should be the content standards listed on the survey you filled out earlier. Each of you has already individually rated mastery of the objectives in terms of relevance, so please feel free to use this as a starting point in your table-level discussion. If everybody at your table rated an objective as essential than perhaps this goes on your group list for the just-barely Prepared Student. Once you have tallied up the objectives that everybody agrees are essential, then try and pick the top five or so that 30
124 your B or better students WOULD be able to demonstrate on the first day in your class. Remember, the purpose of this activity is NOT to create an exhaustive list of KSAs that need to be mastered in high school. The purpose of this activity is to get you thinking about what skill set fundamentally separates the incoming students who will be successful in your class and earn at least a B or better from those will not. You see that each table has a small easel for writing your list of KSAs on. Whoever has the neatest handwriting should be designated as the official scribe of your table. Please write your table number on each sheet of paper that you use during this task. Your table number is the second to the last number in your unique ID. You will ultimately create a list for not only the just barely Prepared, but also the just barely Well-Prepared and just barely Needs Preparation for a total of three lists. Experience suggests that you may want to address all three lists at once so that you are considering each objective relative to which threshold area in the performance levels it may reside, if at all. For instance, if an objective is not fundamental to shifting a student from a C+ to a B-, is it fundamental to shifting a student from a B+ to an A- or from a D+ to a C-? Facilitator: Depending on the timing of the start of this session, adjust accordingly. The hard stop is at noon for lunch. Allow the most time for the distinction between Prepared and Needs Preparation. If we cannot get to the remaining distinctions between levels, then we can analyze the survey results after the fact to provide guidance on the PLDs. You will now create a list of KSAs that separates the just-barely Prepared from the Needs Preparation. This cut is the most crucial distinction and we will spend about 30 minutes on this separation. Once your table is done with the list of KSAs that separates just-barely Prepared from Needs Preparation, if we have time, you will make a second list that separates Prepared from the just-barely Well-Prepared. Perhaps the list of just-barely Well-Prepared starts off with those KSAs that the table agreed was at least important or that some felt were essential while others did not. We will spend about 30 minutes on this distinction. If it helps, please think in terms of the KSAs needed to move from a B+ to an A-. And finally we will create a third list that separates just-barely Needs Preparation from Needs Intensive Preparation. Perhaps the students in the Needs Intensive Preparation are only capable of the most basic KSAs covered. We will spend about 30 minutes on this distinction. For this distinction, we re looking at the KSAs needed to push the student from D+ to C-. Facilitator: 31
125 For the Needs Intensive Preparation, if panelists are struggling then prompt them by asking if perhaps students are capable of the benchmarks in O1 and/or E1. Once all the tables are done with their three lists, we will post them on the wall and discuss them as a group beginning with the KSAs that separate Prepared from Needs Preparation. We will engage in this until lunch at noon which is not a lot of time so before we get started does anyone have any questions? Answer any questions. Let s begin. While you are engaged in this task, I will be walking around to make sure that you are on track and answer any questions that may come up as you discuss the list with your table-mates. Facilitator: Walk around the room making sure that each table is on task and focused on making the correct distinctions between levels. Keep your eye on the time and remind folks when they are running out of time on a particular distinction. Once all groups lists are done, facilitate the room-level discussion by finding commonalities across all the tables, if they exist. Start by focusing on the cut between Prepared and Needs Preparation as this is the most important distinction. Then move to Prepared vs. Well-Prepared and lastly between Needs Intensive Preparation vs. Needs Preparation. It is likely that the Needs Preparation group may only have one or two KSAs that they can do. Make sure that each PLD sheet has the table ID in the upper right-hand corner. 30 Minutes (11:15-11:45) Now that each table has come up with their lists of KSA for each performance level, let s see if we have any commonalities across the room. If one person from each table will bring up their list for the Prepared performance level, I will hang them on the wall and we ll walk through them as a group. Facilitator: Beginning with the Prepared lists, hang them side-by-side in an easy to read location. Look for any common objectives on all or a majority (i.e., 3 of 4) of the tables lists. If there are some, use a marker to draw a star next to them and start with the script below. If there are NOT any common items or only one or two, then ask them to identify the single most important content objectives within each standard and this should get conversation going. If any agreement forms, then mark these with a question mark at tentative agreement. If commonalities are found: 32
126 As you can see there are commonalities across the groups that are considered to be fundamental KSAs that distinguish students who are Prepared from those who Need Preparation. Facilitator: If time allows, move on the Well-Prepared group, then Needs Preparation doing the same steps. If time does NOT allow, have each group bring up their remaining sheets to you. During the next task, compile each of the lists using the template provided by Julie. We ve come to some agreement on what students in each performance level should be capable of demonstrating on the first day in your course. Since in the next task, we re going to focus in specifically on the just-barely Prepared student, we will leave these lists on the wall as a guideline to reflect on while you are making your item-level judgments for Task 3. Lunch 60 Minutes (11:45-12:45) FACILITATOR: Remove from the wall all the lists of KSAs except for the Prepared group. You will key enter these lists during the next task. Immediately upon arrival back in the room, pass out the binders to each participant. The binder should have the rubrics in the front pocket and the key sheet in the back pocket. Don t forget to put a copy of the Rubric Book with Exemplars on each table. Task 3.1 Reviewing and Taking the Assessment 75 Minutes (12:45-2:00) Welcome back from lunch. In order for you to gain an appreciation of the assessment experience and the instrument s degree of difficulty, you will now take the ADP Algebra II EOC exam that was administered in spring You will have about an hour to answer the items from this assessment and review the correct answers. The binder in front of you has the test questions but in a slightly different format than the booklet students encountered. Specifically, it:! contains only operational test items from the base form and! the items appear as one per page. 33
127 The binder in front of you has the assessment items as well as the scoring rubrics for the open-ended items in the front pocket and the correct answers in the back pocket. You will see in the rubric a column called description. This description indicates the number of value points a student has earned. Value points are used to weight the importance of each part of a constructed-response item. The number of value points earned by a student response is mapped to a set of score points, either 0-2 or 0-4 points depending on whether the item is a short answer or extended response. Both the weight of the value points and mapping to score points are reviewed by teachers, higher education staff, and state department personnel during content review and finalized during rangefinding. When you score your own answers, please use the SCORE points not the value points. Additionally, on the table you will find a single copy of a Rubric Book with Exemplars this document has actual student examples of responses at each score point for each open-ended item. You will review and answer each item in the binder and focus on the KSAs required to successfully answer each item since in the next phase you will be asking yourself if the just barely Prepared student, the one who will ultimately earn at least a B- in your course will get the item correct on the first day in your course. Please note that items 1-29 are NON-calculator so please don t use a calculator to answer these items. You may use a calculator on items We will NOT be using your notes or scoring your responses to the items at any point so please use whatever shorthand you need to make meaning for yourself as you work through the items. You will work independently on this and please try to answer the items without referring to the rubrics and key sheet provided so that your experience is similar to the student experience as possible. Once you have completed the assessment, score your responses using the keys and rubrics provided and stand your name tent up vertically so that I can keep track of the progress of the room. FACILITATOR: Remind Panelist to write their Panelist ID on their copy of the answer key. Keep an eye on the clock and give a 15 minute warning and then a 5 minute warning if need be. 15 Minutes (2:00-2:15) It appears that most folks, if not all, are done with the exam. Let s take a few minutes and talk about it. Does anyone care to start with their first thoughts on the items and how the content standards are being assessed? What about the rigorousness of the exam? The purpose of this conversation is to allow a discussion of the rigor relative to the actual items we don t want them to be unreasonable in their expectations and have a cut score that is severely out of alignment with actual performance of student who would be successful in their courses. Task 3.2 Making Item-Level Judgments: Round 1 34
128 45 Minutes (2:15-3:00) Round 1Judgments You spent the morning working with the standards and objectives of the assessment and you have created a list of fundamental skills that an incoming student would have mastery of in order to be successful in your class. This next task is designed to help us determine what score on this assessment a high school student would earn in order to predict success in your college course. We are focusing on the just-barely Prepared student in this judgment task. The resources that you will use during this task are: 1) your expert judgment and experience in the field of college-level mathematics, 2) your Standards Relevance Survey, 3) the Threshold Description for the Prepared student, and 4) the notes you took on the items when taking the assessment. Please find the Panelist Rating Sheet in your folder. Write your unique ID in the upper right-hand corner. This sheet shows you the item sequence (page number), the unique item ID, whether or not the item appeared in the calculator or non-calculator section of the exam, the standard that is being assessed, and the score points possible for each item. Using the items that you just took notes on, you will now review each item and answer the following question, WOULD a successful student get this item correct on their first day in my class? If the answer is yes, then put a 1 next to this item on the judgment sheet. If the answer to that question is no, then put a 0 next to this item on the judgment sheet. When you encounter an open-ended item, review the scoring rubric and note the KSAs required to earn the various score points. You will then ask yourself, How many points WOULD a successful student receive if they answered this question on their first day in my class? Once you have decided on the number of SCORE points, please write this number next to the item on your judgment sheet. Remember, for the open-ended items we are interested in the SCORE points a student would earn, not the value points in the rubric s description. This means that the maximum number of points you can assign to a short-answer item is 2 and the maximum you can assign to an extended-response item is 4. If you are struggling to make a judgment on the open-ended items and need more information than the rubric provides, the Exemplar Binder on the table is a resource you could use at this point. Keep in mind that these examples are not exhaustive of all manners in which students may earn points, but are here to provide you with some guidance. Also, keep in mind that you have two opportunities to make your judgments so don t get caught up in information overload rely on your experience with students and your expertise in mathematics. 35
129 You will proceed in this manner until you have reached the end of the booklet in front of you. Remember, by successful, we mean that the Prepared student is ready to learn without remediation and that the student will ultimately earn B or better in your course. And we are interested in what WOULD they earn NOT what should they earn. While you are making these judgments you should reflect on how you rated each of the standards on the Standards Relevance Survey as well as the discussion and outcome of the Threshold Descriptions that our group developed and decided were important for incoming students to demonstrate in order to earn a B or better in your course.. Are there any questions about your task? Please find the Judgment Study Readiness Survey in your folder. Please indicate your agreement with the two statements for Round 1. I will walk around to insure that everyone is ready to begin the item-level judgment task. Now that everyone is ready to proceed, you may begin. Please work independently. You have one hour to complete this task (until 3pm). When you have judged each item, please add up the number of points you assigned and put the total where indicated at the bottom of the rating sheet and turn it into me. If you finish before the rest of the group, please refrain from talking or feel free to take a break outside the room. If you leave the room for a break at any time, you may not take any materials with you. 36
130 FACILITATOR: Remind Panelist to write their Panelist ID on their Panelist Rating Sheet and Readiness Survey. Keep an eye on the clock and give a 20 minute warning and then a 5 minute warning if need be. Break. 15 Minutes (3:00-3:15) PEARSON staff members will key in the data and print out feedback reports for the discussion after break Round 1 Feedback and Discussion: 30 Minutes (3:15-3:45) Table-Level Discussion I will now pass out the table-level results and the subject-level results. Since there is only one table for Pre-Calculus you only get table-level results. For the Algebra tables, you will get your individual table results as well as the combined results across all Algebra tables. These sheets shows the mean, median, standard deviation, minimum and maximum score assigned by people at your table and across tables within a subject. The mean is the average of the total scores provided all panelists at your table; the median is the middle value of the cut scores from all panelists at your table; the Standard Deviation is a variability index, describing how variable or how spread out the cut scores are at your table. When Standard Deviation is large, it means the cut scores are quite diverse; when Standard Deviation is small, it means the cut scores are relatively similar across panelists. This mean score is telling us that based on the item-level judgments you made via the points you assigned, then students who earn this score are likely to be successful in your courses. Potential discussion topics include: Are the total points you assigned higher or lower than the table or group average? If so, are you using the same expectations of what the threshold Prepared student is capable of doing on the first day in your course? Are there particular items or benchmarks where there is a wide discrepancy in terms of points assigned? Why the discrepancy? Do panelists have different understanding of the KSAs needed to answer successfully? Do panelists have different conceptualization of the threshold Prepared students? We do not intend for you to come to consensus on the item judgments, but we do want you to discuss items showing a lot of variability in ratings to get a feel for why differences 37
131 exist. Since you will be revising your ratings in Round 2 feel free to flag items that you may want to go back and reconsider in Round 2. You will have 30 minutes to discuss at your table and then we ll spend another 30 minutes at a room-level discussion. If your table is ready to discuss with the room as a whole, please stand your name tent up vertically so that I can gauge the progress of the group. FACILITATOR: Keep an eye on the clock and after 30 minutes (if not before), move them to the roomlevel discussion. 30 Minutes (3:45-4:15) Room- Level Discussion Now that you ve seen and discussed the results from your table and subject-group what are your initial impressions? FACILITATOR: Some things that may instigate conversation, depending on results: Was there anything surprising about the results to you? Were there any standards that the Pre-Calculus and Algebra groups did not agree on in general? Task 3.2 Making Item-Level Judgments: Round 2 30 Minutes (4:15-4:45) Round 2 Judgments Now that you ve made your initial judgments on each of the items and discussed the results with your table and across the group, you will have a second opportunity to make judgments. You will go back through and ask yourself the same questions as before: WOULD a successful student get this item correct on their first day in my class? If the answer is yes, then put a 1 next to this item on the judgment sheet. If the answer to that question is no, then put a 0 next to this item on the judgment sheet. When you encounter an open-ended item, review the scoring rubric and note the KSAs required to earn the various score points. You will then ask yourself, How many score points would a successful student receive if they answered this question on their first day in my class? Once you have decided on the number of points, please write this number next to the item on your judgment sheet. You will proceed in this manner until you have reached the end of the booklet in front of you. Remember, by successful, we mean that the Prepared student is ready to learn without remediation and that the student will ultimately earn B or better in your course. 38
132 This time your judgments will go into the column titled Round 2 MY POINTS. Even if you don t change your item rating from Round 1, please carry over the score points assigned in Round 1 column to the Round 2 column so that each item has a score assigned as you will need to add up this column and put the total where indicated at the bottom of the rating sheet. Are there any questions about your task? Please find the Judgment Study Readiness Survey in your folder. Please indicate your agreement with the two statements for Round 2. I will walk around to insure that everyone is ready to begin the item-level judgment task. Now that everyone is ready to proceed, we are almost ready to begin. Since you will complete this task at your own pace, I will go ahead and explain the final task of filling out forms so that as you complete Round 2 of your judgments, you may begin the final work of the day without waiting on the remainder of the group. Task 4 Filling out Forms 15 Minutes (4:45-5:00) Round 2 Judgments Evaluations/Check-Out So once you have finished your Round 2 judgments, you will fill out the Panelist Information Sheet, Exit Survey, and Reimbursement forms located in your folder. When you have completed all of these documents, please bring them to me along with your binder, rubrics, etc and I will check you out for the day. You will be required to initial that you have returned all materials, so please be patient if a line forms. One of the forms that you will be filling out is the Reimbursement form. If you have any expenses over $25 you will need to attach a receipt. If you have not incurred the expense yet, say for airport parking, you may submit this form with receipts once you return home. Also, if your institution rules prevent you from accepting the honorarium that is being paid and you would like to donate it to your institution or department if that is allowed, please let me know as that is a separate form that you will need to fill out and submit. Does anyone have any questions about the forms that need to be turned in at the end of the day? Okay, please work independently on your final item-level judgments for Round 2 and then proceed to the final task of filling out the surveys in your folder. You have 45 minutes to complete this task. FACILITATOR: 39
133 At 4:45 remind panelists that they only have 15 minutes and ask if anyone is still working on the Round 2 judgments. If so, then go to each and make sure they are on track to finish on time. As panelists return materials double-check that they ve put their ID on EACH sheet requiring it. Check off each returned document for each panelist on the Materials Tracking Sheet: Test Items Rubrics Key Sheet Panelist Rating Sheet Content Standards Relevance Survey Judgment Readiness Form Evaluation of Judgment Study Process Reimbursement/Honorarium Form Rubric book with exemplars one per table. THANK EACH PARTICIPANT FOR CONTRIBUTING TO THE SUCCESS OF THE ADP ALGEBRA II EOC ASSESSMENT. 40
134 Appendix C: General Session PowerPoint Slides 41
135 42
136 43
137 44
138 45
139 46
140 47
141 48
142 49
143 Appendix D: ADP Algebra II EOC Exam Standards 50
144 Content Standard and Benchmarks O1. Real numbers a. Convert between and among radical and exponential forms of numerical expressions. b. Simplify and perform operations on numerical expressions containing radicals. c. Apply the laws of exponents to numerical expressions with rational and negative exponents to order and rewrite them in alternative forms. O2. Complex numbers a. Represent complex numbers in the form a+bi, where a and b are real; simplify powers of pure imaginary numbers. b. Perform operations on the set of complex numbers. O3. Algebraic expressions a. Convert between and among radical and exponential forms of algebraic expressions. b. Simplify and perform operations on radical algebraic expressions. c. Apply the laws of exponents to algebraic expressions, including those involving rational and negative exponents, to order and rewrite them in alternative forms. d. Perform operations on polynomial expressions. e. Perform operations on rational expressions, including complex fractions. f. Identify or write equivalent algebraic expressions in one or more variables to extract information. E1. Linear equations and inequalities a. Solve equations and inequalities involving the absolute value of a linear expression. b. Express and solve systems of linear equations in three variables with and without the use of technology. c. Solve systems of linear inequalities in two variables and graph the solution set. d. Recognize and solve problems that can be represented by single variable linear equations or inequalities or systems of linear equations or inequalities involving two or more variables. Interpret the solution(s) in terms of the context of the problem. E2. Nonlinear equations and inequalities a. Solve single-variable quadratic, exponential, rational, radical, and factorable higher-order polynomial equations over the set of real numbers, including quadratic equations involving absolute value. b. Solve single variable quadratic equations and inequalities over the complex numbers; graph real solution sets on a number line. c. Use the discriminant, D = b 2-4ac, to determine the nature of the solutions of the equation ax 2 + bx + c = 0. d. Graph the solution set of a two-variable quadratic inequality in the coordinate plane. e. Rewrite nonlinear equations and inequalities to express them in multiple forms in order to facilitate finding a solution set or to extract information about the relationships or graphs indicated. P1. Quadratic functions a. Determine key characteristics of quadratic functions and their graphs. b. Represent quadratic functions using tables, graphs, verbal statements, and equations. Translate among these representations. c. Describe and represent the effect that changes in the parameters of a quadratic function have on the shape and position of its graph. d. Recognize, express, and solve problems that can be modeled using quadratic functions. Interpret their solutions in terms of the context. 51
145 P2. Higher-order polynomial and rational functions a. Determine key characteristics of power functions in the form f(x) = ax n, a 0, for positive integral values of n and their graphs. b. Determine key characteristics of polynomial functions and their graphs. c. Represent polynomial functions using tables, graphs, verbal statements, and equations. Translate among these representations. d. Determine key characteristics of simple rational functions and their graphs. e. Represent simple rational functions using tables, graphs, verbal statements, and equations. Translate among these representations. f. Recognize, express, and solve problems that can be modeled using polynomial and simple rational functions. Interpret their solutions in terms of the context. X1. Exponential functions a. Determine key characteristics of exponential functions and their graphs. b. Represent exponential functions using tables, graphs, verbal statements, and equations. Represent exponential equations in multiple forms. Translate among these representations. c. Describe and represent the effect that changes in the parameters of an exponential function have on the shape and position of its graph. d. Recognize, express, and solve problems that can be modeled using exponential functions, including those where logarithms provide an efficient method of solution. Interpret their solutions in terms of the context. F1. Function operations a. Combine functions by addition, subtraction, multiplication, and division. b. Determine the composition of two functions, including any necessary restrictions on the domain. F2. Inverse functions a. Describe the conditions under which an inverse relation is a function. b. Determine and graph the inverse relation of a function. F3. Piecewise-defined functions a. Determine key characteristics of absolute value, step, and other piecewise-defined functions. b. Represent piecewise-defined functions using tables, graphs, verbal statements, and equations. Translate among these representations. c. Recognize, express, and solve problems that can be modeled using absolute value, step, and other piecewise-defined functions. Interpret their solutions in terms of the context. 52
146 Appendix E: Exit Survey 53
147 Overall Results Please rate the following questions on a scale from 1 to 5, 1 being totally disagree and 5 being totally agree. 7. I was comfortable with my ratings of the relevance of each ADP Algebra II learning objective % 3% 9% 44% 44% 8. I had a good understanding of what the test was intended to measure % 2% 8% 40% 51% 9. The method for rating items (as Yes=1 or No=0 ) was conceptually clear % 0% 2% 29% 67% 10. After the first round of ratings, I felt comfortable with the standard setting procedure % 3% 10% 38% 48% 11. I found the feedback on the ratings of panelists compared to other panelists useful in setting the standard % 3% 9% 40% 47% 12. How confident do you feel that your final cut score recommendation reflects the ability level of the Successful student in your course? % 3% 14% 45% 37% On the back of this page, please add any additional comment or observations on the judgment study process, facilitators, discussion, etc. Thank you for all your hard work! 54
148 Four-Year Selective Admittance Institution Results Please rate the following questions on a scale from 1 to 5, 1 being totally disagree and 5 being totally agree. 1. I was comfortable with my ratings of the relevance of each ADP Algebra II learning objective % 0% 14% 40% 46% 2. I had a good understanding of what the test was intended to measure % 3% 9% 40% 49% 3. The method for rating items (as Yes=1 or No=0 ) was conceptually clear % 0% 0% 31% 66% 4. After the first round of ratings, I felt comfortable with the standard setting procedure % 3% 12% 29% 56% 5. I found the feedback on the ratings of panelists compared to other panelists useful in setting the standard % 3% 12% 26% 59% 6. How confident do you feel that your final cut score recommendation reflects the ability level of the Successful student in your course? % 0% 15% 44% 41% On the back of this page, please add any additional comment or observations on the judgment study process, facilitators, discussion, etc. Thank you for all your hard work! 55
149 Four-Year Typical Admittance Institution Results Please rate the following questions on a scale from 1 to 5, 1 being totally disagree and 5 being totally agree. 1. I was comfortable with my ratings of the relevance of each ADP Algebra II learning objective % 4% 9% 54% 30% 2. I had a good understanding of what the test was intended to measure % 2% 7% 43% 48% 3. The method for rating items (as Yes=1 or No=0 ) was conceptually clear % 0% 4% 36% 58% 4. After the first round of ratings, I felt comfortable with the standard setting procedure % 4% 13% 43% 37% 5. I found the feedback on the ratings of panelists compared to other panelists useful in setting the standard % 7% 7% 56% 29% 6. How confident do you feel that your final cut score recommendation reflects the ability level of the Successful student in your course? % 9% 22% 43% 26% On the back of this page, please add any additional comment or observations on the judgment study process, facilitators, discussion, etc. Thank you for all your hard work! 56
150 Community College Results Please rate the following questions on a scale from 1 to 5, 1 being totally disagree and 5 being totally agree. 1. I was comfortable with my ratings of the relevance of each ADP Algebra II learning objective % 4% 6% 36% 54% 2. I had a good understanding of what the test was intended to measure % 0% 8% 36% 56% 3. The method for rating items (as Yes=1 or No=0 ) was conceptually clear % 0% 0% 22% 76% 4. After the first round of ratings, I felt comfortable with the standard setting procedure % 2% 6% 38% 54% 5. I found the feedback on the ratings of panelists compared to other panelists useful in setting the standard % 0% 10% 34% 56% 6. How confident do you feel that your final cut score recommendation reflects the ability level of the Successful student in your course? % 0% 6% 48% 44% On the back of this page, please add any additional comment or observations on the judgment study process, facilitators, discussion, etc. Thank you for all your hard work! 57
151 Appendix F: Demographic Survey 58
152 PANELIST INFORMATION SHEET Please provide the following demographic information that will be used to describe the general characteristics of the panelists who are providing feedback on the ADP Algebra II End-of-Course Exam assessment. Your individual responses will be not be shared or reported all data will be reported in an aggregated form. Gender (circle one): Male Female Ethnicity (circle one): American Asian/ Black Hispanic White Other Indian Pacific Islander Which course best describes the course you are currently teaching? College Algebra Pre-Calculus/Intro Calculus What is the highest degree you have attained? B.A. / B.S. M.A. / M.S. Ph.D. Is this degree in a math-related field (circle one)? Yes No Title of your degree: How many years have you taught college-level math courses (circle one)? Name of the institution you are representing today: My institution is the following type (circle one): 2-Yr Technical/Community College 4-Yr College/University 59
153 5. Mapping to Performance Level Descriptors
154
155 Content Specialists Mapping of Items to PLDs The purpose of mapping items to performance level descriptors (PLDs) was to get a sense where cut scores would fall if they were based solely on the PLDs. Results for the ADP Algebra II End-of-Course Exam are reported according to three performance levels that indicate a student s proficiency in Algebra II and his or her preparedness for first-year credit-bearing college mathematics courses. The knowledge, skills, and abilities required at each performance were defined in collaboration with higher-education mathematics professors from across the country. Well Prepared - The student consistently applies concepts, procedures, and skills needed to show mastery of Algebra II. The student is highly effective at devising and clearly communicating a wide range of strategies to solve complex mathematical and contextual problems. The student computes accurately and uses precise mathematical and symbolic language to solve problems and communicate solutions. The student s explanations demonstrate the ability to reason mathematically, making appropriate connections between ideas in or across areas of mathematics, using formal reasoning to justify solutions, and evaluating the validity of solutions. Prepared - The student usually applies concepts, procedures, and skills to show adequate progress toward the mastery of Algebra II. The student is usually effective at devising and communicating a variety of strategies to solve mathematical and contextual problems. The student is adept in computation and uses mathematical and symbolic language to solve problems and communicate solutions. The student s explanations demonstrate the ability to reason mathematically, recognizing connections between ideas in or across areas of mathematics, using formal and informal reasoning to justify solutions, and evaluating the validity of solutions. Needs Preparation - The student inconsistently applies concepts, procedures, and skills to show minimal progress toward the mastery of Algebra II. The student is generally effective at recalling and using routine, easily recognized, or straightforward strategies to solve simple mathematical and some contextual problems. The student can generally compute accurately and uses limited mathematical and symbolic language to solve problems and communicate solutions. The student s explanations demonstrate limited ability to reason mathematically, using informal reasoning to justify solutions, and evaluating the validity of solutions. Two Pearson ADP Algebra II content leads independently evaluated the content of each item appearing on the Spring 2009 form and assigned it to a Performance Level based on the descriptions above. Table 1 lists the Spring 2009 items by difficulty (column B ) from easiest to most difficult. The next two columns show the PLD classification assigned for each item by the content specialists. There were only two items in disagreement (b=.0103 and b=.0481), indicating fairly high reliability of the assignments. The pattern of proficiency levels assigned by the content specialists was examined for potential cut score location. Based on the pattern seen, tentative cuts are proposed to provide an additional piece of context for the standard setting process 5-1
156 (highlighted areas). The goal of the tentative cuts was to maximize the proportion of items assigned to the particular proficiency level. The items shaded in green represent Needs Preparation, the items shaded in orange represent Prepared, and the items shaded in yellow represent Well Prepared. 5-2
157 Table 1. PLD Classifications of Spring 2009 Items Unique Item Identifier B PL by CSS1 PL by CSS2 Max Score Points Item Sequence Standard 2023MC Prepared Prepared 6 1 P2.d N 8255MC Needs Preparation Needs Preparation 35 1 P2.b Y 5017MC Needs Preparation Needs Preparation 2 1 X1.b N 4006MC Needs Preparation Needs Preparation 1 1 O1.c N 8039MC Needs Preparation Needs Preparation 37 1 P2.f Y 8280MC Needs Preparation Needs Preparation 39 1 F3.c Y 5024MC Prepared Prepared 10 1 X1.a N 8041MC Needs Preparation Needs Preparation 14 1 P1.a N 8261MC Prepared Prepared 51 1 P2.b Y 5054MC Well Prepared Well Prepared 40 1 X1.c Y 2336MC Well Prepared Well Prepared 48 1 P2.a Y 2210MC Well Prepared Well Prepared 36 1 E2.a Y 0026MC Needs Preparation Needs Preparation 41 1 O3.e Y 3028MC Needs Preparation Needs Preparation 56 1 E1.c Y 5055MC Prepared Prepared 53 1 X1.d Y 2269MC Well Prepared Well Prepared 5 1 X1.c N 8248MC Prepared Prepared 58 1 P2.a Y 3079MC Needs Preparation Prepared 59 1 E2.e Y 4035MC Prepared Prepared 4 1 O2.a N 6160MC Needs Preparation Needs Preparation 54 1 E2.d Y 6107MC Needs Preparation Prepared 3 1 E1.a N 2157MC Well Prepared Well Prepared 43 1 F1.b Y 3039MC Needs Preparation Needs Preparation 34 1 O3.d Y 3094MC Needs Preparation Needs Preparation 22 1 E2.a N 6073MC Prepared Prepared 26 1 P2.c N 2016MC Prepared Prepared 20 1 P2.d N 6121MC Prepared Prepared 55 1 O2.b Y 6043MC Prepared Prepared 23 1 P1.c N Calculator Usage 5-3
158 Unique Item Identifier B PL by CSS1 PL by CSS2 Max Score Points Item Sequence Standard 6174MC Prepared Prepared 28 1 O3.b N 8243MC Well Prepared Well Prepared 52 1 F2.b Y 9112MC Prepared Prepared 49 1 X1.a Y 1508SA Needs Preparation Needs Preparation 11 2 P1.a N 4013MC Needs Preparation Needs Preparation 8 1 O1.b N 4018MC Prepared Prepared 50 1 E2.b Y 3179MC Well Prepared Well Prepared 15 1 P2.d N 1032SA Needs Preparation Needs Preparation 45 2 O3.d Y 8606MC Well Prepared Well Prepared 27 1 F2.a N 0003MC Prepared Prepared 9 1 P2.a N 3176MC Prepared Prepared 18 1 P2.c N 8100MC Needs Preparation Needs Preparation 42 1 F3.a Y 2160MC Well Prepared Well Prepared 19 1 F2.b N 5030MC Well Prepared Well Prepared 17 1 X1.c N 2162MC Well Prepared Well Prepared 38 1 F2.b Y 1275SA Prepared Prepared 24 2 E2.e N 6149MC Well Prepared Well Prepared 47 1 P1.c Y 4032MC Prepared Prepared 16 1 O1.b N 2261MC Well Prepared Well Prepared 21 1 X1.a N 8258MC Well Prepared Well Prepared 7 1 P2.b N 2230MC Prepared Prepared 60 1 E2.c Y 1446ER Prepared Prepared 13 4 E N 1549SA Prepared Prepared 44 2 X1.b Y 1180ER Well Prepared Well Prepared 57 4 X Y 1691SA Well Prepared Well Prepared 25 2 F1.b N 1478ER Well Prepared Well Prepared 46 4 P Y 1515SA Prepared Prepared 12 2 F3.b N Calculator Usage 5-4
159 6. Cross-Walk of Cut Scores
160
161 Cross-walk Introduction The Crosswalk document was created to summarize and illustrate the results of the various validity studies. The first column contains the Spring 2008 raw score scale which ranges from 0 to 76. The next six columns present results of the respective validity studies. The cells shaded in orange represent results that are more relevant for standard setting purposes, as recommended by the Research Alliance. A key for the abbreviations used is shown below: AL = Algebra PC = Pre-Calculus CC = Community College 4T = 4-year Typical Institution 4S = 4-year More Selective Institution 6-1
162 Cross-walk of Cut Scores Confidential: Do not photocopy or distribute Spring 2008 Raw Score State Concurrent (Proficiency Levels) National Concurrent Predictive Study Contrasting Groups (Predictive data) Judgment Studies Mapping to PLDs C or better in CC PC NJ-Proficient C or better in CC AL 13 IN-Proficient 15 HI & PA-Proficient C or better in 4S AL 16 KY-Proficient RI-Proficient 21 HI-Advanced C or better in 4T PC 23 NJ & PA-Advanced SAT-Concordance 4-Year Selective CC AL Cut 24 KY-Advanced SAT-Concordance All CC Cut & All AL Cut 25 ACT-Concordance Community College 26 ACT-Concordance 4-Year Typical 4S AL Cut IN-Advanced 4T AL Cut ACT-Exp. & Pred. Score B or better in 4S AL & CC PC All 4S Cut Prepared 33 CC PC Cut & All 4T Cut 34 All PC Cut SAT-Exp. & Pred. Score 37 SAT-Pred. Score B or better in 4T AL 39 SAT-Exp. & Pred. Score B or better in 4T PC 4T & 4S PC Cut 40 RI-Advanced SAT-Pred. Score A or better in 4S AL 41 RI-Advanced SAT-Exp. & Pred. Score 42 RI-Advanced SAT-Exp. & Pred. Score 43 RI-Advanced Well Prepared B or better in CC AL 49 A or better in 4T AL 50 A or better in CC PC A or better in 4T PC PSAT-Concordance 57 PSAT-Concordance 58 PSAT-Concordance 59 PSAT-Concordance
163 Appendix A
164
165 Appendix A1: Calculating Expected Scores Using Validity Studies Data Data were collected over two administrations for the ADP Algebra II validity studies. For the concurrent studies, high school students were administered the full Spring 2008 exam. For the cross-sectional studies, students were administered either the Spring 2008 or Endof-Fall 2008 exam, and due to time constraints, most students were instructed to take only one session (roughly half) of the exam. As a result, each unique administration (shown in Table 1) needed to be placed on the same scale so that data could be combined for analyses. The Spring 2008 scale was selected as the scale for all validity study analyses to be conducted on since that administration was the first operational administration with just over 85,000 students contributing to the calibration of the items. This spring 2008 scale had a raw score range of 0-76 points. Table 1. Unique Administrations for Validity Data Unique Administration Raw Score Range Spring 2008 Full Test 0-76 Spring 2008 Session Spring 2008 Session End-of-Fall 2008 Full Test 0-76 End-of-Fall 2008 Session End-of-Fall 2008 Session Obviously, no conversion was necessary for the scores of students who took the full Spring 2008 exam since they are already on the scale of choice. But for the five other unique administrations, converting scores to the same scale was a three step process. The three steps were repeated for each unique administration.! Step One: All items were anchored to the item bank which had been re-centered to the Spring 2008 scale using a mean-mean transformation. By doing an anchored item calibration using WINSTEPS 3.60, a new score file was created which contained a raw score to theta conversion table for each variation of the test used for the validity studies.! Step Two: An expected score on the 0-76 scale was calculated by doing a true score equating using the score file from step one and category structure measures from the Spring 2008 exam. In true score equating, the theta associated with each raw score on the original scale is used to determine the expected raw score on the full 0-76 raw score scale. Conceptually, the calculated expected score is the score that you would expect a student of a particular raw score ability to earn if they were administered the items that were on the Spring 2008 exam, based on the difficulty of those items. A1-1
166 ! Step Three: Students raw scores on a particular unique administration of the assessment were converted to the 0-76 Spring 2008 raw score scale using the raw score to raw score conversion table produced in step two. Although the data collection for the cross-sectional studies spanned more than one administration and students may have taken only one session of the exam, by placing all scores on the Spring 2008 raw score scale of 0-76 points, data from the various unique administrations are comparable and analyses could be conducted on combined data. This rescaling also allows results from the concurrent and cross-sectional studies to be compared. A1-2
167 Appendix A2: Mapping Spring 2009 Scale to Spring 2008 Scale While all data from the validity studies were analyzed on the Spring 2008 scale, standards will be set on the Spring 2009 exam. Therefore it is necessary to establish a statistical relationship between the two scales since the Spring 2008 scale ranges from 0-76 raw score points while the Spring 2009 scale ranges from Two steps were necessary to map the full Spring 2009 scale to the Spring 2008 scale, as presented below:! Step One: A score file was created which contained a raw score to theta conversion table for the Spring 2009 exam by doing an anchored item calibration using WINSTEPS Anchoring the Spring 2009 items to the Spring 2008 theta metric allows the mapping that occurs in Step Two.! Step Two: An expected score on the 0-76 scale was calculated by doing a true score equating using the score file from step one and category structure measures from the Spring 2008 exam. The result is all possible Spring 2009 raw scores mapped to Spring 2008 raw scores with underlying thetas serving as the link between the two administrations. By using the produced conversion table, results on the Spring 2008 metric from the validity studies can be identified on the Spring 2009 operational scale which is being used for standard setting purposes. A2-1
168 Appendix A3: The American Diploma Project (ADP) Algebra II End-of-Course Exam Validity Research Work Plan in Support of Standard Setting July 1, Executive Summary This plan documents the research studies and related tasks Pearson is conducting in collaboration and coordination with Achieve and the ADP Algebra II Coordination and Direction Team (CDT) to provide validity evidence to inform standard setting for the Algebra II End-of-Course Exam in July of The plan reflects ongoing consultation and guidance from the Research Alliance. 2.0 Program Overview In the fall of 2005 with support from Achieve, nine American Diploma Project Network states Arkansas, Indiana, Kentucky, Maryland, Massachusetts, New Jersey, Ohio, Pennsylvania and Rhode Island came together to develop specifications for a common end-of-course exam in Algebra II. Six additional states Arizona, Florida, Hawaii, Minnesota, North Carolina and Washington have since joined the ADP Assessment Consortium, bringing the total number of participating states to fifteen. In March 2007, the states awarded the contract to develop and administer the ADP Algebra II End-of-Course Exam to Pearson. Field testing of Algebra II End-of-Course Exam items was conducted in October 2007 and February The first operational tests were administered in May and June Going forward, the exam will be administered during two operational administration windows each year at the end of fall and at the end of spring. Pearson s base contract runs through June 30, The exam will be available in both online and paper formats beginning with the spring 2009 administration. A subset of the consortium states is working with Pearson to develop the ADP Algebra I End-of- Course Exam. The first operational test will be administered in the spring of Program Goals The American Diploma Project Algebra II End-of-Course Exam is designed to serve 3 critical goals:! to improve high school Algebra II curriculum and instruction;! to serve as an indicator of readiness for first-year college credit-bearing courses; and! to provide a common measure of student performance across states over time The validity studies described in this plan are those required prior to standard setting, and focus on the use of the exam as an indicator of readiness for first-year college credit-bearing courses. The other two goals of the program will be fulfilled through more long-term studies, such as longitudinal studies following a cohort of students from high school, where they took the ADP Algebra II Exam, into college and analyzing their exam grade, courses taken in high school, and performance in college. A3-1
169 2.2 The Role of the States The 15 states participating in the ADP Assessment Consortium vary in their usage of the ADP Algebra II Exam. During the school year three states, Arkansas, Hawaii, and Indiana, required that all Algebra II students take the ADP Algebra II End-of-Course Exam at the end of the course. In most other states it is up to the school or district to decide if its students will participate in the exam. Going forward, the Research Alliance has suggested that when a state would like to test all students that are completing the Algebra II course, but cannot due to budgetary constraints, that a representative sample is identified to allow for more meaningful comparison over time and across states which is not as easily done with samples of convenience. The following tables provide counts of students registered by administration compared to the number of tests completed, by mode (paper or online) and by state. High attrition rates for the field tests have made it difficult to obtain an adequate sample size for some analyses (e.g., online comparability). Table 1: ADP Algebra II End-of-Course Exam Fall 2007 Field-Test Participation State # Students Registered Paper (as of 9/19/07) # Students Tested Paper (10/1/07-10/12/07) # Students Registered Online (as of 9/10/07) # Students Tested Online (10/1/07-10/12/07) Arkansas 27,493 15, Indiana 0 0 2, Kentucky 7,958 4,728 1, Maryland 3,657 1,953 6, Massachusetts 2,157 1, Minnesota 1, New Jersey 15,260 11,761 2, Ohio 36,974 21, Pennsylvania 2,859 2, Rhode Island Tennessee* N/A Washington 0 0 2, Total 98,032 61,937 15,891 3,488 * Tennessee participated in the fall field test but is not a member state. A3-2
170 Table 2: Algebra II End-of-Course Exam Winter 2008 Field-Test Participation # Students Registered (as of 2/12/08) # Students Tested (2/11/08-2/15/08 State Indiana Kentucky Maryland Massachusetts 3,293 1,227 Minnesota 2,717 1,678 New Jersey 1,579 1,036 Total 9,722* 4,864 * Of the total number of students registered, 1,247 students were to participate in the comparability study (half assigned to paper, half assigned to online); however, only 606 students participated in the comparability study. Table 3: Algebra II End-of-Course Exam Spring 2008 Participation # Students Registered # Students Tested (5/1/08-6/13/08) State Arizona 1,319 1,091 Arkansas 25,996 22,101 Hawaii 8,077 5,157 Indiana 4,478 3,027 Kentucky 2,568 2,019 Minnesota New Jersey 12,504 9,813 North Carolina 1, Ohio 44,002 33,611 Pennsylvania 10,845 8,371 Rhode Island 2,658 1,853 Washington Total 114,352 88,344 Table 4: Algebra II End-of-Course Exam Fall 2008 Participation # Students # Students Registered Tested State Arkansas 1, Indiana 1,146 1,011 Ohio 2,712 1,632 Total 4,932 3,419 A3-3
171 Table 5: Algebra II End-of-Course Exam Spring 2009 Forecast (as of 5/19/09) # Exams # Exams Ordered Ordered Spring 2009 Spring 2009 (Paper) (Online) State Arizona 4, Arkansas 26,887 Hawaii 7,301 Indiana 39,382 16,526 Kentucky 1,617 Maryland 2,304 Massachusetts 5,000 Minnesota 1,713 New Jersey 11, North Carolina 3, Ohio 3,035 Pennsylvania 8,645 Rhode Island 435 Total 110,097 21,896 The exam will be administered May 4-June 12, In addition to the ADP Assessment Consortium volumes shown in tables 4 and 5 a small number of exams have been administered or are expected to be administered outside of the 15 states. A school district from Connecticut participated in the Fall 2008 administration. A school district from Idaho has ordered a small number of exams for the Spring 2009 administration. 2.3 The Role of Pearson Pearson is responsible for the design, development, and delivery of the ADP Algebra II End-of- Course Exam. As part of its commitment to create a valid, reliable, high-quality exam, Pearson is also responsible for the oversight and execution of the baseline validity research required for standard-setting. In addition, Pearson will work in close partnership with Achieve, the CDT, and other stakeholders in the American Diploma Project initiative to help develop and execute a long-term research agenda that establishes a broad and deep base of evidence for the purpose, value, and impact over time of the ADP Algebra II End-of-Course Exam. Pearson recommends that this build on the studies planned in this year, as well as the comprehensive validity studies document presented at the November 2007 Research Alliance meeting. 2.4 The Role of Achieve Created by the nation s governors and business leaders, Achieve, is a bipartisan, non-profit organization that helps states raise academic standards, improve assessments, and strengthen accountability to prepare all young people for postsecondary education, work, and citizenship. Achieve was founded at the 1996 National Education Summit and has sponsored subsequent Summits in 1999, 2001 and At the 2005 Summit, Achieve launched the ADP Network and A3-4
172 subsequently collaborated with the ADP member states to develop the ADP Algebra II End-of- Course Exam. Achieve currently plays a multifaceted role on the ADP Algebra II exam project. This includes:! Facilitating communications between the member states, Pearson, advisory groups, and other exam stakeholders.! Providing an oversight and quality assurance function related to test development and administration! Championing the test to the higher education community and fostering research and collaboration on the test! Keeping Chief state school officers informed about the exam Achieve will help to facilitate the execution of the validity research and spring 2009 standard setting with the CDT, the Research Alliance, and Pearson. Achieve will be responsible for selecting the final cut scores, based on recommendations from the CDT and Achieve-selected advisors that attend the standard setting meeting. 2.5 The Role of the Research Alliance Developing the Algebra II End-of-Course Exam introduces unprecedented challenges both at the technical and at the political level. Ultimately, the exam must have sufficient documented validity to allow postsecondary institutions of higher education to accept the exam results as credible indicators of learning and readiness for college level instruction. The test must also have the political buy in from key leaders and those who make decisions about placement at postsecondary institutions across the participating states. The Research Alliance plays a key role in meeting these challenges. The Research Alliance includes experts in assessment, curriculum, K-12 education policy, and postsecondary issues who together have the required background and experience to advice on all aspects of the Algebra II End-of-Course Exam. The purpose of the Research Alliance is to provide high level expert advice to Achieve and the test developer (Pearson) on a variety of issues such as what validity studies are required so that the exam gains credibility among postsecondary faculty and what work must be done on campuses to ensure the test is used. Other issues that may require input from the Research Alliance include how to ensure the standard setting process produces standards that truly indicate college readiness, how to develop innovative items for online testing, and the application of automated scoring for constructed response items. Appendix A includes background notes from the November 2007, March 2008, June 2008, and October 2008 Research Alliance meetings. Appendix B includes a list of current Research Alliance members. 3.0 Validity Studies Required to Support Standard Setting The ADP Algebra II End-of-Course Exam ultimately aims to raise the rigor of both teaching and learning in our high schools so all U.S. students graduate ready to do college-level work. Through different types of validity studies we collected data to better understand how the ADP A3-5
173 Algebra II exam fits to into the current landscape of mathematics instruction and assessment across public high schools as well as 2- and 4-year colleges. There are three types of validity studies to support standard setting that are described in this plan: judgment studies, concurrent validity studies, and cross-sectional validity studies. Each of these studies are focused on providing evidence to inform setting the performance level on the exam that indicates readiness for college-level mathematics. Each type of study is described in the following paragraphs. 3.1 Judgment Studies Purpose The purpose of the judgment studies is to determine from post secondary instructors and professors what knowledge, skills, and abilities a student needs to have previously mastered in order to successfully learn the material that will be covered in their first credit-bearing math course specifically College Algebra or Precalculus. Successfully learn refers to students who will ultimately earn either an A or B in the class without taking a remediation class Description of Studies The general approach that Pearson used to elicit information from college professors/instructors in the judgment studies is as follows: 1. Convene three regional meetings to gather information and feedback from the college professors/instructors. 2. Each of the meetings was one-day in length. Meetings were held on February 19 (Little Rock, AR), March 2 (Baltimore, MD), and April 9 (Cleveland, OH). 3. Up to 60 participants were recruited for each meeting, for a total of 180 participants recruited across the three meetings. 4. The meetings were facilitated by Pearson s psychometric and content teams. Below is the agenda for each judgment study meeting. 7:30-8:15 Sign-in 8:15-8:30 Overview of the ADP Program 8:30-9:00 Introduction to Tasks 9:00:9:15 Transition to breakout rooms 9:15-9:45 Task 1: Complete Standards Relevance Survey 9:45-11:45 Task 2: Draft Threshold Descriptions 11:45-12:00 Transition to lunch 12:00-12:45 Lunch 12:45-3:00 Task 3.1: Review Exam and Calculate Cut Score Round 1 3:00-3:15 Break 3:15-3:45 Table-Level Discussion of Round 1 Results 3:45-4:15 Room-Level Discussion of Round 1 Results 4:15-4:45 Task 3.2: Review Exam and Calculate Cut-Score Round 2 4:45-5:00 Panelist Information Sheet, Evaluation of Process, Reimbursement Form A3-6
174 The ADP Algebra II End-of-Course Exam Judgment Study Summary Report contains a full description of each task listed above. The data from each of the judgment studies has been combined and is in the standard setting briefing book in the form of a single report. The data is disaggregated based on the type of institution: 2-year community college, 4-year typical college, and 4-year more selective college, and course title: College Algebra and Precalculus. Four-year typical colleges are defined as those with admit rates between 64-89%. More selective four-year colleges are defined as those with admit rates between 40-63%. Public colleges with admit rates less than 40%, more than 89%, or with open admission policies were outside of the target sample and were not recruited for this study. Private colleges were also excluded and not part of the target sample Sample Size The targeted sample size for the judgment studies is included in Table 6 and derived as follows: 1. Recruit participation from all fifteen ADP Algebra II consortium states, expanding to non-consortium states within the geographic region of the meeting locations as needed. 2. Across the states target instructors/professors at eligible public two and four year campuses. One third of the participants should come from each of the institution types (2- year community, 4-year typical, 4-year more selective). 3. All instructors/professors should currently be teaching or supervising the instruction of College Algebra or Precalculus. Math chairs or department heads may also participate. Two-thirds of the instructors/professors within an institution type should be from a College Algebra course, and one-third from a Precalculus course. 4. Initial recruitment should start with higher education professors and instructors who have previously participated in ADP item reviews, data reviews, and or rangefinding meetings. 5. Each instructor that agrees to participate will be asked to provide a copy of their College Algebra or Precalculus course syllabus. Mathematics content experts from Achieve will review syllabi for alignment to the exam standards to further put the results from the judgment studies in context. A3-7
175 Table 6: Summary of Judgment Study Sample Variable Total Number of Attendance at Feb 19 Attendance at Mar 2 Attendance at Apr 9 Total Attendance Instructors/ Professors Targeted Judgment Study Judgment Study Judgment Study Two-year, public community college Four-year, public college (typical admit rate) Four-year, public college (more selective admit rate) Total Fifty-three participants were originally registered to attend the March 2 Judgment Study Meeting in Baltimore, Maryland. Due to winter storms on the East Coast there were 18 cancellations the day of the meeting. Listed below are the college campuses that sent one or more representative to a judgment study meeting. Arizona Western College Anne Arundel Community College Arkansas State University Mountain Home Ball State University Baltimore City Community College Bergen Community College Bloomsburg University Bossier Parish Community College Broward College California University of Pennsylvania Central Arizona College Central Piedmont Community College Central Washington University Columbus State Community College Cossatot Community College County College of Morris Delaware Community College Durham Tech Community College Eastern Washington University Elgin Community College Fayetteville State University, NC Florida Golf Coast University Framingham State College Henderson State University Honolulu Community College Indiana Purdue Fort Wayne Indiana State University Indiana University, Bloomington Indiana University, Purdue Ivy Tech Community College James A. Rhodes State College Jefferson Community College Kauai Community College Kent State University Kent State University, Trumbull Kutztown University Lake Superior State University Lakeland Community College Leech Late Tribal College Leeward Community College Maui Community College Millersville University Montclair State University Montgomery College Morgan State University Northern Arizona University Ozarka College Palm Beach Community College Penn State University Purdue University North Central A3-8
176 Rhode Island College Rhodes State College Rowan University San Jacinto College Southern Arkansas University St Cloud State University The Ohio State University Towson University University of Arkansas University of Arkansas University of Arkansas -Fayetteville University of Arkansas, Fort Smith University of Arkansas, Little Rock University of Central Arkansas University of Central Florida University of Cincinnati University of Florida University of Hawaii University of Hawaii Hilo University of Massachusetts Boston University of North Florida University of North Texas University of South Florida University of West Florida University of Wisconsin, Madison University of Wisconsin, Milwaukee Windward Community College Winston Salem State University Timeline Task Responsibility Due Date Recruit college instructors/professors that have attended previous ADP meetings Pearson November 2008 Recruit additional college instructors/professors for judgment meetings Pearson November 2008-March 2009 Create judgment meeting materials (script for facilitators, survey) Pearson December 15, 2008-January 23, 2009 Judgment study meeting in Little Rock, Arkansas Pearson February 19, 2009 Judgment study meeting in Baltimore, Maryland Pearson March 2, 2009 Judgment study meeting in Cleveland, Ohio Pearson April 9, 2009 Summarize results from three meetings into report for briefing book Pearson May Risks A known risk for this study was not recruiting a sufficient number participants by category, particularly for the 4-year institution types. In the end, 58-86% of the target sample was obtained for each category. A3-9
177 3.2 Concurrent Validity Studies Algebra II Exam Scores and ACT/SAT Scores Purpose It will be useful for the validity of the ADP Algebra II test to establish concurrent relationships with the primary college admissions tests used in the United States, namely the ACT and SAT. These measures already provide implicit and explicit benchmarks of college readiness Description of Studies Different states tend to emphasize one college admissions test or the other. Table 7 lists the eight states providing the largest numbers of students testing in the spring 2008 ADP administration. For each state, we have listed the numbers of graduating (or college bound) ACT and SAT testers in In the final column we have listed the preferred test for matching to Algebra II test scores based on the spring 2008 administration. Table 7: ACT/SAT Testing Volumes for Most Active Spring 2008 Algebra II Participants State N Spring '08 # ACT Testers Graduates # SAT Testers CB Seniors Preferred Match Kentucky 2,019 30,929 3,898 ACT Rhode Island 1,853 1,102 8,287 SAT Arkansas 22,101 21,403 1,389 ACT Hawaii 5,157 2,589 7,982 SAT Indiana 3,027 14,257 42,911 SAT New Jersey 9,813 11,132 85,511 SAT Ohio 33,611 86,080 33,902 ACT Pennsylvania 8,371 15, ,911 SAT Sample Size The original research plan recommended pursuing concurrent validity studies with at least two states for which the preferred matching test is the ACT and at least two states for which the preferred matching test is the SAT. From September 2008 until February 2009 Pearson worked with the states listed in Table 7 to obtain ACT or SAT data that could be matched to its spring 2008 ADP scores. The following data were received. Table 8: States Providing College Entrance Data State College Entrance Arkansas ACT Indiana SAT Kentucky ACT Pennsylvania SAT Rhode Island PSAT A3-10
178 Rhode Island provided PSAT data instead of SAT data due to availability of scores from the College Board (SAT files would not be available until August 2009 after standard setting) Risks! There is no common student identifier to facilitate matching ACT and SAT data to ADP Algebra II results. This matching process therefore used variables such as student name, date-of-birth, gender, and school/district identifier. While the Arkansas match yielded over 6,000 students for which an ACT and ADP score are available other state s data have yielded significantly fewer matches. For example, the Indiana file contained only 205 SAT/ADP matches.! Confidentiality of students must be protected in this activity. For example, each state must follow the Family Educational Rights and Privacy Act (FERPA). In addition, each state may have additional laws possibly more restrictive than FERPA -- that govern the protection of student identifying information. It is not possible to successfully match ACT or SAT data with Algebra II data without including student names in the matching algorithm. Thus, each state must analyze and understand the legal implications of this matching activity with their data. While Ohio administered the most ADP exams during the spring of 2008, Ohio was not able to provide OGT or ACT data due to state privacy laws.! ACT and SAT are each administered several times across the course of the year, and students participating in the spring 2008 Algebra II administration may take these tests over more than one year. This further complicates the matching activity. As a result, states that did not already have their SAT data needed to request PSAT data from the College Board. PSAT data is available February 2009 whereas SAT data is not available until August 2009 after standard setting Relationships with Other High School Tests Another source of concurrent validity data for the ADP Algebra II Exam is other high school exams administered within the participating states. These include high school tests used for NCLB purposes and high school graduation tests Purpose Empirical relationships can be established relatively easily by matching test results for students for whom the two types of tests are administered. Such relationships will add to the validity evidence for the ADP Algebra II exam, and help provide context to state leaders. In these studies, relationships are summarized in terms of bivariate plots of exam scores for the same students, and correlations between exam scores in the briefing book for standard setting Sample Sizes In the original research plan, Pearson recommended carrying out studies to summarize relationships between the ADP Algebra II End-of-Course Exam and other high school tests with at least three states for which a viable matching test exists. A3-11
179 Table 9 lists states that provided state high school exam scores matched to or to be matched to ADP spring 2008 scores. Table 9: State Tests Providing Potential Validity Evidence for the ADP Algebra II Exam State State Exam Hawaii Grade 10 Hawaii State Assessment Indiana Grade 10 ISTEP+ Kentucky Grade 11 KCCT New Jersey Grade 11 HSPA Pennsylvania Grade 11 PSSA Rhode Island Grade 11 NECAP Risks A possible drawback for these studies is that most of the students taking the Algebra II Exam will be in grades 10 and 11. Thus, for any given state, it may only be possible to match a subset of test takers based on 2008 state test results. A more complete match in some states may be possible if two years of state test data are mined; however, it may not be possible to utilize 2009 state test results to inform the standard setting that will occur in early August Cross-sectional Validity Studies Purpose Documenting the relationship between student performance on the Algebra II End-of-Course Exam and performance in college mathematics courses will be an important source of evidence to inform standard setting. The cross-sectional validity studies were designed to help establish the ADP Algebra II exam as an effective gauge of student readiness to take credit-bearing courses and to establish the value/credibility of the exam at both 2- and 4-year colleges. In these studies, the ADP Algebra II End-of-Course Exam was given to students enrolled in selected math courses in two-year and four-year institutions Description of Studies Two variations on this type of study were conducted: 1) Predictive studies, where students newly enrolled in relevant college courses take the ADP Algebra II End-of-Course Exam at the beginning of the semester, and the research question is how well does a student s performance on the ADP test predict his/her performance in the math course? 2) Criterion studies, where students in relevant college courses take the ADP Algebra II End-of- Course Exam at the end of the semester, and the research question is how well does the ADP test assess what is being taught in the math course? In both types of studies, we collected auxiliary information about participating students, such as final grade in the course and demographic data. Course syllabi were also requested from the A3-12
180 course instructors. When available, colleges also provided additional student information such as test scores on other placement tests used at the institutions (e.g., ACCUPLACER, COMPASS, or local instruments). An additional goal of the predictive studies is to calculate the probability of success in the firstcredit bearing college math course based on ADP scores. Specifically, levels of exam performance can be associated with probabilities that students will achieve a grade of C or better, B or better, and A or better using logistic regression. These probabilities can be used to create empirically-based performance level descriptors (PLDs) such as, By receiving a score of X on the ADP Algebra II End-of-Course Exam, you have a Y probability of receiving a grade of Z in your first credit-bearing math course Status of Predictive Studies Seven colleges from Arizona and Maryland participated in predictive study in the fall of 2008 Twenty-four colleges from Arkansas, Florida, Hawaii, Indiana, New Jersey, and Ohio participating in the predictive study in January 2009 Students in the predictive study were administered either the calculator or noncalculator section of the operational exam; half of an operational form While 66% of the students participating in the fall study were entering Precalculus, and only 34% beginning a College Algebra course, the emphasis for recruiting for the January study was 60% College Algebra, 20% Precalculus, and 20% remedial math courses Sample Sizes for the Predictive Validity Studies The following table summarizes the number of college students that were assessed using the ADP Algebra II End-of-Course Exam at the beginning of the fall 2008 semester.. Table 10: Fall 2008 Predictive Validity Study Number of Students Assessed 2-yr, public 4-yr public, community typical State college college 4-yr public, selective college Institution Northern Arizona University Arizona 108 Arizona State University Arizona 199 University of Arizona Arizona 104 Cochise Community College Arizona 92 University of Maryland-College Park Maryland 176 University of Maryland-Baltimore County Maryland 300 Howard Community College Maryland 152 Total A3-13
181 The original sampling design for the January predictive studies is listed below in Table 11. Table 11: January Predictive Studies Original Sampling Design Expected Number of Students 2-yr, public 2-yr, public 4-yr public, 4-yr public, community technical typical selective State college college college college Total Arkansas ,080 New Jersey Ohio Total ,880 Within a category (cell), the target was 60% College Algebra, 20% remedial math, and 20% Pre-Calculus Typical college = admit between 64-89% of applicants More selective college = admit between 40-63% of applicants Despite attempts to recruit technical colleges for the study, initial efforts were not successful. Therefore, subsequent recruiting efforts focused on the remaining three categories only and expanded to include the states of Florida, Hawaii, and Indiana. Table 12 on the following page lists the colleges that participated in the January 2009 predictive study and the number of tests scored for each college. A3-14
182 Table 12: January Predictive Study Participation 2-yr, public community State college 4-yr public, typical college 4-yr public, selective college Institution Arkansas Northeastern College Arkansas 50 Arkansas State University Arkansas 203 Arkansas Tech University Arkansas 229 Henderson State University Arkansas 245 NorthWest Arkansas Community College Arkansas 180 Ozarka College Arkansas 39 University of Arkansas Arkansas 570 Chipola College Florida 91 Hillsborough Community College Florida 51 Miami-Dade College-Hialeah Campus Florida 107 University of Central Florida Florida 135 Valencia Community College-Orlando Campus Florida 184 Valencia Community College-Osceola Campus Florida 121 Honolulu Community College Hawaii 81 Leeward Community College Hawaii 143 University of Hawaii at Hilo Hawaii 142 Windward Community College Hawaii 55 University of Southern Indiana Indiana 144 Essex County College New Jersey 204 Rowan University New Jersey 117 Kent State University Ohio 111 Kent State University-Trumbull Campus Ohio 68 Rhodes State College Ohio 84 University of Cincinnati Ohio 336 Total 3,690 1,458 1,023 1, Methodology for the Predictive Studies 1. Pearson created lists of eligible two-year and four-year institutions in the target states. Fouryear institutions admitting more than 89% or less than 40% of applicants will be excluded from the sampling lists. Four-year institutions were classified as typical (admit about 64-89% of applicants) and more selective (admit only 40-63% of applicants). Two-year institutions which were previously differentiated by technical and community colleges, were combined into a single two-year community college category. 2. Schools were contacted in the order they appear on the lists until sufficient sample sizes were obtained for each category. 3. The exam was administered to students entering a remedial math course, a College Algebra course, or a Precalculus course during the period January 12-23, The final break-out of the 3,690 students that participated in the January study (% or number by type of course) will be available during the spring 2009 as data sets are complete and analyses are performed. 4. The professors at each college were instructed to administer the calculator session to half of the students and the non-calculator section to the other half of the students if possible. IRT analyses were used to place student ability estimates on a common scale regardless of the form administered. A3-15
183 5. Participating schools were asked to provide the following information for each student: a. Final grade in the course b. Demographic information c. Any additional admissions placement test scores that are available 6. Participating schools will send the auxiliary data including final course grade at the end of the semester by May 29, Each professor was also asked to provide a course syllabus or course description. 7. Pearson will match the auxiliary data with the Algebra II test scores for the participating students Timeline Task Responsibility Due Date Arizona and Maryland Predictive Study Administration Colleges August-September 2008 Arizona and Maryland colleges provide auxiliary information Colleges January 2009 (course grades) to Pearson after the fall semester Pearson matches auxiliary data for AZ and MD colleges to ADP Pearson February 2009 scores Colleges recruited for January cross-sectional study Pearson October-December (AR, HI, NJ, OH) 2008 January Predictive Study Administration Dates Colleges January 12-23, 2009 Arkansas, Florida, Hawaii, Indiana, New Jersey, and Ohio Colleges End of semester by colleges provide auxiliary data to Pearson after the spring May 29, 2009 semester Pearson matches auxiliary data to ADP scores and drafts report for standard setting briefing book Status of Criterion Validity Studies Pearson June 1-June 19, 2009 Seven colleges from Hawaii participated in a criterion study during the spring 2008 test administration. A total of 149 students were assessed 40 students from 4-year public colleges and 109 students from 2-year colleges. All students were administered the complete ADP Algebra II End-of-Course Exam. Four community colleges from Maryland administered the ADP Algebra II End-of- Course Exam in December Approximately 331 students were assessed. Students were administered either the calculator or non-calculator section. A3-16
184 Table 13 summarizes the number of college students that were assessed using the ADP Algebra II End-of-Course Exam at the end of the spring 2008 semester in Hawaii and the number of Maryland college students that were assessed at the end of the fall 2008 semester in Maryland Risks Table 13: Criterion Validity Studies Number of Students College Name Number of Students Honolulu Community College 3 Kauai Community College 6 Leeward Community College 60 Maui Community College 12 Windward Community College 28 University of Hawaii at Hilo 33 University of Hawaii-West Oahu 7 Hawaii Total Assessed 149 Howard Community College 122 Cecil College 109 Anne Arundel Community College 89 College of Southern Maryland 11 Maryland Total Assessed Lack of incentive or motivation for institution faculty to sign-up for the study or to follow through in sufficient numbers. For the predictive studies this risk was mitigated by offering an incentive of $15 per complete half-test to the college. Such incentives were not offered for the criterion studies. 2. Lack of motivation for participating college students may result in underperformance on the exam. In particular, the difficulty of the Algebra II End-of-Course Exam may result in relatively homogeneous scores (e.g., student scores could be clustered within a narrow range). This could falsely provide the impression that the test does not discriminate well. This may have been mitigated by colleges that provided a portion of the $15 per test incentive to the students. It may also have been mitigated by students only taking the half exam. 3. However, use of half-tests (calculator or non-calculator) will result in much less reliable measures compared with the full tests. This will impact the magnitude of validity coefficients, since reliability provides an upper bound for validity. Statistical corrections may be used to adjust the validity coefficients to mitigate the shorter test lengths. Despite instructions for colleges to evenly administer the two modes of the exam, more students took the non-calculator portion of the exam which is the first session in the test book. 3.4 Other Studies to Include in Long-Term Research Plan There are a variety of other studies under consideration that may be included in the long-term research plan for the exam. These studies will be planned following standard setting and fall into three broad categories: 1) longitudinal studies, 2) international benchmarking, and 3) online comparability studies. Additional content-based studies may also be considered, as well as research studies for the ADP Algebra I End-of-Course Exam. A3-17
185 4.0 The Standard Setting Meeting 4.1 Overview Participants will be provided with empirical data from the validity studies (judgment, concurrent, cross-sectional as well as background information from Achieve to provide a basis for policybased cut score recommendations. The validity study data will inform the standard setting event by providing meaning for different levels of performance on the ADP Algebra II End-of-Course Exam. This meaning can be established during the standard setting meeting by sharing both detailed and summary results from the validity studies described above, as well as results of additional data collections (e.g., Achieve s analysis of college syllabi from judgment study meetings and alignment to ADP exam standards). Results from the validity studies that could be presented at the standard setting meeting include graphical and statistical summaries of relationships between Algebra II test performance and various criterion variables. For example, Figure 1 illustrates a hypothetical relationship between Algebra II scores and final grades in a college course. In this example, the relationship is strong, and one would be able to confidently predict college grades as a function of ADP Algebra II test performance. 5 College Grade (1=F, 5=A) ADP Algebra II Score (0-76) Figure 1: Hypothetical Relationship between ADP Algebra II Scores and College Grades In reality, these relationships are likely to be weaker and less definitive than shown in the above illustration. Furthermore, the relationships may differ across data collections carried out in different states. Nevertheless, graphical summaries of relationships between ADP Algebra II scores and college grades will provide useful reference points for participants to consider during the standard setting process. The collection of validity data from the studies described above can be summarized in a variety of ways that will inform the standard setting. Summaries can include: A3-18
186 ! Student performance on the ADP Algebra II test, including spring 2009 and prior administrations.! Graphs similar to Figure 1 depicting relationships between Algebra II test performance and college grades, both for the predictive and the criterion studies.! Information about the extent to which the content on the Algebra II Exam (and the underlying standards) is similar to the content in the course that immediately precedes college Algebra (typically the first credit bearing college math course). This may be represented by course syllabi collected as part of the higher education predictive and criterion studies, and may be supplemented by other relevant content-related research carried out by Achieve.! Tables that provide interpretative information by representing the probability of students achieving particular grade levels as a function of Algebra II test scores (e.g., probability of grade C and above, Grade B and above, etc.).! Concordance tables summarizing relationships between ADP Algebra II test scores and ACT mathematics and SAT quantitative section scores.! Concordance tables summarizing relationships between ADP Algebra II test scores and various state test scores, with particular emphasis on state test performance levels (e.g., pass, proficient, advanced, etc.).! Overall summaries of correlational and predictive relationships between Algebra II test scores and other variables. This type of data will be collected and provided to the standard setting participants in the form of a briefing book. At standard setting a Spring 2009 operational item book, an impact data look-up table, rubrics book, and key sheet will also be provided. 4.2 Participants Panelists will include 15 state representatives (each state s lead CDT representative or authorized designee) and 14 Achieve-selected advisors. The participants should be aware that the deliberations in the standard setting will be policy-oriented rather than content-oriented. The task of the panelists will be to make policy recommendations informed by the empirical data resulting from the validity research studies. Participants will receive a mailing 1-2 weeks before the standard setting meeting with a summary of each study (the briefing book). Achieve will attend the standard setting meeting as an observer and wait for cut score recommendations from the panel before setting the final cut scores. 4.3 Process At the June 2008 Research Alliance meeting, Achieve and the Research Alliance accepted Pearson s recommended method for standard setting: a modified briefing book method, similar to that outlined by Haertel (2002). In this approach, policy makers are provided with all of the relevant data related to the standard setting decisions, and explicitly consider the policy implications of the data. Haertel argued that stakeholder participation in a rational and coherent deliberative process is necessary to assure that an appropriate validity argument for performance standards can be satisfied. The briefing book would serve as the basis for the standard setting exercise. However, an item-based exercise such as a modified Angoff or Bookmark procedure would not form the basis for quantifying the performance level cuts recommended by the standard setting participants. A3-19
187 In this case, the standard setting members will be provided with empirical data from the validity studies (judgment, concurrent, cross-sectional) to provide basis for policy-based cut score recommendations Number of Performance Levels/Cuts Originally three cuts and four performance levels were planned for the exam. However, due to concerns raised regarding insufficient floor and early results of the validity studies, Pearson recommended and Achieve and the states agreed to two cuts and three performance levels. Each level will describe student proficiency in Algebra II and preparedness for the first credit-bearing college math course (e.g., well prepared, prepared, and needs preparation). It is assumed that these performance level cut scores will be set by Achieve based on the recommendations of the standard setting panel. While Pearson and Achieve do not recommend that states set separate or lower standards for demonstrating proficiency, the establishment of a college-ready performance level by Achieve does not preclude individual institutions of higher education, or consortia of institutions (e.g., California state colleges) from establishing their own cut score for placement into particular courses or programs Setting Cut Scores The process will include multiple rounds, discussion, use of impact data, etc. as described in the briefing book article (Haertel, 2002). Based on feedback received from the standard setting panel, Pearson will document a recommendation for cut points to Achieve. Achieve will make the final performance level decision. The Algebra II standard setting agenda as well as the halfday meeting for the Assessment Consortium states following standard setting, is shown below. The ADP Algebra I End-of-Course Exam standard setting meeting is scheduled for July 20-21, immediately before the Algebra II standard setting. However, there is not explicit link planned between the two scales. Day 1 (Wednesday, July 22, 2009) 8:30-9:00 Registration/Breakfast 9:00-9:30 Opening Remarks & Introductions 9:30-10:45 Overview! The American Diploma Project! The ADP Algebra II End-of-Course Exam! ADP Algebra II Exam Standard Setting Approach! Validity Studies to Inform Standard Setting 10:45-11:00 Break 11:00-12:00 Review the Spring 2009 Operational Test Items 12:00-1:00 Lunch 1:00-2:00 Review and Discuss Results from the Concurrent Validity Studies! National Exams! State Exams 2:00-3:00 Review and Discuss Results from Cross-Sectional Studies! Predictive Studies! Contrasting Groups 3:00-3:15 Break 3:15-4:30 Review and Discuss the Results of the Judgment Studies A3-20
188 Day 2 (Thursday, July 23, 2009) 8:30-9:00 Breakfast 9:00-9:30 Mapping to Performance Level Descriptors 9:30-10:00 Cross-Walk of Cut Scores 10:00-10:30 Recap of All Data 10:30-10:45 Break 10:45-12:00 Make Round 1 Judgment on Cut Scores 12:00-1:00 Lunch 1:00-1:30 Small group discussion of table agreement data from round 1 1:30-2:00 Make Round 2 Judgments 2:00-2:30 Break 2:30-3:00 Large group discussion of agreement from round 2 3:00-4:00 Present and discuss impact data from round 2 Day 3 (Friday, July 24, 2009) 8:30-9:00 Breakfast 9:00-10:00 Make Round 3 Judgments 10:00-10:30 Large group discussion of agreement and impact data from round 3 10:30-11:00 Complete exit survey 11:00-11:30 Check-in materials, adjourn for CDT meeting 11:30-12:30 Break/Lunch CDT Meeting 12:30-12:45 Opening Remarks 12:45-1:45 Algebra I! Review standard setting process! Results of each round! Impact data 1:45-3:45 Algebra II! Review standard setting process! Results of each round! Impact data 3:45-4:30 Recommendations for Achieve & Next Steps A3-21
189 4.3.3 Performance Level Descriptors (PLDs) The following ADP Algebra II performance levels were recommended by Pearson and approved by Achieve and the ADP Assessment Consortium. Well-Prepared: Prepared: Needs Preparation: There will be two types of PLDs associated with each performance level. The first will be empirically-based PLDs (e.g., given a score of X on the ADP Algebra II exam, the probability of the student achieving a grade of C or better is Y ). Empirically-based PLDs are an expected outcome of the cross-sectional validity studies. It is expected that these probability statements or empirical PLDs will be used inform standard setting but will not be used in a more public format. The second type of PLD at each level will be content-based. The content-based PLDs were drafted at the judgment study meetings and describe what a student can do in terms of Algebra II content at each performance level. They will appear on the student score reports. The content PLDs were reviewed and approved by Achieve and the Assessment Consortium. Only minor edits, if necessary, are expected as a result of standard setting Timeline Task Responsibility Due Date Presented standard setting plan to the CDT (conference call) Pearson, CDT, Achieve September 18, 2008 Presented research and standard setting plan to the Pearson/Achieve October 3, 2008 Research Alliance Presented revised standard setting plan with specific performance levels and overall scale range to CDT Pearson October 22-23, 2008 Revised performance levels and scale values and presented to the CDT (conference call) Pearson November 20, 2008, March 5, Content-based PLDs drafted Judgment Study Meeting Participants/Pearson Achieve/ State Content Leads 2009 February 19- March 12, 2009 Achieve and State Content Leads provide feedback on content-based PLDs March 12-19, 2009 The CDT provides feedback on content-based PLDs CDT March 24-27, 2009 Content-based PLDs final for report mock-ups/scales and Pearson March 30, 2009 performance levels final Empirical PLDs created from cross-sectional studies Pearson June 8-19, 2009 Standard Setting meeting panelist make cut score CDT/experts July 22-24, 2009 recommendation to Achieve Achieve finalizes cut scores and provides to Pearson for Achieve July 28, 2009 reporting any requirements for minor changes to PLDs and core competencies also provided A3-22
190 4.5 Risks The primary risk for this plan is the timeliness of information from the cross-sectional studies for those data collections that are based on the spring 2009 college semesters. In these cases, auxiliary data (e.g., college grades, demographic information, additional test scores, etc.) will have to be collected after the spring 2009 semester is completed. These data will need to be collected, analyzed, and summarized prior to finalizing the briefing books in early July. The clustering of student scores on the exam well below 50% is also a concern. While it is possible that student performance will improve once curriculum and instruction are aligned with the ADP Algebra II exam standards in all states, additional research will be required. Validity evidence collected prior to standard setting was very influential reducing the number of performance levels from four to three for this exam. A traditional standard setting script will not work well for this meeting. The script will be drafted as results are obtained from the various studies and the discussion topics can be included. 5.0 Reference Buckendahl, C.W., Smith, R.W., Impara, J.C., & Plake, B.S. (2002). A Comparison of Angoff and Bookmark Standard Setting Methods. Journal of Educational Measurement, 39, Haertel, E. H. (2002). Standard setting as a participatory process: Implications for validation of standards-based accountability programs. Educational Measurement: Issues and Practice, 21 (1), Impara, J.C., & Plake, B.S. (1997). Standard setting: An alternative approach. Journal of Educational Measurement, 34, Phillips, S.E., (2002). Setting Standards on the TAKS Tests: A Modified Item Mapping Procedure. Texas Education Agency. Plake, B.S., Impara, J.C., Buckendahl, C.W., & Ferdous, A.A. (2005). Setting Multiple Performance Standards Using the Yes/No Method: An Alternative to Item Mapping Procedure. Paper presented at the 2005 Annual Meeting of the National Council on Measurement, Montreal, Canada. A3-23
191 Appendix A: Background Notes from the Research Alliance November 2007 At the fall 2007 meeting, a considerable portion of the agenda was devoted to standard setting and developing a framework for validating the Algebra II exam. As the meeting progressed, the two topics became intertwined. By the end of the meeting, three recommendations were made: 1. Base performance standard(s) on probability of success in a college course, rather than simply a description of content knowledge/skills demonstrated. 2. Delay standard setting meetings (i.e., setting a cutscore(s)) from summer 2008 to summer 2009 to allow time for the collection of empirical evidence to inform the decision. 3. Identify, plan, and execute specific validity studies that will support standard setting efforts for the summer of (These validity studies to information standard setting will not be exhaustive and some ADP Algebra II exam validity studies may not be complete by summer 2009.) March 2008 For the March 2008 Research Alliance meeting, proposed validity studies were included as an agenda topic. After lengthy discussion of the proposed scale anchoring study, the Research Alliance recommended the postponement of additional discussion on validity studies until plans for standard setting activities through summer 2009 were more fully developed. Research Alliance members indicated that the presentation of the proposed validity studies at the next Research Alliance meeting within the context of standard setting would facilitate their evaluation and recommended prioritization of those studies in terms of factors such as relevance, appropriateness, and feasibility within the available time frame. For example, all else being equal, a validity study that could contribute to the standard setting effort might be evaluated more closely or given priority in scheduling over a study with less direct impact on standard setting. In particular, the Research Alliance recommended that Pearson and Achieve develop a plan for the standard setting scheduled for spring 2009, and as part of that plan, describe: 1) the number and types of validity studies needed to inform the standard setting; and 2) how the results of these studies will be incorporated into the standard setting. June 2008 During the June 2008 meeting, the Research Alliance members reviewed an operational form of the ADP Algebra II End-of-Course Exam. Concerns were raised about the length of the exam and possibility insufficient floor compared to other exams used in the transition to high school to college as well as in comparison to college Algebra exams. The bulk of the discussion focused on the range of possible research studies that would be useful in serving the three goals of the ADP program, with the recognition that the most pressing need was the collection of sufficient (both in terms of quantity and quality) validity evidence to inform standard-setting in August Andy Porter provided the following succinct summary of the Research Alliance s advice at the June meeting:! Some appropriate persons should think hard about the current operational form of the Algebra II end-of-course exam. The form may be too long to fit nicely within school periods, and there may be insufficient floor (the test may be too difficult) to allow the setting of multiple levels of standards and to be used as a predictor of college readiness.! Apparently, the Algebra II test needs to be untimed. At the same time, care needs to be taken that the conditions under which the test is taken are comparable from one state to A3-24
192 the next. A clear statement needs to be made as to how much time should be set aside and the protocol to be followed when giving the test.! All agree that plans and initial steps should be immediately undertaken so that over time, there will be the results from a longitudinal study that tracks students who took the endof-course exam in high school on into their college or non-college experiences and determines the extent to which performance on the end-of-course test predicts whether or not they had to take remedial math and their grade in their first credit-bearing math course.! The quality of the comparability study that Pearson has completed is not sufficient to conclude comparability or not between the paper and pencil and online tests. Although the comparability design was sufficient, after attrition the final sample is too small and the quality of random assignment was apparently compromised. Further work is necessary (and will be included in the operational testing plans going forward). The boundaries between what validity studies need to be done and what studies need to be done in anticipation of standard settings are happily blurred. The following three pairs of studies were identified as applicable for both purposes: A. Judgment studies i. A bookmark or other type of judgment study conducted to determine a recommended cut point for being judged proficient as to achievement in the Algebra II course at the time the course is completed. Panelists might include a mix of high school teachers and those who were involved in setting the content standards for the course. ii. A bookmark or other type of judgment study to determine the level of performance necessary to take a college Algebra credit-bearing course in state 4-year college/universities. Panelists would be a mix of college professors teaching college Algebra, perhaps some teaching remedial courses, and high school teachers. B. Concurrent validity studies i. Using the end-of-course exam to predict performance on the ACT ii. Using the end-of-course exam to predict performance on the SAT C. Cross-sectional studies where college students take the Algebra II end-of-course test as they begin a credit-bearing college Algebra course, or as they begin a remedial course. i. Study done for 2-year colleges ii. Study done for 4-year state colleges/universities The judgmental studies would not be data free, but rather based on student performance. For example in the bookmark method, student performance is used to create the item ordered booklet and for impact data. For all of the data collection efforts, care should be made to have the data sets representative of the "frame" they are to represent. It may be that the readiness for credit-bearing course in college Algebra for 2-year colleges is not different than for 4-year colleges. If that is the case, then cuts might be made to distinguish between enrolling in a remedial course versus a credit-bearing course, X probability of getting a C or better, and X probability of getting a B or better where X might be 60 percent or whatever seems reasonable. If the 2 and 4-year colleges differ sufficiently that readiness must be defined differently for each, then an alternative might be to say one cut would be readiness for 2-year credit-bearing courses and the second cut would be readiness for 4-year credit-bearing courses. A3-25
193 October 2008 The purpose of the meeting was to share with the Research Alliance:! Results from the spring 2008 ADP Algebra II End-of-Course Exam administration, within the broader context of the Achieve report;! Lessons learned and changes made after the spring exam;! Overview of the research studies that will be conducted to inform standard setting;! Plans for Algebra II standard setting and reporting;! An update on the ADP Algebra I End-of-Course Exam. Below is a partial summary of the advice received during the meeting.! The results from the first administration need to be viewed within the broader context, as presented in Achieve s annual report (e.g. number of students enrolled in Algebra II, number of students tested, percent of students enrolled in Algebra II tested, by state).! When full census testing is not possible within a state; it was recommended that Pearson and Achieve work with state leaders to obtain representative samples, so that students tested within a state match the demographics of the state.! Communications should be framed in the context of what it means for overall mathematics reform initiatives for policymakers and to encourage students to keep taking math courses as they prepare for college and work. Along these lines, we need more information from the states about math sequencing in their state to provide context around the exam.! Reiterated the importance of being able to communicate to stakeholders the value of the exam why should they give up so much time to administer the test?! Even for students that do well on the exam, reports need to communicate that students are on track to be prepared and emphasize the need to continue taking math courses while still in high school.! Caution needs to be exercised when writing performance level descriptors for the lowest achievement levels you don t want to tell kids that it is too late to be prepared for college. At the meeting Pearson and Achieve clarified that college readiness is defined in terms of student readiness to take college-level Algebra without remediation. As a result, the Research Alliance recommended changes to the judgment study tasks, participants, and room/table assignments. Pearson was also asked to clarify the main question it is trying to answer with each of the three types of validity studies. These suggestions have been included in the current version of this document. A3-26
194 Appendix B: Research Alliance Member List *Chair! Dr. Eva Baker, Co-Director, UCLA Center for the Study of Evaluation (CSE)/National Center for Research on Evaluation, Standards, and Student Testing (CRESST)! Dr. David Bressoud, DeWitt Wallace Professor & Chair Mathematics and Computer Science, Macalester College! Dr. Wayne J. Camara, Vice President, Research & Analysis, College Board! Dr. Francis (Skip) Fennell, Professor of Education, McDaniel College! Dr. Henry (Hank) Kepner, Professor, Mathematics Education, Department of Curriculum & Instruction and Mathematical Sciences, University of Wisconsin, Milwaukee! Dr. Robert Linn, Co-Director, CRESST, University of Colorado at Boulder! Dr. Bernard Madison, Professor of Mathematics, Department of Mathematical Sciences, University of Arkansas! *Dr. Andrew C. Porter, Dean, University of Pennsylvania, Graduate School of Education! Dr. William Schmidt, University Distinguished Professor, Michigan State University! Dr. James Sellers, Associate Professor and Director of Undergraduate Mathematics, Penn State University! Dr. Sharif Shakrani, Co-Director, Education Policy Center, Michigan State University! Dr. David S. Spence, President, Southern Regional Education Board! Dr. Uri Treisman, Director, Charles A. Dana Center! Dr. Jon S. Twing, Senior Vice President, Test, Measurement & Research Services, Pearson A3-27
195 Appendix B
196
197 Appendix B1: Alignment of State Algebra II Standards to the ADP Algebra II End-of-Course Exam Standards To determine how the ADP Algebra II End-of-Course (EOC) Exam compares to Algebra II expectations from states, Achieve compared the ADP Algebra II End-of-Course Exam Standards with the Algebra II course standards and end-of-course exam standards from 15 states (see Table 1 for list of documents reviewed). Of the 15 states included in the analysis, ten states are members of the America Diploma Project (ADP) Network and share a similar commitment to prepare students for college and the workforce. Two of the states are also members of the ADP Assessment Consortium. Below is a summary of the state standard documents reviewed. Achieve experts reviewed exam standards first from states in which Algebra II end-of-course exams exist, then from states in which there is a defined set of Algebra II curriculum standards, All of these standards were gathered from state websites. This sample should not be considered exhaustive as these states may have additional Algebra II course and/or exam standards that are not listed and other states may have standards that were not included in the analysis. It is important to note that although additional states may have Algebra II courses, there may not be a state-defined curriculum that could be evaluated in this study. State Alabama Table 1: State Documents Reviewed in Analysis Course/Exam Standards Reviewed Algebra II course standards Algebra II with Trigonometry course standards ADP Network State ADP Assessment Consortium State California Algebra II End-of-Course exam standards Yes No District of Columbia Florida Algebra II course standards No No Algebra II course standards Algebra II Honors course standards Georgia Algebra II course standards Yes No Indiana Algebra II course standards Yes Yes Mississippi Algebra II course standards Yes No New York Algebra II with Trigonometry course standards No No Oklahoma Algebra II End-of-Course exam standards Yes No South Carolina Algebra II course standards No No Tennessee Algebra II course standards Yes No Texas Algebra II course standards Yes No Utah Algebra II course standards No No Yes Yes No Yes B1-1
198 State Virginia West Virginia Course/Exam Standards Reviewed Algebra II End-of-Course exam standards Algebra, Functions, and Data Analysis course standards Algebra II course standards Algebra II with Trigonometry course standards ADP Network State Yes ADP Assessment Consortium State Algebra II course standards (mastery level) No No No Methodology Achieve content experts conducted a comparative analysis of the standards from these states with the ADP Algebra II End-of-Course Exam (EOC) Standards. Within each set of state standards reviewed, each benchmark was mapped to the corresponding benchmark within the EOC standards. The ADP EOC standards are comprised of a core set of 41 benchmarks in five content standards and seven optional modules. The core content standards included in the ADP Algebra II EOC Exam are: Operations on Numbers and Expressions; Equations and Inequalities; Polynomial and Rational Functions; Exponential Functions; and Function Operations and Inverses. The optional modules are: Data and Statistics; Probability; Logarithmic Functions; Trigonometric Functions; Matrices; Conic Sections; and Sequences and Series. For purposes of this study only the core standards of the ADP Algebra II End-of-Course Exam were analyzed. The modules were not included because they have not yet been administered in an operational administration and will not be taken into consideration during the standard setting process. The data were examined from two different perspectives.! The first analysis was conducted to understand the degree to which state Algebra II course and exam standards are aligned, as a whole, with the ADP EOC Assessment.! The second analysis was conducted to better understand how prominently the core ADP Algebra II Exam benchmarks are currently represented in state course and exam standards It is important to note that since it is unlikely that a student would take more than one of the courses offered in a particular state. Therefore, in states that had more than one course or exam, each set of course or exam standards was analyzed individually. B1-2
199 Findings First Analysis Of the state documents reviewed, most of the state standards documents (18 of 20) are 34% to 66% aligned to the EOC benchmarks. Only two sets of state standards are less than 33% aligned. An interesting finding was that not one set of state standards is 100% aligned to the EOC standards, as shown in Table 2. The second analysis will detail where the highest alignment occurred. Second Analysis In comparing the 41 EOC benchmarks across the state standards, seven of the EOC benchmarks that have counterparts in more than 75% of the state documents analyzed and nine benchmarks that have counterparts in fewer than 25% of the state documents. Of the remaining 25 EOC benchmarks, the majority of them (16) have counterparts in 50%- 74% of the state documents reviewed. Interestingly, no single benchmark is included in all of the state documents, as shown in Table 3. The ADP Algebra II Exam benchmarks that had counterparts in more that 75% of the state documents were:! Represent complex numbers in the form a + bi, where a and b are real; simplify powers of pure imaginary numbers. (O2.a.)! Perform operations on rational expressions, including complex fractions. (O3.e.)! Solve equations and inequalities involving the absolute value of a linear expression. (E1.a.)! Solve single variable quadratic equations and inequalities over the complex numbers; graph real solution sets on a number line. (E2.b.)! Determine key characteristics of quadratic functions and their graphs. (P1.a.)! Represent quadratic functions using tables, graphs, verbal statements, and equations. Translate among these representations. (P1.b.)! Describe and represent the effect that changes in the parameters of a quadratic function have on the shape and position of its graph. (P1.c.) Many of the EOC benchmarks that appear in fewer than 25% of the state documents address topics at the extreme ends of an Algebra II course: either the lower level content typically considered to be in an Algebra I course e.g. operations with radicals or more challenging curriculum that pushes the limits of the traditional Algebra II course e.g. piecewise and step functions. However, without an analysis of state courses before and after the Algebra II course sequences, it is difficult to determine whether the content is not covered by the state curriculum at all or is simply not covered by an Algebra II, or equivalent, course. The ADP Algebra II exam content standards that have the highest degree of alignment with the state documents are Equations and Inequalities, Polynomial and Rational Functions, and Exponential Functions. Function Operations and Inverses often considered a group of topics that pushes the limits of the traditional Algebra II courses have the lowest alignment with the state documents. B1-3
200 As it is not possible for the ADP Algebra II EOC Exam to cover all topics, the ones chosen for the core exam are the topics that the states felt were most relevant and assessable for an end-of-course exam. Across the 20 state documents analyzed, there are a few topics that occurred in at least ten of the state documents (50% and above) that do not appear in the core ADP Exam. Seven of these topics are part of the modules for the Exam:! Represent piecewise-defined functions using tables, graphs, verbal statements, and equations. Translate among these representations. (F.3.b)! Use a computer or calculator to find a linear regression equation (least squares line) as a model for data that suggest a linear trend, and determine the correlation coefficient. (S.1.c)! Apply the properties of logarithms and use them to manipulate logarithmic expressions. (L.1.a)! Represent 2-variable and 3-variable systems of linear equations using matrices and use them to solve the system. (M.2.c)! Identify a parabola, circle, ellipse, or hyperbola from its equation, description, or key characteristics. (C.1.a)! Represent conic sections whose axes are parallel to the x- and y-axes using graphs, verbal statements, and equations. Translate among these representations. Represent the equations of conic sections in multiple forms to extract information about the parabola, circle, ellipse, or hyperbola. (C.1.b)! Represent the general term of an arithmetic or geometric sequence and use it to generate the sequence or determine the value of any particular term. (I.1.a) Across the state documents reviewed, there is not a particular module that is more heavily represented, but merely a topic here or there from each module. It is possible that the ADP Consortium states will utilize the modules in the future to expand their curriculum. Five of the topics in common among the 20 sets of documents are not part of the ADP Algebra II EOC Exam s core or modules. There might be elements of these benchmarks contained in the ADP exam, but not enough to be considered to be in alignment with those state documents. The common topics are:! Finding liner domain and range! Function transformations! Direct, inverse, and joint variation! Collect and model with data! Variance and standard deviation B1-4
201 Summary The purpose of this study was to determine how the ADP Algebra II Exam compares to Algebra II expectations across the states. The findings indicate that there is great variation in Algebra II courses and although there are some common topics, the benchmarks and skills vary within those topics. Although the ADP Algebra II End-of- Course Exam aligns partially with the majority of state standards analyzed, there is not perfect alignment, even with the topics on this exam. In other words, the ADP Algebra II End-of-Course Exam, although not a perfect match, addresses many of the topics found in the state standards documents analyzed. B1-5
202 Table 2: State Standard Documents Compared to ADP Algebra II End-of-Course Exam Standards State Course or Exam Standards Reviewed Number of Benchmarks Fully or Partially Aligned Percent of Benchmarks Fully or Partially Aligned Alabama Algebra II course standards 26 63% Algebra II with Trigonometry course standards 19 46% California Algebra II End-of-Course exam standards 16 39% District of Columbia Algebra II course standards 26 63% Algebra II course standards 25 61% Florida Algebra II Honors course standards 20 49% Georgia Algebra II course standards 22 54% Indiana Algebra II course standards 22 54% Mississippi Algebra II course standards 13 32% New York Algebra II with Trigonometry course standards 17 41% Oklahoma Algebra II End-of-Course exam standards 23 56% South Carolina Algebra II course standards 18 44% Tennessee Algebra II course standards 22 54% Texas Algebra II course standards 17 41% Utah Algebra II course standards 17 41% Algebra II End-of-Course exam standards 25 61% Virginia Algebra, Functions, and Data Analysis course standards 7 17% Algebra II course standards 21 51% Algebra II with Trigonometry course standards 19 46% West Virginia Algebra II course standards (mastery level) 20 49% B1-6
203 Table 3: Analysis of ADP Algebra II End-of-Course Exam Standards Benchmarks Compared to State Standard Documents Number of Number of Percent of Percent of Number of State State State State State Documents Documents Documents Documents Documents Partially Fully Partially Fully Aligned Aligned Aligned Aligned Aligned (max=20) (max=20) (max=20) Algebra II EOC standards CORE: Operations on Numbers and Expressions O1 Real Numbers O1.a. Convert between and among radical and exponential forms of numerical expressions. O1.b. Simplify and perform operations on numerical expressions containing radicals. O1.c. Apply the laws of exponents to numerical expressions with rational and negative exponents to order and rewrite them in alternative forms. O2 Complex Numbers O2.a. Represent complex numbers in the form a+bi, where a and b are real; simplify powers of pure imaginary numbers. O2.b. Perform operations on the set of complex numbers. O3 Algebraic Expressions O3.a. Convert between and among radical and exponential forms of algebraic expressions. O3.b. Simplify and perform operations on radical algebraic expressions. O3.c. Apply the laws of exponents to algebraic expressions, including those involving rational and negative exponents, to order and rewrite them in alternative forms. O3.d. Perform operations on polynomial expressions. O3.e. Perform operations on rational expressions, including complex fractions. O3.f. Identify or write equivalent algebraic expressions in one or more variables to extract information. Percent of State Documents Aligned 0 0% 2 10% 2 10% 0 0% 1 5% 1 5% 1 5% 3 15% 4 20% 3 15% 12 60% 15 75% 0 0% 13 65% 13 65% 0 0% 6 30% 6 30% 2 10% 12 60% 14 70% 5 25% 7 35% 12 60% 6 30% 5 25% 11 55% 2 10% 13 65% 15 75% 1 5% 0 0% 1 5% B1-7
204 Algebra II EOC standards CORE: Equations and Inequalities E1 Linear Equations and Inequalities E1.a. Solve equations and inequalities involving the absolute value of a linear expression. E1.b. Express and solve systems of linear equations in three variables with and without the use of technology E1.c. Solve systems of linear inequalities in two variables and graph the solution set. E1.d. Recognize and solve problems that can be represented by single variable linear equations or inequalities or systems of linear equations or inequalities involving two or more variables. Interpret the solution(s) in terms of the context of the problem. E2 Nonlinear Equations and Inequalities E2.a. Solve single-variable quadratic, exponential, rational, radical, and factorable higherorder polynomial equations over the set of real numbers, including quadratic equations involving absolute value. E2.b. Solve single variable quadratic equations and inequalities over the complex numbers; graph real solution sets on a number line. E2.c. Use the discriminant, (b 2-4ac), to determine the nature of the solutions of the equation ax 2 + bx + c = 0 E2.d. Graph the solution set of a two-variable quadratic inequality in the coordinate plane. Number of State Documents Partially Aligned (max=20) Percent of State Documents Partially Aligned Number of State Documents Fully Aligned (max=20) Percent of State Documents Fully Aligned Number of State Documents Aligned (max=20) Percent of State Documents Aligned 2 10% 13 65% 15 75% 6 30% 8 40% 14 70% 0 0% 14 70% 14 70% 3 15% 9 45% 12 60% 13 65% 0 0% 13 65% 5 25% 12 60% 17 85% 0 0% 8 40% 8 40% 0 0% 0 0% 0 0% B1-8
205 Algebra II EOC standards E2.e. Rewrite nonlinear equations and inequalities to express them in multiple forms in order to facilitate finding a solution set or to extract information about the relationships or graphs indicated. CORE: Polynomial and Rational Functions P1 Quadratic Functions P1.a. Determine key characteristics of quadratic functions and their graphs. P1.b. Represent quadratic functions using tables, graphs, verbal statements, and equations. Translate among these representations. P1.c. Describe and represent the effect that changes in the parameters of a quadratic function have on the shape and position of its graph. P1.d. Recognize, express, and solve problems that can be modeled using quadratic functions. Interpret their solutions in terms of the context. P2 Higher-order Polynomial and Rational Functions P2.a. Determine key characteristics of power functions in the form for positive integral values of n and their graphs. P2.b. Determine key characteristics of polynomial functions and their graphs. P2.c. Represent polynomial functions using tables, graphs, verbal statements, and equations. Translate among these representations. P2.d. Determine key characteristics of simple rational functions and their graphs. Number of State Documents Partially Aligned (max=20) Percent of State Documents Partially Aligned Number of State Documents Fully Aligned (max=20) Percent of State Documents Fully Aligned Number of State Documents Aligned (max=20) Percent of State Documents Aligned 1 5% 0 0% 1 5% 3 15% 13 65% 16 80% 7 35% 10 50% 17 85% 2 10% 13 65% 15 75% 3 15% 9 45% 12 60% 0 0% 1 5% 1 5% 1 5% 7 35% 8 40% 6 30% 2 10% 8 40% 1 5% 5 25% 6 30% B1-9
206 Algebra II EOC standards P2.e. Represent simple rational functions using tables, graphs, verbal statements, and equations. Translate among these representations. P2.f. Recognize, express, and solve problems that can be modeled using polynomial and simple rational functions. Interpret their solutions in terms of the context. CORE: Exponential Functions X1 Exponential Functions X1.a. Determine key characteristics of exponential functions and their graphs. X1.b. Represent exponential functions using tables, graphs, verbal statements, and equations. Represent exponential equations in multiple forms. Translate among these representations. X1.c. Describe and represent the effect that changes in the parameters of an exponential function have on the shape and position of its graph. X1.d. Recognize, express, and solve problems that can be modeled using exponential functions, including those where logarithms provide an efficient method of solution. Interpret their solutions in terms of the context. CORE: Function Operations and Inverses F1 Function Operations F1.a. Combine functions by addition, subtraction, multiplication, and division. F1.b. Determine the composition of two functions, including any necessary restrictions on the domain. F2 Inverse Functions Number of State Documents Partially Aligned (max=20) Percent of State Documents Partially Aligned Number of State Documents Fully Aligned (max=20) Percent of State Documents Fully Aligned Number of State Documents Aligned (max=20) Percent of State Documents Aligned 5 25% 4 20% 9 45% 7 35% 3 15% 10 50% 3 15% 8 40% 11 55% 6 30% 7 35% 13 65% 0 0% 8 40% 8 40% 2 10% 10 50% 12 60% 0 0% 10 50% 10 50% 1 5% 12 60% 13 65% B1-10
207 Algebra II EOC standards F2.a. Describe the conditions under which an inverse relation is a function. F2. b. Determine and graph the inverse relation of a function. F3 Piecewise-defined Functions F3.a. Determine key characteristics of absolute value, step, and other piecewise-defined functions. F3.b. Represent piecewisedefined functions using tables, graphs, verbal statements, and equations. Translate among these representations. F3.c. Recognize, express, and solve problems that can be modeled using absolute value, step, and other piecewisedefined functions. Interpret their solutions in terms of the context. Number of State Documents Partially Aligned (max=20) Percent of State Documents Partially Aligned Number of State Documents Fully Aligned (max=20) Percent of State Documents Fully Aligned Number of State Documents Aligned (max=20) Percent of State Documents Aligned 2 10% 4 20% 6 30% 2 10% 7 35% 9 45% 1 5% 3 15% 4 20% 7 35% 4 20% 11 55% 1 5% 1 5% 2 10% B1-11
208 Appendix B2. Aligned Expectations Following are excerpts from Aligned Expectations? A Closer Look at College Admissions and Placement Tests. For the full document, please visit
209
210 Aligned Expectations? A Closer Look at College Admissions and Placement Tests Americ a n Diploma Project N etw ork April 2007, Aligned Expectations? I
211 About Achieve Created by the nation s governors and business leaders, Achieve, Inc., is a bipartisan, non-pro fi t organization that helps states raise academic standards, improve assessments and strengthen accountability to prepare all young people for postsecondary education, w ork and citizenship. Achieve has helped more than half the states benchmark their academic standards, tests and accountability systems against the best examples in the United States and around the w orld. Achieve also serves as a significant national voice for quality in standards-based education reform and regularly convenes governors, CEOs and other in fl uential leaders at National Education Summits to sustain support for higher standards and achievement for all of A merica s schoolchildren. In 2005, Achieve co-sponsored the National Education Summit on High Schools. Forty-five governors at tended the Summit along with corporate CEOs and K 12 and postsecondary leaders. The Summit w as successful in making the case to the governors and business and education leaders that our schools are not adequately preparing students for college and 21st-century jobs and that aggressive action will be needed to address the preparation gap. As a result of the Summit, 29 states joined with Achieve to form the A merican Diploma Project Net w ork a coalition of states commit ted to aligning high school standards, assessments, graduation requirements and accountability systems with the demands of college and the w orkplace. For more information, visit Achieve s Web site at w w w.achieve.org. Published in A pril Copyright 2007 Achieve, Inc. All rights reserved. No par t of this publication may be reproduced or transmit ted in any form or by any means, electronic or mechanical, including photocopy, recording, or any inf ormation or storage retrieval system, without permission from Achieve, Inc. Editorial assistance and design: KSA-Plus Communications, Inc. April 2007,
212 Executive Summary College admissions and placement tests play a crucial role in the American education system. More than 2 million students each year take admissions tests (the ACT or SAT), and the results help postsecondary institutions make critical decisions about who will go to college, where they will be admitted and the likelihood of their success in a broad range of college courses. College placement tests, meanwhile, are used by a majority of the nation s colleges and universities to determine which courses young people are prepared to enter. Now, as many states seek to raise high school standards to ensure that more students graduate prepared for the demands of college and the workplace, college admissions and placement tests are being called upon to serve new purposes for which they were not intentionally designed. Some states, for example, are using the SAT and ACT as their official statewide high school graduation exam, incorporating these tests into their state assessment and accountability systems. Other states are considering whether college placement tests could be given to students while still in high school to provide early feedback on their level of readiness. Still others are developing end-of-course tests that align with their high school curriculum and tap college-ready content or modifying their existing high school tests to make them better measures of college readiness. Achieve launched this study to help inform the decisions states are making about high school assessments by providing greater insights into the world of college admissions and placement testing. Achieve analyzed more than 2,000 questions from college admissions and placement exams to determine how these tests compare to one another and how well they measure the college and work readiness benchmarks created by the American Diploma Project (ADP). These benchmarks are being used by 29 states to align high school standards, curriculum, assessments and accountability systems with the demands of college and work. The study was conducted with the full cooperation of ACT (administrator of the ACT college admissions test and the COMPASS placement test) and the College Board (administrator of the SAT and the ACCUPLACER placement test), both of whom provided Achieve with access to their admissions and placement exams. Achieve also acquired placement exams from a number of other organizations and postsecondary institutions around the country. Findings Achieve s ADP research shows that college faculty across states and institutions have a fairly consistent view of the rigorous level of reading, writing and mathematics skills that incoming freshmen need to be successful in first-year, credit-bearing college courses. In mathematics, students need knowledge and skills typically learned in a four-year mathematics sequence including Algebra I and II, Geometry, data analysis, and statistics. They also need sophisticated mathematical reasoning and problemsolving skills. In English, students need to be able to write and communicate effectively to different audiences, understand and analyze various types of complex informational texts, and apply sophisticated analytic and reasoning skills. Achieve s analysis reveals that college admissions and placement tests vary considerably and do not fully measure the knowledge and skills that are included in the ADP benchmarks. Generally, admissions tests were found to be more demanding than the placement tests and better balanced in the types of questions asked. Reading. College admissions tests in reading are more rigorous than placement tests, though the reading passages on placement tests more accurately reflect 2 Achieve, Inc. April 2007,
213 the types of reading material students will encounter in college across the disciplines. Writing. College admissions and placement tests in writing are rigorous more rigorous than most high school tests and they generally reflect the kind of writing students will be asked to do in college. Institution-developed placement tests are the strongest of the tests analyzed by Achieve. Mathematics. Admissions and placement tests in mathematics emphasize algebra, which is critical for credit-bearing mathematics courses. However, the algebra content assessed tends to favor prealgebra and basic algebra over the advanced algebraic concepts and skills essential for college readiness. Although placement tests are narrowly focused on algebra, admissions tests are broader, measuring a range of other important topics such as data analysis, statistics and geometry. Recommendations for K 12 Policymakers To improve the preparation of high school students so that all graduates are prepared for college and work, states need a more rigorous and coherent system of assessments one that is capable of measuring the standards students are expected to meet in high school and signaling readiness for postsecondary success. Very few states have such a system in place today. What are the implications of Achieve s study of admissions and placement exams for K 12 leaders as they seek to build college-ready tests into their high schools? Augment admissions tests when incorporating them into statewide testing systems. The ACT and SAT are widely used by colleges and therefore have credibility with students, parents and the broader public. States that are considering incorporating these tests into their assessment and accountability systems, however, should proceed with caution. Achieve s analysis reveals that although admissions tests do some things very well, there are gaps in what they measure. Neither the ACT nor the SAT includes the full range of advanced concepts and skills reflected in the ADP benchmarks and, increasingly, in state high school standards. To be effective, states need to augment the ACT and SAT with additional test questions or with additional performance measures to ensure stronger alignment with state standards and to assess the more advanced concepts and skills. Achieve encourages states that are considering incorporating the ACT or SAT into their state assessment and accountability systems to conduct independent alignment studies first and then work with ACT and the College Board to supplement the assessments as needed to ensure greater coherence and alignment. Consider using end-of-course tests to tap higher-level content and skills and place students into college courses. A growing number of states are pursuing endof-course tests in high school as a strategy for measuring college readiness at the upper grades (e.g., Algebra II and 11th grade English) while also better aligning tests with the high school curriculum. There are multiple benefits to this approach. End-of-course tests can be tied closely to the curriculum and to the courses that states require for graduation. They also are more sensitive to instruction because they are taken right after a student completes a course, and they allow states to monitor performance and ensure consistency of rigor across the state. For end-ofcourse tests to serve as an indicator of college readiness, states will need to administer tests in higher-level courses, such as Algebra II and 11th or 12th grade reading and writing. It also is essential for higher education to play a role in developing and/or reviewing these exams to ensure they reflect the skills needed for college success. If postsecondary institutions participate in developing these exams, they can more readily use them for placement purposes in entry-level college courses. This will send a powerful signal to students and their parents and teachers that performance in high school pays off in college. Modify existing high school tests to measure college readiness. Some states are considering adding questions to existing high school assessments to round out those tests and make them adequate measures of college readiness. If done well, this approach has the benefit of April 2007, Aligned Expectations? 3
214 streamlining the number of tests students take by serving the dual purpose of measuring student mastery of content in the state s standards as well as indicating readiness for credit-bearing college courses. The tests also can alert students if they need additional preparation for college in time to adjust their senior year coursework. Use existing college placement tests for diagnostic purposes only. Some states are making college placement tests available for students to take voluntarily in high school. These exams can provide information to high school students about their readiness for creditbearing, first-year college courses and allow teachers to work with students to address learning gaps in their senior year. However, placement tests should not be used as a substitute for building more comprehensive measures of college and work readiness into the state high school accountability system. The majority of the college placement tests reviewed by Achieve are narrowly focused on a subset of knowledge and skills; in math and reading in particular, they reflect relatively low levels of rigor. If states were to incorporate existing placement tests into their formal high school accountability systems, it might inadvertently lead to a narrowing and watering down of the curriculum. Recommendations for Higher Education Policymakers Developing a coherent system of high school assessments capable of measuring college readiness will primarily be driven by state K 12 education leaders, but postsecondary leaders have a critical role to play. Clearly define expectations for incoming students. Postsecondary systems and institutions must be clearer and more transparent about what it means to be college ready so that the K 12 system has something more concrete to aim for. In each state, colleges and universities should collaborate with the K 12 system to define and publicize the standards for transition from high school to college. These standards should be pegged to the knowledge and skills high school graduates need to succeed in credit-bearing, non-remedial courses. The standards must be tangibly articulated, much like the presentation of K 12 academic content standards, so that they do not simply become represented by a score on an admissions or a placement test. Scrutinize placement tests given to incoming students to determine eligibility for entry into credit-bearing courses. Once postsecondary institutions have clearly defined their expectations, it is important that they examine their existing placement tests (and admissions tests, if they are used to make placement decisions) to see whether they measure the content and skills needed to enter and succeed in credit-bearing courses not just to avoid remediation. If, as Achieve found, the tests in use do not fully measure the rigor and breadth of college readiness expectations, college faculty should consider commissioning or developing placement tests that reflect the real demands of what it takes to succeed in college. Collaborate with K 12 on the development of high school tests that fully reflect the breadth and rigor of the content needed for success in postsecondary education. In addition to incorporating more challenging placement tests, colleges should work with K 12 systems to develop stronger high school tests that tap the full range of college-ready content and then use the results of those tests to place students in courses appropriate to their knowledge and skills. Colleges should then align their qualifying scores to establish a high standard of preparation that applies no matter where a student decides to go to college. This type of coordination would have two distinct benefits: It would reduce confusion about just what is required for college-level work, and it would do so in a way that reduces the number of tests a student takes. Support the development and use of K 16 longitudinal data systems. Institutions of higher education need to work with K 12 systems to develop effective data systems that follow students from high school to college. Without data on how students perform once they arrive in college, no one can be sure which high school programs are effective and which need improvement. An effective longitudinal data system should include high school grades and assessment results, college coursetaking patterns, success in first-year college courses, and persistence and completion rates. 4 Achieve, Inc. April 2007,
215 Findings: Mathematics SUMMARY To be successful in college and on the job, high school graduates need to be able to use mathematics to solve challenging problems. Achieve s A DP research with college professors from diverse institutions found that incoming freshmen need to have mastered the content taught in a four-year course sequence that includes courses such as Algebra I, Algebra II and Geometry, as well as significant data analysis and statistics. Achieve s analysis of admissions and placement tests found that: The admissions and placement tests put their heaviest emphasis on algebra content that is important to colleges and high-skilled workplaces. However, the algebra content assessed tends to favor prealgebra and basic algebra over the advanced algebra concepts and skills essential for college readiness and placement into College Algebra. Placement tests in particular are narrow and do not re fl ect the full range of content such as data analysis, statistics and geometric reasoning college students need in a wide variety of courses. Admissions tests are more balanced across topics. Too few questions on admissions and placement tests tap higher-level cognitive skills critical to success in college. In contrast to the unique role of Comp 101 as the nearly universal first credit-bearing course in English, most colleges offer students several options for their first credit-bearing course in mathematics, the most common of which is College Algebra (see discussion on page 8). Other courses include Elementary Statistics, Mathematics for Liberal Arts, 13 Mathematics for Elementary Teachers and Calculus. With the exception of Calculus, these courses build on the knowledge and skills normally acquired in three years of high school mathematics. College Algebra and Elementary Statistics make extensive use of techniques taught in Algebra II, while other introductory creditbearing mathematics courses depend less on mastery of techniques than on mathematical habits of mind. With few exceptions, college courses that cover topics normally taught in high school Algebra II or earlier, such as Intermediate Algebra, do not carry credits that count toward an associate or a bachelor s degree. For this study, Achieve analyzed a variety of mathematics tests that are used for placement purposes by colleges and universities. These tests include examples of four different types: national admissions tests, national placement tests, state or systemwide tests and institution-level tests (see table on the next page). In addition, included in the analysis was a small pool of science items identified by ACT analysts as assessing mathematics content as it arises in natural applications. Although mathematicsintensive items on the ACT science test are incorporated into a composite score useful in making college admission decisions, such items are not reported as part of the ACT mathematics score and, therefore, are not incorporated into decisionmaking about course placement in mathematics. As shown on page 27, Achieve s analysis of placement tests included those used for determining placement into College Algebra and Calculus. Because College Algebra is the most common introductory college mathematics 26 Achieve, Inc. April 2007,
216 TESTS ANALYZED IN MATHEMATICS Test Category Mathematics: College Algebra Mathematics: Calculus National A dmissions Tests ACT Assessment M ath (t w o forms) SAT Reasoning Test M ath (t w o forms) National Placement Tests ACCUPL ACER Elementary Algebra CO MPASS Algebra ACCUPL ACER College Level M athematics CO MPASS College Algebra State or Systemwide Tests Institution-Level Tests Texas Higher Education Assessment Washington Intermediate M athematics Placement Test M athematical Association of A merica Algebra Placement Test Purdue University North Central Student Assessment and M easurement Placement Test Temple University M athematics Placement Test University of Minnesota College M ath Readiness Test Washington A dvanced M athematics Placement Test Louisiana State University College Algebra Placement/Credit Exam M athematical Association of A merica Calculus Readiness Placement Test University of M aryland College Park M ath Placement Test University of Minnesota Calculus Readiness Test Achieve s analysis focuses on the tests used for placement into College Algebra; for an analysis of the tests used for placement into Calculus, see sidebar on page 33. course taken at two- and four-year colleges, the focus of this report is on the first-tier tests. Those are the tests that differentiate students who are ready for a creditbearing College Algebra course from students in need of remediation. Achieve s experts analyzed more than 1,200 items (including 22 ACT science items), coding them for their mathematical content and cognitive demand (e.g., whether students are called upon to recall knowledge, reason or generalize). In addition, using a taxonomy developed from the Third International Mathematics and Science Study (TIMSS), Achieve was able to determine a typical grade level for each item. This level, called an International Grade Placement (IGP) index, can be thought of as a kind of composite among the 40 TIMSS countries (other than the United States) to show when the curriculum focuses on different mathematics content at which point the highest concentration of instruction on a topic occurs. The average international grade level for each test then is calculated by averaging the IGP indices of each item. Achieve s investigation revealed both predictable and surprising features of these tests. Because national admissions tests tend to sample all mathematics taught in the first three years of high school, they typically are broader with respect to content coverage and sometimes April 2007, Aligned Expectations? 27
217 are more cognitively challenging than tests designed specifically for placement purposes. In contrast, placement tests generally focus on a narrower range of knowledge and often employ less cognitively demanding skills to determine which students are likely to succeed in particular courses. Because most colleges and universities offer entering students four or five different mathematics courses requiring different levels of mathematical preparation College Algebra being the most common some placement tests are arranged in multiple tiers, with topics ranging from arithmetic to precalculus. Even though this study focuses primarily on placement tests that are used to place students into College Algebra, Achieve also examined tests designed to indicate readiness for Calculus. Do college admissions and placement tests measure the full range of knowledge and skills described in the ADP benchmarks? ADP found that, to meet the demands of the many different university and college programs, high school graduates need to master a broad range of mathematical skills that are particularly important to professors from a variety of disciplines. In addition to algebraic concepts, professors identify geometry, measurement, data analysis and statistics as vital for success in credit-bearing courses. These topics are included in the ADP benchmarks and in most state standards. Achieve analyzed the admissions and placement tests to determine the distribution of topics, including algebra, geometry/measurement, data analysis/statistics and number concepts. Percentage of Items Content on Math Tests 100% 4% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 36% 43% 18% ACT 10% 30% 49% 11% SAT Admissions 1% 3% 94% 2% 1% 11% 85% 3% 100% 20% 60% 20% Admissions Tests Both the ACT and the SAT mathematics tests place the greatest emphasis on algebra, with nearly half of their collective items assessing algebraic content. Secondary emphasis is placed on geometry, with more than 30 percent of the questions across the ACT and SAT addressing geometry and measurement. Substantially less emphasis is given to number concepts (14 percent) and to data analysis and statistics (7 percent). 14 This distribution of math content is more balanced than that of placement tests. Placement Tests Achieve s analysis shows that three-quarters of the items on the placement tests measure algebra. 15 Tests designed for placement into College Algebra courses collectively attribute only 13 percent of their items to geometry and measurement, with institution-level tests providing somewhat greater emphasis than national 5% 75% 20% 14% 62% 24% 10% 33% Washington Texas Temple Univ. Purdue Univ. Univ. of Minnesota Mathematical Assoc. of America COMPASS ACCUPLACER 50% Number Algebra Geometry/Measurement Data Totals may not equal 100 percent due to rounding. Placement into College Algebra 6% 20% 71% 9% 28 Achieve, Inc. April 2007,
218 tests. Treatment of data and statistics is particularly lacking on placement tests, with only 1 percent of items assessing these concepts. Assessment of number sense and numerical operations also is minimal on placement tests, with 11 percent of items collectively addressing such topics. Emphasis on number concepts is slightly more prevalent on state and institution-level placement tests than on national placement tests. How challenging is the algebra measured by the admissions and placement tests? As previously indicated, what is considered a credit-bearing course in mathematics varies by major and across institutions. College Algebra which also is called Precalculus because it bridges Algebra II and Calculus is generally accepted as a transfer-worthy, credit-bearing college course; it is at this point where course placement becomes critical for students entering college. In theory, high school students should be able to demonstrate mastery of Algebra II-level content to be placed into a College Algebra course. Achieve s detailed coding of the admissions and placement test items made it possible to group the many algebra items into levels that approximately correspond to typical courses: prealgebra (middle school), basic algebra (Algebra I), and advanced algebra (Algebra II). See the table to the right for more details. ACHIEVE ALGEBRA LEVELS Algebra Level Course Equivalent Examples of Topics Covered Prealgebra Middle School M ath/prealgebra Working with integers, rational numbers, pat terns, representation, substitution, basic manipulation and simpli ficatio n Basic Algebra Algebra I Linear equations and basic functions and relatio ns A dvanced Algebra Percentage of Items Algebra II Algebra Content on Math Tests 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 35% 24% 41% ACT 28% 30% 42% SAT Admissions 11% 14% 75% 19% 32% 49% 50% 50% 16% 34% Non-linear equations or inequalities and an understanding of real and complex numb ers 17% 33% 10% 23% Temple Univ. Purdue Univ. Univ. of Minnesota Mathematical Assoc. of America CO MPASS ACCUPLACER 67% 32% 13% 55% Placement into College Algebra 29% 25% 46% Texas Prealgebra Basic Algebra Advanced Algebra Totals may not equal 100 percent due to rounding. 24% 40% 36% Washington April 2007, Aligned Expectations? 29
219 Admissions Tests Nearly one-third of the ACT and SAT collectively assess more rigorous advanced algebra concepts and skills typically encountered in a course such as Algebra II. Most of the advanced algebra items found on the admissions tests address solving quadratic equations and systems of equations. Of the remaining items, slightly more than 25 percent of the ACT and SAT assesses basic algebra topics typically encountered in courses such as Algebra I, and the remaining 40 percent covers prealgebra concepts. Although the ACT and SAT are more rigorous than most of the placement tests with respect to algebra, many important higher-level algebra topics included in the ADP benchmarks such as advanced functions and modeling are barely addressed on either of these tests. Placement Tests On average, the items Achieve analyzed from the placement tests emphasize prealgebra knowledge and skills. Fewer than one in four items assess knowledge and skills typically encountered in courses like Algebra I. Less than 30 percent of the algebra items across all placement tests tap advanced algebra knowledge and skills typically encountered in a course such as Algebra II. There are a few notable exceptions: The Minnesota and Mathematical Association of America (MAA) Algebra tests actually place their greatest emphasis (50 percent of each) on advanced algebra knowledge and skills. More so than on the admissions tests, nearly all of the advanced algebra items on the placement tests including the Minnesota and MAA Algebra tests ask students to solve quadratic equations and systems of equations. Percentage of Items How cognitively challenging are the mathematics test questions? The content measured by an item reveals only one aspect of a test s character. A more complete understanding also requires consideration of the cognitive demand of the questions. For example, are students asked to develop a mathematical model to solve a complex, multistep problem? Or is the question framed in such a way that students need apply only a basic procedure, such as solving a simple linear equation? To analyze cognitive demand, Achieve used a five-point scale with Level 1 (recalling) being the lowest and Level 5 (advanced reasoning) the highest. Admissions Tests The ACT and SAT are the most cognitively demanding of all the tests examined in this study, with nearly 40 percent of their items categorized as Level 3 or above. Level 3 items require students to use non-routine procedures, Cognitive Demand of Math Tests 100% 1% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 24% 13% 54% 8% ACT 2% 27% 10% 55% 6% SAT Admissions 10% 9% 79% 3% 3% 3% 85% 8% 94% 6% 13% 77% 10% 8% 3% 90% 8% 4% Temple Univ. Purdue Univ. Univ. of Minnesota Mathematical Assoc. of America CO MPASS ACCUPLACER Level 1: Recalling Level 2: Routine Procedures Level 3: Non-Routine Procedures 74% 14% Placement into College Algebra Level 4: Formulating Problems/Strategizing Solutions Level 5: Advanced Reasoning Totals may not equal 100 percent due to rounding. 4% 23% 13% 58% 2% Texas 34% 6% 54% 6% Washington 30 Achieve, Inc. April 2007,
220 Level 4 items ask students to formulate problems and strategize solutions, and Level 5 items call on students to use advanced reasoning. Although mathematicsintensive items on the ACT science test are not counted for mathematics placement purposes, they are particularly noteworthy because Achieve s analysis identified all of these items as Level 3 or 4, given the cognitive complexity of reading, analyzing and interpreting scientific data. In addition, it is worth mentioning that nearly 20 percent of items on the SAT are gridded-response items in which no answer choices are given. Students must produce their own answers to these questions and bubble in their answers on a special answer grid. The SAT is the only mathematics test in this study that includes such items; the remaining tests rely exclusively on multiplechoice questions. Placement Tests Achieve s analysis showed that placement tests do not tap the higher levels of cognitive demand. In fact, collectively across all placement tests examined, more than three-quarters of the questions are categorized as Level 2 calling for students to apply routine procedures to solve problems. Seventeen percent of the questions fall at Levels 3 and 4. Only two questions (on the Texas assessment) across all of the placement tests call for students to use advanced reasoning (Level 5), even though research indicates it is necessary for success in college. Interestingly, state and institution-level placement tests on average tend to be more cognitively demanding than the national placement tests, placing greater emphasis on items rated as Level 3 or above. The Washington Intermediate Mathematics Placement Test and the Texas assessment are particularly notable for their considerable emphasis on assessing higher levels of cognitive demand. How rigorous is the content on the college admissions and placement tests on an international scale? As part of its earlier study of high school exit tests, Achieve examined the International Grade Placement (IGP) of each item on each test. The IGP, an index developed by Michigan State University, represents the average grade in which a mathematics topic typically appears in the curriculum of the 40 nations (other than the United States) that took part in the Third International Mathematics and Science Study (TIMSS). The IGP rating of a test is calculated by averaging the IGP levels of the items on the test. TIMSS data show that students in other countries typically study mathematical topics a year earlier than in most U.S. school districts; therefore, the IGP index is approximately one year lower than corresponding U.S. averages. Ideally, admissions and placement tests Content on Math Tests on International Grade Scale Grade Level ACT SAT Admissions Washington Texas Temple Univ. Purdue Univ. Univ. of Minnesota Mathematical Assoc. of America COMPASS ACCUPLACER Placement into College Algebra Median April 2007, Aligned Expectations? 31
221 would place greater emphasis on content taught later in high school and less emphasis on prealgebra. Admissions and Placement Tests Achieve s analysis indicates that although some questions on admissions and placement tests do measure advanced content, overall they place too much emphasis on content that is taught earlier in a student s high school career. The average IGP for the set of items from all of the institution-level placement tests is 8.4, indicating that on average the tests focus on content that is taught in the 8th grade in other countries but in early high school in the United States. The average IGP for the national admissions and placement tests are comparable to the institution-level tests. This is largely because these tests include so many items that measure prealgebra and basic algebra content. As the box-and-whisker plot on page 31 indicates, the tests do measure some higher-level knowledge and skills that are prerequisites for success in college (as evidenced by the whisker extending upward toward the later high school years), but they tend to be weighted with large proportions of items categorized as assessing content at the 9th grade level or below internationally. 32 Achieve, Inc. April 2007,
222 Appendix B3: Analysis of College Algebra and Pre-Calculus Course Syllabi Compared to the ADP Algebra II End-of-Course Exam Standards Background In early 2009, Pearson conducted a series of Judgment Studies involving higher education mathematics professors to evaluate the ADP Algebra II End-of-Course Exam standards and the spring 2008 Algebra II end of course exam. The goal of the Judgment Studies was to determine the relative importance of the content on the exam for preparation for credit-bearing college mathematics courses. These Judgment Studies are a unique aspect of the standard setting process for the ADP Algebra II exam: Including higher education faculty in setting standards for a high school exam is atypical but critical in order to ensure alignment between high school and college mathematics. Professors participating in the Judgment Study sessions were selected because they teach either college algebra or pre-calculus courses at two- and four-year public institutions. The four-year institutions targeted for the study have either a typical or selective admissions policy. Participants were asked to provide a syllabus for the courses they teach. Seventy-nine institutions and campuses participated in the study and syllabi were received from 71 of the institutions across 18 ADP Network states. Because curricula may differ across campuses within a system, each campus was reviewed as a separate entity if more than one campus from the institution provided syllabi. In some instances, several college algebra and/or pre-calculus syllabi were provided for a campus. If the syllabi contained different content, each was reviewed as a separate course. It is important to note that college mathematics courses vary greatly by state, institution, and professor. Some states require courses with the same title and number to contain the same content, where other states do not have a system-wide agreement on course curricula. In total, 69 college algebra-type courses and 30 pre-calculus-type courses were reviewed. Purpose of the Study The purpose of this analysis was to provide information regarding the level of college mathematics course that a student performing at the college-ready level on the ADP Algebra II End-of-Course Exam (EOC) would be prepared to take. Although there are a myriad of credit-bearing mathematics courses that a student might enroll in his/her first year of college, this analysis focuses on college algebra and pre-calculus. It should be noted that there is typically some degree of overlap between consecutive mathematics courses, whether in high school or college. Therefore, some overlap between the content of the ADP EOC Algebra II exam and the first college mathematics course is expected. The institutions that participated and the course codes for the syllabi provided are listed on the following pages in Table 1. July 7, 2009 B3-1
223 State Table 1: Syllabi Provided by Post-Secondary Institutions for ADP Judgment Study College Algebra Pre-Calculus Post-Secondary Institution and Campus Course(s) Course(s) AR Arkansas Northeastern College MA AR Arkansas State University Mountain Home MA 1003 MA 1023 AR Black River Technical College MA 1023 AR Cossatot Community College of the University of Arkansas MATH 1023 MA1054 AR Henderson State University MA 1243 MTH 1273 AR NorthWest Arkansas Community College MATH 1204 AR Ozarka College MATH 1203 AR Southern Arkansas University Main Campus MATH 1023 MATH 1045 AR University of Arkansas Fort Smith MATH 1715 AR University of Arkansas Little Rock College Algebra AR University of Arkansas Main Campus MATH 1203 AR University of Central Arkansas MA 1390 AZ Arizona Western College MA 151 MA 187 AZ Central Arizona College MAT 151 MAT187 AZ Northern Arizona University MAT 108 MAT 125 AZ Paradise Valley Community College MAT 151 FL Florida Golf Coast University MAC 1147 FL Palm Beach Community College MAC 1105 FL The University of West Florida MAC FL University of Florida MAC 1105 FL University of North Florida MAC 1990 FL University of South Florida St. Petersburg MAC 1105 MAC 1147 HI Leeward Community College MATH 103 MATH 135 HI Maui Community College MATH 135 HI University of Hawaii at Hilo MATH 103 MATH 104 HI University of Hawaii at Manoa MATH 135 HI Windward Community College MATH 103 IL Elgin Community College MTH 112 IN Indiana State University MATH 115 IN Indiana University Purdue University Fort Wayne (IPFW) MAT 153 IN Indiana University Bloomington MATH M025 IN Ivy Tech Community College Elkhart MAT B IN Ivy Tech Community College Richmond MATH 136 MATH M026 MATH M027 IN Purdue University North Central MA 152 MA 153 KY Jefferson Community and Technical College MATH 150 LA Bossier Parish Community College MATH 102 July 7, 2009 B3-2
224 State Post-Secondary Institution and Campus College Algebra Course(s) MA Framingham State College MATH 123 MA University of Massachusetts Boston MATH 115 MA Worcester State College MA Pre-Calculus Course(s) MD Anne Arundel Community College MAT 131 MAT 151 MD Bowie State University MATH 150H MD Coppin State University MATH 110 MD Howard Community College MATH 131 MD Towson University MATH 115 MD University of Maryland Baltimore County MATH 150 MD University of Maryland College Park MATH 115 MN Saint Cloud State University MA 115 NC NC Central Piedmont Community College Durham Technical Community College MAT 161 MAT 171 MAT 161/A MAT 171/A NC Halifax Community College MATH 161 MAT 172 NC Winston Salem State University MAT 1313 NJ Bergen Community College MAT 180 NJ County College of Morris MAT 110 NJ Middlesex County College MAT 116 MAT 129 NJ Montclair State University MATH MATH NJ Rowan University MATH MATH OH Ball State University MATH 111 OH Columbus State Community College Math 148 OH James A Rhodes State College MTH 121 MTH 122 OH Kent State University Trumbull MATH OH Lakeland Community College MATH 1650 OH Ohio State University Main Campus MATH 148 MATH 150 PA Bloomsburg University of Pennsylvania PA California University of Pennsylvania MAT 199 PA PA Delaware County Community College Pennsylvania State University MAT 140-I MAT 140-II MATH 21 MATH 22 RI Rhode Island College MATH 120 MATH 209 TX San Jacinto Community College MATH 1314 TX University of North Texas MATH 1100 WA Central Washington University MATH 153 WA Eastern Washington University MATH 104 MATH 105 MATH 106 July 7, 2009 B3-3
225 Methodology Achieve content experts conducted a comparative analysis of the syllabi provided by participating institutions with the ADP Algebra II End-of-Course Exam (EOC) Standards. For each syllabus, each topic was mapped to the corresponding benchmark in the ADP Algebra II EOC exam standards. Due to variations in the level of detail provided in the syllabi, it was sometimes difficult to match information provided in the syllabi to specific EOC benchmarks, but content experts used professional judgment to map to the most appropriate benchmark. The ADP Algebra II EOC standards are comprised of a core set of forty-one benchmarks in five content standards and seven optional modules. The core content standards included in the ADP Algebra II EOC Exam are: Operations on Numbers and Expressions; Equations and Inequalities; Polynomial and Rational Functions; Exponential Functions; and Function Operations and Inverses. The optional modules include: Data and Statistics; Probability; Logarithmic Functions; Trigonometric Functions; Matrices; Conic Sections; and Sequences and Series. For purposes of this study, only the core standards of the ADP Algebra II End-of-Course Exam were included in the analysis. The modules were not included because they have not yet been administered in an operational administration and will not be included in the standard setting process. Each type of course, college algebra and pre-calculus, was analyzed from two different perspectives.! The first analysis reviewed each ADP Algebra II EOC benchmark individually across each course, college algebra and pre-calculus, to determine how well the content in the Algebra II exam standards would prepare a student for the content in each course.! The second analysis reviewed each syllabus, as a whole, against the ADP Algebra II EOC standards to determine the degree of overlap between each syllabus and the Algebra II standards. Findings First Analysis College Algebra level: The comparison of the ADP Algebra II EOC by benchmark with the content topics in the college algebra syllabi indicates that the overlap is between 4-71% at the benchmark level as shown in the last column in Table 2. At the content standard level this averages to an overlap of 19% to 48%. Operations on Number and Expressions has the lowest degree of overlap, containing more introductory algebra material. Polynomial and Rational Functions and Exponential Functions, which contains more of the advanced algebra material, has the highest degree of overlap. July 7, 2009 B3-4
226 Table 2: Overlap of ADP Algebra II End-of-Course Exam Benchmarks and College Algebra Syllabi Number of Percent of Number of Syllabi Percent of Number of Syllabi Syllabi Partially Syllabi Syllabi Algebra II EOC standards Partially Fully Overlappi Fully Aligned Overlappi Aligned ng Aligned (Max=70) ng (Max=70) (Max=70) CORE: Operations on Numbers and Expressions O1 Real Numbers O1.a. Convert between and among radical and exponential forms of numerical expressions. O1.b. Simplify and perform operations on numerical expressions containing radicals. O1.c. Apply the laws of exponents to numerical expressions with rational and negative exponents to order and rewrite them in alternative forms. O2 Complex Numbers O2.a. Represent complex numbers in the form a + bi, where a and b are real; simplify powers of pure imaginary numbers. O2.b. Perform operations on the set of complex numbers. O3 Algebraic Expressions O3.a. Convert between and among radical and exponential forms of algebraic expressions. O3.b. Simplify and perform operations on radical algebraic expressions. O3.c. Apply the laws of exponents to algebraic expressions, including those involving rational and negative exponents, to order and rewrite them in alternative forms. O3.d. Perform operations on polynomial expressions. O3.e. Perform operations on rational expressions, including complex fractions. O3.f. Identify or write equivalent algebraic expressions in one or more variables to extract information. CORE: Equations and Inequalities E1 Linear Equations and Inequalities E1.a. Solve equations and inequalities involving the absolute value of a linear expression. Percent of Syllabi Aligned 2 3% 2 3% 4 6% 3 4% 2 3% 5 7% 4 6% 3 4% 7 10% 9 13% 5 7% 14 20% 7 10% 10 14% 17 25% 6 9% 8 12% 14 20% 4 6% 14 20% 18 26% 6 9% 9 13% 15 22% 6 9% 11 16% 17 25% 10 14% 11 16% 21 30% 10 14% 0 0% 10 14% 18 26% 26 38% 44 64% July 7, 2009 B3-5
227 Algebra II EOC standards E1.b. Express and solve systems of linear equations in three variables with and without the use of technology E1.c. Solve systems of linear inequalities in two variables and graph the solution set. E1.d. Recognize and solve problems that can be represented by single variable linear equations or inequalities or systems of linear equations or inequalities involving two or more variables. Interpret the solution(s) in terms of the context of the problem E2 Nonlinear Equations and Inequalities E2.a. Solve single-variable quadratic, exponential, rational, radical, and factorable higher-order polynomial equations over the set of real numbers, including quadratic equations involving absolute value. E2.b. Solve single variable quadratic equations and inequalities over the complex numbers; graph real solution sets on a number line. E2.c. Use the discriminant,!(b 2-4ac), to determine the nature of the solutions of the equation ax 2 + bx + c = 0 E2.d. Graph the solution set of a twovariable quadratic inequality in the coordinate plane. E2.e. Rewrite nonlinear equations and inequalities to express them in multiple forms in order to facilitate finding a solution set or to extract information about the relationships or graphs indicated. CORE: Polynomial and Rational Functions P1 Quadratic Functions P1.a. Determine key characteristics of quadratic functions and their graphs. P1.b. Represent quadratic functions using tables, graphs, verbal statements, and equations. Translate among these representations. Number of Syllabi Partially Overlappi ng (Max=70) Percent of Syllabi Partially Overlappi ng Number of Syllabi Fully Aligned (Max=70) Percent of Syllabi Fully Aligned Number of Syllabi Aligned (Max=70) Percent of Syllabi Aligned 28 41% 17 25% 45 65% 2 3% 16 23% 18 26% 23 33% 4 6% 27 39% 29 42% 2 3% 31 45% 26 38% 23 33% 49 71% 0 0% 3 4% 3 4% 2 3% 2 3% 4 6% 3 4% 3 4% 6 9% 10 14% 25 36% 35 51% 36 52% 6 9% 42 61% July 7, 2009 B3-6
228 Algebra II EOC standards P1.c. Describe and represent the effect that changes in the parameters of a quadratic function have on the shape and position of its graph. P1.d. Recognize, express, and solve problems that can be modeled using quadratic functions. Interpret their solutions in terms of the context. P2 Higher-order Polynomial and Rational Functions P2.a. Determine key characteristics of power functions in the form for positive integral values of n and their graphs. P2.b. Determine key characteristics of polynomial functions and their graphs. P2.c. Represent polynomial functions using tables, graphs, verbal statements, and equations. Translate among these representations. P2.d. Determine key characteristics of simple rational functions and their graphs. P2.e. Represent simple rational functions using tables, graphs, verbal statements, and equations. Translate among these representations. P2.f. Recognize, express, and solve problems that can be modeled using polynomial and simple rational functions. Interpret their solutions in terms of the context. CORE: Exponential Functions X1 Exponential Functions X1.a. Determine key characteristics of exponential functions and their graphs. X1.b. Represent exponential functions using tables, graphs, verbal statements, and equations. Represent exponential equations in multiple forms. Translate among these representations. X1.c. Describe and represent the effect that changes in the parameters of an exponential function have on the shape and position of its graph. Number of Syllabi Partially Overlappi ng (Max=70) Percent of Syllabi Partially Overlappi ng Number of Syllabi Fully Aligned (Max=70) Percent of Syllabi Fully Aligned Number of Syllabi Aligned (Max=70) Percent of Syllabi Aligned 4 6% 11 16% 15 22% 5 7% 28 41% 33 48% 3 4% 5 7% 8 12% 17 25% 24 35% 41 59% 42 61% 3 4% 45 65% 13 19% 24 35% 37 54% 41 59% 1 1% 42 61% 15 22% 20 29% 35 51% 14 20% 25 36% 39 57% 40 58% 6 9% 46 67% 4 6% 9 13% 13 19% July 7, 2009 B3-7
229 Algebra II EOC standards X1.d. Recognize, express, and solve problems that can be modeled using exponential functions, including those where logarithms provide an efficient method of solution. Interpret their solutions in terms of the context. CORE: Function Operations and Inverses F1 Function Operations F1.a. Combine functions by addition, subtraction, multiplication, and division. F1.b. Determine the composition of two functions, including any necessary restrictions on the domain. F2 Inverse Functions F2.a. Describe the conditions under which an inverse relation is a function. F2. b. Determine and graph the inverse relation of a function. F3 Piecewise-defined Functions F3.a. Determine key characteristics of absolute value, step, and other piecewise-defined functions. F3.b. Represent piecewise-defined functions using tables, graphs, verbal statements, and equations. Translate among these representations. F3.c. Recognize, express, and solve problems that can be modeled using absolute value, step, and other piecewise-defined functions. Interpret their solutions in terms of the context. Number of Syllabi Partially Overlappi ng (Max=70) Percent of Syllabi Partially Overlappi ng Number of Syllabi Fully Aligned (Max=70) Percent of Syllabi Fully Aligned Number of Syllabi Aligned (Max=70) Percent of Syllabi Aligned 5 7% 35 51% 40 58% 8 12% 16 23% 24 35% 3 4% 20 29% 23 33% 11 16% 9 13% 20 29% 8 12% 13 19% 21 30% 4 6% 8 12% 12 17% 13 19% 1 1% 14 20% 1 1% 5 7% 6 9% Pre-Calculus level: In contrast, the pre-calculus type courses have less overlap with the ADP Algebra II EOC standards than the college algebra courses. There are even some topics taught that have no overlap with the individual Algebra II EOC benchmarks, which might indicate that the Algebra II content is taught prior to such courses. The overlap of the content in the pre-calculus syllabi and Algebra II EOC exam at the benchmark level ranges from 0-63% as shown in the last column in Table 3. At the standard level the overlap ranges from 7% to 48%. Similar to the college algebra topics, the lowest degree of overlap was in Operations on Number and Expressions and the highest degree of overlap was in Polynomial and Rational Functions and Exponential Functions. July 7, 2009 B3-8
230 Table 3: Overlap of ADP Algebra II End-of-Course Exam Benchmarks and Pre-Calculus Syllabi Number of Number of Percent of Percent of Number of Syllabi Syllabi Syllabi Syllabi Syllabi Algebra II EOC standards Partially Fully Partially Fully Aligned Aligned Aligned Aligned Aligned (Max=30) (Max=30) (Max=30) CORE: Operations on Numbers and Expressions O1 Real Numbers O1.a. Convert between and among radical and exponential forms of numerical expressions. O1.b. Simplify and perform operations on numerical expressions containing radicals. O1.c. Apply the laws of exponents to numerical expressions with rational and negative exponents to order and rewrite them in alternative forms. O2 Complex Numbers O2.a. Represent complex numbers in the form a + bi, where a and b are real; simplify powers of pure imaginary numbers. O2.b. Perform operations on the set of complex numbers. O3 Algebraic Expressions O3.a. Convert between and among radical and exponential forms of algebraic expressions. O3.b. Simplify and perform operations on radical algebraic expressions. O3.c. Apply the laws of exponents to algebraic expressions, including those involving rational and negative exponents, to order and rewrite them in alternative forms. O3.d. Perform operations on polynomial expressions. O3.e. Perform operations on rational expressions, including complex fractions. O3.f. Identify or write equivalent algebraic expressions in one or more variables to extract information. CORE: Equations and Inequalities E1 Linear Equations and Inequalities E1.a. Solve equations and inequalities involving the absolute value of a linear expression. Percent of Syllabi Aligned 1 3% 0 0% 1 3% 1 3% 0 0% 1 3% 2 7% 0 0% 2 7% 4 13% 2 7% 6 20% 2 7% 2 7% 4 13% 2 7% 0 0% 2 7% 1 3% 0 0% 1 3% 1 3% 0 0% 1 3% 2 7% 3 10% 5 17% 1 3% 0 0% 1 3% 1 3% 0 0% 1 3% 4 13% 7 23% 11 37% July 7, 2009 B3-9
231 Algebra II EOC standards E1.b. Express and solve systems of linear equations in three variables with and without the use of technology E1.c. Solve systems of linear inequalities in two variables and graph the solution set. E1.d. Recognize and solve problems that can be represented by single variable linear equations or inequalities or systems of linear equations or inequalities involving two or more variables. Interpret the solution(s) in terms of the context of the problem E2 Nonlinear Equations and Inequalities E2.a. Solve single-variable quadratic, exponential, rational, radical, and factorable higher-order polynomial equations over the set of real numbers, including quadratic equations involving absolute value. E2.b. Solve single variable quadratic equations and inequalities over the complex numbers; graph real solution sets on a number line. E2.c. Use the discriminant,!(b 2-4ac), to determine the nature of the solutions of the equation ax 2 + bx + c = 0 E2.d. Graph the solution set of a twovariable quadratic inequality in the coordinate plane. E2.e. Rewrite nonlinear equations and inequalities to express them in multiple forms in order to facilitate finding a solution set or to extract information about the relationships or graphs indicated. CORE: Polynomial and Rational Functions P1 Quadratic Functions P1.a. Determine key characteristics of quadratic functions and their graphs. P1.b. Represent quadratic functions using tables, graphs, verbal statements, and equations. Translate among these representations. Number of Syllabi Partially Aligned (Max=30) Percent of Syllabi Partially Aligned Number of Syllabi Fully Aligned (Max=30) Percent of Syllabi Fully Aligned Number of Syllabi Aligned (Max=30) Percent of Syllabi Aligned 5 17% 3 10% 8 27% 2 7% 2 7% 4 13% 1 3% 0 0% 1 3% 7 23% 0 0% 7 23% 4 13% 9 30% 13 43% 0 0% 0 0% 0 0% 0 0% 1 3% 1 3% 1 3% 0 0% 1 3% 4 13% 8 27% 12 40% 14 47% 0 0% 14 47% July 7, 2009 B3-10
232 Algebra II EOC standards P1.c. Describe and represent the effect that changes in the parameters of a quadratic function have on the shape and position of its graph. P1.d. Recognize, express, and solve problems that can be modeled using quadratic functions. Interpret their solutions in terms of the context. P2 Higher-order Polynomial and Rational Functions P2.a. Determine key characteristics of power functions in the form for positive integral values of n and their graphs. P2.b. Determine key characteristics of polynomial functions and their graphs. P2.c. Represent polynomial functions using tables, graphs, verbal statements, and equations. Translate among these representations. P2.d. Determine key characteristics of simple rational functions and their graphs. P2.e. Represent simple rational functions using tables, graphs, verbal statements, and equations. Translate among these representations. P2.f. Recognize, express, and solve problems that can be modeled using polynomial and simple rational functions. Interpret their solutions in terms of the context. CORE: Exponential Functions X1 Exponential Functions X1.a. Determine key characteristics of exponential functions and their graphs. X1.b. Represent exponential functions using tables, graphs, verbal statements, and equations. Represent exponential equations in multiple forms. Translate among these representations. X1.c. Describe and represent the effect that changes in the parameters of an exponential function have on the shape and position of its graph. Number of Syllabi Partially Aligned (Max=30) Percent of Syllabi Partially Aligned Number of Syllabi Fully Aligned (Max=30) Percent of Syllabi Fully Aligned Number of Syllabi Aligned (Max=30) Percent of Syllabi Aligned 0 0% 7 23% 7 23% 1 3% 10 33% 11 37% 2 7% 0 0% 2 7% 10 33% 9 30% 19 63% 16 53% 0 0% 16 53% 8 27% 10 33% 18 60% 17 57% 0 0% 17 57% 3 10% 5 17% 8 27% 10 33% 8 27% 18 60% 19 63% 0 0% 19 63% 0 0% 4 13% 4 13% July 7, 2009 B3-11
233 Algebra II EOC standards X1.d. Recognize, express, and solve problems that can be modeled using exponential functions, including those where logarithms provide an efficient method of solution. Interpret their solutions in terms of the context. CORE: Function Operations and Inverses F1 Function Operations F1.a. Combine functions by addition, subtraction, multiplication, and division. F1.b. Determine the composition of two functions, including any necessary restrictions on the domain. F2 Inverse Functions F2.a. Describe the conditions under which an inverse relation is a function. F2. b. Determine and graph the inverse relation of a function. F3 Piecewise-defined Functions F3.a. Determine key characteristics of absolute value, step, and other piecewise-defined functions. F3.b. Represent piecewise-defined functions using tables, graphs, verbal statements, and equations. Translate among these representations. F3.c. Recognize, express, and solve problems that can be modeled using absolute value, step, and other piecewise-defined functions. Interpret their solutions in terms of the context. Number of Syllabi Partially Aligned (Max=30) Percent of Syllabi Partially Aligned Number of Syllabi Fully Aligned (Max=30) Percent of Syllabi Fully Aligned Number of Syllabi Aligned (Max=30) Percent of Syllabi Aligned 1 3% 16 53% 17 57% 4 13% 9 30% 13 43% 1 3% 9 30% 10 33% 6 20% 5 17% 11 37% 5 17% 5 17% 10 33% 1 3% 0 0% 1 3% 0 0% 0 0% 0 0% 0 0% 0 0% 0 0% Topics included in college syllabi that are not included on the ADP Algebra II Exam College Algebra level: Across the 69 college algebra syllabi analyzed, there are a few topics that occur in at least 50% of the 34 syllabi examined that do not appear in the core ADP Algebra II EOC Exam. These three are part of the modules for the Exam:! Determine key characteristics of logarithmic functions. (L.2.a)! Represent logarithmic functions using tables, graphs, verbal statements, and equations. Translate among these representations. (L.2.b)! Recognize, express, and solve problems that can be modeled using logarithmic functions. Interpret their solutions in terms of the context of the problem. (L.2.d) July 7, 2009 B3-12
234 Although there are other topics covered in the college algebra syllabi that are not covered in the ADP Algebra II EOC Exam, there was no commonality among these topics. Pre-Calculus level: Across the 30 pre-calculus syllabi analyzed, there were a few topics that occurred in at least 50% of the 15 syllabi that did not appear in the core ADP Exam. Four of these topics are part of the modules for the Exam:! Represent logarithmic functions using tables, graphs, verbal statements, and equations. Translate among these representations. (L.2.b)! Use the relationship of the sine and cosine functions to a central angle of the unit circle to determine the exact trigonometric ratio of angles on the unit circle. (T.1.b)! Represent trigonometric functions using tables, graphs, verbal statements, and equations. Translate among these representations. (T.1.d)! Determine key characteristics of trigonometric functions and their graphs. (T.1.e) One of the topics in common among the 30 pre-calculus syllabi examined that is not part of the ADP Algebra II EOC Exam is:! Trigonometry equations and basic identities Overall, the ADP Algebra II EOC Exam standards will provide a strong mathematics foundation for students entering either a college algebra or pre-calculus course. Second Analysis Because one goal of the ADP Algebra II EOC Exam is to serve as an indicator of a student s readiness for the first credit-bearing mathematics course in college, such as college algebra or pre-calculus, much of the content on the exam should be taught prior to these courses. Typically, there is some degree of overlap in consecutive mathematics courses, but ideally the overlap is limited to the review of foundational topics early in the course. College Algebra level: There is considerable overlap between the ADP Algebra II EOC standards and the college algebra syllabi. On average, the college algebra syllabi overlap in 34% of the 41 benchmarks (14 benchmarks), with a range in overlap from 7% to 80%, as shown in Table 4. Pre-Calculus level: There is less content in common between the ADP Algebra II EOC standards and the pre-calculus syllabi. The pre-calculus syllabi average 24% of the benchmarks (10 benchmarks) overlapping with the ADP EOC, with a range in overlap from 0% to 54%, as shown in Table 4. July 7, 2009 B3-13
235 State Post-Secondary Institution and Campus Table 4: College Algebra and Pre-Calculus Syllabi Analyzed College Algebra Course(s) Number of Benchmarks Overlapping (max=41) Percent of Benchmarks Overlapping AR Arkansas Northeastern College MA % Pre-Calculus Course(s) Number of Benchmarks Overlapping (max=41) Percent of Benchmarks Overlapping AR Arkansas State University Mountain Home MA % MA % AR Arkansas State University Mountain Home MA % AR Black River Technical College MA % Cossatot Community College of the University of AR Arkansas MATH % AR Henderson State University MA % MTH % AR NorthWest Arkansas Community College MATH % AR Ozarka College MATH % AR Southern Arkansas University Main Campus MATH % MATH % AR University of Arkansas Fort Smith MATH % AR University of Arkansas Little Rock College Algebra 15 37% AR University of Arkansas Main Campus MATH % AR University of Central Arkansas MA % AZ Arizona Western College MA % MA % AZ Central Arizona College MAT % MAT % AZ Northern Arizona University MAT % AZ Northern Arizona University MAT % AZ Paradise Valley Community College MAT % FL Florida Golf Coast University MAC % FL Palm Beach Community College MAC % MAC FL The University of West Florida % July 7, 2009 B3-14
236 State Post-Secondary Institution and Campus College Algebra Course(s) Number of Benchmarks Overlapping (max=41) Percent of Benchmarks Overlapping FL University of Florida MAC % FL University of North Florida MAC % Pre-Calculus Course(s) Number of Benchmarks Overlapping (max=41) Percent of Benchmarks Overlapping FL University of South Florida St. Petersburg MAC % MAC % HI Leeward Community College MATH % HI Leeward Community College MATH % HI Maui Community College MATH % HI University of Hawaii at Hilo MATH % MATH % HI University of Hawaii at Manoa MATH % HI Windward Community College MATH % IL Elgin Community College MTH % IN Indiana State University MATH % Indiana University Purdue University Fort Wayne IN (IPFW) MAT % IN Indiana University Bloomington MATH M % MATH M % IN Indiana University Bloomington MATH M % IN Ivy Tech Community College Elkhart MAT B 12 29% IN Ivy Tech Community College Richmond MATH % IN Purdue University North Central MA % MA % KY Jefferson Community and Technical College MATH % LA Bossier Parish Community College MATH % MA Framingham State College MATH % MA University of Massachusetts Boston MATH % MA Worcester State College MA % MD Anne Arundel Community College MAT % MAT % MD Bowie State University MATH 150H 12 29% July 7, 2009 B3-15
237 State Post-Secondary Institution and Campus College Algebra Course(s) Number of Benchmarks Overlapping (max=41) Percent of Benchmarks Overlapping MD Coppin State University MATH % MD Howard Community College MATH % MD Towson University MATH % Pre-Calculus Course(s) Number of Benchmarks Overlapping (max=41) Percent of Benchmarks Overlapping MD University of Maryland Baltimore County MATH % MD University of Maryland College Park MATH % MN Saint Cloud State University MA % NC Central Piedmont Community College MAT % MAT % NC Central Piedmont Community College MAT % NC Durham Technical Community College MAT 161/A 14 34% NC Durham Technical Community College MAT 171/A 15 37% NC Halifax Community College MATH % NC Winston Salem State University MAT % NJ Bergen Community College MAT % NJ County College of Morris MAT % NJ Middlesex County College MAT % MAT % MATH 100- MATH 112- NJ Montclair State University % % NJ Rowan University MATH % MATH % OH Ball State University MATH % OH Columbus State Community College MATH % OH James A Rhodes State College MTH % OH James A Rhodes State College MTH % OH Kent State University Trumbull MATH % OH Lakeland Community College MATH % OH Ohio State University Main Campus MATH % MATH % July 7, 2009 B3-16
238 State Post-Secondary Institution and Campus College Algebra Course(s) Number of Benchmarks Overlapping (max=41) Percent of Benchmarks Overlapping Pre-Calculus Course(s) Number of Benchmarks Overlapping (max=41) Percent of Benchmarks Overlapping PA Bloomsburg University of Pennsylvania % % PA California University of Pennsylvania MAT % PA Delaware County Community College MAT 140-I 17 41% PA Delaware County Community College MAT 140-II 9 22% PA Pennsylvania State University MATH % PA Pennsylvania State University MATH % RI Rhode Island College MATH % MATH % TX San Jacinto Community College MATH % TX University of North Texas MATH % WA Central Washington University MATH % WA Eastern Washington University MATH % MATH % WA Eastern Washington University MATH % July 7, 2009 B3-17
239 Summary The syllabi reviewed come from a broad range of two- and four-year public institutions in 18 different ADP Network states and include both college algebra and pre-calculus courses--two of the most commonly taken first-year, credit bearing mathematics courses. The comparisons of ADP Algebra II EOC Standards with the content topics in the syllabi indicate that the overlap at the standard level ranges from 19-48% for college algebra syllabi and from 7-48% for pre-calculus syllabi. This suggests that the college courses are natural extensions of the ADP Algebra II Exam standards, and that the content covered on the ADP Algebra II End-of-Course Exam would provide a solid foundation for college level course work. The standards, therefore, seem at the appropriate level, on average, to prepare students for the first credit-bearing college mathematics courses, particularly college algebra or pre-calculus. July 7, 2009 B3-18
240 Appendix B4: Achieve s ADP Algebra II End-of-Course Exam Standards Compared to Eleven Countries Mathematics Standards To determine how the ADP Algebra II End-of-Course Exam (EOC) Standards compare to standards from other countries, Achieve content specialists compared the content and rigor of the ADP Algebra II Exam standards with upper-level secondary mathematics standards in eleven countries. The following sets of standards were used in this comparison:! Alberta, Canada! China! Chinese, Taipei! Finland! Hong Kong! Japan! Korea! Malaysia! New Zealand! Singapore! Thailand These countries were chosen because they were part of an Achieve study for the United States Department of Education analyzing standards from the APEC (Asia-Pacific Economic Cooperation) countries. Finland s standards were added to the analysis because it is known to be an academically high performing nation. The mathematics standards from the countries Achieve analyzed are the upper-level secondary standards required for all students receiving a high school diploma or equivalent credential. The majority of ADP Algebra II test takers are in 10 th and 11 th grade, which is approximately the same level as the analyzed countries standards. This analysis was a comparison of the countries required standards with the assessment standards for the ADP Algebra II End-of-Course Exam, which is currently an assessment with no stakes attached. While many of the countries studied offer more challenging standards, Achieve did not include an analysis of that content because it is not required for all of their students, and in the United States might be taken at the postsecondary level. The ADP Algebra II EOC Standards are comprised of a core set of 41 benchmarks classified into five core content standards and seven optional modules. The core content standards included in the ADP Algebra II EOC Exam are: Operations on Numbers and Expressions; Equations and Inequalities; Polynomial and Rational Functions; Exponential Functions; and Function Operations and Inverses. The optional modules include: Data and Statistics; Probability; Logarithmic Functions; Trigonometric Functions; Matrices; Conic Sections; and Sequences and Series. For purposes of this study only the core standards of the ADP Algebra II EOC Exam were included in the analysis. The modules were not included because they have not yet been included in an operational administration and will not be included in the initial standard setting process. June 30, 2009 B4-1
241 Methodology Achieve compared the core ADP Algebra II EOC standards with the eleven sets of country standards in four dimensions:! The first analysis compared the distribution of mathematics content across the Number, Algebra, Geometry/Measurement, and Data Analysis/Statistics strands.! The second comparison evaluated the level of algebra content, classified as either pre-algebra, basic algebra or advanced algebra.! The third analysis evaluated the international grade placement of the mathematics content (see next section for further discussion of the International Grade Placement scale).! The fourth analysis determined the level of rigor of all of the reviewed standards. The first dimension of the standards Achieve examined was the content included in each set of standards. To determine this, Achieve content specialists used an international scale created by Michigan State University as part of the Third International Mathematics and Science Study (TIMSS) in which each individual standard statement is assigned a set of fine-grain codes corresponding to the content and performance aspects of the statement. This detailed, comprehensive taxonomy allows the mathematics content to be organized at its most general level according to the following major strands (or domains) of mathematics: Number, Algebra, Geometry/Measurement and Data Analysis/Statistics. These broad categories can also be broken down into smaller units to allow for more refined comparisons. For example, geometry content is divided into a variety of categories such as two-dimensional geometry and measurement; three dimensional geometry and measurement; transformations, congruence and similarity; and trigonometry. The second dimension Achieve evaluated was the rigor of the algebra content of the standards e.g., the content assigned TIMSS codes that fall under the Algebra stand. Within Algebra, the content is broken down into three levels to determine the level of rigor of the algebra pre-algebra, basic algebra (typically taught in an Algebra I, or equivalent, course), and advanced algebra (typically taught in an Algebra II, or equivalent, course). The third dimension Achieve evaluated was the approximate international grade placement of the mathematics content of the standards (e.g., at what grade level the content is typically taught in other countries). Each code in the TIMSS taxonomy has a corresponding grade level at which the mathematics topic is first introduced to students (i.e., the International Grade Placement scale 1 ). The distribution of grade levels associated with all of the standards gives a proxy rating for the difficulty of the content encountered in the standards. 2 1 The International Grade Placement is an index developed by Michigan State University as part of its TIMSS research 2 Standards statements may contain a wide range of expectations. A statement that mentions lower level expectations as well as more advanced performances is assigned multiple performance codes that cover the range of expectations within that statement. June 30, 2009 B4-2
242 The final dimension Achieve considered was the complexity (or cognitive demand) of the performance contained in the standards e.g., what each statement expects students to do with their knowledge and skills. In mathematics, students might need to be able to perform a simple computation or procedural process a low level skill or they can be challenged to reason though a multistep problem a more complex skill. The reporting levels used to delineate the hierarchy of cognitive demand are as follows:! Level 1 includes demonstrating basic knowledge or recall of a fact or property.! Level 2 includes routine problem solving that asks students to do such things as compute, graph, measure or apply a mathematical transformation.! Level 3 includes estimating, comparing, classifying and using data to answer a question, or requiring students to make decisions that go beyond a routine problem-solving activity.! Level 4 includes asking students to formulate a problem or to strategize or critique a solution method.! Level 5 includes asking students to develop algorithms, generalizations, conjectures, justifications or proofs. June 30, 2009 B4-3
243 Findings Content Strands The ADP Algebra II EOC standards were evaluated for the mathematics content contained in the standards. The core EOC has only 3% in the domain of Geometry/Measurement, with no benchmarks in the Number or Data strands. Ninetyseven percent of the 41 benchmarks are in the Algebra domain. This emphasis is not surprising as the core Algebra II exam benchmarks were designed to focus on algebra curriculum. Across the sets of international standards reviewed, the Number domain ranges from 1% to 8% of the respective standards. The Geometry/Measurement standard ranges from 9% to 60% of the benchmarks reviewed and Data ranges from 4% to 39%. The Algebra domain seems to be the primary domain in the international benchmarks, having a range from 23% to 70% of the counties standards reviewed. The ADP Algebra II EOC exam is more heavily weighted in Algebra than the comparison sets of standards, but that is to be expected, as it has a more limited focus, specifically those concepts and skills expected to be mastered by students after an Algebra II or equivalent course. The analysis by mathematics content of the ADP Algebra II EOC exam compared to the international sets of standards is shown in Graph 1 below. Graph 1: Mathematics Content by Strand 100% 75% 3% 9% 21% 16% 18% 19% 9% 41% 31% 13% 15% 27% 23% 4% 25% 10% 60% 39% 13% 50% 24% 10% 50% 97% 19% 25% 64% 42% 49% 67% 57% 62% 70% 23% 39% 33% 57% 0% ADP Algebra II Exam 6% Alberta, Canada 1% 1% China Chinese Taipei 5% 2% 1% 1% 6% 3% 4% Finland Hong Kong Japan Korea Malaysia New Zealand Singapore 8% Thailand Number Algebra Geometry/Measurement Data June 30, 2009 B4-4
244 Level of Algebra Content The benchmarks that were considered part of the Algebra domain in the ADP Algebra II EOC standards and the eleven sets of standards reviewed were assessed for the level of rigor of their algebra content (pre-algebra, basic algebra, or advanced algebra). Fortythree percent of the ADP Algebra II EOC standards are considered to be advanced algebra. This falls in the middle of the range of the other sets of standards reviewed. Japan s standards cover the most advanced algebra, with 78% of their algebra standards classified as advanced. Singapore has the least at 36% classified as advanced algebra. The Algebra domain of the ADP Algebra II EOC standards has 16% of its benchmarks classified as pre-algebra curriculum. The highest level of pre-algebra curriculum is in Hong Kong with 29%, while the lowest is in the Finland curriculum at 9%. On average, other countries place a greater emphasis on Advanced Algebra; in fact, in six countries, over 50% of the Algebra standards focus on Advanced Algebra, as shown in Graph 2. The algebra in the ADP Algebra II EOC standards falls into the middle of the pack in each category analyzed, with less than 50% of the test standards focusing on Advanced Algebra, making the ADP Algebra II EOC standards less rigorous than the standards from most of the other countries in the study. Graph 2: Algebra Content Assessed in Algebra Domain 100% 75% 43% 58% 38% 49% 46% 51% 46% 57% 36% 57% 72% 74% 50% 25% 0% 41% 16% ADP Algebra II Exam 29% 53% 16% 42% 13% 10% 12% 9% Alberta, Canada China Chinese Taipei 25% 29% 15% 36% 11% 13% 29% 26% 28% 15% Finland Hong Kong Japan Korea Malaysia New Zealand 42% 21% Singapore 29% 14% Thailand Pre-Algebra Basic Algebra Advanced Algebra June 30, 2009 B4-5
245 International Grade Placement The International Grade Placement (IGP) scale is a rough measure of when content is introduced on average in the countries that participated in the 1995 TIMSS. For example, an IGP score of 8.5 means, generally speaking, that the content is covered midway through the 8 th grade year. The ADP Algebra II EOC standards have a mean grade placement of Although some of the content is taught in other countries as early as 7 th grade and as late as the end of 11 th grade, as shown by the whiskers on the graph, the grade level for the content centers on 9 th grade. Only two of the eleven countries standards have a lower average grade placement than that ADP Algebra II Exam, although they were not dramatically lower, with Thailand being the lowest at The country with the highest grade placement is Japan, with a mean of It is important to note that the other countries standards evaluated for this study contain standards that are required for all students, whereas mastery of the ADP Algebra II EOC standards is not required yet by any state. Although some standards have a larger spread in the grade level at which particular skills are covered, the ADP Algebra II EOC Exam standards have a mean that is at the lower end of the expectations in other countries, as shown in Graph 3, indicating that the mathematics content in the ADP Algebra II EOC Exam is not as challenging as the content covered in other the upper level secondary standards of the countries that were analyzed. Graph 3: Grade Placement of Mathematics Content ADP Algebra II Exam Alberta China Chinese Taipei Finland Hong Kong Japan Korea Malaysia New Zealand Singapore Thailand June 30, 2009 B4-6
246 Cognitive Demand The ADP Algebra II EOC benchmarks were each evaluated for their level of cognitive demand, as required in the performance aspect of each benchmark. The exam benchmarks fall into three of the five levels of demand. Twenty-nine percent of the EOC benchmarks are considered Level 1, assessing lower level skills such as recalling basic knowledge. Fifty-eight percent of the benchmarks are considered Level 2, assessing routine problem solving. Finally, 13% of the benchmarks are considered Level 4, assessing the formulation of a plan or critique of a solution method. There are no benchmarks classified as Level 3 or Level 5, non-routine procedures and advanced reasoning, respectively. The country standards analyzed vary in the distribution of standards as classified by level of cognitive demand. Only two sets of standards did not address all 5 levels, Korea and Singapore. Across the comparison standards, Level 2 is most prominent. The analysis of each set of standards is shown in Graph 4. The ADP Algebra II EOC benchmarks are not as cognitively demanding as most of the other standards evaluated 10 of the standards from other countries contain Level 5 content and therefore are only average in comparison. Graph 4: Level of Cognitive Demand 100% 75% 13% 6% 8% 8% 12% 11% 4% 6% 5% 12% 2% 4% 2% 1% 6% 3% 4% 10% 10% 7% 15% 15% 12% 23% 8% 12% 14% 9% 10% 13% 52% 50% 58% 66% 31% 63% 54% 34% 53% 31% 19% 38% 37% 42% 25% 0% 29% Common Algebra II 12% Alberta, Canada 38% China 19% 20% Chinese Taipei Finland 34% Hong Kong 26% 46% 44% 14% Japan Korea Malaysia New Zealand 40% 38% Singapore Thailand Level 1: Recalling Level 3: Non-Routine Procedures Level 5: Advanced Reasoning Level 2: Routine Procedures Level 4: Formulating Problems / Strategizing Solutions June 30, 2009 B4-7
247 Summary The ADP Algebra II End-of-Course Exam standards are well in line with international expectations; in fact a number of countries have standards that are more rigorous. Although not the highest in any of the four analyses, the ADP Algebra II EOC Exam Standards are not considered the lowest in any of the four analyses either. With this in mind, the Algebra II exam (which was created as a rigorous exam in the United States, where it is not required to be taken by all students) is average when compared to international standards that are taken by all upper secondary students in their respective countries. The standards from other countries demand that all of their students master at least this content, if not higher level mathematics content. June 30, 2009 B4-8
ANTHONY P. CARNEVALE NICOLE SMITH JEFF STROHL
State-Level Analysis HELP WANTED PROJECTIONS of JOBS and EDUCATION REQUIREMENTS Through 2018 JUNE 2010 ANTHONY P. CARNEVALE NICOLE SMITH JEFF STROHL Contents 1 Introduction 3 U.S. Maps: Educational concentrations
2016 Individual Exchange Premiums updated November 4, 2015
2016 Individual Exchange Premiums updated November 4, 2015 Within the document, you'll find insights across 50 states and DC with available findings (i.e., carrier participation, price leadership, gross
STATE INCOME TAX WITHHOLDING INFORMATION DOCUMENT
STATE INCOME TAX WITHHOLDING INFORMATION DOCUMENT Zurich American Life Insurance Company (ZALICO) Administrative Offices: PO BOX 19097 Greenville, SC 29602-9097 800/449-0523 This document is intended to
Mapping State Proficiency Standards Onto the NAEP Scales:
Mapping State Proficiency Standards Onto the NAEP Scales: Variation and Change in State Standards for Reading and Mathematics, 2005 2009 NCES 2011-458 U.S. DEPARTMENT OF EDUCATION Contents 1 Executive
LexisNexis Law Firm Billable Hours Survey Top Line Report. June 11, 2012
LexisNexis Law Firm Billable Hours Survey Top Line Report June 11, 2012 Executive Summary by Law Firm Size According to the survey, we found that attorneys were not billing all the time they worked. There
INTRODUCTION. Figure 1. Contributions by Source and Year: 2012 2014 (Billions of dollars)
Annual Survey of Public Pensions: State- and Locally- Administered Defined Benefit Data Summary Report: Economy-Wide Statistics Division Briefs: Public Sector By Phillip Vidal Released July 2015 G14-ASPP-SL
Hail-related claims under comprehensive coverage
Bulletin Vol. 29, No. 3 : April 2012 Hail-related claims under comprehensive coverage Claims for hail damage more than doubled in 2011 compared with the previous three years. Hail claims are primarily
VCF Program Statistics (Represents activity through the end of the day on June 30, 2015)
VCF Program Statistics (Represents activity through the end of the day on June 30, 2015) As of June 30, 2015, the VCF has made 12,712 eligibility decisions, finding 11,770 claimants eligible for compensation.
Foreign Language Enrollments in K 12 Public Schools: Are Students Prepared for a Global Society?
Foreign Language s in K 2 Public Schools: Are Students Prepared for a Global Society? Section I: Introduction Since 968, the American Council on the Teaching of Foreign Languages (ACTFL) has conducted
Community College/Technical Institute Mission Convergence Study
Center for Community College Policy Education Commission of the States Community College/Technical Institute Mission Convergence Study Phase 1: Survey of the States Prepared by Donald E. Puyear, Ph.D.
Notices of Cancellation / Nonrenewal and / or Other Related Forms
Forms are listed alphabetically by form title. INDEX POLICY CODES 1. Auto 2. Fire and Multiple Peril 3. Liability 4. Property, other than Fire and Multiple Peril (e.g. Crime & Inland Marine) 5. Workers
50-State Analysis. School Attendance Age Limits. 700 Broadway, Suite 810 Denver, CO 80203-3442 303.299.3600 Fax: 303.296.8332
0-State Analysis School Attendance Age Limits 700 Broadway, Suite 810 Denver, CO 80203-32 303.299.3600 Fax: 303.296.8332 Introduction School Attendance Age Limits By Marga Mikulecky April 2013 This 0-State
Health Insurance Exchanges and the Medicaid Expansion After the Supreme Court Decision: State Actions and Key Implementation Issues
Health Insurance Exchanges and the Medicaid Expansion After the Supreme Court Decision: State Actions and Key Implementation Issues Sara R. Collins, Ph.D. Vice President, Affordable Health Insurance The
Health Insurance Price Index Report for Open Enrollment and Q1 2014. May 2014
Health Insurance Price Index Report for Open Enrollment and May 2014 ehealth 5.2014 Table of Contents Introduction... 3 Executive Summary and Highlights... 4 Nationwide Health Insurance Costs National
The Case for Change The Case for Whopping Big Change
TESTIMONY The California Assembly Higher Education Committee October 7, 2013 Presentation by: David Longanecker President, Western Interstate Commission for Higher Education (WICHE) FINANCING CALIFORNIA
THE FUTURE OF HIGHER EDUCATION IN TEXAS
THE FUTURE OF HIGHER EDUCATION IN TEXAS WOODY L. HUNT, CHAIRMAN HIGHER EDUCATION STRATEGIC PLANNING COMMITTEE September 17, 2015 1 Let s talk about higher education in Texas and the educational competitiveness
Benefits of Selling WorkLife 65
PruTerm WorkLife 65 SM LEARN ABOUT THE PRODUCT AND MARKET Benefits of Selling WorkLife 65 Pru s new and innovative term product will resonate with your clients. WorkLife 65 is a new and innovative term
United States Bankruptcy Court District of Arizona NOTICE TO: DEBTOR ATTORNEYS, BANKRUPTCY PETITION PREPARERS AND DEBTORS
United States Bankruptcy Court District of Arizona NOTICE TO: DEBTOR ATTORNEYS, BANKRUPTCY PETITION PREPARERS AND DEBTORS UPDATED REQUIREMENTS FOR FORMAT OF MASTER MAILING LIST The meeting of creditors
3. How Do States Use These Assessments?
3. How Do States Use These Assessments? This is the third in a series of four related papers from the Center on Education Policy (CEP) describing career readiness assessments across states and districts.
TABLE 1. Didactic/Clinical/Lab SEMESTER TWO (Apply for admission to Nursing Program during Semester Two)
ITEM 127-105-R0505 TABLE 1 CURRICULUM FOR 72 CREDIT ASN WITH OPTIONAL PN EXIT AFTER 48(+) CREDITS ( STAND-ALONE LPN PROGRAMS WILL OFFER FIRST FOUR SEMESTERS) Course Credits Didactic/Clinical/Lab Course
ENS Governmental Format Status (As of 06/16/2008)
Alaska AK Production (G) Region D Tan - Development Required Alabama AL Production (G) Region C Arkansas AR Production (G) Region C D Yellow - Pended for required Beta Site Green - In Production - Direct
How To Regulate Rate Regulation
Rate Regulation Introduction Concerns over the fairness and equity of insurer rating practices that attempt to charge higher premiums to those with higher actual and expected claims costs have increased
United States Bankruptcy Court District of Arizona
United States Bankruptcy Court District of Arizona NOTICE TO: DEBTOR ATTORNEYS, BANKRUPTCY PETITION PREPARERS AND DEBTORS UPDATED REQUIREMENTS FOR FORMAT OF MASTER MAILING LIST The meeting of creditors
Closing the College Attainment Gap between the U.S. and Most Educated Countries, and the Contributions to be made by the States
National Center for Higher Education Management Systems Closing the College Attainment Gap between the U.S. and Most Educated Countries, and the Contributions to be made by the States Patrick J. Kelly
High School Mathematics: State-Level Curriculum Standards and Graduation Requirements
High School Mathematics: State-Level Curriculum Standards and Graduation Requirements Prepared By: Barbara J. Reys Shannon Dingman Nevels Nevels Dawn Teuscher Center for the Study of Mathematics Curriculum
Annual Survey of Public Employment & Payroll Summary Report: 2013
Annual Survey of Public Employment & Payroll Summary Report: 2013 Economy-Wide Statistics Briefs: Public Sector by Robert Jesse Willhide Released December 19, 2014 G13-ASPEP INTRODUCTION This report is
COMMERCIAL FINANCE ASSOCIATION. Annual Asset-Based Lending and Factoring Surveys, 2008
COMMERCIAL FINANCE ASSOCIATION Annual Asset-Based Lending and Factoring Surveys, 2008 Non-Member Edition May 6, 2009 R.S. Carmichael & Co., Inc. Commercial Finance Association 70 West Red Oak Lane (4 th
Alaska (AK) Arizona (AZ) Arkansas (AR) California-RN (CA-RN) Colorado (CO)
Beth Radtke 50 Included in the report: 7/22/2015 11:15:28 AM Alaska (AK) Arizona (AZ) Arkansas (AR) California-RN (CA-RN) Colorado (CO) Connecticut (CT) Delaware (DE) District Columbia (DC) Florida (FL)
Alaska (AK) Arizona (AZ) Arkansas (AR) California-RN (CA-RN) Colorado (CO)
Beth Radtke 49 Included in the report: 7/22/2015 11:24:12 AM Alaska (AK) Arizona (AZ) Arkansas (AR) California-RN (CA-RN) Colorado (CO) Connecticut (CT) Delaware (DE) District Columbia (DC) Florida (FL)
kaiser medicaid and the uninsured commission on The Cost and Coverage Implications of the ACA Medicaid Expansion: National and State-by-State Analysis
kaiser commission on medicaid and the uninsured The Cost and Coverage Implications of the ACA Medicaid Expansion: National and State-by-State Analysis John Holahan, Matthew Buettgens, Caitlin Carroll,
How To Get A National Rac (And Mac)
7 th National RAC (and MAC) Summit December 5 6, 2012 Washington, DC Jane Snecinski P.O. Box 12078 Atlanta, GA 30355 www.postacuteadvisors.com National client base (both public and private sector) based
AAIS Mobile-Homeowners 2008 Series
Policy Forms and Endorsements IT IS WOLTERS KLUWER FINANCIAL SERVICES' POLICY TO LIMIT THE SALE OF BUREAU FORMS TO THE MEMBERS AND SUBSCRIBERS OF THOSE RESPECTIVE BUREAUS. PURCHASE AND USE OF BUREAU FORMS
Required Minimum Distribution Election Form for IRA s, 403(b)/TSA and other Qualified Plans
Required Minimum Distribution Election Form for IRA s, 403(b)/TSA and other Qualified Plans For Policyholders who have not annuitized their deferred annuity contracts Zurich American Life Insurance Company
STATE SUPPLEMENTAL NUTRITION ASSISTANCE PROGRAM PARTICIPATION RATES IN 2009 FOOD AND NUTRITION SERVICE
Responsibility and Work Opportunity Reconciliation Act.... STATE SUPPLEMENTAL NUTRITION ASSISTANCE PROGRAM PARTICIPATION RATES IN 2009 FOOD AND NUTRITION SERVICE Recent studies have examined national participation
Health Insurance Coverage of Children Under Age 19: 2008 and 2009
Health Insurance Coverage of Children Under Age 19: 2008 and 2009 American Community Survey Briefs Issued September 2010 ACSBR/09-11 IntroductIon Health insurance, whether private or public, improves children
Table 1: Advertising, Marketing and Promotional Expense as a Percentage of Net Operating Revenue
Table 1: Advertising, Marketing and Promotional Expense as a Percentage of Net Operating Revenue NAIC Group % Attorney s Title 3.8% Chicago / Fidelity 0.9% Diversified 0.6% First American 2.7% Investors
Health Coverage for the Hispanic Population Today and Under the Affordable Care Act
on on medicaid and and the the uninsured Health Coverage for the Population Today and Under the Affordable Care Act April 2013 Over 50 million s currently live in the United States, comprising 17 percent
What Is College and Career Readiness? A Summary of State Definitions
What Is College and Career Readiness? A Summary of State Definitions Peter A. Conforti On March 13, 2010, President Barack Obama issued a blueprint for reauthorizing the Elementary and Secondary Education
90-400 APPENDIX B. STATE AGENCY ADDRESSES FOR INTERSTATE UIB CLAIMS
INTERSTATE UIB CLAIMS Alabama Multi- Unit (#01) Industrial Relations Bldg. Montgomery, AL 31604 Alaska Interstate Unit (#02) P.O. Box 3-7000 Juneau, AK 99801 Arizona Interstate Liable Office (#03) Department
The Vermont Legislative Research Shop
The Vermont Legislative Research Shop State Responses to Terrorism Every state has responded in some way to the events of September 11 th. Most states have named a Director of Homeland or a liaison to
Rates are valid through March 31, 2014.
The data in this chart was compiled from the physician fee schedule information posted on the CMS website as of January 2014. CPT codes and descriptions are copyright 2012 American Medical Association.
Radiologic Sciences Staffing and Workplace Survey 2015
Radiologic Sciences Staffing and Workplace Survey 2015 Reproduction in any form is forbidden without written permission from publisher. 1 Table of Contents Executive Summary... 3 Staffing Levels... 3 Longitudinal
Resource Brief: Ombudsman Program Data Management Systems
Resource Brief: Ombudsman Program Prepared by the National Association of State Units on Aging National Long-Term Care Ombudsman Resource Center National Citizens' Coalition for Nursing Home Reform 1424
NYCOM 2009 Entering Class - Matriculant Comparison Data
NYCOM 2009 Entering Class - Matriculant Comparison Data Enclosed are summary tables of the 2009 matriculants and parallel data for matriculants to your college. Matriculant data were matched to the applicant
TAX PREP FEE PHILOSOPHY. Copyright 2013 Drake Software
TAX PREP FEE PHILOSOPHY Copyright 2013 Drake Software Table of Contents Tax Prep Fee Survey Introduction... 2 Profile of Respondents... 3 Tax Prep Fee Averages - Federal Forms... 4 1040 Prep Fee Averages
Cancellation/Nonrenewal Surplus Lines Exemptions
Cancellation/Nonrenewal Surplus Lines Exemptions * Indicates updates in laws or regulations for the state Contact: Tina Crum, [email protected], 847-553-3804 Disclaimer: This document was prepared by
Alabama Commission of Higher Education P. O. Box 302000 Montgomery, AL. Alabama
Alabama Alaska Arizona Arkansas California Colorado Connecticut Delaware Alabama Commission of Higher Education P. O. Box 302000 Montgomery, AL 36130-2000 (334) 242-1998 Fax: (334) 242-0268 Alaska Commission
LIMITED PARTNERSHIP FORMATION
LIMITED PARTNERSHIP FORMATION The following Chart has been designed to allow you in a summary format, determine the minimum requirements to form a limited partnership in all 50 states and the District
NEW CARRIER SIGN UP REQUEST FORM
Instructions: (Please fax or email the completed documents) [email protected] Fax: 1-855-631-4174 o Fill o Copy o Copy o initial o Insurance out Carrier profile of Common Carrier Authority Company
Cancellation of Debt (COD) R. Bruce McCommons Harford County, MD TrC 12/4/2013 [email protected]
Cancellation of Debt (COD) R. Bruce McCommons Harford County, MD TrC 12/4/2013 [email protected] 1 Cancellation of debt (COD)... Generally, if a debt for which the taxpayer was personally responsible
Motor Vehicle Financial Responsibility Forms
Alphabetical Index Forms are listed alphabetically by form title. Important Note: The forms shown herein for each state may not be a complete listing of all the financial responsibility forms that are
Motor Vehicle Financial Responsibility Forms
Alphabetical Index Forms are listed alphabetically by form title. Important Note: The forms shown herein for each state may not be a complete listing of all the financial responsibility forms that are
*Time is listed as approximate as an offender may be charged with other crimes which may add on to the sentence.
Victims of drunk driving crashes are given a life sentence. In instances of vehicular homicide caused by drunk drivers, these offenders rarely receive a life sentence in prison. Laws vary greatly on the
Reshaping the College Transition:
Reshaping the College Transition: States That Offer Early College Readiness Assessments and Transition Curricula Elisabeth A. Barnett, Maggie P. Fay, Rachel Hare Bork, and Madeline Joy Weiss May 2013 Despite
Marketplaces (Exchanges): Information for Employers and Individuals Lisa Klinger, J.D. www.leavitt.com/healthcarereform.com
10-21- 2013 As of January 1, 2014, the Patient Protection and Affordable Care Act (PPACA) requires most U.S. citizens and lawful residents to either have minimum essential coverage or to pay a federal
FR Y-14Q: Retail US Auto Loan Schedule Instructions
FR Y-14Q: Retail US Auto Loan Schedule Instructions This document provides general guidance and data definitions for the US Auto Loan Schedule. For the International Auto Loan Schedule, see the separate
NCCI Filing Memorandum Item B-1420
NCCI Filing Memorandum Item B-1420 NATIONAL COUNCIL ON COMPENSATION INSURANCE, INC. (Applies in: AK, AL, AR, AZ, CO, CT, DC, FL, GA, HI, IA, ID, IL, IN, KS, KY, LA, MD, ME, MO, MS, MT, NC, NE, NH, NM,
ONLINE SERVICES FOR KEY LOW-INCOME BENEFIT PROGRAMS What States Provide Online with Respect to SNAP, TANF, Child Care Assistance, Medicaid, and CHIP
820 First Street NE, Suite 510 Washington, DC 20002 Tel: 202-408-1080 Fax: 202-408-1056 [email protected] www.cbpp.org Updated June 8, 2011 ONLINE SERVICES FOR KEY LOW-INCOME BENEFIT PROGRAMS What States
recovery: Projections of Jobs and Education Requirements Through 2020 June 2013
recovery: Projections of Jobs and Requirements Through June 2013 Projections of Jobs and Requirements Through This report projects education requirements linked to forecasted job growth by state and the
Audio Monitoring And The Law: How to Use Audio Legally in Security Systems. Today s Learning Objectives
Audio Monitoring And The Law: How to Use Audio Legally in Security Systems Presented to ISC West / SIA Education April 11, 2013 Donald J Schiffer Attorney at Law General Counsel Louroe Electronics Today
STATE PERSONAL INCOME TAXES ON PENSIONS & RETIREMENT INCOME: TAX YEAR 2010
STATE PERSONAL INCOME TAXES ON PENSIONS & RETIREMENT INCOME: TAX YEAR 2010 Ronald Snell Denver, Colorado February 2011 Most states that levy a personal income tax allow people who receive retirement income
Detail on mathematics graduation requirements from public high schools, by state as of June 5, 2013
Detail on mathematics graduation requirements from public high schools, by state as of June 5, 2013 State Year in Effect Algebra II required Years of Math Alignment Comments/Explanations Alabama 2011-12
Cost and Benefits of Individual and Family Health Insurance. December 2013
Cost and Benefits of Individual and Family Health Insurance December 2013 ehealth 12.2013 Table of Contents Introduction and Background... 3 Methodology Summary... 3 Report Highlights - Policies active
AAIS Personal and Premises Liability Program
Policy Forms and Endorsements IT IS WOLTERS KLUWER FINANCIAL SERVICES' POLICY TO LIMIT THE SALE OF BUREAU FORMS TO THE MEMBERS AND SUBSCRIBERS OF THOSE RESPECTIVE BUREAUS. PURCHASE AND USE OF BUREAU FORMS
Health Reform. Health Insurance Market Reforms: Pre-Existing Condition Exclusions
Health Insurance Market Reforms: Pre-Existing Cditi Exclusis SEPTEMBER 2012 Overview What is a pre-existing cditi? Pre-existing cditis are medical cditis or other health problems that existed before the
Individual Learning Plans: Improving Student Performance
1 Individual Learning Plans: Improving Student Performance Todd Bloom, Ph.D. Chief Academic Officer Emily Kissane Policy Analyst April 2011 2 Table of Contents Purpose of the Study and Methodology... 3
Andrea Zimmermann State Policy Associate NASDCTEc. The Role of CTE in College and Career Readiness
Andrea Zimmermann State Policy Associate NASDCTEc The Role of CTE in College and Career Readiness Overview What is CTE? What is CTE s Role in College and Career Readiness? Questions? What is CTE? Small
Analysis of Site-level Administrator and Superintendent Certification Requirements in the USA (2010)
Analysis of Site-level Administrator and Superintendent Certification Requirements in the USA (2010) Stephen H. Davis, Ed.D. California State Polytechnic University, Pomona Summer 2010 Introduction The
U.S. Department of Education NCES 2011-460 NAEP. Tools on the Web
U.S. Department of Education NCES 2011-460 NAEP Tools on the Web Whether you re an educator, a member of the media, a parent, a student, a policymaker, or a researcher, there are many resources available
A descriptive analysis of state-supported formative assessment initiatives in New York and Vermont
ISSUES& ANSWERS REL 2012 No. 112 At Education Development Center, Inc. A descriptive analysis of state-supported formative assessment initiatives in New York and Vermont ISSUES& ANSWERS REL 2012 No. 112
Physical Presence Triggers N/A
State Alabama Alaska Arizona Arkansas California Colorado Connecticut District of Columbia Delaware Authorization Requirements License from AL Department of Postsecondary Education; Certificate of Authority
Legislative Summary Sheet: Bills Related to Military Families Recently Introduced into State Legislatures
Legislative Summary Sheet: Bills Related to Military Families Recently Introduced into State Legislatures This legislative summary sheet was developed to give an overview of the policy and legislation
Arizona Form 2014 Credit for Taxes Paid to Another State or Country 309
Arizona Form 2014 Credit for Taxes Paid to Another State or Country 309 Phone Numbers For information or help, call one of the numbers listed: Phoenix (602) 255-3381 From area codes 520 and 928, toll-free
Driving under the influence of alcohol or
National Survey on Drug Use and Health The NSDUH Report December 9, 2010 State Estimates of Drunk and Drugged Driving In Brief Combined 2006 to 2009 data indicate that 13.2 percent of persons aged 16 or
Regional Electricity Forecasting
Regional Electricity Forecasting presented to Michigan Forum on Economic Regulatory Policy January 29, 2010 presented by Doug Gotham State Utility Forecasting Group State Utility Forecasting Group Began
Trends in0. Trends Higher Education Series COLLEGE PRICING 2014
Trends in0 1 Trends Higher Education Series COLLEGE PRICING 2014 See the Trends in Higher Education website at trends.collegeboard.org for the figures and tables in this report and for more information
How To Rate Plan On A Credit Card With A Credit Union
Rate History Contact: 1 (800) 331-1538 Form * ** Date Date Name 1 NH94 I D 9/14/1998 N/A N/A N/A 35.00% 20.00% 1/25/2006 3/27/2006 8/20/2006 2 LTC94P I F 9/14/1998 N/A N/A N/A 35.00% 20.00% 1/25/2006 3/27/2006
State Technology Report 2008
About This Report This State Technology Report is a supplement to the 11th edition of Technology Counts, a joint project of Education Week and the Editorial Projects in Education Research Center. As in
Impact of Undocumented Populations on 2010 Congressional Reapportionment
Impact of Undocumented Populations on 2010 Congressional Reapportionment September 19, 2007 Orlando J. Rodriguez, M.A. Manager/Connecticut State Data Center College of Liberal Arts and Sciences University
2014 Year in Review State Policies Impacting CTE. Catherine Imperatore, ACTE Andrea Zimmermann, NASDCTEc February 5, 2015
2014 Year in Review State Policies Impacting CTE Catherine Imperatore, ACTE Andrea Zimmermann, NASDCTEc February 5, 2015 Background and Context Federal policies Perkins WIOA HEA Senate and Congressional
Enrollment Snapshot of Radiography, Radiation Therapy and Nuclear Medicine Technology Programs 2013
Enrollment Snapshot of Radiography, Radiation Therapy and Nuclear Medicine Technology Programs 2013 A Nationwide Survey of Program Directors Conducted by the American Society of Radiologic Technologists
Enrollment Snapshot of Radiography, Radiation Therapy and Nuclear Medicine Technology Programs 2014
Enrollment Snapshot of, Radiation Therapy and Nuclear Medicine Technology Programs 2014 January 2015 2015 ASRT. All rights reserved. Reproduction in any form is forbidden without written permission from
Meeting Oregon s new high school math graduation requirements: examining student enrollment and teacher availability
ISSUES& ANSWERS REL 2012 No. 126 At Education Northwest Meeting Oregon s new high school math graduation requirements: examining student enrollment and teacher availability ISSUES& ANSWERS REL 2012 No.
State Insurance Information
State Insurance Information Alabama 201 Monroe St. Suite 1700 Montgomery, AL 36104 334-269-3550 fax:334-241-4192 http://www.aldoi.org/ Alaska Dept. of Commerce, Division of Insurance. P.O. Box 110805 Juneau,
Summary of State Educational Requirements for International Dentists
Summary of State Educational Requirements for International Dentists This information is subject to change the reader must contact the state dental board directly to verify the current licensure requirements
Enrollment Snapshot of Radiography, Radiation Therapy and Nuclear Medicine Technology Programs 2012
Enrollment Snapshot of, and Nuclear Medicine Programs 2012 A Nationwide Survey of Program Directors Conducted by the American Society of Radiologic Technologists January 2013 2012 ASRT. All rights reserved.
Affordable Care Act and Bariatric Surgery. John Morton, MD, MPH, FASMBS, FACS Chief, Bariatric and Minimally Invasive Surgery Stanford University
Affordable Care Act and Bariatric Surgery John Morton, MD, MPH, FASMBS, FACS Chief, Bariatric and Minimally Invasive Surgery Stanford University Tertiary Prevention Prevention of disease progression and
U.S. Department of Housing and Urban Development: Weekly Progress Report on Recovery Act Spending
U.S. Department of Housing and Urban Development: Weekly Progress Report on Recovery Act Spending by State and Program Report as of 3/7/2011 5:40:51 PM HUD's Weekly Recovery Act Progress Report: AK Grants
Home Schooling Achievement
ing Achievement Why are so many parents choosing to home school? Because it works. A 7 study by Dr. Brian Ray of the National Home Education Research Institute (NHERI) found that home educated students
