Maths Mastery in Primary Schools



Similar documents
Flipped Learning Evaluation Institute for Effective Education (University of York) Peter Rudd, PhD

Online Reading Support London School of Economics Sandra McNally

Getting access to NPD data for EEF evaluations: Advice for evaluators

Hillocks Primary and Nursery School

Pre-experimental Designs for Description. Y520 Strategies for Educational Inquiry

Developing an implementation research proposal. Session 2: Research design

Mastery approaches to mathematics and the new national curriculum

Roseberry Primary and Nursery School. Maths Policy

A social marketing approach to behaviour change

The Regional Growth Fund. Detailed methodology

Prettygate Junior School. Assessment, Recording and Reporting Policy. Date: Summer 2015 Review: Summer 2018

Evaluation: Designs and Approaches

BEAUFORT and LANGLEY SCHOOL ASSESSMENT RECORDING AND REPORTING

Additional Educational Needs and Inclusion Policy and Procedures

11. Quality Assurance Procedures

IDS data identified several schools across the County with low CVA/conversions according to the septiles (1 to 7 grading) at KS4.

The Primary Curriculum in Schools

Evaluation of a Primary Care Dermatology Service: final report

Discover Children s Story Centre York Trials Unit, University of York and Durham University David Torgerson and Carole Torgerson

Policy Document Planning, Assessment, Recording and Reporting September 2010

To ensure that all pupils with specific learning difficulties are identified and supported in school.

St Anne s catholic primary school. Maths 2015

St Mary s College Crosby. Special Educational Needs and Disability Policy (P46) Date of Policy September Date of Review September 2015

Reception baseline: criteria for potential assessments

Generic grade descriptors and supplementary subjectspecific guidance for inspectors on making judgements during visits to schools

IPDET Module 6: Descriptive, Normative, and Impact Evaluation Designs

Hillpark Secondary School

WHAT WORKS IN INNOVATION AND EDUCATION IMPROVING TEACHING AND LEARNING FOR ADULTS WITH BASIC SKILL NEEDS THROUGH FORMATIVE ASSESSMENT STUDY OUTLINE

The Use of Information Communication Technology (ICT) by Secondary Design and Technology (D&T) Trainee Teachers in the UK.

School Summary Improvement Plan in Numeracy for School Community Cork Educate Together National School, Grattan Street, Cork. Roll Number: 19906G

THE REDWAY SCHOOL. This policy was written in line with the whole school Learning and Teaching Policy and Target Setting Policy.

THE DIY EVALUATION GUIDE

Mathematics. Introduction

Comparison of Research Designs Template

Ryburn Valley High School

Shottery CE Primary School. Assessment, Recording and Reporting Policy

Discover Summer School

COGNITIVE ABILITIES TEST: FOURTH EDITION. Sample Reports SECONDARY.

TOOL D14 Monitoring and evaluation: a framework

Mathematics Policy. Michael Sobell Sinai School

KINETON GREEN PRIMARY SCHOOL MATHEMATICS POLICY FEBRUARY 2015

Statistical First Release

University of Roehampton. Quality Assurance of School Partnership Training & Delivery

Llansanffraid Church in Wales Primary School. Information and Communication Technology (ICT) Policy

THE ROLE OF MARKET RESEARCH IN THE MODERN SHOPPING CENTRE

Executive Summary PROMOTING SOCIAL AND COGNITIVE DEVELOPMENT THROUGH COLLABORATIVE ENQUIRY: AN EVALUATION OF THE THINKING THROUGH PHILOSOPHY PROGRAMME

St Martin s C of E Primary School. Gifted and Talented Policy

Promoting hygiene. 9.1 Assessing hygiene practices CHAPTER 9

In an experimental study there are two types of variables: Independent variable (I will abbreviate this as the IV)

STRATEGY FOR ENHANCING STUDENT EMPLOYABILITY

How To Improve Education Planning In Dekalb County Schools

Section 2: Differentiation in practice in the curriculum

Assessment of children s educational achievements in early childhood education

Qualitative research study design

THE QUEEN S SCHOOL Assessment Policy

Chapter Eight: Quantitative Methods

Measuring the Impact of Volunteering

Introduction to Research Methods and Applied Data Analysis

Preparatory Action on the enhancement of the European industrial potential in the field of Security research

Cheadle Primary School Computing and ICT Policy

Abstract Title Page. Title: Conditions for the Effectiveness of a Tablet-Based Algebra Program

The Health Foundation is an independent charity working to improve the quality of healthcare in the UK.

Mark Rutherford School Assessment Policy (Reviewed by Governors Curriculum sub committee January 2014)

Writing Your PG Research Project Proposal

Assessment and the new curriculum. Parents information evening 2

Using Qualitative Information For Impact Assessment: Introducing The Qualitative Impact Assessment Protocol (QUIP)

Welcome back to EDFR I m Jeff Oescher, and I ll be discussing quantitative research design with you for the next several lessons.

Maths Non-negotiables

Children Massaging Children

Policy statement: Assessment, recording and reporting achievement.

BLACKROCK COLLEGE SPECIAL EDUCATIONAL NEEDS POLICY

Middle School (Grades 6-8) Measures of Student Learning Guide

The Sholing Technology College

Assessing Research Protocols: Primary Data Collection By: Maude Laberge, PhD

Much Birch Primary School

Assessment Policy. Date of next review: September 2016

Guidelines on Research Ethics for Projects with Children and Young People Nathalie Noret, Faculty of Health and Life Sciences

EASTINGTON PRIMARY SCHOOL

Evaluation of Pupil Premium

Bridgewater Primary School

SUPPORTING ASSESSMENT IN SCHOOLS - 1

NON-PROBABILITY SAMPLING TECHNIQUES

Research Findings on the Transition to Algebra series

National Qualifications Accounting Review Report

Chapter 2 Quantitative, Qualitative, and Mixed Research

Da Vinci Community School

January 2015 Special Educational Needs Report/ Local Offer

ty School District Digita al Classrooms Plan

Numeracy across learning Principles and practice

Inspection dates March Effectiveness of leadership and management

Assessment Policy. 1 Introduction. 2 Background

Mathematics Policy. National Curriculum Statement on Maths:

Total UK credits 180 Total ECTS 90 PROGRAMME SUMMARY

Oxfam Education. Act activity guide (primary & secondary) Outline. Resources. Learning Objectives. Curricular links. Keywords

Terms of Reference USDA McGovern Dole International Food for Education and Child Nutrition Program (FFE) in Senegal

Completed Formal Classroom Observation Form

Evaluation of Increased Flexibility for 14 to 16 Year Olds Programme: The Second Year

UNDERSTANDING ASSESSMENT How, when & why we assess your child

Greetings from Maldives!

The link between East Asian mastery teaching methods and English children s mathematics skills

Transcription:

Maths Mastery in Primary Schools Institute of Education, University of London John Jerrim Evaluation Summary Age range Year 7 Number of pupils c. 10,000 Number of schools 50 Design Primary Outcome Randomised controlled trial with randomisation at the school level Numeracy Significance Mathematics Mastery is based on a simple way to teach mathematics originally developed by the Singapore Ministry for Education. The Mathematics Mastery model is distinctive in two ways. First, it aims to give pupils a thorough understanding of mathematical concepts, rather than a set of techniques or routines to get to the right answer. Mathematics Mastery shows that problems can be solved in a variety of ways, and ensures that pupils learn in sequence first by manipulating real objects, then by drawing pictorial representations, and ultimately by using mathematical symbols. Second, Mathematics Mastery uses a 'mastery' approach, in which teachers do not move on until all pupils have acquired a basic understanding of the current topic. Additionally, the course is designed so that more able pupils can explore each topic in depth, and therefore remain engaged. The Institute of Education will conduct an independent evaluation using rigorous design and methods. The evaluation focus is on establishing an unbiased estimate of impact of the intervention on short-term (after one year of treatment ) and long-term (end of secondary school) academic outcomes (performance on mathematics tests) The study will include an impact evaluation and a process evaluation. Research plan: Impact evaluation Research questions The research question is - what is the impact of Maths Mastery on children s ability in mathematics over the short and long term? Design The design is a cluster randomised controlled trial, with random allocation at the school level. ARK will develop a pool of 50 schools that would like to try Maths Mastery. 25 (approx) secondary schools will be drawn at random from this pool of schools and will be

defined as the treatment group. 25 secondary schools who are in the pool but not selected to receive the programme will be the control group. All children / teachers in a treatment school will be required to use the programme to avoid selection problems. We regard 25 treatment schools as the minimum necessary to achieve statistical significance of an effect of 0.2 of a standard deviation in maths test scores at a 95% level confidence level. This is assuming: (i) (ii) An inter-class correlation (ICC) of ρ = 0.15 at the school level that children s Key Stage 2 test scores are used as a pre-test in the analysis (OLS regression or ANCOVA), and that the correlation between this and the post-test is 0.75. A higher ICC or a lower association between Key Stage 2 scores and the post-test will inevitably decrease statistical power. By February 2013, ARK will have recruited all schools (treatment and control) that are part of the Maths Mastery programme. The IoE team will then conduct the randomisation in mid-february 2013. If ARK are unable to recruit all 50 schools by this date, a second round of recruitment will take place (with the IoE conducting a subsequent round of randomisation of these additional schools). In September 2013 there will be a treatment and a control group of year 7 pupils in these schools. For both groups, key stage 2 maths test scores will act as the baseline assessment. The control and treatment group pupils will be tested in June 2014 once the treatment group has experienced one full academic year of the maths mastery programme. It is essential that the control schools receive no intervention in the first year of the programme. Progress in Maths tests are likely to be used (provided by GL assessment). An external contractor (e.g. NATCEN) will be responsible for ensuring all schools complete and return these tests on time. Comparison will then be made between the achievement gains in mathematics of pupils in year 7 in the treatment and in the control schools. In May / June 2018 children in treatment and control schools will sit national maths exams. The IoE team will examine the long run effectiveness of the Maths Mastery programme by investigating differences in average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children s Maths Mastery test scores (collected in 2013 and 2014) by NATCEN. Analysis Our analysis strategy will use intention to treat. All children within the schools that are randomised will be included in the analysis. Even if a school withdraws from the intervention all the data on the children participating in the study will be collected (if possible) and included in the analyses. The mean score will be compared between treatment and control groups, controlling for children s performance on their Key Stage 2 maths test, with robust standard errors that take into account clustering at the school level. This will be supplemented by additional analysis (e.g. OLS regression) if appropriate. A 95% 2

confidence interval for the differences in test scores between the intervention and control group will be reported. Progress measures (i.e. the change in test scores between baseline and follow up) shall also be considered. The primary outcome we will consider is children s overall performance on the maths test children will sit at the end of year 7. For secondary outcomes, ARK (the intervention provider) will identify particular questions / sub-scales of interest that they believe the Maths Mastery programme will particularly influence. They will inform the Institute of Education of these before they conduct any analysis of the test score data. The Institute of Education will then compare children s performance on these questions / sub-scales as secondary outcomes. Corrections for multiple testing will be applied if appropriate. Research plan: Process evaluation The process evaluation will start at an early project training meeting in the summer term of 2012-13 and continue through the academic year 2013-14. The evaluation design is qualitative, involving questionnaires, focus groups, semi-structured interviews and classroom observations. Research questions The process evaluation will be conducted alongside the quantitative evaluation to identify which elements of the maths mastery programme seemed significant in supporting success. More specifically, the qualitative analysis will aim to provide insight into: (i) how the maths mastery programme impacted on teachers ideas on issues related to the maths mastery programme, for example on: their role within a department and wider communities of practice, the use of problem solving, the notion of mathematical ability, the effective design and use of tasks; (ii) how those ideas translated into practice in the classroom. Spring/Summer 2013 Decisions about which schools will be in the treatment and control groups for the quantitative evaluation will be made in February 2013. All schools in both groups will then be invited to complete the initial questionnaire prior to that information being disseminated. A system such as Survey Monkey will be used through the internet. A covering statement will explain the purpose of the questionnaire and to ensure fair implementation. The questionnaire will take no longer than 15 minutes to complete and will aim to identify current practice in areas relevant to the maths mastery programme, for example: use of images in teaching; emphasis on powerful ideas as compared to procedures; attitude towards teaching through problem-solving; approaches to providing for low and high achievers. It will also seek teachers perceptions of factors that support or constrain the development of their practice. In addition, data will be sought separately to identify other factors that might be account for successful performance during the programme, beyond the impact of the maths mastery programme. For example, because 3

the treatment is randomly allocated, schools with a high baseline of prior achievement, high proportion of teachers with first degrees in mathematics or a low proportion of pupils receiving free school lunches will be members of both the treatment and the control group and might be expected to perform well irrespective of the influence of the maths mastery programme. Drawing on the knowledge of the maths mastery project team alongside data from the questionnaires and baseline data from the quantitative element of the evaluation, five schools will be identified as the focus of the process evaluation. In order to make the selection, we will aim to identify contrasting schools according to their current practice and other factors. Baseline data collection will take place prior to the project training in the summer of 2013. During this period the IOE process evaluation team will visit the five schools in order to conduct: - Baseline observations of two lessons from each of two teachers in their Y7 classrooms in the sample of 5 schools. These early observations will establish a baseline of practice. An observation schedule will be devised to provide useful feedback to the teachers as well as evaluation data for the programme. - Baseline focus group discussions to identify issues salient to teachers in their perception of the implementation of the programme. These focus groups will be departmentally based with teachers from the same school in the same focus group but may not involve all teachers in a department. 2013-2014 During 2013-14, further data collection will take place to identify changes in the teachers practice and beliefs by conducting: - Final observations of two lessons from each of two teachers in their Y7 classrooms in the sample of 5 schools to identify how aspects of the intervention have been adopted - Final focus group discussions to reflect on the effectiveness of the programme. Towards the end of the evaluation year, a final questionnaire will be given to teachers across the programme in all 50 schools (whether in the treatment or control group). The aim of this questionnaire will be to test inferences drawn from the qualitative analysis about those factors that seemed to be important in the success of the maths mastery programme. By comparing the presence of such factors across successful and less successful schools, according to the quantitative analysis, it may be possible to ascertain how consistently effective such factors were. Similarly, it might be informative to examine the presence of these same factors in those schools not participating in the programme. The final questionnaire will also enable us, by comparison with the initial questionnaire, to identify changes in teachers practices and attitudes in aspects critical to the aims of the programme 4

Finally the IOE process team will carry out telephone interviews of two teachers from each of the 5 sample schools to discuss issues that have arisen in the focus groups and in the observations and from the final questionnaires. 2014/15 and beyond Although 2014/15 lies outside of the evaluation year, we would hand over all instruments used ie questionnaires, interview and observation schedules, for future use if appropriate and desirable. We would need to ensure that all data is collected and maintained during the evaluation year according to BERA ethical standards in such a way that the report and the data could be used for publications by the maths mastery programme with clear statements about the role of IOE. Similarly, IOE would wish to reserve the right to publish papers from the data within constraints dictated by ethical and professional standards agreed between the IOE and maths mastery teams. Baseline questionnaire Baseline focus groups Baseline observations Final questionnaire Final focus groups Final observations Final interviews Treatment and Comparison groups All teachers All teachers Sample of 5 treatment group schools 5 meetings one in each school or in hub meeting 5 schools x 2 teachers 5 meetings one in each school or in hub meeting 5 schools x 2 teachers 5 schools x 2 teachers Personnel Statistical evaluation - Dr. John Jerrim, Professor Lorraine Dearden, Process evaluation - Professor Dave Pratt, Professor Candia Morgan, Dr Cosette Crisan and Dr Cathy Smith Roles and responsibilities JJ Analysis and reporting of trial LD Project management of statistical evaluation DP Lead of process evaluation CM Process Evaluation CC Process Evaluation CS Process Evaluation 5

Timeline quantitative evaluation February 2013: ARK recruited 34 schools. IoE randomly assigned to treatment and control. March May 2013: ARK to recruit 16 additional schools. End May 2013: IoE to randomise the 16 additional schools to treatment and control. September 2013: ARK to secure permission from schools to have access to pupils Unique Pupil Numbers (UPN) and all other information available in the National Pupil Database (NPD) for purposes of the evaluation. ARK to also ask schools to complete a short spreadsheet with some key information about year 7 pupils within the school (UPN, Name, Date of Birth, Gender, Ethnicity). June 2014: NATCEN (in conjunction with ARK) to make contact with all treatment and control schools, reminding them about the test to be conducted. Inform them that this test must be conducted in one specific week in June (e.g. provisional date week beginning June 23 rd 2014). ARK to provide a list of main contacts within all treatment and control schools to NATCEN, IoE and the test provider (e.g. GL Assessment). GL assessment will send all test packs to the lead contact within each school. NATCEN will be responsible for making sure all tests are completed that week. NATCEN will chase up any non-responding school to complete the test by the end of the following week at the latest. NATCEN will be responsible for ensuring all exam scripts are sent to test provider (e.g. GL Assessment) for marking. July 2014: Test provider (e.g. GL Assessment) will mark all exam scripts. They will then provide the Institute of Education with a spreadsheet (e.g. CSV). This spreadsheet will include children s overall test score and whether they got each question right / wrong. It will also contain information on any relevant sub-scale. While children s tests are being marked, ARK will identify particular questions / scales they are interested in as secondary outcomes. They will inform the Institute of Education of these before they begin their statistical analysis. August October 2014: Institute of Education to carry out data cleaning and statistical analysis. Draft report produced. Provided to EEF for comments. October half-term 2014: Treatment and control schools provided with feedback on children s test performance 6

December 2014: Final report produced by Institute of Education and provided to EEF. Roles of the organisations Test provider (e.g. GL Assessment) Provide / deliver test to specified contact person within schools All tests to be linked to pupils name and Unique Pupil Number (e.g. printed on tests provided to schools) To provide the Institute of Education with a spreadsheet (e.g. a CSV file) that includes each pupils overall test score, their score on any sub-domain, and whether they got each question right / wrong. Spreadsheet to be delivered to Institute of Education by the end of July 2013. ARK Recruit 16 more schools by mid-may 2013, and to provide this list to the Institute of Education. Get permission from schools for the Institute of Education and the EEF to use / link pupils Unique Pupil Number. Get schools to complete a spreadsheet with key administrative data on each child (e.g. UPN, school name, pupil name, date of birth, gender, ethnicity) NATCEN Early June 2014 to make contact with main contact in schools to remind them that the test is coming. Inform schools that they have to complete this test within a certain week in June 2014. To liaise with GL assessment to keep up to date with school response. To chase up all non-responding schools, and to ensure a high response rate. To ensure all exam scripts are received by GL Assessment on time for marking. Institute of Education (DoQSS) Randomly allocate schools Draw up the basic pupil data spreadsheet for ARK (to get schools to complete). Conduct statistical analysis. Write final report of quantitative analysis for EEF. Institute of Education (David Pratt) Conduct process evaluation EEF Project overview Assist with permission to use NPD data for purpose of the evaluation. 7