Maths Mastery: Evidence versus Spin

.

On Friday 13 February, the Education Endowment Foundation (EEF) published the long-awaited evaluation reports of two randomised control trials (RCTs) of Mathematics Mastery, an Ark-sponsored programme and recipient of one of the EEF’s first tranche of awards back in 2011.

Inside-out_torus_(animated,_small)Inside-out_torus_(animated,_small)EEF, Ark and Mathematics Mastery each published a press release to mark the occasion but, given the timing, none of these attracted attention from journalists and were discussed only briefly on social media.

The main purpose of this post is to distinguish evidence from spin, to establish exactly what the evaluations tell us – and what provisos should be attached to those findings.

The post is organised into three main sections which deal respectively with:

  • Background to Mathematics Mastery
  • What the evaluation reports tell us and
  • What the press releases claim

The conclusion sets out my best effort at a balanced summary of the main findings. (There is a page jump here for those who prefer to cut to the chase.)

This post is written by a non-statistician for a lay audience. I look to specialist readers to set me straight if I have misinterpreted any statistical techniques or findings,

What was published?

On Friday 13 February the EEF published six different documents relevant to the evaluation:

  • A press release: ‘Low-cost internet-based programme found to considerably improve reading ability of year 7 pupils’.
  • A blog post: ‘Today’s findings: impact, no impact and inconclusive – a normal distribution of findings’.
  • An updated Maths Mastery home page (also published as a pdf Project Summary in a slightly different format).

The last three of these were written by the Independent Evaluators – Jerrim and Vignoles (et al) – employed through the UCL Institute of Education.

The Evaluators also refer to ‘a working paper documenting results from both trials’ available in early 2015 from http://ideas.repec.org/s/qss/dqsswp.html and www.johnjerrim.com. At the time of writing this is not yet available.

Press releases were issued on the same day by:

All of the materials published to date are included in the analysis below.

Background to Maths Mastery

What is Maths Mastery?

According to the NCETM (October 2014) the mastery approach in mathematics is characterised by certain common principles:

‘Teachers reinforce an expectation that all pupils are capable of achieving high standards in mathematics.

  • The large majority of pupils progress through the curriculum content at the same pace. Differentiation is achieved by emphasising deep knowledge and through individual support and intervention.
  • Teaching is underpinned by methodical curriculum design and supported by carefully crafted lessons and resources to foster deep conceptual and procedural knowledge.
  • Practice and consolidation play a central role. Carefully designed variation within this builds fluency and understanding of underlying mathematical concepts in tandem.
  • Teachers use precise questioning in class to test conceptual and procedural knowledge, and assess pupils regularly to identify those requiring intervention so that all pupils keep up.

The intention of these approaches is to provide all children with full access to the curriculum, enabling them to achieve confidence and competence – ‘mastery’ – in mathematics, rather than many failing to develop the maths skills they need for the future.’

The NCETM paper itemises six key features, which I paraphrase as:

  • Curriculum design: Relatively small, sequenced steps which must each be mastered before learners move to the next stage. Fundamental skills and knowledge are secured first and these often need extensive attention.
  • Teaching resources: A ‘coherent programme of high-quality teaching materials’ supports classroom teaching. There is particular emphasis on ‘developing deep structural knowledge and the ability to make connections’. The materials may include ‘high-quality textbooks’.
  • Lesson design: Often involves input from colleagues drawing on classroom observation. Plans set out in detail ‘well-tested methods’ of teaching the topic. They include teacher explanations and questions for learners.
  • Teaching methods: Learners work on the same tasks. Concepts are often explored together. Technical proficiency and conceptual understanding are developed in parallel.
  • Pupil support and differentiation: Is provided through support and intervention rather than through the topics taught, particularly at early stages. High attainers are ‘challenged through more demanding problems which deepen their knowledge of the same content’. Issues are addressed through ‘rapid intervention’ commonly undertaken the same day.
  • Productivity and practice: Fluency is developed from deep knowledge and ‘intelligent practice’. Early learning of multiplication tables is expected. The capacity to recall facts from long term memory is also important.

Its Director published a blog post (October 2014) arguing that our present approach to differentiation has ‘a very negative effect’ on mathematical attainment and that this is ‘one of the root causes’ of our performance in PISA and TIMSS.

This is because it negatively affects the ‘mindset’ of low attainers and high attainers alike. Additionally, low attainers are insufficiently challenged and get further behind because ‘they are missing out on some of the curriculum’. Meanwhile high attainers are racing ahead without developing fluency and deep understanding.

He claims that these problems can be avoided through a mastery approach:

‘Instead, countries employing a mastery approach expose almost all of the children to the same curriculum content at the same pace, allowing them all full access to the curriculum by focusing on developing deep understanding and secure fluency with facts and procedures, and providing differentiation by offering rapid support and intervention to address each individual pupil’s needs.’

But unfortunately he stops short of explaining how, for high attainers, exclusive focus on depth is preferable to a richer blend of breadth, depth and pace, combined according to each learner’s needs.

NCETM is careful not to suggest that mastery is primarily focused on improving the performance of low-attaining learners.

It has published separate guidance on High Attaining Pupils in Primary Schools (registration required), which advocates a more balanced approach, although that predates this newfound commitment to mastery.

NCETM is funded by the Department for Education. Some of the comments on the Director’s blog post complain that it is losing credibility by operating as a cheerleader for Government policy.

Ark’s involvement

Ark is an education charity and multi-academy trust with an enviable reputation.

It builds its approach on six key principles, one of which is ‘Depth before breadth’:

‘When pupils secure firm foundations in English and mathematics, they find the rest of the curriculum far easier to access. That’s why we prioritise depth in these subjects, giving pupils the best chance of academic success. To support fully our pupils’ achievement in maths, we have developed the TES Award winning Mathematics Mastery programme, a highly-effective curriculum and teaching approach inspired by pupil success in Singapore and endorsed by Ofsted. We teach Mathematics Mastery in all our primary schools and at Key Stage 3 in a selection of our secondary schools. It is also being implemented in over 170 schools beyond our network.’

Ark’s 2014 Annual Report identifies five priorities for 2014/15, one of which is:

‘…developing curricula to help ensure our pupils are well prepared as they go through school… codifying our approach to early years and, building on the success of Maths Mastery, piloting an English Mastery programme…’

Mathematics Mastery is a charity in its own right. Its website lists 15 staff, a high-powered advisory group and three partner organisations:  Ark, the EEF (presumably by virtue of the funded evaluation) and the ‘Department for Education and the Mayor of London’ (presumably by virtue of support from the London Schools Excellence Fund).

NCETM’s Director sits on Mathematics Mastery’s Advisory Board.

Ark’s Chief Executive is a member of the EEF’s Advisory Board.

Development of Ark’s Maths Mastery programme

According to this 2012 report from Reform, which features Maths Mastery as a case study, it originated in 2010:

‘The development of Mathematics Mastery stemmed from collaboration between six ARK primary academies in Greater London, and the mathematics departments in seven separate ARK secondary academies in Greater London, Portsmouth and Birmingham. Representatives from ARK visited Singapore to explore the country’s approach first-hand, and Dr Yeap Ban Har, Singapore’s leading expert in maths teaching, visited King Solomon Academy in June 2011.’

In October 2011, EEF awarded Ark a grant of £600,000 for Maths Mastery, one of its first four awards.

The EEF’s press release says:

‘The third grant will support an innovative and highly effective approach to teaching children maths called Mathematics Mastery, which originated in Singapore. The programme – run by ARK Schools, the Academies sponsor, which is also supporting the project – will receive £600,000 over the next four years to reach at least 50 disadvantaged primary and secondary schools.’

Ark’s press release adds:

‘ARK Schools has been awarded a major grant by the Education Endowment Foundation (EEF) to further develop and roll out its Mathematics Mastery programme, an innovative and highly effective approach to teaching children maths based on Singapore maths teaching. The £600,000 grant will enable ARK to launch the programme and related professional development training to improve maths teaching in at least 50 disadvantaged primary and secondary schools.

The funding will enable ARK Schools to write a UK mathematics mastery programme based on the experience of teaching the pilot programme in ARK’s academies. ARK intends to complete the development of its primary modules for use from Sept 2012 and its secondary modules for use from September 2013. In parallel ARK is developing professional training and implementation support for schools outside the ARK network.’

The project home page on EEF’s site now says the total project cost is £774,000. It may be that the balance of £174,000 is the fee paid to the independent evaluators.

This 2012 information sheet says all Ark primary schools would adopt Maths Mastery from September 2012, and that its secondary schools have also devised a KS3 programme.

It describes the launch of a Primary Pioneer Programme from September 2012 and a Secondary Pioneer Programme from September 2013. These will form the cohorts to be evaluated by the EEF.

In 2013, Ark was awarded a grant of £617,375 from the Mayor of London’s London Schools Excellence Fund for the London Primary Schools Mathematics Mastery Project.

This is to support the introduction of Mastery in 120 primary schools spread across 18 London boroughs. (Another source gives the grant as £595,000)

It will be interesting to see whether Maths Mastery (or English Mastery) features in the Excellence Fund’s latest project to increase primary attainment in literacy and numeracy. The outcomes of the EEF evaluations may be relevant to that impending decision.

Ark’s Mathematics Mastery today

The Mathematics Mastery website advertises a branded variant of the mastery model, derived from a tripartite ‘holistic vision’:

  • Deep understanding, through a curriculum that combines universal high expectations with spending more time on fewer topics and heavy emphasis on problem-solving.
  • Integrated professional development through workshops, visits, coaching and mentoring and ‘access to exclusive online teaching and learning materials, including lesson guides for each week’.
  • Teacher collaboration – primary schools are allocated a geographical cluster of 4-6 schools while secondary schools attend a ‘national collaboration event’. There is also an online dimension.

It offers primary and secondary programmes.

The primary programme has three particular features: use of objects and pictures prior to the introduction of symbols; a structured approach to the development of mathematical vocabulary; and heavy emphasis on problem-solving.

It involves one-day training sessions for school leaders, for the Maths Mastery lead and those new to teaching it, and for teachers undertaking the programme in each year group. Each school receives two support visits and attends three local cluster meetings.

Problem-solving is also one of three listed features of the secondary programme. The other two are fewer topics undertaken in greater depth, plus joint lesson planning and departmental workshops.

There are two full training days, one for the Maths Mastery lead and one for the maths department plus an evening session for senior leadership. Each school receives two support visits and attends three national collaborative meetings. They must hold an hour-long departmental workshop each week and commit to sharing resources online.

Both primary and secondary schools are encouraged to launch the programme across Year 1/7 and then roll it upwards ‘over several years’.

The website is not entirely clear but it appears that Maths Mastery itself is being rolled out a year at a time, so even the original primary early adopters will have provision only up to Year 3 and are scheduled to introduce provision for Year 4 next academic year. In the secondary sector, activity currently seems confined to KS3, and predominantly to Year 7.

The number of participating schools is increasing steadily but is still very small.

The most recent figures I could find are 192 (Maths Mastery, November 2014) or 193 – 142 primary and 51 secondary (Ark 2015).

One assumes that this total includes

  • An original tranche of 30 primary ‘early adopters’ including 21 not managed by Ark
  • 60 or so primary and secondary ‘Pioneer Schools’ within the EEF evaluations (ie the schools undertaking the intervention but not those forming the control group, unless they have subsequently opted to take up the programme)
  • The 120 primary schools in the London project
  • Primary and secondary schools recruited outwith the London and EEF projects, either alongside them or subsequently.

But the organisation does not provide a detailed breakdown, or show how these different subsets overlap.

They are particularly coy about the cost. There is nothing about this on the website.

The EEF evaluation reports say that 2FE primary schools and secondary schools will pay ‘an upfront cost of £6,000 for participating in the programme’.

With the addition of staff time for training, the per pupil cost for the initial year is estimated as £127 for primary schools and £50 for secondary schools.

The primary report adds:

‘In subsequent years schools are able to opt for different pathways depending on the amount of support and training they wish to choose; they also have ongoing access to the curriculum materials for additional year groups. The per pupil cost therefore reduces considerably, to below £30 per pupil for additional year groups.’

In EEF terms this is deemed a low cost intervention, although an outlay of such magnitude is a significant burden for primary schools, particularly when funding is under pressure, and might be expected to act as a brake on participation.

Further coyness is evident in respect of statutory assessment outcomes. Some details are provided for individual schools, but there is precious little about the whole cohort.

All I could find was this table in the Primary Yearbook 2014-15.

.

EEF maths mastery performance

It suggests somewhat better achievement at KS1 L2b and L3c than the national average but, there is no information about other Levels and, of course, the sample is not representative, so the comparison is of limited value.

An absence of more sophisticated analysis – combined with the impression of limited transparency for those not yet inside the programme – is likely to act as a second brake on participation.

There is a reference to high attainers in the FAQ on the website:

‘The Mathematics Mastery curriculum emphasises stretching through depth of understanding rather than giving the top end of pupils [sic] new procedures to cover.

Problem solving is central to Mathematics Mastery. The great thing about the problems is that students can take them as far as they can, so those children who grasp the basics quickly can explore tasks further. There is also differentiation in the methods used, with top-end pupils typically moving to abstract numbers more quickly and spending less time with concrete manipulatives or bar models. There are extension ideas and support notes provided with the tasks to help you with this.

A range of schools are currently piloting the programme, which is working well in mixed-ability classes, as well as in schools that have set groups.’

The same unanswered questions arise as with the NCETM statement above. Is ‘Maths Mastery’ primarily focused on the ‘long tail’, potentially at the expense of high attainers?

The IoE evaluators think so. The primary evaluation report says that:

‘Mathematics Mastery intervention is particularly concerned with the ‘mastery’ of basic skills, and raising the attainment of low achievers.’

It would be helpful to have clarity on this point.

.

How influential is Maths Mastery?

Extremely influential.

Much educational and political capital has already been invested in Maths Mastery, hence the peculiar significance of the results contained in the evaluation reports.

The National Curriculum Expert Panel espoused mastery in its ‘Framework for the National Curriculum‘ (December 2011), while ducking the consequences for ‘stretch and challenge’ for high attainers – so creating a tension that remains unresolved to this day.

Meanwhile, the mastery approach has already influenced the new maths programme of study, as the NCETM document makes clear:

‘The 2014 national curriculum for mathematics has been designed to raise standards in maths, with the aim that the large majority of pupils will achieve mastery of the subject…

… For many schools and teachers the shift to this ‘mastery curriculum’ will be a significant one. It will require new approaches to lesson design, teaching, use of resources and support for pupils.’

Maths Mastery confirms that its Director was on the drafting team.

Mastery is also embedded in the national collaborative projects being undertaken through the Maths Hubs. Maths Mastery is one of four national partners in the Hubs initiative.

Ministers have endorsed the Ark programme in their speeches. In April 2014, Truss said:

‘The mastery model of learning places the emphasis on understanding core concepts. It’s associated with countries like Singapore, who have very high-performing pupils.

And in this country, Ark, the academy chain, took it on and developed it.

Ark run training days for maths departments and heads of maths from other schools.

They organise support visits, and share plans and ideas online with other teachers, and share their learning with a cluster of other schools.

It’s a very practical model. We know not every school will have the time or inclination to develop its very own programmes – a small rural school, say, or single-class primary schools.

But in maths mastery, a big chain like Ark took the lead, and made it straightforward for other schools to adopt their model. They maintain an online community – which is a cheap, quick way of keeping up with the best teaching approaches.

That’s the sort of innovation that’s possible.

Of course the important thing is the results. The programme is being evaluated so that when the results come out headteachers will be able to look at it and see if it represents good value.’

In June 2014 she said:

‘This idea of mastery is starting to take hold in classrooms in England. Led by evidence of what works, teachers and schools have sought out these programmes and techniques that have been pioneered in China and East Asia….

…With the Ark Schools Maths Mastery programme, more than 100 primary and secondary schools have joined forces to transform their pupils’ experiences of maths – and more are joining all the time. It’s a whole school programme focused on setting high expectations for all pupils – not believing that some just can’t do it. The programme has already achieved excellent results in other countries.’

Several reputations are being built upon Maths Mastery, many jobs depend upon it and large sums have been invested.

It has the explicit support of one of the country’s foremost academy chains and is already impacting on national curriculum and assessment policy (including the recent consultation on performance indicators for statutory teacher assessment).

Negative or neutral evaluations could have significant consequences for all the key players and are unlikely to encourage new schools to join the Programme.

Hence there is pressure in the system for positive outcomes – hence the significance of spin.

What the EEF evaluations tell us

.

Evaluation Protocols

EEF published separate Protocols for the primary and secondary evaluations in April 2013. These are broadly in line with the approach set out in the final evaluation reports, except that both refer much more explicitly to subsequent longitudinal evaluation:

‘In May/June 2017/18 children in treatment and control schools will sit key stage 2 maths exams. The IoE team will examine the long–run effectiveness of the Maths Mastery programme by investigating differences in school average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children’s Maths Mastery test scores (collected in 2012 and 2013)’.

‘In May/June 2018 children in treatment and control schools will sit national maths exams. The IoE team will examine the long – run effectiveness of the Maths Mastery programme by investigating differences in average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children’s Maths Mastery test scores (collected in 2013 and 2014) by NATCEN.’

It is not clear whether the intention is to preserve the integrity of the intervention and control groups until the former have rolled out Mastery to all year groups, or simply to evaluate the long-term effects of the initial one-year interventions, allowing intervention schools to drop Mastery and control schools to adopt it, entirely as they wish.

EEF Maths Mastery Project Homepage

The EEF’s updated Maths Mastery homepage has been revised to reflect the outcomes of the evaluations. It provides the most accessible summary of those outcomes.

It offers four key conclusions (my emphases):

  • ‘On average, pupils in schools adopting Mathematics Mastery made a small amount more progress than pupils in schools that did not. The effect detected was statistically significant, which means that it is likely that that improvement was caused by the programme.’
  • ‘It is unclear whether the programme had a different impact on pupils eligible for free school meals, or on pupils with higher or lower attainment.’
  • ‘Given the low per-pupil cost, Mathematics Mastery may represent a cost-effective change for schools to consider.’
  • ‘The evaluations assessed the impact of the programme in its first year of adoption. It would be worthwhile to track the medium and long-term impact of the approach.’

A table is supplied showing the effect sizes and confidence intervals for overall impact (primary and secondary together), and for the primary and secondary interventions separately.

EEF table 1 Capture

.

The support materials for the EEF’s toolkit help to explain these judgements.

About the Toolkit tells us that:

‘Average impact is estimated in terms of the additional months’ progress you might expect pupils to make as a result of an approach being used in school, taking average pupil progress over a year as a benchmark.

For example, research summarised in the Toolkit shows that improving the quality of feedback provided to pupils has an average impact of eight months. This means that pupils in a class where high quality feedback is provided will make on average eight months more progress over the course of a year compared to another class of pupils who were performing at the same level at the start of the year. At the end of the year the average pupil in a class of 25 pupils in the feedback group would now be equivalent to the 6th best pupil in the control class having made 20 months progress over the year, compared to an average of 12 months in the other class.’

There is another table showing us how to interpret this scale

EEF table 2 Capture

.

We can see from this that:

  • The overall Maths Mastery impact of +0.073 is towards the upper end of the ‘1 months progress’ category.
  • The ‘primary vs comparison’ impact of +0.10 just scrapes into the ‘2 months progress’ category.
  • The secondary vs comparison impact of +0.06 is towards the middle of the ‘1 months progress category’

All three are officially classed as ‘Low Effect’.

If we compare the effect size attributable to Maths Mastery with others in the Toolkit, it is evident that it ranks slightly above school uniform and slightly below learning styles.

A subsequent section explains that the overall impact rating is dependent on meta-analysis (again my emphases):

‘The findings from the individual trials have been combined using an approach called “meta-analysis”. Meta-analysis can lead to a more accurate estimate of an intervention’s effect. However, it is also important to note that care is needed in interpreting meta-analysed findings.’

But we are not told how, in light of this, we are to exercise care in interpreting this particular finding. There are no explicit ‘health warnings’ attached to it.

The homepage does tell us that:

‘Due to the ages of pupils who participated in the individual trials, the headline findings noted here are more likely to be predictive of programme’s impact on pupils in primary school than on pupils in secondary school.’

It also offers an explanation of why the effects generated from these trials are so small compared with those for earlier studies:

‘The findings were substantially lower than the average effects seen in the existing literature on of “mastery approaches”. A possible explanation for this is that many previous studies were conducted in the United States in the 1970s and 80s, so may overstate the possible impact in English schools today. An alternative explanation is that the Mathematics Mastery programme differed from some examples of mastery learning previously studied. For example classes following the Mathematics Mastery approach did not delay starting new topics until a high level of proficiency had been achieved by all students, which was a key feature in a number of many apparently effective programmes.’

 

There is clearly an issue with the 95% confidence intervals supplied in the first table above. 

The Technical Appendices to the Toolkit say:

‘For those concerned with statistical significance, it is still readily apparent in the confidence intervals surrounding an effect size. If the confidence interval includes zero, then the effect size would be considered not to have reached conventional statistical significance.’ (p6)

The table indicates that the lower confidence interval is zero or lower in all three cases, meaning that none of these findings may be statistically significant.

However, the homepage claims that the overall impact of both interventions, when combined through meta-analysis, is statistically significant.

And it fails entirely to mention that the impact of the both the primary and the secondary interventions separately are statistically insignificant.

The explanation of the attribution of statistical significance to the two evaluations combined is that, whereas the homepage gives confidence intervals to two decimal places, the reports calculate them to a third decimal place.

This gives a lower value of 0.004 (ie four thousandths above zero).

This can be seen from the table annexed to the primary and secondary reports and included in the ‘Overarching Summary Report’

EEF maths mastery 3 decimal places Capture

.

The distinction is marginal, to say the least. Indeed, the Evaluation Reports say:

‘…the pooled effect size of 0.073 is just significantly different from zero at conventional thresholds’

Moreover, notice that the introduction of a third decimal place drags the primary effect size down to 0.099, officially consigning it to the ‘one month’s progress’ category rather than the two months quoted above.

This might appear to be dancing on the head of a statistical pin but, as we shall see later, the spin value of statistical significance is huge!

Overall there is a lack of clarity here that cannot be attributed entirely to the necessity for brevity. The attempt to conflate subtly different outcomes from the separate primary and secondary evaluations has masked these distinctions and distorted the overall assessment.

.

The full reports add some further interesting details which are summarised in the sections below.

Primary Evaluation Report 

EEF maths mastery table 4

Key points:

  • In both the primary and secondary reports, additional reasons are given for why the effects from these evaluations are so much smaller than those from previous studies. These include the fact that:

‘…some studies included in the mastery section of the toolkit show small or no effects, suggesting that making mastery learning work effectively in all circumstances is challenging.’

The overall conclusion is an indirect criticism of the Toolkit, noting as it does that ‘the relevance of such evidence for contemporary education policy in England…may be limited’.

  • The RCT was undertaken across two academic years: In AY2012/13, 40 schools (Cohort A) were involved. Of these, 20 were randomly allocated the intervention and 20 the control. In AY2013/14, 50 schools (Cohort B) participated, 25 allocated the intervention and 25 the control. After the trial, control schools in Cohort A were free to pursue Maths Mastery. (The report does not mention whether this also applied to Cohort B.) It is not clear how subsequent longitudinal evaluation will be affected by such leakage from the control group.
  • The schools participating in the trial schools were recruited by Ark. They had to be state-funded and not already undertaking Maths Mastery:

‘Schools were therefore purposefully selected—they cannot be considered a randomly chosen sample from a well-defined population. The majority of schools participating in the trial were from London or the South East.’

  • Unlike the secondary evaluation, no process evaluation was conducted so it is not possible to determine the extent to which schools adhered to the prescribed programme. 
  • Baseline tests were administered after allocation between intervention and control, at the beginning of each academic year. Pupils were tested again in July. Evaluators used the Number Knowledge Test (NKT) for this purpose. The report discusses reasons why this might not be an accurate predictor of subsequent maths attainment and whether it is so closely related to the intervention as to be ‘a questionable measure of the success of the trial’. The discussion suggests that there were potential advantages to both the intervention and control groups but does not say whether one outweighed the other. 
  • The results of the post-test are summarised thus:

‘Children who received the Mathematics Mastery intervention scored, on average, +0.10 standard deviations higher on the post-test. This, however, only reached statistical significance at the 10% level (t = 1.82; p = 0.07), with the 95% confidence interval ranging from -0.01 to +0.21. Within Cohort A, children in the treatment group scored (on average) +0.09 standard deviations above those children in the control group (confidence interval -0.06 to +0.24). The analogous effect in Cohort B was +0.10 (confidence interval -0.05 to 0.26). Consequently, although the Mathematics Mastery intervention may have had a small positive effect on children’s test scores, it is not possible to rule out sampling variation as an explanation.’

  • The comparison of pre-test and post-test results provides any evidence of differential effects for those with lower or higher prior attainment:

‘Estimates are again presented in terms of effect sizes. The interaction effect is not significantly different from zero, with the 95% confidence interval ranging from -0.01 to +0.02. Thus there is little evidence that the effect of Mathematics Mastery differs between children with different levels of prior achievement.’

The Report adds:

‘Recall that the Mathematics Mastery intervention is particularly concerned with the ‘mastery’ of basic skills, and raising the attainment of low achievers. Thus one might anticipate the intervention to be particularly effective in the bottom half of the test score distribution. There is some, but relatively little, evidence that the intervention was less effective for the bottom half of the test distribution.

So, on this evidence, Maths Mastery is no more effective for the low achievers it is intended to help most. This is somewhat different to the suggestion on the homepage that the answer given to this question is ‘unclear’.

Several limitations are discussed, but it is important to note that they are phrased in hypothetical terms:

  • Pupils’ progress was evaluated after one academic year::

’This may be considered a relatively small ‘dose’ of the Mathematics Mastery programme’.

  • The intervention introduced a new approach to schools, so there was a learning curve which control schools did not experience:

‘With more experience teaching the programme it is possible that teachers would become more effective in implementing it.’

  • The test may favour either control schools or intervention schools.
  • Participating schools volunteered to take part, so it is not possible to say whether similar effects would be found in all schools.
  • It was not possible to control for balance – eg by ethnic background and FSM eligibility – between intervention and control. [This is now feasible so could potentially be undertaken retrospectively to check there was no imbalance.]

Under ‘Interpretation’, the report says:

‘Within the context of the wider educational literature, the effect size reported (0.10 standard deviations) would typically be considered ‘small’….

Yet, despite the modest and statistically insignificant effect, the Mathematics Mastery intervention has shown some promise.’

The phrase ‘some promise’ is justified by reference to the meta-analysis, the cost effectiveness (a small effect size for a low cost is preferable to the same outcome for a higher cost) and the fact that the impact of the entire programme has not yet been evaluated

‘Third, children are likely to follow the Mathematics Mastery programme for a number of years (perhaps throughout primary school), whereas this evaluation has considered the impact of just the first year of the programme. Long-run effects after sustained exposure to the programme could be significantly higher, and will be assessed in a follow-up study using Key Stage 2 data.’

This is the only reference to a follow-up study. It is less definite than the statement in the assessment protocol and there is no further explanation of how this will be managed, especially given potential ‘leakage’ from the control group.

Secondary Evaluation Report

EEF maths mastery table 5

Key points:

  • 50 schools were recruited to participate in the RCT during AY2013/14, with 25 randomly allocated to intervention and control. All Year 7 pupils within the former experienced the intervention.  As in the primary trial, control schools were eligible to access the programme after the end of the trial year. Interestingly, 3 of the 25 intervention schools (12%) dropped out before the end of the year – their reasons are not recorded. 
  • As in the primary trial, Ark recruited the participating schools – which had to be state-funded and new to Maths Mastery. Since schools were deliberately selected they could not be considered a random sample. The report notes:

‘Trial participants, on average, performed less well in their KS1 and KS2 examinations than the state school population as a whole. For instance, their KS1 average points scores (and KS2 maths test scores) were approximately 0.2 standard deviations (0.1 standard deviations) below the population mean. This seems to be driven, at least in part, by the fact that the trial particularly under-represented high achievers (relative to the population). For instance, just 12% of children participating in the trial were awarded Level 3 in their Key Stage 1 maths test, compared to 19% of all state school pupils in England.’

  • KS1 and KS2 tests were used to baseline. The Progress in Maths (PiM) test was used to assess pupils at the end of the year. But about 40% of the questions cover content not included in the Y7 maths mastery curriculum, which disadvantaged them relative to the control group. PiM also includes a calculator section although calculators are not used in Year 7 of Maths Mastery. It was agreed that breakdowns of results would be supplied to account for this.
  • On the basis of overall test results:

‘Children who received the Mathematics Mastery intervention scored, on average, +0.055 standard deviations higher on the PiM post-test. This did not reach statistical significance at conventional thresholds (t = 1.20; p = 0.24), with the 95% confidence interval ranging from –0.037 to +0.147. Turning to the FSM-only sample, the estimated effect size is +0.066 with the 95% confidence interval ranging from –0.037 to +0.169 (p = 0.21). Moreover, we also estimated a model including a FSM-by intervention interaction. Results suggested there was little evidence of heterogeneous intervention effects by FSM. Consequently, although the Mathematics Mastery intervention may have had a small positive effect on overall PiM test scores, one cannot rule out the possibility that this finding is due to sampling variation.

  • When the breakdowns were analysed:

‘As perhaps expected, the Mathematics Mastery intervention did not have any impact upon children’s performance on questions covering topics outside the Mathematics Mastery curriculum. Indeed, the estimated intervention effect is essentially zero (effect size = –0.003). In contrast, the intervention had a more pronounced effect upon material that was focused upon within the Mathematics Mastery curriculum (effect size = 0.100), just reaching statistical significance at the 5% level (t = 2.15; p = 0.04)

  • The only analysis of the comparative performance of high and low attainers is tied to the parts of the test not requiring use of a calculator. It suggests a noticeably smaller effect in the top half of the attainment distribution, with no statistical significance above the 55th This is substantively different to the finding in the primary evaluation, and it begs the question whether secondary Maths Mastery needs adjustment to make it more suitable for high attainers.
  • A process evaluation was focused principally on 5 schools from the intervention group. Focus group discussions were held before the intervention and again towards the end. Telephone interviews were conducted and lessons observed. The sample was selected to ensure different sizes of school, FSM intake and schools achieving both poor and good progress in maths according to their most recent inspection report. One of the recommendations is that:

The intervention should consider how it might give more advice and support with respect to differentiation.’

  • The process evaluation adds further detail about suitability for high attainers:

‘Another school [E] also commented that the materials were also not sufficiently challenging for the highest-attaining children, who were frustrated by revisiting at length the same topics they had already encountered at primary school. Although this observation was also made in other schools, it was generally felt that the children gradually began to realise that they were in fact enjoying the subject more by gaining extra understanding.’

It is not clear whether this latter comment also extends to the high attainers!

A similar set of limitations is explored in similar language to that used in the primary report.

Under ‘Interpretation’ the report says:

‘Although point estimates were consistent with a small, positive gain, the study did not have sufficient statistical power to rule out chance as an explanation. Within the context of the wider educational literature, the effect size reported (less than 0.10 standard deviations) would typically be considered ‘small’…

But, as in the primary report, it detects ‘some promise’ on the same grounds. There is a similar speculative reference to longitudinal evaluation.

.

Press releases and blogs

. 

EEF press release

There is a certain irony in the fact that ‘unlucky’ Friday 13 February was the day selected by the EEF to release these rather disappointing reports.

But Friday is typically the day selected by communications people to release educational news that is most likely to generate negative media coverage – and a Friday immediately before a school holiday is a particularly favoured time to do so, presumably because fewer journalists and social media users are active.

Unfortunately, the practice is at risk of becoming self-defeating, since everyone now expects bad news on a Friday, whereas they might be rather less alert on a busier day earlier in the week.

On this occasion Thursday was an exceptionally busy day for education news, with reaction to Miliband’s speech and a raft of Coalition announcements designed to divert attention from it. With the benefit of hindsight, Thursday might have been a better choice.

The EEF’s press release dealt with evaluation reports on nine separate projects, so increasing the probability that attention would be diverted away from Maths Mastery.

It led on a different evaluation report which generated more positive findings – the EEF seems increasingly sensitive to concerns that too many of the RCTs it sponsors are showing negligible or no positive effect, presumably because the value-for-money police may be inclined to turn their beady eye upon the Foundation itself.

But perhaps it also did so because Maths Mastery’s relatively poor performance was otherwise the story most likely to attract the attention of more informed journalists and commentators.

On the other hand, Maths Mastery was given second billing:

‘Also published today are the results of Mathematics Mastery, a whole-school approach which aims to deepen pupils’ conceptual understanding of key mathematical ideas. Compared to traditional curricula, fewer topics are covered in more depth and greater emphasis is placed on problem solving and encouraging mathematical thinking. The EEF trials found that pupils following the Mathematics Mastery programme made an additional month’s progress over a period of a year.’

.

.

EEF blog post

Later on 13 February EEF released a blog post written by a senior analyst which mentions Maths Mastery in the following terms:

Another finding of note is the small positive impact of teaching children fewer mathematical concepts, but covering them in greater depth to ensure ‘mastery’. The EEF’s evaluation of Mathematics Mastery will make fascinating reading for headteachers contemplating introducing this approach into their school. Of course, the true value of this method may only be evident in years to come as children are able to draw on their secure mathematical foundations to tackle more complex problems.’

EEF is consistently reporting a small positive impact but, as we have seen, this is rather economical with the truth. It deserves some qualification.

More interestingly though, the post adds (my emphases):

‘Our commitment as an organisation is not only to build the strength of the evidence base in education, across key stages, topics, approaches and techniques, but also ensure that the key messages emerging from the research are synthesised and communicated clearly to teachers and school leaders so that evidence can form a central pillar of how decisions are made in schools.

We have already begun this work, driven by the messages from our published trials as well as the existing evidence base. How teaching assistants can be used to best effect, important lessons in literacy at the transition from primary to secondary, and which principles should underpin approaches on encouraging children in reading for pleasure are all issues that have important implications for school leaders. Synthesising and disseminating these vital messages will form the backbone of a new phase of EEF work beginning later in the year.’

It will be interesting to monitor the impact of this work on the communication of outcomes from these particular evaluations.

It will be important to ensure that synthesis and dissemination is not at the expense of accuracy, particularly when ‘high stakes’ results are involved, otherwise there is a risk that users will lose faith in the independence of EEF and its willingness to ‘speak truth unto power’.

.

Maths Mastery Press Release

By also releasing their own posts on 13 February, Mathematics Mastery and Ark made sure that they too would not be picked up by the media.

They must have concluded that, even if they placed the most positive interpretation on the outcomes, they would find it hard to create the kind of media coverage that would generate increased demand from schools.

The Mathematics Mastery release – ‘Mathematics Mastery speeds up pupils’ progress – and is value for money too’ – begins with a list of bullet points citing other evidence that the programme works, so implying that the EEF evaluations are relatively insignificant additions to this comprehensive evidence base:

  • ‘Headteachers say that the teaching of mathematics in their schools has improved
  • Headteachers are happy to recommend us to other schools
  • Numerous Ofsted inspections have praised the “new approach to mathematics” in partner schools
  • Extremely positive evaluations of our training and our school development visits
  • We have an exceptionally high retention rate – schools want to continue in the partnership
  • Great Key Stage 1 results in a large number of schools.’

Much of this is hearsay, or else vague reference to quantitative evidence that is not published openly.

The optimistic comment on the EEF evaluations is:

‘We’re pleased with the finding that, looking at both our primary and secondary programmes together, pupils in the Mathematics Mastery schools make one month’s extra progress on average compared to pupils in the other schools after a one year “dose” of the programme…

…This is a really pleasing outcome – trials of this kind are very rigorous.  Over 80 primary schools and 50 secondary schools were involved in the testing, with over 4000 pupils involved in each phase.  Studies like this often don’t show any progress at all, particularly in the early years of implementation and if, like ours, the programme is aimed at all pupils and not just particular groups.  What’s more, because of the large sample size, the difference in scores between the Mathematics Mastery and other schools is “statistically significant” which means the results are very unlikely to be due to chance.’

The section I have emboldened is in stark contrast to the EEF blog post above, which has the title:

‘Today’s findings; impact, no-impact and inconclusive – a normal distribution of findings’

And so suggests exactly the opposite.

I have already shown just how borderline the calculation of ‘statistical significance’ has been.

The release concludes:

‘Of course we’re pleased with the extra progress even after a limited time, but we’re interested in long term change and long term development and improvement.  We’re determined to work with our partner schools to show what’s possible over pupils’ whole school careers…but it’s nice to know we’ve already started to succeed!’

 .

There was a single retweet of the Tweet above, but from a particularly authoritative source (who also sits on Ark’s Advisory Group).

.

Ark Press Release

Ark’s press release – ‘Independent evaluation shows Mathematics Mastery pupils doing better than their peers’ – is even more bullish.

The opening paragraph claims that:

‘A new independent report from the independent Education Endowment Foundation (EEF) demonstrates the success of the Mathematics Mastery programme. Carried out by academics from Cambridge University and the Institute of Education, the data indicates that the programme may have the potential to halve the attainment gap with high performing countries in the far East.

The second emboldened statement is particularly brazen since there is no evidence in either of the reports that would support such a claim. It is only true in the sense that any programme ‘may have the potential’ to achieve any particularly ambitious outcome.

Statistical significance is again celebrated, though it is important to give Ark credit for adding:

‘…but it is important to note that these individual studies did not reach the threshold for statistical significance. It is only at the combined level across 127 schools and 10,114 pupils that there are sufficient schools and statistical power to determine an effect size of 1 month overall.’

Even if this rather implies that the individual evaluations were somehow at fault for being too small and so not generating ‘sufficient statistical power’.

Then the release returns to its initial theme:

‘… According to the OECD, by age fifteen, pupils in Singapore, Japan, South Korea and China are three years ahead of pupils in England in mathematical achievement. Maths Mastery is inspired by the techniques and strategies used in these countries.

Because Maths Mastery is a whole school programme of training and a cumulative curriculum, rather than a catch-up intervention, this could be a sustained impact. A 2 month gain every primary year and 1 month gain every secondary year could see pupils more than one and a half years ahead by age 16 – halving the gap with higher performing jurisdictions.’

In other words, Ark extrapolates equivalent gains – eschewing all statistical hedging – for each year of study, adding them together to suggest a potential 18 month gain.

It also seems to apply the effect to all participants rather than to the average participant.

This must have been a step too far, even for Ark’s publicity machine.

.

maths mastery ark release capture

.

They subsequently changed the final paragraph above – which one can still find in the version within Google’s cache – to read:

‘…Because Maths Mastery is a whole school programme of training and a cumulative curriculum, rather than a catch-up intervention, we expect this to be a sustained impact.  A longer follow-up study will be needed to investigate this.’

Even in sacrificing the misleading quantification, they could not resist bumping up ‘this could be a sustained impact’ to ‘we expect this to be a sustained impact’

 .

[Postscript: On 25 February, Bank of America Merrill Lynch published a press release announcing a £750,000 donation to Maths Mastery.

The final paragraph ‘About Maths Mastery’ says:

‘Mathematics Mastery is an innovative maths teaching framework, supporting schools, students and teachers to be successful at maths. There are currently 192 Mathematics Mastery partner schools across England, reaching 34,800 pupils. Over the next five years the programme aims to expand to 500 schools, and reach 300,000 pupils. Maths Mastery was recently evaluated by the independent Education Endowment Foundation and pupils were found to be up to two months ahead of their peers in just the first year of the programme. Longer term, this could see pupils more than a year and a half ahead by age 16 – halving the gap with pupils in countries such as Japan, Singapore and China.’

This exemplifies perfectly how such questionable statements are repurposed and recycled with impunity. It is high time that the EEF published a code of practice to help ensure that the outcomes of its evaluations are not misrepresented.]  

.

Conclusion

.

 .

Representing the key findings

My best effort at a balanced presentation of these findings would include the key points below. I am happy to consider amendments, additions and improvements:

  • On average, pupils in primary schools adopting Mathematics Mastery made two months more progress than pupils in primary schools that did not. (This is a borderline result, in that it is only just above the score denoting one month’s progress. It falls to one month’s progress if the effect size is calculated to three decimal places.) The effect is classified as ‘Low’ and this outcome is not statistically significant. 
  • On average, pupils in secondary schools adopting Mathematics Mastery made one month more progress than pupils in secondary schools that did not. The effect is classified as ‘Low’ and this outcome is not statistically significant. 
  • When the results of the primary and secondary evaluations are combined through meta-analysis, pupils in schools adopting Maths Mastery made one month more progress than pupils in schools that did not. The effect is classified as ‘Low’. This outcome is marginally statistically significant, provided that the 95% confidence interval is calculated to three decimal places (but it is not statistically significant if calculated to two decimal places). Care is needed in analysing meta-analysed findings because… [add explanation]. 
  • There is relatively little evidence that the primary programme is more effective for learners with lower prior attainment, but there is such evidence for the secondary programme (in respect of non-calculator questions). There is no substantive evidence that the secondary programme has a different impact on pupils eligible for free schools meals. 
  • The per-pupil cost is relatively low, but the initial outlay of £6,000 for primary schools with 2FE and above is not inconsiderable. Mathematics Mastery may represent a cost-effective change for schools to consider. 
  • The evaluations assessed the impact of the programme in its first year of adoption. It is not appropriate to draw inferences from the findings above to attribute potential value to the whole programme. EEF will be evaluating the medium and long-term impact of the approach by [outline the methodology agreed].

In the meantime, it would be helpful for Ark and Maths Mastery to be much more transparent about KS1 assessment outcomes across their partner schools and possibly publish their own analysis based on comparison between schools undertaking the programme and matched control schools with similar intakes.

And it would be helpful for all partners to explain and evidence more fully the benefits to high attainers of the Maths Mastery approach – and to consider how it might be supplemented when it does not provide the blend of challenge and support that best meets their needs.

It is disappointing that, three years on, the failure of the National Curriculum Expert Panel to reconcile their advocacy for mastery with stretch and challenge for high attainers – in defiance of their remit to consider the latter as well as the former –  is being perpetuated across the system.

NCETM might usefully revisit their guidance on high attainers in primary schools to reflect their new-found commitment to mastery, while also incorporating additional material covering the point above.

.

Postscript

A summary of this piece, published by Schools Week, prompted two comments – one from Stephen Gorard, the other from Dylan Wiliam. The Twitter embed below is the record of a subsequent debate between us and some others, about design of the Maths Mastery evaluations, what they tell us and how useful they are, especially to policy makers.

One of the tweets contains a commitment on the part of Anna Vignoles to set up a seminar to discuss these issues further.

The widget stores the tweets in reverse order (most recent first). Scroll down to the bottom to follow the discussion in chronological order.

.

.

GP

February 2015

High Attainment in the 2014 Secondary and 16-18 Performance Tables

.

This is my annual analysis of high attainment and high attainers’ performance in the Secondary School and College Performance Tables

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on the 2014 Secondary and 16-18 Tables, as well as three statistical releases published alongside them:

It also reports trends since 2012 and 2013, while acknowledging the comparability issues at secondary level this year.

This is a companion piece to previous posts on:

The post opens with the headlines from the subsequent analysis. These are followed by a discussion of definitions and comparability issues.

Two substantive sections deal respectively with secondary and post-16 measures. The post-16 analysis focuses exclusively on A level results. There is a brief postscript on the performance of disadvantaged high attainers.

As ever I apologise in advance for any transcription errors and invite readers to notify me of any they spot, so that I can make the necessary corrections.

.

Headlines

At KS4:

  • High attainers constitute 32.4% of the cohort attending state-funded schools, but this masks some variation by school type. The percentage attending converter academies (38.4%) has fallen by nine percentage points since 2011 but remains almost double the percentage attending sponsored academies (21.2%).
  • Female high attainers (33.7%) continue to outnumber males (32.1%). The percentage of high-attaining males has fallen very slightly since 2013 while the proportion of high-attaining females has slightly increased.
  • 88.8% of the GCSE cohort attending selective schools are high attainers, virtually unchanged from 2013. The percentages in comprehensive schools (30.9%) and modern schools (21.0%) are also little changed.
  • These figures mask significant variation between schools. Ten grammar schools have a GCSE cohort consisting entirely of high attainers but, at the other extreme, one has only 52%.
  • Some comprehensive schools have more high attainers than some grammars: the highest percentage recorded in 2014 by a comprehensive is 86%. Modern schools are also extremely variable, with high attainer populations ranging from 4% to 45%. Schools with small populations of high attainers report very different success rates for them on the headline measures.
  • The fact that 11.2% of the selective school cohort are middle attainers reminds us that 11+ selection is not based on prior attainment. Middle attainers in selective schools perform significantly better than those in comprehensive schools, but worse than high attainers in comprehensives.
  • 92.8% of high attainers in state-funded schools achieved 5 or more GCSEs at grades A*-C (or equivalent) including GCSEs in English and maths. While the success rate for all learners is down by four percentage points compared with 2013, the decline is less pronounced for high attainers (1.9 points).
  • In 340 schools 100% of high attainers achieved this measure, down from 530 in 2013. Fifty-seven schools record 67% or less compared with only 14 in 2013. Four of the 57 had a better success rate for middle attainers than for high attainers.
  • 93.8% of high attainers in state-funded schools achieved GCSE grades A*-C in English and maths. The success rate for high attainers has fallen less than the rate for the cohort as a whole (1.3 points against 2.4 points). Some 470 schools achieved 100% success amongst their high attainers on this measure, down 140 compared with 2013. Thirty-eight schools were at 67% or lower compared with only 12 in 2013. Five of these boast a higher success rate for their middle attainers than their high attainers (and four are the same that do so on the 5+ A*-C including English and maths measure).
  • 68.8% of high attainers were entered for the EBacc and 55% achieved it. The entry rate is up 3.8 percentage points and the success rate up 2.9 points compared with 2013. Sixty-seven schools entered 100% of their high attainers, but only five schools managed 100% success. Thirty-seven schools entered no high attainers at all and 53 had no successful high attainers.
  • 85.6% of high attainers made at least the expected progress in English and 84.7% did so in maths. Both are down on 2013 but much more so in maths (3.1 percentage points) than in English (0.6 points).
  • In 108 schools every high attainer made the requisite progress in English. In 99 schools the same was true of maths in 99 schools. Only 21 schools managed 100% success in both English and maths. At the other extreme there were seven schools in which 50% or fewer made expected progress in both English and maths. Several schools recording 50% or below in either English or maths did significantly better with their middle attainers.
  • In sponsored academies one in four high attainers do not make the expected progress in maths and one in five do not do so in English. In free schools one in every five high attainers falls short in English as do one in six in maths.

At KS5:

  • 11.9% of students at state-funded schools and colleges achieved AAB grades at A level or higher, with at least two in facilitating subjects. This is a slight fall compared with the 12.1% that did so in 2013. The best-performing state institution had a success rate of 83%.
  • 14.1% of A levels taken in selective schools in 2014 were graded A* and 41.1% were graded A* or A. In selective schools 26.1% of the cohort achieved AAA or higher and 32.3% achieved AAB or higher with at least two in facilitating subjects.
  • Across all schools, independent as well as state-funded, the proportion of students achieving three or more A level grades at A*/A is falling and the gap between the success rates of boys and girls is increasing.
  • Boys are more successful than girls on three of the four high attainment measures, the only exception being the least demanding (AAB or higher in any subjects).
  • The highest recorded A level point score per A level student in a state-funded institution in 2014 is 1430.1, compared with an average of 772.7. The lowest is 288.4. The highest APS per A level entry is 271.1 compared with an average of 211.2. The lowest recorded is 108.6.

Disadvantaged high attainers:

  • On the majority of the KS4 headline measures gaps between FSM and non-FSM performance are increasing, even when the 2013 methodology is applied to control for the impact of the reforms affecting comparability. Very limited improvement has been made against any of the five headline measures between 2011 and 2014. It seems that the pupil premium has had little impact to date on either attainment or progress. Although no separate information is forthcoming about the performance of disadvantaged high attainers, it is highly likely that excellence gaps are equally unaffected.

.

Definitions and comparability issues 

Definitions

The Secondary and 16-18 Tables take very different approaches, since the former deals exclusively with high attainers while the latter concentrates exclusively on high attainment.

The Secondary Tables define high attainers according to their prior attainment on end of KS2 tests. Most learners in the 2014 GCSE cohort will have taken these five years previously, in 2009.

The new supporting documentation describes the distinction between high, middle and low attainers thus:

  • low attaining = those below level 4 in the key stage 2 tests
  • middle attaining = those at level 4 in the key stage 2 tests
  • high attaining = those above level 4 in the key stage 2 tests.

Last year the equivalent statement added:

‘To establish a pupil’s KS2 attainment level, we calculated the pupil’s average point score in national curriculum tests for English, maths and science and classified those with a point score of less than 24 as low; those between 24 and 29.99 as middle, and those with 30 or more as high attaining.’

This is now missing, but the methodology is presumably unchanged.

It means that high attainers will tend to be ‘all-rounders’, whose performance is at least middling in each assessment. Those who are exceptionally high achievers in one area but poor in others are unlikely to qualify.

There is nothing in the Secondary Tables or the supporting SFRs about high attainment, such as measures of GCSE achievement at grades A*/A.

By contrast, the 16-18 Tables do not distinguish high attainers, but do deploy a high attainment measure:

‘The percentage of A level students achieving grades AAB or higher in at least two facilitating subjects’

Facilitating subjects include:

‘biology, chemistry, physics, mathematics, further mathematics, geography, history, English literature, modern and classical languages.’

The supporting documentation says:

‘Students who already have a good idea of what they want to study at university should check the usual entry requirements for their chosen course and ensure that their choices at advanced level include any required subjects. Students who are less sure will want to keep their options open while they decide what to do. These students might want to consider choosing at least two facilitating subjects because they are most commonly required for entry to degree courses at Russell Group universities. The study of A levels in particular subjects does not, of course, guarantee anyone a place. Entry to university is competitive and achieving good grades is also important.’

The 2013 Tables also included percentages of students achieving three A levels at grades AAB or higher in facilitating subjects, but this has now been dropped.

The Statement of Intent for the 2014 Tables explains:

‘As announced in the government’s response to the consultation on 16-19 accountability earlier this year, we intend to maintain the AAB measure in performance tables as a standard of academic rigour. However, to address the concerns raised in the 16-19 accountability consultation, we will only require two of the subjects to be in facilitating subjects. Therefore, the indicator based on three facilitating subjects will no longer be reported in the performance tables.’

Both these measures appear in SFR03/15, alongside two others:

  • Percentage of students achieving 3 A*-A grades or better At A level or applied single/double award A level.
  • Percentage of students achieving grades AAB or better at A level or applied single/double award A level.

Comparability Issues 

When it comes to analysis of the Secondary Tables, comparisons with previous years are compromised by changes to the way in which performance is measured.

Both SFRs carry an initial warning:

‘Two major reforms have been implemented which affect the calculation of key stage 4 (KS4) performance measures data in 2014:

  1. Professor Alison Wolf’s Review of Vocational Education recommendations which:
  • restrict the qualifications counted
  • prevent any qualification from counting as larger than one GCSE
  • cap the number of non-GCSEs included in performance measures at two per pupil
  1. An early entry policy to only count a pupil’s first attempt at a qualification.’

SFR02/15 explains that some data has been presented ‘on two alternative bases’:

  • Using the 2014 methodology with the changes above applied and
  • Using a proxy 2013 methodology where the effect of these two changes has been removed.

It points out that more minor changes have not been accounted for, including the removal of unregulated IGCSEs, the application of discounting across different qualification types, the shift to linear GCSE formats and the removal of the speaking and listening component from English.

Moreover, the proxy measure does not:

‘…isolate the impact of changes in school behaviour due to policy changes. For example, we can count best entry results rather than first entry results but some schools will have adjusted their behaviours according to the policy changes and stopped entering pupils in the same patterns as they would have done before the policy was introduced.’

Nevertheless, the proxy is the best available guide to what outcomes would have been had the two reforms above not been introduced. Unfortunately, it has been applied rather sparingly.

Rather than ignore trends completely, this post includes information about changes in high attainers’ GCSE performance compared with previous years, not least so readers can see the impact of the changes that have been introduced.

It is important that we do not allow the impact of these changes to be used as a smokescreen masking negligible improvement or even declines in national performance on key measures.

But we cannot escape the fact that the 2014 figures are not fully comparable with those for previous years. Several of the tables in SFR06/2015 carry a warning in red to this effect (but not those in SFR 02/2015).

A few less substantive changes also impact slightly on the comparability of A level results: the withdrawal of January examinations and ‘automatic add back’ of students whose results were deferred from the previous year because they had not completed their 16-18 study programme.

.

Secondary outcomes

. 

The High Attainer Population 

The Secondary Performance Tables show that there were 172,115 high attainers from state-funded schools within the relevant cohort in 2014, who together account for 32.3% of the entire state-funded school cohort.

This is some 2% fewer than the 175,797 recorded in 2013, which constituted 32.4% of that year’s cohort.

SFR02/2015 provides information about the incidence of high, middle and low attainers by school type and gender.

Chart 1, below, compares the proportion of high attainers by type of school, showing changes since 2011.

The high attainer population across all state-funded mainstream schools has remained relatively stable over the period and currently stands at 32.9%. The corresponding percentage in LA-maintained mainstream schools is slightly lower: the difference is exactly two percentage points in 2014.

High attainers constitute only around one-fifth of the student population of sponsored academies, but close to double that in converter academies. The former percentage is relatively stable but the latter has fallen by some nine percentage points since 2011, presumably as the size of this sector has increased.

The percentage of high attainers in free schools is similar to that in converter academies but has fluctuated over the three years for which data is available. The comparison between 2014 and previous years will have been affected by the inclusion of UTCs and studio schools prior to 2014.

.

HA sec1

*Pre-2014 includes UTCs and studio schools; 2014 includes free schools only

Chart 1: Percentage of high attainers by school type, 2011-2014

. 

Table 1 shows that, in each year since 2011, there has been a slightly higher percentage of female high attainers than male, the gap varying between 0.4 percentage points (2012) and 1.8 percentage points (2011).

The percentage of high-attaining boys in 2014 is the lowest it has been over this period, while the percentage of high attaining girls is slightly higher than it was in 2013 but has not returned to 2011 levels.

Year Boys Girls
2014 32.1 33.7
2013 32.3 33.3
2012 33.4 33.8
2011 32.6 34.4

Table 1: Percentage of high attainers by gender, all state-funded mainstream schools 2011-14

Table 2 shows that the percentage of high attainers in selective schools is almost unchanged from 2013, at just under 89%. This compares with almost 31% in comprehensive schools, unchanged from 2013, and 21% in modern schools, the highest it has been over this period.

The 11.2% of learners in selective schools who are middle attainers remind us that selection by ability through 11-plus tests gives a somewhat different sample than selection exclusively on the basis of KS2 attainment.

. 

Year Selective Comprehensive Modern
2014 88.8 30.9 21.0
2013 88.9 30.9 20.5
2012 89.8 31.7 20.9
2011 90.3 31.6 20.4

Table 2: Percentage of high attainers by admissions practice, 2011-14

The SFR shows that these middle attainers in selective schools are less successful than their high attaining peers, and slightly less successful than high attainers in comprehensives, but they are considerably more successful than middle attaining learners in comprehensive schools.

For example, in 2014 the 5+ A*-C grades including English and maths measure is achieved by:

  • 97.8% of high attainers in selective schools
  • 92.2% of high attainers in comprehensive schools
  • 88.1% of middle attainers in selective schools and
  • 50.8% of middle attainers in comprehensive schools.

A previous post ‘The Politics of Selection: Grammar schools and disadvantage’ (November 2014) explored how some grammar schools are significantly more selective than others – as measured by the percentage of high attainers within their GCSE cohorts – and the fact that some comprehensives are more selective than some grammar schools.

This is again borne out by the 2014 Performance Tables, which show that 10 selective schools have a cohort consisting entirely of high attainers, the same as in 2013. Eighty-nine selective schools have a high attainer population of 90% or more.

However, five are at 70% or below, with the lowest – Dover Grammar School for Boys – registering only 52% high attainers.

By comparison, comprehensives such as King’s Priory School, North Shields and Dame Alice Owen’s School, Potters Bar record 86% and 77% high attainers respectively. 

There is also huge variation in modern schools, from Coombe Girls’ in Kingston, at 45%, just seven percentage points shy of the lowest recorded in a selective school, to The Ellington and Hereson School, Ramsgate, at just 4%.

Two studio colleges say they have no high attainers at all, while 96 schools have 10% or fewer. A significant proportion of these are academies located in rural and coastal areas.

Even though results are suppressed where there are too few high attainers, it is evident that these small cohorts perform very differently in different schools.

Amongst those with a high attainer population of 10% or fewer, the proportion achieving:

  • 5+ A*-C grades including English and maths varies from 44% to100%
  • EBacc ranges from 0% to 89%
  • expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%. 

5+ GCSEs (or equivalent) at A*-C including GCSEs in English and maths 

The Tables show that:

  • 92.8% of high attainers in state-funded schools achieved five or more GCSEs (or equivalent) including GCSEs in English and maths. This compares with 56.6% of all learners. Allowing of course for the impact of 2014 reforms, the latter is a full four percentage points down on the 2013 outcome. By comparison, the outcome for high attainers is down 1.9 percentage points, slightly less than half the overall decline. Roughly one in every fourteen high attainers fails to achieve this benchmark.
  • 340 schools achieve 100% on this measure, significantly fewer than the 530 that did so in 2013 and the 480 managing this in 2012. In 2013, 14 schools registered 67% or fewer high attainers achieving this outcome, whereas in 2014 this number has increased substantially, to 57 schools. Five schools record 0%, including selective Bourne Grammar School, Lincolnshire, hopefully because of their choice of IGCSEs. Six more are at 25% or lower.

. 

A*-C grades in GCSE English and maths 

The Tables reveal that:

  • 93.8% of high attainers in state-funded schools achieved A*-C grades in GCSE English and maths, compared with 58.9% of all pupils. The latter percentage is down by 2.4 percentage points but the former has fallen by only 1.3 percentage points. Roughly one in 16 high attainers fails to achieve this measure.
  • In 2014 the number of schools with 100% of high attainers achieving this measure has fallen to some 470, 140 fewer than in 2013 and 60 fewer than in 2012. There were 38 schools recording 67% or lower, a significant increase compared with 12 in 2013 and 18 in 2012. Of these, four are listed at 0% (Bourne Grammar is at 1%) and five more are at 25% or lower.
  • Amongst the 38 schools recording 67% or lower, five return a higher success rate for their middle attainers than for their high attainers. Four of these are the same that do so on the 5+ A*-C measure above. They are joined by Tong High School. 

Entry to and achievement of the EBacc 

The Tables indicate that:

  • 68.8% of high attainers in state-funded schools were entered for all EBacc subjects and 55.0% achieved the EBacc. The entry rate is up by 3.8 percentage points compared with 2013, and the success rate is up by 2.9 percentage points. By comparison, 31.5% of middle attainers were entered (up 3.7 points) and 12.7% passed (up 0.9 points). Between 2012 and 2013 the entry rate for high attainers increased by 19 percentage points, so the rate of improvement has slowed significantly. Given the impending introduction of the Attainment 8 measure, commitment to the EBacc is presumably waning.
  • Thirty-seven schools entered no high attainers for the EBacc, compared with 55 in 2013 and 186 in 2012. Only 53 schools had no high attainers achieving the EBacc, compared with 79 in 2013 and 235 in 2012. Of these 53, 11 recorded a positive success rate for their middle attainers, though the difference was relatively small in all cases.

At least 3 Levels of Progress in English and maths

The Tables show that:

  • Across all state-funded schools 85.6% of high attainers made at least the expected progress in English while 84.7% did so in maths. The corresponding figures for middle attainers are 70.2% in English and 65.3% in maths. Compared with 2013, the percentages for high attainers are down 0.6 percentage points in English and down 3.1 percentage points in maths, presumably because the first entry only rule has had more impact in the latter. Even allowing for the depressing effect of the changes outlined above, it is unacceptable that more than one in every seven high attainers fails to make the requisite progress in each of these core subjects, especially when the progress expected is relatively undemanding for such students.
  • There were 108 schools in which every high attainer made at least the expected progress in English, exactly the same as in 2013. There were 99 schools which achieved the same outcome in maths, down significantly from 120 in 2013. In 2013 there were 36 schools which managed this in both English in maths, but only 21 did so in 2014.
  • At the other extreme, four schools recorded no high attainers making the expected progress in English, presumably because of their choice of IGCSE. Sixty-five schools were at or below 50% on this measure. In maths 67 schools were at or below 50%, but the lowest recorded outcome was 16%, at Oasis Academy, Hextable.
  • Half of the schools achieving 50% or less with their high attainers in English or maths also returned better results with middle attainers. Particularly glaring differentials in English include Red House Academy (50% middle attainers and 22% high attainers) and Wingfield Academy (73% middle attainers; 36% high attainers). In maths the worst examples are Oasis Academy Hextable (55% middle attainers and 16% high attainers), Sir John Hunt Community Sports College (45% middle attainers and 17% high attainers) and Roseberry College and Sixth Form (now closed) (49% middle attainers and 21% high attainers).

Comparing achievement of these measures by school type and admissions basis 

SFR02/2015 compares the performance of high attainers in different types of school on each of the five measures discussed above. This data is presented in Chart 2 below.

.

HA sec2 

Chart 2: Comparison of high attainers’ GCSE performance by type of school, 2014

.

It shows that:

  • There is significant variation on all five measures, though these are more pronounced for achievement of the EBacc, where there is a 20 percentage point difference between the success rates in sponsored academies (39.2%) and in converter academies (59.9%).
  • Converter academies are the strongest performers across the board, while sponsored academies are consistently the weakest. LA-maintained mainstream schools out-perform free schools on four of the five measures, the only exception being expected progress in maths.
  • Free schools and converter academies achieve stronger performance on progress in maths than on progress in English, but the reverse is true in sponsored academies and LA-maintained schools.
  • Sponsored academies and free schools are both registering relatively poor performance on the EBacc measure and the two progress measures.
  • One in four high attainers in sponsored academies fails to make the requisite progress in maths while one in five fail to do so in English. Moreover, one in five high attainers in free schools fails to make the expected progress in English and one in six in maths. This is unacceptably low.

Comparisons with 2013 outcomes show a general decline, with the exception of EBacc achievement.

This is particularly pronounced in sponsored academies, where there have been falls of 5.2 percentage points on 5+ A*-Cs including English and maths, 5.7 points on A*-C in English and maths and 4.7 points on expected progress in maths. However, expected progress in English has held up well by comparison, with a fall of just 0.6 percentage points.

Progress in maths has declined more than progress in English across the board. In converter academies progress in maths is down 3.1 points, while progress in English is down 1.1 points. In LA-maintained schools, the corresponding falls are 3.4 and 0.4 points respectively.

EBacc achievement is up by 4.5 percentage points in sponsored academies, 3.1 points in LA-maintained schools and 1.8 points in converter academies.

.

Comparing achievement of these measures by school admissions basis 

SFR02/2015 compares the performance of high attainers in selective, comprehensive and modern schools on these five measures. Chart 3 illustrates these comparisons.

.

HA sec3

Chart 3: Comparison of high attainers’ GCSE performance by school admissions basis, 2014

.

It is evident that:

  • High attainers in selective schools outperform those in comprehensive schools on all five measures. The biggest difference is in relation to EBacc achievement (21.6 percentage points). There is a 12.8 point advantage in relation to expected progress in maths and an 8.7 point advantage on expected progress in English.
  • Similarly, high attainers in comprehensive schools outperform those in modern schools. They enjoy a 14.7 percentage point advantage in relation to achievement of the EBacc, but, otherwise, the differences are between 1.6 and 3.5 percentage points.
  • Hence there is a smaller gap, by and large, between the performance of high attainers in modern and comprehensive schools respectively than there is between high attainers in comprehensive and selective schools respectively.
  • Only selective schools are more successful in achieving expected progress in maths than they are in English. It is a cause for some concern that, even in selective schools, 6.5% of pupils are failing to make at least three levels of progress in English.

Compared with 2013, results have typically improved in selective schools but worsened in comprehensive and modern schools. For example:

  • Achievement of the 5+ GCSE measure is up 0.5 percentage points in selective schools but down 2.3 points in comprehensives and modern schools.
  • In selective schools, the success rate for expected progress in English is up 0.5 points and in maths it is up 0.4 points. However, in comprehensive schools progress in English and maths are both down, by 0.7 points and 3.5 points respectively. In modern schools, progress in English is up 0.3 percentage points while progress in maths is down 4.1 percentage points.

When it comes to EBacc achievement, the success rate is unchanged in selective schools, up 3.1 points in comprehensives and up 5 points in modern schools.

. 

Other measures

The Secondary Performance Tables also provide information about the performance of high attainers on several other measures, including:

  • Average Points Score (APS): Annex B of the Statement of Intent says that, as in 2013, the Tables will include APS (best 8) for ‘all qualifications’ and ‘GCSEs only’. At the time of writing, only the former appears in the 2014 Tables. For high attainers, the APS (best 8) all qualifications across all state-funded schools is 386.2, which compares unfavourably with 396.1 in 2013. Four selective schools managed to exceed 450 points: Pate’s Grammar School (455.1); The Tiffin Girls’ School (452.1); Reading School (451.4); and Colyton Grammar School (450.6). The best result in 2013 was 459.5, again at Colyton Grammar School. At the other end of the table, only one school returns a score of under 250 for their high attainers, Pent Valley Technology College (248.1). The lowest recorded score in 2013 was significantly higher at 277.3.
  • Value Added (best 8) prior attainment: The VA score for all state-funded schools in 2014 is 1000.3, compared with 1001.5 in 2013. Five schools returned a result over 1050, whereas four did so in 2013. The 2014 leaders are: Tauheedul Islam Girls School (1070.7); Yesodey Hatorah Senior Girls School (1057.8); The City Academy Hackney (1051.4); The Skinner’s School (1051.2); and Hasmonean High School (1050.9). At the other extreme, 12 schools were at 900 or below, compared with just three in 2013. The lowest performer on this measure is Hull Studio School (851.2). 
  • Average grade: As in the case of APS, the average grade per pupil per GCSE has not yet materialised. The average grade per pupil per qualification is supplied. Five selective schools return A*-, including Henrietta Barnett, Pate’s, Reading School, Tiffin Girls and Tonbridge Grammar. Only Henrietta Barnett and Pate’s managed this in 2013.
  • Number of exam entries: Yet again we only have number of entries for all qualifications and not for GCSE only. The average number of entries per high attainer across state-funded schools is 10.4, compared with 12.1 in 2013. This 1.7 reduction is smaller than for middle attainers (down 2.5 from 11.4 to 8.9) and low attainers (down 3.7 from 10.1 to 6.4). The highest number of entries per high attainer was 14.2 at Gable Hall School and the lowest was 5.9 at The Midland Studio College Hinkley.

16-18: A level outcomes

.

A level grades AAB or higher in at least two facilitating subjects 

The 16-18 Tables show that 11.9% of students in state-funded schools and colleges achieved AAB+ with at least two in facilitating subjects. This is slightly lower than the 12.1% recorded in 2013.

The best-performing state-funded institution is a further education college, Cambridge Regional College, which records 83%. The only other state-funded institution above 80% is The Henrietta Barnett School. At the other end of the spectrum, some 443 institutions are at 0%.

Table 3, derived from SFR03/2015, reveals how performance on this measure has changed since 2013 for different types of institution and, for schools with different admission arrangements.

.

2013 2014
LA-maintained school 11.4 11.5
Sponsored academy 5.4 5.3
Converter academy 16.4 15.7
Free school* 11.3 16.4
Sixth form college 10.4 10
Other FE college 5.8 5.7
 
Selective school 32.4 32.3
Comprehensive school 10.7 10.5
Modern school 2 3.2

.

The substantive change for free schools will be affected by the inclusion of UTCs and studio schools in that line in 2013 and the addition of city technology colleges and 16-19 free schools in 2014.

Otherwise the general trend is slightly downwards but LA-maintained schools have improved very slightly and modern schools have improved significantly.

.

Other measures of high A level attainment

SFR03/15 provides outcomes for three other measures of high A level attainment:

  • 3 A*/A grades or better at A level, or applied single/double award A level
  • Grades AAB or better at A level, or applied single/double award A level
  • Grades AAB or better at A level all of which are in facilitating subjects.

Chart 4, below, compares performance across all state-funded schools and colleges on all four measures, showing results separately for boys and girls.

Boys are in the ascendancy on three of the four measures, the one exception being AAB grades or higher in any subjects. The gaps are more substantial where facilitating subjects are involved.

.

HA sec4

Chart 4: A level high attainment measures by gender, 2014

.

The SFR provides a time series for the achievement of the 3+ A*/A measure, for all schools – including independent schools – and colleges. The 2014 success rate is 12.0%, down 0.5 percentage points compared with 2013.

The trend over time is shown in Chart 5 below. This shows how results for boys and girls alike are slowly declining, having reached their peak in 2010/11. Boys established a clear lead from that year onwards.

As they decline, the lines for boys and girls are steadily diverging since girls’ results are falling more rapidly. The gap between boys and girls in 2014 is 1.3 percentage points.

.

HA sec5

Chart 5: Achievement of 3+ A*/A grades in independent and state-funded schools and in colleges, 2006-2014

.

Chart 6, compares performance on the four different measures by institutional type. It shows a similar pattern across the piece.

Success rates tend to be highest in either converter academies or free schools, while sponsored academies and other FE institutions tend to bring up the rear. LA-maintained schools and sixth form colleges lie midway between.

Converter academies outscore free schools when facilitating subjects do not enter the equation, but the reverse is true when they do. There is a similar relationship between sixth form colleges and LA-maintained schools, but it does not quite hold with the final pair.

. 

HA sec6 

Chart 6: Proportion of students achieving different A level high attainment measures by type of institution, 2014

.

Chart 7 compares performance by admissions policy in the schools sector on the four measures. Selective schools enjoy a big advantage on all four. More than one in four selective school students achieving at least 3 A grades and almost one in 3 achieves AAB+ with at least two in facilitating subjects.

There is a broadly similar relationship across all the measures, in that comprehensive schools record roughly three times the rates achieved in modern schools and selective schools manage roughly three times the success rates in comprehensive schools. 

. 

HA sec7 

Chart 7: Proportion of students achieving different A level high attainment measures by admissions basis in schools, 2014

 .

Other Performance Table measures 

Some of the other measures in the 16-18 Tables are relevant to high attainment:

  • Average Point Score per A level student: The APS per student across all state funded schools and colleges is 772.7, down slightly on the 782.3 recorded last year. The highest recorded APS in 2014 is 1430.1, by Colchester Royal Grammar School. This is almost 100 ahead of the next best school, Colyton Grammar, but well short of the highest score in 2013, which was 1650. The lowest APS for a state-funded school in 2014 is 288.4 at Hartsdown Academy, which also returned the lowest score in 2013. 
  • Average Point Score per A level entry: The APS per A level entry for all state-funded institutions is 211.2, almost identical to the 211.3 recorded in 2013. The highest score attributable to a state-funded institution is 271.1 at The Henrietta Barnett School. This is very slightly slower than the 271.4 achieved by Queen Elizabeth’s Barnet in 2013. The lowest is 108.6, again at Hartsdown Academy, which exceeds the 2013 low of 97.7 at Appleton Academy. 
  • Average grade per A level entry: The average grade across state-funded schools and colleges is C. The highest average grade returned in the state-funded sector is A at The Henrietta Barnett School, Pate’s Grammar School, Queen Elizabeth’s Barnet and Tiffin Girls School. In 2013 only the two Barnet schools achieved the same outcome. At the other extreme, an average U grade is returned by Hartsdown Academy, Irlam and Cadishead College and Swadelands School. 

SFR06/2015 also supplies the percentage of A* and A*/A grades by type of institution and schools’ admissions arrangements. The former is shown in Chart 8 and the latter in Chart 9 below.

The free school comparisons are affected by the changes to this category described above.

Elsewhere the pattern is rather inconsistent. Success rates at A* exceed those set in 2012 and 2013 in LA-maintained schools, sponsored academies, sixth form colleges and other FE institutions. Meanwhile, A*/A grades combined are lower than both 2012 and 2013 in converter academies and sixth form colleges.

.

HA sec8

Chart 8: A level A* and A*/A performance by institutional type, 2012 to 2014

. 

Chart 9 shows A* performance exceeding the success rates for 2012 and 2013 in all three sectors.

When both grades are included, success rates in selective schools have returned almost to 2012 levels following a dip in 2013, while there has been little change across the three years in comprehensive schools and a clear improvement in modern schools, which also experienced a dip last year.

HA sec9

Chart 9: A level A* and A*/A performance in schools by admissions basis, 2012 to 2014.

 .

Disadvantaged high attainers 

There is nothing in either of the Performance Tables or the supporting SFRs to enable us to detect changes in the performance of disadvantaged high attainers relative to their more advantaged peers.

I dedicated a previous post to the very few published statistics available to quantify the size of these excellence gaps and establish if they are closing, stable or widening.

There is continuing uncertainty whether this will be addressed under the new assessment and accountability arrangements to be introduced from 2016.

Although results for all high attainers appear to be holding up better than those for middle and lower attainers, the evidence suggests that FSM and disadvantaged gaps at lower attainment levels are proving stubbornly resistant to closure.

Data from SFR06/2015 is presented in Charts 10-12 below.

Chart 10 shows that, when the 2014 methodology is applied, three of the gaps on the five headline measures increased in 2014 compared with 2013.

That might have been expected given the impact of the changes discussed above but, if the 2013 methodology is applied, so stripping out much (but not all) of the impact of these reforms, four of the five headline gaps worsened and the original three are even wider.

This seems to support the hypothesis that the reforms themselves are not driving this negative trend, athough Teach First has suggested otherwise.

.

HA sec10

Chart 10: FSM gaps for headline GCSE measures, 2013-2014

.

Chart 11 shows how FSM gaps have changed on each of these five measures since 2011. Both sets of 2014 figures are included.

Compared with 2011, there has been improvement on two of the five measures, while two or three have deteriorated, depending which methodology is applied for 2014.

Since 2012, only one measure has improved (expected progress in English) and that by slightly more or less than 1%, according to which 2014 methodology is selected.

Deteriorations have been small however, suggesting that FSM gaps have been relatively stable over this period, despite their closure being a top priority for the Government, backed up by extensive pupil premium funding.

.

HA sec11

Chart 11: FSM/other gaps for headline GCSE measures, 2011 to 2014.

.

Chart 12 shows a slightly more positive pattern for the gaps between disadvantaged learners (essentially ‘ever 6 FSM’ and looked after children) and their peers.

There have been improvements on four of the five headline measures since 2011. But since 2012, only one or two of the measures has improved, according to which 2014 methodology is selected. Compared with 2013, either three or four of the 2014 headline measures are down.

The application of the 2013 methodology in 2014, rather than the 2014 methodology, causes all five of the gaps to increase, so reinforcing the point in bold above.

It is unlikely that this pattern will be any different at higher attainment levels, but evidence to prove or disprove this remains disturbingly elusive.

.

HA sec12

Chart 12: Disadvantaged/other gaps for headline GCSE measures, 2011 to 2014

.

Taken together, this evidence does not provide a ringing endorsement of the Government’s strategy for closing these gaps.

There are various reasons why this might be the case:

  • It is too soon to see a significant effect from the pupil premium or other Government reforms: This is the most likely defensive line, although it begs the question why more urgent action was/is discounted.
  • Pupil premium is insufficiently targeted at the students/school that need it most: This is presumably what underlies the Fair Education Alliance’s misguided recommendation that pupil premium funding should be diverted away from high attaining disadvantaged learners towards their lower attaining peers.
  • Schools enjoy too much flexibility over how they use the pupil premium and too many are using it unwisely: This might point towards more rigorous evaluation, tighter accountability mechanisms and stronger guidance.
  • Pupil premium funding is too low to make a real difference: This might be advanced by institutions concerned at the impact of cuts elsewhere in their budgets.
  • Money isn’t the answer: This might suggest that the pupil premium concept is fundamentally misguided and that the system as a whole needs to take a different or more holistic approach.

I have proposed a more targeted method of tackling secondary excellence gaps and simultaneously strengthening fair access, where funding topsliced from the pupil premium is fed into personal budgets for disadvantaged high attainers.

These would meet the cost of coherent, long-term personalised support programmes, co-ordinated by their schools and colleges, which would access suitable services from a ‘managed market’ of suppliers.

.

Conclusion

This analysis suggests that high attainers, particularly those in selective schools, have been relatively less affected by the reforms that have depressed GCSE results in 2014.

While we should be thankful for small mercies, three issues are of particular concern:

  • There is a stubborn and serious problem with the achievement of expected progress in both English and maths. It cannot be acceptable that approximately one in seven high attainers fails to make three levels of progress in each core subject when this is a relatively undemanding expectation for those with high prior attainment. This issue is particularly acute in sponsored academies where one in four or five high attainers are undershooting their progress targets.
  • Underachievement amongst high attainers is prevalent in far too many state-funded schools and colleges. At KS4 there are huge variations in the performance of high-attaining students depending on which schools they attend. A handful of schools achieve better outcomes with their middle attainers than with their high attainers. This ought to be a strong signal, to the schools as well as to Ofsted, that something serious is amiss.
  • Progress in closing KS4 FSM gaps continues to be elusive, despite this being a national priority, backed up by a pupil premium budget of £2.5bn a year. In the absence of data about the performance of disadvantaged high attainers, we can only assume that this is equally true of excellence gaps.

.

GP

February 2015