Maths Mastery: Evidence versus Spin

.

On Friday 13 February, the Education Endowment Foundation (EEF) published the long-awaited evaluation reports of two randomised control trials (RCTs) of Mathematics Mastery, an Ark-sponsored programme and recipient of one of the EEF’s first tranche of awards back in 2011.

Inside-out_torus_(animated,_small)Inside-out_torus_(animated,_small)EEF, Ark and Mathematics Mastery each published a press release to mark the occasion but, given the timing, none of these attracted attention from journalists and were discussed only briefly on social media.

The main purpose of this post is to distinguish evidence from spin, to establish exactly what the evaluations tell us – and what provisos should be attached to those findings.

The post is organised into three main sections which deal respectively with:

  • Background to Mathematics Mastery
  • What the evaluation reports tell us and
  • What the press releases claim

The conclusion sets out my best effort at a balanced summary of the main findings. (There is a page jump here for those who prefer to cut to the chase.)

This post is written by a non-statistician for a lay audience. I look to specialist readers to set me straight if I have misinterpreted any statistical techniques or findings,

What was published?

On Friday 13 February the EEF published six different documents relevant to the evaluation:

  • A press release: ‘Low-cost internet-based programme found to considerably improve reading ability of year 7 pupils’.
  • A blog post: ‘Today’s findings: impact, no impact and inconclusive – a normal distribution of findings’.
  • An updated Maths Mastery home page (also published as a pdf Project Summary in a slightly different format).

The last three of these were written by the Independent Evaluators – Jerrim and Vignoles (et al) – employed through the UCL Institute of Education.

The Evaluators also refer to ‘a working paper documenting results from both trials’ available in early 2015 from http://ideas.repec.org/s/qss/dqsswp.html and www.johnjerrim.com. At the time of writing this is not yet available.

Press releases were issued on the same day by:

All of the materials published to date are included in the analysis below.

Background to Maths Mastery

What is Maths Mastery?

According to the NCETM (October 2014) the mastery approach in mathematics is characterised by certain common principles:

‘Teachers reinforce an expectation that all pupils are capable of achieving high standards in mathematics.

  • The large majority of pupils progress through the curriculum content at the same pace. Differentiation is achieved by emphasising deep knowledge and through individual support and intervention.
  • Teaching is underpinned by methodical curriculum design and supported by carefully crafted lessons and resources to foster deep conceptual and procedural knowledge.
  • Practice and consolidation play a central role. Carefully designed variation within this builds fluency and understanding of underlying mathematical concepts in tandem.
  • Teachers use precise questioning in class to test conceptual and procedural knowledge, and assess pupils regularly to identify those requiring intervention so that all pupils keep up.

The intention of these approaches is to provide all children with full access to the curriculum, enabling them to achieve confidence and competence – ‘mastery’ – in mathematics, rather than many failing to develop the maths skills they need for the future.’

The NCETM paper itemises six key features, which I paraphrase as:

  • Curriculum design: Relatively small, sequenced steps which must each be mastered before learners move to the next stage. Fundamental skills and knowledge are secured first and these often need extensive attention.
  • Teaching resources: A ‘coherent programme of high-quality teaching materials’ supports classroom teaching. There is particular emphasis on ‘developing deep structural knowledge and the ability to make connections’. The materials may include ‘high-quality textbooks’.
  • Lesson design: Often involves input from colleagues drawing on classroom observation. Plans set out in detail ‘well-tested methods’ of teaching the topic. They include teacher explanations and questions for learners.
  • Teaching methods: Learners work on the same tasks. Concepts are often explored together. Technical proficiency and conceptual understanding are developed in parallel.
  • Pupil support and differentiation: Is provided through support and intervention rather than through the topics taught, particularly at early stages. High attainers are ‘challenged through more demanding problems which deepen their knowledge of the same content’. Issues are addressed through ‘rapid intervention’ commonly undertaken the same day.
  • Productivity and practice: Fluency is developed from deep knowledge and ‘intelligent practice’. Early learning of multiplication tables is expected. The capacity to recall facts from long term memory is also important.

Its Director published a blog post (October 2014) arguing that our present approach to differentiation has ‘a very negative effect’ on mathematical attainment and that this is ‘one of the root causes’ of our performance in PISA and TIMSS.

This is because it negatively affects the ‘mindset’ of low attainers and high attainers alike. Additionally, low attainers are insufficiently challenged and get further behind because ‘they are missing out on some of the curriculum’. Meanwhile high attainers are racing ahead without developing fluency and deep understanding.

He claims that these problems can be avoided through a mastery approach:

‘Instead, countries employing a mastery approach expose almost all of the children to the same curriculum content at the same pace, allowing them all full access to the curriculum by focusing on developing deep understanding and secure fluency with facts and procedures, and providing differentiation by offering rapid support and intervention to address each individual pupil’s needs.’

But unfortunately he stops short of explaining how, for high attainers, exclusive focus on depth is preferable to a richer blend of breadth, depth and pace, combined according to each learner’s needs.

NCETM is careful not to suggest that mastery is primarily focused on improving the performance of low-attaining learners.

It has published separate guidance on High Attaining Pupils in Primary Schools (registration required), which advocates a more balanced approach, although that predates this newfound commitment to mastery.

NCETM is funded by the Department for Education. Some of the comments on the Director’s blog post complain that it is losing credibility by operating as a cheerleader for Government policy.

Ark’s involvement

Ark is an education charity and multi-academy trust with an enviable reputation.

It builds its approach on six key principles, one of which is ‘Depth before breadth’:

‘When pupils secure firm foundations in English and mathematics, they find the rest of the curriculum far easier to access. That’s why we prioritise depth in these subjects, giving pupils the best chance of academic success. To support fully our pupils’ achievement in maths, we have developed the TES Award winning Mathematics Mastery programme, a highly-effective curriculum and teaching approach inspired by pupil success in Singapore and endorsed by Ofsted. We teach Mathematics Mastery in all our primary schools and at Key Stage 3 in a selection of our secondary schools. It is also being implemented in over 170 schools beyond our network.’

Ark’s 2014 Annual Report identifies five priorities for 2014/15, one of which is:

‘…developing curricula to help ensure our pupils are well prepared as they go through school… codifying our approach to early years and, building on the success of Maths Mastery, piloting an English Mastery programme…’

Mathematics Mastery is a charity in its own right. Its website lists 15 staff, a high-powered advisory group and three partner organisations:  Ark, the EEF (presumably by virtue of the funded evaluation) and the ‘Department for Education and the Mayor of London’ (presumably by virtue of support from the London Schools Excellence Fund).

NCETM’s Director sits on Mathematics Mastery’s Advisory Board.

Ark’s Chief Executive is a member of the EEF’s Advisory Board.

Development of Ark’s Maths Mastery programme

According to this 2012 report from Reform, which features Maths Mastery as a case study, it originated in 2010:

‘The development of Mathematics Mastery stemmed from collaboration between six ARK primary academies in Greater London, and the mathematics departments in seven separate ARK secondary academies in Greater London, Portsmouth and Birmingham. Representatives from ARK visited Singapore to explore the country’s approach first-hand, and Dr Yeap Ban Har, Singapore’s leading expert in maths teaching, visited King Solomon Academy in June 2011.’

In October 2011, EEF awarded Ark a grant of £600,000 for Maths Mastery, one of its first four awards.

The EEF’s press release says:

‘The third grant will support an innovative and highly effective approach to teaching children maths called Mathematics Mastery, which originated in Singapore. The programme – run by ARK Schools, the Academies sponsor, which is also supporting the project – will receive £600,000 over the next four years to reach at least 50 disadvantaged primary and secondary schools.’

Ark’s press release adds:

‘ARK Schools has been awarded a major grant by the Education Endowment Foundation (EEF) to further develop and roll out its Mathematics Mastery programme, an innovative and highly effective approach to teaching children maths based on Singapore maths teaching. The £600,000 grant will enable ARK to launch the programme and related professional development training to improve maths teaching in at least 50 disadvantaged primary and secondary schools.

The funding will enable ARK Schools to write a UK mathematics mastery programme based on the experience of teaching the pilot programme in ARK’s academies. ARK intends to complete the development of its primary modules for use from Sept 2012 and its secondary modules for use from September 2013. In parallel ARK is developing professional training and implementation support for schools outside the ARK network.’

The project home page on EEF’s site now says the total project cost is £774,000. It may be that the balance of £174,000 is the fee paid to the independent evaluators.

This 2012 information sheet says all Ark primary schools would adopt Maths Mastery from September 2012, and that its secondary schools have also devised a KS3 programme.

It describes the launch of a Primary Pioneer Programme from September 2012 and a Secondary Pioneer Programme from September 2013. These will form the cohorts to be evaluated by the EEF.

In 2013, Ark was awarded a grant of £617,375 from the Mayor of London’s London Schools Excellence Fund for the London Primary Schools Mathematics Mastery Project.

This is to support the introduction of Mastery in 120 primary schools spread across 18 London boroughs. (Another source gives the grant as £595,000)

It will be interesting to see whether Maths Mastery (or English Mastery) features in the Excellence Fund’s latest project to increase primary attainment in literacy and numeracy. The outcomes of the EEF evaluations may be relevant to that impending decision.

Ark’s Mathematics Mastery today

The Mathematics Mastery website advertises a branded variant of the mastery model, derived from a tripartite ‘holistic vision’:

  • Deep understanding, through a curriculum that combines universal high expectations with spending more time on fewer topics and heavy emphasis on problem-solving.
  • Integrated professional development through workshops, visits, coaching and mentoring and ‘access to exclusive online teaching and learning materials, including lesson guides for each week’.
  • Teacher collaboration – primary schools are allocated a geographical cluster of 4-6 schools while secondary schools attend a ‘national collaboration event’. There is also an online dimension.

It offers primary and secondary programmes.

The primary programme has three particular features: use of objects and pictures prior to the introduction of symbols; a structured approach to the development of mathematical vocabulary; and heavy emphasis on problem-solving.

It involves one-day training sessions for school leaders, for the Maths Mastery lead and those new to teaching it, and for teachers undertaking the programme in each year group. Each school receives two support visits and attends three local cluster meetings.

Problem-solving is also one of three listed features of the secondary programme. The other two are fewer topics undertaken in greater depth, plus joint lesson planning and departmental workshops.

There are two full training days, one for the Maths Mastery lead and one for the maths department plus an evening session for senior leadership. Each school receives two support visits and attends three national collaborative meetings. They must hold an hour-long departmental workshop each week and commit to sharing resources online.

Both primary and secondary schools are encouraged to launch the programme across Year 1/7 and then roll it upwards ‘over several years’.

The website is not entirely clear but it appears that Maths Mastery itself is being rolled out a year at a time, so even the original primary early adopters will have provision only up to Year 3 and are scheduled to introduce provision for Year 4 next academic year. In the secondary sector, activity currently seems confined to KS3, and predominantly to Year 7.

The number of participating schools is increasing steadily but is still very small.

The most recent figures I could find are 192 (Maths Mastery, November 2014) or 193 – 142 primary and 51 secondary (Ark 2015).

One assumes that this total includes

  • An original tranche of 30 primary ‘early adopters’ including 21 not managed by Ark
  • 60 or so primary and secondary ‘Pioneer Schools’ within the EEF evaluations (ie the schools undertaking the intervention but not those forming the control group, unless they have subsequently opted to take up the programme)
  • The 120 primary schools in the London project
  • Primary and secondary schools recruited outwith the London and EEF projects, either alongside them or subsequently.

But the organisation does not provide a detailed breakdown, or show how these different subsets overlap.

They are particularly coy about the cost. There is nothing about this on the website.

The EEF evaluation reports say that 2FE primary schools and secondary schools will pay ‘an upfront cost of £6,000 for participating in the programme’.

With the addition of staff time for training, the per pupil cost for the initial year is estimated as £127 for primary schools and £50 for secondary schools.

The primary report adds:

‘In subsequent years schools are able to opt for different pathways depending on the amount of support and training they wish to choose; they also have ongoing access to the curriculum materials for additional year groups. The per pupil cost therefore reduces considerably, to below £30 per pupil for additional year groups.’

In EEF terms this is deemed a low cost intervention, although an outlay of such magnitude is a significant burden for primary schools, particularly when funding is under pressure, and might be expected to act as a brake on participation.

Further coyness is evident in respect of statutory assessment outcomes. Some details are provided for individual schools, but there is precious little about the whole cohort.

All I could find was this table in the Primary Yearbook 2014-15.

.

EEF maths mastery performance

It suggests somewhat better achievement at KS1 L2b and L3c than the national average but, there is no information about other Levels and, of course, the sample is not representative, so the comparison is of limited value.

An absence of more sophisticated analysis – combined with the impression of limited transparency for those not yet inside the programme – is likely to act as a second brake on participation.

There is a reference to high attainers in the FAQ on the website:

‘The Mathematics Mastery curriculum emphasises stretching through depth of understanding rather than giving the top end of pupils [sic] new procedures to cover.

Problem solving is central to Mathematics Mastery. The great thing about the problems is that students can take them as far as they can, so those children who grasp the basics quickly can explore tasks further. There is also differentiation in the methods used, with top-end pupils typically moving to abstract numbers more quickly and spending less time with concrete manipulatives or bar models. There are extension ideas and support notes provided with the tasks to help you with this.

A range of schools are currently piloting the programme, which is working well in mixed-ability classes, as well as in schools that have set groups.’

The same unanswered questions arise as with the NCETM statement above. Is ‘Maths Mastery’ primarily focused on the ‘long tail’, potentially at the expense of high attainers?

The IoE evaluators think so. The primary evaluation report says that:

‘Mathematics Mastery intervention is particularly concerned with the ‘mastery’ of basic skills, and raising the attainment of low achievers.’

It would be helpful to have clarity on this point.

.

How influential is Maths Mastery?

Extremely influential.

Much educational and political capital has already been invested in Maths Mastery, hence the peculiar significance of the results contained in the evaluation reports.

The National Curriculum Expert Panel espoused mastery in its ‘Framework for the National Curriculum‘ (December 2011), while ducking the consequences for ‘stretch and challenge’ for high attainers – so creating a tension that remains unresolved to this day.

Meanwhile, the mastery approach has already influenced the new maths programme of study, as the NCETM document makes clear:

‘The 2014 national curriculum for mathematics has been designed to raise standards in maths, with the aim that the large majority of pupils will achieve mastery of the subject…

… For many schools and teachers the shift to this ‘mastery curriculum’ will be a significant one. It will require new approaches to lesson design, teaching, use of resources and support for pupils.’

Maths Mastery confirms that its Director was on the drafting team.

Mastery is also embedded in the national collaborative projects being undertaken through the Maths Hubs. Maths Mastery is one of four national partners in the Hubs initiative.

Ministers have endorsed the Ark programme in their speeches. In April 2014, Truss said:

‘The mastery model of learning places the emphasis on understanding core concepts. It’s associated with countries like Singapore, who have very high-performing pupils.

And in this country, Ark, the academy chain, took it on and developed it.

Ark run training days for maths departments and heads of maths from other schools.

They organise support visits, and share plans and ideas online with other teachers, and share their learning with a cluster of other schools.

It’s a very practical model. We know not every school will have the time or inclination to develop its very own programmes – a small rural school, say, or single-class primary schools.

But in maths mastery, a big chain like Ark took the lead, and made it straightforward for other schools to adopt their model. They maintain an online community – which is a cheap, quick way of keeping up with the best teaching approaches.

That’s the sort of innovation that’s possible.

Of course the important thing is the results. The programme is being evaluated so that when the results come out headteachers will be able to look at it and see if it represents good value.’

In June 2014 she said:

‘This idea of mastery is starting to take hold in classrooms in England. Led by evidence of what works, teachers and schools have sought out these programmes and techniques that have been pioneered in China and East Asia….

…With the Ark Schools Maths Mastery programme, more than 100 primary and secondary schools have joined forces to transform their pupils’ experiences of maths – and more are joining all the time. It’s a whole school programme focused on setting high expectations for all pupils – not believing that some just can’t do it. The programme has already achieved excellent results in other countries.’

Several reputations are being built upon Maths Mastery, many jobs depend upon it and large sums have been invested.

It has the explicit support of one of the country’s foremost academy chains and is already impacting on national curriculum and assessment policy (including the recent consultation on performance indicators for statutory teacher assessment).

Negative or neutral evaluations could have significant consequences for all the key players and are unlikely to encourage new schools to join the Programme.

Hence there is pressure in the system for positive outcomes – hence the significance of spin.

What the EEF evaluations tell us

.

Evaluation Protocols

EEF published separate Protocols for the primary and secondary evaluations in April 2013. These are broadly in line with the approach set out in the final evaluation reports, except that both refer much more explicitly to subsequent longitudinal evaluation:

‘In May/June 2017/18 children in treatment and control schools will sit key stage 2 maths exams. The IoE team will examine the long–run effectiveness of the Maths Mastery programme by investigating differences in school average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children’s Maths Mastery test scores (collected in 2012 and 2013)’.

‘In May/June 2018 children in treatment and control schools will sit national maths exams. The IoE team will examine the long – run effectiveness of the Maths Mastery programme by investigating differences in average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children’s Maths Mastery test scores (collected in 2013 and 2014) by NATCEN.’

It is not clear whether the intention is to preserve the integrity of the intervention and control groups until the former have rolled out Mastery to all year groups, or simply to evaluate the long-term effects of the initial one-year interventions, allowing intervention schools to drop Mastery and control schools to adopt it, entirely as they wish.

EEF Maths Mastery Project Homepage

The EEF’s updated Maths Mastery homepage has been revised to reflect the outcomes of the evaluations. It provides the most accessible summary of those outcomes.

It offers four key conclusions (my emphases):

  • ‘On average, pupils in schools adopting Mathematics Mastery made a small amount more progress than pupils in schools that did not. The effect detected was statistically significant, which means that it is likely that that improvement was caused by the programme.’
  • ‘It is unclear whether the programme had a different impact on pupils eligible for free school meals, or on pupils with higher or lower attainment.’
  • ‘Given the low per-pupil cost, Mathematics Mastery may represent a cost-effective change for schools to consider.’
  • ‘The evaluations assessed the impact of the programme in its first year of adoption. It would be worthwhile to track the medium and long-term impact of the approach.’

A table is supplied showing the effect sizes and confidence intervals for overall impact (primary and secondary together), and for the primary and secondary interventions separately.

EEF table 1 Capture

.

The support materials for the EEF’s toolkit help to explain these judgements.

About the Toolkit tells us that:

‘Average impact is estimated in terms of the additional months’ progress you might expect pupils to make as a result of an approach being used in school, taking average pupil progress over a year as a benchmark.

For example, research summarised in the Toolkit shows that improving the quality of feedback provided to pupils has an average impact of eight months. This means that pupils in a class where high quality feedback is provided will make on average eight months more progress over the course of a year compared to another class of pupils who were performing at the same level at the start of the year. At the end of the year the average pupil in a class of 25 pupils in the feedback group would now be equivalent to the 6th best pupil in the control class having made 20 months progress over the year, compared to an average of 12 months in the other class.’

There is another table showing us how to interpret this scale

EEF table 2 Capture

.

We can see from this that:

  • The overall Maths Mastery impact of +0.073 is towards the upper end of the ‘1 months progress’ category.
  • The ‘primary vs comparison’ impact of +0.10 just scrapes into the ‘2 months progress’ category.
  • The secondary vs comparison impact of +0.06 is towards the middle of the ‘1 months progress category’

All three are officially classed as ‘Low Effect’.

If we compare the effect size attributable to Maths Mastery with others in the Toolkit, it is evident that it ranks slightly above school uniform and slightly below learning styles.

A subsequent section explains that the overall impact rating is dependent on meta-analysis (again my emphases):

‘The findings from the individual trials have been combined using an approach called “meta-analysis”. Meta-analysis can lead to a more accurate estimate of an intervention’s effect. However, it is also important to note that care is needed in interpreting meta-analysed findings.’

But we are not told how, in light of this, we are to exercise care in interpreting this particular finding. There are no explicit ‘health warnings’ attached to it.

The homepage does tell us that:

‘Due to the ages of pupils who participated in the individual trials, the headline findings noted here are more likely to be predictive of programme’s impact on pupils in primary school than on pupils in secondary school.’

It also offers an explanation of why the effects generated from these trials are so small compared with those for earlier studies:

‘The findings were substantially lower than the average effects seen in the existing literature on of “mastery approaches”. A possible explanation for this is that many previous studies were conducted in the United States in the 1970s and 80s, so may overstate the possible impact in English schools today. An alternative explanation is that the Mathematics Mastery programme differed from some examples of mastery learning previously studied. For example classes following the Mathematics Mastery approach did not delay starting new topics until a high level of proficiency had been achieved by all students, which was a key feature in a number of many apparently effective programmes.’

 

 

There is clearly an issue with the 95% confidence intervals supplied in the first table above. 

The Technical Appendices to the Toolkit say:

‘For those concerned with statistical significance, it is still readily apparent in the confidence intervals surrounding an effect size. If the confidence interval includes zero, then the effect size would be considered not to have reached conventional statistical significance.’ (p6)

The table indicates that the lower confidence interval is zero or lower in all three cases, meaning that none of these findings may be statistically significant.

However, the homepage claims that the overall impact of both interventions, when combined through meta-analysis, is statistically significant.

And it fails entirely to mention that the impact of the both the primary and the secondary interventions separately are statistically insignificant.

The explanation of the attribution of statistical significance to the two evaluations combined is that, whereas the homepage gives confidence intervals to two decimal places, the reports calculate them to a third decimal place.

This gives a lower value of 0.004 (ie four thousandths above zero).

This can be seen from the table annexed to the primary and secondary reports and included in the ‘Overarching Summary Report’

EEF maths mastery 3 decimal places Capture

.

The distinction is marginal, to say the least. Indeed, the Evaluation Reports say:

‘…the pooled effect size of 0.073 is just significantly different from zero at conventional thresholds’

Moreover, notice that the introduction of a third decimal place drags the primary effect size down to 0.099, officially consigning it to the ‘one month’s progress’ category rather than the two months quoted above.

This might appear to be dancing on the head of a statistical pin but, as we shall see later, the spin value of statistical significance is huge!

Overall there is a lack of clarity here that cannot be attributed entirely to the necessity for brevity. The attempt to conflate subtly different outcomes from the separate primary and secondary evaluations has masked these distinctions and distorted the overall assessment.

.

The full reports add some further interesting details which are summarised in the sections below.

 

Primary Evaluation Report 

EEF maths mastery table 4

Key points:

  • In both the primary and secondary reports, additional reasons are given for why the effects from these evaluations are so much smaller than those from previous studies. These include the fact that:

‘…some studies included in the mastery section of the toolkit show small or no effects, suggesting that making mastery learning work effectively in all circumstances is challenging.’

The overall conclusion is an indirect criticism of the Toolkit, noting as it does that ‘the relevance of such evidence for contemporary education policy in England…may be limited’.

  • The RCT was undertaken across two academic years: In AY2012/13, 40 schools (Cohort A) were involved. Of these, 20 were randomly allocated the intervention and 20 the control. In AY2013/14, 50 schools (Cohort B) participated, 25 allocated the intervention and 25 the control. After the trial, control schools in Cohort A were free to pursue Maths Mastery. (The report does not mention whether this also applied to Cohort B.) It is not clear how subsequent longitudinal evaluation will be affected by such leakage from the control group.
  • The schools participating in the trial schools were recruited by Ark. They had to be state-funded and not already undertaking Maths Mastery:

‘Schools were therefore purposefully selected—they cannot be considered a randomly chosen sample from a well-defined population. The majority of schools participating in the trial were from London or the South East.’

  • Unlike the secondary evaluation, no process evaluation was conducted so it is not possible to determine the extent to which schools adhered to the prescribed programme. 
  • Baseline tests were administered after allocation between intervention and control, at the beginning of each academic year. Pupils were tested again in July. Evaluators used the Number Knowledge Test (NKT) for this purpose. The report discusses reasons why this might not be an accurate predictor of subsequent maths attainment and whether it is so closely related to the intervention as to be ‘a questionable measure of the success of the trial’. The discussion suggests that there were potential advantages to both the intervention and control groups but does not say whether one outweighed the other. 
  • The results of the post-test are summarised thus:

‘Children who received the Mathematics Mastery intervention scored, on average, +0.10 standard deviations higher on the post-test. This, however, only reached statistical significance at the 10% level (t = 1.82; p = 0.07), with the 95% confidence interval ranging from -0.01 to +0.21. Within Cohort A, children in the treatment group scored (on average) +0.09 standard deviations above those children in the control group (confidence interval -0.06 to +0.24). The analogous effect in Cohort B was +0.10 (confidence interval -0.05 to 0.26). Consequently, although the Mathematics Mastery intervention may have had a small positive effect on children’s test scores, it is not possible to rule out sampling variation as an explanation.’

  • The comparison of pre-test and post-test results provides any evidence of differential effects for those with lower or higher prior attainment:

‘Estimates are again presented in terms of effect sizes. The interaction effect is not significantly different from zero, with the 95% confidence interval ranging from -0.01 to +0.02. Thus there is little evidence that the effect of Mathematics Mastery differs between children with different levels of prior achievement.’

The Report adds:

‘Recall that the Mathematics Mastery intervention is particularly concerned with the ‘mastery’ of basic skills, and raising the attainment of low achievers. Thus one might anticipate the intervention to be particularly effective in the bottom half of the test score distribution. There is some, but relatively little, evidence that the intervention was less effective for the bottom half of the test distribution.

So, on this evidence, Maths Mastery is no more effective for the low achievers it is intended to help most. This is somewhat different to the suggestion on the homepage that the answer given to this question is ‘unclear’.

Several limitations are discussed, but it is important to note that they are phrased in hypothetical terms:

  • Pupils’ progress was evaluated after one academic year::

’This may be considered a relatively small ‘dose’ of the Mathematics Mastery programme’.

  • The intervention introduced a new approach to schools, so there was a learning curve which control schools did not experience:

‘With more experience teaching the programme it is possible that teachers would become more effective in implementing it.’

  • The test may favour either control schools or intervention schools.
  • Participating schools volunteered to take part, so it is not possible to say whether similar effects would be found in all schools.
  • It was not possible to control for balance – eg by ethnic background and FSM eligibility – between intervention and control. [This is now feasible so could potentially be undertaken retrospectively to check there was no imbalance.]

Under ‘Interpretation’, the report says:

‘Within the context of the wider educational literature, the effect size reported (0.10 standard deviations) would typically be considered ‘small’….

Yet, despite the modest and statistically insignificant effect, the Mathematics Mastery intervention has shown some promise.’

The phrase ‘some promise’ is justified by reference to the meta-analysis, the cost effectiveness (a small effect size for a low cost is preferable to the same outcome for a higher cost) and the fact that the impact of the entire programme has not yet been evaluated

‘Third, children are likely to follow the Mathematics Mastery programme for a number of years (perhaps throughout primary school), whereas this evaluation has considered the impact of just the first year of the programme. Long-run effects after sustained exposure to the programme could be significantly higher, and will be assessed in a follow-up study using Key Stage 2 data.’

This is the only reference to a follow-up study. It is less definite than the statement in the assessment protocol and there is no further explanation of how this will be managed, especially given potential ‘leakage’ from the control group.

 

Secondary Evaluation Report

EEF maths mastery table 5

Key points:

  • 50 schools were recruited to participate in the RCT during AY2013/14, with 25 randomly allocated to intervention and control. All Year 7 pupils within the former experienced the intervention.  As in the primary trial, control schools were eligible to access the programme after the end of the trial year. Interestingly, 3 of the 25 intervention schools (12%) dropped out before the end of the year – their reasons are not recorded. 
  • As in the primary trial, Ark recruited the participating schools – which had to be state-funded and new to Maths Mastery. Since schools were deliberately selected they could not be considered a random sample. The report notes:

‘Trial participants, on average, performed less well in their KS1 and KS2 examinations than the state school population as a whole. For instance, their KS1 average points scores (and KS2 maths test scores) were approximately 0.2 standard deviations (0.1 standard deviations) below the population mean. This seems to be driven, at least in part, by the fact that the trial particularly under-represented high achievers (relative to the population). For instance, just 12% of children participating in the trial were awarded Level 3 in their Key Stage 1 maths test, compared to 19% of all state school pupils in England.’

  • KS1 and KS2 tests were used to baseline. The Progress in Maths (PiM) test was used to assess pupils at the end of the year. But about 40% of the questions cover content not included in the Y7 maths mastery curriculum, which disadvantaged them relative to the control group. PiM also includes a calculator section although calculators are not used in Year 7 of Maths Mastery. It was agreed that breakdowns of results would be supplied to account for this.
  • On the basis of overall test results:

‘Children who received the Mathematics Mastery intervention scored, on average, +0.055 standard deviations higher on the PiM post-test. This did not reach statistical significance at conventional thresholds (t = 1.20; p = 0.24), with the 95% confidence interval ranging from –0.037 to +0.147. Turning to the FSM-only sample, the estimated effect size is +0.066 with the 95% confidence interval ranging from –0.037 to +0.169 (p = 0.21). Moreover, we also estimated a model including a FSM-by intervention interaction. Results suggested there was little evidence of heterogeneous intervention effects by FSM. Consequently, although the Mathematics Mastery intervention may have had a small positive effect on overall PiM test scores, one cannot rule out the possibility that this finding is due to sampling variation.

  • When the breakdowns were analysed:

‘As perhaps expected, the Mathematics Mastery intervention did not have any impact upon children’s performance on questions covering topics outside the Mathematics Mastery curriculum. Indeed, the estimated intervention effect is essentially zero (effect size = –0.003). In contrast, the intervention had a more pronounced effect upon material that was focused upon within the Mathematics Mastery curriculum (effect size = 0.100), just reaching statistical significance at the 5% level (t = 2.15; p = 0.04)

  • The only analysis of the comparative performance of high and low attainers is tied to the parts of the test not requiring use of a calculator. It suggests a noticeably smaller effect in the top half of the attainment distribution, with no statistical significance above the 55th This is substantively different to the finding in the primary evaluation, and it begs the question whether secondary Maths Mastery needs adjustment to make it more suitable for high attainers.
  • A process evaluation was focused principally on 5 schools from the intervention group. Focus group discussions were held before the intervention and again towards the end. Telephone interviews were conducted and lessons observed. The sample was selected to ensure different sizes of school, FSM intake and schools achieving both poor and good progress in maths according to their most recent inspection report. One of the recommendations is that:

The intervention should consider how it might give more advice and support with respect to differentiation.’

  • The process evaluation adds further detail about suitability for high attainers:

‘Another school [E] also commented that the materials were also not sufficiently challenging for the highest-attaining children, who were frustrated by revisiting at length the same topics they had already encountered at primary school. Although this observation was also made in other schools, it was generally felt that the children gradually began to realise that they were in fact enjoying the subject more by gaining extra understanding.’

It is not clear whether this latter comment also extends to the high attainers!

A similar set of limitations is explored in similar language to that used in the primary report.

Under ‘Interpretation’ the report says:

‘Although point estimates were consistent with a small, positive gain, the study did not have sufficient statistical power to rule out chance as an explanation. Within the context of the wider educational literature, the effect size reported (less than 0.10 standard deviations) would typically be considered ‘small’…

But, as in the primary report, it detects ‘some promise’ on the same grounds. There is a similar speculative reference to longitudinal evaluation.

.

Press releases and blogs

. 

EEF press release

There is a certain irony in the fact that ‘unlucky’ Friday 13 February was the day selected by the EEF to release these rather disappointing reports.

But Friday is typically the day selected by communications people to release educational news that is most likely to generate negative media coverage – and a Friday immediately before a school holiday is a particularly favoured time to do so, presumably because fewer journalists and social media users are active.

Unfortunately, the practice is at risk of becoming self-defeating, since everyone now expects bad news on a Friday, whereas they might be rather less alert on a busier day earlier in the week.

On this occasion Thursday was an exceptionally busy day for education news, with reaction to Miliband’s speech and a raft of Coalition announcements designed to divert attention from it. With the benefit of hindsight, Thursday might have been a better choice.

The EEF’s press release dealt with evaluation reports on nine separate projects, so increasing the probability that attention would be diverted away from Maths Mastery.

It led on a different evaluation report which generated more positive findings – the EEF seems increasingly sensitive to concerns that too many of the RCTs it sponsors are showing negligible or no positive effect, presumably because the value-for-money police may be inclined to turn their beady eye upon the Foundation itself.

But perhaps it also did so because Maths Mastery’s relatively poor performance was otherwise the story most likely to attract the attention of more informed journalists and commentators.

On the other hand, Maths Mastery was given second billing:

‘Also published today are the results of Mathematics Mastery, a whole-school approach which aims to deepen pupils’ conceptual understanding of key mathematical ideas. Compared to traditional curricula, fewer topics are covered in more depth and greater emphasis is placed on problem solving and encouraging mathematical thinking. The EEF trials found that pupils following the Mathematics Mastery programme made an additional month’s progress over a period of a year.’

.

.

EEF blog post

Later on 13 February EEF released a blog post written by a senior analyst which mentions Maths Mastery in the following terms:

Another finding of note is the small positive impact of teaching children fewer mathematical concepts, but covering them in greater depth to ensure ‘mastery’. The EEF’s evaluation of Mathematics Mastery will make fascinating reading for headteachers contemplating introducing this approach into their school. Of course, the true value of this method may only be evident in years to come as children are able to draw on their secure mathematical foundations to tackle more complex problems.’

EEF is consistently reporting a small positive impact but, as we have seen, this is rather economical with the truth. It deserves some qualification.

More interestingly though, the post adds (my emphases):

‘Our commitment as an organisation is not only to build the strength of the evidence base in education, across key stages, topics, approaches and techniques, but also ensure that the key messages emerging from the research are synthesised and communicated clearly to teachers and school leaders so that evidence can form a central pillar of how decisions are made in schools.

We have already begun this work, driven by the messages from our published trials as well as the existing evidence base. How teaching assistants can be used to best effect, important lessons in literacy at the transition from primary to secondary, and which principles should underpin approaches on encouraging children in reading for pleasure are all issues that have important implications for school leaders. Synthesising and disseminating these vital messages will form the backbone of a new phase of EEF work beginning later in the year.’

It will be interesting to monitor the impact of this work on the communication of outcomes from these particular evaluations.

It will be important to ensure that synthesis and dissemination is not at the expense of accuracy, particularly when ‘high stakes’ results are involved, otherwise there is a risk that users will lose faith in the independence of EEF and its willingness to ‘speak truth unto power’.

.

Maths Mastery Press Release

By also releasing their own posts on 13 February, Mathematics Mastery and Ark made sure that they too would not be picked up by the media.

They must have concluded that, even if they placed the most positive interpretation on the outcomes, they would find it hard to create the kind of media coverage that would generate increased demand from schools.

The Mathematics Mastery release – ‘Mathematics Mastery speeds up pupils’ progress – and is value for money too’ – begins with a list of bullet points citing other evidence that the programme works, so implying that the EEF evaluations are relatively insignificant additions to this comprehensive evidence base:

  • ‘Headteachers say that the teaching of mathematics in their schools has improved
  • Headteachers are happy to recommend us to other schools
  • Numerous Ofsted inspections have praised the “new approach to mathematics” in partner schools
  • Extremely positive evaluations of our training and our school development visits
  • We have an exceptionally high retention rate – schools want to continue in the partnership
  • Great Key Stage 1 results in a large number of schools.’

Much of this is hearsay, or else vague reference to quantitative evidence that is not published openly.

The optimistic comment on the EEF evaluations is:

‘We’re pleased with the finding that, looking at both our primary and secondary programmes together, pupils in the Mathematics Mastery schools make one month’s extra progress on average compared to pupils in the other schools after a one year “dose” of the programme…

…This is a really pleasing outcome – trials of this kind are very rigorous.  Over 80 primary schools and 50 secondary schools were involved in the testing, with over 4000 pupils involved in each phase.  Studies like this often don’t show any progress at all, particularly in the early years of implementation and if, like ours, the programme is aimed at all pupils and not just particular groups.  What’s more, because of the large sample size, the difference in scores between the Mathematics Mastery and other schools is “statistically significant” which means the results are very unlikely to be due to chance.’

The section I have emboldened is in stark contrast to the EEF blog post above, which has the title:

‘Today’s findings; impact, no-impact and inconclusive – a normal distribution of findings’

And so suggests exactly the opposite.

I have already shown just how borderline the calculation of ‘statistical significance’ has been.

The release concludes:

‘Of course we’re pleased with the extra progress even after a limited time, but we’re interested in long term change and long term development and improvement.  We’re determined to work with our partner schools to show what’s possible over pupils’ whole school careers…but it’s nice to know we’ve already started to succeed!’

 .

There was a single retweet of the Tweet above, but from a particularly authoritative source (who also sits on Ark’s Advisory Group).

.

Ark Press Release

Ark’s press release – ‘Independent evaluation shows Mathematics Mastery pupils doing better than their peers’ – is even more bullish.

The opening paragraph claims that:

‘A new independent report from the independent Education Endowment Foundation (EEF) demonstrates the success of the Mathematics Mastery programme. Carried out by academics from Cambridge University and the Institute of Education, the data indicates that the programme may have the potential to halve the attainment gap with high performing countries in the far East.

The second emboldened statement is particularly brazen since there is no evidence in either of the reports that would support such a claim. It is only true in the sense that any programme ‘may have the potential’ to achieve any particularly ambitious outcome.

Statistical significance is again celebrated, though it is important to give Ark credit for adding:

‘…but it is important to note that these individual studies did not reach the threshold for statistical significance. It is only at the combined level across 127 schools and 10,114 pupils that there are sufficient schools and statistical power to determine an effect size of 1 month overall.’

Even if this rather implies that the individual evaluations were somehow at fault for being too small and so not generating ‘sufficient statistical power’.

Then the release returns to its initial theme:

‘… According to the OECD, by age fifteen, pupils in Singapore, Japan, South Korea and China are three years ahead of pupils in England in mathematical achievement. Maths Mastery is inspired by the techniques and strategies used in these countries.

Because Maths Mastery is a whole school programme of training and a cumulative curriculum, rather than a catch-up intervention, this could be a sustained impact. A 2 month gain every primary year and 1 month gain every secondary year could see pupils more than one and a half years ahead by age 16 – halving the gap with higher performing jurisdictions.’

In other words, Ark extrapolates equivalent gains – eschewing all statistical hedging – for each year of study, adding them together to suggest a potential 18 month gain.

It also seems to apply the effect to all participants rather than to the average participant.

This must have been a step too far, even for Ark’s publicity machine.

.

maths mastery ark release capture

.

They subsequently changed the final paragraph above – which one can still find in the version within Google’s cache – to read:

‘…Because Maths Mastery is a whole school programme of training and a cumulative curriculum, rather than a catch-up intervention, we expect this to be a sustained impact.  A longer follow-up study will be needed to investigate this.’

Even in sacrificing the misleading quantification, they could not resist bumping up ‘this could be a sustained impact’ to ‘we expect this to be a sustained impact’

 .

[Postscript: On 25 February, Bank of America Merrill Lynch published a press release announcing a £750,000 donation to Maths Mastery.

The final paragraph ‘About Maths Mastery’ says:

‘Mathematics Mastery is an innovative maths teaching framework, supporting schools, students and teachers to be successful at maths. There are currently 192 Mathematics Mastery partner schools across England, reaching 34,800 pupils. Over the next five years the programme aims to expand to 500 schools, and reach 300,000 pupils. Maths Mastery was recently evaluated by the independent Education Endowment Foundation and pupils were found to be up to two months ahead of their peers in just the first year of the programme. Longer term, this could see pupils more than a year and a half ahead by age 16 – halving the gap with pupils in countries such as Japan, Singapore and China.’

This exemplifies perfectly how such questionable statements are repurposed and recycled with impunity. It is high time that the EEF published a code of practice to help ensure that the outcomes of its evaluations are not misrepresented.]  

.

Conclusion

 .

Representing the key findings

My best effort at a balanced presentation of these findings would include the key points below. I am happy to consider amendments, additions and improvements:

  • On average, pupils in primary schools adopting Mathematics Mastery made two months more progress than pupils in primary schools that did not. (This is a borderline result, in that it is only just above the score denoting one month’s progress. It falls to one month’s progress if the effect size is calculated to three decimal places.) The effect is classified as ‘Low’ and this outcome is not statistically significant. 
  • On average, pupils in secondary schools adopting Mathematics Mastery made one month more progress than pupils in secondary schools that did not. The effect is classified as ‘Low’ and this outcome is not statistically significant. 
  • When the results of the primary and secondary evaluations are combined through meta-analysis, pupils in schools adopting Maths Mastery made one month more progress than pupils in schools that did not. The effect is classified as ‘Low’. This outcome is marginally statistically significant, provided that the 95% confidence interval is calculated to three decimal places (but it is not statistically significant if calculated to two decimal places). Care is needed in analysing meta-analysed findings because… [add explanation]. 
  • There is relatively little evidence that the primary programme is more effective for learners with lower prior attainment, but there is such evidence for the secondary programme (in respect of non-calculator questions). There is no substantive evidence that the secondary programme has a different impact on pupils eligible for free schools meals. 
  • The per-pupil cost is relatively low, but the initial outlay of £6,000 for primary schools with 2FE and above is not inconsiderable. Mathematics Mastery may represent a cost-effective change for schools to consider. 
  • The evaluations assessed the impact of the programme in its first year of adoption. It is not appropriate to draw inferences from the findings above to attribute potential value to the whole programme. EEF will be evaluating the medium and long-term impact of the approach by [outline the methodology agreed].

In the meantime, it would be helpful for Ark and Maths Mastery to be much more transparent about KS1 assessment outcomes across their partner schools and possibly publish their own analysis based on comparison between schools undertaking the programme and matched control schools with similar intakes.

And it would be helpful for all partners to explain and evidence more fully the benefits to high attainers of the Maths Mastery approach – and to consider how it might be supplemented when it does not provide the blend of challenge and support that best meets their needs.

It is disappointing that, three years on, the failure of the National Curriculum Expert Panel to reconcile their advocacy for mastery with stretch and challenge for high attainers – in defiance of their remit to consider the latter as well as the former –  is being perpetuated across the system.

NCETM might usefully revisit their guidance on high attainers in primary schools to reflect their new-found commitment to mastery, while also incorporating additional material covering the point above.

.

GP

February 2015

 

High Attainment in the 2014 Secondary and 16-18 Performance Tables

.

This is my annual analysis of high attainment and high attainers’ performance in the Secondary School and College Performance Tables

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on the 2014 Secondary and 16-18 Tables, as well as three statistical releases published alongside them:

It also reports trends since 2012 and 2013, while acknowledging the comparability issues at secondary level this year.

This is a companion piece to previous posts on:

The post opens with the headlines from the subsequent analysis. These are followed by a discussion of definitions and comparability issues.

Two substantive sections deal respectively with secondary and post-16 measures. The post-16 analysis focuses exclusively on A level results. There is a brief postscript on the performance of disadvantaged high attainers.

As ever I apologise in advance for any transcription errors and invite readers to notify me of any they spot, so that I can make the necessary corrections.

.

Headlines

At KS4:

  • High attainers constitute 32.4% of the cohort attending state-funded schools, but this masks some variation by school type. The percentage attending converter academies (38.4%) has fallen by nine percentage points since 2011 but remains almost double the percentage attending sponsored academies (21.2%).
  • Female high attainers (33.7%) continue to outnumber males (32.1%). The percentage of high-attaining males has fallen very slightly since 2013 while the proportion of high-attaining females has slightly increased.
  • 88.8% of the GCSE cohort attending selective schools are high attainers, virtually unchanged from 2013. The percentages in comprehensive schools (30.9%) and modern schools (21.0%) are also little changed.
  • These figures mask significant variation between schools. Ten grammar schools have a GCSE cohort consisting entirely of high attainers but, at the other extreme, one has only 52%.
  • Some comprehensive schools have more high attainers than some grammars: the highest percentage recorded in 2014 by a comprehensive is 86%. Modern schools are also extremely variable, with high attainer populations ranging from 4% to 45%. Schools with small populations of high attainers report very different success rates for them on the headline measures.
  • The fact that 11.2% of the selective school cohort are middle attainers reminds us that 11+ selection is not based on prior attainment. Middle attainers in selective schools perform significantly better than those in comprehensive schools, but worse than high attainers in comprehensives.
  • 92.8% of high attainers in state-funded schools achieved 5 or more GCSEs at grades A*-C (or equivalent) including GCSEs in English and maths. While the success rate for all learners is down by four percentage points compared with 2013, the decline is less pronounced for high attainers (1.9 points).
  • In 340 schools 100% of high attainers achieved this measure, down from 530 in 2013. Fifty-seven schools record 67% or less compared with only 14 in 2013. Four of the 57 had a better success rate for middle attainers than for high attainers.
  • 93.8% of high attainers in state-funded schools achieved GCSE grades A*-C in English and maths. The success rate for high attainers has fallen less than the rate for the cohort as a whole (1.3 points against 2.4 points). Some 470 schools achieved 100% success amongst their high attainers on this measure, down 140 compared with 2013. Thirty-eight schools were at 67% or lower compared with only 12 in 2013. Five of these boast a higher success rate for their middle attainers than their high attainers (and four are the same that do so on the 5+ A*-C including English and maths measure).
  • 68.8% of high attainers were entered for the EBacc and 55% achieved it. The entry rate is up 3.8 percentage points and the success rate up 2.9 points compared with 2013. Sixty-seven schools entered 100% of their high attainers, but only five schools managed 100% success. Thirty-seven schools entered no high attainers at all and 53 had no successful high attainers.
  • 85.6% of high attainers made at least the expected progress in English and 84.7% did so in maths. Both are down on 2013 but much more so in maths (3.1 percentage points) than in English (0.6 points).
  • In 108 schools every high attainer made the requisite progress in English. In 99 schools the same was true of maths in 99 schools. Only 21 schools managed 100% success in both English and maths. At the other extreme there were seven schools in which 50% or fewer made expected progress in both English and maths. Several schools recording 50% or below in either English or maths did significantly better with their middle attainers.
  • In sponsored academies one in four high attainers do not make the expected progress in maths and one in five do not do so in English. In free schools one in every five high attainers falls short in English as do one in six in maths.

At KS5:

  • 11.9% of students at state-funded schools and colleges achieved AAB grades at A level or higher, with at least two in facilitating subjects. This is a slight fall compared with the 12.1% that did so in 2013. The best-performing state institution had a success rate of 83%.
  • 14.1% of A levels taken in selective schools in 2014 were graded A* and 41.1% were graded A* or A. In selective schools 26.1% of the cohort achieved AAA or higher and 32.3% achieved AAB or higher with at least two in facilitating subjects.
  • Across all schools, independent as well as state-funded, the proportion of students achieving three or more A level grades at A*/A is falling and the gap between the success rates of boys and girls is increasing.
  • Boys are more successful than girls on three of the four high attainment measures, the only exception being the least demanding (AAB or higher in any subjects).
  • The highest recorded A level point score per A level student in a state-funded institution in 2014 is 1430.1, compared with an average of 772.7. The lowest is 288.4. The highest APS per A level entry is 271.1 compared with an average of 211.2. The lowest recorded is 108.6.

Disadvantaged high attainers:

  • On the majority of the KS4 headline measures gaps between FSM and non-FSM performance are increasing, even when the 2013 methodology is applied to control for the impact of the reforms affecting comparability. Very limited improvement has been made against any of the five headline measures between 2011 and 2014. It seems that the pupil premium has had little impact to date on either attainment or progress. Although no separate information is forthcoming about the performance of disadvantaged high attainers, it is highly likely that excellence gaps are equally unaffected.

.

Definitions and comparability issues 

Definitions

The Secondary and 16-18 Tables take very different approaches, since the former deals exclusively with high attainers while the latter concentrates exclusively on high attainment.

The Secondary Tables define high attainers according to their prior attainment on end of KS2 tests. Most learners in the 2014 GCSE cohort will have taken these five years previously, in 2009.

The new supporting documentation describes the distinction between high, middle and low attainers thus:

  • low attaining = those below level 4 in the key stage 2 tests
  • middle attaining = those at level 4 in the key stage 2 tests
  • high attaining = those above level 4 in the key stage 2 tests.

Last year the equivalent statement added:

‘To establish a pupil’s KS2 attainment level, we calculated the pupil’s average point score in national curriculum tests for English, maths and science and classified those with a point score of less than 24 as low; those between 24 and 29.99 as middle, and those with 30 or more as high attaining.’

This is now missing, but the methodology is presumably unchanged.

It means that high attainers will tend to be ‘all-rounders’, whose performance is at least middling in each assessment. Those who are exceptionally high achievers in one area but poor in others are unlikely to qualify.

There is nothing in the Secondary Tables or the supporting SFRs about high attainment, such as measures of GCSE achievement at grades A*/A.

By contrast, the 16-18 Tables do not distinguish high attainers, but do deploy a high attainment measure:

‘The percentage of A level students achieving grades AAB or higher in at least two facilitating subjects’

Facilitating subjects include:

‘biology, chemistry, physics, mathematics, further mathematics, geography, history, English literature, modern and classical languages.’

The supporting documentation says:

‘Students who already have a good idea of what they want to study at university should check the usual entry requirements for their chosen course and ensure that their choices at advanced level include any required subjects. Students who are less sure will want to keep their options open while they decide what to do. These students might want to consider choosing at least two facilitating subjects because they are most commonly required for entry to degree courses at Russell Group universities. The study of A levels in particular subjects does not, of course, guarantee anyone a place. Entry to university is competitive and achieving good grades is also important.’

The 2013 Tables also included percentages of students achieving three A levels at grades AAB or higher in facilitating subjects, but this has now been dropped.

The Statement of Intent for the 2014 Tables explains:

‘As announced in the government’s response to the consultation on 16-19 accountability earlier this year, we intend to maintain the AAB measure in performance tables as a standard of academic rigour. However, to address the concerns raised in the 16-19 accountability consultation, we will only require two of the subjects to be in facilitating subjects. Therefore, the indicator based on three facilitating subjects will no longer be reported in the performance tables.’

Both these measures appear in SFR03/15, alongside two others:

  • Percentage of students achieving 3 A*-A grades or better At A level or applied single/double award A level.
  • Percentage of students achieving grades AAB or better at A level or applied single/double award A level.

Comparability Issues 

When it comes to analysis of the Secondary Tables, comparisons with previous years are compromised by changes to the way in which performance is measured.

Both SFRs carry an initial warning:

‘Two major reforms have been implemented which affect the calculation of key stage 4 (KS4) performance measures data in 2014:

  1. Professor Alison Wolf’s Review of Vocational Education recommendations which:
  • restrict the qualifications counted
  • prevent any qualification from counting as larger than one GCSE
  • cap the number of non-GCSEs included in performance measures at two per pupil
  1. An early entry policy to only count a pupil’s first attempt at a qualification.’

SFR02/15 explains that some data has been presented ‘on two alternative bases’:

  • Using the 2014 methodology with the changes above applied and
  • Using a proxy 2013 methodology where the effect of these two changes has been removed.

It points out that more minor changes have not been accounted for, including the removal of unregulated IGCSEs, the application of discounting across different qualification types, the shift to linear GCSE formats and the removal of the speaking and listening component from English.

Moreover, the proxy measure does not:

‘…isolate the impact of changes in school behaviour due to policy changes. For example, we can count best entry results rather than first entry results but some schools will have adjusted their behaviours according to the policy changes and stopped entering pupils in the same patterns as they would have done before the policy was introduced.’

Nevertheless, the proxy is the best available guide to what outcomes would have been had the two reforms above not been introduced. Unfortunately, it has been applied rather sparingly.

Rather than ignore trends completely, this post includes information about changes in high attainers’ GCSE performance compared with previous years, not least so readers can see the impact of the changes that have been introduced.

It is important that we do not allow the impact of these changes to be used as a smokescreen masking negligible improvement or even declines in national performance on key measures.

But we cannot escape the fact that the 2014 figures are not fully comparable with those for previous years. Several of the tables in SFR06/2015 carry a warning in red to this effect (but not those in SFR 02/2015).

A few less substantive changes also impact slightly on the comparability of A level results: the withdrawal of January examinations and ‘automatic add back’ of students whose results were deferred from the previous year because they had not completed their 16-18 study programme.

.

Secondary outcomes

. 

The High Attainer Population 

The Secondary Performance Tables show that there were 172,115 high attainers from state-funded schools within the relevant cohort in 2014, who together account for 32.3% of the entire state-funded school cohort.

This is some 2% fewer than the 175,797 recorded in 2013, which constituted 32.4% of that year’s cohort.

SFR02/2015 provides information about the incidence of high, middle and low attainers by school type and gender.

Chart 1, below, compares the proportion of high attainers by type of school, showing changes since 2011.

The high attainer population across all state-funded mainstream schools has remained relatively stable over the period and currently stands at 32.9%. The corresponding percentage in LA-maintained mainstream schools is slightly lower: the difference is exactly two percentage points in 2014.

High attainers constitute only around one-fifth of the student population of sponsored academies, but close to double that in converter academies. The former percentage is relatively stable but the latter has fallen by some nine percentage points since 2011, presumably as the size of this sector has increased.

The percentage of high attainers in free schools is similar to that in converter academies but has fluctuated over the three years for which data is available. The comparison between 2014 and previous years will have been affected by the inclusion of UTCs and studio schools prior to 2014.

.

HA sec1

*Pre-2014 includes UTCs and studio schools; 2014 includes free schools only

Chart 1: Percentage of high attainers by school type, 2011-2014

. 

Table 1 shows that, in each year since 2011, there has been a slightly higher percentage of female high attainers than male, the gap varying between 0.4 percentage points (2012) and 1.8 percentage points (2011).

The percentage of high-attaining boys in 2014 is the lowest it has been over this period, while the percentage of high attaining girls is slightly higher than it was in 2013 but has not returned to 2011 levels.

Year Boys Girls
2014 32.1 33.7
2013 32.3 33.3
2012 33.4 33.8
2011 32.6 34.4

Table 1: Percentage of high attainers by gender, all state-funded mainstream schools 2011-14

Table 2 shows that the percentage of high attainers in selective schools is almost unchanged from 2013, at just under 89%. This compares with almost 31% in comprehensive schools, unchanged from 2013, and 21% in modern schools, the highest it has been over this period.

The 11.2% of learners in selective schools who are middle attainers remind us that selection by ability through 11-plus tests gives a somewhat different sample than selection exclusively on the basis of KS2 attainment.

. 

Year Selective Comprehensive Modern
2014 88.8 30.9 21.0
2013 88.9 30.9 20.5
2012 89.8 31.7 20.9
2011 90.3 31.6 20.4

Table 2: Percentage of high attainers by admissions practice, 2011-14

The SFR shows that these middle attainers in selective schools are less successful than their high attaining peers, and slightly less successful than high attainers in comprehensives, but they are considerably more successful than middle attaining learners in comprehensive schools.

For example, in 2014 the 5+ A*-C grades including English and maths measure is achieved by:

  • 97.8% of high attainers in selective schools
  • 92.2% of high attainers in comprehensive schools
  • 88.1% of middle attainers in selective schools and
  • 50.8% of middle attainers in comprehensive schools.

A previous post ‘The Politics of Selection: Grammar schools and disadvantage’ (November 2014) explored how some grammar schools are significantly more selective than others – as measured by the percentage of high attainers within their GCSE cohorts – and the fact that some comprehensives are more selective than some grammar schools.

This is again borne out by the 2014 Performance Tables, which show that 10 selective schools have a cohort consisting entirely of high attainers, the same as in 2013. Eighty-nine selective schools have a high attainer population of 90% or more.

However, five are at 70% or below, with the lowest – Dover Grammar School for Boys – registering only 52% high attainers.

By comparison, comprehensives such as King’s Priory School, North Shields and Dame Alice Owen’s School, Potters Bar record 86% and 77% high attainers respectively. 

There is also huge variation in modern schools, from Coombe Girls’ in Kingston, at 45%, just seven percentage points shy of the lowest recorded in a selective school, to The Ellington and Hereson School, Ramsgate, at just 4%.

Two studio colleges say they have no high attainers at all, while 96 schools have 10% or fewer. A significant proportion of these are academies located in rural and coastal areas.

Even though results are suppressed where there are too few high attainers, it is evident that these small cohorts perform very differently in different schools.

Amongst those with a high attainer population of 10% or fewer, the proportion achieving:

  • 5+ A*-C grades including English and maths varies from 44% to100%
  • EBacc ranges from 0% to 89%
  • expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%. 

5+ GCSEs (or equivalent) at A*-C including GCSEs in English and maths 

The Tables show that:

  • 92.8% of high attainers in state-funded schools achieved five or more GCSEs (or equivalent) including GCSEs in English and maths. This compares with 56.6% of all learners. Allowing of course for the impact of 2014 reforms, the latter is a full four percentage points down on the 2013 outcome. By comparison, the outcome for high attainers is down 1.9 percentage points, slightly less than half the overall decline. Roughly one in every fourteen high attainers fails to achieve this benchmark.
  • 340 schools achieve 100% on this measure, significantly fewer than the 530 that did so in 2013 and the 480 managing this in 2012. In 2013, 14 schools registered 67% or fewer high attainers achieving this outcome, whereas in 2014 this number has increased substantially, to 57 schools. Five schools record 0%, including selective Bourne Grammar School, Lincolnshire, hopefully because of their choice of IGCSEs. Six more are at 25% or lower.

. 

A*-C grades in GCSE English and maths 

The Tables reveal that:

  • 93.8% of high attainers in state-funded schools achieved A*-C grades in GCSE English and maths, compared with 58.9% of all pupils. The latter percentage is down by 2.4 percentage points but the former has fallen by only 1.3 percentage points. Roughly one in 16 high attainers fails to achieve this measure.
  • In 2014 the number of schools with 100% of high attainers achieving this measure has fallen to some 470, 140 fewer than in 2013 and 60 fewer than in 2012. There were 38 schools recording 67% or lower, a significant increase compared with 12 in 2013 and 18 in 2012. Of these, four are listed at 0% (Bourne Grammar is at 1%) and five more are at 25% or lower.
  • Amongst the 38 schools recording 67% or lower, five return a higher success rate for their middle attainers than for their high attainers. Four of these are the same that do so on the 5+ A*-C measure above. They are joined by Tong High School. 

Entry to and achievement of the EBacc 

The Tables indicate that:

  • 68.8% of high attainers in state-funded schools were entered for all EBacc subjects and 55.0% achieved the EBacc. The entry rate is up by 3.8 percentage points compared with 2013, and the success rate is up by 2.9 percentage points. By comparison, 31.5% of middle attainers were entered (up 3.7 points) and 12.7% passed (up 0.9 points). Between 2012 and 2013 the entry rate for high attainers increased by 19 percentage points, so the rate of improvement has slowed significantly. Given the impending introduction of the Attainment 8 measure, commitment to the EBacc is presumably waning.
  • Thirty-seven schools entered no high attainers for the EBacc, compared with 55 in 2013 and 186 in 2012. Only 53 schools had no high attainers achieving the EBacc, compared with 79 in 2013 and 235 in 2012. Of these 53, 11 recorded a positive success rate for their middle attainers, though the difference was relatively small in all cases.

At least 3 Levels of Progress in English and maths

The Tables show that:

  • Across all state-funded schools 85.6% of high attainers made at least the expected progress in English while 84.7% did so in maths. The corresponding figures for middle attainers are 70.2% in English and 65.3% in maths. Compared with 2013, the percentages for high attainers are down 0.6 percentage points in English and down 3.1 percentage points in maths, presumably because the first entry only rule has had more impact in the latter. Even allowing for the depressing effect of the changes outlined above, it is unacceptable that more than one in every seven high attainers fails to make the requisite progress in each of these core subjects, especially when the progress expected is relatively undemanding for such students.
  • There were 108 schools in which every high attainer made at least the expected progress in English, exactly the same as in 2013. There were 99 schools which achieved the same outcome in maths, down significantly from 120 in 2013. In 2013 there were 36 schools which managed this in both English in maths, but only 21 did so in 2014.
  • At the other extreme, four schools recorded no high attainers making the expected progress in English, presumably because of their choice of IGCSE. Sixty-five schools were at or below 50% on this measure. In maths 67 schools were at or below 50%, but the lowest recorded outcome was 16%, at Oasis Academy, Hextable.
  • Half of the schools achieving 50% or less with their high attainers in English or maths also returned better results with middle attainers. Particularly glaring differentials in English include Red House Academy (50% middle attainers and 22% high attainers) and Wingfield Academy (73% middle attainers; 36% high attainers). In maths the worst examples are Oasis Academy Hextable (55% middle attainers and 16% high attainers), Sir John Hunt Community Sports College (45% middle attainers and 17% high attainers) and Roseberry College and Sixth Form (now closed) (49% middle attainers and 21% high attainers).

Comparing achievement of these measures by school type and admissions basis 

SFR02/2015 compares the performance of high attainers in different types of school on each of the five measures discussed above. This data is presented in Chart 2 below.

.

HA sec2 

Chart 2: Comparison of high attainers’ GCSE performance by type of school, 2014

.

It shows that:

  • There is significant variation on all five measures, though these are more pronounced for achievement of the EBacc, where there is a 20 percentage point difference between the success rates in sponsored academies (39.2%) and in converter academies (59.9%).
  • Converter academies are the strongest performers across the board, while sponsored academies are consistently the weakest. LA-maintained mainstream schools out-perform free schools on four of the five measures, the only exception being expected progress in maths.
  • Free schools and converter academies achieve stronger performance on progress in maths than on progress in English, but the reverse is true in sponsored academies and LA-maintained schools.
  • Sponsored academies and free schools are both registering relatively poor performance on the EBacc measure and the two progress measures.
  • One in four high attainers in sponsored academies fails to make the requisite progress in maths while one in five fail to do so in English. Moreover, one in five high attainers in free schools fails to make the expected progress in English and one in six in maths. This is unacceptably low.

Comparisons with 2013 outcomes show a general decline, with the exception of EBacc achievement.

This is particularly pronounced in sponsored academies, where there have been falls of 5.2 percentage points on 5+ A*-Cs including English and maths, 5.7 points on A*-C in English and maths and 4.7 points on expected progress in maths. However, expected progress in English has held up well by comparison, with a fall of just 0.6 percentage points.

Progress in maths has declined more than progress in English across the board. In converter academies progress in maths is down 3.1 points, while progress in English is down 1.1 points. In LA-maintained schools, the corresponding falls are 3.4 and 0.4 points respectively.

EBacc achievement is up by 4.5 percentage points in sponsored academies, 3.1 points in LA-maintained schools and 1.8 points in converter academies.

.

Comparing achievement of these measures by school admissions basis 

SFR02/2015 compares the performance of high attainers in selective, comprehensive and modern schools on these five measures. Chart 3 illustrates these comparisons.

.

HA sec3

Chart 3: Comparison of high attainers’ GCSE performance by school admissions basis, 2014

.

It is evident that:

  • High attainers in selective schools outperform those in comprehensive schools on all five measures. The biggest difference is in relation to EBacc achievement (21.6 percentage points). There is a 12.8 point advantage in relation to expected progress in maths and an 8.7 point advantage on expected progress in English.
  • Similarly, high attainers in comprehensive schools outperform those in modern schools. They enjoy a 14.7 percentage point advantage in relation to achievement of the EBacc, but, otherwise, the differences are between 1.6 and 3.5 percentage points.
  • Hence there is a smaller gap, by and large, between the performance of high attainers in modern and comprehensive schools respectively than there is between high attainers in comprehensive and selective schools respectively.
  • Only selective schools are more successful in achieving expected progress in maths than they are in English. It is a cause for some concern that, even in selective schools, 6.5% of pupils are failing to make at least three levels of progress in English.

Compared with 2013, results have typically improved in selective schools but worsened in comprehensive and modern schools. For example:

  • Achievement of the 5+ GCSE measure is up 0.5 percentage points in selective schools but down 2.3 points in comprehensives and modern schools.
  • In selective schools, the success rate for expected progress in English is up 0.5 points and in maths it is up 0.4 points. However, in comprehensive schools progress in English and maths are both down, by 0.7 points and 3.5 points respectively. In modern schools, progress in English is up 0.3 percentage points while progress in maths is down 4.1 percentage points.

When it comes to EBacc achievement, the success rate is unchanged in selective schools, up 3.1 points in comprehensives and up 5 points in modern schools.

. 

Other measures

The Secondary Performance Tables also provide information about the performance of high attainers on several other measures, including:

  • Average Points Score (APS): Annex B of the Statement of Intent says that, as in 2013, the Tables will include APS (best 8) for ‘all qualifications’ and ‘GCSEs only’. At the time of writing, only the former appears in the 2014 Tables. For high attainers, the APS (best 8) all qualifications across all state-funded schools is 386.2, which compares unfavourably with 396.1 in 2013. Four selective schools managed to exceed 450 points: Pate’s Grammar School (455.1); The Tiffin Girls’ School (452.1); Reading School (451.4); and Colyton Grammar School (450.6). The best result in 2013 was 459.5, again at Colyton Grammar School. At the other end of the table, only one school returns a score of under 250 for their high attainers, Pent Valley Technology College (248.1). The lowest recorded score in 2013 was significantly higher at 277.3.
  • Value Added (best 8) prior attainment: The VA score for all state-funded schools in 2014 is 1000.3, compared with 1001.5 in 2013. Five schools returned a result over 1050, whereas four did so in 2013. The 2014 leaders are: Tauheedul Islam Girls School (1070.7); Yesodey Hatorah Senior Girls School (1057.8); The City Academy Hackney (1051.4); The Skinner’s School (1051.2); and Hasmonean High School (1050.9). At the other extreme, 12 schools were at 900 or below, compared with just three in 2013. The lowest performer on this measure is Hull Studio School (851.2). 
  • Average grade: As in the case of APS, the average grade per pupil per GCSE has not yet materialised. The average grade per pupil per qualification is supplied. Five selective schools return A*-, including Henrietta Barnett, Pate’s, Reading School, Tiffin Girls and Tonbridge Grammar. Only Henrietta Barnett and Pate’s managed this in 2013.
  • Number of exam entries: Yet again we only have number of entries for all qualifications and not for GCSE only. The average number of entries per high attainer across state-funded schools is 10.4, compared with 12.1 in 2013. This 1.7 reduction is smaller than for middle attainers (down 2.5 from 11.4 to 8.9) and low attainers (down 3.7 from 10.1 to 6.4). The highest number of entries per high attainer was 14.2 at Gable Hall School and the lowest was 5.9 at The Midland Studio College Hinkley.

16-18: A level outcomes

.

A level grades AAB or higher in at least two facilitating subjects 

The 16-18 Tables show that 11.9% of students in state-funded schools and colleges achieved AAB+ with at least two in facilitating subjects. This is slightly lower than the 12.1% recorded in 2013.

The best-performing state-funded institution is a further education college, Cambridge Regional College, which records 83%. The only other state-funded institution above 80% is The Henrietta Barnett School. At the other end of the spectrum, some 443 institutions are at 0%.

Table 3, derived from SFR03/2015, reveals how performance on this measure has changed since 2013 for different types of institution and, for schools with different admission arrangements.

.

2013 2014
LA-maintained school 11.4 11.5
Sponsored academy 5.4 5.3
Converter academy 16.4 15.7
Free school* 11.3 16.4
Sixth form college 10.4 10
Other FE college 5.8 5.7
 
Selective school 32.4 32.3
Comprehensive school 10.7 10.5
Modern school 2 3.2

.

The substantive change for free schools will be affected by the inclusion of UTCs and studio schools in that line in 2013 and the addition of city technology colleges and 16-19 free schools in 2014.

Otherwise the general trend is slightly downwards but LA-maintained schools have improved very slightly and modern schools have improved significantly.

.

Other measures of high A level attainment

SFR03/15 provides outcomes for three other measures of high A level attainment:

  • 3 A*/A grades or better at A level, or applied single/double award A level
  • Grades AAB or better at A level, or applied single/double award A level
  • Grades AAB or better at A level all of which are in facilitating subjects.

Chart 4, below, compares performance across all state-funded schools and colleges on all four measures, showing results separately for boys and girls.

Boys are in the ascendancy on three of the four measures, the one exception being AAB grades or higher in any subjects. The gaps are more substantial where facilitating subjects are involved.

.

HA sec4

Chart 4: A level high attainment measures by gender, 2014

.

The SFR provides a time series for the achievement of the 3+ A*/A measure, for all schools – including independent schools – and colleges. The 2014 success rate is 12.0%, down 0.5 percentage points compared with 2013.

The trend over time is shown in Chart 5 below. This shows how results for boys and girls alike are slowly declining, having reached their peak in 2010/11. Boys established a clear lead from that year onwards.

As they decline, the lines for boys and girls are steadily diverging since girls’ results are falling more rapidly. The gap between boys and girls in 2014 is 1.3 percentage points.

.

HA sec5

Chart 5: Achievement of 3+ A*/A grades in independent and state-funded schools and in colleges, 2006-2014

.

Chart 6, compares performance on the four different measures by institutional type. It shows a similar pattern across the piece.

Success rates tend to be highest in either converter academies or free schools, while sponsored academies and other FE institutions tend to bring up the rear. LA-maintained schools and sixth form colleges lie midway between.

Converter academies outscore free schools when facilitating subjects do not enter the equation, but the reverse is true when they do. There is a similar relationship between sixth form colleges and LA-maintained schools, but it does not quite hold with the final pair.

. 

HA sec6 

Chart 6: Proportion of students achieving different A level high attainment measures by type of institution, 2014

.

Chart 7 compares performance by admissions policy in the schools sector on the four measures. Selective schools enjoy a big advantage on all four. More than one in four selective school students achieving at least 3 A grades and almost one in 3 achieves AAB+ with at least two in facilitating subjects.

There is a broadly similar relationship across all the measures, in that comprehensive schools record roughly three times the rates achieved in modern schools and selective schools manage roughly three times the success rates in comprehensive schools. 

. 

HA sec7 

Chart 7: Proportion of students achieving different A level high attainment measures by admissions basis in schools, 2014

 .

Other Performance Table measures 

Some of the other measures in the 16-18 Tables are relevant to high attainment:

  • Average Point Score per A level student: The APS per student across all state funded schools and colleges is 772.7, down slightly on the 782.3 recorded last year. The highest recorded APS in 2014 is 1430.1, by Colchester Royal Grammar School. This is almost 100 ahead of the next best school, Colyton Grammar, but well short of the highest score in 2013, which was 1650. The lowest APS for a state-funded school in 2014 is 288.4 at Hartsdown Academy, which also returned the lowest score in 2013. 
  • Average Point Score per A level entry: The APS per A level entry for all state-funded institutions is 211.2, almost identical to the 211.3 recorded in 2013. The highest score attributable to a state-funded institution is 271.1 at The Henrietta Barnett School. This is very slightly slower than the 271.4 achieved by Queen Elizabeth’s Barnet in 2013. The lowest is 108.6, again at Hartsdown Academy, which exceeds the 2013 low of 97.7 at Appleton Academy. 
  • Average grade per A level entry: The average grade across state-funded schools and colleges is C. The highest average grade returned in the state-funded sector is A at The Henrietta Barnett School, Pate’s Grammar School, Queen Elizabeth’s Barnet and Tiffin Girls School. In 2013 only the two Barnet schools achieved the same outcome. At the other extreme, an average U grade is returned by Hartsdown Academy, Irlam and Cadishead College and Swadelands School. 

SFR06/2015 also supplies the percentage of A* and A*/A grades by type of institution and schools’ admissions arrangements. The former is shown in Chart 8 and the latter in Chart 9 below.

The free school comparisons are affected by the changes to this category described above.

Elsewhere the pattern is rather inconsistent. Success rates at A* exceed those set in 2012 and 2013 in LA-maintained schools, sponsored academies, sixth form colleges and other FE institutions. Meanwhile, A*/A grades combined are lower than both 2012 and 2013 in converter academies and sixth form colleges.

.

HA sec8

Chart 8: A level A* and A*/A performance by institutional type, 2012 to 2014

. 

Chart 9 shows A* performance exceeding the success rates for 2012 and 2013 in all three sectors.

When both grades are included, success rates in selective schools have returned almost to 2012 levels following a dip in 2013, while there has been little change across the three years in comprehensive schools and a clear improvement in modern schools, which also experienced a dip last year.

HA sec9

Chart 9: A level A* and A*/A performance in schools by admissions basis, 2012 to 2014.

 .

Disadvantaged high attainers 

There is nothing in either of the Performance Tables or the supporting SFRs to enable us to detect changes in the performance of disadvantaged high attainers relative to their more advantaged peers.

I dedicated a previous post to the very few published statistics available to quantify the size of these excellence gaps and establish if they are closing, stable or widening.

There is continuing uncertainty whether this will be addressed under the new assessment and accountability arrangements to be introduced from 2016.

Although results for all high attainers appear to be holding up better than those for middle and lower attainers, the evidence suggests that FSM and disadvantaged gaps at lower attainment levels are proving stubbornly resistant to closure.

Data from SFR06/2015 is presented in Charts 10-12 below.

Chart 10 shows that, when the 2014 methodology is applied, three of the gaps on the five headline measures increased in 2014 compared with 2013.

That might have been expected given the impact of the changes discussed above but, if the 2013 methodology is applied, so stripping out much (but not all) of the impact of these reforms, four of the five headline gaps worsened and the original three are even wider.

This seems to support the hypothesis that the reforms themselves are not driving this negative trend, athough Teach First has suggested otherwise.

.

HA sec10

Chart 10: FSM gaps for headline GCSE measures, 2013-2014

.

Chart 11 shows how FSM gaps have changed on each of these five measures since 2011. Both sets of 2014 figures are included.

Compared with 2011, there has been improvement on two of the five measures, while two or three have deteriorated, depending which methodology is applied for 2014.

Since 2012, only one measure has improved (expected progress in English) and that by slightly more or less than 1%, according to which 2014 methodology is selected.

Deteriorations have been small however, suggesting that FSM gaps have been relatively stable over this period, despite their closure being a top priority for the Government, backed up by extensive pupil premium funding.

.

HA sec11

Chart 11: FSM/other gaps for headline GCSE measures, 2011 to 2014.

.

Chart 12 shows a slightly more positive pattern for the gaps between disadvantaged learners (essentially ‘ever 6 FSM’ and looked after children) and their peers.

There have been improvements on four of the five headline measures since 2011. But since 2012, only one or two of the measures has improved, according to which 2014 methodology is selected. Compared with 2013, either three or four of the 2014 headline measures are down.

The application of the 2013 methodology in 2014, rather than the 2014 methodology, causes all five of the gaps to increase, so reinforcing the point in bold above.

It is unlikely that this pattern will be any different at higher attainment levels, but evidence to prove or disprove this remains disturbingly elusive.

.

HA sec12

Chart 12: Disadvantaged/other gaps for headline GCSE measures, 2011 to 2014

.

Taken together, this evidence does not provide a ringing endorsement of the Government’s strategy for closing these gaps.

There are various reasons why this might be the case:

  • It is too soon to see a significant effect from the pupil premium or other Government reforms: This is the most likely defensive line, although it begs the question why more urgent action was/is discounted.
  • Pupil premium is insufficiently targeted at the students/school that need it most: This is presumably what underlies the Fair Education Alliance’s misguided recommendation that pupil premium funding should be diverted away from high attaining disadvantaged learners towards their lower attaining peers.
  • Schools enjoy too much flexibility over how they use the pupil premium and too many are using it unwisely: This might point towards more rigorous evaluation, tighter accountability mechanisms and stronger guidance.
  • Pupil premium funding is too low to make a real difference: This might be advanced by institutions concerned at the impact of cuts elsewhere in their budgets.
  • Money isn’t the answer: This might suggest that the pupil premium concept is fundamentally misguided and that the system as a whole needs to take a different or more holistic approach.

I have proposed a more targeted method of tackling secondary excellence gaps and simultaneously strengthening fair access, where funding topsliced from the pupil premium is fed into personal budgets for disadvantaged high attainers.

These would meet the cost of coherent, long-term personalised support programmes, co-ordinated by their schools and colleges, which would access suitable services from a ‘managed market’ of suppliers.

.

Conclusion

This analysis suggests that high attainers, particularly those in selective schools, have been relatively less affected by the reforms that have depressed GCSE results in 2014.

While we should be thankful for small mercies, three issues are of particular concern:

  • There is a stubborn and serious problem with the achievement of expected progress in both English and maths. It cannot be acceptable that approximately one in seven high attainers fails to make three levels of progress in each core subject when this is a relatively undemanding expectation for those with high prior attainment. This issue is particularly acute in sponsored academies where one in four or five high attainers are undershooting their progress targets.
  • Underachievement amongst high attainers is prevalent in far too many state-funded schools and colleges. At KS4 there are huge variations in the performance of high-attaining students depending on which schools they attend. A handful of schools achieve better outcomes with their middle attainers than with their high attainers. This ought to be a strong signal, to the schools as well as to Ofsted, that something serious is amiss.
  • Progress in closing KS4 FSM gaps continues to be elusive, despite this being a national priority, backed up by a pupil premium budget of £2.5bn a year. In the absence of data about the performance of disadvantaged high attainers, we can only assume that this is equally true of excellence gaps.

.

GP

February 2015

Addressed to Teach First and its Fair Education Alliance

.

This short opinion piece was originally commissioned by the TES in November.

My draft reached them on 24 November; they offered some edits on 17 December.

Betweentimes the Fair Education Alliance Report Card made its appearance on 9 December.

Then Christmas intervened.

On 5 January I offered the TES a revised version which should be published on 27 February.

This version includes some relevant Twitter comments and explanatory material.

I very much hope that Teach First and members of the Fair Education Alliance will read it and reflect seriously on the proposal it makes.

.

How worried are you that so few students on free school meals make it to Oxbridge?

Many different reasons are offered by those who argue that such concern may be misplaced:

  • FSM is a poor proxy for disadvantage; any number of alternatives is preferable;
  • We shouldn’t single out Oxbridge when so many other selective universities have similarly poor records;
  • We obsess about Oxbridge when we should be focused on progression to higher education as a whole;
  • We should worry instead about progression to the most selective courses, which aren’t necessarily at the most selective universities;
  • Oxbridge suits a particular kind of student; we shouldn’t force square pegs into round holes;
  • We shouldn’t get involved in social engineering.

Several of these points are well made. But they can be deployed as a smokescreen, obscuring the uncomfortable fact that, despite our collective best efforts, there has been negligible progress against the FSM measure for a decade or more.

Answers to Parliamentary Questions supplied  by BIS say that the total fluctuated between 40 and 45 in the six years from 2005/06 to 2010/11.

The Department for Education’s experimental destination measures statistics suggested that the 2010/11 intake was 30, rising to 50 in 2011/12, of which 40 were from state-funded schools and 10 from state-funded colleges. But these numbers are rounded to the nearest 10.

By comparison, the total number of students recorded as progressing to Oxbridge from state-funded schools and colleges in 2011/12 is 2,420.

This data underpins the adjustment of DfE’s  ‘FSM to Oxbridge’ impact indicator, from 0.1% to 0.2%. It will be interesting to see whether there is stronger progress in the 2012/13 destination measures, due later this month.

.

[Postscript: The 2012/13 Destinations Data was published on 26 January 2014. The number of FSM learners progressing to Oxbridge is shown only in the underlying data (Table NA 12).

This tells us that the numbers are unchanged: 40 from state-funded schools; 10 from state-funded colleges, with both totals again rounded to the nearest 10.

So any improvement in 2011/12 has stalled in 2012/13, or is too small to register given the rounding (and the rounding might even mask a deterioration)

.

.

The non-FSM totals progressing to Oxbridge in 2012/13 are 2,080 from state-funded schools and 480 from state-funded colleges, giving a total of 2,560. This is an increase of some 6% compared with 2011/12.

Subject to the vagaries of rounding, this suggests that the ratio of non-FSM to FSM learners progressing from state-funded institutions deteriorated in 2012/13 compared with 2011/12.]

.

The routine explanation is that too few FSM-eligible students achieve the top grades necessary for admission to Oxbridge. But answers to Parliamentary Questions reveal that, between 2006 and 2011, the number achieving three or more A-levels at grade A or above increased by some 45 per cent, reaching 546 in 2011.

Judged on this measure, our national commitment to social mobility and fair access is not cutting the mustard. Substantial expenditure – by the taxpayer, by universities and the third sector – is making too little difference too slowly. Transparency is limited because the figures are hostages to fortune.

So what could be done about this? Perhaps the answer lies with Teach First and the Fair Education Alliance.

Towards the end of last year Teach First celebrated a decade of impact. It published a report and three pupil case studies, one of which featured a girl who was first in her school to study at Oxford.

I tweeted

.

.

Teach First has a specific interest in this area, beyond its teacher training remit. It runs a scheme, Teach First Futures, for students who are  “currently under-represented in universities, including those whose parents did not go to university and those who have claimed free school meals”.

Participants benefit from a Teach First mentor throughout the sixth form, access to a 4-day Easter school at Cambridge, university day trips, skills workshops and careers sessions. Those applying to Oxbridge receive unspecified additional support.

.

.

Information about the number of participants is not always consistent, but various Teach First sources suggest there were some 250 in 2009, rising to 700 in 2013. This year the target is 900. Perhaps some 2,500 have taken part to date.

Teach First’s impact report  says that 30 per cent of those who had been through the programme in 2013 secured places at Russell Group universities and that 60 per cent of participants interviewed at Oxbridge received an offer.

I searched for details of how many – FSM or otherwise – had actually been admitted to Oxbridge. Apart from one solitary case study, all I could find was a report that mentioned four Oxbridge offers in 2010.

.

.

.

.

.

Through the Fair Education Alliance, Teach First and its partners are committed to five impact goals, one of which is to:

‘Narrow the gap in university graduation, including from the 25% most selective universities, by 8%’*

Last month the Alliance published a Report Card which argued that:

‘The current amount of pupil premium allocated per disadvantaged pupil should be halved, and the remaining funds redistributed to those pupils who are disadvantaged and have low prior attainment. This would give double-weighting to those low income pupils most in need of intervention without raising overall pupil premium spend.’

It is hard to understand how this would improve the probability of achieving the impact goal above, even though the gaps the Alliance wishes to close are between schools serving high and low income communities.

.

.

.

Perhaps it should also contemplate an expanded Alliance Futures Scheme, targeting simultaneously this goal and the Government’s ‘FSM to Oxbridge’ indicator, so killing two birds with one stone.

A really worthwhile Scheme would need to be ambitious, imposing much-needed coherence without resorting to prescription.

Why not consider:

  • A national framework for the supply side, in which all providers – universities included – position their various services.
  • Commitment on the part of all secondary schools and colleges to a coherent long-term support programme for FSM students, with open access at KS3 but continuing participation in KS4 and KS5 subject to successful progress.
  • Schools and colleges responsible for identifying participants’ learning and development needs and addressing those through a blend of internal provision and appropriate services drawn from the national framework.
  • A personal budget for each participant, funded through an annual £50m topslice from the Pupil Premium (there is a precedent) plus a matching sum from universities’ outreach budgets. Those with the weakest fair access records would contribute most. Philanthropic donations would be welcome.
  • The taxpayer’s contribution to all university funding streams made conditional on them meeting challenging but realistic fair access and FSM graduation targets – and publishing full annual data in a standard format.

 .

.

*In the Report card, this impact goal is differently expressed, as narrowing the gap in university graduation, so that at least 5,000 more students from low income backgrounds graduate each year, 1,600 of them from the most selective universities. This is to be achieved by 2022.

‘Low income backgrounds’ means schools where 50% or more pupils come from the most deprived 30% of families according to IDACI.

The gap to be narrowed is between these and pupils from ‘high income backgrounds’, defined as schools where 50% or more pupils come from the least deprived 30% of families according to IDACI.

‘The most selective universities’ means those in the Sutton Trust 30 (the top 25% of universities with the highest required UCAS scores).

The proposed increases in graduation rates from low income backgrounds do not of themselves constitute a narrowing gap, since there is no information about the corresponding changes in graduation rates from high income grounds.

This unique approach to closing gaps adds yet another methodology to the already long list applied to fair access. It risks adding further density to the smokescreen described at the start of this post.

.

.

GP

January 2015

2014 Primary and Secondary Transition Matrices: High Attainers’ Performance

.

This is my annual breakdown of what the Transition Matrices tell us about the national performance of high attainers.

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It complements my reviews of High Attainment in the 2014 Primary Performance Tables (December 2014) and of High Attainment in the 2014 Secondary and Post-16 Performance Tables (forthcoming, in February 2015).

The analysis is based on:

  • The 2014 Static national transition matrices for reading, writing and mathematics – Key Stage 1 to Key Stage 2 (October 2014) and
  • The 2014 Static key Stage 2 to 4 National transition matrices unamended – English and maths (December 2014).

There is also some reference to SFR41/2014: Provisional GCSE and equivalent results in England, 2013 to 2014.

The post begins with some important explanatory notes, before examining the primary and then the secondary matrices. There is a commentary on each matrix, followed by a summary of the key challenges for each sector.

.

Explanatory notes

The static transition matrices take into account results from maintained mainstream and maintained and non-maintained special schools. 

The tables reproduced below use colour coding:

  • purple = more than expected progress
  • dark green = expected progress
  • light green = less than expected progress and
  • grey = those excluded from the calculation.

I will assume that readers are familiar with expectations of progress under the current system of national curriculum levels.

I have written before about the assumptions underpinning this approach and some of the issues it raises.

(See in particular the sections called:

 ‘How much progress does the accountability regime expect from high attainers?’ and

‘Should we expect more progress from high attainers?’)

I have not reprised that discussion here.

The figures within the tables are percentages – X indicates data that has been suppressed (where the cohort comprises only one or two learners). Because of rounding, lines do not always add up to 100%.

In the case of the primary matrices, the commentary below concentrates on the progress made by learners who achieved level 3 or level 4 at KS1. In the case of the secondary matrices, it focuses on those who achieved sub-levels 5A, 5B or 5C at KS2.

Although the primary matrices include progression from KS1 level 4, the secondary matrices do not include progression from KS2 level 6 since the present level 6 tests were introduced only in 2012. Those completing GCSEs in 2014 will typically have undertaken KS2 assessment five years earlier.

The analysis includes comparison with the matrices for 2012 and 2013 respectively.

.

The impact of policy change on the secondary matrices

This comparison is straightforward for the primary sector (KS1 to KS2) but is problematic when it comes to the secondary matrices (KS2 to KS4).

As SFR41/2014 makes clear, the combined impact of:

  • vocational education reforms (restricting eligible qualifications and significantly reducing the weighting of some of them) and 
  • early entry policy (recording in performance measures only the first result achieved, rather than the outcome of any retakes)

has depressed overall KS4 results.

The impact of these factors on progress is not discussed within the text, although one of the tables gives overall percentages for those making the expected progress under the old and new methodologies respectively.

It does so for two separate groups of institutions, neither of which is perfectly comparable with the transition matrices because of the treatment of special schools:

  • State funded mainstream schools (excluding state-funded special schools and non-maintained special schools) and
  • State-funded schools (excluding non-maintained special schools).

However, the difference is likely to be marginal.

There is certainly very little difference between the two sets of figures for the categories above, though the percentages are very slightly larger for the first.

They show:

  • A variation of 2.3 percentage points in English (72.1% making at least the expected progress under the new methodology compared with 74.4% under the old) and
  • A variation of 2.4 percentage points in maths (66.4% making at least the expected progress compared with 68.8%).

There is no such distinction in the static transition matrices, nor does the SFR provide any information about the impact of these policy changes for different levels of prior attainment.

It seems a reasonable starting hypothesis that the impact will be much reduced at higher levels of prior attainment, because comparatively fewer students will be pursuing vocational qualifications.

One might also expect comparatively fewer high attainers to require English and/or maths retakes, even when the consequences of early entry are factored in, but that is rather more provisional.

It may be that the differential impact of these reforms on progression from different levels of prior attainment will be discussed in the statistical releases to be published alongside the Secondary Performance Tables. In that case I will update this treatment.

For the time being, my best counsel is:

  • To be aware that these policy changes have almost certainly had some impact on the progress of secondary high attainers, but 
  • Not to fall into the trap of assuming that they must explain all – or even a substantial proportion – of any downward trends (or absence of upward trends for that matter).

There will be more to say about this in the light of the analysis below.

Is this data still meaningful?

As we all know, the measurement of progression through national curriculum levels will shortly be replaced by a new system.

There is a temptation to regard the methodology underpinning the transition matrices as outmoded and irrelevant.

For the time being though, the transition matrices remain significant to schools (and to Ofsted) and there is an audience for analysis based on them.

Moreover, it is important that we make our best efforts to track annual changes under the present system, right up to the point of changeover.

We should also be thinking now about how to match progression outcomes under the new model with those available under the current system, so as to secure an uninterrupted perspective of trends over time.

Otherwise our conclusions about the longer-term impact of educational policies to raise standards and close gaps will be sadly compromised.

.

2014 Primary Transition Matrices

.

Reading

.

TM reading KS12 Capture

.

Commentary:

  • It appears that relatively few KS1 learners with L4 reading achieved the minimum expected 2 levels of progress by securing L6 at the end of KS2. It is not possible for these learners to make more than the expected progress. The vast majority (92%) recorded a single level of progress, to KS2 L5. This contrasts with 2013, when 12% of KS1 L4 learners did manage to progress to KS2 L6, while only 88% were at KS2 L5. Caution is necessary since the sample of L1 KS4 readers is so small. (The X suggests the total cohort could be as few as 25 pupils.)
  • The table shows that 1% of learners achieving KS1 L3 reading made 3 levels of progress to KS2 L6, exactly the same proportion as in 2012 and 2013. But we know that L6 reading test entries were up 36% compared with 2013: one might reasonably have expected some increase in this percentage as a consequence. The absence of improvement may be attributable to the collapse in success rates on the 2014 L6 reading test.
  • 90% of learners achieving KS1 L3 made the expected 2 or more levels of progress to KS2 L5 or above, 89% making 2 levels of progress to L5. The comparable figures for those making 2 LoP in 2013 and 2012 were 85% and 89% respectively.
  • In 2014 only 10% of those achieving LS1 L3 made a single level of progress to KS2 L4, compared with 13% in 2013 and 10% in 2012. 
  • So, when it comes to L3 prior attainers, the 2013 dip has been overcome, but there has been no improvement beyond the 2012 outcomes. Chart 1 makes this pattern more obvious, illustrating clearly that there has been relatively little improvement across the board.

.

TM chart 1

Chart 1: Percentage of learners with KS1 L3 reading making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 making the expected progress is significantly lower than the proportions with KS1 L2A, L2B or L2 overall who do so. This pattern is unchanged from 2012 and 2013.
  • The proportion exceeding 2 LoP is also far higher for every other level of KS1 prior achievement, also unchanged from 2012 and 2013.
  • Whereas the gap between KS1 L2 and L3 making more than 2 LoP was 36 percentage points in 2013, by 2014 it had increased substantially to 43 percentage points (44% versus 1%). This may again be partly attributable to the decline in L6 reading results.

.

Writing

.

TM writing KS12 Capture

Commentary:

  • 55% of learners with L4 in KS1 writing made the expected 2 levels of progress to KS2 L6, while only 32% made a single level of progress to KS2 L5. This throws into sharper relief the comparable results for L4 readers. 
  • On the other hand, the 2013 tables recorded 61% of L4 writers making the expected progress, six percentage points higher than the 2014 success rate, so there has been a decline in success rates in both reading and writing for this small cohort. The reason for this is unknown, but it may simply be a consequence of the small sample.
  • Of those achieving KS1 L3, 12% made 3 LoP to KS2 L6, up from 6% in 2012 and 9% in 2013. The comparison with reading is again marked. A further 2% of learners with KS1 L2A made 4 levels of progress to KS2 L6.
  • 91% of learners with KS1 L3 writing made the expected 2 or more levels of progress, up from 89% in 2013. Some 79% made 2 LoP to L5, compared with 80% in 2013 and 79% in 2012, so there has been relatively little change.
  • However, in 2014 9% made only a single level of progress to KS2 L4. This is an improvement on 2013, when 11% did so and continues an improving trend from 2012 when 15% fell into this category, although the rate of improvement has slowed somewhat. 
  • These positive trends are illustrated in Chart 2 below, which shows reductions in the proportion achieving a single LoP broadly matched by corresponding improvements in the proportion achieving 3 LoP.

TM chart 2 

Chart 2: Percentage of learners with KS1 L3 writing making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 making the expected progress is again lower than the proportions with KS1 L2A, L2B or L2 overall doing so. It is even lower than the proportion of those with KS1 L1 achieving this outcome. This is unchanged from 2013.
  • The proportion exceeding 2 LoP is far higher for every other level of KS1 achievement excepting L2C, again unchanged from 2013.
  • The percentage point gap between those with KS1 L2 overall and LS1 L3 making more than 2 LoP was 20 points in 2013 and remains unchanged at 20 points in 2014. Once again again there is a marked contrast with reading. 

.

Maths

.

TM maths KS12 Capture

.

Commentary:

  • 95% of those achieving L4 maths at KS1 made the expected 2 levels of progress to KS2 L6. These learners are unable to make more than expected progress. Only 5% made a single level of progress to KS2 L5. 
  • There is a marked improvement since 2013, when 89% made the expected progress and 11% fell short. This is significantly better than KS1 L4 progression in writing and hugely better than KS1 L4 progression in reading.
  • 35% of learners with KS1 L3 maths also made 3 levels of progress to KS2 L6. This percentage is up from 26% in 2013 and 14% in 2012, indicating a continuing trend of strong improvement. In addition, 6% of those with L2A and 1% of those at L2B managed 4 levels of progress to KS2 L6.
  • 91% of learners with KS1 L3 made the expected progress (up one percentage point compared with 2013). Of these, 56% made 2 LoP to KS2 L5. However, 9% made only a single level of progress to KS2 L4 (down a single percentage point compared with 2013).
  • Chart 3 illustrates these positive trends. It contrasts with the similar charts for writing above, in that the rate at which the proportion of L3 learners making a single LoP is reducing is much slower than the rate of improvement in the proportion of KS1 L3 learners making 3 LoP.

.

TM chart 3

Chart 3: Percentage of learners with KS1 L3 maths making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 in maths who achieved the expected progress is identical to the proportion achieving L2 overall that do so, at 91%. However, these rates are lower than for learners with KS1 2B and especially 2A.
  • The proportion exceeding 2 LoP is also identical for those with KS1 L3 and L2 overall (whereas in 2013 there was a seven percentage point gap in favour of those with KS1 L2). The proportion of those with KS1 L2A exceeding 2 LoP remains significantly higher, but the gap has narrowed by six percentage points compared with 2013.

.

Key Challenges: Progress of High Attainers between KS1 and KS2

The overall picture from the primary transition matrices is one of comparatively strong progress in maths, positive progress in writing and a much more mixed picture in reading. But in none of these areas is the story unremittingly positive.

Priorities should include:

  • Improving progression from KS1 L4 to KS2 L6, so that the profile for writing becomes more similar to the profile for maths and, in particular, so that the profile for reading much more closely resembles the profile for writing. No matter how small the cohort, it cannot be acceptable that 92% of KS1 L4 readers make only a single level of progress.
  • Reducing to negligible the proportion of KS1 L3 learners making a single level of progress to KS2 L4. Approximately 1 in 10 learners continue to do so in all three assessments, although there has been some evidence of improvement since 2012, particularly in writing. Other than in maths, the proportion of KS1 L3 learners making a single LoP is significantly higher than the proportion of KS1 L2 learners doing so. 
  • Continuing to improve the proportion of KS1 L3 learners making 3 LoP in each of the three assessments, maintaining the strong rate of improvement in maths, increasing the rate of improvement in writing and moving beyond stagnation at 1% in reading. 
  • Eliminating the percentage point gaps between those with KS1 L2A making at least the expected progress and those with KS1 L3 doing so (5 percentage points in maths and 9 percentage points in each of reading and writing). At the very least, those at KS1 L3 should be matching those at KS1 L2B, but there are presently gaps between them of 2 percentage points in maths, 5 percentage points in reading and 6 percentage points in writing.

.

Secondary Transition Matrices

.

English

.

TM English KS24 Capture

.

Commentary:

  • 98% of learners achieving L5A English at KS2 made at least 3 levels of progress to GCSE grade B or above in 2014. The same is true of 93% of those with KS2 L5B and 75% of those with KS2 L5C. All three figures have improved by one percentage point compared with 2013. The comparable figures in 2012 were 98%, 92% and 70% respectively.
  • 88% of learners achieving L5A at KS2 achieved at least four levels of progress from KS2 to KS4, so achieving a GCSE grade of A* or A, as did 67% of those with L5B and 34% of those with 5C. The comparable figures in 2013 were 89%, 66% and 33% respectively, while in 2012 they were 87%, 64% and 29% respectively.
  • 51% of learners with KS2 L5A made 5 levels of progress by achieving an A* grade at GCSE, compared with 25% of those with L5B, 7% of those with L5C and 1% of those with L4A. The L5B and L5C figures were improvements on 2013 outcomes. The 2014 success rate for those with KS2 L5A is down by two percentage points, while that for L5B is up by two points.
  • These cumulative totals suggest relatively little change in 2014 compared with 2013, with the possible exception of these two-percentage-point swings in the proportions of students making 5 LoP. 
  • The chart below compares the proportion of students with KS2 L5A, 5B and 5C respectively making exactly 3, 4 and 5 LoP. (NB: these are not the same as the cumulative totals quoted above). This again shows relatively small changes in 2014, compared with 2013, and no obvious pattern.

.

TM chart 4

Chart 4: Percentage of learners with KS2 L5A, L5B and L5C in English achieving 3, 4 and 5 levels of progress, 2012-2014

.

  • 1% of learners with KS2 L5A made only 2 levels of progress to GCSE grade C, as did 6% of those with L5B and 20% of those with L5C. These percentages are again little changed compared with 2013, following a much more significant improvement between 2012 and 2013).
  • The percentages of learners with KS2 L4A who achieve at least 3 and at least 4 levels of progress – at 87% and 48% respectively – are significantly higher than the corresponding percentages for those with KS2 L5C. These gaps have also changed very little compared with 2013.

.

Maths

.

TM Maths KS24 Capture

.

Commentary:

  • 96% of learners with L5A at KS2 achieved the expected progress between KS2 and KS4 in 2014, as did 86% of those with KS2 L5B and 65% of those with KS2 L5C. The comparable percentages in 2013 were 97%, 88% and 70%, while in 2012 they were 96%, 86% and 67%. This means there have been declines compared with 2013 for L5A (one percentage point) L5B (two percentage points) and L5C (five percentage points).
  • 80% of learners with KS2 L5A made 4 or more levels of progress between KS2 and KS4, so achieving a GCSE grade A* or A. The same was true of 54% of those with L5B and 26% of those with L5C. In 2013, these percentages were 85%, 59% and 31% respectively, while in 2012 they were 84%, 57% and 30% respectively. So all the 2014 figures – for L5A, L5B and L5C alike, are five percentage points down compared with 2013.
  • In 2014 48% of learners with KS2 L5A made 5 levels of progress by achieving a GCSE A* grade, compared with 20% of those with L5B, 5% of those with L5C and 1% of those with L4A. All three percentages for those with KS2 L5 are down compared with 2013 – by 3 percentage points in the case of those with L5A, 2 points for those with L5B and 1 point for those with L5C.
  • It is evident that there is rather more volatility in the trends in maths progression and some of the downward swings are more pronounced than in English.
  • The chart below compares the proportion of students with KS2 L5A, 5B and 5C respectively making exactly 3, 4 and 5 LoP. (NB, these are not the cumulative totals quoted above). The only discernible pattern is that any improvement is confined to those making 3 LoP.

.

TM chart 5

Chart 5: Percentage of learners with KS2 L5A, L5B and L5C in Maths achieving 3, 4 and 5 levels of progress, 2012-2014

  • 4% of those with KS2 L5A made only 2 LoP to GCSE grade C, as did 13% of those with L5B and 31% of those with L5C. All three percentages have worsened compared with 2013, by 1, 2 and 4 percentage points respectively.
  • The percentages of learners with KS2 L4A who achieve at least 3 and at least 4 levels of progress – at 85% and 37% respectively – are significantly higher than the corresponding percentages for those with L5C, just as they are in English. And, as is the case with English, the percentage point gaps have changed little compared with 2013.

.

Key Challenges: Progress of High Attainers Between KS2 and KS4

The overall picture for high attainers from the secondary transition matrices is of relatively little change in English and of rather more significant decline in maths, though not by any means across the board.

It may be that the impact of the 2014 policy changes on high attainers has been relatively more pronounced in maths than in English – and perhaps more pronounced in maths than might have been expected.

If this is the case, one suspects that the decision to restrict reported outcomes to first exam entries is the most likely culprit.

On the other hand, it might be true that relatively strong improvement in English progression has been cancelled out by these policy changes, though the figures provided in the SFR for expected progress regardless of prior attainment make this more unlikely.

Leaving causation aside, the most significant challenges for the secondary sector are to:

  • Significantly improve the progression rates for learners with KS2 L5A to A*. It should be a default expectation that they achieve five levels of progress, yet only 48% do so in maths and 51% in English – and these percentages are down 5 and 2 percentage points respectively compared with 2013.
  • Similarly, significantly improve the progression rates for learners with KS2 L5B to grade A. It should be a default expectation that they achieve at least 4 LoP, yet only 67% do so in English and 54% in maths – down one point since 2013 in English and 5 points in maths.
  • Reduce and ideally eliminate the rump of high attainers who make a single LoP. This is especially high for those with KS2 L5C – 20% in English and, still worse, 31% in maths – but there is also a problem for those with 5B in maths, 13% of whom fall into this category. The proportion making a single LoP from 5C in maths has risen by 4 percentage points since 2013, while there has also been a 2 point rise for those with 4B. (Thankfully the L5C rate in English has improved by 2 points, but there is a long way still to go.)
  • Close significantly, the progression performance gaps between learners with KS2 L5C and KS2 L4A, in both English and maths. In English there is currently a 12 percentage point gap for those making expected progress and a 14-point gap for those exceeding it. In maths, these gaps are 20 and 11 percentage points respectively. The problem in maths seems particularly pronounced. These gaps have changed little since 2013.

.

Conclusion

This analysis of high attainers’ progression suggests a very mixed picture, across the primary and secondary sectors and beween English and maths. There is some limited scope for congratulation, but too many persistent issues remain.

The commentary has identified four key challenges for each sector, which can be synthesised under two broad headings:

  • Raising expectations beyond the minimum expected progress – and significantly reducing our tolerance of underachievement amongst this cohort. 
  • Ensuring that those at the lower end of the high attaining spectrum sustain their initial momentum, at least matching the rather stronger progress of those with slightly lower prior attainment.

The secondary picture has become confused this year by the impact of policy changes.

We do not know to what extent these explain any downward trends – or depress any upward trends – for those with high prior attainment, though one may tentatively hypothesise that any impact has been rather more significant in maths than in English.

It would be quite improper to assume that the changes in high attainers’ progression rates compared with 2013 are entirely attributable to the impact of these policy adjustments.

It would be more accurate to say that they mask any broader trends in the data, making those more difficult to isolate.

We should not allow this methodological difficulty – or the impending replacement of the present levels-based system – to divert us from continuing efforts to improve the progression of high attainers.

For Ofsted is intensifying its scrutiny of how schools support the most able – and they will expect nothing less.

.

GP

January 2015

Gifted Phoenix 2014 Review and Retrospective

.

I am rounding out this year’s blogging with my customary backwards look at the various posts I published during 2014.

This is partly an exercise in self-congratulation but also flags up to readers any potentially useful posts they might have missed.

.

P1020553

Norwegian Panorama by Gifted Phoenix

.

This is my 32nd post of the year, three fewer than the 35 I published in 2013. Even so, total blog views have increased by 20% compared with 2013.

Almost exactly half of these views originate in the UK. Other countries generating a large number of views include the United States, Singapore, India, Australia, Hong Kong, Saudi Arabia, Germany, Canada and South Korea. The site has been visited this year by readers located in157 different countries.

My most popular post during 2014 was Gifted Education in Singapore: Part 2, which was published back in May 2012. This continues to attract interest in Singapore!

The most popular post written during 2014 was The 2013 Transition Matrices and High Attainers’ Performance (January).

Other 2014 posts that attracted a large readership were:

This illustrates just how strongly the accountability regime features in the priorities of English educators.

I have continued to feature comparatively more domestic topics: approximately 75% of my posts this year have been about the English education system. I have not ventured beyond these shores since September.

The first section below reviews the minority of posts with a global perspective; the second covers the English material. A brief conclusion offers my take on future prospects.

.

Global Gifted Education

I began the year by updating my Blogroll, with the help of responses to Gifted Education Activity in the Blogosphere and on Twitter.

This post announced the creation of a Twitter list containing all the feeds I can find that mention gifted education (or a similar term, whether in English or another language) in their profile.

I have continued to update the list, which presently includes 1,312 feeds and has 22 subscribers. If you want to be included – or have additions to suggest – please don’t hesitate to tweet me.

While we’re on the subject, I should take this opportunity to thank my 5,960 Twitter followers, an increase of some 28% compared with this time last year.

In February I published A Brief Discussion about Gifted Labelling and its Permanency. This recorded a debate I had on Twitter about whether the ‘gifted label’ might be used more as a temporary marker than a permanent sorting device.

March saw the appearance of How Well Does Gifted Education Use Social Media?

This proposed some quality criteria for social media usage and blogs/websites that operate within the field of gifted education.

It also reviewed the social media activity of six key players (WCGTC, ECHA, NAGC, SENG, NACE and Potential Plus UK) as well as wider activity within the blogosphere, on five leading social media platforms and utilising four popular content creation tools.

Some of the websites mentioned above have been recast since the post was published and are now much improved (though I claim no direct influence).

Also in March I published What Has Become of the European Talent Network? Part One and Part Two.

These posts were scheduled just ahead of a conference organised by the Hungarian sponsors of the network. I did not attend, fearing that the proceedings would have limited impact on the future direction of this once promising initiative. I used the posts to set out my reservations, which include a failure to engage with constructive criticism.

Part One scrutinises the Hungarian talent development model on which the European Network is based. Part Two describes the halting progress made by to date. It identifies several deficiencies that need to be addressed if the Network is to have a significant and lasting impact on pan-European support for talent development and gifted education.

During April I produced PISA 2012 Creative Problem Solving: International Comparison of High Achievers’ Performance

This analyses the performance of high achievers from a selection of 11 jurisdictions – either world leaders or prominent English-speaking nations – on the PISA 2012 Creative Problem Solving assessment.

It is a companion piece to a 2013 post which undertook a similar analysis of the PISA 2012 assessments in Reading, Maths and Science.

In May I contributed to the Hoagies’ Bloghop for that month.

Air on the ‘G’ String: Hoagies’ Bloghop, May 2014 was my input to discussion about the efficacy of ‘the G word’ (gifted). I deliberately produced a provocative and thought-provoking piece which stirred typically intense reactions in several quarters.

Finally, September saw the production of Beware the ‘short head’: PISA’s Resilient Students’ Measure.

This takes a closer look at the relatively little-known PISA ‘resilient students’ measure – focused on high achievers from disadvantaged socio-economic backgrounds – and how well different jurisdictions perform against it.

The title reflects the post’s conclusion that, like many other countries, England:

‘…should be worrying as much about our ‘short head’ as our ‘long tail’’.

And so I pass seamlessly on to the series of domestic posts I published during 2014…

.

English Education Policy

My substantive post in January was High Attainment in the 2013 Secondary and 16-18 Performance Tables, an analysis of the data contained in last year’s Tables and the related statistical publications.

Also in January I produced a much briefer commentary on The 2013 Transition Matrices and High Attainers’ Performance.

The purpose of these annual posts (and the primary equivalent which appears each December) is to synthesise data about the performance of high attainers and high attainment at national level, so that schools can more easily benchmark their own performance.

In February I wrote What Becomes of Schools that Fail their High Attainers?*

It examines the subsequent history of schools that recorded particularly poor results with high attainers in the Secondary Performance Tables. (The asterisk references a footnote apologising ‘for this rather tabloid title’.)

By March I was focused on Challenging NAHT’s Commission on Assessment subjecting the Commission’s Report to a suitably forensic examination and offering a parallel series of recommendations derived from it.

My April Fool’s joke this year was Plans for a National Centre for Education Research into Free Schools (CERFS). This has not materialised but, had our previous Secretary of State for Education not been reshuffled, I’m sure it would have been only a matter of time!

Also in April I was Unpacking the Primary Assessment and Accountability Reforms, exposing some of the issues and uncertainties embodied in the government’s response to consultation on its proposals.

Some of the issues I highlighted eight months ago are now being more widely discussed – not least the nature of the performance descriptors, as set out in the recent consultation exercise dedicated to those.

But the reform process is slow. Many other issues remain unresolved and it seems increasingly likely that some of the more problematic will be delayed deliberately until after the General Election.

May was particularly productive, witnessing four posts, three of them substantial:

  • How well is Ofsted reporting on the most able? explores how Ofsted inspectors are interpreting the references to the attainment and progress of the most able added to the Inspection Handbook late last year. The sample comproses the 87 secondary inspection reports that were published in March 2014. My overall assessment? Requires Improvement.

.

.

  • A Closer Look at Level 6 is a ‘data-driven analysis of Level 6 performance’. As well as providing a baseline against which to assess future Level 6 achievement, this also identifies several gaps in the published data and raises as yet unanswered questions about the nature of the new tests to be introduced from 2016.
  • One For The Echo Chamber was prompted by The Echo Chamber reblogging service, whose founder objected that my posts are too long, together with the ensuing Twitter debate. Throughout the year the vast majority of my posts have been unapologetically detailed and thorough. They are intended as reference material, to be quarried and revisited, rather than the disposable vignettes that so many seem to prefer. To this day they get reblogged on The Echo Chamber only when a sympathetic moderator is undertaking the task.
  • ‘Poor but Bright’ v ‘Poor but Dim’ arose from another debate on Twitter, sparked by a blog post which argued that the latter are a higher educational priority than the former. I argued that both deserved equal priority, since it is inequitable to discriminate between disadvantaged learners on the basis of prior attainment and the economic arguments cut both ways. This issue continues to bubble like a subterranean stream, only to resurface from time to time, most recently when the Fair Education Alliance proposed that the value of pupil premium allocations attached to disadvantaged high attainers should be halved.

In June I asked Why Can’t We Have National Consensus on Educating High Attainers? and proposed a set of core principles that might form the basis for such consensus.

These were positively received. Unfortunately though, the necessary debate has not yet taken place.

.

.

The principles should be valuable to schools considering how best to respond to Ofsted’s increased scrutiny of their provision for the most able. Any institution considering how best to revitalise its provision might discuss how the principles should be interpreted to suit their particular needs and circumstances.

July saw the publication of Digging Beneath the Destination Measures which explored the higher education destinations statistics published the previous month.

It highlighted the relatively limited progress made towards improving the progression of young people from disadvantaged backgrounds to selective universities.

There were no posts in August, half of which was spent in Norway, taking the photographs that have graced some of my subsequent publications.

In September I produced What Happened to the Level 6 Reading Results? an investigation into the mysterious collapse of L6 reading test results in 2014.

Test entries increased significantly. So did the success rates on the other level 6 tests (in maths and in grammar, punctuation and spelling (GPS)).  Even teacher assessment of L6 reading showed a marked upward trend.

Despite all this, the number of pupils successful on the L6 reading test fell from 2,062 in 2013 to 851 (provisional). The final statistics – released only this month – show a marginal improvement to 935, but the outcome is still extremely disappointing. No convincing explanation has been offered and the impact on 2015 entries is unlikely to be positive.

That same month I published Closing England’s Excellence Gaps: Part One and Part Two.

These present the evidence base relating to high attainment gaps between disadvantaged and other learners, to distinguish what we know from what remains unclear and so to provide a baseline for further research.

The key finding is that the evidence base is both sketchy and fragmented. We should understand much more than we do about the size and incidence of excellence gaps. We should be strengthening the evidence base as part of a determined strategy to close the gaps.

.

.

In October 16-19 Maths Free Schools Revisited marked a third visit to the 16-19 maths free schools programme, concentrating on progress since my previous post in March 2013, especially at the two schools which have opened to date.

I subsequently revised the post to reflect an extended series of tweeted comments from Dominic Cummings, who was a prime mover behind the programme. The second version is called 16-19 Maths Free Schools Revisited: Oddyssean Edition .

The two small institutions at KCL and Exeter University (both very similar to each other) constitute a rather limited outcome for a project that was intended to generate a dozen innovative university-sponsored establishments. There is reportedly a third school in the pipeline but, as 2014 closes, details have yet to be announced.

Excellence Gaps Quality Standard: Version One is an initial draft of a standard encapsulating effective whole school practice in supporting disadvantaged high attainers. It updates and adapts the former IQS for gifted and talented education.

This first iteration needs to be trialled thoroughly, developed and refined but, even as it stands, it offers another useful starting point for schools reviewing the effectiveness of their own provision.

The baseline standard captures the essential ‘non-negotiables’ intended to be applicable to all settings. The exemplary standard is pitched high and should challenge even the most accomplished of schools and colleges.

All comments and drafting suggestions are welcome.

.

.

In November I published twin studies of The Politics of Setting and The Politics of Selection: Grammar Schools and Disadvantage.

These issues have become linked since Prime Minister Cameron has regularly proposed an extension of the former as a response to calls on the right wing of his party for an extension of the latter.

This was almost certainly the source of autumn media rumours that a strategy, originating in Downing Street, would be launched to incentivise and extend setting.

Newly installed Secretary of State Morgan presumably insisted that existing government policy (which leaves these matters entirely to schools) should remain undisturbed. However, the idea might conceivably be resuscitated for the Tory election manifesto.

Now that UKIP has confirmed its own pro-selection policy there is pressure on the Conservative party to resolve its internal tensions on the issue and identify a viable alternative position. But the pro-grammar lobby is unlikely to accept increased setting as a consolation prize…

.

.

Earlier in December I added a companion piece to ‘The Politics of Selection’.

How Well Do Grammar Schools Perform With Disadvantaged Students? reveals that the remaining 163 grammar schools have very different records in this respect. The poor performance of a handful is a cause for concern.

I also published High Attainment in the 2014 Primary School Performance Tables – another exercise in benchmarking, this time for primary schools interested in how well they support high attainers and high attainment.

This shows that HMCI’s recent distinction between positive support for the most able in the primary sector and a much weaker record in secondary schools is not entirely accurate. There are conspicuous weaknesses in the primary sector too.

Meanwhile, Chinese learners continue to perform extraordinarily well on the Level 6 maths test, achieving an amazing 35% success rate, up six percentage points since 2013. This domestic equivalent of the Shanghai phenomenon bears closer investigation.

My penultimate post of the year HMCI Ups the Ante on the Most Able collates all the references to the most able in HMCI’s 2014 Annual Report and its supporting documentation.

It sets out Ofsted’s plans for the increased scrutiny of schools and for additional survey reports that reflect this scrutiny.

It asks the question whether Ofsted’s renewed emphasis will be sufficient to rectify the shortcomings they themselves identify and – assuming it will not – outlines an additional ten-step plan to secure system-wide improvement.

Conclusion

So what are the prospects for 2015 and beyond?

My 2013 Retrospective was decidedly negative about the future of global gifted education:

‘The ‘closed shop’ is as determinedly closed as ever; vested interests are shored up; governance is weak. There is fragmentation and vacuum where there should be inclusive collaboration for the benefit of learners. Too many are on the outside, looking in. Too many on the inside are superannuated and devoid of fresh ideas.’

Despite evidence of a few ‘green shoots’’ during 2014, my overall sense of pessimism remains.

Meanwhile, future prospects for high attainers in England hang in the balance.

Several of the Coalition Government’s education reforms have been designed to shift schools’ focus away from borderline learners, so that every learner improves, including those at the top of the attainment distribution.

On the other hand, Ofsted’s judgement that a third of secondary inspections this year

‘…pinpointed specific problems with teaching the most able’

would suggest that schools’ everyday practice falls some way short of this ideal.

HMCI’s commitment to champion the interests of the most able is decidedly positive but, as suggested above, it might not be enough to secure the necessary system-wide improvement.

Ofsted is itself under pressure and faces an uncertain future, regardless of the election outcome. HMCI’s championing might not survive the arrival of a successor.

It seems increasingly unlikely that any political party’s election manifesto will have anything significant to say about this topic, unless  the enthusiasm for selection in some quarters can be harnessed and redirected towards the much more pertinent question of how best to meet the needs of all high attainers in all schools and colleges, especially those from disadvantaged backgrounds.

But the entire political future is shrouded in uncertainty. Let’s wait and see how things are shaping up on the other side of the election.

From a personal perspective I am closing in on five continuous years of edutweeting and edublogging.

I once expected to extract from this commitment benefits commensurate with the time and energy invested. But that is no longer the case, if indeed it ever was.

I plan to call time at the end of this academic year.

 .

GP

December 2014

HMCI Ups the Ante on the Most Able

.

Her Majesty’s Chief Inspector Wilshaw made some important statements about the education of what Ofsted most often calls ‘the most able’ learners in his 2013/14 Annual Report and various supporting documents.

P1020587

Another Norwegian Landscape by Gifted Phoenix

This short post compiles and summarises these statements, setting them in the context of current inspection policy and anticipated changes to the inspection process.

It goes on to consider what further action might be necessary to remedy the deficiencies Ofsted has identified in schools and to boost our national capacity to educate high attainers.

It continues a narrative which runs through several of my previous posts including:

.

What the Annual Report documents said

Ofsted’s press release marking publication of the 2013/14 Annual Report utilises a theme that runs consistently through all the documentation: while the primary sector continues to improve, progress has stalled in the secondary sector, resulting in a widening performance gap between the two sectors.

It conveys HMCI’s judgement that primary schools’ improvement is attributable to the fact that they ‘attend to the basics’, one of which is:

‘Enabling the more able [sic] pupils to reach their potential’

Conversely, the characteristics of secondary schools where improvement has stalled include:

‘The most able not being challenged’.

It is unclear whether Ofsted maintains a distinction between ‘more able’ and ‘most able’ since neither term is defined at any point in the Annual Report documentation.

In his speech launching the Annual Report, HMCI Wilshaw said:

‘The problem is also acute for the most able children. Primaries have made steady progress in helping this group. The proportion of pupils at Key Stage 2 gaining a Level 5 or above rose from 21% in 2013 to 24% this year. Attainment at Level 6 has also risen, particularly in mathematics, where the proportion reaching the top grade has increased from 3% to 9% in two years.

Contrast that with the situation in secondary schools. In 2013, nearly a quarter of pupils who achieved highly at primary school failed to gain even a B grade at GCSE. A third of our inspections of secondary schools this year pinpointed specific problems with teaching the most able – a third of inspections this year.

We cannot allow this lack of progress to persist. Imagine how dispiriting it must be for a child to arrive at a secondary school bursting with enthusiasm and keen to learn, only to be forced to repeat lessons already learnt and endure teaching that fails to stimulate them. To help tackle this problem, I have commissioned a report into progress at Key Stage 3 and it will report next year.’

HMCI’s written Commentary on the Annual Report says of provision in primary schools:

‘Many primary schools stretch the more able

Good and outstanding schools encourage wider reading and writing at length. Often, a school’s emphasis on the spiritual, moral, social and cultural aspects of the curriculum benefits all pupils but especially the more able, providing them with opportunities to engage with complex issues.

The proportion of pupils at Key Stage 2 attaining a Level 5 or above in reading, writing and mathematics increased from 21% in 2013 to 24% in 2014.

Attainment at Level 6 has also risen. In mathematics, the proportion of pupils achieving Level 6 rose from 3% in 2012 to 9% in 2014. The proportion achieving Level 6 in grammar, punctuation and spelling rose by two percentage points in the last year to 4%.

These improvements suggest that primary schools are getting better at identifying the brightest children and developing their potential.’ (Page 9)

The parallel commentary on provision in secondary schools says:

Too many secondary schools are not challenging the most able

In 2013, almost two thirds of the pupils in non-selective schools who attained highly at primary school in English and mathematics did not reach an A or A* in those subjects at GCSE. Nearly a quarter of them did not even achieve a B grade.

Around a third of our inspections of secondary schools this year identified issues in the teaching of the most able pupils. Inspectors found that teachers’ expectations of the most able were too low. There is a worrying lack of scholarship permeating the culture of too many schools.

In the year ahead, Ofsted will look even more closely at the performance of the brightest pupils in routine school inspections and will publish a separate report on what we find.’ (Page 13)

The Annual Report itself adds:

‘Challenging the most able

England’s schools are still not doing enough to help the most able children realise their potential. Ofsted drew attention to this last year, but the story has yet to change significantly. Almost two thirds of the pupils in non-selective schools who attained highly at primary school in English and mathematics did not reach an A* or A in those subjects at GCSE in 2013. Nearly a quarter of them did not even achieve a B grade and a disproportionate number of these are boys. Our brightest pupils are not doing as well as their peers in some other countries that are significantly outperforming England. In PISA 2012, fewer 15-year-olds in England were attaining at the highest levels in mathematics than their peers in Germany, Poland and Belgium. In reading, however, they were on a par.

This year, our inspectors looked carefully at how schools were challenging their most able pupils. Further action for individual schools was recommended in a third of our inspection reports. The majority of recommendations related to improved teaching of this group of pupils. Inspectors called on schools to ensure that the most able pupils are being given challenging work that takes full account of their abilities. Stretching the most able is a task for the whole school. It is important that schools promote a culture that supports the most able pupils to flourish, giving them opportunities to develop the skills needed by top universities and tracking their progress at every stage.

‘Ofsted will continue to press schools to stretch their most able pupils. Over the coming year, inspectors will be looking at this more broadly, taking into account the leadership shown in this area by schools. We will also further sharpen our recommendations so that schools have a better understanding of how they can help their most able pupils to reach their potential. Ofsted will follow up its 2013 publication on the most able in secondary schools with another survey focusing on non-selective primary and secondary schools. As part of this survey, we will examine the transition of the most able pupils from one phase to the next.’

Rather strangely, there are substantive references in only two of the accompanying regional reports.

The Report on London – the region that arguably stands above all others in terms of overall pupil performance – says:

More able pupils [sic]

London does reasonably well overall for more able pupils. In 2012/13 the proportion of pupils who were high attainers in Year 6 and then went on to gain A* or A in GCSE English was 46% in London compared with 41% in England.  In mathematics, the proportions were 49% across England and 58% in London.

However, in 2012/13, seven local authorities – Croydon, Bexley, Havering, Lewisham, Lambeth, Tower Hamlets and Waltham Forest – were below the London and national proportions of previously high attaining pupils who went on to attain grade A* or A in GCSE English. With the exception of Bexley, the same local authorities also fell below the London and national levels for the proportion of previously high-attaining pupils who went on to attain grade A* or A in GCSE mathematics.

We have identified the need to secure more rapid progress for London’s more able pupils as one of our key priorities. Inspectors will be paying particular attention to the performance of the more able pupils in schools and local authorities where these pupils are not reaching their full potential.’

The Report on the North-West identifies a problem:

‘Too many of the more able students underperform at secondary school. Of the 23 local authorities in the North West, 13 are below the national level for the percentage of children achieving at least Level 5 at Key Stage 2 in English and mathematics. The proportion subsequently attaining A* or A at GCSE is very low in some areas, particularly Knowsley, Salford and Blackpool.’

But it does not mention tackling this issue amongst its regional priorities.

The six remaining regional reports are silent on the issue.

.

Summarising the key implications

Synthesising the messages from these different sources, it seems that:

  • Primary schools have made ‘steady progress’ in supporting the most able, improving their capacity to identify and develop their potential. 
  • Inspection evidence suggests one third of secondary schools have specific problems with teaching the most able. This is a whole school issue. Too many high attainers at the end of KS2 are no longer high attainers at the end of KS4. Teachers’ expectations are too low. A positive school culture is essential but there is ‘a worrying lack of scholarship permeating the culture of too many schools’.  
  • Ofsted will increase the scrutiny it gives to the performance of the most able in routine school inspections, taking account of the leadership shown by schools (which appears to mean the contribution made by school leaders within schools), and will sharpen their recommendations within school inspection reports to reflect this increased scrutiny. 
  • They will also publish a survey report in 2015 that will feature: the outcomes of their increased scrutiny; provision in ‘non-selective primary and secondary schools’ including transition between phases; and the progress of the most able learners in KS3. 
  • In London the need to secure more rapid progress for more able pupils is a priority for Ofsted’s regional team. They will focus particularly on progress in English and maths between KS2 and KS4 in seven local authorities performing below the national and London average. 

[Postscript: In his Select Committee appearance on 28 January 2015, HMCI said that the 2015 survey report will be published in May.

]

All this suggests that schools would be wise to concentrate on strengthening leadership, school culture and transition – as well as eradicating any problems associated with teaching the most able.

KS3 is a particular concern in secondary schools. Although there will be comparatively more attention paid to the secondary sector, primary schools will not escape Ofsted’s increased scrutiny.

This is as it should be since my recent analysis of high attainers and high attainment in the 2014 Primary Performance Tables demonstrates that there is significant underachievement amongst high attainers in the primary sector and, in particular, very limited progress in closing achievement gaps between disadvantaged and other learners at higher attainment levels.

Ofsted does not say that they will give particular attention to most able learners in receipt of the pupil premium. The 2013 survey report committed them to doing so, but I could find no such emphasis in my survey of secondary inspection reports.

.

Will this be enough?

HMCI’s continuing concern about the quality of provision for the most able raises the question whether Ofsted’s increased scrutiny will be sufficient to bring about the requisite improvement.

Government policy is to leave this matter entirely to schools, although this has been challenged in some quarters. Labour in Opposition has been silent on the matter since Burnham’s Demos speech in July 2011.

More recent political debate about selection and setting has studiously avoided the wider question of how best to meet the needs of the most able, especially those from disadvantaged backgrounds.

If HMCI Wilshaw were minded to up the ante still further, what additional action might he undertake within Ofsted and advocate beyond it?

I sketch out below a ten-step plan for his and your consideration.

.

  1. Ofsted should strengthen its inspection procedures by publishing a glossary and supplementary inspection guidance, so that schools and inspectors alike have a clearer, shared understanding of Ofsted’s expectations and what provision should look like in outstanding and good schools. This should feature much more prominently the achievement, progress and HE destinations of disadvantaged high attainers, especially those in receipt of the Pupil Premium.

.

  1. The initiative under way in Ofsted’s London region should be extended immediately to all eight regions and a progress report should be included in Ofsted’s planned 2015 survey.

.

  1. The Better Inspection for All consultation must result in a clearer and more consistent approach to the inspection of provision for the most able learners across all sectors, with separate inspection handbooks adjusted to reflect the supplementary guidance above. Relevant high attainment, high attainer and excellence gaps data should be added to the School Data Dashboard.

.

  1. Ofsted should extend its planned 2015 survey to include a thorough review of the scope and quality of support for educating the most able provided to schools through local authority school improvement services, academy chains, multi-academy trusts and teaching school alliances. It should make recommendations for extending and strengthening such support, eliminating any patchiness of provision.

.

  1. Reforms to the assessment and accountability frameworks mean that less emphasis will be placed in future on the achievement of national benchmarks by borderline candidates and more on the attainment and progress of all learners. But there are still significant gaps in the data published about high attainment and high attainers, especially the differential performance of advantaged and disadvantaged learners. The decision to abandon the planned data portal – in which it was expected some of this data would be deposited – is problematic. Increased transparency would be helpful.

.

  1. There are unanswered questions about the support that the new levels-free assessment regime will provide for the achievement and progression of the most able. There is a risk that a ‘mastery’-focused approach will emphasise progression through increased depth of study, at the expense of greater breadth and faster pace, thus placing an unnecessary constraint on their education. Guidance is desirable to help eliminate these concerns.

.

  1. The Education Endowment Foundation (EEF) should extend its remit to include excellence gaps. All EEF-sponsored evaluations should routinely consider the impact on disadvantaged high attainers. The EEF should also sponsor projects to evaluate the blend of interventions that are most effective in closing excellence gaps. The Toolkit should be revised where necessary to highlight more clearly where specific interventions have a differential impact on high attainers.

.

  1. Efforts should be made to establish national consensus on the effective education of high attainers through consultation on and agreement of a set of common core principles.

.

  1. A ‘national conversation’ is needed to identify strategies for supporting (disadvantaged) high attainers, pushing beyond the ideological disagreements over selection and setting to consider a far wider range of options, including more innovative approaches to within-school and between-school provision.

.

  1. A feasibility study should be conducted into the viability of a national, non-governmental learner-centred support programme for disadvantaged high attainers aged 11-18. This would be market-driven but operate within a supporting national framework. It would be managed entirely within existing budgets – possibly an annual £50m pupil premium topslice plus a matching contribution from universities’ fair access outreach funding.

.

GP

December 2014

High Attainment in the 2014 Primary School Performance Tables

.

This is my annual post reviewing data about high attainment and high attainers at the end of Key Stage 2.

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on:

and parallel material for previous years.

‘High attainment’ is taken to mean National Curriculum Level 5 and above.

‘High attainers’ are defined in accordance with the Performance Tables, meaning those with prior attainment above Level 2 in KS1 teacher assessments (average points score of 18 or higher). This measure obviously excludes learners who are particularly strong in one area but correspondingly weak in another.

The proportions of the end-of-KS2 cohort defined as high, middle and low attainers have remained fairly constant since 2012.

High attainers presently constitute the top quartile of the relevant population, but this proportion is not fixed: it will increase as and when KS1 performance improves.

High % Middle % Low %
2014 25 58 18
2013 25 57 18
2012 24 57 19

Table 1: Proportion of high, middle and low prior attainers in state-funded schools by year since 2012

 

The percentage of high attainers in different schools’ end-of-KS2 cohorts varies very considerably and is unlikely to remain constant from year to year. Schools with small year groups are particularly vulnerable to significant fluctuations.

The 2014 Performance Tables show that Minster School, in Southwell, Nottinghamshire and St Patrick’s Church of England Primary Academy in Solihull each had 88% high attainers.

Over 600 primary schools have 50% or more high attainers within their cohorts. But, at the other extreme, more than 570 have no high attainers at all, while some 1,150 have 5% or fewer.

This serves to illustrate the very unequal distribution of learners with high prior attainment between schools.

The commentary below opens with a summary of the headline findings. The subsequent sections focus in turn on the composite measure (reading, writing and maths combined), then on the outcomes of the reading, GPS (grammar, punctuation and spelling) and maths tests and finally on teacher assessment in writing.

I have tried to ensure that percentages are consistent throughout this analysis, but the effect of rounding means that some figures are slightly different in different SFR tables. I apologise in advance for – and will of course correct – any transcription errors.

.

Headlines

.

Overall Trends

Chart 1 below compares performance at level 5 and above (L5+) and level 4 and above (L4+) in 2013 and 2014. The bars on the left hand side denote L4+, while those corresponding to L5+ are on the right.

HA 1

Chart 1: L4+ and L5+ performance compared, 2013-2014

With the exception of maths, which has remained unchanged, there have been improvements across the board at L4+, of between two and four percentage points.

The same is true at L5+ and – in the case of reading, GPS and writing – the percentage point improvements are relatively larger. This is good news.

Chart 2 compares the gaps between disadvantaged learners (‘ever 6’ FSM plus children in care) and all other learners in state-funded schools on all five measures, for both 2013 and 2014.

.

HA 2

Chart 2: Disadvantaged gaps at L4+ and L5+ for all five measures, 2013 and 2014

.

With the sole exception of the composite measure in 2013, each L4+ gap is smaller than the corresponding gap at L5+, though the difference can be as little as one percentage point (the composite measure) and as high as 11 percentage points (reading).

Whereas the L4+ gap in reading is lower than for any other measure, the L5+ reading gap is now the biggest. This suggests there is a particular problem with L5+ reading.

The distance between L4+ and L5+ gaps has typically widened since 2013, except in the case of maths, where it has narrowed by one percentage point.

While three of the L4+ gaps have closed slightly (composite, reading, GPS) the remainder are unchanged. However, two of the L5+ gaps have increased (composite, writing) and only the maths gap has closed slightly.

This suggests that what limited progress there has been in closing disadvantaged gaps has focused more on L4+ than L5+.

The pupil premium is not bringing about a radical improvement – and its impact is relatively lower at higher attainment levels.

A similar pattern is discernible with FSM gaps as Chart 3 reveals. This excludes the composite measure as this is not supplied in the SFR.

Overall the picture at L4+ is cautiously positive, with small downward trends on three of the four measures, but the picture at L5+ is more mixed since two of the measures are unchanged.

.

HA 3

Chart 3: FSM gaps at L4+ and L5+ compared, 2013 and 2014  

Composite measure

  • Although the proportion of learners achieving this benchmark is slightly higher in converter academies than in LA-maintained schools, the latter have improved faster since 2013. The success rate in sponsored academies is half that in converter academies. Free schools are improving but remain behind LA-maintained schools. 
  • Some 650 schools achieve 50% or higher, but another 470 record 0% (fewer than the 600 which did so in 2013). 
  • 67% of high attainers achieved this benchmark in 2014, up five percentage points on 2013 but one third still fall short, demonstrating that there is extensive underachievement amongst high attainers in the primary sector. This rather undermines HMCI’s observations in his Commentary on the 2014 Annual Report. 
  • Although over 670 schools have a 100% success rate amongst their high attainers, 42 schools have recorded 0% (down from 54 in 2013). Several of these do better by their middle attainers. In 10 primary schools no high attainers achieve L4+ in reading, writing and maths combined.

.

Reading

  • The substantial improvement in L5+ reading performance since 2013 masks an as yet unexplained crash in Level 6 test performance. Only 874 learners in state-funded schools achieved L6 reading, compared with 2,137 in 2013. This is in marked contrast to a substantive increase in L6 test entries, the success rate on L6 teacher assessment and the trend in the other L6 tests. In 2013 around 12,700 schools had no pupils who achieved L6 reading, but this increased to some 13,670 schools in 2014. Even the performance of Chinese pupils (otherwise phenomenally successful on L6 tests) went backwards. 
  • The proportion of Chinese learners achieving L5 in reading has reached 65% (compared with 50% for White learners), having increased by seven percentage points since 2013 and overtaken the 61% recorded in 2012. 
  • 43 primary schools had a 100% success rate at Level 5 in the reading test, but 29 more registered 0%. 
  • Some 92% of high attainers made at least the expected progress in reading, fewer than the 94% of middle attainers who did so. However, this was a three percentage point improvement on the 89% who made the requisite progress in 2013. 

GPS

  •  The proportion of Chinese learners achieving L5+ in the GPS test is now 75%, a seven percentage point improvement on 2013. Moreover, 15% achieved Level 6, up eight percentage points on 2013. (The comparable Level 5+ percentage for White learners is 50%). There are unmistakeable signs that Chinese ascendancy in maths is being replicated with GPS. 
  • Some 7,210 schools had no learners achieving L6 in the GPS test, compared with 10,200 in 2013. While 18 schools recorded a perfect 100% record at Level 5 and above, 33 had no learners at L5+. 

.

Maths

  • Chinese learners continue to make great strides. The percentage succeeding on the L6 test has climbed a further six percentage points and now stands at 35% (compared with 8% for White Pupils). Chinese boys are at 39%. The proportion of Chinese learners achieving level 6 is now comparable to the proportions of other ethnic groups achieving level 5. This lends further credence to the notion that we have our own domestic equivalent of Shanghai’s PISA success – and perhaps to the suggestion that focusing on Shanghai’s classroom practice may bring only limited benefits. 
  • While it is commendable that 3% of FSM and 4% of disadvantaged learners are successful in the L6 maths test, the gaps between them and other learners are increasing as the overall success rate grows. There are now seven percentage point gaps for FSM and disadvantaged alike. 
  • Ten schools managed a L6 success rate of 50% or higher, while some 280 were at 30% or higher. On the other hand, 3,200 schools had no L6 passes (down from 5,100 in 2013). 
  • About 94% of high attainers made the expected progress in maths a one percentage point improvement on 2013 – and two percentage points more than the proportion of successful middle attainers. But 27 schools posted a success rate of 50% or below.

.

Writing (TA)

  • Chinese pupils do not match their performance on the GPS test, though 6% achieve L6 in writing TA compared with just 2% of white pupils. 
  • Three schools managed a 50% success rate at Level 6 and 56 were at 25% or above. Only one school managed 100% at L5, but some 200 scored 0%. 
  • Some 93% of all pupils make the expected progress in writing between KS1 and KS2. This is true of 95% of high attainers – and 95% of middle attainers too.

 

Composite measure: reading, writing and maths

Table 2 shows the overall proportion of learners achieving L5 or above in all of reading, writing and maths in each year since 2012.

 

2012 2013 2014
L5+ overall 20% 21% 24%
L5+ boys 17% 18% 20%
L5+ girls 23% 25% 27%

Table 2: Proportion of all learners achieving KS2 L5+ in reading, writing and maths, 2012-2014

The overall success rate has increased by three percentage points compared with 2013 and by four percentage points since 2012.

The percentage of learners achieving L4+ has also improved by four percentage points since 2012, so the improvement at L5+ is broadly commensurate.

Over this period, girls’ lead over boys has remained relatively stable at between six and seven percentage points.

The SFR reveals that success on this measure varies significantly between school type.

The percentages for LA-maintained schools (24%) and all academies and free schools (23%) are little different.

However mainstream converter academies stand at 26%, twice the 13% recorded by sponsored academies. Free schools are at 21%. These percentages have changed significantly compared with 2013.

.

HA 4

Chart 4:  Comparison of proportion of learners achieving L5+ in reading writing and maths in 2013 and 2014

.

Whereas free schools are making rapid progress and sponsored academies are also improving at a significant rate, converter academies are improving more slowly than LA-maintained schools.

The highest percentages on this measure in the Performance Tables are recorded by Fox Primary School in Kensington and Chelsea (86%) and Hampden Gurney CofE Primary School in Westminster (85%).

Altogether, some 650 schools have achieved success rates of 50% or higher, while 23 have managed 75% or higher.

At the other end of the spectrum about 470 schools have no learners at all who achieved this measure, fewer than the 600 recording this outcome in 2013.

Table 3 shows the gap between disadvantaged (ie ‘ever 6’ FSM and children in care) learners and others, as recorded in the Performance Tables.

2012 2013 2014
Disadv 9 10 12
Other 24 26 29
Gap 15 16 17

Table 3: Proportion of disadvantaged learners achieving L5+ in reading, writing and maths, 2012-2014

.

Although the percentage of disadvantaged learners achieving this benchmark has improved somewhat, the percentage of other learners doing so has improved faster, meaning that the gap between advantaged and other learners is widening steadily.

This contrasts with the trend at L4+, where the Performance Tables show a gap that has narrowed from 19 percentage points in 2012 (80% versus 61%) to 18 points in 2013 (81% versus 63%) and now to 16 points in 2014 (83% versus 67%).

Chart 5 below illustrates this comparison.

.

HA 5

Chart 5: Comparing disadvantaged/other attainment gaps in KS2 reading, writing and maths combined at L4+ and L5+, 2012-2014.

While the L4+ gap has closed by three percentage points since 2012, the L5+ gap has widened by two percentage points. This suggests that disadvantaged learners amongst the top 25% by prior attainment are not benefiting commensurately from the pupil premium.

There are 97 primary schools where 50% or more disadvantaged learners achieve L5+ across reading, writing and maths (compared with 40 in 2013).

The highest performers record above 80% on this measure with their disadvantaged learners, albeit with cohorts of 6 to 8. Only one school with a more substantial cohort (of 34) manages over 70%. This is Tollgate Primary School in Newham.

The percentage of high attainers who achieved L5+ in 2014 was 67%, up five percentage points from 62% in 2013. (In 2012 the Performance Tables provided a breakdown for English and maths, which is not comparable).

Although this is a significant improvement, it means that one third of high attainers at KS1 still do not achieve this KS2 benchmark, suggesting that there is significant underachievement amongst this top quartile.

Thirteen percent of middle attainers also achieved this outcome, compared with 10% in 2013.

A significant number of schools – over 670 – do manage a 100% success rate amongst their high attainers, but there are also 42 schools where no high attainers achieve the benchmark (there were 54 in 2013). In several of them, more middle attainers than high attainers achieve the benchmark.

There are ten primary schools in which no high attainers achieve L4 in reading writing and maths. Perhaps one should be thankful for the fact that no middle attainers in these schools achieve the benchmark either!

The KS2 average point score was 34.0 or higher in five schools, equivalent to a level 5A. The highest  APS was 34.7, recorded by Fox Primary School, with a cohort of 42 pupils.

Across all state-funded schools, the average value added measure for high attainers across reading, writing and maths is 99.8, the same as it was in 2013.

The comparable averages for middle attainers and low attainers are 100.0 and 100.2 respectively, showing that high attainers benefit slightly less from their primary education.

The highest value-added recorded for high attainers is 104.7 by Tudor Court Primary School in Thurrock, while the lowest is 93.7 at Sacriston Junior School in Durham (now closed).

Three more schools are below 95.0 and some 250 are at 97.5 or lower.

.

Reading Test

Table 4 shows the percentage of all learners, boys and girls achieving L5+ in reading since 2010. There has been a five percentage point increase (rounded) in the overall result since 2013, which restores performance to the level it had reached in 2010.

A seven percentage point gap in favour of girls remains unchanged from 2013. This is four points less than the comparable gender gap in 2010.

.

2010 2011 2012 2013 2014
L5+ overall 50 43 48 44 50
Boys 45 37 43 41 46
Girls 56 48 53 48 53

Table 4: Percentage of learners achieving L5+ in reading since 2010

.

As reported in my September 2014 post ‘What Happened to the Level 6 Reading Results?’ L6 performance in reading has collapsed in 2014.

The figures have improved slightly since the provisional results were released, but the collapse is still marked.

Table 5 shows the numbers successful since 2012.

The number of successful learners in 2014 is less than half the number successful in 2013 and almost back to the level in 2012 when the test was first introduced.

This despite the fact that the number of entries for the level 6 test – 95,000 – was almost exactly twice the 47,000 recorded in 2012 and significantly higher than the 70,000 entries in 2013.

For comparison, the number of pupils awarded level 6 in reading via teacher assessment was 15,864 in 2013 and 17,593 in 2014

We still have no explanation for this major decline which is entirely out of kilter with other L6 test outcomes.

.

2012 2013 2014
% No % No % No
L6+ 0 900 0 2,262 0 935
Boys 0 200 0 592 0 263
Girls 0 700 1 1,670 0 672

Table 5: Number and percentage of learners achieving L6 on the KS2 reading test 2012-2014

.

These figures include some pupils attending independent schools, but another table in the SFR reveals that 874 learners in state-funded primary schools achieved L6 (compared with 2,137 in 2013). Of these, all but 49 achieved L3+ in their KS1 reading assessment.

But some 13,700 of those with L3+ reading at the end of KS1 progressed to L4 or lower at the end of KS2.

The SFR does not supply numbers of learners with different characteristics achieving L6 and all percentages are negligible. The only group recording a positive percentage are Chinese learners at 1%.

In 2013, Chinese learners were at 2% and some other minority ethnic groups recorded 1%, so not even the Chinese have been able to withstand the collapse in the L6 success rate.

According to the SFR, the FSM gap at L5 is 21 percentage points (32% versus 53% for all other pupils). The disadvantaged gap is also 21 percentage points (35% versus 56% for all other pupils).

Chart 6 shows how these percentages have changed since 2012.

.

HA 6

Chart 6: FSM and disadvantaged gaps for KS2 reading test at L5+, 2012-2014

FSM performance has improved by five percentage points compared with 2013, while disadvantaged performance has grown by six percentage points.

However, gaps remain unchanged for FSM and have increased by one percentage point for disadvantaged learners. There is no discernible or consistent closing of gaps in KS2 reading at L5.

These gaps of 21 percentage points for both FSM and disadvantaged, are significantly larger than the comparable gaps at L4+ of 12 (FSM) and 10 (disadvantaged) percentage points.

The analysis of level 5 performance in the SFR reveals that the proportion of Chinese learners achieving level 5 has reached 65%, having increased by seven percentage points since 2013 and overtaken the 61% recorded in 2012.

Turning to the Performance Tables, we can see that, in relation to L6:

  • The highest recorded percentage achieving L6 is 17%, at Dent CofE Voluntary Aided Primary School in Cumbria. Thirteen schools recorded a L6 success rate of 10% or higher. (The top school in 2013 recorded 19%).
  • In 2013 around 12,700 schools had no pupils who achieved L6 reading, whereas in 2014 this had increased to some 13,670 schools.

In relation to L5:

  • 43 schools achieved a 100% record in L5 reading (compared with only 18 in 2013). All but one of these recorded 0% at L6, which may suggest that they were concentrating on maximising L5 achievement rather than risking L6 entry.
  • Conversely, there are 29 primary schools where no learners achieved L5 reading.

Some 92% of high attainers made at least the expected progress in reading, fewer than the 94% of middle attainers who did so.  However, this was a three percentage point improvement on the 89% who made the requisite progress in 2013.

And 41 schools recorded a success rate of 50% or lower on this measure, most of them comfortably exceeding this with their low and middle attainers alike.

.

GPS Test

Since the grammar, punctuation and spelling test was first introduced in 2013, there is only a two-year run of data. Tables 6 and 7 below show performance at L5+ and L6+ respectively.

.

2013 % 2014 %
L5+ overall 48 52
Boys 42 46
Girls 54 58

Table 6: Percentage of learners achieving L5+ in GPS, 2013 and 2014

2013 2014
% No % No
L6+ 2 8,606 4 21,111
Boys 1 3,233 3 8,321
Girls 2 5,373 5 12,790

Table 7: Number and percentage of learners achieving L6 in GPS, 2013 and 2014

.

Table 6 shows an overall increase of four percentage points in 2014 and the maintenance of a 12 percentage point gap in favour of girls.

Table 7 shows a very healthy improvement in L6 performance, which only serves to emphasise the parallel collapse in L6 reading. Boys have caught up a little on girls but the latter’s advantage remains significant.

The SFR shows that 75% of Chinese learners achieve L5 and above, up seven percentage points from 68% in 2013. Moreover, the proportion achieving L6 has increased by eight percentage points, to 15%. There are all the signs that Chinese eminence in maths is repeating itself with GPS.

Chart 7 shows how the FSM gap and disadvantaged gap has changed at L5+ for GPS. The disadvantaged gap has remained stable at 19 percentage points, while the FSM gap has narrowed by one percentage point.

These gaps are somewhat larger than those at L4 and above, which stand at 17 percentage points for FSM and 15 percentage points for disadvantaged learners.

.

HA 7

Chart 7:  FSM and disadvantaged gaps for KS2 GPS test at L5+, 2013 and 2014

.

The Performance Tables show that, in relation to L6:

  • The school with the highest percentage achieving level 6 GPS is Fulwood, St Peter’s CofE Primary School in Lancashire, which records a 47% success rate. Some 89 schools achieve a success rate of 25% or higher.
  • In 2014 there were some 7,210 schools that recorded no L6 performers at all, but this compares favourably with 10,200 in 2013. This significant reduction is in marked contrast to the increase in schools with no L6 readers.

Turning to L5:

  • 18 schools recorded a perfect 100% record for L5 GPS. These schools recorded L6 success rates that vary between 0% and 25%.
  • There are 33 primary schools where no learners achieved L5 GPS.

.

Maths test

Table 8 below provides the percentages of learners achieving L5+ in the KS2 maths test since 2010.

Over the five year period, the success rate has improved by eight percentage points, but the improvement in 2014 is less pronounced than it has been over the last few years.

The four percentage point lead that boys have over girls has changed little since 2010, apart from a temporary increase to six percentage points in 2012.

.

2010 2011 2012 2013 2014
L5+ overall 34 35 39 41 42
Boys 36 37 42 43 44
Girls 32 33 36 39 40

Table 8: Percentage of learners achieving L5+ in KS2 maths test, 2010-2014

.

Table 9 shows the change in achievement in the L6 test since 2012. This includes pupils attending independent schools – another table in the SFR indicates that the total number of successful learners in 2014 in state-funded schools is 47,349, meaning that almost 95% of those achieving L6 maths are located in the state-funded sector.

There has been a healthy improvement since 2013, with almost 15,000 more successful learners – an increase of over 40%. Almost one in ten of the end of KS2 cohort now succeeds at L6. This places the reversal in L6 reading into even sharper relief.

The ratio between boys and girls has remained broadly unchanged, so boys continue to account for over 60% of successful learners.

.

2012 2013 2014
% No % No % No
L6+ 3 19,000 7 35,137 9 50,001
Boys 12,400 8 21,388 11 30,173
Girls 6,600 5 13,749 7 19,828

Table 9 Number and percentage of learners achieving L6 in KS2 maths test 2012-2014

.

The SFR shows that, of those achieving L6 in state-funded schools, some 78% had achieved L3 or above at KS1. However, some 9% of those with KS1 L3 – something approaching 10,000 pupils – progressed only to L4, or lower.

The breakdown for minority ethnic groups shows that the Chinese ascendancy continues. This illustrated by Chart 8 below.

HA 8

Chart 8: KS2 L6 maths test performance by ethnic background, 2012-2014

In 2014, the percentage of Chinese achieving L5+ has increased by a respectable three percentage points to 74%, but the L6 figure has climbed by a further six percentage points to 35%. More than one third of Chinese learners now achieve L6 on the maths test.

This means that the proportion of Chinese pupils achieving L6 is now broadly similar to the proportion of other minorities achieving Level 5 (34% of white pupils for example).

They are fifteen percentage points ahead of the next best outcome – 20% recorded by Indian learners. White learners stand at 8%.

There is an eight percentage point gap between Chinese boys (39%) and Chinese girls (31%). The gap for white boys and girls is much lower, but this is a consequence of the significantly lower percentages.

Given that Chinese pupils are capable of achieving such extraordinary results under the present system, these outcomes raise significant questions about the balance between school and family effects and whether efforts to emulate Chinese approaches to maths teaching are focused on the wrong target.

Success rates in the L6 maths test are high enough to produce percentages for FSM and disadvantaged learners. The FSM and disadvantaged gaps both stand at seven percentage points, whereas they were at 5 percentage points (FSM) and 6 percentage points (disadvantaged) in 2013. The performance of disadvantaged learners has improved, but not as fast as that of other learners.

Chart 9 shows how these gaps have changed since 2012.

While the L6 gaps are steadily increasing, the L5+ gaps have remained broadly stable at 20 percentage points (FSM) and 21 percentage points (disadvantaged). There has been a small one percentage point improvement in the gap for disadvantaged learners in 2014, matching the similar small improvement for L4+.

The gaps at L5+ remain significantly larger than those at L4+ (13 percentage points for FSM and 11 percentage points for disadvantaged).

HA 9

Chart 9: FSM and disadvantaged gaps, KS2 L5+ and L6 maths test, 2012 to 2014

.

The Performance Tables reveal that:

  • The school with the highest recorded percentage of L6 learners is Fox Primary School (see above) at 64%, some seven percentage points higher than its nearest rival. Ten schools achieve a success rate of 50% or higher (compared with only three in 2013), 56 at 40% or higher and 278 at 30% or higher.
  • However, over 3,200 schools record no L6 passes. This is a significant improvement on the 5,100 in this category in 2013, but the number is still far too high.
  • Nine schools record a 100% success rate for L5+ maths. This is fewer than the 17 that managed this feat in 2013.

Some 94% of high attainers made the expected progress in maths a one percentage point improvement on 2013, two percentage points more than did so in reading in 2014 – and two percentage points more than the proportion of middle attainers managing this.

However, 27 schools had a success rate of 50% or below, the vast majority of them comfortably exceeding this with their middle attainers – and often their low attainers too.

.

Writing Teacher Assessment

Table 10 shows how the percentage achieving L5+ through the teacher assessment of writing has changed since 2012.

There has been a healthy five percentage point improvement overall, and an improvement of three percentage points since last year, stronger than the comparable improvement at L4+. The large gender gap of 15 percentage points in favour of girls is also unchanged since 2013.

.

2012 2013 2014
L5+ overall 28 30 33
Boys 22 23 26
Girls 35 38 41

Table 10: Percentage achieving level 5+ in KS2 writing TA 2012-2014

.

Just 2% of learners nationally achieve L6 in writing TA – 11,340 pupils (10,654 of them located in state-funded schools).

However, this is a very significant improvement on the 2,861 recording this outcome in 2013. Just 3,928 of the total are boys.

Chinese ascendancy at L6 is not so significant. The Chinese success rate stands at 6%. However, if the comparator is performance at L5+ Chinese learners record 52%, compared with 33% for both White and Asian learners.

The chart below shows how FSM and disadvantaged gaps have changed at L5+ since 2012.

This indicates that the FSM gap, having widened by two percentage points in 2013, has narrowed by a single percentage point in 2014, so it remains higher than it was in 2012. Meanwhile the disadvantaged gap has widened by one percentage point since 2013.

The comparable 2014 gaps at L4+ are 15 percentage points (FSM) and 13 percentage points (disadvantaged), so the gaps at L5+ are significantly larger.

.

HA 10

Chart 10: FSM and disadvantaged gaps, L5+ Writing TA, 2012-2014

.

The Performance Tables show that:

  • Three schools record a L6 success rate of 50% and only 56 are at 25% or higher.
  • At the other end of the spectrum, the number of schools with no L6s is some 9,780, about a thousand fewer than in 2013.
  • At L5+ only one school has a 100% success rate (there were four in 2013). Conversely, about 200 schools record 0% on this measure.

Some 93% of all pupils make the expected progress in writing between KS1 and KS2 and this is true of 95% of high attainers – the same percentage of middle attainers is also successful.

Conclusion

Taken together, this evidence presents a far more nuanced picture of high attainment and high attainers’ performance in the primary sector than suggested by HMCI’s Commentary on his 2014 Annual Report:

‘The proportion of pupils at Key Stage 2 attaining a Level 5 or above in reading, writing and mathematics increased from 21% in 2013 to 24% in 2014.

Attainment at Level 6 has also risen. In mathematics, the proportion of pupils achieving Level 6 rose from 3% in 2012 to 9% in 2014. The proportion achieving Level 6 in grammar, punctuation and spelling rose by two percentage points in the last year to 4%.

These improvements suggest that primary schools are getting better at identifying the brightest children and developing their potential.’

There are four particular areas of concern:

  • Underachievement amongst high attainers is too prevalent in far too many primary schools. Although there has been some improvement since 2013, the fact that only 67% of those with high prior attainment at KS1 achieve L5 in reading, writing and maths combined is particularly worrying.
  • FSM and disadvantaged achievement gaps at L5+ remain significantly larger than those at L4+ – and there has been even less progress in closing them. The pupil premium ought to be having a significantly stronger impact on these excellence gaps.
  • The collapse of L6 reading test results is all the more stark when compared with the markedly improved success rates in GPS and maths which HMCI notes. We still have no explanation of the cause.
  • The success rates of Chinese pupils on L6 tests remains conspicuous and in maths is frankly extraordinary. This evidence of a ‘domestic Shanghai effect’ should be causing us to question why other groups are so far behind them – and whether we need to look beyond Shanghai classrooms when considering how best to improve standards in primary maths.

.

GP

December 2014

The Politics of Selection: Grammar Schools and Disadvantage

This post considers how England’s selective schools are addressing socio-economic disadvantage.

Another irrelevant Norwegian vista by Gifted Phoenix

Another irrelevant Norwegian vista by Gifted Phoenix

It is intended as an evidence base against which to judge various political statements about the potential value of selective education as an engine of social mobility.

It does not deal with recent research reports about the historical record of grammar schools in this respect. These show that – contrary to received wisdom – selective education has had a very limited impact on social mobility.

Politicians of all parties would do well to acknowledge this, rather than attempting (as some do) to perpetuate the myth in defiance of the evidence.

This post concentrates instead on the current record of these schools, recent efforts to strengthen their capacity to support the Government’s gap closing strategy and prospects for the future.

It encourages advocates of increased selection to consider the wider question of how best to support high attainers from disadvantaged backgrounds.

The post is organised into four main sections:

  • A summary of how the main political parties view selection at this point, some six months ahead of a General Election.
  • A detailed profile of the socio-economic inclusiveness of grammar schools today, which draws heavily on published data but also includes findings from recent research.
  • An evaluation of national efforts over the last year to reform selective schools’ admissions, testing and outreach in support of high-attaining disadvantaged learners.
  • Comparison of the various policy options for closing excellence gaps between such learners and their more advantaged peers – and consideration of the role that reformed and/or increased selection might play in a more comprehensive strategy.

Since I know many readers prefer to read my lengthy posts selectively I have included page jumps from each of the bullet points above to the relevant sections below.

One more preliminary point.

This is the second time I have explored selection on this Blog, though my previous post, on fair access to grammar schools, appeared as far back as January 2011. This post updates some of the data in the earlier one.

One purpose of that earlier post was to draw attention to the parallels in the debates about fair access to grammar schools and to selective higher education.

I do not repeat those arguments here, although writing this has confirmed my opinion that they are closely related issues and that many of the strategies deployed at one level could be applied equally at the other.

So there remains scope to explore how appropriate equivalents of Offa, access agreements, bursaries and contexualised admissions might be applied to selective secondary admissions arrangements, alongside the reforms that are already on the table. I leave that thought hanging.

.

The Political Context

My last post on ‘The Politics of Setting’ explored how political debate surrounding within-school and between-school selection is becoming increasingly febrile as we approach the 2015 General Election.

The two have become inextricably linked because Prime Minister Cameron, in deciding not to accommodate calls on the right of his party to increase the number of selective schools, has called instead for ‘a grammar stream in every school’ and, latterly, for a wider – perhaps universal – commitment to setting.

In May 2007, Cameron wrote:

‘That’s what the grammar school row was about: moving the Conservative Party on from slogans such as ‘Bring back grammar schools’ so that we can offer serious policies for improving state education for everyone…

…Most critics seem to accept, when pressed, that as I have said, the prospect of more grammars is not practical politics.

Conservative governments in the past – and Conservative councils in the present – have both failed to carry out this policy because, ultimately, it is not what parents want….

…When I say I oppose nationwide selection by 11 between schools, that does not mean I oppose selection by academic ability altogether.

Quite the reverse. I am passionate about the importance of setting by ability within schools, so that we stretch the brightest kids and help those in danger of being left behind.’

With a Conservative Government this would be a motor of aspiration for the brightest kids from the poorest homes – effectively a ‘grammar stream’ in every subject in every school.

Setting would be a focus for Ofsted and a priority for all new academies.’

As ‘The Politics of Setting’ explained, this alternative aspiration to strengthen within-school selection has not yet materialised, although there are strong signs that it is still Cameron’s preferred way forward.

The Coalition has been clear that:

‘It is not the policy of the Government to establish new grammar schools in England’ (Hansard, 10 February 2014, Col. 427W).

but it has also:

  • Removed barriers to the expansion of existing grammar schools through increases to planned admission numbers (PANs) within the Admissions Code.
  • Introduced several new selective post-16 institutions through the free schools policy (though not as many as originally envisaged since the maths free schools project has made relatively little progress).
  • Made efforts to reform the admissions procedures of existing selective secondary schools and
  • Accepted in principle that these existing schools might also expand through annexes, or satellite schools. This is now a live issue since one decision is pending and a second proposal may be in the pipeline.

The Liberal Democrats have enthusiastically pursued at least the third of these policies, with Lib Dem education minister David Laws leading the Government’s efforts to push the grammar schools further and faster down this route.

In his June 2014 speech (of which much more below) Laws describes grammar schools as ‘a significant feature of the landscape in many local areas’ and ‘an established fact of our education system’.

But, as the Election approaches, the Lib Dems are increasingly distancing themselves from a pro-selective stance.

Clegg is reported to have said recently that he did not believe selective schools were the way forward:

‘The Conservatives have got this odd tendency to constantly want to turn the clock back.

Some of them seem to be hankering towards a kind of selective approach to education, which I don’t think works.

Non-selective schools stream and a lot of them stream quite forcefully, that’s all fine, but I think a segregated school system is not what this country needs.’

Leaving aside the odd endorsement of ‘forceful streaming’, this could even be interpreted as hostile to existing grammar schools.

Meanwhile, both frontrunners to replace Cameron as Tory leader have recently restated their pro-grammar school credentials:

  • Constituency MP Teresa May has welcomed consideration of the satellite option in Maidenhead.

The right wing of the Tory party has long supported increased selection and will become increasingly vociferous as the Election approaches.

Conservative Voice – which describes itself as on the ‘center-Right of the party’ [sic] – will imminently launch a campaign calling for removal of the ban on new grammar schools to be included in the Conservative Election Manifesto.

They have already conducted a survey to inform the campaign, from which it is clear that they will be playing the social mobility card.

The Conservative right is acutely aware of the election threat posed by UKIP, which has already stated its policy that:

‘Existing schools will be allowed to apply to become grammar schools and select according to ability and aptitude. Selection ages will be flexible and determined by the school in consultation with the local authority.’

Its leader has spoken of ‘a grammar school in every town’ and media commentators have begun to suggest that the Tories will lose votes to UKIP on this issue.

Labour’s previous shadow education minister, Stephen Twigg, opposed admissions code reforms that made it easier for existing grammar schools to expand.

But the present incumbent has said very little on the subject.

A newspaper interview in January 2014 hints at a reforming policy:

‘Labour would not shut surviving grammar schools but Mr Hunt said their social mix should be questioned.

“If they are simply about merit why do we see the kind of demographics and class make-up within them?”’

But it seems that this has dropped off Labour’s agenda now that the Coalition has adopted it.

I could find no formal commitment from Labour to address the issue in government, even though that might provide some sort of palliative for those within the party who oppose selection in all its forms and have suggested that funding should be withdrawn from selective academies.

So the overall picture suggests that Labour and the Lib Dems are deliberately distancing themselves from any active policy on selection, presumably regarding it as a poisoned chalice. The Tories are conspicuously riven on the issue, while UKIP has stolen a march by occupying the ground which the Tory right would like to occupy.

As the Election approaches, the Conservatives face four broad choices. They can:

  • Endorse the status quo under the Coalition, making any change of policy conditional on the outcome of a future leadership contest.
  • Advocate more between-school selection. This might or might not stop short of permitting new selective 11-18 secondary schools. Any such policy needs to be distinct from UKIP’s.
  • Advocate more within-school selection, as preferred by Cameron. This might adopt any position between encouragement and compulsion.
  • Develop a more comprehensive support strategy for high attaining learners from disadvantaged backgrounds. This might include any or all of the above, but should also consider support targeted directly at disadvantaged students.

These options are discussed in the final part of the post.

The next section provides an assessment of the current state of selective school engagement with disadvantaged learners, as a precursor to describing how the reform programme is shaping up.

.

How well do grammar schools serve disadvantaged students?

.

The Grammar School Stock and the Size of the Selective Pupil Population

Government statistics show that, as of January 2014, there are 163 selective state-funded secondary schools in England.

This is one less than previously, following the merger of Chatham House Grammar School for Boys and Clarendon House Grammar School. These two Kent schools formed the Chatham and Clarendon Grammar School with effect from 1 September 2013.

At January 2014:

  • 135 of these 163 schools (83%) are academy converters, leaving just 28 in local authority control. Twenty of the schools (12%) have a religious character.
  • Some 5.1% of pupils in state-funded schools attend selective schools. (The percentage fluctuated between 4% and 5% over the last 20 years.) The percentage of learners under 16 attending selective schools is lower. Between 2007 and 2011 it was 3.9% to 4.0%.
  • There are 162,630 pupils of all ages attending state-funded selective secondary schools, of which 135,365 (83.2%) attend academies and 27,265 (16.8%) attend LA maintained schools. This represents an increase of 1,000 compared with 2013. The annual intake is around 22,000.

The distribution of selective schools between regions and local authority areas is shown in Table 1 below.

The percentage of selective school pupils by region varies from 12.0% in the South East to zero in the North East, a grammar-free zone. The percentage of pupils attending selective schools by local authority area (counting only those with at least one selective school) varies from 45.1% in Trafford to 2.1% in Devon.

Some of the percentages at the upper end of this range seem to have increased significantly since May 2011, although the two sets of figures may not be exactly comparable.

For example, the proportion of Trafford pupils attending selective schools has increased by almost 5% (from 40.2% in 2011). In Torbay there has been an increase of over 4% (34.8% compared with 30.5%) and in Kent an increase of almost 4% (33.3% compared with 29.6%).

.

Table 1: The distribution of selective schools by region and local authority area and the percentage of pupils within each authority attending them (January 2014)

Region Schools Pupils Percentage of all pupils
North East 0 0 0
North West 19 20,240 4.9
Cumbria 1 833 2.8
Lancashire 4 4,424 6.6
Liverpool 1 988 3.3
Trafford 7 7,450 45.1
Wirral 6 6,547 30.5
Yorkshire and Humberside 6 6,055 1.9
Calderdale 2 2,217 14.2
Kirklees 1 1,383 5.5
North Yorkshire 3 2,454 6.5
East Midlands 15 12,700 4.5
Lincolnshire 15 12,699 26.9
West Midlands 19 15,865 4.5
Birmingham 8 7,350 10.4
Stoke-on-Trent 1 1,078 8.7
Telford and Wrekin 2 1,283 11.7
Walsall 2 1,423 7.0
Warwickshire 5 3,980 12.0
Wolverhampton 1 753 5.0
East of England 8 7,715 2.1
Essex 4 3,398 4.0
Southend-on-Sea 4 4,319 32.8
London 19 20,770 4.4
Barnet 3 2,643 11.6
Bexley 4 5,466 26.6
Bromley 2 1,997 9.0
Enfield 1 1,378 6.1
Kingston upon Thames 2 2,021 20.5
Redbridge 2 1,822 7.9
Sutton 5 5,445 30.7
South East 57 59,910 12.0
Buckinghamshire 13 15,288 42.2
Kent 32 33,059 33.3
Medway 6 6,031 32.2
Reading 2 1,632 24.1
Slough 4 3,899 37.4
South West 20 19,370 6.2
Bournemouth 2 2,245 23.3
Devon 1 822 2.1
Gloucestershire 7 6,196 16.2
Plymouth 3 2,780 16.3
Poole 2 2,442 26.8
Torbay 3 2,976 34.8
Wiltshire 2 1,928 6.6
TOTAL 163 162,630 5.1

.

Some authorities are deemed wholly selective but different definitions have been adopted.

One PQ reply suggests that 10 of the 36 local authority areas – Bexley, Buckinghamshire, Kent, Lincolnshire, Medway, Slough, Southend, Sutton, Torbay and Trafford – are deemed wholly selective because they feature in the Education (Grammar School Ballots) Regulations 1998.

Another authoritative source – the House of Commons Library – omits Bexley, Lincolnshire and Sutton from this list, presumably because they also contain comprehensive schools.

Of course many learners who attend grammar schools live in local authority areas other than those in which their schools are located. Many travel significant distances to attend.

A PQ reply from March 2012 states that some 76.6% of all those attending grammar schools live in the same local authority as their school, while 23.2% live outside. (The remainder are ‘unknowns’.)

These figures mask substantial variation between authorities. A recent study, for the Sutton Trust  ‘Entry into Grammar Schools in England’ (Cribb et al, 2013) provides equivalent figures for each local authority from 2009-10 to 2011-12.

The percentage of within authority admissions reaches 38.5% in Trafford and 36% in Buckinghamshire but, at the other extreme, it can be as low as 1.7% in Devon and 2.2% in Cumbria.

The percentage of admissions from outside the authority can be as much as 75% (Reading) and 68% (Kingston) or, alternatively, as low as 4.5% in Gloucestershire and 6.8% in Kent.

.

Recent Trends in the Size and Distribution of the Disadvantaged Grammar School Pupil Population

Although this section of the post is intended to describe the ‘present state’, I wanted to illustrate how that compares with the relatively recent past.

I attached to my 2011 post a table showing how the proportion of FSM students attending grammar schools had changed annually since 1995. This is reproduced below, updated to reflect more recent data where it is available

A health warning is attached since the figures were derived from several different PQ replies and I cannot be sure that the assumptions underpinning each were identical. Where there are known methodological differences I have described these in the footnotes.

.

Table 2: Annual percentage FSM in all grammar schools and gap between that and percentage FSM in all secondary schools, 1995-2013

Year PercentageFSM in GS Percentage FSMall schools Percentagepoint Gap
1995 3.9 18.0 14.1
1996 3.8 18.3 14.5
1997 3.7 18.2 14.5
1998 3.4 17.5 14.1
1999 3.1 16.9 13.8
2000 2.8 16.5 13.7
2001 2.4 15.8 13.4
2002 2.2 14.9 12.7
2003 2.1 14.5 12.4
2004 2.2 14.3 12.1
2005 2.1 14.0 11.9
2006 2.2 14.6 12.4
2007 2.0 13.1 11.1
2008 1.9 12.8 10.9
2009 2.0 13.4 11.4
2010 15.4
2011 2.4 14.6 12.2
2012 14.8
2013 15.1
2014 14.6

(1) Prior to 2003 includes dually registered pupils and excludes boarding pupils; from 2003 onwards includes dually registered and boarding pupils.

(2) Before 2002 numbers of pupils eligible for free school meals were collected at school level. From 2002 onwards numbers have been derived from pupil level returns.

(3) 2008 and 2009 figures for all schools exclude academies

.

Between 1996 and 2005 the FSM rate in all schools fell annually, dropping by 4.3 percentage points over that period. The FSM rate in grammar schools also fell, by 1.7 percentage points. The percentage point gap between all schools and selective schools fell by 2.6 percentage points.

Both FSM rates reached their lowest point in 2008. At that point the FSM rate in grammar schools was half what it had been in 1996. Thereafter, the rate across all schools increased, but has been rather more volatile, with small swings in either direction.

One might expect the 2014 FSM rate across all grammar schools to be at or around its 2011 level of 2.4%.

A more recent PQ reply revealed the total number of pupil premium recipients attending selective schools over the last three financial years:

  • FY2011-12 – 3,013
  • FY2012-13 – 6,184 (on extension to ‘ever 6’)
  • FY2013-14 – 7,353

(Hansard 20 January 2014, Col. WA88)

This suggests a trend of increasing participation in the sector, though total numbers are still very low, averaging around 45 per school and slightly over six per year group.

.

Comparison with FSM rates in selective authorities

In 2012, a table deposited in the Commons Library (Dep 2012-0432) in response to a PQ provided the January 2011 FSM rates for selective schools and all state-funded secondary schools in each authority containing selective schools.

In this case, the FSM rates provided relate only to pupils aged 15 or under. The comparable national average rates are 2.7% for selective schools and 15.9% for all state-funded schools.

  • Selective school FSM rates per authority vary between 6.0% in Birmingham and 0.6% in Wiltshire.
  • Other authorities with particularly low FSM rates include Bromley (0.7%), Reading (0.8%) and Essex (0.9%).
  • Authorities with relatively high FSM rates include Wirral (5.2%), Walsall (4.9) and Redbridge (4.8%).
  • The authorities with the biggest gaps between FSM rates for selective schools and all schools are Birmingham, at 28.0 percentage points, Liverpool, at 23.8 percentage points, Enfield at 21.8 percentage points and Wolverhampton, at 21.7 percentage points.
  • Conversely, Buckinghamshire has a gap of only 4.7 percentage points, since its FSM rate for all state-funded secondary schools is only 6.0%.
  • Buckinghamshire’s overall FSM rate is more than four times the rate in its grammar schools, while in Birmingham the overall rate is almost six times the grammar school rate. On this measure, the disparity is greatest in metropolitan boroughs with significant areas of disadvantage.

.

Proportion of disadvantaged learners in each selective school

I attached to my 2011 post a table setting out the FSM rates (all pupils, regardless of age) for each selective school in January 2009.

This updated version sets out the January 2013 FSM and disadvantaged (ie ‘ever 6 FSM’) rates by school, drawn from the latest School Performance Tables. (Click on the screenshot below to download the Excel file.)

.

GS excel Capture

.

Key points include:

  • The size of grammar schools varies considerably, with NORs ranging from 437 (Newport Girls’) to 1518 (Townley Girls’). The average NOR is slightly below 1000.
  • 24 of the 163 schools (14.7%) have suppressed FSM percentages. Since the lowest published percentage is 1.1%, the impact of suppression is that all schools at or below 1.0% are affected. Since no school returns 0, we must assume that all contain a handful of FSM learners. It is notable that six of these schools are in Buckinghamshire, three in Gloucestershire and three in Essex. Both Bromley grammar schools also fall into this category.
  • 67 selective schools (41.1%) have FSM rates of 2% or lower. The average FSM rate across all these schools is 3.25%.
  • The highest recorded FSM rates are at Handsworth Grammar School (14.4%), King Edward VI Aston School (12.9%) and Stretford Grammar School (12%). These three are significant outliers – the next highest rate is 7.8%.
  • As one would expect, there is a strong correlation between FSM rates and ‘ever 6’ rates. Most of the schools with the lowest ‘ever 6’ rates are those with SUPP FSM rates. Of the 26 schools returning ‘ever 6’ rates of 3.0% or lower, all but 7 fall into this category.
  • The lowest ‘ever 6’ rate is the 0.6% returned by Sir William Borlase’s Grammar School in Buckinghamshire. On this evidence it is probably the most socio-economically selective grammar school in the country. Five of the ten schools with the lowest ‘ever 6’ rates are located in Buckinghamshire.
  • A few schools have FSM and ‘ever 6’ rates that do not correlate strongly. The most pronounced is Ribston Hall in Gloucestershire which is SUPP for FSM yet has an ‘ever 6’ rate of 5.5%, not far short of the grammar school average which is some 6.6%. Clitheroe Royal Grammar School is another outlier, returning an ‘ever 6’ rate of 4.8%.
  • The highest ‘ever 6’ rates are in Handsworth Grammar School (27.2%), Stretford Grammar School (24.3%) and King Edward VI Aston School (20.3%). These are the only three above 20%.
  • In London there is a fairly broad range of socio-economic selectivity, from St Olave’s and St Saviour’s (Bromley) – which records an ‘ever 6’ rate of 2.5% – to Woodford County High School, Redbridge, where the ‘ever 6’ rate is 11%. As noted above, the FSM rates at the two Bromley schools are SUPP. The London school with the highest FSM rate is again Woodford County High, at 5%.

Another source throws further light on the schools with the lowest FSM rates. In October 2013, a PQ reply provided a table of the 50 state secondary schools in England with the lowest entitlement to FSM, alongside a second table of the 50 schools with the highest entitlement.

These are again January 2013 figures but on this occasion the rates are for pupils aged 15 or under and the only figures suppressed (denoted by ‘x’) are where no more than two pupils are FSM.

Sir William Borlase’s tops the list, being the only school in the country with a nil return (so the one or two FSM pupils who attend must be aged over 15 and may have been admitted directly to the sixth form).

The remainder of the ‘top ten’ includes eight selective schools and one comprehensive (Old Swinford Hospital School in Dudley). The eight grammar schools are:

  • Cranbrook, Kent – x
  • Adams’, Telford and Wrekin – x
  • St Olave’s and St Saviour’s, Bromley – 0.5%
  • Dr Challoner’s High Buckinghamshire – 0.5%
  • Dr Challoner’s Grammar, Buckinghamshire – 0.6%
  • Aylesbury Grammar, Buckinghamshire – 0.6%
  • Newstead Wood, Bromley – 0.6%
  • Pate’s, Gloucestershire – 0.6%

Comparing the data in my tables for 2009 and 2013 also throws up some interesting facts:

  • Some schools have increased significantly in size – Burnham Grammar School (Buckinghamshire), Sir Thomas Rich’s (Gloucestershire), Highworth Grammar School for Girls (Kent), Simon Langton Grammar School for Boys (Kent), Kesteven and Grantham Girls’ School (Lincolnshire), Carre’s Grammar School (Lincolnshire) and St Joseph’s College (Stoke) have all increased their NORs by 100 or more.
  • However, some other schools have shrunk significantly, notably The Skegness Grammar School in Lincolnshire (down 129), The Boston Grammar School in Lincolnshire (down 110), Fort Pitt Grammar School in Medway (down 132) and Slough Grammar School (down 175).
  • While recognising that the figures may not be fully comparable, there have also been some significant changes in the proportions of FSM pupils on roll. Significant increases are evident at King Edward VI Aston (up 5.9 percentage points), Fort Pitt (up 5.1 percentage points) and Handsworth Grammar (up 4.7 percentage points).
  • The only equally pronounced mover in the opposite direction is St Anselm’s College on The Wirral, where the FSM rate has more than halved, falling by 5.2 percentage points, from 9.8% to 4.6%.

Additional statistics were peppered throughout David Laws’ June 2014 speech.

He refers to a paper by DfE analysts which unfortunately has not been published:

  • In 2013, 21 grammar schools had fewer than 1% of pupils eligible for FSM. Ninety-eight had fewer than 3% eligible and 161 had fewer than 10% eligible. This compares to a national average of 16.3% across England. (The basis for these figures is not supplied but they more or less agree with those above.)
  • In Buckinghamshire in 2011, 14% of the year 7 cohort were eligible for the pupil premium, but only 4% of the cohort in Buckinghamshire grammar schools were eligible. In Lincolnshire the comparable percentages were 21% and 7% respectively.

.

Selectivity

Most commentary tends to regard the cadre of selective schools as very similar in character, leaving aside any religious affiliation and the fact that many are single sex establishments.

Although the fact is rarely discussed, some grammar schools are significantly more selective than others.

The 2013 Secondary Performance Tables show that only 10 grammar schools can claim that 100% of the cohort comprises high attainers. (These are defined on the basis of performance in statutory end of KS2 tests, in which they must record an APS of 30 or more across English, maths and science.)

At several schools – Clarendon House (Kent, now merged), Fort Pitt (Medway), Skegness (Lincolnshire), Dover Boys’ and Girls’ (Kent), Folkestone Girls’ (Kent), St Joseph’s (Stoke), Boston High (Lincolnshire) and the Harvey School (Kent) – the proportion of high attainers stands at 70% or below.

Many comprehensive schools comfortably exceed this, hence – when it comes to KS2 attainment – some comprehensives are more selective than some grammar schools.

Key variables determining a grammar school’s selectivity will include:

  • The overall number of pupils in the area served by the school and/or the maximum geographical distance that pupils may travel to it.
  • The number of pupils who take the entrance tests, including the proportion of pupils attending independent schools competing for admission.
  • The number of competing selective schools and high-performing comprehensive schools, plus the proportion of learners who remain in or are ‘siphoned off’ into the independent sector.
  • The number of places available at the school and the pass mark in the entrance tests.

I have been unable to locate any meaningful measure of the relative selectivity of grammar schools, yet this is bound to impact on the admission of disadvantaged learners.

An index of selectivity would improve efforts to compare more fairly the outcomes achieved by different grammar schools, including their records on access for disadvantaged learners.

.

Prior attainment data

In his June 2014 speech, Laws acknowledges that:

  • ‘A key barrier is the low level of free school meal pupils achieving level 5, typically a proxy for pupils you admit’.
  • However, in wholly selective areas fewer than 50% of FSM learners achieving Level 5 enter selective schools compared with two-thirds of non-FSM pupils:

‘We calculated it would require a shift of just 200 level 5 FSM pupils to go into grammar schools in wholly selective areas to remove this particular bias ‘

Alternative versions of this statement appear elsewhere, as we shall see below.

Using data from 2009/10 and 2011/12, the Sutton Trust study by Cribb et al explored whether advantaged and disadvantaged pupils with KS2 level 5 in both English and maths were equally likely to attend grammar schools.

They found that those not eligible for FSM are still more likely to attend. This applies regardless of whether the grammar school is located in a selective local authority, although the percentages and the gaps vary considerably.

  • In selective authorities, some 66% of these high attaining non-FSM pupils went on to grammar schools compared with under 40% of FSM pupils, giving a gap of over 26 percentage points. (Note that the percentage for FSM is ten percentage points lower than the one quoted by Laws. I can find no reason for this disparity, unless the percentage has changed dramatically since 2012.)
  • In isolated grammar schools outside London the gap is much smaller, at roughly 11 percentage points (18% non-FSM against 7% FSM).
  • In London there is a similar 12 percentage point gap (15% non-FSM versus 3% FSM)

 

Cribb Capture 1

A similar pattern is detected on the basis of KS2 maths test fine points scores:

‘Two points are evident. First, for any given level of maths attainment, pupils who are eligible for FSM have a noticeably lower probability of attending a grammar school. Indeed, a non-FSM student with an average maths score has the same probability of entering a grammar school as an FSM pupil with a score 0.7 standard deviations above average. Second, the gap in probability of attendance between FSM and non-FSM pupils actually widens substantially: non-FSM pupils with test scores one standard deviation above average have a 55% likelihood of attending a grammar school in selective local authorities, whereas similar pupils who are eligible for FSM have only a 30% chance of attending a grammar school. This is suggestive that bright pupils from deprived families are not attending grammar schools as much as their attainment would suggest they might.’

This rather calls into question Laws’ initial statement that level 5 performance among FSM pupils is ‘a key barrier’ to admission.

The study also confirms that pupils attending primary schools with relatively high levels of deprivation are much less likely to progress to grammar schools.

On the other hand, some 13% of pupils nationally transfer into selective schools from non-state schools and schools outside England. The researchers are unable to distinguish clearly those from abroad and those from the independent sector, but note that they are typically wealthier than state school transfers.

This masks significant variation between local authority areas.

Almost 34% of such pupils transfer in to grammar schools in Essex, as do 24% in Bromley, 23% in Wiltshire and 22% in Bournemouth and Southend. At the other extreme, only 6% are incomers in Kirklees.

.

Headteacher perceptions

The Sutton Trust released a parallel research report from NATCEN reporting the outcomes of interviews with a small sample of three primary school and eight grammar school headteachers.

The researchers found that:

  • Rightly or wrongly, many heads felt disadvantaged learners had relatively lower educational aspirations.
  • Disadvantaged parents were sometimes perceived to know less about grammar schools and place less value on the benefits they might confer.
  • Heads felt disadvantaged parents ‘often associated grammar schools with tradition, middle class values and elitism’. Parents felt their children ‘might struggle interacting with children from more affluent backgrounds’.
  • Grammar school heads highlighted the role of primary schools but ‘this was difficult when primary schools disagreed with assessment based entry processes and selective education in general’.
  • Heads felt grammar schools should provide more outreach and demonstrate their openness to everyone. It was suggested that, as grammar schools increasingly take in pupils from further away and/or from independent schools, this might further distance schools from their local communities.
  • It was widely acknowledged that learners from more advantaged backgrounds were coached to pass the entrance exams. Some grammar heads regarded tutoring as ‘good examination preparation’; others recognised it as a barrier for disadvantaged learners.
  • Although there are financial barriers to accessing grammar schools, including the cost of uniforms and school trips, grammar school heads claimed to deploy a variety of support strategies.

Overall

The preceding analysis is complex and difficult to synthesise into a few key messages, but here is my best effort.

The national figures show that, taken as a whole, the 163 grammar schools contain extremely low proportions of FSM-eligible and ‘ever 6’ learners.

National FSM rates across all grammar schools have fallen significantly over the past 20 years and, although the FSM gap between selective schools and all schools has narrowed a little, it is still very pronounced.

There is certainly a strong case for concerted action to reduce significantly the size of this gap and to strive towards parity.

The disparity is no doubt partly attributable to lower rates of high attainment at KS2 amongst disadvantaged learners, but high attaining disadvantaged learners are themselves significantly under-represented. This is particularly true of wholly selective authorities but also applies nationally.

Although the sample is small, the evidence suggests that grammar school and primary head teachers share the perception that disadvantaged learners are further disadvantaged by the selective admissions process.

However, the cadre of grammar schools is a very broad church. The schools are very different and operate in markedly different contexts. Some are super-selective while others are less selective than some comprehensive schools.

A handful have relatively high levels of FSM and ‘ever-6’ admissions but a significant minority have almost negligible numbers of disadvantaged learners. Although contextual factors influence FSM and ‘ever 6’ rates significantly, there are still marked disparities which cannot be explained by such factors.

Each school faces a slightly different challenge.

Transparency and public understanding would be considerably improved by the publication of statistical information showing how grammar schools differ when assessed against a set of key indicators – and identifying clear improvement targets for each school. 

There seem to me to be strong grounds for incorporating schools’ performance against such targets into Ofsted’s inspection regime.

.

Progress Towards Reform

.

The Sutton Trust Research

Although the Grammar School Heads’ Association (GSHA) argues that it has pursued reform internally for some years, a much wider-ranging initiative has developed over the last twelve months, kicked off by the publication of a tranche of research by the Sutton Trust in November 2013.

This included the two publications, by Cribb et al and NATCEN cited above, plus a third piece by Jesson.

There was also an overarching summary report ‘Poor Grammar: Entry into Grammar Schools for disadvantaged pupils in England’.

This made six recommendations which, taken together, cover the full spectrum of action required to strengthen the schools’ capacity to admit more disadvantaged learners:

  • Review selection tests to ensure they are not a barrier to the admission of learners from disadvantaged backgrounds. The text remarks that:

‘Some grammar schools and local authorities are already trying to develop tests which are regularly changed, less susceptible to coaching, intelligence-based and not culturally biased.’

  • Reduce the advantage obtained by those who can pay for private tuition by making available a minimum of ten hours of test preparation to all applicants on a free or subsidised basis.
  • Improve grammar school outreach support, targeting learners from low and middle income backgrounds. This should include: assurances on access to transport and support with other costs; active encouragement for suitable Pupil Premium recipients to apply; using the media to dispel notions that grammar schools are exclusive and elitist; and deploying existing disadvantaged students as ambassadors.
  • Using the flexibility within the Admissions Code (at this point available only to academies) to prioritise the admission of high achieving students who are entitled to the pupil premium. There is also a suggestion that schools might: 

‘…consider giving preference to students from low or middle income households who reach a minimum threshold in the admission test’.

though it is not clear how this would comply with the Code.

  • Develop primary-grammar school partnerships to provide transition support for disadvantaged students, enabling primary schools to provide stronger encouragement for applications and reassure parents.
  • Develop partnerships with non-selective secondary schools:

‘…to ensure that high achieving students from low and middle income backgrounds have access to good local teachers in their areas.’

The Sutton Trust also made its own commitment to:

‘…look at ways that we can support innovation in improved testing, test preparation, outreach, admissions and collaboration.

We will also commission independent analysis of the impact of any such programmes to create an evidence base to enhance fair access to grammar schools.’

.

Reaction

Immediate reaction was predictably polarised. The GSHA was unhappy with the presentation of the report.

Its November 2013 Newsletter grumbles:

‘It is the way in which the research is presented by the Sutton Trust rather than any of research findings that give rise to concerns. Through a process of statistical machination the press release chose to lead on the claim that 6% of prep school pupils provide four times more grammar school pupils than the 16% of FSM eligible children. Inevitably, this led to headlines that the independent sector dominates admissions. The reality, of course is that 88% of all grammar school students come from state primary schools….

….Grammars select on ability and only 10% of FSM children reach level 5 at KS2 compared with a national average of 25%. The report, quite reasonably, uses level 5 as the indicator of grammar school potential. On the basis of this data the proportions of eligible FSM children in grammar schools is significantly greater than the overall FSM proportion in the top 500 comprehensives….

In 2012 just over 500 FSM children entered grammar schools. For the success rate of L5 FSM to match that of other L5 would require 200 more FSM children a year to enter grammar schools. Just one more in each school would virtually close the gap….

….The recommendations of the report are not, as claimed, either new or radical. All are areas that had already been identified by GSHA as options to aid access and represent practices that are already adopted by schools. This work, however, is usually carefully presented to avoid promotion of a coaching culture.

It is unfortunate that the press briefing both contributed to reinforcing the false stereotyping of grammar schools and failed to signal initiatives taken by grammar schools.’

There is evidence here of retaliatory ‘statistical machination’, together with a rather defensive attitude that may not bode well for the future.

On the other hand HMCI Wilshaw was characteristically forthright in the expression of an almost diametrically opposite opinion.

In December 2013 he is reported to have said:

‘Grammar schools are stuffed full of middle-class kids. A tiny percentage are on free school meals: 3%. That is a nonsense.

Anyone who thinks grammar schools are going to increase social mobility needs to look at those figures. I don’t think they work. The fact of the matter is that there will be calls for a return to the grammar school system. Well, look what is happening at the moment. Northern Ireland has a selective system and they did worse than us in the [international comparison] table. The grammar schools might do well with 10% of the school population, but everyone else does really badly. What we have to do is make sure all schools do well in the areas in which they are located.’

 .

The Laws Speech

Liberal Democrat Education Minister David Laws made clear the Government’s interest in reform with his June 2014 speech, already referenced above.

Early on in the speech he remarks that:

‘The debate about grammar schools seems to have been put in the political deep freeze – with no plans either to increase or reduce the number of what are extremely popular schools in their localities.’

With the benefit of hindsight, this seems rather ignorant of (or else disrespectful to) UKIP, which had nailed their colours to the mast just three weeks previously.

Laws acknowledges the challenge thrown down by Wilshaw, though without attribution:

‘Are you, as some would have it, “stuffed full of middle-class kids”?

Or are you opening up opportunities to all bright children regardless of their background, or can you do more?

Why is entry to grammar schools so often maligned?’

He says he wants to work with them ‘openly and constructively on social mobility’, to ‘consider what greater role they can play in breaking the cycles of disadvantage and closing the opportunity gap’, while accepting that the Government and the primary sector must also play their parts.

He suggests that the Government will do more to increase the supply of high attaining disadvantaged learners:

‘…a key barrier is the low level of free school meal pupils achieving level 5, typically a proxy for pupils you admit. So this is not just a challenge for grammar schools, but for the whole education system…

….My promise to you, alongside my challenge to you, is that this government will do everything in its power to make sure that more children from poorer backgrounds achieve their full potential.’

He lists the policies that:

‘Taken together, and over time…will start to shift the dial for poorer children – so that more and more reach level 5’

leading of course with the pupil premium.

He also proposes aspirational targets, though without any timescale attached:

My ambition is that all selective schools should aim for the same proportion of children on free school meals in their schools as in their local area.

This would mean an additional 3,500 free school meal pupils in selective schools every year, or an additional 35,000 pupils over 10 years.’

In relation to the flexibilities in the Admissions Code he adds:

I am pleased to be able to say that 32 grammar schools have implemented an admissions priority for pupils eligible for free school meals this year….

We in the Department for Education will fully support any school that chooses to change its admissions criteria in this way – in fact, I want to see all grammar schools give preference to pupil premium pupils over the next few years.’

Similarly, on coaching and testing:

‘…I really welcome the association’s work to encourage a move to entry tests that are less susceptible to coaching, and I am heartened to hear that at least 40% of grammar schools are now moving to the introduction of coaching resistant tests.

Again, I hope that all grammar schools will soon do so, and it will be interesting to see the impact of this.’

And he adds:

I want all schools to build on the progress that is being made and seek to close the gap by increasing parental engagement, and stronger working with local primaries – with a focus on identifying potential.’

So he overtly endorses several of the recommendations proposed by the Sutton Trust seven months earlier.

A Sutton Trust press release:

‘…welcomed the commitment by Schools Minister David Laws, to widening access to grammar schools and making the issue a priority in government’.

This may be a little over-optimistic.

A Collaborative Project Takes Shape

Laws also mentions in his speech that:

‘The GSHA will be working with us, the Sutton Trust and the University of Durham to explore ways in which access to grammar schools by highly able deprived children might be improved by looking more closely at the testing process and what may be limiting the engagement of pupils with it.’

The associated release from the Sutton Trust uses the present tense:

‘The Trust is currently working with the King Edward VI Foundation, which runs five grammar schools in Birmingham, Durham University, the Grammar School Heads Association and the Department for Education to target and evaluate the most effective strategies to broaden access to grammar schools.

A range of initiatives being run by the Foundation, including test familiarisation sessions at community locations, visits from primary schools and support for numeracy and literacy teaching for gifted and talented children at local primary schools, will be evaluated by Durham University to understand and compare their impact. The resulting analysis will provide a template for other grammar schools to work with.’

We know that Laws had been discussing these issues with the grammar schools for some time.

When he appeared before the Education Select Committee in February 2014 he said:

‘We are trying, for example, to talk to grammar schools about giving young people fairer access opportunities into those schools.  We are trying to allow them to use the pupil premium as a factor in their admissions policy.  We are trying to encourage them to ensure that testing is fairer to young people and is not just coachable. ‘

The repetition of ‘trying’ might suggest some reluctance on the part of grammar school representatives to engage on these issues.

Yet press coverage suggested the discussions were ongoing. In May the GSHA Newsletter states that it had first met Laws to discuss admissions some eighteen months previously, so perhaps as early as November 2012.

It adds:

‘We are currently working on a research project with the DfE and the Sutton Trust to try to find out what practices help to reduce barriers to access for those parents and students from deprived backgrounds.’

A parallel report in another paper comments:

‘The grammar school heads have also gone into partnership with the education charity the Sutton Trust to support more able children from middle and lower income backgrounds applying to selective schools.

Other ideas being considered include putting on test familiarisation sessions for disadvantaged children – something they have missed out on in the past.’

While an entry on CEM’s website says:

‘Access Grammar:

This project seeks to look at ways access to grammar schools for highly able children from non-privileged backgrounds can be improved. The project will identify potential target cohorts in the study areas for a range of outreach interventions and will look to evaluate these activities. For this project, the CEM Research and Evaluation team are working in collaboration with the Sutton Trust, Grammar School Heads Association, King Edwards Foundation and the Department for Education.

Start date: January 2014
End date: January 2017.’

So we know that there is a five-way partnership engaged on a three year project, The various statements describing the project’s objectives are all slightly different, although there is a clear resemblance between them, the aims articulated by Laws and the recommendations set out by the Sutton Trust.

But I searched in vain for any more detailed specification, including key milestones, funding and intended outcomes. It is not clear whether the taxpayer is contributing through DfE funding, or whether the Sutton Trust  and/or other partners are meeting the cost.

Given that we are almost a year into the programme, there is a strong case for this material to be made public.

.

Progress on Admissions Criteria

Of the issues mentioned in the Sutton Trust’s recommendations – tests and test preparation, admissions flexibility, outreach and partnership with primary and non-selective secondary schools – those at the front of the list have been most prominent (though there is also evidence that the King Edward’s Foundation is pursuing reform across a wider front).

The GSHA’s May 2014 newsletter is less grumpy than its predecessor, but still strikes a rather defensive note.

It uses a now familiar statistic, but in a slightly different fashion:

‘The actual number of students with Level 5s in their SATs who either choose not to apply to a grammar school or who apply but do not receive a place is reckoned by GSHA and the DfE to be two hundred students a year; not the very large number that the percentages originally suggested.’

This is the third time we have encountered this particular assertion, but each time it has been articulated differently. Which of the three statements is correct?

The GSHA is also keen to emphasise that progress is being made independently through its own good offices. On admissions reform, the article says:

‘A significant number of schools 38 have either adopted an FSM priority or consulted about doing so in the last admissions round. A further 59 are considering doing so in the next admissions round.’

The GHSA was also quoted in the TES, to the effect that 30 grammar schools had already been given permission by DfE to change their admissions policies and would so with effect from September 2015, while a further five or six had already introduced the reform.

A November 2014 PQ reply updates the figures above, saying that 32 grammar schools have already prioritised disadvantaged learners in their admissions arrangements and a further 65 ‘intend to consult on doing so’.

That leaves 66 (40%) which are not giving this active consideration.

The Chief Executive of the GSHA commented:

‘“You won’t notice a dramatic change in schools themselves because the numbers are quite small…This is reaching out at the margins in a way that won’t deprive other people of a place. The real need is to raise the standard among free school meals pupils at Key Stage 1 and Key Stage 2, that’s the key issue.

“What we are looking at in the meantime is what we can do to help these free school meals pupils who want to come to grammar school.”

Mr Sindall said that many of the country’s 164 grammar schools would not change their policies because competition for places was less fierce and it would be unnecessary. Many schools were also increasing outreach programmes and some were running eleven-plus familiarisation sessions to help prepare poorer children for the test, he added.’

There is evidence here of a desire to play down the impact of such changes, to suggest that the supply of disadvantaged high achievers is too small to do otherwise.

The data analysis above suggests that almost all selective schools need to address the issue.

Between them, the various press reports mention admissions changes at several schools, including Rugby High, South Wilts, ‘a series of Buckinghamshire grammars including Sir William Borlase’s, Dr Challoner’s  and Aylesbury Grammar’, as well as the King Edward’s Foundation Schools in Birmingham.

I checked how these changes have been embodied in some of these schools’ admissions policies.

The reports indicated that Rugby was:

‘…going even further by reserving a fixed number of places for FSM-eligible children, so potentially accepting pupils with lower entrance exam scores than other applicants.’

Rugby’s admissions arrangements for 2015 do indeed include as a second overall admissions priority, immediately following children in care:

‘Up to 10 places for children living within the priority circle for children in receipt of Free School Meals whose scores are between one and ten marks below the qualifying score for entry to the school.’

South Wilts included FSM as an oversubscription criterion in its 2014 admission arrangements, replacing it with pupil premium eligibility in 2015. However, in both cases it is placed third after children in care and those living in the school’s designated [catchment] area.

Sir William Borlase’s goes one better, in that its 2015 admissions policy places children eligible for free school meals immediately after ‘children in care’ and before ‘children living in the catchment area of the school’, though again only in the oversubscription criteria.

The King Edward’s Foundation is pursuing a similar route to Rugby’s. It announced its intention to reform admissions to its five Birmingham grammar schools in April 2014:

‘The Government wishes to improve the social mobility of children in the UK and has urged selective schools to consider how their admission policies could be changed to achieve this. The King Edward VI Grammar Schools have applied to the Department for Education which can allow them to give preference in their policies, to children who are on free school meals, or have been at any point in the last six years…

… In addition the grammar schools will be offering familiarisation sessions which will introduce children from less privileged backgrounds to the idea of attending a grammar school and will encourage them to take the 11+.

All of the Grammar Schools have set themselves a target of a 20% intake of children on free school meals (Aston has already achieved this and has a target of 25%). The expansion of the grammar schools which was announced earlier this year means that these additional children will simply fill the additional space.’

According to the 2013 Performance Tables, the FSM rates at each of these schools in January 2013 were:

  • Aston – 12.9%
  • Camp Hill Boys – 3.6%
  • Camp Hill Girls – 5.3%
  • Five Ways – 2.6%
  • Handsworth Girls – 6.3%

There must have been a major improvement at Aston for the September 2013 admissions round. As for the other four schools, they must increase their FSM admissions by a factor of between 4 and 8 to reach this target.

I wonder whether the targets are actually for ‘ever 6’ admissions?

In the event, the Foundation’s applications encountered some difficulties. In July the Admissions Adjudicator was obliged to reject them.

A parent had objected on the grounds that:

‘…it is necessary to request financial information from parents to achieve this priority which is contrary to paragraph 1.9(f) of the School Admissions Code.

… The objector further feels that it is unclear, unfair and unreasonable to use the pupil premium to differentiate between applications when the school is oversubscribed.’

The Adjudicator found in favour of the parent on the technical grounds that, although the schools had applied for variations of their funding agreements to permit this change, they had only done so retrospectively.

However, in each case:

‘The school is now entitled to give priority to girls eligible for the pupil premium as the funding agreement has been amended.’

By August the Foundation was able to state that the issue had been resolved:

‘Children applying for a place at any of the King Edward VI Grammar Schools must now achieve a minimum “qualifying score” in the test to be eligible for entry.

Any Looked After Child or previously Looked After Child (a child who is or has been in the care of the Local Authority) who achieves the “qualifying score” will be given priority for admission for up to 20% of the available places (25% at Aston).

Children eligible for Pupil Premium (those who have been registered for Free School meals at any point in the 6 years prior to the closing date for registration, 11 July 2014) who achieve the “qualifying score” will also be given priority for admission.

After this allocation, children not eligible for the Pupil Premium but who achieve the “qualifying score” will be admitted by rank order of scores until all places are filled.’

The Foundation has published an interesting FAQ on the new arrangements:

‘Q5. Will this mean that if you are poor you won’t have to score as high in the 11+ admission tests?
A. That is essentially correct – up to 20% of places (25% at Aston) are set aside for pupil premium children who achieve “a qualifying score”. This qualifying score will be set before the test in September after we have reviewed data in order to ensure that children who achieve the score can flourish in our schools.

Q6. Why don’t you want the cleverest children at your school anymore?
A.
 We want our schools to represent the City of Birmingham and the diverse backgrounds that our children might come from. We believe that there are clever children out there who just don’t have the same opportunity to succeed as those from more privileged backgrounds and we want to try to do something about that.’

It acknowledges the magnitude of the challenge ahead:

‘John Collins, Secretary to the Governors of the charity The Schools of King Edward VI in Birmingham said “This is a hugely challenging target which we do not expect to achieve in the first few years of the initiative, as currently there are relatively few free school meal pupils who apply to take the test. These low numbers are something we are trying to address with our “familiarisation” programme which seeks to encourage bright children from less privileged backgrounds to take the test.”’

Also in July the Government opened up the same possibility for grammar schools that are not academies by consulting on amendments to the Admissions Code to permit this.

In October this was confirmed in the Government’s response to the consultation which stressed it was being introduced as an option rather than a universal requirement.

.

Progress on 11+ Test Reform

The new-style 11-plus tests developed by CEM have not had a universally positive reception. Much of the attention has been focused on their adoption by Buckinghamshire grammar schools.

The GSHA’s May 2014 newsletter notes that ‘some schools in the Midlands’ have been using CEM tests for five years. From 2015, 40% of grammar schools will be using these tests, which are:

‘…designed to be immune to the influence of coaching’

adding:

‘The analysis of data from Buckinghamshire (a wholly selective area which has recently switched to the CEM Centre tests) will provide us in time with valuable hard data on the large scale impact of the change over time.’

Back in February 2014 an Observer article had already cited positive feedback from Buckinghamshire:

‘Last autumn, a handful of education authorities in England introduced an exam designed to test a wider range of abilities – ones that are already being taught in primary schools, rather than skills that can be mastered through home tutoring – to make the selection system fairer.

Provisional results indicate that a more diverse selection of pupils passed this test, and headteachers say they feel the change has made a difference.

Ros Rochefort, headteacher at Bledlow Ridge primary school in Buckinghamshire…said that this year, for the first time in her career, the test has delivered a fair result. “All the kids who got through were expected to pass and, as usual, there are a couple of appeals coming through. All our very able children were selected….

…. Philip Wayne, headteacher at Chesham grammar school and chairman of the Bucks Grammar School Heads Association, has welcomed the changes and says he is “very confident” that the new test will avoid the current situation, in which many pupils who won places at his school with the help of intensive tutoring struggle to keep up with lessons once they arrive.’

However, there were contemporary reports that the 2013 tests led to a 6% fall (110 fewer pupils) in the proportion of places awarded to children from in-county state primary schools, even though 300 more pupils applied.

In September this was further developed in a Guardian story:

‘According to the data, a child from a Buckinghamshire private school is now more than three and a half times more likely to pass the 11-plus than a child from one of its state primaries….

…FOI requests to the eight secondary schools in Wycombe, which includes some of the most deprived and diverse wards in the county, suggest that children on free school meals and of Pakistani heritage have been less successful this year. ‘

A local pressure group Local Equal and Excellent has been trying to gather and analyse the data from the initial rounds of testing in 2013 and 2014 (ie for admission in 2014 and 2015).

Their most recent analysis complains at refusals to publish the full test data and contains an analysis based on the limited material that has been released.

In November 2014, the matter was discussed at Buckinghamshire’s Education, Skills and Children’s Services Select Committee.

The ‘results and analysis’ paper prepared by Buckinghamshire’s grammar school headteachers contains many words and far too few numbers.

The section on ‘Closing the gap’ says:

‘One local group has claimed that children from poorer backgrounds and BME have ‘done worse’ in the new Secondary Transfer Test. It is not specified what ‘worse’ means; however it is not reliable to make statements about trends and patterns for specific groups from a single year’s data and as stated above the data that has been used to make such claims is a small subset of the total and unrepresentative. To substantiate such claims a detailed analysis of additional information such as the current attainment of the children concerned would be needed. We are currently considering how a longitudinal study might be achieved.’

This is overly defensive and insufficiently transparent.

There is some disagreement about whether or not the new test is less amenable to coaching.

The ‘results and analysis’ paper says:

‘There is no such thing as a ‘tutor proof’ test. However, the new tests are less susceptible to the impact of specific test tutoring because they are aligned to the National Curriculum which all children study. Additionally, the questions in the new test are less predictable than in the previous test because they cover a wider range of topics and there is a broader range of question types – points acknowledged and welcomed by primary headteachers’.

Conversely, the pressure group says:

‘The new 11-plus, devised by the Centre for Evaluation and Monitoring (CEM) at Durham University, is supposed to rely less heavily on verbal reasoning and be more closely allied to the primary curriculum. Practice papers for the CEM test are supposed to be less readily available…

But… the fact that it is modelled on what can be taught in schools means the CEM test is more amenable to coaching… if children can’t be taught to get better in maths, why are we teaching it in schools? Practice will make anyone better and I see no sign that tuition has tailed off at all.’

Elsewhere there is evidence that 11+ testing is not immune to financial pressures. North Yorkshire is presently consulting on a plan to scale back from a familiarisation test and two sets of two full tests, with the best results taken forward.

Instead there would be a single set of tests taken by all candidates on the same day at a single venue, plus sample booklets in place of the familiarisation test. A system of reviews, enabling parents to provide supporting evidence to explain under-performance, would also be discontinued.

The reason is explicit:

‘The cost of administering an overly bureaucratic system of testing is no longer sustainable in the light of very significant cuts in public expenditure.’

Even though the draft impact assessment says that the Council will consider applications for support with transport from rural areas and for those with low incomes, there is some unacknowledged risk that the new arrangements will be detrimental to efforts to increase the proportion of disadvantaged learners admitted to these schools.

.

How Best to Close Excellence Gaps

.

What to do with the status quo

The next Government will inherit:

  • The Access Grammar reform project, outlined above, which is making some progress in the right direction, but needs closer scrutiny and probably more central direction. There is an obvious tension between Laws’ aspiration that all grammar schools should ‘give preference to pupil premium pupils over the next few years’ and the GSHA position, which is that many schools do not need to change their policies. It will be important that the changes to admissions arrangements for the 163 schools are catalogued and their impact on admissions monitored and made public, so that we can see at a glance which schools are leading the pack and which are laggards. A published progress report against the Sutton Trust’s six recommendations would help to establish future priorities. Greater transparency about the project itself is also highly desirable.
  • A small cadre of selective 16-19 free schools. It will need to articulate its position on academic selection at 16+ and might need to take action to ensure a level playing field with existing sixth form colleges. It might consider raising expectations of both new and existing institutions in respect of the admission of disadvantaged learners, so securing consistency between 11+ selection and 16+ selection.
  • Flexibility within the Admissions Code for all grammar schools – academies and LA-maintained alike – to prioritise the admission of disadvantaged learners. It may need to consider whether it should move further towards compulsion in respect of grammar schools, particularly if the GSHA maintains its position that many do not need to broaden their intake in this fashion.
  • Flexibility for all grammar schools to increase Planned Admission Numbers and, potentially, to submit proposals for the establishment of Satellite institutions. The approval of such proposals rests with the local authority in the case of a maintained school but with the Secretary of State for Education in respect of academies. An incoming government may need to consider what limits and conditions should be imposed on such expansion, including requirements relating to the admission of disadvantaged learners.

It may be helpful to clarify the position on satellites. The Coalition Government has confirmed that they can be established:

‘It is possible for an existing maintained grammar school or academy with selective arrangements to expand the number of places they offer, including by extending on to another site…There are, however, limitations on that sort of expansion, meaning it could only be a continuation of the existing school. The school admissions code is written from a presumption that those schools with a split site are a single school’ (Hansard, 16 February 2012, Col. 184W).

In December 2013, a proposal to establish a grammar school annexe in Sevenoaks, Kent was rejected by the Secretary of State on the grounds that it would constitute a new school:

‘Mr Gove’s legal ruling hinged on the issue of a girls’ grammar school being the sponsor of a Sevenoaks annexe for both girls and boys. The planned entry of Sevenoaks boys to the annexe lead Mr Gove to rule that the annexe’s proposed admissions policy was sufficiently different to the sponsor school’s girls-only admissions policy to constitute a wholly new grammar school.’

But a revised proposal was submitted in November 2014 for a girls’ only annexe. Moreover, the local authority has committed to exploring whether another satellite could be established in Maidenhead, acknowledging that this would require the co-operation of an existing grammar school.

The timing of the decision on the revised Sevenoaks proposal ensures that selection will remain a live issue as we approach the General Election

Further options to promote between-school selection

There are several options for strengthening a pro-selection policy further that would not require the removal of statutory constraints on opening new 11-18 grammar schools, or permitting existing schools to change their character to permit selection.

For example:

  • Pursuing the Wilshavian notion of organising schools into geographical clusters, some with academic and others with vocational specialisms, and enabling learners to switch between them at 14+. In many areas these clusters will incorporate at least one grammar school; in others the ‘academic’ role would be undertaken by high-performing comprehensive schools with strong sixth forms. The practical difficulties associated with implementing this strategy ought not to be underplayed, however. For example, how much spare capacity would the system need to carry in order to respond to annual fluctuations in demand? How likely is it that students would wish to leave their grammar schools at 14 and what tests would incomers be expected to pass? Would the system also be able to accommodate those who still wished to change institution at age 16?
  • Vigorously expanding the cadre of post-16 selective free schools. There is presumably a largely unspent budget for up to twelve 16-19 maths free schools, though it will be vulnerable to cuts. It would be relatively straightforward to develop more, extending into other curricular specialisms and removing the obligatory university sponsorship requirement. Expansion could be focused on clones of the London Academy of Excellence and the Harris Westminster Sixth Form. But there should be standard minimum requirements for the admission of disadvantaged learners. A national network might be created which could help to drive improvements in neighbouring primary and secondary schools.
  • Permit successful selective post-16 institutions to admit high-attaining disadvantaged students at age 14, to an academic pathway, as a parallel initiative to that which enables successful colleges to take in 14 year-olds wishing to study vocational qualifications. It may be that the existing scheme already permits this, since the curriculum requirements do not seem to specify a vocational pathway.

UKIP’s policy, as presently articulated, is merely enabling: few existing schools are likely to want to change their character in this fashion.

One assumes that Tory advocates would be satisfied with legislation permitting the establishment of new free schools that select at age 11 or age 14. It seems unlikely that anyone will push for the nuclear option of ‘a grammar school in every town’… but Conservative Voice will imminently reveal their hand.

.

Further options to promote within-school selection

If the political preference is to pursue within-school provision as an alternative to between-school selection there are also several possibilities including:

  • Encouraging the development of more bilateral schools with parallel grammar and selective streams and/or fast-track grammar streams within standard comprehensive schools.
  • Requiring, incentivising or promoting more setting in secondary schools, potentially prioritising the core subjects.
  • Developing a wider understanding of more radical and innovative grouping practices, such as vertical and cluster grouping, and trialling the impact of these through the EEF.

It would of course be important to design such interventions to benefit all students, but especially disadvantaged high attainers.

The Government might achieve the necessary leverage through a ‘presumption’ built into Ofsted’s inspection guidance (schools are presumed to favour the specified approach unless they can demonstrate that an alternative leads consistently to higher pupil outcomes) or through a ‘flexible framework’ quality standard.

.

A national student support scheme

The most efficient method of supporting attainment and social mobility amongst disadvantaged high attainers is through a national scheme that helps them directly, rather than targeting the schools and colleges that they attend.

This need not be a structured national programme, centrally delivered by a single provider. It could operate within a framework that brings greater coherence to the existing market and actively promotes the introduction of new suppliers to fill gaps in coverage and/or compete on quality. A ‘managed market’ if you will.

The essential elements would include:

  • This supply-side framework, covering the full range of disadvantaged students’ learning and development needs, within which all suppliers – universities, third sector, commercial, schools-based – would position their services (or they would be excluded from the scheme).
  • A commitment on the part of all state-funded schools and colleges to implement the scheme with their disadvantaged high attainers (the qualifying criterion might be FSM or ‘ever 6’) – and to ensure continuity and progression when and if these students change institution, especially at 16+.
  • A coherent learning and development programme for each eligible student throughout Years 7-13. Provision in KS3 might be open access and light touch, designed principally to identify those willing and able to pursue the programme into KS4 and KS5. Provision in these latter stages would be tailored to individuals’ needs and continuation would be dependent on progress against challenging but realistic personal targets, including specified GCSE grades.
  • Schools and colleges would act as facilitators and guides, conducting periodic reviews of students’ needs; helping them to identify suitable services from the framework; ensuring that their overall learning programmes – the in-school/college provision together with the services secured from the framework – constitute a coherent learning experience; helping them to maintain learning profiles detailing their progress and achievement.
  • Each learner would have a personal budget to meet costs attached to delivering his learning programme, especially costs attached to services provided through the framework. This would be paid through an endowment fund, refreshed by an annual £50m topslice from the pupil premium budget (analogous to that for literacy and numeracy catch-up) and a matching topslice from universities’ outreach budgets for fair access.
  • Universities would be strongly encouraged to make unconditional offers on the basis of high quality learning profiles, submitted by students as part of their admissions process.
  • There would be annual national targets for improving the GCSE and A level attainment of students participating in the scheme and for admission to – and graduation from – selective universities. This would include challenging but realistic targets for improving FSM admission to Oxbridge.

.

Conclusion

The current political debate is overly fixated on aspects of the wider problem, rather than considering the issue in the round.

I have set out above the far wider range of options that should be under consideration. These are not necessarily mutually exclusive.

If I were advising any political party inclined to take this seriously, I would recommend four essential components:

  • An enhanced strategy to ensure that all existing selective schools (including 16+ institutions) take in a larger proportion of high-attaining disadvantaged learners. Approval for expansion and any new schools would be conditional on meeting specified fair access targets.
  • Development of the cadre of 163 grammar schools into a national network, with direct responsibility for leading national efforts to increase the supply of high-attaining disadvantaged learners emerging from primary schools. Selective independent schools might also join the network, to fill gaps in the coverage and fulfil partnership expectations.
  • A policy to promote in all schools effective and innovative approaches to pupil grouping, enabling them to identify the circumstances in which different methods might work optimally and how best to implement those methods to achieve success. Schools would be encouraged to develop, trial and evaluate novel and hybrid approaches, so as to broaden the range of potential methods available.
  • A national support scheme for disadvantaged high attainers aged 11-19 meeting the broad specification set out above.

Regrettably, I fear that party political points-scoring will stand in the way of a rational solution.

Grammar schools have acquired a curious symbolic value, almost entirely independent of their true purpose and largely unaffected by the evidence base.

They are much like a flag of convenience that any politician anxious to show off his right-wing credentials can wave provocatively in the face of his opponents. There is an equivalent flag for abolitionists.  Anyone who proposes an alternative position is typically ignored.

.

GP

November 2014

The Politics of Setting

I had been intending never to revisit the difficult topic of setting, secure in the knowledge that I could not improve on my earlier treatment of the pros and cons.

P1010978

Irrelevant picture of Norway by Gifted Phoenix

But recent developments have caused me to reconsider, led me to address the issue from a different perspective.

My previous post attempted an objective and balanced statement of the educational arguments for and against, drawing on the research evidence and taking account of all learners, regardless of their attainment.

This one explores how setting – just one option within the far wider range of so-called ‘ability grouping’ strategies – has been reflected in government policy and party political policy documents since 1997, culminating in the position we have  reached as the parties begin to firm up their 2015 General Election manifestos.

The post begins with brief notes on terminology and incidence.

The substantive text is organised into four sections:

  • How Labour government positions on setting developed and fluctuated between 1997 and 2010.
  • How the Conservative party turned to setting while in opposition.
  • How Coalition Government policy on setting has shifted since May 2010.

It concludes with a summary of the position we have reached as we approach the next Election, together with some ideas for how we might move forwards more constructively.

In case you prefer to read selectively I have included links to the relevant section from each of the bullet points above.

 

Terminology

I take setting to mean grouping learners in a discrete class, ostensibly selected with reference to prior attainment in a specific subject.

It is distinct from streaming, where the selection – which may well be generic and ability-based – applies to teaching across a range of different subjects. The learners in a higher stream may not be higher attainers in each of these subjects.

One sometimes also encounters references to banding which is broadly synonymous with streaming, except that streaming tends to refer to a single class, while bands may include more than one class. It may therefore be a less differentiated form of streaming.

Both setting and streaming are within-school selection strategies, which may be adopted by selective or comprehensive schools. They may be perceived as viable alternatives to between-school selection which is no longer regarded as politically feasible by Labour, Conservatives or Liberal Democrats.

There is, however, continuing pressure from the right of the Conservative party and recently from UKIP for the restoration of grammar schools. The Coalition government has opened up the prospect of satellite establishments and overseen the introduction of several selective post-16 institutions. This might be viewed as the thin end of the wedge.

It has not always been possible to isolate the approach to setting since there is often a tendency to brigade it with streaming and/or a wider range of grouping strategies, occasionally including various approaches to within class grouping.

Sometimes these distinctions are clear and sometimes they are less so. To take a prominent example, the relevant entry in the Education Endowment Foundation’s Teaching and Learning Toolkit is not a model of clarity.

Called ‘Setting or Streaming’, it discusses effectiveness initially in terms of ‘ability grouping’ (first paragraph).

Clarity is not improved by the inclusion of the American terms for streaming (tracking) and setting (regrouping).

Nor is it clear whether ‘ability grouping’ is intended as a synonym for ‘setting or streaming’ or whether it has a broader scope.

The second paragraph reverts to ‘setting or streaming’ before discussing a wider range of interventions targeted at gifted and talented learners including several accelerative measures. One of these – promotion – is not necessarily a grouping strategy, at least as I understand the term.

The next three paragraphs relate to low attainers. The third focuses on ‘ability grouping’, although there is one reference is to ‘setting or streaming’, the fourth discusses both ‘setting’ and ‘ability grouping’, while the fifth mentions only ‘ability grouping’.

This terminological imprecision is confusing and unhelpful, especially when it appears in a text that purports to present the available research evidence clearly and unambiguously.

 

How prevalent is setting?

There are few recent and reliable statistics available on the incidence of setting.

Statistics deposited in the Commons Library in March 2012 (Dep 2012-0434) provide Ofsted data on the percentage of lessons observed in secondary schools that were either setted or streamed/banded for every year from 1996/97 to 2002/03, excluding PE.

In 2002/03, 40% of all secondary lessons observed were setted and 4% were streamed or banded.

From 2003/04 to 2010/11, the table provides percentages of lessons observed that were setted, streamed or banded, for ‘lower’, ‘average’ and ‘upper ability’ learners respectively.

In 2010/11, the average percentages across all year groups were 12% for average ability, 16% for lower ability and 17% for higher ability.

The reply to a PQ from July 2011 provides 2009/10 data, for maths, English and science in primary and secondary schools respectively. The percentages relate to ‘classes setted, streamed or banded by ability where pupils are placed within an ability range within the school’.

The average figures across all year groups are set out below. For primary schools I have included Year 6 percentages in brackets:

  • Maths primary 19% (34%)
  • English primary 11% (19%)
  • Science primary 2% (3%)
  • Maths secondary 71%
  • English secondary 58%
  • Science secondary 62%

A 2014 study of primary practice found that:

Approximately 17% of the pupils studied, who were born in 2000-2001, were in ability streams. Some 8% of the total group were in the top stream, 5% in the middle and 4% in the bottom stream.

Last year Ofsted estimated that, excluding PE, some 45% of secondary lessons were set or streamed. The TES story containing these figures notes:

‘The Department for Education was unable to produce statistics on how many students are set or streamed. Ofsted produced limited data based on lessons it had inspected… but stressed that “there is no way of using this data to draw out national conclusions in any way”….

…In comments accompanying Ofsted’s figures, Sir Michael noted that, since 2005, its inspections have not involved observing all teachers in a school. Lessons that were seen were not “necessarily representative” of the school or system as a whole, he said.

….”It is not possible to deduce from inspection data the proportions of pupils nationally who are taught in setted/streamed classes or in mixed-ability groups,” the chief inspector said.’

We can only conclude that a significant proportion of secondary students and older primary learners is setted and that that this practice is most prevalent in the core subjects. It is unclear whether these percentages are now increasing, stable or declining. 

It would be highly desirable to obtain more accurate figures through the School Census, if only to track the influence of the presentation of the evidence base in the Toolkit.

 

Part One: The evolution of Labour government policy from 1997 to 2010

 

First Labour Government

In 1997 the incoming Labour Government published its White Paper ‘Excellence in Schools’. The chapter on ‘Modernising the comprehensive principle’ said:

Mixed ability grouping… requires excellent teaching and in some schools has worked well. But in too many cases it has failed both to stretch the brightest and to respond to the needs of those who have fallen behind. Setting, particularly in science, maths and languages, is proving effective in many schools. We do not believe that any single model of grouping pupils should be imposed on secondary schools, but unless a school can demonstrate that it is getting better than expected results through a different approach, we do make the presumption that setting should be the norm in secondary schools. In some cases, it is worth considering in primary schools. Schools should make clear in reports to parents the use they are making of different grouping approaches. OFSTED inspections will also report on this.

The clear implication is that, where the quality of teaching is not excellent, setting is likely to prove relatively more effective than ‘mixed ability grouping’, particularly in science, maths and languages.

Setting will not be made compulsory in secondary schools, but there is a presumption that it should be ‘the norm’, presumably in all subjects but certainly in science, maths and languages, unless schools can show ‘better than expected results’ through a different approach. In primary schools, setting should be considered in some unspecified cases.

Ofsted will check what schools are doing (and presumably validate or otherwise any claim of ‘better than expected results’, although the precise meaning of this term is not explained).

The text also says that the Department will publish guidance and exemplification of best practice, taken from this country and abroad ‘in organising classes to meet the different abilities of pupils’. There is a list of strategies in which it has particular interest, including:

  • ‘target-grouping, where pupils are grouped by ability for part of the week and groups are altered in line with regular assessment;
  • fast-tracking, where pupils are encouraged to learn and take qualifications ahead of their age cohort.’

Early in 1999, Ofsted published a survey on setting in primary schools. I cannot source the text online, but contemporary reviews, such as this from the LGA, show that it was strongly supportive of the practice:

Setting, rather than streaming, in primary schools provides a powerful lever for raising standards, so long as it is carefully implemented and properly managed, say Her Majesty’s Inspectors from OFSTED.

A new survey of the practice of setting – grouping children by ability for specific subjects – uses evidence from OFSTED inspection data, from a questionnaire and from focused inspections by HMI. It endorses the government’s view that setting is well worth considering.

‘Where teachers understand its potential and modify their teaching techniques accordingly, setting can be a very successful way of organising teaching groups,’ HMI say in the report Setting in Primary Schools, published today by OFSTED.

They point out that setting does not, by itself, guarantee success in raising standards nor can it compensate for poor teaching. However, evidence from school inspections suggests that the quality of teaching in setted lessons in the three core subjects is slightly better than in lessons with the full ability range.’

This introduces two important themes – that the efficacy of setting is dependent on:

  • it being implemented and managed effectively and
  • the appropriate adaptation of teaching techniques.

In September 2000 a DfEE research report on ‘Innovative Grouping Practices in Secondary Schools’ summarises the advantages and disadvantages of ability grouping more generally, but consists mainly of extended case studies of contemporary innovative practice.

The introduction sets the context thus:

The challenge now is to find ways of grouping pupils and developing pedagogy that capitalises on the advantages and minimises the disadvantages outlined above. In other words, how can schools develop grouping plans to achieve the best attainment outcomes for pupils while minimising any negative impact?

This rather more pragmatic approach reappears in subsequent guidance documents, but was set aside when government policy was articulated.

 

Second Labour Government

The 2001 Green Paper ‘Schools Building on Success’ reverts to a bullish reference to setting in the section on KS3:

We want to see further increases in the extent of setting within subjects including express sets to enable those who are capable of doing so to advance beyond the levels set for their age and to take Key Stage 3 tests early.’

But this does not survive into ‘Schools Achieving Success’, the White Paper published the same year, which makes no reference to setting specifically or ‘ability grouping’ more generally..

A roughly contemporary PQ reply also hedges its bets:

The Government supports a flexible approach to pupil grouping, including setting by ability where appropriate’.

The sentence is vacuous because deliberately imprecise. Essentially it expresses the government’s preference for schools to decide their own approaches.

It seems that there is growing indecision over which line to take. Should the government opt for consistent and wholehearted endorsement, full devolution of responsibility to schools, or a middle path that focuses on developing and disseminating effective practice to meet the needs of different settings?

This is of course redolent of wider contemporary debate about the role of the government in determining education policy and practice.

Setting is not mentioned in the ‘Five Year Strategy for Children and Learners’ which appeared in 2004.

 

Third Labour Government

Setting makes a significant reappearance in the October 2005 White Paper ‘Higher Standards, Better Schools For All’:

‘Grouping students can help to build motivation, social skills and independence; and most importantly can raise standards because pupils are better engaged in their own learning. We have encouraged schools to use setting since 1997. Putting children in different ability groups within a class is commonplace in primary schools. Ofsted reports show that the proportion of Key Stage 3 lessons which are set has risen since 1997 to over a third now, with greater rises in English and maths. The significant majority of English, science and modern foreign language lessons in secondary schools, and about nine in ten maths lessons are already organised by setting.

It will continue to be for schools to decide how and when to group and set by ability. But we will encourage more schools to adopt such grouping and help them to learn from the innovative practices that some schools are already employing without lowering expectations for pupils in lower ability groups or limiting choices in the curriculum. We will publish, in the New Year, independent research into current best practice.

The first emboldened point implies a consistency that is not fully reflected in the narrative above, in that the encouragement for setting seems to have waned somewhat between 2001 and 2004.

The second emboldened section makes it clear that schools remain free to determine their own approaches. The presumption in favour of setting has gone by the wayside and the government will focus instead on encouragement through the continuing promotion of innovation and best practice.

Shortly afterwards, the research report ‘The Effects of Pupil Grouping: Literature Review’ appeared.

Back in 2010 I summarised its key findings thus:

  • No single form of grouping benefits all pupils and there is little attainment advantage associated with setting – ie no significant difference between setting and mixed ability classes in overall attainment outcomes across all pupils.
  • ‘At the extremes of attainment’ low-achieving pupils show more progress in mixed ability classes and high-achieving pupils show more progress in sets.
  • Lower sets tend to contain a disproportionate number of boys, pupils from those ethnic groups that tend to underachieve and pupils with SEN.
  • There are aspirational and behavioural disadvantages to setting, predominantly amongst lower attainers, and there is a correlation between disaffection and setting, particular for pupils in the lowest sets.
  • Higher sets are more likely to have experienced and highly-qualified teachers whereas lower sets experience more changes of teacher and are less likely to be taught by a specialist in the subject.’

A contemporaneous TES story argues that the report undermines the government’s position by offering too little support for setting:

‘Setting pupils by ability, one of the most widely-trailed parts of last week’s white paper, has few benefits, a study funded by the Department for Education and Skills has concluded.

There is no evidence that streamed or set classes produce, on average, higher performance than mixed-ability classes, said the report. It also found that setting pupils is already widespread, particularly in maths….

It says the debate between setting and mixed-ability teaching has become polarised and does not reflect what happens in schools where a wide range of ways of grouping pupils is used….

…The review concluded: “There are no significant differences between setting and mixed-ability teaching in overall attainment … but … low-achieving pupils show more progress in mixed-ability classes and high-achieving pupils show more progress in set classes.’

This provides the spur for a renewed effort to push beyond the polarised debate, to refocus on helping to develop solutions to fit particular needs and circumstances

In 2006, DfES published ‘Pupil Grouping Strategies and Practices at Key Stage 2 and 3: Case Studies of 24 Schools in England’, a companion piece to the 2005 study.

The impact of grouping on pupil attainment were summarised thus:

  • Schools identified that the use of setting enabled them to tailor teaching for different ability pupils in order to impact on their understanding and achievement. However, the research did not find evidence to corroborate these expected achievement gains.
  • In secondary schools that adopted mixed ability or part mixed ability grouping approaches, the rationale given by teachers and senior managers tended not to make reference to attainment but rather to focus on the benefits in terms of social awareness and inclusivity. 
  • In primary schools, which used mixed ability as the predominant organisational grouping, pupils were often seated around tables on the basis of ability and it was not possible to differentiate attainment outcomes that related directly to setting or mixed ability from these observations.’

So advocates of secondary setting could not demonstrate stronger attainment overall, while advocates of secondary mixed ability teaching were not primarily concerned with the impact on attainment.

In the primary sector it was not possible to distinguish a differential impact on outcomes from either option.

In September of the same year, the National Strategies produced ‘Grouping Pupils for Success’, useful guidance for schools deciding on the most appropriate grouping strategies.

The introduction says that it:

‘…moves on from the old ‘for and against’ debates about grouping to a more sophisticated understanding of what it means to group pupils for success.’

Suggestions relating specifically to setting include:

  • ‘Make a careful match of individual teacher strengths with the nature of sets, for example placing a teacher experienced in challenging low attainers with the lowest set or band, to lift attainment.
  • Avoid ‘teaching to the middle’ in mixed-ability classes.
  • Monitor pupils’ learning to ensure that pupils have opportunities to demonstrate higher attainment, for example in tiered papers in the National Curriculum tests, and that access to the curriculum and resources are not limited by assumptions about ability level.
  • Ensure that teaching in top sets creates a learning atmosphere in which it is acceptable to make mistakes, to ask for clarification or repetition.
  • Develop inclusive teaching approaches, for example through differentiated questioning or the use of within-class groupings.’

It summarises the research on setting and mixed ability grouping respectively in the two tables reproduced below.

 

2014 Setting Capture 1

2014 setting Capture 2

 

Effective Teaching and Learning for Pupils in Low Attaining Groups’ (2007) takes the same line as the previous studies in arguing that:

‘…the polarisation of the grouping debate does not reflect the available evidence….Rather than pointing towards the overwhelming superiority of one form of grouping over another, it suggests that different forms of grouping are effective for different ‘types’ of pupils, in relation to different kinds of outcomes.’

But it continues:

‘The decision, therefore, about whether to group by attainment, either has to be seen as a matter of principle, where empirical evidence is of limited relevance, or else has to be regarded as one that is complex and may even be too close to call.

Nevertheless, the authors contribute some further empirical evidence to the debate, notably concerning the characteristics of pupils in low attaining sets:

  • ‘The analysis of data on pupils’ allocation to groups confirms prior attainment as the main, albeit a relatively poor predictor of set placement, for example, with over half the pupils with low prior attainment in English ending up in middle or high sets. Although prior attainment remains statistically significant, setting decisions are clearly not made on this basis alone.’
  • ‘Social class is a significant predictor of set placement. Pupils from higher socio-economic status (SES) backgrounds are more likely to be assigned to higher sets and less likely to be assigned to lower sets.
  • Special Educational Need (SEN) is a significant predictor of set placement (after controlling for social class and prior attainment), with these pupils concentrated in the low attainment sets. Less than 10% of pupils in the highest sets have SEN. This suggests that SEN and low attainment are seen as closely related or overlapping and that set placement may also be confounded by the effect of behaviour.
  • Ethnicity was a weaker significant predictor of set placement, (after controlling for social class and prior attainment), with pupils of Bangladeshi origin being slightly less likely to be selected for the higher sets.
  • Gender was not a significant predictor of set placement (after controlling for social class and prior attainment), except in Key Stage 2 literacy where, against recent trends, females were more likely to be placed in a low set. Overall, males are slightly overrepresented in the low sets and under-represented in the middle sets but this difference was not statistically significant.
  • Other factors including teacher assessments, teacher judgements and pupil characteristics such as behaviour are likely to influence set placement. Some schools allocated pupils with behavioural difficulties to high sets irrespective of prior attainment because they believed that the classroom context provided in these groups would promote positive behaviour. Other schools allocated these pupils to lower sets because they were smaller and provided higher staff ratios.’

Also in 2007, ‘The Children’s Plan’ included a section entitled ‘Good classroom practices – better use of grouping and setting’.

Essentially this replicates the approach taken in the 2005 White Paper, though the drafting is far more convoluted and so far less clear:

‘Improved understanding of each child’s progress should also lead to more effective use of group teaching. Since 1997 we have been encouraging schools to use ‘setting’ (teaching groups of pupils by ability in a particular subject rather than across a range of subjects) and other forms of pupil grouping, and we continue to encourage these practices.

Using setting and groups to teach children of similar abilities and interests can bring real educational benefits. But where it is poorly implemented, for example through ‘streaming’ (where pupils are grouped across a range of subjects based on general rather than subject-specific assessment) it can be socially divisive and detrimental to all but the highest achieving pupils. Grouping can also be used more effectively in the classroom – in particular, through proven approaches to in-class grouping by need, and guided group work when the teacher coaches a small group to apply immediately what they have been learning in the main part of the lesson. We will promote this best practice as standard practice.

Under this new formulation, there is recognition that there can be effective practice in setting and mixed ability grouping alike.

The final sentence potentially embodies a slightly different approach, by introducing the notion of promoting ‘standard practice’, but no further details are provided about what exactly this will entail.

Then Labour appears to lose interest in setting. A 2008 publication from DCSF ‘Personalised Learning: A Practical Guide’ includes a chapter on ‘Pupil Grouping’ but it says almost nothing about setting. It is as if the authors are keen to move on from what has become a rather sterile debate.

A PQ from March 2009 uses the Children’s Plan formulation:

‘Analysis of research suggests that no single model of pupil grouping will be of benefit to all pupils all of the time. For example, there is some evidence that being taught in a mixed ability class can be beneficial for low attainers, but that ability-based classes can be beneficial for high attainers.

We promote setting — the grouping of pupils according to their ability in a particular subject — as an effective way of ensuring that individual pupils are receiving personalised help appropriate to where they are in their learning. Similarly, we promote effective pupil grouping practices, and guided work, as tools for delivering the most appropriate curriculum to each individual in mixed ability classes.

We do not promote streaming—where pupils are assigned to classes on the basis of an overall assessment of their general ability and pupils remain in their streamed classes across the majority of subjects—as it assumes that children will have the same level of ability in all subjects.’

But, three months later, the 2009 White Paper ‘Your child, your schools, our future’ has nothing to say on the subject, and references to setting are conspicuously absent from the Pupil and Parent Guarantees. Ministers have apparently decided that schools are best left to their own devices.

Setting did not appear in Labour’s 2010 Election Manifesto either.

 

Part 2: The Conservatives in opposition

In January 2006, just at the time when Government research reports were discussing the polarised nature of debate on setting and advocating a more nuanced approach, David Cameron and David Willetts (then Tory education spokesman) both made statements that explicitly supported setting.

One report has Cameron saying:

‘I want no child held back, so my priority is not selection by ability between schools but setting by ability within schools, because every parent knows that a high quality education means engaging children at the right level.’

Another attributes to him the statement:

‘I want the Conservative Party to help me campaign in setting by each subject in every school so that we actually do what I think is common sense which is to help stretch the brightest pupils and help those who are in danger of falling behind…There’s a real case for more selection within schools rather than selection between schools.’

“The government is getting into a mess over the issues of selection and admissions.”

It seems that Cameron has identified support for setting as a means of distancing himself from calls within his party for the introduction of more selective schools.

Willetts said:

What I shall be looking for in the months ahead is how best to spread setting, and I would not rule out using central government more in this area…The evidence that setting works is powerful indeed, and yet you still have more than half of lessons not taught in sets, where you can target your teaching methods to children with a particular level of skill.’

Another report has a slightly different version:

We are not saying that an edict will go out from the Department for Education that schools are instructed to set in all circumstances but the empirical evidence is that it works.

I would not rule out ministers getting involved in the way schools organise setting, but our instincts are to cut back rather than add to central bureaucracy and direction.

One can see writ large the tension between the mutually exclusive desires for prescription and autonomy. Willetts is leaving open the possibility of central direction of some sort.

In the event, it was decided that Ofsted would be the enforcer. The November 2007 Conservative Green Paper ‘Raising the Bar, Closing the Gap’ made this clear:

‘While every pupil must be given the opportunity of a good education, we also recognise that each pupil should be given the opportunity to learn in accordance with their particular aptitude and ability, so that the brightest pupils continue to be stretched at the same time as pupils who might be struggling are given extra support.

We believe that setting by ability is the only solution to achieving this ambition. Labour’s 1997 manifesto acknowledged the importance of setting and implied that the amount of setting in schools would be increased significantly. This has not taken place.

… We believe that school children learn more effectively when taught with children of a similar ability. We also believe setting contributes to better behaviour. We will therefore alter guidance to Ofsted to ensure that schools – particularly those not performing at high levels – set all academic subjects by ability.’

Contemporary press reports remind us that Cameron had originally spoken of ‘a grammar school stream’ in every school, but streaming was set aside in favour of setting:

‘The Tories will now make clear that streaming need only apply in smaller schools with limited timetabling.’

Cameron has decided that setting is ‘the only solution’ and the inspection regime will impose this on all schools (the document is not clear whether primary schools are included). There is no explicit exemption for those ‘performing at high levels’ although they will be a lower priority.

This new position is a restatement of the Labour position of 1997.

Hansard shows that new opposition spokesman Michael Gove maintained an interest in the issue until at least summer 2009.

In May 2008 he requests the latest data on the extent of setting and DCSF’s guidance on the issue. Ofsted answers the first point while the reply directs him to the materials referenced above.

In July 2009 he again asks for updated data on the incidence of setting.

But the enforcement of setting through Ofsted was clearly set aside when the time came to consider the content of the 2010 Tory Election Manifesto.

For the time being at least, it seemed that the alternative attractions of full autonomy for schools had triumphed.

 

Part 3: The evolution of Coalition policy on setting

 

2010 to 2014

Indeed, the 2010 Schools White Paper made a virtue of schools’ autonomy in such matters.

‘We will expect schools to set their own improvement priorities. As long as schools provide a good education, we will not mandate specific approaches.

We…believe that it is often effective to incentivise improvement and innovative ideas, rather than to mandate a uniform approach.’

But it takes some time for any evidence of this approach in relation to setting.

A PQ from July 2011 asking for data and current guidance to schools elicits the statement that, while there is no guidance,

Case studies showing the effective use of setting in schools will be made available on the department’s website shortly.’

The task was not a high priority. Eight months later, in March 2012, the answer to the next PQ on the topic confirms that the material has now been published.

These case studies were not transferred to gov.uk, but have been archived and are still available.

The covering article, dated 26 April 2012, reads:

Setting and other forms of pupil grouping are ways of tailoring teaching and learning for mixed-ability classes which can help raise standards.  When setting is done well it can be an effective way to personalise teaching and learning to the differing needs of groups of pupils.’

There are five case studies in all, two of secondary and three of primary schools. Each is no more than a page in length. Compared with some of the guidance produced by Labour administrations they are of relatively limited value.

But the issue was soon stirred up by the intervention of HMCI Wilshaw.

In September 2012, he referred to the issue obliquely, but in such a way that his comments could be interpreted to fit the very different political perspectives of the newspapers that carried his comments.

I can find no official Ofsted record of what he said.

One report offers this version:

‘Heads have got to make up their mind. If they want mixed-ability, then they have got to make sure there’s differentiated teaching. And we will be very critical when we inspect schools, particularly in the secondary sector, if we see mixed-ability without mixed-ability teaching.

He added: ‘If you have got a youngster with low basic skills sitting alongside a youngster with Oxbridge potential, then it is really important that that’s taken into account.’

Another provides a slightly different interpretation:

‘”Where there are mixed-ability classes, unless there is differentiated teaching… it doesn’t work,” he said, adding that effective differentiated teaching was “hugely difficult” to achieve.

He said mixed-ability classes could be an “article of faith” for schools who were not concerned enough about good practice and were doing something “more akin to social engineering”. In those cases Ofsted inspections would be “very critical”.’

A third suggests that Wilshaw expressly put some distance between his remarks and the setting controversy:

This is not a judgment on mixed ability as opposed to setting or streaming, it is saying where there are mixed ability classes unless there is differentiated teaching to groups of school children in the class, unless there are individual programmes of work, it doesn’t work,” he said,

“It is absolutely critical that if you have a youngster with low grades at school who struggles with literacy and numeracy sitting alongside a youngster with Oxbridge potential then it is really important that is taken into account and they are taught by people who are experienced in good teaching of mixed ability classes.”’

Conversely, a fourth was much more bullish:

‘Inspectors will now be critical of schools that do not differentiate between high and low achievers.

This could lead to schools falling into the new category of ‘requires improvement’ (which replaces the old ‘satisfactory’ description), or even being labelled ‘inadequate’…

Ofsted cannot force schools to adopt setting – grouping pupils according to their academic ability in single subjects – or streaming, where ability groups cover most or all subjects.

However, Sir Michael’s intervention is likely to make headteachers rethink their practice of mixed ability classes for fear of being marked down in future inspections

‘It’s a combination of low expectations of what these youngsters can achieve, that their progress is not sufficiently tracked, and what I would call and have done ever since I have been a teacher the curse of mixed-ability classes without mixed-ability teaching,’ he said.

The former head said mixed-ability classes did not work ‘unless there is differentiated teaching to groups of schoolchildren in the class’ and ‘individual programmes of work’….

…Many schools had recognised this and ‘moved towards setting arrangements’, he said.

It seems as though everyone heard what they wanted to hear.

Wilshaw’s fundamental point seems to echo the 1997 White Paper and some of Labour’s guidance material reviewed above.

His principal argument is that mixed ability settings require mixed ability teaching, and that effective mixed ability teaching is a difficult skill to master. He implies, but does not state explicitly, that teaching a narrower range of ability is comparatively easier.

He suggests that Ofsted will look askance at schools that adopt mixed ability teaching on ideological grounds,  that cannot justify it in terms of their learners’ achievement, or where the quality of teaching is insufficient to support it.

Seven months later, in April 2013, a peculiar story appeared in the TES called ‘Conservatives abandon pledge to enforce ability grouping’:

‘The practice of grouping classes by ability has long had strong backing from the top. Ofsted, the education secretary, the prime minister and their Labour predecessors have all encouraged schools to use setting in more lessons.

But, despite their rhetoric, Conservative ministers have quietly dropped a pledge to enforce setting by ability…

… Last September, Ofsted chief inspector Sir Michael Wilshaw appeared to support the call, warning that some students were being held back by “the curse of mixed-ability classes without mixed-ability teaching”, adding that such teaching was “hugely difficult” to achieve.

But the government has now said that it does not advocate setting. “It is for schools to decide how best to organise teaching – including whether to group and set pupils by ability – as they know exactly what their students need,” a spokesman said.

And Ofsted says it “doesn’t have a view on whether setting or streaming is a good idea or not”. A spokeswoman for the inspectorate also revealed that Conservative ministers had not asked Ofsted to enforce setting.’

This is odd, since any Conservative adherence to the enforcement of setting would date back to 2007. I can find no more recent commitment than that. So why overtly drop a policy that no-one could reasonably have assumed the Conservatives still to advocate?

It is as if ministers are determined to re-impose the position on autonomy reached back in 2010, which has been compromised by Wilshaw’s insistence on linking schools’ decisions to Ofsted’s assessment of their performance.

Given more recent history, it is also conceivable that ministers were on the receiving end of pressure from the Prime Minister’s Office to adopt a more interventionist approach. Perhaps this was their way of distancing education ministers from such pressure.

But Ofsted’s alleged neutrality on the question of setting was soon called into question again when in June 2013 it published  ‘The Most Able Students’.

This developed HMCI Wilshaw’s theme:

‘In around a third of the schools visited, students were taught mainly in mixed ability groups throughout Key Stage 3. Where setting by ability occurred at an early stage, this was usually only for mathematics. Sets were introduced at various times for English and science, but often only in the later stages of Key Stage 3.

For most other subjects, mixed ability classes were retained throughout Key Stage 3. In the very best schools, this did not appear to have a detrimental impact on students’ progress because the teaching was carefully planned and well matched to the most able students’ needs. In the less effective schools, the work was pitched at the level of the average-attaining students. It was not challenging enough for the most able and their progress was insufficient…

…It was evident in some of the schools visited that school leaders had responded to recent research findings about mixed ability teaching, particularly in Key Stage 3. Eight of the schools had moved recently towards grouping by ability, particularly in English, mathematics and science. Some other school leaders recognised that their earlier grouping arrangements had not always promoted the best outcomes for the most able students. They indicated that they were moving away from mixed ability teaching to setting, streaming or banding in most subjects. Schools’ data shown to inspectors during the visits indicated that these moves were beginning to have a positive impact on outcomes for the most able students.’ 

Although Ofsted may not have an official view on the desirability of setting, it is abundantly clear that schools are encouraged to consider it where the quality of mixed ability teaching provided is not sufficiently strong to secure commensurate outcomes for able learners.

The current iteration of the inspection handbook says:

Inspectors should consider how effectively pupils are grouped within lessons and across year groups. For example:

  • where pupils are taught in mixed ability groups/classes, inspectors will consider whether the most able are stretched and the least able are supported sufficiently to reach their full potential. 
  • where pupils are taught in sets, inspectors will consider how leaders ensure that pupils in lower sets are not disadvantaged or that teachers take into account that pupils within a set may still have very different needs. ‘

The associated grade descriptors for an inadequate school mention:

‘The organisation of the curriculum and classes is resulting in some pupils achieving less well than they should.’

This carefully balanced approach makes it clear that inspectors will consider equally seriously the efficacy of sets and mixed ability groups.

Schools are likely to be pulled up if their mixed ability settings are insufficiently challenging for high attainers, but will also be challenged if their sets are holding back low attainers or if teaching within sets is insufficiently differentiated.

 

Developments in Autumn 2014

This careful balance was once more disturbed.

On 3 September, the Guardian reported that:

‘Compulsory setting according to ability in England’s secondary schools is to be proposed by the education secretary, Nicky Morgan, in her first big initiative since she took the role in July. She is due to make the announcement as early as today.’

This shock introduction was immediately undermined by a subsequent paragraph indicating that setting would not be compulsory after all, though it would be incentivised through the inspection regime:

‘It is expected that Morgan will ask the education watchdog, Ofsted, to implement and enforce the measure, probably by making it a condition of receiving an outstanding rating‘.

So the strategy would be to prevent schools from receiving the top inspection rating if they had not adopted setting.

The piece also stated unequivocally that the policy had been ‘cleared with Downing Street’, although ‘The Department for Education gave no comment after being contacted.’

This implied that the source of the story was the Prime Minister’s Office.

Just six hours later, the same paper carried a second report featuring comments made by Morgan in a Commons debate that same afternoon:

‘Richard Fuller: The Secretary of State has faced a number of confusing interventions from Opposition Members, one of which repeated something that was said in The Guardian today, which was that she was about to announce a policy a compulsory setting. Will she take this opportunity to say whether she is going to do that?

Nicky Morgan: Let me confirm for the benefit of the House that there is absolutely no truth in those rumours. There are some people outside this House who have a rather unhealthy interest in speculating about what I am or am not about to announce. They would be better served if they spent less time on Twitter and talking to journalists, and more time reflecting on the importance of the policies and reforms that have already been implemented by this Government.’ (Hansard, 3 Sep 2014, Col 357)

So as not to appear entirely wrong-footed, the Guardian cited Dominic Cummings in support of its original story:

Gove’s former special adviser Dominic Cummings said he had been told Cameron wanted to back compulsory setting.

He added on Twitter: “I was told by No 10 and two others in Whitehall a version v close to the Guardian story. Some had warned internally it was mad.” He also suggested there was a launch plan prepared inside No 10.’

Cummings’ Twitter feed on the day in question is instructive:

 

 

A BBC report included comment from the Lib Dems, confirming that they would not support a Coalition policy along these lines:

‘A senior Liberal Democrat source also distanced the party from any such proposal.

“This has not been agreed by the Liberal Democrats and is not government policy. We do not think it would be appropriate to tie schools’ hands in this way.”’

And Labour in opposition took a similar line:

‘Labour’s shadow education secretary Tristram Hunt had called on the education secretary to reject political involvement in such school decisions.

“I believe that excellent heads and great teachers know better than Westminster politicians how to deliver the best schooling for all pupils.

“We thought there was political consensus on the importance of school autonomy.’

The Cummings version lends support to the idea that some sort of enforcement of setting remains under consideration for inclusion in the Conservative Election Manifesto for 2015.

It might once again help to pacify those in the Party who seek a renewed commitment to selective education. Conservative MPs will be acutely aware of UKIP’s declared policy:

‘Existing schools will be allowed to apply to become grammar schools and select according to ability and aptitude. Selection ages will be flexible and determined by the school in consultation with the local authority.’

I could find no explicit statement to the effect that a commitment to introduce setting would definitely not be in the 2015 Manifesto. The final paragraph of a related TES story claimed this was the case, but this is not supported elsewhere.

While there was no reference to setting in Morgan’s speech to the Conservative Party Conference, the idea has subsequently reappeared in a different guise.

On 12 October the Conservative party let it be known that their Manifesto would include plans to enable Regional Schools Commissioners to intervene directly in the operation of any school rated inadequate by Ofsted, whether or not an academy.

The briefing made an explicit link with setting:

‘A Conservative spokesperson said the new powers would be developed in “consultation with Ofsted and the Education Endowment Foundation”, but a “menu of options” might include forcing schools to put children into classes based on ability, or ‘sets’ as they are also known.’  (Academies Week).

So, rather than making setting a condition of an ‘outstanding’ Ofsted rating, this possible new approach is to empower RSCs to impose setting on inadequate schools.

Whether the inclusion of setting in the menu of options would survive the consultation process is open to question – and presumably RSCs would also be reluctant to impose it without hard evidence that it would radically improve the performance of an inadequate school. Such evidence would be hard to find.

Perhaps this is a method of parking the issue:  giving No 10 the impression that enforcement of setting is part of the agenda when in fact it is not.

Meanwhile, the DfE has restated its existing commitment to giving schools autonomy in this matter. On 30 October, a Conservative MP tabled a PQ:

‘Andrew Rosindell (Romford):

To ask the Secretary of State for Education, what steps her Department is taking to ensure that children at secondary school are being efficiently grouped according to their academic ability.

Answered by: Mr David Laws

The Department for Education believes that individual schools are best placed to determine whether and how to group children by academic ability. There are many different models of pupil grouping, and schools themselves are best able to respond to their individual circumstances to meet the needs and capabilities of their pupils.

Note that the reply refers to the DfE’s belief rather than the Government’s position.

This suggests that we may not have heard the last of the matter, especially if setting remains part of the Prime Minister’s strategy for buying off the siren voices calling for renewed commitment to grammar schools.

 

Part 4: The Education Endowment Foundation’s Evidence Base

The Education Endowment Foundation (EEF) exists to improve the achievement of disadvantaged learners. The website says:

‘We aim to raise the attainment of children facing disadvantage by:

  • Identifying and funding promising educational innovations that address the needs of disadvantaged children in primary and secondary schools in England;
  • Evaluating these innovations to extend and secure the evidence on what works and can be made to work at scale;
  • Encouraging schools, government, charities, and others to apply evidence and adopt innovations found to be effective.’

But, confusingly, it has also been designated jointly with the Sutton Trust as a What Works Centre for improving educational outcomes for all school age children:

‘The What Works centres will summarise and share research with local decision-makers, helping them to invest in services that deliver the best outcomes for citizens and value-for-money for taxpayers.

In the EEF’s case, decision-makers include teachers and school-leaders, parents and governors, researchers and policy-makers. They are the primary audience for our Teaching and Learning Toolkit, an accessible summary of educational research which provides guidance for teachers and schools on how to use their resources to improve the attainment of disadvantaged pupils. ‘

See the logical disconnect? The principal tool used by the EEF/Sutton Trust to inform decision makers about what works well with all learners has been designed to inform decisions about what works well with disadvantaged learners.

This is particularly problematic when it comes to setting.

 

 The Teaching and Learning Toolkit

The EEF’s website describes the Toolkit as follows:

‘The Sutton Trust-EEF Teaching and Learning Toolkit is an accessible summary of educational research which provides guidance for teachers and schools on how to use their resources to improve the attainment of disadvantaged pupils.

The Toolkit currently covers 34 topics, each summarised in terms of their average impact on attainment, the strength of the evidence supporting them and their cost.’

One of the 34 topics is ‘Setting or streaming’. This pairing is potentially problematic since the subsequent commentary does not consistently distinguish the impact of one from the other.

I have already described above how the guidance switches between setting, streaming, ability grouping and wider gifted and talented provision.

When it comes to quantification, the Toolkit arrives at an average impact measure of -1 month – ie in terms of average pupil progress over a year, the impact of ‘setting or streaming’ on disadvantaged learners is negative.

The description of the Toolkit notes:

‘Most approaches included in the Toolkit tend to have very similar average impacts on pupils with different characteristics. However, where the research summarised suggests that an approach has a different average impact on the learning of pupils from disadvantaged backgrounds compared to the learning of their peers, the Toolkit’s ‘headline’ average impact figure refers to the former.’

The section describing the impact of ‘setting or streaming’ begins:

Overall, ability grouping appears to benefit higher attaining pupils and be detrimental to the learning of mid-range and lower attaining learners. On average, ability grouping does not appear to be an effective strategy for raising the attainment of disadvantaged pupils, who are more likely to be assigned to lower groups.’

It continues:

On average, studies show that higher attaining learners make between one and two additional months progress when set or streamed compared to when taught in mixed ability groups.’

No reference is made to the plight of disadvantaged high attainers, who might be expected to benefit commensurately.

The impact of setting and streaming remains undifferentiated.

The next section of the commentary considers a wider range of grouping interventions targeted on gifted and talented learners. This does not seem directly relevant to the narrower case of ‘setting or streaming’.

The final section of the commentary is concerned with low attainers (and so, by implication, includes the majority of disadvantaged learners).

It says:

Low attaining learners fall behind by one or two months a year, on average, when compared with the progress of similar students in classes without ability grouping. It appears likely that routine setting or streaming arrangements undermine low attainers’ confidence and discourage the belief that attainment can be improved through effort. Research also suggests that ability grouping can have a longer term negative effect on the attitudes and engagement of low attaining pupils. It should be noted that there are some exceptions to this average, where ability grouping has benefitted all learners. Further study could be undertaken to understand what happened differently in these examples.

Evidence suggests that the impact of setting is more detrimental to low attaining pupils in mathematics who do better in mixed attainment groups, and that ability grouping particularly affects upper primary and lower secondary education. The effects appear to be less clear-cut in other subjects, though negative effects are reported for low attaining pupils across the curriculum.

Though the average impact of ability grouping on low attaining pupils is negative, evidence suggests that certain types of ability grouping are more effective than others. Some studies have shown that reducing the size of the lowest attaining groups and assigning high-performing teachers to these groups can be effective, as can providing additional targeted catch up support.’

So the text suggests that high attaining learners make between one and two months more progress in sets/streams, while low attaining learners fall behind by the same amount. There is, therefore, between two and four months’ difference in the impact on high and low attainers respectively.

But this commentary:

  • Is not providing sufficiently accurate information to enable us to distinguish the impact of setting alone, as opposed to ‘setting or streaming’ together. 
  • Neglects the interests of high-attaining disadvantaged learners who are assumed to be an insignificant minority. 
  • Is fundamentally unclear, a particularly heinous crime considering the purpose of the Toolkit.

 

The KCL Study

One of the projects funded by the EEF is examining Best Practice in Grouping Students. The four-year project began in 2014 and continues through to spring 2018. It is co-ordinated by a team based at King’s College London and is receiving a total of £1.184m. The evaluation has been assigned to the NFER.

The project summary on EEF’s website distinguishes two parallel strands:

  • A randomised control trial of an intervention ‘which trains schools in a best practice approach to setting’ and is focused on English and maths in Years 7 and 8. The trial begins in September 2015, but results are not expected until spring 2018. It will be conducted in a sample of 120 schools, randomly allocated to conduct the trial or form part of the control group.
  • A pilot of an intervention ‘to introduce mixed ability teaching to secondary schools’ and ‘examine whether it is possible to overcome the common barriers to mixed ability teaching’. The intervention will be developed initially with three schools but the pilot will subsequently be extended to ten. Results are due in spring 2017.

One of the descriptions on the King’s College site suggests that the focus is explicitly placed on lower sets and low attainers:

The project addresses the needs of pupils in low ‘ability’ sets and streams, wherein research has identified socially disadvantaged pupils are strongly over-represented.

The project draws on substantial existing research evidence (concerning the educational outcomes for young people in low sets and streams, and the related poor practice often associated with such groupings), as illustrated in the Education Endowment Foundation/Sutton Trust Toolkit and elsewhere. The evidence from the literature concerning existing bad practice and detrimental outcomes associated with low ‘ability’ groups establishes areas for potential improvement, which will be applied via the interventions.’

It adds that the trial will measure the impact on pupil attainment, noting that the developmental phase:

‘Will also allow us to research why schools and policy-makers appear so wedded to ‘ability’ grouping, and what might persuade the adoption of an evidence-based approach.’

A second description confirms the focus of the setting strand on lower sets:

One, on Best Practice in Setting, seeks to remedy the detrimental elements identified by research to be associated with low sets.’

This also mentions that a pilot study – it is not clear whether this is of one strand or both – is being undertaken in six schools in the current academic year. It states that the full study will involve around 100 schools (rather than 120) and will be completed in 2017 (rather than spring 2018).

The exclusive emphasis on low sets is directly contradicted in a TES story about the project:

The Education Endowment Foundation, which commissioned the study, said the research aimed to address poor practices in both low and high sets.’

Is there a difference of opinion between KCL and the EEF? It would be helpful to know the truth, since there is otherwise strong reason to believe that the needs of high-attaining disadvantaged learners will be neglected.

NFER’s description of its evaluation currently says that the protocols for both strands are not yet agreed. Hopefully they willl be clear on whether the operation of higher sets – and the impact on disadvantaged learners within them – is also part of the agenda.

 

What Makes Great Teaching?

On 31 October 2014, the Sutton Trust published ‘What Makes Great Teaching?

One assumes that it has done so in its role as partner with the EEF in the What Works Centre, rather than as a charity supporting social mobility.

The press release dictated much of the media coverage. Provocatively headed:

‘Many popular teaching practices are ineffective, warns new Sutton Trust report’

It begins:

‘Lavish praise for students is among seven popular teaching practices not supported by evidence, according to a new Sutton Trust report which reviews over 200 pieces of research on how to develop great teachers.

What Makes Great Teaching, by Professor Rob Coe and colleagues at Durham University, warns that many common practices can be harmful to learning and have no grounding in research. Examples include using praise lavishly, allowing learners to discover key ideas by themselves, grouping students by ability and presenting information to students based on their “preferred learning style”.’

Later on the press release lists seven ‘examples of strategies unsupported by evidence’. Third in the list is:

Grouping students by ability. Evidence on the effects of grouping by ability, either by allocating students to different classes, or to within-class groups, suggests that it makes very little difference to learning outcomes. It can result in teachers failing to accommodate different needs within an ability group and over-playing differences between groups, going too fast with the high-ability groups and too slow with the low.’

The Report itself does not include this list in its executive summary. It appears in a section called ‘Examples of Ineffective Practices’. But the text repeats more or less verbatim the claim in the press release

The following are examples of practices whose use is not supported by the research evidence…

Group learners by ability

Evidence on the effects of grouping by ability, either by allocating students to different classes, or to within-class groups, suggests that it makes very little difference to learning outcomes (Higgins et al, 2014). Although ability grouping can in theory allow teachers to target a narrower range of pace and content of lessons, it can also create an exaggerated sense of within-group homogeneity and between-group heterogeneity in the teacher’s mind (Stipek, 2010). This can result in teachers failing to make necessary accommodations for the range of different needs within a supposedly homogeneous ‘ability’ group, and over-doing their accommodations for different groups, going too fast with the high-ability groups and too slow with the low.’

The first reference to grouping by ability making ‘very little reference to learning outcomes’ is to the Toolkit, though the report’s bibliography attributes it to 2013 not 2014. The second reference – ‘Stipek 2010’ – inexplicably appears in the bibliography under D rather than S.

As far as I can see, this is a reference to an article – an excerpt from a 2002 book called Motivation to Learn: Integrating Theory and Practice – that cites a series of other studies dating between 1976 and 1998.

Is the opening sentence an accurate description of what the Toolkit says?

As we have seen, the Toolkit considers ‘setting or streaming’ – though it also mentions a range of other strategies targeted at gifted and talented students – but it doesn’t discuss substantive evidence relating to within-class groups.

The only reference to them comes at the end of the Toolkit entry, in the section ‘What should I consider’. It says:

‘Flexible within-class grouping is preferable to tracking or streaming for low-attaining pupils’.

But that doesn’t support the statement above. (Nor, for that matter, is it supported by the evidence in the earlier parts of the Toolkit text.)’

The differential impact of setting on high and low attainers is not mentioned.

How might this statement be improved to reflect the evidence? It might say:

  • When discussing evidence on the effectiveness of ability grouping, it is important to distinguish the impact of setting, streaming and within class ability grouping respectively.
  • It is also important to distinguish the differential impacts on high attainers and low attainers respectively. Great care should be taken to clarify whether the discussion relates to all learners or only to disadvantaged learners. The subset of low attainers ought not to be regarded as analogous with the subset of disadvantaged learners.
  • The evidence suggests the overall impact of setting or streaming – ie one or the other – on low attainers is negative (one to two months) whereas the impact on high attainers is positive (one to two months). There is therefore a difference of up to four months’ progress between high and low attainers respectively.
  • There is less evidence on the differential impact of setting and streaming respectively. What we do know is x.
  • The impact of setting varies according to prior attainment of the learners, the subject of study and how well it is implemented. The available evidence suggests that setting is most likely to be successful under the following conditions….and, conversely, is least likely to be successful when….

 

Where we are now – and future prospects

 

The Evidence Base

The arguments about the advantages and disadvantages of setting have long been polarised – and there is little evidence to suggest that this will change as we head into 2015.

The EEF/Sutton Trust nexus purports to stand for evidence-based pedagogy, but both arms of the partnership are too vague and unspecific in how they present this evidence.

Because they take short cuts, it is too easy to interpret their coverage as overwhelmingly negative towards setting. A more careful and nuanced presentation would highlight the different contexts where setting might be more and less likely to operate effectively.

As things stand, the standard bearers for evidence-based practice seem more inclined to perpetuate the polarisation of views, rather than promoting a more sophisticated understanding of the issue.

This may be a deliberate reaction to the unevidenced espousal of setting by politicians, or it may just be insufficient attention to detail.

Substantive amendment of the toolkit entry – along the lines set out above – is devoutly to be wished for.

And it should be accompanied by a commitment to produce and update accurate data about the incidence of setting by sector, type of school, subject and pupils’ prior attainment. The Schools Census is the perfect vehicle.

One hopes that the results from the KCL study will be more carefully presented, but the absence of evaluation protocols and the disagreements over the focus of the study are a cause for concern. The protocols should be finalised and published forthwith.

The KCL study is unlikely to reveal that best practice in setting has a substantial impact on improvements in the performance of disadvantaged learners, even the high attainers.

But everything is relative: hardly any of the studies of other interventions so far completed by the EEF have identified a significant positive effect.

I would be satisfied with confirmation of a limited but positive impact on the performance of disadvantaged high attainers, combined with recognition that any negative impact on disadvantaged low attainers can potentially be eliminated through effective practice.

Some recommendations for the implementation of hybrid approaches – perhaps combining a single top set with several parallel mixed ability groups – wouldn’t go amiss.

Any evidence that does emerge from the KCL study – positive or negative – will not appear until well after the 2015 Election.

For the future, we anticipate keenly the pronouncements on setting that will emerge from a College of Teaching and/or from the Liberal Democrat’s independent Education Standards Authority. There is no reason to believe that they will be any more inclined to withstand the ideological pressures than their predecessors.

 

The Policies

Labour and Liberal Democrat politicians seem wedded to the notion of full autonomy for schools, though their parallel enthusiasm for the new entities mentioned above might tend to undermine this position.

It is not clear whether schools would be protected as much from the setting-related pronouncements of a College of Teaching as they would from the predilections of a government minister.

As for the Conservatives, they seem caught on the horns of a dilemma. Do they too opt for autonomy and the virtues of the market, or do they get on the front foot and develop a more robust alternative to UKIP’s espousal of selection?

They could commit to more selective schools, if only of the post-16 variety. They might even push back the concept to 14+, perhaps enabling the strongest sixth form colleges to accept 14-16 year-olds just as FE colleges can.

They might develop a new cross-school support system for high attainers, especially disadvantaged high attainers. They need look no further than posts elsewhere on this blog for ideas as to how it might be constructed.

They should avoid at all costs the Sutton Trust open access wheeze, which directs a substantial taxpayer subsidy towards independent schools while denuding the state sector of high attaining learners.

Or they might continue to refine the idea of a grammar stream in every school.

The research evidence against streaming seems to me more damning than the evidence against setting, though this is often obscured by the tendency to roll them up together.

That said, several comprehensive schools operating in direct competition with grammar schools seem to have introduced a grammar stream. It would be possible to promote this practice in this subset of comprehensive schools – whether through Ofsted or otherwise – and to develop practical guidance on effective practice.

Setting might well remain the path of least resistance, although compulsory setting across the board would be too much of a straitjacket, restricting the flexibility of those schools that perform outstandingly with mixed ability teaching.

So some sort of selective imposition is necessary. The Ofsted inspection regime is the only effective lever remaining in the hands of central government. The Inspection Handbook might be altered to reinstate a presumption of the kind advanced by Labour in 1997 – and this might be weighted towards schools in either the higher or the lower performance categories. In either case the presumption might be confined to the core subjects.

But even this could only be carried forward in the teeth of opposition from the education profession, so would have the potential to reduce still further the quantum of Tory teacher votes.

The more recently suggested fall-back – adding setting to a menu of possible school improvement interventions managed through the Regional Schools Commissioners – is weak by comparison. So weak that it is tantamount to kicking the whole idea into the long grass.

There are precious few alternatives. Perhaps the only other realistic proposition is to revert to presentation of the evidence base, but to develop this into substantive guidance that schools are expected to consider before determining their approach to ability grouping – much more substantive than the half-hearted case studies published in 2012.

If it were my call, I would construct a ‘flexible framework’ quality standard that defines the broad parameters of effective practice while permitting schools significant flexibility over interpretation of those in practice.

This would align with and reflect the available research evidence on the most effective approaches to setting, including advice on when setting is most likely to work and when it might be preferable to select an alternative strategy.

I would incorporate the standard into supplementary guidance for Ofsted inspectors, to ensure that inspection judgements on setting are fully consistent with it.

And I would strongly encourage schools to use it within their own improvement planning processes – and through peer-to-peer assessments undertaken by teaching schools, national leaders of education and other elements of the emerging self-improving schools system.

I would combine this with a national support programme for disadvantaged high attainers in Years 7-13, generously funded through matched topslices from the Pupil Premium and HE fair access budgets.

 

The Politics

With May and Johnson already on manoeuvres, not to mention continued pressure from the Brady/Davis camp, Cameron may be even more inclined to press ahead, even in the teeth of opposition from some DfE ministers.

In June 2013, polling suggested  (p8) that 43% of all voters – and 66% of Tory voters – agreed that:

‘The Government should encourage more schools to select by academic ability and build more grammar schools’.

It seems highly likely that, without any viable alternative, the Tories will haemorrhage right wing votes to UKIP over this issue. But there is a trade-off with teacher votes. Are they capable of defining an acceptable middle way, or are they doomed to fall between two stools?

They might at any rate consider some of the ideas set out above.

 

GP

November 2014

 

Excellence Gaps Quality Standard: Version 1

 

This post is the first stage of a potential development project.

letter-33809_640
It is my initial ‘aunt sally’ for a new best fit quality standard, intended to support schools and colleges to close performance gaps between high-achieving disadvantaged learners and their more advantaged peers.

It aims to integrate two separate educational G_letter_blue_whiteobjectives:

  • Improving the achievement of disadvantaged learners, specifically those eligible for Pupil Premium support; and
  • Improving the achievement of high attainers, by increasing the proportion that achieve highly and the levels at which they achieve.

High achievement embraces both high Blue_square_Qattainment and strong progress, but these terms are not defined or quantified on the face of the standard, so that it is applicable in primary, secondary and post-16 settings and under both the current and future assessment regimes.

I have adopted new design parameters for this fresh venture into quality standards:

  • The standard consists of twelve elements placed in what seems a logical order, but they White_Letter_S_on_Green_Backgroundare not grouped into categories. All settings should consider all twelve elements. Eleven are equally weighted, but the first ‘performance’ element is potentially more significant.
  • The baseline standard is called ‘Emerging’ and is broadly aligned with Ofsted’s ‘Requires Improvement’. I want it to capture only the essential ‘non-negotiables’ that all settings must observe or they would otherwise be inadequate. I have erred on the side of minimalism for this first effort.
  • The standard marking progress beyond the baseline is called ‘Improving’ and is (very) broadly aligned with Ofsted’s ‘Good’. I have separately defined only the learner performance expected, on the assumption that in other respects the standard marks a continuum. Settings will position themselves according to how far they exceed the baseline and to what extent they fall short of excellence.
  • The excellence standard is called ‘Exemplary’ and is broadly aligned with Ofsted’s ‘Outstanding’. I have deliberately tried to pitch this as highly as possible, so that it provides challenge for even the strongest settings. Here I have erred on the side of specificity.

The trick with quality standards is to find the right balance between over-prescription and vacuous ‘motherhood and apple pie’ statements.

There may be some variation in this respect between elements of the standard: the section on teaching and learning always seems to be more accommodating of diversity than others given the very different conceptions of what constitutes effective practice. (But I am also cautious of trespassing into territory that, as a non-practitioner, I may not fully understand.)

The standard uses terminology peculiar to English settings but the broad thrust should be applicable in other countries with only limited adaptation.

The terminology needn’t necessarily be appropriate in all respects to all settings, but it should have sufficient currency and sharpness to support meaningful interaction between them, including cross-phase interaction. It is normal for primary schools to find some of the language more appropriate to secondary schools.

It is important to emphasise the ‘best fit’ nature of such standards. Following discussion informed by interaction with the framework, settings will reach a reasoned and balanced judgement of their own performance across the twelve elements.

It is not necessary for all statements in all elements to be observed to the letter. If a setting finds all or part of a statement beyond the pale, it should establish why that is and, wherever possible, devise an alternative formulation to fit its context. But it should strive wherever possible to work within the framework, taking full advantage of the flexibility it permits.

Some of the terminology will be wanting, some important references will have been omitted while others will be over-egged. That is the nature of ‘aunt sallys’.

Feel free to propose amendments using the comments facility below.

The quality standard is immediately below.  To improve readability, I have not reproduced the middle column where it is empty. Those who prefer to see the full layout can access it via this PDF

 

 

Emerging (RI) Improving (G) Exemplary (O)
The setting meets essential minimum criteria In best fit terms the setting has progressed beyond entry level but is not yet exemplary The setting is a model for others to follow
Performance Attainment and progress of disadvantaged high achievers typically matches that of similar learners nationally, or is rapidly approaching this..Attainment and progress of advantaged and disadvantaged high achievers in the setting are both improving. Attainment and progress of disadvantaged high achievers consistently matches and sometimes exceeds that of similar learners nationally..Attainment and progress are improving steadily for advantaged and disadvantaged high achievers in the setting and performance gaps between them are closing. Attainment and progress of disadvantaged high achievers significantly and consistently exceeds that of similar learners nationally..

Attainment and progress matches but does not exceed that of advantaged learners within the setting, or is rapidly approaching this, and both attainment and progress are improving steadily, for advantaged and disadvantaged high achievers alike.

 

 

 

  Emerging (RI) The setting meets essential minimum criteria Exemplary (O) The setting is a model for others to follow
Policy/strategy There is a published policy to close excellence gaps, supported by improvement planning. Progress is carefully monitored. There is a comprehensive yet clear and succinct policy to close excellence gaps that is published and easily accessible. It is familiar to and understood by staff, parents and learners alike.

.

SMART action to close excellence gaps features prominently in improvement plans; targets are clear; resources and responsibilities are allocated; progress is monitored and action adjusted accordingly. Learners’ and parents’ feedback is routinely collected.

.

The setting invests in evidence-based research and fosters innovation to improve its own performance and contribute to system-wide improvement.

Classroom T&L Classroom practice consistently addresses the needs of disadvantaged high achievers, so improving their learning and performance. The relationship between teaching quality and closing excellence gaps is invariably reflected in classroom preparation and practice.

.

All teaching staff and paraprofessionals can explain how their practice addresses the needs of disadvantaged high achievers, and how this has improved their learning and performance.

.

All staff are encouraged to research, develop, deploy, evaluate and disseminate more effective strategies in a spirit of continuous improvement.

Out of class learning A menu of appropriate opportunities is accessible to all disadvantaged high achievers and there is a systematic process to match opportunities to needs. A full menu of appropriate opportunities – including independent online learning, coaching and mentoring as well as face-to-face activities – is continually updated. All disadvantaged high achievers are supported to participate.

.

All provision is integrated alongside classroom learning into a coherent, targeted educational programme. The pitch is appropriate, duplication is avoided and gaps are filled.

.

Staff ensure that: learners’ needs are regularly assessed; they access and complete opportunities that match their needs; participation and performance are monitored and compiled in a learning record.

Assessment/ tracking Systems for assessing, reporting and tracking attainment and progress provide disadvantaged high achievers, parents and staff with the information they need to improve performance Systems for assessing, tracking and reporting attainment and progress embody stretch, challenge and the highest expectations. They identify untapped potential in disadvantaged learners. They do not impose artificially restrictive ceilings on performance.

.

Learners (and their parents) know exactly how well they are performing, what they need to improve and how they should set about it. Assessment also reflects progress towards wider goals.

.

Frequent reports are issued and explained, enabling learners (and their parents) to understand exactly how their performance has changed over time and how it compares with their peers, identifying areas of relative strength and weakness.

.

All relevant staff have real-time access to the assessment records of disadvantaged high attainers and use these to inform their work.

.

Data informs institution-wide strategies to improve attainment and progress. Analysis includes comparison with similar settings.

Curriculum/organisation The needs and circumstances of disadvantaged high achievers explicitly inform the curriculum and curriculum development, as well as the selection of appropriate organisational strategies – eg sets and/or mixed ability classes. The curriculum is tailored to the needs of disadvantaged high achievers. Curriculum flexibility is utilised to this end. Curriculum development and planning take full account of this.

.

Rather than a ‘one size fits all’ approach, enrichment (breadth), extension (depth) and acceleration (pace) are combined appropriately to meet different learners’ needs.

.

Personal, social and learning skills development and the cultivation of social and cultural capital reflect the priority attached to closing excellence gaps and the contribution this can make to improving social mobility.

.

Organisational strategies – eg the choice of sets or mixed ability classes – are informed by reliable evidence of their likely impact on excellence gaps.

Ethos/pastoral The ethos is positive and supportive of disadvantaged high achievers. Excellence is valued by staff and learners alike. Bullying that undermines this is eradicated. The ethos embodies the highest expectations of learners, and of staff in respect of learners. Every learner counts equally.

.

Excellence is actively pursued and celebrated; competition is encouraged but not at the expense of motivation and self-esteem;hothousing is shunned.

.

High achievement is the norm and this is reflected in organisational culture; there is zero tolerance of associated bullying and a swift and proportional response to efforts to undermine this culture.

.

Strong but realistic aspirations are fostered. Role models are utilised. Social and emotional needs associated with excellence gaps are promptly and thoroughly addressed.

.

The impact of disadvantage is monitored carefully. Wherever possible, obstacles to achievement are removed.

Transition/progression The performance, needs and circumstances of disadvantaged high achievers are routinely addressed in transition between settings and in the provision of information, advice and guidance. Where possible, admissions arrangements prioritise learners from disadvantaged backgrounds – and high achievers are treated equally in this respect.

.

Receiving settings routinely collect information about the performance, needs and circumstances of disadvantaged high achievers. They routinely share such information when learners transfer to other settings.

.

Information, advice and guidance is tailored, balanced and thorough. It supports progression to settings that are consistent with the highest expectations and high aspirations while also meeting learners’ needs.

.

Destinations data is collected, published and used to inform monitoring.

.

Leadership, staffing, CPD A named member of staff is responsible – with senior leadership support – for co-ordinating and monitoring activity across the setting (and improvement against this standard)..Professional development needs associated with closing excellence gaps are identified and addressed The senior leadership team has an identified lead and champion for disadvantaged high achievers and the closing of excellence gaps.

.

A named member of staff is responsible for co-ordinating and monitoring activity across the setting (and improvement against this standard).

.

Closing excellence gaps is accepted as a collective responsibility of the whole staff and governing body. There is a named lead governor.

.

There is a regular audit of professional development needs associated with closing excellence gaps across the whole staff and governing body. A full menu of appropriate opportunities is continually updated and those with needs are supported to take part.

.

The critical significance of teaching quality in closing excellence gaps is instilled in all staff, accepted and understood.

Parents Parents and guardians understand how excellence gaps are tackled and are encouraged to support this process. Wherever possible, parents and guardians are actively engaged as partners in the process of closing excellence gaps. The setting may need to act as a surrogate. Other agencies are engaged as necessary.

.

Staff, parents and learners review progress together regularly. The division of responsibility is clear. Where necessary, the setting provides support through outreach and family learning.

.

This standard is used as the basis of a guarantee to parents and learners of the support that the school will provide, in return for parental engagement and learner commitment.

Resources Sufficient resources – staffing and funding – are allocated to improvement planning (and to the achievement of this standard)..Where available, Pupil Premium is used effectively to support disadvantaged high achievers. Sufficient resources – staffing and funding – are allocated to relevant actions in the improvement plan (and to the achievement of this standard).

.

The proportion of Pupil Premium (and/or alternative funding sources) allocated to closing excellence gaps is commensurate with their incidence in the setting.

.

The allocation of Pupil Premium (or equivalent resources) is not differentiated on the basis of prior achievement: high achievers are deemed to have equal needs.

.

Settings should evidence their commitment to these principles in published material (especially information required to be published about the use of Pupil Premium).

Partnership/collaboration The setting takes an active role in collaborative activity to close excellence gaps. Excellence gaps are addressed and progress is monitored in partnership with all relevant ‘feeder’ and ‘feeding’ settings in the locality.

.

The setting leads improvement across other settings within its networks, utilising the internal expertise it has developed to support others locally, regionally and nationally.

.

The setting uses collaboration strategically to build its own capacity and improve its expertise.

 

letter-33809_640G_letter_blue_whiteBlue_square_QWhite_Letter_S_on_Green_Background

 

 

 

 

Those who are not familiar with the quality standards approach may wish to know more.

Regular readers will know that I advocate what I call ‘flexible framework thinking’, a middle way between the equally unhelpful extremes of top-down prescription (one-size-fits-all) and full institutional autonomy (a thousand flowers blooming). Neither secures consistently high quality provision across all settings.

The autonomy paradigm is currently in the ascendant. We attempt to control quality through ever-more elaborate performance tables and an inspection regime that depends on fallible human inspectors and documentation that regulates towards convergence when it should be enabling diversity, albeit within defined parameters.

I see more value in supporting institutions through best-fit guidance of this kind.

My preferred model is a quality standard, flexible enough to be relevant to thousands of different settings, yet specific enough to provide meaningful guidance on effective practice and improvement priorities, regardless of the starting point.

I have written about the application of quality standards to gifted education and their benefits on several occasions:

Quality standards are emphatically not ‘tick box’ exercises and should never be deployed as such.

Rather they are non-prescriptive instruments for settings to use in self-evaluation, for reviewing their current performance and for planning their improvement priorities. They support professional development and lend themselves to collaborative peer assessment.

Quality standards can be used to marshal and organise resources and online support. They can provide the essential spine around which to build guidance documents and they provide a useful instrument for research and evaluation purposes.

 

GP

October 2014