The problem of reverse excellence gaps

This post compares the performance of primary schools that record significant proportions of disadvantaged high attainers.

spiral-77493_1280It explores the nature of excellence gaps, which I have previously defined as:

‘The difference between the percentage of disadvantaged learners who reach a specified age- or stage-related threshold of high achievement – or who secure the requisite progress between two such thresholds – and the percentage of all other eligible learners that do so.’

It draws particular attention to the incidence at school level of sizeable reverse excellence gaps where disadvantaged learners out-perform their more advantaged peers.

According to my theoretical model reverse gaps threaten equilibrium and should be corrected without depressing the achievement of disadvantaged high attainers.

In this post:

  • The measure of disadvantage is eligibility for the pupil premium – those eligible for free school meals at any time in the last six years (‘ever 6 FSM’) and children in care.
  • The measure of high attainment is Level 5 or above in KS2 reading, writing and maths combined.

.

National figures

The 2014 Primary School Performance Tables show that 24% of the cohort attending state-funded primary schools achieved KS2 Level 5 or above in reading, writing and maths combined. In 2013, the comparable figure was 21% and in 2012 it was 20%.

In 2014 some 650 primary schools managed a success rate of 50% or higher for the entire cohort, up from 425 in 2013 and 380 in 2012

The comparable national percentages for disadvantaged learners are 12% in 2014, 10% in 2013 and 9% in 2012. For all other learners (ie non-disadvantaged) they are 24% in 2012, 26% in 2013 and 29% in 2014.

In 2014, there were 97 state-funded schools where 50% or more of disadvantaged learners achieved this benchmark, compared with only 38 in 2013 and 42 in 2012. This group of schools provides the sample for this analysis.

Chart 1 below illustrates the national excellence gaps over time while Chart 2 compares the proportion of schools achieving 50% or higher on this measure with all learners and disadvantaged learners respectively.

.

REG graph 1

Chart 1: Percentage of disadvantaged and other learners achieving L5+ in KS2 reading, writing and maths, 2012-14

Chart 1 shows that all rates are improving, but the rate of improvement is slower for disadvantaged learners. So the socio-economic achievement gap at L5+ in reading, writing and maths combined has grown from 15% in 2012, to 16% in 2013 and then to 17% in 2014.

REG graph 2 

Chart 2: Number of schools where 50% of all/disadvantaged learners achieved L5+ in KS2 reading, writing and maths, 2012-14

Chart 2 shows steady improvement in the number of schools achieving outstandingly on this measure for all learners and disadvantaged learners alike (though there was a slight blip in 2013 in respect of the latter).

Since 2012, the proportion of schools achieving this benchmark with disadvantaged learners has increased more substantially than the proportion doing so with all learners. At first sight this is a positive trend.

However Chart 1 suggests that, even with the pupil premium, the national excellence gap between higher-attaining advantaged and disadvantaged learners is increasing steadily. This is a negative trend.

It might suggest either that high-attaining disadvantaged learners are not benefiting sufficiently from the premium, or that interventions targeted towards them are ineffective in closing gaps. Or perhaps both of these factors are in play.

 

Schools achieving high success rates with disadvantaged learners

The 97 schools achieving a success rate of 50% or more with their disadvantaged high attainers are geographically dispersed across all regions, although a very high proportion (40%) is located in London and over half are in London and the South-East.

.

Reg graph 3

Chart 3: Distribution of schools in sample by region

 .

Nineteen London boroughs are represented but eight of the 97 schools are located in a single borough – Greenwich – with a further five in Kensington and Chelsea. The reasons for this clustering are unclear, though it would suggest a degree of common practice.

Almost half of the sample consists of church schools, fairly equally divided between Church of England and Roman Catholic institutions. Seven of the 97 are academy converters, six are controlled, 42 are aided and the remainder are community schools.

Other variables include:

  • The average size of the KS2 cohort eligible for assessment is about 40 learners, with a range from 14 to 134.
  • The percentage of high attainers varies from 7% to 64%, compared with an average of 25% for all state-funded schools. More than one quarter of these schools record 40% or more high attainers.
  • The percentage of middle attainers ranges between 38% and 78%, compared with an average of 58% for state funded schools.
  • The percentage of low attainers lies between 0% and 38%, compared with the national average for state-funded schools of 18%. Only 15 of the sample record a percentage higher than this national average.
  • The percentage of disadvantaged learners ranges from 4% to 77%, compared with the national average for state-funded schools of 31%. Roughly one in five of the sample has 50% or more, while almost two in five have 20% or less.
  • The number of disadvantaged pupils in the cohort is between 6 and 48. (Schools with fewer than 5 in the cohort have their results suppressed). In only 22 of the sample is the number of disadvantaged pupils higher than 10.
  • In 12 of the schools there are no EAL pupils in the cohort but a further 11 are at 60% or higher, compared with an average for state-funded schools of 18%.

Overall there is significant variation between these schools.

.

School-level performance

The vast majority of the schools in the sample are strong performers overall on the L5 reading, writing and maths measure. All but five lie above the 2014 national average of 24% for state-funded schools and almost half are at 50% or higher.

The average point score ranges from 34.7 to 27.9, compared with the state-funded average of 28.7. All but 15 of the sample record an APS of 30 or higher. The average grade per pupil is 4B in one case only and 4A in fourteen more. Otherwise it is 5C or higher.

Many of these schools are also strong performers in KS2 L6 tests, though these results are not disaggregated for advantaged and disadvantaged learners.

More than four out of five are above the average 9% success rate for L6 maths in state-funded primary schools and almost two out of five are at 20% or higher.

As for L6 grammar, punctuation and spelling (GPS), some two-thirds are above the success rate of 4% for all state-funded primary schools and almost two out of five are at 10% or higher.

When it comes to the core measure used in this analysis, those at the top of the range appear at first sight to have performed outstandingly in 2014.

Four schools come in at over 80%, though none has a disadvantaged cohort larger than eight pupils. These are:

Not far behind them is Tollgate Primary School, Newham (71%) but Tollgate also has a cohort of 34 disadvantaged learners, almost three times the size of any of its nearest rivals.

What stands out from the data above all else is the fact that very few schools show any capacity to replicate this level of performance over two or three years in succession.

In some cases results for earlier years are suppressed because five or fewer disadvantaged pupils constituted the cohort. Leaving those aside, just 6 schools in the sample managed a success rate of 50% or higher in 2013 as well (so for two successive years) and no school managed it for three years in a row.

The schools closest to achieving this are:

  • Tollgate Primary School, Newham (71% in 2014, 50% in 2013 and 40% in 2013)

Only 9 of the sample achieved a success rate of 30% or higher for three years in a row.

The size and direction of excellence gaps

Another conspicuous finding is that several of these schools display sizeable reverse excellence gaps, where the performance of disadvantaged learners far exceeds that of their more advantaged peers.

Their success rates for all other pupils at L5 in reading, writing and maths combined vary enormously, ranging between 91% and 10%. Nineteen of the sample (20%) is at or below the national average rate for state-funded schools.

But in a clear majority of the sample the success rate for all other pupils is lower than it is for disadvantaged pupils.

The biggest reverse excellence gap is recorded by St John’s Church of England Primary School in Cheltenham, Gloucestershire, where the success rate for disadvantaged learners is 67%, compared with 19% for other learners, giving a huge disparity of 48 percentage points!

Several other schools record reverse gaps of 30 points or more, many of them church schools. This raises the intriguing possibility that the ethos and approach in such schools may be relatively more conducive to disadvantaged high attainers, although small numbers are undoubtedly a factor in some schools.

The ‘cliff-edge’ nature of the distinction between disadvantaged and other learners may also be a factor.

If schools have a relatively high proportion of comparatively disadvantaged learners ineligible for the pupil premium they may depress the results for the majority, especially if their particular needs are not being addressed.

At the other extreme, several schools perform creditably with their disadvantaged learners while also demonstrating large standard excellence gaps.

Some of the worst offenders are the schools celebrated above for achieving consistency over a three year period:

  • Fox Primary School has a 2014 excellence gap of 34 points (57% disadvantaged versus 91% advantaged)
  • Nelson Mandela School a similar gap of 28 points (54% disadvantaged versus 82% advantaged).

Only Tollgate School bucks this trend with a standard excellence gap of just two percentage points.

The chart below illustrates the variance in excellence gaps across the sample. Sizeable reverse gaps clearly predominate.

 .

REG graph 4

Chart 4: Incidence of reverse and normal excellence gaps in the sample

Out of the entire sample, only 17 schools returned success rates for advantaged and other learners that were within five percentage points of each other. Less than one-third of the sample falls within a variance of plus or minus 10%.

These extreme variations may in some cases be associated with big disparities in the sizes of the two groups: if disadvantaged high attainers are in single figures, differences can hinge on the performance of just one or two learners. But this does not apply in all cases. As noted above, the underperformance of relatively disadvantaged learners may also be a factor in the reverse gaps scenario.

Ofsted inspection reports

I was curious to see whether schools with sizeable excellence gaps – whether normal or reverse – had received comment on this point from Ofsted.

Of the schools within the sample, just one – Shrewsbury Cathedral Catholic Primary School – has been rated inadequate in its last inspection report. The inspection was undertaken in July 2014, so will not have reflected a huge reverse excellence gap of 38 percentage points in the 2014 KS2 assessments.

The underachievement of the most able is identified as a contributory factor in the special measures judgement but the report comments thus on the achievement of disadvantaged learners:

‘Although in Year 6, pupils eligible for additional government funding (the pupil premium) reach similar levels to their classmates in reading, writing and mathematics, eligible pupils attain lower standards than those reached by their classmates, in Years 2, 3 and 4. The gap between the attainment of eligible and non-eligible pupils in these year groups is widening in reading, writing and mathematics. In mathematics, in Year 3, eligible pupils are over a year behind their classmates.’

Two further schools in the sample were judged by Ofsted to require improvement, both in 2013 – St Matthew’s in Surbiton and St Stephen’s in Godstone, Surrey. All others that have been inspected were deemed outstanding or good.

At St Matthew’s inspectors commented on the achievement of disadvantaged learners:

‘Weaknesses in the attainment of Year 6 pupils supported by pupil premium funding were identified in 2012 and the school took action to reduce the gap in attainment between this group of pupils and their peers. This gap reduced in 2013 so that they were just over one term behind the others in English and mathematics, but there is still a substantial gap for similar pupils in Year 2, with almost a year’s gap evident in 2013. Support is now in place to tackle this.’

In 2014, the KS2 cohort at St Matthew’s achieved a 53% success rate on L5 reading, writing and maths, with disadvantaged learners at 50%, not too far behind.

At St Stephen’s inspectors said of disadvantaged learners:

‘The school successfully closes the gap between the attainment of pupils who benefit from the pupil premium and others. Indeed, in national tests at the end of Year 6 in 2012, the very small number of eligible pupils was attaining about a term ahead of their classmates in English and mathematics. Focused support is being given to eligible pupils in the current year to help all fulfil their potential.’

A more recent report in 2015 notes:

‘The school is successfully closing the gaps between disadvantaged pupils and others. In 2014, at the end of Key Stage 2, disadvantaged pupils outperformed other pupils nationally and in the school by about three terms in mathematics. They also outperformed other pupils nationally by about two terms nationally and in the school in reading and writing. Disadvantaged pupils across the school typically make faster progress than other pupils in reading, writing and mathematics.’

It is not clear whether inspectors regard this as a positive outcome.

Unfortunately, Tollgate, Nelson Mandela and Fox – all three outstanding – have not been inspected since 2008/2009. One wonders whether the significant excellence gaps at the latter might impact on their overall inspection grade.

.

Pupil Premium allocations 

I was equally curious to see what the websites for these three schools recorded about their use of the pupil premium.

Schools are required to publish details of how they spend the pupil premium and the effect this has on the attainment of the learners who attract it.

Ofsted has recently reported that only about one-third of non-selective secondary schools make appropriate use of the pupil premium to support their disadvantaged most able learners – and there is little reason to suppose that most primary schools are any more successful in this respect.

But are these three schools any different?

  • Fox Primary School has pupil premium income of £54.7K in 2014-15. It explains in its statement:

‘Beyond all of this, Fox directs a comparatively large proportion of budget to staffing to ensure small group teaching can target pupils of all attainment to attain and achieve higher than national expectations. Disadvantaged pupils who are attaining above the expected level are also benefitting from small group learning, including core subject lessons with class sizes up to 20. The impact of this approach can be seen in the APS and value added scores of disadvantaged pupils for the last 2 years at both KS1 and KS2. The improved staffing ratios are not included in pupil premium spend.’

  • Nelson Mandela School has so far not uploaded details for 2014-15. In 2013-14 it received pupil premium of £205.2K. The statement contains no explicit reference to high-attaining disadvantaged learners.
  • Tollgate Primary School received pupil premium of £302.2K in 2014-15. Its report covers this and the previous year. In 2013-14 there are entries for:

‘Aim Higher, challenging more able FSM pupils’ (Y6)

In 2014-15 funding is allocated to pay for five intervention teachers, whose role is described as:

‘Small group teaching for higher ability. Intervention programmes for FSM’.

.

Conclusion

The national excellence gap between disadvantaged and other learners achieving KS2 L5 in all of reading, writing and maths is growing, despite the pupil premium. The reasons for this require investigation and resolution.

Ofsted’s commitment to give the issue additional scrutiny will be helpful but may not be sufficient to turn this situation around. Other options should be considered.

The evidence suggests that schools’ capacity to sustain Level 5+ performance across reading, writing and maths for relatively large proportions of their disadvantaged learners is limited. High levels of performance are rarely maintained for two or three years in succession.

Where high success rates are achieved, more often than not this results in a significant reverse excellence gap.

Such reverse gaps may be affected by the small number of disadvantaged learners within some schools’ cohorts but there may also be evidence to suggest that several schools are succeeding with their disadvantaged high achievers at the expense of those from relatively more advantaged backgrounds.

Further investigation is necessary to establish the association between this trend and a ‘cliff-edge’ definition of disadvantage.

Such an outcome is not optimal or desirable and should be addressed quickly, though without depressing the performance of disadvantaged high achievers.

A handful of strong performers, including the majority of those that are relatively more consistent year-on-year, do well despite continuing to demonstrate sizeable standard excellence gaps.

Here the advantaged do outstandingly well and the disadvantaged do significantly worse, but still significantly better than in many other schools.

This outcome is not optimal either.

There are very few schools that perform consistently highly on this measure, for advantaged and disadvantaged high attainers alike.

Newham’s Tollgate Primary School is perhaps the nearest to exemplary practice. It receives significant pupil premium income and, in 2014-15, has invested in five intervention staff whose role is partially to provide small group teaching that benefits high attainers from disadvantaged backgrounds.

Fox Primary School has also acted to reduce group sizes, but it remains to be seen whether this will help to eliminate the large positive excellence gap apparent in 2014.

This is a model that others might replicate, provided their pupil premium income is substantial enough to underwrite the cost, but the necessary conditions for success are not yet clear and further research is necessary to establish and disseminate them.

Alternative approaches will be necessary for schools with small numbers of disadvantaged learners and a correspondingly small pupil premium budget.

The Education Endowment Fund (EEF) is the obvious source of funding. It should be much more explicitly focused on excellence gaps than it has been to date.

GP

May 2015

High Attainment in the 2014 Secondary and 16-18 Performance Tables

.

This is my annual analysis of high attainment and high attainers’ performance in the Secondary School and College Performance Tables

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on the 2014 Secondary and 16-18 Tables, as well as three statistical releases published alongside them:

It also reports trends since 2012 and 2013, while acknowledging the comparability issues at secondary level this year.

This is a companion piece to previous posts on:

The post opens with the headlines from the subsequent analysis. These are followed by a discussion of definitions and comparability issues.

Two substantive sections deal respectively with secondary and post-16 measures. The post-16 analysis focuses exclusively on A level results. There is a brief postscript on the performance of disadvantaged high attainers.

As ever I apologise in advance for any transcription errors and invite readers to notify me of any they spot, so that I can make the necessary corrections.

.

Headlines

At KS4:

  • High attainers constitute 32.4% of the cohort attending state-funded schools, but this masks some variation by school type. The percentage attending converter academies (38.4%) has fallen by nine percentage points since 2011 but remains almost double the percentage attending sponsored academies (21.2%).
  • Female high attainers (33.7%) continue to outnumber males (32.1%). The percentage of high-attaining males has fallen very slightly since 2013 while the proportion of high-attaining females has slightly increased.
  • 88.8% of the GCSE cohort attending selective schools are high attainers, virtually unchanged from 2013. The percentages in comprehensive schools (30.9%) and modern schools (21.0%) are also little changed.
  • These figures mask significant variation between schools. Ten grammar schools have a GCSE cohort consisting entirely of high attainers but, at the other extreme, one has only 52%.
  • Some comprehensive schools have more high attainers than some grammars: the highest percentage recorded in 2014 by a comprehensive is 86%. Modern schools are also extremely variable, with high attainer populations ranging from 4% to 45%. Schools with small populations of high attainers report very different success rates for them on the headline measures.
  • The fact that 11.2% of the selective school cohort are middle attainers reminds us that 11+ selection is not based on prior attainment. Middle attainers in selective schools perform significantly better than those in comprehensive schools, but worse than high attainers in comprehensives.
  • 92.8% of high attainers in state-funded schools achieved 5 or more GCSEs at grades A*-C (or equivalent) including GCSEs in English and maths. While the success rate for all learners is down by four percentage points compared with 2013, the decline is less pronounced for high attainers (1.9 points).
  • In 340 schools 100% of high attainers achieved this measure, down from 530 in 2013. Fifty-seven schools record 67% or less compared with only 14 in 2013. Four of the 57 had a better success rate for middle attainers than for high attainers.
  • 93.8% of high attainers in state-funded schools achieved GCSE grades A*-C in English and maths. The success rate for high attainers has fallen less than the rate for the cohort as a whole (1.3 points against 2.4 points). Some 470 schools achieved 100% success amongst their high attainers on this measure, down 140 compared with 2013. Thirty-eight schools were at 67% or lower compared with only 12 in 2013. Five of these boast a higher success rate for their middle attainers than their high attainers (and four are the same that do so on the 5+ A*-C including English and maths measure).
  • 68.8% of high attainers were entered for the EBacc and 55% achieved it. The entry rate is up 3.8 percentage points and the success rate up 2.9 points compared with 2013. Sixty-seven schools entered 100% of their high attainers, but only five schools managed 100% success. Thirty-seven schools entered no high attainers at all and 53 had no successful high attainers.
  • 85.6% of high attainers made at least the expected progress in English and 84.7% did so in maths. Both are down on 2013 but much more so in maths (3.1 percentage points) than in English (0.6 points).
  • In 108 schools every high attainer made the requisite progress in English. In 99 schools the same was true of maths in 99 schools. Only 21 schools managed 100% success in both English and maths. At the other extreme there were seven schools in which 50% or fewer made expected progress in both English and maths. Several schools recording 50% or below in either English or maths did significantly better with their middle attainers.
  • In sponsored academies one in four high attainers do not make the expected progress in maths and one in five do not do so in English. In free schools one in every five high attainers falls short in English as do one in six in maths.

At KS5:

  • 11.9% of students at state-funded schools and colleges achieved AAB grades at A level or higher, with at least two in facilitating subjects. This is a slight fall compared with the 12.1% that did so in 2013. The best-performing state institution had a success rate of 83%.
  • 14.1% of A levels taken in selective schools in 2014 were graded A* and 41.1% were graded A* or A. In selective schools 26.1% of the cohort achieved AAA or higher and 32.3% achieved AAB or higher with at least two in facilitating subjects.
  • Across all schools, independent as well as state-funded, the proportion of students achieving three or more A level grades at A*/A is falling and the gap between the success rates of boys and girls is increasing.
  • Boys are more successful than girls on three of the four high attainment measures, the only exception being the least demanding (AAB or higher in any subjects).
  • The highest recorded A level point score per A level student in a state-funded institution in 2014 is 1430.1, compared with an average of 772.7. The lowest is 288.4. The highest APS per A level entry is 271.1 compared with an average of 211.2. The lowest recorded is 108.6.

Disadvantaged high attainers:

  • On the majority of the KS4 headline measures gaps between FSM and non-FSM performance are increasing, even when the 2013 methodology is applied to control for the impact of the reforms affecting comparability. Very limited improvement has been made against any of the five headline measures between 2011 and 2014. It seems that the pupil premium has had little impact to date on either attainment or progress. Although no separate information is forthcoming about the performance of disadvantaged high attainers, it is highly likely that excellence gaps are equally unaffected.

.

Definitions and comparability issues 

Definitions

The Secondary and 16-18 Tables take very different approaches, since the former deals exclusively with high attainers while the latter concentrates exclusively on high attainment.

The Secondary Tables define high attainers according to their prior attainment on end of KS2 tests. Most learners in the 2014 GCSE cohort will have taken these five years previously, in 2009.

The new supporting documentation describes the distinction between high, middle and low attainers thus:

  • low attaining = those below level 4 in the key stage 2 tests
  • middle attaining = those at level 4 in the key stage 2 tests
  • high attaining = those above level 4 in the key stage 2 tests.

Last year the equivalent statement added:

‘To establish a pupil’s KS2 attainment level, we calculated the pupil’s average point score in national curriculum tests for English, maths and science and classified those with a point score of less than 24 as low; those between 24 and 29.99 as middle, and those with 30 or more as high attaining.’

This is now missing, but the methodology is presumably unchanged.

It means that high attainers will tend to be ‘all-rounders’, whose performance is at least middling in each assessment. Those who are exceptionally high achievers in one area but poor in others are unlikely to qualify.

There is nothing in the Secondary Tables or the supporting SFRs about high attainment, such as measures of GCSE achievement at grades A*/A.

By contrast, the 16-18 Tables do not distinguish high attainers, but do deploy a high attainment measure:

‘The percentage of A level students achieving grades AAB or higher in at least two facilitating subjects’

Facilitating subjects include:

‘biology, chemistry, physics, mathematics, further mathematics, geography, history, English literature, modern and classical languages.’

The supporting documentation says:

‘Students who already have a good idea of what they want to study at university should check the usual entry requirements for their chosen course and ensure that their choices at advanced level include any required subjects. Students who are less sure will want to keep their options open while they decide what to do. These students might want to consider choosing at least two facilitating subjects because they are most commonly required for entry to degree courses at Russell Group universities. The study of A levels in particular subjects does not, of course, guarantee anyone a place. Entry to university is competitive and achieving good grades is also important.’

The 2013 Tables also included percentages of students achieving three A levels at grades AAB or higher in facilitating subjects, but this has now been dropped.

The Statement of Intent for the 2014 Tables explains:

‘As announced in the government’s response to the consultation on 16-19 accountability earlier this year, we intend to maintain the AAB measure in performance tables as a standard of academic rigour. However, to address the concerns raised in the 16-19 accountability consultation, we will only require two of the subjects to be in facilitating subjects. Therefore, the indicator based on three facilitating subjects will no longer be reported in the performance tables.’

Both these measures appear in SFR03/15, alongside two others:

  • Percentage of students achieving 3 A*-A grades or better At A level or applied single/double award A level.
  • Percentage of students achieving grades AAB or better at A level or applied single/double award A level.

Comparability Issues 

When it comes to analysis of the Secondary Tables, comparisons with previous years are compromised by changes to the way in which performance is measured.

Both SFRs carry an initial warning:

‘Two major reforms have been implemented which affect the calculation of key stage 4 (KS4) performance measures data in 2014:

  1. Professor Alison Wolf’s Review of Vocational Education recommendations which:
  • restrict the qualifications counted
  • prevent any qualification from counting as larger than one GCSE
  • cap the number of non-GCSEs included in performance measures at two per pupil
  1. An early entry policy to only count a pupil’s first attempt at a qualification.’

SFR02/15 explains that some data has been presented ‘on two alternative bases’:

  • Using the 2014 methodology with the changes above applied and
  • Using a proxy 2013 methodology where the effect of these two changes has been removed.

It points out that more minor changes have not been accounted for, including the removal of unregulated IGCSEs, the application of discounting across different qualification types, the shift to linear GCSE formats and the removal of the speaking and listening component from English.

Moreover, the proxy measure does not:

‘…isolate the impact of changes in school behaviour due to policy changes. For example, we can count best entry results rather than first entry results but some schools will have adjusted their behaviours according to the policy changes and stopped entering pupils in the same patterns as they would have done before the policy was introduced.’

Nevertheless, the proxy is the best available guide to what outcomes would have been had the two reforms above not been introduced. Unfortunately, it has been applied rather sparingly.

Rather than ignore trends completely, this post includes information about changes in high attainers’ GCSE performance compared with previous years, not least so readers can see the impact of the changes that have been introduced.

It is important that we do not allow the impact of these changes to be used as a smokescreen masking negligible improvement or even declines in national performance on key measures.

But we cannot escape the fact that the 2014 figures are not fully comparable with those for previous years. Several of the tables in SFR06/2015 carry a warning in red to this effect (but not those in SFR 02/2015).

A few less substantive changes also impact slightly on the comparability of A level results: the withdrawal of January examinations and ‘automatic add back’ of students whose results were deferred from the previous year because they had not completed their 16-18 study programme.

.

Secondary outcomes

. 

The High Attainer Population 

The Secondary Performance Tables show that there were 172,115 high attainers from state-funded schools within the relevant cohort in 2014, who together account for 32.3% of the entire state-funded school cohort.

This is some 2% fewer than the 175,797 recorded in 2013, which constituted 32.4% of that year’s cohort.

SFR02/2015 provides information about the incidence of high, middle and low attainers by school type and gender.

Chart 1, below, compares the proportion of high attainers by type of school, showing changes since 2011.

The high attainer population across all state-funded mainstream schools has remained relatively stable over the period and currently stands at 32.9%. The corresponding percentage in LA-maintained mainstream schools is slightly lower: the difference is exactly two percentage points in 2014.

High attainers constitute only around one-fifth of the student population of sponsored academies, but close to double that in converter academies. The former percentage is relatively stable but the latter has fallen by some nine percentage points since 2011, presumably as the size of this sector has increased.

The percentage of high attainers in free schools is similar to that in converter academies but has fluctuated over the three years for which data is available. The comparison between 2014 and previous years will have been affected by the inclusion of UTCs and studio schools prior to 2014.

.

HA sec1

*Pre-2014 includes UTCs and studio schools; 2014 includes free schools only

Chart 1: Percentage of high attainers by school type, 2011-2014

. 

Table 1 shows that, in each year since 2011, there has been a slightly higher percentage of female high attainers than male, the gap varying between 0.4 percentage points (2012) and 1.8 percentage points (2011).

The percentage of high-attaining boys in 2014 is the lowest it has been over this period, while the percentage of high attaining girls is slightly higher than it was in 2013 but has not returned to 2011 levels.

Year Boys Girls
2014 32.1 33.7
2013 32.3 33.3
2012 33.4 33.8
2011 32.6 34.4

Table 1: Percentage of high attainers by gender, all state-funded mainstream schools 2011-14

Table 2 shows that the percentage of high attainers in selective schools is almost unchanged from 2013, at just under 89%. This compares with almost 31% in comprehensive schools, unchanged from 2013, and 21% in modern schools, the highest it has been over this period.

The 11.2% of learners in selective schools who are middle attainers remind us that selection by ability through 11-plus tests gives a somewhat different sample than selection exclusively on the basis of KS2 attainment.

. 

Year Selective Comprehensive Modern
2014 88.8 30.9 21.0
2013 88.9 30.9 20.5
2012 89.8 31.7 20.9
2011 90.3 31.6 20.4

Table 2: Percentage of high attainers by admissions practice, 2011-14

The SFR shows that these middle attainers in selective schools are less successful than their high attaining peers, and slightly less successful than high attainers in comprehensives, but they are considerably more successful than middle attaining learners in comprehensive schools.

For example, in 2014 the 5+ A*-C grades including English and maths measure is achieved by:

  • 97.8% of high attainers in selective schools
  • 92.2% of high attainers in comprehensive schools
  • 88.1% of middle attainers in selective schools and
  • 50.8% of middle attainers in comprehensive schools.

A previous post ‘The Politics of Selection: Grammar schools and disadvantage’ (November 2014) explored how some grammar schools are significantly more selective than others – as measured by the percentage of high attainers within their GCSE cohorts – and the fact that some comprehensives are more selective than some grammar schools.

This is again borne out by the 2014 Performance Tables, which show that 10 selective schools have a cohort consisting entirely of high attainers, the same as in 2013. Eighty-nine selective schools have a high attainer population of 90% or more.

However, five are at 70% or below, with the lowest – Dover Grammar School for Boys – registering only 52% high attainers.

By comparison, comprehensives such as King’s Priory School, North Shields and Dame Alice Owen’s School, Potters Bar record 86% and 77% high attainers respectively. 

There is also huge variation in modern schools, from Coombe Girls’ in Kingston, at 45%, just seven percentage points shy of the lowest recorded in a selective school, to The Ellington and Hereson School, Ramsgate, at just 4%.

Two studio colleges say they have no high attainers at all, while 96 schools have 10% or fewer. A significant proportion of these are academies located in rural and coastal areas.

Even though results are suppressed where there are too few high attainers, it is evident that these small cohorts perform very differently in different schools.

Amongst those with a high attainer population of 10% or fewer, the proportion achieving:

  • 5+ A*-C grades including English and maths varies from 44% to100%
  • EBacc ranges from 0% to 89%
  • expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%. 

5+ GCSEs (or equivalent) at A*-C including GCSEs in English and maths 

The Tables show that:

  • 92.8% of high attainers in state-funded schools achieved five or more GCSEs (or equivalent) including GCSEs in English and maths. This compares with 56.6% of all learners. Allowing of course for the impact of 2014 reforms, the latter is a full four percentage points down on the 2013 outcome. By comparison, the outcome for high attainers is down 1.9 percentage points, slightly less than half the overall decline. Roughly one in every fourteen high attainers fails to achieve this benchmark.
  • 340 schools achieve 100% on this measure, significantly fewer than the 530 that did so in 2013 and the 480 managing this in 2012. In 2013, 14 schools registered 67% or fewer high attainers achieving this outcome, whereas in 2014 this number has increased substantially, to 57 schools. Five schools record 0%, including selective Bourne Grammar School, Lincolnshire, hopefully because of their choice of IGCSEs. Six more are at 25% or lower.

. 

A*-C grades in GCSE English and maths 

The Tables reveal that:

  • 93.8% of high attainers in state-funded schools achieved A*-C grades in GCSE English and maths, compared with 58.9% of all pupils. The latter percentage is down by 2.4 percentage points but the former has fallen by only 1.3 percentage points. Roughly one in 16 high attainers fails to achieve this measure.
  • In 2014 the number of schools with 100% of high attainers achieving this measure has fallen to some 470, 140 fewer than in 2013 and 60 fewer than in 2012. There were 38 schools recording 67% or lower, a significant increase compared with 12 in 2013 and 18 in 2012. Of these, four are listed at 0% (Bourne Grammar is at 1%) and five more are at 25% or lower.
  • Amongst the 38 schools recording 67% or lower, five return a higher success rate for their middle attainers than for their high attainers. Four of these are the same that do so on the 5+ A*-C measure above. They are joined by Tong High School. 

Entry to and achievement of the EBacc 

The Tables indicate that:

  • 68.8% of high attainers in state-funded schools were entered for all EBacc subjects and 55.0% achieved the EBacc. The entry rate is up by 3.8 percentage points compared with 2013, and the success rate is up by 2.9 percentage points. By comparison, 31.5% of middle attainers were entered (up 3.7 points) and 12.7% passed (up 0.9 points). Between 2012 and 2013 the entry rate for high attainers increased by 19 percentage points, so the rate of improvement has slowed significantly. Given the impending introduction of the Attainment 8 measure, commitment to the EBacc is presumably waning.
  • Thirty-seven schools entered no high attainers for the EBacc, compared with 55 in 2013 and 186 in 2012. Only 53 schools had no high attainers achieving the EBacc, compared with 79 in 2013 and 235 in 2012. Of these 53, 11 recorded a positive success rate for their middle attainers, though the difference was relatively small in all cases.

At least 3 Levels of Progress in English and maths

The Tables show that:

  • Across all state-funded schools 85.6% of high attainers made at least the expected progress in English while 84.7% did so in maths. The corresponding figures for middle attainers are 70.2% in English and 65.3% in maths. Compared with 2013, the percentages for high attainers are down 0.6 percentage points in English and down 3.1 percentage points in maths, presumably because the first entry only rule has had more impact in the latter. Even allowing for the depressing effect of the changes outlined above, it is unacceptable that more than one in every seven high attainers fails to make the requisite progress in each of these core subjects, especially when the progress expected is relatively undemanding for such students.
  • There were 108 schools in which every high attainer made at least the expected progress in English, exactly the same as in 2013. There were 99 schools which achieved the same outcome in maths, down significantly from 120 in 2013. In 2013 there were 36 schools which managed this in both English in maths, but only 21 did so in 2014.
  • At the other extreme, four schools recorded no high attainers making the expected progress in English, presumably because of their choice of IGCSE. Sixty-five schools were at or below 50% on this measure. In maths 67 schools were at or below 50%, but the lowest recorded outcome was 16%, at Oasis Academy, Hextable.
  • Half of the schools achieving 50% or less with their high attainers in English or maths also returned better results with middle attainers. Particularly glaring differentials in English include Red House Academy (50% middle attainers and 22% high attainers) and Wingfield Academy (73% middle attainers; 36% high attainers). In maths the worst examples are Oasis Academy Hextable (55% middle attainers and 16% high attainers), Sir John Hunt Community Sports College (45% middle attainers and 17% high attainers) and Roseberry College and Sixth Form (now closed) (49% middle attainers and 21% high attainers).

Comparing achievement of these measures by school type and admissions basis 

SFR02/2015 compares the performance of high attainers in different types of school on each of the five measures discussed above. This data is presented in Chart 2 below.

.

HA sec2 

Chart 2: Comparison of high attainers’ GCSE performance by type of school, 2014

.

It shows that:

  • There is significant variation on all five measures, though these are more pronounced for achievement of the EBacc, where there is a 20 percentage point difference between the success rates in sponsored academies (39.2%) and in converter academies (59.9%).
  • Converter academies are the strongest performers across the board, while sponsored academies are consistently the weakest. LA-maintained mainstream schools out-perform free schools on four of the five measures, the only exception being expected progress in maths.
  • Free schools and converter academies achieve stronger performance on progress in maths than on progress in English, but the reverse is true in sponsored academies and LA-maintained schools.
  • Sponsored academies and free schools are both registering relatively poor performance on the EBacc measure and the two progress measures.
  • One in four high attainers in sponsored academies fails to make the requisite progress in maths while one in five fail to do so in English. Moreover, one in five high attainers in free schools fails to make the expected progress in English and one in six in maths. This is unacceptably low.

Comparisons with 2013 outcomes show a general decline, with the exception of EBacc achievement.

This is particularly pronounced in sponsored academies, where there have been falls of 5.2 percentage points on 5+ A*-Cs including English and maths, 5.7 points on A*-C in English and maths and 4.7 points on expected progress in maths. However, expected progress in English has held up well by comparison, with a fall of just 0.6 percentage points.

Progress in maths has declined more than progress in English across the board. In converter academies progress in maths is down 3.1 points, while progress in English is down 1.1 points. In LA-maintained schools, the corresponding falls are 3.4 and 0.4 points respectively.

EBacc achievement is up by 4.5 percentage points in sponsored academies, 3.1 points in LA-maintained schools and 1.8 points in converter academies.

.

Comparing achievement of these measures by school admissions basis 

SFR02/2015 compares the performance of high attainers in selective, comprehensive and modern schools on these five measures. Chart 3 illustrates these comparisons.

.

HA sec3

Chart 3: Comparison of high attainers’ GCSE performance by school admissions basis, 2014

.

It is evident that:

  • High attainers in selective schools outperform those in comprehensive schools on all five measures. The biggest difference is in relation to EBacc achievement (21.6 percentage points). There is a 12.8 point advantage in relation to expected progress in maths and an 8.7 point advantage on expected progress in English.
  • Similarly, high attainers in comprehensive schools outperform those in modern schools. They enjoy a 14.7 percentage point advantage in relation to achievement of the EBacc, but, otherwise, the differences are between 1.6 and 3.5 percentage points.
  • Hence there is a smaller gap, by and large, between the performance of high attainers in modern and comprehensive schools respectively than there is between high attainers in comprehensive and selective schools respectively.
  • Only selective schools are more successful in achieving expected progress in maths than they are in English. It is a cause for some concern that, even in selective schools, 6.5% of pupils are failing to make at least three levels of progress in English.

Compared with 2013, results have typically improved in selective schools but worsened in comprehensive and modern schools. For example:

  • Achievement of the 5+ GCSE measure is up 0.5 percentage points in selective schools but down 2.3 points in comprehensives and modern schools.
  • In selective schools, the success rate for expected progress in English is up 0.5 points and in maths it is up 0.4 points. However, in comprehensive schools progress in English and maths are both down, by 0.7 points and 3.5 points respectively. In modern schools, progress in English is up 0.3 percentage points while progress in maths is down 4.1 percentage points.

When it comes to EBacc achievement, the success rate is unchanged in selective schools, up 3.1 points in comprehensives and up 5 points in modern schools.

. 

Other measures

The Secondary Performance Tables also provide information about the performance of high attainers on several other measures, including:

  • Average Points Score (APS): Annex B of the Statement of Intent says that, as in 2013, the Tables will include APS (best 8) for ‘all qualifications’ and ‘GCSEs only’. At the time of writing, only the former appears in the 2014 Tables. For high attainers, the APS (best 8) all qualifications across all state-funded schools is 386.2, which compares unfavourably with 396.1 in 2013. Four selective schools managed to exceed 450 points: Pate’s Grammar School (455.1); The Tiffin Girls’ School (452.1); Reading School (451.4); and Colyton Grammar School (450.6). The best result in 2013 was 459.5, again at Colyton Grammar School. At the other end of the table, only one school returns a score of under 250 for their high attainers, Pent Valley Technology College (248.1). The lowest recorded score in 2013 was significantly higher at 277.3.
  • Value Added (best 8) prior attainment: The VA score for all state-funded schools in 2014 is 1000.3, compared with 1001.5 in 2013. Five schools returned a result over 1050, whereas four did so in 2013. The 2014 leaders are: Tauheedul Islam Girls School (1070.7); Yesodey Hatorah Senior Girls School (1057.8); The City Academy Hackney (1051.4); The Skinner’s School (1051.2); and Hasmonean High School (1050.9). At the other extreme, 12 schools were at 900 or below, compared with just three in 2013. The lowest performer on this measure is Hull Studio School (851.2). 
  • Average grade: As in the case of APS, the average grade per pupil per GCSE has not yet materialised. The average grade per pupil per qualification is supplied. Five selective schools return A*-, including Henrietta Barnett, Pate’s, Reading School, Tiffin Girls and Tonbridge Grammar. Only Henrietta Barnett and Pate’s managed this in 2013.
  • Number of exam entries: Yet again we only have number of entries for all qualifications and not for GCSE only. The average number of entries per high attainer across state-funded schools is 10.4, compared with 12.1 in 2013. This 1.7 reduction is smaller than for middle attainers (down 2.5 from 11.4 to 8.9) and low attainers (down 3.7 from 10.1 to 6.4). The highest number of entries per high attainer was 14.2 at Gable Hall School and the lowest was 5.9 at The Midland Studio College Hinkley.

16-18: A level outcomes

.

A level grades AAB or higher in at least two facilitating subjects 

The 16-18 Tables show that 11.9% of students in state-funded schools and colleges achieved AAB+ with at least two in facilitating subjects. This is slightly lower than the 12.1% recorded in 2013.

The best-performing state-funded institution is a further education college, Cambridge Regional College, which records 83%. The only other state-funded institution above 80% is The Henrietta Barnett School. At the other end of the spectrum, some 443 institutions are at 0%.

Table 3, derived from SFR03/2015, reveals how performance on this measure has changed since 2013 for different types of institution and, for schools with different admission arrangements.

.

2013 2014
LA-maintained school 11.4 11.5
Sponsored academy 5.4 5.3
Converter academy 16.4 15.7
Free school* 11.3 16.4
Sixth form college 10.4 10
Other FE college 5.8 5.7
 
Selective school 32.4 32.3
Comprehensive school 10.7 10.5
Modern school 2 3.2

.

The substantive change for free schools will be affected by the inclusion of UTCs and studio schools in that line in 2013 and the addition of city technology colleges and 16-19 free schools in 2014.

Otherwise the general trend is slightly downwards but LA-maintained schools have improved very slightly and modern schools have improved significantly.

.

Other measures of high A level attainment

SFR03/15 provides outcomes for three other measures of high A level attainment:

  • 3 A*/A grades or better at A level, or applied single/double award A level
  • Grades AAB or better at A level, or applied single/double award A level
  • Grades AAB or better at A level all of which are in facilitating subjects.

Chart 4, below, compares performance across all state-funded schools and colleges on all four measures, showing results separately for boys and girls.

Boys are in the ascendancy on three of the four measures, the one exception being AAB grades or higher in any subjects. The gaps are more substantial where facilitating subjects are involved.

.

HA sec4

Chart 4: A level high attainment measures by gender, 2014

.

The SFR provides a time series for the achievement of the 3+ A*/A measure, for all schools – including independent schools – and colleges. The 2014 success rate is 12.0%, down 0.5 percentage points compared with 2013.

The trend over time is shown in Chart 5 below. This shows how results for boys and girls alike are slowly declining, having reached their peak in 2010/11. Boys established a clear lead from that year onwards.

As they decline, the lines for boys and girls are steadily diverging since girls’ results are falling more rapidly. The gap between boys and girls in 2014 is 1.3 percentage points.

.

HA sec5

Chart 5: Achievement of 3+ A*/A grades in independent and state-funded schools and in colleges, 2006-2014

.

Chart 6, compares performance on the four different measures by institutional type. It shows a similar pattern across the piece.

Success rates tend to be highest in either converter academies or free schools, while sponsored academies and other FE institutions tend to bring up the rear. LA-maintained schools and sixth form colleges lie midway between.

Converter academies outscore free schools when facilitating subjects do not enter the equation, but the reverse is true when they do. There is a similar relationship between sixth form colleges and LA-maintained schools, but it does not quite hold with the final pair.

. 

HA sec6 

Chart 6: Proportion of students achieving different A level high attainment measures by type of institution, 2014

.

Chart 7 compares performance by admissions policy in the schools sector on the four measures. Selective schools enjoy a big advantage on all four. More than one in four selective school students achieving at least 3 A grades and almost one in 3 achieves AAB+ with at least two in facilitating subjects.

There is a broadly similar relationship across all the measures, in that comprehensive schools record roughly three times the rates achieved in modern schools and selective schools manage roughly three times the success rates in comprehensive schools. 

. 

HA sec7 

Chart 7: Proportion of students achieving different A level high attainment measures by admissions basis in schools, 2014

 .

Other Performance Table measures 

Some of the other measures in the 16-18 Tables are relevant to high attainment:

  • Average Point Score per A level student: The APS per student across all state funded schools and colleges is 772.7, down slightly on the 782.3 recorded last year. The highest recorded APS in 2014 is 1430.1, by Colchester Royal Grammar School. This is almost 100 ahead of the next best school, Colyton Grammar, but well short of the highest score in 2013, which was 1650. The lowest APS for a state-funded school in 2014 is 288.4 at Hartsdown Academy, which also returned the lowest score in 2013. 
  • Average Point Score per A level entry: The APS per A level entry for all state-funded institutions is 211.2, almost identical to the 211.3 recorded in 2013. The highest score attributable to a state-funded institution is 271.1 at The Henrietta Barnett School. This is very slightly slower than the 271.4 achieved by Queen Elizabeth’s Barnet in 2013. The lowest is 108.6, again at Hartsdown Academy, which exceeds the 2013 low of 97.7 at Appleton Academy. 
  • Average grade per A level entry: The average grade across state-funded schools and colleges is C. The highest average grade returned in the state-funded sector is A at The Henrietta Barnett School, Pate’s Grammar School, Queen Elizabeth’s Barnet and Tiffin Girls School. In 2013 only the two Barnet schools achieved the same outcome. At the other extreme, an average U grade is returned by Hartsdown Academy, Irlam and Cadishead College and Swadelands School. 

SFR06/2015 also supplies the percentage of A* and A*/A grades by type of institution and schools’ admissions arrangements. The former is shown in Chart 8 and the latter in Chart 9 below.

The free school comparisons are affected by the changes to this category described above.

Elsewhere the pattern is rather inconsistent. Success rates at A* exceed those set in 2012 and 2013 in LA-maintained schools, sponsored academies, sixth form colleges and other FE institutions. Meanwhile, A*/A grades combined are lower than both 2012 and 2013 in converter academies and sixth form colleges.

.

HA sec8

Chart 8: A level A* and A*/A performance by institutional type, 2012 to 2014

. 

Chart 9 shows A* performance exceeding the success rates for 2012 and 2013 in all three sectors.

When both grades are included, success rates in selective schools have returned almost to 2012 levels following a dip in 2013, while there has been little change across the three years in comprehensive schools and a clear improvement in modern schools, which also experienced a dip last year.

HA sec9

Chart 9: A level A* and A*/A performance in schools by admissions basis, 2012 to 2014.

 .

Disadvantaged high attainers 

There is nothing in either of the Performance Tables or the supporting SFRs to enable us to detect changes in the performance of disadvantaged high attainers relative to their more advantaged peers.

I dedicated a previous post to the very few published statistics available to quantify the size of these excellence gaps and establish if they are closing, stable or widening.

There is continuing uncertainty whether this will be addressed under the new assessment and accountability arrangements to be introduced from 2016.

Although results for all high attainers appear to be holding up better than those for middle and lower attainers, the evidence suggests that FSM and disadvantaged gaps at lower attainment levels are proving stubbornly resistant to closure.

Data from SFR06/2015 is presented in Charts 10-12 below.

Chart 10 shows that, when the 2014 methodology is applied, three of the gaps on the five headline measures increased in 2014 compared with 2013.

That might have been expected given the impact of the changes discussed above but, if the 2013 methodology is applied, so stripping out much (but not all) of the impact of these reforms, four of the five headline gaps worsened and the original three are even wider.

This seems to support the hypothesis that the reforms themselves are not driving this negative trend, athough Teach First has suggested otherwise.

.

HA sec10

Chart 10: FSM gaps for headline GCSE measures, 2013-2014

.

Chart 11 shows how FSM gaps have changed on each of these five measures since 2011. Both sets of 2014 figures are included.

Compared with 2011, there has been improvement on two of the five measures, while two or three have deteriorated, depending which methodology is applied for 2014.

Since 2012, only one measure has improved (expected progress in English) and that by slightly more or less than 1%, according to which 2014 methodology is selected.

Deteriorations have been small however, suggesting that FSM gaps have been relatively stable over this period, despite their closure being a top priority for the Government, backed up by extensive pupil premium funding.

.

HA sec11

Chart 11: FSM/other gaps for headline GCSE measures, 2011 to 2014.

.

Chart 12 shows a slightly more positive pattern for the gaps between disadvantaged learners (essentially ‘ever 6 FSM’ and looked after children) and their peers.

There have been improvements on four of the five headline measures since 2011. But since 2012, only one or two of the measures has improved, according to which 2014 methodology is selected. Compared with 2013, either three or four of the 2014 headline measures are down.

The application of the 2013 methodology in 2014, rather than the 2014 methodology, causes all five of the gaps to increase, so reinforcing the point in bold above.

It is unlikely that this pattern will be any different at higher attainment levels, but evidence to prove or disprove this remains disturbingly elusive.

.

HA sec12

Chart 12: Disadvantaged/other gaps for headline GCSE measures, 2011 to 2014

.

Taken together, this evidence does not provide a ringing endorsement of the Government’s strategy for closing these gaps.

There are various reasons why this might be the case:

  • It is too soon to see a significant effect from the pupil premium or other Government reforms: This is the most likely defensive line, although it begs the question why more urgent action was/is discounted.
  • Pupil premium is insufficiently targeted at the students/school that need it most: This is presumably what underlies the Fair Education Alliance’s misguided recommendation that pupil premium funding should be diverted away from high attaining disadvantaged learners towards their lower attaining peers.
  • Schools enjoy too much flexibility over how they use the pupil premium and too many are using it unwisely: This might point towards more rigorous evaluation, tighter accountability mechanisms and stronger guidance.
  • Pupil premium funding is too low to make a real difference: This might be advanced by institutions concerned at the impact of cuts elsewhere in their budgets.
  • Money isn’t the answer: This might suggest that the pupil premium concept is fundamentally misguided and that the system as a whole needs to take a different or more holistic approach.

I have proposed a more targeted method of tackling secondary excellence gaps and simultaneously strengthening fair access, where funding topsliced from the pupil premium is fed into personal budgets for disadvantaged high attainers.

These would meet the cost of coherent, long-term personalised support programmes, co-ordinated by their schools and colleges, which would access suitable services from a ‘managed market’ of suppliers.

.

Conclusion

This analysis suggests that high attainers, particularly those in selective schools, have been relatively less affected by the reforms that have depressed GCSE results in 2014.

While we should be thankful for small mercies, three issues are of particular concern:

  • There is a stubborn and serious problem with the achievement of expected progress in both English and maths. It cannot be acceptable that approximately one in seven high attainers fails to make three levels of progress in each core subject when this is a relatively undemanding expectation for those with high prior attainment. This issue is particularly acute in sponsored academies where one in four or five high attainers are undershooting their progress targets.
  • Underachievement amongst high attainers is prevalent in far too many state-funded schools and colleges. At KS4 there are huge variations in the performance of high-attaining students depending on which schools they attend. A handful of schools achieve better outcomes with their middle attainers than with their high attainers. This ought to be a strong signal, to the schools as well as to Ofsted, that something serious is amiss.
  • Progress in closing KS4 FSM gaps continues to be elusive, despite this being a national priority, backed up by a pupil premium budget of £2.5bn a year. In the absence of data about the performance of disadvantaged high attainers, we can only assume that this is equally true of excellence gaps.

.

GP

February 2015

A Primary Assessment Progress Report

.

This post tracks progress towards the introduction of the primary assessment and accountability reforms introduced by England’s Coalition Government.

pencil-145970_640It reviews developments since the Government’s consultation response was published, as well as the further action required to ensure full and timely implementation.

It considers the possibility of delay as a consequence of the May 2015 General Election and the potential impact of a new government with a different political complexion.

An introductory section outlines the timeline for reform. This is followed by seven thematic sections dealing with:

There are page jumps from each of the bullets above, should readers wish to refer to these specific sections.

Each section summarises briefly the changes and commitments set out in the consultation response (and in the original consultation document where these appear not to have been superseded).

Each then reviews in more detail the progress made to date, itemising the tasks that remain outstanding.

I have included deadlines for all outstanding tasks. Where these are unknown I have made a ‘best guess’ (indicated by a question mark after the date).

I have done my best to steer a consistent path through the variety of material associated with these reforms, pointing out apparent conflicts between sources wherever these exist.

A final section considers progress across the reform programme as a whole – and how much remains to be done.

It discusses the likely impact of Election Purdah and the prospects for changes in direction consequent upon the outcome of the Election.

I have devoted previous posts to ‘Analysis of the Primary Assessment and Accountability Consultation Document’ (July 2013) and to the response in ‘Unpacking the Primary Assessment and Accountability Reforms’ (April 2014) so there is inevitably some repetition here, for which I apologise.

This is a long and complex post, even by my standards. I have tried to construct the big picture from a variety of different sources, to itemise all the jigsaw pieces already in place and all those that are still missing.

If you spot any errors or omissions, do let me know and I will do my best to correct them.

.

[Postscript: Please note that I have added several further postscripts to this document since the original date of publication. If you are revisiting, do pause at the new emboldened paragraphs below.]

Timeline for Reform

The consultation document ‘Primary assessment and accountability under the new national curriculum’ was published on 7 July 2013.

It contained a commitment to publish a response in ‘autumn 2013’, but ‘Reforming assessment and accountability for primary schools’ did not appear until March 2014.

The implementation timetable has to be inferred from a variety of sources but seems to be as shown in the table below. (I have set aside interim milestones until the thematic sections below.)

Month/year Action
Sept 2014 Schools no longer expected to use levels for non-statutory assessment
May 2015 End of KS1 and KS2 national curriculum tests and statutory teacher assessment reported through levels for the final time. .
Summer term 2015 Final 2016 KS1 and KS2 test frameworks, sample materials and mark schemes published.
Guidance published on reporting of test results.
Sept 2015 Schools can use approved reception baseline assessments (or a KS1 baseline).
Sept/Autumn term 2015 New performance descriptors for statutory teacher assessment published.
Dec 2015 Primary Performance Tables use levels for the final time.
May 2016 New KS1 and KS tests introduced, reported through new attainment and progress measures.
June 2016 Statutory teacher assessment reported through new performance descriptors.
Sept 2016 Reception baseline assessment the only baseline option for all-through primaries
Schools must publish new headline measures on their websites.
New floor standards come into effect (with progress element still derived from KS1 baseline).
Dec 2016 New attainment and performance measures published in Primary Performance Tables.

The General Election takes place on 7 May 2015, but pre-Election Purdah will commence on 30 March, almost exactly a year on from publication of the consultation response.

At the time of writing, some 40 weeks have elapsed since the response was published – and there are some 10 weeks before Purdah descends.

Assuming that the next Government is formed within a week of the Election (which might be optimistic), there is a second working period of roughly 10 weeks between that and the end of the AY 2014/15 summer term.

The convention is that all significant assessment and accountability reforms are notified to schools a full academic year before implementation, so allowing them sufficient time to plan for implementation.

A full year’s lead time is no longer sacrosanct (and has already been set aside in some instances below) but any shorter notification period may have significant implications for teacher workload – something that the Government is committed to tackling.

.

[Postscript: On 6 February the Government published its response to the Workload Challenge, which contained a commitment to introduce, from ‘Spring 2015’, a:

‘DfE Protocol setting out minimum lead-in times for significant curriculum, qualifications and accountability changes…’

Elsewhere the text says that the minimum lead time will be a year, thus reinforcing the convention described above.

The term ‘significant’ allows some wriggle room, but one might reasonably expect it to be applied to some of the outstanding actions below.

The Protocol was published on 23 March. The first numbered paragraph implicitly defines a significant change as one having ‘a significant workload impact on schools’, though what constitutes significance (and who determines it) is left unanswered.

There is provision for override ‘in cases where change is urgently required’ but criteria for introducing an override are not supplied.]

.

.

We now know that a minimum lead time will not be applied to the introduction of new performance descriptors for statutory teacher assessment (see below). The original timescale did not fit this description and it has not been adjusted in the light of consultation.]

.

Announcements made during the long summer holiday are much disliked by schools, so the end of summer term 2015 becomes the de facto target for any reforms requiring implementation from September 2016.

One might therefore conclude that:

  • We are about two-thirds of the way through the main implementation period.
  • There is a period of some 100 working days in which to complete the reforms expected to be notified to schools before the end of the AY2014/15 summer term. This is divided into two windows of some 50 working days on either side of Purdah.
  • There is some scope to extend more deadlines into the summer break and autumn 2015, but the costs of doing so – including loss of professional goodwill – might outweigh the benefits.

Purdah will act as a brake on progress across the piece. It will delay announcements that might otherwise have been made in April and early May, such as those related to new tests scheduled for May 2016.

The implications of Purdah are discussed further in the final section of this post.

.

Reception Baseline Assessment

Consultation response

A new Reception Baseline will be introduced from September 2015. This will be undertaken by children within their first few weeks of school (so not necessarily during the first half of the autumn term).

Teachers will be able to select from a range of assessments ‘but most are likely to be administered by the reception teaching staff’.  Assessments will be ‘short’ and ‘sit within teachers’ broader assessments of children’s development’.

They will be:

‘…strong predictors of key stage 1 and key stage 2 attainment whilst reflecting the age and abilities of children in reception’

Schools that use an approved baseline assessment ‘in September 2015’ (and presumably later during the 2015/16 academic year) will have their progress measured in 2022 against that or a KS1 baseline, whichever gives the best result.

However, only the reception baseline will be available from September 2016 and, from this point, the Early Years Foundation Stage (EYFS) profile will no longer be compulsory.

The reception baseline will not be compulsory either, since:

‘Schools that choose not to use an approved baseline assessment from 2016 will be judged on an attainment floor standard alone.’

But, since the attainment floor standard is so demanding (see below), this apparent choice may prove illusory for most schools.

Further work includes:

  • Engaging experts to develop criteria for the baselines.
  • A study in autumn 2014 of schools that already use such assessments, to inform decisions on moderation and the reporting of results to parents.
  • Communicating those decisions about moderation and reporting results – to Ofsted as well as to parents – ensuring they are ‘contextualised by teachers’ broader assessments’.
  • Publishing a list of assessments that meet the prescribed criteria.

.

Developments to date

Baseline criteria were published by the STA in May 2014.

The purpose of the assessments is described thus:

‘…to support the accountability framework and help assess school effectiveness by providing a score for each child at the start of reception which reflects their attainment against a pre-determined content domain and which will be used as the basis for an accountability measure of the relative progress of a cohort of children through primary school.’

This emphasis on the relevance of the baseline to floor targets is in marked contrast with the emphasis on reporting progress to parents in the original consultation document.

Towards the end of the document here is a request for ‘supporting information in addition to the criteria’:

‘What guidance will suppliers provide to schools in order to enable them to interpret the results and report them to parents in a contextualised way, for example alongside teacher observation?’

This seems to refer to the immediate reporting of baseline outcomes rather than of subsequent progress measures. Suitability for this purpose does not appear within the criteria themselves.

Interestingly, the criteria specify that the content domain:

‘…must demonstrate a clear progression towards the key stage 1 national curriculum in English and mathematics’,

but there is no reference to progression to KS2, and nothing about assessments being ‘strong predictors’ of future attainment, whether at KS1 or KS2.

Have expectations been lowered, perhaps because of concerns about the predictive validity of the assessments currently available?

A research study was commissioned in June 2014 (so earlier than anticipated) with broader parameters than originally envisaged.

The Government awarded a 9-month contract to NFER worth £49.7K, to undertake surveys of teachers’, school leaders’ and parents’ views on baseline assessment.

The documentation reveals that CEM is also involved in a parallel quantitative study which will ‘simulate an accountability environment’ for a group of schools, to judge changes in their behaviour.

Both of these organisations are also in the running for concession contracts to deliver the assessments from September 2015 (see below).

The aims of the project are to identify:

  • The impact of the introduction of baseline assessments in an accountability context.
  • Challenges to the smooth introduction of baseline assessments as a means to constructing an accountability measure.
  • Potential needs for monitoring and moderation approaches.
  • What reporting mechanisms and formats stakeholders find most useful.

Objectives are set out for an accountability strand and a reporting strand respectively. The former refer explicitly to identification of ‘gaming’ and the exploration of ‘perverse incentives’.

It is not entirely clear from the latter whether researchers are focused solely on initial contexualised reporting of reception baseline outcomes, or are also exploring the subsequent reporting of progress.

The full objectives are reproduced below

.

Reception baseline capture

.

The final ‘publishable’ report is to be delivered by March 2015. It will be touch and go whether this can be released before Purdah descends. Confirmation of policy decisions based on the research will likely be delayed until after the Election.

.

The process has begun to identify and publish a list of assessments that meet the criteria.

A tender appeared on Contracts Finder in September 2014 and has been updated several times subsequently, the most recent version appearing in early December.

The purpose is to award several concession contracts, giving holders the right to compete with each other to deliver baseline assessments.

Contracts were scheduled to be awarded on 26 January 2015, but there was no announcement. Each will last 19 months (to August 2016), with an option to extend for a further year. The total value of the contracts, including extensions, is calculated at £4.2m.

There is no limit to the number of concessions to be awarded, but providers must meet specified (and complex) school recruitment and delivery targets which essentially translate into a 10% sample of all eligible schools.

Under-recruiting providers can be included if fewer than four meet the 10% target, as long as they have recruited at least 1,000 eligible schools.

Moreover:

‘The minimum volume requirement may be waived if the number of schools choosing to administer the reception baseline is fewer than 8,887 [50% of the total number of schools with a reception class].’

Hence the number of suppliers in the market is likely to be limited to 10 or so: there will be some choice, but not too much.

My online researches unearthed four obvious candidates:

And suggestions that this might constitute the entire field

.

.

The initial deadline for recruiting the target number of schools is 30 April 2015, slap-bang in the middle of Purdah. This may prove problematic.

.

[Postscript: The award of six concession contracts was quietly confirmed on Wednesday 4 February, via new guidance on DfE’s website. The two contractors missing from the list above are Early Excellence and Hodder Education.

The guidance confirms that schools must sign up with their preferred supplier. They can do so after the initial deadline of 30 April but, on 3 June, schools will be told if they have chosen a provider that has been suspended for failing to recruit sufficient schools.  They will then need to choose an alternative provider.

It adds that, in AY2015/16, LA-maintained schools, academies and free schools will be reimbursed for the ‘basic cost’ of approved reception baselines. Thereafter, school budgets will include the necessary funding.

In the event, the Government has barely contributed to publicity for the assessment, leaving it to suppliers to make the running. The initial low-key approach (including links to the contractors’ home pages rather than to details of their baseline offers) has been maintained.

The only addition to the guidance has been the inclusion, from 20 March, of the criteria used to evaluate the original bids. This seems unlikely to help schools select their preferred solution since, by definition, all the successful bids must have satisifed these criteria!

Purdah will now prevent any further Government publicity.]

.

It seems likely that the decision to allow a range of baseline assessments – as opposed to a single national measure – will create significant comparability issues.

One of the ‘clarification questions’ posed by potential suppliers is:

‘We can find no reference to providing a comparability score between provider assessments. Therefore, can we assume that each battery of assessments will be independent, stand-alone and with no need to cross reference to other suppliers?’

The answer given is:

‘The assumption is correct at this stage. However, STA will be conducting a comparability study with successful suppliers in September 2015 to determine whether concordance tables can be constructed between assessments.’

This implies that progress measures will need to be calculated separately for users of each baseline assessment – and that these will be comparable only through additional ‘concordance tables’, should these prove feasible.

There are associated administrative and workload issues for schools, particularly those with high mobility rates, which may find themselves needing to engage with several different baseline assessment products.

One answer to a supplier’s question reveals that:

‘As currently, children will be included in performance measures for the school in which they take their final assessment (i.e. key stage 2 tests) regardless of which school they were at for the input measure (i.e. reception baseline on key stage 1). We are currently reviewing how long a child needs to have attended a school in order for their progress outcome to be included in the measure.’

The issue of comparability also raises questions about their aggregation for floor target purposes. Will targets based on several different baseline assessments be comparable with those based on only one? Will schools with high mobility rates be disadvantaged?

Schools will pay for the assessments. The supporting documentation says that:

‘The amount of funding that schools will be provided with is still to be determined. This will not be determined until after bids have been submitted to avoid accusations of price fixing.’

One of the answers to a clarification question says:

‘The funding will be available to schools from October 2015 to cover the reception baseline for the academic year 2015/16.’

Another says this funding is unlikely to be ringfenced.

There is some confusion over the payment mechanism. One answer says:

‘…the mechanism for this is still to be determined. In the longer term, money will be provided to schools through the Dedicated Schools Grant (DSG) to purchase the reception baseline. However, the Department is still considering options for the first year and may pay suppliers directly depending on the amount of data provided.’

But yet another is confident that:

‘Suppliers will be paid directly by schools. The Department will reimburse schools separately.’

The documentation also reveals that there has as yet been no decision on how to measure progress between the baseline and the end of KS2:

‘The Department is still considering how to measure this and is keen for suppliers to provide their thoughts.’

The ‘Statement of requirements’ once again foregrounds the use of the baseline for floor targets rather than reporting individual learners’ progress.

‘On 27 March 2014, the Department for Education (DfE) announced plans to introduce a new floor standard from September 2016. This will be based on the progress made by pupils from reception to the end of primary school.  The DfE will use a new Reception Baseline Assessment to capture the starting point from which the progress that schools make with their pupils will be measured.  The content of the Reception Baseline will reflect the knowledge and understanding of children at the start of reception, and will be clearly linked to the learning and development requirements of the Early Years Foundation Stage and key stage 1 national curriculum in English and mathematics.  The Reception Baseline will be administered within the first half term of a pupil’s entry to a reception class.’

In relation to reporting to parents, one of the answers to suppliers’ questions states:

‘Some parents will be aware of the reception baseline from the national media coverage of the policy announcement. We anticipate that awareness of the reception baseline will develop over time. As with other assessments carried out by a school, we would expect schools to share information with parents if asked, though there will be no requirement to report the outcome of the reception baseline to parents.’

So it appears that, regardless of the outcomes of the research above, initial short term reporting of reception baseline outcomes will be optional.

.

[Postscript: This position is still more vigorously stated in a letter dated November 2014 from Ministers to a primary group formed by two maths associations. It says (my emphasis):

‘Let me be clear that we do not intend the baseline assessment to be used to monitor the progress of individual children. You rightly point out that any assessment that was designed to be reliable at individual child level would need to take into account the different ages at which children start reception and be sufficiently detailed to account for the variation in performance one expects from young children day-to-day. Rather, the baseline assessment is about capturing the starting point for the cohort which can then be used to assess the progress of that cohort at the end of primary school,’

This distinction has not been made sufficiently explicit in material published elsewhere.]

.

The overall picture is of a process in which procurement is running in parallel with research and development work intended to help resolve several significant and outstanding issues. This is a consequence of the September 2015 deadline for introduction, which seems increasingly problematic.

Particularly so given that many professionals are yet to be convinced of the case for reception baseline assessment, expressing reservations on several fundamental grounds, extending well beyond the issues highlighted above.

A January 2015 Report from the Centre Forum – Progress matters in Primary too – defends the plan against its detractors, citing six key points of concern. Some of the counter-arguments summarised below are rather more convincing than others:

  • Validity: The contention that reception level assessments are accurate predictors of attainment at the end of KS2 is justified by reference to CEM’s PIPS assessment, which was judged in 2001 to give a correlation of 0.7. But of course KS2 tests were very different in those days.
  • Reliability: The notion that attainment can be reliably determined in reception is again justified with reference to PIPS data from 2001 (showing a 0.98 correlation on retesting). The authors argue that the potentially negative effects of test conditions on young children and the risks of bias should be ‘mitigated’ (but not eliminated) through the development and selection process.
  • Contextualisation: The risk of over-simplification through reporting a single numerical score, independent of factors such as age, needs to be set against the arguments in favour of a relatively simple and transparent methodology. Schools are free to add such context when communicating with parents.
  • Labelling: The argument that baseline outcomes will tend to undermine universally high expectations is countered by the view that assessment may actually challenge labelling attributable to other causes, and can in any case be managed in reporting to parents by providing additional contextual information.
  • Pupil mobility: Concern that the assessment will be unfair on schools with high levels of mobility is met by reference to planned guidance on ‘how long a pupil needs to have attended a school in order to be included in the progress measure’. However, the broader problems associated with a choice of assessments are acknowledged.
  • Gaming: The risk that schools will artificially depress baseline outcomes will be managed through effective moderation and monitoring.

The overall conclusion is that:

‘…the legitimate concerns raised by stakeholders around the reliability and fairness of a baseline assessment do not present fundamental impediments to implementing the progress measure. Overall, a well-designed assessment and appropriate moderation could address these concerns to the extent that a baseline assessment could provide a reasonable basis for constructing a progress measure.

That said, the Department for Education and baseline assessment providers need to address, and, where indicated, mitigate the concerns. However, in principle, there is nothing to prevent a well-designed baseline test being used to create a progress-based accountability measure.’

The report adds:

‘However, this argument still needs to be won and teachers’ concerns assuaged….

.. Since the majority of schools will be reliant on the progress measure under the new system, they need to be better informed about the validity, reliability and purpose of the baseline assessment. To win the support of school leaders and teachers, the Department for Education must release clear, defensible evidence that the baseline assessment is indeed valid, fair and reliable.’

.

[Postscript: On 25 March the STA tendered for a supplier to ‘determine appropriate models for assuring the national data from the reception baseline’. The notice continues:

‘Once models have been determined, STA will agree up to three approaches to be implemented by the supplier in small scale pilots during September/October 2015. The supplier will also be responsible for evaluating the approaches using evidence from the pilots with the aim of recommending an approach to be implemented from September 2016.’

The need for quality assurance is compounded by the fact that there are six different assessment models. The documentation makes clear that monitoring, moderation and other quality assurance methods will be considered.

The contract runs from 1 July 2015 to 31 January 2016 with the possibility of extension for a further 12 months. It will be let by 19 June.]

 .

Outstanding tasks

  • Publish list of contracts for approved baseline assessments (26 January 2015) COMPLETED
  • Explain funding arrangements for baseline assessments and how FY2015-16 funding will be distributed (January 2015?) COMPLETED
  • Publish research on baseline assessment (March/April 2015) 
  • Confirm monitoring and moderation arrangements (March/April 2015?) 
  • Deadline for contractors recruiting schools for initial baseline assessments (30 April 2015) 
  • Publish guidance on the reporting of baseline assessment results (May 2015?) 
  • Award quality assurance tender (June 2016)
  • Undertake comparability study with successful suppliers to determine whether concordance tables can be constructed (Autumn 2015) 
  • Determine funding required for AY2015/16 assessment and distribute to schools (or suppliers?) (October 2015?)
  • Pilot quality assurance models (October 2015)

KS1 and KS2 tests

.

Consultation response

The new tests will comprise:

  • At KS1 – externally set and internally marked tests of maths and reading and an externally set test of grammar, punctuation and spelling (GPS). It is unclear from the text whether the GPS test will be externally marked.
  • At KS2 – externally set and externally marked tests of maths, reading and science, plus a sampling test in science.

Outcomes of both KS1 and KS2 tests (other than the science sampling test) will be expressed as scaled scores. A footnote makes it clear that, in both cases, a score of ‘100 will represent the new expected standard for that stage’

The consultation document says of the scaled scores:

‘Because it is not possible to create tests of precisely the same difficulty every year, the number of marks needed to meet the secondary readiness standard will fluctuate slightly from one year to another. To ensure that results are comparable over time, we propose to convert raw test marks into a scaled score, where the secondary readiness standard will remain the same from year to year. Scaled scores are used in all international surveys and ensure that test outcomes are comparable over time.’

It adds that the Standards and Testing Agency (STA) will develop the scale.

Otherwise very little detail is provided about next steps. The consultation response is silent on the issue. The original consultation document says only that:

‘The Standards and Testing Agency will develop new national curriculum tests, to reflect the new national curriculum programmes of study.’

Adding, in relation to the science sampling test:

‘We will continue with national sample tests in science, designed to monitor national standards over time. A nationally-representative sample of pupils will sit a range of tests, designed to produce detailed information on the cohort’s performance across the whole science curriculum. The design of the tests will mean that results cannot be used to hold individual schools or pupils accountable.’

.

Developments to date

On March 31 2014, the STA published  draft test frameworks for the seven KS1 and KS2 tests to be introduced from 2016:

  • KS1 GPS: a short written task (20 mins); short answer questions (20 mins) and a spelling task (15 mins)
  • KS1 reading: two reading tests, one with texts and questions together, the other with a separate answer booklet (2 x 20 mins)
  • KS1 maths: an arithmetic test (15 mins) and a test of fluency, problem-solving and reasoning (35 mins)
  • KS2 GPS: a grammar and punctuation test (45 mins) and a spelling task (15 mins)
  • KS2 reading: a single test (60 mins)
  • KS2 maths: an arithmetic test (30 mins) and two tests of fluency, problem-solving and reasoning (2 x 40 mins)
  • KS2 science (sampling): tests in physics, chemistry and biology contexts (3 x 25 mins).

Each test will be designed for the full range of prior attainment and questions will typically be posed in order of difficulty.

Each framework explains that all eligible children at state-funded schools will be required to take the tests, but some learners will be exempt.

For further details of which learners will be exempted, readers are referred to the current Assessment and Reporting Arrangements (ARA) booklets.

According to these, the KS1 tests should be taken by all learners working at level 1 or above and the KS2 tests by all learners working at level 3 and above. Teacher assessment data must be submitted for pupils working below the level of the tests.

But of course levels will no longer exist – and we have no equivalent in the form of scaled scores – so the draft frameworks do not define clearly the lower parameter of the range of prior attainment the tests are intended to accommodate.

It will not be straightforward to design workable tests for such broad spans of prior attainment.

Each framework has a common section on the derivation of scaled scores:

‘The raw score on the test…will be converted into a scaled score. Translating raw scores into scaled scores ensures performance can be reported on a consistent scale for all children. Scaled scores retain the same meaning from one year to the next. Therefore, a particular scaled score reflects the same level of attainment in one year as in the previous year, having been adjusted for any differences in difficulty of the test.

Additionally, each child will receive an overall result indicating whether or not he or she has achieved the required standard on the test. A standard-setting exercise will be conducted on the first live test in 2016 in order to determine the scaled score needed for a child to be considered to have met the standard. This process will be facilitated by the performance descriptor… which defines the performance level required to meet the standard. In subsequent years, the standard will be maintained using appropriate statistical methods to translate raw scores on a new test into scaled scores with an additional judgemental exercise at the expected standard. The scaled score required to achieve the expected level on the test will always remain the same.

The exact scale for the scaled scores will be determined following further analysis of trialling data. This will include a full review of the reporting of confidence intervals for scaled scores.’

In July 2014 STA also published sample questions, mark schemes and associated commentaries for each test.

.

Outstanding tasks

I have been unable to trace any details of the timetable for test development and trialling.

As far as I can establish, STA has not published an equivalent to QCDA’s ‘Test development, level setting and maintaining standards’ (March 2010) which describes in some detail the different stages of the test development process.

This old QCA web-page describes a 22-month cycle, from the initial stages of test development to the administration of the tests.

This aligns reasonably well with the 25-month period between publication of the draft test frameworks on 31 March 2014 and the administration of the tests in early May 2016.

Applying the same timetable to the 2016 tests – using publication of the draft frameworks as the starting point – suggests that:

  • The first pre-test should have been completed by November 2014
  • The second pre-test should take place by February 2015 
  • Mark schemes and tests should be finalised by July 2015

STA commits to publishing, the final test frameworks and a full set of sample tests and mark schemes for each of the national curriculum tests at key stages 1 and 2 ‘during the 2015 summer term’.

Given Purdah, these seem most likely to appear towards the end of the summer term rather than a full year ahead of the tests.

In relation to the test frameworks, STA says:

‘We may make small changes as a result of this work; however, we do not expect the main elements of the frameworks to change.’

They will also produce, to the same deadline, guidance on how the results of national curriculum tests will be reported, including an explanation of scaled scores.

So we have three further outstanding tasks:

  • Publishing the final test frameworks (summer term 2015) 
  • Finalising the scale to be used for the tests (summer term 2015) 
  • Publishing guidance explaining the use and reporting of scaled scores (summer term 2015)

.

[Postscript: Since publishing this post, I have found on Contracts Finder various STA contracts, as follows:

How these square with the timetable above is, as yet, unclear. If there is a possibility that final test frameworks cannot be finalised until Autumn 2015, the Workload Challenge Protocol may well bite here too.]

.

Statutory teacher assessment

.

Consultation response

The response confirms statutory teacher assessment of:

  • KS1 maths, reading, writing, speaking and listening and science
  • KS2 maths, reading, writing and science.

There are to be performance descriptors for each statutory teacher assessment:

  • a single descriptor for KS1 science and KS2 science, reading and maths
  • several descriptors for KS1 maths, reading, writing and speaking and listening, and also for KS2 writing.

There is a commitment to improve KS1 moderation, given concerns expressed by Ofsted and the NAHT Commission.

In respect of low attaining pupils the response says:

‘All pupils who are not able to access the relevant end of key stage test will continue to have their attainment assessed by teachers. We will retain P-scales for reporting teachers’ judgements. The content of the P-scales will remain unchanged. Where pupils are working above the P-scales but below the level of the test, we will provide further information to enable teachers to assess attainment at the end of the relevant key stage in the context of the new national curriculum.’

And there is to be further consideration of whether to move to external moderation of P-scale teacher assessment.

So, to summarise, the further work involves:

  • Developing new performance descriptors – to be drafted by an expert group. According to the response, the KS1 descriptors would be introduced in ‘autumn 2014’. No date is given for the KS2 descriptors.
  • Improving moderation of KS1 teacher assessment, working closely with schools and Ofsted.
  • Providing guidance to support teacher assessment of those working above the P-scales but below the level of the tests.
  • Deciding whether to move to external moderation of P-scale teacher assessment.

.

Developments to date

Updated statutory guidance on the P-Scale attainment targets for pupils with SEN was released in July 2014, but neither it nor the existing guidance on when to use the P-Scales relates them to the new scaled scores, or discusses the issue of moderation.

.

In September 2014, a guidance noteNational curriculum and assessment from September 2014: Information for schools’ revised the timeline for the development of performance descriptors:

‘New performance descriptors will be published (in draft) in autumn 2014 which will inform statutory teacher assessment at the end of key stage 1 and 2 in summer 2016. Final versions will be published by September 2015.’

.

A consultation document on performance descriptors: ‘Performance descriptors for use in key stage 1 and 2 statutory teacher assessment for 2015 to 2016’ was published on 23 October 2014.

The descriptors were:

‘… drafted with experts, including teachers, representatives from Local Authorities, curriculum and subject experts. Also Ofsted and Ofqual have observed and supported the drafting process’

A November 2014 FoI response revealed the names of the experts involved and brief biographies were provided in the media.

A further FoI has been submitted requesting details of their remit but, at the time of writing, this has not been answered.

.

[Postscript: The FoI response setting out the remit was published on 5 February.]

.

The consultation document revealed for the first time the complex structure of the performance descriptor framework.

It prescribes four descriptors for KS1 reading, writing and maths but five for KS2 writing.

The singleton descriptors reflect ‘working at the national standard’.

Where four descriptors are required these are termed (from the top down): ‘mastery’, ‘national’, ‘working towards national’ and ‘below national’ standard.

In the case of KS2 writing ‘above national standard’ is sandwiched between ‘mastery’ and ‘national’.

.

Performance descriptor Capture 1Perfromance Decriptor Capture 2

The document explains how these different levels cross-reference to the assessment of learners exempted from the tests.

In the case of assessments with only a single descriptor, it becomes clear that a further distinction is needed:

‘In subjects with only one performance descriptor, all pupils not assessed against the P-scales will be marked in the same way – meeting, or not meeting, the ‘national standard’.

So ‘not meeting the national standard’ should also be included in the table above. The relation between ‘not meeting’ and ‘below’ national standard is not explained.

But still further complexity is added since:

‘There will be some pupils who are not assessed against the P-scales (because they are working above P8 or because they do not have special educational needs), but who have not yet achieved the contents of the ‘below national standard’ performance descriptor (in subjects with several descriptors). In such cases, pupils will be given a code (which will be determined) to ensure that their attainment is still captured.’

This produces a hierarchy as follows (from the bottom up):

  • P Scales
  • In cases of assessments with several descriptors, an attainment code yet to be determined
  • In case of assessments with single descriptors, an undeclared ‘not meeting the national standard’ descriptor
  • The single descriptor or four/five descriptors listed above.

However, the document says:

‘The performance descriptors do not include any aspects of performance from the programme of study for the following key stage. Any pupils considered to have attained the ‘Mastery standard’ are expected to explore the curriculum in greater depth and build on the breadth of their knowledge and skills within that key stage.’

This places an inappropriate brake on the progress of the highest attainers because the assessment ceiling is pitched too low to accommodate them.

It is acknowledging that some high attainers will be performing above the level of the highest descriptors but, regardless of whether or not they move into the programme for the next key stage, there is no mechanism to record their performance.

This raises the further question whether the mastery standard is pitched at the equivalent of level 6, or below it. It will be interesting to see whether this is addressed in the consultation response.

The consultation document says that the draft descriptors will be trialled during summer term 2015 in a representative sample of schools.

These trials and the consultation feedback will together inform the development of the final descriptors, but also:

  • ‘statutory arrangements for teacher assessment using the performance descriptors;
  • final guidance for schools (and those responsible for external moderation arrangements) on how the performance descriptors should be used;
  • an updated national model for the external moderation of teacher assessment; and
  • nationally developed exemplification of the work of pupils for each performance descriptor at the end of each key stage.’

Published comments on the draft descriptors have been almost entirely negative, which might suggest that the response could be delayed. The consultation document said it should appear ‘around 26 February 2015’.

According to the document, the final descriptors will be published either ‘in September 2015’ or ‘in the autumn term 2015’, depending whether you rely on the section headed ‘Purpose’ or the one called ‘Next Steps’. The first option would allow them to appear as late as December 2015.

A recent newspaper report suggested that the negative reception had resulted in an ‘amber/red’ assessment of primary assessment reform as a whole. The leaked commentary said that any decision to review the approach would increase the risk that the descriptors could not be finalised ‘by September as planned’.

However, the story concludes:

‘The DfE says: “We do not comment on leaks,” but there are indications from the department that the guidance will be finalised by September. Perhaps ministers chose, in the end, not to “review their approach”, despite the concerns.’

Hence it would appear that delay until after the beginning of AY2015/16 will not be countenanced

Note that the descriptors are for use in academic year 2015/16, so even publication in September is problematic, since teachers will begin the year not knowing which descriptors to apply.

The consultation document refers only to descriptors for AY2015/16, which might imply that they will be further refined for subsequent years. Essentially therefore, the arrangements proposed here would be an imperfect interim solution.

.

[Postscript: On 26 February 2015 the Consultation Response was published – so on the date commited to in the consultation document. 

As expected, it revealed significant opposition to the original proposals:

  • 74% of respondents were concerned about nomenclature
  • 76% considered that the descriptors were not spaced effectively across the range of pupils’ performance
  • 69% of respondents considered them not clear or easy to understand

The response acknowledges that the issues raised:

‘….amount to a request for greater simplicity, clarity and consistency to support teachers in applying performance descriptors and to help parents understand their meaning.’

But goes on to allege that: 

‘…there are some stakeholders who valued the levels system and would like performance descriptors to function in a similar way across the key stages, which is not their intention.’

Even so, although the Descriptors are not intended to inform formative assessment, respondents have raised concerns that they could be applied in this manner.

There is also the issue of comparability between formative and summative assessment measures, but this is not addressed.

The response does not entirely acknowledge that opposition to the original proposals is sending it back to the drawing board but:

‘As a result of some of the conflicting responses to the consultation, we will work with relevant experts to determine the most appropriate course of action to address the concerns raised and will inform schools of the agreed approach according to the timetable set out in the consultation document – i.e. by September 2015.

The new assessment commission (see below) will have an as yet undefined role in this process:

‘In the meantime, and to help with this [ie determining the most appropriate course of action] the Government is establishing a Commission on Assessment Without Levels….’

Unfortunately, this role has not been clarified in the Commission’s Statement of Intended Outputs

There is no reference to the trials in schools, which may or may not continue. A DfE Memorandum to the Education Select Committee on its 2014-15 Supplementary Estimates reveals that £0.3m has been reallocated to pay for them, but this is no guarantee that they will take place.

Implementation will not be delayed by a year, despite the commitment to allow a full year’s notice for significant reforms announced in the response to the Workload Challenge.

This part of the timetable is now seriously concertina’d and there must be serious doubt whether the timescale is feasible, especially if proper trialling is to be accommodated.]

.

Outstanding tasks 

  • Publish response to performance descriptors consultation document (26 February 2015) COMPLETED
  • Trial (revised?) draft performance descriptors (summer term 2015) 
  • Publish adjusted descriptors, revised in the light of consultation with experts and input from the commission (summer term 2015)
  • Experts and commission on assessment produce response to concerns raised and inform schools of outcomes (September 2015)
  • Confirm statutory arrangements for use of the performance descriptors (September/autumn term 2015) 
  • Publish final performance descriptors for AY2015/16 (September/autumn term 2015) 
  • Publish final guidance on the use of performance descriptors (September/autumn term 2015) 
  • Publish exemplification of each performance descriptor at each key stage (September/autumn term 2015)
  • Publish an updated model for the external moderation of teacher assessment (September/autumn term 2015?) 
  • Confirm plans for the moderation of KS1 teacher assessment and use of the P-scales (September/autumn term 2015?) 
  • Publish guidance on assessment of those working above the P-scales but below the level of the tests (September/autumn term 2015?) 
  • Decide whether performance descriptors require adjustment for AY2016/17 onwards (summer term 2016)

.

Schools’ internal assessment and tracking systems

.

Consultation response

The consultation document outlined some of the Government’s justification for the removal of national curriculum levels. The statement that:

‘Schools will be able to focus their teaching, assessment and reporting not on a set of opaque level descriptions, but on the essential knowledge that all pupils should learn’

may be somewhat called into question by the preceding discussion of performance descriptors.

The consultation document continues:

‘There will be a clear separation between ongoing, formative assessment (wholly owned by schools) and the statutory summative assessment which the government will prescribe to provide robust external accountability and national benchmarking. Ofsted will expect to see evidence of pupils’ progress, with inspections informed by the school’s chosen pupil tracking data.’

A subsequent section adds:

‘We will not prescribe a national system for schools’ ongoing assessment….

…. We expect schools to have a curriculum and assessment framework that meets a set of core principles…

 … Although schools will be free to devise their own curriculum and assessment system, we will provide examples of good practice which schools may wish to follow. We will work with professional associations, subject experts, education publishers and external test developers to signpost schools to a range of potential approaches.’

The consultation response does not cover this familiar territory again, saying only:

‘Since we launched the consultation, we have had conversations with our expert group on assessment about how to support schools to make best use of the new assessment freedoms. We have launched an Assessment Innovation Fund to enable assessment methods developed by schools and expert organisations to be scaled up into easy-to-use packages for other schools to use.’

Further work is therefore confined to the promulgation of core principles, the application of the Assessment Innovation Fund and possibly further work to ‘signpost schools to a range of potential approaches’.

.

Developments to date

The Assessment Innovation Fund was originally announced initially in December 2013.

A factsheet released at that time explains that many schools are developing new curriculum and assessment systems and that the Fund is intended to enable schools to share these.

Funding of up to £10K per school is made available to help up to 10 schools to prepare simple, easy-to-use packages that can be made freely available to other schools.

They must commit to:

‘…make their approach available on an open licence basis. This means that anyone who wishes to use the package (and any trade-marked name) must be granted a non-revocable, perpetual, royalty-free licence to do so with the right to sub-licence. The intellectual property rights to the system will remain with the school/group which devised it.’

Successful applicants were to be confirmed ‘in the week commencing 21 April 2014’

In the event, nine successful applications were announced on 1 May, although one subsequently withdrew, apparently over the licensing terms.

The packages developed with this funding are stored – in a rather user-unfriendly fashion – on this TES Community Blog, along with other material supportive of the decision to dispense with levels.

Much other useful material has been published online which has not been collected into this repository and it is not clear to what extent it will develop beyond its present limits, since the most recent addition was in early November 2014.

A recent survey by Capita Sims (itself a provider of assessment support) conducted between June and September 2014, suggested that:

  • 25% of primary and secondary schools were unprepared for and 53% had not yet finalised plans for replacing levels.
  • 28% were planning to keep the existing system of levels, 21% intended to introduce a new system and 28% had not yet made a decision.
  • 50% of those introducing an alternative expected to do so by September 2015, while 23% intended to do so by September 2016.
  • Schools’ biggest concern (53% of respondents) is measuring progress and setting targets for learners.

Although the survey is four months old and has clear limitations (there were only 126 respondents) this would suggest further support may be necessary, ideally targeted towards the least confident schools.

.

In April 2014 the Government published a set of Assessment Principles, building on earlier material in the primary consultation document. These had been developed by an ‘independent expert panel’.

It is not entirely clear whether the principles apply solely to primary schools and to schools’ own assessment processes (as opposed to statutory assessment).

The introductory statement says:

‘The principles are designed to help all schools as they implement arrangements for assessing pupils’ progress against their school curriculum; Government will not impose a single system for ongoing assessment.

Schools will be expected to demonstrate (with evidence) their assessment of pupils’ progress, to keep parents informed, to enable governors to make judgements about the school’s effectiveness, and to inform Ofsted inspections.’

This might suggest they are not intended to cover statutory assessment and testing but are relevant to secondary schools.

There are nine principles in all, divided into three groups:

.

Principles Capture

.

The last of these seems particularly demanding.

 .

In July 2014, Ofsted published guidance in the form of a ‘Note for inspectors: use of assessment information during inspections in 2014/15’. This says that:

‘In 2014/15, most schools, academies and free schools will have historic performance data expressed in national curriculum levels, except for those pupils in Year 1. Inspectors may find that schools are tracking attainment and progress using a mixture of measures for some, or all, year groups and subjects.

As now, inspectors will use a range of evidence to make judgements, including by looking at test results, pupils’ work and pupils’ own perceptions of their learning. Inspectors will not expect to see a particular assessment system in place and will recognise that schools are still working towards full implementation of their preferred approach.’

It goes on to itemise the ways in which inspectors will check that these systems are effective, without judging the systems themselves, but by gathering evidence of effective implementation through leadership and management, the accuracy of assessment, effectiveness in securing progress and quality of reporting to parents.

. 

In September 2014, NCTL published a research reportBeyond Levels: alternative assessment approaches developed by teaching schools.’

The report summarises the outcomes of small-scale research conducted in 34 teaching school alliances. It offers six rather prolix recommendations for schools and DfE to consider, which can be summarised as follows:

  • A culture shift is necessary in recognition of the new opportunities provided by the new national curriculum and the removal of levels.
  • Schools need access to conferences and seminars to help develop their assessment expertise.
  • Schools would benefit from access to peer reviewed commercial tracking systems relating to the new national curriculum. Clarification is needed about what data will be collected centrally.
  • Teaching school alliances and schools need financial support to further develop assessment practice, especially practical classroom tools, which should be made freely available online.
  • Financial support is needed for teachers to undertake postgraduate research and courses in this field.
  • It is essential to develop professional knowledge about emerging effective assessment practice.

I can find no government response to these recommendations and so have not addressed them in the list of outstanding tasks below.

.

[Postscript: On 25 February 2015, the Government announced the establishment of a ‘Commission on Assessment Without Levels’:

‘To help schools as they develop effective and valuable assessment schemes, and to help us to identify model approaches we are today announcing the formation of a commission on assessment without levels. This commission will continue the evidence-based approach to assessment which we have put in place, and will support primary and secondary schools with the transition to assessment without levels, identifying and sharing good practice in assessment.’

This appears to suggest belated recognition that the steps outlined above have provided schools with insufficient support for the transition to levels-free internal assessment. It is also a response to the possibility that Labour might revisit the decision to remove them (see below).

The Consultation Response on Performance Descriptors released on 26 February (see above) says that the Commission will help to determine the most appropriate response to concerns raised about the Descriptors, while also suggesting that this task will not be devolved exclusively to them.

It adds that the Commission will:

‘…collate, quality assure, publish and share best practice in assessment with schools across the country…and will help to foster innovation and success in assessment practice more widely.’

The membership of the Commission was announced on 9 March.

.

.

The Commission met on 10 March and 23 March 2015 and will meet four more times – in April, May, June and July.

Its Terms of Reference have been published. The Statement of Intended Outputs mentioned in the consultation response on Performance Descriptors appeared without any publicity on 27 March

It seemed that the Commission, together with the further consultation of experts, supplied a convenient mechanism for ‘parking’ some difficult issues until the other side of the Election.

However, neither the terms of reference nor the statement of outputs mentions the Performance Descriptors, so the Commission’s role in relation to them remains shrouded in mystery.

.

.

The authors of the Statement of Outputs feel it necessary to mention in passing that it:

‘…supports the decision to removel levels, but appreciates that the reasons for removing levels are not widely understood’.

It sets out a 10-point list of outputs comprising:

  • Another statement of the purposes of assessment and another set of principles to support schools in developing effective assessment systems, presumably different to those published by the previous expert group in April 2014. (It will be interesting to compare the two sets of principles, to establish whether Government policy on what constitutes effective assessment has changed over the last 12 months. It will also be worthwhile monitoring the gap between the principles and the views of Alison Peacock, one of the Commission’s members. She also sat on the expert panel that developed the original principles, some of which seem rather at odds with her own practice and preferences. Meanwhile, another member – Sam Freedman – has stated

.

.

  • An explanation of ‘how assessment without levels can better serve the needs of pupils and teachers’.
  • Guidance to ‘help schools create assessment policies which reflect the principles of effective assessment without levels’.
  • Clear information about ‘the legal and regulatory assessment requirements’, intende to clarify what they are now, how they will change and when. (The fact that the Commission concludes that such information is not already available is a searing indictment of the Government’s communications efforts to date.)
  • Clarification with Ofsted of ‘the role that assessment without levels will play in the inspection process’ so schools can demonstrate effectiveness without adding to teacher workload. (So again they must believe that Ofsted has not sufficiently clarified this already.)
  • Dissemination of good practice, obtained through engagement with ‘a wide group of stakeholders including schools, local authorities, teachers and teaching unions’. (This is tacit admission that the strategy described above is not working.)
  • Advice to the Government on how ITT and CPD can support assessment without levels and guidance to schools on the use of CPD for this purpose. (There is no reference to the resource implications of introducing additional training and development.)
  • Advice to the Government on ensuring ‘appropriate provision is made for pupils with SEN in the development of assessment policy’. (Their judgement that this is not yet accounted for is a worrying indictment of Government policy to date. They see this as not simply a lapse of communication but a lacuna in the policy-making process.)
  • ‘Careful consideration’ of commitments to tackling teacher workload – which they expect to alleviate by providing information, advice and support. (There is no hint that the introduction of Performance Descriptors will be delayed in line with the Workload Challenge.)
  • A final report before the end of the summer term, though it may publish some outputs sooner. (It will not be able to do so until the outcome of the Election is decided.)

Although there is some implicit criticism of Government policy and communications to date, the failure to make any reference to the Performance Descriptors is unlikely to instil confidence in the capacity of the Commission to provide the necessary challenge to the original proposals, or support to the profession in identifying a workable alternative.]

.

Outstanding tasks

  • Further dissemination of good practice through the existing mechanisms (ongoing) 
  • Further ‘work with professional associations, subject experts, education publishers and external test developers to signpost schools to a range of potential approaches.’ (ongoing)
  • Additional work (via the commission) to ‘collate, quality assure, publish and share’ best practice (Report by July 2015 with other outputs possible from May 2015)

Reporting to parents

.

Consultation response

The consultation document envisaged three outcomes for each test:

  • A scaled score
  • The learner’s position in the national cohort, expressed as a decile
  • The rate of progress from a baseline, derived by comparing a learner’s scaled score with that of other learners with the same level of prior attainment.

Deciles did not survive the consultation

The consultation response confirms that, for each test, parents will receive:

  • Their own child’s scaled score; and
  • The average scaled score for the school, ‘the local area’ (presumably the geographical area covered by the authority in which the school is situated) and the country as a whole.

They must also receive information about progress, but the response only discusses how this might be published on school websites and for the purposes of the floor targets (see sections below), rather than how it should be reported directly to parents.

We have addressed already the available information about the calculation of the scaled scores.

The original consultation document also outlined the broad methodology underpinning the progress measures:

‘In order to report pupils’ progress through the primary curriculum, the scaled score for each pupil at key stage 2 would be compared to the scores of other pupils with the same prior attainment. This will identify whether an individual made more or less progress than pupils with similar prior attainment…

…. Using this approach, a school might report pupils’ national curriculum test results to parents as follows:

In the end of key stage 2 reading test, Sally received a scaled score of 126 (the secondary ready standard is 100), placing her in the top 10% of pupils nationally. The average scaled score for pupils with the same prior attainment was 114, so she has made more progress in reading than pupils with a similar starting-point.’

.

Developments to date

On this web page first published in April 2014 STA commits to publishing guidance during summer term 2015 on how the results of national curriculum tests will be reported, including an explanation of scaled scores.

In September 2014, a further guidance note ‘National curriculum and assessment from September 2014: Information for schools’ shed a little further light on the calculation of the progress measures:

‘Pupil progress will be determined in relation to the average progress made by pupils with the same baseline (i.e. the same KS1 average point score). For example, if a pupil had an APS of 19 at KS1, we will calculate the average scaled score in the KS2 tests for all pupils with an APS of 19 and see whether the pupil in question achieved a higher or lower scaled score than that average The exact methodology of how this will be reported is still to be determined.’

It is hard to get a clear sense of the full range of assessment information that parents will receive.

I have been unable to find any comprehensive description, which would suggest that this is being held back until the methodology for calculating the various measures is finalised.

The various sections above suggest that they will receive details of:

  • Reception baseline assessment outcomes.
  • Attainment in end of KS1 and end of KS2 tests, now expressed as scaled scores (or via teacher assessment, code or P-scales if working below the level of the tests). This will be supplemented by a series of average scaled scores for each test.
  • Progress between the baseline assessment (reception baseline from 2022; KS1 baseline beforehand) and end of KS2 tests, relative to learners with similar prior attainment at the baseline.
  • Attainment in statutory teacher assessments, normally expressed through performance descriptors, but with different arrangements for low attainers.
  • Attainment and progress between reception baseline, KS1 and KS2 tests, provided through schools’ own internal assessment and tracking systems.

We have seen that reporting mechanisms for the first and fourth are not yet finalised.

The fifth is now for schools to determine, taking account of Ofsted’s guidance and, if they wish, the Assessment Principles.

The scales necessary to report the second are not yet published, and these also form the basis of the remaining progress measures.

Parents will be receiving this information in a variety of different formats: scaled scores, average scaled scores, baseline scores, performance descriptors, progress scores and internal tracking measures.

Moreover, the performance descriptor scales will vary according to the assessment and internal tracking will vary from school to school.

This is certainly much more complex than the current unified system of reporting based on levels. Parents will require extensive support to understand what they are receiving.

Outstanding tasks

Previous sections have already referenced expected guidance on reporting baseline assessments, scaled scores and the use of performance descriptors (which presumably includes parental reporting).

One assumes that there will also need to be unified guidance on all aspects of reporting to parents, intended for parental consumption.

So, avoiding duplication of previous sections, the remaining outstanding tasks are to:

  • Finalise the methodology for reporting on pupil progress (summer term 2015) 
  • Provide comprehensive guidance to parents on all aspects of reporting (summer term 2015?)

Publication of outcomes

.

Consultation response

This section covers publication of material for public consumption, within and alongside the Primary School Performance Tables and on schools’ websites.

The initial consultation document has much to say about first of these, while the consultation response barely mentions the Tables, focusing almost exclusively on school websites

The original document suggests that the Performance Tables will include a variety of measures, including:

  • The percentage of pupils meeting the secondary readiness standard
  • The average scaled score
  • Where the school’s pupils fit in the national cohort
  • Pupils’ rate of progress
  • How many of the school’s pupils are among the highest-attaining nationally, through a measure showing the percentage of pupils attaining a high scaled score in each subject.
  • Teacher assessment outcomes in English maths and science
  • Comparisons of each school’s performance with that of schools with similar intake
  • Data about the progress of those with very low prior attainment.

All the headline measures will be published separately for pupils in receipt of the pupil premium.

All measures will be published as three year rolling averages in addition to annual results.

There is also a commitment to publish a wide range of test and teacher assessment data, relating to both attainment and progress, through a Data Portal:

‘The department is currently procuring a new data portal or “data warehouse” to store the school performance data that we hold and provide access to it in the most flexible way. This will allow schools, governors and parents to find and analyse the data about schools in which they are most interested, for example focusing on the progress of low attainers in mathematics in different schools or the attainment of certain pupil groups.’

The consultation response acknowledges as a guiding principle:

‘…a broad range of information should be published to help parents and the wider public know how well schools are performing.’

The accountability system will:

‘…require schools to publish information on their websites so that parents can understand both the progress pupils make and the standards they achieve.’

Data on low attainers’ attainment and progress will not be published since the diversity of this group demands extensive contextual information.

But when it comes to Performance Tables, the consultation response says only:

‘As now, performance tables will present a wide range of information about primary school performance.’

By implication, they will include progress measures since the text adds:

‘In 2022 performance tables, we will judge schools on whichever is better: their progress from the reception baseline to key stage 2; or their progress from key stage 1 to key stage 2.

However, schools will be required to publish a suite of indicators in standard format on their websites, including:

  • The average progress made by pupils in reading, writing and maths
  • The percentage of pupils achieving the expected standard at the end of KS2 in reading, writing and maths
  • The average score of pupils in their end of KS2 assessments and
  • The ‘percentage of pupils who achieve a high score in all areas’ at the end of KS2.

The precise form of the last of these indicators is not explained. This is not quite the same as the ‘measure showing the percentage of pupils attaining a high scaled score in each subject’ mentioned in the original consultation document.

Does ‘all areas’ mean reading, writing and maths? Must learners achieve a minimum score in each assessment, or a single aggregate score above a certain threshold?

In addition:

‘So that parents can make comparisons between schools, we would like to show each school’s position in the country on these measures and present these results in a manner that is clear for all audiences to understand. We will discuss how best to do so with stakeholders, to ensure that the presentation of the data is clear, fair and statistically robust.’

.

Developments to date

In June 2014, a consultation document was issued ‘Accountability: publishing headline performance measures on school and college websites’. This was accompanied by a press release.

The consultation document explains the intended relationship between the Performance Tables, Data Portal and material published on schools’ websites:

‘Performance tables will continue to provide information about individual schools and colleges and be the central source of school and college performance information.’

Moreover:

‘Future changes to the website, through the school and college performance data portal, will improve accessibility to a wide range of information, including the headline performance measures. It will enable interested parents, students, schools, colleges and researchers to interrogate educational data held by the Department for Education to best meet their requirements.’

But:

‘Nevertheless, the first place many parents and students look for information about a school or college is the institution’s own website’

Schools are already required to publish such information, but there is inconsistency in where and how it is presented. The document expresses the intention that consistent information should be placed ‘on the front page of every school and college website’.

The content proposed for primary school’s websites covers the four headline measures set out in the consultation response.

A footnote says:

‘These measures will apply to all-through primary, junior and middle schools. Variants of these measures will apply for infant and first schools.’

But the variants are not set out.

There is no reference to the plan to show ‘each school’s position in the country on these measures’ as mentioned in the consultation response.

The consultation proposes a standard visual presentation which, for primary schools, looks like this

.

school websites Capture

.

The response to this consultation ‘Publishing performance measures on school and college websites’ appeared in December 2014 (the consultation document had said ‘Autumn 2014’).

The summary of responses says:

‘The majority of respondents to the consultation welcomed the proposals to present headline performance measures in a standard format. There was also strong backing for the proposed visual presentation of data to aid understanding of performance. However, many respondents suggested that without some sense of scale or spread to provide some context to the visual presentation, the data could be misleading. Others said that the language used alongside the charts should be clearer…’

…Whilst most respondents favoured a data application tool that would remove the burden of annually updating performance data on school and college websites, they also highlighted the difficulties of developing a data application that would be compatible with a wide range of school and college websites.’

It is clear that some respondents had questioned why school websites should not simply carry a link on their homepage to the School Performance Tables.

In the light of this reaction, further research will be undertaken to:

  • develop a clear and simple visual representation of the data, but with added contextual information.
  • establish how performance tables data can be presented ‘in a way that reaches more parents’.

The timeline suggests that this will result in ‘proposals for redevelopment of performance tables’ by May 2015, so we can no longer assume that the Tables will cover the list of material suggested in the original consultation document.

The timeline indicates that if initial user research concludes that a data application is required, that will be developed and tested between June and October 2015, for roll out between September 2016 and January 2017.

Schools will be informed by autumn 2015 whether they should carry a link to the Tables, download a data application or pursue a third option.

But, nevertheless:

‘All schools and colleges, including academies, free schools and university technical colleges, will be required to publish the new headline performance measures in a consistent, standard format on their websites from 2016.’

So, if an application is not introduced, it seems that schools will still have to publish the measures on their websites: they will not be able to rely solely on a link to the Performance Tables.

Middle schools will only be required to publish the primary measures. No mention is made of infant or first schools.

.

There is no further reference to the data portal, since this project was quietly shelved in September 2014, following unexplained delays in delivery.

.

.

There has been no subsequent explanation of the implications of this decision. Will the material intended for inclusion in the Portal be included in the Performance Tables, or published by another route, or will it no longer be published?

.

Finally, some limited information has emerged about accountability arrangements for infant schools.

This appears on a web page – New accountability arrangements for infant schools from 2016 – published in June 2014.

It explains that the reception baseline will permit the measurement of progress alongside attainment. The progress of infant school pupils will be published for the first time in the 2019 Performance Tables.

This might mean a further addition to the list of information reported to parents set out in the previous section.

There is also a passing reference to moderation:

‘To help increase confidence and consistency in our moderation of infant schools, we will be increasing the proportion of schools where KS1 assessments are moderated externally. From summer 2015, half of all infant schools will have their KS1 assessments externally moderated.’

But no further information is forthcoming about the nature of other headline measures and how they will be reported.

.

Outstanding tasks

  • Complete user research and publish proposals for redevelopment of Performance Tables (May 2015) 
  • Confirm what data will be published in the 2016 Performance Tables (summer Term 2015?)
  • Confirm how material originally intended for inclusion in Data Portal will be published (summer term 2015?)
  • Confirm the format and publication route for data showing each school’s position in the country on the headline measures (summer term 2015?) 
  • Confirm headline performance measures for infant and first schools (summer term 2015?) 
  • If necessary, further develop and test a prototype data application for schools’ websites (October 2015) 
  • Inform schools whether a data application will be introduced (autumn 2015) 
  • Amend School Information Regulations to require publication of headline measures in standard format (April 2016) 
  • If proceeding, complete development and testing of a data application (May 2016) 
  • If proceeding, complete roll out of data application (February 2017)

.

Floor standards

.

Consultation response

Minimum expectations of schools will continue to be embodied in floor standards. Schools falling below the floor will attract ‘additional scrutiny through inspection’ and ‘intervention may be required’.

Although the new standard:

‘holds schools to account both on the progress they make and on how well their pupils achieve.’

In practice they are able to choose between one or the other.

An all-through primary school will be above the floor standards if:

  • Pupils make sufficient progress between the reception baseline and the end of KS2 in all of reading, writing and maths or
  • 85% or more of pupils meet the new expected standard at the end of KS2 (similar to Level 4b under the current system).

A junior or middle school will be above the floor standard if:

  • pupils make sufficient progress at key stage 2 from their starting point at key stage 1; or
  • 85% or more of pupils meet the new expected standard at the end of key stage 2

At this stage arrangements for measuring the progress of pupils in infant or first schools are still to be considered.

Since the reception baseline will be introduced in 2015, progress in all-through primary schools will continue to be measured from the end of KS1 until 2022.

This should mean that, prior to 2022, the standard would be achieved by ensuring that the progress made by pupils in a school – in reading, writing and maths – equals or exceeds the national average progress made by pupils with similar prior attainment at the end of KS1.

Exactly how individual progress will be aggregated to create a whole school measure is not yet clear. The original consultation document holds up the possibility that slightly below average progress will be acceptable:

‘…we expect the value-added score required to be above the floor to be between 98.5 and 99 (a value-added score of 100 represents average progress).’

The consultation response says the amount of progress required will be determined in 2016:

‘The proposed progress measure will be based on value-added in each of reading, writing and mathematics. Each pupil’s scaled scores in each area at key stage 2 will be compared with the scores of pupils who had the same results in their assessments at key stage 1.

For a school to be above the progress floor, pupils will have to make sufficient progress in all of reading, writing and mathematics. For 2016, we will set the precise extent of progress required once key stage 2 tests have been sat for the first time. Once pupils take a reception baseline, progress will continue to be measured using a similar value added methodology.’

In 2022 schools will be assessed against either the reception or KS1 baseline, whichever gives the best result. From 2023 only the reception baseline will be in play.

The attainment standard will be based on achievement of ‘a scaled score of 100 or more’ in each of the reading and maths tests and achievement, via teacher assessment, of the new expected standard in writing (presumably the middle of the five described above).

The attainment standard is significantly more demanding, in that the present requirement is for 65% of learners to meet the expected standard – and the standard itself will now be pitched higher, at the equivalent of Level 4B.

The original consultation document says:

‘Our modelling suggests that a progress measure set at this level, combined with the 85% threshold attainment measure, would result in a similar number of schools falling below the floor as at present. Over time we will consider whether schools should make at least average progress as part of floor standards.’

The consultation response does not confirm this judgement.

.

Developments

The only significant development since the publication of the consultation response is the detail provided on the June 2014 webpage New accountability arrangements for infant schools from 2016.

In addition to the points in the previous section, this also confirms that:

‘…there will not be a floor standard for infant schools’

But this statement has been called into question, since the table from the performance descriptors consultation, reproduced above, appears to suggest that KS1 teacher assessments in reading, writing and maths do contribute to a floor standard – whether for infant or all-through primary schools is unclear.

.

The aforementioned Centre Forum Report ‘Progress matters in Primary too’ (January 2015) also appears to call into question the results of the modelling reported in the initial consultation document.

It says:

‘…the likelihood is that, based on current performance, progress will be the measure used for the vast majority of schools, at least in the short to medium term. Even those schools which achieve the attainment floor target will only do so by ensuring at least average progress is made by their pupils. As a result, progress will in practice be the dominant accountability metric.’

It undertakes modelling based on 2013 attainment data – ie simulating the effect of the new standards had they been in place in 2013, using selected learning areas within the EYFSP as a proxy for the reception baseline – which suggests that just 10% of schools in 2013 would have met the new attainment floor.

It concludes that:

‘For the vast majority of schools, progress will be their only option for avoiding intervention when the reforms come into effect.’

Unfortunately though, it does not provide an estimate of the proportion of schools likely to achieve the progress floor standard, with either the current KS1 baseline or its proxy for a reception baseline.

Outstanding Tasks

  • Confirm the detailed methodology for deriving both the attainment and progress elements of the floor standards, in relation to both the new reception baseline and the interim KS1 baseline (summer 2015?)
  • Set the amount of progress required to achieve the progress element of the floor standards (summer 2016)
  • (In the consultation document) Consider whether schools should make at least average progress as part of floor standards and ‘move to three year rolling averages for floor standard measures’ (long term)

.

Overall progress, Purdah and General Election outcomes

Progress to date and actions outstanding

The lists of outstanding actions above record some 40 tasks necessary to the successful implementation of the primary assessment and accountability reforms.

If the ‘advance notice’ conventions are observed, roughly half of these require completion by the end of the summer term in July 2015, within the two windows of 50 working days on either side of Purdah.

These conventions have already been set aside in some cases, most obviously in respect of reception baseline assessment and the performance descriptors for statutory teacher assessment.

Unsurprisingly, the commentary above suggests that these two strands of the reform programme are the most complex and potentially the most problematic.

The sheer number of outstanding tasks and the limited time in which to complete them could pose problems.

It is important to remember that there are similar reforms in the secondary and post-16 sectors that need to be managed in parallel.

The leaked amber/red rating was attributed solely to the negative reaction to the draft performance descriptors, but it could also reflect a wider concern that all the necessary steps may not be completed in time to give schools the optimal period for planning and preparation.

Schools may be able to cope with shorter notice in a few instances, where the stakes are relatively low, but if too substantive a proportion of the overall reform programme is delayed into next academic year, they will find the cumulative impact much harder to manage.

In a worst case scenario, implementation of some elements might need to be delayed by a year, although the corollary would be an extended transition period for schools that would be less than ideal. It may also be difficult to disentangle the different strands given the degree of interdependency between them.

Given the proximity of a General Election, it may not be politic to confirm such delays before Purdah intervenes: the path of least resistance is probably to postpone any difficult decisions for consideration by the incoming government.

.

The implications of Purdah

As noted above, if the General Election result is clear-cut, Purdah will last some five-and-a-half weeks and will occur at a critical point in the implementation timetable.

The impact of Purdah should not be under-estimated.

From the point at which Parliament is dissolved on Monday 30 March, the Government must abstain from major policy decisions and announcements.

The Election is typically announced a few days before the dissolution of Parliament. This ‘wash up’ period between announcement and dissolution is typically used to complete essential unfinished business.

The Cabinet Office issues guidance on conduct during Purdah shortly before it begins.

The 2015 guidance has not yet issued so the 2010 guidance is the best source of information about what to expect.

.

[Postscript: 2015 Guidance was posted on 30 March 2015 and is substantively the same as the 2010 edition.]

.

Key points include:

  • ‘Decisions on matters of policy on which a new Government might be expected to want the opportunity to take a different view from the present Government should be postponed until after the Election, provided that such postponement would not be detrimental to the national interest or wasteful of public money.’
  • ‘Officials should not… be asked to devise new policies or arguments…’
  • ‘Departmental communications staff may…properly continue to discharge during the Election period their normal function only to the extent of providing factual explanation of current Government policy, statements and decisions.’
  • ‘There would normally be no objection to issuing routine factual publications, for example, health and safety advice but these will have to be decided on a case by case basis taking account of the subject matter and the intended audience.’
  • ‘Regular statistical releases and research reports (e.g. press notices, bulletins, publications or electronic releases) will continue to be issued and published on dates which have been pre-announced. Ad hoc statistical releases or research reports should be released only where a precise release date has been published prior to the Election period. Where a pre-announcement has specified that the information would be released during a specified period (e.g. a week, or longer time period), but did not specify a precise day, releases should not be published within the Election period.’
  • ‘Research: Fieldwork involving interviews with the public or sections of it will be postponed or abandoned although regular, continuous and on-going statistical surveys may continue.’
  • ‘Official websites…the release of new online services and publication of reworked content should not occur until after the General Election… Content may be updated for factual accuracy but no substantial revisions should be made and distributed.’
  • The general principles and conventions set out in this guidance apply to NDPBs and similar public bodies.

Assuming similar provisions in 2015, most if not all of the assessment and accountability work programme would grind to a halt.

To take an example, it is conceivable that those awarded baseline assessment contracts would be able to recruit schools after 30 March, but they will receive little or no help from the DfE during the Purdah period. Given that the recruitment deadline is 30 April, this may be expected to depress recruitment significantly.

.

The impact of different General Election outcomes

Forming a Government in the case of a Hung Parliament may also take some time, further delaying the process.

The six days taken in 2010 may not be a guide to what will happen in 2015.

The Cabinet Manual (2011) says:

‘Where an election does not result in an overall majority for a single party, the incumbent government remains in office unless and until the Prime Minister tenders his or her resignation and the Government’s resignation to the Sovereign. An incumbent government is entitled to wait until the new Parliament has met to see if it can command the confidence of the House of Commons, but is expected to resign if it becomes clear that it is unlikely to be able to command that confidence and there is a clear alternative…

…The nature of the government formed will be dependent on discussions between political parties and any resulting agreement. Where there is no overall majority, there are essentially three broad types of government that could be formed:

  • single-party, minority government, where the party may (although not necessarily) be supported by a series of ad hoc agreements based on common interests;
  • formal inter-party agreement, for example the Liberal–Labour pact from 1977 to 1978; or
  • formal coalition government, which generally consists of ministers from more than one political party, and typically commands a majority in the House of Commons’.

If one or more of the parties forming the next government has a different policy on assessment and accountability, this could result in pressure to amend or withdraw parts of the reform programme.

If a single party is involved, pre-Election contact with civil servants may have clarified its intentions, enabling work to resume as soon as the new government is in place but, if more than one party is involved, it may take longer to agree the preferred way forward.

Under a worst case scenario, planners might need to allow for Purdah and post-Election negotiations to consume eight weeks or longer.

The impact of the Election on the shape and scope of the primary assessment and accountability reforms will also depend on which party or parties enter government.

If the same Coalition partners are returned, one might expect uninterrupted implementation, unless the minority Lib Dems seek to negotiate different arrangements, which seems unlikely.

But if a different party or a differently constituted Coalition forms the Government, one might expect decisions to abandon or delay some aspects of the programme.

If Labour forms the Government, or is the major party in a Coalition, some unravelling will be necessary.

They are broadly committed to the status quo:

‘Yet when it comes to many of the technical day-to-day aspects of school leadership – child protection, curriculum reform, assessment and accountability – we believe that a period of stability could prove beneficial for raising pupil achievement. This may not be an exciting rallying cry, but it is crucial that the incoming government takes account of the classroom realities.’

Hunt has also declared:

‘Do not mistake me: I am zealot for minimum standards, rigorous assessment and intelligent accountability.

But if we choose to focus upon exam results and league tables to the detriment of everything else, then we are simply not preparing our young people for the demands of the 21st century.’

And, thus far, Labour has made few specific commitments in this territory.

  • They support reception baseline assessment but whether that extends to sustaining a market of providers is unknown. Might they be inclined to replace this with a single national assessment?.
  • There is very little about floor targets – a Labour invention – although the Blunkett Review appears to suggest that Directors of School Standards will enjoy some discretion in respect of their enforcement.

Reading between the lines, it seems likely that they would delay some of the strands described above – and potentially simplify others.

.

Conclusion

The primary assessment reform programme is both extensive and highly complex, comprising several strands and many interdependencies.

Progress to date can best be described as halting.

There are still many steps to be taken and difficult issues to resolve, about half of which should be completed by the end of this academic year. Pre-Election Purdah will cut significantly into the time available.

More announcements may be delayed into the summer holidays or the following autumn term, but this reduces the planning and preparation time available to schools and has potentially significant workload implications.

Alternatively, implementation of some elements or strands may be delayed by a year, but this extends the transition period between old and new arrangements. Any such rationalisation seems likely to be delayed until after the Election and decisions will be influenced by its outcome.

.

[Postscript: The commitment in the Government’s Workload Challenge response to a one-year lead time, now encapsulated in the Protocol published on 23 March, has not resulted in any specific commitments to delay ahead of the descent of Purdah.

At the onset of Purdah on 30 March some 18 actions appear to be outstanding and requiring completion by the end of the summer term. This will be a tall order for a new Government, especially one of a different complexion.]

.

If Labour is the dominant party, they may be more inclined to simplify some strands, especially baseline assessment and statutory teacher assessment, while also providing much more intensive support for schools wrestling with the removal of levels.

Given the evidence set out above, ‘amber/red’ seems an appropriate rating for the programme as a whole.

It seems increasingly likely that some significant adjustments will be essential, regardless of the Election outcome.

.

GP

January 2015

2014 Primary and Secondary Transition Matrices: High Attainers’ Performance

.

This is my annual breakdown of what the Transition Matrices tell us about the national performance of high attainers.

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It complements my reviews of High Attainment in the 2014 Primary Performance Tables (December 2014) and of High Attainment in the 2014 Secondary and Post-16 Performance Tables (forthcoming, in February 2015).

The analysis is based on:

  • The 2014 Static national transition matrices for reading, writing and mathematics – Key Stage 1 to Key Stage 2 (October 2014) and
  • The 2014 Static key Stage 2 to 4 National transition matrices unamended – English and maths (December 2014).

There is also some reference to SFR41/2014: Provisional GCSE and equivalent results in England, 2013 to 2014.

The post begins with some important explanatory notes, before examining the primary and then the secondary matrices. There is a commentary on each matrix, followed by a summary of the key challenges for each sector.

.

Explanatory notes

The static transition matrices take into account results from maintained mainstream and maintained and non-maintained special schools. 

The tables reproduced below use colour coding:

  • purple = more than expected progress
  • dark green = expected progress
  • light green = less than expected progress and
  • grey = those excluded from the calculation.

I will assume that readers are familiar with expectations of progress under the current system of national curriculum levels.

I have written before about the assumptions underpinning this approach and some of the issues it raises.

(See in particular the sections called:

 ‘How much progress does the accountability regime expect from high attainers?’ and

‘Should we expect more progress from high attainers?’)

I have not reprised that discussion here.

The figures within the tables are percentages – X indicates data that has been suppressed (where the cohort comprises only one or two learners). Because of rounding, lines do not always add up to 100%.

In the case of the primary matrices, the commentary below concentrates on the progress made by learners who achieved level 3 or level 4 at KS1. In the case of the secondary matrices, it focuses on those who achieved sub-levels 5A, 5B or 5C at KS2.

Although the primary matrices include progression from KS1 level 4, the secondary matrices do not include progression from KS2 level 6 since the present level 6 tests were introduced only in 2012. Those completing GCSEs in 2014 will typically have undertaken KS2 assessment five years earlier.

The analysis includes comparison with the matrices for 2012 and 2013 respectively.

.

The impact of policy change on the secondary matrices

This comparison is straightforward for the primary sector (KS1 to KS2) but is problematic when it comes to the secondary matrices (KS2 to KS4).

As SFR41/2014 makes clear, the combined impact of:

  • vocational education reforms (restricting eligible qualifications and significantly reducing the weighting of some of them) and 
  • early entry policy (recording in performance measures only the first result achieved, rather than the outcome of any retakes)

has depressed overall KS4 results.

The impact of these factors on progress is not discussed within the text, although one of the tables gives overall percentages for those making the expected progress under the old and new methodologies respectively.

It does so for two separate groups of institutions, neither of which is perfectly comparable with the transition matrices because of the treatment of special schools:

  • State funded mainstream schools (excluding state-funded special schools and non-maintained special schools) and
  • State-funded schools (excluding non-maintained special schools).

However, the difference is likely to be marginal.

There is certainly very little difference between the two sets of figures for the categories above, though the percentages are very slightly larger for the first.

They show:

  • A variation of 2.3 percentage points in English (72.1% making at least the expected progress under the new methodology compared with 74.4% under the old) and
  • A variation of 2.4 percentage points in maths (66.4% making at least the expected progress compared with 68.8%).

There is no such distinction in the static transition matrices, nor does the SFR provide any information about the impact of these policy changes for different levels of prior attainment.

It seems a reasonable starting hypothesis that the impact will be much reduced at higher levels of prior attainment, because comparatively fewer students will be pursuing vocational qualifications.

One might also expect comparatively fewer high attainers to require English and/or maths retakes, even when the consequences of early entry are factored in, but that is rather more provisional.

It may be that the differential impact of these reforms on progression from different levels of prior attainment will be discussed in the statistical releases to be published alongside the Secondary Performance Tables. In that case I will update this treatment.

For the time being, my best counsel is:

  • To be aware that these policy changes have almost certainly had some impact on the progress of secondary high attainers, but 
  • Not to fall into the trap of assuming that they must explain all – or even a substantial proportion – of any downward trends (or absence of upward trends for that matter).

There will be more to say about this in the light of the analysis below.

Is this data still meaningful?

As we all know, the measurement of progression through national curriculum levels will shortly be replaced by a new system.

There is a temptation to regard the methodology underpinning the transition matrices as outmoded and irrelevant.

For the time being though, the transition matrices remain significant to schools (and to Ofsted) and there is an audience for analysis based on them.

Moreover, it is important that we make our best efforts to track annual changes under the present system, right up to the point of changeover.

We should also be thinking now about how to match progression outcomes under the new model with those available under the current system, so as to secure an uninterrupted perspective of trends over time.

Otherwise our conclusions about the longer-term impact of educational policies to raise standards and close gaps will be sadly compromised.

.

2014 Primary Transition Matrices

.

Reading

.

TM reading KS12 Capture

.

Commentary:

  • It appears that relatively few KS1 learners with L4 reading achieved the minimum expected 2 levels of progress by securing L6 at the end of KS2. It is not possible for these learners to make more than the expected progress. The vast majority (92%) recorded a single level of progress, to KS2 L5. This contrasts with 2013, when 12% of KS1 L4 learners did manage to progress to KS2 L6, while only 88% were at KS2 L5. Caution is necessary since the sample of L1 KS4 readers is so small. (The X suggests the total cohort could be as few as 25 pupils.)
  • The table shows that 1% of learners achieving KS1 L3 reading made 3 levels of progress to KS2 L6, exactly the same proportion as in 2012 and 2013. But we know that L6 reading test entries were up 36% compared with 2013: one might reasonably have expected some increase in this percentage as a consequence. The absence of improvement may be attributable to the collapse in success rates on the 2014 L6 reading test.
  • 90% of learners achieving KS1 L3 made the expected 2 or more levels of progress to KS2 L5 or above, 89% making 2 levels of progress to L5. The comparable figures for those making 2 LoP in 2013 and 2012 were 85% and 89% respectively.
  • In 2014 only 10% of those achieving LS1 L3 made a single level of progress to KS2 L4, compared with 13% in 2013 and 10% in 2012. 
  • So, when it comes to L3 prior attainers, the 2013 dip has been overcome, but there has been no improvement beyond the 2012 outcomes. Chart 1 makes this pattern more obvious, illustrating clearly that there has been relatively little improvement across the board.

.

TM chart 1

Chart 1: Percentage of learners with KS1 L3 reading making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 making the expected progress is significantly lower than the proportions with KS1 L2A, L2B or L2 overall who do so. This pattern is unchanged from 2012 and 2013.
  • The proportion exceeding 2 LoP is also far higher for every other level of KS1 prior achievement, also unchanged from 2012 and 2013.
  • Whereas the gap between KS1 L2 and L3 making more than 2 LoP was 36 percentage points in 2013, by 2014 it had increased substantially to 43 percentage points (44% versus 1%). This may again be partly attributable to the decline in L6 reading results.

.

Writing

.

TM writing KS12 Capture

Commentary:

  • 55% of learners with L4 in KS1 writing made the expected 2 levels of progress to KS2 L6, while only 32% made a single level of progress to KS2 L5. This throws into sharper relief the comparable results for L4 readers. 
  • On the other hand, the 2013 tables recorded 61% of L4 writers making the expected progress, six percentage points higher than the 2014 success rate, so there has been a decline in success rates in both reading and writing for this small cohort. The reason for this is unknown, but it may simply be a consequence of the small sample.
  • Of those achieving KS1 L3, 12% made 3 LoP to KS2 L6, up from 6% in 2012 and 9% in 2013. The comparison with reading is again marked. A further 2% of learners with KS1 L2A made 4 levels of progress to KS2 L6.
  • 91% of learners with KS1 L3 writing made the expected 2 or more levels of progress, up from 89% in 2013. Some 79% made 2 LoP to L5, compared with 80% in 2013 and 79% in 2012, so there has been relatively little change.
  • However, in 2014 9% made only a single level of progress to KS2 L4. This is an improvement on 2013, when 11% did so and continues an improving trend from 2012 when 15% fell into this category, although the rate of improvement has slowed somewhat. 
  • These positive trends are illustrated in Chart 2 below, which shows reductions in the proportion achieving a single LoP broadly matched by corresponding improvements in the proportion achieving 3 LoP.

TM chart 2 

Chart 2: Percentage of learners with KS1 L3 writing making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 making the expected progress is again lower than the proportions with KS1 L2A, L2B or L2 overall doing so. It is even lower than the proportion of those with KS1 L1 achieving this outcome. This is unchanged from 2013.
  • The proportion exceeding 2 LoP is far higher for every other level of KS1 achievement excepting L2C, again unchanged from 2013.
  • The percentage point gap between those with KS1 L2 overall and LS1 L3 making more than 2 LoP was 20 points in 2013 and remains unchanged at 20 points in 2014. Once again again there is a marked contrast with reading. 

.

Maths

.

TM maths KS12 Capture

.

Commentary:

  • 95% of those achieving L4 maths at KS1 made the expected 2 levels of progress to KS2 L6. These learners are unable to make more than expected progress. Only 5% made a single level of progress to KS2 L5. 
  • There is a marked improvement since 2013, when 89% made the expected progress and 11% fell short. This is significantly better than KS1 L4 progression in writing and hugely better than KS1 L4 progression in reading.
  • 35% of learners with KS1 L3 maths also made 3 levels of progress to KS2 L6. This percentage is up from 26% in 2013 and 14% in 2012, indicating a continuing trend of strong improvement. In addition, 6% of those with L2A and 1% of those at L2B managed 4 levels of progress to KS2 L6.
  • 91% of learners with KS1 L3 made the expected progress (up one percentage point compared with 2013). Of these, 56% made 2 LoP to KS2 L5. However, 9% made only a single level of progress to KS2 L4 (down a single percentage point compared with 2013).
  • Chart 3 illustrates these positive trends. It contrasts with the similar charts for writing above, in that the rate at which the proportion of L3 learners making a single LoP is reducing is much slower than the rate of improvement in the proportion of KS1 L3 learners making 3 LoP.

.

TM chart 3

Chart 3: Percentage of learners with KS1 L3 maths making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 in maths who achieved the expected progress is identical to the proportion achieving L2 overall that do so, at 91%. However, these rates are lower than for learners with KS1 2B and especially 2A.
  • The proportion exceeding 2 LoP is also identical for those with KS1 L3 and L2 overall (whereas in 2013 there was a seven percentage point gap in favour of those with KS1 L2). The proportion of those with KS1 L2A exceeding 2 LoP remains significantly higher, but the gap has narrowed by six percentage points compared with 2013.

.

Key Challenges: Progress of High Attainers between KS1 and KS2

The overall picture from the primary transition matrices is one of comparatively strong progress in maths, positive progress in writing and a much more mixed picture in reading. But in none of these areas is the story unremittingly positive.

Priorities should include:

  • Improving progression from KS1 L4 to KS2 L6, so that the profile for writing becomes more similar to the profile for maths and, in particular, so that the profile for reading much more closely resembles the profile for writing. No matter how small the cohort, it cannot be acceptable that 92% of KS1 L4 readers make only a single level of progress.
  • Reducing to negligible the proportion of KS1 L3 learners making a single level of progress to KS2 L4. Approximately 1 in 10 learners continue to do so in all three assessments, although there has been some evidence of improvement since 2012, particularly in writing. Other than in maths, the proportion of KS1 L3 learners making a single LoP is significantly higher than the proportion of KS1 L2 learners doing so. 
  • Continuing to improve the proportion of KS1 L3 learners making 3 LoP in each of the three assessments, maintaining the strong rate of improvement in maths, increasing the rate of improvement in writing and moving beyond stagnation at 1% in reading. 
  • Eliminating the percentage point gaps between those with KS1 L2A making at least the expected progress and those with KS1 L3 doing so (5 percentage points in maths and 9 percentage points in each of reading and writing). At the very least, those at KS1 L3 should be matching those at KS1 L2B, but there are presently gaps between them of 2 percentage points in maths, 5 percentage points in reading and 6 percentage points in writing.

.

Secondary Transition Matrices

.

English

.

TM English KS24 Capture

.

Commentary:

  • 98% of learners achieving L5A English at KS2 made at least 3 levels of progress to GCSE grade B or above in 2014. The same is true of 93% of those with KS2 L5B and 75% of those with KS2 L5C. All three figures have improved by one percentage point compared with 2013. The comparable figures in 2012 were 98%, 92% and 70% respectively.
  • 88% of learners achieving L5A at KS2 achieved at least four levels of progress from KS2 to KS4, so achieving a GCSE grade of A* or A, as did 67% of those with L5B and 34% of those with 5C. The comparable figures in 2013 were 89%, 66% and 33% respectively, while in 2012 they were 87%, 64% and 29% respectively.
  • 51% of learners with KS2 L5A made 5 levels of progress by achieving an A* grade at GCSE, compared with 25% of those with L5B, 7% of those with L5C and 1% of those with L4A. The L5B and L5C figures were improvements on 2013 outcomes. The 2014 success rate for those with KS2 L5A is down by two percentage points, while that for L5B is up by two points.
  • These cumulative totals suggest relatively little change in 2014 compared with 2013, with the possible exception of these two-percentage-point swings in the proportions of students making 5 LoP. 
  • The chart below compares the proportion of students with KS2 L5A, 5B and 5C respectively making exactly 3, 4 and 5 LoP. (NB: these are not the same as the cumulative totals quoted above). This again shows relatively small changes in 2014, compared with 2013, and no obvious pattern.

.

TM chart 4

Chart 4: Percentage of learners with KS2 L5A, L5B and L5C in English achieving 3, 4 and 5 levels of progress, 2012-2014

.

  • 1% of learners with KS2 L5A made only 2 levels of progress to GCSE grade C, as did 6% of those with L5B and 20% of those with L5C. These percentages are again little changed compared with 2013, following a much more significant improvement between 2012 and 2013).
  • The percentages of learners with KS2 L4A who achieve at least 3 and at least 4 levels of progress – at 87% and 48% respectively – are significantly higher than the corresponding percentages for those with KS2 L5C. These gaps have also changed very little compared with 2013.

.

Maths

.

TM Maths KS24 Capture

.

Commentary:

  • 96% of learners with L5A at KS2 achieved the expected progress between KS2 and KS4 in 2014, as did 86% of those with KS2 L5B and 65% of those with KS2 L5C. The comparable percentages in 2013 were 97%, 88% and 70%, while in 2012 they were 96%, 86% and 67%. This means there have been declines compared with 2013 for L5A (one percentage point) L5B (two percentage points) and L5C (five percentage points).
  • 80% of learners with KS2 L5A made 4 or more levels of progress between KS2 and KS4, so achieving a GCSE grade A* or A. The same was true of 54% of those with L5B and 26% of those with L5C. In 2013, these percentages were 85%, 59% and 31% respectively, while in 2012 they were 84%, 57% and 30% respectively. So all the 2014 figures – for L5A, L5B and L5C alike, are five percentage points down compared with 2013.
  • In 2014 48% of learners with KS2 L5A made 5 levels of progress by achieving a GCSE A* grade, compared with 20% of those with L5B, 5% of those with L5C and 1% of those with L4A. All three percentages for those with KS2 L5 are down compared with 2013 – by 3 percentage points in the case of those with L5A, 2 points for those with L5B and 1 point for those with L5C.
  • It is evident that there is rather more volatility in the trends in maths progression and some of the downward swings are more pronounced than in English.
  • The chart below compares the proportion of students with KS2 L5A, 5B and 5C respectively making exactly 3, 4 and 5 LoP. (NB, these are not the cumulative totals quoted above). The only discernible pattern is that any improvement is confined to those making 3 LoP.

.

TM chart 5

Chart 5: Percentage of learners with KS2 L5A, L5B and L5C in Maths achieving 3, 4 and 5 levels of progress, 2012-2014

  • 4% of those with KS2 L5A made only 2 LoP to GCSE grade C, as did 13% of those with L5B and 31% of those with L5C. All three percentages have worsened compared with 2013, by 1, 2 and 4 percentage points respectively.
  • The percentages of learners with KS2 L4A who achieve at least 3 and at least 4 levels of progress – at 85% and 37% respectively – are significantly higher than the corresponding percentages for those with L5C, just as they are in English. And, as is the case with English, the percentage point gaps have changed little compared with 2013.

.

Key Challenges: Progress of High Attainers Between KS2 and KS4

The overall picture for high attainers from the secondary transition matrices is of relatively little change in English and of rather more significant decline in maths, though not by any means across the board.

It may be that the impact of the 2014 policy changes on high attainers has been relatively more pronounced in maths than in English – and perhaps more pronounced in maths than might have been expected.

If this is the case, one suspects that the decision to restrict reported outcomes to first exam entries is the most likely culprit.

On the other hand, it might be true that relatively strong improvement in English progression has been cancelled out by these policy changes, though the figures provided in the SFR for expected progress regardless of prior attainment make this more unlikely.

Leaving causation aside, the most significant challenges for the secondary sector are to:

  • Significantly improve the progression rates for learners with KS2 L5A to A*. It should be a default expectation that they achieve five levels of progress, yet only 48% do so in maths and 51% in English – and these percentages are down 5 and 2 percentage points respectively compared with 2013.
  • Similarly, significantly improve the progression rates for learners with KS2 L5B to grade A. It should be a default expectation that they achieve at least 4 LoP, yet only 67% do so in English and 54% in maths – down one point since 2013 in English and 5 points in maths.
  • Reduce and ideally eliminate the rump of high attainers who make a single LoP. This is especially high for those with KS2 L5C – 20% in English and, still worse, 31% in maths – but there is also a problem for those with 5B in maths, 13% of whom fall into this category. The proportion making a single LoP from 5C in maths has risen by 4 percentage points since 2013, while there has also been a 2 point rise for those with 4B. (Thankfully the L5C rate in English has improved by 2 points, but there is a long way still to go.)
  • Close significantly, the progression performance gaps between learners with KS2 L5C and KS2 L4A, in both English and maths. In English there is currently a 12 percentage point gap for those making expected progress and a 14-point gap for those exceeding it. In maths, these gaps are 20 and 11 percentage points respectively. The problem in maths seems particularly pronounced. These gaps have changed little since 2013.

.

Conclusion

This analysis of high attainers’ progression suggests a very mixed picture, across the primary and secondary sectors and beween English and maths. There is some limited scope for congratulation, but too many persistent issues remain.

The commentary has identified four key challenges for each sector, which can be synthesised under two broad headings:

  • Raising expectations beyond the minimum expected progress – and significantly reducing our tolerance of underachievement amongst this cohort. 
  • Ensuring that those at the lower end of the high attaining spectrum sustain their initial momentum, at least matching the rather stronger progress of those with slightly lower prior attainment.

The secondary picture has become confused this year by the impact of policy changes.

We do not know to what extent these explain any downward trends – or depress any upward trends – for those with high prior attainment, though one may tentatively hypothesise that any impact has been rather more significant in maths than in English.

It would be quite improper to assume that the changes in high attainers’ progression rates compared with 2013 are entirely attributable to the impact of these policy adjustments.

It would be more accurate to say that they mask any broader trends in the data, making those more difficult to isolate.

We should not allow this methodological difficulty – or the impending replacement of the present levels-based system – to divert us from continuing efforts to improve the progression of high attainers.

For Ofsted is intensifying its scrutiny of how schools support the most able – and they will expect nothing less.

.

GP

January 2015

High Attainment in the 2014 Primary School Performance Tables

.

This is my annual post reviewing data about high attainment and high attainers at the end of Key Stage 2.

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on:

and parallel material for previous years.

‘High attainment’ is taken to mean National Curriculum Level 5 and above.

‘High attainers’ are defined in accordance with the Performance Tables, meaning those with prior attainment above Level 2 in KS1 teacher assessments (average points score of 18 or higher). This measure obviously excludes learners who are particularly strong in one area but correspondingly weak in another.

The proportions of the end-of-KS2 cohort defined as high, middle and low attainers have remained fairly constant since 2012.

High attainers presently constitute the top quartile of the relevant population, but this proportion is not fixed: it will increase as and when KS1 performance improves.

High % Middle % Low %
2014 25 58 18
2013 25 57 18
2012 24 57 19

Table 1: Proportion of high, middle and low prior attainers in state-funded schools by year since 2012

 

The percentage of high attainers in different schools’ end-of-KS2 cohorts varies very considerably and is unlikely to remain constant from year to year. Schools with small year groups are particularly vulnerable to significant fluctuations.

The 2014 Performance Tables show that Minster School, in Southwell, Nottinghamshire and St Patrick’s Church of England Primary Academy in Solihull each had 88% high attainers.

Over 600 primary schools have 50% or more high attainers within their cohorts. But, at the other extreme, more than 570 have no high attainers at all, while some 1,150 have 5% or fewer.

This serves to illustrate the very unequal distribution of learners with high prior attainment between schools.

The commentary below opens with a summary of the headline findings. The subsequent sections focus in turn on the composite measure (reading, writing and maths combined), then on the outcomes of the reading, GPS (grammar, punctuation and spelling) and maths tests and finally on teacher assessment in writing.

I have tried to ensure that percentages are consistent throughout this analysis, but the effect of rounding means that some figures are slightly different in different SFR tables. I apologise in advance for – and will of course correct – any transcription errors.

.

Headlines

.

Overall Trends

Chart 1 below compares performance at level 5 and above (L5+) and level 4 and above (L4+) in 2013 and 2014. The bars on the left hand side denote L4+, while those corresponding to L5+ are on the right.

HA 1

Chart 1: L4+ and L5+ performance compared, 2013-2014

With the exception of maths, which has remained unchanged, there have been improvements across the board at L4+, of between two and four percentage points.

The same is true at L5+ and – in the case of reading, GPS and writing – the percentage point improvements are relatively larger. This is good news.

Chart 2 compares the gaps between disadvantaged learners (‘ever 6’ FSM plus children in care) and all other learners in state-funded schools on all five measures, for both 2013 and 2014.

.

HA 2

Chart 2: Disadvantaged gaps at L4+ and L5+ for all five measures, 2013 and 2014

.

With the sole exception of the composite measure in 2013, each L4+ gap is smaller than the corresponding gap at L5+, though the difference can be as little as one percentage point (the composite measure) and as high as 11 percentage points (reading).

Whereas the L4+ gap in reading is lower than for any other measure, the L5+ reading gap is now the biggest. This suggests there is a particular problem with L5+ reading.

The distance between L4+ and L5+ gaps has typically widened since 2013, except in the case of maths, where it has narrowed by one percentage point.

While three of the L4+ gaps have closed slightly (composite, reading, GPS) the remainder are unchanged. However, two of the L5+ gaps have increased (composite, writing) and only the maths gap has closed slightly.

This suggests that what limited progress there has been in closing disadvantaged gaps has focused more on L4+ than L5+.

The pupil premium is not bringing about a radical improvement – and its impact is relatively lower at higher attainment levels.

A similar pattern is discernible with FSM gaps as Chart 3 reveals. This excludes the composite measure as this is not supplied in the SFR.

Overall the picture at L4+ is cautiously positive, with small downward trends on three of the four measures, but the picture at L5+ is more mixed since two of the measures are unchanged.

.

HA 3

Chart 3: FSM gaps at L4+ and L5+ compared, 2013 and 2014  

Composite measure

  • Although the proportion of learners achieving this benchmark is slightly higher in converter academies than in LA-maintained schools, the latter have improved faster since 2013. The success rate in sponsored academies is half that in converter academies. Free schools are improving but remain behind LA-maintained schools. 
  • Some 650 schools achieve 50% or higher, but another 470 record 0% (fewer than the 600 which did so in 2013). 
  • 67% of high attainers achieved this benchmark in 2014, up five percentage points on 2013 but one third still fall short, demonstrating that there is extensive underachievement amongst high attainers in the primary sector. This rather undermines HMCI’s observations in his Commentary on the 2014 Annual Report. 
  • Although over 670 schools have a 100% success rate amongst their high attainers, 42 schools have recorded 0% (down from 54 in 2013). Several of these do better by their middle attainers. In 10 primary schools no high attainers achieve L4+ in reading, writing and maths combined.

.

Reading

  • The substantial improvement in L5+ reading performance since 2013 masks an as yet unexplained crash in Level 6 test performance. Only 874 learners in state-funded schools achieved L6 reading, compared with 2,137 in 2013. This is in marked contrast to a substantive increase in L6 test entries, the success rate on L6 teacher assessment and the trend in the other L6 tests. In 2013 around 12,700 schools had no pupils who achieved L6 reading, but this increased to some 13,670 schools in 2014. Even the performance of Chinese pupils (otherwise phenomenally successful on L6 tests) went backwards. 
  • The proportion of Chinese learners achieving L5 in reading has reached 65% (compared with 50% for White learners), having increased by seven percentage points since 2013 and overtaken the 61% recorded in 2012. 
  • 43 primary schools had a 100% success rate at Level 5 in the reading test, but 29 more registered 0%. 
  • Some 92% of high attainers made at least the expected progress in reading, fewer than the 94% of middle attainers who did so. However, this was a three percentage point improvement on the 89% who made the requisite progress in 2013. 

GPS

  •  The proportion of Chinese learners achieving L5+ in the GPS test is now 75%, a seven percentage point improvement on 2013. Moreover, 15% achieved Level 6, up eight percentage points on 2013. (The comparable Level 5+ percentage for White learners is 50%). There are unmistakeable signs that Chinese ascendancy in maths is being replicated with GPS. 
  • Some 7,210 schools had no learners achieving L6 in the GPS test, compared with 10,200 in 2013. While 18 schools recorded a perfect 100% record at Level 5 and above, 33 had no learners at L5+. 

.

Maths

  • Chinese learners continue to make great strides. The percentage succeeding on the L6 test has climbed a further six percentage points and now stands at 35% (compared with 8% for White Pupils). Chinese boys are at 39%. The proportion of Chinese learners achieving level 6 is now comparable to the proportions of other ethnic groups achieving level 5. This lends further credence to the notion that we have our own domestic equivalent of Shanghai’s PISA success – and perhaps to the suggestion that focusing on Shanghai’s classroom practice may bring only limited benefits. 
  • While it is commendable that 3% of FSM and 4% of disadvantaged learners are successful in the L6 maths test, the gaps between them and other learners are increasing as the overall success rate grows. There are now seven percentage point gaps for FSM and disadvantaged alike. 
  • Ten schools managed a L6 success rate of 50% or higher, while some 280 were at 30% or higher. On the other hand, 3,200 schools had no L6 passes (down from 5,100 in 2013). 
  • About 94% of high attainers made the expected progress in maths a one percentage point improvement on 2013 – and two percentage points more than the proportion of successful middle attainers. But 27 schools posted a success rate of 50% or below.

.

Writing (TA)

  • Chinese pupils do not match their performance on the GPS test, though 6% achieve L6 in writing TA compared with just 2% of white pupils. 
  • Three schools managed a 50% success rate at Level 6 and 56 were at 25% or above. Only one school managed 100% at L5, but some 200 scored 0%. 
  • Some 93% of all pupils make the expected progress in writing between KS1 and KS2. This is true of 95% of high attainers – and 95% of middle attainers too.

 

Composite measure: reading, writing and maths

Table 2 shows the overall proportion of learners achieving L5 or above in all of reading, writing and maths in each year since 2012.

 

2012 2013 2014
L5+ overall 20% 21% 24%
L5+ boys 17% 18% 20%
L5+ girls 23% 25% 27%

Table 2: Proportion of all learners achieving KS2 L5+ in reading, writing and maths, 2012-2014

The overall success rate has increased by three percentage points compared with 2013 and by four percentage points since 2012.

The percentage of learners achieving L4+ has also improved by four percentage points since 2012, so the improvement at L5+ is broadly commensurate.

Over this period, girls’ lead over boys has remained relatively stable at between six and seven percentage points.

The SFR reveals that success on this measure varies significantly between school type.

The percentages for LA-maintained schools (24%) and all academies and free schools (23%) are little different.

However mainstream converter academies stand at 26%, twice the 13% recorded by sponsored academies. Free schools are at 21%. These percentages have changed significantly compared with 2013.

.

HA 4

Chart 4:  Comparison of proportion of learners achieving L5+ in reading writing and maths in 2013 and 2014

.

Whereas free schools are making rapid progress and sponsored academies are also improving at a significant rate, converter academies are improving more slowly than LA-maintained schools.

The highest percentages on this measure in the Performance Tables are recorded by Fox Primary School in Kensington and Chelsea (86%) and Hampden Gurney CofE Primary School in Westminster (85%).

Altogether, some 650 schools have achieved success rates of 50% or higher, while 23 have managed 75% or higher.

At the other end of the spectrum about 470 schools have no learners at all who achieved this measure, fewer than the 600 recording this outcome in 2013.

Table 3 shows the gap between disadvantaged (ie ‘ever 6’ FSM and children in care) learners and others, as recorded in the Performance Tables.

2012 2013 2014
Disadv 9 10 12
Other 24 26 29
Gap 15 16 17

Table 3: Proportion of disadvantaged learners achieving L5+ in reading, writing and maths, 2012-2014

.

Although the percentage of disadvantaged learners achieving this benchmark has improved somewhat, the percentage of other learners doing so has improved faster, meaning that the gap between advantaged and other learners is widening steadily.

This contrasts with the trend at L4+, where the Performance Tables show a gap that has narrowed from 19 percentage points in 2012 (80% versus 61%) to 18 points in 2013 (81% versus 63%) and now to 16 points in 2014 (83% versus 67%).

Chart 5 below illustrates this comparison.

.

HA 5

Chart 5: Comparing disadvantaged/other attainment gaps in KS2 reading, writing and maths combined at L4+ and L5+, 2012-2014.

While the L4+ gap has closed by three percentage points since 2012, the L5+ gap has widened by two percentage points. This suggests that disadvantaged learners amongst the top 25% by prior attainment are not benefiting commensurately from the pupil premium.

There are 97 primary schools where 50% or more disadvantaged learners achieve L5+ across reading, writing and maths (compared with 40 in 2013).

The highest performers record above 80% on this measure with their disadvantaged learners, albeit with cohorts of 6 to 8. Only one school with a more substantial cohort (of 34) manages over 70%. This is Tollgate Primary School in Newham.

The percentage of high attainers who achieved L5+ in 2014 was 67%, up five percentage points from 62% in 2013. (In 2012 the Performance Tables provided a breakdown for English and maths, which is not comparable).

Although this is a significant improvement, it means that one third of high attainers at KS1 still do not achieve this KS2 benchmark, suggesting that there is significant underachievement amongst this top quartile.

Thirteen percent of middle attainers also achieved this outcome, compared with 10% in 2013.

A significant number of schools – over 670 – do manage a 100% success rate amongst their high attainers, but there are also 42 schools where no high attainers achieve the benchmark (there were 54 in 2013). In several of them, more middle attainers than high attainers achieve the benchmark.

There are ten primary schools in which no high attainers achieve L4 in reading writing and maths. Perhaps one should be thankful for the fact that no middle attainers in these schools achieve the benchmark either!

The KS2 average point score was 34.0 or higher in five schools, equivalent to a level 5A. The highest  APS was 34.7, recorded by Fox Primary School, with a cohort of 42 pupils.

Across all state-funded schools, the average value added measure for high attainers across reading, writing and maths is 99.8, the same as it was in 2013.

The comparable averages for middle attainers and low attainers are 100.0 and 100.2 respectively, showing that high attainers benefit slightly less from their primary education.

The highest value-added recorded for high attainers is 104.7 by Tudor Court Primary School in Thurrock, while the lowest is 93.7 at Sacriston Junior School in Durham (now closed).

Three more schools are below 95.0 and some 250 are at 97.5 or lower.

.

Reading Test

Table 4 shows the percentage of all learners, boys and girls achieving L5+ in reading since 2010. There has been a five percentage point increase (rounded) in the overall result since 2013, which restores performance to the level it had reached in 2010.

A seven percentage point gap in favour of girls remains unchanged from 2013. This is four points less than the comparable gender gap in 2010.

.

2010 2011 2012 2013 2014
L5+ overall 50 43 48 44 50
Boys 45 37 43 41 46
Girls 56 48 53 48 53

Table 4: Percentage of learners achieving L5+ in reading since 2010

.

As reported in my September 2014 post ‘What Happened to the Level 6 Reading Results?’ L6 performance in reading has collapsed in 2014.

The figures have improved slightly since the provisional results were released, but the collapse is still marked.

Table 5 shows the numbers successful since 2012.

The number of successful learners in 2014 is less than half the number successful in 2013 and almost back to the level in 2012 when the test was first introduced.

This despite the fact that the number of entries for the level 6 test – 95,000 – was almost exactly twice the 47,000 recorded in 2012 and significantly higher than the 70,000 entries in 2013.

For comparison, the number of pupils awarded level 6 in reading via teacher assessment was 15,864 in 2013 and 17,593 in 2014

We still have no explanation for this major decline which is entirely out of kilter with other L6 test outcomes.

.

2012 2013 2014
% No % No % No
L6+ 0 900 0 2,262 0 935
Boys 0 200 0 592 0 263
Girls 0 700 1 1,670 0 672

Table 5: Number and percentage of learners achieving L6 on the KS2 reading test 2012-2014

.

These figures include some pupils attending independent schools, but another table in the SFR reveals that 874 learners in state-funded primary schools achieved L6 (compared with 2,137 in 2013). Of these, all but 49 achieved L3+ in their KS1 reading assessment.

But some 13,700 of those with L3+ reading at the end of KS1 progressed to L4 or lower at the end of KS2.

The SFR does not supply numbers of learners with different characteristics achieving L6 and all percentages are negligible. The only group recording a positive percentage are Chinese learners at 1%.

In 2013, Chinese learners were at 2% and some other minority ethnic groups recorded 1%, so not even the Chinese have been able to withstand the collapse in the L6 success rate.

According to the SFR, the FSM gap at L5 is 21 percentage points (32% versus 53% for all other pupils). The disadvantaged gap is also 21 percentage points (35% versus 56% for all other pupils).

Chart 6 shows how these percentages have changed since 2012.

.

HA 6

Chart 6: FSM and disadvantaged gaps for KS2 reading test at L5+, 2012-2014

FSM performance has improved by five percentage points compared with 2013, while disadvantaged performance has grown by six percentage points.

However, gaps remain unchanged for FSM and have increased by one percentage point for disadvantaged learners. There is no discernible or consistent closing of gaps in KS2 reading at L5.

These gaps of 21 percentage points for both FSM and disadvantaged, are significantly larger than the comparable gaps at L4+ of 12 (FSM) and 10 (disadvantaged) percentage points.

The analysis of level 5 performance in the SFR reveals that the proportion of Chinese learners achieving level 5 has reached 65%, having increased by seven percentage points since 2013 and overtaken the 61% recorded in 2012.

Turning to the Performance Tables, we can see that, in relation to L6:

  • The highest recorded percentage achieving L6 is 17%, at Dent CofE Voluntary Aided Primary School in Cumbria. Thirteen schools recorded a L6 success rate of 10% or higher. (The top school in 2013 recorded 19%).
  • In 2013 around 12,700 schools had no pupils who achieved L6 reading, whereas in 2014 this had increased to some 13,670 schools.

In relation to L5:

  • 43 schools achieved a 100% record in L5 reading (compared with only 18 in 2013). All but one of these recorded 0% at L6, which may suggest that they were concentrating on maximising L5 achievement rather than risking L6 entry.
  • Conversely, there are 29 primary schools where no learners achieved L5 reading.

Some 92% of high attainers made at least the expected progress in reading, fewer than the 94% of middle attainers who did so.  However, this was a three percentage point improvement on the 89% who made the requisite progress in 2013.

And 41 schools recorded a success rate of 50% or lower on this measure, most of them comfortably exceeding this with their low and middle attainers alike.

.

GPS Test

Since the grammar, punctuation and spelling test was first introduced in 2013, there is only a two-year run of data. Tables 6 and 7 below show performance at L5+ and L6+ respectively.

.

2013 % 2014 %
L5+ overall 48 52
Boys 42 46
Girls 54 58

Table 6: Percentage of learners achieving L5+ in GPS, 2013 and 2014

2013 2014
% No % No
L6+ 2 8,606 4 21,111
Boys 1 3,233 3 8,321
Girls 2 5,373 5 12,790

Table 7: Number and percentage of learners achieving L6 in GPS, 2013 and 2014

.

Table 6 shows an overall increase of four percentage points in 2014 and the maintenance of a 12 percentage point gap in favour of girls.

Table 7 shows a very healthy improvement in L6 performance, which only serves to emphasise the parallel collapse in L6 reading. Boys have caught up a little on girls but the latter’s advantage remains significant.

The SFR shows that 75% of Chinese learners achieve L5 and above, up seven percentage points from 68% in 2013. Moreover, the proportion achieving L6 has increased by eight percentage points, to 15%. There are all the signs that Chinese eminence in maths is repeating itself with GPS.

Chart 7 shows how the FSM gap and disadvantaged gap has changed at L5+ for GPS. The disadvantaged gap has remained stable at 19 percentage points, while the FSM gap has narrowed by one percentage point.

These gaps are somewhat larger than those at L4 and above, which stand at 17 percentage points for FSM and 15 percentage points for disadvantaged learners.

.

HA 7

Chart 7:  FSM and disadvantaged gaps for KS2 GPS test at L5+, 2013 and 2014

.

The Performance Tables show that, in relation to L6:

  • The school with the highest percentage achieving level 6 GPS is Fulwood, St Peter’s CofE Primary School in Lancashire, which records a 47% success rate. Some 89 schools achieve a success rate of 25% or higher.
  • In 2014 there were some 7,210 schools that recorded no L6 performers at all, but this compares favourably with 10,200 in 2013. This significant reduction is in marked contrast to the increase in schools with no L6 readers.

Turning to L5:

  • 18 schools recorded a perfect 100% record for L5 GPS. These schools recorded L6 success rates that vary between 0% and 25%.
  • There are 33 primary schools where no learners achieved L5 GPS.

.

Maths test

Table 8 below provides the percentages of learners achieving L5+ in the KS2 maths test since 2010.

Over the five year period, the success rate has improved by eight percentage points, but the improvement in 2014 is less pronounced than it has been over the last few years.

The four percentage point lead that boys have over girls has changed little since 2010, apart from a temporary increase to six percentage points in 2012.

.

2010 2011 2012 2013 2014
L5+ overall 34 35 39 41 42
Boys 36 37 42 43 44
Girls 32 33 36 39 40

Table 8: Percentage of learners achieving L5+ in KS2 maths test, 2010-2014

.

Table 9 shows the change in achievement in the L6 test since 2012. This includes pupils attending independent schools – another table in the SFR indicates that the total number of successful learners in 2014 in state-funded schools is 47,349, meaning that almost 95% of those achieving L6 maths are located in the state-funded sector.

There has been a healthy improvement since 2013, with almost 15,000 more successful learners – an increase of over 40%. Almost one in ten of the end of KS2 cohort now succeeds at L6. This places the reversal in L6 reading into even sharper relief.

The ratio between boys and girls has remained broadly unchanged, so boys continue to account for over 60% of successful learners.

.

2012 2013 2014
% No % No % No
L6+ 3 19,000 7 35,137 9 50,001
Boys 12,400 8 21,388 11 30,173
Girls 6,600 5 13,749 7 19,828

Table 9 Number and percentage of learners achieving L6 in KS2 maths test 2012-2014

.

The SFR shows that, of those achieving L6 in state-funded schools, some 78% had achieved L3 or above at KS1. However, some 9% of those with KS1 L3 – something approaching 10,000 pupils – progressed only to L4, or lower.

The breakdown for minority ethnic groups shows that the Chinese ascendancy continues. This illustrated by Chart 8 below.

HA 8

Chart 8: KS2 L6 maths test performance by ethnic background, 2012-2014

In 2014, the percentage of Chinese achieving L5+ has increased by a respectable three percentage points to 74%, but the L6 figure has climbed by a further six percentage points to 35%. More than one third of Chinese learners now achieve L6 on the maths test.

This means that the proportion of Chinese pupils achieving L6 is now broadly similar to the proportion of other minorities achieving Level 5 (34% of white pupils for example).

They are fifteen percentage points ahead of the next best outcome – 20% recorded by Indian learners. White learners stand at 8%.

There is an eight percentage point gap between Chinese boys (39%) and Chinese girls (31%). The gap for white boys and girls is much lower, but this is a consequence of the significantly lower percentages.

Given that Chinese pupils are capable of achieving such extraordinary results under the present system, these outcomes raise significant questions about the balance between school and family effects and whether efforts to emulate Chinese approaches to maths teaching are focused on the wrong target.

Success rates in the L6 maths test are high enough to produce percentages for FSM and disadvantaged learners. The FSM and disadvantaged gaps both stand at seven percentage points, whereas they were at 5 percentage points (FSM) and 6 percentage points (disadvantaged) in 2013. The performance of disadvantaged learners has improved, but not as fast as that of other learners.

Chart 9 shows how these gaps have changed since 2012.

While the L6 gaps are steadily increasing, the L5+ gaps have remained broadly stable at 20 percentage points (FSM) and 21 percentage points (disadvantaged). There has been a small one percentage point improvement in the gap for disadvantaged learners in 2014, matching the similar small improvement for L4+.

The gaps at L5+ remain significantly larger than those at L4+ (13 percentage points for FSM and 11 percentage points for disadvantaged).

HA 9

Chart 9: FSM and disadvantaged gaps, KS2 L5+ and L6 maths test, 2012 to 2014

.

The Performance Tables reveal that:

  • The school with the highest recorded percentage of L6 learners is Fox Primary School (see above) at 64%, some seven percentage points higher than its nearest rival. Ten schools achieve a success rate of 50% or higher (compared with only three in 2013), 56 at 40% or higher and 278 at 30% or higher.
  • However, over 3,200 schools record no L6 passes. This is a significant improvement on the 5,100 in this category in 2013, but the number is still far too high.
  • Nine schools record a 100% success rate for L5+ maths. This is fewer than the 17 that managed this feat in 2013.

Some 94% of high attainers made the expected progress in maths a one percentage point improvement on 2013, two percentage points more than did so in reading in 2014 – and two percentage points more than the proportion of middle attainers managing this.

However, 27 schools had a success rate of 50% or below, the vast majority of them comfortably exceeding this with their middle attainers – and often their low attainers too.

.

Writing Teacher Assessment

Table 10 shows how the percentage achieving L5+ through the teacher assessment of writing has changed since 2012.

There has been a healthy five percentage point improvement overall, and an improvement of three percentage points since last year, stronger than the comparable improvement at L4+. The large gender gap of 15 percentage points in favour of girls is also unchanged since 2013.

.

2012 2013 2014
L5+ overall 28 30 33
Boys 22 23 26
Girls 35 38 41

Table 10: Percentage achieving level 5+ in KS2 writing TA 2012-2014

.

Just 2% of learners nationally achieve L6 in writing TA – 11,340 pupils (10,654 of them located in state-funded schools).

However, this is a very significant improvement on the 2,861 recording this outcome in 2013. Just 3,928 of the total are boys.

Chinese ascendancy at L6 is not so significant. The Chinese success rate stands at 6%. However, if the comparator is performance at L5+ Chinese learners record 52%, compared with 33% for both White and Asian learners.

The chart below shows how FSM and disadvantaged gaps have changed at L5+ since 2012.

This indicates that the FSM gap, having widened by two percentage points in 2013, has narrowed by a single percentage point in 2014, so it remains higher than it was in 2012. Meanwhile the disadvantaged gap has widened by one percentage point since 2013.

The comparable 2014 gaps at L4+ are 15 percentage points (FSM) and 13 percentage points (disadvantaged), so the gaps at L5+ are significantly larger.

.

HA 10

Chart 10: FSM and disadvantaged gaps, L5+ Writing TA, 2012-2014

.

The Performance Tables show that:

  • Three schools record a L6 success rate of 50% and only 56 are at 25% or higher.
  • At the other end of the spectrum, the number of schools with no L6s is some 9,780, about a thousand fewer than in 2013.
  • At L5+ only one school has a 100% success rate (there were four in 2013). Conversely, about 200 schools record 0% on this measure.

Some 93% of all pupils make the expected progress in writing between KS1 and KS2 and this is true of 95% of high attainers – the same percentage of middle attainers is also successful.

Conclusion

Taken together, this evidence presents a far more nuanced picture of high attainment and high attainers’ performance in the primary sector than suggested by HMCI’s Commentary on his 2014 Annual Report:

‘The proportion of pupils at Key Stage 2 attaining a Level 5 or above in reading, writing and mathematics increased from 21% in 2013 to 24% in 2014.

Attainment at Level 6 has also risen. In mathematics, the proportion of pupils achieving Level 6 rose from 3% in 2012 to 9% in 2014. The proportion achieving Level 6 in grammar, punctuation and spelling rose by two percentage points in the last year to 4%.

These improvements suggest that primary schools are getting better at identifying the brightest children and developing their potential.’

There are four particular areas of concern:

  • Underachievement amongst high attainers is too prevalent in far too many primary schools. Although there has been some improvement since 2013, the fact that only 67% of those with high prior attainment at KS1 achieve L5 in reading, writing and maths combined is particularly worrying.
  • FSM and disadvantaged achievement gaps at L5+ remain significantly larger than those at L4+ – and there has been even less progress in closing them. The pupil premium ought to be having a significantly stronger impact on these excellence gaps.
  • The collapse of L6 reading test results is all the more stark when compared with the markedly improved success rates in GPS and maths which HMCI notes. We still have no explanation of the cause.
  • The success rates of Chinese pupils on L6 tests remains conspicuous and in maths is frankly extraordinary. This evidence of a ‘domestic Shanghai effect’ should be causing us to question why other groups are so far behind them – and whether we need to look beyond Shanghai classrooms when considering how best to improve standards in primary maths.

.

GP

December 2014

Unpacking the Primary Assessment and Accountability Reforms

This post examines the Government response to consultation on primary assessment and accountability.

pencil-145970_640It sets out exactly what is planned, what further steps will be necessary to make these plans viable and the implementation timetable.

It is part of a sequence of posts I have devoted to this topic, most recently:

Earlier posts in the series include The Removal of National Curriculum Levels and the Implications for Able Pupils’ Progression (June 2012) and Whither National Curriculum Assessment Without Levels? (February 2013).

The consultation response contrives to be both minimal and dense. It is necessary to unpick each element carefully, to consider its implications for the package as a whole and to reflect on how that package fits in the context of wider education reform.

I have organised the post so that it considers sequentially:

  • The case for change, including the aims and core principles, to establish the policy frame for the planned reforms.
  • The impact on the assessment experience of children aged 2-11 and how that is likely to change.
  • The introduction of baseline assessment in Year R.
  • The future shape of end of KS1 and end of KS2 assessment respectively.
  • How the new assessment outcomes will be derived, reported and published.
  • The impact on floor standards.

Towards the end of the post I have also provided a composite ‘to do’ list containing all the declared further steps necessary to make the plan viable, with a suggested deadline for each.

And the post concludes with an overall judgement on the plans, in the form of a summary of key issues and unanswered questions arising from the earlier commentary. Impatient readers may wish to jump straight to that section.

I am indebted to Warwick Mansell for his previous post on this topic. I shall try hard not to parrot the important points he has already made, though there is inevitably some overlap.

Readers should also look to Michael Tidd for more information about the shape and content of the new tests.

What has been published?

The original consultation document ‘Primary assessment and accountability under the new national curriculum’ was published on 17 July 2013 with a deadline for response of 17 October 2013. At that stage the Government’s response was due ‘in autumn 2013’.

The response was finally published on 27 March, some four months later than planned and only five months prior to the introduction of the revised national curriculum which these arrangements are designed to support.

It is likely that the Government will have decided that 31 March was the latest feasible date to issue the response, so they were right up against the wire.

It was accompanied by:

  • A press release which focused on the full range of assessment reforms – for primary, secondary and post-16.

Shortly before the response was published, the reply to a Parliamentary question asked on 17 March explained that test frameworks were expected to be included within it:

‘Guidance on the nature of the revised key stage 1 and key stage 2 tests, including mathematics, will be published by the Standards and Testing Agency in the form of test framework documents. The frameworks are due to be released as part of the Government’s response to the primary assessment and accountability consultation. In addition, some example test questions will be made available to schools this summer and a full sample test will be made available in the summer of 2015.’ (Col 383W)

.

.

In the event, these documents – seven in all – did not appear until 31 March and there was no reference to any of the three commitments above in what appeared on 27 March.

Finally, the Standards and Testing Agency published on 3 April a guidance page on national curriculum tests from 2016. At present it contains very little information but further material will be added as and when it is published.

Partly because the initial consultation document was extremely ‘drafty’, the reaction of many key external respondents to the consultation was largely negative. One imagines that much of the period since 17 October has been devoted to finding the common ground.

Policy makers will have had to do most of their work after the consultation document issued because they were not ready beforehand.

But the length of the delay in issuing the response would suggest that they also encountered significant dissent amongst internal stakeholders – and that the eventual outcome is likely to be a compromise of sorts between these competing interests.

Such compromises tend to have observable weaknesses and/or put off problematic issues for another day.

A brief summary of consultation responses is included within the Government’s response. I will refer to this at relevant points during the discussion below.

 .

The Case for Change

 .

Aims

The consultation response begins – as did the original consultation document – with a section setting out the case for reform.

It provides a framework of aims and principles intended to underpin the changes that are being set in place.

The aims are:

  • The most important outcome of primary education is to ‘give as many pupils as possible the knowledge and skills to flourish in the later phases of education’. This is a broader restatement of the ‘secondary ready’ concept adopted in the original consultation document.
  • The primary national curriculum and accountability reforms ‘set high expectations so that all children can reach their potential and are well prepared for secondary school’. Here the ‘secondary ready’ hurdle is more baldly stated. The parallel notion is that all children should do as well as they can – and that they may well achieve different levels of performance. (‘Reach their potential’ is disliked by some because it is considered to imply a fixed ceiling for each child and fixed mindset thinking.)
  • To raise current threshold expectations. These are set too low, since too few learners (47%) with KS2 level 4C in both English and maths go on to achieve five or more GCSE grades A*-C including English and maths, while 72% of those with KS2 level 4B do so. So the new KS2 bar will be set at this higher level, but with the expectation that 85% of learners per school will jump it, 13% more than the current national figure. Meanwhile the KS4 outcome will also change, to achievement across eight GCSEs rather than five, quite probably at a more demanding level than the present C grade. In the true sense, this is a moving target.
  • No child should be allowed to fall behind’. This is a reference to the notion of ‘mastery’ in its crudest sense, though the model proposed will not deliver this outcome. We have noted already a reference to ‘as many children as possible’ and the school-level target – initially at least – will be set at 85%. In reality, a significant minority of learners will progress more slowly and will fall short of the threshold at the end of KS2.
  • The new system ‘will set a higher bar’ but ‘almost all pupils should leave primary school well-placed to succeed in the next phase of their education’. Another nuanced version of ‘secondary ready’ is introduced. This marks a recognition that some learners will not jump over the higher bar. In the light of subsequent references to 85%, ‘almost all’ is rather over-optimistic.
  • We also want to celebrate the progress that pupils make in schools with more challenging intakes’. Getting ‘nearly all pupils to meet this standard…’ (the standard of secondary readiness?) ‘…is very demanding, at least in the short term’. There will therefore be recognition of progress ‘from a low starting point’ – even though these learners have, by definition, been allowed to fall behind and will continue to do so.

So there is something of a muddle here, no doubt engendered by a spirit of compromise.

The black and white distinction of ‘secondary-readiness’ has been replaced by various verbal approximations, but the bottom line is that there will be a defined threshold denoting preparedness that is pitched higher than the current threshold.

And the proportion likely to fall short is downplayed – there is apparent unwillingness at this stage to acknowledge the norm that up to 15% of learners in each school will undershoot the threshold – substantially more in schools with ‘challenging intakes’.

What this boils down to is a desire that all will achieve the new higher hurdle – and that all will be encouraged to exceed it if they can – tempered by recognition that this is presently impossible. No child should be allowed to fall behind but many inevitably will do so.

It might have been better to express these aims in the form of future aspirations – and our collective efforts to bridge the gap between present reality and those ambitious aspirations.

Principles

The section concludes with a new set of principles governing pedagogy, assessment and accountability:

  • ‘Ongoing, teacher-led assessment is a crucial part of effective teaching;
  • Schools should have the freedom to decide how to teach their curriculum and how to track the progress that pupils make;
  • Both summative teacher assessment and external testing are important;
  • Accountability is key to a successful school system, and therefore must be fair and transparent;
  • Measures of both progress and attainment are important for understanding school performance; and
  • A broad range of information should be published to help parents and the wider public know how well schools are performing.’

These are generic ‘motherhood and apple pie’ statements and so largely uncontroversial. I might have added a seventh – that schools’ in-house assessment and reporting systems must complement summative assessment and testing, including by predicting for parents the anticipated outcomes of the latter.

Perhaps interestingly, there is no repetition of the defence for the removal of national curriculum levels. Instead, the response concentrates on the support available to schools.

It mentions discussion with an ‘expert group on assessment’ about ‘how to support schools to make best use of the new assessment freedoms’. We are not told the membership of this group (which, as far as I know, has not been made public) or the nature of its remit.

There is also a link to information about the Assessment Innovation Fund, which will provide up to 10 grants of up to £10,000 which schools and organisations can use to develop packages that share their innovative practice with others.

 

Children’s experience of assessment up to the end of KS2

The response mentions the full range of national assessments that will impact on children between the ages of two and 11:

  • The statutory progress check at two years of age.
  • A new baseline assessment undertaken within a few weeks of the start of Year R, introduced from September 2015.
  • An Early Years Foundation Stage Profile undertaken in the final term of the year in which children reach the age of five. A revised profile was introduced from September 2012. It is currently compulsory but will be optional from September 2016. The original consultation document said that the profile would no longer be moderated and data would no longer be collected. Neither of those commitments is repeated here.
  • The Phonics Screening Check, normally undertaken in Year 1. The possibility of making these assessments non-statutory for all-through primary schools, suggested in the consultation document, has not been pursued: 53% of respondents opposed this idea, whereas 32% supported it.
  • End of KS1 assessment and
  • End of KS2 assessment.

So a total of six assessments are in place between the ages of two and 11. At least four – and possibly five – will be undertaken between ages two and seven.

It is likely that early years’ professionals will baulk at this amount of assessment, no matter how sensitively it is designed. But the cost and inefficiency of the model is also open to criticism.

The Reception Baseline

Approach

The original consultation document asked whether:

  • KS1 assessment should be retained as a baseline – 45% supported this and 41% were opposed.
  • A baseline check should be introduced at the start of Reception – 51% supported this and 34% were opposed.
  • Such a baseline check should be optional – 68% agreed and 19% disagreed.
  • Schools should be allowed to choose from a range of commercially available materials for this baseline check – 73% said no and only 15% said yes.

So, whereas views were mixed on where the baseline should be set, there were substantial majorities in favour of any Year R baseline check being optional and following a single, standard national format.

The response argues that Year R is the most sensible point at which to position the baseline since that is:

‘…the earliest point that nearly all children are in school’.

What happens in respect of children who are not in school at this point is not discussed.

There is no explanation of why the Government has disregarded the clear majority of respondents by choosing to permit a range of assessment approaches, so this decision must be ideologically motivated.

The response says ‘most’ are likely to be administered by teaching staff, leaving open the possibility that some options will be administered externally.

Design

Such assessments will need to be:

‘…strong predictors of key stage 1 and key stage 2 attainment, whilst reflecting the age and abilities of children in Reception’.

Presumably this means predictors of attainment in each of the three core subjects – English, maths and science – rather than any broader notion of attainment. The challenge inherent in securing a reasonable predictor of attainment across these domains seven years further on in a child’s development should not be under-estimated.

The response points out that such assessment tools are already available for use in Year R, some are used widely and some schools have long experience of using them. But there is no information about how many of these are deemed to meet already the description above.

In any case, new criteria need to be devised which all such assessments must meet. Some degree of modification will be necessary for all existing products and new products will be launched to compete in the market.

There is an opportunity to use this process to ratchet up the Year R Baseline beyond current expectations, so matching the corresponding process at the end of KS2. The consultation response says nothing about whether this is on the cards.

Interestingly, in his subsequent ‘Unsure start’ speech about early years inspection, HMCI refers to:

‘…the government’s announcement last week that they will be introducing a readiness-for-school test at age four. This is an ideal opportunity to improve accountability. But I think it should go further.

I hope that the published outcomes of these tests will be detailed enough to show parents how their own child has performed. I fear that an overall school grade will fail to illuminate the progress of poor children. I ask government to think again about this issue.’

The terminology – ‘readiness for school’ is markedly blunter than the references to a reception baseline in the consultation response. There is nothing in the response about the outcomes of these tests being published, nor anything about ‘an overall school grade’.

Does this suggest that decisions have already been made that were not communicated in the consultation response?

.

Timeline, options, questions

Several pieces of further work are required in short order to inform schools and providers about what will be required – and to enable both to prepare for introduction of the assessments from September 2015. All these should feature in the ‘to do’ list below.

One might reasonably have hoped that – especially given the long delay – some attempt might have been made to publish suggested draft criteria for the baseline alongside the consultation response. The fact that even preliminary research into existing practice has not been undertaken is a cause for concern.

Although the baseline will be introduced from September 2015, there is a one-year interim measure which can only apply to all-through primary schools:

  • They can opt out of the Year R baseline measure entirely, relying instead on KS1 outcomes as their baseline; or
  • They can use an approved Year R baseline assessment and have this cohort’s progress measured at the end of KS2 (which will be in 2022) by either the Year R or the KS1 baseline, whichever demonstrates the most progress.

In the period up to and including 2021, progress will continue to be measured from the end of KS1. So learners who complete KS2 in 2021 for example will be assessed on progress since their KS1 tests in 2017.

Junior and middle schools will also continue to use a KS1 baseline.

Arrangements for infant and first schools are still to be determined, another rather worrying omission at this stage in proceedings.

It is also clear that all-through primary schools (and infant/first schools?) will continue to be able to opt out from the Year R baseline from September 2016 onwards, since the response says:

‘Schools that choose not to use an approved baseline assessment from 2016 will be judged on an attainment floor standard alone’.

Hence the Year R baseline check is entirely optional and a majority of schools could choose not to undertake it.

However, they would need to be confident of meeting the demanding 85% attainment threshold in the floor standard.

They might be wise to postpone that decision until the pitch of the progress expectation is determined. For neither the Year R baseline nor the amount of progress that learners are expected to make from their starting point in Year R is yet defined.

This latter point applies at the average school level (for the purposes of the floor standard) and in respect of the individual learner. For example, if a four year-old is particularly precocious in, say, maths, what scaled scores must they register seven years later to be judged to have made sufficient progress?

There are several associated questions that follow on from this.

Will it be in schools’ interests to acknowledge that they have precocious four year-olds at all? Will the Year R baseline reinforce the tendency to use Reception to bring all children to the same starting point in readiness for Year 1, regardless of their precocity?

Will the moderation arrangements be hard-edged enough to stop all-through primary schools gaming the system by artificially depressing their baseline outcomes?

Who will undertake this moderation and how much will it cost? Will not the decision to permit schools to choose from a range of measures unnecessarily complicate the moderation process and add to the expense?

The consultation response neither poses these questions nor supplies answers.

The future shape of end KS1 and end KS2 assessment

.

What assessment will take place?

At KS1 learners will be assessed in:

  • Reading – test plus teacher assessment
  • Writing – test (of grammar, punctuation and spelling) plus teacher assessment
  • Speaking and listening – teacher assessment
  • Maths – test plus teacher assessment
  • Science  – teacher assessment

The new test of grammar, punctuation and spelling did not feature in the original consultation and has presumably been introduced to strengthen the marker of progress to which four year-olds should aspire at age seven.

The draft test specifications for the KS1 tests in reading, GPS and maths outline the requirements placed on the test developers, so it is straightforward to compare the specifications for reading and maths with the current tests.

The GPS test will include a 20 minute written grammar and punctuation task; a 20 minute test comprising short grammar, punctuation and vocabulary questions; and a 15 minute spelling task.

There is a passing reference to further work on KS1 moderation which is included in the ‘to do’ list below.

At KS2 learners will be assessed in

  • Reading – test plus teacher assessment
  • Writing – test (of grammar spelling and punctuation) plus teacher assessment
  • Maths – test plus teacher assessment
  • Science  – teacher assessment plus a science sampling test.

Once again, the draft test specifications – reading, GPS, maths and science sampling – describe the shape of each test and the content they are expected to assess.

I will leave it to experts to comment on the content of the tests.

 .

Academies and free schools

It is important to note that the framing of this content – by means of detailed ‘performance descriptors’ – means that the freedom academies and free schools enjoy in departing from the national curriculum will be largely illusory.

I raised this issue back in February 2013:

  • ‘We know that there will be a new grading system in the core subjects at the end of KS2. If this were to be based on the ATs as drafted, it could only reflect whether or not learners can demonstrate that they know, can apply and understand ‘the matters, skills and processes specified’ in the PoS as a whole. Since there is no provision for ATs that reflect sub-elements of the PoS – such as reading, writing, spelling – grades will have to be awarded on the basis of separate syllabuses for end of KS2 tests associated with these sub-elements.
  • This grading system must anyway be applied universally if it is to inform the publication of performance tables. Since some schools are exempt from National Curriculum requirements, it follows that grading cannot be derived directly from the ATs and/or the PoS, but must be independent of them. So this once more points to end of KS2 tests based on entirely separate syllabuses which nevertheless reflect the relevant part of the draft PoS. The KS2 arrangements are therefore very similar to those planned at KS4.’

I have more to say about the ‘performance descriptors’ below.

 .

Single tests for all learners

A critical point I want to emphasise at this juncture – not mentioned at all in the consultation document or the response – is the test development challenge inherent in producing single papers suitable for all learners, regardless of their attainment.

We know from the response that the P-scales will be retained for those who are unable to access the end of key stage tests. (Incidentally, the content of the P-scales will remain unchanged so they will not be aligned with the revised national curriculum, as suggested in the consultation document.)

There will also be provision for pupils who are working ‘above the P-scales but below the level of the test’.

Now the P-scales are for learners working below level 1 (in old currency). This is the first indication I have seen that the tests may not cater for the full range from Level 1-equivalent to Level 6-equivalent and above. But no further information is provided.

It may be that this is a reference to learners who are working towards level 1 (in old currency) but do not have SEN.

The 2014 KS2 ARA booklet notes:

‘Children working towards level 1 of the national curriculum who do not have a special educational need should be reported to STA as ‘W’ (Working below the level). This includes children who are working towards level 1 solely because they have English as an additional language. Schools should use the code ‘NOTSEN’ to explain why a child working towards level 1 does not have P scales reported. ‘NOTSEN’ replaces the code ‘EAL’ that was used in previous years.’

The consultation document said:

‘We do not propose to develop an equivalent to the current level 6 tests, which are used to challenge the highest-attaining pupils. Key stage 2 national curriculum tests will include challenging material (at least of the standard of the current level 6 test) which all pupils will have the opportunity to answer, without the need for a separate test’.

The draft test specifications make it clear that the tests should:

‘provide a suitable challenge for all children and give every child the opportunity to achieve as high a standard…as possible.’

Moreover:

‘In order to improve general accessibility for all children, where possible, questions will be placed in order of difficulty.’

The development of single tests covering this span of attainment – from level 1 to above level 6 – tests in which the questions are posed in order of difficulty and even the highest attainers must answer all questions – seem to me to be a very tall order, especially in maths.

More than that, I urgently need persuading that this is not a waste of high attainers’ time and poor assessment practice.

 .

How assessment outcomes will be derived, reported and published

Deriving assessment outcomes

One of the reasons cited for replacing national curriculum levels was the complexity of the system and the difficulty parents experienced in understanding it.

The Ministerial response to the original report from the National Curriculum Expert Panel said:

‘As you rightly identified, the current system is confusing for parents and restrictive for teachers. I agree with your recommendation that there should be a direct relationship between what children are taught and what is assessed. We will therefore describe subject content in a way which makes clear both what should be taught and what pupils should know and be able to do as a result.’

The consultation document glossed the same point thus:

‘Schools will be able to focus their teaching, assessment and reporting not on a set of opaque level descriptions, but on the essential knowledge that all pupils should learn.’

However, the consultation response introduces for the first time the concept of a ‘performance descriptor’.

This term is defined in the glossaries at the end of each draft test specification:

Description of the typical characteristics of children working at a particular standard. For these tests, the performance descriptor will characterise the minimum performance required to be working at the appropriate standard for the end of the key stage.’

Essentially this is a collective term for something very similar to old-style level descriptions.

Except that, in the case of the tests, they are all describing the same level of performance.

They have been rendered necessary by the odd decision to provide only a single generic attainment target for each programme of study. But, as noted back in February 2013, the test developers need a more sophisticated framework on which to base their assessments.

According to the draft test specifications they will also be used

‘By a panel of teachers to set the standards on the new tests following their first administration in May 2016’.

When it comes to teacher assessment, the consultation response says:

‘New performance descriptors will be introduced to inform the statutory teacher assessments at the end of key stage one [and]…key stage two.’

But there are two models in play simultaneously.

In four cases – science at KS1 and reading, maths and science at KS2 – there will be ‘a single performance descriptor of the new expected standard’, in the same way as there are in the test specifications.

But in five cases – reading, writing, speaking and listening and maths at KS1; and writing at KS2 :

‘teachers will assess pupils as meeting one of several performance descriptors’.

These are old-style level descriptors by another name. They perform exactly the same function.

The response says that the KS1 teacher assessment performance descriptors will be drafted by an expert group for introduction in autumn 2014. It does not mention whether KS2 teacher assessment performance descriptors will be devised in the same way and to the same timetable.

 .

Reporting assessment outcomes to parents

When it comes to reporting to parents, there will be three different arrangements in play at both KS1 and KS2:

  • Test results will be reported by means of scaled scores (of which more in a moment).
  • One set of teacher assessments will be reported by selecting from a set of differentiated performance descriptors.
  • A second set of teacher assessments will be reported according to whether learners have achieved a single threshold performance descriptor.

This is already significantly more complex than the previous system, which applied the same framework of national curriculum levels across the piece.

It seems that KS1 test outcomes will be reported as straightforward scaled scores (though this is only mentioned on page 8 of the main text of the response and not in Annex B, which compares the new arrangements with those currently in place).

But, in the case of KS2:

‘Parents will be provided with their child’s score alongside the average for their school, the local area and nationally. In the light of the consultation responses, we will not give parents a decile ranking for their child due to concerns about whether decile rankings are meaningful and their reliability at individual pupil level.’

The consultation document proposed a tripartite reporting system comprising:

  • A scaled score for each KS2 test, derived from raw test marks and built around a ‘secondary readiness standard’. This standard would be set at a scaled score of 100, which would remain unchanged. It was suggested for illustrative purposes that a scale based on the current national curriculum tests might run from 80 to 130.
  • An average scaled score in each test for other pupils nationally with the same prior attainment at the baseline. Comparison of a learner’s scaled score with the average scaled score would show whether they had made more or less progress than the national average.
  • A national ranking in each test – expressed in terms of deciles – showing how a learner’s scaled score compared with the range of performance nationally.

The latter has been dispensed with, given that 35% of consultation respondents disagreed with it, but there were clearly technical reservations too.

In its place, the ‘value added’ progress measure has been expanded so that there is a comparison with other pupils in the learner’s own school and the ‘local area’ (which presumably means local authority). This beefs up the progression element in reporting at the expense of information about the attainment level achieved.

So at the end of KS2 parents will receive scaled scores and three average scaled scores for each of reading, writing and maths – twelve scores in all – plus four performance descriptors, of which three will be singleton threshold descriptors (reading, maths and science) and one will be selected from a differentiated series (writing). That makes sixteen assessment outcomes altogether, provided in four different formats.

The consultation response tells us nothing more about the range of the scale that will be used to provide scaled scores. We do not even know if it will be the same for each test.

The draft test specifications say that:

‘The exact scale for the scaled scores will be determined following further analysis of trialling data. This will include a full review of the reporting of confidence intervals for scaled scores.’

But they also contain this worrying statement:

‘The provision of a scaled score will aid in the interpretation of children’s performance over time as the scaled score which represents the expected standard will be the same year on year. However, at the extremes of the scaled score distribution, as is standard practice, the scores will be truncated such that above and below a certain point, all children will be awarded the same scaled score in order to minimise the effect for children at the ends of the distribution where the test is not measuring optimally.’

This appears to suggest that scaled scores will not accurately describe performance at the extremes of the distribution, because the tests will not accurately measure such performance. This might be describing a statistical truism, but it again begs the question whether the highest attainers are being short-changed by the selected approach.

.

Publication of assessment outcomes

The response introduces the idea that ‘a suite of indicators’ will be published on each school’s own website in a standard format. These are:

  • The average progress made by pupils in reading, writing and maths. (This is presumably relevant to both KS1 and KS2 and to both tests and teacher assessment.)
  • The percentage of pupils reaching the expected standard in reading, writing and mathematics at the end of key stage 2. (This is presumably relevant to both tests and teacher assessment.)
  • The average score of pupils in their end of key stage 2 assessments. (The final word suggests teacher assessment as well as tests, even though there will not be a score from the former.)
  • The percentage of pupils who achieve a high score in all areas at the end of key stage 2. (Does ‘all areas’ imply something more than statutory tests and teacher assessments? Does it mean treating each area separately, or providing details only of those who have achieved high scores across all areas?)

The latter is the only reference to high attainers in the entire response. It does not give any indication of what will count as a high score for these purposes. Will it be designed to catch the top-third of attainers or something more demanding, perhaps equivalent to the top decile?

A decision has been taken not to report the outcomes of assessment against the P-scales because the need to contextualise such information is perceived to be relatively greater.

And, as noted above, HMCI let slip the fact that the outcomes of reception baselines would also be published, but apparently in the form of a single overall grade.

We are not told when these requirements will be introduced, but presumably they must be in place to report the outcomes of assessments undertaken in spring 2016.

Additionally:

‘So that parents can make comparisons between schools, we would like to show each school’s position in the country on these measures and present these results in a manner that is clear for all audiences to understand. We will discuss how best to do so with stakeholders, to ensure that the presentation of the data is clear, fair and statistically robust.’

This suggests inclusion in the 2016 School Performance Tables, but this is not stated explicitly.

Indeed, apart from references to the publication of progress measures in the 2022 Performance Tables, there is no explicit coverage of their contribution in the response, nor any reference to the planned supporting data portal, or how data will be distributed between the Tables and the portal.

The original consultation document gave several commitments on the future content of performance tables. They included:

  • How many of a school’s pupils are amongst the highest attaining nationally, by showing the percentage of pupils achieving a high scaled score in each subject.
  • Measures to show the attainment and progress of learners attracting the Pupil Premium.
  • Comparison of each school’s performance with that of schools with similar intakes.

None are mentioned here, nor are any of the suggestions advanced by respondents taken up.

Floor standards

Changes are proposed to the floor standards with effect from September 2016.

This section of the response begins by committing to:

‘…a new floor standard that holds schools to account both on the progress that they make and on how well their pupils achieve.’

But the plans set out subsequently do not meet this description.

The progress element of the current floor standard relates to any of reading, writing or mathematics but, under the new floor standard, it will relate to all three of these together.

An all-though primary school must demonstrate that:

‘…pupils make sufficient progress at key stage 2 from their starting point…’

As we have noted above, all-through primaries can opt to use the KS1 baseline or the Year R baseline in 2015. Moreover, from 2016 they can choose not to use the Year R baseline and be assessed solely on the attainment measure in the floor standards (see below).

Junior and middle schools obviously apply the KS1 baseline, while arrangements for infant and first schools have yet to be finalised.

What constitutes ‘sufficient progress’ is not defined. Annex C of the response says:

‘For 2016 we will set the precise extent of progress required once key stage 2 tests have been sat for the first time.’

Presumably this will be progress from KS1 to KS2, since progress from the Year R baseline will not be introduced until 2023.

The attainment element of the new floor standards is for schools to have 85% or more of pupils meeting the new, higher threshold standard at the end of KS2 in all of reading, writing and maths. The text says explicitly that this threshold is ‘similar to a level 4b under the current system’.

Annex C clarifies that this will be judged by the achievement of a scaled score of 100 or more in each of the reading and maths tests, plus teacher assessment that learners have reached the expected standard in writing (so the GPS test does not count in the same way, simply informing the teacher assessment).

As noted above, this a far bigger ask than the current reference to 65% of learners meeting the expected (and lower 4c) standard. The summary at the beginning of the response refers to it as ‘a challenging aspiration’:

‘Over time we expect more and more schools to achieve this standard.’

The statement in the first paragraph of this section of the response led us to believe that these two requirements – for progress and attainment respectively – would be combined, so that schools would be held account for both (unless, presumably, they exercised their right to opt out of the Year R baseline assessment).

But this is not the case. Schools need only achieve one or the other.

It follows that schools with a very high performing intake may exceed the floor standards on the basis of all-round high attainment alone, regardless of the progress made by their learners.

The reason for this provision is unclear, though one suspects that schools with an extremely high attaining intake, whether at Reception or Year 3, will be harder pressed to achieve sufficient progress, presumably because some ceiling effects come into play at the end of KS2.

This in turn might suggest that the planned tests do not have sufficient headroom for the highest attainers, even though they are supposed to provide similar challenge to level 6 and potentially extend beyond it.

Meanwhile, schools with less than stellar attainment results will be obliged to follow the progress route to jump the floor standard. This too will be demanding because all three domains will be in play.

There will have been some internal modelling undertaken to judge how many schools would be likely to fall short of the floor standards given these arrangements and it would be very useful to know these estimates, however unreliable they prove to be.

In their absence, one suspects that the majority of schools will be below the floor standards, at least initially. That of course materially changes the nature and purpose of the standards.

To Do List

The response and the draft specifications together contain a long list of work to be carried out over the next two years or so. I have included below my best guess as to the latest possible date for each decision to be completed and communicated:

  • Decide how progress will be measured for infants and first schools between the Year R baseline and the end of KS1 (April 2014)
  • Make available to schools a ‘small number’ of sample test questions for each key stage and subject (Summer 2014)
  • Work with experts to establish the criteria for the Year R baseline (September 2014)
  • KS1 [and KS2?] teacher assessment performance descriptors to be drafted by an expert group (September 2014)
  • Complete and report outcomes of a study with schools that already use Year R baseline assessments (December 2014)
  • Decide how Year R baseline assessments will be moderated (December 2014)
  • Publish a list of assessments that meet the Year R baseline criteria (March 2015)
  • Decide how Year R baseline results will be communicated to parents and to Ofsted (March 2015)
  • Make available to schools a full set of sample materials including tests and mark schemes for all KS1 and KS2 tests (September 2015)
  • Complete work with Ofsted and Teachers to improve KS1 moderation (September 2015)
  • Provide further information to enable teachers to assess pupils at the end of KS1 and KS2 who are ‘working above the P-scales but below the level of the test’ (September 2015)
  • Decide whether to move to external moderation of P-scale teacher assessment (September 2015)
  • Agree with stakeholders how to compare schools’ performance on a suite of assessment outcomes published in a standard format (September 2015)
  • Publish all final test frameworks (Autumn 2015)
  • Introduce new requirements for schools to publish a suite of assessment outcomes in a standard format (Spring 2016)
  • Panels of teacher use level descriptors to set the standards on the new tests following their first administration in May 2016 (Summer 2016)
  • Define what counts as sufficient progress from the Year R baseline to end KS1 and end KS2 respectively (Summer 2016)

Conclusion

Overall the response is rather more cogent and coherent than the original consultation document, though there are several inconsistencies and many sins of omission.

Drawing together the key issues emerging from the commentary above, I would highlight twelve key points:

  • The declared aims express the policy direction clumsily and without conviction. The ultimate aspirations are universal ‘secondary readiness’ (though expressed in broader terms), ‘no child left behind’ and ‘every child fulfilling their potential’ but there is no real effort to reconcile these potentially conflicting notions into a consensual vision of what primary education is for. Moreover, an inconvenient truth lurks behind these statements. By raising expectations so significantly – 4b equivalent rather than 4c; 85% over the attainment threshold rather than 65%; ‘sufficient progress’ rather than median progress and across three domains rather than one – there will be much more failure in the short to medium term. More learners will fall behind and fall short of the thresholds; many more schools are likely to undershoot the floor standards. It may also prove harder for some learners to demonstrate their potential. It might have been better to acknowledge this reality and to frame the vision in terms of creating the conditions necessary for subsequent progress towards the ultimate aspirations.
  • Younger children are increasingly caught in the crossbeam from the twin searchlights of assessment and accountability. HMCI’s subsequent intervention has raised the stakes still further. This creates obvious tensions in the sector which can be traced back to disagreements over the respective purposes of early years and primary provision and how they relate to each other. (HMCI’s notion of ‘school readiness’ is no doubt as narrow to early years practitioners as ‘secondary readiness’ is to primary educators.) But this is not just a theoretical point. Additional demands for focused inspection, moderation and publication of outcomes all carry a significant price tag. It must be open to question whether the sheer weight of assessment activity is optimal and delivers value for money. Should a radical future Government – probably with a cost-cutting remit – have rationalisation in mind?
  • Giving schools the freedom to choose from a range of Year R baseline assessment tools also seems inherently inefficient and flies in the face of the clear majority of consultation responses. We are told nothing of the perceived quality of existing services, none of which can – by definition – satisfy these new expectations without significant adjustment. It will not be straightforward to construct a universal and child-friendly instrument that is a sufficiently strong predictor of Level 4b-equivalent performance in KS2 reading, writing and maths assessments undertaken seven years later. Moreover, there will be a strong temptation for the Government to pitch the baseline higher than current expectations, so matching the  realignment at the other end of the process. Making the Reception baseline assessment optional – albeit with strings attached – seems rather half-hearted, almost an insurance against failure. Effective (and expensive) moderation may protect against widespread gaming, but the risk remains that Reception teachers will be even more predisposed to prioritise universal school readiness over stretching their more precocious four year-olds.
  • The task of designing an effective test for all levels of prior attainment at the end of key stage 2 is equally fraught with difficulty. The P-scales will be retained (in their existing format, unaligned with the revised national curriculum) for learners with special needs working below the equivalent of what is currently level 1. There will also be undefined provision ‘for those working above the level of the P-scales but below the level of the test’, even though the draft test development frameworks say:

‘All eligible children who are registered at maintained schools, special schools, or academies (including free schools) in England and are at the end of key stage 2 will be required to take the…test, unless they have taken it in the past.’

And this applies to all learners other than those in the exempted categories set out in the ARA booklets. The draft specifications add that test questions will be placed in order of difficulty. I have grave difficulty in understanding how such assessments can be optimal for high attainers and fear that this is bad assessment practice.

  • On top of this there is the worrying statement in the test development frameworks that scaled scores will be ‘truncated’ at the extremes of the distribution’. This does not fill one with confidence that the highest and lowest attainers will have their test performance properly recognised and reported.
  • The necessary invention of ‘performance descriptors’ removes any lingering illusion that academies and free schools have significant freedom to depart from the national curriculum, at least as far as the core subjects are concerned. It is hard to understand why these descriptors could not have been published alongside the programmes of study within the national curriculum.
  • The ‘performance descriptors’ in the draft test specifications carry all sorts of health warnings that they are inappropriate for teacher assessment because they cover only material that can be assessed in a written test. But there will be significant overlap between the test and teacher assessment versions, particularly in those that describe threshold performance at the equivalent of level 4b. For we know now that there will also be hierarchies of performance descriptors – aka level descriptors – for KS1 teacher assessment in reading, writing, speaking and listening and maths, as well as for KS2 teacher assessment in writing. Levels were so problematic that it has been necessary to reinvent them!
  • What with scaled scores, average scaled scores, threshold performance descriptors and ‘levelled’ performance descriptors, schools face an uphill battle in convincing parents that the reporting of test outcomes under this system will be simpler and more understandable. At the end of KS2 they will receive 16 different assessments in four different formats. (Remember that parents will also need to cope with schools’ approaches to internal assessment, which may or may not align with these arrangements.)
  • We are told about new requirements to be placed on schools to publish assessment outcomes, but the description is infuriatingly vague. We do not know whether certain requirements apply to both KS1 and 2, and/or to both tests and teacher assessment. The reference to ‘the percentage of pupils who achieve a high score in all areas at the end of key stage 2’ is additionally vague because it is unclear whether it applies to performance in each assessment, or across all assessments combined. Nor is the pitch of the high score explained. This is the only reference to high attainers in the entire response and it raises more questions than it answers.
  • We also have negligible information about what will appear in the school performance tables and what will be relegated to the accompanying data portal. We know there is an intention to compare schools’ performance on the measures they are required to publish and that is all. Much of the further detail in the original consultation document may or may not have fallen by the wayside.
  • The new floor standards have all the characteristics of a last-minute compromise hastily stitched together. The consultation document was explicit that floor standards would:

‘…focus on threshold attainment measures and value-added progress measures’

It anticipated that the progress measure would require average scaled scores of between 98.5 and 99.0 adding:

‘Our modelling suggests that a progress measure set at this level, combined with the 85% threshold attainment measure, would result in a similar number of schools falling below the floor as at present.’

But the analysis of responses fails to report at all on the question ‘Do you have any comments about these proposals for the Department’s floor standards?’ It does include the response to a subsequent question about including an average point score attainment measure in the floor standards (39% of respondents were in favour of this against 31% against). But the main text does not discuss this option at all. It begins by stating that both an attainment and a progress dimension are in play, but then describes a system in which schools can choose one or the other. There is no attempt to quantify ‘sufficient progress’ and no revised modelling of the impact of standards set at this level. We are left with the suspicion that a very significant proportion of schools will not exceed the floor. There is also a potential perverse incentive for schools with very high attaining intakes not to bother about progress at all.

  • Finally, the ‘to do’ list is substantial. Several of those with the tightest deadlines ought really to have been completed ahead of the consultation response, especially given the significant delay. There is nothing about the interaction between this work programme and that proposed by NAHT’s Commission on Assessment. Much of this work would need to take place on the other side of a General Election, while the lead time for assessing KS2 progress against a Year R baseline is a full nine years. This makes the project as a whole particularly vulnerable to the whims of future governments.

I’m struggling to find the right description for the overall package. I don’t think it’s quite substantial or messy enough to count as a dog’s breakfast. But, like a poorly airbrushed portrait, it flatters to deceive. Seen from a distance it appears convincing but, on closer inspection, there are too many wrinkles that have not been properly smoothed out

GP

April 2014

 

 

What Becomes of Schools That Fail Their High Attainers?*

.

This post reviews the performance and subsequent history of schools with particularly poor results for high attainers in the Secondary School Performance Tables over the last three years.

P1010120

Seahorse in Perth Aquarium by Gifted Phoenix

It establishes a high attainer ‘floor target’ so as to draw a manageable sample of poor performers and, having done so:

  • Analyses the characteristics of this sample;
  • Explores whether these schools typically record poor performance in subsequent years or manage to rectify matters;
  • Examines the impact of various interventions, including falling below the official floor targets, being placed in special measures or deemed to have serious weaknesses following inspection, becoming an academy and receiving a pre-warning and/or warning notice;
  • Considers whether the most recent Ofsted reports on these schools do full justice to this issue, including those undertaken after September 2013 when new emphasis was placed on the performance of the ‘most able’.

The post builds on my previous analysis of high attainment in the 2013 School Performance Tables (January 2014). It applies the broad definition of high attainers used in the Tables, which I discussed in that post and have not repeated here.

I must emphasise at the outset that factors other than poor performance may partially explain particularly low scores in the Tables.

There may be several extenuating circumstances that are not reflected in the results. Sometimes these may surface in Ofsted inspection reports, but the accountability and school improvement regime typically imposes a degree of rough justice, and I have followed its lead.

It is also worth noting that the Performance Tables do not provide data for schools where the number of high attainers is five or fewer, because of the risk that individuals may be identifiable even though the data is anonymised.

This is unfortunate since the chances are that schools with very few high attainers will find it more difficult than others to address their needs. We may never know, but there is more on the impact of cohort size below.

Finally please accept my customary apology for any transcription errors. Do let me know if you notice any and I will correct them.

.

Drawing the Sample

The obvious solution would be to apply the existing floor targets to high attainers.

So it would include all schools recording:

  • Fewer than 35% (2011) or 40% (2012 and 2013) of high attainers achieving five or more GCSEs at grades A*-C or equivalent including GCSEs in English and mathematics and
  • Below median scores for the percentage of high attainers making at least the expected three levels of progress between Key Stages 2 and 4 in English and maths respectively.

But the first element is far too undemanding a threshold to apply for high attaining learners and the overall target generates a tiny sample.

The only school failing to achieve it in 2013 was Ark Kings Academy in Birmingham, which recorded just six high attainers, forming 9% of the cohort (so only just above the level at which results would have been suppressed).

In 2012 two schools were in the same boat:

  • The Rushden Community College in Northamptonshire, with 35 high attainers (26% of the cohort), which became a sponsored academy with the same name on 1 December 2012; and
  • Culverhay School in Bath and North East Somerset, with 10 high attainers (19% of the cohort), which became Bath Community Academy on 1 September 2012.

No schools at all performed at this level in 2011.

A sample of just three schools is rather too unrepresentative, so it is necessary to set a more demanding benchmark which combines the same threshold and progress elements.

The problem is not with the progress measure. Far too many schools fail to meet the median level of performance – around 70% each year in both English and maths – even with their cadres of high attainers. Hence I need to lower the pitch of this element to create a manageable sample.

I plumped for 60% or fewer high attainers making at least the expected progress between KS2 and KS4 in both English and maths. This captured 22 state-funded schools in 2013, 31 in 2012 and 38 in 2011. (It also enabled Ark King’s Academy to escape, by virtue of the fact that 67% of its high attainers learners achieved the requisite progress in English.)

For the threshold element I opted for 70% or fewer high attainers achieving five or more GCSEs at grades A*-C or equivalent including GCSEs in English and maths. This captured 19 state-funded schools in 2013, 29 in 2012 and 13 in 2011.

.

Venn 2.

The numbers of state-funded schools that met both criteria were seven in 2013, eight  in 2012 and five in 2011, so 20 in all.

I decided to feature this small group of schools in the present post while also keeping in mind the schools occupying each side of the Venn Diagram. I particularly wanted to see whether schools which emerged from the central sample in subsequent years continued to fall short on one or other of the constituent elements.

The 20 schools in the main sample are:

Table 1 below provides more detail about these 20 schools.

.

Table 1: Schools Falling Below Illustrative High Attainer Floor Targets 2011-2013

Name Type LA Status/Sponsor Subsequent History
2011
Carter Community School 12-16 mixed modern Poole Community Sponsored academy (ULT) 1/4/13
Hadden Park High School 11-16 mixed comp Nottingham Foundation Sponsored Academy (Bluecoat School) 1/1/14
Merchants Academy 11-18 mixed comp Bristol Sponsored Academy (Merchant Venturers/ University of Bristol
The Robert Napier School 11-18 mixed modern Medway Foundation Sponsored Academy (Fort Pitt Grammar School)  1/9/12
Bishop of Rochester Academy 11-18 mixed comp Kent Sponsored Academy (Medway Council/ Canterbury Christ Church University/ Diocese of Rochester)
2012
The Rushden Community College 11-18 mixed comp Northants Community Sponsored Academy (The Education Fellowship) 12/12
Culverhay School 11-18 boys comp Bath and NE Somerset Community Bath Community Academy – mixed (Cabot Learning) 1/9/12
Raincliffe School 11-16 mixed comp N Yorks Community Closed 8/12 (merged with Graham School)
The Coseley School 11-16 mixed comp Dudley Foundation
Fleetwood High School 11-18 mixed comp Lancs Foundation
John Spendluffe Foundation Technology College 11-16 mixed modern Lincs Academy converter
Parklands High School 11-18 mixed Liverpool Foundation Discussing academy sponsorship (Bright Tribe)
Frank F Harrison Engineering College 11-18 mixed comp Walsall Foundation Mirus Academy (sponsored by Walsall College) 1/1/12
2013
Gloucester Academy 11-19 mixed comp Glos Sponsored Academy (Prospect Education/ Gloucestershire College)
Christ the King Catholic and Church of England VA School 11-16 mixed comp Knowsley VA Closed 31/8/13
Aireville School 11-16 mixed modern N Yorks Community
Manchester Creative and Media Academy for Boys 11-19 boys comp Manchester Sponsored Academy (Manchester College/ Manchester Council/ Microsoft)
Fearns Community Sports College 11-16 mixed comp Lancs Community
Unity College Blackpool 5-16 mixed comp Blackpool Community Unity Academy Blackpool (sponsored by Fylde Coast Academies)
The Mirus Academy 3-19 mixed comp Walsall Sponsored Academy (Walsall College)

 .

Only one school appears twice over the three-year period albeit in two separate guises – Frank F Harrison/Mirus.

Of the 20 in the sample, seven were recorded in the relevant year’s Performance Tables as community schools, six as foundation schools, one was VA, one was an academy converter and the five remaining were sponsored academies.

Of the 14 that were not originally academies, seven have since become sponsored academies and one is discussing the prospect. Two more have closed, so just five – 25% of the sample – remain outside the academies sector.

All but two of the schools are mixed (the other two are boys’ schools). Four are modern schools and the remainder comprehensive.

Geographically they are concentrated in the Midlands and the North, with a few in the South-West and the extreme South-East. There are no representatives from London, the East or the North-East.

.

Performance of the Core Sample

Table 2 below looks at key Performance Table results for these schools. I have retained the separation by year and the order in which the schools appear, which reflects their performance on the GCSE threshold measure, with the poorest performing at the top of each section.

.

Table 2: Performance of schools falling below proposed high attainer floor targets 2011-2013

Name No of HA % HA 5+ A*-C incl E+M 3+ LoP En 3+ LoP Ma APS (GCSE)
2011
Carter Community School 9 13 56 56 44 304.9
Hadden Park High School 15 13 60 40 20 144.3
Merchants Academy 19 19 68 58 42 251.6
The Robert Napier School 28 12 68 39 46 292.8
Bishop of Rochester Academy 10 5 70 50 60 298.8
2012
The Rushden Community College 35 26 3 0 54 326.5
Culverhay School 10 19 30 40 20 199.3
Raincliffe School 6 11 50 50 33 211.5
The Coseley School 35 20 60 51 60 262.7
Fleetwood High School 34 22 62 38 24 272.9
John Spendluffe Foundation Technology College 14 12 64 50 43 283.6
Parklands High School 13 18 69 23 8 143.7
Frank F Harrison Engineering College 20 12 70 35 60 188.3
2013
Gloucester Academy 18 13 44 28 50 226.8
Christ the King Catholic and Church of England VA School 22 22 55 32 41 256.5
Aireville School 23 23 61 35 57 267.9
Manchester Creative and Media Academy for Boys 16 19 63 50 50 244.9
Fearns Community Sports College 22 13 64 36 59 306.0
Unity College Blackpool 21 18 67 57 52 277.1
The Mirus Academy 23 13 70 57 52 201.4

.

The size of the high attainer population in these schools varies between 6 (the minimum for which statistics are published) and 35, with an average of just under 20.

The percentage of high attainers within each school’s cohort ranges from 5% to 26% with an average of slightly over 16%.

This compares with a national average in 2013 for all state-funded schools of 32.4%, almost twice the size of the average cohort in this sample. All 20 schools here record a high attainer population significantly below this national average.

This correlation may be significant – tending to support the case that high attainers are more likely to struggle in schools where they are less strongly concentrated – but it does not prove the relationship.

Achievement against the GCSE threshold measure falls as low as 3% (Rushden in 2012) but this was reportedly attributable to the school selecting ineligible English specifications.

Otherwise the poorest result is 30% at Culverhay, also in 2012, followed by Gloucester Academy (44% in 2013) and Raincliffe (50% in 2012). Only these four schools have recorded performance at or below 50%.

Indeed there is a very wide span of performance even amongst these small samples, especially in 2012 when it reaches an amazing 67 percentage points (40 percentage points excluding Rushden). In 2013 there was a span of 26 percentage points and in 2011 a span of 14 percentage points.

The overall average amongst the 20 schools is almost 58%. This varies by year. In 2011 it was 64%, in 2012 it was significantly lower at 51% (but rose to 58% if Rushden is excluded) and in 2013 it was 61%.

This compares with a national average for high attainers in state-funded schools of 94.7% in 2013. The extent to which some of these outlier schools are undershooting the national average is truly eye-watering.

Turning to the progress measures, one might expect even greater variance, given that so many more schools fail to clear this element of the official floor targets with their high attainers.

The overall average across these 20 schools is 41% in English and 44% in maths, suggesting that performance is slightly stronger in maths than English.

But in 2011 the averages were 49% in English and 42% in maths, reversing this general pattern and producing a much wider gap in favour of English.

In 2012 they were 36% in English and 38% in maths, but the English average improves to 41% if Rushden’s result is excluded. This again bucks the overall trend.

The overall average is cemented by the 2013 figures when the average for maths stood at 53% compared with 42% for English.

Hence, over the three years, we can see that the sharp drop in English in 2012 – most probably attributable to the notorious marking issue – was barely recovered in 2013. Conversely, a drop in maths in 2012 was followed by a sharp recovery in 2013.

The small sample size calls into question the significance of these patterns, but they are interesting nevertheless.

The comparable national averages among all state-funded schools in 2013 were 86.2% in English and 87.8% in maths. So the schools in this sample are typically operating at around half the national average levels. This is indeed worse than the comparable record on the threshold measure.

That said, the variation in these results is again huge – 35 percentage points in English (excluding Rushden) and as much as 52 percentage points in maths.

There is no obvious pattern in these schools’ comparative performance in English and maths. Ten schools scored more highly in English and nine in maths, with one school recording equally in both. English was in the ascendancy in 2011 and 2012, but maths supplanted it in 2013.

The final column in Table 2 shows the average point score (APS) for high attainers’ best eight GCSE results. There is once more a very big range, from 144.3 to 326.5 – over 180 points – compared with a 2013 national average for high attainers in state-funded schools of 377.6.

The schools at the bottom of the distribution are almost certainly relying heavily on GCSE-equivalent qualifications, rather than pushing their high attainers towards GCSEs.

Those schools that record relatively high APS alongside relatively low progress scores are most probably taking their high attaining learners with L5 at KS2 to GCSE grade C, but no further.

.

Changes in Performance from 2011 to 2013

Table 3, below, shows how the performance of the 2011 sample changed in 2012 and 2013, while Table 4 shows how the 2012 sample performed in 2013.

The numbers in green show improvements compared with the schools’ 2011 baselines and those in bold are above my illustrative high attainer floor target. The numbers in red are those which are lower than the schools’ 2011 baselines.

.

Table 3: Performance of the 2011 Sample in 2012 and 2013

Name             % HA  5+ A*-C incl E+M      3+ LOP E    3+ LOP M
11 12 13 11 12 13 11 12 13 11 12 13
Carter Community School 13 14 13 56 100 92 56 80 75 44 80 33
Hadden Park High School 13 15 8 60 87 75 40 80 75 20 53 50
Merchants Academy 19 16 20 68 79 96 58 79 88 42 47 71
The Robert Napier School 12 12 11 68 83 96 39 59 92 46 62 80
Bishop of Rochester Academy 5 7 8 70 83 73 50 67 47 60 75 53

.

All but one of the five schools showed little variation in the relative size of their high attainer populations over the three years in question.

More importantly, all five schools made radical improvements in 2012.

Indeed, all five exceeded the 5+ GCSE threshold element of my illustrative floor target in both 2012 and 2013 though, more worryingly, three of the five fell back somewhat in 2013 compared with 2012, which might suggest that short term improvement is not being fully sustained.

Four of the five exceeded the English progress element of the illustrative floor target in 2012 while the fifth – Robert Napier – missed by only 1%.

Four of the five also exceeded the floor in 2013, including Robert Napier which made a 43 percentage point improvement compared with 2012. On this occasion, Bishop of Rochester was the exception, having fallen back even below its 2011 level.

In the maths progress element, all five schools made an improvement in 2012, three of the five exceeding the floor target, the exceptions being Hadden Park and Merchants Academy

But by 2013, only three schools remained above their 2011 baseline and only two – Merchants and Robert Napier – remained above the floor target.

None of the five schools would have remained below my floor target in either 2012 or 2013, by virtue of their improved performance on the 5+ GCSE threshold element, but there was significantly greater insecurity in the progress elements, especially in maths.

There is also evidence of huge swings in performance on the progress measures. Hadden Park improved progression in English by 40 percentage points between 2011 and 2012. Carter Community School almost matched this in maths, improving by 36 percentage points, only to fall back by a huge 47 percentage points in the following year.

Overall this would appear to suggest that this small sample of schools made every effort to improve against the threshold and progress measures in 2012 but, while most were able to sustain improvement – or at least control their decline – on the threshold measure into 2013, this was not always possible with the progress elements.

There is more than a hint of two markedly different trajectories, with one group of schools managing to sustain initial improvements from a very low base and the other group falling back after an initial drive.

Is the same pattern emerging amongst the group of schools that fell below my high attainer floor target in 2012?

.

Table 4: Performance of the 2012 Sample in 2013

Name   % HA  5+ A*-C incl E+M 3+ LOP E 3+ LOP M
12 13  12 13 12 13 12 13
The Rushden Community College 26 23 3 90 0 74 54 87
Culverhay School 19 12 30 67 40 67 20 67
Raincliffe School 11 50 50 33
The Coseley School 20 26 60 88 51 82 60 78
Fleetwood High School 22 24 62 84 38 36 24 67
John Spendluffe Foundation Technology College 12 15 64 100 50 61 43 83
Parklands High School 18 11 69 78 23 56 8 56
Frank F Harrison Engineering College 12 13 70 70 35 57 60 52

.

We must rule out Raincliffe, which closed, leaving seven schools under consideration.

Some of these schools experienced slightly more fluctuation in the size of their high attainer populations – and over the shorter period of two years rather than three.

Six of the seven managed significant improvements in the 5+ GCSE threshold with the remaining school – Frank F Harrison – maintaining its 2012 performance.

Two schools – Frank F Harrison and Culverhay did not exceed the illustrative floor on this element.  Meanwhile John Spendluffe achieved a highly creditable perfect score, comfortably exceeding the national average for state-funded schools. Rushden was not too far behind.

There was greater variability with the progress measures. In English, three schools remained below the illustrative floor in 2013 with one – Fleetwood High – falling back compared with its 2012 performance.

Conversely, Coseley improved by 31 percentage points to not far below the national average for state-funded schools.

In maths two schools failed to make it over the floor. Parklands made a 48 percentage point improvement but still fell short, while Frank F Harrison fell back eight percentage points compared with its 2012 performance.

On the other hand, Rushden and John Spendluffe are closing in on national average performance for state-funded schools. Both have made improvements of over 30 percentage points.

Of the seven, only Frank F Harrison would remain below my overall illustrative floor target on the basis of its 2013 performance.

Taking the two samples together, the good news is that many struggling schools are capable of making radical improvements in their performance with high attainers.

But question marks remain over the capacity of some schools to sustain initial  improvements over subsequent years.

 .

What Interventions Have Impacted on these Schools?

Table 5 below reveals how different accountability and school improvement interventions have been brought to bear on this sample of 20 schools since 2011.

.

Table 5: Interventions Impacting on Sample Schools 2011-2014

Name Floor Targets Most recent Inspection Ofsted Rating (Pre-) warning notice Academised
2011
Carter Community School  .FT 2011. FT 2013  .29/11/12. NYI as academy 2 Sponsored
Hadden Park High School  .FT 2011.FT 2012

.FT 2013

 .13/11/13 .NYI as academy SM Sponsored
Merchants Academy  .FT 2011 .FT 2012  .9/6/11 2
The Robert Napier School  .FT 2011.FT 2012  .17/09/09.NYI as academy 3 Sponsored
Bishop of Rochester Academy  .FT 2011.FT 2013  .28/6/13 3 PWN 3/1/12
2012
The Rushden Community College FT 2012  .10/11/10.NYI as academy 3 Sponsored
Culverhay School  .FT 2011 .FT 2012

.(FT 2013)

 .11/1/12 .NYI as academy SM Sponsored
Raincliffe School  .FT 2012  .19/10/10 3 Closed
The Coseley School  .FT 2012  .13/9/12 SM
Fleetwood High School  .FT 2012 .FT 2013  .20/3/13 SWK
John Spendluffe Foundation Technology College  .FT 2012  .3/3/10 .As academy    18/9/13 .1.2 Academy converter 9/11
Parklands High School  .FT 2011.FT 2012

.FT 2013

 .5/12/13 SM Discussing sponsorship
Frank F Harrison Engineering College  .FT 2011.FT 2012

.(FT 2013)

 .5/7/11.See Mirus Academy below 3 Now Mirus Academy (see below)
2013
Gloucester Academy  .FT 2011.FT 2012

 .FT 2013

 .4/10/12 SWK  .PWN 16/9/13.WN 16/12/13
Christ the King RC and CofE VA School  .FT 2011.FT 2012

.FT 2013

 .18/9/12 SM Closed
Aireville School  .FT 2012.FT 2013  .15/5/13 SM
Manchester Creative and Media Academy for Boys  .FT 2011.FT 2012

.FT 2013

 .13/6/13 SWK PWN 3/1/12
Fearns Community Sports College  .FT 2011.FT 2013  .28/6/12 3
Unity College Blackpool .  .FT 2011 .FT 2012

.FT 2013

 .9/11/11.NYI as academy 3 Sponsored
The Mirus Academy  .FT 2013  .7/11/13 SM

 .

Floor Targets

The first and obvious point to note is that every single school in this list fell below the official floor targets in the year in which they also undershot my illustrative high attainers’ targets.

It is extremely reassuring that none of the schools returning particularly poor outcomes with high attainers are deemed acceptable performers in generic terms. I had feared that a few schools at least would achieve this feat.

In fact, three-quarters of these schools have fallen below the floor targets in at least two of the three years in question, while eight have done so in all three years, two having changed their status by becoming academies in the final year (which, strictly speaking, prevents them from scoring the hat-trick). One has since closed.

Some schools appear to have been spared intervention by receiving a relatively positive Ofsted inspection grade despite their floor target records. For example, Carter Community School had a ‘good’ rating sandwiched between two floor target appearances, while Merchants Academy presumably received its good rating before subsequently dropping below the floor.

John Spendluffe managed an outstanding rating two years before it dropped below the floor target and was rated good – in its new guise as an academy – a year afterwards.

The consequences of falling below the floor targets are surprisingly unclear, as indeed are the complex rules governing the wider business of intervention in underperforming schools.

DfE press notices typically say something like:

Schools below the floor and with a history of underperformance face being taken over by a sponsor with a track record of improving weak schools.’

But of course that can only apply to schools that are not already academies.

Moreover, LA-maintained schools may appeal to Ofsted against standards and performance warning notices issued by their local authorities; and schools and LAs may also challenge forced academisation in the courts, arguing that they have sufficient capacity to drive improvement.

As far as I can establish, it is nowhere clearly explained what exactly constitutes a ‘history of underperformance’, so there is inevitably a degree of subjectivity in the application of this criterion.

Advice elsewhere suggests that a school’s inspection outcomes and ‘the local authority’s position in terms of securing improvement as a maintained school’ should also be taken into account alongside achievement against the floor targets.

We do not know what weighting is given to these different sources of evidence, nor can we rule out the possibility that other factors – tangible or intangible – are also weighed in the balance.

Some might argue that this gives politicians the necessary flexibility to decide each case on its merits, taking careful account of the unique circumstances that apply rather than imposing a standard set of cookie-cutter judgements.

Others might counter that the absence of standard criteria, imposed rigorously but with flexibility to take additional special circumstances in to account, lays such decisions unnecessarily open to dispute and is likely to generate costly and time-consuming legal challenge

.

Academy Warning Notices

When it comes to academies:

‘In cases of sustained poor academic performance at an academy, ministers may issue a pre-warning notice to the relevant trust, demanding urgent action to bring about substantial improvements, or they will receive a warning notice. If improvement does not follow after that, further action – which could ultimately lead to a change of sponsor – can be taken. In cases where there are concerns about the performance of a number of a trust’s schools, the trust has been stopped from taking on new projects.’

‘Sustained poor academic performance’ may or may not be different from a ‘history of underperformance’ and it too escapes definition.

One cannot but conclude that it would be very helpful indeed to have some authoritative guidance, so that there is much greater transparency in the processes through which these various provisions are being applied, to academies and LA-maintained schools alike.

In the absence of such guidance, it seems rather surprising that only three of the academies in this sample – Bishop of Rochester, Gloucester and Manchester Creative and Media – have received pre-warning letters to date, while only Gloucester’s has been superseded by a full-blown warning notice. None of these mention specifically the underperformance of high attainers.

  • Bishop of Rochester received its notice in January 2012, but subsequently fell below the floor targets in both 2012 and 2013 and – betweentimes – received an Ofsted inspection rating of 3 (‘requires improvement’).
  • Manchester Creative and Media also received its pre-warning notice in January 2012. It too has been below the floor targets in both 2012 and 2013 and was deemed to have serious weaknesses in a June 2013 inspection.
  • Gloucester received its pre-warning notice much more recently, in September 2013, followed by a full warning notice just three months later.

These pre-warning letters invite the relevant Trusts to set out within 15 days what action they will take to improve matters, whereas the warning notices demand a series of specific improvements with a tight deadline. (In the case of Gloucester Academy the notice issued on 16 December 2013 imposing a deadline of 15 January 2014. We do not yet know the outcome.)

Other schools in my sample have presumably been spared a pre-warning letter because of their relatively recent acquisition of academy status, although several other 2012 openers have already received them. One anticipates that more will attract such attention in due course.

 .

Ofsted Inspection

The relevant columns of Table 5 reveal that, of the 12 schools that are now academies (taking care to count Harrison/Mirus as one rather than two), half have not yet been inspected in their new guise.

As noted above, it is strictly the case that, when schools become academies – whether sponsored or via conversion – they are formally closed and replaced by successor schools, so the old inspection reports no longer apply to the new school.

However, this does not prevent many academies from referring to such reports on their websites – and they do have a certain currency when one wishes to see whether or not a recently converted academy has been making progress.

But, if we accept the orthodox position, there are only six academies with bona fide inspection reports: Merchants, Bishop of Rochester, John Spendluffe, Gloucester, Manchester Creative and Media and Mirus.

All five of the LA-maintained schools still open have been inspected fairly recently: Coseley, Fleetwood, Parklands, Aireville and Fearns.

This gives us a sample of 11 schools with valid inspection reports:

  • Two academies are rated ‘good’ (2)  – Merchants and John Spendluffe;
  • One academy – Bishop of Rochester – and one LA-maintained school –  Fearns – ‘require improvement’ (3);
  • Two academies – Gloucester and Manchester – and one LA-maintained school – Fleetwood – are inadequate (4) having serious weaknesses and
  • One academy – Mirus – and three LA-maintained schools – Parklands, Coseley and Aireville – are inadequate (4) and in Special Measures.

The School Inspection Handbook explains the distinction between these two  variants of ‘inadequate’:

‘A school is judged to require significant improvement where it has serious weaknesses because one or more of the key areas is ‘inadequate’ (grade 4) and/or there are important weaknesses in the provision for pupils’ spiritual, moral, social and cultural development. However, leaders, managers and governors have been assessed as having the capacity to secure improvement

…A school requires special measures if:

  • it is failing to give its pupils an acceptable standard of education and
  • the persons responsible for leading, managing or governing are not demonstrating the capacity to secure the necessary improvement in the school.’

Schools in each of these categories are subject to more frequent monitoring reports. Those with serious weaknesses are typically re-inspected within 18 months, while, for those in special measures, the timing of re-inspection depends on the school’s rate of improvement.

It may be a surprise to some that only seven of the 11 are currently deemed inadequate given the weight of evidence stacked against them.

There is some support for the contention that Ofsted inspection ratings, floor target assessments and pre-warning notices do not always link together as seamlessly as one might imagine, although apparent inconsistencies may sometimes arise from the chronological sequence of these different judgements.

But what do these 11 reports say, if anything, about the performance of high attainers? Is there substantive evidence of a stronger focus on ‘the most able’ in those reports that have issued since September 2013?

.

The Content of Ofsted Inspection Reports

Table 6, below, sets out what each report contains on this topic, presenting the schools in the order of their most recent inspection.

One might therefore expect the judgements to be more specific and explicit in the three reports at the foot of the table, which should reflect the new guidance introduced last September. I discussed that guidance at length in this October 2013 post.

.

Table 6: Specific references to high attainers/more able/most able in inspection reports

Name Date Outcome Comments
Merchants Academy 29/6/11 Good (2) In Year 9… an impressive proportion of higher-attaining students…have been entered early for the GCSE examinations in mathematics and science. Given their exceptionally low starting points on entry into the academy, this indicates that these students are making outstanding progress in their learning and their achievement is exceptional.More-able students are fast-tracked to early GCSE entry and prepared well to follow the InternationalBaccalaureate route.
Fearns Community Sports College 28/6/12 Requires improvement (3) Setting has been introduced across all year groups to ensure that students are appropriately challenged and supported, especially more-able students. This is now beginning to increase the number of students achieving higher levels earlier in Key Stage 3.
The Coseley School 13/9/12 Special Measures (4) Teaching is inadequate because it does not always extend students, particularly the more able.What does the school need to do to improve further?Raise achievement, particularly for the most able, by ensuring that:

  • work consistently challenges and engages all students so that they make good progress in lessons
  • challenging targets are set as a minimum expectation
  • students do not end studies in English language and mathematics early without having the chance to achieve the best possible grade
  • GCSE results in all subjects are at least in line with national expectations.

Target setting is not challenging enough for all ability groups, particularly for the more-able students who do not make sufficient progress by the end of Key Stage 4.

Gloucester Academy 4/10/12 Serious Weaknesses (4) No specific reference
Fleetwood High School 20/3/13 Serious Weaknesses(4) No specific reference
Aireville School 15/5/13 Special Measures(4) Teachers tend to give the same task to all students despite a wide range of ability within the class. Consequently, many students will complete their work and wait politely until the teacher has ensured the weaker students complete at least part of the task. This limits the achievement of the more-able students and undermines the confidence of the least-able.There is now a good range of subjects and qualifications that meet the diverse needs and aspirations of the students, particularly the more-able students.
Manchester Creative and Media Academy for Boys 13/6/13 Serious Weaknesses(4) The most-able boys are not consistently challenged to attain at the highest levels. In some lessons they work independently and make rapid progress, whereas on other occasions their work is undemanding.What does the academy need to do to improve further?Improve the quality of teaching in Key Stages 3 and 4 so that it is at least good leading to rapid progress and raised attainment for all groups of boys, especially in English, mathematics and science by…  ensuring that tasks are engaging and challenge all students, including the most-able.The most-able boys receive insufficient challenge to enable them to excel. Too many lessons donot require them to solve problems or link their learning to real-life contexts.In some lessons teachers’ planning indicates that they intend different students to achieve different outcomes, but they provide them all with the same tasks and do not adjust the pace or nature of work for higher- or lower-attaining students. This results in a slow pace of learning and some boys becoming frustrated.
Bishop of Rochester Academy 28/6/13 Requires improvement (3) No specific reference
John Spendluffe Foundation Technology College 18/9/13 Good (2) Not enough lessons are outstanding in providing a strong pace, challenge and opportunities for independent learning, particularly for the most able.The 2013 results show a leap forward in attainment and progress, although the most able could still make better progress.Leadership and management are not outstanding because the achievement of pupils, though improving quickly, has not been maintained at a high level over a period of time, and a small number of more-able students are still not achieving their full potential.
The Mirus Academy 7/11/13 Special Measures (4) The academy’s early entry policy for GCSE has made no discernible difference to pupils’ achievement, including that of more able pupils.
Parklands High School 5/12/13 Special Measures (4) The achievement of students supported by the pupil premium generally lags behind that of their classmates. All groups, including themost able students and those who have special educational needs, achieve poorly.Students who join the school having achieved Level 5 in national Key Stage 2 tests in primary school fare less well than middle attainers, in part due to early GCSE entry. They did a little better in 2013 than in 2012.

.

There is inconsistency within both parts of the sample – the first eight reports that pre-date the new guidance and the three produced subsequently.

Three of the eleven reports make no specific reference to high attainers/most able learners, all of them undertaken before the new guidance came into effect.

In three more cases the references are confined to early entry or setting, one of those published since September 2013.

Only four of the eleven make what I judge to be substantive comments:

  • The Coseley School (special measures) – where the needs of the most able are explicitly marked out as an area requiring improvement;
  • The Manchester Creative and Media Academy for Boys (serious weaknesses) – where attention is paid to the most able throughout the report;
  • John Spendluffe Foundation Technology College (good) – which includes some commentary on the performance of the most able; and
  • Parklands High School (special measures) – which also provides little more than the essential minimum coverage.

The first two predate the new emphasis on the most able, but they are comfortably the most thorough. It is worrying that not all reports published since September are taking the needs of the most able as seriously as they might.

One might expect that, unconsciously or otherwise, inspectors are less ready to single out the performance of the most able when a school is inadequate across the board, but the small sample above does not support this hypothesis. Some of the most substantive comments relate to inadequate schools.

It therefore seems more likely that the variance is attributable to the differing capacity of inspection teams to respond to the new emphases in their inspection guidance. This would support the case made in my previous post for inspectors to receive additional guidance on how they should interpret the new requirement.

.

Conclusion

This post established an illustrative floor target to identify a small sample of 20 schools that have demonstrated particularly poor performance with high attainers in the Performance Tables for 2011, 2012 or 2013.

It:

  • Compared the performance of these schools in the year in which they fell below the floor, noting significant variance by year and between institutions, but also highlighting the fact that the proportion of high attainers attending these schools is significantly lower than the national average for state-funded schools.
  • Examined the subsequent performance of schools below the illustrative floor in 2011 and 2012, finding that almost all made significant improvements in the year immediately following, but that some of the 2011 cohort experienced difficulty in sustaining this improvement across all elements into a second year. It seems that progress in English, maths or both are more vulnerable to slippage than the 5+ A*-C GCSE threshold measure.
  • Confirmed – most reassuringly – that every school in the sample fell below the official, generic floor targets in the year in which they also undershot my illustrative high attainer floor targets.
  • Reviewed the combination of assessments and interventions applied to the sample of schools since 2011, specifically the interaction between academisation, floor targets, Ofsted inspection and (pre)warning notices for academies. These do not always point in the same direction, although chronology can be an extenuating factor. New guidance about how these and other provisions apply and interact would radically improve transparency in a complex and politically charged field.
  • Analysed the coverage of high attainers/most able students in recent inspection reports on 11 schools from amongst the sample of 20, including three published after September 2013 when new emphasis on the most able came into effect. This exposed grave inconsistency in the scope and quality of the coverage, both before and after September 2013, which did not correlate with the grade of the inspection. Inspectors would benefit from succinct additional guidance.

In the process of determining which schools fell below my high attainers floor target, I also identified the schools that undershot one or other of the elements but not both. This wider group included 46 schools in 2011, 52 schools in 2012 and 34 schools in 2013.

Several of these schools reappear in two or more of the three years, either in their existing form or following conversion to academy status.

Together they constitute a ‘watch list’ of more than 100 institutions, the substantial majority of which remain vulnerable to continued underperformance with their high attainers for the duration of the current accountability regime.

The chances are that many will also continue to struggle following the introduction of the new ‘progress 8’ floor measure from 2015.

Perhaps unsurprisingly, the significant majority are now sponsored academies.

I plan to monitor their progress.

.

*Apologies for this rather tabloid title!

.

GP

February 2014

The 2013 Transition Matrices and High Attainers’ Performance

.

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

Since last year’s post on the Secondary Transition Matrices attracted considerable interest, I thought I’d provide a short commentary on what the 2013 Matrices – primary and secondary – tell us about the national performance of high attainers.

This note is a postscript to my recent offerings on:

and completes the set of benchmarking resources that I planned to make available.

I am using the national matrices rather than the interactive matrices (which, at the time of writing, are not yet available for 2013 results). I have included a few figures from the 2012 national matrices for comparative purposes.

According to Raise Online, the national matrices are derived from the data for ‘maintained mainstream and maintained and non-maintained special schools’.

They utilise KS2 fine points scores as set out below.

Sub Level Points Fine points range
6 39 36-41.99
5A 34-35.99
5B 33 32-33.99
5C 30-31.99
4A 28-29.99
4B 27 26-27.99
4C 24-25.99
3A 22-23.99
3B 21 20-21.99
3C 18-19.99

.

2013 Primary Transition Matrices

The Primary Matrices track back the KS1 performance of learners completing KS2 tests in 2013.

.

Reading

.

2013 primary reading TM.

This table shows that:

  • 12% of KS1 learners with L4 reading secured the minimum expected 2 levels of progress by securing L6 at the end of KS2. It is not possible for such learners to make more than 2 levels of progress. Almost all the remaining 88% of Level 4 learners made a single level of progress to Level 5.
  • By comparison, just 1% of learners achieving Level 3 in KS1 made 3 levels of progress to Level 6 (the same percentage as in 2012).
  • 87% of KS1 learners achieving L3 in reading secured the expected 2 or more levels of progress, 85% of them making 2 levels of progress to L5. However, some 13% made only 1 level of progress to L4. (In 2012, 89% of those with L3 at secured L5 and 10% reached L4.)
  • The proportion of learners with L3 in reading at KS1 who made the expected 2 levels of progress was lower than the proportions of learners with L2 overall, L2A, or L2B doing so. The proportion exceeding 2 levels of progress was far higher for every other level of KS1 achievement. (This was also true in 2012.)

.

Writing

.

2013 primary writing TM.

This table shows that:

  • 61% of learners achieving L4 in writing at KS1 made the requisite 2 levels of progress to L6 at KS2. Such learners are unable to make more than 2 levels of progress. The remaining 39% of L4 learners made a single level of progress to L5.
  • This compares with 9% of learners with L3 at KS1 who made 3 levels of progress to L6 (up from 6% in 2012). A further 2% of learners with L2A made 4 levels of progress to L6.
  • 89% of learners with L3 in KS1 writing made the expected 2 or more levels of progress, 80% of them making 2 levels of progress to L5. But 11% made only a single level of progress to L4. (In 2012, 79% of those with L3 at KS1 reached L5 and 15% made only L4.)
  • The proportion of learners with L3 at KS1 in writing achieving the expected 2 levels of progress was lower than the proportions of learners with L2 overall, L2A or L2B, or even L1 doing so. The proportion exceeding 2 levels of progress was far higher for every other level of KS1 achievement with the exception of L2C. (A similar pattern was evident in 2012.)

.

Maths

.

2013 primary maths TM.

This table shows that:

  • 89% of those achieving L4 in maths at KS1 made the requisite 2 levels of progress to L6 in KS2. These learners are unable to make more than 2 levels of progress. But the remaining 11% of those with L4 at KS1 made only a single level of progress to KS2 L5.
  • This compares with 26% of learners at L3 in KS1 maths who made 3 levels of progress to KS2 L6 (up significantly from 14% in 2012). In addition, 4% of those at KS1 L2A and 1% of those at 2B also managed 4 levels of progress to KS2 L6.
  • 90% of learners with L3 in KS1 maths made the expected 2 or more levels of progress to L5, 64% making 2 levels of progress to L5. But a further 10% made only a single level of progress to KS2 L4. (In 2012, 74% of those with L3 at KS1 made it to KS2 L5 and 11% secured L4.)
  • The proportion of learners with L3 at KS1 in maths who achieved the expected 2 levels of progress was lower than the proportions of those with KS1 L2A or L2B doing so. The proportion of learners exceeding 2 levels of progress was significantly higher for those with KS1 L2 overall, those with L2A, and even those with L1, but it was lower for those with L2B and especially L2C. (In 2012 the pattern was similar, but the gap between the proportions with L2B and L3 exceeding 2 levels of progress has narrowed significantly.)

.

Key Challenges

The key challenges in respect of high attainers in the primary sector are to:

  • Enable a higher proportion of learners with L4 at KS1 to make the expected 2 levels of progress to KS2 L6. There is a particular problem in reading where 88% of these learners are making a single level of progress.
  • Enable a higher proportion of learners with L3 at KS1 to make 3 levels of progress to KS2 L6. Reading is again the least advanced, but there is huge scope for improvement across the board. Efforts should be made to close the gaps between L2A and L3 making three levels of progress, which currently stand at 55 percentage points (reading), 49 percentage points (writing) and 30 percentage points (maths). For the duration of their existence, increasing take-up of KS2 L6 tests should secure further improvement.
  • Increase the proportions of learners with L3 at KS1 making 2 levels of progress so they are comparable with what is achieved by those with L2A and L2B at KS1. There are currently gaps of 11 percentage points (reading), 10 percentage points (writing) and 9 percentage points (maths) between those with L3 and those with L2A. The gaps between those with L3 and those with L2B are 5 percentage points (reading), 8 percentage points (writing) and 1 percentage point (maths).
  • Ensure that far fewer learners with L3 at KS1 manage only a single level of progress across KS2. The current levels – 13% in reading, 11% in writing and 10% in maths – are unacceptable.

.

Secondary

The Secondary Matrices track back the KS2 performance of learners completing GCSEs in 2013.

.

English

.

2013 secondary Englsih sublevels TM.

The table shows:

  • 97% of KS2 learners achieving 5A in English secured at least 3 levels of progress from KS2 to KS4 in 2013. This compares with 92% of learners achieving 5B and 74% of learners achieving 5C. (The comparable figures in 2012 were 98%, 92% and 70% respectively.)
  • 89% of KS2 learners achieving 5A in English achieved 4 or more levels of progress from KS2 to KS4 in 2013, so achieving an A* or A grade at GCSE, compared with 66% of those achieving 5B and 33% of those achieving 5C. (The comparable figures in 2012 were 87%, 64% and 29% respectively.)
  • The percentages of learners with 4A in English at KS2 who completed 3 and 4 or more levels of progress – 87% and 46% respectively – were significantly higher than the comparable percentages for learners achieving 5C.
  • 53% of KS2 learners achieving 5A in English made 5 levels of progress by achieving A* at GCSE, compared with 23% of those achieving 5B and 6% of those achieving 5C. (These are significantly higher than the comparable figures for 2012, which were 47%, 20% and 4% respectively).
  • 1% of KS2 learners achieving 5A at KS2 made only two levels of progress to GCSE grade C, compared with 6% of those with 5B and 22% of those with 5C. (These percentages have fallen significantly compared with 2012, when they were 3%, 13% and 30% respectively.)

.

Maths

.2013 secondary maths sublevels TM.

This table shows:

  • 97% of those achieving 5A in maths secured at least 3 levels of progress from KS2 to KS4, whereas 88% of learners achieving 5B did so and 70% of learners at 5C. (The comparable 2012 figures were 96%, 86% and 67% respectively.)
  • 85% of KS2 learners achieving 5A in maths made 4 or more levels of progress in 2013 to GCSE A* or A grades, compared with 59% of those at 5B and 31% of those at 5C. (The comparable 2012 figures were 84%, 57% and 30%.)
  • The percentage of learners achieving 4A in maths at KS2 who completed 3 and 4 or more levels of progress – 91% and 43% respectively – were significantly higher than the percentages of those with 5C who did so.
  • 53% of KS2 learners with 5A in maths made 5 levels of progress to achieve an A* grade in maths, compared with 22% of those with 5B and 6% of those with 5C. (The comparable figures for 2012 were 50%, 20% and 6% respectively).
  • 3% of learners with 5A at KS2 made only two levels of progress to GCSE grade C, compared with 11% of those with 5B and 27% of those with 5C. (These percentages were 3%, 13% and 30% in 2012.)

.

Key challenges

The key challenges in respect of high attainers in the secondary sector are to:

  • Ensure that, so far as possible, all learners with L5 at KS2 make at least 3 levels of progress to at least GCSE grade B. Currently more than 1 in 5 students with Level 5C fail to achieve this in English and more than 1 in 4 fail to do so in maths. Moreover, more than 1 in 10 of those with 5B at KS2 fall short of 3 levels of progress in maths. This is disappointing.
  • Ensure that a higher proportion of learners with L5 at KS2 make 4 and 5 levels of progress. The default expectation for those with L5A at KS2 should be an A* Grade at GCSE (5 levels of progress) while the default for those with L5B at KS2 should be at least Grade A at GCSE (4 levels of progress). Currently 47% of those with L5A are falling short of A* in both English and maths, while 34% of those with L5B are falling short of A*/A in English while 41% are doing so in maths.
  • Narrow the gaps between the performance of those with L5C at KS2 and those with L4A. Currently there are 13 percentage point gaps between the proportions making the expected 3 levels of progress and between the proportions exceeding 3 levels of progress in English, while in maths there are gaps of 21 percentage points between those making 3 levels of progress and of 12 percentage points between those exceeding 3 levels of progress.

.

GP

January 2014

High Attainment in the 2013 Secondary and 16-18 Performance Tables

.

.

This post reviews high attainment and high attaining student data in the 2013 Secondary and 16-18 Performance Tables relating

Data Overload courtesy of opensourceway

to GCSE and A level respectively. It compares key outcomes with those reported in last year’s tables.

It also draws extensively on two accompanying statistical publications:

and derives three year trends, from these and the comparable 2011 and 2012 publications, focused primarily on variations by sector and school admission arrangements.

This post complements a briefer analysis of High Attainment in the 2013 Primary School Performance Tables published on 12 December 2013 and updates last year’s High Attaining Students in the 2012 Secondary School Performance Tables (January 2013).

This year’s secondary/post-16 analysis is presented in a somewhat different format, organised into sections relating to key measures, beginning with GCSE and moving on to A level.

A few preliminaries:

There are sometimes discrepancies between the figures given in the Tables and those in the supporting statistical publications that I cannot explain.

The commentary highlights results – some extraordinarily good, others correspondingly poor – from specific institutions identified in the Tables. This adds some richness and colour to what might otherwise have been a rather dry post.

But there may of course be extenuating circumstances to justify particularly poor results which are not allowed for in the Tables. Equally, strong results may not always be solely attributable to the quality of education provided in the institution that secures them.

As always, I apologise in advance for any transcription errors and urge you to report them through the comments facility provided.

Those who prefer not to read the full post will find a headline summary immediately below. The main text provides additional detail but is intended primarily for reference purposes.

.

Headlines

.

Media Coverage

There has been relatively little media coverage of what the Performance Tables reveal about the achievement of high attainers, though one article appeared in the Daily Telegraph.

It said that the high attainer population comprised some 175,800 students of which:

  • about 9,300 ‘failed to gain five good GCSEs at grades A* to C, including English and maths’;
  • around 48% per cent (over 84,200) did not pass the EBacc and 35% [approaching 62,000] did not enter all the necessary subjects;
  • in English almost 14% [24,000] ‘effectively went backwards in English by gaining lower scores at GCSE level than comparable tests taken at 11, while 12% [21,000] did so in maths’.

The figures in square brackets are my own, derived from the percentages provided in the article.

The final point suggests that sizeable minorities of high attainers achieved the equivalent of Level 4 in English and maths GCSEs, but this is incorrect.

These figures relate to the proportion of high attainers who did not make at least three levels of progress from KS2 to KS4 in English and maths (see below) – quite a different matter.

.

Headlines from this analysis

The following extended bullet points summarise the key findings from my own analysis:

  • The high attainer population constitutes almost exactly one third of the population of mainstream state-funded schools. A gender gap that had almost closed in 2012 has widened again in favour of girls. There are significant variations between school types – for example just over 20% of students attending sponsored academies are high attainers compared with just under 40% in academy converters. The population in free schools, UTCs and studio schools has fallen by 11.5% since 2012, presumably as a consequence of the sector’s rapid expansion. Only 90% of the selective school cohort constitutes high attainers, which suggests 10% of their intake are middle attainers who perform well on ability-based 11+ assessments. The selective school high attainer population has fallen by 1.4% since 2011. Ten selective schools record that their cohort consists entirely of high attainers, but some selective schools register a cohort in which two-thirds or fewer students are high attainers. This is comfortably lower than some non-selective schools, raising awkward questions about the nature of selective education. Although there are no schools with no high attainers, two schools recorded 3% and 99 have fewer than 10% (down from 110 in 2012). Academies in coastal towns are well-represented. The schools containing these small high attaining groups demonstrate huge variations in high attainer performance. This warrants further investigation.
  • 60.6% of all students in state-funded schools achieved 5 or more GCSEs at grades A*-C (or equivalent) including GCSEs in English and maths, a 1.8% improvement compared with 2012. But the success rate for high attainers improved by only 0.7% to 94.7%. This is better than the 0.2% fall in the success rate amongst low attainers but falls well short of the 2. 3% improvement for middle attainers. One in every twenty high attainers continues to miss this essential benchmark. But 50 more schools recorded 100% returns than in 2012 – and 19 fewer schools were at 75% or below. Apart from selective schools falling foul of the ineligibility of some IGCSEs, Ark King’s Academy in Birmingham was the lowest performer at 33%. Trends vary considerably according to school type. Free schools, UTCs and studio schools have improved by 4.2% since 2012, which must be partly a consequence of the growth of that sector. Meanwhile high attainers in selective schools have fallen back by 2.0% (and selective schools overall by 2.3%) since 2011. It is unlikely that idiosyncratic IGCSE choices are solely responsible. The profiles of sponsored and converter academies are still markedly different, though the gap between their high attainers’ performance has halved since 2011, from 5.3 percentage points to 2.7 percentage points.
  • There were big increases in the percentage of all students entered for all EBacc subjects in state-funded schools – up 12.4% to 35.5% – and the percentage successful – up 6.6% to 22.8%. The comparable entry and success rates for high attainers were 65.0% and 52.1% respectively. The entry rate for 2012 was 46.3%, so that has improved by almost 19 percentage points, a much faster rate of improvement than the headline figure. The success rate has improved from 38.5% last year, so by 13.6 percentage points, more than double the improvement in the headline figure. The EBacc is clearly catching on for high attainers following a relatively slow start. That said, one could make a case that the high attainer success rate in particular remains rather disappointing, since something like one in five high attainers entered for the EBacc fail to convert entry into achievement. Forty-seven schools entered all of their high attainers but only four recorded 100% success, two selective (Chelmsford County High for Girls and Queen Elizabeth’s Barnet) and two comprehensive (St Ursula’s Convent School in Greenwich and The Urswick School in Hackney). Only 55 schools entered no high attainers for the EBacc, compared with 186 in 2012. Seventy-nine schools recorded 0% of high attainers achieving the EBacc, also down significantly, from 235 in 2012. Seven of these were grammar schools, presumably all falling foul of IGCSE restrictions.
  • 70.4% of all students in state-funded schools made at least the expected three levels of progress in English and 70.7% did so in maths. These constitute improvements of 2.4% and 2.0% respectively. High attainers registered 86.2% success in English and 87.8% in maths. Their rates of improvement were broadly comparable with the headline figures, though slightly stronger in English. It remains disturbing that one in seven high attainers fail to make the expected progress in English and 1 in 8 fail to do so in maths. More schools achieved 100% success amongst their high attainers on each measure than in 2012 – 108 in English and 120 in maths. Forty-four schools were at or below 50% on this measure in English, some IGCSE-favouring grammar schools amongst them. Apart from those, the worst performer was Gloucester Academy at 28%. In maths 31 schools were at or below this 50% benchmark and the worst performer was Stafford Sports College at 29%. Six schools managed 50% or below in both English and maths, several of them academies. Amongst those at 50% or below in English, 11 had better rates of performance for both their middle and their low attainers than for their high attainers. Amongst those at 50% or below in maths, only one school achieved this feat – St Peter’s Catholic College of Maths and Computing (!) in Redcar and Cleveland. It is a cause for concern that high attainers in English attending selective schools continue to fall back on this measure and that one in five high attainers in sponsored academies, free schools, UTCs and studios is failing to make three levels of progress in English, while the same is true of maths in sponsored academies.
  • 7.5% of students in state-funded schools and colleges achieved grades of AAB or higher at A level with all three in facilitating subjects, an improvement of 0.1% compared with 2012. But the comparable percentage for students who achieved these grades with at least two in facilitating subjects shot up to 12.1%, an improvement of 4.3% on 2012. There are big variations between sectors, with the percentage achieving the former measure ranging from 3.5% (FE colleges) to 10.4% (converter academies. The figure for selective schools is 21.1%. Turning to the latter measure, percentages vary from 5.4% in mainstream sponsored academies to 16.4% in mainstream converter academies, while selective schools stand at 32.4%. Across all sectors, more students achieve grades AAA or higher in any A level subjects than achieve AAB or higher in three facilitating subjects. The proportion of students achieving AAA or higher in any A levels is falling in most sectors and institutional types, except in free schools, UTCs and studios and in FE colleges. The proportion achieving AAB or higher in any subjects is falling except in sponsored academies and FE colleges. Conversely there are improvements for AAB or higher with all three in facilitating subjects in LA-maintained mainstream schools, sponsored academies, sixth form colleges and FE colleges (and also across all comprehensive schools).  Across all state-funded mainstream schools, the percentage of A level A* grades has fallen back by 0.5% since 2011 while the percentage of A*/A grades has declined by 0.1%.

The full commentary below names 22 schools which perform particularly badly on one or more GCSE high attainer measures (leaving aside selective schools that have adopted ineligible GCSEs).

Of those 22, only nine are below the floor targets and, of those nine, only four are not already academies. Hence the floor targets regime leaves the vast majority of these schools untouched.

The only hope is that these schools will be caught by Ofsted’s renewed emphasis on the attainment and progress of the ‘most able’ learners (though that provision could do with further clarification as this previous post explained).

 

Definitions

The analysis of GCSE performance is focused primarily on high attainers, while the A level analysis is confined to high attainment.

This is a consequence of the way the two sets of performance tables are constructed (such distinctions were brought out more fully in this October 2013 post.)

There is no coverage of A*/A performance at GCSE within the Secondary Tables so we must necessarily rely on performance against standard measures, such as 5+ GCSEs at A*-C including English and maths and the English Baccalaureate (EBacc).

The Government response to the consultation on secondary accountability reform suggests that this will remain the case, with material about achievement of top GCSE grades confined to the supporting Data Portal. It remains to be seen whether this arrangement will give high attainment the prominence it needs and deserves.

The current definition of high attainers is based on prior performance at the end of KS2. Most learners will have taken these KS2 tests five years previously, in 2008:

  • High attainers are those who achieved above Level 4 in KS2 tests – ie their average point score (APS) in English, maths and science tests was 30 or higher.
  • Middle attainers are those who achieved at the expected Level 4 in KS2 tests – ie their APS in these tests was between 24 and 29.99 – and
  • Low attainers are those who achieved below Level 4 in KS2 tests – ie their APS in these tests was under 24.

Since high attainers are determined on the basis of APS across three subjects, the definition will include all-rounders who achieve good (if not outstanding) results across all three tests, as well as some with a relatively spiky achievement profile who compensate for middling performance in one area through very high attainment in another.

Conversely, learners who are exceptionally strong in one subject but relatively poor in the other two are unlikely to pass the APS 30 threshold.

Both the Secondary Tables and the associated statistical publications remain bereft of data about the performance of high attainers from disadvantaged backgrounds and how that compares with the performance of their more advantaged high attaining peers.

This is unfortunate, since schools that are bucking the trend in this respect – achieving a negligible ‘excellence gap’ between their high attainers from advantaged and disadvantaged backgrounds – richly deserve to be celebrated and emulated.

At A level a variety of high attainment measures are reported in the statistical publications, but the Performance Tables focus on the achievement of AAB+ grades in the so-called ‘facilitating subjects’.

The Statement of Intent for last year’s Tables confirmed the intention to introduce:

‘Percentages of students achieving three A levels at grades AAB or higher in facilitating subjects, reflecting the subjects and grades most commonly required by Russell Group and other top universities.’

These subjects are listed as ‘biology, chemistry, physics, mathematics, geography, history, English literature, modern and classical languages.’

Such measures have been widely criticised for their narrowness, the Russell Group itself asserting that:

‘It would be wrong to use this simple indicator as a measure of the number of pupils in a school who are qualified to apply successfully to a Russell Group university.’

Nevertheless, they support one of the Government’s preferred Social Mobility Indicators which compares the percentage of students attending state and independent schools who achieve this measure. (In 2012 the gap was 15.1%, a full percentage point smaller than in 2011.)

There is nothing in the 16-18 Tables about high attainers, although the consultation document on 16-19 accountability reform includes a commitment to:

‘Consider how we can report the results of low, middle and high attainers similarly [to KS4] in the expanded 16-19 performance tables’.

At the time of writing, the response to this consultation has not been published.

.

GCSE Achievement

 .

The High Attainer Population

Before examining the performance data it is important to review the size of the high attaining population and how this varies between genders, sectors and types of school.

Tables 1A, B and C below show that the population has remained relatively stable since 2011. It accounts consistently for almost exactly one third of students in state-funded mainstream schools.

The gender gap amongst high attainers has changed slightly since 2011. The percentage of high attaining girls has fallen back, slightly but consistently, while the percentage of high attaining boys increased in 2012, only to fall back again in 2013.

A gender gap that had almost been eliminated in 2012 has now widened again to a full percentage point. The percentages of high attaining learners of both genders are the lowest they have been over the three year period.

There are significant variations according to sector and school type, since the high attainer population in converter academies is almost double that in sponsored academies, where it constitutes barely a fifth of the student body. This is a strikingly similar proportion to that found in modern schools.

The percentage of high attainers in comprehensive schools is only very slightly lower than the overall figure.

At the other end of the spectrum, the high attaining cohort constitutes around 90% of the selective school population, which begs interesting questions about the nature of the other 10% and possible discrepancies between KS2 results and ability-focused 11+ assessment.

It cannot be the case that the majority of the missing 10% attended independent preparatory schools and did not take KS2 tests, since those without test results are excluded from the calculations.

The underlying trend is downward in all types of school. There has been a huge 11.5% fall in the proportion of high attainers in free schools, UTCs and studio schools. This is presumably consequent upon the expansion of that sector and brings it much more into line with the figures for all maintained mainstream and other comprehensive schools.

Otherwise the most substantial reduction has been in converter academies. The percentage in selective schools has fallen by 1.4% since 2011, twice the rate of decline in comprehensive schools.

.

Table 1A: Percentage of high attainers by sector 2011, 2012 and 2013

All maintained mainstream LA maintained mainstream Sponsored academies Converter academies Free schools, UTCs and studio schools
2013 32.8 30.6 20.5 39.3 31.4
2012 33.6 32.0 20.9 42.5 42.9
2011 33.5 20.6 47.5

.

Table 1B: Percentage of high attainers by admissions practice 2011, 2012 and 2013

Selective Comprehensive Modern
2013 88.9 30.9 20.5
2012 89.8 31.7 20.9
2011 90.3 31.6 20.4

.

Table 1C: Percentage of high attainers by gender, all state-funded mainstream schools 2011, 2012, 2013

  Boys Girls
2013 32.3 33.3
2012 33.4 33.8
2011 32.6 34.4

 

The 2013 Performance Tables list 10 schools where 100% of pupils are deemed high attainers, all of which are selective. Thirteen selective schools were in this position in 2012.

But there is also a fairly wide spread amongst selective schools, with some recording as few as 70% high attainers, broadly comparable with some prominent non-selective schools.

For example, Dame Alice Owen’s School and The Cardinal Vaughan Memorial RC School – both comprehensive – have high attainer populations of 79% and 77% respectively, while Fort Pitt Grammar School in Chatham, Kent and Skegness Grammar School are at 65% and 66% respectively.

This raises intriguing questions about the nature of selective education and the dividing line between selective and non-selective schools

At the other extreme there are no schools recording zero high attainers, but 99 record 10% or fewer, several of them prominent academies. This is an improvement on 2012 when 110 schools fell into this category.

The two schools with the fewest high attainers (3%) are Barnfield Business and Enterprise Studio (which opened in 2013) and St Aldhelm’s Academy in Poole.

Academies based in coastal towns are well represented amongst the 99.

It is interesting to speculate whether very small high attainer cohorts generally perform better than slightly larger cohorts that perhaps constitute a ‘critical mass’.

Certainly there are huge variations in performance on the key measures amongst those schools with few high attainers (where results have not been suppressed). This is particularly true of EBacc entry and success rates.

For example, amongst the 50 schools with the fewest high attainers:

  • the EBacc success rate varies from 73% at Aston Manor Academy to zero, returned by 12 of the 50.
  • The percentage of high attaining pupils making the expected progress in English varies from 50% to 100% while the corresponding range in maths is from 47% to 100%.

Many of these figures are derived from very small cohorts (all are between 1 and 20), but the point stands nevertheless.

.

The Performance of High Attainers

As noted above, there are no true high attainment measures relating to the achievement of GCSE A*/A grades within the Secondary Tables, so this section is necessarily reliant on the universal measures they contain.

.

5+ GCSEs at Grades A*-C including English and maths

The 2013 Secondary Performance Tables reveal that:

  • 53.6% of students at state-funded schools achieved 5+ GCSEs at Grades A*-C including English and maths, up 1.7% from 51.9% in 2012.
  • But the Tables pay more attention to the percentage achieving 5+ GCSEs at Grades A*-C (or equivalent) including GCSEs in English and maths: 60.6% of students attending state-funded schools achieved that measure in 2013, compared, up 1.8% from 58.8% in 2012.
  • 94.7% of high attainers in state-funded schools secured this outcome, up an improvement on 94.0% in 2012. The comparable figures for middle attainers and low attainers (with 2012 figures in brackets) are 57.4% (55.1%) and 6.9% (7.1%) respectively. Hence the overall increase of 1.8% masks a slight fall amongst low attainers and a significantly smaller increase amongst high attainers. Although there has been improvement, one in every 20 high attainers continues to fall short.
  • But it is notable that around 530 schools achieved 100% amongst their high attainers on this measure, compared with some 480 in 2012. Moreover, only 14 schools are at or below 67%, compared with 19 in 2012, and 47 are at or below 75% compared with 66 in 2012. This is positive news and suggests that the inclusion of the distinction within the Tables is beginning to bear fruit.

 Tables 2A and 2B, below show there has been an increase of 3.5% on this measure since 2011 across all pupils in state-funded mainstream schools. Meanwhile the proportion of high attainers securing this outcome has fallen by 0.4% (after rising slightly in 2012).

It may well be harder for schools to eradicate the last vestiges of underachievement at the top end than to strengthen performance amongst middle attainers, where there is significantly more scope for improvement. But some may also be concentrating disproportionately on those middle attainers.

This overall picture masks very different trends in different types of school.

In sponsored academies an overall improvement of 4.8% coincides with a slight 0.1% fall amongst high attainers, who have recovered following a substantial dip in 2012.

But in converter academies the overall success rate has fallen by almost 9% since 2011, while the rate for high attainers has fallen by only 2.7%.

And in free schools, UTCs and studios a slight overall fall since 2012 (there are no figures for 2011) is accompanied by an improvement for high attainers of over 4%.

Comprehensive schools have improved by 2.6% overall since 2011, yet their high attainers have fallen back by 0.3%. In selective schools the overall rate has fallen back by 2.3% while the high attainer rate has dropped by a similar 2.0%. This is concerning.

It is not straightforward to work out what is happening here, though the changing size of different sectors must be having a significant impact. 2012 GCSE results in English will certainly have influenced the dip in last year’s figures.

High attainers in free schools, UTCs and studios still have some ground to make up on other sectors and it will be interesting to see whether their improving trend will continue in 2014

 ,

Table 2A: Percentage achieving 5+ A*-C grades (or equivalent) including English and maths by sector

All state-funded mainstream LA maintained mainstream Sponsored academies Converter academies Free schools, UTCs and studio schools
All HA All HA All HA All HA All HA
2013 61.7 94.7 59.2 94.1 51.2 93.0 68.2 95.7 54.6 91.7
2012 59.8 94.0 58.2 93.5 49.3 91.5 68.4 95.5 55.7 87.5
2011 58.2 95.1 N/A N/A 46.8 93.1 77.1 98.4 N/A N/A

.

Table 2B: Percentage achieving 5+ A*-C grades (or equivalent) including English and maths by admissions practice

Selective Comprehensive Modern
All HA All HA All HA
2013 96.4 97.3 60.4 94.5 55.3 92.5
2012 97.4 98.2 58.5 93.5 53.1 92.2
2011 98.7 99.3 57.8 94.8 50.8 91.8

.

A*-C grades in GCSE English and maths

According to the 2013 Secondary Tables:

  • 61.3% of all students in state-funded schools achieved GCSE grades A*-C in English and maths, compared with 59.5% in 2012, an improvement of 1.8%.
  • However, 95.1% of high attainers in state-funded schools achieved this measure compared with 94.3% in 2012, an increase of only 0.8%. The comparable figures for middle and low attainers (with 2012 figures in brackets) were 58.5% (55.8%) and 7.1% (7.3%) respectively. The pattern is therefore similar to the 5A*-C measure, with limited improvement at the top, significant improvement in the middle and a slight decline at the bottom.
  • Some 610 state-funded schools had 100% of their high attainers achieve this outcome, a significant improvement on the 530 recorded in 2012. There were 12 schools where the percentage was 67% or lower, compared with 18 in 2012, and 38 where the percentage was 75% or lower, compared with almost 60 in 2012.
  • These latter figures include Pate’s, King Edward VI Camp Hill and Bishop Wordsworth’s, all presumably tripped up again by their choice of IGCSE. Other poor performers were Gloucester Academy (again) at 44% and St Antony’s Catholic College in Trafford (59%). The worst performers were relatively stronger than their predecessors from 2012.

The trend data derived from the associated statistical publications shows that the overall figure for high attainers in state-funded schools has increased by 0.9% compared with 2012, recovering most of the 1.2% dip that year compared with 2011.

Sponsored academies have improved significantly with their high attainers back to 93.5% (their 2011 percentage) following a 1.7% dip in 2012. On the other hand, high attainers in converter academies have made little improvement compared with 2012, while free schools, studio schools and UTCs have improved by 3.9%.

Once again these patterns are probably influenced strongly by change in the size of some sectors and the impact of 2012 GCSE English results.

Interestingly though, selective school high attainers – having managed 99.5% on this measure in 2011 – are continuing to fall back, recording 98.3% in 2012 and now 97.5%. This may have something to do with the increasing attraction of IGCSE.

 .

Entry to and achievement of the EBacc

The 2013 Secondary Tables show that:

  • 35.5% of all students at state-funded schools were entered for all English Baccalaureate subjects, compared with 23.1% in 2012, and 22.8% achieved all EBacc subjects, up 6.6% from 16.2% in 2012.
  • Both entry (65.0%) and success (52.1%) rates continue to be much higher for high attainers than for middle attainers (27.8% entered and 11.8% successful) and low attainers (3.4% entered and 0.5% successful)
  • In 2012, the entry rate for high attainers was 46.3%, so there has been a substantial improvement of almost 19%. The 2012 success rate was 38.5%, so that has improved by 13.6%.
  • One could reasonably argue that a 52.1% success rate is lower than might be expected and relatively disappointing given that almost two-thirds of high attainers now enter all EBacc subjects. But, compared with the two previous measures, schools are much further away from the 100% ceiling with the EBacc, so further significant improvement amongst high attainers is likely over the next few years. However, the forthcoming shift to ‘Progress 8’ measure is likely to impact significantly.
  • 55 schools entered no high attainers for the EBacc, down considerably from 186 in 2012. Zero high attainers achieved the EBacc at 79 schools compared with 235 in 2012. Several grammar schools were of this number.

Tables 3A and B below indicate that, despite rapid improvement since 2012, only a third of high attainers in sponsored academies achieve the EBacc, compared with almost 6 in 10 attending converter academies.

The success rate for high attainers at free schools, UTCs and studios is only slightly higher than that for sponsored academies and both are improving at a similar rate.

Amost exactly half of high attainers at comprehensive schools are successful, as are almost exactly three quarters of high attainers at selective schools, but the rate of improvement is much faster in comprehensive schools – and indeed in modern schools too.

.

Table 3A: Percentages achieving the EBacc by sector

All state-funded mainstream LA maintained mainstream Sponsored academies Converter academies Free schools, UTCs and studio schools
All HA All HA All HA All HA All HA
2013 23.2 52.1 20.9 49.1 11.0 34.7 30.1 58.1 16.1 35.6
2012 16.4 38.5 14.5 35.0 6.3 21.1 25.7 49.1 12.2 23.6
2011 15.6 37.2 5.2 17.7 31.5 55.4

.

Table 3B: Percentages achieving the EBacc by admissions practice

Selective Comprehensive Modern
All HA All HA All HA
2013 71.6 74.6 21.5 49.9 12.1 33.3
2012 68.2 70.7 14.5 35.0 7.2 20.7
2011 68.1 70.5 13.7 33.6 6.7 20.3

.

Three Levels of Progress in English and maths

The Tables inform us that:

  • 70.4% of all pupils in state-funded secondary schools made at least three levels of progress in English (up 2.4% from 68% in 2012) and 70.7% did so in maths (up 2.0% from 68.7% in 2012).
  • In both subjects more high attainers made the requisite progress than middle and low attainers: 86.2% in English (2.8% up on 2012) and 87.8% in maths (up 2.0%). Despite these improvements, it remains the case that approximately one in seven high attainers fail to make the expected progress in English and one in eight fail to do so in maths. This is extremely disappointing.
  • There were 108 schools in which every high attainer made the requisite progress in English, up from 93 in 2012. In maths, 120 schools ensured every high attainer made the expected progress, compared with 100 in 2012. A total of 36 schools managed this feat in both English and maths, whereas only 26 did so in 2012.
  • At the three grammar schools we have already encountered, no high attainers made the expected progress in English. Forty-four schools were at or below 50% on this measure, down markedly from 75 in 2012. The worst performer apart from the grammar schools was Gloucester Academy at 28%.
  • Thirty-one schools had 50% or fewer high attainers making the expected progress in maths, an improvement on the 46 registering this result last year. The poorest performer was Stafford Sports College at 29%.

Tables 4A and B below contain the trend data for the achievement of three levels of progress in English while Tables 5A and B cover maths.

The figures within these tables are not strictly comparable, since the statistics unaccountably define high attainment slightly differently for the two populations. In the case of the ‘all’ column, they use achievement of Level 5 in the relevant KS2 test (ie English or maths), rather than above Level 4 achievement across all three core subjects, while the definition for the ‘high attainers’ column is the customary one set out above.

Nevertheless, one can see that, overall, the percentage of high attainers meeting this benchmark in English is recovering following a significant fall last year. Free schools, UTCs and studios have just overtaken sponsored academies while converter academies are 3.5 percentage points ahead of LA maintained mainstream schools.

In maths converter academies have an even more substantial 4.5 percentage point lead over LA maintained mainstream schools. Sponsored academies are a full 10 percentage points behind converters and five percentage points behind free schools, UTCs and studios, but the latter category is recording a downward trend while everyone else is moving in the opposite direction.

The fact that one in five high attainers in sponsored academies, free schools, UTCs and studios is failing to make three levels of progress in English is serious cause for concern.  Worse still, the same is true of maths in sponsored academies. This state of affairs requires urgent attention.

It is noticeable that the general recovery in performance amongst high attainers in English does not extend to selective schools, which have fallen back still further since 2012 and are now a full 3.5 percentage points behind their 2011 level. Regardless of causality – and early entry policy as well as the increasing popularity of IGCSE may be involved – this too is a matter for concern. The situation is more positive in maths however.

.

Table 4A: Percentages achieving three levels of progress in English by sector

All state-funded mainstream LA maintained mainstream Sponsored academies Converter academies Free schools, UTCs and studio schools
All HA All HA All HA All HA All HA
2013 79.7 86.2 85.0 80.7 88.5 81.0
2012 76.9 83.4 82.5 76.0 86.7 68.1
2011 69.0 87.2 79.6 94.5

.

Table 4B: Percentages achieving three levels of progress in English by admissions practice

Selective Comprehensive Modern
All HA All HA All HA
2013 93.0 85.5 81.0
2012 93.4 82.3 77.1
2011 96.5 86.2 81.1

.

Table 5A: Percentages achieving three levels of progress in maths by sector

All state-funded mainstream LA maintained mainstream Sponsored academies Converter academies Free schools, UTCs and studio schools
All HA All HA All HA All HA All HA
2013 81.7 87.8 86.2 80.8 90.7 85.6
2012 79.7 85.8 84.4 77.8 90.2 87.5
2011 76.8 85.2 75.6 93.2

.

Table 5B: Percentages achieving three levels of progress in maths by admissions practice

Selective Comprehensive Modern
All HA All HA All HA
2013 96.6 86.9 84.4
2012 95.9 84.7 80.5
2011 96.6 83.9 79.3

.

Other measures

The Performance Tables show:

  • The average point score (APS) per pupil for the best eight subjects (GCSE only) across all state-funded schools was 280.1, up from 276.7 in 2012. Amongst high attainers this rose to 377.6, up from 375.4 in 2012. Only five schools – all selective – achieved an APS above 450 for their high attainers (eight schools managed this in 2012). The top performer was Colyton Grammar School in Devon. At the other extreme, four schools were at 200 or lower (much reduced from 16 in 2012). These were Hadden Park High School in Nottingham (160.1), Pent Valley Technology College in Kent (188.5), Aylesford School in Kent (195.7) and Bolton St Catherine’s Academy (198.2).
  • According to the value added (best 8) measure, the best results for high attainers were achieved by four schools that scored over 1050 (seven schools managed this in 2012). These were Tauheedul Islam Girls High School, Harris Girls’ Academy, East Dulwich, Sheffield Park Academy and Lordswood Girls’ School and Sixth Form Centre. Conversely there were three schools where the score was 900 or below. The lowest VA scores were recorded by Ark Kings Academy; Hadden Park High School; and Manchester Creative and Media Academy for Boys.
  • The Tables also provide an average grade per GCSE per high attainer (uncapped) but, at the time of writing, the relevant column in the Tables refuses to sort in ascending/descending order. This press article draws on another measure – average grade per pupil per qualification (capped at best 8) to identify Colyton Grammar School as the only state-funded school to achieve an average A* on this measure. It is highly likely that Colyton Grammar will top the rankings for the uncapped high attainer measure too. The article adds that a total of 195 schools (state and independent presumably) achieved an average of either A*, A*-, A+, A or A- on the capped measure, noting that the Hull Studio School posted G- (though with only seven pupils) while two further schools were at E+ and a further 82 schools averaged D grades.
  •  The average number of GCSE entries for high attainers in state-funded schools was 9.9, up slightly from 9.7 in 2012. The highest level of GCSE entries per high attainer was 15.5 at Colyton Grammar School, repeating its 2012 achievement in this respect. At three schools – Hadden Park, Aylesford and Bolton St Catherine’s – high attainers were entered for fewer than five GCSEs (15 schools were in this category last year). One school – Ormesby in Middlesbrough – entered its high attainers for 22 qualifications, which seems a little excessive.

 

A Level Achievement

.

Percentage achieving 3+ A levels at AAB+ in facilitating subjects

According to the 16-18 Performance Tables:

  • 7.5% of A level students in all state funded schools and colleges achieved three A levels, all in facilitating subjects, at AAB or higher, up slightly from 7.4% in 2012. This is another column in the Tables that – at the time of writing – will not sort results into ascending/descending order. In 2012 a handful of state-funded institutions achieved 60% on this measure and that is likely to have been repeated. There were also 574 schools and colleges that recorded zero in 2012 and there may have been a slight improvement on that this year.
  • The same problem arises with the parallel measure showing the percentage of students achieve AAB+ with at least two in facilitating subjects. We know that 12.1% of A level students in state funded schools and colleges achieved this, up very significantly from 7.8% in 2012, but there is no information about the performance of individual schools. In 2012 a handful of institutions achieved over 80% on this measure, with Queen Elizabeth’s Barnet topping the state schools at 88%. At the other extreme, there were about 440 schools and colleges which recorded zero in 2012.

Tables 6A and B below show that the success rate on the first of these measures is creeping up in LA-maintained mainstream schools, sponsored academies, sixth form colleges and FE colleges. The same is true of comprehensive schools.

On the other hand, the success rate is falling somewhat in converter academies, free schools, UTCs and studios – and also in selective and modern schools.

It is noticeable how badly sponsored academies fare on this measure, achieving exactly half the rate of LA-maintained mainstream schools.

.

Table 6A: Percentages of students achieving 3+ A levels at AAB+ in facilitating subjects by sector

All state-funded mainstream LA maintained mainstream Sponsored academies Converter academies Free schools, UTCs and studio schools Sixth form colleges FE
2013 8.7 7.4 3.7 10.4 5.1 6.0 3.5
2012 8.6 7.2 3.4 11.4 7.5 5.8 3.3
2011

.

Table 6B: Percentages of students achieving 3+ A levels at AAB+ in facilitating subjects in schools by admissions practice

Selective Comprehensive Modern
2013 21.1 6.8 1.0
2012 21.5 6.6 1.6
2011

.

The 2013 statistics also contain breakdowns for the ‘AAB+ with two in facilitating subjects’ measure, as shown in Table 6C below.

.

Table 6C: Percentages of students achieving 3+ A levels at AAB+ with two in facilitating subjects by sector and admissions practice, 2013 only.

State-funded mainstream schools 13.6
LA-funded mainstream schools 11.4
Sponsored academies (mainstream) 5.4
Converter academies (mainstream) 16.4
Mainstream free schools, UTCs and studios 11.3
Sixth Form Colleges 10.4
FE Colleges 5.8
Selective schools 32.4
Comprehensive schools 10.7
Modern schools 2.0

.

While sponsored academies are achieving unspectacular results – in that they are even further behind LA-funded schools and even below FE colleges on this measure – selective schools are managing to get almost one third of their students to this level.

.

Percentage achieving 3+ A levels at A*/A

The Performance Tables do not include this measure, but it is included in the statistical reports. Tables 7A and B below show the trends – downwards in all sectors and types of school except FE colleges and free schools, UTCs and studios.

It is unclear why their performance should be improving on this measure but declining on the AAB+ in three facilitating subjects measure, though it seems likely that some UTCs and studios are less likely to enter their students for facilitating subjects.

.

Table 7A: Percentages of all students achieving 3+ A levels at A*/A by sector

All state-funded mainstream LA maintained mainstream Sponsored academies Converter academies Free schools, UTCs and studio schools Sixth form colleges FE
2013 10.7 8.7 4.1 13.1 7.9 9.3 5.1
2012 10.9 9.1 4.2 14.8 6.0 9.7 5.0
2011 11.4 10.2 4.9

.

Table 7B: Percentages of students achieving 3+ A levels at A*/A in schools by admissions practice

Selective Comprehensive Modern
2013 27.0 8.1 1.7
2012 27.7 8.3 1.9
2011 27.7 8.4 2.3

.

Percentage achieving 3+ A levels at AAB+

Once again, this measure is not in the Tables but is in the statistical bulletin. Tables 8A and B below compare trends. The broad trend is again downwards although this is being bucked (just) by FE colleges and (more significantly) by sponsored academies. So while sponsored academies are getting slightly falling results at AAA+, their results are improving at AAB.

 .

Table 8A: Percentages of students achieving 3+ A levels at AAB+ by sector

All state-funded mainstream LA maintained mainstream Sponsored academies Converter academies Free schools, UTCs and studio schools Sixth form colleges FE
2013 17.9 15.1 7.9 21.4 13.0 16.4 9.5
2012 17.9 15.4 7.5 23.4 16.4 16.8 9.4
2011

.

Table 8B: Percentages of students achieving 3+ A levels at AAB+ in schools by admissions practice

Selective Comprehensive Modern
2013 40.0 14.5 4.4
2012 40.6 14.5 4.7
2011 40.9 14.8 5.4

It is interesting to compare 2013 performance across these different high attainment measures and Table 9 below does this, enabling one to see more clearly the differentiated response to facilitating subjects amongst high attainers.

In most parts of the schools sector, the success rate for AAB+ in any subjects is roughly twice that of AAB+ in facilitating subjects, but this is not true of FE and sixth form colleges, nor of free schools, UTCs and studios.

At the same time, every sector and school type shows a higher rate at AAA+ than at AAB+ in three facilitating subjects.

.

Table 9: Percentages of students by sector/school type achieving different A level high attainment measures in 2013

AAB+ in 3FS AAB+ 2in FS AAB+ AAA+
All state-funded mainstream 8.7 13.6 17.9 10.7
LA-funded mainstream 7.4 11.4 15.1 8.7
Sponsored academies 3.7 5.4 7.9 4.1
Converter academies 10.4 16.3 21.4 13.1
Free schools UTCs and studios 5.1 11.3 13.0 7.9
Sixth form colleges 6.0 10.4 16.4 9.3
FE colleges 3.5 5.8 9.5 5.1
Selective schools 21.1 32.4 40.0 27.0
Comprehensive schools 6.8 10.7 14.5 8.1
Modern schools 1.0 2.0 4.4 1.7

.

Other Measures

Reverting back to the Performance Tables:

  • The APS per A level student across all state-funded institutions is 782.3, up significantly from 736.2 in 2012. The highest APS was recorded by Dartford Grammar School which recorded 1650.0. At the other end of the spectrum, an APS of 252.6 was recorded by Hartsdown Technology College in Kent.
  • The APS per A level entry across all state-funded institutions was 211.3 compared with 210.2 in 2012. The strongest performer in the maintained sector was Queen Elizabeth’s School, Barnet, which achieved 271.4. The lowest score in the maintained sector is 97.7, at Appleton Academy in Bradford.
  • A new average point score per A level pupil expressed as a grade is dominated by independent schools, but the top state-funded performers – both achieving an average A grade – are Henrietta Barnet and Queen Elizabeth’s Barnet. A handful of schools record U on this measure: Appleton Academy, The Gateway Academy in Thurrock, Hartsdown Technology College and The Mirus Academy in Walsall.
  • A new A level value added measure has also been introduced for the first time. It shows Ripon Grammar School as the top performer scoring 0.61. The lowest score generated on this measure is -1.03 at Appleton Academy, which comes in comfortably below any of its competitors.
  • The Statistical Bulletin also tells us what percentage of A level entries were awarded A* and A grades. Tables 10 A and B below record this data and show the trend since 2011. It is evident that A* performance is falling back slightly in every context, with the sole exception of FE (a very slight improvement) and free schools, UTCs and studios. A*/A performance is generally holding up better, other than in converter academies.

 .

Table 10A: Percentage of A* and A*/A grades by sector

All state-funded mainstream LA maintained mainstream Sponsored academies Converter academies Free schools, UTCs and studio schools Sixth form colleges FE
A* A*/A A* A*/A A* A*/A A* A*/A A* A*/A A* A*/A A* A*/A
2013 6.8 24.4 5.9 21.4 3.7 15.1 7.9 27.5 4.7 20.0 5.7 21.6 3.9 15.4
2012 7.2 24.3 6.3 21.8 4.0 14.4 8.9 29.1 4.3 20.3 5.8 21.8 3.8 15.6
2011 7.3 24.5 6.3 22.4 3.9 16.0

.

Table 10B: Percentage of A* and A*/A grades in schools by admissions practice

Selective Comprehensive Modern
A* A*/A A* A*/A A* A*/A
2013 12.7 40.6 5.6 21.1 2.5 10.6
2012 13.4 41.2 5.9 20.9 2.6 11.1
2011 13.4 41.0 6.0 21.1 3.0 11.6

.

Conclusion

What are we to make of this analysis overall?

The good news is that high attainers have registered improvements since 2012 across all the most important GCSE measures:

  • 5+ GCSEs or equivalent including English and maths GCSEs (up 0.7%)
  • GCSEs in English and maths (up 0.8%)
  • 3 levels of progress in English (up 2.8%)
  • 3 levels of progress in maths (up 2.0%)
  •  EBacc entry (up 19%) and EBacc achievement (up 13.6%).

The last of these is particularly impressive.

At A level, the underlying trend for high attainment per se is slightly downward, but significantly upward for AAB+ grades with two in facilitating subjects.

The less good news is that some of these improvements have been made from a relatively low base, so the overall levels of performance still fall short of what is acceptable.

It is not cause for congratulation that one in seven high attainers still fail to make the expected progress in English, while one in eight still fail to do so in English. Nor is it encouraging that one in twenty high attainers still fail to secure five or more GCSEs (or equivalent) including GCSEs in English and maths.

The significant improvement with the EBacc masks the fact that one fifth of high attainers who enter exams in all the requisite subjects still fail to secure the necessary grades.

Moreover, some schools are demonstrating very limited capacity to secure high achievement and – in particular – sufficient progress from their high attainers.

The fact that several schools achieve better progression in English for both their middle and their low attainers is particularly scandalous and needs urgent attention.

As noted above, the floor targets regime is too blunt an instrument to address the shortcomings in the substantial majority of schools that have been highlighted in this study for poor performance on one or more high attainers measures.

The combined impact of the secondary accountability reforms planned for 2016 is as yet unclear. For the time being at least, Ofsted inspection is the only game in town.

The school inspection guidance now demands more attention to the performance and progress made by the ‘most able students’. But will inspection bring about with sufficient rapidity the requisite improvements amongst the poorest performers highlighted here?

I have it in mind to monitor progress in this small sample of twenty-two schools – and also to look back on what has happened to a parallel group of the poorest performers in 2012.

.

GP

January 2014

P1000935

High Attainment in the 2013 Primary School Performance Tables

.

This is a distillation of data about  high attainment and the performance of high attaining learners in the 2013 Primary School Performance Tables.

It draws on the statistics contained in SFR51/2013 – National curriculum assessments at key stage 2: 2012-13.

For the purposes of this post, high attainment is Level 5 and above at KS2.

The definition of high attainers is taken from the School Performance Tables. A distinction between the performance of low, medium and high attaining pupils was first introduced into the 2011 Tables. It is based on prior attainment four years earlier at the end of Key Stage 1.

The User Guide to the Tables explains the distinction thus:

‘Prior attainment definitions are based on KS1 Teacher Assessment (using the KS1 Average Point Score) as follows:

  • Low attaining = those below Level 2 at KS1 (ie those with a KS1 APS < 12);
  • Middle attaining = those at Level 2 at KS1 (ie those with a KS1 APS >= 12 but <18);
  • High attaining = those above Level 2 at KS1 (ie those with a KS1 APS >= 18).

Where a pupil does not have a KS1 assessment (eg. because they weren’t in the country at the time), they will not be included in these figures.’

It follows that this definition will not include learners who are particularly strong in one area and comparatively weak in another, but it will include those who achieve relatively strongly across the board.

The proportions of the KS2 cohort defined as high, middle and low attainers in state-funded schools in 2013 are

  High % Middle % Low %
2013 25 57 18

 

Headlines

  • The percentage of pupils achieving Level 5 and above is down 4% in reading but up 2% in maths.
  • 7% of pupils achieved Level 6 in maths, up from 3% in 2012. This includes a staggering 29% of Chinese pupils. Some 2% of pupils achieved Level 6 in writing and in grammar, spelling and punctuation (GSP), but less than 1% achieved Level 6 in reading.
  • According to the Tables, there is a 16% achievement gap between the proportions of advantaged and disadvantaged learners achieving Level 5 and above in reading, writing and maths, up 1% on 2012. But this is a smaller gap than exists at Level 4B and above (21%) and at Level 4 and above (18%).
  • On the other hand, the SFR shows that the FSM/non-FSM and advantaged/disadvantaged gaps for each assessment are invariably significantly higher at Level 5 and above than they are at Level 4 and above. The biggest differences are in reading (10 percentage points worse for disadvantaged; 8 percentage points worse for FSM) and in maths (10 percentage points worse for disadvantaged; 7 percentage points worse for FSM).
  • A worrying 37% of high attainers in state-funded schools did not achieve Level 5 or above in reading, writing and maths. Not one high attainer achieved this in 64 primary schools.
  • Significant numbers of schools had no pupils at Level 6 in each assessment: some 12,700 had none in reading; about 10,750 had none in writing; some 10,200 had none in GSP and over 5,100 had none in maths.

 

Summary of Outcomes in the 2013 Primary Performance Tables

.

Aggregated – Reading, Writing and Maths

  • Overall, 21% of pupils in state-funded schools achieved Level 5 or above in reading, writing and maths (up 1% from 20% in 2012).
  • 25% of girls achieved this (up from 23% in 2012) and 18% of boys did so (up from 17% in 2012) giving an unchanged 7% gender gap. Some 19% of EAL pupils achieved this outcome.
  • 10% of disadvantaged pupils achieved this, compared with 26% of other pupils, giving an achievement gap of 16% (in 2012 9% of disadvantaged pupils and 24% of other pupils achieved this, so the gap has increased by 1% since last year). However, this gap is significantly smaller than the 21% gap at Level 4B and the 19% gap at Level 4.
  • 63% of the pupils in state-funded schools achieving this benchmark were high attainers, meaning that a worrying 37% of high attainers fell short.  Meanwhile, 10% of middle attainers were successful.
  • Almost all high attainers secured Level 4B and above (97%) and Level 4 and above (99%).
  • The percentage achieving this benchmark varied by school type from 25% (converter academies); to 21% (LA maintained mainstream schools);  to 14% (free schools); and 10% (sponsored academies).
  • One school – Litton C of E Primary (Buxton) – achieved 100% on this measure (six pupils). A dozen schools managed 75% or more, including two with 1FE – Grinling Gibbons Primary School (Lewisham) and Lowbrook Academy (Maidenhead).
  • At Grinling Gibbons, 88% of disadvantaged pupils achieved this measure (cohort of 16). Almost 40 schools recorded over 50%, two of them with cohorts of 30+ – Nelson Mandela School (Birmingham) and Tollgate Primary (Newham).
  • Seven schools achieved an average point score of 34.0 or above (equivalent to an average Level 5A) the largest being Lowbrook Academy and Fox Primary School (Kensington and Chelsea).
  • In over 600 primary schools no pupils achieved this benchmark.  In 64 schools, not one high attainer managed to do so (though, in a handful of these, up to 20% of middle attainers did so).

.

Reading

  • 44% of pupils in state-funded schools achieved Level 5 or above in reading (This is 4% lower – rounded – than the 48% who did so in 2012).
  • Around 2,300 pupils achieved Level 6 – 592 boys and 1,670 girls.
  • 18% of boys and 25% of girls achieved Level 5 or higher, giving a gender gap of 7%. Compared with 2012, Level 5 attainment declined significantly more amongst girls compared (down 5%) than boys (down 2%) so the gender gap closed by 3%.
  • 86% of high attainers achieved Level 5 or above.
  • 87% of those with KS1 reading at Level 3 or higher managed Level 5 – and a further  2% achieved Level 6.
  • The FSM gap at Level 5 and above is 21% (48% versus 27%) compared with 13% at Level 4 and above.
  • The advantaged/disadvantaged gap at Level 5 and above is 21% (51% versus 30%) compared with 11% at Level 4 and above.
  • 89% of high attainers made the expected progress in reading (compared with 92% of middle attainers).
  • One primary school – Ilford and Kingston C of E Primary School (Lewes) recorded 19% of its pupils achieving Level 6.
  • 18 primary schools recorded 100% achieving Level 5 or above in Reading – no pupils in any of those schools achieved Level 6.
  • About 12,700 schools had no pupils at Level 6 in Reading

.

Grammar, Punctuation and Spelling (GPS)

  • 47% of pupils in state-funded schools achieved Level 5 or above in GPS.
  • 2% (around 8,600) achieved Level 6 including 3,233 boys and 5,373 girls.
  • 7% of Chinese pupils achieved Level 6.
  • 42% of boys and 54% of girls achieved Level 5 or above giving a gender gap of 12%.
  • 91% of high attainers achieved Level 5 or above.
  • The FSM gap at Level 5 and above is 20% (51% versus 31%) compared with 18% at Level 4 and above.
  • The advantaged/disadvantaged gap at Level 5 and above is 19% (53% versus 34%) compared with 17% at Level 4 and above.
  • In two primary schools – St Joseph’s Catholic Primary (Southwark) and The Vineyard School (Richmond) 38% of pupils achieved Level 6.
  • 20 schools had 100% of pupils at Level 5 or above.
  • About 10,200 schools posted zero Level 6 results.

.

Writing

  • 30% of pupils in state-funded schools achieved Level 5 or above in writing teacher assessment.
  • 2% (over 8,400 pupils) achieved Level 6 including 2,861 boys and 5,549 girls.
  • 80% of those with Level 3 writing at KS1 achieved Level 5 and a further 9% achieved Level 6.
  • 76% of high attainers achieved Level 5.
  • The FSM gap at Level 5 and above is 19% (34% versus 15%) compared with 16% at Level 4 and above.
  • The disadvantaged/non-disadvantaged gap at Level 5 and above is 18% (36% versus 18%) compared with 13% at Level 4 and above.
  • 94% of high attainers made the expected progress in writing (compared with 93% of middle attainers).
  • At Newton Farm Nursery Infant and Junior School (Harrow) 63% of pupils achieved Level 6.
  • Just 4 schools achieved 100% at Level 5 or above – Litton C of E Primary (Buxton), Newton Farm (Harrow), St Joseph’s Hurst Green (Clitheroe) and St Oswald’s C of E Primary (Chester).
  • 10,750 schools had no pupils at Level 6.

.

Maths

  • 41% of pupils in state-funded schools achieved Level 5 or above in maths (up 2% from 39% in 2012).
  • 7% (around 35,000 pupils) achieved Level 6 (up 3% – rounded – from 3% in 2012) including 21,388 boys and 13,749 girls.
  • 29% of Chinese pupils achieved Level 6  (19% did so in 2012).
  • 2% of FSM pupils achieved Level 6.
  • 43% of those at Level 5 are boys and 39% are girls (compared with 2012 girls improved by 2% whereas boys improved by only 1%, so narrowing the gender gap slightly).
  • 64% of those with Level 3 or above in maths at KS1 made it to Level 5 at KS2 and a further 26% achieved Level 6.
  • 83% of high attainers achieved Level 5 or above.
  • 93% of high attainers made the expected progress in maths (compared with 90% of middle attainers).
  • The FSM gap at Level 5 and above is 20% (44% versus 24%) compared with 13% at Level 4 and above.
  • The advantaged/disadvantaged gap at Level 5 and above is 21% (47% versus 26%) compared with 11% at level 4 and above.
  • St Oswald’s CE Aided Primary (Chester) had 75% of its entry achieve Level 6 in maths and two other schools exceeded 50% – St Joseph’s RC Primary Hurst Green (Clitheroe) and Haselor School (Alcester).
  • 17 schools had 100% of their entry at Level 5 or above.
  • In over 5,100 schools no pupils achieved Level 6.

GP

December 2013