High Attainment in the 2014 Primary School Performance Tables

.

This is my annual post reviewing data about high attainment and high attainers at the end of Key Stage 2.

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on:

and parallel material for previous years.

‘High attainment’ is taken to mean National Curriculum Level 5 and above.

‘High attainers’ are defined in accordance with the Performance Tables, meaning those with prior attainment above Level 2 in KS1 teacher assessments (average points score of 18 or higher). This measure obviously excludes learners who are particularly strong in one area but correspondingly weak in another.

The proportions of the end-of-KS2 cohort defined as high, middle and low attainers have remained fairly constant since 2012.

High attainers presently constitute the top quartile of the relevant population, but this proportion is not fixed: it will increase as and when KS1 performance improves.

High % Middle % Low %
2014 25 58 18
2013 25 57 18
2012 24 57 19

Table 1: Proportion of high, middle and low prior attainers in state-funded schools by year since 2012

 

The percentage of high attainers in different schools’ end-of-KS2 cohorts varies very considerably and is unlikely to remain constant from year to year. Schools with small year groups are particularly vulnerable to significant fluctuations.

The 2014 Performance Tables show that Minster School, in Southwell, Nottinghamshire and St Patrick’s Church of England Primary Academy in Solihull each had 88% high attainers.

Over 600 primary schools have 50% or more high attainers within their cohorts. But, at the other extreme, more than 570 have no high attainers at all, while some 1,150 have 5% or fewer.

This serves to illustrate the very unequal distribution of learners with high prior attainment between schools.

The commentary below opens with a summary of the headline findings. The subsequent sections focus in turn on the composite measure (reading, writing and maths combined), then on the outcomes of the reading, GPS (grammar, punctuation and spelling) and maths tests and finally on teacher assessment in writing.

I have tried to ensure that percentages are consistent throughout this analysis, but the effect of rounding means that some figures are slightly different in different SFR tables. I apologise in advance for – and will of course correct – any transcription errors.

.

Headlines

.

Overall Trends

Chart 1 below compares performance at level 5 and above (L5+) and level 4 and above (L4+) in 2013 and 2014. The bars on the left hand side denote L4+, while those corresponding to L5+ are on the right.

HA 1

Chart 1: L4+ and L5+ performance compared, 2013-2014

With the exception of maths, which has remained unchanged, there have been improvements across the board at L4+, of between two and four percentage points.

The same is true at L5+ and – in the case of reading, GPS and writing – the percentage point improvements are relatively larger. This is good news.

Chart 2 compares the gaps between disadvantaged learners (‘ever 6’ FSM plus children in care) and all other learners in state-funded schools on all five measures, for both 2013 and 2014.

.

HA 2

Chart 2: Disadvantaged gaps at L4+ and L5+ for all five measures, 2013 and 2014

.

With the sole exception of the composite measure in 2013, each L4+ gap is smaller than the corresponding gap at L5+, though the difference can be as little as one percentage point (the composite measure) and as high as 11 percentage points (reading).

Whereas the L4+ gap in reading is lower than for any other measure, the L5+ reading gap is now the biggest. This suggests there is a particular problem with L5+ reading.

The distance between L4+ and L5+ gaps has typically widened since 2013, except in the case of maths, where it has narrowed by one percentage point.

While three of the L4+ gaps have closed slightly (composite, reading, GPS) the remainder are unchanged. However, two of the L5+ gaps have increased (composite, writing) and only the maths gap has closed slightly.

This suggests that what limited progress there has been in closing disadvantaged gaps has focused more on L4+ than L5+.

The pupil premium is not bringing about a radical improvement – and its impact is relatively lower at higher attainment levels.

A similar pattern is discernible with FSM gaps as Chart 3 reveals. This excludes the composite measure as this is not supplied in the SFR.

Overall the picture at L4+ is cautiously positive, with small downward trends on three of the four measures, but the picture at L5+ is more mixed since two of the measures are unchanged.

.

HA 3

Chart 3: FSM gaps at L4+ and L5+ compared, 2013 and 2014  

Composite measure

  • Although the proportion of learners achieving this benchmark is slightly higher in converter academies than in LA-maintained schools, the latter have improved faster since 2013. The success rate in sponsored academies is half that in converter academies. Free schools are improving but remain behind LA-maintained schools. 
  • Some 650 schools achieve 50% or higher, but another 470 record 0% (fewer than the 600 which did so in 2013). 
  • 67% of high attainers achieved this benchmark in 2014, up five percentage points on 2013 but one third still fall short, demonstrating that there is extensive underachievement amongst high attainers in the primary sector. This rather undermines HMCI’s observations in his Commentary on the 2014 Annual Report. 
  • Although over 670 schools have a 100% success rate amongst their high attainers, 42 schools have recorded 0% (down from 54 in 2013). Several of these do better by their middle attainers. In 10 primary schools no high attainers achieve L4+ in reading, writing and maths combined.

.

Reading

  • The substantial improvement in L5+ reading performance since 2013 masks an as yet unexplained crash in Level 6 test performance. Only 874 learners in state-funded schools achieved L6 reading, compared with 2,137 in 2013. This is in marked contrast to a substantive increase in L6 test entries, the success rate on L6 teacher assessment and the trend in the other L6 tests. In 2013 around 12,700 schools had no pupils who achieved L6 reading, but this increased to some 13,670 schools in 2014. Even the performance of Chinese pupils (otherwise phenomenally successful on L6 tests) went backwards. 
  • The proportion of Chinese learners achieving L5 in reading has reached 65% (compared with 50% for White learners), having increased by seven percentage points since 2013 and overtaken the 61% recorded in 2012. 
  • 43 primary schools had a 100% success rate at Level 5 in the reading test, but 29 more registered 0%. 
  • Some 92% of high attainers made at least the expected progress in reading, fewer than the 94% of middle attainers who did so. However, this was a three percentage point improvement on the 89% who made the requisite progress in 2013. 

GPS

  •  The proportion of Chinese learners achieving L5+ in the GPS test is now 75%, a seven percentage point improvement on 2013. Moreover, 15% achieved Level 6, up eight percentage points on 2013. (The comparable Level 5+ percentage for White learners is 50%). There are unmistakeable signs that Chinese ascendancy in maths is being replicated with GPS. 
  • Some 7,210 schools had no learners achieving L6 in the GPS test, compared with 10,200 in 2013. While 18 schools recorded a perfect 100% record at Level 5 and above, 33 had no learners at L5+. 

.

Maths

  • Chinese learners continue to make great strides. The percentage succeeding on the L6 test has climbed a further six percentage points and now stands at 35% (compared with 8% for White Pupils). Chinese boys are at 39%. The proportion of Chinese learners achieving level 6 is now comparable to the proportions of other ethnic groups achieving level 5. This lends further credence to the notion that we have our own domestic equivalent of Shanghai’s PISA success – and perhaps to the suggestion that focusing on Shanghai’s classroom practice may bring only limited benefits. 
  • While it is commendable that 3% of FSM and 4% of disadvantaged learners are successful in the L6 maths test, the gaps between them and other learners are increasing as the overall success rate grows. There are now seven percentage point gaps for FSM and disadvantaged alike. 
  • Ten schools managed a L6 success rate of 50% or higher, while some 280 were at 30% or higher. On the other hand, 3,200 schools had no L6 passes (down from 5,100 in 2013). 
  • About 94% of high attainers made the expected progress in maths a one percentage point improvement on 2013 – and two percentage points more than the proportion of successful middle attainers. But 27 schools posted a success rate of 50% or below.

.

Writing (TA)

  • Chinese pupils do not match their performance on the GPS test, though 6% achieve L6 in writing TA compared with just 2% of white pupils. 
  • Three schools managed a 50% success rate at Level 6 and 56 were at 25% or above. Only one school managed 100% at L5, but some 200 scored 0%. 
  • Some 93% of all pupils make the expected progress in writing between KS1 and KS2. This is true of 95% of high attainers – and 95% of middle attainers too.

 

Composite measure: reading, writing and maths

Table 2 shows the overall proportion of learners achieving L5 or above in all of reading, writing and maths in each year since 2012.

 

2012 2013 2014
L5+ overall 20% 21% 24%
L5+ boys 17% 18% 20%
L5+ girls 23% 25% 27%

Table 2: Proportion of all learners achieving KS2 L5+ in reading, writing and maths, 2012-2014

The overall success rate has increased by three percentage points compared with 2013 and by four percentage points since 2012.

The percentage of learners achieving L4+ has also improved by four percentage points since 2012, so the improvement at L5+ is broadly commensurate.

Over this period, girls’ lead over boys has remained relatively stable at between six and seven percentage points.

The SFR reveals that success on this measure varies significantly between school type.

The percentages for LA-maintained schools (24%) and all academies and free schools (23%) are little different.

However mainstream converter academies stand at 26%, twice the 13% recorded by sponsored academies. Free schools are at 21%. These percentages have changed significantly compared with 2013.

.

HA 4

Chart 4:  Comparison of proportion of learners achieving L5+ in reading writing and maths in 2013 and 2014

.

Whereas free schools are making rapid progress and sponsored academies are also improving at a significant rate, converter academies are improving more slowly than LA-maintained schools.

The highest percentages on this measure in the Performance Tables are recorded by Fox Primary School in Kensington and Chelsea (86%) and Hampden Gurney CofE Primary School in Westminster (85%).

Altogether, some 650 schools have achieved success rates of 50% or higher, while 23 have managed 75% or higher.

At the other end of the spectrum about 470 schools have no learners at all who achieved this measure, fewer than the 600 recording this outcome in 2013.

Table 3 shows the gap between disadvantaged (ie ‘ever 6’ FSM and children in care) learners and others, as recorded in the Performance Tables.

2012 2013 2014
Disadv 9 10 12
Other 24 26 29
Gap 15 16 17

Table 3: Proportion of disadvantaged learners achieving L5+ in reading, writing and maths, 2012-2014

.

Although the percentage of disadvantaged learners achieving this benchmark has improved somewhat, the percentage of other learners doing so has improved faster, meaning that the gap between advantaged and other learners is widening steadily.

This contrasts with the trend at L4+, where the Performance Tables show a gap that has narrowed from 19 percentage points in 2012 (80% versus 61%) to 18 points in 2013 (81% versus 63%) and now to 16 points in 2014 (83% versus 67%).

Chart 5 below illustrates this comparison.

.

HA 5

Chart 5: Comparing disadvantaged/other attainment gaps in KS2 reading, writing and maths combined at L4+ and L5+, 2012-2014.

While the L4+ gap has closed by three percentage points since 2012, the L5+ gap has widened by two percentage points. This suggests that disadvantaged learners amongst the top 25% by prior attainment are not benefiting commensurately from the pupil premium.

There are 97 primary schools where 50% or more disadvantaged learners achieve L5+ across reading, writing and maths (compared with 40 in 2013).

The highest performers record above 80% on this measure with their disadvantaged learners, albeit with cohorts of 6 to 8. Only one school with a more substantial cohort (of 34) manages over 70%. This is Tollgate Primary School in Newham.

The percentage of high attainers who achieved L5+ in 2014 was 67%, up five percentage points from 62% in 2013. (In 2012 the Performance Tables provided a breakdown for English and maths, which is not comparable).

Although this is a significant improvement, it means that one third of high attainers at KS1 still do not achieve this KS2 benchmark, suggesting that there is significant underachievement amongst this top quartile.

Thirteen percent of middle attainers also achieved this outcome, compared with 10% in 2013.

A significant number of schools – over 670 – do manage a 100% success rate amongst their high attainers, but there are also 42 schools where no high attainers achieve the benchmark (there were 54 in 2013). In several of them, more middle attainers than high attainers achieve the benchmark.

There are ten primary schools in which no high attainers achieve L4 in reading writing and maths. Perhaps one should be thankful for the fact that no middle attainers in these schools achieve the benchmark either!

The KS2 average point score was 34.0 or higher in five schools, equivalent to a level 5A. The highest  APS was 34.7, recorded by Fox Primary School, with a cohort of 42 pupils.

Across all state-funded schools, the average value added measure for high attainers across reading, writing and maths is 99.8, the same as it was in 2013.

The comparable averages for middle attainers and low attainers are 100.0 and 100.2 respectively, showing that high attainers benefit slightly less from their primary education.

The highest value-added recorded for high attainers is 104.7 by Tudor Court Primary School in Thurrock, while the lowest is 93.7 at Sacriston Junior School in Durham (now closed).

Three more schools are below 95.0 and some 250 are at 97.5 or lower.

.

Reading Test

Table 4 shows the percentage of all learners, boys and girls achieving L5+ in reading since 2010. There has been a five percentage point increase (rounded) in the overall result since 2013, which restores performance to the level it had reached in 2010.

A seven percentage point gap in favour of girls remains unchanged from 2013. This is four points less than the comparable gender gap in 2010.

.

2010 2011 2012 2013 2014
L5+ overall 50 43 48 44 50
Boys 45 37 43 41 46
Girls 56 48 53 48 53

Table 4: Percentage of learners achieving L5+ in reading since 2010

.

As reported in my September 2014 post ‘What Happened to the Level 6 Reading Results?’ L6 performance in reading has collapsed in 2014.

The figures have improved slightly since the provisional results were released, but the collapse is still marked.

Table 5 shows the numbers successful since 2012.

The number of successful learners in 2014 is less than half the number successful in 2013 and almost back to the level in 2012 when the test was first introduced.

This despite the fact that the number of entries for the level 6 test – 95,000 – was almost exactly twice the 47,000 recorded in 2012 and significantly higher than the 70,000 entries in 2013.

For comparison, the number of pupils awarded level 6 in reading via teacher assessment was 15,864 in 2013 and 17,593 in 2014

We still have no explanation for this major decline which is entirely out of kilter with other L6 test outcomes.

.

2012 2013 2014
% No % No % No
L6+ 0 900 0 2,262 0 935
Boys 0 200 0 592 0 263
Girls 0 700 1 1,670 0 672

Table 5: Number and percentage of learners achieving L6 on the KS2 reading test 2012-2014

.

These figures include some pupils attending independent schools, but another table in the SFR reveals that 874 learners in state-funded primary schools achieved L6 (compared with 2,137 in 2013). Of these, all but 49 achieved L3+ in their KS1 reading assessment.

But some 13,700 of those with L3+ reading at the end of KS1 progressed to L4 or lower at the end of KS2.

The SFR does not supply numbers of learners with different characteristics achieving L6 and all percentages are negligible. The only group recording a positive percentage are Chinese learners at 1%.

In 2013, Chinese learners were at 2% and some other minority ethnic groups recorded 1%, so not even the Chinese have been able to withstand the collapse in the L6 success rate.

According to the SFR, the FSM gap at L5 is 21 percentage points (32% versus 53% for all other pupils). The disadvantaged gap is also 21 percentage points (35% versus 56% for all other pupils).

Chart 6 shows how these percentages have changed since 2012.

.

HA 6

Chart 6: FSM and disadvantaged gaps for KS2 reading test at L5+, 2012-2014

FSM performance has improved by five percentage points compared with 2013, while disadvantaged performance has grown by six percentage points.

However, gaps remain unchanged for FSM and have increased by one percentage point for disadvantaged learners. There is no discernible or consistent closing of gaps in KS2 reading at L5.

These gaps of 21 percentage points for both FSM and disadvantaged, are significantly larger than the comparable gaps at L4+ of 12 (FSM) and 10 (disadvantaged) percentage points.

The analysis of level 5 performance in the SFR reveals that the proportion of Chinese learners achieving level 5 has reached 65%, having increased by seven percentage points since 2013 and overtaken the 61% recorded in 2012.

Turning to the Performance Tables, we can see that, in relation to L6:

  • The highest recorded percentage achieving L6 is 17%, at Dent CofE Voluntary Aided Primary School in Cumbria. Thirteen schools recorded a L6 success rate of 10% or higher. (The top school in 2013 recorded 19%).
  • In 2013 around 12,700 schools had no pupils who achieved L6 reading, whereas in 2014 this had increased to some 13,670 schools.

In relation to L5:

  • 43 schools achieved a 100% record in L5 reading (compared with only 18 in 2013). All but one of these recorded 0% at L6, which may suggest that they were concentrating on maximising L5 achievement rather than risking L6 entry.
  • Conversely, there are 29 primary schools where no learners achieved L5 reading.

Some 92% of high attainers made at least the expected progress in reading, fewer than the 94% of middle attainers who did so.  However, this was a three percentage point improvement on the 89% who made the requisite progress in 2013.

And 41 schools recorded a success rate of 50% or lower on this measure, most of them comfortably exceeding this with their low and middle attainers alike.

.

GPS Test

Since the grammar, punctuation and spelling test was first introduced in 2013, there is only a two-year run of data. Tables 6 and 7 below show performance at L5+ and L6+ respectively.

.

2013 % 2014 %
L5+ overall 48 52
Boys 42 46
Girls 54 58

Table 6: Percentage of learners achieving L5+ in GPS, 2013 and 2014

2013 2014
% No % No
L6+ 2 8,606 4 21,111
Boys 1 3,233 3 8,321
Girls 2 5,373 5 12,790

Table 7: Number and percentage of learners achieving L6 in GPS, 2013 and 2014

.

Table 6 shows an overall increase of four percentage points in 2014 and the maintenance of a 12 percentage point gap in favour of girls.

Table 7 shows a very healthy improvement in L6 performance, which only serves to emphasise the parallel collapse in L6 reading. Boys have caught up a little on girls but the latter’s advantage remains significant.

The SFR shows that 75% of Chinese learners achieve L5 and above, up seven percentage points from 68% in 2013. Moreover, the proportion achieving L6 has increased by eight percentage points, to 15%. There are all the signs that Chinese eminence in maths is repeating itself with GPS.

Chart 7 shows how the FSM gap and disadvantaged gap has changed at L5+ for GPS. The disadvantaged gap has remained stable at 19 percentage points, while the FSM gap has narrowed by one percentage point.

These gaps are somewhat larger than those at L4 and above, which stand at 17 percentage points for FSM and 15 percentage points for disadvantaged learners.

.

HA 7

Chart 7:  FSM and disadvantaged gaps for KS2 GPS test at L5+, 2013 and 2014

.

The Performance Tables show that, in relation to L6:

  • The school with the highest percentage achieving level 6 GPS is Fulwood, St Peter’s CofE Primary School in Lancashire, which records a 47% success rate. Some 89 schools achieve a success rate of 25% or higher.
  • In 2014 there were some 7,210 schools that recorded no L6 performers at all, but this compares favourably with 10,200 in 2013. This significant reduction is in marked contrast to the increase in schools with no L6 readers.

Turning to L5:

  • 18 schools recorded a perfect 100% record for L5 GPS. These schools recorded L6 success rates that vary between 0% and 25%.
  • There are 33 primary schools where no learners achieved L5 GPS.

.

Maths test

Table 8 below provides the percentages of learners achieving L5+ in the KS2 maths test since 2010.

Over the five year period, the success rate has improved by eight percentage points, but the improvement in 2014 is less pronounced than it has been over the last few years.

The four percentage point lead that boys have over girls has changed little since 2010, apart from a temporary increase to six percentage points in 2012.

.

2010 2011 2012 2013 2014
L5+ overall 34 35 39 41 42
Boys 36 37 42 43 44
Girls 32 33 36 39 40

Table 8: Percentage of learners achieving L5+ in KS2 maths test, 2010-2014

.

Table 9 shows the change in achievement in the L6 test since 2012. This includes pupils attending independent schools – another table in the SFR indicates that the total number of successful learners in 2014 in state-funded schools is 47,349, meaning that almost 95% of those achieving L6 maths are located in the state-funded sector.

There has been a healthy improvement since 2013, with almost 15,000 more successful learners – an increase of over 40%. Almost one in ten of the end of KS2 cohort now succeeds at L6. This places the reversal in L6 reading into even sharper relief.

The ratio between boys and girls has remained broadly unchanged, so boys continue to account for over 60% of successful learners.

.

2012 2013 2014
% No % No % No
L6+ 3 19,000 7 35,137 9 50,001
Boys 12,400 8 21,388 11 30,173
Girls 6,600 5 13,749 7 19,828

Table 9 Number and percentage of learners achieving L6 in KS2 maths test 2012-2014

.

The SFR shows that, of those achieving L6 in state-funded schools, some 78% had achieved L3 or above at KS1. However, some 9% of those with KS1 L3 – something approaching 10,000 pupils – progressed only to L4, or lower.

The breakdown for minority ethnic groups shows that the Chinese ascendancy continues. This illustrated by Chart 8 below.

HA 8

Chart 8: KS2 L6 maths test performance by ethnic background, 2012-2014

In 2014, the percentage of Chinese achieving L5+ has increased by a respectable three percentage points to 74%, but the L6 figure has climbed by a further six percentage points to 35%. More than one third of Chinese learners now achieve L6 on the maths test.

This means that the proportion of Chinese pupils achieving L6 is now broadly similar to the proportion of other minorities achieving Level 5 (34% of white pupils for example).

They are fifteen percentage points ahead of the next best outcome – 20% recorded by Indian learners. White learners stand at 8%.

There is an eight percentage point gap between Chinese boys (39%) and Chinese girls (31%). The gap for white boys and girls is much lower, but this is a consequence of the significantly lower percentages.

Given that Chinese pupils are capable of achieving such extraordinary results under the present system, these outcomes raise significant questions about the balance between school and family effects and whether efforts to emulate Chinese approaches to maths teaching are focused on the wrong target.

Success rates in the L6 maths test are high enough to produce percentages for FSM and disadvantaged learners. The FSM and disadvantaged gaps both stand at seven percentage points, whereas they were at 5 percentage points (FSM) and 6 percentage points (disadvantaged) in 2013. The performance of disadvantaged learners has improved, but not as fast as that of other learners.

Chart 9 shows how these gaps have changed since 2012.

While the L6 gaps are steadily increasing, the L5+ gaps have remained broadly stable at 20 percentage points (FSM) and 21 percentage points (disadvantaged). There has been a small one percentage point improvement in the gap for disadvantaged learners in 2014, matching the similar small improvement for L4+.

The gaps at L5+ remain significantly larger than those at L4+ (13 percentage points for FSM and 11 percentage points for disadvantaged).

HA 9

Chart 9: FSM and disadvantaged gaps, KS2 L5+ and L6 maths test, 2012 to 2014

.

The Performance Tables reveal that:

  • The school with the highest recorded percentage of L6 learners is Fox Primary School (see above) at 64%, some seven percentage points higher than its nearest rival. Ten schools achieve a success rate of 50% or higher (compared with only three in 2013), 56 at 40% or higher and 278 at 30% or higher.
  • However, over 3,200 schools record no L6 passes. This is a significant improvement on the 5,100 in this category in 2013, but the number is still far too high.
  • Nine schools record a 100% success rate for L5+ maths. This is fewer than the 17 that managed this feat in 2013.

Some 94% of high attainers made the expected progress in maths a one percentage point improvement on 2013, two percentage points more than did so in reading in 2014 – and two percentage points more than the proportion of middle attainers managing this.

However, 27 schools had a success rate of 50% or below, the vast majority of them comfortably exceeding this with their middle attainers – and often their low attainers too.

.

Writing Teacher Assessment

Table 10 shows how the percentage achieving L5+ through the teacher assessment of writing has changed since 2012.

There has been a healthy five percentage point improvement overall, and an improvement of three percentage points since last year, stronger than the comparable improvement at L4+. The large gender gap of 15 percentage points in favour of girls is also unchanged since 2013.

.

2012 2013 2014
L5+ overall 28 30 33
Boys 22 23 26
Girls 35 38 41

Table 10: Percentage achieving level 5+ in KS2 writing TA 2012-2014

.

Just 2% of learners nationally achieve L6 in writing TA – 11,340 pupils (10,654 of them located in state-funded schools).

However, this is a very significant improvement on the 2,861 recording this outcome in 2013. Just 3,928 of the total are boys.

Chinese ascendancy at L6 is not so significant. The Chinese success rate stands at 6%. However, if the comparator is performance at L5+ Chinese learners record 52%, compared with 33% for both White and Asian learners.

The chart below shows how FSM and disadvantaged gaps have changed at L5+ since 2012.

This indicates that the FSM gap, having widened by two percentage points in 2013, has narrowed by a single percentage point in 2014, so it remains higher than it was in 2012. Meanwhile the disadvantaged gap has widened by one percentage point since 2013.

The comparable 2014 gaps at L4+ are 15 percentage points (FSM) and 13 percentage points (disadvantaged), so the gaps at L5+ are significantly larger.

.

HA 10

Chart 10: FSM and disadvantaged gaps, L5+ Writing TA, 2012-2014

.

The Performance Tables show that:

  • Three schools record a L6 success rate of 50% and only 56 are at 25% or higher.
  • At the other end of the spectrum, the number of schools with no L6s is some 9,780, about a thousand fewer than in 2013.
  • At L5+ only one school has a 100% success rate (there were four in 2013). Conversely, about 200 schools record 0% on this measure.

Some 93% of all pupils make the expected progress in writing between KS1 and KS2 and this is true of 95% of high attainers – the same percentage of middle attainers is also successful.

Conclusion

Taken together, this evidence presents a far more nuanced picture of high attainment and high attainers’ performance in the primary sector than suggested by HMCI’s Commentary on his 2014 Annual Report:

‘The proportion of pupils at Key Stage 2 attaining a Level 5 or above in reading, writing and mathematics increased from 21% in 2013 to 24% in 2014.

Attainment at Level 6 has also risen. In mathematics, the proportion of pupils achieving Level 6 rose from 3% in 2012 to 9% in 2014. The proportion achieving Level 6 in grammar, punctuation and spelling rose by two percentage points in the last year to 4%.

These improvements suggest that primary schools are getting better at identifying the brightest children and developing their potential.’

There are four particular areas of concern:

  • Underachievement amongst high attainers is too prevalent in far too many primary schools. Although there has been some improvement since 2013, the fact that only 67% of those with high prior attainment at KS1 achieve L5 in reading, writing and maths combined is particularly worrying.
  • FSM and disadvantaged achievement gaps at L5+ remain significantly larger than those at L4+ – and there has been even less progress in closing them. The pupil premium ought to be having a significantly stronger impact on these excellence gaps.
  • The collapse of L6 reading test results is all the more stark when compared with the markedly improved success rates in GPS and maths which HMCI notes. We still have no explanation of the cause.
  • The success rates of Chinese pupils on L6 tests remains conspicuous and in maths is frankly extraordinary. This evidence of a ‘domestic Shanghai effect’ should be causing us to question why other groups are so far behind them – and whether we need to look beyond Shanghai classrooms when considering how best to improve standards in primary maths.

.

GP

December 2014

What Happened to the Level 6 Reading Results?

 

Provisional 2014 key stage 2 results were published on 28 August.

500px-Japanese_Urban_Expwy_Sign_Number_6.svgThis brief supplementary post considers the Level 6 test results – in reading, in maths and in grammar, punctuation and spelling (GPS) – and how they compare with Level 6 outcomes in 2012 and 2013.

An earlier post, A Closer Look at Level 6, published in May 2014, provides a fuller analysis of these earlier results.

Those not familiar with the 2014 L6 test materials can consult the papers, mark schemes and level thresholds at these links:

 

Number of Entries

Entry levels for the 2014 Level 6 tests were published in the media in May 2014. Chart 1 below shows the number of entries for each test since 2012 (2013 in the case of GPS). These figures are for all schools, independent as well as state-funded.

 

L6 Sept chart 1

Chart 1: Entry rates for Level 6 tests 2012 to 2014 – all schools

 

In 2014, reading entries were up 36%, GPS entries up 52% and maths entries up 36%. There is as yet no indication of a backlash from the decision to withdraw Level 6 tests after 2015, though this may have an impact next year.

The postscript to A Closer Look estimated that, if entries continue to increase at current rates, we might expect something approaching 120,000 in reading, 130,000 in GPS and 140,000 in maths.

Chart 2 shows the percentage of all eligible learners entered for Level 6 tests, again for all schools. Nationally, between one in six and one in five eligible learners are now entered for Level 6 tests. Entry rates for reading and maths have almost doubled since 2012.

 

L6 Sept chart 2

Chart 2: Percentage of eligible learners entered for Level 6 tests 2012 to 2014, all schools

 

Success Rates

The headline percentages in the SFR show:

  • 0% achieving L6 reading (unchanged from 2013)
  • 4% achieving L6 GPS (up from 2% in 2013) and
  • 9% achieving L6 maths (up from 7% in 2013).

Local authority and regional percentages are also supplied.

  • Only in Richmond did the L6 pass rate in reading register above 0% (at 1%). Hence all regions are at 0%.
  • For GPS the highest percentages are 14% in Richmond, 10% in Kensington and Chelsea and Kingston, 9% in Sutton and 8% in Barnet, Harrow and Trafford. Regional rates vary between 2% in Yorkshire and Humberside and 6% in Outer London.
  • In maths, Richmond recorded 22%, Kingston 19%, Trafford, Harrow and Sutton were at 18% and Kensington and Chelsea at 17%. Regional rates range from 7% in Yorkshire and Humberside and the East Midlands to 13% in Outer London.

Further insight into the national figures can be obtained by analysing the raw numbers supplied in the SFR.

Chart 3 shows how many of those entered for each test were successful in each year. Here there is something of a surprise.

 

L6 Sept chart 3

Chart 3: Percentage of learners entered achieving Level 6, 2012 to 2014, all schools

 

Nearly half of all entrants are now successful in L6 maths, though the improvement in the success rate has slowed markedly compared with the nine percentage point jump in 2013.

In GPS, the success rate has improved by nine percentage points between 2013 and 2014 and almost one in four entrants is now successful. Hence the GPS success rate is roughly half that for maths. This may be attributable in part to its shorter history, although the 2014 success rate is significantly below the rate for maths in 2013.

But in reading an already very low success rate has declined markedly, following a solid improvement in 2013 from a very low base in 2012. The 2014 success rate is now less than half what it was in 2012. Fewer than one in a hundred of those entered have passed this test.

Chart 4 shows how many learners were successful in the L6 reading test in 2014 compared with previous years, giving results for boys and girls separately.

 

L6 Sept chart 4

Chart 4: Percentage of learners entered achieving Level 6 in reading, 2012 to 2014, by gender

 

The total number of successful learners in 2014 is over 5% lower than in 2012, when the reading test was introduced, and down 62% on the success rate achieved in 2013.

Girls appear to have suffered disproportionately from the decline in 2014 success rates. While the success rate for girls is down 63%, the decline for boys is slightly less, at 61%. The success rate for boys remains above where it was in 2012 but, for girls, it is about 12% down on where it was in 2012.

In 2012, only 22% of successful candidates were boys. This rose to 26% in 2013 and has again increased slightly, to 28% in 2014. The gap between girls’ and boys’ performance remains substantially bigger than those for GPS and maths.

Charts 5 and 6 give the comparable figures for GPS and maths respectively.

In GPS, the total number of successful entries has increased by almost 140% compared with 2013. Girls form a slightly lower proportion of this group than in 2013, their share falling from 62% to 60%. Boys are therefore beginning to close what remains a substantial performance gap.

 

L6 Sept chart 5

Chart 5: Percentage of learners entered achieving Level 6 in GPS, 2012 to 2014, by gender

 

In maths, the total number of successful entries is up by about 40% on 2013 and demonstrates rapid improvement over the three year period.

Compared with 2013, the success rate for girls has increased by 43%, whereas the corresponding increase for boys is closer to 41%. Boys formed 65% of the successful cohort in 2012, 61% in 2013 and 60% in 2014, so girls’ progress in narrowing this substantial performance gap is slowing.

 

L6 Sept chart 6

Chart 6: Percentage of learners entered achieving Level 6 in maths, 2012 to 2014, by gender

 

Progress

The SFR also provides a table, this time for state-funded schools only, showing the KS1 outcomes of those successful in achieving Level 6. (For maths and reading, this data includes those with a non-numerical grade in the test who have been awarded L6 via teacher assessment. The data for writing is derived solely from teacher assessment.)

Not surprisingly, over 94% of those achieving Level 6 in reading had achieved Level 3 in KS1, but 4.8% were at L2A and a single learner was recorded at Level 1. The proportion with KS1 Level 3 in 2013 was higher, at almost 96%.

In maths, however, only some 78% of those achieving Level 6 were at Level 3 in KS1. A further 18% were at 2A and almost 3% were at 2B. A further 165 were recorded as 2C or 1. In 2013, over 82% had KS1 L3 while almost 15% had 2A.

It seems, therefore, that KS1 performance was a slightly weaker indicator of KS2 level 6 success in 2014 than in the previous year, but this trend was apparent in both reading and maths – and KS1 performance remains a significantly weaker indicator in maths than it is in reading.

 

Why did the L6 reading results decline so drastically?

Given that the number of entries for the Level 6 reading test increased dramatically, the declining pass rate suggests either a problematic test or that schools entered a higher proportion of learners who had relatively little chance of success. A third possibility is that the test was deliberately made more difficult.

The level threshold for the 2014 Level 6 reading test was 24 marks, compared with 22 marks in 2013, but there are supposed to be sophisticated procedures in place to ensure that standards are maintained. We should be able to discount the third cause.

The second cause is also unlikely to be significant, since schools are strongly advised only to enter learners who are already demonstrating attainment beyond KS2 Level 5.There is no benefit to learners or schools from entering pupils for tests that they are almost certain to fail.

The existing pass rate was very low, but it was on an upward trajectory. Increasing familiarity with the test ought to have improved schools’ capacity to enter the right learners and to prepare them to pass it.

That leaves only the first possibility – something must have been wrong with the test.

Press coverage from May 2014, immediately after the test was administered, explained that it contained different rules for learners and invigilators about the length of time available for answering questions.

The paper gave learners one hour for completion, while invigilators were told pupils had 10 minutes’ reading time followed by 50 minutes in which to answer the questions. Schools interpreted this contradiction differently and several reported disruption to the examination as a consequence.

The NAHT was reported to have written to the Standards and Testing Agency:

‘…asking for a swift review into this error and to seek assurance that no child will be disadvantaged after having possibly been given incorrect advice on how to manage their time and answers’.

The STA statement says:

‘We apologise for this error. All children had the same amount of time to complete the test and were able to consult the reading booklet at any time. We expect it will have taken pupils around 10 minutes to read the booklet, so this discrepancy should not have led to any significant advantage for those pupils where reading time was not correctly allotted.’

NAHT has now posted the reply it received from STA on 16 May. It says:

‘Ofqual, our regulator, is aware of the error and of the information set out below and will, of
course, have to independently assure itself that the test remains valid. We would not
expect this to occur until marking and level setting processes are complete, in line with
their normal timescales.’

It then sets out the reasons why it believes the test remains valid. These suggest the advantage to the learners following the incorrect instructions was minimal since:

  • few would need less than 10 minutes’ reading time;
  • pre-testing showed 90% of learners completed the test within 50 minutes;
  • in 2013 only 3.5% of learners were within 1 or 2 marks of the threshold;
  • a comparative study to change the timing of the Levels 3-5 test made little difference to item difficulty.

NAHT says it will now review the test results in the light of this response.

 

 

Who is responsible?

According to its most recent business plan, STA:

‘is responsible for setting and maintaining test standards’ (p3)

but it publishes little or nothing about the process involved, or how it handles representations such as that from NAHT.

Meanwhile, Ofqual says its role is:

‘to make sure the assessments are valid and fit for purpose, that the assessments are fair and manageable, that the standards are properly set and maintained and the results are used appropriately.

We have two specific objectives as set out by law:

  • to promote assessment arrangements which are valid, reliable and comparable
  • to promote public confidence in the arrangements.

We keep national assessments under review at all times. If we think at any point there might be a significant problem with the system, then we notify the Secretary of State for Education.’

Ofqual’s Chair has confirmed via Twitter that Ofqual was:

‘made aware at the time, considered the issues and observed level setting’.

Ofqual was content that the level-setting was properly undertaken.

 

 

I asked whether, in the light of that, Ofqual saw a role for itself in investigating the atypical results. I envisaged that this might take place under the Regulatory Framework for National Curriculum Assessments (2011).

This commits Ofqual to publishing annually its ‘programme for reviewing National Assessment arrangements’ (p14) as well as ‘an annual report on the outcomes of the review programme’ (p18).

However the most recent of these relates to 2011/12 and appeared in November of that year.

 

 

I infer from this that we may seem some reaction from Ofqual, if and when it finally produces an annual report on National Curriculum Assessments in 2014, but that’s not going to appear before 2015 at the earliest.

I can’t help but feel that this is not quite satisfactory – that atypical test performance of this magnitude ought to trigger an automatic and transparent review, even if the overall number of learners affected is comparatively small.

If I were part of the system I would want to understand promptly exactly what happened, for fear that it might happen again.

If you are in any doubt quite how out of kilter the reading test outcomes were, consider the parallel results for Level 6 teacher assessment.

In 2013, 5,698 learners were assessed at Level 6 in reading through teacher assessment – almost exactly two-and-a-half times as many as achieved Level 6 in the test.

In 2014, a whopping 17,582 learners were assessed at Level 6 through teacher assessment, around 20 times as many as secured a Level 6 in the reading test.

If the ratio between test and teacher assessment results in 2014 had been the same as it was in 2013, the number successful on the test would have been over 7,000, eight-fold higher than the reported 851.

I rest my case.

 

The new regime

In February 2013, a DfE-commissioned report Investigation of Key Stage 2 Level 6 Tests recommended that:

‘There is a need to review whether the L6 test in Reading is the most appropriate test to use to discriminate between the highest ability pupils and others given:

a) that only around 0.3 per cent of the pupils that achieved at least a level 5 went on to achieve a level 6 in Reading compared to 9 per cent for Mathematics

b) there was a particular lack of guidance and school expertise in this area

c) pupil maturity was seen to be an issue

d) the cost of supporting and administering a test for such a small proportion of the school population appears to outweigh the benefits.’

This has been overtaken by the decision to withdraw all three Level 6 tests and to rely on single tests of reading GPS and maths for all learners when the new assessment regime is introduced from 2016.

Draft test frameworks were published in March 2014, supplemented in July by sample questions, mark schemes and commentary.

Given the imminent introduction of this new regime, together with schools’ experience in 2014, it seems increasingly unlikely that 2015 Level 6 test entries in reading will approach the 120,000 figure suggested by the trend.

Perhaps more importantly, schools and assessment experts alike seem remarkably sanguine about the prospect of single tests for pupils demonstrating the full range of prior attainment, apart from those assessed via the P-Scales. (The draft test frameworks are worryingly vague about whether those operating at the equivalent of Levels 1 and 2 will be included.)

I could wish to be equally sanguine, on behalf of all those learners capable of achieving at least the equivalent of Level 6 after 2015. But, as things stand, the evidence to support that position is seemingly non-existent.

In October 2013, Ofqual commented that:

‘There are also some significant technical challenges in designing assessments which can discriminate effectively and consistently across the attainment range so they can be reported at this level of precision.’

A year on, we still have no inkling whether those challenges have been overcome.

 

GP

September 2014