High Attainment in the 2014 Primary School Performance Tables

.

This is my annual post reviewing data about high attainment and high attainers at the end of Key Stage 2.

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on:

and parallel material for previous years.

‘High attainment’ is taken to mean National Curriculum Level 5 and above.

‘High attainers’ are defined in accordance with the Performance Tables, meaning those with prior attainment above Level 2 in KS1 teacher assessments (average points score of 18 or higher). This measure obviously excludes learners who are particularly strong in one area but correspondingly weak in another.

The proportions of the end-of-KS2 cohort defined as high, middle and low attainers have remained fairly constant since 2012.

High attainers presently constitute the top quartile of the relevant population, but this proportion is not fixed: it will increase as and when KS1 performance improves.

High % Middle % Low %
2014 25 58 18
2013 25 57 18
2012 24 57 19

Table 1: Proportion of high, middle and low prior attainers in state-funded schools by year since 2012

 

The percentage of high attainers in different schools’ end-of-KS2 cohorts varies very considerably and is unlikely to remain constant from year to year. Schools with small year groups are particularly vulnerable to significant fluctuations.

The 2014 Performance Tables show that Minster School, in Southwell, Nottinghamshire and St Patrick’s Church of England Primary Academy in Solihull each had 88% high attainers.

Over 600 primary schools have 50% or more high attainers within their cohorts. But, at the other extreme, more than 570 have no high attainers at all, while some 1,150 have 5% or fewer.

This serves to illustrate the very unequal distribution of learners with high prior attainment between schools.

The commentary below opens with a summary of the headline findings. The subsequent sections focus in turn on the composite measure (reading, writing and maths combined), then on the outcomes of the reading, GPS (grammar, punctuation and spelling) and maths tests and finally on teacher assessment in writing.

I have tried to ensure that percentages are consistent throughout this analysis, but the effect of rounding means that some figures are slightly different in different SFR tables. I apologise in advance for – and will of course correct – any transcription errors.

.

Headlines

.

Overall Trends

Chart 1 below compares performance at level 5 and above (L5+) and level 4 and above (L4+) in 2013 and 2014. The bars on the left hand side denote L4+, while those corresponding to L5+ are on the right.

HA 1

Chart 1: L4+ and L5+ performance compared, 2013-2014

With the exception of maths, which has remained unchanged, there have been improvements across the board at L4+, of between two and four percentage points.

The same is true at L5+ and – in the case of reading, GPS and writing – the percentage point improvements are relatively larger. This is good news.

Chart 2 compares the gaps between disadvantaged learners (‘ever 6’ FSM plus children in care) and all other learners in state-funded schools on all five measures, for both 2013 and 2014.

.

HA 2

Chart 2: Disadvantaged gaps at L4+ and L5+ for all five measures, 2013 and 2014

.

With the sole exception of the composite measure in 2013, each L4+ gap is smaller than the corresponding gap at L5+, though the difference can be as little as one percentage point (the composite measure) and as high as 11 percentage points (reading).

Whereas the L4+ gap in reading is lower than for any other measure, the L5+ reading gap is now the biggest. This suggests there is a particular problem with L5+ reading.

The distance between L4+ and L5+ gaps has typically widened since 2013, except in the case of maths, where it has narrowed by one percentage point.

While three of the L4+ gaps have closed slightly (composite, reading, GPS) the remainder are unchanged. However, two of the L5+ gaps have increased (composite, writing) and only the maths gap has closed slightly.

This suggests that what limited progress there has been in closing disadvantaged gaps has focused more on L4+ than L5+.

The pupil premium is not bringing about a radical improvement – and its impact is relatively lower at higher attainment levels.

A similar pattern is discernible with FSM gaps as Chart 3 reveals. This excludes the composite measure as this is not supplied in the SFR.

Overall the picture at L4+ is cautiously positive, with small downward trends on three of the four measures, but the picture at L5+ is more mixed since two of the measures are unchanged.

.

HA 3

Chart 3: FSM gaps at L4+ and L5+ compared, 2013 and 2014  

Composite measure

  • Although the proportion of learners achieving this benchmark is slightly higher in converter academies than in LA-maintained schools, the latter have improved faster since 2013. The success rate in sponsored academies is half that in converter academies. Free schools are improving but remain behind LA-maintained schools. 
  • Some 650 schools achieve 50% or higher, but another 470 record 0% (fewer than the 600 which did so in 2013). 
  • 67% of high attainers achieved this benchmark in 2014, up five percentage points on 2013 but one third still fall short, demonstrating that there is extensive underachievement amongst high attainers in the primary sector. This rather undermines HMCI’s observations in his Commentary on the 2014 Annual Report. 
  • Although over 670 schools have a 100% success rate amongst their high attainers, 42 schools have recorded 0% (down from 54 in 2013). Several of these do better by their middle attainers. In 10 primary schools no high attainers achieve L4+ in reading, writing and maths combined.

.

Reading

  • The substantial improvement in L5+ reading performance since 2013 masks an as yet unexplained crash in Level 6 test performance. Only 874 learners in state-funded schools achieved L6 reading, compared with 2,137 in 2013. This is in marked contrast to a substantive increase in L6 test entries, the success rate on L6 teacher assessment and the trend in the other L6 tests. In 2013 around 12,700 schools had no pupils who achieved L6 reading, but this increased to some 13,670 schools in 2014. Even the performance of Chinese pupils (otherwise phenomenally successful on L6 tests) went backwards. 
  • The proportion of Chinese learners achieving L5 in reading has reached 65% (compared with 50% for White learners), having increased by seven percentage points since 2013 and overtaken the 61% recorded in 2012. 
  • 43 primary schools had a 100% success rate at Level 5 in the reading test, but 29 more registered 0%. 
  • Some 92% of high attainers made at least the expected progress in reading, fewer than the 94% of middle attainers who did so. However, this was a three percentage point improvement on the 89% who made the requisite progress in 2013. 

GPS

  •  The proportion of Chinese learners achieving L5+ in the GPS test is now 75%, a seven percentage point improvement on 2013. Moreover, 15% achieved Level 6, up eight percentage points on 2013. (The comparable Level 5+ percentage for White learners is 50%). There are unmistakeable signs that Chinese ascendancy in maths is being replicated with GPS. 
  • Some 7,210 schools had no learners achieving L6 in the GPS test, compared with 10,200 in 2013. While 18 schools recorded a perfect 100% record at Level 5 and above, 33 had no learners at L5+. 

.

Maths

  • Chinese learners continue to make great strides. The percentage succeeding on the L6 test has climbed a further six percentage points and now stands at 35% (compared with 8% for White Pupils). Chinese boys are at 39%. The proportion of Chinese learners achieving level 6 is now comparable to the proportions of other ethnic groups achieving level 5. This lends further credence to the notion that we have our own domestic equivalent of Shanghai’s PISA success – and perhaps to the suggestion that focusing on Shanghai’s classroom practice may bring only limited benefits. 
  • While it is commendable that 3% of FSM and 4% of disadvantaged learners are successful in the L6 maths test, the gaps between them and other learners are increasing as the overall success rate grows. There are now seven percentage point gaps for FSM and disadvantaged alike. 
  • Ten schools managed a L6 success rate of 50% or higher, while some 280 were at 30% or higher. On the other hand, 3,200 schools had no L6 passes (down from 5,100 in 2013). 
  • About 94% of high attainers made the expected progress in maths a one percentage point improvement on 2013 – and two percentage points more than the proportion of successful middle attainers. But 27 schools posted a success rate of 50% or below.

.

Writing (TA)

  • Chinese pupils do not match their performance on the GPS test, though 6% achieve L6 in writing TA compared with just 2% of white pupils. 
  • Three schools managed a 50% success rate at Level 6 and 56 were at 25% or above. Only one school managed 100% at L5, but some 200 scored 0%. 
  • Some 93% of all pupils make the expected progress in writing between KS1 and KS2. This is true of 95% of high attainers – and 95% of middle attainers too.

 

Composite measure: reading, writing and maths

Table 2 shows the overall proportion of learners achieving L5 or above in all of reading, writing and maths in each year since 2012.

 

2012 2013 2014
L5+ overall 20% 21% 24%
L5+ boys 17% 18% 20%
L5+ girls 23% 25% 27%

Table 2: Proportion of all learners achieving KS2 L5+ in reading, writing and maths, 2012-2014

The overall success rate has increased by three percentage points compared with 2013 and by four percentage points since 2012.

The percentage of learners achieving L4+ has also improved by four percentage points since 2012, so the improvement at L5+ is broadly commensurate.

Over this period, girls’ lead over boys has remained relatively stable at between six and seven percentage points.

The SFR reveals that success on this measure varies significantly between school type.

The percentages for LA-maintained schools (24%) and all academies and free schools (23%) are little different.

However mainstream converter academies stand at 26%, twice the 13% recorded by sponsored academies. Free schools are at 21%. These percentages have changed significantly compared with 2013.

.

HA 4

Chart 4:  Comparison of proportion of learners achieving L5+ in reading writing and maths in 2013 and 2014

.

Whereas free schools are making rapid progress and sponsored academies are also improving at a significant rate, converter academies are improving more slowly than LA-maintained schools.

The highest percentages on this measure in the Performance Tables are recorded by Fox Primary School in Kensington and Chelsea (86%) and Hampden Gurney CofE Primary School in Westminster (85%).

Altogether, some 650 schools have achieved success rates of 50% or higher, while 23 have managed 75% or higher.

At the other end of the spectrum about 470 schools have no learners at all who achieved this measure, fewer than the 600 recording this outcome in 2013.

Table 3 shows the gap between disadvantaged (ie ‘ever 6’ FSM and children in care) learners and others, as recorded in the Performance Tables.

2012 2013 2014
Disadv 9 10 12
Other 24 26 29
Gap 15 16 17

Table 3: Proportion of disadvantaged learners achieving L5+ in reading, writing and maths, 2012-2014

.

Although the percentage of disadvantaged learners achieving this benchmark has improved somewhat, the percentage of other learners doing so has improved faster, meaning that the gap between advantaged and other learners is widening steadily.

This contrasts with the trend at L4+, where the Performance Tables show a gap that has narrowed from 19 percentage points in 2012 (80% versus 61%) to 18 points in 2013 (81% versus 63%) and now to 16 points in 2014 (83% versus 67%).

Chart 5 below illustrates this comparison.

.

HA 5

Chart 5: Comparing disadvantaged/other attainment gaps in KS2 reading, writing and maths combined at L4+ and L5+, 2012-2014.

While the L4+ gap has closed by three percentage points since 2012, the L5+ gap has widened by two percentage points. This suggests that disadvantaged learners amongst the top 25% by prior attainment are not benefiting commensurately from the pupil premium.

There are 97 primary schools where 50% or more disadvantaged learners achieve L5+ across reading, writing and maths (compared with 40 in 2013).

The highest performers record above 80% on this measure with their disadvantaged learners, albeit with cohorts of 6 to 8. Only one school with a more substantial cohort (of 34) manages over 70%. This is Tollgate Primary School in Newham.

The percentage of high attainers who achieved L5+ in 2014 was 67%, up five percentage points from 62% in 2013. (In 2012 the Performance Tables provided a breakdown for English and maths, which is not comparable).

Although this is a significant improvement, it means that one third of high attainers at KS1 still do not achieve this KS2 benchmark, suggesting that there is significant underachievement amongst this top quartile.

Thirteen percent of middle attainers also achieved this outcome, compared with 10% in 2013.

A significant number of schools – over 670 – do manage a 100% success rate amongst their high attainers, but there are also 42 schools where no high attainers achieve the benchmark (there were 54 in 2013). In several of them, more middle attainers than high attainers achieve the benchmark.

There are ten primary schools in which no high attainers achieve L4 in reading writing and maths. Perhaps one should be thankful for the fact that no middle attainers in these schools achieve the benchmark either!

The KS2 average point score was 34.0 or higher in five schools, equivalent to a level 5A. The highest  APS was 34.7, recorded by Fox Primary School, with a cohort of 42 pupils.

Across all state-funded schools, the average value added measure for high attainers across reading, writing and maths is 99.8, the same as it was in 2013.

The comparable averages for middle attainers and low attainers are 100.0 and 100.2 respectively, showing that high attainers benefit slightly less from their primary education.

The highest value-added recorded for high attainers is 104.7 by Tudor Court Primary School in Thurrock, while the lowest is 93.7 at Sacriston Junior School in Durham (now closed).

Three more schools are below 95.0 and some 250 are at 97.5 or lower.

.

Reading Test

Table 4 shows the percentage of all learners, boys and girls achieving L5+ in reading since 2010. There has been a five percentage point increase (rounded) in the overall result since 2013, which restores performance to the level it had reached in 2010.

A seven percentage point gap in favour of girls remains unchanged from 2013. This is four points less than the comparable gender gap in 2010.

.

2010 2011 2012 2013 2014
L5+ overall 50 43 48 44 50
Boys 45 37 43 41 46
Girls 56 48 53 48 53

Table 4: Percentage of learners achieving L5+ in reading since 2010

.

As reported in my September 2014 post ‘What Happened to the Level 6 Reading Results?’ L6 performance in reading has collapsed in 2014.

The figures have improved slightly since the provisional results were released, but the collapse is still marked.

Table 5 shows the numbers successful since 2012.

The number of successful learners in 2014 is less than half the number successful in 2013 and almost back to the level in 2012 when the test was first introduced.

This despite the fact that the number of entries for the level 6 test – 95,000 – was almost exactly twice the 47,000 recorded in 2012 and significantly higher than the 70,000 entries in 2013.

For comparison, the number of pupils awarded level 6 in reading via teacher assessment was 15,864 in 2013 and 17,593 in 2014

We still have no explanation for this major decline which is entirely out of kilter with other L6 test outcomes.

.

2012 2013 2014
% No % No % No
L6+ 0 900 0 2,262 0 935
Boys 0 200 0 592 0 263
Girls 0 700 1 1,670 0 672

Table 5: Number and percentage of learners achieving L6 on the KS2 reading test 2012-2014

.

These figures include some pupils attending independent schools, but another table in the SFR reveals that 874 learners in state-funded primary schools achieved L6 (compared with 2,137 in 2013). Of these, all but 49 achieved L3+ in their KS1 reading assessment.

But some 13,700 of those with L3+ reading at the end of KS1 progressed to L4 or lower at the end of KS2.

The SFR does not supply numbers of learners with different characteristics achieving L6 and all percentages are negligible. The only group recording a positive percentage are Chinese learners at 1%.

In 2013, Chinese learners were at 2% and some other minority ethnic groups recorded 1%, so not even the Chinese have been able to withstand the collapse in the L6 success rate.

According to the SFR, the FSM gap at L5 is 21 percentage points (32% versus 53% for all other pupils). The disadvantaged gap is also 21 percentage points (35% versus 56% for all other pupils).

Chart 6 shows how these percentages have changed since 2012.

.

HA 6

Chart 6: FSM and disadvantaged gaps for KS2 reading test at L5+, 2012-2014

FSM performance has improved by five percentage points compared with 2013, while disadvantaged performance has grown by six percentage points.

However, gaps remain unchanged for FSM and have increased by one percentage point for disadvantaged learners. There is no discernible or consistent closing of gaps in KS2 reading at L5.

These gaps of 21 percentage points for both FSM and disadvantaged, are significantly larger than the comparable gaps at L4+ of 12 (FSM) and 10 (disadvantaged) percentage points.

The analysis of level 5 performance in the SFR reveals that the proportion of Chinese learners achieving level 5 has reached 65%, having increased by seven percentage points since 2013 and overtaken the 61% recorded in 2012.

Turning to the Performance Tables, we can see that, in relation to L6:

  • The highest recorded percentage achieving L6 is 17%, at Dent CofE Voluntary Aided Primary School in Cumbria. Thirteen schools recorded a L6 success rate of 10% or higher. (The top school in 2013 recorded 19%).
  • In 2013 around 12,700 schools had no pupils who achieved L6 reading, whereas in 2014 this had increased to some 13,670 schools.

In relation to L5:

  • 43 schools achieved a 100% record in L5 reading (compared with only 18 in 2013). All but one of these recorded 0% at L6, which may suggest that they were concentrating on maximising L5 achievement rather than risking L6 entry.
  • Conversely, there are 29 primary schools where no learners achieved L5 reading.

Some 92% of high attainers made at least the expected progress in reading, fewer than the 94% of middle attainers who did so.  However, this was a three percentage point improvement on the 89% who made the requisite progress in 2013.

And 41 schools recorded a success rate of 50% or lower on this measure, most of them comfortably exceeding this with their low and middle attainers alike.

.

GPS Test

Since the grammar, punctuation and spelling test was first introduced in 2013, there is only a two-year run of data. Tables 6 and 7 below show performance at L5+ and L6+ respectively.

.

2013 % 2014 %
L5+ overall 48 52
Boys 42 46
Girls 54 58

Table 6: Percentage of learners achieving L5+ in GPS, 2013 and 2014

2013 2014
% No % No
L6+ 2 8,606 4 21,111
Boys 1 3,233 3 8,321
Girls 2 5,373 5 12,790

Table 7: Number and percentage of learners achieving L6 in GPS, 2013 and 2014

.

Table 6 shows an overall increase of four percentage points in 2014 and the maintenance of a 12 percentage point gap in favour of girls.

Table 7 shows a very healthy improvement in L6 performance, which only serves to emphasise the parallel collapse in L6 reading. Boys have caught up a little on girls but the latter’s advantage remains significant.

The SFR shows that 75% of Chinese learners achieve L5 and above, up seven percentage points from 68% in 2013. Moreover, the proportion achieving L6 has increased by eight percentage points, to 15%. There are all the signs that Chinese eminence in maths is repeating itself with GPS.

Chart 7 shows how the FSM gap and disadvantaged gap has changed at L5+ for GPS. The disadvantaged gap has remained stable at 19 percentage points, while the FSM gap has narrowed by one percentage point.

These gaps are somewhat larger than those at L4 and above, which stand at 17 percentage points for FSM and 15 percentage points for disadvantaged learners.

.

HA 7

Chart 7:  FSM and disadvantaged gaps for KS2 GPS test at L5+, 2013 and 2014

.

The Performance Tables show that, in relation to L6:

  • The school with the highest percentage achieving level 6 GPS is Fulwood, St Peter’s CofE Primary School in Lancashire, which records a 47% success rate. Some 89 schools achieve a success rate of 25% or higher.
  • In 2014 there were some 7,210 schools that recorded no L6 performers at all, but this compares favourably with 10,200 in 2013. This significant reduction is in marked contrast to the increase in schools with no L6 readers.

Turning to L5:

  • 18 schools recorded a perfect 100% record for L5 GPS. These schools recorded L6 success rates that vary between 0% and 25%.
  • There are 33 primary schools where no learners achieved L5 GPS.

.

Maths test

Table 8 below provides the percentages of learners achieving L5+ in the KS2 maths test since 2010.

Over the five year period, the success rate has improved by eight percentage points, but the improvement in 2014 is less pronounced than it has been over the last few years.

The four percentage point lead that boys have over girls has changed little since 2010, apart from a temporary increase to six percentage points in 2012.

.

2010 2011 2012 2013 2014
L5+ overall 34 35 39 41 42
Boys 36 37 42 43 44
Girls 32 33 36 39 40

Table 8: Percentage of learners achieving L5+ in KS2 maths test, 2010-2014

.

Table 9 shows the change in achievement in the L6 test since 2012. This includes pupils attending independent schools – another table in the SFR indicates that the total number of successful learners in 2014 in state-funded schools is 47,349, meaning that almost 95% of those achieving L6 maths are located in the state-funded sector.

There has been a healthy improvement since 2013, with almost 15,000 more successful learners – an increase of over 40%. Almost one in ten of the end of KS2 cohort now succeeds at L6. This places the reversal in L6 reading into even sharper relief.

The ratio between boys and girls has remained broadly unchanged, so boys continue to account for over 60% of successful learners.

.

2012 2013 2014
% No % No % No
L6+ 3 19,000 7 35,137 9 50,001
Boys 12,400 8 21,388 11 30,173
Girls 6,600 5 13,749 7 19,828

Table 9 Number and percentage of learners achieving L6 in KS2 maths test 2012-2014

.

The SFR shows that, of those achieving L6 in state-funded schools, some 78% had achieved L3 or above at KS1. However, some 9% of those with KS1 L3 – something approaching 10,000 pupils – progressed only to L4, or lower.

The breakdown for minority ethnic groups shows that the Chinese ascendancy continues. This illustrated by Chart 8 below.

HA 8

Chart 8: KS2 L6 maths test performance by ethnic background, 2012-2014

In 2014, the percentage of Chinese achieving L5+ has increased by a respectable three percentage points to 74%, but the L6 figure has climbed by a further six percentage points to 35%. More than one third of Chinese learners now achieve L6 on the maths test.

This means that the proportion of Chinese pupils achieving L6 is now broadly similar to the proportion of other minorities achieving Level 5 (34% of white pupils for example).

They are fifteen percentage points ahead of the next best outcome – 20% recorded by Indian learners. White learners stand at 8%.

There is an eight percentage point gap between Chinese boys (39%) and Chinese girls (31%). The gap for white boys and girls is much lower, but this is a consequence of the significantly lower percentages.

Given that Chinese pupils are capable of achieving such extraordinary results under the present system, these outcomes raise significant questions about the balance between school and family effects and whether efforts to emulate Chinese approaches to maths teaching are focused on the wrong target.

Success rates in the L6 maths test are high enough to produce percentages for FSM and disadvantaged learners. The FSM and disadvantaged gaps both stand at seven percentage points, whereas they were at 5 percentage points (FSM) and 6 percentage points (disadvantaged) in 2013. The performance of disadvantaged learners has improved, but not as fast as that of other learners.

Chart 9 shows how these gaps have changed since 2012.

While the L6 gaps are steadily increasing, the L5+ gaps have remained broadly stable at 20 percentage points (FSM) and 21 percentage points (disadvantaged). There has been a small one percentage point improvement in the gap for disadvantaged learners in 2014, matching the similar small improvement for L4+.

The gaps at L5+ remain significantly larger than those at L4+ (13 percentage points for FSM and 11 percentage points for disadvantaged).

HA 9

Chart 9: FSM and disadvantaged gaps, KS2 L5+ and L6 maths test, 2012 to 2014

.

The Performance Tables reveal that:

  • The school with the highest recorded percentage of L6 learners is Fox Primary School (see above) at 64%, some seven percentage points higher than its nearest rival. Ten schools achieve a success rate of 50% or higher (compared with only three in 2013), 56 at 40% or higher and 278 at 30% or higher.
  • However, over 3,200 schools record no L6 passes. This is a significant improvement on the 5,100 in this category in 2013, but the number is still far too high.
  • Nine schools record a 100% success rate for L5+ maths. This is fewer than the 17 that managed this feat in 2013.

Some 94% of high attainers made the expected progress in maths a one percentage point improvement on 2013, two percentage points more than did so in reading in 2014 – and two percentage points more than the proportion of middle attainers managing this.

However, 27 schools had a success rate of 50% or below, the vast majority of them comfortably exceeding this with their middle attainers – and often their low attainers too.

.

Writing Teacher Assessment

Table 10 shows how the percentage achieving L5+ through the teacher assessment of writing has changed since 2012.

There has been a healthy five percentage point improvement overall, and an improvement of three percentage points since last year, stronger than the comparable improvement at L4+. The large gender gap of 15 percentage points in favour of girls is also unchanged since 2013.

.

2012 2013 2014
L5+ overall 28 30 33
Boys 22 23 26
Girls 35 38 41

Table 10: Percentage achieving level 5+ in KS2 writing TA 2012-2014

.

Just 2% of learners nationally achieve L6 in writing TA – 11,340 pupils (10,654 of them located in state-funded schools).

However, this is a very significant improvement on the 2,861 recording this outcome in 2013. Just 3,928 of the total are boys.

Chinese ascendancy at L6 is not so significant. The Chinese success rate stands at 6%. However, if the comparator is performance at L5+ Chinese learners record 52%, compared with 33% for both White and Asian learners.

The chart below shows how FSM and disadvantaged gaps have changed at L5+ since 2012.

This indicates that the FSM gap, having widened by two percentage points in 2013, has narrowed by a single percentage point in 2014, so it remains higher than it was in 2012. Meanwhile the disadvantaged gap has widened by one percentage point since 2013.

The comparable 2014 gaps at L4+ are 15 percentage points (FSM) and 13 percentage points (disadvantaged), so the gaps at L5+ are significantly larger.

.

HA 10

Chart 10: FSM and disadvantaged gaps, L5+ Writing TA, 2012-2014

.

The Performance Tables show that:

  • Three schools record a L6 success rate of 50% and only 56 are at 25% or higher.
  • At the other end of the spectrum, the number of schools with no L6s is some 9,780, about a thousand fewer than in 2013.
  • At L5+ only one school has a 100% success rate (there were four in 2013). Conversely, about 200 schools record 0% on this measure.

Some 93% of all pupils make the expected progress in writing between KS1 and KS2 and this is true of 95% of high attainers – the same percentage of middle attainers is also successful.

Conclusion

Taken together, this evidence presents a far more nuanced picture of high attainment and high attainers’ performance in the primary sector than suggested by HMCI’s Commentary on his 2014 Annual Report:

‘The proportion of pupils at Key Stage 2 attaining a Level 5 or above in reading, writing and mathematics increased from 21% in 2013 to 24% in 2014.

Attainment at Level 6 has also risen. In mathematics, the proportion of pupils achieving Level 6 rose from 3% in 2012 to 9% in 2014. The proportion achieving Level 6 in grammar, punctuation and spelling rose by two percentage points in the last year to 4%.

These improvements suggest that primary schools are getting better at identifying the brightest children and developing their potential.’

There are four particular areas of concern:

  • Underachievement amongst high attainers is too prevalent in far too many primary schools. Although there has been some improvement since 2013, the fact that only 67% of those with high prior attainment at KS1 achieve L5 in reading, writing and maths combined is particularly worrying.
  • FSM and disadvantaged achievement gaps at L5+ remain significantly larger than those at L4+ – and there has been even less progress in closing them. The pupil premium ought to be having a significantly stronger impact on these excellence gaps.
  • The collapse of L6 reading test results is all the more stark when compared with the markedly improved success rates in GPS and maths which HMCI notes. We still have no explanation of the cause.
  • The success rates of Chinese pupils on L6 tests remains conspicuous and in maths is frankly extraordinary. This evidence of a ‘domestic Shanghai effect’ should be causing us to question why other groups are so far behind them – and whether we need to look beyond Shanghai classrooms when considering how best to improve standards in primary maths.

.

GP

December 2014

Advertisements

A Closer Look at Level 6

This post provides a data-driven analysis of Level 6 (L6) performance at Key Stage 2, so as to:

pencil-145970_640

  • Marshall the published information and provide a commentary that properly reflects this bigger picture;
  • Establish which data is not yet published but ought to be in the public domain;
  • Provide a baseline against which to measure L6 performance in the 2014 SATs; and
  • Initiate discussion about the likely impact of new tests for the full attainment span on the assessment and performance of the highest attainers, both before and after those tests are introduced in 2016.

Following an initial section highlighting key performance data across the three L6 tests – reading; grammar, punctuation and spelling (GPS); and maths – the post undertakes a more detailed examination of L6 achievement in English, maths and science, taking in both teacher assessment and test outcomes.

It  concludes with a summary of key findings reflecting the four purposes above.

Those who prefer not to read the substantive text can jump straight to the summary from here

I apologise in advance for any transcription errors and statistical shortcomings in the analysis below.

Background

Relationship with previous posts

This discussion picks up themes explored in several previous posts.

In May 2013 I reviewed an Investigation of Level 6 Key Stage 2 Tests commissioned and published by in February that year by the Department for Education.

My overall assessment of that report?

‘A curate’s egg really. Positive and useful in a small way, not least in reminding us that primary-secondary transition for gifted learners remains problematic, but also a missed opportunity to flag up some other critical issues – and of course heavily overshadowed by the primary assessment consultation on the immediate horizon.’

The performance of the highest primary attainers also featured strongly in an analysis of the outcomes of NAHT’s Commission on Assessment (February 2014) and this parallel piece on the response to the consultation on primary assessment and accountability (April 2014).

The former offered the Commission two particularly pertinent recommendations, namely that it should:

‘shift from its narrow and ‘mildly accelerative’ view of high attainment to accommodate a richer concept that combines enrichment (breadth), extension (depth) and acceleration (faster pace) according to learners’ individual needs.’

Additionally it should:

‘incorporate a fourth ‘far exceeded’ assessment judgement, since the ‘exceeded’ judgement covers too wide a span of attainment.’

The latter discussed plans to discontinue L6 tests by introducing from 2016 single tests for the full attainment span at the end of KS2, from the top of the P-scales to a level the initial consultation document described as ‘at least of the standard of’ the current L6.

It opined:

‘The task of designing an effective test for all levels of prior attainment at the end of key stage 2 is…fraught with difficulty…I have grave difficulty in understanding how such assessments can be optimal for high attainers and fear that this is bad assessment practice.’

Aspects of L6 performance also featured in a relatively brief review of High Attainment in 2013 Primary School Performance Tables (December 2013). This post expands significantly on the relevant data included in that one.

The new material is drawn from three principal sources:

The recent history of L6 tests

Level 6 tests have a rather complex history. The footnotes to SFR 51/2013 simplify this considerably, noting that:

  • L6 tests were initially available from 1995 to 2002
  • In 2010 there was a L6 test for mathematics only
  • Since 2012 there have been tests of reading and mathematics
  • The GPS test was introduced in 2013.

In fact, the 2010 maths test was the culmination of an earlier QCDA pilot of single level tests. In that year the results from the pilot were reported as statutory National Curriculum test results in pilot schools.

In 2011 optional L6 tests were piloted in reading, writing and maths. These were not externally marked and the results were not published.

The June 2011 Bew Report came out in favour:

‘We believe that the Government should continue to provide level 6 National Curriculum Tests for schools to use on an optional basis, whose results should be reported to parents and secondary schools.’

Externally marked L6 tests were offered in reading and maths in 2012, alongside L6 teacher assessment in writing. The GPS test was added to the portfolio in the following year.

In 2012, ministers were talking up the tests describing them as:

‘…a central element in the Coalition’s drive to ensure that high ability children reach their potential. Nick Gibb, the schools minister, said: “Every child should be given the opportunity to achieve to the best of their abilities.

“These tests will ensure that the brightest pupils are stretched and standards are raised for all.”’

In 2012 the Primary Performance Tables used L6 results only in the calculation of ‘level 5+’, APS, value-added and progress measures, but this was not the case in 2013.

The Statement of Intent on the Tables said:

‘…the percentage of the number of children at the end of Key Stage 2 achieving level 6 in a school will also be shown in performance tables. The Department will not publish any information at school level about the numbers of children entered for the level 6 tests, or the percentage achieving level 6 of those entered for level 6.’

The nature of the test is unchanged for 2014: they took place on 12, 13 and 15 May respectively. This post is timed to coincide with their administration.

The KS2 ARA booklet  continues to explain that:

‘Children entered for level 6 tests are required to take the levels 3-5 tests. Headteachers should consider a child’s expected attainment before registering them for the level 6 tests as they should be demonstrating attainment above level 5. Schools may register children for the level 6 tests and subsequently withdraw them.

The child must achieve a level 5 in the levels 3-5 test and pass the corresponding level 6 test in the same year in order to be awarded an overall level 6 result. If the child does not pass the level 6 test they will be awarded the level achieved in the levels 3-5 test.’

Anticipated future developments

At the time of writing the Government has not published a Statement of Intent explaining whether there will be any change in the reporting of L6 results in the December 2014 Primary School Performance Tables.

An accompanying Data Warehouse (aka Portal) is also under development and early iterations are expected to appear before the next set of Tables. The Portal will make available a wider range of performance data, some of it addressing high attainment.

The discussion in this post of material not yet in the public domain is designed in part as a marker to influence consideration of material for inclusion in the Portal.

As noted above, the Government has published its response to the consultation on primary assessment and accountability arrangements, confirming that new single assessments for the full attainment span will be introduced in 2016.

At the time of writing, there is no published information about the number of entries for the 2014 tests. (In 2013 these details were released in the reply to a Parliamentary Question.)

Entries had to be confirmed by March 2014, so it may be that the decision to replace the L6 tests, not confirmed until that same month, has not impacted negatively on demand. The effect on 2015 entries remains to be seen, but there is a real risk that these will be significantly depressed.

L6 tests are scheduled to be taken for the final time in May 2015. The reading and maths tests will have been in place for four consecutive years; the GPS test for three.

Under the new arrangements there will continue to be tests in reading, GSP and maths – plus a sampling test in science – as well as teacher assessment in reading, writing, maths and science.

KS2 test outcomes (but not teacher assessment) will be reported by means of a scaled score for each test, alongside three average scaled scores, for the school, the local area and nationally.

The original consultation document proposed that each scaled score would be built around a ‘secondary readiness standard’ loosely aligned with the current L4B, but converted into a score of 100.

The test development frameworks mention that:

‘at the extremes of the scaled score distribution, as is standard practice, the scores will be truncated such that above and below a certain point, all children will be awarded the same scaled score in order to minimise the effect for children at the ends of the distribution where the test is not measuring optimally.’

A full set of sample materials including tests and mark schemes for every test will be published by September 2015, the beginning of the academic year in which the new tests are first deployed.

The consultation document said these single tests would:

‘include challenging material (at least of the standard of the current level 6 test) which all pupils will have the opportunity to answer, without the need for a separate test’.

The development frameworks published on 31 March made it clear that the new tests should:

‘provide a suitable challenge for all children and give every child the opportunity to achieve as high a standard…as possible.’

Additionally:

‘In order to improve general accessibility for all children, where possible, questions will be placed in order of difficulty.’

These various and potentially conflicting statements informed the opinion I have already repeated.

The question then arises whether the Government’s U turn on separate tests for the highest attainers is in the latter’s best interests. There cannot be a continuation of L6 tests per se, because the system of levels that underpins it will no longer exist, but separate tests could in principle continue.

Even if the new universal tests provide equally valid and reliable judgements of their attainment – which is currently open to question – one might reasonably argue that the U turn itself may undermine continuity of provision and continued improvement in schools’ practice.

The fact that this practice needs substantive improvement is evidenced by Ofsted’s recent decision to strengthen the attention given to the attainment and progress of what they call ‘the most able’ in all school inspection reports.

L6 tests: Key Performance Data

Entry and success rates

As noted above, the information in the public domain about entry rates to L6 tests is incomplete.

The 2013 Investigation provides the number of pupils entered for each test in 2012. We do not have comparable data for 2013, but a PQ reply does supply the number of pupils registered for the tests in both 2012 and 2013. This can be supplemented by material in the 2013 SFR and the corresponding 2012 publication.

The available data is synthesised in this table showing for each year – and where available – the number registered for each test, the number entered, the total number of pupils achieving L6 and, of those, the number attending state-funded schools.

                    2012                   2013
Reg Ent Pass PassSF Reg Ent Pass Pass SF
Reading 47,148 46,810 942 x 73,118 x 2,262 2,137
GPS x x x x 61,883 x 8,606 x
Maths 55,809 55,212 18,953 x 80,925 x 35,137 33,202

One can see that there are relatively small differences between the numbers of pupils registered and the number entered, so the former is a decent enough proxy for the latter. I shall use the former in the calculations immediately below.

It is also evident that the proportions of learners attending independent schools who achieve L6 are small though significant. But, given the incomplete data set for state-funded schools, I shall use the pass rate for all schools in the following calculations.

In sum then, in 2012, the pass rates per registered entry were:

  • Reading – 2.0%
  • Maths – 34.0%

And in 2013 they were:

  • Reading – 3.1%
  • GPS – 13.9%
  • Maths – 43.4%

The pass rates in 2013 have improved significantly in both reading and maths, the former from a very low base. However, the proportion of learners successful in the L6 reading test remains extremely small.

The 2013 Investigation asserted, on the basis of the 2012 results, that:

‘The cost of supporting and administering a test for such a small proportion of the school population appears to outweigh the benefits’

However it did not publish any information about that cost.

It went on to suggest that there is a case for reviewing whether the L6 test is the most appropriate means to  ‘identify a range of higher performing pupils, for example the top 10%’. The Government chose not to act on this suggestion.

Gender, ethnic background and disadvantage

The 2013 results demonstrate some very significant gender disparities, as revealed in Chart 1 below.

Girls account for 62% of successful pupils in GPS and a whopping 74% in reading, while boys account for 61% of successful pupils in maths. These imbalances raise important questions about whether gender differences in high attainment are really this pronounced, or whether there is significant underachievement amongst the under-represented gender in each case.

Chart 1: Number of pupils successful in 2013 L6 tests by gender

L6 chart 1

There are equally significant disparities in performance by ethnic background. Chart 2 below illustrates how the performance of three selected ethnic minority groups – white, Asian and Chinese – varies by test and gender.

It shows that pupils from Chinese backgrounds have a marked ascendancy in all three tests, while Asian pupils are ahead of white pupils in GPS and maths but not reading. Girls are ahead of boys within all three ethnic groups, girls leading in reading and GPS and boys leading in maths. Chinese girls comfortably out-perform white and Asian boys

Chinese pupils are way ahead in maths, with 29% overall achieving L6 and an astonishing 35% of Chinese boys achieving this outcome.

The reasons for this vast disparity are not explained and raise equally awkward questions about the distribution of high attainment and the incidence of underachievement.

 

Chart 2: Percentages of pupils successful in 2013 L6 tests by gender and selected ethnic background

L6 chart2

There are also significant excellence gaps on each of the tests, though these are hard to visualise when working solely with percentages (pupil numbers have not been published).

The percentage variations are shown in the table below. This sets out the FSM gap and the disadvantaged gap, the latter being based on the ever-6 FSM measure that underpins the Pupil Premium.

These figures suggest that, while learners eligible for the Pupil Premium are demonstrating success on the maths test (and, for girls at least, on the GPS test too), they are over three times less likely to be successful than those from advantaged backgrounds. The impact of the Pupil Premium is therefore limited.

The gap between the two groups reaches as high as 7% for boys in maths. Although this is low by comparison with the corresponding gap at level 4, it is nonetheless significant. There is more about excellence gaps in maths below.

 

Reading GPS        Maths   
G B G B G B
FSM 0 0 1 0 2 3
Non-FSM 1 0 2 1 6 9
Gap 1 0 1 1 4 6
 
Dis 0 0 1 0 2 3
Non-Dis 1 0 3 2 7 10
Gap 1 0 2 2 5 7

Schools achieving L6 success

Finally in this opening section, a comparison of schools achieving L6 success in the 2013 Primary School Performance Tables reveals different patterns for each test.

The table below shows how many schools secured different percentages of pupils at L6. The number of schools achieving 11-20% at L6 in the GPS test is over 20 times the number that achieved that outcome in reading. But over eight times more schools secured this outcome in maths than managed it in GPS.

No schools made it beyond 20% at L6 in reading and none pushed beyond 40% at L6 in GPS, but the outliers in maths managed well over 60% and even 70% returns.

11-20% 21-30% 31-40% 41-50% 51-60% 61-70% 71-80% Total
Reading 24 24
GPS 298 22 2 322
Maths 2521 531 106 25 0 1 2 3186

There is also some evidence of schools being successful in more than one test.

Amongst the small sample of 28 schools that secured 41% or more L6s in maths,  two also featured amongst the top 24 performers in reading and five amongst the top 24 performers in GSP.

The school with arguably the best record across all three tests is Christ Church Primary School in Hampstead, which secured 13% in reading, 21% in GPS and 46% in maths, from a KS2 cohort of 24. The FSM/Pupil Premium rates at the school are low but, nevertheless, this is an outstanding result.

The following sections look more closely at L6 test and teacher assessment results in each subject. Each section consists of a series of bullet points highlighting significant findings.

English

 

Reading Test

The evidence on performance on the L6 reading test is compromised to some extent by the tiny proportions of pupils that achieve it. However:

  • 9,605 schools registered pupils for the 2013 L6 reading test, up 48% from 6,469 in 2012, and the number of pupils registered increased from 47,148 in 2012 to 73,118 in 2013, an increase of 55%.
  • Of the 539,473 learners who undertook the 2013 KS2 reading tests, only 2,262 (about 0.42%) achieved L6. This figure includes some in independent schools; the comparable figure for state-funded schools only is 2,137, so 5.5% of L6s were secured in the independent sector.
  • Of this first total – ie including pupils from independent schools – 1,670 were girls (0.63% of all girls who undertook the KS2 reading tests) and 592 were boys (0.21% of all boys who undertook the KS2 reading tests).
  • These are significant improvements on the comparable 2012 figures which showed about 900 learners achieving L6, including 700 girls and 200 boys. (The figures were rounded in the SFR but the 2013 evaluation confirmed the actual number as 942). The overall percentage achieving L6 therefore increased by about 140% in 2013, compared with 2012. If we assume registration for L6 tests as a proxy for entry, this suggests that just over 3% of entrants passed in 2013.
  • In state-funded schools only, the percentage of learners from a Chinese background entered for KS2 reading tests who achieved L6 reaches 2%, compared with 1% for those of mixed background and 0% for learners from white, Asian and black backgrounds.
  • Amongst the defined sub-groups, learners of Irish, any other white, white and Asian and any other Asian backgrounds also make it to 1%. All the remainder are at 0%.
  • The same is true of EAL learners and native English speakers, FSM-eligible and disadvantaged learners, making worthwhile comparisons almost impossible.
  • The 2013 transition matrices show that 12% of learners who had achieved L4 at the end of KS1 went on to achieve L6, while 1% of those who had achieved L3 did so. Hence the vast majority of those at L4 in KS1 did not make two levels of progress.
  • Progression data in the SFR shows that, of the 2,137 learners achieving L6 in state funded schools, 2,047 were at L3 or above at KS1, 77 were at L2A, 10 were at L2B and 3 were at L2C. Of the total population at KS1 L3 or above, 1.8% progressed to L6.
  • Regional and local authority breakdowns are given only as percentages, of limited value for comparative purposes because they are so small. Only London and the South East record 1% at L6 overall, with all the remaining regions at 0%. Only one local authority – Richmond upon Thames – reaches 2%.
  • However 1% of girls reach L6 in all regions apart from Yorkshire and Humberside and a few more authorities record 2% of girls at L6: Camden, Hammersmith and Fulham, Kensington and Chelsea, Kingston, Richmond and Solihull.
  • The 2013 Primary School Performance Tables show that some 12,700 schools recorded no learners achieving L6.
  • At the other end of the spectrum, 36 schools recorded 10% or more of their KS2 cohort achieving L6. Four of these recorded 15% or higher:

Iford and Kingston C of E Primary School, East Sussex (19%; cohort of 21).

Emmanuel C of E Primary School, Camden (17%; cohort of 12).

Goosnargh Whitechapel Primary School, Lancashire (17%; cohort of 6).

High Beech  C of E VC Primary School, Essex (15%; cohort of 13).

Reading TA

There is relatively little data about teacher assessment outcomes.

  • The total number of pupils in all schools achieving L6 in reading TA in 2013 is 15,864 from a cohort of 539,729 (2.94%). This is over seven times as many as achieved L6 in the comparable test (whereas in maths the figures are very similar). It would be useful to know how many pupils achieved L6 in TA, were entered for the test and did not succeed.
  • The number of successful girls is 10,166 (3.85% of females assessed) and the number of boys achieving L6 is 5,698 (2.06% of males assessed). Hence the gap between girls and boys is far narrower on TA than it is on the corresponding test.
  • Within the 2013 Performance Tables, eight schools recorded 50% or more of their pupils at L6, the top performer being Peppard Church of England Primary School, Oxfordshire, which reached 83% (five from a cohort of six).

 

Writing (including GPS)

 

GPS Test

The L6 Grammar, Punctuation and Spelling (GPS) test was newly introduced in 2013. This is what we know from the published data:

  • The number of schools that registered for the test was 7,870, almost 2,000 fewer than registered for the reading test. The number of pupil registrations was 61,883, over 12,000 fewer than for reading.
  • The total number of successful learners is 8,606, from a total of 539,438 learners assessed at KS2, including those in independent schools taking the tests, giving an actual percentage of 1.6%. As far as I can establish, a comparable figure for state-funded schools is not available.
  • As with reading, there are significant differences between boys and girls. There were 5,373 successful girls (2.04% of girls entered for KS2 GPS tests) and 3,233 successful boys (1.17% of boys entered for KS2 GPS). This imbalance in favour of girls is significant, but not nearly as pronounced as in the reading test.
  • The proportion of pupil registrations for the L6 GPS test resulting in L6 success is around one in seven (13.9%) well over four times as high as for reading.
  • The ethnic breakdown in state-funded schools shows that Chinese learners are again in the ascendancy. Overall, 7% of pupils from a Chinese background achieved L6, compared with 1% white, 2% mixed, 2% Asian and 1% black.
  • Chart 3 below shows how L6 achievement in GPS varies between ethnic sub-groups. Indian pupils reach 4% while white and Asian pupils score 3%, as do pupils from any other Asian background.

Chart 3: 2013 GPS L6 performance by ethnic sub-groups

L6 chart 3

  • When gender differences are taken into account, Chinese girls are at 8% (compared with boys at 7%), ahead of Indian girls at 5% (boys 3%), white and Asian girls at 4% (boys 3%) and any other Asian girls also at 4% (boys 3%). The ascendancy of Chinese girls over boys from any other ethnic background is particularly noteworthy and replicates the situation in maths (see below).
  • Interestingly, EAL learners and learners with English as a native language both record 2% at L6. Although these figures are rounded, it suggests that exceptional performance in this aspect of English does not correlate with being a native speaker.
  • FSM-eligible learners register 0%, compared with 2% for those not eligible. However, disadvantaged learners are at 1% and non-disadvantaged 2% (Disadvantaged boys are at 0% and non-disadvantaged girls at 3%). Without knowing the numbers involved we can draw few reliable conclusions from this data.
  • Chart 4 below gives illustrates the regional breakdown for boys, girls and both genders. At regional level, London reaches 3% success overall, with both the South East and Eastern regions at 2% and all other regions at 1%. Girls record 2% in every region apart from the North West and Yorkshire and Humberside. Only in London do boys reach 2%.

 

Chart 4: 2013 L6 GPS outcomes by gender and region

L6 chart 4

  • At local authority level the highest scoring are Richmond (7%); the Isles of Scilly (6%); Kingston and Sutton (5%); and Harrow, Hillingdon and Wokingham (4%).
  • The School Performance Tables reveal that some 10,200 schools posted no L6 results while, at the other extreme, 34 schools recorded 20% or more of their KS2 cohort at L6 and 463 schools managed 10% or above. The best records were achieved by:

St Joseph’s Catholic Primary School, Southwark (38%; cohort of 24).

The Vineyard School, Richmond  (38%; cohort of 56).

Cartmel C of E Primary School,  (29%; cohort of 7) and

Greystoke School, (29%; cohort of 7).

Writing TA

When it comes to teacher assessment:

  • 8,410 learners from both state and independent schools out of a total of 539,732 assessed (1.56%) were judged to be at L6 in writing. The total figure for state-funded schools is 7,877 pupils. This is very close to the number successful in the L6 GPS test, even though the focus is somewhat different.
  • Of these, 5,549 are girls (2.1% of the total cohort) and 2,861 boys (1.04% of the total cohort). Hence the imbalance in favour of girls is more pronounced in writing TA than in the GPS test, whereas the reverse is true for reading. 
  • About 5% of learners from Chinese backgrounds achieve L6, as do 3% of white Asian and 3% of Irish pupils.
  • The 2013 transition matrices record progression in writing TA, rather than in the GSP test. They show that 61% of those assessed at L4 at KS1 go on to achieve L6, so only 6 out of 10 are making the expected minimum two levels of progress. On the other hand, some 9% of those with KS1 L3 go on to achieve L6, as do 2% of those at L2A.
  • The SFR provides further progression data – again based on the TA outcomes – for state-funded schools only. It shows us that one pupil working towards L1 at KS1 went on to achieve L6 at KS2, as did 11 at L1, 54 at L2C, 393 at L2B, 1,724 at L2A and 5,694 at L3 or above. Hence some pupils are making five or more levels of progress.
  • The regional breakdown – this time including independent schools – gives the East Midlands, West Midlands, London and the South West at 2%, with all the rest at 1%. At local authority level, the best performers are: City of London at 10%; Greenwich, Kensington and Chelsea and Richmond at 5% and Windsor and Maidenhead at 4%.

English TA

There is additionally a little information about pupils achieving L6 across the subject:

  • The SFR confirms that 8,087 pupils (1.5%) were assessed at L6 in English, including 5,244 girls (1.99% of all girls entered) and 2,843 boys (1.03% of all boys entered). These figures are for all schools, including independent schools.
  • There is a regional breakdown showing the East and West Midlands, London and the South West at 2%, with all the remainder at 1%. Amongst local authorities, the strongest performers are City of London (10%); and Bristol, Greenwich, Hackney, Richmond, Windsor and Maidenhead (4%). The exceptional performance of Bristol, Greenwich and Hackney is noteworthy.
  • In the Performance Tables, 27 schools record 30% or more pupils at L6 across English, the top performer again being Newton Farm, at 60%.

Maths

L6 performance in maths is more common than in other tests and subjects and the higher percentages generated typically result in more meaningful comparisons.

  • The number of school registrations for L6 maths in 2013 was 11,369, up almost 40% from 8,130 in 2012. The number of pupil registrations was 80,925, up some 45% from 55,809 in 2012.
  • The number of successful pupils – in both independent and state schools – was 35,137 (6.51% of all entrants). The gender imbalance in reading and GPS is reversed, with 21,388 boys at this level (7.75% of males entered for the overall KS2 test) compared with 13,749 girls (5.22% of females entered for the test). The SFR gives a total for state-funded schools of 33,202 pupils, so some 5.5% of Level 6s were achieved in independent schools.
  • Compared with 2012, the numbers of successful pupils has increased from 18,953. This represents an increase of 85%, not as huge as the increase for reading but a very substantial increase nevertheless. 
  • The number of successful girls has risen by some 108% from 6,600 (rounded) and the number of successful boys by about 72%, from 12,400 (rounded), so the improvement in girls’ success is markedly larger than the corresponding improvement for boys.  
  • Assuming L6 test registration as a proxy for entry, the success rate in 2013 is around 43.4%, massively better than for reading (3%) and GPS (13.9%). The corresponding success rate in 2012 was around 34%. (Slightly different results would be obtained if one used actual entry rates and passes for state schools only, but we do not have these figures for both years.)
  • The breakdown in state-funded schools for the main ethnic groups by gender is illustrated by Chart 5 below. This shows how performance by boys and girls varies according to whether they are white ( W), mixed (M), Asian (A), black (B) or Chinese (C). It also compares the outcomes in 2012 and 2013. The superior performance of Chinese learners is evident, with Chinese boys reaching a staggering 35% success rate in 2013. As things stand, Chinese boys are almost nine times more likely to achieve L6 than black girls.
  • Chart 5 also shows that none of the gender or ethnic patterns has changed between 2012 and 2013, but some groups are making faster progress, albeit from a low base. This is especially true of white girls, black boys and, to a slightly lesser extent, Asian girls.
  • Chinese girls and boys have improved at roughly the same rate and black boys have progressed faster than black girls but, in the remaining three groups, girls are improving at a faster rate than boys.

Chart 5: L6 Maths test by main ethnic groups and gender

L6 chart 5

  • Amongst sub-groups, not included on this table, the highest performing are: any other Asian background 15%, Indian 14%, white and Asian 11% and Irish 10%. Figures for Gypsy/Roma and any other white background are suppressed, while travellers of Irish heritage are at 0%, black Caribbean at 2% and any other black background at 3%. In these latter cases, the differential with Chinese performance is huge.
  • EAL learners record a 7% success rate, compared with 6% for native English language speakers, an improvement on the level pegging recorded for GPS. This gap widens to 2% for boys – 9% versus 7% in favour of EAL, whereas for girls it is 1% – 6% versus 5% in favour of EAL. The advantage enjoyed by EAL learners was also evident in 2012.
  • The table below shows the position for FSM and disadvantaged learners by gender, and how this has changed since 2012.
FSM boys Non FSM boys Gap Dis boys Non dis boys Gap
2012 1% 5% 4% 1% 6% 5%
2013 3% 9% 6% 3% 10% 7%
FSM girls Non FSM girls Gap Dis girls Non dis girls Gap
2012 1% 3% 2% 1% 3% 2%
2013 2% 6% 4% 2% 7% 5%
FSM all Non FSM all Gap Dis all Non dis all Gap
2012 1% 4% 3% 1% 4% 3%
2013 2% 7% 5% 2% 8% 6%
  • This shows that the gap between FSM and non-FSM and between disadvantaged and non-disadvantaged has grown – for boys, girls and the groups as a whole – between 2012 and 2013. All the gaps have increased by 2% or 3%, with higher gaps between disadvantaged and advantaged girls and for disadvantaged boys and girls together, compared with their more advantaged peers.
  • The gaps are all between 2% and 7%, so not large compared with those lower down the attainment spectrum, but the fact that they are widening is a significant cause for concern, suggesting that Pupil Premium funding is not having an impact at L6 in maths.
  • The Transition Matrices show that 89% of learners assessed at L4 in KS1 went on to achieve L6, while 26% of those with L3 at KS1 did so, as did 4% of those with L2A and 1% of those with L2B. Hence a noticeable minority is making four levels of progress.
  • The progression data in the SFR, relating to state-funded schools, show that one pupil made it from W at KS1 to L6, while 8 had L1, 82 had 2C, 751 had 2B, 4,983 had 2A and 27,377 had L3. Once again, a small minority of learners is making four or five levels of progress.
  • At regional level, the breakdown is: NE 6%, NW 6%, Y+H 5%, EM 6%, WM 6%, E 6%, London 9%, SE 7% and SW 6%. So London has a clear lead in respect of the proportion of its learners achieving L6.
  • The local authorities leading the rankings are: City of London 24%, Richmond 19%, Isles of Scilly 17%, Harrow and Kingston 15%, Trafford and Sutton 14%. No real surprises there!
  • The Performance Tables show 33 schools achieved 40% or higher on this measure. Eight schools were at 50% or above. The best performing schools were:

St Oswald’s C of E Aided Primary School, Cheshire West and Chester (75%; cohort 8)

St Joseph’s Roman Catholic Primary School, Hurst Green, Lancashire (71%; cohort 7)

Haselor School, Warwickshire (67%; cohort 6).

  • Some of the schools achieving 50% were significantly larger, notably Bowdon C of E Primary School, Trafford, which had a KS2 cohort of 60.

Maths TA

The data available on maths TA is more limited:

  • Including pupils at independent schools, a total of 33,668 were assessed at L6 in maths (6.24% of all KS2 candidates). This included 20,336 boys (7.37% of all male KS2 candidates) and 13,332 girls (5.06% of all female candidates). The number achieving L6 maths TA is slightly lower than the corresponding number achieving L6 in the test.
  • The regional breakdown was as follows: NE 5%; NW 5%; Y+H 5%; EM 5%, WM 6%; E 6%, London 8%; SE 7%, SW 6%, so London’s ascendancy is not as significant as in the test. 
  • The strongest local authority performers are: City of London 24%; Harrow and Richmond 15%; Sutton 14%; Trafford 13%; Solihull and Bromley 12%.
  • In the Performance Tables, 63 schools recorded 40% or higher on this measure, 15 of them at 50% or higher. The top performer was St Oswald’s C of E Aided Primary School (see above) with 88%.

Science

Science data is confined to teacher assessment outcomes.

  • A total of just 1,633 pupils achieved L6 in 2013, equivalent to 0.3% of the KS2 science cohort. Of these, 1,029 were boys (0.37%) and 604 were girls (0.23%), suggesting a gender imbalance broadly similar to that in maths.
  • No regions and only a handful of local authorities recorded a success rate of 1%.
  • In the Performance Tables, 31 schools managed 20% or higher and seven schools were above 30%. The best performing were:

Newton Farm (see above) (50%; cohort 30)

Hunsdon Junior Mixed and Infant School, Hertfordshire (40%; cohort 10)

Etchingham Church of England Primary School, East Sussex (38%; cohort 16)

St Benedict’s Roman Catholic Primary School Ampleforth, North Yorkshire (36%; cohort 14).

Conclusions

 

Key findings from this data analysis

I will not repeat again all of the significant points highlighted above, but these seem particularly worthy of attention and further analysis:

  • The huge variation in success rates for the three L6 tests. The proportion of learners achieving L6 in the reading test is improving at a faster rate than in maths, but from a very low base. It remains unacceptably low, is significantly out of kilter with the TA results for L6 reading and – unless there has been a major improvement in 2014 – is likely to stay depressed for the limited remaining lifetime of the test.
  • In the tests, 74% of those successful in reading are girls, 62% of those successful in GPS are girls and 61% of those successful in maths are boys. In reading there are also interesting disparities between gender distribution at L6 in the test and in teacher assessment. Can these differences be attributed solely to gender distinctions or is there significant gender-related underachievement at the top of the attainment distribution? If so, how can this be addressed? 
  • There are also big variations in performance by ethnic background. Chinese learners in particular are hugely successful, especially in maths. In 2013, Chinese girls outscored significantly boys from all other backgrounds, while an astonishing 35% of Chinese boys achieved L6. This raises important questions about the distribution of high attainment, the incidence of underachievement and how the interaction between gender and ethnic background impacts on these.
  • There are almost certainly significant excellence gaps in performance on all three tests (ie between advantaged and disadvantaged learners), though in reading and GPS these are masked by the absence of numerical data. In maths we can see that the gaps are not as large as those lower down the attainment spectrum, but they widened significantly in 2013 compared with 2012. This suggests that the impact of the Pupil Premium on the performance of the highest attainers from disadvantaged backgrounds is extremely limited.  What can and should be done to address this issue?
  • EAL learners perform equally as well as their counterparts in the GPS test and even better in maths. This raises interesting questions about the relationship between language acquisition and mathematical performance and, even more intriguingly, the relationship between language acquisition and skill in manipulating language in its written form. Further analysis of why EAL learners are so successful may provide helpful clues that would improve L6 teaching for all learners.
  • Schools are recording very different success rates in each of the tests. Some schools that secure very high L6 success rates in one test fail to do so in the others, but a handful of schools are strong performers across all three tests. We should know more than we do about the characteristics and practices of these highly successful schools.

Significant gaps in the data

A data portal to underpin the School Performance Tables is under construction. There have been indications that it will contain material about high attainers’ performance but, while levels continue to be used in the Tables, this should include comprehensive coverage of L6 performance, as well as addressing the achievement of high attainers as they are defined for Performance Table purposes (a much broader subset of learners).

Subject to the need to suppress small numbers for data protection purposes, the portal might reasonably include, in addition to the data currently available:

  • For each test and TA, numbers of registrations, entries and successful pupils from FSM and disadvantaged backgrounds respectively, including analysis by gender and ethnic background, both separately and combined. All the data below should also be available for these subsets of the population.
  • Registrations and entries for each L6 test, for every year in which the tests have been administered, showing separately rates for state-funded and all schools and rates for different types of state-funded school.
  • Cross-referencing of L6 test and TA performance, to show how many learners are successful in one, the other and both – as well as how many learners achieve L6 on more than one test and/or TA and different combinations of assessments.
  • Numbers of pupils successful in each test and TA by region and LA, as well as regional breakdowns of the data above and below.
  • Trends in this data across all the years in which the tests and TA have been administered.
  • The annual cost of developing and administering each of the L6 tests so we can make a judgement about value for money.

It would also be helpful to produce case studies of schools that are especially successful in maximising L6 performance, especially for under-represented groups.

 

The impact of the new tests pre- and post-2016

We do not yet know whether the announcement that L6 tests will disappear after 2015 has depressed registration, entry and success rates in 2014. This is more likely in 2015, since the 2014 registration deadline and the response to the primary assessment and accountability consultation were broadly co-terminous.

All the signs are that the accountability regime will continue to focus some attention on the performance of high attainers:

  • Ofsted is placing renewed emphasis on the attainment and progress of the ‘most able’ in school inspection, though they have a broad conceptualisation of that term and may not necessarily highlight L6 achievement.
  • From 2016, schools will be required to publish ‘the percentage of pupils who achieve a high score in all areas at the end of key stage 2.’ But we do not know whether this means publishing separately the percentage of pupils achieving high scores in each area, or only the percentage of pupils achieving high scores across all areas. Nor do we know what will count as a high score for these purposes.
  • There were commitments in the original primary assessment and accountability consultation document to inclusion of measures in the Primary Performance Tables setting out:

‘How many of a school’s pupils are amongst the highest attaining nationally, by showing the percentage of pupils achieving a high scaled score in each subject.’

but these were not repeated in the consultation response.

In short, there are several unanswered questions and some cause to doubt the extent to which Level 6-equivalent performance will continue to be a priority. The removal of L6 tests could therefore reduce significantly the attention primary schools give to their highest attainers.

Moreover, questions remain over the suitability of the new tests for these highest attainers. These may possibly be overcome but there is considerable cause for concern.

It is quite conceivable that the test developers will not be able to accommodate effective assessment of L6 performance within single tests as planned.

If that is the case, the Government faces a choice between perpetuating separate tests, or the effective relegation of the assessment of the highest attainers to teacher assessment alone.

Such a decision would almost certainly need to be taken on this side of a General Election. But of course it need not be binding on the successor administration. Labour has made no commitments about support for high attainers, which suggests they will not be a priority for them should they form the next Government.

The recently published Assessment Principles are intended to underpin effective assessment systems within schools. They state that such systems:

‘Differentiate attainment between pupils of different abilities, giving early recognition of pupils who are falling behind and those who are excelling.’

This lends welcome support to the recommendations I offered to NAHT’s Commission on Assessment

But the national system for assessment and accountability has an equally strong responsibility to differentiate throughout the attainment spectrum and to recognise the achievement of those who excel.

As things stand, there must be some doubt whether it will do so.

Postscript

On 19 May 2014 two newspapers helpfully provided the entry figures for the 2014 L6 tests. These are included in the chart below.

L6 postscript chart

It is clear that entries to all three tests held up well in 2014 and, as predicted, numbers have not yet been depressed as a consequence of the decision to drop L6 tests after 2015.

The corresponding figures for the numbers of schools entering learners for each test have not been released, so we do not know to what extent the increase is driven by new schools signing up, as opposed to schools with previous entries increasing the numbers they enter.

This additional information makes it easier to project approximate trends into 2015, so we shall be able to tell next year whether the change of assessment policy will cause entry rates to tail off.

  • Entries for the L6 reading test were 49% up in 2013 and 36% up in 2014. Assuming the rate of increase in 2015 falls to 23% (ie again 13% down on the previous year), there would be some 117,000 entries in 2015.
  • Entries for the L6 maths test were 41% up in 2013 and 36% up in 2014. Assuming the rate of increase in 2015 falls to 31% (ie again 5% down on the previous year), there would be around 139,000 entries in 2015.
  • GPS is more problematic because we have only two years on which to base the trend. If we assume that the rate of increase in entries will fall somewhere between the rate for maths and the rate for reading in 2014 (their second year of operation) there would be somewhere between 126,000 and 133,000 entries in 2015 – so approximately 130,000 entries.

It is almost certainly a projection too far to estimate the 2014 pass rates on the basis of the 2014 entry rates, so I will resist the temptation. Nevertheless, we ought to expect continued improvement at broadly commensurate rates.

The press stories include a Government ‘line to take’ on the L6 tests.

In the Telegraph, this is:

‘Want to see every school stretching all their pupils and these figures show that primary schools have embraced the opportunity to stretch their brightest 11-year-olds.’

‘This is part of a package of measures – along with toughening up existing primary school tests, raising the bar and introducing higher floor standards – that will raise standards and help ensure all children arrive at secondary school ready to thrive.’

In the Mail it is:

‘We brought back these tests because we wanted to give teachers the chance to set high aspirations for pupils in literacy and numeracy.’

‘We want to see every school stretching all their pupils. These figures show that primary schools have embraced the opportunity to stretch their brightest 11-year-olds by  teaching them more demanding new material, in line with the new curriculum, and by entering them for the Level 6 test.’

There is additionally confirmation in the Telegraph article that ‘challenging material currently seen in the level 6 exams would be incorporated into all SATs tests’ when the new universal assessments are introduced, but nothing about the test development difficulties that this presents.

But each piece attributed this welcome statement to Mr Gove:

‘It is plain wrong to set a ceiling on the talents of the very brightest pupils and let them drift in class.’

‘Letting teachers offer level 6 tests means that the most talented children will be fully stretched and start secondary school razor sharp.’

Can we read into that a commitment to ensure that the new system – including curriculum, assessment, qualifications, accountability and (critically) Pupil Premium support for the disadvantaged – is designed in a joined up fashion to meet the needs of ‘the very brightest pupils’?

I wonder if Mr Hunt feels able to follow suit.

GP

May 2014

A Summer of Love for English Gifted Education? Episode One: KS2 Level 6 Tests

.

summer of love 1967 by 0 fairy 0

summer of love 1967 by 0 fairy 0

This post is the first in a short series, scheduled to coincide with three publications – two yet to be published – that focus directly on provision for gifted learners in England.

Each Episode will foreground one of the publications, set within the emerging overall narrative. Each will assess the likely impact of the target publication and the broader narrative as it unfolds while also reflecting associated developments in educational policy anticipated during the next few months.

Episode One:

  • Analyses the first publication, an Investigation of Level 6 Key Stage 2 Tests, already published in February 2013, exploring its findings in the context of current uncertainty about future arrangements for assessment in primary schools.
  • Reviews the outcomes of the most recent Ofsted survey of gifted and talented education, conducted in December 2009, so establishing a benchmark for consideration of a new Ofsted survey of how schools educate their most able pupils, due for publication in May 2013.
  • Sets out what we know about the third document, an Investigation of School and College-level strategies to raise the Aspirations of High-Achieving Disadvantaged Pupils to Pursue Higher Education, due for publication by mid-September 2013.

Future Episodes will scrutinise the new Ofsted Survey and the second Investigation respectively, linking them with other developments over the summer period, not all of which may yet be in the public domain.

By this means I plan to provide a kind of iterative stocktake of current issues and future prospects for their resolution. I am curious to learn whether I will be more or less positive at the end of the series than at the beginning.

For I enter the fray in a spirit of some world-weariness and pessimism over the continuing inability of the gifted education community to act collaboratively, to reform itself and to improve practice. This is seemingly a global malaise, though some countries stand out as bucking the trend. Many have featured in previous posts.

Will the Summer of Love provide the spur for trend-bucking reform here in England, or will the groundswell of energy it generates be dissipated in the long, languorous, lazy sunshine days ahead?

.

Publications in the Last Two Years and Associated Developments

Following a lengthy period in the doldrums, we may be on the verge of a rather livelier season in the evolving history of English gifted education.

It would be wrong to suggest that we have been entirely becalmed. Over the past two years we have digested a trio of key publications, all of which have been reviewed on this Blog:

  • The Sutton Trust’s ‘Educating the Highly Able’ (July 2012), which I took seriously to task for its over-emphasis on excellence at the expense of equity and almost entire failure to address the needs of underachieving gifted learners, especially those from disadvantaged backgrounds. Given the sponsoring organisation’s raison d’etre (improving social mobility) that seemed, frankly, bizarre.

These documents may have had some limited positive impact, by maintaining gifted education’s profile within wider education policy, but I can find no evidence to suggest that they have reformed our collective thinking about effective gifted education, let alone improved the learning experience and life chances of English gifted learners.

Indeed, it is conceivable that the two latter publications have set back the cause of gifted education by taking us down two successive blind alleys.

I have made my own small efforts to refocus attention on a more productive direction of travel through The Gifted Phoenix Manifesto for Gifted Education.

I do not claim any great status or significance for the Manifesto, though there are encouraging early signs that it is stimulating productive debate amongst others in the field, at least amongst those who are not firmly wedded to the status quo.

The Sutton Trust promises further work, however:

‘Helping the highly able

Piloting programmes that support and stretch bright students from non-privileged backgrounds in state schools, and opening up selective state schools to bright children from low and middle income homes.’

This presumably includes the outcome of the call for proposals that it issued as long ago as July 2012, ‘with a view to developing the first project by the end of the year’ – ie 31 December 2012 (see attachment at the bottom of the linked page).

The call for proposals sought:

‘Cost-effective, scalable projects which support highly able pupils in non-selective maintained schools.  The Trust is particularly interested in initiatives which are based on sound evidence and / or which draw on proven models of intervention.’

It expressed interest in:

  • ‘proposals that focus on those pupils capable of excellence in core academic school subjects’;.
  • ‘various methods of defining this group – for example those attaining at the 90th percentile and above, the 95th percentile, or the new Level 6’ or ‘on the basis of school performance and local context’;
  • Support for ‘“exceptionally able” pupils’ especially ‘imaginative ways of bringing them together’;
  • Provision that is ‘integral to schools and not simply a “bolt-on” to mainstream provision’
  • Programmes that start ‘in key stage three or four, but which may continue to support the students through their transition to FE and HE’.

There is some reasonable hope therefore that the Trust might still contribute in a positive way to the Summer of Love! If there is an announcement during the timeframe of this series I will of course feature the details in a future Episode.

But I plan to build the series around a second trio of documents which have the capacity to be somewhat more influential than those published from 2011 to 2012.

.

Kew once more 1 by giftedphoenix

Kew once more 1 by giftedphoenix

.

Key Stage 2 Level 6

One is already with us: an ‘Investigation of Key Stage 2 Level 6 Tests’ commissioned by the Department for Education and published in late February 2013. (OK, so I’m stretching a point by extending Summer back into the Winter, but this study has so far escaped serious in-depth attention.)

The authors are Mike Coldwell, Ben Willis and Colin McCaig from the Centre for Education and Inclusion Research (CEIR) at Sheffield Hallam University.

Before engaging directly with their findings, it is necessary to sketch in a fair amount of contextual background, since that will be critical to the broader narrative we expect to evolve over the coming months.

 .

Background: Level 6 Tests

Level 6 Tests are by no means the first example of efforts to raise the assessment ceiling for high-attaining learners at the end of Key Stage 2 (KS2) (typically the final year of primary school when children are aged 11), but there is insufficient space here to trace the history of their predecessors.

The current iteration, optional Level 6 tests, was introduced in 2011 in reading, writing and maths. The tests were not externally marked, nor were results published.

QCDA was still in place. Its website said:

‘The tests provide the opportunity to stretch high attaining pupils and also provide a useful tool for measuring the ability and progression of gifted and talented pupils. You are advised to view the tests to make a judgement on how appropriate they are for your pupils.’

In June 2011, the Bew Report into KS2 testing, assessment and accountability reflected this experience:

‘We recognise that the current system of National Curriculum tests can appear to place a ceiling on attainment for the most able pupils. This has important implications for measures of progress, since a pupil who achieves level 3 at the end of Key Stage 1 can currently only achieve level 5 in the end of Key Stage 2 tests, and can therefore only make two levels of progress (currently the expected rate of progress).

Allowing pupils to attain level 6 at the end of Key Stage 2 would enable pupils with high Key Stage 1 attainment to make better than expected progress. Secondary schools receiving pupils who had attained level 6 would understand that these pupils would need to be particularly challenged and stretched from the start of Year 7…

It is important to challenge the most able pupils. We welcome the Government’s decision to make level 6 tests available to schools on an optional basis this year. We believe that these optional tests could allow particularly able pupils an opportunity to develop and fully demonstrate their knowledge and understanding.

However, we do have some concerns, in particular over the extent to which it will be possible for primary schools to cover enough of the Key Stage 3 curriculum to allow pupils to attain level 6. NFER, one of the few respondents who commented on this issue, suggested that it would be more appropriate to award a ‘high 5’ than a level 6.’

So Bew concluded:

‘We believe that the Government should continue to provide level 6 National Curriculum Tests for schools to use on an optional basis, whose results should be reported to parents and secondary schools.’

But there was also a rider:

‘If, following the review of the National Curriculum, any changes are made to the current system of levels, alternative arrangements should be put in place to ensure the most able pupils are challenged.’

More about that anon.

In the light of this, externally marked KS2 Level 6 tests were offered in 2012 in Reading and Maths. There was also an option to undertake internally marked Level 6 teacher assessment in Writing.

The 2012 KS2 Assessment and Reporting Arrangements Booklet offered a brief commentary:

‘These tests are optional and are aimed at high attaining children. Headteachers should take into account a child’s expected attainment prior to entering them for these tests as they should already be demonstrating attainment above level 5…

To be awarded an overall level 6 in a subject, a child must achieve both a level 5 in the end of Key Stage 2 test and pass the level 6 test for that subject. Schools can refer to the 2011 level 6 test papers in order to inform their assessment of whether to enter children for the test.’

The Investigation examines this 2012 experience, but is confined to the two externally marked tests.

Meanwhile – and skipping ahead for a moment – in 2013, the optional Reading and Maths tests are once again available, alongside a new optional test of Grammar, Punctuation and Spelling, in place of the teacher assessment of writing.

Reporting of Level 6 results in School Performance Tables has also changed. In 2012, Level 6 outcomes were used only in the ‘calculation of progress measures, Value Added,  percentage achieving level 5+ and average point scores’.

When it comes to the 2013 Performance Tables:

‘…the percentage of the number of children at the end of Key Stage 2 achieving level 6 in a school will also be shown in performance tables. The Department will not publish any information at school level about the numbers of children entered for the level 6 tests, or the percentage achieving level 6 of those entered for level 6.’

This change may have been significant in driving increased interest in the tests, though not necessarily for all the right reasons, as the discussion below will reveal.

Although the 2012 Performance Tables made limited use of Level 6 results some aggregated performance data was published, as my post on the outcomes noted:

‘900 pupils achieved Level 6 in the KS2 reading test and 19,000 did so in the maths test. While the former is significantly lower than 1% of total entries, the latter is equivalent to 3%, so roughly one pupil per class is now achieving Level 6 in maths. (About 700 pupils also achieved Level 6 in science teacher assessment). Almost all learners achieving a Level 6 will have demonstrated three levels of progress. We know from other provisional data that some 2,500 of those securing Level 6 in maths achieved either Level 2A or even Level 2B in maths alone at KS1, so managing four levels of progress in crude whole-level terms.’

Incidentally, we now know from DfE’s website that:

‘There will not be a Key Stage 2 science sampling test in 2013; a new, biennial (every other year), pupil-level sampling system will be introduced in 2014.’

And slightly more accurate performance data was supplied in an Appendix to the Investigation itself. It tells us that, across all schools (including independent schools that opted to take the tests):

  • 55,212 learners were entered for Level 6 Maths and 18,953 of them (34.3%) achieved it; and
  • 46,810 pupils were entered for level 6 reading and 942 (2.0%) achieved it.

That gives a total of 102,022 entries, though we do not know how many came from independent schools or, indeed, how many learners were entered for Level 6 tests in both Maths and Reading.

.

Background: The Future of National Curriculum Assessment

We have known since June 2012 that National Curriculum levels will be phased out and were informed, through a kind of policy aside in March 2013, that this would happen ‘from 2016’.

The new National Curriculum will be introduced from September 2014, so will be assessed through the existing assessment framework during its first year of implementation, despite the apparently strong case for keeping it and the associated assessment reforms fully synchronised.

It may be that this decision is associated with recent difficulties over the procurement of a contractor to undertake external marking of the KS2 tests from 2014-2016, or else progress on determining the new arrangements was insufficiently advanced by the time that contract came to be negotiated.

At the time of writing we still await a promised consultation document on primary assessment and accountability, some 10 months after the removal of levels was first communicated.

The issues discussed below will need revisiting once the Government’s proposals are safely in the public domain: the spectre of assessment reform hangs over this post as well as the Investigation it is supposed to be reviewing.

There are few clues to the direction of travel, apart from some suggestion that the Government has been influenced by Bew’s deliberations, even though his clarity on this point left something to be desired.

I quote the relevant sections fully below, to ensure that I haven’t missed any vital inflection or  hint of what Bew intended. The emphases are mine:

‘In the short term, we believe we need to retain levels as a means of measuring pupils’ progress and attainment… However, in the long term, we believe the introduction of a new National Curriculum provides an opportunity to improve how we report from statutory assessment. We believe it is for the National Curriculum Review to determine the most appropriate way of defining the national standards which are used to categorise pupils’ attainment.

We realise that, in order to measure progress, it is necessary to have an appropriate scale against which attainment and progress can be measured at various points. For example in Australia, a ‘vertical scale’ (where a movement along the scale between any two equally spaced points must reflect similar levels of progress) is created by testing several year-groups, using some common questions to link scores on each test together. A particular question might be considered difficult for a Year 3 pupil, but much easier for a Year 5 pupil. Although this is technically defensible, it does require tests at more regular intervals than we currently have in England.

In England, we currently use National Curriculum levels as a scale against which to measure progress. However, as stated later in this chapter, concerns have been raised as to whether the levels, as they currently exist, are appropriate as a true vertical scale. We recommend that, as part of the review of the National Curriculum, consideration is given to creating a more appropriate ‘vertical scale’ with which to measure progress.

And, a little later in the Report:

‘In the longer term, we feel it may be helpful for statutory assessment to divide into two parts. All pupils could be expected to master a ‘core’ of essential knowledge by the end of Key Stage 2, concentrating on the basic literacy and numeracy which all pupils require if they are to access the secondary curriculum. This ‘core’ could be assessed through a ‘mastery’ test which all pupils should be expected to pass (only excepting cases of profound Special Educational Needs), providing a high minimum standard of literacy and numeracy at the end of primary education.

We recognise the risk that this approach may lead to ‘teaching to the test’, may set an unhelpfully low ceiling on attainment and would not reflect pupils’ progress. We would suggest two solutions. Firstly, it might be helpful to allow pupils to take ‘core’ tests in Years 4, 5 or 6 to ensure that able pupils are challenged. Secondly, we feel there could also be a separate assessment at the end of Key Stage 2 to allow pupils to demonstrate the extent of their knowledge and therefore to measure pupils’ progress during the Key Stage. This assessment could be designed to identify the extent of pupils’ attainment and understanding at the end of Year 6, spreading them out on a ‘vertical scale’ rather than being a pass/fail mastery test. Such an assessment should be as useful as possible to pupils, parents and teachers. It may be helpful for the results to report in greater detail than is currently provided by National Curriculum Test data, so they can identify more effectively the pupil’s attainment in key broad aspects of a subject.

We feel the combination of these statutory assessments could ensure that all pupils reach a minimum standard of attainment while also allowing pupils to demonstrate the progress they have made – which would indicate the quality of the school’s contribution to their education. It could provide a safety net in that all pupils should achieve a basic minimum, but would not impose a low ceiling on the able.’

And then finally:

‘A key criticism of the current Key Stage 2 tests is that pupils’ knowledge and skills over a four-year Key Stage is assessed via tests in a single specified week in May. Some critics have raised concerns that this approach causes stress for pupils, particularly those working at the lower end of a spectrum, and may have unfair implications for schools, whose overall results may be affected if for example a highly-performing pupil is absent on test day. In addition, criticism suggests there is little incentive to challenge the more able children, who may well be working at level 5 at an earlier point in the Key Stage or year.

We believe that our earlier recommendations address these issues. However, we also recognise the benefits of a system based on the principle of ‘testing when ready’. The proponents of such an approach argue that it would allow each pupil to be entered for statutory tests when he/she is ready, and then able to move on to more advanced learning. We believe that it would be possible for a statutory ‘testing when ready’ system to meet the statutory assessment purposes we have specified.

However, we are not convinced that moving to a ‘testing when ready’ approach is the best way of achieving the purposes of statutory assessment under the current National Curriculum. We suggest that the principle of ‘testing when ready’ should be considered in the future following the National Curriculum Review. We believe that the principle of ‘testing when ready’ may fit well if computer administered testing is introduced, making it easier for each pupil to sit his/her own personalised test at any point in time when teachers deem him/her to be ready.’

In summary then, Bew appears to suggest:

  • Assessment of mastery of an essential core of knowledge that all should pass but which might be undertaken as early as Year 4, two years before the end of KS2;
  • A separate end of KS2 assessment of the extent of learners’ knowledge and their progress against  a new ‘vertical scale’ that will judge their progress over time, this potentially incorporating reporting on attainment in ‘key broad aspects of a subject’;
  • Consideration of transition to a universal ‘testing when ready’ approach at some indeterminate future point (which may or may not be contemporaneous with and complementary to the changes above).

Quite what learners will do after they have successfully completed the mastery test – and its relationship to the draft Programmes of Study that have now been published – is not explained, or even explored.

Are learners expected to begin anticipating the Key Stage 3 programme of study, or to confine themselves to pursuing the KS2 programme in greater breadth and depth, or a combination of the above?

In short, Bew raises more questions than he answers (and so effectively reinforces the argument for keeping curricular and assessment reforms fully synchronised).

At this point we simply do not know whether the Government is ready to unveil plans for the introduction of a radically new ‘test when ready’ assessment regime from 2016, or whether some sort of intermediate position will be adopted.

The former decision would be a very bold reform given the ‘high stakes’ nature of these tests and the current state of cutting edge assessment practice. Given the difficult history of National Curriculum assessment, the risk of catastrophic error might well be too great to contemplate at this stage.

Awash in all this uncertainty, one might be forgiven for assuming that an analysis of the impact of the introduction of Level 6 tests has been overtaken – or almost overtaken – by events.

But that would be unjustified since the Investigation addresses some important issues about gifted education in the upper primary years, effective management of the transition between primary and secondary schools and the role of assessment in that process.

.

Kew once more 2 by giftedphoenix

Kew once more 2 by giftedphoenix

.

The Investigation: Key Points

The Report is structured around the sequence of events leading from a school’s decision to enter learners for the tests, proceeding from there to consider the identification and selection of participants, the support provided to them in the run up to taking the test, and the outcomes for participants, other pupils, the host school and receiving secondary schools.

It addresses five research questions:

  • How have the tests affected school behaviour towards the most able pupils?
  • What is the difference in behaviours between schools that do well in the tests and those which do not?
  • What are the positive and negative effects of the tests, on schools and pupils respectively?
  • Why did some schools enter pupils for the tests whereas others did not?
  • How are schools identifying pupils to enter the tests?

It does so by means of a tripartite methodology, drawing on 20 case studies of schools undertaking the tests, 40 telephone interviews with schools that decided not to take part and 20 telephone interviews with secondary schools.

.

The Decision to Enter Learners

Schools that decided to enter pupils for the tests did so because:

  • They wanted to provide additional challenge for able pupils and/or remove an unhelpful ceiling on their attainment. There was a perceived motivational benefit, for staff as well as learners,  while some primary schools ‘hoped that an externally validated exam might make secondary schools more secure in their views about primaries’ judgements’, as well as protecting learners from expectations that they would repeat work at their receiving secondary schools.
  • They wanted to evidence positive performance by the school, by demonstrating additional progress by learners and confirming teacher assessment outcomes. Entry was assumed to assert their high expectations of able pupils. Some were anxious that failure to take part would be perceived negatively by Ofsted.
  • Some were encouraged by the ‘low stakes’ nature of the assessment, identified entry as consistent with the school’s existing priorities, saw a positive marketing opportunity, or wanted to attract or retain staff ‘with sufficient confidence and expertise to teach level 6 content’.

Conversely, schools deciding against participation most often did so because they judged that they had no pupils for which the tests would be suitable (though there was recognition that this was a cohort-specific issue).

Many said they had received insufficient guidance, about the test itself and about the need to teach the Key Stage 3 programme of study, and there was related concern about the absence of dedicated teaching materials.

Some objected to the tests in principle, preferring an alternative approach to assessing these learners, or concerned at a disproportionate focus on the core subjects. ‘Quite a number’ took the reverse and negative position on secondary schools’ anticipated response, assuming that receiving schools would re-test and repeat the work pupils had undertaken.

.

Identification and Selection of Participants

Concern about lack of guidance extended to advice on selection of participants. There was widespread worry at the limited availability of past papers. Lack of confidence led to schools adopting very different approaches, some rather liberal and others much more conservative.

Some entered only those learners they believed had a very good chance of passing. Others extended entry to all those they believed had some chance of success, sometimes including even those they felt probably would not pass.

On average, case study schools nominated 41% of the subset of learners who achieved Level 5 in Maths, though some entered 20% or fewer and others 81% or more. Most fell between these two extremes. (The national figure is given as 26%.)

But, in Reading, case study schools nominated on average only 25% of learners who had achieved Level 5. Only a minority of schools nominated over 41%. (The national figure is given as 18%.)

Timing of selection varied considerably. Identifying potential entrants relatively early in Year 6 and confirming selection nearer the April deadline was a common strategy.

Decisions typically took into account several factors, foremost of which were learners’ own preferences. Few schools consulted parents systematically. There was generally less clarity and confidence in respect of Reading.

Schools typically utilised a mix of objective, quantifiable and subjective, value-driven measures, but ‘many schools struggled to convey coherently a specific selection strategy’ and it is clear that the probability of a learner being entered varied considerably according to which school they attended.

Objective evidence included formative assessment, tracking data, cross-moderation of work between partner schools and the outcomes of practice tests. Though schools felt secure in their levelling, only a handful stated explicitly that they had learners working at Level 6, either at the point of selection for the tests or subsequently. In reality, most made their judgements on the basis of performance at Level 5.

Subjective considerations – eg learners’ ‘wellbeing’ – were significant:

‘In certain instances possessing the raw ingredients of academic ability and a track record of high academic performance in isolation were not necessarily seen to be sufficient grounds for selection. Instead a number of schools also attached considerable importance to the particular pupils’ maturity, personality and, in some cases, behaviour.’

Many schools expected to tighten their selection criteria in response to low pass rates, especially in Reading. There was marked dissatisfaction with ‘the increased threshold marks (compared with those from the pilot tests)’ and a feeling that this had led schools to underestimate the difficulty of the tests.

The Executive Summary argues that ‘schools were largely effective in ensuring that the very top ability pupils were identified and put forward’, but the substantive text is not quite so bullish.

There was clear evidence of reticence on teachers’ parts in outlining the characteristics of learners working at Level 6. Reference was made to independence, tenacity and motivation and ‘an innate flare or capability to excel at a particular subject’.

Some schools struggled to pin down these traits, especially for Reading. Teachers mentioned ‘excellent inferential skills and capacity to access authorial intent’.

Maturity was also a key consideration:

‘The parameters of the Level 6 Reading test are just not compatible with the vast majority of pupils aged 11 (even the very brightest ones) – they simply do not possess the experiences and emotional maturity to be able to access what is required of them within the level 6 test.’

.

Support Provided to Participants

Limited guidance was a prominent issue, leading schools to use ‘an array of ad hoc means of support’ derived from their own research and experience.

Many adopted aspects of the KS3 Programme of Study, despite concern at the attitude of receiving secondary schools. Materials and support were much more evident in Maths than in Reading.

Lack of clarity over the relationship between Level 6 tests and the KS3 programmes of study was a significant issue. Most schools drew on the KS3 curriculum but a few preferred to emphasise breadth and depth at KS2 instead.

Schools were generally more confident in their support for Maths because ‘there appeared to be more internal and external expertise available’ and they found selection of participants less problematic.

Two aspects of support were prominent:

  • Classroom differentiation, focused on specific aspects of the curriculum – though the tests themselves were not widely perceived to have had a material impact on such practice. Some form of ability grouping was in place in all schools in respect of maths and most schools in respect of reading (as part of literacy).
  • Test preparation, mostly undertaken in additional booster sessions combining teaching with test-taking practice and the wider use of practice papers.

The Report characterises three broad approaches adopted by schools: outcome focussed (heavily emphasising test preparation); teaching and learning focused (with markedly less emphasis on booster sessions and test practice); and a composite approach marking the continuum between these two extremes.

Several schools reported an intention ‘to focus more on teaching and learning’ in the coming year.

.

Outcomes of the Tests

In Maths it was possible ‘to identify a small number of schools that performed particularly well and others that performed relatively poorly’.

The analysis focuses on the simple pass rate, the Level 5 to 6 conversion rate and a ‘top Level 5’ to Level 6 conversion rate across the 20 case study schools.

The simple pass rate was 40% (34% nationally), though this masked significant variation – from 0% to 100% indeed.

These outcomes correlated broadly with the level 5 to 6 conversion rates for which the case study school average was 17%, with variance from 0% to 50%.

However, when it came to the’ top Level 5’ to Level 6 conversion rate, the Report can only admit that, while there was some degree of correlation with the other two measures:

‘On this measure there was polarity: most schools either found that all of their ‘top level 5s’ achieved level 6 or that none of them achieved it. This is difficult to interpret, and the qualitative data does not shed a light on this.’

Even more problematically, only one learner in the entire sample was successful in achieving Level 6 in the Reading test – equivalent to a 1% success rate (the national pass rate was 2%).

The Report offers some rather approximate findings, wrapped around with health warnings, suggesting that better results were more typically found in schools with a combined approach featuring learning and outcomes (see above), as opposed to either of those two extremes.

Positive outcomes for schools have already been outlined above.

Benefits for learners, identified by teachers and learners alike, included the scope provided by the tests for learners to demonstrate (even fulfil) their potential. Wider personal outcomes were also mentioned including a positive impact on motivation (though there were also corresponding concerns about overloading and over-pressurising learners).

Secondary schools rather tended to reinforce the negative expectations of some primary schools:

  • They were ‘generally ambivalent about primary schools’ use of L6 test and aspects of the KS3 curriculum…due to the fact that secondary schools in general felt that measures of KS2 outcomes were not accurate… Consequently, they preferred to test the children pre-entry or at the beginning of Year 7’.
  • ‘Many of the secondary schools were concerned about primary schools ‘teaching to the test’ and thus producing L6 pupils with little breadth and depth of understanding of L6 working…Generally secondaries viewed such results as unreliable, albeit useful for baseline assessment, as they help to identify ‘high fliers’’
  • While most noted the benefits for learners ‘some felt that inaccurate test outcomes made the transition more difficult’. The usual range of concerns was expressed.

.

The Investigation’s own Conclusions

The Investigation offers four main conclusions:

  • It is abundantly clear…that greater guidance on pupil selection and support and more practice materials are key issues’. This needs to incorporate guidance on coverage, or otherwise, of the KS3 curriculum. The main text (but not the executive summary) identifies this as a responsibility of ‘DfE with the STA’. It remains to be seen whether the Government will take on this task or will look instead to the market to respond.
  • Schools adopting a strongly outcome-focussed approach were less likely to produce successful results than those adopting a mixed learning and outcome approach. Some schools seemed too heavily driven by pressure to secure positive inspection results, and

.‘responded to the direction from inspectors and policymakers to support the most able by a narrowing of the curriculum and overemphasising test preparation, which is not in the best interests of pupil, teachers or schools’

There is a ‘need for policy to aim to drive home the vital importance of pedagogy and learning to counteract the tendency’.

  • Secondary schools confirm primary schools’ scepticism that they will not ‘judge the tests as an accurate reflection of levels’. There is therefore ‘a strong need to engage secondaries much more with primaries in, for example, curriculum, assessment and moderation’. This is presumably a process that is most easily undertaken through local collaboration.
  • The very low pass rate in Reading, selection issues (including maturity as a key component) and secondary scepticism point to a need ‘to review whether the L6 Reading test in its current form is the most appropriate test to use to identify a range of higher performing pupils, for example the top 10%’. The full commentary also notes that:

.‘The cost of supporting and administering a test for such a small proportion of the school population appears to outweigh the benefits’.

.

My Conclusions

There is relatively little here that would be unusual or surprising to a seasoned observer of how gifted education is currently practised and of wider educational issues such as the impact of Ofsted on school practice and transfer and transition issues.

The study is rather narrow in its conceptualisation, in that it fails to address the interface between the Level 6 tests and other relevant aspects of Government thinking, not least emerging policy on the curriculum and (of course) assessment.

It entirely ignores the fact that a decision to abandon National Curriculum Levels was announced eight months prior to publication.

There is no attempt to analyse the national data in any depth, or to look at any issues concerning the gender, ethnic and socio-economic profile of learners entered for the tests and successful within it, even though there will have been some heavy biases, especially in favour of those from comparatively advantaged backgrounds.

It would have been particularly helpful to see how much bigger the FSM gap at Level 6 is, compared with Level 5, whether schools had focused on this issue and, if so, what action they had taken to address it. Was there any evidence of the positive use of Pupil Premium funding for this purpose?

The Investigation’s general point about the negative impact of Ofsted on schools’ practice may also be rather misleading, in that the negative influence of overly outcomes-focussed thinking is at least partly attributable to School Performance Tables rather than Ofsted’s school inspection framework.

In that guise it will probably also feature in Ofsted’s own upcoming publication (see below). Whether there is any reference in Ofsted’s report to the case for rebalancing schools towards pedagogy and learning, so they are more in equilibrium with the pursuit of assessment outcomes, is rather more doubtful. Quite how that might be undertaken is ducked by the Level 6 Investigation and so likely to be sidelined.

The issues relating to transition and transfer are longstanding and a heavy drag on the efficiency of our school system, both for gifted learners and the wider population. If the upcoming consultation affects the timing of Key Stage 2 assessment, that may provide the impetus for renewed efforts to address the generic problem. Otherwise this seems unlikely to be a priority for the Government.

The response to date to the call for additional guidance has been rather limited.

Certainly, a range of sample material has been posted to assist schools interested in taking up the new test of grammar, punctuation and spelling. But the information available to support the Maths and Reading tests remains relatively thin. I have found nothing that addresses substantively the issues about pre-empting elements of Key Stage 3.

Despite the limited support available, evidence has recently emerged that Level 6 test entries are significantly higher for 2013 than for 2012. A total of 113,600 pupils have been entered, equivalent to 21% of the relevant pupil population.

This is said to be an increase of 55% compared with the 73,300 entered in 2012 (though that figure does not seem to agree with those quoted in the Investigation and reproduced above).

Moreover, some 11,300 schools have registered for the tests, up 41% on the 2012 figure of 8,300 schools.

Given the issues associated with the Reading test set out in the Report, one might hazard a reasonable guess that the increase will be attributable largely to the Maths test and perhaps to schools experimenting with the new grammar, punctuation and spelling test (though the figures are not broken down by test).

Increased emphasis in the 2013 Performance Tables (see above) will also be a significant factor. Does this suggest that schools are increasingly slaves to the outcomes-driven mentality that the Investigation strives so hard to discourage?

.

.

The key point here is that it is unlikely to be wise or appropriate to enter over one fifth of all end KS2 learners for tests in which so few are likely to be successful.

One might reasonably hope that, incorporated within the design principles for whatever assessment instruments will replace Level 6 tests, there is explicit recognition that a basic pass/fail distinction, combined with an exceptionally high threshold for a pass, is not the optimal solution.

It is important to retain a high threshold for those with the capacity to achieve it, but other relatively strong candidates also need opportunities to demonstrate a positive outcome at a slightly lower level. A new approach might look to recognise positively the performance of the top 10%, top 5% and top 1% respectively.

It will also be critical to ensure an orderly transition from the current arrangements to those in place from 2016. There is a valuable window of opportunity to pilot new approaches thoroughly alongside the existing models. The reform need not be rushed – that is the silver lining to the cloud associated with decoupling curriculum and assessment reforms.

So, what is my overall judgement of the contribution made by this first publication to my wished for ‘Summer of Love’?

A curate’s egg really. Positive and useful in a small way, not least in reminding us that primary-secondary transition for gifted learners remains problematic, but also a missed opportunity to flag up some other critical issues – and of course heavily overshadowed by the primary assessment consultation on the immediate horizon.

Still, one hopes that its recommendations will be revisited as part of a holistic response to all three publications, and that those to follow will take full account of its findings, otherwise the overall narrative will be somewhat impoverished and will almost certainly fail to give due prominence to the critically important upper primary phase.

.

Kew once more 3 by giftedphoenix

Kew once more 3 by giftedphoenix

 

The Ofsted Survey

.

Background

Next in line for publication is an Ofsted Survey, conducted using the Inspectorate’s rapid response methodology, which will examine ‘how state schools teach the most able children’.

Unusually, this was announced in January 2013 through a press briefing with a national newspaper. Given the political leanings of the paper in question, the contents of the story may be a somewhat biased version of reality.

There is no information whatsoever on Ofsted’s own website, with the sole (and recently added) exception of a publication schedule confirming that the survey will be published in May.

The newspaper report explains that:

  • Despite being a rapid response exercise, this publication ‘will be the most extensive investigation of gifted and talented provision undertaken’ by Ofsted.
  • It will focus predominantly – if not exclusively – on secondary schools where ‘children who get top marks in primary school are being let down by some secondary school teachers who leave them to coast rather than stretch them to achieve the best exam results’.
  • It will examine ‘concerns that bright pupils who are taught in mixed ability classes are failing to be stretched and that schools are entering clever children too early for GCSE exams so that they gain only the C grades that count in league tables and are not pushed to the full extent of their abilities’.
  • Ofsted will interrogate existing inspection data on educational provision for gifted and talented learners, as well as pupil progress data. They will also survey provision afresh, through visits to a representative sample of over 50 secondary schools.

HMCI Sir Michael Wilshaw is quoted extensively:

‘I am concerned that our most able pupils are not doing as well as they should be…Are schools pushing them in the way they should be pushed and are pushed in the independent sector and in the selective system?

The statistic that four independent schools and a very prestigious six [sic] form college are sending more youngsters to Oxbridge than 2,000 state secondary schools is a nonsense. When the history of comprehensive education is written people need to say that they did as well by the most able pupils as they did by the least able…

I am passionate about this, it will be a landmark report…I am as concerned as the next person on the issue of social mobility. Are our children and our children from the poorest backgrounds who are naturally bright doing as well as they should?

…I would like to see GCSE league tables reformed…The anxiety to get as many through those C boundaries have sometimes meant that schools haven’t pushed children beyond that.

We need sophisticated league tables which shows [sic] progress. Youngsters leaving primary school with level 5 should be getting A*, A or B at GCSE.’

It is arguable that the Government has already responded to the final specific point via its proposal – in the consultation on secondary accountability released alongside the draft National Curriculum – to publish an ‘average point score 8’ measure based on each pupil’s achievement across eight qualifications at the end of KS4 (though whether it has done enough to counterbalance other pressures in the system to prioritise the C/D borderline is open to question).

Otherwise there are several familiar themes here:

  • whether gifted learners are insufficiently challenged, particularly in secondary comprehensive schools;
  • whether they are making sufficient progress between the end of Key Stage 2 and the end of Key Stage 4;
  • whether they are held back by poor differentiation, including a preponderance of mixed ability teaching;
  • to what extent they are supported by schools’ policies on early entry to examinations, particularly GCSEs;
  • whether more can be to done to support progression by state school students to the most competitive universities, especially by those from disadvantaged backgrounds; and
  • whether there are perverse incentives in the accountability system that result in gifted learners being short-changed.

Given the puff generated by Sir Michael, expectations are high that this will be a substantial and influential piece of work. It follows that, if it turns out to be comparatively a damp squib, the sense of disappointment and frustration will be so much greater.

The Report will be judged by what new and fresh light it can bring to bear on these issues and, critically, by the strength of the recommendations it directs towards stakeholders at national, local and school level.

Just how interventionist will Ofsted show itself in backing up its leader’s passion? Will it take responsibility for co-ordinating a response from central government to any recommendations that it points in that direction – and what exactly will Ofsted commit itself to doing to help bring about real and lasting change?

Not to labour the point (though I fear I may be doing so) a limp effort that repackages familiar findings and appeals rather weakly to stakeholders’ better judgement will not display the landmark qualities of which HMCI has boasted.

A future Episode in this series will be dedicated to assessing whether or not these inflated expectations have been satisfied, and what the consequences are for the Summer of Love.

.

Benchmarking the New Report

In the meantime, it is instructive to look back at the most recent inspection report on gifted education, thus supplying a benchmark of sorts against which to judge the findings in this new publication.

This will help to establish whether the new report is simply bearing out what we know already about long-standing shortcomings in gifted education, or whether it has important messages to convey about the impact – positive or negative – of the predominantly ‘school led’ approach adopted by successive Governments over the past three years.

The most recent report was published in December 2009, in the latter days of the previous government.

Gifted and Talented Pupils in Schools’ is based on a rapid response survey of 26 primary and secondary schools, selected because their most recent school-wide inspections had identified gifted and talented education as ‘an improvement point’.

The survey was undertaken shortly after the previous government had, in the Report’s words:

‘Reviewed its national programme for gifted and talented pupils and concluded that it was not having sufficient impact on schools. As a result, provision is being scaled back to align it more closely with wider developments in personalising learning. Schools will be expected to do more themselves for these pupils.’

Eight of the 26 schools (31%) were judged to be well-placed to respond to this new environment, 14 (54%) displayed adequate capacity for improvement and the remaining four (15%) had ‘poorly developed’ capacity to sustain improvement.

The schools that were well-placed to build their own capacity could demonstrate that their improved provision was having a positive impact on outcomes for all pupils, were making use of available national resources – including the critically important Quality Standards – and were making sure that all pupils were suitably challenged in lessons.

The majority of schools in the middle group could demonstrate some improvement in pupil outcomes since their last inspection, but ‘many of the developments in these schools were fragile and the changes had had limited success in helping gifted and talented pupils to make appropriate and sustained progress’.

Gifted education was not a priority and:

‘To build their capacity to improve provision, they would benefit from better guidance, support and resources from outside agencies and organisations.’

In the four schools with inadequate capacity to improve, lead staff had insufficient status to influence strategic planning, teachers had not received appropriate training and schools:

‘Did not sufficiently recognise their own responsibilities to meet the needs of their gifted and talented pupils’.

The Report’s Key Findings identify a series of specific issues:

  • Many schools’ gifted education policies were ‘generic versions from other schools or the local authority’, so insufficiently effective.
  • In the large majority of schools (77%) pupils said their views were not adequately reflected in curriculum planning and they experienced an inconsistent level of challenge.
  • None of the schools had engaged fully with the parents of gifted learners to understand their needs and discuss effective support.
  • The better-placed schools were characterised by strong senior leadership in this field and lead staff with sufficient status to influence and implement policy. Conversely, in the poorer schools, senior staff demonstrated insufficient drive or commitment to this issue in the face of competing priorities.
  • In schools judged to have adequate capacity to improve, subject leaders had too much flexibility to interpret school policy, resulting in inconsistency and lack of coherence across the curriculum.
  • Most schools ‘needed further support to identify the most appropriate regional and national resources and training to meet their particular needs’. Lead staff were seeking practical subject-specific training for classroom teachers.
  • All schools ‘felt they needed more support and guidance about how to judge what gifted and talented pupils at different ages should be achieving and how well they were making progress towards attaining their challenging targets across key stages’
  • Just over half the schools had established collaborative partnerships with other schools in their localities. Lack of such support was evident in the schools with limited capacity to improve. There was comparatively little scrutiny through local accountability arrangements.
  • All the schools had developed out-of-hours provision though the link with school-based provision was not always clear and schools were not consistently evaluating the impact of such provision.
  • There was little analysis of progression by different groups of gifted learners.

The Report offers the customary series of recommendations, directed at central and local government and schools, designed to help schools build the necessary capacity to improve their performance in these areas. It will be telling whether the new Report assesses progress in implementing those.

Rather oddly, they fail to endorse or propose arrangements for the ongoing application of the Quality Standards in a ‘school-led’ environment, although the Standards incorporate all these elements of effective practice and provide a clear framework for continuous improvement.

With the benefit of hindsight, one might argue that many of the problems Ofsted cited in 2009 would have been rather less pronounced had the Inspectorate fully embraced the Standards as their official criteria for judging the effectiveness of gifted education when they were first introduced.

The Standards are now growing significantly out of date and require an urgent refresh if they are to remain a valuable resource for schools as they continue to pursue improvement.

Ideally Ofsted might lead that process and subsequently endorse the revised Standards as the universal measure for judging the quality of English schools’ gifted education. I can think of nothing that would have a more significant impact on the overall quality of provision

But I suspect that will be an idea too interventionist for even the most passionate HMCI to entertain.

It will be fascinating, nevertheless, to map the shortcomings identified in the upcoming Report against the existing Standards, as well as against those flagged in the predecessor Report. But that’s a topic for another day.

.

Kew once more 4 by giftedphoenix

Kew once more 4 by giftedphoenix

.

Raising the Aspirations of High-Achieving Disadvantaged Pupils

Thirdly and finally, DfE has commissioned an ‘Investigation of School and College-level Strategies to Raise the Aspirations of High-achieving Disadvantaged Pupils to Pursue Higher Education’.

This is still some way from publication, but the contract – including the specification – is available for public scrutiny (see documents section on this link).

The contract was awarded to TNS-BMRB (where the Project Lead is Mark Peters) working with the Institute for Policy Studies in Education (IPSE) based at London Metropolitan University (where the lead is Carole Leathwood).

IPSE is undertaking the qualitative element of the research and carries this outline of the project on its website.

According to the contract, the contractors must deliver their final report by 28 June and the Department must publish it within 12 weeks of this date, so by 20 September 2013 at the latest. The project is costing £114,113 plus VAT.

Its aims, as set down in the contract, are to discover:

  • ‘What strategies are being used by schools across years 7-11 and in school sixth forms (years 12-13) to support high-achieving disadvantaged pupils in to [sic] pursue HE.
  • If the pupil premium is being used in schools to fund aspiration raising activities for high-achieving disadvantaged pupils.
  • What strategies are being used by colleges to support high-achieving disadvantaged pupils pursue HE and
  • To identify assess [sic] any areas of potential good practice.

‘High-achieving’ is defined for these purposes as ‘pupils who achieve a Level 5 or higher in English and Maths at KS2’.

As reported in a previous post, some 27% of pupils achieved this outcome in 2012, up from 21% in 2011, so the focus is on the top quartile, or perhaps the top two deciles of pupils on this measure.

‘Disadvantaged’ is defined as ‘pupils eligible for free school meals’ (and, in the case of post-16 students, those who were eligible for FSM in Year 11). This is of course a somewhat narrower definition than eligibility for the Pupil Premium, even though the Premium is pivotal to the study.

The national proportion of pupils achieving Level 5 in KS2 English and maths in 2012 who are eligible for FSM is, I believe, 14%, compared with 32% of non-FSM pupils, giving a gap on this measure of 18%.

This data is not provided in School Performance Tables nor is it easily sourceable from published national statistics, though it does appear in schools’ Raise Online reports. (Incidentally, the comparable gap at Level 4 is somewhat lower, at 16%.)

The full set of objectives for the project is as follows (my emphases, but not my punctuation):

‘For Schools:

  • To identify to what extent schools are supporting high-achieving disadvantaged pupils to raise their aspiration to go on to HE?
  • To identify what activities take place in Years 7 -11 for high-achieving disadvantaged pupils to raise their aspiration to go on to HE and the Russell Group universities?
  • To identify whether the Pupil Premium being used [sic] to fund specific activities to help pupils pursue HE?
  • To identify what good practice looks like for supporting high-achieving disadvantaged pupils to pursue HE? (Focusing particularly on schools that have a high percentage of FSM pupils who go on to HE).

For FE colleges, sixth forms colleges and school sixth forms:

  • To identify to what extent are colleges supporting high-achieving disadvantaged learners post-16 to pursue HE?
  • To identify what strategies, if any, do high-achieving disadvantaged learners receive post-16 to pursue HE and more specifically Russell Group Universities?
  • To identify what good practice looks like for supporting high-achieving disadvantaged learners to pursue HE? (Focusing in particular on the strategies used by colleges that have a high percentage of disadvantaged learners who go on to HE).

For schools and colleges

  • To establish how schools and colleges are identifying ‘high-achieving, disadvantaged’ pupils/learners?
  • To identify which particular groups (if any) are being identified as requiring specific support and why?
  • To identify what extent schools/colleges engage in aspiration raising activities specifically designed to increase participation in Russell Group Institutions (rather than HE in general)?
  • To identify what good practice look like in relation to different groups of pupils/learners?’

It is evident from this that there is some confusion between aspiration-raising activities and wider support strategies. But there is clearly interest in comparing strategies in the school and post-16 sectors respectively (and perhaps in different parts of the post-16 sector too.) The primary sector does not feature.

There is also interest in establishing approaches to identifying the beneficiaries of such support; how such provision is differentiated between progression to HE and progression to ‘Russell Group universities’ respectively; the nature of good practice in each sector, drawn particularly from institutions where a significant proportion of students progress to HE; and distinguishing practice for different (but non-defined) groups of learners.

Finally, there is some interest – though perhaps a little underplayed – in exploring the extent to which the Pupil Premium is used to fund this activity in schools. (Funding sources in post-16 environments are not mentioned.)

The study comprises 6 phases: pre-survey scoping; survey piloting; national school survey (a sample of 500 schools, including 100 that send a high proportion of FSM-eligible pupils to HE); national FE and sixth form college survey (a sample of 100 institutions); case studies (eight schools and two colleges); and results analysis.

The latter will incorporate:

  • ‘To what extent schools and colleges are providing aspiration raising activities to high achieving disadvantaged pupils.’
  • ‘What activities take place across different year groups.’
  • ‘Analysis by school characteristics including region, school size, distance to the nearest Russell group university, proportion of FSM eligible pupils’
  • Comparison of the 400 schools with the 100 sending a high proportion of their FSM pupils on to higher education.
  • Whether ‘activities are associated with higher numbers of pupils progressing to HE and trends in what works for different pupil groups’
  • Triangulation of data from different strands
  • Analysis of ‘best practice’, incorporating ‘comparisons between schools and colleges’.

There is no overt reference to other Government policies and initiatives that might be expected to impact on institutions’ practice, such as the Destination Measures (which will be presented separately for FSM-eligible learners in 2013, as well as being incorporated in School and College Performance Tables) or the Dux Scheme. Nor is there any explicit reference to the outreach activities of universities.

One assumes, however, that the outcomes will help inform Government decisions as to the effectiveness of existing school and college level policy interventions that contribute towards the achievement of its Social Mobility Indicators, specifically:

The Report is likely to result in arrangements of some sort for for disseminating effective practice between institutions, even if that amounts only to a few brief case studies.

It may even help to inform decisions about whether additional interventions are required and, if so, the nature of those interventions.

Previous posts on this Blog have made the case for a nationally co-ordinated and targeted intervention provided through a ‘flexible framework’ which would synergise the currently separate ‘push’ strategies from schools/colleges with the ‘pull’ strategies from higher education in support of the ‘most disadvantaged, most able’.

This would be a subset of the 14% achieving KS2 Level 5 in English and maths, defined by their capacity to enter the most competitive universities. It might incorporate a specific focus on increasing substantively progression to particular ‘elite’ targets, whether expressed in terms of courses (eg medicine, veterinary, law) or institutions (notably Oxbridge).

At the moment all the running is being made on the ‘pull’ side, spearheaded by joint OFFA/HEFCE efforts to develop a ‘National Strategy for Access and Student Success’.

A joint effort would:

  • Passport funding on individual learners and support them through transition at 16 and 18, probably topslicing Pupil Premium for the purpose.
  • Enable learners and facilitators to draw on provision offered via the (currently fragmented) supply side, drawing in third party providers as well as schools/colleges and universities.
  • Provide for a menu of such provision from various sources to be synthesised into a personalised programme based on needs assessment and subject to regular monitoring and updating.

Although there is presently some ideological inhibition hindering the adoption of such scaffolded programmes, an intervention of this nature – targeted exclusively at a select cohort of ‘high ability, high need’ students – would be likely to result in much more significant improvements against these indicators, and do so much more quickly than generic system-wide reform.

In ‘holding the Government’s feet to the fire’ over social mobility issues, perhaps the recently-established Social Mobility and Child Poverty Commission might see its way to making that case when it reports on Government progress in the Autumn.

.

Kew once more 5 by giftedphoenix

Kew once more 5 by giftedphoenix

 

Drawing These Strands Together

So, as things stand at the end of Episode One:

  • There is a decent, if relatively narrow report on the table which draws attention to longstanding transition and transfer problems and an outcomes-obsessed mentality at the top end of Key Stage 2, as well as a range of narrower issues associated with the effective delivery of Level 6 tests.
  • We impatiently await a consultation document on primary accountability that should provide some clarity over the future assessment of high-attaining learners within Key Stage 2, so enabling us to complete the bigger picture of National Curriculum and associated assessment reforms across Key Stages 1-4.
  • We also await a much-vaunted Ofsted survey report which – if it satisfies our high expectations – might provide the spur for real action at national, local and school levels, perhaps even inspiring the Sutton Trust to announce the outcomes of its 2012 call for proposals.
  • Then in September the third report (the second Investigation) will ideally be sufficiently strategic and influential to cause some important joining up to be undertaken across that part of the agenda focused on progression to higher education by high-attaining learners from disadvantaged backgrounds, potentially at the behest of the Social Mobility and Child Poverty Commission.

I am hopeful that this series of posts will support the process of distilling and synthesising these different elements to provide a composite picture of national strengths and weaknesses in gifted education throughout the continuum from upper Key Stage 2 to university entry. Some kind of audit if you will.

But the question begged is how to respond to the state of affairs that this ‘joining up’ process reveals.

As matters stand, at the end of this first post in the series, I have proffered unto the melting pot a cautiously provisional wishlist comprising three main items: a Manifesto that sets out some principles and arguments for a genuinely collaborative response, revised Quality Standards integrated within the accountability machinery and a targeted intervention for ‘high ability; high need’ learners designed to eliminate the fragmentation that bedevils current efforts.

This menu may well grow and change as the ‘Summer of Love’ progresses, not least to reflect planned and unplanned discussion of the issues . I would be delighted if some of that discussion were to take place in the comments facility below.

I believe one of the Manifesto principles must be to pursue an optimal middle way that is neither top-down nor bottom-up but a ‘strategy of all the talents’. That is reflected in my own version. Your comments are ever welcome about that, too.

But that principle presupposes a national gifted education community with the capacity and wherewithal to build on strengths and tackle weaknesses in a strategic, collaborative, inclusive and universal fashion.

For, if the next stage of reform is once more to be school-led, it is abundantly clear from the evidence presented above that schools will need our support to bring about real and lasting improvements in gifted education practice, for the benefit of all English gifted learners.

I was once optimistic about the prospects, but now I’m not so sure. Perhaps the Summer of Love is a chance in a generation – maybe the last chance – to galvanise the putative community into a real community and so make that happen.

.

GP

May 2013