How High Attainers Feature in Ofsted Inspection and School Performance Tables (and what to do about it) – Part Two


This is the second and final part of a post about how the school accountability system reflects the performance of high attaining learners.

Part One considered recent amendments to Ofsted’s inspection guidance and compared Ofsted’s approach with how high attainers are defined in the School Performance Tables. It reviewed expectations of progression by high attainers and proposed that these should be increased.

Part Two:

  • Reviews how the next School Performance Tables (2013 results) will feature high attainers, compared with the current Tables (2012 results).
  • Explains how high attainers feature in the proposals for assessment and accountability reform set out in three recent consultation documents – covering primary, secondary and post-16 education respectively. This takes in the recently published Government response to the secondary consultation.
  • Offers guidance for schools on how they might set about planning to improve the performance of their high attainers, given current accountability arrangements and future prospects and
  • Proposes for further discussion a basket of key indicators that schools might publish and pursue alongside learners, parents and other stakeholders.

I have adopted a simple taxonomy of measures throughout the discussion that follows.

This distinguishes measures relating specifically to high attainers according to whether they feature:

  • Attainment: the achievement of specified grades or levels in assessments conducted at the end of a key stage of education.
  • Progress: the expected trajectory between two or more of these assessments, consistent with achieving commensurate outcomes in each.
  • Destination: the nature of the educational or other setting to which learners have moved at the end of a key stage.
  • Closing the Gap: the difference between the outcomes of disadvantaged and other learners on any of these measures, whether this is falling and, if so, by how much.
  • Comparison: how the performance of schools/colleges on any of these measures compares with broadly similar institutions.

I have also flagged up related measures of high attainment – and measures which reflect high attainment – where these have been applied to the entire cohort rather than separately to high attainers.


High Attainers in the 2012 and 2013 Performance Tables

We begin with a comparison between the 2012 and 2013 School Performance Tables covering the primary, secondary and 16-18 tables respectively.

The details of reporting in the 2013 Tables are drawn from the published Statement of Intent. This confirms that they should be published according to the standard timetable, in mid-December 2013 (primary) and late January 2014 (secondary and post-16) respectively, with any data not then ready for publication added as and when it becomes available.

For ease of reference I have included in brackets the national figure for each high attainer measure in the 2012 Tables (state-funded schools only).


Primary Tables

Significant changes will be apparent in 2013. These are consequential on: the introduction of Level 4B+ and Level 6 performance; the removal of an overall level in English; the introduction of the grammar, punctuation and spelling (GPS) test and the addition of three year averages for specified measures.

Level 4B+ has been introduced as a new marker of ‘secondary readiness’. The reason given is that analysis of sub levels showed that, in 2012, only 47% of those with 4C in both English and maths went on to achieve 5 A*-C grade GCSEs including English and maths, while the comparable percentages for 4B and 4A were 72% and 81% respectively.

I pause only to note that, if the threshold is raised in this manner, there is even stronger logic behind the idea of raising the threshold for high attainers in parallel – an idea I floated in Part One of this post.

The table below compares the 2012 and 2013 measures using the taxonomy set out above.


  2012 2013
High Attainers measures
Attainment % achieving L3- (0%), L4+ (99%), L5+ in both English and maths (72%) % achieving L3- and L5+, in all three of reading and maths tests and writing TA
KS1-2 value added measure (English and maths) (99.8) % achieving L4+ and L4B+ in reading test
% achieving L4+ in writing TA
% achieving L3-, L4+, L4B+, L5+ and L6 in GPS test
% achieving L3-, L4+, L4B+, L5+ and L6 in maths test
KS1-2 VA (reading, writing and maths)
Progress % making at least expected progress in English (87%) and maths (92%) % making at least  expected progress in each of reading, writing, maths
Destinations None None
Closing the gap None None
Comparison % achieving L4+ in English and maths (99%) % achieving L4+ in  reading, writing and maths
High Attainment measures
  % achieving L5+ in reading and maths tests and writing TA (20%) % achieving L5+ and L6 in reading test
  % achieving L5+ in English (37%) in maths (39%) and in reading (48%) % achieving L5+ and L6 in writing TA
  % achieving L5+ in English TA (36%), maths TA (40%), science TA (36%), reading TA (46%) and writing TA (28%) % achieving L5+ and L6 in English, reading, maths, science TA
  Average point score (28.2) Average point score (reading, writing, maths)


The number of attainment measures applied specifically to high attainers has increased, but it is not entirely clear why so many different combinations of levels will be reported for different elements (there are four variants in all). The extent of L6 attainment is clearly a factor, but cannot be the sole reason.

For the first time we will be able to compare the performance of disadvantaged and other learners for both L5 and L6 in GPS and maths, at least in theory, since very few schools are likely to have sufficient L6 performance by disadvantaged learners to register on this measure.

But this is a high attainment measure. We still cannot see what proportion of high attainers are disadvantaged – and how their performance compares with their peers.

A national primary destinations measure is not really feasible and would tell us little, unless receiving secondary schools are categorised on the basis of their performance in the secondary tables or their Ofsted rating. It might be interesting to see what proportion of high attainers (particularly disadvantaged high attainers) transferrto secondary schools deemed outstanding – compared with middle and low attainers – but the benefits would be limited. However, this might be more relevant at local level.

The basis of the comparison with other schools is important to understand, since this methodology will be carried forward in the accountability reforms reviewed below.

The probability is calculated of pupils achieving KS2 L4+ in both English and maths, based on the achievement of pupils with the same prior attainment at KS1. These probabilities are averaged to provide a figure for the school. Then a similar school group is established by selecting the 62 schools with immediately stronger average performance and the 62 with immediately weaker average performance, giving a group of 125 schools. An average Level per pupil is calculated from the average points score.

The same methodology is used in the secondary tables. Here the calculation is built on the probability of achieving 5+ A*-C grades in GCSE, or equivalent, plus English and maths GCSEs, and based on the achievement of pupils with the same prior attainment at KS2. In this case the similar school group is derived from the 27 schools immediately above and the 27 immediately below, giving a group of 55 comparator schools.


Secondary Tables

Compared with the extensive changes to the Primary Tables in 2013, there is very little different in the Secondary Tables, apart from the introduction of an average grade per pupil (best 8) measure.


2012 2013
High attainers measures
Attainment % achieving: 5+ A*-C GCSEs or equivalent including GCSEs in English and maths (94%) Ditto
% achieving Grades A*-C in GCSE English and maths (94.3%) Ditto
APS (best 8) all qualifications (398.5) and GCSE only (375.4) Ditto
Average grade per qualification and per GCSE Ditto
Average entries per pupil for all qualifications (12.4) and GCSE (9.7); Ditto
% entered for all EBacc subjects (46.3%);            % achieving all EBacc qualifications (38.5%); Ditto
EBacc VA in English (1000.2), maths (1000.1), science (1000.4), humanities (1000.8) and languages (1000.2) Ditto
Average grade per pupil (best 8)
Progress % making at least expected progress in English (96.6%) and maths (96.8%) Ditto
  VA (best 8) (1000.8) Ditto
Destinations None (see below) Published data included
Closing the Gap None Ditto
Comparisons % achieving 5+ A*-C GCSEs or equivalent, including GCSEs in English and maths (94%) Ditto
High Attainment measures None Ditto


I have not included in the high attainment section the few additional attainment measures not  applied to high attainers (percentage achieving 5+ GCSEs at A*-C or equivalent (83%); percentage entered for and achieving EBacc subjects – English, maths, science, humanities and languages). These are not specifically high attainment measures and tell us relatively little.

As in the Primary Tables, there is a no analysis of schools’ success in closing the gap for high attainers. And, since there are no substantive high attainment measures in the Secondary Tables (with the possible exception of the EBacc), this omission is comparatively more significant.

Whereas we can see some L5+ and L6 performance for disadvantaged learners in the primary tables, there is no equivalent focus – say on 5+ GCSEs at A*/A including English and maths – in the secondary tables.

Destinations measures have already been published separately for 2010/11 and education destinations will be included in the 2013 Performance Tables, but the breakdown of destinations data by pupil characteristics does not include a prior attainment category.

(Compare this with the more cautious statements about the longer term use of KS4 destinations data set out below. The Statement of Intent is comparatively more cautious about the KS5 data.)

We have also had the separate announcement that Performance Tables will henceforward record only a learner’s first entry for any given GCSE examination (or IGCSE or Level 1/Level 2 Certificate).

This is too late for the imminent 2013 Tables, but will impact on the 2014 Tables (published in January 2015), when it will bite on all EBacc subjects. From 2015 it will impact on all subjects.

There are two schools of thought about the potential impact on high attainers.

One might argue that this change should not affect them unduly, since they are much more likely to be entered early because ready to achieve a commensurately high grade, rather than to ‘bank’ a Grade C which they may or may not subsequently seek to improve via retakes.

As the DfE announcement says:

‘If schools are confident that pupils will achieve well even when entered early and that early entry is therefore in the interests of the pupil, they should not need to make any changes to entry plans.’

On the other hand, we have already seen in Part One that Ofsted have included in  Subsidiary Guidance supporting inspection the advice that:

‘Inspectors should investigate whether a policy of early entry to GCSE for pupils is preventing them from making as much progress as they should, for example because…

  • The widespread use of early GCSE entry and repeated sitting of examinations has encouraged short-term gains in learning but has led to underachievement at GCSE, particularly for able pupils
  • Opportunities are missed to meet the needs of high-attaining pupils through depth of GCSE study and additional qualifications.’ (para 34)

Schools would do well to ensure that their plans to improve high attainers’ performance are reflected in any revision of their early entry policies, and vice versa.

Given the significance now attached to this issue, any measure that depends on increasing the incidence of early entry for high attainers is likely to receive close scrutiny.

Schools should not be cowed from adopting such an approach where it is clearly in the best interest of their highest attainers, but they will need strong supporting evidence that early entry will result in an A*/A grade (and ideally A*), and that appropriate progression routes are in place.


16-18 Tables

The post-16 Tables are comparatively less well-developed and continue to rely exclusively on high attainment measures rather than separately delineating outcomes for high attainers.

The structure of the 2013 Tables is undergoing significant change, with separate reporting of three different performance categories, depending whether students have pursued A levels, A level and other advanced academic qualifications or advanced vocational qualifications respectively.

The 2013 entries in the table below cover only the A level strand.


  2012 2013 (A levels)
High attainers measures  
Attainment None Ditto
Progress None Ditto
Destinations None Ditto (see below)
Closing the Gap None Ditto
Comparisons None Ditto
High attainment measures
  % of KS5 students (4.8%) and of A level students (7.4%) achieving 3 A levels at AAB+ in three facilitating subjects % of A level students achieving 3 A levels at AAB+ in three facilitating subjects
  % of KS5 students (7.8%) and of A level students (11.9%) achieving  3 A levels at AAB+ with two in facilitating subjects % of A level students achieving 3 A levels at AAB+ with two in facilitating subjects
  APS per student and per entry for A level only APS per A level student (FTE) and per A level entry
  APS per student and per entry for A level and other academic qualifications
  APS per student and per entry for A level and equivalent qualifications (including 4 year time series)
  A level VA (see below)


New value added measures are also expected for A level and the other two performance categories, though the release of this data is said to be ‘subject to further analysis’ and no substantive detail is provided.

There is no commitment to introduce KS5 destinations data into the 2013 Tables though the Statement of Intent says:

‘We will continue to evaluate both the KS5 and employment destinations measures as part of our aim to include in future performance tables.’

‘Facilitating subjects’ continue to hold sway in the key high attainment measures, despite continuing criticism of the concept, as well as concern that a ‘three facilitating subjects’ measure is not consistent with the Russell Group’s advice.


What can we learn from this comparison?

It is noticeable how little consistency there is between each set of Tables as presently formulated. High attainers feature strongly in the Secondary Tables, to a limited extent in the Primary Tables and not at all in the Post-16 Tables.

Conversely, there are several measures of high attainment in the Primary Tables and a couple in the Post-16 tables, but the Secondary Tables concentrate exclusively on generic measures. There is no measure for achievement pitched at GCSEs Grades A*/A.

Closing the gap data and comparisons data is so far entirely absent from the post-16 Tables and, perhaps less surprisingly, destinations data is absent from the Primary Tables.

Where destinations and closing the gap data is available, there is no breakdown for high attainers.

The next section will explore whether the changes proposed in the three accountability-related consultation documents are likely to bring about any greater consistency in the coverage of the respective Performance Tables and, more specifically, how they are likely to report on high attainers.


Likely Impact of Accountability Reforms

At the time of writing, three staged consultations have been launched but only one has been concluded.

All three consultations are predicated on a new approach to the publication of accountability data which has three components:

  • A requirement to publish a handful of headline indicators on schools’ own websites. The secondary consultation response describes this as a ‘snapshot’ in standard format. It seems likely that this provision will be extended to primary and post-16 institutions but, as yet, this is nowhere confirmed.
  • Continued publication of Performance Tables which contain the same headline indicators, but with additional material showing how different groups of learners perform against them and how performance compares with other institutions.
  • The introduction by March 2015 of a new data portal which contains all the (performance) information held about schools (and colleges?) We are told in the secondary consultation response that:

‘It will be an easily accessible website that allows the public to search all the information we hold about schools, subject to protecting individuals’ anonymity. Respondents to the consultation argued that it would be useful to see measures showing school by school performance in vocational qualifications, the percentage of pupils achieving the top grades in GCSEs, and average grades by subject. We agree that all these measures will be of interest to many people, and the Data Portal is being designed so that parents can search for this type of information.’

There is otherwise comparatively little information about this portal in the public domain.

DfE’s Digital Strategy, published in December 2012, says that the parent School Performance Data (SDP) programme will ‘see the consolidation of 8 existing data-based services into one’ and that delivery of the programme will be staggered from the second quarter of 2014 until the final quarter of 2015. At some point within this timeline, it will absorb and replace RAISE Online.



Primary Accountability Reform

Those with a wider interest in the proposed changes to primary accountability and assessment are invited to read this analysis.  This commentary deals only with the material likely to be published through one of the three routes described above.

The primary document is comfortably the vaguest of the three consultations. However, we know that new KS2 tests – single tests for the full attainment spectrum – will be introduced in 2016, with results first appearing in the 2016 Performance Tables, likely to be published in December that year.

We are told that tests will be available in maths and in elements of English including reading (GPS is not mentioned explicitly). Writing will continue to be assessed via teacher assessment.

Published performance measures are typically described in the consultation as relating to ‘each subject’ but it seems most likely that elements of English will continue to be reported separately.

All the measures outlined below therefore apply to maths and to each tested element of English. One assumes they also apply to writing TA – and potentially to other TA outcomes too – but this is unclear.

So there may be one or two sets of measures for maths (test and potentially TA), while for English there will be somewhere between one (reading test) and five (reading and GPS tests; reading, writing and GPS TA).

For each subject/element, Performance Tables are expected to include:

  • The percentage of pupils ‘meeting the secondary readiness standard’, which will already have been introduced in the 2013 Tables. Secondary readiness will be reported using scaled scores derived from raw test marks. A score of 100 is proposed to denote the threshold of secondary readiness, so the Tables will show the percentage of a school’s learners at or above 100.
  • The corresponding average scaled scores for each school. The consultation document adopts an illustrative scale – based on the current national curriculum tests – which runs from 80 to 130. An average scaled score significantly over 100 is a positive indicator but, of itself, says nothing about the distribution of scores, or whether this outcome is attributable to prior attainment.
  • ‘How many of the school’s pupils are among the highest-attaining nationally, by…showing the percentage of pupils attaining a high scaled score’. There is no information about where these measures will be pitched in relation to the deciles that will be used to report to parents on individual learners’ performance. It might be that the cut-off for high attainers is pitched at a particular score – say 120 on the illustrative scale above – or at the 20th percentile, for example. (It seems unlikely that numbers performing in each decile will be reported in the tables though they could presumably be found via the data portal.)
  • Progress measures based on comparison of pupils’ scaled scores with the scores of other pupils with the same prior attainment. The paper gives the impression that there will be separate progress measures for each subject/element, rather than a composite measure such as that planned for secondary schools (see below). These will be derived from performance on a baseline assessment, which seems increasingly likely to be moved back from the end of KS1 to YR. We are not told what categories of prior attainment will be applied. A sophisticated gradation, based on deciles, is unlikely to be consistent with a simple baseline check for 4 year-olds. A cruder tripartite judgement of ‘primary-readiness’ is more probable, with learners categorised as ‘not yet primary ready’, ‘primary ready’ or ‘performing beyond primary readiness’.
  • Comparison of each school’s performance with other schools with similar intakes. Whether this will be undertaken in relation to all the measures above, or only selected measures, is so far unclear.
  • The attainment and progress of those in receipt of the Pupil Premium. Whether this analysis will be provided for to all the measures above also remains unclear.

All measures will be published as annual results and as three year rolling averages

The document asks respondents to comment on whether other measures should be prioritised within performance tables. It is not clear which of this data will be published on schools’ own websites as well as in performance tables.


Secondary accountability reform

The response to the secondary consultation confirms that schools will be required to publish five measures in a standard format on their own websites:

  • Attainment across a suite of up to eight qualifications (called ‘Attainment 8’). This suite includes: English and maths, both of which are double weighted to reflect their significance (English language will only be double weighted if combined with English literature); three further EBacc subjects (combined science double award will count as two qualifications; subjects can be drawn from the same subject area, eg three sciences or two languages); and three further qualifications, which may be arts-based, academic or approved vocational qualifications (English literature may count as one of these). The measure will be applied to learners entering fewer than eight qualifications and those entering more will have their highest graded qualifications counted in the final category. The outcome will be expressed as an average grade, but with finer gradations – exemplification in the response refers to ‘a high B grade or a low D grade’. (The consultation document noted that learners would know their APS and be able to compare it with ‘easily available local and national benchmarks’).
  • An aggregated progress measure (called ‘Progress 8’) based on the same suite of up to eight qualifications. KS4 results will be predicted on the basis of prior attainment at KS2. This involves calculating an estimated KS4 outcome based on the actual outcomes of all learners with a specified level of KS2 attainment across English and maths. (The two examples in the response are based on the existing KS2 APS methodology and averaged NC levels respectively, rather than the proposed new ‘scaled scores’ methodology outlined above. We do not know on what basis such scores would be aggregated in future for the purpose of this calculation.) Each learner’s VA score is calculated by subtracting their estimated outcome from their actual outcome. So, if a learner with given prior attainment is estimated to achieve 8C grades at GCSE and actually achieves 4Bs and 4Cs, that gives a VA score of +0.5 (4 grades over 8 subjects). The school’s average VA score is calculated by aggregating these outcomes.
  • The percentage of learners achieving a ‘pass’ grade or better in English and maths. The response uses the existing nomenclature of a C grade or better, but this will change when a new GCSE grading system is finalised. Outcomes from the Ofqual consultation – which proposed a new grading scale from 1 (low) to 8 (high) and a subsequent standards-setting consultation – should be available shortly.
  • The percentage achieving ‘good grades’ in the EBacc (the same issue applies) and
  • A destination measure, derived from the current experimental approach. This is provisional since ‘we want to be sure the statistics are robust before committing to use this…as a headline indicator’.

The response adds that the calculation process for ‘Progress 8’ is under review. It is likely to be adjusted, by calculating expected progress from the results of learners who completed KS4 three years previously, though possibly not until 2019.

There might also be a shift to calculating expected progress at subject level, which is then averaged, to reduce the likelihood of ‘schools entering learners for qualifications ‘in which it is easier to score points for the progress measure’.

A straightforward linear point scoring system is under discussion – eg 1 for a current GCSE Grade G to 8 for a current A* grade. This would give more credit to schools for higher results than the current non-linear approach, which awards a G 16 points and an A* 58 points. (this might suggest a similar adjustment to the primary APS methodology.)

Finally, the treatment of an estimated 1.2% of low attainers who enter no relevant qualifications is still under consideration. The methodology will not be finalised until Spring 2014.

Performance Tables will ‘eventually’ include the five headline indicators above (no date is provided for this).

They will also contain:

  • A value added progress measure in English and maths, showing whether learners have performed better or worse than expected given their prior attainment at KS2. (The original consultation document implied this would relate to English and maths separately and would be provided for low, middle and high attainers respectively.)
  • A comparison measure with similar schools, using the existing methodology, but further developed to give ‘an indication of disadvantage’ in each group of similar schools.
  • By implication, ‘closing the gap’ indicators showing the attainment of disadvantaged learners eligible for the Pupil Premium, the progress of those learners and ‘the in-school gap in attainment between disadvantaged pupils and their peers’. Both single year data and three year rolling averages will be published.
  • By implication there will also be further analysis of performance by high, middle and low attainers on each measure. The drafting is however very unclear, referring to the present rather than the future:

‘Performance tables give a breakdown of the performance of different pupil groups on the headline indicators. They show how well pupils with low, middle and high prior attainment perform on each measure…For each indicator, local and national benchmarks are provided to make it easier to judge each school’s performance. Using this information, parents can search for the information which is most relevant to them. For example, they can see how many pupils with high prior attainment go on to achieve the EBacc at different schools in their area’

It seems most likely that all learners judged ahead of the new ‘secondary ready’ threshold would count as high attainers, even though it would now be possible to calculate more sophisticated gradations based on deciles of performance.

Hence we would continue to have three broad categories of prior attainment: exceeding secondary ready standard, at secondary ready standard and not yet secondary ready.

However, the consultation response is silent on this matter.

We are told that material about pupils achieving the top grades at GCSE will only be available through the data portal, which means that – on this issue – the new secondary tables will continue to be out of kilter with the primary and post-16 tables.

There is one further unexplained reference in the response:

‘Confidence intervals will also be important when we present each school’s percentile ranking on the range of headline measures. For example, a school could have performed well on the Attainment 8 measure and be in the 10th percentile, with a confidence interval that indicates that the school’s true ranking is likely to lie between the 5th and 15th percentiles.’

This appears to suggest that schools will be ranked on each headline measure – but it remains unclear whether this material will be included in the Performance Tables.

There is also a beguiling reference to Ofsted:

‘In addition Ofsted may choose to specify some of these measures, for example the percentage of pupils achieving the best GCSE grades, in their inspection guidance’.

In addition:

‘Schools in which pupils make an average of one grade more progress than expected across their 8 subjects will not be inspected by Ofsted during the next academic year (unless there are exceptional circumstances, for example where there are safeguarding concerns).’

This suggests further changes to the Handbook and Subsidiary Guidance, and potentially to the Framework itself.

Changes will be introduced into the 2016 Performance Tables published in January 2017, but schools will receive information based on 2014 exam results to illustrate their performance on the new measures, and will be able to opt in to a new floor standards methodology in 2015.



This consultation proposes that performance should be reported separately at Level 2 (including continued study of English and maths for those so far without a GCSE ‘pass’ – currently graded A*-C) and Level 3.

In respect of Level 3 performance should be reported for three strands of provision: Academic (A level, AS level, IB, EPQ etc), Applied General and Technical, as has been introduced for 2013 Tables.

This analysis focuses exclusively on Level 3 Academic provision.

At Level 3, five ‘topline performance measures’ are proposed which will be included in Performance Tables (it is not stated whether institutions will also be required to publish them on their websites):

The proposed topline measures are:

  • Two principal attainment measures: average grade and average points per full-time A level student – a best 3 A levels measure is under consideration ‘to encourage substantial A level programmes’; and average grade and points score per A level entry. This is given pride of place, above the measures that rely on facilitating subjects and there is no reference to restrictions being placed on the subjects studied.
  • An aggregate KS4-5 progress measure ‘showing the progress of similar students nationally who, according to their results at the end of Key Stage 4, were of the same level of ability’ (by which they mean attainment). This sounds broadly consistent with what is proposed for the primary and secondary tables. The annex says:

‘Only students with the same prior attainment, taking the same subjects, will be compared to provide a subject score. Subject scores will then be aggregated with other academic…scores to provide an overall academic score.’

This presumably means that A level students will be compared with those with similar KS4 attainment who are taking the same A level subjects.

  • A destination measure ‘showing student progression to a positive destination’. No reference is made to the controversial question whether this will continue to distinguish Russell Group universities and Oxbridge.
  • A completion measure.

There is additionally a commitment to ‘consider how we can report the results of low, middle and high attainers similarly [to KS4] in the expanded 16-19 performance tables’ but no further clue as to how these will be devised.

A number of additional measures are also laid out, which may or may not appear in the performance tables:

  • The percentage of students achieving AAB+ grades at A level in two and in three ‘facilitating subjects’. Note that the benchmark is still AAB+, even though, from 2013-14 onwards, HEFCE’s student number control relaxation – from which this measure was originally derived – is extended from AAB+ to ABB+.
  • A ‘closing the gap’ measure which will show attainment by pupils who were eligible for Pupil Premium funding in Year 11. The annex suggests that this ‘can be compared with the top line attainment measure’ which, in the context of A level, may mean one or both of the two described above. It is not clear whether this will also be applied to the progress measure.
  • Attainment of approved level 3 maths qualifications for students who do not take A or AS level (these are under development). These will be available for teaching from 2015.

The timetable for the introduction of these reforms is not specified.


Comparison of Primary, Secondary and Post-16 reforms

The table below shows the extent to which the overall proposals for performance table reform reflect a consistent application of the typology set out above.


Primary Secondary Post-16
Attainment % achieving secondary ready standardAverage scaled scores ‘Attainment 8’ expressed as average gradePass grade in E+MPass grades in EBacc Average grade and APS per FT A level student (potentially on best 3 A levels)Average grade and APS per A level entryAAB+ in 2 and 3 facilitating subjectsPerformance in L3 maths qualifications
Progress Averaged scaled scores compared with prior attainment ‘Progress 8’ expressed as +/- average gradeProgress in E+M Yes but no detail
Destinations None mentioned Provisionally Yes but no detail
Closing the gap Yes (indicators unclear) Yes (unclear if applied for all  indicators above Yes but no detail
Comparisons Yes (indicators unclear) Yes (with ‘indication of disadvantage’) None mentioned
Notes Reference to ‘percentile rankings on headline measures’ Unclear if facilitating subjects and L3 maths in tables

There is evidence of a shift towards greater consistency of approach compared with the current performance tables, although many of the details have yet to be clarified.

Unfortunately, this lack of clarity extends to the definition of high attainers and how their performance will be reported.

  • In the primary sector, it seems most likely that high attainers will be defined according to some yet-to-be-determined measure of ‘primary readiness’ and in the secondary sector according to the new ‘secondary-ready’ standard. One could extend the same logic into post-16 by devising a ‘sixth form ready’ standard based on performance against new-style GCSE grades. This would transpose into the new Tables the rather crude tripartite distinction in place currently at primary and secondary level, though we know the pitch will be somewhat higher. But this is little more than an educated guess. It would be quite possible to introduce a more sophisticated distinction and a narrower definition of what constitutes a high attainer, though this would be much more straightforward in the secondary sector than in primary.
  • We are equally unclear how high attainers’ performance will be reported. There is nothing explicit in the primary consultation document to explain which measures will be applied to high, middle and low attainers, Indeed, the only direct reference to such distinctions applies to Ofsted inspection:

‘Schools in which low, middle and high attaining pupils all make better than average progress will be much less likely to be inspected sooner.’

The secondary consultation response implies that all the headline measures will be applied to these three categories of prior attainment, but fails to state this explicitly, while the post-16 document doesn’t go beyond the broader commitment mentioned above.

  • As for the reporting of high attainment, the methodology for the principal  progress measures – confirmed for the primary and secondary tables and planned for post-16 – are specifically designed to discourage schools from concentrating over-much on learners on the borderline of a threshold measure. This is welcome. But, whereas, the primary tables will include new measures focused explicitly on how many of a school’s population have achieved national measures of high attainment (as expressed by high scaled scores) and high attainment measures will be retained in the new post-16 tables, the continued omission of an equivalent GCSE measure from the secondary tables seems inconsistent, even though we are told it will be possible to find this data in the accompanying portal. It is hard to understand the logic that justifies this continued inconsistency of approach, when the broader direction of travel seems very much about bring the three sets of tables more closely into line.


How Should Schools Respond?

With so much change in the offing, it can be all too easy for schools to slip into a defensive, reactive mode, particularly if they are under pressure on other fronts.

There is a temptation to concentrate effort elsewhere, on the grounds that high attainers will do comparatively well with relatively little support. Most will be secure L5 performers and go on to pick up a clutch of A grades at GCSE. The opportunity cost of lifting them to L6 and converting their As into A*s may be perceived as simply too great.

On the other hand, while I can cite no hard evidence to support the contention, I have often observed that schools which are successful with their high attainers are rarely unsuccessful in other respects. It is as if support for high attainers is a litmus test of personalised education, going the extra mile and wider school effectiveness.

And there is a real opportunity for schools to get on to the front foot with this issue, given the absence of any substantive lead or guidance from the centre, or any conspicuous consensus amongst schools – or between experts, academics and service providers – over what constitutes effective practice.

Naturally schools will want to frame their response in a way that addresses their priorities and fits their particular contexts. They will need to take into account how their success is defined and reported by Ofsted and in School Performance Tables – and how this might change in the future – but they will not plan exclusively on that basis.

They must find the ‘best fit’ between the demands of the accountability regime and what is in the best interests of their learners. The regime is not intended to impose rigid conformity, and higher performing schools in particular must be allowed to trust their judgement rather than devoting themselves exclusively to these ‘one-size-fits-all’ measures of success.

The first part of this final section sets out how a school might rethink and redefine its support for high attainers from first principles – though it stops short of advocating or discussing any specific elements of effective whole school practice.

The second part draws on the analysis elsewhere in this post to inform a suggested basket of key measures from which schools might select when constructing a plan to improve the their high attainers’ performance. This is very much intended as a starting point for discussion, rather than a blueprint that all must follow.


Rethinking Support for High Attainers

A high attainers’ support strategy will only be completely successful if it has the full commitment of staff, learners, parents and other key stakeholders. It should extend across the whole school and all dimensions of whole school practice, including learning at home and parental and community engagement.

The best way to secure wholesale commitment is through an inclusive and transparent consultative process. The outcomes of that process are most readily captured in an accessible document that should:

  • Include a clear, comprehensive yet succinct statement of whole school policy for challenging and supporting high attainers that is meaningful and relevant to lay and professional audiences alike.
  • Incorporate a concise improvement plan that is regularly monitored and updated and that feeds into the wider school improvement plan.
  • Be published openly, so that the school’s priorities are understood and acted on by all parties – and so that prospective parents and learners can use them to inform decisions about whether to apply for admission.

In formulating a support plan, schools must consider what relationship it should have with any parallel gifted and talented education policy (or equivalent terminology adopted by the school).

It is not appropriate simply to substitute one for the other, without giving careful consideration to what might be lost in doing so.

In future, how will the school support learners with high ability but who might not realise it through high attainment? What of twice-exceptional learners, for example, and those with diverse talents that are not demonstrated through high attainment, whether in arts, sports, interpersonal skills, or any other field judged significant by the school?

Some schools may prefer to have parallel and mutually supportive policies for these two overlapping populations; others may prefer an integrated policy.

The most straightforward approach is to distinguish high attainers as a subset of the school’s wider gifted and talented population. Ignore the school of thought that suggests high attainers are somehow different, ‘bright but not gifted’.

But, if the policy is integrated, it must be clear where and how support for high attainers is distinct.

The support plan must rest on a clear definition of what constitutes a high attainer – and a potential high attainer – in the context of the school, together with explicit recognition of the gradations of high attainment within the general definition. (Ofsted’s example shows how confusion may be caused by using inconsistent terminology and a failure to define terms.)

The plan should be framed around core priorities. These will typically include:

  • A judicious blend of challenge and support for high attainers, designed to ensure that they continue to perform highly, but are not exposed to undue pressure or stress; and that their high attainment is not achieved at the expense of personal wellbeing, or wider personal development.
  • Challenge and support for learners who are not yet high attainers but might become so. (This might include the remainder of the school population, or a more narrowly defined group, depending on context and ideological preference.)
  • Targeted challenge and support for disadvantaged learners and under-represented groups. This must include those in receipt of the Pupil Premium but might also reflect gender, minority ethnic, SEN and summer-born considerations. Schools understand the complex causes of underachievement and that most underachieving learners are affected by a combination of factors. Avoid a simplistic quota-driven model. Critically, this equity-driven support must not be achieved at the expense of advantaged learners. The optimal strategy is to continue to raise standards for all high attainers, but to raise them relatively faster for those from disadvantaged backgrounds.

The plan must show how the current state will be changed into the desired future state. This necessitates:

  • A thorough review process and full statement of the baseline position, preserving a balance between the celebration of strengths and the identification of weaknesses and areas for development. Schools should not gloss over gaps in their skillset, or evidence that particular subjects/departments are weaker than others. This should not be an exercise in finding and attributing fault, but a collective endeavour undertaken in a spirit of continuous improvement.
  • A coherent set of priorities for improvement which must be SMART (Specific, Measurable, Achievable, Realistic and Timebound). A named individual should be accountable for each priority and it should be stated explicitly what staff time, budget and any other resources are allocated to securing it.
  • Improvement priorities should be aligned with a matching set of outcome measures, or success criteria, which might draw from the suggested basket set out below. These should capture the intended impact of the improvement priorities on high attainers’ performance.
  • Arrangements for regular monitoring and review. There should be capacity to manage slippage and risk and to accommodate succession planning. A senior manager should be accountable for the plan as a whole, but it should be evident how all staff and all stakeholders – including learners and parents – can contribute positively to its achievement.

If the school is sufficiently confident it should consider incorporating an explicit entitlement to challenge and support for all its high attaining learners. This should not be vague and generalised, but sharp and explicit, so that parents and learners can challenge the school and seek redress if this entitlement is not forthcoming.


A potential basket of key measures

What measures might schools and colleges select from when developing such plans? I have set out below some first efforts at three baskets of key measures relating to primary, secondary and post-16 respectively.

These broadly reflect the current accountability regime, though I have suggested some departures, to fill gaps or in response to known concerns.

I do not pretend that these are more than illustrative. Some of the measures beg questions about definition, a few are rather iffy and there are certainly gaps in the coverage.

But they should serve to exemplify the broad approach, as well as providing a basis for more rigorous discussion at institutional level. I don’t have all the answers and very much want to start the conversation, rather than attempting to close it off.

Planners might reasonably consider drawing one measure from each of the five areas in the typology I have set out. Failing that, they might aim for a ‘balanced scorecard’ rather than relying excessively on measures in just one or two categories.

The optimal number of measures is probably between three and five – if there are fewer than three the scope of the plan will be too narrow; if there are more than five it will be too complex.

It should be possible to develop and refine these baskets over time, to reflect ideas and suggestions from those engaging with them. I hope to revisit them in future.

They will need significant adjustment to reflect the new accountability regime, once the proposals in the three consultation documents have been implemented.

And, hopefully, the new Data Portal will make it much easier to construct the necessary measures in all five of these categories, once it is introduced from 2014.


                                     PRIMARY BASKET OF INDICATORS
Attainment % of high attainers at KS1
  % of high attainers at KS2
% of high attainers achieving KS1 L3
% of high attainers achieving KS2 L5B/5A/6
Progress 100% of high attainers make expected progress in KS1
  % of high attainers making more than expected progress in KS1
  100% of high attainers make at least 2 levels of progress in KS2
  % of high attainers making more than 2/up to 3 levels of progress in KS2
Destinations % of high attainers transferring to outstanding secondary schools
  % of high attainers transferring to selective schools
Closing the gap For any of the indicators above, the FSM gap for high attainers is at least x% lower than the FSM gap for middle attainers
  For any of the indicators above, the FSM gap is closed by x%
  High attainers are representative of school population (by eg FSM, gender, ethnic background, SEN, month of birth)
Comparisons For any of the indicators above, compare with all schools located in LA area/governed by academy trust
  For any of the indicators above, compare with family of schools with broadly similar intake



                                            SECONDARY BASKET OF INDICATORS
Attainment % of KS2 high attainers in Y7 intake
% of KS2 high attainers achieving 5+ A*-A grades at GCSE or equivalent including GCSE English and maths
% of KS2 high attainers’ GCSE entries awarded A*/A grades, or A* grades only
% of KS2 high attainers awarded GCSE A*/A grades or A* grades via early entry
Increase in GCSE APS (Best 8) and/or improvement in average grade
Increase in GCSE APS (new Attainment 8 measure) and/or improvement in average grade
Progress 100% of KS2 high attainers making expected progress from KS2-4
  % of KS2 high attainers making more than expected progress from KS2-4
  % of KS2 high attainers making 4/5 levels of progress from KS2-4
Destinations % of KS2 high attainers transferring to outstanding sixth forms/post-16 institutions
Closing the gap For any of the indicators above, the FSM gap for high attainers is at least x% lower than the FSM gap for middle attainers
  For any of the indicators above, the FSM gap is closed by x%
  High attainers are representative of school population by eg FSM, ethnic background, SEN, month of birth
Comparisons For any of the indicators above, compare with other schools located in LA area/governed by academy trust
  For any of the indicators above, compare with a family of schools with broadly similar intake



                                             POST-16 BASKET OF INDICATORS
Attainment % of KS2 high attainers achieving AAB+/ABB+ grades at A level (whether or not in facilitating subjects)
% of KS2 high attainers’ A level entries awarded B/A/A* or A*/A or A* grades only
Increase in A level APS (best 3) and/or improvement in average grade
Progress % with GCSE A* achieving A level grade A* in same subject
  % with 5+ GCSEs at A*/A including English and maths achieving 3+ A levels at AAB/ABB
Destinations % transferring to high tariff undergraduate degree courses
  % transferring to selective/Russell Group/Oxbridge universities
Closing the gap For any of the indicators above, the FSM gap for high attainers is % lower than the FSM gap for middle attainers
  For any of the indicators above, the FSM gap is closed by x%
  High attainers are representative of sixth form population by eg FSM, gender, ethnic background, SEN, month of birth
Comparisons For any of the indicators above, compare with other schools located in LA area/governed academy trust
  For any of the indicators above, compare with a family of schools/colleges with broadly similar intake



In this second part of the post, I have adopted a simple typology of performance measures to explain:

  • Imminent changes in how the Performance Tables address high attainers and high attainment, highlighting key differences between sectors and outlining the direction of travel towards longer term reform.
  • What is known and what is still unknown about the focus on high attainers in new performance table arrangements, mostly scheduled for implementation from 2016, and to what extent these reforms will introduce a more robust approach and greater consistency between sectors.
  • How, pending those longer term reforms, schools and colleges might set about developing a high attainers’ support plan, so seizing the initiative and skirting around some of the difficulties presented by revisions to the inspection guidance.
  • How such support plans might be constructed around a basket of key outcome measures, so that they are a. focused explicitly on improvements in institutional performance as well as improved provision for high attaining learners and b. broadly reflect performance table measures without requiring slavish adherence.

This post is very much a work in progress, striving as it does to pin down a moving target while also setting out a basic support framework with universal application. I am unhappy with some aspects of this first edition and will aim to eliminate its shortcomings in a future iteration. All suggestions for improvement are welcome.



October 2013


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s