Has Ofsted improved inspection of the most able?

.

This post examines the quality of Ofsted reporting on how well secondary schools educate their most able learners.

keep-calm-and-prepare-for-ofsted-6The analysis is based on a sample of 87 Section 5 inspection reports published during March 2015.

I have compared the results with those obtained from a parallel exercise undertaken a year ago and published in How well is Ofsted reporting on the most able? (May 2014).

This new post considers how inspectors’ assessments have changed in the light of their increased experience, additional guidance and – most recently – the publication of Ofsted’s survey report: The most able students: An update on progress since June 2013.

This appeared on 4 March 2015, at the beginning of my survey period, although it was heralded in HMCI’s Annual Report and the various supporting materials published alongside it in December 2014. One might therefore expect it to have had an immediate effect on inspection practice.

Those seeking further details of either of these publications are cordially invited to consult the earlier posts I dedicated to them:

The organisation of this post is straightforward.

The first section considers how Ofsted expects its inspectors to report on provision for the most able, as required by the current Inspection Handbook and associated guidance. It also explores how those expectations were intended to change in the light of the Update on Progress.

Subsequent sections set out the findings from my own survey:

  • The nature of the 2015 sample – and how this differs from the 2014 sample
  • Coverage in Key Findings and Areas for Improvement
  • Coverage in the main body of reports, especially under Quality of Teaching and Achievement of Pupils, the sections that most commonly feature material about the most able

The final section follows last year’s practice in offering a set of key findings and areas for improvement for consideration by Ofsted.

I have supplied page jumps to each section from the descriptions above.

How inspectors should address the most able

.

Definition and distribution

Ofsted nowhere explains how inspectors are to define the most able. It is not clear whether they permit schools to supply their own definitions, or else apply the distinctions adopted in their survey reports. This is not entirely helpful to schools.

In the original survey – The most able students: Are they doing as well as they should in our non-selective secondary schools? (June 2013) – Ofsted described the most able as:

‘…the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

The measure of potential is not defined, but an example is given, of EAL students who are new to the country and so might not (yet) have achieved Level 5.

In the new survey prior attainment at KS2 remains the indicator, but the reference to potential is dropped:

‘…students starting secondary school in Year 7 having attained Level 5 or above in English (reading and writing) and/or mathematics at the end of Key Stage 2’

The size of this group varies at national level according to the year group.

If we take learners in Year 7 who completed KS2 in 2014, the data shows that 24% achieved KS2 Level 5 in both English (reading and writing) and maths. A further 5% secured L5 in English (reading and writing only) while another 20% reached L5 in maths only.

So 49% of the present Year 7 are deemed high attainers.

.

Ofsted venn Capture

But this proportion falls to about 40% amongst those who completed KS4 in 2014 and so typically undertook KS2 assessment five years earlier in 2009.

Ofsted’s measure is different to the definition adopted in the Secondary Performance Tables which, although also based on prior attainment at KS2, depends on an APS of 30 or higher in KS2 tests in the core subjects.

Only ‘all-rounders’ count according to this definition, while Ofsted includes those who are relatively strong in either maths or English but who might be weak in the other subject. Neither approach considers achievement beyond the core subjects.

According to the Performance Tables definition, amongst the cohort completing KS4 in 2014, only 32.3% of those in state-funded schools were deemed high attainers, some eight percentage points lower than Ofsted’s figure.

The sheer size of Ofsted’s most able cohort will be surprising to some, who might naturally assume a higher hurdle and a correspondingly smaller group. The span of attainment it covers is huge, from one L5C (possibly paired with a L3) to three L6s.

But the generosity of Ofsted’s assumptions does mean that every year group in every school should contain at least a handful of high attainers, regardless of the characteristics of its intake.

Unfortunately, Ofsted’s survey report does not say exactly how many schools have negligible numbers of high attainers, telling us only how many non-selective schools had at least one pupil in their 2014 GCSE cohort with the requisite prior attainment in English, in maths and in both English and maths.

In each case some 2,850 secondary schools had at least one student within scope. This means that some 9% of schools had no students in each category, but we have no way of establishing how many had no students in all three categories.

Using the rival Performance Table definition, only some 92 state-funded non-selective secondary schools reported a 2014 GCSE cohort with 10% or fewer high attainers. The lowest recorded percentage is 3% and, of those with 5% or fewer, the number of high attaining students ranges from 1 to 9.

Because Ofsted’s definition is more liberal, one might reasonably assume that every secondary school has at least one high-attaining student per year group, though there will be a handful of schools with very few indeed.

At the other extreme, according to the Performance Tables definition, over 100 state-funded non-selective schools can boast a 2014 GCSE population where high attainers are in the majority – and the highest recorded percentage for a state-funded comprehensive is 86%. Using Ofsted’s measure, the number of schools in this position will be substantively higher.

For the analysis below, I have linked the number of high attainers (according to the Performance Tables) in a school’s 2014 GCSE cohort with the outcomes of inspection, so as to explore whether there is a relationship between these two variables.

Framework and Handbook

The current Framework for School Inspection (December 2014) makes no reference to the most able.

Inspectors must consider:

‘…the extent to which the education provided by the school meets the needs of the range of pupils at the school, and in particular the needs of disabled pupils and those who have special educational needs.’

One of the principles of school inspection is that it will:

‘focus on pupils’ and parents’ needs by…evaluating the extent to which schools provide an inclusive environment that meets the needs of all pupils, irrespective of age, disability, gender, race, religion or belief, or sexual orientation’.

Neither ability nor attainment is mentioned. This may or may not change when the Common Inspection Framework is published.

The most recent version of the School Inspection Handbook (December 2014) has much more to say on the issue. All relevant references in the main text and in the grade descriptors are set out in the Annex at the end of this post.

Key points include:

  • Ofsted uses inconsistent terminology (‘most able’, ‘more able’, ‘highest attainers’) without distinguishing between these terms.
  • Most of the references to the most able occur in lists of different groups of learners, another of which is typically ‘disadvantaged pupils’. This gives the mistaken impression that the two groups are distinct – that there is no such thing as a most able disadvantaged learner.
  • The Common Inspection Framework will be supported by separate inspection handbooks for each sector. The consultation response does not mention any revisions relating to the most able; neither does the March 2015 survey report say that revisions will be introduced in these handbooks to reflect its findings and recommendations (but see below). 

.

Guidance

Since the first survey report was published in 2013, several pieces of guidance have issued to inspectors.

  • In Schools and Inspection (October 2013), inspectors’ attention is drawn to key revisions to the section 5 inspection framework:

‘In judging the quality of teaching…Inspectors will evaluate how teaching meets the needs of, and provides appropriate challenge to, the most able pupils. Underachievement of the most able pupils can trigger the judgements of inadequate achievement and inadequate teaching.’

In relation to report writing:

‘Inspectors are also reminded that they should include a short statement in the report on how well the most able pupils are learning and making progress and the outcomes for these pupils.’

  • In Schools and Inspection (March 2014) several amendments are noted to Section 5 inspection and report writing guidance from January of that year, including:

‘Most Able – Inspectors must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

‘…must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

Moreover, for secondary schools:

‘There must be a comment on early entry for GCSE examinations. Where the school has an early entry policy, inspectors must be clear on whether early entry is limiting the potential of the most able pupils. Where early entry is not used, inspectors must comment briefly to that effect.’

  • In School Inspection Update (December 2014) Ofsted’s National Director, Schools reminds inspectors, following the first of a series of half-termly reviews of ‘the impact of policy on school inspection practice’, to:

‘…place greater emphasis, in line with the handbook changes from September, on the following areas in section 5 inspection reports…The provision and outcomes for different groups of children, notably the most-able pupils and the disadvantaged (as referred to in the handbook in paragraphs 40, 129, 137, 147, 155, 180, 186, 194, 195, 196, 207, 208, 210 and 212).’

HMCI’s Annual Report

The 2014 Annual Report said (my emphasis):

‘Ofsted will continue to press schools to stretch their most able pupils. Over the coming year, inspectors will be looking at this more broadly, taking into account the leadership shown in this area by schools. We will also further sharpen our recommendations so that schools have a better understanding of how they can help their most able pupils to reach their potential.’

HMCI’s Commentary on the Report  added for good measure:

‘In the year ahead, Ofsted will look even more closely at the performance of the brightest pupils in routine school inspections.’

So we are to expect a combination of broader focus, closer scrutiny and sharper recommendations.

The Annual Report relates to AY2013/14 and was published at the end of the first term of AY2014/15 and the end of calendar year 2014, so one assumes that references to the ‘coming year’ and ‘the year ahead’ are to calendar year 2015.

We should be able to see the impact of this ramping up in the sample I have selected, but some further change is also likely.

March 2015 survey report

One of the key findings from the March 2015 survey was (my emphasis):

Ofsted has sharpened its focus on the progress and quality of teaching of the most able students. We routinely comment on the achievement of the most able students in our inspection reports. However, more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

Ofsted directed three recommendations at itself which do not altogether reflect this (my emboldening):

‘Ofsted should:

  • Make sure that inspections continue to focus sharply on the progress made by students who are able and disadvantaged.
  • Report more robustly about how well schools promote the needs of the most able through the quality of their curriculum and the information, advice and guidance they offer to the most able students.
  • Ensure thematic surveys investigate, where appropriate, how well the most able are supported through, for example, schools’ use of the pupil premium and the curriculum provided.’

The first of these recommendations implies that inspections already focus sufficiently on the progress of able and disadvantaged learners – an assumption that we shall test in the analysis below. It therefore implies that no further change is necessary.

The third alludes to the most able disadvantaged but relates solely to thematic surveys, not to Section 5 inspection reports.

The second may imply that further emphasis will be placed on inspecting the appropriateness of the curriculum and IAG. Both of these topics seem likely to feature more strongly in a generic sense in the new Framework and Handbooks. One assumes that this will be extended to the most able, amongst other groups.

Though not mentioned in the survey report, we do know that Ofsted is preparing an evaluation toolkit. This was mentioned in a speech given by its Schools Director almost immediately after publication:

‘In this region specifically, inspectors have met with headteachers to address the poor achievement of the brightest disadvantaged children.

And inspectors are developing a most able evaluation toolkit for schools, aligned to that which is in place for free school meals.’

It is not clear from this whether the toolkit will be confined only to the most able disadvantaged or will have wider coverage.

Moreover, this statement raises the prospect that the toolkit might be similar in style to The Pupil Premium: Analysis and challenge tools for schools (January 2013). This is more akin to an old spanner than a Swiss army penknife. Anything of this nature would be rather less helpful than the term ‘toolkit’ implies.

At his request, I emailed Ofsted’s Director, Schools with questions on 21 March 2015. I requested further details of the toolkit. At the time of writing I have still to receive a reply.

.

The sample

I have selected an almost identical sample to that used in my 2014 analysis, one year on. It includes the 87 Section 5 inspection reports on secondary schools (excluding middle schools deemed secondary) that were published by Ofsted in the month of March 2015.

The bulk of the inspections were undertaken in February 2015, though a few took place in late January or early March.

Chart 1 gives the regional breakdown of the schools in the sample. All nine regions are represented, though there are only five schools from the North East, while Yorkshire and Humberside boasts 15. There are between seven and 11 schools in each of the other regions. In total 59 local authorities are represented.

In regional terms, this sample is more evenly balanced than the 2014 equivalent and the total number of authorities is two higher.

 .

Ofanal 1

Chart 1: Schools within the sample by region

Chart 2 shows how different statuses of school are represented within the sample.

All are non-selective. Fifty-three schools (61%) are academies, divided almost equally between the sponsored and converter varieties.

Community and foundation schools together form a third group of equivalent size, while the seven remaining schools have voluntary status, just one of them voluntary controlled. There are no free schools.

.

Ofanal 2

Chart 2: Schools within the sample by status

.

All but three of the schools are mixed – and those three are boys’ schools.

As for age range, there is one 13-18 and one 14-18 school. Otherwise there are 32 11-16 institutions (37% of the sample) while the remaining 53 (61%) are 11-18 or 11-19 institutions.

Chart 3 shows the variation in numbers on roll. The smallest school – a new 11-18 secondary school – has just 125 pupils; the largest 2083. The average is 912.

Fifty-two schools (60%) are between 600 and 1,200 and twenty-three (26%) between 800 and 1,000 pupils.

.

Ofanal 3

Chart 3: Schools within the sample by NOR

. 

Chart 4 shows the overall inspection grade of schools within the sample. A total of 19 schools (22%) are rated inadequate, seven of them attracting special measures. Only nine (10%) are outstanding, while 27 (31%) are good and 32 (37%) require improvement.

This is very similar to the distribution in the 2014 sample, except that there are slightly more inadequate schools and slightly fewer requiring improvement.

.

Ofanal 4

Chart 4: Schools within the sample by overall inspection grade

Unlike the 2104 analysis, I have also explored the distribution of all grades within reports. The results are set out in Chart 5.

Schools in the sample are relatively more secure on Leadership and management (55% outstanding or good) and Behaviour and safety of pupils (60% outstanding or good) than they are on Quality of teaching (43% outstanding or good) and Achievement of pupils (41% outstanding or good).

.

Ofanal 5

Chart 5: Schools within the sample by inspection sub-grades

Another new addition this year is comparison with the number and percentage of high attainers.

Amongst the sample, the number of high attainers in the 2014 GCSE cohort varied from three to 196 and the percentage from 3% to 52%. (Two schools did not have a GCSE cohort in 2014.)

These distributions are shown on the scatter charts 6 and 7, below.

Chart 6 (number) shows one major outlier at the top of the distribution. The vast majority – 64% of the sample – record numbers between 20 and 60. The average number is 41.

.

Ofanal 6

Chart 6: Schools within the sample by number of high attainers (Secondary Performance Tables measure)

. 

Chart 7 again has a single outlier, this time at the bottom of the distribution. The average is 32%, slightly less than the 32.3% reported for all state-funded schools in the Performance Tables.

Two in five of the sample register a high attainer percentage of between 20% and 30%, while three in five register between 20% and 40%.

But almost a third have a high attainer population of 20% or lower.

.

Ofanal 7 

Chart 7: Schools within the sample by percentage of high attainers (Secondary Performance Tables measure)

Out of curiosity, I compared the overall inspection grade with the percentage of high attainers.

  • Amongst the nine outstanding schools, the percentage of high attainers ranged from 22% to 47%, averaging 33% (there was also one without a high attainer percentage).
  • Amongst the 27 good schools, the percentage of high attainers was between 13% and 52% (plus one without a high attainer percentage) and averaged 32%.
  • Amongst the 32 schools requiring improvement, the percentage of high attainers varied between 3% and 40% and averaged 23%.
  • Amongst the 19 inadequate schools, the percentage of high attainers lay between 10% and 38% and also averaged 23%.

This may suggest a tendency for outstanding/good schools to have a somewhat larger proportion of high attainers than schools judged to be requiring improvement or inadequate.

Key findings and areas for improvement

.

Distribution of comments

Thirty-nine of the reports in the sample (45%) address the most able in the Summary of key findings, while 33 (38%) do so in the section about what the school needs to do to improve further.

In 24 cases (28%) there were entries in both these sections, but in 39 of the reports (45%) there was no reference to the most able in either section.

In 2014, 34% of reports in the sample addressed the issue in both the main findings and recommendations and 52% mentioned it in neither of these sections.

These percentage point changes are not strongly indicative of an extended commitment to this issue.

In the 2015 sample it was rather more likely for a reference to appear in the key findings for community schools (53%) and foundation schools (50%) than it was for converter academies (44%), sponsored academies (42%) or voluntary schools (29%).

Chart 8 shows the distribution of comments in these sections according to the overall inspection grade. In numerical terms, schools rated as requiring improvement overall are most likely to attract comments in both Key findings and Areas for improvement related to the most able.

.

Ofanal 8

Chart 8: Most able mentioned in key findings and areas for improvement by overall inspection grade (percentages)

.

But, when expressed as percentages of the total number of schools in the sample attracting these grades, it becomes apparent that the lower the grade, the more likely such a comment will be received.

Of the 39 reports making reference in the key findings, 10 comments were positive, 28 were negative and one managed to be both positive and negative simultaneously:

‘While the most-able students achieve well, they are capable of even greater success, notably in mathematics.’ (Harewood College, Bournemouth)

.

Positive key findings

Five of the ten exclusively positive comments were directed at community schools.

The percentage of high attainers in the 2014 GCSE cohorts at the schools attracting positive comments varied from 13% to 52% and included three of the five schools with the highest percentages in the sample.

Interestingly, only two of the schools with positive comments received an overall outstanding grade, while three required improvement.

Examples of positive comments, which were often generic, include:

  • ‘The most able students achieve very well, and the proportion of GCSE A* and A grades is significantly above average across the curriculum.’ (Durham Johnston Comprehensive School, Durham)
  • ‘The most able students do well because they are given work that challenges them to achieve their potential’. (The Elton High School Specialist Arts College, Bury)
  • ‘Most able students make good progress in most lessons because of well-planned activities to extend their learning’. (Endon High School, Staffordshire)
  • ‘Teachers encourage the most able students to explore work in depth and to master skills at a high level’. (St Richard Reynolds Catholic High School, Richmond-upon-Thames).

Negative key findings

The distribution of the 28 negative comments in Key findings according to overall inspection grade was:  Outstanding (nil); Good five (19%); Requires improvement twelve (38%); Inadequate eleven (58%).

This suggests a relatively strong correlation between the quality of provision for the most able and the overall quality of the school.

The proportion of high attainers in the 2014 GCSE cohorts of the schools attracting negative comments varied between 3% and 42%. All but three are below the national average for state-funded schools on this measure and half reported 20% or fewer high attainers.

This broadly supports the hypothesis that quality is less strong in schools where the proportion of high attainers is comparatively low.

Examples of typical negative comments:

  • ‘The most able students are not given work that is hard enough’ (Dyson Perrins C of E Sports College, Worcestershire)
  • ‘Too many students, particularly the most able, do not make the progress of which they are capable’ (New Line Learning Academy, Kent)
  • ‘Students, particularly the more able, make slower progress in some lessons where they are not sufficiently challenged. This can lead to some off task behaviour which is not always dealt with by staff’ (The Ferrers School, Northamptonshire)
  • ‘Teachers do not always make sufficient use of assessment information to plan work that fully stretches or challenges all groups of students, particularly the most able’ (Noel-Baker School, Derby).

The menu of shortcomings identified is limited, consisting of seven items: underachievement (especially too few high GCSE grades), insufficient progress, low expectations, insufficiently challenging work, poor teaching quality, poor planning and poor use of assessment information.

Of these, the most common comprise a familiar litany. They are (in descending order): 

  • Insufficiently challenging work 
  • Insufficient progress 
  • Underachievement and 
  • Low expectations.

Inspectors often point out inconsistent practice, though in the worst instances these shortcomings are dominant or even school-wide.

.

No key findings

Chart 9 shows the distribution of reports with no comments about the most able in Key findings and Areas for improvement according to overall inspection grade. When expressed as percentages, these again show that schools rated as outstanding are most likely to escape such comments, while inadequate schools are most likely to be in the firing line.

.

Ofanal 9

Chart 9: Most able not mentioned in key findings and areas for improvement by inspection grade (percentages)

This pattern replicates the findings from 2014. Orders of magnitude are also broadly comparable.  There is no substantive evidence of a major increase in emphasis from inspectors.

It seems particularly surprising that, in over half of schools requiring improvement and a third or more of inadequate schools, issues with educating the most able are still not significant enough to feature in these sections of inspection reports.

.

Areas for improvement

By definition, recommendations for improvement are always associated with identified shortcomings.

The correlation between key findings and areas for improvement is inconsistent. In six cases there were Key findings relating to the most able, but no area for improvement specifically associated with those. Conversely, nine reports had identified areas for improvement that were not picked up in the key findings.

Areas for improvement are almost always formulaic and expressed as lists: the school should improve x through y and z.

When it comes to the most able, the area for improvement is almost invariably teaching quality, though sometimes this is indicated as the route to higher achievement while on other occasions teaching quality and raising achievement are perceived as parallel priorities.

Just one report in the sample mentioned the quality of leadership and management:

‘Ensure that leadership and management take the necessary steps to secure a significant rise in students’ achievement at the end of Year 11 through…ensuring that work set for the most able is always sufficiently challenging’ (New Line Learning Academy, Kent).

This is despite the fact that leadership was specifically mentioned as a focus in HMCI’s Annual Report.

The actions needed to bring about improvement reflect the issues mentioned in the analysis of key findings above. The most common involve applying assessment information to planning and teaching:

  • ‘Raise students’ achievement and the quality of teaching further by ensuring that:…all staff develop their use of class data to plan learning so that students, including the most able, meet their challenging targets’ (Oasis Academy Isle of Sheppey, Kent)
  • ‘Ensure the quality of teaching is always good or better, in order to raise attainment and increase rates of progress, especially in English and mathematics, by:…ensuring teachers use all the information available to them to plan lessons that challenge students, including the most able’ (Oasis Academy Lister Park, Bradford)
  • ‘Embed and sustain improvements in achievement overall and in English in particular so that teaching is consistently good and outstanding by: making best use of assessment information to set work that is appropriately challenging, including for the least and most able students’ (Pleckgate High School Mathematics and Computing College, Blackburn with Darwen)

Other typical actions involve setting more challenging tasks, raising the level of questioning, providing accurate feedback, improving lesson planning and maintaining consistently high expectations.

.

Coverage in the main body of reports

.

Leadership and management

Given the reference to this in HMCI’s Annual Report, one might have expected a new and significant emphasis within this section of the reports in the sample.

In fact, the most able were only mentioned in this section in 13 reports (15% of the total). Hardly any of these comments identified shortcomings. The only examples I could find were:

  • ‘The most-able students are not challenged sufficiently in all subjects to
    achieve the higher standards of which they are capable’ (Birkbeck School and Community Arts College, Lincolnshire)
  • ‘Action to improve the quality of teaching is not focused closely enough on the strengths and weaknesses of the school and, as a result, leaders have not done enough to secure good teaching of students and groups of students, including…the most able (Ashington High School Sports College, Northumberland)

Inspectors are much more likely to accentuate the positive:

  • ‘The school has been awarded the Challenge Award more than once. This is given for excellent education for a school’s most-able, gifted and talented students and for challenge across all abilities. Representatives from all departments attend meetings and come up with imaginative ways to deepen these students’ understanding.’ (Cheam High School, Sutton)
  • ‘Leaders and governors are committed to ensuring equality of opportunity for all students and are making effective use of student achievement data to target students who may need additional support or intervention. Leaders have identified the need to improve the achievement of…the most-able in some subjects and have put in place strategies to do so’ (Castle Hall academy Trust, Kirklees)
  • ‘Measures being taken to improve the achievement of the most able are effective. Tracking of progress is robust and two coordinators have been appointed to help raise achievement and aspirations. Students say improvements in teaching have been made, and the work of current students shows that their attainment and progress is on track to reach higher standards.’ (The Byrchall High School, Wigan).

Not one report mentioned the role of governors in securing effective provision for the most able. 

Given how often school leadership escapes censure for issues identified elsewhere in reports, this outcome could be interpreted as somewhat complacent. 

HMCI is quite correct to insist that provision for the most able is a whole school issue and, as such, a school’s senior leadership team should be held to account for such shortcomings.

Behaviour and safety

The impact of under-challenging work on pupils’ behaviour is hardly ever identified as a problem.

One example has been identified in the analysis of Key findings above. Only one other report mentions the most able in this section, and the comment is about the role of the school council rather than behaviour per se:

‘The academy council is a vibrant organisation and is one of many examples where students are encouraged to take an active role in the life of the academy. Sixth form students are trained to act as mentors to younger students. This was seen being effectively employed to…challenge the most able students in Year 9’ (St Thomas More High School, Southend)

A handful of reports make some reference under ‘Quality of teaching’ but one might reasonably conclude that neither  bullying of the most able nor disruptive behaviour from bored high attainers is particularly widespread.

Quality of teaching

Statements about the most able are much more likely to appear in this section of reports. Altogether 59 of the sample (68%) made some reference.

Chart 10 shows the correlation between the incidence of comments and the sub-grade awarded by inspectors to this aspect of provision. It demonstrates that, while differences are relatively small, schools deemed outstanding are rather more likely to attract such comment.

But only one of the comments on outstanding provision is negative and that did not mention the most able specifically:

‘Also, in a small minority of lessons, activities do not always deepen
students’ knowledge and understanding to achieve the very highest grades at GCSE and A level.’ (Central Foundation Boys’ School, Islington)

.

Ofanal 10

Chart 10: Incidence of comments under quality of teaching by grade awarded for quality of teaching

.

Comments are much more likely to be negative in schools where the quality of teaching is judged to be good (41%), requiring improvement (59%) and inadequate (58%).

Even so, a few schools in the lower two categories receive surprisingly positive endorsements:

  • ‘On the other hand, the most able students and the younger students in school consistently make good use of the feedback. They say they greatly value teachers’ advice….The teaching of the most able students is strong and often very strong. As a result, these students make good progress and, at times, achieve very well.’ (RI – The Elton High School Specialist Arts College, Bury)
  • ‘Teaching in mathematics is more variable, but in some classes, good and outstanding teaching is resulting in students’ rapid progress. This is most marked in the higher sets where the most able students are being stretched and challenged and are on track to reach the highest grades at GCSE…. In general, the teaching of the most able students….is good.’ (RI- New Charter Academy, Tameside)
  • ‘At its most effective, teaching is well organised to support the achievement of the most able, whose progress is better than other students. This is seen in some of the current English and science work.’ (I – Ely College, Cambridgeshire).

Negative comments on the quality of teaching supply a familiar list of shortcomings.

Some of the most perceptive are rather more specific. Examples include:

  • ‘While the best teaching allows all students to make progress, sometimes discussions that arise naturally in learning, particularly with more able students, are cut short. As a result, students do not have the best opportunity to explore ideas fully and guide their own progress.’ (Dyson Perrins C of E Sports College, Worcestershire)
  • ‘Teachers’ planning increasingly takes account of current information about students’ progress. However, some teachers assume that because the students are organised into ability sets, they do not need to match their teaching to individual and groups of students’ current progress. This has an inhibiting effect on the progress of the more able students in some groups.’ (Chulmleigh Community College, Devon)
  • ‘In too many lessons, particularly boys’ classes, teachers do not use questioning effectively to check students’ learning or promote their thinking. Teachers accept responses that are too short for them to assess students’ understanding. Neither do they adjust their teaching to revisit aspects not fully grasped or move swiftly to provide greater stretch and new learning for all, including the most able.’ (The Crest Academies, Brent)
  • ‘In some lessons, students, including the most able, are happy to sit and wait for the teacher to help them, rather than work things out for themselves’ (Willenhall E-ACT Academy, Walsall).

Were one compiling a list of what to do to impress inspectors, it would include the following items:

  • Plans lessons meticulously with the needs of the most able in mind 
  • Use assessment information to inform planning of work for the most able 
  • Differentiate work (and homework) to match most able learners’ needs and starting points 
  • Deploy targeted questioning, as well as opportunities to develop deeper thinking and produce more detailed pieces of work 
  • Give the most able the flexibility to pursue complex tasks and do not force them to participate in unnecessary revision and reinforcement 
  • Do not use setting as an excuse for neglecting differentiation 
  • Ensure that work for the most able is suitably challenging 
  • Ensure that subject knowledge is sufficiently secure for this purpose 
  • Maintain the highest expectations of what the most able students can achieve 
  • Support the most able to achieve more highly but do not allow them to become over-reliant on support 
  • Deploy teaching assistants to support the most able 
  • Respond to restlessness and low level disruption from the most able when insufficiently challenged.

While many of the reports implicitly acknowledge that the most able learners will have different subject-specific strengths and weaknesses, the implications of this are barely discussed.

Moreover, while a few reports attempt a terminological distinction between ‘more able’ and ‘most able’, the vast majority seem to assume that, in terms of prior attainment, the most able are a homogenous group, whereas – given Ofsted’s preferred approach – there is enormous variation.

Achievement of pupils 

This is the one area of reports where reference to the most able is now apparently compulsory – or almost compulsory.

Just one report in the sample has nothing to say about the achievement of the most able in this section: that on Ashby School in Leicestershire.

Some of the comments are relatively long and detailed, but others are far more cursory and the coverage varies considerably.

Using as an example the subset of schools awarded a sub-grade of outstanding for the achievement of pupils, we can exemplify different types of response:

  • Generic: ‘The school’s most able students make rapid progress and attain excellent results. This provides them with an excellent foundation to continue to achieve well in their future studies.’ (Kelvin Hall School, Hull)
  • Generic, progress-focused: ‘The most-able students make rapid progress and the way they are taught helps them to probe topics in greater depth or to master skills at a high level.’ (St Richard Reynolds Catholic High School, Richmond-upon-Thames)
  • Achievement-focused, core subjects: ‘Higher attaining students achieve exceptionally well as a result of the support and challenge which they receive in class. The proportion of students achieving the higher A* to A grade was similar to national averages in English but significantly above in mathematics.
  • Specific, achievement- and progress-focused: ‘Although the most able students make exceptional progress in the large majority of subjects, a few do not reach the very highest GCSE grades of which they are capable. In 2014, in English language, mathematics and science, a third of all students gained A and A* GCSE grades. Performance in the arts is a real strength. For example, almost two thirds of students in drama and almost half of all music students achieved A and A* grades. However, the proportions of A and A* grades were slightly below the national figures in English literature, geography and some of the subjects with smaller numbers of students (Central Foundation Boys’ School, Islington)

If we look instead at the schools with a sub-grade of inadequate, the comments are typically more focused on progress, but limited progress is invariably described as ‘inadequate’, ‘requiring improvement’, ‘weak’, ‘not good’, ‘not fast enough’. It is never quantified.

On the relatively few occasions when achievement is discussed, the measure is typically GCSE A*/A grades, most often in the core subjects.

It is evident from cross-referencing the Achievement of pupils sub-grade against the percentage of high attainers in the 2014 GCSE cohort that there is a similar correlation to that with the overall inspection grade:

  • In schools judge outstanding on this measure, the high attainer population ranges from 22% to 47% (average 33%)
  • In schools judged good, the range is from 13% to 52% (average 32%)
  • In schools requiring improvement it is between 3% and 40% (average 23%)
  • In schools rated inadequate it varies from 10% to 32% (average 22%)

.

Sixth Form Provision 

Coverage of the most able in sections dedicated to the sixth form is also extremely variable. Relatively few reports deploy the term itself when referring to 16-19 year-old students.

Sometimes there is discussion of progression to higher education and sometimes not. Where this does exist there is little agreement on the appropriate measure of selectivity in higher education:

  • ‘Students are aspiring to study at the top universities in Britain. This is a realistic prospect and illustrates the work the school has done in raising their aspirations.’ (Welling School, Bexley)
  • ‘The academy carefully tracks the destination of leavers with most students proceeding to university and one third of students gaining entry to a Russell Group university’ (Ashcroft Technology Academy, Wandsworth)
  • ‘Provision for the most able students is good, and an increasing proportion of students are moving on to the highly regarded ‘Russell group’ or Oxbridge universities. A high proportion of last year’s students have taken up a place at university and almost all gained a place at their first choice’ (Ashby School, Leicestershire)
  • ‘Large numbers of sixth form students progress to well-regarded universities’ (St Bartholomew’s School, West Berkshire)
  • ‘Students receive good support in crafting applications to universities which most likely match their attainment; this includes students who aspire to Oxford or Cambridge’ (Anthony Gell School, Derbyshire).

Most able and disadvantaged

Given the commitment in the 2015 survey report to ‘continue to focus sharply on the progress made by students who are able and disadvantaged’, I made a point of reviewing the coverage of this issue across all sections of the sample reports.

Suffice to say that only one report discussed provision for the most able disadvantaged students, in these terms:

‘Pupil premium funding is being used successfully to close the wide achievement gaps apparent at the previous inspection….This funding is also being effectively used to extend the range of experiences for those disadvantaged students who are most able. An example of this is their participation in a residential writing weekend.’ (St Hild’s C of E VA School, Hartlepool)

Take a bow Lead Inspector Petts!

A handful of other reports made more general statements to the effect that disadvantaged students perform equivalently to their non-disadvantaged peers, most often with reference to the sixth form:

  • ‘The few disadvantaged students in the sixth form make the same progress as other students, although overall, they attain less well than others due to their lower starting points’ (Sir Thomas Wharton Community College, Doncaster)
  • ‘There is no difference between the rates of progress made by disadvantaged students and their peers’ (Sarum Academy, Wiltshire)
  • ‘In many cases the progress of disadvantaged students is outstripping that of others. Disadvantaged students in the current Year 11 are on course to do
    every bit as well as other students.’ (East Point Academy, Suffolk).

On two occasions, the point was missed entirely:

  • ‘The attainment of disadvantaged students in 2014 was lower than that of other students because of their lower starting points. In English, they were half a grade behind other students in the school and nationally. In mathematics, they were a grade behind other students in the school and almost a grade behind students nationally. The wider gap in mathematics is due to the high attainment of those students in the academy who are not from disadvantaged backgrounds.’ (Chulmleigh Community College, Devon)
  • ‘Disadvantaged students make good progress from their starting points in relation to other students nationally. These students attained approximately two-thirds of a GCSE grade less than non-disadvantaged students nationally in English and in mathematics. This gap is larger in school because of the exceptionally high standards attained by a large proportion of the most able students…’ (Durham Johnston Comprehensive School, Durham)

If Ofsted believes that inspectors are already focusing sharply on this issue then, on this evidence, they are sadly misinformed.

Key Findings and areas for improvement

.

Key findings: Guidance

  • Ofsted inspectors have no reliable definition of ‘most able’ and no guidance on the appropriateness of definitions adopted by the schools they visit. The approach taken in the 2015 survey report is different to that adopted in the initial 2013 survey and is now exclusively focused on prior attainment. It is also significantly different to the high attainer measure in the Secondary Performance Tables.
  • Using Ofsted’s approach, the national population of most able in Year 7 approaches 50% of all learners; in Year 11 it is some 40% of all learners. The latter is some eight percentage points lower than the cohort derived from the Performance Tables measure.
  • The downside of such a large cohort is that it masks the huge attainment differences within the cohort, from a single L5C (and possibly a L3 in either maths or English) to a clutch of L6s. Inspectors might be encouraged to regard this as a homogenous group.
  • The upside is that there should be a most able presence in every year group of every school. In some comprehensive schools, high attainers will be a substantial majority in every year group; in others there will be no more than a handful.
  • Ofsted has not released data showing the incidence of high attainers in each school according to its measure (or the Performance Tables measure for that matter). This does not features in Ofsted’s Data Dashboard.
  • Guidance in the current School Inspection Handbook is not entirely helpful. There is not space in a Section 5 inspection report to respond to all the separate references (see Appendix for the full list). The terminology is confused (‘most able’, ‘more able’, ‘high attainers’).Too often the Handbook mentions several different groups alongside the most able, one of which is disadvantaged pupils. This perpetuates the false assumption that there are no most able disadvantaged learners. We do not yet know whether there will be wholesale revision when new Handbooks are introduced to reflect the Common Inspection Framework.
  • At least four pieces of subsidiary guidance have issued to inspectors since October 2013. But there has been nothing to reflect the commitments in HMCI’s Annual Report (including a stronger focus on school leadership of this issue) or the March 2015 Survey report. This material requires enhancement and consolidation.
  • The March 2015 Report apparently commits to more intensive scrutiny of curricular and IAG provision in Section 5 inspections, as well as ‘continued focus’ on able and disadvantaged students (see below). A subsequent commitment to an evaluation toolkit would be helpful to inspectors as well as schools, but its structure and content has not yet been revealed.

Key findings: Survey

  • The sample for my survey is broadly representative of regions, school status and variations in NOR. In terms of overall inspection grades, 10% are outstanding, 31% good, 37% require improvement and 22% are inadequate. In terms of sub-grades, they are notably weaker on Quality of teaching and Achievement of pupils, the two sections that most typically feature material about the most able.
  • There is huge variation within the sample by percentage of high attainers (2014 GCSE population according to the Secondary Performance Tables measure). The range is from 3% to 52%. The average is 32%, very slightly under the 32.3% average for all state-funded schools. Comparing overall inspection grade with percentage of high attainers suggests a marked difference between those rated outstanding/good (average 32/33%) and those rated as requiring improvement/inadequate (average 23%).
  • 45% of the reports in the sample addressed the most able under Key findings; 38% did so under Areas for improvement and 28% made reference in both sections. However, 45% made no reference in either of these sections. In 2014, 34% mentioned the most able in both main findings and recommendations, while 52% mentioned it in neither. On this measure, inspectors’ focus on the most able has not increased substantively since last year.
  • Community and foundation schools were rather more likely to attract such comments than either converter or sponsored academies. Voluntary schools were least likely to attract them. The lower the overall inspection grade, the more likely a school is to receive such comments.
  • In Key findings, negative comments outnumbered positive comments by a ratio of 3:1. Schools with high percentages of high attainers were well represented amongst those receiving positive comments.
  • Unsurprisingly, schools rated inadequate overall were much more likely to attract negative comments. A correlation between overall quality and quality of provision for the most able was somewhat more apparent than in 2014. There was also some evidence to suggest a correlation between negative comments and a low proportion of high attainers.
  • On the other hand, over half of schools with an overall requiring improvement grade and a third with an overall inspection grade of inadequate did not attract comments about the most able under Key findings. This is not indicative of greater emphasis.
  • The menu of shortcomings is confined to seven principal faults: underachievement (especially too few high GCSE grades), insufficient progress, low expectations, insufficiently challenging work, poor teaching quality, poor planning and poor use of assessment information. In most cases practice is inconsistent but occasionally problems are school-wide.
  • Areas for improvement are almost always expressed in formulaic fashion. Those relating to the most able focus almost invariably on the Quality of teaching. The improvement most commonly urged is more thorough application of assessment information to planning and teaching.
  • Only 15% of reports mention the most able under Leadership and management and, of those, only two are negative comments. The role of governors was not raised once. Too often the school leadership escapes censure for shortcomings identified elsewhere in the report. This is not consistent with indications of new-found emphasis in this territory.
  • The most able are hardly ever mentioned in the Behaviour and safety section of reports. It would seem that bullying is invisible and low level disruption by bored high attainers rare.
  • Conversely, 68% of reports referenced the most able under Quality of teaching. Although negative comments are much more likely in schools judged as inadequate or requiring improvement in this area, a few appear to be succeeding with their most able against the odds. The main text identifies a list of twelve good practice points gleaned from the sample.
  • Only one report fails to mention the most able under Achievement of pupils, but the quality and coverage varies enormously. Some comments are entirely generic; some focus on achievement, others on progress and some on both. Few venture beyond the core subjects. There is very little quantification, especially of insufficient progress (and especially compared with equivalent discussion of progress by disadvantaged learners).
  • Relatively few reports deploy the term ‘most able’ when discussing sixth form provision. Progression to higher education is sometimes mentioned and sometimes not. There is no consensus on how to refer to selective higher education.
  • Only one report in this sample mentions disadvantaged most able students. Two reports betray the tendency of assuming these two groups to be mutually exclusive but, worse still, the sin of omission is almost universal. This provides no support whatsoever for Ofsted’s claim that inspectors already address the issue.

Areas for improvement

Ofsted has made only limited improvements since the previous inspection in May 2014 and its more recent commitments are not yet reflected in Section 5 inspection practice.

In order to pass muster it should:

  • Appoint a lead inspector for the most able who will assume responsibility across Ofsted, including communication and consultation with third parties.
  • Consolidate and clarify material about the most able in the new Inspection Handbooks and supporting guidance for inspectors.
  • Prepare and publish a high quality evaluation toolkit, to support schools and inspectors alike. This should address definitional and terminological issues as well as supplying benchmarking data for achievement and progress. It might also set out the core principles underpinning effective practice.
  • Include within the toolkit a self-assessment and evaluation framework based on the quality standards. This should model Ofsted’s understanding of whole school provision for the most able that aligns with outstanding, good and requiring improvement grades, so that schools can understand the progression between these points.
  • Incorporate data about the incidence of the most able and their performance in the Data Dashboard.
  • Extend all elements of this work programme to the primary and post-16 sectors.
  • Undertake this work programme in consultation with external practitioners and experts in the field, completing it as soon as possible and by December 2015 at the latest.

 .

Verdict: (Still) Requires Improvement.

GP

April 2015

.. 

.

Annex: Coverage in the School Inspection Handbook (December 2014)

Main Text

Inspectors should:

  • Gather evidence about how well they are ‘learning, gaining knowledge and understanding, and making progress’ (para 40)
  • Take account of them when considering performance data (para 59)
  • Take advantage of opportunities to gather evidence from them (para 68)
  • Consider the effectiveness of pupil grouping, for example ‘where pupils are taught in mixed ability groups/classes, inspectors will consider whether the most able are stretched…’ (para 153)
  • Explore ‘how well the school works with families to support them in overcoming the cultural obstacles that often stand in the way of the most able pupils from deprived backgrounds attending university’ (para 154)
  • Consider whether ‘teachers set homework in line with the school’s policy and that challenges all pupils, especially the most able’ (para 180)
  • Consider ‘whether work in Key Stage 3 is demanding enough, especially for the most able when too often undemanding work is repeated unnecessarily’ (para 180)
  • Consider whether ‘teaching helps to develop a culture and ethos of scholastic excellence, where the highest achievement in academic work is recognised, especially in supporting the achievement of the most able’ (para 180)
  • When judging achievement, have regard for ‘the progress that the most able are making towards attaining the highest grades’ and ‘pay particular attention to whether more able pupils in general and the most able pupils in particular are achieving as well as they should’. They must ‘summarise the achievements of the most able pupils in a separate paragraph of the inspection report’ (paras 185-7)
  • Consider ‘how the school uses assessment information to identify pupils who…need additional support to reach their full potential, including the most able.’ (para 193)
  • Consider how well ‘assessment, including test results, targets, performance descriptors or expected standards are used to ensure that…more able pupils do work that deepens their knowledge and understanding’ and ‘pupils’ strengths and misconceptions are identified and acted on by teachers during lessons and more widely to… deepen the knowledge and understanding of the most able’ (para 194)
  • Take account of ‘the learning and progress across year groups of different groups of pupils currently on the roll of the school, including…the most able’. Evidence gathered should include ‘the school’s own records of pupils’ progress, including… the most able pupils such as those who joined secondary schools having attained highly in Key Stage 2’ (para 195)
  • Take account of ‘pupils’ progress in the last three years, where such data exist and are applicable, including that of…the most able’ (para 195)
  • ‘When inspecting and reporting on students’ achievement in the sixth form, inspectors must take into account all other guidance on judging the achievement, behaviour and development of students, including specific groups such as…the most able ‘ (para 210)
  • Talk to sixth form students to discover ‘how well individual study programmes meet their expectations, needs and future plans, including for…the most able’ (para 212)

However, the terminology is not always consistent. in assessing the overall effectiveness of a school, inspectors must judge its response to ‘the achievement of…the highest and lowest attainers’ (para 129)

Grade descriptors

Outstanding

  • Overall effectiveness:

‘The school’s practice consistently reflects the highest expectations of staff and the highest aspirations for pupils, including the most able…’

  • Quality of teaching:

‘Much teaching over time in all key stages and most subjects is outstanding and never less than consistently good. As a result, almost all pupils currently on roll in the school, including…the most able, are making sustained progress that leads to outstanding achievement.’

  • Achievement of pupils:

‘The learning of groups of pupils, particularly… the most able, is consistently good or better.’

  • Effectiveness of sixth form provision:

‘All groups of pupils make outstanding progress, including…the most able’

Good

  • Overall effectiveness:

‘The school takes effective action to enable most pupils, including the most able…’

  • Quality of teaching:

‘Teaching over time in most subjects, including English and mathematics, is consistently good. As a result, most pupils and groups of pupils on roll in the school, including…the most able, make good progress and achieve well over time.’

‘Effective teaching strategies, including setting appropriate homework and well-targeted support and intervention, are matched closely to most pupils’ needs, including those most and least able, so that pupils learn well in lessons’

  • Achievement of pupils:

‘The learning of groups of pupils, particularly… the most able, is generally good.’

  • Effectiveness of sixth form provision:

‘As a result of teaching that is consistently good over time, students make good progress, including…the most able’

Inadequate

  • Quality of teaching:

‘As a result of weak teaching over time, pupils or particular groups of pupils, including…the most able, are making inadequate progress.’

  • Achievement of pupils:

‘Groups of pupils, particularly disabled pupils and/or those who have special educational needs and/or disadvantaged pupils and/or the most able, are underachieving’

  • Effectiveness of sixth form provision:

‘Students or specific groups such as… the most able do not achieve as well as they can. Low attainment of any group shows little sign of rising.’

The most able students: Has Ofsted made progress?

.

This post considers Ofsted’s survey report ‘The most able students: An update on progress since June 2013’ published on 4 March 2015.

It is organised into the following sections:

  • The fit with earlier analysis
  • Reaction to the Report
  • Definitions and the consequent size of Ofsted’s ‘most able’ population
  • Evidence base – performance data and associated key findings
  • Evidence base – inspection and survey evidence and associated key findings
  • Ofsted’s recommendations and overall assessment
  • Prospects for success

How this fits with earlier work

The new Report assesses progress since Ofsted’s previous foray into this territory some 21 months ago: ‘The most able students: Are they doing as well as they should in our non-selective secondary schools?’ (June 2013)

The autopsy I performed on the original report was severely critical.

It concluded:

‘My overall impression is of a curate’s egg, whose better parts have been largely overlooked because of the opprobrium heaped on the bad bits.

The Report might have had a better reception had the data analysis been stronger, had the most significant messages been given comparatively greater prominence and had the tone been somewhat more emollient towards the professionals it addresses, with some sort of undertaking to underwrite support – as well as challenge – from the centre.’

In May 2014, almost exactly mid-way between that Report and this, I published an analysis of the quality of Ofsted reporting on support for the most able in a sample of Section 5 secondary school inspection reports.

This uncovered a patchy picture which I characterised as ‘requiring improvement’.

It noted the scant attention given by inspectors to high-attaining disadvantaged learners and called for Ofsted to publish guidance to clarify, for inspectors and schools alike, what they mean by the most able and their expectations of what support schools should provide.

In December 2014, I published ‘HMCI ups the ante on the most able’ which drew attention to commitments in HMCI’s Annual Report for 2013/14 and the supporting documentation released alongside it.

I concluded that post with a series of ten recommendations for further action by Ofsted and other central government bodies that would radically improve the chances of achieving system-wide improvement in this territory.

The new Report was immediately preceded by a Labour commitment to introduce a £15m Gifted and Talented Fund if successful in the forthcoming General Election.

This short commentary discusses that and sets out the wider political context into which Ofsted’s new offering will fall.

.

Reactions to Ofsted’s Report

Before considering the Report’s content, it may be helpful to complete this context-setting by charting immediate reactions to it.

  • DfE’s ‘line to take, as quoted by the Mail, is:

‘We know that the best schools do stretch their pupils. They are the ones with a no-excuses culture that inspires every student to do their best.

Our plan for education is designed to shine a bright light on schools which are coasting, or letting the best and brightest fall by the wayside.

That is why we are replacing the discredited system which rewarded schools where the largest numbers of pupils scraped a C grade at GCSE.

Instead we are moving to a new system which encourages high-achievers to get the highest grades possible while also recognising schools which push those who find exams harder.’

‘David Cameron’s government has no strategy for supporting schools to nurture their most able pupils. International research shows we perform badly in helping the most gifted pupils. We’re going to do something about that. Labour will establish a Gifted and Talented Fund to equip schools with the most effective strategies for stretching their most able pupils.’

  • ASCL complains that the Report ‘fails to recognise that school leaders have done an extraordinary job in difficult circumstances in raising standards and delivering a good education for all children’. It is also annoyed because Ofsted’s press release:

‘…should have focused on the significant amount of good practice identified in the report rather than leading with comments that some schools are not doing enough to ensure the most able children fulfil their potential.’

 .

 .

  • NAHT makes a similarly generic point about volatility and change:

‘The secondary sector has been subject to massive structural change over the past few years. It’s neither sensible nor accurate to accuse secondary schools of failure. The system itself is getting in the way of success…

…Not all of these changes are bad. The concern is that the scale and pace of them will make it very hard indeed to know what will happen and how the changes will interact….

…The obvious answer is quite simple: slow down and plan the changes better; schedule them far enough ahead to give schools time to react….

But the profession also needs to ask what it can do. One answer is not to react so quickly to changes in league table calculations – to continue to do what is right…’

There was no official reaction from ATL, NASUWT or NUT.

Turning to the specialist organisations:

‘If the failure reported by Ofsted was about any other issue there would be a national outcry.

This cannot be an issue laid at the door of schools alone, with so many teachers working hard, and with no budget, to support these children.

But in some schools there is no focus on supporting high potential learners, little training for teachers to cope with their educational needs, and a naive belief that these children will succeed ‘no matter what’.

Ofsted has shown that this approach is nothing short of a disaster; a patchwork of different kinds of provision, a lack of ambitious expectations and a postcode lottery for parents.

We need a framework in place which clearly recognises best practice in schools, along with a greater understanding of how to support these children with high learning potential before it is too late.’

‘NACE concurs with both the findings and the need for urgent action to be taken to remove the barriers to high achievement for ALL pupils in primary and secondary schools…

… the organisation is  well aware that nationally there is a long way to go before all able children are achieving in line with their abilities.’

‘Today’s report demonstrates an urgent need for more dedicated provision for the highly able in state schools. Ofsted is right to describe the situation as ‘especially disappointing’; too many of our brightest students are being let down…

…We need to establish an effective national programme to support our highly able children particularly those from low and middle income backgrounds so that they have the stretch and breath they need to access the best universities and the best careers.’

Summing up, the Government remains convinced that its existing generic reforms will generate the desired improvements.

There is so far no response, from Conservatives or Liberal Democrats, to the challenge laid down by Labour, which has decided that some degree of arms-length intervention from the centre is justified.

The headteacher organisations are defensive because they see themselves as the fall guys, as the centre increasingly devolves responsibility through a ‘school-driven self-improving’ system that cannot yet support its own weight (and might never be able to do so, given the resource implications of building sufficient capacity).

But they cannot get beyond these generic complaints to address the specific issues that Ofsted presents. They are in denial.

The silence of the mainstream teachers’ associations is sufficient comment on the significance they attach to this issue.

The specialist lobby calls explicitly for a national framework, or even the resurrection of a national programme. All are pushing their own separate agendas over common purpose and collaborative action.

Taken together, this does not bode well for Ofsted’s chances of achieving significant traction.

Ofsted’s definitions

.

Who are the most able?

Ofsted is focused exclusively on non-selective secondary schools, and primarily on KS3, though most of the data it publishes relates to KS4 outcomes.

My analysis of the June 2013 report took umbrage at Ofsted’s previous definition of the most able:

‘For the purpose of this survey ‘most able’ is defined as the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2. Some pupils who are new to the country and are learning English as an additional language, for example, might not have attained Level 5 or beyond at the end of Key Stage 2 but have the potential to achieve it.’

On this occasion, the definition is similarly based on prior attainment at KS2, but the unquantified proportion of learners with ‘the potential to attain Level 5 or above’ are removed, meaning that Ofsted is now focused exclusively on high attainers:

‘For this report, ‘most able’ refers to students starting secondary school in Year 7 having attained Level 5 or above in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

This reinforces the unsuitability of the term ‘most able’, on the grounds that attainment, not ability, is the true focus.

Ofsted adds for good measure:

‘There is currently no national definition for most able’

They fail to point out that the Performance Tables include a subtly different definition of high attainers, essentially requiring an APS of 30 points or higher across Key Stage 2 tests in the core subjects.

The 2014 Secondary Performance Tables show that this high attainer population constitutes 32.3% of the 2014 GCSE cohort in state-funded schools.

The associated SFR indicates that high attainers account for 30.9% of the cohort in comprehensive schools (compared with 88.8% in selective schools).

But Ofsted’s definition is wider still. The SFR published alongside the 2014 Primary Performance Tables reveals that, in 2014:

  • 29% of pupils achieved Level 5 or above in KS2 reading and writing
  • 44% of pupils achieved Level 5 or above in KS2 Maths and
  • 24% of pupils achieved Level 5 or above in KS2 reading, writing and maths.

If this information is fed into a Venn diagram, it becomes evident that, this academic year, the ‘most able’ constitute 49% of the Year 7 cohort.

That’s right – almost exactly half of this year’s Year 7s fall within Ofsted’s definition.

.

Ofsted venn Capture

.

The population is not quite so large if we focus instead on KS2 data from 2009, when the 2014 GCSE cohort typically took their KS2 tests, but even that gives a combined total of 39%.

We can conclude that Ofsted’s ‘most able’ population is approximately 40% of the KS4 cohort and approaching 50% of the KS3 cohort.

This again calls into question Ofsted’s terminology, since the ‘most’ in ‘most able’ gives the impression that they are focused on a much smaller population at the top of the attainment distribution.

We can check the KS4 figure against numerical data provided in the Report, to demonstrate that it applies equally to non-selective schools, ie once selective schools have been removed from the equation.

The charts in Annex A of the Report give the total number of pupils in non-selective schools with L5 outcomes from their KS2 assessments five years before they take GCSEs:

  • L5 maths and English = 91,944
  • L5 maths = 165,340
  • L5 English (reading and writing) = 138,789

Assuming there is no double-counting, this gives us a total population of 212,185 in 2009.

I could not find a reliable figure for the number of KS2 test takers in 2009 in state-funded primary schools, but the equivalent in the 2011 Primary Performance Tables is 547,025.

Using that, one can calculate that those within Ofsted’s definition constitute some 39% of the 2014 GCSE cohort in non-selective secondary schools. The calculations above suggest that the KS3 cohort will be some ten percentage points larger.

.

Distribution between schools

Of course the distribution of these students between schools will vary considerably.

The 2014 Secondary Performance Tables illustrate this graphically through their alternative ‘high attainers’ measure. The cohort information provides the percentage of high attainers in the GCSE cohort in each school.

The highest recorded percentage in a state-funded comprehensive school is 86%, whereas 92 state-funded schools record 10% or fewer high attainers and just over 650 have 20% or fewer in their GCSE cohort.

At the other extreme, 21 non-selective state-funded schools are at 61% or higher, 102 at 51% or higher and 461 at 41% or higher.

However, the substantial majority – about 1,740 state-funded, non-selective schools – fall between 21% and 40%.

The distribution is shown in the graph below.

.

Ofsted graph 1

Percentage of high attainers within each state-funded non-selective secondary school’s cohort 2014 (Performance Tables measure)

Ofsted approaches the issue differently, by looking at the incidence of pupils with KS2 L5 in English, maths and both English and maths.

Their tables (again in Annex A of the Report) show that, within the 2014 GCSE cohort there were:

  • 2,869 non-selective schools where at least one pupil previously attained a L5 in KS2 English
  • 2,875 non-selective schools where at least one pupil previously attained a L5 in KS2 maths and
  • 2,859 non-selective schools where at least one pupil previously attained l5 in KS2 English and maths.

According to the cohort data in the 2014 Secondary Performance Tables, this suggests that roughly 9% of state-funded non-selective secondary schools had no pupils in each of these categories within the relevant cohort. (It is of course a different 9% in each case.)

Ofsted’s analysis shows that the lowest decile of schools in the distribution of students with L5 in English will have up to 14 of them.

Similarly the lowest decile for L5 in maths will have up to 18 pupils, and the lowest decile for L5 in maths and English combined will have up to 10 pupils.

Assuming a top set typically contains at least 26 pupils, 50% of state-funded, non-selective schools with at least one pupil with L5 English have insufficient students for one full set. The comparable percentage for maths is 30%.

But Ofsted gives no hint of what might constitute a critical mass of high attainers, appearing to suggest that it is simply a case of ‘the more the better’.

Moreover, it seems likely that Ofsted might simply be identifying the incidence of disadvantage through the proxy of high attainers.

This is certainly true at the extremes of the distribution based on the Performance Tables measure.

  • Amongst the 92 schools with 10% or fewer high attainers, 53 (58%) have a cohort containing 41% or more disadvantaged students.
  • By comparison, amongst the 102 schools with 51% or more high attainers, not one school has such a high proportion of disadvantaged students, indeed, 57% have 10% or fewer.

Disadvantage

When Ofsted discusses the most able from disadvantaged backgrounds, its definition of disadvantage is confined to ‘Ever-6 FSM’.

The Report does not provide breakdowns showing the size of this disadvantaged population in state-funded non-selective schools with L5 English or L5 maths.

It does tell us that 12,150 disadvantaged students in the 2014 GCSE cohort had achieved KS2 L5 in both English and maths.  They form about 13.2% of the total cohort achieving this outcome.

If we assume that the same percentage applies to the total populations achieving L5 English only and L5 maths only, this suggests the total size of Ofsted’s disadvantaged most able population within the 2014 GCSE cohort in state-funded, non-selective schools is almost exactly 28,000 students.

Strangely, the Report does not analyse the distribution of disadvantaged high attainers, as opposed to high attainers more generally, even though the text mentions this as an issue in passing.

One would expect that the so called ‘minority effect’ might be even more pronounced in schools where there are very few disadvantaged high attainers.

Ofsted’s evidence base: Performance data

The Executive Summary argues that analysis of national performance data reveals:

‘…three key areas of underperformance for the most able students. These are the difference in outcomes between:

  • schools where most able students make up a very small proportion of the school’s population and those schools where proportions are higher
  • the disadvantaged most able students and their better off peers
  • the most able girls and the most able boys.

If the performance of the most able students is to be maximised, these differences need to be overcome.’

As noted above, Ofsted does not separately consider schools where the incidence of disadvantaged most able students is low, nor does it look at the interaction between these three categories.

It considers all three areas of underperformance through the single prism of prior attainment in KS2 tests of English and maths.

The Report also comments on a fourth dimension: the progression of disadvantaged students to competitive universities. Once again this is related to KS2 performance.

There are three data-related Key Findings:

  • National data show that too many of the most able students are still being let down and are failing to reach their full potential. Most able students’ achievement appears to suffer even more when they are from disadvantaged backgrounds or when they attend a school where the proportion of previously high-attaining students is small.’
  • ‘Nationally, too many of our most able students fail to achieve the grades they need to get into top universities. There are still schools where not a single most able student achieves the A-level grades commonly preferred by top universities.’
  • The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

The following sections look at achievement compared with prior attainment, followed by each of the four dimensions highlighted above.

GCSE attainment compared with KS2 prior attainment

Ofsted’s approach is modelled on the transition matrices, as applied to non-selective schools, comparing KS2 test performance in 2009 with subsequent GCSE performance in 2014.

Students with KS2 L5 are expected to make at least three levels of progress, to GCSE Grade B or higher, but this is relatively undemanding for high attainers, who should ideally be aiming for A/A* grades.

Ofsted presents two charts which illustrate the relatively small proportions who are successful in these terms – and the comparatively large proportions who undershoot even a grade B.

Ofsted Capture 1

Ofsted Capture 2

 .

  • In English, 39% manage A*/A grades while 77% achieve at least a Grade B, meaning that 23% achieve C or below.
  • In maths, 42% achieve A*/A grades, 76% at least a B and so 24% achieve C or lower.
  • In English and maths combined, 32% achieve A*/A grades in both subjects, 73% manage at least 2 B grades, while 27% fall below this.

Approximately one in four high attainers is not achieving each of these progression targets, even though they are not particularly demanding.

The Report notes that, in selective schools, the proportion of Level 5 students not achieving at least a Grade B is much lower, at 8% in English and 6% in maths.

Even allowing for the unreliability of these ‘levels of progress’ assumptions, the comparison between selective and non-selective schools is telling.

.

The size of a school’s most able population

The Report sets out evidence to support the contention that ‘the most able do best when there are more of them in a school’ (or, more accurately, in their year group).

It provides three graphs – for English, for maths and for maths and English combined – which divide non-selective schools with at least one L5 student into deciles according to the size of that L5 population.

These show consistent increases in the proportion of students achieving GCSE Grade B and above and Grades A*/A, with the lowest percentages for the lowest deciles and vice versa.

Comparing the bottom (fewest L5) and top (most L5) deciles:

  • In English 27% of the lowest decile achieved A*/A and 67% at least a B, whereas in the highest decile 48% achieved A*/A and 83% at least B.
  • In maths 28% of the bottom decile recorded A*/A while 65% managed at least a B, whereas in the top decile 54% achieved A*/A and 83% at least a B.
  • In maths and English combined, the lowest decile schools returned 17% A*/A grades and 58% at B or above, while in the highest decile the percentages were 42% and 81% respectively.

Selective schools record higher percentages than the highest decile on all three measures.

There is a single reference to the impact of sublevels, amply evidenced by the transition matrices.

‘For example, in schools where the lowest proportions of most able students had previously gained Level 5A in mathematics, 63% made more than expected progress. In contrast, in schools where the highest proportion of most able students who had previously attained Level 5A in mathematics, 86% made more than expected progress.’

Ofsted does not draw any inferences from this finding.

As hinted above, one might want to test the hypothesis that there may be an association with setting – in that schools with sufficient Level 5 students to constitute a top set might be relatively more successful.

Pursued to its logical extreme the finding would suggest that Level 5 students will be most successful where they are all taught together.

Interestingly, my own analysis of schools with small high attainer populations (10% or less of the cohort), derived from the 2014 Secondary Performance Tables, shows just how much variation there can be in the performance of these small groups when it comes to the standard measures:

  • 5+ A*-C grades including English and maths varies from 44% to 100%
  • EBacc ranges from 0% to 89%
  • Expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%.

This is partly a function of the small sample sizes. One suspects that Ofsted’s deciles smooth over similar variations.

But the most obvious point is that already emphasised in the previous section – the distribution of high attainers seems in large part a proxy for the level of advantage in a school.

Viewed from this perspective, Ofsted’s data on the variation in performance by distribution of high attaining students seems unsurprising.

.

Excellence gaps

Ofsted cites an ‘ever 6’ gap of 13 percentage points at GCSE grade B and above in English (66% compared with 79%) and of 17 percentage points in maths (61% compared with 78%).

Reverting again to progression from KS2, the gap between L5 ‘ever 6 FSM’ and other students going on to achieve A*/A grades in both English and maths is also given as 17 percentage points (20% versus 37%). At Grade B and above the gap is 16 points (59% compared with 75%).

A table is supplied showing progression by sub-level in English and maths separately.

.

Ofsted Capture 3

. 

A footnote explains that the ‘ever 6 FSM’ population with L5a in English was small, consisting of just 136 students.

I have transferred these excellence gaps to the graph below, to illustrate the relationship more clearly.

.

Ofsted chart 2

GCSE attainment gaps between advantaged and disadvantaged learners by KS2 prior attainment

.

It shows that, for grades A*-B, the size of the gap reduces the higher the KS2 sub-level, but the reverse is true at grades A*/A, at least as far as the distinction between 5c and 5b/a is concerned. The gaps remain similar or identical for progression from the higher two sub-levels.

This might suggest that schools are too little focused on pushing high-attaining disadvantaged learners beyond grade B.

 .

Gender

There is a short section on gender differences which points out that, for students with KS2 L5:

  • In English there was a 10 percentage point gap in favour of girls at Grade B and above and an 11 point gap in favour of girls at A*/A.
  • In maths there was a five percentage point gap at both Grade B and above and Grade A*/A.

But the interrelationship with excellence gaps and the size of the high attainer population is not explored.

.

Progression to competitive higher education

The Executive Summary mentions one outcome from the 2012/13 destinations data – that only 5% of disadvantaged students completing KS5 in 2012 progressed to ‘the top universities’. (The main text also compares the progression rates for state-funded and independent schools).

It acknowledges some improvement compared with previous years, but notes the disparity with progression rates for students from comparatively advantaged backgrounds.

A subsequent footnote reveals that Ofsted is referring throughout to progression to Russell Group universities

The Executive Summary also highlights regional differences:

‘For example, even within a high-achieving region like London, disadvantaged students in Brent are almost four times as likely to attend a prestigious university as those in Croydon.’

The main text adds:

‘For example, of the 500 or so disadvantaged students in Kent, only 2% go on to attend a top university. In Manchester, this rises to 9%. Disadvantaged students in Barnet are almost four times as likely as their peers in Kent to attend a prestigious university.’

Annex A provides only one statistic concerning progression from KS2 to KS5:

‘One half of students achieving Level 5 in English and mathematics at Key Stage 2 failed to achieve any A or A* grades at A level in non-selective schools’

There is no attempt to relate this data to the other variables discussed above.

Ofsted’s Evidence base – inspection and survey evidence

The qualitative evidence in Ofsted’s report is derived from:

  • A survey of 40 non-selective secondary schools and 10 primary schools. All the secondary schools had at least 15% of students ‘considered to be high attaining at the end of Key Stage 2’ (as opposed to meeting Ofsted’s definition), as well as 10% or more considered to be low-attaining. The sample varied according to size, type and urban or rural location. Fifteen of the 40 were included in the survey underpinning the original 2013 report. Nine of the 10 primary schools were feeders for the secondaries in the sample. In the secondary schools, inspectors held discussions with senior leaders, as well as those responsible for transition and IAG (so not apparently those with lead responsibility for high attainers). They also interviewed students in KS3 and KS5 and looked at samples of students’ work.

The six survey questions are shown below

.

Ofsted Capture 4

.

  • Supplementary questions asked during 130 Section 5 inspections, focused on how well the most able students are maintaining their progress in KS3, plus challenge and availability of suitable IAG for those in Year 11.
  • An online survey of 600 Year 8 and Year 11 students from 17 unidentified secondary schools, plus telephone interviews with five Russell Group admissions tutors.

The Report divides the qualitative dimension of its report into seven sections that map broadly on to the six survey questions.

The summary below is organised thematically, pulling together material from the key findings and supporting commentary. Relevant key findings are emboldened. Some of these have relevance to sections other than that in which they are located.

The length of each section is a good guide to the distribution and relative weight of Ofsted’s qualitative evidence

Most able disadvantaged

‘Schools visited were rarely meeting the distinct needs of students who are most able and disadvantaged. Not enough was being done to widen the experience of these students and develop their broader knowledge or social and cultural awareness early on in Key Stage 3. The gap at Key Stage 4 between the progress made by the most able disadvantaged students and their better off peers is still too large and is not closing quickly enough.’

The 2013 Report found few instances of pupil premium being used effectively to support the most able disadvantaged. This time round, about a third of survey schools were doing so. Six schools used the premium effectively to raise attainment.

Funding was more often used for enrichment activities but these were much less common in KS3, where not enough was being done to broaden students’ experience or develop social and cultural awareness.

In less successful schools, funding was not targeted ‘with the most able students in mind’, nor was its impact evaluated with sufficient precision.

In most survey schools, the proportion of most able disadvantaged was small. Consequently leaders did not always consider them.

In the few examples of effective practice, schools provided personalised support plans.

.

.

Leadership

Ofsted complains of complacency. Leaders are satisfied with their most able students making the expected progress – their expectations are not high enough.

School leaders in survey schools:

‘…did not see the need to do anything differently for the most able as a specific group.’

One head commented that specific support would be ‘a bit elitiist’.

In almost half of survey schools, heads were not prioritising the needs of their most able students at a sufficiently early stage.

Just 44 of the 130 schools asked supplementary questions had a senior leader with designated responsibility for the most able. Of these, only 16 also had a designated governor.

The Report comments:

‘This suggests that the performance of the most able students was not a high priority…’

Curriculum

Too often, the curriculum did not ensure that work was hard enough for the most able students in Key Stage 3. Inspectors found that there were too many times when students repeated learning they had already mastered or did work that was too easy, particularly in foundation subjects.’

Although leaders have generally made positive curriculum changes at KS4 and 5, issues remain at KS3. General consensus amongst students in over half the survey schools was that work is too easy.

Students identified maths and English as more challenging than other subjects in about a third of survey schools.

In the 130 schools asked supplementary questions, leaders rarely prioritised the needs of the most able at KS3. Only seven offered a curriculum designed for different abilities.

In the most effective survey schools the KS3 curriculum was carefully structured:

‘…leaders knew that, for the most able, knowledge and understanding of content was vitally important alongside the development of resilience and knowing how to conduct their own research.’

By comparison, the KS4 curriculum was tailored in almost half of survey schools. All the schools introduced enrichment and extra-curricular opportunities, though few were effectively evaluated.

. 

Assessment and tracking

Assessment, performance tracking and target setting for the most able students in Key Stage 4 were generally good, but were not effective enough in Key Stage 3. The schools visited routinely tracked the progress of their older most able students, but this remained weak for younger students. Often, targets set for the most able students were too low, which reflected the low ambitions for these students. Targets did not consistently reflect how quickly the most able students can make progress.’

Heads and assessment leaders considered tracking the progress of the most able sufficient to address their performance, but only rarely was this information used to improve curriculum and teaching strategies.

Monitoring and evaluation tends to be focused on KS4. There were some improvements in tracking at KS4 and KS5, but this had caused many schools to lose focus on tracking from the start of KS3.

KS3 students in most survey schools said their views were sought, but could not always point to changes as a consequence. Only in eight schools were able students’ views sought as a cohort.

Year 8 respondents to the online survey typically said schools could do more to develop their interests.

At KS3, half the survey schools did not track progress in all subjects. Where tracking was comprehensive, progress was inconsistent, especially in foundation subjects.

Assessment and tracking ‘generally lacked urgency and rigour’. This, when combined with ineffective use of KS2 assessments:

‘… has led to an indifferent start to secondary school for many of the most able students in these schools.’

KS2 tests were almost always used to set targets but five schools distrusted these results. Baseline testing was widely used, but only about a quarter of the sample used it effectively to spot gaps in learning or under-achievement.

Twenty-six of the 40 survey schools set targets ‘at just above national expectations’. For many students these were insufficiently demanding.

Expectations were insufficiently high to enable them to reach their potential. Weaknesses at KS3 meant there was too much to catch up at KS4 and 5.

In the better examples:

‘…leaders looked critically at national expectations and made shrewd adjustments so that the most able were aiming for the gold standard of A and A* at GCSE and A levels rather than grade B. They ensured that teachers were clear about expectations and students knew exactly what was expected of them. Leaders in these schools tracked the progress of their most able students closely. Teachers were quickly aware of any dips in performance and alert to opportunities to stretch them.’

The expectations built into levels-based national curriculum assessment imposed ‘a glass ceiling’. It is hoped that reforms such as Progress 8 will help raise schools’ aspirations.

 .

Quality of teaching

‘In some schools, teaching for the most able lacked sufficient challenge in Key Stage 3. Teachers did not have high enough expectations and so students made an indifferent start to their secondary education. The quality of students’ work across different subjects was patchy, particularly in foundation subjects. The homework given to the most able was variable in how well it stretched them and school leaders did not routinely check its effectiveness.’

The most common methods of introducing ‘stretch’ reported by teachers and students were extension work, challenge questions and differentiated tasks.

But in only eight of the survey schools did teachers have specific training in applying these techniques to the most able.

As in 2013, teaching at KS3 was insufficiently focused on the most able. The quality of work and tasks set was patchy, especially in foundation subjects. In two-thirds of survey schools work was insufficiently challenging in foundation subjects; in just under half, work was insufficiently challenging in maths and English.

Students experienced a range of teaching quality, even in the same school. Most said there were lessons that did not challenge them. Older students were more content with the quality of stretch and challenge.

In only about one fifth of survey schools was homework adapted to the needs of the most able. Extension tasks were increasingly common.

The same was true of half of the 130 schools asked supplementary questions.  Only 14 had a policy of setting more challenging homework for the most able.

Most schools placed students in maths and science sets fairly early in Year 7, but did so less frequently in English.

In many cases, older students were taught successfully in mixed ability classes, often because there were too few students to make sets viable:

‘The fact that these schools were delivering mixed ability classes successfully suggests that the organisation of classes by ability is not the only factor affecting the quality of teaching. Other factors, such as teachers not teaching their main subject or sharing classes or leaders focusing the skills of their best teachers disproportionately on the upper key stages, are also influential.’

. 

School culture and ethos

Leaders had not embedded an ethos in which academic excellence was championed with sufficient urgency. Students’ learning in Key Stage 3 in the schools visited was too frequently disrupted by low-level disruption, particularly in mixed-ability classes. Teachers had not had enough effective training in using strategies to accelerate the progress of their most able students.’

Where leadership was effective, leaders placed strong emphasis on creating the right ethos. School leaders had not prioritised embedding a positive ethos at KS3 in 22 of the survey schools.

In half of the survey schools, the most able students said their learning was affected by low-level disruption, though teachers in three-quarters of schools maintained this was rare. Senior leaders also had a more positive view than students.

In 16 of the schools, students thought behaviour was less good in mixed ability classes and staff tended to agree.

.

Transition

‘Inspectors found that the secondary schools visited were not using transition information from primary schools effectively to get the most able off to a flying start in Key Stage 3. Leaders rarely put in place bespoke arrangements for the most able students. In just under half of the schools visited, transition arrangements were not good enough. Some leaders and teachers expressed doubt about the accuracy of Key Stage 2 results. The information that schools gathered was more sophisticated, but, in too many cases, teachers did not use it well enough to make sure students were doing work with the right level of difficulty.

Too often poor transition arrangements meant students were treading water in KS3. The absence of leadership accountability for transition appeared a factor in stifled progress at KS4 and beyond.

Transfer arrangements with primary schools were not well developed in 16 of the survey schools. Compared with 2013, schools were more likely to find out about pupils’ strengths and weaknesses, but the information was rarely used well.

Secondary schools had more frequent and extended contact with primary schools through subject specialists to identify the most able, but these links were not always used effectively. Only one school had a specific curriculum pathway for such students.

Leaders in four of the ten primary schools surveyed doubted whether secondary schools used transition information effectively.

However, transition worked well in half of the secondary schools.  Six planned the Year 7 curriculum jointly with primary teachers. Leaders had the highest expectations of their staff to ensure that the most able were working at the appropriate level of challenge.

Transition appeared more effective where schools had fewer feeder primaries. About one third of the sample had more than 30 feeder schools, which posed more difficulties, but four of these schools had effective arrangements.

Progression to HE

‘Information, advice and guidance to students about accessing the most appropriate courses and universities were not good enough. There were worrying occasions when schools did too little to encourage the most able students to apply to prestigious universities. The quality of support was too dependent on the skills of individual staff in the schools visited.

While leaders made stronger links with universities to provide disadvantaged students in Key Stages 4 and 5 with a wider range of experiences, they were not evaluating the impact sharply enough. As a result, there was often no way to measure how effectively these links were supporting students in preparing successful applications to the most appropriate courses.’

Support and guidance about university applications is ‘still fragile’ and ‘remains particularly weak’.

Students, especially those from disadvantaged backgrounds, were not getting the IAG they need. Ten survey schools gave no specific support to first generation university attendees or those eligible for the pupil premium.

Forty-nine of the 130 school asked additional questions did not prioritise the needs of such students. However, personalised mentoring was reported in 16 schools.

In four survey schools students were not encouraged to apply to the top universities.

‘The remnants of misplaced ideas about elitism appear to be stubbornly resistant to change in a very small number of schools. One admissions tutor commented: ‘There is confusion (in schools) between excellence and elitism’.

Only a third of survey schools employed dedicated staff to support university applications. Much of the good practice was heavily reliant on the skills of a few individuals. HE admissions staff agreed.

In 13 of the schools visited, students had a limited understanding of the range of opportunities available to them.

Survey schools had a sound understanding of subject requirements for different degree courses. Only about one-quarter engaged early with parents.

.

Ofsted and other Central Government action

‘Ofsted has sharpened its focus on the progress and quality of teaching of the most able students. We routinely comment on the achievement of the most able students in our inspection reports. However, more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

.

Ofsted’s recommendations and conclusions

This is a somewhat better Report than its June 2013 predecessor, although it continues to fall into several of the same statistical and presentational traps.

It too is a curate’s egg.

For any student of effective provision for the most able, the broad assessment in the previous section is profoundly unsurprising, but its endorsement by Ofsted gives it added power and significance.

We should be grateful that HMCI has chosen to champion this issue when so many others are content to ignore it.

The overall message can best be summarised by juxtaposing two short statements from the Report, one expressed positively, another negatively:

  • In over half of survey schools, the most able KS3 students were progressing as well as, or better than, others. 
  • The needs of the most able were not being met effectively in the majority of survey schools.

Reading between the lines, too often, the most able students are succeeding despite their schools, rather than because of them.

What is rather more surprising – and potentially self-defeating – is Ofsted’s insistence on laying the problem almost entirely at the door of schools, and especially of headteachers.

There is most definitely a degree of complacency amongst school leaders about this issue, and Ofsted is quite right to point that out.

The determination of NAHT and ASCL to take offence at the criticism being directed towards headteachers, to use volatility and change as an excuse and to urge greater focus on the pockets of good practice is sufficient evidence of this.

But there is little by way of counterbalance. Too little attention is paid to the question whether the centre is providing the right support – and the right level of support – to facilitate system-wide improvement. It as if the ‘school-led, self-improving’ ideal is already firmly in place.

Then again, any commitment on the part of the headteachers’ associations to tackling the root causes of the problem is sadly lacking. Meanwhile, the teachers;’ associations ignored the Report completely.

Ofsted criticises this complacency and expresses concern that most of its survey schools:

‘…have been slow in taking forward Ofsted’s previous recommendations, particularly at KS3’

There is a call for renewed effort:

‘Urgent action is now required. Leaders must grasp the nettle and radically transform transition from primary school and the delivery of the Key Stage 3 curriculum. Schools must also revolutionise the quality of information, advice and guidance for their most able students.’

Ofsted’s recommendations for action are set out below. Seven are directed at school leaders, three at Ofsted and one at DfE.

Ofsted capture 5

Ofsted Capture 6

Those aimed by Ofsted towards itself are helpful in some respects.

For example, there is implicit acknowledgement that, until now, inspectors have been insufficiently focused on the most able from disadvantaged backgrounds.

Ofsted stops short of meeting my call for it to produce guidance to help schools and inspectors to understand Ofsted’s expectations.

But it is possible that it might do so. Shortly after publication of the Report, its Director for Schools made a speech confirming that: 

‘… inspectors are developing a most able evaluation toolkit for schools, aligned to that which is in place for free school meals’. 

.

.

If Ofsted is prepared to consult experts and practitioners on the content of that toolkit, rather than producing it behind closed doors, it is more likely to be successful.

There are obvious definitional issues stemming from the fact that, according to Ofsted’s current approach, the ‘most able’ population constitutes 40-50% of all learners.

While this helps to ensure relevance to every school, no matter how depressed the attainment of its intake, it also highlights the need for further differentiation of this huge population.

Some of Ofsted’s statistical indicators and benchmarking tools will need sharpening, not least to avoid the pitfalls associated with the inverse relationship between the proportion of high attainers and the proportion of disadvantaged learners.

They might usefully focus explicitly on the distribution and incidence of the disadvantaged most able.

Prospects for success

But the obvious question is why schools should be any more likely to respond this time round than in 2013?

Will the references in the Ofsted inspection handbook plus reformed assessment arrangements be sufficient to change schools’ behaviour?

Ofsted is not about to place explicit requirements on the face of the inspection framework.

We are invited to believe that Progress 8 in particular will encourage secondary schools to give due attention to the needs of high attainers.

Yet there is no commitment to the publication of a high attainers’ performance measure (comparable to the equivalent primary measure) or the gap on that measure between those from advantaged and disadvantaged backgrounds.

Data about the performance of secondary high attainers was to have been made available through the now-abandoned Data Portal – and there has been no information about what, if anything, will take its place.

And many believe that the necessary change cannot be achieved by tinkering with the accountability framework.

The specialist organisations are united in one respect: they all believe that schools – and learners themselves – need more direct support if we are to spread current pockets of effective practice throughout the system.

But different bodies have very different views about what form that support should take. Until we can:

  • Establish the framework necessary to secure universally high standards across all schools without resorting to national prescription

we – and Ofsted – are whistling in the wind.

GP

March 2015

HMCI Ups the Ante on the Most Able

.

Her Majesty’s Chief Inspector Wilshaw made some important statements about the education of what Ofsted most often calls ‘the most able’ learners in his 2013/14 Annual Report and various supporting documents.

P1020587

Another Norwegian Landscape by Gifted Phoenix

This short post compiles and summarises these statements, setting them in the context of current inspection policy and anticipated changes to the inspection process.

It goes on to consider what further action might be necessary to remedy the deficiencies Ofsted has identified in schools and to boost our national capacity to educate high attainers.

It continues a narrative which runs through several of my previous posts including:

.

What the Annual Report documents said

Ofsted’s press release marking publication of the 2013/14 Annual Report utilises a theme that runs consistently through all the documentation: while the primary sector continues to improve, progress has stalled in the secondary sector, resulting in a widening performance gap between the two sectors.

It conveys HMCI’s judgement that primary schools’ improvement is attributable to the fact that they ‘attend to the basics’, one of which is:

‘Enabling the more able [sic] pupils to reach their potential’

Conversely, the characteristics of secondary schools where improvement has stalled include:

‘The most able not being challenged’.

It is unclear whether Ofsted maintains a distinction between ‘more able’ and ‘most able’ since neither term is defined at any point in the Annual Report documentation.

In his speech launching the Annual Report, HMCI Wilshaw said:

‘The problem is also acute for the most able children. Primaries have made steady progress in helping this group. The proportion of pupils at Key Stage 2 gaining a Level 5 or above rose from 21% in 2013 to 24% this year. Attainment at Level 6 has also risen, particularly in mathematics, where the proportion reaching the top grade has increased from 3% to 9% in two years.

Contrast that with the situation in secondary schools. In 2013, nearly a quarter of pupils who achieved highly at primary school failed to gain even a B grade at GCSE. A third of our inspections of secondary schools this year pinpointed specific problems with teaching the most able – a third of inspections this year.

We cannot allow this lack of progress to persist. Imagine how dispiriting it must be for a child to arrive at a secondary school bursting with enthusiasm and keen to learn, only to be forced to repeat lessons already learnt and endure teaching that fails to stimulate them. To help tackle this problem, I have commissioned a report into progress at Key Stage 3 and it will report next year.’

HMCI’s written Commentary on the Annual Report says of provision in primary schools:

‘Many primary schools stretch the more able

Good and outstanding schools encourage wider reading and writing at length. Often, a school’s emphasis on the spiritual, moral, social and cultural aspects of the curriculum benefits all pupils but especially the more able, providing them with opportunities to engage with complex issues.

The proportion of pupils at Key Stage 2 attaining a Level 5 or above in reading, writing and mathematics increased from 21% in 2013 to 24% in 2014.

Attainment at Level 6 has also risen. In mathematics, the proportion of pupils achieving Level 6 rose from 3% in 2012 to 9% in 2014. The proportion achieving Level 6 in grammar, punctuation and spelling rose by two percentage points in the last year to 4%.

These improvements suggest that primary schools are getting better at identifying the brightest children and developing their potential.’ (Page 9)

The parallel commentary on provision in secondary schools says:

Too many secondary schools are not challenging the most able

In 2013, almost two thirds of the pupils in non-selective schools who attained highly at primary school in English and mathematics did not reach an A or A* in those subjects at GCSE. Nearly a quarter of them did not even achieve a B grade.

Around a third of our inspections of secondary schools this year identified issues in the teaching of the most able pupils. Inspectors found that teachers’ expectations of the most able were too low. There is a worrying lack of scholarship permeating the culture of too many schools.

In the year ahead, Ofsted will look even more closely at the performance of the brightest pupils in routine school inspections and will publish a separate report on what we find.’ (Page 13)

The Annual Report itself adds:

‘Challenging the most able

England’s schools are still not doing enough to help the most able children realise their potential. Ofsted drew attention to this last year, but the story has yet to change significantly. Almost two thirds of the pupils in non-selective schools who attained highly at primary school in English and mathematics did not reach an A* or A in those subjects at GCSE in 2013. Nearly a quarter of them did not even achieve a B grade and a disproportionate number of these are boys. Our brightest pupils are not doing as well as their peers in some other countries that are significantly outperforming England. In PISA 2012, fewer 15-year-olds in England were attaining at the highest levels in mathematics than their peers in Germany, Poland and Belgium. In reading, however, they were on a par.

This year, our inspectors looked carefully at how schools were challenging their most able pupils. Further action for individual schools was recommended in a third of our inspection reports. The majority of recommendations related to improved teaching of this group of pupils. Inspectors called on schools to ensure that the most able pupils are being given challenging work that takes full account of their abilities. Stretching the most able is a task for the whole school. It is important that schools promote a culture that supports the most able pupils to flourish, giving them opportunities to develop the skills needed by top universities and tracking their progress at every stage.

‘Ofsted will continue to press schools to stretch their most able pupils. Over the coming year, inspectors will be looking at this more broadly, taking into account the leadership shown in this area by schools. We will also further sharpen our recommendations so that schools have a better understanding of how they can help their most able pupils to reach their potential. Ofsted will follow up its 2013 publication on the most able in secondary schools with another survey focusing on non-selective primary and secondary schools. As part of this survey, we will examine the transition of the most able pupils from one phase to the next.’

Rather strangely, there are substantive references in only two of the accompanying regional reports.

The Report on London – the region that arguably stands above all others in terms of overall pupil performance – says:

More able pupils [sic]

London does reasonably well overall for more able pupils. In 2012/13 the proportion of pupils who were high attainers in Year 6 and then went on to gain A* or A in GCSE English was 46% in London compared with 41% in England.  In mathematics, the proportions were 49% across England and 58% in London.

However, in 2012/13, seven local authorities – Croydon, Bexley, Havering, Lewisham, Lambeth, Tower Hamlets and Waltham Forest – were below the London and national proportions of previously high attaining pupils who went on to attain grade A* or A in GCSE English. With the exception of Bexley, the same local authorities also fell below the London and national levels for the proportion of previously high-attaining pupils who went on to attain grade A* or A in GCSE mathematics.

We have identified the need to secure more rapid progress for London’s more able pupils as one of our key priorities. Inspectors will be paying particular attention to the performance of the more able pupils in schools and local authorities where these pupils are not reaching their full potential.’

The Report on the North-West identifies a problem:

‘Too many of the more able students underperform at secondary school. Of the 23 local authorities in the North West, 13 are below the national level for the percentage of children achieving at least Level 5 at Key Stage 2 in English and mathematics. The proportion subsequently attaining A* or A at GCSE is very low in some areas, particularly Knowsley, Salford and Blackpool.’

But it does not mention tackling this issue amongst its regional priorities.

The six remaining regional reports are silent on the issue.

.

Summarising the key implications

Synthesising the messages from these different sources, it seems that:

  • Primary schools have made ‘steady progress’ in supporting the most able, improving their capacity to identify and develop their potential. 
  • Inspection evidence suggests one third of secondary schools have specific problems with teaching the most able. This is a whole school issue. Too many high attainers at the end of KS2 are no longer high attainers at the end of KS4. Teachers’ expectations are too low. A positive school culture is essential but there is ‘a worrying lack of scholarship permeating the culture of too many schools’.  
  • Ofsted will increase the scrutiny it gives to the performance of the most able in routine school inspections, taking account of the leadership shown by schools (which appears to mean the contribution made by school leaders within schools), and will sharpen their recommendations within school inspection reports to reflect this increased scrutiny. 
  • They will also publish a survey report in 2015 that will feature: the outcomes of their increased scrutiny; provision in ‘non-selective primary and secondary schools’ including transition between phases; and the progress of the most able learners in KS3. 
  • In London the need to secure more rapid progress for more able pupils is a priority for Ofsted’s regional team. They will focus particularly on progress in English and maths between KS2 and KS4 in seven local authorities performing below the national and London average. 

[Postscript: In his Select Committee appearance on 28 January 2015, HMCI said that the 2015 survey report will be published in May.

However, there were press reports a few days ahead that it would be brought forward to Wednesday 4 March.

Publication ahead of the General Election, rather than immediately afterwards, puts pressure on the political parties to set out their response.

Will they continue to advance the familiar line that their generic standards-raising policies will ‘lift all ships’, or will they commit to a more targeted solution, such as the one I proposed here?]

.

All this suggests that schools would be wise to concentrate on strengthening leadership, school culture and transition – as well as eradicating any problems associated with teaching the most able.

KS3 is a particular concern in secondary schools. Although there will be comparatively more attention paid to the secondary sector, primary schools will not escape Ofsted’s increased scrutiny.

This is as it should be since my recent analysis of high attainers and high attainment in the 2014 Primary Performance Tables demonstrates that there is significant underachievement amongst high attainers in the primary sector and, in particular, very limited progress in closing achievement gaps between disadvantaged and other learners at higher attainment levels.

Ofsted does not say that they will give particular attention to most able learners in receipt of the pupil premium. The 2013 survey report committed them to doing so, but I could find no such emphasis in my survey of secondary inspection reports.

.

Will this be enough?

HMCI’s continuing concern about the quality of provision for the most able raises the question whether Ofsted’s increased scrutiny will be sufficient to bring about the requisite improvement.

Government policy is to leave this matter entirely to schools, although this has been challenged in some quarters. Labour in Opposition has been silent on the matter since Burnham’s Demos speech in July 2011.

More recent political debate about selection and setting has studiously avoided the wider question of how best to meet the needs of the most able, especially those from disadvantaged backgrounds.

If HMCI Wilshaw were minded to up the ante still further, what additional action might he undertake within Ofsted and advocate beyond it?

I sketch out below a ten-step plan for his and your consideration.

.

  1. Ofsted should strengthen its inspection procedures by publishing a glossary and supplementary inspection guidance, so that schools and inspectors alike have a clearer, shared understanding of Ofsted’s expectations and what provision should look like in outstanding and good schools. This should feature much more prominently the achievement, progress and HE destinations of disadvantaged high attainers, especially those in receipt of the Pupil Premium.

.

  1. The initiative under way in Ofsted’s London region should be extended immediately to all eight regions and a progress report should be included in Ofsted’s planned 2015 survey.

.

  1. The Better Inspection for All consultation must result in a clearer and more consistent approach to the inspection of provision for the most able learners across all sectors, with separate inspection handbooks adjusted to reflect the supplementary guidance above. Relevant high attainment, high attainer and excellence gaps data should be added to the School Data Dashboard.

.

  1. Ofsted should extend its planned 2015 survey to include a thorough review of the scope and quality of support for educating the most able provided to schools through local authority school improvement services, academy chains, multi-academy trusts and teaching school alliances. It should make recommendations for extending and strengthening such support, eliminating any patchiness of provision.

.

  1. Reforms to the assessment and accountability frameworks mean that less emphasis will be placed in future on the achievement of national benchmarks by borderline candidates and more on the attainment and progress of all learners. But there are still significant gaps in the data published about high attainment and high attainers, especially the differential performance of advantaged and disadvantaged learners. The decision to abandon the planned data portal – in which it was expected some of this data would be deposited – is problematic. Increased transparency would be helpful.

.

  1. There are unanswered questions about the support that the new levels-free assessment regime will provide for the achievement and progression of the most able. There is a risk that a ‘mastery’-focused approach will emphasise progression through increased depth of study, at the expense of greater breadth and faster pace, thus placing an unnecessary constraint on their education. Guidance is desirable to help eliminate these concerns.

.

  1. The Education Endowment Foundation (EEF) should extend its remit to include excellence gaps. All EEF-sponsored evaluations should routinely consider the impact on disadvantaged high attainers. The EEF should also sponsor projects to evaluate the blend of interventions that are most effective in closing excellence gaps. The Toolkit should be revised where necessary to highlight more clearly where specific interventions have a differential impact on high attainers.

.

  1. Efforts should be made to establish national consensus on the effective education of high attainers through consultation on and agreement of a set of common core principles.

.

  1. A ‘national conversation’ is needed to identify strategies for supporting (disadvantaged) high attainers, pushing beyond the ideological disagreements over selection and setting to consider a far wider range of options, including more innovative approaches to within-school and between-school provision.

.

  1. A feasibility study should be conducted into the viability of a national, non-governmental learner-centred support programme for disadvantaged high attainers aged 11-18. This would be market-driven but operate within a supporting national framework. It would be managed entirely within existing budgets – possibly an annual £50m pupil premium topslice plus a matching contribution from universities’ fair access outreach funding.

.

GP

December 2014

How well is Ofsted reporting on the most able?

 

 

This post considers how Ofsted’s new emphasis on the attainment and progress of the most able learners is reflected in school inspection reports.

My analysis is based on the 87 Section 5 secondary school inspection reports published in the month of March 2014.

keep-calm-and-prepare-for-ofsted-6I shall not repeat here previous coverage of how Ofsted’s emphasis on the most able has been framed. Interested readers may wish to refer to previous posts for details:

The more specific purpose of the post is to explore how consistently Ofsted inspectors are applying their guidance and, in particular, whether there is substance for some of the concerns I expressed in these earlier posts, drawn together in the next section.

The remainder of the post provides an analysis of the sample and a qualitative review of the material about the most able (and analogous terms) included in the sample of 87 inspection reports.

It concludes with a summary of the key points, a set of associated recommendations and an overall inspection grade for inspectors’ performance to date. Here is a link to this final section for those who prefer to skip the substance of the post.

 

Background

Before embarking on the real substance of this argument I need to restate briefly some of the key issues raised in those earlier posts:

  • Ofsted’s definition of ‘the most able’ in its 2013 survey report is idiosyncratically broad, including around half of all learners on the basis of their KS2 outcomes.
  • The evidence base for this survey report included material suggesting that the most able students are supported well or better in only 20% of lessons – and are not making the progress of which they are capable in about 40% of schools.
  • The survey report’s recommendations included three commitments on Ofsted’s part. It would:

 o   ‘focus more closely in its inspections on the teaching and progress of the most able students, the curriculum available to them, and the information, advice and guidance provided to the most able students’;

o   ‘consider in more detail during inspection how well the pupil premium is used to support the most able students from disadvantaged backgrounds’ and

o   ‘report its inspection findings about this group of students more clearly in school inspection, sixth form and college reports.’

  • Subsequently the school inspection guidance was revised somewhat  haphazardly, resulting in the parallel use of several undefined terms (‘able pupils’, ‘most able’, ‘high attaining’, ‘highest attaining’),  the underplaying of the attainment and progress of the most able learners attracting the Pupil Premium and very limited reference to appropriate curriculum and IAG.
  • Within the inspection guidance, emphasis was placed primarily on learning and progress. I edited together the two relevant sets of level descriptors in the guidance to provide this summary for the four different inspection categories:

In outstanding schools the most able pupils’ learning is consistently good or better and they are making rapid and sustained progress.

In good schools the most able pupils’ learning is generally good, they make good progress and achieve well over time.

In schools requiring improvement the teaching of the most able pupils and their achievement are not good.

In inadequate schools the most able pupils are underachieving and making inadequate progress.

  • No published advice has been made available to inspectors on the interpretation of these amendments to the inspection guidance. In October 2013 I wrote:

‘Unfortunately, there is a real risk that the questionable clarity of the Handbook and Subsidiary Guidance will result in some inconsistency in the application of the Framework, even though the fundamental purpose of such material is surely to achieve the opposite.’

  • Analysis of a very small sample of reports for schools reporting poor results for high attainers in the school performance tables suggested inconsistency both before and after the amendments were introduced into the guidance. I commented:

‘One might expect that, unconsciously or otherwise, inspectors are less ready to single out the performance of the most able when a school is inadequate across the board, but the small sample above does not support this hypothesis. Some of the most substantive comments relate to inadequate schools.

It therefore seems more likely that the variance is attributable to the differing capacity of inspection teams to respond to the new emphases in their inspection guidance. This would support the case made in my previous post for inspectors to receive additional guidance on how they should interpret the new requirement.’

The material below considers the impact of these revisions on a more substantial sample of reports and whether this justifies some of the concerns expressed above.

It is important to add that, in January 2014, Ofsted revised its guidance document ‘Writing the report for school inspections’ to include the statement that:

Inspectors must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’ (p8)

This serves to reinforce the changes to the inspection guidance and clearly indicates that coverage of this issue – at least in these terms – is a non-negotiable: we should expect to see appropriate reference in every single section 5 report.

 

The Sample

The sample comprises 87 secondary schools whose Section 5 inspection reports were published by Ofsted in the month of March 2014.

The inspections were conducted between 26 November 2013 and 11 March 2014, so the inspectors will have had time to become familiar with the revised guidance.

However up to 20 of the inspections took place before Ofsted felt it necessary to emphasise that coverage of the progress and teaching of the most able is compulsory.

The sample happens to include several institutions inspected as part of wider-ranging reviews of schools in Birmingham and schools operated by the E-ACT academy chain. It also incorporates several middle-deemed secondary schools.

Chart 1 shows the regional breakdown of the sample, adopting the regions Ofsted uses to categorise reports, as opposed to its own regional structure (ie with the North East identified separately from Yorkshire and Humberside).

It contains a disproportionately large number of schools from the West Midlands while the South-West is significantly under-represented. All the remaining regions supply between 5 and 13 schools. A total of 57 local authority areas are represented.

 

Chart 1: Schools within the sample by region

Ofsted chart 1

 

Chart 2 shows the different statuses of schools within the sample. Over 40% are community schools, while almost 30% are sponsored academies. There are no academy converters but sponsored academies, free schools and studio schools together account for some 37% of the sample.

 

Chart 2: Schools within the sample by status

Ofsted chart 2

 

The vast majority of schools in the sample are 11-16 or 11-18 institutions, but four are all-through schools, five provide for learners aged 13 or 14 upwards and 10 are middle schools. There are four single sex schools.

Chart 3 shows the variation in school size. Some of the studio schools, free schools and middle schools are very small by secondary standards, while the largest secondary school in the sample has some 1,600 pupils. A significant proportion of schools have between 600 and 1,000 pupils.

 

Chart 3: Schools within the sample by number on roll

Ofsted chart 3

The distribution of overall inspection grades between the sample schools is illustrated by Chart 4 below. Eight of the sample were rated outstanding, 28 good, 35 as requiring improvement and 16 inadequate.

Of those rated inadequate, 12 were subject to special measures and four had serious weaknesses.

 

Chart 4: Schools within the sample by overall inspection grade

 Ofsted chart 4

The eight schools rated outstanding include:

  • A mixed 11-18 sponsored academy
  • A mixed 14-19 studio school
  • A mixed 11-18 free school
  • A mixed 11-16 VA comprehensive;
  • A girls’ 11-18  VA comprehensive
  • A boys’ 11-18 VA selective school
  • A girls’ 11-18 community comprehensive and
  • A mixed 11-18 community comprehensive

The sixteen schools rated inadequate include:

  • Eight mixed 11-18 sponsored academies
  • Two mixed 11-16 sponsored academies
  • An mixed all-through sponsored academy
  • A mixed 11-16 free school
  • Two mixed 11-16 community comprehensives
  • A mixed 11-18 community comprehensive and
  • A mixed 13-19 community comprehensive

 

Coverage of the most able in main findings and recommendations

 

Terminology 

Where they were mentioned, such learners were most often described as ‘most able’ but a wide range of other terminology is deployed included ‘most-able’, ‘the more able’, ‘more-able’, ‘higher attaining’, ‘high-ability’, ‘higher-ability’ and ‘able students’.

The idiosyncratic adoption of redundant hyphenation is an unresolved mystery.

It is not unusual for two or more of these terms to be used in the same report. Because there is no glossary in existence, this makes some reports rather less straightforward to interpret accurately.

It is also more difficult to compare and contrast reports. Helpful services like Watchsted’s word search facility become less useful.

 

Incidence of commentary in the main findings and recommendations

Thirty of the 87 inspection reports (34%) addressed the school’s most able learners explicitly (or applied a similar term) in the sections setting out the report’s main findings and the recommendations respectively.

The analysis showed that 28% of reports on academies (including studios and free schools) met this criterion, whereas 38% of reports on non-academy schools did so.

Chart 5 shows how the incidence of reference in both main findings and recommendations varies according to the overall inspection grade awarded.

One can see that this level of attention is most prevalent in schools requiring improvement, followed by those with inadequate grades. It was less common in schools rated good and less common still in outstanding schools. The gap between these two categories is perhaps smaller than expected.

The slight lead for schools requiring improvement over inadequate schools may be attributable to a view that more of the latter face more pressing priorities, or it may have something to do with the varying proportions of high attainers in such schools, or both of these factors could be in play, amongst others.

 

Chart 5: Most able covered in both main findings and recommendations by overall inspection rating (percentage)

Ofsted chart 5

A further eleven reports (13%) addressed the most able learners in the recommendations but not the main findings.

Only one report managed to feature the most able in the main findings but not in the recommendations and this was because the former recorded that ‘the most able students do well’.

Consequently, a total of 45 reports (52%) did not mention the most able in either the main findings or the recommendations.

This applied to some 56% of reports on academies (including free schools and studio schools) and 49% of reports on other state-funded schools.

So, according to these proxy measures, the most able in academies appear to receive comparatively less attention from inspectors than those in non-academy schools. It is not clear why. (The samples are almost certainly too small to support reliable comparison of academies and non-academies with different inspection ratings.)

Chart 6 below shows the inspection ratings for this subset of reports.

 

Chart 6: Most able covered in neither main findings nor recommendations by overall inspection rating (percentage)

Ofsted chart 6

Here is further evidence that the significant majority of outstanding schools are regarded as having no significant problems in respect of provision for the most able.

On the other hand, this is far from being universally true, since it is an issue for one in four of them. This ratio of 3:1 does not lend complete support to the oft-encountered truism that outstanding schools invariably provide outstandingly for the most able – and vice versa.

At the other end of the spectrum, and perhaps even more surprisingly, over 30% of inadequate schools are assumed not to have issues significant enough to warrant reference in these sections. Sometimes this may be because they are equally poor at providing for all their learners, so the most able are not separately singled out.

Chart 7 below shows differences by school size, giving the percentage of reports mentioning the most able in both main findings and recommendations and in neither.

It divides schools into three categories: small (24 schools with a NOR of 599 or lower), medium (35 schools with a NOR of 600-999) and large (28 schools with a NOR of 1000 or higher.

 

Chart 7: Reports mentioning the most able in main findings and recommendations by school size 

 Ofsted chart 7

It is evident that ‘neither’ exceeds ‘both’ in the case of all three categories. The percentages are not too dissimilar in the case of small and large schools, which record a very similar profile.

But there is a much more significant difference for medium-sized schools. They demonstrate a much smaller percentage of ‘both’ reports and comfortably the largest percentage of ‘neither’ reports.

This pattern – suggesting that inspectors are markedly less likely to emphasise provision for the most able in medium-sized schools – is worthy of further investigation.

It would be particularly interesting to explore further the relationship between school size, the proportion of high attainers in a school and their achievement.

 

Typical references in the main findings and recommendations

I could detect no obvious and consistent variations in these references by school status or size, but it was possible to detect a noticeably different emphasis between schools rated outstanding and those rated inadequate.

Where the most able featured in reports on outstanding schools, these included recommendations such as:

‘Further increase the proportion of outstanding teaching in order to raise attainment even higher, especially for the most able students.’ (11-16 VA comprehensive).

‘Ensure an even higher proportion of students, including the most able, make outstanding progress across all subjects’ (11-18 sponsored academy).

These statements suggest that such schools have made good progress in eradicating underachievement amongst the most able but still have further room for improvement.

But where the most able featured in recommendations for inadequate schools, they were typically of this nature:

‘Improve teaching so that it is consistently good or better across all subjects, but especially in mathematics, by: raising teachers’ expectations of the quality and amount of work students of all abilities can do, especially the most and least able.’  (11-16 sponsored academy).

‘Improve the quality of teaching in order to speed up the progress students make by setting tasks that are at the right level to get the best out of students, especially the most able.’ (11-18 sponsored academy).

‘Rapidly improve the quality of teaching, especially in mathematics, by ensuring that teachers: have much higher expectations of what students can achieve, especially the most able…’ (11-16 community school).

These make clear that poor and inconsistent teaching quality is causing significant underachievement at the top end (and ‘especially’ suggests that this top end underachievement is particularly pronounced compared with other sections of the attainment spectrum in such schools).

Recommendations for schools requiring improvement are akin to those for inadequate schools but typically more specific, pinpointing particular dimensions of good quality teaching that are absent, so limiting effective provision for the most able. It is as if these schools have some of the pieces in place but not yet the whole jigsaw.

By comparison, recommendations for good schools can seem rather more impressionistic and/or formulaic, focusing more generally on ‘increasing the proportion of outstanding teaching’. In such cases the assessment is less about missing elements and more about the consistent application of all of them across the school.

One gets the distinct impression that inspectors have a clearer grasp of the ‘fit’ between provision for the most able and the other three inspection outcomes, at least as far as the distinction between ‘good’ and ‘outstanding’ is concerned.

But it would be misleading to suggest that these lines of demarcation are invariably clear. The boundary between ‘good’ and ‘requires improvement’ seems comparatively distinct, but there was more evidence of overlap at the intersections between the other grades.

 

Coverage of the most able in the main body of reports 

References to the most able rarely turn up in the sections dealing with behaviour and safety and leadership and management. I counted no examples of the former and no more than one or two of the latter.

I could find no examples where information, advice and guidance available to the most able are separately and explicitly discussed and little specific reference to the appropriateness of the curriculum for the most able. Both are less prominent than the recommendations in the June 2013 survey report led us to expect.

Within this sample, the vast majority of reports include some description of the attainment and/or progress of the most able in the section about pupils’ achievement, while roughly half pick up the issue in relation to the quality of teaching.

The extent of the coverage of most able learners varied enormously. Some devoted a single sentence to the topic while others referred to it separately in main findings, recommendations, pupils’ achievement and quality of teaching. In a handful of cases reports seemed to give disproportionate attention to the topic.

 

Attainment and progress

Analyses of attainment and progress are sometimes entirely generic, as in:

‘The most able students make good progress’ (inadequate 11-18 community school).

‘The school has correctly identified a small number of the most able who could make even more progress’ (outstanding 11-16 RC VA school).

‘The most able students do not always secure the highest grades’ (11-16 community school requiring improvement).

‘The most able students make largely expected rates of progress. Not enough yet go on to attain the highest GCSE grades in all subjects.’ (Good 11-18 sponsored academy).

Sometimes such statements can be damning:

‘The most-able students in the academy are underachieving in almost every subject. This is even the case in most of those subjects where other students are doing well. It is an academy-wide issue.’ (Inadequate 11-18 sponsored academy).

These do not in my view constitute reporting ‘in detail on the progress of the most able pupils’ and so probably fall foul of Ofsted’s guidance to inspectors on writing reports.

More specific comments on attainment typically refer explicitly to the achievement of A*/A grades at GCSE and ideally to specific subjects, for example:

‘In 2013, standards in science, design and technology, religious studies, French and Spanish were also below average. Very few students achieved the highest A* and A grades.’ (Inadequate 11-18 sponsored academy)

‘Higher-ability students do particularly well in a range of subjects, including mathematics, religious education, drama, art and graphics. They do as well as other students nationally in history and geography.’ (13-18 community school  requiring improvement)

More specific comments on progress include:

‘The progress of the most able students in English is significantly better than that in other schools nationally, and above national figures in mathematics. However, the progress of this group is less secure in science and humanities.’  (Outstanding 11-18 sponsored academy)

‘In 2013, when compared to similar students nationally, more-able students made less progress than less-able students in English. In mathematics, where progress is less than in English, students of all abilities made similar progress.’ (11-18 sponsored academy requiring improvement).

Statements about progress rarely extend beyond English and maths (the first example above is exceptional) but, when attainment is the focus, some reports take a narrow view based exclusively on the core subjects, while others are far wider-ranging.

Despite the reference in Ofsted’s survey report, and subsequently the revised subsidiary guidance, to coverage of high attaining learners in receipt of the Pupil Premium, this is hardly ever addressed.

I could find only two examples amongst the 87 reports:

‘The gap between the achievement in English and mathematics of students for whom the school receives additional pupil premium funding and that of their classmates widened in 2013… During the inspection, it was clear that the performance of this group is a focus in all lessons and those of highest ability were observed to be achieving equally as well as their peers.’ (11-16 foundation school requiring improvement)

‘Students eligible for the pupil premium make less progress than others do and are consequently behind their peers by approximately one GCSE grade in English and mathematics. These gaps reduced from 2012 to 2013, although narrowing of the gaps in progress has not been consistent over time. More-able students in this group make relatively less progress.’ (11-16 sponsored academy requiring improvement)

More often than not it seems that the most able and those in receipt of the Pupil Premium are assumed to be mutually exclusive groups.

 

Quality of teaching 

There was little variation in the issues raised under teaching quality. Most inspectors select two or three options from a standard menu:

‘Where teaching is best, teachers provide suitably challenging materials and through highly effective questioning enable the most able students to be appropriately challenged and stretched…. Where teaching is less effective, teachers are not planning work at the right level of difficulty. Some work is too easy for the more able students in the class. (Good 11-16 community school)

 ‘In teaching observed during the inspection, the pace of learning for the most able students was too slow because the activities they were given were too easy. Although planning identified different activities for the most able students, this was often vague and not reflected in practice.  Work lacks challenge for the most able students.’ (Inadequate 11-16 community school)

‘In lessons where teaching requires improvement, teachers do not plan work at the right level to ensure that students of differing abilities build on what they already know. As a result, there is a lack of challenge in these lessons, particularly for the more able students, and the pace of learning is slow. In these lessons teachers do not have high enough expectations of what students can achieve.’ (11-18 community school requiring improvement)

‘Tasks set by teachers are sometimes too easy and repetitive for pupils, particularly the most able. In mathematics, pupils are sometimes not moved on quickly enough to new and more challenging tasks when they have mastered their current work.’ (9-13 community middle school requiring improvement)

‘Targets which are set for students are not demanding enough, and this particularly affects the progress of the most able because teachers across the year groups and subjects do not always set them work which is challenging. As a result, the most able students are not stretched in lessons and do not achieve as well as they should.’ (11-16 sponsored academy rated inadequate)

All the familiar themes are present – assessment informing planning, careful differentiation, pace and challenge, appropriate questioning, the application of subject knowledge, the quality of homework, high expectations and extending effective practice between subject departments.

 

Negligible coverage of the most able

Only one of the 87 reports failed to make any mention of the most able whatsoever. This is the report on North Birmingham Academy, an 11-19 mixed school requiring improvement.

This clearly does not meet the injunction to:

‘…report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough’.

It ought not to have passed through Ofsted’s quality assurance processes unscathed. The inspection was conducted in February 2014, after this guidance issued, so there is no excuse.

Several other inspections make only cursory references to the most able in the main body of the report, for example:

‘Where teaching is not so good, it was often because teachers failed to check students’ understanding or else to anticipate when to intervene to support students’ learning, especially higher attaining students in the class.’ (Good 11-18 VA comprehensive).

‘… the teachers’ judgements matched those of the examiners for a small group of more-able students who entered early for GCSE in November 2013.’ (Inadequate 11-18 sponsored academy).

‘More-able students are increasingly well catered for as part of the academy’s focus on raising levels of challenge.’ (Good 11-18 sponsored academy).

‘The most able students do not always pursue their work to the best of their capability.’ (11-16 free school requiring improvement).

These would also fall well short of the report writing guidance. At least 6% of my sample falls into this category.

Some reports note explicitly that the most able learners are not making sufficient progress, but fail to capture this in the main findings or recommendations, for example:

‘The achievement of more able students is uneven across subjects. More able students said to inspectors that they did not feel they were challenged or stretched in many of their lessons. Inspectors agreed with this view through evidence gathered in lesson observations…lessons do not fully challenge all students, especially the more able, to achieve the grades of which they are capable.’ (11-19 sponsored academy requiring improvement).

‘The 2013 results of more-able students show they made slower progress than is typical nationally, especially in mathematics.  Progress is improving this year, but they are still not always sufficiently challenged in lessons.’ (11-18 VC CofE school requiring improvement).

‘There is only a small proportion of more-able students in the academy. In 2013 they made less progress in English and mathematics than similar students nationally. Across all of their subjects, teaching is not sufficiently challenging for more-able students and they leave the academy with standards below where they should be.’ (Inadequate 11-18 sponsored academy).

‘The proportion of students achieving grades A* and A was well below average, demonstrating that the achievement of the most able also requires improvement.’  (11-18 sponsored academy requiring improvement).

Something approaching 10% of the sample fell into this category. It was not always clear why this issue was not deemed significant enough to feature amongst schools’ priorities for improvement. This state of affairs was more typical of schools requiring improvement than inadequate schools, so one could not so readily argue that the schools concerned were overwhelmed with the need to rectify more basic shortcomings.

That said, the example from an inadequate academy above may be significant. It is almost as if the small number of more able students is the reason why this shortcoming is not taken more seriously.

Inspectors must carry in their heads a somewhat subjective hierarchy of issues that schools are expected to tackle. Some inspectors appear to feature the most able at a relatively high position in this hierarchy; others push it further down the list. Some appear more flexible in the application of this hierarchy to different settings than others.

 

Formulaic and idiosyncratic references 

There is clear evidence of formulaic responses, especially in the recommendations for how schools can improve their practice.

Many reports adopt the strategy of recommending a series of actions featuring the most able, either in the target group:

‘Improve the quality of teaching to at least good so that students, including the most able, achieve higher standards, by ensuring that: [followed by a list of actions] (9-13 community middle school requiring improvement)

Or in the list of actions:

‘Improve the quality of teaching in order to raise the achievement of students by ensuring that teachers:…use assessment information to plan their work so that all groups of students, including those supported by the pupil premium and the most-able students, make good progress.’ (11-16 community school requiring improvement)

It was rare indeed to come across a report that referred explicitly to interesting or different practice in the school, or approached the topic in a more individualistic manner, but here are a few examples:

‘More-able pupils are catered for well and make good progress. Pupils enjoy the regular, extra challenges set for them in many lessons and, where this happens, it enhances their progress. They enjoy that extra element which often tests them and gets them thinking about their work in more depth. Most pupils are keen to explore problems which will take them to the next level or extend their skills.’  (Good 9-13 community middle school)

‘Although the vast majority of groups of students make excellent progress, the school has correctly identified a small number of the most able who could make even more progress. It has already started an impressive programme of support targeting the 50 most able students called ‘Students Targeted A grade Results’ (STAR). This programme offers individualised mentoring using high-quality teachers to give direct intervention and support. This is coupled with the involvement of local universities. The school believes this will give further aspiration to these students to do their very best and attend prestigious universities.’  (Outstanding 11-16 VA school)

I particularly liked:

‘Policies to promote equality of opportunity are ineffective because of the underachievement of several groups of students, including those eligible for the pupil premium and the more-able students.’ (Inadequate 11-18 academy) 

 

Conclusion

 

Main Findings

The principal findings from this survey, admittedly based on a rather small and not entirely representative sample, are that:

  • Inspectors are terminologically challenged in addressing this issue, because there are too many synonyms or near-synonyms in use.
  • Approximately one-third of inspection reports address provision for the most able in both main findings and recommendations. This is less common in academies than in community, controlled and aided schools. It is most prevalent in schools with an overall ‘requires improvement’ rating, followed by those rated inadequate. It is least prevalent in outstanding schools, although one in four outstanding schools is dealt with in this way.
  • Slightly over half of inspection reports address provision for the most able in neither the main findings nor the recommendations. This is relatively more common in the academies sector and in outstanding schools. It is least prevalent in schools rated inadequate, though almost one-third of inadequate schools fall into this category. Sometimes this is the case even though provision for the most able is identified as a significant issue in the main body of the report.
  • There is an unexplained tendency for reports on medium-sized schools to be significantly less likely to feature the most able in both main findings and recommendations and significantly more likely to feature it in neither. This warrants further investigation.
  • Overall coverage of the topic varies excessively between reports. One ignored it entirely, while several provided only cursory coverage and a few covered it to excess. The scope and quality of the coverage does not necessarily correlate with the significance of the issue for the school.
  • Coverage of the attainment and progress of the most able learners is variable. Some reports offer only generic descriptions of attainment and progress combined, some are focused exclusively on attainment in the core subjects while others take a wider curricular perspective. Outside the middle school sector, desirable attainment outcomes for the most able are almost invariably defined exclusively in terms of A* and A grade GCSEs.
  • Hardly any reports consider the attainment and/or progress of the most able learners in receipt of the Pupil Premium.
  • None of these reports make specific and explicit reference to IAG for the most able. It is rarely stated whether the school’s curriculum satisfies the needs of the most able.
  • Too many reports adopt formulaic approaches, especially in the recommendations they offer the school. Too few include reference to interesting or different practice.

In my judgement, too much current inspection reporting falls short of the commitments contained in the original Ofsted survey report and of the more recent requirement to:

‘always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

 

Recommendations

  • Ofsted should publish a glossary defining clearly all the terms for the most able that it employs, so that both inspectors and schools understand exactly what is intended when a particular term is deployed and which learners should be in scope when the most able are discussed.
  • Ofsted should co-ordinate the development of supplementary guidance clarifying their expectations of schools in respect of provision for the most able. This should set out in more detail what expectations would apply for such provision to be rated outstanding, good, requiring improvement and inadequate respectively. This should include the most able in receipt of the Pupil Premium, the suitability of the curriculum and the provision of IAG.
  • Ofsted should provide supplementary guidance for inspectors outlining and exemplifying the full range of evidence they might interrogate concerning the attainment and progress of the most able learners, including those in receipt of the Pupil Premium.
  • This guidance should specify the essential minimum coverage expected in reports and the ‘triggers’ that would warrant it being referenced in the main findings and/or recommendations for action.
  • This guidance should discourage inspectors from adopting formulaic descriptors and recommendations and specifically encourage them to identify unusual or innovative examples of effective practice.
  • The school inspection handbook and subsidiary guidance should be amended to reflect the supplementary guidance.
  • The School Data Dashboard should be expanded to include key data highlighting the attainment and progress of the most able.
  • These actions should also be undertaken for inspection of the primary and 16-19 sectors respectively.

 

Overall assessment: Requires Improvement.

 

GP

May 2014

 

 

 

 

 

 

 

 

 

 

 

What Becomes of Schools That Fail Their High Attainers?*

.

This post reviews the performance and subsequent history of schools with particularly poor results for high attainers in the Secondary School Performance Tables over the last three years.

P1010120

Seahorse in Perth Aquarium by Gifted Phoenix

It establishes a high attainer ‘floor target’ so as to draw a manageable sample of poor performers and, having done so:

  • Analyses the characteristics of this sample;
  • Explores whether these schools typically record poor performance in subsequent years or manage to rectify matters;
  • Examines the impact of various interventions, including falling below the official floor targets, being placed in special measures or deemed to have serious weaknesses following inspection, becoming an academy and receiving a pre-warning and/or warning notice;
  • Considers whether the most recent Ofsted reports on these schools do full justice to this issue, including those undertaken after September 2013 when new emphasis was placed on the performance of the ‘most able’.

The post builds on my previous analysis of high attainment in the 2013 School Performance Tables (January 2014). It applies the broad definition of high attainers used in the Tables, which I discussed in that post and have not repeated here.

I must emphasise at the outset that factors other than poor performance may partially explain particularly low scores in the Tables.

There may be several extenuating circumstances that are not reflected in the results. Sometimes these may surface in Ofsted inspection reports, but the accountability and school improvement regime typically imposes a degree of rough justice, and I have followed its lead.

It is also worth noting that the Performance Tables do not provide data for schools where the number of high attainers is five or fewer, because of the risk that individuals may be identifiable even though the data is anonymised.

This is unfortunate since the chances are that schools with very few high attainers will find it more difficult than others to address their needs. We may never know, but there is more on the impact of cohort size below.

Finally please accept my customary apology for any transcription errors. Do let me know if you notice any and I will correct them.

.

Drawing the Sample

The obvious solution would be to apply the existing floor targets to high attainers.

So it would include all schools recording:

  • Fewer than 35% (2011) or 40% (2012 and 2013) of high attainers achieving five or more GCSEs at grades A*-C or equivalent including GCSEs in English and mathematics and
  • Below median scores for the percentage of high attainers making at least the expected three levels of progress between Key Stages 2 and 4 in English and maths respectively.

But the first element is far too undemanding a threshold to apply for high attaining learners and the overall target generates a tiny sample.

The only school failing to achieve it in 2013 was Ark Kings Academy in Birmingham, which recorded just six high attainers, forming 9% of the cohort (so only just above the level at which results would have been suppressed).

In 2012 two schools were in the same boat:

  • The Rushden Community College in Northamptonshire, with 35 high attainers (26% of the cohort), which became a sponsored academy with the same name on 1 December 2012; and
  • Culverhay School in Bath and North East Somerset, with 10 high attainers (19% of the cohort), which became Bath Community Academy on 1 September 2012.

No schools at all performed at this level in 2011.

A sample of just three schools is rather too unrepresentative, so it is necessary to set a more demanding benchmark which combines the same threshold and progress elements.

The problem is not with the progress measure. Far too many schools fail to meet the median level of performance – around 70% each year in both English and maths – even with their cadres of high attainers. Hence I need to lower the pitch of this element to create a manageable sample.

I plumped for 60% or fewer high attainers making at least the expected progress between KS2 and KS4 in both English and maths. This captured 22 state-funded schools in 2013, 31 in 2012 and 38 in 2011. (It also enabled Ark King’s Academy to escape, by virtue of the fact that 67% of its high attainers learners achieved the requisite progress in English.)

For the threshold element I opted for 70% or fewer high attainers achieving five or more GCSEs at grades A*-C or equivalent including GCSEs in English and maths. This captured 19 state-funded schools in 2013, 29 in 2012 and 13 in 2011.

.

Venn 2.

The numbers of state-funded schools that met both criteria were seven in 2013, eight  in 2012 and five in 2011, so 20 in all.

I decided to feature this small group of schools in the present post while also keeping in mind the schools occupying each side of the Venn Diagram. I particularly wanted to see whether schools which emerged from the central sample in subsequent years continued to fall short on one or other of the constituent elements.

The 20 schools in the main sample are:

Table 1 below provides more detail about these 20 schools.

.

Table 1: Schools Falling Below Illustrative High Attainer Floor Targets 2011-2013

Name Type LA Status/Sponsor Subsequent History
2011
Carter Community School 12-16 mixed modern Poole Community Sponsored academy (ULT) 1/4/13
Hadden Park High School 11-16 mixed comp Nottingham Foundation Sponsored Academy (Bluecoat School) 1/1/14
Merchants Academy 11-18 mixed comp Bristol Sponsored Academy (Merchant Venturers/ University of Bristol
The Robert Napier School 11-18 mixed modern Medway Foundation Sponsored Academy (Fort Pitt Grammar School)  1/9/12
Bishop of Rochester Academy 11-18 mixed comp Kent Sponsored Academy (Medway Council/ Canterbury Christ Church University/ Diocese of Rochester)
2012
The Rushden Community College 11-18 mixed comp Northants Community Sponsored Academy (The Education Fellowship) 12/12
Culverhay School 11-18 boys comp Bath and NE Somerset Community Bath Community Academy – mixed (Cabot Learning) 1/9/12
Raincliffe School 11-16 mixed comp N Yorks Community Closed 8/12 (merged with Graham School)
The Coseley School 11-16 mixed comp Dudley Foundation
Fleetwood High School 11-18 mixed comp Lancs Foundation
John Spendluffe Foundation Technology College 11-16 mixed modern Lincs Academy converter
Parklands High School 11-18 mixed Liverpool Foundation Discussing academy sponsorship (Bright Tribe)
Frank F Harrison Engineering College 11-18 mixed comp Walsall Foundation Mirus Academy (sponsored by Walsall College) 1/1/12
2013
Gloucester Academy 11-19 mixed comp Glos Sponsored Academy (Prospect Education/ Gloucestershire College)
Christ the King Catholic and Church of England VA School 11-16 mixed comp Knowsley VA Closed 31/8/13
Aireville School 11-16 mixed modern N Yorks Community
Manchester Creative and Media Academy for Boys 11-19 boys comp Manchester Sponsored Academy (Manchester College/ Manchester Council/ Microsoft)
Fearns Community Sports College 11-16 mixed comp Lancs Community
Unity College Blackpool 5-16 mixed comp Blackpool Community Unity Academy Blackpool (sponsored by Fylde Coast Academies)
The Mirus Academy 3-19 mixed comp Walsall Sponsored Academy (Walsall College)

 .

Only one school appears twice over the three-year period albeit in two separate guises – Frank F Harrison/Mirus.

Of the 20 in the sample, seven were recorded in the relevant year’s Performance Tables as community schools, six as foundation schools, one was VA, one was an academy converter and the five remaining were sponsored academies.

Of the 14 that were not originally academies, seven have since become sponsored academies and one is discussing the prospect. Two more have closed, so just five – 25% of the sample – remain outside the academies sector.

All but two of the schools are mixed (the other two are boys’ schools). Four are modern schools and the remainder comprehensive.

Geographically they are concentrated in the Midlands and the North, with a few in the South-West and the extreme South-East. There are no representatives from London, the East or the North-East.

.

Performance of the Core Sample

Table 2 below looks at key Performance Table results for these schools. I have retained the separation by year and the order in which the schools appear, which reflects their performance on the GCSE threshold measure, with the poorest performing at the top of each section.

.

Table 2: Performance of schools falling below proposed high attainer floor targets 2011-2013

Name No of HA % HA 5+ A*-C incl E+M 3+ LoP En 3+ LoP Ma APS (GCSE)
2011
Carter Community School 9 13 56 56 44 304.9
Hadden Park High School 15 13 60 40 20 144.3
Merchants Academy 19 19 68 58 42 251.6
The Robert Napier School 28 12 68 39 46 292.8
Bishop of Rochester Academy 10 5 70 50 60 298.8
2012
The Rushden Community College 35 26 3 0 54 326.5
Culverhay School 10 19 30 40 20 199.3
Raincliffe School 6 11 50 50 33 211.5
The Coseley School 35 20 60 51 60 262.7
Fleetwood High School 34 22 62 38 24 272.9
John Spendluffe Foundation Technology College 14 12 64 50 43 283.6
Parklands High School 13 18 69 23 8 143.7
Frank F Harrison Engineering College 20 12 70 35 60 188.3
2013
Gloucester Academy 18 13 44 28 50 226.8
Christ the King Catholic and Church of England VA School 22 22 55 32 41 256.5
Aireville School 23 23 61 35 57 267.9
Manchester Creative and Media Academy for Boys 16 19 63 50 50 244.9
Fearns Community Sports College 22 13 64 36 59 306.0
Unity College Blackpool 21 18 67 57 52 277.1
The Mirus Academy 23 13 70 57 52 201.4

.

The size of the high attainer population in these schools varies between 6 (the minimum for which statistics are published) and 35, with an average of just under 20.

The percentage of high attainers within each school’s cohort ranges from 5% to 26% with an average of slightly over 16%.

This compares with a national average in 2013 for all state-funded schools of 32.4%, almost twice the size of the average cohort in this sample. All 20 schools here record a high attainer population significantly below this national average.

This correlation may be significant – tending to support the case that high attainers are more likely to struggle in schools where they are less strongly concentrated – but it does not prove the relationship.

Achievement against the GCSE threshold measure falls as low as 3% (Rushden in 2012) but this was reportedly attributable to the school selecting ineligible English specifications.

Otherwise the poorest result is 30% at Culverhay, also in 2012, followed by Gloucester Academy (44% in 2013) and Raincliffe (50% in 2012). Only these four schools have recorded performance at or below 50%.

Indeed there is a very wide span of performance even amongst these small samples, especially in 2012 when it reaches an amazing 67 percentage points (40 percentage points excluding Rushden). In 2013 there was a span of 26 percentage points and in 2011 a span of 14 percentage points.

The overall average amongst the 20 schools is almost 58%. This varies by year. In 2011 it was 64%, in 2012 it was significantly lower at 51% (but rose to 58% if Rushden is excluded) and in 2013 it was 61%.

This compares with a national average for high attainers in state-funded schools of 94.7% in 2013. The extent to which some of these outlier schools are undershooting the national average is truly eye-watering.

Turning to the progress measures, one might expect even greater variance, given that so many more schools fail to clear this element of the official floor targets with their high attainers.

The overall average across these 20 schools is 41% in English and 44% in maths, suggesting that performance is slightly stronger in maths than English.

But in 2011 the averages were 49% in English and 42% in maths, reversing this general pattern and producing a much wider gap in favour of English.

In 2012 they were 36% in English and 38% in maths, but the English average improves to 41% if Rushden’s result is excluded. This again bucks the overall trend.

The overall average is cemented by the 2013 figures when the average for maths stood at 53% compared with 42% for English.

Hence, over the three years, we can see that the sharp drop in English in 2012 – most probably attributable to the notorious marking issue – was barely recovered in 2013. Conversely, a drop in maths in 2012 was followed by a sharp recovery in 2013.

The small sample size calls into question the significance of these patterns, but they are interesting nevertheless.

The comparable national averages among all state-funded schools in 2013 were 86.2% in English and 87.8% in maths. So the schools in this sample are typically operating at around half the national average levels. This is indeed worse than the comparable record on the threshold measure.

That said, the variation in these results is again huge – 35 percentage points in English (excluding Rushden) and as much as 52 percentage points in maths.

There is no obvious pattern in these schools’ comparative performance in English and maths. Ten schools scored more highly in English and nine in maths, with one school recording equally in both. English was in the ascendancy in 2011 and 2012, but maths supplanted it in 2013.

The final column in Table 2 shows the average point score (APS) for high attainers’ best eight GCSE results. There is once more a very big range, from 144.3 to 326.5 – over 180 points – compared with a 2013 national average for high attainers in state-funded schools of 377.6.

The schools at the bottom of the distribution are almost certainly relying heavily on GCSE-equivalent qualifications, rather than pushing their high attainers towards GCSEs.

Those schools that record relatively high APS alongside relatively low progress scores are most probably taking their high attaining learners with L5 at KS2 to GCSE grade C, but no further.

.

Changes in Performance from 2011 to 2013

Table 3, below, shows how the performance of the 2011 sample changed in 2012 and 2013, while Table 4 shows how the 2012 sample performed in 2013.

The numbers in green show improvements compared with the schools’ 2011 baselines and those in bold are above my illustrative high attainer floor target. The numbers in red are those which are lower than the schools’ 2011 baselines.

.

Table 3: Performance of the 2011 Sample in 2012 and 2013

Name             % HA  5+ A*-C incl E+M      3+ LOP E    3+ LOP M
11 12 13 11 12 13 11 12 13 11 12 13
Carter Community School 13 14 13 56 100 92 56 80 75 44 80 33
Hadden Park High School 13 15 8 60 87 75 40 80 75 20 53 50
Merchants Academy 19 16 20 68 79 96 58 79 88 42 47 71
The Robert Napier School 12 12 11 68 83 96 39 59 92 46 62 80
Bishop of Rochester Academy 5 7 8 70 83 73 50 67 47 60 75 53

.

All but one of the five schools showed little variation in the relative size of their high attainer populations over the three years in question.

More importantly, all five schools made radical improvements in 2012.

Indeed, all five exceeded the 5+ GCSE threshold element of my illustrative floor target in both 2012 and 2013 though, more worryingly, three of the five fell back somewhat in 2013 compared with 2012, which might suggest that short term improvement is not being fully sustained.

Four of the five exceeded the English progress element of the illustrative floor target in 2012 while the fifth – Robert Napier – missed by only 1%.

Four of the five also exceeded the floor in 2013, including Robert Napier which made a 43 percentage point improvement compared with 2012. On this occasion, Bishop of Rochester was the exception, having fallen back even below its 2011 level.

In the maths progress element, all five schools made an improvement in 2012, three of the five exceeding the floor target, the exceptions being Hadden Park and Merchants Academy

But by 2013, only three schools remained above their 2011 baseline and only two – Merchants and Robert Napier – remained above the floor target.

None of the five schools would have remained below my floor target in either 2012 or 2013, by virtue of their improved performance on the 5+ GCSE threshold element, but there was significantly greater insecurity in the progress elements, especially in maths.

There is also evidence of huge swings in performance on the progress measures. Hadden Park improved progression in English by 40 percentage points between 2011 and 2012. Carter Community School almost matched this in maths, improving by 36 percentage points, only to fall back by a huge 47 percentage points in the following year.

Overall this would appear to suggest that this small sample of schools made every effort to improve against the threshold and progress measures in 2012 but, while most were able to sustain improvement – or at least control their decline – on the threshold measure into 2013, this was not always possible with the progress elements.

There is more than a hint of two markedly different trajectories, with one group of schools managing to sustain initial improvements from a very low base and the other group falling back after an initial drive.

Is the same pattern emerging amongst the group of schools that fell below my high attainer floor target in 2012?

.

Table 4: Performance of the 2012 Sample in 2013

Name   % HA  5+ A*-C incl E+M 3+ LOP E 3+ LOP M
12 13  12 13 12 13 12 13
The Rushden Community College 26 23 3 90 0 74 54 87
Culverhay School 19 12 30 67 40 67 20 67
Raincliffe School 11 50 50 33
The Coseley School 20 26 60 88 51 82 60 78
Fleetwood High School 22 24 62 84 38 36 24 67
John Spendluffe Foundation Technology College 12 15 64 100 50 61 43 83
Parklands High School 18 11 69 78 23 56 8 56
Frank F Harrison Engineering College 12 13 70 70 35 57 60 52

.

We must rule out Raincliffe, which closed, leaving seven schools under consideration.

Some of these schools experienced slightly more fluctuation in the size of their high attainer populations – and over the shorter period of two years rather than three.

Six of the seven managed significant improvements in the 5+ GCSE threshold with the remaining school – Frank F Harrison – maintaining its 2012 performance.

Two schools – Frank F Harrison and Culverhay did not exceed the illustrative floor on this element.  Meanwhile John Spendluffe achieved a highly creditable perfect score, comfortably exceeding the national average for state-funded schools. Rushden was not too far behind.

There was greater variability with the progress measures. In English, three schools remained below the illustrative floor in 2013 with one – Fleetwood High – falling back compared with its 2012 performance.

Conversely, Coseley improved by 31 percentage points to not far below the national average for state-funded schools.

In maths two schools failed to make it over the floor. Parklands made a 48 percentage point improvement but still fell short, while Frank F Harrison fell back eight percentage points compared with its 2012 performance.

On the other hand, Rushden and John Spendluffe are closing in on national average performance for state-funded schools. Both have made improvements of over 30 percentage points.

Of the seven, only Frank F Harrison would remain below my overall illustrative floor target on the basis of its 2013 performance.

Taking the two samples together, the good news is that many struggling schools are capable of making radical improvements in their performance with high attainers.

But question marks remain over the capacity of some schools to sustain initial  improvements over subsequent years.

 .

What Interventions Have Impacted on these Schools?

Table 5 below reveals how different accountability and school improvement interventions have been brought to bear on this sample of 20 schools since 2011.

.

Table 5: Interventions Impacting on Sample Schools 2011-2014

Name Floor Targets Most recent Inspection Ofsted Rating (Pre-) warning notice Academised
2011
Carter Community School  .FT 2011. FT 2013  .29/11/12. NYI as academy 2 Sponsored
Hadden Park High School  .FT 2011.FT 2012

.FT 2013

 .13/11/13 .NYI as academy SM Sponsored
Merchants Academy  .FT 2011 .FT 2012  .9/6/11 2
The Robert Napier School  .FT 2011.FT 2012  .17/09/09.NYI as academy 3 Sponsored
Bishop of Rochester Academy  .FT 2011.FT 2013  .28/6/13 3 PWN 3/1/12
2012
The Rushden Community College FT 2012  .10/11/10.NYI as academy 3 Sponsored
Culverhay School  .FT 2011 .FT 2012

.(FT 2013)

 .11/1/12 .NYI as academy SM Sponsored
Raincliffe School  .FT 2012  .19/10/10 3 Closed
The Coseley School  .FT 2012  .13/9/12 SM
Fleetwood High School  .FT 2012 .FT 2013  .20/3/13 SWK
John Spendluffe Foundation Technology College  .FT 2012  .3/3/10 .As academy    18/9/13 .1.2 Academy converter 9/11
Parklands High School  .FT 2011.FT 2012

.FT 2013

 .5/12/13 SM Discussing sponsorship
Frank F Harrison Engineering College  .FT 2011.FT 2012

.(FT 2013)

 .5/7/11.See Mirus Academy below 3 Now Mirus Academy (see below)
2013
Gloucester Academy  .FT 2011.FT 2012

 .FT 2013

 .4/10/12 SWK  .PWN 16/9/13.WN 16/12/13
Christ the King RC and CofE VA School  .FT 2011.FT 2012

.FT 2013

 .18/9/12 SM Closed
Aireville School  .FT 2012.FT 2013  .15/5/13 SM
Manchester Creative and Media Academy for Boys  .FT 2011.FT 2012

.FT 2013

 .13/6/13 SWK PWN 3/1/12
Fearns Community Sports College  .FT 2011.FT 2013  .28/6/12 3
Unity College Blackpool .  .FT 2011 .FT 2012

.FT 2013

 .9/11/11.NYI as academy 3 Sponsored
The Mirus Academy  .FT 2013  .7/11/13 SM

 .

Floor Targets

The first and obvious point to note is that every single school in this list fell below the official floor targets in the year in which they also undershot my illustrative high attainers’ targets.

It is extremely reassuring that none of the schools returning particularly poor outcomes with high attainers are deemed acceptable performers in generic terms. I had feared that a few schools at least would achieve this feat.

In fact, three-quarters of these schools have fallen below the floor targets in at least two of the three years in question, while eight have done so in all three years, two having changed their status by becoming academies in the final year (which, strictly speaking, prevents them from scoring the hat-trick). One has since closed.

Some schools appear to have been spared intervention by receiving a relatively positive Ofsted inspection grade despite their floor target records. For example, Carter Community School had a ‘good’ rating sandwiched between two floor target appearances, while Merchants Academy presumably received its good rating before subsequently dropping below the floor.

John Spendluffe managed an outstanding rating two years before it dropped below the floor target and was rated good – in its new guise as an academy – a year afterwards.

The consequences of falling below the floor targets are surprisingly unclear, as indeed are the complex rules governing the wider business of intervention in underperforming schools.

DfE press notices typically say something like:

Schools below the floor and with a history of underperformance face being taken over by a sponsor with a track record of improving weak schools.’

But of course that can only apply to schools that are not already academies.

Moreover, LA-maintained schools may appeal to Ofsted against standards and performance warning notices issued by their local authorities; and schools and LAs may also challenge forced academisation in the courts, arguing that they have sufficient capacity to drive improvement.

As far as I can establish, it is nowhere clearly explained what exactly constitutes a ‘history of underperformance’, so there is inevitably a degree of subjectivity in the application of this criterion.

Advice elsewhere suggests that a school’s inspection outcomes and ‘the local authority’s position in terms of securing improvement as a maintained school’ should also be taken into account alongside achievement against the floor targets.

We do not know what weighting is given to these different sources of evidence, nor can we rule out the possibility that other factors – tangible or intangible – are also weighed in the balance.

Some might argue that this gives politicians the necessary flexibility to decide each case on its merits, taking careful account of the unique circumstances that apply rather than imposing a standard set of cookie-cutter judgements.

Others might counter that the absence of standard criteria, imposed rigorously but with flexibility to take additional special circumstances in to account, lays such decisions unnecessarily open to dispute and is likely to generate costly and time-consuming legal challenge

.

Academy Warning Notices

When it comes to academies:

‘In cases of sustained poor academic performance at an academy, ministers may issue a pre-warning notice to the relevant trust, demanding urgent action to bring about substantial improvements, or they will receive a warning notice. If improvement does not follow after that, further action – which could ultimately lead to a change of sponsor – can be taken. In cases where there are concerns about the performance of a number of a trust’s schools, the trust has been stopped from taking on new projects.’

‘Sustained poor academic performance’ may or may not be different from a ‘history of underperformance’ and it too escapes definition.

One cannot but conclude that it would be very helpful indeed to have some authoritative guidance, so that there is much greater transparency in the processes through which these various provisions are being applied, to academies and LA-maintained schools alike.

In the absence of such guidance, it seems rather surprising that only three of the academies in this sample – Bishop of Rochester, Gloucester and Manchester Creative and Media – have received pre-warning letters to date, while only Gloucester’s has been superseded by a full-blown warning notice. None of these mention specifically the underperformance of high attainers.

  • Bishop of Rochester received its notice in January 2012, but subsequently fell below the floor targets in both 2012 and 2013 and – betweentimes – received an Ofsted inspection rating of 3 (‘requires improvement’).
  • Manchester Creative and Media also received its pre-warning notice in January 2012. It too has been below the floor targets in both 2012 and 2013 and was deemed to have serious weaknesses in a June 2013 inspection.
  • Gloucester received its pre-warning notice much more recently, in September 2013, followed by a full warning notice just three months later.

These pre-warning letters invite the relevant Trusts to set out within 15 days what action they will take to improve matters, whereas the warning notices demand a series of specific improvements with a tight deadline. (In the case of Gloucester Academy the notice issued on 16 December 2013 imposing a deadline of 15 January 2014. We do not yet know the outcome.)

Other schools in my sample have presumably been spared a pre-warning letter because of their relatively recent acquisition of academy status, although several other 2012 openers have already received them. One anticipates that more will attract such attention in due course.

 .

Ofsted Inspection

The relevant columns of Table 5 reveal that, of the 12 schools that are now academies (taking care to count Harrison/Mirus as one rather than two), half have not yet been inspected in their new guise.

As noted above, it is strictly the case that, when schools become academies – whether sponsored or via conversion – they are formally closed and replaced by successor schools, so the old inspection reports no longer apply to the new school.

However, this does not prevent many academies from referring to such reports on their websites – and they do have a certain currency when one wishes to see whether or not a recently converted academy has been making progress.

But, if we accept the orthodox position, there are only six academies with bona fide inspection reports: Merchants, Bishop of Rochester, John Spendluffe, Gloucester, Manchester Creative and Media and Mirus.

All five of the LA-maintained schools still open have been inspected fairly recently: Coseley, Fleetwood, Parklands, Aireville and Fearns.

This gives us a sample of 11 schools with valid inspection reports:

  • Two academies are rated ‘good’ (2)  – Merchants and John Spendluffe;
  • One academy – Bishop of Rochester – and one LA-maintained school –  Fearns – ‘require improvement’ (3);
  • Two academies – Gloucester and Manchester – and one LA-maintained school – Fleetwood – are inadequate (4) having serious weaknesses and
  • One academy – Mirus – and three LA-maintained schools – Parklands, Coseley and Aireville – are inadequate (4) and in Special Measures.

The School Inspection Handbook explains the distinction between these two  variants of ‘inadequate’:

‘A school is judged to require significant improvement where it has serious weaknesses because one or more of the key areas is ‘inadequate’ (grade 4) and/or there are important weaknesses in the provision for pupils’ spiritual, moral, social and cultural development. However, leaders, managers and governors have been assessed as having the capacity to secure improvement

…A school requires special measures if:

  • it is failing to give its pupils an acceptable standard of education and
  • the persons responsible for leading, managing or governing are not demonstrating the capacity to secure the necessary improvement in the school.’

Schools in each of these categories are subject to more frequent monitoring reports. Those with serious weaknesses are typically re-inspected within 18 months, while, for those in special measures, the timing of re-inspection depends on the school’s rate of improvement.

It may be a surprise to some that only seven of the 11 are currently deemed inadequate given the weight of evidence stacked against them.

There is some support for the contention that Ofsted inspection ratings, floor target assessments and pre-warning notices do not always link together as seamlessly as one might imagine, although apparent inconsistencies may sometimes arise from the chronological sequence of these different judgements.

But what do these 11 reports say, if anything, about the performance of high attainers? Is there substantive evidence of a stronger focus on ‘the most able’ in those reports that have issued since September 2013?

.

The Content of Ofsted Inspection Reports

Table 6, below, sets out what each report contains on this topic, presenting the schools in the order of their most recent inspection.

One might therefore expect the judgements to be more specific and explicit in the three reports at the foot of the table, which should reflect the new guidance introduced last September. I discussed that guidance at length in this October 2013 post.

.

Table 6: Specific references to high attainers/more able/most able in inspection reports

Name Date Outcome Comments
Merchants Academy 29/6/11 Good (2) In Year 9… an impressive proportion of higher-attaining students…have been entered early for the GCSE examinations in mathematics and science. Given their exceptionally low starting points on entry into the academy, this indicates that these students are making outstanding progress in their learning and their achievement is exceptional.More-able students are fast-tracked to early GCSE entry and prepared well to follow the InternationalBaccalaureate route.
Fearns Community Sports College 28/6/12 Requires improvement (3) Setting has been introduced across all year groups to ensure that students are appropriately challenged and supported, especially more-able students. This is now beginning to increase the number of students achieving higher levels earlier in Key Stage 3.
The Coseley School 13/9/12 Special Measures (4) Teaching is inadequate because it does not always extend students, particularly the more able.What does the school need to do to improve further?Raise achievement, particularly for the most able, by ensuring that:

  • work consistently challenges and engages all students so that they make good progress in lessons
  • challenging targets are set as a minimum expectation
  • students do not end studies in English language and mathematics early without having the chance to achieve the best possible grade
  • GCSE results in all subjects are at least in line with national expectations.

Target setting is not challenging enough for all ability groups, particularly for the more-able students who do not make sufficient progress by the end of Key Stage 4.

Gloucester Academy 4/10/12 Serious Weaknesses (4) No specific reference
Fleetwood High School 20/3/13 Serious Weaknesses(4) No specific reference
Aireville School 15/5/13 Special Measures(4) Teachers tend to give the same task to all students despite a wide range of ability within the class. Consequently, many students will complete their work and wait politely until the teacher has ensured the weaker students complete at least part of the task. This limits the achievement of the more-able students and undermines the confidence of the least-able.There is now a good range of subjects and qualifications that meet the diverse needs and aspirations of the students, particularly the more-able students.
Manchester Creative and Media Academy for Boys 13/6/13 Serious Weaknesses(4) The most-able boys are not consistently challenged to attain at the highest levels. In some lessons they work independently and make rapid progress, whereas on other occasions their work is undemanding.What does the academy need to do to improve further?Improve the quality of teaching in Key Stages 3 and 4 so that it is at least good leading to rapid progress and raised attainment for all groups of boys, especially in English, mathematics and science by…  ensuring that tasks are engaging and challenge all students, including the most-able.The most-able boys receive insufficient challenge to enable them to excel. Too many lessons donot require them to solve problems or link their learning to real-life contexts.In some lessons teachers’ planning indicates that they intend different students to achieve different outcomes, but they provide them all with the same tasks and do not adjust the pace or nature of work for higher- or lower-attaining students. This results in a slow pace of learning and some boys becoming frustrated.
Bishop of Rochester Academy 28/6/13 Requires improvement (3) No specific reference
John Spendluffe Foundation Technology College 18/9/13 Good (2) Not enough lessons are outstanding in providing a strong pace, challenge and opportunities for independent learning, particularly for the most able.The 2013 results show a leap forward in attainment and progress, although the most able could still make better progress.Leadership and management are not outstanding because the achievement of pupils, though improving quickly, has not been maintained at a high level over a period of time, and a small number of more-able students are still not achieving their full potential.
The Mirus Academy 7/11/13 Special Measures (4) The academy’s early entry policy for GCSE has made no discernible difference to pupils’ achievement, including that of more able pupils.
Parklands High School 5/12/13 Special Measures (4) The achievement of students supported by the pupil premium generally lags behind that of their classmates. All groups, including themost able students and those who have special educational needs, achieve poorly.Students who join the school having achieved Level 5 in national Key Stage 2 tests in primary school fare less well than middle attainers, in part due to early GCSE entry. They did a little better in 2013 than in 2012.

.

There is inconsistency within both parts of the sample – the first eight reports that pre-date the new guidance and the three produced subsequently.

Three of the eleven reports make no specific reference to high attainers/most able learners, all of them undertaken before the new guidance came into effect.

In three more cases the references are confined to early entry or setting, one of those published since September 2013.

Only four of the eleven make what I judge to be substantive comments:

  • The Coseley School (special measures) – where the needs of the most able are explicitly marked out as an area requiring improvement;
  • The Manchester Creative and Media Academy for Boys (serious weaknesses) – where attention is paid to the most able throughout the report;
  • John Spendluffe Foundation Technology College (good) – which includes some commentary on the performance of the most able; and
  • Parklands High School (special measures) – which also provides little more than the essential minimum coverage.

The first two predate the new emphasis on the most able, but they are comfortably the most thorough. It is worrying that not all reports published since September are taking the needs of the most able as seriously as they might.

One might expect that, unconsciously or otherwise, inspectors are less ready to single out the performance of the most able when a school is inadequate across the board, but the small sample above does not support this hypothesis. Some of the most substantive comments relate to inadequate schools.

It therefore seems more likely that the variance is attributable to the differing capacity of inspection teams to respond to the new emphases in their inspection guidance. This would support the case made in my previous post for inspectors to receive additional guidance on how they should interpret the new requirement.

.

Conclusion

This post established an illustrative floor target to identify a small sample of 20 schools that have demonstrated particularly poor performance with high attainers in the Performance Tables for 2011, 2012 or 2013.

It:

  • Compared the performance of these schools in the year in which they fell below the floor, noting significant variance by year and between institutions, but also highlighting the fact that the proportion of high attainers attending these schools is significantly lower than the national average for state-funded schools.
  • Examined the subsequent performance of schools below the illustrative floor in 2011 and 2012, finding that almost all made significant improvements in the year immediately following, but that some of the 2011 cohort experienced difficulty in sustaining this improvement across all elements into a second year. It seems that progress in English, maths or both are more vulnerable to slippage than the 5+ A*-C GCSE threshold measure.
  • Confirmed – most reassuringly – that every school in the sample fell below the official, generic floor targets in the year in which they also undershot my illustrative high attainer floor targets.
  • Reviewed the combination of assessments and interventions applied to the sample of schools since 2011, specifically the interaction between academisation, floor targets, Ofsted inspection and (pre)warning notices for academies. These do not always point in the same direction, although chronology can be an extenuating factor. New guidance about how these and other provisions apply and interact would radically improve transparency in a complex and politically charged field.
  • Analysed the coverage of high attainers/most able students in recent inspection reports on 11 schools from amongst the sample of 20, including three published after September 2013 when new emphasis on the most able came into effect. This exposed grave inconsistency in the scope and quality of the coverage, both before and after September 2013, which did not correlate with the grade of the inspection. Inspectors would benefit from succinct additional guidance.

In the process of determining which schools fell below my high attainers floor target, I also identified the schools that undershot one or other of the elements but not both. This wider group included 46 schools in 2011, 52 schools in 2012 and 34 schools in 2013.

Several of these schools reappear in two or more of the three years, either in their existing form or following conversion to academy status.

Together they constitute a ‘watch list’ of more than 100 institutions, the substantial majority of which remain vulnerable to continued underperformance with their high attainers for the duration of the current accountability regime.

The chances are that many will also continue to struggle following the introduction of the new ‘progress 8’ floor measure from 2015.

Perhaps unsurprisingly, the significant majority are now sponsored academies.

I plan to monitor their progress.

.

*Apologies for this rather tabloid title!

.

GP

February 2014

How High Attainers Feature in Ofsted Inspection and School Performance Tables (and what to do about it) – Part Two

.

This is the second and final part of a post about how the school accountability system reflects the performance of high attaining learners.

Part One considered recent amendments to Ofsted’s inspection guidance and compared Ofsted’s approach with how high attainers are defined in the School Performance Tables. It reviewed expectations of progression by high attainers and proposed that these should be increased.

Part Two:

  • Reviews how the next School Performance Tables (2013 results) will feature high attainers, compared with the current Tables (2012 results).
  • Explains how high attainers feature in the proposals for assessment and accountability reform set out in three recent consultation documents – covering primary, secondary and post-16 education respectively. This takes in the recently published Government response to the secondary consultation.
  • Offers guidance for schools on how they might set about planning to improve the performance of their high attainers, given current accountability arrangements and future prospects and
  • Proposes for further discussion a basket of key indicators that schools might publish and pursue alongside learners, parents and other stakeholders.

I have adopted a simple taxonomy of measures throughout the discussion that follows.

This distinguishes measures relating specifically to high attainers according to whether they feature:

  • Attainment: the achievement of specified grades or levels in assessments conducted at the end of a key stage of education.
  • Progress: the expected trajectory between two or more of these assessments, consistent with achieving commensurate outcomes in each.
  • Destination: the nature of the educational or other setting to which learners have moved at the end of a key stage.
  • Closing the Gap: the difference between the outcomes of disadvantaged and other learners on any of these measures, whether this is falling and, if so, by how much.
  • Comparison: how the performance of schools/colleges on any of these measures compares with broadly similar institutions.

I have also flagged up related measures of high attainment – and measures which reflect high attainment – where these have been applied to the entire cohort rather than separately to high attainers.

.

High Attainers in the 2012 and 2013 Performance Tables

We begin with a comparison between the 2012 and 2013 School Performance Tables covering the primary, secondary and 16-18 tables respectively.

The details of reporting in the 2013 Tables are drawn from the published Statement of Intent. This confirms that they should be published according to the standard timetable, in mid-December 2013 (primary) and late January 2014 (secondary and post-16) respectively, with any data not then ready for publication added as and when it becomes available.

For ease of reference I have included in brackets the national figure for each high attainer measure in the 2012 Tables (state-funded schools only).

 .

Primary Tables

Significant changes will be apparent in 2013. These are consequential on: the introduction of Level 4B+ and Level 6 performance; the removal of an overall level in English; the introduction of the grammar, punctuation and spelling (GPS) test and the addition of three year averages for specified measures.

Level 4B+ has been introduced as a new marker of ‘secondary readiness’. The reason given is that analysis of sub levels showed that, in 2012, only 47% of those with 4C in both English and maths went on to achieve 5 A*-C grade GCSEs including English and maths, while the comparable percentages for 4B and 4A were 72% and 81% respectively.

I pause only to note that, if the threshold is raised in this manner, there is even stronger logic behind the idea of raising the threshold for high attainers in parallel – an idea I floated in Part One of this post.

The table below compares the 2012 and 2013 measures using the taxonomy set out above.

 .

  2012 2013
High Attainers measures
Attainment % achieving L3- (0%), L4+ (99%), L5+ in both English and maths (72%) % achieving L3- and L5+, in all three of reading and maths tests and writing TA
KS1-2 value added measure (English and maths) (99.8) % achieving L4+ and L4B+ in reading test
% achieving L4+ in writing TA
% achieving L3-, L4+, L4B+, L5+ and L6 in GPS test
% achieving L3-, L4+, L4B+, L5+ and L6 in maths test
KS1-2 VA (reading, writing and maths)
Progress % making at least expected progress in English (87%) and maths (92%) % making at least  expected progress in each of reading, writing, maths
Destinations None None
Closing the gap None None
Comparison % achieving L4+ in English and maths (99%) % achieving L4+ in  reading, writing and maths
 
High Attainment measures
  % achieving L5+ in reading and maths tests and writing TA (20%) % achieving L5+ and L6 in reading test
  % achieving L5+ in English (37%) in maths (39%) and in reading (48%) % achieving L5+ and L6 in writing TA
  % achieving L5+ in English TA (36%), maths TA (40%), science TA (36%), reading TA (46%) and writing TA (28%) % achieving L5+ and L6 in English, reading, maths, science TA
  Average point score (28.2) Average point score (reading, writing, maths)

.

The number of attainment measures applied specifically to high attainers has increased, but it is not entirely clear why so many different combinations of levels will be reported for different elements (there are four variants in all). The extent of L6 attainment is clearly a factor, but cannot be the sole reason.

For the first time we will be able to compare the performance of disadvantaged and other learners for both L5 and L6 in GPS and maths, at least in theory, since very few schools are likely to have sufficient L6 performance by disadvantaged learners to register on this measure.

But this is a high attainment measure. We still cannot see what proportion of high attainers are disadvantaged – and how their performance compares with their peers.

A national primary destinations measure is not really feasible and would tell us little, unless receiving secondary schools are categorised on the basis of their performance in the secondary tables or their Ofsted rating. It might be interesting to see what proportion of high attainers (particularly disadvantaged high attainers) transferrto secondary schools deemed outstanding – compared with middle and low attainers – but the benefits would be limited. However, this might be more relevant at local level.

The basis of the comparison with other schools is important to understand, since this methodology will be carried forward in the accountability reforms reviewed below.

The probability is calculated of pupils achieving KS2 L4+ in both English and maths, based on the achievement of pupils with the same prior attainment at KS1. These probabilities are averaged to provide a figure for the school. Then a similar school group is established by selecting the 62 schools with immediately stronger average performance and the 62 with immediately weaker average performance, giving a group of 125 schools. An average Level per pupil is calculated from the average points score.

The same methodology is used in the secondary tables. Here the calculation is built on the probability of achieving 5+ A*-C grades in GCSE, or equivalent, plus English and maths GCSEs, and based on the achievement of pupils with the same prior attainment at KS2. In this case the similar school group is derived from the 27 schools immediately above and the 27 immediately below, giving a group of 55 comparator schools.

 .

Secondary Tables

Compared with the extensive changes to the Primary Tables in 2013, there is very little different in the Secondary Tables, apart from the introduction of an average grade per pupil (best 8) measure.

 .

2012 2013
High attainers measures
Attainment % achieving: 5+ A*-C GCSEs or equivalent including GCSEs in English and maths (94%) Ditto
% achieving Grades A*-C in GCSE English and maths (94.3%) Ditto
APS (best 8) all qualifications (398.5) and GCSE only (375.4) Ditto
Average grade per qualification and per GCSE Ditto
Average entries per pupil for all qualifications (12.4) and GCSE (9.7); Ditto
% entered for all EBacc subjects (46.3%);            % achieving all EBacc qualifications (38.5%); Ditto
EBacc VA in English (1000.2), maths (1000.1), science (1000.4), humanities (1000.8) and languages (1000.2) Ditto
Average grade per pupil (best 8)
Progress % making at least expected progress in English (96.6%) and maths (96.8%) Ditto
  VA (best 8) (1000.8) Ditto
Destinations None (see below) Published data included
Closing the Gap None Ditto
Comparisons % achieving 5+ A*-C GCSEs or equivalent, including GCSEs in English and maths (94%) Ditto
High Attainment measures None Ditto

 .

I have not included in the high attainment section the few additional attainment measures not  applied to high attainers (percentage achieving 5+ GCSEs at A*-C or equivalent (83%); percentage entered for and achieving EBacc subjects – English, maths, science, humanities and languages). These are not specifically high attainment measures and tell us relatively little.

As in the Primary Tables, there is a no analysis of schools’ success in closing the gap for high attainers. And, since there are no substantive high attainment measures in the Secondary Tables (with the possible exception of the EBacc), this omission is comparatively more significant.

Whereas we can see some L5+ and L6 performance for disadvantaged learners in the primary tables, there is no equivalent focus – say on 5+ GCSEs at A*/A including English and maths – in the secondary tables.

Destinations measures have already been published separately for 2010/11 and education destinations will be included in the 2013 Performance Tables, but the breakdown of destinations data by pupil characteristics does not include a prior attainment category.

(Compare this with the more cautious statements about the longer term use of KS4 destinations data set out below. The Statement of Intent is comparatively more cautious about the KS5 data.)

We have also had the separate announcement that Performance Tables will henceforward record only a learner’s first entry for any given GCSE examination (or IGCSE or Level 1/Level 2 Certificate).

This is too late for the imminent 2013 Tables, but will impact on the 2014 Tables (published in January 2015), when it will bite on all EBacc subjects. From 2015 it will impact on all subjects.

There are two schools of thought about the potential impact on high attainers.

One might argue that this change should not affect them unduly, since they are much more likely to be entered early because ready to achieve a commensurately high grade, rather than to ‘bank’ a Grade C which they may or may not subsequently seek to improve via retakes.

As the DfE announcement says:

‘If schools are confident that pupils will achieve well even when entered early and that early entry is therefore in the interests of the pupil, they should not need to make any changes to entry plans.’

On the other hand, we have already seen in Part One that Ofsted have included in  Subsidiary Guidance supporting inspection the advice that:

‘Inspectors should investigate whether a policy of early entry to GCSE for pupils is preventing them from making as much progress as they should, for example because…

  • The widespread use of early GCSE entry and repeated sitting of examinations has encouraged short-term gains in learning but has led to underachievement at GCSE, particularly for able pupils
  • Opportunities are missed to meet the needs of high-attaining pupils through depth of GCSE study and additional qualifications.’ (para 34)

Schools would do well to ensure that their plans to improve high attainers’ performance are reflected in any revision of their early entry policies, and vice versa.

Given the significance now attached to this issue, any measure that depends on increasing the incidence of early entry for high attainers is likely to receive close scrutiny.

Schools should not be cowed from adopting such an approach where it is clearly in the best interest of their highest attainers, but they will need strong supporting evidence that early entry will result in an A*/A grade (and ideally A*), and that appropriate progression routes are in place.

 .

16-18 Tables

The post-16 Tables are comparatively less well-developed and continue to rely exclusively on high attainment measures rather than separately delineating outcomes for high attainers.

The structure of the 2013 Tables is undergoing significant change, with separate reporting of three different performance categories, depending whether students have pursued A levels, A level and other advanced academic qualifications or advanced vocational qualifications respectively.

The 2013 entries in the table below cover only the A level strand.

 .

  2012 2013 (A levels)
High attainers measures  
Attainment None Ditto
Progress None Ditto
Destinations None Ditto (see below)
Closing the Gap None Ditto
Comparisons None Ditto
High attainment measures
  % of KS5 students (4.8%) and of A level students (7.4%) achieving 3 A levels at AAB+ in three facilitating subjects % of A level students achieving 3 A levels at AAB+ in three facilitating subjects
  % of KS5 students (7.8%) and of A level students (11.9%) achieving  3 A levels at AAB+ with two in facilitating subjects % of A level students achieving 3 A levels at AAB+ with two in facilitating subjects
  APS per student and per entry for A level only APS per A level student (FTE) and per A level entry
  APS per student and per entry for A level and other academic qualifications
  APS per student and per entry for A level and equivalent qualifications (including 4 year time series)
  A level VA (see below)

 .

New value added measures are also expected for A level and the other two performance categories, though the release of this data is said to be ‘subject to further analysis’ and no substantive detail is provided.

There is no commitment to introduce KS5 destinations data into the 2013 Tables though the Statement of Intent says:

‘We will continue to evaluate both the KS5 and employment destinations measures as part of our aim to include in future performance tables.’

‘Facilitating subjects’ continue to hold sway in the key high attainment measures, despite continuing criticism of the concept, as well as concern that a ‘three facilitating subjects’ measure is not consistent with the Russell Group’s advice.

 .

What can we learn from this comparison?

It is noticeable how little consistency there is between each set of Tables as presently formulated. High attainers feature strongly in the Secondary Tables, to a limited extent in the Primary Tables and not at all in the Post-16 Tables.

Conversely, there are several measures of high attainment in the Primary Tables and a couple in the Post-16 tables, but the Secondary Tables concentrate exclusively on generic measures. There is no measure for achievement pitched at GCSEs Grades A*/A.

Closing the gap data and comparisons data is so far entirely absent from the post-16 Tables and, perhaps less surprisingly, destinations data is absent from the Primary Tables.

Where destinations and closing the gap data is available, there is no breakdown for high attainers.

The next section will explore whether the changes proposed in the three accountability-related consultation documents are likely to bring about any greater consistency in the coverage of the respective Performance Tables and, more specifically, how they are likely to report on high attainers.

 .

Likely Impact of Accountability Reforms

At the time of writing, three staged consultations have been launched but only one has been concluded.

All three consultations are predicated on a new approach to the publication of accountability data which has three components:

  • A requirement to publish a handful of headline indicators on schools’ own websites. The secondary consultation response describes this as a ‘snapshot’ in standard format. It seems likely that this provision will be extended to primary and post-16 institutions but, as yet, this is nowhere confirmed.
  • Continued publication of Performance Tables which contain the same headline indicators, but with additional material showing how different groups of learners perform against them and how performance compares with other institutions.
  • The introduction by March 2015 of a new data portal which contains all the (performance) information held about schools (and colleges?) We are told in the secondary consultation response that:

‘It will be an easily accessible website that allows the public to search all the information we hold about schools, subject to protecting individuals’ anonymity. Respondents to the consultation argued that it would be useful to see measures showing school by school performance in vocational qualifications, the percentage of pupils achieving the top grades in GCSEs, and average grades by subject. We agree that all these measures will be of interest to many people, and the Data Portal is being designed so that parents can search for this type of information.’

There is otherwise comparatively little information about this portal in the public domain.

DfE’s Digital Strategy, published in December 2012, says that the parent School Performance Data (SDP) programme will ‘see the consolidation of 8 existing data-based services into one’ and that delivery of the programme will be staggered from the second quarter of 2014 until the final quarter of 2015. At some point within this timeline, it will absorb and replace RAISE Online.

.

.

Primary Accountability Reform

Those with a wider interest in the proposed changes to primary accountability and assessment are invited to read this analysis.  This commentary deals only with the material likely to be published through one of the three routes described above.

The primary document is comfortably the vaguest of the three consultations. However, we know that new KS2 tests – single tests for the full attainment spectrum – will be introduced in 2016, with results first appearing in the 2016 Performance Tables, likely to be published in December that year.

We are told that tests will be available in maths and in elements of English including reading (GPS is not mentioned explicitly). Writing will continue to be assessed via teacher assessment.

Published performance measures are typically described in the consultation as relating to ‘each subject’ but it seems most likely that elements of English will continue to be reported separately.

All the measures outlined below therefore apply to maths and to each tested element of English. One assumes they also apply to writing TA – and potentially to other TA outcomes too – but this is unclear.

So there may be one or two sets of measures for maths (test and potentially TA), while for English there will be somewhere between one (reading test) and five (reading and GPS tests; reading, writing and GPS TA).

For each subject/element, Performance Tables are expected to include:

  • The percentage of pupils ‘meeting the secondary readiness standard’, which will already have been introduced in the 2013 Tables. Secondary readiness will be reported using scaled scores derived from raw test marks. A score of 100 is proposed to denote the threshold of secondary readiness, so the Tables will show the percentage of a school’s learners at or above 100.
  • The corresponding average scaled scores for each school. The consultation document adopts an illustrative scale – based on the current national curriculum tests – which runs from 80 to 130. An average scaled score significantly over 100 is a positive indicator but, of itself, says nothing about the distribution of scores, or whether this outcome is attributable to prior attainment.
  • ‘How many of the school’s pupils are among the highest-attaining nationally, by…showing the percentage of pupils attaining a high scaled score’. There is no information about where these measures will be pitched in relation to the deciles that will be used to report to parents on individual learners’ performance. It might be that the cut-off for high attainers is pitched at a particular score – say 120 on the illustrative scale above – or at the 20th percentile, for example. (It seems unlikely that numbers performing in each decile will be reported in the tables though they could presumably be found via the data portal.)
  • Progress measures based on comparison of pupils’ scaled scores with the scores of other pupils with the same prior attainment. The paper gives the impression that there will be separate progress measures for each subject/element, rather than a composite measure such as that planned for secondary schools (see below). These will be derived from performance on a baseline assessment, which seems increasingly likely to be moved back from the end of KS1 to YR. We are not told what categories of prior attainment will be applied. A sophisticated gradation, based on deciles, is unlikely to be consistent with a simple baseline check for 4 year-olds. A cruder tripartite judgement of ‘primary-readiness’ is more probable, with learners categorised as ‘not yet primary ready’, ‘primary ready’ or ‘performing beyond primary readiness’.
  • Comparison of each school’s performance with other schools with similar intakes. Whether this will be undertaken in relation to all the measures above, or only selected measures, is so far unclear.
  • The attainment and progress of those in receipt of the Pupil Premium. Whether this analysis will be provided for to all the measures above also remains unclear.

All measures will be published as annual results and as three year rolling averages

The document asks respondents to comment on whether other measures should be prioritised within performance tables. It is not clear which of this data will be published on schools’ own websites as well as in performance tables.

.

Secondary accountability reform

The response to the secondary consultation confirms that schools will be required to publish five measures in a standard format on their own websites:

  • Attainment across a suite of up to eight qualifications (called ‘Attainment 8’). This suite includes: English and maths, both of which are double weighted to reflect their significance (English language will only be double weighted if combined with English literature); three further EBacc subjects (combined science double award will count as two qualifications; subjects can be drawn from the same subject area, eg three sciences or two languages); and three further qualifications, which may be arts-based, academic or approved vocational qualifications (English literature may count as one of these). The measure will be applied to learners entering fewer than eight qualifications and those entering more will have their highest graded qualifications counted in the final category. The outcome will be expressed as an average grade, but with finer gradations – exemplification in the response refers to ‘a high B grade or a low D grade’. (The consultation document noted that learners would know their APS and be able to compare it with ‘easily available local and national benchmarks’).
  • An aggregated progress measure (called ‘Progress 8’) based on the same suite of up to eight qualifications. KS4 results will be predicted on the basis of prior attainment at KS2. This involves calculating an estimated KS4 outcome based on the actual outcomes of all learners with a specified level of KS2 attainment across English and maths. (The two examples in the response are based on the existing KS2 APS methodology and averaged NC levels respectively, rather than the proposed new ‘scaled scores’ methodology outlined above. We do not know on what basis such scores would be aggregated in future for the purpose of this calculation.) Each learner’s VA score is calculated by subtracting their estimated outcome from their actual outcome. So, if a learner with given prior attainment is estimated to achieve 8C grades at GCSE and actually achieves 4Bs and 4Cs, that gives a VA score of +0.5 (4 grades over 8 subjects). The school’s average VA score is calculated by aggregating these outcomes.
  • The percentage of learners achieving a ‘pass’ grade or better in English and maths. The response uses the existing nomenclature of a C grade or better, but this will change when a new GCSE grading system is finalised. Outcomes from the Ofqual consultation – which proposed a new grading scale from 1 (low) to 8 (high) and a subsequent standards-setting consultation – should be available shortly.
  • The percentage achieving ‘good grades’ in the EBacc (the same issue applies) and
  • A destination measure, derived from the current experimental approach. This is provisional since ‘we want to be sure the statistics are robust before committing to use this…as a headline indicator’.

The response adds that the calculation process for ‘Progress 8’ is under review. It is likely to be adjusted, by calculating expected progress from the results of learners who completed KS4 three years previously, though possibly not until 2019.

There might also be a shift to calculating expected progress at subject level, which is then averaged, to reduce the likelihood of ‘schools entering learners for qualifications ‘in which it is easier to score points for the progress measure’.

A straightforward linear point scoring system is under discussion – eg 1 for a current GCSE Grade G to 8 for a current A* grade. This would give more credit to schools for higher results than the current non-linear approach, which awards a G 16 points and an A* 58 points. (this might suggest a similar adjustment to the primary APS methodology.)

Finally, the treatment of an estimated 1.2% of low attainers who enter no relevant qualifications is still under consideration. The methodology will not be finalised until Spring 2014.

Performance Tables will ‘eventually’ include the five headline indicators above (no date is provided for this).

They will also contain:

  • A value added progress measure in English and maths, showing whether learners have performed better or worse than expected given their prior attainment at KS2. (The original consultation document implied this would relate to English and maths separately and would be provided for low, middle and high attainers respectively.)
  • A comparison measure with similar schools, using the existing methodology, but further developed to give ‘an indication of disadvantage’ in each group of similar schools.
  • By implication, ‘closing the gap’ indicators showing the attainment of disadvantaged learners eligible for the Pupil Premium, the progress of those learners and ‘the in-school gap in attainment between disadvantaged pupils and their peers’. Both single year data and three year rolling averages will be published.
  • By implication there will also be further analysis of performance by high, middle and low attainers on each measure. The drafting is however very unclear, referring to the present rather than the future:

‘Performance tables give a breakdown of the performance of different pupil groups on the headline indicators. They show how well pupils with low, middle and high prior attainment perform on each measure…For each indicator, local and national benchmarks are provided to make it easier to judge each school’s performance. Using this information, parents can search for the information which is most relevant to them. For example, they can see how many pupils with high prior attainment go on to achieve the EBacc at different schools in their area’

It seems most likely that all learners judged ahead of the new ‘secondary ready’ threshold would count as high attainers, even though it would now be possible to calculate more sophisticated gradations based on deciles of performance.

Hence we would continue to have three broad categories of prior attainment: exceeding secondary ready standard, at secondary ready standard and not yet secondary ready.

However, the consultation response is silent on this matter.

We are told that material about pupils achieving the top grades at GCSE will only be available through the data portal, which means that – on this issue – the new secondary tables will continue to be out of kilter with the primary and post-16 tables.

There is one further unexplained reference in the response:

‘Confidence intervals will also be important when we present each school’s percentile ranking on the range of headline measures. For example, a school could have performed well on the Attainment 8 measure and be in the 10th percentile, with a confidence interval that indicates that the school’s true ranking is likely to lie between the 5th and 15th percentiles.’

This appears to suggest that schools will be ranked on each headline measure – but it remains unclear whether this material will be included in the Performance Tables.

There is also a beguiling reference to Ofsted:

‘In addition Ofsted may choose to specify some of these measures, for example the percentage of pupils achieving the best GCSE grades, in their inspection guidance’.

In addition:

‘Schools in which pupils make an average of one grade more progress than expected across their 8 subjects will not be inspected by Ofsted during the next academic year (unless there are exceptional circumstances, for example where there are safeguarding concerns).’

This suggests further changes to the Handbook and Subsidiary Guidance, and potentially to the Framework itself.

Changes will be introduced into the 2016 Performance Tables published in January 2017, but schools will receive information based on 2014 exam results to illustrate their performance on the new measures, and will be able to opt in to a new floor standards methodology in 2015.

 .

Post-16

This consultation proposes that performance should be reported separately at Level 2 (including continued study of English and maths for those so far without a GCSE ‘pass’ – currently graded A*-C) and Level 3.

In respect of Level 3 performance should be reported for three strands of provision: Academic (A level, AS level, IB, EPQ etc), Applied General and Technical, as has been introduced for 2013 Tables.

This analysis focuses exclusively on Level 3 Academic provision.

At Level 3, five ‘topline performance measures’ are proposed which will be included in Performance Tables (it is not stated whether institutions will also be required to publish them on their websites):

The proposed topline measures are:

  • Two principal attainment measures: average grade and average points per full-time A level student – a best 3 A levels measure is under consideration ‘to encourage substantial A level programmes’; and average grade and points score per A level entry. This is given pride of place, above the measures that rely on facilitating subjects and there is no reference to restrictions being placed on the subjects studied.
  • An aggregate KS4-5 progress measure ‘showing the progress of similar students nationally who, according to their results at the end of Key Stage 4, were of the same level of ability’ (by which they mean attainment). This sounds broadly consistent with what is proposed for the primary and secondary tables. The annex says:

‘Only students with the same prior attainment, taking the same subjects, will be compared to provide a subject score. Subject scores will then be aggregated with other academic…scores to provide an overall academic score.’

This presumably means that A level students will be compared with those with similar KS4 attainment who are taking the same A level subjects.

  • A destination measure ‘showing student progression to a positive destination’. No reference is made to the controversial question whether this will continue to distinguish Russell Group universities and Oxbridge.
  • A completion measure.

There is additionally a commitment to ‘consider how we can report the results of low, middle and high attainers similarly [to KS4] in the expanded 16-19 performance tables’ but no further clue as to how these will be devised.

A number of additional measures are also laid out, which may or may not appear in the performance tables:

  • The percentage of students achieving AAB+ grades at A level in two and in three ‘facilitating subjects’. Note that the benchmark is still AAB+, even though, from 2013-14 onwards, HEFCE’s student number control relaxation – from which this measure was originally derived – is extended from AAB+ to ABB+.
  • A ‘closing the gap’ measure which will show attainment by pupils who were eligible for Pupil Premium funding in Year 11. The annex suggests that this ‘can be compared with the top line attainment measure’ which, in the context of A level, may mean one or both of the two described above. It is not clear whether this will also be applied to the progress measure.
  • Attainment of approved level 3 maths qualifications for students who do not take A or AS level (these are under development). These will be available for teaching from 2015.

The timetable for the introduction of these reforms is not specified.

 .

Comparison of Primary, Secondary and Post-16 reforms

The table below shows the extent to which the overall proposals for performance table reform reflect a consistent application of the typology set out above.

.

Primary Secondary Post-16
Attainment % achieving secondary ready standardAverage scaled scores ‘Attainment 8’ expressed as average gradePass grade in E+MPass grades in EBacc Average grade and APS per FT A level student (potentially on best 3 A levels)Average grade and APS per A level entryAAB+ in 2 and 3 facilitating subjectsPerformance in L3 maths qualifications
Progress Averaged scaled scores compared with prior attainment ‘Progress 8’ expressed as +/- average gradeProgress in E+M Yes but no detail
Destinations None mentioned Provisionally Yes but no detail
Closing the gap Yes (indicators unclear) Yes (unclear if applied for all  indicators above Yes but no detail
Comparisons Yes (indicators unclear) Yes (with ‘indication of disadvantage’) None mentioned
Notes Reference to ‘percentile rankings on headline measures’ Unclear if facilitating subjects and L3 maths in tables

There is evidence of a shift towards greater consistency of approach compared with the current performance tables, although many of the details have yet to be clarified.

Unfortunately, this lack of clarity extends to the definition of high attainers and how their performance will be reported.

  • In the primary sector, it seems most likely that high attainers will be defined according to some yet-to-be-determined measure of ‘primary readiness’ and in the secondary sector according to the new ‘secondary-ready’ standard. One could extend the same logic into post-16 by devising a ‘sixth form ready’ standard based on performance against new-style GCSE grades. This would transpose into the new Tables the rather crude tripartite distinction in place currently at primary and secondary level, though we know the pitch will be somewhat higher. But this is little more than an educated guess. It would be quite possible to introduce a more sophisticated distinction and a narrower definition of what constitutes a high attainer, though this would be much more straightforward in the secondary sector than in primary.
  • We are equally unclear how high attainers’ performance will be reported. There is nothing explicit in the primary consultation document to explain which measures will be applied to high, middle and low attainers, Indeed, the only direct reference to such distinctions applies to Ofsted inspection:

‘Schools in which low, middle and high attaining pupils all make better than average progress will be much less likely to be inspected sooner.’

The secondary consultation response implies that all the headline measures will be applied to these three categories of prior attainment, but fails to state this explicitly, while the post-16 document doesn’t go beyond the broader commitment mentioned above.

  • As for the reporting of high attainment, the methodology for the principal  progress measures – confirmed for the primary and secondary tables and planned for post-16 – are specifically designed to discourage schools from concentrating over-much on learners on the borderline of a threshold measure. This is welcome. But, whereas, the primary tables will include new measures focused explicitly on how many of a school’s population have achieved national measures of high attainment (as expressed by high scaled scores) and high attainment measures will be retained in the new post-16 tables, the continued omission of an equivalent GCSE measure from the secondary tables seems inconsistent, even though we are told it will be possible to find this data in the accompanying portal. It is hard to understand the logic that justifies this continued inconsistency of approach, when the broader direction of travel seems very much about bring the three sets of tables more closely into line.

 .

How Should Schools Respond?

With so much change in the offing, it can be all too easy for schools to slip into a defensive, reactive mode, particularly if they are under pressure on other fronts.

There is a temptation to concentrate effort elsewhere, on the grounds that high attainers will do comparatively well with relatively little support. Most will be secure L5 performers and go on to pick up a clutch of A grades at GCSE. The opportunity cost of lifting them to L6 and converting their As into A*s may be perceived as simply too great.

On the other hand, while I can cite no hard evidence to support the contention, I have often observed that schools which are successful with their high attainers are rarely unsuccessful in other respects. It is as if support for high attainers is a litmus test of personalised education, going the extra mile and wider school effectiveness.

And there is a real opportunity for schools to get on to the front foot with this issue, given the absence of any substantive lead or guidance from the centre, or any conspicuous consensus amongst schools – or between experts, academics and service providers – over what constitutes effective practice.

Naturally schools will want to frame their response in a way that addresses their priorities and fits their particular contexts. They will need to take into account how their success is defined and reported by Ofsted and in School Performance Tables – and how this might change in the future – but they will not plan exclusively on that basis.

They must find the ‘best fit’ between the demands of the accountability regime and what is in the best interests of their learners. The regime is not intended to impose rigid conformity, and higher performing schools in particular must be allowed to trust their judgement rather than devoting themselves exclusively to these ‘one-size-fits-all’ measures of success.

The first part of this final section sets out how a school might rethink and redefine its support for high attainers from first principles – though it stops short of advocating or discussing any specific elements of effective whole school practice.

The second part draws on the analysis elsewhere in this post to inform a suggested basket of key measures from which schools might select when constructing a plan to improve the their high attainers’ performance. This is very much intended as a starting point for discussion, rather than a blueprint that all must follow.

.

Rethinking Support for High Attainers

A high attainers’ support strategy will only be completely successful if it has the full commitment of staff, learners, parents and other key stakeholders. It should extend across the whole school and all dimensions of whole school practice, including learning at home and parental and community engagement.

The best way to secure wholesale commitment is through an inclusive and transparent consultative process. The outcomes of that process are most readily captured in an accessible document that should:

  • Include a clear, comprehensive yet succinct statement of whole school policy for challenging and supporting high attainers that is meaningful and relevant to lay and professional audiences alike.
  • Incorporate a concise improvement plan that is regularly monitored and updated and that feeds into the wider school improvement plan.
  • Be published openly, so that the school’s priorities are understood and acted on by all parties – and so that prospective parents and learners can use them to inform decisions about whether to apply for admission.

In formulating a support plan, schools must consider what relationship it should have with any parallel gifted and talented education policy (or equivalent terminology adopted by the school).

It is not appropriate simply to substitute one for the other, without giving careful consideration to what might be lost in doing so.

In future, how will the school support learners with high ability but who might not realise it through high attainment? What of twice-exceptional learners, for example, and those with diverse talents that are not demonstrated through high attainment, whether in arts, sports, interpersonal skills, or any other field judged significant by the school?

Some schools may prefer to have parallel and mutually supportive policies for these two overlapping populations; others may prefer an integrated policy.

The most straightforward approach is to distinguish high attainers as a subset of the school’s wider gifted and talented population. Ignore the school of thought that suggests high attainers are somehow different, ‘bright but not gifted’.

But, if the policy is integrated, it must be clear where and how support for high attainers is distinct.

The support plan must rest on a clear definition of what constitutes a high attainer – and a potential high attainer – in the context of the school, together with explicit recognition of the gradations of high attainment within the general definition. (Ofsted’s example shows how confusion may be caused by using inconsistent terminology and a failure to define terms.)

The plan should be framed around core priorities. These will typically include:

  • A judicious blend of challenge and support for high attainers, designed to ensure that they continue to perform highly, but are not exposed to undue pressure or stress; and that their high attainment is not achieved at the expense of personal wellbeing, or wider personal development.
  • Challenge and support for learners who are not yet high attainers but might become so. (This might include the remainder of the school population, or a more narrowly defined group, depending on context and ideological preference.)
  • Targeted challenge and support for disadvantaged learners and under-represented groups. This must include those in receipt of the Pupil Premium but might also reflect gender, minority ethnic, SEN and summer-born considerations. Schools understand the complex causes of underachievement and that most underachieving learners are affected by a combination of factors. Avoid a simplistic quota-driven model. Critically, this equity-driven support must not be achieved at the expense of advantaged learners. The optimal strategy is to continue to raise standards for all high attainers, but to raise them relatively faster for those from disadvantaged backgrounds.

The plan must show how the current state will be changed into the desired future state. This necessitates:

  • A thorough review process and full statement of the baseline position, preserving a balance between the celebration of strengths and the identification of weaknesses and areas for development. Schools should not gloss over gaps in their skillset, or evidence that particular subjects/departments are weaker than others. This should not be an exercise in finding and attributing fault, but a collective endeavour undertaken in a spirit of continuous improvement.
  • A coherent set of priorities for improvement which must be SMART (Specific, Measurable, Achievable, Realistic and Timebound). A named individual should be accountable for each priority and it should be stated explicitly what staff time, budget and any other resources are allocated to securing it.
  • Improvement priorities should be aligned with a matching set of outcome measures, or success criteria, which might draw from the suggested basket set out below. These should capture the intended impact of the improvement priorities on high attainers’ performance.
  • Arrangements for regular monitoring and review. There should be capacity to manage slippage and risk and to accommodate succession planning. A senior manager should be accountable for the plan as a whole, but it should be evident how all staff and all stakeholders – including learners and parents – can contribute positively to its achievement.

If the school is sufficiently confident it should consider incorporating an explicit entitlement to challenge and support for all its high attaining learners. This should not be vague and generalised, but sharp and explicit, so that parents and learners can challenge the school and seek redress if this entitlement is not forthcoming.

.

A potential basket of key measures

What measures might schools and colleges select from when developing such plans? I have set out below some first efforts at three baskets of key measures relating to primary, secondary and post-16 respectively.

These broadly reflect the current accountability regime, though I have suggested some departures, to fill gaps or in response to known concerns.

I do not pretend that these are more than illustrative. Some of the measures beg questions about definition, a few are rather iffy and there are certainly gaps in the coverage.

But they should serve to exemplify the broad approach, as well as providing a basis for more rigorous discussion at institutional level. I don’t have all the answers and very much want to start the conversation, rather than attempting to close it off.

Planners might reasonably consider drawing one measure from each of the five areas in the typology I have set out. Failing that, they might aim for a ‘balanced scorecard’ rather than relying excessively on measures in just one or two categories.

The optimal number of measures is probably between three and five – if there are fewer than three the scope of the plan will be too narrow; if there are more than five it will be too complex.

It should be possible to develop and refine these baskets over time, to reflect ideas and suggestions from those engaging with them. I hope to revisit them in future.

They will need significant adjustment to reflect the new accountability regime, once the proposals in the three consultation documents have been implemented.

And, hopefully, the new Data Portal will make it much easier to construct the necessary measures in all five of these categories, once it is introduced from 2014.

 .

                                     PRIMARY BASKET OF INDICATORS
Attainment % of high attainers at KS1
  % of high attainers at KS2
% of high attainers achieving KS1 L3
% of high attainers achieving KS2 L5B/5A/6
Progress 100% of high attainers make expected progress in KS1
  % of high attainers making more than expected progress in KS1
  100% of high attainers make at least 2 levels of progress in KS2
  % of high attainers making more than 2/up to 3 levels of progress in KS2
Destinations % of high attainers transferring to outstanding secondary schools
  % of high attainers transferring to selective schools
Closing the gap For any of the indicators above, the FSM gap for high attainers is at least x% lower than the FSM gap for middle attainers
  For any of the indicators above, the FSM gap is closed by x%
  High attainers are representative of school population (by eg FSM, gender, ethnic background, SEN, month of birth)
Comparisons For any of the indicators above, compare with all schools located in LA area/governed by academy trust
  For any of the indicators above, compare with family of schools with broadly similar intake

 .

 .

                                            SECONDARY BASKET OF INDICATORS
Attainment % of KS2 high attainers in Y7 intake
% of KS2 high attainers achieving 5+ A*-A grades at GCSE or equivalent including GCSE English and maths
% of KS2 high attainers’ GCSE entries awarded A*/A grades, or A* grades only
% of KS2 high attainers awarded GCSE A*/A grades or A* grades via early entry
Increase in GCSE APS (Best 8) and/or improvement in average grade
Increase in GCSE APS (new Attainment 8 measure) and/or improvement in average grade
Progress 100% of KS2 high attainers making expected progress from KS2-4
  % of KS2 high attainers making more than expected progress from KS2-4
  % of KS2 high attainers making 4/5 levels of progress from KS2-4
Destinations % of KS2 high attainers transferring to outstanding sixth forms/post-16 institutions
Closing the gap For any of the indicators above, the FSM gap for high attainers is at least x% lower than the FSM gap for middle attainers
  For any of the indicators above, the FSM gap is closed by x%
  High attainers are representative of school population by eg FSM, ethnic background, SEN, month of birth
Comparisons For any of the indicators above, compare with other schools located in LA area/governed by academy trust
  For any of the indicators above, compare with a family of schools with broadly similar intake

.

 .

                                             POST-16 BASKET OF INDICATORS
Attainment % of KS2 high attainers achieving AAB+/ABB+ grades at A level (whether or not in facilitating subjects)
% of KS2 high attainers’ A level entries awarded B/A/A* or A*/A or A* grades only
Increase in A level APS (best 3) and/or improvement in average grade
Progress % with GCSE A* achieving A level grade A* in same subject
  % with 5+ GCSEs at A*/A including English and maths achieving 3+ A levels at AAB/ABB
Destinations % transferring to high tariff undergraduate degree courses
  % transferring to selective/Russell Group/Oxbridge universities
Closing the gap For any of the indicators above, the FSM gap for high attainers is % lower than the FSM gap for middle attainers
  For any of the indicators above, the FSM gap is closed by x%
  High attainers are representative of sixth form population by eg FSM, gender, ethnic background, SEN, month of birth
Comparisons For any of the indicators above, compare with other schools located in LA area/governed academy trust
  For any of the indicators above, compare with a family of schools/colleges with broadly similar intake

.

Conclusion

In this second part of the post, I have adopted a simple typology of performance measures to explain:

  • Imminent changes in how the Performance Tables address high attainers and high attainment, highlighting key differences between sectors and outlining the direction of travel towards longer term reform.
  • What is known and what is still unknown about the focus on high attainers in new performance table arrangements, mostly scheduled for implementation from 2016, and to what extent these reforms will introduce a more robust approach and greater consistency between sectors.
  • How, pending those longer term reforms, schools and colleges might set about developing a high attainers’ support plan, so seizing the initiative and skirting around some of the difficulties presented by revisions to the inspection guidance.
  • How such support plans might be constructed around a basket of key outcome measures, so that they are a. focused explicitly on improvements in institutional performance as well as improved provision for high attaining learners and b. broadly reflect performance table measures without requiring slavish adherence.

This post is very much a work in progress, striving as it does to pin down a moving target while also setting out a basic support framework with universal application. I am unhappy with some aspects of this first edition and will aim to eliminate its shortcomings in a future iteration. All suggestions for improvement are welcome.

.

GP

October 2013

 

How High Attainers Feature in School Inspection and Performance Tables (and what to do about it)

 

.

This post explains:

  • How revised Ofsted inspection guidance gives greater prominence to high-attaining learners (or ‘the most able’ in Ofsted terminology).
  • How this differs from the treatment of high attainers in the School Performance Tables as presently formulated.
  • How high attainers feature in current proposals for accountability reform.
  • How schools might respond to inconsistent expectations from each side of the accountability framework and prepare for an uncertain future.

 

Outline of Content

Because of the length of this piece, I have divided it into two parts. Each part has two main sections.

Part One covers:

  • Changes to Ofsted’s inspection guidance. This explains and analyses the key changes to the School Inspection Handbook and Subsidiary Guidance which came into effect from September 2013.
  • Terminology, definitions, measures and data. This examines how Ofsted has begun to use the term ‘most able’ while the Performance Tables refer to ‘high attainers’. It compares the definitions adopted by Ofsted and in the Performance Tables. It discusses the ‘expected levels of progress’ methodology, highlighting a fundamental inconsistency in current guidance, and reflects on whether the accountability system should expect more progress from high attainers.

I have reversed the logical order of these sections to accommodate readers who wish only to understand how Ofsted’s guidance has changed. The second section begins the process of setting those revisions in the context of the wider accountability regime.

Part Two includes:

  • Performance Tables and Proposals for Accountability Reform. This summarises how high attainment and progress are reported in the 2012 Performance Tables and how this will change in 2013. It also offers a comparative analysis of how high attainers’ performance is expected to feature in a reformed accountability framework, for the primary, secondary and post-16 sectors respectively. This is based on the three parallel consultation documents, which begin to explain how the accountability framework will respond to the withdrawal of National Curriculum levels from 2016.
  • How schools should aim to satisfy these expectations. This provides some introductory guidance to shape the development and review of whole school plans to improve support for high attainers. It does not discuss the different ways in which schools can improve their provision – that is a topic for another day – but concentrates on the broad framing of policies and plans. It proposes a basket of key measures, for primary and secondary schools respectively, that fit the current context and can be adjusted to reflect future developments.

Within this post I have drawn together several elements from earlier posts to create the bigger picture. There is some overlap, but I have tried to keep it to a minimum. I hope it is helpful to readers to have all this material within a single frame, focused explicitly on how schools should respond to the challenges presented by the accountability system.

Like all of my posts, this is a free and open access resource (but please observe the provisions of the Creative Commons Licence located at the top right hand corner of this Blog).

And do please use this contact form if you would like to discuss additional, customised advice and support.

 

Changes to Ofsted’s Inspection Guidance

 

Background and Scope

In June 2013, Ofsted published a survey report: ‘The most able students: Are they doing as well as they should in our non-selective secondary schools?

That same month I produced an extended analysis of the Report drawing out its comparative strengths and weaknesses, as well as summarising the guidance it contains on elements of effective whole school practice.

Readers requiring a full blow-by-blow account of the Report and its contents are cordially invited to digest the older post first.

The recommendations contained in ‘The most able students’ led to the changes recently introduced into the school inspection guidance, which uses the same terminology, rendering ‘most able’ Ofsted’s term of art for inspection purposes henceforward.

The revisions were introduced during the 2013 summer holidays and came into almost immediate effect in September 2013, ostensibly fulfilling the recommendations in ‘The most able students’ that Ofsted should:

  • focus more closely in its inspections on the teaching and progress of the most able students, the curriculum available to them, and the information, advice and guidance provided to the most able students
  • consider in more detail during inspection how well the pupil premium is used to support the most able students from disadvantaged backgrounds
  • report its inspection findings about this group of students more clearly in school inspection, sixth form and college reports.’ (page 11).

We might therefore expect the revisions to embed these priorities – and perhaps also to reflect related issues highlighted in the key findings and recommendations, such as: creating a culture of excellence, primary-secondary transition, progress in KS3 specifically, high expectations, the quality of homework, evaluation of mixed ability teaching, tracking and targeting, information for parents/carers and supporting progression to HE.

It is important to note that, while the source document was confined to non-selective secondary schools, the revisions to the inspection guidance apply to all schools – primary as well as secondary – that fall within scope of The Framework for School Inspection.

This means they cover school sixth forms and even extend to maintained nursery schools.

On the other hand, they exclude 16-19 academies, 16-19 UTCs and 16-19 studio schools, as well as sixth form colleges and FE colleges, all of which are covered by the Common Inspection Framework for Education and Skills.

No equivalent changes have been introduced into that Framework, or the relevant supporting documentation. It follows that there is some inconsistency between the expectations placed on 11-18 secondary schools and on 16-19 institutions.

Provision and support for high attainers is optimal when fully connected and co-ordinated across Years R-13, with particular emphasis on the key transition points at ages 11 and 16. But roughly half of the relevant post-16 population will be attending colleges that are not affected by these changes.

I have deliberately postponed detailed scrutiny of definitions until after this opening section, but it is important at the outset to supply what is conspicuously missing from the inspection guidance: a basic explanation of what Ofsted means by the ‘most able’.

This is not easy to establish, but can be derived from a footnote spread across the bottom of pages 6-7 of ‘The Most Able Students’. In the absence of any statement to the contrary, one can only assume that the transfer of that identical phrase into the guidance means that the definition applied to the phrase has also been transferred.

So, according to this source, the most able in secondary schools are:

‘…students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

Hence Ofsted means all learners with KS2 Level 5 in English, maths or both, plus those falling below this threshold who nevertheless had the potential to achieve it.

An equivalent definition for KS2 in primary schools (not supplied in ‘The most able students’) would be:

‘Learners starting KS2 having attained Level 3 or above, or having the potential to achieve Level 3 and above in (any element of) English and/or maths at the end of KS1.’

The bracketed phrase is included because a single level for English will not be reported in Primary Performance Tables from 2013.

There is no obvious equivalent for KS1 in primary schools or KS5 in secondary schools, though it would be possible to create similar measures relating to achievement at GCSE and in a Year 1 National Curriculum baseline assessment (assuming the ELGs cannot be made to serve the purpose).

The critical point to bring out at this stage is the sheer size of the most able population as defined on this basis.

For example, if we were to use 2012 KS2 results to calculate the national Year 7 population falling within the secondary definition, it would include 50% of them on the basis of Level 5 achievement alone. Once the ‘potential Level 5s’ are factored in, we are dealing with a clear majority of the year group.

In any given school, this population will vary considerably according to the sector, the year group in question, prior attainment of the intake and how ‘potential to achieve’ is determined.

Many schools might reasonably calculate that all of their pupils – or all but a small minority – fall within scope. Even in schools with the most depressed intakes, this population will be sizeable if generous allowance is made for the impact of disadvantage on learners’ capacity to achieve the specified attainment threshold.

It is helpful to hold in mind a rough sense of the size of the most able population as one begins to engage with the inspection guidance.

.

The Framework for School Inspection

In fact, the School Inspection Framework itself has not been amended at all. Ofsted has sought to adjust its practical application through changes to two supporting documents:

  • The School Inspection Handbook (31 July 2013) which ‘provides instructions and guidance for inspectors conducting inspections… It sets out what inspectors must do and what schools can expect, and provides guidance for inspectors on making their judgements.’

This is not entirely satisfactory.

For example, the current version of the Framework stresses that inspections assess whether schools provide an inclusive environment:

‘which meets the needs of all pupils, irrespective of age, disability, gender reassignment, race, religion or belief, sex, or sexual orientation.’ (pp 13-14)

This list may be confined to distinctions that feature in the Equalities legislation, but there is no inherent reason why that should be the case. One might reasonably argue that, if HMI were really serious about inclusion and support for the most able, ‘attainment or ability’ should be added to the list!

It is more concerning that the section of the Framework dealing with pupil achievement says:

‘When judging achievement, inspectors have regard both for pupils’ progress and for their attainment. They take into account their starting points and age. Particular consideration is given to the progress that the lowest attaining pupils are making.’ (p17)

Why shouldn’t equally particular consideration be given to the progress of the highest attaining pupils? If a reference to low attainers is on the face of the Framework, while references to high attainers are confined to the supporting guidance, schools will draw the obvious conclusion about relative priorities.

Elsewhere in the Framework, there are generalised inclusive statements, applied to quality of teaching:

‘Inspectors will consider the extent to which… teaching strategies, including setting appropriate homework, together with support and intervention, match individual needs.’ (p 18)

and to quality of leadership and management:

‘Inspectors will consider the extent to which leaders and managers… demonstrate an ambitious vision for the school and high expectations of all pupils and teachers… provide a broad and balanced curriculum that meets the needs of all pupils, enables all pupils to achieve their full educational potential and make progress in their learning’ (pp 19-20)

but, if these statements are genuinely intended to reflect equality of opportunity, including for the ‘most able’, why has the progress of the lowest attaining learners been singled out beforehand for special attention?

Clearly it was too much to expect amendments on the face of the Framework itself, presumably because they could not be introduced without a formal consultation exercise. The large number of amendments introduced via the supporting guidance – covering a broad spectrum of issues – might have justified a consultation, though it would have delayed their implementation by several months.

But there is nothing to prevent Ofsted from publishing a list of draft amendments to the Framework that, subject to consultation, will be introduced when it is next revised and updated. Such an approach would help schools (and inspectors) to understand much more clearly the intended impact of complementary amendments to the supporting guidance.

.

School Inspection Handbook: Main Text

Prior to this round of amendments, there was a single reference in Paragraph 108 of the Handbook, applying to judgements of the quality of a school:

‘Inspection is primarily about evaluating how well individual pupils benefit from their school. It is important to test the school’s response to individual needs by observing how well it helps all pupils to make progress and fulfil their potential. Depending on the type of school, it may be relevant to pay particular attention to the achievement of:

  • disabled pupils, and those who have special educational needs
  • those with protected characteristics, including Gypsy, Roma and Traveller children, as defined by the Equality Act 2010
  • boys girls
  • the highest and lowest attainers
  • pupils for whom the pupil premium provides support, including:
  • looked after children
  • pupils known to be eligible for free school meals – a school is unlikely to be judged outstanding if these pupils are not making at least good progress
  • children of service families
  • those receiving alternative provision’.

Notice that the relevance of the highest attainers is optional -‘it may be relevant’ – and depends on the type of school being inspected, rather than being applied universally. It is left to the inspection team to make a judgement call.

Note, too, that the preferred terminology is ‘highest attainers’, rather than ‘the most able’. ‘Highest’ is an absolute term – rather than ‘high’ or ‘higher’ – which might be taken to imply the very extreme of the attainment spectrum, but there is no way of knowing.

This reference to the achievement of the ‘highest attainers’ remains in place, but is now juxtaposed against a series of newly inserted references to ‘the most able’. The former is optional, to be applied at inspectors’ discretion; the latter apply to all settings regardless.

There are no clues to tell us whether Ofsted is using the two terms synonymously, or if they intend to maintain a subtle distinction. The fact that the phrase has not been replaced by ‘the most able’ might suggest the latter, but that presupposes that this was picked up and consciously addressed during what seems to have been a rather cursory redrafting process.

There is no published glossary to inform interpretation of the terminology used in the Framework and its supporting guidance. By contrast, Estyn in Wales has published a ‘Glossary of Inspection Terms’, though that is hardly a model to be emulated, since it does not include their own preferred formulation ‘more able and talented’.)

The term ‘most able’ now appears in several parts of the main text:

  • Lesson observations must ‘gather evidence about how well individual pupils and particular groups of pupils are learning and making progress, including those with special needs, those for whom the pupil premium provides support and the most able, and assess the extent to which pupils have grown in knowledge’ (para 26);
  • Through meetings with pupils, parents, staff and other stakeholders, inspectors must: ‘gather evidence from a wide range of pupils, including disabled pupils, those with special educational needs, those for whom the pupil premium provides support, pupils who are receiving other forms of support and the most able.’ (para 41);
  • When it comes to judging achievement of pupils at the school, inspectors must: ‘have regard for pupils’ starting points in terms of their prior attainment and age. This includes the progress that the lowest attaining pupils are making and its effect on raising their attainment, and the progress that the most able are making towards attaining the highest levels and grades.’ (para 115);
  • They must also: ‘take account of: the learning and progress across year groups of different groups of pupils currently on the roll of the school, including disabled pupils, those who have special educational needs, those for whom the pupil premium provides support and the most able.’ (para 116).
  • They should take account of: ‘pupils’ progress in the last three years, including that for looked after children, disabled pupils, those who have special educational needs and the most able. Evidence gathered by inspectors during the course of the inspection should include: the proportions making expected progress and the proportions exceeding expected progress in English and in mathematics from each starting point, compared with national figures, for all pupils and for those for whom the pupil premium provides support.’ (para 116)
  • And in relation to Key Stage 1, they should take account of: ‘how well pupils with a lower starting point have made up ground, and the breadth and depth of progress made by the most able.’ (para 117)
  • When it comes to observing the quality of teaching and learning, inspectors must: ‘consider whether…teaching engages and includes all pupils, with work that is challenging enough and that meets their individual needs, including for the most able pupils’ (para 124)

The bulk of these references relate to data-driven judgements of attainment and progress, but it is worth pausing to emphasise the final point.

This, together with the reference to ‘the extent to which [the most able] pupils have grown in knowledge’ is the nearest we get to any explicit reference to the curriculum.

When it comes to qualitative judgement, and a priority for qualitative whole school improvement, schools need to examine how well – and how consistently – their teaching engages, includes and challenges the most able.

Incidentally, there is nothing in these amendments to indicate a preference for setting, though schools might do well to remember HMCI’s previously expressed concerns about:

‘the curse of mixed ability classes without mixed ability teaching’

The third point  – about progress – seems to be explicitly and deliberately reinforcing the statement on page 17 of the Framework that I quoted above. But while the Framework mentions only progress by the lowest attaining pupils, the Handbook now emphasises progress by the lowest and highest attaining alike. This is not a model of clarity.

Given the emphasis in ‘The most able students’ it seems odd that there is no explicit reference in the Handbook to those eligible for the Pupil Premium, unless one counts what is said about those who ‘exceed expected progress’ in English and maths, but that is not quite the same thing.

The way in which ‘the most able’ is tacked onto lists of different pupil groups also gives the rather unfortunate impression that these groups are mutually exclusive, rather than overlapping.

So far there is nothing significant about support for the most able to progress to competitive universities, apart from a brief and very general statement in the section on quality of leadership and management, referring to how well leaders and managers:

‘Ensure that the curriculum…provides timely independent information, advice and guidance to assist pupils on their next steps in training, education or employment.’

.

School Inspection Handbook: Level Descriptions

‘The most able’ has been inserted into two sets of descriptors within the Handbook.

In relation to achievement of pupils at the school’:

  • In Outstanding schools: ‘The learning of groups of pupils, particularly those who are disabled, those who have special educational needs, those for whom the pupil premium provides support, and the most able is consistently good or better.’
  • In Good schools: ‘The learning of groups of pupils, particularly those who are disabled, those who have special educational needs, those for whom the pupil premium provides support the most able, is generally good.’
  • In schools requiring improvement: there is only a generic ‘Pupils’ achievement requires improvement as it is not good’.
  • In Inadequate schools: ‘Groups of pupils, particularly disabled pupils and/or those who have special educational needs and/or those for whom the pupil premium provides support, and/or the most able, are underachieving.’

And, in the Descriptions for quality of teaching:

  • In Outstanding schools: ‘Much of the teaching in all key stages and most subjects is outstanding and never less than consistently good. As a result, almost all pupils currently on roll in the school, including disabled pupils, those who have special educational needs, those for whom the pupil premium provides support and the most able, are making rapid and sustained progress.’
  • In Good schools: ‘Teaching in most subjects, including English and mathematics, is usually good, with examples of some outstanding teaching. As a result, most pupils and groups of pupils on roll in the school, including disabled pupils, those who have special educational needs, those for whom the pupil premium provides support and the most able, make good progress and achieve well over time.’
  • In Schools Requiring Improvement there is the generic ‘‘Teaching requires improvement as it is not good’
  • In Inadequate Schools: ‘As a result of weak teaching over time, pupils or particular groups of pupils, including disabled pupils, those who have special educational needs, those for whom the pupil premium provides support and the most able, are making inadequate progress.’

 Here the dual emphasis on attainment and progress is writ large. I won’t labour the point I have made already about the overlapping nature of the groups listed.

There is nothing here either about the most able in receipt of the Pupil Premium, about curriculum or IAG, so we must continue our search for these missing pieces of the jigsaw within the parallel Subsidiary Guidance.

.

School Inspection Handbook: Postscript

Before we leave the Handbook behind, it is well worth examining one critical section in more detail, especially since it has been amended quite significantly.

Paragraphs 114-117 set out the evidence of attainment and progress that Ofsted inspectors will now draw upon. Because these seem so central to Ofsted’s interest in the most able, I have paraphrased the full list below, applying it exclusively to them.

The exercise illustrates that, if schools are to prioritise improvement by the most able, they must ensure that this is reflected throughout their evidence base. It also helpfully emphasises the importance of extending this effort to learners who attract the Pupil Premium.

Ofsted will want to examine

  • Learning and progress by the most able across different year groups. Evidence is gathered from: lesson observations; scrutiny of pupils’ work; schools’ records of pupils’ progress and progress of those receiving support from the Pupil Premium; ‘the quality and rigour of assessment’ (particularly in nursery, reception and KS1); discussions with pupils about their work; the views of parents, pupils and staff; discussion with staff and senior leaders; case studies of individual pupils; and listening to pupils read.
  • Progress made by the most able in the last three years. Evidence should include: the proportions making and exceeding expected progress ‘in English and in mathematics from each starting point, compared with national figures, for all pupils and for those whom the Pupil Premium provides support’; value-added indices for pupils and subjects; ‘other relevant indicators, including value-added data’; performance measures for the sixth form, including success rates; EYFS profile data; and ‘any analysis of progress data presented by the school, including information provided by external organisations’.
  • The most able learners’ attainment in relation to national standards (where available) and compared with all schools, based on data over the last three years, noting any evidence of performance significantly above or below national averages, trends of improvement or decline and inspection evidence of current pupils’ attainment across year groups. The latter will include, where relevant: the proportion of pupils achieving particular standards; capped average point scores; average point scores; pupils’ attainment in reading and writing and in maths; outcomes of the most recent phonics screening check and any follow-up by the schools; and attainment shown by test and exam results not yet validated or benchmarked nationally.
  • Difference in the achievement of the most able for whom the Pupil Premium provides support and others in the school including attainment gaps, particularly in English and maths (these to include differences in average points score in each of English and maths at the end of KS2 and at GCSE); and differences in progress from different starting points (see above).

Curiously, the footnotes attached to the original version of this section ignore the relevance of KS2 Level 6.

‘…starting points at Key Stage 2 include Levels W (and P levels), 1, 2, 3, 4 and 5’

I can only assume that this is an oversight.

.

References in the Subsidiary Guidance

One searches in vain for anything explicit about the curriculum or IAG. It seems that Ofsted decided not to give any prominence to these two critically important and controversial areas.

The section on ‘The Use of Prior Performance Data’ now says:

‘Inspectors should compare a school’s proportions of pupils making expected progress and the school’s proportions of pupils making more than expected progress in English and in mathematics with the national figures for each starting point. Consistency in being close to or above the national figures for pupils at each prior-attainment level, including the most able, is an important aspect of good achievement… Inspectors should pay particular attention to the sizeable prior-attainment groups in the school, and the most able, and note that school proportions below national figures for one starting point should not be considered to be compensated for by school proportions above national figures for another starting point. Inspectors should consider the school and national figures for the most recent year and the previous year, and how much they have changed.’ (para 7)

The insertion of references to ‘the most able’ makes for rather clumsy sentence structure, but does serve to highlight the new emphasis on their progression.

Provision in KS1 is once more singled out, but in a slightly different manner:

‘If all pupils are making good progress and levels of attainment are consistently high, overall achievement between the end of the Early Years Foundation Stage and end of Key Stage 1 is likely to be at least good and may be outstanding. To be outstanding, pupils known to be eligible for free school meals and the most able should be making good or better progress.’ (para 32, final bullet point)

There is a most welcome bullet point in the section about the achievement of disabled learners and those with SEN:

‘A category of ‘need’ such as autistic spectrum disorder, does not by itself indicate expected levels that pupils would usually be at, given their starting points (i.e. one pupil may be working towards 12 A* GCSE grades whereas another pupil of the same age may be working towards Level P6)

At last a paragraph appears that confirms inspectors’ interest in the most able attracting the Pupil Premium:

‘Inspectors must take account of the performance of the group for whom the pupil premium provides support, however small. Within this group, the progress in English and in mathematics of each different prior-attainment group should be considered and compared with that of the other pupils in the school, using the tables in RAISE online that show proportions making expected progress and proportions exceeding expected progress from each starting level. Inspectors should pay particular attention to the sizeable prior-attainment groups (those containing around 20% or more of the pupils for whom the pupil premium provides support) and the most able.’ (para 8)

Refreshing though it is to see that every school must pay attention to the most able supported by the Pupil Premium, regardless of the number attracting the Premium and how many are amongst the most able, one wonders why this message is not conveyed through the Handbook in similar terms.

Something similar occurs in respect of early entry to GCSE. The Handbook introduces generic concerns about early entry, especially in maths:

‘Inspectors should evaluate the school’s approach to early entry for GCSE mathematics and its impact on achievement and subsequent curriculum pathways. Inspectors should challenge the use of inappropriate early and multiple entry to GCSE examinations, including where pupils stop studying mathematics before the end of Year 11.

This is subsequently applied to all subjects in the level descriptions for Leadership and management, but only the Subsidiary Guidance relates the issue directly to ability and attainment:

‘Inspectors should investigate whether a policy of early entry to GCSE for pupils is preventing them from making as much progress as they should, for example because:

  • the extensive and inappropriate use of early GCSE entry, particularly in mathematics, puts too much emphasis on attaining a grade C at the expense of developing the understanding necessary to succeed at A level and beyond
  • the policy is having a detrimental impact on the uptake of advanced level courses
  • the widespread use of early GCSE entry and repeated sitting of examinations has encouraged short-term gains in learning but has led to underachievement at GCSE, particularly for able pupils
  • the policy has resulted in a lack of attention to the attainment of the least able
  • opportunities are missed to meet the needs of high-attaining pupils through depth of GCSE study and additional qualifications.

In evaluating any approach to early entry, inspectors should consider the impact not only on the judgement on pupils’ achievement but also on leadership and management in terms of whether the school is providing a curriculum that meets the pupils’ needs.’ (para 34)

Here we have even more terminological confusion, with the use of both ‘able pupils’ and ‘high-attaining pupils’, while ‘most able’ is conspicuous by its absence.

The Handbook and Subsidiary Guidance between them have referred to: ‘highest attaining’, ‘high attaining’, ‘most able’ and ‘able’ without defining any of these terms, or differentiating between them.

 .

Overall

The amendments introduced into the Handbook and Subsidiary Guidance place a stronger emphasis on the most able principally through the repetition of that phrase at various points in the text.

These amendments are focused predominantly on pupil attainment and progress, rather underplaying any wider emphasis on effective whole school practice. References to curricular challenge and IAG for progression to competitive universities are generalised and scant.

The impact on overall Ofsted judgements can best be appreciated by editing together the relevant elements of the two sets of level descriptors referenced above:

  • In outstanding schools the most able pupils’ learning is consistently good or better and they are making rapid and sustained progress.
  •  In good schools the most able pupils’ learning is generally good, they make good progress and achieve well over time.
  • In schools requiring improvement the teaching of the most able pupils and their achievement are not good.
  • In inadequate schools the most able pupils are underachieving and making inadequate progress.

The attainment and progress of the most able supported by the Pupil Premium is integral to these judgements, though this latter point is underplayed in the guidance.

One might have hoped for a more considered and more carefully drafted response, built upon a careful definition of the term, which explains whether it differs from other similar phrases used in these materials and, if so, how.

Unfortunately, there is a real risk that the questionable clarity of the Handbook and Subsidiary Guidance will result in some inconsistency in the application of the Framework, even though the fundamental purpose of such material is surely to achieve the opposite.

A dedicated piece of additional briefing would have been particularly helpful, but there is nothing on this topic in the most recently published package (22 September 2013).

.

Terminology, Definitions, Measures and Data

Some readers may find that parts of this section tell them mostly what they already know. But even those who feel secure in the basics might want to cast an eye over the critical distinctions and issues set out below. Some may want to take issue with certain steps along the path I have negotiated through the tricky terminological issues.

I hope others will find it helpful to have the full scaffolding in place as they grapple with the implications of Ofsted’s new emphasis on the most able learners – and how that relates to the parallel emphasis in the School Performance Tables.

 .

Terminology: Most Able and High Attainers

In the section above, I have faithfully replicated the terminology adopted by Ofsted, while highlighting the problems caused by switching between terms that might or might not be synonymous.

Meanwhile the School Performance Tables have consistently adopted an alternative term: ‘high attainers’.

So what is the distinction – and which terminology should we prefer?

This treatment is necessarily brief and begs many questions that are best addressed in the margins. I shall set out the argument as best I can and move rapidly on.

A failure to distinguish properly between attainment and ability bedevils this field and consistently sullies wider educational debate. The two terms are often used synonymously, especially by economists, who should really know better!

Here is my rough and ready effort at pinning down the distinction in terms that fit the current context:

  • Attainment involves securing specified measurable educational outcomes, typically assessed through graded tests and public examinations (eg KS2 tests, GCSE, A Level). Some authorities (Ofsted included) maintain a distinction between attainment and progress, but it is also used in a general sense to encompass both. Attainment is (only) one dimension of wider educational achievement.
  • Ability is a measure of potential, not a measure of achievement. It may be hidden and/or its realisation obstructed. Consequently it is not easily assessed. Moreover, ability is complex, multifaceted and not synonymous with intelligence. Single identification instruments – for example IQ tests, CAT scores – may well be misleading and/or culturally biased and/or provide an incomplete picture. Some eschew the assessment and identification of ability because of the issues and difficulties associated with the concept. Some deploy questionable identification practice. Others adopt a pragmatic ‘best-fit’ approach, utilising a broad range of qualitative and quantitative evidence including ‘identification through provision’. Attainment-based evidence may feature within this portfolio, but should not be relied on exclusively or excessively otherwise the critical distinction is lost.

The best performers in key stage tests and public examinations are at the top end of the attainment distribution, but not necessarily at the top end of the ability distribution. High attainment may be a proxy for high ability but it is not the same thing, however ability is conceived (which is a separate, complex and highly controversial issue).

Similarly, high-attaining pupils may be regarded as a subset of a school’s gifted and talented population (or whatever alternative terminology it prefers to use) but one might reasonably expect that population also to include other learners who – for a variety and combination of reasons and for the time being at least – are not realising their ability through high attainment.

While some schools may find ability too difficult and controversial a concept to wrestle with (especially since they are no longer expected by the Government to do so), all are pushed by the accountability system to focus on high attainment and on the performance of their high attaining learners.

Schools cannot entirely abdicate from engagement with ability, since their success as judged by the accountability system depends in part on their capacity to unlock high attainment amongst those who are not yet demonstrating it.

But, since their focus is the nurturing of attainment, rather than the nurturing of ability, this can be articulated in terms of the former rather than the latter. Hence the imperative is to maximise the number of high attaining learners and the level at which they attain.

A subset of ‘potential high attainers’ is supported to cross the appropriate threshold. At one extreme, schools may decide that all their learners who are not yet high attaining should be regarded thus. At the other, schools may prefer to focus exclusively on a significantly smaller group of ‘borderline high attainers’.

But schools must balance this attainment imperative against the wider purposes of education and the wider needs of learners, some of which may be influenced by ability. There is always concern that the accountability system overplays attainment at the expense of these wider needs, but that is an argument for another day.

In the past, high attainers may have been regarded as a second-order priority, since emphasis was placed disproportionately on the achievement of key threshold measures set at a lower level and the borderline candidates who could be supported to achieve them. But schools are increasingly driven by the accountability system to improve performance at all levels of prior attainment.

Ofsted’s choice of ‘most able’ is misleading because:

  • Ability and its derivative ‘able’ are heavily loaded and contentious terms. There is comparatively little consensus over what they mean, hence their application without careful definition is always problematic. Ability and attainment are not synonymous but Ofsted’s focus is exclusively on the latter. There is no measure of ability in the School Performance Tables which confine themselves to measures of attainment (including progress) and destination.
  • ‘Most able’ is an absolute term normally denoting those at the extreme of the ability distribution. It suggests a markedly higher threshold than ‘highly able’, ‘more able’ or simply ‘able’. Yet Ofsted’s own definition accounts for some 50% of all learners (see below).

It may be that Ofsted wished to include within its definition some learners who would feature amongst the high attaining population but are underachieving, perhaps as a consequence of disadvantage. But Ofsted is not interested in ability per se, only in its successful conversion to high attainment.

It may be that Ofsted’s choice of terminology was also influenced by their wish to maintain a clear distinction between attainment and progress. Perhaps they were concerned that using the term ‘high attainer’ might confuse this distinction.

Given the considerable scope for confusion I have adopted the terminology ‘high attainers/high attaining learners’ and ‘potential high attainers’. The former means those who have achieved or exceeded a specified assessment outcome and are making commensurate progress. The latter means those who, with appropriate support, might become high attainers. 

 .

Comparing Ofsted and Performance Table Definitions

The Primary and Secondary School Performance Tables report attainment and progress for the pupil population as a whole, but also separately for ‘high attainers’, ‘middle attainers’ and ‘low attainers’.

Each of these groups is defined by reference to their performance in the earliest relevant key stage assessment. Currently KS1 assessment is used for the Primary Tables – though this may change in future – and KS2 for the Secondary Tables.

The Tables report attainment at later key stages by those who achieved or exceeded the initial baseline marker. By this means, and through the expected levels of progress methodology (of which more below), they highlight the improvement made by such learners across one (primary) or two (secondary 11-16) key stages.

The assumption is that a perfect school will ensure that all of its pupils – whether high, middle or low attainers – will successfully achieve the commensurate attainment benchmarks at later key stages and so make at least the expected progress.

Of course, many circumstances can intervene to prevent even the best schools from achieving perfection!  The worst case scenario is that no learners make the expected progress. There is inevitably a distribution of schools between these two extremes.

Other things being equal, one might expect more high attainers than middle attainers to make the expected progress, and more middle attainers than low attainers to do so. This is borne out by the national data in the 2012 Performance Tables.

At school level, if the success rate for high attainers compares unfavourably with those for middle and low attainers, so is out of kilter with the national data, it is taken as evidence that the former are comparatively less well served by the school. The assumption is that high attainers have not received the same degree of targeted challenge and support as their lower attaining peers.

In practice, other factors may come into play, principal among them the proportion of each sub-population within the relevant year group. Do high attainers tend to perform better in the schools and year groups where they are most heavily concentrated, or is the reverse true in certain circumstances? Is there an optimal proportion? Tempting though it is to pursue that question, we must return to the matter in hand.

The User Guide for the 2012 Secondary Tables explains:

‘Prior attainment definitions are based on the KS2 test results attained by pupils on completion of the primary school phase:

  • Below expected level = those below Level 4 in the KS2 tests;
  • At expected level = those at Level 4 in the KS2 tests;
  • Above expected level = those above Level 4 in the KS2 tests.

To establish a pupil’s KS2 attainment level, we calculated the pupil’s average point score in National Curriculum Tests for English, maths and science and classified those with a point score of less than 24 as low; those between 24 and 29.99 as middle, and those with 30 or more as high attaining. ‘

So the 2012 Secondary Performance Tables define high attainers as those who average above Level 4 performance across the three core subjects.

Learners who achieve highly in one subject are not counted if their performance in the other two drags them below the average.

And, on this measure, the 2012 Secondary Tables show that, nationally, 33.1% of pupils attending state maintained schools qualified as high attainers.

The comparable percentage in the Primary Tables, based on an average point score of 18 or more across KS1 English and maths assessments, is 24%, quite considerably lower.

The implications of a definition of high attainers that includes one quarter of learners in the later primary years and one third of learners at secondary level are rarely discussed. Is there a case for a more consistent approach across the two sectors, or is it a reasonable assumption that there are significantly more high attainers in the secondary sector? Let us leave that question hanging and return to the comparison with Ofsted.

One might expect Ofsted to have adopted this same Performance Tables definition, so ensuring consistency across both arms of the accountability regime. That would have been most straightforward for schools on the receiving end of both.

But, as we have seen, for reasons unexplained and best known to itself, Ofsted uses a completely different threshold on which to base its definition, which is confined to the secondary sector because ‘The Most Able Students’ does not deal with primary schools.

The crux of Ofsted’s definition is the achievement of National Curriculum Level 5 in English, in mathematics, or in both English and maths. This is quite different to average above Level 4 achievement in English, maths and science.

This has the virtue of ‘counting in’ learners with relatively high attainment in one of the two principal subjects, but relatively low attainment in the other. Performance in science is not deemed relevant.

The 2012 Primary Performance Tables report the percentage achieving Level 5 in both English and maths as 27%, while 39% achieved Level 5 in maths and 38% did so in English.

Hence 12% achieved Level 5 in maths and not English (39-27) and 10% did so in English and not maths (37-27) – so the total achieving Level 5 in one, the other or both subject in 2012 is 27 + 12 + 11 = 50%.

I cannot find all the data to undertake the same calculation for the equivalent Ofsted-derived definition for the primary sector. We know the national percentages achieving Level 3 in 2012 – 27% reading, 14% writing, 22% maths – but not what proportion of KS1 learners achieved one, two and three Level 3s respectively.

One can reasonably predict that the total will significantly exceed the 24% obtained by the Performance Tables methodology.

Reverting to the secondary data, it might be argued that, while 2012 outcomes are applicable for learners now in Year 7, one must use progressively earlier KS2 data for learners in older year groups.

That might be expected to depress slightly the percentage exceeding the threshold – but it does not alter the fact that the basis of the Ofsted definition is entirely different to (and substantively more generous than) that in the Performance Tables.

And of course we have not yet factored in that proportion of learners judged to have had ‘the potential to achieve Level 5’ (or the equivalent Level 3 at the primary level).

There is little information in ‘The most able students’ about the likely size of this ‘potential high attainers’ group. The definitional footnote mentions EAL learners who do not yet have the skills to demonstrate Level 5 performance, but does not estimate how many learners are affected.

Any methodology adopted by schools might also be expected to factor in:

  • Near misses – most schools would include learners who achieved Level 4A in English or maths or both; and
  • Disadvantaged learners – schools might include learners attracting the Pupil Premium who would have achieved Level 5 had no gap existed between the performance of advantaged and disadvantaged learners.
  • An ideological predisposition. Some schools might base their approach on the principle that all their learners are capable of Level 5 performance; others might regard this as imposing unrealistic expectations on a proportion of their learners.

Were this exercise to be undertaken at national level, it would encompass a comfortable majority of the secondary student population.

As noted above, there is likely to be significant variation between the size of schools’ ‘high attainer and potential high attainer’ populations, but most will find them more substantial than the term ‘most able’ might initially have led them to believe.

So, to summarise: we have two distinct measures of what constitutes a high attainer, one of which also includes potential high attainers. Both are catholic interpretations but one (Ofsted’s) is significantly more generous than the other.

That said, we have had to evolve Ofsted’s definition from indistinct clues. There is nothing overt and explicit to tell us what is meant by the term ‘most able’ as now used in the Inspection Guidance.

This situation is less than optimal for schools wishing to show themselves to best advantage on both sides of the accountability regime.

.

How Much Progress Does the Accountability Regime Expect from High Attainers?

It goes without saying that what constitutes high attainment depends on the context. In England, attainment measures are typically associated with end of key stage assessment and the instruments and grading scales applied to them. High-performing learners are expected to achieve a commensurately high grade in the appropriate assessment.

But, once a learner has demonstrated high attainment, they are expected to continue to demonstrate it, achieving commensurately high grades in subsequent assessments and so making consistently good progress between these different stage-related outcomes.

This assumption is integral to the accountability system, which makes no allowance for the non-linear development of most learners. It is assumed that, when viewed over the longer term, these inconsistencies are smoothed out: high attainers will typically remain so, throughout a key stage and even across key stages, indeed throughout their school careers.

The assumption is contestable but that, too, is an argument for another day.

The rate of progress is currently determined with reference to National Curriculum Levels. It is typically expected that all learners should make at least two levels of progress between KS1 and KS2; and at least three levels of progress between KS2 and KS4, but the reality is somewhat more complex.

For this purpose, GCSE Grades are mapped onto National Curriculum levels in such a way that learners achieving Level 5 at KS2 need to achieve at least GCSE Grade B to show three levels of progress across KS2-4.

.

NC Level 1 2 3 4 5 6 7 8 9 10
GCSE grade C B A A*

.

For learners with Level 5 at KS2, an A grade at GCSE would denote four levels of progress across KS2-4, while an A* grade would mean five levels of progress.

This is the ceiling – it is not possible for any Level 5 learner to achieve more progress than is denoted by an A* grade, though this would of course denote six levels of progress from KS2 Level 4.

So, in effect, the progression ceiling is comparatively lower for those with higher prior attainment than it is for middle and lower attainers, even though the former are arguably more likely to make further and faster progress than their peers.

The ‘levels of progress’ methodology rests on a further assumption – that these steps of progress are equidistant, equally steep, and so equally demanding. Hence it requires the same effort to climb three levels from Level 4 to GCSE Grade C as it does to climb from Level 6 to GCSE Grade A. I have sometimes seen this assumption disputed.

The methodology is far from perfect, which might help to justify the decision to dispense with it when National Curriculum levels go in 2016.

In the meantime, however, schools need to work within current expectations, as applied in the School Performance Tables. These will continue in force for at least two and probably three more sets of Tables, in 2013, 2014 and probably 2015.

So what are the current expectations?

The User Guide to the 2012 Performance Tables includes material which explains what it calls ‘progression trajectories’.

The note relating to the primary tables says:

‘The majority of children are expected to leave Key Stage 1 (age 7), working at least at level 2. During Key Stage 2, pupils are expected to make at least two levels’ progress, with the majority achieving at least a level 4 by age 11. Pupils entering Key Stage 2 at level 3 should progress at least to level 5; while those entering at level 1 should progress at least to level 3…These are minimum expectations and opportunities exist for schools to provide greater stretch for more able children, with the introduction of a level 6 test for 11 year olds.

.

Progression trajectories primary Capture

.

A few hundred pupils a year reach level 4 at Key Stage 1 in maths and/or the different elements of English. An associated technical note reminds us that it was only the advent of KS2 Level 6 tests that enabled these learners to achieve the expected two levels of progress – previously they were limited to only one.

But Level 3 is the norm for primary high attainers and the introduction of the Level 6 tests has raised the ceiling for them, permitting many to exceed the standard expectation by making three levels of progress from KS1 to KS2. Three is the limit however.

Although the proportions of learners achieving Level 6 are still relatively small, numbers are increasing rapidly, especially in maths. The SFR containing provisional results from the 2013 tests shows that 7% of learners achieved Level 6 in KS2 maths.

.

.

The note relating to the secondary performance tables says:

‘The majority of children are expected to leave Key Stage 2 (age 11), working at least at level 4.  By the end of Key Stage 4, pupils who were at level 4 should progress to achieve at least a Grade C at GCSE; while pupils working at level 6 should be expected to achieve at least an A at GCSE…These are minimum expectations.’

.

Progression trajectories secondary Capture .

But the associated technical note disagrees.

A diagram shows that the minimum expected progress from a KS2 Level 6 is Grade B, equivalent to only two levels of progress.

.

Secondary progression matrix diagram Capture

The text reinforces this:

Pupils attaining level 5 or level 6 at KS2 are expected to achieve at least a grade B at GCSE. Therefore all pupils achieving an A* – B are deemed to have made the expected progress, whether or not their prior attainment is known.’

The technical note is the current version. I checked the RAISE Online library in case it contained more recent information but – at the time of writing at least – it does not.

It seems that there is currently some confusion about whether or not learners with Level 6 are expected to progress at least three levels to GCSE grade A, at least as far as the Performance Tables are concerned.

This may be because the interpretation in the technical note predates the interpretation in the guidance note and has not been updated.

Clearly the higher level of expectation is preferable, because it is nonsensical that the very highest attainers need make only two levels of progress across five years of secondary schooling when everyone else is expected to make at least three.

.

Should We Expect More Progress from High Attainers?

Many schools have pushed beyond these minimum expectations, especially for their high attainers. It is fairly common practice for learners to be expected to make somewhere between two and three levels of progress from KS1 to KS2 and four levels of progress from KS 2-4.

Given that this practice is already firmly established, there seems to be a strong case for both arms of the accountability system to emulate it, so raising expectations for high attainers nationally, regardless of the schools they attend.

That would confirm the value and significance of Level 6 tests to primary schools, while secondary schools would reasonably expect the rapidly increasing number of learners arriving with KS2 L6 to reach GCSE A* five years later.

Another option would be to combine this additional stretch with a recalibration of the definition of high attainers – and of course its application to both arms of the accountability regime.

For the evidence from the Secondary Transition Matrices, held in the RAISE online library, shows just how much progress varies according to National Curriculum sub-levels.

When it comes to full levels, the Matrices show that, nationally, 51% of learners with Level 5 in maths made four or more levels of progress (to Grade A or above) in 2012, while 41% with Level 5 did so in English.

(Incidentally, the Matrices do not show progress from Level 6 because the KS2 level relates to performance some years earlier, typically in 2007 for those taking GCSE in 2012.)

.

Maths TM - full grades CaptureEnfglish TM full grade Capture

.

But the real value of these Matrices lies in the breakdown they provide of progression by sub-level.

I have already drawn out the key points in an earlier post and will not repeat that material here, other than to note that, in 2012:

  • 87% of learners with a Level 5A at KS2 in English achieved at least four levels of progress, compared with 64% with 5B and 29% with 5C (the latter lower than the comparable conversion rate for those with a Level 4A). Moreover, 47% of those with 5A achieve five levels of progress to A*, compared with 20 of those with 5B and only 4% of those with 5C;
  • 84% of learners with Level 5A in maths secured at least four levels of progress, whereas 57% of those with 5B and 30% with 5C managed this (and once again, the conversion rate from Level 4A was higher than for 5C). And 50% of those with 5A make five levels of progress to A*, compared with 20% of those with 5B and just 6% of those with 5C

.

TM Maths Capture

 Maths

 .

TM English Capture

English

.

There is a clear distinction between the progress made by learners with Level 5A/B and by those with 5C, which might argue for the Performance Tables to adjust upwards the average point score expected of high attainers, while simultaneously raising the expectation to four levels of progress.

There may be reluctance to adjust the levels-driven progress methodology when it has a limited lifespan of three years at most. On the other hand:

  • There is already an issue – and it will become more pronounced over the next three years as more learners achieve KS2 L6.
  • As the plans for post-2016 assessment and accountability are developed and finalised, it is important that suitably high expectations of high attainers are transferred across from the current system to its successor.
  • Implementation of the 2016 reforms may be dependent on the outcome of the 2015 General Election – there is currently no guarantee that Labour will proceed with the removal of National Curriculum levels and/or follow the timetable laid down by the Government.

.

Summing Up

This marks the end of the first part of this post. I have tried to show:

  • How Ofsted inspection guidance places greater emphasis on high attainers, what this really means and where the meaning is unclear.
  • That the revisions introduced by Ofsted are a not quite comprehensive response to the self-imposed recommendations in ‘The most able students’.
  • How – in the absence of any guidance to the contrary – Ofsted’s assumed definition of high attainers (aka ‘the most able’) differs from that applied in the School Performance Tables, resulting in inconsistency between the two sides of the of the accountability regime which is sub-optimal for schools.
  • That, if Ofsted’s assumed definition stands, schools need to be prepared for the likelihood that the majority of their learners will fall within it.
  • That expectations of progress for high attainers in the Secondary Performance Tables are currently unclear.
  • That there is a strong case for increasing those expectations – for primary as well as secondary high attainers – which could if necessary be combined with an upwards adjustment of the definitions.

I have called for Ofsted to publish brief supplementary guidance to clarify its definitions and the wider implications of the revisions of its inspection guidance. This would ideally confirm that Ofsted’s definitions and Performance Table definitions are one and the same.

In Part Two, I will review how high attainers will be reported on in the 2013 Performance Tables, and how this differs from the arrangements in the 2012 Tables.

I will also set out how the proposals in the consultations on primary, secondary and post-16 accountability are expected to impact on high attainers. (If the response to the secondary consultation appears imminently I will reflect that in the analysis.)

Finally, I will offer some guidance for schools on how they might set about planning to improve their high attainers’ performance – and what that they might include in a basket of key improvement measures, designed to be shared with learners, parents and other key stakeholders.

.

GP

October 2013

A Summer of Love for English Gifted Education? Episode 2: Ofsted’s ‘The Most Able Students’

 .

This post provides a close analysis of Ofsted’s Report: ‘The most able students: Are they doing as well as they should in our non-selective secondary schools?’ (June 2013)

.

,

summer of love 1967 by 0 fairy 0

summer of love 1967 by 0 fairy 0

This is the second post in a short series, predicated on the assumption that we are currently enjoying a domestic ‘summer of love’ for gifted education.

According to this conceit, the ‘summer of love’ is built around three official publications, all of them linked in some way with the education of gifted learners, and various associated developments.

Part One in the series introduced the three documents:

  • An Ofsted Survey of how schools educate their most able pupils (still unpublished at that point); and
  • A planned ‘Investigation of school and college level strategies to raise the aspirations of high-achieving disadvantaged pupils to pursue higher education’, this report programmed for publication in September 2013.

It provided a full analysis of the KS2 L6 Investigation and drew on the contractual specification for the Investigation of aspiration-raising strategies to set out what we know about its likely content and coverage.

It also explored the pre-publicity surrounding Ofsted’s Survey, which has been discussed exclusively by HMCI Wilshaw in the media. (There was no official announcement on Ofsted’s own website, though it did at least feature in their schedule of forthcoming publications.)

Part One also introduced a benchmark for the ‘The most able students’, in the shape of a review of Ofsted’s last foray into this territory – a December 2009 Survey called ‘Gifted and talented pupils in schools’.

I will try my best not to repeat too much material from Part One in this second Episode so, if you feel a little at sea without this background detail, I strongly recommend that you start with the middle section of that first post before reading this one.

I will also refer you, at least once, to various earlier posts of mine, including three I wrote on the day ‘The most able students’ was published:

  • My Twitter Feed – A reproduction of the real time Tweets I published immediately the Report was made available online, summarising its key points and recommendations and conveying my initial reactions and those of several influential commentators and respondents. (If you don’t like long posts, go there for the potted version!);

Part Two is dedicated almost exclusively to analysis of ‘The most able students’ and the reaction to its publication to date.

It runs a fine tooth comb over the content of the Report, comparing its findings with those set out in Ofsted’s 2009 publication and offering some judgement as to whether it possesses the ‘landmark’ qualities boasted of it by HMCI in media interviews and/or whether it justifies the criticism heaped on it in some quarters.

It also matches Ofsted’s findings against the Institutional Quality Standards (IQS) for Gifted Education – the planning and improvement tool last refreshed in 2010 – to explore what that reveals about the coverage of each document.

For part of my argument is that, if schools are to address the issues exposed by Ofsted, they will need help and support to do so – not only a collaborative mechanism such as that proposed in ‘Driving Gifted Education Forward – but also some succinct, practical guidance that builds on the experience developed during the lifetime of the late National Gifted and Talented Programme.

For – if you’d like a single succinct take-away from this analysis – I firmly believe that it is now timely for the IQS to be reviewed and updated to better reflect current policy and the new evidence base created in part by Ofsted and the other two publications I am ‘celebrating’ as part of the Summer of Love.

Oh, and if you want to find out more about my ‘big picture’ vision, may I refer you finally to the Gifted Phoenix Manifesto for Gifted Education.

But now it’s high time I began to engage you directly with what has proved to be a rather controversial text.

.

Ofsted’s Definition of ‘Most Able’

The first thing to point out is that Ofsted’s Report is focused very broadly in one sense, but rather narrowly in another.

The logic-defying definition of ‘most able students’ Ofsted adopts – for the survey that informs the Report – is tucked away in a footnote divided between the bottom of pages 6 and 7 of the Report.

This says:

For the purpose of this survey ‘most able’ is defined as the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2. Some pupils who are new to the country and are learning English as an additional language, for example, might not have attained Level 5 or beyond at the end of Key Stage 2 but have the potential to achieve it.

It is hard to reconcile this definition with the emphasis in the title of the Report on ‘the most able students’, which suggests a much narrower population at one extreme of an ability distribution (not an attainment distribution, although most of the Report is actually about high attaining students, something quite different).

In fact, Ofsted’s sample includes:

  • All pupils achieving Level 5 and above in English – 38% of all pupils taking end KS2 tests in 2012 achieved this.
  • All pupils achieving Level 5 and above in maths – 39% of all pupils achieved this in 2012.
  • We also know that 27% of pupils achieved Level 5 or above in both English and maths in 2012. This enables us to deduce that approximately 11% of pupils managed Level 5 only in English and approximately 12% only in maths.
  • So adding these three together we get 27% + 11% + 12% = 50%. In other words, we have already included exactly half of the entire pupil population and have so far counted only ‘high attaining’ pupils.
  • But we also need to include a further proportion of pupils who ‘have the potential’ to achieve Level 5 in one or other of these subject but do not do so. This sub-population is unquantifiable, since Ofsted gives only the example of EAL pupils, rather than the full range of qualifying circumstances it has included. A range of different special needs might also cause a learner to be categorised thus. So might a particularly disadvantaged background (although that rather cuts across other messages within the Report). In practice, individual learners are typically affected by the complex interaction of a whole range of different factors, including gender, ethnic and socio-economic background, special needs, month of birth – and so on. Ofsted fails to explain which factors it has decided are within scope and which outside, or to provide any number or percentage for this group that we can tack on to the 50% already deemed high attainers.

Some might regard this lack of precision as unwarranted in a publication by our national Inspectorate, finding reason therein to ignore the important findings that Ofsted presents later in the Report. That would be unfortunate.

Not only is Ofsted’s definition very broad, it is also idiosyncratic, even in Government terms, because it is not the same as the slightly less generous version in the Secondary School Performance Tables, which is based on achievement of Level 5 in Key Stage 2 tests of English, maths and science.

So, according to this metric, Ofsted is concerned with the majority of pupils in our secondary schools – several million in fact.

But ‘The Most Able Students’ is focused exclusively on the segment of this population that attends non-selective 11-16 and 11-18 state schools.

We are told that only 160,000 students from a total of 3.235m in state-funded secondary schools attend selective institutions.

Another footnote adds that, in 2012, of 116,000 students meeting Ofsted’s ‘high attainers’ definition in state-funded schools who took GCSEs in English and maths, around 100,000 attended non-selective schools, compared with 16,000 in selective schools (so some 86%).

This imbalance is used to justify the exclusion of selective schools from the evidence base, even though some further direct comparison of the two sectors might have been instructive – possibly even supportive of the claim that there is a particular problem in comprehensive schools that is not found in selective institutions. Instead, we are asked to take this claim largely on trust.

.

Exeter1 by Gifted Phoenix

Exeter1 by Gifted Phoenix

.

The Data-Driven Headlines

The Report includes several snippets of data-based evidence to illustrate its argument, most of which relate to subsets of the population it has rather loosely defined, rather than that population as a whole. This creates a problematic disconnect between the definition and the data.

One can group the data into three categories: material relating to progression between Key Stages 2 and 4, material relating to achievement of AAB+ grades at A level in the so-called ‘facilitating subjects’ and material drawn from international comparisons studies. The former predominates.

 .

Data About Progression from KS2 to KS4

Ofsted does not explain up front the current expectation that pupils should make at least three full levels of progress between the end of Key Stage 2 and the end of Key Stage 4, or explore the fact that this assumption must disappear when National Curriculum levels go in 2016.

The conversion tables say that pupils achieving Level 5 at the end of Key Stage 2 should manage at least a Grade B at GCSE. Incidentally – and rather confusingly – that also includes pupils who are successful in the new Level 6 tests.

Hence the expectation does not apply to some of the very highest attainers who, rather than facing extra challenge, need only make two levels of progress in (what is typically) five years of schooling.

I have argued consistently that three levels of progress is insufficiently challenging for many high attainers. Ofsted makes that assumption too – even celebrates schools that push beyond it – but fails to challenge the source or substance of that advice.

We are supplied with the following pieces of data, all relating to 2012:

  • 65% of ‘high attainers’ in non-selective secondary schools – not according to Ofsted’s definition above, but the narrower one of those achieving  Key Stage 2 Level 5 in both English and maths – did not achieve GCSEs at A/A* in both those subjects. (So this is equivalent to 4 or 5 levels of progress in the two subjects combined.) This group includes over 65,000 students (see pages 4, 6, 8, 12).
  • Within the same population, 27% of students did not achieve GCSEs at B or above in both English and maths. (So this is the expected 3+ levels of progress.) This accounts for just over 27,000 students.) (see pages 4, 6 and 12).
  • On the basis of this measure, 42% of FSM-eligible students did not achieve GCSEs at B or above in both English and maths, whereas the comparable figure for non-FSM students was 25%, giving a gap between FSM and non-FSM (rather than between FSM and all students) of 17%. We are not told what the gap was at A*/A, or for the ‘survey population’ as a whole  (page 14)
  • Of those who achieved Level 5 in English (only) at Key Stage 2, 62% of those attending non-selective state schools did not achieve an A* or A Grade at GCSE (so making 4 or 5 levels of progress) and 25% did not achieve a GCSE B grade or higher (so making 3+ levels of progress) (page 12)
  • Of those who achieved Level 5 in maths (only) at Key Stage 2, 53% did not achieve A*/A at GCSE (4 or 5 levels of progress) and 22% did not achieve B or higher (3+ levels of progress) (page 12)
  • We are also given the differentials between boys and girls on several of these measures, but not the percentages for each gender. In English, for A*/A and for B and above, the gap is 11% in favour of girls. In maths, the gap is 6% in favour of girls at A*/A and 5% at B and above. In English and maths combined, the gap is 10% in favour of girls for A*/A and B and above alike (page 15).
  • As for ethnic background, we learn that non-White British students outperformed White British students by 2% in maths and 1% in English and maths together, but the two groups performed equally in English at Grades B and above. The comparable data for Grades A*/A show non-White British outperforming White British by 3% in maths and again 1% in English and maths together, while the two groups again performed equally in English (page 16)

What can we deduce from this? Well, not to labour the obvious, but what is the point of setting out a definition, however exaggeratedly inclusive, only to move to a different definition in the data analysis?

Why bother to spell out a definition based on achievement in English or maths, only to rely so heavily on data relating to achievement in English and maths?

There are also no comparators. We cannot see how the proportion of high attainers making expected progress compares with the proportion of middle and low attainers doing so, so there is no way of knowing whether there is a particular problem at the upper end of the spectrum. We can’t see the comparable pattern in selective schools either.

There is no information about the trend over time – whether the underperformance of high attainers is improving, static or deteriorating compared with previous years – and how that pattern differs from the trend for middle and low attainers.

The same applies to the information about the FSM gap, which is confined solely to English and maths, and solely to Grade B and above, so we can’t see how their performance compares between the two subjects and for the top A*/A grades, even though that data is supplied for boys versus girls and white versus non-white British.

The gender, ethnic and socio-economic data is presented separately so we cannot see how these different factors impact on each other. This despite HMI’s known concern about the underperformance of disadvantaged white boys in particular. It would have been helpful to see that concern linked across to this one.

Overall, the findings do not seem particularly surprising. The large gaps between the percentages of students achieving four and three levels of progress respectively is to be expected, given the orthodoxy that students need only make a minimum of three levels of progress rather than the maximum progress of which they are capable.

The FSM gap of 17% at Grade B and above is actually substantively lower than the gap at Grade C and above which stood at 26.2% in 2011/12. Whether the A*/A gap demonstrates a further widening at the top end remains shrouded in mystery.

Although it is far too soon to have progression data, the report almost entirely ignores the impact of Level 6 on the emerging picture. And it forbears to mention the implications for any future data analysis – including trend analysis – of the decision to dispense with National Curriculum levels entirely with effect from 2016.

Clearly additional data of this kind might have overloaded the main body of the Report, but a data Annex could and should have been appended.

.

Why Ignore the Transition Matrices?

There is a host of information available about the performance of high attaining learners at Key Stage 4 and Key Stage 5 respectively, much of which I drew on for this post back in January 2013.

This applies to all state-funded schools and makes the point about high attainers’ underachievement in spades.

It reveals that, to some extent at least, there is a problem in selective schools too:

‘Not surprisingly (albeit rather oddly), 89.8% of students in selective schools are classified as ‘above Level 4’, whereas the percentage for comprehensive schools is 31.7%. Selective schools do substantially better on all the measures, especially the EBacc where the percentage of ‘above Level 4’ students achieving this benchmark is double the comprehensive school figure (70.7% against 35.0%). More worryingly, 6.6% of these high-attaining pupils in selective schools are not making the expected progress in English and 4.1% are not doing so in maths. In comprehensive school there is even more cause for concern, with 17.7% falling short of three levels of progress in English and 15.3% doing so in maths.’

It is unsurprising that selective schools tend to perform relatively better than comprehensive schools in maximising the achievement of high attainers, because they are specialists in that field.

But, by concentrating exclusively on comprehensive schools, Ofsted gives the false impression that there is no problem in selective schools when there clearly is, albeit not quite so pronounced.

More recently, I have drawn attention to the enormous contribution that can be added to this evidence base by the Key Stage 2 to 4 Transition Matrices available in the Raise Online library.

.

Transition  Matrices and student numbers English (top) and maths (bottom)

.

TM English CaptureTransition matrices numbers English captureTM Maths CaptureTransition matrices maths numbers Capture.

These have the merit of analysing progress to GCSE on the basis of National Curriculum sub-levels, and illustrate the very different performance of learners who achieve 5C, 5B and 5A respectively.

This means we are able to differentiate within the hugely wide Ofsted sample and begin to see how GCSE outcomes are affected by the strength of learners’ KS2 level 5 performance some five years previously.

The tables above show the percentages for English and maths respectively, for those completing GCSEs in 2012. I have also included the tables giving the pupil numbers in each category.

We can see from the percentages that:

  • Of those achieving 5A in English, 47% go on to achieve an A* in the subject, whereas for 5B the percentage is 20% and for 5C as low as 4%.
  • Similarly, of those achieving 5A in Maths, 50% manage an A*, compared with 20% for those with 5B and only 6% for those with 5C.
  • Of those achieving 5A in English, 40% achieve Grade A, so there is a fairly even split between the top two grades. Some 11% achieve a Grade B and just 1% a Grade C.
  • In maths, 34% of those with 5A at KS2 go on to secure a Grade A, so there is a relatively heavier bias in favour of A* grades. A slightly higher 13% progress to a B and 3% to a Grade C.
  • The matrices show that, when it comes to the overall group of learners achieving Level 5, in English 10% get A*, 31% get A and 36% a B. Meanwhile, in maths, 20% get an A*, 31% an A and 29% a B. This illustrates perfectly the very significant advantage enjoyed by those with a high Level 5 compared with Level 5 as a whole.
  • More worryingly, the progression made by learners who achieve upper Level 4s at Key Stage 2 tends to outweigh the progression of those with 5Cs. In English, 70% of those with 5C made 3 levels of progress and 29% made 4 levels of progress. For those with 4A, the comparable percentages were 85% and 41% respectively. For those with 4B they were 70% (so equal to the 5Cs) and 21% respectively.
  • Turning to maths, the percentages of those with Level 5C achieving three and four levels of progress were 67% and 30% respectively, while for those with 4A they were 89% and 39% respectively and for 4B, 76% (so higher) and 19% (lower) respectively.

This suggests that, while there is undeniably an urgent and important issue at the very top, with half or fewer of 5As being translated into A* Grades, the bulk of the problem seems to be at the lower end of Level 5, where there is a conspicuous dip compared with both comparatively higher and comparatively lower attainers.

I realise that there are health warnings attached to the transition matrices, but one can immediately see how this information significantly enriches Ofsted’s relatively simplistic analysis.

.

Data About A Level Achievement and International Comparisons

The data supplied to illustrate progression to A level and international comparisons is comparatively limited.

For A Level:

  • In 2012, 334 (so 20%) of a total of 1,649 non-selective 11-18 schools had no students achieving AAB+ Grades at A Level including at least two of the facilitating subjects.  A footnote tells us that this applies only to 11-18 schools entering at least five pupils at A level. There is nothing about the controversy surrounding the validity of the ‘two facilitating subjects’ proviso (pages 4, 6, 14)
  • Sutton Trust data is quoted from a 2008 publication suggesting that some 60,000 learners who were in the top quintile (20%) of performers in state schools at ages 11, 14 and 16 had not entered higher education by the age of 18; also that those known to have been eligible for FSM were 19% less likely than others to enter higher education by age 19. The most significant explanatory factor was ‘the level and nature of the qualifications’ obtained by those who had been FSM-eligible (page 15).
  • A second Sutton Trust report is referenced showing that, from 2007-2009, students from independent schools were over twice as likely to gain admission to ‘one of the 30 most highly selective universities’ as students from non-selective state schools (48.2% compared with 18 %). However, this ‘could not be attributed solely to the schools’ average A level or equivalent results’ since 58% of applicants from the 30 strongest-performing comprehensive schools on this measure were admitted to these universities, compared with 87.1% from the highest-performing independent schools and 74.1% from the highest-performing grammar schools (pages 16-17)
  • The only international comparisons data is drawn from PISA 2009. The Report uses performance against the highest level in the tests of reading, maths and science respectively. It notes that, in reading, England ranked 15th on this measure though above the OECD average, in maths England ranked 33rd and somewhat below the OECD average and in science England was a strong performer somewhat above the OECD average (page 17)

Apart from the first item, all this material is now at least four years old.

There is no attempt to link KS2 progression to KS5 achievement, which would have materially strengthened the argument (and which is the focus of one of the Report’s central recommendations).

Nor is there any effort to link the PISA assessment to GCSE data, by explaining the key similarities and differences between the two instruments and exploring what that tells us about particular areas of strength and weakness for high attainers in these subjects.

There is again, a wealth of pertinent data available, much of it presented in previous posts on this blog:

Given the relatively scant use of data in the Report, and the significant question marks about the manner in which it has been applied to support the argument, it is hardly surprising that much of the criticism levelled at Ofsted can be traced back to this issue.

All the material I have presented on this blog is freely available online and was curated by someone with no statistical expertise.

While I cannot claim my analysis is error-free, it seems to me that Ofsted’s coverage of the issue is impoverished by comparison. Not only is there too little data, there is too little of the right data to exemplify the issues under discussion.

But, as I have already stated, that is not sufficient reason to condemn the entire Report out of hand.

.

Exeter2 by Gifted Phoenix

Exeter2 by Gifted Phoenix

 

The Qualitative Dimension of the Report

The Evidence Base

If you read some of the social media criticism heaped upon ‘The most able students’ you would be forgiven for thinking that the evidence base consisted entirely of a few dodgy statistics.

But Ofsted also drew on:

  • Field visits to 41 non-selective secondary schools across England, undertaken in March 2013. The sample (which is reproduced as an Annex to the Report) was drawn from each of Ofsted’s eight regions and included schools of different sizes and ‘type’ and ‘different geographical contexts’. Twenty-seven were 11-18 schools, two are described as 11-19 schools, 11 were 11-16 schools and one admitted pupils at 14. Eighteen were academy converters. Inspectors spent a day in each school, discussing issues with school leaders, staff and pupils (asking similar questions to check sources against each other) and they ‘investigated analyses of the school’s [sic] current data’. We know that:

‘Nearly all of the schools visited had a broadly average intake in terms of their students’ prior attainment at the end of Key Stage 2, although this varied from year group to year group.’

Three selective schools were also visited ‘to provide comparison’ but – rather strangely – that comparative evidence was not used in the Report.

  • A sample of 2,327 lesson observation forms collected from Section 5 inspections of a second sample of 109 non-selective secondary schools undertaken in academic year 2012/13. We are not told anything about the selection of this sample, so we have no idea how representative it was.
  • A survey of 93 responses made by parents and carers to a questionnaire Ofsted placed on the website of the National Association for Able Children in Education (NACE)’. Ofsted also ‘sought the views of some key external organisations and individuals’ but these are not named. I have been able to identify just one organisation and one individual who were approached, which perhaps betrays a rather thin sample.

I have no great problem with the sample of schools selected for the survey. Some have suggested that 41 is too few. It falls short of the 50 mentioned in HMCI’s pre-publicity but it is enough, especially since Ofsted’s last Report in December 2009 drew on evidence from just 26 primary and secondary schools.

The second sample of lesson observations is more suspect, in that no information is supplied about how it was drawn. So it is entirely possible that it included all observations from those schools whose inspections were critical of provision for high attainers, or that all the schools were rated as underperforming overall, or against one of Ofsted’s key measures. There is a sin of omission here.

The parental survey is very small and, since it was filtered through a single organisation that focuses predominantly on teacher support, is likely to have generated a biased sample. The failure to engage a proper cross-section of organisations and individuals is regrettable: in these circumstances one should either consult many or none at all.

 .

Survey Questions

Ofsted is comparatively generous with information about its Survey instrument.

There were two fundamental questions, each supported by a handful of supplementary questions:

‘Are the most able students in non-selective state secondary schools achieving as well as they should?’ (with ‘most able’ defined as set out above). This was supported by four supplementary questions:

  • Are comprehensive schools challenging bright students in the way that the independent sector and selective system do?
  • Do schools track progression effectively enough? Do they know how their most able students are doing? What enrichment programme is offered to the most able students and what is its impact?
  • What is the effect of mixed ability classes on the most able students?
  • What is the impact of early entry at GCSE on the most able students?

Why is there such disparity in admissions to the most prestigious universities between a small number of independent and selective schools and the great majority of state-maintained non-selective schools and academies?’

  • What is the quality of careers advice and its impact on A level students, particularly in terms of their successful application to top universities? Are students receiving good advice and support on how to complete their UCAS forms/personal statements?
  • Are the most able students from disadvantaged backgrounds as likely as the most able students from more affluent families to progress to top universities, and if not why?
  • What are successful state schools doing to increase application success rates and what lessons can be learnt?

Evidence from the 41 non-selective schools was collected under six broad themes:

  • ‘the leadership of the school
  • the achievement of the most able students throughout the school
  • the transfer and transition of these students from their primary schools and their induction into secondary school
  • the quality of teaching, learning and assessment of the most able students
  • the curriculum and extension activities offered to the most able student
  • the support and guidance provided for the most able students, particularly when they were choosing subjects and preparing for university.’

But  the survey also ‘focused on five key elements’ (page 32) which are virtually identical to the last five themes above.

.

Analysis of Key Findings

 

Top Level Conclusions

Before engaging in detail with the qualitative analysis from these sources, it is worth pausing to highlight two significant quantitative findings which are far more telling than those generated by the data analysis foregrounded in the Report.

Had I the good fortune to have reviewed the Report’s key findings prior to publication, I would have urged far greater prominence for:

  • ‘The 2,327 lesson observation evidence forms… showed that the most able students in only a fifth of these lessons were supported well or better.’
  • ‘In around 40% of the schools visited in the survey, the most able students were not making the progress of which they were capable. In a few of the schools visited, teachers did not even know who the most able students were.’

So, in a nutshell, one source of evidence suggests that, in 80% of lessons, support for the most able students is either inadequate or requires improvement.

Another source suggests that, in 40% of schools, the most able students are underachieving in terms of progress while, in a few schools, their identity is unknown.

And these findings apply not to a narrow group of the very highest attaining learners but, on the basis of Ofsted’s own definition, to over 50% of pupils!

Subject to the methodological concerns above, the samples appear sufficiently robust to be extrapolated to all English secondary schools – or the non-selective majority at least.

We do not need to apportion blame, or make schools feel that this is entirely their fault. But this is scandalous – indeed so problematic that it surely requires a concerted national effort to tackle it.

We will consider below whether the recommendations set out in the Report match that description, but first we need to engage with some of the qualitative detail.

The analysis below looks in turn at each of the six themes, in the order that they appear in the main body of the Report.

.

Theme 1 – Achievement of the Most Able Students

 Key finding: ‘The most able students in non-selective secondary schools are not achieving as well as they should. In many schools, expectations of what the most able students should achieve are too low.’

 Additional points:

  • [Too] many of the students in the problematic 40% of surveyed schools ‘failed to attain the highest levels at GCSE and A level’.
  • Academic progress in KS3 required improvement in 17 of the 41 schools. Data was neither accurate nor robust in seven of the 41. Progress differed widely by subject.
  • At KS4, the most able were making less progress than other students in 19 of the 41 schools.
  • At KS5, the most able were making ‘less than expected progress’ in one or more subjects at 17 of the 41 schools.

 .

Theme 2 – Leadership and Management

Key Finding: ‘Leaders in our secondary schools have not done enough to create a culture of scholastic excellence, where the highest achievement in academic work is recognised as vitally important. Schools do not routinely give the same attention to the most able as they do to low-attaining students or those who struggle at school.’

Additional points:

  • Nearly all school leaders claimed to be ambitious for their most able students, but this was not realised in practice in over 40% of the sample.
  • In less effective schools initiatives were usually new or rudimentary and had not been evaluated.
  • Students were taught mainly in mixed ability groups in about a third of the schools visited. Setting was typically restricted to core subjects and often introduced for English and science relatively late in KS3.
  • This had no detrimental effect in ‘the very best schools’ but, in the less effective, work was typically pitched to average attainers.
  • Seven schools had revised their policy on early GCSE entry because of a negative impact on the number of the most able achieving top grades.
  • Leaders in the best schools showed high aspirations for their most able students, providing high-quality teaching and work matched to their needs. Results were well above average and high proportions achieved A*/A grades at GCSE and A level.
  • The best leaders ensure their high aspirations are understood throughout the school community, set high expectations embodied in stretching targets, recruit strong staff and deploy them as specialists and create ‘a dynamic, innovative learning environment’.

.

Theme 3 – Transfer and Transition

Key Finding: ‘Transition arrangements from primary to secondary school are not effective enough to ensure that students maintain their academic momentum into Year 7. Information is not used carefully so that teachers can plan to meet the most able students’ needs in all lessons from the beginning of their secondary school career.’

Additional points:

  • The quality of transition is much too variable. Arrangements were weak in over one quarter of schools visited. Work was repeated in KS3 or was insufficiently challenging. Opportunities were missed to extend and consolidate previous learning.
  • Simple approaches were most effective, easier to implement in schools with few primary feeders or long-established cluster arrangements.
  • In the best examples secondary schools supported the most able before transfer, through specialist teaching and enrichment/extension activities.
  • In many schools activities were typically generic rather than targeted at the most able and many leaders didn’t know how effective they were for this group.
  • In over a quarter of schools the most able ‘did not get off to a good start’ in Year 7 because expectations were too low, work was insufficiently demanding and pupils were under-challenged.
  • Overall inspectors found serious weaknesses in this practice.
  • Effective practice includes: pre-transfer liaison with primary teachers and careful discussion about the most able; gathering a wide range of data to inform setting or class groups; identifying the most able early and implementing support for them to maintain their momentum; and fully evaluating pre-transfer activities and adapting them in the light of that.

.

Exeter3 by Gifted Phoenix

Exeter3 by Gifted Phoenix

 .

Theme 4 – The Quality of Teaching, Learning and Assessment

Key Findings:

‘Teaching is insufficiently focused on the most able at KS3. In over two-fifths of the schools visited for the survey, students did not make the progress that they should, or that they were capable of, between the ages of 11 and 14. Students said that too much work was repetitive and undemanding in KS3. As a result, their progress faltered and their interest in school waned.

Many students became used to performing at a lower level than they are capable of. Parents or carers and teachers accepted this too readily. Students did not do the hard work and develop the resilience needed to perform at a higher level because more challenging tasks were not regularly demanded of them. The work was pitched at the middle and did not extend the most able. School leaders did not evaluate how well mixed-ability group teaching was challenging the most able students.’

Additional points:

  • The reasons for slow progress varied between schools and subjects but included: failure to recognise and challenge the most able; variability in approaches across subjects and year groups; inconsistent application of school policy; and lack of focus by senior and middle leaders.
  • Weaker provision demonstrated: insufficient tracking of the most able, inadequate rapid intervention strategies, insufficiently differentiated homework, failure to apply Pupil Premium funding and little evaluation of the impact of teaching and support.
  • In a few schools the organisation of classes inhibited progress, as evidenced by limited knowledge of the effectiveness of differentiation in mixed ability settings and lack of challenge, particularly in KS3.
  • Eight schools had moved recently to grouping by ability, particularly in core subjects. Others indicated they were moving towards setting, streaming or banding most subjects. Schools’ data showed this beginning to have a positive impact on outcomes.

.

Theme 5 – Curriculum and Extension Activities

Key Findings:

‘The curriculum and the quality of homework required improvement. The curriculum in KS3 and early entry to GCSE examination are among the key weaknesses found by inspectors. Homework and the programme of extension activities for the most able students, where they existed, were not checked routinely for their impact or quality. Students said that too much homework was insufficiently challenging; it failed to interest them, extend their thinking or develop their skills.

Inequalities between different groups of the most able students are not being tackled satisfactorily. The attainment of the most able students who are eligible for FSM, especially the most able boys, lags behind that of other groups. Few of the schools visited used the Pupil Premium funding to support the most able students from the poorest backgrounds.

Assessment, tracking and targeting are not used sufficiently well in many schools. Some of the schools visited paid scant attention to the progress of their most able students.’

Additional points:

  • In over a quarter of schools visited, aspects of the curriculum, including homework, required improvement. In two schools the curriculum failed to meet the needs of the most able.
  • In one in seven schools, leaders had made significant changes recently, including more focus on academic subjects and more setting.
  • But schools did not always listen to feedback from their most able students. Many did not ask students how well the school was meeting their needs or how to improve further.
  • In weaker schools students were rarely given extension work. Sixth form students reported insufficient opportunities to think reflectively and too few suggestions for wider, independent reading.
  • Many in less effective schools felt homework could be more challenging. Few were set wider research or extension tasks.
  • While some leaders said extra challenge was incorporated in homework, many students disagreed. Few school leaders were aware of the homework provided to these students. Many schools had limited strategies for auditing and evaluating its quality.
  • Most school leaders said a wide range of extension tasks, extra-curricular and enrichment activities was provided for the most able, but these were usually for all students. Targeted activities, when undertaken, were rarely evaluated.
  • Research suggests it is important to provide access to such activities for the most able students where parents are not doing so. Schools used Pupil Premium for this in only a few instances.
  • The Premium was ‘generally spent on providing support for all underachieving and low-attaining students rather than on the most able students from disadvantaged backgrounds’.
  • Strong, effective practice was exemplified by a curriculum well-matched to the needs of most able students, a good range and quality of extra-curricular activity, effective use of the Pupil Premium to enrich students’ curriculum and educational experience and motivating and engaging homework, tailored to students’ needs, designed to develop creativity and independence.
  • In over a third of schools visited, tracking of the most able was ‘not secure, routine or robust’. Intervention was often too slow.
  • In weaker schools, leaders were focused mainly on the C/D borderline; stronger schools also focused on A*/A grades too, believing their pupils could do better than ‘the B grade that is implied by the expected progress measure’.
  • Some schools used assessment systems inconsistently, especially in some KS3 foundation subjects where there was insufficient or inaccurate data. In one in five schools, targets for the most able ‘lacked precision and challenge’.
  • In a fifth of schools, senior leaders had introduced improved monitoring systems to hold staff to account, but implementation was often at a very early stage. Only in the best schools were such systems well established.
  • The most effective included lesson observation, work scrutiny, data analysis and reviews of teacher planning. In the better schools students knew exactly what they needed to do to attain the next level/grade and received regular feedback on progress.
  • The most successful schools had in place a wide range of strategies including: ensuring staff had detailed knowledge of the most able, their strengths and interests; through comprehensive assessment, providing challenging programmes and high quality support that met students’ needs;  and rigorous tracking by year, department and key stage combined with swift intervention where needed.
  • Many leaders had not introduced professional development focused on the most able students. Their needs had not been tackled by staff in over one fifth of schools visited, so teachers had not developed the required skills to meet their needs, or up-to-date knowledge of the Year 6 curriculum and assessment arrangements. Stronger schools were learning with and from their peers and had formed links with a range of external agencies.

.

Theme 6 – Support and Guidance for University Entry

Key Findings:

‘Too few of the schools worked with families to support them in overcoming the cultural and financial obstacles that stood in the way of the most able students attending university, particularly universities away from the immediate local area. Schools did not provide much information about the various benefits of attending different universities or help the most able students to understand more about the financial support available.

Most of the 11-16 schools visited were insufficiently focused on university entrance. These schools did not provide students with sufficiently detailed advice and guidance on all the post-16 options available.

Schools’ expertise in and knowledge about how to apply to the most prestigious universities was not always current and relevant. Insufficient support and guidance were provided to those most able students whose family members had not attended university.’

Additional points:

  • Support and guidance varied in quality, accuracy and depth. Around half of schools visited ‘accepted any university as an option’. Almost a quarter had much to do to convince students and their families of the benefits of higher education, and began doing so too late.
  • Data provided by 26 of the 29 11-18 schools showed just 16 students went to Oxbridge in 2011, one eligible for FSM, but almost half came from just two of the schools. Nineteen had no students accepted at Oxbridge. The 2012 figures showed some improvement with 26 admitted to Oxbridge from 28 schools, three of them FSM-eligible.
  • In 2011, 293 students went to Russell Group universities, but only six were FSM eligible. By 2012 this had increased to 352, including 30 eligible for FSM, but over a quarter of the 352 came from just two schools.
  • Factors inhibiting application to prestigious universities included pressure to stay in the locality, cost (including fees), aversion to debt and low expectations. Almost half of the schools visited tackled this through partnership with local universities.
  • Schools did not always provide early or effective careers advice or information about the costs and benefits of attending university.
  • Some schools showed a lack of up-to-date intelligence about universities and their entrance requirements, but one third of those visited provided high quality support and guidance.
  • Some schools regarded going to any university as the indicator of success, disagreeing that it was appropriate to push students towards prestigious universities, rather than the ‘right’ institution for the student.
  • Most of the 11-16 schools visited were insufficiently focused on university entrance. They did not provide sufficiently detailed advice on post-16 options and did not track students’ destinations effectively, either post-16 or post-18.
  • The best schools: provided early on a planned programme to raise students’ awareness of university education; began engaging with students and parents about this as soon as they entered the school; provided support and guidance about subject choices, entry requirements and course content; supported UCAS applications; enabled students to visit a range of universities; and used alumni as role models.

.

Exeter4 by Gifted Phoenix

Exeter4 by Gifted Phoenix

 

Ofsted’s Recommendations

There are two sets of recommendations in the Report, each with an associated commentary about the key constituents of good and bad practice. The first is in HMCI’s Foreword; the second in the main body of the Report.

.

HMCI’s Version

This leads with material from the data analysis, rather than some of the more convincing data from the survey, or at least a judicious blend of both sources.

He rightly describes the outcomes as unacceptable and inconsistent with the principle of comprehensive education, though his justification for omitting selective schools from the analysis is rather less convincing, especially since he is focused in part on narrowing the gap between the two as far as admission to prestigious universities is concerned.

Having pointed up deficiencies at whole school level and in lessons he argues that:

‘The term ‘special needs’ should be as relevant to the most able as it is to those who require support for their learning difficulties’

This is rather out of left field and is not repeated in the main body or the official recommendations. There are pros and cons to such a route – and it would anyway be entirely inappropriate for a population comprising over 50% of the secondary population.

HMCI poses ‘three key challenges’:

‘First, we need to make sure that our most able students do as well academically as those of our main economic competitors. This means aiming for A* and A grades and not being satisfied with less. Not enough has changed since 2009, when the PISA tests found that England’s teenagers were just over half as likely as those from other developed nations to reach the highest levels in mathematics in international tests.

The second challenge is to ensure, from early on, that students know what opportunities are open to them and develop the confidence to make the most of these. They need tutoring, guidance and encouragement, as well as a chance to meet other young people who have embraced higher education. In this respect, independent schools as well as universities have an important role to play in supporting state schools.

The third challenge is to ensure that all schools help students and families overcome cultural barriers to attending higher education. Many of our most able students come from homes where no parent or close relative has either experienced, or expects, progression to university. Schools, therefore, need to engage more effectively with the parents or carers of these students to tackle this challenge.’

This despite the fact that comparison with international competitors is almost entirely lacking from the Report, save for one brief section on PISA data.

The role of independent schools is also underplayed, while the role of universities is seen very much from the schools’ perspective – there is no effort to link together the ‘fair access’ and ‘most able’ agendas in any meaningful fashion.

Parental engagement is also arguably under-emphasised or, at least, confined almost exclusively to the issue of progression.

.

Ofsted’s Version

The ‘official’ text provides a standard overarching bullet point profile of poor and strong provision respectively.

  • Poor provision is characterised by: ‘fragile’ primary/secondary transfer; placement in groups where teaching is not challenging; irregular progress checks; a focus on D/C borderline students at the expense of the more able; and failure to prepare students well for A levels.
  • Strong provision features: leadership determined to improve standards for all students; high expectations of the most able amongst students, families and teachers; effective transition to sustain the momentum of the most able; early identification to inform tailoring of teaching and the curriculum; curricular flexibility to permit challenge and extension; grouping to support stretch from the start of secondary school;  expert teaching, formative assessment and purposeful homework; effective training and capacity for teachers to learn from each other; close monitoring of progress to inform rapid intervention where necessary; and effective support for application to prestigious universities.

A series of 13 recommendations is provided, alongside three Ofsted commitments. Ten of the 13 are aimed at schools and three at central Government.

I have set out the recommendations in the table below, alongside those from the previous Report, published in 2009.

 

2009 Report 2013 Report
Central Government Central Government
Ensure planned catalogue of learning and professional development opportunities meets the needs of parents, schools and LAs DfE to ensure parents receive annual report recording whether students are on track to achieve as well as they should in national tests and exams
Ensure LAs hold schools more rigorously to account for the impact of their G&T provision DfE to develop progress measures from KS2 to KS4 and KS5
DfE to promote new destination data showing progression to (Russell Group) universities
Ofsted will focus inspections more closely on teaching and progress of most able, their curriculum and the information, advice and guidance provided to them
Ofsted will consider in more detail during inspection how well Pupil Premium is used to support disadvantaged most able
Ofsted will report inspection findings about this group more clearly in school, sixth form and college reports
Local Authorities Local Authorities
Hold schools more rigorously to account for the impact of their G&T provision
Encourage best practice by sharing with schools what works well and how to access appropriate resources and training
Help schools produce clearer indicators of achievement and progress at different ages
Schools Schools
Match teaching to pupils’ individual needs Develop culture and ethos so needs of most able are championed by school leaders
Listen to pupil feedback and act on it Help most able to leave school with best qualifications by developing skills, confidence and attitudes needed to succeed at the best universities
Inform parents and engage them more constructively Improve primary-secondary transfer so all Year 7 teachers know which students achieved highly and what aspects of the curriculum they studied in Year 6, and use this to inform KS3 teaching.
Use funding to improve provision through collaboration Ensure work remains challenging throughout KS3 so most able make rapid progress.
Ensure lead staff have strategic clout Ensure leaders evaluate mixed ability teaching so most able are sufficiently challenged and make good progress
Ensure rigorous audit and evaluation processes Evaluate homework to ensure it is sufficiently challenging
Give parents better and more frequent information about what their children should achieve and raise expectations where necessary.
Work more closely with families, especially first generation HE applicants and FSM-eligible to overcome cultural and financial obstacles to HE application
Develop more knowledge and expertise to support applications to the most prestigious universities
Publish more widely the university destinations of their students

TABLE 1: COMPARING OFSTED RECOMMENDATIONS IN 2009 AND 2013

The comparison serves to illustrate the degree of crossover between the two Reports – and to what extent the issues raised in the former remain pertinent four years on.

The emboldened Items in the left-hand column are still outstanding and are not addressed in the latest Report. There is nothing about providing support for schools from the centre; and nothing whatsoever about the role of the ‘middle tier’, however that is composed. Ofsted’s new Report might have been enriched by some cross-reference to its predecessor.

The three recommendations directed at the centre are relatively limited in scope – fundamentally restricted to elements of the status quo and probably demanding negligible extra work or resource

  • The reference to an annual report to parents could arguably be satisfied by the existing requirements, which are encapsulated in secondary legislation.
  • It is not clear whether promoting the new destination measures requires anything more than their continuing publication – the 2013 version is scheduled for release this very week.
  • The reference to development of progress measures may be slightly more significant but probably reflects work already in progress. The consultation document on Secondary School Accountability proposed a progress measure based on a new ‘APS8’ indicator, calculated through a Value Added method and using end KS2 results in English and maths as a baseline:

‘It will take the progress each pupil makes between Key Stage 2 and Key Stage 4 and compare that with the progress that we expect to be made by pupils nationally who had the same level of attainment at Key Stage 2 (calculated by combining results at end of Key Stage 2 in English and mathematics).’

However this applies only to KS4, not KS5, and we are still waiting to discover how the KS2 baseline will be graded from 2016 when National Curriculum levels disappear.

This throws attention back on the Secretary of State’s June 2012 announcement, so far unfulfilled by any public consultation:

‘In terms of statutory assessment, however, I believe that it is critical that we both recognise the achievements of all pupils, and provide for a focus on progress. Some form of grading of pupil attainment in mathematics, science and English will therefore be required, so that we can recognise and reward the highest achievers as well as identifying those that are falling below national expectations. We will consider further the details of how this will work.’

.

The Balance Between Challenge and Support

It is hard to escape the conclusion that Ofsted believe inter-school collaboration, the third sector and the market can together provide all the support that schools can  need (while the centre’s role is confined to providing commensurate challenge through a somewhat stiffened accountability regime).

After four years of school-driven gifted education, I am not entirely sure I share their confidence that schools and the third sector can rise collectively to that challenge.

They seem relatively hamstrung at present by insufficient central investment in capacity-building and an unwillingness on the part of key players to work together collaboratively to update existing guidance and provide support. The infrastructure is limited and fragmented and leadership is lacking.

As I see it, there are two immediate priorities:

  • To provide and maintain the catalogue of learning opportunities and professional support mentioned in Ofsted’s 2009 report; and
  • To update and disseminate national guidance on what constitutes effective whole school gifted and talented education.

The latter should in my view be built around an updated version of the Quality Standards for gifted education, last refreshed in 2010. It should be adopted once more as the single authoritative statement of effective practice which more sophisticated tools – some, such as the Challenge Award, with fairly hefty price tags attached – can adapt and apply as necessary.

The Table appended to this post maps the main findings in both the 2009 and 2013 Ofsted Reports against the Standards. I have also inserted a cross in those sections of the Standards which are addressed by the main text of the more recent Report.

One can see from this how relevant the Standards remain to discussion of what constitutes effective whole school practice.

But one can also identify one or two significant gaps in Ofsted’s coverage, including:

  • identification – and the issues it raises about the relationship between ability and attainment
  • the critical importance of a coherent, thorough, living policy document incorporating an annually updated action plan for improvement
  • the relevance of new technology (such as social media)
  • the significance of support for affective issues, including bullying, and
  • the allocation of sufficient resources – human and financial –  to undertake the work.

.

Exeter5 by Gifted Phoenix

Exeter5 by Gifted Phoenix

 

Reaction to the Report

I will not trouble to reproduce some of the more vituperative comment from certain sources, since I strongly suspect much of it to be inspired by personal hostility to HMCI and to gifted education alike.

  • To date there has been no formal written response from the Government although David Laws recorded one or two interviews such as this which simply reflects existing reforms to accountability and qualifications. At the time of writing, the DfE page on Academically More Able Pupils has not been updated to reflect the Report.
  •  The Opposition criticised the Government for having ‘no plan for gifted and talented children’ but did not offer any specific plan of their own.
  • The Sutton Trust called the Report ‘A wake-up call to Ministers’ adding:

‘Schools must improve their provision, as Ofsted recommends. But the Government should play its part too by providing funding to trial the most effective ways to enable our brightest young people to fulfil their potential. Enabling able students to fulfil their potential goes right to the heart of social mobility, basic fairness and economic efficiency.’

Contrary to my expectations, there was no announcement arising from the call for proposals the Trust itself issued back in July 2012 (see word attachment at bottom). A subsequent blog post called for:

‘A voluntary scheme which gives head teachers an incentive – perhaps through a top-up to their pupil premium or some other matched-funding provided centrally – to engage with evidence based programmes which have been shown to have an impact on the achievement of the most able students.’

‘We warned the Government in 2010 when it scrapped the gifted and talented programme that this would be the result. Many schools are doing a fantastic job in supporting these children. However we know from experience that busy schools will often only have time to focus on the latest priorities. The needs of the most able children have fallen to the bottom of the political and social agenda and it’s time to put it right to the top again.’

‘It is imperative that Ofsted, schools and organisations such as NACE work in partnership to examine in detail the issues surrounding this report. We need to disseminate more effectively what works. There are schools that are outstanding in how they provide for the brightest students. However there has not been enough rigorous research into this.’

  • Within the wider blogosphere, Geoff Barton was first out of the traps, criticising Ofsted for lack of rigour, interference in matters properly left to schools, ‘fatuous comparisons’ and ‘easy soundbites’.
  • The same day Tom Bennett was much more supportive of the Report and dispensed some commonsense advice based firmly on his experience as a G&T co-ordinator.
  • Then Learning Spy misunderstood Tom’s suggestions about identification asking ‘how does corralling the boffins and treating them differently’ serve the aim of high expectations for all? He far preferred Headguruteacher’s advocacy for a ‘teach to the top’ curriculum, which is eminently sensible.
  • Accordingly, Headguruteacher contributed The Anatomy of High Expectations which drew out the value of the Report for self-evaluation purposes (so not too different to my call for a revised IQS).
  • Finally Chris Husbands offered a contribution on the IoE Blog which also linked Ofsted’s Report to the abolition of National Curriculum levels, reminding us of some of the original design features built in by TGAT but never realised in practice.

Apologies to any I have missed!

As for yours truly, I included the reactions of all the main teachers’ associations in the collection of Tweets I posted on the day of publication.

I published Driving Gifted Education Forward, a single page proposal for the kind of collaborative mechanism that could bring about system-wide improvement, built on school-to-school collaboration. It proposes a network of Learning Schools, complementing Teaching Schools, established as centres of excellence with a determinedly outward-looking focus.

And I produced a short piece about transition matrices which I have partly integrated into this post.

Having all but completed this extended analysis, have I changed the initial views I Tweeted on the day of publication?

.

.

Well, not really. My overall impression is of a curate’s egg, whose better parts have been largely overlooked because of the opprobrium heaped on the bad bits.

.

Curate's Egg 370px-True_humility

Bishop: ‘I’m afraid you’ve got a bad egg Mr Jones’, Curate: ‘Oh, no, my Lord, I assure you that parts of it are excellent!’

.

The Report might have had a better reception had the data analysis been stronger, had the most significant messages been given comparatively greater prominence and had the tone been somewhat more emollient towards the professionals it addresses, with some sort of undertaking to underwrite support – as well as challenge – from the centre.

The commitments to toughen up the inspection regime are welcome but we need more explicit details of exactly how this will be managed, including any amendments to the framework for inspection and supporting guidance. Such adjustments must be prominent and permanent rather than tacked on as an afterthought.

We – all of us with an interest – need to fillet the key messages from the text and integrate them into a succinct piece of guidance as I have suggested, but carefully so that it applies to every setting and has built-in progression for even the best-performing schools. That’s what the Quality Standards did – and why they are still needed. Perhaps Ofsted should lead the revision exercise and incorporate them wholesale into the inspection framework.

As we draw down a veil over the second of these three ‘Summer of Love’ publications, what are the immediate prospects for a brighter future for English gifted education?

Well, hardly incandescent sunshine, but rather more promising than before. Ofsted’s Report isn’t quite the ‘landmark’ HMCI Wilshaw promised and it won’t be the game changer some of us had hoped for, but it’s better than a poke in the eye with the proverbial blunt stick.

Yet the sticking point remains the capacity of schools, organisations and individuals to set aside their differences and secure the necessary collateral to work collectively together to bring about the improvements called for in the Report.

Without such commitment too many schools will fail to change their ways.

.

GP

June 2013

.

.

ANNEX: MAPPING KEY FINDINGS FROM THE 2009 AND 2013 REPORTS AGAINST THE IQS

IQS Element IQS Sub-element Ofsted 2009 Ofsted 2013
Standards and progress Attainment levels high and progress strong Schools need more support and advice about standards and expectations Most able aren’t achieving as well as they should. Expectations are too low.65% who achieved KS2 L5 in English and maths failed to attain GCSE A*/A gradesTeaching is insufficiently focused on the most able at KS3Inequalities between different groups aren’t being tackled satisfactorily
SMART targets set for other outcomes x
Effective classroom provision Effective pedagogical strategies Pupil experienced inconsistent level of challenge x
Differentiated lessons x
Effective application of new technologies
Identification Effective identification strategies x
Register is maintained
Population is broadly representative of intake
Assessment Data informs planning and progression Assessment, tracking and targeting not used sufficiently well in many schools
Effective target-setting and feedback x
Strong peer and self-assessment
Transfer and transition Effective information transfer between classes, years and institutions Transition doesn’t ensure students maintain academic momentum into Year 7
Enabling curriculum entitlement and choice Curriculum matched to learners’ needs Pupils’ views not reflected in curriculum planning The KS3 curriculum is a key weakness, as is early GCSE entry
Choice and accessibility to flexible pathways
Leadership Effective support by SLT, governors and staff Insufficient commitment in poorer performing schools School leaders haven’t done enough to create a culture of scholastic excellence.Schools don’t routinely give the same attention to most able as low-attaining or struggling students.
Monitoring and evaluation Performance regularly reviewed against challenging targets Little evaluation of progression by different groups x
Evaluation of provision for learners to inform development x
Policy Policy is integral to school planning, reflects best practice and is reviewed regularly Many policies generic versions from other schools or the LA;Too much inconsistency and incoherence between subjects
School ethos and pastoral care Setting high expectations and celebrating achievement Many students become used to performing at a lower level than they are capable of. Parents and teachers accept this too readily.
Support for underachievers and socio-emotional needs
Support for bullying and academic pressure/opportunities to benefit the wider community
Staff development Effective induction and professional development x
Professional development for managers and whole staff x
Resources Appropriate budget and resources applied effectively
Engaging with the community, families and beyond Parents informed, involved and engaged Less than full parental engagement Too few schools supporting families in overcoming cultural and financial obstacles to attending university
Effective networking and collaboration with other schools and organisations Schools need more support to source best resources and trainingLimited collaboration in some schools; little local scrutiny/accountability Most 11-16 schools insufficiently focused on university entranceSchools’ expertise and knowledge of prestigious universities not always current and relevant
Learning beyond the classroom Participation in a coherent programme of out-of-hours learning Link with school provision not always clear; limited evaluation of impact Homework and extension activities were not checked routinely for impact and quality

My Twitter Feed Summarising Key Points from Ofsted’s Report ‘The Most Able Students’

.

Here is the record of my Tweets from this morning summarising the main points from Ofsted’s newly-published Survey Report: ‘The Most Able Students’.

.

.

.

OFSTED’S KEY FINDINGS

.

.

.

OFSTED’S RECOMMENDATIONS

.

.

.

OFSTED’S COMMITMENTS

.

.

.

OVERALL ASSESSMENT

.

.

.

GOVERNMENT RESPONSE

.

.

.

OPPOSTION RESPONSE

.

.

.

POTENTIAL PLUS PRESS RELEASE

.

.

.

SUTTON TRUST PRESS RELEASE

.

.

.

WHAT THE UNIONS THINK

.