Gifted Phoenix 2014 Review and Retrospective

.

I am rounding out this year’s blogging with my customary backwards look at the various posts I published during 2014.

This is partly an exercise in self-congratulation but also flags up to readers any potentially useful posts they might have missed.

.

P1020553

Norwegian Panorama by Gifted Phoenix

.

This is my 32nd post of the year, three fewer than the 35 I published in 2013. Even so, total blog views have increased by 20% compared with 2013.

Almost exactly half of these views originate in the UK. Other countries generating a large number of views include the United States, Singapore, India, Australia, Hong Kong, Saudi Arabia, Germany, Canada and South Korea. The site has been visited this year by readers located in157 different countries.

My most popular post during 2014 was Gifted Education in Singapore: Part 2, which was published back in May 2012. This continues to attract interest in Singapore!

The most popular post written during 2014 was The 2013 Transition Matrices and High Attainers’ Performance (January).

Other 2014 posts that attracted a large readership were:

This illustrates just how strongly the accountability regime features in the priorities of English educators.

I have continued to feature comparatively more domestic topics: approximately 75% of my posts this year have been about the English education system. I have not ventured beyond these shores since September.

The first section below reviews the minority of posts with a global perspective; the second covers the English material. A brief conclusion offers my take on future prospects.

.

Global Gifted Education

I began the year by updating my Blogroll, with the help of responses to Gifted Education Activity in the Blogosphere and on Twitter.

This post announced the creation of a Twitter list containing all the feeds I can find that mention gifted education (or a similar term, whether in English or another language) in their profile.

I have continued to update the list, which presently includes 1,312 feeds and has 22 subscribers. If you want to be included – or have additions to suggest – please don’t hesitate to tweet me.

While we’re on the subject, I should take this opportunity to thank my 5,960 Twitter followers, an increase of some 28% compared with this time last year.

In February I published A Brief Discussion about Gifted Labelling and its Permanency. This recorded a debate I had on Twitter about whether the ‘gifted label’ might be used more as a temporary marker than a permanent sorting device.

March saw the appearance of How Well Does Gifted Education Use Social Media?

This proposed some quality criteria for social media usage and blogs/websites that operate within the field of gifted education.

It also reviewed the social media activity of six key players (WCGTC, ECHA, NAGC, SENG, NACE and Potential Plus UK) as well as wider activity within the blogosphere, on five leading social media platforms and utilising four popular content creation tools.

Some of the websites mentioned above have been recast since the post was published and are now much improved (though I claim no direct influence).

Also in March I published What Has Become of the European Talent Network? Part One and Part Two.

These posts were scheduled just ahead of a conference organised by the Hungarian sponsors of the network. I did not attend, fearing that the proceedings would have limited impact on the future direction of this once promising initiative. I used the posts to set out my reservations, which include a failure to engage with constructive criticism.

Part One scrutinises the Hungarian talent development model on which the European Network is based. Part Two describes the halting progress made by to date. It identifies several deficiencies that need to be addressed if the Network is to have a significant and lasting impact on pan-European support for talent development and gifted education.

During April I produced PISA 2012 Creative Problem Solving: International Comparison of High Achievers’ Performance

This analyses the performance of high achievers from a selection of 11 jurisdictions – either world leaders or prominent English-speaking nations – on the PISA 2012 Creative Problem Solving assessment.

It is a companion piece to a 2013 post which undertook a similar analysis of the PISA 2012 assessments in Reading, Maths and Science.

In May I contributed to the Hoagies’ Bloghop for that month.

Air on the ‘G’ String: Hoagies’ Bloghop, May 2014 was my input to discussion about the efficacy of ‘the G word’ (gifted). I deliberately produced a provocative and thought-provoking piece which stirred typically intense reactions in several quarters.

Finally, September saw the production of Beware the ‘short head’: PISA’s Resilient Students’ Measure.

This takes a closer look at the relatively little-known PISA ‘resilient students’ measure – focused on high achievers from disadvantaged socio-economic backgrounds – and how well different jurisdictions perform against it.

The title reflects the post’s conclusion that, like many other countries, England:

‘…should be worrying as much about our ‘short head’ as our ‘long tail’’.

And so I pass seamlessly on to the series of domestic posts I published during 2014…

.

English Education Policy

My substantive post in January was High Attainment in the 2013 Secondary and 16-18 Performance Tables, an analysis of the data contained in last year’s Tables and the related statistical publications.

Also in January I produced a much briefer commentary on The 2013 Transition Matrices and High Attainers’ Performance.

The purpose of these annual posts (and the primary equivalent which appears each December) is to synthesise data about the performance of high attainers and high attainment at national level, so that schools can more easily benchmark their own performance.

In February I wrote What Becomes of Schools that Fail their High Attainers?*

It examines the subsequent history of schools that recorded particularly poor results with high attainers in the Secondary Performance Tables. (The asterisk references a footnote apologising ‘for this rather tabloid title’.)

By March I was focused on Challenging NAHT’s Commission on Assessment subjecting the Commission’s Report to a suitably forensic examination and offering a parallel series of recommendations derived from it.

My April Fool’s joke this year was Plans for a National Centre for Education Research into Free Schools (CERFS). This has not materialised but, had our previous Secretary of State for Education not been reshuffled, I’m sure it would have been only a matter of time!

Also in April I was Unpacking the Primary Assessment and Accountability Reforms, exposing some of the issues and uncertainties embodied in the government’s response to consultation on its proposals.

Some of the issues I highlighted eight months ago are now being more widely discussed – not least the nature of the performance descriptors, as set out in the recent consultation exercise dedicated to those.

But the reform process is slow. Many other issues remain unresolved and it seems increasingly likely that some of the more problematic will be delayed deliberately until after the General Election.

May was particularly productive, witnessing four posts, three of them substantial:

  • How well is Ofsted reporting on the most able? explores how Ofsted inspectors are interpreting the references to the attainment and progress of the most able added to the Inspection Handbook late last year. The sample comproses the 87 secondary inspection reports that were published in March 2014. My overall assessment? Requires Improvement.

.

.

  • A Closer Look at Level 6 is a ‘data-driven analysis of Level 6 performance’. As well as providing a baseline against which to assess future Level 6 achievement, this also identifies several gaps in the published data and raises as yet unanswered questions about the nature of the new tests to be introduced from 2016.
  • One For The Echo Chamber was prompted by The Echo Chamber reblogging service, whose founder objected that my posts are too long, together with the ensuing Twitter debate. Throughout the year the vast majority of my posts have been unapologetically detailed and thorough. They are intended as reference material, to be quarried and revisited, rather than the disposable vignettes that so many seem to prefer. To this day they get reblogged on The Echo Chamber only when a sympathetic moderator is undertaking the task.
  • ‘Poor but Bright’ v ‘Poor but Dim’ arose from another debate on Twitter, sparked by a blog post which argued that the latter are a higher educational priority than the former. I argued that both deserved equal priority, since it is inequitable to discriminate between disadvantaged learners on the basis of prior attainment and the economic arguments cut both ways. This issue continues to bubble like a subterranean stream, only to resurface from time to time, most recently when the Fair Education Alliance proposed that the value of pupil premium allocations attached to disadvantaged high attainers should be halved.

In June I asked Why Can’t We Have National Consensus on Educating High Attainers? and proposed a set of core principles that might form the basis for such consensus.

These were positively received. Unfortunately though, the necessary debate has not yet taken place.

.

.

The principles should be valuable to schools considering how best to respond to Ofsted’s increased scrutiny of their provision for the most able. Any institution considering how best to revitalise its provision might discuss how the principles should be interpreted to suit their particular needs and circumstances.

July saw the publication of Digging Beneath the Destination Measures which explored the higher education destinations statistics published the previous month.

It highlighted the relatively limited progress made towards improving the progression of young people from disadvantaged backgrounds to selective universities.

There were no posts in August, half of which was spent in Norway, taking the photographs that have graced some of my subsequent publications.

In September I produced What Happened to the Level 6 Reading Results? an investigation into the mysterious collapse of L6 reading test results in 2014.

Test entries increased significantly. So did the success rates on the other level 6 tests (in maths and in grammar, punctuation and spelling (GPS)).  Even teacher assessment of L6 reading showed a marked upward trend.

Despite all this, the number of pupils successful on the L6 reading test fell from 2,062 in 2013 to 851 (provisional). The final statistics – released only this month – show a marginal improvement to 935, but the outcome is still extremely disappointing. No convincing explanation has been offered and the impact on 2015 entries is unlikely to be positive.

That same month I published Closing England’s Excellence Gaps: Part One and Part Two.

These present the evidence base relating to high attainment gaps between disadvantaged and other learners, to distinguish what we know from what remains unclear and so to provide a baseline for further research.

The key finding is that the evidence base is both sketchy and fragmented. We should understand much more than we do about the size and incidence of excellence gaps. We should be strengthening the evidence base as part of a determined strategy to close the gaps.

.

.

In October 16-19 Maths Free Schools Revisited marked a third visit to the 16-19 maths free schools programme, concentrating on progress since my previous post in March 2013, especially at the two schools which have opened to date.

I subsequently revised the post to reflect an extended series of tweeted comments from Dominic Cummings, who was a prime mover behind the programme. The second version is called 16-19 Maths Free Schools Revisited: Oddyssean Edition .

The two small institutions at KCL and Exeter University (both very similar to each other) constitute a rather limited outcome for a project that was intended to generate a dozen innovative university-sponsored establishments. There is reportedly a third school in the pipeline but, as 2014 closes, details have yet to be announced.

Excellence Gaps Quality Standard: Version One is an initial draft of a standard encapsulating effective whole school practice in supporting disadvantaged high attainers. It updates and adapts the former IQS for gifted and talented education.

This first iteration needs to be trialled thoroughly, developed and refined but, even as it stands, it offers another useful starting point for schools reviewing the effectiveness of their own provision.

The baseline standard captures the essential ‘non-negotiables’ intended to be applicable to all settings. The exemplary standard is pitched high and should challenge even the most accomplished of schools and colleges.

All comments and drafting suggestions are welcome.

.

.

In November I published twin studies of The Politics of Setting and The Politics of Selection: Grammar Schools and Disadvantage.

These issues have become linked since Prime Minister Cameron has regularly proposed an extension of the former as a response to calls on the right wing of his party for an extension of the latter.

This was almost certainly the source of autumn media rumours that a strategy, originating in Downing Street, would be launched to incentivise and extend setting.

Newly installed Secretary of State Morgan presumably insisted that existing government policy (which leaves these matters entirely to schools) should remain undisturbed. However, the idea might conceivably be resuscitated for the Tory election manifesto.

Now that UKIP has confirmed its own pro-selection policy there is pressure on the Conservative party to resolve its internal tensions on the issue and identify a viable alternative position. But the pro-grammar lobby is unlikely to accept increased setting as a consolation prize…

.

.

Earlier in December I added a companion piece to ‘The Politics of Selection’.

How Well Do Grammar Schools Perform With Disadvantaged Students? reveals that the remaining 163 grammar schools have very different records in this respect. The poor performance of a handful is a cause for concern.

I also published High Attainment in the 2014 Primary School Performance Tables – another exercise in benchmarking, this time for primary schools interested in how well they support high attainers and high attainment.

This shows that HMCI’s recent distinction between positive support for the most able in the primary sector and a much weaker record in secondary schools is not entirely accurate. There are conspicuous weaknesses in the primary sector too.

Meanwhile, Chinese learners continue to perform extraordinarily well on the Level 6 maths test, achieving an amazing 35% success rate, up six percentage points since 2013. This domestic equivalent of the Shanghai phenomenon bears closer investigation.

My penultimate post of the year HMCI Ups the Ante on the Most Able collates all the references to the most able in HMCI’s 2014 Annual Report and its supporting documentation.

It sets out Ofsted’s plans for the increased scrutiny of schools and for additional survey reports that reflect this scrutiny.

It asks the question whether Ofsted’s renewed emphasis will be sufficient to rectify the shortcomings they themselves identify and – assuming it will not – outlines an additional ten-step plan to secure system-wide improvement.

Conclusion

So what are the prospects for 2015 and beyond?

My 2013 Retrospective was decidedly negative about the future of global gifted education:

‘The ‘closed shop’ is as determinedly closed as ever; vested interests are shored up; governance is weak. There is fragmentation and vacuum where there should be inclusive collaboration for the benefit of learners. Too many are on the outside, looking in. Too many on the inside are superannuated and devoid of fresh ideas.’

Despite evidence of a few ‘green shoots’’ during 2014, my overall sense of pessimism remains.

Meanwhile, future prospects for high attainers in England hang in the balance.

Several of the Coalition Government’s education reforms have been designed to shift schools’ focus away from borderline learners, so that every learner improves, including those at the top of the attainment distribution.

On the other hand, Ofsted’s judgement that a third of secondary inspections this year

‘…pinpointed specific problems with teaching the most able’

would suggest that schools’ everyday practice falls some way short of this ideal.

HMCI’s commitment to champion the interests of the most able is decidedly positive but, as suggested above, it might not be enough to secure the necessary system-wide improvement.

Ofsted is itself under pressure and faces an uncertain future, regardless of the election outcome. HMCI’s championing might not survive the arrival of a successor.

It seems increasingly unlikely that any political party’s election manifesto will have anything significant to say about this topic, unless  the enthusiasm for selection in some quarters can be harnessed and redirected towards the much more pertinent question of how best to meet the needs of all high attainers in all schools and colleges, especially those from disadvantaged backgrounds.

But the entire political future is shrouded in uncertainty. Let’s wait and see how things are shaping up on the other side of the election.

From a personal perspective I am closing in on five continuous years of edutweeting and edublogging.

I once expected to extract from this commitment benefits commensurate with the time and energy invested. But that is no longer the case, if indeed it ever was.

I plan to call time at the end of this academic year.

 .

GP

December 2014

HMCI Ups the Ante on the Most Able

.

Her Majesty’s Chief Inspector Wilshaw made some important statements about the education of what Ofsted most often calls ‘the most able’ learners in his 2013/14 Annual Report and various supporting documents.

P1020587

Another Norwegian Landscape by Gifted Phoenix

This short post compiles and summarises these statements, setting them in the context of current inspection policy and anticipated changes to the inspection process.

It goes on to consider what further action might be necessary to remedy the deficiencies Ofsted has identified in schools and to boost our national capacity to educate high attainers.

It continues a narrative which runs through several of my previous posts including:

.

What the Annual Report documents said

Ofsted’s press release marking publication of the 2013/14 Annual Report utilises a theme that runs consistently through all the documentation: while the primary sector continues to improve, progress has stalled in the secondary sector, resulting in a widening performance gap between the two sectors.

It conveys HMCI’s judgement that primary schools’ improvement is attributable to the fact that they ‘attend to the basics’, one of which is:

‘Enabling the more able [sic] pupils to reach their potential’

Conversely, the characteristics of secondary schools where improvement has stalled include:

‘The most able not being challenged’.

It is unclear whether Ofsted maintains a distinction between ‘more able’ and ‘most able’ since neither term is defined at any point in the Annual Report documentation.

In his speech launching the Annual Report, HMCI Wilshaw said:

‘The problem is also acute for the most able children. Primaries have made steady progress in helping this group. The proportion of pupils at Key Stage 2 gaining a Level 5 or above rose from 21% in 2013 to 24% this year. Attainment at Level 6 has also risen, particularly in mathematics, where the proportion reaching the top grade has increased from 3% to 9% in two years.

Contrast that with the situation in secondary schools. In 2013, nearly a quarter of pupils who achieved highly at primary school failed to gain even a B grade at GCSE. A third of our inspections of secondary schools this year pinpointed specific problems with teaching the most able – a third of inspections this year.

We cannot allow this lack of progress to persist. Imagine how dispiriting it must be for a child to arrive at a secondary school bursting with enthusiasm and keen to learn, only to be forced to repeat lessons already learnt and endure teaching that fails to stimulate them. To help tackle this problem, I have commissioned a report into progress at Key Stage 3 and it will report next year.’

HMCI’s written Commentary on the Annual Report says of provision in primary schools:

‘Many primary schools stretch the more able

Good and outstanding schools encourage wider reading and writing at length. Often, a school’s emphasis on the spiritual, moral, social and cultural aspects of the curriculum benefits all pupils but especially the more able, providing them with opportunities to engage with complex issues.

The proportion of pupils at Key Stage 2 attaining a Level 5 or above in reading, writing and mathematics increased from 21% in 2013 to 24% in 2014.

Attainment at Level 6 has also risen. In mathematics, the proportion of pupils achieving Level 6 rose from 3% in 2012 to 9% in 2014. The proportion achieving Level 6 in grammar, punctuation and spelling rose by two percentage points in the last year to 4%.

These improvements suggest that primary schools are getting better at identifying the brightest children and developing their potential.’ (Page 9)

The parallel commentary on provision in secondary schools says:

Too many secondary schools are not challenging the most able

In 2013, almost two thirds of the pupils in non-selective schools who attained highly at primary school in English and mathematics did not reach an A or A* in those subjects at GCSE. Nearly a quarter of them did not even achieve a B grade.

Around a third of our inspections of secondary schools this year identified issues in the teaching of the most able pupils. Inspectors found that teachers’ expectations of the most able were too low. There is a worrying lack of scholarship permeating the culture of too many schools.

In the year ahead, Ofsted will look even more closely at the performance of the brightest pupils in routine school inspections and will publish a separate report on what we find.’ (Page 13)

The Annual Report itself adds:

‘Challenging the most able

England’s schools are still not doing enough to help the most able children realise their potential. Ofsted drew attention to this last year, but the story has yet to change significantly. Almost two thirds of the pupils in non-selective schools who attained highly at primary school in English and mathematics did not reach an A* or A in those subjects at GCSE in 2013. Nearly a quarter of them did not even achieve a B grade and a disproportionate number of these are boys. Our brightest pupils are not doing as well as their peers in some other countries that are significantly outperforming England. In PISA 2012, fewer 15-year-olds in England were attaining at the highest levels in mathematics than their peers in Germany, Poland and Belgium. In reading, however, they were on a par.

This year, our inspectors looked carefully at how schools were challenging their most able pupils. Further action for individual schools was recommended in a third of our inspection reports. The majority of recommendations related to improved teaching of this group of pupils. Inspectors called on schools to ensure that the most able pupils are being given challenging work that takes full account of their abilities. Stretching the most able is a task for the whole school. It is important that schools promote a culture that supports the most able pupils to flourish, giving them opportunities to develop the skills needed by top universities and tracking their progress at every stage.

‘Ofsted will continue to press schools to stretch their most able pupils. Over the coming year, inspectors will be looking at this more broadly, taking into account the leadership shown in this area by schools. We will also further sharpen our recommendations so that schools have a better understanding of how they can help their most able pupils to reach their potential. Ofsted will follow up its 2013 publication on the most able in secondary schools with another survey focusing on non-selective primary and secondary schools. As part of this survey, we will examine the transition of the most able pupils from one phase to the next.’

Rather strangely, there are substantive references in only two of the accompanying regional reports.

The Report on London – the region that arguably stands above all others in terms of overall pupil performance – says:

More able pupils [sic]

London does reasonably well overall for more able pupils. In 2012/13 the proportion of pupils who were high attainers in Year 6 and then went on to gain A* or A in GCSE English was 46% in London compared with 41% in England.  In mathematics, the proportions were 49% across England and 58% in London.

However, in 2012/13, seven local authorities – Croydon, Bexley, Havering, Lewisham, Lambeth, Tower Hamlets and Waltham Forest – were below the London and national proportions of previously high attaining pupils who went on to attain grade A* or A in GCSE English. With the exception of Bexley, the same local authorities also fell below the London and national levels for the proportion of previously high-attaining pupils who went on to attain grade A* or A in GCSE mathematics.

We have identified the need to secure more rapid progress for London’s more able pupils as one of our key priorities. Inspectors will be paying particular attention to the performance of the more able pupils in schools and local authorities where these pupils are not reaching their full potential.’

The Report on the North-West identifies a problem:

‘Too many of the more able students underperform at secondary school. Of the 23 local authorities in the North West, 13 are below the national level for the percentage of children achieving at least Level 5 at Key Stage 2 in English and mathematics. The proportion subsequently attaining A* or A at GCSE is very low in some areas, particularly Knowsley, Salford and Blackpool.’

But it does not mention tackling this issue amongst its regional priorities.

The six remaining regional reports are silent on the issue.

.

Summarising the key implications

Synthesising the messages from these different sources, it seems that:

  • Primary schools have made ‘steady progress’ in supporting the most able, improving their capacity to identify and develop their potential. 
  • Inspection evidence suggests one third of secondary schools have specific problems with teaching the most able. This is a whole school issue. Too many high attainers at the end of KS2 are no longer high attainers at the end of KS4. Teachers’ expectations are too low. A positive school culture is essential but there is ‘a worrying lack of scholarship permeating the culture of too many schools’.  
  • Ofsted will increase the scrutiny it gives to the performance of the most able in routine school inspections, taking account of the leadership shown by schools (which appears to mean the contribution made by school leaders within schools), and will sharpen their recommendations within school inspection reports to reflect this increased scrutiny. 
  • They will also publish a survey report in 2015 that will feature: the outcomes of their increased scrutiny; provision in ‘non-selective primary and secondary schools’ including transition between phases; and the progress of the most able learners in KS3. 
  • In London the need to secure more rapid progress for more able pupils is a priority for Ofsted’s regional team. They will focus particularly on progress in English and maths between KS2 and KS4 in seven local authorities performing below the national and London average. 

[Postscript: In his Select Committee appearance on 28 January 2015, HMCI said that the 2015 survey report will be published in May.

However, there were press reports a few days ahead that it would be brought forward to Wednesday 4 March.

Publication ahead of the General Election, rather than immediately afterwards, puts pressure on the political parties to set out their response.

Will they continue to advance the familiar line that their generic standards-raising policies will ‘lift all ships’, or will they commit to a more targeted solution, such as the one I proposed here?]

.

All this suggests that schools would be wise to concentrate on strengthening leadership, school culture and transition – as well as eradicating any problems associated with teaching the most able.

KS3 is a particular concern in secondary schools. Although there will be comparatively more attention paid to the secondary sector, primary schools will not escape Ofsted’s increased scrutiny.

This is as it should be since my recent analysis of high attainers and high attainment in the 2014 Primary Performance Tables demonstrates that there is significant underachievement amongst high attainers in the primary sector and, in particular, very limited progress in closing achievement gaps between disadvantaged and other learners at higher attainment levels.

Ofsted does not say that they will give particular attention to most able learners in receipt of the pupil premium. The 2013 survey report committed them to doing so, but I could find no such emphasis in my survey of secondary inspection reports.

.

Will this be enough?

HMCI’s continuing concern about the quality of provision for the most able raises the question whether Ofsted’s increased scrutiny will be sufficient to bring about the requisite improvement.

Government policy is to leave this matter entirely to schools, although this has been challenged in some quarters. Labour in Opposition has been silent on the matter since Burnham’s Demos speech in July 2011.

More recent political debate about selection and setting has studiously avoided the wider question of how best to meet the needs of the most able, especially those from disadvantaged backgrounds.

If HMCI Wilshaw were minded to up the ante still further, what additional action might he undertake within Ofsted and advocate beyond it?

I sketch out below a ten-step plan for his and your consideration.

.

  1. Ofsted should strengthen its inspection procedures by publishing a glossary and supplementary inspection guidance, so that schools and inspectors alike have a clearer, shared understanding of Ofsted’s expectations and what provision should look like in outstanding and good schools. This should feature much more prominently the achievement, progress and HE destinations of disadvantaged high attainers, especially those in receipt of the Pupil Premium.

.

  1. The initiative under way in Ofsted’s London region should be extended immediately to all eight regions and a progress report should be included in Ofsted’s planned 2015 survey.

.

  1. The Better Inspection for All consultation must result in a clearer and more consistent approach to the inspection of provision for the most able learners across all sectors, with separate inspection handbooks adjusted to reflect the supplementary guidance above. Relevant high attainment, high attainer and excellence gaps data should be added to the School Data Dashboard.

.

  1. Ofsted should extend its planned 2015 survey to include a thorough review of the scope and quality of support for educating the most able provided to schools through local authority school improvement services, academy chains, multi-academy trusts and teaching school alliances. It should make recommendations for extending and strengthening such support, eliminating any patchiness of provision.

.

  1. Reforms to the assessment and accountability frameworks mean that less emphasis will be placed in future on the achievement of national benchmarks by borderline candidates and more on the attainment and progress of all learners. But there are still significant gaps in the data published about high attainment and high attainers, especially the differential performance of advantaged and disadvantaged learners. The decision to abandon the planned data portal – in which it was expected some of this data would be deposited – is problematic. Increased transparency would be helpful.

.

  1. There are unanswered questions about the support that the new levels-free assessment regime will provide for the achievement and progression of the most able. There is a risk that a ‘mastery’-focused approach will emphasise progression through increased depth of study, at the expense of greater breadth and faster pace, thus placing an unnecessary constraint on their education. Guidance is desirable to help eliminate these concerns.

.

  1. The Education Endowment Foundation (EEF) should extend its remit to include excellence gaps. All EEF-sponsored evaluations should routinely consider the impact on disadvantaged high attainers. The EEF should also sponsor projects to evaluate the blend of interventions that are most effective in closing excellence gaps. The Toolkit should be revised where necessary to highlight more clearly where specific interventions have a differential impact on high attainers.

.

  1. Efforts should be made to establish national consensus on the effective education of high attainers through consultation on and agreement of a set of common core principles.

.

  1. A ‘national conversation’ is needed to identify strategies for supporting (disadvantaged) high attainers, pushing beyond the ideological disagreements over selection and setting to consider a far wider range of options, including more innovative approaches to within-school and between-school provision.

.

  1. A feasibility study should be conducted into the viability of a national, non-governmental learner-centred support programme for disadvantaged high attainers aged 11-18. This would be market-driven but operate within a supporting national framework. It would be managed entirely within existing budgets – possibly an annual £50m pupil premium topslice plus a matching contribution from universities’ fair access outreach funding.

.

GP

December 2014

High Attainment in the 2014 Primary School Performance Tables

.

This is my annual post reviewing data about high attainment and high attainers at the end of Key Stage 2.

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on:

and parallel material for previous years.

‘High attainment’ is taken to mean National Curriculum Level 5 and above.

‘High attainers’ are defined in accordance with the Performance Tables, meaning those with prior attainment above Level 2 in KS1 teacher assessments (average points score of 18 or higher). This measure obviously excludes learners who are particularly strong in one area but correspondingly weak in another.

The proportions of the end-of-KS2 cohort defined as high, middle and low attainers have remained fairly constant since 2012.

High attainers presently constitute the top quartile of the relevant population, but this proportion is not fixed: it will increase as and when KS1 performance improves.

High % Middle % Low %
2014 25 58 18
2013 25 57 18
2012 24 57 19

Table 1: Proportion of high, middle and low prior attainers in state-funded schools by year since 2012

 

The percentage of high attainers in different schools’ end-of-KS2 cohorts varies very considerably and is unlikely to remain constant from year to year. Schools with small year groups are particularly vulnerable to significant fluctuations.

The 2014 Performance Tables show that Minster School, in Southwell, Nottinghamshire and St Patrick’s Church of England Primary Academy in Solihull each had 88% high attainers.

Over 600 primary schools have 50% or more high attainers within their cohorts. But, at the other extreme, more than 570 have no high attainers at all, while some 1,150 have 5% or fewer.

This serves to illustrate the very unequal distribution of learners with high prior attainment between schools.

The commentary below opens with a summary of the headline findings. The subsequent sections focus in turn on the composite measure (reading, writing and maths combined), then on the outcomes of the reading, GPS (grammar, punctuation and spelling) and maths tests and finally on teacher assessment in writing.

I have tried to ensure that percentages are consistent throughout this analysis, but the effect of rounding means that some figures are slightly different in different SFR tables. I apologise in advance for – and will of course correct – any transcription errors.

.

Headlines

.

Overall Trends

Chart 1 below compares performance at level 5 and above (L5+) and level 4 and above (L4+) in 2013 and 2014. The bars on the left hand side denote L4+, while those corresponding to L5+ are on the right.

HA 1

Chart 1: L4+ and L5+ performance compared, 2013-2014

With the exception of maths, which has remained unchanged, there have been improvements across the board at L4+, of between two and four percentage points.

The same is true at L5+ and – in the case of reading, GPS and writing – the percentage point improvements are relatively larger. This is good news.

Chart 2 compares the gaps between disadvantaged learners (‘ever 6’ FSM plus children in care) and all other learners in state-funded schools on all five measures, for both 2013 and 2014.

.

HA 2

Chart 2: Disadvantaged gaps at L4+ and L5+ for all five measures, 2013 and 2014

.

With the sole exception of the composite measure in 2013, each L4+ gap is smaller than the corresponding gap at L5+, though the difference can be as little as one percentage point (the composite measure) and as high as 11 percentage points (reading).

Whereas the L4+ gap in reading is lower than for any other measure, the L5+ reading gap is now the biggest. This suggests there is a particular problem with L5+ reading.

The distance between L4+ and L5+ gaps has typically widened since 2013, except in the case of maths, where it has narrowed by one percentage point.

While three of the L4+ gaps have closed slightly (composite, reading, GPS) the remainder are unchanged. However, two of the L5+ gaps have increased (composite, writing) and only the maths gap has closed slightly.

This suggests that what limited progress there has been in closing disadvantaged gaps has focused more on L4+ than L5+.

The pupil premium is not bringing about a radical improvement – and its impact is relatively lower at higher attainment levels.

A similar pattern is discernible with FSM gaps as Chart 3 reveals. This excludes the composite measure as this is not supplied in the SFR.

Overall the picture at L4+ is cautiously positive, with small downward trends on three of the four measures, but the picture at L5+ is more mixed since two of the measures are unchanged.

.

HA 3

Chart 3: FSM gaps at L4+ and L5+ compared, 2013 and 2014  

Composite measure

  • Although the proportion of learners achieving this benchmark is slightly higher in converter academies than in LA-maintained schools, the latter have improved faster since 2013. The success rate in sponsored academies is half that in converter academies. Free schools are improving but remain behind LA-maintained schools. 
  • Some 650 schools achieve 50% or higher, but another 470 record 0% (fewer than the 600 which did so in 2013). 
  • 67% of high attainers achieved this benchmark in 2014, up five percentage points on 2013 but one third still fall short, demonstrating that there is extensive underachievement amongst high attainers in the primary sector. This rather undermines HMCI’s observations in his Commentary on the 2014 Annual Report. 
  • Although over 670 schools have a 100% success rate amongst their high attainers, 42 schools have recorded 0% (down from 54 in 2013). Several of these do better by their middle attainers. In 10 primary schools no high attainers achieve L4+ in reading, writing and maths combined.

.

Reading

  • The substantial improvement in L5+ reading performance since 2013 masks an as yet unexplained crash in Level 6 test performance. Only 874 learners in state-funded schools achieved L6 reading, compared with 2,137 in 2013. This is in marked contrast to a substantive increase in L6 test entries, the success rate on L6 teacher assessment and the trend in the other L6 tests. In 2013 around 12,700 schools had no pupils who achieved L6 reading, but this increased to some 13,670 schools in 2014. Even the performance of Chinese pupils (otherwise phenomenally successful on L6 tests) went backwards. 
  • The proportion of Chinese learners achieving L5 in reading has reached 65% (compared with 50% for White learners), having increased by seven percentage points since 2013 and overtaken the 61% recorded in 2012. 
  • 43 primary schools had a 100% success rate at Level 5 in the reading test, but 29 more registered 0%. 
  • Some 92% of high attainers made at least the expected progress in reading, fewer than the 94% of middle attainers who did so. However, this was a three percentage point improvement on the 89% who made the requisite progress in 2013. 

GPS

  •  The proportion of Chinese learners achieving L5+ in the GPS test is now 75%, a seven percentage point improvement on 2013. Moreover, 15% achieved Level 6, up eight percentage points on 2013. (The comparable Level 5+ percentage for White learners is 50%). There are unmistakeable signs that Chinese ascendancy in maths is being replicated with GPS. 
  • Some 7,210 schools had no learners achieving L6 in the GPS test, compared with 10,200 in 2013. While 18 schools recorded a perfect 100% record at Level 5 and above, 33 had no learners at L5+. 

.

Maths

  • Chinese learners continue to make great strides. The percentage succeeding on the L6 test has climbed a further six percentage points and now stands at 35% (compared with 8% for White Pupils). Chinese boys are at 39%. The proportion of Chinese learners achieving level 6 is now comparable to the proportions of other ethnic groups achieving level 5. This lends further credence to the notion that we have our own domestic equivalent of Shanghai’s PISA success – and perhaps to the suggestion that focusing on Shanghai’s classroom practice may bring only limited benefits. 
  • While it is commendable that 3% of FSM and 4% of disadvantaged learners are successful in the L6 maths test, the gaps between them and other learners are increasing as the overall success rate grows. There are now seven percentage point gaps for FSM and disadvantaged alike. 
  • Ten schools managed a L6 success rate of 50% or higher, while some 280 were at 30% or higher. On the other hand, 3,200 schools had no L6 passes (down from 5,100 in 2013). 
  • About 94% of high attainers made the expected progress in maths a one percentage point improvement on 2013 – and two percentage points more than the proportion of successful middle attainers. But 27 schools posted a success rate of 50% or below.

.

Writing (TA)

  • Chinese pupils do not match their performance on the GPS test, though 6% achieve L6 in writing TA compared with just 2% of white pupils. 
  • Three schools managed a 50% success rate at Level 6 and 56 were at 25% or above. Only one school managed 100% at L5, but some 200 scored 0%. 
  • Some 93% of all pupils make the expected progress in writing between KS1 and KS2. This is true of 95% of high attainers – and 95% of middle attainers too.

 

Composite measure: reading, writing and maths

Table 2 shows the overall proportion of learners achieving L5 or above in all of reading, writing and maths in each year since 2012.

 

2012 2013 2014
L5+ overall 20% 21% 24%
L5+ boys 17% 18% 20%
L5+ girls 23% 25% 27%

Table 2: Proportion of all learners achieving KS2 L5+ in reading, writing and maths, 2012-2014

The overall success rate has increased by three percentage points compared with 2013 and by four percentage points since 2012.

The percentage of learners achieving L4+ has also improved by four percentage points since 2012, so the improvement at L5+ is broadly commensurate.

Over this period, girls’ lead over boys has remained relatively stable at between six and seven percentage points.

The SFR reveals that success on this measure varies significantly between school type.

The percentages for LA-maintained schools (24%) and all academies and free schools (23%) are little different.

However mainstream converter academies stand at 26%, twice the 13% recorded by sponsored academies. Free schools are at 21%. These percentages have changed significantly compared with 2013.

.

HA 4

Chart 4:  Comparison of proportion of learners achieving L5+ in reading writing and maths in 2013 and 2014

.

Whereas free schools are making rapid progress and sponsored academies are also improving at a significant rate, converter academies are improving more slowly than LA-maintained schools.

The highest percentages on this measure in the Performance Tables are recorded by Fox Primary School in Kensington and Chelsea (86%) and Hampden Gurney CofE Primary School in Westminster (85%).

Altogether, some 650 schools have achieved success rates of 50% or higher, while 23 have managed 75% or higher.

At the other end of the spectrum about 470 schools have no learners at all who achieved this measure, fewer than the 600 recording this outcome in 2013.

Table 3 shows the gap between disadvantaged (ie ‘ever 6’ FSM and children in care) learners and others, as recorded in the Performance Tables.

2012 2013 2014
Disadv 9 10 12
Other 24 26 29
Gap 15 16 17

Table 3: Proportion of disadvantaged learners achieving L5+ in reading, writing and maths, 2012-2014

.

Although the percentage of disadvantaged learners achieving this benchmark has improved somewhat, the percentage of other learners doing so has improved faster, meaning that the gap between advantaged and other learners is widening steadily.

This contrasts with the trend at L4+, where the Performance Tables show a gap that has narrowed from 19 percentage points in 2012 (80% versus 61%) to 18 points in 2013 (81% versus 63%) and now to 16 points in 2014 (83% versus 67%).

Chart 5 below illustrates this comparison.

.

HA 5

Chart 5: Comparing disadvantaged/other attainment gaps in KS2 reading, writing and maths combined at L4+ and L5+, 2012-2014.

While the L4+ gap has closed by three percentage points since 2012, the L5+ gap has widened by two percentage points. This suggests that disadvantaged learners amongst the top 25% by prior attainment are not benefiting commensurately from the pupil premium.

There are 97 primary schools where 50% or more disadvantaged learners achieve L5+ across reading, writing and maths (compared with 40 in 2013).

The highest performers record above 80% on this measure with their disadvantaged learners, albeit with cohorts of 6 to 8. Only one school with a more substantial cohort (of 34) manages over 70%. This is Tollgate Primary School in Newham.

The percentage of high attainers who achieved L5+ in 2014 was 67%, up five percentage points from 62% in 2013. (In 2012 the Performance Tables provided a breakdown for English and maths, which is not comparable).

Although this is a significant improvement, it means that one third of high attainers at KS1 still do not achieve this KS2 benchmark, suggesting that there is significant underachievement amongst this top quartile.

Thirteen percent of middle attainers also achieved this outcome, compared with 10% in 2013.

A significant number of schools – over 670 – do manage a 100% success rate amongst their high attainers, but there are also 42 schools where no high attainers achieve the benchmark (there were 54 in 2013). In several of them, more middle attainers than high attainers achieve the benchmark.

There are ten primary schools in which no high attainers achieve L4 in reading writing and maths. Perhaps one should be thankful for the fact that no middle attainers in these schools achieve the benchmark either!

The KS2 average point score was 34.0 or higher in five schools, equivalent to a level 5A. The highest  APS was 34.7, recorded by Fox Primary School, with a cohort of 42 pupils.

Across all state-funded schools, the average value added measure for high attainers across reading, writing and maths is 99.8, the same as it was in 2013.

The comparable averages for middle attainers and low attainers are 100.0 and 100.2 respectively, showing that high attainers benefit slightly less from their primary education.

The highest value-added recorded for high attainers is 104.7 by Tudor Court Primary School in Thurrock, while the lowest is 93.7 at Sacriston Junior School in Durham (now closed).

Three more schools are below 95.0 and some 250 are at 97.5 or lower.

.

Reading Test

Table 4 shows the percentage of all learners, boys and girls achieving L5+ in reading since 2010. There has been a five percentage point increase (rounded) in the overall result since 2013, which restores performance to the level it had reached in 2010.

A seven percentage point gap in favour of girls remains unchanged from 2013. This is four points less than the comparable gender gap in 2010.

.

2010 2011 2012 2013 2014
L5+ overall 50 43 48 44 50
Boys 45 37 43 41 46
Girls 56 48 53 48 53

Table 4: Percentage of learners achieving L5+ in reading since 2010

.

As reported in my September 2014 post ‘What Happened to the Level 6 Reading Results?’ L6 performance in reading has collapsed in 2014.

The figures have improved slightly since the provisional results were released, but the collapse is still marked.

Table 5 shows the numbers successful since 2012.

The number of successful learners in 2014 is less than half the number successful in 2013 and almost back to the level in 2012 when the test was first introduced.

This despite the fact that the number of entries for the level 6 test – 95,000 – was almost exactly twice the 47,000 recorded in 2012 and significantly higher than the 70,000 entries in 2013.

For comparison, the number of pupils awarded level 6 in reading via teacher assessment was 15,864 in 2013 and 17,593 in 2014

We still have no explanation for this major decline which is entirely out of kilter with other L6 test outcomes.

.

2012 2013 2014
% No % No % No
L6+ 0 900 0 2,262 0 935
Boys 0 200 0 592 0 263
Girls 0 700 1 1,670 0 672

Table 5: Number and percentage of learners achieving L6 on the KS2 reading test 2012-2014

.

These figures include some pupils attending independent schools, but another table in the SFR reveals that 874 learners in state-funded primary schools achieved L6 (compared with 2,137 in 2013). Of these, all but 49 achieved L3+ in their KS1 reading assessment.

But some 13,700 of those with L3+ reading at the end of KS1 progressed to L4 or lower at the end of KS2.

The SFR does not supply numbers of learners with different characteristics achieving L6 and all percentages are negligible. The only group recording a positive percentage are Chinese learners at 1%.

In 2013, Chinese learners were at 2% and some other minority ethnic groups recorded 1%, so not even the Chinese have been able to withstand the collapse in the L6 success rate.

According to the SFR, the FSM gap at L5 is 21 percentage points (32% versus 53% for all other pupils). The disadvantaged gap is also 21 percentage points (35% versus 56% for all other pupils).

Chart 6 shows how these percentages have changed since 2012.

.

HA 6

Chart 6: FSM and disadvantaged gaps for KS2 reading test at L5+, 2012-2014

FSM performance has improved by five percentage points compared with 2013, while disadvantaged performance has grown by six percentage points.

However, gaps remain unchanged for FSM and have increased by one percentage point for disadvantaged learners. There is no discernible or consistent closing of gaps in KS2 reading at L5.

These gaps of 21 percentage points for both FSM and disadvantaged, are significantly larger than the comparable gaps at L4+ of 12 (FSM) and 10 (disadvantaged) percentage points.

The analysis of level 5 performance in the SFR reveals that the proportion of Chinese learners achieving level 5 has reached 65%, having increased by seven percentage points since 2013 and overtaken the 61% recorded in 2012.

Turning to the Performance Tables, we can see that, in relation to L6:

  • The highest recorded percentage achieving L6 is 17%, at Dent CofE Voluntary Aided Primary School in Cumbria. Thirteen schools recorded a L6 success rate of 10% or higher. (The top school in 2013 recorded 19%).
  • In 2013 around 12,700 schools had no pupils who achieved L6 reading, whereas in 2014 this had increased to some 13,670 schools.

In relation to L5:

  • 43 schools achieved a 100% record in L5 reading (compared with only 18 in 2013). All but one of these recorded 0% at L6, which may suggest that they were concentrating on maximising L5 achievement rather than risking L6 entry.
  • Conversely, there are 29 primary schools where no learners achieved L5 reading.

Some 92% of high attainers made at least the expected progress in reading, fewer than the 94% of middle attainers who did so.  However, this was a three percentage point improvement on the 89% who made the requisite progress in 2013.

And 41 schools recorded a success rate of 50% or lower on this measure, most of them comfortably exceeding this with their low and middle attainers alike.

.

GPS Test

Since the grammar, punctuation and spelling test was first introduced in 2013, there is only a two-year run of data. Tables 6 and 7 below show performance at L5+ and L6+ respectively.

.

2013 % 2014 %
L5+ overall 48 52
Boys 42 46
Girls 54 58

Table 6: Percentage of learners achieving L5+ in GPS, 2013 and 2014

2013 2014
% No % No
L6+ 2 8,606 4 21,111
Boys 1 3,233 3 8,321
Girls 2 5,373 5 12,790

Table 7: Number and percentage of learners achieving L6 in GPS, 2013 and 2014

.

Table 6 shows an overall increase of four percentage points in 2014 and the maintenance of a 12 percentage point gap in favour of girls.

Table 7 shows a very healthy improvement in L6 performance, which only serves to emphasise the parallel collapse in L6 reading. Boys have caught up a little on girls but the latter’s advantage remains significant.

The SFR shows that 75% of Chinese learners achieve L5 and above, up seven percentage points from 68% in 2013. Moreover, the proportion achieving L6 has increased by eight percentage points, to 15%. There are all the signs that Chinese eminence in maths is repeating itself with GPS.

Chart 7 shows how the FSM gap and disadvantaged gap has changed at L5+ for GPS. The disadvantaged gap has remained stable at 19 percentage points, while the FSM gap has narrowed by one percentage point.

These gaps are somewhat larger than those at L4 and above, which stand at 17 percentage points for FSM and 15 percentage points for disadvantaged learners.

.

HA 7

Chart 7:  FSM and disadvantaged gaps for KS2 GPS test at L5+, 2013 and 2014

.

The Performance Tables show that, in relation to L6:

  • The school with the highest percentage achieving level 6 GPS is Fulwood, St Peter’s CofE Primary School in Lancashire, which records a 47% success rate. Some 89 schools achieve a success rate of 25% or higher.
  • In 2014 there were some 7,210 schools that recorded no L6 performers at all, but this compares favourably with 10,200 in 2013. This significant reduction is in marked contrast to the increase in schools with no L6 readers.

Turning to L5:

  • 18 schools recorded a perfect 100% record for L5 GPS. These schools recorded L6 success rates that vary between 0% and 25%.
  • There are 33 primary schools where no learners achieved L5 GPS.

.

Maths test

Table 8 below provides the percentages of learners achieving L5+ in the KS2 maths test since 2010.

Over the five year period, the success rate has improved by eight percentage points, but the improvement in 2014 is less pronounced than it has been over the last few years.

The four percentage point lead that boys have over girls has changed little since 2010, apart from a temporary increase to six percentage points in 2012.

.

2010 2011 2012 2013 2014
L5+ overall 34 35 39 41 42
Boys 36 37 42 43 44
Girls 32 33 36 39 40

Table 8: Percentage of learners achieving L5+ in KS2 maths test, 2010-2014

.

Table 9 shows the change in achievement in the L6 test since 2012. This includes pupils attending independent schools – another table in the SFR indicates that the total number of successful learners in 2014 in state-funded schools is 47,349, meaning that almost 95% of those achieving L6 maths are located in the state-funded sector.

There has been a healthy improvement since 2013, with almost 15,000 more successful learners – an increase of over 40%. Almost one in ten of the end of KS2 cohort now succeeds at L6. This places the reversal in L6 reading into even sharper relief.

The ratio between boys and girls has remained broadly unchanged, so boys continue to account for over 60% of successful learners.

.

2012 2013 2014
% No % No % No
L6+ 3 19,000 7 35,137 9 50,001
Boys 12,400 8 21,388 11 30,173
Girls 6,600 5 13,749 7 19,828

Table 9 Number and percentage of learners achieving L6 in KS2 maths test 2012-2014

.

The SFR shows that, of those achieving L6 in state-funded schools, some 78% had achieved L3 or above at KS1. However, some 9% of those with KS1 L3 – something approaching 10,000 pupils – progressed only to L4, or lower.

The breakdown for minority ethnic groups shows that the Chinese ascendancy continues. This illustrated by Chart 8 below.

HA 8

Chart 8: KS2 L6 maths test performance by ethnic background, 2012-2014

In 2014, the percentage of Chinese achieving L5+ has increased by a respectable three percentage points to 74%, but the L6 figure has climbed by a further six percentage points to 35%. More than one third of Chinese learners now achieve L6 on the maths test.

This means that the proportion of Chinese pupils achieving L6 is now broadly similar to the proportion of other minorities achieving Level 5 (34% of white pupils for example).

They are fifteen percentage points ahead of the next best outcome – 20% recorded by Indian learners. White learners stand at 8%.

There is an eight percentage point gap between Chinese boys (39%) and Chinese girls (31%). The gap for white boys and girls is much lower, but this is a consequence of the significantly lower percentages.

Given that Chinese pupils are capable of achieving such extraordinary results under the present system, these outcomes raise significant questions about the balance between school and family effects and whether efforts to emulate Chinese approaches to maths teaching are focused on the wrong target.

Success rates in the L6 maths test are high enough to produce percentages for FSM and disadvantaged learners. The FSM and disadvantaged gaps both stand at seven percentage points, whereas they were at 5 percentage points (FSM) and 6 percentage points (disadvantaged) in 2013. The performance of disadvantaged learners has improved, but not as fast as that of other learners.

Chart 9 shows how these gaps have changed since 2012.

While the L6 gaps are steadily increasing, the L5+ gaps have remained broadly stable at 20 percentage points (FSM) and 21 percentage points (disadvantaged). There has been a small one percentage point improvement in the gap for disadvantaged learners in 2014, matching the similar small improvement for L4+.

The gaps at L5+ remain significantly larger than those at L4+ (13 percentage points for FSM and 11 percentage points for disadvantaged).

HA 9

Chart 9: FSM and disadvantaged gaps, KS2 L5+ and L6 maths test, 2012 to 2014

.

The Performance Tables reveal that:

  • The school with the highest recorded percentage of L6 learners is Fox Primary School (see above) at 64%, some seven percentage points higher than its nearest rival. Ten schools achieve a success rate of 50% or higher (compared with only three in 2013), 56 at 40% or higher and 278 at 30% or higher.
  • However, over 3,200 schools record no L6 passes. This is a significant improvement on the 5,100 in this category in 2013, but the number is still far too high.
  • Nine schools record a 100% success rate for L5+ maths. This is fewer than the 17 that managed this feat in 2013.

Some 94% of high attainers made the expected progress in maths a one percentage point improvement on 2013, two percentage points more than did so in reading in 2014 – and two percentage points more than the proportion of middle attainers managing this.

However, 27 schools had a success rate of 50% or below, the vast majority of them comfortably exceeding this with their middle attainers – and often their low attainers too.

.

Writing Teacher Assessment

Table 10 shows how the percentage achieving L5+ through the teacher assessment of writing has changed since 2012.

There has been a healthy five percentage point improvement overall, and an improvement of three percentage points since last year, stronger than the comparable improvement at L4+. The large gender gap of 15 percentage points in favour of girls is also unchanged since 2013.

.

2012 2013 2014
L5+ overall 28 30 33
Boys 22 23 26
Girls 35 38 41

Table 10: Percentage achieving level 5+ in KS2 writing TA 2012-2014

.

Just 2% of learners nationally achieve L6 in writing TA – 11,340 pupils (10,654 of them located in state-funded schools).

However, this is a very significant improvement on the 2,861 recording this outcome in 2013. Just 3,928 of the total are boys.

Chinese ascendancy at L6 is not so significant. The Chinese success rate stands at 6%. However, if the comparator is performance at L5+ Chinese learners record 52%, compared with 33% for both White and Asian learners.

The chart below shows how FSM and disadvantaged gaps have changed at L5+ since 2012.

This indicates that the FSM gap, having widened by two percentage points in 2013, has narrowed by a single percentage point in 2014, so it remains higher than it was in 2012. Meanwhile the disadvantaged gap has widened by one percentage point since 2013.

The comparable 2014 gaps at L4+ are 15 percentage points (FSM) and 13 percentage points (disadvantaged), so the gaps at L5+ are significantly larger.

.

HA 10

Chart 10: FSM and disadvantaged gaps, L5+ Writing TA, 2012-2014

.

The Performance Tables show that:

  • Three schools record a L6 success rate of 50% and only 56 are at 25% or higher.
  • At the other end of the spectrum, the number of schools with no L6s is some 9,780, about a thousand fewer than in 2013.
  • At L5+ only one school has a 100% success rate (there were four in 2013). Conversely, about 200 schools record 0% on this measure.

Some 93% of all pupils make the expected progress in writing between KS1 and KS2 and this is true of 95% of high attainers – the same percentage of middle attainers is also successful.

Conclusion

Taken together, this evidence presents a far more nuanced picture of high attainment and high attainers’ performance in the primary sector than suggested by HMCI’s Commentary on his 2014 Annual Report:

‘The proportion of pupils at Key Stage 2 attaining a Level 5 or above in reading, writing and mathematics increased from 21% in 2013 to 24% in 2014.

Attainment at Level 6 has also risen. In mathematics, the proportion of pupils achieving Level 6 rose from 3% in 2012 to 9% in 2014. The proportion achieving Level 6 in grammar, punctuation and spelling rose by two percentage points in the last year to 4%.

These improvements suggest that primary schools are getting better at identifying the brightest children and developing their potential.’

There are four particular areas of concern:

  • Underachievement amongst high attainers is too prevalent in far too many primary schools. Although there has been some improvement since 2013, the fact that only 67% of those with high prior attainment at KS1 achieve L5 in reading, writing and maths combined is particularly worrying.
  • FSM and disadvantaged achievement gaps at L5+ remain significantly larger than those at L4+ – and there has been even less progress in closing them. The pupil premium ought to be having a significantly stronger impact on these excellence gaps.
  • The collapse of L6 reading test results is all the more stark when compared with the markedly improved success rates in GPS and maths which HMCI notes. We still have no explanation of the cause.
  • The success rates of Chinese pupils on L6 tests remains conspicuous and in maths is frankly extraordinary. This evidence of a ‘domestic Shanghai effect’ should be causing us to question why other groups are so far behind them – and whether we need to look beyond Shanghai classrooms when considering how best to improve standards in primary maths.

.

GP

December 2014

How Well Do Grammar Schools Perform With Disadvantaged Students?

This supplement to my previous post on The Politics of Selection  compares the performance of disadvantaged learners in different grammar schools.

It adds a further dimension to the evidence base set out in my earlier post, intended to inform debate about the potential value of grammar schools as engines of social mobility.

The commentary is based on the spreadsheet embedded below, which relies entirely on data drawn from the 2013 Secondary School Performance Tables.

.

.

If you find any transcription errors please alert me and I will correct them.

.

Preliminary Notes

The 2013 Performance Tables define disadvantaged learners as those eligible for free school meals in the last six years and children in care. Hence both these categories are caught by the figures in my spreadsheet.

Because the number of disadvantaged pupils attending grammar schools is typically very low, I have used the three year average figures contained in the ‘Closing the Gap’ section of the Tables.

These are therefore the number of disadvantaged students in each school’s end of KS4 cohort for 2011, 2012 and 2013 combined. They should illustrate the impact of pupil premium support and wider closing the gap strategies on grammar schools since the Coalition government came to power.

Even when using three year averages the data is frustratingly incomplete, since 13 of the 163 grammar schools have so few disadvantaged students – fewer than six across all three cohorts combined – that the results are suppressed. We have no information at all about how well or how badly these schools are performing in terms of closing gaps.

My analysis uses each of the three performance measures within this section of the Performance Tables:

  • The percentage of pupils at the end of KS4 achieving five or more GCSEs (or equivalents) at grades A*-C, including GCSEs in English and maths. 
  • The proportion of pupils who, by the end of KS4, have made at least the expected progress in English. 
  • The proportion of pupil who, by the end of KS4, have made at least the expected progress in maths.

In each case I have recorded the percentage of disadvantaged learners who achieve the measure and the percentage point gap between that and the corresponding figure for ‘other’ – ie non-disadvantaged – students.

For comparison I have also included the corresponding percentages for all disadvantaged pupils in all state-funded schools and for all high attainers in state-funded schools. The latter is for 2013 only rather than a three-year average.

Unfortunately the Tables do not provide data for high attaining disadvantaged students. The vast majority of disadvantaged students attending grammar schools will be high-attaining according to the definition used in the Tables (average points score of 30 or higher across KS2 English, maths and science).

But, as my previous post showed, some grammar schools record 70% or fewer high attainers, disadvantaged or otherwise. These include: Clarendon House (Kent, now merged), Fort Pitt (Medway), Skegness (Lincolnshire), Dover Boys’ and Girls’ (Kent), Folkestone Girls’ (Kent), St Joseph’s (Stoke), Boston High (Lincolnshire) and the Harvey School (Kent).

Some of these schools feature in the analysis below, while some do not, suggesting that the correlation between selectivity and the performance of disadvantaged students is not straightforward.

.

Number of disadvantaged learners in each school

The following schools are those with suppressed results, placed in order according to the number of disadvantaged learners within scope, from lowest to highest:

  • Tonbridge Grammar School, Kent (2)
  • Bishop Wordsworth’s Grammar School, Wiltshire (3)
  • Caistor Grammar School, Lincolnshire (3)
  • Sir William Borlase’s Grammar School, Buckinghamshire (3)
  • Adams’ Grammar School, Telford and Wrekin (4)
  • Chelmsford County High School for Girls, Essex (4)
  • Dr Challoner’s High School, Buckinghamshire (4)
  • King Edward VI School, Warwickshire (4)
  • Alcester Grammar School, Warwickshire (5)
  • Beaconsifeld High School, Buckinghamshire (5)
  • King Edward VI Grammar School, Chelmsford, Essex (5)
  • Reading School, Reading (5)
  • St Bernard’s Catholic Grammar School, Slough (5).

Some of these schools feature among those with the lowest proportions of ‘ever 6 FSM’ pupils on roll, as shown in the spreadsheet accompanying my previous post, but some do not.

The remaining 152 schools each record a combined cohort of between six and 96 students, with an average of 22.

A further 19 schools have a combined cohort of 10 or fewer, meaning that 32 grammar schools in all (20% of the total) are in this category.

At the other end of the distribution, only 16 schools (10% of all grammar schools) have a combined cohort of 40 disadvantaged students or higher – and only four have one of 50 disadvantaged students or higher.

These are:

  • Handsworth Grammar School, Birmingham (96)
  • Stretford Grammar School, Trafford (76)
  • Dane Court Grammar School, Kent (57)
  • Slough Grammar School (Upton Court) (50).

Because the ratio of disadvantaged to other pupils in the large majority of grammar schools is so marked, the results below must be treated with a significant degree of caution.

Outcomes based on such small numbers may well be misleading, but they are all we have.

Arguably, grammar schools should find it relatively easier to achieve success with a very small cohort of students eligible for the pupil premium – since fewer require separate monitoring and, potentially, additional support.

On the other hand, the comparative rarity of disadvantaged students may mean that some grammar schools have too little experience of addressing such needs, or believe that closing gaps is simply not an issue for them.

Then again, it is perhaps more likely that grammar schools will fall short of 100% success with their much larger proportions of ‘other’ students, simply because the probability of special circumstances arising is relatively higher. One might expect therefore to see ‘positive gaps’ with success rates for disadvantaged students slightly higher than those for their relatively more advantaged peers.

Ideally though, grammar schools should be aiming for a perfect 100% success rate for all students on these three measures, regardless of whether they are advantaged or disadvantaged. None is particularly challenging, for high attainers in particular – and most of these schools have been rated as outstanding by Ofsted.

.

Five or more GCSE A*-C grades or equivalent including GCSEs in English and maths

In all state-funded schools, the percentage of disadvantaged students achieving this measure across the three year period is 38.7% while the percentage of other students doing so is 66.3%, giving a gap of 27.6 percentage points.

In 2013, 94.7% of all high attainers in state-funded secondary schools achieved this measure.

No grammar school falls below the 38.7% benchmark for its disadvantaged learners. The nearest to it is Pate’s Grammar School, at 43%. But these results were affected by the School’s decision to sit English examinations which were not recognised for Performance Table purposes.

The next lowest percentages are returned by:

  • Spalding Grammar School, Lincolnshire (59%)
  • Simon Langton Grammar School for Boys, Kent (65%)
  • Stratford Grammar School for Girls, Warwickshire (71%)
  • The Boston Grammar School, Lincolnshire (74%)

These were the only four schools below 75%.

Table 1 below illustrates these percentages and the percentage point gap for each of these four schools.

.

Table 1

Table 1: 5+ GCSEs at A*-C or equivalent including GCSEs in English and maths: Lowest performing and largest gaps

.

A total of 46 grammar schools (31% of the 150 without suppressed results) fall below the 2013 figure for high attainers across all state-funded schools.

On the other hand, 75 grammar schools (exactly 50%) achieve 100% on this measure, for combined student cohorts ranging in size from six to 49.

Twenty-six of the 28 schools that had no gap between the performance of their advantaged and disadvantaged students were amongst those scoring 100%. (The other two were at 97% and 95% respectively.)

The remaining 49 with a 100% record amongst their disadvantaged students demonstrate a ‘positive gap’, in that the disadvantaged do better than the advantaged.

The biggest positive gap is seven percentage points, recorded by Clarendon House Grammar School in Kent and Queen Elizabeth’s Grammar School in Alford, Lincolnshire.

Naturally enough, schools recording relatively lower success rates amongst their disadvantaged students also tend to demonstrate a negative gap, where the advantaged do better than the disadvantaged.

Three schools had an achievement gap higher than the 27.6 percentage point national average. They were:

  • Simon Langton Grammar School for Boys (30 percentage points)
  • Spalding Grammar School (28 percentage points)
  • Stratford Grammar School for Girls (28 percentage points)

So three of the four with the lowest success rates for disadvantaged learners demonstrated the biggest gaps. Twelve more schools had double digit achievement gaps of 10% or higher.

These 15 schools – 10% of the total for which we have data – have a significant issue to address, regardless of the size of their disadvantaged populations.

One noticeable oddity at this end of the table is King Edward VI Camp Hill School for Boys in Birmingham, which returns a positive gap of 14 percentage points (rounded): with 80% for disadvantaged and 67% for advantaged. On this measure at least, it is doing relatively badly with its disadvantaged students, but considerably worse with those from advantaged backgrounds!

However, this idiosyncratic pattern is also likely to be attributable to the School using some examinations not eligible for inclusion in the Tables.

.

At least expected progress in English

Across all state-funded schools, the percentage of disadvantaged students making at least three levels of progress in English is 55.5%, compared with 75.1% of ‘other’ students, giving a gap of 19.6 percentage points.

In 2013, 86.2% of high attainers achieved this benchmark.

If we again discount Pate’s from consideration, the lowest performing school on this measure is The Boston Grammar School which is at 53%, lower than the national average figure.

A further 43 schools (29% of those for which we have data) are below the 2013 average for all high attainers. Six more of these fall below 70%:

  • The Skegness Grammar School, Lincolnshire (62%)
  • Queen Elizabeth Grammar School, Cumbria (62%)
  • Plymouth High School for Girls (64%)
  • Spalding Grammar School, Lincolnshire (65%)
  • Devonport High School for Boys, Plymouth (65%)
  • Simon Langton Grammar School for Boys, Kent (67%)

Table 2 below illustrates these outcomes, together with the attainment gaps recorded by these schools and others with particularly large gaps.

.

Table 2

Table 2: At least expected progress in English from KS2 to KS4: Lowest performing and largest gaps

.

At the other end of the table, 44 grammar schools achieve 100% on this measure (29% of those for which we have data.) This is significantly fewer than achieved perfection on the five or more GCSEs benchmark.

When it comes to closing the gap, only 16 of the 44 achieve a perfect 100% score with both advantaged and disadvantaged students, again much lower than on the attainment measure above.

The largest positive gaps (where disadvantaged students outscore their advantaged classmates) are at The King Edward VI Grammar School, Louth, Lincolnshire (11 percentage points) and John Hampden Grammar School Buckinghamshire (10 percentage points).

Amongst the schools propping up the table on this measure, six record negative gaps of 20 percentage points or higher, so exceeding the average gap in state-funded secondary schools:

  • The Skegness Grammar School (30 percentage points)
  • Queen Elizabeth Grammar School Cumbria (28 percentage points)
  • Stratford Grammar School for Girls (27 percentage points)
  • Plymouth High School for Girls (25 percentage points)
  • Devonport High School for Boys, Plymouth (23 percentage points)
  • Loreto Grammar School, Trafford (20 percentage points).

There is again a strong correlation between low disadvantaged performance and large gaps, although the relationship does not apply in all cases.

Another 23 grammar schools have a negative gap of 10 percentage points or higher.

There is again a curious trend for King Edward VI Camp Hill in Birmingham, which comes in at 75% on this measure, yet its disadvantaged students outscore the advantaged, which are at 65%, ten percentage points lower. As noted above, there may well be extenuating circumstances.

.

At least expected progress in maths

The percentage of disadvantaged students making at least three levels of progress in maths across all state-funded schools is 50.7%, compared with a figure for ‘other’ students of 74.1%, giving a gap of 23.4 percentage points.

In 2013, 87.8% of high attainers achieved this.

On this occasion Pate’s is unaffected (in fact scores 100%), as does King Edward VI Camp Hill School for Boys (in its case for advantaged and disadvantaged alike).

No schools come in below the national average for disadvantaged students, in fact all comfortably exceed it. However, the lowest performers are still a long way behind some of their fellow grammar schools.

The worst performing grammar schools on this measure are:

  • Spalding Grammar School, Lincolnshire (59%)
  • Queen Elizabeth Grammar School Cumbria (62%)
  • Simon Langton Grammar School for Boys, Kent (63%)
  • Dover Grammar School for Boys, Kent (67%)
  • The Boston Grammar School, Lincolnshire (68%)
  • Borden Grammar School, Kent (68%)

These are very similar to the corresponding rates for the lowest performers in English.

Table 3 illustrates these outcomes, together with other schools demonstrating very large gaps between advantaged and disadvantaged students.

.

Table 3

Table 3: At least expected progress in maths from KS2 to KS4: Lowest performing and largest gaps

A total of 32 schools (21% of those for which we have data) undershoot the 2013 average for high attainers, a slightly better outcome than for English.

At the other extreme, there are 54 schools (36% of those for which we have data) that score 100% on this measure, slightly more than do so on the comparable measure for English, but still significantly fewer than achieve this on the 5+ GCSE measure.

Seventeen of the 54 also achieve a perfect 100% for advantaged students.

The largest positive gaps recorded are 11 percentage points at The Harvey Grammar School in Kent (which achieved 94% for disadvantaged students) and 7 percentage points at Queen Elizabeth’s Grammar School, Alford, Lincolnshire (91% for disadvantaged students).

The largest negative gaps on this measure are equally as substantial as those relating to English. Four schools perform significantly worse than the average gap of 23.4 percentage points:

  • Spalding Grammar School, Lincolnshire (32 percentage points)
  • Queen Elizabeth Grammar School, Cumbria (31 percentage points)
  • Simon Langton Grammar School for Boys, Kent (31 percentage points)
  • Stratford Grammar School for Girls (27 percentage points)

Queen Elizabeth’s and Stratford Girls’ appeared in the same list for English. Stratford Girls’ appeared in the same list for the 5+ GCSE measure.

A further 20 schools have a double-digit negative gap of 10 percentage points or higher, very similar to the outcome in English.

.

Comparison across the three measures

As will be evident from the tables and lists above, some grammar schools perform consistently poorly on all three measures.

Others perform consistently well, while a third group have ‘spiky profiles’

The number of schools that achieve 100% on all three measures with their disadvantaged students is 25 (17% of those for which we have data).

Eight of these are located in London; none is located in Birmingham. Just two are in Buckinghamshire and there is one each in Gloucestershire, Kent and Lincolnshire.

Only six schools achieve 100% on all three measures with advantaged and disadvantaged students alike. They are:

  • Queen Elizabeth’s, Barnet
  • Colyton Grammar School, Devon
  • Nonsuch High School for Girls, Sutton
  • St Olave’s and St Saviour’s Grammar School, Bromley
  • Tiffin Girls’ School, Kingston
  • Kendrick School, Reading

Five schools recorded comparatively low performance across all three measures (ie below 80% on each):

  • Spalding Grammar School, Lincolnshire
  • Simon Langton Grammar School for Boys, Kent
  • The Boston Grammar School, Lincolnshire
  • Stratford Grammar School for Girls
  • St Joseph’s College, Stoke on Trent

Their overall performance is illustrated in Table 4.

.

Table 4

Table 4: Schools where 80% or fewer disadvantaged learners achieved each measure

.

This small group of schools are a major cause for concern.

A total of 16 schools (11% of those for which we have data) score 90% or less on all three measures and they, too, are potentially concerning.

Schools which record negative gaps of 10 percentage points or more on all three measures are:

  • Simon Langton Grammar School for Boys, Kent
  • Dover Grammar School for Boys, Kent
  • The Boston Grammar School, Lincolnshire
  • Stratford Grammar School for Girls
  • Wilmington Grammar School for Boys, Kent
  • St Joseph’s College, Stoke-on-Trent
  • Queen Elizabeth’s Grammar School, Horncastle, Lincolnshire

Table 5 records these outcomes

.

Table 5

Table 5: Schools with gaps of 10% or higher on all three measures

.

Of these, Boston and Stratford have gaps of 20 percentage points or higher on all three measures.

A total of 32 grammar schools (21% of those for which we have data) record a percentage of 80 percentage points or lower on at least one of the three measures.

.

Selective University Destinations

I had also wanted to include in the analysis some data on progression to selective (Russell Group) universities, drawn from the experimental destination statistics.

Unfortunately, the results for FSM students are suppressed for the vast majority of schools, making comparison impossible. According to the underlying data for 2011/12, all I can establish with any certainty is that:

  • In 29 grammar schools, there were no FSM students in the cohort.
  • Five schools returned 0%, meaning that no FSM students successfully progressed to a Russell Group university. These were Wycombe High School, Wallington High School for Girls, The Crossley Heath School in Calderdale, St Anselm’s College on the Wirral and Bacup and Rawtenstall Grammar School.
  • Three schools were relatively successful – King Edward VI Five Ways in Birmingham reported 58% of FSM students progressing, while King Edward VI Handsworth reported 53% and the Latymer School achieved an impressive 75%.
  • All remaining grammar schools – some 127 in that year – are reported as ‘x’ meaning that there were either one or two students in the cohort, so the percentages are suppressed.

We can infer from this that, at least in 2011/12, very few grammar schools indeed were specialising in providing an effective route to Russell Group universities for FSM students.

.

Conclusion

Even allowing for the unreliability of statistics based on very small cohorts, this analysis is robust enough to show that the performance of grammar schools in supporting disadvantaged students is extremely disparate.

While there is a relatively large group of consistently high performers, roughly one in five grammar schools is a cause for concern on at least one of the three measures. Approximately one in ten is performing no more than satisfactorily across all three. 

The analysis hints at the possibility that the biggest problems tend to be located in rural and coastal areas rather than in London and other urban centres, but this pattern is not always consistent. The majority of the poorest performers seem to be located in wholly selective authorities but, again, this is not always the case.

A handful of grammar schools are recording significant negative gaps between the performance of disadvantaged students and their peers. This is troubling. There is no obvious correlation between the size of the disadvantaged cohort and the level of underperformance.

There may be extenuating circumstances in some cases, but there is no public national record of what these are – an argument for greater transparency across the board.

One hopes that the grammar schools that are struggling in this respect are also those at the forefront of the reform programme described in my previous post – and that they are improving rapidly.

One hopes, too, that those whose business it is to ensure that schools make effective use of the pupil premium are monitoring these institutions closely. Some of the evidence highlighted above would not, in my view, be consistent with an outstanding Ofsted inspection outcome.

If the same pattern is evident when the 2014 Performance Tables are published in January 2015, there will be serious cause for concern.

As for the question whether grammar schools are currently meeting the needs of their – typically few – disadvantaged students, the answer is ‘some are; some aren’t’. This argues for intervention in inverse proportion to success.

.

GP

December 2014