Protecting pupil premium for high attainers

.

This post continues the campaign I have been waging against the Fair Education Alliance, a Teach First-inspired ‘coalition for change in education’ over a proposal in its Report Card 2014 to f-school-letter-gradehalve the pupil premium for disadvantaged learners with high prior attainment.

I am:

  • Inviting Fair Education Alliance members (and Read On. Get On. partners) to defend the proposal or else distance themselves from it and
  • Calling on both campaigns to withdraw it.

.

Background

The Fair Education Alliance was launched by Teach First in June 2014. It aims to:

‘…significantly narrow the achievement gap between young people from our poorest communities and their wealthier peers by 2022’.

There are 27 members in all (see below).

The Alliance plans to monitor progress annually against five Fair Education Impact Goals through an annual Report Card.

The first Report Card, published in December 2014, explains that the Alliance was formed:

‘…in response to the growing demand for a national debate on why thousands of children do not get a fair education’.

The Impact Goals are described thus:

  • ‘Narrow the gap in literacy and numeracy at primary school

The Fair Education Alliance is committed to closing the attainment gap between primary schools serving lower income pupils and those educating higher income pupils. Our goal is for this gap to be narrowed by 90 % by 2022.

  • Narrow the gap in GCSE attainment at secondary school

The Fair Education Alliance is committed to closing the attainment gap between secondary schools serving lower income pupils and those educating higher income pupils. Our goal is to close 44 % of this gap by 2022.

  • Ensure young people develop key strengths, including resilience and wellbeing, to support high aspirations

The Fair Education Alliance is committed to ensuring young people develop non-cognitive skills, including the positive wellbeing and resilience they need to succeed in life. The Alliance will be working with other organisations to develop measurement tools which will allow the development of these key skills to be captured.

  • Narrow the gap in the proportion of young people taking part in further education or employment-based training after finishing their GCSEs.

The Fair Education Alliance wants to see an increase in the number of young people from low-income communities who stay in further education or employment-based training once they have completed Key Stage 4. Our goal is for 90% of young people from schools serving low income communities to be in post-16 education or employment-based training by 2022.

  • Narrow the gap in university graduation, including from the 25% most selective universities

The Fair Education Alliance is committed to closing the graduation gap between young people from low income backgrounds and those from high income backgrounds. Our goal is for at least 5,000 more pupils from low income backgrounds to graduate each year, with 1,600 of these young people graduating from the most selective universities.’

The problematic proposal relates to Impact Goal 2, focused on the GCSE attainment gap in secondary schools.

The gap in question is between:

  • Schools serving low income communities: ‘State schools where 50 % or more of the pupils attending come from the most deprived 30 % of families according to the Income Deprivation Affecting Children Index (IDACI)’ and
  • Schools serving high income communities: ‘State schools where 50 % or more of the pupils attending come from the least deprived 30 % of families according to IDACI’.

The Report Card explains that the Alliance is focused on gaps between schools rather than gaps between pupils:

‘…to better capture data that includes those pupils whose families are on a low income but are just above the income threshold for free school meals (the poverty measure in schooling). This measurement also helps monitor the impact of the Alliance’s efforts towards meeting the goals as many members work with and through schools to tackle educational inequality, rather than with individual pupils.’

Under Goal 2, the gap the Alliance wishes to close relates to:

‘Average point score…across eight GCSE subjects, with extra weighting for English and maths’

The measure excludes equivalent qualifications. The baseline gap – derived from 2012/13 data:

‘…is currently 101.7 average points – the difference between 8 C grades and 8 A grades.

The Report Card says this gap has narrowed by 10.5% since 2010/11, but warns that new accountability measures could work in the opposite direction.

The problematic recommendation

The Report Card discusses the distribution of funding to support deprivation, arguing that:

  • Some aspects of disadvantage ‘are given less recognition in the current funding system. ‘For instance FSM Ever 6 does not include low income families who just miss the eligibility criteria for free school meals; and the national funding formula is not able to compensate for geographical isolation and high transport costs which can compound low incomes in parts of the country.’
  • ‘Consequently – due to the combination of a high intake of pupils attracting the premium and a currently unequal national school funding formula – there are a small number of very successful schools building up large surpluses. Meanwhile some schools with arguably greater need, where pupils suffer different socioeconomic disadvantages that affect their attainment, are receiving comparatively little extra funding. This hampers their ability to deal with the challenges that their students face and to prevent those vulnerable pupils from falling behind their peers.’

To rectify this problem, the Report Card recommends a significant policy adjustment:

Target pupil premium by attainment as well as disadvantage measures: This could be achieved through halving current funding per pupil for FSM Ever 6. Half of this funding could then be re-allocated to pupils eligible for FSM Ever 6 who have low prior attainment. This would give double-weighting to those low income pupils most in need of intervention without raising overall pupil premium spend. The change of funding model would increase school accountability for ‘catching up’ pupils.

The proposal is advanced in a section about secondary schools; it is unclear whether it is intended to apply equally to primary schools.

Quite what constitutes low prior attainment is never made entirely clear either. One assumes that, for secondary students, it is anything below the scaled score equivalent of KS2 L4b in English (reading and writing), maths or both.

This does of course mean that learners attracting the pupil premium who achieve the requisite scores will be as much short-changed as those who exceed them. Low attainers must take precedence over middle attainers as well as high attainers.

I am minded to extend my campaign to encompass the ‘squeezed middle’, but perhaps I should let someone else bear that standard.

.

Why this is objectionable

I oppose this proposal because:

  • The pupil premium is described as ‘additional funding for publicly funded schools in England to raise the attainment of disadvantaged pupils and close the gap between them and their peers’. Although not a personal funding entitlement – the funding can be aggregated and deployed as schools see fit – schools are held accountable for the impact of the pupil premium on the attainment and progress of the pupils that attract it. There is presently no distinction according to the attainment of these students, but the change proposed by the Alliance would shift the accountability focus to prioritise the achievement and progress of disadvantaged low attainers over disadvantaged middle and high attainers.
  • The pupil premium should not be treated as part of the overall school budget. As Ofsted said in its first report on the premium (September 2012):

‘School leaders, including governing bodies, should ensure that Pupil Premium funding is not simply absorbed into mainstream budgets, but instead is carefully targeted at the designated children. They should be able to identify clearly how the money is being spent.’

Since the premium follows the pupil, schools with large numbers of eligible pupils should not have any part of this funding clawed back, nor should those with relatively few eligible pupils have it supplemented.

  • If there are problems with the distribution of deprivation funding, this should be addressed through the school funding formula. It is wrong to suggest that a national funding formula would be incapable of compensating for associated sparsity factors. It is for those devising such a formula to determine whether to compensate for pupils not eligible for the premium and factors such as geographical isolation and high transport costs. The Alliance is perfectly entitled to lobby for this. But, in the absence of such a formula, the premium should not be rationed or redistributed to compensate.

‘Our report in 2013 found few instances of the pupil premium being used effectively to support the disadvantaged most able pupils. In the schools visited for this survey, about a third were using the pupil premium funding effectively to target the needs of these pupils.

  • Any decision to double weight pupil premium for disadvantaged learners with low prior attainment would be likely to penalise disadvantaged high attainers. Although schools could theoretically decide to aggregate the funding and spend it differently, the clear intention is that the accountability framework would incentivise correspondingly stronger improvement by low attainers relative to middle and higher attainers. It is hard to understand how this, combined with the redistribution of funding, would help schools to support the latter and so meet Ofsted’s expectations
  • There are strong equity arguments against such a redistribution: disadvantaged learners should not be penalised on the basis of their prior attainment. That is  not ‘A fair education for all’, nor is it consistent with the ‘sound moral argument for giving every child an equal chance to succeed‘ mentioned in the Executive Summary of the Report Card. There is a fundamental distinction between reflecting the additional costs attributable to supporting all low attainers in the funding formula and redistributing allocations associated with individual disadvantaged learners for the same purpose.
  • The Report Card itself recognises the significance of disadvantaged high attainers:

‘As the Level 5 attainment gap highlights, there is not only a need to catch up those ‘slipping behind’ but also an imperative to ‘stretch the top’ when looking at pupils from low income communities. Some schools do well by this measure: sharing best practice in making better than expected levels of progress and stretching the highest attainers is crucial for ensuring all schools can replicate the successes some have already developed.’

How this can be squared with the proposed redistribution of pupil premium is not addressed. 

  • Such a policy would make the Alliance’s own goal of narrowing the gap in university graduation from the 25% most selective universities much harder to achieve, since it would reduce the likelihood of disadvantaged learners reaching the level of attainment necessary to secure admission.
  • There is already additional funding, outside the school funding settlement, dedicated to ‘catch-up’ for those with low prior attainment. Well over £50m per year is allocated to the ‘catch-up premium’ providing £500 per pupil who did not achieve at least KS2 L4 in reading and/or maths. This may be used for individual or small group tuition, summer schools or resources and materials. A further £50m has also been top-sliced from the pupil premium to provide an annual summer schools programme for those at the end of KS2. A core purpose is ‘to help disadvantaged pupils who are behind in key areas such as literacy and numeracy to catch up with their peers’. There is no corresponding funding for disadvantaged high attainers.
  • For FY2015/16, the Government adjusted the funding formula to allocate an additional £390m to schools in the least fairly funded authorities. This involved setting a minimum funding level for five pupil characteristics, one being ‘pupils from deprived backgrounds’, another ‘pupils with low attainment before starting at their primary or secondary school’. The values for the latter are £660 for primary schools and £940 for secondary schools. This establishes a precedent for reflecting the needs of low attaining learners in further progress towards a national funding formula.

.

The campaign to date

I had an inconclusive discussion with Teach First officials on the day the Report Card was published

.

Subsequently I pressed the Fair Education Alliance spokesperson at Teach First on some specific questions.

.

I received two undertakings to respond online but nothing has materialised. Finally, on 17 April I requested a response within 24 hours.

.

Nothing doing.

Meanwhile though, Sam Freedman published a piece that appeared to accept that such imbalances should be rectified through the schools funding formula:

‘The distribution, in turn, will depend on whether the next Government maintains the pupil premium at the same level – which has shifted funds towards poorer parts of the country – and whether they introduce a “National Funding Formula” (NFF).

At the moment there are significant and historic differences between funding in different parts of the country. Inner London for instance is overfunded, and many schools have significant surpluses, whereas other parts of the country, often more rural, have much tighter margins. The current Government have taken steps to remedy this but plan to go further if they win the election by introducing a NFF. Doing this would help alleviate the worst effects of the cuts for schools that are currently underfunded.’

Freedman himself retweeted this comment.

We had a further conversation on 20 April after this post had been published.

.

.

Another influential Twitterata also appeared influenced – if not yet fully converted – by my line of argument:

Positive though some of these indications are, there are grounds to fear that at least some Alliance Members remain wedded to the redistribution of pupil premium.

The idea recently reappeared in a publication underpinning the Read On Get On campaign, supported by a variety of organisations including Teach First and some of the Fair Education Alliance.

The report in question – The Power of Reading (April 2015) – mentions that:

‘The Read On. Get On. campaign is working closely with the Fair Education Alliance and the National Literacy Forum to achieve our core goals, and this report reflects and builds on their recommendations.’

One of its ‘recommendations to the new Government’ is ‘Ensure stronger support for disadvantaged children who are falling behind’.

‘In what is likely to be a tight public spending round, our priority for further investment is to improve the quality of early education for the poorest children, as set out above. However, there are options for reforming existing pupil premium spending for primary school children so that it focuses resources and accountability on children from disadvantaged backgrounds who are falling behind…

….One option proposed by the Fair Education Alliance is to refocus the existing pupil premium on children who are eligible for free school meals and who start primary school behind. This would use existing funding and accountability mechanisms for the pupil premium to focus attention on children who need the most urgent help to progress, including in reading. It would make primary schools more accountable for how they support disadvantaged children who are falling behind. The primary pupil premium will be worth £1,300 per pupil in 2015–16 and is paid straight to schools for any child registered as eligible for free school meals at any point in the last six years. The FEA proposes halving the existing premium, and redistributing the other half to children who meet the existing eligibility criteria and have low prior attainment. New baseline tests for children at the start of the reception year, to be introduced in September 2016, could be used as the basis for measuring the prior attainment of children starting primary school.’

Interestingly, this appears to confirm that the Fair Education Alliance supports a redistribution of pupil premium in the primary sector as well as the secondary, something I could not find expressed on the face of the Report Card.

I reacted angrily

.

The campaign continued

It won’t be long now before I leave the education world behind for ever, but I have decided to devote spare moments to the pursuit on social media of the organisations that form the Fair Education Alliance and/or support Read On. Get On.

I am asking each organisation to:

  • Justify their support for the policy that has been advanced or 
  • Formally distance themselves from it

I also extend an invitation to both campaigns to formally withdraw their proposals.

I shall publish the outcomes here.

The organisations involved are listed below. If any of them would care to cut to the chase, they are most welcome to use the comments facility on this blog or tweet me @GiftedPhoenix

Since my experience to date has been of surprising coyness when organisations are challenged over their ill-conceived policy ideas, I am imposing a ‘three strikes’ rule.

Any organisation that fails to respond having been challenged three times will be awarded a badge of shame and consigned to the Scrapheap.

Let’s see who’s in there by the end of term.

GP

April 2015

.


Fair Education Alliance

Read On. Get On.

.

The Scrapheap

.

Fair Access Trends in DfE’s Destinations Data 2010-13

This is a brief supplementary post about progression by FSM students to selective universities.

In preparing my last post, I had occasion to look again at DfE statistics on KS5 student destinations.

.

Destinations Data

These experimental statistics were first published in 2012 and most recently in January 2015. To date they cover four academic years, starting with AY2009/10 and ending with AY2012/13.

Underlying data is published each year and since AY2010/11 this has included the number of FSM students admitted to different categories of selective university: the ‘top third’, Russell Group and Oxbridge.

Allowing for a health warning about potential comparability issues (see Technical Notes below) I wanted to investigate how FSM admissions to these categories had changed over the three years in question.

The numbers are set out in this embedded spreadsheet.

.

On closer inspection they reveal some interesting information.

Graph 1, below, shows the percentage increase between AY2010/11 and AY2012/13 for FSM and non-FSM students in each category of selective higher education.

On the face of it, this is extremely good news for fair access, since the increase in FSM admissions significantly exceeds the increase in non-FSM admissions for all three categories of selective higher education.

The increase in FSM progression to Oxbridge is exactly in line with the increase at Russell Group universities.

The improvement at ‘top third’ HEIs is some 40 percentage points lower, but these institutions are almost 10 percentage points ahead of the rate of improvement for all HE.

Over the same period non-FSM progression to Russell Group universities has increased at almost twice the rate of non-FSM progression at Oxbridge, which is only slightly ahead of the 10% or so improvement at ‘top third’ institutions.

But non-FSM progression to all higher education has actually fallen slightly over the period.

.

Fair access graph 1 rev

Graph 1: Percentage increase in FSM and non-FSM students attending selective HE destinations between AY2010/11 and AY2012/13 (From DfE destination statistics, underlying data)

 .

The similarity between the FSM increases for Oxbridge and Russell Group universities may help to substantiate the improvement for the former, despite the potentially drastic impact that rounding can have on such small totals (see Technical Notes below).

On the other hand, it should not be forgotten that this radical improvement was achieved in a single year, between AY2010/11 and AY2011/12.

In the following year there was no change at all for Oxbridge, with FSM admissions stalled on 50, whereas the improvement at Russell Group universities was much more consistent, increasing by some 22% compared with AY2011/12.

Further insights can be gleaned by looking at the figures in a different way.

Graph 2 shows the percentage of total admissions to the different categories of selective higher education accounted for by FSM students – and how these have changed by academic year.

This reveals a somewhat different picture. The FSM progression rate to Oxbridge remains some two percentage points behind the rate for progression to the Russell Group as a whole (although the gap closed temporarily in AY2011/12). Whereas there has been steady improvement across the Russell Group, the FSM share fell back back at Oxbridge between AYs 2011/12 and 2012/13.

The overall improvement for all higher education has also been strong, particularly so between AYs 2011/12 and 2012/13. At ‘top third’ universities the FSM share fell back a little in 2011/12 but recovered strongly in 2012/13.

.

Additional Oxbridge graph

Graph 2: Percentage of admissions to Oxbridge, RG, Top third and all HEIs accounted for by FSM students, 2010/11 to 2012/13 (From DfE destination statistics, underlying data)

.

One might normally be wary of expressing changes in comparatively small percentages as percentages themselves, but since the UCAS End of Cycle Report (see below) includes such calculations, it seems equally justifiable in this context.

They reveal a substantial 24-point difference in the change in the FSM share of total admissions between 2012 and 2013, with Oxbridge recording -10% and the remainder of the Russell Group +14%

.

.

This coincides with a change in the constitution of the Russell Group, as Durham, Exeter, Queen Mary’s and York Universities joined in 2012. This might have had some small impact on share, but does not explain the 24-point gap.

A more tantalising question is the impact of the relaxation of student number controls for students with A level grades of AAB+ or equivalent, combined with a fall in the total number of applicants. Did these factors contribute to the improvement at Russell Group universities, or was the improvement achieved in spite of them?

UCAS End of Cycle Data

This data provides a more differentiated view of FSM progression to selective universities than the oft-quoted UCAS End of Cycle Report 2014, which has a small section on this topic, based on matched NPD and UCAS admissions data.

FSM eligibility is determined when the student is aged 15 and selective ‘high-tariff’ institutions appear to be calculated on the same basis as the ‘top third’. This ensures a degree of comparability with the Destinations statistics, although the UCAS data relates to the progression of 18 year-olds from state-funded schools only (so excludes colleges).

Furthermore there is no expectation of sustained participation (see technical notes below) and the ‘top third’ of universities has probably been calculated in a different year.

The UCAS analysis is confined exclusively to entry rates – the proportions of the total FSM and non-FSM 18 year-old populations progressing to high-, medium- and low-tariff universities respectively.

Graph 3, below, is derived from the data underpinning the Report. It shows progression to high-tariff universities for FSM and non-FSM students.

.

UCAS

Graph 3: FSM and non-FSM entry rates to UCAS high-tariff universities, 2011-2014

.

This reveals that:

  • There were very small increases in entry rates between 2013 and 2014, for both FSM and non-FSM populations. (The Report notes that this is a 3.7% improvement for FSM and a 2.9% improvement for non-FSM.)
  • The ratio between non-FSM and FSM has also narrowed minimally, but the gap between them has widened minimally too (from 6.4 points to 6.5 points).
  • Since 2011, the FSM entry rate has increased by some 50% while the improvement in the non-FSM entry rate is nearer 25%. The ratio between the two rates has improved, but the gap between them has widened from 5.6 points to 6.5 points.

This is not the universally positive story for fair access suggested in media coverage and subsequent political commentary.

.

Oxbridge Data

Data published by Oxford and Cambridge, either in their access agreements or admissions statistics, show that progress over the three years in question has been inconsistent.

  • At Oxford the total number of applicants from Acorn 4 and 5 postcodes reached a peak of 1,246 in 2010/11, only to fall to 1,079 in 2011/12 and 1,070 in 2012/13. The percentage of all students admitted with Acorn 4 and 5 postcodes was 7.6% in 2010/11, but fell to 6.7% in 2011/12, increasing only slightly to 6.8% in 2012/13.
  • At Cambridge 4.1% of applicants in 2010/11 were home applicants from Polar 2 quintile 1 postcodes and 17.6% were successful applicants. There was an improvement in 2011/12, to 4.6% of applicants and a 22.6% success rate but, in 2012/13, applications remained at 4.6% and the success rate fell back to 20.2%.

Unfortunately neither chooses to make public any data they might hold on annual admissions from FSM and non-FSM students.

Reasons cited in access agreements include the effects of the new student funding regime, a fall in the number of school leavers and the argument that an impact will only become apparent after sustained activity over a five year period. Oxford is however predicting significant improvement in AY2013/14 on the basis of its provisional data.

But one might reasonably expect these factors to have had a similar effect on other Russell Group universities. So how does one justify the disparity revealed by graph 2 above – between Oxbridge and the remainder of the Russell Group?

.

Possible reasons for the disparity between Oxbridge and other Russell Group universities

The explanation most often supplied by Oxbridge is that very few FSM-eligible students manage the exceptionally high attainment required for admission.

Admissions statistics from the two universities shows that, in 2012/13:

  • At Oxford 37.1% of students accepted had A*A*A*, 27.2% had A*A*A, 24% had A*AA and 9.4% had AAA (best three A levels).
  • At Cambridge, 59.5% of applicants achieving a UCAS tariff equivalent to A*A*A* were accepted, as were 23.6% of those with A*A*A and 13.9% of those with A*AA.

Data on FSM achievement at the highest A level grades (or equivalent) is particularly hard to come by. I have previously drawn on answers to various Parliamentary Questions that show an increase of some 45% in FSM students achieving AAA or better at A level between 2006 and 2011.

The most recent of these (Col 35W) was answered in July 2012. It says that, of pupils entering at least one A level in 2010/11 and eligible for FSM at the end of Year 11, there were 546 who achieved 3 or more GCE A levels at A*-A. This includes students in both the school and FE sectors. By comparison, there were 22,353 non-FSM students achieveing the same feat.

If we look at the ratio between achievement at this level and admission to Oxbridge in the same year:

  • 546 FSM students corresponded with 30 places secured (ratio 18:1)
  • 22,353 non-FSM students corresponded with 2,260 places secured (ratio 10:1)

So what exactly is happening? There are several possible further reasons for FSM under-representation:

  • Too few FSM students are gaining A* grades (or equivalent), as opposed to A grades, at A level.
  • Too few FSM students are gaining the necessary grades in suitable subject combinations and/or in facilitating subjects. (There has been some suggestion recently that subject choice is an issue, though this study adopts a broader definition of disadvantage and does not apply specifically to Oxbridge admission.)
  • When Oxbridge chooses FSM students pre A-level, their GCSE/AS level performance does not reflect their eventual A level performance.
  • Too few of the highest attaining FSM students are applying to Oxbridge, quite possibly for a variety of different reasons.
  • Too many FSM applicants to Oxbridge are seeking entry to the most competitive courses; too few to those where there are fewer applicants per place. (At Oxford in 2012/13, for example, the success rate for medicine was 10% while for classics it was 42%)
  • FSM students do apply in proportion, but are relatively less successful at gaining admission for reasons other than (predicted) attainment. One reason might be that neither University specifically targets FSM students through its access strategy, preferring alternative indicators of disadvantage.

Unfortunately, there is very little data available publicly to test which of these hypotheses are correct, their relative impact and how they operate in combination.

As attention switches to the pupil premium measure, one wonders whether the next government will ensure that reliable data can be made available to selective universities and, through Offa, expect them to feature this in their access targets, as well as their policies for contexualised admissions.

.

Technical Notes

There is a timelag associated with the HESA dataset, which has to be matched with the National Pupil Database.  For example, the January 2015 publication matches data on students in KS5 taking A level and equivalent qualifications in AY2011/12 and on those in HE in AY2012/13.

The most recent publication appeared in January 2015. Since HESA collects data at the end of each academic year the lag was approximately 18 months.

The next publication, relating to academic year 2013/14, is not scheduled for release until October/November 2015, indicating a lag of 15/16 months.

According to the Technical Note linked to the most recent SFR KS5 students are included if they:

  • Entered for at least one A level or equivalent level 3 qualification similar in size to an A level.
  • Attend state-funded mainstream schools, independent schools, FE and sixth form colleges and maintained, non-maintained and independent special schools. (However, it seems that only a few independent schools – those that provide tracking information to local authorities – are included.)

Students must record sustained participation – in all of the first two terms of the year – at one or more HE destinations. In 2012/13 this was defined as between October 2012 and March 2013.

Higher education is defined as any UK HE institution, so those admitted to institutions abroad are excluded. Students undertaking HE courses at FE institutions are included. The note is not quite clear about the treatment of students accepted for deferred entry.

The categories of selective HE are nested within each other:

  • The top third of HEIs when grouped by mean UCAS tariff score from entrants’ best three A level grades. KS5 students with other qualifications are excluded from the calculation. For the purposes of this publication, students with no A level points were excluded from the calculation. The ‘top third’ methodology is preferred by BIS. The constitution of the group changes annually, though 88% of institutions were within scope for six consecutive years up to 2011/12. (The 2011/12 list is used on this occasion.)
  • The Russell Group (Birmingham, Bristol, Cambridge, Cardiff, Durham, Edinburgh, Exeter, Glasgow, Imperial, KCL, Leeds, Liverpool, LSE, Manchester, Newcastle, Nottingham, Oxford, Queen Mary’s, Queens Belfast, Sheffield, Southampton, UCL, Warwick and York).
  • Oxbridge (Oxford and Cambridge)

Eligibility for free school meals (FSM) means students eligible for and claiming FSM in Year 11. Pupil premium was not introduced until September 2011, when these students were already beyond Year 11.

All national figures are rounded to the nearest 10, which makes small totals particularly unreliable. (For example, 40 + 10 could represent 35 + 5 or 44 +14, so anywhere between 40 and 58.)

The technical note advises that:

‘Some of the differences across years may be attributable to the tightening of methodology or the improvements in data matching, so comparisons across years must be treated with caution.’

.

GP

March 2015

The most able students: Has Ofsted made progress?

.

This post considers Ofsted’s survey report ‘The most able students: An update on progress since June 2013’ published on 4 March 2015.

It is organised into the following sections:

  • The fit with earlier analysis
  • Reaction to the Report
  • Definitions and the consequent size of Ofsted’s ‘most able’ population
  • Evidence base – performance data and associated key findings
  • Evidence base – inspection and survey evidence and associated key findings
  • Ofsted’s recommendations and overall assessment
  • Prospects for success

How this fits with earlier work

The new Report assesses progress since Ofsted’s previous foray into this territory some 21 months ago: ‘The most able students: Are they doing as well as they should in our non-selective secondary schools?’ (June 2013)

The autopsy I performed on the original report was severely critical.

It concluded:

‘My overall impression is of a curate’s egg, whose better parts have been largely overlooked because of the opprobrium heaped on the bad bits.

The Report might have had a better reception had the data analysis been stronger, had the most significant messages been given comparatively greater prominence and had the tone been somewhat more emollient towards the professionals it addresses, with some sort of undertaking to underwrite support – as well as challenge – from the centre.’

In May 2014, almost exactly mid-way between that Report and this, I published an analysis of the quality of Ofsted reporting on support for the most able in a sample of Section 5 secondary school inspection reports.

This uncovered a patchy picture which I characterised as ‘requiring improvement’.

It noted the scant attention given by inspectors to high-attaining disadvantaged learners and called for Ofsted to publish guidance to clarify, for inspectors and schools alike, what they mean by the most able and their expectations of what support schools should provide.

In December 2014, I published ‘HMCI ups the ante on the most able’ which drew attention to commitments in HMCI’s Annual Report for 2013/14 and the supporting documentation released alongside it.

I concluded that post with a series of ten recommendations for further action by Ofsted and other central government bodies that would radically improve the chances of achieving system-wide improvement in this territory.

The new Report was immediately preceded by a Labour commitment to introduce a £15m Gifted and Talented Fund if successful in the forthcoming General Election.

This short commentary discusses that and sets out the wider political context into which Ofsted’s new offering will fall.

.

Reactions to Ofsted’s Report

Before considering the Report’s content, it may be helpful to complete this context-setting by charting immediate reactions to it.

  • DfE’s ‘line to take, as quoted by the Mail, is:

‘We know that the best schools do stretch their pupils. They are the ones with a no-excuses culture that inspires every student to do their best.

Our plan for education is designed to shine a bright light on schools which are coasting, or letting the best and brightest fall by the wayside.

That is why we are replacing the discredited system which rewarded schools where the largest numbers of pupils scraped a C grade at GCSE.

Instead we are moving to a new system which encourages high-achievers to get the highest grades possible while also recognising schools which push those who find exams harder.’

‘David Cameron’s government has no strategy for supporting schools to nurture their most able pupils. International research shows we perform badly in helping the most gifted pupils. We’re going to do something about that. Labour will establish a Gifted and Talented Fund to equip schools with the most effective strategies for stretching their most able pupils.’

  • ASCL complains that the Report ‘fails to recognise that school leaders have done an extraordinary job in difficult circumstances in raising standards and delivering a good education for all children’. It is also annoyed because Ofsted’s press release:

‘…should have focused on the significant amount of good practice identified in the report rather than leading with comments that some schools are not doing enough to ensure the most able children fulfil their potential.’

 .

 .

  • NAHT makes a similarly generic point about volatility and change:

‘The secondary sector has been subject to massive structural change over the past few years. It’s neither sensible nor accurate to accuse secondary schools of failure. The system itself is getting in the way of success…

…Not all of these changes are bad. The concern is that the scale and pace of them will make it very hard indeed to know what will happen and how the changes will interact….

…The obvious answer is quite simple: slow down and plan the changes better; schedule them far enough ahead to give schools time to react….

But the profession also needs to ask what it can do. One answer is not to react so quickly to changes in league table calculations – to continue to do what is right…’

There was no official reaction from ATL, NASUWT or NUT.

Turning to the specialist organisations:

‘If the failure reported by Ofsted was about any other issue there would be a national outcry.

This cannot be an issue laid at the door of schools alone, with so many teachers working hard, and with no budget, to support these children.

But in some schools there is no focus on supporting high potential learners, little training for teachers to cope with their educational needs, and a naive belief that these children will succeed ‘no matter what’.

Ofsted has shown that this approach is nothing short of a disaster; a patchwork of different kinds of provision, a lack of ambitious expectations and a postcode lottery for parents.

We need a framework in place which clearly recognises best practice in schools, along with a greater understanding of how to support these children with high learning potential before it is too late.’

‘NACE concurs with both the findings and the need for urgent action to be taken to remove the barriers to high achievement for ALL pupils in primary and secondary schools…

… the organisation is  well aware that nationally there is a long way to go before all able children are achieving in line with their abilities.’

‘Today’s report demonstrates an urgent need for more dedicated provision for the highly able in state schools. Ofsted is right to describe the situation as ‘especially disappointing’; too many of our brightest students are being let down…

…We need to establish an effective national programme to support our highly able children particularly those from low and middle income backgrounds so that they have the stretch and breath they need to access the best universities and the best careers.’

Summing up, the Government remains convinced that its existing generic reforms will generate the desired improvements.

There is so far no response, from Conservatives or Liberal Democrats, to the challenge laid down by Labour, which has decided that some degree of arms-length intervention from the centre is justified.

The headteacher organisations are defensive because they see themselves as the fall guys, as the centre increasingly devolves responsibility through a ‘school-driven self-improving’ system that cannot yet support its own weight (and might never be able to do so, given the resource implications of building sufficient capacity).

But they cannot get beyond these generic complaints to address the specific issues that Ofsted presents. They are in denial.

The silence of the mainstream teachers’ associations is sufficient comment on the significance they attach to this issue.

The specialist lobby calls explicitly for a national framework, or even the resurrection of a national programme. All are pushing their own separate agendas over common purpose and collaborative action.

Taken together, this does not bode well for Ofsted’s chances of achieving significant traction.

Ofsted’s definitions

.

Who are the most able?

Ofsted is focused exclusively on non-selective secondary schools, and primarily on KS3, though most of the data it publishes relates to KS4 outcomes.

My analysis of the June 2013 report took umbrage at Ofsted’s previous definition of the most able:

‘For the purpose of this survey ‘most able’ is defined as the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2. Some pupils who are new to the country and are learning English as an additional language, for example, might not have attained Level 5 or beyond at the end of Key Stage 2 but have the potential to achieve it.’

On this occasion, the definition is similarly based on prior attainment at KS2, but the unquantified proportion of learners with ‘the potential to attain Level 5 or above’ are removed, meaning that Ofsted is now focused exclusively on high attainers:

‘For this report, ‘most able’ refers to students starting secondary school in Year 7 having attained Level 5 or above in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

This reinforces the unsuitability of the term ‘most able’, on the grounds that attainment, not ability, is the true focus.

Ofsted adds for good measure:

‘There is currently no national definition for most able’

They fail to point out that the Performance Tables include a subtly different definition of high attainers, essentially requiring an APS of 30 points or higher across Key Stage 2 tests in the core subjects.

The 2014 Secondary Performance Tables show that this high attainer population constitutes 32.3% of the 2014 GCSE cohort in state-funded schools.

The associated SFR indicates that high attainers account for 30.9% of the cohort in comprehensive schools (compared with 88.8% in selective schools).

But Ofsted’s definition is wider still. The SFR published alongside the 2014 Primary Performance Tables reveals that, in 2014:

  • 29% of pupils achieved Level 5 or above in KS2 reading and writing
  • 44% of pupils achieved Level 5 or above in KS2 Maths and
  • 24% of pupils achieved Level 5 or above in KS2 reading, writing and maths.

If this information is fed into a Venn diagram, it becomes evident that, this academic year, the ‘most able’ constitute 49% of the Year 7 cohort.

That’s right – almost exactly half of this year’s Year 7s fall within Ofsted’s definition.

.

Ofsted venn Capture

.

The population is not quite so large if we focus instead on KS2 data from 2009, when the 2014 GCSE cohort typically took their KS2 tests, but even that gives a combined total of 39%.

We can conclude that Ofsted’s ‘most able’ population is approximately 40% of the KS4 cohort and approaching 50% of the KS3 cohort.

This again calls into question Ofsted’s terminology, since the ‘most’ in ‘most able’ gives the impression that they are focused on a much smaller population at the top of the attainment distribution.

We can check the KS4 figure against numerical data provided in the Report, to demonstrate that it applies equally to non-selective schools, ie once selective schools have been removed from the equation.

The charts in Annex A of the Report give the total number of pupils in non-selective schools with L5 outcomes from their KS2 assessments five years before they take GCSEs:

  • L5 maths and English = 91,944
  • L5 maths = 165,340
  • L5 English (reading and writing) = 138,789

Assuming there is no double-counting, this gives us a total population of 212,185 in 2009.

I could not find a reliable figure for the number of KS2 test takers in 2009 in state-funded primary schools, but the equivalent in the 2011 Primary Performance Tables is 547,025.

Using that, one can calculate that those within Ofsted’s definition constitute some 39% of the 2014 GCSE cohort in non-selective secondary schools. The calculations above suggest that the KS3 cohort will be some ten percentage points larger.

.

Distribution between schools

Of course the distribution of these students between schools will vary considerably.

The 2014 Secondary Performance Tables illustrate this graphically through their alternative ‘high attainers’ measure. The cohort information provides the percentage of high attainers in the GCSE cohort in each school.

The highest recorded percentage in a state-funded comprehensive school is 86%, whereas 92 state-funded schools record 10% or fewer high attainers and just over 650 have 20% or fewer in their GCSE cohort.

At the other extreme, 21 non-selective state-funded schools are at 61% or higher, 102 at 51% or higher and 461 at 41% or higher.

However, the substantial majority – about 1,740 state-funded, non-selective schools – fall between 21% and 40%.

The distribution is shown in the graph below.

.

Ofsted graph 1

Percentage of high attainers within each state-funded non-selective secondary school’s cohort 2014 (Performance Tables measure)

Ofsted approaches the issue differently, by looking at the incidence of pupils with KS2 L5 in English, maths and both English and maths.

Their tables (again in Annex A of the Report) show that, within the 2014 GCSE cohort there were:

  • 2,869 non-selective schools where at least one pupil previously attained a L5 in KS2 English
  • 2,875 non-selective schools where at least one pupil previously attained a L5 in KS2 maths and
  • 2,859 non-selective schools where at least one pupil previously attained l5 in KS2 English and maths.

According to the cohort data in the 2014 Secondary Performance Tables, this suggests that roughly 9% of state-funded non-selective secondary schools had no pupils in each of these categories within the relevant cohort. (It is of course a different 9% in each case.)

Ofsted’s analysis shows that the lowest decile of schools in the distribution of students with L5 in English will have up to 14 of them.

Similarly the lowest decile for L5 in maths will have up to 18 pupils, and the lowest decile for L5 in maths and English combined will have up to 10 pupils.

Assuming a top set typically contains at least 26 pupils, 50% of state-funded, non-selective schools with at least one pupil with L5 English have insufficient students for one full set. The comparable percentage for maths is 30%.

But Ofsted gives no hint of what might constitute a critical mass of high attainers, appearing to suggest that it is simply a case of ‘the more the better’.

Moreover, it seems likely that Ofsted might simply be identifying the incidence of disadvantage through the proxy of high attainers.

This is certainly true at the extremes of the distribution based on the Performance Tables measure.

  • Amongst the 92 schools with 10% or fewer high attainers, 53 (58%) have a cohort containing 41% or more disadvantaged students.
  • By comparison, amongst the 102 schools with 51% or more high attainers, not one school has such a high proportion of disadvantaged students, indeed, 57% have 10% or fewer.

Disadvantage

When Ofsted discusses the most able from disadvantaged backgrounds, its definition of disadvantage is confined to ‘Ever-6 FSM’.

The Report does not provide breakdowns showing the size of this disadvantaged population in state-funded non-selective schools with L5 English or L5 maths.

It does tell us that 12,150 disadvantaged students in the 2014 GCSE cohort had achieved KS2 L5 in both English and maths.  They form about 13.2% of the total cohort achieving this outcome.

If we assume that the same percentage applies to the total populations achieving L5 English only and L5 maths only, this suggests the total size of Ofsted’s disadvantaged most able population within the 2014 GCSE cohort in state-funded, non-selective schools is almost exactly 28,000 students.

Strangely, the Report does not analyse the distribution of disadvantaged high attainers, as opposed to high attainers more generally, even though the text mentions this as an issue in passing.

One would expect that the so called ‘minority effect’ might be even more pronounced in schools where there are very few disadvantaged high attainers.

Ofsted’s evidence base: Performance data

The Executive Summary argues that analysis of national performance data reveals:

‘…three key areas of underperformance for the most able students. These are the difference in outcomes between:

  • schools where most able students make up a very small proportion of the school’s population and those schools where proportions are higher
  • the disadvantaged most able students and their better off peers
  • the most able girls and the most able boys.

If the performance of the most able students is to be maximised, these differences need to be overcome.’

As noted above, Ofsted does not separately consider schools where the incidence of disadvantaged most able students is low, nor does it look at the interaction between these three categories.

It considers all three areas of underperformance through the single prism of prior attainment in KS2 tests of English and maths.

The Report also comments on a fourth dimension: the progression of disadvantaged students to competitive universities. Once again this is related to KS2 performance.

There are three data-related Key Findings:

  • National data show that too many of the most able students are still being let down and are failing to reach their full potential. Most able students’ achievement appears to suffer even more when they are from disadvantaged backgrounds or when they attend a school where the proportion of previously high-attaining students is small.’
  • ‘Nationally, too many of our most able students fail to achieve the grades they need to get into top universities. There are still schools where not a single most able student achieves the A-level grades commonly preferred by top universities.’
  • The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

The following sections look at achievement compared with prior attainment, followed by each of the four dimensions highlighted above.

GCSE attainment compared with KS2 prior attainment

Ofsted’s approach is modelled on the transition matrices, as applied to non-selective schools, comparing KS2 test performance in 2009 with subsequent GCSE performance in 2014.

Students with KS2 L5 are expected to make at least three levels of progress, to GCSE Grade B or higher, but this is relatively undemanding for high attainers, who should ideally be aiming for A/A* grades.

Ofsted presents two charts which illustrate the relatively small proportions who are successful in these terms – and the comparatively large proportions who undershoot even a grade B.

Ofsted Capture 1

Ofsted Capture 2

 .

  • In English, 39% manage A*/A grades while 77% achieve at least a Grade B, meaning that 23% achieve C or below.
  • In maths, 42% achieve A*/A grades, 76% at least a B and so 24% achieve C or lower.
  • In English and maths combined, 32% achieve A*/A grades in both subjects, 73% manage at least 2 B grades, while 27% fall below this.

Approximately one in four high attainers is not achieving each of these progression targets, even though they are not particularly demanding.

The Report notes that, in selective schools, the proportion of Level 5 students not achieving at least a Grade B is much lower, at 8% in English and 6% in maths.

Even allowing for the unreliability of these ‘levels of progress’ assumptions, the comparison between selective and non-selective schools is telling.

.

The size of a school’s most able population

The Report sets out evidence to support the contention that ‘the most able do best when there are more of them in a school’ (or, more accurately, in their year group).

It provides three graphs – for English, for maths and for maths and English combined – which divide non-selective schools with at least one L5 student into deciles according to the size of that L5 population.

These show consistent increases in the proportion of students achieving GCSE Grade B and above and Grades A*/A, with the lowest percentages for the lowest deciles and vice versa.

Comparing the bottom (fewest L5) and top (most L5) deciles:

  • In English 27% of the lowest decile achieved A*/A and 67% at least a B, whereas in the highest decile 48% achieved A*/A and 83% at least B.
  • In maths 28% of the bottom decile recorded A*/A while 65% managed at least a B, whereas in the top decile 54% achieved A*/A and 83% at least a B.
  • In maths and English combined, the lowest decile schools returned 17% A*/A grades and 58% at B or above, while in the highest decile the percentages were 42% and 81% respectively.

Selective schools record higher percentages than the highest decile on all three measures.

There is a single reference to the impact of sublevels, amply evidenced by the transition matrices.

‘For example, in schools where the lowest proportions of most able students had previously gained Level 5A in mathematics, 63% made more than expected progress. In contrast, in schools where the highest proportion of most able students who had previously attained Level 5A in mathematics, 86% made more than expected progress.’

Ofsted does not draw any inferences from this finding.

As hinted above, one might want to test the hypothesis that there may be an association with setting – in that schools with sufficient Level 5 students to constitute a top set might be relatively more successful.

Pursued to its logical extreme the finding would suggest that Level 5 students will be most successful where they are all taught together.

Interestingly, my own analysis of schools with small high attainer populations (10% or less of the cohort), derived from the 2014 Secondary Performance Tables, shows just how much variation there can be in the performance of these small groups when it comes to the standard measures:

  • 5+ A*-C grades including English and maths varies from 44% to 100%
  • EBacc ranges from 0% to 89%
  • Expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%.

This is partly a function of the small sample sizes. One suspects that Ofsted’s deciles smooth over similar variations.

But the most obvious point is that already emphasised in the previous section – the distribution of high attainers seems in large part a proxy for the level of advantage in a school.

Viewed from this perspective, Ofsted’s data on the variation in performance by distribution of high attaining students seems unsurprising.

.

Excellence gaps

Ofsted cites an ‘ever 6’ gap of 13 percentage points at GCSE grade B and above in English (66% compared with 79%) and of 17 percentage points in maths (61% compared with 78%).

Reverting again to progression from KS2, the gap between L5 ‘ever 6 FSM’ and other students going on to achieve A*/A grades in both English and maths is also given as 17 percentage points (20% versus 37%). At Grade B and above the gap is 16 points (59% compared with 75%).

A table is supplied showing progression by sub-level in English and maths separately.

.

Ofsted Capture 3

. 

A footnote explains that the ‘ever 6 FSM’ population with L5a in English was small, consisting of just 136 students.

I have transferred these excellence gaps to the graph below, to illustrate the relationship more clearly.

.

Ofsted chart 2

GCSE attainment gaps between advantaged and disadvantaged learners by KS2 prior attainment

.

It shows that, for grades A*-B, the size of the gap reduces the higher the KS2 sub-level, but the reverse is true at grades A*/A, at least as far as the distinction between 5c and 5b/a is concerned. The gaps remain similar or identical for progression from the higher two sub-levels.

This might suggest that schools are too little focused on pushing high-attaining disadvantaged learners beyond grade B.

 .

Gender

There is a short section on gender differences which points out that, for students with KS2 L5:

  • In English there was a 10 percentage point gap in favour of girls at Grade B and above and an 11 point gap in favour of girls at A*/A.
  • In maths there was a five percentage point gap at both Grade B and above and Grade A*/A.

But the interrelationship with excellence gaps and the size of the high attainer population is not explored.

.

Progression to competitive higher education

The Executive Summary mentions one outcome from the 2012/13 destinations data – that only 5% of disadvantaged students completing KS5 in 2012 progressed to ‘the top universities’. (The main text also compares the progression rates for state-funded and independent schools).

It acknowledges some improvement compared with previous years, but notes the disparity with progression rates for students from comparatively advantaged backgrounds.

A subsequent footnote reveals that Ofsted is referring throughout to progression to Russell Group universities

The Executive Summary also highlights regional differences:

‘For example, even within a high-achieving region like London, disadvantaged students in Brent are almost four times as likely to attend a prestigious university as those in Croydon.’

The main text adds:

‘For example, of the 500 or so disadvantaged students in Kent, only 2% go on to attend a top university. In Manchester, this rises to 9%. Disadvantaged students in Barnet are almost four times as likely as their peers in Kent to attend a prestigious university.’

Annex A provides only one statistic concerning progression from KS2 to KS5:

‘One half of students achieving Level 5 in English and mathematics at Key Stage 2 failed to achieve any A or A* grades at A level in non-selective schools’

There is no attempt to relate this data to the other variables discussed above.

Ofsted’s Evidence base – inspection and survey evidence

The qualitative evidence in Ofsted’s report is derived from:

  • A survey of 40 non-selective secondary schools and 10 primary schools. All the secondary schools had at least 15% of students ‘considered to be high attaining at the end of Key Stage 2’ (as opposed to meeting Ofsted’s definition), as well as 10% or more considered to be low-attaining. The sample varied according to size, type and urban or rural location. Fifteen of the 40 were included in the survey underpinning the original 2013 report. Nine of the 10 primary schools were feeders for the secondaries in the sample. In the secondary schools, inspectors held discussions with senior leaders, as well as those responsible for transition and IAG (so not apparently those with lead responsibility for high attainers). They also interviewed students in KS3 and KS5 and looked at samples of students’ work.

The six survey questions are shown below

.

Ofsted Capture 4

.

  • Supplementary questions asked during 130 Section 5 inspections, focused on how well the most able students are maintaining their progress in KS3, plus challenge and availability of suitable IAG for those in Year 11.
  • An online survey of 600 Year 8 and Year 11 students from 17 unidentified secondary schools, plus telephone interviews with five Russell Group admissions tutors.

The Report divides the qualitative dimension of its report into seven sections that map broadly on to the six survey questions.

The summary below is organised thematically, pulling together material from the key findings and supporting commentary. Relevant key findings are emboldened. Some of these have relevance to sections other than that in which they are located.

The length of each section is a good guide to the distribution and relative weight of Ofsted’s qualitative evidence

Most able disadvantaged

‘Schools visited were rarely meeting the distinct needs of students who are most able and disadvantaged. Not enough was being done to widen the experience of these students and develop their broader knowledge or social and cultural awareness early on in Key Stage 3. The gap at Key Stage 4 between the progress made by the most able disadvantaged students and their better off peers is still too large and is not closing quickly enough.’

The 2013 Report found few instances of pupil premium being used effectively to support the most able disadvantaged. This time round, about a third of survey schools were doing so. Six schools used the premium effectively to raise attainment.

Funding was more often used for enrichment activities but these were much less common in KS3, where not enough was being done to broaden students’ experience or develop social and cultural awareness.

In less successful schools, funding was not targeted ‘with the most able students in mind’, nor was its impact evaluated with sufficient precision.

In most survey schools, the proportion of most able disadvantaged was small. Consequently leaders did not always consider them.

In the few examples of effective practice, schools provided personalised support plans.

.

.

Leadership

Ofsted complains of complacency. Leaders are satisfied with their most able students making the expected progress – their expectations are not high enough.

School leaders in survey schools:

‘…did not see the need to do anything differently for the most able as a specific group.’

One head commented that specific support would be ‘a bit elitiist’.

In almost half of survey schools, heads were not prioritising the needs of their most able students at a sufficiently early stage.

Just 44 of the 130 schools asked supplementary questions had a senior leader with designated responsibility for the most able. Of these, only 16 also had a designated governor.

The Report comments:

‘This suggests that the performance of the most able students was not a high priority…’

Curriculum

Too often, the curriculum did not ensure that work was hard enough for the most able students in Key Stage 3. Inspectors found that there were too many times when students repeated learning they had already mastered or did work that was too easy, particularly in foundation subjects.’

Although leaders have generally made positive curriculum changes at KS4 and 5, issues remain at KS3. General consensus amongst students in over half the survey schools was that work is too easy.

Students identified maths and English as more challenging than other subjects in about a third of survey schools.

In the 130 schools asked supplementary questions, leaders rarely prioritised the needs of the most able at KS3. Only seven offered a curriculum designed for different abilities.

In the most effective survey schools the KS3 curriculum was carefully structured:

‘…leaders knew that, for the most able, knowledge and understanding of content was vitally important alongside the development of resilience and knowing how to conduct their own research.’

By comparison, the KS4 curriculum was tailored in almost half of survey schools. All the schools introduced enrichment and extra-curricular opportunities, though few were effectively evaluated.

. 

Assessment and tracking

Assessment, performance tracking and target setting for the most able students in Key Stage 4 were generally good, but were not effective enough in Key Stage 3. The schools visited routinely tracked the progress of their older most able students, but this remained weak for younger students. Often, targets set for the most able students were too low, which reflected the low ambitions for these students. Targets did not consistently reflect how quickly the most able students can make progress.’

Heads and assessment leaders considered tracking the progress of the most able sufficient to address their performance, but only rarely was this information used to improve curriculum and teaching strategies.

Monitoring and evaluation tends to be focused on KS4. There were some improvements in tracking at KS4 and KS5, but this had caused many schools to lose focus on tracking from the start of KS3.

KS3 students in most survey schools said their views were sought, but could not always point to changes as a consequence. Only in eight schools were able students’ views sought as a cohort.

Year 8 respondents to the online survey typically said schools could do more to develop their interests.

At KS3, half the survey schools did not track progress in all subjects. Where tracking was comprehensive, progress was inconsistent, especially in foundation subjects.

Assessment and tracking ‘generally lacked urgency and rigour’. This, when combined with ineffective use of KS2 assessments:

‘… has led to an indifferent start to secondary school for many of the most able students in these schools.’

KS2 tests were almost always used to set targets but five schools distrusted these results. Baseline testing was widely used, but only about a quarter of the sample used it effectively to spot gaps in learning or under-achievement.

Twenty-six of the 40 survey schools set targets ‘at just above national expectations’. For many students these were insufficiently demanding.

Expectations were insufficiently high to enable them to reach their potential. Weaknesses at KS3 meant there was too much to catch up at KS4 and 5.

In the better examples:

‘…leaders looked critically at national expectations and made shrewd adjustments so that the most able were aiming for the gold standard of A and A* at GCSE and A levels rather than grade B. They ensured that teachers were clear about expectations and students knew exactly what was expected of them. Leaders in these schools tracked the progress of their most able students closely. Teachers were quickly aware of any dips in performance and alert to opportunities to stretch them.’

The expectations built into levels-based national curriculum assessment imposed ‘a glass ceiling’. It is hoped that reforms such as Progress 8 will help raise schools’ aspirations.

 .

Quality of teaching

‘In some schools, teaching for the most able lacked sufficient challenge in Key Stage 3. Teachers did not have high enough expectations and so students made an indifferent start to their secondary education. The quality of students’ work across different subjects was patchy, particularly in foundation subjects. The homework given to the most able was variable in how well it stretched them and school leaders did not routinely check its effectiveness.’

The most common methods of introducing ‘stretch’ reported by teachers and students were extension work, challenge questions and differentiated tasks.

But in only eight of the survey schools did teachers have specific training in applying these techniques to the most able.

As in 2013, teaching at KS3 was insufficiently focused on the most able. The quality of work and tasks set was patchy, especially in foundation subjects. In two-thirds of survey schools work was insufficiently challenging in foundation subjects; in just under half, work was insufficiently challenging in maths and English.

Students experienced a range of teaching quality, even in the same school. Most said there were lessons that did not challenge them. Older students were more content with the quality of stretch and challenge.

In only about one fifth of survey schools was homework adapted to the needs of the most able. Extension tasks were increasingly common.

The same was true of half of the 130 schools asked supplementary questions.  Only 14 had a policy of setting more challenging homework for the most able.

Most schools placed students in maths and science sets fairly early in Year 7, but did so less frequently in English.

In many cases, older students were taught successfully in mixed ability classes, often because there were too few students to make sets viable:

‘The fact that these schools were delivering mixed ability classes successfully suggests that the organisation of classes by ability is not the only factor affecting the quality of teaching. Other factors, such as teachers not teaching their main subject or sharing classes or leaders focusing the skills of their best teachers disproportionately on the upper key stages, are also influential.’

. 

School culture and ethos

Leaders had not embedded an ethos in which academic excellence was championed with sufficient urgency. Students’ learning in Key Stage 3 in the schools visited was too frequently disrupted by low-level disruption, particularly in mixed-ability classes. Teachers had not had enough effective training in using strategies to accelerate the progress of their most able students.’

Where leadership was effective, leaders placed strong emphasis on creating the right ethos. School leaders had not prioritised embedding a positive ethos at KS3 in 22 of the survey schools.

In half of the survey schools, the most able students said their learning was affected by low-level disruption, though teachers in three-quarters of schools maintained this was rare. Senior leaders also had a more positive view than students.

In 16 of the schools, students thought behaviour was less good in mixed ability classes and staff tended to agree.

.

Transition

‘Inspectors found that the secondary schools visited were not using transition information from primary schools effectively to get the most able off to a flying start in Key Stage 3. Leaders rarely put in place bespoke arrangements for the most able students. In just under half of the schools visited, transition arrangements were not good enough. Some leaders and teachers expressed doubt about the accuracy of Key Stage 2 results. The information that schools gathered was more sophisticated, but, in too many cases, teachers did not use it well enough to make sure students were doing work with the right level of difficulty.

Too often poor transition arrangements meant students were treading water in KS3. The absence of leadership accountability for transition appeared a factor in stifled progress at KS4 and beyond.

Transfer arrangements with primary schools were not well developed in 16 of the survey schools. Compared with 2013, schools were more likely to find out about pupils’ strengths and weaknesses, but the information was rarely used well.

Secondary schools had more frequent and extended contact with primary schools through subject specialists to identify the most able, but these links were not always used effectively. Only one school had a specific curriculum pathway for such students.

Leaders in four of the ten primary schools surveyed doubted whether secondary schools used transition information effectively.

However, transition worked well in half of the secondary schools.  Six planned the Year 7 curriculum jointly with primary teachers. Leaders had the highest expectations of their staff to ensure that the most able were working at the appropriate level of challenge.

Transition appeared more effective where schools had fewer feeder primaries. About one third of the sample had more than 30 feeder schools, which posed more difficulties, but four of these schools had effective arrangements.

Progression to HE

‘Information, advice and guidance to students about accessing the most appropriate courses and universities were not good enough. There were worrying occasions when schools did too little to encourage the most able students to apply to prestigious universities. The quality of support was too dependent on the skills of individual staff in the schools visited.

While leaders made stronger links with universities to provide disadvantaged students in Key Stages 4 and 5 with a wider range of experiences, they were not evaluating the impact sharply enough. As a result, there was often no way to measure how effectively these links were supporting students in preparing successful applications to the most appropriate courses.’

Support and guidance about university applications is ‘still fragile’ and ‘remains particularly weak’.

Students, especially those from disadvantaged backgrounds, were not getting the IAG they need. Ten survey schools gave no specific support to first generation university attendees or those eligible for the pupil premium.

Forty-nine of the 130 school asked additional questions did not prioritise the needs of such students. However, personalised mentoring was reported in 16 schools.

In four survey schools students were not encouraged to apply to the top universities.

‘The remnants of misplaced ideas about elitism appear to be stubbornly resistant to change in a very small number of schools. One admissions tutor commented: ‘There is confusion (in schools) between excellence and elitism’.

Only a third of survey schools employed dedicated staff to support university applications. Much of the good practice was heavily reliant on the skills of a few individuals. HE admissions staff agreed.

In 13 of the schools visited, students had a limited understanding of the range of opportunities available to them.

Survey schools had a sound understanding of subject requirements for different degree courses. Only about one-quarter engaged early with parents.

.

Ofsted and other Central Government action

‘Ofsted has sharpened its focus on the progress and quality of teaching of the most able students. We routinely comment on the achievement of the most able students in our inspection reports. However, more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

.

Ofsted’s recommendations and conclusions

This is a somewhat better Report than its June 2013 predecessor, although it continues to fall into several of the same statistical and presentational traps.

It too is a curate’s egg.

For any student of effective provision for the most able, the broad assessment in the previous section is profoundly unsurprising, but its endorsement by Ofsted gives it added power and significance.

We should be grateful that HMCI has chosen to champion this issue when so many others are content to ignore it.

The overall message can best be summarised by juxtaposing two short statements from the Report, one expressed positively, another negatively:

  • In over half of survey schools, the most able KS3 students were progressing as well as, or better than, others. 
  • The needs of the most able were not being met effectively in the majority of survey schools.

Reading between the lines, too often, the most able students are succeeding despite their schools, rather than because of them.

What is rather more surprising – and potentially self-defeating – is Ofsted’s insistence on laying the problem almost entirely at the door of schools, and especially of headteachers.

There is most definitely a degree of complacency amongst school leaders about this issue, and Ofsted is quite right to point that out.

The determination of NAHT and ASCL to take offence at the criticism being directed towards headteachers, to use volatility and change as an excuse and to urge greater focus on the pockets of good practice is sufficient evidence of this.

But there is little by way of counterbalance. Too little attention is paid to the question whether the centre is providing the right support – and the right level of support – to facilitate system-wide improvement. It as if the ‘school-led, self-improving’ ideal is already firmly in place.

Then again, any commitment on the part of the headteachers’ associations to tackling the root causes of the problem is sadly lacking. Meanwhile, the teachers;’ associations ignored the Report completely.

Ofsted criticises this complacency and expresses concern that most of its survey schools:

‘…have been slow in taking forward Ofsted’s previous recommendations, particularly at KS3’

There is a call for renewed effort:

‘Urgent action is now required. Leaders must grasp the nettle and radically transform transition from primary school and the delivery of the Key Stage 3 curriculum. Schools must also revolutionise the quality of information, advice and guidance for their most able students.’

Ofsted’s recommendations for action are set out below. Seven are directed at school leaders, three at Ofsted and one at DfE.

Ofsted capture 5

Ofsted Capture 6

Those aimed by Ofsted towards itself are helpful in some respects.

For example, there is implicit acknowledgement that, until now, inspectors have been insufficiently focused on the most able from disadvantaged backgrounds.

Ofsted stops short of meeting my call for it to produce guidance to help schools and inspectors to understand Ofsted’s expectations.

But it is possible that it might do so. Shortly after publication of the Report, its Director for Schools made a speech confirming that: 

‘… inspectors are developing a most able evaluation toolkit for schools, aligned to that which is in place for free school meals’. 

.

.

If Ofsted is prepared to consult experts and practitioners on the content of that toolkit, rather than producing it behind closed doors, it is more likely to be successful.

There are obvious definitional issues stemming from the fact that, according to Ofsted’s current approach, the ‘most able’ population constitutes 40-50% of all learners.

While this helps to ensure relevance to every school, no matter how depressed the attainment of its intake, it also highlights the need for further differentiation of this huge population.

Some of Ofsted’s statistical indicators and benchmarking tools will need sharpening, not least to avoid the pitfalls associated with the inverse relationship between the proportion of high attainers and the proportion of disadvantaged learners.

They might usefully focus explicitly on the distribution and incidence of the disadvantaged most able.

Prospects for success

But the obvious question is why schools should be any more likely to respond this time round than in 2013?

Will the references in the Ofsted inspection handbook plus reformed assessment arrangements be sufficient to change schools’ behaviour?

Ofsted is not about to place explicit requirements on the face of the inspection framework.

We are invited to believe that Progress 8 in particular will encourage secondary schools to give due attention to the needs of high attainers.

Yet there is no commitment to the publication of a high attainers’ performance measure (comparable to the equivalent primary measure) or the gap on that measure between those from advantaged and disadvantaged backgrounds.

Data about the performance of secondary high attainers was to have been made available through the now-abandoned Data Portal – and there has been no information about what, if anything, will take its place.

And many believe that the necessary change cannot be achieved by tinkering with the accountability framework.

The specialist organisations are united in one respect: they all believe that schools – and learners themselves – need more direct support if we are to spread current pockets of effective practice throughout the system.

But different bodies have very different views about what form that support should take. Until we can:

  • Establish the framework necessary to secure universally high standards across all schools without resorting to national prescription

we – and Ofsted – are whistling in the wind.

GP

March 2015

Addressed to Teach First and its Fair Education Alliance

.

This short opinion piece was originally commissioned by the TES in November.

My draft reached them on 24 November; they offered some edits on 17 December.

Betweentimes the Fair Education Alliance Report Card made its appearance on 9 December.

Then Christmas intervened.

On 5 January I offered the TES a revised version they said should be published on 27 February. It never appeared.

This Tweet

.

.

prompted an undertaking that it would appear on 27 March. I’ll believe that when I see it.

But there’s no reason why you should wait any longer. This version is more comprehensive anyway, in that it includes several relevant Twitter comments and additional explanatory material.

I very much hope that Teach First and members of the Fair Education Alliance will read it and reflect seriously on the proposal it makes.

As the final sequence of Tweets below shows, Teach First committed to an online response on 14 February. Still waiting…

.

.

.

How worried are you that so few students on free school meals make it to Oxbridge?

Many different reasons are offered by those who argue that such concern may be misplaced:

  • FSM is a poor proxy for disadvantage; any number of alternatives is preferable;
  • We shouldn’t single out Oxbridge when so many other selective universities have similarly poor records;
  • We obsess about Oxbridge when we should be focused on progression to higher education as a whole;
  • We should worry instead about progression to the most selective courses, which aren’t necessarily at the most selective universities;
  • Oxbridge suits a particular kind of student; we shouldn’t force square pegs into round holes;
  • We shouldn’t get involved in social engineering.

Several of these points are well made. But they can be deployed as a smokescreen, obscuring the uncomfortable fact that, despite our collective best efforts, there has been negligible progress against the FSM measure for a decade or more.

Answers to Parliamentary Questions supplied  by BIS say that the total fluctuated between 40 and 45 in the six years from 2005/06 to 2010/11.

The Department for Education’s experimental destination measures statistics suggested that the 2010/11 intake was 30, rising to 50 in 2011/12, of which 40 were from state-funded schools and 10 from state-funded colleges. But these numbers are rounded to the nearest 10.

By comparison, the total number of students recorded as progressing to Oxbridge from state-funded schools and colleges in 2011/12 is 2,420.

This data underpins the adjustment of DfE’s  ‘FSM to Oxbridge’ impact indicator, from 0.1% to 0.2%. It will be interesting to see whether there is stronger progress in the 2012/13 destination measures, due later this month.

.

[Postscript: The 2012/13 Destinations Data was published on 26 January 2014. The number of FSM learners progressing to Oxbridge is shown only in the underlying data (Table NA 12).

This tells us that the numbers are unchanged: 40 from state-funded schools; 10 from state-funded colleges, with both totals again rounded to the nearest 10.

So any improvement in 2011/12 has stalled in 2012/13, or is too small to register given the rounding (and the rounding might even mask a deterioration)

.

.

The non-FSM totals progressing to Oxbridge in 2012/13 are 2,080 from state-funded schools and 480 from state-funded colleges, giving a total of 2,560. This is an increase of some 6% compared with 2011/12.

Subject to the vagaries of rounding, this suggests that the ratio of non-FSM to FSM learners progressing from state-funded institutions deteriorated in 2012/13 compared with 2011/12.]

.

The routine explanation is that too few FSM-eligible students achieve the top grades necessary for admission to Oxbridge. But answers to Parliamentary Questions reveal that, between 2006 and 2011, the number achieving three or more A-levels at grade A or above increased by some 45 per cent, reaching 546 in 2011.

Judged on this measure, our national commitment to social mobility and fair access is not cutting the mustard. Substantial expenditure – by the taxpayer, by universities and the third sector – is making too little difference too slowly. Transparency is limited because the figures are hostages to fortune.

So what could be done about this? Perhaps the answer lies with Teach First and the Fair Education Alliance.

Towards the end of last year Teach First celebrated a decade of impact. It published a report and three pupil case studies, one of which featured a girl who was first in her school to study at Oxford.

I tweeted

.

.

Teach First has a specific interest in this area, beyond its teacher training remit. It runs a scheme, Teach First Futures, for students who are  “currently under-represented in universities, including those whose parents did not go to university and those who have claimed free school meals”.

Participants benefit from a Teach First mentor throughout the sixth form, access to a 4-day Easter school at Cambridge, university day trips, skills workshops and careers sessions. Those applying to Oxbridge receive unspecified additional support.

.

.

Information about the number of participants is not always consistent, but various Teach First sources suggest there were some 250 in 2009, rising to 700 in 2013. This year the target is 900. Perhaps some 2,500 have taken part to date.

Teach First’s impact report  says that 30 per cent of those who had been through the programme in 2013 secured places at Russell Group universities and that 60 per cent of participants interviewed at Oxbridge received an offer.

I searched for details of how many – FSM or otherwise – had actually been admitted to Oxbridge. Apart from one solitary case study, all I could find was a report that mentioned four Oxbridge offers in 2010.

.

.

.

.

.

Through the Fair Education Alliance, Teach First and its partners are committed to five impact goals, one of which is to:

‘Narrow the gap in university graduation, including from the 25% most selective universities, by 8%’*

Last month the Alliance published a Report Card which argued that:

‘The current amount of pupil premium allocated per disadvantaged pupil should be halved, and the remaining funds redistributed to those pupils who are disadvantaged and have low prior attainment. This would give double-weighting to those low income pupils most in need of intervention without raising overall pupil premium spend.’

It is hard to understand how this would improve the probability of achieving the impact goal above, even though the gaps the Alliance wishes to close are between schools serving high and low income communities.

.

.

.

Perhaps it should also contemplate an expanded Alliance Futures Scheme, targeting simultaneously this goal and the Government’s ‘FSM to Oxbridge’ indicator, so killing two birds with one stone.

A really worthwhile Scheme would need to be ambitious, imposing much-needed coherence without resorting to prescription.

Why not consider:

  • A national framework for the supply side, in which all providers – universities included – position their various services.
  • Commitment on the part of all secondary schools and colleges to a coherent long-term support programme for FSM students, with open access at KS3 but continuing participation in KS4 and KS5 subject to successful progress.
  • Schools and colleges responsible for identifying participants’ learning and development needs and addressing those through a blend of internal provision and appropriate services drawn from the national framework.
  • A personal budget for each participant, funded through an annual £50m topslice from the Pupil Premium (there is a precedent) plus a matching sum from universities’ outreach budgets. Those with the weakest fair access records would contribute most. Philanthropic donations would be welcome.
  • The taxpayer’s contribution to all university funding streams made conditional on them meeting challenging but realistic fair access and FSM graduation targets – and publishing full annual data in a standard format.

 .

.

*In the Report card, this impact goal is differently expressed, as narrowing the gap in university graduation, so that at least 5,000 more students from low income backgrounds graduate each year, 1,600 of them from the most selective universities. This is to be achieved by 2022.

‘Low income backgrounds’ means schools where 50% or more pupils come from the most deprived 30% of families according to IDACI.

The gap to be narrowed is between these and pupils from ‘high income backgrounds’, defined as schools where 50% or more pupils come from the least deprived 30% of families according to IDACI.

‘The most selective universities’ means those in the Sutton Trust 30 (the top 25% of universities with the highest required UCAS scores).

The proposed increases in graduation rates from low income backgrounds do not of themselves constitute a narrowing gap, since there is no information about the corresponding changes in graduation rates from high income grounds.

This unique approach to closing gaps adds yet another methodology to the already long list applied to fair access. It risks adding further density to the smokescreen described at the start of this post.

.

.

.

GP

January 2015

How Well Do Grammar Schools Perform With Disadvantaged Students?

This supplement to my previous post on The Politics of Selection  compares the performance of disadvantaged learners in different grammar schools.

It adds a further dimension to the evidence base set out in my earlier post, intended to inform debate about the potential value of grammar schools as engines of social mobility.

The commentary is based on the spreadsheet embedded below, which relies entirely on data drawn from the 2013 Secondary School Performance Tables.

.

.

If you find any transcription errors please alert me and I will correct them.

.

Preliminary Notes

The 2013 Performance Tables define disadvantaged learners as those eligible for free school meals in the last six years and children in care. Hence both these categories are caught by the figures in my spreadsheet.

Because the number of disadvantaged pupils attending grammar schools is typically very low, I have used the three year average figures contained in the ‘Closing the Gap’ section of the Tables.

These are therefore the number of disadvantaged students in each school’s end of KS4 cohort for 2011, 2012 and 2013 combined. They should illustrate the impact of pupil premium support and wider closing the gap strategies on grammar schools since the Coalition government came to power.

Even when using three year averages the data is frustratingly incomplete, since 13 of the 163 grammar schools have so few disadvantaged students – fewer than six across all three cohorts combined – that the results are suppressed. We have no information at all about how well or how badly these schools are performing in terms of closing gaps.

My analysis uses each of the three performance measures within this section of the Performance Tables:

  • The percentage of pupils at the end of KS4 achieving five or more GCSEs (or equivalents) at grades A*-C, including GCSEs in English and maths. 
  • The proportion of pupils who, by the end of KS4, have made at least the expected progress in English. 
  • The proportion of pupil who, by the end of KS4, have made at least the expected progress in maths.

In each case I have recorded the percentage of disadvantaged learners who achieve the measure and the percentage point gap between that and the corresponding figure for ‘other’ – ie non-disadvantaged – students.

For comparison I have also included the corresponding percentages for all disadvantaged pupils in all state-funded schools and for all high attainers in state-funded schools. The latter is for 2013 only rather than a three-year average.

Unfortunately the Tables do not provide data for high attaining disadvantaged students. The vast majority of disadvantaged students attending grammar schools will be high-attaining according to the definition used in the Tables (average points score of 30 or higher across KS2 English, maths and science).

But, as my previous post showed, some grammar schools record 70% or fewer high attainers, disadvantaged or otherwise. These include: Clarendon House (Kent, now merged), Fort Pitt (Medway), Skegness (Lincolnshire), Dover Boys’ and Girls’ (Kent), Folkestone Girls’ (Kent), St Joseph’s (Stoke), Boston High (Lincolnshire) and the Harvey School (Kent).

Some of these schools feature in the analysis below, while some do not, suggesting that the correlation between selectivity and the performance of disadvantaged students is not straightforward.

.

Number of disadvantaged learners in each school

The following schools are those with suppressed results, placed in order according to the number of disadvantaged learners within scope, from lowest to highest:

  • Tonbridge Grammar School, Kent (2)
  • Bishop Wordsworth’s Grammar School, Wiltshire (3)
  • Caistor Grammar School, Lincolnshire (3)
  • Sir William Borlase’s Grammar School, Buckinghamshire (3)
  • Adams’ Grammar School, Telford and Wrekin (4)
  • Chelmsford County High School for Girls, Essex (4)
  • Dr Challoner’s High School, Buckinghamshire (4)
  • King Edward VI School, Warwickshire (4)
  • Alcester Grammar School, Warwickshire (5)
  • Beaconsifeld High School, Buckinghamshire (5)
  • King Edward VI Grammar School, Chelmsford, Essex (5)
  • Reading School, Reading (5)
  • St Bernard’s Catholic Grammar School, Slough (5).

Some of these schools feature among those with the lowest proportions of ‘ever 6 FSM’ pupils on roll, as shown in the spreadsheet accompanying my previous post, but some do not.

The remaining 152 schools each record a combined cohort of between six and 96 students, with an average of 22.

A further 19 schools have a combined cohort of 10 or fewer, meaning that 32 grammar schools in all (20% of the total) are in this category.

At the other end of the distribution, only 16 schools (10% of all grammar schools) have a combined cohort of 40 disadvantaged students or higher – and only four have one of 50 disadvantaged students or higher.

These are:

  • Handsworth Grammar School, Birmingham (96)
  • Stretford Grammar School, Trafford (76)
  • Dane Court Grammar School, Kent (57)
  • Slough Grammar School (Upton Court) (50).

Because the ratio of disadvantaged to other pupils in the large majority of grammar schools is so marked, the results below must be treated with a significant degree of caution.

Outcomes based on such small numbers may well be misleading, but they are all we have.

Arguably, grammar schools should find it relatively easier to achieve success with a very small cohort of students eligible for the pupil premium – since fewer require separate monitoring and, potentially, additional support.

On the other hand, the comparative rarity of disadvantaged students may mean that some grammar schools have too little experience of addressing such needs, or believe that closing gaps is simply not an issue for them.

Then again, it is perhaps more likely that grammar schools will fall short of 100% success with their much larger proportions of ‘other’ students, simply because the probability of special circumstances arising is relatively higher. One might expect therefore to see ‘positive gaps’ with success rates for disadvantaged students slightly higher than those for their relatively more advantaged peers.

Ideally though, grammar schools should be aiming for a perfect 100% success rate for all students on these three measures, regardless of whether they are advantaged or disadvantaged. None is particularly challenging, for high attainers in particular – and most of these schools have been rated as outstanding by Ofsted.

.

Five or more GCSE A*-C grades or equivalent including GCSEs in English and maths

In all state-funded schools, the percentage of disadvantaged students achieving this measure across the three year period is 38.7% while the percentage of other students doing so is 66.3%, giving a gap of 27.6 percentage points.

In 2013, 94.7% of all high attainers in state-funded secondary schools achieved this measure.

No grammar school falls below the 38.7% benchmark for its disadvantaged learners. The nearest to it is Pate’s Grammar School, at 43%. But these results were affected by the School’s decision to sit English examinations which were not recognised for Performance Table purposes.

The next lowest percentages are returned by:

  • Spalding Grammar School, Lincolnshire (59%)
  • Simon Langton Grammar School for Boys, Kent (65%)
  • Stratford Grammar School for Girls, Warwickshire (71%)
  • The Boston Grammar School, Lincolnshire (74%)

These were the only four schools below 75%.

Table 1 below illustrates these percentages and the percentage point gap for each of these four schools.

.

Table 1

Table 1: 5+ GCSEs at A*-C or equivalent including GCSEs in English and maths: Lowest performing and largest gaps

.

A total of 46 grammar schools (31% of the 150 without suppressed results) fall below the 2013 figure for high attainers across all state-funded schools.

On the other hand, 75 grammar schools (exactly 50%) achieve 100% on this measure, for combined student cohorts ranging in size from six to 49.

Twenty-six of the 28 schools that had no gap between the performance of their advantaged and disadvantaged students were amongst those scoring 100%. (The other two were at 97% and 95% respectively.)

The remaining 49 with a 100% record amongst their disadvantaged students demonstrate a ‘positive gap’, in that the disadvantaged do better than the advantaged.

The biggest positive gap is seven percentage points, recorded by Clarendon House Grammar School in Kent and Queen Elizabeth’s Grammar School in Alford, Lincolnshire.

Naturally enough, schools recording relatively lower success rates amongst their disadvantaged students also tend to demonstrate a negative gap, where the advantaged do better than the disadvantaged.

Three schools had an achievement gap higher than the 27.6 percentage point national average. They were:

  • Simon Langton Grammar School for Boys (30 percentage points)
  • Spalding Grammar School (28 percentage points)
  • Stratford Grammar School for Girls (28 percentage points)

So three of the four with the lowest success rates for disadvantaged learners demonstrated the biggest gaps. Twelve more schools had double digit achievement gaps of 10% or higher.

These 15 schools – 10% of the total for which we have data – have a significant issue to address, regardless of the size of their disadvantaged populations.

One noticeable oddity at this end of the table is King Edward VI Camp Hill School for Boys in Birmingham, which returns a positive gap of 14 percentage points (rounded): with 80% for disadvantaged and 67% for advantaged. On this measure at least, it is doing relatively badly with its disadvantaged students, but considerably worse with those from advantaged backgrounds!

However, this idiosyncratic pattern is also likely to be attributable to the School using some examinations not eligible for inclusion in the Tables.

.

At least expected progress in English

Across all state-funded schools, the percentage of disadvantaged students making at least three levels of progress in English is 55.5%, compared with 75.1% of ‘other’ students, giving a gap of 19.6 percentage points.

In 2013, 86.2% of high attainers achieved this benchmark.

If we again discount Pate’s from consideration, the lowest performing school on this measure is The Boston Grammar School which is at 53%, lower than the national average figure.

A further 43 schools (29% of those for which we have data) are below the 2013 average for all high attainers. Six more of these fall below 70%:

  • The Skegness Grammar School, Lincolnshire (62%)
  • Queen Elizabeth Grammar School, Cumbria (62%)
  • Plymouth High School for Girls (64%)
  • Spalding Grammar School, Lincolnshire (65%)
  • Devonport High School for Boys, Plymouth (65%)
  • Simon Langton Grammar School for Boys, Kent (67%)

Table 2 below illustrates these outcomes, together with the attainment gaps recorded by these schools and others with particularly large gaps.

.

Table 2

Table 2: At least expected progress in English from KS2 to KS4: Lowest performing and largest gaps

.

At the other end of the table, 44 grammar schools achieve 100% on this measure (29% of those for which we have data.) This is significantly fewer than achieved perfection on the five or more GCSEs benchmark.

When it comes to closing the gap, only 16 of the 44 achieve a perfect 100% score with both advantaged and disadvantaged students, again much lower than on the attainment measure above.

The largest positive gaps (where disadvantaged students outscore their advantaged classmates) are at The King Edward VI Grammar School, Louth, Lincolnshire (11 percentage points) and John Hampden Grammar School Buckinghamshire (10 percentage points).

Amongst the schools propping up the table on this measure, six record negative gaps of 20 percentage points or higher, so exceeding the average gap in state-funded secondary schools:

  • The Skegness Grammar School (30 percentage points)
  • Queen Elizabeth Grammar School Cumbria (28 percentage points)
  • Stratford Grammar School for Girls (27 percentage points)
  • Plymouth High School for Girls (25 percentage points)
  • Devonport High School for Boys, Plymouth (23 percentage points)
  • Loreto Grammar School, Trafford (20 percentage points).

There is again a strong correlation between low disadvantaged performance and large gaps, although the relationship does not apply in all cases.

Another 23 grammar schools have a negative gap of 10 percentage points or higher.

There is again a curious trend for King Edward VI Camp Hill in Birmingham, which comes in at 75% on this measure, yet its disadvantaged students outscore the advantaged, which are at 65%, ten percentage points lower. As noted above, there may well be extenuating circumstances.

.

At least expected progress in maths

The percentage of disadvantaged students making at least three levels of progress in maths across all state-funded schools is 50.7%, compared with a figure for ‘other’ students of 74.1%, giving a gap of 23.4 percentage points.

In 2013, 87.8% of high attainers achieved this.

On this occasion Pate’s is unaffected (in fact scores 100%), as does King Edward VI Camp Hill School for Boys (in its case for advantaged and disadvantaged alike).

No schools come in below the national average for disadvantaged students, in fact all comfortably exceed it. However, the lowest performers are still a long way behind some of their fellow grammar schools.

The worst performing grammar schools on this measure are:

  • Spalding Grammar School, Lincolnshire (59%)
  • Queen Elizabeth Grammar School Cumbria (62%)
  • Simon Langton Grammar School for Boys, Kent (63%)
  • Dover Grammar School for Boys, Kent (67%)
  • The Boston Grammar School, Lincolnshire (68%)
  • Borden Grammar School, Kent (68%)

These are very similar to the corresponding rates for the lowest performers in English.

Table 3 illustrates these outcomes, together with other schools demonstrating very large gaps between advantaged and disadvantaged students.

.

Table 3

Table 3: At least expected progress in maths from KS2 to KS4: Lowest performing and largest gaps

A total of 32 schools (21% of those for which we have data) undershoot the 2013 average for high attainers, a slightly better outcome than for English.

At the other extreme, there are 54 schools (36% of those for which we have data) that score 100% on this measure, slightly more than do so on the comparable measure for English, but still significantly fewer than achieve this on the 5+ GCSE measure.

Seventeen of the 54 also achieve a perfect 100% for advantaged students.

The largest positive gaps recorded are 11 percentage points at The Harvey Grammar School in Kent (which achieved 94% for disadvantaged students) and 7 percentage points at Queen Elizabeth’s Grammar School, Alford, Lincolnshire (91% for disadvantaged students).

The largest negative gaps on this measure are equally as substantial as those relating to English. Four schools perform significantly worse than the average gap of 23.4 percentage points:

  • Spalding Grammar School, Lincolnshire (32 percentage points)
  • Queen Elizabeth Grammar School, Cumbria (31 percentage points)
  • Simon Langton Grammar School for Boys, Kent (31 percentage points)
  • Stratford Grammar School for Girls (27 percentage points)

Queen Elizabeth’s and Stratford Girls’ appeared in the same list for English. Stratford Girls’ appeared in the same list for the 5+ GCSE measure.

A further 20 schools have a double-digit negative gap of 10 percentage points or higher, very similar to the outcome in English.

.

Comparison across the three measures

As will be evident from the tables and lists above, some grammar schools perform consistently poorly on all three measures.

Others perform consistently well, while a third group have ‘spiky profiles’

The number of schools that achieve 100% on all three measures with their disadvantaged students is 25 (17% of those for which we have data).

Eight of these are located in London; none is located in Birmingham. Just two are in Buckinghamshire and there is one each in Gloucestershire, Kent and Lincolnshire.

Only six schools achieve 100% on all three measures with advantaged and disadvantaged students alike. They are:

  • Queen Elizabeth’s, Barnet
  • Colyton Grammar School, Devon
  • Nonsuch High School for Girls, Sutton
  • St Olave’s and St Saviour’s Grammar School, Bromley
  • Tiffin Girls’ School, Kingston
  • Kendrick School, Reading

Five schools recorded comparatively low performance across all three measures (ie below 80% on each):

  • Spalding Grammar School, Lincolnshire
  • Simon Langton Grammar School for Boys, Kent
  • The Boston Grammar School, Lincolnshire
  • Stratford Grammar School for Girls
  • St Joseph’s College, Stoke on Trent

Their overall performance is illustrated in Table 4.

.

Table 4

Table 4: Schools where 80% or fewer disadvantaged learners achieved each measure

.

This small group of schools are a major cause for concern.

A total of 16 schools (11% of those for which we have data) score 90% or less on all three measures and they, too, are potentially concerning.

Schools which record negative gaps of 10 percentage points or more on all three measures are:

  • Simon Langton Grammar School for Boys, Kent
  • Dover Grammar School for Boys, Kent
  • The Boston Grammar School, Lincolnshire
  • Stratford Grammar School for Girls
  • Wilmington Grammar School for Boys, Kent
  • St Joseph’s College, Stoke-on-Trent
  • Queen Elizabeth’s Grammar School, Horncastle, Lincolnshire

Table 5 records these outcomes

.

Table 5

Table 5: Schools with gaps of 10% or higher on all three measures

.

Of these, Boston and Stratford have gaps of 20 percentage points or higher on all three measures.

A total of 32 grammar schools (21% of those for which we have data) record a percentage of 80 percentage points or lower on at least one of the three measures.

.

Selective University Destinations

I had also wanted to include in the analysis some data on progression to selective (Russell Group) universities, drawn from the experimental destination statistics.

Unfortunately, the results for FSM students are suppressed for the vast majority of schools, making comparison impossible. According to the underlying data for 2011/12, all I can establish with any certainty is that:

  • In 29 grammar schools, there were no FSM students in the cohort.
  • Five schools returned 0%, meaning that no FSM students successfully progressed to a Russell Group university. These were Wycombe High School, Wallington High School for Girls, The Crossley Heath School in Calderdale, St Anselm’s College on the Wirral and Bacup and Rawtenstall Grammar School.
  • Three schools were relatively successful – King Edward VI Five Ways in Birmingham reported 58% of FSM students progressing, while King Edward VI Handsworth reported 53% and the Latymer School achieved an impressive 75%.
  • All remaining grammar schools – some 127 in that year – are reported as ‘x’ meaning that there were either one or two students in the cohort, so the percentages are suppressed.

We can infer from this that, at least in 2011/12, very few grammar schools indeed were specialising in providing an effective route to Russell Group universities for FSM students.

.

Conclusion

Even allowing for the unreliability of statistics based on very small cohorts, this analysis is robust enough to show that the performance of grammar schools in supporting disadvantaged students is extremely disparate.

While there is a relatively large group of consistently high performers, roughly one in five grammar schools is a cause for concern on at least one of the three measures. Approximately one in ten is performing no more than satisfactorily across all three. 

The analysis hints at the possibility that the biggest problems tend to be located in rural and coastal areas rather than in London and other urban centres, but this pattern is not always consistent. The majority of the poorest performers seem to be located in wholly selective authorities but, again, this is not always the case.

A handful of grammar schools are recording significant negative gaps between the performance of disadvantaged students and their peers. This is troubling. There is no obvious correlation between the size of the disadvantaged cohort and the level of underperformance.

There may be extenuating circumstances in some cases, but there is no public national record of what these are – an argument for greater transparency across the board.

One hopes that the grammar schools that are struggling in this respect are also those at the forefront of the reform programme described in my previous post – and that they are improving rapidly.

One hopes, too, that those whose business it is to ensure that schools make effective use of the pupil premium are monitoring these institutions closely. Some of the evidence highlighted above would not, in my view, be consistent with an outstanding Ofsted inspection outcome.

If the same pattern is evident when the 2014 Performance Tables are published in January 2015, there will be serious cause for concern.

As for the question whether grammar schools are currently meeting the needs of their – typically few – disadvantaged students, the answer is ‘some are; some aren’t’. This argues for intervention in inverse proportion to success.

.

GP

December 2014

The Politics of Selection: Grammar Schools and Disadvantage

This post considers how England’s selective schools are addressing socio-economic disadvantage.

Another irrelevant Norwegian vista by Gifted Phoenix

Another irrelevant Norwegian vista by Gifted Phoenix

It is intended as an evidence base against which to judge various political statements about the potential value of selective education as an engine of social mobility.

It does not deal with recent research reports about the historical record of grammar schools in this respect. These show that – contrary to received wisdom – selective education has had a very limited impact on social mobility.

Politicians of all parties would do well to acknowledge this, rather than attempting (as some do) to perpetuate the myth in defiance of the evidence.

This post concentrates instead on the current record of these schools, recent efforts to strengthen their capacity to support the Government’s gap closing strategy and prospects for the future.

It encourages advocates of increased selection to consider the wider question of how best to support high attainers from disadvantaged backgrounds.

The post is organised into four main sections:

  • A summary of how the main political parties view selection at this point, some six months ahead of a General Election.
  • A detailed profile of the socio-economic inclusiveness of grammar schools today, which draws heavily on published data but also includes findings from recent research.
  • An evaluation of national efforts over the last year to reform selective schools’ admissions, testing and outreach in support of high-attaining disadvantaged learners.
  • Comparison of the various policy options for closing excellence gaps between such learners and their more advantaged peers – and consideration of the role that reformed and/or increased selection might play in a more comprehensive strategy.

Since I know many readers prefer to read my lengthy posts selectively I have included page jumps from each of the bullet points above to the relevant sections below.

One more preliminary point.

This is the second time I have explored selection on this Blog, though my previous post, on fair access to grammar schools, appeared as far back as January 2011. This post updates some of the data in the earlier one.

One purpose of that earlier post was to draw attention to the parallels in the debates about fair access to grammar schools and to selective higher education.

I do not repeat those arguments here, although writing this has confirmed my opinion that they are closely related issues and that many of the strategies deployed at one level could be applied equally at the other.

So there remains scope to explore how appropriate equivalents of Offa, access agreements, bursaries and contexualised admissions might be applied to selective secondary admissions arrangements, alongside the reforms that are already on the table. I leave that thought hanging.

.

The Political Context

My last post on ‘The Politics of Setting’ explored how political debate surrounding within-school and between-school selection is becoming increasingly febrile as we approach the 2015 General Election.

The two have become inextricably linked because Prime Minister Cameron, in deciding not to accommodate calls on the right of his party to increase the number of selective schools, has called instead for ‘a grammar stream in every school’ and, latterly, for a wider – perhaps universal – commitment to setting.

In May 2007, Cameron wrote:

‘That’s what the grammar school row was about: moving the Conservative Party on from slogans such as ‘Bring back grammar schools’ so that we can offer serious policies for improving state education for everyone…

…Most critics seem to accept, when pressed, that as I have said, the prospect of more grammars is not practical politics.

Conservative governments in the past – and Conservative councils in the present – have both failed to carry out this policy because, ultimately, it is not what parents want….

…When I say I oppose nationwide selection by 11 between schools, that does not mean I oppose selection by academic ability altogether.

Quite the reverse. I am passionate about the importance of setting by ability within schools, so that we stretch the brightest kids and help those in danger of being left behind.’

With a Conservative Government this would be a motor of aspiration for the brightest kids from the poorest homes – effectively a ‘grammar stream’ in every subject in every school.

Setting would be a focus for Ofsted and a priority for all new academies.’

As ‘The Politics of Setting’ explained, this alternative aspiration to strengthen within-school selection has not yet materialised, although there are strong signs that it is still Cameron’s preferred way forward.

The Coalition has been clear that:

‘It is not the policy of the Government to establish new grammar schools in England’ (Hansard, 10 February 2014, Col. 427W).

but it has also:

  • Removed barriers to the expansion of existing grammar schools through increases to planned admission numbers (PANs) within the Admissions Code.
  • Introduced several new selective post-16 institutions through the free schools policy (though not as many as originally envisaged since the maths free schools project has made relatively little progress).
  • Made efforts to reform the admissions procedures of existing selective secondary schools and
  • Accepted in principle that these existing schools might also expand through annexes, or satellite schools. This is now a live issue since one decision is pending and a second proposal may be in the pipeline.

The Liberal Democrats have enthusiastically pursued at least the third of these policies, with Lib Dem education minister David Laws leading the Government’s efforts to push the grammar schools further and faster down this route.

In his June 2014 speech (of which much more below) Laws describes grammar schools as ‘a significant feature of the landscape in many local areas’ and ‘an established fact of our education system’.

But, as the Election approaches, the Lib Dems are increasingly distancing themselves from a pro-selective stance.

Clegg is reported to have said recently that he did not believe selective schools were the way forward:

‘The Conservatives have got this odd tendency to constantly want to turn the clock back.

Some of them seem to be hankering towards a kind of selective approach to education, which I don’t think works.

Non-selective schools stream and a lot of them stream quite forcefully, that’s all fine, but I think a segregated school system is not what this country needs.’

Leaving aside the odd endorsement of ‘forceful streaming’, this could even be interpreted as hostile to existing grammar schools.

Meanwhile, both frontrunners to replace Cameron as Tory leader have recently restated their pro-grammar school credentials:

  • Constituency MP Teresa May has welcomed consideration of the satellite option in Maidenhead.

The right wing of the Tory party has long supported increased selection and will become increasingly vociferous as the Election approaches.

Conservative Voice – which describes itself as on the ‘center-Right of the party’ [sic] – will imminently launch a campaign calling for removal of the ban on new grammar schools to be included in the Conservative Election Manifesto.

They have already conducted a survey to inform the campaign, from which it is clear that they will be playing the social mobility card.

The Conservative right is acutely aware of the election threat posed by UKIP, which has already stated its policy that:

‘Existing schools will be allowed to apply to become grammar schools and select according to ability and aptitude. Selection ages will be flexible and determined by the school in consultation with the local authority.’

Its leader has spoken of ‘a grammar school in every town’ and media commentators have begun to suggest that the Tories will lose votes to UKIP on this issue.

Labour’s previous shadow education minister, Stephen Twigg, opposed admissions code reforms that made it easier for existing grammar schools to expand.

But the present incumbent has said very little on the subject.

A newspaper interview in January 2014 hints at a reforming policy:

‘Labour would not shut surviving grammar schools but Mr Hunt said their social mix should be questioned.

“If they are simply about merit why do we see the kind of demographics and class make-up within them?”’

But it seems that this has dropped off Labour’s agenda now that the Coalition has adopted it.

I could find no formal commitment from Labour to address the issue in government, even though that might provide some sort of palliative for those within the party who oppose selection in all its forms and have suggested that funding should be withdrawn from selective academies.

So the overall picture suggests that Labour and the Lib Dems are deliberately distancing themselves from any active policy on selection, presumably regarding it as a poisoned chalice. The Tories are conspicuously riven on the issue, while UKIP has stolen a march by occupying the ground which the Tory right would like to occupy.

As the Election approaches, the Conservatives face four broad choices. They can:

  • Endorse the status quo under the Coalition, making any change of policy conditional on the outcome of a future leadership contest.
  • Advocate more between-school selection. This might or might not stop short of permitting new selective 11-18 secondary schools. Any such policy needs to be distinct from UKIP’s.
  • Advocate more within-school selection, as preferred by Cameron. This might adopt any position between encouragement and compulsion.
  • Develop a more comprehensive support strategy for high attaining learners from disadvantaged backgrounds. This might include any or all of the above, but should also consider support targeted directly at disadvantaged students.

These options are discussed in the final part of the post.

The next section provides an assessment of the current state of selective school engagement with disadvantaged learners, as a precursor to describing how the reform programme is shaping up.

.

How well do grammar schools serve disadvantaged students?

.

The Grammar School Stock and the Size of the Selective Pupil Population

Government statistics show that, as of January 2014, there are 163 selective state-funded secondary schools in England.

This is one less than previously, following the merger of Chatham House Grammar School for Boys and Clarendon House Grammar School. These two Kent schools formed the Chatham and Clarendon Grammar School with effect from 1 September 2013.

At January 2014:

  • 135 of these 163 schools (83%) are academy converters, leaving just 28 in local authority control. Twenty of the schools (12%) have a religious character.
  • Some 5.1% of pupils in state-funded schools attend selective schools. (The percentage fluctuated between 4% and 5% over the last 20 years.) The percentage of learners under 16 attending selective schools is lower. Between 2007 and 2011 it was 3.9% to 4.0%.
  • There are 162,630 pupils of all ages attending state-funded selective secondary schools, of which 135,365 (83.2%) attend academies and 27,265 (16.8%) attend LA maintained schools. This represents an increase of 1,000 compared with 2013. The annual intake is around 22,000.

The distribution of selective schools between regions and local authority areas is shown in Table 1 below.

The percentage of selective school pupils by region varies from 12.0% in the South East to zero in the North East, a grammar-free zone. The percentage of pupils attending selective schools by local authority area (counting only those with at least one selective school) varies from 45.1% in Trafford to 2.1% in Devon.

Some of the percentages at the upper end of this range seem to have increased significantly since May 2011, although the two sets of figures may not be exactly comparable.

For example, the proportion of Trafford pupils attending selective schools has increased by almost 5% (from 40.2% in 2011). In Torbay there has been an increase of over 4% (34.8% compared with 30.5%) and in Kent an increase of almost 4% (33.3% compared with 29.6%).

.

Table 1: The distribution of selective schools by region and local authority area and the percentage of pupils within each authority attending them (January 2014)

Region Schools Pupils Percentage of all pupils
North East 0 0 0
North West 19 20,240 4.9
Cumbria 1 833 2.8
Lancashire 4 4,424 6.6
Liverpool 1 988 3.3
Trafford 7 7,450 45.1
Wirral 6 6,547 30.5
Yorkshire and Humberside 6 6,055 1.9
Calderdale 2 2,217 14.2
Kirklees 1 1,383 5.5
North Yorkshire 3 2,454 6.5
East Midlands 15 12,700 4.5
Lincolnshire 15 12,699 26.9
West Midlands 19 15,865 4.5
Birmingham 8 7,350 10.4
Stoke-on-Trent 1 1,078 8.7
Telford and Wrekin 2 1,283 11.7
Walsall 2 1,423 7.0
Warwickshire 5 3,980 12.0
Wolverhampton 1 753 5.0
East of England 8 7,715 2.1
Essex 4 3,398 4.0
Southend-on-Sea 4 4,319 32.8
London 19 20,770 4.4
Barnet 3 2,643 11.6
Bexley 4 5,466 26.6
Bromley 2 1,997 9.0
Enfield 1 1,378 6.1
Kingston upon Thames 2 2,021 20.5
Redbridge 2 1,822 7.9
Sutton 5 5,445 30.7
South East 57 59,910 12.0
Buckinghamshire 13 15,288 42.2
Kent 32 33,059 33.3
Medway 6 6,031 32.2
Reading 2 1,632 24.1
Slough 4 3,899 37.4
South West 20 19,370 6.2
Bournemouth 2 2,245 23.3
Devon 1 822 2.1
Gloucestershire 7 6,196 16.2
Plymouth 3 2,780 16.3
Poole 2 2,442 26.8
Torbay 3 2,976 34.8
Wiltshire 2 1,928 6.6
TOTAL 163 162,630 5.1

.

Some authorities are deemed wholly selective but different definitions have been adopted.

One PQ reply suggests that 10 of the 36 local authority areas – Bexley, Buckinghamshire, Kent, Lincolnshire, Medway, Slough, Southend, Sutton, Torbay and Trafford – are deemed wholly selective because they feature in the Education (Grammar School Ballots) Regulations 1998.

Another authoritative source – the House of Commons Library – omits Bexley, Lincolnshire and Sutton from this list, presumably because they also contain comprehensive schools.

Of course many learners who attend grammar schools live in local authority areas other than those in which their schools are located. Many travel significant distances to attend.

A PQ reply from March 2012 states that some 76.6% of all those attending grammar schools live in the same local authority as their school, while 23.2% live outside. (The remainder are ‘unknowns’.)

These figures mask substantial variation between authorities. A recent study, for the Sutton Trust  ‘Entry into Grammar Schools in England’ (Cribb et al, 2013) provides equivalent figures for each local authority from 2009-10 to 2011-12.

The percentage of within authority admissions reaches 38.5% in Trafford and 36% in Buckinghamshire but, at the other extreme, it can be as low as 1.7% in Devon and 2.2% in Cumbria.

The percentage of admissions from outside the authority can be as much as 75% (Reading) and 68% (Kingston) or, alternatively, as low as 4.5% in Gloucestershire and 6.8% in Kent.

.

Recent Trends in the Size and Distribution of the Disadvantaged Grammar School Pupil Population

Although this section of the post is intended to describe the ‘present state’, I wanted to illustrate how that compares with the relatively recent past.

I attached to my 2011 post a table showing how the proportion of FSM students attending grammar schools had changed annually since 1995. This is reproduced below, updated to reflect more recent data where it is available

A health warning is attached since the figures were derived from several different PQ replies and I cannot be sure that the assumptions underpinning each were identical. Where there are known methodological differences I have described these in the footnotes.

.

Table 2: Annual percentage FSM in all grammar schools and gap between that and percentage FSM in all secondary schools, 1995-2013

Year PercentageFSM in GS Percentage FSMall schools Percentagepoint Gap
1995 3.9 18.0 14.1
1996 3.8 18.3 14.5
1997 3.7 18.2 14.5
1998 3.4 17.5 14.1
1999 3.1 16.9 13.8
2000 2.8 16.5 13.7
2001 2.4 15.8 13.4
2002 2.2 14.9 12.7
2003 2.1 14.5 12.4
2004 2.2 14.3 12.1
2005 2.1 14.0 11.9
2006 2.2 14.6 12.4
2007 2.0 13.1 11.1
2008 1.9 12.8 10.9
2009 2.0 13.4 11.4
2010 15.4
2011 2.4 14.6 12.2
2012 14.8
2013 15.1
2014 14.6

(1) Prior to 2003 includes dually registered pupils and excludes boarding pupils; from 2003 onwards includes dually registered and boarding pupils.

(2) Before 2002 numbers of pupils eligible for free school meals were collected at school level. From 2002 onwards numbers have been derived from pupil level returns.

(3) 2008 and 2009 figures for all schools exclude academies

.

Between 1996 and 2005 the FSM rate in all schools fell annually, dropping by 4.3 percentage points over that period. The FSM rate in grammar schools also fell, by 1.7 percentage points. The percentage point gap between all schools and selective schools fell by 2.6 percentage points.

Both FSM rates reached their lowest point in 2008. At that point the FSM rate in grammar schools was half what it had been in 1996. Thereafter, the rate across all schools increased, but has been rather more volatile, with small swings in either direction.

One might expect the 2014 FSM rate across all grammar schools to be at or around its 2011 level of 2.4%.

A more recent PQ reply revealed the total number of pupil premium recipients attending selective schools over the last three financial years:

  • FY2011-12 – 3,013
  • FY2012-13 – 6,184 (on extension to ‘ever 6’)
  • FY2013-14 – 7,353

(Hansard 20 January 2014, Col. WA88)

This suggests a trend of increasing participation in the sector, though total numbers are still very low, averaging around 45 per school and slightly over six per year group.

.

Comparison with FSM rates in selective authorities

In 2012, a table deposited in the Commons Library (Dep 2012-0432) in response to a PQ provided the January 2011 FSM rates for selective schools and all state-funded secondary schools in each authority containing selective schools.

In this case, the FSM rates provided relate only to pupils aged 15 or under. The comparable national average rates are 2.7% for selective schools and 15.9% for all state-funded schools.

  • Selective school FSM rates per authority vary between 6.0% in Birmingham and 0.6% in Wiltshire.
  • Other authorities with particularly low FSM rates include Bromley (0.7%), Reading (0.8%) and Essex (0.9%).
  • Authorities with relatively high FSM rates include Wirral (5.2%), Walsall (4.9) and Redbridge (4.8%).
  • The authorities with the biggest gaps between FSM rates for selective schools and all schools are Birmingham, at 28.0 percentage points, Liverpool, at 23.8 percentage points, Enfield at 21.8 percentage points and Wolverhampton, at 21.7 percentage points.
  • Conversely, Buckinghamshire has a gap of only 4.7 percentage points, since its FSM rate for all state-funded secondary schools is only 6.0%.
  • Buckinghamshire’s overall FSM rate is more than four times the rate in its grammar schools, while in Birmingham the overall rate is almost six times the grammar school rate. On this measure, the disparity is greatest in metropolitan boroughs with significant areas of disadvantage.

.

Proportion of disadvantaged learners in each selective school

I attached to my 2011 post a table setting out the FSM rates (all pupils, regardless of age) for each selective school in January 2009.

This updated version sets out the January 2013 FSM and disadvantaged (ie ‘ever 6 FSM’) rates by school, drawn from the latest School Performance Tables. (Click on the screenshot below to download the Excel file.)

.

GS excel Capture

.

Key points include:

  • The size of grammar schools varies considerably, with NORs ranging from 437 (Newport Girls’) to 1518 (Townley Girls’). The average NOR is slightly below 1000.
  • 24 of the 163 schools (14.7%) have suppressed FSM percentages. Since the lowest published percentage is 1.1%, the impact of suppression is that all schools at or below 1.0% are affected. Since no school returns 0, we must assume that all contain a handful of FSM learners. It is notable that six of these schools are in Buckinghamshire, three in Gloucestershire and three in Essex. Both Bromley grammar schools also fall into this category.
  • 67 selective schools (41.1%) have FSM rates of 2% or lower. The average FSM rate across all these schools is 3.25%.
  • The highest recorded FSM rates are at Handsworth Grammar School (14.4%), King Edward VI Aston School (12.9%) and Stretford Grammar School (12%). These three are significant outliers – the next highest rate is 7.8%.
  • As one would expect, there is a strong correlation between FSM rates and ‘ever 6’ rates. Most of the schools with the lowest ‘ever 6’ rates are those with SUPP FSM rates. Of the 26 schools returning ‘ever 6’ rates of 3.0% or lower, all but 7 fall into this category.
  • The lowest ‘ever 6’ rate is the 0.6% returned by Sir William Borlase’s Grammar School in Buckinghamshire. On this evidence it is probably the most socio-economically selective grammar school in the country. Five of the ten schools with the lowest ‘ever 6’ rates are located in Buckinghamshire.
  • A few schools have FSM and ‘ever 6’ rates that do not correlate strongly. The most pronounced is Ribston Hall in Gloucestershire which is SUPP for FSM yet has an ‘ever 6’ rate of 5.5%, not far short of the grammar school average which is some 6.6%. Clitheroe Royal Grammar School is another outlier, returning an ‘ever 6’ rate of 4.8%.
  • The highest ‘ever 6’ rates are in Handsworth Grammar School (27.2%), Stretford Grammar School (24.3%) and King Edward VI Aston School (20.3%). These are the only three above 20%.
  • In London there is a fairly broad range of socio-economic selectivity, from St Olave’s and St Saviour’s (Bromley) – which records an ‘ever 6’ rate of 2.5% – to Woodford County High School, Redbridge, where the ‘ever 6’ rate is 11%. As noted above, the FSM rates at the two Bromley schools are SUPP. The London school with the highest FSM rate is again Woodford County High, at 5%.

Another source throws further light on the schools with the lowest FSM rates. In October 2013, a PQ reply provided a table of the 50 state secondary schools in England with the lowest entitlement to FSM, alongside a second table of the 50 schools with the highest entitlement.

These are again January 2013 figures but on this occasion the rates are for pupils aged 15 or under and the only figures suppressed (denoted by ‘x’) are where no more than two pupils are FSM.

Sir William Borlase’s tops the list, being the only school in the country with a nil return (so the one or two FSM pupils who attend must be aged over 15 and may have been admitted directly to the sixth form).

The remainder of the ‘top ten’ includes eight selective schools and one comprehensive (Old Swinford Hospital School in Dudley). The eight grammar schools are:

  • Cranbrook, Kent – x
  • Adams’, Telford and Wrekin – x
  • St Olave’s and St Saviour’s, Bromley – 0.5%
  • Dr Challoner’s High Buckinghamshire – 0.5%
  • Dr Challoner’s Grammar, Buckinghamshire – 0.6%
  • Aylesbury Grammar, Buckinghamshire – 0.6%
  • Newstead Wood, Bromley – 0.6%
  • Pate’s, Gloucestershire – 0.6%

Comparing the data in my tables for 2009 and 2013 also throws up some interesting facts:

  • Some schools have increased significantly in size – Burnham Grammar School (Buckinghamshire), Sir Thomas Rich’s (Gloucestershire), Highworth Grammar School for Girls (Kent), Simon Langton Grammar School for Boys (Kent), Kesteven and Grantham Girls’ School (Lincolnshire), Carre’s Grammar School (Lincolnshire) and St Joseph’s College (Stoke) have all increased their NORs by 100 or more.
  • However, some other schools have shrunk significantly, notably The Skegness Grammar School in Lincolnshire (down 129), The Boston Grammar School in Lincolnshire (down 110), Fort Pitt Grammar School in Medway (down 132) and Slough Grammar School (down 175).
  • While recognising that the figures may not be fully comparable, there have also been some significant changes in the proportions of FSM pupils on roll. Significant increases are evident at King Edward VI Aston (up 5.9 percentage points), Fort Pitt (up 5.1 percentage points) and Handsworth Grammar (up 4.7 percentage points).
  • The only equally pronounced mover in the opposite direction is St Anselm’s College on The Wirral, where the FSM rate has more than halved, falling by 5.2 percentage points, from 9.8% to 4.6%.

Additional statistics were peppered throughout David Laws’ June 2014 speech.

He refers to a paper by DfE analysts which unfortunately has not been published:

  • In 2013, 21 grammar schools had fewer than 1% of pupils eligible for FSM. Ninety-eight had fewer than 3% eligible and 161 had fewer than 10% eligible. This compares to a national average of 16.3% across England. (The basis for these figures is not supplied but they more or less agree with those above.)
  • In Buckinghamshire in 2011, 14% of the year 7 cohort were eligible for the pupil premium, but only 4% of the cohort in Buckinghamshire grammar schools were eligible. In Lincolnshire the comparable percentages were 21% and 7% respectively.

.

Selectivity

Most commentary tends to regard the cadre of selective schools as very similar in character, leaving aside any religious affiliation and the fact that many are single sex establishments.

Although the fact is rarely discussed, some grammar schools are significantly more selective than others.

The 2013 Secondary Performance Tables show that only 10 grammar schools can claim that 100% of the cohort comprises high attainers. (These are defined on the basis of performance in statutory end of KS2 tests, in which they must record an APS of 30 or more across English, maths and science.)

At several schools – Clarendon House (Kent, now merged), Fort Pitt (Medway), Skegness (Lincolnshire), Dover Boys’ and Girls’ (Kent), Folkestone Girls’ (Kent), St Joseph’s (Stoke), Boston High (Lincolnshire) and the Harvey School (Kent) – the proportion of high attainers stands at 70% or below.

Many comprehensive schools comfortably exceed this, hence – when it comes to KS2 attainment – some comprehensives are more selective than some grammar schools.

Key variables determining a grammar school’s selectivity will include:

  • The overall number of pupils in the area served by the school and/or the maximum geographical distance that pupils may travel to it.
  • The number of pupils who take the entrance tests, including the proportion of pupils attending independent schools competing for admission.
  • The number of competing selective schools and high-performing comprehensive schools, plus the proportion of learners who remain in or are ‘siphoned off’ into the independent sector.
  • The number of places available at the school and the pass mark in the entrance tests.

I have been unable to locate any meaningful measure of the relative selectivity of grammar schools, yet this is bound to impact on the admission of disadvantaged learners.

An index of selectivity would improve efforts to compare more fairly the outcomes achieved by different grammar schools, including their records on access for disadvantaged learners.

.

Prior attainment data

In his June 2014 speech, Laws acknowledges that:

  • ‘A key barrier is the low level of free school meal pupils achieving level 5, typically a proxy for pupils you admit’.
  • However, in wholly selective areas fewer than 50% of FSM learners achieving Level 5 enter selective schools compared with two-thirds of non-FSM pupils:

‘We calculated it would require a shift of just 200 level 5 FSM pupils to go into grammar schools in wholly selective areas to remove this particular bias ‘

Alternative versions of this statement appear elsewhere, as we shall see below.

Using data from 2009/10 and 2011/12, the Sutton Trust study by Cribb et al explored whether advantaged and disadvantaged pupils with KS2 level 5 in both English and maths were equally likely to attend grammar schools.

They found that those not eligible for FSM are still more likely to attend. This applies regardless of whether the grammar school is located in a selective local authority, although the percentages and the gaps vary considerably.

  • In selective authorities, some 66% of these high attaining non-FSM pupils went on to grammar schools compared with under 40% of FSM pupils, giving a gap of over 26 percentage points. (Note that the percentage for FSM is ten percentage points lower than the one quoted by Laws. I can find no reason for this disparity, unless the percentage has changed dramatically since 2012.)
  • In isolated grammar schools outside London the gap is much smaller, at roughly 11 percentage points (18% non-FSM against 7% FSM).
  • In London there is a similar 12 percentage point gap (15% non-FSM versus 3% FSM)

 

Cribb Capture 1

A similar pattern is detected on the basis of KS2 maths test fine points scores:

‘Two points are evident. First, for any given level of maths attainment, pupils who are eligible for FSM have a noticeably lower probability of attending a grammar school. Indeed, a non-FSM student with an average maths score has the same probability of entering a grammar school as an FSM pupil with a score 0.7 standard deviations above average. Second, the gap in probability of attendance between FSM and non-FSM pupils actually widens substantially: non-FSM pupils with test scores one standard deviation above average have a 55% likelihood of attending a grammar school in selective local authorities, whereas similar pupils who are eligible for FSM have only a 30% chance of attending a grammar school. This is suggestive that bright pupils from deprived families are not attending grammar schools as much as their attainment would suggest they might.’

This rather calls into question Laws’ initial statement that level 5 performance among FSM pupils is ‘a key barrier’ to admission.

The study also confirms that pupils attending primary schools with relatively high levels of deprivation are much less likely to progress to grammar schools.

On the other hand, some 13% of pupils nationally transfer into selective schools from non-state schools and schools outside England. The researchers are unable to distinguish clearly those from abroad and those from the independent sector, but note that they are typically wealthier than state school transfers.

This masks significant variation between local authority areas.

Almost 34% of such pupils transfer in to grammar schools in Essex, as do 24% in Bromley, 23% in Wiltshire and 22% in Bournemouth and Southend. At the other extreme, only 6% are incomers in Kirklees.

.

Headteacher perceptions

The Sutton Trust released a parallel research report from NATCEN reporting the outcomes of interviews with a small sample of three primary school and eight grammar school headteachers.

The researchers found that:

  • Rightly or wrongly, many heads felt disadvantaged learners had relatively lower educational aspirations.
  • Disadvantaged parents were sometimes perceived to know less about grammar schools and place less value on the benefits they might confer.
  • Heads felt disadvantaged parents ‘often associated grammar schools with tradition, middle class values and elitism’. Parents felt their children ‘might struggle interacting with children from more affluent backgrounds’.
  • Grammar school heads highlighted the role of primary schools but ‘this was difficult when primary schools disagreed with assessment based entry processes and selective education in general’.
  • Heads felt grammar schools should provide more outreach and demonstrate their openness to everyone. It was suggested that, as grammar schools increasingly take in pupils from further away and/or from independent schools, this might further distance schools from their local communities.
  • It was widely acknowledged that learners from more advantaged backgrounds were coached to pass the entrance exams. Some grammar heads regarded tutoring as ‘good examination preparation’; others recognised it as a barrier for disadvantaged learners.
  • Although there are financial barriers to accessing grammar schools, including the cost of uniforms and school trips, grammar school heads claimed to deploy a variety of support strategies.

Overall

The preceding analysis is complex and difficult to synthesise into a few key messages, but here is my best effort.

The national figures show that, taken as a whole, the 163 grammar schools contain extremely low proportions of FSM-eligible and ‘ever 6’ learners.

National FSM rates across all grammar schools have fallen significantly over the past 20 years and, although the FSM gap between selective schools and all schools has narrowed a little, it is still very pronounced.

There is certainly a strong case for concerted action to reduce significantly the size of this gap and to strive towards parity.

The disparity is no doubt partly attributable to lower rates of high attainment at KS2 amongst disadvantaged learners, but high attaining disadvantaged learners are themselves significantly under-represented. This is particularly true of wholly selective authorities but also applies nationally.

Although the sample is small, the evidence suggests that grammar school and primary head teachers share the perception that disadvantaged learners are further disadvantaged by the selective admissions process.

However, the cadre of grammar schools is a very broad church. The schools are very different and operate in markedly different contexts. Some are super-selective while others are less selective than some comprehensive schools.

A handful have relatively high levels of FSM and ‘ever-6’ admissions but a significant minority have almost negligible numbers of disadvantaged learners. Although contextual factors influence FSM and ‘ever 6’ rates significantly, there are still marked disparities which cannot be explained by such factors.

Each school faces a slightly different challenge.

Transparency and public understanding would be considerably improved by the publication of statistical information showing how grammar schools differ when assessed against a set of key indicators – and identifying clear improvement targets for each school. 

There seem to me to be strong grounds for incorporating schools’ performance against such targets into Ofsted’s inspection regime.

.

Progress Towards Reform

.

The Sutton Trust Research

Although the Grammar School Heads’ Association (GSHA) argues that it has pursued reform internally for some years, a much wider-ranging initiative has developed over the last twelve months, kicked off by the publication of a tranche of research by the Sutton Trust in November 2013.

This included the two publications, by Cribb et al and NATCEN cited above, plus a third piece by Jesson.

There was also an overarching summary report ‘Poor Grammar: Entry into Grammar Schools for disadvantaged pupils in England’.

This made six recommendations which, taken together, cover the full spectrum of action required to strengthen the schools’ capacity to admit more disadvantaged learners:

  • Review selection tests to ensure they are not a barrier to the admission of learners from disadvantaged backgrounds. The text remarks that:

‘Some grammar schools and local authorities are already trying to develop tests which are regularly changed, less susceptible to coaching, intelligence-based and not culturally biased.’

  • Reduce the advantage obtained by those who can pay for private tuition by making available a minimum of ten hours of test preparation to all applicants on a free or subsidised basis.
  • Improve grammar school outreach support, targeting learners from low and middle income backgrounds. This should include: assurances on access to transport and support with other costs; active encouragement for suitable Pupil Premium recipients to apply; using the media to dispel notions that grammar schools are exclusive and elitist; and deploying existing disadvantaged students as ambassadors.
  • Using the flexibility within the Admissions Code (at this point available only to academies) to prioritise the admission of high achieving students who are entitled to the pupil premium. There is also a suggestion that schools might: 

‘…consider giving preference to students from low or middle income households who reach a minimum threshold in the admission test’.

though it is not clear how this would comply with the Code.

  • Develop primary-grammar school partnerships to provide transition support for disadvantaged students, enabling primary schools to provide stronger encouragement for applications and reassure parents.
  • Develop partnerships with non-selective secondary schools:

‘…to ensure that high achieving students from low and middle income backgrounds have access to good local teachers in their areas.’

The Sutton Trust also made its own commitment to:

‘…look at ways that we can support innovation in improved testing, test preparation, outreach, admissions and collaboration.

We will also commission independent analysis of the impact of any such programmes to create an evidence base to enhance fair access to grammar schools.’

.

Reaction

Immediate reaction was predictably polarised. The GSHA was unhappy with the presentation of the report.

Its November 2013 Newsletter grumbles:

‘It is the way in which the research is presented by the Sutton Trust rather than any of research findings that give rise to concerns. Through a process of statistical machination the press release chose to lead on the claim that 6% of prep school pupils provide four times more grammar school pupils than the 16% of FSM eligible children. Inevitably, this led to headlines that the independent sector dominates admissions. The reality, of course is that 88% of all grammar school students come from state primary schools….

….Grammars select on ability and only 10% of FSM children reach level 5 at KS2 compared with a national average of 25%. The report, quite reasonably, uses level 5 as the indicator of grammar school potential. On the basis of this data the proportions of eligible FSM children in grammar schools is significantly greater than the overall FSM proportion in the top 500 comprehensives….

In 2012 just over 500 FSM children entered grammar schools. For the success rate of L5 FSM to match that of other L5 would require 200 more FSM children a year to enter grammar schools. Just one more in each school would virtually close the gap….

….The recommendations of the report are not, as claimed, either new or radical. All are areas that had already been identified by GSHA as options to aid access and represent practices that are already adopted by schools. This work, however, is usually carefully presented to avoid promotion of a coaching culture.

It is unfortunate that the press briefing both contributed to reinforcing the false stereotyping of grammar schools and failed to signal initiatives taken by grammar schools.’

There is evidence here of retaliatory ‘statistical machination’, together with a rather defensive attitude that may not bode well for the future.

On the other hand HMCI Wilshaw was characteristically forthright in the expression of an almost diametrically opposite opinion.

In December 2013 he is reported to have said:

‘Grammar schools are stuffed full of middle-class kids. A tiny percentage are on free school meals: 3%. That is a nonsense.

Anyone who thinks grammar schools are going to increase social mobility needs to look at those figures. I don’t think they work. The fact of the matter is that there will be calls for a return to the grammar school system. Well, look what is happening at the moment. Northern Ireland has a selective system and they did worse than us in the [international comparison] table. The grammar schools might do well with 10% of the school population, but everyone else does really badly. What we have to do is make sure all schools do well in the areas in which they are located.’

 .

The Laws Speech

Liberal Democrat Education Minister David Laws made clear the Government’s interest in reform with his June 2014 speech, already referenced above.

Early on in the speech he remarks that:

‘The debate about grammar schools seems to have been put in the political deep freeze – with no plans either to increase or reduce the number of what are extremely popular schools in their localities.’

With the benefit of hindsight, this seems rather ignorant of (or else disrespectful to) UKIP, which had nailed their colours to the mast just three weeks previously.

Laws acknowledges the challenge thrown down by Wilshaw, though without attribution:

‘Are you, as some would have it, “stuffed full of middle-class kids”?

Or are you opening up opportunities to all bright children regardless of their background, or can you do more?

Why is entry to grammar schools so often maligned?’

He says he wants to work with them ‘openly and constructively on social mobility’, to ‘consider what greater role they can play in breaking the cycles of disadvantage and closing the opportunity gap’, while accepting that the Government and the primary sector must also play their parts.

He suggests that the Government will do more to increase the supply of high attaining disadvantaged learners:

‘…a key barrier is the low level of free school meal pupils achieving level 5, typically a proxy for pupils you admit. So this is not just a challenge for grammar schools, but for the whole education system…

….My promise to you, alongside my challenge to you, is that this government will do everything in its power to make sure that more children from poorer backgrounds achieve their full potential.’

He lists the policies that:

‘Taken together, and over time…will start to shift the dial for poorer children – so that more and more reach level 5’

leading of course with the pupil premium.

He also proposes aspirational targets, though without any timescale attached:

My ambition is that all selective schools should aim for the same proportion of children on free school meals in their schools as in their local area.

This would mean an additional 3,500 free school meal pupils in selective schools every year, or an additional 35,000 pupils over 10 years.’

In relation to the flexibilities in the Admissions Code he adds:

I am pleased to be able to say that 32 grammar schools have implemented an admissions priority for pupils eligible for free school meals this year….

We in the Department for Education will fully support any school that chooses to change its admissions criteria in this way – in fact, I want to see all grammar schools give preference to pupil premium pupils over the next few years.’

Similarly, on coaching and testing:

‘…I really welcome the association’s work to encourage a move to entry tests that are less susceptible to coaching, and I am heartened to hear that at least 40% of grammar schools are now moving to the introduction of coaching resistant tests.

Again, I hope that all grammar schools will soon do so, and it will be interesting to see the impact of this.’

And he adds:

I want all schools to build on the progress that is being made and seek to close the gap by increasing parental engagement, and stronger working with local primaries – with a focus on identifying potential.’

So he overtly endorses several of the recommendations proposed by the Sutton Trust seven months earlier.

A Sutton Trust press release:

‘…welcomed the commitment by Schools Minister David Laws, to widening access to grammar schools and making the issue a priority in government’.

This may be a little over-optimistic.

A Collaborative Project Takes Shape

Laws also mentions in his speech that:

‘The GSHA will be working with us, the Sutton Trust and the University of Durham to explore ways in which access to grammar schools by highly able deprived children might be improved by looking more closely at the testing process and what may be limiting the engagement of pupils with it.’

The associated release from the Sutton Trust uses the present tense:

‘The Trust is currently working with the King Edward VI Foundation, which runs five grammar schools in Birmingham, Durham University, the Grammar School Heads Association and the Department for Education to target and evaluate the most effective strategies to broaden access to grammar schools.

A range of initiatives being run by the Foundation, including test familiarisation sessions at community locations, visits from primary schools and support for numeracy and literacy teaching for gifted and talented children at local primary schools, will be evaluated by Durham University to understand and compare their impact. The resulting analysis will provide a template for other grammar schools to work with.’

We know that Laws had been discussing these issues with the grammar schools for some time.

When he appeared before the Education Select Committee in February 2014 he said:

‘We are trying, for example, to talk to grammar schools about giving young people fairer access opportunities into those schools.  We are trying to allow them to use the pupil premium as a factor in their admissions policy.  We are trying to encourage them to ensure that testing is fairer to young people and is not just coachable. ‘

The repetition of ‘trying’ might suggest some reluctance on the part of grammar school representatives to engage on these issues.

Yet press coverage suggested the discussions were ongoing. In May the GSHA Newsletter states that it had first met Laws to discuss admissions some eighteen months previously, so perhaps as early as November 2012.

It adds:

‘We are currently working on a research project with the DfE and the Sutton Trust to try to find out what practices help to reduce barriers to access for those parents and students from deprived backgrounds.’

A parallel report in another paper comments:

‘The grammar school heads have also gone into partnership with the education charity the Sutton Trust to support more able children from middle and lower income backgrounds applying to selective schools.

Other ideas being considered include putting on test familiarisation sessions for disadvantaged children – something they have missed out on in the past.’

While an entry on CEM’s website says:

‘Access Grammar:

This project seeks to look at ways access to grammar schools for highly able children from non-privileged backgrounds can be improved. The project will identify potential target cohorts in the study areas for a range of outreach interventions and will look to evaluate these activities. For this project, the CEM Research and Evaluation team are working in collaboration with the Sutton Trust, Grammar School Heads Association, King Edwards Foundation and the Department for Education.

Start date: January 2014
End date: January 2017.’

So we know that there is a five-way partnership engaged on a three year project, The various statements describing the project’s objectives are all slightly different, although there is a clear resemblance between them, the aims articulated by Laws and the recommendations set out by the Sutton Trust.

But I searched in vain for any more detailed specification, including key milestones, funding and intended outcomes. It is not clear whether the taxpayer is contributing through DfE funding, or whether the Sutton Trust  and/or other partners are meeting the cost.

Given that we are almost a year into the programme, there is a strong case for this material to be made public.

.

Progress on Admissions Criteria

Of the issues mentioned in the Sutton Trust’s recommendations – tests and test preparation, admissions flexibility, outreach and partnership with primary and non-selective secondary schools – those at the front of the list have been most prominent (though there is also evidence that the King Edward’s Foundation is pursuing reform across a wider front).

The GSHA’s May 2014 newsletter is less grumpy than its predecessor, but still strikes a rather defensive note.

It uses a now familiar statistic, but in a slightly different fashion:

‘The actual number of students with Level 5s in their SATs who either choose not to apply to a grammar school or who apply but do not receive a place is reckoned by GSHA and the DfE to be two hundred students a year; not the very large number that the percentages originally suggested.’

This is the third time we have encountered this particular assertion, but each time it has been articulated differently. Which of the three statements is correct?

The GSHA is also keen to emphasise that progress is being made independently through its own good offices. On admissions reform, the article says:

‘A significant number of schools 38 have either adopted an FSM priority or consulted about doing so in the last admissions round. A further 59 are considering doing so in the next admissions round.’

The GHSA was also quoted in the TES, to the effect that 30 grammar schools had already been given permission by DfE to change their admissions policies and would so with effect from September 2015, while a further five or six had already introduced the reform.

A November 2014 PQ reply updates the figures above, saying that 32 grammar schools have already prioritised disadvantaged learners in their admissions arrangements and a further 65 ‘intend to consult on doing so’.

That leaves 66 (40%) which are not giving this active consideration.

The Chief Executive of the GSHA commented:

‘“You won’t notice a dramatic change in schools themselves because the numbers are quite small…This is reaching out at the margins in a way that won’t deprive other people of a place. The real need is to raise the standard among free school meals pupils at Key Stage 1 and Key Stage 2, that’s the key issue.

“What we are looking at in the meantime is what we can do to help these free school meals pupils who want to come to grammar school.”

Mr Sindall said that many of the country’s 164 grammar schools would not change their policies because competition for places was less fierce and it would be unnecessary. Many schools were also increasing outreach programmes and some were running eleven-plus familiarisation sessions to help prepare poorer children for the test, he added.’

There is evidence here of a desire to play down the impact of such changes, to suggest that the supply of disadvantaged high achievers is too small to do otherwise.

The data analysis above suggests that almost all selective schools need to address the issue.

Between them, the various press reports mention admissions changes at several schools, including Rugby High, South Wilts, ‘a series of Buckinghamshire grammars including Sir William Borlase’s, Dr Challoner’s  and Aylesbury Grammar’, as well as the King Edward’s Foundation Schools in Birmingham.

I checked how these changes have been embodied in some of these schools’ admissions policies.

The reports indicated that Rugby was:

‘…going even further by reserving a fixed number of places for FSM-eligible children, so potentially accepting pupils with lower entrance exam scores than other applicants.’

Rugby’s admissions arrangements for 2015 do indeed include as a second overall admissions priority, immediately following children in care:

‘Up to 10 places for children living within the priority circle for children in receipt of Free School Meals whose scores are between one and ten marks below the qualifying score for entry to the school.’

South Wilts included FSM as an oversubscription criterion in its 2014 admission arrangements, replacing it with pupil premium eligibility in 2015. However, in both cases it is placed third after children in care and those living in the school’s designated [catchment] area.

Sir William Borlase’s goes one better, in that its 2015 admissions policy places children eligible for free school meals immediately after ‘children in care’ and before ‘children living in the catchment area of the school’, though again only in the oversubscription criteria.

The King Edward’s Foundation is pursuing a similar route to Rugby’s. It announced its intention to reform admissions to its five Birmingham grammar schools in April 2014:

‘The Government wishes to improve the social mobility of children in the UK and has urged selective schools to consider how their admission policies could be changed to achieve this. The King Edward VI Grammar Schools have applied to the Department for Education which can allow them to give preference in their policies, to children who are on free school meals, or have been at any point in the last six years…

… In addition the grammar schools will be offering familiarisation sessions which will introduce children from less privileged backgrounds to the idea of attending a grammar school and will encourage them to take the 11+.

All of the Grammar Schools have set themselves a target of a 20% intake of children on free school meals (Aston has already achieved this and has a target of 25%). The expansion of the grammar schools which was announced earlier this year means that these additional children will simply fill the additional space.’

According to the 2013 Performance Tables, the FSM rates at each of these schools in January 2013 were:

  • Aston – 12.9%
  • Camp Hill Boys – 3.6%
  • Camp Hill Girls – 5.3%
  • Five Ways – 2.6%
  • Handsworth Girls – 6.3%

There must have been a major improvement at Aston for the September 2013 admissions round. As for the other four schools, they must increase their FSM admissions by a factor of between 4 and 8 to reach this target.

I wonder whether the targets are actually for ‘ever 6’ admissions?

In the event, the Foundation’s applications encountered some difficulties. In July the Admissions Adjudicator was obliged to reject them.

A parent had objected on the grounds that:

‘…it is necessary to request financial information from parents to achieve this priority which is contrary to paragraph 1.9(f) of the School Admissions Code.

… The objector further feels that it is unclear, unfair and unreasonable to use the pupil premium to differentiate between applications when the school is oversubscribed.’

The Adjudicator found in favour of the parent on the technical grounds that, although the schools had applied for variations of their funding agreements to permit this change, they had only done so retrospectively.

However, in each case:

‘The school is now entitled to give priority to girls eligible for the pupil premium as the funding agreement has been amended.’

By August the Foundation was able to state that the issue had been resolved:

‘Children applying for a place at any of the King Edward VI Grammar Schools must now achieve a minimum “qualifying score” in the test to be eligible for entry.

Any Looked After Child or previously Looked After Child (a child who is or has been in the care of the Local Authority) who achieves the “qualifying score” will be given priority for admission for up to 20% of the available places (25% at Aston).

Children eligible for Pupil Premium (those who have been registered for Free School meals at any point in the 6 years prior to the closing date for registration, 11 July 2014) who achieve the “qualifying score” will also be given priority for admission.

After this allocation, children not eligible for the Pupil Premium but who achieve the “qualifying score” will be admitted by rank order of scores until all places are filled.’

The Foundation has published an interesting FAQ on the new arrangements:

‘Q5. Will this mean that if you are poor you won’t have to score as high in the 11+ admission tests?
A. That is essentially correct – up to 20% of places (25% at Aston) are set aside for pupil premium children who achieve “a qualifying score”. This qualifying score will be set before the test in September after we have reviewed data in order to ensure that children who achieve the score can flourish in our schools.

Q6. Why don’t you want the cleverest children at your school anymore?
A.
 We want our schools to represent the City of Birmingham and the diverse backgrounds that our children might come from. We believe that there are clever children out there who just don’t have the same opportunity to succeed as those from more privileged backgrounds and we want to try to do something about that.’

It acknowledges the magnitude of the challenge ahead:

‘John Collins, Secretary to the Governors of the charity The Schools of King Edward VI in Birmingham said “This is a hugely challenging target which we do not expect to achieve in the first few years of the initiative, as currently there are relatively few free school meal pupils who apply to take the test. These low numbers are something we are trying to address with our “familiarisation” programme which seeks to encourage bright children from less privileged backgrounds to take the test.”’

Also in July the Government opened up the same possibility for grammar schools that are not academies by consulting on amendments to the Admissions Code to permit this.

In October this was confirmed in the Government’s response to the consultation which stressed it was being introduced as an option rather than a universal requirement.

.

Progress on 11+ Test Reform

The new-style 11-plus tests developed by CEM have not had a universally positive reception. Much of the attention has been focused on their adoption by Buckinghamshire grammar schools.

The GSHA’s May 2014 newsletter notes that ‘some schools in the Midlands’ have been using CEM tests for five years. From 2015, 40% of grammar schools will be using these tests, which are:

‘…designed to be immune to the influence of coaching’

adding:

‘The analysis of data from Buckinghamshire (a wholly selective area which has recently switched to the CEM Centre tests) will provide us in time with valuable hard data on the large scale impact of the change over time.’

Back in February 2014 an Observer article had already cited positive feedback from Buckinghamshire:

‘Last autumn, a handful of education authorities in England introduced an exam designed to test a wider range of abilities – ones that are already being taught in primary schools, rather than skills that can be mastered through home tutoring – to make the selection system fairer.

Provisional results indicate that a more diverse selection of pupils passed this test, and headteachers say they feel the change has made a difference.

Ros Rochefort, headteacher at Bledlow Ridge primary school in Buckinghamshire…said that this year, for the first time in her career, the test has delivered a fair result. “All the kids who got through were expected to pass and, as usual, there are a couple of appeals coming through. All our very able children were selected….

…. Philip Wayne, headteacher at Chesham grammar school and chairman of the Bucks Grammar School Heads Association, has welcomed the changes and says he is “very confident” that the new test will avoid the current situation, in which many pupils who won places at his school with the help of intensive tutoring struggle to keep up with lessons once they arrive.’

However, there were contemporary reports that the 2013 tests led to a 6% fall (110 fewer pupils) in the proportion of places awarded to children from in-county state primary schools, even though 300 more pupils applied.

In September this was further developed in a Guardian story:

‘According to the data, a child from a Buckinghamshire private school is now more than three and a half times more likely to pass the 11-plus than a child from one of its state primaries….

…FOI requests to the eight secondary schools in Wycombe, which includes some of the most deprived and diverse wards in the county, suggest that children on free school meals and of Pakistani heritage have been less successful this year. ‘

A local pressure group Local Equal and Excellent has been trying to gather and analyse the data from the initial rounds of testing in 2013 and 2014 (ie for admission in 2014 and 2015).

Their most recent analysis complains at refusals to publish the full test data and contains an analysis based on the limited material that has been released.

In November 2014, the matter was discussed at Buckinghamshire’s Education, Skills and Children’s Services Select Committee.

The ‘results and analysis’ paper prepared by Buckinghamshire’s grammar school headteachers contains many words and far too few numbers.

The section on ‘Closing the gap’ says:

‘One local group has claimed that children from poorer backgrounds and BME have ‘done worse’ in the new Secondary Transfer Test. It is not specified what ‘worse’ means; however it is not reliable to make statements about trends and patterns for specific groups from a single year’s data and as stated above the data that has been used to make such claims is a small subset of the total and unrepresentative. To substantiate such claims a detailed analysis of additional information such as the current attainment of the children concerned would be needed. We are currently considering how a longitudinal study might be achieved.’

This is overly defensive and insufficiently transparent.

There is some disagreement about whether or not the new test is less amenable to coaching.

The ‘results and analysis’ paper says:

‘There is no such thing as a ‘tutor proof’ test. However, the new tests are less susceptible to the impact of specific test tutoring because they are aligned to the National Curriculum which all children study. Additionally, the questions in the new test are less predictable than in the previous test because they cover a wider range of topics and there is a broader range of question types – points acknowledged and welcomed by primary headteachers’.

Conversely, the pressure group says:

‘The new 11-plus, devised by the Centre for Evaluation and Monitoring (CEM) at Durham University, is supposed to rely less heavily on verbal reasoning and be more closely allied to the primary curriculum. Practice papers for the CEM test are supposed to be less readily available…

But… the fact that it is modelled on what can be taught in schools means the CEM test is more amenable to coaching… if children can’t be taught to get better in maths, why are we teaching it in schools? Practice will make anyone better and I see no sign that tuition has tailed off at all.’

Elsewhere there is evidence that 11+ testing is not immune to financial pressures. North Yorkshire is presently consulting on a plan to scale back from a familiarisation test and two sets of two full tests, with the best results taken forward.

Instead there would be a single set of tests taken by all candidates on the same day at a single venue, plus sample booklets in place of the familiarisation test. A system of reviews, enabling parents to provide supporting evidence to explain under-performance, would also be discontinued.

The reason is explicit:

‘The cost of administering an overly bureaucratic system of testing is no longer sustainable in the light of very significant cuts in public expenditure.’

Even though the draft impact assessment says that the Council will consider applications for support with transport from rural areas and for those with low incomes, there is some unacknowledged risk that the new arrangements will be detrimental to efforts to increase the proportion of disadvantaged learners admitted to these schools.

.

How Best to Close Excellence Gaps

.

What to do with the status quo

The next Government will inherit:

  • The Access Grammar reform project, outlined above, which is making some progress in the right direction, but needs closer scrutiny and probably more central direction. There is an obvious tension between Laws’ aspiration that all grammar schools should ‘give preference to pupil premium pupils over the next few years’ and the GSHA position, which is that many schools do not need to change their policies. It will be important that the changes to admissions arrangements for the 163 schools are catalogued and their impact on admissions monitored and made public, so that we can see at a glance which schools are leading the pack and which are laggards. A published progress report against the Sutton Trust’s six recommendations would help to establish future priorities. Greater transparency about the project itself is also highly desirable.
  • A small cadre of selective 16-19 free schools. It will need to articulate its position on academic selection at 16+ and might need to take action to ensure a level playing field with existing sixth form colleges. It might consider raising expectations of both new and existing institutions in respect of the admission of disadvantaged learners, so securing consistency between 11+ selection and 16+ selection.
  • Flexibility within the Admissions Code for all grammar schools – academies and LA-maintained alike – to prioritise the admission of disadvantaged learners. It may need to consider whether it should move further towards compulsion in respect of grammar schools, particularly if the GSHA maintains its position that many do not need to broaden their intake in this fashion.
  • Flexibility for all grammar schools to increase Planned Admission Numbers and, potentially, to submit proposals for the establishment of Satellite institutions. The approval of such proposals rests with the local authority in the case of a maintained school but with the Secretary of State for Education in respect of academies. An incoming government may need to consider what limits and conditions should be imposed on such expansion, including requirements relating to the admission of disadvantaged learners.

It may be helpful to clarify the position on satellites. The Coalition Government has confirmed that they can be established:

‘It is possible for an existing maintained grammar school or academy with selective arrangements to expand the number of places they offer, including by extending on to another site…There are, however, limitations on that sort of expansion, meaning it could only be a continuation of the existing school. The school admissions code is written from a presumption that those schools with a split site are a single school’ (Hansard, 16 February 2012, Col. 184W).

In December 2013, a proposal to establish a grammar school annexe in Sevenoaks, Kent was rejected by the Secretary of State on the grounds that it would constitute a new school:

‘Mr Gove’s legal ruling hinged on the issue of a girls’ grammar school being the sponsor of a Sevenoaks annexe for both girls and boys. The planned entry of Sevenoaks boys to the annexe lead Mr Gove to rule that the annexe’s proposed admissions policy was sufficiently different to the sponsor school’s girls-only admissions policy to constitute a wholly new grammar school.’

But a revised proposal was submitted in November 2014 for a girls’ only annexe. Moreover, the local authority has committed to exploring whether another satellite could be established in Maidenhead, acknowledging that this would require the co-operation of an existing grammar school.

The timing of the decision on the revised Sevenoaks proposal ensures that selection will remain a live issue as we approach the General Election

Further options to promote between-school selection

There are several options for strengthening a pro-selection policy further that would not require the removal of statutory constraints on opening new 11-18 grammar schools, or permitting existing schools to change their character to permit selection.

For example:

  • Pursuing the Wilshavian notion of organising schools into geographical clusters, some with academic and others with vocational specialisms, and enabling learners to switch between them at 14+. In many areas these clusters will incorporate at least one grammar school; in others the ‘academic’ role would be undertaken by high-performing comprehensive schools with strong sixth forms. The practical difficulties associated with implementing this strategy ought not to be underplayed, however. For example, how much spare capacity would the system need to carry in order to respond to annual fluctuations in demand? How likely is it that students would wish to leave their grammar schools at 14 and what tests would incomers be expected to pass? Would the system also be able to accommodate those who still wished to change institution at age 16?
  • Vigorously expanding the cadre of post-16 selective free schools. There is presumably a largely unspent budget for up to twelve 16-19 maths free schools, though it will be vulnerable to cuts. It would be relatively straightforward to develop more, extending into other curricular specialisms and removing the obligatory university sponsorship requirement. Expansion could be focused on clones of the London Academy of Excellence and the Harris Westminster Sixth Form. But there should be standard minimum requirements for the admission of disadvantaged learners. A national network might be created which could help to drive improvements in neighbouring primary and secondary schools.
  • Permit successful selective post-16 institutions to admit high-attaining disadvantaged students at age 14, to an academic pathway, as a parallel initiative to that which enables successful colleges to take in 14 year-olds wishing to study vocational qualifications. It may be that the existing scheme already permits this, since the curriculum requirements do not seem to specify a vocational pathway.

UKIP’s policy, as presently articulated, is merely enabling: few existing schools are likely to want to change their character in this fashion.

One assumes that Tory advocates would be satisfied with legislation permitting the establishment of new free schools that select at age 11 or age 14. It seems unlikely that anyone will push for the nuclear option of ‘a grammar school in every town’… but Conservative Voice will imminently reveal their hand.

.

Further options to promote within-school selection

If the political preference is to pursue within-school provision as an alternative to between-school selection there are also several possibilities including:

  • Encouraging the development of more bilateral schools with parallel grammar and selective streams and/or fast-track grammar streams within standard comprehensive schools.
  • Requiring, incentivising or promoting more setting in secondary schools, potentially prioritising the core subjects.
  • Developing a wider understanding of more radical and innovative grouping practices, such as vertical and cluster grouping, and trialling the impact of these through the EEF.

It would of course be important to design such interventions to benefit all students, but especially disadvantaged high attainers.

The Government might achieve the necessary leverage through a ‘presumption’ built into Ofsted’s inspection guidance (schools are presumed to favour the specified approach unless they can demonstrate that an alternative leads consistently to higher pupil outcomes) or through a ‘flexible framework’ quality standard.

.

A national student support scheme

The most efficient method of supporting attainment and social mobility amongst disadvantaged high attainers is through a national scheme that helps them directly, rather than targeting the schools and colleges that they attend.

This need not be a structured national programme, centrally delivered by a single provider. It could operate within a framework that brings greater coherence to the existing market and actively promotes the introduction of new suppliers to fill gaps in coverage and/or compete on quality. A ‘managed market’ if you will.

The essential elements would include:

  • This supply-side framework, covering the full range of disadvantaged students’ learning and development needs, within which all suppliers – universities, third sector, commercial, schools-based – would position their services (or they would be excluded from the scheme).
  • A commitment on the part of all state-funded schools and colleges to implement the scheme with their disadvantaged high attainers (the qualifying criterion might be FSM or ‘ever 6’) – and to ensure continuity and progression when and if these students change institution, especially at 16+.
  • A coherent learning and development programme for each eligible student throughout Years 7-13. Provision in KS3 might be open access and light touch, designed principally to identify those willing and able to pursue the programme into KS4 and KS5. Provision in these latter stages would be tailored to individuals’ needs and continuation would be dependent on progress against challenging but realistic personal targets, including specified GCSE grades.
  • Schools and colleges would act as facilitators and guides, conducting periodic reviews of students’ needs; helping them to identify suitable services from the framework; ensuring that their overall learning programmes – the in-school/college provision together with the services secured from the framework – constitute a coherent learning experience; helping them to maintain learning profiles detailing their progress and achievement.
  • Each learner would have a personal budget to meet costs attached to delivering his learning programme, especially costs attached to services provided through the framework. This would be paid through an endowment fund, refreshed by an annual £50m topslice from the pupil premium budget (analogous to that for literacy and numeracy catch-up) and a matching topslice from universities’ outreach budgets for fair access.
  • Universities would be strongly encouraged to make unconditional offers on the basis of high quality learning profiles, submitted by students as part of their admissions process.
  • There would be annual national targets for improving the GCSE and A level attainment of students participating in the scheme and for admission to – and graduation from – selective universities. This would include challenging but realistic targets for improving FSM admission to Oxbridge.

.

Conclusion

The current political debate is overly fixated on aspects of the wider problem, rather than considering the issue in the round.

I have set out above the far wider range of options that should be under consideration. These are not necessarily mutually exclusive.

If I were advising any political party inclined to take this seriously, I would recommend four essential components:

  • An enhanced strategy to ensure that all existing selective schools (including 16+ institutions) take in a larger proportion of high-attaining disadvantaged learners. Approval for expansion and any new schools would be conditional on meeting specified fair access targets.
  • Development of the cadre of 163 grammar schools into a national network, with direct responsibility for leading national efforts to increase the supply of high-attaining disadvantaged learners emerging from primary schools. Selective independent schools might also join the network, to fill gaps in the coverage and fulfil partnership expectations.
  • A policy to promote in all schools effective and innovative approaches to pupil grouping, enabling them to identify the circumstances in which different methods might work optimally and how best to implement those methods to achieve success. Schools would be encouraged to develop, trial and evaluate novel and hybrid approaches, so as to broaden the range of potential methods available.
  • A national support scheme for disadvantaged high attainers aged 11-19 meeting the broad specification set out above.

Regrettably, I fear that party political points-scoring will stand in the way of a rational solution.

Grammar schools have acquired a curious symbolic value, almost entirely independent of their true purpose and largely unaffected by the evidence base.

They are much like a flag of convenience that any politician anxious to show off his right-wing credentials can wave provocatively in the face of his opponents. There is an equivalent flag for abolitionists.  Anyone who proposes an alternative position is typically ignored.

.

GP

November 2014

Excellence Gaps Quality Standard: Version 1

 

This post is the first stage of a potential development project.

letter-33809_640
It is my initial ‘aunt sally’ for a new best fit quality standard, intended to support schools and colleges to close performance gaps between high-achieving disadvantaged learners and their more advantaged peers.

It aims to integrate two separate educational G_letter_blue_whiteobjectives:

  • Improving the achievement of disadvantaged learners, specifically those eligible for Pupil Premium support; and
  • Improving the achievement of high attainers, by increasing the proportion that achieve highly and the levels at which they achieve.

High achievement embraces both high Blue_square_Qattainment and strong progress, but these terms are not defined or quantified on the face of the standard, so that it is applicable in primary, secondary and post-16 settings and under both the current and future assessment regimes.

I have adopted new design parameters for this fresh venture into quality standards:

  • The standard consists of twelve elements placed in what seems a logical order, but they White_Letter_S_on_Green_Backgroundare not grouped into categories. All settings should consider all twelve elements. Eleven are equally weighted, but the first ‘performance’ element is potentially more significant.
  • The baseline standard is called ‘Emerging’ and is broadly aligned with Ofsted’s ‘Requires Improvement’. I want it to capture only the essential ‘non-negotiables’ that all settings must observe or they would otherwise be inadequate. I have erred on the side of minimalism for this first effort.
  • The standard marking progress beyond the baseline is called ‘Improving’ and is (very) broadly aligned with Ofsted’s ‘Good’. I have separately defined only the learner performance expected, on the assumption that in other respects the standard marks a continuum. Settings will position themselves according to how far they exceed the baseline and to what extent they fall short of excellence.
  • The excellence standard is called ‘Exemplary’ and is broadly aligned with Ofsted’s ‘Outstanding’. I have deliberately tried to pitch this as highly as possible, so that it provides challenge for even the strongest settings. Here I have erred on the side of specificity.

The trick with quality standards is to find the right balance between over-prescription and vacuous ‘motherhood and apple pie’ statements.

There may be some variation in this respect between elements of the standard: the section on teaching and learning always seems to be more accommodating of diversity than others given the very different conceptions of what constitutes effective practice. (But I am also cautious of trespassing into territory that, as a non-practitioner, I may not fully understand.)

The standard uses terminology peculiar to English settings but the broad thrust should be applicable in other countries with only limited adaptation.

The terminology needn’t necessarily be appropriate in all respects to all settings, but it should have sufficient currency and sharpness to support meaningful interaction between them, including cross-phase interaction. It is normal for primary schools to find some of the language more appropriate to secondary schools.

It is important to emphasise the ‘best fit’ nature of such standards. Following discussion informed by interaction with the framework, settings will reach a reasoned and balanced judgement of their own performance across the twelve elements.

It is not necessary for all statements in all elements to be observed to the letter. If a setting finds all or part of a statement beyond the pale, it should establish why that is and, wherever possible, devise an alternative formulation to fit its context. But it should strive wherever possible to work within the framework, taking full advantage of the flexibility it permits.

Some of the terminology will be wanting, some important references will have been omitted while others will be over-egged. That is the nature of ‘aunt sallys’.

Feel free to propose amendments using the comments facility below.

The quality standard is immediately below.  To improve readability, I have not reproduced the middle column where it is empty. Those who prefer to see the full layout can access it via this PDF

 

 

Emerging (RI) Improving (G) Exemplary (O)
The setting meets essential minimum criteria In best fit terms the setting has progressed beyond entry level but is not yet exemplary The setting is a model for others to follow
Performance Attainment and progress of disadvantaged high achievers typically matches that of similar learners nationally, or is rapidly approaching this..Attainment and progress of advantaged and disadvantaged high achievers in the setting are both improving. Attainment and progress of disadvantaged high achievers consistently matches and sometimes exceeds that of similar learners nationally..Attainment and progress are improving steadily for advantaged and disadvantaged high achievers in the setting and performance gaps between them are closing. Attainment and progress of disadvantaged high achievers significantly and consistently exceeds that of similar learners nationally..

Attainment and progress matches but does not exceed that of advantaged learners within the setting, or is rapidly approaching this, and both attainment and progress are improving steadily, for advantaged and disadvantaged high achievers alike.

 

 

 

  Emerging (RI) The setting meets essential minimum criteria Exemplary (O) The setting is a model for others to follow
Policy/strategy There is a published policy to close excellence gaps, supported by improvement planning. Progress is carefully monitored. There is a comprehensive yet clear and succinct policy to close excellence gaps that is published and easily accessible. It is familiar to and understood by staff, parents and learners alike.

.

SMART action to close excellence gaps features prominently in improvement plans; targets are clear; resources and responsibilities are allocated; progress is monitored and action adjusted accordingly. Learners’ and parents’ feedback is routinely collected.

.

The setting invests in evidence-based research and fosters innovation to improve its own performance and contribute to system-wide improvement.

Classroom T&L Classroom practice consistently addresses the needs of disadvantaged high achievers, so improving their learning and performance. The relationship between teaching quality and closing excellence gaps is invariably reflected in classroom preparation and practice.

.

All teaching staff and paraprofessionals can explain how their practice addresses the needs of disadvantaged high achievers, and how this has improved their learning and performance.

.

All staff are encouraged to research, develop, deploy, evaluate and disseminate more effective strategies in a spirit of continuous improvement.

Out of class learning A menu of appropriate opportunities is accessible to all disadvantaged high achievers and there is a systematic process to match opportunities to needs. A full menu of appropriate opportunities – including independent online learning, coaching and mentoring as well as face-to-face activities – is continually updated. All disadvantaged high achievers are supported to participate.

.

All provision is integrated alongside classroom learning into a coherent, targeted educational programme. The pitch is appropriate, duplication is avoided and gaps are filled.

.

Staff ensure that: learners’ needs are regularly assessed; they access and complete opportunities that match their needs; participation and performance are monitored and compiled in a learning record.

Assessment/ tracking Systems for assessing, reporting and tracking attainment and progress provide disadvantaged high achievers, parents and staff with the information they need to improve performance Systems for assessing, tracking and reporting attainment and progress embody stretch, challenge and the highest expectations. They identify untapped potential in disadvantaged learners. They do not impose artificially restrictive ceilings on performance.

.

Learners (and their parents) know exactly how well they are performing, what they need to improve and how they should set about it. Assessment also reflects progress towards wider goals.

.

Frequent reports are issued and explained, enabling learners (and their parents) to understand exactly how their performance has changed over time and how it compares with their peers, identifying areas of relative strength and weakness.

.

All relevant staff have real-time access to the assessment records of disadvantaged high attainers and use these to inform their work.

.

Data informs institution-wide strategies to improve attainment and progress. Analysis includes comparison with similar settings.

Curriculum/organisation The needs and circumstances of disadvantaged high achievers explicitly inform the curriculum and curriculum development, as well as the selection of appropriate organisational strategies – eg sets and/or mixed ability classes. The curriculum is tailored to the needs of disadvantaged high achievers. Curriculum flexibility is utilised to this end. Curriculum development and planning take full account of this.

.

Rather than a ‘one size fits all’ approach, enrichment (breadth), extension (depth) and acceleration (pace) are combined appropriately to meet different learners’ needs.

.

Personal, social and learning skills development and the cultivation of social and cultural capital reflect the priority attached to closing excellence gaps and the contribution this can make to improving social mobility.

.

Organisational strategies – eg the choice of sets or mixed ability classes – are informed by reliable evidence of their likely impact on excellence gaps.

Ethos/pastoral The ethos is positive and supportive of disadvantaged high achievers. Excellence is valued by staff and learners alike. Bullying that undermines this is eradicated. The ethos embodies the highest expectations of learners, and of staff in respect of learners. Every learner counts equally.

.

Excellence is actively pursued and celebrated; competition is encouraged but not at the expense of motivation and self-esteem;hothousing is shunned.

.

High achievement is the norm and this is reflected in organisational culture; there is zero tolerance of associated bullying and a swift and proportional response to efforts to undermine this culture.

.

Strong but realistic aspirations are fostered. Role models are utilised. Social and emotional needs associated with excellence gaps are promptly and thoroughly addressed.

.

The impact of disadvantage is monitored carefully. Wherever possible, obstacles to achievement are removed.

Transition/progression The performance, needs and circumstances of disadvantaged high achievers are routinely addressed in transition between settings and in the provision of information, advice and guidance. Where possible, admissions arrangements prioritise learners from disadvantaged backgrounds – and high achievers are treated equally in this respect.

.

Receiving settings routinely collect information about the performance, needs and circumstances of disadvantaged high achievers. They routinely share such information when learners transfer to other settings.

.

Information, advice and guidance is tailored, balanced and thorough. It supports progression to settings that are consistent with the highest expectations and high aspirations while also meeting learners’ needs.

.

Destinations data is collected, published and used to inform monitoring.

.

Leadership, staffing, CPD A named member of staff is responsible – with senior leadership support – for co-ordinating and monitoring activity across the setting (and improvement against this standard)..Professional development needs associated with closing excellence gaps are identified and addressed The senior leadership team has an identified lead and champion for disadvantaged high achievers and the closing of excellence gaps.

.

A named member of staff is responsible for co-ordinating and monitoring activity across the setting (and improvement against this standard).

.

Closing excellence gaps is accepted as a collective responsibility of the whole staff and governing body. There is a named lead governor.

.

There is a regular audit of professional development needs associated with closing excellence gaps across the whole staff and governing body. A full menu of appropriate opportunities is continually updated and those with needs are supported to take part.

.

The critical significance of teaching quality in closing excellence gaps is instilled in all staff, accepted and understood.

Parents Parents and guardians understand how excellence gaps are tackled and are encouraged to support this process. Wherever possible, parents and guardians are actively engaged as partners in the process of closing excellence gaps. The setting may need to act as a surrogate. Other agencies are engaged as necessary.

.

Staff, parents and learners review progress together regularly. The division of responsibility is clear. Where necessary, the setting provides support through outreach and family learning.

.

This standard is used as the basis of a guarantee to parents and learners of the support that the school will provide, in return for parental engagement and learner commitment.

Resources Sufficient resources – staffing and funding – are allocated to improvement planning (and to the achievement of this standard)..Where available, Pupil Premium is used effectively to support disadvantaged high achievers. Sufficient resources – staffing and funding – are allocated to relevant actions in the improvement plan (and to the achievement of this standard).

.

The proportion of Pupil Premium (and/or alternative funding sources) allocated to closing excellence gaps is commensurate with their incidence in the setting.

.

The allocation of Pupil Premium (or equivalent resources) is not differentiated on the basis of prior achievement: high achievers are deemed to have equal needs.

.

Settings should evidence their commitment to these principles in published material (especially information required to be published about the use of Pupil Premium).

Partnership/collaboration The setting takes an active role in collaborative activity to close excellence gaps. Excellence gaps are addressed and progress is monitored in partnership with all relevant ‘feeder’ and ‘feeding’ settings in the locality.

.

The setting leads improvement across other settings within its networks, utilising the internal expertise it has developed to support others locally, regionally and nationally.

.

The setting uses collaboration strategically to build its own capacity and improve its expertise.

 

letter-33809_640G_letter_blue_whiteBlue_square_QWhite_Letter_S_on_Green_Background

 

 

 

 

Those who are not familiar with the quality standards approach may wish to know more.

Regular readers will know that I advocate what I call ‘flexible framework thinking’, a middle way between the equally unhelpful extremes of top-down prescription (one-size-fits-all) and full institutional autonomy (a thousand flowers blooming). Neither secures consistently high quality provision across all settings.

The autonomy paradigm is currently in the ascendant. We attempt to control quality through ever-more elaborate performance tables and an inspection regime that depends on fallible human inspectors and documentation that regulates towards convergence when it should be enabling diversity, albeit within defined parameters.

I see more value in supporting institutions through best-fit guidance of this kind.

My preferred model is a quality standard, flexible enough to be relevant to thousands of different settings, yet specific enough to provide meaningful guidance on effective practice and improvement priorities, regardless of the starting point.

I have written about the application of quality standards to gifted education and their benefits on several occasions:

Quality standards are emphatically not ‘tick box’ exercises and should never be deployed as such.

Rather they are non-prescriptive instruments for settings to use in self-evaluation, for reviewing their current performance and for planning their improvement priorities. They support professional development and lend themselves to collaborative peer assessment.

Quality standards can be used to marshal and organise resources and online support. They can provide the essential spine around which to build guidance documents and they provide a useful instrument for research and evaluation purposes.

 

GP

October 2014

Beware the ‘short head': PISA’s Resilient Students’ Measure

 

This post takes a closer look at the PISA concept of ‘resilient students’ – essentially a measure of disadvantaged high attainment amongst 15 year-olds – and how this varies from country to country.

7211284724_f3c5515bf7_mThe measure was addressed briefly in my recent review of the evidence base for excellence gaps in England but there was not space on that occasion to provide a thoroughgoing review.

The post is organised as follows:

  • A definition of the measure and explanation of how it has changed since the concept was first introduced.
  • A summary of key findings, including selected international comparisons, and of trends over recent PISA cycles.
  • A brief review of OECD and related research material about the characteristics of resilient learners.

I have not provided background about the nature of PISA assessments, but this can be found in previous posts about the mainstream PISA 2012 results and PISA 2012 Problem Solving.

 

Defining the resilient student

In 2011, the OECD published ‘Against the Odds: Disadvantaged students who succeed in school’, which introduced the notion of PISA as a study of resilience. It uses PISA 2006 data throughout and foregrounds science, as did the entire PISA 2006 cycle.

There are two definitions of resilience in play: an international benchmark and a country-specific measure to inform discussion of effective policy levers in different national settings.

The international benchmark relates to the top third of PISA performers (ie above the 67th percentile) across all countries after accounting for socio-economic background. The resilient population comprises students in this group who also fall within the bottom third of the socio-economic background distribution in their particular jurisdiction.

Hence the benchmark comprises an international dimension of performance and a national/jurisdictional dimension of disadvantage.

This cohort is compared with disadvantaged low achievers, a population similarly derived, except that their performance is in the bottom third across all countries, after accounting for socio-economic background.

The national benchmark applies the same national measure relating to socio-economic background, but the measure of performance is the top third of the national/jurisdictional performance distribution for the relevant PISA test.

The basis for determining socio-economic background is the PISA Index of Economic, Social and Cultural Status (ESCS).

‘Against the Odds’ describes it thus:

‘The indicator captures students’ family and home characteristics that describe their socio-economic background. It includes information about parental occupational status and highest educational level, as well as information on home possessions, such as computers, books and access to the Internet.’

Further details are provided in the original PISA 2006 Report (p333).

Rather confusingly, the parameters of the international benchmark were subsequently changed.

PISA 2009 Results: Overcoming Social Background – Equity in Learning Opportunities and Outcomes Volume II describes the new methodology in this fashion:

‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status).’

No reason is given for this shift to a narrower measure of both attainment and disadvantage, nor is the impact on results discussed.

The new methodology is seemingly retained in PISA 2012 Results: Excellence through Equity: Giving every student the chance to succeed – Volume II:

‘A student is class­ed as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter of students among all countries, after accounting for socio-economic status.’

However, multiplication by four is dispensed with.

This should mean that the outcomes from PISA 2009 and 2012 are broadly comparable with some straightforward multiplication. However the 2006 results foreground science, while in 2009 the focus is reading – and shifts on to maths in 2012.

Although there is some commonality between these different test-specific results (see below), there is also some variation, notably in terms of differential outcomes for boys and girls.

 

PISA 2006 results

The chart reproduced below compares national percentages of resilient students and disadvantaged low achievers in science using the original international benchmark. It shows the proportion of resilient learners amongst disadvantaged students.

 

Resil 2006 science Capture

Conversely, the data table supplied alongside the chart shows the proportion of resilient students amongst all learners. Results have to be multiplied by three on this occasion (since the indicator is based on ‘top third attainment, bottom third advantage’).

I have not reproduced the entire dataset, but have instead created a subset of 14 jurisdictions in which my readership may be particularly interested, namely: Australia, Canada, Finland, Hong Kong, Ireland, Japan, New Zealand, Poland, Shanghai, Singapore, South Korea, Taiwan, the UK and the US. I have also included the OECD average.

I have retained this grouping throughout the analysis, even though some of the jurisdictions do not appear throughout – in particular, Shanghai and Singapore are both omitted from the 2006 data.

Chart 1 shows these results.

 

Resil first chart Chart 1: PISA resilience in science for selected jurisdictions by gender (PISA 2006 data)

 

All the jurisdictions in my sample are relatively strong performers on this measure. Only the United States falls consistently below the OECD average.

Hong Kong has the highest percentage of resilient learners – almost 75% of its disadvantaged students achieve the benchmark. Finland is also a very strong performer, while other jurisdictions achieving over 50% include Canada, Japan, South Korea and Taiwan.

The UK is just above the OECD average, but the US is ten points below. The proportion of disadvantaged resilient students in Hong Kong is almost twice the proportion in the UK and two and a half times the proportion in the US.

Most of the sample shows relatively little variation between their proportions of male and female resilient learners. Females have a slight lead across the OECD as a whole, but males are in the ascendancy in eight of these jurisdictions.

The largest gap – some 13 percentage points in favour of boys – can be found in Hong Kong. The largest advantage in favour of girls – 6.9 percentage points – is evident in Poland. In the UK males are ahead by slightly over three percentage points.

The first chart also shows that there is a relatively strong relationship between the proportion of resilient students and of disadvantaged low achievers. Jurisdictions with the largest proportions of resilient students typically have the smallest proportions of disadvantaged low achievers.

In Hong Kong, the proportion of disadvantaged students who are low achievers is 6.3%, set against an OECD average of 25.8%. Conversely, in the US, this proportion reaches 37.8% – and is 26.7% in the UK. Of this sample, only the US has a bigger proportion of disadvantaged low achievers than of disadvantaged resilient students.

 

‘Against the Odds’ examines the relationship between resiliency in science, reading and maths, but does so using the national benchmark, so the figures are not comparable with those above. I have, however, provided a chart comparing performance in my sample of jurisdictions.

 

Resil second chart

Chart 2: Students resilient in science who are resilient in other subjects, national benchmark of resilience, PISA 2006

 

Amongst the jurisdictions for which we have data there is a relatively similar pattern, with between 47% and 56% of students resilient in all three subjects.

In most cases, students who are resilient in two subjects combine science and maths rather than science and reading, but this is not universally true since the reverse pattern applies in Ireland, Japan and South Korea.

The document summarises the outcomes thus:

‘This evidence indicates that the vast majority of students who are resilient with respect to science are also resilient in at least one if not both of the other domains…These results suggest that resilience in science is not a domain-specific characteristic but rather there is something about these students or the schools they attend that lead them to overcome their social disadvantage and excel at school in multiple subject domains.’

 

PISA 2009 Results

The results drawn from PISA 2009 focus on outcomes in reading, rather than science, and of course the definitional differences described above make them incompatible with those for 2006.

The first graph reproduced below shows the outcomes for the full set of participating jurisdictions, while the second – Chart 2 – provides the results for my sample.

Resil PISA 2009 Capture

 

Resil third chart

Chart 3: PISA resilience in reading for selected jurisdictions by gender (PISA 2009 data)

 

The overall OECD average is pitched at 30.8% compared with 39% on the PISA 2006 science measure. Ten of our sample fall above the OECD average and Australia matches it, but the UK, Ireland and the US are below the average, the UK undershooting it by some seven percentage points.

The strongest performer is Shanghai at 75.6%, closely followed by Hong Kong at 72.4%. They and South Korea are the only jurisdictions in the sample which can count over half their disadvantaged readers as resilient. Singapore, Finland and Japan are also relatively strong performers.

There are pronounced gender differences in favour of girls. They have a 16.8 percentage point lead over boys in the OECD average figure and they outscore boys in every country in our sample. These differentials are most marked in Finland, Poland and New Zealand. In the UK there is a difference of 9.2 percentage points, smaller than in many other countries in the sample.

The comparison with the proportion of disadvantaged low achievers is illustrated by chart 3. This reveals the huge variation in the performance of our sample.

 

Resil fourth chart

Chart 4: Comparing percentage of resilient and low-achieving students in reading, PISA 2009

At one extreme, the proportion of disadvantaged low achievers (bottom quartile of the achievement distribution) is virtually negligible in Shanghai and Hong Kong, while around three-quarters of disadvantaged students are resilient (top quartile of the achievement distribution).

At the other, countries like the UK have broadly similar proportions of low achievers and resilient students. The chart reinforces just how far behind they are at both the top and the bottom of the attainment spectrum.

 

PISA 2012 Results

In 2012 the focus is maths rather than reading. The graph reproduced below compares resilience scores across the full set of participating jurisdictions, while Chart 4 covers only my smaller sample.

 

Resil PISA 2012 Capture

resil fifth chart Chart 5: PISA resilience in maths for selected jurisdictions by gender (PISA 2012 data)

 

Despite the change in subject, the span of performance on this measure is broadly similar to that found in reading three years earlier. The OECD average is 25.6%, roughly five percentage points lower than the average in 2009 reading.

Nine of the sample lie above the OECD average, while Australia, Ireland, New Zealand, UK and the US are below. The UK is closer to the OECD average in maths than it was in reading, however, and is a relatively stronger performer than the US and New Zealand.

Shanghai and Hong Kong are once again the top performers, at 76.8% and 72.4% respectively. Singapore is at just over 60% and South Korea at just over 50%. Taiwan and Japan are also notably strong performers.

Within the OECD average, boys have a four percentage point lead on girls, but boys’ relatively stronger performance is not universal – in Hong Kong, Poland, Singapore and South Korea, girls are in the ascendancy.  This is most strongly seen in Poland. The percentage point difference in the UK is just 2.

The comparison with disadvantage low achievers is illustrated in Chart 5.

 

Resil sixth chart

Chart 6: Comparing percentage of resilient and low-achieving students in maths, PISA 2012

 

Once again the familiar pattern emerges, with negligible proportions of low achievers in the countries with the largest shares of resilient students. At the other extreme, the US and New Zealand are the only two jurisdictions in this sample with a longer ‘tail’ of low achievers. The reverse is true in the UK, but only just!

 

Another OECD Publication ‘Strengthening Resilience through Education: PISA Results – background document’ contains a graph showing the variance in jurisdictions’ mathematical performance by deciles of socio-economic disadvantage. This is reproduced below.

 

resil maths deciles Capture

The text adds:

‘Further analysis indicates that the 10% socio-economically most disadvantaged children in Shanghai perform at the same level as the 10% most privileged children in the United States; and that the 20% most disadvantaged children in Finland, Japan, Estonia, Korea, Singapore, Hong Kong-China and Shanghai-China compare favourably to the OECD average.’

One can see that the UK is decidedly ‘mid-table’ at both extremes of the distribution. On the evidence of this measure, one cannot fully accept the oft-repeated saw that the UK is a much stronger performer with high attainers than with low attainers, certainly as far as disadvantaged learners are concerned.

 

The 2012 Report also compares maths-based resiliency records over the four cycles from PISA 2003 to PISA 2012 – as shown in the graph reproduced below – but few of the changes are statistically significant. There has also been some statistical sleight of hand to ensure comparability across the cycles.

 

resil comparing PISA 2003 to 2012 capture

Amongst the outcomes that are statistically significant, Australia experienced a fall of 1.9 percentage points, Canada 1.6 percentage points, Finland 3.3 percentage points and New Zealand 2.9 percentage points. The OECD average was relatively little changed.

The UK is not included in this analysis because of issues with its PISA 2003 results.

Resilience is not addressed in the main PISA 2012 report on problem-solving, but one can find online the graph below, which shows the relative performance of the participating countries.

It is no surprise that the Asian Tigers are at the top of the league (although Shanghai is no longer in the ascendancy). England (as opposed to the UK) is at just over 30%, a little above the OECD average, which appears to stand at around 27%.

The United States and Australia perform at a very similar level. Canada is ahead of them and Poland is the laggard.

 

resil problem solving 2012 Capture

 

Resilience in the home countries

Inserted for the purposes of reinforcement, the chart below compiles the UK outcomes from the PISA 2006, 2009 and 2012 studies above, as compared with the top performer in my sample for each cycle and the appropriate OECD average. Problem-solving is omitted.

Only in science (using the ‘top third attainer, bottom third disadvantage’ formula) does the UK exceed the OECD average figure and then only slightly.

In both reading and maths, the gap between the UK and the top performer in my sample is eye-wateringly large: in each case there are more than three times as many resilient students in the top-performing jurisdiction.

It is abundantly clear from this data that disadvantaged high attainers in the UK do not perform strongly compared with their peers elsewhere.

 

Resil seventh chart

Chart 7: Resilience measures from PISA 2006-2012 comparing UK with top performer in this sample and OECD average

 

Unfortunately NFER does not pick up the concept of resilience in its analysis of England’s PISA 2012 results.

The only comparative analysis across the Home Countries that I can find is contained in a report prepared for the Northern Ireland Ministry of Education by NFER called ‘PISA 2009: Modelling achievement and resilience in Northern Ireland’ (March 2012).

This uses the old ‘highest third by attainment, lowest third by disadvantage’ methodology deployed in ‘Against the Odds’. Reading is the base.

The results show that 41% of English students are resilient, the same figure as for the UK as a whole. The figures for the other home countries appear to be: Northern Ireland 42%; Scotland 44%; and Wales 35%.

Whether the same relationship holds true in maths and science using the ‘top quartile, bottom quartile’ methodology is unknown. One suspects though that each of the UK figures given above will also apply to England.

 

The characteristics of resilient learners

‘Against the Odds’ outlines some evidence derived from comparisons using the national benchmark:

  • Resilient students are, on average, somewhat more advantaged than disadvantaged low achievers, but the difference is relatively small and mostly accounted for by home-related factors (eg. number of books in the home, parental level of education) rather than parental occupation and income.
  • In most jurisdictions, resilient students achieve proficiency level 4 or higher in science. This is true of 56.8% across the OECD. In the UK the figure is 75.8%; in Hong Kong it is 88.4%. We do not know what proportions achieve the highest proficiency levels.
  • Students with an immigrant background – either born outside the country of residence or with parents were born outside the country – tend to be under-represented amongst resilient students.
  • Resilient students tend to be more motivated, confident and engaged than disadvantaged low achievers. Students’ confidence in their academic abilities is a strong predictor of resilience, stronger than motivation.
  • Learning time – the amount of time spent in normal science lessons – is also a strong predictor of resilience, but there is relatively little evidence of an association with school factors such as school management, admissions policies and competition.

Volume III of the PISA 2012 Report: ‘Ready to Learn: Students’ engagement, drive and self-beliefs’ offers a further gloss on these characteristics from a mathematical perspective:

‘Resilient students and advantaged high-achievers have lower rates of absenteeism and lack of punctuality than disadvantaged and advantaged low-achievers…

….resilient and disadvantaged low-achievers tend to have lower sense of belonging than advantaged low-achievers and advantaged high-achievers: socio-economically disadvantaged students express a lower sense of belonging than socio-economically advantaged students irrespective of their performance in mathematics.

Resilient students tend to resemble advantaged high-achievers with respect to their level of drive, motivation and self-beliefs: resilient students and advantaged high-achievers have in fact much higher levels of perseverance, intrinsic and instrumental motivation to learn mathematics, mathematics self-efficacy, mathematics self-concept and lower levels of mathematics anxiety than students who perform at lower levels than would be expected of them given their socio-economic condition…

….In fact, one key characteristic that resilient students tend to share across participating countries and economies, is that they are generally physically and mentally present in class, are ready to persevere when faced with challenges and difficulties and believe in their abilities as mathematics learners.’

Several research studies can be found online that reinforce these findings, sometimes adding a few further details for good measure:

The aforementioned NFER study for Northern Ireland uses a multi-level logistic model to investigate the school and student background factors associated with resilience in Northern Ireland using PISA 2009 data.

It derives odds ratios as follows: grammar school 7.44; female pupils 2.00; possessions – classic literature 1.69; wealth 0.76; percentage of pupils eligible for FSM – 0.63; and books in home – 0-10 books 0.35.

On the positive impact of selection the report observes:

‘This is likely to be largely caused by the fact that to some extent grammar schools will be identifying the most resilient students as part of the selection process. As such, we cannot be certain about the effectiveness or otherwise of grammar schools in providing the best education for disadvantaged children.’

Another study – ‘Predicting academic resilience with mathematics learning and demographic variables’ (Cheung et al 2014) – concludes that, amongst East Asian jurisdictions such as Hong-Kong, Japan and South Korea, resilience is associated with avoidance of ‘redoublement’ and having attended kindergarten for more than a year.

Unsurprisingly, students who are more familiar with mathematical concepts and have greater mathematical self-efficacy are also more likely to be resilient.

Amongst other countries in the sample – including Canada and Finland – being male, native (as opposed to immigrant) and avoiding ‘redoublement’ produced stronger chances of resilience.

In addition to familiarity with maths concepts and self-efficacy, resilient students in these countries were less anxious about maths and had a higher degree of maths self-concept.

Work on ‘Resilience Patterns in Public Schools in Turkey’ (unattributed and undated) – based on PISA 2009 data and using the ‘top third, bottom third’ methodology – finds that 10% of a Turkish sample are resilient in reading, maths and science; 6% are resilient in two subjects and a further 8% in one only.

Resilience varies in different subjects according to year of education.

resil Turkey Capture

There are also significant regional differences.

Odds ratios show a positive association with: more than one year of pre-primary education; selective provision, especially in maths; absence of ability grouping; additional learning time, especially for maths and science; a good disciplinary climate and strong teacher-student relations.

An Italian study – ‘A way to resilience: How can Italian disadvantaged students and schools close the achievement gap?’ (Agasisti and Longobardi, undated) uses PISA 2009 data to examine the characteristics of resilient students attending schools with high levels of disadvantage.

This confirms some of the findings above in respect of student characteristics, finding a negative impact from immigrant status (and also from a high proportion of immigrants in a school). ‘Joy in reading’ and ‘positive attitude to computers’ are both positively associated with resilience, as is a positive relationship with teachers.

School type is found to influence the incidence of resilience – particularly enrolment in Licei as opposed to professional or technical schools – so reflecting one outcome of the Northern Irish study. Other significant school level factors include the quality of educational resources available and investment in extracurricular activities. Regional differences are once more pronounced.

A second Italian study – ‘Does public spending improve educational resilience? A longitudinal analysis of OECD PISA data’ (Agasisti et al 2014) finds a positive correlation between the proportion of a country’s public expenditure devoted to education and the proportion of resilient students.

Finally, this commentary from Marc Tucker in the US links its relatively low incidence of resilient students to national views about the nature of ability:

‘In Asia, differences in student achievement are generally attributed to differences in the effort that students put into learning, whereas in the United States, these differences are attributed to natural ability.  This leads to much lower expectations for students who come from low-income families…

My experience of the Europeans is that they lie somewhere between the Asians and the Americans with respect to the question as to whether effort or genetic material is the most important explainer of achievement in school…

… My take is that American students still suffer relative to students in both Europe and Asia as a result of the propensity of the American education system to sort students out by ability and assign different students work at different challenge levels, based on their estimates of student’s inherited intelligence.’

 

Conclusion

What are we to make of all this?

It suggests to me that we have not pushed much beyond statements of the obvious and vague conjecture in our efforts to understand the resilient student population and how to increase its size in any given jurisdiction.

The comparative statistical evidence shows that England has a real problem with underachievement by disadvantaged students, as much at the top as the bottom of the attainment distribution.

We are not alone in facing this difficulty, although it is significantly more pronounced than in several of our most prominent PISA competitors.

We should be worrying as much about our ‘short head’ as our ‘long tail’.

 

GP

September 2014

 

 

 

 

 

 

Closing England’s Excellence Gaps: Part 2

This is the second part of an extended post considering what we know – and do not know – about high attainment gaps between learners from advantaged and disadvantaged backgrounds in England.

512px-Bakerloo_line_-_Waterloo_-_Mind_the_gap

Mind the Gap by Clicsouris

Part one provided an England-specific definition, articulated a provisional theoretical model for addressing excellence gaps and set out the published data about the size of excellence gaps at Key Stages 2,4 and 5, respectively.

Part two continues to review the evidence base for excellence gaps, covering the question whether high attainers remain so, international comparisons data and related research and excellence gaps analysis from the USA.

It also describes those elements of present government policy that impact directly on excellence gaps and offers some recommendations for strengthening our national emphasis on this important issue.

 

Whether disadvantaged high achievers remain so

 

The Characteristics of High Attainers

The Characteristics of high attainers (DfES 2007) includes investigation of:

  • whether pupils in the top 10% at KS4 in 2006 were also high attainers at KS3 in 2004 and KS2 in 2001, by matching back to their fine grade points scores; and
  • chances of being a KS4 high attainer given a range of pupil characteristics at KS2 and KS3.

On the first point it finds that 4% of all pupils remain in the top 10% throughout, while 83% of pupils are never in the top 10% group.

Some 63% of those who were high attainers at the end of KS2 are still high attainers at the end of KS3, while 72% of KS3 high attainers are still in that group at the end of KS4. Approximately half of high attainers at KS2 are high attainers at KS4.

The calculation is not repeated for advantaged and disadvantaged high attainers respectively, but this shows that – while there is relatively little movement between  the high attaining population and other learners (with only 17% of the overall population falling within scope at any point) – there is a sizeable ‘drop out’ amongst high attainers at each key stage.

Turning to the second point, logistic regression is used to calculate the odds of being a KS4 high attainer given different levels of prior attainment and a range of pupil characteristics. Results are controlled to isolate the impact of individual characteristics and for attainment.

The study finds that pupils with a KS2 average points score (APS) above 33 are more likely than not to be high attainers at KS4, and this probability increases as their KS2 APS increases. For those with an APS of 36, the odds are 23.73, meaning they have a 24/25 chance of being a KS4 high attainer.

For FSM-eligible learners though, the odds are 0.55, meaning that the chances of being a KS4 high attainer are 45% lower amongst FSM-eligible pupils, compared to  their non-FSM counterparts with similar prior attainment and characteristics.

The full set of findings for individual characteristics is reproduced below.

Ex gap Capture 7

 

An appendix supplies the exact ratios for each characteristic and the text points out that these can be multiplied to calculate odds ratios for different combinations:

The odds for different prior attainment levels and other characteristics combined with FSM eligibility are not worked through, but could easily be calculated. It would be extremely worthwhile to repeat this analysis using more recent data to see whether the results would be replicated for those completing KS4 in 2014.

 

Sutton Trust

In 2008, the Sutton Trust published ‘Wasted talent? Attrition rates of high achieving pupils between school and university’ which examines the attrition rates for FSM-eligible learners among the top 20% of performers at KS2, KS3 and KS4.

A footnote says that this calculation was ‘on the basis of their English and maths scores at age 11, and at later stages of schooling’, which is somewhat unclear. A single, unidentified cohort is tracked across key stages.

The report suggests ‘extremely high rates of ‘leakage’ amongst the least privileged pupils’. The key finding is that two-thirds of disadvantaged top performers at KS2 are not amongst the top performers at KS4, whereas 42% advantaged top performers are not.

 

EPPSE

Also in the longitudinal tradition ‘Performing against the odds: developmental trajectories of children in the EPPSE 3-16 study’ (Siraj-Blatchford et al, June 2011) investigated through interviews the factors that enabled a small group of disadvantaged learners to ‘succeed against the odds’.

Twenty learners were identified who were at the end of KS3 or at KS4 and who had achieved well above predicted levels in English and maths at the end of KS2. Achievement was predicted for the full sample of 2,800 children within the EPPSE study via multi-level modelling, generating:

‘…residual scores for each individual child, indicating the differences between predicted and attained achievement at age 11, while controlling for certain child characteristics (i.e., age, gender, birth weight, and the presence of developmental problems) and family characteristics (i.e., mothers’ education, fathers’ education, socio-economic status [SES] and family income). ‘

The 20 identified as succeeding against the odds had KS2 residual scores for both English and maths within the highest 20% of the sample. ‘Development trajectories’ were created for the group using a range of assessments conducted at age 3, 4, 5, 7, 11 and 14.

The highest job level held in the family when the children were aged 3-4 was manual, semi-skilled or unskilled, or the parent(s) had never worked.

The 20 were randomly selected from each gender – eight boys and 12 girls – while ensuring representation of ‘the bigger minority ethnic groups’. It included nine students characterised as White UK, five Black Caribbean, two Black African and one each of Indian (Sikh), Pakistani, Mixed Heritage and Indian (Hindu).

Interviews were conducted with children, parents and the teacher at their [present] secondary school the learners felt ‘knew them best’. Teacher interviews were secured for 11 of the 20.

Comparison of development trajectories showed significant gaps between this ‘low SES high attainment’ group and a comparative sample of ‘low SES, predicted attainment’ students. They were ahead from the outset and pulled further away.

They also exceeded a comparator group of high SES learners performing at predicted levels from entry to primary education until KS2. Even at KS3, 16 of the 20 were still performing above the mean of the high SES sample.

These profiles – illustrated in the two charts below – were very similar in English and maths respectively. In either case, Group 1 are those with ‘low SES, high attainment’, while Group 4 are ‘high SES predicted attainment’ students.

 

Supp exgap Eng Capture

Supp exgap Maths Capture

 

Interviews identified five factors that helped to explain this success:

  • The child’s perceived cognitive ability, strong motivation for school and learning and their hobbies and interests. Most parents and children regarded cognitive ability as ‘inherent to the child’, but they had experienced many opportunities to develop their abilities and received support in developing a ‘positive self-image’. Parenting ‘reflected a belief in the parent’s efficacy to positively influence the child’s learning’. Children also demonstrated ability to self-regulate and positive attitudes to homework. They had a positive attitude to learning and made frequent use of books and computers for this purpose. They used school and learning as distractions from wider family problems. Many were driven to learn, to succeed educationally and achieve future aspirations.
  • Home context – effective practical and emotional support with school and learning. Families undertook a wide range of learning activities, especially in the early years. These were perceived as enjoyable but also valuable preparation for subsequent schooling. During the primary years, almost all families actively stimulated their children to read. In the secondary years, many parents felt their efforts to regulate their children’s activities and set boundaries were significant. Parents also provided practical support with school and learning, taking an active interest and interacting with their child’s school. Their parenting style is described as ‘authoritative: warm, firm and accepting of their needs for psychological autonomy but demanding’. They set clear standards and boundaries for behaviour while granting extra autonomy as their children matured. They set high expectations and felt strongly responsible for their child’s education and attitude to learning. They believed in their capacity to influence their children positively. Some were motivated by the educational difficulties they had experienced.
  • (Pre-)School environment – teachers who are sensitive and responsive to the child’s needs and use ‘an authoritative approach to teaching and interactive teaching strategies’; and, additionally, supportive school policies. Parents had a positive perception of the value of pre-school education, though the value of highly effective pre-school provision was not clear cut with this sample. Moreover ‘very few clear patterns of association could be discerned between primary school effectiveness and development of rankings on trajectories’. That said both parents and children recognised that their schools had helped them address learning and behavioural difficulties. Success was attributed to the quality of teachers. ‘They thought that good quality teaching meant that teachers were able to explain things clearly, were enthusiastic about the subject they taught, were approachable when things were difficult to understand, were generally friendly, had control over the class and clearly communicated their expectations and boundaries.’
  • Peers providing practical, emotional and motivational support. Friends were especially valuable in helping them to respond to difficulties, helping in class, with homework and revision. Such support was often mutual, helping to build understanding and develop self-esteem, as a consequence of undertaking the role of teacher. Friends also provided role models and competitors.
  • Similar support provided by the extended family and wider social, cultural and religious communities. Parents encouraged their children to take part in extra-curricular activities and were often aware of their educational benefits. Family networks often provided additional learning experiences, particularly for Caribbean and some Asian families.

 

Ofsted

Ofsted’s The most able students: Are they doing as well as they should in our non-selective secondary schools? (2013) defines this population rather convolutedly as those:

‘…starting secondary school in Year 7 attaining level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and or mathematics at the end of Key Stage 2.’ (Footnote p6-7)

There is relatively little data in the report about the performance of high-attaining disadvantaged learners, other than the statement that only 58% of FSM students within the ‘most able’ population in KS2 and attending non-selective secondary schools go on to achieve A*-B GCSE grades in English and maths, compared with 75% of non-FSM pupils, giving a gap of 17 percentage points.

I have been unable to find national transition matrices for advantaged and disadvantaged learners, enabling us to compare the proportion of advantaged and disadvantaged pupils making and exceeding the expected progress between key stages.

 

Regression to the mean and efforts to circumvent it

Much prominence has been given to Feinstein’s 2003 finding that, whereas high-scoring children from advantaged and disadvantaged backgrounds (defined by parental occupation) perform at a broadly similar level when tested at 22 months, the disadvantaged group are subsequently overtaken by relatively low-scoring children from advantaged backgrounds during the primary school years.

The diagram that summarises this relationship has been reproduced widely and much used as the centrepiece of arguments justifying efforts to improve social mobility.

Feinstein Capture

But Feinstein’s finding were subsequently challenged on methodological grounds associated with the effects of regression to the mean.

Jerrim and Vignoles (2011) concluded:

‘There is currently an overwhelming view amongst academics and policymakers that highly able children from poor homes get overtaken by their affluent (but less able) peers before the end of primary school. Although this empirical finding is treated as a stylised fact, the methodology used to reach this conclusion is seriously flawed. After attempting to correct for the aforementioned statistical problem, we find little evidence that this is actually the case. Hence we strongly recommend that any future work on high ability–disadvantaged groups takes the problem of regression to the mean fully into account.’

On the other hand, Whitty and Anders comment:

‘Although some doubt has been raised regarding this analysis on account of the potential for regression to the mean to exaggerate the phenomenon (Jerrim and Vignoles, 2011), it is highly unlikely that this would overturn the core finding that high SES, lower ability children catch up with their low-SES, higher-ability peers.’

Their point is borne out by Progress made by high-attaining children from disadvantaged backgrounds (June 2014) suggesting that Vignoles, as part of the writing team, has changed her mind somewhat since 2011.

This research adopts a methodological route to minimise the impact of regression to the mean. This involves assigning learners to achievement groups using a different test to those used to follow their attainment trajectories and focusing principally on those trajectories from KS2 onwards.

The high attaining group is defined as those achieving Level 3 or above in KS1 writing, which selected in 12.6% of the sample. (For comparison, the same calculations are undertaken based on achieving L3 or above in KS1 maths.) These pupils are ranked and assigned a percentile on the basis of their performance on the remaining KS1 tests and at each subsequent key stage.

The chart summarising the outcomes in the period from KS1 to KS4 is reproduced below, showing the different trajectories of the ‘most deprived’ and ‘least deprived’. These are upper and lower quintile groups of state school students derived on the basis of FSM eligibility and a set of area-based measures of disadvantage and measures of socio-economic status derived from the census.

 

Ex gap 8 Capture

The trajectories do not alter significantly beyond KS4.

The study concludes:

‘…children from poorer backgrounds who are high attaining at age 7 are more likely to fall off a high attainment trajectory than children from richer backgrounds. We find that high-achieving children from the most deprived families perform worse than lower-achieving students from the least deprived families by Key Stage 4. Conversely, lower-achieving affluent children catch up with higher-achieving deprived children between Key Stage 2 and Key Stage 4.’

Hence:

‘The period between Key Stage 2 and Key Stage 4 appears to be a crucial time to ensure that higher-achieving pupils from poor backgrounds remain on a high achievement trajectory.’

In short, a Feinstein-like relationship is established but it operates at a somewhat later stage in the educational process.

 

International comparisons studies

 

PISA: Resilience

OECD PISA studies have recently begun to report on the performance of what they call ‘resilient’ learners.

Against the Odds: Disadvantaged Students Who Succeed in Schools (OECD, 2011) describes this population as those who fall within the bottom third of their country’s distribution by socio-economic background, but who achieve within the top third on PISA assessments across participating countries.

This publication uses PISA 2006 science results as the basis of its calculations. The relative position of different countries is shown in the chart reproduced below. Hong Kong tops the league at 24.8%, the UK is at 13.5%, slightly above the OECD average of 13%, while the USA is languishing at 9.9%.

Ex Gap Capture 9

The findings were discussed further in PISA in Focus 5 (OECD 2011), where PISA 2009 data is used to make the calculation. The methodology is also significantly adjusted so that includes a substantially smaller population:

‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status.’

According to this measure, the UK is at 24% and the US has leapfrogged them at 28%. Both are below the OECD average of 31%, while Shanghai and Hong Kong stand at over 70%.

The Report on PISA 2012 (OECD 2013) retains the more demanding definition of resilience, but dispenses with multiplication by 4, so these results need to be so multiplied to be comparable with those for 2009.

This time round, Shanghai is at 19.2% (76.8%) and Hong Kong at 18.1% (72.4%). The OECD average is 6.4% (25.6%), the UK at 5.8% (23.2%) and the US at 5.2% (20.8%).

So the UK has lost a little ground compared with 2009, but is much close to the OECD average and has overtaken the US, which has fallen back by some seven percentage points.

I could find no commentary on these changes.

NFER has undertaken some work on resilience in Northern Ireland, using PISA 2009 reading results (and the original ‘one third’ methodology) as a base. This includes odds ratios for different characteristics of being resilient. This could be replicated for England using PISA 2012 data and the latest definition of resilience.

 

Research on socio-economic gradients

The Socio-Economic Gradient in Teenagers’ Reading Skills: How Does England Compare with Other Countries? (Jerrim 2012) compares the performance of students within the highest and lowest quintiles of the ISEI Index of Occupational Status on the PISA 2009 reading tests.

It quantifies the proportion of these two populations within each decile of  achievement, so generating a gradient, before reviewing how this gradient has changed between PISA 2000 and PISA 2009, comparing outcomes for England, Australia, Canada, Finland, Germany and the US.

Jerrim summarises his findings thus:

‘The difference between advantaged and disadvantaged children’s PISA 2009 reading test scores in England is similar (on average) to that in most other developed countries (including Australia, Germany and, to some extent, the US). This is in contrast to previous studies from the 1990s, which suggested that there was a particularly large socio-economic gap in English pupils’ academic achievement.

Yet the association between family background and high achievement seems to be stronger in England than elsewhere.

There is some evidence that the socio-economic achievement gradient has been reduced in England over the last decade, although not amongst the most able pupils from advantaged and disadvantaged homes.’

Jerrim finds that the link in England between family background and high achievement is stronger than in most other OECD countries, whereas this is not the case at the other end of the distribution.

He hypothises that this might be attributable to recent policy focus on reducing the ‘long tail’ while:

‘much less attention seems to be paid to helping disadvantaged children who are already doing reasonably well to push on and reach the top grades’.

He dismisses the notion that the difference is associated with the fact that  disadvantaged children are concentrated in lower-performing schools, since it persists even when controls for school effects are introduced.

In considering why PISA scores show the achievement gap in reading has reduced between 2000 and 2009 at the lower end of the attainment distribution but not at the top, he cites two possibilities: that Government policy has been disproportionately successful at the lower end; and that there has been a more substantial decline in achievement amongst learners from advantaged backgrounds than amongst their disadvantaged peers. He is unable to rule out the latter possibility.

He also notes in passing that PISA scores in maths do not generate the same pattern.

These arguments are further developed in ‘The Reading Gap: The socio-economic gap in children’s reading skills: A cross-national comparison using PISA 2009’ (Jerrim, 2013) which applies the same methodology.

This finds that high-achieving (top decile of the test distribution) boys from the most advantaged quintile in England are two years and seven months ahead of high-achieving boys from the most disadvantaged quintile, while the comparable gap for girls is slightly lower, at two years and four months.

The chart reproduced below illustrates international comparisons for boys. It shows that only Scotland has a larger high achievement gap than England. (The black lines indicate 99% confidence intervals – he associates the uncertainty to ‘sampling variation’.)

Gaps in countries at the bottom of the table are approximately half the size of those in England and Scotland.

Ex gap 10 capture

 

One of the report’s recommendations is that:

‘The coalition government has demonstrated its commitment to disadvantaged pupils by establishing the Education Endowment Foundation… A key part of this Foundation’s future work should be to ensure highly able children from disadvantaged backgrounds succeed in school and have the opportunity to enter top universities and professional jobs. The government should provide additional resources to the foundation to trial interventions that specifically target already high achieving children from disadvantaged homes. These should be evaluated using robust evaluation methodologies (e.g. randomised control trials) so that policymakers develop a better understanding of what schemes really have the potential to work.’

The study is published by the Sutton Trust whose Chairman – Sir Peter Lampl – is also chairman of the EEF.

In ‘Family background and access to high ‘status’ universities’ (2013) Jerrim provides a different chart showing estimates by country of disadvantaged high achieving learners. The measure of achievement is PISA Level 5 in reading and the measure of disadvantage remains quintiles derived from the ISEI index.

Ex Gap 12 Capture 

The underlying figures are not supplied.

Also in 2013, in ‘The mathematical skills of school children: how does England compare to the high-performing East Asian jurisdictions?’ Jerrim and Choi construct a similar gradient for maths, drawing on a mix of PISA and TIMSS assessments conducted between 2003 and 2009, so enabling them to consider variation according to the age at which assessment takes place.

The international tests selected are TIMSS 2003, 4th grade; TIMSS 2007, 8th grade and PISA 2009. The differences between what these tests measure are described as ‘slight’. The analysis of achievement relies on deciles of the achievement distribution.

Thirteen comparator countries are included, including six wealthy western economies, three ‘middle income’ western economies and four Asian Tigers (Hong Kong, Japan, Singapore and Taiwan).

This study applies as the best available proxy for socio-economic status the number of books in the family home, comparing the most advantaged (over 200 books) with the least (under 25 books). It acknowledges the limitations of this proxy, which Jerrim discusses elsewhere.

The evidence suggests that:

‘between primary school and the end of secondary school, the gap between the lowest achieving children in England and the lowest achieving children in East Asian countries is reduced’

but remains significant.

Conversely, results for the top 10% of the distribution:

‘suggest that the gap between the highest achieving children in England and the highest achieving children in East Asia increases between the end of primary school and the end of secondary school’.

The latter outcome is illustrated in the chart reproduced below

Ex gap 11 Capture

 

The authors do not consider variation by socio-economic background amongst the high-achieving cohort, presumably because the data still does not support the pattern they previously identified for reading.

 

US studies

In 2007 the Jack Kent Cooke Foundation published ‘Achievement Trap: How America is Failing Millions of High-Achieving Students from Low Income Backgrounds’ (Wyner, Bridgeland, Diiulio) The text was subsequently revised in 2009.

This focuses exclusively on gaps attributable to socio-economic status, by comparing the performance of those in the top and bottom halves of the family income distribution in the US, as adjusted for family size.

The achievement measure is top quartile performance on nationally normalised exams administered within two longitudinal studies: The National Education Longitudinal Study (NELS) and the Baccalaureate and Beyond Longitudinal Study (B&B).

The study reports that relatively few lower income students remain high achievers throughout their time in elementary and high school:

  • 56% remain high achievers in reading by Grade 5, compared with 69% of higher income students.
  • 25 percent fall out of the high achiever cohort in high school, compared with 16% of higher income students.
  • Higher income learners who are not high achievers in Grade 1 are more than twice as likely to be high achievers by Grade 5. The same is true between Grades 8 and 12.

2007 also saw the publication of ‘Overlooked Gems: A national perspective on low income promising learners’ (Van Tassel-Baska and Stambaugh). This is a compilation of the proceedings of a 2006 conference which does not attempt a single definition of the target group, but draws on a variety of different research studies and programmes, each with different starting points.

An influential 2009 McKinsey study ‘The Economic Impact of the Achievement Gap in America’s Schools’ acknowledges the existence of what it calls a ‘top gap’. They use this term with reference to:

  • the number of top performers and the level of top performance in the US compared with other countries and
  • the gap in the US between the proportion of Black/Latino students and the proportion of all students achieving top levels of performance.

The authors discuss the colossal economic costs of achievement gaps more generally, but fail to extend this analysis to the ‘top gap’ specifically.

In 2010 ‘Mind the Other Gap: The Growing Excellence Gap in K-12 Education’ (Plucker, Burroughs and Song) was published – and seems to have been the first study to use this term.

The authors define such gaps straightforwardly as

‘Differences between subgroups of students performing at the highest levels of achievement’

The measures of high achievement deployed are the advanced standards on US NAEP maths and reading tests, at Grades 4 and 8 respectively.

The study identifies gaps based on four sets of learner characteristics:

  • Socio-economic status (eligible or not for free or reduced price lunch).
  • Ethnic background (White versus Black and/or Hispanic).
  • English language proficiency (what we in England would call EAL, compared with non-EAL).
  • Gender (girls versus boys).

Each characteristic is dealt with in isolation, so there is no discussion of the gaps between – for example – disadvantaged Black/Hispanic and disadvantaged White boys.

In relation to socio-economic achievement gaps, Plucker et al find that:

  • In Grade 4 maths, from 1996 to 2007, the proportion of advantaged learners achieving the advanced level increased by 5.6 percentage points, while the proportion of disadvantaged learners doing so increased by 1.2 percentage points. In Grade 8 maths, these percentage point changes were 5.7 and 0.8 percentage points respectively. Allowing for changes in the size of the advantaged and disadvantaged cohorts, excellence gaps are estimated to have widened by 4.1 percentage points in Grade 4 (to 7.3%) and 4.9 percentage points in Grade 8 (to 8.2%).
  • In Grade 4 reading, from 1998 to 2007, the proportion of advantaged learners achieving the advanced level increased by 1.2 percentage points, while the proportion of disadvantaged students doing so increased by 0.8 percentage points. In Grade 8 reading, these percentage point changes were almost negligible for both groups. The Grade 4 excellence gap is estimated to have increased slightly, by 0.4 percentage points (to 9.4%) whereas Grade 8 gaps have increased minimally by 0.2 percentage points (to 3.1%).

They observe that the size of excellence gaps are, at best, only moderately correlated with those at lower levels of achievement.

There is a weak relationship between gaps at basic and advanced level – indeed ‘smaller achievement gaps among minimally competent students is related to larger gaps among advanced students’ – but there is some inter-relationship between those at proficient and advanced level.

They conclude that, whereas No Child Left Behind (NCLB) helped to narrow achievement gaps, this does not extend to high achievers.

There is no substantive evidence that the NCLB focus on lower achievers has increased the excellence gap, although the majority of states surveyed by the NAGC felt that NCLB had diverted attention and resource away from gifted education.

In 2011 ‘Do High Fliers Maintain their Altitude?’ (Xiang et al 2011) provides a US analysis of whether individual students remain high achievers throughout their school careers.

They do not report outcomes for disadvantaged high achievers, but do consider briefly those attending schools with high and low proportions respectively of students eligible for free and reduced price lunches.

For this section of the report, high achievement is defined as ‘those whose math or reading scores placed them within the top ten per cent of their individual grades and schools’. Learners were tracked from Grades 3 to 5 and Grades 6 to 8.

It is described as exploratory, because the sample was not representative.

However:

‘High-achieving students attending high-poverty schools made about the same amount of academic growth over time as their high-achieving peers in low-poverty schools…It appears that the relationship between a school’s poverty rate and the growth of its highest-achieving students is weak. In other words, attending a low-poverty school adds little to the average high achiever’s prospects for growth.’

The wider study was criticised in a review by the NEPC, in part on the grounds that the results may have been distorted by regression to the mean, a shortcoming only briefly discussed in an appendix..

The following year saw the publication of Unlocking Emergent Talent: Supporting High Achievement of Low-Income, High-Ability Students (Olszewski-Kubilius and Clarenbach, 2012).

This is the report of a national summit on the issue convened in that year by the NAGC.

It follows Plucker (one of the summit participants) in using as its starting point,the achievement of advanced level on selected NAEP assessments by learners eligible for free and reduced price lunches.

But it also reports some additional outcomes for Grade 12 and for assessments of civics and writing:

  • ‘Since 1998, 1% or fewer of 4th-, 8th-, and 12th-grade free or reduced lunch students, compared to between 5% and 6% of non-eligible students scored at the advanced level on the NAEP civics exam.
  • Since 1998, 1% or fewer of free and reduced lunch program-eligible students scored at the advanced level on the eighth-grade NAEP writing exam while the percentage of non-eligible students who achieved advanced scores increased from 1% to 3%.’

The bulk of the report is devoted to identifying barriers to progress and offering recommendations for improving policy, practice and research. I provided an extended analysis in this post from May 2013.

Finally, ‘Talent on the Sidelines: Excellence Gaps and America’s Persistent Talent Underclass’ (Plucker, Hardesty and Burroughs 2013) is a follow-up to ‘Mind the Other Gap’.

It updates the findings in that report, as set out above:

  • In Grade 4 maths, from 1996 to 2011, the proportion of advantaged students scoring at the advanced level increased by 8.3 percentage points, while the proportion of disadvantaged learners doing so increased by 1.5 percentage points. At Grade 8, the comparable changes were 8.5 percentage points and 1.5 percentage points respectively. Excellence gaps have increased by 6.8 percentage points at Grade 4 (to 9.6%) and by 7 percentage points at Grade 8 (to 10.3%).
  • In Grade 4 reading, from 1998 to 2011, the proportion of advantaged students scoring at the advanced level increased by 2.6 percentage points, compared with an increase of 0.9 percentage points amongst disadvantaged learners. Grade 8 saw equivalent increases of 1.8 and 0.9 percentage points respectively. Excellence gaps are estimated to have increased at Grade 4 by 1.7 percentage points (to 10.7%) and marginally increased at Grade 8 by 0.9 percentage points (to 4.2%).

In short, many excellence gaps remain large and most continue to grow. The report’s recommendations are substantively the same as those put forward in 2010.

 

How Government education policy impacts on excellence gaps

Although many aspects of Government education policy may be expected to have some longer-term impact on raising the achievement of all learners, advantaged and disadvantaged alike, relatively few interventions are focused exclusively and directly on closing attainment gaps between advantaged and disadvantaged learners – and so have the potential to makes a significant difference to excellence gaps.

The most significant of these include:

 

The Pupil Premium:

In November 2010, the IPPR voiced concerns that the benefits of the pupil premium might not reach all those learners who attract it.

Accordingly they recommended that pupil premium should be allocated directly to those learners through an individual Pupil Premium Entitlement which might be used to support a menu of approved activities, including ‘one-to-one teaching to stretch the most able low income pupils’.

The recommendation has not been repeated and the present Government shows no sign of restricting schools’ freedom to use the premium in this manner.

However, the Blunkett Labour Policy Review ‘Putting students and parents first’ recommends that Labour in government should:

‘Assess the level and use of the Pupil Premium to ensure value for money, and that it is targeted to enhance the life chances of children facing the biggest challenges, whether from special needs or from the nature of the background and societal impact they have experienced.’

In February 2013 Ofsted reported that schools spending the pupil premium successfully to improve achievement:

‘Never confused eligibility for the Pupil Premium with low ability, and focused on supporting their disadvantaged pupils to achieve the highest levels’.

Conversely, where schools were less successful in spending the funding, they:

‘focused on pupils attaining the nationally expected level at the end of the key stage…but did not go beyond these expectations, so some more able eligible pupils underachieved.’

In July 2013, DfE’s Evaluation of Pupil Premium reported that, when deciding which disadvantaged pupils to target for support, the top criterion was ‘low attainment’ and was applied in 91% of primary schools and 88% of secondary schools.

In June 2013, in ‘The Most Able Students’, Ofsted reported that:

‘Pupil Premium funding was used in only a few instances to support the most able students who were known to be eligible for free school meals. The funding was generally spent on providing support for all underachieving and low-attaining students rather than on the most able students from disadvantaged backgrounds.’

Accordingly, it gave a commitment that:

‘Ofsted will… consider in more detail during inspection how well the pupil premium is used to support the most able students from disadvantaged backgrounds.’

However, this was not translated into the school inspection guidance.

The latest edition of the School Inspection Handbook says only:

‘Inspectors should pay particular attention to whether more able pupils in general and the most able pupils in particular are achieving as well as they should. For example, does a large enough proportion of those pupils who had the highest attainment at the end of Key Stage 2 in English and mathematics achieve A*/A GCSE grades in these subjects by the age of 16?

Inspectors should summarise the achievements of the most able pupils in a separate paragraph of the inspection report.’

There is no reference to the most able in parallel references to the pupil premium.

There has, however, been some progress in giving learners eligible for the pupil premium priority in admission to selective schools.

In May 2014, the TES reported that:

‘Thirty [grammar] schools have been given permission by the Department for Education to change their admissions policies already. The vast majority of these will introduce the changes for children starting school in September 2015…A small number – five or six – have already introduced the reform.’

The National Grammar Schools Association confirmed that:

‘A significant number of schools 38 have either adopted an FSM priority or consulted about doing so in the last admissions round. A further 59 are considering doing so in the next admissions round.’

In July 2014, the Government launched a consultation on the School Admissions Code which proposes extending to all state-funded schools the option to give priority in their admission arrangements to learners eligible for the pupil premium. This was previously open to academies and free schools via their funding agreements.

 

The Education Endowment Foundation (EEF)

The EEF describes itself as:

‘An independent grant-making charity dedicated to breaking the link between family income and educational achievement, ensuring that children from all backgrounds can fulfil their potential and make the most of their talents.’

The 2010 press release announcing its formation emphasised its role in raising standards in underperforming schools. This was reinforced by the Chairman in a TES article from June 2011:

‘So the target group for EEF-funded projects in its first couple of years are pupils eligible for free school meals in primary and secondary schools underneath the Government’s floor standards at key stages 2 and 4. That’s roughly 1,500 schools up and down the country. Projects can benefit other schools and pupils, as long as there is a significant focus on this core target group of the most needy young people in the most challenging schools.’

I have been unable to trace any formal departure from this position, though it no longer appears in this form in the Foundation’s guidance. The Funding FAQs say only:

‘In the case of projects involving the whole school, rather than targeted interventions, we would expect applicants to be willing to work with schools where the proportion of FSM-eligible pupils is well above the national average and/or with schools where FSM-eligible pupils are under-performing academically.’

I can find no EEF-funded projects that are exclusively or primarily focused on high-attaining disadvantaged learners, though a handful of its reports do refer to the impact on this group.

 

Changes to School Accountability Measures

As we have seen in Part one, the School Performance Tables currently provide very limited information about the performance of disadvantaged high achievers.

The July 2013 consultation document on primary assessment and accountability reform included a commitment to publish a series of headline measures in the tables including:

‘How many of the school’s pupils are among the highest-attaining nationally, by…showing the percentage of pupils attaining a high scaled score in each subject.’

Moreover, it added:

‘We will publish all the headline measures to show the attainment and progress of pupils for whom the school is in receipt of the pupil premium.’

Putting two and two together, this should mean that, from 2016, we will be able to see the percentage of pupil premium-eligible students achieving a high scaled score, though we do not yet know what ‘high scaled score’ means, nor do we know whether the data will be for English and maths separately or combined.

The October 2013 response to the secondary assessment and accountability consultation document fails to say explicitly whether excellence gap measures will be published in School Performance Tables.

It mentions that:

‘Schools will now be held to account for (a) the attainment of their disadvantaged pupils, (b) the progress made by their disadvantaged pupils, and (c) the in-school gap in attainment between disadvantaged pupils and their peers.’

Meanwhile a planned data portal will contain:

‘the percentage of pupils achieving the top grades in GCSEs’

but the interaction between these two elements, if any, remains unclear.

The March 2014 response to the consultation on post-16 accountability and assessment says:

‘We intend to develop measures covering all five headline indicators for students in 16-19 education who were in receipt of pupil premium funding in year 11.’

The post-16 headline measures include a new progress measure and an attainment measure showing the average points score across all level 3 qualifications.

It is expected that a destination measure will also be provided, as long as the methodology can be made sufficiently robust. The response says:

‘A more detailed breakdown of destinations data, such as entry to particular groups of universities, will continue to be published below the headline. This will include data at local authority level, so that destinations for students in the same area can be compared.’

and this should continue to distinguish the destinations of disadvantaged students.

Additional A level attainment measures – the average grade across the best three A levels and the achievement of AAB grades with at least two in facilitating subjects seem unlikely to be differentiated according to disadvantage.

There remains a possibility that much more excellence gap data, for primary, secondary and post-16, will be made available through the planned school portal, but no specification had been made public at the time of writing.

More worryingly, recent news reports have suggested that the IT project developing the portal and the ‘data warehouse’ behind it has been abandoned. The statements refer to coninuing to deliver ‘the school performance tables and associated services’ but there is no clarification of whether this latter phrase includes the portal. Given the absence of an official statement, one suspects the worst.

 

 

The Social Mobility and Child Poverty Commission (SMCPC)

The Commission was established with the expectation that it would ‘hold the Government’s feet to the fire’ to encourage progress on these two topics.

It publishes annual ‘state of the nation’ reports that are laid before Parliament and also undertakes ‘social mobility advocacy’.

The first annual report – already referenced in Part one – was published in November 2013. The second is due in October 2014.

The Chairman of the Commission was less than complimentary about the quality of the Government’s response to its first report, which made no reference to its comments about attainment gaps at higher grades. It remains to be seen whether the second will be taken any more seriously.

The Commission has already shown significant interest in disadvantaged high achievers – in June 2014 it published the study ‘Progress made by high-attaining children from disadvantaged backgrounds’ referenced above – so there is every chance that the topic will feature again in the 2014 annual report.

The Commission is of course strongly interested in the social mobility indicators and progress made against them, so may also include recommendations for how they might be adjusted to reflect changes to the schools accountability regime set out above.

 

Recommended reforms to close excellence gaps

Several proposals emerge from the commentary on current Government policy above:

  • It would be helpful to have further evaluation of the pupil premium to check whether high-achieving disadvantaged learners are receiving commensurate support. Schools need further guidance on ways in which they can use the premium to support high achievers. This should also be a focus for the pupil premium Champion and in pupil premium reviews.
  • Ofsted’s school inspection handbook requires revision to fulfil its commitment to focus on the most able in receipt of the premium. Inspectors also need guidance (published so schools can see it) to ensure common expectations are applied across institutions. These provisions should be extended to the post-16 inspection regime.
  • All selective secondary schools should be invited to prioritise pupil premium recipients in their admissions criteria, with the Government reserving the right to impose this on schools that do not comply voluntarily.
  • The Education Endowment Foundation should undertake targeted studies of interventions to close excellence gaps, but should also ensure that the impact on excellence gaps is mainstreamed in all the studies they fund. (This should be straightforward since their Chairman has already called for action on this front.)
  • The Government should consider the case for the inclusion of data on excellence gaps in all the headline measures in the primary, secondary and post-16 performance tables. Failing that, such data (percentages and numbers) should be readily accessible from a new data portal as soon as feasible, together with historical data of the same nature. (If the full-scale portal is no longer deliverable, a suitable alternative openly accessible database should be provided.) It should also publish annually a statistical analysis of all excellence gaps and the progress made towards closing them. As much progress as possible should be made before the new assessment and accountability regime is introduced. At least one excellence gap measure should be incorporated into revised DfE impact indicators and the social mobility indicators.
  • The Social Mobility and Child Poverty Commission (SMCPC) should routinely consider the progress made in closing excellence gaps within its annual report – and the Government should commit to consider seriously any recommendations they offer to improve such progress.

This leaves the question whether there should be a national programme dedicated to closing excellence gaps, and so improving fair access to competitive universities. (It makes excellent sense to combine these twin objectives and to draw on the resources available to support the latter.)

Much of the research above – whether it originates in the US or UK – argues for dedicated state/national programmes to tackle excellence gaps.

More recently, the Sutton Trust has published a Social Mobility Manifesto for 2015 which recommends that the next government should:

‘Reintroduce ring-fenced government funding to support the most able learners (roughly the top ten per cent) in maintained schools and academies from key stage three upwards. This funding could go further if schools were required to provide some level of match funding.

Develop an evidence base of effective approaches for highly able pupils and ensure training and development for teachers on how to challenge their most able pupils most effectively.

Make a concerted effort to lever in additional support from universities and other partners with expertise in catering for the brightest pupils, including through creating a national programme for highly able learners, delivered through a network of universities and accessible to every state-funded secondary school serving areas of disadvantage.’

This is not as clear as it might be about the balance between support for the most able and the most able disadvantaged respectively.

I have written extensively about what shape such a programme should have, most recently in the final section of ‘Digging Beneath the Destination Measures’ (July 2014).

The core would be:

‘A light touch framework that will supply the essential minimum scaffolding necessary to support effective market operation on the demand and supply sides simultaneously…

The centrepiece of the framework would be a structured typology or curriculum comprising the full range of knowledge, skills and understanding required by disadvantaged students to equip them for progression to selective higher education

  • On the demand side this would enable educational settings to adopt a consistent approach to needs identification across the 11-19 age range. Provision from 11-14 might be open to any disadvantaged learner wishing it to access it, but provision from 14 onwards would depend on continued success against challenging attainment targets.
  • On the supply side this would enable the full range of providers – including students’ own educational settings – to adopt a consistent approach to defining which knowledge, skills and understanding their various programmes and services are designed to impart. They would be able to qualify their definitions according to the age, characteristics, selectivity of intended destination and/or geographical location of the students they serve.

With advice from their educational settings, students would periodically identify their learning needs, reviewing the progress they had made towards personal targets and adjusting their priorities accordingly. They would select the programmes and services best matched to their needs….

…Each learner within the programme would have a personal budget dedicated to purchasing programmes and services with a cost attached. This would be fed from several sources including:

  • Their annual Pupil Premium allocation (currently £935 per year) up to Year 11.
  • A national fund fed by selective higher education institutions. This would collect a fixed minimum topslice from each institution’s outreach budget, supplemented by an annual levy on those failing to meet demanding new fair access targets. (Institutions would also be incentivised to offer programmes and services with no cost attached.)
  • Philanthropic support, bursaries, scholarships, sponsorships and in-kind support sourced from business, charities, higher education, independent schools and parents. Economic conditions permitting, the Government might offer to match any income generated from these sources.’

 

Close

We know far too little than we should about the size of excellence gaps in England – and whether or not progress is being made in closing them.

I hope that this post makes some small contribution towards rectifying matters, even though the key finding is that the picture is fragmented and extremely sketchy.

Rudimentary as it is, this survey should provide a baseline of sorts, enabling us to judge more easily what additional information is required and how we might begin to frame effective practice, whether at institutional or national level.

 

GP

September 2014

Closing England’s Excellence Gaps: Part One

This post examines what we know – and do not know – about high attainment gaps between learners from advantaged and disadvantaged backgrounds in England.

Mind the Gap by Clicsouris

Mind the Gap by Clicsouris

It assesses the capacity of current national education policy to close these gaps and recommends further action to improve the prospects of doing so rapidly and efficiently.

Because the post is extremely long I have divided it into two parts.

Part one comprises:

  • A working definition for the English context, explanation of the significance of excellence gaps, description of how this post relates to earlier material and provisional development of the theoretical model articulated in those earlier posts.
  • A summary of the headline data on socio-economic attainment gaps in England, followed by a review of published data relevant to excellence gaps at primary, secondary and post-16 levels.

Part two contains:

  • A distillation of research evidence, including material on whether disadvantaged high attainers remain so, international comparisons studies and research derived from them, and literature covering excellence gaps in the USA.
  • A brief review of how present Government policy might be expected to impact directly on excellence gaps, especially via the Pupil Premium, school accountability measures, the Education Endowment Foundation (EEF) and the Social Mobility and Child Poverty Commission (SMCPC). I have left to one side the wider set of reforms that might have an indirect and/or longer-term impact.
  • Some recommendations for strengthening our collective capacity to quantify address and ultimately close excellence gaps.

The post is intended to synthesise, supplement and update earlier material, so providing a baseline for further analysis – and ultimately consideration of further national policy intervention, whether under the present Government or a subsequent administration.

It does not discuss the economic and social origins of educational disadvantage, or the merits of wider policy to eliminate poverty and strengthen social mobility.

It starts from the premiss that, while education reform cannot eliminate the effects of disadvantage, it can make a significant, positive contribution by improving significantly the life chances of disadvantaged learners.

It does not debate the fundamental principle that, when prioritising educational support to improve the life chances of learners from disadvantaged backgrounds, governments should not discriminate on the basis of ability or prior attainment.

It assumes that optimal policies will deliver improvement for all disadvantaged learners, regardless of their starting point. It suggests, however, that intervention strategies should aim for equilibrium, prioritising gaps that are furthest away from it and taking account of several different variables in the process.

 

A working definition for the English context

The literature in Part two reveals that there is no accepted universal definition of excellence gaps, so I have developed my own England-specific working definition for the purposes of this post.

An excellence gap is:

‘The difference between the percentage of disadvantaged learners who reach a specified age- or stage-related threshold of high achievement – or who secure the requisite progress between two such thresholds – and the percentage of all other eligible learners that do so.’

This demands further clarification of what typically constitutes a disadvantaged learner and a threshold of high achievement.

In the English context, the measures of disadvantage with the most currency are FSM eligibility (eligible for and receiving free school meals) and eligibility for the deprivation element of the pupil premium (eligible for and receiving FSM at some point in the preceding six years – often called ‘ever 6’).

Throughout this post, for the sake of clarity, I have given priority to the former over the latter, except where the former is not available.

The foregrounded characteristic is socio-economic disadvantage, but this does not preclude analysis of the differential achievement of sub-groups defined according to secondary characteristics including gender, ethnic background and learning English as an additional language (EAL) – as well as multiple combinations of these.

Some research is focused on ‘socio-economic gradients’, which show how gaps vary at different points of the achievement distribution on a given assessment.

The appropriate thresholds of high achievement are most likely to be measured through national assessments of pupil attainment, notably end of KS2 tests (typically Year 6, age 11), GCSE and equivalent examinations (typically Year 11, age 16) and A level and equivalent examinations (typically Year 13, age 18).

Alternative thresholds of high achievement may be derived from international assessments, such as PISA, TIMSS or PIRLS.

Occasionally – and especially in the case of these international studies – an achievement threshold is statistically derived, in the form of a percentile range of performance, rather than with reference to a particular grade, level or score. I have not allowed for this within the working definition.

Progress measures typically relate to the distance travelled between: baseline assessment (currently at the end of KS1 – Year 2, age 7 – but scheduled to move to Year R, age 4) and end of KS2 tests; or between KS2 tests and the end of KS4 (GCSE); or between GCSE and the end of KS5 (Level 3/A level).

Some studies extend the concept of progress between two thresholds to a longitudinal approach that traces how disadvantaged learners who achieve a particular threshold perform throughout their school careers – do they sustain early success, or fall away, and what proportion are ‘late bloomers’?

 

Why are excellence gaps important?

Excellence gaps are important for two different sets of reasons: those applying to all achievement gaps and those which apply more specifically or substantively to excellence gaps.

Under the first heading:

  • The goal of education should be to provide all learners, including disadvantaged learners, with the opportunity to maximise their educational potential, so eliminating ‘the soft bigotry of low expectations’.
  • Schools should be ‘engines of social mobility’, helping disadvantaged learners to overcome their backgrounds and compete equally with their more advantaged peers.
  • International comparisons studies reveal that the most successful education systems can and do raise attainment for all and close socio-economic achievement gaps simultaneously.
  • There is a strong economic case for reducing – and ideally eradicating – underachievement attributable to disadvantage.

Under the second heading:

  • An exclusive or predominant focus on gaps at the lower end of the attainment distribution is fundamentally inequitable and tends to reinforce the ‘soft bigotry of low expectations’.
  • Disadvantaged learners benefit from successful role models – predecessors or peers from a similar background who have achieved highly and are reaping the benefits.
  • An economic imperative to increase the supply of highly-skilled labour will place greater emphasis on the top end of the achievement distribution. Some argue that there is a ‘smart fraction’ tying national economic growth to a country’s stock of high achievers. There may be additional spin-off benefits from increasing the supply of scientists, writers, artists, or even politicians!
  • The most highly educated disadvantaged learners are least likely to confer disadvantage on their children, so improving the proportion of such learners may tend to improve inter-generational social mobility.

Excellence gaps are rarely identified as such – the term is not yet in common usage in UK education, though it has greater currency in the US. Regardless of terminology, they rarely receive attention, either as part of a wider set of achievement gaps, or separately in their own right.

 

Relationship with earlier posts

Since this blog was founded in April 2010 I have written extensively about excellence gaps and how to address them.

The most pertinent of my previous posts are:

I have also written about excellence gaps in New Zealand – Part 1 and Part 2 (June 2012) – but do not draw on that material here.

Gifted education (or apply your alternative term) is amongst those education policy areas most strongly influenced by political and ideological views on the preferred balance between excellence and equity. This is particularly true of decisions about how best to address excellence gaps.

The excellence-equity trade-off was identified in my first post (May 2010) as one of three fundamental polarities that determine the nature of gifted education and provide the basis for most discussion about what form it should take.

The Gifted Phoenix Manifesto for Gifted Education (March 2013) highlighted their significance thus:

‘Gifted education is about balancing excellence and equity. That means raising standards for all while also raising standards faster for those from disadvantaged backgrounds.

Through combined support for excellence and equity we can significantly increase our national stock of high level human capital and so improve economic growth…

…Excellence in gifted education is about maximising the proportion of high achievers reaching advanced international benchmarks (eg PISA, TIMSS and PIRLS) so increasing the ‘smart fraction’ which contributes to economic growth.

Equity in gifted education is about narrowing (and ideally eliminating) the excellence gap between high achievers from advantaged and disadvantaged backgrounds (which may be attributable in part to causes other than poverty). This also increases the proportion of high achievers, so building the ‘smart fraction’ and contributing to economic growth.’

More recently, one of the 10 draft core principles I set out in ‘Why Can’t We Have National Consensus on Educating High Attainers?’ (June 2014) said:

‘We must pursue simultaneously the twin priorities of raising standards and closing gaps. We must give higher priority to all disadvantaged learners, regardless of their prior achievement. Standards should continue to rise amongst all high achievers, but they should rise faster amongst disadvantaged high achievers. This makes a valuable contribution to social mobility.’

 

This model provisionally developed

Using my working definition as a starting point, this section describes a theoretical model showing how excellence and equity are brought to bear when considering excellence gaps – and then how best to address them.

This should be applicable at any level, from a single school to a national education system and all points in between.

The model depends on securing the optimal balance between excellence and equity where:

  • Excellence is focused on increasing the proportion of all learners who achieve highly and, where necessary, increasing the pitch of high achievement thresholds to remove unhelpful ceiling effects. The thresholds in question may be nationally or internationally determined and are most likely to register high attainment through a formal assessment process. (This may be extended so there is complementary emphasis on increasing the proportion of high-achieving learners who make sufficiently strong progress between two different age- or stage-related thresholds.)
  • Equity is focused on increasing the proportion of high-achieving disadvantaged learners (and/or the proportion of disadvantaged learners making sufficiently strong progress) at a comparatively faster rate, so they form a progressively larger proportion of the overall high-achieving population, up to the point of equilibrium, where advantaged and disadvantaged learners are equally likely to achieve the relevant thresholds (and/or progress measure). This must be secured without deliberately repressing improvement amongst advantaged learners – ie by introducing policies designed explicitly to limit their achievement and/or progress relative to disadvantaged learners – but a decision to do nothing or to redistribute resources in favour of disadvantage is entirely permissible.

The optimal policy response will depend on the starting position and the progress achieved over time.

If excellence gaps are widening, the model suggests that interventions and resources should be concentrated in favour of equity. Policies should be reviewed and adjusted, or strengthened where necessary, to meet the desired objectives.

If excellence gaps are widening rapidly, this reallocation and adjustment process will be relatively more substantial (and probably more urgent) than if they are widening more slowly.

Slowly widening gaps will demand more reallocation and adjustment than a situation where gaps are stubbornly resistant to improvement, or else closing too slowly. But even in the latter case there should be some reallocation and adjustment until equilibrium is achieved.

When excellence gaps are already closing rapidly – and there are no overt policies in place to deliberately repress improvement amongst high-achieving advantaged learners – it may be that unintended pressures in the system are inadvertently bringing this about. In that case, policy and resources should be adjusted to correct these pressures and so restore the correct twin-speed improvement.

The aim is to achieve and sustain equilibrium, even beyond the point when excellence gaps are eliminated, so that they are not permitted to reappear.

If ‘reverse gaps’ begin to materialise, where disadvantaged learners consistently outperform their more advantaged peers, this also threatens equilibrium and would suggest a proportionate redistribution of effort towards excellence.

Such scenarios are most likely to occur in settings where there are a large proportion of learners that, while not disadvantaged according to the ‘cliff edge’ definition required to make the distinction, are still relatively disadvantaged.

Close attention must therefore be paid to the distribution of achievement across the full spectrum of disadvantage, to ensure that success at the extreme of the distribution does not mask significant underachievement elsewhere.

One should be able to determine a more precise policy response by considering a restricted set of variables. These include:

  • The size of the gaps at the start of the process and, associated with this, the time limit allowed for equilibrium to be reached. Clearly larger gaps are more likely to take longer to close. Policy makers may conclude that steady improvement over several years is more manageable for the system than a rapid sprint towards equilibrium. On the other hand, there may be benefits associated with pace and momentum.
  • The rate at which overall high achievement is improving. If this is relatively fast, the rate of improvement amongst advantaged high achievers will be correspondingly strong, so the rate for disadvantaged high achievers must be stronger still.
  • The variance between excellence gaps at different ages/stages. If the gaps are larger at particular stages of education, the pursuit of equilibrium suggests disproportionate attention is given to those so gaps are closed consistently. If excellence gaps are small for relatively young learners and increase with age, priority should be given to the latter, but there may be other factors in play, such as evidence that closing relatively small gaps at an early stage will have a more substantial ‘knock-on’ effect later on.
  • The level at which high achievement thresholds are pitched. Obviously this will influence the size of the gaps that need to be closed. But, other things being equal, enabling a higher proportion of learners to achieve a relatively high threshold will demand more intensive support. On the other hand, relatively fewer learners – whether advantaged or disadvantaged – are likely to be successful. Does one need to move a few learners a big distance or a larger proportion a smaller one?
  • Whether or not gaps at lower achievement thresholds are smaller and/or closing at a faster rate. If so, there is a strong case for securing parity of progress at higher and lower thresholds alike. On the other hand, if excellence gaps are closing more quickly, it may be appropriate to reallocate resources away from them and towards lower levels of achievement.
  • The relative size of the overall disadvantaged population, the associated economic gap between advantage and disadvantage and (as suggested above) the distribution in relation to the cut-off. If the definition of disadvantage is pitched relatively low (ie somewhat disadvantaged), the disadvantaged population will be correspondingly large, but the economic gap between advantage and disadvantage will be relatively small. If the definition is pitched relatively high (ie very disadvantaged) the reverse will be true, giving a comparatively small disadvantaged population but a larger gap between advantage and disadvantage.
  • The proportion of the disadvantaged population that is realistically within reach of the specified high achievement benchmarks. This variable is a matter of educational philosophy. There is merit in an inclusive approach – indeed it seems preferable to overestimate this proportion than the reverse. Extreme care should be taken not to discourage late developers or close off opportunities on the basis of comparatively low current attainment, so reinforcing existing gaps through unhelpfully low expectations. On the other hand, supporting unrealistically high expectations may be equally damaging and ultimately waste scarce resources. There may be more evidence to support such distinctions with older learners than with their younger peers. 

 

How big are England’s headline attainment gaps and how fast are they closing?

Closing socio-economic achievement gaps has been central to English educational policy for the last two decades, including under the current Coalition Government and its Labour predecessor.

It will remain an important priority for the next Government, regardless of the outcome of the 2015 General Election.

The present Government cites ‘Raising the achievement of disadvantaged children’ as one of ten schools policies it is pursuing.

The policy description describes the issue thus:

‘Children from disadvantaged backgrounds are far less likely to get good GCSE results. Attainment statistics published in January 2014 show that in 2013 37.9% of pupils who qualified for free school meals got 5 GCSEs, including English and mathematics at A* to C, compared with 64.6% of pupils who do not qualify.

We believe it is unacceptable for children’s success to be determined by their social circumstances. We intend to raise levels of achievement for all disadvantaged pupils and to close the gap between disadvantaged children and their peers.’

The DfE’s input and impact indicators  – showing progress against the priorities set out in its business plan – do not feature the measure mentioned in the policy description (which is actually five or more GCSEs at Grades A*-C or equivalents, including GCSEs in English and maths).

The gap on this measure was 27.7% in 2009, improving to 26.7% in 2013, so there has been a small 1.0 percentage point improvement over five years, spanning the last half of the previous Government’s term in office and the first half of this Government’s term.

Instead the impact indicators include three narrower measures focused on closing the attainment gap between free school meal pupils and their peers, at 11, 16 and 19 respectively:

  • Impact Indicator 7 compares the percentages of FSM-eligible and all other pupils achieving level 4 or above in KS2 assessment of reading, writing and maths. The 2013 gap is 18.7%, down 0.4% from 19.1% in 2012.
  • Impact Indicator 8 compares the percentages of FSM-eligible and all other pupils achieving A*-C grades in GCSE maths and English. The 2013 gap is 26.5%, up 0.3% from 26.2% in 2012.
  • Impact Indicator 9 compares the percentages of learners who were FSM-eligible at age 15 and all other learners who attain a level 3 qualification by the end of the academic year in which they are 19. The 2013 gap is 24.3%, up 0.1% from 24.2% in 2012.

These small changes, not always pointing in the right direction, reflect the longer term narrative, as is evident from the Government’s Social Mobility Indicators which also incorporate these three measures.

  • In 2005-06 the KS2 L4 maths and English gap was 25.0%, so there has been a fairly substantial 6.3 percentage point reduction over seven years, but only about one quarter of the gap has been closed.
  • In 2007-08 the KS4 GCSE maths and English gap was 28.0%, so there has been a minimal 1.5 percentage point reduction over six years, equivalent to annual national progress of 0.25 percentage points per year. At that rate it will take another century to complete the process.
  • In 2004-05 the Level 3 qualification gap was 26.4%, so there has been a very similar 2.1 percentage point reduction over 8 years.

The DfE impact indicators also include a set of three destination measures that track the percentage of FSM learners progressing to Oxford and Cambridge, any Russell Group university and any university.

There is a significant time lag with all of these – the most recent available data relates to 2011/2012 – and only two years of data have been collected.

All show an upward trend. Oxbridge is up from 0.1% to 0.2%, Russell Group up from 3% to 4% and any university up from 45% to 47% – actually a 2.5 percentage point improvement.

The Oxbridge numbers are so small that a percentage measure is a rather misleading indicator of marginal improvement from a desperately low base.

It is important to note that forthcoming changes to the assessment regime will impose a different set of headline indicators at ages 11 and 16 that will not be comparable with these.

From 2014 significant methodological adjustments are being introduced to School Performance Tables that significantly restrict the range of qualifications equivalent to GCSEs. Only the first entry in each subject will count for Performance Table purposes, this applying to English Baccalaureate subjects in 2014 and then all subjects in 2015.

Both these factors will tend to depress overall results and may be expected to widen attainment gaps on the headline KS4 measure as well as the oft-cited 5+ GCSEs measure.

From 2016 new baseline assessments, the introduction of scaled scores at the end of KS2 and a new GCSE grading system will add a further layer of change.

As a consequence there will be substantial revisions to the headline measures in Primary, Secondary and Post-16 Performance Tables. The latter will include destination measures, provided they can be made methodologically sound.

At the time of writing, the Government has made negligible reference to the impact of these reforms on national measures of progress, including its own Impact Indicators and the parallel Social Mobility indicators, though the latter are reportedly under review.

 

Published data on English excellence gaps

The following sections summarise what data I can find in the public domain about excellence gaps at primary (KS2), secondary (KS4) and post-16 (KS5) respectively.

I have cited the most recent data derivable from Government statistical releases and performance tables, supplemented by other interesting findings gleaned from research and commentary.

 

Primary (KS2) 

The most recent national data is contained in SFR51/2013: National Curriculum Assessments at Key Stage 2: 2012 to 2013. This provides limited information about the differential performance of learners eligible for and receiving FSM (which I have referred to as ‘FSM’), and for those known to be eligible for FSM at any point from Years 1 to 6 (known as ‘ever 6’ and describing those in receipt of the Pupil Premium on grounds of deprivation).

There is also additional information in the 2013 Primary School Performance Tables, where the term ‘disadvantaged’ is used to describe ‘ever 6’ learners and ‘children looked after’.

There is comparably little variation between these different sets of figures at national level. In the analysis below (and in the subsequent section on KS4) I have used FSM data wherever possible, but have substituted ‘disadvantaged’ data where FSM is not available.  All figures apply to state-funded schools only.

I have used Level 5 and above as the best available proxy for high attainment. Some Level 6 data is available, but in percentages only, and these are all so small that comparisons are misleading.

The Performance Tables distinguish a subset of high attainers, on the basis of prior attainment (at KS1 for KS2 and at KS2 for KS4) but no information is provided about the differential performance of advantaged and disadvantaged high attainers.

In 2013:

  • 21% of all pupils achieved Level 5 or above in reading, writing and maths combined, but only 10% of FSM pupils did so, compared with 26% of others, giving an attainment gap of 16%. The comparable gap at Level 4B (in reading and maths and L4 in writing) was 18%. At Level 4 (across the board) it was 20%. In this case, the gaps are slightly larger at lower attainment levels but, whereas the L4 gap has narrowed by 1% since 2012, the L5 gap has widened by 1%.
  • In reading, 44% of all pupils achieved Level 5 and above, but only 21% of FSM pupils did so, compared with 48% of others, giving an attainment gap of 21%. The comparable gap at Level 4 and above was eight percentage points lower at 13%.
  • In writing (teacher assessment), 31% of all pupils achieved level 5 and above, but only 15% of FSM pupils did so, compared with 34% of others, giving an attainment gap of 19%. The comparable gap at Level 4 and above was three percentage points lower at 16%.
  • In grammar, punctuation and spelling (GPS), 47% of all pupils achieved Level 5 and above, but only 31% of FSM pupils did so, compared with 51% of others, giving an attainment gap of 20%. The comparable gap at Level 4 and above was two percentage points lower at 18%.
  • In maths, 41% of pupils in state-funded schools achieved Level 5 and above, up 2% on 2012. But only 24% of FSM pupils achieved this compared with 44% of others, giving an attainment gap of 20%. The comparable gap at level 4 and above is 13%.

Chart 1 shows these outcomes graphically. In four cases out of five, the gap at the higher attainment level is greater, substantially so in reading and maths. All the Level 5 gaps fall between 16% and 20%.

 

Ex gap table 1

Chart 1: Percentage point gaps between FSM and all other pupils’ attainment at KS2 L4 and above and KS2 L5 and above, 2013 

 

It is difficult to trace reliably the progress made in reducing these gaps in English, since the measures have changed frequently. There has been more stability in maths, however, and the data reveals that – whereas the FSM gap at Level 4 and above has reduced by 5 percentage points since 2008 (from 18 points to 13 points) – the FSM gap at Level 5 and above has remained between 19 and 20 points throughout. Hence the gap between L4+ and L5+ on this measure has increased in the last five years.

There is relatively little published about KS2 excellence gaps elsewhere, though one older Government publication, a DfES Statistical Bulletin: The characteristics of high attainers (2007) offers a small insight.

It defines KS2 high attainers as the top 10%, on the basis of finely grained average points scores across English, maths and science, so a more selective but wider-ranging definition than any of the descriptors of Level 5 performance above.

According to this measure, some 2.7% of FSM-eligible pupils were high attainers in 2006, compared with 11.6% of non-FSM pupils, giving a gap of 8.9 percentage points.

The Bulletin supplies further analysis of this population of high attainers, summarised in the table reproduced below.

 

EX Gap Capture 1 

  

Secondary (KS4) 

While Government statistical releases provide at least limited data about FSM performance at high levels in end of KS2 assessments, this is entirely absent from KS4 data, because there is no information about the achievement of GCSE grades above C, whether for single subjects or combinations.

The most recent publication: SFR05/2014: GCSE and equivalent attainment by pupil characteristics, offers a multitude of measures based on Grades G and above or C and above, many of which are set out in Chart 2, which illustrates the FSM gap on each, organised in order from the smallest gap to the biggest.

(The gap cited here for A*-C grades in English and maths GCSEs is very slightly different to the figure in the impact indicator.)

 

Ex gap table 2

Chart 2: Percentage point gaps between FSM and all other pupils’ attainment on different KS4 measures, 2013

 

In its State of the Nation Report 2013, the Social Mobility and Child Poverty Commission included a table comparing regional performance on a significantly more demanding ‘8+ GCSEs excluding equivalents and including English and maths’ measure. This uses ‘ever 6’ rather than FSM as the indicator of disadvantage.

The relevant table is reproduced below. It shows regional gaps of between 20 and 26 percentage points on the tougher measure, so a similar order of magnitude to the national indicators at the top end of Chart 2.

 

ExGap 2 Capture

 

Comparing the two measures, one can see that:

  • The percentages of ‘ever 6’ learners achieving the more demanding measure are very much lower than the comparable percentages achieving the 5+ GCSEs measure, but the same is also true of their more advantaged peers.
  • Consequently, in every region but London and the West Midlands, the attainment gap is actually larger for the less demanding measure.
  • In London, the gaps are much closer, at 19.1 percentage points on the 5+ measure and 20.9 percentage points on the 8+ measure. In the West Midlands, the gap on the 8+ measure is larger by five percentage points. In all other cases, the difference is at least six percentage points in the other direction.

We do not really understand the reasons why London and the West Midlands are atypical in this respect.

The Characteristics of High Attainers (2007) provides a comparable analysis for KS4 to that already referenced at KS2. In this case, the top 10% of high attainers is derived on the basis of capped GCSE scores.

This gives a gap of 8.8 percentage points between the proportion of non-FSM (11.2%) and FSM (2.4%) students within the defined population, very similar to the parallel calculation at KS2.

Other variables within this population are set out in the table reproduced below.

 

ExGap Capture 3

Finally, miscellaneous data has also appeared from time to time in the answers to Parliamentary Questions. For example:

  • In 2003, 1.0% of FSM-eligible learners achieved five or more GCSEs at A*/A including English and maths but excluding equivalents, compared with 6.8% of those not eligible, giving a gap of 5.8 percentage points. By 2009 the comparable percentages were 1.7% and 9.0% respectively, resulting in an increased gap of 7.3 percentage points (Col 568W)
  • In 2006/07, the percentage of FSM-eligible pupils securing A*/A grades at GCSE in different subjects, compared with the percentage of all pupils in maintained schools doing so were as shown in the table below (Col 808W)
FSM All pupils Gap
Maths 3.7 15.6 11.9
Eng lit 4.1 20.0 15.9
Eng lang 3.5 16.4 12.9
Physics 2.2 49.0 46.8
Chemistry 2.5 48.4 45.9
Biology 2.5 46.8 44.3
French 3.5 22.9 19.4
German 2.8 23.2 20.4

Table 1: Percentage of FSM-eligible and all pupils achieving GCSE A*/A grades in different GCSE subjects in 2007

  • In 2008, 1% of FSM-eligible learners in maintained schools achieved A* in GCSE maths compared with 4% of all pupils in maintained schools. The comparable percentages for Grade A were 3% and 10% respectively, giving an A*/A gap of 10 percentage points (Col 488W)

 

Post-16 (KS5)

The most recent post-16 attainment data is provided in SFR10/2014: Level 2 and 3 attainment by young people aged 19 in 2013 and SFR02/14: A level and other level 3 results: academic year 2012 to 2013.

The latter contains a variety of high attainment measures – 3+ A*/A grades;  AAB grades or better; AAB grades or better with at least two in facilitating subjects;  AAB grades or better, all in facilitating subjects – yet none of them distinguish success rates for advantaged and disadvantaged learners.

The former does includes a table which provides a time series of gaps for achievement of Level 3 at age 19 through 2 A levels or the International Baccalaureate. The measure of disadvantage is FSM-eligibility in Year 11. The gap was 22.0 percentage points in 2013, virtually unchanged from 22.7 percentage points in 2005.

In (How) did New Labour narrow the achievement and participation gap (Whitty and Anders, 2014) the authors reproduce a chart from a DfE roundtable event held in March 2013 (on page 44).

This is designed to show how FSM gaps vary across key stages and also provides ‘odds ratios’ – the relative chances of FSM and other pupils achieving each measure. It relies on 2012 outcomes.

The quality of the reproduction is poor, but it seems to suggest that, using the AAB+ in at least two facilitating subjects measure, there is a five percentage point gap between FSM students and others (3% versus 8%), while the odds ratio shows that non-FSM students are 2.9 times more likely than FSM students to achieve this outcome.

Once again, occasional replies to Parliamentary Questions provide some supplementary information:

  • In 2007, 189 FSM-eligible students (3.7%) in maintained mainstream schools (so excluding sixth form colleges and FE colleges) achieved 3 A grades at A level. This compared with 13,467 other students (9.5%) giving a gap of 5.8 percentage points (Source: Parliamentary Question, 26 November 2008, Hansard (Col 1859W)
  • In 2008, 160 students (3.5%) eligible for FSM achieved that outcome. This compares with 14,431 (10.5%) of those not eligible for FSM, giving a gap of 7.0 percentage points. The figures relate to 16-18 year-olds, in maintained schools only, who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth form colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are counted. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
  • Of pupils entering at least one A level in 2010/11 and eligible for FSM at the end of Year 11, 546 (4.1%) achieved 3 or more GCE A levels at A*-A compared with 22,353 other pupils (10.6%) so giving a gap of 6.5 percentage points. These figures include students in both the schools and FE sectors. (Parliamentary Question, 9 July 2012, Hansard (Col 35W)) 

 In September 2014, a DfE response to a Freedom of Information request provided some additional data about FSM gaps at A level over the period from 2009 to 2013. This is set out in the table below, which records the gaps between FSM and all other pupils, presumably for all schools and colleges, whether or not state-funded.

Apart from the atypical result for the top indicator in 2010, all these percentages fall in the range 6.0% to 10%, so are in line with the sources above.

 

2009 2010 2011 2012 2013
3+ grades at A*/A or applied single/double award 9.0 12.8 9.3 8.7 8.3
AAB+ grades in facilitating subjects 6.3 6.2
AAB+ grades at least 2 in facilitating subjects 9.8

 

Additional evidence of Key Stage excellence gaps from a sample born in 1991

In Progress made by high-achieving children from disadvantaged backgrounds (Crawford, Macmillan and Vignoles, 2014) provides useful data on the size of excellence gaps at different key stages, as well as analysis of whether disadvantaged high achievers remain so through their school careers.

The latter appears in Part two, but the first set of findings provides a useful supplement to the broad picture set out above.

This study is based on a sample of learners born in 1991/1992, so they would presumably have taken end of KS2 tests in 2002, GCSEs in 2007 and A levels in 2009. It includes all children who attended a state primary school, including those who subsequently attended an independent secondary school.

It utilises a variety of measures of disadvantage, including whether learners were always FSM-eligible (in Years 7-11), or ‘ever FSM’ during that period. This summary focuses on the distinction between ‘always FSM’ and ‘never FSM’.

It selects a basket of high attainment measures spread across the key stages, including:

  • At KS1, achieving Level 3 or above in reading and maths.
  • At KS2, achieving Level 5 or above in English and maths.
  • At KS4, achieving six or more GCSEs at grades A*-C in EBacc subjects (as well as five or more).
  • At KS5, achieving two or more (and three or more) A levels at grades A-B in any subjects.
  • Also at KS5, achieving two or more (and three or more) A levels at grades A-B in facilitating subjects.

The choice of measures at KS2 and KS5 is reasonable, reflecting the data available at the time. For example, one assumes that A* grades at A level do not feature in the KS5 measures since they were not introduced until 2010).

At KS4, the selection is rather more puzzling and idiosyncratic. It would have been preferable to have included at least one measure based on performance across a range of GCSEs at grades A*-B or A*/A.

The authors justify their decision on the basis that ‘there is no consensus on what is considered high attainment’, even though most commentators would expect this to reflect higher grade performance, while few are likely to define it solely in terms of breadth of study across a prescribed set of ‘mainstream’ subjects.

Outcomes for ‘always FSM’ and ‘never FSM’ on the eight measures listed above are presented in Chart 3.

Ex gap Table 3

Chart 3: Achievement of ‘always FSM’ and ‘never FSM’ on a basket of high attainment measures for pupils born in 1991/92

 

This reveals gaps of 12 to 13 percentage points at Key Stages 1 and 2, somewhat smaller than several of those described above.

It is particularly notable that the 2013 gap for KS2 L5 reading, writing and maths is 16 percentage points, whereas the almost comparable 2002 (?) gap for KS2 English and maths amongst this sample is 13.5%. Even allowing for comparability issues, there may tentative evidence here to suggest widening excellence gaps at KS2 over the last decade.

The KS4 gaps are significantly larger than those existing at KS1/2, at 27 and 18 percentage points respectively. But comparison with the previous evidence reinforces the point that the size of the gaps in this sample is attributable to subject mix: this must be the case since the grade expectation is no higher than C.

The data for A*/A performance on five or more GCSEs set out above, which does not insist on coverage of EBacc subjects other than English and maths, suggests a gap of around seven percentage points. But it also demonstrates big gaps – again at A*/A – for achievement in single subjects, especially the separate sciences.

The KS5 gaps on this sample range from 2.5 to 13 percentage points. We cited data above suggesting a five percentage point gap in 2012 for AAB+, at least two in facilitating subjects. These findings do not seem wildly out of kilter with that, or with the evidence of gaps of around six to seven percentage points for AAA grades or higher.

 

Overall pattern 

The published data provides a beguiling glimpse of the size of excellence gaps and how they compare with FSM gaps on the key national benchmarks.

But discerning the pattern is like trying to understand the picture on a jigsaw when the majority of pieces are missing.

The received wisdom is capture in the observation by Whitty and Anders that:

‘Even though the attainment gap in schools has narrowed overall, it is largest for the elite measures’

and the SMCPC’s comment that:

‘…the system is better at lifting children eligible for FSM above a basic competence level (getting 5A*–C) than getting them above a tougher level of attainment likely to secure access to top universities.’

This seems broadly true, but the detailed picture is rather more complicated.

  • At KS2 there are gaps at L5 and above of around 16-20 percentage points, the majority higher than the comparable gaps at L4. But the gaps for core subjects combined are smaller than for each assessment. There is tentative evidence that the former may be widening.
  • At KS4 there are very significant differences between results in individual subjects. When it comes to multi-subject indicators, differences in the choice of subject mix – as well as choice of grade – make it extremely difficult to draw even the most tentative conclusions about the size of excellence gaps and how they relate to benchmark-related gaps at KS4 and excellence gaps at KS2.
  • At KS5, the limited evidence suggests that A level excellence gaps at the highest grades are broadly similar to those at GCSE A*/A. If anything, gaps seem to narrow slightly compared with KS4. But the confusion over KS4 measures makes this impossible to verify.

We desperately need access to a more complete dataset so we can understand these relationships more clearly.

This is the end of Part one. In Part two, we move on to consider evidence about whether high attainers remain so, before examining international comparisons data and related research, followed by excellence gaps analysis from the USA.

Part two concludes with a short review of how present government policy impacts on excellence gaps and some recommendations for strengthening the present arrangements.

 

GP

September 2014