The most able students: Has Ofsted made progress?

.

This post considers Ofsted’s survey report ‘The most able students: An update on progress since June 2013’ published on 4 March 2015.

It is organised into the following sections:

  • The fit with earlier analysis
  • Reaction to the Report
  • Definitions and the consequent size of Ofsted’s ‘most able’ population
  • Evidence base – performance data and associated key findings
  • Evidence base – inspection and survey evidence and associated key findings
  • Ofsted’s recommendations and overall assessment
  • Prospects for success

How this fits with earlier work

The new Report assesses progress since Ofsted’s previous foray into this territory some 21 months ago: ‘The most able students: Are they doing as well as they should in our non-selective secondary schools?’ (June 2013)

The autopsy I performed on the original report was severely critical.

It concluded:

‘My overall impression is of a curate’s egg, whose better parts have been largely overlooked because of the opprobrium heaped on the bad bits.

The Report might have had a better reception had the data analysis been stronger, had the most significant messages been given comparatively greater prominence and had the tone been somewhat more emollient towards the professionals it addresses, with some sort of undertaking to underwrite support – as well as challenge – from the centre.’

In May 2014, almost exactly mid-way between that Report and this, I published an analysis of the quality of Ofsted reporting on support for the most able in a sample of Section 5 secondary school inspection reports.

This uncovered a patchy picture which I characterised as ‘requiring improvement’.

It noted the scant attention given by inspectors to high-attaining disadvantaged learners and called for Ofsted to publish guidance to clarify, for inspectors and schools alike, what they mean by the most able and their expectations of what support schools should provide.

In December 2014, I published ‘HMCI ups the ante on the most able’ which drew attention to commitments in HMCI’s Annual Report for 2013/14 and the supporting documentation released alongside it.

I concluded that post with a series of ten recommendations for further action by Ofsted and other central government bodies that would radically improve the chances of achieving system-wide improvement in this territory.

The new Report was immediately preceded by a Labour commitment to introduce a £15m Gifted and Talented Fund if successful in the forthcoming General Election.

This short commentary discusses that and sets out the wider political context into which Ofsted’s new offering will fall.

.

Reactions to Ofsted’s Report

Before considering the Report’s content, it may be helpful to complete this context-setting by charting immediate reactions to it.

  • DfE’s ‘line to take, as quoted by the Mail, is:

‘We know that the best schools do stretch their pupils. They are the ones with a no-excuses culture that inspires every student to do their best.

Our plan for education is designed to shine a bright light on schools which are coasting, or letting the best and brightest fall by the wayside.

That is why we are replacing the discredited system which rewarded schools where the largest numbers of pupils scraped a C grade at GCSE.

Instead we are moving to a new system which encourages high-achievers to get the highest grades possible while also recognising schools which push those who find exams harder.’

‘David Cameron’s government has no strategy for supporting schools to nurture their most able pupils. International research shows we perform badly in helping the most gifted pupils. We’re going to do something about that. Labour will establish a Gifted and Talented Fund to equip schools with the most effective strategies for stretching their most able pupils.’

  • ASCL complains that the Report ‘fails to recognise that school leaders have done an extraordinary job in difficult circumstances in raising standards and delivering a good education for all children’. It is also annoyed because Ofsted’s press release:

‘…should have focused on the significant amount of good practice identified in the report rather than leading with comments that some schools are not doing enough to ensure the most able children fulfil their potential.’

 .

 .

  • NAHT makes a similarly generic point about volatility and change:

‘The secondary sector has been subject to massive structural change over the past few years. It’s neither sensible nor accurate to accuse secondary schools of failure. The system itself is getting in the way of success…

…Not all of these changes are bad. The concern is that the scale and pace of them will make it very hard indeed to know what will happen and how the changes will interact….

…The obvious answer is quite simple: slow down and plan the changes better; schedule them far enough ahead to give schools time to react….

But the profession also needs to ask what it can do. One answer is not to react so quickly to changes in league table calculations – to continue to do what is right…’

There was no official reaction from ATL, NASUWT or NUT.

Turning to the specialist organisations:

‘If the failure reported by Ofsted was about any other issue there would be a national outcry.

This cannot be an issue laid at the door of schools alone, with so many teachers working hard, and with no budget, to support these children.

But in some schools there is no focus on supporting high potential learners, little training for teachers to cope with their educational needs, and a naive belief that these children will succeed ‘no matter what’.

Ofsted has shown that this approach is nothing short of a disaster; a patchwork of different kinds of provision, a lack of ambitious expectations and a postcode lottery for parents.

We need a framework in place which clearly recognises best practice in schools, along with a greater understanding of how to support these children with high learning potential before it is too late.’

‘NACE concurs with both the findings and the need for urgent action to be taken to remove the barriers to high achievement for ALL pupils in primary and secondary schools…

… the organisation is  well aware that nationally there is a long way to go before all able children are achieving in line with their abilities.’

‘Today’s report demonstrates an urgent need for more dedicated provision for the highly able in state schools. Ofsted is right to describe the situation as ‘especially disappointing’; too many of our brightest students are being let down…

…We need to establish an effective national programme to support our highly able children particularly those from low and middle income backgrounds so that they have the stretch and breath they need to access the best universities and the best careers.’

Summing up, the Government remains convinced that its existing generic reforms will generate the desired improvements.

There is so far no response, from Conservatives or Liberal Democrats, to the challenge laid down by Labour, which has decided that some degree of arms-length intervention from the centre is justified.

The headteacher organisations are defensive because they see themselves as the fall guys, as the centre increasingly devolves responsibility through a ‘school-driven self-improving’ system that cannot yet support its own weight (and might never be able to do so, given the resource implications of building sufficient capacity).

But they cannot get beyond these generic complaints to address the specific issues that Ofsted presents. They are in denial.

The silence of the mainstream teachers’ associations is sufficient comment on the significance they attach to this issue.

The specialist lobby calls explicitly for a national framework, or even the resurrection of a national programme. All are pushing their own separate agendas over common purpose and collaborative action.

Taken together, this does not bode well for Ofsted’s chances of achieving significant traction.

Ofsted’s definitions

.

Who are the most able?

Ofsted is focused exclusively on non-selective secondary schools, and primarily on KS3, though most of the data it publishes relates to KS4 outcomes.

My analysis of the June 2013 report took umbrage at Ofsted’s previous definition of the most able:

‘For the purpose of this survey ‘most able’ is defined as the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2. Some pupils who are new to the country and are learning English as an additional language, for example, might not have attained Level 5 or beyond at the end of Key Stage 2 but have the potential to achieve it.’

On this occasion, the definition is similarly based on prior attainment at KS2, but the unquantified proportion of learners with ‘the potential to attain Level 5 or above’ are removed, meaning that Ofsted is now focused exclusively on high attainers:

‘For this report, ‘most able’ refers to students starting secondary school in Year 7 having attained Level 5 or above in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

This reinforces the unsuitability of the term ‘most able’, on the grounds that attainment, not ability, is the true focus.

Ofsted adds for good measure:

‘There is currently no national definition for most able’

They fail to point out that the Performance Tables include a subtly different definition of high attainers, essentially requiring an APS of 30 points or higher across Key Stage 2 tests in the core subjects.

The 2014 Secondary Performance Tables show that this high attainer population constitutes 32.3% of the 2014 GCSE cohort in state-funded schools.

The associated SFR indicates that high attainers account for 30.9% of the cohort in comprehensive schools (compared with 88.8% in selective schools).

But Ofsted’s definition is wider still. The SFR published alongside the 2014 Primary Performance Tables reveals that, in 2014:

  • 29% of pupils achieved Level 5 or above in KS2 reading and writing
  • 44% of pupils achieved Level 5 or above in KS2 Maths and
  • 24% of pupils achieved Level 5 or above in KS2 reading, writing and maths.

If this information is fed into a Venn diagram, it becomes evident that, this academic year, the ‘most able’ constitute 49% of the Year 7 cohort.

That’s right – almost exactly half of this year’s Year 7s fall within Ofsted’s definition.

.

Ofsted venn Capture

.

The population is not quite so large if we focus instead on KS2 data from 2009, when the 2014 GCSE cohort typically took their KS2 tests, but even that gives a combined total of 39%.

We can conclude that Ofsted’s ‘most able’ population is approximately 40% of the KS4 cohort and approaching 50% of the KS3 cohort.

This again calls into question Ofsted’s terminology, since the ‘most’ in ‘most able’ gives the impression that they are focused on a much smaller population at the top of the attainment distribution.

We can check the KS4 figure against numerical data provided in the Report, to demonstrate that it applies equally to non-selective schools, ie once selective schools have been removed from the equation.

The charts in Annex A of the Report give the total number of pupils in non-selective schools with L5 outcomes from their KS2 assessments five years before they take GCSEs:

  • L5 maths and English = 91,944
  • L5 maths = 165,340
  • L5 English (reading and writing) = 138,789

Assuming there is no double-counting, this gives us a total population of 212,185 in 2009.

I could not find a reliable figure for the number of KS2 test takers in 2009 in state-funded primary schools, but the equivalent in the 2011 Primary Performance Tables is 547,025.

Using that, one can calculate that those within Ofsted’s definition constitute some 39% of the 2014 GCSE cohort in non-selective secondary schools. The calculations above suggest that the KS3 cohort will be some ten percentage points larger.

.

Distribution between schools

Of course the distribution of these students between schools will vary considerably.

The 2014 Secondary Performance Tables illustrate this graphically through their alternative ‘high attainers’ measure. The cohort information provides the percentage of high attainers in the GCSE cohort in each school.

The highest recorded percentage in a state-funded comprehensive school is 86%, whereas 92 state-funded schools record 10% or fewer high attainers and just over 650 have 20% or fewer in their GCSE cohort.

At the other extreme, 21 non-selective state-funded schools are at 61% or higher, 102 at 51% or higher and 461 at 41% or higher.

However, the substantial majority – about 1,740 state-funded, non-selective schools – fall between 21% and 40%.

The distribution is shown in the graph below.

.

Ofsted graph 1

Percentage of high attainers within each state-funded non-selective secondary school’s cohort 2014 (Performance Tables measure)

Ofsted approaches the issue differently, by looking at the incidence of pupils with KS2 L5 in English, maths and both English and maths.

Their tables (again in Annex A of the Report) show that, within the 2014 GCSE cohort there were:

  • 2,869 non-selective schools where at least one pupil previously attained a L5 in KS2 English
  • 2,875 non-selective schools where at least one pupil previously attained a L5 in KS2 maths and
  • 2,859 non-selective schools where at least one pupil previously attained l5 in KS2 English and maths.

According to the cohort data in the 2014 Secondary Performance Tables, this suggests that roughly 9% of state-funded non-selective secondary schools had no pupils in each of these categories within the relevant cohort. (It is of course a different 9% in each case.)

Ofsted’s analysis shows that the lowest decile of schools in the distribution of students with L5 in English will have up to 14 of them.

Similarly the lowest decile for L5 in maths will have up to 18 pupils, and the lowest decile for L5 in maths and English combined will have up to 10 pupils.

Assuming a top set typically contains at least 26 pupils, 50% of state-funded, non-selective schools with at least one pupil with L5 English have insufficient students for one full set. The comparable percentage for maths is 30%.

But Ofsted gives no hint of what might constitute a critical mass of high attainers, appearing to suggest that it is simply a case of ‘the more the better’.

Moreover, it seems likely that Ofsted might simply be identifying the incidence of disadvantage through the proxy of high attainers.

This is certainly true at the extremes of the distribution based on the Performance Tables measure.

  • Amongst the 92 schools with 10% or fewer high attainers, 53 (58%) have a cohort containing 41% or more disadvantaged students.
  • By comparison, amongst the 102 schools with 51% or more high attainers, not one school has such a high proportion of disadvantaged students, indeed, 57% have 10% or fewer.

Disadvantage

When Ofsted discusses the most able from disadvantaged backgrounds, its definition of disadvantage is confined to ‘Ever-6 FSM’.

The Report does not provide breakdowns showing the size of this disadvantaged population in state-funded non-selective schools with L5 English or L5 maths.

It does tell us that 12,150 disadvantaged students in the 2014 GCSE cohort had achieved KS2 L5 in both English and maths.  They form about 13.2% of the total cohort achieving this outcome.

If we assume that the same percentage applies to the total populations achieving L5 English only and L5 maths only, this suggests the total size of Ofsted’s disadvantaged most able population within the 2014 GCSE cohort in state-funded, non-selective schools is almost exactly 28,000 students.

Strangely, the Report does not analyse the distribution of disadvantaged high attainers, as opposed to high attainers more generally, even though the text mentions this as an issue in passing.

One would expect that the so called ‘minority effect’ might be even more pronounced in schools where there are very few disadvantaged high attainers.

Ofsted’s evidence base: Performance data

The Executive Summary argues that analysis of national performance data reveals:

‘…three key areas of underperformance for the most able students. These are the difference in outcomes between:

  • schools where most able students make up a very small proportion of the school’s population and those schools where proportions are higher
  • the disadvantaged most able students and their better off peers
  • the most able girls and the most able boys.

If the performance of the most able students is to be maximised, these differences need to be overcome.’

As noted above, Ofsted does not separately consider schools where the incidence of disadvantaged most able students is low, nor does it look at the interaction between these three categories.

It considers all three areas of underperformance through the single prism of prior attainment in KS2 tests of English and maths.

The Report also comments on a fourth dimension: the progression of disadvantaged students to competitive universities. Once again this is related to KS2 performance.

There are three data-related Key Findings:

  • National data show that too many of the most able students are still being let down and are failing to reach their full potential. Most able students’ achievement appears to suffer even more when they are from disadvantaged backgrounds or when they attend a school where the proportion of previously high-attaining students is small.’
  • ‘Nationally, too many of our most able students fail to achieve the grades they need to get into top universities. There are still schools where not a single most able student achieves the A-level grades commonly preferred by top universities.’
  • The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

The following sections look at achievement compared with prior attainment, followed by each of the four dimensions highlighted above.

GCSE attainment compared with KS2 prior attainment

Ofsted’s approach is modelled on the transition matrices, as applied to non-selective schools, comparing KS2 test performance in 2009 with subsequent GCSE performance in 2014.

Students with KS2 L5 are expected to make at least three levels of progress, to GCSE Grade B or higher, but this is relatively undemanding for high attainers, who should ideally be aiming for A/A* grades.

Ofsted presents two charts which illustrate the relatively small proportions who are successful in these terms – and the comparatively large proportions who undershoot even a grade B.

Ofsted Capture 1

Ofsted Capture 2

 .

  • In English, 39% manage A*/A grades while 77% achieve at least a Grade B, meaning that 23% achieve C or below.
  • In maths, 42% achieve A*/A grades, 76% at least a B and so 24% achieve C or lower.
  • In English and maths combined, 32% achieve A*/A grades in both subjects, 73% manage at least 2 B grades, while 27% fall below this.

Approximately one in four high attainers is not achieving each of these progression targets, even though they are not particularly demanding.

The Report notes that, in selective schools, the proportion of Level 5 students not achieving at least a Grade B is much lower, at 8% in English and 6% in maths.

Even allowing for the unreliability of these ‘levels of progress’ assumptions, the comparison between selective and non-selective schools is telling.

.

The size of a school’s most able population

The Report sets out evidence to support the contention that ‘the most able do best when there are more of them in a school’ (or, more accurately, in their year group).

It provides three graphs – for English, for maths and for maths and English combined – which divide non-selective schools with at least one L5 student into deciles according to the size of that L5 population.

These show consistent increases in the proportion of students achieving GCSE Grade B and above and Grades A*/A, with the lowest percentages for the lowest deciles and vice versa.

Comparing the bottom (fewest L5) and top (most L5) deciles:

  • In English 27% of the lowest decile achieved A*/A and 67% at least a B, whereas in the highest decile 48% achieved A*/A and 83% at least B.
  • In maths 28% of the bottom decile recorded A*/A while 65% managed at least a B, whereas in the top decile 54% achieved A*/A and 83% at least a B.
  • In maths and English combined, the lowest decile schools returned 17% A*/A grades and 58% at B or above, while in the highest decile the percentages were 42% and 81% respectively.

Selective schools record higher percentages than the highest decile on all three measures.

There is a single reference to the impact of sublevels, amply evidenced by the transition matrices.

‘For example, in schools where the lowest proportions of most able students had previously gained Level 5A in mathematics, 63% made more than expected progress. In contrast, in schools where the highest proportion of most able students who had previously attained Level 5A in mathematics, 86% made more than expected progress.’

Ofsted does not draw any inferences from this finding.

As hinted above, one might want to test the hypothesis that there may be an association with setting – in that schools with sufficient Level 5 students to constitute a top set might be relatively more successful.

Pursued to its logical extreme the finding would suggest that Level 5 students will be most successful where they are all taught together.

Interestingly, my own analysis of schools with small high attainer populations (10% or less of the cohort), derived from the 2014 Secondary Performance Tables, shows just how much variation there can be in the performance of these small groups when it comes to the standard measures:

  • 5+ A*-C grades including English and maths varies from 44% to 100%
  • EBacc ranges from 0% to 89%
  • Expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%.

This is partly a function of the small sample sizes. One suspects that Ofsted’s deciles smooth over similar variations.

But the most obvious point is that already emphasised in the previous section – the distribution of high attainers seems in large part a proxy for the level of advantage in a school.

Viewed from this perspective, Ofsted’s data on the variation in performance by distribution of high attaining students seems unsurprising.

.

Excellence gaps

Ofsted cites an ‘ever 6’ gap of 13 percentage points at GCSE grade B and above in English (66% compared with 79%) and of 17 percentage points in maths (61% compared with 78%).

Reverting again to progression from KS2, the gap between L5 ‘ever 6 FSM’ and other students going on to achieve A*/A grades in both English and maths is also given as 17 percentage points (20% versus 37%). At Grade B and above the gap is 16 points (59% compared with 75%).

A table is supplied showing progression by sub-level in English and maths separately.

.

Ofsted Capture 3

. 

A footnote explains that the ‘ever 6 FSM’ population with L5a in English was small, consisting of just 136 students.

I have transferred these excellence gaps to the graph below, to illustrate the relationship more clearly.

.

Ofsted chart 2

GCSE attainment gaps between advantaged and disadvantaged learners by KS2 prior attainment

.

It shows that, for grades A*-B, the size of the gap reduces the higher the KS2 sub-level, but the reverse is true at grades A*/A, at least as far as the distinction between 5c and 5b/a is concerned. The gaps remain similar or identical for progression from the higher two sub-levels.

This might suggest that schools are too little focused on pushing high-attaining disadvantaged learners beyond grade B.

 .

Gender

There is a short section on gender differences which points out that, for students with KS2 L5:

  • In English there was a 10 percentage point gap in favour of girls at Grade B and above and an 11 point gap in favour of girls at A*/A.
  • In maths there was a five percentage point gap at both Grade B and above and Grade A*/A.

But the interrelationship with excellence gaps and the size of the high attainer population is not explored.

.

Progression to competitive higher education

The Executive Summary mentions one outcome from the 2012/13 destinations data – that only 5% of disadvantaged students completing KS5 in 2012 progressed to ‘the top universities’. (The main text also compares the progression rates for state-funded and independent schools).

It acknowledges some improvement compared with previous years, but notes the disparity with progression rates for students from comparatively advantaged backgrounds.

A subsequent footnote reveals that Ofsted is referring throughout to progression to Russell Group universities

The Executive Summary also highlights regional differences:

‘For example, even within a high-achieving region like London, disadvantaged students in Brent are almost four times as likely to attend a prestigious university as those in Croydon.’

The main text adds:

‘For example, of the 500 or so disadvantaged students in Kent, only 2% go on to attend a top university. In Manchester, this rises to 9%. Disadvantaged students in Barnet are almost four times as likely as their peers in Kent to attend a prestigious university.’

Annex A provides only one statistic concerning progression from KS2 to KS5:

‘One half of students achieving Level 5 in English and mathematics at Key Stage 2 failed to achieve any A or A* grades at A level in non-selective schools’

There is no attempt to relate this data to the other variables discussed above.

Ofsted’s Evidence base – inspection and survey evidence

The qualitative evidence in Ofsted’s report is derived from:

  • A survey of 40 non-selective secondary schools and 10 primary schools. All the secondary schools had at least 15% of students ‘considered to be high attaining at the end of Key Stage 2’ (as opposed to meeting Ofsted’s definition), as well as 10% or more considered to be low-attaining. The sample varied according to size, type and urban or rural location. Fifteen of the 40 were included in the survey underpinning the original 2013 report. Nine of the 10 primary schools were feeders for the secondaries in the sample. In the secondary schools, inspectors held discussions with senior leaders, as well as those responsible for transition and IAG (so not apparently those with lead responsibility for high attainers). They also interviewed students in KS3 and KS5 and looked at samples of students’ work.

The six survey questions are shown below

.

Ofsted Capture 4

.

  • Supplementary questions asked during 130 Section 5 inspections, focused on how well the most able students are maintaining their progress in KS3, plus challenge and availability of suitable IAG for those in Year 11.
  • An online survey of 600 Year 8 and Year 11 students from 17 unidentified secondary schools, plus telephone interviews with five Russell Group admissions tutors.

The Report divides the qualitative dimension of its report into seven sections that map broadly on to the six survey questions.

The summary below is organised thematically, pulling together material from the key findings and supporting commentary. Relevant key findings are emboldened. Some of these have relevance to sections other than that in which they are located.

The length of each section is a good guide to the distribution and relative weight of Ofsted’s qualitative evidence

Most able disadvantaged

‘Schools visited were rarely meeting the distinct needs of students who are most able and disadvantaged. Not enough was being done to widen the experience of these students and develop their broader knowledge or social and cultural awareness early on in Key Stage 3. The gap at Key Stage 4 between the progress made by the most able disadvantaged students and their better off peers is still too large and is not closing quickly enough.’

The 2013 Report found few instances of pupil premium being used effectively to support the most able disadvantaged. This time round, about a third of survey schools were doing so. Six schools used the premium effectively to raise attainment.

Funding was more often used for enrichment activities but these were much less common in KS3, where not enough was being done to broaden students’ experience or develop social and cultural awareness.

In less successful schools, funding was not targeted ‘with the most able students in mind’, nor was its impact evaluated with sufficient precision.

In most survey schools, the proportion of most able disadvantaged was small. Consequently leaders did not always consider them.

In the few examples of effective practice, schools provided personalised support plans.

.

.

Leadership

Ofsted complains of complacency. Leaders are satisfied with their most able students making the expected progress – their expectations are not high enough.

School leaders in survey schools:

‘…did not see the need to do anything differently for the most able as a specific group.’

One head commented that specific support would be ‘a bit elitiist’.

In almost half of survey schools, heads were not prioritising the needs of their most able students at a sufficiently early stage.

Just 44 of the 130 schools asked supplementary questions had a senior leader with designated responsibility for the most able. Of these, only 16 also had a designated governor.

The Report comments:

‘This suggests that the performance of the most able students was not a high priority…’

Curriculum

Too often, the curriculum did not ensure that work was hard enough for the most able students in Key Stage 3. Inspectors found that there were too many times when students repeated learning they had already mastered or did work that was too easy, particularly in foundation subjects.’

Although leaders have generally made positive curriculum changes at KS4 and 5, issues remain at KS3. General consensus amongst students in over half the survey schools was that work is too easy.

Students identified maths and English as more challenging than other subjects in about a third of survey schools.

In the 130 schools asked supplementary questions, leaders rarely prioritised the needs of the most able at KS3. Only seven offered a curriculum designed for different abilities.

In the most effective survey schools the KS3 curriculum was carefully structured:

‘…leaders knew that, for the most able, knowledge and understanding of content was vitally important alongside the development of resilience and knowing how to conduct their own research.’

By comparison, the KS4 curriculum was tailored in almost half of survey schools. All the schools introduced enrichment and extra-curricular opportunities, though few were effectively evaluated.

. 

Assessment and tracking

Assessment, performance tracking and target setting for the most able students in Key Stage 4 were generally good, but were not effective enough in Key Stage 3. The schools visited routinely tracked the progress of their older most able students, but this remained weak for younger students. Often, targets set for the most able students were too low, which reflected the low ambitions for these students. Targets did not consistently reflect how quickly the most able students can make progress.’

Heads and assessment leaders considered tracking the progress of the most able sufficient to address their performance, but only rarely was this information used to improve curriculum and teaching strategies.

Monitoring and evaluation tends to be focused on KS4. There were some improvements in tracking at KS4 and KS5, but this had caused many schools to lose focus on tracking from the start of KS3.

KS3 students in most survey schools said their views were sought, but could not always point to changes as a consequence. Only in eight schools were able students’ views sought as a cohort.

Year 8 respondents to the online survey typically said schools could do more to develop their interests.

At KS3, half the survey schools did not track progress in all subjects. Where tracking was comprehensive, progress was inconsistent, especially in foundation subjects.

Assessment and tracking ‘generally lacked urgency and rigour’. This, when combined with ineffective use of KS2 assessments:

‘… has led to an indifferent start to secondary school for many of the most able students in these schools.’

KS2 tests were almost always used to set targets but five schools distrusted these results. Baseline testing was widely used, but only about a quarter of the sample used it effectively to spot gaps in learning or under-achievement.

Twenty-six of the 40 survey schools set targets ‘at just above national expectations’. For many students these were insufficiently demanding.

Expectations were insufficiently high to enable them to reach their potential. Weaknesses at KS3 meant there was too much to catch up at KS4 and 5.

In the better examples:

‘…leaders looked critically at national expectations and made shrewd adjustments so that the most able were aiming for the gold standard of A and A* at GCSE and A levels rather than grade B. They ensured that teachers were clear about expectations and students knew exactly what was expected of them. Leaders in these schools tracked the progress of their most able students closely. Teachers were quickly aware of any dips in performance and alert to opportunities to stretch them.’

The expectations built into levels-based national curriculum assessment imposed ‘a glass ceiling’. It is hoped that reforms such as Progress 8 will help raise schools’ aspirations.

 .

Quality of teaching

‘In some schools, teaching for the most able lacked sufficient challenge in Key Stage 3. Teachers did not have high enough expectations and so students made an indifferent start to their secondary education. The quality of students’ work across different subjects was patchy, particularly in foundation subjects. The homework given to the most able was variable in how well it stretched them and school leaders did not routinely check its effectiveness.’

The most common methods of introducing ‘stretch’ reported by teachers and students were extension work, challenge questions and differentiated tasks.

But in only eight of the survey schools did teachers have specific training in applying these techniques to the most able.

As in 2013, teaching at KS3 was insufficiently focused on the most able. The quality of work and tasks set was patchy, especially in foundation subjects. In two-thirds of survey schools work was insufficiently challenging in foundation subjects; in just under half, work was insufficiently challenging in maths and English.

Students experienced a range of teaching quality, even in the same school. Most said there were lessons that did not challenge them. Older students were more content with the quality of stretch and challenge.

In only about one fifth of survey schools was homework adapted to the needs of the most able. Extension tasks were increasingly common.

The same was true of half of the 130 schools asked supplementary questions.  Only 14 had a policy of setting more challenging homework for the most able.

Most schools placed students in maths and science sets fairly early in Year 7, but did so less frequently in English.

In many cases, older students were taught successfully in mixed ability classes, often because there were too few students to make sets viable:

‘The fact that these schools were delivering mixed ability classes successfully suggests that the organisation of classes by ability is not the only factor affecting the quality of teaching. Other factors, such as teachers not teaching their main subject or sharing classes or leaders focusing the skills of their best teachers disproportionately on the upper key stages, are also influential.’

. 

School culture and ethos

Leaders had not embedded an ethos in which academic excellence was championed with sufficient urgency. Students’ learning in Key Stage 3 in the schools visited was too frequently disrupted by low-level disruption, particularly in mixed-ability classes. Teachers had not had enough effective training in using strategies to accelerate the progress of their most able students.’

Where leadership was effective, leaders placed strong emphasis on creating the right ethos. School leaders had not prioritised embedding a positive ethos at KS3 in 22 of the survey schools.

In half of the survey schools, the most able students said their learning was affected by low-level disruption, though teachers in three-quarters of schools maintained this was rare. Senior leaders also had a more positive view than students.

In 16 of the schools, students thought behaviour was less good in mixed ability classes and staff tended to agree.

.

Transition

‘Inspectors found that the secondary schools visited were not using transition information from primary schools effectively to get the most able off to a flying start in Key Stage 3. Leaders rarely put in place bespoke arrangements for the most able students. In just under half of the schools visited, transition arrangements were not good enough. Some leaders and teachers expressed doubt about the accuracy of Key Stage 2 results. The information that schools gathered was more sophisticated, but, in too many cases, teachers did not use it well enough to make sure students were doing work with the right level of difficulty.

Too often poor transition arrangements meant students were treading water in KS3. The absence of leadership accountability for transition appeared a factor in stifled progress at KS4 and beyond.

Transfer arrangements with primary schools were not well developed in 16 of the survey schools. Compared with 2013, schools were more likely to find out about pupils’ strengths and weaknesses, but the information was rarely used well.

Secondary schools had more frequent and extended contact with primary schools through subject specialists to identify the most able, but these links were not always used effectively. Only one school had a specific curriculum pathway for such students.

Leaders in four of the ten primary schools surveyed doubted whether secondary schools used transition information effectively.

However, transition worked well in half of the secondary schools.  Six planned the Year 7 curriculum jointly with primary teachers. Leaders had the highest expectations of their staff to ensure that the most able were working at the appropriate level of challenge.

Transition appeared more effective where schools had fewer feeder primaries. About one third of the sample had more than 30 feeder schools, which posed more difficulties, but four of these schools had effective arrangements.

Progression to HE

‘Information, advice and guidance to students about accessing the most appropriate courses and universities were not good enough. There were worrying occasions when schools did too little to encourage the most able students to apply to prestigious universities. The quality of support was too dependent on the skills of individual staff in the schools visited.

While leaders made stronger links with universities to provide disadvantaged students in Key Stages 4 and 5 with a wider range of experiences, they were not evaluating the impact sharply enough. As a result, there was often no way to measure how effectively these links were supporting students in preparing successful applications to the most appropriate courses.’

Support and guidance about university applications is ‘still fragile’ and ‘remains particularly weak’.

Students, especially those from disadvantaged backgrounds, were not getting the IAG they need. Ten survey schools gave no specific support to first generation university attendees or those eligible for the pupil premium.

Forty-nine of the 130 school asked additional questions did not prioritise the needs of such students. However, personalised mentoring was reported in 16 schools.

In four survey schools students were not encouraged to apply to the top universities.

‘The remnants of misplaced ideas about elitism appear to be stubbornly resistant to change in a very small number of schools. One admissions tutor commented: ‘There is confusion (in schools) between excellence and elitism’.

Only a third of survey schools employed dedicated staff to support university applications. Much of the good practice was heavily reliant on the skills of a few individuals. HE admissions staff agreed.

In 13 of the schools visited, students had a limited understanding of the range of opportunities available to them.

Survey schools had a sound understanding of subject requirements for different degree courses. Only about one-quarter engaged early with parents.

.

Ofsted and other Central Government action

‘Ofsted has sharpened its focus on the progress and quality of teaching of the most able students. We routinely comment on the achievement of the most able students in our inspection reports. However, more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

.

Ofsted’s recommendations and conclusions

This is a somewhat better Report than its June 2013 predecessor, although it continues to fall into several of the same statistical and presentational traps.

It too is a curate’s egg.

For any student of effective provision for the most able, the broad assessment in the previous section is profoundly unsurprising, but its endorsement by Ofsted gives it added power and significance.

We should be grateful that HMCI has chosen to champion this issue when so many others are content to ignore it.

The overall message can best be summarised by juxtaposing two short statements from the Report, one expressed positively, another negatively:

  • In over half of survey schools, the most able KS3 students were progressing as well as, or better than, others. 
  • The needs of the most able were not being met effectively in the majority of survey schools.

Reading between the lines, too often, the most able students are succeeding despite their schools, rather than because of them.

What is rather more surprising – and potentially self-defeating – is Ofsted’s insistence on laying the problem almost entirely at the door of schools, and especially of headteachers.

There is most definitely a degree of complacency amongst school leaders about this issue, and Ofsted is quite right to point that out.

The determination of NAHT and ASCL to take offence at the criticism being directed towards headteachers, to use volatility and change as an excuse and to urge greater focus on the pockets of good practice is sufficient evidence of this.

But there is little by way of counterbalance. Too little attention is paid to the question whether the centre is providing the right support – and the right level of support – to facilitate system-wide improvement. It as if the ‘school-led, self-improving’ ideal is already firmly in place.

Then again, any commitment on the part of the headteachers’ associations to tackling the root causes of the problem is sadly lacking. Meanwhile, the teachers;’ associations ignored the Report completely.

Ofsted criticises this complacency and expresses concern that most of its survey schools:

‘…have been slow in taking forward Ofsted’s previous recommendations, particularly at KS3’

There is a call for renewed effort:

‘Urgent action is now required. Leaders must grasp the nettle and radically transform transition from primary school and the delivery of the Key Stage 3 curriculum. Schools must also revolutionise the quality of information, advice and guidance for their most able students.’

Ofsted’s recommendations for action are set out below. Seven are directed at school leaders, three at Ofsted and one at DfE.

Ofsted capture 5

Ofsted Capture 6

Those aimed by Ofsted towards itself are helpful in some respects.

For example, there is implicit acknowledgement that, until now, inspectors have been insufficiently focused on the most able from disadvantaged backgrounds.

Ofsted stops short of meeting my call for it to produce guidance to help schools and inspectors to understand Ofsted’s expectations.

But it is possible that it might do so. Shortly after publication of the Report, its Director for Schools made a speech confirming that: 

‘… inspectors are developing a most able evaluation toolkit for schools, aligned to that which is in place for free school meals’. 

.

.

If Ofsted is prepared to consult experts and practitioners on the content of that toolkit, rather than producing it behind closed doors, it is more likely to be successful.

There are obvious definitional issues stemming from the fact that, according to Ofsted’s current approach, the ‘most able’ population constitutes 40-50% of all learners.

While this helps to ensure relevance to every school, no matter how depressed the attainment of its intake, it also highlights the need for further differentiation of this huge population.

Some of Ofsted’s statistical indicators and benchmarking tools will need sharpening, not least to avoid the pitfalls associated with the inverse relationship between the proportion of high attainers and the proportion of disadvantaged learners.

They might usefully focus explicitly on the distribution and incidence of the disadvantaged most able.

Prospects for success

But the obvious question is why schools should be any more likely to respond this time round than in 2013?

Will the references in the Ofsted inspection handbook plus reformed assessment arrangements be sufficient to change schools’ behaviour?

Ofsted is not about to place explicit requirements on the face of the inspection framework.

We are invited to believe that Progress 8 in particular will encourage secondary schools to give due attention to the needs of high attainers.

Yet there is no commitment to the publication of a high attainers’ performance measure (comparable to the equivalent primary measure) or the gap on that measure between those from advantaged and disadvantaged backgrounds.

Data about the performance of secondary high attainers was to have been made available through the now-abandoned Data Portal – and there has been no information about what, if anything, will take its place.

And many believe that the necessary change cannot be achieved by tinkering with the accountability framework.

The specialist organisations are united in one respect: they all believe that schools – and learners themselves – need more direct support if we are to spread current pockets of effective practice throughout the system.

But different bodies have very different views about what form that support should take. Until we can:

  • Establish the framework necessary to secure universally high standards across all schools without resorting to national prescription

we – and Ofsted – are whistling in the wind.

GP

March 2015

Addressed to Teach First and its Fair Education Alliance

.

This short opinion piece was originally commissioned by the TES in November.

My draft reached them on 24 November; they offered some edits on 17 December.

Betweentimes the Fair Education Alliance Report Card made its appearance on 9 December.

Then Christmas intervened.

On 5 January I offered the TES a revised version they said should be published on 27 February. It never appeared.

This Tweet

.

.

prompted an undertaking that it would appear on 27 March. I’ll believe that when I see it.

But there’s no reason why you should wait any longer. This version is more comprehensive anyway, in that it includes several relevant Twitter comments and additional explanatory material.

I very much hope that Teach First and members of the Fair Education Alliance will read it and reflect seriously on the proposal it makes.

As the final sequence of Tweets below shows, Teach First committed to an online response on 14 February. Still waiting…

.

.

.

How worried are you that so few students on free school meals make it to Oxbridge?

Many different reasons are offered by those who argue that such concern may be misplaced:

  • FSM is a poor proxy for disadvantage; any number of alternatives is preferable;
  • We shouldn’t single out Oxbridge when so many other selective universities have similarly poor records;
  • We obsess about Oxbridge when we should be focused on progression to higher education as a whole;
  • We should worry instead about progression to the most selective courses, which aren’t necessarily at the most selective universities;
  • Oxbridge suits a particular kind of student; we shouldn’t force square pegs into round holes;
  • We shouldn’t get involved in social engineering.

Several of these points are well made. But they can be deployed as a smokescreen, obscuring the uncomfortable fact that, despite our collective best efforts, there has been negligible progress against the FSM measure for a decade or more.

Answers to Parliamentary Questions supplied  by BIS say that the total fluctuated between 40 and 45 in the six years from 2005/06 to 2010/11.

The Department for Education’s experimental destination measures statistics suggested that the 2010/11 intake was 30, rising to 50 in 2011/12, of which 40 were from state-funded schools and 10 from state-funded colleges. But these numbers are rounded to the nearest 10.

By comparison, the total number of students recorded as progressing to Oxbridge from state-funded schools and colleges in 2011/12 is 2,420.

This data underpins the adjustment of DfE’s  ‘FSM to Oxbridge’ impact indicator, from 0.1% to 0.2%. It will be interesting to see whether there is stronger progress in the 2012/13 destination measures, due later this month.

.

[Postscript: The 2012/13 Destinations Data was published on 26 January 2014. The number of FSM learners progressing to Oxbridge is shown only in the underlying data (Table NA 12).

This tells us that the numbers are unchanged: 40 from state-funded schools; 10 from state-funded colleges, with both totals again rounded to the nearest 10.

So any improvement in 2011/12 has stalled in 2012/13, or is too small to register given the rounding (and the rounding might even mask a deterioration)

.

.

The non-FSM totals progressing to Oxbridge in 2012/13 are 2,080 from state-funded schools and 480 from state-funded colleges, giving a total of 2,560. This is an increase of some 6% compared with 2011/12.

Subject to the vagaries of rounding, this suggests that the ratio of non-FSM to FSM learners progressing from state-funded institutions deteriorated in 2012/13 compared with 2011/12.]

.

The routine explanation is that too few FSM-eligible students achieve the top grades necessary for admission to Oxbridge. But answers to Parliamentary Questions reveal that, between 2006 and 2011, the number achieving three or more A-levels at grade A or above increased by some 45 per cent, reaching 546 in 2011.

Judged on this measure, our national commitment to social mobility and fair access is not cutting the mustard. Substantial expenditure – by the taxpayer, by universities and the third sector – is making too little difference too slowly. Transparency is limited because the figures are hostages to fortune.

So what could be done about this? Perhaps the answer lies with Teach First and the Fair Education Alliance.

Towards the end of last year Teach First celebrated a decade of impact. It published a report and three pupil case studies, one of which featured a girl who was first in her school to study at Oxford.

I tweeted

.

.

Teach First has a specific interest in this area, beyond its teacher training remit. It runs a scheme, Teach First Futures, for students who are  “currently under-represented in universities, including those whose parents did not go to university and those who have claimed free school meals”.

Participants benefit from a Teach First mentor throughout the sixth form, access to a 4-day Easter school at Cambridge, university day trips, skills workshops and careers sessions. Those applying to Oxbridge receive unspecified additional support.

.

.

Information about the number of participants is not always consistent, but various Teach First sources suggest there were some 250 in 2009, rising to 700 in 2013. This year the target is 900. Perhaps some 2,500 have taken part to date.

Teach First’s impact report  says that 30 per cent of those who had been through the programme in 2013 secured places at Russell Group universities and that 60 per cent of participants interviewed at Oxbridge received an offer.

I searched for details of how many – FSM or otherwise – had actually been admitted to Oxbridge. Apart from one solitary case study, all I could find was a report that mentioned four Oxbridge offers in 2010.

.

.

.

.

.

Through the Fair Education Alliance, Teach First and its partners are committed to five impact goals, one of which is to:

‘Narrow the gap in university graduation, including from the 25% most selective universities, by 8%’*

Last month the Alliance published a Report Card which argued that:

‘The current amount of pupil premium allocated per disadvantaged pupil should be halved, and the remaining funds redistributed to those pupils who are disadvantaged and have low prior attainment. This would give double-weighting to those low income pupils most in need of intervention without raising overall pupil premium spend.’

It is hard to understand how this would improve the probability of achieving the impact goal above, even though the gaps the Alliance wishes to close are between schools serving high and low income communities.

.

.

.

Perhaps it should also contemplate an expanded Alliance Futures Scheme, targeting simultaneously this goal and the Government’s ‘FSM to Oxbridge’ indicator, so killing two birds with one stone.

A really worthwhile Scheme would need to be ambitious, imposing much-needed coherence without resorting to prescription.

Why not consider:

  • A national framework for the supply side, in which all providers – universities included – position their various services.
  • Commitment on the part of all secondary schools and colleges to a coherent long-term support programme for FSM students, with open access at KS3 but continuing participation in KS4 and KS5 subject to successful progress.
  • Schools and colleges responsible for identifying participants’ learning and development needs and addressing those through a blend of internal provision and appropriate services drawn from the national framework.
  • A personal budget for each participant, funded through an annual £50m topslice from the Pupil Premium (there is a precedent) plus a matching sum from universities’ outreach budgets. Those with the weakest fair access records would contribute most. Philanthropic donations would be welcome.
  • The taxpayer’s contribution to all university funding streams made conditional on them meeting challenging but realistic fair access and FSM graduation targets – and publishing full annual data in a standard format.

 .

.

*In the Report card, this impact goal is differently expressed, as narrowing the gap in university graduation, so that at least 5,000 more students from low income backgrounds graduate each year, 1,600 of them from the most selective universities. This is to be achieved by 2022.

‘Low income backgrounds’ means schools where 50% or more pupils come from the most deprived 30% of families according to IDACI.

The gap to be narrowed is between these and pupils from ‘high income backgrounds’, defined as schools where 50% or more pupils come from the least deprived 30% of families according to IDACI.

‘The most selective universities’ means those in the Sutton Trust 30 (the top 25% of universities with the highest required UCAS scores).

The proposed increases in graduation rates from low income backgrounds do not of themselves constitute a narrowing gap, since there is no information about the corresponding changes in graduation rates from high income grounds.

This unique approach to closing gaps adds yet another methodology to the already long list applied to fair access. It risks adding further density to the smokescreen described at the start of this post.

.

.

GP

January 2015

How Well Do Grammar Schools Perform With Disadvantaged Students?

This supplement to my previous post on The Politics of Selection  compares the performance of disadvantaged learners in different grammar schools.

It adds a further dimension to the evidence base set out in my earlier post, intended to inform debate about the potential value of grammar schools as engines of social mobility.

The commentary is based on the spreadsheet embedded below, which relies entirely on data drawn from the 2013 Secondary School Performance Tables.

.

.

If you find any transcription errors please alert me and I will correct them.

.

Preliminary Notes

The 2013 Performance Tables define disadvantaged learners as those eligible for free school meals in the last six years and children in care. Hence both these categories are caught by the figures in my spreadsheet.

Because the number of disadvantaged pupils attending grammar schools is typically very low, I have used the three year average figures contained in the ‘Closing the Gap’ section of the Tables.

These are therefore the number of disadvantaged students in each school’s end of KS4 cohort for 2011, 2012 and 2013 combined. They should illustrate the impact of pupil premium support and wider closing the gap strategies on grammar schools since the Coalition government came to power.

Even when using three year averages the data is frustratingly incomplete, since 13 of the 163 grammar schools have so few disadvantaged students – fewer than six across all three cohorts combined – that the results are suppressed. We have no information at all about how well or how badly these schools are performing in terms of closing gaps.

My analysis uses each of the three performance measures within this section of the Performance Tables:

  • The percentage of pupils at the end of KS4 achieving five or more GCSEs (or equivalents) at grades A*-C, including GCSEs in English and maths. 
  • The proportion of pupils who, by the end of KS4, have made at least the expected progress in English. 
  • The proportion of pupil who, by the end of KS4, have made at least the expected progress in maths.

In each case I have recorded the percentage of disadvantaged learners who achieve the measure and the percentage point gap between that and the corresponding figure for ‘other’ – ie non-disadvantaged – students.

For comparison I have also included the corresponding percentages for all disadvantaged pupils in all state-funded schools and for all high attainers in state-funded schools. The latter is for 2013 only rather than a three-year average.

Unfortunately the Tables do not provide data for high attaining disadvantaged students. The vast majority of disadvantaged students attending grammar schools will be high-attaining according to the definition used in the Tables (average points score of 30 or higher across KS2 English, maths and science).

But, as my previous post showed, some grammar schools record 70% or fewer high attainers, disadvantaged or otherwise. These include: Clarendon House (Kent, now merged), Fort Pitt (Medway), Skegness (Lincolnshire), Dover Boys’ and Girls’ (Kent), Folkestone Girls’ (Kent), St Joseph’s (Stoke), Boston High (Lincolnshire) and the Harvey School (Kent).

Some of these schools feature in the analysis below, while some do not, suggesting that the correlation between selectivity and the performance of disadvantaged students is not straightforward.

.

Number of disadvantaged learners in each school

The following schools are those with suppressed results, placed in order according to the number of disadvantaged learners within scope, from lowest to highest:

  • Tonbridge Grammar School, Kent (2)
  • Bishop Wordsworth’s Grammar School, Wiltshire (3)
  • Caistor Grammar School, Lincolnshire (3)
  • Sir William Borlase’s Grammar School, Buckinghamshire (3)
  • Adams’ Grammar School, Telford and Wrekin (4)
  • Chelmsford County High School for Girls, Essex (4)
  • Dr Challoner’s High School, Buckinghamshire (4)
  • King Edward VI School, Warwickshire (4)
  • Alcester Grammar School, Warwickshire (5)
  • Beaconsifeld High School, Buckinghamshire (5)
  • King Edward VI Grammar School, Chelmsford, Essex (5)
  • Reading School, Reading (5)
  • St Bernard’s Catholic Grammar School, Slough (5).

Some of these schools feature among those with the lowest proportions of ‘ever 6 FSM’ pupils on roll, as shown in the spreadsheet accompanying my previous post, but some do not.

The remaining 152 schools each record a combined cohort of between six and 96 students, with an average of 22.

A further 19 schools have a combined cohort of 10 or fewer, meaning that 32 grammar schools in all (20% of the total) are in this category.

At the other end of the distribution, only 16 schools (10% of all grammar schools) have a combined cohort of 40 disadvantaged students or higher – and only four have one of 50 disadvantaged students or higher.

These are:

  • Handsworth Grammar School, Birmingham (96)
  • Stretford Grammar School, Trafford (76)
  • Dane Court Grammar School, Kent (57)
  • Slough Grammar School (Upton Court) (50).

Because the ratio of disadvantaged to other pupils in the large majority of grammar schools is so marked, the results below must be treated with a significant degree of caution.

Outcomes based on such small numbers may well be misleading, but they are all we have.

Arguably, grammar schools should find it relatively easier to achieve success with a very small cohort of students eligible for the pupil premium – since fewer require separate monitoring and, potentially, additional support.

On the other hand, the comparative rarity of disadvantaged students may mean that some grammar schools have too little experience of addressing such needs, or believe that closing gaps is simply not an issue for them.

Then again, it is perhaps more likely that grammar schools will fall short of 100% success with their much larger proportions of ‘other’ students, simply because the probability of special circumstances arising is relatively higher. One might expect therefore to see ‘positive gaps’ with success rates for disadvantaged students slightly higher than those for their relatively more advantaged peers.

Ideally though, grammar schools should be aiming for a perfect 100% success rate for all students on these three measures, regardless of whether they are advantaged or disadvantaged. None is particularly challenging, for high attainers in particular – and most of these schools have been rated as outstanding by Ofsted.

.

Five or more GCSE A*-C grades or equivalent including GCSEs in English and maths

In all state-funded schools, the percentage of disadvantaged students achieving this measure across the three year period is 38.7% while the percentage of other students doing so is 66.3%, giving a gap of 27.6 percentage points.

In 2013, 94.7% of all high attainers in state-funded secondary schools achieved this measure.

No grammar school falls below the 38.7% benchmark for its disadvantaged learners. The nearest to it is Pate’s Grammar School, at 43%. But these results were affected by the School’s decision to sit English examinations which were not recognised for Performance Table purposes.

The next lowest percentages are returned by:

  • Spalding Grammar School, Lincolnshire (59%)
  • Simon Langton Grammar School for Boys, Kent (65%)
  • Stratford Grammar School for Girls, Warwickshire (71%)
  • The Boston Grammar School, Lincolnshire (74%)

These were the only four schools below 75%.

Table 1 below illustrates these percentages and the percentage point gap for each of these four schools.

.

Table 1

Table 1: 5+ GCSEs at A*-C or equivalent including GCSEs in English and maths: Lowest performing and largest gaps

.

A total of 46 grammar schools (31% of the 150 without suppressed results) fall below the 2013 figure for high attainers across all state-funded schools.

On the other hand, 75 grammar schools (exactly 50%) achieve 100% on this measure, for combined student cohorts ranging in size from six to 49.

Twenty-six of the 28 schools that had no gap between the performance of their advantaged and disadvantaged students were amongst those scoring 100%. (The other two were at 97% and 95% respectively.)

The remaining 49 with a 100% record amongst their disadvantaged students demonstrate a ‘positive gap’, in that the disadvantaged do better than the advantaged.

The biggest positive gap is seven percentage points, recorded by Clarendon House Grammar School in Kent and Queen Elizabeth’s Grammar School in Alford, Lincolnshire.

Naturally enough, schools recording relatively lower success rates amongst their disadvantaged students also tend to demonstrate a negative gap, where the advantaged do better than the disadvantaged.

Three schools had an achievement gap higher than the 27.6 percentage point national average. They were:

  • Simon Langton Grammar School for Boys (30 percentage points)
  • Spalding Grammar School (28 percentage points)
  • Stratford Grammar School for Girls (28 percentage points)

So three of the four with the lowest success rates for disadvantaged learners demonstrated the biggest gaps. Twelve more schools had double digit achievement gaps of 10% or higher.

These 15 schools – 10% of the total for which we have data – have a significant issue to address, regardless of the size of their disadvantaged populations.

One noticeable oddity at this end of the table is King Edward VI Camp Hill School for Boys in Birmingham, which returns a positive gap of 14 percentage points (rounded): with 80% for disadvantaged and 67% for advantaged. On this measure at least, it is doing relatively badly with its disadvantaged students, but considerably worse with those from advantaged backgrounds!

However, this idiosyncratic pattern is also likely to be attributable to the School using some examinations not eligible for inclusion in the Tables.

.

At least expected progress in English

Across all state-funded schools, the percentage of disadvantaged students making at least three levels of progress in English is 55.5%, compared with 75.1% of ‘other’ students, giving a gap of 19.6 percentage points.

In 2013, 86.2% of high attainers achieved this benchmark.

If we again discount Pate’s from consideration, the lowest performing school on this measure is The Boston Grammar School which is at 53%, lower than the national average figure.

A further 43 schools (29% of those for which we have data) are below the 2013 average for all high attainers. Six more of these fall below 70%:

  • The Skegness Grammar School, Lincolnshire (62%)
  • Queen Elizabeth Grammar School, Cumbria (62%)
  • Plymouth High School for Girls (64%)
  • Spalding Grammar School, Lincolnshire (65%)
  • Devonport High School for Boys, Plymouth (65%)
  • Simon Langton Grammar School for Boys, Kent (67%)

Table 2 below illustrates these outcomes, together with the attainment gaps recorded by these schools and others with particularly large gaps.

.

Table 2

Table 2: At least expected progress in English from KS2 to KS4: Lowest performing and largest gaps

.

At the other end of the table, 44 grammar schools achieve 100% on this measure (29% of those for which we have data.) This is significantly fewer than achieved perfection on the five or more GCSEs benchmark.

When it comes to closing the gap, only 16 of the 44 achieve a perfect 100% score with both advantaged and disadvantaged students, again much lower than on the attainment measure above.

The largest positive gaps (where disadvantaged students outscore their advantaged classmates) are at The King Edward VI Grammar School, Louth, Lincolnshire (11 percentage points) and John Hampden Grammar School Buckinghamshire (10 percentage points).

Amongst the schools propping up the table on this measure, six record negative gaps of 20 percentage points or higher, so exceeding the average gap in state-funded secondary schools:

  • The Skegness Grammar School (30 percentage points)
  • Queen Elizabeth Grammar School Cumbria (28 percentage points)
  • Stratford Grammar School for Girls (27 percentage points)
  • Plymouth High School for Girls (25 percentage points)
  • Devonport High School for Boys, Plymouth (23 percentage points)
  • Loreto Grammar School, Trafford (20 percentage points).

There is again a strong correlation between low disadvantaged performance and large gaps, although the relationship does not apply in all cases.

Another 23 grammar schools have a negative gap of 10 percentage points or higher.

There is again a curious trend for King Edward VI Camp Hill in Birmingham, which comes in at 75% on this measure, yet its disadvantaged students outscore the advantaged, which are at 65%, ten percentage points lower. As noted above, there may well be extenuating circumstances.

.

At least expected progress in maths

The percentage of disadvantaged students making at least three levels of progress in maths across all state-funded schools is 50.7%, compared with a figure for ‘other’ students of 74.1%, giving a gap of 23.4 percentage points.

In 2013, 87.8% of high attainers achieved this.

On this occasion Pate’s is unaffected (in fact scores 100%), as does King Edward VI Camp Hill School for Boys (in its case for advantaged and disadvantaged alike).

No schools come in below the national average for disadvantaged students, in fact all comfortably exceed it. However, the lowest performers are still a long way behind some of their fellow grammar schools.

The worst performing grammar schools on this measure are:

  • Spalding Grammar School, Lincolnshire (59%)
  • Queen Elizabeth Grammar School Cumbria (62%)
  • Simon Langton Grammar School for Boys, Kent (63%)
  • Dover Grammar School for Boys, Kent (67%)
  • The Boston Grammar School, Lincolnshire (68%)
  • Borden Grammar School, Kent (68%)

These are very similar to the corresponding rates for the lowest performers in English.

Table 3 illustrates these outcomes, together with other schools demonstrating very large gaps between advantaged and disadvantaged students.

.

Table 3

Table 3: At least expected progress in maths from KS2 to KS4: Lowest performing and largest gaps

A total of 32 schools (21% of those for which we have data) undershoot the 2013 average for high attainers, a slightly better outcome than for English.

At the other extreme, there are 54 schools (36% of those for which we have data) that score 100% on this measure, slightly more than do so on the comparable measure for English, but still significantly fewer than achieve this on the 5+ GCSE measure.

Seventeen of the 54 also achieve a perfect 100% for advantaged students.

The largest positive gaps recorded are 11 percentage points at The Harvey Grammar School in Kent (which achieved 94% for disadvantaged students) and 7 percentage points at Queen Elizabeth’s Grammar School, Alford, Lincolnshire (91% for disadvantaged students).

The largest negative gaps on this measure are equally as substantial as those relating to English. Four schools perform significantly worse than the average gap of 23.4 percentage points:

  • Spalding Grammar School, Lincolnshire (32 percentage points)
  • Queen Elizabeth Grammar School, Cumbria (31 percentage points)
  • Simon Langton Grammar School for Boys, Kent (31 percentage points)
  • Stratford Grammar School for Girls (27 percentage points)

Queen Elizabeth’s and Stratford Girls’ appeared in the same list for English. Stratford Girls’ appeared in the same list for the 5+ GCSE measure.

A further 20 schools have a double-digit negative gap of 10 percentage points or higher, very similar to the outcome in English.

.

Comparison across the three measures

As will be evident from the tables and lists above, some grammar schools perform consistently poorly on all three measures.

Others perform consistently well, while a third group have ‘spiky profiles’

The number of schools that achieve 100% on all three measures with their disadvantaged students is 25 (17% of those for which we have data).

Eight of these are located in London; none is located in Birmingham. Just two are in Buckinghamshire and there is one each in Gloucestershire, Kent and Lincolnshire.

Only six schools achieve 100% on all three measures with advantaged and disadvantaged students alike. They are:

  • Queen Elizabeth’s, Barnet
  • Colyton Grammar School, Devon
  • Nonsuch High School for Girls, Sutton
  • St Olave’s and St Saviour’s Grammar School, Bromley
  • Tiffin Girls’ School, Kingston
  • Kendrick School, Reading

Five schools recorded comparatively low performance across all three measures (ie below 80% on each):

  • Spalding Grammar School, Lincolnshire
  • Simon Langton Grammar School for Boys, Kent
  • The Boston Grammar School, Lincolnshire
  • Stratford Grammar School for Girls
  • St Joseph’s College, Stoke on Trent

Their overall performance is illustrated in Table 4.

.

Table 4

Table 4: Schools where 80% or fewer disadvantaged learners achieved each measure

.

This small group of schools are a major cause for concern.

A total of 16 schools (11% of those for which we have data) score 90% or less on all three measures and they, too, are potentially concerning.

Schools which record negative gaps of 10 percentage points or more on all three measures are:

  • Simon Langton Grammar School for Boys, Kent
  • Dover Grammar School for Boys, Kent
  • The Boston Grammar School, Lincolnshire
  • Stratford Grammar School for Girls
  • Wilmington Grammar School for Boys, Kent
  • St Joseph’s College, Stoke-on-Trent
  • Queen Elizabeth’s Grammar School, Horncastle, Lincolnshire

Table 5 records these outcomes

.

Table 5

Table 5: Schools with gaps of 10% or higher on all three measures

.

Of these, Boston and Stratford have gaps of 20 percentage points or higher on all three measures.

A total of 32 grammar schools (21% of those for which we have data) record a percentage of 80 percentage points or lower on at least one of the three measures.

.

Selective University Destinations

I had also wanted to include in the analysis some data on progression to selective (Russell Group) universities, drawn from the experimental destination statistics.

Unfortunately, the results for FSM students are suppressed for the vast majority of schools, making comparison impossible. According to the underlying data for 2011/12, all I can establish with any certainty is that:

  • In 29 grammar schools, there were no FSM students in the cohort.
  • Five schools returned 0%, meaning that no FSM students successfully progressed to a Russell Group university. These were Wycombe High School, Wallington High School for Girls, The Crossley Heath School in Calderdale, St Anselm’s College on the Wirral and Bacup and Rawtenstall Grammar School.
  • Three schools were relatively successful – King Edward VI Five Ways in Birmingham reported 58% of FSM students progressing, while King Edward VI Handsworth reported 53% and the Latymer School achieved an impressive 75%.
  • All remaining grammar schools – some 127 in that year – are reported as ‘x’ meaning that there were either one or two students in the cohort, so the percentages are suppressed.

We can infer from this that, at least in 2011/12, very few grammar schools indeed were specialising in providing an effective route to Russell Group universities for FSM students.

.

Conclusion

Even allowing for the unreliability of statistics based on very small cohorts, this analysis is robust enough to show that the performance of grammar schools in supporting disadvantaged students is extremely disparate.

While there is a relatively large group of consistently high performers, roughly one in five grammar schools is a cause for concern on at least one of the three measures. Approximately one in ten is performing no more than satisfactorily across all three. 

The analysis hints at the possibility that the biggest problems tend to be located in rural and coastal areas rather than in London and other urban centres, but this pattern is not always consistent. The majority of the poorest performers seem to be located in wholly selective authorities but, again, this is not always the case.

A handful of grammar schools are recording significant negative gaps between the performance of disadvantaged students and their peers. This is troubling. There is no obvious correlation between the size of the disadvantaged cohort and the level of underperformance.

There may be extenuating circumstances in some cases, but there is no public national record of what these are – an argument for greater transparency across the board.

One hopes that the grammar schools that are struggling in this respect are also those at the forefront of the reform programme described in my previous post – and that they are improving rapidly.

One hopes, too, that those whose business it is to ensure that schools make effective use of the pupil premium are monitoring these institutions closely. Some of the evidence highlighted above would not, in my view, be consistent with an outstanding Ofsted inspection outcome.

If the same pattern is evident when the 2014 Performance Tables are published in January 2015, there will be serious cause for concern.

As for the question whether grammar schools are currently meeting the needs of their – typically few – disadvantaged students, the answer is ‘some are; some aren’t’. This argues for intervention in inverse proportion to success.

.

GP

December 2014

The Politics of Selection: Grammar Schools and Disadvantage

This post considers how England’s selective schools are addressing socio-economic disadvantage.

Another irrelevant Norwegian vista by Gifted Phoenix

Another irrelevant Norwegian vista by Gifted Phoenix

It is intended as an evidence base against which to judge various political statements about the potential value of selective education as an engine of social mobility.

It does not deal with recent research reports about the historical record of grammar schools in this respect. These show that – contrary to received wisdom – selective education has had a very limited impact on social mobility.

Politicians of all parties would do well to acknowledge this, rather than attempting (as some do) to perpetuate the myth in defiance of the evidence.

This post concentrates instead on the current record of these schools, recent efforts to strengthen their capacity to support the Government’s gap closing strategy and prospects for the future.

It encourages advocates of increased selection to consider the wider question of how best to support high attainers from disadvantaged backgrounds.

The post is organised into four main sections:

  • A summary of how the main political parties view selection at this point, some six months ahead of a General Election.
  • A detailed profile of the socio-economic inclusiveness of grammar schools today, which draws heavily on published data but also includes findings from recent research.
  • An evaluation of national efforts over the last year to reform selective schools’ admissions, testing and outreach in support of high-attaining disadvantaged learners.
  • Comparison of the various policy options for closing excellence gaps between such learners and their more advantaged peers – and consideration of the role that reformed and/or increased selection might play in a more comprehensive strategy.

Since I know many readers prefer to read my lengthy posts selectively I have included page jumps from each of the bullet points above to the relevant sections below.

One more preliminary point.

This is the second time I have explored selection on this Blog, though my previous post, on fair access to grammar schools, appeared as far back as January 2011. This post updates some of the data in the earlier one.

One purpose of that earlier post was to draw attention to the parallels in the debates about fair access to grammar schools and to selective higher education.

I do not repeat those arguments here, although writing this has confirmed my opinion that they are closely related issues and that many of the strategies deployed at one level could be applied equally at the other.

So there remains scope to explore how appropriate equivalents of Offa, access agreements, bursaries and contexualised admissions might be applied to selective secondary admissions arrangements, alongside the reforms that are already on the table. I leave that thought hanging.

.

The Political Context

My last post on ‘The Politics of Setting’ explored how political debate surrounding within-school and between-school selection is becoming increasingly febrile as we approach the 2015 General Election.

The two have become inextricably linked because Prime Minister Cameron, in deciding not to accommodate calls on the right of his party to increase the number of selective schools, has called instead for ‘a grammar stream in every school’ and, latterly, for a wider – perhaps universal – commitment to setting.

In May 2007, Cameron wrote:

‘That’s what the grammar school row was about: moving the Conservative Party on from slogans such as ‘Bring back grammar schools’ so that we can offer serious policies for improving state education for everyone…

…Most critics seem to accept, when pressed, that as I have said, the prospect of more grammars is not practical politics.

Conservative governments in the past – and Conservative councils in the present – have both failed to carry out this policy because, ultimately, it is not what parents want….

…When I say I oppose nationwide selection by 11 between schools, that does not mean I oppose selection by academic ability altogether.

Quite the reverse. I am passionate about the importance of setting by ability within schools, so that we stretch the brightest kids and help those in danger of being left behind.’

With a Conservative Government this would be a motor of aspiration for the brightest kids from the poorest homes – effectively a ‘grammar stream’ in every subject in every school.

Setting would be a focus for Ofsted and a priority for all new academies.’

As ‘The Politics of Setting’ explained, this alternative aspiration to strengthen within-school selection has not yet materialised, although there are strong signs that it is still Cameron’s preferred way forward.

The Coalition has been clear that:

‘It is not the policy of the Government to establish new grammar schools in England’ (Hansard, 10 February 2014, Col. 427W).

but it has also:

  • Removed barriers to the expansion of existing grammar schools through increases to planned admission numbers (PANs) within the Admissions Code.
  • Introduced several new selective post-16 institutions through the free schools policy (though not as many as originally envisaged since the maths free schools project has made relatively little progress).
  • Made efforts to reform the admissions procedures of existing selective secondary schools and
  • Accepted in principle that these existing schools might also expand through annexes, or satellite schools. This is now a live issue since one decision is pending and a second proposal may be in the pipeline.

The Liberal Democrats have enthusiastically pursued at least the third of these policies, with Lib Dem education minister David Laws leading the Government’s efforts to push the grammar schools further and faster down this route.

In his June 2014 speech (of which much more below) Laws describes grammar schools as ‘a significant feature of the landscape in many local areas’ and ‘an established fact of our education system’.

But, as the Election approaches, the Lib Dems are increasingly distancing themselves from a pro-selective stance.

Clegg is reported to have said recently that he did not believe selective schools were the way forward:

‘The Conservatives have got this odd tendency to constantly want to turn the clock back.

Some of them seem to be hankering towards a kind of selective approach to education, which I don’t think works.

Non-selective schools stream and a lot of them stream quite forcefully, that’s all fine, but I think a segregated school system is not what this country needs.’

Leaving aside the odd endorsement of ‘forceful streaming’, this could even be interpreted as hostile to existing grammar schools.

Meanwhile, both frontrunners to replace Cameron as Tory leader have recently restated their pro-grammar school credentials:

  • Constituency MP Teresa May has welcomed consideration of the satellite option in Maidenhead.

The right wing of the Tory party has long supported increased selection and will become increasingly vociferous as the Election approaches.

Conservative Voice – which describes itself as on the ‘center-Right of the party’ [sic] – will imminently launch a campaign calling for removal of the ban on new grammar schools to be included in the Conservative Election Manifesto.

They have already conducted a survey to inform the campaign, from which it is clear that they will be playing the social mobility card.

The Conservative right is acutely aware of the election threat posed by UKIP, which has already stated its policy that:

‘Existing schools will be allowed to apply to become grammar schools and select according to ability and aptitude. Selection ages will be flexible and determined by the school in consultation with the local authority.’

Its leader has spoken of ‘a grammar school in every town’ and media commentators have begun to suggest that the Tories will lose votes to UKIP on this issue.

Labour’s previous shadow education minister, Stephen Twigg, opposed admissions code reforms that made it easier for existing grammar schools to expand.

But the present incumbent has said very little on the subject.

A newspaper interview in January 2014 hints at a reforming policy:

‘Labour would not shut surviving grammar schools but Mr Hunt said their social mix should be questioned.

“If they are simply about merit why do we see the kind of demographics and class make-up within them?”’

But it seems that this has dropped off Labour’s agenda now that the Coalition has adopted it.

I could find no formal commitment from Labour to address the issue in government, even though that might provide some sort of palliative for those within the party who oppose selection in all its forms and have suggested that funding should be withdrawn from selective academies.

So the overall picture suggests that Labour and the Lib Dems are deliberately distancing themselves from any active policy on selection, presumably regarding it as a poisoned chalice. The Tories are conspicuously riven on the issue, while UKIP has stolen a march by occupying the ground which the Tory right would like to occupy.

As the Election approaches, the Conservatives face four broad choices. They can:

  • Endorse the status quo under the Coalition, making any change of policy conditional on the outcome of a future leadership contest.
  • Advocate more between-school selection. This might or might not stop short of permitting new selective 11-18 secondary schools. Any such policy needs to be distinct from UKIP’s.
  • Advocate more within-school selection, as preferred by Cameron. This might adopt any position between encouragement and compulsion.
  • Develop a more comprehensive support strategy for high attaining learners from disadvantaged backgrounds. This might include any or all of the above, but should also consider support targeted directly at disadvantaged students.

These options are discussed in the final part of the post.

The next section provides an assessment of the current state of selective school engagement with disadvantaged learners, as a precursor to describing how the reform programme is shaping up.

.

How well do grammar schools serve disadvantaged students?

.

The Grammar School Stock and the Size of the Selective Pupil Population

Government statistics show that, as of January 2014, there are 163 selective state-funded secondary schools in England.

This is one less than previously, following the merger of Chatham House Grammar School for Boys and Clarendon House Grammar School. These two Kent schools formed the Chatham and Clarendon Grammar School with effect from 1 September 2013.

At January 2014:

  • 135 of these 163 schools (83%) are academy converters, leaving just 28 in local authority control. Twenty of the schools (12%) have a religious character.
  • Some 5.1% of pupils in state-funded schools attend selective schools. (The percentage fluctuated between 4% and 5% over the last 20 years.) The percentage of learners under 16 attending selective schools is lower. Between 2007 and 2011 it was 3.9% to 4.0%.
  • There are 162,630 pupils of all ages attending state-funded selective secondary schools, of which 135,365 (83.2%) attend academies and 27,265 (16.8%) attend LA maintained schools. This represents an increase of 1,000 compared with 2013. The annual intake is around 22,000.

The distribution of selective schools between regions and local authority areas is shown in Table 1 below.

The percentage of selective school pupils by region varies from 12.0% in the South East to zero in the North East, a grammar-free zone. The percentage of pupils attending selective schools by local authority area (counting only those with at least one selective school) varies from 45.1% in Trafford to 2.1% in Devon.

Some of the percentages at the upper end of this range seem to have increased significantly since May 2011, although the two sets of figures may not be exactly comparable.

For example, the proportion of Trafford pupils attending selective schools has increased by almost 5% (from 40.2% in 2011). In Torbay there has been an increase of over 4% (34.8% compared with 30.5%) and in Kent an increase of almost 4% (33.3% compared with 29.6%).

.

Table 1: The distribution of selective schools by region and local authority area and the percentage of pupils within each authority attending them (January 2014)

Region Schools Pupils Percentage of all pupils
North East 0 0 0
North West 19 20,240 4.9
Cumbria 1 833 2.8
Lancashire 4 4,424 6.6
Liverpool 1 988 3.3
Trafford 7 7,450 45.1
Wirral 6 6,547 30.5
Yorkshire and Humberside 6 6,055 1.9
Calderdale 2 2,217 14.2
Kirklees 1 1,383 5.5
North Yorkshire 3 2,454 6.5
East Midlands 15 12,700 4.5
Lincolnshire 15 12,699 26.9
West Midlands 19 15,865 4.5
Birmingham 8 7,350 10.4
Stoke-on-Trent 1 1,078 8.7
Telford and Wrekin 2 1,283 11.7
Walsall 2 1,423 7.0
Warwickshire 5 3,980 12.0
Wolverhampton 1 753 5.0
East of England 8 7,715 2.1
Essex 4 3,398 4.0
Southend-on-Sea 4 4,319 32.8
London 19 20,770 4.4
Barnet 3 2,643 11.6
Bexley 4 5,466 26.6
Bromley 2 1,997 9.0
Enfield 1 1,378 6.1
Kingston upon Thames 2 2,021 20.5
Redbridge 2 1,822 7.9
Sutton 5 5,445 30.7
South East 57 59,910 12.0
Buckinghamshire 13 15,288 42.2
Kent 32 33,059 33.3
Medway 6 6,031 32.2
Reading 2 1,632 24.1
Slough 4 3,899 37.4
South West 20 19,370 6.2
Bournemouth 2 2,245 23.3
Devon 1 822 2.1
Gloucestershire 7 6,196 16.2
Plymouth 3 2,780 16.3
Poole 2 2,442 26.8
Torbay 3 2,976 34.8
Wiltshire 2 1,928 6.6
TOTAL 163 162,630 5.1

.

Some authorities are deemed wholly selective but different definitions have been adopted.

One PQ reply suggests that 10 of the 36 local authority areas – Bexley, Buckinghamshire, Kent, Lincolnshire, Medway, Slough, Southend, Sutton, Torbay and Trafford – are deemed wholly selective because they feature in the Education (Grammar School Ballots) Regulations 1998.

Another authoritative source – the House of Commons Library – omits Bexley, Lincolnshire and Sutton from this list, presumably because they also contain comprehensive schools.

Of course many learners who attend grammar schools live in local authority areas other than those in which their schools are located. Many travel significant distances to attend.

A PQ reply from March 2012 states that some 76.6% of all those attending grammar schools live in the same local authority as their school, while 23.2% live outside. (The remainder are ‘unknowns’.)

These figures mask substantial variation between authorities. A recent study, for the Sutton Trust  ‘Entry into Grammar Schools in England’ (Cribb et al, 2013) provides equivalent figures for each local authority from 2009-10 to 2011-12.

The percentage of within authority admissions reaches 38.5% in Trafford and 36% in Buckinghamshire but, at the other extreme, it can be as low as 1.7% in Devon and 2.2% in Cumbria.

The percentage of admissions from outside the authority can be as much as 75% (Reading) and 68% (Kingston) or, alternatively, as low as 4.5% in Gloucestershire and 6.8% in Kent.

.

Recent Trends in the Size and Distribution of the Disadvantaged Grammar School Pupil Population

Although this section of the post is intended to describe the ‘present state’, I wanted to illustrate how that compares with the relatively recent past.

I attached to my 2011 post a table showing how the proportion of FSM students attending grammar schools had changed annually since 1995. This is reproduced below, updated to reflect more recent data where it is available

A health warning is attached since the figures were derived from several different PQ replies and I cannot be sure that the assumptions underpinning each were identical. Where there are known methodological differences I have described these in the footnotes.

.

Table 2: Annual percentage FSM in all grammar schools and gap between that and percentage FSM in all secondary schools, 1995-2013

Year PercentageFSM in GS Percentage FSMall schools Percentagepoint Gap
1995 3.9 18.0 14.1
1996 3.8 18.3 14.5
1997 3.7 18.2 14.5
1998 3.4 17.5 14.1
1999 3.1 16.9 13.8
2000 2.8 16.5 13.7
2001 2.4 15.8 13.4
2002 2.2 14.9 12.7
2003 2.1 14.5 12.4
2004 2.2 14.3 12.1
2005 2.1 14.0 11.9
2006 2.2 14.6 12.4
2007 2.0 13.1 11.1
2008 1.9 12.8 10.9
2009 2.0 13.4 11.4
2010 15.4
2011 2.4 14.6 12.2
2012 14.8
2013 15.1
2014 14.6

(1) Prior to 2003 includes dually registered pupils and excludes boarding pupils; from 2003 onwards includes dually registered and boarding pupils.

(2) Before 2002 numbers of pupils eligible for free school meals were collected at school level. From 2002 onwards numbers have been derived from pupil level returns.

(3) 2008 and 2009 figures for all schools exclude academies

.

Between 1996 and 2005 the FSM rate in all schools fell annually, dropping by 4.3 percentage points over that period. The FSM rate in grammar schools also fell, by 1.7 percentage points. The percentage point gap between all schools and selective schools fell by 2.6 percentage points.

Both FSM rates reached their lowest point in 2008. At that point the FSM rate in grammar schools was half what it had been in 1996. Thereafter, the rate across all schools increased, but has been rather more volatile, with small swings in either direction.

One might expect the 2014 FSM rate across all grammar schools to be at or around its 2011 level of 2.4%.

A more recent PQ reply revealed the total number of pupil premium recipients attending selective schools over the last three financial years:

  • FY2011-12 – 3,013
  • FY2012-13 – 6,184 (on extension to ‘ever 6’)
  • FY2013-14 – 7,353

(Hansard 20 January 2014, Col. WA88)

This suggests a trend of increasing participation in the sector, though total numbers are still very low, averaging around 45 per school and slightly over six per year group.

.

Comparison with FSM rates in selective authorities

In 2012, a table deposited in the Commons Library (Dep 2012-0432) in response to a PQ provided the January 2011 FSM rates for selective schools and all state-funded secondary schools in each authority containing selective schools.

In this case, the FSM rates provided relate only to pupils aged 15 or under. The comparable national average rates are 2.7% for selective schools and 15.9% for all state-funded schools.

  • Selective school FSM rates per authority vary between 6.0% in Birmingham and 0.6% in Wiltshire.
  • Other authorities with particularly low FSM rates include Bromley (0.7%), Reading (0.8%) and Essex (0.9%).
  • Authorities with relatively high FSM rates include Wirral (5.2%), Walsall (4.9) and Redbridge (4.8%).
  • The authorities with the biggest gaps between FSM rates for selective schools and all schools are Birmingham, at 28.0 percentage points, Liverpool, at 23.8 percentage points, Enfield at 21.8 percentage points and Wolverhampton, at 21.7 percentage points.
  • Conversely, Buckinghamshire has a gap of only 4.7 percentage points, since its FSM rate for all state-funded secondary schools is only 6.0%.
  • Buckinghamshire’s overall FSM rate is more than four times the rate in its grammar schools, while in Birmingham the overall rate is almost six times the grammar school rate. On this measure, the disparity is greatest in metropolitan boroughs with significant areas of disadvantage.

.

Proportion of disadvantaged learners in each selective school

I attached to my 2011 post a table setting out the FSM rates (all pupils, regardless of age) for each selective school in January 2009.

This updated version sets out the January 2013 FSM and disadvantaged (ie ‘ever 6 FSM’) rates by school, drawn from the latest School Performance Tables. (Click on the screenshot below to download the Excel file.)

.

GS excel Capture

.

Key points include:

  • The size of grammar schools varies considerably, with NORs ranging from 437 (Newport Girls’) to 1518 (Townley Girls’). The average NOR is slightly below 1000.
  • 24 of the 163 schools (14.7%) have suppressed FSM percentages. Since the lowest published percentage is 1.1%, the impact of suppression is that all schools at or below 1.0% are affected. Since no school returns 0, we must assume that all contain a handful of FSM learners. It is notable that six of these schools are in Buckinghamshire, three in Gloucestershire and three in Essex. Both Bromley grammar schools also fall into this category.
  • 67 selective schools (41.1%) have FSM rates of 2% or lower. The average FSM rate across all these schools is 3.25%.
  • The highest recorded FSM rates are at Handsworth Grammar School (14.4%), King Edward VI Aston School (12.9%) and Stretford Grammar School (12%). These three are significant outliers – the next highest rate is 7.8%.
  • As one would expect, there is a strong correlation between FSM rates and ‘ever 6’ rates. Most of the schools with the lowest ‘ever 6’ rates are those with SUPP FSM rates. Of the 26 schools returning ‘ever 6’ rates of 3.0% or lower, all but 7 fall into this category.
  • The lowest ‘ever 6’ rate is the 0.6% returned by Sir William Borlase’s Grammar School in Buckinghamshire. On this evidence it is probably the most socio-economically selective grammar school in the country. Five of the ten schools with the lowest ‘ever 6’ rates are located in Buckinghamshire.
  • A few schools have FSM and ‘ever 6’ rates that do not correlate strongly. The most pronounced is Ribston Hall in Gloucestershire which is SUPP for FSM yet has an ‘ever 6’ rate of 5.5%, not far short of the grammar school average which is some 6.6%. Clitheroe Royal Grammar School is another outlier, returning an ‘ever 6’ rate of 4.8%.
  • The highest ‘ever 6’ rates are in Handsworth Grammar School (27.2%), Stretford Grammar School (24.3%) and King Edward VI Aston School (20.3%). These are the only three above 20%.
  • In London there is a fairly broad range of socio-economic selectivity, from St Olave’s and St Saviour’s (Bromley) – which records an ‘ever 6’ rate of 2.5% – to Woodford County High School, Redbridge, where the ‘ever 6’ rate is 11%. As noted above, the FSM rates at the two Bromley schools are SUPP. The London school with the highest FSM rate is again Woodford County High, at 5%.

Another source throws further light on the schools with the lowest FSM rates. In October 2013, a PQ reply provided a table of the 50 state secondary schools in England with the lowest entitlement to FSM, alongside a second table of the 50 schools with the highest entitlement.

These are again January 2013 figures but on this occasion the rates are for pupils aged 15 or under and the only figures suppressed (denoted by ‘x’) are where no more than two pupils are FSM.

Sir William Borlase’s tops the list, being the only school in the country with a nil return (so the one or two FSM pupils who attend must be aged over 15 and may have been admitted directly to the sixth form).

The remainder of the ‘top ten’ includes eight selective schools and one comprehensive (Old Swinford Hospital School in Dudley). The eight grammar schools are:

  • Cranbrook, Kent – x
  • Adams’, Telford and Wrekin – x
  • St Olave’s and St Saviour’s, Bromley – 0.5%
  • Dr Challoner’s High Buckinghamshire – 0.5%
  • Dr Challoner’s Grammar, Buckinghamshire – 0.6%
  • Aylesbury Grammar, Buckinghamshire – 0.6%
  • Newstead Wood, Bromley – 0.6%
  • Pate’s, Gloucestershire – 0.6%

Comparing the data in my tables for 2009 and 2013 also throws up some interesting facts:

  • Some schools have increased significantly in size – Burnham Grammar School (Buckinghamshire), Sir Thomas Rich’s (Gloucestershire), Highworth Grammar School for Girls (Kent), Simon Langton Grammar School for Boys (Kent), Kesteven and Grantham Girls’ School (Lincolnshire), Carre’s Grammar School (Lincolnshire) and St Joseph’s College (Stoke) have all increased their NORs by 100 or more.
  • However, some other schools have shrunk significantly, notably The Skegness Grammar School in Lincolnshire (down 129), The Boston Grammar School in Lincolnshire (down 110), Fort Pitt Grammar School in Medway (down 132) and Slough Grammar School (down 175).
  • While recognising that the figures may not be fully comparable, there have also been some significant changes in the proportions of FSM pupils on roll. Significant increases are evident at King Edward VI Aston (up 5.9 percentage points), Fort Pitt (up 5.1 percentage points) and Handsworth Grammar (up 4.7 percentage points).
  • The only equally pronounced mover in the opposite direction is St Anselm’s College on The Wirral, where the FSM rate has more than halved, falling by 5.2 percentage points, from 9.8% to 4.6%.

Additional statistics were peppered throughout David Laws’ June 2014 speech.

He refers to a paper by DfE analysts which unfortunately has not been published:

  • In 2013, 21 grammar schools had fewer than 1% of pupils eligible for FSM. Ninety-eight had fewer than 3% eligible and 161 had fewer than 10% eligible. This compares to a national average of 16.3% across England. (The basis for these figures is not supplied but they more or less agree with those above.)
  • In Buckinghamshire in 2011, 14% of the year 7 cohort were eligible for the pupil premium, but only 4% of the cohort in Buckinghamshire grammar schools were eligible. In Lincolnshire the comparable percentages were 21% and 7% respectively.

.

Selectivity

Most commentary tends to regard the cadre of selective schools as very similar in character, leaving aside any religious affiliation and the fact that many are single sex establishments.

Although the fact is rarely discussed, some grammar schools are significantly more selective than others.

The 2013 Secondary Performance Tables show that only 10 grammar schools can claim that 100% of the cohort comprises high attainers. (These are defined on the basis of performance in statutory end of KS2 tests, in which they must record an APS of 30 or more across English, maths and science.)

At several schools – Clarendon House (Kent, now merged), Fort Pitt (Medway), Skegness (Lincolnshire), Dover Boys’ and Girls’ (Kent), Folkestone Girls’ (Kent), St Joseph’s (Stoke), Boston High (Lincolnshire) and the Harvey School (Kent) – the proportion of high attainers stands at 70% or below.

Many comprehensive schools comfortably exceed this, hence – when it comes to KS2 attainment – some comprehensives are more selective than some grammar schools.

Key variables determining a grammar school’s selectivity will include:

  • The overall number of pupils in the area served by the school and/or the maximum geographical distance that pupils may travel to it.
  • The number of pupils who take the entrance tests, including the proportion of pupils attending independent schools competing for admission.
  • The number of competing selective schools and high-performing comprehensive schools, plus the proportion of learners who remain in or are ‘siphoned off’ into the independent sector.
  • The number of places available at the school and the pass mark in the entrance tests.

I have been unable to locate any meaningful measure of the relative selectivity of grammar schools, yet this is bound to impact on the admission of disadvantaged learners.

An index of selectivity would improve efforts to compare more fairly the outcomes achieved by different grammar schools, including their records on access for disadvantaged learners.

.

Prior attainment data

In his June 2014 speech, Laws acknowledges that:

  • ‘A key barrier is the low level of free school meal pupils achieving level 5, typically a proxy for pupils you admit’.
  • However, in wholly selective areas fewer than 50% of FSM learners achieving Level 5 enter selective schools compared with two-thirds of non-FSM pupils:

‘We calculated it would require a shift of just 200 level 5 FSM pupils to go into grammar schools in wholly selective areas to remove this particular bias ‘

Alternative versions of this statement appear elsewhere, as we shall see below.

Using data from 2009/10 and 2011/12, the Sutton Trust study by Cribb et al explored whether advantaged and disadvantaged pupils with KS2 level 5 in both English and maths were equally likely to attend grammar schools.

They found that those not eligible for FSM are still more likely to attend. This applies regardless of whether the grammar school is located in a selective local authority, although the percentages and the gaps vary considerably.

  • In selective authorities, some 66% of these high attaining non-FSM pupils went on to grammar schools compared with under 40% of FSM pupils, giving a gap of over 26 percentage points. (Note that the percentage for FSM is ten percentage points lower than the one quoted by Laws. I can find no reason for this disparity, unless the percentage has changed dramatically since 2012.)
  • In isolated grammar schools outside London the gap is much smaller, at roughly 11 percentage points (18% non-FSM against 7% FSM).
  • In London there is a similar 12 percentage point gap (15% non-FSM versus 3% FSM)

 

Cribb Capture 1

A similar pattern is detected on the basis of KS2 maths test fine points scores:

‘Two points are evident. First, for any given level of maths attainment, pupils who are eligible for FSM have a noticeably lower probability of attending a grammar school. Indeed, a non-FSM student with an average maths score has the same probability of entering a grammar school as an FSM pupil with a score 0.7 standard deviations above average. Second, the gap in probability of attendance between FSM and non-FSM pupils actually widens substantially: non-FSM pupils with test scores one standard deviation above average have a 55% likelihood of attending a grammar school in selective local authorities, whereas similar pupils who are eligible for FSM have only a 30% chance of attending a grammar school. This is suggestive that bright pupils from deprived families are not attending grammar schools as much as their attainment would suggest they might.’

This rather calls into question Laws’ initial statement that level 5 performance among FSM pupils is ‘a key barrier’ to admission.

The study also confirms that pupils attending primary schools with relatively high levels of deprivation are much less likely to progress to grammar schools.

On the other hand, some 13% of pupils nationally transfer into selective schools from non-state schools and schools outside England. The researchers are unable to distinguish clearly those from abroad and those from the independent sector, but note that they are typically wealthier than state school transfers.

This masks significant variation between local authority areas.

Almost 34% of such pupils transfer in to grammar schools in Essex, as do 24% in Bromley, 23% in Wiltshire and 22% in Bournemouth and Southend. At the other extreme, only 6% are incomers in Kirklees.

.

Headteacher perceptions

The Sutton Trust released a parallel research report from NATCEN reporting the outcomes of interviews with a small sample of three primary school and eight grammar school headteachers.

The researchers found that:

  • Rightly or wrongly, many heads felt disadvantaged learners had relatively lower educational aspirations.
  • Disadvantaged parents were sometimes perceived to know less about grammar schools and place less value on the benefits they might confer.
  • Heads felt disadvantaged parents ‘often associated grammar schools with tradition, middle class values and elitism’. Parents felt their children ‘might struggle interacting with children from more affluent backgrounds’.
  • Grammar school heads highlighted the role of primary schools but ‘this was difficult when primary schools disagreed with assessment based entry processes and selective education in general’.
  • Heads felt grammar schools should provide more outreach and demonstrate their openness to everyone. It was suggested that, as grammar schools increasingly take in pupils from further away and/or from independent schools, this might further distance schools from their local communities.
  • It was widely acknowledged that learners from more advantaged backgrounds were coached to pass the entrance exams. Some grammar heads regarded tutoring as ‘good examination preparation’; others recognised it as a barrier for disadvantaged learners.
  • Although there are financial barriers to accessing grammar schools, including the cost of uniforms and school trips, grammar school heads claimed to deploy a variety of support strategies.

Overall

The preceding analysis is complex and difficult to synthesise into a few key messages, but here is my best effort.

The national figures show that, taken as a whole, the 163 grammar schools contain extremely low proportions of FSM-eligible and ‘ever 6’ learners.

National FSM rates across all grammar schools have fallen significantly over the past 20 years and, although the FSM gap between selective schools and all schools has narrowed a little, it is still very pronounced.

There is certainly a strong case for concerted action to reduce significantly the size of this gap and to strive towards parity.

The disparity is no doubt partly attributable to lower rates of high attainment at KS2 amongst disadvantaged learners, but high attaining disadvantaged learners are themselves significantly under-represented. This is particularly true of wholly selective authorities but also applies nationally.

Although the sample is small, the evidence suggests that grammar school and primary head teachers share the perception that disadvantaged learners are further disadvantaged by the selective admissions process.

However, the cadre of grammar schools is a very broad church. The schools are very different and operate in markedly different contexts. Some are super-selective while others are less selective than some comprehensive schools.

A handful have relatively high levels of FSM and ‘ever-6’ admissions but a significant minority have almost negligible numbers of disadvantaged learners. Although contextual factors influence FSM and ‘ever 6’ rates significantly, there are still marked disparities which cannot be explained by such factors.

Each school faces a slightly different challenge.

Transparency and public understanding would be considerably improved by the publication of statistical information showing how grammar schools differ when assessed against a set of key indicators – and identifying clear improvement targets for each school. 

There seem to me to be strong grounds for incorporating schools’ performance against such targets into Ofsted’s inspection regime.

.

Progress Towards Reform

.

The Sutton Trust Research

Although the Grammar School Heads’ Association (GSHA) argues that it has pursued reform internally for some years, a much wider-ranging initiative has developed over the last twelve months, kicked off by the publication of a tranche of research by the Sutton Trust in November 2013.

This included the two publications, by Cribb et al and NATCEN cited above, plus a third piece by Jesson.

There was also an overarching summary report ‘Poor Grammar: Entry into Grammar Schools for disadvantaged pupils in England’.

This made six recommendations which, taken together, cover the full spectrum of action required to strengthen the schools’ capacity to admit more disadvantaged learners:

  • Review selection tests to ensure they are not a barrier to the admission of learners from disadvantaged backgrounds. The text remarks that:

‘Some grammar schools and local authorities are already trying to develop tests which are regularly changed, less susceptible to coaching, intelligence-based and not culturally biased.’

  • Reduce the advantage obtained by those who can pay for private tuition by making available a minimum of ten hours of test preparation to all applicants on a free or subsidised basis.
  • Improve grammar school outreach support, targeting learners from low and middle income backgrounds. This should include: assurances on access to transport and support with other costs; active encouragement for suitable Pupil Premium recipients to apply; using the media to dispel notions that grammar schools are exclusive and elitist; and deploying existing disadvantaged students as ambassadors.
  • Using the flexibility within the Admissions Code (at this point available only to academies) to prioritise the admission of high achieving students who are entitled to the pupil premium. There is also a suggestion that schools might: 

‘…consider giving preference to students from low or middle income households who reach a minimum threshold in the admission test’.

though it is not clear how this would comply with the Code.

  • Develop primary-grammar school partnerships to provide transition support for disadvantaged students, enabling primary schools to provide stronger encouragement for applications and reassure parents.
  • Develop partnerships with non-selective secondary schools:

‘…to ensure that high achieving students from low and middle income backgrounds have access to good local teachers in their areas.’

The Sutton Trust also made its own commitment to:

‘…look at ways that we can support innovation in improved testing, test preparation, outreach, admissions and collaboration.

We will also commission independent analysis of the impact of any such programmes to create an evidence base to enhance fair access to grammar schools.’

.

Reaction

Immediate reaction was predictably polarised. The GSHA was unhappy with the presentation of the report.

Its November 2013 Newsletter grumbles:

‘It is the way in which the research is presented by the Sutton Trust rather than any of research findings that give rise to concerns. Through a process of statistical machination the press release chose to lead on the claim that 6% of prep school pupils provide four times more grammar school pupils than the 16% of FSM eligible children. Inevitably, this led to headlines that the independent sector dominates admissions. The reality, of course is that 88% of all grammar school students come from state primary schools….

….Grammars select on ability and only 10% of FSM children reach level 5 at KS2 compared with a national average of 25%. The report, quite reasonably, uses level 5 as the indicator of grammar school potential. On the basis of this data the proportions of eligible FSM children in grammar schools is significantly greater than the overall FSM proportion in the top 500 comprehensives….

In 2012 just over 500 FSM children entered grammar schools. For the success rate of L5 FSM to match that of other L5 would require 200 more FSM children a year to enter grammar schools. Just one more in each school would virtually close the gap….

….The recommendations of the report are not, as claimed, either new or radical. All are areas that had already been identified by GSHA as options to aid access and represent practices that are already adopted by schools. This work, however, is usually carefully presented to avoid promotion of a coaching culture.

It is unfortunate that the press briefing both contributed to reinforcing the false stereotyping of grammar schools and failed to signal initiatives taken by grammar schools.’

There is evidence here of retaliatory ‘statistical machination’, together with a rather defensive attitude that may not bode well for the future.

On the other hand HMCI Wilshaw was characteristically forthright in the expression of an almost diametrically opposite opinion.

In December 2013 he is reported to have said:

‘Grammar schools are stuffed full of middle-class kids. A tiny percentage are on free school meals: 3%. That is a nonsense.

Anyone who thinks grammar schools are going to increase social mobility needs to look at those figures. I don’t think they work. The fact of the matter is that there will be calls for a return to the grammar school system. Well, look what is happening at the moment. Northern Ireland has a selective system and they did worse than us in the [international comparison] table. The grammar schools might do well with 10% of the school population, but everyone else does really badly. What we have to do is make sure all schools do well in the areas in which they are located.’

 .

The Laws Speech

Liberal Democrat Education Minister David Laws made clear the Government’s interest in reform with his June 2014 speech, already referenced above.

Early on in the speech he remarks that:

‘The debate about grammar schools seems to have been put in the political deep freeze – with no plans either to increase or reduce the number of what are extremely popular schools in their localities.’

With the benefit of hindsight, this seems rather ignorant of (or else disrespectful to) UKIP, which had nailed their colours to the mast just three weeks previously.

Laws acknowledges the challenge thrown down by Wilshaw, though without attribution:

‘Are you, as some would have it, “stuffed full of middle-class kids”?

Or are you opening up opportunities to all bright children regardless of their background, or can you do more?

Why is entry to grammar schools so often maligned?’

He says he wants to work with them ‘openly and constructively on social mobility’, to ‘consider what greater role they can play in breaking the cycles of disadvantage and closing the opportunity gap’, while accepting that the Government and the primary sector must also play their parts.

He suggests that the Government will do more to increase the supply of high attaining disadvantaged learners:

‘…a key barrier is the low level of free school meal pupils achieving level 5, typically a proxy for pupils you admit. So this is not just a challenge for grammar schools, but for the whole education system…

….My promise to you, alongside my challenge to you, is that this government will do everything in its power to make sure that more children from poorer backgrounds achieve their full potential.’

He lists the policies that:

‘Taken together, and over time…will start to shift the dial for poorer children – so that more and more reach level 5’

leading of course with the pupil premium.

He also proposes aspirational targets, though without any timescale attached:

My ambition is that all selective schools should aim for the same proportion of children on free school meals in their schools as in their local area.

This would mean an additional 3,500 free school meal pupils in selective schools every year, or an additional 35,000 pupils over 10 years.’

In relation to the flexibilities in the Admissions Code he adds:

I am pleased to be able to say that 32 grammar schools have implemented an admissions priority for pupils eligible for free school meals this year….

We in the Department for Education will fully support any school that chooses to change its admissions criteria in this way – in fact, I want to see all grammar schools give preference to pupil premium pupils over the next few years.’

Similarly, on coaching and testing:

‘…I really welcome the association’s work to encourage a move to entry tests that are less susceptible to coaching, and I am heartened to hear that at least 40% of grammar schools are now moving to the introduction of coaching resistant tests.

Again, I hope that all grammar schools will soon do so, and it will be interesting to see the impact of this.’

And he adds:

I want all schools to build on the progress that is being made and seek to close the gap by increasing parental engagement, and stronger working with local primaries – with a focus on identifying potential.’

So he overtly endorses several of the recommendations proposed by the Sutton Trust seven months earlier.

A Sutton Trust press release:

‘…welcomed the commitment by Schools Minister David Laws, to widening access to grammar schools and making the issue a priority in government’.

This may be a little over-optimistic.

A Collaborative Project Takes Shape

Laws also mentions in his speech that:

‘The GSHA will be working with us, the Sutton Trust and the University of Durham to explore ways in which access to grammar schools by highly able deprived children might be improved by looking more closely at the testing process and what may be limiting the engagement of pupils with it.’

The associated release from the Sutton Trust uses the present tense:

‘The Trust is currently working with the King Edward VI Foundation, which runs five grammar schools in Birmingham, Durham University, the Grammar School Heads Association and the Department for Education to target and evaluate the most effective strategies to broaden access to grammar schools.

A range of initiatives being run by the Foundation, including test familiarisation sessions at community locations, visits from primary schools and support for numeracy and literacy teaching for gifted and talented children at local primary schools, will be evaluated by Durham University to understand and compare their impact. The resulting analysis will provide a template for other grammar schools to work with.’

We know that Laws had been discussing these issues with the grammar schools for some time.

When he appeared before the Education Select Committee in February 2014 he said:

‘We are trying, for example, to talk to grammar schools about giving young people fairer access opportunities into those schools.  We are trying to allow them to use the pupil premium as a factor in their admissions policy.  We are trying to encourage them to ensure that testing is fairer to young people and is not just coachable. ‘

The repetition of ‘trying’ might suggest some reluctance on the part of grammar school representatives to engage on these issues.

Yet press coverage suggested the discussions were ongoing. In May the GSHA Newsletter states that it had first met Laws to discuss admissions some eighteen months previously, so perhaps as early as November 2012.

It adds:

‘We are currently working on a research project with the DfE and the Sutton Trust to try to find out what practices help to reduce barriers to access for those parents and students from deprived backgrounds.’

A parallel report in another paper comments:

‘The grammar school heads have also gone into partnership with the education charity the Sutton Trust to support more able children from middle and lower income backgrounds applying to selective schools.

Other ideas being considered include putting on test familiarisation sessions for disadvantaged children – something they have missed out on in the past.’

While an entry on CEM’s website says:

‘Access Grammar:

This project seeks to look at ways access to grammar schools for highly able children from non-privileged backgrounds can be improved. The project will identify potential target cohorts in the study areas for a range of outreach interventions and will look to evaluate these activities. For this project, the CEM Research and Evaluation team are working in collaboration with the Sutton Trust, Grammar School Heads Association, King Edwards Foundation and the Department for Education.

Start date: January 2014
End date: January 2017.’

So we know that there is a five-way partnership engaged on a three year project, The various statements describing the project’s objectives are all slightly different, although there is a clear resemblance between them, the aims articulated by Laws and the recommendations set out by the Sutton Trust.

But I searched in vain for any more detailed specification, including key milestones, funding and intended outcomes. It is not clear whether the taxpayer is contributing through DfE funding, or whether the Sutton Trust  and/or other partners are meeting the cost.

Given that we are almost a year into the programme, there is a strong case for this material to be made public.

.

Progress on Admissions Criteria

Of the issues mentioned in the Sutton Trust’s recommendations – tests and test preparation, admissions flexibility, outreach and partnership with primary and non-selective secondary schools – those at the front of the list have been most prominent (though there is also evidence that the King Edward’s Foundation is pursuing reform across a wider front).

The GSHA’s May 2014 newsletter is less grumpy than its predecessor, but still strikes a rather defensive note.

It uses a now familiar statistic, but in a slightly different fashion:

‘The actual number of students with Level 5s in their SATs who either choose not to apply to a grammar school or who apply but do not receive a place is reckoned by GSHA and the DfE to be two hundred students a year; not the very large number that the percentages originally suggested.’

This is the third time we have encountered this particular assertion, but each time it has been articulated differently. Which of the three statements is correct?

The GSHA is also keen to emphasise that progress is being made independently through its own good offices. On admissions reform, the article says:

‘A significant number of schools 38 have either adopted an FSM priority or consulted about doing so in the last admissions round. A further 59 are considering doing so in the next admissions round.’

The GHSA was also quoted in the TES, to the effect that 30 grammar schools had already been given permission by DfE to change their admissions policies and would so with effect from September 2015, while a further five or six had already introduced the reform.

A November 2014 PQ reply updates the figures above, saying that 32 grammar schools have already prioritised disadvantaged learners in their admissions arrangements and a further 65 ‘intend to consult on doing so’.

That leaves 66 (40%) which are not giving this active consideration.

The Chief Executive of the GSHA commented:

‘“You won’t notice a dramatic change in schools themselves because the numbers are quite small…This is reaching out at the margins in a way that won’t deprive other people of a place. The real need is to raise the standard among free school meals pupils at Key Stage 1 and Key Stage 2, that’s the key issue.

“What we are looking at in the meantime is what we can do to help these free school meals pupils who want to come to grammar school.”

Mr Sindall said that many of the country’s 164 grammar schools would not change their policies because competition for places was less fierce and it would be unnecessary. Many schools were also increasing outreach programmes and some were running eleven-plus familiarisation sessions to help prepare poorer children for the test, he added.’

There is evidence here of a desire to play down the impact of such changes, to suggest that the supply of disadvantaged high achievers is too small to do otherwise.

The data analysis above suggests that almost all selective schools need to address the issue.

Between them, the various press reports mention admissions changes at several schools, including Rugby High, South Wilts, ‘a series of Buckinghamshire grammars including Sir William Borlase’s, Dr Challoner’s  and Aylesbury Grammar’, as well as the King Edward’s Foundation Schools in Birmingham.

I checked how these changes have been embodied in some of these schools’ admissions policies.

The reports indicated that Rugby was:

‘…going even further by reserving a fixed number of places for FSM-eligible children, so potentially accepting pupils with lower entrance exam scores than other applicants.’

Rugby’s admissions arrangements for 2015 do indeed include as a second overall admissions priority, immediately following children in care:

‘Up to 10 places for children living within the priority circle for children in receipt of Free School Meals whose scores are between one and ten marks below the qualifying score for entry to the school.’

South Wilts included FSM as an oversubscription criterion in its 2014 admission arrangements, replacing it with pupil premium eligibility in 2015. However, in both cases it is placed third after children in care and those living in the school’s designated [catchment] area.

Sir William Borlase’s goes one better, in that its 2015 admissions policy places children eligible for free school meals immediately after ‘children in care’ and before ‘children living in the catchment area of the school’, though again only in the oversubscription criteria.

The King Edward’s Foundation is pursuing a similar route to Rugby’s. It announced its intention to reform admissions to its five Birmingham grammar schools in April 2014:

‘The Government wishes to improve the social mobility of children in the UK and has urged selective schools to consider how their admission policies could be changed to achieve this. The King Edward VI Grammar Schools have applied to the Department for Education which can allow them to give preference in their policies, to children who are on free school meals, or have been at any point in the last six years…

… In addition the grammar schools will be offering familiarisation sessions which will introduce children from less privileged backgrounds to the idea of attending a grammar school and will encourage them to take the 11+.

All of the Grammar Schools have set themselves a target of a 20% intake of children on free school meals (Aston has already achieved this and has a target of 25%). The expansion of the grammar schools which was announced earlier this year means that these additional children will simply fill the additional space.’

According to the 2013 Performance Tables, the FSM rates at each of these schools in January 2013 were:

  • Aston – 12.9%
  • Camp Hill Boys – 3.6%
  • Camp Hill Girls – 5.3%
  • Five Ways – 2.6%
  • Handsworth Girls – 6.3%

There must have been a major improvement at Aston for the September 2013 admissions round. As for the other four schools, they must increase their FSM admissions by a factor of between 4 and 8 to reach this target.

I wonder whether the targets are actually for ‘ever 6’ admissions?

In the event, the Foundation’s applications encountered some difficulties. In July the Admissions Adjudicator was obliged to reject them.

A parent had objected on the grounds that:

‘…it is necessary to request financial information from parents to achieve this priority which is contrary to paragraph 1.9(f) of the School Admissions Code.

… The objector further feels that it is unclear, unfair and unreasonable to use the pupil premium to differentiate between applications when the school is oversubscribed.’

The Adjudicator found in favour of the parent on the technical grounds that, although the schools had applied for variations of their funding agreements to permit this change, they had only done so retrospectively.

However, in each case:

‘The school is now entitled to give priority to girls eligible for the pupil premium as the funding agreement has been amended.’

By August the Foundation was able to state that the issue had been resolved:

‘Children applying for a place at any of the King Edward VI Grammar Schools must now achieve a minimum “qualifying score” in the test to be eligible for entry.

Any Looked After Child or previously Looked After Child (a child who is or has been in the care of the Local Authority) who achieves the “qualifying score” will be given priority for admission for up to 20% of the available places (25% at Aston).

Children eligible for Pupil Premium (those who have been registered for Free School meals at any point in the 6 years prior to the closing date for registration, 11 July 2014) who achieve the “qualifying score” will also be given priority for admission.

After this allocation, children not eligible for the Pupil Premium but who achieve the “qualifying score” will be admitted by rank order of scores until all places are filled.’

The Foundation has published an interesting FAQ on the new arrangements:

‘Q5. Will this mean that if you are poor you won’t have to score as high in the 11+ admission tests?
A. That is essentially correct – up to 20% of places (25% at Aston) are set aside for pupil premium children who achieve “a qualifying score”. This qualifying score will be set before the test in September after we have reviewed data in order to ensure that children who achieve the score can flourish in our schools.

Q6. Why don’t you want the cleverest children at your school anymore?
A.
 We want our schools to represent the City of Birmingham and the diverse backgrounds that our children might come from. We believe that there are clever children out there who just don’t have the same opportunity to succeed as those from more privileged backgrounds and we want to try to do something about that.’

It acknowledges the magnitude of the challenge ahead:

‘John Collins, Secretary to the Governors of the charity The Schools of King Edward VI in Birmingham said “This is a hugely challenging target which we do not expect to achieve in the first few years of the initiative, as currently there are relatively few free school meal pupils who apply to take the test. These low numbers are something we are trying to address with our “familiarisation” programme which seeks to encourage bright children from less privileged backgrounds to take the test.”’

Also in July the Government opened up the same possibility for grammar schools that are not academies by consulting on amendments to the Admissions Code to permit this.

In October this was confirmed in the Government’s response to the consultation which stressed it was being introduced as an option rather than a universal requirement.

.

Progress on 11+ Test Reform

The new-style 11-plus tests developed by CEM have not had a universally positive reception. Much of the attention has been focused on their adoption by Buckinghamshire grammar schools.

The GSHA’s May 2014 newsletter notes that ‘some schools in the Midlands’ have been using CEM tests for five years. From 2015, 40% of grammar schools will be using these tests, which are:

‘…designed to be immune to the influence of coaching’

adding:

‘The analysis of data from Buckinghamshire (a wholly selective area which has recently switched to the CEM Centre tests) will provide us in time with valuable hard data on the large scale impact of the change over time.’

Back in February 2014 an Observer article had already cited positive feedback from Buckinghamshire:

‘Last autumn, a handful of education authorities in England introduced an exam designed to test a wider range of abilities – ones that are already being taught in primary schools, rather than skills that can be mastered through home tutoring – to make the selection system fairer.

Provisional results indicate that a more diverse selection of pupils passed this test, and headteachers say they feel the change has made a difference.

Ros Rochefort, headteacher at Bledlow Ridge primary school in Buckinghamshire…said that this year, for the first time in her career, the test has delivered a fair result. “All the kids who got through were expected to pass and, as usual, there are a couple of appeals coming through. All our very able children were selected….

…. Philip Wayne, headteacher at Chesham grammar school and chairman of the Bucks Grammar School Heads Association, has welcomed the changes and says he is “very confident” that the new test will avoid the current situation, in which many pupils who won places at his school with the help of intensive tutoring struggle to keep up with lessons once they arrive.’

However, there were contemporary reports that the 2013 tests led to a 6% fall (110 fewer pupils) in the proportion of places awarded to children from in-county state primary schools, even though 300 more pupils applied.

In September this was further developed in a Guardian story:

‘According to the data, a child from a Buckinghamshire private school is now more than three and a half times more likely to pass the 11-plus than a child from one of its state primaries….

…FOI requests to the eight secondary schools in Wycombe, which includes some of the most deprived and diverse wards in the county, suggest that children on free school meals and of Pakistani heritage have been less successful this year. ‘

A local pressure group Local Equal and Excellent has been trying to gather and analyse the data from the initial rounds of testing in 2013 and 2014 (ie for admission in 2014 and 2015).

Their most recent analysis complains at refusals to publish the full test data and contains an analysis based on the limited material that has been released.

In November 2014, the matter was discussed at Buckinghamshire’s Education, Skills and Children’s Services Select Committee.

The ‘results and analysis’ paper prepared by Buckinghamshire’s grammar school headteachers contains many words and far too few numbers.

The section on ‘Closing the gap’ says:

‘One local group has claimed that children from poorer backgrounds and BME have ‘done worse’ in the new Secondary Transfer Test. It is not specified what ‘worse’ means; however it is not reliable to make statements about trends and patterns for specific groups from a single year’s data and as stated above the data that has been used to make such claims is a small subset of the total and unrepresentative. To substantiate such claims a detailed analysis of additional information such as the current attainment of the children concerned would be needed. We are currently considering how a longitudinal study might be achieved.’

This is overly defensive and insufficiently transparent.

There is some disagreement about whether or not the new test is less amenable to coaching.

The ‘results and analysis’ paper says:

‘There is no such thing as a ‘tutor proof’ test. However, the new tests are less susceptible to the impact of specific test tutoring because they are aligned to the National Curriculum which all children study. Additionally, the questions in the new test are less predictable than in the previous test because they cover a wider range of topics and there is a broader range of question types – points acknowledged and welcomed by primary headteachers’.

Conversely, the pressure group says:

‘The new 11-plus, devised by the Centre for Evaluation and Monitoring (CEM) at Durham University, is supposed to rely less heavily on verbal reasoning and be more closely allied to the primary curriculum. Practice papers for the CEM test are supposed to be less readily available…

But… the fact that it is modelled on what can be taught in schools means the CEM test is more amenable to coaching… if children can’t be taught to get better in maths, why are we teaching it in schools? Practice will make anyone better and I see no sign that tuition has tailed off at all.’

Elsewhere there is evidence that 11+ testing is not immune to financial pressures. North Yorkshire is presently consulting on a plan to scale back from a familiarisation test and two sets of two full tests, with the best results taken forward.

Instead there would be a single set of tests taken by all candidates on the same day at a single venue, plus sample booklets in place of the familiarisation test. A system of reviews, enabling parents to provide supporting evidence to explain under-performance, would also be discontinued.

The reason is explicit:

‘The cost of administering an overly bureaucratic system of testing is no longer sustainable in the light of very significant cuts in public expenditure.’

Even though the draft impact assessment says that the Council will consider applications for support with transport from rural areas and for those with low incomes, there is some unacknowledged risk that the new arrangements will be detrimental to efforts to increase the proportion of disadvantaged learners admitted to these schools.

.

How Best to Close Excellence Gaps

.

What to do with the status quo

The next Government will inherit:

  • The Access Grammar reform project, outlined above, which is making some progress in the right direction, but needs closer scrutiny and probably more central direction. There is an obvious tension between Laws’ aspiration that all grammar schools should ‘give preference to pupil premium pupils over the next few years’ and the GSHA position, which is that many schools do not need to change their policies. It will be important that the changes to admissions arrangements for the 163 schools are catalogued and their impact on admissions monitored and made public, so that we can see at a glance which schools are leading the pack and which are laggards. A published progress report against the Sutton Trust’s six recommendations would help to establish future priorities. Greater transparency about the project itself is also highly desirable.
  • A small cadre of selective 16-19 free schools. It will need to articulate its position on academic selection at 16+ and might need to take action to ensure a level playing field with existing sixth form colleges. It might consider raising expectations of both new and existing institutions in respect of the admission of disadvantaged learners, so securing consistency between 11+ selection and 16+ selection.
  • Flexibility within the Admissions Code for all grammar schools – academies and LA-maintained alike – to prioritise the admission of disadvantaged learners. It may need to consider whether it should move further towards compulsion in respect of grammar schools, particularly if the GSHA maintains its position that many do not need to broaden their intake in this fashion.
  • Flexibility for all grammar schools to increase Planned Admission Numbers and, potentially, to submit proposals for the establishment of Satellite institutions. The approval of such proposals rests with the local authority in the case of a maintained school but with the Secretary of State for Education in respect of academies. An incoming government may need to consider what limits and conditions should be imposed on such expansion, including requirements relating to the admission of disadvantaged learners.

It may be helpful to clarify the position on satellites. The Coalition Government has confirmed that they can be established:

‘It is possible for an existing maintained grammar school or academy with selective arrangements to expand the number of places they offer, including by extending on to another site…There are, however, limitations on that sort of expansion, meaning it could only be a continuation of the existing school. The school admissions code is written from a presumption that those schools with a split site are a single school’ (Hansard, 16 February 2012, Col. 184W).

In December 2013, a proposal to establish a grammar school annexe in Sevenoaks, Kent was rejected by the Secretary of State on the grounds that it would constitute a new school:

‘Mr Gove’s legal ruling hinged on the issue of a girls’ grammar school being the sponsor of a Sevenoaks annexe for both girls and boys. The planned entry of Sevenoaks boys to the annexe lead Mr Gove to rule that the annexe’s proposed admissions policy was sufficiently different to the sponsor school’s girls-only admissions policy to constitute a wholly new grammar school.’

But a revised proposal was submitted in November 2014 for a girls’ only annexe. Moreover, the local authority has committed to exploring whether another satellite could be established in Maidenhead, acknowledging that this would require the co-operation of an existing grammar school.

The timing of the decision on the revised Sevenoaks proposal ensures that selection will remain a live issue as we approach the General Election

Further options to promote between-school selection

There are several options for strengthening a pro-selection policy further that would not require the removal of statutory constraints on opening new 11-18 grammar schools, or permitting existing schools to change their character to permit selection.

For example:

  • Pursuing the Wilshavian notion of organising schools into geographical clusters, some with academic and others with vocational specialisms, and enabling learners to switch between them at 14+. In many areas these clusters will incorporate at least one grammar school; in others the ‘academic’ role would be undertaken by high-performing comprehensive schools with strong sixth forms. The practical difficulties associated with implementing this strategy ought not to be underplayed, however. For example, how much spare capacity would the system need to carry in order to respond to annual fluctuations in demand? How likely is it that students would wish to leave their grammar schools at 14 and what tests would incomers be expected to pass? Would the system also be able to accommodate those who still wished to change institution at age 16?
  • Vigorously expanding the cadre of post-16 selective free schools. There is presumably a largely unspent budget for up to twelve 16-19 maths free schools, though it will be vulnerable to cuts. It would be relatively straightforward to develop more, extending into other curricular specialisms and removing the obligatory university sponsorship requirement. Expansion could be focused on clones of the London Academy of Excellence and the Harris Westminster Sixth Form. But there should be standard minimum requirements for the admission of disadvantaged learners. A national network might be created which could help to drive improvements in neighbouring primary and secondary schools.
  • Permit successful selective post-16 institutions to admit high-attaining disadvantaged students at age 14, to an academic pathway, as a parallel initiative to that which enables successful colleges to take in 14 year-olds wishing to study vocational qualifications. It may be that the existing scheme already permits this, since the curriculum requirements do not seem to specify a vocational pathway.

UKIP’s policy, as presently articulated, is merely enabling: few existing schools are likely to want to change their character in this fashion.

One assumes that Tory advocates would be satisfied with legislation permitting the establishment of new free schools that select at age 11 or age 14. It seems unlikely that anyone will push for the nuclear option of ‘a grammar school in every town’… but Conservative Voice will imminently reveal their hand.

.

Further options to promote within-school selection

If the political preference is to pursue within-school provision as an alternative to between-school selection there are also several possibilities including:

  • Encouraging the development of more bilateral schools with parallel grammar and selective streams and/or fast-track grammar streams within standard comprehensive schools.
  • Requiring, incentivising or promoting more setting in secondary schools, potentially prioritising the core subjects.
  • Developing a wider understanding of more radical and innovative grouping practices, such as vertical and cluster grouping, and trialling the impact of these through the EEF.

It would of course be important to design such interventions to benefit all students, but especially disadvantaged high attainers.

The Government might achieve the necessary leverage through a ‘presumption’ built into Ofsted’s inspection guidance (schools are presumed to favour the specified approach unless they can demonstrate that an alternative leads consistently to higher pupil outcomes) or through a ‘flexible framework’ quality standard.

.

A national student support scheme

The most efficient method of supporting attainment and social mobility amongst disadvantaged high attainers is through a national scheme that helps them directly, rather than targeting the schools and colleges that they attend.

This need not be a structured national programme, centrally delivered by a single provider. It could operate within a framework that brings greater coherence to the existing market and actively promotes the introduction of new suppliers to fill gaps in coverage and/or compete on quality. A ‘managed market’ if you will.

The essential elements would include:

  • This supply-side framework, covering the full range of disadvantaged students’ learning and development needs, within which all suppliers – universities, third sector, commercial, schools-based – would position their services (or they would be excluded from the scheme).
  • A commitment on the part of all state-funded schools and colleges to implement the scheme with their disadvantaged high attainers (the qualifying criterion might be FSM or ‘ever 6’) – and to ensure continuity and progression when and if these students change institution, especially at 16+.
  • A coherent learning and development programme for each eligible student throughout Years 7-13. Provision in KS3 might be open access and light touch, designed principally to identify those willing and able to pursue the programme into KS4 and KS5. Provision in these latter stages would be tailored to individuals’ needs and continuation would be dependent on progress against challenging but realistic personal targets, including specified GCSE grades.
  • Schools and colleges would act as facilitators and guides, conducting periodic reviews of students’ needs; helping them to identify suitable services from the framework; ensuring that their overall learning programmes – the in-school/college provision together with the services secured from the framework – constitute a coherent learning experience; helping them to maintain learning profiles detailing their progress and achievement.
  • Each learner would have a personal budget to meet costs attached to delivering his learning programme, especially costs attached to services provided through the framework. This would be paid through an endowment fund, refreshed by an annual £50m topslice from the pupil premium budget (analogous to that for literacy and numeracy catch-up) and a matching topslice from universities’ outreach budgets for fair access.
  • Universities would be strongly encouraged to make unconditional offers on the basis of high quality learning profiles, submitted by students as part of their admissions process.
  • There would be annual national targets for improving the GCSE and A level attainment of students participating in the scheme and for admission to – and graduation from – selective universities. This would include challenging but realistic targets for improving FSM admission to Oxbridge.

.

Conclusion

The current political debate is overly fixated on aspects of the wider problem, rather than considering the issue in the round.

I have set out above the far wider range of options that should be under consideration. These are not necessarily mutually exclusive.

If I were advising any political party inclined to take this seriously, I would recommend four essential components:

  • An enhanced strategy to ensure that all existing selective schools (including 16+ institutions) take in a larger proportion of high-attaining disadvantaged learners. Approval for expansion and any new schools would be conditional on meeting specified fair access targets.
  • Development of the cadre of 163 grammar schools into a national network, with direct responsibility for leading national efforts to increase the supply of high-attaining disadvantaged learners emerging from primary schools. Selective independent schools might also join the network, to fill gaps in the coverage and fulfil partnership expectations.
  • A policy to promote in all schools effective and innovative approaches to pupil grouping, enabling them to identify the circumstances in which different methods might work optimally and how best to implement those methods to achieve success. Schools would be encouraged to develop, trial and evaluate novel and hybrid approaches, so as to broaden the range of potential methods available.
  • A national support scheme for disadvantaged high attainers aged 11-19 meeting the broad specification set out above.

Regrettably, I fear that party political points-scoring will stand in the way of a rational solution.

Grammar schools have acquired a curious symbolic value, almost entirely independent of their true purpose and largely unaffected by the evidence base.

They are much like a flag of convenience that any politician anxious to show off his right-wing credentials can wave provocatively in the face of his opponents. There is an equivalent flag for abolitionists.  Anyone who proposes an alternative position is typically ignored.

.

GP

November 2014

Excellence Gaps Quality Standard: Version 1

 

This post is the first stage of a potential development project.

letter-33809_640
It is my initial ‘aunt sally’ for a new best fit quality standard, intended to support schools and colleges to close performance gaps between high-achieving disadvantaged learners and their more advantaged peers.

It aims to integrate two separate educational G_letter_blue_whiteobjectives:

  • Improving the achievement of disadvantaged learners, specifically those eligible for Pupil Premium support; and
  • Improving the achievement of high attainers, by increasing the proportion that achieve highly and the levels at which they achieve.

High achievement embraces both high Blue_square_Qattainment and strong progress, but these terms are not defined or quantified on the face of the standard, so that it is applicable in primary, secondary and post-16 settings and under both the current and future assessment regimes.

I have adopted new design parameters for this fresh venture into quality standards:

  • The standard consists of twelve elements placed in what seems a logical order, but they White_Letter_S_on_Green_Backgroundare not grouped into categories. All settings should consider all twelve elements. Eleven are equally weighted, but the first ‘performance’ element is potentially more significant.
  • The baseline standard is called ‘Emerging’ and is broadly aligned with Ofsted’s ‘Requires Improvement’. I want it to capture only the essential ‘non-negotiables’ that all settings must observe or they would otherwise be inadequate. I have erred on the side of minimalism for this first effort.
  • The standard marking progress beyond the baseline is called ‘Improving’ and is (very) broadly aligned with Ofsted’s ‘Good’. I have separately defined only the learner performance expected, on the assumption that in other respects the standard marks a continuum. Settings will position themselves according to how far they exceed the baseline and to what extent they fall short of excellence.
  • The excellence standard is called ‘Exemplary’ and is broadly aligned with Ofsted’s ‘Outstanding’. I have deliberately tried to pitch this as highly as possible, so that it provides challenge for even the strongest settings. Here I have erred on the side of specificity.

The trick with quality standards is to find the right balance between over-prescription and vacuous ‘motherhood and apple pie’ statements.

There may be some variation in this respect between elements of the standard: the section on teaching and learning always seems to be more accommodating of diversity than others given the very different conceptions of what constitutes effective practice. (But I am also cautious of trespassing into territory that, as a non-practitioner, I may not fully understand.)

The standard uses terminology peculiar to English settings but the broad thrust should be applicable in other countries with only limited adaptation.

The terminology needn’t necessarily be appropriate in all respects to all settings, but it should have sufficient currency and sharpness to support meaningful interaction between them, including cross-phase interaction. It is normal for primary schools to find some of the language more appropriate to secondary schools.

It is important to emphasise the ‘best fit’ nature of such standards. Following discussion informed by interaction with the framework, settings will reach a reasoned and balanced judgement of their own performance across the twelve elements.

It is not necessary for all statements in all elements to be observed to the letter. If a setting finds all or part of a statement beyond the pale, it should establish why that is and, wherever possible, devise an alternative formulation to fit its context. But it should strive wherever possible to work within the framework, taking full advantage of the flexibility it permits.

Some of the terminology will be wanting, some important references will have been omitted while others will be over-egged. That is the nature of ‘aunt sallys’.

Feel free to propose amendments using the comments facility below.

The quality standard is immediately below.  To improve readability, I have not reproduced the middle column where it is empty. Those who prefer to see the full layout can access it via this PDF

 

 

Emerging (RI) Improving (G) Exemplary (O)
The setting meets essential minimum criteria In best fit terms the setting has progressed beyond entry level but is not yet exemplary The setting is a model for others to follow
Performance Attainment and progress of disadvantaged high achievers typically matches that of similar learners nationally, or is rapidly approaching this..Attainment and progress of advantaged and disadvantaged high achievers in the setting are both improving. Attainment and progress of disadvantaged high achievers consistently matches and sometimes exceeds that of similar learners nationally..Attainment and progress are improving steadily for advantaged and disadvantaged high achievers in the setting and performance gaps between them are closing. Attainment and progress of disadvantaged high achievers significantly and consistently exceeds that of similar learners nationally..

Attainment and progress matches but does not exceed that of advantaged learners within the setting, or is rapidly approaching this, and both attainment and progress are improving steadily, for advantaged and disadvantaged high achievers alike.

 

 

 

  Emerging (RI) The setting meets essential minimum criteria Exemplary (O) The setting is a model for others to follow
Policy/strategy There is a published policy to close excellence gaps, supported by improvement planning. Progress is carefully monitored. There is a comprehensive yet clear and succinct policy to close excellence gaps that is published and easily accessible. It is familiar to and understood by staff, parents and learners alike.

.

SMART action to close excellence gaps features prominently in improvement plans; targets are clear; resources and responsibilities are allocated; progress is monitored and action adjusted accordingly. Learners’ and parents’ feedback is routinely collected.

.

The setting invests in evidence-based research and fosters innovation to improve its own performance and contribute to system-wide improvement.

Classroom T&L Classroom practice consistently addresses the needs of disadvantaged high achievers, so improving their learning and performance. The relationship between teaching quality and closing excellence gaps is invariably reflected in classroom preparation and practice.

.

All teaching staff and paraprofessionals can explain how their practice addresses the needs of disadvantaged high achievers, and how this has improved their learning and performance.

.

All staff are encouraged to research, develop, deploy, evaluate and disseminate more effective strategies in a spirit of continuous improvement.

Out of class learning A menu of appropriate opportunities is accessible to all disadvantaged high achievers and there is a systematic process to match opportunities to needs. A full menu of appropriate opportunities – including independent online learning, coaching and mentoring as well as face-to-face activities – is continually updated. All disadvantaged high achievers are supported to participate.

.

All provision is integrated alongside classroom learning into a coherent, targeted educational programme. The pitch is appropriate, duplication is avoided and gaps are filled.

.

Staff ensure that: learners’ needs are regularly assessed; they access and complete opportunities that match their needs; participation and performance are monitored and compiled in a learning record.

Assessment/ tracking Systems for assessing, reporting and tracking attainment and progress provide disadvantaged high achievers, parents and staff with the information they need to improve performance Systems for assessing, tracking and reporting attainment and progress embody stretch, challenge and the highest expectations. They identify untapped potential in disadvantaged learners. They do not impose artificially restrictive ceilings on performance.

.

Learners (and their parents) know exactly how well they are performing, what they need to improve and how they should set about it. Assessment also reflects progress towards wider goals.

.

Frequent reports are issued and explained, enabling learners (and their parents) to understand exactly how their performance has changed over time and how it compares with their peers, identifying areas of relative strength and weakness.

.

All relevant staff have real-time access to the assessment records of disadvantaged high attainers and use these to inform their work.

.

Data informs institution-wide strategies to improve attainment and progress. Analysis includes comparison with similar settings.

Curriculum/organisation The needs and circumstances of disadvantaged high achievers explicitly inform the curriculum and curriculum development, as well as the selection of appropriate organisational strategies – eg sets and/or mixed ability classes. The curriculum is tailored to the needs of disadvantaged high achievers. Curriculum flexibility is utilised to this end. Curriculum development and planning take full account of this.

.

Rather than a ‘one size fits all’ approach, enrichment (breadth), extension (depth) and acceleration (pace) are combined appropriately to meet different learners’ needs.

.

Personal, social and learning skills development and the cultivation of social and cultural capital reflect the priority attached to closing excellence gaps and the contribution this can make to improving social mobility.

.

Organisational strategies – eg the choice of sets or mixed ability classes – are informed by reliable evidence of their likely impact on excellence gaps.

Ethos/pastoral The ethos is positive and supportive of disadvantaged high achievers. Excellence is valued by staff and learners alike. Bullying that undermines this is eradicated. The ethos embodies the highest expectations of learners, and of staff in respect of learners. Every learner counts equally.

.

Excellence is actively pursued and celebrated; competition is encouraged but not at the expense of motivation and self-esteem;hothousing is shunned.

.

High achievement is the norm and this is reflected in organisational culture; there is zero tolerance of associated bullying and a swift and proportional response to efforts to undermine this culture.

.

Strong but realistic aspirations are fostered. Role models are utilised. Social and emotional needs associated with excellence gaps are promptly and thoroughly addressed.

.

The impact of disadvantage is monitored carefully. Wherever possible, obstacles to achievement are removed.

Transition/progression The performance, needs and circumstances of disadvantaged high achievers are routinely addressed in transition between settings and in the provision of information, advice and guidance. Where possible, admissions arrangements prioritise learners from disadvantaged backgrounds – and high achievers are treated equally in this respect.

.

Receiving settings routinely collect information about the performance, needs and circumstances of disadvantaged high achievers. They routinely share such information when learners transfer to other settings.

.

Information, advice and guidance is tailored, balanced and thorough. It supports progression to settings that are consistent with the highest expectations and high aspirations while also meeting learners’ needs.

.

Destinations data is collected, published and used to inform monitoring.

.

Leadership, staffing, CPD A named member of staff is responsible – with senior leadership support – for co-ordinating and monitoring activity across the setting (and improvement against this standard)..Professional development needs associated with closing excellence gaps are identified and addressed The senior leadership team has an identified lead and champion for disadvantaged high achievers and the closing of excellence gaps.

.

A named member of staff is responsible for co-ordinating and monitoring activity across the setting (and improvement against this standard).

.

Closing excellence gaps is accepted as a collective responsibility of the whole staff and governing body. There is a named lead governor.

.

There is a regular audit of professional development needs associated with closing excellence gaps across the whole staff and governing body. A full menu of appropriate opportunities is continually updated and those with needs are supported to take part.

.

The critical significance of teaching quality in closing excellence gaps is instilled in all staff, accepted and understood.

Parents Parents and guardians understand how excellence gaps are tackled and are encouraged to support this process. Wherever possible, parents and guardians are actively engaged as partners in the process of closing excellence gaps. The setting may need to act as a surrogate. Other agencies are engaged as necessary.

.

Staff, parents and learners review progress together regularly. The division of responsibility is clear. Where necessary, the setting provides support through outreach and family learning.

.

This standard is used as the basis of a guarantee to parents and learners of the support that the school will provide, in return for parental engagement and learner commitment.

Resources Sufficient resources – staffing and funding – are allocated to improvement planning (and to the achievement of this standard)..Where available, Pupil Premium is used effectively to support disadvantaged high achievers. Sufficient resources – staffing and funding – are allocated to relevant actions in the improvement plan (and to the achievement of this standard).

.

The proportion of Pupil Premium (and/or alternative funding sources) allocated to closing excellence gaps is commensurate with their incidence in the setting.

.

The allocation of Pupil Premium (or equivalent resources) is not differentiated on the basis of prior achievement: high achievers are deemed to have equal needs.

.

Settings should evidence their commitment to these principles in published material (especially information required to be published about the use of Pupil Premium).

Partnership/collaboration The setting takes an active role in collaborative activity to close excellence gaps. Excellence gaps are addressed and progress is monitored in partnership with all relevant ‘feeder’ and ‘feeding’ settings in the locality.

.

The setting leads improvement across other settings within its networks, utilising the internal expertise it has developed to support others locally, regionally and nationally.

.

The setting uses collaboration strategically to build its own capacity and improve its expertise.

 

letter-33809_640G_letter_blue_whiteBlue_square_QWhite_Letter_S_on_Green_Background

 

 

 

 

Those who are not familiar with the quality standards approach may wish to know more.

Regular readers will know that I advocate what I call ‘flexible framework thinking’, a middle way between the equally unhelpful extremes of top-down prescription (one-size-fits-all) and full institutional autonomy (a thousand flowers blooming). Neither secures consistently high quality provision across all settings.

The autonomy paradigm is currently in the ascendant. We attempt to control quality through ever-more elaborate performance tables and an inspection regime that depends on fallible human inspectors and documentation that regulates towards convergence when it should be enabling diversity, albeit within defined parameters.

I see more value in supporting institutions through best-fit guidance of this kind.

My preferred model is a quality standard, flexible enough to be relevant to thousands of different settings, yet specific enough to provide meaningful guidance on effective practice and improvement priorities, regardless of the starting point.

I have written about the application of quality standards to gifted education and their benefits on several occasions:

Quality standards are emphatically not ‘tick box’ exercises and should never be deployed as such.

Rather they are non-prescriptive instruments for settings to use in self-evaluation, for reviewing their current performance and for planning their improvement priorities. They support professional development and lend themselves to collaborative peer assessment.

Quality standards can be used to marshal and organise resources and online support. They can provide the essential spine around which to build guidance documents and they provide a useful instrument for research and evaluation purposes.

 

GP

October 2014

Beware the ‘short head': PISA’s Resilient Students’ Measure

 

This post takes a closer look at the PISA concept of ‘resilient students’ – essentially a measure of disadvantaged high attainment amongst 15 year-olds – and how this varies from country to country.

7211284724_f3c5515bf7_mThe measure was addressed briefly in my recent review of the evidence base for excellence gaps in England but there was not space on that occasion to provide a thoroughgoing review.

The post is organised as follows:

  • A definition of the measure and explanation of how it has changed since the concept was first introduced.
  • A summary of key findings, including selected international comparisons, and of trends over recent PISA cycles.
  • A brief review of OECD and related research material about the characteristics of resilient learners.

I have not provided background about the nature of PISA assessments, but this can be found in previous posts about the mainstream PISA 2012 results and PISA 2012 Problem Solving.

 

Defining the resilient student

In 2011, the OECD published ‘Against the Odds: Disadvantaged students who succeed in school’, which introduced the notion of PISA as a study of resilience. It uses PISA 2006 data throughout and foregrounds science, as did the entire PISA 2006 cycle.

There are two definitions of resilience in play: an international benchmark and a country-specific measure to inform discussion of effective policy levers in different national settings.

The international benchmark relates to the top third of PISA performers (ie above the 67th percentile) across all countries after accounting for socio-economic background. The resilient population comprises students in this group who also fall within the bottom third of the socio-economic background distribution in their particular jurisdiction.

Hence the benchmark comprises an international dimension of performance and a national/jurisdictional dimension of disadvantage.

This cohort is compared with disadvantaged low achievers, a population similarly derived, except that their performance is in the bottom third across all countries, after accounting for socio-economic background.

The national benchmark applies the same national measure relating to socio-economic background, but the measure of performance is the top third of the national/jurisdictional performance distribution for the relevant PISA test.

The basis for determining socio-economic background is the PISA Index of Economic, Social and Cultural Status (ESCS).

‘Against the Odds’ describes it thus:

‘The indicator captures students’ family and home characteristics that describe their socio-economic background. It includes information about parental occupational status and highest educational level, as well as information on home possessions, such as computers, books and access to the Internet.’

Further details are provided in the original PISA 2006 Report (p333).

Rather confusingly, the parameters of the international benchmark were subsequently changed.

PISA 2009 Results: Overcoming Social Background – Equity in Learning Opportunities and Outcomes Volume II describes the new methodology in this fashion:

‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status).’

No reason is given for this shift to a narrower measure of both attainment and disadvantage, nor is the impact on results discussed.

The new methodology is seemingly retained in PISA 2012 Results: Excellence through Equity: Giving every student the chance to succeed – Volume II:

‘A student is class­ed as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter of students among all countries, after accounting for socio-economic status.’

However, multiplication by four is dispensed with.

This should mean that the outcomes from PISA 2009 and 2012 are broadly comparable with some straightforward multiplication. However the 2006 results foreground science, while in 2009 the focus is reading – and shifts on to maths in 2012.

Although there is some commonality between these different test-specific results (see below), there is also some variation, notably in terms of differential outcomes for boys and girls.

 

PISA 2006 results

The chart reproduced below compares national percentages of resilient students and disadvantaged low achievers in science using the original international benchmark. It shows the proportion of resilient learners amongst disadvantaged students.

 

Resil 2006 science Capture

Conversely, the data table supplied alongside the chart shows the proportion of resilient students amongst all learners. Results have to be multiplied by three on this occasion (since the indicator is based on ‘top third attainment, bottom third advantage’).

I have not reproduced the entire dataset, but have instead created a subset of 14 jurisdictions in which my readership may be particularly interested, namely: Australia, Canada, Finland, Hong Kong, Ireland, Japan, New Zealand, Poland, Shanghai, Singapore, South Korea, Taiwan, the UK and the US. I have also included the OECD average.

I have retained this grouping throughout the analysis, even though some of the jurisdictions do not appear throughout – in particular, Shanghai and Singapore are both omitted from the 2006 data.

Chart 1 shows these results.

 

Resil first chart Chart 1: PISA resilience in science for selected jurisdictions by gender (PISA 2006 data)

 

All the jurisdictions in my sample are relatively strong performers on this measure. Only the United States falls consistently below the OECD average.

Hong Kong has the highest percentage of resilient learners – almost 75% of its disadvantaged students achieve the benchmark. Finland is also a very strong performer, while other jurisdictions achieving over 50% include Canada, Japan, South Korea and Taiwan.

The UK is just above the OECD average, but the US is ten points below. The proportion of disadvantaged resilient students in Hong Kong is almost twice the proportion in the UK and two and a half times the proportion in the US.

Most of the sample shows relatively little variation between their proportions of male and female resilient learners. Females have a slight lead across the OECD as a whole, but males are in the ascendancy in eight of these jurisdictions.

The largest gap – some 13 percentage points in favour of boys – can be found in Hong Kong. The largest advantage in favour of girls – 6.9 percentage points – is evident in Poland. In the UK males are ahead by slightly over three percentage points.

The first chart also shows that there is a relatively strong relationship between the proportion of resilient students and of disadvantaged low achievers. Jurisdictions with the largest proportions of resilient students typically have the smallest proportions of disadvantaged low achievers.

In Hong Kong, the proportion of disadvantaged students who are low achievers is 6.3%, set against an OECD average of 25.8%. Conversely, in the US, this proportion reaches 37.8% – and is 26.7% in the UK. Of this sample, only the US has a bigger proportion of disadvantaged low achievers than of disadvantaged resilient students.

 

‘Against the Odds’ examines the relationship between resiliency in science, reading and maths, but does so using the national benchmark, so the figures are not comparable with those above. I have, however, provided a chart comparing performance in my sample of jurisdictions.

 

Resil second chart

Chart 2: Students resilient in science who are resilient in other subjects, national benchmark of resilience, PISA 2006

 

Amongst the jurisdictions for which we have data there is a relatively similar pattern, with between 47% and 56% of students resilient in all three subjects.

In most cases, students who are resilient in two subjects combine science and maths rather than science and reading, but this is not universally true since the reverse pattern applies in Ireland, Japan and South Korea.

The document summarises the outcomes thus:

‘This evidence indicates that the vast majority of students who are resilient with respect to science are also resilient in at least one if not both of the other domains…These results suggest that resilience in science is not a domain-specific characteristic but rather there is something about these students or the schools they attend that lead them to overcome their social disadvantage and excel at school in multiple subject domains.’

 

PISA 2009 Results

The results drawn from PISA 2009 focus on outcomes in reading, rather than science, and of course the definitional differences described above make them incompatible with those for 2006.

The first graph reproduced below shows the outcomes for the full set of participating jurisdictions, while the second – Chart 2 – provides the results for my sample.

Resil PISA 2009 Capture

 

Resil third chart

Chart 3: PISA resilience in reading for selected jurisdictions by gender (PISA 2009 data)

 

The overall OECD average is pitched at 30.8% compared with 39% on the PISA 2006 science measure. Ten of our sample fall above the OECD average and Australia matches it, but the UK, Ireland and the US are below the average, the UK undershooting it by some seven percentage points.

The strongest performer is Shanghai at 75.6%, closely followed by Hong Kong at 72.4%. They and South Korea are the only jurisdictions in the sample which can count over half their disadvantaged readers as resilient. Singapore, Finland and Japan are also relatively strong performers.

There are pronounced gender differences in favour of girls. They have a 16.8 percentage point lead over boys in the OECD average figure and they outscore boys in every country in our sample. These differentials are most marked in Finland, Poland and New Zealand. In the UK there is a difference of 9.2 percentage points, smaller than in many other countries in the sample.

The comparison with the proportion of disadvantaged low achievers is illustrated by chart 3. This reveals the huge variation in the performance of our sample.

 

Resil fourth chart

Chart 4: Comparing percentage of resilient and low-achieving students in reading, PISA 2009

At one extreme, the proportion of disadvantaged low achievers (bottom quartile of the achievement distribution) is virtually negligible in Shanghai and Hong Kong, while around three-quarters of disadvantaged students are resilient (top quartile of the achievement distribution).

At the other, countries like the UK have broadly similar proportions of low achievers and resilient students. The chart reinforces just how far behind they are at both the top and the bottom of the attainment spectrum.

 

PISA 2012 Results

In 2012 the focus is maths rather than reading. The graph reproduced below compares resilience scores across the full set of participating jurisdictions, while Chart 4 covers only my smaller sample.

 

Resil PISA 2012 Capture

resil fifth chart Chart 5: PISA resilience in maths for selected jurisdictions by gender (PISA 2012 data)

 

Despite the change in subject, the span of performance on this measure is broadly similar to that found in reading three years earlier. The OECD average is 25.6%, roughly five percentage points lower than the average in 2009 reading.

Nine of the sample lie above the OECD average, while Australia, Ireland, New Zealand, UK and the US are below. The UK is closer to the OECD average in maths than it was in reading, however, and is a relatively stronger performer than the US and New Zealand.

Shanghai and Hong Kong are once again the top performers, at 76.8% and 72.4% respectively. Singapore is at just over 60% and South Korea at just over 50%. Taiwan and Japan are also notably strong performers.

Within the OECD average, boys have a four percentage point lead on girls, but boys’ relatively stronger performance is not universal – in Hong Kong, Poland, Singapore and South Korea, girls are in the ascendancy.  This is most strongly seen in Poland. The percentage point difference in the UK is just 2.

The comparison with disadvantage low achievers is illustrated in Chart 5.

 

Resil sixth chart

Chart 6: Comparing percentage of resilient and low-achieving students in maths, PISA 2012

 

Once again the familiar pattern emerges, with negligible proportions of low achievers in the countries with the largest shares of resilient students. At the other extreme, the US and New Zealand are the only two jurisdictions in this sample with a longer ‘tail’ of low achievers. The reverse is true in the UK, but only just!

 

Another OECD Publication ‘Strengthening Resilience through Education: PISA Results – background document’ contains a graph showing the variance in jurisdictions’ mathematical performance by deciles of socio-economic disadvantage. This is reproduced below.

 

resil maths deciles Capture

The text adds:

‘Further analysis indicates that the 10% socio-economically most disadvantaged children in Shanghai perform at the same level as the 10% most privileged children in the United States; and that the 20% most disadvantaged children in Finland, Japan, Estonia, Korea, Singapore, Hong Kong-China and Shanghai-China compare favourably to the OECD average.’

One can see that the UK is decidedly ‘mid-table’ at both extremes of the distribution. On the evidence of this measure, one cannot fully accept the oft-repeated saw that the UK is a much stronger performer with high attainers than with low attainers, certainly as far as disadvantaged learners are concerned.

 

The 2012 Report also compares maths-based resiliency records over the four cycles from PISA 2003 to PISA 2012 – as shown in the graph reproduced below – but few of the changes are statistically significant. There has also been some statistical sleight of hand to ensure comparability across the cycles.

 

resil comparing PISA 2003 to 2012 capture

Amongst the outcomes that are statistically significant, Australia experienced a fall of 1.9 percentage points, Canada 1.6 percentage points, Finland 3.3 percentage points and New Zealand 2.9 percentage points. The OECD average was relatively little changed.

The UK is not included in this analysis because of issues with its PISA 2003 results.

Resilience is not addressed in the main PISA 2012 report on problem-solving, but one can find online the graph below, which shows the relative performance of the participating countries.

It is no surprise that the Asian Tigers are at the top of the league (although Shanghai is no longer in the ascendancy). England (as opposed to the UK) is at just over 30%, a little above the OECD average, which appears to stand at around 27%.

The United States and Australia perform at a very similar level. Canada is ahead of them and Poland is the laggard.

 

resil problem solving 2012 Capture

 

Resilience in the home countries

Inserted for the purposes of reinforcement, the chart below compiles the UK outcomes from the PISA 2006, 2009 and 2012 studies above, as compared with the top performer in my sample for each cycle and the appropriate OECD average. Problem-solving is omitted.

Only in science (using the ‘top third attainer, bottom third disadvantage’ formula) does the UK exceed the OECD average figure and then only slightly.

In both reading and maths, the gap between the UK and the top performer in my sample is eye-wateringly large: in each case there are more than three times as many resilient students in the top-performing jurisdiction.

It is abundantly clear from this data that disadvantaged high attainers in the UK do not perform strongly compared with their peers elsewhere.

 

Resil seventh chart

Chart 7: Resilience measures from PISA 2006-2012 comparing UK with top performer in this sample and OECD average

 

Unfortunately NFER does not pick up the concept of resilience in its analysis of England’s PISA 2012 results.

The only comparative analysis across the Home Countries that I can find is contained in a report prepared for the Northern Ireland Ministry of Education by NFER called ‘PISA 2009: Modelling achievement and resilience in Northern Ireland’ (March 2012).

This uses the old ‘highest third by attainment, lowest third by disadvantage’ methodology deployed in ‘Against the Odds’. Reading is the base.

The results show that 41% of English students are resilient, the same figure as for the UK as a whole. The figures for the other home countries appear to be: Northern Ireland 42%; Scotland 44%; and Wales 35%.

Whether the same relationship holds true in maths and science using the ‘top quartile, bottom quartile’ methodology is unknown. One suspects though that each of the UK figures given above will also apply to England.

 

The characteristics of resilient learners

‘Against the Odds’ outlines some evidence derived from comparisons using the national benchmark:

  • Resilient students are, on average, somewhat more advantaged than disadvantaged low achievers, but the difference is relatively small and mostly accounted for by home-related factors (eg. number of books in the home, parental level of education) rather than parental occupation and income.
  • In most jurisdictions, resilient students achieve proficiency level 4 or higher in science. This is true of 56.8% across the OECD. In the UK the figure is 75.8%; in Hong Kong it is 88.4%. We do not know what proportions achieve the highest proficiency levels.
  • Students with an immigrant background – either born outside the country of residence or with parents were born outside the country – tend to be under-represented amongst resilient students.
  • Resilient students tend to be more motivated, confident and engaged than disadvantaged low achievers. Students’ confidence in their academic abilities is a strong predictor of resilience, stronger than motivation.
  • Learning time – the amount of time spent in normal science lessons – is also a strong predictor of resilience, but there is relatively little evidence of an association with school factors such as school management, admissions policies and competition.

Volume III of the PISA 2012 Report: ‘Ready to Learn: Students’ engagement, drive and self-beliefs’ offers a further gloss on these characteristics from a mathematical perspective:

‘Resilient students and advantaged high-achievers have lower rates of absenteeism and lack of punctuality than disadvantaged and advantaged low-achievers…

….resilient and disadvantaged low-achievers tend to have lower sense of belonging than advantaged low-achievers and advantaged high-achievers: socio-economically disadvantaged students express a lower sense of belonging than socio-economically advantaged students irrespective of their performance in mathematics.

Resilient students tend to resemble advantaged high-achievers with respect to their level of drive, motivation and self-beliefs: resilient students and advantaged high-achievers have in fact much higher levels of perseverance, intrinsic and instrumental motivation to learn mathematics, mathematics self-efficacy, mathematics self-concept and lower levels of mathematics anxiety than students who perform at lower levels than would be expected of them given their socio-economic condition…

….In fact, one key characteristic that resilient students tend to share across participating countries and economies, is that they are generally physically and mentally present in class, are ready to persevere when faced with challenges and difficulties and believe in their abilities as mathematics learners.’

Several research studies can be found online that reinforce these findings, sometimes adding a few further details for good measure:

The aforementioned NFER study for Northern Ireland uses a multi-level logistic model to investigate the school and student background factors associated with resilience in Northern Ireland using PISA 2009 data.

It derives odds ratios as follows: grammar school 7.44; female pupils 2.00; possessions – classic literature 1.69; wealth 0.76; percentage of pupils eligible for FSM – 0.63; and books in home – 0-10 books 0.35.

On the positive impact of selection the report observes:

‘This is likely to be largely caused by the fact that to some extent grammar schools will be identifying the most resilient students as part of the selection process. As such, we cannot be certain about the effectiveness or otherwise of grammar schools in providing the best education for disadvantaged children.’

Another study – ‘Predicting academic resilience with mathematics learning and demographic variables’ (Cheung et al 2014) – concludes that, amongst East Asian jurisdictions such as Hong-Kong, Japan and South Korea, resilience is associated with avoidance of ‘redoublement’ and having attended kindergarten for more than a year.

Unsurprisingly, students who are more familiar with mathematical concepts and have greater mathematical self-efficacy are also more likely to be resilient.

Amongst other countries in the sample – including Canada and Finland – being male, native (as opposed to immigrant) and avoiding ‘redoublement’ produced stronger chances of resilience.

In addition to familiarity with maths concepts and self-efficacy, resilient students in these countries were less anxious about maths and had a higher degree of maths self-concept.

Work on ‘Resilience Patterns in Public Schools in Turkey’ (unattributed and undated) – based on PISA 2009 data and using the ‘top third, bottom third’ methodology – finds that 10% of a Turkish sample are resilient in reading, maths and science; 6% are resilient in two subjects and a further 8% in one only.

Resilience varies in different subjects according to year of education.

resil Turkey Capture

There are also significant regional differences.

Odds ratios show a positive association with: more than one year of pre-primary education; selective provision, especially in maths; absence of ability grouping; additional learning time, especially for maths and science; a good disciplinary climate and strong teacher-student relations.

An Italian study – ‘A way to resilience: How can Italian disadvantaged students and schools close the achievement gap?’ (Agasisti and Longobardi, undated) uses PISA 2009 data to examine the characteristics of resilient students attending schools with high levels of disadvantage.

This confirms some of the findings above in respect of student characteristics, finding a negative impact from immigrant status (and also from a high proportion of immigrants in a school). ‘Joy in reading’ and ‘positive attitude to computers’ are both positively associated with resilience, as is a positive relationship with teachers.

School type is found to influence the incidence of resilience – particularly enrolment in Licei as opposed to professional or technical schools – so reflecting one outcome of the Northern Irish study. Other significant school level factors include the quality of educational resources available and investment in extracurricular activities. Regional differences are once more pronounced.

A second Italian study – ‘Does public spending improve educational resilience? A longitudinal analysis of OECD PISA data’ (Agasisti et al 2014) finds a positive correlation between the proportion of a country’s public expenditure devoted to education and the proportion of resilient students.

Finally, this commentary from Marc Tucker in the US links its relatively low incidence of resilient students to national views about the nature of ability:

‘In Asia, differences in student achievement are generally attributed to differences in the effort that students put into learning, whereas in the United States, these differences are attributed to natural ability.  This leads to much lower expectations for students who come from low-income families…

My experience of the Europeans is that they lie somewhere between the Asians and the Americans with respect to the question as to whether effort or genetic material is the most important explainer of achievement in school…

… My take is that American students still suffer relative to students in both Europe and Asia as a result of the propensity of the American education system to sort students out by ability and assign different students work at different challenge levels, based on their estimates of student’s inherited intelligence.’

 

Conclusion

What are we to make of all this?

It suggests to me that we have not pushed much beyond statements of the obvious and vague conjecture in our efforts to understand the resilient student population and how to increase its size in any given jurisdiction.

The comparative statistical evidence shows that England has a real problem with underachievement by disadvantaged students, as much at the top as the bottom of the attainment distribution.

We are not alone in facing this difficulty, although it is significantly more pronounced than in several of our most prominent PISA competitors.

We should be worrying as much about our ‘short head’ as our ‘long tail’.

 

GP

September 2014

 

 

 

 

 

 

Closing England’s Excellence Gaps: Part 2

This is the second part of an extended post considering what we know – and do not know – about high attainment gaps between learners from advantaged and disadvantaged backgrounds in England.

512px-Bakerloo_line_-_Waterloo_-_Mind_the_gap

Mind the Gap by Clicsouris

Part one provided an England-specific definition, articulated a provisional theoretical model for addressing excellence gaps and set out the published data about the size of excellence gaps at Key Stages 2,4 and 5, respectively.

Part two continues to review the evidence base for excellence gaps, covering the question whether high attainers remain so, international comparisons data and related research and excellence gaps analysis from the USA.

It also describes those elements of present government policy that impact directly on excellence gaps and offers some recommendations for strengthening our national emphasis on this important issue.

 

Whether disadvantaged high achievers remain so

 

The Characteristics of High Attainers

The Characteristics of high attainers (DfES 2007) includes investigation of:

  • whether pupils in the top 10% at KS4 in 2006 were also high attainers at KS3 in 2004 and KS2 in 2001, by matching back to their fine grade points scores; and
  • chances of being a KS4 high attainer given a range of pupil characteristics at KS2 and KS3.

On the first point it finds that 4% of all pupils remain in the top 10% throughout, while 83% of pupils are never in the top 10% group.

Some 63% of those who were high attainers at the end of KS2 are still high attainers at the end of KS3, while 72% of KS3 high attainers are still in that group at the end of KS4. Approximately half of high attainers at KS2 are high attainers at KS4.

The calculation is not repeated for advantaged and disadvantaged high attainers respectively, but this shows that – while there is relatively little movement between  the high attaining population and other learners (with only 17% of the overall population falling within scope at any point) – there is a sizeable ‘drop out’ amongst high attainers at each key stage.

Turning to the second point, logistic regression is used to calculate the odds of being a KS4 high attainer given different levels of prior attainment and a range of pupil characteristics. Results are controlled to isolate the impact of individual characteristics and for attainment.

The study finds that pupils with a KS2 average points score (APS) above 33 are more likely than not to be high attainers at KS4, and this probability increases as their KS2 APS increases. For those with an APS of 36, the odds are 23.73, meaning they have a 24/25 chance of being a KS4 high attainer.

For FSM-eligible learners though, the odds are 0.55, meaning that the chances of being a KS4 high attainer are 45% lower amongst FSM-eligible pupils, compared to  their non-FSM counterparts with similar prior attainment and characteristics.

The full set of findings for individual characteristics is reproduced below.

Ex gap Capture 7

 

An appendix supplies the exact ratios for each characteristic and the text points out that these can be multiplied to calculate odds ratios for different combinations:

The odds for different prior attainment levels and other characteristics combined with FSM eligibility are not worked through, but could easily be calculated. It would be extremely worthwhile to repeat this analysis using more recent data to see whether the results would be replicated for those completing KS4 in 2014.

 

Sutton Trust

In 2008, the Sutton Trust published ‘Wasted talent? Attrition rates of high achieving pupils between school and university’ which examines the attrition rates for FSM-eligible learners among the top 20% of performers at KS2, KS3 and KS4.

A footnote says that this calculation was ‘on the basis of their English and maths scores at age 11, and at later stages of schooling’, which is somewhat unclear. A single, unidentified cohort is tracked across key stages.

The report suggests ‘extremely high rates of ‘leakage’ amongst the least privileged pupils’. The key finding is that two-thirds of disadvantaged top performers at KS2 are not amongst the top performers at KS4, whereas 42% advantaged top performers are not.

 

EPPSE

Also in the longitudinal tradition ‘Performing against the odds: developmental trajectories of children in the EPPSE 3-16 study’ (Siraj-Blatchford et al, June 2011) investigated through interviews the factors that enabled a small group of disadvantaged learners to ‘succeed against the odds’.

Twenty learners were identified who were at the end of KS3 or at KS4 and who had achieved well above predicted levels in English and maths at the end of KS2. Achievement was predicted for the full sample of 2,800 children within the EPPSE study via multi-level modelling, generating:

‘…residual scores for each individual child, indicating the differences between predicted and attained achievement at age 11, while controlling for certain child characteristics (i.e., age, gender, birth weight, and the presence of developmental problems) and family characteristics (i.e., mothers’ education, fathers’ education, socio-economic status [SES] and family income). ‘

The 20 identified as succeeding against the odds had KS2 residual scores for both English and maths within the highest 20% of the sample. ‘Development trajectories’ were created for the group using a range of assessments conducted at age 3, 4, 5, 7, 11 and 14.

The highest job level held in the family when the children were aged 3-4 was manual, semi-skilled or unskilled, or the parent(s) had never worked.

The 20 were randomly selected from each gender – eight boys and 12 girls – while ensuring representation of ‘the bigger minority ethnic groups’. It included nine students characterised as White UK, five Black Caribbean, two Black African and one each of Indian (Sikh), Pakistani, Mixed Heritage and Indian (Hindu).

Interviews were conducted with children, parents and the teacher at their [present] secondary school the learners felt ‘knew them best’. Teacher interviews were secured for 11 of the 20.

Comparison of development trajectories showed significant gaps between this ‘low SES high attainment’ group and a comparative sample of ‘low SES, predicted attainment’ students. They were ahead from the outset and pulled further away.

They also exceeded a comparator group of high SES learners performing at predicted levels from entry to primary education until KS2. Even at KS3, 16 of the 20 were still performing above the mean of the high SES sample.

These profiles – illustrated in the two charts below – were very similar in English and maths respectively. In either case, Group 1 are those with ‘low SES, high attainment’, while Group 4 are ‘high SES predicted attainment’ students.

 

Supp exgap Eng Capture

Supp exgap Maths Capture

 

Interviews identified five factors that helped to explain this success:

  • The child’s perceived cognitive ability, strong motivation for school and learning and their hobbies and interests. Most parents and children regarded cognitive ability as ‘inherent to the child’, but they had experienced many opportunities to develop their abilities and received support in developing a ‘positive self-image’. Parenting ‘reflected a belief in the parent’s efficacy to positively influence the child’s learning’. Children also demonstrated ability to self-regulate and positive attitudes to homework. They had a positive attitude to learning and made frequent use of books and computers for this purpose. They used school and learning as distractions from wider family problems. Many were driven to learn, to succeed educationally and achieve future aspirations.
  • Home context – effective practical and emotional support with school and learning. Families undertook a wide range of learning activities, especially in the early years. These were perceived as enjoyable but also valuable preparation for subsequent schooling. During the primary years, almost all families actively stimulated their children to read. In the secondary years, many parents felt their efforts to regulate their children’s activities and set boundaries were significant. Parents also provided practical support with school and learning, taking an active interest and interacting with their child’s school. Their parenting style is described as ‘authoritative: warm, firm and accepting of their needs for psychological autonomy but demanding’. They set clear standards and boundaries for behaviour while granting extra autonomy as their children matured. They set high expectations and felt strongly responsible for their child’s education and attitude to learning. They believed in their capacity to influence their children positively. Some were motivated by the educational difficulties they had experienced.
  • (Pre-)School environment – teachers who are sensitive and responsive to the child’s needs and use ‘an authoritative approach to teaching and interactive teaching strategies’; and, additionally, supportive school policies. Parents had a positive perception of the value of pre-school education, though the value of highly effective pre-school provision was not clear cut with this sample. Moreover ‘very few clear patterns of association could be discerned between primary school effectiveness and development of rankings on trajectories’. That said both parents and children recognised that their schools had helped them address learning and behavioural difficulties. Success was attributed to the quality of teachers. ‘They thought that good quality teaching meant that teachers were able to explain things clearly, were enthusiastic about the subject they taught, were approachable when things were difficult to understand, were generally friendly, had control over the class and clearly communicated their expectations and boundaries.’
  • Peers providing practical, emotional and motivational support. Friends were especially valuable in helping them to respond to difficulties, helping in class, with homework and revision. Such support was often mutual, helping to build understanding and develop self-esteem, as a consequence of undertaking the role of teacher. Friends also provided role models and competitors.
  • Similar support provided by the extended family and wider social, cultural and religious communities. Parents encouraged their children to take part in extra-curricular activities and were often aware of their educational benefits. Family networks often provided additional learning experiences, particularly for Caribbean and some Asian families.

 

Ofsted

Ofsted’s The most able students: Are they doing as well as they should in our non-selective secondary schools? (2013) defines this population rather convolutedly as those:

‘…starting secondary school in Year 7 attaining level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and or mathematics at the end of Key Stage 2.’ (Footnote p6-7)

There is relatively little data in the report about the performance of high-attaining disadvantaged learners, other than the statement that only 58% of FSM students within the ‘most able’ population in KS2 and attending non-selective secondary schools go on to achieve A*-B GCSE grades in English and maths, compared with 75% of non-FSM pupils, giving a gap of 17 percentage points.

I have been unable to find national transition matrices for advantaged and disadvantaged learners, enabling us to compare the proportion of advantaged and disadvantaged pupils making and exceeding the expected progress between key stages.

 

Regression to the mean and efforts to circumvent it

Much prominence has been given to Feinstein’s 2003 finding that, whereas high-scoring children from advantaged and disadvantaged backgrounds (defined by parental occupation) perform at a broadly similar level when tested at 22 months, the disadvantaged group are subsequently overtaken by relatively low-scoring children from advantaged backgrounds during the primary school years.

The diagram that summarises this relationship has been reproduced widely and much used as the centrepiece of arguments justifying efforts to improve social mobility.

Feinstein Capture

But Feinstein’s finding were subsequently challenged on methodological grounds associated with the effects of regression to the mean.

Jerrim and Vignoles (2011) concluded:

‘There is currently an overwhelming view amongst academics and policymakers that highly able children from poor homes get overtaken by their affluent (but less able) peers before the end of primary school. Although this empirical finding is treated as a stylised fact, the methodology used to reach this conclusion is seriously flawed. After attempting to correct for the aforementioned statistical problem, we find little evidence that this is actually the case. Hence we strongly recommend that any future work on high ability–disadvantaged groups takes the problem of regression to the mean fully into account.’

On the other hand, Whitty and Anders comment:

‘Although some doubt has been raised regarding this analysis on account of the potential for regression to the mean to exaggerate the phenomenon (Jerrim and Vignoles, 2011), it is highly unlikely that this would overturn the core finding that high SES, lower ability children catch up with their low-SES, higher-ability peers.’

Their point is borne out by Progress made by high-attaining children from disadvantaged backgrounds (June 2014) suggesting that Vignoles, as part of the writing team, has changed her mind somewhat since 2011.

This research adopts a methodological route to minimise the impact of regression to the mean. This involves assigning learners to achievement groups using a different test to those used to follow their attainment trajectories and focusing principally on those trajectories from KS2 onwards.

The high attaining group is defined as those achieving Level 3 or above in KS1 writing, which selected in 12.6% of the sample. (For comparison, the same calculations are undertaken based on achieving L3 or above in KS1 maths.) These pupils are ranked and assigned a percentile on the basis of their performance on the remaining KS1 tests and at each subsequent key stage.

The chart summarising the outcomes in the period from KS1 to KS4 is reproduced below, showing the different trajectories of the ‘most deprived’ and ‘least deprived’. These are upper and lower quintile groups of state school students derived on the basis of FSM eligibility and a set of area-based measures of disadvantage and measures of socio-economic status derived from the census.

 

Ex gap 8 Capture

The trajectories do not alter significantly beyond KS4.

The study concludes:

‘…children from poorer backgrounds who are high attaining at age 7 are more likely to fall off a high attainment trajectory than children from richer backgrounds. We find that high-achieving children from the most deprived families perform worse than lower-achieving students from the least deprived families by Key Stage 4. Conversely, lower-achieving affluent children catch up with higher-achieving deprived children between Key Stage 2 and Key Stage 4.’

Hence:

‘The period between Key Stage 2 and Key Stage 4 appears to be a crucial time to ensure that higher-achieving pupils from poor backgrounds remain on a high achievement trajectory.’

In short, a Feinstein-like relationship is established but it operates at a somewhat later stage in the educational process.

 

International comparisons studies

 

PISA: Resilience

OECD PISA studies have recently begun to report on the performance of what they call ‘resilient’ learners.

Against the Odds: Disadvantaged Students Who Succeed in Schools (OECD, 2011) describes this population as those who fall within the bottom third of their country’s distribution by socio-economic background, but who achieve within the top third on PISA assessments across participating countries.

This publication uses PISA 2006 science results as the basis of its calculations. The relative position of different countries is shown in the chart reproduced below. Hong Kong tops the league at 24.8%, the UK is at 13.5%, slightly above the OECD average of 13%, while the USA is languishing at 9.9%.

Ex Gap Capture 9

The findings were discussed further in PISA in Focus 5 (OECD 2011), where PISA 2009 data is used to make the calculation. The methodology is also significantly adjusted so that includes a substantially smaller population:

‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status.’

According to this measure, the UK is at 24% and the US has leapfrogged them at 28%. Both are below the OECD average of 31%, while Shanghai and Hong Kong stand at over 70%.

The Report on PISA 2012 (OECD 2013) retains the more demanding definition of resilience, but dispenses with multiplication by 4, so these results need to be so multiplied to be comparable with those for 2009.

This time round, Shanghai is at 19.2% (76.8%) and Hong Kong at 18.1% (72.4%). The OECD average is 6.4% (25.6%), the UK at 5.8% (23.2%) and the US at 5.2% (20.8%).

So the UK has lost a little ground compared with 2009, but is much close to the OECD average and has overtaken the US, which has fallen back by some seven percentage points.

I could find no commentary on these changes.

NFER has undertaken some work on resilience in Northern Ireland, using PISA 2009 reading results (and the original ‘one third’ methodology) as a base. This includes odds ratios for different characteristics of being resilient. This could be replicated for England using PISA 2012 data and the latest definition of resilience.

 

Research on socio-economic gradients

The Socio-Economic Gradient in Teenagers’ Reading Skills: How Does England Compare with Other Countries? (Jerrim 2012) compares the performance of students within the highest and lowest quintiles of the ISEI Index of Occupational Status on the PISA 2009 reading tests.

It quantifies the proportion of these two populations within each decile of  achievement, so generating a gradient, before reviewing how this gradient has changed between PISA 2000 and PISA 2009, comparing outcomes for England, Australia, Canada, Finland, Germany and the US.

Jerrim summarises his findings thus:

‘The difference between advantaged and disadvantaged children’s PISA 2009 reading test scores in England is similar (on average) to that in most other developed countries (including Australia, Germany and, to some extent, the US). This is in contrast to previous studies from the 1990s, which suggested that there was a particularly large socio-economic gap in English pupils’ academic achievement.

Yet the association between family background and high achievement seems to be stronger in England than elsewhere.

There is some evidence that the socio-economic achievement gradient has been reduced in England over the last decade, although not amongst the most able pupils from advantaged and disadvantaged homes.’

Jerrim finds that the link in England between family background and high achievement is stronger than in most other OECD countries, whereas this is not the case at the other end of the distribution.

He hypothises that this might be attributable to recent policy focus on reducing the ‘long tail’ while:

‘much less attention seems to be paid to helping disadvantaged children who are already doing reasonably well to push on and reach the top grades’.

He dismisses the notion that the difference is associated with the fact that  disadvantaged children are concentrated in lower-performing schools, since it persists even when controls for school effects are introduced.

In considering why PISA scores show the achievement gap in reading has reduced between 2000 and 2009 at the lower end of the attainment distribution but not at the top, he cites two possibilities: that Government policy has been disproportionately successful at the lower end; and that there has been a more substantial decline in achievement amongst learners from advantaged backgrounds than amongst their disadvantaged peers. He is unable to rule out the latter possibility.

He also notes in passing that PISA scores in maths do not generate the same pattern.

These arguments are further developed in ‘The Reading Gap: The socio-economic gap in children’s reading skills: A cross-national comparison using PISA 2009’ (Jerrim, 2013) which applies the same methodology.

This finds that high-achieving (top decile of the test distribution) boys from the most advantaged quintile in England are two years and seven months ahead of high-achieving boys from the most disadvantaged quintile, while the comparable gap for girls is slightly lower, at two years and four months.

The chart reproduced below illustrates international comparisons for boys. It shows that only Scotland has a larger high achievement gap than England. (The black lines indicate 99% confidence intervals – he associates the uncertainty to ‘sampling variation’.)

Gaps in countries at the bottom of the table are approximately half the size of those in England and Scotland.

Ex gap 10 capture

 

One of the report’s recommendations is that:

‘The coalition government has demonstrated its commitment to disadvantaged pupils by establishing the Education Endowment Foundation… A key part of this Foundation’s future work should be to ensure highly able children from disadvantaged backgrounds succeed in school and have the opportunity to enter top universities and professional jobs. The government should provide additional resources to the foundation to trial interventions that specifically target already high achieving children from disadvantaged homes. These should be evaluated using robust evaluation methodologies (e.g. randomised control trials) so that policymakers develop a better understanding of what schemes really have the potential to work.’

The study is published by the Sutton Trust whose Chairman – Sir Peter Lampl – is also chairman of the EEF.

In ‘Family background and access to high ‘status’ universities’ (2013) Jerrim provides a different chart showing estimates by country of disadvantaged high achieving learners. The measure of achievement is PISA Level 5 in reading and the measure of disadvantage remains quintiles derived from the ISEI index.

Ex Gap 12 Capture 

The underlying figures are not supplied.

Also in 2013, in ‘The mathematical skills of school children: how does England compare to the high-performing East Asian jurisdictions?’ Jerrim and Choi construct a similar gradient for maths, drawing on a mix of PISA and TIMSS assessments conducted between 2003 and 2009, so enabling them to consider variation according to the age at which assessment takes place.

The international tests selected are TIMSS 2003, 4th grade; TIMSS 2007, 8th grade and PISA 2009. The differences between what these tests measure are described as ‘slight’. The analysis of achievement relies on deciles of the achievement distribution.

Thirteen comparator countries are included, including six wealthy western economies, three ‘middle income’ western economies and four Asian Tigers (Hong Kong, Japan, Singapore and Taiwan).

This study applies as the best available proxy for socio-economic status the number of books in the family home, comparing the most advantaged (over 200 books) with the least (under 25 books). It acknowledges the limitations of this proxy, which Jerrim discusses elsewhere.

The evidence suggests that:

‘between primary school and the end of secondary school, the gap between the lowest achieving children in England and the lowest achieving children in East Asian countries is reduced’

but remains significant.

Conversely, results for the top 10% of the distribution:

‘suggest that the gap between the highest achieving children in England and the highest achieving children in East Asia increases between the end of primary school and the end of secondary school’.

The latter outcome is illustrated in the chart reproduced below

Ex gap 11 Capture

 

The authors do not consider variation by socio-economic background amongst the high-achieving cohort, presumably because the data still does not support the pattern they previously identified for reading.

 

US studies

In 2007 the Jack Kent Cooke Foundation published ‘Achievement Trap: How America is Failing Millions of High-Achieving Students from Low Income Backgrounds’ (Wyner, Bridgeland, Diiulio) The text was subsequently revised in 2009.

This focuses exclusively on gaps attributable to socio-economic status, by comparing the performance of those in the top and bottom halves of the family income distribution in the US, as adjusted for family size.

The achievement measure is top quartile performance on nationally normalised exams administered within two longitudinal studies: The National Education Longitudinal Study (NELS) and the Baccalaureate and Beyond Longitudinal Study (B&B).

The study reports that relatively few lower income students remain high achievers throughout their time in elementary and high school:

  • 56% remain high achievers in reading by Grade 5, compared with 69% of higher income students.
  • 25 percent fall out of the high achiever cohort in high school, compared with 16% of higher income students.
  • Higher income learners who are not high achievers in Grade 1 are more than twice as likely to be high achievers by Grade 5. The same is true between Grades 8 and 12.

2007 also saw the publication of ‘Overlooked Gems: A national perspective on low income promising learners’ (Van Tassel-Baska and Stambaugh). This is a compilation of the proceedings of a 2006 conference which does not attempt a single definition of the target group, but draws on a variety of different research studies and programmes, each with different starting points.

An influential 2009 McKinsey study ‘The Economic Impact of the Achievement Gap in America’s Schools’ acknowledges the existence of what it calls a ‘top gap’. They use this term with reference to:

  • the number of top performers and the level of top performance in the US compared with other countries and
  • the gap in the US between the proportion of Black/Latino students and the proportion of all students achieving top levels of performance.

The authors discuss the colossal economic costs of achievement gaps more generally, but fail to extend this analysis to the ‘top gap’ specifically.

In 2010 ‘Mind the Other Gap: The Growing Excellence Gap in K-12 Education’ (Plucker, Burroughs and Song) was published – and seems to have been the first study to use this term.

The authors define such gaps straightforwardly as

‘Differences between subgroups of students performing at the highest levels of achievement’

The measures of high achievement deployed are the advanced standards on US NAEP maths and reading tests, at Grades 4 and 8 respectively.

The study identifies gaps based on four sets of learner characteristics:

  • Socio-economic status (eligible or not for free or reduced price lunch).
  • Ethnic background (White versus Black and/or Hispanic).
  • English language proficiency (what we in England would call EAL, compared with non-EAL).
  • Gender (girls versus boys).

Each characteristic is dealt with in isolation, so there is no discussion of the gaps between – for example – disadvantaged Black/Hispanic and disadvantaged White boys.

In relation to socio-economic achievement gaps, Plucker et al find that:

  • In Grade 4 maths, from 1996 to 2007, the proportion of advantaged learners achieving the advanced level increased by 5.6 percentage points, while the proportion of disadvantaged learners doing so increased by 1.2 percentage points. In Grade 8 maths, these percentage point changes were 5.7 and 0.8 percentage points respectively. Allowing for changes in the size of the advantaged and disadvantaged cohorts, excellence gaps are estimated to have widened by 4.1 percentage points in Grade 4 (to 7.3%) and 4.9 percentage points in Grade 8 (to 8.2%).
  • In Grade 4 reading, from 1998 to 2007, the proportion of advantaged learners achieving the advanced level increased by 1.2 percentage points, while the proportion of disadvantaged students doing so increased by 0.8 percentage points. In Grade 8 reading, these percentage point changes were almost negligible for both groups. The Grade 4 excellence gap is estimated to have increased slightly, by 0.4 percentage points (to 9.4%) whereas Grade 8 gaps have increased minimally by 0.2 percentage points (to 3.1%).

They observe that the size of excellence gaps are, at best, only moderately correlated with those at lower levels of achievement.

There is a weak relationship between gaps at basic and advanced level – indeed ‘smaller achievement gaps among minimally competent students is related to larger gaps among advanced students’ – but there is some inter-relationship between those at proficient and advanced level.

They conclude that, whereas No Child Left Behind (NCLB) helped to narrow achievement gaps, this does not extend to high achievers.

There is no substantive evidence that the NCLB focus on lower achievers has increased the excellence gap, although the majority of states surveyed by the NAGC felt that NCLB had diverted attention and resource away from gifted education.

In 2011 ‘Do High Fliers Maintain their Altitude?’ (Xiang et al 2011) provides a US analysis of whether individual students remain high achievers throughout their school careers.

They do not report outcomes for disadvantaged high achievers, but do consider briefly those attending schools with high and low proportions respectively of students eligible for free and reduced price lunches.

For this section of the report, high achievement is defined as ‘those whose math or reading scores placed them within the top ten per cent of their individual grades and schools’. Learners were tracked from Grades 3 to 5 and Grades 6 to 8.

It is described as exploratory, because the sample was not representative.

However:

‘High-achieving students attending high-poverty schools made about the same amount of academic growth over time as their high-achieving peers in low-poverty schools…It appears that the relationship between a school’s poverty rate and the growth of its highest-achieving students is weak. In other words, attending a low-poverty school adds little to the average high achiever’s prospects for growth.’

The wider study was criticised in a review by the NEPC, in part on the grounds that the results may have been distorted by regression to the mean, a shortcoming only briefly discussed in an appendix..

The following year saw the publication of Unlocking Emergent Talent: Supporting High Achievement of Low-Income, High-Ability Students (Olszewski-Kubilius and Clarenbach, 2012).

This is the report of a national summit on the issue convened in that year by the NAGC.

It follows Plucker (one of the summit participants) in using as its starting point,the achievement of advanced level on selected NAEP assessments by learners eligible for free and reduced price lunches.

But it also reports some additional outcomes for Grade 12 and for assessments of civics and writing:

  • ‘Since 1998, 1% or fewer of 4th-, 8th-, and 12th-grade free or reduced lunch students, compared to between 5% and 6% of non-eligible students scored at the advanced level on the NAEP civics exam.
  • Since 1998, 1% or fewer of free and reduced lunch program-eligible students scored at the advanced level on the eighth-grade NAEP writing exam while the percentage of non-eligible students who achieved advanced scores increased from 1% to 3%.’

The bulk of the report is devoted to identifying barriers to progress and offering recommendations for improving policy, practice and research. I provided an extended analysis in this post from May 2013.

Finally, ‘Talent on the Sidelines: Excellence Gaps and America’s Persistent Talent Underclass’ (Plucker, Hardesty and Burroughs 2013) is a follow-up to ‘Mind the Other Gap’.

It updates the findings in that report, as set out above:

  • In Grade 4 maths, from 1996 to 2011, the proportion of advantaged students scoring at the advanced level increased by 8.3 percentage points, while the proportion of disadvantaged learners doing so increased by 1.5 percentage points. At Grade 8, the comparable changes were 8.5 percentage points and 1.5 percentage points respectively. Excellence gaps have increased by 6.8 percentage points at Grade 4 (to 9.6%) and by 7 percentage points at Grade 8 (to 10.3%).
  • In Grade 4 reading, from 1998 to 2011, the proportion of advantaged students scoring at the advanced level increased by 2.6 percentage points, compared with an increase of 0.9 percentage points amongst disadvantaged learners. Grade 8 saw equivalent increases of 1.8 and 0.9 percentage points respectively. Excellence gaps are estimated to have increased at Grade 4 by 1.7 percentage points (to 10.7%) and marginally increased at Grade 8 by 0.9 percentage points (to 4.2%).

In short, many excellence gaps remain large and most continue to grow. The report’s recommendations are substantively the same as those put forward in 2010.

 

How Government education policy impacts on excellence gaps

Although many aspects of Government education policy may be expected to have some longer-term impact on raising the achievement of all learners, advantaged and disadvantaged alike, relatively few interventions are focused exclusively and directly on closing attainment gaps between advantaged and disadvantaged learners – and so have the potential to makes a significant difference to excellence gaps.

The most significant of these include:

 

The Pupil Premium:

In November 2010, the IPPR voiced concerns that the benefits of the pupil premium might not reach all those learners who attract it.

Accordingly they recommended that pupil premium should be allocated directly to those learners through an individual Pupil Premium Entitlement which might be used to support a menu of approved activities, including ‘one-to-one teaching to stretch the most able low income pupils’.

The recommendation has not been repeated and the present Government shows no sign of restricting schools’ freedom to use the premium in this manner.

However, the Blunkett Labour Policy Review ‘Putting students and parents first’ recommends that Labour in government should:

‘Assess the level and use of the Pupil Premium to ensure value for money, and that it is targeted to enhance the life chances of children facing the biggest challenges, whether from special needs or from the nature of the background and societal impact they have experienced.’

In February 2013 Ofsted reported that schools spending the pupil premium successfully to improve achievement:

‘Never confused eligibility for the Pupil Premium with low ability, and focused on supporting their disadvantaged pupils to achieve the highest levels’.

Conversely, where schools were less successful in spending the funding, they:

‘focused on pupils attaining the nationally expected level at the end of the key stage…but did not go beyond these expectations, so some more able eligible pupils underachieved.’

In July 2013, DfE’s Evaluation of Pupil Premium reported that, when deciding which disadvantaged pupils to target for support, the top criterion was ‘low attainment’ and was applied in 91% of primary schools and 88% of secondary schools.

In June 2013, in ‘The Most Able Students’, Ofsted reported that:

‘Pupil Premium funding was used in only a few instances to support the most able students who were known to be eligible for free school meals. The funding was generally spent on providing support for all underachieving and low-attaining students rather than on the most able students from disadvantaged backgrounds.’

Accordingly, it gave a commitment that:

‘Ofsted will… consider in more detail during inspection how well the pupil premium is used to support the most able students from disadvantaged backgrounds.’

However, this was not translated into the school inspection guidance.

The latest edition of the School Inspection Handbook says only:

‘Inspectors should pay particular attention to whether more able pupils in general and the most able pupils in particular are achieving as well as they should. For example, does a large enough proportion of those pupils who had the highest attainment at the end of Key Stage 2 in English and mathematics achieve A*/A GCSE grades in these subjects by the age of 16?

Inspectors should summarise the achievements of the most able pupils in a separate paragraph of the inspection report.’

There is no reference to the most able in parallel references to the pupil premium.

There has, however, been some progress in giving learners eligible for the pupil premium priority in admission to selective schools.

In May 2014, the TES reported that:

‘Thirty [grammar] schools have been given permission by the Department for Education to change their admissions policies already. The vast majority of these will introduce the changes for children starting school in September 2015…A small number – five or six – have already introduced the reform.’

The National Grammar Schools Association confirmed that:

‘A significant number of schools 38 have either adopted an FSM priority or consulted about doing so in the last admissions round. A further 59 are considering doing so in the next admissions round.’

In July 2014, the Government launched a consultation on the School Admissions Code which proposes extending to all state-funded schools the option to give priority in their admission arrangements to learners eligible for the pupil premium. This was previously open to academies and free schools via their funding agreements.

 

The Education Endowment Foundation (EEF)

The EEF describes itself as:

‘An independent grant-making charity dedicated to breaking the link between family income and educational achievement, ensuring that children from all backgrounds can fulfil their potential and make the most of their talents.’

The 2010 press release announcing its formation emphasised its role in raising standards in underperforming schools. This was reinforced by the Chairman in a TES article from June 2011:

‘So the target group for EEF-funded projects in its first couple of years are pupils eligible for free school meals in primary and secondary schools underneath the Government’s floor standards at key stages 2 and 4. That’s roughly 1,500 schools up and down the country. Projects can benefit other schools and pupils, as long as there is a significant focus on this core target group of the most needy young people in the most challenging schools.’

I have been unable to trace any formal departure from this position, though it no longer appears in this form in the Foundation’s guidance. The Funding FAQs say only:

‘In the case of projects involving the whole school, rather than targeted interventions, we would expect applicants to be willing to work with schools where the proportion of FSM-eligible pupils is well above the national average and/or with schools where FSM-eligible pupils are under-performing academically.’

I can find no EEF-funded projects that are exclusively or primarily focused on high-attaining disadvantaged learners, though a handful of its reports do refer to the impact on this group.

 

Changes to School Accountability Measures

As we have seen in Part one, the School Performance Tables currently provide very limited information about the performance of disadvantaged high achievers.

The July 2013 consultation document on primary assessment and accountability reform included a commitment to publish a series of headline measures in the tables including:

‘How many of the school’s pupils are among the highest-attaining nationally, by…showing the percentage of pupils attaining a high scaled score in each subject.’

Moreover, it added:

‘We will publish all the headline measures to show the attainment and progress of pupils for whom the school is in receipt of the pupil premium.’

Putting two and two together, this should mean that, from 2016, we will be able to see the percentage of pupil premium-eligible students achieving a high scaled score, though we do not yet know what ‘high scaled score’ means, nor do we know whether the data will be for English and maths separately or combined.

The October 2013 response to the secondary assessment and accountability consultation document fails to say explicitly whether excellence gap measures will be published in School Performance Tables.

It mentions that:

‘Schools will now be held to account for (a) the attainment of their disadvantaged pupils, (b) the progress made by their disadvantaged pupils, and (c) the in-school gap in attainment between disadvantaged pupils and their peers.’

Meanwhile a planned data portal will contain:

‘the percentage of pupils achieving the top grades in GCSEs’

but the interaction between these two elements, if any, remains unclear.

The March 2014 response to the consultation on post-16 accountability and assessment says:

‘We intend to develop measures covering all five headline indicators for students in 16-19 education who were in receipt of pupil premium funding in year 11.’

The post-16 headline measures include a new progress measure and an attainment measure showing the average points score across all level 3 qualifications.

It is expected that a destination measure will also be provided, as long as the methodology can be made sufficiently robust. The response says:

‘A more detailed breakdown of destinations data, such as entry to particular groups of universities, will continue to be published below the headline. This will include data at local authority level, so that destinations for students in the same area can be compared.’

and this should continue to distinguish the destinations of disadvantaged students.

Additional A level attainment measures – the average grade across the best three A levels and the achievement of AAB grades with at least two in facilitating subjects seem unlikely to be differentiated according to disadvantage.

There remains a possibility that much more excellence gap data, for primary, secondary and post-16, will be made available through the planned school portal, but no specification had been made public at the time of writing.

More worryingly, recent news reports have suggested that the IT project developing the portal and the ‘data warehouse’ behind it has been abandoned. The statements refer to coninuing to deliver ‘the school performance tables and associated services’ but there is no clarification of whether this latter phrase includes the portal. Given the absence of an official statement, one suspects the worst.

 

 

The Social Mobility and Child Poverty Commission (SMCPC)

The Commission was established with the expectation that it would ‘hold the Government’s feet to the fire’ to encourage progress on these two topics.

It publishes annual ‘state of the nation’ reports that are laid before Parliament and also undertakes ‘social mobility advocacy’.

The first annual report – already referenced in Part one – was published in November 2013. The second is due in October 2014.

The Chairman of the Commission was less than complimentary about the quality of the Government’s response to its first report, which made no reference to its comments about attainment gaps at higher grades. It remains to be seen whether the second will be taken any more seriously.

The Commission has already shown significant interest in disadvantaged high achievers – in June 2014 it published the study ‘Progress made by high-attaining children from disadvantaged backgrounds’ referenced above – so there is every chance that the topic will feature again in the 2014 annual report.

The Commission is of course strongly interested in the social mobility indicators and progress made against them, so may also include recommendations for how they might be adjusted to reflect changes to the schools accountability regime set out above.

 

Recommended reforms to close excellence gaps

Several proposals emerge from the commentary on current Government policy above:

  • It would be helpful to have further evaluation of the pupil premium to check whether high-achieving disadvantaged learners are receiving commensurate support. Schools need further guidance on ways in which they can use the premium to support high achievers. This should also be a focus for the pupil premium Champion and in pupil premium reviews.
  • Ofsted’s school inspection handbook requires revision to fulfil its commitment to focus on the most able in receipt of the premium. Inspectors also need guidance (published so schools can see it) to ensure common expectations are applied across institutions. These provisions should be extended to the post-16 inspection regime.
  • All selective secondary schools should be invited to prioritise pupil premium recipients in their admissions criteria, with the Government reserving the right to impose this on schools that do not comply voluntarily.
  • The Education Endowment Foundation should undertake targeted studies of interventions to close excellence gaps, but should also ensure that the impact on excellence gaps is mainstreamed in all the studies they fund. (This should be straightforward since their Chairman has already called for action on this front.)
  • The Government should consider the case for the inclusion of data on excellence gaps in all the headline measures in the primary, secondary and post-16 performance tables. Failing that, such data (percentages and numbers) should be readily accessible from a new data portal as soon as feasible, together with historical data of the same nature. (If the full-scale portal is no longer deliverable, a suitable alternative openly accessible database should be provided.) It should also publish annually a statistical analysis of all excellence gaps and the progress made towards closing them. As much progress as possible should be made before the new assessment and accountability regime is introduced. At least one excellence gap measure should be incorporated into revised DfE impact indicators and the social mobility indicators.
  • The Social Mobility and Child Poverty Commission (SMCPC) should routinely consider the progress made in closing excellence gaps within its annual report – and the Government should commit to consider seriously any recommendations they offer to improve such progress.

This leaves the question whether there should be a national programme dedicated to closing excellence gaps, and so improving fair access to competitive universities. (It makes excellent sense to combine these twin objectives and to draw on the resources available to support the latter.)

Much of the research above – whether it originates in the US or UK – argues for dedicated state/national programmes to tackle excellence gaps.

More recently, the Sutton Trust has published a Social Mobility Manifesto for 2015 which recommends that the next government should:

‘Reintroduce ring-fenced government funding to support the most able learners (roughly the top ten per cent) in maintained schools and academies from key stage three upwards. This funding could go further if schools were required to provide some level of match funding.

Develop an evidence base of effective approaches for highly able pupils and ensure training and development for teachers on how to challenge their most able pupils most effectively.

Make a concerted effort to lever in additional support from universities and other partners with expertise in catering for the brightest pupils, including through creating a national programme for highly able learners, delivered through a network of universities and accessible to every state-funded secondary school serving areas of disadvantage.’

This is not as clear as it might be about the balance between support for the most able and the most able disadvantaged respectively.

I have written extensively about what shape such a programme should have, most recently in the final section of ‘Digging Beneath the Destination Measures’ (July 2014).

The core would be:

‘A light touch framework that will supply the essential minimum scaffolding necessary to support effective market operation on the demand and supply sides simultaneously…

The centrepiece of the framework would be a structured typology or curriculum comprising the full range of knowledge, skills and understanding required by disadvantaged students to equip them for progression to selective higher education

  • On the demand side this would enable educational settings to adopt a consistent approach to needs identification across the 11-19 age range. Provision from 11-14 might be open to any disadvantaged learner wishing it to access it, but provision from 14 onwards would depend on continued success against challenging attainment targets.
  • On the supply side this would enable the full range of providers – including students’ own educational settings – to adopt a consistent approach to defining which knowledge, skills and understanding their various programmes and services are designed to impart. They would be able to qualify their definitions according to the age, characteristics, selectivity of intended destination and/or geographical location of the students they serve.

With advice from their educational settings, students would periodically identify their learning needs, reviewing the progress they had made towards personal targets and adjusting their priorities accordingly. They would select the programmes and services best matched to their needs….

…Each learner within the programme would have a personal budget dedicated to purchasing programmes and services with a cost attached. This would be fed from several sources including:

  • Their annual Pupil Premium allocation (currently £935 per year) up to Year 11.
  • A national fund fed by selective higher education institutions. This would collect a fixed minimum topslice from each institution’s outreach budget, supplemented by an annual levy on those failing to meet demanding new fair access targets. (Institutions would also be incentivised to offer programmes and services with no cost attached.)
  • Philanthropic support, bursaries, scholarships, sponsorships and in-kind support sourced from business, charities, higher education, independent schools and parents. Economic conditions permitting, the Government might offer to match any income generated from these sources.’

 

Close

We know far too little than we should about the size of excellence gaps in England – and whether or not progress is being made in closing them.

I hope that this post makes some small contribution towards rectifying matters, even though the key finding is that the picture is fragmented and extremely sketchy.

Rudimentary as it is, this survey should provide a baseline of sorts, enabling us to judge more easily what additional information is required and how we might begin to frame effective practice, whether at institutional or national level.

 

GP

September 2014

Closing England’s Excellence Gaps: Part One

This post examines what we know – and do not know – about high attainment gaps between learners from advantaged and disadvantaged backgrounds in England.

Mind the Gap by Clicsouris

Mind the Gap by Clicsouris

It assesses the capacity of current national education policy to close these gaps and recommends further action to improve the prospects of doing so rapidly and efficiently.

Because the post is extremely long I have divided it into two parts.

Part one comprises:

  • A working definition for the English context, explanation of the significance of excellence gaps, description of how this post relates to earlier material and provisional development of the theoretical model articulated in those earlier posts.
  • A summary of the headline data on socio-economic attainment gaps in England, followed by a review of published data relevant to excellence gaps at primary, secondary and post-16 levels.

Part two contains:

  • A distillation of research evidence, including material on whether disadvantaged high attainers remain so, international comparisons studies and research derived from them, and literature covering excellence gaps in the USA.
  • A brief review of how present Government policy might be expected to impact directly on excellence gaps, especially via the Pupil Premium, school accountability measures, the Education Endowment Foundation (EEF) and the Social Mobility and Child Poverty Commission (SMCPC). I have left to one side the wider set of reforms that might have an indirect and/or longer-term impact.
  • Some recommendations for strengthening our collective capacity to quantify address and ultimately close excellence gaps.

The post is intended to synthesise, supplement and update earlier material, so providing a baseline for further analysis – and ultimately consideration of further national policy intervention, whether under the present Government or a subsequent administration.

It does not discuss the economic and social origins of educational disadvantage, or the merits of wider policy to eliminate poverty and strengthen social mobility.

It starts from the premiss that, while education reform cannot eliminate the effects of disadvantage, it can make a significant, positive contribution by improving significantly the life chances of disadvantaged learners.

It does not debate the fundamental principle that, when prioritising educational support to improve the life chances of learners from disadvantaged backgrounds, governments should not discriminate on the basis of ability or prior attainment.

It assumes that optimal policies will deliver improvement for all disadvantaged learners, regardless of their starting point. It suggests, however, that intervention strategies should aim for equilibrium, prioritising gaps that are furthest away from it and taking account of several different variables in the process.

 

A working definition for the English context

The literature in Part two reveals that there is no accepted universal definition of excellence gaps, so I have developed my own England-specific working definition for the purposes of this post.

An excellence gap is:

‘The difference between the percentage of disadvantaged learners who reach a specified age- or stage-related threshold of high achievement – or who secure the requisite progress between two such thresholds – and the percentage of all other eligible learners that do so.’

This demands further clarification of what typically constitutes a disadvantaged learner and a threshold of high achievement.

In the English context, the measures of disadvantage with the most currency are FSM eligibility (eligible for and receiving free school meals) and eligibility for the deprivation element of the pupil premium (eligible for and receiving FSM at some point in the preceding six years – often called ‘ever 6’).

Throughout this post, for the sake of clarity, I have given priority to the former over the latter, except where the former is not available.

The foregrounded characteristic is socio-economic disadvantage, but this does not preclude analysis of the differential achievement of sub-groups defined according to secondary characteristics including gender, ethnic background and learning English as an additional language (EAL) – as well as multiple combinations of these.

Some research is focused on ‘socio-economic gradients’, which show how gaps vary at different points of the achievement distribution on a given assessment.

The appropriate thresholds of high achievement are most likely to be measured through national assessments of pupil attainment, notably end of KS2 tests (typically Year 6, age 11), GCSE and equivalent examinations (typically Year 11, age 16) and A level and equivalent examinations (typically Year 13, age 18).

Alternative thresholds of high achievement may be derived from international assessments, such as PISA, TIMSS or PIRLS.

Occasionally – and especially in the case of these international studies – an achievement threshold is statistically derived, in the form of a percentile range of performance, rather than with reference to a particular grade, level or score. I have not allowed for this within the working definition.

Progress measures typically relate to the distance travelled between: baseline assessment (currently at the end of KS1 – Year 2, age 7 – but scheduled to move to Year R, age 4) and end of KS2 tests; or between KS2 tests and the end of KS4 (GCSE); or between GCSE and the end of KS5 (Level 3/A level).

Some studies extend the concept of progress between two thresholds to a longitudinal approach that traces how disadvantaged learners who achieve a particular threshold perform throughout their school careers – do they sustain early success, or fall away, and what proportion are ‘late bloomers’?

 

Why are excellence gaps important?

Excellence gaps are important for two different sets of reasons: those applying to all achievement gaps and those which apply more specifically or substantively to excellence gaps.

Under the first heading:

  • The goal of education should be to provide all learners, including disadvantaged learners, with the opportunity to maximise their educational potential, so eliminating ‘the soft bigotry of low expectations’.
  • Schools should be ‘engines of social mobility’, helping disadvantaged learners to overcome their backgrounds and compete equally with their more advantaged peers.
  • International comparisons studies reveal that the most successful education systems can and do raise attainment for all and close socio-economic achievement gaps simultaneously.
  • There is a strong economic case for reducing – and ideally eradicating – underachievement attributable to disadvantage.

Under the second heading:

  • An exclusive or predominant focus on gaps at the lower end of the attainment distribution is fundamentally inequitable and tends to reinforce the ‘soft bigotry of low expectations’.
  • Disadvantaged learners benefit from successful role models – predecessors or peers from a similar background who have achieved highly and are reaping the benefits.
  • An economic imperative to increase the supply of highly-skilled labour will place greater emphasis on the top end of the achievement distribution. Some argue that there is a ‘smart fraction’ tying national economic growth to a country’s stock of high achievers. There may be additional spin-off benefits from increasing the supply of scientists, writers, artists, or even politicians!
  • The most highly educated disadvantaged learners are least likely to confer disadvantage on their children, so improving the proportion of such learners may tend to improve inter-generational social mobility.

Excellence gaps are rarely identified as such – the term is not yet in common usage in UK education, though it has greater currency in the US. Regardless of terminology, they rarely receive attention, either as part of a wider set of achievement gaps, or separately in their own right.

 

Relationship with earlier posts

Since this blog was founded in April 2010 I have written extensively about excellence gaps and how to address them.

The most pertinent of my previous posts are:

I have also written about excellence gaps in New Zealand – Part 1 and Part 2 (June 2012) – but do not draw on that material here.

Gifted education (or apply your alternative term) is amongst those education policy areas most strongly influenced by political and ideological views on the preferred balance between excellence and equity. This is particularly true of decisions about how best to address excellence gaps.

The excellence-equity trade-off was identified in my first post (May 2010) as one of three fundamental polarities that determine the nature of gifted education and provide the basis for most discussion about what form it should take.

The Gifted Phoenix Manifesto for Gifted Education (March 2013) highlighted their significance thus:

‘Gifted education is about balancing excellence and equity. That means raising standards for all while also raising standards faster for those from disadvantaged backgrounds.

Through combined support for excellence and equity we can significantly increase our national stock of high level human capital and so improve economic growth…

…Excellence in gifted education is about maximising the proportion of high achievers reaching advanced international benchmarks (eg PISA, TIMSS and PIRLS) so increasing the ‘smart fraction’ which contributes to economic growth.

Equity in gifted education is about narrowing (and ideally eliminating) the excellence gap between high achievers from advantaged and disadvantaged backgrounds (which may be attributable in part to causes other than poverty). This also increases the proportion of high achievers, so building the ‘smart fraction’ and contributing to economic growth.’

More recently, one of the 10 draft core principles I set out in ‘Why Can’t We Have National Consensus on Educating High Attainers?’ (June 2014) said:

‘We must pursue simultaneously the twin priorities of raising standards and closing gaps. We must give higher priority to all disadvantaged learners, regardless of their prior achievement. Standards should continue to rise amongst all high achievers, but they should rise faster amongst disadvantaged high achievers. This makes a valuable contribution to social mobility.’

 

This model provisionally developed

Using my working definition as a starting point, this section describes a theoretical model showing how excellence and equity are brought to bear when considering excellence gaps – and then how best to address them.

This should be applicable at any level, from a single school to a national education system and all points in between.

The model depends on securing the optimal balance between excellence and equity where:

  • Excellence is focused on increasing the proportion of all learners who achieve highly and, where necessary, increasing the pitch of high achievement thresholds to remove unhelpful ceiling effects. The thresholds in question may be nationally or internationally determined and are most likely to register high attainment through a formal assessment process. (This may be extended so there is complementary emphasis on increasing the proportion of high-achieving learners who make sufficiently strong progress between two different age- or stage-related thresholds.)
  • Equity is focused on increasing the proportion of high-achieving disadvantaged learners (and/or the proportion of disadvantaged learners making sufficiently strong progress) at a comparatively faster rate, so they form a progressively larger proportion of the overall high-achieving population, up to the point of equilibrium, where advantaged and disadvantaged learners are equally likely to achieve the relevant thresholds (and/or progress measure). This must be secured without deliberately repressing improvement amongst advantaged learners – ie by introducing policies designed explicitly to limit their achievement and/or progress relative to disadvantaged learners – but a decision to do nothing or to redistribute resources in favour of disadvantage is entirely permissible.

The optimal policy response will depend on the starting position and the progress achieved over time.

If excellence gaps are widening, the model suggests that interventions and resources should be concentrated in favour of equity. Policies should be reviewed and adjusted, or strengthened where necessary, to meet the desired objectives.

If excellence gaps are widening rapidly, this reallocation and adjustment process will be relatively more substantial (and probably more urgent) than if they are widening more slowly.

Slowly widening gaps will demand more reallocation and adjustment than a situation where gaps are stubbornly resistant to improvement, or else closing too slowly. But even in the latter case there should be some reallocation and adjustment until equilibrium is achieved.

When excellence gaps are already closing rapidly – and there are no overt policies in place to deliberately repress improvement amongst high-achieving advantaged learners – it may be that unintended pressures in the system are inadvertently bringing this about. In that case, policy and resources should be adjusted to correct these pressures and so restore the correct twin-speed improvement.

The aim is to achieve and sustain equilibrium, even beyond the point when excellence gaps are eliminated, so that they are not permitted to reappear.

If ‘reverse gaps’ begin to materialise, where disadvantaged learners consistently outperform their more advantaged peers, this also threatens equilibrium and would suggest a proportionate redistribution of effort towards excellence.

Such scenarios are most likely to occur in settings where there are a large proportion of learners that, while not disadvantaged according to the ‘cliff edge’ definition required to make the distinction, are still relatively disadvantaged.

Close attention must therefore be paid to the distribution of achievement across the full spectrum of disadvantage, to ensure that success at the extreme of the distribution does not mask significant underachievement elsewhere.

One should be able to determine a more precise policy response by considering a restricted set of variables. These include:

  • The size of the gaps at the start of the process and, associated with this, the time limit allowed for equilibrium to be reached. Clearly larger gaps are more likely to take longer to close. Policy makers may conclude that steady improvement over several years is more manageable for the system than a rapid sprint towards equilibrium. On the other hand, there may be benefits associated with pace and momentum.
  • The rate at which overall high achievement is improving. If this is relatively fast, the rate of improvement amongst advantaged high achievers will be correspondingly strong, so the rate for disadvantaged high achievers must be stronger still.
  • The variance between excellence gaps at different ages/stages. If the gaps are larger at particular stages of education, the pursuit of equilibrium suggests disproportionate attention is given to those so gaps are closed consistently. If excellence gaps are small for relatively young learners and increase with age, priority should be given to the latter, but there may be other factors in play, such as evidence that closing relatively small gaps at an early stage will have a more substantial ‘knock-on’ effect later on.
  • The level at which high achievement thresholds are pitched. Obviously this will influence the size of the gaps that need to be closed. But, other things being equal, enabling a higher proportion of learners to achieve a relatively high threshold will demand more intensive support. On the other hand, relatively fewer learners – whether advantaged or disadvantaged – are likely to be successful. Does one need to move a few learners a big distance or a larger proportion a smaller one?
  • Whether or not gaps at lower achievement thresholds are smaller and/or closing at a faster rate. If so, there is a strong case for securing parity of progress at higher and lower thresholds alike. On the other hand, if excellence gaps are closing more quickly, it may be appropriate to reallocate resources away from them and towards lower levels of achievement.
  • The relative size of the overall disadvantaged population, the associated economic gap between advantage and disadvantage and (as suggested above) the distribution in relation to the cut-off. If the definition of disadvantage is pitched relatively low (ie somewhat disadvantaged), the disadvantaged population will be correspondingly large, but the economic gap between advantage and disadvantage will be relatively small. If the definition is pitched relatively high (ie very disadvantaged) the reverse will be true, giving a comparatively small disadvantaged population but a larger gap between advantage and disadvantage.
  • The proportion of the disadvantaged population that is realistically within reach of the specified high achievement benchmarks. This variable is a matter of educational philosophy. There is merit in an inclusive approach – indeed it seems preferable to overestimate this proportion than the reverse. Extreme care should be taken not to discourage late developers or close off opportunities on the basis of comparatively low current attainment, so reinforcing existing gaps through unhelpfully low expectations. On the other hand, supporting unrealistically high expectations may be equally damaging and ultimately waste scarce resources. There may be more evidence to support such distinctions with older learners than with their younger peers. 

 

How big are England’s headline attainment gaps and how fast are they closing?

Closing socio-economic achievement gaps has been central to English educational policy for the last two decades, including under the current Coalition Government and its Labour predecessor.

It will remain an important priority for the next Government, regardless of the outcome of the 2015 General Election.

The present Government cites ‘Raising the achievement of disadvantaged children’ as one of ten schools policies it is pursuing.

The policy description describes the issue thus:

‘Children from disadvantaged backgrounds are far less likely to get good GCSE results. Attainment statistics published in January 2014 show that in 2013 37.9% of pupils who qualified for free school meals got 5 GCSEs, including English and mathematics at A* to C, compared with 64.6% of pupils who do not qualify.

We believe it is unacceptable for children’s success to be determined by their social circumstances. We intend to raise levels of achievement for all disadvantaged pupils and to close the gap between disadvantaged children and their peers.’

The DfE’s input and impact indicators  – showing progress against the priorities set out in its business plan – do not feature the measure mentioned in the policy description (which is actually five or more GCSEs at Grades A*-C or equivalents, including GCSEs in English and maths).

The gap on this measure was 27.7% in 2009, improving to 26.7% in 2013, so there has been a small 1.0 percentage point improvement over five years, spanning the last half of the previous Government’s term in office and the first half of this Government’s term.

Instead the impact indicators include three narrower measures focused on closing the attainment gap between free school meal pupils and their peers, at 11, 16 and 19 respectively:

  • Impact Indicator 7 compares the percentages of FSM-eligible and all other pupils achieving level 4 or above in KS2 assessment of reading, writing and maths. The 2013 gap is 18.7%, down 0.4% from 19.1% in 2012.
  • Impact Indicator 8 compares the percentages of FSM-eligible and all other pupils achieving A*-C grades in GCSE maths and English. The 2013 gap is 26.5%, up 0.3% from 26.2% in 2012.
  • Impact Indicator 9 compares the percentages of learners who were FSM-eligible at age 15 and all other learners who attain a level 3 qualification by the end of the academic year in which they are 19. The 2013 gap is 24.3%, up 0.1% from 24.2% in 2012.

These small changes, not always pointing in the right direction, reflect the longer term narrative, as is evident from the Government’s Social Mobility Indicators which also incorporate these three measures.

  • In 2005-06 the KS2 L4 maths and English gap was 25.0%, so there has been a fairly substantial 6.3 percentage point reduction over seven years, but only about one quarter of the gap has been closed.
  • In 2007-08 the KS4 GCSE maths and English gap was 28.0%, so there has been a minimal 1.5 percentage point reduction over six years, equivalent to annual national progress of 0.25 percentage points per year. At that rate it will take another century to complete the process.
  • In 2004-05 the Level 3 qualification gap was 26.4%, so there has been a very similar 2.1 percentage point reduction over 8 years.

The DfE impact indicators also include a set of three destination measures that track the percentage of FSM learners progressing to Oxford and Cambridge, any Russell Group university and any university.

There is a significant time lag with all of these – the most recent available data relates to 2011/2012 – and only two years of data have been collected.

All show an upward trend. Oxbridge is up from 0.1% to 0.2%, Russell Group up from 3% to 4% and any university up from 45% to 47% – actually a 2.5 percentage point improvement.

The Oxbridge numbers are so small that a percentage measure is a rather misleading indicator of marginal improvement from a desperately low base.

It is important to note that forthcoming changes to the assessment regime will impose a different set of headline indicators at ages 11 and 16 that will not be comparable with these.

From 2014 significant methodological adjustments are being introduced to School Performance Tables that significantly restrict the range of qualifications equivalent to GCSEs. Only the first entry in each subject will count for Performance Table purposes, this applying to English Baccalaureate subjects in 2014 and then all subjects in 2015.

Both these factors will tend to depress overall results and may be expected to widen attainment gaps on the headline KS4 measure as well as the oft-cited 5+ GCSEs measure.

From 2016 new baseline assessments, the introduction of scaled scores at the end of KS2 and a new GCSE grading system will add a further layer of change.

As a consequence there will be substantial revisions to the headline measures in Primary, Secondary and Post-16 Performance Tables. The latter will include destination measures, provided they can be made methodologically sound.

At the time of writing, the Government has made negligible reference to the impact of these reforms on national measures of progress, including its own Impact Indicators and the parallel Social Mobility indicators, though the latter are reportedly under review.

 

Published data on English excellence gaps

The following sections summarise what data I can find in the public domain about excellence gaps at primary (KS2), secondary (KS4) and post-16 (KS5) respectively.

I have cited the most recent data derivable from Government statistical releases and performance tables, supplemented by other interesting findings gleaned from research and commentary.

 

Primary (KS2) 

The most recent national data is contained in SFR51/2013: National Curriculum Assessments at Key Stage 2: 2012 to 2013. This provides limited information about the differential performance of learners eligible for and receiving FSM (which I have referred to as ‘FSM’), and for those known to be eligible for FSM at any point from Years 1 to 6 (known as ‘ever 6’ and describing those in receipt of the Pupil Premium on grounds of deprivation).

There is also additional information in the 2013 Primary School Performance Tables, where the term ‘disadvantaged’ is used to describe ‘ever 6’ learners and ‘children looked after’.

There is comparably little variation between these different sets of figures at national level. In the analysis below (and in the subsequent section on KS4) I have used FSM data wherever possible, but have substituted ‘disadvantaged’ data where FSM is not available.  All figures apply to state-funded schools only.

I have used Level 5 and above as the best available proxy for high attainment. Some Level 6 data is available, but in percentages only, and these are all so small that comparisons are misleading.

The Performance Tables distinguish a subset of high attainers, on the basis of prior attainment (at KS1 for KS2 and at KS2 for KS4) but no information is provided about the differential performance of advantaged and disadvantaged high attainers.

In 2013:

  • 21% of all pupils achieved Level 5 or above in reading, writing and maths combined, but only 10% of FSM pupils did so, compared with 26% of others, giving an attainment gap of 16%. The comparable gap at Level 4B (in reading and maths and L4 in writing) was 18%. At Level 4 (across the board) it was 20%. In this case, the gaps are slightly larger at lower attainment levels but, whereas the L4 gap has narrowed by 1% since 2012, the L5 gap has widened by 1%.
  • In reading, 44% of all pupils achieved Level 5 and above, but only 21% of FSM pupils did so, compared with 48% of others, giving an attainment gap of 21%. The comparable gap at Level 4 and above was eight percentage points lower at 13%.
  • In writing (teacher assessment), 31% of all pupils achieved level 5 and above, but only 15% of FSM pupils did so, compared with 34% of others, giving an attainment gap of 19%. The comparable gap at Level 4 and above was three percentage points lower at 16%.
  • In grammar, punctuation and spelling (GPS), 47% of all pupils achieved Level 5 and above, but only 31% of FSM pupils did so, compared with 51% of others, giving an attainment gap of 20%. The comparable gap at Level 4 and above was two percentage points lower at 18%.
  • In maths, 41% of pupils in state-funded schools achieved Level 5 and above, up 2% on 2012. But only 24% of FSM pupils achieved this compared with 44% of others, giving an attainment gap of 20%. The comparable gap at level 4 and above is 13%.

Chart 1 shows these outcomes graphically. In four cases out of five, the gap at the higher attainment level is greater, substantially so in reading and maths. All the Level 5 gaps fall between 16% and 20%.

 

Ex gap table 1

Chart 1: Percentage point gaps between FSM and all other pupils’ attainment at KS2 L4 and above and KS2 L5 and above, 2013 

 

It is difficult to trace reliably the progress made in reducing these gaps in English, since the measures have changed frequently. There has been more stability in maths, however, and the data reveals that – whereas the FSM gap at Level 4 and above has reduced by 5 percentage points since 2008 (from 18 points to 13 points) – the FSM gap at Level 5 and above has remained between 19 and 20 points throughout. Hence the gap between L4+ and L5+ on this measure has increased in the last five years.

There is relatively little published about KS2 excellence gaps elsewhere, though one older Government publication, a DfES Statistical Bulletin: The characteristics of high attainers (2007) offers a small insight.

It defines KS2 high attainers as the top 10%, on the basis of finely grained average points scores across English, maths and science, so a more selective but wider-ranging definition than any of the descriptors of Level 5 performance above.

According to this measure, some 2.7% of FSM-eligible pupils were high attainers in 2006, compared with 11.6% of non-FSM pupils, giving a gap of 8.9 percentage points.

The Bulletin supplies further analysis of this population of high attainers, summarised in the table reproduced below.

 

EX Gap Capture 1 

  

Secondary (KS4) 

While Government statistical releases provide at least limited data about FSM performance at high levels in end of KS2 assessments, this is entirely absent from KS4 data, because there is no information about the achievement of GCSE grades above C, whether for single subjects or combinations.

The most recent publication: SFR05/2014: GCSE and equivalent attainment by pupil characteristics, offers a multitude of measures based on Grades G and above or C and above, many of which are set out in Chart 2, which illustrates the FSM gap on each, organised in order from the smallest gap to the biggest.

(The gap cited here for A*-C grades in English and maths GCSEs is very slightly different to the figure in the impact indicator.)

 

Ex gap table 2

Chart 2: Percentage point gaps between FSM and all other pupils’ attainment on different KS4 measures, 2013

 

In its State of the Nation Report 2013, the Social Mobility and Child Poverty Commission included a table comparing regional performance on a significantly more demanding ‘8+ GCSEs excluding equivalents and including English and maths’ measure. This uses ‘ever 6’ rather than FSM as the indicator of disadvantage.

The relevant table is reproduced below. It shows regional gaps of between 20 and 26 percentage points on the tougher measure, so a similar order of magnitude to the national indicators at the top end of Chart 2.

 

ExGap 2 Capture

 

Comparing the two measures, one can see that:

  • The percentages of ‘ever 6’ learners achieving the more demanding measure are very much lower than the comparable percentages achieving the 5+ GCSEs measure, but the same is also true of their more advantaged peers.
  • Consequently, in every region but London and the West Midlands, the attainment gap is actually larger for the less demanding measure.
  • In London, the gaps are much closer, at 19.1 percentage points on the 5+ measure and 20.9 percentage points on the 8+ measure. In the West Midlands, the gap on the 8+ measure is larger by five percentage points. In all other cases, the difference is at least six percentage points in the other direction.

We do not really understand the reasons why London and the West Midlands are atypical in this respect.

The Characteristics of High Attainers (2007) provides a comparable analysis for KS4 to that already referenced at KS2. In this case, the top 10% of high attainers is derived on the basis of capped GCSE scores.

This gives a gap of 8.8 percentage points between the proportion of non-FSM (11.2%) and FSM (2.4%) students within the defined population, very similar to the parallel calculation at KS2.

Other variables within this population are set out in the table reproduced below.

 

ExGap Capture 3

Finally, miscellaneous data has also appeared from time to time in the answers to Parliamentary Questions. For example:

  • In 2003, 1.0% of FSM-eligible learners achieved five or more GCSEs at A*/A including English and maths but excluding equivalents, compared with 6.8% of those not eligible, giving a gap of 5.8 percentage points. By 2009 the comparable percentages were 1.7% and 9.0% respectively, resulting in an increased gap of 7.3 percentage points (Col 568W)
  • In 2006/07, the percentage of FSM-eligible pupils securing A*/A grades at GCSE in different subjects, compared with the percentage of all pupils in maintained schools doing so were as shown in the table below (Col 808W)
FSM All pupils Gap
Maths 3.7 15.6 11.9
Eng lit 4.1 20.0 15.9
Eng lang 3.5 16.4 12.9
Physics 2.2 49.0 46.8
Chemistry 2.5 48.4 45.9
Biology 2.5 46.8 44.3
French 3.5 22.9 19.4
German 2.8 23.2 20.4

Table 1: Percentage of FSM-eligible and all pupils achieving GCSE A*/A grades in different GCSE subjects in 2007

  • In 2008, 1% of FSM-eligible learners in maintained schools achieved A* in GCSE maths compared with 4% of all pupils in maintained schools. The comparable percentages for Grade A were 3% and 10% respectively, giving an A*/A gap of 10 percentage points (Col 488W)

 

Post-16 (KS5)

The most recent post-16 attainment data is provided in SFR10/2014: Level 2 and 3 attainment by young people aged 19 in 2013 and SFR02/14: A level and other level 3 results: academic year 2012 to 2013.

The latter contains a variety of high attainment measures – 3+ A*/A grades;  AAB grades or better; AAB grades or better with at least two in facilitating subjects;  AAB grades or better, all in facilitating subjects – yet none of them distinguish success rates for advantaged and disadvantaged learners.

The former does includes a table which provides a time series of gaps for achievement of Level 3 at age 19 through 2 A levels or the International Baccalaureate. The measure of disadvantage is FSM-eligibility in Year 11. The gap was 22.0 percentage points in 2013, virtually unchanged from 22.7 percentage points in 2005.

In (How) did New Labour narrow the achievement and participation gap (Whitty and Anders, 2014) the authors reproduce a chart from a DfE roundtable event held in March 2013 (on page 44).

This is designed to show how FSM gaps vary across key stages and also provides ‘odds ratios’ – the relative chances of FSM and other pupils achieving each measure. It relies on 2012 outcomes.

The quality of the reproduction is poor, but it seems to suggest that, using the AAB+ in at least two facilitating subjects measure, there is a five percentage point gap between FSM students and others (3% versus 8%), while the odds ratio shows that non-FSM students are 2.9 times more likely than FSM students to achieve this outcome.

Once again, occasional replies to Parliamentary Questions provide some supplementary information:

  • In 2007, 189 FSM-eligible students (3.7%) in maintained mainstream schools (so excluding sixth form colleges and FE colleges) achieved 3 A grades at A level. This compared with 13,467 other students (9.5%) giving a gap of 5.8 percentage points (Source: Parliamentary Question, 26 November 2008, Hansard (Col 1859W)
  • In 2008, 160 students (3.5%) eligible for FSM achieved that outcome. This compares with 14,431 (10.5%) of those not eligible for FSM, giving a gap of 7.0 percentage points. The figures relate to 16-18 year-olds, in maintained schools only, who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth form colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are counted. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
  • Of pupils entering at least one A level in 2010/11 and eligible for FSM at the end of Year 11, 546 (4.1%) achieved 3 or more GCE A levels at A*-A compared with 22,353 other pupils (10.6%) so giving a gap of 6.5 percentage points. These figures include students in both the schools and FE sectors. (Parliamentary Question, 9 July 2012, Hansard (Col 35W)) 

 In September 2014, a DfE response to a Freedom of Information request provided some additional data about FSM gaps at A level over the period from 2009 to 2013. This is set out in the table below, which records the gaps between FSM and all other pupils, presumably for all schools and colleges, whether or not state-funded.

Apart from the atypical result for the top indicator in 2010, all these percentages fall in the range 6.0% to 10%, so are in line with the sources above.

 

2009 2010 2011 2012 2013
3+ grades at A*/A or applied single/double award 9.0 12.8 9.3 8.7 8.3
AAB+ grades in facilitating subjects 6.3 6.2
AAB+ grades at least 2 in facilitating subjects 9.8

 

Additional evidence of Key Stage excellence gaps from a sample born in 1991

In Progress made by high-achieving children from disadvantaged backgrounds (Crawford, Macmillan and Vignoles, 2014) provides useful data on the size of excellence gaps at different key stages, as well as analysis of whether disadvantaged high achievers remain so through their school careers.

The latter appears in Part two, but the first set of findings provides a useful supplement to the broad picture set out above.

This study is based on a sample of learners born in 1991/1992, so they would presumably have taken end of KS2 tests in 2002, GCSEs in 2007 and A levels in 2009. It includes all children who attended a state primary school, including those who subsequently attended an independent secondary school.

It utilises a variety of measures of disadvantage, including whether learners were always FSM-eligible (in Years 7-11), or ‘ever FSM’ during that period. This summary focuses on the distinction between ‘always FSM’ and ‘never FSM’.

It selects a basket of high attainment measures spread across the key stages, including:

  • At KS1, achieving Level 3 or above in reading and maths.
  • At KS2, achieving Level 5 or above in English and maths.
  • At KS4, achieving six or more GCSEs at grades A*-C in EBacc subjects (as well as five or more).
  • At KS5, achieving two or more (and three or more) A levels at grades A-B in any subjects.
  • Also at KS5, achieving two or more (and three or more) A levels at grades A-B in facilitating subjects.

The choice of measures at KS2 and KS5 is reasonable, reflecting the data available at the time. For example, one assumes that A* grades at A level do not feature in the KS5 measures since they were not introduced until 2010).

At KS4, the selection is rather more puzzling and idiosyncratic. It would have been preferable to have included at least one measure based on performance across a range of GCSEs at grades A*-B or A*/A.

The authors justify their decision on the basis that ‘there is no consensus on what is considered high attainment’, even though most commentators would expect this to reflect higher grade performance, while few are likely to define it solely in terms of breadth of study across a prescribed set of ‘mainstream’ subjects.

Outcomes for ‘always FSM’ and ‘never FSM’ on the eight measures listed above are presented in Chart 3.

Ex gap Table 3

Chart 3: Achievement of ‘always FSM’ and ‘never FSM’ on a basket of high attainment measures for pupils born in 1991/92

 

This reveals gaps of 12 to 13 percentage points at Key Stages 1 and 2, somewhat smaller than several of those described above.

It is particularly notable that the 2013 gap for KS2 L5 reading, writing and maths is 16 percentage points, whereas the almost comparable 2002 (?) gap for KS2 English and maths amongst this sample is 13.5%. Even allowing for comparability issues, there may tentative evidence here to suggest widening excellence gaps at KS2 over the last decade.

The KS4 gaps are significantly larger than those existing at KS1/2, at 27 and 18 percentage points respectively. But comparison with the previous evidence reinforces the point that the size of the gaps in this sample is attributable to subject mix: this must be the case since the grade expectation is no higher than C.

The data for A*/A performance on five or more GCSEs set out above, which does not insist on coverage of EBacc subjects other than English and maths, suggests a gap of around seven percentage points. But it also demonstrates big gaps – again at A*/A – for achievement in single subjects, especially the separate sciences.

The KS5 gaps on this sample range from 2.5 to 13 percentage points. We cited data above suggesting a five percentage point gap in 2012 for AAB+, at least two in facilitating subjects. These findings do not seem wildly out of kilter with that, or with the evidence of gaps of around six to seven percentage points for AAA grades or higher.

 

Overall pattern 

The published data provides a beguiling glimpse of the size of excellence gaps and how they compare with FSM gaps on the key national benchmarks.

But discerning the pattern is like trying to understand the picture on a jigsaw when the majority of pieces are missing.

The received wisdom is capture in the observation by Whitty and Anders that:

‘Even though the attainment gap in schools has narrowed overall, it is largest for the elite measures’

and the SMCPC’s comment that:

‘…the system is better at lifting children eligible for FSM above a basic competence level (getting 5A*–C) than getting them above a tougher level of attainment likely to secure access to top universities.’

This seems broadly true, but the detailed picture is rather more complicated.

  • At KS2 there are gaps at L5 and above of around 16-20 percentage points, the majority higher than the comparable gaps at L4. But the gaps for core subjects combined are smaller than for each assessment. There is tentative evidence that the former may be widening.
  • At KS4 there are very significant differences between results in individual subjects. When it comes to multi-subject indicators, differences in the choice of subject mix – as well as choice of grade – make it extremely difficult to draw even the most tentative conclusions about the size of excellence gaps and how they relate to benchmark-related gaps at KS4 and excellence gaps at KS2.
  • At KS5, the limited evidence suggests that A level excellence gaps at the highest grades are broadly similar to those at GCSE A*/A. If anything, gaps seem to narrow slightly compared with KS4. But the confusion over KS4 measures makes this impossible to verify.

We desperately need access to a more complete dataset so we can understand these relationships more clearly.

This is the end of Part one. In Part two, we move on to consider evidence about whether high attainers remain so, before examining international comparisons data and related research, followed by excellence gaps analysis from the USA.

Part two concludes with a short review of how present government policy impacts on excellence gaps and some recommendations for strengthening the present arrangements.

 

GP

September 2014

‘Poor but Bright’ v ‘Poor but Dim’

 

light-147810_640This post explores whether, in supporting learners from disadvantaged backgrounds, educators should prioritise low attainers over high attainers, or give them equal priority.

 

 

 

Introduction

Last week I took umbrage at a blog post and found myself engaged in a Twitter discussion with the author, one Mr Thomas.

 

 

Put crudely, the discussion hinged on the question whether the educational needs of ‘poor but dim’ learners should take precedence over those of the ‘poor but bright’. (This is Mr Thomas’s shorthand, not mine.)

He argued that the ‘poor but dim’ are the higher priority; I countered that all poor learners should have equal priority, regardless of their ability and prior attainment.

We began to explore the issue:

  • as a matter of educational policy and principle
  • with reference to inputs – the allocation of financial and human resources between these competing priorities and
  • in terms of outcomes – the comparative benefits to the economy and to society from investment at the top or the bottom of the attainment spectrum.

This post presents the discussion, adding more flesh and gloss from the Gifted Phoenix perspective.

It might or might not stimulate some interest in how this slightly different take on a rather hoary old chestnut plays out in England’s current educational landscape.

But I am particularly interested in how gifted advocates in different countries respond to these arguments. What is the consensus, if any, on the core issue?

Depending on the answer to this first question, how should gifted advocates frame the argument for educationalists and the wider public?

To help answer the first question I have included a poll at the end of the post.

Do please respond to that – and feel free to discuss the second question in the comments section below.

The structure of the post is fairly complex, comprising:

  • A (hopefully objective) summary of Mr Thomas’s original post.
  • An embedded version of the substance of our Twitter conversation. I have removed some Tweets – mostly those from third parties – and reordered a little to make this more accessible. I don’t believe I’ve done any significant damage to either case.
  • Some definition of terms, because there is otherwise much cause for confusion as we push further into the debate.
  • A digressionary exploration of the evidence base, dealing with attainment data and budget allocations respectively. The former exposes what little we are told about how socio-economic gaps vary across the attainment spectrum; the latter is relevant to the discussion of inputs. Those pressed for time may wish to proceed directly to…
  • …A summing up, which expands in turn the key points we exchanged on the point of principle, on inputs and on outcomes respectively.

I have reserved until close to the end a few personal observations about the encounter and how it made me feel.

And I conclude with the customary brief summary of key points and the aforementioned poll.

It is an ambitious piece and I am in two minds as to whether it hangs together properly, but you are ultimately the judges of that.

 

What Mr Thomas Blogged

The post was called ‘The Romance of the Poor but Bright’ and the substance of the argument (incorporating several key quotations) ran like this:

  • The ‘effort and resources, of schools but particularly of business and charitable enterprise, are directed disproportionately at those who are already high achieving – the poor but bright’.
  • Moreover ‘huge effort is expended on access to the top universities, with great sums being spent to make marginal improvements to a small set of students at the top of the disadvantaged spectrum. They cite the gap in entry, often to Oxbridge, as a significant problem that blights our society.’
  • This however is ‘the pretty face of the problem. The far uglier face is the gap in life outcomes for those who take least well to education.’
  • Popular discourse is easily caught up in the romance of the poor but bright’ but ‘we end up ignoring the more pressing problem – of students for whom our efforts will determine whether they ever get a job or contribute to society’. For ‘when did you last hear someone advocate for the poor but dim?’ 
  • ‘The gap most damaging to society is in life outcomes for the children who perform least well at school.’ Three areas should be prioritised to improve their educational outcomes:

o   Improving alternative provision (AP) which ‘operates as a shadow school system, largely unknown and wholly unappreciated’ - ‘developing a national network of high–quality alternative provision…must be a priority if we are to close the gap at the bottom’.

o   Improving ‘consistency in SEN support’ because ‘schools are often ill equipped to cope with these, and often manage only because of the extraordinary effort of dedicated staff’. There is ‘inconsistency in funding and support between local authorities’.

o   Introducing clearer assessment of basic skills, ‘so that a student could not appear to be performing well unless they have mastered the basics’.

  • While ‘any student failing to meet their potential is a dreadful thing’, the educational successes of ‘students with incredibly challenging behaviour’ and ‘complex special needs…have the power to change the British economy, far more so than those of their brighter peers.’

A footnote adds ‘I do not believe in either bright or dim, only differences in epigenetic coding or accumulated lifetime practice, but that is a discussion for another day.’

Indeed it is.

 

Our ensuing Twitter discussion

The substance of our Twitter discussion is captured in the embedded version immediately below. (Scroll down to the bottom for the beginning and work your way back to the top.)

 

 

Defining Terms

 

Poor, Bright and Dim

I take poor to mean socio-economic disadvantage, as opposed to any disadvantage attributable to the behaviours, difficulties, needs, impairments or disabilities associated with AP and/or SEN.

I recognise of course that such a distinction is more theoretical than practical, because, when learners experience multiple causes of disadvantage, the educational response must be holistic rather than disaggregated.

Nevertheless, the meaning of ‘poor’ is clear – that term cannot be stretched to include these additional dimensions of disadvantage.

The available performance data foregrounds two measures of socio-economic disadvantage: current eligibility for and take up of free school meals (FSM) and qualification for the deprivation element of the Pupil Premium, determined by FSM eligibility at some point within the last 6 years (known as ‘ever-6’).

Both are used in this post. Distinctions are typically between the disadvantaged learners and non-disadvantaged learners, though some of the supporting data compares outcomes for disadvantaged learners with outcomes for all learners, advantaged and disadvantaged alike.

The gaps that need closing are therefore:

  • between ‘poor and bright’ and other ‘bright’ learners (The Excellence Gap) and 
  • between ‘poor and dim’ and other ‘dim’ learners. I will christen this The Foundation Gap.

The core question is whether The Foundation Gap takes precedence over The Excellence Gap or vice versa, or whether they should have equal billing.

This involves immediate and overt recognition that classification as AP and/or SEN is not synonymous with the epithet ‘poor’, because there are many comparatively advantaged learners within these populations.

But such a distinction is not properly established in Mr Thomas’ blog, which applies the epithet ‘poor’ but then treats the AP and SEN populations as homogenous and somehow associated with it.

 

By ‘dim’ I take Mr Thomas to mean the lowest segment of the attainment distribution – one of his tweets specifically mentions ‘the bottom 20%’. The AP and/or SEN populations are likely to be disproportionately represented within these two deciles, but they are not synonymous with it either.

This distinction will not be lost on gifted advocates who are only too familiar with the very limited attention paid to twice exceptional learners.

Those from poor backgrounds within the AP and/or SEN populations are even more likely to be disproportionately represented in ‘the bottom 20%’ than their more advantaged peers, but even they will not constitute the entirety of ‘the bottom 20%’. A Venn diagram would likely show significant overlap, but that is all.

Hence disadvantaged AP/SEN are almost certainly a relatively poor proxy for the ‘poor but dim’.

That said I could find no data that quantifies these relationships.

The School Performance Tables distinguish a ‘low attainer’ cohort. (In the Secondary Tables the definition is determined by prior KS2 attainment and in the Primary Tables by prior KS1 attainment.)

These populations comprise some 15.7% of the total population in the Secondary Tables and about 18.0% in the Primary Tables. But neither set of Tables applies the distinction in their reporting of the attainment of those from disadvantaged backgrounds.

 

It follows from the definition of ‘dim’ that, by ‘bright’, MrThomas probably intends the two corresponding deciles at the top of the attainment distribution (even though he seems most exercised about the subset with the capacity to progress to competitive universities, particularly Oxford and Cambridge. This is a far more select group of exceptionally high attainers – and an even smaller group of exceptionally high attainers from disadvantaged backgrounds.)

A few AP and/or SEN students will likely fall within this wider group, fewer still within the subset of exceptionally high attainers. AP and/or SEN students from disadvantaged backgrounds will be fewer again, if indeed there are any at all.

The same issues with data apply. The School Performance Tables distinguish ‘high attainers’, who constitute over 32% of the secondary cohort and 25% of the primary cohort. As with low attainers, we cannot isolate the performance of those from disadvantaged backgrounds.

We are forced to rely on what limited data is made publicly available to distinguish the performance of disadvantaged low and high attainers.

At the top of the distribution there is a trickle of evidence about performance on specific high attainment measures and access to the most competitive universities. Still greater transparency is fervently to be desired.

At the bottom, I can find very little relevant data at all – we are driven inexorably towards analyses of the SEN population, because that is the only dataset differentiated by disadvantage, even though we have acknowledged that such a proxy is highly misleading. (Equivalent AP attainment data seems conspicuous by its absence.)

 

AP and SEN

Before exploring these datasets I ought to provide some description of the different programmes and support under discussion here, if only for the benefit of readers who are unfamiliar with the English education system.

 

Alternative Provision (AP) is intended to meet the needs of a variety of vulnerable learners:

‘They include pupils who have been excluded or who cannot attend mainstream school for other reasons: for example, children with behaviour issues, those who have short- or long-term illness, school phobics, teenage mothers, pregnant teenagers, or pupils without a school place.’

AP is provided in a variety of settings where learners engage in timetabled education activities away from their school and school staff.

Providers include further education colleges, charities, businesses, independent schools and the public sector. Pupil Referral Units (PRUs) are perhaps the best-known settings – there are some 400 nationally.

A review of AP was undertaken by Taylor in 2012 and the Government subsequently embarked on a substantive improvement programme. This rather gives the lie to Mr Thomas’ contention that AP is ‘largely unknown and wholly unappreciated’.

Taylor complains of a lack of reliable data about the number of learners in AP but notes that the DfE’s 2011 AP census recorded 14,050 pupils in PRUs and a further 23,020 in other settings on a mixture of full-time and part-time placements. This suggests a total of slightly over 37,000 learners, though the FTE figure is unknown.

He states that AP learners are:

‘…twice as likely as the average pupil to qualify for free school meals’

A supporting Equality Impact Assessment qualifies this somewhat:

‘In Jan 2011, 34.6% of pupils in PRUs and 13.8%* of pupils in other AP, were eligible for and claiming free school meals, compared with 14.6% of pupils in secondary schools. [*Note: in some AP settings, free school meals would not be available, so that figure is under-stated, but we cannot say by how much.]’

If the PRU population is typical of the wider AP population, approximately one third qualify under this FSM measure of disadvantage, meaning that the substantial majority are not ‘poor’ according to our definition above.

Taylor confirms that overall GCSE performance in AP is extremely low, pointing out that in 2011 just 1.4% achieved five or more GCSE grades A*-C including [GCSEs in] maths and English, compared to 53.4% of pupils in all schools.

By 2012/13 the comparable percentages were 1.7% and 61.7% respectively (the latter for all state-funded schools), suggesting an increasing gap in overall performance. This is a cause for concern but not directly relevant to the issue under consideration.

The huge disparity is at least partly explained by the facts that many AP students take alternative qualifications and that the national curriculum does not apply to PRUs.

Data is available showing the full range of qualifications pursued. Taylor recommended that all students in AP should continue to receive ‘appropriate and challenging English and Maths teaching’.

Interestingly, he also pointed out that:

‘In some PRUs and AP there is no provision for more able pupils who end up leaving without the GCSE grades they are capable of earning.’

However, he fails to offer a specific recommendation to address this point.

 

Special Educational Needs (SEN) are needs or disabilities that affect children’s ability to learn. These may include behavioural and social difficulties, learning difficulties or physical impairments.

This area has also been subject to a major Government reform programme now being implemented.

There is significant overlap between AP and SEN, with Taylor’s review of the former noting that the population in PRUs is 79% SEN.

We know from the 2013 SEN statistics that 12.6% of all pupils on roll at PRUs had SEN statements and 68.9% had SEN without statements. But these populations represent only a tiny proportion of the total SEN population in schools.

SEN learners also have higher than typical eligibility for FSM. In January 2013, 30.1% of all SEN categories across all primary, secondary and special schools were FSM-eligible, roughly twice the rate for all pupils. However, this means that almost seven in ten are not caught by the definition of ‘poor’ provided above.

In 2012/13 23.4% of all SEN learners achieved five or more GCSEs at A*-C or equivalent, including GCSEs in English and maths, compared with 70.4% of those having no identified SEN – another significant overall gap, but not directly relevant to our comparison of the ‘poor but bright’ and the ‘poor but dim’.

 

Data on socio-economic attainment gaps across the attainment spectrum

Those interested in how socio-economic attainment gaps vary at different attainment levels cannot fail to be struck by how little material of this kind is published, particularly in the secondary sector, when such gaps tend to increase in size.

One cannot entirely escape the conviction that this reticence deliberately masks some inconvenient truths.

  • The ideal would be to have the established high/middle/low attainer distinctions mapped directly onto performance by advantaged/disadvantaged learners in the Performance Tables but, as we have indicated, this material is conspicuous by its absence. Perhaps it will appear in the Data Portal now under development.
  • Our next best option is to examine socio-economic attainment gaps on specific attainment measures that will serve as decent proxies for high/middle/low attainment. We can do this to some extent but the focus is disproportionately on the primary sector because the Secondary Tables do not include proper high attainment measures (such as measures based exclusively on GCSE performance at grades A*/A). Maybe the Portal will come to the rescue here as well. We can however supply some basic Oxbridge fair access data.
  • The least preferable option is deploy our admittedly poor proxies for low attainers – SEN and AP. But there isn’t much information from this source either. 

The analysis below looks consecutively at data for the primary and secondary sectors.

 

Primary

We know, from the 2013 Primary School Performance Tables, that the percentage of disadvantaged and other learners achieving different KS2 levels in reading, writing and maths combined, in 2013 and 2012 respectively, were as follows:

 

Table 1: Percentage of disadvantaged and all other learners achieving each national curriculum level at KS2 in 2013 in reading, writing and maths combined

L3  or below L4  or above L4B or above L5 or above
Dis Oth Gap Dis Oth Gap Dis Oth Gap Dis Oth Gap
2013 13 5 +8 63 81 -18 49 69 -20 10 26 -16
2012 x x x 61 80 -19 x x x 9 24 -15

 

This tells us relatively little, apart from the fact that disadvantaged learners are heavily over-represented at L3 and below and heavily under-represented at L5 and above.

The L5 gap is somewhat lower than the gaps at L4 and 4B respectively, but not markedly so. However, the L5 gap has widened slightly since 2012 while the reverse is true at L4.

This next table synthesises data from SFR51/13: ‘National curriculum assessments at key stage 2: 2012 to 2013’. It also shows gaps for disadvantage, as opposed to FSM gaps.

 

Table 2: Percentage of disadvantaged and all other learners achieving each national curriculum level, including differentiation by gender, in each 2013 end of KS2 test

L3 L4 L4B L5 L6
D O Gap D O Gap D O Gap D O Gap D O Gap
Reading All 12 6 +6 48 38 +10 63 80 -17 30 50 -20 0 1 -1
B 13 7 +6 47 40 +7 59 77 -18 27 47 -20 0 0 0
G 11 5 +6 48 37 +11 67 83 -16 33 54 -21 0 1 -1
GPS All 28 17 +11 28 25 +3 52 70 -18 33 51 -18 1 2 -1
B 30 20 +10 27 27 0 45 65 -20 28 46 -18 0 2 -2
G 24 13 +11 28 24 +4 58 76 -18 39 57 -18 1 3 -2
Maths All 16 9 +7 50 41 +9 62 78 -16 24 39 -15 2 8 -6
B 15 8 +7 48 39 +9 63 79 -16 26 39 -13 3 10 -7
G 17 9 +8 52 44 +8 61 78 -17 23 38 -15 2 7 -5

 

This tells a relatively consistent story across each test and for boys as well as girls.

We can see that, at Level 4 and below, learners from disadvantaged backgrounds are in the clear majority, perhaps with the exception of L4 GPS.  But at L4B and above they are very much in the minority.

Moreover, with the exception of L6 where low percentages across the board mask the true size of the gaps, disadvantaged learners tend to be significantly more under-represented at L4B and above than they are over-represented at L4 and below.

A different way of looking at this data is to compare the percentages of advantaged and disadvantaged learners respectively at L4 and L5 in each assessment.

  • Reading: Amongst disadvantaged learners the proportion at L5 is -18 percentage points fewer than the proportion at L4, but amongst advantaged learners the proportion at L5 is +12 percentage points higher than at L4.
  • GPS: Amongst disadvantaged learners the proportion at L5 is +5 percentage points more than the proportion at L4, but amongst advantaged learners the proportion at L5 is +26 percentage points higher than at L4.
  • Maths: Amongst disadvantaged learners the proportion at L5 is -26 percentage points fewer than the proportion at L4, but amongst advantaged learners the proportion at L5 is only 2 percentage points fewer than at L4.

If we look at 2013 gaps compared with 2012 (with teacher assessment of writing included in place of the GSP test introduced in 2013) we can see there has been relatively little change across the board, with the exception of L5 maths, which has been affected by the increasing success of advantaged learners at L6.

 

Table 3: Percentage of disadvantaged and all other learners achieving  national curriculum levels 3-6 in each of reading, writing and maths in 2012 and 2013 respectively

L3 L4 L5 L6
D O Gap D O Gap D O Gap D O Gap
Reading 2012 11 6 +5 46 36 +10 33 54 -21 0 0 0
2013 12 6 +6 48 38 +10 30 50 -20 0 1 -1
Writing 2012 22 11 +11 55 52 +3 15 32 -17 0 1 -1
2013 19 10 +9 56 52 +4 17 34 -17 1 2 -1
Maths 2012 17 9 +8 50 41 +9 23 43 -20 1 4 -3
2013 16 9 +9 50 41 +9 24 39 -15 2 8 -6

 

To summarise, as far as KS2 performance is concerned, there are significant imbalances at both the top and the bottom of the attainment distribution and these gaps have not changed significantly since 2012. There is some evidence to suggest that gaps at the top are larger than those at the bottom.

 

Secondary

Unfortunately there is a dearth of comparable data at secondary level, principally because of the absence of published measures of high attainment.

SFR05/2014 provides us with FSM gaps (as opposed to disadvantaged gaps) for a series of GCSE measures, none of which serve our purpose particularly well:

  • 5+ A*-C GCSE grades: gap = 16.0%
  • 5+ A*-C grades including English and maths GCSEs: gap = 26.7%
  • 5+ A*-G grades: gap = 7.6%
  • 5+ A*-G grades including English and maths GCSEs: gap = 9.9%
  • A*-C grades in English and maths GCSEs: gap = 26.6%
  • Achieving the English Baccalaureate: gap = 16.4%

Perhaps all we can deduce is that the gaps vary considerably in size, but tend to be smaller for the relatively less demanding and larger for the relatively more demanding measures.

For specific high attainment measures we are forced to rely principally on data snippets released in answer to occasional Parliamentary Questions.

For example:

  • In 2003, 1.0% of FSM-eligible learners achieved five or more GCSEs at A*/A including English and maths but excluding equivalents, compared with 6.8% of those not eligible, giving a gap of 5.8%. By 2009 the comparable percentages were 1.7% and 9.0% respectively, giving an increased gap of 7.3% (Col 568W)
  • In 2006/07, the percentage of FSM-eligible pupils securing A*/A grades at GCSE in different subjects, compared with the percentage of all pupils in maintained schools doing so were as shown in the table below (Col 808W)

 

Table 4: Percentage of FSM-eligible and all pupils achieving GCSE A*/A grades in different GCSE subjects in 2007

FSM All pupils Gap
Maths 3.7 15.6 11.9
Eng lit 4.1 20.0 15.9
Eng lang 3.5 16.4 12.9
Physics 2.2 49.0 46.8
Chemistry 2.5 48.4 45.9
Biology 2.5 46.8 44.3
French 3.5 22.9 19.4
German 2.8 23.2 20.4

 

  • In 2008, 1% of FSM-eligible learners in maintained schools achieved A* in GCSE maths compared with 4% of all pupils in maintained schools. The comparable percentages for Grade A were 3% and 10% respectively, giving an A*/A gap of 10% (Col 488W)

There is much variation in the subject-specific outcomes at A*/A described above. But, when it comes to the overall 5+ GCSEs high attainment measure based on grades A*/A, the gap is much smaller than on the corresponding standard measure based on grades A*-C.

Further material of broadly the same vintage is available in a 2007 DfE statistical publication: ‘The Characteristics of High Attainers’.

There is a complex pattern in evidence here which is very hard to explain with the limited data available. More time series data of this nature – illustrating Excellence and Foundation Gaps alike – should be published annually so that we have a more complete and much more readily accessible dataset.

I could find no information at all about the comparative performance of disadvantaged learners in AP settings compared with those not from disadvantaged backgrounds.

Data is published showing the FSM gap for SEN learners on all the basic GCSE measures listed above. I have retained the generic FSM gaps in brackets for the sake of comparison:

  • 5+ A*-C GCSE grades: gap = 12.5% (16.0%)
  • 5+ A*-C grades including English and maths GCSEs: gap = 12.1% (26.7%)
  • 5+ A*-G grades: gap = 10.4% (7.6%)
  • 5+ A*-G grades including English and maths GCSEs: gap = 13.2% (9.9%)
  • A*-C grades in English and maths GCSEs: gap = 12.3% (26.6%)
  • Achieving the English Baccalaureate: gap = 3.5% (16.4%)

One can see that the FSM gaps for the more demanding measures are generally lower for SEN learners than they are for all learners. This may be interesting but, for the reasons given above, this is not a reliable proxy for the FSM gap amongst ‘dim’ learners.

 

When it comes to fair access to Oxbridge, I provided a close analysis of much relevant data in this post from November 2013.

The chart below shows the number of 15 year-olds eligible for and claiming FSM at age 15 who progressed to Oxford or Cambridge by age 19. The figures are rounded to the nearest five.

 

Chart 1: FSM-eligible learners admitted to Oxford and Cambridge 2005/06 to 2010/11

Oxbridge chart

 

In sum, there has been no change in these numbers over the last six years for which data has been published. So while there may have been consistently significant expenditure on access agreements and multiple smaller mentoring programmes, it has had negligible impact on this measure at least.

My previous post set out a proposal for what to do about this sorry state of affairs.

 

Budgets

For the purposes of this discussion we need ideally to identify and compare total national budgets for the ‘poor but bright’ and the ‘poor but dim’. But that is simply not possible. 

Many funding streams cannot be disaggregated in this manner. As we have seen, some – including the AP and SEN budgets – may be aligned erroneously with the second of these groups, although they also support learners who are neither ‘poor’ nor ‘dim’ and have a broader purpose than raising attainment. 

There may be some debate, too, about which funding streams should be weighed in the balance.

On the ‘bright but poor’ side, do we include funding for grammar schools, even though the percentage of disadvantaged learners attending many of them is virtually negligible (despite recent suggestions that some are now prepared to do something about this)? Should the Music and Dance Scheme (MDS) be within scope of this calculation?

The best I can offer is a commentary that gives a broad sense of orders of magnitude, to illustrate in very approximate terms how the scales tend to tilt more towards the ‘poor but dim’ rather than the ‘poor but bright’, but also to weave in a few relevant asides about some of the funding streams in question.

 

Pupil Premium and the EEF

I begin with the Pupil Premium – providing schools with additional funding to raise the attainment of disadvantaged learners.

The Premium is not attached to the learners who qualify for it, so schools are free to aggregate the funding and use it as they see fit. They are held accountable for these decisions through Ofsted inspection and the gap-narrowing measures in the Performance Tables.

Mr Thomas suggests in our Twitter discussion that AP students are not significant beneficiaries of such support, although provision in PRUs features prominently in the published evaluation of the Premium. It is for local authorities to determine how Pupil Premium funding is allocated in AP settings.

One might also make a case that ‘bright but poor’ learners are not a priority either, despite suggestions from the Pupil Premium Champion to the contrary.

 

 

As we have seen, the Performance Tables are not sharply enough focused on the excellence gaps at the top of the distribution and I have shown elsewhere that Ofsted’s increased focus on the most able does not yet extend to the impact on those attracting the Pupil Premium, even though there was a commitment that it would do so. 

If there is Pupil Premium funding heading towards high attainers from disadvantaged backgrounds, the limited data to which we have access does not yet suggest a significant impact on the size of Excellence Gaps. 

The ‘poor but bright’ are not a priority for the Education Endowment Foundation (EEF) either.

This 2011 paper explains that it is prioritising the performance of disadvantaged learners in schools below the floor targets. At one point it says:

‘Looking at the full range of GCSE results (as opposed to just the proportions who achieve the expected standards) shows that the challenge facing the EEF is complex – it is not simply a question of taking pupils from D to C (the expected level of attainment). Improving results across the spectrum of attainment will mean helping talented pupils to achieve top grades, while at the same time raising standards amongst pupils currently struggling to pass.’

But this is just after it has shown that the percentages of disadvantaged high attainers in its target schools are significantly lower than elsewhere. Other things being equal, the ‘poor but dim’ will be the prime beneficiaries.

It may now be time for the EEF to expand its focus to all schools. A diagram from this paper – reproduced below – demonstrates that, in 2010, the attainment gap between FSM and non-FSM was significantly larger in schools above the floor than in those below the floor that the EEF is prioritising. This is true in both the primary and secondary sectors.

It would be interesting to see whether this is still the case.

 

EEF Capture

 

AP and SEN

Given the disaggregation problems discussed above, this section is intended simply to give some basic sense of orders of magnitude – lending at least some evidence to counter Mr Thomas’ assertion that the ‘effort and resources, of schools… are directed disproportionately at those who are already high achieving – the poor but bright’.

It is surprisingly hard to get a grip on the overall national budget for AP. A PQ from early 2011 (Col 75W) supplies a net current expenditure figure for all English local authorities of £530m.

Taylor’s Review fails to offer a comparable figure but my rough estimates based on the per pupil costs he supplies suggests a revenue budget of at least £400m. (Taylor suggests average per pupil costs of £9,500 per year for full-time AP, although PRU places are said to cost between £12,000 and £18,000 per annum.)

I found online a consultation document from Kent – England’s largest local authority – stating its revenue costs at over £11m in FY2014-15. Approximately 454 pupils attended Kent’s AP/PRU provision in 2012-13.

There must also be a significant capital budget. There are around 400 PRUs, not to mention a growing cadre of specialist AP academies and free schools. The total capital cost of the first AP free school – Derby Pride Academy – was £2.147m for a 50 place setting.

In FY2011-12, total annual national expenditure on SEN was £5.77 billion (Col 391W). There will have been some cost-cutting as a consequence of the latest reforms, but the order of magnitude is clear.

The latest version of the SEN Code of Practice outlines the panoply of support available, including the compulsory requirement that each school has a designated teacher to be responsible for co-ordinating SEN provision (the SENCO).

In short, the national budget for AP is sizeable and the national budget for SEN is huge. Per capita expenditure is correspondingly high. If we could isolate the proportion of these budgets allocated to raising the attainment of the ‘poor but dim’, the total would be substantial.

 

Fair Access, especially to Oxbridge, and some related observations

Mr Thomas refers specifically to funding to support fair access to universities – especially Oxbridge – for those from disadvantaged backgrounds. This is another area in which it is hard to get a grasp on total expenditure, not least because of the many small-scale mentoring projects that exist.

Mr Thomas is quite correct to remark on the sheer number of these, although they are relatively small beer in budgetary terms. (One suspects that they would be much more efficient and effective if they could be linked together within some sort of overarching framework.)

The Office for Fair Access (OFFA) estimates University access agreement expenditure on outreach in 2014-15 at £111.9m and this has to be factored in, as does DfE’s own small contribution – the Future Scholar Awards.

Were any expenditure in this territory to be criticised, it would surely be the development and capital costs for new selective 16-19 academies and free schools that specifically give priority to disadvantaged students.

The sums are large, perhaps not outstandingly so compared with national expenditure on SEN for example, but they will almost certainly benefit only a tiny localised proportion of the ‘bright but poor’ population.

There are several such projects around the country. Some of the most prominent are located in London.

The London Academy of Excellence (capacity 420) is fairly typical. It cost an initial £4.7m to establish plus an annual lease requiring a further £400K annually.

But this is dwarfed by the projected costs of the Harris Westminster Sixth Form, scheduled to open in September 2014. Housed in a former government building, the capital cost is reputed to be £45m for a 500-place institution.

There were reportedly disagreements within Government:

‘It is understood that the £45m cost was subject to a “significant difference of opinion” within the DfE where critics say that by concentrating large resources on the brightest children at a time when budgets are constrained means other children might miss out…

But a spokeswoman for the DfE robustly defended the plans tonight. “This is an inspirational collaboration between the country’s top academy chain and one of the best private schools in the country,” she said. “It will give hundreds of children from low income families across London the kind of top quality sixth-form previously reserved for the better off.”’

Here we have in microcosm the debate to which this post is dedicated.

One blogger – a London College Principal – pointed out that the real issue was not whether the brightest should benefit over others, but how few of the ‘poor but bright’ would do so:

‘£45m could have a transformative effect on thousands of 16-19 year olds across London… £45m could have funded at least 50 extra places in each college for over 10 years, helped build excellent new facilities for all students and created a city-wide network to support gifted and talented students in sixth forms across the capital working with our partner universities and employers.’

 

Summing Up

There are three main elements to the discussion: the point of principle, the inputs and the impact. The following sections deal with each of these in turn.

 

Principle

Put bluntly, should ‘poor but dim’ kids have higher priority for educators than ‘poor but bright’ kids (Mr Thomas’ position) or should all poor kids have equal priority and an equal right to the support they need to achieve their best (the Gifted Phoenix position)?

For Mr Thomas, it seems this priority is determined by whether – and how far – the learner is behind undefined ‘basic levels of attainment’ and/or mastery of ‘the basics’ (presumably literacy and numeracy).

Those below the basic attainment threshold have higher priority than those above it. He does not say so but this logic suggests that those furthest below the threshold are the highest priority and those furthest above are the lowest.

So, pursued to its logical conclusion, this would mean that the highest attainers would get next to no support while a human vegetable would be the highest priority of all.

However, since Mr Thomas’ focus is on marginal benefit, it may be that those nearest the threshold would be first in the queue for scarce resources, because they would require the least effort and resources to lift above it.

This philosophy drives the emphasis on achievement of national benchmarks and predominant focus on borderline candidates that, until recently, dominated our assessment and accountability system.

For Gifted Phoenix, every socio-economically disadvantaged learner has equal priority to the support they need to improve their attainment, by virtue of that disadvantage.

There is no question of elevating some ahead of others in the pecking order because they are further behind on key educational measures since, in effect, that is penalising some disadvantaged learners on the grounds of their ability or, more accurately, their prior attainment.

This philosophy underpins the notion of personalised education and is driving the recent and welcome reforms of the assessment and accountability system, designed to ensure that schools are judged by how well they improve the attainment of all learners, rather than predominantly on the basis of the proportion achieving the standard national benchmarks.

I suggested that, in deriding ‘the romance of the poor but bright’, Mr Thomas ran the risk of falling into ‘the slough of anti-elitism’. He rejected that suggestion, while continuing to emphasise the need to ‘concentrate more’ on ‘those at risk of never being able to engage with society’.

I have made the assumption that Thomas is interested primarily in KS2 and GCSE or equivalent qualifications at KS4 given his references to KS2 L4, basic skills and ‘paper qualifications needed to enter meaningful employment’.

But his additional references to ‘real qualifications’ (as opposed to paper ones) and engaging with society could well imply a wider range of personal, social and work-related skills for employability and adult life.

My preference for equal priority would apply regardless: there is no guarantee that high attainers from disadvantaged backgrounds will necessarily possess these vital skills.

But, as indicated in the definition above, there is an important distinction to be maintained between:

  • educational support to raise the attainment, learning and employability skills of socio-economically disadvantaged learners and prepare them for adult life and
  • support to manage a range of difficulties – whether behavioural problems, disability, physical or mental impairment – that impact on the broader life chances of the individuals concerned.

Such a distinction may well be masked in the everyday business of providing effective holistic support for learners facing such difficulties, but this debate requires it to be made and sustained given Mr Thomas’s definition of the problem in terms of the comparative treatment of the ‘poor but bright’ and the ‘poor but dim’.

Having made this distinction, it is not clear whether he himself sustains it consistently through to the end of his post. In the final paragraphs the term ‘poor but dim’ begins to morph into a broader notion encompassing all AP and SEN learners regardless of their socio-economic status.

Additional dimensions of disadvantage are potentially being brought into play. This is inconsistent and radically changes the nature of the argument.

 

Inputs

By inputs I mean the resources – financial and human – made available to support the education of ‘dim’ and ‘bright’ disadvantaged learners respectively.

Mr Thomas also shifts his ground as far as inputs are concerned.

His post opens with a statement that ‘the effort and resources’ of schools, charities and businesses are ‘directed disproportionately’ at the poor but bright – and he exemplifies this with reference to fair access to competitive universities, particularly Oxbridge.

When I point out the significant investment in AP compared with fair access, he changes tack – ‘I’m measuring outcomes not just inputs’.

Then later he says ‘But what some need is just more expensive’, to which I respond that ‘the bottom end already has the lion’s share of funding’.

At this point we have both fallen into the trap of treating the entirety of the AP and SEN budgets as focused on the ‘poor but dim’.

We are failing to recognise that they are poor proxies because the majority of AP and SEN learners are not ‘poor’, many are not ‘dim’, these budgets are focused on a wider range of needs and there is significant additional expenditure directed at ‘poor but dim’ learners elsewhere in the wider education budget.

Despite Mr Thomas’s opening claim, it should be reasonably evident from the preceding commentary that my ‘lion’s share’ point is factually correct. His suggestion that AP is ‘largely unknown and wholly unappreciated’ flies in the face of the Taylor Review and the Government’s subsequent work programme.

SEN may depend heavily on the ‘extraordinary effort of dedicated staff’, but at least there are such dedicated staff! There may be inconsistencies in local authority funding and support for SEN, but the global investment is colossal by comparison with the funding dedicated on the other side of the balance.

Gifted Phoenix’s position acknowledges that inputs are heavily loaded in favour of the SEN and AP budgets. This is as it should be since, as Thomas rightly notes, many of the additional services they need are frequently more expensive to provide. These services are not simply dedicated to raising their attainment, but also to tackling more substantive problems associated with their status.

Whether the balance of expenditure on the ‘bright’ and ‘dim’ respectively is optimal is a somewhat different matter. Contrary to Mr Thomas’s position, gifted advocates are often convinced that too much largesse is focused on the latter at the expense of the former.

Turning to advocacy, Mr Thomas says ‘we end up ignoring the more pressing problem’ of the poor but dim. He argues in the Twitter discussion that too few people are advocating for these learners, adding that they are failed ‘because it’s not popular to talk about them’.

I could not resist countering that advocacy for gifted learners is equally unpopular, indeed ‘the word is literally taboo in many settings’. I cannot help thinking – from his footnote reference to ‘epigenetic coding’ – that Mr Thomas is amongst those who are distinctly uncomfortable with the term.

Where advocacy does survive it is focused exclusively on progression to competitive universities and, to some extent, high attainment as a route towards that outcome. The narrative has shifted away from concepts of high ability or giftedness, because of the very limited consensus about that condition (even amongst gifted advocates) and even considerable doubt in some quarters whether it exists at all.

 

Outcomes

Mr Thomas maintains in his post that the successes of his preferred target group ‘have the power to change the British economy, far more so than those of their brighter peers’. This is because ‘the gap most damaging to society is in life outcomes for the children that perform least well at school’.

As noted above, it is important to remember that we are discussing here the addition of educational and economic value by tackling underachievement amongst learners from disadvantaged backgrounds, rather than amongst all the children that perform least well.

We are also leaving to one side the addition of value through any wider engagement by health and social services to improve life chances.

It is quite reasonable to advance the argument that ‘improving the outcomes ‘of the bottom 20%’ (the Tail) will have ‘a huge socio-economic impact’ and ‘make the biggest marginal difference to society’.

But one could equally make the case that society would derive similar or even higher returns from a decision to concentrate disproportionately on the highest attainers (the Smart Fraction).

Or, as Gifted Phoenix would prefer, one could reasonably propose that the optimal returns should be achieved by means of a balanced approach that raises both the floor and the ceiling, avoiding any arbitrary distinctions on the basis of prior attainment.

From the Gifted Phoenix perspective, one should balance the advantages of removing the drag on productivity of an educational underclass against those of developing the high-level human capital needed to drive economic growth and improve our chances of success in what Coalition ministers call the ‘global race’.

According to this perspective, by eliminating excellence gaps between disadvantaged and advantaged high attainers we will secure a stream of benefits broadly commensurate to that at the bottom end.

These will include substantial spillover benefits, achieved as a result of broadening the pool of successful leaders in political, social, educational and artistic fields, not to mention significant improvements in social mobility.

It is even possible to argue that, by creating a larger pool of more highly educated parents, we can also achieve a significant positive impact on the achievement of subsequent generations, thus significantly reducing the size of the tail.

And in the present generation we will create many more role models: young people from disadvantaged backgrounds who become educationally successful and who can influence the aspirations of younger disadvantaged learners.

This avoids the risk that low expectations will be reinforced and perpetuated through a ‘deficit model’ approach that places excessive emphasis on removing the drag from the tail by producing a larger number of ‘useful members of society’.

This line of argument is integral to the Gifted Phoenix Manifesto.

It seems to me entirely conceivable that economists might produce calculations to justify any of these different paths.

But it would be highly inequitable to put all our eggs in the ‘poor but bright’ basket, because that penalises some disadvantaged learners for their failure to achieve high attainment thresholds.

And it would be equally inequitable to focus exclusively on the ‘poor but dim’, because that penalises some disadvantaged learners for their success in becoming high attainers.

The more equitable solution must be to opt for a ‘balanced scorecard’ approach that generates a proportion of the top end benefits and a proportion of the bottom end benefits simultaneously.

There is a risk that this reduces the total flow of benefits, compared with one or other of the inequitable solutions, but there is a trade-off here between efficiency and a socially desirable outcome that balances the competing interests of the two groups.

 

The personal dimension

After we had finished our Twitter exchanges, I thought to research Mr Thomas online. Turns out he’s quite the Big-Cheese-in-Embryo. Provided he escapes the lure of filthy lucre, he’ll be a mover and shaker in education within the next decade.

I couldn’t help noticing his own educational experience – public school, a First in PPE from Oxford, leading light in the Oxford Union – then graduation from Teach First alongside internships with Deutsche Bank and McKinsey.

Now he’s serving his educational apprenticeship as joint curriculum lead for maths at a prominent London Academy. He’s also a trustee of ‘a university mentoring project for highly able 11-14 year old pupils from West London state schools’.

Lucky I didn’t check earlier. Such a glowing CV might have been enough to cow this grammar school Oxbridge reject, even if I did begin this line of work several years before he was born. Not that I have a chip on my shoulder…

The experience set me wondering about the dominant ideology amongst the Teach First cadre, and how it is tempered by extended exposure to teaching in a challenging environment.

There’s more than a hint of idealism about someone from this privileged background espousing the educational philosophy that Mr Thomas professes. But didn’t he wonder where all the disadvantaged people were during his own educational experience, and doesn’t he want to change that too?

His interest in mentoring highly able pupils would suggest that he does, but also seems directly to contradict the position he’s reached here. It would be a pity if the ‘poor but bright’ could not continue to rely on his support, equal in quantity and quality to the support he offers the ‘poor but dim’.

For he could make a huge difference at both ends of the attainment spectrum – and, with his undeniable talents, he should certainly be able to do so

 

Conclusion

We are entertaining three possible answers to the question whether in principle to prioritise the needs of the ‘poor but bright’ or the ‘poor but dim’:

  • Concentrate principally – perhaps even exclusively – on closing the Excellence Gaps at the top
  • Concentrate principally – perhaps even exclusively – on closing the Foundation Gaps at the bottom
  • Concentrate equally across the attainment spectrum, at the top and bottom and all points in between.

Speaking as an advocate for those at the top, I favour the third option.

It seems to me incontrovertible – though hard to quantify – that, in the English education system, the lion’s share of resources go towards closing the Foundation Gaps.

That is perhaps as it should be, although one could wish that the financial scales were not tipped so excessively in their direction, for ‘poor but bright’ learners do in my view have an equal right to challenge and support, and should not be penalised for their high attainment.

Our current efforts to understand the relative size of the Foundation and Excellence Gaps and how these are changing over time are seriously compromised by the limited data in the public domain.

There is a powerful economic case to be made for prioritising the Foundation Gaps as part of a deliberate strategy for shortening the tail – but an equally powerful case can be constructed for prioritising the Excellence Gaps, as part of a deliberate strategy for increasing the smart fraction.

Neither of these options is optimal from an equity perspective however. The stream of benefits might be compromised somewhat by not focusing exclusively on one or the other, but a balanced approach should otherwise be in our collective best interests.

You may or may not agree. Here is a poll so you can register your vote. Please use the comments facility to share your wider views on this post.

 

 

 

 

Epilogue

 

We must beware the romance of the poor but bright,

But equally beware

The romance of rescuing the helpless

Poor from their sorry plight.

We must ensure the Tail

Does not wag the disadvantaged dog!

 

GP

May 2014

A Summer of Love for English Gifted Education? Episode 3: Improving Fair Access to Oxbridge

.

This post is a critical examination of policy and progress on improving progression for the highest attainers from disadvantaged backgrounds to selective universities, especially Oxford and Cambridge.

.

.

It:

  • Uncovers evidence of shaky statistical interpretation by these universities and their representative body;
  • Identifies problems with the current light-touch regulatory and monitoring apparatus, including shortcomings in the publication of data and reporting of progress at national level;
  • Proposes a series of additional steps to address this long-standing shortcoming of our education system.

.

Background

summer of love 1967 by 0 fairy 0

summer of love 1967 by 0 fairy 0

Regular readers may recall that I have completed two parts of a trilogy of posts carrying the optimistic strapline ‘A Summer of Love for Gifted Education’.

The idea was to structure these posts around three key government publications.

  • This final part was supposed to analyse another DfE-commissioned research report, an ‘Investigation of school- and college- level strategies to raise the Aspirations of High-Achieving Disadvantaged Pupils to pursue higher education’.

We know from the published contract (see attachment in ‘Documents’ section) that this latter study was undertaken by TNS/BMRB and the Institute for Policy Studies in Education (IPSE) based at London Metropolitan University. The final signed off report should have been produced by 28 June 2013 and published within 12 weeks of approval, so by the end of September. As I write, it has still to appear, which would suggest that there is a problem with the quality and/or size of the evidence base.

In the five months since the appearance of Part Two I have published a series of posts developing the themes explored in the first two-thirds of my incomplete trilogy.

But what to do about the missing final episode of ‘A Summer of Love’, which was going to develop this latter fair access theme in more detail?

My initial idea was to survey and synthesise the large number of other recently published studies on the topic. But, as I reviewed the content of these publications, it struck me that such a post would be stuffed full of descriptive detail but lack any real bite – by which I mean substantial and serious engagement with the central problem.

I decided to cut to the chase.

I also decided to foreground material about the highest reaches of A level attainment and progression to Oxbridge, not because I see the issue solely in these stratospheric terms, but because:

  • The top end of fair access is important in its own right, especially for those with a gifted education perspective. Oxford and Cambridge consistently declare themselves a special case and I wanted to explore the substance of their position.
  • There is compelling evidence that Oxford and Cambridge are amongst the weakest performers when it comes to fair access for the highest attaining disadvantaged learners. There are reasons why the task may be comparatively more difficult for them but, equally, as our most prestigious universities, they should be at the forefront when it comes to developing and implementing effective strategies to tackle the problem.
  • The Government has itself made Oxbridge performance a litmus test of progress (or lack of progress) on fair access and on higher education’s wider contribution to social mobility.

The first part of the post briefly reviews the range of measures and regulatory apparatus devoted to improving fair access. This is to provide a frame from which to explore the available data and its shortcomings, rather than an in-depth analysis of relative strengths and weaknesses. Readers who are familiar with this background may prefer to skip it.

The mid-section concentrates on the limited data in the public domain and how it has been (mis)interpreted.

The final section reviews the criticisms made by the SMCPC and, while endorsing them thoroughly, offers a set of further proposals – many of them data-driven – for ratcheting up our collective national efforts to reverse the unsatisfactory progress made to date.

.

A Troubling Tale of Unnecessary Complexity and Weak Regulation

.

A Proliferation of Measures

There is little doubt that we have a problem in England when it comes to progression to selective, competitive higher education (however defined) by learners from disadvantaged backgrounds (however defined).

We may not be unique in that respect, but that does not alter the fact that the problem is longstanding and largely unresolved.

The recent ‘State of the Nation 2013’ Report from the SMPCPC says ‘there has been little change in the social profile of the most elite institutions for over a decade’, adding that ‘while some of the building blocks are in place to lift children off the bottom, opening up elites remains elusive.’

Part of the problem is that the debates about these respective definitions continue to receive disproportionate coverage. Such debates are sometimes deployed as a diversionary tactic, intentionally drawing us away from the unpalatable evidence that we are making decidedly poor headway in tackling the core issue.

The definitional complexities are such that they lend themselves to exploitation by those with a vested interest in preserving the status quo and defending themselves against what they regard as unwonted state intervention.

I shall resist the temptation to explore the comparative advantages and disadvantages of different measures, since that would risk falling into the trap I have just identified.

But I do need to introduce some of the more prominent – and pin down some subtle distinctions – if only for the benefit of readers in other countries.

One typically encounters four different categorisations of competitive, selective higher education here in the UK:

  • Oxbridge – a convenient shorthand reference to Oxford and Cambridge Universities. These two institutions are commonly understood to be qualitatively superior to other UK universities and, although that advantage does not apply universally, to every undergraduate course and subject, there is some academic support for treating them as a category in their own right.
  • Russell Group – The Russell Group was formed in 1994 and originally comprised 17 members. There are currently 24 members, 20 of them located in England, including Oxford and Cambridge. Four institutions – Durham, Exeter, Queen Mary’s and York – joined as recently as 2012 and membership is likely to increase as the parallel 1994 Group has just disbanded. DfE (as opposed to BIS) often uses Russell Group membership as its preferred proxy for selective, competitive higher education, although there are no objective criteria that apply exclusively to all members.
  • Sutton Trust 30The Sutton Trust originally identified a list of 13 universities, derived from ‘average newspaper league table rankings’. This list – Birmingham, Bristol, Cambridge, Durham, Edinburgh, Imperial, LSE, Nottingham, Oxford, St Andrews, UCL, Warwick and York – still appears occasionally in research commissioned by the Trust, although it was subsequently expanded to 30 institutions. In ‘Degrees of Success’, a July 2011 publication, they were described thus:

‘The Sutton Trust 30 grouping of highly selective universities comprises universities in Scotland, England and Wales with over 500 undergraduate entrants each year, where it was estimated that less than 10 per cent of places are attainable to pupils with 200 UCAS tariff points (equivalent to two D grades and a C grade at A-level) or less. These 30 universities also emerge as the 30 most selective according to the latest Times University Guide.’

The full list includes all but two of the Russell Group (Queen Mary’s and Queen’s Belfast) plus eight additional institutions.

‘The HEIs included in this group change every year; although 94% of HEIs remained in the top third for 5 consecutive years, from 2006/07 to 2010/11. The calculation is restricted to the top three A level attainment; pupils who study other qualifications at Key Stage 5 will be excluded. Institutions with a considerable proportion of entrants who studied a combination of A levels and other qualifications may appear to have low scores. As the analysis covers students from schools and colleges in England, some institutions in other UK countries have scores based on small numbers of students. As this measure uses matched data, all figures should be treated as estimates.’

This categorisation includes seven further mainstream universities (Aston, City, Dundee, East Anglia, Goldsmiths, Loughborough, Sussex) plus a range of specialist institutions.

Indicators of educational disadvantage are legion, but these are amongst the most frequently encountered:

  • Eligibility for free school meals (FSM): DfE’s preferred measure. The term is misleading since the measure only includes learners who meet the FSM eligibility criteria and for whom a claim is made, so eligibility of itself is insufficient. Free school meals are available for learners in state-funded secondary schools, including those in sixth forms. From September 2014, eligibility will be extended to all in Years R, 1 and 2 and to disadvantaged learners in further education and sixth form colleges. The phased introduction of Universal Credit will also impact on the eligibility criteria (children of families receiving Universal Credit between April 2013 and March 2014 are eligible for FSM, but the cost of extending FSM to all Universal Credit recipients once fully rolled out is likely to be prohibitive). We do not yet know whether these reforms will cause DfE to select an alternative preferred measure and, if so, what that will be. Eligibility for the Pupil Premium is one option, more liberal than FSM, though this currently applies only to age 16.
  • Residual Household Income below £16,000: This is broadly the income at which eligibility for free school meals becomes available. It is used by selective universities (Oxford included) because it can be applied universally, regardless of educational setting and whether or not free school meals have been claimed. Oxford explains that:

‘Residual income is based on gross household income (before tax and National Insurance) minus certain allowable deductions. These can include pension payments, which are eligible for certain specified tax relief, and allowances for other dependent children.’

The threshold is determined through the assessment conducted by Student Finance England, so is fully consistent with its guidance.

  • Low participation schools: This measure focuses on participation by school attended rather than where students live. It may be generic – perhaps derived from the Government’s experimental destinations statistics – or based on admissions records for a particular institution. As far as I can establish, there is no standard or recommended methodology: institutions decide for themselves the criteria they wish to apply.
  • POLAR (Participation Of Local Areas): HEFCE’s area-based classification of participation in higher education. Wards are categorised in five quintiles, with Quintile 1 denoting those with lowest participation. The current edition is POLAR 3.
  • Other geodemographic classifications: these include commercially developed systems such as ACORN and MOSAIC based on postcodes and Output Area Classification (OAC) based on census data. One might also include under this heading the Indices of Multiple Deprivation (IMD) and the associated sub-domain Income Deprivation Affecting Children Index (IDACI).
  • National Statistics Socio-Economic Classification (NS-SEC): an occupationally-based definition of socio-economic status applied via individuals to their households. There are typically eight classes:
  1. Higher managerial, administrative and professional
  2. Lower managerial, administrative and professional
  3. Intermediate
  4. Small employers and own account workers
  5. Lower supervisory and technical
  6. Semi routine
  7. Routine
  8. Never worked and long term unemployed

Data is often reported for NS-SEC 4-7.

Sitting alongside these measures of disadvantage is a slightly different animal – recruitment from state-funded schools and colleges compared with recruitment from the independent sector.

While this may be a useful social mobility indicator, it is a poor proxy for fair access.

Many learners attending independent schools are from comparatively disadvantaged backgrounds, and of course substantively more learners at state-maintained schools are comparatively advantaged.

The Office For Fair Access (OFFA) confirms that:

‘in most circumstances we would not approve an access agreement allowing an institution to measure the diversity of its student body solely on the basis of the numbers of state school pupils it recruits….it is conceivable that a university could improve its proportion of state school students without recruiting greater proportions of students from disadvantaged groups.’

Nevertheless, independent/state balance continues to features prominently in some access agreements drawn up by selective universities and approved by OFFA.

There is a risk that some institutions are permitted to give this indicator disproportionate attention, at the expense of their wider commitment to fair access.

 .

Securing National Improvement

Given the embarrassment of riches set out above, comparing progress between institutions is well-nigh impossible, let alone assessing the cumulative impact on fair access at national level.

When it came to determining their current strategy, the government of the day must have faced a choice between:

  • Imposing a standard set of measures on all institutions, ignoring complaints that those selected were inappropriate for some settings, particularly those that were somehow atypical;
  • Allowing institutions to choose their own measures, even though that had a negative impact on the rate of improvement against the Government’s own preferred national indicators; and
  •  A half-way house which insisted on universal adoption of one or two key measures while permitting institutions to choose from a menu of additional measures, so creating a basket more or less appropriate to their circumstances.

For reasons that are not entirely clear – but presumably owe something to vigorous lobbying from higher education interests – the weaker middle option was preferred and remains in place to this day.

The standard-setting and monitoring process is currently driven by OFFA, though we expect imminently the final version of a National Strategy for Access and Student Success, developed jointly with HEFCE.

A new joint process for overseeing OFFA’s access agreements (from 2015/16) and HEFCE’s widening participation strategic statements (from 2014-2017) will be introduced in early 2014.

There were tantalising suggestions that the status quo might be adjusted through work on the wider issue of evaluation.

An early letter referred to plans to:

‘Commission feasibility study to establish if possible to develop common evaluation measures that all institutions could adopt to assess the targeting and impact of their access and student success work’.

The report would be completed by Spring 2013.

Then an Interim Report on the Strategy said the study would be commissioned in ‘early 2013 to report in May 2013’ (Annex B).

It added:

‘Informal discussions with a range of institutional representatives have indicated that many institutions would welcome a much clearer indication of the kind of evidence and indicators that we would wish to see. Therefore a key strand within the strategy development will be work undertaken with the sector to develop an evaluation framework to guide them in their efforts to evidence the impact of their activity. Within this, we intend to test the feasibility of developing some common measures for the gathering of high-level evidence that might be aggregated to provide a national picture. We will also investigate what more can be done by national bodies including ourselves to make better use of national data sets in supporting institutions as they track the impact of their interventions on individual students.’

However, HEFCE’s webpage setting out research and stakeholder engagement in support of the National Strategy still says the study is ‘to be commissioned’ and that the publication date is ‘to be confirmed’.

I can find no explanation of the reasons for this delay.

For the time being, OFFA is solely responsible for issuing guidance to institutions on the content of their access agreements, approving the Agreements and monitoring progress against them.

OFFA’s website says:

‘Universities and colleges set their own targets based on where they need to improve and what their particular institution is trying to achieve under its access agreement…These targets must be agreed by OFFA. We require universities and colleges to set themselves at least one target around broadening their entrant pool. We also encourage (but do not require) them to set themselves further targets, particularly around their work on outreach and, where appropriate, retention. Most choose to do so. We normally expect universities and colleges to have a range of targets in order to measure their progress effectively. When considering whether targets are sufficiently ambitious, we consider whether they represent a balanced view of the institution’s performance, and whether they address areas where indicators suggest that the institution has furthest to go to improve access.

From 2012-13, in line with Ministerial guidance, we are placing a greater emphasis on progress against targets. We would not, however, impose a sanction solely on the basis of a university or college not meeting its targets or milestones.’

The interim report on a National Strategy suggests that – informally at least – many universities recognise that this degree of flexibility is not helpful to their prospects of improving fair access, either individually or collectively.

But the fact that the promised work has not been undertaken might imply a counterforce pushing in precisely the opposite direction.

The expectations placed on universities are further complicated by the rather unclear status of the annual performance indicators for widening participation of under-represented groups supplied by the Higher Education Statistics Agency (HESA).

HESA’s table for young full-time first degree entrants shows progress by each HEI against benchmarks for ‘from state schools or colleges’, ‘from NS-SEC classes 4, 5, 6 and 7’ and ‘from low participation neighbourhoods (based on POLAR3 methodology)’ respectively.

HESA describes its benchmarks thus:

‘Because there are such differences between institutions, the average values for the whole of the higher education sector are not necessarily helpful when comparing HEIs. A sector average has therefore been calculated which is then adjusted for each institution to take into account some of the factors which contribute to the differences between them. The factors allowed for are subject of study, qualifications on entry and age on entry (young or mature).’

HESA’s benchmarks are clearly influential in terms of the measures adopted in many access agreements (and much of the attention given to the state versus independent sector intake may be attributable to them).

On the other hand, the indicators receive rather cavalier treatment in the most recent access agreements from Oxford and Cambridge. Oxford applies the old POLAR2 methodology in place of the latest POLAR3, while Cambridge adjusts the POLAR3 benchmarks to reflect its own research.

The most recent 2011/12 HESA results for Oxford and Cambridge are as follows:

.

Institution       State schools        NS-SEC 4-7     LPN (POLAR3)
Benchmark Performance Benchmark Performance Benchmark Performance
Oxford 71.2% 57.7% 15.9% 11.0% 4.7% 3.1%
Cambridge 71.4% 57.9% 15.9% 10.3% 4.5% 2.5%

.

That probably explains why Oxbridge would prefer an alternative approach! But the reference to further work in the Interim Strategy perhaps also suggests that few see these benchmarks as the best way forward.

.

National Targets

The Government also appears in something of a bind with its preferred measures for monitoring national progress.

When it comes to fair access (as opposed to widening participation) the Social Mobility Indicators rely exclusively on the gap between state and independent school participation at the most selective HEIs, as defined by BIS.

As noted above, this has major shortcomings as a measure of fair access, though more validity as a social mobility measure.

The relevant indicator shows that the gap held between 37% and 39% between 2006 and 2010, but this has just been updated to reflect an unfortunate increase to 40% in 2010/11.

BIS uses the same measure as a Departmental Performance Indicator for its work on higher education.  The attachment on the relevant gov.uk page is currently the wrong one – which might indicate that BIS is less than comfortable with its lack of progress against the measure.

DfE takes a different approach declaring an ‘Outcome of Education’ indicator:

‘Outcome of education:

i)             Percentage of children on free school meals progressing to Oxford or Cambridge*.

ii)            Percentage of children on free school meals progressing to a Russell Group university*.

iii)           Percentage of children on free school meals progressing to any university*.

iv)           Participation in education and work based training at age 16 to 17

*Available June 2013’

But progress against this indicator is nowhere to be found in the relevant section of the DfE website or, as far I can establish, anywhere within the DfE pages on gov.uk.

.

.

Oxbridge Access Agreement Targets for 2014/15

Perhaps the best way to link this section with the next is by showing how Oxford and Cambridge have decided to frame the targets in their access agreements for 2014/15

Oxford has OFFA’s agreement to target:

  • Schools and colleges that secure limited progression to Oxford. They use ‘historic UCAS data’ to estimate that ‘in any one year up to 1,680…will have no students who achieve AAA grades but, over a three-year period they may produce a maximum of two AAA candidates’. They also prioritise an estimated 1,175 institutions which have larger numbers achieving AAA grades ‘but where the success rate for an application to Oxford is below 10%’. In 2010, 19.4% of Oxford admissions were from these two groups and it plans to increase the proportion to 25% by 2016-17;
  • UK undergraduates from disadvantaged socio-economic backgrounds, based on ‘ACORN postcodes 4 and 5’. Some 7.6% of admissions came from these postcodes in 2010/11 and Oxford proposes to reach 9.0% by 2016/17.
  • UK undergraduates from neighbourhoods with low participation in higher education, as revealed by POLAR2. It will focus on ‘students domiciled in POLAR quintiles 1 and 2’. In 2012, 10.6% are from this group and Oxford proposes to increase this to 13.0% by 2016-17.

In addition to a target for admitting disabled students, Oxford also says it will monitor and report on the state/independent school mix, despite evidence ‘that this measure is often misleading as an indicator of social diversity’. It notes that:

‘30% of 2012 entrants in receipt of the full Oxford Bursary (students with a household income of £16,000 or less) were educated in the independent sector…The University will continue to monitor the level of students from households with incomes of £16,000 or less. It is considered that these are the most financially disadvantaged in society, and it is below this threshold that some qualify for receipt of free schools meals, and the pupil premium. The University does not consider that identifying simply those students who have been in receipt of free school meals provides a suitably robust indicator of disadvantage as they are not available in every school or college with post-16 provision, nor does every eligible student choose to receive them.

There are no national statistics currently available on the number of students whose household income is £16,000 or less and who attain the required academic threshold to make a competitive application to Oxford. In 2011-12, around one in ten of the University’s UK undergraduate intake was admitted from a household with this level of declared income.’

Meanwhile, Cambridge proposes only two relevant targets, one of them focused on the independent/state divide:

  • Increase the proportion of UK resident students admitted from UK state sector schools and colleges to between 61% and 63%. This is underpinned by the University’s research finding that ‘the proportion of students nationally educated at state schools securing examination grades in subject combinations that reflect our entrance requirements and the achievement level of students admitted to Cambridge stands at around 62%’.
  • Increase the proportion of UK resident students from low participation neighbourhoods to approximately 4% by 2016. It argues:

‘Currently HESA performance indicators and other national datasets relating to socio-economic background do not take adequate account of the entry requirements of individual institutions. Whilst they take some account of attainment, they do not do so in sufficient detail for highly selective institutions such as Cambridge where the average candidate admitted has 2.5 A* grades with specific subject entry requirements. For the present we have adjusted our HESA low participation neighbourhood benchmark in line with the results of our research in relation to state school entry and will use this as our target.’

Each of these approaches has good and bad points. Cambridge’s is more susceptible to the criticism that it is overly narrow. There is no real basis to compare the relative performance of the two institutions since there is negligible overlap between their preferred indicators. That may be more comfortable for them, but it is not in the best interests of their customers, or of those seeking to improve their performance.

 

Investigating the Data on High Attainment and Fair Access to Oxbridge

Those seeking statistics about high attainment amongst disadvantaged young people and their subsequent progression to Oxbridge are bound to be disappointed.

There is no real appreciation of the excellence gap in this country and this looks set to continue. The fact that gaps between advantaged and disadvantaged learners are typically wider at the top end of the attainment distribution seems to have acted as a brake on the publication of data that proves the point.

It is possible that the current round of accountability reforms will alter this state of affairs, but this has not yet been confirmed.

For the time being at least, almost all published statistics about high A level attainment amongst disadvantaged learners have come via answers to Parliamentary Questions. This material invariably measures disadvantage in terms of FSM eligibility.

Information about the admission of disadvantaged learners to Oxbridge is equally scant, but a picture of sorts can be built up from a mixture of PQ replies, university admission statistics and the DfE’s destination measures. The material supplied by the universities draws on measures other than FSM.

The following two sections set out what little we know, including the ever important statistical caveats.

.

High Attainment Data

  • In 2003, 94 students (1.9%) eligible for FSM achieved three or more A grades at A level. The figures relate to 16-18 year-olds in maintained schools only who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth from colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are included. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
  • In 2008, 160 students (3.5%) eligible for FSM achieved that outcome. The figures relate to 16-18 year-olds, in maintained schools only, who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth from colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are included. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
  • In 2008/09, 232 pupils at maintained mainstream schools eligible for FSM achieved three or more A grades at A level (including applied A level and double award), 179 of them attending comprehensive schools. The figures exclude students in FE and sixth form colleges previously eligible for FSM. (Parliamentary Question, 7 April 2010, Hansard (Col 1451W))
  • The number of Year 13 A level candidates eligible for FSM in Year 11 achieving 3 or more A grade levels (including applied A levels and double award) were: 2006 – 377; 2007 – 433; 2008 – 432; 2009 – 509. These figures include students in both the schools and FE sectors.(Parliamentary Question, 27 July 2010, Hansard (Col 1223W))

 .

To summarise, the total number of students who were FSM-eligible at age 16 and went on to achieve three or more GCE A levels at Grade A*/A – including those in maintained schools, sixth form and FE colleges – has been increasing significantly since 2006.

2006 2007 2008 2009 2010 2011
Number 377 433 432 509 ? 546

The overall increase between 2006 and 2011 is about 45%.

 .

Oxbridge Admission/Acceptance Data

  • The number of learners eligible for and claiming FSM at age 15 who progressed to Oxford or Cambridge by age 19 between 2005 and 2008 (rounded to the nearest five) were:
2005/06 2006/07 2007/08 2008/09
Oxford 25 20 20 25
Cambridge 20 25 20 20
TOTAL 45 45 40 45

Sources: Parliamentary Question, 13 December 2010, Hansard (Col 549W) and Parliamentary Question 21 February 2012, Hansard (Col 755W)

.

[Postscript (January 2014):

In January 2014, BIS answered a further PQ which provided equivalent figures for 2009/10 and 2010/11 – again rounded to the nearest five and derived from matching the National Pupil Database (NPD), HESA Student Record and the Individualised Learner Record (ILR) owned by the Skills Funding Agency.

The revised table is as follows:

  2005/06 2006/07 2007/08 2008/09 2009/10 2010/11
Oxford 25 20 20 25 15 15
Cambridge 20 25 20 20 25 25
TOTAL 45 45 40 45 40 40

 

Sources:

Parliamentary Question, 13 December 2010, Hansard (Col 549W)

Parliamentary Question 21 February 2012, Hansard (Col 755W)

Parliamentary Question 7 January 2014, Hansard (Col 191W)

Although the 2010/11 total is marginally more positive than the comparable figure derived from the Destination Indicators (see below) this confirms negligible change overall during the last six years for which data is available.  The slight improvement at Cambridge during the last two years of the sequence is matched by a corresponding decline at Oxford, from what is already a desperately low base.]

.

Number %age FSM Number FSM
UK HEIs 164,620 6% 10,080
Top third of HEIs 49,030 4% 2,000
Russell Group 28,620 3% 920
Oxbridge 2,290 1% 30

.

.

These are experimental statistics and all figures – including the 30 at Oxbridge – are rounded to the nearest 10. The introductory commentary explains that:

‘This statistical first release (experimental statistics) on destination measures shows the percentage of students progressing to further learning in a school, further education or sixth-form college, apprenticeship, higher education institution or moving into employment or training.’

It adds that:

‘To be included in the measure, young people have to show sustained participation in an education or employment destination in all of the first 2 terms of the year after they completed KS4 or took A level or other level 3 qualifications. The first 2 terms are defined as October to March.’

The Technical Notes published alongside the data also reveal that: It includes only learners aged 16-18 and those who have entered at least one A level or an equivalent L3 qualification;  the data collection process incorporates ‘an estimate of young people who have been accepted through the UCAS system for entry into the following academic year’ but ‘deferred acceptances are not reported as a distinct destination’; and FSM data for KS5 learners relates to those eligible for and claiming FSM in Year 11.

  • Cambridge’s 2012 intake ‘included 50+ students who had previously been in receipt of FSM’ (It is not stated whether all were eligible in Year 11, so it is most likely that this is the number of students who had received FSM at one time or another in their school careers.) This shows that Cambridge at least is collecting FSM data that it does not publish amongst its own admission statistics or use in its access agreement. (Cambridge University Statement, 26 September 2013)
  • In 2012, Cambridge had 418 applications from the most disadvantaged POLAR2 quintile (4.6% of all applicants) and, of those, 93 were accepted (3.6% of all acceptances) giving a 22.2% success rate.(Cambridge University Admission Statistics 2012 (page 23))

.

To summarise, the numbers of disadvantaged learners progressing to Oxbridge are very small; exceedingly so as far as those formerly receiving FSM are concerned.

Even allowing for methodological variations, the balance of evidence suggests that, at best, the numbers of FSM learners progressing to Oxbridge have remained broadly the same since 2005.

During that period, the concerted efforts of the system described above have had zero impact. The large sums invested in outreach and bursaries have made not one iota of difference.

This is true even though the proportion achieving the AAA A level benchmark has increased by about 45%. If Oxbridge admission was solely dependent on attainment, one would have expected a commensurate increase, to around 65 FSM entrants per year.

On the basis of the 2010/11 Destination Indicators, we can estimate that, whereas Oxbridge admits approximately 8% of all Russell Group students, it only admits slightly over 3% of Russell Group FSM students. If Oxbridge achieved the performance of its Russell Group peers, the numbers of formerly FSM admissions would be over 100 per year.

.

Misleading Use of This Data

To add insult to injury, this data is frequently misinterpreted and misused. Here are some examples, all of which draw selectively on the data set out above.

  • Of the 80,000 FSM-eligible students in the UK only 176 received three As at A level…more than one quarter of those students….ended up at either Oxford or Cambridge – Nicholson (Oxford Undergraduate Admissions Director, Letter to Guardian, 7 March 2011)
  • ‘Of the 80,000 children eligible for free school meals in the UK in 2007, only 176 received 3 As at A level. Of those 45 (more than a quarter) got places at Oxford or Cambridge’ (Undated Parliamentary Briefing ‘Access and admissions to Oxford University’ )
  • ‘The root causes of underrepresentation of students from poorer backgrounds at leading universities include underachievement in schools and a lack of good advice on subject choices. For example, in 2009 only 232 students who had been on free school meals (FSM) achieved 3As at A-level or the equivalent.  This was 4.1% of the total number of FSM students taking A-levels, and less than an estimated 0.3% of all those who had received free school meals when aged 15.’ (Russell Group Press release, 23 July 2013).
  • ‘Such data as is available suggests that less than 200 students per year who are recorded as being eligible for FSM secure grades of AAA or better at A level. The typical entrance requirement for Cambridge is A*AA, and so on that basis the University admits in excess of one quarter of all FSM students who attain the grades that would make them eligible for entry.’ (Cambridge University Statement, 26 September 2013)
  • ‘According to data produced by the Department for Children, Schools and Families, of the 4,516 FSM students who secured a pass grade at A Level in 2008 only 160 secured the grades then required for entry to the University of Cambridge (ie AAA). Students who were eligible for FSM therefore make up less than 1% of the highest achieving students nationally each year.

Assuming that all 160 of these students applied to Oxford or Cambridge in equal numbers (ie 80 students per institution) and 22 were successful in securing places at Cambridge (in line with the 2006-08 average) then this would represent a success rate of 27.5% – higher than the average success rate for all students applying to the University (25.6% over the last three years). In reality of course not every AAA student chooses to apply to Oxford or Cambridge, for instance because neither university offers the course they want to study, e.g. Dentistry.’ (Cambridge Briefing, January 2011 repeated in Cambridge University Statement, 26 September 2013)

.

.

To summarise, Oxford, Cambridge and the Russell Group are all guilty of implying that FSM-eligible learners in the schools sector are the only FSM-eligible learners progressing to selective universities.

They persist in using the school sector figures even though combined figures for the school and FE sectors have been available since 2010.

Oxbridge’s own admission statistics show that, in 2012:

  • 9.6% of acceptances at Cambridge (332 students) were extended to students attending sixth form, FE and tertiary colleges (UK figures)
  • 10.5% of UK domiciled acceptances at Oxford (283 students) were extended to students attending sixth form colleges and FE institutions of all types

We can rework Cambridge’s calculation using the figure of 546 students with three or more A*/A grades in 2011:

  • assuming that all applied to Oxford and Cambridge in equal numbers gives a figure of 273 per institution
  • assuming a success rate of 25.6% – the average over the last three years
  • the number of FSM students that would have been admitted to Cambridge is roughly 70.

Part of the reason high-attaining disadvantaged students do not apply to Oxbridge may be because they want to study the relatively few mainstream subjects, such as dentistry, which are not available.

But it is highly likely that other factors are at play, including the perception that Oxbridge is not doing all that it might to increase numbers of disadvantaged students from the state sector.

If this favourable trend in A level performance stalls, as a consequence of recent A level reforms, it will not be reasonable – in the light of the evidence presented above – for Oxbridge to argue that this is impacting negatively on the admission of FSM-eligible learners.

.

Building on the work of the SMCPC

 

‘Higher Education: The Fair Access Challenge’

There is no shortage of publications on fair access and related issues. In the last year alone, these have included:

Easily the most impressive has been the Social Mobility and Child Poverty Commission’s ‘Higher Education: The Fair Access Challenge’ (June 2013), though it does tend to rely a little too heavily on evidence of the imbalance between state and independent-educated students.

.

.

It examines the response of universities to recommendations first advanced in an earlier publication ‘University Challenge: How Higher Education Can Advance Social Mobility’ (2012) published by Alan Milburn, now Chair of the Commission, in his former role as Independent Reviewer on Social Mobility.

The analysis sets out key points from the earlier work:

  • Participation levels at the most selective universities by the least advantaged are unchanged since the mid-90s.
  • The most advantaged young people are seven times more likely to attend the most selective universities than the most disadvantaged.
  • The probability of a state secondary pupil eligible for FSM in Year 11 entering Oxbridge by 19 is almost 2000 to 1; for a privately educated pupil the probability is 20 to 1.

New research is presented to show that the intake of Russell Group universities has become less socially representative in the last few years:

  • The number of state school pupils entering Russell Group Universities has increased by an estimated 2.6% from 2002/03 to 2011/12, but the commensurate increase in privately educated entrants is 7.9%. The proportion of young full-time state-educated entrants has consequently fallen from 75.6% to 74.6% over this period. The worst performers on this measure are: Durham (-9.2%), Newcastle (-4.6%), Warwick (-4.5%) and Bristol (-3.9%). The best are: Edinburgh (+4.6%), UCL (+3.3%), LSE (+3.0%) and Southampton (2.9%). The Oxbridge figures are: Cambridge (+0.3%) and Oxford (+2.3%).
  • Similarly, the proportion of young full-time entrants from NS-SEC classes 4-7 has fallen from 19.9% in 2002/03 to 19.0% in 2011/12. A table (reproduced below) shows that the worst offenders on this measure are Queen’s Belfast (-4.6%), Liverpool (-3.2%), Cardiff (-2.9%) and Queen Mary’s (-2.7%). Conversely, the best performers are Nottingham (+2.2%), York (+0.9%), Warwick and LSE (+0.8%). The figures for Oxbridge are: Cambridge (-1.0%) and Oxford (0.0%).

.

NC-SEC Capture.

  • An estimated 3,700 state-educated learners have the necessary grades for admission to Russell Group universities but do not take up places. This calculation is based on the fact that, if all of the 20 Russell Group universities in England achieved their HESA widening participation benchmarks, they would have recruited an extra 3,662 students from state schools. (The benchmarks show how socially representative each intake would be if it were representative of all entrants with the grades required for entry – though see Cambridge’s reservations on this point, above.) Some universities would need to increase significantly the percentage of state students recruited – for example, Bristol and Durham (26.9%), Oxford (23.4%) and Cambridge (23.3%).
  • Using the same methodology to calculate the shortfall per university in NS-SEC 4-7 students results in the table below, showing the worst offenders to require percentage increases of 54.4% (Cambridge), 48.5% (Bristol), 45.5% (Oxford) and 42,2% (Durham). Conversely, Queen Mary’s, Queen’s Belfast, LSE and Kings College are over-recruiting from this population on this measure.

.

NS sec Capture 2.

  • Even if every Russell Group university met the self-imposed targets in its access agreement, the number of ‘missing’ state educated students would drop by only 25% by 2016/17, because the targets are insufficiently ambitious. (This is largely because only seven have provided such targets in their 2013/14 access agreements and there are, of course, no collective targets.)
  • Boliver’s research is cited to show that there is a gap in applications from state school pupils compared with those educated in the independent sector. But there is also evidence that a state school applicant needs, on average, one grade higher in their A levels (eg AAA rather than AAB) to be as likely to be admitted as an otherwise identical student from the independent sector.
  • A Financial Times analysis of 2011 applications to Oxford from those with very good GCSEs found that those from independent schools were 74% more likely to apply than those from the most disadvantaged state secondary schools. Amongst applicants, independently educated students were more than three times as likely to be admitted as their peers in disadvantaged state schools. They were also 20% more likely to be admitted than those at the 10% most advantaged state secondary schools. As shown by the table below, the probabilities involved varied considerably. The bottom line is that the total probability of a place at Oxford for an independent school student is 2.93%, whereas the comparable figure for a student at one of the 10% most disadvantaged state secondary schools is just 0.07%.

.

NS sec Capture 3

When it comes to the causes of the fair access gap, subject to controls for prior attainment, the report itemises several contributory factors, noting the limited evidence available to establish their relative importance and interaction:

  • low aspirations among students, parents and teachers
  • less knowledge of the applications process, problems in demonstrating potential through the admissions process and a tendency to apply to the most over-subscribed courses
  • not choosing the right  A-level subjects and teachers’ under-prediction of expected A level grades
  • a sense that selective universities ‘are socially exclusive and “not for the likes of them”’

The Report states unequivocally that:

‘The Social Mobility and Child Poverty Commission is deeply concerned about the lack of progress on fair access. The most selective universities need to be doing far more to ensure that they are recruiting from the widest possible pool of talent. The Commission will be looking for evidence of a step change in both intention and action in the years to come.’

It identifies several areas for further action, summarising universities’ responses to ‘University Challenge’:

  • Building links between universities and schools: The earlier report offered several recommendations, including that universities should have explicit objectives to help schools close attainment gaps. No evidence is given to suggest that such action is widespread, though many universities are strengthening their outreach activities and building stronger relationships with the schools sector. Several universities highlighted the difficulties inherent in co-ordinating their outreach activity given the demise of Aimhigher, but several retain involvement in a regional partnership.
  • Setting targets for fair access: The earlier report recommended that HE representative bodies should set statistical targets for progress on fair access over the next five years. This was not met positively:

‘Representative bodies in the Higher Education Sector did not feel this would be a useful step for them to take, saying that it was difficult to aggregate the different targets that individual institutions set themselves. There was also a feeling among some highly selective institutions that the report overestimated the number of students who have the potential to succeed at the most selective universities.’

Nevertheless, the Commission is insistent:

The Commission believes it is essential that the Russell Group signals its determination to make a real difference to outcomes by setting a clear collective statistical target for how much progress its members are aiming to make in closing the ‘fair access gap’. Not doing so risks a lack of sustained focus among the most selective universities’.

  • Using contextual admissions data: The report argues that ‘there is now a clear evidence base that supports the use of contextual data’. Recommendations from the earlier report were intended to universalise the use of contextual data, including commitment from the various representative bodies through a common statement of support and a collaborative guide to best practice. There is no sign of the former, although the Commission reports ‘widespread agreement that the use of contextual data during the admissions process should be mainstreamed’. However it notes that there is much more still to do. (The subsequent SPA publication should have helped to push forward this agenda.)
  • Reforming the National Scholarship Programme: The earlier report called on the Government to undertake a ‘strategic review of government funding for access’ to include the national Scholarship Programme (NSP). The suggestion that the imminent HEFCE/OFFA National Strategy should tackle the issue has been superseded by a Government decision to refocus the NSP on postgraduate education.
  • Postgraduate funding reform: The earlier report recommended work on a postgraduate loan scheme and further data collection to inform future decisions. The current report says that:

‘…the Government appears to have decided against commissioning an independent report looking at the issue of postgraduate access. This is very disappointing.’

and calls on it ‘to take heed’. However, this has again been superseded by the NSP announcement.

The SMCPC’s ‘State of the Nation 2013’ report reinforces its earlier publication, arguing that:

‘…despite progress, too much outreach work that aims to make access to university fairer and participation wider continues to rely on unproven methods or on work that is ad hoc, uncoordinated and duplicative… These are all issues that the higher education sector needs to address with greater intentionality if progress is to be made on breaking the link between social origin and university education.

The UK Government also needs to raise its game… much more needs to be done… to address the loss of coordination capacity in outreach work following the abolition of Aimhigher.’

It recommends that:

‘All Russell Group universities should agree five-year aims to close the fair access gap, all universities should adopt contextual admissions processes and evidence-based outreach programmes, and the Government should focus attention on increasing university applications from mature and part-time students.’

 .

What Else Might Be Done?

I set myself the challenge of drawing up a reform programme that would build on the SMCPC’s recommendations but would also foreground the key issues I have highlighted above, namely:

  • A significant improvement in the rate of progression for disadvantaged high-attaining learners to Oxbridge;
  • A more rigorous approach to defining, applying and monitoring improvement measures; and
  • The publication of more substantive and recent data

A determined administration that is prepared to take on the vested interests could do worse than pursue the following 10-point plan

  • 1. Develop a new approach to specifying universities’ fair access targets for young full-time undergraduate students. This would require all institutions meeting the BIS ‘most selective HEI’ criteria to pursue two universal measures and no more than two measures of their own devising, so creating a basket of no more than four measures. Independent versus state representation could be addressed as one of the two additional measures.
  • 2. The universal measures should relate explicitly to students achieving a specified A level threshold that has currency at these most selective HEIs. It could be pitched at the equivalent of ABB at A level, for example. The measures should comprise:
    • A progression measure for all learners eligible for the Pupil Premium in Year 11 of their secondary education (so a broader measure than FSM eligibility); and
    • A progression measure for all learners – whether or not formerly eligible for the Pupil Premium – attending a state-funded sixth form or college with a relatively poor historical record of securing places for their learners at such HEIs. This measure would be nationally defined and standardised across all institutions other than Oxbridge.
  • 3. In the case of Oxford and Cambridge the relevant A level tariff would be set higher, say at the equivalent of AAA grades at A level, and the nationally defined  ‘relatively poor historical record’ would reflect only Oxbridge admission.
  • 4. These two universal measures would be imposed on institutions through the new National Strategy for Access and Student Success. All institutions would be required to set challenging but realistic annual targets. There would be substantial financial incentives for institutions achieving their targets and significant financial penalties for institutions that fail to achieve them.
  • 5. The two universal measures would be embedded in the national Social Mobility Indicators and the KPIs of BIS and DfE respectively.
  • 6. Central Government would publish annually data setting out:
    • The number and percentage of formerly Pupil Premium-eligible learners achieving the specified A level thresholds for selective universities and Oxbridge respectively.
    • A ‘league table’ of the schools and colleges with relatively poor progression to selective universities and Oxbridge respectively.
    • A ‘league table’ of the universities with relatively poor records of recruitment from these schools and colleges.
    • A time series showing the numbers of students and percentage of their intake drawn from these two populations by selective universities and Oxbridge respectively each year. This should cover both applications and admissions.
  • 7. All parties would agree new protocols for data sharing and transparency, including tracking learners through unique identifiers across the boundaries between school and post-16 and school/college and higher education, so ensuring that the timelag in the publication of this data is minimal.
  • 8. Universities defend fiercely their right to determine their own undergraduate admissions without interference from the centre, meaning that the business of driving national improvement is much more difficult than it should be. But, given the signal lack of progress at the top end of the attainment distribution, there are strong grounds for common agreement to override this autonomy in the special case of high-achieving disadvantaged students.  A new National Scholarship Scheme should be introduced to support learners formerly in receipt of the Pupil Premium who go on to achieve the Oxbridge A Level tariff:
    • Oxford and Cambridge should set aside 5% additional places per year (ie on top of their existing complement) reserved exclusively for such students. On the basis of 2012 admissions figures, this would amount to almost exactly 250 places for England divided approximately equally between the two institutions (the scheme could be for England only or UK-wide). This would provide sufficient places for approximately 45% of those FSM learners currently achieving 3+ A*/A grades.
    • All eligible students with predicted grades at or above the tariff would be eligible to apply for one of these scholarship places. Admission decisions would be for the relevant university except that – should the full allocation not be taken up by those deemed suitable for admission who go on to achieve the requisite grades – the balance would be made available to the next best applicants until the quota of places at each university is filled.
    • The Government would pay a premium fee set 50% above the going rate (so £4,500 per student per annum currently) for each National Scholarship student admitted to Oxbridge. However, the relevant University would be penalised the full fee plus the premium (so £13,500 per student per year) should the student fail to complete their undergraduate degree with a 2.2 or better. Penalties would be offset against the costs of running the scheme. Assuming fees remain unchanged and 100% of students graduate with a 2.2 or better, this would cost the Government £1.125m pa.
  • 9. In addition, the Government would support the establishment of a National Framework Programme covering Years 9-13, along the lines set out in my November 2010 post on this topic with the explicit aim of increasing the number of Pupil Premium-eligible learners who achieve these tariffs. The budget could be drawn in broadly equal proportions from Pupil Premium/16-19 bursary, a matched topslice from universities’ outreach expenditure and a matched sum from the Government. If the programme supported 2,500 learners a year to the tune of £2,500 per year, the total steady state cost would be slightly over £30m, approximately £10m of which would be new money (though even this could be topsliced from the overall Pupil Premium budget).
  • 10. The impact of this plan would be carefully monitored and evaluated, and adjusted as appropriate to maximise the likelihood of success. It would be a condition of funding that all selective universities would continue to comply with the plan.

Do I honestly believe anything of this kind will ever happen?

.

flying pig capture

.

GP

November 2013