Fair Access Trends in DfE’s Destinations Data 2010-13

This is a brief supplementary post about progression by FSM students to selective universities.

In preparing my last post, I had occasion to look again at DfE statistics on KS5 student destinations.

.

Destinations Data

These experimental statistics were first published in 2012 and most recently in January 2015. To date they cover four academic years, starting with AY2009/10 and ending with AY2012/13.

Underlying data is published each year and since AY2010/11 this has included the number of FSM students admitted to different categories of selective university: the ‘top third’, Russell Group and Oxbridge.

Allowing for a health warning about potential comparability issues (see Technical Notes below) I wanted to investigate how FSM admissions to these categories had changed over the three years in question.

The numbers are set out in this embedded spreadsheet.

.

On closer inspection they reveal some interesting information.

Graph 1, below, shows the percentage increase between AY2010/11 and AY2012/13 for FSM and non-FSM students in each category of selective higher education.

On the face of it, this is extremely good news for fair access, since the increase in FSM admissions significantly exceeds the increase in non-FSM admissions for all three categories of selective higher education.

The increase in FSM progression to Oxbridge is exactly in line with the increase at Russell Group universities.

The improvement at ‘top third’ HEIs is some 40 percentage points lower, but these institutions are almost 10 percentage points ahead of the rate of improvement for all HE.

Over the same period non-FSM progression to Russell Group universities has increased at almost twice the rate of non-FSM progression at Oxbridge, which is only slightly ahead of the 10% or so improvement at ‘top third’ institutions.

But non-FSM progression to all higher education has actually fallen slightly over the period.

.

Fair access graph 1 rev

Graph 1: Percentage increase in FSM and non-FSM students attending selective HE destinations between AY2010/11 and AY2012/13 (From DfE destination statistics, underlying data)

 .

The similarity between the FSM increases for Oxbridge and Russell Group universities may help to substantiate the improvement for the former, despite the potentially drastic impact that rounding can have on such small totals (see Technical Notes below).

On the other hand, it should not be forgotten that this radical improvement was achieved in a single year, between AY2010/11 and AY2011/12.

In the following year there was no change at all for Oxbridge, with FSM admissions stalled on 50, whereas the improvement at Russell Group universities was much more consistent, increasing by some 22% compared with AY2011/12.

Further insights can be gleaned by looking at the figures in a different way.

Graph 2 shows the percentage of total admissions to the different categories of selective higher education accounted for by FSM students – and how these have changed by academic year.

This reveals a somewhat different picture. The FSM progression rate to Oxbridge remains some two percentage points behind the rate for progression to the Russell Group as a whole (although the gap closed temporarily in AY2011/12). Whereas there has been steady improvement across the Russell Group, the FSM share fell back back at Oxbridge between AYs 2011/12 and 2012/13.

The overall improvement for all higher education has also been strong, particularly so between AYs 2011/12 and 2012/13. At ‘top third’ universities the FSM share fell back a little in 2011/12 but recovered strongly in 2012/13.

.

Additional Oxbridge graph

Graph 2: Percentage of admissions to Oxbridge, RG, Top third and all HEIs accounted for by FSM students, 2010/11 to 2012/13 (From DfE destination statistics, underlying data)

.

One might normally be wary of expressing changes in comparatively small percentages as percentages themselves, but since the UCAS End of Cycle Report (see below) includes such calculations, it seems equally justifiable in this context.

They reveal a substantial 24-point difference in the change in the FSM share of total admissions between 2012 and 2013, with Oxbridge recording -10% and the remainder of the Russell Group +14%

.

.

This coincides with a change in the constitution of the Russell Group, as Durham, Exeter, Queen Mary’s and York Universities joined in 2012. This might have had some small impact on share, but does not explain the 24-point gap.

A more tantalising question is the impact of the relaxation of student number controls for students with A level grades of AAB+ or equivalent, combined with a fall in the total number of applicants. Did these factors contribute to the improvement at Russell Group universities, or was the improvement achieved in spite of them?

UCAS End of Cycle Data

This data provides a more differentiated view of FSM progression to selective universities than the oft-quoted UCAS End of Cycle Report 2014, which has a small section on this topic, based on matched NPD and UCAS admissions data.

FSM eligibility is determined when the student is aged 15 and selective ‘high-tariff’ institutions appear to be calculated on the same basis as the ‘top third’. This ensures a degree of comparability with the Destinations statistics, although the UCAS data relates to the progression of 18 year-olds from state-funded schools only (so excludes colleges).

Furthermore there is no expectation of sustained participation (see technical notes below) and the ‘top third’ of universities has probably been calculated in a different year.

The UCAS analysis is confined exclusively to entry rates – the proportions of the total FSM and non-FSM 18 year-old populations progressing to high-, medium- and low-tariff universities respectively.

Graph 3, below, is derived from the data underpinning the Report. It shows progression to high-tariff universities for FSM and non-FSM students.

.

UCAS

Graph 3: FSM and non-FSM entry rates to UCAS high-tariff universities, 2011-2014

.

This reveals that:

  • There were very small increases in entry rates between 2013 and 2014, for both FSM and non-FSM populations. (The Report notes that this is a 3.7% improvement for FSM and a 2.9% improvement for non-FSM.)
  • The ratio between non-FSM and FSM has also narrowed minimally, but the gap between them has widened minimally too (from 6.4 points to 6.5 points).
  • Since 2011, the FSM entry rate has increased by some 50% while the improvement in the non-FSM entry rate is nearer 25%. The ratio between the two rates has improved, but the gap between them has widened from 5.6 points to 6.5 points.

This is not the universally positive story for fair access suggested in media coverage and subsequent political commentary.

.

Oxbridge Data

Data published by Oxford and Cambridge, either in their access agreements or admissions statistics, show that progress over the three years in question has been inconsistent.

  • At Oxford the total number of applicants from Acorn 4 and 5 postcodes reached a peak of 1,246 in 2010/11, only to fall to 1,079 in 2011/12 and 1,070 in 2012/13. The percentage of all students admitted with Acorn 4 and 5 postcodes was 7.6% in 2010/11, but fell to 6.7% in 2011/12, increasing only slightly to 6.8% in 2012/13.
  • At Cambridge 4.1% of applicants in 2010/11 were home applicants from Polar 2 quintile 1 postcodes and 17.6% were successful applicants. There was an improvement in 2011/12, to 4.6% of applicants and a 22.6% success rate but, in 2012/13, applications remained at 4.6% and the success rate fell back to 20.2%.

Unfortunately neither chooses to make public any data they might hold on annual admissions from FSM and non-FSM students.

Reasons cited in access agreements include the effects of the new student funding regime, a fall in the number of school leavers and the argument that an impact will only become apparent after sustained activity over a five year period. Oxford is however predicting significant improvement in AY2013/14 on the basis of its provisional data.

But one might reasonably expect these factors to have had a similar effect on other Russell Group universities. So how does one justify the disparity revealed by graph 2 above – between Oxbridge and the remainder of the Russell Group?

.

Possible reasons for the disparity between Oxbridge and other Russell Group universities

The explanation most often supplied by Oxbridge is that very few FSM-eligible students manage the exceptionally high attainment required for admission.

Admissions statistics from the two universities shows that, in 2012/13:

  • At Oxford 37.1% of students accepted had A*A*A*, 27.2% had A*A*A, 24% had A*AA and 9.4% had AAA (best three A levels).
  • At Cambridge, 59.5% of applicants achieving a UCAS tariff equivalent to A*A*A* were accepted, as were 23.6% of those with A*A*A and 13.9% of those with A*AA.

Data on FSM achievement at the highest A level grades (or equivalent) is particularly hard to come by. I have previously drawn on answers to various Parliamentary Questions that show an increase of some 45% in FSM students achieving AAA or better at A level between 2006 and 2011.

The most recent of these (Col 35W) was answered in July 2012. It says that, of pupils entering at least one A level in 2010/11 and eligible for FSM at the end of Year 11, there were 546 who achieved 3 or more GCE A levels at A*-A. This includes students in both the school and FE sectors. By comparison, there were 22,353 non-FSM students achieveing the same feat.

If we look at the ratio between achievement at this level and admission to Oxbridge in the same year:

  • 546 FSM students corresponded with 30 places secured (ratio 18:1)
  • 22,353 non-FSM students corresponded with 2,260 places secured (ratio 10:1)

So what exactly is happening? There are several possible further reasons for FSM under-representation:

  • Too few FSM students are gaining A* grades (or equivalent), as opposed to A grades, at A level.
  • Too few FSM students are gaining the necessary grades in suitable subject combinations and/or in facilitating subjects. (There has been some suggestion recently that subject choice is an issue, though this study adopts a broader definition of disadvantage and does not apply specifically to Oxbridge admission.)
  • When Oxbridge chooses FSM students pre A-level, their GCSE/AS level performance does not reflect their eventual A level performance.
  • Too few of the highest attaining FSM students are applying to Oxbridge, quite possibly for a variety of different reasons.
  • Too many FSM applicants to Oxbridge are seeking entry to the most competitive courses; too few to those where there are fewer applicants per place. (At Oxford in 2012/13, for example, the success rate for medicine was 10% while for classics it was 42%)
  • FSM students do apply in proportion, but are relatively less successful at gaining admission for reasons other than (predicted) attainment. One reason might be that neither University specifically targets FSM students through its access strategy, preferring alternative indicators of disadvantage.

Unfortunately, there is very little data available publicly to test which of these hypotheses are correct, their relative impact and how they operate in combination.

As attention switches to the pupil premium measure, one wonders whether the next government will ensure that reliable data can be made available to selective universities and, through Offa, expect them to feature this in their access targets, as well as their policies for contexualised admissions.

.

Technical Notes

There is a timelag associated with the HESA dataset, which has to be matched with the National Pupil Database.  For example, the January 2015 publication matches data on students in KS5 taking A level and equivalent qualifications in AY2011/12 and on those in HE in AY2012/13.

The most recent publication appeared in January 2015. Since HESA collects data at the end of each academic year the lag was approximately 18 months.

The next publication, relating to academic year 2013/14, is not scheduled for release until October/November 2015, indicating a lag of 15/16 months.

According to the Technical Note linked to the most recent SFR KS5 students are included if they:

  • Entered for at least one A level or equivalent level 3 qualification similar in size to an A level.
  • Attend state-funded mainstream schools, independent schools, FE and sixth form colleges and maintained, non-maintained and independent special schools. (However, it seems that only a few independent schools – those that provide tracking information to local authorities – are included.)

Students must record sustained participation – in all of the first two terms of the year – at one or more HE destinations. In 2012/13 this was defined as between October 2012 and March 2013.

Higher education is defined as any UK HE institution, so those admitted to institutions abroad are excluded. Students undertaking HE courses at FE institutions are included. The note is not quite clear about the treatment of students accepted for deferred entry.

The categories of selective HE are nested within each other:

  • The top third of HEIs when grouped by mean UCAS tariff score from entrants’ best three A level grades. KS5 students with other qualifications are excluded from the calculation. For the purposes of this publication, students with no A level points were excluded from the calculation. The ‘top third’ methodology is preferred by BIS. The constitution of the group changes annually, though 88% of institutions were within scope for six consecutive years up to 2011/12. (The 2011/12 list is used on this occasion.)
  • The Russell Group (Birmingham, Bristol, Cambridge, Cardiff, Durham, Edinburgh, Exeter, Glasgow, Imperial, KCL, Leeds, Liverpool, LSE, Manchester, Newcastle, Nottingham, Oxford, Queen Mary’s, Queens Belfast, Sheffield, Southampton, UCL, Warwick and York).
  • Oxbridge (Oxford and Cambridge)

Eligibility for free school meals (FSM) means students eligible for and claiming FSM in Year 11. Pupil premium was not introduced until September 2011, when these students were already beyond Year 11.

All national figures are rounded to the nearest 10, which makes small totals particularly unreliable. (For example, 40 + 10 could represent 35 + 5 or 44 +14, so anywhere between 40 and 58.)

The technical note advises that:

‘Some of the differences across years may be attributable to the tightening of methodology or the improvements in data matching, so comparisons across years must be treated with caution.’

.

GP

March 2015

Addressed to Teach First and its Fair Education Alliance

.

This short opinion piece was originally commissioned by the TES in November.

My draft reached them on 24 November; they offered some edits on 17 December.

Betweentimes the Fair Education Alliance Report Card made its appearance on 9 December.

Then Christmas intervened.

On 5 January I offered the TES a revised version they said should be published on 27 February. It never appeared.

This Tweet

.

.

prompted an undertaking that it would appear on 27 March. I’ll believe that when I see it.

But there’s no reason why you should wait any longer. This version is more comprehensive anyway, in that it includes several relevant Twitter comments and additional explanatory material.

I very much hope that Teach First and members of the Fair Education Alliance will read it and reflect seriously on the proposal it makes.

As the final sequence of Tweets below shows, Teach First committed to an online response on 14 February. Still waiting…

.

.

.

How worried are you that so few students on free school meals make it to Oxbridge?

Many different reasons are offered by those who argue that such concern may be misplaced:

  • FSM is a poor proxy for disadvantage; any number of alternatives is preferable;
  • We shouldn’t single out Oxbridge when so many other selective universities have similarly poor records;
  • We obsess about Oxbridge when we should be focused on progression to higher education as a whole;
  • We should worry instead about progression to the most selective courses, which aren’t necessarily at the most selective universities;
  • Oxbridge suits a particular kind of student; we shouldn’t force square pegs into round holes;
  • We shouldn’t get involved in social engineering.

Several of these points are well made. But they can be deployed as a smokescreen, obscuring the uncomfortable fact that, despite our collective best efforts, there has been negligible progress against the FSM measure for a decade or more.

Answers to Parliamentary Questions supplied  by BIS say that the total fluctuated between 40 and 45 in the six years from 2005/06 to 2010/11.

The Department for Education’s experimental destination measures statistics suggested that the 2010/11 intake was 30, rising to 50 in 2011/12, of which 40 were from state-funded schools and 10 from state-funded colleges. But these numbers are rounded to the nearest 10.

By comparison, the total number of students recorded as progressing to Oxbridge from state-funded schools and colleges in 2011/12 is 2,420.

This data underpins the adjustment of DfE’s  ‘FSM to Oxbridge’ impact indicator, from 0.1% to 0.2%. It will be interesting to see whether there is stronger progress in the 2012/13 destination measures, due later this month.

.

[Postscript: The 2012/13 Destinations Data was published on 26 January 2014. The number of FSM learners progressing to Oxbridge is shown only in the underlying data (Table NA 12).

This tells us that the numbers are unchanged: 40 from state-funded schools; 10 from state-funded colleges, with both totals again rounded to the nearest 10.

So any improvement in 2011/12 has stalled in 2012/13, or is too small to register given the rounding (and the rounding might even mask a deterioration)

.

.

The non-FSM totals progressing to Oxbridge in 2012/13 are 2,080 from state-funded schools and 480 from state-funded colleges, giving a total of 2,560. This is an increase of some 6% compared with 2011/12.

Subject to the vagaries of rounding, this suggests that the ratio of non-FSM to FSM learners progressing from state-funded institutions deteriorated in 2012/13 compared with 2011/12.]

.

The routine explanation is that too few FSM-eligible students achieve the top grades necessary for admission to Oxbridge. But answers to Parliamentary Questions reveal that, between 2006 and 2011, the number achieving three or more A-levels at grade A or above increased by some 45 per cent, reaching 546 in 2011.

Judged on this measure, our national commitment to social mobility and fair access is not cutting the mustard. Substantial expenditure – by the taxpayer, by universities and the third sector – is making too little difference too slowly. Transparency is limited because the figures are hostages to fortune.

So what could be done about this? Perhaps the answer lies with Teach First and the Fair Education Alliance.

Towards the end of last year Teach First celebrated a decade of impact. It published a report and three pupil case studies, one of which featured a girl who was first in her school to study at Oxford.

I tweeted

.

.

Teach First has a specific interest in this area, beyond its teacher training remit. It runs a scheme, Teach First Futures, for students who are  “currently under-represented in universities, including those whose parents did not go to university and those who have claimed free school meals”.

Participants benefit from a Teach First mentor throughout the sixth form, access to a 4-day Easter school at Cambridge, university day trips, skills workshops and careers sessions. Those applying to Oxbridge receive unspecified additional support.

.

.

Information about the number of participants is not always consistent, but various Teach First sources suggest there were some 250 in 2009, rising to 700 in 2013. This year the target is 900. Perhaps some 2,500 have taken part to date.

Teach First’s impact report  says that 30 per cent of those who had been through the programme in 2013 secured places at Russell Group universities and that 60 per cent of participants interviewed at Oxbridge received an offer.

I searched for details of how many – FSM or otherwise – had actually been admitted to Oxbridge. Apart from one solitary case study, all I could find was a report that mentioned four Oxbridge offers in 2010.

.

.

.

.

.

Through the Fair Education Alliance, Teach First and its partners are committed to five impact goals, one of which is to:

‘Narrow the gap in university graduation, including from the 25% most selective universities, by 8%’*

Last month the Alliance published a Report Card which argued that:

‘The current amount of pupil premium allocated per disadvantaged pupil should be halved, and the remaining funds redistributed to those pupils who are disadvantaged and have low prior attainment. This would give double-weighting to those low income pupils most in need of intervention without raising overall pupil premium spend.’

It is hard to understand how this would improve the probability of achieving the impact goal above, even though the gaps the Alliance wishes to close are between schools serving high and low income communities.

.

.

.

Perhaps it should also contemplate an expanded Alliance Futures Scheme, targeting simultaneously this goal and the Government’s ‘FSM to Oxbridge’ indicator, so killing two birds with one stone.

A really worthwhile Scheme would need to be ambitious, imposing much-needed coherence without resorting to prescription.

Why not consider:

  • A national framework for the supply side, in which all providers – universities included – position their various services.
  • Commitment on the part of all secondary schools and colleges to a coherent long-term support programme for FSM students, with open access at KS3 but continuing participation in KS4 and KS5 subject to successful progress.
  • Schools and colleges responsible for identifying participants’ learning and development needs and addressing those through a blend of internal provision and appropriate services drawn from the national framework.
  • A personal budget for each participant, funded through an annual £50m topslice from the Pupil Premium (there is a precedent) plus a matching sum from universities’ outreach budgets. Those with the weakest fair access records would contribute most. Philanthropic donations would be welcome.
  • The taxpayer’s contribution to all university funding streams made conditional on them meeting challenging but realistic fair access and FSM graduation targets – and publishing full annual data in a standard format.

 .

.

*In the Report card, this impact goal is differently expressed, as narrowing the gap in university graduation, so that at least 5,000 more students from low income backgrounds graduate each year, 1,600 of them from the most selective universities. This is to be achieved by 2022.

‘Low income backgrounds’ means schools where 50% or more pupils come from the most deprived 30% of families according to IDACI.

The gap to be narrowed is between these and pupils from ‘high income backgrounds’, defined as schools where 50% or more pupils come from the least deprived 30% of families according to IDACI.

‘The most selective universities’ means those in the Sutton Trust 30 (the top 25% of universities with the highest required UCAS scores).

The proposed increases in graduation rates from low income backgrounds do not of themselves constitute a narrowing gap, since there is no information about the corresponding changes in graduation rates from high income grounds.

This unique approach to closing gaps adds yet another methodology to the already long list applied to fair access. It risks adding further density to the smokescreen described at the start of this post.

.

.

.

GP

January 2015

How Well Do Grammar Schools Perform With Disadvantaged Students?

This supplement to my previous post on The Politics of Selection  compares the performance of disadvantaged learners in different grammar schools.

It adds a further dimension to the evidence base set out in my earlier post, intended to inform debate about the potential value of grammar schools as engines of social mobility.

The commentary is based on the spreadsheet embedded below, which relies entirely on data drawn from the 2013 Secondary School Performance Tables.

.

.

If you find any transcription errors please alert me and I will correct them.

.

Preliminary Notes

The 2013 Performance Tables define disadvantaged learners as those eligible for free school meals in the last six years and children in care. Hence both these categories are caught by the figures in my spreadsheet.

Because the number of disadvantaged pupils attending grammar schools is typically very low, I have used the three year average figures contained in the ‘Closing the Gap’ section of the Tables.

These are therefore the number of disadvantaged students in each school’s end of KS4 cohort for 2011, 2012 and 2013 combined. They should illustrate the impact of pupil premium support and wider closing the gap strategies on grammar schools since the Coalition government came to power.

Even when using three year averages the data is frustratingly incomplete, since 13 of the 163 grammar schools have so few disadvantaged students – fewer than six across all three cohorts combined – that the results are suppressed. We have no information at all about how well or how badly these schools are performing in terms of closing gaps.

My analysis uses each of the three performance measures within this section of the Performance Tables:

  • The percentage of pupils at the end of KS4 achieving five or more GCSEs (or equivalents) at grades A*-C, including GCSEs in English and maths. 
  • The proportion of pupils who, by the end of KS4, have made at least the expected progress in English. 
  • The proportion of pupil who, by the end of KS4, have made at least the expected progress in maths.

In each case I have recorded the percentage of disadvantaged learners who achieve the measure and the percentage point gap between that and the corresponding figure for ‘other’ – ie non-disadvantaged – students.

For comparison I have also included the corresponding percentages for all disadvantaged pupils in all state-funded schools and for all high attainers in state-funded schools. The latter is for 2013 only rather than a three-year average.

Unfortunately the Tables do not provide data for high attaining disadvantaged students. The vast majority of disadvantaged students attending grammar schools will be high-attaining according to the definition used in the Tables (average points score of 30 or higher across KS2 English, maths and science).

But, as my previous post showed, some grammar schools record 70% or fewer high attainers, disadvantaged or otherwise. These include: Clarendon House (Kent, now merged), Fort Pitt (Medway), Skegness (Lincolnshire), Dover Boys’ and Girls’ (Kent), Folkestone Girls’ (Kent), St Joseph’s (Stoke), Boston High (Lincolnshire) and the Harvey School (Kent).

Some of these schools feature in the analysis below, while some do not, suggesting that the correlation between selectivity and the performance of disadvantaged students is not straightforward.

.

Number of disadvantaged learners in each school

The following schools are those with suppressed results, placed in order according to the number of disadvantaged learners within scope, from lowest to highest:

  • Tonbridge Grammar School, Kent (2)
  • Bishop Wordsworth’s Grammar School, Wiltshire (3)
  • Caistor Grammar School, Lincolnshire (3)
  • Sir William Borlase’s Grammar School, Buckinghamshire (3)
  • Adams’ Grammar School, Telford and Wrekin (4)
  • Chelmsford County High School for Girls, Essex (4)
  • Dr Challoner’s High School, Buckinghamshire (4)
  • King Edward VI School, Warwickshire (4)
  • Alcester Grammar School, Warwickshire (5)
  • Beaconsifeld High School, Buckinghamshire (5)
  • King Edward VI Grammar School, Chelmsford, Essex (5)
  • Reading School, Reading (5)
  • St Bernard’s Catholic Grammar School, Slough (5).

Some of these schools feature among those with the lowest proportions of ‘ever 6 FSM’ pupils on roll, as shown in the spreadsheet accompanying my previous post, but some do not.

The remaining 152 schools each record a combined cohort of between six and 96 students, with an average of 22.

A further 19 schools have a combined cohort of 10 or fewer, meaning that 32 grammar schools in all (20% of the total) are in this category.

At the other end of the distribution, only 16 schools (10% of all grammar schools) have a combined cohort of 40 disadvantaged students or higher – and only four have one of 50 disadvantaged students or higher.

These are:

  • Handsworth Grammar School, Birmingham (96)
  • Stretford Grammar School, Trafford (76)
  • Dane Court Grammar School, Kent (57)
  • Slough Grammar School (Upton Court) (50).

Because the ratio of disadvantaged to other pupils in the large majority of grammar schools is so marked, the results below must be treated with a significant degree of caution.

Outcomes based on such small numbers may well be misleading, but they are all we have.

Arguably, grammar schools should find it relatively easier to achieve success with a very small cohort of students eligible for the pupil premium – since fewer require separate monitoring and, potentially, additional support.

On the other hand, the comparative rarity of disadvantaged students may mean that some grammar schools have too little experience of addressing such needs, or believe that closing gaps is simply not an issue for them.

Then again, it is perhaps more likely that grammar schools will fall short of 100% success with their much larger proportions of ‘other’ students, simply because the probability of special circumstances arising is relatively higher. One might expect therefore to see ‘positive gaps’ with success rates for disadvantaged students slightly higher than those for their relatively more advantaged peers.

Ideally though, grammar schools should be aiming for a perfect 100% success rate for all students on these three measures, regardless of whether they are advantaged or disadvantaged. None is particularly challenging, for high attainers in particular – and most of these schools have been rated as outstanding by Ofsted.

.

Five or more GCSE A*-C grades or equivalent including GCSEs in English and maths

In all state-funded schools, the percentage of disadvantaged students achieving this measure across the three year period is 38.7% while the percentage of other students doing so is 66.3%, giving a gap of 27.6 percentage points.

In 2013, 94.7% of all high attainers in state-funded secondary schools achieved this measure.

No grammar school falls below the 38.7% benchmark for its disadvantaged learners. The nearest to it is Pate’s Grammar School, at 43%. But these results were affected by the School’s decision to sit English examinations which were not recognised for Performance Table purposes.

The next lowest percentages are returned by:

  • Spalding Grammar School, Lincolnshire (59%)
  • Simon Langton Grammar School for Boys, Kent (65%)
  • Stratford Grammar School for Girls, Warwickshire (71%)
  • The Boston Grammar School, Lincolnshire (74%)

These were the only four schools below 75%.

Table 1 below illustrates these percentages and the percentage point gap for each of these four schools.

.

Table 1

Table 1: 5+ GCSEs at A*-C or equivalent including GCSEs in English and maths: Lowest performing and largest gaps

.

A total of 46 grammar schools (31% of the 150 without suppressed results) fall below the 2013 figure for high attainers across all state-funded schools.

On the other hand, 75 grammar schools (exactly 50%) achieve 100% on this measure, for combined student cohorts ranging in size from six to 49.

Twenty-six of the 28 schools that had no gap between the performance of their advantaged and disadvantaged students were amongst those scoring 100%. (The other two were at 97% and 95% respectively.)

The remaining 49 with a 100% record amongst their disadvantaged students demonstrate a ‘positive gap’, in that the disadvantaged do better than the advantaged.

The biggest positive gap is seven percentage points, recorded by Clarendon House Grammar School in Kent and Queen Elizabeth’s Grammar School in Alford, Lincolnshire.

Naturally enough, schools recording relatively lower success rates amongst their disadvantaged students also tend to demonstrate a negative gap, where the advantaged do better than the disadvantaged.

Three schools had an achievement gap higher than the 27.6 percentage point national average. They were:

  • Simon Langton Grammar School for Boys (30 percentage points)
  • Spalding Grammar School (28 percentage points)
  • Stratford Grammar School for Girls (28 percentage points)

So three of the four with the lowest success rates for disadvantaged learners demonstrated the biggest gaps. Twelve more schools had double digit achievement gaps of 10% or higher.

These 15 schools – 10% of the total for which we have data – have a significant issue to address, regardless of the size of their disadvantaged populations.

One noticeable oddity at this end of the table is King Edward VI Camp Hill School for Boys in Birmingham, which returns a positive gap of 14 percentage points (rounded): with 80% for disadvantaged and 67% for advantaged. On this measure at least, it is doing relatively badly with its disadvantaged students, but considerably worse with those from advantaged backgrounds!

However, this idiosyncratic pattern is also likely to be attributable to the School using some examinations not eligible for inclusion in the Tables.

.

At least expected progress in English

Across all state-funded schools, the percentage of disadvantaged students making at least three levels of progress in English is 55.5%, compared with 75.1% of ‘other’ students, giving a gap of 19.6 percentage points.

In 2013, 86.2% of high attainers achieved this benchmark.

If we again discount Pate’s from consideration, the lowest performing school on this measure is The Boston Grammar School which is at 53%, lower than the national average figure.

A further 43 schools (29% of those for which we have data) are below the 2013 average for all high attainers. Six more of these fall below 70%:

  • The Skegness Grammar School, Lincolnshire (62%)
  • Queen Elizabeth Grammar School, Cumbria (62%)
  • Plymouth High School for Girls (64%)
  • Spalding Grammar School, Lincolnshire (65%)
  • Devonport High School for Boys, Plymouth (65%)
  • Simon Langton Grammar School for Boys, Kent (67%)

Table 2 below illustrates these outcomes, together with the attainment gaps recorded by these schools and others with particularly large gaps.

.

Table 2

Table 2: At least expected progress in English from KS2 to KS4: Lowest performing and largest gaps

.

At the other end of the table, 44 grammar schools achieve 100% on this measure (29% of those for which we have data.) This is significantly fewer than achieved perfection on the five or more GCSEs benchmark.

When it comes to closing the gap, only 16 of the 44 achieve a perfect 100% score with both advantaged and disadvantaged students, again much lower than on the attainment measure above.

The largest positive gaps (where disadvantaged students outscore their advantaged classmates) are at The King Edward VI Grammar School, Louth, Lincolnshire (11 percentage points) and John Hampden Grammar School Buckinghamshire (10 percentage points).

Amongst the schools propping up the table on this measure, six record negative gaps of 20 percentage points or higher, so exceeding the average gap in state-funded secondary schools:

  • The Skegness Grammar School (30 percentage points)
  • Queen Elizabeth Grammar School Cumbria (28 percentage points)
  • Stratford Grammar School for Girls (27 percentage points)
  • Plymouth High School for Girls (25 percentage points)
  • Devonport High School for Boys, Plymouth (23 percentage points)
  • Loreto Grammar School, Trafford (20 percentage points).

There is again a strong correlation between low disadvantaged performance and large gaps, although the relationship does not apply in all cases.

Another 23 grammar schools have a negative gap of 10 percentage points or higher.

There is again a curious trend for King Edward VI Camp Hill in Birmingham, which comes in at 75% on this measure, yet its disadvantaged students outscore the advantaged, which are at 65%, ten percentage points lower. As noted above, there may well be extenuating circumstances.

.

At least expected progress in maths

The percentage of disadvantaged students making at least three levels of progress in maths across all state-funded schools is 50.7%, compared with a figure for ‘other’ students of 74.1%, giving a gap of 23.4 percentage points.

In 2013, 87.8% of high attainers achieved this.

On this occasion Pate’s is unaffected (in fact scores 100%), as does King Edward VI Camp Hill School for Boys (in its case for advantaged and disadvantaged alike).

No schools come in below the national average for disadvantaged students, in fact all comfortably exceed it. However, the lowest performers are still a long way behind some of their fellow grammar schools.

The worst performing grammar schools on this measure are:

  • Spalding Grammar School, Lincolnshire (59%)
  • Queen Elizabeth Grammar School Cumbria (62%)
  • Simon Langton Grammar School for Boys, Kent (63%)
  • Dover Grammar School for Boys, Kent (67%)
  • The Boston Grammar School, Lincolnshire (68%)
  • Borden Grammar School, Kent (68%)

These are very similar to the corresponding rates for the lowest performers in English.

Table 3 illustrates these outcomes, together with other schools demonstrating very large gaps between advantaged and disadvantaged students.

.

Table 3

Table 3: At least expected progress in maths from KS2 to KS4: Lowest performing and largest gaps

A total of 32 schools (21% of those for which we have data) undershoot the 2013 average for high attainers, a slightly better outcome than for English.

At the other extreme, there are 54 schools (36% of those for which we have data) that score 100% on this measure, slightly more than do so on the comparable measure for English, but still significantly fewer than achieve this on the 5+ GCSE measure.

Seventeen of the 54 also achieve a perfect 100% for advantaged students.

The largest positive gaps recorded are 11 percentage points at The Harvey Grammar School in Kent (which achieved 94% for disadvantaged students) and 7 percentage points at Queen Elizabeth’s Grammar School, Alford, Lincolnshire (91% for disadvantaged students).

The largest negative gaps on this measure are equally as substantial as those relating to English. Four schools perform significantly worse than the average gap of 23.4 percentage points:

  • Spalding Grammar School, Lincolnshire (32 percentage points)
  • Queen Elizabeth Grammar School, Cumbria (31 percentage points)
  • Simon Langton Grammar School for Boys, Kent (31 percentage points)
  • Stratford Grammar School for Girls (27 percentage points)

Queen Elizabeth’s and Stratford Girls’ appeared in the same list for English. Stratford Girls’ appeared in the same list for the 5+ GCSE measure.

A further 20 schools have a double-digit negative gap of 10 percentage points or higher, very similar to the outcome in English.

.

Comparison across the three measures

As will be evident from the tables and lists above, some grammar schools perform consistently poorly on all three measures.

Others perform consistently well, while a third group have ‘spiky profiles’

The number of schools that achieve 100% on all three measures with their disadvantaged students is 25 (17% of those for which we have data).

Eight of these are located in London; none is located in Birmingham. Just two are in Buckinghamshire and there is one each in Gloucestershire, Kent and Lincolnshire.

Only six schools achieve 100% on all three measures with advantaged and disadvantaged students alike. They are:

  • Queen Elizabeth’s, Barnet
  • Colyton Grammar School, Devon
  • Nonsuch High School for Girls, Sutton
  • St Olave’s and St Saviour’s Grammar School, Bromley
  • Tiffin Girls’ School, Kingston
  • Kendrick School, Reading

Five schools recorded comparatively low performance across all three measures (ie below 80% on each):

  • Spalding Grammar School, Lincolnshire
  • Simon Langton Grammar School for Boys, Kent
  • The Boston Grammar School, Lincolnshire
  • Stratford Grammar School for Girls
  • St Joseph’s College, Stoke on Trent

Their overall performance is illustrated in Table 4.

.

Table 4

Table 4: Schools where 80% or fewer disadvantaged learners achieved each measure

.

This small group of schools are a major cause for concern.

A total of 16 schools (11% of those for which we have data) score 90% or less on all three measures and they, too, are potentially concerning.

Schools which record negative gaps of 10 percentage points or more on all three measures are:

  • Simon Langton Grammar School for Boys, Kent
  • Dover Grammar School for Boys, Kent
  • The Boston Grammar School, Lincolnshire
  • Stratford Grammar School for Girls
  • Wilmington Grammar School for Boys, Kent
  • St Joseph’s College, Stoke-on-Trent
  • Queen Elizabeth’s Grammar School, Horncastle, Lincolnshire

Table 5 records these outcomes

.

Table 5

Table 5: Schools with gaps of 10% or higher on all three measures

.

Of these, Boston and Stratford have gaps of 20 percentage points or higher on all three measures.

A total of 32 grammar schools (21% of those for which we have data) record a percentage of 80 percentage points or lower on at least one of the three measures.

.

Selective University Destinations

I had also wanted to include in the analysis some data on progression to selective (Russell Group) universities, drawn from the experimental destination statistics.

Unfortunately, the results for FSM students are suppressed for the vast majority of schools, making comparison impossible. According to the underlying data for 2011/12, all I can establish with any certainty is that:

  • In 29 grammar schools, there were no FSM students in the cohort.
  • Five schools returned 0%, meaning that no FSM students successfully progressed to a Russell Group university. These were Wycombe High School, Wallington High School for Girls, The Crossley Heath School in Calderdale, St Anselm’s College on the Wirral and Bacup and Rawtenstall Grammar School.
  • Three schools were relatively successful – King Edward VI Five Ways in Birmingham reported 58% of FSM students progressing, while King Edward VI Handsworth reported 53% and the Latymer School achieved an impressive 75%.
  • All remaining grammar schools – some 127 in that year – are reported as ‘x’ meaning that there were either one or two students in the cohort, so the percentages are suppressed.

We can infer from this that, at least in 2011/12, very few grammar schools indeed were specialising in providing an effective route to Russell Group universities for FSM students.

.

Conclusion

Even allowing for the unreliability of statistics based on very small cohorts, this analysis is robust enough to show that the performance of grammar schools in supporting disadvantaged students is extremely disparate.

While there is a relatively large group of consistently high performers, roughly one in five grammar schools is a cause for concern on at least one of the three measures. Approximately one in ten is performing no more than satisfactorily across all three. 

The analysis hints at the possibility that the biggest problems tend to be located in rural and coastal areas rather than in London and other urban centres, but this pattern is not always consistent. The majority of the poorest performers seem to be located in wholly selective authorities but, again, this is not always the case.

A handful of grammar schools are recording significant negative gaps between the performance of disadvantaged students and their peers. This is troubling. There is no obvious correlation between the size of the disadvantaged cohort and the level of underperformance.

There may be extenuating circumstances in some cases, but there is no public national record of what these are – an argument for greater transparency across the board.

One hopes that the grammar schools that are struggling in this respect are also those at the forefront of the reform programme described in my previous post – and that they are improving rapidly.

One hopes, too, that those whose business it is to ensure that schools make effective use of the pupil premium are monitoring these institutions closely. Some of the evidence highlighted above would not, in my view, be consistent with an outstanding Ofsted inspection outcome.

If the same pattern is evident when the 2014 Performance Tables are published in January 2015, there will be serious cause for concern.

As for the question whether grammar schools are currently meeting the needs of their – typically few – disadvantaged students, the answer is ‘some are; some aren’t’. This argues for intervention in inverse proportion to success.

.

GP

December 2014

A Summer of Love for English Gifted Education? Episode 3: Improving Fair Access to Oxbridge

.

This post is a critical examination of policy and progress on improving progression for the highest attainers from disadvantaged backgrounds to selective universities, especially Oxford and Cambridge.

.

.

It:

  • Uncovers evidence of shaky statistical interpretation by these universities and their representative body;
  • Identifies problems with the current light-touch regulatory and monitoring apparatus, including shortcomings in the publication of data and reporting of progress at national level;
  • Proposes a series of additional steps to address this long-standing shortcoming of our education system.

.

Background

summer of love 1967 by 0 fairy 0

summer of love 1967 by 0 fairy 0

Regular readers may recall that I have completed two parts of a trilogy of posts carrying the optimistic strapline ‘A Summer of Love for Gifted Education’.

The idea was to structure these posts around three key government publications.

  • This final part was supposed to analyse another DfE-commissioned research report, an ‘Investigation of school- and college- level strategies to raise the Aspirations of High-Achieving Disadvantaged Pupils to pursue higher education’.

We know from the published contract (see attachment in ‘Documents’ section) that this latter study was undertaken by TNS/BMRB and the Institute for Policy Studies in Education (IPSE) based at London Metropolitan University. The final signed off report should have been produced by 28 June 2013 and published within 12 weeks of approval, so by the end of September. As I write, it has still to appear, which would suggest that there is a problem with the quality and/or size of the evidence base.

In the five months since the appearance of Part Two I have published a series of posts developing the themes explored in the first two-thirds of my incomplete trilogy.

But what to do about the missing final episode of ‘A Summer of Love’, which was going to develop this latter fair access theme in more detail?

My initial idea was to survey and synthesise the large number of other recently published studies on the topic. But, as I reviewed the content of these publications, it struck me that such a post would be stuffed full of descriptive detail but lack any real bite – by which I mean substantial and serious engagement with the central problem.

I decided to cut to the chase.

I also decided to foreground material about the highest reaches of A level attainment and progression to Oxbridge, not because I see the issue solely in these stratospheric terms, but because:

  • The top end of fair access is important in its own right, especially for those with a gifted education perspective. Oxford and Cambridge consistently declare themselves a special case and I wanted to explore the substance of their position.
  • There is compelling evidence that Oxford and Cambridge are amongst the weakest performers when it comes to fair access for the highest attaining disadvantaged learners. There are reasons why the task may be comparatively more difficult for them but, equally, as our most prestigious universities, they should be at the forefront when it comes to developing and implementing effective strategies to tackle the problem.
  • The Government has itself made Oxbridge performance a litmus test of progress (or lack of progress) on fair access and on higher education’s wider contribution to social mobility.

The first part of the post briefly reviews the range of measures and regulatory apparatus devoted to improving fair access. This is to provide a frame from which to explore the available data and its shortcomings, rather than an in-depth analysis of relative strengths and weaknesses. Readers who are familiar with this background may prefer to skip it.

The mid-section concentrates on the limited data in the public domain and how it has been (mis)interpreted.

The final section reviews the criticisms made by the SMCPC and, while endorsing them thoroughly, offers a set of further proposals – many of them data-driven – for ratcheting up our collective national efforts to reverse the unsatisfactory progress made to date.

.

A Troubling Tale of Unnecessary Complexity and Weak Regulation

.

A Proliferation of Measures

There is little doubt that we have a problem in England when it comes to progression to selective, competitive higher education (however defined) by learners from disadvantaged backgrounds (however defined).

We may not be unique in that respect, but that does not alter the fact that the problem is longstanding and largely unresolved.

The recent ‘State of the Nation 2013’ Report from the SMPCPC says ‘there has been little change in the social profile of the most elite institutions for over a decade’, adding that ‘while some of the building blocks are in place to lift children off the bottom, opening up elites remains elusive.’

Part of the problem is that the debates about these respective definitions continue to receive disproportionate coverage. Such debates are sometimes deployed as a diversionary tactic, intentionally drawing us away from the unpalatable evidence that we are making decidedly poor headway in tackling the core issue.

The definitional complexities are such that they lend themselves to exploitation by those with a vested interest in preserving the status quo and defending themselves against what they regard as unwonted state intervention.

I shall resist the temptation to explore the comparative advantages and disadvantages of different measures, since that would risk falling into the trap I have just identified.

But I do need to introduce some of the more prominent – and pin down some subtle distinctions – if only for the benefit of readers in other countries.

One typically encounters four different categorisations of competitive, selective higher education here in the UK:

  • Oxbridge – a convenient shorthand reference to Oxford and Cambridge Universities. These two institutions are commonly understood to be qualitatively superior to other UK universities and, although that advantage does not apply universally, to every undergraduate course and subject, there is some academic support for treating them as a category in their own right.
  • Russell Group – The Russell Group was formed in 1994 and originally comprised 17 members. There are currently 24 members, 20 of them located in England, including Oxford and Cambridge. Four institutions – Durham, Exeter, Queen Mary’s and York – joined as recently as 2012 and membership is likely to increase as the parallel 1994 Group has just disbanded. DfE (as opposed to BIS) often uses Russell Group membership as its preferred proxy for selective, competitive higher education, although there are no objective criteria that apply exclusively to all members.
  • Sutton Trust 30The Sutton Trust originally identified a list of 13 universities, derived from ‘average newspaper league table rankings’. This list – Birmingham, Bristol, Cambridge, Durham, Edinburgh, Imperial, LSE, Nottingham, Oxford, St Andrews, UCL, Warwick and York – still appears occasionally in research commissioned by the Trust, although it was subsequently expanded to 30 institutions. In ‘Degrees of Success’, a July 2011 publication, they were described thus:

‘The Sutton Trust 30 grouping of highly selective universities comprises universities in Scotland, England and Wales with over 500 undergraduate entrants each year, where it was estimated that less than 10 per cent of places are attainable to pupils with 200 UCAS tariff points (equivalent to two D grades and a C grade at A-level) or less. These 30 universities also emerge as the 30 most selective according to the latest Times University Guide.’

The full list includes all but two of the Russell Group (Queen Mary’s and Queen’s Belfast) plus eight additional institutions.

‘The HEIs included in this group change every year; although 94% of HEIs remained in the top third for 5 consecutive years, from 2006/07 to 2010/11. The calculation is restricted to the top three A level attainment; pupils who study other qualifications at Key Stage 5 will be excluded. Institutions with a considerable proportion of entrants who studied a combination of A levels and other qualifications may appear to have low scores. As the analysis covers students from schools and colleges in England, some institutions in other UK countries have scores based on small numbers of students. As this measure uses matched data, all figures should be treated as estimates.’

This categorisation includes seven further mainstream universities (Aston, City, Dundee, East Anglia, Goldsmiths, Loughborough, Sussex) plus a range of specialist institutions.

Indicators of educational disadvantage are legion, but these are amongst the most frequently encountered:

  • Eligibility for free school meals (FSM): DfE’s preferred measure. The term is misleading since the measure only includes learners who meet the FSM eligibility criteria and for whom a claim is made, so eligibility of itself is insufficient. Free school meals are available for learners in state-funded secondary schools, including those in sixth forms. From September 2014, eligibility will be extended to all in Years R, 1 and 2 and to disadvantaged learners in further education and sixth form colleges. The phased introduction of Universal Credit will also impact on the eligibility criteria (children of families receiving Universal Credit between April 2013 and March 2014 are eligible for FSM, but the cost of extending FSM to all Universal Credit recipients once fully rolled out is likely to be prohibitive). We do not yet know whether these reforms will cause DfE to select an alternative preferred measure and, if so, what that will be. Eligibility for the Pupil Premium is one option, more liberal than FSM, though this currently applies only to age 16.
  • Residual Household Income below £16,000: This is broadly the income at which eligibility for free school meals becomes available. It is used by selective universities (Oxford included) because it can be applied universally, regardless of educational setting and whether or not free school meals have been claimed. Oxford explains that:

‘Residual income is based on gross household income (before tax and National Insurance) minus certain allowable deductions. These can include pension payments, which are eligible for certain specified tax relief, and allowances for other dependent children.’

The threshold is determined through the assessment conducted by Student Finance England, so is fully consistent with its guidance.

  • Low participation schools: This measure focuses on participation by school attended rather than where students live. It may be generic – perhaps derived from the Government’s experimental destinations statistics – or based on admissions records for a particular institution. As far as I can establish, there is no standard or recommended methodology: institutions decide for themselves the criteria they wish to apply.
  • POLAR (Participation Of Local Areas): HEFCE’s area-based classification of participation in higher education. Wards are categorised in five quintiles, with Quintile 1 denoting those with lowest participation. The current edition is POLAR 3.
  • Other geodemographic classifications: these include commercially developed systems such as ACORN and MOSAIC based on postcodes and Output Area Classification (OAC) based on census data. One might also include under this heading the Indices of Multiple Deprivation (IMD) and the associated sub-domain Income Deprivation Affecting Children Index (IDACI).
  • National Statistics Socio-Economic Classification (NS-SEC): an occupationally-based definition of socio-economic status applied via individuals to their households. There are typically eight classes:
  1. Higher managerial, administrative and professional
  2. Lower managerial, administrative and professional
  3. Intermediate
  4. Small employers and own account workers
  5. Lower supervisory and technical
  6. Semi routine
  7. Routine
  8. Never worked and long term unemployed

Data is often reported for NS-SEC 4-7.

Sitting alongside these measures of disadvantage is a slightly different animal – recruitment from state-funded schools and colleges compared with recruitment from the independent sector.

While this may be a useful social mobility indicator, it is a poor proxy for fair access.

Many learners attending independent schools are from comparatively disadvantaged backgrounds, and of course substantively more learners at state-maintained schools are comparatively advantaged.

The Office For Fair Access (OFFA) confirms that:

‘in most circumstances we would not approve an access agreement allowing an institution to measure the diversity of its student body solely on the basis of the numbers of state school pupils it recruits….it is conceivable that a university could improve its proportion of state school students without recruiting greater proportions of students from disadvantaged groups.’

Nevertheless, independent/state balance continues to features prominently in some access agreements drawn up by selective universities and approved by OFFA.

There is a risk that some institutions are permitted to give this indicator disproportionate attention, at the expense of their wider commitment to fair access.

 .

Securing National Improvement

Given the embarrassment of riches set out above, comparing progress between institutions is well-nigh impossible, let alone assessing the cumulative impact on fair access at national level.

When it came to determining their current strategy, the government of the day must have faced a choice between:

  • Imposing a standard set of measures on all institutions, ignoring complaints that those selected were inappropriate for some settings, particularly those that were somehow atypical;
  • Allowing institutions to choose their own measures, even though that had a negative impact on the rate of improvement against the Government’s own preferred national indicators; and
  •  A half-way house which insisted on universal adoption of one or two key measures while permitting institutions to choose from a menu of additional measures, so creating a basket more or less appropriate to their circumstances.

For reasons that are not entirely clear – but presumably owe something to vigorous lobbying from higher education interests – the weaker middle option was preferred and remains in place to this day.

The standard-setting and monitoring process is currently driven by OFFA, though we expect imminently the final version of a National Strategy for Access and Student Success, developed jointly with HEFCE.

A new joint process for overseeing OFFA’s access agreements (from 2015/16) and HEFCE’s widening participation strategic statements (from 2014-2017) will be introduced in early 2014.

There were tantalising suggestions that the status quo might be adjusted through work on the wider issue of evaluation.

An early letter referred to plans to:

‘Commission feasibility study to establish if possible to develop common evaluation measures that all institutions could adopt to assess the targeting and impact of their access and student success work’.

The report would be completed by Spring 2013.

Then an Interim Report on the Strategy said the study would be commissioned in ‘early 2013 to report in May 2013’ (Annex B).

It added:

‘Informal discussions with a range of institutional representatives have indicated that many institutions would welcome a much clearer indication of the kind of evidence and indicators that we would wish to see. Therefore a key strand within the strategy development will be work undertaken with the sector to develop an evaluation framework to guide them in their efforts to evidence the impact of their activity. Within this, we intend to test the feasibility of developing some common measures for the gathering of high-level evidence that might be aggregated to provide a national picture. We will also investigate what more can be done by national bodies including ourselves to make better use of national data sets in supporting institutions as they track the impact of their interventions on individual students.’

However, HEFCE’s webpage setting out research and stakeholder engagement in support of the National Strategy still says the study is ‘to be commissioned’ and that the publication date is ‘to be confirmed’.

I can find no explanation of the reasons for this delay.

For the time being, OFFA is solely responsible for issuing guidance to institutions on the content of their access agreements, approving the Agreements and monitoring progress against them.

OFFA’s website says:

‘Universities and colleges set their own targets based on where they need to improve and what their particular institution is trying to achieve under its access agreement…These targets must be agreed by OFFA. We require universities and colleges to set themselves at least one target around broadening their entrant pool. We also encourage (but do not require) them to set themselves further targets, particularly around their work on outreach and, where appropriate, retention. Most choose to do so. We normally expect universities and colleges to have a range of targets in order to measure their progress effectively. When considering whether targets are sufficiently ambitious, we consider whether they represent a balanced view of the institution’s performance, and whether they address areas where indicators suggest that the institution has furthest to go to improve access.

From 2012-13, in line with Ministerial guidance, we are placing a greater emphasis on progress against targets. We would not, however, impose a sanction solely on the basis of a university or college not meeting its targets or milestones.’

The interim report on a National Strategy suggests that – informally at least – many universities recognise that this degree of flexibility is not helpful to their prospects of improving fair access, either individually or collectively.

But the fact that the promised work has not been undertaken might imply a counterforce pushing in precisely the opposite direction.

The expectations placed on universities are further complicated by the rather unclear status of the annual performance indicators for widening participation of under-represented groups supplied by the Higher Education Statistics Agency (HESA).

HESA’s table for young full-time first degree entrants shows progress by each HEI against benchmarks for ‘from state schools or colleges’, ‘from NS-SEC classes 4, 5, 6 and 7’ and ‘from low participation neighbourhoods (based on POLAR3 methodology)’ respectively.

HESA describes its benchmarks thus:

‘Because there are such differences between institutions, the average values for the whole of the higher education sector are not necessarily helpful when comparing HEIs. A sector average has therefore been calculated which is then adjusted for each institution to take into account some of the factors which contribute to the differences between them. The factors allowed for are subject of study, qualifications on entry and age on entry (young or mature).’

HESA’s benchmarks are clearly influential in terms of the measures adopted in many access agreements (and much of the attention given to the state versus independent sector intake may be attributable to them).

On the other hand, the indicators receive rather cavalier treatment in the most recent access agreements from Oxford and Cambridge. Oxford applies the old POLAR2 methodology in place of the latest POLAR3, while Cambridge adjusts the POLAR3 benchmarks to reflect its own research.

The most recent 2011/12 HESA results for Oxford and Cambridge are as follows:

.

Institution       State schools        NS-SEC 4-7     LPN (POLAR3)
Benchmark Performance Benchmark Performance Benchmark Performance
Oxford 71.2% 57.7% 15.9% 11.0% 4.7% 3.1%
Cambridge 71.4% 57.9% 15.9% 10.3% 4.5% 2.5%

.

That probably explains why Oxbridge would prefer an alternative approach! But the reference to further work in the Interim Strategy perhaps also suggests that few see these benchmarks as the best way forward.

.

National Targets

The Government also appears in something of a bind with its preferred measures for monitoring national progress.

When it comes to fair access (as opposed to widening participation) the Social Mobility Indicators rely exclusively on the gap between state and independent school participation at the most selective HEIs, as defined by BIS.

As noted above, this has major shortcomings as a measure of fair access, though more validity as a social mobility measure.

The relevant indicator shows that the gap held between 37% and 39% between 2006 and 2010, but this has just been updated to reflect an unfortunate increase to 40% in 2010/11.

BIS uses the same measure as a Departmental Performance Indicator for its work on higher education.  The attachment on the relevant gov.uk page is currently the wrong one – which might indicate that BIS is less than comfortable with its lack of progress against the measure.

DfE takes a different approach declaring an ‘Outcome of Education’ indicator:

‘Outcome of education:

i)             Percentage of children on free school meals progressing to Oxford or Cambridge*.

ii)            Percentage of children on free school meals progressing to a Russell Group university*.

iii)           Percentage of children on free school meals progressing to any university*.

iv)           Participation in education and work based training at age 16 to 17

*Available June 2013’

But progress against this indicator is nowhere to be found in the relevant section of the DfE website or, as far I can establish, anywhere within the DfE pages on gov.uk.

.

.

Oxbridge Access Agreement Targets for 2014/15

Perhaps the best way to link this section with the next is by showing how Oxford and Cambridge have decided to frame the targets in their access agreements for 2014/15

Oxford has OFFA’s agreement to target:

  • Schools and colleges that secure limited progression to Oxford. They use ‘historic UCAS data’ to estimate that ‘in any one year up to 1,680…will have no students who achieve AAA grades but, over a three-year period they may produce a maximum of two AAA candidates’. They also prioritise an estimated 1,175 institutions which have larger numbers achieving AAA grades ‘but where the success rate for an application to Oxford is below 10%’. In 2010, 19.4% of Oxford admissions were from these two groups and it plans to increase the proportion to 25% by 2016-17;
  • UK undergraduates from disadvantaged socio-economic backgrounds, based on ‘ACORN postcodes 4 and 5’. Some 7.6% of admissions came from these postcodes in 2010/11 and Oxford proposes to reach 9.0% by 2016/17.
  • UK undergraduates from neighbourhoods with low participation in higher education, as revealed by POLAR2. It will focus on ‘students domiciled in POLAR quintiles 1 and 2’. In 2012, 10.6% are from this group and Oxford proposes to increase this to 13.0% by 2016-17.

In addition to a target for admitting disabled students, Oxford also says it will monitor and report on the state/independent school mix, despite evidence ‘that this measure is often misleading as an indicator of social diversity’. It notes that:

‘30% of 2012 entrants in receipt of the full Oxford Bursary (students with a household income of £16,000 or less) were educated in the independent sector…The University will continue to monitor the level of students from households with incomes of £16,000 or less. It is considered that these are the most financially disadvantaged in society, and it is below this threshold that some qualify for receipt of free schools meals, and the pupil premium. The University does not consider that identifying simply those students who have been in receipt of free school meals provides a suitably robust indicator of disadvantage as they are not available in every school or college with post-16 provision, nor does every eligible student choose to receive them.

There are no national statistics currently available on the number of students whose household income is £16,000 or less and who attain the required academic threshold to make a competitive application to Oxford. In 2011-12, around one in ten of the University’s UK undergraduate intake was admitted from a household with this level of declared income.’

Meanwhile, Cambridge proposes only two relevant targets, one of them focused on the independent/state divide:

  • Increase the proportion of UK resident students admitted from UK state sector schools and colleges to between 61% and 63%. This is underpinned by the University’s research finding that ‘the proportion of students nationally educated at state schools securing examination grades in subject combinations that reflect our entrance requirements and the achievement level of students admitted to Cambridge stands at around 62%’.
  • Increase the proportion of UK resident students from low participation neighbourhoods to approximately 4% by 2016. It argues:

‘Currently HESA performance indicators and other national datasets relating to socio-economic background do not take adequate account of the entry requirements of individual institutions. Whilst they take some account of attainment, they do not do so in sufficient detail for highly selective institutions such as Cambridge where the average candidate admitted has 2.5 A* grades with specific subject entry requirements. For the present we have adjusted our HESA low participation neighbourhood benchmark in line with the results of our research in relation to state school entry and will use this as our target.’

Each of these approaches has good and bad points. Cambridge’s is more susceptible to the criticism that it is overly narrow. There is no real basis to compare the relative performance of the two institutions since there is negligible overlap between their preferred indicators. That may be more comfortable for them, but it is not in the best interests of their customers, or of those seeking to improve their performance.

 

Investigating the Data on High Attainment and Fair Access to Oxbridge

Those seeking statistics about high attainment amongst disadvantaged young people and their subsequent progression to Oxbridge are bound to be disappointed.

There is no real appreciation of the excellence gap in this country and this looks set to continue. The fact that gaps between advantaged and disadvantaged learners are typically wider at the top end of the attainment distribution seems to have acted as a brake on the publication of data that proves the point.

It is possible that the current round of accountability reforms will alter this state of affairs, but this has not yet been confirmed.

For the time being at least, almost all published statistics about high A level attainment amongst disadvantaged learners have come via answers to Parliamentary Questions. This material invariably measures disadvantage in terms of FSM eligibility.

Information about the admission of disadvantaged learners to Oxbridge is equally scant, but a picture of sorts can be built up from a mixture of PQ replies, university admission statistics and the DfE’s destination measures. The material supplied by the universities draws on measures other than FSM.

The following two sections set out what little we know, including the ever important statistical caveats.

.

High Attainment Data

  • In 2003, 94 students (1.9%) eligible for FSM achieved three or more A grades at A level. The figures relate to 16-18 year-olds in maintained schools only who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth from colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are included. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
  • In 2008, 160 students (3.5%) eligible for FSM achieved that outcome. The figures relate to 16-18 year-olds, in maintained schools only, who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth from colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are included. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
  • In 2008/09, 232 pupils at maintained mainstream schools eligible for FSM achieved three or more A grades at A level (including applied A level and double award), 179 of them attending comprehensive schools. The figures exclude students in FE and sixth form colleges previously eligible for FSM. (Parliamentary Question, 7 April 2010, Hansard (Col 1451W))
  • The number of Year 13 A level candidates eligible for FSM in Year 11 achieving 3 or more A grade levels (including applied A levels and double award) were: 2006 – 377; 2007 – 433; 2008 – 432; 2009 – 509. These figures include students in both the schools and FE sectors.(Parliamentary Question, 27 July 2010, Hansard (Col 1223W))

 .

To summarise, the total number of students who were FSM-eligible at age 16 and went on to achieve three or more GCE A levels at Grade A*/A – including those in maintained schools, sixth form and FE colleges – has been increasing significantly since 2006.

2006 2007 2008 2009 2010 2011
Number 377 433 432 509 ? 546

The overall increase between 2006 and 2011 is about 45%.

 .

Oxbridge Admission/Acceptance Data

  • The number of learners eligible for and claiming FSM at age 15 who progressed to Oxford or Cambridge by age 19 between 2005 and 2008 (rounded to the nearest five) were:
2005/06 2006/07 2007/08 2008/09
Oxford 25 20 20 25
Cambridge 20 25 20 20
TOTAL 45 45 40 45

Sources: Parliamentary Question, 13 December 2010, Hansard (Col 549W) and Parliamentary Question 21 February 2012, Hansard (Col 755W)

.

[Postscript (January 2014):

In January 2014, BIS answered a further PQ which provided equivalent figures for 2009/10 and 2010/11 – again rounded to the nearest five and derived from matching the National Pupil Database (NPD), HESA Student Record and the Individualised Learner Record (ILR) owned by the Skills Funding Agency.

The revised table is as follows:

  2005/06 2006/07 2007/08 2008/09 2009/10 2010/11
Oxford 25 20 20 25 15 15
Cambridge 20 25 20 20 25 25
TOTAL 45 45 40 45 40 40

 

Sources:

Parliamentary Question, 13 December 2010, Hansard (Col 549W)

Parliamentary Question 21 February 2012, Hansard (Col 755W)

Parliamentary Question 7 January 2014, Hansard (Col 191W)

Although the 2010/11 total is marginally more positive than the comparable figure derived from the Destination Indicators (see below) this confirms negligible change overall during the last six years for which data is available.  The slight improvement at Cambridge during the last two years of the sequence is matched by a corresponding decline at Oxford, from what is already a desperately low base.]

.

Number %age FSM Number FSM
UK HEIs 164,620 6% 10,080
Top third of HEIs 49,030 4% 2,000
Russell Group 28,620 3% 920
Oxbridge 2,290 1% 30

.

.

These are experimental statistics and all figures – including the 30 at Oxbridge – are rounded to the nearest 10. The introductory commentary explains that:

‘This statistical first release (experimental statistics) on destination measures shows the percentage of students progressing to further learning in a school, further education or sixth-form college, apprenticeship, higher education institution or moving into employment or training.’

It adds that:

‘To be included in the measure, young people have to show sustained participation in an education or employment destination in all of the first 2 terms of the year after they completed KS4 or took A level or other level 3 qualifications. The first 2 terms are defined as October to March.’

The Technical Notes published alongside the data also reveal that: It includes only learners aged 16-18 and those who have entered at least one A level or an equivalent L3 qualification;  the data collection process incorporates ‘an estimate of young people who have been accepted through the UCAS system for entry into the following academic year’ but ‘deferred acceptances are not reported as a distinct destination’; and FSM data for KS5 learners relates to those eligible for and claiming FSM in Year 11.

  • Cambridge’s 2012 intake ‘included 50+ students who had previously been in receipt of FSM’ (It is not stated whether all were eligible in Year 11, so it is most likely that this is the number of students who had received FSM at one time or another in their school careers.) This shows that Cambridge at least is collecting FSM data that it does not publish amongst its own admission statistics or use in its access agreement. (Cambridge University Statement, 26 September 2013)
  • In 2012, Cambridge had 418 applications from the most disadvantaged POLAR2 quintile (4.6% of all applicants) and, of those, 93 were accepted (3.6% of all acceptances) giving a 22.2% success rate.(Cambridge University Admission Statistics 2012 (page 23))

.

To summarise, the numbers of disadvantaged learners progressing to Oxbridge are very small; exceedingly so as far as those formerly receiving FSM are concerned.

Even allowing for methodological variations, the balance of evidence suggests that, at best, the numbers of FSM learners progressing to Oxbridge have remained broadly the same since 2005.

During that period, the concerted efforts of the system described above have had zero impact. The large sums invested in outreach and bursaries have made not one iota of difference.

This is true even though the proportion achieving the AAA A level benchmark has increased by about 45%. If Oxbridge admission was solely dependent on attainment, one would have expected a commensurate increase, to around 65 FSM entrants per year.

On the basis of the 2010/11 Destination Indicators, we can estimate that, whereas Oxbridge admits approximately 8% of all Russell Group students, it only admits slightly over 3% of Russell Group FSM students. If Oxbridge achieved the performance of its Russell Group peers, the numbers of formerly FSM admissions would be over 100 per year.

.

Misleading Use of This Data

To add insult to injury, this data is frequently misinterpreted and misused. Here are some examples, all of which draw selectively on the data set out above.

  • Of the 80,000 FSM-eligible students in the UK only 176 received three As at A level…more than one quarter of those students….ended up at either Oxford or Cambridge – Nicholson (Oxford Undergraduate Admissions Director, Letter to Guardian, 7 March 2011)
  • ‘Of the 80,000 children eligible for free school meals in the UK in 2007, only 176 received 3 As at A level. Of those 45 (more than a quarter) got places at Oxford or Cambridge’ (Undated Parliamentary Briefing ‘Access and admissions to Oxford University’ )
  • ‘The root causes of underrepresentation of students from poorer backgrounds at leading universities include underachievement in schools and a lack of good advice on subject choices. For example, in 2009 only 232 students who had been on free school meals (FSM) achieved 3As at A-level or the equivalent.  This was 4.1% of the total number of FSM students taking A-levels, and less than an estimated 0.3% of all those who had received free school meals when aged 15.’ (Russell Group Press release, 23 July 2013).
  • ‘Such data as is available suggests that less than 200 students per year who are recorded as being eligible for FSM secure grades of AAA or better at A level. The typical entrance requirement for Cambridge is A*AA, and so on that basis the University admits in excess of one quarter of all FSM students who attain the grades that would make them eligible for entry.’ (Cambridge University Statement, 26 September 2013)
  • ‘According to data produced by the Department for Children, Schools and Families, of the 4,516 FSM students who secured a pass grade at A Level in 2008 only 160 secured the grades then required for entry to the University of Cambridge (ie AAA). Students who were eligible for FSM therefore make up less than 1% of the highest achieving students nationally each year.

Assuming that all 160 of these students applied to Oxford or Cambridge in equal numbers (ie 80 students per institution) and 22 were successful in securing places at Cambridge (in line with the 2006-08 average) then this would represent a success rate of 27.5% – higher than the average success rate for all students applying to the University (25.6% over the last three years). In reality of course not every AAA student chooses to apply to Oxford or Cambridge, for instance because neither university offers the course they want to study, e.g. Dentistry.’ (Cambridge Briefing, January 2011 repeated in Cambridge University Statement, 26 September 2013)

.

.

To summarise, Oxford, Cambridge and the Russell Group are all guilty of implying that FSM-eligible learners in the schools sector are the only FSM-eligible learners progressing to selective universities.

They persist in using the school sector figures even though combined figures for the school and FE sectors have been available since 2010.

Oxbridge’s own admission statistics show that, in 2012:

  • 9.6% of acceptances at Cambridge (332 students) were extended to students attending sixth form, FE and tertiary colleges (UK figures)
  • 10.5% of UK domiciled acceptances at Oxford (283 students) were extended to students attending sixth form colleges and FE institutions of all types

We can rework Cambridge’s calculation using the figure of 546 students with three or more A*/A grades in 2011:

  • assuming that all applied to Oxford and Cambridge in equal numbers gives a figure of 273 per institution
  • assuming a success rate of 25.6% – the average over the last three years
  • the number of FSM students that would have been admitted to Cambridge is roughly 70.

Part of the reason high-attaining disadvantaged students do not apply to Oxbridge may be because they want to study the relatively few mainstream subjects, such as dentistry, which are not available.

But it is highly likely that other factors are at play, including the perception that Oxbridge is not doing all that it might to increase numbers of disadvantaged students from the state sector.

If this favourable trend in A level performance stalls, as a consequence of recent A level reforms, it will not be reasonable – in the light of the evidence presented above – for Oxbridge to argue that this is impacting negatively on the admission of FSM-eligible learners.

.

Building on the work of the SMCPC

 

‘Higher Education: The Fair Access Challenge’

There is no shortage of publications on fair access and related issues. In the last year alone, these have included:

Easily the most impressive has been the Social Mobility and Child Poverty Commission’s ‘Higher Education: The Fair Access Challenge’ (June 2013), though it does tend to rely a little too heavily on evidence of the imbalance between state and independent-educated students.

.

.

It examines the response of universities to recommendations first advanced in an earlier publication ‘University Challenge: How Higher Education Can Advance Social Mobility’ (2012) published by Alan Milburn, now Chair of the Commission, in his former role as Independent Reviewer on Social Mobility.

The analysis sets out key points from the earlier work:

  • Participation levels at the most selective universities by the least advantaged are unchanged since the mid-90s.
  • The most advantaged young people are seven times more likely to attend the most selective universities than the most disadvantaged.
  • The probability of a state secondary pupil eligible for FSM in Year 11 entering Oxbridge by 19 is almost 2000 to 1; for a privately educated pupil the probability is 20 to 1.

New research is presented to show that the intake of Russell Group universities has become less socially representative in the last few years:

  • The number of state school pupils entering Russell Group Universities has increased by an estimated 2.6% from 2002/03 to 2011/12, but the commensurate increase in privately educated entrants is 7.9%. The proportion of young full-time state-educated entrants has consequently fallen from 75.6% to 74.6% over this period. The worst performers on this measure are: Durham (-9.2%), Newcastle (-4.6%), Warwick (-4.5%) and Bristol (-3.9%). The best are: Edinburgh (+4.6%), UCL (+3.3%), LSE (+3.0%) and Southampton (2.9%). The Oxbridge figures are: Cambridge (+0.3%) and Oxford (+2.3%).
  • Similarly, the proportion of young full-time entrants from NS-SEC classes 4-7 has fallen from 19.9% in 2002/03 to 19.0% in 2011/12. A table (reproduced below) shows that the worst offenders on this measure are Queen’s Belfast (-4.6%), Liverpool (-3.2%), Cardiff (-2.9%) and Queen Mary’s (-2.7%). Conversely, the best performers are Nottingham (+2.2%), York (+0.9%), Warwick and LSE (+0.8%). The figures for Oxbridge are: Cambridge (-1.0%) and Oxford (0.0%).

.

NC-SEC Capture.

  • An estimated 3,700 state-educated learners have the necessary grades for admission to Russell Group universities but do not take up places. This calculation is based on the fact that, if all of the 20 Russell Group universities in England achieved their HESA widening participation benchmarks, they would have recruited an extra 3,662 students from state schools. (The benchmarks show how socially representative each intake would be if it were representative of all entrants with the grades required for entry – though see Cambridge’s reservations on this point, above.) Some universities would need to increase significantly the percentage of state students recruited – for example, Bristol and Durham (26.9%), Oxford (23.4%) and Cambridge (23.3%).
  • Using the same methodology to calculate the shortfall per university in NS-SEC 4-7 students results in the table below, showing the worst offenders to require percentage increases of 54.4% (Cambridge), 48.5% (Bristol), 45.5% (Oxford) and 42,2% (Durham). Conversely, Queen Mary’s, Queen’s Belfast, LSE and Kings College are over-recruiting from this population on this measure.

.

NS sec Capture 2.

  • Even if every Russell Group university met the self-imposed targets in its access agreement, the number of ‘missing’ state educated students would drop by only 25% by 2016/17, because the targets are insufficiently ambitious. (This is largely because only seven have provided such targets in their 2013/14 access agreements and there are, of course, no collective targets.)
  • Boliver’s research is cited to show that there is a gap in applications from state school pupils compared with those educated in the independent sector. But there is also evidence that a state school applicant needs, on average, one grade higher in their A levels (eg AAA rather than AAB) to be as likely to be admitted as an otherwise identical student from the independent sector.
  • A Financial Times analysis of 2011 applications to Oxford from those with very good GCSEs found that those from independent schools were 74% more likely to apply than those from the most disadvantaged state secondary schools. Amongst applicants, independently educated students were more than three times as likely to be admitted as their peers in disadvantaged state schools. They were also 20% more likely to be admitted than those at the 10% most advantaged state secondary schools. As shown by the table below, the probabilities involved varied considerably. The bottom line is that the total probability of a place at Oxford for an independent school student is 2.93%, whereas the comparable figure for a student at one of the 10% most disadvantaged state secondary schools is just 0.07%.

.

NS sec Capture 3

When it comes to the causes of the fair access gap, subject to controls for prior attainment, the report itemises several contributory factors, noting the limited evidence available to establish their relative importance and interaction:

  • low aspirations among students, parents and teachers
  • less knowledge of the applications process, problems in demonstrating potential through the admissions process and a tendency to apply to the most over-subscribed courses
  • not choosing the right  A-level subjects and teachers’ under-prediction of expected A level grades
  • a sense that selective universities ‘are socially exclusive and “not for the likes of them”’

The Report states unequivocally that:

‘The Social Mobility and Child Poverty Commission is deeply concerned about the lack of progress on fair access. The most selective universities need to be doing far more to ensure that they are recruiting from the widest possible pool of talent. The Commission will be looking for evidence of a step change in both intention and action in the years to come.’

It identifies several areas for further action, summarising universities’ responses to ‘University Challenge’:

  • Building links between universities and schools: The earlier report offered several recommendations, including that universities should have explicit objectives to help schools close attainment gaps. No evidence is given to suggest that such action is widespread, though many universities are strengthening their outreach activities and building stronger relationships with the schools sector. Several universities highlighted the difficulties inherent in co-ordinating their outreach activity given the demise of Aimhigher, but several retain involvement in a regional partnership.
  • Setting targets for fair access: The earlier report recommended that HE representative bodies should set statistical targets for progress on fair access over the next five years. This was not met positively:

‘Representative bodies in the Higher Education Sector did not feel this would be a useful step for them to take, saying that it was difficult to aggregate the different targets that individual institutions set themselves. There was also a feeling among some highly selective institutions that the report overestimated the number of students who have the potential to succeed at the most selective universities.’

Nevertheless, the Commission is insistent:

The Commission believes it is essential that the Russell Group signals its determination to make a real difference to outcomes by setting a clear collective statistical target for how much progress its members are aiming to make in closing the ‘fair access gap’. Not doing so risks a lack of sustained focus among the most selective universities’.

  • Using contextual admissions data: The report argues that ‘there is now a clear evidence base that supports the use of contextual data’. Recommendations from the earlier report were intended to universalise the use of contextual data, including commitment from the various representative bodies through a common statement of support and a collaborative guide to best practice. There is no sign of the former, although the Commission reports ‘widespread agreement that the use of contextual data during the admissions process should be mainstreamed’. However it notes that there is much more still to do. (The subsequent SPA publication should have helped to push forward this agenda.)
  • Reforming the National Scholarship Programme: The earlier report called on the Government to undertake a ‘strategic review of government funding for access’ to include the national Scholarship Programme (NSP). The suggestion that the imminent HEFCE/OFFA National Strategy should tackle the issue has been superseded by a Government decision to refocus the NSP on postgraduate education.
  • Postgraduate funding reform: The earlier report recommended work on a postgraduate loan scheme and further data collection to inform future decisions. The current report says that:

‘…the Government appears to have decided against commissioning an independent report looking at the issue of postgraduate access. This is very disappointing.’

and calls on it ‘to take heed’. However, this has again been superseded by the NSP announcement.

The SMCPC’s ‘State of the Nation 2013’ report reinforces its earlier publication, arguing that:

‘…despite progress, too much outreach work that aims to make access to university fairer and participation wider continues to rely on unproven methods or on work that is ad hoc, uncoordinated and duplicative… These are all issues that the higher education sector needs to address with greater intentionality if progress is to be made on breaking the link between social origin and university education.

The UK Government also needs to raise its game… much more needs to be done… to address the loss of coordination capacity in outreach work following the abolition of Aimhigher.’

It recommends that:

‘All Russell Group universities should agree five-year aims to close the fair access gap, all universities should adopt contextual admissions processes and evidence-based outreach programmes, and the Government should focus attention on increasing university applications from mature and part-time students.’

 .

What Else Might Be Done?

I set myself the challenge of drawing up a reform programme that would build on the SMCPC’s recommendations but would also foreground the key issues I have highlighted above, namely:

  • A significant improvement in the rate of progression for disadvantaged high-attaining learners to Oxbridge;
  • A more rigorous approach to defining, applying and monitoring improvement measures; and
  • The publication of more substantive and recent data

A determined administration that is prepared to take on the vested interests could do worse than pursue the following 10-point plan

  • 1. Develop a new approach to specifying universities’ fair access targets for young full-time undergraduate students. This would require all institutions meeting the BIS ‘most selective HEI’ criteria to pursue two universal measures and no more than two measures of their own devising, so creating a basket of no more than four measures. Independent versus state representation could be addressed as one of the two additional measures.
  • 2. The universal measures should relate explicitly to students achieving a specified A level threshold that has currency at these most selective HEIs. It could be pitched at the equivalent of ABB at A level, for example. The measures should comprise:
    • A progression measure for all learners eligible for the Pupil Premium in Year 11 of their secondary education (so a broader measure than FSM eligibility); and
    • A progression measure for all learners – whether or not formerly eligible for the Pupil Premium – attending a state-funded sixth form or college with a relatively poor historical record of securing places for their learners at such HEIs. This measure would be nationally defined and standardised across all institutions other than Oxbridge.
  • 3. In the case of Oxford and Cambridge the relevant A level tariff would be set higher, say at the equivalent of AAA grades at A level, and the nationally defined  ‘relatively poor historical record’ would reflect only Oxbridge admission.
  • 4. These two universal measures would be imposed on institutions through the new National Strategy for Access and Student Success. All institutions would be required to set challenging but realistic annual targets. There would be substantial financial incentives for institutions achieving their targets and significant financial penalties for institutions that fail to achieve them.
  • 5. The two universal measures would be embedded in the national Social Mobility Indicators and the KPIs of BIS and DfE respectively.
  • 6. Central Government would publish annually data setting out:
    • The number and percentage of formerly Pupil Premium-eligible learners achieving the specified A level thresholds for selective universities and Oxbridge respectively.
    • A ‘league table’ of the schools and colleges with relatively poor progression to selective universities and Oxbridge respectively.
    • A ‘league table’ of the universities with relatively poor records of recruitment from these schools and colleges.
    • A time series showing the numbers of students and percentage of their intake drawn from these two populations by selective universities and Oxbridge respectively each year. This should cover both applications and admissions.
  • 7. All parties would agree new protocols for data sharing and transparency, including tracking learners through unique identifiers across the boundaries between school and post-16 and school/college and higher education, so ensuring that the timelag in the publication of this data is minimal.
  • 8. Universities defend fiercely their right to determine their own undergraduate admissions without interference from the centre, meaning that the business of driving national improvement is much more difficult than it should be. But, given the signal lack of progress at the top end of the attainment distribution, there are strong grounds for common agreement to override this autonomy in the special case of high-achieving disadvantaged students.  A new National Scholarship Scheme should be introduced to support learners formerly in receipt of the Pupil Premium who go on to achieve the Oxbridge A Level tariff:
    • Oxford and Cambridge should set aside 5% additional places per year (ie on top of their existing complement) reserved exclusively for such students. On the basis of 2012 admissions figures, this would amount to almost exactly 250 places for England divided approximately equally between the two institutions (the scheme could be for England only or UK-wide). This would provide sufficient places for approximately 45% of those FSM learners currently achieving 3+ A*/A grades.
    • All eligible students with predicted grades at or above the tariff would be eligible to apply for one of these scholarship places. Admission decisions would be for the relevant university except that – should the full allocation not be taken up by those deemed suitable for admission who go on to achieve the requisite grades – the balance would be made available to the next best applicants until the quota of places at each university is filled.
    • The Government would pay a premium fee set 50% above the going rate (so £4,500 per student per annum currently) for each National Scholarship student admitted to Oxbridge. However, the relevant University would be penalised the full fee plus the premium (so £13,500 per student per year) should the student fail to complete their undergraduate degree with a 2.2 or better. Penalties would be offset against the costs of running the scheme. Assuming fees remain unchanged and 100% of students graduate with a 2.2 or better, this would cost the Government £1.125m pa.
  • 9. In addition, the Government would support the establishment of a National Framework Programme covering Years 9-13, along the lines set out in my November 2010 post on this topic with the explicit aim of increasing the number of Pupil Premium-eligible learners who achieve these tariffs. The budget could be drawn in broadly equal proportions from Pupil Premium/16-19 bursary, a matched topslice from universities’ outreach expenditure and a matched sum from the Government. If the programme supported 2,500 learners a year to the tune of £2,500 per year, the total steady state cost would be slightly over £30m, approximately £10m of which would be new money (though even this could be topsliced from the overall Pupil Premium budget).
  • 10. The impact of this plan would be carefully monitored and evaluated, and adjusted as appropriate to maximise the likelihood of success. It would be a condition of funding that all selective universities would continue to comply with the plan.

Do I honestly believe anything of this kind will ever happen?

.

flying pig capture

.

GP

November 2013