How strong is Oxbridge access?

.

This post assesses how well Oxford and Cambridge Universities support fair access for students from disadvantaged backgrounds attending state-funded schools and colleges.

courtesy of Wellcome Images

courtesy of Wellcome Images

It sets out an evidence base to inform and support an Access Lecture I have been asked to give at Brasenose College, Oxford on 28 April 2015.

The outline for that Lecture is as follows:

‘If national efforts:

  • by state-funded schools and colleges to close high attainment gaps between learners from advantaged and disadvantaged backgrounds
  • by selective higher education institutions to secure fair access for students from disadvantaged backgrounds

could be integrated more effectively, much more substantial progress could be achieved on both fronts.

There is scope for reform in both sectors, to ensure a closer fit between the ‘push’ from schools and colleges and the ‘pull’ from higher education.

Faster progress will be achieved through a national framework that brings greater coherence to the market on both the demand and supply sides. It should be feasible to focus all support directly on learners, regardless of their educational setting.

Oxford and Cambridge should position themselves at the forefront of such efforts, serving as beacons of excellence and exemplary practice.’

This is a companion piece to two previous posts:

The first of these explores the issue from first principles, considering measures, targets and data before outlining a 10-point improvement plan. The second advances a simplified version of this plan.

This post concentrates principally on description of the access-related activities of these two universities, placing those in the wider context of updated material about national policy developments and the relatively disappointing outcomes achieved to date.

It is organised into five main sections:

  • A review of key changes to the national access effort since November 2013.
  • A note on outcomes, which questions whether Oxbridge reflects the positive trends reported for selective higher education as a whole.
  • In depth analysis of how fair access work has developed, at Oxford and Cambridge respectively, as revealed by their successive access agreements.
  • Analysis of signature access programmes at Oxford and Cambridge, featuring their rival residential summer schools and efforts to develop a longer term relationship with disadvantaged students, as recommended by Offa.
  • My personal assessment of strengths and areas for development, including a slightly revised version of the improvement strategy I have proposed in earlier posts.

Given the length of the post I have inserted page jumps to each section.

.

Recent developments in national fair access policy

My November 2013 post supplies considerable detail about the regulation of fair access to English universities which I shall not repeat here.

Amongst other things, it deals with:

  • Published data on high attainment by disadvantaged students and their progression to Oxbridge – and how this has not always been used appropriately.

This section describes briefly the principal changes to the national fair access mechanisms introduced by and subsequent to the National Strategy – and explains how access agreements fit into these mechanisms.

.

National Strategy for Access and Student Success

The National Strategy sets out a ‘student lifecycle approach’ in which access forms the first of three main stages.

It seeks to address:

‘…the wide gap in participation rates between people from advantaged and disadvantaged backgrounds in society, and between students with different characteristics, particularly at the most selective institutions.’

There are six key actions:

  • Introduce a national approach to collaborative outreach that will foster new collaborative partnerships, reduce duplication and support the tracking of students who have undertaken outreach activities. Hefce will fund the national roll-out of a tracking system.
  • Secure a more coherent approach to the provision of information, advice and guidance. HE outreach activity and schools policy will be ‘joined up’.
  • Develop a national evaluation framework, so universities can evaluate their activities more effectively and provide comparable national data. Hefce and Offa will examine the feasibility of sector-wide evaluation measures and publish good practice guidance by January 2015.
  • Co-ordinate national research into access, build the evidence base for effective outreach and share good practice.
  • Introduce a joint HEFCE-Offa approach to requesting information from institutions and
  • Encourage institutions to re-balance their funding from financial support towards outreach and collaborative outreach’.

The new national approach to collaborative outreach will be derived from a set of principles (the emboldening is mine):

  • ‘Outreach is most effective when delivered as a progressive, sustained programme of activity and engagement over time.
  • Outreach programmes need to be directed towards young people at different stages of their educational career and begin at primary level.
  • The effective delivery of outreach programmes requires the full, adequately resourced involvement and engagement of HEIs, FECs and schools.
  • The collaborative provision of outreach delivers significant benefits in terms of scale, engagement, co-ordination and impartiality.
  • Progression pathways for learners with non-traditional or vocational qualifications need to be clearly articulated.
  • Outreach to mature learners depends on good links with FECs, employers and the community.
  • Without good advice and guidance, outreach is impoverished and less effective.’

In November 2013, institutions were advised that they would be expected to prepare their own Strategies for Access and Student Success (SASS), which would replace Offa’s access agreements and Hefce’s widening participation strategic statements.

These would cover the period 2014-19, incorporating the information and commitments that would otherwise have featured in 2015-16 access agreements. In future these arrangements would be updated each spring. Full guidance was promised by late January 2014.

However, further guidance was issued in February 2014 stating that separate returns would continue because:

‘…of the Department for Business, Innovation and Skills’ unexpected delay in sending HEFCE’s grant letter, and because we appreciate that institutions need to make progress with their access and student success plans, which must be approved by the Director of Fair Access to Higher Education. Separating our information requirements is the most pragmatic approach at this time.’

Hefce now says:

‘We are no longer requesting widening participation strategic statements from institutions and are moving towards an outcomes framework for 2014-15 onwards.’

It appears that the SASS concept has been set aside permanently. Certainly Offa’s 2016-17 guidance (February 2015) envisages the continuation of separate access agreements, although there is now a single monitoring return to Offa and Hefce.

Initiatives prompted by the National Strategy

The outcomes framework will be informed by two research projects, one developing a data return, the other designed to establish how an outcomes framework ‘could lead us to understand the relative impact of a wider range of access and student success activities and expenditure’.

As far as I can establish there has been nothing further on evaluation. Hefce’s website mentions guidance, but the link is to material published in 2010

However, the current work programme does include rolling out a Higher Education Access Tracker (HEAT) which helps universities track outreach participants through to HE entry. Hefce is funding this to the tune of £3m over 2014-17, but institutions must also pay a subscription – and only 21 are currently signed up.

The strategy is also establishing National Networks for Collaborative Outreach (NNCOs) which, it is claimed:

‘will deliver a nationally coordinated approach to working with schools, universities and colleges to help people access HE’.

In fact, the purpose of the networks is almost exclusively the provision of information.

They will supply a single point of contact providing information for teachers and advisers about outreach activity in their area, as well as general advice about progression to HE. They will undertake this through websites to be available ‘in early spring 2015’.

At the time of writing, Hefce’s website merely lists the institutions participating in each network – there are no links to live websites for any of these.

There is a budget of £22m for the networks over academic years 2014/15 and 2015/16. Each network receives £120,000 per year and there is also a small additional allocation for each institution.

Three of the networks have national reach, one of them to support students wishing to progress to Oxbridge. This is called the Oxford and Cambridge Collaborative Network. Oxford is the lead institution.

A Google search confirms no web presence at the time of writing. However Oxford’s press release says:

‘Oxford will lead the Oxford and Cambridge NNCO, which will aim to offer specific support to students hoping to study at Oxford and Cambridge by reaching out to students and teachers in more than 1,600 schools across England. The collaboration will build on the current information and advice already offered to students and teachers, but enhanced by activities including a new interactive website, online webinars with admissions staff from Oxford and Cambridge, and more resources for activities in local schools linked to Oxford and Cambridge colleges….

… Online webinars with admissions staff from both universities will make it easier to make contact with students and schools from hard to reach geographic areas, and those schools with limited numbers of high-achieving students each year.

The new network will aim to work with state schools across England with particular emphasis on those in areas that currently have little engagement with Oxford and Cambridge outreach; those in schools offering post-16 (GCSE) education; those from schools with low progression to Oxford or Cambridge, or from areas of socioeconomic disadvantage.’

Offa guidance and strategic plan

Offa’s latest access agreement guidance (for 2016-17 agreements) sets out future priorities that are consistent with the national strategy. These include:

  • Greater emphasis on long-term outreach: ‘Evidence suggests that targeted, longterm outreach which boosts achievement and aspirations among disadvantaged people is a more effective way of widening access than institutional financial support. Where appropriate, you should therefore consider how you can strengthen your work to raise the aspiration and attainment of potential students of all ages, from primary school pupils through to adults.’
  • More effective collaboration: ‘Collaboration between institutions providing outreach is not limited to alliances of higher education institutions (HEIs). We would normally expect collaborative outreach to include many stakeholders rather than to be between a single HEI and schools, colleges or other stakeholders receiving outreach. For example, collaboration may be between one HEI and further education colleges (FECs), other higher education providers, employers, third sector organisations, schools, colleges, training providers, local authorities and so on.’
  • Stretching targets for achieving faster progress: ‘we now ask you to review and set new stretching targets which set out the desired outcomes of the work set out in your access agreement. When reviewing your targets, we expect all institutions, particularly those with relatively low proportions of students from under-represented groups, to demonstrate how they intend to make faster progress in improving access, success and/or progression for these students. This is in line with the aims expressed in our forthcoming strategic plan, which is informed by guidance from Ministers.’

This strategic plan was published in February 2015. It notes that, while some progress has been made in improving access for disadvantaged students to selective higher education, there is much more still to do.

‘Despite these improvements, the gaps between the most advantaged and most disadvantaged people remain unacceptably large. The latest UCAS data shows that, on average, the most advantaged 20 per cent of young people are 2.5 times more likely to go to higher education than the most disadvantaged 20 per cent. At the most selective institutions this ratio increases – with the most advantaged young people on average 6.8 times more likely to attend one of these institutions compared to the most disadvantaged young people.’

One of Offa’s targets (described as ‘Sector Outcome Objectives’) is:

‘To make faster progress to increase the entry rate of students from underrepresented and disadvantaged groups entering more selective institutions, and narrow the participation gap between people from the most and least advantaged backgrounds at such institutions.’

The measure selected is English 18-year-old entry rates by POLAR 2 for higher tariff providers. The targets are:

‘….for the entry rate from quintile 1 to increase from 3.2 per cent in 2014-15 to 5 per cent by 2019-20, and from 5.1 per cent in 2014-15 to 7 per cent by 2019-20 for quintile 2. To reduce the gap in participation, our target is for the quintile 5: quintile 1 ratio to decrease from 6.8 in 2014-15 to 5.0 by 2019-20.’

.

A Note on Outcomes

High tariff HEIs

As Offa suggests, there is some cause for optimism about wider progress towards fair access, albeit from an exceedingly low base.

The UCAS End of Cycle Report 2014 indicates that:

  • Students from POLAR Quintile 1 are 40% more likely to enter a high-tariff institution than in 2011, though the percentage achieving this is still tiny (it has increased from 2.3% to 3.2%).
  • FSM-eligible students (the Report doesn’t indicate whether they were ‘ever 6 FSM’ or FSM in Year 11) are 50% more likely to enter a higher tariff institution than in 2011, but the 2014 success rate is still only 2.1%.

As noted above, Offa’s Strategic Plan for 2015-20 includes a target to increase the POLAR Quintile 1 success rate from 3.2% to 5% by 2019-20.

This is an increase of 56% in the six years between 2014 and 2020, compared with an increase of 40% in the three years between 2011 and 2014. Looked at in this way it is relatively unambitious.

But what of Oxbridge? How does its performance compare with other high-tariff institutions?

Oxbridge argues that it is a special case – because of its higher entrance requirements – so should not be judged by the same criteria as other high tariff institutions. It is for this reason that Oxford and Cambridge are reluctant to be assessed against HESA’s Performance Indicators.

Offa’s access agreement methodology enables universities to set targets that reflect their different circumstances, but its own KPIs are framed according to national measures which might not be appropriate to some.

There is no separate Offa target to improve Oxbridge access. When it comes to system-wide performance measures, only DfE’s Impact Indicator 12: Percentage of children on free school meals progressing to Oxford or Cambridge University is specific to Oxbridge.

This is based on the DfE’s experimental Destination Measures statistics. FSM eligibility is determined in Year 11 rather than via the ‘ever 6’ methodology.

The Indicator reports an increase from 0.1% in 2010/11 to 0.2% in 2011/12. (This compares with a reported increase in FSM progression to Russell Group universities from 3.0% to 4.0%)

However, as I have pointed out:

  • The 2010/11 intake was 30 and the 2011/12 intake 50.
  • The 2011/12 intake comprised 40 students from state-funded schools and 10 from state-funded colleges, but both numbers are rounded to the nearest 10.
  • The 2012/13 intake, not yet incorporated into the Indicator, is unchanged from 2010-11, both numbers again rounded to the nearest 10, so any improvement achieved in 2011/12 stalled completely in 2012/13.

The most recent data reported to Offa by Oxford and Cambridge also relates to 2012/13.

.

Cambridge

Cambridge uses the POLAR Quintile 1 measure, also a HESA benchmark, though adjusted downwards to reflect its high attainment threshold. It is aiming for a target of 4.0% by 2016/17, against a 2009/10 baseline of 3.1%.

The 2011/12 outcome is given as 2.5%. The 2012/13 line is blank, on the grounds that HESA has not yet reported it. We can now see that the outcome was in fact 3.5% (POLAR2), so a significant improvement, more than catching up the decline the previous year. HESA has recently published the 2013/14 outcome, which is 3.6%, a very slight improvement on the previous year

HESA’s own benchmarks for Cambridge (again POLAR2) were 4.4% in 2011/12, 4.7% in 2012/13 and 4.6% in 2013/14, so it continues to undershoot these quite significantly.

In its latest 2015/16 agreement, Cambridge’s 2017/18 target is unchanged at 4.0% (but now transferred to POLAR3 quintile 1). It has not set a target for 2018/19.

Given Offa’s commitment to achieving a 5.0% outcome by 2019/20, it will be interesting to see where Cambridge pitches its own target in its 2016-17 access agreement. Will it, too, aim for 5%, or will it scale back its own target on the grounds that the attainment profile of its intake is atypically high?

.

Oxford

Oxford opts for a different measure. It only reports outcomes for POLAR Quintiles 1 and 2 combined, which is insufficiently specific, using a measure based on ACORN postcode analysis as its principal indicator of access for disadvantaged students.

On this second measure, it reports a target of 9.0% by 2016/17 against a 2009/10 baseline of 6.1% and, more recently, has projected this forward to 10% by 2018/19.

The 2011/12 outcome is 7.6% and the 2012/13 outcome is 6.7%. This fall of 0.9% is annotated ‘Progress made – but less than anticipated’.

If we were to apply the POLAR2 HESA Quintile 1 measure to Oxford, it would have registered 2.6% in 2011/12 (against a HESA benchmark of 4.7%), 3.0% in 2012/13 (against a benchmark of 4.9%) and only 2.4% in 2013/14 (against a benchmark of 4.8%).

The reason is presumably the atypical attainment threshold for admission to Oxford.

Oxford does not have the benefit of an Offa marker against which to pitch its ACORN target for 2019-20.

Comparing Oxford and Cambridge

Graph 1, below, illustrates progress against each university’s principal measure of fair access, as well as the trend implied by its targets.

.

Oxbridge graph 1

Graph 1: Oxford and Cambridge: Progress against principal fair access target and projected outcomes for future years

The graph shows inconsistent progress to 2012/13. Oxford’s trend is broadly positive, but Cambridge has not yet caught up where it was in 2008/09. The trajectory implied by Oxford’s targets is more ambitious than Cambridge’s.

Graph 2, below provides further analysis of Oxford’s outcomes, based on data provided in the most recent 2015-16 access agreement. Unfortunately Cambridge is less transparent in this respect.

Graph 2 shows the same pattern of progress against the ACORN target as in Graph 1, except that the 2013 figure is an actual outcome (6.8%) rather than a target (7.5%).

It also shows for each year the percentage of all applicants from ACORN 4/5 postcodes who applied successfully. These compare with a success rate for all applicants of around 20%, giving a gap of three or four percentage points to make up. Progress on this measure has also fluctuated, falling back significantly in 2010 and not yet returning to the mark achieved in 2009.

Preliminary data for 2014 suggests a significant improvement, however. The agreement says that 320 conditional offers have been made, giving an estimated figure for acceptances of 275 (my estimate, not Oxford’s) and a corresponding success rate of 19.2%. If confirmed, this will be a significant step forward.

.

Oxbridge graph 2

Graph 2:  The percentage of all successful applicants drawn from ACORN 4/5 postcodes and the percentage of all applicants from ACORN 4/5 postcodes who are successful, 2008-2013

.

Graph 3, below is derived from the data underpinning the DfE’s experimental KS5 destination statistics for 2010 to 2011, 2011 to 2012 and 2012 to 2013. It provides, for each year, the percentage of admissions to Oxbridge, Russell Group, Top Third and all HEIs accounted for by FSM students.

.

Additional Oxbridge graphGraph 3: Percentage of admissions to Oxbridge, RG, Top third and all HEIs accounted for by FSM students, 2010/11 to 2012/13 (From DfE destination statistics, underlying data)

.

The Oxbridge data, especially, must be treated with a degree of caution, since all figures are derived from separate totals for state-funded schools and colleges, each rounded to the nearest 10. Consequently, changes from year to year may be inflated or deflated by the generous rounding.

Nevertheless, one can see that FSM admission to Oxbridge continues to lag well behind the rates for admission to selective higher education more generally. Although one might argue that Oxbridge is improving at a faster rate, it is doing so from a significantly lower base and, in the most recent year (2012/13), the improvement in all other respects is not mirrored in the Oxbridge figures.

Although the rounded number of FSM admissions to Oxbridge in 2012/13 remained unchanged from 2011/12 (at 50) the number of non-FSM admissions increased by 190, so dragging down the percentage.

To summarise:

  • There is an unhelpful two-year lag in outcomes data and limited commonality in the basis of the measures used to set targets, making comparison much more difficult than it needs to be.
  • Neither university routinely releases details of the number of FSM or ‘ever 6’ FSM students within its intake, but DfE destinations data, also affected by a two-year lag, shows that FSM admissions to Oxbridge are significantly lower than to selective HE more generally. The actual number of FSM students admitted has been more or less stalled at 50 or fewer for a decade.
  • Fair access to Oxbridge is improving slightly, but not consistently. Cambridge has not yet caught up where it was in 2008/09. Oxford’s progress is more secure than Cambridge’s, and Oxford’s target is more challenging.

Access Agreements

Access agreements are approved annually by Offa, which issues annual guidance to inform the review process.

It looks particularly at the nature of the access measures adopted, the resources allocated and whether targets and milestones are suitably challenging.

Offa archives old access agreements on its website as well as universities’ self-assessments. The latter should:

  • ‘assess their progress against each target they set themselves in their agreements
  • provide data showing their progress against targets for each academic year since 2006-07 and
  • provide a commentary setting their access work in context, highlighting any particular challenges they have faced, and, if they have not made as wished, explaining the reasons for this.’

The archive includes:

  • Access agreements for Oxford and Cambridge for 2006-07 through to 2015-16 and
  • Self -assessments for Oxford and Cambridge for 2010-11 through to 2012-13

Self-assessments for 2013-14 were due during January 2015 but have not yet been published. In previous years they have not appeared until July.

Access agreements for 2016-17 are due for submission during April 2015. They too are unlikely to appear before July.

Analysis of how access agreements have changed over time provides a valuable insight into the evolution of institutional policies, including the extent to which these have been modified in line with Offa’s guidance.

Comparison between Oxford and Cambridge’s access agreements also helps to draw out key differences between their respective access policies, as well as comparative strengths and weaknesses and areas in which they might potentially learn from each other.

The sections below explore the chronological development of each university’s access agreement under four headings:

  • Budget: The total budget devoted to activity within scope of the agreement, and the balance between funding for bursaries and outreach respectively
  • Bursaries: The bursaries provided to students from the most disadvantaged backgrounds
  • Outreach: The range of activities undertaken 
  • Targets: The targets and milestones set and progress against those not already discussed above.

I have also included a section of Commentary, intended to capture observations that throw additional light on the institution’s approach and attitude to access.

It is important to note that the two universities now adopt a somewhat different approach to the nature of access agreements.

The agreements for 2006-07 were nine (Oxford) and eight (Cambridge) pages in length. Cambridge’s 2015-16 agreement is slightly longer, at 11 pages, but Oxford’s is 48 pages long.

In recent years, Oxford’s agreement has consistently been much more detailed and more informative. This distinction will be apparent from the analysis below.

Moreover, Cambridge’s agreement was unchanged from 2006-07 to 2009-10, whereas Oxford’s changed somewhat in this period. Both universities submitted single agreements for 2010-11 and 2011-12, but both have changed their agreements – at least to some degree – each year since then.

.

Budget (£m pa)

Costs are not always as clearly expressed as one would wish, nor are they always fully comparable. This is despite the fact that Offa now produces a template for the purpose.

There is very limited information in Cambridge’s most recent agreement, whereas Oxford supplies extensive detail, including (at Offa’s behest) what is and is not ‘Offa-countable’:

‘When calculating your progression spend, please note that OFFA’s remit only extends to students and courses that are fee-regulated. This means that only measures targeted at undergraduate students (or postgraduate ITT students) from under-represented and disadvantaged groups should be included in your OFFA-countable spend. For example, you should not include spend on financial support for postgraduate students in your OFFA-countable expenditure, although you may include this in your total expenditure on progression.’ (Offa, 2015-16 Resource Plan)

The tables below represent my best effort at harvesting comparable figures. The first table summarises Cambridge’s budget, the second Oxford’s.

Year Bursaries Outreach Total Notes
2006-10 £7.0m £1.15m £8.15m Bursary cost in steady state..£0.425m of outreach budget from AimHigher and Hefce funds.
2010-12 £7.5m £1.15m £8.65m Bursary cost in steady state..£0.45m of outreach budget from AimHigher and Hefce funds.
2012-13 £8.3m £4.2m £12.5m Bursary cost in steady state and includes £1.2m steady state assumption for NSP..Total outreach cost includes £2.7m current expenditure plus £1.5m from fee income.
2013-14 £8.3m £4.2m £12.5m As above.
2014-15 £8.0m £4.66m £12.66m Bursary cost in steady state and includes £0.9m for NSP..Total outreach cost includes £2.7m current expenditure plus £1.96m fee income (of which £0.258m is redirected from NSP).
2015-16 £6.9m £3.0m £9.9m Bursary cost in steady state..Total outreach cost includes unspecified fee income.

Table 1: Summary of costs in Cambridge’s access agreements, 2006-2016

.

Year Bursaries Outreach Total Notes
2006-07 £6.8m £1.35m £8.15m Bursary cost in steady state..An additional £3m is provided through college support.
2007-08 £6.8m £1.35m £8.15m Bursary cost in steady state..An additional £3m is provided through college support.
2008-09 £6.3m £1.075m £7.375m Bursary cost in steady state.
2009-10 £6.4m £0.968m £7.368m Bursary cost in steady state.
2010-11 £6.4m £0.968m £7.368m Bursary cost in steady state.
2011-12 £6.6m £1.415m £8.015m
2012-13 £8.8m £2.6m £11.65m Bursary total included £2.2m for tuition fee waivers..Plus additional £0.25m for retention, support and employability.Includes NSP allocation of £0.4m
2013-14 £9.4m.(£9.4m) £4.52m.(£2.44m) £13.92m Bursary total includes £2.9m for tuition fee waivers..Plus additional £0.41m for retention, support and employability.Includes NSP allocation of £0.79m

.

Figures in brackets are ‘offa-countable’

.

2014-15 £11.32m.(£11.05m) £5.23m.(£2.92m) £16.55m Bursary includes £4.06m for tuition fee waivers..Plus additional £0.54m for retention, support and employability.Includes NSP allocation of £0.34m

.

Figures in brackets are ‘offa-countable’

 .

2015-16 £10.89m.(£10.6m) £5.67m.(£3.24m) £16.56m Bursary includes £3.63m for tuition fee waivers..Plus additional £0.71m for retention, support and employability.Of total only £13.81m is ‘offa-countable’

 .

Table 2: Summary of costs in Oxford’s access agreements, 2006-2016

These suggest that:

  • Total combined expenditure in 2006-07 was £16.3m, but by 2015-16, this had increased to £23.74m (excluding Oxford’s ‘non-offa countable’ expenditure, an increase of around 46%.
  • Whereas in 2006-07, both Universities were spending exactly the same, by 2015-16, total expenditure at Cambridge had increased by some 21%, while total Offa-countable expenditure at Oxford had increased by about 70%.
  • In 2005-06, the percentage of total funding spent on bursaries was 86% at Cambridge and 83% at Oxford. By 2015-16, the comparable percentages are 70% and 77%. Hence Cambridge has reduced the proportion spent on bursaries more substantively than Oxford, but both Universities continue to direct their funding predominantly towards bursaries.
  • In 2005-06, expenditure on bursaries by each university was very similar. Although the total devoted to bursaries by Cambridge increased slightly in the intervening years, by 2015-16 it was almost the same as in 2005-06. However, expenditure on bursaries at Oxford is some 56% above what it was in 2005-06.
  • Since 2005-06, both Oxford and Cambridge have more than doubled their expenditure on outreach. Taken together, the two universities expect to spend some £6.24m on outreach in 2015-16. Cambridge’s ratio of bursary to outreach spend is approaching 2:1, whereas Oxford’s is more than 3:1.
  • Although the sums they now spend on outreach (offa-countable in Oxford’s case) are relatively similar, Cambridge spends 30% of its total expenditure on outreach while Oxford spends 23%. However, Cambridge spends significantly less than it did at its peak in 2014-15, while Oxford’s expenditure has increased steadily since 2010-11.

Bursaries

Bursary arrangements have shifted subtly, especially as NSP fee waivers have arrived and then disappeared. The details below relate only to the most generous bursary rates for students with the lowest residual household incomes.

Cambridge’s access agreements suggest that:

  • For 2006-10 Cambridge’s bursary offer for students eligible for a full maintenance grant – with a residual household income of £16,000 or below – is £3,000 per year. It estimates that some 10% of its full fee-paying undergraduates – around 955 students – will qualify.
  • For 2010-12 the maximum bursary is £3,400 for all students qualifying for a full maintenance grant – now equivalent to a residual household income of £25,000 or below – and about 1,100 students (13% of Cambridge’s UK undergraduates) will qualify.
  • For 2012-13 the maximum bursary is £3,500 for those with a full maintenance grant. There is an additional fee waiver of £6,000 in the first year of study for such students who are also from ‘particularly disadvantaged backgrounds’ including those formerly in receipt of FSM. (The University points out that these are the Government’s criteria).
  • For 2013-14 the same arrangements apply.
  • For 2014-15 the same arrangements apply, except that recipients can no longer allocate part of their bursary towards an additional fee waiver.
  • For 2015-16 only the bursary of £3,500 remains in place for those with a full maintenance grant.

Oxford’s access agreements reveal that:

  • In 2006-07, students whose residual household income is below £17,500 receive a bursary of £3,000 per year, plus an additional £1,000 in the first year of the course. About 1,200 students are expected to benefit.
  • From 2007-08, these rates increase to £3,070 and £1,025 extra in the first year.
  • From 2008-09, new entrants with a residual household income below £25,000 receive a bursary of £3,150, but all those with an income below £18,000 will receive an extra £850 in the first year of their course.
  • In 2009-10, these rates increase to £3,225 and £875 respectively. This is unchanged for 2010-11.
  • In 2012-13, students with a residual household income below £16,000 a year will receive a bursary of £3,300 per year, plus a tuition fee waiver of £5,500 in the first year of the course and £3,000 in subsequent years.
  • In 2013-14, these arrangements are unchanged.
  • In 2014-15, the bursary rate remains at £3,300, but the fee waiver is reduced to £3,000 a year.
  • In 2015-16, the bursary rate increases substantively to £4,500 per year. A more select group of Moritz-Heyman scholars (with residual income below £16,000 but also ‘flagged on a number of contextual data disadvantage indicators’ ) also receive an annual tuition fee waiver of £3,000

In more recent agreements, Cambridge’s maximum rate of bursary is available for all students below a residual income of £25,000, whereas at Oxford it is confined to students with a residual income of less than £16,000.

Hence Cambridge is comparatively more generous to students with a residual income above £16,000 but below £25,000.

Until 2015-16, the maximum bursary rates were broadly similar, but Oxford has now added a significant increase, offering £1,000 more than Cambridge. Moreover, a fee waiver remains in place for the most disadvantaged students.

Hence Oxford is now more generous to students with a residual income below £16,000. Oxford argues:

‘The University will be monitoring the level of students from households with income of less than £16,000. It is considered that these are the most financially disadvantaged in society, and it is below this threshold that some qualify for receipt of free schools meals, and consideration for the proposed pupil premium. The University does not consider that identifying simply those students who have actually been in receipt of free school meals provides a suitably robust indicator of disadvantage as they are not available in every school or college with post-16 provision, nor does every eligible student choose to receive them.’

The 2014-15 agreement states that 30% of 2012 entrants in receipt of the full bursary – and so with a household income of £16,000 or less – were educated in the independent sector. These students would of course be ineligible for FSM and pupil premium.

The 2015-16 agreement adds that roughly 10% of Home/EU full time undergraduates would qualify for such a bursary. This is supported by the University’s published admissions statistics for 2013, which give the percentage as 9.9% and the number of students as 297.

In 2013, we know that 2,510 admissions were from England, so we can estimate the number of English full bursary holders at approximately 250, of which some 175 were educated in the maintained sector.

But DfE’s destination indicators suggest that only some 25 of these were FSM-eligible.

And other DfE research suggests that only some 14% of students entitled to FSM are not claiming (though that rises to 22% for 15 year-olds).

Taking the latter figure, one might conclude that roughly 30 of the 175 were FSM eligible or non-claimants, so what of the remaining 145 (some 83%)?

It seems likely that they were drawn into residual household income of £16,000 or lower by some combination of:

  • Allowances for additional dependent children (£1,130 per dependent child)
  • Allowances for AVCs and other pension contributions
  • Other allowable expenses.

Interestingly Oxford’s 2013 admissions data shows that the proportion of its intake with incomes between £16,000 and £25,000 was roughly half that of the group with incomes below £16,000.

. 

Outreach

Cambridge

For 2006-2012, Cambridge divides its outreach provision into three categories:

  • Activity to encourage applications from under-represented groups to Cambridge. This is targeted at students in the first generation in their families to attend HE; those who attend schools or colleges with low or below average GCSE and A level performance; and those attending schools or colleges with little recent history of sending students to Cambridge. Three sub-categories are identified: information events for teachers and parents, residential Easter and summer schools and a miscellany of visits to Cambridge, visits to schools, masterclasses, workshops, study days etc.
  • Collaborative activities with other HE partners to raise aspirations and encourage participation. This includes regional Aimhigher projects and gifted and talented events provided through NAGTY.
  • General aspiration-raising activities for the HE sector generally. These are predominantly subject-based and online activities.

For 2012-16, Cambridge continues to describe its provision under the first and third of these categories, adding that both involve collaborative work. It also identifies a wider range of target groups:

‘These include children in care; students eligible for free school meals [NB]; Black, Asian and minority ethnicity students; mature learners; students educated in further education colleges; and bright students in schools and colleges which have not historically sent students to the University of Cambridge.’

‘Or previously eligible’ is added to FSM eligibility in later iterations.

The description of provision is short, mentioning a national programme of visits and events provided by colleges through an Area Links Scheme plus centrally provided summer schools and taster events.

Five priorities are identified:

  • Increasing the number of places available on events with demonstrable impact, particularly summer schools, taster days and events for teachers.
  • Preserving the legacy of local Aimhigher work.
  • Providing a sustained programme of advice and activities for younger students in local secondary schools.
  • Developing initiatives to encourage state school students to choose appropriate subject combinations and apply to selective universities and.
  • Working closely with Oxford

A sixth priority is added in 2013-14 – ensuring PGCE intakes reflect the population from which Cambridge recruits and building networks of graduate teachers to support wider outreach activity.

In 2014-15 these priorities are unchanged, except that the second and third are conflated into one. There is also an added reference to the long-term nature of some of this work:

‘A number of our initiatives engage with younger age groups and consist of a series of sustained engagements over a number of years. For example, our work in Cambridgeshire and with looked-after children involves secondary school students of all ages, whilst our core programme for black, Asian and minority ethnicity students is delivered to each cohort over a three year period.’

.

Oxford

Oxford’s outreach activity is harder to synthesise, because the agreements vary more often and some of more recent are so much more detailed.

In its 2006-07 Agreement, Oxford establishes a distinction between activities designed to encourage applications to the University and more general aspiration-raising activities.

However, these are not separately identified in the list it provides, which includes:

  • Hosting Aspiration Days for students from Years 9-11 drawn from ‘Oxford’s specific “target areas”’
  • A HEFCE specialist summer school for 150 Year 11 students from under-represented groups
  • Local Aimhigher provision
  • A programme of some 500 annual outreach visits targeting schools and colleges with little history of sending students to Oxford or into HE more generally
  • A Year 12 Sutton Trust Summer School for 250 students from non-traditional backgrounds
  • A programme of regional events to encourage applications from non-traditional backgrounds
  • A programme of events for teachers from schools with little history of sending students to Oxford, supporting some 100 teachers a year
  • Support for student-led programmes including the Oxford Access Scheme (for students from inner city schools) and a Target Schools Scheme run by the Student Union
  • A Further Education Access Initiative reaching 100 colleges a year and
  • Subject-specific enrichment activities.

In the following year, the items on the list change slightly. The University is said to be undertaking a thorough audit of these activities.

By 2008-09, Oxford describes the objective of its access work as increasing representation from: state school students, students from lower socio-economic groups, students from BME groups and care leavers.

It is focused on two areas: increasing the number of high quality applications from target groups and ensuring fair admissions processes. It undertakes wider aspirations-raising work on top of this.

The list of central access initiatives annexed to the agreement is missing.

For 2009-10 and 2010-11, the agreement refers to ‘detailed operational plans’ being developed to achieve its objectives.

By 2011-12, Oxford has added a third area of focus to the two immediately above: ensuring that teachers and advisers are able to support intending applicants.

Detailed operational plans are still under development. However, the subsequent agreements introduce several key elements:

  • UNIQ residential summer schools for Year 12 students. Participants are selected on the basis of GCSE A* performance compared with their average school attainment, ACORN postcode, school’s history of sending pupils to Oxford and any care history. A personal statement is also required. There were 380 participants in 2009, rising to 500 in 2010. Capacity is projected to increase to 650 in 2011, 700 in 2012, 850 in 2013, and 1000 participants in 2014.
  • By 2012-13, two other ‘flagship programmes’ are identified: a programme of seven regional one-day teacher conferences and a link programme connecting every local authority with a named college. Participants in the teacher conferences are drawn from schools and colleges with low numbers of students achieving high grades or limited success in achieving offers. Oxford’s target is a 15% success rate for applications from these teachers’ schools.
  • In 2013-14, there is the first reference to a Pathways Programme – longitudinal provision for students across Years 10-13 in schools with little history of engagement with Oxford. By 2014-15 this has expanded to accommodate 500 Year 12 students attending study days and 1,800 Year 10 students attending a taster day. In the 2015-16 agreement there is reference to 3,000 participants.
  • The 2012-13 agreement also outlines a system of access flags attached to certain student applicants, denoting educational and social disadvantage. Some 500 applicants were flagged in 2009/10, 630 in 2010/11 and 928 in 2011/12. The intention is that flagged candidates will achieve the same success rate in receiving offers as all applicants from the same sector. (The sectors specified are comprehensive, grammar, FEC, 6FC and independent). A flag for students from low participation neighbourhoods is incorporated from 2011-12 and one for students from schools and colleges with historically low progression to Oxford is introduced in 2012-13. The 2014-15 agreement notes that the proportion of flagged students achieving an offer and subsequently admitted has risen from 15.6% in 2010-11 to 17.2% in 2011-12. The gap between the success rate of flagged applicants and all UK-domiciled applicants has also fallen from 6.4% to 5.6%. In the 2015-16 agreement, the offer rate for flagged candidates is reported as being 19.1% in 2012-13 and 21.9% in 2013-14 However, there is no comparison with the sector-specific data for all applicants.

The 2012-13 agreement is the first to mention the preparation of an Oxford Common Framework for Access but this is not ready until the publication of the 2014-15 agreement.

In that agreement, Oxford describes a four-fold approach it has developed for targeting different types of schools:

  • The large proportion producing few students with the necessary attainment to apply to Oxford – highly tailored individual activities such as UNIQ, school-cluster visits and the student union’s student shadowing scheme.
  • Schools with little history of sending students to Oxford or students who have been relatively unsuccessful – application and interview preparation workshops and awareness-raising events.
  • Schools where there are many high-attaining students but little history of sending students to Oxford – increase understanding of the application process and break down myths.
  • Schools who have significant numbers of successful applicants – maintain a working relationship.

Targets

.

Cambridge

Cambridge begins by adopting selected HESA benchmarks, even though these have:

‘severe limitations in a Cambridge context, in that they take insufficient account of the University’s entry requirements, both in terms of subject combinations and of levels of qualification. We hope in due course to develop our own internally derived milestones or, alternatively, consider the applicability of any milestones which OFFA might develop.’

Three targets are adopted:

  • Increasing the proportion of UK undergraduates from state schools or colleges to between 60% and 63%, compared with a HESA benchmark for 2001-02 of 65%.
  • Increasing the proportion of students admitted whose parental occupation falls within NS-SEC 4-7 to 13-14%, compared with a HESA benchmark for 2001-02 of 13%.
  • Increasing the proportion of students from low participation neighbourhoods to approximately 8-9% compared with the HESA 2001-02 benchmark of 7%.

For 2010-12, the third of these targets is lowered to 5-6% because HESA has changed the basis of its calculation, reducing Cambridge’s benchmark by 33%.

By 2012-13, the first of these targets is described as the University’s ‘principal objective’, so it is deemed more important than improving fair access for disadvantaged students. This statement is subsequently removed, however.

The third objective is again recalibrated downwards, this time to 4%, because:

‘Currently HESA performance indicators and other national datasets relating to socio-economic background do not take adequate account of the entry requirements of individual institutions. Whilst they take some account of attainment, they do not do so in sufficient detail for highly selective institutions such as Cambridge where the average candidate admitted has 2.5 A* grades with specific subject entry requirements. For the present we have adjusted our HESA low participation neighbourhood benchmark in line with the results of our research in relation to state school entry and will use this as our five-year target….We will seek data through HESA or otherwise to amend or update our target in relation to socio-economic background in a revised access agreement next year.’

A paper is available explaining the recalibration (applying a scaling factor of 0.88)

Two new targets are also introduced: a retention benchmark and a process target relating to the minimum number of summer school places.  There will be a minimum of 600 places a year for the next five years.

The substantive details are unchanged in all subsequent agreements.

Oxford

In its 2006-07 access agreement, Oxford discusses setting a performance indicator for recruitment from the maintained sector, adding that from 2006 it will begin to collect data on recruitment from lower socio-economic groups.

In 2007-08 it notes that recruitment from SEG 4-7 ‘increased by 7% and drew the University closer to its benchmark’.

In 2008-09, Oxford is continuing to monitor participation by SEG 4-7 and planning to introduce an internally developed benchmark, adjusted to reflect the high attainment required for entry to Oxford. By 2009-10/2010-11, work is still ongoing to develop such a benchmark.

In 2011-12 it seems still not to be ready, but in 2012-13 Oxford introduces its current indicators:

  • Increase the percentage of UK undergraduates at Oxford from schools and colleges which historically have had limited progression to Oxford.
  • Increase the percentage of UK undergraduate students at Oxford from disadvantaged socio-economic backgrounds. ACORN is adopted because:

‘The University has found the ACORN information to be the most accurate source of verifiable information to highlight socioeconomic factors that may signify disadvantage, and has used it as a contextual flag in the undergraduate admissions process since 2008-9, and also as a factor when selecting participants for the UNIQ summer schools programmes.’

  • Increase the percentage of UK undergraduate students at Oxford from neighbourhoods with low participation in higher education. This utilises POLAR quintiles 1 and 2 ‘in line with HEFCE and OFFA recommendations’.
  • Meet the HEFCE benchmark on disabled students at Oxford.

It supplements these with three ‘activity targets and outcomes’:

  • 60% of those participating in the UNIQ summer schools make an application to Oxford, and 30% of those applying to receive an offer of a place.
  • Improve the participation, application, and success levels from schools and colleges who have had teachers attend the Regional Teacher Conferences, where these schools and college have either a limited numbers of qualified candidates or where there historically has been limited success in securing offers.
  • Using contextual information in the admissions process to identify candidates who may be suitable to be interviewed on the basis of either time in care, or socio-economic and educational disadvantage. The expectation is that identified candidates would then achieve the same success rate in receiving offers as all applicants to Oxford from equivalent school or college sectors.

These are unchanged in subsequent agreements though, as we have seen, there is no reporting of flagged applicants’ success compared with all students in their respective sectors, only compared with all applicants.

Commentary

There are, within the series of access agreements, valuable insights into the thinking within Oxford and Cambridge about such issues. Here is an annotated selection, presented in broadly chronological order:

  • Improvement will take time: ‘Cambridge will continue to strive to encourage applications from qualified applicants from groups currently under-represented and to admit a greater proportion of them within the context of our admissions policies and without compromising entry standards. Experience has, however, demonstrated that outreach activity takes time to alter the composition of the student population.’ (Cambridge, 2006-10)
  • Partnership and collaboration is necessary: ‘In setting itself these objectives, the University recognises that the problems relating to access to higher education are complex and deep-seated, and beyond the capability of the University to solve by itself. They require the input of all parts of the organisation to address, and indeed the input of agencies external to the University. Oxford is committed to playing its part in addressing these issues…’ (Oxford, 2008-09)
  • Increases in intake are unlikely: ‘Because, in part, of the full-time, residential nature of Cambridge’s undergraduate courses, it is unlikely that the university’s undergraduate intake will significantly increase over the next five years.’ (In 2012-13, this is qualified by the addition of the phrase ‘…beyond the colleges’ capacity to admit them’, but this is dropped again the following year.) (Cambridge 2006-10 and 2012-13)
  • Access is focused on application rather than admission: ‘The selection process aims to identify the most able, by subject, from among a very highly qualified field of candidates. While the purpose of our access work is to ensure that all students who are likely to be able to meet the required standards have the opportunity to apply, our admissions procedures aim to select those candidates who best meet our published selection criteria.’ (Oxford, 2012-13)
  • The balance of expenditure in favour of bursaries is justified: Whilst mindful of OFFA guidance on this subject, we do not believe that there is a sufficient body of evidence that greater benefit would be derived from different proportions of expenditure. As suggested above…we believe that our financial support has a significant bearing on retention. We have also taken full account of student feedback in the formulation of the present scheme. Students have confirmed during the current year that they do not want to see a reduction in bursary levels. It should be noted that the level of expenditure on outreach activity outlined in this agreement is supplemented from very substantial funding through other sources, and so we believe our commitment in this area to be considerable and appropriate.’ (Cambridge 2013-14)
  • This balance of expenditure in favour of bursaries is open to challenge: ‘Our package of financial support to undergraduate students, through both tuition charge waivers and maintenance bursaries, is expected to contribute in broad terms to meeting the targets and outcomes. As yet, however, the evidence for a demonstrable connection between financial support for students and improvements in access to higher education amongst under-represented groups is unclear. We will continue to review our position on the basis of further evidence and analysis.’ (Oxford, 2012-13)
  • Explanations of limited progress: Progress against these targets in 2012 has proved extremely challenging, particularly against the backdrop of the new funding regime combined with a demographic decline in the number of school leavers. In relation to the three targets dealing with educational, social and economic disadvantage, Oxford has seen both a decline in applicants and a decline in the number of students that have been admitted…Oxford will continue to focus its outreach efforts and resources on recruiting and encouraging a wider range of student to apply successfully to the University (Oxford 2014-15)
  • Student funding reforms have depressed performance: ‘The 2011, 2012 and 2013 entry cycles proved atypical, given the extensive changes to student funding, and this was reflected in the limited success against the targets…The provisional figures for 2014 entry, however, indicate that we have made headway across the board, particularly in regard to candidates who are from postcodes with high levels of socio-economic disadvantage using the Acorn (A Classification Of Residential Neighbourhoods) postcode classification. …Sustained long term outreach activity takes time to show in the admissions process, and the need to allow a five year period to assess progress has been reiterated by Oxford on a regular basis.’ (Oxford 2015-16)
  • Potentially negative impact of A level reform: ‘We are concerned that current proposals for A-level reform would significantly reduce student choice and flexibility; in particular, the lack of formal end of Year 12 examinations will adversely affect student confidence and the quality of the advice they receive about higher education options, and also prevent institutions such as Cambridge from accurately assessing current academic performance and trajectory. If effected these proposed reforms could have a significant bearing on our ability to make progress on access measures.’ (In 2015-16 there is also concern ‘…that proposed funding arrangements would effectively restrict students in many state schools to three A-levels, meaning that the opportunity to study extremely valuable fourth subjects such as Further Mathematics would be lost.) (Cambridge, 2014-15 and 2015-16 
  • There is an evidence base for effective practice ‘There is also increasing evidence that sustained work with students over a longer period of time is more effective than one-off interventions, particularly if this work is tailored to the requirements of each age group.’ and ‘Research into access activities has identified that, provided they have a sufficient depth of content, summer schools are a particularly valuable experience for students who have higher academic achievements and aspirations than others in their peer group.’ (Oxford 2014-15 and 2015-16)
  • Universities’ role in raising attainment: There is a larger question about the role of universities in raising attainment rates within schools. Universities can, and Oxford does, work in partnership with schools, local authorities, and third parties to form collaborative networks that can work together to raise the attainment rates of students from the most deprived backgrounds’ (Oxford 2014-15)

Some of these issues will be picked up again in the final section of this post.

.

Oxbridge’s Signature Access Programmes

This section reviews information about key programmes within each university’s access portfolio that reflect their long-term commitment to residential programmes and a more recent focus on longer-term partnership programmes targeting secondary students from disadvantaged backgrounds.

Before engaging with these specific programmes, it is important to give a sense of the full range of activity presently under way. In Oxford’s case, the most recent 2015-16 access agreement provides the basis for this. In Cambridge’s case, I have drawn on online material and an online brochure.

Cambridge’s Access Portfolio 

Cambridge’s Outreach and Access webpages provide details of:

  • Insight supporting students attracting the Pupil Premium in Year 9 through to Year 13 (see below)
  • Experience Cambridge, a 3-week subject-specific academic project, undertaken predominantly through the University’s VLE.
  • HE+, a pilot programme involving regional consortia of state schools and colleges working with their link Cambridge College to enable their academically able students to make competitive applications to selective universities including Cambridge.
  • HE Partnership, an aspiration-raising initiative targeting Year 9-11 students in Cambridgeshire and Peterborough schools with lower than average progression rates – and particularly students attending them with no family background of attending higher education.

A separate Raising Aspirations booklet mentions, in addition:

  • The Subject Matters, events for Year 11 students to support their A level subject choice
  • Year 12 subject masterclasses
  • A Black Asian and minority ethnicity (BAME) outreach programme
  • Further education and mature student outreach
  • Various examples of outreach by University Departments
  • Activity under the College Area Links Scheme
  • The CUSU Shadowing Scheme
  • Open Days
  • Oxford and Cambridge student conferences
  • Participation in higher education conventions

Oxford’s Access Portfolio

Oxford’s 2015-16 access agreement describes:

  • Briefings for Teach First and PGCE students which typically attract 150 students annually.
  • An annual programme of school and college visits, which involved over 3,300 UK schools and colleges in 2012-13. These are undertaken through Link Colleges (see below).
  • Target Schools, an OUSU programme involving undergraduate visits and student Shadowing Scheme.
  • A variety of Departmental and subject-specific outreach activities

Cambridge: Sutton Trust Summer Schools 

Sutton Trust summer schools are subject-specific residential courses for Year 12 students. They are currently provided at ten institutions including Cambridge. There are about 2,000 places nationally and Cambridge accounts for 550 of them.

Cambridge offers 26 five-day courses in July and August, hosted by six of its colleges. They are free to attend. The providers meet all costs including travel to and from the venue, food and accommodation.

Successful applicants must meet most or all of the following eligibility criteria:

  • In the first generation of their family to attend university (in fact this means neither parent has a first degree or equivalent)
  • Eligible for FSM [not pupil premium] during secondary education
  • Have achieved at least 5A*/A grades at GCSE or equivalent and be taking subjects relevant to the summer schools they wish to attend
  • Attend schools/colleges with a low overall A level point score (typically below the national average) and/or low progression to HE
  • Live in neighbourhoods with low progression rates to HE and/or high rates of socio-economic deprivation.

Participants must attend a UK state-funded school or college, so those attending independent schools are ineligible, even if they have moved subsequently into a state sixth form. Priority is given to children who are, or were formerly, looked after or in care.

Cambridge’s website says:

‘We look at a combination of the contextual priority criteria met and GCSE grades (or equivalent) in subjects relevant to the course for which you have applied. In 2014, the majority of our 550 summer school participants met two or more of these criteria.’

In answer to the question ‘does attending a summer school increase my chances of getting a place at Cambridge, the University says:

‘Applications to the University are completely separate from the Summer Schools and use different criteria to those of the Summer School.  Admissions Tutors will not know whether an applicant has attended a Summer School, unless you choose to mention it in your personal statement…Equally, being unsuccessful in a summer school application does not correlate to the likelihood of being accepted to Cambridge as an undergraduate: we use very different criteria and it is in no way a statement about your academic record or potential.’

. 

Oxford: UNIQ summer schools

The UNIQ summer schools website describes a very similar animal. It is also targeted at Year 12 students in state schools and colleges. The courses are also one-week, subject-specific residential experiences undertaken during July and August. All costs are covered.

According to the access agreements Oxford planned to increase the number of places available to 1,000 in 2014 and achievement of this outcome is confirmed in the published statistics, which add that there were 4,327 applications and that 507 ‘near miss applicants’ were invited to undertake other outreach activities.

Interestingly though, the number of places available in 2015 fell back substantially, to 850. The number of courses was 35, unchanged from 2014, suggesting a drop in the average number of students per course from 29 to 24.

Courses are categorised according to whether they are in Humanities, Medical Sciences, Mathematical, Physical and Life Sciences or Social Sciences. Sixteen of the 35 are in Humanities subjects.

The eligibility criteria are also similar to those for Sutton trust summer schools, but  those relating to disadvantage are not described with any degree of specificity . They include:

  • The number of A* GCSE grades achieved compared with the average for the applicant’s school when they took GCSEs. (Applicants are only permitted to have completed one A level.)
  • Academic attainment and history of progression to Oxford at the school or college where the applicant is taking A levels
  • ACORN postcode data
  • POLAR 3 data and
  • The quality of a personal statement

Applications from looked after children are considered ‘on an individual basis’.

A referee, normally a teacher, needs to confirm the details of their application.

Students who complete a UNIQ summer school fulfil the requirements for the ASDAN Universities Award.

The website adds that from 2015, Oxford is ‘running a virtual learning programme for selected applicants’.

The answer given to the question ‘Will attending a UNIQ summer school make it more likely that I will get a place at Oxford University says:

‘Students who attend UNIQ and decide to apply to Oxford University do not receive any preferential treatment at the application stage.

Admissions tutors who make decisions about undergraduate offers select entirely on academic merit. Unless students mention on their UCAS Personal Statement that they have attended the UNIQ Summer School, admissions tutors will not know, as we do not provide them with separate information.’

Cambridge: Insight 

Insight is described in the guide for teachers as:

‘an [sic] multidisciplinary programme which aims develop [sic] and broaden students’ academic interests and tackle the barriers many students face when applying to university. We hope to achieve this through inspiring subject days, discussions with current university students and academics and sessions about university.’

Eligible students are in Year 9, attract the Pupil Premium, can travel to and from Cambridge in a day and are ‘on track to achieve Level 7 English, maths and science but [sic] the end of Key Stage 3’.

The programme is predominantly focused on six London boroughs, but applications are also invited from non-selective state schools elsewhere with ‘above average eligibility for free school meals’.

There is a series of Saturday and holiday events, including:

  • Core sessions, including an introductory event in the Spring term of Year 9 and ‘Subject Matters’ – events to support A level choices – in the Autumn Term of Year 11.
  • Additional subject days provided throughout Years 10 and 11
  • A one-night residential at the end of Year 10 and a four-night residential at the end of Year 11 for ‘those who have shown enthusiasm and commitment to the programme’.
  • A regular email newsletter during Years 12 and 13 providing information about open days, masterclasses, residentials and competitions.

The programme is free of charge.

I could find no evaluation of the impact of this programme, which is not mentioned in Cambridge’s ‘Raising Aspirations’ brochure, even though it seems to be their only substantial long term programme targeting disadvantaged students outside the local area

.

Oxford: Pathways Programme 

The website describes Pathways as an initiative co-ordinated by Oxford’s colleges with support from the Sutton Trust.

‘The programme aims to provide information, advice and guidance on higher education and Oxford to academically able students, and staff members, in non-selective state schools with little history of student progression to Oxford.’

The components are:

  • Year 10 taster days which provide sessions on higher education and student finance. Applications are made by schools, which need to be in the state sector, ‘usually without sixth forms’ and with little or any history of sending students to Oxford.
  • Year 11 investigating options events, focused on the significance of GCSE results and post-16 choices. These are aimed at students who have undertaken a taster event who attend schools fitting the description above. Schools are encouraged to bring up to ten students. There are also two subject-focused days, one devoted to Medicine, the other to Humanities.
  • Year 12 study days providing a taste of subject-specific university-level study. This involves two taster sessions undertaken in small groups, two talks from admissions tutors and a college tour. There are twenty-one subjects offered. Participants are from non-selective state schools and colleges. They are normally expected to have at least 5 GCSE A* grades (7 for medicine) and be predicted to achieve at least 3 A grades at A level, or equivalent.
  • A Year 13 application information day, providing advice on personal statements, tests and interviews. These cover seven broad subject areas. Participants are again drawn from non-selective state schools and colleges.

Although not confined to students from disadvantaged backgrounds, teachers are advised that:

‘When selecting participants for the Year 12 and 13 events, we also take into account socio-economic data, such as parental HE participation and eligibility for benefits or free schools meals.’

The Sutton Trust explains that Pathways involved almost 3,000 students and 400 teachers in its first year. The Trust is funding the further development of the Year 12 and 13 components.

I could find no separate evaluation of the effectiveness of Pathways.

Strengths and weaknesses of Oxbridge provision

.

Summer schools 

Both Oxford and Cambridge place extensive reliance on the effectiveness of summer schools as an instrument for improving access, with summer school provision forming the centrepiece of their respective strategies.

The evidence base in support of this strategy appears relatively slim. Both appear to be relying principally on evaluation of the Sutton Trust’s programme.

The Sutton Trust appears to publish an annual Targeting and Progression Report, but the 2014 edition has all the institution-specific data stripped out, which is not entirely helpful.

However, it does reveal that, amongst applicants for summer schools in all ten locations, only:

  • 59.5% were from the first generation of their family with experience of HE.
  • 54.8% came from schools and colleges with below average A level point scores and/or low progression rates to HE.
  • 29.9% were in Polar 2 quintiles 1 or 2.

There is no reference to the FSM eligibility criterion, so presumably that was not in place last year.

There is limited information about the status of those accepted onto courses. Between them, the document and a parallel powerpoint presentation tell us that:

  • The majority of attendees met two or three of the eligibility criteria
  • 77% met three of the criteria, but we don’t know which three
  • 85% met the ‘first generation’ criterion
  • 74% ‘came from schools with low attainment’
  • 49% ‘lived in areas with the lowest level of progression to university’ (presumably Polar quintiles 1 and 2).

Given the focus of this post, the last outcome is particularly disappointing, since it means that over half were not disadvantaged on the Trust’s only measure. Perhaps the additional FSM criterion has been introduced in an effort to secure a larger majority of applicants from disadvantaged backgrounds.

The presentation also reveals that the Trust specifically targeted 900 ‘hard to reach schools’ which eventually supplied 257 attendees, 88% of them meeting three or more of the eligibility criteria.

The implication must be that, if such an exercise had not taken place, the proportion of attendees from disadvantaged backgrounds would have been significantly lower.

The Report also reveals that, of the 2012 Sutton Trust summer school cohort, 58% of university applicants took up a place at a Russell Group university. A total of 125 students (10% of the cohort) accepted a place at the institution that hosted their summer school.

Oxford publishes information about its summer schools in its access agreements.

The target is for 60% of participants to apply and for 30% of applicants to receive an offer. The University also aims that summer school participants will have the same success rate in securing an offer as the average for all applicants from the state sector.

Each agreement provides detail about the number of participants who apply to Oxford, the number receiving offers and the proportion of those from ACORN groups 4 and 5.

These are summarised in Graph 4, below, which illustrates that the impact on recruitment of students from ACORN 4 and 5 postcodes is not fully commensurate with the increase in the number of participants.

Oxbridge graph 3

Graph 4: Impact of UNIQ summer schools, 2010-2013

.

Oxford also provides details of the proportion of summer school participants from Polar quintiles 1 and 2 receiving an admission offer, for 2011 (19.5%), 2012 (15%) and 2013 (20.3%). In 2013, the comparable ‘success rate’ for all applicants to the University was 20.1%.

The evaluation evidence cited by Oxbridge is captured in a Sutton Trust Summer School Impact Report, dating from 2011. This is based on analysis of the 2008 and 2009 summer school intakes, when course were located at Bristol, Nottingham and St Andrews, as well as at Oxford and Cambridge.

It concludes that:

  • Summer schools successfully select students who fit the eligibility criteria (though that is not entirely borne out by the more recent outcomes above).
  • Amongst the disadvantaged cohort, less disadvantaged students are more likely to take up places than their more disadvantaged peers.
  • However, attending a summer school closes the gap between the success rates – in terms of obtaining admission offers – of more and less disadvantaged students. Exactly why this happens is unclear.
  • There are significant differences between universities. Cambridge exhibits ‘relatively poor conversion of attendees into applications (not least when compared to the equivalent performance of Oxford)’

The overall conclusion is that summer schools do have a positive impact, compared with control groups, but the study does not offer recommendations for how they might work better, or consider value for money.

The closing section notes that:

‘They achieve this by raising two of the three ‘As’ of the WP canon – student awareness and student aspirations. It may not directly enhance the third – student attainment – though summer schools can support students’ study skills – but the growing adoption of a ‘contextual data’ approach to the treatment of university admissions should be to the further benefit of the sorts of students who pass through summer schools.’

Overall then, summer schools have a positive impact, but if we are judging their efficiency as a mechanism for improving the intake of students from disadvantaged backgrounds, it is clear that there is extensive deadweight. They might be better targeted on the most disadvantaged students.

If this is true of summer schools it is almost certainly true of other elements of Oxbridges’s access programmes.

.

Other more general issues 

  • A smorgasbord of provision: It is evident that both Oxford and Cambridge are engaged in multiple overlapping initiatives designed to improve access, both to their own institutions and to selective HE more generally. At Offa’s behest, they are targeting several sub-populations. The 2016-17 guidance on completing access agreements invites them to consider a variety of under-represented groups: minority ethnic students, disabled students, care leavers and students in care, part-time students, mature students, medical students, PGCE students. There seems to be a tendency to invent a series of small targeted initiatives for each sub-group, rather than focusing principally on two or three substantial programmes that would make a real difference to core target groups. 
  • Too many priorities too vaguely expressed: Both universities identify core priorities through the targets they have selected. In Oxford’s case those involve increasing representation from: schools and colleges with limited progression to Oxford; postcodes associated with significant socio-economic disadvantage; postcodes associated with low HE participation; and disabled students. However the first three overlap to some extent and recent access agreements do not indicate the relative priority attached to each. In Cambridge’s case only two targets relate to admissions, one focused on increasing representation from state schools, the other from low participation postcodes. In older agreements, the former has clear priority over the latter but it is unclear whether this remains the case. Offa’s framework requires simplification so that both universities have no option but to prioritise admissions from disadvantaged learners educated in state-funded institutions. It should be much clearer exactly which activities are dedicated to this end and what funding is allocated for this purpose. 
  • A plethora of measures: The Offa system permits Oxbridge and other universities too much leeway in defining the populations whose access they seek to promote and in determining how they measure success. This makes it harder to compare universities’ records and more complex to harmonise with the measures most often applied in schools and colleges. If universities refuse to foreground eligibility for the pupil premium and for FSM, they should at the very least publish annual data about the proportion of their intake falling within these categories, and without the present two year time lag.
  • Limited transparency: There is too much variability in the degree of transparency permitted by the Offa framework. Oxford provides much more data in its access agreement than does Cambridge, but the range of data published in support of fair access is limited across the board. Within the bounds of data protection legislation, it should be possible for the university to state each year, without a two-year timelag, what proportion of their intake fall within certain specified categories, how those vary between subjects and the range of attainment demonstrated in each case. The publication of such material would go a long way towards removing any sense that Oxbridge is overly defensive about these issues. 
  • Limited investment in long term collaborative programmes: Summer schools are valuable but they do not impact early enough, nor do they raise attainment. The Insight and Pathways programmes demonstrate growing recognition of the potential value of establishing long-term relationships with prospective students that begin as early as primary school and certainly before the end of KS3. Such programmes require schools, colleges and universities to preserve continuity for each eligible student through to the point of university entry. Existing programmes are insufficiently intensive and reach too few students. Scalability is an obvious issue. 
  • Negligible involvement in attainment-raising work: Both Oxford and Cambridge state frequently that the principal obstacle to recruiting more disadvantaged students is the scarcity of sufficiently high attainment within the target group. Yet rarely, if ever, do they invest in long-term activities designed to raise these students’ attainment, seeming to believe that this is entirely a matter for schools and colleges. The precedent offered by university involvement in academy sponsorship and A level reform would suggest that there is no fundamental obstacle to much closer engagement in such activities.

.

Tackling the core problem

The proposed solution is a framework that supports a coherent long-term programme for all high-attaining disadvantaged students attending state-funded institutions in England, stretching from Year 7 to Year 13. These might be defined as all those eligible for pupil premium. An additional high attainment criterion, based on achievement in end of KS2 tests, could be introduced if necessary.

Such a programme could be extended to the other home countries and additional populations subject to the availability of funding.

The framework would position the school/college as the co-ordinator, facilitator and quality assurer of each eligible student’s learning experience (with handover as appropriate as and when a learner transfers school or into a post-16 setting).

It would stretch across the full range of competence required for admission to selective HE, including high attainment, personal and learning skills, strong yet realistic aspirations, cultural capital, access to tailored IAG etc.

On the demand side, the framework would be used to identify each student’s strengths and areas for development, and monitor progress again challenging but realistic personal targets.

From Years 7-9 the programme would be light-touch and open access for all eligible disadvantaged students. Emphasis would be placed on awareness-raising and the initial cultivation of relevant skills.

Entry to the programme from Year 10 would be conditional on the achievement of an appropriate nigh attainment threshold at the end of KS3. From this point, provision would be tailored to the individual and more intensive.

Continuation in subsequent years would be dependent on the student achieving appropriate high attainment thresholds and challenging interim targets.

Schools’ and colleges’ performance would be monitored through destinations data and Ofsted inspection.

On the supply side the framework would be used to identify, organise and catalogue all opportunities to develop the full range of competence required for admission to selective HE, whether provided by the student’s own school or college, other education providers in the school, college and HE sectors or reputable private and third sector providers.

Opportunities offered by external providers, whether at national or regional level, would be catalogued and mapped against the framework in a searchable national database. Schools and colleges would be responsible for mapping their own provision and other local provision against the framework.

Each student would have a personal budget supplied from a central fund. Personal budgets would be administered by the school/college and used to purchase suitable learning opportunities with a cost attached. The fund would be fed by an annual £50m topslice from the pupil premium. This would cover the full cost of personal budgets.

The annual budget of £50m per year might be divided between:

  • Light-touch open access activities in Years 7-9 – £10m
  • Intensive programme in Years 10-13 – £10m per year group.

The latter would be sufficient to support 5,000 eligible students to the tune of £2,000 per student per year, or 4,000 to the tune of £2,500.

By comparison, DfE’s destination indicators suggest that, in 2012/13, ‘top third’ universities admitted 2,650 FSM-eligible students; some 1,520 of these were admitted to Russell Group universities and, of those, just 50 were admitted to Oxbridge.

Selective universities would make a small contribution, the sum adjusted to reflect their comparative performance against fair access targets. These contributions would be used to meet the administrative costs associated with the programme. Total annual running costs have not been estimated but are unlikely to be more than £2.5m per year.

Universities might choose to invest additional funding, covered by their annual Offa access agreements, in developing free-to-access products and services that sit within the supply side of the framework. Attainment-raising activities might be a particular priority, especially for Oxbridge.

Philanthropic contributions might also be channelled towards filling gaps in the supply of products and services where, for whatever reason, the market failed to respond.

Selective universities would have access to information about the progress and performance of participating students. Students would apply for higher education via UCAS as normal, but strong performers would expect to receive unconditional offers from their preferred universities, on the strength of their achievement within the programme to date.

Participation in the programme would be a condition of funding for all selective universities. All processes and outcomes would be transparent, unless data protection legislation prevented this. The programme would be independently evaluated.

Optionally, universities might be further incentivised to make unconditional offers and provide the necessary support during undergraduate study. The Government might pay the receiving university a fee supplement, 50% above the going rate, for every student on the programme admitted unconditionally (so up to £22.5m per cohort per year assuming a supplement of £4,500 and 100% recruitment). This supplement would not be provided for conditional offers.

The Government would also claw back the full fee plus the supplement for every student on the programme – whether admitted conditionally or unconditionally – who failed to graduate with a good degree (so £40,500 per student assuming a 3-year degree and a £9,000 fee).

GP

March 2015

Why McInerney is just plain wrong

.

I should be completing my next evidence-based post but, 24 hours on from reading this evidence-light Guardian article by Laura McInerney, I am still incandescent.

.

.

I find I cannot return to normal business until I have shredded these flimsy arguments.  So this post is by way of catharsis.

McInerney’s core premiss is that political parties of all colours focus disproportionately on ‘the smartest children’ while ‘ignoring lower ability learners’.

This poisonous ideology seems particularly prevalent amongst Teach First types. I imagine they are regurgitating lessons they learned on its courses,

I have seen it promulgated by rising stars in the profession. That exchange prompted this previous post which attempted a balanced, rational analysis of our respective positions.

Ideologues cannot be persuaded by evidence, so there is no hope for McInerney and her ilk, but I hope that more open-minded readers will be swayed a little by the reasoning below.

.

What does she mean by ability?

McInerney distinguishes learners who are ‘smart’ or ‘bright’ from those who are ‘lower ability’. This betrays a curious adherence to old-fashioned notions of fixed ability, dividing children into sheep and goats.

There is no recognition of ability as a continuum, or of the capacity of learners to improve through effort, if given the right support.

The principles of personalised learning are thrown out of the window.

Education is not a matter of enabling every learner to ‘become the best that they can be’. Instead it is a zero sum game, trading off the benefits given to one fixed group – the smart kids – against those allegedly denied to another – the lower ability learners.

There is also an elementary confusion between ability and attainment.

It seems that McInerney is concerned with the latter (‘get good marks’; ‘received a high grade’) yet her terminology (‘lower-ability pupils’; ‘the smartest children’; ‘gifted and talented’) is heavily redolent of the former.

.

What does she mean by focusing on the top rather than the tail?

According to McInerney’s notions, these ‘lower ability’ kids face a sad destiny. They are ‘more likely to truant, be excluded or become unemployed’, more likely to ‘slip into unskilled jobs’ and, by implication, form part of the prison population (‘75% of prisoners are illiterate’).

If we accept that low attainers are preponderant in these categories, then it is logical to conclude that programmes focused on tackling such problems are predominantly benefiting low attainers.

So governments’ investment in action to improve behaviour and discipline, tackle truancy and offer Alternative Provision must be distributed accordingly when we are calculating the inputs on either side of this equation.

Since the bulk of those with special educational needs are also low attainers, the same logic must be applied to SEN funding.

And of course most of the £2.5bn pupil premium budget is headed in the same direction.

Set against the size of some of these budgets, Labour’s commitment to invest a paltry £15 million in supporting high attainers pales into insignificance.

There are precious few programmes that disproportionately support high attainers. One might cite BIS support for fair access and possibly DfE support for the Music and Dance Scheme. Most are ‘penny packages’ by comparison.

When the national gifted and talented programme was at its peak it also cost no more than £15m a year.

Viewed in this way, it is abundantly clear that low attainers continue to attract the lion’s share of educational funding and political attention. The distasteful medical analogy with which McInerney opens her piece is just plain wrong.

The simple reason is that substantial investment in high attainers is politically unacceptable.

Even though one could make a convincing case that the economic benefits of investing in the ‘smart fraction’ are broadly commensurate with those derived from shortening the ‘long tail’.

Of course we need to do both simultaneously. This is not a zero sum game.

.

Deficit model thinking

McInerney is engaged in deficit model thinking.

There is no substance to her suggestion that the government’s social mobility strategy is disproportionately focused on ‘making high court judges’. Take a look at the Social Mobility Indicators if you don’t believe me.

McInerney is dangerously close to suggesting that, because low attainers are predominantly disadvantaged, all disadvantaged learners are low attainers. Labour’s commitment is a sop for the middle classes. Nothing could be further from the truth.

But high-attaining learners from disadvantaged backgrounds will not succeed without the requisite support. They have an equal right to such support: they are not ‘the healthiest’, pushing in front of ‘the sickest’ low attainers. Equally, they should not be expected to go to the back of the queue.

There are powerful economic and equity arguments for ensuring that more learners from disadvantaged backgrounds progress to competitive universities and professional careers.

As and when more succeed, they serve as role models for younger learners, persuading them that they too can follow suit.

McInerney has made that journey personally so I find it hard to understand why she has fallen prey to anti-elitism.

Her criticism of Labour is sadly misplaced. She should be asking instead why other parties are not matching their commitment.

According to her there was a golden age under Blunkett ‘who really believed in helping all children, not mostly the smartest.’

Guess who was Secretary of State when Labour first offered support to gifted and talented learners?

He fully appreciated that the tail should not wag the dog.

[Postscript: Here is the Twitter debate that followed this post. Scroll down to the bottom and work upwards to read the discussion in broadly chronological order.]

.

 

GP

March 2015

The Policy Exchange National Scholarships Programme

.

This post is a short critical analysis of the proposal for a new National Scholarships Programme contained in the Policy Exchange Education Manifesto, published in March 2015.

.

Background

Policy Exchange describes itself as ‘the UK’s leading think tank’.

It is Right-leaning, having been established in 2002 by a group including Boles (the founding Director), Gove and Maude, all currently Conservative Ministers in the Coalition Government.

On Friday 6 March, Policy Exchange published an Education Manifesto, authored by its Education Team: Jonathan Simons, Natasha Porter and Annaliese Briggs.

The Manifesto’s Introduction says:

‘This is not a manifesto in its traditional sense. What is published here is a collection of short ideas around particular areas which are more localised than those in our main reports. It is our hope and our belief that any or all of them could be taken up by any main political party in May 2015, and they complement the broader policy recommendations we have put forward in our published reports.’

There are seven ‘ideas’, the last of which is for National Scholarships, summarised as follows:

‘Government should design a prestigious scholarship scheme to financially support the most talented undergraduates in the country – covering approximately 200 individuals a year – if they attend a UK university and remain in the UK for at least three years after graduation.’

Despite the authors named above, this has unmistakeably Odyssean fingerprints!

.

Rationale

The purpose of the Programme seems to be to ensure that the economic benefits vested in the most outstanding undergraduates are not lost to the UK through ‘brain drain’:

‘The intention would be to marry the most able students within the UK with some of the world class provision on offer at UK universities (though the scholar would have their free choice of which institution to attend). The financial package would act less as a facilitator to go to university in general but as a nudge to incentivise scholars to remain in the UK throughout university and beyond, as opposed to going abroad, which is becoming an increasingly competitive battleground. [sic]’

The paper emphasises the economic benefits of investing in a country’s very highest attainers:

‘If such highly able individuals can accrue great awards and accomplishments which benefit not just themselves but, through positive spillovers, drive increase in human capital more widely, then this will be of wider benefit.’

This idea is associated with Benbow and Lubinski, Co-Directors of the Study of Mathematically Precocious Youth (SMPY) located at Vanderbilt University in the US:

‘They argue for a national scheme to identify such individuals and nurture them, both for the individuals’ own benefits but also for the benefits of their home nations. This is because in advanced economies in particular, with a shift towards higher skilled jobs, the economic prosperity of a country depends on its human capital potential. Education today is the economy of tomorrow. If such individuals as these under discussion can generate further talent by virtue of their own accomplishments, then there is a competitive rationale for countries to identify and support these individuals.’

In fact, these arguments have a longer pedigree

There is no explanation of how the highest attainers ‘can generate further talent by virtue of their own accomplishments’, though this might be a reference to potential future employment as university academics.

Some limited evidence is cited to support fears of a brain drain:

‘A BIS report from 2010 found that some 2.8 per cent of state sector pupils and 5.5 per cent of independent sector pupils apply to universities outside the UK – small in absolute terms but “It is particularly significant that it is the academically most gifted pupils who are the most likely to apply to foreign universities”. Longitudinal data – which unfortunately only goes to 2011 – nevertheless shows a consistent increase since 2005.

Most recently, the Institute for International Education and the US-UK Fulbright Commission releaed [sic] data in late 2014 showing that there were a record number of UK students studying in the USA, which has always been the most popular country for foreign study. 10,191 British students pursued study in the US during the 2013/14 academic year, up from around 9,500 12 months earlier and the largest year-on-year increase in more than a decade. Undergraduates accounted for 49.6 per cent of all UK students heading to the US. Some 23.9 per cent were postgraduates and the remainder were taking part in short-term exchanges or graduate work programmes.’

.

What is proposed?

The proposed Programme would award £10,000 per year for three years of undergraduate study at an English university to ‘the top 200 scholars in the country’. The total cost of the awards would be ‘£6m a year in steady state’.

This would involve the Government collaborating with universities and other unspecified partners to develop a new optional test for 17-18 year-olds.

Any student resident in the UK would be eligible, so there would be no screening process.

The test would:

‘…seek to measure via a range of metrics a combination of academic ability and academic potential. The test would be calibrated to accurately identify those with ability found in approximately 1 in 10,000 individuals (or variants of this depending on how wide the entry criteria are drawn). A proportion of the top ranked scores on this test would be designated National Scholars and be eligible for a package of incentives under the National Scholarship Scheme, contingent upon enrolling as an undergraduate at a UK university.’

Anyone who received a scholarship and subsequently left the country within three years of graduating would be required to repay it.

Hence the scheme would obstruct enrolment as an undergraduate overseas and also place a significant obstacle in the path of postgraduate mobility.

Analysis

There is no problem

The idea is a solution in search of a problem.

There is no specific evidence that the 200 students with the highest ability and academic potential (however that is measured) are any more likely to study abroad.

The 2010 BIS research report quoted above notes that 76% of the students in its survey planned to return to the UK, although many wanted to work abroad before doing so.

Furthermore:

‘Significantly, the survey results point to the students with the strongest A level results being more likely to want to return to the UK at some point after their studies. International student mobility should not therefore be interpreted as a brain drain of the UK’s best and brightest young people.’

The BIS report quite rightly explores this issue in the context of international student mobility, the globalisation of higher education and the postgraduate labour market.

The threat of brain drain can be countered by the argument that the strongest UK students should be encouraged to attend the best courses at the world’s best universities (language of tuition permitting). Only by doing so will they maximise their skills and their subsequent economic value.

Meanwhile, the best overseas students should be welcomed to UK universities and encouraged to consider postgraduate study and employment here, so that the UK economy benefits from their engagement.

Poor policy design

There is insufficient information about the nature of the test.

It would not be an intelligence test, but would assess ‘academic ability and potential’.

Since it must be applicable to all students, regardless of their current subjects of study or their intended undergraduate field(s) of study, it must not rely in any way on subject content, otherwise it would be biased in favour of specialists in those fields.

It seems unlikely that such a test already exists, unless one is prepared to argue that the US SAT test fits the bill and, even if it does, the ceiling is almost certainly too low.

The footnotes acknowledge that:

‘…such a proposed test has no track record on validity and there will be a large number of students therefore caught in statistical noise just outside the cut off score.’

The development process would be lengthy and complex – and the costs correspondingly high. These development costs are not included in the £6m budget.

If the test is coachable, this opens up the possibility of a further market for the private tuition industry. Students will be diverted from their A level studies as a consequence.

The reference to ‘a range of metrics’ suggests the possibility of a complex test battery rather than a single assessment. The ongoing cost of administering the test is also excluded from the budget.

Similarly, the ongoing costs of administering the scholarship scheme, evaluating its effectiveness, monitoring the movements of alumni and pursuing repayments are also excluded.

The relationship between the scholarship and other forms of student support is not properly developed. Why not link the incentive to student loan repayments instead of introducing a separate scholarship scheme? One section of the paper suggests it could meet living costs, or be offset against tuition fees.

It acknowledges that many of the beneficiaries of such scholarships are likely to come from privileged backgrounds and be educated in the independent sector.

It seems unlikely that they would they be swayed by financial inducements at this level, especially if their parents have been forking out upwards of £25,000 a year for school fees.

It is likely that those who are determined to study abroad will choose not to take the test. The benefits of £30,000 now will be more than outweighed by the additional earnings they might subsequently expect as a consequence of pursuing a better course elsewhere. This will be especially true of those from affluent backgrounds.

Finally, one doubts whether a sample as tiny as 200 students a year – no matter how talented they are – would have any substantive impact on the UK economy, even assuming that the arguments in favour of globalisation could be set aside. Such a scheme would be more effective if it had a wider reach.

Redundant lines of argument and poor research

The first part of the paper is devoted to describing the original National Scholarship Programme, a completely different animal, designed to provide financial support to enable disadvantaged students to participate in higher education. It is a red herring.

In contrast, the new proposal has nothing to do with fair access or social mobility. It is ‘targeted on talent rather than socio-economic background’.

The paper argues that there are few incentives that ‘recognise and support the most intellectually able’, continuing:

 ‘At a school level, the previous National Academy for Gifted and Talented Youth, was cancelled in 2010 and its funds used for the National Scholarship Programme! [sic]’

This is hopelessly wrong.

NAGTY’s five year contract ended in 2007. Its sponsor, Warwick University, chose not to bid for the subsequent contract, which was intended to extend support to all England’s gifted and talented learners (then numbered at approximately one million), rather than the top 5% of 11-19 year-olds who were NAGTY’s main target group.

The subsequent contract, for Young, Gifted and Talented, ended in 2010 and was not renewed, as the then Labour Government decided to devolve responsibility to schools. This funding stream was not diverted to the NSP, which was administered by HEFCE through BIS.

The paper continues:

‘In line with a general approach towards autonomy, there is also no agreed definition of able students or gifted and talented students. Anecdotally, it is often tended to be used for somewhere around the top 15% or so of the cohort in ability terms. However, this note takes a different and much narrower definition, and is concerned with what might be called the extremely able – those with ability levels found in approximately 1 in every 10,000 of the population.’

The problematic co-existence of definitional autonomy and Ofsted’s emphasis on assessing the effectiveness of all schools’ support for the most able is not discussed.

The reference to ‘somewhere around the top 15%’ is more than anecdotal – it is plucked entirely out of the air. Having introduced this topic, what is the justification given for shifting the emphasis away from 15% of learners to 0.0001% of prospective undergraduates? The policy response to one has negligible bearing on the other.

(In fact, the footnotes reveal that a cadre of 200 scholarships would accommodate some 0.003% of the undergraduate population.)

The next section of the paper suggests that SMPY has been focused on different countries, yet SMPY participants have all been resident in the United States (though Cohort 5 covers graduate students enrolled in the top-ranked maths, science and engineering courses located there).

Benbow and Lubinski argue for a national scheme to identify and nurture such learners from the age of 13. Yet the paper switches again to discuss university scholarship schemes in the US, India, France and Russia. All of the three still extant are focused on maths, science and technology, so are not direct parallels with what is proposed here.

A comparison is drawn with elite sports funding

‘This approach mirrors closely the “no compromise approach” of elite sporting organisations funded by UK Sport, which requires tangible outcomes of high performance (ie realistic chances of an Olympic medal) in exchange for funding. Less successful sports, however, popular, are not entitled to the same levels of funding. The net result is that performance at the elite end of UK sport has exponentially grown – whilst alongside that, other funding helps develop grass roots sport and widening participation.’

I struggle to understand the parallels between funding for successful sports and for successful students, unless this is supposed to make the case for not linking the scholarships to socio-economic disadvantage.

The inclusion of a table of five countries’ Olympic medal tallies from 1996-2012 is, however, entirely spurious and redundant.

.

Conclusion

The end of the paper says:

‘There should also be a renewed focus on how to stretch all pupils within the state sector at whatever level, and further work on identifying potential highly able talent across the wider state education sector as Ofsted have identified – both of which will be the focus of future Policy Exchange work. But this is not the same thing, and nor should it be confused with, a scheme to reward and nurture excellence at 18 now, wherever it comes from.’

This is surely ironic, in that much of the commentary above shows how these two issues have been interleaved in the paper itself.

The fact that Policy Exchange plans fresh work on the wider question of support for the most able in schools is welcome. I look forward to being involved.

But, meanwhile, this idea should be consigned to the bin.

.

.

GP

March 2015

The most able students: Has Ofsted made progress?

.

This post considers Ofsted’s survey report ‘The most able students: An update on progress since June 2013’ published on 4 March 2015.

It is organised into the following sections:

  • The fit with earlier analysis
  • Reaction to the Report
  • Definitions and the consequent size of Ofsted’s ‘most able’ population
  • Evidence base – performance data and associated key findings
  • Evidence base – inspection and survey evidence and associated key findings
  • Ofsted’s recommendations and overall assessment
  • Prospects for success

How this fits with earlier work

The new Report assesses progress since Ofsted’s previous foray into this territory some 21 months ago: ‘The most able students: Are they doing as well as they should in our non-selective secondary schools?’ (June 2013)

The autopsy I performed on the original report was severely critical.

It concluded:

‘My overall impression is of a curate’s egg, whose better parts have been largely overlooked because of the opprobrium heaped on the bad bits.

The Report might have had a better reception had the data analysis been stronger, had the most significant messages been given comparatively greater prominence and had the tone been somewhat more emollient towards the professionals it addresses, with some sort of undertaking to underwrite support – as well as challenge – from the centre.’

In May 2014, almost exactly mid-way between that Report and this, I published an analysis of the quality of Ofsted reporting on support for the most able in a sample of Section 5 secondary school inspection reports.

This uncovered a patchy picture which I characterised as ‘requiring improvement’.

It noted the scant attention given by inspectors to high-attaining disadvantaged learners and called for Ofsted to publish guidance to clarify, for inspectors and schools alike, what they mean by the most able and their expectations of what support schools should provide.

In December 2014, I published ‘HMCI ups the ante on the most able’ which drew attention to commitments in HMCI’s Annual Report for 2013/14 and the supporting documentation released alongside it.

I concluded that post with a series of ten recommendations for further action by Ofsted and other central government bodies that would radically improve the chances of achieving system-wide improvement in this territory.

The new Report was immediately preceded by a Labour commitment to introduce a £15m Gifted and Talented Fund if successful in the forthcoming General Election.

This short commentary discusses that and sets out the wider political context into which Ofsted’s new offering will fall.

.

Reactions to Ofsted’s Report

Before considering the Report’s content, it may be helpful to complete this context-setting by charting immediate reactions to it.

  • DfE’s ‘line to take, as quoted by the Mail, is:

‘We know that the best schools do stretch their pupils. They are the ones with a no-excuses culture that inspires every student to do their best.

Our plan for education is designed to shine a bright light on schools which are coasting, or letting the best and brightest fall by the wayside.

That is why we are replacing the discredited system which rewarded schools where the largest numbers of pupils scraped a C grade at GCSE.

Instead we are moving to a new system which encourages high-achievers to get the highest grades possible while also recognising schools which push those who find exams harder.’

‘David Cameron’s government has no strategy for supporting schools to nurture their most able pupils. International research shows we perform badly in helping the most gifted pupils. We’re going to do something about that. Labour will establish a Gifted and Talented Fund to equip schools with the most effective strategies for stretching their most able pupils.’

  • ASCL complains that the Report ‘fails to recognise that school leaders have done an extraordinary job in difficult circumstances in raising standards and delivering a good education for all children’. It is also annoyed because Ofsted’s press release:

‘…should have focused on the significant amount of good practice identified in the report rather than leading with comments that some schools are not doing enough to ensure the most able children fulfil their potential.’

 .

 .

  • NAHT makes a similarly generic point about volatility and change:

‘The secondary sector has been subject to massive structural change over the past few years. It’s neither sensible nor accurate to accuse secondary schools of failure. The system itself is getting in the way of success…

…Not all of these changes are bad. The concern is that the scale and pace of them will make it very hard indeed to know what will happen and how the changes will interact….

…The obvious answer is quite simple: slow down and plan the changes better; schedule them far enough ahead to give schools time to react….

But the profession also needs to ask what it can do. One answer is not to react so quickly to changes in league table calculations – to continue to do what is right…’

There was no official reaction from ATL, NASUWT or NUT.

Turning to the specialist organisations:

‘If the failure reported by Ofsted was about any other issue there would be a national outcry.

This cannot be an issue laid at the door of schools alone, with so many teachers working hard, and with no budget, to support these children.

But in some schools there is no focus on supporting high potential learners, little training for teachers to cope with their educational needs, and a naive belief that these children will succeed ‘no matter what’.

Ofsted has shown that this approach is nothing short of a disaster; a patchwork of different kinds of provision, a lack of ambitious expectations and a postcode lottery for parents.

We need a framework in place which clearly recognises best practice in schools, along with a greater understanding of how to support these children with high learning potential before it is too late.’

‘NACE concurs with both the findings and the need for urgent action to be taken to remove the barriers to high achievement for ALL pupils in primary and secondary schools…

… the organisation is  well aware that nationally there is a long way to go before all able children are achieving in line with their abilities.’

‘Today’s report demonstrates an urgent need for more dedicated provision for the highly able in state schools. Ofsted is right to describe the situation as ‘especially disappointing’; too many of our brightest students are being let down…

…We need to establish an effective national programme to support our highly able children particularly those from low and middle income backgrounds so that they have the stretch and breath they need to access the best universities and the best careers.’

Summing up, the Government remains convinced that its existing generic reforms will generate the desired improvements.

There is so far no response, from Conservatives or Liberal Democrats, to the challenge laid down by Labour, which has decided that some degree of arms-length intervention from the centre is justified.

The headteacher organisations are defensive because they see themselves as the fall guys, as the centre increasingly devolves responsibility through a ‘school-driven self-improving’ system that cannot yet support its own weight (and might never be able to do so, given the resource implications of building sufficient capacity).

But they cannot get beyond these generic complaints to address the specific issues that Ofsted presents. They are in denial.

The silence of the mainstream teachers’ associations is sufficient comment on the significance they attach to this issue.

The specialist lobby calls explicitly for a national framework, or even the resurrection of a national programme. All are pushing their own separate agendas over common purpose and collaborative action.

Taken together, this does not bode well for Ofsted’s chances of achieving significant traction.

Ofsted’s definitions

.

Who are the most able?

Ofsted is focused exclusively on non-selective secondary schools, and primarily on KS3, though most of the data it publishes relates to KS4 outcomes.

My analysis of the June 2013 report took umbrage at Ofsted’s previous definition of the most able:

‘For the purpose of this survey ‘most able’ is defined as the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2. Some pupils who are new to the country and are learning English as an additional language, for example, might not have attained Level 5 or beyond at the end of Key Stage 2 but have the potential to achieve it.’

On this occasion, the definition is similarly based on prior attainment at KS2, but the unquantified proportion of learners with ‘the potential to attain Level 5 or above’ are removed, meaning that Ofsted is now focused exclusively on high attainers:

‘For this report, ‘most able’ refers to students starting secondary school in Year 7 having attained Level 5 or above in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

This reinforces the unsuitability of the term ‘most able’, on the grounds that attainment, not ability, is the true focus.

Ofsted adds for good measure:

‘There is currently no national definition for most able’

They fail to point out that the Performance Tables include a subtly different definition of high attainers, essentially requiring an APS of 30 points or higher across Key Stage 2 tests in the core subjects.

The 2014 Secondary Performance Tables show that this high attainer population constitutes 32.3% of the 2014 GCSE cohort in state-funded schools.

The associated SFR indicates that high attainers account for 30.9% of the cohort in comprehensive schools (compared with 88.8% in selective schools).

But Ofsted’s definition is wider still. The SFR published alongside the 2014 Primary Performance Tables reveals that, in 2014:

  • 29% of pupils achieved Level 5 or above in KS2 reading and writing
  • 44% of pupils achieved Level 5 or above in KS2 Maths and
  • 24% of pupils achieved Level 5 or above in KS2 reading, writing and maths.

If this information is fed into a Venn diagram, it becomes evident that, this academic year, the ‘most able’ constitute 49% of the Year 7 cohort.

That’s right – almost exactly half of this year’s Year 7s fall within Ofsted’s definition.

.

Ofsted venn Capture

.

The population is not quite so large if we focus instead on KS2 data from 2009, when the 2014 GCSE cohort typically took their KS2 tests, but even that gives a combined total of 39%.

We can conclude that Ofsted’s ‘most able’ population is approximately 40% of the KS4 cohort and approaching 50% of the KS3 cohort.

This again calls into question Ofsted’s terminology, since the ‘most’ in ‘most able’ gives the impression that they are focused on a much smaller population at the top of the attainment distribution.

We can check the KS4 figure against numerical data provided in the Report, to demonstrate that it applies equally to non-selective schools, ie once selective schools have been removed from the equation.

The charts in Annex A of the Report give the total number of pupils in non-selective schools with L5 outcomes from their KS2 assessments five years before they take GCSEs:

  • L5 maths and English = 91,944
  • L5 maths = 165,340
  • L5 English (reading and writing) = 138,789

Assuming there is no double-counting, this gives us a total population of 212,185 in 2009.

I could not find a reliable figure for the number of KS2 test takers in 2009 in state-funded primary schools, but the equivalent in the 2011 Primary Performance Tables is 547,025.

Using that, one can calculate that those within Ofsted’s definition constitute some 39% of the 2014 GCSE cohort in non-selective secondary schools. The calculations above suggest that the KS3 cohort will be some ten percentage points larger.

.

Distribution between schools

Of course the distribution of these students between schools will vary considerably.

The 2014 Secondary Performance Tables illustrate this graphically through their alternative ‘high attainers’ measure. The cohort information provides the percentage of high attainers in the GCSE cohort in each school.

The highest recorded percentage in a state-funded comprehensive school is 86%, whereas 92 state-funded schools record 10% or fewer high attainers and just over 650 have 20% or fewer in their GCSE cohort.

At the other extreme, 21 non-selective state-funded schools are at 61% or higher, 102 at 51% or higher and 461 at 41% or higher.

However, the substantial majority – about 1,740 state-funded, non-selective schools – fall between 21% and 40%.

The distribution is shown in the graph below.

.

Ofsted graph 1

Percentage of high attainers within each state-funded non-selective secondary school’s cohort 2014 (Performance Tables measure)

Ofsted approaches the issue differently, by looking at the incidence of pupils with KS2 L5 in English, maths and both English and maths.

Their tables (again in Annex A of the Report) show that, within the 2014 GCSE cohort there were:

  • 2,869 non-selective schools where at least one pupil previously attained a L5 in KS2 English
  • 2,875 non-selective schools where at least one pupil previously attained a L5 in KS2 maths and
  • 2,859 non-selective schools where at least one pupil previously attained l5 in KS2 English and maths.

According to the cohort data in the 2014 Secondary Performance Tables, this suggests that roughly 9% of state-funded non-selective secondary schools had no pupils in each of these categories within the relevant cohort. (It is of course a different 9% in each case.)

Ofsted’s analysis shows that the lowest decile of schools in the distribution of students with L5 in English will have up to 14 of them.

Similarly the lowest decile for L5 in maths will have up to 18 pupils, and the lowest decile for L5 in maths and English combined will have up to 10 pupils.

Assuming a top set typically contains at least 26 pupils, 50% of state-funded, non-selective schools with at least one pupil with L5 English have insufficient students for one full set. The comparable percentage for maths is 30%.

But Ofsted gives no hint of what might constitute a critical mass of high attainers, appearing to suggest that it is simply a case of ‘the more the better’.

Moreover, it seems likely that Ofsted might simply be identifying the incidence of disadvantage through the proxy of high attainers.

This is certainly true at the extremes of the distribution based on the Performance Tables measure.

  • Amongst the 92 schools with 10% or fewer high attainers, 53 (58%) have a cohort containing 41% or more disadvantaged students.
  • By comparison, amongst the 102 schools with 51% or more high attainers, not one school has such a high proportion of disadvantaged students, indeed, 57% have 10% or fewer.

Disadvantage

When Ofsted discusses the most able from disadvantaged backgrounds, its definition of disadvantage is confined to ‘Ever-6 FSM’.

The Report does not provide breakdowns showing the size of this disadvantaged population in state-funded non-selective schools with L5 English or L5 maths.

It does tell us that 12,150 disadvantaged students in the 2014 GCSE cohort had achieved KS2 L5 in both English and maths.  They form about 13.2% of the total cohort achieving this outcome.

If we assume that the same percentage applies to the total populations achieving L5 English only and L5 maths only, this suggests the total size of Ofsted’s disadvantaged most able population within the 2014 GCSE cohort in state-funded, non-selective schools is almost exactly 28,000 students.

Strangely, the Report does not analyse the distribution of disadvantaged high attainers, as opposed to high attainers more generally, even though the text mentions this as an issue in passing.

One would expect that the so called ‘minority effect’ might be even more pronounced in schools where there are very few disadvantaged high attainers.

Ofsted’s evidence base: Performance data

The Executive Summary argues that analysis of national performance data reveals:

‘…three key areas of underperformance for the most able students. These are the difference in outcomes between:

  • schools where most able students make up a very small proportion of the school’s population and those schools where proportions are higher
  • the disadvantaged most able students and their better off peers
  • the most able girls and the most able boys.

If the performance of the most able students is to be maximised, these differences need to be overcome.’

As noted above, Ofsted does not separately consider schools where the incidence of disadvantaged most able students is low, nor does it look at the interaction between these three categories.

It considers all three areas of underperformance through the single prism of prior attainment in KS2 tests of English and maths.

The Report also comments on a fourth dimension: the progression of disadvantaged students to competitive universities. Once again this is related to KS2 performance.

There are three data-related Key Findings:

  • National data show that too many of the most able students are still being let down and are failing to reach their full potential. Most able students’ achievement appears to suffer even more when they are from disadvantaged backgrounds or when they attend a school where the proportion of previously high-attaining students is small.’
  • ‘Nationally, too many of our most able students fail to achieve the grades they need to get into top universities. There are still schools where not a single most able student achieves the A-level grades commonly preferred by top universities.’
  • The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

The following sections look at achievement compared with prior attainment, followed by each of the four dimensions highlighted above.

GCSE attainment compared with KS2 prior attainment

Ofsted’s approach is modelled on the transition matrices, as applied to non-selective schools, comparing KS2 test performance in 2009 with subsequent GCSE performance in 2014.

Students with KS2 L5 are expected to make at least three levels of progress, to GCSE Grade B or higher, but this is relatively undemanding for high attainers, who should ideally be aiming for A/A* grades.

Ofsted presents two charts which illustrate the relatively small proportions who are successful in these terms – and the comparatively large proportions who undershoot even a grade B.

Ofsted Capture 1

Ofsted Capture 2

 .

  • In English, 39% manage A*/A grades while 77% achieve at least a Grade B, meaning that 23% achieve C or below.
  • In maths, 42% achieve A*/A grades, 76% at least a B and so 24% achieve C or lower.
  • In English and maths combined, 32% achieve A*/A grades in both subjects, 73% manage at least 2 B grades, while 27% fall below this.

Approximately one in four high attainers is not achieving each of these progression targets, even though they are not particularly demanding.

The Report notes that, in selective schools, the proportion of Level 5 students not achieving at least a Grade B is much lower, at 8% in English and 6% in maths.

Even allowing for the unreliability of these ‘levels of progress’ assumptions, the comparison between selective and non-selective schools is telling.

.

The size of a school’s most able population

The Report sets out evidence to support the contention that ‘the most able do best when there are more of them in a school’ (or, more accurately, in their year group).

It provides three graphs – for English, for maths and for maths and English combined – which divide non-selective schools with at least one L5 student into deciles according to the size of that L5 population.

These show consistent increases in the proportion of students achieving GCSE Grade B and above and Grades A*/A, with the lowest percentages for the lowest deciles and vice versa.

Comparing the bottom (fewest L5) and top (most L5) deciles:

  • In English 27% of the lowest decile achieved A*/A and 67% at least a B, whereas in the highest decile 48% achieved A*/A and 83% at least B.
  • In maths 28% of the bottom decile recorded A*/A while 65% managed at least a B, whereas in the top decile 54% achieved A*/A and 83% at least a B.
  • In maths and English combined, the lowest decile schools returned 17% A*/A grades and 58% at B or above, while in the highest decile the percentages were 42% and 81% respectively.

Selective schools record higher percentages than the highest decile on all three measures.

There is a single reference to the impact of sublevels, amply evidenced by the transition matrices.

‘For example, in schools where the lowest proportions of most able students had previously gained Level 5A in mathematics, 63% made more than expected progress. In contrast, in schools where the highest proportion of most able students who had previously attained Level 5A in mathematics, 86% made more than expected progress.’

Ofsted does not draw any inferences from this finding.

As hinted above, one might want to test the hypothesis that there may be an association with setting – in that schools with sufficient Level 5 students to constitute a top set might be relatively more successful.

Pursued to its logical extreme the finding would suggest that Level 5 students will be most successful where they are all taught together.

Interestingly, my own analysis of schools with small high attainer populations (10% or less of the cohort), derived from the 2014 Secondary Performance Tables, shows just how much variation there can be in the performance of these small groups when it comes to the standard measures:

  • 5+ A*-C grades including English and maths varies from 44% to 100%
  • EBacc ranges from 0% to 89%
  • Expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%.

This is partly a function of the small sample sizes. One suspects that Ofsted’s deciles smooth over similar variations.

But the most obvious point is that already emphasised in the previous section – the distribution of high attainers seems in large part a proxy for the level of advantage in a school.

Viewed from this perspective, Ofsted’s data on the variation in performance by distribution of high attaining students seems unsurprising.

.

Excellence gaps

Ofsted cites an ‘ever 6’ gap of 13 percentage points at GCSE grade B and above in English (66% compared with 79%) and of 17 percentage points in maths (61% compared with 78%).

Reverting again to progression from KS2, the gap between L5 ‘ever 6 FSM’ and other students going on to achieve A*/A grades in both English and maths is also given as 17 percentage points (20% versus 37%). At Grade B and above the gap is 16 points (59% compared with 75%).

A table is supplied showing progression by sub-level in English and maths separately.

.

Ofsted Capture 3

. 

A footnote explains that the ‘ever 6 FSM’ population with L5a in English was small, consisting of just 136 students.

I have transferred these excellence gaps to the graph below, to illustrate the relationship more clearly.

.

Ofsted chart 2

GCSE attainment gaps between advantaged and disadvantaged learners by KS2 prior attainment

.

It shows that, for grades A*-B, the size of the gap reduces the higher the KS2 sub-level, but the reverse is true at grades A*/A, at least as far as the distinction between 5c and 5b/a is concerned. The gaps remain similar or identical for progression from the higher two sub-levels.

This might suggest that schools are too little focused on pushing high-attaining disadvantaged learners beyond grade B.

 .

Gender

There is a short section on gender differences which points out that, for students with KS2 L5:

  • In English there was a 10 percentage point gap in favour of girls at Grade B and above and an 11 point gap in favour of girls at A*/A.
  • In maths there was a five percentage point gap at both Grade B and above and Grade A*/A.

But the interrelationship with excellence gaps and the size of the high attainer population is not explored.

.

Progression to competitive higher education

The Executive Summary mentions one outcome from the 2012/13 destinations data – that only 5% of disadvantaged students completing KS5 in 2012 progressed to ‘the top universities’. (The main text also compares the progression rates for state-funded and independent schools).

It acknowledges some improvement compared with previous years, but notes the disparity with progression rates for students from comparatively advantaged backgrounds.

A subsequent footnote reveals that Ofsted is referring throughout to progression to Russell Group universities

The Executive Summary also highlights regional differences:

‘For example, even within a high-achieving region like London, disadvantaged students in Brent are almost four times as likely to attend a prestigious university as those in Croydon.’

The main text adds:

‘For example, of the 500 or so disadvantaged students in Kent, only 2% go on to attend a top university. In Manchester, this rises to 9%. Disadvantaged students in Barnet are almost four times as likely as their peers in Kent to attend a prestigious university.’

Annex A provides only one statistic concerning progression from KS2 to KS5:

‘One half of students achieving Level 5 in English and mathematics at Key Stage 2 failed to achieve any A or A* grades at A level in non-selective schools’

There is no attempt to relate this data to the other variables discussed above.

Ofsted’s Evidence base – inspection and survey evidence

The qualitative evidence in Ofsted’s report is derived from:

  • A survey of 40 non-selective secondary schools and 10 primary schools. All the secondary schools had at least 15% of students ‘considered to be high attaining at the end of Key Stage 2’ (as opposed to meeting Ofsted’s definition), as well as 10% or more considered to be low-attaining. The sample varied according to size, type and urban or rural location. Fifteen of the 40 were included in the survey underpinning the original 2013 report. Nine of the 10 primary schools were feeders for the secondaries in the sample. In the secondary schools, inspectors held discussions with senior leaders, as well as those responsible for transition and IAG (so not apparently those with lead responsibility for high attainers). They also interviewed students in KS3 and KS5 and looked at samples of students’ work.

The six survey questions are shown below

.

Ofsted Capture 4

.

  • Supplementary questions asked during 130 Section 5 inspections, focused on how well the most able students are maintaining their progress in KS3, plus challenge and availability of suitable IAG for those in Year 11.
  • An online survey of 600 Year 8 and Year 11 students from 17 unidentified secondary schools, plus telephone interviews with five Russell Group admissions tutors.

The Report divides the qualitative dimension of its report into seven sections that map broadly on to the six survey questions.

The summary below is organised thematically, pulling together material from the key findings and supporting commentary. Relevant key findings are emboldened. Some of these have relevance to sections other than that in which they are located.

The length of each section is a good guide to the distribution and relative weight of Ofsted’s qualitative evidence

Most able disadvantaged

‘Schools visited were rarely meeting the distinct needs of students who are most able and disadvantaged. Not enough was being done to widen the experience of these students and develop their broader knowledge or social and cultural awareness early on in Key Stage 3. The gap at Key Stage 4 between the progress made by the most able disadvantaged students and their better off peers is still too large and is not closing quickly enough.’

The 2013 Report found few instances of pupil premium being used effectively to support the most able disadvantaged. This time round, about a third of survey schools were doing so. Six schools used the premium effectively to raise attainment.

Funding was more often used for enrichment activities but these were much less common in KS3, where not enough was being done to broaden students’ experience or develop social and cultural awareness.

In less successful schools, funding was not targeted ‘with the most able students in mind’, nor was its impact evaluated with sufficient precision.

In most survey schools, the proportion of most able disadvantaged was small. Consequently leaders did not always consider them.

In the few examples of effective practice, schools provided personalised support plans.

.

.

Leadership

Ofsted complains of complacency. Leaders are satisfied with their most able students making the expected progress – their expectations are not high enough.

School leaders in survey schools:

‘…did not see the need to do anything differently for the most able as a specific group.’

One head commented that specific support would be ‘a bit elitiist’.

In almost half of survey schools, heads were not prioritising the needs of their most able students at a sufficiently early stage.

Just 44 of the 130 schools asked supplementary questions had a senior leader with designated responsibility for the most able. Of these, only 16 also had a designated governor.

The Report comments:

‘This suggests that the performance of the most able students was not a high priority…’

Curriculum

Too often, the curriculum did not ensure that work was hard enough for the most able students in Key Stage 3. Inspectors found that there were too many times when students repeated learning they had already mastered or did work that was too easy, particularly in foundation subjects.’

Although leaders have generally made positive curriculum changes at KS4 and 5, issues remain at KS3. General consensus amongst students in over half the survey schools was that work is too easy.

Students identified maths and English as more challenging than other subjects in about a third of survey schools.

In the 130 schools asked supplementary questions, leaders rarely prioritised the needs of the most able at KS3. Only seven offered a curriculum designed for different abilities.

In the most effective survey schools the KS3 curriculum was carefully structured:

‘…leaders knew that, for the most able, knowledge and understanding of content was vitally important alongside the development of resilience and knowing how to conduct their own research.’

By comparison, the KS4 curriculum was tailored in almost half of survey schools. All the schools introduced enrichment and extra-curricular opportunities, though few were effectively evaluated.

. 

Assessment and tracking

Assessment, performance tracking and target setting for the most able students in Key Stage 4 were generally good, but were not effective enough in Key Stage 3. The schools visited routinely tracked the progress of their older most able students, but this remained weak for younger students. Often, targets set for the most able students were too low, which reflected the low ambitions for these students. Targets did not consistently reflect how quickly the most able students can make progress.’

Heads and assessment leaders considered tracking the progress of the most able sufficient to address their performance, but only rarely was this information used to improve curriculum and teaching strategies.

Monitoring and evaluation tends to be focused on KS4. There were some improvements in tracking at KS4 and KS5, but this had caused many schools to lose focus on tracking from the start of KS3.

KS3 students in most survey schools said their views were sought, but could not always point to changes as a consequence. Only in eight schools were able students’ views sought as a cohort.

Year 8 respondents to the online survey typically said schools could do more to develop their interests.

At KS3, half the survey schools did not track progress in all subjects. Where tracking was comprehensive, progress was inconsistent, especially in foundation subjects.

Assessment and tracking ‘generally lacked urgency and rigour’. This, when combined with ineffective use of KS2 assessments:

‘… has led to an indifferent start to secondary school for many of the most able students in these schools.’

KS2 tests were almost always used to set targets but five schools distrusted these results. Baseline testing was widely used, but only about a quarter of the sample used it effectively to spot gaps in learning or under-achievement.

Twenty-six of the 40 survey schools set targets ‘at just above national expectations’. For many students these were insufficiently demanding.

Expectations were insufficiently high to enable them to reach their potential. Weaknesses at KS3 meant there was too much to catch up at KS4 and 5.

In the better examples:

‘…leaders looked critically at national expectations and made shrewd adjustments so that the most able were aiming for the gold standard of A and A* at GCSE and A levels rather than grade B. They ensured that teachers were clear about expectations and students knew exactly what was expected of them. Leaders in these schools tracked the progress of their most able students closely. Teachers were quickly aware of any dips in performance and alert to opportunities to stretch them.’

The expectations built into levels-based national curriculum assessment imposed ‘a glass ceiling’. It is hoped that reforms such as Progress 8 will help raise schools’ aspirations.

 .

Quality of teaching

‘In some schools, teaching for the most able lacked sufficient challenge in Key Stage 3. Teachers did not have high enough expectations and so students made an indifferent start to their secondary education. The quality of students’ work across different subjects was patchy, particularly in foundation subjects. The homework given to the most able was variable in how well it stretched them and school leaders did not routinely check its effectiveness.’

The most common methods of introducing ‘stretch’ reported by teachers and students were extension work, challenge questions and differentiated tasks.

But in only eight of the survey schools did teachers have specific training in applying these techniques to the most able.

As in 2013, teaching at KS3 was insufficiently focused on the most able. The quality of work and tasks set was patchy, especially in foundation subjects. In two-thirds of survey schools work was insufficiently challenging in foundation subjects; in just under half, work was insufficiently challenging in maths and English.

Students experienced a range of teaching quality, even in the same school. Most said there were lessons that did not challenge them. Older students were more content with the quality of stretch and challenge.

In only about one fifth of survey schools was homework adapted to the needs of the most able. Extension tasks were increasingly common.

The same was true of half of the 130 schools asked supplementary questions.  Only 14 had a policy of setting more challenging homework for the most able.

Most schools placed students in maths and science sets fairly early in Year 7, but did so less frequently in English.

In many cases, older students were taught successfully in mixed ability classes, often because there were too few students to make sets viable:

‘The fact that these schools were delivering mixed ability classes successfully suggests that the organisation of classes by ability is not the only factor affecting the quality of teaching. Other factors, such as teachers not teaching their main subject or sharing classes or leaders focusing the skills of their best teachers disproportionately on the upper key stages, are also influential.’

. 

School culture and ethos

Leaders had not embedded an ethos in which academic excellence was championed with sufficient urgency. Students’ learning in Key Stage 3 in the schools visited was too frequently disrupted by low-level disruption, particularly in mixed-ability classes. Teachers had not had enough effective training in using strategies to accelerate the progress of their most able students.’

Where leadership was effective, leaders placed strong emphasis on creating the right ethos. School leaders had not prioritised embedding a positive ethos at KS3 in 22 of the survey schools.

In half of the survey schools, the most able students said their learning was affected by low-level disruption, though teachers in three-quarters of schools maintained this was rare. Senior leaders also had a more positive view than students.

In 16 of the schools, students thought behaviour was less good in mixed ability classes and staff tended to agree.

.

Transition

‘Inspectors found that the secondary schools visited were not using transition information from primary schools effectively to get the most able off to a flying start in Key Stage 3. Leaders rarely put in place bespoke arrangements for the most able students. In just under half of the schools visited, transition arrangements were not good enough. Some leaders and teachers expressed doubt about the accuracy of Key Stage 2 results. The information that schools gathered was more sophisticated, but, in too many cases, teachers did not use it well enough to make sure students were doing work with the right level of difficulty.

Too often poor transition arrangements meant students were treading water in KS3. The absence of leadership accountability for transition appeared a factor in stifled progress at KS4 and beyond.

Transfer arrangements with primary schools were not well developed in 16 of the survey schools. Compared with 2013, schools were more likely to find out about pupils’ strengths and weaknesses, but the information was rarely used well.

Secondary schools had more frequent and extended contact with primary schools through subject specialists to identify the most able, but these links were not always used effectively. Only one school had a specific curriculum pathway for such students.

Leaders in four of the ten primary schools surveyed doubted whether secondary schools used transition information effectively.

However, transition worked well in half of the secondary schools.  Six planned the Year 7 curriculum jointly with primary teachers. Leaders had the highest expectations of their staff to ensure that the most able were working at the appropriate level of challenge.

Transition appeared more effective where schools had fewer feeder primaries. About one third of the sample had more than 30 feeder schools, which posed more difficulties, but four of these schools had effective arrangements.

Progression to HE

‘Information, advice and guidance to students about accessing the most appropriate courses and universities were not good enough. There were worrying occasions when schools did too little to encourage the most able students to apply to prestigious universities. The quality of support was too dependent on the skills of individual staff in the schools visited.

While leaders made stronger links with universities to provide disadvantaged students in Key Stages 4 and 5 with a wider range of experiences, they were not evaluating the impact sharply enough. As a result, there was often no way to measure how effectively these links were supporting students in preparing successful applications to the most appropriate courses.’

Support and guidance about university applications is ‘still fragile’ and ‘remains particularly weak’.

Students, especially those from disadvantaged backgrounds, were not getting the IAG they need. Ten survey schools gave no specific support to first generation university attendees or those eligible for the pupil premium.

Forty-nine of the 130 school asked additional questions did not prioritise the needs of such students. However, personalised mentoring was reported in 16 schools.

In four survey schools students were not encouraged to apply to the top universities.

‘The remnants of misplaced ideas about elitism appear to be stubbornly resistant to change in a very small number of schools. One admissions tutor commented: ‘There is confusion (in schools) between excellence and elitism’.

Only a third of survey schools employed dedicated staff to support university applications. Much of the good practice was heavily reliant on the skills of a few individuals. HE admissions staff agreed.

In 13 of the schools visited, students had a limited understanding of the range of opportunities available to them.

Survey schools had a sound understanding of subject requirements for different degree courses. Only about one-quarter engaged early with parents.

.

Ofsted and other Central Government action

‘Ofsted has sharpened its focus on the progress and quality of teaching of the most able students. We routinely comment on the achievement of the most able students in our inspection reports. However, more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

.

Ofsted’s recommendations and conclusions

This is a somewhat better Report than its June 2013 predecessor, although it continues to fall into several of the same statistical and presentational traps.

It too is a curate’s egg.

For any student of effective provision for the most able, the broad assessment in the previous section is profoundly unsurprising, but its endorsement by Ofsted gives it added power and significance.

We should be grateful that HMCI has chosen to champion this issue when so many others are content to ignore it.

The overall message can best be summarised by juxtaposing two short statements from the Report, one expressed positively, another negatively:

  • In over half of survey schools, the most able KS3 students were progressing as well as, or better than, others. 
  • The needs of the most able were not being met effectively in the majority of survey schools.

Reading between the lines, too often, the most able students are succeeding despite their schools, rather than because of them.

What is rather more surprising – and potentially self-defeating – is Ofsted’s insistence on laying the problem almost entirely at the door of schools, and especially of headteachers.

There is most definitely a degree of complacency amongst school leaders about this issue, and Ofsted is quite right to point that out.

The determination of NAHT and ASCL to take offence at the criticism being directed towards headteachers, to use volatility and change as an excuse and to urge greater focus on the pockets of good practice is sufficient evidence of this.

But there is little by way of counterbalance. Too little attention is paid to the question whether the centre is providing the right support – and the right level of support – to facilitate system-wide improvement. It as if the ‘school-led, self-improving’ ideal is already firmly in place.

Then again, any commitment on the part of the headteachers’ associations to tackling the root causes of the problem is sadly lacking. Meanwhile, the teachers;’ associations ignored the Report completely.

Ofsted criticises this complacency and expresses concern that most of its survey schools:

‘…have been slow in taking forward Ofsted’s previous recommendations, particularly at KS3’

There is a call for renewed effort:

‘Urgent action is now required. Leaders must grasp the nettle and radically transform transition from primary school and the delivery of the Key Stage 3 curriculum. Schools must also revolutionise the quality of information, advice and guidance for their most able students.’

Ofsted’s recommendations for action are set out below. Seven are directed at school leaders, three at Ofsted and one at DfE.

Ofsted capture 5

Ofsted Capture 6

Those aimed by Ofsted towards itself are helpful in some respects.

For example, there is implicit acknowledgement that, until now, inspectors have been insufficiently focused on the most able from disadvantaged backgrounds.

Ofsted stops short of meeting my call for it to produce guidance to help schools and inspectors to understand Ofsted’s expectations.

But it is possible that it might do so. Shortly after publication of the Report, its Director for Schools made a speech confirming that: 

‘… inspectors are developing a most able evaluation toolkit for schools, aligned to that which is in place for free school meals’. 

.

.

If Ofsted is prepared to consult experts and practitioners on the content of that toolkit, rather than producing it behind closed doors, it is more likely to be successful.

There are obvious definitional issues stemming from the fact that, according to Ofsted’s current approach, the ‘most able’ population constitutes 40-50% of all learners.

While this helps to ensure relevance to every school, no matter how depressed the attainment of its intake, it also highlights the need for further differentiation of this huge population.

Some of Ofsted’s statistical indicators and benchmarking tools will need sharpening, not least to avoid the pitfalls associated with the inverse relationship between the proportion of high attainers and the proportion of disadvantaged learners.

They might usefully focus explicitly on the distribution and incidence of the disadvantaged most able.

Prospects for success

But the obvious question is why schools should be any more likely to respond this time round than in 2013?

Will the references in the Ofsted inspection handbook plus reformed assessment arrangements be sufficient to change schools’ behaviour?

Ofsted is not about to place explicit requirements on the face of the inspection framework.

We are invited to believe that Progress 8 in particular will encourage secondary schools to give due attention to the needs of high attainers.

Yet there is no commitment to the publication of a high attainers’ performance measure (comparable to the equivalent primary measure) or the gap on that measure between those from advantaged and disadvantaged backgrounds.

Data about the performance of secondary high attainers was to have been made available through the now-abandoned Data Portal – and there has been no information about what, if anything, will take its place.

And many believe that the necessary change cannot be achieved by tinkering with the accountability framework.

The specialist organisations are united in one respect: they all believe that schools – and learners themselves – need more direct support if we are to spread current pockets of effective practice throughout the system.

But different bodies have very different views about what form that support should take. Until we can:

  • Establish the framework necessary to secure universally high standards across all schools without resorting to national prescription

we – and Ofsted – are whistling in the wind.

GP

March 2015

Labour’s Commitment to Gifted Education: Can the Tories match it?

.

Today, Labour announced that it would support gifted and talented children.

.

This short post examines what is so far in the public domain.

Is this concerted action?

We heard on Sunday (1 March 2015) that Ofsted is bringing forward publication of its second survey report on the education of the ‘most able’.

Plans for the survey were announced in HMCI’s Annual Report, published in December 2014. I set out exactly what was proposed in this contemporaneous post.

At the end of January, HMCI Wilshaw told the Education Select Committee that the second survey report would be published in May (see page 41) but newspaper reports over the weekend said it would appear tomorrow (4 March).

Labour’s announcement is obviously timed to anticipate Ofsted’s report.

By bringing forward his report to this side of the General Election, HMCI has certainly ensured that it will exert much more leverage on political decision-making. He will want that to impact on the Conservatives as well as Labour.

.

What exactly is Labour’s commitment?

The original newspaper report is so far our only source. (I will add any further details from material that appears subsequently.)

It says that, if elected:

  • Labour would establish an independently-administered Gifted and Talented Fund, which is likely to ‘have a £15m pot initially’.
  • Schools would be able to bid for money from the Fund to ‘help their work in stretching the most able pupils’.
  • The Fund would help to establish ‘a new evidence base on how to encourage talented children’

The current evidence base, cited in support of this decision, comprises: material from Ofsted’s first survey report (June 2013); the Social Mobility and Child Poverty Commission’s report on High-attaining children from disadvantaged backgrounds (June 2014); and PISA data (which I analysed in this post from December 2013.

.

Unanswered questions

There are many.

The use of ‘gifted and talented’ terminology may be misleading, in that the remainder of the text suggests Labour is focused on high attainers including (but not exclusively) those from disadvantaged backgrounds.

It is not clear whether the £15m funding commitment is an annual commitment or an initial investment that might or might not be topped up subsequently.

It seems to be available to both primary and secondary schools, but this is not made explicit.

It is not clear how bids for the funding would be assessed, or who would assess them.

The purpose of the funding seems primarily to support teachers and schools rather than to support high attaining learners themselves.

The relationship between the Fund and building the evidence base is not made clear. Will there be an expectation of school-based action research, for example?

There is no explicit ‘joining up’ with wider Labour action on social mobility or fair access to selective higher education (and there is an unfortunate allusion to the pupil premium which suggests it is exclusively to help lower attainers).

In a separate blog, Shadow Minister Hunt does link the Fund to these twin aims:

‘The long and the short of it is this: if we could help talented, disadvantaged children to achieve at the same trajectory as their better off peers it would almost double the number of children from poor backgrounds attending the top universities.’

but the mechanism by which this will be achieved – and the link with Offa’s regime – is left unexplained.

Then in a third statement, Hunt implies that the funding is:

‘…to support the most able pupils from low and middle income backgrounds to progress into the professions, high quality apprenticeships and the best universities’

This suggests that the funding will not be targeted exclusively towards those from disadvantaged backgrounds, but it will be targeted at learners rather than teachers and schools.

It will be interesting to see whether the Fund is described more specifically in Labour’s Manifesto.

.

Is anyone on the inside track?

The word on the street is that Labour developed its policy through an internal review.

But the inclusion of a statement from Peter Lampl might suggest that they are in cahoots with the Sutton Trust, where an ex-Labour SPAD is ensconced as Director of Research and Communications.

The Trust’s Mobility Manifesto (September 2014) includes a call for:

‘…an effective national programme for highly able state school pupils, with ring-fenced funding to support evidence-based activities and tracking of pupils’ progress.’

Unfortunately, it is also wedded to the misguided Open Access scheme, which involves denuding state-funded schools of high attainers and diverting them to independent schools instead. (For a more balanced and careful analysis see this post from April 2012)

It cannot be entirely accidental that Lampl published his latest article pushing this wheeze on the same day as Labour’s announcement.

The Education Endowment Foundation might be a potential home for the Fund – and of course the Sutton Trust has a close relationship with the EEF.

 .

Pressure on the Tories?

The combined weight of Labour’s announcement and HMI’s report will put significant pressure on the Tories, especially, to follow suit.

They are already in a difficult position in this territory, having publicly wavered between selection and setting.

Back in 2007 then Opposition Leader Cameron ruled out new grammar schools and proposed universal setting as an alternative.

‘Most critics seem to accept, when pressed, that as I have said, the prospect of more grammars is not practical politics….

…When I say I oppose nationwide selection by 11 between schools, that does not mean I oppose selection by academic ability altogether.

Quite the reverse. I am passionate about the importance of setting by ability within schools, so that we stretch the brightest kids and help those in danger of being left behind.

With a Conservative Government this would be a motor of aspiration for the brightest kids from the poorest homes – effectively a ‘grammar stream’ in every subject in every school.

Setting would be a focus for Ofsted and a priority for all new academies.

More recently, he has enthused about the expansion of existing grammar schools:

 ‘”I strongly support the right of all good schools to expand. I think that’s very important and that should include grammar schools,” the prime minister said:

“Under this government grammar schools have been able to expand and that is all to the good.”‘

The as-yet-unresolved decision on the Sevenoaks satellite is keeping this a live issue as we approach the Election.

There are now media reports that, while the proposal is ready to be approved, Cameron has insisted that the decision is shelved until after the Election, in an effort to prevent it becoming a significant issue during the Tories’ Campaign.

Meanwhile, in September 2014, there was a brief resurgence of the plan for compulsory setting. But this was rapidly relegated to one of a menu of options in the armoury of regional schools commissioners, who would be granted new powers to intervene in failing schools.

In March 2015, the Tory-leaning Policy Exchange think tank published its Education Manifesto, which proposed that:

‘Government should design a prestigious scholarship scheme to financially support the most talented undergraduates in the country – covering approximately 200 individuals a year – if they attend a UK university and remain in the UK for at least three years after graduation.’

This seems too small a fig-leaf to conceal the Tories’ embarrassment – and it is anyway poorly- conceived – see my analysis here.

.

.

The Tories’ only other fallback is the claim that the Coalition Government’s more generic policies will raise standards across the board, including at the top of the attainment spectrum.  This seems increasingly threadbare, however.

With no viable plan C, they could still be squeezed between Labour’s new-found commitment to gifted education and UKIP’s espousal of grammar schools.

Initial reaction to Labour’s announcement?

This is the first time Labour have expressed support for high attainers since Andy Burnham was Shadow Minister.

If the sum they have announced is an annual commitment, this broadly matches the budget for the National Gifted and Talented Programme when it was at its height in the mid-2000s.

They are clearly anxious to keep this support at arms-length from Government – they don’t want to return to a national programme.

The disadvantages of full autonomy could be avoided if bids are invited against a framework of priorities, rather than left entirely for schools to determine. Labour presumably want this funding to make a difference to the statistics they cite from the evidence base.

If the funding is for educators rather than learners, that begs the question whether those from disadvantaged backgrounds might not also be supported through a £50m pupil premium topslice as I have suggested elsewhere.

It would also be helpful if the funding was linked to a national effort to reach consensus on the education of high attainers, as embodied in these ten core principles.

But this is a decent start. ‘Better than a poke in the eye with a blunt stick’, as my favourite colloquialism has it.

.

GP

March 2015

Maths Mastery: Evidence versus Spin

.

On Friday 13 February, the Education Endowment Foundation (EEF) published the long-awaited evaluation reports of two randomised control trials (RCTs) of Mathematics Mastery, an Ark-sponsored programme and recipient of one of the EEF’s first tranche of awards back in 2011.

Inside-out_torus_(animated,_small)Inside-out_torus_(animated,_small)EEF, Ark and Mathematics Mastery each published a press release to mark the occasion but, given the timing, none of these attracted attention from journalists and were discussed only briefly on social media.

The main purpose of this post is to distinguish evidence from spin, to establish exactly what the evaluations tell us – and what provisos should be attached to those findings.

The post is organised into three main sections which deal respectively with:

  • Background to Mathematics Mastery
  • What the evaluation reports tell us and
  • What the press releases claim

The conclusion sets out my best effort at a balanced summary of the main findings. (There is a page jump here for those who prefer to cut to the chase.)

This post is written by a non-statistician for a lay audience. I look to specialist readers to set me straight if I have misinterpreted any statistical techniques or findings,

What was published?

On Friday 13 February the EEF published six different documents relevant to the evaluation:

  • A press release: ‘Low-cost internet-based programme found to considerably improve reading ability of year 7 pupils’.
  • A blog post: ‘Today’s findings: impact, no impact and inconclusive – a normal distribution of findings’.
  • An updated Maths Mastery home page (also published as a pdf Project Summary in a slightly different format).

The last three of these were written by the Independent Evaluators – Jerrim and Vignoles (et al) – employed through the UCL Institute of Education.

The Evaluators also refer to ‘a working paper documenting results from both trials’ available in early 2015 from http://ideas.repec.org/s/qss/dqsswp.html and www.johnjerrim.com. At the time of writing this is not yet available.

Press releases were issued on the same day by:

All of the materials published to date are included in the analysis below.

Background to Maths Mastery

What is Maths Mastery?

According to the NCETM (October 2014) the mastery approach in mathematics is characterised by certain common principles:

‘Teachers reinforce an expectation that all pupils are capable of achieving high standards in mathematics.

  • The large majority of pupils progress through the curriculum content at the same pace. Differentiation is achieved by emphasising deep knowledge and through individual support and intervention.
  • Teaching is underpinned by methodical curriculum design and supported by carefully crafted lessons and resources to foster deep conceptual and procedural knowledge.
  • Practice and consolidation play a central role. Carefully designed variation within this builds fluency and understanding of underlying mathematical concepts in tandem.
  • Teachers use precise questioning in class to test conceptual and procedural knowledge, and assess pupils regularly to identify those requiring intervention so that all pupils keep up.

The intention of these approaches is to provide all children with full access to the curriculum, enabling them to achieve confidence and competence – ‘mastery’ – in mathematics, rather than many failing to develop the maths skills they need for the future.’

The NCETM paper itemises six key features, which I paraphrase as:

  • Curriculum design: Relatively small, sequenced steps which must each be mastered before learners move to the next stage. Fundamental skills and knowledge are secured first and these often need extensive attention.
  • Teaching resources: A ‘coherent programme of high-quality teaching materials’ supports classroom teaching. There is particular emphasis on ‘developing deep structural knowledge and the ability to make connections’. The materials may include ‘high-quality textbooks’.
  • Lesson design: Often involves input from colleagues drawing on classroom observation. Plans set out in detail ‘well-tested methods’ of teaching the topic. They include teacher explanations and questions for learners.
  • Teaching methods: Learners work on the same tasks. Concepts are often explored together. Technical proficiency and conceptual understanding are developed in parallel.
  • Pupil support and differentiation: Is provided through support and intervention rather than through the topics taught, particularly at early stages. High attainers are ‘challenged through more demanding problems which deepen their knowledge of the same content’. Issues are addressed through ‘rapid intervention’ commonly undertaken the same day.
  • Productivity and practice: Fluency is developed from deep knowledge and ‘intelligent practice’. Early learning of multiplication tables is expected. The capacity to recall facts from long term memory is also important.

Its Director published a blog post (October 2014) arguing that our present approach to differentiation has ‘a very negative effect’ on mathematical attainment and that this is ‘one of the root causes’ of our performance in PISA and TIMSS.

This is because it negatively affects the ‘mindset’ of low attainers and high attainers alike. Additionally, low attainers are insufficiently challenged and get further behind because ‘they are missing out on some of the curriculum’. Meanwhile high attainers are racing ahead without developing fluency and deep understanding.

He claims that these problems can be avoided through a mastery approach:

‘Instead, countries employing a mastery approach expose almost all of the children to the same curriculum content at the same pace, allowing them all full access to the curriculum by focusing on developing deep understanding and secure fluency with facts and procedures, and providing differentiation by offering rapid support and intervention to address each individual pupil’s needs.’

But unfortunately he stops short of explaining how, for high attainers, exclusive focus on depth is preferable to a richer blend of breadth, depth and pace, combined according to each learner’s needs.

NCETM is careful not to suggest that mastery is primarily focused on improving the performance of low-attaining learners.

It has published separate guidance on High Attaining Pupils in Primary Schools (registration required), which advocates a more balanced approach, although that predates this newfound commitment to mastery.

NCETM is funded by the Department for Education. Some of the comments on the Director’s blog post complain that it is losing credibility by operating as a cheerleader for Government policy.

Ark’s involvement

Ark is an education charity and multi-academy trust with an enviable reputation.

It builds its approach on six key principles, one of which is ‘Depth before breadth’:

‘When pupils secure firm foundations in English and mathematics, they find the rest of the curriculum far easier to access. That’s why we prioritise depth in these subjects, giving pupils the best chance of academic success. To support fully our pupils’ achievement in maths, we have developed the TES Award winning Mathematics Mastery programme, a highly-effective curriculum and teaching approach inspired by pupil success in Singapore and endorsed by Ofsted. We teach Mathematics Mastery in all our primary schools and at Key Stage 3 in a selection of our secondary schools. It is also being implemented in over 170 schools beyond our network.’

Ark’s 2014 Annual Report identifies five priorities for 2014/15, one of which is:

‘…developing curricula to help ensure our pupils are well prepared as they go through school… codifying our approach to early years and, building on the success of Maths Mastery, piloting an English Mastery programme…’

Mathematics Mastery is a charity in its own right. Its website lists 15 staff, a high-powered advisory group and three partner organisations:  Ark, the EEF (presumably by virtue of the funded evaluation) and the ‘Department for Education and the Mayor of London’ (presumably by virtue of support from the London Schools Excellence Fund).

NCETM’s Director sits on Mathematics Mastery’s Advisory Board.

Ark’s Chief Executive is a member of the EEF’s Advisory Board.

Development of Ark’s Maths Mastery programme

According to this 2012 report from Reform, which features Maths Mastery as a case study, it originated in 2010:

‘The development of Mathematics Mastery stemmed from collaboration between six ARK primary academies in Greater London, and the mathematics departments in seven separate ARK secondary academies in Greater London, Portsmouth and Birmingham. Representatives from ARK visited Singapore to explore the country’s approach first-hand, and Dr Yeap Ban Har, Singapore’s leading expert in maths teaching, visited King Solomon Academy in June 2011.’

In October 2011, EEF awarded Ark a grant of £600,000 for Maths Mastery, one of its first four awards.

The EEF’s press release says:

‘The third grant will support an innovative and highly effective approach to teaching children maths called Mathematics Mastery, which originated in Singapore. The programme – run by ARK Schools, the Academies sponsor, which is also supporting the project – will receive £600,000 over the next four years to reach at least 50 disadvantaged primary and secondary schools.’

Ark’s press release adds:

‘ARK Schools has been awarded a major grant by the Education Endowment Foundation (EEF) to further develop and roll out its Mathematics Mastery programme, an innovative and highly effective approach to teaching children maths based on Singapore maths teaching. The £600,000 grant will enable ARK to launch the programme and related professional development training to improve maths teaching in at least 50 disadvantaged primary and secondary schools.

The funding will enable ARK Schools to write a UK mathematics mastery programme based on the experience of teaching the pilot programme in ARK’s academies. ARK intends to complete the development of its primary modules for use from Sept 2012 and its secondary modules for use from September 2013. In parallel ARK is developing professional training and implementation support for schools outside the ARK network.’

The project home page on EEF’s site now says the total project cost is £774,000. It may be that the balance of £174,000 is the fee paid to the independent evaluators.

This 2012 information sheet says all Ark primary schools would adopt Maths Mastery from September 2012, and that its secondary schools have also devised a KS3 programme.

It describes the launch of a Primary Pioneer Programme from September 2012 and a Secondary Pioneer Programme from September 2013. These will form the cohorts to be evaluated by the EEF.

In 2013, Ark was awarded a grant of £617,375 from the Mayor of London’s London Schools Excellence Fund for the London Primary Schools Mathematics Mastery Project.

This is to support the introduction of Mastery in 120 primary schools spread across 18 London boroughs. (Another source gives the grant as £595,000)

It will be interesting to see whether Maths Mastery (or English Mastery) features in the Excellence Fund’s latest project to increase primary attainment in literacy and numeracy. The outcomes of the EEF evaluations may be relevant to that impending decision.

Ark’s Mathematics Mastery today

The Mathematics Mastery website advertises a branded variant of the mastery model, derived from a tripartite ‘holistic vision’:

  • Deep understanding, through a curriculum that combines universal high expectations with spending more time on fewer topics and heavy emphasis on problem-solving.
  • Integrated professional development through workshops, visits, coaching and mentoring and ‘access to exclusive online teaching and learning materials, including lesson guides for each week’.
  • Teacher collaboration – primary schools are allocated a geographical cluster of 4-6 schools while secondary schools attend a ‘national collaboration event’. There is also an online dimension.

It offers primary and secondary programmes.

The primary programme has three particular features: use of objects and pictures prior to the introduction of symbols; a structured approach to the development of mathematical vocabulary; and heavy emphasis on problem-solving.

It involves one-day training sessions for school leaders, for the Maths Mastery lead and those new to teaching it, and for teachers undertaking the programme in each year group. Each school receives two support visits and attends three local cluster meetings.

Problem-solving is also one of three listed features of the secondary programme. The other two are fewer topics undertaken in greater depth, plus joint lesson planning and departmental workshops.

There are two full training days, one for the Maths Mastery lead and one for the maths department plus an evening session for senior leadership. Each school receives two support visits and attends three national collaborative meetings. They must hold an hour-long departmental workshop each week and commit to sharing resources online.

Both primary and secondary schools are encouraged to launch the programme across Year 1/7 and then roll it upwards ‘over several years’.

The website is not entirely clear but it appears that Maths Mastery itself is being rolled out a year at a time, so even the original primary early adopters will have provision only up to Year 3 and are scheduled to introduce provision for Year 4 next academic year. In the secondary sector, activity currently seems confined to KS3, and predominantly to Year 7.

The number of participating schools is increasing steadily but is still very small.

The most recent figures I could find are 192 (Maths Mastery, November 2014) or 193 – 142 primary and 51 secondary (Ark 2015).

One assumes that this total includes

  • An original tranche of 30 primary ‘early adopters’ including 21 not managed by Ark
  • 60 or so primary and secondary ‘Pioneer Schools’ within the EEF evaluations (ie the schools undertaking the intervention but not those forming the control group, unless they have subsequently opted to take up the programme)
  • The 120 primary schools in the London project
  • Primary and secondary schools recruited outwith the London and EEF projects, either alongside them or subsequently.

But the organisation does not provide a detailed breakdown, or show how these different subsets overlap.

They are particularly coy about the cost. There is nothing about this on the website.

The EEF evaluation reports say that 2FE primary schools and secondary schools will pay ‘an upfront cost of £6,000 for participating in the programme’.

With the addition of staff time for training, the per pupil cost for the initial year is estimated as £127 for primary schools and £50 for secondary schools.

The primary report adds:

‘In subsequent years schools are able to opt for different pathways depending on the amount of support and training they wish to choose; they also have ongoing access to the curriculum materials for additional year groups. The per pupil cost therefore reduces considerably, to below £30 per pupil for additional year groups.’

In EEF terms this is deemed a low cost intervention, although an outlay of such magnitude is a significant burden for primary schools, particularly when funding is under pressure, and might be expected to act as a brake on participation.

Further coyness is evident in respect of statutory assessment outcomes. Some details are provided for individual schools, but there is precious little about the whole cohort.

All I could find was this table in the Primary Yearbook 2014-15.

.

EEF maths mastery performance

It suggests somewhat better achievement at KS1 L2b and L3c than the national average but, there is no information about other Levels and, of course, the sample is not representative, so the comparison is of limited value.

An absence of more sophisticated analysis – combined with the impression of limited transparency for those not yet inside the programme – is likely to act as a second brake on participation.

There is a reference to high attainers in the FAQ on the website:

‘The Mathematics Mastery curriculum emphasises stretching through depth of understanding rather than giving the top end of pupils [sic] new procedures to cover.

Problem solving is central to Mathematics Mastery. The great thing about the problems is that students can take them as far as they can, so those children who grasp the basics quickly can explore tasks further. There is also differentiation in the methods used, with top-end pupils typically moving to abstract numbers more quickly and spending less time with concrete manipulatives or bar models. There are extension ideas and support notes provided with the tasks to help you with this.

A range of schools are currently piloting the programme, which is working well in mixed-ability classes, as well as in schools that have set groups.’

The same unanswered questions arise as with the NCETM statement above. Is ‘Maths Mastery’ primarily focused on the ‘long tail’, potentially at the expense of high attainers?

The IoE evaluators think so. The primary evaluation report says that:

‘Mathematics Mastery intervention is particularly concerned with the ‘mastery’ of basic skills, and raising the attainment of low achievers.’

It would be helpful to have clarity on this point.

.

How influential is Maths Mastery?

Extremely influential.

Much educational and political capital has already been invested in Maths Mastery, hence the peculiar significance of the results contained in the evaluation reports.

The National Curriculum Expert Panel espoused mastery in its ‘Framework for the National Curriculum‘ (December 2011), while ducking the consequences for ‘stretch and challenge’ for high attainers – so creating a tension that remains unresolved to this day.

Meanwhile, the mastery approach has already influenced the new maths programme of study, as the NCETM document makes clear:

‘The 2014 national curriculum for mathematics has been designed to raise standards in maths, with the aim that the large majority of pupils will achieve mastery of the subject…

… For many schools and teachers the shift to this ‘mastery curriculum’ will be a significant one. It will require new approaches to lesson design, teaching, use of resources and support for pupils.’

Maths Mastery confirms that its Director was on the drafting team.

Mastery is also embedded in the national collaborative projects being undertaken through the Maths Hubs. Maths Mastery is one of four national partners in the Hubs initiative.

Ministers have endorsed the Ark programme in their speeches. In April 2014, Truss said:

‘The mastery model of learning places the emphasis on understanding core concepts. It’s associated with countries like Singapore, who have very high-performing pupils.

And in this country, Ark, the academy chain, took it on and developed it.

Ark run training days for maths departments and heads of maths from other schools.

They organise support visits, and share plans and ideas online with other teachers, and share their learning with a cluster of other schools.

It’s a very practical model. We know not every school will have the time or inclination to develop its very own programmes – a small rural school, say, or single-class primary schools.

But in maths mastery, a big chain like Ark took the lead, and made it straightforward for other schools to adopt their model. They maintain an online community – which is a cheap, quick way of keeping up with the best teaching approaches.

That’s the sort of innovation that’s possible.

Of course the important thing is the results. The programme is being evaluated so that when the results come out headteachers will be able to look at it and see if it represents good value.’

In June 2014 she said:

‘This idea of mastery is starting to take hold in classrooms in England. Led by evidence of what works, teachers and schools have sought out these programmes and techniques that have been pioneered in China and East Asia….

…With the Ark Schools Maths Mastery programme, more than 100 primary and secondary schools have joined forces to transform their pupils’ experiences of maths – and more are joining all the time. It’s a whole school programme focused on setting high expectations for all pupils – not believing that some just can’t do it. The programme has already achieved excellent results in other countries.’

Several reputations are being built upon Maths Mastery, many jobs depend upon it and large sums have been invested.

It has the explicit support of one of the country’s foremost academy chains and is already impacting on national curriculum and assessment policy (including the recent consultation on performance indicators for statutory teacher assessment).

Negative or neutral evaluations could have significant consequences for all the key players and are unlikely to encourage new schools to join the Programme.

Hence there is pressure in the system for positive outcomes – hence the significance of spin.

What the EEF evaluations tell us

.

Evaluation Protocols

EEF published separate Protocols for the primary and secondary evaluations in April 2013. These are broadly in line with the approach set out in the final evaluation reports, except that both refer much more explicitly to subsequent longitudinal evaluation:

‘In May/June 2017/18 children in treatment and control schools will sit key stage 2 maths exams. The IoE team will examine the long–run effectiveness of the Maths Mastery programme by investigating differences in school average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children’s Maths Mastery test scores (collected in 2012 and 2013)’.

‘In May/June 2018 children in treatment and control schools will sit national maths exams. The IoE team will examine the long – run effectiveness of the Maths Mastery programme by investigating differences in average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children’s Maths Mastery test scores (collected in 2013 and 2014) by NATCEN.’

It is not clear whether the intention is to preserve the integrity of the intervention and control groups until the former have rolled out Mastery to all year groups, or simply to evaluate the long-term effects of the initial one-year interventions, allowing intervention schools to drop Mastery and control schools to adopt it, entirely as they wish.

EEF Maths Mastery Project Homepage

The EEF’s updated Maths Mastery homepage has been revised to reflect the outcomes of the evaluations. It provides the most accessible summary of those outcomes.

It offers four key conclusions (my emphases):

  • ‘On average, pupils in schools adopting Mathematics Mastery made a small amount more progress than pupils in schools that did not. The effect detected was statistically significant, which means that it is likely that that improvement was caused by the programme.’
  • ‘It is unclear whether the programme had a different impact on pupils eligible for free school meals, or on pupils with higher or lower attainment.’
  • ‘Given the low per-pupil cost, Mathematics Mastery may represent a cost-effective change for schools to consider.’
  • ‘The evaluations assessed the impact of the programme in its first year of adoption. It would be worthwhile to track the medium and long-term impact of the approach.’

A table is supplied showing the effect sizes and confidence intervals for overall impact (primary and secondary together), and for the primary and secondary interventions separately.

EEF table 1 Capture

.

The support materials for the EEF’s toolkit help to explain these judgements.

About the Toolkit tells us that:

‘Average impact is estimated in terms of the additional months’ progress you might expect pupils to make as a result of an approach being used in school, taking average pupil progress over a year as a benchmark.

For example, research summarised in the Toolkit shows that improving the quality of feedback provided to pupils has an average impact of eight months. This means that pupils in a class where high quality feedback is provided will make on average eight months more progress over the course of a year compared to another class of pupils who were performing at the same level at the start of the year. At the end of the year the average pupil in a class of 25 pupils in the feedback group would now be equivalent to the 6th best pupil in the control class having made 20 months progress over the year, compared to an average of 12 months in the other class.’

There is another table showing us how to interpret this scale

EEF table 2 Capture

.

We can see from this that:

  • The overall Maths Mastery impact of +0.073 is towards the upper end of the ‘1 months progress’ category.
  • The ‘primary vs comparison’ impact of +0.10 just scrapes into the ‘2 months progress’ category.
  • The secondary vs comparison impact of +0.06 is towards the middle of the ‘1 months progress category’

All three are officially classed as ‘Low Effect’.

If we compare the effect size attributable to Maths Mastery with others in the Toolkit, it is evident that it ranks slightly above school uniform and slightly below learning styles.

A subsequent section explains that the overall impact rating is dependent on meta-analysis (again my emphases):

‘The findings from the individual trials have been combined using an approach called “meta-analysis”. Meta-analysis can lead to a more accurate estimate of an intervention’s effect. However, it is also important to note that care is needed in interpreting meta-analysed findings.’

But we are not told how, in light of this, we are to exercise care in interpreting this particular finding. There are no explicit ‘health warnings’ attached to it.

The homepage does tell us that:

‘Due to the ages of pupils who participated in the individual trials, the headline findings noted here are more likely to be predictive of programme’s impact on pupils in primary school than on pupils in secondary school.’

It also offers an explanation of why the effects generated from these trials are so small compared with those for earlier studies:

‘The findings were substantially lower than the average effects seen in the existing literature on of “mastery approaches”. A possible explanation for this is that many previous studies were conducted in the United States in the 1970s and 80s, so may overstate the possible impact in English schools today. An alternative explanation is that the Mathematics Mastery programme differed from some examples of mastery learning previously studied. For example classes following the Mathematics Mastery approach did not delay starting new topics until a high level of proficiency had been achieved by all students, which was a key feature in a number of many apparently effective programmes.’

 

There is clearly an issue with the 95% confidence intervals supplied in the first table above. 

The Technical Appendices to the Toolkit say:

‘For those concerned with statistical significance, it is still readily apparent in the confidence intervals surrounding an effect size. If the confidence interval includes zero, then the effect size would be considered not to have reached conventional statistical significance.’ (p6)

The table indicates that the lower confidence interval is zero or lower in all three cases, meaning that none of these findings may be statistically significant.

However, the homepage claims that the overall impact of both interventions, when combined through meta-analysis, is statistically significant.

And it fails entirely to mention that the impact of the both the primary and the secondary interventions separately are statistically insignificant.

The explanation of the attribution of statistical significance to the two evaluations combined is that, whereas the homepage gives confidence intervals to two decimal places, the reports calculate them to a third decimal place.

This gives a lower value of 0.004 (ie four thousandths above zero).

This can be seen from the table annexed to the primary and secondary reports and included in the ‘Overarching Summary Report’

EEF maths mastery 3 decimal places Capture

.

The distinction is marginal, to say the least. Indeed, the Evaluation Reports say:

‘…the pooled effect size of 0.073 is just significantly different from zero at conventional thresholds’

Moreover, notice that the introduction of a third decimal place drags the primary effect size down to 0.099, officially consigning it to the ‘one month’s progress’ category rather than the two months quoted above.

This might appear to be dancing on the head of a statistical pin but, as we shall see later, the spin value of statistical significance is huge!

Overall there is a lack of clarity here that cannot be attributed entirely to the necessity for brevity. The attempt to conflate subtly different outcomes from the separate primary and secondary evaluations has masked these distinctions and distorted the overall assessment.

.

The full reports add some further interesting details which are summarised in the sections below.

Primary Evaluation Report 

EEF maths mastery table 4

Key points:

  • In both the primary and secondary reports, additional reasons are given for why the effects from these evaluations are so much smaller than those from previous studies. These include the fact that:

‘…some studies included in the mastery section of the toolkit show small or no effects, suggesting that making mastery learning work effectively in all circumstances is challenging.’

The overall conclusion is an indirect criticism of the Toolkit, noting as it does that ‘the relevance of such evidence for contemporary education policy in England…may be limited’.

  • The RCT was undertaken across two academic years: In AY2012/13, 40 schools (Cohort A) were involved. Of these, 20 were randomly allocated the intervention and 20 the control. In AY2013/14, 50 schools (Cohort B) participated, 25 allocated the intervention and 25 the control. After the trial, control schools in Cohort A were free to pursue Maths Mastery. (The report does not mention whether this also applied to Cohort B.) It is not clear how subsequent longitudinal evaluation will be affected by such leakage from the control group.
  • The schools participating in the trial schools were recruited by Ark. They had to be state-funded and not already undertaking Maths Mastery:

‘Schools were therefore purposefully selected—they cannot be considered a randomly chosen sample from a well-defined population. The majority of schools participating in the trial were from London or the South East.’

  • Unlike the secondary evaluation, no process evaluation was conducted so it is not possible to determine the extent to which schools adhered to the prescribed programme. 
  • Baseline tests were administered after allocation between intervention and control, at the beginning of each academic year. Pupils were tested again in July. Evaluators used the Number Knowledge Test (NKT) for this purpose. The report discusses reasons why this might not be an accurate predictor of subsequent maths attainment and whether it is so closely related to the intervention as to be ‘a questionable measure of the success of the trial’. The discussion suggests that there were potential advantages to both the intervention and control groups but does not say whether one outweighed the other. 
  • The results of the post-test are summarised thus:

‘Children who received the Mathematics Mastery intervention scored, on average, +0.10 standard deviations higher on the post-test. This, however, only reached statistical significance at the 10% level (t = 1.82; p = 0.07), with the 95% confidence interval ranging from -0.01 to +0.21. Within Cohort A, children in the treatment group scored (on average) +0.09 standard deviations above those children in the control group (confidence interval -0.06 to +0.24). The analogous effect in Cohort B was +0.10 (confidence interval -0.05 to 0.26). Consequently, although the Mathematics Mastery intervention may have had a small positive effect on children’s test scores, it is not possible to rule out sampling variation as an explanation.’

  • The comparison of pre-test and post-test results provides any evidence of differential effects for those with lower or higher prior attainment:

‘Estimates are again presented in terms of effect sizes. The interaction effect is not significantly different from zero, with the 95% confidence interval ranging from -0.01 to +0.02. Thus there is little evidence that the effect of Mathematics Mastery differs between children with different levels of prior achievement.’

The Report adds:

‘Recall that the Mathematics Mastery intervention is particularly concerned with the ‘mastery’ of basic skills, and raising the attainment of low achievers. Thus one might anticipate the intervention to be particularly effective in the bottom half of the test score distribution. There is some, but relatively little, evidence that the intervention was less effective for the bottom half of the test distribution.

So, on this evidence, Maths Mastery is no more effective for the low achievers it is intended to help most. This is somewhat different to the suggestion on the homepage that the answer given to this question is ‘unclear’.

Several limitations are discussed, but it is important to note that they are phrased in hypothetical terms:

  • Pupils’ progress was evaluated after one academic year::

’This may be considered a relatively small ‘dose’ of the Mathematics Mastery programme’.

  • The intervention introduced a new approach to schools, so there was a learning curve which control schools did not experience:

‘With more experience teaching the programme it is possible that teachers would become more effective in implementing it.’

  • The test may favour either control schools or intervention schools.
  • Participating schools volunteered to take part, so it is not possible to say whether similar effects would be found in all schools.
  • It was not possible to control for balance – eg by ethnic background and FSM eligibility – between intervention and control. [This is now feasible so could potentially be undertaken retrospectively to check there was no imbalance.]

Under ‘Interpretation’, the report says:

‘Within the context of the wider educational literature, the effect size reported (0.10 standard deviations) would typically be considered ‘small’….

Yet, despite the modest and statistically insignificant effect, the Mathematics Mastery intervention has shown some promise.’

The phrase ‘some promise’ is justified by reference to the meta-analysis, the cost effectiveness (a small effect size for a low cost is preferable to the same outcome for a higher cost) and the fact that the impact of the entire programme has not yet been evaluated

‘Third, children are likely to follow the Mathematics Mastery programme for a number of years (perhaps throughout primary school), whereas this evaluation has considered the impact of just the first year of the programme. Long-run effects after sustained exposure to the programme could be significantly higher, and will be assessed in a follow-up study using Key Stage 2 data.’

This is the only reference to a follow-up study. It is less definite than the statement in the assessment protocol and there is no further explanation of how this will be managed, especially given potential ‘leakage’ from the control group.

Secondary Evaluation Report

EEF maths mastery table 5

Key points:

  • 50 schools were recruited to participate in the RCT during AY2013/14, with 25 randomly allocated to intervention and control. All Year 7 pupils within the former experienced the intervention.  As in the primary trial, control schools were eligible to access the programme after the end of the trial year. Interestingly, 3 of the 25 intervention schools (12%) dropped out before the end of the year – their reasons are not recorded. 
  • As in the primary trial, Ark recruited the participating schools – which had to be state-funded and new to Maths Mastery. Since schools were deliberately selected they could not be considered a random sample. The report notes:

‘Trial participants, on average, performed less well in their KS1 and KS2 examinations than the state school population as a whole. For instance, their KS1 average points scores (and KS2 maths test scores) were approximately 0.2 standard deviations (0.1 standard deviations) below the population mean. This seems to be driven, at least in part, by the fact that the trial particularly under-represented high achievers (relative to the population). For instance, just 12% of children participating in the trial were awarded Level 3 in their Key Stage 1 maths test, compared to 19% of all state school pupils in England.’

  • KS1 and KS2 tests were used to baseline. The Progress in Maths (PiM) test was used to assess pupils at the end of the year. But about 40% of the questions cover content not included in the Y7 maths mastery curriculum, which disadvantaged them relative to the control group. PiM also includes a calculator section although calculators are not used in Year 7 of Maths Mastery. It was agreed that breakdowns of results would be supplied to account for this.
  • On the basis of overall test results:

‘Children who received the Mathematics Mastery intervention scored, on average, +0.055 standard deviations higher on the PiM post-test. This did not reach statistical significance at conventional thresholds (t = 1.20; p = 0.24), with the 95% confidence interval ranging from –0.037 to +0.147. Turning to the FSM-only sample, the estimated effect size is +0.066 with the 95% confidence interval ranging from –0.037 to +0.169 (p = 0.21). Moreover, we also estimated a model including a FSM-by intervention interaction. Results suggested there was little evidence of heterogeneous intervention effects by FSM. Consequently, although the Mathematics Mastery intervention may have had a small positive effect on overall PiM test scores, one cannot rule out the possibility that this finding is due to sampling variation.

  • When the breakdowns were analysed:

‘As perhaps expected, the Mathematics Mastery intervention did not have any impact upon children’s performance on questions covering topics outside the Mathematics Mastery curriculum. Indeed, the estimated intervention effect is essentially zero (effect size = –0.003). In contrast, the intervention had a more pronounced effect upon material that was focused upon within the Mathematics Mastery curriculum (effect size = 0.100), just reaching statistical significance at the 5% level (t = 2.15; p = 0.04)

  • The only analysis of the comparative performance of high and low attainers is tied to the parts of the test not requiring use of a calculator. It suggests a noticeably smaller effect in the top half of the attainment distribution, with no statistical significance above the 55th This is substantively different to the finding in the primary evaluation, and it begs the question whether secondary Maths Mastery needs adjustment to make it more suitable for high attainers.
  • A process evaluation was focused principally on 5 schools from the intervention group. Focus group discussions were held before the intervention and again towards the end. Telephone interviews were conducted and lessons observed. The sample was selected to ensure different sizes of school, FSM intake and schools achieving both poor and good progress in maths according to their most recent inspection report. One of the recommendations is that:

The intervention should consider how it might give more advice and support with respect to differentiation.’

  • The process evaluation adds further detail about suitability for high attainers:

‘Another school [E] also commented that the materials were also not sufficiently challenging for the highest-attaining children, who were frustrated by revisiting at length the same topics they had already encountered at primary school. Although this observation was also made in other schools, it was generally felt that the children gradually began to realise that they were in fact enjoying the subject more by gaining extra understanding.’

It is not clear whether this latter comment also extends to the high attainers!

A similar set of limitations is explored in similar language to that used in the primary report.

Under ‘Interpretation’ the report says:

‘Although point estimates were consistent with a small, positive gain, the study did not have sufficient statistical power to rule out chance as an explanation. Within the context of the wider educational literature, the effect size reported (less than 0.10 standard deviations) would typically be considered ‘small’…

But, as in the primary report, it detects ‘some promise’ on the same grounds. There is a similar speculative reference to longitudinal evaluation.

.

Press releases and blogs

. 

EEF press release

There is a certain irony in the fact that ‘unlucky’ Friday 13 February was the day selected by the EEF to release these rather disappointing reports.

But Friday is typically the day selected by communications people to release educational news that is most likely to generate negative media coverage – and a Friday immediately before a school holiday is a particularly favoured time to do so, presumably because fewer journalists and social media users are active.

Unfortunately, the practice is at risk of becoming self-defeating, since everyone now expects bad news on a Friday, whereas they might be rather less alert on a busier day earlier in the week.

On this occasion Thursday was an exceptionally busy day for education news, with reaction to Miliband’s speech and a raft of Coalition announcements designed to divert attention from it. With the benefit of hindsight, Thursday might have been a better choice.

The EEF’s press release dealt with evaluation reports on nine separate projects, so increasing the probability that attention would be diverted away from Maths Mastery.

It led on a different evaluation report which generated more positive findings – the EEF seems increasingly sensitive to concerns that too many of the RCTs it sponsors are showing negligible or no positive effect, presumably because the value-for-money police may be inclined to turn their beady eye upon the Foundation itself.

But perhaps it also did so because Maths Mastery’s relatively poor performance was otherwise the story most likely to attract the attention of more informed journalists and commentators.

On the other hand, Maths Mastery was given second billing:

‘Also published today are the results of Mathematics Mastery, a whole-school approach which aims to deepen pupils’ conceptual understanding of key mathematical ideas. Compared to traditional curricula, fewer topics are covered in more depth and greater emphasis is placed on problem solving and encouraging mathematical thinking. The EEF trials found that pupils following the Mathematics Mastery programme made an additional month’s progress over a period of a year.’

.

.

EEF blog post

Later on 13 February EEF released a blog post written by a senior analyst which mentions Maths Mastery in the following terms:

Another finding of note is the small positive impact of teaching children fewer mathematical concepts, but covering them in greater depth to ensure ‘mastery’. The EEF’s evaluation of Mathematics Mastery will make fascinating reading for headteachers contemplating introducing this approach into their school. Of course, the true value of this method may only be evident in years to come as children are able to draw on their secure mathematical foundations to tackle more complex problems.’

EEF is consistently reporting a small positive impact but, as we have seen, this is rather economical with the truth. It deserves some qualification.

More interestingly though, the post adds (my emphases):

‘Our commitment as an organisation is not only to build the strength of the evidence base in education, across key stages, topics, approaches and techniques, but also ensure that the key messages emerging from the research are synthesised and communicated clearly to teachers and school leaders so that evidence can form a central pillar of how decisions are made in schools.

We have already begun this work, driven by the messages from our published trials as well as the existing evidence base. How teaching assistants can be used to best effect, important lessons in literacy at the transition from primary to secondary, and which principles should underpin approaches on encouraging children in reading for pleasure are all issues that have important implications for school leaders. Synthesising and disseminating these vital messages will form the backbone of a new phase of EEF work beginning later in the year.’

It will be interesting to monitor the impact of this work on the communication of outcomes from these particular evaluations.

It will be important to ensure that synthesis and dissemination is not at the expense of accuracy, particularly when ‘high stakes’ results are involved, otherwise there is a risk that users will lose faith in the independence of EEF and its willingness to ‘speak truth unto power’.

.

Maths Mastery Press Release

By also releasing their own posts on 13 February, Mathematics Mastery and Ark made sure that they too would not be picked up by the media.

They must have concluded that, even if they placed the most positive interpretation on the outcomes, they would find it hard to create the kind of media coverage that would generate increased demand from schools.

The Mathematics Mastery release – ‘Mathematics Mastery speeds up pupils’ progress – and is value for money too’ – begins with a list of bullet points citing other evidence that the programme works, so implying that the EEF evaluations are relatively insignificant additions to this comprehensive evidence base:

  • ‘Headteachers say that the teaching of mathematics in their schools has improved
  • Headteachers are happy to recommend us to other schools
  • Numerous Ofsted inspections have praised the “new approach to mathematics” in partner schools
  • Extremely positive evaluations of our training and our school development visits
  • We have an exceptionally high retention rate – schools want to continue in the partnership
  • Great Key Stage 1 results in a large number of schools.’

Much of this is hearsay, or else vague reference to quantitative evidence that is not published openly.

The optimistic comment on the EEF evaluations is:

‘We’re pleased with the finding that, looking at both our primary and secondary programmes together, pupils in the Mathematics Mastery schools make one month’s extra progress on average compared to pupils in the other schools after a one year “dose” of the programme…

…This is a really pleasing outcome – trials of this kind are very rigorous.  Over 80 primary schools and 50 secondary schools were involved in the testing, with over 4000 pupils involved in each phase.  Studies like this often don’t show any progress at all, particularly in the early years of implementation and if, like ours, the programme is aimed at all pupils and not just particular groups.  What’s more, because of the large sample size, the difference in scores between the Mathematics Mastery and other schools is “statistically significant” which means the results are very unlikely to be due to chance.’

The section I have emboldened is in stark contrast to the EEF blog post above, which has the title:

‘Today’s findings; impact, no-impact and inconclusive – a normal distribution of findings’

And so suggests exactly the opposite.

I have already shown just how borderline the calculation of ‘statistical significance’ has been.

The release concludes:

‘Of course we’re pleased with the extra progress even after a limited time, but we’re interested in long term change and long term development and improvement.  We’re determined to work with our partner schools to show what’s possible over pupils’ whole school careers…but it’s nice to know we’ve already started to succeed!’

 .

There was a single retweet of the Tweet above, but from a particularly authoritative source (who also sits on Ark’s Advisory Group).

.

Ark Press Release

Ark’s press release – ‘Independent evaluation shows Mathematics Mastery pupils doing better than their peers’ – is even more bullish.

The opening paragraph claims that:

‘A new independent report from the independent Education Endowment Foundation (EEF) demonstrates the success of the Mathematics Mastery programme. Carried out by academics from Cambridge University and the Institute of Education, the data indicates that the programme may have the potential to halve the attainment gap with high performing countries in the far East.

The second emboldened statement is particularly brazen since there is no evidence in either of the reports that would support such a claim. It is only true in the sense that any programme ‘may have the potential’ to achieve any particularly ambitious outcome.

Statistical significance is again celebrated, though it is important to give Ark credit for adding:

‘…but it is important to note that these individual studies did not reach the threshold for statistical significance. It is only at the combined level across 127 schools and 10,114 pupils that there are sufficient schools and statistical power to determine an effect size of 1 month overall.’

Even if this rather implies that the individual evaluations were somehow at fault for being too small and so not generating ‘sufficient statistical power’.

Then the release returns to its initial theme:

‘… According to the OECD, by age fifteen, pupils in Singapore, Japan, South Korea and China are three years ahead of pupils in England in mathematical achievement. Maths Mastery is inspired by the techniques and strategies used in these countries.

Because Maths Mastery is a whole school programme of training and a cumulative curriculum, rather than a catch-up intervention, this could be a sustained impact. A 2 month gain every primary year and 1 month gain every secondary year could see pupils more than one and a half years ahead by age 16 – halving the gap with higher performing jurisdictions.’

In other words, Ark extrapolates equivalent gains – eschewing all statistical hedging – for each year of study, adding them together to suggest a potential 18 month gain.

It also seems to apply the effect to all participants rather than to the average participant.

This must have been a step too far, even for Ark’s publicity machine.

.

maths mastery ark release capture

.

They subsequently changed the final paragraph above – which one can still find in the version within Google’s cache – to read:

‘…Because Maths Mastery is a whole school programme of training and a cumulative curriculum, rather than a catch-up intervention, we expect this to be a sustained impact.  A longer follow-up study will be needed to investigate this.’

Even in sacrificing the misleading quantification, they could not resist bumping up ‘this could be a sustained impact’ to ‘we expect this to be a sustained impact’

 .

[Postscript: On 25 February, Bank of America Merrill Lynch published a press release announcing a £750,000 donation to Maths Mastery.

The final paragraph ‘About Maths Mastery’ says:

‘Mathematics Mastery is an innovative maths teaching framework, supporting schools, students and teachers to be successful at maths. There are currently 192 Mathematics Mastery partner schools across England, reaching 34,800 pupils. Over the next five years the programme aims to expand to 500 schools, and reach 300,000 pupils. Maths Mastery was recently evaluated by the independent Education Endowment Foundation and pupils were found to be up to two months ahead of their peers in just the first year of the programme. Longer term, this could see pupils more than a year and a half ahead by age 16 – halving the gap with pupils in countries such as Japan, Singapore and China.’

This exemplifies perfectly how such questionable statements are repurposed and recycled with impunity. It is high time that the EEF published a code of practice to help ensure that the outcomes of its evaluations are not misrepresented.]  

.

Conclusion

.

 .

Representing the key findings

My best effort at a balanced presentation of these findings would include the key points below. I am happy to consider amendments, additions and improvements:

  • On average, pupils in primary schools adopting Mathematics Mastery made two months more progress than pupils in primary schools that did not. (This is a borderline result, in that it is only just above the score denoting one month’s progress. It falls to one month’s progress if the effect size is calculated to three decimal places.) The effect is classified as ‘Low’ and this outcome is not statistically significant. 
  • On average, pupils in secondary schools adopting Mathematics Mastery made one month more progress than pupils in secondary schools that did not. The effect is classified as ‘Low’ and this outcome is not statistically significant. 
  • When the results of the primary and secondary evaluations are combined through meta-analysis, pupils in schools adopting Maths Mastery made one month more progress than pupils in schools that did not. The effect is classified as ‘Low’. This outcome is marginally statistically significant, provided that the 95% confidence interval is calculated to three decimal places (but it is not statistically significant if calculated to two decimal places). Care is needed in analysing meta-analysed findings because… [add explanation]. 
  • There is relatively little evidence that the primary programme is more effective for learners with lower prior attainment, but there is such evidence for the secondary programme (in respect of non-calculator questions). There is no substantive evidence that the secondary programme has a different impact on pupils eligible for free schools meals. 
  • The per-pupil cost is relatively low, but the initial outlay of £6,000 for primary schools with 2FE and above is not inconsiderable. Mathematics Mastery may represent a cost-effective change for schools to consider. 
  • The evaluations assessed the impact of the programme in its first year of adoption. It is not appropriate to draw inferences from the findings above to attribute potential value to the whole programme. EEF will be evaluating the medium and long-term impact of the approach by [outline the methodology agreed].

In the meantime, it would be helpful for Ark and Maths Mastery to be much more transparent about KS1 assessment outcomes across their partner schools and possibly publish their own analysis based on comparison between schools undertaking the programme and matched control schools with similar intakes.

And it would be helpful for all partners to explain and evidence more fully the benefits to high attainers of the Maths Mastery approach – and to consider how it might be supplemented when it does not provide the blend of challenge and support that best meets their needs.

It is disappointing that, three years on, the failure of the National Curriculum Expert Panel to reconcile their advocacy for mastery with stretch and challenge for high attainers – in defiance of their remit to consider the latter as well as the former –  is being perpetuated across the system.

NCETM might usefully revisit their guidance on high attainers in primary schools to reflect their new-found commitment to mastery, while also incorporating additional material covering the point above.

.

Postscript

A summary of this piece, published by Schools Week, prompted two comments – one from Stephen Gorard, the other from Dylan Wiliam. The Twitter embed below is the record of a subsequent debate between us and some others, about design of the Maths Mastery evaluations, what they tell us and how useful they are, especially to policy makers.

One of the tweets contains a commitment on the part of Anna Vignoles to set up a seminar to discuss these issues further.

The widget stores the tweets in reverse order (most recent first). Scroll down to the bottom to follow the discussion in chronological order.

.

.

GP

February 2015

High Attainment in the 2014 Secondary and 16-18 Performance Tables

.

This is my annual analysis of high attainment and high attainers’ performance in the Secondary School and College Performance Tables

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on the 2014 Secondary and 16-18 Tables, as well as three statistical releases published alongside them:

It also reports trends since 2012 and 2013, while acknowledging the comparability issues at secondary level this year.

This is a companion piece to previous posts on:

The post opens with the headlines from the subsequent analysis. These are followed by a discussion of definitions and comparability issues.

Two substantive sections deal respectively with secondary and post-16 measures. The post-16 analysis focuses exclusively on A level results. There is a brief postscript on the performance of disadvantaged high attainers.

As ever I apologise in advance for any transcription errors and invite readers to notify me of any they spot, so that I can make the necessary corrections.

.

Headlines

At KS4:

  • High attainers constitute 32.4% of the cohort attending state-funded schools, but this masks some variation by school type. The percentage attending converter academies (38.4%) has fallen by nine percentage points since 2011 but remains almost double the percentage attending sponsored academies (21.2%).
  • Female high attainers (33.7%) continue to outnumber males (32.1%). The percentage of high-attaining males has fallen very slightly since 2013 while the proportion of high-attaining females has slightly increased.
  • 88.8% of the GCSE cohort attending selective schools are high attainers, virtually unchanged from 2013. The percentages in comprehensive schools (30.9%) and modern schools (21.0%) are also little changed.
  • These figures mask significant variation between schools. Ten grammar schools have a GCSE cohort consisting entirely of high attainers but, at the other extreme, one has only 52%.
  • Some comprehensive schools have more high attainers than some grammars: the highest percentage recorded in 2014 by a comprehensive is 86%. Modern schools are also extremely variable, with high attainer populations ranging from 4% to 45%. Schools with small populations of high attainers report very different success rates for them on the headline measures.
  • The fact that 11.2% of the selective school cohort are middle attainers reminds us that 11+ selection is not based on prior attainment. Middle attainers in selective schools perform significantly better than those in comprehensive schools, but worse than high attainers in comprehensives.
  • 92.8% of high attainers in state-funded schools achieved 5 or more GCSEs at grades A*-C (or equivalent) including GCSEs in English and maths. While the success rate for all learners is down by four percentage points compared with 2013, the decline is less pronounced for high attainers (1.9 points).
  • In 340 schools 100% of high attainers achieved this measure, down from 530 in 2013. Fifty-seven schools record 67% or less compared with only 14 in 2013. Four of the 57 had a better success rate for middle attainers than for high attainers.
  • 93.8% of high attainers in state-funded schools achieved GCSE grades A*-C in English and maths. The success rate for high attainers has fallen less than the rate for the cohort as a whole (1.3 points against 2.4 points). Some 470 schools achieved 100% success amongst their high attainers on this measure, down 140 compared with 2013. Thirty-eight schools were at 67% or lower compared with only 12 in 2013. Five of these boast a higher success rate for their middle attainers than their high attainers (and four are the same that do so on the 5+ A*-C including English and maths measure).
  • 68.8% of high attainers were entered for the EBacc and 55% achieved it. The entry rate is up 3.8 percentage points and the success rate up 2.9 points compared with 2013. Sixty-seven schools entered 100% of their high attainers, but only five schools managed 100% success. Thirty-seven schools entered no high attainers at all and 53 had no successful high attainers.
  • 85.6% of high attainers made at least the expected progress in English and 84.7% did so in maths. Both are down on 2013 but much more so in maths (3.1 percentage points) than in English (0.6 points).
  • In 108 schools every high attainer made the requisite progress in English. In 99 schools the same was true of maths in 99 schools. Only 21 schools managed 100% success in both English and maths. At the other extreme there were seven schools in which 50% or fewer made expected progress in both English and maths. Several schools recording 50% or below in either English or maths did significantly better with their middle attainers.
  • In sponsored academies one in four high attainers do not make the expected progress in maths and one in five do not do so in English. In free schools one in every five high attainers falls short in English as do one in six in maths.

At KS5:

  • 11.9% of students at state-funded schools and colleges achieved AAB grades at A level or higher, with at least two in facilitating subjects. This is a slight fall compared with the 12.1% that did so in 2013. The best-performing state institution had a success rate of 83%.
  • 14.1% of A levels taken in selective schools in 2014 were graded A* and 41.1% were graded A* or A. In selective schools 26.1% of the cohort achieved AAA or higher and 32.3% achieved AAB or higher with at least two in facilitating subjects.
  • Across all schools, independent as well as state-funded, the proportion of students achieving three or more A level grades at A*/A is falling and the gap between the success rates of boys and girls is increasing.
  • Boys are more successful than girls on three of the four high attainment measures, the only exception being the least demanding (AAB or higher in any subjects).
  • The highest recorded A level point score per A level student in a state-funded institution in 2014 is 1430.1, compared with an average of 772.7. The lowest is 288.4. The highest APS per A level entry is 271.1 compared with an average of 211.2. The lowest recorded is 108.6.

Disadvantaged high attainers:

  • On the majority of the KS4 headline measures gaps between FSM and non-FSM performance are increasing, even when the 2013 methodology is applied to control for the impact of the reforms affecting comparability. Very limited improvement has been made against any of the five headline measures between 2011 and 2014. It seems that the pupil premium has had little impact to date on either attainment or progress. Although no separate information is forthcoming about the performance of disadvantaged high attainers, it is highly likely that excellence gaps are equally unaffected.

.

Definitions and comparability issues 

Definitions

The Secondary and 16-18 Tables take very different approaches, since the former deals exclusively with high attainers while the latter concentrates exclusively on high attainment.

The Secondary Tables define high attainers according to their prior attainment on end of KS2 tests. Most learners in the 2014 GCSE cohort will have taken these five years previously, in 2009.

The new supporting documentation describes the distinction between high, middle and low attainers thus:

  • low attaining = those below level 4 in the key stage 2 tests
  • middle attaining = those at level 4 in the key stage 2 tests
  • high attaining = those above level 4 in the key stage 2 tests.

Last year the equivalent statement added:

‘To establish a pupil’s KS2 attainment level, we calculated the pupil’s average point score in national curriculum tests for English, maths and science and classified those with a point score of less than 24 as low; those between 24 and 29.99 as middle, and those with 30 or more as high attaining.’

This is now missing, but the methodology is presumably unchanged.

It means that high attainers will tend to be ‘all-rounders’, whose performance is at least middling in each assessment. Those who are exceptionally high achievers in one area but poor in others are unlikely to qualify.

There is nothing in the Secondary Tables or the supporting SFRs about high attainment, such as measures of GCSE achievement at grades A*/A.

By contrast, the 16-18 Tables do not distinguish high attainers, but do deploy a high attainment measure:

‘The percentage of A level students achieving grades AAB or higher in at least two facilitating subjects’

Facilitating subjects include:

‘biology, chemistry, physics, mathematics, further mathematics, geography, history, English literature, modern and classical languages.’

The supporting documentation says:

‘Students who already have a good idea of what they want to study at university should check the usual entry requirements for their chosen course and ensure that their choices at advanced level include any required subjects. Students who are less sure will want to keep their options open while they decide what to do. These students might want to consider choosing at least two facilitating subjects because they are most commonly required for entry to degree courses at Russell Group universities. The study of A levels in particular subjects does not, of course, guarantee anyone a place. Entry to university is competitive and achieving good grades is also important.’

The 2013 Tables also included percentages of students achieving three A levels at grades AAB or higher in facilitating subjects, but this has now been dropped.

The Statement of Intent for the 2014 Tables explains:

‘As announced in the government’s response to the consultation on 16-19 accountability earlier this year, we intend to maintain the AAB measure in performance tables as a standard of academic rigour. However, to address the concerns raised in the 16-19 accountability consultation, we will only require two of the subjects to be in facilitating subjects. Therefore, the indicator based on three facilitating subjects will no longer be reported in the performance tables.’

Both these measures appear in SFR03/15, alongside two others:

  • Percentage of students achieving 3 A*-A grades or better At A level or applied single/double award A level.
  • Percentage of students achieving grades AAB or better at A level or applied single/double award A level.

Comparability Issues 

When it comes to analysis of the Secondary Tables, comparisons with previous years are compromised by changes to the way in which performance is measured.

Both SFRs carry an initial warning:

‘Two major reforms have been implemented which affect the calculation of key stage 4 (KS4) performance measures data in 2014:

  1. Professor Alison Wolf’s Review of Vocational Education recommendations which:
  • restrict the qualifications counted
  • prevent any qualification from counting as larger than one GCSE
  • cap the number of non-GCSEs included in performance measures at two per pupil
  1. An early entry policy to only count a pupil’s first attempt at a qualification.’

SFR02/15 explains that some data has been presented ‘on two alternative bases’:

  • Using the 2014 methodology with the changes above applied and
  • Using a proxy 2013 methodology where the effect of these two changes has been removed.

It points out that more minor changes have not been accounted for, including the removal of unregulated IGCSEs, the application of discounting across different qualification types, the shift to linear GCSE formats and the removal of the speaking and listening component from English.

Moreover, the proxy measure does not:

‘…isolate the impact of changes in school behaviour due to policy changes. For example, we can count best entry results rather than first entry results but some schools will have adjusted their behaviours according to the policy changes and stopped entering pupils in the same patterns as they would have done before the policy was introduced.’

Nevertheless, the proxy is the best available guide to what outcomes would have been had the two reforms above not been introduced. Unfortunately, it has been applied rather sparingly.

Rather than ignore trends completely, this post includes information about changes in high attainers’ GCSE performance compared with previous years, not least so readers can see the impact of the changes that have been introduced.

It is important that we do not allow the impact of these changes to be used as a smokescreen masking negligible improvement or even declines in national performance on key measures.

But we cannot escape the fact that the 2014 figures are not fully comparable with those for previous years. Several of the tables in SFR06/2015 carry a warning in red to this effect (but not those in SFR 02/2015).

A few less substantive changes also impact slightly on the comparability of A level results: the withdrawal of January examinations and ‘automatic add back’ of students whose results were deferred from the previous year because they had not completed their 16-18 study programme.

.

Secondary outcomes

. 

The High Attainer Population 

The Secondary Performance Tables show that there were 172,115 high attainers from state-funded schools within the relevant cohort in 2014, who together account for 32.3% of the entire state-funded school cohort.

This is some 2% fewer than the 175,797 recorded in 2013, which constituted 32.4% of that year’s cohort.

SFR02/2015 provides information about the incidence of high, middle and low attainers by school type and gender.

Chart 1, below, compares the proportion of high attainers by type of school, showing changes since 2011.

The high attainer population across all state-funded mainstream schools has remained relatively stable over the period and currently stands at 32.9%. The corresponding percentage in LA-maintained mainstream schools is slightly lower: the difference is exactly two percentage points in 2014.

High attainers constitute only around one-fifth of the student population of sponsored academies, but close to double that in converter academies. The former percentage is relatively stable but the latter has fallen by some nine percentage points since 2011, presumably as the size of this sector has increased.

The percentage of high attainers in free schools is similar to that in converter academies but has fluctuated over the three years for which data is available. The comparison between 2014 and previous years will have been affected by the inclusion of UTCs and studio schools prior to 2014.

.

HA sec1

*Pre-2014 includes UTCs and studio schools; 2014 includes free schools only

Chart 1: Percentage of high attainers by school type, 2011-2014

. 

Table 1 shows that, in each year since 2011, there has been a slightly higher percentage of female high attainers than male, the gap varying between 0.4 percentage points (2012) and 1.8 percentage points (2011).

The percentage of high-attaining boys in 2014 is the lowest it has been over this period, while the percentage of high attaining girls is slightly higher than it was in 2013 but has not returned to 2011 levels.

Year Boys Girls
2014 32.1 33.7
2013 32.3 33.3
2012 33.4 33.8
2011 32.6 34.4

Table 1: Percentage of high attainers by gender, all state-funded mainstream schools 2011-14

Table 2 shows that the percentage of high attainers in selective schools is almost unchanged from 2013, at just under 89%. This compares with almost 31% in comprehensive schools, unchanged from 2013, and 21% in modern schools, the highest it has been over this period.

The 11.2% of learners in selective schools who are middle attainers remind us that selection by ability through 11-plus tests gives a somewhat different sample than selection exclusively on the basis of KS2 attainment.

. 

Year Selective Comprehensive Modern
2014 88.8 30.9 21.0
2013 88.9 30.9 20.5
2012 89.8 31.7 20.9
2011 90.3 31.6 20.4

Table 2: Percentage of high attainers by admissions practice, 2011-14

The SFR shows that these middle attainers in selective schools are less successful than their high attaining peers, and slightly less successful than high attainers in comprehensives, but they are considerably more successful than middle attaining learners in comprehensive schools.

For example, in 2014 the 5+ A*-C grades including English and maths measure is achieved by:

  • 97.8% of high attainers in selective schools
  • 92.2% of high attainers in comprehensive schools
  • 88.1% of middle attainers in selective schools and
  • 50.8% of middle attainers in comprehensive schools.

A previous post ‘The Politics of Selection: Grammar schools and disadvantage’ (November 2014) explored how some grammar schools are significantly more selective than others – as measured by the percentage of high attainers within their GCSE cohorts – and the fact that some comprehensives are more selective than some grammar schools.

This is again borne out by the 2014 Performance Tables, which show that 10 selective schools have a cohort consisting entirely of high attainers, the same as in 2013. Eighty-nine selective schools have a high attainer population of 90% or more.

However, five are at 70% or below, with the lowest – Dover Grammar School for Boys – registering only 52% high attainers.

By comparison, comprehensives such as King’s Priory School, North Shields and Dame Alice Owen’s School, Potters Bar record 86% and 77% high attainers respectively. 

There is also huge variation in modern schools, from Coombe Girls’ in Kingston, at 45%, just seven percentage points shy of the lowest recorded in a selective school, to The Ellington and Hereson School, Ramsgate, at just 4%.

Two studio colleges say they have no high attainers at all, while 96 schools have 10% or fewer. A significant proportion of these are academies located in rural and coastal areas.

Even though results are suppressed where there are too few high attainers, it is evident that these small cohorts perform very differently in different schools.

Amongst those with a high attainer population of 10% or fewer, the proportion achieving:

  • 5+ A*-C grades including English and maths varies from 44% to100%
  • EBacc ranges from 0% to 89%
  • expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%. 

5+ GCSEs (or equivalent) at A*-C including GCSEs in English and maths 

The Tables show that:

  • 92.8% of high attainers in state-funded schools achieved five or more GCSEs (or equivalent) including GCSEs in English and maths. This compares with 56.6% of all learners. Allowing of course for the impact of 2014 reforms, the latter is a full four percentage points down on the 2013 outcome. By comparison, the outcome for high attainers is down 1.9 percentage points, slightly less than half the overall decline. Roughly one in every fourteen high attainers fails to achieve this benchmark.
  • 340 schools achieve 100% on this measure, significantly fewer than the 530 that did so in 2013 and the 480 managing this in 2012. In 2013, 14 schools registered 67% or fewer high attainers achieving this outcome, whereas in 2014 this number has increased substantially, to 57 schools. Five schools record 0%, including selective Bourne Grammar School, Lincolnshire, hopefully because of their choice of IGCSEs. Six more are at 25% or lower.

. 

A*-C grades in GCSE English and maths 

The Tables reveal that:

  • 93.8% of high attainers in state-funded schools achieved A*-C grades in GCSE English and maths, compared with 58.9% of all pupils. The latter percentage is down by 2.4 percentage points but the former has fallen by only 1.3 percentage points. Roughly one in 16 high attainers fails to achieve this measure.
  • In 2014 the number of schools with 100% of high attainers achieving this measure has fallen to some 470, 140 fewer than in 2013 and 60 fewer than in 2012. There were 38 schools recording 67% or lower, a significant increase compared with 12 in 2013 and 18 in 2012. Of these, four are listed at 0% (Bourne Grammar is at 1%) and five more are at 25% or lower.
  • Amongst the 38 schools recording 67% or lower, five return a higher success rate for their middle attainers than for their high attainers. Four of these are the same that do so on the 5+ A*-C measure above. They are joined by Tong High School. 

Entry to and achievement of the EBacc 

The Tables indicate that:

  • 68.8% of high attainers in state-funded schools were entered for all EBacc subjects and 55.0% achieved the EBacc. The entry rate is up by 3.8 percentage points compared with 2013, and the success rate is up by 2.9 percentage points. By comparison, 31.5% of middle attainers were entered (up 3.7 points) and 12.7% passed (up 0.9 points). Between 2012 and 2013 the entry rate for high attainers increased by 19 percentage points, so the rate of improvement has slowed significantly. Given the impending introduction of the Attainment 8 measure, commitment to the EBacc is presumably waning.
  • Thirty-seven schools entered no high attainers for the EBacc, compared with 55 in 2013 and 186 in 2012. Only 53 schools had no high attainers achieving the EBacc, compared with 79 in 2013 and 235 in 2012. Of these 53, 11 recorded a positive success rate for their middle attainers, though the difference was relatively small in all cases.

At least 3 Levels of Progress in English and maths

The Tables show that:

  • Across all state-funded schools 85.6% of high attainers made at least the expected progress in English while 84.7% did so in maths. The corresponding figures for middle attainers are 70.2% in English and 65.3% in maths. Compared with 2013, the percentages for high attainers are down 0.6 percentage points in English and down 3.1 percentage points in maths, presumably because the first entry only rule has had more impact in the latter. Even allowing for the depressing effect of the changes outlined above, it is unacceptable that more than one in every seven high attainers fails to make the requisite progress in each of these core subjects, especially when the progress expected is relatively undemanding for such students.
  • There were 108 schools in which every high attainer made at least the expected progress in English, exactly the same as in 2013. There were 99 schools which achieved the same outcome in maths, down significantly from 120 in 2013. In 2013 there were 36 schools which managed this in both English in maths, but only 21 did so in 2014.
  • At the other extreme, four schools recorded no high attainers making the expected progress in English, presumably because of their choice of IGCSE. Sixty-five schools were at or below 50% on this measure. In maths 67 schools were at or below 50%, but the lowest recorded outcome was 16%, at Oasis Academy, Hextable.
  • Half of the schools achieving 50% or less with their high attainers in English or maths also returned better results with middle attainers. Particularly glaring differentials in English include Red House Academy (50% middle attainers and 22% high attainers) and Wingfield Academy (73% middle attainers; 36% high attainers). In maths the worst examples are Oasis Academy Hextable (55% middle attainers and 16% high attainers), Sir John Hunt Community Sports College (45% middle attainers and 17% high attainers) and Roseberry College and Sixth Form (now closed) (49% middle attainers and 21% high attainers).

Comparing achievement of these measures by school type and admissions basis 

SFR02/2015 compares the performance of high attainers in different types of school on each of the five measures discussed above. This data is presented in Chart 2 below.

.

HA sec2 

Chart 2: Comparison of high attainers’ GCSE performance by type of school, 2014

.

It shows that:

  • There is significant variation on all five measures, though these are more pronounced for achievement of the EBacc, where there is a 20 percentage point difference between the success rates in sponsored academies (39.2%) and in converter academies (59.9%).
  • Converter academies are the strongest performers across the board, while sponsored academies are consistently the weakest. LA-maintained mainstream schools out-perform free schools on four of the five measures, the only exception being expected progress in maths.
  • Free schools and converter academies achieve stronger performance on progress in maths than on progress in English, but the reverse is true in sponsored academies and LA-maintained schools.
  • Sponsored academies and free schools are both registering relatively poor performance on the EBacc measure and the two progress measures.
  • One in four high attainers in sponsored academies fails to make the requisite progress in maths while one in five fail to do so in English. Moreover, one in five high attainers in free schools fails to make the expected progress in English and one in six in maths. This is unacceptably low.

Comparisons with 2013 outcomes show a general decline, with the exception of EBacc achievement.

This is particularly pronounced in sponsored academies, where there have been falls of 5.2 percentage points on 5+ A*-Cs including English and maths, 5.7 points on A*-C in English and maths and 4.7 points on expected progress in maths. However, expected progress in English has held up well by comparison, with a fall of just 0.6 percentage points.

Progress in maths has declined more than progress in English across the board. In converter academies progress in maths is down 3.1 points, while progress in English is down 1.1 points. In LA-maintained schools, the corresponding falls are 3.4 and 0.4 points respectively.

EBacc achievement is up by 4.5 percentage points in sponsored academies, 3.1 points in LA-maintained schools and 1.8 points in converter academies.

.

Comparing achievement of these measures by school admissions basis 

SFR02/2015 compares the performance of high attainers in selective, comprehensive and modern schools on these five measures. Chart 3 illustrates these comparisons.

.

HA sec3

Chart 3: Comparison of high attainers’ GCSE performance by school admissions basis, 2014

.

It is evident that:

  • High attainers in selective schools outperform those in comprehensive schools on all five measures. The biggest difference is in relation to EBacc achievement (21.6 percentage points). There is a 12.8 point advantage in relation to expected progress in maths and an 8.7 point advantage on expected progress in English.
  • Similarly, high attainers in comprehensive schools outperform those in modern schools. They enjoy a 14.7 percentage point advantage in relation to achievement of the EBacc, but, otherwise, the differences are between 1.6 and 3.5 percentage points.
  • Hence there is a smaller gap, by and large, between the performance of high attainers in modern and comprehensive schools respectively than there is between high attainers in comprehensive and selective schools respectively.
  • Only selective schools are more successful in achieving expected progress in maths than they are in English. It is a cause for some concern that, even in selective schools, 6.5% of pupils are failing to make at least three levels of progress in English.

Compared with 2013, results have typically improved in selective schools but worsened in comprehensive and modern schools. For example:

  • Achievement of the 5+ GCSE measure is up 0.5 percentage points in selective schools but down 2.3 points in comprehensives and modern schools.
  • In selective schools, the success rate for expected progress in English is up 0.5 points and in maths it is up 0.4 points. However, in comprehensive schools progress in English and maths are both down, by 0.7 points and 3.5 points respectively. In modern schools, progress in English is up 0.3 percentage points while progress in maths is down 4.1 percentage points.

When it comes to EBacc achievement, the success rate is unchanged in selective schools, up 3.1 points in comprehensives and up 5 points in modern schools.

. 

Other measures

The Secondary Performance Tables also provide information about the performance of high attainers on several other measures, including:

  • Average Points Score (APS): Annex B of the Statement of Intent says that, as in 2013, the Tables will include APS (best 8) for ‘all qualifications’ and ‘GCSEs only’. At the time of writing, only the former appears in the 2014 Tables. For high attainers, the APS (best 8) all qualifications across all state-funded schools is 386.2, which compares unfavourably with 396.1 in 2013. Four selective schools managed to exceed 450 points: Pate’s Grammar School (455.1); The Tiffin Girls’ School (452.1); Reading School (451.4); and Colyton Grammar School (450.6). The best result in 2013 was 459.5, again at Colyton Grammar School. At the other end of the table, only one school returns a score of under 250 for their high attainers, Pent Valley Technology College (248.1). The lowest recorded score in 2013 was significantly higher at 277.3.
  • Value Added (best 8) prior attainment: The VA score for all state-funded schools in 2014 is 1000.3, compared with 1001.5 in 2013. Five schools returned a result over 1050, whereas four did so in 2013. The 2014 leaders are: Tauheedul Islam Girls School (1070.7); Yesodey Hatorah Senior Girls School (1057.8); The City Academy Hackney (1051.4); The Skinner’s School (1051.2); and Hasmonean High School (1050.9). At the other extreme, 12 schools were at 900 or below, compared with just three in 2013. The lowest performer on this measure is Hull Studio School (851.2). 
  • Average grade: As in the case of APS, the average grade per pupil per GCSE has not yet materialised. The average grade per pupil per qualification is supplied. Five selective schools return A*-, including Henrietta Barnett, Pate’s, Reading School, Tiffin Girls and Tonbridge Grammar. Only Henrietta Barnett and Pate’s managed this in 2013.
  • Number of exam entries: Yet again we only have number of entries for all qualifications and not for GCSE only. The average number of entries per high attainer across state-funded schools is 10.4, compared with 12.1 in 2013. This 1.7 reduction is smaller than for middle attainers (down 2.5 from 11.4 to 8.9) and low attainers (down 3.7 from 10.1 to 6.4). The highest number of entries per high attainer was 14.2 at Gable Hall School and the lowest was 5.9 at The Midland Studio College Hinkley.

16-18: A level outcomes

.

A level grades AAB or higher in at least two facilitating subjects 

The 16-18 Tables show that 11.9% of students in state-funded schools and colleges achieved AAB+ with at least two in facilitating subjects. This is slightly lower than the 12.1% recorded in 2013.

The best-performing state-funded institution is a further education college, Cambridge Regional College, which records 83%. The only other state-funded institution above 80% is The Henrietta Barnett School. At the other end of the spectrum, some 443 institutions are at 0%.

Table 3, derived from SFR03/2015, reveals how performance on this measure has changed since 2013 for different types of institution and, for schools with different admission arrangements.

.

2013 2014
LA-maintained school 11.4 11.5
Sponsored academy 5.4 5.3
Converter academy 16.4 15.7
Free school* 11.3 16.4
Sixth form college 10.4 10
Other FE college 5.8 5.7
 
Selective school 32.4 32.3
Comprehensive school 10.7 10.5
Modern school 2 3.2

.

The substantive change for free schools will be affected by the inclusion of UTCs and studio schools in that line in 2013 and the addition of city technology colleges and 16-19 free schools in 2014.

Otherwise the general trend is slightly downwards but LA-maintained schools have improved very slightly and modern schools have improved significantly.

.

Other measures of high A level attainment

SFR03/15 provides outcomes for three other measures of high A level attainment:

  • 3 A*/A grades or better at A level, or applied single/double award A level
  • Grades AAB or better at A level, or applied single/double award A level
  • Grades AAB or better at A level all of which are in facilitating subjects.

Chart 4, below, compares performance across all state-funded schools and colleges on all four measures, showing results separately for boys and girls.

Boys are in the ascendancy on three of the four measures, the one exception being AAB grades or higher in any subjects. The gaps are more substantial where facilitating subjects are involved.

.

HA sec4

Chart 4: A level high attainment measures by gender, 2014

.

The SFR provides a time series for the achievement of the 3+ A*/A measure, for all schools – including independent schools – and colleges. The 2014 success rate is 12.0%, down 0.5 percentage points compared with 2013.

The trend over time is shown in Chart 5 below. This shows how results for boys and girls alike are slowly declining, having reached their peak in 2010/11. Boys established a clear lead from that year onwards.

As they decline, the lines for boys and girls are steadily diverging since girls’ results are falling more rapidly. The gap between boys and girls in 2014 is 1.3 percentage points.

.

HA sec5

Chart 5: Achievement of 3+ A*/A grades in independent and state-funded schools and in colleges, 2006-2014

.

Chart 6, compares performance on the four different measures by institutional type. It shows a similar pattern across the piece.

Success rates tend to be highest in either converter academies or free schools, while sponsored academies and other FE institutions tend to bring up the rear. LA-maintained schools and sixth form colleges lie midway between.

Converter academies outscore free schools when facilitating subjects do not enter the equation, but the reverse is true when they do. There is a similar relationship between sixth form colleges and LA-maintained schools, but it does not quite hold with the final pair.

. 

HA sec6 

Chart 6: Proportion of students achieving different A level high attainment measures by type of institution, 2014

.

Chart 7 compares performance by admissions policy in the schools sector on the four measures. Selective schools enjoy a big advantage on all four. More than one in four selective school students achieving at least 3 A grades and almost one in 3 achieves AAB+ with at least two in facilitating subjects.

There is a broadly similar relationship across all the measures, in that comprehensive schools record roughly three times the rates achieved in modern schools and selective schools manage roughly three times the success rates in comprehensive schools. 

. 

HA sec7 

Chart 7: Proportion of students achieving different A level high attainment measures by admissions basis in schools, 2014

 .

Other Performance Table measures 

Some of the other measures in the 16-18 Tables are relevant to high attainment:

  • Average Point Score per A level student: The APS per student across all state funded schools and colleges is 772.7, down slightly on the 782.3 recorded last year. The highest recorded APS in 2014 is 1430.1, by Colchester Royal Grammar School. This is almost 100 ahead of the next best school, Colyton Grammar, but well short of the highest score in 2013, which was 1650. The lowest APS for a state-funded school in 2014 is 288.4 at Hartsdown Academy, which also returned the lowest score in 2013. 
  • Average Point Score per A level entry: The APS per A level entry for all state-funded institutions is 211.2, almost identical to the 211.3 recorded in 2013. The highest score attributable to a state-funded institution is 271.1 at The Henrietta Barnett School. This is very slightly slower than the 271.4 achieved by Queen Elizabeth’s Barnet in 2013. The lowest is 108.6, again at Hartsdown Academy, which exceeds the 2013 low of 97.7 at Appleton Academy. 
  • Average grade per A level entry: The average grade across state-funded schools and colleges is C. The highest average grade returned in the state-funded sector is A at The Henrietta Barnett School, Pate’s Grammar School, Queen Elizabeth’s Barnet and Tiffin Girls School. In 2013 only the two Barnet schools achieved the same outcome. At the other extreme, an average U grade is returned by Hartsdown Academy, Irlam and Cadishead College and Swadelands School. 

SFR06/2015 also supplies the percentage of A* and A*/A grades by type of institution and schools’ admissions arrangements. The former is shown in Chart 8 and the latter in Chart 9 below.

The free school comparisons are affected by the changes to this category described above.

Elsewhere the pattern is rather inconsistent. Success rates at A* exceed those set in 2012 and 2013 in LA-maintained schools, sponsored academies, sixth form colleges and other FE institutions. Meanwhile, A*/A grades combined are lower than both 2012 and 2013 in converter academies and sixth form colleges.

.

HA sec8

Chart 8: A level A* and A*/A performance by institutional type, 2012 to 2014

. 

Chart 9 shows A* performance exceeding the success rates for 2012 and 2013 in all three sectors.

When both grades are included, success rates in selective schools have returned almost to 2012 levels following a dip in 2013, while there has been little change across the three years in comprehensive schools and a clear improvement in modern schools, which also experienced a dip last year.

HA sec9

Chart 9: A level A* and A*/A performance in schools by admissions basis, 2012 to 2014.

 .

Disadvantaged high attainers 

There is nothing in either of the Performance Tables or the supporting SFRs to enable us to detect changes in the performance of disadvantaged high attainers relative to their more advantaged peers.

I dedicated a previous post to the very few published statistics available to quantify the size of these excellence gaps and establish if they are closing, stable or widening.

There is continuing uncertainty whether this will be addressed under the new assessment and accountability arrangements to be introduced from 2016.

Although results for all high attainers appear to be holding up better than those for middle and lower attainers, the evidence suggests that FSM and disadvantaged gaps at lower attainment levels are proving stubbornly resistant to closure.

Data from SFR06/2015 is presented in Charts 10-12 below.

Chart 10 shows that, when the 2014 methodology is applied, three of the gaps on the five headline measures increased in 2014 compared with 2013.

That might have been expected given the impact of the changes discussed above but, if the 2013 methodology is applied, so stripping out much (but not all) of the impact of these reforms, four of the five headline gaps worsened and the original three are even wider.

This seems to support the hypothesis that the reforms themselves are not driving this negative trend, athough Teach First has suggested otherwise.

.

HA sec10

Chart 10: FSM gaps for headline GCSE measures, 2013-2014

.

Chart 11 shows how FSM gaps have changed on each of these five measures since 2011. Both sets of 2014 figures are included.

Compared with 2011, there has been improvement on two of the five measures, while two or three have deteriorated, depending which methodology is applied for 2014.

Since 2012, only one measure has improved (expected progress in English) and that by slightly more or less than 1%, according to which 2014 methodology is selected.

Deteriorations have been small however, suggesting that FSM gaps have been relatively stable over this period, despite their closure being a top priority for the Government, backed up by extensive pupil premium funding.

.

HA sec11

Chart 11: FSM/other gaps for headline GCSE measures, 2011 to 2014.

.

Chart 12 shows a slightly more positive pattern for the gaps between disadvantaged learners (essentially ‘ever 6 FSM’ and looked after children) and their peers.

There have been improvements on four of the five headline measures since 2011. But since 2012, only one or two of the measures has improved, according to which 2014 methodology is selected. Compared with 2013, either three or four of the 2014 headline measures are down.

The application of the 2013 methodology in 2014, rather than the 2014 methodology, causes all five of the gaps to increase, so reinforcing the point in bold above.

It is unlikely that this pattern will be any different at higher attainment levels, but evidence to prove or disprove this remains disturbingly elusive.

.

HA sec12

Chart 12: Disadvantaged/other gaps for headline GCSE measures, 2011 to 2014

.

Taken together, this evidence does not provide a ringing endorsement of the Government’s strategy for closing these gaps.

There are various reasons why this might be the case:

  • It is too soon to see a significant effect from the pupil premium or other Government reforms: This is the most likely defensive line, although it begs the question why more urgent action was/is discounted.
  • Pupil premium is insufficiently targeted at the students/school that need it most: This is presumably what underlies the Fair Education Alliance’s misguided recommendation that pupil premium funding should be diverted away from high attaining disadvantaged learners towards their lower attaining peers.
  • Schools enjoy too much flexibility over how they use the pupil premium and too many are using it unwisely: This might point towards more rigorous evaluation, tighter accountability mechanisms and stronger guidance.
  • Pupil premium funding is too low to make a real difference: This might be advanced by institutions concerned at the impact of cuts elsewhere in their budgets.
  • Money isn’t the answer: This might suggest that the pupil premium concept is fundamentally misguided and that the system as a whole needs to take a different or more holistic approach.

I have proposed a more targeted method of tackling secondary excellence gaps and simultaneously strengthening fair access, where funding topsliced from the pupil premium is fed into personal budgets for disadvantaged high attainers.

These would meet the cost of coherent, long-term personalised support programmes, co-ordinated by their schools and colleges, which would access suitable services from a ‘managed market’ of suppliers.

.

Conclusion

This analysis suggests that high attainers, particularly those in selective schools, have been relatively less affected by the reforms that have depressed GCSE results in 2014.

While we should be thankful for small mercies, three issues are of particular concern:

  • There is a stubborn and serious problem with the achievement of expected progress in both English and maths. It cannot be acceptable that approximately one in seven high attainers fails to make three levels of progress in each core subject when this is a relatively undemanding expectation for those with high prior attainment. This issue is particularly acute in sponsored academies where one in four or five high attainers are undershooting their progress targets.
  • Underachievement amongst high attainers is prevalent in far too many state-funded schools and colleges. At KS4 there are huge variations in the performance of high-attaining students depending on which schools they attend. A handful of schools achieve better outcomes with their middle attainers than with their high attainers. This ought to be a strong signal, to the schools as well as to Ofsted, that something serious is amiss.
  • Progress in closing KS4 FSM gaps continues to be elusive, despite this being a national priority, backed up by a pupil premium budget of £2.5bn a year. In the absence of data about the performance of disadvantaged high attainers, we can only assume that this is equally true of excellence gaps.

.

GP

February 2015

Addressed to Teach First and its Fair Education Alliance

.

This short opinion piece was originally commissioned by the TES in November.

My draft reached them on 24 November; they offered some edits on 17 December.

Betweentimes the Fair Education Alliance Report Card made its appearance on 9 December.

Then Christmas intervened.

On 5 January I offered the TES a revised version they said should be published on 27 February. It never appeared.

This Tweet

.

.

prompted an undertaking that it would appear on 27 March. I’ll believe that when I see it.

But there’s no reason why you should wait any longer. This version is more comprehensive anyway, in that it includes several relevant Twitter comments and additional explanatory material.

I very much hope that Teach First and members of the Fair Education Alliance will read it and reflect seriously on the proposal it makes.

As the final sequence of Tweets below shows, Teach First committed to an online response on 14 February. Still waiting…

.

.

.

How worried are you that so few students on free school meals make it to Oxbridge?

Many different reasons are offered by those who argue that such concern may be misplaced:

  • FSM is a poor proxy for disadvantage; any number of alternatives is preferable;
  • We shouldn’t single out Oxbridge when so many other selective universities have similarly poor records;
  • We obsess about Oxbridge when we should be focused on progression to higher education as a whole;
  • We should worry instead about progression to the most selective courses, which aren’t necessarily at the most selective universities;
  • Oxbridge suits a particular kind of student; we shouldn’t force square pegs into round holes;
  • We shouldn’t get involved in social engineering.

Several of these points are well made. But they can be deployed as a smokescreen, obscuring the uncomfortable fact that, despite our collective best efforts, there has been negligible progress against the FSM measure for a decade or more.

Answers to Parliamentary Questions supplied  by BIS say that the total fluctuated between 40 and 45 in the six years from 2005/06 to 2010/11.

The Department for Education’s experimental destination measures statistics suggested that the 2010/11 intake was 30, rising to 50 in 2011/12, of which 40 were from state-funded schools and 10 from state-funded colleges. But these numbers are rounded to the nearest 10.

By comparison, the total number of students recorded as progressing to Oxbridge from state-funded schools and colleges in 2011/12 is 2,420.

This data underpins the adjustment of DfE’s  ‘FSM to Oxbridge’ impact indicator, from 0.1% to 0.2%. It will be interesting to see whether there is stronger progress in the 2012/13 destination measures, due later this month.

.

[Postscript: The 2012/13 Destinations Data was published on 26 January 2014. The number of FSM learners progressing to Oxbridge is shown only in the underlying data (Table NA 12).

This tells us that the numbers are unchanged: 40 from state-funded schools; 10 from state-funded colleges, with both totals again rounded to the nearest 10.

So any improvement in 2011/12 has stalled in 2012/13, or is too small to register given the rounding (and the rounding might even mask a deterioration)

.

.

The non-FSM totals progressing to Oxbridge in 2012/13 are 2,080 from state-funded schools and 480 from state-funded colleges, giving a total of 2,560. This is an increase of some 6% compared with 2011/12.

Subject to the vagaries of rounding, this suggests that the ratio of non-FSM to FSM learners progressing from state-funded institutions deteriorated in 2012/13 compared with 2011/12.]

.

The routine explanation is that too few FSM-eligible students achieve the top grades necessary for admission to Oxbridge. But answers to Parliamentary Questions reveal that, between 2006 and 2011, the number achieving three or more A-levels at grade A or above increased by some 45 per cent, reaching 546 in 2011.

Judged on this measure, our national commitment to social mobility and fair access is not cutting the mustard. Substantial expenditure – by the taxpayer, by universities and the third sector – is making too little difference too slowly. Transparency is limited because the figures are hostages to fortune.

So what could be done about this? Perhaps the answer lies with Teach First and the Fair Education Alliance.

Towards the end of last year Teach First celebrated a decade of impact. It published a report and three pupil case studies, one of which featured a girl who was first in her school to study at Oxford.

I tweeted

.

.

Teach First has a specific interest in this area, beyond its teacher training remit. It runs a scheme, Teach First Futures, for students who are  “currently under-represented in universities, including those whose parents did not go to university and those who have claimed free school meals”.

Participants benefit from a Teach First mentor throughout the sixth form, access to a 4-day Easter school at Cambridge, university day trips, skills workshops and careers sessions. Those applying to Oxbridge receive unspecified additional support.

.

.

Information about the number of participants is not always consistent, but various Teach First sources suggest there were some 250 in 2009, rising to 700 in 2013. This year the target is 900. Perhaps some 2,500 have taken part to date.

Teach First’s impact report  says that 30 per cent of those who had been through the programme in 2013 secured places at Russell Group universities and that 60 per cent of participants interviewed at Oxbridge received an offer.

I searched for details of how many – FSM or otherwise – had actually been admitted to Oxbridge. Apart from one solitary case study, all I could find was a report that mentioned four Oxbridge offers in 2010.

.

.

.

.

.

Through the Fair Education Alliance, Teach First and its partners are committed to five impact goals, one of which is to:

‘Narrow the gap in university graduation, including from the 25% most selective universities, by 8%’*

Last month the Alliance published a Report Card which argued that:

‘The current amount of pupil premium allocated per disadvantaged pupil should be halved, and the remaining funds redistributed to those pupils who are disadvantaged and have low prior attainment. This would give double-weighting to those low income pupils most in need of intervention without raising overall pupil premium spend.’

It is hard to understand how this would improve the probability of achieving the impact goal above, even though the gaps the Alliance wishes to close are between schools serving high and low income communities.

.

.

.

Perhaps it should also contemplate an expanded Alliance Futures Scheme, targeting simultaneously this goal and the Government’s ‘FSM to Oxbridge’ indicator, so killing two birds with one stone.

A really worthwhile Scheme would need to be ambitious, imposing much-needed coherence without resorting to prescription.

Why not consider:

  • A national framework for the supply side, in which all providers – universities included – position their various services.
  • Commitment on the part of all secondary schools and colleges to a coherent long-term support programme for FSM students, with open access at KS3 but continuing participation in KS4 and KS5 subject to successful progress.
  • Schools and colleges responsible for identifying participants’ learning and development needs and addressing those through a blend of internal provision and appropriate services drawn from the national framework.
  • A personal budget for each participant, funded through an annual £50m topslice from the Pupil Premium (there is a precedent) plus a matching sum from universities’ outreach budgets. Those with the weakest fair access records would contribute most. Philanthropic donations would be welcome.
  • The taxpayer’s contribution to all university funding streams made conditional on them meeting challenging but realistic fair access and FSM graduation targets – and publishing full annual data in a standard format.

 .

.

*In the Report card, this impact goal is differently expressed, as narrowing the gap in university graduation, so that at least 5,000 more students from low income backgrounds graduate each year, 1,600 of them from the most selective universities. This is to be achieved by 2022.

‘Low income backgrounds’ means schools where 50% or more pupils come from the most deprived 30% of families according to IDACI.

The gap to be narrowed is between these and pupils from ‘high income backgrounds’, defined as schools where 50% or more pupils come from the least deprived 30% of families according to IDACI.

‘The most selective universities’ means those in the Sutton Trust 30 (the top 25% of universities with the highest required UCAS scores).

The proposed increases in graduation rates from low income backgrounds do not of themselves constitute a narrowing gap, since there is no information about the corresponding changes in graduation rates from high income grounds.

This unique approach to closing gaps adds yet another methodology to the already long list applied to fair access. It risks adding further density to the smokescreen described at the start of this post.

.

.

GP

January 2015

2014 Primary and Secondary Transition Matrices: High Attainers’ Performance

.

This is my annual breakdown of what the Transition Matrices tell us about the national performance of high attainers.

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It complements my reviews of High Attainment in the 2014 Primary Performance Tables (December 2014) and of High Attainment in the 2014 Secondary and Post-16 Performance Tables (forthcoming, in February 2015).

The analysis is based on:

  • The 2014 Static national transition matrices for reading, writing and mathematics – Key Stage 1 to Key Stage 2 (October 2014) and
  • The 2014 Static key Stage 2 to 4 National transition matrices unamended – English and maths (December 2014).

There is also some reference to SFR41/2014: Provisional GCSE and equivalent results in England, 2013 to 2014.

The post begins with some important explanatory notes, before examining the primary and then the secondary matrices. There is a commentary on each matrix, followed by a summary of the key challenges for each sector.

.

Explanatory notes

The static transition matrices take into account results from maintained mainstream and maintained and non-maintained special schools. 

The tables reproduced below use colour coding:

  • purple = more than expected progress
  • dark green = expected progress
  • light green = less than expected progress and
  • grey = those excluded from the calculation.

I will assume that readers are familiar with expectations of progress under the current system of national curriculum levels.

I have written before about the assumptions underpinning this approach and some of the issues it raises.

(See in particular the sections called:

 ‘How much progress does the accountability regime expect from high attainers?’ and

‘Should we expect more progress from high attainers?’)

I have not reprised that discussion here.

The figures within the tables are percentages – X indicates data that has been suppressed (where the cohort comprises only one or two learners). Because of rounding, lines do not always add up to 100%.

In the case of the primary matrices, the commentary below concentrates on the progress made by learners who achieved level 3 or level 4 at KS1. In the case of the secondary matrices, it focuses on those who achieved sub-levels 5A, 5B or 5C at KS2.

Although the primary matrices include progression from KS1 level 4, the secondary matrices do not include progression from KS2 level 6 since the present level 6 tests were introduced only in 2012. Those completing GCSEs in 2014 will typically have undertaken KS2 assessment five years earlier.

The analysis includes comparison with the matrices for 2012 and 2013 respectively.

.

The impact of policy change on the secondary matrices

This comparison is straightforward for the primary sector (KS1 to KS2) but is problematic when it comes to the secondary matrices (KS2 to KS4).

As SFR41/2014 makes clear, the combined impact of:

  • vocational education reforms (restricting eligible qualifications and significantly reducing the weighting of some of them) and 
  • early entry policy (recording in performance measures only the first result achieved, rather than the outcome of any retakes)

has depressed overall KS4 results.

The impact of these factors on progress is not discussed within the text, although one of the tables gives overall percentages for those making the expected progress under the old and new methodologies respectively.

It does so for two separate groups of institutions, neither of which is perfectly comparable with the transition matrices because of the treatment of special schools:

  • State funded mainstream schools (excluding state-funded special schools and non-maintained special schools) and
  • State-funded schools (excluding non-maintained special schools).

However, the difference is likely to be marginal.

There is certainly very little difference between the two sets of figures for the categories above, though the percentages are very slightly larger for the first.

They show:

  • A variation of 2.3 percentage points in English (72.1% making at least the expected progress under the new methodology compared with 74.4% under the old) and
  • A variation of 2.4 percentage points in maths (66.4% making at least the expected progress compared with 68.8%).

There is no such distinction in the static transition matrices, nor does the SFR provide any information about the impact of these policy changes for different levels of prior attainment.

It seems a reasonable starting hypothesis that the impact will be much reduced at higher levels of prior attainment, because comparatively fewer students will be pursuing vocational qualifications.

One might also expect comparatively fewer high attainers to require English and/or maths retakes, even when the consequences of early entry are factored in, but that is rather more provisional.

It may be that the differential impact of these reforms on progression from different levels of prior attainment will be discussed in the statistical releases to be published alongside the Secondary Performance Tables. In that case I will update this treatment.

For the time being, my best counsel is:

  • To be aware that these policy changes have almost certainly had some impact on the progress of secondary high attainers, but 
  • Not to fall into the trap of assuming that they must explain all – or even a substantial proportion – of any downward trends (or absence of upward trends for that matter).

There will be more to say about this in the light of the analysis below.

Is this data still meaningful?

As we all know, the measurement of progression through national curriculum levels will shortly be replaced by a new system.

There is a temptation to regard the methodology underpinning the transition matrices as outmoded and irrelevant.

For the time being though, the transition matrices remain significant to schools (and to Ofsted) and there is an audience for analysis based on them.

Moreover, it is important that we make our best efforts to track annual changes under the present system, right up to the point of changeover.

We should also be thinking now about how to match progression outcomes under the new model with those available under the current system, so as to secure an uninterrupted perspective of trends over time.

Otherwise our conclusions about the longer-term impact of educational policies to raise standards and close gaps will be sadly compromised.

.

2014 Primary Transition Matrices

.

Reading

.

TM reading KS12 Capture

.

Commentary:

  • It appears that relatively few KS1 learners with L4 reading achieved the minimum expected 2 levels of progress by securing L6 at the end of KS2. It is not possible for these learners to make more than the expected progress. The vast majority (92%) recorded a single level of progress, to KS2 L5. This contrasts with 2013, when 12% of KS1 L4 learners did manage to progress to KS2 L6, while only 88% were at KS2 L5. Caution is necessary since the sample of L1 KS4 readers is so small. (The X suggests the total cohort could be as few as 25 pupils.)
  • The table shows that 1% of learners achieving KS1 L3 reading made 3 levels of progress to KS2 L6, exactly the same proportion as in 2012 and 2013. But we know that L6 reading test entries were up 36% compared with 2013: one might reasonably have expected some increase in this percentage as a consequence. The absence of improvement may be attributable to the collapse in success rates on the 2014 L6 reading test.
  • 90% of learners achieving KS1 L3 made the expected 2 or more levels of progress to KS2 L5 or above, 89% making 2 levels of progress to L5. The comparable figures for those making 2 LoP in 2013 and 2012 were 85% and 89% respectively.
  • In 2014 only 10% of those achieving LS1 L3 made a single level of progress to KS2 L4, compared with 13% in 2013 and 10% in 2012. 
  • So, when it comes to L3 prior attainers, the 2013 dip has been overcome, but there has been no improvement beyond the 2012 outcomes. Chart 1 makes this pattern more obvious, illustrating clearly that there has been relatively little improvement across the board.

.

TM chart 1

Chart 1: Percentage of learners with KS1 L3 reading making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 making the expected progress is significantly lower than the proportions with KS1 L2A, L2B or L2 overall who do so. This pattern is unchanged from 2012 and 2013.
  • The proportion exceeding 2 LoP is also far higher for every other level of KS1 prior achievement, also unchanged from 2012 and 2013.
  • Whereas the gap between KS1 L2 and L3 making more than 2 LoP was 36 percentage points in 2013, by 2014 it had increased substantially to 43 percentage points (44% versus 1%). This may again be partly attributable to the decline in L6 reading results.

.

Writing

.

TM writing KS12 Capture

Commentary:

  • 55% of learners with L4 in KS1 writing made the expected 2 levels of progress to KS2 L6, while only 32% made a single level of progress to KS2 L5. This throws into sharper relief the comparable results for L4 readers. 
  • On the other hand, the 2013 tables recorded 61% of L4 writers making the expected progress, six percentage points higher than the 2014 success rate, so there has been a decline in success rates in both reading and writing for this small cohort. The reason for this is unknown, but it may simply be a consequence of the small sample.
  • Of those achieving KS1 L3, 12% made 3 LoP to KS2 L6, up from 6% in 2012 and 9% in 2013. The comparison with reading is again marked. A further 2% of learners with KS1 L2A made 4 levels of progress to KS2 L6.
  • 91% of learners with KS1 L3 writing made the expected 2 or more levels of progress, up from 89% in 2013. Some 79% made 2 LoP to L5, compared with 80% in 2013 and 79% in 2012, so there has been relatively little change.
  • However, in 2014 9% made only a single level of progress to KS2 L4. This is an improvement on 2013, when 11% did so and continues an improving trend from 2012 when 15% fell into this category, although the rate of improvement has slowed somewhat. 
  • These positive trends are illustrated in Chart 2 below, which shows reductions in the proportion achieving a single LoP broadly matched by corresponding improvements in the proportion achieving 3 LoP.

TM chart 2 

Chart 2: Percentage of learners with KS1 L3 writing making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 making the expected progress is again lower than the proportions with KS1 L2A, L2B or L2 overall doing so. It is even lower than the proportion of those with KS1 L1 achieving this outcome. This is unchanged from 2013.
  • The proportion exceeding 2 LoP is far higher for every other level of KS1 achievement excepting L2C, again unchanged from 2013.
  • The percentage point gap between those with KS1 L2 overall and LS1 L3 making more than 2 LoP was 20 points in 2013 and remains unchanged at 20 points in 2014. Once again again there is a marked contrast with reading. 

.

Maths

.

TM maths KS12 Capture

.

Commentary:

  • 95% of those achieving L4 maths at KS1 made the expected 2 levels of progress to KS2 L6. These learners are unable to make more than expected progress. Only 5% made a single level of progress to KS2 L5. 
  • There is a marked improvement since 2013, when 89% made the expected progress and 11% fell short. This is significantly better than KS1 L4 progression in writing and hugely better than KS1 L4 progression in reading.
  • 35% of learners with KS1 L3 maths also made 3 levels of progress to KS2 L6. This percentage is up from 26% in 2013 and 14% in 2012, indicating a continuing trend of strong improvement. In addition, 6% of those with L2A and 1% of those at L2B managed 4 levels of progress to KS2 L6.
  • 91% of learners with KS1 L3 made the expected progress (up one percentage point compared with 2013). Of these, 56% made 2 LoP to KS2 L5. However, 9% made only a single level of progress to KS2 L4 (down a single percentage point compared with 2013).
  • Chart 3 illustrates these positive trends. It contrasts with the similar charts for writing above, in that the rate at which the proportion of L3 learners making a single LoP is reducing is much slower than the rate of improvement in the proportion of KS1 L3 learners making 3 LoP.

.

TM chart 3

Chart 3: Percentage of learners with KS1 L3 maths making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 in maths who achieved the expected progress is identical to the proportion achieving L2 overall that do so, at 91%. However, these rates are lower than for learners with KS1 2B and especially 2A.
  • The proportion exceeding 2 LoP is also identical for those with KS1 L3 and L2 overall (whereas in 2013 there was a seven percentage point gap in favour of those with KS1 L2). The proportion of those with KS1 L2A exceeding 2 LoP remains significantly higher, but the gap has narrowed by six percentage points compared with 2013.

.

Key Challenges: Progress of High Attainers between KS1 and KS2

The overall picture from the primary transition matrices is one of comparatively strong progress in maths, positive progress in writing and a much more mixed picture in reading. But in none of these areas is the story unremittingly positive.

Priorities should include:

  • Improving progression from KS1 L4 to KS2 L6, so that the profile for writing becomes more similar to the profile for maths and, in particular, so that the profile for reading much more closely resembles the profile for writing. No matter how small the cohort, it cannot be acceptable that 92% of KS1 L4 readers make only a single level of progress.
  • Reducing to negligible the proportion of KS1 L3 learners making a single level of progress to KS2 L4. Approximately 1 in 10 learners continue to do so in all three assessments, although there has been some evidence of improvement since 2012, particularly in writing. Other than in maths, the proportion of KS1 L3 learners making a single LoP is significantly higher than the proportion of KS1 L2 learners doing so. 
  • Continuing to improve the proportion of KS1 L3 learners making 3 LoP in each of the three assessments, maintaining the strong rate of improvement in maths, increasing the rate of improvement in writing and moving beyond stagnation at 1% in reading. 
  • Eliminating the percentage point gaps between those with KS1 L2A making at least the expected progress and those with KS1 L3 doing so (5 percentage points in maths and 9 percentage points in each of reading and writing). At the very least, those at KS1 L3 should be matching those at KS1 L2B, but there are presently gaps between them of 2 percentage points in maths, 5 percentage points in reading and 6 percentage points in writing.

.

Secondary Transition Matrices

.

English

.

TM English KS24 Capture

.

Commentary:

  • 98% of learners achieving L5A English at KS2 made at least 3 levels of progress to GCSE grade B or above in 2014. The same is true of 93% of those with KS2 L5B and 75% of those with KS2 L5C. All three figures have improved by one percentage point compared with 2013. The comparable figures in 2012 were 98%, 92% and 70% respectively.
  • 88% of learners achieving L5A at KS2 achieved at least four levels of progress from KS2 to KS4, so achieving a GCSE grade of A* or A, as did 67% of those with L5B and 34% of those with 5C. The comparable figures in 2013 were 89%, 66% and 33% respectively, while in 2012 they were 87%, 64% and 29% respectively.
  • 51% of learners with KS2 L5A made 5 levels of progress by achieving an A* grade at GCSE, compared with 25% of those with L5B, 7% of those with L5C and 1% of those with L4A. The L5B and L5C figures were improvements on 2013 outcomes. The 2014 success rate for those with KS2 L5A is down by two percentage points, while that for L5B is up by two points.
  • These cumulative totals suggest relatively little change in 2014 compared with 2013, with the possible exception of these two-percentage-point swings in the proportions of students making 5 LoP. 
  • The chart below compares the proportion of students with KS2 L5A, 5B and 5C respectively making exactly 3, 4 and 5 LoP. (NB: these are not the same as the cumulative totals quoted above). This again shows relatively small changes in 2014, compared with 2013, and no obvious pattern.

.

TM chart 4

Chart 4: Percentage of learners with KS2 L5A, L5B and L5C in English achieving 3, 4 and 5 levels of progress, 2012-2014

.

  • 1% of learners with KS2 L5A made only 2 levels of progress to GCSE grade C, as did 6% of those with L5B and 20% of those with L5C. These percentages are again little changed compared with 2013, following a much more significant improvement between 2012 and 2013).
  • The percentages of learners with KS2 L4A who achieve at least 3 and at least 4 levels of progress – at 87% and 48% respectively – are significantly higher than the corresponding percentages for those with KS2 L5C. These gaps have also changed very little compared with 2013.

.

Maths

.

TM Maths KS24 Capture

.

Commentary:

  • 96% of learners with L5A at KS2 achieved the expected progress between KS2 and KS4 in 2014, as did 86% of those with KS2 L5B and 65% of those with KS2 L5C. The comparable percentages in 2013 were 97%, 88% and 70%, while in 2012 they were 96%, 86% and 67%. This means there have been declines compared with 2013 for L5A (one percentage point) L5B (two percentage points) and L5C (five percentage points).
  • 80% of learners with KS2 L5A made 4 or more levels of progress between KS2 and KS4, so achieving a GCSE grade A* or A. The same was true of 54% of those with L5B and 26% of those with L5C. In 2013, these percentages were 85%, 59% and 31% respectively, while in 2012 they were 84%, 57% and 30% respectively. So all the 2014 figures – for L5A, L5B and L5C alike, are five percentage points down compared with 2013.
  • In 2014 48% of learners with KS2 L5A made 5 levels of progress by achieving a GCSE A* grade, compared with 20% of those with L5B, 5% of those with L5C and 1% of those with L4A. All three percentages for those with KS2 L5 are down compared with 2013 – by 3 percentage points in the case of those with L5A, 2 points for those with L5B and 1 point for those with L5C.
  • It is evident that there is rather more volatility in the trends in maths progression and some of the downward swings are more pronounced than in English.
  • The chart below compares the proportion of students with KS2 L5A, 5B and 5C respectively making exactly 3, 4 and 5 LoP. (NB, these are not the cumulative totals quoted above). The only discernible pattern is that any improvement is confined to those making 3 LoP.

.

TM chart 5

Chart 5: Percentage of learners with KS2 L5A, L5B and L5C in Maths achieving 3, 4 and 5 levels of progress, 2012-2014

  • 4% of those with KS2 L5A made only 2 LoP to GCSE grade C, as did 13% of those with L5B and 31% of those with L5C. All three percentages have worsened compared with 2013, by 1, 2 and 4 percentage points respectively.
  • The percentages of learners with KS2 L4A who achieve at least 3 and at least 4 levels of progress – at 85% and 37% respectively – are significantly higher than the corresponding percentages for those with L5C, just as they are in English. And, as is the case with English, the percentage point gaps have changed little compared with 2013.

.

Key Challenges: Progress of High Attainers Between KS2 and KS4

The overall picture for high attainers from the secondary transition matrices is of relatively little change in English and of rather more significant decline in maths, though not by any means across the board.

It may be that the impact of the 2014 policy changes on high attainers has been relatively more pronounced in maths than in English – and perhaps more pronounced in maths than might have been expected.

If this is the case, one suspects that the decision to restrict reported outcomes to first exam entries is the most likely culprit.

On the other hand, it might be true that relatively strong improvement in English progression has been cancelled out by these policy changes, though the figures provided in the SFR for expected progress regardless of prior attainment make this more unlikely.

Leaving causation aside, the most significant challenges for the secondary sector are to:

  • Significantly improve the progression rates for learners with KS2 L5A to A*. It should be a default expectation that they achieve five levels of progress, yet only 48% do so in maths and 51% in English – and these percentages are down 5 and 2 percentage points respectively compared with 2013.
  • Similarly, significantly improve the progression rates for learners with KS2 L5B to grade A. It should be a default expectation that they achieve at least 4 LoP, yet only 67% do so in English and 54% in maths – down one point since 2013 in English and 5 points in maths.
  • Reduce and ideally eliminate the rump of high attainers who make a single LoP. This is especially high for those with KS2 L5C – 20% in English and, still worse, 31% in maths – but there is also a problem for those with 5B in maths, 13% of whom fall into this category. The proportion making a single LoP from 5C in maths has risen by 4 percentage points since 2013, while there has also been a 2 point rise for those with 4B. (Thankfully the L5C rate in English has improved by 2 points, but there is a long way still to go.)
  • Close significantly, the progression performance gaps between learners with KS2 L5C and KS2 L4A, in both English and maths. In English there is currently a 12 percentage point gap for those making expected progress and a 14-point gap for those exceeding it. In maths, these gaps are 20 and 11 percentage points respectively. The problem in maths seems particularly pronounced. These gaps have changed little since 2013.

.

Conclusion

This analysis of high attainers’ progression suggests a very mixed picture, across the primary and secondary sectors and beween English and maths. There is some limited scope for congratulation, but too many persistent issues remain.

The commentary has identified four key challenges for each sector, which can be synthesised under two broad headings:

  • Raising expectations beyond the minimum expected progress – and significantly reducing our tolerance of underachievement amongst this cohort. 
  • Ensuring that those at the lower end of the high attaining spectrum sustain their initial momentum, at least matching the rather stronger progress of those with slightly lower prior attainment.

The secondary picture has become confused this year by the impact of policy changes.

We do not know to what extent these explain any downward trends – or depress any upward trends – for those with high prior attainment, though one may tentatively hypothesise that any impact has been rather more significant in maths than in English.

It would be quite improper to assume that the changes in high attainers’ progression rates compared with 2013 are entirely attributable to the impact of these policy adjustments.

It would be more accurate to say that they mask any broader trends in the data, making those more difficult to isolate.

We should not allow this methodological difficulty – or the impending replacement of the present levels-based system – to divert us from continuing efforts to improve the progression of high attainers.

For Ofsted is intensifying its scrutiny of how schools support the most able – and they will expect nothing less.

.

GP

January 2015

Gifted Phoenix 2014 Review and Retrospective

.

I am rounding out this year’s blogging with my customary backwards look at the various posts I published during 2014.

This is partly an exercise in self-congratulation but also flags up to readers any potentially useful posts they might have missed.

.

P1020553

Norwegian Panorama by Gifted Phoenix

.

This is my 32nd post of the year, three fewer than the 35 I published in 2013. Even so, total blog views have increased by 20% compared with 2013.

Almost exactly half of these views originate in the UK. Other countries generating a large number of views include the United States, Singapore, India, Australia, Hong Kong, Saudi Arabia, Germany, Canada and South Korea. The site has been visited this year by readers located in157 different countries.

My most popular post during 2014 was Gifted Education in Singapore: Part 2, which was published back in May 2012. This continues to attract interest in Singapore!

The most popular post written during 2014 was The 2013 Transition Matrices and High Attainers’ Performance (January).

Other 2014 posts that attracted a large readership were:

This illustrates just how strongly the accountability regime features in the priorities of English educators.

I have continued to feature comparatively more domestic topics: approximately 75% of my posts this year have been about the English education system. I have not ventured beyond these shores since September.

The first section below reviews the minority of posts with a global perspective; the second covers the English material. A brief conclusion offers my take on future prospects.

.

Global Gifted Education

I began the year by updating my Blogroll, with the help of responses to Gifted Education Activity in the Blogosphere and on Twitter.

This post announced the creation of a Twitter list containing all the feeds I can find that mention gifted education (or a similar term, whether in English or another language) in their profile.

I have continued to update the list, which presently includes 1,312 feeds and has 22 subscribers. If you want to be included – or have additions to suggest – please don’t hesitate to tweet me.

While we’re on the subject, I should take this opportunity to thank my 5,960 Twitter followers, an increase of some 28% compared with this time last year.

In February I published A Brief Discussion about Gifted Labelling and its Permanency. This recorded a debate I had on Twitter about whether the ‘gifted label’ might be used more as a temporary marker than a permanent sorting device.

March saw the appearance of How Well Does Gifted Education Use Social Media?

This proposed some quality criteria for social media usage and blogs/websites that operate within the field of gifted education.

It also reviewed the social media activity of six key players (WCGTC, ECHA, NAGC, SENG, NACE and Potential Plus UK) as well as wider activity within the blogosphere, on five leading social media platforms and utilising four popular content creation tools.

Some of the websites mentioned above have been recast since the post was published and are now much improved (though I claim no direct influence).

Also in March I published What Has Become of the European Talent Network? Part One and Part Two.

These posts were scheduled just ahead of a conference organised by the Hungarian sponsors of the network. I did not attend, fearing that the proceedings would have limited impact on the future direction of this once promising initiative. I used the posts to set out my reservations, which include a failure to engage with constructive criticism.

Part One scrutinises the Hungarian talent development model on which the European Network is based. Part Two describes the halting progress made by to date. It identifies several deficiencies that need to be addressed if the Network is to have a significant and lasting impact on pan-European support for talent development and gifted education.

During April I produced PISA 2012 Creative Problem Solving: International Comparison of High Achievers’ Performance

This analyses the performance of high achievers from a selection of 11 jurisdictions – either world leaders or prominent English-speaking nations – on the PISA 2012 Creative Problem Solving assessment.

It is a companion piece to a 2013 post which undertook a similar analysis of the PISA 2012 assessments in Reading, Maths and Science.

In May I contributed to the Hoagies’ Bloghop for that month.

Air on the ‘G’ String: Hoagies’ Bloghop, May 2014 was my input to discussion about the efficacy of ‘the G word’ (gifted). I deliberately produced a provocative and thought-provoking piece which stirred typically intense reactions in several quarters.

Finally, September saw the production of Beware the ‘short head’: PISA’s Resilient Students’ Measure.

This takes a closer look at the relatively little-known PISA ‘resilient students’ measure – focused on high achievers from disadvantaged socio-economic backgrounds – and how well different jurisdictions perform against it.

The title reflects the post’s conclusion that, like many other countries, England:

‘…should be worrying as much about our ‘short head’ as our ‘long tail’’.

And so I pass seamlessly on to the series of domestic posts I published during 2014…

.

English Education Policy

My substantive post in January was High Attainment in the 2013 Secondary and 16-18 Performance Tables, an analysis of the data contained in last year’s Tables and the related statistical publications.

Also in January I produced a much briefer commentary on The 2013 Transition Matrices and High Attainers’ Performance.

The purpose of these annual posts (and the primary equivalent which appears each December) is to synthesise data about the performance of high attainers and high attainment at national level, so that schools can more easily benchmark their own performance.

In February I wrote What Becomes of Schools that Fail their High Attainers?*

It examines the subsequent history of schools that recorded particularly poor results with high attainers in the Secondary Performance Tables. (The asterisk references a footnote apologising ‘for this rather tabloid title’.)

By March I was focused on Challenging NAHT’s Commission on Assessment subjecting the Commission’s Report to a suitably forensic examination and offering a parallel series of recommendations derived from it.

My April Fool’s joke this year was Plans for a National Centre for Education Research into Free Schools (CERFS). This has not materialised but, had our previous Secretary of State for Education not been reshuffled, I’m sure it would have been only a matter of time!

Also in April I was Unpacking the Primary Assessment and Accountability Reforms, exposing some of the issues and uncertainties embodied in the government’s response to consultation on its proposals.

Some of the issues I highlighted eight months ago are now being more widely discussed – not least the nature of the performance descriptors, as set out in the recent consultation exercise dedicated to those.

But the reform process is slow. Many other issues remain unresolved and it seems increasingly likely that some of the more problematic will be delayed deliberately until after the General Election.

May was particularly productive, witnessing four posts, three of them substantial:

  • How well is Ofsted reporting on the most able? explores how Ofsted inspectors are interpreting the references to the attainment and progress of the most able added to the Inspection Handbook late last year. The sample comproses the 87 secondary inspection reports that were published in March 2014. My overall assessment? Requires Improvement.

.

.

  • A Closer Look at Level 6 is a ‘data-driven analysis of Level 6 performance’. As well as providing a baseline against which to assess future Level 6 achievement, this also identifies several gaps in the published data and raises as yet unanswered questions about the nature of the new tests to be introduced from 2016.
  • One For The Echo Chamber was prompted by The Echo Chamber reblogging service, whose founder objected that my posts are too long, together with the ensuing Twitter debate. Throughout the year the vast majority of my posts have been unapologetically detailed and thorough. They are intended as reference material, to be quarried and revisited, rather than the disposable vignettes that so many seem to prefer. To this day they get reblogged on The Echo Chamber only when a sympathetic moderator is undertaking the task.
  • ‘Poor but Bright’ v ‘Poor but Dim’ arose from another debate on Twitter, sparked by a blog post which argued that the latter are a higher educational priority than the former. I argued that both deserved equal priority, since it is inequitable to discriminate between disadvantaged learners on the basis of prior attainment and the economic arguments cut both ways. This issue continues to bubble like a subterranean stream, only to resurface from time to time, most recently when the Fair Education Alliance proposed that the value of pupil premium allocations attached to disadvantaged high attainers should be halved.

In June I asked Why Can’t We Have National Consensus on Educating High Attainers? and proposed a set of core principles that might form the basis for such consensus.

These were positively received. Unfortunately though, the necessary debate has not yet taken place.

.

.

The principles should be valuable to schools considering how best to respond to Ofsted’s increased scrutiny of their provision for the most able. Any institution considering how best to revitalise its provision might discuss how the principles should be interpreted to suit their particular needs and circumstances.

July saw the publication of Digging Beneath the Destination Measures which explored the higher education destinations statistics published the previous month.

It highlighted the relatively limited progress made towards improving the progression of young people from disadvantaged backgrounds to selective universities.

There were no posts in August, half of which was spent in Norway, taking the photographs that have graced some of my subsequent publications.

In September I produced What Happened to the Level 6 Reading Results? an investigation into the mysterious collapse of L6 reading test results in 2014.

Test entries increased significantly. So did the success rates on the other level 6 tests (in maths and in grammar, punctuation and spelling (GPS)).  Even teacher assessment of L6 reading showed a marked upward trend.

Despite all this, the number of pupils successful on the L6 reading test fell from 2,062 in 2013 to 851 (provisional). The final statistics – released only this month – show a marginal improvement to 935, but the outcome is still extremely disappointing. No convincing explanation has been offered and the impact on 2015 entries is unlikely to be positive.

That same month I published Closing England’s Excellence Gaps: Part One and Part Two.

These present the evidence base relating to high attainment gaps between disadvantaged and other learners, to distinguish what we know from what remains unclear and so to provide a baseline for further research.

The key finding is that the evidence base is both sketchy and fragmented. We should understand much more than we do about the size and incidence of excellence gaps. We should be strengthening the evidence base as part of a determined strategy to close the gaps.

.

.

In October 16-19 Maths Free Schools Revisited marked a third visit to the 16-19 maths free schools programme, concentrating on progress since my previous post in March 2013, especially at the two schools which have opened to date.

I subsequently revised the post to reflect an extended series of tweeted comments from Dominic Cummings, who was a prime mover behind the programme. The second version is called 16-19 Maths Free Schools Revisited: Oddyssean Edition .

The two small institutions at KCL and Exeter University (both very similar to each other) constitute a rather limited outcome for a project that was intended to generate a dozen innovative university-sponsored establishments. There is reportedly a third school in the pipeline but, as 2014 closes, details have yet to be announced.

Excellence Gaps Quality Standard: Version One is an initial draft of a standard encapsulating effective whole school practice in supporting disadvantaged high attainers. It updates and adapts the former IQS for gifted and talented education.

This first iteration needs to be trialled thoroughly, developed and refined but, even as it stands, it offers another useful starting point for schools reviewing the effectiveness of their own provision.

The baseline standard captures the essential ‘non-negotiables’ intended to be applicable to all settings. The exemplary standard is pitched high and should challenge even the most accomplished of schools and colleges.

All comments and drafting suggestions are welcome.

.

.

In November I published twin studies of The Politics of Setting and The Politics of Selection: Grammar Schools and Disadvantage.

These issues have become linked since Prime Minister Cameron has regularly proposed an extension of the former as a response to calls on the right wing of his party for an extension of the latter.

This was almost certainly the source of autumn media rumours that a strategy, originating in Downing Street, would be launched to incentivise and extend setting.

Newly installed Secretary of State Morgan presumably insisted that existing government policy (which leaves these matters entirely to schools) should remain undisturbed. However, the idea might conceivably be resuscitated for the Tory election manifesto.

Now that UKIP has confirmed its own pro-selection policy there is pressure on the Conservative party to resolve its internal tensions on the issue and identify a viable alternative position. But the pro-grammar lobby is unlikely to accept increased setting as a consolation prize…

.

.

Earlier in December I added a companion piece to ‘The Politics of Selection’.

How Well Do Grammar Schools Perform With Disadvantaged Students? reveals that the remaining 163 grammar schools have very different records in this respect. The poor performance of a handful is a cause for concern.

I also published High Attainment in the 2014 Primary School Performance Tables – another exercise in benchmarking, this time for primary schools interested in how well they support high attainers and high attainment.

This shows that HMCI’s recent distinction between positive support for the most able in the primary sector and a much weaker record in secondary schools is not entirely accurate. There are conspicuous weaknesses in the primary sector too.

Meanwhile, Chinese learners continue to perform extraordinarily well on the Level 6 maths test, achieving an amazing 35% success rate, up six percentage points since 2013. This domestic equivalent of the Shanghai phenomenon bears closer investigation.

My penultimate post of the year HMCI Ups the Ante on the Most Able collates all the references to the most able in HMCI’s 2014 Annual Report and its supporting documentation.

It sets out Ofsted’s plans for the increased scrutiny of schools and for additional survey reports that reflect this scrutiny.

It asks the question whether Ofsted’s renewed emphasis will be sufficient to rectify the shortcomings they themselves identify and – assuming it will not – outlines an additional ten-step plan to secure system-wide improvement.

Conclusion

So what are the prospects for 2015 and beyond?

My 2013 Retrospective was decidedly negative about the future of global gifted education:

‘The ‘closed shop’ is as determinedly closed as ever; vested interests are shored up; governance is weak. There is fragmentation and vacuum where there should be inclusive collaboration for the benefit of learners. Too many are on the outside, looking in. Too many on the inside are superannuated and devoid of fresh ideas.’

Despite evidence of a few ‘green shoots’’ during 2014, my overall sense of pessimism remains.

Meanwhile, future prospects for high attainers in England hang in the balance.

Several of the Coalition Government’s education reforms have been designed to shift schools’ focus away from borderline learners, so that every learner improves, including those at the top of the attainment distribution.

On the other hand, Ofsted’s judgement that a third of secondary inspections this year

‘…pinpointed specific problems with teaching the most able’

would suggest that schools’ everyday practice falls some way short of this ideal.

HMCI’s commitment to champion the interests of the most able is decidedly positive but, as suggested above, it might not be enough to secure the necessary system-wide improvement.

Ofsted is itself under pressure and faces an uncertain future, regardless of the election outcome. HMCI’s championing might not survive the arrival of a successor.

It seems increasingly unlikely that any political party’s election manifesto will have anything significant to say about this topic, unless  the enthusiasm for selection in some quarters can be harnessed and redirected towards the much more pertinent question of how best to meet the needs of all high attainers in all schools and colleges, especially those from disadvantaged backgrounds.

But the entire political future is shrouded in uncertainty. Let’s wait and see how things are shaping up on the other side of the election.

From a personal perspective I am closing in on five continuous years of edutweeting and edublogging.

I once expected to extract from this commitment benefits commensurate with the time and energy invested. But that is no longer the case, if indeed it ever was.

I plan to call time at the end of this academic year.

 .

GP

December 2014