A Summer of Love for English Gifted Education? Episode 3: Improving Fair Access to Oxbridge

.

This post is a critical examination of policy and progress on improving progression for the highest attainers from disadvantaged backgrounds to selective universities, especially Oxford and Cambridge.

.

.

It:

  • Uncovers evidence of shaky statistical interpretation by these universities and their representative body;
  • Identifies problems with the current light-touch regulatory and monitoring apparatus, including shortcomings in the publication of data and reporting of progress at national level;
  • Proposes a series of additional steps to address this long-standing shortcoming of our education system.

.

Background

summer of love 1967 by 0 fairy 0

summer of love 1967 by 0 fairy 0

Regular readers may recall that I have completed two parts of a trilogy of posts carrying the optimistic strapline ‘A Summer of Love for Gifted Education’.

The idea was to structure these posts around three key government publications.

  • This final part was supposed to analyse another DfE-commissioned research report, an ‘Investigation of school- and college- level strategies to raise the Aspirations of High-Achieving Disadvantaged Pupils to pursue higher education’.

We know from the published contract (see attachment in ‘Documents’ section) that this latter study was undertaken by TNS/BMRB and the Institute for Policy Studies in Education (IPSE) based at London Metropolitan University. The final signed off report should have been produced by 28 June 2013 and published within 12 weeks of approval, so by the end of September. As I write, it has still to appear, which would suggest that there is a problem with the quality and/or size of the evidence base.

In the five months since the appearance of Part Two I have published a series of posts developing the themes explored in the first two-thirds of my incomplete trilogy.

But what to do about the missing final episode of ‘A Summer of Love’, which was going to develop this latter fair access theme in more detail?

My initial idea was to survey and synthesise the large number of other recently published studies on the topic. But, as I reviewed the content of these publications, it struck me that such a post would be stuffed full of descriptive detail but lack any real bite – by which I mean substantial and serious engagement with the central problem.

I decided to cut to the chase.

I also decided to foreground material about the highest reaches of A level attainment and progression to Oxbridge, not because I see the issue solely in these stratospheric terms, but because:

  • The top end of fair access is important in its own right, especially for those with a gifted education perspective. Oxford and Cambridge consistently declare themselves a special case and I wanted to explore the substance of their position.
  • There is compelling evidence that Oxford and Cambridge are amongst the weakest performers when it comes to fair access for the highest attaining disadvantaged learners. There are reasons why the task may be comparatively more difficult for them but, equally, as our most prestigious universities, they should be at the forefront when it comes to developing and implementing effective strategies to tackle the problem.
  • The Government has itself made Oxbridge performance a litmus test of progress (or lack of progress) on fair access and on higher education’s wider contribution to social mobility.

The first part of the post briefly reviews the range of measures and regulatory apparatus devoted to improving fair access. This is to provide a frame from which to explore the available data and its shortcomings, rather than an in-depth analysis of relative strengths and weaknesses. Readers who are familiar with this background may prefer to skip it.

The mid-section concentrates on the limited data in the public domain and how it has been (mis)interpreted.

The final section reviews the criticisms made by the SMCPC and, while endorsing them thoroughly, offers a set of further proposals – many of them data-driven – for ratcheting up our collective national efforts to reverse the unsatisfactory progress made to date.

.

A Troubling Tale of Unnecessary Complexity and Weak Regulation

.

A Proliferation of Measures

There is little doubt that we have a problem in England when it comes to progression to selective, competitive higher education (however defined) by learners from disadvantaged backgrounds (however defined).

We may not be unique in that respect, but that does not alter the fact that the problem is longstanding and largely unresolved.

The recent ‘State of the Nation 2013’ Report from the SMPCPC says ‘there has been little change in the social profile of the most elite institutions for over a decade’, adding that ‘while some of the building blocks are in place to lift children off the bottom, opening up elites remains elusive.’

Part of the problem is that the debates about these respective definitions continue to receive disproportionate coverage. Such debates are sometimes deployed as a diversionary tactic, intentionally drawing us away from the unpalatable evidence that we are making decidedly poor headway in tackling the core issue.

The definitional complexities are such that they lend themselves to exploitation by those with a vested interest in preserving the status quo and defending themselves against what they regard as unwonted state intervention.

I shall resist the temptation to explore the comparative advantages and disadvantages of different measures, since that would risk falling into the trap I have just identified.

But I do need to introduce some of the more prominent – and pin down some subtle distinctions – if only for the benefit of readers in other countries.

One typically encounters four different categorisations of competitive, selective higher education here in the UK:

  • Oxbridge – a convenient shorthand reference to Oxford and Cambridge Universities. These two institutions are commonly understood to be qualitatively superior to other UK universities and, although that advantage does not apply universally, to every undergraduate course and subject, there is some academic support for treating them as a category in their own right.
  • Russell Group – The Russell Group was formed in 1994 and originally comprised 17 members. There are currently 24 members, 20 of them located in England, including Oxford and Cambridge. Four institutions – Durham, Exeter, Queen Mary’s and York – joined as recently as 2012 and membership is likely to increase as the parallel 1994 Group has just disbanded. DfE (as opposed to BIS) often uses Russell Group membership as its preferred proxy for selective, competitive higher education, although there are no objective criteria that apply exclusively to all members.
  • Sutton Trust 30The Sutton Trust originally identified a list of 13 universities, derived from ‘average newspaper league table rankings’. This list – Birmingham, Bristol, Cambridge, Durham, Edinburgh, Imperial, LSE, Nottingham, Oxford, St Andrews, UCL, Warwick and York – still appears occasionally in research commissioned by the Trust, although it was subsequently expanded to 30 institutions. In ‘Degrees of Success’, a July 2011 publication, they were described thus:

‘The Sutton Trust 30 grouping of highly selective universities comprises universities in Scotland, England and Wales with over 500 undergraduate entrants each year, where it was estimated that less than 10 per cent of places are attainable to pupils with 200 UCAS tariff points (equivalent to two D grades and a C grade at A-level) or less. These 30 universities also emerge as the 30 most selective according to the latest Times University Guide.’

The full list includes all but two of the Russell Group (Queen Mary’s and Queen’s Belfast) plus eight additional institutions.

‘The HEIs included in this group change every year; although 94% of HEIs remained in the top third for 5 consecutive years, from 2006/07 to 2010/11. The calculation is restricted to the top three A level attainment; pupils who study other qualifications at Key Stage 5 will be excluded. Institutions with a considerable proportion of entrants who studied a combination of A levels and other qualifications may appear to have low scores. As the analysis covers students from schools and colleges in England, some institutions in other UK countries have scores based on small numbers of students. As this measure uses matched data, all figures should be treated as estimates.’

This categorisation includes seven further mainstream universities (Aston, City, Dundee, East Anglia, Goldsmiths, Loughborough, Sussex) plus a range of specialist institutions.

Indicators of educational disadvantage are legion, but these are amongst the most frequently encountered:

  • Eligibility for free school meals (FSM): DfE’s preferred measure. The term is misleading since the measure only includes learners who meet the FSM eligibility criteria and for whom a claim is made, so eligibility of itself is insufficient. Free school meals are available for learners in state-funded secondary schools, including those in sixth forms. From September 2014, eligibility will be extended to all in Years R, 1 and 2 and to disadvantaged learners in further education and sixth form colleges. The phased introduction of Universal Credit will also impact on the eligibility criteria (children of families receiving Universal Credit between April 2013 and March 2014 are eligible for FSM, but the cost of extending FSM to all Universal Credit recipients once fully rolled out is likely to be prohibitive). We do not yet know whether these reforms will cause DfE to select an alternative preferred measure and, if so, what that will be. Eligibility for the Pupil Premium is one option, more liberal than FSM, though this currently applies only to age 16.
  • Residual Household Income below £16,000: This is broadly the income at which eligibility for free school meals becomes available. It is used by selective universities (Oxford included) because it can be applied universally, regardless of educational setting and whether or not free school meals have been claimed. Oxford explains that:

‘Residual income is based on gross household income (before tax and National Insurance) minus certain allowable deductions. These can include pension payments, which are eligible for certain specified tax relief, and allowances for other dependent children.’

The threshold is determined through the assessment conducted by Student Finance England, so is fully consistent with its guidance.

  • Low participation schools: This measure focuses on participation by school attended rather than where students live. It may be generic – perhaps derived from the Government’s experimental destinations statistics – or based on admissions records for a particular institution. As far as I can establish, there is no standard or recommended methodology: institutions decide for themselves the criteria they wish to apply.
  • POLAR (Participation Of Local Areas): HEFCE’s area-based classification of participation in higher education. Wards are categorised in five quintiles, with Quintile 1 denoting those with lowest participation. The current edition is POLAR 3.
  • Other geodemographic classifications: these include commercially developed systems such as ACORN and MOSAIC based on postcodes and Output Area Classification (OAC) based on census data. One might also include under this heading the Indices of Multiple Deprivation (IMD) and the associated sub-domain Income Deprivation Affecting Children Index (IDACI).
  • National Statistics Socio-Economic Classification (NS-SEC): an occupationally-based definition of socio-economic status applied via individuals to their households. There are typically eight classes:
  1. Higher managerial, administrative and professional
  2. Lower managerial, administrative and professional
  3. Intermediate
  4. Small employers and own account workers
  5. Lower supervisory and technical
  6. Semi routine
  7. Routine
  8. Never worked and long term unemployed

Data is often reported for NS-SEC 4-7.

Sitting alongside these measures of disadvantage is a slightly different animal – recruitment from state-funded schools and colleges compared with recruitment from the independent sector.

While this may be a useful social mobility indicator, it is a poor proxy for fair access.

Many learners attending independent schools are from comparatively disadvantaged backgrounds, and of course substantively more learners at state-maintained schools are comparatively advantaged.

The Office For Fair Access (OFFA) confirms that:

‘in most circumstances we would not approve an access agreement allowing an institution to measure the diversity of its student body solely on the basis of the numbers of state school pupils it recruits….it is conceivable that a university could improve its proportion of state school students without recruiting greater proportions of students from disadvantaged groups.’

Nevertheless, independent/state balance continues to features prominently in some access agreements drawn up by selective universities and approved by OFFA.

There is a risk that some institutions are permitted to give this indicator disproportionate attention, at the expense of their wider commitment to fair access.

 .

Securing National Improvement

Given the embarrassment of riches set out above, comparing progress between institutions is well-nigh impossible, let alone assessing the cumulative impact on fair access at national level.

When it came to determining their current strategy, the government of the day must have faced a choice between:

  • Imposing a standard set of measures on all institutions, ignoring complaints that those selected were inappropriate for some settings, particularly those that were somehow atypical;
  • Allowing institutions to choose their own measures, even though that had a negative impact on the rate of improvement against the Government’s own preferred national indicators; and
  •  A half-way house which insisted on universal adoption of one or two key measures while permitting institutions to choose from a menu of additional measures, so creating a basket more or less appropriate to their circumstances.

For reasons that are not entirely clear – but presumably owe something to vigorous lobbying from higher education interests – the weaker middle option was preferred and remains in place to this day.

The standard-setting and monitoring process is currently driven by OFFA, though we expect imminently the final version of a National Strategy for Access and Student Success, developed jointly with HEFCE.

A new joint process for overseeing OFFA’s access agreements (from 2015/16) and HEFCE’s widening participation strategic statements (from 2014-2017) will be introduced in early 2014.

There were tantalising suggestions that the status quo might be adjusted through work on the wider issue of evaluation.

An early letter referred to plans to:

‘Commission feasibility study to establish if possible to develop common evaluation measures that all institutions could adopt to assess the targeting and impact of their access and student success work’.

The report would be completed by Spring 2013.

Then an Interim Report on the Strategy said the study would be commissioned in ‘early 2013 to report in May 2013’ (Annex B).

It added:

‘Informal discussions with a range of institutional representatives have indicated that many institutions would welcome a much clearer indication of the kind of evidence and indicators that we would wish to see. Therefore a key strand within the strategy development will be work undertaken with the sector to develop an evaluation framework to guide them in their efforts to evidence the impact of their activity. Within this, we intend to test the feasibility of developing some common measures for the gathering of high-level evidence that might be aggregated to provide a national picture. We will also investigate what more can be done by national bodies including ourselves to make better use of national data sets in supporting institutions as they track the impact of their interventions on individual students.’

However, HEFCE’s webpage setting out research and stakeholder engagement in support of the National Strategy still says the study is ‘to be commissioned’ and that the publication date is ‘to be confirmed’.

I can find no explanation of the reasons for this delay.

For the time being, OFFA is solely responsible for issuing guidance to institutions on the content of their access agreements, approving the Agreements and monitoring progress against them.

OFFA’s website says:

‘Universities and colleges set their own targets based on where they need to improve and what their particular institution is trying to achieve under its access agreement…These targets must be agreed by OFFA. We require universities and colleges to set themselves at least one target around broadening their entrant pool. We also encourage (but do not require) them to set themselves further targets, particularly around their work on outreach and, where appropriate, retention. Most choose to do so. We normally expect universities and colleges to have a range of targets in order to measure their progress effectively. When considering whether targets are sufficiently ambitious, we consider whether they represent a balanced view of the institution’s performance, and whether they address areas where indicators suggest that the institution has furthest to go to improve access.

From 2012-13, in line with Ministerial guidance, we are placing a greater emphasis on progress against targets. We would not, however, impose a sanction solely on the basis of a university or college not meeting its targets or milestones.’

The interim report on a National Strategy suggests that – informally at least – many universities recognise that this degree of flexibility is not helpful to their prospects of improving fair access, either individually or collectively.

But the fact that the promised work has not been undertaken might imply a counterforce pushing in precisely the opposite direction.

The expectations placed on universities are further complicated by the rather unclear status of the annual performance indicators for widening participation of under-represented groups supplied by the Higher Education Statistics Agency (HESA).

HESA’s table for young full-time first degree entrants shows progress by each HEI against benchmarks for ‘from state schools or colleges’, ‘from NS-SEC classes 4, 5, 6 and 7’ and ‘from low participation neighbourhoods (based on POLAR3 methodology)’ respectively.

HESA describes its benchmarks thus:

‘Because there are such differences between institutions, the average values for the whole of the higher education sector are not necessarily helpful when comparing HEIs. A sector average has therefore been calculated which is then adjusted for each institution to take into account some of the factors which contribute to the differences between them. The factors allowed for are subject of study, qualifications on entry and age on entry (young or mature).’

HESA’s benchmarks are clearly influential in terms of the measures adopted in many access agreements (and much of the attention given to the state versus independent sector intake may be attributable to them).

On the other hand, the indicators receive rather cavalier treatment in the most recent access agreements from Oxford and Cambridge. Oxford applies the old POLAR2 methodology in place of the latest POLAR3, while Cambridge adjusts the POLAR3 benchmarks to reflect its own research.

The most recent 2011/12 HESA results for Oxford and Cambridge are as follows:

.

Institution       State schools        NS-SEC 4-7     LPN (POLAR3)
Benchmark Performance Benchmark Performance Benchmark Performance
Oxford 71.2% 57.7% 15.9% 11.0% 4.7% 3.1%
Cambridge 71.4% 57.9% 15.9% 10.3% 4.5% 2.5%

.

That probably explains why Oxbridge would prefer an alternative approach! But the reference to further work in the Interim Strategy perhaps also suggests that few see these benchmarks as the best way forward.

.

National Targets

The Government also appears in something of a bind with its preferred measures for monitoring national progress.

When it comes to fair access (as opposed to widening participation) the Social Mobility Indicators rely exclusively on the gap between state and independent school participation at the most selective HEIs, as defined by BIS.

As noted above, this has major shortcomings as a measure of fair access, though more validity as a social mobility measure.

The relevant indicator shows that the gap held between 37% and 39% between 2006 and 2010, but this has just been updated to reflect an unfortunate increase to 40% in 2010/11.

BIS uses the same measure as a Departmental Performance Indicator for its work on higher education.  The attachment on the relevant gov.uk page is currently the wrong one – which might indicate that BIS is less than comfortable with its lack of progress against the measure.

DfE takes a different approach declaring an ‘Outcome of Education’ indicator:

‘Outcome of education:

i)             Percentage of children on free school meals progressing to Oxford or Cambridge*.

ii)            Percentage of children on free school meals progressing to a Russell Group university*.

iii)           Percentage of children on free school meals progressing to any university*.

iv)           Participation in education and work based training at age 16 to 17

*Available June 2013’

But progress against this indicator is nowhere to be found in the relevant section of the DfE website or, as far I can establish, anywhere within the DfE pages on gov.uk.

.

.

Oxbridge Access Agreement Targets for 2014/15

Perhaps the best way to link this section with the next is by showing how Oxford and Cambridge have decided to frame the targets in their access agreements for 2014/15

Oxford has OFFA’s agreement to target:

  • Schools and colleges that secure limited progression to Oxford. They use ‘historic UCAS data’ to estimate that ‘in any one year up to 1,680…will have no students who achieve AAA grades but, over a three-year period they may produce a maximum of two AAA candidates’. They also prioritise an estimated 1,175 institutions which have larger numbers achieving AAA grades ‘but where the success rate for an application to Oxford is below 10%’. In 2010, 19.4% of Oxford admissions were from these two groups and it plans to increase the proportion to 25% by 2016-17;
  • UK undergraduates from disadvantaged socio-economic backgrounds, based on ‘ACORN postcodes 4 and 5’. Some 7.6% of admissions came from these postcodes in 2010/11 and Oxford proposes to reach 9.0% by 2016/17.
  • UK undergraduates from neighbourhoods with low participation in higher education, as revealed by POLAR2. It will focus on ‘students domiciled in POLAR quintiles 1 and 2’. In 2012, 10.6% are from this group and Oxford proposes to increase this to 13.0% by 2016-17.

In addition to a target for admitting disabled students, Oxford also says it will monitor and report on the state/independent school mix, despite evidence ‘that this measure is often misleading as an indicator of social diversity’. It notes that:

‘30% of 2012 entrants in receipt of the full Oxford Bursary (students with a household income of £16,000 or less) were educated in the independent sector…The University will continue to monitor the level of students from households with incomes of £16,000 or less. It is considered that these are the most financially disadvantaged in society, and it is below this threshold that some qualify for receipt of free schools meals, and the pupil premium. The University does not consider that identifying simply those students who have been in receipt of free school meals provides a suitably robust indicator of disadvantage as they are not available in every school or college with post-16 provision, nor does every eligible student choose to receive them.

There are no national statistics currently available on the number of students whose household income is £16,000 or less and who attain the required academic threshold to make a competitive application to Oxford. In 2011-12, around one in ten of the University’s UK undergraduate intake was admitted from a household with this level of declared income.’

Meanwhile, Cambridge proposes only two relevant targets, one of them focused on the independent/state divide:

  • Increase the proportion of UK resident students admitted from UK state sector schools and colleges to between 61% and 63%. This is underpinned by the University’s research finding that ‘the proportion of students nationally educated at state schools securing examination grades in subject combinations that reflect our entrance requirements and the achievement level of students admitted to Cambridge stands at around 62%’.
  • Increase the proportion of UK resident students from low participation neighbourhoods to approximately 4% by 2016. It argues:

‘Currently HESA performance indicators and other national datasets relating to socio-economic background do not take adequate account of the entry requirements of individual institutions. Whilst they take some account of attainment, they do not do so in sufficient detail for highly selective institutions such as Cambridge where the average candidate admitted has 2.5 A* grades with specific subject entry requirements. For the present we have adjusted our HESA low participation neighbourhood benchmark in line with the results of our research in relation to state school entry and will use this as our target.’

Each of these approaches has good and bad points. Cambridge’s is more susceptible to the criticism that it is overly narrow. There is no real basis to compare the relative performance of the two institutions since there is negligible overlap between their preferred indicators. That may be more comfortable for them, but it is not in the best interests of their customers, or of those seeking to improve their performance.

 

Investigating the Data on High Attainment and Fair Access to Oxbridge

Those seeking statistics about high attainment amongst disadvantaged young people and their subsequent progression to Oxbridge are bound to be disappointed.

There is no real appreciation of the excellence gap in this country and this looks set to continue. The fact that gaps between advantaged and disadvantaged learners are typically wider at the top end of the attainment distribution seems to have acted as a brake on the publication of data that proves the point.

It is possible that the current round of accountability reforms will alter this state of affairs, but this has not yet been confirmed.

For the time being at least, almost all published statistics about high A level attainment amongst disadvantaged learners have come via answers to Parliamentary Questions. This material invariably measures disadvantage in terms of FSM eligibility.

Information about the admission of disadvantaged learners to Oxbridge is equally scant, but a picture of sorts can be built up from a mixture of PQ replies, university admission statistics and the DfE’s destination measures. The material supplied by the universities draws on measures other than FSM.

The following two sections set out what little we know, including the ever important statistical caveats.

.

High Attainment Data

  • In 2003, 94 students (1.9%) eligible for FSM achieved three or more A grades at A level. The figures relate to 16-18 year-olds in maintained schools only who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth from colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are included. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
  • In 2008, 160 students (3.5%) eligible for FSM achieved that outcome. The figures relate to 16-18 year-olds, in maintained schools only, who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth from colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are included. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
  • In 2008/09, 232 pupils at maintained mainstream schools eligible for FSM achieved three or more A grades at A level (including applied A level and double award), 179 of them attending comprehensive schools. The figures exclude students in FE and sixth form colleges previously eligible for FSM. (Parliamentary Question, 7 April 2010, Hansard (Col 1451W))
  • The number of Year 13 A level candidates eligible for FSM in Year 11 achieving 3 or more A grade levels (including applied A levels and double award) were: 2006 – 377; 2007 – 433; 2008 – 432; 2009 – 509. These figures include students in both the schools and FE sectors.(Parliamentary Question, 27 July 2010, Hansard (Col 1223W))

 .

To summarise, the total number of students who were FSM-eligible at age 16 and went on to achieve three or more GCE A levels at Grade A*/A – including those in maintained schools, sixth form and FE colleges – has been increasing significantly since 2006.

2006 2007 2008 2009 2010 2011
Number 377 433 432 509 ? 546

The overall increase between 2006 and 2011 is about 45%.

 .

Oxbridge Admission/Acceptance Data

  • The number of learners eligible for and claiming FSM at age 15 who progressed to Oxford or Cambridge by age 19 between 2005 and 2008 (rounded to the nearest five) were:
2005/06 2006/07 2007/08 2008/09
Oxford 25 20 20 25
Cambridge 20 25 20 20
TOTAL 45 45 40 45

Sources: Parliamentary Question, 13 December 2010, Hansard (Col 549W) and Parliamentary Question 21 February 2012, Hansard (Col 755W)

.

[Postscript (January 2014):

In January 2014, BIS answered a further PQ which provided equivalent figures for 2009/10 and 2010/11 – again rounded to the nearest five and derived from matching the National Pupil Database (NPD), HESA Student Record and the Individualised Learner Record (ILR) owned by the Skills Funding Agency.

The revised table is as follows:

  2005/06 2006/07 2007/08 2008/09 2009/10 2010/11
Oxford 25 20 20 25 15 15
Cambridge 20 25 20 20 25 25
TOTAL 45 45 40 45 40 40

 

Sources:

Parliamentary Question, 13 December 2010, Hansard (Col 549W)

Parliamentary Question 21 February 2012, Hansard (Col 755W)

Parliamentary Question 7 January 2014, Hansard (Col 191W)

Although the 2010/11 total is marginally more positive than the comparable figure derived from the Destination Indicators (see below) this confirms negligible change overall during the last six years for which data is available.  The slight improvement at Cambridge during the last two years of the sequence is matched by a corresponding decline at Oxford, from what is already a desperately low base.]

.

Number %age FSM Number FSM
UK HEIs 164,620 6% 10,080
Top third of HEIs 49,030 4% 2,000
Russell Group 28,620 3% 920
Oxbridge 2,290 1% 30

.

.

These are experimental statistics and all figures – including the 30 at Oxbridge – are rounded to the nearest 10. The introductory commentary explains that:

‘This statistical first release (experimental statistics) on destination measures shows the percentage of students progressing to further learning in a school, further education or sixth-form college, apprenticeship, higher education institution or moving into employment or training.’

It adds that:

‘To be included in the measure, young people have to show sustained participation in an education or employment destination in all of the first 2 terms of the year after they completed KS4 or took A level or other level 3 qualifications. The first 2 terms are defined as October to March.’

The Technical Notes published alongside the data also reveal that: It includes only learners aged 16-18 and those who have entered at least one A level or an equivalent L3 qualification;  the data collection process incorporates ‘an estimate of young people who have been accepted through the UCAS system for entry into the following academic year’ but ‘deferred acceptances are not reported as a distinct destination’; and FSM data for KS5 learners relates to those eligible for and claiming FSM in Year 11.

  • Cambridge’s 2012 intake ‘included 50+ students who had previously been in receipt of FSM’ (It is not stated whether all were eligible in Year 11, so it is most likely that this is the number of students who had received FSM at one time or another in their school careers.) This shows that Cambridge at least is collecting FSM data that it does not publish amongst its own admission statistics or use in its access agreement. (Cambridge University Statement, 26 September 2013)
  • In 2012, Cambridge had 418 applications from the most disadvantaged POLAR2 quintile (4.6% of all applicants) and, of those, 93 were accepted (3.6% of all acceptances) giving a 22.2% success rate.(Cambridge University Admission Statistics 2012 (page 23))

.

To summarise, the numbers of disadvantaged learners progressing to Oxbridge are very small; exceedingly so as far as those formerly receiving FSM are concerned.

Even allowing for methodological variations, the balance of evidence suggests that, at best, the numbers of FSM learners progressing to Oxbridge have remained broadly the same since 2005.

During that period, the concerted efforts of the system described above have had zero impact. The large sums invested in outreach and bursaries have made not one iota of difference.

This is true even though the proportion achieving the AAA A level benchmark has increased by about 45%. If Oxbridge admission was solely dependent on attainment, one would have expected a commensurate increase, to around 65 FSM entrants per year.

On the basis of the 2010/11 Destination Indicators, we can estimate that, whereas Oxbridge admits approximately 8% of all Russell Group students, it only admits slightly over 3% of Russell Group FSM students. If Oxbridge achieved the performance of its Russell Group peers, the numbers of formerly FSM admissions would be over 100 per year.

.

Misleading Use of This Data

To add insult to injury, this data is frequently misinterpreted and misused. Here are some examples, all of which draw selectively on the data set out above.

  • Of the 80,000 FSM-eligible students in the UK only 176 received three As at A level…more than one quarter of those students….ended up at either Oxford or Cambridge – Nicholson (Oxford Undergraduate Admissions Director, Letter to Guardian, 7 March 2011)
  • ‘Of the 80,000 children eligible for free school meals in the UK in 2007, only 176 received 3 As at A level. Of those 45 (more than a quarter) got places at Oxford or Cambridge’ (Undated Parliamentary Briefing ‘Access and admissions to Oxford University’ )
  • ‘The root causes of underrepresentation of students from poorer backgrounds at leading universities include underachievement in schools and a lack of good advice on subject choices. For example, in 2009 only 232 students who had been on free school meals (FSM) achieved 3As at A-level or the equivalent.  This was 4.1% of the total number of FSM students taking A-levels, and less than an estimated 0.3% of all those who had received free school meals when aged 15.’ (Russell Group Press release, 23 July 2013).
  • ‘Such data as is available suggests that less than 200 students per year who are recorded as being eligible for FSM secure grades of AAA or better at A level. The typical entrance requirement for Cambridge is A*AA, and so on that basis the University admits in excess of one quarter of all FSM students who attain the grades that would make them eligible for entry.’ (Cambridge University Statement, 26 September 2013)
  • ‘According to data produced by the Department for Children, Schools and Families, of the 4,516 FSM students who secured a pass grade at A Level in 2008 only 160 secured the grades then required for entry to the University of Cambridge (ie AAA). Students who were eligible for FSM therefore make up less than 1% of the highest achieving students nationally each year.

Assuming that all 160 of these students applied to Oxford or Cambridge in equal numbers (ie 80 students per institution) and 22 were successful in securing places at Cambridge (in line with the 2006-08 average) then this would represent a success rate of 27.5% – higher than the average success rate for all students applying to the University (25.6% over the last three years). In reality of course not every AAA student chooses to apply to Oxford or Cambridge, for instance because neither university offers the course they want to study, e.g. Dentistry.’ (Cambridge Briefing, January 2011 repeated in Cambridge University Statement, 26 September 2013)

.

.

To summarise, Oxford, Cambridge and the Russell Group are all guilty of implying that FSM-eligible learners in the schools sector are the only FSM-eligible learners progressing to selective universities.

They persist in using the school sector figures even though combined figures for the school and FE sectors have been available since 2010.

Oxbridge’s own admission statistics show that, in 2012:

  • 9.6% of acceptances at Cambridge (332 students) were extended to students attending sixth form, FE and tertiary colleges (UK figures)
  • 10.5% of UK domiciled acceptances at Oxford (283 students) were extended to students attending sixth form colleges and FE institutions of all types

We can rework Cambridge’s calculation using the figure of 546 students with three or more A*/A grades in 2011:

  • assuming that all applied to Oxford and Cambridge in equal numbers gives a figure of 273 per institution
  • assuming a success rate of 25.6% – the average over the last three years
  • the number of FSM students that would have been admitted to Cambridge is roughly 70.

Part of the reason high-attaining disadvantaged students do not apply to Oxbridge may be because they want to study the relatively few mainstream subjects, such as dentistry, which are not available.

But it is highly likely that other factors are at play, including the perception that Oxbridge is not doing all that it might to increase numbers of disadvantaged students from the state sector.

If this favourable trend in A level performance stalls, as a consequence of recent A level reforms, it will not be reasonable – in the light of the evidence presented above – for Oxbridge to argue that this is impacting negatively on the admission of FSM-eligible learners.

.

Building on the work of the SMCPC

 

‘Higher Education: The Fair Access Challenge’

There is no shortage of publications on fair access and related issues. In the last year alone, these have included:

Easily the most impressive has been the Social Mobility and Child Poverty Commission’s ‘Higher Education: The Fair Access Challenge’ (June 2013), though it does tend to rely a little too heavily on evidence of the imbalance between state and independent-educated students.

.

.

It examines the response of universities to recommendations first advanced in an earlier publication ‘University Challenge: How Higher Education Can Advance Social Mobility’ (2012) published by Alan Milburn, now Chair of the Commission, in his former role as Independent Reviewer on Social Mobility.

The analysis sets out key points from the earlier work:

  • Participation levels at the most selective universities by the least advantaged are unchanged since the mid-90s.
  • The most advantaged young people are seven times more likely to attend the most selective universities than the most disadvantaged.
  • The probability of a state secondary pupil eligible for FSM in Year 11 entering Oxbridge by 19 is almost 2000 to 1; for a privately educated pupil the probability is 20 to 1.

New research is presented to show that the intake of Russell Group universities has become less socially representative in the last few years:

  • The number of state school pupils entering Russell Group Universities has increased by an estimated 2.6% from 2002/03 to 2011/12, but the commensurate increase in privately educated entrants is 7.9%. The proportion of young full-time state-educated entrants has consequently fallen from 75.6% to 74.6% over this period. The worst performers on this measure are: Durham (-9.2%), Newcastle (-4.6%), Warwick (-4.5%) and Bristol (-3.9%). The best are: Edinburgh (+4.6%), UCL (+3.3%), LSE (+3.0%) and Southampton (2.9%). The Oxbridge figures are: Cambridge (+0.3%) and Oxford (+2.3%).
  • Similarly, the proportion of young full-time entrants from NS-SEC classes 4-7 has fallen from 19.9% in 2002/03 to 19.0% in 2011/12. A table (reproduced below) shows that the worst offenders on this measure are Queen’s Belfast (-4.6%), Liverpool (-3.2%), Cardiff (-2.9%) and Queen Mary’s (-2.7%). Conversely, the best performers are Nottingham (+2.2%), York (+0.9%), Warwick and LSE (+0.8%). The figures for Oxbridge are: Cambridge (-1.0%) and Oxford (0.0%).

.

NC-SEC Capture.

  • An estimated 3,700 state-educated learners have the necessary grades for admission to Russell Group universities but do not take up places. This calculation is based on the fact that, if all of the 20 Russell Group universities in England achieved their HESA widening participation benchmarks, they would have recruited an extra 3,662 students from state schools. (The benchmarks show how socially representative each intake would be if it were representative of all entrants with the grades required for entry – though see Cambridge’s reservations on this point, above.) Some universities would need to increase significantly the percentage of state students recruited – for example, Bristol and Durham (26.9%), Oxford (23.4%) and Cambridge (23.3%).
  • Using the same methodology to calculate the shortfall per university in NS-SEC 4-7 students results in the table below, showing the worst offenders to require percentage increases of 54.4% (Cambridge), 48.5% (Bristol), 45.5% (Oxford) and 42,2% (Durham). Conversely, Queen Mary’s, Queen’s Belfast, LSE and Kings College are over-recruiting from this population on this measure.

.

NS sec Capture 2.

  • Even if every Russell Group university met the self-imposed targets in its access agreement, the number of ‘missing’ state educated students would drop by only 25% by 2016/17, because the targets are insufficiently ambitious. (This is largely because only seven have provided such targets in their 2013/14 access agreements and there are, of course, no collective targets.)
  • Boliver’s research is cited to show that there is a gap in applications from state school pupils compared with those educated in the independent sector. But there is also evidence that a state school applicant needs, on average, one grade higher in their A levels (eg AAA rather than AAB) to be as likely to be admitted as an otherwise identical student from the independent sector.
  • A Financial Times analysis of 2011 applications to Oxford from those with very good GCSEs found that those from independent schools were 74% more likely to apply than those from the most disadvantaged state secondary schools. Amongst applicants, independently educated students were more than three times as likely to be admitted as their peers in disadvantaged state schools. They were also 20% more likely to be admitted than those at the 10% most advantaged state secondary schools. As shown by the table below, the probabilities involved varied considerably. The bottom line is that the total probability of a place at Oxford for an independent school student is 2.93%, whereas the comparable figure for a student at one of the 10% most disadvantaged state secondary schools is just 0.07%.

.

NS sec Capture 3

When it comes to the causes of the fair access gap, subject to controls for prior attainment, the report itemises several contributory factors, noting the limited evidence available to establish their relative importance and interaction:

  • low aspirations among students, parents and teachers
  • less knowledge of the applications process, problems in demonstrating potential through the admissions process and a tendency to apply to the most over-subscribed courses
  • not choosing the right  A-level subjects and teachers’ under-prediction of expected A level grades
  • a sense that selective universities ‘are socially exclusive and “not for the likes of them”’

The Report states unequivocally that:

‘The Social Mobility and Child Poverty Commission is deeply concerned about the lack of progress on fair access. The most selective universities need to be doing far more to ensure that they are recruiting from the widest possible pool of talent. The Commission will be looking for evidence of a step change in both intention and action in the years to come.’

It identifies several areas for further action, summarising universities’ responses to ‘University Challenge’:

  • Building links between universities and schools: The earlier report offered several recommendations, including that universities should have explicit objectives to help schools close attainment gaps. No evidence is given to suggest that such action is widespread, though many universities are strengthening their outreach activities and building stronger relationships with the schools sector. Several universities highlighted the difficulties inherent in co-ordinating their outreach activity given the demise of Aimhigher, but several retain involvement in a regional partnership.
  • Setting targets for fair access: The earlier report recommended that HE representative bodies should set statistical targets for progress on fair access over the next five years. This was not met positively:

‘Representative bodies in the Higher Education Sector did not feel this would be a useful step for them to take, saying that it was difficult to aggregate the different targets that individual institutions set themselves. There was also a feeling among some highly selective institutions that the report overestimated the number of students who have the potential to succeed at the most selective universities.’

Nevertheless, the Commission is insistent:

The Commission believes it is essential that the Russell Group signals its determination to make a real difference to outcomes by setting a clear collective statistical target for how much progress its members are aiming to make in closing the ‘fair access gap’. Not doing so risks a lack of sustained focus among the most selective universities’.

  • Using contextual admissions data: The report argues that ‘there is now a clear evidence base that supports the use of contextual data’. Recommendations from the earlier report were intended to universalise the use of contextual data, including commitment from the various representative bodies through a common statement of support and a collaborative guide to best practice. There is no sign of the former, although the Commission reports ‘widespread agreement that the use of contextual data during the admissions process should be mainstreamed’. However it notes that there is much more still to do. (The subsequent SPA publication should have helped to push forward this agenda.)
  • Reforming the National Scholarship Programme: The earlier report called on the Government to undertake a ‘strategic review of government funding for access’ to include the national Scholarship Programme (NSP). The suggestion that the imminent HEFCE/OFFA National Strategy should tackle the issue has been superseded by a Government decision to refocus the NSP on postgraduate education.
  • Postgraduate funding reform: The earlier report recommended work on a postgraduate loan scheme and further data collection to inform future decisions. The current report says that:

‘…the Government appears to have decided against commissioning an independent report looking at the issue of postgraduate access. This is very disappointing.’

and calls on it ‘to take heed’. However, this has again been superseded by the NSP announcement.

The SMCPC’s ‘State of the Nation 2013’ report reinforces its earlier publication, arguing that:

‘…despite progress, too much outreach work that aims to make access to university fairer and participation wider continues to rely on unproven methods or on work that is ad hoc, uncoordinated and duplicative… These are all issues that the higher education sector needs to address with greater intentionality if progress is to be made on breaking the link between social origin and university education.

The UK Government also needs to raise its game… much more needs to be done… to address the loss of coordination capacity in outreach work following the abolition of Aimhigher.’

It recommends that:

‘All Russell Group universities should agree five-year aims to close the fair access gap, all universities should adopt contextual admissions processes and evidence-based outreach programmes, and the Government should focus attention on increasing university applications from mature and part-time students.’

 .

What Else Might Be Done?

I set myself the challenge of drawing up a reform programme that would build on the SMCPC’s recommendations but would also foreground the key issues I have highlighted above, namely:

  • A significant improvement in the rate of progression for disadvantaged high-attaining learners to Oxbridge;
  • A more rigorous approach to defining, applying and monitoring improvement measures; and
  • The publication of more substantive and recent data

A determined administration that is prepared to take on the vested interests could do worse than pursue the following 10-point plan

  • 1. Develop a new approach to specifying universities’ fair access targets for young full-time undergraduate students. This would require all institutions meeting the BIS ‘most selective HEI’ criteria to pursue two universal measures and no more than two measures of their own devising, so creating a basket of no more than four measures. Independent versus state representation could be addressed as one of the two additional measures.
  • 2. The universal measures should relate explicitly to students achieving a specified A level threshold that has currency at these most selective HEIs. It could be pitched at the equivalent of ABB at A level, for example. The measures should comprise:
    • A progression measure for all learners eligible for the Pupil Premium in Year 11 of their secondary education (so a broader measure than FSM eligibility); and
    • A progression measure for all learners – whether or not formerly eligible for the Pupil Premium – attending a state-funded sixth form or college with a relatively poor historical record of securing places for their learners at such HEIs. This measure would be nationally defined and standardised across all institutions other than Oxbridge.
  • 3. In the case of Oxford and Cambridge the relevant A level tariff would be set higher, say at the equivalent of AAA grades at A level, and the nationally defined  ‘relatively poor historical record’ would reflect only Oxbridge admission.
  • 4. These two universal measures would be imposed on institutions through the new National Strategy for Access and Student Success. All institutions would be required to set challenging but realistic annual targets. There would be substantial financial incentives for institutions achieving their targets and significant financial penalties for institutions that fail to achieve them.
  • 5. The two universal measures would be embedded in the national Social Mobility Indicators and the KPIs of BIS and DfE respectively.
  • 6. Central Government would publish annually data setting out:
    • The number and percentage of formerly Pupil Premium-eligible learners achieving the specified A level thresholds for selective universities and Oxbridge respectively.
    • A ‘league table’ of the schools and colleges with relatively poor progression to selective universities and Oxbridge respectively.
    • A ‘league table’ of the universities with relatively poor records of recruitment from these schools and colleges.
    • A time series showing the numbers of students and percentage of their intake drawn from these two populations by selective universities and Oxbridge respectively each year. This should cover both applications and admissions.
  • 7. All parties would agree new protocols for data sharing and transparency, including tracking learners through unique identifiers across the boundaries between school and post-16 and school/college and higher education, so ensuring that the timelag in the publication of this data is minimal.
  • 8. Universities defend fiercely their right to determine their own undergraduate admissions without interference from the centre, meaning that the business of driving national improvement is much more difficult than it should be. But, given the signal lack of progress at the top end of the attainment distribution, there are strong grounds for common agreement to override this autonomy in the special case of high-achieving disadvantaged students.  A new National Scholarship Scheme should be introduced to support learners formerly in receipt of the Pupil Premium who go on to achieve the Oxbridge A Level tariff:
    • Oxford and Cambridge should set aside 5% additional places per year (ie on top of their existing complement) reserved exclusively for such students. On the basis of 2012 admissions figures, this would amount to almost exactly 250 places for England divided approximately equally between the two institutions (the scheme could be for England only or UK-wide). This would provide sufficient places for approximately 45% of those FSM learners currently achieving 3+ A*/A grades.
    • All eligible students with predicted grades at or above the tariff would be eligible to apply for one of these scholarship places. Admission decisions would be for the relevant university except that – should the full allocation not be taken up by those deemed suitable for admission who go on to achieve the requisite grades – the balance would be made available to the next best applicants until the quota of places at each university is filled.
    • The Government would pay a premium fee set 50% above the going rate (so £4,500 per student per annum currently) for each National Scholarship student admitted to Oxbridge. However, the relevant University would be penalised the full fee plus the premium (so £13,500 per student per year) should the student fail to complete their undergraduate degree with a 2.2 or better. Penalties would be offset against the costs of running the scheme. Assuming fees remain unchanged and 100% of students graduate with a 2.2 or better, this would cost the Government £1.125m pa.
  • 9. In addition, the Government would support the establishment of a National Framework Programme covering Years 9-13, along the lines set out in my November 2010 post on this topic with the explicit aim of increasing the number of Pupil Premium-eligible learners who achieve these tariffs. The budget could be drawn in broadly equal proportions from Pupil Premium/16-19 bursary, a matched topslice from universities’ outreach expenditure and a matched sum from the Government. If the programme supported 2,500 learners a year to the tune of £2,500 per year, the total steady state cost would be slightly over £30m, approximately £10m of which would be new money (though even this could be topsliced from the overall Pupil Premium budget).
  • 10. The impact of this plan would be carefully monitored and evaluated, and adjusted as appropriate to maximise the likelihood of success. It would be a condition of funding that all selective universities would continue to comply with the plan.

Do I honestly believe anything of this kind will ever happen?

.

flying pig capture

.

GP

November 2013

A Summer of Love for English Gifted Education? Episode 2: Ofsted’s ‘The Most Able Students’

 .

This post provides a close analysis of Ofsted’s Report: ‘The most able students: Are they doing as well as they should in our non-selective secondary schools?’ (June 2013)

.

,

summer of love 1967 by 0 fairy 0

summer of love 1967 by 0 fairy 0

This is the second post in a short series, predicated on the assumption that we are currently enjoying a domestic ‘summer of love’ for gifted education.

According to this conceit, the ‘summer of love’ is built around three official publications, all of them linked in some way with the education of gifted learners, and various associated developments.

Part One in the series introduced the three documents:

  • An Ofsted Survey of how schools educate their most able pupils (still unpublished at that point); and
  • A planned ‘Investigation of school and college level strategies to raise the aspirations of high-achieving disadvantaged pupils to pursue higher education’, this report programmed for publication in September 2013.

It provided a full analysis of the KS2 L6 Investigation and drew on the contractual specification for the Investigation of aspiration-raising strategies to set out what we know about its likely content and coverage.

It also explored the pre-publicity surrounding Ofsted’s Survey, which has been discussed exclusively by HMCI Wilshaw in the media. (There was no official announcement on Ofsted’s own website, though it did at least feature in their schedule of forthcoming publications.)

Part One also introduced a benchmark for the ‘The most able students’, in the shape of a review of Ofsted’s last foray into this territory – a December 2009 Survey called ‘Gifted and talented pupils in schools’.

I will try my best not to repeat too much material from Part One in this second Episode so, if you feel a little at sea without this background detail, I strongly recommend that you start with the middle section of that first post before reading this one.

I will also refer you, at least once, to various earlier posts of mine, including three I wrote on the day ‘The most able students’ was published:

  • My Twitter Feed – A reproduction of the real time Tweets I published immediately the Report was made available online, summarising its key points and recommendations and conveying my initial reactions and those of several influential commentators and respondents. (If you don’t like long posts, go there for the potted version!);

Part Two is dedicated almost exclusively to analysis of ‘The most able students’ and the reaction to its publication to date.

It runs a fine tooth comb over the content of the Report, comparing its findings with those set out in Ofsted’s 2009 publication and offering some judgement as to whether it possesses the ‘landmark’ qualities boasted of it by HMCI in media interviews and/or whether it justifies the criticism heaped on it in some quarters.

It also matches Ofsted’s findings against the Institutional Quality Standards (IQS) for Gifted Education – the planning and improvement tool last refreshed in 2010 – to explore what that reveals about the coverage of each document.

For part of my argument is that, if schools are to address the issues exposed by Ofsted, they will need help and support to do so – not only a collaborative mechanism such as that proposed in ‘Driving Gifted Education Forward – but also some succinct, practical guidance that builds on the experience developed during the lifetime of the late National Gifted and Talented Programme.

For – if you’d like a single succinct take-away from this analysis – I firmly believe that it is now timely for the IQS to be reviewed and updated to better reflect current policy and the new evidence base created in part by Ofsted and the other two publications I am ‘celebrating’ as part of the Summer of Love.

Oh, and if you want to find out more about my ‘big picture’ vision, may I refer you finally to the Gifted Phoenix Manifesto for Gifted Education.

But now it’s high time I began to engage you directly with what has proved to be a rather controversial text.

.

Ofsted’s Definition of ‘Most Able’

The first thing to point out is that Ofsted’s Report is focused very broadly in one sense, but rather narrowly in another.

The logic-defying definition of ‘most able students’ Ofsted adopts – for the survey that informs the Report – is tucked away in a footnote divided between the bottom of pages 6 and 7 of the Report.

This says:

For the purpose of this survey ‘most able’ is defined as the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2. Some pupils who are new to the country and are learning English as an additional language, for example, might not have attained Level 5 or beyond at the end of Key Stage 2 but have the potential to achieve it.

It is hard to reconcile this definition with the emphasis in the title of the Report on ‘the most able students’, which suggests a much narrower population at one extreme of an ability distribution (not an attainment distribution, although most of the Report is actually about high attaining students, something quite different).

In fact, Ofsted’s sample includes:

  • All pupils achieving Level 5 and above in English – 38% of all pupils taking end KS2 tests in 2012 achieved this.
  • All pupils achieving Level 5 and above in maths – 39% of all pupils achieved this in 2012.
  • We also know that 27% of pupils achieved Level 5 or above in both English and maths in 2012. This enables us to deduce that approximately 11% of pupils managed Level 5 only in English and approximately 12% only in maths.
  • So adding these three together we get 27% + 11% + 12% = 50%. In other words, we have already included exactly half of the entire pupil population and have so far counted only ‘high attaining’ pupils.
  • But we also need to include a further proportion of pupils who ‘have the potential’ to achieve Level 5 in one or other of these subject but do not do so. This sub-population is unquantifiable, since Ofsted gives only the example of EAL pupils, rather than the full range of qualifying circumstances it has included. A range of different special needs might also cause a learner to be categorised thus. So might a particularly disadvantaged background (although that rather cuts across other messages within the Report). In practice, individual learners are typically affected by the complex interaction of a whole range of different factors, including gender, ethnic and socio-economic background, special needs, month of birth – and so on. Ofsted fails to explain which factors it has decided are within scope and which outside, or to provide any number or percentage for this group that we can tack on to the 50% already deemed high attainers.

Some might regard this lack of precision as unwarranted in a publication by our national Inspectorate, finding reason therein to ignore the important findings that Ofsted presents later in the Report. That would be unfortunate.

Not only is Ofsted’s definition very broad, it is also idiosyncratic, even in Government terms, because it is not the same as the slightly less generous version in the Secondary School Performance Tables, which is based on achievement of Level 5 in Key Stage 2 tests of English, maths and science.

So, according to this metric, Ofsted is concerned with the majority of pupils in our secondary schools – several million in fact.

But ‘The Most Able Students’ is focused exclusively on the segment of this population that attends non-selective 11-16 and 11-18 state schools.

We are told that only 160,000 students from a total of 3.235m in state-funded secondary schools attend selective institutions.

Another footnote adds that, in 2012, of 116,000 students meeting Ofsted’s ‘high attainers’ definition in state-funded schools who took GCSEs in English and maths, around 100,000 attended non-selective schools, compared with 16,000 in selective schools (so some 86%).

This imbalance is used to justify the exclusion of selective schools from the evidence base, even though some further direct comparison of the two sectors might have been instructive – possibly even supportive of the claim that there is a particular problem in comprehensive schools that is not found in selective institutions. Instead, we are asked to take this claim largely on trust.

.

Exeter1 by Gifted Phoenix

Exeter1 by Gifted Phoenix

.

The Data-Driven Headlines

The Report includes several snippets of data-based evidence to illustrate its argument, most of which relate to subsets of the population it has rather loosely defined, rather than that population as a whole. This creates a problematic disconnect between the definition and the data.

One can group the data into three categories: material relating to progression between Key Stages 2 and 4, material relating to achievement of AAB+ grades at A level in the so-called ‘facilitating subjects’ and material drawn from international comparisons studies. The former predominates.

 .

Data About Progression from KS2 to KS4

Ofsted does not explain up front the current expectation that pupils should make at least three full levels of progress between the end of Key Stage 2 and the end of Key Stage 4, or explore the fact that this assumption must disappear when National Curriculum levels go in 2016.

The conversion tables say that pupils achieving Level 5 at the end of Key Stage 2 should manage at least a Grade B at GCSE. Incidentally – and rather confusingly – that also includes pupils who are successful in the new Level 6 tests.

Hence the expectation does not apply to some of the very highest attainers who, rather than facing extra challenge, need only make two levels of progress in (what is typically) five years of schooling.

I have argued consistently that three levels of progress is insufficiently challenging for many high attainers. Ofsted makes that assumption too – even celebrates schools that push beyond it – but fails to challenge the source or substance of that advice.

We are supplied with the following pieces of data, all relating to 2012:

  • 65% of ‘high attainers’ in non-selective secondary schools – not according to Ofsted’s definition above, but the narrower one of those achieving  Key Stage 2 Level 5 in both English and maths – did not achieve GCSEs at A/A* in both those subjects. (So this is equivalent to 4 or 5 levels of progress in the two subjects combined.) This group includes over 65,000 students (see pages 4, 6, 8, 12).
  • Within the same population, 27% of students did not achieve GCSEs at B or above in both English and maths. (So this is the expected 3+ levels of progress.) This accounts for just over 27,000 students.) (see pages 4, 6 and 12).
  • On the basis of this measure, 42% of FSM-eligible students did not achieve GCSEs at B or above in both English and maths, whereas the comparable figure for non-FSM students was 25%, giving a gap between FSM and non-FSM (rather than between FSM and all students) of 17%. We are not told what the gap was at A*/A, or for the ‘survey population’ as a whole  (page 14)
  • Of those who achieved Level 5 in English (only) at Key Stage 2, 62% of those attending non-selective state schools did not achieve an A* or A Grade at GCSE (so making 4 or 5 levels of progress) and 25% did not achieve a GCSE B grade or higher (so making 3+ levels of progress) (page 12)
  • Of those who achieved Level 5 in maths (only) at Key Stage 2, 53% did not achieve A*/A at GCSE (4 or 5 levels of progress) and 22% did not achieve B or higher (3+ levels of progress) (page 12)
  • We are also given the differentials between boys and girls on several of these measures, but not the percentages for each gender. In English, for A*/A and for B and above, the gap is 11% in favour of girls. In maths, the gap is 6% in favour of girls at A*/A and 5% at B and above. In English and maths combined, the gap is 10% in favour of girls for A*/A and B and above alike (page 15).
  • As for ethnic background, we learn that non-White British students outperformed White British students by 2% in maths and 1% in English and maths together, but the two groups performed equally in English at Grades B and above. The comparable data for Grades A*/A show non-White British outperforming White British by 3% in maths and again 1% in English and maths together, while the two groups again performed equally in English (page 16)

What can we deduce from this? Well, not to labour the obvious, but what is the point of setting out a definition, however exaggeratedly inclusive, only to move to a different definition in the data analysis?

Why bother to spell out a definition based on achievement in English or maths, only to rely so heavily on data relating to achievement in English and maths?

There are also no comparators. We cannot see how the proportion of high attainers making expected progress compares with the proportion of middle and low attainers doing so, so there is no way of knowing whether there is a particular problem at the upper end of the spectrum. We can’t see the comparable pattern in selective schools either.

There is no information about the trend over time – whether the underperformance of high attainers is improving, static or deteriorating compared with previous years – and how that pattern differs from the trend for middle and low attainers.

The same applies to the information about the FSM gap, which is confined solely to English and maths, and solely to Grade B and above, so we can’t see how their performance compares between the two subjects and for the top A*/A grades, even though that data is supplied for boys versus girls and white versus non-white British.

The gender, ethnic and socio-economic data is presented separately so we cannot see how these different factors impact on each other. This despite HMI’s known concern about the underperformance of disadvantaged white boys in particular. It would have been helpful to see that concern linked across to this one.

Overall, the findings do not seem particularly surprising. The large gaps between the percentages of students achieving four and three levels of progress respectively is to be expected, given the orthodoxy that students need only make a minimum of three levels of progress rather than the maximum progress of which they are capable.

The FSM gap of 17% at Grade B and above is actually substantively lower than the gap at Grade C and above which stood at 26.2% in 2011/12. Whether the A*/A gap demonstrates a further widening at the top end remains shrouded in mystery.

Although it is far too soon to have progression data, the report almost entirely ignores the impact of Level 6 on the emerging picture. And it forbears to mention the implications for any future data analysis – including trend analysis – of the decision to dispense with National Curriculum levels entirely with effect from 2016.

Clearly additional data of this kind might have overloaded the main body of the Report, but a data Annex could and should have been appended.

.

Why Ignore the Transition Matrices?

There is a host of information available about the performance of high attaining learners at Key Stage 4 and Key Stage 5 respectively, much of which I drew on for this post back in January 2013.

This applies to all state-funded schools and makes the point about high attainers’ underachievement in spades.

It reveals that, to some extent at least, there is a problem in selective schools too:

‘Not surprisingly (albeit rather oddly), 89.8% of students in selective schools are classified as ‘above Level 4’, whereas the percentage for comprehensive schools is 31.7%. Selective schools do substantially better on all the measures, especially the EBacc where the percentage of ‘above Level 4’ students achieving this benchmark is double the comprehensive school figure (70.7% against 35.0%). More worryingly, 6.6% of these high-attaining pupils in selective schools are not making the expected progress in English and 4.1% are not doing so in maths. In comprehensive school there is even more cause for concern, with 17.7% falling short of three levels of progress in English and 15.3% doing so in maths.’

It is unsurprising that selective schools tend to perform relatively better than comprehensive schools in maximising the achievement of high attainers, because they are specialists in that field.

But, by concentrating exclusively on comprehensive schools, Ofsted gives the false impression that there is no problem in selective schools when there clearly is, albeit not quite so pronounced.

More recently, I have drawn attention to the enormous contribution that can be added to this evidence base by the Key Stage 2 to 4 Transition Matrices available in the Raise Online library.

.

Transition  Matrices and student numbers English (top) and maths (bottom)

.

TM English CaptureTransition matrices numbers English captureTM Maths CaptureTransition matrices maths numbers Capture.

These have the merit of analysing progress to GCSE on the basis of National Curriculum sub-levels, and illustrate the very different performance of learners who achieve 5C, 5B and 5A respectively.

This means we are able to differentiate within the hugely wide Ofsted sample and begin to see how GCSE outcomes are affected by the strength of learners’ KS2 level 5 performance some five years previously.

The tables above show the percentages for English and maths respectively, for those completing GCSEs in 2012. I have also included the tables giving the pupil numbers in each category.

We can see from the percentages that:

  • Of those achieving 5A in English, 47% go on to achieve an A* in the subject, whereas for 5B the percentage is 20% and for 5C as low as 4%.
  • Similarly, of those achieving 5A in Maths, 50% manage an A*, compared with 20% for those with 5B and only 6% for those with 5C.
  • Of those achieving 5A in English, 40% achieve Grade A, so there is a fairly even split between the top two grades. Some 11% achieve a Grade B and just 1% a Grade C.
  • In maths, 34% of those with 5A at KS2 go on to secure a Grade A, so there is a relatively heavier bias in favour of A* grades. A slightly higher 13% progress to a B and 3% to a Grade C.
  • The matrices show that, when it comes to the overall group of learners achieving Level 5, in English 10% get A*, 31% get A and 36% a B. Meanwhile, in maths, 20% get an A*, 31% an A and 29% a B. This illustrates perfectly the very significant advantage enjoyed by those with a high Level 5 compared with Level 5 as a whole.
  • More worryingly, the progression made by learners who achieve upper Level 4s at Key Stage 2 tends to outweigh the progression of those with 5Cs. In English, 70% of those with 5C made 3 levels of progress and 29% made 4 levels of progress. For those with 4A, the comparable percentages were 85% and 41% respectively. For those with 4B they were 70% (so equal to the 5Cs) and 21% respectively.
  • Turning to maths, the percentages of those with Level 5C achieving three and four levels of progress were 67% and 30% respectively, while for those with 4A they were 89% and 39% respectively and for 4B, 76% (so higher) and 19% (lower) respectively.

This suggests that, while there is undeniably an urgent and important issue at the very top, with half or fewer of 5As being translated into A* Grades, the bulk of the problem seems to be at the lower end of Level 5, where there is a conspicuous dip compared with both comparatively higher and comparatively lower attainers.

I realise that there are health warnings attached to the transition matrices, but one can immediately see how this information significantly enriches Ofsted’s relatively simplistic analysis.

.

Data About A Level Achievement and International Comparisons

The data supplied to illustrate progression to A level and international comparisons is comparatively limited.

For A Level:

  • In 2012, 334 (so 20%) of a total of 1,649 non-selective 11-18 schools had no students achieving AAB+ Grades at A Level including at least two of the facilitating subjects.  A footnote tells us that this applies only to 11-18 schools entering at least five pupils at A level. There is nothing about the controversy surrounding the validity of the ‘two facilitating subjects’ proviso (pages 4, 6, 14)
  • Sutton Trust data is quoted from a 2008 publication suggesting that some 60,000 learners who were in the top quintile (20%) of performers in state schools at ages 11, 14 and 16 had not entered higher education by the age of 18; also that those known to have been eligible for FSM were 19% less likely than others to enter higher education by age 19. The most significant explanatory factor was ‘the level and nature of the qualifications’ obtained by those who had been FSM-eligible (page 15).
  • A second Sutton Trust report is referenced showing that, from 2007-2009, students from independent schools were over twice as likely to gain admission to ‘one of the 30 most highly selective universities’ as students from non-selective state schools (48.2% compared with 18 %). However, this ‘could not be attributed solely to the schools’ average A level or equivalent results’ since 58% of applicants from the 30 strongest-performing comprehensive schools on this measure were admitted to these universities, compared with 87.1% from the highest-performing independent schools and 74.1% from the highest-performing grammar schools (pages 16-17)
  • The only international comparisons data is drawn from PISA 2009. The Report uses performance against the highest level in the tests of reading, maths and science respectively. It notes that, in reading, England ranked 15th on this measure though above the OECD average, in maths England ranked 33rd and somewhat below the OECD average and in science England was a strong performer somewhat above the OECD average (page 17)

Apart from the first item, all this material is now at least four years old.

There is no attempt to link KS2 progression to KS5 achievement, which would have materially strengthened the argument (and which is the focus of one of the Report’s central recommendations).

Nor is there any effort to link the PISA assessment to GCSE data, by explaining the key similarities and differences between the two instruments and exploring what that tells us about particular areas of strength and weakness for high attainers in these subjects.

There is again, a wealth of pertinent data available, much of it presented in previous posts on this blog:

Given the relatively scant use of data in the Report, and the significant question marks about the manner in which it has been applied to support the argument, it is hardly surprising that much of the criticism levelled at Ofsted can be traced back to this issue.

All the material I have presented on this blog is freely available online and was curated by someone with no statistical expertise.

While I cannot claim my analysis is error-free, it seems to me that Ofsted’s coverage of the issue is impoverished by comparison. Not only is there too little data, there is too little of the right data to exemplify the issues under discussion.

But, as I have already stated, that is not sufficient reason to condemn the entire Report out of hand.

.

Exeter2 by Gifted Phoenix

Exeter2 by Gifted Phoenix

 

The Qualitative Dimension of the Report

The Evidence Base

If you read some of the social media criticism heaped upon ‘The most able students’ you would be forgiven for thinking that the evidence base consisted entirely of a few dodgy statistics.

But Ofsted also drew on:

  • Field visits to 41 non-selective secondary schools across England, undertaken in March 2013. The sample (which is reproduced as an Annex to the Report) was drawn from each of Ofsted’s eight regions and included schools of different sizes and ‘type’ and ‘different geographical contexts’. Twenty-seven were 11-18 schools, two are described as 11-19 schools, 11 were 11-16 schools and one admitted pupils at 14. Eighteen were academy converters. Inspectors spent a day in each school, discussing issues with school leaders, staff and pupils (asking similar questions to check sources against each other) and they ‘investigated analyses of the school’s [sic] current data’. We know that:

‘Nearly all of the schools visited had a broadly average intake in terms of their students’ prior attainment at the end of Key Stage 2, although this varied from year group to year group.’

Three selective schools were also visited ‘to provide comparison’ but – rather strangely – that comparative evidence was not used in the Report.

  • A sample of 2,327 lesson observation forms collected from Section 5 inspections of a second sample of 109 non-selective secondary schools undertaken in academic year 2012/13. We are not told anything about the selection of this sample, so we have no idea how representative it was.
  • A survey of 93 responses made by parents and carers to a questionnaire Ofsted placed on the website of the National Association for Able Children in Education (NACE)’. Ofsted also ‘sought the views of some key external organisations and individuals’ but these are not named. I have been able to identify just one organisation and one individual who were approached, which perhaps betrays a rather thin sample.

I have no great problem with the sample of schools selected for the survey. Some have suggested that 41 is too few. It falls short of the 50 mentioned in HMCI’s pre-publicity but it is enough, especially since Ofsted’s last Report in December 2009 drew on evidence from just 26 primary and secondary schools.

The second sample of lesson observations is more suspect, in that no information is supplied about how it was drawn. So it is entirely possible that it included all observations from those schools whose inspections were critical of provision for high attainers, or that all the schools were rated as underperforming overall, or against one of Ofsted’s key measures. There is a sin of omission here.

The parental survey is very small and, since it was filtered through a single organisation that focuses predominantly on teacher support, is likely to have generated a biased sample. The failure to engage a proper cross-section of organisations and individuals is regrettable: in these circumstances one should either consult many or none at all.

 .

Survey Questions

Ofsted is comparatively generous with information about its Survey instrument.

There were two fundamental questions, each supported by a handful of supplementary questions:

‘Are the most able students in non-selective state secondary schools achieving as well as they should?’ (with ‘most able’ defined as set out above). This was supported by four supplementary questions:

  • Are comprehensive schools challenging bright students in the way that the independent sector and selective system do?
  • Do schools track progression effectively enough? Do they know how their most able students are doing? What enrichment programme is offered to the most able students and what is its impact?
  • What is the effect of mixed ability classes on the most able students?
  • What is the impact of early entry at GCSE on the most able students?

Why is there such disparity in admissions to the most prestigious universities between a small number of independent and selective schools and the great majority of state-maintained non-selective schools and academies?’

  • What is the quality of careers advice and its impact on A level students, particularly in terms of their successful application to top universities? Are students receiving good advice and support on how to complete their UCAS forms/personal statements?
  • Are the most able students from disadvantaged backgrounds as likely as the most able students from more affluent families to progress to top universities, and if not why?
  • What are successful state schools doing to increase application success rates and what lessons can be learnt?

Evidence from the 41 non-selective schools was collected under six broad themes:

  • ‘the leadership of the school
  • the achievement of the most able students throughout the school
  • the transfer and transition of these students from their primary schools and their induction into secondary school
  • the quality of teaching, learning and assessment of the most able students
  • the curriculum and extension activities offered to the most able student
  • the support and guidance provided for the most able students, particularly when they were choosing subjects and preparing for university.’

But  the survey also ‘focused on five key elements’ (page 32) which are virtually identical to the last five themes above.

.

Analysis of Key Findings

 

Top Level Conclusions

Before engaging in detail with the qualitative analysis from these sources, it is worth pausing to highlight two significant quantitative findings which are far more telling than those generated by the data analysis foregrounded in the Report.

Had I the good fortune to have reviewed the Report’s key findings prior to publication, I would have urged far greater prominence for:

  • ‘The 2,327 lesson observation evidence forms… showed that the most able students in only a fifth of these lessons were supported well or better.’
  • ‘In around 40% of the schools visited in the survey, the most able students were not making the progress of which they were capable. In a few of the schools visited, teachers did not even know who the most able students were.’

So, in a nutshell, one source of evidence suggests that, in 80% of lessons, support for the most able students is either inadequate or requires improvement.

Another source suggests that, in 40% of schools, the most able students are underachieving in terms of progress while, in a few schools, their identity is unknown.

And these findings apply not to a narrow group of the very highest attaining learners but, on the basis of Ofsted’s own definition, to over 50% of pupils!

Subject to the methodological concerns above, the samples appear sufficiently robust to be extrapolated to all English secondary schools – or the non-selective majority at least.

We do not need to apportion blame, or make schools feel that this is entirely their fault. But this is scandalous – indeed so problematic that it surely requires a concerted national effort to tackle it.

We will consider below whether the recommendations set out in the Report match that description, but first we need to engage with some of the qualitative detail.

The analysis below looks in turn at each of the six themes, in the order that they appear in the main body of the Report.

.

Theme 1 – Achievement of the Most Able Students

 Key finding: ‘The most able students in non-selective secondary schools are not achieving as well as they should. In many schools, expectations of what the most able students should achieve are too low.’

 Additional points:

  • [Too] many of the students in the problematic 40% of surveyed schools ‘failed to attain the highest levels at GCSE and A level’.
  • Academic progress in KS3 required improvement in 17 of the 41 schools. Data was neither accurate nor robust in seven of the 41. Progress differed widely by subject.
  • At KS4, the most able were making less progress than other students in 19 of the 41 schools.
  • At KS5, the most able were making ‘less than expected progress’ in one or more subjects at 17 of the 41 schools.

 .

Theme 2 – Leadership and Management

Key Finding: ‘Leaders in our secondary schools have not done enough to create a culture of scholastic excellence, where the highest achievement in academic work is recognised as vitally important. Schools do not routinely give the same attention to the most able as they do to low-attaining students or those who struggle at school.’

Additional points:

  • Nearly all school leaders claimed to be ambitious for their most able students, but this was not realised in practice in over 40% of the sample.
  • In less effective schools initiatives were usually new or rudimentary and had not been evaluated.
  • Students were taught mainly in mixed ability groups in about a third of the schools visited. Setting was typically restricted to core subjects and often introduced for English and science relatively late in KS3.
  • This had no detrimental effect in ‘the very best schools’ but, in the less effective, work was typically pitched to average attainers.
  • Seven schools had revised their policy on early GCSE entry because of a negative impact on the number of the most able achieving top grades.
  • Leaders in the best schools showed high aspirations for their most able students, providing high-quality teaching and work matched to their needs. Results were well above average and high proportions achieved A*/A grades at GCSE and A level.
  • The best leaders ensure their high aspirations are understood throughout the school community, set high expectations embodied in stretching targets, recruit strong staff and deploy them as specialists and create ‘a dynamic, innovative learning environment’.

.

Theme 3 – Transfer and Transition

Key Finding: ‘Transition arrangements from primary to secondary school are not effective enough to ensure that students maintain their academic momentum into Year 7. Information is not used carefully so that teachers can plan to meet the most able students’ needs in all lessons from the beginning of their secondary school career.’

Additional points:

  • The quality of transition is much too variable. Arrangements were weak in over one quarter of schools visited. Work was repeated in KS3 or was insufficiently challenging. Opportunities were missed to extend and consolidate previous learning.
  • Simple approaches were most effective, easier to implement in schools with few primary feeders or long-established cluster arrangements.
  • In the best examples secondary schools supported the most able before transfer, through specialist teaching and enrichment/extension activities.
  • In many schools activities were typically generic rather than targeted at the most able and many leaders didn’t know how effective they were for this group.
  • In over a quarter of schools the most able ‘did not get off to a good start’ in Year 7 because expectations were too low, work was insufficiently demanding and pupils were under-challenged.
  • Overall inspectors found serious weaknesses in this practice.
  • Effective practice includes: pre-transfer liaison with primary teachers and careful discussion about the most able; gathering a wide range of data to inform setting or class groups; identifying the most able early and implementing support for them to maintain their momentum; and fully evaluating pre-transfer activities and adapting them in the light of that.

.

Exeter3 by Gifted Phoenix

Exeter3 by Gifted Phoenix

 .

Theme 4 – The Quality of Teaching, Learning and Assessment

Key Findings:

‘Teaching is insufficiently focused on the most able at KS3. In over two-fifths of the schools visited for the survey, students did not make the progress that they should, or that they were capable of, between the ages of 11 and 14. Students said that too much work was repetitive and undemanding in KS3. As a result, their progress faltered and their interest in school waned.

Many students became used to performing at a lower level than they are capable of. Parents or carers and teachers accepted this too readily. Students did not do the hard work and develop the resilience needed to perform at a higher level because more challenging tasks were not regularly demanded of them. The work was pitched at the middle and did not extend the most able. School leaders did not evaluate how well mixed-ability group teaching was challenging the most able students.’

Additional points:

  • The reasons for slow progress varied between schools and subjects but included: failure to recognise and challenge the most able; variability in approaches across subjects and year groups; inconsistent application of school policy; and lack of focus by senior and middle leaders.
  • Weaker provision demonstrated: insufficient tracking of the most able, inadequate rapid intervention strategies, insufficiently differentiated homework, failure to apply Pupil Premium funding and little evaluation of the impact of teaching and support.
  • In a few schools the organisation of classes inhibited progress, as evidenced by limited knowledge of the effectiveness of differentiation in mixed ability settings and lack of challenge, particularly in KS3.
  • Eight schools had moved recently to grouping by ability, particularly in core subjects. Others indicated they were moving towards setting, streaming or banding most subjects. Schools’ data showed this beginning to have a positive impact on outcomes.

.

Theme 5 – Curriculum and Extension Activities

Key Findings:

‘The curriculum and the quality of homework required improvement. The curriculum in KS3 and early entry to GCSE examination are among the key weaknesses found by inspectors. Homework and the programme of extension activities for the most able students, where they existed, were not checked routinely for their impact or quality. Students said that too much homework was insufficiently challenging; it failed to interest them, extend their thinking or develop their skills.

Inequalities between different groups of the most able students are not being tackled satisfactorily. The attainment of the most able students who are eligible for FSM, especially the most able boys, lags behind that of other groups. Few of the schools visited used the Pupil Premium funding to support the most able students from the poorest backgrounds.

Assessment, tracking and targeting are not used sufficiently well in many schools. Some of the schools visited paid scant attention to the progress of their most able students.’

Additional points:

  • In over a quarter of schools visited, aspects of the curriculum, including homework, required improvement. In two schools the curriculum failed to meet the needs of the most able.
  • In one in seven schools, leaders had made significant changes recently, including more focus on academic subjects and more setting.
  • But schools did not always listen to feedback from their most able students. Many did not ask students how well the school was meeting their needs or how to improve further.
  • In weaker schools students were rarely given extension work. Sixth form students reported insufficient opportunities to think reflectively and too few suggestions for wider, independent reading.
  • Many in less effective schools felt homework could be more challenging. Few were set wider research or extension tasks.
  • While some leaders said extra challenge was incorporated in homework, many students disagreed. Few school leaders were aware of the homework provided to these students. Many schools had limited strategies for auditing and evaluating its quality.
  • Most school leaders said a wide range of extension tasks, extra-curricular and enrichment activities was provided for the most able, but these were usually for all students. Targeted activities, when undertaken, were rarely evaluated.
  • Research suggests it is important to provide access to such activities for the most able students where parents are not doing so. Schools used Pupil Premium for this in only a few instances.
  • The Premium was ‘generally spent on providing support for all underachieving and low-attaining students rather than on the most able students from disadvantaged backgrounds’.
  • Strong, effective practice was exemplified by a curriculum well-matched to the needs of most able students, a good range and quality of extra-curricular activity, effective use of the Pupil Premium to enrich students’ curriculum and educational experience and motivating and engaging homework, tailored to students’ needs, designed to develop creativity and independence.
  • In over a third of schools visited, tracking of the most able was ‘not secure, routine or robust’. Intervention was often too slow.
  • In weaker schools, leaders were focused mainly on the C/D borderline; stronger schools also focused on A*/A grades too, believing their pupils could do better than ‘the B grade that is implied by the expected progress measure’.
  • Some schools used assessment systems inconsistently, especially in some KS3 foundation subjects where there was insufficient or inaccurate data. In one in five schools, targets for the most able ‘lacked precision and challenge’.
  • In a fifth of schools, senior leaders had introduced improved monitoring systems to hold staff to account, but implementation was often at a very early stage. Only in the best schools were such systems well established.
  • The most effective included lesson observation, work scrutiny, data analysis and reviews of teacher planning. In the better schools students knew exactly what they needed to do to attain the next level/grade and received regular feedback on progress.
  • The most successful schools had in place a wide range of strategies including: ensuring staff had detailed knowledge of the most able, their strengths and interests; through comprehensive assessment, providing challenging programmes and high quality support that met students’ needs;  and rigorous tracking by year, department and key stage combined with swift intervention where needed.
  • Many leaders had not introduced professional development focused on the most able students. Their needs had not been tackled by staff in over one fifth of schools visited, so teachers had not developed the required skills to meet their needs, or up-to-date knowledge of the Year 6 curriculum and assessment arrangements. Stronger schools were learning with and from their peers and had formed links with a range of external agencies.

.

Theme 6 – Support and Guidance for University Entry

Key Findings:

‘Too few of the schools worked with families to support them in overcoming the cultural and financial obstacles that stood in the way of the most able students attending university, particularly universities away from the immediate local area. Schools did not provide much information about the various benefits of attending different universities or help the most able students to understand more about the financial support available.

Most of the 11-16 schools visited were insufficiently focused on university entrance. These schools did not provide students with sufficiently detailed advice and guidance on all the post-16 options available.

Schools’ expertise in and knowledge about how to apply to the most prestigious universities was not always current and relevant. Insufficient support and guidance were provided to those most able students whose family members had not attended university.’

Additional points:

  • Support and guidance varied in quality, accuracy and depth. Around half of schools visited ‘accepted any university as an option’. Almost a quarter had much to do to convince students and their families of the benefits of higher education, and began doing so too late.
  • Data provided by 26 of the 29 11-18 schools showed just 16 students went to Oxbridge in 2011, one eligible for FSM, but almost half came from just two of the schools. Nineteen had no students accepted at Oxbridge. The 2012 figures showed some improvement with 26 admitted to Oxbridge from 28 schools, three of them FSM-eligible.
  • In 2011, 293 students went to Russell Group universities, but only six were FSM eligible. By 2012 this had increased to 352, including 30 eligible for FSM, but over a quarter of the 352 came from just two schools.
  • Factors inhibiting application to prestigious universities included pressure to stay in the locality, cost (including fees), aversion to debt and low expectations. Almost half of the schools visited tackled this through partnership with local universities.
  • Schools did not always provide early or effective careers advice or information about the costs and benefits of attending university.
  • Some schools showed a lack of up-to-date intelligence about universities and their entrance requirements, but one third of those visited provided high quality support and guidance.
  • Some schools regarded going to any university as the indicator of success, disagreeing that it was appropriate to push students towards prestigious universities, rather than the ‘right’ institution for the student.
  • Most of the 11-16 schools visited were insufficiently focused on university entrance. They did not provide sufficiently detailed advice on post-16 options and did not track students’ destinations effectively, either post-16 or post-18.
  • The best schools: provided early on a planned programme to raise students’ awareness of university education; began engaging with students and parents about this as soon as they entered the school; provided support and guidance about subject choices, entry requirements and course content; supported UCAS applications; enabled students to visit a range of universities; and used alumni as role models.

.

Exeter4 by Gifted Phoenix

Exeter4 by Gifted Phoenix

 

Ofsted’s Recommendations

There are two sets of recommendations in the Report, each with an associated commentary about the key constituents of good and bad practice. The first is in HMCI’s Foreword; the second in the main body of the Report.

.

HMCI’s Version

This leads with material from the data analysis, rather than some of the more convincing data from the survey, or at least a judicious blend of both sources.

He rightly describes the outcomes as unacceptable and inconsistent with the principle of comprehensive education, though his justification for omitting selective schools from the analysis is rather less convincing, especially since he is focused in part on narrowing the gap between the two as far as admission to prestigious universities is concerned.

Having pointed up deficiencies at whole school level and in lessons he argues that:

‘The term ‘special needs’ should be as relevant to the most able as it is to those who require support for their learning difficulties’

This is rather out of left field and is not repeated in the main body or the official recommendations. There are pros and cons to such a route – and it would anyway be entirely inappropriate for a population comprising over 50% of the secondary population.

HMCI poses ‘three key challenges’:

‘First, we need to make sure that our most able students do as well academically as those of our main economic competitors. This means aiming for A* and A grades and not being satisfied with less. Not enough has changed since 2009, when the PISA tests found that England’s teenagers were just over half as likely as those from other developed nations to reach the highest levels in mathematics in international tests.

The second challenge is to ensure, from early on, that students know what opportunities are open to them and develop the confidence to make the most of these. They need tutoring, guidance and encouragement, as well as a chance to meet other young people who have embraced higher education. In this respect, independent schools as well as universities have an important role to play in supporting state schools.

The third challenge is to ensure that all schools help students and families overcome cultural barriers to attending higher education. Many of our most able students come from homes where no parent or close relative has either experienced, or expects, progression to university. Schools, therefore, need to engage more effectively with the parents or carers of these students to tackle this challenge.’

This despite the fact that comparison with international competitors is almost entirely lacking from the Report, save for one brief section on PISA data.

The role of independent schools is also underplayed, while the role of universities is seen very much from the schools’ perspective – there is no effort to link together the ‘fair access’ and ‘most able’ agendas in any meaningful fashion.

Parental engagement is also arguably under-emphasised or, at least, confined almost exclusively to the issue of progression.

.

Ofsted’s Version

The ‘official’ text provides a standard overarching bullet point profile of poor and strong provision respectively.

  • Poor provision is characterised by: ‘fragile’ primary/secondary transfer; placement in groups where teaching is not challenging; irregular progress checks; a focus on D/C borderline students at the expense of the more able; and failure to prepare students well for A levels.
  • Strong provision features: leadership determined to improve standards for all students; high expectations of the most able amongst students, families and teachers; effective transition to sustain the momentum of the most able; early identification to inform tailoring of teaching and the curriculum; curricular flexibility to permit challenge and extension; grouping to support stretch from the start of secondary school;  expert teaching, formative assessment and purposeful homework; effective training and capacity for teachers to learn from each other; close monitoring of progress to inform rapid intervention where necessary; and effective support for application to prestigious universities.

A series of 13 recommendations is provided, alongside three Ofsted commitments. Ten of the 13 are aimed at schools and three at central Government.

I have set out the recommendations in the table below, alongside those from the previous Report, published in 2009.

 

2009 Report 2013 Report
Central Government Central Government
Ensure planned catalogue of learning and professional development opportunities meets the needs of parents, schools and LAs DfE to ensure parents receive annual report recording whether students are on track to achieve as well as they should in national tests and exams
Ensure LAs hold schools more rigorously to account for the impact of their G&T provision DfE to develop progress measures from KS2 to KS4 and KS5
DfE to promote new destination data showing progression to (Russell Group) universities
Ofsted will focus inspections more closely on teaching and progress of most able, their curriculum and the information, advice and guidance provided to them
Ofsted will consider in more detail during inspection how well Pupil Premium is used to support disadvantaged most able
Ofsted will report inspection findings about this group more clearly in school, sixth form and college reports
Local Authorities Local Authorities
Hold schools more rigorously to account for the impact of their G&T provision
Encourage best practice by sharing with schools what works well and how to access appropriate resources and training
Help schools produce clearer indicators of achievement and progress at different ages
Schools Schools
Match teaching to pupils’ individual needs Develop culture and ethos so needs of most able are championed by school leaders
Listen to pupil feedback and act on it Help most able to leave school with best qualifications by developing skills, confidence and attitudes needed to succeed at the best universities
Inform parents and engage them more constructively Improve primary-secondary transfer so all Year 7 teachers know which students achieved highly and what aspects of the curriculum they studied in Year 6, and use this to inform KS3 teaching.
Use funding to improve provision through collaboration Ensure work remains challenging throughout KS3 so most able make rapid progress.
Ensure lead staff have strategic clout Ensure leaders evaluate mixed ability teaching so most able are sufficiently challenged and make good progress
Ensure rigorous audit and evaluation processes Evaluate homework to ensure it is sufficiently challenging
Give parents better and more frequent information about what their children should achieve and raise expectations where necessary.
Work more closely with families, especially first generation HE applicants and FSM-eligible to overcome cultural and financial obstacles to HE application
Develop more knowledge and expertise to support applications to the most prestigious universities
Publish more widely the university destinations of their students

TABLE 1: COMPARING OFSTED RECOMMENDATIONS IN 2009 AND 2013

The comparison serves to illustrate the degree of crossover between the two Reports – and to what extent the issues raised in the former remain pertinent four years on.

The emboldened Items in the left-hand column are still outstanding and are not addressed in the latest Report. There is nothing about providing support for schools from the centre; and nothing whatsoever about the role of the ‘middle tier’, however that is composed. Ofsted’s new Report might have been enriched by some cross-reference to its predecessor.

The three recommendations directed at the centre are relatively limited in scope – fundamentally restricted to elements of the status quo and probably demanding negligible extra work or resource

  • The reference to an annual report to parents could arguably be satisfied by the existing requirements, which are encapsulated in secondary legislation.
  • It is not clear whether promoting the new destination measures requires anything more than their continuing publication – the 2013 version is scheduled for release this very week.
  • The reference to development of progress measures may be slightly more significant but probably reflects work already in progress. The consultation document on Secondary School Accountability proposed a progress measure based on a new ‘APS8’ indicator, calculated through a Value Added method and using end KS2 results in English and maths as a baseline:

‘It will take the progress each pupil makes between Key Stage 2 and Key Stage 4 and compare that with the progress that we expect to be made by pupils nationally who had the same level of attainment at Key Stage 2 (calculated by combining results at end of Key Stage 2 in English and mathematics).’

However this applies only to KS4, not KS5, and we are still waiting to discover how the KS2 baseline will be graded from 2016 when National Curriculum levels disappear.

This throws attention back on the Secretary of State’s June 2012 announcement, so far unfulfilled by any public consultation:

‘In terms of statutory assessment, however, I believe that it is critical that we both recognise the achievements of all pupils, and provide for a focus on progress. Some form of grading of pupil attainment in mathematics, science and English will therefore be required, so that we can recognise and reward the highest achievers as well as identifying those that are falling below national expectations. We will consider further the details of how this will work.’

.

The Balance Between Challenge and Support

It is hard to escape the conclusion that Ofsted believe inter-school collaboration, the third sector and the market can together provide all the support that schools can  need (while the centre’s role is confined to providing commensurate challenge through a somewhat stiffened accountability regime).

After four years of school-driven gifted education, I am not entirely sure I share their confidence that schools and the third sector can rise collectively to that challenge.

They seem relatively hamstrung at present by insufficient central investment in capacity-building and an unwillingness on the part of key players to work together collaboratively to update existing guidance and provide support. The infrastructure is limited and fragmented and leadership is lacking.

As I see it, there are two immediate priorities:

  • To provide and maintain the catalogue of learning opportunities and professional support mentioned in Ofsted’s 2009 report; and
  • To update and disseminate national guidance on what constitutes effective whole school gifted and talented education.

The latter should in my view be built around an updated version of the Quality Standards for gifted education, last refreshed in 2010. It should be adopted once more as the single authoritative statement of effective practice which more sophisticated tools – some, such as the Challenge Award, with fairly hefty price tags attached – can adapt and apply as necessary.

The Table appended to this post maps the main findings in both the 2009 and 2013 Ofsted Reports against the Standards. I have also inserted a cross in those sections of the Standards which are addressed by the main text of the more recent Report.

One can see from this how relevant the Standards remain to discussion of what constitutes effective whole school practice.

But one can also identify one or two significant gaps in Ofsted’s coverage, including:

  • identification – and the issues it raises about the relationship between ability and attainment
  • the critical importance of a coherent, thorough, living policy document incorporating an annually updated action plan for improvement
  • the relevance of new technology (such as social media)
  • the significance of support for affective issues, including bullying, and
  • the allocation of sufficient resources – human and financial –  to undertake the work.

.

Exeter5 by Gifted Phoenix

Exeter5 by Gifted Phoenix

 

Reaction to the Report

I will not trouble to reproduce some of the more vituperative comment from certain sources, since I strongly suspect much of it to be inspired by personal hostility to HMCI and to gifted education alike.

  • To date there has been no formal written response from the Government although David Laws recorded one or two interviews such as this which simply reflects existing reforms to accountability and qualifications. At the time of writing, the DfE page on Academically More Able Pupils has not been updated to reflect the Report.
  •  The Opposition criticised the Government for having ‘no plan for gifted and talented children’ but did not offer any specific plan of their own.
  • The Sutton Trust called the Report ‘A wake-up call to Ministers’ adding:

‘Schools must improve their provision, as Ofsted recommends. But the Government should play its part too by providing funding to trial the most effective ways to enable our brightest young people to fulfil their potential. Enabling able students to fulfil their potential goes right to the heart of social mobility, basic fairness and economic efficiency.’

Contrary to my expectations, there was no announcement arising from the call for proposals the Trust itself issued back in July 2012 (see word attachment at bottom). A subsequent blog post called for:

‘A voluntary scheme which gives head teachers an incentive – perhaps through a top-up to their pupil premium or some other matched-funding provided centrally – to engage with evidence based programmes which have been shown to have an impact on the achievement of the most able students.’

‘We warned the Government in 2010 when it scrapped the gifted and talented programme that this would be the result. Many schools are doing a fantastic job in supporting these children. However we know from experience that busy schools will often only have time to focus on the latest priorities. The needs of the most able children have fallen to the bottom of the political and social agenda and it’s time to put it right to the top again.’

‘It is imperative that Ofsted, schools and organisations such as NACE work in partnership to examine in detail the issues surrounding this report. We need to disseminate more effectively what works. There are schools that are outstanding in how they provide for the brightest students. However there has not been enough rigorous research into this.’

  • Within the wider blogosphere, Geoff Barton was first out of the traps, criticising Ofsted for lack of rigour, interference in matters properly left to schools, ‘fatuous comparisons’ and ‘easy soundbites’.
  • The same day Tom Bennett was much more supportive of the Report and dispensed some commonsense advice based firmly on his experience as a G&T co-ordinator.
  • Then Learning Spy misunderstood Tom’s suggestions about identification asking ‘how does corralling the boffins and treating them differently’ serve the aim of high expectations for all? He far preferred Headguruteacher’s advocacy for a ‘teach to the top’ curriculum, which is eminently sensible.
  • Accordingly, Headguruteacher contributed The Anatomy of High Expectations which drew out the value of the Report for self-evaluation purposes (so not too different to my call for a revised IQS).
  • Finally Chris Husbands offered a contribution on the IoE Blog which also linked Ofsted’s Report to the abolition of National Curriculum levels, reminding us of some of the original design features built in by TGAT but never realised in practice.

Apologies to any I have missed!

As for yours truly, I included the reactions of all the main teachers’ associations in the collection of Tweets I posted on the day of publication.

I published Driving Gifted Education Forward, a single page proposal for the kind of collaborative mechanism that could bring about system-wide improvement, built on school-to-school collaboration. It proposes a network of Learning Schools, complementing Teaching Schools, established as centres of excellence with a determinedly outward-looking focus.

And I produced a short piece about transition matrices which I have partly integrated into this post.

Having all but completed this extended analysis, have I changed the initial views I Tweeted on the day of publication?

.

.

Well, not really. My overall impression is of a curate’s egg, whose better parts have been largely overlooked because of the opprobrium heaped on the bad bits.

.

Curate's Egg 370px-True_humility

Bishop: ‘I’m afraid you’ve got a bad egg Mr Jones’, Curate: ‘Oh, no, my Lord, I assure you that parts of it are excellent!’

.

The Report might have had a better reception had the data analysis been stronger, had the most significant messages been given comparatively greater prominence and had the tone been somewhat more emollient towards the professionals it addresses, with some sort of undertaking to underwrite support – as well as challenge – from the centre.

The commitments to toughen up the inspection regime are welcome but we need more explicit details of exactly how this will be managed, including any amendments to the framework for inspection and supporting guidance. Such adjustments must be prominent and permanent rather than tacked on as an afterthought.

We – all of us with an interest – need to fillet the key messages from the text and integrate them into a succinct piece of guidance as I have suggested, but carefully so that it applies to every setting and has built-in progression for even the best-performing schools. That’s what the Quality Standards did – and why they are still needed. Perhaps Ofsted should lead the revision exercise and incorporate them wholesale into the inspection framework.

As we draw down a veil over the second of these three ‘Summer of Love’ publications, what are the immediate prospects for a brighter future for English gifted education?

Well, hardly incandescent sunshine, but rather more promising than before. Ofsted’s Report isn’t quite the ‘landmark’ HMCI Wilshaw promised and it won’t be the game changer some of us had hoped for, but it’s better than a poke in the eye with the proverbial blunt stick.

Yet the sticking point remains the capacity of schools, organisations and individuals to set aside their differences and secure the necessary collateral to work collectively together to bring about the improvements called for in the Report.

Without such commitment too many schools will fail to change their ways.

.

GP

June 2013

.

.

ANNEX: MAPPING KEY FINDINGS FROM THE 2009 AND 2013 REPORTS AGAINST THE IQS

IQS Element IQS Sub-element Ofsted 2009 Ofsted 2013
Standards and progress Attainment levels high and progress strong Schools need more support and advice about standards and expectations Most able aren’t achieving as well as they should. Expectations are too low.65% who achieved KS2 L5 in English and maths failed to attain GCSE A*/A gradesTeaching is insufficiently focused on the most able at KS3Inequalities between different groups aren’t being tackled satisfactorily
SMART targets set for other outcomes x
Effective classroom provision Effective pedagogical strategies Pupil experienced inconsistent level of challenge x
Differentiated lessons x
Effective application of new technologies
Identification Effective identification strategies x
Register is maintained
Population is broadly representative of intake
Assessment Data informs planning and progression Assessment, tracking and targeting not used sufficiently well in many schools
Effective target-setting and feedback x
Strong peer and self-assessment
Transfer and transition Effective information transfer between classes, years and institutions Transition doesn’t ensure students maintain academic momentum into Year 7
Enabling curriculum entitlement and choice Curriculum matched to learners’ needs Pupils’ views not reflected in curriculum planning The KS3 curriculum is a key weakness, as is early GCSE entry
Choice and accessibility to flexible pathways
Leadership Effective support by SLT, governors and staff Insufficient commitment in poorer performing schools School leaders haven’t done enough to create a culture of scholastic excellence.Schools don’t routinely give the same attention to most able as low-attaining or struggling students.
Monitoring and evaluation Performance regularly reviewed against challenging targets Little evaluation of progression by different groups x
Evaluation of provision for learners to inform development x
Policy Policy is integral to school planning, reflects best practice and is reviewed regularly Many policies generic versions from other schools or the LA;Too much inconsistency and incoherence between subjects
School ethos and pastoral care Setting high expectations and celebrating achievement Many students become used to performing at a lower level than they are capable of. Parents and teachers accept this too readily.
Support for underachievers and socio-emotional needs
Support for bullying and academic pressure/opportunities to benefit the wider community
Staff development Effective induction and professional development x
Professional development for managers and whole staff x
Resources Appropriate budget and resources applied effectively
Engaging with the community, families and beyond Parents informed, involved and engaged Less than full parental engagement Too few schools supporting families in overcoming cultural and financial obstacles to attending university
Effective networking and collaboration with other schools and organisations Schools need more support to source best resources and trainingLimited collaboration in some schools; little local scrutiny/accountability Most 11-16 schools insufficiently focused on university entranceSchools’ expertise and knowledge of prestigious universities not always current and relevant
Learning beyond the classroom Participation in a coherent programme of out-of-hours learning Link with school provision not always clear; limited evaluation of impact Homework and extension activities were not checked routinely for impact and quality

‘Unlocking Emergent Talent’

.

This post reviews ‘Unlocking Emergent Talent’, a recent publication about support for low income high ability students by the National Association for Gifted Children (NAGC) in the United States, and considers its relevance to other national settings, especially England.

summer of love 1967 by 0 fairy 0

summer of love 1967 by 0 fairy 0

Although not formally part of the ‘Summer of Love’ series, this is linked to those posts. It offers a useful comparator for an upcoming report on supporting high-achieving disadvantaged learners towards higher education, third of a trio of publications that are staging-posts in the sequence.

It also offers some basis for judgement whether the wider narrative devotes sufficient attention to the equity dimension of gifted education. My Gifted Phoenix Manifesto asserts that it is essential to maintain equity-driven gap-narrowing in judicious balance with excellence-driven efforts to raise standards for all gifted learners regardless of background.

I am particularly interested in the implications for the design of suitable policy interventions. But also in the application in England of the Pupil Premium, additional funding determined by the number of disadvantaged pupils which schools are expected to use to reduce the attainment gap between them and their peers.

The key issue is whether or not the Premium is being utilised effectively to tackle excellence gaps between high attaining learners – and the prospects for further improvement in that quarter, should it be needed.

The NAGC report does not help in this respect, but I have taken the liberty of introducing additional material relevant to the topic, because it is so pivotal to the equity strand of the emerging ‘Summer of Love’ narrative. Put crudely, understanding what constitutes an effective intervention is of limited value if there is no resource or incentive to implement it.

While ‘Unlocking Emergent Talent’ remains the centrepiece of the post, I have also factored in other recent and relevant material from a variety of US and English sources, especially where it seems to me that the argument in NAGC’s publication is guilty of elision, or needs tempering to enhance its relevance to English settings.

.

The Summit

‘Unlocking Emergent Talent’ made its appearance in November 2012, the product of a two-day National Summit on Low-Income High Ability Learners which took place from 30-31 May, with support from the Jack Kent Cooke Foundation.

The NAGC Website retains a page dedicated to the Summit including biographies of many of the participants and a multitude of background reading. The supporting resources include a list of Summit Presenter Recommended Readings and an Annotated Bibliography. Other useful contributions have been linked into the text below.

According to the Agenda:

  • The event began with an overview and expectation-setting session led by Paula Olszewski-Kubilius, Director of CTD at Northwestern University and current NAGC President;
  • There was a presentation on The Effects of Poverty on Educational Opportunity by Josh Wyner, Executive Director of the College Excellence Program at The Aspen Institute. Three respondents subsequently shared their thoughts on how poverty-related issues present amongst different US populations.
  • Paula Olszewski-Kubilius introduced the themes and rationale for an ensuing discussion focused respectively on school programmes and supplemental programmes that ‘work with promising learners from poverty’. Brief composite summaries of the featured school and supplemental programmes are provided. (Further links to each programme are supplied below.)
  • Following small group discussion and a first stab at delineating an emerging research agenda, the next session focused on ‘Building a Psychological Entity that Supports Commitment to High Achievement/Psycho-social Skills and Issues with Promising Learners from Poverty’.  This featured Angela Duckworth from the University of Pennsylvania (whose presentation is here) and Frank Worrell, from the University of California.
  • The second day kicked off with a session on ‘Research and Policy: Next Steps for Action/Reinventing the System for High Ability Learners from Poverty’ with inputs from Chester Finn, President of the Thomas B Fordham Institute and Jonathan Plucker, then Director of the Center for Evaluation and Education Policy at Indiana University (whose presentation is here).
  • Finally ‘Overlooked Gems Then and Now: What’s Changed, What’s the Same’ – a comparison between the outcomes of an earlier NAGC venture into this territory and the current effort – was led by Joyce VanTassel- Baska from the College of William and Mary Center for Gifted Education.

The resulting publication ‘Unlocking Emergent Talent’ divides participants in a slightly different way, citing Olszewski-Kubilius, Duckworth, Finn, Plucker, Worrall and Wyner respectively as ‘Featured Presenters’, followed by 18 ‘Moderators, Panelists and Respondents’ and a further 35 ‘Participants’.

Of these 59, all are resident in the United States. Almost half are academics employed in US universities, a further 15 or so work in district, county or state education departments or state associations for the gifted. The remainder are associated with selected programmes featured in the publication or with the sponsors (whose programmes also feature).

It says that the Summit was intended to:

  • Share recent research on the education and development of low-income high ability learners;
  • Identify barriers that prevent them from reaching the highest levels of school achievement and ‘success in adulthood commensurate with their abilities’;
  • Share details of successful school-based and supplementary programmes;
  • Synthesise best practice for identifying and supporting low-income learners, ‘especially culturally and linguistically diverse students’; and
  • Generate a research agenda to inform future practice.

It explains that the Summit and Report together were designed to build on the earlier publication ‘Overlooked Gems: A National Perspective on Low-Income Promising Learners’ dating from 2007. NAGC’s page on the Summit carries a shorter summary of the proceedings of the April 2006 conference that generated this report.

‘Unlocking Emergent Talent’ is divided into a series of short chapters which dart around the territory and include a fair degree of duplication. So, in undertaking this analysis, I have taken the liberty of reorganising the material to focus respectively on:

  • The nature of the problem, as currently manifested in US education, including evidence of underachievement and analysis of the barriers to progress and participation by this target group. I have undertaken a good deal more ‘ground-clearing’ than appears in the report itself;
  • The skills and attitudes that can inhibit progress by such learners (which the Report calls ‘Psychosocial Issues’);
  • Effective policies, initiatives, programmes and practice – and the problems associated with replication and scaling (which are given rather cursory treatment);
  • The identified research agenda, insofar as this throws further light on the material already presented.

I have introduced commentary on different but associated material throughout the analysis, wherever it seems to fit best. Much is concentrated in the first part of the post, which considers in some detail the issues that ‘Unlocking Emergent Talent’ is designed to address.

.

Park Fauna by Gifted Phoenix

Park Fauna by Gifted Phoenix

 

Defining the Target Group

The report is rather remiss in not bothering to define with any exactitude what constitutes a ‘Low Income High Ability Student’ and in failing to engage with the issues that arise from the adoption of a definition.

The low income dimension is associated principally with eligibility for free and reduced-price lunches, the criterion applied to data published through the National Assessment of Educational Progress (NAEP). The analysis also makes use of PISA data on the comparative performance of learners from different socio-economic backgrounds and how this varies between countries.

There is no comparison of these measures and no exploration of their good and bad points compared with alternative approaches to defining educational disadvantage.

Any treatment of these issues in England would be certain to include some commentary on the pros and cons of eligibility for free school meals (FSM) as a measure, compared with alternatives that utilise a localised geographical indicator, based on wards or neighbourhoods, or possibly even an alternative proxy derived from family background.

This analysis suggests that such issues are equally pertinent in the US:

‘Students are entitled to free lunches if their families’ incomes are below 130 percent of the annual income poverty level guideline established by the U.S. Department of Health and Human Services and updated annually by the Census Bureau (currently $21,756 for a family of four). Children who are members of households receiving food stamp benefits or cash assistance through the Temporary Assistance for Needy Families block grant, as well as homeless, runaway, and migrant children, also qualify for free meals. Students with family incomes below 185 percent of poverty are eligible for a reduced price lunch…

…Researchers often use free or reduced price lunch (FRPL) enrollment figures as a proxy for poverty at the school level, because Census poverty data (which is used at the state and district level) is not available disaggregated below the school district level and is not collected annually…

While FRPL data is generally a reliable poverty indicator in the elementary grades, it is less so in the high school grades. Because free and reduced price lunch is an opt-in program at the majority of schools, researchers believe that high school students are greatly under-represented in school lunch program enrollment. High school students may refuse to enroll in FRPL due to a perceived stigma attached to the program.’

The high ability dimension is comparatively muddier, in that the report relies principally on attainment measures – the Advanced Level on NAEP assessments and, on one occasion, ‘the highest achievement levels’ in PISA assessments of reading maths and science (for background on the latter see this previous post).

This introduces into proceedings the oft-encountered confusion between ability and attainment/achievement, which are of course quite different animals. Indeed the difference between ability unfulfilled and ability already manifested through high attainment/achievement is absolutely pivotal to this topic.

The problem is that much of the available data relates to high achievement, as opposed to high ability. The resulting bias towards achievement data reproduces at macro level an issue often encountered in identification for gifted programmes, where attainment evidence is  paramount, resulting in neglect of learners with unfulfilled potential, often attributable to disadvantage.

It is strange that no use is made of data about the composition of the population served by gifted programmes of different kinds and levels, even though there must be abundant evidence that many of these are heavily skewed against learners from disadvantaged backgrounds.

There may even be aggregated national data available. If there is a gifted flag and a FRPL flag in the national data collection, what is the problem in establishing the relationship?

Certainly the Office for Civil Rights publishes information (page 9) about the ethnic composition of gifted programmes nationally.

Their March 2012 summary notes that almost three-quarters of students enrolled in gifted and talented education (GATE) are either White (62%) or Asian (10%) whereas the overall enrolment rates for these populations in areas offering GATE programming are 49% and 5% respectively. Contrastingly, 16% of GATE enrolments are Hispanic and 10% Black, while the comparable overall enrolment rates are 25% and 19% respectively.

Across the sample, only 4% of African-American and 5% of Hispanic students are enrolled in gifted programmes.

This introduces a second problem, in that there is evidence throughout that the report is relying disproportionately on material – both data and research – about the under-representation of (and limited support for) learners from minority ethnic backgrounds in gifted programmes, as opposed to material that relates directly to learners of any ethnic background who are from low-income families.

This is understandable, given the prominent historical focus on minority provision in the US. There are signs that the focus is beginning to shift, given recent data about the increasing size of income achievement gaps compared with minority achievement gaps (see below).

England has already moved to perceiving this issue predominantly through the lens of financial disadvantage, an adjustment that also came about in recognition that some minority ethnic achievement gaps are narrowing (although others remain pronounced) and that financial disadvantage is apparently the core problem.

This approach is not without its critics, since other explanations of minority ethnic gaps may tend to be underplayed as a consequence.

On the other hand, the historical emphasis on minorities may have tended to obscure and even aggravate achievement gaps between advantaged and disadvantaged learners in majority populations. In England, white working class boys are a particular cause for concern.

While there is clear and significant overlap between minority ethnic and financially disadvantaged populations, whether in the US or England, they are by no means synonymous in either country, so prominent health warnings are necessary whenever such assumptions are made.

I have made similar observations in respect of New Zealand, where minority ethnic issues are so prominent in educational discourse – including discourse about gifted education – that they appear to overshadow the issue of financial disadvantage.

To give this report credit, it does point out quite clearly that, while poverty and ethnicity overlap, they are by not the same thing. Three general assumptions are expressed:

  • ‘Poverty and minority status are not the same. Although there is overlap, poverty manifests differently based on geography, ethnicity, and race.
  • Poverty is pervasive and includes students from rural, White, urban, African American, Hispanic, Asian, and other cultural backgrounds.
  • Typical characteristics of gifted students may manifest differently in low-income, high-ability learners.’

Earlier in the report, 2010 Census data is quoted revealing that 38% of African-American, 32% of Hispanic, 17% of White and 14% of Asian children ‘live in low socio-economic circumstances’. (It is not stated whether this is defined by FRPL or some alternative indicator).

It might have gone further in clarifying that the broader construct of disadvantage reflects the complex interaction between these factors and several others, not least gender, parental level of education, incidence of special educational needs, English as an additional language and even month of birth. As in the UK, it is quite likely that social class may also be a factor.

The large number of variables that may impact on disadvantage in any one individual reinforces the danger of embarking on analysis that gives particular prominence to any single factor, even if evidence suggests that it is the main driver.

It is also a salutary reminder that the response to disadvantage – whether or not within gifted programmes – must be tailored to individual circumstances. The data and research evidence may point to significant trends, but programmes will stand or fall on their capacity to address each learner’s unique needs.

It follows that regular assessment of those needs and how they are changing over time is an essential element of effective practice (and one that is probably underplayed in Unlocking Emergent Talent).

.

Park Flora 1 by Gifted Phoenix

Park Flora 1 by Gifted Phoenix

.

Analysis of the Problem

The initial Overview section of the report identifies these constituent elements of the problem it seeks to address:

  • Relatively few US students of any description are achieving levels of excellence, whether defined in terms of NAEP Advanced Level or the highest levels of PISA assessment.
  • Poverty has a negative impact on educational achievement. The report draws first on evidence of the impact of socio-economic disadvantage on achievement gaps in the US, compared with other countries, drawn from analysis of PISA 2006 and 2009. The point could have been illustrated pertinently by this diagram

OECD Numbers Final.xlsx

Incidentally, the UK is close to the OECD average (14.0) on this measure

  • Within the US there are also achievement gaps at every level, including ‘excellence gaps’ as evidenced by NAEP. Three different measures are cited:

‘Between 1998 and 2007, 1.7% or fewer of free and reduced lunch program-eligible students scored at the advanced level on the eighth-grade NAEP math exam compared to between 6% and 10% of non-eligible students.

Since 1998, 1% or fewer of 4th-, 8th-, and 12th- grade free or reduced lunch students, compared to between 5% and 6% of non-eligible students scored at the advanced level on the NAEP civics exam.

Since 1998, 1% or fewer of free and reduced lunch program-eligible students scored at the advanced level on the eighth-grade NAEP writing exam while the percentage of non-eligible students who achieved advanced scores increased from 1% to 3%.’

  • Some evidence is also offered to support the argument that US schooling does not currently improve or sustain the performance of the top-achieving students compared with comparatively lower achievers, nor does it close the gap in performance between high- and low-income high-achieving students, as measured by attendance at selective universities, graduation and completion of a postgraduate degree.
  • High ability students (as opposed to high-achieving students) are not perceived as a priority within US education policy. Moreover:

‘Success in closing achievement gaps amongst lower achieving students does not appear to impact gaps amongst groups of top students’.

This is compounded because efforts to address equity in education often fail to embrace those ‘who are already showing advanced ability and/or achievement’ while the overall commitment to supporting gifted education per se is described as ‘tenuous’. The level of support depends where one lives and remaining funding is often under threat.

.

A very recent US publication ‘Breaking the Glass Ceiling of Achievement for Low-Income Students and Students of Color’ from The Education Trust (May 2013) provides more in-depth analysis of the excellence gap data (though its coverage is frustratingly incomplete and it too is guilty of unhelpfully interweaving minority ethnic and economically disadvantaged data).

It also relies on NAEP advanced level data for FRPL-eligible students, examining trends from 2003 to 2011, particularly in maths and reading at grades 4 and 8 respectively.

  • In 4th grade maths, the percentage of low-income learners achieving the advanced benchmark increased from 1% to 2% between 2003 and 2011; meanwhile the percentage of high-income learners improved from 6 to 12%, thus widening the gap. A similar pattern was seen in 8th grade maths.
  • In 4th grade reading, the percentage of low-income learners achieving the advanced benchmark remained at 2% between 2003 and 2011, whereas high-income learners improved slightly, from 11% to 13%. The gap also widened at 8th grade.

Meanwhile, gaps were typically narrowing at the ‘below basic’ benchmark (though there was no significant change in 4th grade maths at this level).

This study also analyses progress at the 90th percentile of performance, so independently of the NAEP advanced benchmark, finding some evidence of gap-narrowing (which isn’t quantified for low-income students).

By 2011 there are wide gaps in performance between low-income and high-income learners: 21% for 4th grade maths, 26% for 8th grade maths, 24% for 4th grade reading and 21% for 8th grade reading. But these are invariably smaller gaps than apply at the 10th percentile for low-achieving learners.

Only at 12th Grade is this pattern reversed. At that age, the gap at the 90th percentile in maths is 24%, compared with 18% at the 10th percentile; in reading the 90th percentile gap is 21% compared with 19% for the 10th percentile.

So the overall picture is perhaps somewhat less clear-cut than the selective facts provided in ‘Unlocking Emergent Talent’ would suggest.

.

The pattern is by no means identical in England. I included materials about England’s own excellence gaps in this recent post, which draws particularly on Jerrim’s work on PISA reading assessments.

His work reveals that, on this measure at least, countries display significantly different profiles when it comes to the relationship between background and achievement at different deciles of achievement:

‘He comments on the difference between the US – where the association between background and achievement is relatively strong across the achievement deciles – and Finland, where the association is comparatively weak.

In England there is a relatively strong link between socio-economic background and high achievement:

‘Socio-economic test score differences at the 80th percentile are greater here than in 18 out of the other 22 OECD countries considered (and significantly so on 11 occasions). The same is not true, however, at the bottom of the PISA reading test distribution, where England is actually ranked above the median, having smaller socioeconomic test score differences.’

…He finds that, while the average gap has declined [over time] and that is repeated at the bottom end of the achievement distribution, this is not true at the top.

…He finds that the narrowing of the gap appears to have been driven by a relatively greater decline in achievement amongst those from advantaged backgrounds but:

‘Whereas the apparent decline in performance for the top SES quintile seems to have occurred quite evenly across the achievement distribution… the decline suffered by the most disadvantaged group is most apparent at the top end.’

It would be fascinating to pursue further the apparent disparities between the US and England that this amalgamation of sources begins to uncover, but we must content ourselves for the time being with the broader truth that both countries have significant issues with their socio-economic excellence gaps that urgently need addressing.

.

What can Education Contribute to Gap-Narrowing?

.

How much Difference Does Education Make?

There is nothing at all in ‘Unlocking Emergent Talent’ about the relative impact of educational interventions on disadvantage compared with other strategies, such as tackling the root causes of poverty by redistributing wealth. It seems to be taken for granted that the interventions described will address the problems identified, as long as such effective practice is more widely adopted.

The omission is curious, since Plucker’s presentation to the Summit is unfailingly explicit about the fundamental importance of reducing poverty to tackling the excellence gap.

Plucker poverty 1 Capture

.

Plucker poverty 2 Capture

.

.

.

.

.

.

.

Another recent publication, ‘Improving Performance of Low-Achieving and Culturally and Linguistically Diverse Students’, written by Ben Levin for the Global Cities Education Network, sets the context nicely:

‘The relationship between these social factors and school outcomes has been known for a long time. And at least since the Coleman Report (done in the United States in the mid-1960s), there has been a vigorous debate about how much schools can actually do to overcome these differences. That debate continues, with some contending that schools are rather powerless in the face of social disadvantage and others claiming that schools can do a great deal to overcome social inequities. According to various estimates in the research literature, anywhere from 50 to 80 percent of the variance in student achievement is due to factors outside the school, and anywhere from 20 to 50 percent of the variance is explainable” (in statistical terms) by factors inside the school.’

Levin goes on to point out that there is huge variance between schools’ performance at any given socio-economic level – and that there are similar disparities between countries, as revealed by the PISA data. Although system-wide improvement is feasible, significant achievement gaps remain in even the most successful countries.

The assumption that school factors may account for up to 50% of variance seems relatively optimistic from a UK perspective. For example, the 2010 BERA Paper ‘Social Equality: can schools narrow the gap?’ warns:

‘However, school effects must not be overstated, as they have sometimes been by national policy-makers. According to studies in the UK, typically between 10-20 per cent of the variance in attainment between pupils is related to school factors – though this does not mean all variance is down to school-level factors, since some will be attributable to teachers.’

In addressing the contribution that gifted education can make to reducing excellence gaps, we would do well to inject a dose of realism about the overall impact of such interventions, while not succumbing to the temptation to underplay their potential significance.

.

The latter position can be all too easy to reach in the light of some contributions to this debate. In recent months, significant attention has been paid to discussion of Sean Reardon’s comparatively pessimistic assessment.

In July 2011 he published The Widening Opportunity Gap Between the Rich and Poor: New Evidence and Possible Explanations which examines the changing relationship between family economic background and educational achievement since the 1970s.

He compares learners from families at the 90th percentile of the income distribution (around $165,000) with those at the 10th percentile (around $15,000). This is of course a significantly more polarised distinction than exists between those eligible for FRPL and those ineligible.

He notes that income inequality has become much more pronounced since the 1970s, such that a family with school-age children at the 90th percentile in 1970 earned five times the amount of a family at the 10th percentile. Nowadays, the multiple is 11. As a consequence, wealthy families now have a comparatively higher proportion of income to invest in their children’s development.

He argues that:

  • The income achievement gap is almost twice the size of the achievement gap between black and white students whereas, in the 1960s, this ethnic achievement gap was almost twice as large as the income-related gap. Hence family income has become a significantly better predictor of success in school than ethnic background.
  • The increasing gap does not seem attributable to differences in parents’ educational level – the relationship between these two factors has remained fairly stable since the 1960s. Consequently, family income is now almost as strong an indicator of children’s achievement as their parental level of education.
  • The size of the gap is at least partly attributable to a significantly stronger association between income and achievement for families with above average incomes, where the effect is now some 30-60% larger than it was for children born in the 1970s.
  • The gap is already sizeable when US children enter kindergarten but then remains relatively stable throughout the remainder of their schooling, neither increasing nor decreasing, so schooling appears to make relatively little difference (though Reardon appears to compromise this position slightly elsewhere.)
  • Evidence suggests that the increase is partly associated with increasing parental investment in children’s cognitive development at the top end of the distribution. Children from wealthier families are better prepared to succeed in school when they enter kindergarten, and they retain this advantage throughout their subsequent schooling.

Reardon’s research has recently been given fresh impetus by an article in the New York Times which glosses his argument thus:

‘The most potent development over the past three decades is that the test scores of children from high-income families have increased very rapidly. Before 1980, affluent students had little advantage over middle-class students in academic performance; most of the socioeconomic disparity in academics was between the middle class and the poor. But the rich now outperform the middle class by as much as the middle class outperform the poor. Just as the incomes of the affluent have grown much more rapidly than those of the middle class over the last few decades, so, too, have most of the gains in educational success accrued to the children of the rich.’

He suggests that wealthier parents are increasingly focussed on the school success of their children because such success has become increasingly important in an environment where a university degree is no longer a guarantee of a good job. Upward social mobility is much harder to secure, so parents are increasingly competing to secure their children’s success.

The level of this investment is significantly higher amongst high-income families than amongst middle and low income families. The gap between the rich and the middle class – ‘upper tail inequality’ – is a new and unfamiliar condition and little thought has been given to addressing it.

Wealthier parents are gaining this advantage through:

‘More stable home environments, more time for parents to read to their children, access to higher-quality child care and preschool and — in places like New York City, where 4-year-old children take tests to determine entry into gifted and talented programs — access to preschool test preparation tutors or the time to serve as tutors themselves.’

It is this fundamental ‘opportunity gap’ that needs to be addressed, rather than the achievement gap evident in schools, which is partly a consequence of it.

Breaking the link between education and family background might involve replicating the behaviour of wealthy families, by investing heavily in the development of high-quality childcare and pre-school experience, paying relatively more attention to improving the quality of parenting than to improving the quality of teachers.

In the light of this there is arguably negligible benefit in investing in subsequent educational interventions that support low-income high-ability learners, because the damage has already been done and later investment is unlikely to level the playing field sufficiently to make a real difference.

But, as noted above, comparisons between the 90th and 10th percentiles by income – as opposed to eligibility and non-eligibility by FRPL or FSM – are bound to result in a relatively pronounced effect.

Moreover, it is not clear whether Reardon’s conclusions apply equally at all levels of achievement. There might be some reason to believe that the effects he describes are somewhat less pronounced in the case of disadvantaged learners who are relatively high attainers, or who have the potential to be so.

And some might argue that an intervention tailored to individual need, which also includes an explicit focus on parental education, might stand a better chance than most of having a positive effect at least commensurate with its cost.

.

Inter-school Variance Still Matters at the Micro-Level

Given the temptation to surrender to negativity, it is important that we do not lose sight of Levin’s point about inter-school variance (as well as inter-national variance). There must be scope for improvement if we can bring more schools (and more countries) up to the level demonstrated by the strongest performers.

This of course raises further difficult questions about the transferability and replicability of effective practice –whether between schools or between countries – that must be set aside as beyond the scope of this post.

Let us continue on the brave assumption that, given the right inputs and distribution processes, improved outcomes can be spread and embedded within a much wider range of settings – and that the right inputs and processes are understood and available to us.

Inter-school variance in support for high-achieving low-income learners has been discussed in another recent US publication. ‘A Level Playing Field: How College Readiness Standards Change the Accountability Game’ reports the findings of a three-year study of 35,000 high attaining learners in elementary and middle schools. The sample was drawn from the top 10% of achievers from each school.

The analysis compares the performance of high-achieving learners from high-poverty and low-poverty schools respectively (as defined by the top and bottom quartiles according to the percentage of learners eligible for FRPL).

It is important to note that high achievers in high-poverty schools are not necessarily from a disadvantaged background, though that is significantly more likely. The same goes for advantaged high achievers in low poverty schools. The study is rather quiet about this issue, though its findings are nevertheless significant.

Two measures are used: improvement in outcomes over time, measured through maths and reading achievement on Measures of Academic Progress (MAP) tests, and projected ACT College Readiness Benchmarks in maths and reading, which were derived from a study that linked MAP scores with these benchmarks.

Key findings were:

  • The vast majority of middle school high-achievers were projected to achieve the ACT benchmarks – 95% in low-poverty schools in both maths and reading; and over 85% in maths and over 80% in reading in the high-poverty schools. So, on this measure, while there is a disparity, the gap between high and low poverty schools is relatively small.
  • As for improvement in performance, the research finds that high- and low-poverty schools ‘produce roughly consistent rates of improvement over time in both reading and mathematics’. The achievement gap between the high- and low-poverty schools did not widen during the study period (though it didn’t narrow either).
  • There is, however, very significant variation between schools on this measure, both in the low-poverty and the high-poverty samples:

‘For example, at the beginning of the study, the average high-achieving math student in a high-poverty school started out performing at about the 90th percentile relative to national (NWEA) achievement norms. But if such a student attended a school that produced 10th percentile growth, that student would enter middle school performing at only the 77th percentile, whereas a comparable student at a 90th percentile growth school would enter middle school performing at the 93rd percentile. For these two students, the differences in opportunities could be quite large.

In short, given the large variance in growth across schools, it is quite clear that factors other than poverty largely control the relative growth of high achievers generated by any given school. This trend is interesting because it is counterintuitive. Given the advantages in resources available to wealthier schools, many might expect that students attending such schools would show superior growth over time. This was not necessarily the case.’

It follows that transferring from a high-poverty to a low-poverty school will not necessarily produce a dramatic improvement in high achievers’ performance. And it is a mistake to assume that low poverty equates to high quality, or vice versa for that matter. Quality operates independently of the relative poverty of the intake.

  • The study calculates that, if all high-poverty schools were able to produce the growth achieved by schools in the 75th percentile of the sample, the college-readiness gap between high- and low-poverty schools would be eliminated. The Preface comments:

‘Perhaps the best news coming from this study is that many high-poverty schools meet and exceed that target. The top high-poverty schools show growth that not only equals the best low-poverty schools but also dwarfs the meagre returns achieved by the worst ones. In fact, the 22 high-poverty elementary schools with the best growth rates entirely erased and surpassed their achievement gap relative to the 27 low-poverty schools with the lowest growth rates. And the 13 high-poverty middle schools with the highest rates of growth closed and surpassed their achievement gap relative to the 16 low-poverty schools with the lowest growth rates.’

So, to sum up, when it comes to narrowing achievement gaps – including excellence gaps -education may not matter that much at the macro level when compared with other key variables, but which school matters considerably at the micro level for the individual gifted learner.

Moreover, if all schools could perform at the level of the best, that would have a significant effect within the relatively narrow limits of education’s contribution to the overall equation. So attention shifts to the optimal way of transmitting effective practice between settings (or it would had we not set aside the difficult questions about this). This issue is another missing link in the argument set out in ‘Unlocking Emergent Talent’.

In passing, it is worth noting that one of the policy recommendations in ‘A Level Playing Field’ would be very familiar to those involved in English gifted education:

‘Moving forward, this study encourages policymakers to reframe the national discussion about how to best serve high achievers by recognizing that the nation’s “elite students” should not be defined solely as the top 1%, 5%, or 10% in the standardized testing pool, and that each and every school has its own group of elite students.’

There is real value in framing policy to address the needs of the most able pupils in every school, even though this population would vary considerably compared with national norms. This takes one stage further the arguments in the report in favour of local norms.

Not only should interventions be tailored to the needs of individual learners, but they should also be sufficiently flexible to be adopted in every school, since no school should be allowed to assume that it has no gifted learners. If exceptions are permitted, it follows that high-ability learners within them who are held back by disadvantage will miss out on their entitlement.

This has been an extensive detour and it is high time that we returned to the substance of ‘Unlocking Emergent Talent’.

.

Park Flora 2 by Gifted Phoenix

Park Flora 2 by Gifted Phoenix

.

Barriers to Overcome

The report identifies seven barriers to participation by disadvantaged learners in programmes suited to their educational needs which, it says, are particularly problematic for those catered for by public (as opposed to private) schools.

  • Narrow conceptions of giftedness that perceive it as an inherited and fixed trait rather than malleable and potentially evidenced through unfulfilled potential. The Report speaks of ‘already-developed ability’ as opposed to ‘potential to achieve’, but this is inaccurate and confusing since the distinction is fundamentally between selection on the basis of achievement (which favours those from advantaged backgrounds) and selection on the basis of ability (which should not do so, assuming that ability is evenly distributed within the population). The report avoids confronting this issue of the distribution of ability head on (see below), though it does acknowledge the deleterious effect of limited exposure to ‘a literacy-rich home’ and ‘challenging curriculum and enriched learning opportunities’.
  • Misconceptions about disadvantaged high-ability learners which boil down to low expectations and over-emphasis on what these learners lack by way of ‘economic, social and cultural capital’ rather than their strengths. These impact negatively on teacher nominations for gifted programmes, often dictated by poor identification practice that fails to utilise qualitative evidence and does not take account of learners’ different cultural backgrounds.
  • Limitations of pedagogy and curriculum which do not foreground talent development but tend to underestimate learners’ capabilities, concentrating overmuch on tackling ‘perceived academic deficits’ through ‘drill to build up missing basic skills and content knowledge’. It is also suggested that US schools do not offer a sufficiently culturally responsive curriculum that reflects the experiences, heritage, language and values of minority ethnic groups as well as of ‘majority cultures living in geographically depressed areas’.
  • Poor identification practice, including using a narrow range of evidence, failing to take account of the limited learning opportunities formerly made available to such students, perhaps by applying inappropriate national norms, relying overmuch on nominations from inexperienced teachers who have had no appropriate training, and failing to offer learners more than one opportunity to demonstrate their ability and to take proper account of improvement over time.
  • Introducing obstacles to programme participation, such as expecting learners to travel outside their own area or expecting them to meet associated transport costs. Sometimes parents’ inability to press for appropriate educational adjustments or secure access to the best quality schooling can also prove problematic.
  • The Gifted Label which can damage relationships between the learner and his peers, even resulting in rejection and/or bullying. Consequently, potential gifted learners may avoid the imposition of the label, or be dissuaded if their own background is under-represented in the gifted group.
  • Limited access to out-of school opportunities, which – in the US particularly – have been used by parents to compensate for ‘the shortage, or absence, of advanced courses in their children’s schools’. There is an extensive tradition of such provision in the US, especially summer schools and shorter weekend and holiday courses, often linked to talent search procedures. But the vast majority require payment of tuition fees, so they are largely enclaves for the advantaged middle classes.

All of these are familiar in the English setting, though the last is somewhat less pronounced, simply because the range of opportunities of this kind is significantly more limited here, and there may be a stronger tradition of schools providing their own out-of-hours learning opportunities.

They are all perfectly valid, but they stand as proxy for the more substantial barrier that I have alluded to above: the assumption that ability (as opposed to achievement) is unequally distributed in the population, whether by ethnicity, gender or socio-economic background.

This issue is now so toxic that there is often a tendency to ignore it. There are continuing research traditions which make it their business to detect perceived differences in intelligence or ability, and to conclude that these impact significantly on educational achievement.

But, even if these arguments can be made to stand up (and they are open to challenge on a variety of grounds), the fundamental difficulty is that they serve to reinforce precisely the low expectations that lie at the root of the problem.

It follows that there is much virtue in starting from the fixed and incontrovertible assumption that, while the distribution of achievement is undoubtedly affected by gender, ethnic and socio-economic background, the distribution of ability is not.

Then the equity-driven side of the equation for gifted educators is far more straightforward to grasp and aim towards: it is simply to ensure that entry to gifted programmes is broadly representative and that success – whether demonstrated by a measure of high achievement, progression to selective higher education or any other outcome – is evenly distributed.

If too few low-income learners are admitted to a gifted programme, this may well be indicative that identification procedures are over-reliant on attainment measures, as opposed to evidence of hidden or emergent potential.

If too few low-income learners are successful within a gifted programme, this may well be indicative that the content and/or assessment is inappropriately weighted against learners from such a background.

This is not to argue for fixed quotas, or affirmative action, but simply to advance a straightforward corrective to the ‘deficit thinking’ that is outlined in the report.

It is only by following these arguments through to this ultimate position that we can effectively counter the hold of unfairly low expectations on our efforts to narrow and ultimately eliminate unhelpful excellence gaps.

.

Psychosocial Factors

The report poses two questions: which non-cognitive factors are most significant in determining the success of low-income high ability students, and which of these most lend themselves to improvement through education?

It calls for more research into the characteristics of successful learners with this background, which is perhaps tantamount to admission that the treatment subsequently offered is both provisional and potentially incomplete.

As a precursor to that treatment, it offers an outline drawn from research on African-American and Latino gifted students which may not be fully transferrable to the low-income population (the emphases are mine):

‘These students had high educational and career aspirations and were extremely motivated to accomplish them. They demonstrated a strong work ethic and commitment to study. Their families were emotionally supportive and they had extended family and other adults such as teachers, coaches, mentors, and church leaders to turn to for additional support and guidance. High self-esteem gave them the confidence to actively seek advice and assistance from adults outside the family when they needed it. They had a peer network of other students with similarly high goals and commitment to academic achievement who provided psychological, emotional and social support to remain on track despite setbacks or obstacles. They were confident in their own racial identity and open to multicultural experiences, including friendships.’

The subsequent text does not dwell on the importance of support networks within and beyond the family, concentrating exclusively on the learners’ own characteristics. Nor does it treat all of those, selecting instead the following list which it suggests are ‘especially critical and malleable’:

  • Mindsets, or beliefs about intelligence and ability. Those who see their capability as fixed are disadvantaged compared with those who believe they can improve their performance through effort. This is allied with the concept of ‘grit’, or resilience, associated with recognition of the significance of persistent effort over time. Educational settings can encourage learners to appreciate the contribution to success made by their own effort and persistence.

‘Grit’ is currently receiving significant attention. Duckworth’s presentation concludes with an admirably brief summary of the conclusions from her research into this phenomenon

Duckworth grit Capture

  • Motivation, which is associated with students’ belief that they can do well in school, and that doing well is important to them and will contribute significantly to their life chances. Motivation is associated with high expectations from educators, who give learners opportunities to succeed, so building their confidence and motivation to succeed further.
  • Some other factors are identified as particularly relevant to high achievers, though the commentary suggests that findings associated with minority ethnic groups are being applied here to low-income students without too much supporting evidence. Factors include: negative stereotypes of groups to which the learner belongs, which can impact on their engagement and performance; a perceived choice between achievement and affiliation with a group of friends or peers, and the risk that choosing the former lays the student open to isolation and bullying; and the capacity to develop ‘dual identities’ to reconcile conflicting expectations and norms.

There is a fairly extensive literature in England about the impact of aspirations and attitudes – whether the learner’s or their parents’ – on learners from disadvantaged backgrounds, though the extent to which these vary according to ability or prior achievement is relatively less explored.

It will be interesting to compare the findings from the forthcoming ‘Investigation of school and college-level strategies to raise the aspirations of high-achieving disadvantaged pupils to pursue higher education’ with other more generic material and also with the list above.

A 2012 study by Gorard et al: ‘The impact of attitudes and aspirations on educational attainment and participation’ offered a meta-analysis covering 13 different kinds of aspiration, attitude or behaviour (AAB), four of which were relevant to parents (parental involvement, parenting style, parental expectations and parental substance abuse).

Five more relate to a learner’s own attitudes and aspirations: self-concept or esteem (self- perception and evaluation of one’s worth or goodness), self-efficacy or locus of control (belief in one’s ability to achieve and that one’s actions can make a difference), aspiration (what one hopes will happen in the future), motivation (the reason for a decision and strength of purpose in carrying it out) and attitude (one’s feelings about school and education).

The remaining four are behavioural: engagement with extra-curricular activities, engagement with paid work, substance abuse and poor behaviour.

The survey sought evidence of a causal relationship between each of these and attainment/participation, having determined that such a relationship involves four aspects:

  • There is an association, or correlation between the two variables;
  • The AAB pre-existed any improvement in attainment/participation and can be used to predict subsequent changes;
  • Controlled interventions have altered the level of an AAB, so producing changes in attainment/participation that cannot be otherwise explained; and
  • There is a plausible account of how the AAB influenced attainment/participation.

The authors comment:

‘The evidence in most areas is generally too immature at present to estimate the effect sizes or the costs of any type of intervention. It is important, therefore, that future work moves towards estimates of both, which can then be broken down into estimates of cost-effectiveness for specific sub-groups of learners, such as low attainers and families of low socio-economic status (SES).

Much of the work found in this review on the causes of attainment was conducted in the USA. Its results are relevant to the experience on this side of the Atlantic, but it would be helpful to see more of this kind of work, concerning both participation and attainment, being carried out in the UK, and reflecting the country’s specific context and culture.’

This parallel summary report ‘The Role of Aspirations, Attitudes and Behaviour in Closing the Educational Attainment Gap’ concludes:

‘The existing evidence supports the use of interventions focused on parental involvement in children’s education to improve outcomes. The immediate focus should be on rolling out and closely monitoring such interventions.

There is mixed evidence on the impact of interventions focused on extra-curricular activities, mentoring, children’s self-belief and motivation. Further development of such interventions should be trialled alongside evaluations of their effectiveness.

There is little or no evidence of impact for interventions focused on things like addressing children’s general attitudes to education or the amount of paid work children do during term time. Such interventions might be pursued for other reasons, but the evidence does not currently support their use to raise attainment.’

While there are clear differences between the typologies adopted – and the English research relates to all disadvantaged learners rather than just high-ability learners – there is cause for caution.

While ‘psychosocial factors’ may be significant, the evidence base is thin and, without such evidence, we may be tempted to exaggerate their impact relative to other factors that may more readily explain achievement and excellence gaps.

.

Park Flora 3 by Gifted Phoenix

Park Flora 3 by Gifted Phoenix

.

Effective Policies, Initiatives, Programmes and Practice

The report comes at effective provision in three overlapping chapters, devoted to programme models, policies and initiatives and best practices respectively.

Six effective practices are identified from analysis of a range of different school-based and supplementary programmes (one or two of which are called slightly into question by the analysis above):

  • Gateway function: a focus on preparation for subsequent educational experience, often at critical transition points, so helping to ‘increase access, create additional entry points into, and address “leaks” in existing pipelines of talent development’.  Ideally provision should comprise ‘comprehensive talent development paths…that begin in pre-school (or earlier) and continue through Grade 12 and beyond.
  • Selection criteria matched to level of developed talent: provision for younger learners is more inclusive and less selective than provision for older students. Selection criteria draw on multiple evidence sources to produce a holistic assessment, including quantitative data based on local norms rather than rigid national cut-off scores.
  • A challenging enriched curriculum that requires higher-level thinking skills: learners with developing abilities can benefit from challenge as much as the highest-achieving students. This often demands professional development to raise teachers’ expectations and develop their differentiation skills.
  • Significantly extended learning time beyond the school day: this may be as important in tackling underachievement amongst potentially high-achieving students as for those performing at lower levels.
  • Components that compensate for the benefits enjoyed by more advantaged students: this might include tutoring, mentoring and counselling, internship opportunities and careers advice.
  • Expanded student support networks: providing opportunities for learners to work with similar students from other schools or localities, so creating a stronger peer network. This might be complemented by mentor support and parental education, so as to strengthen family support.

This is followed by a series of seven ‘policies and action initiatives’:

  • Increase expectations, by introducing and working towards clearer definitions of advanced levels of learning on state tests, focusing simultaneously on increasing the proportion of learners achieving those levels and narrowing achievement gaps. Similar goals should be set in respect of NAEP and PISA measures of advanced performance. Also ensure that high-quality teaching is available to these learners, especially in high-poverty schools.
  • Support high achievement through a range of strategies including more specialist STEM schools, implementing a ‘gifted education pedagogy’, additional focus on gifted education in initial teacher education and subsequent professional development, extending access to Advanced Placement and International Baccalaureate courses, improved access to out-of-hours supplementary programmes,
  • Start early and sustain, by supporting pre-school and elementary school enrichment activity, identifying high achievers and then providing them with consistent support throughout their time in school. This will demand focus on instilling psychosocial skills ‘supportive of continued commitment to high achievement’.
  • Provide additional support alongside the school curriculum, such as mentoring, tutoring, advice on university entry and access to role models.  Given the significance of family support, programmes must develop parents’ understanding and advocacy.
  • Remove barriers to programme participation, ensuring that definitions and identification processes are inclusive of ‘marginalised and under-identified gifted students’, that information is translated into community languages and that districts and schools are supportive of learners progressing through the curriculum at their own pace.
  • Focus wider school reform on high ability: ensure that efforts to address achievement gaps incorporate excellence gaps, that Response To Intervention (RTI) and grouping strategies address these learners’ needs and that success is measured in a way that incorporates high achievement. Effective practice must be shared, so that successful programmes can be replicated and adapted elsewhere.
  • Invest in research to determine ‘the conditions under which interventions are effective and with whom’. It is critical that these are cost-effective and scalable. (There is a brief and not too helpful section on replicability and scalability which rather vaguely suggests exploration of distance education models and the development of ersatz supplementary education within school settings, possibly built on partnership between organisations offering supplementary programmes and school districts.)

Finally, there is a third series of ‘best educational practices’ which highlights material earlier in the text. In summary it advocates:

  • Inclusive, culturally responsive and holistic identification practice, supported by teacher education.
  • Culturally responsive programmes and services incorporating development of both cognitive and psychosocial skills.
  • Positive cultures in schools that ‘exalt individual differences of all kinds and value and reward high academic achievement create [sic] contexts in which low-income, high-ability students from all backgrounds can thrive’.

There is also a final exhortation:

‘A list of best practices will remain just that unless it is coupled with a commitment to looking at low-income and culturally and linguistically diverse students from a different lens and from a perspective that emphasises strengths instead of weaknesses, differences rather than deficits, possibilities as opposed to limitations, and solutions instead of obstacles.’

The Appendix to the report provides separate summaries of eight different programmes featured at the Summit. This is both a small sample and a mixed bag, containing some very small programmes and some rather large ones. There are also two projects focused exclusively on supporting learners from minority ethnic backgrounds.

The links below are to project websites where these are available:

  • Project Excite, a year-round out-of-school programme for minority learners in Grades 3-8 provided by Northwestern University and two local school districts.
  • Project Nexus, a former programme of the Maryland State Education Department (2005-2008) helping to prepare low-income students for higher education.
  • The Scholars Program, provided by Sponsors for Educational Opportunity (SEO), a year-round out-of-school programme supporting urban students in New York and San Francisco to progress to selective universities.
  • The TEAK Fellowship, a year-round out-of-school programme for talented New York City students from low income families supporting admission to high school and university.
  • The Young Scholars Program, operated by Fairfax County, Virginia to support low-income high ability learners in grades K-2, preparing them for subsequent gifted programmes.

.

Moving Forward

The report admits to ‘a lingering concern’ associated with the interaction of different variables – it specifies rural/urban/surburban, race and culture – and the implications for effective provision. This is welcome in light of some of the reservations expressed above.

It also quite rightly rejects ‘categorical designations’ because they ‘fail to capture the variation in levels of poverty, opportunity and education within the subgroups included in each category’. A one-size-fits-all approach will not work.

It proposes a research agenda that foregrounds our limited understanding of the characteristics of successful learners from low-income backgrounds since:

‘Although we can speculate on obstacles and impediments, there is not a deep understanding of how these intersect with race, culture, gender and domain of talent.’

There is surely a risk that the interaction of so many different factors – elements of disadvantage, as well as variations in background, schooling and personal attitudes – is so complex and individualised that it will not be possible to draw general conclusions that can be consistently applied across this population?

The research agenda proposes further work to investigate the characteristics of successful learners, the development of psychosocial skills, the removal of barriers (professionals’ perceptions and assumptions, identification, family and community beliefs) and effective provision (appropriate curriculum and instruction, the characteristics of successful programmes, scaling and replication and teacher education).

One cannot help feeling that, rather than providing a basis for extensive further work of this nature, any available funding might be better spent in devising cost-effective and scalable interventions that start from our current understanding of effective practice – and evaluating them formatively and summatively so as to refine that understanding and adjust the programmes accordingly.

But maybe this is the tension between giftedness and gifted education once more raising its ugly head. Or maybe it is my bias against research and in favour of policy-making; or perhaps a little of both.

Still, a focus on the tangible and immediate – on inputs and processes and their success in generating efficiently the right mix of positive outcomes – is likely to generate more substantive and more immediate returns than in-depth psychological study.

.

Apple Blossom by Gifted Phoenix

Apple Blossom by Gifted Phoenix

.

Drawing the Strands Together

.

Unlocking Emergent Talent and Elements of Effective Provision

Unlocking Emergent Talent is a helpful resume of what is currently understood as effective practice in identifying and meeting the needs of high ability low income learners, but it does not add conspicuously to our collective understanding of such practice.

It also displays some shortcomings, in substituting evidence about minority ethnic students to fill gaps in the evidence base for low-income students and, to a lesser extent, in not consistently differentiating findings about high-achieving students from findings about high-ability students.

It does not fully address, or else skips over, a series of substantive issues including:

  • Different definitions of ‘high ability’ and ‘low income’ and the issues associated with selecting one of several alternatives.
  • The wider evidence base on excellence gaps, which presents a rather more complex picture than that presented in the report.
  • The range of factors that contribute towards disadvantage and the complex manner in which different factors interact and impact on the learner.
  • The relatively limited contribution that education can make to tackling disadvantage and the correspondingly significant impact of poverty on educational achievement.
  • Variation in the quality of support between settings, the impact of reducing this variance (and associated questions about our capacity to spread and embed effective practice).
  • The distribution of ability within the population.
  • The value of parental engagement compared with learners’ own ‘psychosocial skills’, and the significance of those skills relative to other variables.
  • Cost and efficiency and their influence on the shape of interventions to support the target group.
  • Identifying the right blend of in-school and out-of-hours provision.
  • Considering the relative advantages and disadvantages of stand-alone provision for disadvantaged learners, integrated support for advantaged and disadvantaged alike, or a mixed economy.

All that said, it provides a helpful framework against which to assess current practice and from which to begin to develop new practice. From a domestic perspective it supplies a reasonable reference point for consideration of the relatively similar English publication we expect in September.

The read-across will not be perfect. The English report will be dedicated specifically to support for progression to higher education and its focus is exclusively 11-18 year-olds. It will adopt a relatively liberal definition of ‘high-achieving’ which is broad in terms of the range of achievement it embraces, but does not otherwise accommodate those whose ability is not yet translated into high achievement. It is likely to concentrate substantively on in-school and in-college strategies, as opposed to external programmes.

Nevertheless, my forthcoming review will undoubtedly be aided by this prior excursion into broadly similar territory on the other side of the Atlantic.

.

The Pupil Premium

That said, one further critical issue will not be assisted by the comparison: whether available funding, principally in the form of the Pupil Premium, is allocated in such a manner that high-achieving disadvantaged learners receive their fair share of support – and whether such funding is making a real difference to their expectations of progression to higher education, and especially to selective universities.

I have raised in at least one previous post the question whether:

‘Gifted learners from disadvantaged backgrounds will receive the same level of benefit from the Premium as other disadvantaged learners, notably those who are not likely to achieve national benchmarks at Key Stage 2 and Key Stage 4.

For the Premium does not currently operate as an individual entitlement following the learner. The Government has issued no advice to schools to suggest that it should be deployed in this fashion…

The Institute for Public Policy Research (IPPR) has argued that each eligible learner should receive a Pupil Premium Entitlement, so ensuring that the funding directly benefits those eligible for it. The IPPR argues that this should pay for:

‘extra catch-up tuition, small group tuition or one-to-one teaching to stretch the most able low-income pupils’.’

While there has been no apparent shift towards such an entitlement, other levers have been brought to bear to increase the general emphasis on gap narrowing. Ofsted inspectors will be monitoring the attainment gap in every school and will not rate a school outstanding unless it is closing that gap. Schools that are struggling will be required to appoint a head teacher from a successful school to advise them.

Ofsted has reinforced the message that schools should have:

‘Carefully ringfenced the funding so that they always spen[d] it on the target group of pupils’.

And, when it comes to high achievers, has expressed the desire that they have:

‘Never confused eligibility for the Pupil Premium with low ability, and focused on supporting their disadvantaged pupils to achieve the highest levels’.

One might reasonably expect that the imminent Ofsted report on provision for highly able learners, next in line for publication in the ‘Summer of Love’, will incorporate some further coverage of this kind, including some guidelines to differentiate effective and less effective practice. That messaging should then be traceable across to the third and final publication, where it should be an important feature.

It remains to be seen whether the other key accountability lever School Performance Tables, will be used to incentivise schools to support their higher achievers. The consultation on secondary school accountability – recently closed – proposed the publication of generic attainment data for pupils attracting the Premium, but did not commit to differentiating that by prior achievement.

We know that the current method of delineating such achievement, National Curriculum levels, is set to disappear in 2016 and, although there has been a commitment to a new system for grading high attainment in the core subjects at the end of KS2, we do not yet know how that will be done.

The Government has published a series of short case studies of effective use of the Premium, one of which features the gifted and talented programme at Paignton Community and Sports College. It doesn’t offer any startling insights into best practice, but it does confirm official endorsement for deploying some of the available funding in this fashion.

There is evidence elsewhere that the broad message has already been taken on board. A new scheme administered by the National College of Teaching and Leadership ‘Closing the Gap: Test and Learn’ supports school-based research into effective approaches to narrowing the gap. It is beginning with a consultation phase in which schools have been asked ‘which group of pupils should we be most attending to?’

The initial results make positive reading.

Curee consultation Capture

As things stand, one might reasonably expect that a significant proportion of the funded projects will be focused on our target group.

But there are also issues associated with the fact that the Pupil Premium is not available in post-16 settings, where entirely different funding arrangements apply. There is no mechanism for securing consistent support across the transition between 11-16 and 16-19 education for the substantial proportion of students who progress to higher education via two separate institutions with a break at 16 (or, for that matter, for those who change institutions at some other point in their school careers, most often as a consequence of moving house).

There have been suggestions that this might change. Press coverage in May 2012 reported that consideration was being given to a Student Premium for all pupils eligible for free school meals who passed the EBacc. The funding, worth up to £2,500 a year, would be confirmed at the age 16, subject to confirmation of a university place, but would not be available until the student entered higher education.

Then the Government’s report on progress in the first year of its social mobility strategy mentioned:

‘Options for reform of the National Scholarship Programme and other forms of student support, including a possible ‘HE Premium’, alongside other models… and whether we can give greater certainty of the support available to individuals at the point they are considering applying to university.’

No reforms of this nature have so far been forthcoming.

Towards the end of the first post in the Summer of Love series, I proposed a targeted intervention programme supported by an annual Pupil Premium topslice. The funding would be transferred into a personal entitlement or voucher that could be passported on the individual learner, following them across into a post-16 setting if necessary.

There is a precedent for such a topslice in the form of the £50m of Pupil Premium funding set aside for summer schools. A further £50m topslice represents just 2% of the total sum available for the Pupil Premium in 2014-15.

It should be possible to generate a matched contribution from a separate 16-19 funding source if necessary, though the total amount required would be relatively small.

Let us end with some traditional but provisional ‘back-of the envelope’ costings,

In the early secondary years the funding might be targeted at broader awareness-raising for all Premium-eligible learners achieving Level 5 at KS2 in English and Maths (or the equivalent in the new assessment regime). This is currently some 14% of the Year 6 cohort so, assuming a total year group of 600,000, some 84,000 learners annually across Years 7-9.

From Year 9/10 onwards it might be focused more tightly on a tailored programme for each Premium-eligible learner with the capacity to enter a selective higher education course, or a selective university, or to achieve a specified benchmark, such as A levels at Grades AAB+. In 2010-11, just 7% of all state school students achieved these grades (though admittedly in ‘facilitating subjects’ only).

I cannot find a reliable estimate of the proportion formerly eligible for free school meals, but modelling undertaken by HEFCE in 2011 (Annex D) suggests very small numbers in POLAR Quintile 1 (2,741 aged under 21) achieved this outcome (and not only in ‘facilitating subjects’ either). It is highly unlikely that the national cohort of Premium-eligible learners considered likely to achieve this would exceed 5,000 per year group.

So we might expect a steady-state national cohort of around 250,000 in Years 7-9 and some 20,000 in Years 10-13. A sum of £50m would enable one to allocate:

  • £1,500 per year to learners in Years 10-13 (20,000 x £1,500 = £30m)
  • An average of £6,000 per year per school for learners in Years 7-9 (3,000 x £6,000 = £18m) though the sums provided would be weighted to reflect distribution while avoiding ‘penny packages’

So leaving sufficient change for formative and summative evaluation, possibly even a thorough randomised control trial!

.

GP

May 2013