Excellence Gaps Quality Standard: Version 1

 

This post is the first stage of a potential development project.

letter-33809_640
It is my initial ‘aunt sally’ for a new best fit quality standard, intended to support schools and colleges to close performance gaps between high-achieving disadvantaged learners and their more advantaged peers.

It aims to integrate two separate educational G_letter_blue_whiteobjectives:

  • Improving the achievement of disadvantaged learners, specifically those eligible for Pupil Premium support; and
  • Improving the achievement of high attainers, by increasing the proportion that achieve highly and the levels at which they achieve.

High achievement embraces both high Blue_square_Qattainment and strong progress, but these terms are not defined or quantified on the face of the standard, so that it is applicable in primary, secondary and post-16 settings and under both the current and future assessment regimes.

I have adopted new design parameters for this fresh venture into quality standards:

  • The standard consists of twelve elements placed in what seems a logical order, but they White_Letter_S_on_Green_Backgroundare not grouped into categories. All settings should consider all twelve elements. Eleven are equally weighted, but the first ‘performance’ element is potentially more significant.
  • The baseline standard is called ‘Emerging’ and is broadly aligned with Ofsted’s ‘Requires Improvement’. I want it to capture only the essential ‘non-negotiables’ that all settings must observe or they would otherwise be inadequate. I have erred on the side of minimalism for this first effort.
  • The standard marking progress beyond the baseline is called ‘Improving’ and is (very) broadly aligned with Ofsted’s ‘Good’. I have separately defined only the learner performance expected, on the assumption that in other respects the standard marks a continuum. Settings will position themselves according to how far they exceed the baseline and to what extent they fall short of excellence.
  • The excellence standard is called ‘Exemplary’ and is broadly aligned with Ofsted’s ‘Outstanding’. I have deliberately tried to pitch this as highly as possible, so that it provides challenge for even the strongest settings. Here I have erred on the side of specificity.

The trick with quality standards is to find the right balance between over-prescription and vacuous ‘motherhood and apple pie’ statements.

There may be some variation in this respect between elements of the standard: the section on teaching and learning always seems to be more accommodating of diversity than others given the very different conceptions of what constitutes effective practice. (But I am also cautious of trespassing into territory that, as a non-practitioner, I may not fully understand.)

The standard uses terminology peculiar to English settings but the broad thrust should be applicable in other countries with only limited adaptation.

The terminology needn’t necessarily be appropriate in all respects to all settings, but it should have sufficient currency and sharpness to support meaningful interaction between them, including cross-phase interaction. It is normal for primary schools to find some of the language more appropriate to secondary schools.

It is important to emphasise the ‘best fit’ nature of such standards. Following discussion informed by interaction with the framework, settings will reach a reasoned and balanced judgement of their own performance across the twelve elements.

It is not necessary for all statements in all elements to be observed to the letter. If a setting finds all or part of a statement beyond the pale, it should establish why that is and, wherever possible, devise an alternative formulation to fit its context. But it should strive wherever possible to work within the framework, taking full advantage of the flexibility it permits.

Some of the terminology will be wanting, some important references will have been omitted while others will be over-egged. That is the nature of ‘aunt sallys’.

Feel free to propose amendments using the comments facility below.

The quality standard is immediately below.  To improve readability, I have not reproduced the middle column where it is empty. Those who prefer to see the full layout can access it via this PDF

 

 

Emerging (RI) Improving (G) Exemplary (O)
The setting meets essential minimum criteria In best fit terms the setting has progressed beyond entry level but is not yet exemplary The setting is a model for others to follow
Performance Attainment and progress of disadvantaged high achievers typically matches that of similar learners nationally, or is rapidly approaching this..Attainment and progress of advantaged and disadvantaged high achievers in the setting are both improving. Attainment and progress of disadvantaged high achievers consistently matches and sometimes exceeds that of similar learners nationally..Attainment and progress are improving steadily for advantaged and disadvantaged high achievers in the setting and performance gaps between them are closing. Attainment and progress of disadvantaged high achievers significantly and consistently exceeds that of similar learners nationally..

Attainment and progress matches but does not exceed that of advantaged learners within the setting, or is rapidly approaching this, and both attainment and progress are improving steadily, for advantaged and disadvantaged high achievers alike.

 

 

 

  Emerging (RI) The setting meets essential minimum criteria Exemplary (O) The setting is a model for others to follow
Policy/strategy There is a published policy to close excellence gaps, supported by improvement planning. Progress is carefully monitored. There is a comprehensive yet clear and succinct policy to close excellence gaps that is published and easily accessible. It is familiar to and understood by staff, parents and learners alike.

.

SMART action to close excellence gaps features prominently in improvement plans; targets are clear; resources and responsibilities are allocated; progress is monitored and action adjusted accordingly. Learners’ and parents’ feedback is routinely collected.

.

The setting invests in evidence-based research and fosters innovation to improve its own performance and contribute to system-wide improvement.

Classroom T&L Classroom practice consistently addresses the needs of disadvantaged high achievers, so improving their learning and performance. The relationship between teaching quality and closing excellence gaps is invariably reflected in classroom preparation and practice.

.

All teaching staff and paraprofessionals can explain how their practice addresses the needs of disadvantaged high achievers, and how this has improved their learning and performance.

.

All staff are encouraged to research, develop, deploy, evaluate and disseminate more effective strategies in a spirit of continuous improvement.

Out of class learning A menu of appropriate opportunities is accessible to all disadvantaged high achievers and there is a systematic process to match opportunities to needs. A full menu of appropriate opportunities – including independent online learning, coaching and mentoring as well as face-to-face activities – is continually updated. All disadvantaged high achievers are supported to participate.

.

All provision is integrated alongside classroom learning into a coherent, targeted educational programme. The pitch is appropriate, duplication is avoided and gaps are filled.

.

Staff ensure that: learners’ needs are regularly assessed; they access and complete opportunities that match their needs; participation and performance are monitored and compiled in a learning record.

Assessment/ tracking Systems for assessing, reporting and tracking attainment and progress provide disadvantaged high achievers, parents and staff with the information they need to improve performance Systems for assessing, tracking and reporting attainment and progress embody stretch, challenge and the highest expectations. They identify untapped potential in disadvantaged learners. They do not impose artificially restrictive ceilings on performance.

.

Learners (and their parents) know exactly how well they are performing, what they need to improve and how they should set about it. Assessment also reflects progress towards wider goals.

.

Frequent reports are issued and explained, enabling learners (and their parents) to understand exactly how their performance has changed over time and how it compares with their peers, identifying areas of relative strength and weakness.

.

All relevant staff have real-time access to the assessment records of disadvantaged high attainers and use these to inform their work.

.

Data informs institution-wide strategies to improve attainment and progress. Analysis includes comparison with similar settings.

Curriculum/organisation The needs and circumstances of disadvantaged high achievers explicitly inform the curriculum and curriculum development, as well as the selection of appropriate organisational strategies – eg sets and/or mixed ability classes. The curriculum is tailored to the needs of disadvantaged high achievers. Curriculum flexibility is utilised to this end. Curriculum development and planning take full account of this.

.

Rather than a ‘one size fits all’ approach, enrichment (breadth), extension (depth) and acceleration (pace) are combined appropriately to meet different learners’ needs.

.

Personal, social and learning skills development and the cultivation of social and cultural capital reflect the priority attached to closing excellence gaps and the contribution this can make to improving social mobility.

.

Organisational strategies – eg the choice of sets or mixed ability classes – are informed by reliable evidence of their likely impact on excellence gaps.

Ethos/pastoral The ethos is positive and supportive of disadvantaged high achievers. Excellence is valued by staff and learners alike. Bullying that undermines this is eradicated. The ethos embodies the highest expectations of learners, and of staff in respect of learners. Every learner counts equally.

.

Excellence is actively pursued and celebrated; competition is encouraged but not at the expense of motivation and self-esteem;hothousing is shunned.

.

High achievement is the norm and this is reflected in organisational culture; there is zero tolerance of associated bullying and a swift and proportional response to efforts to undermine this culture.

.

Strong but realistic aspirations are fostered. Role models are utilised. Social and emotional needs associated with excellence gaps are promptly and thoroughly addressed.

.

The impact of disadvantage is monitored carefully. Wherever possible, obstacles to achievement are removed.

Transition/progression The performance, needs and circumstances of disadvantaged high achievers are routinely addressed in transition between settings and in the provision of information, advice and guidance. Where possible, admissions arrangements prioritise learners from disadvantaged backgrounds – and high achievers are treated equally in this respect.

.

Receiving settings routinely collect information about the performance, needs and circumstances of disadvantaged high achievers. They routinely share such information when learners transfer to other settings.

.

Information, advice and guidance is tailored, balanced and thorough. It supports progression to settings that are consistent with the highest expectations and high aspirations while also meeting learners’ needs.

.

Destinations data is collected, published and used to inform monitoring.

.

Leadership, staffing, CPD A named member of staff is responsible – with senior leadership support – for co-ordinating and monitoring activity across the setting (and improvement against this standard)..Professional development needs associated with closing excellence gaps are identified and addressed The senior leadership team has an identified lead and champion for disadvantaged high achievers and the closing of excellence gaps.

.

A named member of staff is responsible for co-ordinating and monitoring activity across the setting (and improvement against this standard).

.

Closing excellence gaps is accepted as a collective responsibility of the whole staff and governing body. There is a named lead governor.

.

There is a regular audit of professional development needs associated with closing excellence gaps across the whole staff and governing body. A full menu of appropriate opportunities is continually updated and those with needs are supported to take part.

.

The critical significance of teaching quality in closing excellence gaps is instilled in all staff, accepted and understood.

Parents Parents and guardians understand how excellence gaps are tackled and are encouraged to support this process. Wherever possible, parents and guardians are actively engaged as partners in the process of closing excellence gaps. The setting may need to act as a surrogate. Other agencies are engaged as necessary.

.

Staff, parents and learners review progress together regularly. The division of responsibility is clear. Where necessary, the setting provides support through outreach and family learning.

.

This standard is used as the basis of a guarantee to parents and learners of the support that the school will provide, in return for parental engagement and learner commitment.

Resources Sufficient resources – staffing and funding – are allocated to improvement planning (and to the achievement of this standard)..Where available, Pupil Premium is used effectively to support disadvantaged high achievers. Sufficient resources – staffing and funding – are allocated to relevant actions in the improvement plan (and to the achievement of this standard).

.

The proportion of Pupil Premium (and/or alternative funding sources) allocated to closing excellence gaps is commensurate with their incidence in the setting.

.

The allocation of Pupil Premium (or equivalent resources) is not differentiated on the basis of prior achievement: high achievers are deemed to have equal needs.

.

Settings should evidence their commitment to these principles in published material (especially information required to be published about the use of Pupil Premium).

Partnership/collaboration The setting takes an active role in collaborative activity to close excellence gaps. Excellence gaps are addressed and progress is monitored in partnership with all relevant ‘feeder’ and ‘feeding’ settings in the locality.

.

The setting leads improvement across other settings within its networks, utilising the internal expertise it has developed to support others locally, regionally and nationally.

.

The setting uses collaboration strategically to build its own capacity and improve its expertise.

 

letter-33809_640G_letter_blue_whiteBlue_square_QWhite_Letter_S_on_Green_Background

 

 

 

 

Those who are not familiar with the quality standards approach may wish to know more.

Regular readers will know that I advocate what I call ‘flexible framework thinking’, a middle way between the equally unhelpful extremes of top-down prescription (one-size-fits-all) and full institutional autonomy (a thousand flowers blooming). Neither secures consistently high quality provision across all settings.

The autonomy paradigm is currently in the ascendant. We attempt to control quality through ever-more elaborate performance tables and an inspection regime that depends on fallible human inspectors and documentation that regulates towards convergence when it should be enabling diversity, albeit within defined parameters.

I see more value in supporting institutions through best-fit guidance of this kind.

My preferred model is a quality standard, flexible enough to be relevant to thousands of different settings, yet specific enough to provide meaningful guidance on effective practice and improvement priorities, regardless of the starting point.

I have written about the application of quality standards to gifted education and their benefits on several occasions:

Quality standards are emphatically not ‘tick box’ exercises and should never be deployed as such.

Rather they are non-prescriptive instruments for settings to use in self-evaluation, for reviewing their current performance and for planning their improvement priorities. They support professional development and lend themselves to collaborative peer assessment.

Quality standards can be used to marshal and organise resources and online support. They can provide the essential spine around which to build guidance documents and they provide a useful instrument for research and evaluation purposes.

 

GP

October 2014

16-19 Maths Free Schools Revisited: Oddyssean Edition

This is the second edition of a post that marks the opening of two university-sponsored 16-19 maths free schools by taking a fresh look at the wider programme that spawned them.

Courtesy of Andrew J Hanson Indiana University

Courtesy of Andrew J Hanson Indiana University

I have revised the text to reflect substantive comments provided by Dominic Cummings through the oddyseanproject Twitter feed. Cummings was political adviser to former Secretary of State for Education Michael Gove until January 2014. He was instigator and champion of the maths free schools programme.

I fee obliged to point out that the inclusion of these comments does not constitute his endorsement or approval of the text. I have reserved the right to part company with him on matters of interpretation (rather than matters of fact) and have signalled where instances occur.

The post scrutinises developments since the publication of ‘A Progress Report on 16-19 Maths Free Schools’ (March 2013), building on the foundations within ‘The Introduction in England of Selective 16-19 Maths Free Schools’ (November 2011).

The broad structure of the post is as follows:

  • A description of the genesis of the programme and a summary of developments up to March 2013.
  • The subsequent history of the programme, from March 2013 to the present day. This reviews efforts to recruit more university sponsors into the programme – and to resist the publication of information showing which had submitted expressions of interest and, subsequently, formal proposals.
  • An assessment of the prospects for the programme at this point and for wider efforts to expand and remodel England’s national maths talent pipeline.

Since many readers will be interested in some of these sections but not others, I have included direct links to the main text from the first word of each bullet point above.

 

Genesis and early developments

Capital investment to support the programme was confirmed in the 2011 Autumn Statement, which referred to:

‘…an extra £600 million to fund 100 additional Free Schools by the end of this parliament. This will include new specialist maths Free Schools for 16-18 year olds, supported by strong university maths departments and academics’.

This followed an orchestrated sequence of stories fed to the media immediately prior to the Statement.

One source reported a plan to establish 12 such schools in major cities by the end of the Parliament (Spring 2015) ‘before the model is expanded nationwide’. These would:

‘…act as a model for similar institutions specialising in other subjects’.

Another confirmed the number of institutions, adding that there would be ‘…a special application process outside the regular free school application process…’

A third added that the project was viewed as an important part of the Government’s strategy for economic growth, suggesting that some of the schools:

‘…would offer pure maths, while others would combine the subject with physics, chemistry or computer sciences.’

Assuming provision for 12 schools at £6m a time, the Treasury had provided a capital budget of £72m available until 2015. It remains unclear whether this sum was ringfenced for university-sponsored maths schools or could be diverted into the wider free schools programme.

We now know that Cummings was behind the maths free schools project. But these original press briefings originated from the Treasury, showing that they were indeed committed to a 12-school programme within the lifetime of the Parliament.

 

 

The most recent edition of Cummings’ essay ‘Some thoughts on education and political priorities’ (2013) sets out the rationale for the programme:

‘We know that at the top end of the ability range, specialist schools, such as the famous Russian ‘Kolmogorov schools’…show that it is possible to educate the most able and interested pupils to an extremely high level…We should give this ~2% a specialist education as per Eton or Kolmogorov, including deep problem-solving skills in maths and physics.

The first English specialist maths schools, run by King’s College and Exeter University, have been approved by the Department for Education and will open in 2014. All of the pupils will be prepared for the maths ‘STEP’ paper that Cambridge requires for entry (or Oxford’s equivalent) – an exam that sets challenging problems involving unfamiliar ways of considering familiar  material, rather than the formulaic multi-step questions of A Level.’

Back in February 2012, TES reported that:

‘The DfE has hosted a consultation meeting on the new free schools with interested parties from the mathematical community in order to outline its plans.’

‘TES understands that officials within the Department for Education are now keen to establish the schools on the model of Kolmogorov, a boarding school that selects the brightest mathematicians in Russia.’

Andrey Kolmogorov courtesy of Svjo

Andrey Kolmogorov courtesy of Svjo

 

In fact, the meeting discussed a variety of international models and, on 20 February, Education Minister Nick Gibb answered a PQ thus:

‘Alex Cunningham: To ask the Secretary of State for Education when he expects the first free school specialising in mathematics for 16 to 18 year-olds to open; how many 16 to 18 year-olds he expects to enrol in free schools specialising in mathematics by 2015; with which universities he has discussed these free schools; and what guidance he plans to provide to people who wish to apply to open such a school.

Mr Gibb: We are developing proposals on how specialist maths schools for 16 to 18-year-olds might operate and will announce further details in due course. We are keen to engage with all those who have an interest to explore possible models and innovative ideas.’ (Col. 723W).

However, no proposals were published.

The minutes from King’s College London (KCL) Council meeting of 26 June 2012 reveal that:

‘Following approval by the Principal’s Central Team, the College was pursuing discussions with the Department for Education about sponsoring one of 12 specialist Maths schools for 16-18 year olds to be established with the support of university Mathematics departments. The initiative was intended to address national deficiencies in the subject and to promote a flow of highly talented students into university. In discussion, members noted that while the financial and reputational risks and the costs in management time needed to be carefully analysed, the project supported the College’s commitment to widening participation and had the potential to enhance the strengths of the Mathematics Department and the Department of Education and Professional Services, as well as addressing a national problem. The Council approved the College’s continued engagement with this initiative.’

By December 2012 KCL had announced that it would establish a maths free school, with both its maths and education departments involved. The school was scheduled to open in September 2014.

KCL confirmed that it had received from DfE a development grant plus a parallel outreach grant to support a programme for mathematically talented 14-16 year-olds, some of whom might subsequently attend the school.

The minutes of the University of Exeter Council meeting of 13 December 2012 record that:

‘As Council were aware, Exeter was going to be a partner in an exciting regional development to set up one of the first two Maths specialist schools with Exeter College. The other school would be led by King’s College London. This would cater for talented Maths students as a Free School with intake from four counties (Devon, Cornwall, Somerset and Dorset) with a planned total number of students of 120 by September 2017. The bid was submitted to the Department of Education on 11th December and the outcome would be announced in early January, with the school opening in 2014. It would be taught by Exeter College teachers with contributions from staff in pure and applied Maths in the College of Engineering, Mathematics and Physical Sciences (CEMPS), input from the Graduate School of Education and from CEMPS students as mentors and ambassadors. It was hoped that at least some of these talented students would choose to progress to the University. Council would be kept informed of the progress of the bid.’

In January 2013 a DfE press release announced approval of this second school. It would indeed have capacity for 120 students, with Monday-Thursday boarding provision for 20% (24 students), enabling it to recruit from across the four counties named above, so acting as a ‘regional centre of excellence’.

This project had also received a development grant – which we know was up to £300K – had agreement in principle to an outreach grant and also expected to open in September 2014.

There is also reference to plans for Met Office involvement with the School.

The press release repeats that:

‘The ultimate aim is to create a network of schools that operate across England which identify and nurture mathematical and scientific talent.’

A page added to DfE’s website in March 2013 invites further expressions of interest to open maths free schools in September 2014 and beyond.

Parallel Q and A, which has now been removed, made clear that development grants would not be available to new applicants:

‘Is there financial support available to develop our plans?

Not at the beginning. Once we have approved a proposal, we do offer some support to cover the costs of project management, and recruiting some staff before the school opens, in the same way we would for any Free School.’

This has subsequently been reversed (see below).

 

Progress since March 2013

 

The Hard Sell

While KCL and Exeter developed their plans, strenuous efforts were made to encourage other universities to participate in the programme.

A TES piece from May 2013, profiling the newly-appointed head of the KCL school, includes a quote from Alison Wolf – the prominent chair of the project group at KCL:

‘’The Brit School is a really good comparison,” she says. “When we were working on the new school and thinking about what to do, we’d look at their website.

“Maths is very glamorous if you’re a young mathematician, which is why they’ll do well when they are around other people who adore maths.”

The story adds that 16 schools are now planned rather than the original 12, but no source is attributed to this statement. Cummings says it is a mistake

 

 

It seems that the wider strategy at this stage was to convince other potential university sponsors that maths schools were an opportunity not to be missed, to imply that there was already substantial interest from prominent competitors, so encouraging them to climb on board for fear of missing the boat.

 

Playing the Fair Access Card

But there was soon an apparent change of tack. In June 2013, the Guardian reported that education minister Liz Truss had written to the heads of university maths departments to encourage bids.

‘As an incentive to open the new schools, universities will be allowed to fund them using budgets otherwise reserved for improving access to higher education for under-represented and disadvantaged groups….

Les Ebdon, director of Offa, said: “I’d be happy to see more university-led maths free schools because of the role they can play in helping able students from disadvantaged backgrounds access higher education.

“It is for individual universities and colleges to decide whether or not this is something they want to do, but Offa is supportive of anything that is targeted at under-represented groups and helps them to fulfil their potential.”

…According to Truss’s letter, Ebdon confirmed it would be “perfectly legitimate to allocate funding ringfenced for improving access for under-represented groups towards the establishment of such schools,” counting the spending as “widening access”.’

My initial post had pointed to the potential significance of this coupling of excellence and equity as early as November 2011:

‘It is not clear whether a fundamental purpose of these institutions is to support the Government’s drive towards greater social mobility through fair access to competitive universities. However, one might reasonably suggest it would be an oversight not to deploy them…encouraging institutions to give priority during the admissions process would be the likely solution.’

What appeared to be Ministers’ rather belated conversion to the merits of alignment with social mobility and fair access might have been interpreted as opportunism rather than a sincere effort to join together two parallel strands of Government policy, especially since it had not been identified as a central feature in either KCL’s or Exeter’s plans.

But Cummings reveals that such alignment was intended from the outset.

 

 

I can find nothing on Offa’s website confirming the statement that funding ringfenced for fair access might be allocated by universities to the development of maths free schools. There is no contemporary press notice and nothing in subsequent guidance on the content of access agreements. This begs the question whether Ebdon’s comments constitute official Offa advice.

I asked Cummings why it took so long to get the line from Ebdon and why that line wasn’t encapsulated in Offa guidance.

 

 

The Cummings view of the dysfunctionality of central government is well-known, but to have to wait nineteen months for a brief statement on a high-priority programme – with inevitably long lead times yet time-limited to the duration of the Parliament – must have been deeply frustrating.

It would seem that Offa had to be persuaded away from sympathy with the negative views Cummings attributes to so many vice chancellors – and that this required a personal meeting at ministerial level.

But this was a priority programme with strong ministerial backing.

 

 

One must draw one’s own private conclusions about the motivations and commitment of the key protagonists – I will not apportion blame.

The text of Truss’s letter is preserved online and the identical text appears within it:

‘I want to encourage other universities to consider whether they could run similar schools: selective, innovative and stretching our brightest and best young mathematicians. It is a logical extension of the role that dozens of universities have already played in sponsoring academies.

I also wanted to highlight to your colleagues that Professor Les Ebdon, Director of the Office for Fair Access, is enthusiastic about the role university led Maths Free Schools can have in encouraging more young people to go on to study maths at university, and to reap the benefits that brings. Professor Ebdon has also confirmed to me that he considers the sponsorship and development of Maths Free Schools as contributing to higher education ‘widening access’ activity, and that it would be perfectly legitimate to allocate funding ring-fenced for improving access for underrepresented groups towards the establishment of such schools.

Unlike our usual practice for Free Schools, there is no competitive application process for Maths Free Schools. Instead we ask interested universities to submit a short proposal setting out the key features of the school. These proposals need not be long: King’s and Exeter both submitted initial proposals that were around 12 pages…

[There follows a list of bullet points describing the content of these initial proposals, none of which address the admission of students from disadvantaged backgrounds.]

….Both King’s College and the University of Exeter had a number of detailed discussions with colleagues in the Department to develop and refine their proposals and we are always happy to work with universities to help them focus their plans before submitting a formal proposal. If we approve a proposal, we do then offer financial support to cover the costs of project management, and of recruiting some staff before the school opens, in the same we would for any free school.’

(By way of an aside, note that the final emboldened sentence in the quotation above corrects the statement in the Q and A mentioned above. It seems that maths free schools are now treated comparably with all other free school projects in this respect, even though the application process remains different.

The latest version of free school pre-opening guidance gives the sum available in Project Development Grant for 16-19 free schools as £0.25m.)

Going back to Offa, there are no conditions imposed by Ebdon in respect of admissions to the schools, which seems a little over-relaxed, given that they might well attract a predominantly advantaged intake. I wonder whether Ebdon was content to offer personal support but refused to provide official Offa endorsement.

 

 

In July 2013 the BBC reported a speech by Truss at the 2013 ACME Conference. Oddly, the speech is not preserved on the gov.uk site. According to the BBC:

“We want this movement to spread still further,” she told delegates.

“So we’re allowing universities to apply to sponsor new maths free schools through a fast-track, simplified procedure, without having to go through the normal competitive application process.

“These schools will not only improve standards in maths teaching, but will equip talented young people from low-income backgrounds with the skills they need to study maths at university.”

Mrs Truss said the Office for Fair Access had confirmed that, when universities contributed to the sponsorship or development of maths free schools, this would be considered as one of their activities to widen access to under-represented groups – and therefore as part of their access agreement.

“I hope that this is the start of a new network of world-class free schools, under the aegis of top universities, helping to prepare talented 16- to 19-year-olds from any and every background for the demands of university study.”

Note that Ebdon’s endorsement is now Offa’s.

Cummings’ essay remarks in a footnote:

‘Other maths departments were enthusiastic about the idea but Vice Chancellor offices were hostile because of the political fear of accusations of ‘elitism’. Hopefully the recent support of Les Ebdon for the idea will change this.’

A year on, we have no evidence that it has done so. Cummings comments

 

 

What that ‘not none’ amounts to – beyond references (reproduced later in this post) in KCL’s and Exeter’s access agreements – remains to be established for, as we shall see, it does not feature prominently in the priorities of either of their schools.

 

The Soft Sell

By the beginning of the following academic year, a more subtle strategy was adopted. The two schools-in-development launched a maths competition for teams from London and the South-West with prizes awarded by education ministers.

 

 

A November 2013 DfE press release marks the ceremony. Michael Gove is quoted:

‘We need specialist maths free schools like King’s College London (KCL) Maths School and Exeter Mathematics School. They will develop the talents of exceptional young mathematicians and ensure they can compete in the global race.’

The release continues:

‘The KCL and Exeter schools are the first to take advantage of a development grant made available by the Department for Education for the creation of university-led specialist maths free schools.’

The notes include a link to the 1 March webpage mentioned above for ‘Universities interested in developing their own maths free school’.

 

Publicity avoided

We now know that a Freedom of Information request had been submitted to DfE in October 2013, asking how many expressions of interest and firm proposals had been received, which institutions had submitted these and which proposals had been approved and rejected.

The source is an ICO Decision Notice published on 12 June 2014.

The request was initially rejected and this decision was upheld in January 2014 following an internal review. A complaint was immediately lodged with the Information Commissioner’s Office.

The Decision Notice records the Commissioner’s decision that public interest outweighs the case for withholding the information. Accordingly he directs that it should be released to the complainant within 35 calendar days of the date of the Notice (ie by 17 July 2014).

The Notice contains some interesting snippets:

  • ‘It has been the DfE’s experience that interested Heads of Maths have contacted it for further information before seeking to discuss the idea with their Vice Chancellor.’ There is no process for accepting formal expressions of interest.
  • There are…no fixed criteria against which all proposals are assessed.’
  • ‘The DfE confirmed that the application is and has always been the first formal stage of the maths free schools process and it has already stated publicly that it has received three applications from King’s College London, Exeter University and the University of Central Lancashire.’
  • ‘It [ie DfE] confirmed that funding arrangements were only confirmed for the development of maths free schools in February 2014 and many policy decisions on this issue have been shaped by the specifics of the two schools that are due to open soon. It expects the policy to develop even further as more maths free schools are approved.’
  • ‘The DfE explained that universities are extremely risk adverse when it comes to protecting their reputation and so do not want to be publically named until they have submitted an application. As such, if they are named at an earlier point it may make them pull out altogether and may make universities unwilling to approach the DfE with ideas.’
  • ‘Similarly, the DfE argued that if it were to release the reasons why one of the applications was rejected it would be likely to deter future interest as the university would not want the public criticism of its ideas. Given that the policy is driven by university interest, if all potential groups are deterred the policy will fail and students will not be able to enjoy the potential benefits.’

The Commissioner gave these arguments short shrift, pointing out the benefits of transparency for policy development and the encouragement of more successful applications.

The text does not say so explicitly, but one can imagine the Commissioner thinking  ‘given the low level of interest stimulated to date, you might at least try a more open strategy – what have you got to lose?’

It does seem unlikely that university heads of maths departments would submit speculative expressions of interest without internal clearance. Their approaches were presumably of the informal ‘sounding out’ variety. They would understand the shaky internal politics of failing to consult the corporate centre – not to mention their education faculties

The lack of specific and transparent assessment criteria does appear to have backfired. What guarantees might universities otherwise receive that their proposals would be judged objectively?

One can imagine the questions:

  • Is the scheme open to all universities, Russell Group or otherwise?
  • If not, what criteria must the host university satisfy?
  • What counts as a ‘strong mathematics department?’
  • Can projects be led by university departments of education, or only undertaken jointly (as at KCL)?

Without explicit and consistent answers one can readily understand why many universities would be disinclined to pursue the idea.

Cummings disagrees strongly with this suggestion

 

 

But I am still unconvinced. Personal experience of working with sceptical vice chancellors and their offices leads me to believe that some distinct parameters would have been better than none, provided that they were flexible parameters, in all the areas where ministers were genuinely flexible.

Some flagging up of ministerial preferences might also have been helpful, provided it was also made clear that ministers could be persuaded away from them by a strong enough bid with a different complexion.

Since ministers set so much store by the fair access dimension, and were acutely aware of the need to face down universities’ concerns about elitism, some explicit statement of the importance they attached to this dimension would not have gone amiss.

And the reference to bespoke solutions rings rather hollow when – as we shall see – the proposals from KCL and Exeter were so strikingly similar.

I suspect this difference of opinion boils down to ideology – our very different ideas about bureaucracy and how best to harness innovation. The point is moot in any case.

 

The reference to belated confirmation of funding arrangements – as recently as February 2014 – is intriguing. It cannot apply to capital funding, unless that was vired in extremis. I wondered whether it might relate to the parallel recurrent funding pot or simply the availability of project development grants.

The latter seems unlikely given the statement in the letter to HoDOMS, dated some eight months previously.

One suspects that there might have been internal difficulties in ringfencing sufficient recurrent funding to honour proposals as and when they were received. Some prospective bidders might have baulked on being told that their budget could not be confirmed until a later date.

But the eventual resolution of this issue a little over a year before the end of the spending round would be unlikely to have a significant impact on the number of successful bids, especially if unspent capital funding has to be surrendered by Spring 2015.

Cummings throws some light on this issue

 

 

It sounds as though there were internal pressures to integrate maths free schools into the 16-19 free schools programme, where levels of bureaucracy might have caused further delay

But these comments tend to play down the budgetary issue flagged up to the ICO. Although it might have been strictly correct that: ‘funding arrangements were only confirmed for the development of maths free schools in February 2014‘, the associated suggestion that this had been a significant factor holding up the approval of further projects seems rather more suspect.

 

Recent developments

In July 2014 the TES revealed that it had been the source of this FoI request.

 

 

But the story itself reveals little new, other than that:

‘Five further expressions of interest have been made but not yet yielded an application’

The sources of these EoIs are not listed, even though they must have been divulged to the paper by this point.

David Reynolds opines that:

‘Having a small number of schools doesn’t matter if we can get the knowledge from them around the system. So we need them to be excellent schools and we need to somehow get that knowledge around.’

A DfE statement concludes:

‘We continue to welcome applications and expressions of interest from universities and the first maths free schools, set up by two leading universities, will be opening in September.’

So we know there have been eight expressions of interest, three of them converted into firm proposals.

The receipt of the third proposal, from the University of Central Lancashire (UClan), is said to have been made public, but I can find no record of it in the lists of Wave 1 to 7 free school applications so far released, or anywhere else for that matter. (KCL and Exeter were both included in Wave 3.)

There is a reference in UCLAN’s 2013-14 access agreement dated 31 May 2012:

‘The University is currently consulting on the formation of a Maths Free School which would be run alongside its new Engineering Innovation Centre at the Preston Campus.’

Nothing is said about the plans in the access agreements for 2014-15 and 2015-16.

There is one further reference on the New Schools Network site to a:

‘Consultant engaged to carry out a feasibility study re a Maths Free School on behalf of the University of Central Lancashire (UCLan)’.

One assumes that this must be out-of-date, unless UCLan is considering a second bid.

Otherwise, a simple process of elimination tells us that UCLan’s proposal must have been rejected. The reason for this is now presumably known to TES, as are the sources of the five expressions of interest that were not converted into proposals. Why have they not published this information?

Perhaps they are waiting for DfE to place these details on its website but, at the time of writing – almost three months after the Decision Notice issued – it has not been uploaded.

Meanwhile, there are no further maths free school proposals in the most recent Wave 7 information relating to applications received by 9 May 2014.

The deadline for Wave 8 is imminent. That may well be the last on this side of the Election.

Cummings reveals that there is a fourth proposal in the pipeline which is not yet ready to be made public.

 

 

One assumes a September 2015 start and we must wait to see whether it catches Wave 8.

We discussed the relationship of this proposal to the evidence submitted to the ICO. We do not know whether it features among the five expressions of interest but it might be supernumary. Cummings is at pains to justify a cautious approach to FoI requests.

 

 

He is content to release details only at the point where development funding is committed.

So, assuming DfE is pursuing the same strategy, one can reasonably conclude that development funding has not yet been agreed for this fourth proposal. Although it has progressed beyond the status of an expression of interest, it is not yet an approved application.

Almost nine months has passed since Cummings left the Department, yet negotiations have not reached the point where development funding is confirmed. This must be a complex and sensitive negotation indeed! Perhaps there is a Big Fish on the end of this particular hook…or perhaps the host university has cold feet. We must wait and see.

A further feature published by the TES in October 2014 throws no fresh light on these matters, though it carries a quote by new Secretary of State Nicky Morgan, interviewed at the KCL School launch:

‘I think that some [universities] are clearly waiting to see how the King’s and Exeter schools go. Clearly there is a huge amount of effort required, but I think King’s will be enormously successful, and I am hoping they will be leading by example.’

That sounds suspiciously like tacit admission that there will be no new proposals before a General Election.

Another opinion, diametrically opposed to David Reynolds’ view, is contributed by the head of the school of education at Nottingham University who is also Deputy Chair of ACME:

‘I’m very supportive of more people doing more maths, but even if you have 12 schools, you are really scratching the surface,” said Andrew Noyes, head of the school of education at Nottingham University and a former maths teacher.

“These kinds of policy experiments are very nice and they’re beneficial for a certain number of young people, but they’re relatively cheap compared with providing high-quality maths education at every stage in every school.”’

So what are the prospects for the success of the KCL and Exeter Schools? The next section reviews the evidence so far in the public domain.

 

The KCL and Exeter Free Schools

 

KCL School

The KCL School opened in September 2014 with 68 students, against a planned admissions number of 60. The most recent TES article says that there were 130 applicants and nearly all of those successful were drawn from state schools.

However, another reliable source – a member of the governing body – says that only 85% (ie 58) are from maintained schools, so the independent sector is actually over-represented.

He adds that:

‘Many are from families where neither parent has attended university’

but that is not necessarily an indicator of disadvantage.

We also know that some 43% (29 students) were female, which is a laudable outcome.

The School is located in Lambeth Walk, some distance from KCL’s main campuses. The capital cost of refurbishing the School was seemingly £5m. It occupies two buildings and the main building is shared with a doctor’s surgery.

My March 2013 post summarised KCL’s plans, as revealed by material on the University’s site at that time, supplemented by the content of an information pack for potential heads which is no longer available online.

I have reproduced the main points below, to provide a baseline against which to judge the finished article.

  • The full roll will be 120, with an annual admission number of 60. Potential applicants must have at least 5 GCSE grades A*-C including A*/A in both maths and physics or maths and dual award science.
  • Other admissions criteria will probably include a school reference, ‘our judgement about how much difference attending the school will make to your future based on a number of factors, including the results from an interview’ and the results of a test of aptitude for problem-solving and mathematical thinking.
  • The headteacher information pack adds that ‘the school will also be committed to recruiting a significant proportion of students from socially disadvantaged backgrounds, and to an outreach programme… to further this objective.’
  • All students will take Maths, Further Maths and Physics A levels. They will be expected to take STEP papers and may take a further AS level (an FAQ suggests this will be an Extended Project). Every student will have a maths mentor, either an undergraduate or ‘a junior member of the maths department’.
  • They will also ‘continue with a broad general curriculum, including other sciences, social science, humanities and languages, and have opportunities for sport and the visual and performing arts.’ Some of this provision will be ‘delivered through existing King’s facilities’. The provisional timetable assumes a 40-hour working week, including independent study.
  • The University maths department ‘will be closely involved in curriculum development’ and academics will have ‘regular timetabled contact’, potentially via masterclasses.
  • There will be strong emphasis on collaboration with partner schools. In the longer term, the school ‘intends to seek independent funding for a larger CPD programme associated with the school’s curriculum and pedagogy, and to offer it to a wide range of  schools and students, using school premises out of hours’.

At the time of writing, the KCL Maths School website does not have a working link to the admissions policy, although it can be found online.

As expected, 60 students will be admitted in September 2015. Minimum requirements are now

‘A or A* in GCSE Mathematics or in iGCSE Mathematics

Either an A or A* in GCSE Physics or iGCSE Physics, or an AA, A*A or A*A* in GCSE Science and GCSE Additional Science, or an A or A* in all three Physics modules contained within the GCSE Science, Additional Science and Further Additional Science qualifications; and

A*-C grade in 5 other GCSEs or other qualifications that count towards the Key Stage 4 performance tables compiled by the Department of Education, normally including English language.’

So the minimum requirement has been stiffened to at least seven GCSEs, or equivalent, including A*/A grades in maths and physics and at least a C in English language.

The application process does indeed include a reference, an aptitude test and an interview.

The test is based on KS3 national curriculum material up to Level 8, containing ‘routine and less familiar problems’. Some specimen questions are supplied.

The latest TES story says there are two interviews but this is wrong – there is one interview but two interview scores.

Cummings queries this point

 

 

I can no longer check the original admissions policy to establish whether there was exceptionally provision for two interviews for admission in 2014, but all the other material I have seen – including the admissions policy for 2015 – refers to a single interview.

One of the two scores is ‘to assess to what extent the school is likely to add value in terms of making a difference to [candidates’] future careers’ but there is no explicit reference to priority for disadvantaged students anywhere in the admissions policy.

Indeed, the section headed Equality and Diversity says:

‘All places at King’s College London Mathematics School are offered on the basis of academic ability and aptitude.’

This does not amount to a commitment to recruit ‘a significant proportion of students from socially disadvantaged backgrounds’, as stated in the headteacher information pack.

A deputy headteacher information pack published in November 2013 had already rowed back from this, simply stating that:

‘Students will be recruited from a wide variety of backgrounds.’

The reasons for such backtracking remain unclear. Perhaps it was only ever insurance against accusations of elitism that never actually materialised.

The website confirms that all students take A levels in maths, further maths and physics, together with an AS EPQ. But now they can also take an optional AS level in computing in Year 12 and may convert it to an A level in Year 13. They will also take either the AEA or STEP papers.

The description of additional curricular provision is somewhat vague. Students will have a series of lessons and educational visits. Each fortnight a KCL lecturer will introduce a new theme, to be explored through ‘mini research projects’. Students will also learn a modern language but to what level is unclear.

A mentor will be assigned to support work for the EPQ. There will also be a maths mentor – always an undergraduate, never ‘a junior member of the maths department’ – available for one meeting a week.

Tuesday afternoons seem to be set aside for sport and exercise. Visual and performing arts will be explored through extra-curricular activity, though this is currently aspirational rather than real:

‘…the school hopes to have sufficient interest to form a student choir, orchestra and dramatic society.’

The length of the school day is six hours and 55 minutes, with five hours of lessons (though the FAQ implies that students will not have a full timetable).

The present staff complement is 10, six of whom seem to be teaching staff. The head was formerly Head of Maths at Highgate School.

Outreach continues for students in Years 10 and 11. There is also a CPD programme for those new to teaching further maths. This is funded by a £75,000 grant from the Mayor’s London Schools Excellence Fund and supports 30 teachers from six schools spread across five boroughs.

KCL’s Access Agreement for 2015/16 says:

‘King’s College London Mathematics School aims to increase substantially the number of young people with the right levels of mathematical attainment to study STEM subjects at top-rated universities. It also aims to improve access to high quality mathematical education at sixth form level and is targeting individuals from schools where such provision is not easily available (in particular, 11-16 schools and schools where further mathematics is not offered as part of the curriculum at A-level). The school has implemented an extensive outreach programme for pupils at KS4, aged 14-16, whereby pupils come to King’s College London for two hours per fortnight over a two-year period. Through this programme, the school will provide students with limited access [sic] to high quality sixth form provision the understanding and skills they need to prepare for A-levels in Maths and Further Maths should they decide to study them, and also to support applications to the maths school should they wish to make them.

The school has also just launched a programme of continuing professional development for maths teachers in London schools. The programme will run for two consecutive years, and will enable high-quality teaching of Further Maths for those new to teaching this A-level. One of the key aims of this programme is to improve take up and retention rates in A-level Further Maths, with a view to increasing numbers of well-trained applicants to STEM subjects at university.’

Exeter

The Exeter School also opened in September 2014, with 34 students, against a planned admission number of 30. Disappointingly only seven are girls. Eleven (32%) are boarders. We do not know the number of applicants.

The School is located in Rougemont House, a Grade 2 listed building close to the University and College. The cost of refurbishment is as yet unknown.

There were relatively fewer details available of Exeter’s plans at the time I wrote my previous post. The January 2013 revealed that:

  • As we have seen, the roll would be 120 students, 60 per year group, with boarding places available for 20% of them.
  • All students would take maths A level and the STEP paper and all would have 1:1 maths mentoring.
  • University academics would provide an ‘enrichment and critical thinking programme’.
  • The Met Office would be involved.

The 2014 admissions policy dates from September 2013.  It indicates that the School will admit 30 students in September 2014, 50 in September 2015 and 60 in September 2016. It will not reach full capacity until September 2017.

Minimum entry requirements are:

  • A* in GCSE Mathematics
  • A or A* in double sciences or single science Physics (in 2015 computer science is also acceptable as an alternative)
  • At least 6 GCSEs at C grade or above, normally to include English Language at a grade B.

So Exeter is more demanding than KCL in respect of the grades required for both GCSE maths and English language, but the minimum number of GCSEs required is one fewer.

The policy says that the School will aim for allocated places to reflect the incidence of potential students across Devon (47%) and in the other three counties served by the school (Cornwall 23%, Somerset 23%, Dorset 6%) but they will not be selected on this basis. There is nothing in the admissions criteria to secure this outcome, so the purpose of this paragraph is unclear.

The selection process involves a written application, a reference an interview and ‘a mathematics-based entry exam’, subsequently called an aptitude test. This is described in identical terms to the test used by KCL – indeed the specimen questions are identical.

The oversubscription criteria involve giving priority to ‘interview answers and the candidates’ potential to thrive and succeed on the course’.

Under ‘Equality and Diversity’ the document says:

‘EMS is committed to widening participation and broadening access to high quality mathematics education. As such, we will target our recruitment in areas which have high levels of deprivation and in schools for which provision is currently limited, such as those without 6th forms.

EMS will encourage applications from female students through targeted marketing and recruitment. However, there will be no positive discrimination for girls in the admissions criteria.’

The first statement is largely meaningless since neither residence in a deprived area nor attendance at a school without a sixth form is mentioned explicitly in the admissions criteria.

The second statement is reflected in the fact that only 20% of the inaugural cohort is female.

The document notes that boarding will be available for learners living more than an hour distant. The proportion of boarders in the first cohort is significantly higher than expected.

It adds that boarding fees will be payable (and published on the School’s website) but it is expected they ‘will be subsidised by a government grant and a private investor’. There will also be a limited number of means-tested full bursaries, the criteria for which will also be published.

At the time of writing neither fees nor subsidies nor bursary criteria are published on the open pages of the website. It also mentions a subsidised transport scheme but provides no details. This is unhelpful to prospective candidates.

Students take A levels in maths and further maths, plus an A level in either physics or computer science. They are also prepared for STEP papers. All students pursue one further AS level at Exeter College, selecting from a choice of over 30 subjects, with the option to complete the A level in Year 13. Amongst the 30 are several non-traditional options such as fashion and design, media studies and world development. The School is clearly not wedded to facilitating subjects!

In maths students will:

‘…collaborate with those in other mathematics schools and meet, converse and work with staff and students from Exeter University’s mathematics department. They will have access to mathematical mentors from the University who will provide 1:1 and small group support for individual development and project work.’

Maths mentors will be 3rd or 4th year undergraduates and sessions will take place fortnightly.

All students will have a pastoral tutor who will ‘deliver a curriculum designed to meet the students’ development needs’. Some extra-curricular options may also be available:

‘Several clubs and societies will exist within EMS, these will be established as a result of students’ own interests. In addition, Exeter College’s specialist facilities, learning centres and other services will be accessible to them. Students will join their friends and other students from the College for sporting and enrichment activities including, for example, structured voluntary work, theatre productions and the Duke of Edinburgh’s Award Scheme.’

I could find no reference to a University-provided enrichment and critical thinking programme or to Met Office involvement.

The Head of Exeter School was formerly a maths teacher and maths AST at Torquay Boys’ Grammar School. Other staff responsibilities are not enumerated, but the Contacts page mentions only one teacher apart from the Head.

Another section of the site says the School will be advertising for a Deputy and ‘teachers of Mathematics, Computer Science and Physics (p/t)’. So the original intention to deploy Exeter College staff seems to have been set aside. Advertisements have been placed for several posts including a Pastoral Leader and an Outreach and Admissions Officer.

An outreach programme is being launched and business links will be established, but there are no details as yet. There are links to a KS4/5 maths teachers’ network sponsored by the Further Maths Support Programme.

Exeter’s 2015/16 Access Agreement says:

‘The University and the College are already joint sponsors of the innovative new Exeter Maths School and are developing a strategic approach to outreach that supports both curriculum enhancement in local schools and progression for the students enrolled in the school. Together with the South Devon UTC, these two new education providers offer opportunities for innovative collaborative approaches to outreach in the region.’

This sounds very much a work in progress.

 

 

Comparing the two schools

My 2013 post observed:

‘From the information so far published, the Exeter project seems very close conceptually to the one at King’s, indeed almost a clone. It would have been good to have seen evidence of a fundamentally different approach.’

If anything, the two projects have grown even more similar as they have matured. To the extent that these are pilot institutions testing out a diversity of models this is not entirely helpful.

Both schools are very small and KCL in particular offers a very restricted range of post-16 qualifications. There is downside to post-16 education on this model – otherwise we wouldn’t be exercised about the negative effects of small sixth forms – though both projects make some effort to broaden their students’ experience and, as we have seen, Exeter includes some shared provision with Exeter College.

The admissions requirements and processes are almost identical. It is important to recognise that neither institution is highly selective, especially in terms of overall GCSE performance and, in this respect, the comparisons with Kolmogorov and other institutions elsewhere in the world are rather misleading.

This is not the top 2% that Cummings cited as the beneficiaries in his essay. Even in terms of mathematical ability, the intake to these schools will be relatively broad.

The expectation that all will take STEP papers may be realistic but, despite the use of an aptitude test, any expectation of universal success is surely over-optimistic.

For Cambridge says STEP papers are ‘aimed at the top 5% or so of all A-level mathematics candidates’.  Fewer than 1,500 students took the most popular Paper 1 in 2013 and, in 2014, over 20% of participants received an Unclassified grade.

Cummings queries my conclusions here

and I have to admit that these are inferred from the evidence set out above. But, on the basis of that evidence, I would be surprised indeed if STEP results for these two schools exceed the national profile in 2016.

Cummings notes that approximately one third of those entered for STEP attend independent schools, meaning that roughly 1,000 of the 2013 cohort were in maintained institutions. There may be some marginal increase in state-funded STEP entry through these two schools, but the impact of MEI support elsewhere is likely to be more significant.

 

The priority attached to excellence is less pronounced than expected. But this is not matched (and justified) by a correspondingly stronger emphasis on equity.

Neither school gives priority within its admissions or oversubscription criteria to students from disadvantaged backgrounds. A major opportunity has been lost as a consequence.

Cummings responds

There are questions to be asked here about just how tightly the universities were held to the specifications they agreed.

There is nothing about the admission of disadvantaged students in the KCL funding agreement (I can’t find Exeter’s). It would be interesting to know what exactly they set down in their proposals, as approved.

One suspects that some effort has been made to prioritise admissions from state schools, especially state schools without sixth forms, but all this is swept up into the interview scores: there is nothing explicit and binding. The fact that 15% of the KCL intake has come from the independent sector shows that  this is insufficient.

Comparison with the admissions policy for the Harris Westminster Sixth Form is instructive:

‘Applicants who have achieved the qualifying score will then be awarded points as follows:

  • One point for the applicant’s home address…if it is in an area of high deprivation, based on an independently published assessment of levels of deprivation of postcodes;
  • One point if they qualify for, or have previously qualified for, Free School Meals.

If Year 12 is oversubscribed then, after the admission of pupils with Special Educational Needs where the Harris Westminster Sixth Form is named on the statement, the criteria will be applied in the order in which they are set out below to those who have achieved a qualifying score:

a. Looked after and former looked after young people;

b. Applicants who have 2 points in accordance with the paragraph above;

c. Applicants who have 1 point in accordance with the paragraph above…’

The funding allocations for academic year 2014/15 show that both maths free schools have been awarded zero free meals funding, suggesting that no pupils eligible for free school meals in Year 11 have been admitted.

So there is too little emphasis on excellence and equity alike. These institutions exemplify a compromise position which, while tenable, will reduce their overall impact on the system.

The only substantive difference between the two schools is that one is located in London and the other in a much more sparsely populated and geographically dispersed region. These latter conditions necessitate a boarding option for some students. The costs associated with boarding are not transparent, but one suspects that they will also serve as a brake on the recruitment of disadvantaged students.

Exeter has no real competitors in its region, other than existing sixth forms and post-16 institutions, but KCL faces stiff competition from the likes of the London Academy of Excellence and the Harris Westminster Sixth Form, both of which are much more substantial institutions offering a wider range of qualifications and, quite possibly, a richer learning experience.

Both Schools are designed to suit students who wish to specialise early and who are content with only limited opportunities to work outside that specialisation. That subgroup does not necessarily include the strongest mathematicians.

It might have been different story if the Schools could have guaranteed progression into the most selective higher education courses, but this they cannot offer. There is no guaranteed progression even to the host universities (whose mathematics departments are not the strongest – one obvious reason why they were attracted to hosting maths schools in the first place).

Exeter and Kings no doubt expect that their Schools will help them to compete more effectively for prospective students – both through direct recruitment and, more indirectly, by raising their profile in the maths education sector – but they will not state this overtly, preferring to emphasise their contribution to improving standards system-wide.

There is no reference to independent evaluation, so one assumes that success indicators will focus on recruitment, a strong showing in the Performance Tables and especially Ofsted inspection outcomes.

A level performance must be consistently high and HE destinations must be commensurate. Because recruitment of disadvantaged students has not been a priority fair access measures are largely irrelevant.

Other indicators should reflect the Schools’ contribution to strengthening the maths talent pipeline and maths education more generally, particularly by offering leadership at regional and national levels.

At this early stage, my judgement is that the KCL project seems rather better placed than Exeter to achieve success. It has hit the ground running while Exeter has some rapid catching up to do. One is good; the other requires improvement.

 

Future Prospects

 

Prospects for the maths school programme

With just seven months before Election Purdah, there is no prospect whatsoever that the programme will reach its target of 12 schools. Indeed it seems highly unlikely that any further projects can be brought to fruition before the end of the spending round, with the possible exception of the mysterious ‘4th proposal’.

On assumes that the Regional Schools Commissioners are now responsible for stimulating and supporting new maths school projects – though this has not been made explicit – but they already have their hands full with many other more pressing priorities.

If Labour were to win the Election it seems unlikely that they would want to extend the programme beyond the schools already established.

Even under the Conservatives it would be extremely vulnerable given its poor track record, the very tight budgetary constraints in the next spending round (especially if schools funding is no longer ringfenced) and the fact that its original champions are no longer in place at DfE.

Cummings suggest that a further five schools might be reasonable objective for the next Parliament, but only if the commitment within DfE is sustained.

Even that unlikely prospect would result in a network of only eight schools by 2020, four short of the original target that was to have been delivered five years earlier.

With the benefit of hindsight one might have taken a different approach to programme design and targeting.  Paradoxically, the centre has appeared overly prescriptive – favouring a ‘Kolmogorov-lite’ model, ideally hosted by a Russell Group institution – but also too vague – omitting to clarify their expectations in a specification with explicit ‘non-negotiables’.

Universities were hesitant to come forward. Some will have had other fish to fry, some may have had reservations arising from fear of elitism, but more still are likely to have been unclear about the Government’s agenda and how best to satisfy it.

The belated decision to flag up the potential contribution to fair access was locking the door after the horse had bolted. Other universities will have noted that neither KCL nor Exeter paid lip service in this direction.

Cummings rejects this analysis. For him the resistance from vice chancellors had a straightforward explanation

According to his narrative, many university mathematicians were on the side of the angels, understanding the advantage to their departments of securing a bigger flow of undergraduates, far better prepared for university study.

But they were thwarted by the corporate centres in their institutions, the vice chancellors hamstrung by their fear of potential reputational damage, invariably associated with the charge of elitism.

Yet I have seen negligible evidence of media criticism of KCL and Exeter on these grounds, or any others for that matter.

The only occasion on which I have seen the term ‘elitism’ wielded is by the massed ranks of the Devon Branch of the NUT. Neither KCL nor Exeter has had to play the trump card of priority for disadvantaged students – indeed I have shown above how they have apparently rowed back from earlier commitments on this front.

We shall probably never know the truth since there are no records of these discussions – and I very much doubt whether any vice chancellors will read this and decide to put the record straight.

My own personal experience has been that, by and large, universities are reluctant to serve as instruments to further government education policy. Their knee-jerk is more of the ‘not invented here’ variety and, even if they are given carte blanche, they remain highly suspicious of government motives. Fundamentally, there is an absence of trust.

An internal champion, such as Alison Wolf at KCL, can help to break this down.

 

There were also policy design issues. Because they were awarded a substantial capital budget – and were wedded to the value of free schools – ministers were driven to focus on creating new stand-alone institutions that might ultimately form a network, rather than on building the network itself.

The decision to create a set of maths hubs was the most sensible place to start, enabling new maths schools to take on the role of hubs when they were ready to do so. But, the maths hubs were a later invention and, to date at least, there have been no efforts to ‘retro-fit’ the maths schools into the network, meaning that these parallel policy strands are not yet integrated.

 

Prospects for the national maths talent pipeline

England is far from having a coherent national strategy to improve maths education or, as one element within that, a convincing plan to strengthen the maths talent pipeline.

Maths education enjoys a surfeit of players with overlapping remits. National organisations include:

A host of other organisations are involved, including the Joint Mathematical Council (JMC), an umbrella body, the Advisory Committee on Mathematics Education (ACME), the United Kingdom Mathematics Trust (UKMT) and the School Mathematics Project (SMP).

This leaves to one side the maths-related element of broader programmes to support between-school collaboration, recruit teachers and develop new-style qualifications. There is a parallel set of equally complex relationships in science education.

Not to put too fine a point on it, there are too many cooks. No single body is in charge; none has lead responsibility for developing the talent pipeline.

Ministers have been energetic in generating a series of stand-alone initiatives. The overarching vision has been sketched out in a series of set-piece speeches, but there is no plan showing how the different elements knit together to create a whole greater than the sum of its parts.

This probably has something to do with an ideological distaste for national strategies of any kind.

The recent introduction of maths hubs might have been intended to bring some much-needed clarity to a complex set of relationships at local, regional and national levels. But the hubs seem to be adding to the complexity by running even more new projects, starting with a Shanghai Teacher Exchange Programme.

Cummings has me down as a hopeless idealist

and who am I to contest his more recent and much more wide-ranging experience? I will only say that I can still recollect the conditions under which many such obstacles can be overcome.

 

courtesy of Jim2K

courtesy of Jim2K

 

Last words

A network-driven approach to talent development might just work – I suggested as much at the end of my previous post – but it must be designed to deliver a set of shared strategic objectives. Someone authoritative needs to hold the ring.

What a pity there wasn’t a mechanism to vire the £72m capital budget for 12 free schools into a pot devoted to this end. For, as things stand, it seems that up to £12m will have been spent on two institutions with a combined annual cohort of 120 students, while as much as £60m may have to be surrendered back to the Treasury.

We are better off than we would have been without the KCL and Exeter Schools, but two schools – or perhaps three – is a drop in the ocean. Even 12 schools of this size would have been hard-pressed to drive improvement across the system.

The failure to capitalise on the potential of these projects to support progression by genuinely disadvantaged students is disappointing and deserves to be revisited.

This might have been a once-in-a-generation chance to mend the maths talent pipeline. I do hope we haven’t blown it.

 

GP

October 2014

16-19 Maths Free Schools Revisited

This post marks the opening of two university-sponsored 16-19 maths free schools with a fresh look at the wider programme that spawned them.

It scrutinises developments since the publication of ‘A Progress Report on 16-19 Maths Free Schools’ (March 2013), building on the foundations within ‘The Introduction in England of Selective 16-19 Maths Free Schools’ (November 2011).

courtesy of Jim2K

courtesy of Jim2K

The broad structure of the post is as follows:

  • A description of the genesis of the programme and a summary of developments up to March 2013.
  • The subsequent history of the programme, from March 2013 to the present day. This reviews efforts to recruit more university sponsors into the programme – and to resist the publication of information showing which had submitted expressions of interest and, subsequently, formal proposals.
  • An assessment of the prospects for the programme at this point and for wider efforts to expand and remodel England’s national maths talent pipeline.

Since many readers will be interested in some of these sections but not others, I have included direct links to the main text from the first word of each bullet point above.

 

Genesis and early developments

Capital investment to support the programme was confirmed in the 2011 Autumn Statement, which referred to:

‘…an extra £600 million to fund 100 additional Free Schools by the end of this parliament. This will include new specialist maths Free Schools for 16-18 year olds, supported by strong university maths departments and academics’.

This followed an orchestrated sequence of stories fed to the media immediately prior to the Statement.

One source reported a plan to establish 12 such schools in major cities by the end of the Parliament (Spring 2015) ‘before the model is expanded nationwide’. These would:

‘…act as a model for similar institutions specialising in other subjects’.

Another confirmed the number of institutions, adding that there would be ‘…a special application process outside the regular free school application process…’

A third added that the project was viewed as an important part of the Government’s strategy for economic growth, suggesting that some of the schools:

‘…would offer pure maths, while others would combine the subject with physics, chemistry or computer sciences.’

Assuming provision for 12 schools at £6m a time, the Treasury had provided a capital budget of £72m available until 2015. It remains unclear whether this sum was ringfenced for university-sponsored maths schools or could be diverted into the wider free schools programme.

We now know that former political adviser Dominic Cummings was a prime instigator of the maths free schools project – and presumably behind the press briefings outlined above.

The most recent edition of his essay ‘Some thoughts on education and political priorities’ (2013) says:

‘We know that at the top end of the ability range, specialist schools, such as the famous Russian ‘Kolmogorov schools’…show that it is possible to educate the most able and interested pupils to an extremely high level…We should give this ~2% a specialist education as per Eton or Kolmogorov, including deep problem-solving skills in maths and physics.

The first English specialist maths schools, run by King’s College and Exeter University, have been approved by the Department for Education and will open in 2014. All of the pupils will be prepared for the maths ‘STEP’ paper that Cambridge requires for entry (or Oxford’s equivalent) – an exam that sets challenging problems involving unfamiliar ways of considering familiar  material, rather than the formulaic multi-step questions of A Level.’

Back in February 2012, TES reported that:

‘The DfE has hosted a consultation meeting on the new free schools with interested parties from the mathematical community in order to outline its plans.’

‘TES understands that officials within the Department for Education are now keen to establish the schools on the model of Kolmogorov, a boarding school that selects the brightest mathematicians in Russia.’

In fact, the meeting discussed a variety of international models and, on 20 February, Education Minister Nick Gibb answered a PQ thus:

‘Alex Cunningham: To ask the Secretary of State for Education when he expects the first free school specialising in mathematics for 16 to 18 year-olds to open; how many 16 to 18 year-olds he expects to enrol in free schools specialising in mathematics by 2015; with which universities he has discussed these free schools; and what guidance he plans to provide to people who wish to apply to open such a school.

Mr Gibb: We are developing proposals on how specialist maths schools for 16 to 18-year-olds might operate and will announce further details in due course. We are keen to engage with all those who have an interest to explore possible models and innovative ideas.’ (Col. 723W).

However, no proposals were published.

The minutes from King’s College London (KCL) Council meeting of 26 June 2012 reveal that:

‘Following approval by the Principal’s Central Team, the College was pursuing discussions with the Department for Education about sponsoring one of 12 specialist Maths schools for 16-18 year olds to be established with the support of university Mathematics departments. The initiative was intended to address national deficiencies in the subject and to promote a flow of highly talented students into university. In discussion, members noted that while the financial and reputational risks and the costs in management time needed to be carefully analysed, the project supported the College’s commitment to widening participation and had the potential to enhance the strengths of the Mathematics Department and the Department of Education and Professional Services, as well as addressing a national problem. The Council approved the College’s continued engagement with this initiative.’

By December 2012 KCL had announced that it would establish a maths free school, with both its maths and education departments involved. The school was scheduled to open in September 2014.

KCL confirmed that it had received from DfE a development grant plus a parallel outreach grant to support a programme for mathematically talented 14-16 year-olds, some of whom might subsequently attend the school.

The minutes of the University of Exeter Council meeting of 13 December 2012 record that:

‘As Council were aware, Exeter was going to be a partner in an exciting regional development to set up one of the first two Maths specialist schools with Exeter College. The other school would be led by King’s College London. This would cater for talented Maths students as a Free School with intake from four counties (Devon, Cornwall, Somerset and Dorset) with a planned total number of students of 120 by September 2017. The bid was submitted to the Department of Education on 11th December and the outcome would be announced in early January, with the school opening in 2014. It would be taught by Exeter College teachers with contributions from staff in pure and applied Maths in the College of Engineering, Mathematics and Physical Sciences (CEMPS), input from the Graduate School of Education and from CEMPS students as mentors and ambassadors. It was hoped that at least some of these talented students would choose to progress to the University. Council would be kept informed of the progress of the bid.’

In January 2013 a DfE press release announced approval of this second school. It would indeed have capacity for 120 students, with Monday-Thursday boarding provision for 20% (24 students), enabling it to recruit from across the four counties named above, so acting as a ‘regional centre of excellence’.

This project had also received a development grant – which we know was up to £300K – had agreement in principle to an outreach grant and also expected to open in September 2014.

There is also reference to plans for Met Office involvement with the School.

The press release repeats that:

‘The ultimate aim is to create a network of schools that operate across England which identify and nurture mathematical and scientific talent.’

A page added to DfE’s website in March 2013 invites further expressions of interest to open maths free schools in September 2014 and beyond.

Parallel Q and A, which has now been removed, made clear that development grants would not be available to new applicants:

‘Is there financial support available to develop our plans?

Not at the beginning. Once we have approved a proposal, we do offer some support to cover the costs of project management, and recruiting some staff before the school opens, in the same way we would for any Free School.’

This has subsequently been reversed (see below).

 

Progress since March 2013

 

The Hard Sell

While KCL and Exeter developed their plans, strenuous efforts were made to encourage other universities to participate in the programme.

A TES piece from May 2013, profiling the newly-appointed head of the KCL school, includes a quote from Alison Wolf – the prominent chair of the project group at KCL:

‘’The Brit School is a really good comparison,” she says. “When we were working on the new school and thinking about what to do, we’d look at their website.

“Maths is very glamorous if you’re a young mathematician, which is why they’ll do well when they are around other people who adore maths.”

The story adds that 16 schools are now planned rather than the original 12, but no source is attributed to this statement.

It seems that the wider strategy at this stage was to convince other potential university sponsors that maths schools were an opportunity not to be missed, to imply that there was already substantial interest from prominent competitors, so encouraging them to climb on board for fear of missing the boat.

 

Playing the Fair Access Card

But there was soon a change of tack. In June 2013, the Guardian reported that education minister Liz Truss had written to the heads of university maths departments to encourage bids.

‘As an incentive to open the new schools, universities will be allowed to fund them using budgets otherwise reserved for improving access to higher education for under-represented and disadvantaged groups….

Les Ebdon, director of Offa, said: “I’d be happy to see more university-led maths free schools because of the role they can play in helping able students from disadvantaged backgrounds access higher education.

“It is for individual universities and colleges to decide whether or not this is something they want to do, but Offa is supportive of anything that is targeted at under-represented groups and helps them to fulfil their potential.”

…According to Truss’s letter, Ebdon confirmed it would be “perfectly legitimate to allocate funding ringfenced for improving access for under-represented groups towards the establishment of such schools,” counting the spending as “widening access”.’

My initial post had pointed to the potential significance of this coupling of excellence and equity as early as November 2011:

‘It is not clear whether a fundamental purpose of these institutions is to support the Government’s drive towards greater social mobility through fair access to competitive universities. However, one might reasonably suggest it would be an oversight not to deploy them…encouraging institutions to give priority during the admissions process would be the likely solution.’

But Ministers’ rather belated conversion to the merits of alignment with social mobility and fair access might have been interpreted as opportunism rather than a sincere effort to join together two parallel strands of Government policy, especially since it had not been identified as a central feature in either KCL’s or Exeter’s plans.

I can find nothing on Offa’s website confirming the statement that funding ringfenced for fair access might be allocated by universities to the development of maths free schools. There is no contemporary press notice and nothing in subsequent guidance on the content of access agreements. This begs the question whether Ebdon’s comments constitute official Offa advice.

However the text of the letter is preserved online and the identical text appears within it:

‘I want to encourage other universities to consider whether they could run similar schools: selective, innovative and stretching our brightest and best young mathematicians. It is a logical extension of the role that dozens of universities have already played in sponsoring academies.

I also wanted to highlight to your colleagues that Professor Les Ebdon, Director of the Office for Fair Access, is enthusiastic about the role university led Maths Free Schools can have in encouraging more young people to go on to study maths at university, and to reap the benefits that brings. Professor Ebdon has also confirmed to me that he considers the sponsorship and development of Maths Free Schools as contributing to higher education ‘widening access’ activity, and that it would be perfectly legitimate to allocate funding ring-fenced for improving access for underrepresented groups towards the establishment of such schools.

Unlike our usual practice for Free Schools, there is no competitive application process for Maths Free Schools. Instead we ask interested universities to submit a short proposal setting out the key features of the school. These proposals need not be long: King’s and Exeter both submitted initial proposals that were around 12 pages…

[There follows a list of bullet points describing the content of these initial proposals, none of which address the admission of students from disadvantaged backgrounds.]

….Both King’s College and the University of Exeter had a number of detailed discussions with colleagues in the Department to develop and refine their proposals and we are always happy to work with universities to help them focus their plans before submitting a formal proposal. If we approve a proposal, we do then offer financial support to cover the costs of project management, and of recruiting some staff before the school opens, in the same we would for any free school.’

(By way of an aside, note that the final emboldened sentence in the quotation above corrects the statement in the Q and A mentioned above. It seems that maths free schools are now treated comparably with all other free school projects in this respect, even though the application process remains different.

The latest version of free school pre-opening guidance gives the sum available in Project Development Grant for 16-19 free schools as £0.25m.)

Going back to Offa, there are no conditions imposed by Ebdon in respect of admissions to the schools, which seems a little over-relaxed, given that they might well attract a predominantly advantaged intake. I wonder whether Ebdon was content to offer personal support but refused to provide official Offa endorsement.

 

 

In July 2013 the BBC reported a speech by Truss at the 2013 ACME Conference. Oddly, the speech is not preserved on the gov.uk site. According to the BBC:

“We want this movement to spread still further,” she told delegates.

“So we’re allowing universities to apply to sponsor new maths free schools through a fast-track, simplified procedure, without having to go through the normal competitive application process.

“These schools will not only improve standards in maths teaching, but will equip talented young people from low-income backgrounds with the skills they need to study maths at university.”

Mrs Truss said the Office for Fair Access had confirmed that, when universities contributed to the sponsorship or development of maths free schools, this would be considered as one of their activities to widen access to under-represented groups – and therefore as part of their access agreement.

“I hope that this is the start of a new network of world-class free schools, under the aegis of top universities, helping to prepare talented 16- to 19-year-olds from any and every background for the demands of university study.”

Note that Ebdon’s endorsement is now Offa’s.

Cummings’ essay remarks in a footnote:

‘Other maths departments were enthusiastic about the idea but Vice Chancellor offices were hostile because of the political fear of accusations of ‘elitism’. Hopefully the recent support of Les Ebdon for the idea will change this.’

A year on, we have no evidence that it has done so.

 

The Soft Sell

By the beginning of the following academic year, a more subtle strategy was adopted. The two schools-in-development launched a maths competition for teams from London and the South-West with prizes awarded by education ministers.

 

 

A November 2013 DfE press release marks the ceremony. Michael Gove is quoted:

‘We need specialist maths free schools like King’s College London (KCL) Maths School and Exeter Mathematics School. They will develop the talents of exceptional young mathematicians and ensure they can compete in the global race.’

The release continues:

‘The KCL and Exeter schools are the first to take advantage of a development grant made available by the Department for Education for the creation of university-led specialist maths free schools.’

The notes include a link to the 1 March webpage mentioned above for ‘Universities interested in developing their own maths free school’.

 

Publicity avoided

We now know that a Freedom of Information request had been submitted to DfE in October 2013, asking how many expressions of interest and firm proposals had been received, which institutions had submitted these and which proposals had been approved and rejected.

The source is an ICO Decision Notice published on 12 June 2014.

The request was initially rejected and this decision was upheld in January 2014 following an internal review. A complaint was immediately lodged with the Information Commissioner’s Office.

The Decision Notice records the Commissioner’s decision that public interest outweighs the case for withholding the information. Accordingly he directs that it should be released to the complainant within 35 calendar days of the date of the Notice (ie by 17 July 2014).

The Notice contains some interesting snippets:

  • ‘It has been the DfE’s experience that interested Heads of Maths have contacted it for further information before seeking to discuss the idea with their Vice Chancellor.’ There is no process for accepting formal expressions of interest.
  • There are…no fixed criteria against which all proposals are assessed.’
  • ‘The DfE confirmed that the application is and has always been the first formal stage of the maths free schools process and it has already stated publicly that it has received three applications from King’s College London, Exeter University and the University of Central Lancashire.’
  • ‘It [ie DfE] confirmed that funding arrangements were only confirmed for the development of maths free schools in February 2014 and many policy decisions on this issue have been shaped by the specifics of the two schools that are due to open soon. It expects the policy to develop even further as more maths free schools are approved.’
  • ‘The DfE explained that universities are extremely risk adverse when it comes to protecting their reputation and so do not want to be publically named until they have submitted an application. As such, if they are named at an earlier point it may make them pull out altogether and may make universities unwilling to approach the DfE with ideas.’
  • ‘Similarly, the DfE argued that if it were to release the reasons why one of the applications was rejected it would be likely to deter future interest as the university would not want the public criticism of its ideas. Given that the policy is driven by university interest, if all potential groups are deterred the policy will fail and students will not be able to enjoy the potential benefits.’

The Commissioner gave these arguments short shrift, pointing out the benefits of transparency for policy development and the encouragement of more successful applications.

The text does not say so explicitly, but one can imagine the Commissioner thinking  ‘given the low level of interest stimulated to date, you might at least try a more open strategy –what have you got to lose?’

It does seem unlikely that university heads of maths departments would submit speculative expressions of interest without internal clearance. Their approaches were presumably of the informal ‘sounding out’ variety. They would understand the shaky internal politics of failing to consult the corporate centre – not to mention their education faculties

The lack of specific and transparent assessment criteria does appear to have backfired. What guarantees might universities otherwise receive that their proposals would be judged objectively?

One can imagine the questions:

  • Is the scheme open to all universities, Russell Group or otherwise?
  • If not, what criteria must the host university satisfy?
  • What counts as a ‘strong mathematics department?’
  • Can projects be led by university departments of education, or only undertaken jointly (as at KCL)?

Without explicit and consistent answers one can readily understand why many universities would be disinclined to pursue the idea.

The reference to belated confirmation of funding arrangements – as recently as February 2014 – is intriguing. It cannot apply to capital funding, unless that was vired in extremis. Perhaps it relates to the parallel recurrent funding pot or simply the availability of project development grants.

The latter seems unlikely given the statement in the letter to HoDOMS, dated some eight months previously.

One suspects that there might have been internal difficulties in ringfencing sufficient recurrent funding to honour proposals as and when they were received. Some prospective bidders might have baulked on being told that their budget could not be confirmed until a later date.

But the eventual resolution of this issue a little over a year before the end of the spending round would be unlikely to have a significant impact on the number of successful bids, especially if unspent capital funding has to be surrendered by Spring 2015.

 

Recent developments

In July 2014 the TES revealed that it had been the source of this FoI request.

 

 

But the story reveals little new, other than that:

‘Five further expressions of interest have been made but not yet yielded an application’

The sources are not revealed.

David Reynolds opines that:

‘Having a small number of schools doesn’t matter if we can get the knowledge from them around the system. So we need them to be excellent schools and we need to somehow get that knowledge around.’

A DfE statement concludes:

‘We continue to welcome applications and expressions of interest from universities and the first maths free schools, set up by two leading universities, will be opening in September.’

So we know there have been eight expressions of interest, three of them converted into firm proposals.

The receipt of the third proposal, from the University of Central Lancashire (UClan), is said to have been made public, but I can find no record of it in the lists of Wave 1 to 7 free school applications so far published.

There is a reference in UCLAN’s 2013-14 access agreement dated 31 May 2012:

‘The University is currently consulting on the formation of a Maths Free School which would be run alongside its new Engineering Innovation Centre at the Preston Campus.’

Nothing is said about the plans in the access agreements for 2014-15 and 2015-16.

There is one further reference on the New Schools Network site to a:

‘Consultant engaged to carry out a feasibility study re a Maths Free School on behalf of the University of Central Lancashire (UCLan)’.

One assumes that this must be out-of-date, unless UCLan is considering a second bid.

Otherwise, a simple process of elimination tells us that UCLan’s proposal must have been rejected. The reason for this is now presumably known to TES, as are the sources of the five expressions of interest that were not converted into proposals. Why have they not published this information?

Perhaps they are waiting for DfE to place these details on its website but, at the time of writing – almost three months after the Decision Notice issued – it has not been uploaded.

Meanwhile, there are no further maths free school proposals in the most recent Wave 7 information relating to applications received by 9 May 2014.

The deadline for Wave 8 is imminent. That may well be the last on this side of the Election.

A further feature published by the TES in October 2014 throws no fresh light on these matters, though it carries a quote by new Secretary of State Nicky Morgan, interviewed at the KCL School launch:

‘I think that some [universities] are clearly waiting to see how the King’s and Exeter schools go. Clearly there is a huge amount of effort required, but I think King’s will be enormously successful, and I am hoping they will be leading by example.’

That sounds suspiciously like tacit admission that there will be no new proposals before a General Election.

Another opinion, diametrically opposed to David Reynolds’ view, is contributed by the head of the school of education at Nottingham University who is also Deputy Chair of ACME:

‘I’m very supportive of more people doing more maths, but even if you have 12 schools, you are really scratching the surface,” said Andrew Noyes, head of the school of education at Nottingham University and a former maths teacher.

“These kinds of policy experiments are very nice and they’re beneficial for a certain number of young people, but they’re relatively cheap compared with providing high-quality maths education at every stage in every school.”’

So what are the prospects for the success of the KCL and Exeter Schools? The next section reviews the evidence so far in the public domain.

 

The KCL and Exeter Free Schools

 

KCL School

The KCL School opened in September 2014 with 68 students, against a planned admissions number of 60. The most recent TES article says that there were 130 applicants and nearly all of those successful were drawn from state schools.

However, another reliable source – a member of the governing body – says that only 85% (ie 58) are from maintained schools, so the independent sector is actually over-represented.

He adds that:

‘Many are from families where neither parent has attended university’

but that is not necessarily an indicator of disadvantage.

We also know that some 43% (29 students) were female, which is a laudable outcome.

The School is located in Lambeth Walk, some distance from KCL’s main campuses. The capital cost of refurbishing the School was seemingly £5m. It occupies two buildings and the main building is shared with a doctor’s surgery.

My March 2013 post summarised KCL’s plans, as revealed by material on the University’s site at that time, supplemented by the content of an information pack for potential heads which is no longer available online.

I have reproduced the main points below, to provide a baseline against which to judge the finished article.

  • The full roll will be 120, with an annual admission number of 60. Potential applicants must have at least 5 GCSE grades A*-C including A*/A in both maths and physics or maths and dual award science.
  • Other admissions criteria will probably include a school reference, ‘our judgement about how much difference attending the school will make to your future based on a number of factors, including the results from an interview’ and the results of a test of aptitude for problem-solving and mathematical thinking.
  • The headteacher information pack adds that ‘the school will also be committed to recruiting a significant proportion of students from socially disadvantaged backgrounds, and to an outreach programme… to further this objective.’
  • All students will take Maths, Further Maths and Physics A levels. They will be expected to take STEP papers and may take a further AS level (an FAQ suggests this will be an Extended Project). Every student will have a maths mentor, either an undergraduate or ‘a junior member of the maths department’.
  • They will also ‘continue with a broad general curriculum, including other sciences, social science, humanities and languages, and have opportunities for sport and the visual and performing arts.’ Some of this provision will be ‘delivered through existing King’s facilities’. The provisional timetable assumes a 40-hour working week, including independent study.
  • The University maths department ‘will be closely involved in curriculum development’ and academics will have ‘regular timetabled contact’, potentially via masterclasses.
  • There will be strong emphasis on collaboration with partner schools. In the longer term, the school ‘intends to seek independent funding for a larger CPD programme associated with the school’s curriculum and pedagogy, and to offer it to a wide range of  schools and students, using school premises out of hours’.

At the time of writing, the KCL Maths School website does not have a working link to the admissions policy, although it can be found online.

As expected, 60 students will be admitted in September 2015. Minimum requirements are now

‘A or A* in GCSE Mathematics or in iGCSE Mathematics

Either an A or A* in GCSE Physics or iGCSE Physics, or an AA, A*A or A*A* in GCSE Science and GCSE Additional Science, or an A or A* in all three Physics modules contained within the GCSE Science, Additional Science and Further Additional Science qualifications; and

A*-C grade in 5 other GCSEs or other qualifications that count towards the Key Stage 4 performance tables compiled by the Department of Education, normally including English language.’

So the minimum requirement has been stiffened to at least seven GCSEs, or equivalent, including A*/A grades in maths and physics and at least a C in English language.

The application process does indeed include a reference, an aptitude test and an interview.

The test is based on KS3 national curriculum material up to Level 8, containing ‘routine and less familiar problems’. Some specimen questions are supplied.

The latest TES story says there are two interviews but this is wrong – there is one interview but two interview scores. One of the two scores is ‘to assess to what extent the school is likely to add value in terms of making a difference to [candidates’] future careers’ but there is no explicit reference to priority for disadvantaged students anywhere in the admissions policy.

Indeed, the section headed Equality and Diversity says:

‘All places at King’s College London Mathematics School are offered on the basis of academic ability and aptitude.’

This does not amount to a commitment to recruit ‘a significant proportion of students from socially disadvantaged backgrounds’, as stated in the headteacher information pack.

The website confirms that all students take A levels in maths, further maths and physics, together with an AS EPQ. But now they can also take an optional AS level in computing in Year 12 and may convert it to an A level in Year 13. They will also take either the AEA or STEP papers.

The description of additional curricular provision is somewhat vague. Students will have a series of lessons and educational visits. Each fortnight a KCL lecturer will introduce a new theme, to be explored through ‘mini research projects’. Students will also learn a modern language but to what level is unclear.

A mentor will be assigned to support work for the EPQ. There will also be a maths mentor – always an undergraduate, never ‘a junior member of the maths department’ – available for one meeting a week.

Tuesday afternoons seem to be set aside for sport and exercise. Visual and performing arts will be explored through extra-curricular activity, though this is currently aspirational rather than real:

‘…the school hopes to have sufficient interest to form a student choir, orchestra and dramatic society.’

The length of the school day is six hours and 55 minutes, with five hours of lessons (though the FAQ implies that students will not have a full timetable).

The present staff complement is 10, six of whom seem to be teaching staff. The head was formerly Head of Maths at Highgate School.

Outreach continues for students in Years 10 and 11. There is also a CPD programme for those new to teaching further maths. This is funded by a £75,000 grant from the Mayor’s London Schools Excellence Fund and supports 30 teachers from six schools spread across five boroughs.

KCL’s Access Agreement for 2015/16 says:

‘King’s College London Mathematics School aims to increase substantially the number of young people with the right levels of mathematical attainment to study STEM subjects at top-rated universities. It also aims to improve access to high quality mathematical education at sixth form level and is targeting individuals from schools where such provision is not easily available (in particular, 11-16 schools and schools where further mathematics is not offered as part of the curriculum at A-level). The school has implemented an extensive outreach programme for pupils at KS4, aged 14-16, whereby pupils come to King’s College London for two hours per fortnight over a two-year period. Through this programme, the school will provide students with limited access [sic] to high quality sixth form provision the understanding and skills they need to prepare for A-levels in Maths and Further Maths should they decide to study them, and also to support applications to the maths school should they wish to make them.

The school has also just launched a programme of continuing professional development for maths teachers in London schools. The programme will run for two consecutive years, and will enable high-quality teaching of Further Maths for those new to teaching this A-level. One of the key aims of this programme is to improve take up and retention rates in A-level Further Maths, with a view to increasing numbers of well-trained applicants to STEM subjects at university.’

Exeter

The Exeter School also opened in September 2014, with 34 students, against a planned admission number of 30. Disappointingly only seven are girls. Eleven (32%) are boarders. We do not know the number of applicants.

The School is located in Rougemont House, a Grade 2 listed building close to the University and College. The cost of refurbishment is as yet unknown.

There were relatively fewer details available of Exeter’s plans at the time I wrote my previous post. The January 2013 revealed that:

  • As we have seen, the roll would be 120 students, 60 per year group, with boarding places available for 20% of them.
  • All students would take maths A level and the STEP paper and all would have 1:1 maths mentoring.
  • University academics would provide an ‘enrichment and critical thinking programme’.
  • The Met Office would be involved.

The 2014 admissions policy dates from September 2013.  It indicates that the School will admit 30 students in September 2014, 50 in September 2015 and 60 in September 2016. It will not reach full capacity until September 2017.

Minimum entry requirements are:

  • A* in GCSE Mathematics
  • A or A* in double sciences or single science Physics (in 2015 computer science is also acceptable as an alternative)
  • At least 6 GCSEs at C grade or above, normally to include English Language at a grade B.

So Exeter is more demanding than KCL in respect of the grades required for both GCSE maths and English language, but the minimum number of GCSEs required is one fewer.

The policy says that the School will aim for allocated places to reflect the incidence of potential students across Devon (47%) and in the other three counties served by the school (Cornwall 23%, Somerset 23%, Dorset 6%) but they will not be selected on this basis. There is nothing in the admissions criteria to secure this outcome, so the purpose of this paragraph is unclear.

The selection process involves a written application, a reference an interview and ‘a mathematics-based entry exam’, subsequently called an aptitude test. This is described in identical terms to the test used by KCL – indeed the specimen questions are identical.

The oversubscription criteria involve giving priority to ‘interview answers and the candidates’ potential to thrive and succeed on the course’.

Under ‘Equality and Diversity’ the document says:

‘EMS is committed to widening participation and broadening access to high quality mathematics education. As such, we will target our recruitment in areas which have high levels of deprivation and in schools for which provision is currently limited, such as those without 6th forms.

EMS will encourage applications from female students through targeted marketing and recruitment. However, there will be no positive discrimination for girls in the admissions criteria.’

The first statement is largely meaningless since neither residence in a deprived area nor attendance at a school without a sixth form is mentioned explicitly in the admissions criteria.

The second statement is reflected in the fact that only 20% of the inaugural cohort is female.

The document notes that boarding will be available for learners living more than an hour distant. The proportion of boarders in the first cohort is significantly higher than expected.

It adds that boarding fees will be payable (and published on the School’s website) but it is expected they ‘will be subsidised by a government grant and a private investor’. There will also be a limited number of means-tested full bursaries, the criteria for which will also be published.

At the time of writing neither fees nor subsidies nor bursary criteria are published on the open pages of the website. It also mentions a subsidised transport scheme but provides no details. This is unhelpful to prospective candidates.

Students take A levels in maths and further maths, plus an A level in either physics or computer science. They are also prepared for STEP papers. All students pursue one further AS level at Exeter College, selecting from a choice of over 30 subjects, with the option to complete the A level in Year 13. Amongst the 30 are several non-traditional options such as fashion and design, media studies and world development. The School is clearly not wedded to facilitating subjects!

In maths students will:

‘…collaborate with those in other mathematics schools and meet, converse and work with staff and students from Exeter University’s mathematics department. They will have access to mathematical mentors from the University who will provide 1:1 and small group support for individual development and project work.’

Maths mentors will be 3rd or 4th year undergraduates and sessions will take place fortnightly.

All students will have a pastoral tutor who will ‘deliver a curriculum designed to meet the students’ development needs’. Some extra-curricular options may also be available:

‘Several clubs and societies will exist within EMS, these will be established as a result of students’ own interests. In addition, Exeter College’s specialist facilities, learning centres and other services will be accessible to them. Students will join their friends and other students from the College for sporting and enrichment activities including, for example, structured voluntary work, theatre productions and the Duke of Edinburgh’s Award Scheme.’

I could find no reference to a University-provided enrichment and critical thinking programme or to Met Office involvement.

The Head of Exeter School was formerly a maths teacher and maths AST at Torquay Boys’ Grammar School. Other staff responsibilities are not enumerated, but the Contacts page mentions only one teacher apart from the Head.

Another section of the site says the School will be advertising for a Deputy and ‘teachers of Mathematics, Computer Science and Physics (p/t)’. Advertisements have been placed for several posts including a Pastoral Leader and an Outreach and Admissions Officer.

An outreach programme is being launched and business links will be established, but there are no details as yet. There are links to a KS4/5 maths teachers’ network sponsored by the Further Maths Support Programme.

Exeter’s 2015/16 Access Agreement says:

‘The University and the College are already joint sponsors of the innovative new Exeter Maths School and are developing a strategic approach to outreach that supports both curriculum enhancement in local schools and progression for the students enrolled in the school. Together with the South Devon UTC, these two new education providers offer opportunities for innovative collaborative approaches to outreach in the region.’

This sounds very much a work in progress.

 

Comparing the two schools

My 2013 post observed:

‘From the information so far published, the Exeter project seems very close conceptually to the one at King’s, indeed almost a clone. It would have been good to have seen evidence of a fundamentally different approach.’

If anything, the two projects have grown even more similar as they have matured. To the extent that these are pilot institutions testing out a diversity of models this is not entirely helpful.

Both Schools are very small and KCL in particular offers a very restricted range of post-16 qualifications. There is downside to post-16 education on this model – otherwise we wouldn’t be exercised about the negative effects of small sixth forms – though both projects make some effort to broaden their students’ experience and, as we have seen, Exeter includes some shared provision with Exeter College.

The admissions requirements and processes are almost identical. It is important to recognise that neither institution is highly selective, especially in terms of overall GCSE performance and, in this respect, the comparisons with Kolmogorov and other institutions elsewhere in the world are rather misleading.

This is not the top 2% that Cummings cited as the beneficiaries in his essay. Even in terms of mathematical ability, the intake to these schools will be relatively broad.

The expectation that all will take STEP papers may be realistic but, despite the use of an aptitude test, any expectation of universal success is surely over-optimistic.

For Cambridge says STEP papers are ‘aimed at the top 5% or so of all A-level mathematics candidates’.  Fewer than 1,500 students took the most popular Paper 1 in 2013 and, in 2014, over 20% of participants received an Unclassified grade.

Cummings notes that approximately one third of those entered for STEP attend independent schools, meaning that roughly 1,000 of the 2013 cohort were in maintained institutions. There may be some marginal increase in state-funded STEP entry through these two schools, but the impact of MEI support elsewhere is likely to be more significant.

The priority attached to excellence is less pronounced than expected. But this is not matched (and justified) by a correspondingly stronger emphasis on equity.

Neither school gives priority within its admissions or oversubscription criteria to students from disadvantaged backgrounds. A major opportunity has been lost as a consequence.

So there is insufficient emphasis on excellence and equity alike. These institutions exemplify a compromise position which, while tenable, will reduce their overall impact on the system.

The only substantive difference between the two schools is that one is located in London and the other in a much more sparsely populated and geographically dispersed region. These latter conditions necessitate a boarding option for some students. The costs associated with boarding are not transparent, but one suspects that they will also serve as a brake on the recruitment of disadvantaged students.

Exeter has no real competitors in its region, other than existing sixth forms and post-16 institutions, but KCL faces stiff competition from the likes of the London Academy of Excellence and the Harris Westminster Sixth Form, both of which are much more substantial institutions offering a wider range of qualifications and, quite possibly, a richer learning experience.

Both Schools are designed to suit students who wish to specialise early and who are content with only limited opportunities to work outside that specialisation. That subgroup does not necessarily include the strongest mathematicians.

It might have been different story if the Schools could have guaranteed progression into the most selective higher education courses, but this they cannot offer. There is no guaranteed progression even to the host universities (whose mathematics departments are not the strongest – one obvious reason why they were attracted to hosting maths schools in the first place).

Exeter and Kings no doubt expect that their Schools will help them to compete more effectively for prospective students – both through direct recruitment and, more indirectly, by raising their profile in the maths education sector – but they will not state this overtly, preferring to emphasis their contribution to improving standards system-wide.

There is no reference to independent evaluation, so one assumes that success indicators will focus on recruitment, a strong showing in the Performance Tables and especially Ofsted inspection outcomes.

A level performance must be consistently high and HE destinations must be commensurate. Because recruitment of disadvantaged students has not been a priority fair access measures are largely irrelevant.

Other indicators should reflect the Schools’ contribution to strengthening the maths talent pipeline and maths education more generally, particularly by offering leadership at regional and national levels.

At this early stage, my judgement is that the KCL project seems rather better placed than Exeter to achieve success. It has hit the ground running while Exeter has some rapid catching up to do. One is good; the other requires improvement.

 

Future Prospects

 

Prospects for the maths school programme

With just seven months before Election Purdah, there is no prospect whatsoever that the programme will reach its target of 12 schools. Indeed it seems highly unlikely that any further projects can be brought to fruition before the end of the spending round.

On assumes that the Regional Schools Commissioners are now responsible for stimulating and supporting new maths school projects – though this has not been made explicit – but they already have their hands full with many other more pressing priorities.

If Labour were to win the Election it seems unlikely that they would want to extend the programme beyond the two schools already established.

Even under the Conservatives it would be extremely vulnerable given its poor track record, the very tight budgetary constraints in the next spending round (especially if schools funding is no longer ringfenced) and the fact that its original champions are no longer in place at DfE.

With the benefit of hindsight one might have taken a different approach to programme design and targeting.  Paradoxically, the centre has appeared overly prescriptive – favouring a ‘Kolmogorov-lite’ model, ideally hosted by a Russell Group institution – but also too vague – omitting to clarify their expectations in a specification with explicit ‘non-negotiables’.

Universities were hesitant to come forward. Some will have had other fish to fry, some may have had reservations arising from fear of elitism, but more still are likely to have been unclear about the Government’s agenda and how best to satisfy it.

The belated decision to flag up the potential contribution to fair access was locking the door after the horse had bolted. Other universities will have noted that neither KCL nor Exeter paid lip service in this direction.

Because they were awarded a substantial capital budget – and were wedded to the value of free schools – ministers were driven to focus on creating new stand-alone institutions that might ultimately form a network, rather than on building the network itself.

The decision to create a set of maths hubs was the most sensible place to start, enabling new maths schools to take on the role of hubs when they were ready to do so. But, the maths hubs were a later invention and, to date at least, there have been no efforts to ‘retro-fit’ the maths schools into the network, meaning that these parallel policy strands are not yet integrated.

 

Prospects for the national maths talent pipeline

England is far from having a coherent national strategy to improve maths education or, as one element within that, a convincing plan to strengthen the maths talent pipeline.

Maths education enjoys a surfeit of players with overlapping remits. National organisations include:

A host of other organisations are involved, including the Joint Mathematical Council (JMC), an umbrella body, the Advisory Committee on Mathematics Education (ACME), the United Kingdom Mathematics Trust (UKMT) and the School Mathematics Project (SMP).

This leaves to one side the maths-related element of broader programmes to support between-school collaboration, recruit teachers and develop new-style qualifications. There is a parallel set of equally complex relationships in science education.

Not to put to finer point on it, there are too many cooks. No single body is in charge; none has lead responsibility for developing the talent pipeline.

Ministers have been energetic in generating a series of stand-alone initiatives. The overarching vision has been sketched out in a series of set-piece speeches, but there is no plan showing how the different elements knit together to create a whole greater than the sum of its parts.

This probably has something to do with an ideological distaste for national strategies of any kind.

The recent introduction of maths hubs might have been intended to bring some much-needed clarity to a complex set of relationships at local, regional and national levels. But the hubs seem to be adding to the complexity by running even more new projects, starting with a Shanghai Teacher Exchange Programme.

A network-driven approach to talent development might just work – I suggested as much at the end of my previous post – but it must be designed to deliver a set of shared strategic objectives. Someone authoritative needs to hold the ring.

What a pity there wasn’t a mechanism to vire the £72m capital budget for 12 free schools into a pot devoted to this end. For, as things stand, it seems that up to £12m will have been spent on two institutions with a combined annual cohort of 120 students, while a further £60m may have to be surrendered back to the Treasury.

We are better off than we would have been without the KCL and Exeter Schools, but two schools is a drop in the ocean. Even 12 schools of this size would have been hard-pressed to drive improvement across the system.

This might have been a once-in-a-generation chance to mend the maths talent pipeline. I hope we haven’t blown it.

 

GP

October 2014

Beware the ‘short head': PISA’s Resilient Students’ Measure

 

This post takes a closer look at the PISA concept of ‘resilient students’ – essentially a measure of disadvantaged high attainment amongst 15 year-olds – and how this varies from country to country.

7211284724_f3c5515bf7_mThe measure was addressed briefly in my recent review of the evidence base for excellence gaps in England but there was not space on that occasion to provide a thoroughgoing review.

The post is organised as follows:

  • A definition of the measure and explanation of how it has changed since the concept was first introduced.
  • A summary of key findings, including selected international comparisons, and of trends over recent PISA cycles.
  • A brief review of OECD and related research material about the characteristics of resilient learners.

I have not provided background about the nature of PISA assessments, but this can be found in previous posts about the mainstream PISA 2012 results and PISA 2012 Problem Solving.

 

Defining the resilient student

In 2011, the OECD published ‘Against the Odds: Disadvantaged students who succeed in school’, which introduced the notion of PISA as a study of resilience. It uses PISA 2006 data throughout and foregrounds science, as did the entire PISA 2006 cycle.

There are two definitions of resilience in play: an international benchmark and a country-specific measure to inform discussion of effective policy levers in different national settings.

The international benchmark relates to the top third of PISA performers (ie above the 67th percentile) across all countries after accounting for socio-economic background. The resilient population comprises students in this group who also fall within the bottom third of the socio-economic background distribution in their particular jurisdiction.

Hence the benchmark comprises an international dimension of performance and a national/jurisdictional dimension of disadvantage.

This cohort is compared with disadvantaged low achievers, a population similarly derived, except that their performance is in the bottom third across all countries, after accounting for socio-economic background.

The national benchmark applies the same national measure relating to socio-economic background, but the measure of performance is the top third of the national/jurisdictional performance distribution for the relevant PISA test.

The basis for determining socio-economic background is the PISA Index of Economic, Social and Cultural Status (ESCS).

‘Against the Odds’ describes it thus:

‘The indicator captures students’ family and home characteristics that describe their socio-economic background. It includes information about parental occupational status and highest educational level, as well as information on home possessions, such as computers, books and access to the Internet.’

Further details are provided in the original PISA 2006 Report (p333).

Rather confusingly, the parameters of the international benchmark were subsequently changed.

PISA 2009 Results: Overcoming Social Background – Equity in Learning Opportunities and Outcomes Volume II describes the new methodology in this fashion:

‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status).’

No reason is given for this shift to a narrower measure of both attainment and disadvantage, nor is the impact on results discussed.

The new methodology is seemingly retained in PISA 2012 Results: Excellence through Equity: Giving every student the chance to succeed – Volume II:

‘A student is class­ed as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter of students among all countries, after accounting for socio-economic status.’

However, multiplication by four is dispensed with.

This should mean that the outcomes from PISA 2009 and 2012 are broadly comparable with some straightforward multiplication. However the 2006 results foreground science, while in 2009 the focus is reading – and shifts on to maths in 2012.

Although there is some commonality between these different test-specific results (see below), there is also some variation, notably in terms of differential outcomes for boys and girls.

 

PISA 2006 results

The chart reproduced below compares national percentages of resilient students and disadvantaged low achievers in science using the original international benchmark. It shows the proportion of resilient learners amongst disadvantaged students.

 

Resil 2006 science Capture

Conversely, the data table supplied alongside the chart shows the proportion of resilient students amongst all learners. Results have to be multiplied by three on this occasion (since the indicator is based on ‘top third attainment, bottom third advantage’).

I have not reproduced the entire dataset, but have instead created a subset of 14 jurisdictions in which my readership may be particularly interested, namely: Australia, Canada, Finland, Hong Kong, Ireland, Japan, New Zealand, Poland, Shanghai, Singapore, South Korea, Taiwan, the UK and the US. I have also included the OECD average.

I have retained this grouping throughout the analysis, even though some of the jurisdictions do not appear throughout – in particular, Shanghai and Singapore are both omitted from the 2006 data.

Chart 1 shows these results.

 

Resil first chart Chart 1: PISA resilience in science for selected jurisdictions by gender (PISA 2006 data)

 

All the jurisdictions in my sample are relatively strong performers on this measure. Only the United States falls consistently below the OECD average.

Hong Kong has the highest percentage of resilient learners – almost 75% of its disadvantaged students achieve the benchmark. Finland is also a very strong performer, while other jurisdictions achieving over 50% include Canada, Japan, South Korea and Taiwan.

The UK is just above the OECD average, but the US is ten points below. The proportion of disadvantaged resilient students in Hong Kong is almost twice the proportion in the UK and two and a half times the proportion in the US.

Most of the sample shows relatively little variation between their proportions of male and female resilient learners. Females have a slight lead across the OECD as a whole, but males are in the ascendancy in eight of these jurisdictions.

The largest gap – some 13 percentage points in favour of boys – can be found in Hong Kong. The largest advantage in favour of girls – 6.9 percentage points – is evident in Poland. In the UK males are ahead by slightly over three percentage points.

The first chart also shows that there is a relatively strong relationship between the proportion of resilient students and of disadvantaged low achievers. Jurisdictions with the largest proportions of resilient students typically have the smallest proportions of disadvantaged low achievers.

In Hong Kong, the proportion of disadvantaged students who are low achievers is 6.3%, set against an OECD average of 25.8%. Conversely, in the US, this proportion reaches 37.8% – and is 26.7% in the UK. Of this sample, only the US has a bigger proportion of disadvantaged low achievers than of disadvantaged resilient students.

 

‘Against the Odds’ examines the relationship between resiliency in science, reading and maths, but does so using the national benchmark, so the figures are not comparable with those above. I have, however, provided a chart comparing performance in my sample of jurisdictions.

 

Resil second chart

Chart 2: Students resilient in science who are resilient in other subjects, national benchmark of resilience, PISA 2006

 

Amongst the jurisdictions for which we have data there is a relatively similar pattern, with between 47% and 56% of students resilient in all three subjects.

In most cases, students who are resilient in two subjects combine science and maths rather than science and reading, but this is not universally true since the reverse pattern applies in Ireland, Japan and South Korea.

The document summarises the outcomes thus:

‘This evidence indicates that the vast majority of students who are resilient with respect to science are also resilient in at least one if not both of the other domains…These results suggest that resilience in science is not a domain-specific characteristic but rather there is something about these students or the schools they attend that lead them to overcome their social disadvantage and excel at school in multiple subject domains.’

 

PISA 2009 Results

The results drawn from PISA 2009 focus on outcomes in reading, rather than science, and of course the definitional differences described above make them incompatible with those for 2006.

The first graph reproduced below shows the outcomes for the full set of participating jurisdictions, while the second – Chart 2 – provides the results for my sample.

Resil PISA 2009 Capture

 

Resil third chart

Chart 3: PISA resilience in reading for selected jurisdictions by gender (PISA 2009 data)

 

The overall OECD average is pitched at 30.8% compared with 39% on the PISA 2006 science measure. Ten of our sample fall above the OECD average and Australia matches it, but the UK, Ireland and the US are below the average, the UK undershooting it by some seven percentage points.

The strongest performer is Shanghai at 75.6%, closely followed by Hong Kong at 72.4%. They and South Korea are the only jurisdictions in the sample which can count over half their disadvantaged readers as resilient. Singapore, Finland and Japan are also relatively strong performers.

There are pronounced gender differences in favour of girls. They have a 16.8 percentage point lead over boys in the OECD average figure and they outscore boys in every country in our sample. These differentials are most marked in Finland, Poland and New Zealand. In the UK there is a difference of 9.2 percentage points, smaller than in many other countries in the sample.

The comparison with the proportion of disadvantaged low achievers is illustrated by chart 3. This reveals the huge variation in the performance of our sample.

 

Resil fourth chart

Chart 4: Comparing percentage of resilient and low-achieving students in reading, PISA 2009

At one extreme, the proportion of disadvantaged low achievers (bottom quartile of the achievement distribution) is virtually negligible in Shanghai and Hong Kong, while around three-quarters of disadvantaged students are resilient (top quartile of the achievement distribution).

At the other, countries like the UK have broadly similar proportions of low achievers and resilient students. The chart reinforces just how far behind they are at both the top and the bottom of the attainment spectrum.

 

PISA 2012 Results

In 2012 the focus is maths rather than reading. The graph reproduced below compares resilience scores across the full set of participating jurisdictions, while Chart 4 covers only my smaller sample.

 

Resil PISA 2012 Capture

resil fifth chart Chart 5: PISA resilience in maths for selected jurisdictions by gender (PISA 2012 data)

 

Despite the change in subject, the span of performance on this measure is broadly similar to that found in reading three years earlier. The OECD average is 25.6%, roughly five percentage points lower than the average in 2009 reading.

Nine of the sample lie above the OECD average, while Australia, Ireland, New Zealand, UK and the US are below. The UK is closer to the OECD average in maths than it was in reading, however, and is a relatively stronger performer than the US and New Zealand.

Shanghai and Hong Kong are once again the top performers, at 76.8% and 72.4% respectively. Singapore is at just over 60% and South Korea at just over 50%. Taiwan and Japan are also notably strong performers.

Within the OECD average, boys have a four percentage point lead on girls, but boys’ relatively stronger performance is not universal – in Hong Kong, Poland, Singapore and South Korea, girls are in the ascendancy.  This is most strongly seen in Poland. The percentage point difference in the UK is just 2.

The comparison with disadvantage low achievers is illustrated in Chart 5.

 

Resil sixth chart

Chart 6: Comparing percentage of resilient and low-achieving students in maths, PISA 2012

 

Once again the familiar pattern emerges, with negligible proportions of low achievers in the countries with the largest shares of resilient students. At the other extreme, the US and New Zealand are the only two jurisdictions in this sample with a longer ‘tail’ of low achievers. The reverse is true in the UK, but only just!

 

Another OECD Publication ‘Strengthening Resilience through Education: PISA Results – background document’ contains a graph showing the variance in jurisdictions’ mathematical performance by deciles of socio-economic disadvantage. This is reproduced below.

 

resil maths deciles Capture

The text adds:

‘Further analysis indicates that the 10% socio-economically most disadvantaged children in Shanghai perform at the same level as the 10% most privileged children in the United States; and that the 20% most disadvantaged children in Finland, Japan, Estonia, Korea, Singapore, Hong Kong-China and Shanghai-China compare favourably to the OECD average.’

One can see that the UK is decidedly ‘mid-table’ at both extremes of the distribution. On the evidence of this measure, one cannot fully accept the oft-repeated saw that the UK is a much stronger performer with high attainers than with low attainers, certainly as far as disadvantaged learners are concerned.

 

The 2012 Report also compares maths-based resiliency records over the four cycles from PISA 2003 to PISA 2012 – as shown in the graph reproduced below – but few of the changes are statistically significant. There has also been some statistical sleight of hand to ensure comparability across the cycles.

 

resil comparing PISA 2003 to 2012 capture

Amongst the outcomes that are statistically significant, Australia experienced a fall of 1.9 percentage points, Canada 1.6 percentage points, Finland 3.3 percentage points and New Zealand 2.9 percentage points. The OECD average was relatively little changed.

The UK is not included in this analysis because of issues with its PISA 2003 results.

Resilience is not addressed in the main PISA 2012 report on problem-solving, but one can find online the graph below, which shows the relative performance of the participating countries.

It is no surprise that the Asian Tigers are at the top of the league (although Shanghai is no longer in the ascendancy). England (as opposed to the UK) is at just over 30%, a little above the OECD average, which appears to stand at around 27%.

The United States and Australia perform at a very similar level. Canada is ahead of them and Poland is the laggard.

 

resil problem solving 2012 Capture

 

Resilience in the home countries

Inserted for the purposes of reinforcement, the chart below compiles the UK outcomes from the PISA 2006, 2009 and 2012 studies above, as compared with the top performer in my sample for each cycle and the appropriate OECD average. Problem-solving is omitted.

Only in science (using the ‘top third attainer, bottom third disadvantage’ formula) does the UK exceed the OECD average figure and then only slightly.

In both reading and maths, the gap between the UK and the top performer in my sample is eye-wateringly large: in each case there are more than three times as many resilient students in the top-performing jurisdiction.

It is abundantly clear from this data that disadvantaged high attainers in the UK do not perform strongly compared with their peers elsewhere.

 

Resil seventh chart

Chart 7: Resilience measures from PISA 2006-2012 comparing UK with top performer in this sample and OECD average

 

Unfortunately NFER does not pick up the concept of resilience in its analysis of England’s PISA 2012 results.

The only comparative analysis across the Home Countries that I can find is contained in a report prepared for the Northern Ireland Ministry of Education by NFER called ‘PISA 2009: Modelling achievement and resilience in Northern Ireland’ (March 2012).

This uses the old ‘highest third by attainment, lowest third by disadvantage’ methodology deployed in ‘Against the Odds’. Reading is the base.

The results show that 41% of English students are resilient, the same figure as for the UK as a whole. The figures for the other home countries appear to be: Northern Ireland 42%; Scotland 44%; and Wales 35%.

Whether the same relationship holds true in maths and science using the ‘top quartile, bottom quartile’ methodology is unknown. One suspects though that each of the UK figures given above will also apply to England.

 

The characteristics of resilient learners

‘Against the Odds’ outlines some evidence derived from comparisons using the national benchmark:

  • Resilient students are, on average, somewhat more advantaged than disadvantaged low achievers, but the difference is relatively small and mostly accounted for by home-related factors (eg. number of books in the home, parental level of education) rather than parental occupation and income.
  • In most jurisdictions, resilient students achieve proficiency level 4 or higher in science. This is true of 56.8% across the OECD. In the UK the figure is 75.8%; in Hong Kong it is 88.4%. We do not know what proportions achieve the highest proficiency levels.
  • Students with an immigrant background – either born outside the country of residence or with parents were born outside the country – tend to be under-represented amongst resilient students.
  • Resilient students tend to be more motivated, confident and engaged than disadvantaged low achievers. Students’ confidence in their academic abilities is a strong predictor of resilience, stronger than motivation.
  • Learning time – the amount of time spent in normal science lessons – is also a strong predictor of resilience, but there is relatively little evidence of an association with school factors such as school management, admissions policies and competition.

Volume III of the PISA 2012 Report: ‘Ready to Learn: Students’ engagement, drive and self-beliefs’ offers a further gloss on these characteristics from a mathematical perspective:

‘Resilient students and advantaged high-achievers have lower rates of absenteeism and lack of punctuality than disadvantaged and advantaged low-achievers…

….resilient and disadvantaged low-achievers tend to have lower sense of belonging than advantaged low-achievers and advantaged high-achievers: socio-economically disadvantaged students express a lower sense of belonging than socio-economically advantaged students irrespective of their performance in mathematics.

Resilient students tend to resemble advantaged high-achievers with respect to their level of drive, motivation and self-beliefs: resilient students and advantaged high-achievers have in fact much higher levels of perseverance, intrinsic and instrumental motivation to learn mathematics, mathematics self-efficacy, mathematics self-concept and lower levels of mathematics anxiety than students who perform at lower levels than would be expected of them given their socio-economic condition…

….In fact, one key characteristic that resilient students tend to share across participating countries and economies, is that they are generally physically and mentally present in class, are ready to persevere when faced with challenges and difficulties and believe in their abilities as mathematics learners.’

Several research studies can be found online that reinforce these findings, sometimes adding a few further details for good measure:

The aforementioned NFER study for Northern Ireland uses a multi-level logistic model to investigate the school and student background factors associated with resilience in Northern Ireland using PISA 2009 data.

It derives odds ratios as follows: grammar school 7.44; female pupils 2.00; possessions – classic literature 1.69; wealth 0.76; percentage of pupils eligible for FSM – 0.63; and books in home – 0-10 books 0.35.

On the positive impact of selection the report observes:

‘This is likely to be largely caused by the fact that to some extent grammar schools will be identifying the most resilient students as part of the selection process. As such, we cannot be certain about the effectiveness or otherwise of grammar schools in providing the best education for disadvantaged children.’

Another study – ‘Predicting academic resilience with mathematics learning and demographic variables’ (Cheung et al 2014) – concludes that, amongst East Asian jurisdictions such as Hong-Kong, Japan and South Korea, resilience is associated with avoidance of ‘redoublement’ and having attended kindergarten for more than a year.

Unsurprisingly, students who are more familiar with mathematical concepts and have greater mathematical self-efficacy are also more likely to be resilient.

Amongst other countries in the sample – including Canada and Finland – being male, native (as opposed to immigrant) and avoiding ‘redoublement’ produced stronger chances of resilience.

In addition to familiarity with maths concepts and self-efficacy, resilient students in these countries were less anxious about maths and had a higher degree of maths self-concept.

Work on ‘Resilience Patterns in Public Schools in Turkey’ (unattributed and undated) – based on PISA 2009 data and using the ‘top third, bottom third’ methodology – finds that 10% of a Turkish sample are resilient in reading, maths and science; 6% are resilient in two subjects and a further 8% in one only.

Resilience varies in different subjects according to year of education.

resil Turkey Capture

There are also significant regional differences.

Odds ratios show a positive association with: more than one year of pre-primary education; selective provision, especially in maths; absence of ability grouping; additional learning time, especially for maths and science; a good disciplinary climate and strong teacher-student relations.

An Italian study – ‘A way to resilience: How can Italian disadvantaged students and schools close the achievement gap?’ (Agasisti and Longobardi, undated) uses PISA 2009 data to examine the characteristics of resilient students attending schools with high levels of disadvantage.

This confirms some of the findings above in respect of student characteristics, finding a negative impact from immigrant status (and also from a high proportion of immigrants in a school). ‘Joy in reading’ and ‘positive attitude to computers’ are both positively associated with resilience, as is a positive relationship with teachers.

School type is found to influence the incidence of resilience – particularly enrolment in Licei as opposed to professional or technical schools – so reflecting one outcome of the Northern Irish study. Other significant school level factors include the quality of educational resources available and investment in extracurricular activities. Regional differences are once more pronounced.

A second Italian study – ‘Does public spending improve educational resilience? A longitudinal analysis of OECD PISA data’ (Agasisti et al 2014) finds a positive correlation between the proportion of a country’s public expenditure devoted to education and the proportion of resilient students.

Finally, this commentary from Marc Tucker in the US links its relatively low incidence of resilient students to national views about the nature of ability:

‘In Asia, differences in student achievement are generally attributed to differences in the effort that students put into learning, whereas in the United States, these differences are attributed to natural ability.  This leads to much lower expectations for students who come from low-income families…

My experience of the Europeans is that they lie somewhere between the Asians and the Americans with respect to the question as to whether effort or genetic material is the most important explainer of achievement in school…

… My take is that American students still suffer relative to students in both Europe and Asia as a result of the propensity of the American education system to sort students out by ability and assign different students work at different challenge levels, based on their estimates of student’s inherited intelligence.’

 

Conclusion

What are we to make of all this?

It suggests to me that we have not pushed much beyond statements of the obvious and vague conjecture in our efforts to understand the resilient student population and how to increase its size in any given jurisdiction.

The comparative statistical evidence shows that England has a real problem with underachievement by disadvantaged students, as much at the top as the bottom of the attainment distribution.

We are not alone in facing this difficulty, although it is significantly more pronounced than in several of our most prominent PISA competitors.

We should be worrying as much about our ‘short head’ as our ‘long tail’.

 

GP

September 2014

 

 

 

 

 

 

Closing England’s Excellence Gaps: Part 2

This is the second part of an extended post considering what we know – and do not know – about high attainment gaps between learners from advantaged and disadvantaged backgrounds in England.

512px-Bakerloo_line_-_Waterloo_-_Mind_the_gap

Mind the Gap by Clicsouris

Part one provided an England-specific definition, articulated a provisional theoretical model for addressing excellence gaps and set out the published data about the size of excellence gaps at Key Stages 2,4 and 5, respectively.

Part two continues to review the evidence base for excellence gaps, covering the question whether high attainers remain so, international comparisons data and related research and excellence gaps analysis from the USA.

It also describes those elements of present government policy that impact directly on excellence gaps and offers some recommendations for strengthening our national emphasis on this important issue.

 

Whether disadvantaged high achievers remain so

 

The Characteristics of High Attainers

The Characteristics of high attainers (DfES 2007) includes investigation of:

  • whether pupils in the top 10% at KS4 in 2006 were also high attainers at KS3 in 2004 and KS2 in 2001, by matching back to their fine grade points scores; and
  • chances of being a KS4 high attainer given a range of pupil characteristics at KS2 and KS3.

On the first point it finds that 4% of all pupils remain in the top 10% throughout, while 83% of pupils are never in the top 10% group.

Some 63% of those who were high attainers at the end of KS2 are still high attainers at the end of KS3, while 72% of KS3 high attainers are still in that group at the end of KS4. Approximately half of high attainers at KS2 are high attainers at KS4.

The calculation is not repeated for advantaged and disadvantaged high attainers respectively, but this shows that – while there is relatively little movement between  the high attaining population and other learners (with only 17% of the overall population falling within scope at any point) – there is a sizeable ‘drop out’ amongst high attainers at each key stage.

Turning to the second point, logistic regression is used to calculate the odds of being a KS4 high attainer given different levels of prior attainment and a range of pupil characteristics. Results are controlled to isolate the impact of individual characteristics and for attainment.

The study finds that pupils with a KS2 average points score (APS) above 33 are more likely than not to be high attainers at KS4, and this probability increases as their KS2 APS increases. For those with an APS of 36, the odds are 23.73, meaning they have a 24/25 chance of being a KS4 high attainer.

For FSM-eligible learners though, the odds are 0.55, meaning that the chances of being a KS4 high attainer are 45% lower amongst FSM-eligible pupils, compared to  their non-FSM counterparts with similar prior attainment and characteristics.

The full set of findings for individual characteristics is reproduced below.

Ex gap Capture 7

 

An appendix supplies the exact ratios for each characteristic and the text points out that these can be multiplied to calculate odds ratios for different combinations:

The odds for different prior attainment levels and other characteristics combined with FSM eligibility are not worked through, but could easily be calculated. It would be extremely worthwhile to repeat this analysis using more recent data to see whether the results would be replicated for those completing KS4 in 2014.

 

Sutton Trust

In 2008, the Sutton Trust published ‘Wasted talent? Attrition rates of high achieving pupils between school and university’ which examines the attrition rates for FSM-eligible learners among the top 20% of performers at KS2, KS3 and KS4.

A footnote says that this calculation was ‘on the basis of their English and maths scores at age 11, and at later stages of schooling’, which is somewhat unclear. A single, unidentified cohort is tracked across key stages.

The report suggests ‘extremely high rates of ‘leakage’ amongst the least privileged pupils’. The key finding is that two-thirds of disadvantaged top performers at KS2 are not amongst the top performers at KS4, whereas 42% advantaged top performers are not.

 

EPPSE

Also in the longitudinal tradition ‘Performing against the odds: developmental trajectories of children in the EPPSE 3-16 study’ (Siraj-Blatchford et al, June 2011) investigated through interviews the factors that enabled a small group of disadvantaged learners to ‘succeed against the odds’.

Twenty learners were identified who were at the end of KS3 or at KS4 and who had achieved well above predicted levels in English and maths at the end of KS2. Achievement was predicted for the full sample of 2,800 children within the EPPSE study via multi-level modelling, generating:

‘…residual scores for each individual child, indicating the differences between predicted and attained achievement at age 11, while controlling for certain child characteristics (i.e., age, gender, birth weight, and the presence of developmental problems) and family characteristics (i.e., mothers’ education, fathers’ education, socio-economic status [SES] and family income). ‘

The 20 identified as succeeding against the odds had KS2 residual scores for both English and maths within the highest 20% of the sample. ‘Development trajectories’ were created for the group using a range of assessments conducted at age 3, 4, 5, 7, 11 and 14.

The highest job level held in the family when the children were aged 3-4 was manual, semi-skilled or unskilled, or the parent(s) had never worked.

The 20 were randomly selected from each gender – eight boys and 12 girls – while ensuring representation of ‘the bigger minority ethnic groups’. It included nine students characterised as White UK, five Black Caribbean, two Black African and one each of Indian (Sikh), Pakistani, Mixed Heritage and Indian (Hindu).

Interviews were conducted with children, parents and the teacher at their [present] secondary school the learners felt ‘knew them best’. Teacher interviews were secured for 11 of the 20.

Comparison of development trajectories showed significant gaps between this ‘low SES high attainment’ group and a comparative sample of ‘low SES, predicted attainment’ students. They were ahead from the outset and pulled further away.

They also exceeded a comparator group of high SES learners performing at predicted levels from entry to primary education until KS2. Even at KS3, 16 of the 20 were still performing above the mean of the high SES sample.

These profiles – illustrated in the two charts below – were very similar in English and maths respectively. In either case, Group 1 are those with ‘low SES, high attainment’, while Group 4 are ‘high SES predicted attainment’ students.

 

Supp exgap Eng Capture

Supp exgap Maths Capture

 

Interviews identified five factors that helped to explain this success:

  • The child’s perceived cognitive ability, strong motivation for school and learning and their hobbies and interests. Most parents and children regarded cognitive ability as ‘inherent to the child’, but they had experienced many opportunities to develop their abilities and received support in developing a ‘positive self-image’. Parenting ‘reflected a belief in the parent’s efficacy to positively influence the child’s learning’. Children also demonstrated ability to self-regulate and positive attitudes to homework. They had a positive attitude to learning and made frequent use of books and computers for this purpose. They used school and learning as distractions from wider family problems. Many were driven to learn, to succeed educationally and achieve future aspirations.
  • Home context – effective practical and emotional support with school and learning. Families undertook a wide range of learning activities, especially in the early years. These were perceived as enjoyable but also valuable preparation for subsequent schooling. During the primary years, almost all families actively stimulated their children to read. In the secondary years, many parents felt their efforts to regulate their children’s activities and set boundaries were significant. Parents also provided practical support with school and learning, taking an active interest and interacting with their child’s school. Their parenting style is described as ‘authoritative: warm, firm and accepting of their needs for psychological autonomy but demanding’. They set clear standards and boundaries for behaviour while granting extra autonomy as their children matured. They set high expectations and felt strongly responsible for their child’s education and attitude to learning. They believed in their capacity to influence their children positively. Some were motivated by the educational difficulties they had experienced.
  • (Pre-)School environment – teachers who are sensitive and responsive to the child’s needs and use ‘an authoritative approach to teaching and interactive teaching strategies’; and, additionally, supportive school policies. Parents had a positive perception of the value of pre-school education, though the value of highly effective pre-school provision was not clear cut with this sample. Moreover ‘very few clear patterns of association could be discerned between primary school effectiveness and development of rankings on trajectories’. That said both parents and children recognised that their schools had helped them address learning and behavioural difficulties. Success was attributed to the quality of teachers. ‘They thought that good quality teaching meant that teachers were able to explain things clearly, were enthusiastic about the subject they taught, were approachable when things were difficult to understand, were generally friendly, had control over the class and clearly communicated their expectations and boundaries.’
  • Peers providing practical, emotional and motivational support. Friends were especially valuable in helping them to respond to difficulties, helping in class, with homework and revision. Such support was often mutual, helping to build understanding and develop self-esteem, as a consequence of undertaking the role of teacher. Friends also provided role models and competitors.
  • Similar support provided by the extended family and wider social, cultural and religious communities. Parents encouraged their children to take part in extra-curricular activities and were often aware of their educational benefits. Family networks often provided additional learning experiences, particularly for Caribbean and some Asian families.

 

Ofsted

Ofsted’s The most able students: Are they doing as well as they should in our non-selective secondary schools? (2013) defines this population rather convolutedly as those:

‘…starting secondary school in Year 7 attaining level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and or mathematics at the end of Key Stage 2.’ (Footnote p6-7)

There is relatively little data in the report about the performance of high-attaining disadvantaged learners, other than the statement that only 58% of FSM students within the ‘most able’ population in KS2 and attending non-selective secondary schools go on to achieve A*-B GCSE grades in English and maths, compared with 75% of non-FSM pupils, giving a gap of 17 percentage points.

I have been unable to find national transition matrices for advantaged and disadvantaged learners, enabling us to compare the proportion of advantaged and disadvantaged pupils making and exceeding the expected progress between key stages.

 

Regression to the mean and efforts to circumvent it

Much prominence has been given to Feinstein’s 2003 finding that, whereas high-scoring children from advantaged and disadvantaged backgrounds (defined by parental occupation) perform at a broadly similar level when tested at 22 months, the disadvantaged group are subsequently overtaken by relatively low-scoring children from advantaged backgrounds during the primary school years.

The diagram that summarises this relationship has been reproduced widely and much used as the centrepiece of arguments justifying efforts to improve social mobility.

Feinstein Capture

But Feinstein’s finding were subsequently challenged on methodological grounds associated with the effects of regression to the mean.

Jerrim and Vignoles (2011) concluded:

‘There is currently an overwhelming view amongst academics and policymakers that highly able children from poor homes get overtaken by their affluent (but less able) peers before the end of primary school. Although this empirical finding is treated as a stylised fact, the methodology used to reach this conclusion is seriously flawed. After attempting to correct for the aforementioned statistical problem, we find little evidence that this is actually the case. Hence we strongly recommend that any future work on high ability–disadvantaged groups takes the problem of regression to the mean fully into account.’

On the other hand, Whitty and Anders comment:

‘Although some doubt has been raised regarding this analysis on account of the potential for regression to the mean to exaggerate the phenomenon (Jerrim and Vignoles, 2011), it is highly unlikely that this would overturn the core finding that high SES, lower ability children catch up with their low-SES, higher-ability peers.’

Their point is borne out by Progress made by high-attaining children from disadvantaged backgrounds (June 2014) suggesting that Vignoles, as part of the writing team, has changed her mind somewhat since 2011.

This research adopts a methodological route to minimise the impact of regression to the mean. This involves assigning learners to achievement groups using a different test to those used to follow their attainment trajectories and focusing principally on those trajectories from KS2 onwards.

The high attaining group is defined as those achieving Level 3 or above in KS1 writing, which selected in 12.6% of the sample. (For comparison, the same calculations are undertaken based on achieving L3 or above in KS1 maths.) These pupils are ranked and assigned a percentile on the basis of their performance on the remaining KS1 tests and at each subsequent key stage.

The chart summarising the outcomes in the period from KS1 to KS4 is reproduced below, showing the different trajectories of the ‘most deprived’ and ‘least deprived’. These are upper and lower quintile groups of state school students derived on the basis of FSM eligibility and a set of area-based measures of disadvantage and measures of socio-economic status derived from the census.

 

Ex gap 8 Capture

The trajectories do not alter significantly beyond KS4.

The study concludes:

‘…children from poorer backgrounds who are high attaining at age 7 are more likely to fall off a high attainment trajectory than children from richer backgrounds. We find that high-achieving children from the most deprived families perform worse than lower-achieving students from the least deprived families by Key Stage 4. Conversely, lower-achieving affluent children catch up with higher-achieving deprived children between Key Stage 2 and Key Stage 4.’

Hence:

‘The period between Key Stage 2 and Key Stage 4 appears to be a crucial time to ensure that higher-achieving pupils from poor backgrounds remain on a high achievement trajectory.’

In short, a Feinstein-like relationship is established but it operates at a somewhat later stage in the educational process.

 

International comparisons studies

 

PISA: Resilience

OECD PISA studies have recently begun to report on the performance of what they call ‘resilient’ learners.

Against the Odds: Disadvantaged Students Who Succeed in Schools (OECD, 2011) describes this population as those who fall within the bottom third of their country’s distribution by socio-economic background, but who achieve within the top third on PISA assessments across participating countries.

This publication uses PISA 2006 science results as the basis of its calculations. The relative position of different countries is shown in the chart reproduced below. Hong Kong tops the league at 24.8%, the UK is at 13.5%, slightly above the OECD average of 13%, while the USA is languishing at 9.9%.

Ex Gap Capture 9

The findings were discussed further in PISA in Focus 5 (OECD 2011), where PISA 2009 data is used to make the calculation. The methodology is also significantly adjusted so that includes a substantially smaller population:

‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status.’

According to this measure, the UK is at 24% and the US has leapfrogged them at 28%. Both are below the OECD average of 31%, while Shanghai and Hong Kong stand at over 70%.

The Report on PISA 2012 (OECD 2013) retains the more demanding definition of resilience, but dispenses with multiplication by 4, so these results need to be so multiplied to be comparable with those for 2009.

This time round, Shanghai is at 19.2% (76.8%) and Hong Kong at 18.1% (72.4%). The OECD average is 6.4% (25.6%), the UK at 5.8% (23.2%) and the US at 5.2% (20.8%).

So the UK has lost a little ground compared with 2009, but is much close to the OECD average and has overtaken the US, which has fallen back by some seven percentage points.

I could find no commentary on these changes.

NFER has undertaken some work on resilience in Northern Ireland, using PISA 2009 reading results (and the original ‘one third’ methodology) as a base. This includes odds ratios for different characteristics of being resilient. This could be replicated for England using PISA 2012 data and the latest definition of resilience.

 

Research on socio-economic gradients

The Socio-Economic Gradient in Teenagers’ Reading Skills: How Does England Compare with Other Countries? (Jerrim 2012) compares the performance of students within the highest and lowest quintiles of the ISEI Index of Occupational Status on the PISA 2009 reading tests.

It quantifies the proportion of these two populations within each decile of  achievement, so generating a gradient, before reviewing how this gradient has changed between PISA 2000 and PISA 2009, comparing outcomes for England, Australia, Canada, Finland, Germany and the US.

Jerrim summarises his findings thus:

‘The difference between advantaged and disadvantaged children’s PISA 2009 reading test scores in England is similar (on average) to that in most other developed countries (including Australia, Germany and, to some extent, the US). This is in contrast to previous studies from the 1990s, which suggested that there was a particularly large socio-economic gap in English pupils’ academic achievement.

Yet the association between family background and high achievement seems to be stronger in England than elsewhere.

There is some evidence that the socio-economic achievement gradient has been reduced in England over the last decade, although not amongst the most able pupils from advantaged and disadvantaged homes.’

Jerrim finds that the link in England between family background and high achievement is stronger than in most other OECD countries, whereas this is not the case at the other end of the distribution.

He hypothises that this might be attributable to recent policy focus on reducing the ‘long tail’ while:

‘much less attention seems to be paid to helping disadvantaged children who are already doing reasonably well to push on and reach the top grades’.

He dismisses the notion that the difference is associated with the fact that  disadvantaged children are concentrated in lower-performing schools, since it persists even when controls for school effects are introduced.

In considering why PISA scores show the achievement gap in reading has reduced between 2000 and 2009 at the lower end of the attainment distribution but not at the top, he cites two possibilities: that Government policy has been disproportionately successful at the lower end; and that there has been a more substantial decline in achievement amongst learners from advantaged backgrounds than amongst their disadvantaged peers. He is unable to rule out the latter possibility.

He also notes in passing that PISA scores in maths do not generate the same pattern.

These arguments are further developed in ‘The Reading Gap: The socio-economic gap in children’s reading skills: A cross-national comparison using PISA 2009’ (Jerrim, 2013) which applies the same methodology.

This finds that high-achieving (top decile of the test distribution) boys from the most advantaged quintile in England are two years and seven months ahead of high-achieving boys from the most disadvantaged quintile, while the comparable gap for girls is slightly lower, at two years and four months.

The chart reproduced below illustrates international comparisons for boys. It shows that only Scotland has a larger high achievement gap than England. (The black lines indicate 99% confidence intervals – he associates the uncertainty to ‘sampling variation’.)

Gaps in countries at the bottom of the table are approximately half the size of those in England and Scotland.

Ex gap 10 capture

 

One of the report’s recommendations is that:

‘The coalition government has demonstrated its commitment to disadvantaged pupils by establishing the Education Endowment Foundation… A key part of this Foundation’s future work should be to ensure highly able children from disadvantaged backgrounds succeed in school and have the opportunity to enter top universities and professional jobs. The government should provide additional resources to the foundation to trial interventions that specifically target already high achieving children from disadvantaged homes. These should be evaluated using robust evaluation methodologies (e.g. randomised control trials) so that policymakers develop a better understanding of what schemes really have the potential to work.’

The study is published by the Sutton Trust whose Chairman – Sir Peter Lampl – is also chairman of the EEF.

In ‘Family background and access to high ‘status’ universities’ (2013) Jerrim provides a different chart showing estimates by country of disadvantaged high achieving learners. The measure of achievement is PISA Level 5 in reading and the measure of disadvantage remains quintiles derived from the ISEI index.

Ex Gap 12 Capture 

The underlying figures are not supplied.

Also in 2013, in ‘The mathematical skills of school children: how does England compare to the high-performing East Asian jurisdictions?’ Jerrim and Choi construct a similar gradient for maths, drawing on a mix of PISA and TIMSS assessments conducted between 2003 and 2009, so enabling them to consider variation according to the age at which assessment takes place.

The international tests selected are TIMSS 2003, 4th grade; TIMSS 2007, 8th grade and PISA 2009. The differences between what these tests measure are described as ‘slight’. The analysis of achievement relies on deciles of the achievement distribution.

Thirteen comparator countries are included, including six wealthy western economies, three ‘middle income’ western economies and four Asian Tigers (Hong Kong, Japan, Singapore and Taiwan).

This study applies as the best available proxy for socio-economic status the number of books in the family home, comparing the most advantaged (over 200 books) with the least (under 25 books). It acknowledges the limitations of this proxy, which Jerrim discusses elsewhere.

The evidence suggests that:

‘between primary school and the end of secondary school, the gap between the lowest achieving children in England and the lowest achieving children in East Asian countries is reduced’

but remains significant.

Conversely, results for the top 10% of the distribution:

‘suggest that the gap between the highest achieving children in England and the highest achieving children in East Asia increases between the end of primary school and the end of secondary school’.

The latter outcome is illustrated in the chart reproduced below

Ex gap 11 Capture

 

The authors do not consider variation by socio-economic background amongst the high-achieving cohort, presumably because the data still does not support the pattern they previously identified for reading.

 

US studies

In 2007 the Jack Kent Cooke Foundation published ‘Achievement Trap: How America is Failing Millions of High-Achieving Students from Low Income Backgrounds’ (Wyner, Bridgeland, Diiulio) The text was subsequently revised in 2009.

This focuses exclusively on gaps attributable to socio-economic status, by comparing the performance of those in the top and bottom halves of the family income distribution in the US, as adjusted for family size.

The achievement measure is top quartile performance on nationally normalised exams administered within two longitudinal studies: The National Education Longitudinal Study (NELS) and the Baccalaureate and Beyond Longitudinal Study (B&B).

The study reports that relatively few lower income students remain high achievers throughout their time in elementary and high school:

  • 56% remain high achievers in reading by Grade 5, compared with 69% of higher income students.
  • 25 percent fall out of the high achiever cohort in high school, compared with 16% of higher income students.
  • Higher income learners who are not high achievers in Grade 1 are more than twice as likely to be high achievers by Grade 5. The same is true between Grades 8 and 12.

2007 also saw the publication of ‘Overlooked Gems: A national perspective on low income promising learners’ (Van Tassel-Baska and Stambaugh). This is a compilation of the proceedings of a 2006 conference which does not attempt a single definition of the target group, but draws on a variety of different research studies and programmes, each with different starting points.

An influential 2009 McKinsey study ‘The Economic Impact of the Achievement Gap in America’s Schools’ acknowledges the existence of what it calls a ‘top gap’. They use this term with reference to:

  • the number of top performers and the level of top performance in the US compared with other countries and
  • the gap in the US between the proportion of Black/Latino students and the proportion of all students achieving top levels of performance.

The authors discuss the colossal economic costs of achievement gaps more generally, but fail to extend this analysis to the ‘top gap’ specifically.

In 2010 ‘Mind the Other Gap: The Growing Excellence Gap in K-12 Education’ (Plucker, Burroughs and Song) was published – and seems to have been the first study to use this term.

The authors define such gaps straightforwardly as

‘Differences between subgroups of students performing at the highest levels of achievement’

The measures of high achievement deployed are the advanced standards on US NAEP maths and reading tests, at Grades 4 and 8 respectively.

The study identifies gaps based on four sets of learner characteristics:

  • Socio-economic status (eligible or not for free or reduced price lunch).
  • Ethnic background (White versus Black and/or Hispanic).
  • English language proficiency (what we in England would call EAL, compared with non-EAL).
  • Gender (girls versus boys).

Each characteristic is dealt with in isolation, so there is no discussion of the gaps between – for example – disadvantaged Black/Hispanic and disadvantaged White boys.

In relation to socio-economic achievement gaps, Plucker et al find that:

  • In Grade 4 maths, from 1996 to 2007, the proportion of advantaged learners achieving the advanced level increased by 5.6 percentage points, while the proportion of disadvantaged learners doing so increased by 1.2 percentage points. In Grade 8 maths, these percentage point changes were 5.7 and 0.8 percentage points respectively. Allowing for changes in the size of the advantaged and disadvantaged cohorts, excellence gaps are estimated to have widened by 4.1 percentage points in Grade 4 (to 7.3%) and 4.9 percentage points in Grade 8 (to 8.2%).
  • In Grade 4 reading, from 1998 to 2007, the proportion of advantaged learners achieving the advanced level increased by 1.2 percentage points, while the proportion of disadvantaged students doing so increased by 0.8 percentage points. In Grade 8 reading, these percentage point changes were almost negligible for both groups. The Grade 4 excellence gap is estimated to have increased slightly, by 0.4 percentage points (to 9.4%) whereas Grade 8 gaps have increased minimally by 0.2 percentage points (to 3.1%).

They observe that the size of excellence gaps are, at best, only moderately correlated with those at lower levels of achievement.

There is a weak relationship between gaps at basic and advanced level – indeed ‘smaller achievement gaps among minimally competent students is related to larger gaps among advanced students’ – but there is some inter-relationship between those at proficient and advanced level.

They conclude that, whereas No Child Left Behind (NCLB) helped to narrow achievement gaps, this does not extend to high achievers.

There is no substantive evidence that the NCLB focus on lower achievers has increased the excellence gap, although the majority of states surveyed by the NAGC felt that NCLB had diverted attention and resource away from gifted education.

In 2011 ‘Do High Fliers Maintain their Altitude?’ (Xiang et al 2011) provides a US analysis of whether individual students remain high achievers throughout their school careers.

They do not report outcomes for disadvantaged high achievers, but do consider briefly those attending schools with high and low proportions respectively of students eligible for free and reduced price lunches.

For this section of the report, high achievement is defined as ‘those whose math or reading scores placed them within the top ten per cent of their individual grades and schools’. Learners were tracked from Grades 3 to 5 and Grades 6 to 8.

It is described as exploratory, because the sample was not representative.

However:

‘High-achieving students attending high-poverty schools made about the same amount of academic growth over time as their high-achieving peers in low-poverty schools…It appears that the relationship between a school’s poverty rate and the growth of its highest-achieving students is weak. In other words, attending a low-poverty school adds little to the average high achiever’s prospects for growth.’

The wider study was criticised in a review by the NEPC, in part on the grounds that the results may have been distorted by regression to the mean, a shortcoming only briefly discussed in an appendix..

The following year saw the publication of Unlocking Emergent Talent: Supporting High Achievement of Low-Income, High-Ability Students (Olszewski-Kubilius and Clarenbach, 2012).

This is the report of a national summit on the issue convened in that year by the NAGC.

It follows Plucker (one of the summit participants) in using as its starting point,the achievement of advanced level on selected NAEP assessments by learners eligible for free and reduced price lunches.

But it also reports some additional outcomes for Grade 12 and for assessments of civics and writing:

  • ‘Since 1998, 1% or fewer of 4th-, 8th-, and 12th-grade free or reduced lunch students, compared to between 5% and 6% of non-eligible students scored at the advanced level on the NAEP civics exam.
  • Since 1998, 1% or fewer of free and reduced lunch program-eligible students scored at the advanced level on the eighth-grade NAEP writing exam while the percentage of non-eligible students who achieved advanced scores increased from 1% to 3%.’

The bulk of the report is devoted to identifying barriers to progress and offering recommendations for improving policy, practice and research. I provided an extended analysis in this post from May 2013.

Finally, ‘Talent on the Sidelines: Excellence Gaps and America’s Persistent Talent Underclass’ (Plucker, Hardesty and Burroughs 2013) is a follow-up to ‘Mind the Other Gap’.

It updates the findings in that report, as set out above:

  • In Grade 4 maths, from 1996 to 2011, the proportion of advantaged students scoring at the advanced level increased by 8.3 percentage points, while the proportion of disadvantaged learners doing so increased by 1.5 percentage points. At Grade 8, the comparable changes were 8.5 percentage points and 1.5 percentage points respectively. Excellence gaps have increased by 6.8 percentage points at Grade 4 (to 9.6%) and by 7 percentage points at Grade 8 (to 10.3%).
  • In Grade 4 reading, from 1998 to 2011, the proportion of advantaged students scoring at the advanced level increased by 2.6 percentage points, compared with an increase of 0.9 percentage points amongst disadvantaged learners. Grade 8 saw equivalent increases of 1.8 and 0.9 percentage points respectively. Excellence gaps are estimated to have increased at Grade 4 by 1.7 percentage points (to 10.7%) and marginally increased at Grade 8 by 0.9 percentage points (to 4.2%).

In short, many excellence gaps remain large and most continue to grow. The report’s recommendations are substantively the same as those put forward in 2010.

 

How Government education policy impacts on excellence gaps

Although many aspects of Government education policy may be expected to have some longer-term impact on raising the achievement of all learners, advantaged and disadvantaged alike, relatively few interventions are focused exclusively and directly on closing attainment gaps between advantaged and disadvantaged learners – and so have the potential to makes a significant difference to excellence gaps.

The most significant of these include:

 

The Pupil Premium:

In November 2010, the IPPR voiced concerns that the benefits of the pupil premium might not reach all those learners who attract it.

Accordingly they recommended that pupil premium should be allocated directly to those learners through an individual Pupil Premium Entitlement which might be used to support a menu of approved activities, including ‘one-to-one teaching to stretch the most able low income pupils’.

The recommendation has not been repeated and the present Government shows no sign of restricting schools’ freedom to use the premium in this manner.

However, the Blunkett Labour Policy Review ‘Putting students and parents first’ recommends that Labour in government should:

‘Assess the level and use of the Pupil Premium to ensure value for money, and that it is targeted to enhance the life chances of children facing the biggest challenges, whether from special needs or from the nature of the background and societal impact they have experienced.’

In February 2013 Ofsted reported that schools spending the pupil premium successfully to improve achievement:

‘Never confused eligibility for the Pupil Premium with low ability, and focused on supporting their disadvantaged pupils to achieve the highest levels’.

Conversely, where schools were less successful in spending the funding, they:

‘focused on pupils attaining the nationally expected level at the end of the key stage…but did not go beyond these expectations, so some more able eligible pupils underachieved.’

In July 2013, DfE’s Evaluation of Pupil Premium reported that, when deciding which disadvantaged pupils to target for support, the top criterion was ‘low attainment’ and was applied in 91% of primary schools and 88% of secondary schools.

In June 2013, in ‘The Most Able Students’, Ofsted reported that:

‘Pupil Premium funding was used in only a few instances to support the most able students who were known to be eligible for free school meals. The funding was generally spent on providing support for all underachieving and low-attaining students rather than on the most able students from disadvantaged backgrounds.’

Accordingly, it gave a commitment that:

‘Ofsted will… consider in more detail during inspection how well the pupil premium is used to support the most able students from disadvantaged backgrounds.’

However, this was not translated into the school inspection guidance.

The latest edition of the School Inspection Handbook says only:

‘Inspectors should pay particular attention to whether more able pupils in general and the most able pupils in particular are achieving as well as they should. For example, does a large enough proportion of those pupils who had the highest attainment at the end of Key Stage 2 in English and mathematics achieve A*/A GCSE grades in these subjects by the age of 16?

Inspectors should summarise the achievements of the most able pupils in a separate paragraph of the inspection report.’

There is no reference to the most able in parallel references to the pupil premium.

There has, however, been some progress in giving learners eligible for the pupil premium priority in admission to selective schools.

In May 2014, the TES reported that:

‘Thirty [grammar] schools have been given permission by the Department for Education to change their admissions policies already. The vast majority of these will introduce the changes for children starting school in September 2015…A small number – five or six – have already introduced the reform.’

The National Grammar Schools Association confirmed that:

‘A significant number of schools 38 have either adopted an FSM priority or consulted about doing so in the last admissions round. A further 59 are considering doing so in the next admissions round.’

In July 2014, the Government launched a consultation on the School Admissions Code which proposes extending to all state-funded schools the option to give priority in their admission arrangements to learners eligible for the pupil premium. This was previously open to academies and free schools via their funding agreements.

 

The Education Endowment Foundation (EEF)

The EEF describes itself as:

‘An independent grant-making charity dedicated to breaking the link between family income and educational achievement, ensuring that children from all backgrounds can fulfil their potential and make the most of their talents.’

The 2010 press release announcing its formation emphasised its role in raising standards in underperforming schools. This was reinforced by the Chairman in a TES article from June 2011:

‘So the target group for EEF-funded projects in its first couple of years are pupils eligible for free school meals in primary and secondary schools underneath the Government’s floor standards at key stages 2 and 4. That’s roughly 1,500 schools up and down the country. Projects can benefit other schools and pupils, as long as there is a significant focus on this core target group of the most needy young people in the most challenging schools.’

I have been unable to trace any formal departure from this position, though it no longer appears in this form in the Foundation’s guidance. The Funding FAQs say only:

‘In the case of projects involving the whole school, rather than targeted interventions, we would expect applicants to be willing to work with schools where the proportion of FSM-eligible pupils is well above the national average and/or with schools where FSM-eligible pupils are under-performing academically.’

I can find no EEF-funded projects that are exclusively or primarily focused on high-attaining disadvantaged learners, though a handful of its reports do refer to the impact on this group.

 

Changes to School Accountability Measures

As we have seen in Part one, the School Performance Tables currently provide very limited information about the performance of disadvantaged high achievers.

The July 2013 consultation document on primary assessment and accountability reform included a commitment to publish a series of headline measures in the tables including:

‘How many of the school’s pupils are among the highest-attaining nationally, by…showing the percentage of pupils attaining a high scaled score in each subject.’

Moreover, it added:

‘We will publish all the headline measures to show the attainment and progress of pupils for whom the school is in receipt of the pupil premium.’

Putting two and two together, this should mean that, from 2016, we will be able to see the percentage of pupil premium-eligible students achieving a high scaled score, though we do not yet know what ‘high scaled score’ means, nor do we know whether the data will be for English and maths separately or combined.

The October 2013 response to the secondary assessment and accountability consultation document fails to say explicitly whether excellence gap measures will be published in School Performance Tables.

It mentions that:

‘Schools will now be held to account for (a) the attainment of their disadvantaged pupils, (b) the progress made by their disadvantaged pupils, and (c) the in-school gap in attainment between disadvantaged pupils and their peers.’

Meanwhile a planned data portal will contain:

‘the percentage of pupils achieving the top grades in GCSEs’

but the interaction between these two elements, if any, remains unclear.

The March 2014 response to the consultation on post-16 accountability and assessment says:

‘We intend to develop measures covering all five headline indicators for students in 16-19 education who were in receipt of pupil premium funding in year 11.’

The post-16 headline measures include a new progress measure and an attainment measure showing the average points score across all level 3 qualifications.

It is expected that a destination measure will also be provided, as long as the methodology can be made sufficiently robust. The response says:

‘A more detailed breakdown of destinations data, such as entry to particular groups of universities, will continue to be published below the headline. This will include data at local authority level, so that destinations for students in the same area can be compared.’

and this should continue to distinguish the destinations of disadvantaged students.

Additional A level attainment measures – the average grade across the best three A levels and the achievement of AAB grades with at least two in facilitating subjects seem unlikely to be differentiated according to disadvantage.

There remains a possibility that much more excellence gap data, for primary, secondary and post-16, will be made available through the planned school portal, but no specification had been made public at the time of writing.

More worryingly, recent news reports have suggested that the IT project developing the portal and the ‘data warehouse’ behind it has been abandoned. The statements refer to coninuing to deliver ‘the school performance tables and associated services’ but there is no clarification of whether this latter phrase includes the portal. Given the absence of an official statement, one suspects the worst.

 

 

The Social Mobility and Child Poverty Commission (SMCPC)

The Commission was established with the expectation that it would ‘hold the Government’s feet to the fire’ to encourage progress on these two topics.

It publishes annual ‘state of the nation’ reports that are laid before Parliament and also undertakes ‘social mobility advocacy’.

The first annual report – already referenced in Part one – was published in November 2013. The second is due in October 2014.

The Chairman of the Commission was less than complimentary about the quality of the Government’s response to its first report, which made no reference to its comments about attainment gaps at higher grades. It remains to be seen whether the second will be taken any more seriously.

The Commission has already shown significant interest in disadvantaged high achievers – in June 2014 it published the study ‘Progress made by high-attaining children from disadvantaged backgrounds’ referenced above – so there is every chance that the topic will feature again in the 2014 annual report.

The Commission is of course strongly interested in the social mobility indicators and progress made against them, so may also include recommendations for how they might be adjusted to reflect changes to the schools accountability regime set out above.

 

Recommended reforms to close excellence gaps

Several proposals emerge from the commentary on current Government policy above:

  • It would be helpful to have further evaluation of the pupil premium to check whether high-achieving disadvantaged learners are receiving commensurate support. Schools need further guidance on ways in which they can use the premium to support high achievers. This should also be a focus for the pupil premium Champion and in pupil premium reviews.
  • Ofsted’s school inspection handbook requires revision to fulfil its commitment to focus on the most able in receipt of the premium. Inspectors also need guidance (published so schools can see it) to ensure common expectations are applied across institutions. These provisions should be extended to the post-16 inspection regime.
  • All selective secondary schools should be invited to prioritise pupil premium recipients in their admissions criteria, with the Government reserving the right to impose this on schools that do not comply voluntarily.
  • The Education Endowment Foundation should undertake targeted studies of interventions to close excellence gaps, but should also ensure that the impact on excellence gaps is mainstreamed in all the studies they fund. (This should be straightforward since their Chairman has already called for action on this front.)
  • The Government should consider the case for the inclusion of data on excellence gaps in all the headline measures in the primary, secondary and post-16 performance tables. Failing that, such data (percentages and numbers) should be readily accessible from a new data portal as soon as feasible, together with historical data of the same nature. (If the full-scale portal is no longer deliverable, a suitable alternative openly accessible database should be provided.) It should also publish annually a statistical analysis of all excellence gaps and the progress made towards closing them. As much progress as possible should be made before the new assessment and accountability regime is introduced. At least one excellence gap measure should be incorporated into revised DfE impact indicators and the social mobility indicators.
  • The Social Mobility and Child Poverty Commission (SMCPC) should routinely consider the progress made in closing excellence gaps within its annual report – and the Government should commit to consider seriously any recommendations they offer to improve such progress.

This leaves the question whether there should be a national programme dedicated to closing excellence gaps, and so improving fair access to competitive universities. (It makes excellent sense to combine these twin objectives and to draw on the resources available to support the latter.)

Much of the research above – whether it originates in the US or UK – argues for dedicated state/national programmes to tackle excellence gaps.

More recently, the Sutton Trust has published a Social Mobility Manifesto for 2015 which recommends that the next government should:

‘Reintroduce ring-fenced government funding to support the most able learners (roughly the top ten per cent) in maintained schools and academies from key stage three upwards. This funding could go further if schools were required to provide some level of match funding.

Develop an evidence base of effective approaches for highly able pupils and ensure training and development for teachers on how to challenge their most able pupils most effectively.

Make a concerted effort to lever in additional support from universities and other partners with expertise in catering for the brightest pupils, including through creating a national programme for highly able learners, delivered through a network of universities and accessible to every state-funded secondary school serving areas of disadvantage.’

This is not as clear as it might be about the balance between support for the most able and the most able disadvantaged respectively.

I have written extensively about what shape such a programme should have, most recently in the final section of ‘Digging Beneath the Destination Measures’ (July 2014).

The core would be:

‘A light touch framework that will supply the essential minimum scaffolding necessary to support effective market operation on the demand and supply sides simultaneously…

The centrepiece of the framework would be a structured typology or curriculum comprising the full range of knowledge, skills and understanding required by disadvantaged students to equip them for progression to selective higher education

  • On the demand side this would enable educational settings to adopt a consistent approach to needs identification across the 11-19 age range. Provision from 11-14 might be open to any disadvantaged learner wishing it to access it, but provision from 14 onwards would depend on continued success against challenging attainment targets.
  • On the supply side this would enable the full range of providers – including students’ own educational settings – to adopt a consistent approach to defining which knowledge, skills and understanding their various programmes and services are designed to impart. They would be able to qualify their definitions according to the age, characteristics, selectivity of intended destination and/or geographical location of the students they serve.

With advice from their educational settings, students would periodically identify their learning needs, reviewing the progress they had made towards personal targets and adjusting their priorities accordingly. They would select the programmes and services best matched to their needs….

…Each learner within the programme would have a personal budget dedicated to purchasing programmes and services with a cost attached. This would be fed from several sources including:

  • Their annual Pupil Premium allocation (currently £935 per year) up to Year 11.
  • A national fund fed by selective higher education institutions. This would collect a fixed minimum topslice from each institution’s outreach budget, supplemented by an annual levy on those failing to meet demanding new fair access targets. (Institutions would also be incentivised to offer programmes and services with no cost attached.)
  • Philanthropic support, bursaries, scholarships, sponsorships and in-kind support sourced from business, charities, higher education, independent schools and parents. Economic conditions permitting, the Government might offer to match any income generated from these sources.’

 

Close

We know far too little than we should about the size of excellence gaps in England – and whether or not progress is being made in closing them.

I hope that this post makes some small contribution towards rectifying matters, even though the key finding is that the picture is fragmented and extremely sketchy.

Rudimentary as it is, this survey should provide a baseline of sorts, enabling us to judge more easily what additional information is required and how we might begin to frame effective practice, whether at institutional or national level.

 

GP

September 2014

Closing England’s Excellence Gaps: Part One

This post examines what we know – and do not know – about high attainment gaps between learners from advantaged and disadvantaged backgrounds in England.

Mind the Gap by Clicsouris

Mind the Gap by Clicsouris

It assesses the capacity of current national education policy to close these gaps and recommends further action to improve the prospects of doing so rapidly and efficiently.

Because the post is extremely long I have divided it into two parts.

Part one comprises:

  • A working definition for the English context, explanation of the significance of excellence gaps, description of how this post relates to earlier material and provisional development of the theoretical model articulated in those earlier posts.
  • A summary of the headline data on socio-economic attainment gaps in England, followed by a review of published data relevant to excellence gaps at primary, secondary and post-16 levels.

Part two contains:

  • A distillation of research evidence, including material on whether disadvantaged high attainers remain so, international comparisons studies and research derived from them, and literature covering excellence gaps in the USA.
  • A brief review of how present Government policy might be expected to impact directly on excellence gaps, especially via the Pupil Premium, school accountability measures, the Education Endowment Foundation (EEF) and the Social Mobility and Child Poverty Commission (SMCPC). I have left to one side the wider set of reforms that might have an indirect and/or longer-term impact.
  • Some recommendations for strengthening our collective capacity to quantify address and ultimately close excellence gaps.

The post is intended to synthesise, supplement and update earlier material, so providing a baseline for further analysis – and ultimately consideration of further national policy intervention, whether under the present Government or a subsequent administration.

It does not discuss the economic and social origins of educational disadvantage, or the merits of wider policy to eliminate poverty and strengthen social mobility.

It starts from the premiss that, while education reform cannot eliminate the effects of disadvantage, it can make a significant, positive contribution by improving significantly the life chances of disadvantaged learners.

It does not debate the fundamental principle that, when prioritising educational support to improve the life chances of learners from disadvantaged backgrounds, governments should not discriminate on the basis of ability or prior attainment.

It assumes that optimal policies will deliver improvement for all disadvantaged learners, regardless of their starting point. It suggests, however, that intervention strategies should aim for equilibrium, prioritising gaps that are furthest away from it and taking account of several different variables in the process.

 

A working definition for the English context

The literature in Part two reveals that there is no accepted universal definition of excellence gaps, so I have developed my own England-specific working definition for the purposes of this post.

An excellence gap is:

‘The difference between the percentage of disadvantaged learners who reach a specified age- or stage-related threshold of high achievement – or who secure the requisite progress between two such thresholds – and the percentage of all other eligible learners that do so.’

This demands further clarification of what typically constitutes a disadvantaged learner and a threshold of high achievement.

In the English context, the measures of disadvantage with the most currency are FSM eligibility (eligible for and receiving free school meals) and eligibility for the deprivation element of the pupil premium (eligible for and receiving FSM at some point in the preceding six years – often called ‘ever 6’).

Throughout this post, for the sake of clarity, I have given priority to the former over the latter, except where the former is not available.

The foregrounded characteristic is socio-economic disadvantage, but this does not preclude analysis of the differential achievement of sub-groups defined according to secondary characteristics including gender, ethnic background and learning English as an additional language (EAL) – as well as multiple combinations of these.

Some research is focused on ‘socio-economic gradients’, which show how gaps vary at different points of the achievement distribution on a given assessment.

The appropriate thresholds of high achievement are most likely to be measured through national assessments of pupil attainment, notably end of KS2 tests (typically Year 6, age 11), GCSE and equivalent examinations (typically Year 11, age 16) and A level and equivalent examinations (typically Year 13, age 18).

Alternative thresholds of high achievement may be derived from international assessments, such as PISA, TIMSS or PIRLS.

Occasionally – and especially in the case of these international studies – an achievement threshold is statistically derived, in the form of a percentile range of performance, rather than with reference to a particular grade, level or score. I have not allowed for this within the working definition.

Progress measures typically relate to the distance travelled between: baseline assessment (currently at the end of KS1 – Year 2, age 7 – but scheduled to move to Year R, age 4) and end of KS2 tests; or between KS2 tests and the end of KS4 (GCSE); or between GCSE and the end of KS5 (Level 3/A level).

Some studies extend the concept of progress between two thresholds to a longitudinal approach that traces how disadvantaged learners who achieve a particular threshold perform throughout their school careers – do they sustain early success, or fall away, and what proportion are ‘late bloomers’?

 

Why are excellence gaps important?

Excellence gaps are important for two different sets of reasons: those applying to all achievement gaps and those which apply more specifically or substantively to excellence gaps.

Under the first heading:

  • The goal of education should be to provide all learners, including disadvantaged learners, with the opportunity to maximise their educational potential, so eliminating ‘the soft bigotry of low expectations’.
  • Schools should be ‘engines of social mobility’, helping disadvantaged learners to overcome their backgrounds and compete equally with their more advantaged peers.
  • International comparisons studies reveal that the most successful education systems can and do raise attainment for all and close socio-economic achievement gaps simultaneously.
  • There is a strong economic case for reducing – and ideally eradicating – underachievement attributable to disadvantage.

Under the second heading:

  • An exclusive or predominant focus on gaps at the lower end of the attainment distribution is fundamentally inequitable and tends to reinforce the ‘soft bigotry of low expectations’.
  • Disadvantaged learners benefit from successful role models – predecessors or peers from a similar background who have achieved highly and are reaping the benefits.
  • An economic imperative to increase the supply of highly-skilled labour will place greater emphasis on the top end of the achievement distribution. Some argue that there is a ‘smart fraction’ tying national economic growth to a country’s stock of high achievers. There may be additional spin-off benefits from increasing the supply of scientists, writers, artists, or even politicians!
  • The most highly educated disadvantaged learners are least likely to confer disadvantage on their children, so improving the proportion of such learners may tend to improve inter-generational social mobility.

Excellence gaps are rarely identified as such – the term is not yet in common usage in UK education, though it has greater currency in the US. Regardless of terminology, they rarely receive attention, either as part of a wider set of achievement gaps, or separately in their own right.

 

Relationship with earlier posts

Since this blog was founded in April 2010 I have written extensively about excellence gaps and how to address them.

The most pertinent of my previous posts are:

I have also written about excellence gaps in New Zealand – Part 1 and Part 2 (June 2012) – but do not draw on that material here.

Gifted education (or apply your alternative term) is amongst those education policy areas most strongly influenced by political and ideological views on the preferred balance between excellence and equity. This is particularly true of decisions about how best to address excellence gaps.

The excellence-equity trade-off was identified in my first post (May 2010) as one of three fundamental polarities that determine the nature of gifted education and provide the basis for most discussion about what form it should take.

The Gifted Phoenix Manifesto for Gifted Education (March 2013) highlighted their significance thus:

‘Gifted education is about balancing excellence and equity. That means raising standards for all while also raising standards faster for those from disadvantaged backgrounds.

Through combined support for excellence and equity we can significantly increase our national stock of high level human capital and so improve economic growth…

…Excellence in gifted education is about maximising the proportion of high achievers reaching advanced international benchmarks (eg PISA, TIMSS and PIRLS) so increasing the ‘smart fraction’ which contributes to economic growth.

Equity in gifted education is about narrowing (and ideally eliminating) the excellence gap between high achievers from advantaged and disadvantaged backgrounds (which may be attributable in part to causes other than poverty). This also increases the proportion of high achievers, so building the ‘smart fraction’ and contributing to economic growth.’

More recently, one of the 10 draft core principles I set out in ‘Why Can’t We Have National Consensus on Educating High Attainers?’ (June 2014) said:

‘We must pursue simultaneously the twin priorities of raising standards and closing gaps. We must give higher priority to all disadvantaged learners, regardless of their prior achievement. Standards should continue to rise amongst all high achievers, but they should rise faster amongst disadvantaged high achievers. This makes a valuable contribution to social mobility.’

 

This model provisionally developed

Using my working definition as a starting point, this section describes a theoretical model showing how excellence and equity are brought to bear when considering excellence gaps – and then how best to address them.

This should be applicable at any level, from a single school to a national education system and all points in between.

The model depends on securing the optimal balance between excellence and equity where:

  • Excellence is focused on increasing the proportion of all learners who achieve highly and, where necessary, increasing the pitch of high achievement thresholds to remove unhelpful ceiling effects. The thresholds in question may be nationally or internationally determined and are most likely to register high attainment through a formal assessment process. (This may be extended so there is complementary emphasis on increasing the proportion of high-achieving learners who make sufficiently strong progress between two different age- or stage-related thresholds.)
  • Equity is focused on increasing the proportion of high-achieving disadvantaged learners (and/or the proportion of disadvantaged learners making sufficiently strong progress) at a comparatively faster rate, so they form a progressively larger proportion of the overall high-achieving population, up to the point of equilibrium, where advantaged and disadvantaged learners are equally likely to achieve the relevant thresholds (and/or progress measure). This must be secured without deliberately repressing improvement amongst advantaged learners – ie by introducing policies designed explicitly to limit their achievement and/or progress relative to disadvantaged learners – but a decision to do nothing or to redistribute resources in favour of disadvantage is entirely permissible.

The optimal policy response will depend on the starting position and the progress achieved over time.

If excellence gaps are widening, the model suggests that interventions and resources should be concentrated in favour of equity. Policies should be reviewed and adjusted, or strengthened where necessary, to meet the desired objectives.

If excellence gaps are widening rapidly, this reallocation and adjustment process will be relatively more substantial (and probably more urgent) than if they are widening more slowly.

Slowly widening gaps will demand more reallocation and adjustment than a situation where gaps are stubbornly resistant to improvement, or else closing too slowly. But even in the latter case there should be some reallocation and adjustment until equilibrium is achieved.

When excellence gaps are already closing rapidly – and there are no overt policies in place to deliberately repress improvement amongst high-achieving advantaged learners – it may be that unintended pressures in the system are inadvertently bringing this about. In that case, policy and resources should be adjusted to correct these pressures and so restore the correct twin-speed improvement.

The aim is to achieve and sustain equilibrium, even beyond the point when excellence gaps are eliminated, so that they are not permitted to reappear.

If ‘reverse gaps’ begin to materialise, where disadvantaged learners consistently outperform their more advantaged peers, this also threatens equilibrium and would suggest a proportionate redistribution of effort towards excellence.

Such scenarios are most likely to occur in settings where there are a large proportion of learners that, while not disadvantaged according to the ‘cliff edge’ definition required to make the distinction, are still relatively disadvantaged.

Close attention must therefore be paid to the distribution of achievement across the full spectrum of disadvantage, to ensure that success at the extreme of the distribution does not mask significant underachievement elsewhere.

One should be able to determine a more precise policy response by considering a restricted set of variables. These include:

  • The size of the gaps at the start of the process and, associated with this, the time limit allowed for equilibrium to be reached. Clearly larger gaps are more likely to take longer to close. Policy makers may conclude that steady improvement over several years is more manageable for the system than a rapid sprint towards equilibrium. On the other hand, there may be benefits associated with pace and momentum.
  • The rate at which overall high achievement is improving. If this is relatively fast, the rate of improvement amongst advantaged high achievers will be correspondingly strong, so the rate for disadvantaged high achievers must be stronger still.
  • The variance between excellence gaps at different ages/stages. If the gaps are larger at particular stages of education, the pursuit of equilibrium suggests disproportionate attention is given to those so gaps are closed consistently. If excellence gaps are small for relatively young learners and increase with age, priority should be given to the latter, but there may be other factors in play, such as evidence that closing relatively small gaps at an early stage will have a more substantial ‘knock-on’ effect later on.
  • The level at which high achievement thresholds are pitched. Obviously this will influence the size of the gaps that need to be closed. But, other things being equal, enabling a higher proportion of learners to achieve a relatively high threshold will demand more intensive support. On the other hand, relatively fewer learners – whether advantaged or disadvantaged – are likely to be successful. Does one need to move a few learners a big distance or a larger proportion a smaller one?
  • Whether or not gaps at lower achievement thresholds are smaller and/or closing at a faster rate. If so, there is a strong case for securing parity of progress at higher and lower thresholds alike. On the other hand, if excellence gaps are closing more quickly, it may be appropriate to reallocate resources away from them and towards lower levels of achievement.
  • The relative size of the overall disadvantaged population, the associated economic gap between advantage and disadvantage and (as suggested above) the distribution in relation to the cut-off. If the definition of disadvantage is pitched relatively low (ie somewhat disadvantaged), the disadvantaged population will be correspondingly large, but the economic gap between advantage and disadvantage will be relatively small. If the definition is pitched relatively high (ie very disadvantaged) the reverse will be true, giving a comparatively small disadvantaged population but a larger gap between advantage and disadvantage.
  • The proportion of the disadvantaged population that is realistically within reach of the specified high achievement benchmarks. This variable is a matter of educational philosophy. There is merit in an inclusive approach – indeed it seems preferable to overestimate this proportion than the reverse. Extreme care should be taken not to discourage late developers or close off opportunities on the basis of comparatively low current attainment, so reinforcing existing gaps through unhelpfully low expectations. On the other hand, supporting unrealistically high expectations may be equally damaging and ultimately waste scarce resources. There may be more evidence to support such distinctions with older learners than with their younger peers. 

 

How big are England’s headline attainment gaps and how fast are they closing?

Closing socio-economic achievement gaps has been central to English educational policy for the last two decades, including under the current Coalition Government and its Labour predecessor.

It will remain an important priority for the next Government, regardless of the outcome of the 2015 General Election.

The present Government cites ‘Raising the achievement of disadvantaged children’ as one of ten schools policies it is pursuing.

The policy description describes the issue thus:

‘Children from disadvantaged backgrounds are far less likely to get good GCSE results. Attainment statistics published in January 2014 show that in 2013 37.9% of pupils who qualified for free school meals got 5 GCSEs, including English and mathematics at A* to C, compared with 64.6% of pupils who do not qualify.

We believe it is unacceptable for children’s success to be determined by their social circumstances. We intend to raise levels of achievement for all disadvantaged pupils and to close the gap between disadvantaged children and their peers.’

The DfE’s input and impact indicators  – showing progress against the priorities set out in its business plan – do not feature the measure mentioned in the policy description (which is actually five or more GCSEs at Grades A*-C or equivalents, including GCSEs in English and maths).

The gap on this measure was 27.7% in 2009, improving to 26.7% in 2013, so there has been a small 1.0 percentage point improvement over five years, spanning the last half of the previous Government’s term in office and the first half of this Government’s term.

Instead the impact indicators include three narrower measures focused on closing the attainment gap between free school meal pupils and their peers, at 11, 16 and 19 respectively:

  • Impact Indicator 7 compares the percentages of FSM-eligible and all other pupils achieving level 4 or above in KS2 assessment of reading, writing and maths. The 2013 gap is 18.7%, down 0.4% from 19.1% in 2012.
  • Impact Indicator 8 compares the percentages of FSM-eligible and all other pupils achieving A*-C grades in GCSE maths and English. The 2013 gap is 26.5%, up 0.3% from 26.2% in 2012.
  • Impact Indicator 9 compares the percentages of learners who were FSM-eligible at age 15 and all other learners who attain a level 3 qualification by the end of the academic year in which they are 19. The 2013 gap is 24.3%, up 0.1% from 24.2% in 2012.

These small changes, not always pointing in the right direction, reflect the longer term narrative, as is evident from the Government’s Social Mobility Indicators which also incorporate these three measures.

  • In 2005-06 the KS2 L4 maths and English gap was 25.0%, so there has been a fairly substantial 6.3 percentage point reduction over seven years, but only about one quarter of the gap has been closed.
  • In 2007-08 the KS4 GCSE maths and English gap was 28.0%, so there has been a minimal 1.5 percentage point reduction over six years, equivalent to annual national progress of 0.25 percentage points per year. At that rate it will take another century to complete the process.
  • In 2004-05 the Level 3 qualification gap was 26.4%, so there has been a very similar 2.1 percentage point reduction over 8 years.

The DfE impact indicators also include a set of three destination measures that track the percentage of FSM learners progressing to Oxford and Cambridge, any Russell Group university and any university.

There is a significant time lag with all of these – the most recent available data relates to 2011/2012 – and only two years of data have been collected.

All show an upward trend. Oxbridge is up from 0.1% to 0.2%, Russell Group up from 3% to 4% and any university up from 45% to 47% – actually a 2.5 percentage point improvement.

The Oxbridge numbers are so small that a percentage measure is a rather misleading indicator of marginal improvement from a desperately low base.

It is important to note that forthcoming changes to the assessment regime will impose a different set of headline indicators at ages 11 and 16 that will not be comparable with these.

From 2014 significant methodological adjustments are being introduced to School Performance Tables that significantly restrict the range of qualifications equivalent to GCSEs. Only the first entry in each subject will count for Performance Table purposes, this applying to English Baccalaureate subjects in 2014 and then all subjects in 2015.

Both these factors will tend to depress overall results and may be expected to widen attainment gaps on the headline KS4 measure as well as the oft-cited 5+ GCSEs measure.

From 2016 new baseline assessments, the introduction of scaled scores at the end of KS2 and a new GCSE grading system will add a further layer of change.

As a consequence there will be substantial revisions to the headline measures in Primary, Secondary and Post-16 Performance Tables. The latter will include destination measures, provided they can be made methodologically sound.

At the time of writing, the Government has made negligible reference to the impact of these reforms on national measures of progress, including its own Impact Indicators and the parallel Social Mobility indicators, though the latter are reportedly under review.

 

Published data on English excellence gaps

The following sections summarise what data I can find in the public domain about excellence gaps at primary (KS2), secondary (KS4) and post-16 (KS5) respectively.

I have cited the most recent data derivable from Government statistical releases and performance tables, supplemented by other interesting findings gleaned from research and commentary.

 

Primary (KS2) 

The most recent national data is contained in SFR51/2013: National Curriculum Assessments at Key Stage 2: 2012 to 2013. This provides limited information about the differential performance of learners eligible for and receiving FSM (which I have referred to as ‘FSM’), and for those known to be eligible for FSM at any point from Years 1 to 6 (known as ‘ever 6’ and describing those in receipt of the Pupil Premium on grounds of deprivation).

There is also additional information in the 2013 Primary School Performance Tables, where the term ‘disadvantaged’ is used to describe ‘ever 6’ learners and ‘children looked after’.

There is comparably little variation between these different sets of figures at national level. In the analysis below (and in the subsequent section on KS4) I have used FSM data wherever possible, but have substituted ‘disadvantaged’ data where FSM is not available.  All figures apply to state-funded schools only.

I have used Level 5 and above as the best available proxy for high attainment. Some Level 6 data is available, but in percentages only, and these are all so small that comparisons are misleading.

The Performance Tables distinguish a subset of high attainers, on the basis of prior attainment (at KS1 for KS2 and at KS2 for KS4) but no information is provided about the differential performance of advantaged and disadvantaged high attainers.

In 2013:

  • 21% of all pupils achieved Level 5 or above in reading, writing and maths combined, but only 10% of FSM pupils did so, compared with 26% of others, giving an attainment gap of 16%. The comparable gap at Level 4B (in reading and maths and L4 in writing) was 18%. At Level 4 (across the board) it was 20%. In this case, the gaps are slightly larger at lower attainment levels but, whereas the L4 gap has narrowed by 1% since 2012, the L5 gap has widened by 1%.
  • In reading, 44% of all pupils achieved Level 5 and above, but only 21% of FSM pupils did so, compared with 48% of others, giving an attainment gap of 21%. The comparable gap at Level 4 and above was eight percentage points lower at 13%.
  • In writing (teacher assessment), 31% of all pupils achieved level 5 and above, but only 15% of FSM pupils did so, compared with 34% of others, giving an attainment gap of 19%. The comparable gap at Level 4 and above was three percentage points lower at 16%.
  • In grammar, punctuation and spelling (GPS), 47% of all pupils achieved Level 5 and above, but only 31% of FSM pupils did so, compared with 51% of others, giving an attainment gap of 20%. The comparable gap at Level 4 and above was two percentage points lower at 18%.
  • In maths, 41% of pupils in state-funded schools achieved Level 5 and above, up 2% on 2012. But only 24% of FSM pupils achieved this compared with 44% of others, giving an attainment gap of 20%. The comparable gap at level 4 and above is 13%.

Chart 1 shows these outcomes graphically. In four cases out of five, the gap at the higher attainment level is greater, substantially so in reading and maths. All the Level 5 gaps fall between 16% and 20%.

 

Ex gap table 1

Chart 1: Percentage point gaps between FSM and all other pupils’ attainment at KS2 L4 and above and KS2 L5 and above, 2013 

 

It is difficult to trace reliably the progress made in reducing these gaps in English, since the measures have changed frequently. There has been more stability in maths, however, and the data reveals that – whereas the FSM gap at Level 4 and above has reduced by 5 percentage points since 2008 (from 18 points to 13 points) – the FSM gap at Level 5 and above has remained between 19 and 20 points throughout. Hence the gap between L4+ and L5+ on this measure has increased in the last five years.

There is relatively little published about KS2 excellence gaps elsewhere, though one older Government publication, a DfES Statistical Bulletin: The characteristics of high attainers (2007) offers a small insight.

It defines KS2 high attainers as the top 10%, on the basis of finely grained average points scores across English, maths and science, so a more selective but wider-ranging definition than any of the descriptors of Level 5 performance above.

According to this measure, some 2.7% of FSM-eligible pupils were high attainers in 2006, compared with 11.6% of non-FSM pupils, giving a gap of 8.9 percentage points.

The Bulletin supplies further analysis of this population of high attainers, summarised in the table reproduced below.

 

EX Gap Capture 1 

  

Secondary (KS4) 

While Government statistical releases provide at least limited data about FSM performance at high levels in end of KS2 assessments, this is entirely absent from KS4 data, because there is no information about the achievement of GCSE grades above C, whether for single subjects or combinations.

The most recent publication: SFR05/2014: GCSE and equivalent attainment by pupil characteristics, offers a multitude of measures based on Grades G and above or C and above, many of which are set out in Chart 2, which illustrates the FSM gap on each, organised in order from the smallest gap to the biggest.

(The gap cited here for A*-C grades in English and maths GCSEs is very slightly different to the figure in the impact indicator.)

 

Ex gap table 2

Chart 2: Percentage point gaps between FSM and all other pupils’ attainment on different KS4 measures, 2013

 

In its State of the Nation Report 2013, the Social Mobility and Child Poverty Commission included a table comparing regional performance on a significantly more demanding ‘8+ GCSEs excluding equivalents and including English and maths’ measure. This uses ‘ever 6’ rather than FSM as the indicator of disadvantage.

The relevant table is reproduced below. It shows regional gaps of between 20 and 26 percentage points on the tougher measure, so a similar order of magnitude to the national indicators at the top end of Chart 2.

 

ExGap 2 Capture

 

Comparing the two measures, one can see that:

  • The percentages of ‘ever 6’ learners achieving the more demanding measure are very much lower than the comparable percentages achieving the 5+ GCSEs measure, but the same is also true of their more advantaged peers.
  • Consequently, in every region but London and the West Midlands, the attainment gap is actually larger for the less demanding measure.
  • In London, the gaps are much closer, at 19.1 percentage points on the 5+ measure and 20.9 percentage points on the 8+ measure. In the West Midlands, the gap on the 8+ measure is larger by five percentage points. In all other cases, the difference is at least six percentage points in the other direction.

We do not really understand the reasons why London and the West Midlands are atypical in this respect.

The Characteristics of High Attainers (2007) provides a comparable analysis for KS4 to that already referenced at KS2. In this case, the top 10% of high attainers is derived on the basis of capped GCSE scores.

This gives a gap of 8.8 percentage points between the proportion of non-FSM (11.2%) and FSM (2.4%) students within the defined population, very similar to the parallel calculation at KS2.

Other variables within this population are set out in the table reproduced below.

 

ExGap Capture 3

Finally, miscellaneous data has also appeared from time to time in the answers to Parliamentary Questions. For example:

  • In 2003, 1.0% of FSM-eligible learners achieved five or more GCSEs at A*/A including English and maths but excluding equivalents, compared with 6.8% of those not eligible, giving a gap of 5.8 percentage points. By 2009 the comparable percentages were 1.7% and 9.0% respectively, resulting in an increased gap of 7.3 percentage points (Col 568W)
  • In 2006/07, the percentage of FSM-eligible pupils securing A*/A grades at GCSE in different subjects, compared with the percentage of all pupils in maintained schools doing so were as shown in the table below (Col 808W)
FSM All pupils Gap
Maths 3.7 15.6 11.9
Eng lit 4.1 20.0 15.9
Eng lang 3.5 16.4 12.9
Physics 2.2 49.0 46.8
Chemistry 2.5 48.4 45.9
Biology 2.5 46.8 44.3
French 3.5 22.9 19.4
German 2.8 23.2 20.4

Table 1: Percentage of FSM-eligible and all pupils achieving GCSE A*/A grades in different GCSE subjects in 2007

  • In 2008, 1% of FSM-eligible learners in maintained schools achieved A* in GCSE maths compared with 4% of all pupils in maintained schools. The comparable percentages for Grade A were 3% and 10% respectively, giving an A*/A gap of 10 percentage points (Col 488W)

 

Post-16 (KS5)

The most recent post-16 attainment data is provided in SFR10/2014: Level 2 and 3 attainment by young people aged 19 in 2013 and SFR02/14: A level and other level 3 results: academic year 2012 to 2013.

The latter contains a variety of high attainment measures – 3+ A*/A grades;  AAB grades or better; AAB grades or better with at least two in facilitating subjects;  AAB grades or better, all in facilitating subjects – yet none of them distinguish success rates for advantaged and disadvantaged learners.

The former does includes a table which provides a time series of gaps for achievement of Level 3 at age 19 through 2 A levels or the International Baccalaureate. The measure of disadvantage is FSM-eligibility in Year 11. The gap was 22.0 percentage points in 2013, virtually unchanged from 22.7 percentage points in 2005.

In (How) did New Labour narrow the achievement and participation gap (Whitty and Anders, 2014) the authors reproduce a chart from a DfE roundtable event held in March 2013 (on page 44).

This is designed to show how FSM gaps vary across key stages and also provides ‘odds ratios’ – the relative chances of FSM and other pupils achieving each measure. It relies on 2012 outcomes.

The quality of the reproduction is poor, but it seems to suggest that, using the AAB+ in at least two facilitating subjects measure, there is a five percentage point gap between FSM students and others (3% versus 8%), while the odds ratio shows that non-FSM students are 2.9 times more likely than FSM students to achieve this outcome.

Once again, occasional replies to Parliamentary Questions provide some supplementary information:

  • In 2007, 189 FSM-eligible students (3.7%) in maintained mainstream schools (so excluding sixth form colleges and FE colleges) achieved 3 A grades at A level. This compared with 13,467 other students (9.5%) giving a gap of 5.8 percentage points (Source: Parliamentary Question, 26 November 2008, Hansard (Col 1859W)
  • In 2008, 160 students (3.5%) eligible for FSM achieved that outcome. This compares with 14,431 (10.5%) of those not eligible for FSM, giving a gap of 7.0 percentage points. The figures relate to 16-18 year-olds, in maintained schools only, who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth form colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are counted. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
  • Of pupils entering at least one A level in 2010/11 and eligible for FSM at the end of Year 11, 546 (4.1%) achieved 3 or more GCE A levels at A*-A compared with 22,353 other pupils (10.6%) so giving a gap of 6.5 percentage points. These figures include students in both the schools and FE sectors. (Parliamentary Question, 9 July 2012, Hansard (Col 35W)) 

 In September 2014, a DfE response to a Freedom of Information request provided some additional data about FSM gaps at A level over the period from 2009 to 2013. This is set out in the table below, which records the gaps between FSM and all other pupils, presumably for all schools and colleges, whether or not state-funded.

Apart from the atypical result for the top indicator in 2010, all these percentages fall in the range 6.0% to 10%, so are in line with the sources above.

 

2009 2010 2011 2012 2013
3+ grades at A*/A or applied single/double award 9.0 12.8 9.3 8.7 8.3
AAB+ grades in facilitating subjects 6.3 6.2
AAB+ grades at least 2 in facilitating subjects 9.8

 

Additional evidence of Key Stage excellence gaps from a sample born in 1991

In Progress made by high-achieving children from disadvantaged backgrounds (Crawford, Macmillan and Vignoles, 2014) provides useful data on the size of excellence gaps at different key stages, as well as analysis of whether disadvantaged high achievers remain so through their school careers.

The latter appears in Part two, but the first set of findings provides a useful supplement to the broad picture set out above.

This study is based on a sample of learners born in 1991/1992, so they would presumably have taken end of KS2 tests in 2002, GCSEs in 2007 and A levels in 2009. It includes all children who attended a state primary school, including those who subsequently attended an independent secondary school.

It utilises a variety of measures of disadvantage, including whether learners were always FSM-eligible (in Years 7-11), or ‘ever FSM’ during that period. This summary focuses on the distinction between ‘always FSM’ and ‘never FSM’.

It selects a basket of high attainment measures spread across the key stages, including:

  • At KS1, achieving Level 3 or above in reading and maths.
  • At KS2, achieving Level 5 or above in English and maths.
  • At KS4, achieving six or more GCSEs at grades A*-C in EBacc subjects (as well as five or more).
  • At KS5, achieving two or more (and three or more) A levels at grades A-B in any subjects.
  • Also at KS5, achieving two or more (and three or more) A levels at grades A-B in facilitating subjects.

The choice of measures at KS2 and KS5 is reasonable, reflecting the data available at the time. For example, one assumes that A* grades at A level do not feature in the KS5 measures since they were not introduced until 2010).

At KS4, the selection is rather more puzzling and idiosyncratic. It would have been preferable to have included at least one measure based on performance across a range of GCSEs at grades A*-B or A*/A.

The authors justify their decision on the basis that ‘there is no consensus on what is considered high attainment’, even though most commentators would expect this to reflect higher grade performance, while few are likely to define it solely in terms of breadth of study across a prescribed set of ‘mainstream’ subjects.

Outcomes for ‘always FSM’ and ‘never FSM’ on the eight measures listed above are presented in Chart 3.

Ex gap Table 3

Chart 3: Achievement of ‘always FSM’ and ‘never FSM’ on a basket of high attainment measures for pupils born in 1991/92

 

This reveals gaps of 12 to 13 percentage points at Key Stages 1 and 2, somewhat smaller than several of those described above.

It is particularly notable that the 2013 gap for KS2 L5 reading, writing and maths is 16 percentage points, whereas the almost comparable 2002 (?) gap for KS2 English and maths amongst this sample is 13.5%. Even allowing for comparability issues, there may tentative evidence here to suggest widening excellence gaps at KS2 over the last decade.

The KS4 gaps are significantly larger than those existing at KS1/2, at 27 and 18 percentage points respectively. But comparison with the previous evidence reinforces the point that the size of the gaps in this sample is attributable to subject mix: this must be the case since the grade expectation is no higher than C.

The data for A*/A performance on five or more GCSEs set out above, which does not insist on coverage of EBacc subjects other than English and maths, suggests a gap of around seven percentage points. But it also demonstrates big gaps – again at A*/A – for achievement in single subjects, especially the separate sciences.

The KS5 gaps on this sample range from 2.5 to 13 percentage points. We cited data above suggesting a five percentage point gap in 2012 for AAB+, at least two in facilitating subjects. These findings do not seem wildly out of kilter with that, or with the evidence of gaps of around six to seven percentage points for AAA grades or higher.

 

Overall pattern 

The published data provides a beguiling glimpse of the size of excellence gaps and how they compare with FSM gaps on the key national benchmarks.

But discerning the pattern is like trying to understand the picture on a jigsaw when the majority of pieces are missing.

The received wisdom is capture in the observation by Whitty and Anders that:

‘Even though the attainment gap in schools has narrowed overall, it is largest for the elite measures’

and the SMCPC’s comment that:

‘…the system is better at lifting children eligible for FSM above a basic competence level (getting 5A*–C) than getting them above a tougher level of attainment likely to secure access to top universities.’

This seems broadly true, but the detailed picture is rather more complicated.

  • At KS2 there are gaps at L5 and above of around 16-20 percentage points, the majority higher than the comparable gaps at L4. But the gaps for core subjects combined are smaller than for each assessment. There is tentative evidence that the former may be widening.
  • At KS4 there are very significant differences between results in individual subjects. When it comes to multi-subject indicators, differences in the choice of subject mix – as well as choice of grade – make it extremely difficult to draw even the most tentative conclusions about the size of excellence gaps and how they relate to benchmark-related gaps at KS4 and excellence gaps at KS2.
  • At KS5, the limited evidence suggests that A level excellence gaps at the highest grades are broadly similar to those at GCSE A*/A. If anything, gaps seem to narrow slightly compared with KS4. But the confusion over KS4 measures makes this impossible to verify.

We desperately need access to a more complete dataset so we can understand these relationships more clearly.

This is the end of Part one. In Part two, we move on to consider evidence about whether high attainers remain so, before examining international comparisons data and related research, followed by excellence gaps analysis from the USA.

Part two concludes with a short review of how present government policy impacts on excellence gaps and some recommendations for strengthening the present arrangements.

 

GP

September 2014

What Happened to the Level 6 Reading Results?

 

Provisional 2014 key stage 2 results were published on 28 August.

500px-Japanese_Urban_Expwy_Sign_Number_6.svgThis brief supplementary post considers the Level 6 test results – in reading, in maths and in grammar, punctuation and spelling (GPS) – and how they compare with Level 6 outcomes in 2012 and 2013.

An earlier post, A Closer Look at Level 6, published in May 2014, provides a fuller analysis of these earlier results.

Those not familiar with the 2014 L6 test materials can consult the papers, mark schemes and level thresholds at these links:

 

Number of Entries

Entry levels for the 2014 Level 6 tests were published in the media in May 2014. Chart 1 below shows the number of entries for each test since 2012 (2013 in the case of GPS). These figures are for all schools, independent as well as state-funded.

 

L6 Sept chart 1

Chart 1: Entry rates for Level 6 tests 2012 to 2014 – all schools

 

In 2014, reading entries were up 36%, GPS entries up 52% and maths entries up 36%. There is as yet no indication of a backlash from the decision to withdraw Level 6 tests after 2015, though this may have an impact next year.

The postscript to A Closer Look estimated that, if entries continue to increase at current rates, we might expect something approaching 120,000 in reading, 130,000 in GPS and 140,000 in maths.

Chart 2 shows the percentage of all eligible learners entered for Level 6 tests, again for all schools. Nationally, between one in six and one in five eligible learners are now entered for Level 6 tests. Entry rates for reading and maths have almost doubled since 2012.

 

L6 Sept chart 2

Chart 2: Percentage of eligible learners entered for Level 6 tests 2012 to 2014, all schools

 

Success Rates

The headline percentages in the SFR show:

  • 0% achieving L6 reading (unchanged from 2013)
  • 4% achieving L6 GPS (up from 2% in 2013) and
  • 9% achieving L6 maths (up from 7% in 2013).

Local authority and regional percentages are also supplied.

  • Only in Richmond did the L6 pass rate in reading register above 0% (at 1%). Hence all regions are at 0%.
  • For GPS the highest percentages are 14% in Richmond, 10% in Kensington and Chelsea and Kingston, 9% in Sutton and 8% in Barnet, Harrow and Trafford. Regional rates vary between 2% in Yorkshire and Humberside and 6% in Outer London.
  • In maths, Richmond recorded 22%, Kingston 19%, Trafford, Harrow and Sutton were at 18% and Kensington and Chelsea at 17%. Regional rates range from 7% in Yorkshire and Humberside and the East Midlands to 13% in Outer London.

Further insight into the national figures can be obtained by analysing the raw numbers supplied in the SFR.

Chart 3 shows how many of those entered for each test were successful in each year. Here there is something of a surprise.

 

L6 Sept chart 3

Chart 3: Percentage of learners entered achieving Level 6, 2012 to 2014, all schools

 

Nearly half of all entrants are now successful in L6 maths, though the improvement in the success rate has slowed markedly compared with the nine percentage point jump in 2013.

In GPS, the success rate has improved by nine percentage points between 2013 and 2014 and almost one in four entrants is now successful. Hence the GPS success rate is roughly half that for maths. This may be attributable in part to its shorter history, although the 2014 success rate is significantly below the rate for maths in 2013.

But in reading an already very low success rate has declined markedly, following a solid improvement in 2013 from a very low base in 2012. The 2014 success rate is now less than half what it was in 2012. Fewer than one in a hundred of those entered have passed this test.

Chart 4 shows how many learners were successful in the L6 reading test in 2014 compared with previous years, giving results for boys and girls separately.

 

L6 Sept chart 4

Chart 4: Percentage of learners entered achieving Level 6 in reading, 2012 to 2014, by gender

 

The total number of successful learners in 2014 is over 5% lower than in 2012, when the reading test was introduced, and down 62% on the success rate achieved in 2013.

Girls appear to have suffered disproportionately from the decline in 2014 success rates. While the success rate for girls is down 63%, the decline for boys is slightly less, at 61%. The success rate for boys remains above where it was in 2012 but, for girls, it is about 12% down on where it was in 2012.

In 2012, only 22% of successful candidates were boys. This rose to 26% in 2013 and has again increased slightly, to 28% in 2014. The gap between girls’ and boys’ performance remains substantially bigger than those for GPS and maths.

Charts 5 and 6 give the comparable figures for GPS and maths respectively.

In GPS, the total number of successful entries has increased by almost 140% compared with 2013. Girls form a slightly lower proportion of this group than in 2013, their share falling from 62% to 60%. Boys are therefore beginning to close what remains a substantial performance gap.

 

L6 Sept chart 5

Chart 5: Percentage of learners entered achieving Level 6 in GPS, 2012 to 2014, by gender

 

In maths, the total number of successful entries is up by about 40% on 2013 and demonstrates rapid improvement over the three year period.

Compared with 2013, the success rate for girls has increased by 43%, whereas the corresponding increase for boys is closer to 41%. Boys formed 65% of the successful cohort in 2012, 61% in 2013 and 60% in 2014, so girls’ progress in narrowing this substantial performance gap is slowing.

 

L6 Sept chart 6

Chart 6: Percentage of learners entered achieving Level 6 in maths, 2012 to 2014, by gender

 

Progress

The SFR also provides a table, this time for state-funded schools only, showing the KS1 outcomes of those successful in achieving Level 6. (For maths and reading, this data includes those with a non-numerical grade in the test who have been awarded L6 via teacher assessment. The data for writing is derived solely from teacher assessment.)

Not surprisingly, over 94% of those achieving Level 6 in reading had achieved Level 3 in KS1, but 4.8% were at L2A and a single learner was recorded at Level 1. The proportion with KS1 Level 3 in 2013 was higher, at almost 96%.

In maths, however, only some 78% of those achieving Level 6 were at Level 3 in KS1. A further 18% were at 2A and almost 3% were at 2B. A further 165 were recorded as 2C or 1. In 2013, over 82% had KS1 L3 while almost 15% had 2A.

It seems, therefore, that KS1 performance was a slightly weaker indicator of KS2 level 6 success in 2014 than in the previous year, but this trend was apparent in both reading and maths – and KS1 performance remains a significantly weaker indicator in maths than it is in reading.

 

Why did the L6 reading results decline so drastically?

Given that the number of entries for the Level 6 reading test increased dramatically, the declining pass rate suggests either a problematic test or that schools entered a higher proportion of learners who had relatively little chance of success. A third possibility is that the test was deliberately made more difficult.

The level threshold for the 2014 Level 6 reading test was 24 marks, compared with 22 marks in 2013, but there are supposed to be sophisticated procedures in place to ensure that standards are maintained. We should be able to discount the third cause.

The second cause is also unlikely to be significant, since schools are strongly advised only to enter learners who are already demonstrating attainment beyond KS2 Level 5.There is no benefit to learners or schools from entering pupils for tests that they are almost certain to fail.

The existing pass rate was very low, but it was on an upward trajectory. Increasing familiarity with the test ought to have improved schools’ capacity to enter the right learners and to prepare them to pass it.

That leaves only the first possibility – something must have been wrong with the test.

Press coverage from May 2014, immediately after the test was administered, explained that it contained different rules for learners and invigilators about the length of time available for answering questions.

The paper gave learners one hour for completion, while invigilators were told pupils had 10 minutes’ reading time followed by 50 minutes in which to answer the questions. Schools interpreted this contradiction differently and several reported disruption to the examination as a consequence.

The NAHT was reported to have written to the Standards and Testing Agency:

‘…asking for a swift review into this error and to seek assurance that no child will be disadvantaged after having possibly been given incorrect advice on how to manage their time and answers’.

The STA statement says:

‘We apologise for this error. All children had the same amount of time to complete the test and were able to consult the reading booklet at any time. We expect it will have taken pupils around 10 minutes to read the booklet, so this discrepancy should not have led to any significant advantage for those pupils where reading time was not correctly allotted.’

NAHT has now posted the reply it received from STA on 16 May. It says:

‘Ofqual, our regulator, is aware of the error and of the information set out below and will, of
course, have to independently assure itself that the test remains valid. We would not
expect this to occur until marking and level setting processes are complete, in line with
their normal timescales.’

It then sets out the reasons why it believes the test remains valid. These suggest the advantage to the learners following the incorrect instructions was minimal since:

  • few would need less than 10 minutes’ reading time;
  • pre-testing showed 90% of learners completed the test within 50 minutes;
  • in 2013 only 3.5% of learners were within 1 or 2 marks of the threshold;
  • a comparative study to change the timing of the Levels 3-5 test made little difference to item difficulty.

NAHT says it will now review the test results in the light of this response.

 

 

Who is responsible?

According to its most recent business plan, STA:

‘is responsible for setting and maintaining test standards’ (p3)

but it publishes little or nothing about the process involved, or how it handles representations such as that from NAHT.

Meanwhile, Ofqual says its role is:

‘to make sure the assessments are valid and fit for purpose, that the assessments are fair and manageable, that the standards are properly set and maintained and the results are used appropriately.

We have two specific objectives as set out by law:

  • to promote assessment arrangements which are valid, reliable and comparable
  • to promote public confidence in the arrangements.

We keep national assessments under review at all times. If we think at any point there might be a significant problem with the system, then we notify the Secretary of State for Education.’

Ofqual’s Chair has confirmed via Twitter that Ofqual was:

‘made aware at the time, considered the issues and observed level setting’.

Ofqual was content that the level-setting was properly undertaken.

 

 

I asked whether, in the light of that, Ofqual saw a role for itself in investigating the atypical results. I envisaged that this might take place under the Regulatory Framework for National Curriculum Assessments (2011).

This commits Ofqual to publishing annually its ‘programme for reviewing National Assessment arrangements’ (p14) as well as ‘an annual report on the outcomes of the review programme’ (p18).

However the most recent of these relates to 2011/12 and appeared in November of that year.

 

 

I infer from this that we may seem some reaction from Ofqual, if and when it finally produces an annual report on National Curriculum Assessments in 2014, but that’s not going to appear before 2015 at the earliest.

I can’t help but feel that this is not quite satisfactory – that atypical test performance of this magnitude ought to trigger an automatic and transparent review, even if the overall number of learners affected is comparatively small.

If I were part of the system I would want to understand promptly exactly what happened, for fear that it might happen again.

If you are in any doubt quite how out of kilter the reading test outcomes were, consider the parallel results for Level 6 teacher assessment.

In 2013, 5,698 learners were assessed at Level 6 in reading through teacher assessment – almost exactly two-and-a-half times as many as achieved Level 6 in the test.

In 2014, a whopping 17,582 learners were assessed at Level 6 through teacher assessment, around 20 times as many as secured a Level 6 in the reading test.

If the ratio between test and teacher assessment results in 2014 had been the same as it was in 2013, the number successful on the test would have been over 7,000, eight-fold higher than the reported 851.

I rest my case.

 

The new regime

In February 2013, a DfE-commissioned report Investigation of Key Stage 2 Level 6 Tests recommended that:

‘There is a need to review whether the L6 test in Reading is the most appropriate test to use to discriminate between the highest ability pupils and others given:

a) that only around 0.3 per cent of the pupils that achieved at least a level 5 went on to achieve a level 6 in Reading compared to 9 per cent for Mathematics

b) there was a particular lack of guidance and school expertise in this area

c) pupil maturity was seen to be an issue

d) the cost of supporting and administering a test for such a small proportion of the school population appears to outweigh the benefits.’

This has been overtaken by the decision to withdraw all three Level 6 tests and to rely on single tests of reading GPS and maths for all learners when the new assessment regime is introduced from 2016.

Draft test frameworks were published in March 2014, supplemented in July by sample questions, mark schemes and commentary.

Given the imminent introduction of this new regime, together with schools’ experience in 2014, it seems increasingly unlikely that 2015 Level 6 test entries in reading will approach the 120,000 figure suggested by the trend.

Perhaps more importantly, schools and assessment experts alike seem remarkably sanguine about the prospect of single tests for pupils demonstrating the full range of prior attainment, apart from those assessed via the P-Scales. (The draft test frameworks are worryingly vague about whether those operating at the equivalent of Levels 1 and 2 will be included.)

I could wish to be equally sanguine, on behalf of all those learners capable of achieving at least the equivalent of Level 6 after 2015. But, as things stand, the evidence to support that position is seemingly non-existent.

In October 2013, Ofqual commented that:

‘There are also some significant technical challenges in designing assessments which can discriminate effectively and consistently across the attainment range so they can be reported at this level of precision.’

A year on, we still have no inkling whether those challenges have been overcome.

 

GP

September 2014

 

 

 

 

Digging Beneath the Destination Measures

 

This post takes as its starting point the higher education destination data published by the Department for Education (DfE) in June 2014.

turner-oxford-high-stIt explores:

  • The gaps between progression rates for students from disadvantaged backgrounds (defined in terms of eligibility for free school meals) and those of their more advantaged peers.
  • How these rates vary according to whether the students come from schools or colleges and the selectivity of the higher education to which they progress.
  • Regional differences, with a particular focus on Inner and Outer London.

Although these are officially classified as experimental statistics, they supply a valuable alternative perspective on national progress towards fair access for disadvantaged learners to selective universities.

Securing such progress is integral to the Government’s education and social mobility strategy, since it is embedded in DfE’s Impact Indicators, in BIS Performance Indicators and the Social Mobility Indicators. The DfE indicators depend on these destination measures.

The final section discusses the optimal policy response to the position revealed by this analysis. It:

  • Discusses the limitations of a free market solution combined with institutional autonomy, structural reform – especially the introduction of specialist post-16 providers – and the expected incorporation of these measures into the post-16 accountability framework.
  • Sets out the advantages of introducing a framework to support the market on both the demand and supply sides. This would secure a coherent and consistent menu of opportunities that might be targeted directly at the learners most likely to benefit. This might be undertaken at national or at regional level, including in London.
  • Suggests that – given the abundant evidence of stalled progress – the latter approach is most likely to bring about more immediate, significant and sustained improvement without excessive deadweight cost.

I am publishing this on the eve of The Brilliant Club’s Inaugural Conference, which asks the question

‘How can universities and schools help pupils from low participation backgrounds secure places and succeed at highly competitive universities?’

The organisers and participants are cordially invited to admit this second personal contribution to this debate, for I have already written extensively about the particular problem of fair access to Oxbridge for disadvantaged learners.

That post exposed some rather shaky statistical interpretation by the universities concerned and proposed a series of policy steps to address the worryingly low progression rates to these two universities. I will refer to it occasionally below, keeping repetition to a minimum. I commend it to you as a companion piece to this.

 

The Destination Data

DfE published SFR 19/2014: ‘Destinations of key stage 4 and key stage 5 pupils: 2011 to 2012’ on 26 June 2014.

These are described as ‘experimental statistics…as data are still being evaluated and remain subject to further testing in terms of their reliability and ability to meet customer needs’.

Nevertheless, subject to possible further refinement, DfE plans to incorporate KS5 destination measures into the new post-16 accountability arrangements to be introduced from 2016.  They are set to become increasingly significant for school sixth forms and post-16 providers alike.

The measures are based on student activity in the year immediately following the completion of A level or other Level 3 qualifications.

Students are included if:

  • They are aged 16, 17 or 18 and entered for at least one A level or other L3 qualification. (Those entered for AS level only are therefore excluded.)
  • They ‘show sustained participation…in all of the first two terms of the year after…’ ie from October 2011 to March 2012. (Dropouts are excluded but there is provision to pick up students transferring from one provider to another.)

The time lag is caused by the need to match data from the national pupil database (NPD) and the Higher Education Statistics Agency (HESA). The most recent matchable dataset combines the HESA data for academic year 2011/12 with the KS5 performance data for academic year 2010/11.

The 2011/12 destination data includes partial coverage of independent schools for the first time, alongside state-funded schools and colleges, but my analysis is confined to state-funded institutions.

The measure of disadvantage is eligibility for free school meals (FSM). Students are considered disadvantaged if they were eligible for and receiving FSM meals at any point in Year 11, so immediately prior to KS5. This post typically uses ‘FSM’ or ‘FSM-eligible’ to describe this group.

FSM is a narrower definition of disadvantage than the Pupil Premium, which is based on FSM eligibility at any point in the preceding six years. These two definitions continue to have most currency in the schools sector, but are frequently disregarded in the higher education sector where several alternatives are deployed.

All measures of disadvantage have upside and downside and, having explored this issue extensively in my previous post about Oxbridge, I do not propose to cover the same ground here.

I will only repeat the contention that, far too often, those facing criticism for their failure to improve fair access will criticise in turn the measures adopted, so producing a smokescreen to deflect attention from that failure.

The analysis that follows draws principally on tables included in the underlying data published alongside the SFR. The presentation of the data in these tables – used in all the published material – is important to bear in mind.

All totals are rounded to the nearest ten, while any single figure less than 6 is suppressed and replaced with ‘x’.

Hence a total of ‘10’ is an approximation which might represent any figure between 6 and 14.

It follows that a calculation involving several totals may be even more approximate. To take an important example, the sum of five totals, each given as ‘10’, may represent anything between 30 and 70.

This degree of imprecision is less than helpful when smaller cohorts – such as FSM-eligible students progressing to the most competitive universities – are under discussion.

A more detailed and sophisticated explanation of the methodology supporting the measures can be found in the Technical Note published alongside the SFR.

 

Nature of the Total Population

Table 1, below, shows how the national population is distributed between state-funded schools and colleges – and between FSM and non-FSM students from each of those settings.

 

Table 1: Distribution of national KS5 population and numbers progressing to a sustained education destination 2011/12

State-funded schools State-funded colleges Total
FSM Non-FSM Total FSM Non-FSM Total FSM Non-FSM Total
No of students 11,100 153,480 164,580 17,680 153,230 170,910 28,770 306,720 335,490
Sustained education destination 8,020 114,470 122,490 10,430 90,830 101,260 18,450 205,300 223,760

 

Key points include:

  • Of the total KS5 student population of 335,490, only some 8.6% are FSM-eligible. Hence the analysis below is derived from a sample of 28,770 students.
  • Some 49% of this population attend mainstream state-funded schools compared with 51% at state-funded colleges. Total numbers are therefore distributed fairly evenly between the two sectors.
  • The FSM-eligible population attending schools is 6.7% of the total population attending schools and over 38% of the total FSM population. The former percentage is significantly lower than the proportion of FSM-eligible students aged 11-15 in the national secondary school population, which stood at around 16% in 2012.
  • The FSM-eligible population attending colleges is 10.3% of the total population attending colleges and over 61% of the total FSM population.

Hence the overall population is spread fairly evenly between schools and colleges, but a significant majority of the FSM-eligible population is located in the latter.

Furthermore:

  • The proportion of KS5 students progressing to a sustained education destination (as opposed to not progressing to any destination, or progressing to employment or training) is almost 67%, but amongst FSM-eligible learners this falls slightly, to 64%.
  • Amongst those attending schools, the proportion of FSM-eligible students progressing to a sustained education destination is approximately 72%; amongst those attending colleges it is much lower – some 59%.

The analysis below uses the total population as a base, rather than the proportion that progresses to a sustained educational destination.

The incidence of FSM-eligible students also varies considerably by region. Chart 1 below shows the percentage of FSM and other students in each region’s overall KS5 cohort.

 

Chart 1: Percentages of FSM and non-FSM in KS5 cohort by region 2010/11

Destinations chart 1

 

The percentage of FSM-eligible students ranges from as low as 4.3% in the South East up to 30.3% in Inner London – a vast differential.

Inner London has comfortably more than twice the incidence of FSM students in Outer London, the next highest, and some seven times the rate in the South East.

The sizes of these cohorts are also extremely variable. There are over 4,000 students in the FSM populations for each of Inner and Outer London, compared with as few as 1,400 in the South West region. Taken together, Inner and Outer London account for slightly over 30% of the total English FSM-eligible population.

However, the total KS5 population is far bigger in the South East (58,260) than in any other region, while Inner London (14,030) is the smallest population. The South East alone accounts for over 17% of the total KS5 cohort.

These variations – particularly the high incidence of FSM students within a relatively small overall KS5 population in Inner London – are bound to have a profound effect on progression to higher education.

The concentration in Inner London is such that it will almost certainly be a relatively easy task to prioritise FSM students’ needs and also achieve economies of scale through provision across multiple schools.

There will be heavy concentrations of FSM-eligible students in many secondary schools, as well as in post-16 provision in both schools and colleges. Significantly fewer institutions – secondary or post-16 – will have negligible FSM-eligible populations.

There will be a similar effect in Outer London, though patchier and not so profound.

 

Progression to a UK Higher Education Institution

Table 2: National breakdown of numbers progressing to a UK Higher education institution, 2011/12

State-funded schools State-funded colleges Total
FSM Non-FSM Total FSM Non-FSM Total FSM Non-FSM Total
No of students 11,100 153,480 164,580 17,680 153,230 170,910 28,770 306,720 335,490
UK HEI destination 6,250 95,880 102,130 7,290 67,130 74,420 13,540 163,010 176,550

 

Table 2, above, shows that:

  • The overall proportion progressing to a UK higher education institution is almost 53%, but this falls to 47% for FSM-eligible students.
  • The proportion of FSM students attending schools that progresses to a UK HEI is 56% whereas the comparable proportion for those attending FE colleges is 41% – a significant difference of 15 percentage points.
  • The number of FSM students progressing from colleges (7,290) remains larger than that progressing from schools (6,250).
  • There is a six percentage point variation between the progression rates for FSM and non-FSM students attending schools (56% versus 62%). In colleges the variation is only three percentage points (41% versus 44%).

Chart 2, below, shows the percentage of the KS5 FSM cohort in each region progressing to a UK higher education institution, compared with the percentage of the KS5 non-FSM cohort doing so.

The overall progression rate for FSM-eligible students is very nearly twice as high in each of Inner and Outer London as it is in the South West, the lowest performing region.

Incredibly, in Inner London, the progression rate for FSM-eligible students slightly exceeds the rate for non-FSM students – and these two rates are also very close in Outer London

 

Chart 2: Percentages of FSM and non-FSM progressing to UK HE by region 2011/12

Destinations chart 2

 

There is relatively little disparity between the regional progression rates for non-FSM students – only 16 percentage points variation between the highest and lowest performing regions (63% in Outer London versus 47% in the South West), compared with a 30 percentage point variation for FSM students (63% in Inner London versus 33% in South West England).

Outside London, the regions with the smallest variation between progression rates for FSM and non-FSM respectively are the West Midlands (nine percentage points) and Yorkshire and Humberside (eleven percentage points). The largest variation is in the North East (seventeen percentage points).

It is worth labouring the point by noting that FSM-eligible students located in London are almost twice as likely to progress to some form of UK higher education as those in the South West and the South East, and more likely to progress than non-FSM students in every other region, with the sole exception of Outer London

London is clearly an outstanding success in these terms, so bearing out all the recent publicity given to London’s relative success in securing high levels of attainment while simultaneously closing FSM gaps.

Some other regions need to work much harder than others to close this widening participation gap.

 

Progression to Selective UK Higher Education

But does this marked disparity between London and other English regions extend to progression to selective universities?

The destinations data incorporates several different measures of selectivity, each a subset of its predecessor:

  • Top third: the top 33% of HEIs, as measured by their mean UCAS tariff score, based on the best three A level grades of students admitted (other qualifications are excluded). The subset of institutions within this group changes annually, although 88% of those represented in 2011/12 had been included for six consecutive years, from 2006/07 onwards. (The technical note includes a full list at Annex 1.)
  • Russell Group: institutions belonging to the self-selecting Russell Group,all of which are represented within the top third.
  • Oxbridge: comprising Oxford and Cambridge, two particularly prominent members of the Russell Group which, rightly or wrongly, are perceived to be the pinnacle of selectivity in UK higher education (an assumption discussed in my Oxbridge post).

The last two of these feature in DfE’s Impact Indicators, alongside the percentage of FSM-eligible learners progressing to any university. The first is utilised in the Social Mobility Indicators (number 13), but to compare progression from state and independent institutions respectively.

The sections that follow look at each of these in order of selectivity, beginning with a national level comparison between progression rates for schools and colleges and proceeding to examine regional disparities for schools and colleges together.

 

Progression to the Top Third

Table 3 compares numbers of FSM-eligible and non-FSM learners progressing to top third institutions from state-funded schools and colleges respectively.

 

Table 3: National numbers progressing to UK HEIs and ‘Top Third’ HEIs in 2011/12

State-funded schools State-funded colleges Total
FSM Non-FSM Total FSM Non-FSM Total FSM Non-FSM Total
No of students 11,100 153,480 164,580 17,680 153,230 170,910 28,770 306,720 335,490
UK HEI destination 6,250 95,880 102,130 7,290 67,130 74,420 13,540 163,010 176,550
Top Third destination 1,300 35,410 36,710 920 15,000 15,920 2,210 50,410 52,620

 

The numbers reveal that:

  • The overall progression rate for KS5 students to top third institutions is 15.7%, but this masks a difference of almost nine percentage points between non-FSM students (16.4%) and their FSM peers (7.7%). Hence non-FSM students are more than twice as likely to gain a place at a top third institution.
  • School-based students are much more likely to reach top third institutions than those at colleges (22.3% versus 9.3%). The same is true amongst the FSM population – the FSM-eligible progression rate from schools is 11.7%, compared with just 5.2% from colleges. This is a substantively larger differential than applies in respect of all UK higher education.
  • Whereas the raw number of FSM learners progressing to any UK HE destination is higher in colleges, the reverse is true when it comes to the top third.
  • Overall, almost 30% of KS5 students progressing to a UK HE institution make it to one in the top third. But whereas roughly one in three (31%) of non-FSM students do so, only one in six (16.3%) of FSM students manage this.
  • When it comes to FSM students from schools and colleges respectively, approximately one in five (20.8%) of FSM students from schools who progress to a UK HE institution make it to a top third institution, whereas this is true of around one in eight of those from colleges (12.6%).

In sum, there are very significant gaps at national level between FSM-eligible progression rates to all UK higher education on one hand and top third institutions on the other. There are equally significant gaps in the FSM progression rates to top third institutions from schools and colleges respectively.

Chart 3, below, compares FSM and non-FSM progressions to top third higher education institutions in different regions.

 

Chart 3: Percentages of FSM and non-FSM students in the overall KS5 cohort who progressed to ‘top third’ HEIs in 2011/12

Destinations chart 3

One can see that:

  • The highest rate for non-FSM students is 24% in Outer London. Inner London rates only fourth on this measure, having dropped behind the Eastern and South Eastern regions. It is only one percentage point above the national average.
  • The highest rate for FSM-eligible students is 12%, again in Outer London, with Inner London just behind at 11%. These are significantly higher than the next highest rates (7%) in the West Midlands and the South East.
  • The non-FSM rates exceed the FSM rates in every region. In the East and South West, the non-FSM rate is three times higher than the FSM rate and, even in Inner London, the gap is six percentage points in favour of non-FSM.

The huge differences between regional success rates for progression to all UK higher education and top third institutions respectively are illustrated by Chart 4.

 

Chart 4: Comparison of regional progression to all UK HE and ‘top third institutions, comparing FSM and non-FSM, 2011/12

Destinations chart 4

It is immediately clear that the top third progression rates are invariably much lower than for progression to all UK higher education institutions, for both FSM-eligible and non-FSM students.

  • The gap at national level between non-FSM students progressing to all institutions and top third institutions is 37 percentage points (53% versus 16%). The comparable gap for FSM students is 39 percentage points (47% versus 8%). So whereas almost half of FSM students progress to any UK higher education institution, fewer than one in ten progress to ‘top third’ institutions.
  • Whereas Inner London recorded 63% of FSM students progressing to all institutions and Outer London wasn’t far behind at 62%, their comparable percentages for FSM progression to ‘top third’ institutions are 11% and 12% respectively. Both these gaps – standing at 50 percentage points or so – are huge, and significantly larger than the national average of 39 percentage points. The smallest gap between these two progression rates for FSM students is 27 percentage points in the South East. So the gap in London is almost twice the size of the gap in the South East. Moreover, the gap between these two rates is larger for non-FSM than FSM students in every region outside London, where the reverse is true.
  • On the other hand, whereas nationally there is a ratio of around 6:1 between FSM progression rates to UK higher education and top third institutions respectively, this falls to around 5:1 in both Inner and Outer London. Conversely it reaches 9:1 in the North East

Overall, it is clear that London leads the way on both measures of FSM progression. But the huge lead London has established in terms of progression to all UK higher education only serves to emphasise their rather more limited progress against the more demanding benchmark. That said, London is still achieving close to twice the rate of the next best region on the more demanding measure.

 

Russell Group

We might expect a broadly similar pattern in respect of progression rates to Russell Group universities, but it should also be instructive to compare performance on these two selective measures, even though cohorts are now small enough for the impact of rounding to be felt.

 

Table 4: National numbers progressing to all UK HE institutions and Russell Group Universities in 2011/12

State-funded schools State-funded colleges Total
FSM Non-FSM Total FSM Non-FSM Total FSM Non-FSM Total
No of students 11,100 153,480 164,580 17,680 153,230 170,910 28,770 306,720 335,490
UK HEI destination 6,250 95,880 102,130 7,290 67,130 74,420 13,540 163,010 176,550
Top third destination 1,300 35,410 36,710 920 15,000 15,920 2,210 50,410 52,620
Russell Group destination 740 24,180 24,920 510 9,790 10,300 1,240 33,970 35,220

 

Table 4 reveals that:

  • The overall national progression rate for KS5 students to Russell Group universities is 10.5%, compared with 15.7% for the top third. There is again a marked difference between the non-FSM rate (11.1%, compared with 16.4% for the top third) and the FSM rate (4.3%, compared with 7.7% for the top third). Whereas one in every nine non-FSM students progress to a Russell Group university, the corresponding odds for FSM are closer to one in 23. The ratio between FSM and non-FSM progression rates is larger at this higher level of selectivity.
  • The progression rate for all school-based students to Russell Group universities is 15.1% (compared with 22.3% for the top third), whereas the progression rate from colleges is much lower, at 6% (compared with 9.3% for the top third).
  • On the schools side, the FSM-eligible progression rate stands at 6.7% (against 11.7% for the top third), while in colleges it is as low as 2.9% (compared with 5.2% for the top third). The non-FSM rates are 15.8% for schools and 6.4% for colleges, so a higher proportion of FSM-eligible students from schools are successful than non-FSM students from colleges.
  • Almost 20% of all students who progress to a UK higher education institution go to a Russell Group university (compared with 30% going to a top third institution) but, for FSM-eligible learners, this falls to 9.2% (compared with 16.3% going to the top third). Whereas the FSM success rate for the top third was slightly more than half the non-FSM success rate, it is slightly less than half the non-FSM rate for Russell Group progression. The comparable percentages for schools and colleges are 11.8% and 7% respectively.
  • Overall, 66.9% of students reaching a ‘top third’ university are attending a Russell Group institution. But this overall ‘top third/RSG conversion rate’ for FSM-eligible students is only 56.1%, almost eleven percentage points lower than the rate for all students. (There is only a small difference between schools and colleges in this respect.) Hence the chances of FSM-eligible students attending Russell Group institutions within the ‘top third’ are significantly lower than those of their more advantaged peers.
  • It is also instructive to compare the different size of these cohorts. The overall non-FSM cohort progressing to Russell Group universities is 27 times the size of the FSM cohort doing so. Put another way, the overall FSM cohort is just 3.5% of the total population progressing to Russell Group institutions. (Interestingly, this falls to 3% for those attending schools whereas the comparable percentage for those attending colleges is higher at 5%.) The total number of FSM-eligible students going on to all Russell Group institutions is about half the number of non-FSM students progressing to Oxbridge alone.

Chart 5, immediately below, provides a region-by-region comparison of FSM-eligible and non-FSM progression rates to Russell Group universities.

 

Chart 5: Percentage of KS5 cohort – fsm and non-fsm – progressing to Russell Group universities by region, 2011/12

Destinations chart 5

 

This shows that:

  • Outer London is leading the way in terms of progression by FSM-eligible and non-FSM students alike. On the non-FSM side it is comfortably ahead of the North West, followed by the rest of the pack. Inner London brings up the rear, a full five percentage points behind the outer boroughs.
  • When it comes to FSM-eligible students there is little to choose between the regions, since they are all clustered between 3% and 6%. But it is much harder to establish real distinctions when percentages are so low. Inner London seems to be in the middle of the pack for FSM progression, suggesting it is performing respectably but not outstandingly on this measure.
  • The numbers – see Table 5 below – indicate that outer London contributes one in five of the FSM cohort progressing to Russell Group institutions, while Inner and Outer London together account for more than a third. (This is an important fact to bear in mind when contemplating the case for a separate London-wide strategy to improve FSM progression rates.) Numbers contributed by the North East, East Midlands and South West regions are markedly low by comparison.

 

Table 5: Percentage of FSM-eligible students progressing to Russell Group universities from each region 2011/12

NE NW YH EM WM EE IL OL SE SW Eng
Numbers progressing to RG universities 50 240 110 50 160 60 190 260 80 50 1240
%age of total 4% 19% 9% 4% 13% 5% 15% 21% 6% 4% 100%

 

Oxbridge

Table 6 below shows national progression rates to Oxbridge by sector, differentiating FSM-eligible and non-FSM. It reveals that:

  • The overall progression rate for all students to Oxbridge is 0.72%, so roughly one in every 140 KS5 students goes to Oxbridge. If we focus only on those progressing to UK higher education, this rate halves to around one in every 70. Of those progressing to Russell Group universities, 6.9% are headed to Oxbridge, equivalent to almost one in every 15.
  • But, when it comes to FSM students, theses rates are much, much lower. Of those progressing to Russell Group institutions, only one in 25 are destined for Oxbridge. Roughly one in every 270 FSM students progressing to UK higher education will attend these two universities.

 

Table 6: National numbers progressing to all UK HE institutions, top third, Russell Group and Oxbridge 2011/12

State-funded schools State-funded colleges Total
FSM Non-FSM Total FSM Non-FSM Total FSM Non-FSM Total
No of students 11,100 153,480 164,580 17,680 153,230 170,910 28,770 306,720 335,490
UK HEI destination 6,250 95,880 102,130 7,290 67,130 74,420 13,540 163,010 176,550
Top third destination 1,300 35,410 36,710 920 15,000 15,920 2,210 50,410 52,620
Russell Group destination 740 24,180 24,920 510 9,790 10,300 1,240 33,970 35,220
Oxbridge destination 40 1,850 1,890 10 520 530 50 2,370 2,420

 

  • If Oxbridge were to accept the same proportion of FSM students that attend Russell Group universities, they would together take in some 85 students rather than the 50 recorded here.

But, for all we know they are doing so, since we are at the very limits of the usefulness of these statistics.

The totals in the data above are rounded to the nearest 10, so the number of FSM students progressing to Oxbridge could be as low as 40 (35 from schools + 5 from colleges) or as high as 58 (44 from schools + 14 from colleges).

This degree of possible variance rather calls into question the wisdom of using this data to support a national impact indicator.

It also reinforces the case for Oxford and Cambridge to publish accurate annual data on the actual numbers of formerly FSM-eligible students they admit, ensuring that they define that term in exactly the same manner as these destination measures.

A figure at the lower end of this distribution would be broadly consistent with other data and suggest continuing long-term failure to shift this figure upwards.

BIS has provided figures over the years in answer to various Parliamentary Questions. These are derived by matching the NPD, HESA Student Record and Individual Learners Record (ILR). They are rounded to the nearest five, rather than the nearest ten, and together supply annual outcomes from 2005/06 to 2010/11.

 

Table 7: FSM-eligible progression to Oxbridge 2005-2011 sourced  from BIS replies to PQs

2005/06 2006/07 2007/08 2008/09 2009/10 2010/11
Oxford 25 20 20 25 15 15
Cambridge 20 25 20 20 25 25
TOTAL 45 45 40 45 40 40

 

My educated guess is that this number remained at or below 45 in 2011/12 and is unlikely to rise significantly for the foreseeable future.

But we should not be satisfied even if it doubles between 2010/11 and 2015/16, reaching 80-90 over that five year period. The desperately low base should not be used to justify such poverty of ambition.

I note in passing that the approach to rounding in the regional destination data is markedly unhelpful. Remember that all figures in the data are rounded to the nearest 10 and x indicates a number between 1 and 5. Table 8 shows the possible impact on figures for FSM progression to Oxbridge by region.

 

Table 7: Potential variance in numbers of FSM-eligible students progressing to Oxbridge by region 2001/12

Region Given Min Max Mean
NE X 1 5 3
NW 10 6 14 10
YH X 1 5 3
EM X 1 5 3
WM 10 6 14 10
EE 10 6 14 10
IL 10 6 14 10
OL 10 6 14 10
SE X 1 5 3
SW X 1 5 3
Eng 50 35 95 65

 

The obvious point is that the given total of 50 students could stand proxy for any figure between 35 and 95 (though one assumes that the real total must lie between 40 and 58, as indicated by the national figures in Table 6).

 

Putting it all Together

What are the headlines from the preceding analysis, as far as the progression of FSM-eligible students is concerned?

  • The destinations data generates a national population of almost 29,000 FSM-eligible students who constitute 8.6% of the total cohort. Over 60% of these are located in colleges, the remainder in schools. These national figures mask substantial regional variations: the FSM-eligible population ranges from 4.3% of the total (South East) to 30.3% (Inner London). The size of these regional FSM cohorts is also extremely variable. Inner and Outer London combined account for over 30% of the national FSM-eligible population.
  • At national level, some 64% of FSM-eligible KS5 learners progress to a sustained educational destination (as opposed to no sustained destination or else employment/training) but this rate is 72% amongst those who attended schools compared with 59% amongst those who attended colleges.
  • Over half (53%) of all KS5 students progress to a UK higher education institution, but the progression rate for FSM-eligible students is six percentage points lower at 47%.
  • About one in six of all KS5 students progress to a ‘top third’ institution, but only about one in 13 FSM-eligible students do so. About one in ten of all KS5 students attend Russell Group institutions, but this falls to one in 23 for FSM-eligible students.
  • There are significant differences between progression rates from schools and colleges respectively. From schools, the FSM-eligible progression rate to all UK higher education is 56%, to top third institutions it is 11.7% and to Russell Group Institutions it is 6.7%. The comparable percentages for colleges are consistently lower at 41%, 5.2% and 2.9% respectively. Whereas the number progressing to UK higher education is higher in colleges, the majority of those progressing to top third institutions are from schools. Almost 60% of those progressing to Russell Group universities are located in schools.
  • In regional terms, the FSM progression rate to all UK higher education ranges from 33.6% in the South West to 63.1% in Inner London, a huge 30 percentage point variation. Outer London is only one point behind at 61.9%. Exceptionally, the FSM progression rate in Inner London exceeds the non-FSM progression rate. Elsewhere, the non-FSM rate exceeds the FSM rate by between nine and 17 percentage points.
  • FSM progression rates to top third institutions are much lower, ranging from 4.4% (North East) to 12.4% (Outer London), which outscores Inner London at 10.6%. Both are well ahead of the national average at 7.7%. The non-FSM progression rates significantly exceed the FSM-eligible rates in every region. The gap is smallest in Inner London at 6.6 percentage points.
  • The gaps in London between FSM-eligible progression rates to all UK HE and the top third institutions reach 50 percentage points, significantly higher than the 39 percentage point national average. The smallest gap is 27 percentage points in the South East. Although London is leading the way on both these measures, its conspicuous success on the less demanding measure throws into sharper relief the limited progress made against the other.
  • A similar pattern is revealed when it comes to Russell Group universities, though the differences are more severe. The FSM progression rate ranges from 2.9% in Eastern England to 5.9% in Outer London, with Inner London only very slightly above the national average at 4.5%. Inner London also falls behind the North West on this measure. There are again significant differences between the rates for FSM and non-FSM. This gap is smallest in Inner London at 4.8 percentage points.
  • Chart 6, below, compares FSM and non-FSM progression rates by region to all UK higher education, the top third and Russell Group institutions respectively. The data is shown rounded to a single decimal place. This shows that the gaps between Russell Group and top third progression rates for FSM students are far bigger in London than anywhere else – 6.1 percentage points in Inner London and 6.5 percentage points in Outer London compared with a national average of 3.4 percentage points. FSM progression to Russell Group universities seems to be the point at which the celebrated London effect has stalled.

 

Chart 6: FSM and non-FSM progression by region to all UK HE, ‘top third’ and Russell Group institutions 2011/12

Destinations chart 6

 

  • As far as FSM progression to Oxbridge is concerned, the data is too limited and approximate to tell us anything substantial, other than to confirm that national FSM progression rates are scandalously low. There might have been a slight improvement – we can’t tell for certain – but from a horrifically low base. Five regions sent a maximum of 5 FSM-eligible learners to Oxbridge in 2012 while the other five each managed between 6 and 14.

 

What limits FSM progression to selective higher education?

Selective universities frequently argue that the main obstacle preventing the admission of more disadvantaged students is that far too few of them achieve the highest attainment levels necessary to secure admission.

Much is made in particular of the comparatively low number of FSM eligible students achieving AAA+ grades at A level – though a PQ reply confirmed (Col 35W) that 546 students achieved this in 2011 and, as we have seen, the data above shows 1,240 FSM students progressing to Russell Group universities in 2012, so well over 50% had lower grades than this. Some courses require slightly lower grades and contexualised admissions practice is almost certainly more widespread than many are prepared to admit.

Unfortunately though, there is very little published data defining excellence gaps – the difference in performance at high attainment levels between advantaged and disadvantaged students – so it is much more difficult than it should be to find hard evidence of this relationship and how it varies by region.

There seems to be broad consensus in the research literature that, although attainment is not the only contributory factor, it is the most significant cause of under-representation, not least because the effect is much more limited when controls for high attainment are introduced.

But it also recognised that a variety of other factors are in play, including:

  • Personal, peer and community aspirations
  • Motivation and resilience
  • Acquisition of social and cultural capital
  • Subject choice (often discussed in terms of ‘facilitating subjects’)
  • Access to and quality of information, advice and guidance
  • Aversion to student debt
  • Whether educators demonstrate consistently high expectations and are favourably disposed towards the most selective universities.

Of course it is overly simplistic to regard such factors as distinct from high attainment, since several of them contribute indirectly towards it.

It is also important to bear in mind that the most demanding and highest tariff courses in particular disciplines are not necessarily located at the most prestigious universities, so – even allowing for screening effects – schools and colleges may be acting in many students’ best interests by pointing them in other directions.

And it is open to question whether disadvantaged students should be persuaded to attend higher education institutions that do not suit them personally, even if the future flow of economic benefits suggests this is the most rational decision. There is a trade-off between present happiness and future income and potential students – as adults – should arguably be able to exercise some freedom of choice. There is also the risk of drop-out to consider.

These factors will impact on different students with different intensities in different combinations and in very different ways: there can be no ‘one size fits all’ solution.

All this aside, it seems that – for the disadvantaged student cohort as a whole – the cumulative impact of such factors is much less significant than the impact of attainment.

So it would be a reasonable hypothesis that regions whose FSM (and non-FSM) students are under-represented at Russell Group universities demonstrate relatively lower levels of high attainment at GCSE and A level.

Could this help to explain why Inner London, so successful in terms of progression to UK higher education institutions, is far less so where Russell Group universities are concerned? The remainder of this section struggles to test this hypothesis with the very limited data available.

Taking A level first, Chart 7, below, compares top grade A level performance in 2013, the most recent year for which this data is available, while Chart 8 compares achievement of AAB A level grades or higher in 2011 and 2012 with FSM-eligible and non-FSM progression rates in 2012 drawn from the destinations data. (Note that the 2011 data does not supply separate AAB+ outcomes for Inner and Outer London).

 

Chart 7: Top A level performance by region 2012/13

Destinations chart 7

Chart 8: Regional achievement of AAB+ grades at A level in 2011 and 2012 compared with 2012 FSM and non-FSM progression rates to RG universities

Destinations chart 8

 

Chart 7 shows that Inner London returns the lowest rates of top-grade A level attainment, while Outer London is at the top of the range. This suggests that top grade A level attainment is depressed in Inner London, which might well be attributable to the exceptionally high incidence of relatively lower attaining FSM-eligible students.

Chart 8 again shows Outer London performing strongly – on both top grade A level attainment and Russell Group progression, while Inner London is lagging behind.

A straightforward bilateral comparison between Inner and Outer London suggests a clear correlation between these two variables, although correlation does not amount to causation.

Moreover, the picture becomes somewhat more complex when other regions are factored in. Outer London has similar top grade A level attainment to the South East, but performs significantly better on Russell Group progression, even with a significantly higher proportion of FSM students.

Meanwhile Inner London, clearly the laggard in terms of top grade A level performance, is also the backmarker for non-FSM Russell Group progression. However, it still seems to perform comparatively well in terms of FSM progression, especially when compared with the South East.

This could be explainable by the fact that relatively more FSM students in Inner London achieve the highest grades, or perhaps they are disproportionately the beneficiaries of contexualised admissions practice. Other factors could also be in play, not least the geographical proximity of several Russell Group institutions.

There is some evidence – published by the Social Mobility and Child Poverty Commission (SMCPC) and recently taken up in CfBT’s research on the ‘London effect’ – that disadvantaged students across London as a whole are relatively strong performers in higher grade GCSEs.

The SMCPC’s 2013 State of the Nation report (page 191) drew attention to overall London success on an 8+ A*-B GCSE including English and Maths (and excluding equivalents) measure – albeit distinguishing between those attracting Pupil Premium funding and their peers.

 

destinations capture 1

 

This table was converted into a chart in a recent CfBT research report on London.

 

destinations capture 2

 

Unfortunately, we cannot see the data for Inner and Outer London separately, so the ‘London effect’ may be disproportionately attributable to the Outer boroughs.

So where does this leave us?

The balance of probabilities suggests that the incidence of high attainment at GCSE and post-16 will impact strongly on progression to selective higher education and so provide the root cause for regional differences in progression rates.

Regions wishing to improve their performance need to look first at increasing high attainment, taking full account of disparities between the performance of FSM and non-FSM students.

There is some evidence to suggest that the celebrated ‘London effect’ has not translated into achievement of the highest attainment levels at A level in Inner London, especially compared with Outer London. This is impacting negatively on progression rates for FSM students but, ironically, progression rates for non-FSM students seem to be taking a bigger hit, perhaps because they do not benefit so significantly from contexualised admissions.

Any London-wide regional strategy to improve progression to the most selective universities would need to focus strongly on closing the gaps between FSM and non-FSM progression rates in Inner and Outer London respectively.

 

The policy response to poor FSM progression

The current policy response is multi-faceted but focused primarily on system-wide improvement, rather than organising and targeting support directly at the students most likely to benefit.

This is partly a function of a market-driven political philosophy, fundamental aversion to centrally organised programmes and commitment to a distributed model in which institutions enjoy substantial autonomy, subject to a strong accountability regime which focuses primarily (but not exclusively) on outcomes, including via the introduction of the destination measures discussed in this post.

By strengthening the system as a whole, it is anticipated that standards will rise across the board. A more rigorous national curriculum and more demanding qualifications will raise performance thresholds, ensuring that all learners are better prepared for progression, regardless of their destination. Some examinations are being revised to remove ceilings on the performance of the highest attainers.

Reporting of performance is adjusted to ensure that schools focus on improving attainment and progress of all learners, regardless of their starting point. Inspection includes checks that high attainers are not underachieving.

A series of interventions has been introduced to strengthen attainment and progression in maths and across other STEM subjects.

There have been efforts to strengthen the role of the Office for Fair Access (OFFA) and to introduce a co-ordinated ‘National Strategy for Access and Student Success’ involving collaboration between OFFA and HEFCE. Meanwhile HE student number controls have been relaxed enabling institutions to expand their intakes of suitably qualified students.

Some degree of localised intervention is taking place through the free schools programme as a first tranche of selective 16-19 institutions has been established, often with an explicit mission to increase the flow of disadvantaged students to selective higher education.

Financial support has been targeted towards disadvantaged learners through the Pupil Premium, ensuring that schools receive extra funding for each disadvantaged learner they admit up to and including Year 11. Academies – including many selective schools – are permitted to prioritise admission of these learners when oversubscribed.

There are issues with aspects of this agenda, for example:

  • The introduction of universal end of KS2 tests may reduce their capacity to differentiate the performance of the highest attainers, so recently enhanced through the adoption of Level 6 tests. There is an associated risk that schools’ internal assessment systems will impose artificially low ceilings restricting high attainers’ progress.
  • Ofsted’s welcome focus on the most able in schools gives insufficient emphasis to those attracting the Pupil Premium and is not backed up by explicit guidance. Nor does it apply to the separate inspection of post-16 settings, undertaken under a different inspection framework.
  • OFFA and HEFCE cannot readily alter the behaviour of independent higher education institutions that make too little progress with fair access, or which improve too slowly. There are too few carrots and sticks and widespread resistance to the imposition of robust targets, even though the SMCPC has called for this repeatedly. Efforts at strengthening institutional collaboration are equally constrained.
  • As yet there are too few selective 16-19 institutions to make a real difference. They are too little focused on supporting improvements in neighbouring institutions and, even within their own intakes, do not always give sufficient priority to the most disadvantaged students.
  • The Pupil Premium stops at age 16 and schools are largely free to use it as they wish – there is no guarantee that each learner attracting the Premium will benefit commensurately and some risk that high attainers are amongst the most vulnerable in this respect.
  • One wonders whether the destination indicators, when introduced into the accountability regime in 2016, will be influential enough to change institutional behaviour. The simultaneous deployment of several different measures of selectivity may dilute their impact. On the other hand, a single measure would be too blunt an instrument.

But these are second order issues. Overall, the current education reform programme can be expected to bring about some improvement in FSM progression rates to selective higher education.

However:

  • It will take a comparatively long time.
  • There is significant deadweight.
  • Fault lines between higher education and schools policy remain problematic.
  • Nothing is holding these disparate policy elements together, so ensuring that ‘the whole is greater than the sum of the parts’.

 

Solving the Policy Design Problem

Given the context of wider government policy, what additional policy dimension should be introduced to secure significant improvements in progression rates for disadvantaged learners to selective higher education?

The missing component – which might be introduced nationally or piloted at regional level and subsequently rolled out – is a light touch framework that will supply the essential minimum scaffolding necessary to support effective market operation on the demand and supply sides simultaneously.

This is by no means equivalent to a rigid, centralised top-down programme, but it does recognise that, left to its own devices, the free market will not create the conditions necessary for success. Some limited intervention is essential.

The centrepiece of the framework would be a structured typology or curriculum comprising the full range of knowledge, skills and understanding required by disadvantaged students to equip them for progression to selective higher education

  • On the demand side this would enable educational settings to adopt a consistent approach to needs identification across the 11-19 age range. Provision from 11-14 might be open to any disadvantaged learner wishing it to access it, but provision from 14 onwards would depend on continued success against challenging attainment targets.
  • On the supply side this would enable the full range of providers – including students’ own educational settings – to adopt a consistent approach to defining which knowledge, skills and understanding their various programmes and services are designed to impart. They would be able to qualify their definitions according to the age, characteristics, selectivity of intended destination and/or geographical location of the students they serve.

With advice from their educational settings, students would periodically identify their learning needs, reviewing the progress they had made towards personal targets and adjusting their priorities accordingly. They would select the programmes and services best matched to their needs.

The supply side would use market intelligence to adjust the range of programmes and services to meet need from different constituencies and localities, acting swiftly to fill gaps in the market and eradicate over-supply.  Programmes and services attracting insufficient demand would close down, while popular programmes and services would expand to meet demand. Small providers with many competitors would discuss the benefits of collaboration to achieve economies of scale, so bringing down costs and increasing demand.

Each learner within the programme would have a personal budget dedicated to purchasing programmes and services with a cost attached. This would be fed from several sources including:

  • Their annual Pupil Premium allocation (currently £935 per year) up to Year 11.
  • A national fund fed by selective higher education institutions. This would collect a fixed minimum topslice from each institution’s outreach budget, supplemented by an annual levy on those failing to meet demanding new fair access targets. (Institutions would also be incentivised to offer programmes and services with no cost attached.)
  • Philanthropic support, bursaries, scholarships, sponsorships and in-kind support sourced from business, charities, higher education, independent schools and parents. Economic conditions permitting, the Government might offer to match any income generated from these sources.

This would be a more than adequate replacement for Aim Higher funding, the loss of which is still felt keenly according to this recent DfE research report.

A solution of this kind would be largely self-regulating, requiring only minimal co-ordination and a small administrative budget. It would have several conspicuous advantages in terms of securing much greater coherence and consistency:

  • Across the age range, securing continuity and progression for all participating learners throughout secondary education up to the point of entry into higher education.
  • Between educational settings, especially at the key transition point between secondary and tertiary education at age 16, when half or more students might be expected to move to a different setting.
  • Regardless of geographical location, so that students are less disadvantaged by virtue of where they live, able to draw on high quality blended and online provision in locations where face-to-face provision is unviable.
  • Incorporating the contribution of national, regional and local centres of excellence – including for example new selective 16-19 institutions such as the London Academy for Excellence and the Harris Westminster Sixth Form – providing them with a platform to share and spread excellent practice and supply outreach of their own.
  • Providing a nexus for cross-sectoral partnership and collaboration, including collaborative efforts in the higher education sector recently launched by OFFA and HEFCE.
  • Supplying a context in which selective higher education institutions can be more transparent about their contextual admission offers and other fair access policies, enabling students to make proper comparisons when selecting their preferred institutions.
  • Accommodating and complementing the reform package I have already proposed to improve fair access to Oxbridge.

The first of these dimensions is particularly important given recent research, published by the Social Mobility and Child Poverty Commission which finds that:

‘Of 7,853 children from the most deprived homes who achieve level 5 in English and maths at age 11 (8.5%…), only 906 (11.5%) make it to an elite university. If they had the same trajectory as a child from one of the least deprived families, then 3,066 of these children would be likely to go to an elite university (39.0%) – suggesting that 2,160 children are falling behind.’

The report concludes:

‘Poorer students have lower average achievement at each stage of their education and even those who start strongly with higher achievement at Key Stages 1 and 2 are more likely to fall off their high achievement trajectory than their wealthier peers. The achievement of students from poorer backgrounds is particularly likely to fall away between Key Stage 2 and Key Stage 4, making secondary school a potentially important area of intervention for policymakers interested in increasing participation at high-status universities amongst young people from more deprived backgrounds.’

Such an approach would be relatively inexpensive and fully scalable (I have not properly costed it, but a £50m topslice from the annual £2.5bn national Pupil Premium budget – for which there is precedent – would be more than enough to meet the full burden on the taxpayer.)

A regional pilot – perhaps in London, or perhaps elsewhere – would accommodate an EEF-funded randomised control trial, though this would need to be extended if incorporating a cohort undertaking the full cycle from Year 7 upwards.

The full benefits would not be realised until this first seven year cycle was completed, but one would anticipate significant positive impacts on attainment much sooner than that and, if students were allowed to participate from Year 10, or possibly even later, the impact on progression to selective universities would be felt within the lifetime of the next government.

 

Conclusion

There are strong equity and social mobility arguments for improving significantly the attainment of disadvantaged students and increasing their rates of progression to selective universities. This is also a sound investment in human capital, improving our national standing in the ‘global race’.

These progression rates have been stalled for a generation. Recent attempts to claim ‘green shoots of recovery’ relate only to the least selective top third measure. Even if they are realised, they are unlikely to wash through to Russell Group and Oxbridge admissions where the under-representation of FSM students is marked and, some would argue, a national scandal.

The publication of Destination Measures provides a valuable addition to our evidence base, though we know far too little about excellence gaps – between the performance of advantaged and disadvantaged learners on high attainment measures – so cannot readily explore the impact of these on progression rates.

Current education policy will likely bring about improvements, but only very slowly. Progression rates to the most selective institutions will be the hardest and slowest to shift. There are ongoing risks associated with cross-policy coherence and the fault lines between education policy for schools and higher education respectively (with the post-16 sector caught somewhere in between).

An additional policy strand is needed to secure vertical, horizontal and lateral coherence and deliver a whole greater than the sum of its parts. Potential design principles for this strand are set out above.  Substantial benefits would be realised during the lifetime of the next government.

Perfect Manifesto material!

 

GP

July 2014

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Why Can’t We Have National Consensus on Educating High Attainers?

 

 

This post proposes a statement of core principles to provoke debate and ultimately build consensus about the education of high attaining learners.

focus-blur--long-way-to-the-top-2_2955981It incorporates an Aunt Sally – admittedly imperfect, provocative and prolix – to illustrate the concept and stimulate initial thinking about what such a statement might contain.

The principles are designed to underpin effective provision. They are intended to apply at every level of the education system (whether national, regional or local) and to every learning setting and age group, from entry to Reception to admission to higher education (or equivalent) and all points in between.

Alongside the draft core principles – which should have more or less global application – I offer a complementary set of ‘reform principles’ which are specific to the English context and describe how our national education reform programme might be harnessed and applied more consistently to support high attainers.

This is expressed in system-wide terms, but could be translated fairly straightforwardly into something more meaningful for schools and colleges.

 

Justification

As education reforms continue to be developed and implemented at a rapid pace, it is essential that they fit together coherently. The various reforms must operate together smoothly, like interlocking cogs in a well-oiled machine, such that the whole is greater than the sum of the parts.

Coherence must be achieved across three dimensions:

  • Horizontally, across the span of education policy.
  • Vertically across the age range, taking in the primary, secondary and tertiary sectors.
  • Laterally for each and every learning setting to which it applies.

There is a risk that such co-ordination becomes more approximate as capacity is stretched by the sheer weight of reform, especially if the central resource traditionally devoted to this task is contracting simultaneously.

In an increasingly bottom-up system, some of the responsibility for ensuring the ‘fit’ across the span of education reforms can be devolved from the centre, initially to a range of intermediary bodies and ultimately to learning settings themselves.

Regardless of where the responsibility lies, there can be a tendency to cut corners, by making these judgements with reference to some notional average learner. But this ignores the needs and circumstances of atypical constituencies including high attainers.

High attainers may even find themselves at the bottom of the pecking order amongst these atypical constituencies, typically as a consequence of the misguided view that they are more or less self-sufficient educationally speaking.

A framework of sorts is necessary to support this process, to protect against the risk that high attainers may otherwise be short-changed and also to ensure flexibility of provision within broad but common parameters.

The Government has recently set a precedent by publishing a set of Assessment Principles ‘to underpin effective assessment systems within schools’.

This post applies that precedent to support the education of high attainers, providing a flexible framework, capable of adoption (with adaptation where necessary) by all the different bodies and settings engaged in this process.

 

The English policy context

I have sought to incorporate in the second set of ‘reform’ principles the full range of areas explored by this blog, which began life at roughly the same time as the present Government began its education reform programme.

They are designed to capture the reform agenda now, as we draw to the close of the 2013/14 academic year. They highlight aspects of reform that are likely to be dominant over the next three academic years, subject of course to any adjustments to the reform programme in the light of the 2015 General Election.

These include:

  • Introduction of a new national curriculum incorporating both greater challenge and greater flexibility, together with full exemption for academies.
  • Introduction of new assessment arrangements, including internal assessment in schools following the withdrawal of national curriculum levels and external assessment arrangements, particularly at the end of KS2.
  • Introduction of revised GCSE and A level qualifications, including a new recalibrated grading system for GCSE.
  • Radical changes to the accountability system, including the reporting of learners’ achievement and the inspection of provision in different learning settings. 
  • Ensuring that the Pupil Premium drives accelerated progress in closing attainment gaps between disadvantaged and advantaged learners.
  • Ensuring accelerated progress against updated social mobility indicators, including improvements in fair access to selective universities.
  • Strengthening system-wide collaboration, ensuring that new types of institution play a significant role in this process, developing subject-specific support networks (especially in STEM) and building the capacity and reach of teaching school alliances.

 

Process

The Aunt Sally might be used as a starting point by a small group charged with generating a viable draft set of principles, either stand-alone or supported by any additional scaffolding deemed necessary.

The preparation of the draft core principles would itself be a consensus-establishing exercise, helping to distinguish areas of agreement and critical sticking points requiring negotiation to resolve.

This draft might be issued for consultation for a fixed period. Responses would be sought directly from a range of key national organisations, all of which would subsequently be invited to endorse formally the final version, revised in the light of consultation.

This stage might entail some further extended negotiation, but the process itself would help to raise the profile of the issue.

Out in the wider system, educators might be encouraged to interact with the final version of the principles, to discuss and record how they might be adjusted or qualified to fit their own particular settings.

There might be an online repository and forum (using a free online platform) enabling educators to discuss their response to the principles, suggest localised adjustments and variants to fit their unique contexts, provide exemplification and share supporting resources, materials and links.

Some of the key national organisations might be encouraged to develop programmes and resources within their own purlieux which would link explicitly with the core principles.

Costs would be limited to the human resource necessary to co-ordinate the initial task and subsequently curate the online repository.

 

Provisos

The focus on high attainment (as a subset of high achievement) has been selected in preference to any categorisation of high ability, talent or giftedness because there are fewer definitional difficulties, the terminology is less problematic and there should be a correspondingly stronger chance of reaching consensus.

I have not at this stage included a definition of high attainers. Potentially one could adopt the definition used in the Primary and Secondary Performance Tables, or an alternative derived from Ofsted’s ‘most able’ concept.

The PISA high achievement benchmarks could be incorporated, so permitting England to compare its progress with other countries.

But, since we are working towards new attainment measures at the end of KS2 and KS4 alike, it may be more appropriate to develop a working definition based on what we know of those measures, adapting the definition as necessary once the measures are themselves more fully defined.

In the two sections following I have set out the two parts of my Aunt Sally

  • A set of ten core principles, designed to embody a shared philosophy underpinning the education of high attainers and
  • A parallel set of ten reform principles, designed to show how England’s education reform agenda might be adapted and applied to support the education of high attainers.

As noted above, I have cast the latter in system-wide terms, hopefully as a precursor to developing a version that will apply (with some customisation) to every learning setting. I have chosen deliberately to set out the big picture from which these smaller versions might be derived.

My Aunt Sally is imbued with a personal belief in the middle way between a bottom-up, school-driven and market-based system on one hand and a rigid, top-down and centrally prescribed system on the other. The disadvantages of the latter still live in the memory, while those of the former are writ large in the current crisis.

Some of this flavour will be obvious below, especially in the last two reform principles, which embody what I call ‘flexible framework thinking’. You will need to make some allowances if you are of a different persuasion.

I have also been deliberately a little contentious in places, so as to stimulate reaction in readers. The final version will need to be more felicitously worded, but it should still be sharp enough to have real meaning and impact.

For there is no point in generating an anodyne ‘motherhood and apple pie’ statement that has no prospect of shifting opinion and behaviour in the direction required.

Finally, the current text is too long-winded, but I judged it necessary to include some broader context and signposting for those coming to this afresh. I am hopeful that, when this is shorn away, the slimmed-down version will be closer to its fighting weight.

 

Ten Core Principles

This section sets out ten essential principles that all parts of the education system should follow in providing for high achievers.

 

  1. Raising achievement – within the education system as a whole and for each and every learner – is one of the principal aims of education. It does not conflict with other aims, or with our duty to promote learners’ personal and social development, or their health, welfare and well-being.

 

  1. Securing high achievement – increasing the proportion of high achievers and raising the achievement of existing high-achievers – is integral to this aim.

 

  1. Both existing and potential high achievers have a right, equal to that of all other learners, to the blend of challenge and support they need to improve further – to become the best that they can be. No learner should be discriminated against educationally on the basis of their prior achievement, whether high or low or somewhere in between.

 

  1. We must pursue simultaneously the twin priorities of raising standards and closing gaps. We must give higher priority to all disadvantaged learners, regardless of their prior achievement. Standards should continue to rise amongst all high achievers, but they should rise faster amongst disadvantaged high achievers. This makes a valuable contribution to social mobility.

 

  1. Securing high attainment is integral to securing high achievement. The route to high attainment may involve any or all of greater breadth, increased depth and a faster pace of learning. These elements should be prioritised and combined appropriately to meet each learner’s needs; a one-size-fits-all solution should not be imposed, nor should any of these elements be ruled out automatically.

 

  1. There must be no artificial ceilings or boundaries restricting high attainment, whether imposed by chronological age or by the expertise available in the principal learning setting; equally, there must be no ‘hot-housing’, resulting from an imbalance between challenge and support and an associated failure to respond with sensitivity to the learner’s wider needs.

 

  1. High attainers are an extremely diverse and disparate population. Some are much higher attainers than others. Some may be ‘all-rounders’ while others have particular strengths and areas for development. All need the right blend of challenge and support to improve alike in areas of strength and any areas of comparative weakness.

 

  1. Amongst the high-attaining population there is significant over-representation of some learner characteristics. But there is also significant diversity, resulting from the interaction between gender, special needs, ethnic and socio-economic background (and several other characteristics besides). This diversity can and should increase as excellence gaps are closed.

 

  1. Educators must guard against the false assumption that high attainment is a corollary of advantage. Equally, they must accept that, while effective education can make a significant difference, external factors beyond their control will also impact upon high attainment. The debate about the relative strength of genetic and environmental influences is irrelevant, except insofar as it obstructs universally high expectations and instilling a positive ‘growth mindset’ in all learners.

 

  1. High attainers cannot meet their own educational needs without the support of educators. Nor is it true that they have no such needs by virtue of their prior attainment. Your investment in their continued improvement is valuable to them as individuals, but also to the country as a whole, economically, socially and culturally.

 

Ten Reform Principles

This section describes how different elements of educational reform might be harnessed to ensure a coherent, consistent and mutually supportive strategy for increasing high attainment

The elements below are described in national system-wide terms, as they apply to the primary and secondary school sectors, but each should be capable of adjustment so it is directly relevant at any level of the system and to every learning setting.

 

  1. Revised national curriculum arrangements offer greater flexibility to design school curricula to meet high attainers’ needs. ‘Top down’ curriculum design, embodying the highest expectations of all learners, is preferable to a ‘deficit model’ approach derived from lowest common denominator thresholds. Exemplary models should be developed and disseminated to support schools in developing their own.

 

  1. The assessment system must enable high attainers to show what they know, understand and can do. Their needs should not be overlooked in the pursuit of universally applicable assessment processes. Formative assessment must provide accurate, constructive feedback and sustain high expectations, regardless of the starting point. Internal and external assessment alike must be free of undesirable ceiling effects.

 

  1. Regardless of their school, all high attainers should have access to opportunities to demonstrate excellence through national assessments and public examinations, including Level 6 assessment (while it exists) and early entry (where it is in their best interests). Progression across transition points – eg primary to secondary – should not require unnecessary repetition and reinforcement. It, should be pre-planned, monitored and kept under review.

 

  1. High attainment measures should feature prominently when results are reported, especially in national School and College Performance Tables, but also on school websites and in the national data portal. Reporting should reveal clearly the extent of excellence gaps between the performance of advantaged and disadvantaged high attainers respectively.

 

  1. Ofsted’s inspection framework now focuses on the attainment and progress of ‘the most able’ in every school. Inspectors should adopt a consistent approach to judging all settings’ provision for high attainers, including explicit focus on disadvantaged high attainers. Inspectors and settings alike would benefit from succinct guidance on effective practice.

 

  1. The impact of the Pupil Premium on closing excellence gaps should be monitored closely. Effective practice should be captured and shared. The Education Endowment Foundation should ensure that impact on excellence gaps is mainstreamed within all its funded programmes and should also stimulate and support programmes dedicated to closing excellence gaps.

 

  1. The closing of excellence gaps should improve progression for disadvantaged high attainers, including to selective secondary, tertiary and higher education. Destination indicators should enable comparison of institutional success in this regard. Disadvantaged high attainers need access to tailored IAG to support fair access at every level. Targeted outreach to support effective transition is also essential at each transition point (typically 11, 16 and 18). Universities should be involved from KS2 onwards. The relevant social mobility measures should align with Pupil Premium ‘eligibility’. Concerted corrective action is required to improve progress whenever and wherever it stalls.

 

  1. System-wide collaboration is required to drive improvement. It must include all geographical areas, educational sectors and institutional types, including independent and selective schools.  All silos – whether associated with localities, academy chains, teaching school alliances, subject specialism or any other subset of provision – must be broken down. This requires joint action by educational settings, voluntary sector organisations and private sector providers alike. Organisations active in the field must stop protecting their fiefdoms and work together for the common good.

 

  1. To minimise fragmentation and patchiness of provision, high attaining learners should have guaranteed access to a menu of opportunities organised within a coherent but flexible framework. Their schools, as lead providers, should facilitate and co-ordinate on their behalves. A similar approach is required to support educators with relevant school improvement, initial training, professional development and research. To support this parallel framework, both theoretical and practical knowledge of the ‘pedagogy of high attainment’ should be collected, organised and shared.

 

  1. All providers should be invited to position their services within these frameworks, using intelligence about the balance between demand and supply to inform the development of new products and services. Responsibility for overseeing the frameworks and for monitoring and reporting progress should be allocated to an independent entity within this national community. As far as possible this should be a self-funding and self-sustaining system.

 

Next Steps

I have already had some welcome interest in developing a set of core principles to support the education of high attaining learners.

This may be a vehicle to stimulate a series of useful partnerships, but it would be premature to publicise these preliminary discussions for fear that they do not reach fruition.

This post is intended to stimulate others to consider the potential benefits of such an approach – and I am at your service should you wish to discuss the idea further.

But if I have only caused you to reflect more deeply about your personal contribution to the education of high attainers, even then this effort has been worthwhile.

GP

May 2014

 

 

‘Poor but Bright’ v ‘Poor but Dim’

 

light-147810_640This post explores whether, in supporting learners from disadvantaged backgrounds, educators should prioritise low attainers over high attainers, or give them equal priority.

 

 

 

Introduction

Last week I took umbrage at a blog post and found myself engaged in a Twitter discussion with the author, one Mr Thomas.

 

 

Put crudely, the discussion hinged on the question whether the educational needs of ‘poor but dim’ learners should take precedence over those of the ‘poor but bright’. (This is Mr Thomas’s shorthand, not mine.)

He argued that the ‘poor but dim’ are the higher priority; I countered that all poor learners should have equal priority, regardless of their ability and prior attainment.

We began to explore the issue:

  • as a matter of educational policy and principle
  • with reference to inputs – the allocation of financial and human resources between these competing priorities and
  • in terms of outcomes – the comparative benefits to the economy and to society from investment at the top or the bottom of the attainment spectrum.

This post presents the discussion, adding more flesh and gloss from the Gifted Phoenix perspective.

It might or might not stimulate some interest in how this slightly different take on a rather hoary old chestnut plays out in England’s current educational landscape.

But I am particularly interested in how gifted advocates in different countries respond to these arguments. What is the consensus, if any, on the core issue?

Depending on the answer to this first question, how should gifted advocates frame the argument for educationalists and the wider public?

To help answer the first question I have included a poll at the end of the post.

Do please respond to that – and feel free to discuss the second question in the comments section below.

The structure of the post is fairly complex, comprising:

  • A (hopefully objective) summary of Mr Thomas’s original post.
  • An embedded version of the substance of our Twitter conversation. I have removed some Tweets – mostly those from third parties – and reordered a little to make this more accessible. I don’t believe I’ve done any significant damage to either case.
  • Some definition of terms, because there is otherwise much cause for confusion as we push further into the debate.
  • A digressionary exploration of the evidence base, dealing with attainment data and budget allocations respectively. The former exposes what little we are told about how socio-economic gaps vary across the attainment spectrum; the latter is relevant to the discussion of inputs. Those pressed for time may wish to proceed directly to…
  • …A summing up, which expands in turn the key points we exchanged on the point of principle, on inputs and on outcomes respectively.

I have reserved until close to the end a few personal observations about the encounter and how it made me feel.

And I conclude with the customary brief summary of key points and the aforementioned poll.

It is an ambitious piece and I am in two minds as to whether it hangs together properly, but you are ultimately the judges of that.

 

What Mr Thomas Blogged

The post was called ‘The Romance of the Poor but Bright’ and the substance of the argument (incorporating several key quotations) ran like this:

  • The ‘effort and resources, of schools but particularly of business and charitable enterprise, are directed disproportionately at those who are already high achieving – the poor but bright’.
  • Moreover ‘huge effort is expended on access to the top universities, with great sums being spent to make marginal improvements to a small set of students at the top of the disadvantaged spectrum. They cite the gap in entry, often to Oxbridge, as a significant problem that blights our society.’
  • This however is ‘the pretty face of the problem. The far uglier face is the gap in life outcomes for those who take least well to education.’
  • Popular discourse is easily caught up in the romance of the poor but bright’ but ‘we end up ignoring the more pressing problem – of students for whom our efforts will determine whether they ever get a job or contribute to society’. For ‘when did you last hear someone advocate for the poor but dim?’ 
  • ‘The gap most damaging to society is in life outcomes for the children who perform least well at school.’ Three areas should be prioritised to improve their educational outcomes:

o   Improving alternative provision (AP) which ‘operates as a shadow school system, largely unknown and wholly unappreciated’ - ‘developing a national network of high–quality alternative provision…must be a priority if we are to close the gap at the bottom’.

o   Improving ‘consistency in SEN support’ because ‘schools are often ill equipped to cope with these, and often manage only because of the extraordinary effort of dedicated staff’. There is ‘inconsistency in funding and support between local authorities’.

o   Introducing clearer assessment of basic skills, ‘so that a student could not appear to be performing well unless they have mastered the basics’.

  • While ‘any student failing to meet their potential is a dreadful thing’, the educational successes of ‘students with incredibly challenging behaviour’ and ‘complex special needs…have the power to change the British economy, far more so than those of their brighter peers.’

A footnote adds ‘I do not believe in either bright or dim, only differences in epigenetic coding or accumulated lifetime practice, but that is a discussion for another day.’

Indeed it is.

 

Our ensuing Twitter discussion

The substance of our Twitter discussion is captured in the embedded version immediately below. (Scroll down to the bottom for the beginning and work your way back to the top.)

 

 

Defining Terms

 

Poor, Bright and Dim

I take poor to mean socio-economic disadvantage, as opposed to any disadvantage attributable to the behaviours, difficulties, needs, impairments or disabilities associated with AP and/or SEN.

I recognise of course that such a distinction is more theoretical than practical, because, when learners experience multiple causes of disadvantage, the educational response must be holistic rather than disaggregated.

Nevertheless, the meaning of ‘poor’ is clear – that term cannot be stretched to include these additional dimensions of disadvantage.

The available performance data foregrounds two measures of socio-economic disadvantage: current eligibility for and take up of free school meals (FSM) and qualification for the deprivation element of the Pupil Premium, determined by FSM eligibility at some point within the last 6 years (known as ‘ever-6’).

Both are used in this post. Distinctions are typically between the disadvantaged learners and non-disadvantaged learners, though some of the supporting data compares outcomes for disadvantaged learners with outcomes for all learners, advantaged and disadvantaged alike.

The gaps that need closing are therefore:

  • between ‘poor and bright’ and other ‘bright’ learners (The Excellence Gap) and 
  • between ‘poor and dim’ and other ‘dim’ learners. I will christen this The Foundation Gap.

The core question is whether The Foundation Gap takes precedence over The Excellence Gap or vice versa, or whether they should have equal billing.

This involves immediate and overt recognition that classification as AP and/or SEN is not synonymous with the epithet ‘poor’, because there are many comparatively advantaged learners within these populations.

But such a distinction is not properly established in Mr Thomas’ blog, which applies the epithet ‘poor’ but then treats the AP and SEN populations as homogenous and somehow associated with it.

 

By ‘dim’ I take Mr Thomas to mean the lowest segment of the attainment distribution – one of his tweets specifically mentions ‘the bottom 20%’. The AP and/or SEN populations are likely to be disproportionately represented within these two deciles, but they are not synonymous with it either.

This distinction will not be lost on gifted advocates who are only too familiar with the very limited attention paid to twice exceptional learners.

Those from poor backgrounds within the AP and/or SEN populations are even more likely to be disproportionately represented in ‘the bottom 20%’ than their more advantaged peers, but even they will not constitute the entirety of ‘the bottom 20%’. A Venn diagram would likely show significant overlap, but that is all.

Hence disadvantaged AP/SEN are almost certainly a relatively poor proxy for the ‘poor but dim’.

That said I could find no data that quantifies these relationships.

The School Performance Tables distinguish a ‘low attainer’ cohort. (In the Secondary Tables the definition is determined by prior KS2 attainment and in the Primary Tables by prior KS1 attainment.)

These populations comprise some 15.7% of the total population in the Secondary Tables and about 18.0% in the Primary Tables. But neither set of Tables applies the distinction in their reporting of the attainment of those from disadvantaged backgrounds.

 

It follows from the definition of ‘dim’ that, by ‘bright’, MrThomas probably intends the two corresponding deciles at the top of the attainment distribution (even though he seems most exercised about the subset with the capacity to progress to competitive universities, particularly Oxford and Cambridge. This is a far more select group of exceptionally high attainers – and an even smaller group of exceptionally high attainers from disadvantaged backgrounds.)

A few AP and/or SEN students will likely fall within this wider group, fewer still within the subset of exceptionally high attainers. AP and/or SEN students from disadvantaged backgrounds will be fewer again, if indeed there are any at all.

The same issues with data apply. The School Performance Tables distinguish ‘high attainers’, who constitute over 32% of the secondary cohort and 25% of the primary cohort. As with low attainers, we cannot isolate the performance of those from disadvantaged backgrounds.

We are forced to rely on what limited data is made publicly available to distinguish the performance of disadvantaged low and high attainers.

At the top of the distribution there is a trickle of evidence about performance on specific high attainment measures and access to the most competitive universities. Still greater transparency is fervently to be desired.

At the bottom, I can find very little relevant data at all – we are driven inexorably towards analyses of the SEN population, because that is the only dataset differentiated by disadvantage, even though we have acknowledged that such a proxy is highly misleading. (Equivalent AP attainment data seems conspicuous by its absence.)

 

AP and SEN

Before exploring these datasets I ought to provide some description of the different programmes and support under discussion here, if only for the benefit of readers who are unfamiliar with the English education system.

 

Alternative Provision (AP) is intended to meet the needs of a variety of vulnerable learners:

‘They include pupils who have been excluded or who cannot attend mainstream school for other reasons: for example, children with behaviour issues, those who have short- or long-term illness, school phobics, teenage mothers, pregnant teenagers, or pupils without a school place.’

AP is provided in a variety of settings where learners engage in timetabled education activities away from their school and school staff.

Providers include further education colleges, charities, businesses, independent schools and the public sector. Pupil Referral Units (PRUs) are perhaps the best-known settings – there are some 400 nationally.

A review of AP was undertaken by Taylor in 2012 and the Government subsequently embarked on a substantive improvement programme. This rather gives the lie to Mr Thomas’ contention that AP is ‘largely unknown and wholly unappreciated’.

Taylor complains of a lack of reliable data about the number of learners in AP but notes that the DfE’s 2011 AP census recorded 14,050 pupils in PRUs and a further 23,020 in other settings on a mixture of full-time and part-time placements. This suggests a total of slightly over 37,000 learners, though the FTE figure is unknown.

He states that AP learners are:

‘…twice as likely as the average pupil to qualify for free school meals’

A supporting Equality Impact Assessment qualifies this somewhat:

‘In Jan 2011, 34.6% of pupils in PRUs and 13.8%* of pupils in other AP, were eligible for and claiming free school meals, compared with 14.6% of pupils in secondary schools. [*Note: in some AP settings, free school meals would not be available, so that figure is under-stated, but we cannot say by how much.]’

If the PRU population is typical of the wider AP population, approximately one third qualify under this FSM measure of disadvantage, meaning that the substantial majority are not ‘poor’ according to our definition above.

Taylor confirms that overall GCSE performance in AP is extremely low, pointing out that in 2011 just 1.4% achieved five or more GCSE grades A*-C including [GCSEs in] maths and English, compared to 53.4% of pupils in all schools.

By 2012/13 the comparable percentages were 1.7% and 61.7% respectively (the latter for all state-funded schools), suggesting an increasing gap in overall performance. This is a cause for concern but not directly relevant to the issue under consideration.

The huge disparity is at least partly explained by the facts that many AP students take alternative qualifications and that the national curriculum does not apply to PRUs.

Data is available showing the full range of qualifications pursued. Taylor recommended that all students in AP should continue to receive ‘appropriate and challenging English and Maths teaching’.

Interestingly, he also pointed out that:

‘In some PRUs and AP there is no provision for more able pupils who end up leaving without the GCSE grades they are capable of earning.’

However, he fails to offer a specific recommendation to address this point.

 

Special Educational Needs (SEN) are needs or disabilities that affect children’s ability to learn. These may include behavioural and social difficulties, learning difficulties or physical impairments.

This area has also been subject to a major Government reform programme now being implemented.

There is significant overlap between AP and SEN, with Taylor’s review of the former noting that the population in PRUs is 79% SEN.

We know from the 2013 SEN statistics that 12.6% of all pupils on roll at PRUs had SEN statements and 68.9% had SEN without statements. But these populations represent only a tiny proportion of the total SEN population in schools.

SEN learners also have higher than typical eligibility for FSM. In January 2013, 30.1% of all SEN categories across all primary, secondary and special schools were FSM-eligible, roughly twice the rate for all pupils. However, this means that almost seven in ten are not caught by the definition of ‘poor’ provided above.

In 2012/13 23.4% of all SEN learners achieved five or more GCSEs at A*-C or equivalent, including GCSEs in English and maths, compared with 70.4% of those having no identified SEN – another significant overall gap, but not directly relevant to our comparison of the ‘poor but bright’ and the ‘poor but dim’.

 

Data on socio-economic attainment gaps across the attainment spectrum

Those interested in how socio-economic attainment gaps vary at different attainment levels cannot fail to be struck by how little material of this kind is published, particularly in the secondary sector, when such gaps tend to increase in size.

One cannot entirely escape the conviction that this reticence deliberately masks some inconvenient truths.

  • The ideal would be to have the established high/middle/low attainer distinctions mapped directly onto performance by advantaged/disadvantaged learners in the Performance Tables but, as we have indicated, this material is conspicuous by its absence. Perhaps it will appear in the Data Portal now under development.
  • Our next best option is to examine socio-economic attainment gaps on specific attainment measures that will serve as decent proxies for high/middle/low attainment. We can do this to some extent but the focus is disproportionately on the primary sector because the Secondary Tables do not include proper high attainment measures (such as measures based exclusively on GCSE performance at grades A*/A). Maybe the Portal will come to the rescue here as well. We can however supply some basic Oxbridge fair access data.
  • The least preferable option is deploy our admittedly poor proxies for low attainers – SEN and AP. But there isn’t much information from this source either. 

The analysis below looks consecutively at data for the primary and secondary sectors.

 

Primary

We know, from the 2013 Primary School Performance Tables, that the percentage of disadvantaged and other learners achieving different KS2 levels in reading, writing and maths combined, in 2013 and 2012 respectively, were as follows:

 

Table 1: Percentage of disadvantaged and all other learners achieving each national curriculum level at KS2 in 2013 in reading, writing and maths combined

L3  or below L4  or above L4B or above L5 or above
Dis Oth Gap Dis Oth Gap Dis Oth Gap Dis Oth Gap
2013 13 5 +8 63 81 -18 49 69 -20 10 26 -16
2012 x x x 61 80 -19 x x x 9 24 -15

 

This tells us relatively little, apart from the fact that disadvantaged learners are heavily over-represented at L3 and below and heavily under-represented at L5 and above.

The L5 gap is somewhat lower than the gaps at L4 and 4B respectively, but not markedly so. However, the L5 gap has widened slightly since 2012 while the reverse is true at L4.

This next table synthesises data from SFR51/13: ‘National curriculum assessments at key stage 2: 2012 to 2013’. It also shows gaps for disadvantage, as opposed to FSM gaps.

 

Table 2: Percentage of disadvantaged and all other learners achieving each national curriculum level, including differentiation by gender, in each 2013 end of KS2 test

L3 L4 L4B L5 L6
D O Gap D O Gap D O Gap D O Gap D O Gap
Reading All 12 6 +6 48 38 +10 63 80 -17 30 50 -20 0 1 -1
B 13 7 +6 47 40 +7 59 77 -18 27 47 -20 0 0 0
G 11 5 +6 48 37 +11 67 83 -16 33 54 -21 0 1 -1
GPS All 28 17 +11 28 25 +3 52 70 -18 33 51 -18 1 2 -1
B 30 20 +10 27 27 0 45 65 -20 28 46 -18 0 2 -2
G 24 13 +11 28 24 +4 58 76 -18 39 57 -18 1 3 -2
Maths All 16 9 +7 50 41 +9 62 78 -16 24 39 -15 2 8 -6
B 15 8 +7 48 39 +9 63 79 -16 26 39 -13 3 10 -7
G 17 9 +8 52 44 +8 61 78 -17 23 38 -15 2 7 -5

 

This tells a relatively consistent story across each test and for boys as well as girls.

We can see that, at Level 4 and below, learners from disadvantaged backgrounds are in the clear majority, perhaps with the exception of L4 GPS.  But at L4B and above they are very much in the minority.

Moreover, with the exception of L6 where low percentages across the board mask the true size of the gaps, disadvantaged learners tend to be significantly more under-represented at L4B and above than they are over-represented at L4 and below.

A different way of looking at this data is to compare the percentages of advantaged and disadvantaged learners respectively at L4 and L5 in each assessment.

  • Reading: Amongst disadvantaged learners the proportion at L5 is -18 percentage points fewer than the proportion at L4, but amongst advantaged learners the proportion at L5 is +12 percentage points higher than at L4.
  • GPS: Amongst disadvantaged learners the proportion at L5 is +5 percentage points more than the proportion at L4, but amongst advantaged learners the proportion at L5 is +26 percentage points higher than at L4.
  • Maths: Amongst disadvantaged learners the proportion at L5 is -26 percentage points fewer than the proportion at L4, but amongst advantaged learners the proportion at L5 is only 2 percentage points fewer than at L4.

If we look at 2013 gaps compared with 2012 (with teacher assessment of writing included in place of the GSP test introduced in 2013) we can see there has been relatively little change across the board, with the exception of L5 maths, which has been affected by the increasing success of advantaged learners at L6.

 

Table 3: Percentage of disadvantaged and all other learners achieving  national curriculum levels 3-6 in each of reading, writing and maths in 2012 and 2013 respectively

L3 L4 L5 L6
D O Gap D O Gap D O Gap D O Gap
Reading 2012 11 6 +5 46 36 +10 33 54 -21 0 0 0
2013 12 6 +6 48 38 +10 30 50 -20 0 1 -1
Writing 2012 22 11 +11 55 52 +3 15 32 -17 0 1 -1
2013 19 10 +9 56 52 +4 17 34 -17 1 2 -1
Maths 2012 17 9 +8 50 41 +9 23 43 -20 1 4 -3
2013 16 9 +9 50 41 +9 24 39 -15 2 8 -6

 

To summarise, as far as KS2 performance is concerned, there are significant imbalances at both the top and the bottom of the attainment distribution and these gaps have not changed significantly since 2012. There is some evidence to suggest that gaps at the top are larger than those at the bottom.

 

Secondary

Unfortunately there is a dearth of comparable data at secondary level, principally because of the absence of published measures of high attainment.

SFR05/2014 provides us with FSM gaps (as opposed to disadvantaged gaps) for a series of GCSE measures, none of which serve our purpose particularly well:

  • 5+ A*-C GCSE grades: gap = 16.0%
  • 5+ A*-C grades including English and maths GCSEs: gap = 26.7%
  • 5+ A*-G grades: gap = 7.6%
  • 5+ A*-G grades including English and maths GCSEs: gap = 9.9%
  • A*-C grades in English and maths GCSEs: gap = 26.6%
  • Achieving the English Baccalaureate: gap = 16.4%

Perhaps all we can deduce is that the gaps vary considerably in size, but tend to be smaller for the relatively less demanding and larger for the relatively more demanding measures.

For specific high attainment measures we are forced to rely principally on data snippets released in answer to occasional Parliamentary Questions.

For example:

  • In 2003, 1.0% of FSM-eligible learners achieved five or more GCSEs at A*/A including English and maths but excluding equivalents, compared with 6.8% of those not eligible, giving a gap of 5.8%. By 2009 the comparable percentages were 1.7% and 9.0% respectively, giving an increased gap of 7.3% (Col 568W)
  • In 2006/07, the percentage of FSM-eligible pupils securing A*/A grades at GCSE in different subjects, compared with the percentage of all pupils in maintained schools doing so were as shown in the table below (Col 808W)

 

Table 4: Percentage of FSM-eligible and all pupils achieving GCSE A*/A grades in different GCSE subjects in 2007

FSM All pupils Gap
Maths 3.7 15.6 11.9
Eng lit 4.1 20.0 15.9
Eng lang 3.5 16.4 12.9
Physics 2.2 49.0 46.8
Chemistry 2.5 48.4 45.9
Biology 2.5 46.8 44.3
French 3.5 22.9 19.4
German 2.8 23.2 20.4

 

  • In 2008, 1% of FSM-eligible learners in maintained schools achieved A* in GCSE maths compared with 4% of all pupils in maintained schools. The comparable percentages for Grade A were 3% and 10% respectively, giving an A*/A gap of 10% (Col 488W)

There is much variation in the subject-specific outcomes at A*/A described above. But, when it comes to the overall 5+ GCSEs high attainment measure based on grades A*/A, the gap is much smaller than on the corresponding standard measure based on grades A*-C.

Further material of broadly the same vintage is available in a 2007 DfE statistical publication: ‘The Characteristics of High Attainers’.

There is a complex pattern in evidence here which is very hard to explain with the limited data available. More time series data of this nature – illustrating Excellence and Foundation Gaps alike – should be published annually so that we have a more complete and much more readily accessible dataset.

I could find no information at all about the comparative performance of disadvantaged learners in AP settings compared with those not from disadvantaged backgrounds.

Data is published showing the FSM gap for SEN learners on all the basic GCSE measures listed above. I have retained the generic FSM gaps in brackets for the sake of comparison:

  • 5+ A*-C GCSE grades: gap = 12.5% (16.0%)
  • 5+ A*-C grades including English and maths GCSEs: gap = 12.1% (26.7%)
  • 5+ A*-G grades: gap = 10.4% (7.6%)
  • 5+ A*-G grades including English and maths GCSEs: gap = 13.2% (9.9%)
  • A*-C grades in English and maths GCSEs: gap = 12.3% (26.6%)
  • Achieving the English Baccalaureate: gap = 3.5% (16.4%)

One can see that the FSM gaps for the more demanding measures are generally lower for SEN learners than they are for all learners. This may be interesting but, for the reasons given above, this is not a reliable proxy for the FSM gap amongst ‘dim’ learners.

 

When it comes to fair access to Oxbridge, I provided a close analysis of much relevant data in this post from November 2013.

The chart below shows the number of 15 year-olds eligible for and claiming FSM at age 15 who progressed to Oxford or Cambridge by age 19. The figures are rounded to the nearest five.

 

Chart 1: FSM-eligible learners admitted to Oxford and Cambridge 2005/06 to 2010/11

Oxbridge chart

 

In sum, there has been no change in these numbers over the last six years for which data has been published. So while there may have been consistently significant expenditure on access agreements and multiple smaller mentoring programmes, it has had negligible impact on this measure at least.

My previous post set out a proposal for what to do about this sorry state of affairs.

 

Budgets

For the purposes of this discussion we need ideally to identify and compare total national budgets for the ‘poor but bright’ and the ‘poor but dim’. But that is simply not possible. 

Many funding streams cannot be disaggregated in this manner. As we have seen, some – including the AP and SEN budgets – may be aligned erroneously with the second of these groups, although they also support learners who are neither ‘poor’ nor ‘dim’ and have a broader purpose than raising attainment. 

There may be some debate, too, about which funding streams should be weighed in the balance.

On the ‘bright but poor’ side, do we include funding for grammar schools, even though the percentage of disadvantaged learners attending many of them is virtually negligible (despite recent suggestions that some are now prepared to do something about this)? Should the Music and Dance Scheme (MDS) be within scope of this calculation?

The best I can offer is a commentary that gives a broad sense of orders of magnitude, to illustrate in very approximate terms how the scales tend to tilt more towards the ‘poor but dim’ rather than the ‘poor but bright’, but also to weave in a few relevant asides about some of the funding streams in question.

 

Pupil Premium and the EEF

I begin with the Pupil Premium – providing schools with additional funding to raise the attainment of disadvantaged learners.

The Premium is not attached to the learners who qualify for it, so schools are free to aggregate the funding and use it as they see fit. They are held accountable for these decisions through Ofsted inspection and the gap-narrowing measures in the Performance Tables.

Mr Thomas suggests in our Twitter discussion that AP students are not significant beneficiaries of such support, although provision in PRUs features prominently in the published evaluation of the Premium. It is for local authorities to determine how Pupil Premium funding is allocated in AP settings.

One might also make a case that ‘bright but poor’ learners are not a priority either, despite suggestions from the Pupil Premium Champion to the contrary.

 

 

As we have seen, the Performance Tables are not sharply enough focused on the excellence gaps at the top of the distribution and I have shown elsewhere that Ofsted’s increased focus on the most able does not yet extend to the impact on those attracting the Pupil Premium, even though there was a commitment that it would do so. 

If there is Pupil Premium funding heading towards high attainers from disadvantaged backgrounds, the limited data to which we have access does not yet suggest a significant impact on the size of Excellence Gaps. 

The ‘poor but bright’ are not a priority for the Education Endowment Foundation (EEF) either.

This 2011 paper explains that it is prioritising the performance of disadvantaged learners in schools below the floor targets. At one point it says:

‘Looking at the full range of GCSE results (as opposed to just the proportions who achieve the expected standards) shows that the challenge facing the EEF is complex – it is not simply a question of taking pupils from D to C (the expected level of attainment). Improving results across the spectrum of attainment will mean helping talented pupils to achieve top grades, while at the same time raising standards amongst pupils currently struggling to pass.’

But this is just after it has shown that the percentages of disadvantaged high attainers in its target schools are significantly lower than elsewhere. Other things being equal, the ‘poor but dim’ will be the prime beneficiaries.

It may now be time for the EEF to expand its focus to all schools. A diagram from this paper – reproduced below – demonstrates that, in 2010, the attainment gap between FSM and non-FSM was significantly larger in schools above the floor than in those below the floor that the EEF is prioritising. This is true in both the primary and secondary sectors.

It would be interesting to see whether this is still the case.

 

EEF Capture

 

AP and SEN

Given the disaggregation problems discussed above, this section is intended simply to give some basic sense of orders of magnitude – lending at least some evidence to counter Mr Thomas’ assertion that the ‘effort and resources, of schools… are directed disproportionately at those who are already high achieving – the poor but bright’.

It is surprisingly hard to get a grip on the overall national budget for AP. A PQ from early 2011 (Col 75W) supplies a net current expenditure figure for all English local authorities of £530m.

Taylor’s Review fails to offer a comparable figure but my rough estimates based on the per pupil costs he supplies suggests a revenue budget of at least £400m. (Taylor suggests average per pupil costs of £9,500 per year for full-time AP, although PRU places are said to cost between £12,000 and £18,000 per annum.)

I found online a consultation document from Kent – England’s largest local authority – stating its revenue costs at over £11m in FY2014-15. Approximately 454 pupils attended Kent’s AP/PRU provision in 2012-13.

There must also be a significant capital budget. There are around 400 PRUs, not to mention a growing cadre of specialist AP academies and free schools. The total capital cost of the first AP free school – Derby Pride Academy – was £2.147m for a 50 place setting.

In FY2011-12, total annual national expenditure on SEN was £5.77 billion (Col 391W). There will have been some cost-cutting as a consequence of the latest reforms, but the order of magnitude is clear.

The latest version of the SEN Code of Practice outlines the panoply of support available, including the compulsory requirement that each school has a designated teacher to be responsible for co-ordinating SEN provision (the SENCO).

In short, the national budget for AP is sizeable and the national budget for SEN is huge. Per capita expenditure is correspondingly high. If we could isolate the proportion of these budgets allocated to raising the attainment of the ‘poor but dim’, the total would be substantial.

 

Fair Access, especially to Oxbridge, and some related observations

Mr Thomas refers specifically to funding to support fair access to universities – especially Oxbridge – for those from disadvantaged backgrounds. This is another area in which it is hard to get a grasp on total expenditure, not least because of the many small-scale mentoring projects that exist.

Mr Thomas is quite correct to remark on the sheer number of these, although they are relatively small beer in budgetary terms. (One suspects that they would be much more efficient and effective if they could be linked together within some sort of overarching framework.)

The Office for Fair Access (OFFA) estimates University access agreement expenditure on outreach in 2014-15 at £111.9m and this has to be factored in, as does DfE’s own small contribution – the Future Scholar Awards.

Were any expenditure in this territory to be criticised, it would surely be the development and capital costs for new selective 16-19 academies and free schools that specifically give priority to disadvantaged students.

The sums are large, perhaps not outstandingly so compared with national expenditure on SEN for example, but they will almost certainly benefit only a tiny localised proportion of the ‘bright but poor’ population.

There are several such projects around the country. Some of the most prominent are located in London.

The London Academy of Excellence (capacity 420) is fairly typical. It cost an initial £4.7m to establish plus an annual lease requiring a further £400K annually.

But this is dwarfed by the projected costs of the Harris Westminster Sixth Form, scheduled to open in September 2014. Housed in a former government building, the capital cost is reputed to be £45m for a 500-place institution.

There were reportedly disagreements within Government:

‘It is understood that the £45m cost was subject to a “significant difference of opinion” within the DfE where critics say that by concentrating large resources on the brightest children at a time when budgets are constrained means other children might miss out…

But a spokeswoman for the DfE robustly defended the plans tonight. “This is an inspirational collaboration between the country’s top academy chain and one of the best private schools in the country,” she said. “It will give hundreds of children from low income families across London the kind of top quality sixth-form previously reserved for the better off.”’

Here we have in microcosm the debate to which this post is dedicated.

One blogger – a London College Principal – pointed out that the real issue was not whether the brightest should benefit over others, but how few of the ‘poor but bright’ would do so:

‘£45m could have a transformative effect on thousands of 16-19 year olds across London… £45m could have funded at least 50 extra places in each college for over 10 years, helped build excellent new facilities for all students and created a city-wide network to support gifted and talented students in sixth forms across the capital working with our partner universities and employers.’

 

Summing Up

There are three main elements to the discussion: the point of principle, the inputs and the impact. The following sections deal with each of these in turn.

 

Principle

Put bluntly, should ‘poor but dim’ kids have higher priority for educators than ‘poor but bright’ kids (Mr Thomas’ position) or should all poor kids have equal priority and an equal right to the support they need to achieve their best (the Gifted Phoenix position)?

For Mr Thomas, it seems this priority is determined by whether – and how far – the learner is behind undefined ‘basic levels of attainment’ and/or mastery of ‘the basics’ (presumably literacy and numeracy).

Those below the basic attainment threshold have higher priority than those above it. He does not say so but this logic suggests that those furthest below the threshold are the highest priority and those furthest above are the lowest.

So, pursued to its logical conclusion, this would mean that the highest attainers would get next to no support while a human vegetable would be the highest priority of all.

However, since Mr Thomas’ focus is on marginal benefit, it may be that those nearest the threshold would be first in the queue for scarce resources, because they would require the least effort and resources to lift above it.

This philosophy drives the emphasis on achievement of national benchmarks and predominant focus on borderline candidates that, until recently, dominated our assessment and accountability system.

For Gifted Phoenix, every socio-economically disadvantaged learner has equal priority to the support they need to improve their attainment, by virtue of that disadvantage.

There is no question of elevating some ahead of others in the pecking order because they are further behind on key educational measures since, in effect, that is penalising some disadvantaged learners on the grounds of their ability or, more accurately, their prior attainment.

This philosophy underpins the notion of personalised education and is driving the recent and welcome reforms of the assessment and accountability system, designed to ensure that schools are judged by how well they improve the attainment of all learners, rather than predominantly on the basis of the proportion achieving the standard national benchmarks.

I suggested that, in deriding ‘the romance of the poor but bright’, Mr Thomas ran the risk of falling into ‘the slough of anti-elitism’. He rejected that suggestion, while continuing to emphasise the need to ‘concentrate more’ on ‘those at risk of never being able to engage with society’.

I have made the assumption that Thomas is interested primarily in KS2 and GCSE or equivalent qualifications at KS4 given his references to KS2 L4, basic skills and ‘paper qualifications needed to enter meaningful employment’.

But his additional references to ‘real qualifications’ (as opposed to paper ones) and engaging with society could well imply a wider range of personal, social and work-related skills for employability and adult life.

My preference for equal priority would apply regardless: there is no guarantee that high attainers from disadvantaged backgrounds will necessarily possess these vital skills.

But, as indicated in the definition above, there is an important distinction to be maintained between:

  • educational support to raise the attainment, learning and employability skills of socio-economically disadvantaged learners and prepare them for adult life and
  • support to manage a range of difficulties – whether behavioural problems, disability, physical or mental impairment – that impact on the broader life chances of the individuals concerned.

Such a distinction may well be masked in the everyday business of providing effective holistic support for learners facing such difficulties, but this debate requires it to be made and sustained given Mr Thomas’s definition of the problem in terms of the comparative treatment of the ‘poor but bright’ and the ‘poor but dim’.

Having made this distinction, it is not clear whether he himself sustains it consistently through to the end of his post. In the final paragraphs the term ‘poor but dim’ begins to morph into a broader notion encompassing all AP and SEN learners regardless of their socio-economic status.

Additional dimensions of disadvantage are potentially being brought into play. This is inconsistent and radically changes the nature of the argument.

 

Inputs

By inputs I mean the resources – financial and human – made available to support the education of ‘dim’ and ‘bright’ disadvantaged learners respectively.

Mr Thomas also shifts his ground as far as inputs are concerned.

His post opens with a statement that ‘the effort and resources’ of schools, charities and businesses are ‘directed disproportionately’ at the poor but bright – and he exemplifies this with reference to fair access to competitive universities, particularly Oxbridge.

When I point out the significant investment in AP compared with fair access, he changes tack – ‘I’m measuring outcomes not just inputs’.

Then later he says ‘But what some need is just more expensive’, to which I respond that ‘the bottom end already has the lion’s share of funding’.

At this point we have both fallen into the trap of treating the entirety of the AP and SEN budgets as focused on the ‘poor but dim’.

We are failing to recognise that they are poor proxies because the majority of AP and SEN learners are not ‘poor’, many are not ‘dim’, these budgets are focused on a wider range of needs and there is significant additional expenditure directed at ‘poor but dim’ learners elsewhere in the wider education budget.

Despite Mr Thomas’s opening claim, it should be reasonably evident from the preceding commentary that my ‘lion’s share’ point is factually correct. His suggestion that AP is ‘largely unknown and wholly unappreciated’ flies in the face of the Taylor Review and the Government’s subsequent work programme.

SEN may depend heavily on the ‘extraordinary effort of dedicated staff’, but at least there are such dedicated staff! There may be inconsistencies in local authority funding and support for SEN, but the global investment is colossal by comparison with the funding dedicated on the other side of the balance.

Gifted Phoenix’s position acknowledges that inputs are heavily loaded in favour of the SEN and AP budgets. This is as it should be since, as Thomas rightly notes, many of the additional services they need are frequently more expensive to provide. These services are not simply dedicated to raising their attainment, but also to tackling more substantive problems associated with their status.

Whether the balance of expenditure on the ‘bright’ and ‘dim’ respectively is optimal is a somewhat different matter. Contrary to Mr Thomas’s position, gifted advocates are often convinced that too much largesse is focused on the latter at the expense of the former.

Turning to advocacy, Mr Thomas says ‘we end up ignoring the more pressing problem’ of the poor but dim. He argues in the Twitter discussion that too few people are advocating for these learners, adding that they are failed ‘because it’s not popular to talk about them’.

I could not resist countering that advocacy for gifted learners is equally unpopular, indeed ‘the word is literally taboo in many settings’. I cannot help thinking – from his footnote reference to ‘epigenetic coding’ – that Mr Thomas is amongst those who are distinctly uncomfortable with the term.

Where advocacy does survive it is focused exclusively on progression to competitive universities and, to some extent, high attainment as a route towards that outcome. The narrative has shifted away from concepts of high ability or giftedness, because of the very limited consensus about that condition (even amongst gifted advocates) and even considerable doubt in some quarters whether it exists at all.

 

Outcomes

Mr Thomas maintains in his post that the successes of his preferred target group ‘have the power to change the British economy, far more so than those of their brighter peers’. This is because ‘the gap most damaging to society is in life outcomes for the children that perform least well at school’.

As noted above, it is important to remember that we are discussing here the addition of educational and economic value by tackling underachievement amongst learners from disadvantaged backgrounds, rather than amongst all the children that perform least well.

We are also leaving to one side the addition of value through any wider engagement by health and social services to improve life chances.

It is quite reasonable to advance the argument that ‘improving the outcomes ‘of the bottom 20%’ (the Tail) will have ‘a huge socio-economic impact’ and ‘make the biggest marginal difference to society’.

But one could equally make the case that society would derive similar or even higher returns from a decision to concentrate disproportionately on the highest attainers (the Smart Fraction).

Or, as Gifted Phoenix would prefer, one could reasonably propose that the optimal returns should be achieved by means of a balanced approach that raises both the floor and the ceiling, avoiding any arbitrary distinctions on the basis of prior attainment.

From the Gifted Phoenix perspective, one should balance the advantages of removing the drag on productivity of an educational underclass against those of developing the high-level human capital needed to drive economic growth and improve our chances of success in what Coalition ministers call the ‘global race’.

According to this perspective, by eliminating excellence gaps between disadvantaged and advantaged high attainers we will secure a stream of benefits broadly commensurate to that at the bottom end.

These will include substantial spillover benefits, achieved as a result of broadening the pool of successful leaders in political, social, educational and artistic fields, not to mention significant improvements in social mobility.

It is even possible to argue that, by creating a larger pool of more highly educated parents, we can also achieve a significant positive impact on the achievement of subsequent generations, thus significantly reducing the size of the tail.

And in the present generation we will create many more role models: young people from disadvantaged backgrounds who become educationally successful and who can influence the aspirations of younger disadvantaged learners.

This avoids the risk that low expectations will be reinforced and perpetuated through a ‘deficit model’ approach that places excessive emphasis on removing the drag from the tail by producing a larger number of ‘useful members of society’.

This line of argument is integral to the Gifted Phoenix Manifesto.

It seems to me entirely conceivable that economists might produce calculations to justify any of these different paths.

But it would be highly inequitable to put all our eggs in the ‘poor but bright’ basket, because that penalises some disadvantaged learners for their failure to achieve high attainment thresholds.

And it would be equally inequitable to focus exclusively on the ‘poor but dim’, because that penalises some disadvantaged learners for their success in becoming high attainers.

The more equitable solution must be to opt for a ‘balanced scorecard’ approach that generates a proportion of the top end benefits and a proportion of the bottom end benefits simultaneously.

There is a risk that this reduces the total flow of benefits, compared with one or other of the inequitable solutions, but there is a trade-off here between efficiency and a socially desirable outcome that balances the competing interests of the two groups.

 

The personal dimension

After we had finished our Twitter exchanges, I thought to research Mr Thomas online. Turns out he’s quite the Big-Cheese-in-Embryo. Provided he escapes the lure of filthy lucre, he’ll be a mover and shaker in education within the next decade.

I couldn’t help noticing his own educational experience – public school, a First in PPE from Oxford, leading light in the Oxford Union – then graduation from Teach First alongside internships with Deutsche Bank and McKinsey.

Now he’s serving his educational apprenticeship as joint curriculum lead for maths at a prominent London Academy. He’s also a trustee of ‘a university mentoring project for highly able 11-14 year old pupils from West London state schools’.

Lucky I didn’t check earlier. Such a glowing CV might have been enough to cow this grammar school Oxbridge reject, even if I did begin this line of work several years before he was born. Not that I have a chip on my shoulder…

The experience set me wondering about the dominant ideology amongst the Teach First cadre, and how it is tempered by extended exposure to teaching in a challenging environment.

There’s more than a hint of idealism about someone from this privileged background espousing the educational philosophy that Mr Thomas professes. But didn’t he wonder where all the disadvantaged people were during his own educational experience, and doesn’t he want to change that too?

His interest in mentoring highly able pupils would suggest that he does, but also seems directly to contradict the position he’s reached here. It would be a pity if the ‘poor but bright’ could not continue to rely on his support, equal in quantity and quality to the support he offers the ‘poor but dim’.

For he could make a huge difference at both ends of the attainment spectrum – and, with his undeniable talents, he should certainly be able to do so

 

Conclusion

We are entertaining three possible answers to the question whether in principle to prioritise the needs of the ‘poor but bright’ or the ‘poor but dim’:

  • Concentrate principally – perhaps even exclusively – on closing the Excellence Gaps at the top
  • Concentrate principally – perhaps even exclusively – on closing the Foundation Gaps at the bottom
  • Concentrate equally across the attainment spectrum, at the top and bottom and all points in between.

Speaking as an advocate for those at the top, I favour the third option.

It seems to me incontrovertible – though hard to quantify – that, in the English education system, the lion’s share of resources go towards closing the Foundation Gaps.

That is perhaps as it should be, although one could wish that the financial scales were not tipped so excessively in their direction, for ‘poor but bright’ learners do in my view have an equal right to challenge and support, and should not be penalised for their high attainment.

Our current efforts to understand the relative size of the Foundation and Excellence Gaps and how these are changing over time are seriously compromised by the limited data in the public domain.

There is a powerful economic case to be made for prioritising the Foundation Gaps as part of a deliberate strategy for shortening the tail – but an equally powerful case can be constructed for prioritising the Excellence Gaps, as part of a deliberate strategy for increasing the smart fraction.

Neither of these options is optimal from an equity perspective however. The stream of benefits might be compromised somewhat by not focusing exclusively on one or the other, but a balanced approach should otherwise be in our collective best interests.

You may or may not agree. Here is a poll so you can register your vote. Please use the comments facility to share your wider views on this post.

 

 

 

 

Epilogue

 

We must beware the romance of the poor but bright,

But equally beware

The romance of rescuing the helpless

Poor from their sorry plight.

We must ensure the Tail

Does not wag the disadvantaged dog!

 

GP

May 2014