A Digression on Breadth, Depth, Pace and Mastery

.

Tricoloring (1)

This post explores the emerging picture of mastery-based differentiation for high attainers and compares it with a model we used in the National G&T Programme, back in the day.

It is a rare venture into pedagogical territory by a non-practitioner, so may not bear close scrutiny from the practitioner’s perspective. But it seeks to pose intelligent questions from a theoretical position and so promote further debate.

.

Breadth, depth and pace

. 

Quality standards

In the original National Quality Standards in Gifted and Talented Education (2005) one aspect of exemplary ‘Effective Provision in the Classroom’ was:

‘Teaching and learning are suitably challenging and varied, incorporating the breadth, depth and pace required to progress high achievement. Pupils routinely work independently and self-reliantly.’

In the 2010 version it was still in place:

‘Lessons consistently challenge and inspire pupils, incorporating the breadth, depth and pace required to support exceptional rates of progress. Pupils routinely work creatively, independently and self-reliantly.’

These broad standards were further developed in the associated Classroom Quality Standards (2007) which offered a more sophisticated model of effective practice.

The original quality standards were developed by small expert working groups, reporting to wider advisory groups and were carefully trialled in primary and secondary classrooms.

They were designed not to be prescriptive but, rather, to provide a flexible framework within which schools could develop and refine their own preferred practice.

Defining the terms

What did we mean by breadth, depth and pace?

  • Breadth (sometimes called enrichment) gives learners access to additional material beyond the standard programme of study. They might explore additional dimensions of the same topic, or an entirely new topic. They might need to make cross-curricular connections, and/or to apply their knowledge and skills in an unfamiliar context.
  • Depth (sometimes called extension) involves delving further into the same topic, or considering it from a different perspective. It might foreground problem solving. Learners might need to acquire new knowledge and skills and may anticipate material that typically occurs later in the programme of study.
  • Pace (sometimes called acceleration) takes two different forms. It may be acceleration of the learner, for example advancing an individual to a higher year group in a subject where they are particularly strong. More often, it is acceleration of the learning, enabling learners to move through the programme of study at a relatively faster pace than some or all of their peers. Acceleration of learning can take place at a ‘micro’ level in differentiated lesson planning, or in a ‘macro’ sense, typically through setting. Both versions of acceleration will cause the learner to complete the programme of study sooner and they may be entered early for an associated test or examination.

It should be readily apparent that these concepts are not distinct but overlapping.  There might be an element of faster pace in extension, or increased depth in acceleration for example. A single learning opportunity may include two, or possibly all three. It is not always straightforward to disentangle them completely.

Applying these terms

From the learner’s perspective, one of these three elements can be dominant, with the preferred strategy determined by that learner’s attainment, progress and wider needs.

  • Enrichment might be dominant if the learner is an all-rounder, relatively strong in this subject but with equal or even greater strength elsewhere.
  • Extension might be dominant if the learner shows particular aptitude or interest in specific aspects of the programme of study.
  • Acceleration might be dominant if the learner is exceptionally strong in this subject, or has independently acquired and introduced knowledge or skills that are not normally encountered until later in this or a subsequent key stage.

Equally though, the richest learning experience is likely to involve a blend of all three elements in different combinations: restricting advanced learners to one or two of them might not always be in their best interests. Moreover, some high attainers will thrive with a comparatively ‘balanced scorecard’

The intensity or degree of enrichment, extension or acceleration will also vary according to the learners’ needs. Even in a top set decisions about how broadly to explore, how deeply to probe or how far and how fast to press forward must reflect their starting point and the progress achieved to date.

Acceleration of the learner may be appropriate if he or she is exceptionally advanced.  Social and emotional maturity will need to be taken into account, but all learners are different – this should not be used as a blanket excuse for failing to apply the approach.

There must be evidence that the learner is in full command of the programme of study to date and that restricting his pace is having a detrimental effect. A pedagogical preference for moving along the class at the same pace should never over-ride the learner’s needs.

Both variants of acceleration demand careful long-term planning, so the learner can continue on a fast track where appropriate, or step off without loss of esteem. It will be frustrating for a high attainer expected to ‘mark time’ when continuity is lost. This may be particularly problematic on transfer and transition between settings.

Careful monitoring is also required, to ensure that the learner continues to benefit, is comfortable and remains on target to achieve the highest grades. No good purpose is served by ‘hothousing’.

Mastery and depth

The Expert Panel

The recent evolution of a mastery approach can be tracked back to the Report of the Expert Panel for the National Curriculum Review (December 2011).

‘Amongst the international systems which we have examined, there are several that appear to focus on fewer things in greater depth in primary education, and pay particular attention to all pupils having an adequate understanding of these key elements prior to moving to the next body of content – they are ‘ready to progress’…

… it is important to understand that this model applies principally to primary education. Many of the systems in which this model is used progressively change in secondary education to more selective and differentiated routes. Spread of attainment then appears to increase in many of these systems, but still with higher overall standards than we currently achieve in England…

There are issues regarding ‘stretch and challenge’ for those pupils who, for a particular body of content, grasp material more swiftly than others. There are different responses to this in different national settings, but frequently there is a focus on additional activities that allow greater application and practice, additional topic study within the same area of content, and engagement in demonstration and discussion with others

These views cohere with our notion of a revised model that focuses on inclusion, mastery and progress. However, more work needs to be done around these issues, both with respect to children with learning difficulties and those regarded as high attainers.’

For reasons best known to itself, the Panel never undertook that further work in relation to high attainers, or at least it was never published. This has created a gap in the essential groundwork necessary for the adoption of a mastery-driven approach.

 .

National curriculum

Aspects of this thinking became embodied in the national curriculum, but there are some important checks and balances.

The inclusion statement requires differentiation for high attainers:

‘Teachers should set high expectations for every pupil. They should plan stretching work for pupils whose attainment is significantly above the expected standard.’

The primary programmes of study for all the core subjects remind everyone that:

Within each key stage, schools therefore have the flexibility to introduce content earlier or later than set out in the programme of study. In addition, schools can introduce key stage content during an earlier key stage, if appropriate.’

But, in mathematics, both the primary and secondary PoS say:

‘The expectation is that the majority of pupils will move through the programmes of study at broadly the same pace. However, decisions about when to progress should always be based on the security of pupils’ understanding and their readiness to progress to the next stage. Pupils who grasp concepts rapidly should be challenged through being offered rich and sophisticated problems before any acceleration through new content. Those who are not sufficiently fluent with earlier material should consolidate their understanding, including through additional practice, before moving on.’

These three statements are carefully worded and, in circumstances where all apply, they need to be properly reconciled.

.

NCETM champions the maths mastery movement

The National Centre for Excellence in the Teaching of Mathematics (NCETM), a Government-funded entity responsible for raising levels of achievement in maths, has emerged as a cheerleader for and champion of a maths mastery approach.

It has published a paper ‘Mastery approaches to mathematics and the new national curriculum’ (October 2014).

Its Director, Charlie Stripp, has also written two blog posts on the topic:

The October 2014 paper argues (my emphasis):

‘Though there are many differences between the education systems of England and those of east and south-east Asia, we can learn from the ‘mastery’ approach to teaching commonly followed in these countries. Certain principles and features characterise this approach…

… The large majority of pupils progress through the curriculum content at the same pace. Differentiation is achieved by emphasising deep knowledge and through individual support and intervention.’

It continues:

‘Taking a mastery approach, differentiation occurs in the support and intervention provided to different pupils, not in the topics taught, particularly at earlier stages. There is no differentiation in content taught, but the questioning and scaffolding individual pupils receive in class as they work through problems will differ, with higher attainers challenged through more demanding problems which deepen their knowledge of the same content.’

In his October 2014 post, Stripp opines:

‘Put crudely, standard approaches to differentiation commonly used in our primary school maths lessons involve some children being identified as ‘mathematically weak’ and being taught a reduced curriculum with ‘easier’ work to do, whilst others are identified as ‘mathematically able’ and given extension tasks….

…For the children identified as ‘mathematically able’:

  1. Extension work, unless very skilfully managed, can encourage the idea that success in maths is like a race, with a constant need to rush ahead, or it can involve unfocused investigative work that contributes little to pupils’ understanding. This means extension work can often result in superficial learning. Secure progress in learning maths is based on developing procedural fluency and a deep understanding of concepts in parallel, enabling connections to be made between mathematical ideas. Without deep learning that develops both of these aspects, progress cannot be sustained.
  2. Being identified as ‘able’ can limit pupils’ future progress by making them unwilling to tackle maths they find demanding because they don’t want to challenge their perception of themselves as being ‘clever’ and therefore finding maths easy….

…I do think much of what I’m saying here also applies at secondary level.

Countries at the top of the table for attainment in mathematics education employ a mastery approach to teaching mathematics. Teachers in these countries do not differentiate their maths teaching by restricting the mathematics that ‘weaker’ children experience, whilst encouraging ‘able’ children to ‘get ahead’ through extension tasks… Instead, countries employing a mastery approach expose almost all of the children to the same curriculum content at the same pace…’

The April 2015 post continues in a similar vein, commenting directly on the references in the PoS quoted above (my emphases):

‘The sentence: ‘Pupils who grasp concepts rapidly should be challenged through rich and sophisticated problems before any acceleration through new content’, directly discourages acceleration through content, instead requiring challenge through ‘rich and sophisticated (which I interpret as mathematically deeper) problems’. Engaging with ‘rich and sophisticated problems’ involves reasoning mathematically and applying maths to solve problems, addressing all three curriculum aims. All pupils should encounter such problems; different pupils engage with problems at different depths, but all pupils benefit

…Meeting the needs of all pupils without differentiation of lesson content requires ensuring that both (i) when a pupil is slow to grasp an aspect of the curriculum, he or she is supported to master it and (ii) all pupils should be challenged to understand more deeply…

The success of teaching for mastery in the Far East (and in the schools employing such teaching here in England) suggests that all pupils benefit more from deeper understanding than from acceleration to new material. Deeper understanding can be achieved for all pupils by questioning that asks them to articulate HOW and WHY different mathematical techniques work, and to make deep mathematical connections. These questions can be accessed by pupils at different depths and we have seen the Shanghai teachers, and many English primary teachers who are adopting a teaching for mastery approach, use them very skilfully to really challenge even the highest attaining pupils.’

The NCETM is producing guidance on assessment without levels, showing how to establish when a learner

‘…has ‘mastered’ the curriculum content (meaning he or she is meeting national expectations and so ready to progress) and when a pupil is ‘working deeper’ (meaning he or she is exceeding national expectations in terms of depth of understanding).’

.

Commentary

NCETM wants to establish a distinction between depth via problem-solving (good) and depth via extension tasks (bad)

There is some unhelpful terminological confusion in the assumption that extension tasks necessarily require learners to anticipate material not yet covered by the majority of the class.

Leaving that aside, notice how the relatively balanced wording in the programme of study is gradually adjusted until the balance has disappeared.

The PoS says ‘the majority of pupils will move through…at broadly the same pace’ and that they ‘should be challenged through being offered rich and sophisticated problems before any acceleration through new content).

This is first translated into

‘…the large majority of pupils progress through the curriculum content at the same pace’ (NCETM paper) then it becomes

‘…expose almost all of the children to the same curriculum content at the same pace’ (Stripp’s initial post) and finally emerges as

‘Meeting the needs of all pupils without differentiation of lesson content’ and

‘…all pupils benefit more from deeper understanding than from acceleration to new material.’ (Stripp’s second post).

Any non-mathematician will tell you that the difference between the majority (over 50%) and all (100%) may be close to 50%.

Such a minority could very comfortably include all children achieving L3 equivalent at KS1 or L5 equivalent at KS2, or all those deemed high attainers in the Primary and Secondary Performance Tables.

The NCETM pretends that this minority does not exist.

It does not consider the scope for acceleration towards new content subsequent to the delivery of ‘rich and sophisticated problems’.

Instead it argues that the statement in the PoS ‘directly discourages acceleration through content’ when it does no such thing.

This is propaganda, but why is NCETM advancing it?

One possibility, not fully developed in these commentaries, is the notion that teachers find it easier to work in this way. In order to be successful ‘extension work’ demands exceptionally skilful management.

On the other hand, Stripp celebrates the fact that Shanghai teachers:

…were very skilled at questioning and challenging children to engage more deeply with maths within the context of whole class teaching.’

It is a moot point whether such questioning, combined with the capacity to develop ‘rich and sophisticated problems’, is any more straightforward for teachers to master than the capacity to devise suitable extension tasks, especially when one approach is relatively more familiar than the other.

Meanwhile, every effort is made to associate maths mastery with other predilections and prejudices entertained by educational professionals:

  • It will have a positive impact on teacher workload, but no evidence – real or imagined – is cited to support this belief.
  • The belief that all children can be successful at maths (though with no acknowledgement that some will always be comparatively more successful than others) and an associated commitment to ‘mindset’, encouraging learners to associate success with effort and hard work rather than underlying aptitude.
  • The longstanding opposition of many in the maths education community to any form of acceleration, fuelled by alarming histories of failed prodigies at one extreme and poorly targeted early entry policies at the other. (I well remember discussing this with them as far back as the nineties.)
  • The still contested benefits of life without levels.

On this latter point, the guidance NCETM is developing appears to assume that ‘exceeding national expectations’ in maths must necessarily involve ‘working deeper’.

I have repeatedly argued that, for high attainers, such measures should acknowledge the potential contributions of breadth, depth and pace.

Indeed, following a meeting and email exchanges last December, NAHT said it wanted to employ me to help develop such guidance, as part of its bigger assessment package.

(Then nothing more – no explanation, no apology, zilch. Shame on you, Mr Hobby. That’s no way to run an organisation.)

.

Conclusion

Compared with the richness of the tripartite G&T model, the emphasis placed exclusively on depth in the NCETM mastery narrative seems relatively one-dimensional and impoverished.

There is no great evidence in this NCETM material of a willingness to develop an alternative understanding of ‘stretch and challenge’ for high attainers.  Vague terms like  ‘intelligent practice’, ‘deep thinking’ and ‘deep learning’ are bandied about like magical incantations, but what do they really mean?

NCETM needs to revisit the relevant statement in the programme of study and strip away (pun intended) the ‘Chinese whispers’ (pun once more intended) in which they have cocooned it.

Teachers following the maths mastery bandwagon need meaningful free-to-access guidance that helps them construct suitably demanding and sophisticated problems and to deploy advanced questioning techniques that get the best out of their high attainers.

I do not dismiss the possibility that high attainers can thrive under a mastery model that foregrounds depth over breadth and pace, but it is a mistake to neglect breadth and pace entirely.

Shanghai might be an exception, but most of the other East Asian cradles of mastery also run parallel gifted education programmes in which accelerated maths is typically predominant. I’ve reviewed several on this Blog.

.

GP

April 2015

Protecting pupil premium for high attainers

.

This post continues the campaign I have been waging against the Fair Education Alliance, a Teach First-inspired ‘coalition for change in education’ over a proposal in its Report Card 2014 to f-school-letter-gradehalve the pupil premium for disadvantaged learners with high prior attainment.

I am:

  • Inviting Fair Education Alliance members (and Read On. Get On. partners) to defend the proposal or else distance themselves from it and
  • Calling on both campaigns to withdraw it.

.

Background

The Fair Education Alliance was launched by Teach First in June 2014. It aims to:

‘…significantly narrow the achievement gap between young people from our poorest communities and their wealthier peers by 2022’.

There are 27 members in all (see below).

The Alliance plans to monitor progress annually against five Fair Education Impact Goals through an annual Report Card.

The first Report Card, published in December 2014, explains that the Alliance was formed:

‘…in response to the growing demand for a national debate on why thousands of children do not get a fair education’.

The Impact Goals are described thus:

  • ‘Narrow the gap in literacy and numeracy at primary school

The Fair Education Alliance is committed to closing the attainment gap between primary schools serving lower income pupils and those educating higher income pupils. Our goal is for this gap to be narrowed by 90 % by 2022.

  • Narrow the gap in GCSE attainment at secondary school

The Fair Education Alliance is committed to closing the attainment gap between secondary schools serving lower income pupils and those educating higher income pupils. Our goal is to close 44 % of this gap by 2022.

  • Ensure young people develop key strengths, including resilience and wellbeing, to support high aspirations

The Fair Education Alliance is committed to ensuring young people develop non-cognitive skills, including the positive wellbeing and resilience they need to succeed in life. The Alliance will be working with other organisations to develop measurement tools which will allow the development of these key skills to be captured.

  • Narrow the gap in the proportion of young people taking part in further education or employment-based training after finishing their GCSEs.

The Fair Education Alliance wants to see an increase in the number of young people from low-income communities who stay in further education or employment-based training once they have completed Key Stage 4. Our goal is for 90% of young people from schools serving low income communities to be in post-16 education or employment-based training by 2022.

  • Narrow the gap in university graduation, including from the 25% most selective universities

The Fair Education Alliance is committed to closing the graduation gap between young people from low income backgrounds and those from high income backgrounds. Our goal is for at least 5,000 more pupils from low income backgrounds to graduate each year, with 1,600 of these young people graduating from the most selective universities.’

The problematic proposal relates to Impact Goal 2, focused on the GCSE attainment gap in secondary schools.

The gap in question is between:

  • Schools serving low income communities: ‘State schools where 50 % or more of the pupils attending come from the most deprived 30 % of families according to the Income Deprivation Affecting Children Index (IDACI)’ and
  • Schools serving high income communities: ‘State schools where 50 % or more of the pupils attending come from the least deprived 30 % of families according to IDACI’.

The Report Card explains that the Alliance is focused on gaps between schools rather than gaps between pupils:

‘…to better capture data that includes those pupils whose families are on a low income but are just above the income threshold for free school meals (the poverty measure in schooling). This measurement also helps monitor the impact of the Alliance’s efforts towards meeting the goals as many members work with and through schools to tackle educational inequality, rather than with individual pupils.’

Under Goal 2, the gap the Alliance wishes to close relates to:

‘Average point score…across eight GCSE subjects, with extra weighting for English and maths’

The measure excludes equivalent qualifications. The baseline gap – derived from 2012/13 data:

‘…is currently 101.7 average points – the difference between 8 C grades and 8 A grades.

The Report Card says this gap has narrowed by 10.5% since 2010/11, but warns that new accountability measures could work in the opposite direction.

The problematic recommendation

The Report Card discusses the distribution of funding to support deprivation, arguing that:

  • Some aspects of disadvantage ‘are given less recognition in the current funding system. ‘For instance FSM Ever 6 does not include low income families who just miss the eligibility criteria for free school meals; and the national funding formula is not able to compensate for geographical isolation and high transport costs which can compound low incomes in parts of the country.’
  • ‘Consequently – due to the combination of a high intake of pupils attracting the premium and a currently unequal national school funding formula – there are a small number of very successful schools building up large surpluses. Meanwhile some schools with arguably greater need, where pupils suffer different socioeconomic disadvantages that affect their attainment, are receiving comparatively little extra funding. This hampers their ability to deal with the challenges that their students face and to prevent those vulnerable pupils from falling behind their peers.’

To rectify this problem, the Report Card recommends a significant policy adjustment:

Target pupil premium by attainment as well as disadvantage measures: This could be achieved through halving current funding per pupil for FSM Ever 6. Half of this funding could then be re-allocated to pupils eligible for FSM Ever 6 who have low prior attainment. This would give double-weighting to those low income pupils most in need of intervention without raising overall pupil premium spend. The change of funding model would increase school accountability for ‘catching up’ pupils.

The proposal is advanced in a section about secondary schools; it is unclear whether it is intended to apply equally to primary schools.

Quite what constitutes low prior attainment is never made entirely clear either. One assumes that, for secondary students, it is anything below the scaled score equivalent of KS2 L4b in English (reading and writing), maths or both.

This does of course mean that learners attracting the pupil premium who achieve the requisite scores will be as much short-changed as those who exceed them. Low attainers must take precedence over middle attainers as well as high attainers.

I am minded to extend my campaign to encompass the ‘squeezed middle’, but perhaps I should let someone else bear that standard.

.

Why this is objectionable

I oppose this proposal because:

  • The pupil premium is described as ‘additional funding for publicly funded schools in England to raise the attainment of disadvantaged pupils and close the gap between them and their peers’. Although not a personal funding entitlement – the funding can be aggregated and deployed as schools see fit – schools are held accountable for the impact of the pupil premium on the attainment and progress of the pupils that attract it. There is presently no distinction according to the attainment of these students, but the change proposed by the Alliance would shift the accountability focus to prioritise the achievement and progress of disadvantaged low attainers over disadvantaged middle and high attainers.
  • The pupil premium should not be treated as part of the overall school budget. As Ofsted said in its first report on the premium (September 2012):

‘School leaders, including governing bodies, should ensure that Pupil Premium funding is not simply absorbed into mainstream budgets, but instead is carefully targeted at the designated children. They should be able to identify clearly how the money is being spent.’

Since the premium follows the pupil, schools with large numbers of eligible pupils should not have any part of this funding clawed back, nor should those with relatively few eligible pupils have it supplemented.

  • If there are problems with the distribution of deprivation funding, this should be addressed through the school funding formula. It is wrong to suggest that a national funding formula would be incapable of compensating for associated sparsity factors. It is for those devising such a formula to determine whether to compensate for pupils not eligible for the premium and factors such as geographical isolation and high transport costs. The Alliance is perfectly entitled to lobby for this. But, in the absence of such a formula, the premium should not be rationed or redistributed to compensate.

‘Our report in 2013 found few instances of the pupil premium being used effectively to support the disadvantaged most able pupils. In the schools visited for this survey, about a third were using the pupil premium funding effectively to target the needs of these pupils.

  • Any decision to double weight pupil premium for disadvantaged learners with low prior attainment would be likely to penalise disadvantaged high attainers. Although schools could theoretically decide to aggregate the funding and spend it differently, the clear intention is that the accountability framework would incentivise correspondingly stronger improvement by low attainers relative to middle and higher attainers. It is hard to understand how this, combined with the redistribution of funding, would help schools to support the latter and so meet Ofsted’s expectations
  • There are strong equity arguments against such a redistribution: disadvantaged learners should not be penalised on the basis of their prior attainment. That is  not ‘A fair education for all’, nor is it consistent with the ‘sound moral argument for giving every child an equal chance to succeed‘ mentioned in the Executive Summary of the Report Card. There is a fundamental distinction between reflecting the additional costs attributable to supporting all low attainers in the funding formula and redistributing allocations associated with individual disadvantaged learners for the same purpose.
  • The Report Card itself recognises the significance of disadvantaged high attainers:

‘As the Level 5 attainment gap highlights, there is not only a need to catch up those ‘slipping behind’ but also an imperative to ‘stretch the top’ when looking at pupils from low income communities. Some schools do well by this measure: sharing best practice in making better than expected levels of progress and stretching the highest attainers is crucial for ensuring all schools can replicate the successes some have already developed.’

How this can be squared with the proposed redistribution of pupil premium is not addressed. 

  • Such a policy would make the Alliance’s own goal of narrowing the gap in university graduation from the 25% most selective universities much harder to achieve, since it would reduce the likelihood of disadvantaged learners reaching the level of attainment necessary to secure admission.
  • There is already additional funding, outside the school funding settlement, dedicated to ‘catch-up’ for those with low prior attainment. Well over £50m per year is allocated to the ‘catch-up premium’ providing £500 per pupil who did not achieve at least KS2 L4 in reading and/or maths. This may be used for individual or small group tuition, summer schools or resources and materials. A further £50m has also been top-sliced from the pupil premium to provide an annual summer schools programme for those at the end of KS2. A core purpose is ‘to help disadvantaged pupils who are behind in key areas such as literacy and numeracy to catch up with their peers’. There is no corresponding funding for disadvantaged high attainers.
  • For FY2015/16, the Government adjusted the funding formula to allocate an additional £390m to schools in the least fairly funded authorities. This involved setting a minimum funding level for five pupil characteristics, one being ‘pupils from deprived backgrounds’, another ‘pupils with low attainment before starting at their primary or secondary school’. The values for the latter are £660 for primary schools and £940 for secondary schools. This establishes a precedent for reflecting the needs of low attaining learners in further progress towards a national funding formula.

.

The campaign to date

I had an inconclusive discussion with Teach First officials on the day the Report Card was published

.

Subsequently I pressed the Fair Education Alliance spokesperson at Teach First on some specific questions.

.

I received two undertakings to respond online but nothing has materialised. Finally, on 17 April I requested a response within 24 hours.

.

Nothing doing.

Meanwhile though, Sam Freedman published a piece that appeared to accept that such imbalances should be rectified through the schools funding formula:

‘The distribution, in turn, will depend on whether the next Government maintains the pupil premium at the same level – which has shifted funds towards poorer parts of the country – and whether they introduce a “National Funding Formula” (NFF).

At the moment there are significant and historic differences between funding in different parts of the country. Inner London for instance is overfunded, and many schools have significant surpluses, whereas other parts of the country, often more rural, have much tighter margins. The current Government have taken steps to remedy this but plan to go further if they win the election by introducing a NFF. Doing this would help alleviate the worst effects of the cuts for schools that are currently underfunded.’

Freedman himself retweeted this comment.

We had a further conversation on 20 April after this post had been published.

.

.

Another influential Twitterata also appeared influenced – if not yet fully converted – by my line of argument:

Positive though some of these indications are, there are grounds to fear that at least some Alliance Members remain wedded to the redistribution of pupil premium.

The idea recently reappeared in a publication underpinning the Read On Get On campaign, supported by a variety of organisations including Teach First and some of the Fair Education Alliance.

The report in question – The Power of Reading (April 2015) – mentions that:

‘The Read On. Get On. campaign is working closely with the Fair Education Alliance and the National Literacy Forum to achieve our core goals, and this report reflects and builds on their recommendations.’

One of its ‘recommendations to the new Government’ is ‘Ensure stronger support for disadvantaged children who are falling behind’.

‘In what is likely to be a tight public spending round, our priority for further investment is to improve the quality of early education for the poorest children, as set out above. However, there are options for reforming existing pupil premium spending for primary school children so that it focuses resources and accountability on children from disadvantaged backgrounds who are falling behind…

….One option proposed by the Fair Education Alliance is to refocus the existing pupil premium on children who are eligible for free school meals and who start primary school behind. This would use existing funding and accountability mechanisms for the pupil premium to focus attention on children who need the most urgent help to progress, including in reading. It would make primary schools more accountable for how they support disadvantaged children who are falling behind. The primary pupil premium will be worth £1,300 per pupil in 2015–16 and is paid straight to schools for any child registered as eligible for free school meals at any point in the last six years. The FEA proposes halving the existing premium, and redistributing the other half to children who meet the existing eligibility criteria and have low prior attainment. New baseline tests for children at the start of the reception year, to be introduced in September 2016, could be used as the basis for measuring the prior attainment of children starting primary school.’

Interestingly, this appears to confirm that the Fair Education Alliance supports a redistribution of pupil premium in the primary sector as well as the secondary, something I could not find expressed on the face of the Report Card.

I reacted angrily

.

The campaign continued

It won’t be long now before I leave the education world behind for ever, but I have decided to devote spare moments to the pursuit on social media of the organisations that form the Fair Education Alliance and/or support Read On. Get On.

I am asking each organisation to:

  • Justify their support for the policy that has been advanced or 
  • Formally distance themselves from it

I also extend an invitation to both campaigns to formally withdraw their proposals.

I shall publish the outcomes here.

The organisations involved are listed below. If any of them would care to cut to the chase, they are most welcome to use the comments facility on this blog or tweet me @GiftedPhoenix

Since my experience to date has been of surprising coyness when organisations are challenged over their ill-conceived policy ideas, I am imposing a ‘three strikes’ rule.

Any organisation that fails to respond having been challenged three times will be awarded a badge of shame and consigned to the Scrapheap.

Let’s see who’s in there by the end of term.

GP

April 2015

.


Fair Education Alliance

.

.

.

.

.

.

.

.

I have published a comment from Future Leaders in which they accept that:

‘…mid- and high-attainers from poor backgrounds should not be deprived of the support that they need to succeed’.

Thanks to them for their prompt and clear response.

.

.

.

.

Read On. Get On.

.

The Scrapheap

.

Has Ofsted improved inspection of the most able?

.

This post examines the quality of Ofsted reporting on how well secondary schools educate their most able learners.

keep-calm-and-prepare-for-ofsted-6The analysis is based on a sample of 87 Section 5 inspection reports published during March 2015.

I have compared the results with those obtained from a parallel exercise undertaken a year ago and published in How well is Ofsted reporting on the most able? (May 2014).

This new post considers how inspectors’ assessments have changed in the light of their increased experience, additional guidance and – most recently – the publication of Ofsted’s survey report: The most able students: An update on progress since June 2013.

This appeared on 4 March 2015, at the beginning of my survey period, although it was heralded in HMCI’s Annual Report and the various supporting materials published alongside it in December 2014. One might therefore expect it to have had an immediate effect on inspection practice.

Those seeking further details of either of these publications are cordially invited to consult the earlier posts I dedicated to them:

The organisation of this post is straightforward.

The first section considers how Ofsted expects its inspectors to report on provision for the most able, as required by the current Inspection Handbook and associated guidance. It also explores how those expectations were intended to change in the light of the Update on Progress.

Subsequent sections set out the findings from my own survey:

  • The nature of the 2015 sample – and how this differs from the 2014 sample
  • Coverage in Key Findings and Areas for Improvement
  • Coverage in the main body of reports, especially under Quality of Teaching and Achievement of Pupils, the sections that most commonly feature material about the most able

The final section follows last year’s practice in offering a set of key findings and areas for improvement for consideration by Ofsted.

I have supplied page jumps to each section from the descriptions above.

How inspectors should address the most able

.

Definition and distribution

Ofsted nowhere explains how inspectors are to define the most able. It is not clear whether they permit schools to supply their own definitions, or else apply the distinctions adopted in their survey reports. This is not entirely helpful to schools.

In the original survey – The most able students: Are they doing as well as they should in our non-selective secondary schools? (June 2013) – Ofsted described the most able as:

‘…the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

The measure of potential is not defined, but an example is given, of EAL students who are new to the country and so might not (yet) have achieved Level 5.

In the new survey prior attainment at KS2 remains the indicator, but the reference to potential is dropped:

‘…students starting secondary school in Year 7 having attained Level 5 or above in English (reading and writing) and/or mathematics at the end of Key Stage 2’

The size of this group varies at national level according to the year group.

If we take learners in Year 7 who completed KS2 in 2014, the data shows that 24% achieved KS2 Level 5 in both English (reading and writing) and maths. A further 5% secured L5 in English (reading and writing only) while another 20% reached L5 in maths only.

So 49% of the present Year 7 are deemed high attainers.

.

Ofsted venn Capture

But this proportion falls to about 40% amongst those who completed KS4 in 2014 and so typically undertook KS2 assessment five years earlier in 2009.

Ofsted’s measure is different to the definition adopted in the Secondary Performance Tables which, although also based on prior attainment at KS2, depends on an APS of 30 or higher in KS2 tests in the core subjects.

Only ‘all-rounders’ count according to this definition, while Ofsted includes those who are relatively strong in either maths or English but who might be weak in the other subject. Neither approach considers achievement beyond the core subjects.

According to the Performance Tables definition, amongst the cohort completing KS4 in 2014, only 32.3% of those in state-funded schools were deemed high attainers, some eight percentage points lower than Ofsted’s figure.

The sheer size of Ofsted’s most able cohort will be surprising to some, who might naturally assume a higher hurdle and a correspondingly smaller group. The span of attainment it covers is huge, from one L5C (possibly paired with a L3) to three L6s.

But the generosity of Ofsted’s assumptions does mean that every year group in every school should contain at least a handful of high attainers, regardless of the characteristics of its intake.

Unfortunately, Ofsted’s survey report does not say exactly how many schools have negligible numbers of high attainers, telling us only how many non-selective schools had at least one pupil in their 2014 GCSE cohort with the requisite prior attainment in English, in maths and in both English and maths.

In each case some 2,850 secondary schools had at least one student within scope. This means that some 9% of schools had no students in each category, but we have no way of establishing how many had no students in all three categories.

Using the rival Performance Table definition, only some 92 state-funded non-selective secondary schools reported a 2014 GCSE cohort with 10% or fewer high attainers. The lowest recorded percentage is 3% and, of those with 5% or fewer, the number of high attaining students ranges from 1 to 9.

Because Ofsted’s definition is more liberal, one might reasonably assume that every secondary school has at least one high-attaining student per year group, though there will be a handful of schools with very few indeed.

At the other extreme, according to the Performance Tables definition, over 100 state-funded non-selective schools can boast a 2014 GCSE population where high attainers are in the majority – and the highest recorded percentage for a state-funded comprehensive is 86%. Using Ofsted’s measure, the number of schools in this position will be substantively higher.

For the analysis below, I have linked the number of high attainers (according to the Performance Tables) in a school’s 2014 GCSE cohort with the outcomes of inspection, so as to explore whether there is a relationship between these two variables.

Framework and Handbook

The current Framework for School Inspection (December 2014) makes no reference to the most able.

Inspectors must consider:

‘…the extent to which the education provided by the school meets the needs of the range of pupils at the school, and in particular the needs of disabled pupils and those who have special educational needs.’

One of the principles of school inspection is that it will:

‘focus on pupils’ and parents’ needs by…evaluating the extent to which schools provide an inclusive environment that meets the needs of all pupils, irrespective of age, disability, gender, race, religion or belief, or sexual orientation’.

Neither ability nor attainment is mentioned. This may or may not change when the Common Inspection Framework is published.

The most recent version of the School Inspection Handbook (December 2014) has much more to say on the issue. All relevant references in the main text and in the grade descriptors are set out in the Annex at the end of this post.

Key points include:

  • Ofsted uses inconsistent terminology (‘most able’, ‘more able’, ‘highest attainers’) without distinguishing between these terms.
  • Most of the references to the most able occur in lists of different groups of learners, another of which is typically ‘disadvantaged pupils’. This gives the mistaken impression that the two groups are distinct – that there is no such thing as a most able disadvantaged learner.
  • The Common Inspection Framework will be supported by separate inspection handbooks for each sector. The consultation response does not mention any revisions relating to the most able; neither does the March 2015 survey report say that revisions will be introduced in these handbooks to reflect its findings and recommendations (but see below). 

.

Guidance

Since the first survey report was published in 2013, several pieces of guidance have issued to inspectors.

  • In Schools and Inspection (October 2013), inspectors’ attention is drawn to key revisions to the section 5 inspection framework:

‘In judging the quality of teaching…Inspectors will evaluate how teaching meets the needs of, and provides appropriate challenge to, the most able pupils. Underachievement of the most able pupils can trigger the judgements of inadequate achievement and inadequate teaching.’

In relation to report writing:

‘Inspectors are also reminded that they should include a short statement in the report on how well the most able pupils are learning and making progress and the outcomes for these pupils.’

  • In Schools and Inspection (March 2014) several amendments are noted to Section 5 inspection and report writing guidance from January of that year, including:

‘Most Able – Inspectors must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

‘…must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

Moreover, for secondary schools:

‘There must be a comment on early entry for GCSE examinations. Where the school has an early entry policy, inspectors must be clear on whether early entry is limiting the potential of the most able pupils. Where early entry is not used, inspectors must comment briefly to that effect.’

  • In School Inspection Update (December 2014) Ofsted’s National Director, Schools reminds inspectors, following the first of a series of half-termly reviews of ‘the impact of policy on school inspection practice’, to:

‘…place greater emphasis, in line with the handbook changes from September, on the following areas in section 5 inspection reports…The provision and outcomes for different groups of children, notably the most-able pupils and the disadvantaged (as referred to in the handbook in paragraphs 40, 129, 137, 147, 155, 180, 186, 194, 195, 196, 207, 208, 210 and 212).’

HMCI’s Annual Report

The 2014 Annual Report said (my emphasis):

‘Ofsted will continue to press schools to stretch their most able pupils. Over the coming year, inspectors will be looking at this more broadly, taking into account the leadership shown in this area by schools. We will also further sharpen our recommendations so that schools have a better understanding of how they can help their most able pupils to reach their potential.’

HMCI’s Commentary on the Report  added for good measure:

‘In the year ahead, Ofsted will look even more closely at the performance of the brightest pupils in routine school inspections.’

So we are to expect a combination of broader focus, closer scrutiny and sharper recommendations.

The Annual Report relates to AY2013/14 and was published at the end of the first term of AY2014/15 and the end of calendar year 2014, so one assumes that references to the ‘coming year’ and ‘the year ahead’ are to calendar year 2015.

We should be able to see the impact of this ramping up in the sample I have selected, but some further change is also likely.

March 2015 survey report

One of the key findings from the March 2015 survey was (my emphasis):

Ofsted has sharpened its focus on the progress and quality of teaching of the most able students. We routinely comment on the achievement of the most able students in our inspection reports. However, more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

Ofsted directed three recommendations at itself which do not altogether reflect this (my emboldening):

‘Ofsted should:

  • Make sure that inspections continue to focus sharply on the progress made by students who are able and disadvantaged.
  • Report more robustly about how well schools promote the needs of the most able through the quality of their curriculum and the information, advice and guidance they offer to the most able students.
  • Ensure thematic surveys investigate, where appropriate, how well the most able are supported through, for example, schools’ use of the pupil premium and the curriculum provided.’

The first of these recommendations implies that inspections already focus sufficiently on the progress of able and disadvantaged learners – an assumption that we shall test in the analysis below. It therefore implies that no further change is necessary.

The third alludes to the most able disadvantaged but relates solely to thematic surveys, not to Section 5 inspection reports.

The second may imply that further emphasis will be placed on inspecting the appropriateness of the curriculum and IAG. Both of these topics seem likely to feature more strongly in a generic sense in the new Framework and Handbooks. One assumes that this will be extended to the most able, amongst other groups.

Though not mentioned in the survey report, we do know that Ofsted is preparing an evaluation toolkit. This was mentioned in a speech given by its Schools Director almost immediately after publication:

‘In this region specifically, inspectors have met with headteachers to address the poor achievement of the brightest disadvantaged children.

And inspectors are developing a most able evaluation toolkit for schools, aligned to that which is in place for free school meals.’

It is not clear from this whether the toolkit will be confined only to the most able disadvantaged or will have wider coverage.

Moreover, this statement raises the prospect that the toolkit might be similar in style to The Pupil Premium: Analysis and challenge tools for schools (January 2013). This is more akin to an old spanner than a Swiss army penknife. Anything of this nature would be rather less helpful than the term ‘toolkit’ implies.

At his request, I emailed Ofsted’s Director, Schools with questions on 21 March 2015. I requested further details of the toolkit. At the time of writing I have still to receive a reply.

.

The sample

I have selected an almost identical sample to that used in my 2014 analysis, one year on. It includes the 87 Section 5 inspection reports on secondary schools (excluding middle schools deemed secondary) that were published by Ofsted in the month of March 2015.

The bulk of the inspections were undertaken in February 2015, though a few took place in late January or early March.

Chart 1 gives the regional breakdown of the schools in the sample. All nine regions are represented, though there are only five schools from the North East, while Yorkshire and Humberside boasts 15. There are between seven and 11 schools in each of the other regions. In total 59 local authorities are represented.

In regional terms, this sample is more evenly balanced than the 2014 equivalent and the total number of authorities is two higher.

 .

Ofanal 1

Chart 1: Schools within the sample by region

Chart 2 shows how different statuses of school are represented within the sample.

All are non-selective. Fifty-three schools (61%) are academies, divided almost equally between the sponsored and converter varieties.

Community and foundation schools together form a third group of equivalent size, while the seven remaining schools have voluntary status, just one of them voluntary controlled. There are no free schools.

.

Ofanal 2

Chart 2: Schools within the sample by status

.

All but three of the schools are mixed – and those three are boys’ schools.

As for age range, there is one 13-18 and one 14-18 school. Otherwise there are 32 11-16 institutions (37% of the sample) while the remaining 53 (61%) are 11-18 or 11-19 institutions.

Chart 3 shows the variation in numbers on roll. The smallest school – a new 11-18 secondary school – has just 125 pupils; the largest 2083. The average is 912.

Fifty-two schools (60%) are between 600 and 1,200 and twenty-three (26%) between 800 and 1,000 pupils.

.

Ofanal 3

Chart 3: Schools within the sample by NOR

. 

Chart 4 shows the overall inspection grade of schools within the sample. A total of 19 schools (22%) are rated inadequate, seven of them attracting special measures. Only nine (10%) are outstanding, while 27 (31%) are good and 32 (37%) require improvement.

This is very similar to the distribution in the 2014 sample, except that there are slightly more inadequate schools and slightly fewer requiring improvement.

.

Ofanal 4

Chart 4: Schools within the sample by overall inspection grade

Unlike the 2104 analysis, I have also explored the distribution of all grades within reports. The results are set out in Chart 5.

Schools in the sample are relatively more secure on Leadership and management (55% outstanding or good) and Behaviour and safety of pupils (60% outstanding or good) than they are on Quality of teaching (43% outstanding or good) and Achievement of pupils (41% outstanding or good).

.

Ofanal 5

Chart 5: Schools within the sample by inspection sub-grades

Another new addition this year is comparison with the number and percentage of high attainers.

Amongst the sample, the number of high attainers in the 2014 GCSE cohort varied from three to 196 and the percentage from 3% to 52%. (Two schools did not have a GCSE cohort in 2014.)

These distributions are shown on the scatter charts 6 and 7, below.

Chart 6 (number) shows one major outlier at the top of the distribution. The vast majority – 64% of the sample – record numbers between 20 and 60. The average number is 41.

.

Ofanal 6

Chart 6: Schools within the sample by number of high attainers (Secondary Performance Tables measure)

. 

Chart 7 again has a single outlier, this time at the bottom of the distribution. The average is 32%, slightly less than the 32.3% reported for all state-funded schools in the Performance Tables.

Two in five of the sample register a high attainer percentage of between 20% and 30%, while three in five register between 20% and 40%.

But almost a third have a high attainer population of 20% or lower.

.

Ofanal 7 

Chart 7: Schools within the sample by percentage of high attainers (Secondary Performance Tables measure)

Out of curiosity, I compared the overall inspection grade with the percentage of high attainers.

  • Amongst the nine outstanding schools, the percentage of high attainers ranged from 22% to 47%, averaging 33% (there was also one without a high attainer percentage).
  • Amongst the 27 good schools, the percentage of high attainers was between 13% and 52% (plus one without a high attainer percentage) and averaged 32%.
  • Amongst the 32 schools requiring improvement, the percentage of high attainers varied between 3% and 40% and averaged 23%.
  • Amongst the 19 inadequate schools, the percentage of high attainers lay between 10% and 38% and also averaged 23%.

This may suggest a tendency for outstanding/good schools to have a somewhat larger proportion of high attainers than schools judged to be requiring improvement or inadequate.

Key findings and areas for improvement

.

Distribution of comments

Thirty-nine of the reports in the sample (45%) address the most able in the Summary of key findings, while 33 (38%) do so in the section about what the school needs to do to improve further.

In 24 cases (28%) there were entries in both these sections, but in 39 of the reports (45%) there was no reference to the most able in either section.

In 2014, 34% of reports in the sample addressed the issue in both the main findings and recommendations and 52% mentioned it in neither of these sections.

These percentage point changes are not strongly indicative of an extended commitment to this issue.

In the 2015 sample it was rather more likely for a reference to appear in the key findings for community schools (53%) and foundation schools (50%) than it was for converter academies (44%), sponsored academies (42%) or voluntary schools (29%).

Chart 8 shows the distribution of comments in these sections according to the overall inspection grade. In numerical terms, schools rated as requiring improvement overall are most likely to attract comments in both Key findings and Areas for improvement related to the most able.

.

Ofanal 8

Chart 8: Most able mentioned in key findings and areas for improvement by overall inspection grade (percentages)

.

But, when expressed as percentages of the total number of schools in the sample attracting these grades, it becomes apparent that the lower the grade, the more likely such a comment will be received.

Of the 39 reports making reference in the key findings, 10 comments were positive, 28 were negative and one managed to be both positive and negative simultaneously:

‘While the most-able students achieve well, they are capable of even greater success, notably in mathematics.’ (Harewood College, Bournemouth)

.

Positive key findings

Five of the ten exclusively positive comments were directed at community schools.

The percentage of high attainers in the 2014 GCSE cohorts at the schools attracting positive comments varied from 13% to 52% and included three of the five schools with the highest percentages in the sample.

Interestingly, only two of the schools with positive comments received an overall outstanding grade, while three required improvement.

Examples of positive comments, which were often generic, include:

  • ‘The most able students achieve very well, and the proportion of GCSE A* and A grades is significantly above average across the curriculum.’ (Durham Johnston Comprehensive School, Durham)
  • ‘The most able students do well because they are given work that challenges them to achieve their potential’. (The Elton High School Specialist Arts College, Bury)
  • ‘Most able students make good progress in most lessons because of well-planned activities to extend their learning’. (Endon High School, Staffordshire)
  • ‘Teachers encourage the most able students to explore work in depth and to master skills at a high level’. (St Richard Reynolds Catholic High School, Richmond-upon-Thames).

Negative key findings

The distribution of the 28 negative comments in Key findings according to overall inspection grade was:  Outstanding (nil); Good five (19%); Requires improvement twelve (38%); Inadequate eleven (58%).

This suggests a relatively strong correlation between the quality of provision for the most able and the overall quality of the school.

The proportion of high attainers in the 2014 GCSE cohorts of the schools attracting negative comments varied between 3% and 42%. All but three are below the national average for state-funded schools on this measure and half reported 20% or fewer high attainers.

This broadly supports the hypothesis that quality is less strong in schools where the proportion of high attainers is comparatively low.

Examples of typical negative comments:

  • ‘The most able students are not given work that is hard enough’ (Dyson Perrins C of E Sports College, Worcestershire)
  • ‘Too many students, particularly the most able, do not make the progress of which they are capable’ (New Line Learning Academy, Kent)
  • ‘Students, particularly the more able, make slower progress in some lessons where they are not sufficiently challenged. This can lead to some off task behaviour which is not always dealt with by staff’ (The Ferrers School, Northamptonshire)
  • ‘Teachers do not always make sufficient use of assessment information to plan work that fully stretches or challenges all groups of students, particularly the most able’ (Noel-Baker School, Derby).

The menu of shortcomings identified is limited, consisting of seven items: underachievement (especially too few high GCSE grades), insufficient progress, low expectations, insufficiently challenging work, poor teaching quality, poor planning and poor use of assessment information.

Of these, the most common comprise a familiar litany. They are (in descending order): 

  • Insufficiently challenging work 
  • Insufficient progress 
  • Underachievement and 
  • Low expectations.

Inspectors often point out inconsistent practice, though in the worst instances these shortcomings are dominant or even school-wide.

.

No key findings

Chart 9 shows the distribution of reports with no comments about the most able in Key findings and Areas for improvement according to overall inspection grade. When expressed as percentages, these again show that schools rated as outstanding are most likely to escape such comments, while inadequate schools are most likely to be in the firing line.

.

Ofanal 9

Chart 9: Most able not mentioned in key findings and areas for improvement by inspection grade (percentages)

This pattern replicates the findings from 2014. Orders of magnitude are also broadly comparable.  There is no substantive evidence of a major increase in emphasis from inspectors.

It seems particularly surprising that, in over half of schools requiring improvement and a third or more of inadequate schools, issues with educating the most able are still not significant enough to feature in these sections of inspection reports.

.

Areas for improvement

By definition, recommendations for improvement are always associated with identified shortcomings.

The correlation between key findings and areas for improvement is inconsistent. In six cases there were Key findings relating to the most able, but no area for improvement specifically associated with those. Conversely, nine reports had identified areas for improvement that were not picked up in the key findings.

Areas for improvement are almost always formulaic and expressed as lists: the school should improve x through y and z.

When it comes to the most able, the area for improvement is almost invariably teaching quality, though sometimes this is indicated as the route to higher achievement while on other occasions teaching quality and raising achievement are perceived as parallel priorities.

Just one report in the sample mentioned the quality of leadership and management:

‘Ensure that leadership and management take the necessary steps to secure a significant rise in students’ achievement at the end of Year 11 through…ensuring that work set for the most able is always sufficiently challenging’ (New Line Learning Academy, Kent).

This is despite the fact that leadership was specifically mentioned as a focus in HMCI’s Annual Report.

The actions needed to bring about improvement reflect the issues mentioned in the analysis of key findings above. The most common involve applying assessment information to planning and teaching:

  • ‘Raise students’ achievement and the quality of teaching further by ensuring that:…all staff develop their use of class data to plan learning so that students, including the most able, meet their challenging targets’ (Oasis Academy Isle of Sheppey, Kent)
  • ‘Ensure the quality of teaching is always good or better, in order to raise attainment and increase rates of progress, especially in English and mathematics, by:…ensuring teachers use all the information available to them to plan lessons that challenge students, including the most able’ (Oasis Academy Lister Park, Bradford)
  • ‘Embed and sustain improvements in achievement overall and in English in particular so that teaching is consistently good and outstanding by: making best use of assessment information to set work that is appropriately challenging, including for the least and most able students’ (Pleckgate High School Mathematics and Computing College, Blackburn with Darwen)

Other typical actions involve setting more challenging tasks, raising the level of questioning, providing accurate feedback, improving lesson planning and maintaining consistently high expectations.

.

Coverage in the main body of reports

.

Leadership and management

Given the reference to this in HMCI’s Annual Report, one might have expected a new and significant emphasis within this section of the reports in the sample.

In fact, the most able were only mentioned in this section in 13 reports (15% of the total). Hardly any of these comments identified shortcomings. The only examples I could find were:

  • ‘The most-able students are not challenged sufficiently in all subjects to
    achieve the higher standards of which they are capable’ (Birkbeck School and Community Arts College, Lincolnshire)
  • ‘Action to improve the quality of teaching is not focused closely enough on the strengths and weaknesses of the school and, as a result, leaders have not done enough to secure good teaching of students and groups of students, including…the most able (Ashington High School Sports College, Northumberland)

Inspectors are much more likely to accentuate the positive:

  • ‘The school has been awarded the Challenge Award more than once. This is given for excellent education for a school’s most-able, gifted and talented students and for challenge across all abilities. Representatives from all departments attend meetings and come up with imaginative ways to deepen these students’ understanding.’ (Cheam High School, Sutton)
  • ‘Leaders and governors are committed to ensuring equality of opportunity for all students and are making effective use of student achievement data to target students who may need additional support or intervention. Leaders have identified the need to improve the achievement of…the most-able in some subjects and have put in place strategies to do so’ (Castle Hall academy Trust, Kirklees)
  • ‘Measures being taken to improve the achievement of the most able are effective. Tracking of progress is robust and two coordinators have been appointed to help raise achievement and aspirations. Students say improvements in teaching have been made, and the work of current students shows that their attainment and progress is on track to reach higher standards.’ (The Byrchall High School, Wigan).

Not one report mentioned the role of governors in securing effective provision for the most able. 

Given how often school leadership escapes censure for issues identified elsewhere in reports, this outcome could be interpreted as somewhat complacent. 

HMCI is quite correct to insist that provision for the most able is a whole school issue and, as such, a school’s senior leadership team should be held to account for such shortcomings.

Behaviour and safety

The impact of under-challenging work on pupils’ behaviour is hardly ever identified as a problem.

One example has been identified in the analysis of Key findings above. Only one other report mentions the most able in this section, and the comment is about the role of the school council rather than behaviour per se:

‘The academy council is a vibrant organisation and is one of many examples where students are encouraged to take an active role in the life of the academy. Sixth form students are trained to act as mentors to younger students. This was seen being effectively employed to…challenge the most able students in Year 9’ (St Thomas More High School, Southend)

A handful of reports make some reference under ‘Quality of teaching’ but one might reasonably conclude that neither  bullying of the most able nor disruptive behaviour from bored high attainers is particularly widespread.

Quality of teaching

Statements about the most able are much more likely to appear in this section of reports. Altogether 59 of the sample (68%) made some reference.

Chart 10 shows the correlation between the incidence of comments and the sub-grade awarded by inspectors to this aspect of provision. It demonstrates that, while differences are relatively small, schools deemed outstanding are rather more likely to attract such comment.

But only one of the comments on outstanding provision is negative and that did not mention the most able specifically:

‘Also, in a small minority of lessons, activities do not always deepen
students’ knowledge and understanding to achieve the very highest grades at GCSE and A level.’ (Central Foundation Boys’ School, Islington)

.

Ofanal 10

Chart 10: Incidence of comments under quality of teaching by grade awarded for quality of teaching

.

Comments are much more likely to be negative in schools where the quality of teaching is judged to be good (41%), requiring improvement (59%) and inadequate (58%).

Even so, a few schools in the lower two categories receive surprisingly positive endorsements:

  • ‘On the other hand, the most able students and the younger students in school consistently make good use of the feedback. They say they greatly value teachers’ advice….The teaching of the most able students is strong and often very strong. As a result, these students make good progress and, at times, achieve very well.’ (RI – The Elton High School Specialist Arts College, Bury)
  • ‘Teaching in mathematics is more variable, but in some classes, good and outstanding teaching is resulting in students’ rapid progress. This is most marked in the higher sets where the most able students are being stretched and challenged and are on track to reach the highest grades at GCSE…. In general, the teaching of the most able students….is good.’ (RI- New Charter Academy, Tameside)
  • ‘At its most effective, teaching is well organised to support the achievement of the most able, whose progress is better than other students. This is seen in some of the current English and science work.’ (I – Ely College, Cambridgeshire).

Negative comments on the quality of teaching supply a familiar list of shortcomings.

Some of the most perceptive are rather more specific. Examples include:

  • ‘While the best teaching allows all students to make progress, sometimes discussions that arise naturally in learning, particularly with more able students, are cut short. As a result, students do not have the best opportunity to explore ideas fully and guide their own progress.’ (Dyson Perrins C of E Sports College, Worcestershire)
  • ‘Teachers’ planning increasingly takes account of current information about students’ progress. However, some teachers assume that because the students are organised into ability sets, they do not need to match their teaching to individual and groups of students’ current progress. This has an inhibiting effect on the progress of the more able students in some groups.’ (Chulmleigh Community College, Devon)
  • ‘In too many lessons, particularly boys’ classes, teachers do not use questioning effectively to check students’ learning or promote their thinking. Teachers accept responses that are too short for them to assess students’ understanding. Neither do they adjust their teaching to revisit aspects not fully grasped or move swiftly to provide greater stretch and new learning for all, including the most able.’ (The Crest Academies, Brent)
  • ‘In some lessons, students, including the most able, are happy to sit and wait for the teacher to help them, rather than work things out for themselves’ (Willenhall E-ACT Academy, Walsall).

Were one compiling a list of what to do to impress inspectors, it would include the following items:

  • Plans lessons meticulously with the needs of the most able in mind 
  • Use assessment information to inform planning of work for the most able 
  • Differentiate work (and homework) to match most able learners’ needs and starting points 
  • Deploy targeted questioning, as well as opportunities to develop deeper thinking and produce more detailed pieces of work 
  • Give the most able the flexibility to pursue complex tasks and do not force them to participate in unnecessary revision and reinforcement 
  • Do not use setting as an excuse for neglecting differentiation 
  • Ensure that work for the most able is suitably challenging 
  • Ensure that subject knowledge is sufficiently secure for this purpose 
  • Maintain the highest expectations of what the most able students can achieve 
  • Support the most able to achieve more highly but do not allow them to become over-reliant on support 
  • Deploy teaching assistants to support the most able 
  • Respond to restlessness and low level disruption from the most able when insufficiently challenged.

While many of the reports implicitly acknowledge that the most able learners will have different subject-specific strengths and weaknesses, the implications of this are barely discussed.

Moreover, while a few reports attempt a terminological distinction between ‘more able’ and ‘most able’, the vast majority seem to assume that, in terms of prior attainment, the most able are a homogenous group, whereas – given Ofsted’s preferred approach – there is enormous variation.

Achievement of pupils 

This is the one area of reports where reference to the most able is now apparently compulsory – or almost compulsory.

Just one report in the sample has nothing to say about the achievement of the most able in this section: that on Ashby School in Leicestershire.

Some of the comments are relatively long and detailed, but others are far more cursory and the coverage varies considerably.

Using as an example the subset of schools awarded a sub-grade of outstanding for the achievement of pupils, we can exemplify different types of response:

  • Generic: ‘The school’s most able students make rapid progress and attain excellent results. This provides them with an excellent foundation to continue to achieve well in their future studies.’ (Kelvin Hall School, Hull)
  • Generic, progress-focused: ‘The most-able students make rapid progress and the way they are taught helps them to probe topics in greater depth or to master skills at a high level.’ (St Richard Reynolds Catholic High School, Richmond-upon-Thames)
  • Achievement-focused, core subjects: ‘Higher attaining students achieve exceptionally well as a result of the support and challenge which they receive in class. The proportion of students achieving the higher A* to A grade was similar to national averages in English but significantly above in mathematics.
  • Specific, achievement- and progress-focused: ‘Although the most able students make exceptional progress in the large majority of subjects, a few do not reach the very highest GCSE grades of which they are capable. In 2014, in English language, mathematics and science, a third of all students gained A and A* GCSE grades. Performance in the arts is a real strength. For example, almost two thirds of students in drama and almost half of all music students achieved A and A* grades. However, the proportions of A and A* grades were slightly below the national figures in English literature, geography and some of the subjects with smaller numbers of students (Central Foundation Boys’ School, Islington)

If we look instead at the schools with a sub-grade of inadequate, the comments are typically more focused on progress, but limited progress is invariably described as ‘inadequate’, ‘requiring improvement’, ‘weak’, ‘not good’, ‘not fast enough’. It is never quantified.

On the relatively few occasions when achievement is discussed, the measure is typically GCSE A*/A grades, most often in the core subjects.

It is evident from cross-referencing the Achievement of pupils sub-grade against the percentage of high attainers in the 2014 GCSE cohort that there is a similar correlation to that with the overall inspection grade:

  • In schools judge outstanding on this measure, the high attainer population ranges from 22% to 47% (average 33%)
  • In schools judged good, the range is from 13% to 52% (average 32%)
  • In schools requiring improvement it is between 3% and 40% (average 23%)
  • In schools rated inadequate it varies from 10% to 32% (average 22%)

.

Sixth Form Provision 

Coverage of the most able in sections dedicated to the sixth form is also extremely variable. Relatively few reports deploy the term itself when referring to 16-19 year-old students.

Sometimes there is discussion of progression to higher education and sometimes not. Where this does exist there is little agreement on the appropriate measure of selectivity in higher education:

  • ‘Students are aspiring to study at the top universities in Britain. This is a realistic prospect and illustrates the work the school has done in raising their aspirations.’ (Welling School, Bexley)
  • ‘The academy carefully tracks the destination of leavers with most students proceeding to university and one third of students gaining entry to a Russell Group university’ (Ashcroft Technology Academy, Wandsworth)
  • ‘Provision for the most able students is good, and an increasing proportion of students are moving on to the highly regarded ‘Russell group’ or Oxbridge universities. A high proportion of last year’s students have taken up a place at university and almost all gained a place at their first choice’ (Ashby School, Leicestershire)
  • ‘Large numbers of sixth form students progress to well-regarded universities’ (St Bartholomew’s School, West Berkshire)
  • ‘Students receive good support in crafting applications to universities which most likely match their attainment; this includes students who aspire to Oxford or Cambridge’ (Anthony Gell School, Derbyshire).

Most able and disadvantaged

Given the commitment in the 2015 survey report to ‘continue to focus sharply on the progress made by students who are able and disadvantaged’, I made a point of reviewing the coverage of this issue across all sections of the sample reports.

Suffice to say that only one report discussed provision for the most able disadvantaged students, in these terms:

‘Pupil premium funding is being used successfully to close the wide achievement gaps apparent at the previous inspection….This funding is also being effectively used to extend the range of experiences for those disadvantaged students who are most able. An example of this is their participation in a residential writing weekend.’ (St Hild’s C of E VA School, Hartlepool)

Take a bow Lead Inspector Petts!

A handful of other reports made more general statements to the effect that disadvantaged students perform equivalently to their non-disadvantaged peers, most often with reference to the sixth form:

  • ‘The few disadvantaged students in the sixth form make the same progress as other students, although overall, they attain less well than others due to their lower starting points’ (Sir Thomas Wharton Community College, Doncaster)
  • ‘There is no difference between the rates of progress made by disadvantaged students and their peers’ (Sarum Academy, Wiltshire)
  • ‘In many cases the progress of disadvantaged students is outstripping that of others. Disadvantaged students in the current Year 11 are on course to do
    every bit as well as other students.’ (East Point Academy, Suffolk).

On two occasions, the point was missed entirely:

  • ‘The attainment of disadvantaged students in 2014 was lower than that of other students because of their lower starting points. In English, they were half a grade behind other students in the school and nationally. In mathematics, they were a grade behind other students in the school and almost a grade behind students nationally. The wider gap in mathematics is due to the high attainment of those students in the academy who are not from disadvantaged backgrounds.’ (Chulmleigh Community College, Devon)
  • ‘Disadvantaged students make good progress from their starting points in relation to other students nationally. These students attained approximately two-thirds of a GCSE grade less than non-disadvantaged students nationally in English and in mathematics. This gap is larger in school because of the exceptionally high standards attained by a large proportion of the most able students…’ (Durham Johnston Comprehensive School, Durham)

If Ofsted believes that inspectors are already focusing sharply on this issue then, on this evidence, they are sadly misinformed.

Key Findings and areas for improvement

.

Key findings: Guidance

  • Ofsted inspectors have no reliable definition of ‘most able’ and no guidance on the appropriateness of definitions adopted by the schools they visit. The approach taken in the 2015 survey report is different to that adopted in the initial 2013 survey and is now exclusively focused on prior attainment. It is also significantly different to the high attainer measure in the Secondary Performance Tables.
  • Using Ofsted’s approach, the national population of most able in Year 7 approaches 50% of all learners; in Year 11 it is some 40% of all learners. The latter is some eight percentage points lower than the cohort derived from the Performance Tables measure.
  • The downside of such a large cohort is that it masks the huge attainment differences within the cohort, from a single L5C (and possibly a L3 in either maths or English) to a clutch of L6s. Inspectors might be encouraged to regard this as a homogenous group.
  • The upside is that there should be a most able presence in every year group of every school. In some comprehensive schools, high attainers will be a substantial majority in every year group; in others there will be no more than a handful.
  • Ofsted has not released data showing the incidence of high attainers in each school according to its measure (or the Performance Tables measure for that matter). This does not features in Ofsted’s Data Dashboard.
  • Guidance in the current School Inspection Handbook is not entirely helpful. There is not space in a Section 5 inspection report to respond to all the separate references (see Appendix for the full list). The terminology is confused (‘most able’, ‘more able’, ‘high attainers’).Too often the Handbook mentions several different groups alongside the most able, one of which is disadvantaged pupils. This perpetuates the false assumption that there are no most able disadvantaged learners. We do not yet know whether there will be wholesale revision when new Handbooks are introduced to reflect the Common Inspection Framework.
  • At least four pieces of subsidiary guidance have issued to inspectors since October 2013. But there has been nothing to reflect the commitments in HMCI’s Annual Report (including a stronger focus on school leadership of this issue) or the March 2015 Survey report. This material requires enhancement and consolidation.
  • The March 2015 Report apparently commits to more intensive scrutiny of curricular and IAG provision in Section 5 inspections, as well as ‘continued focus’ on able and disadvantaged students (see below). A subsequent commitment to an evaluation toolkit would be helpful to inspectors as well as schools, but its structure and content has not yet been revealed.

Key findings: Survey

  • The sample for my survey is broadly representative of regions, school status and variations in NOR. In terms of overall inspection grades, 10% are outstanding, 31% good, 37% require improvement and 22% are inadequate. In terms of sub-grades, they are notably weaker on Quality of teaching and Achievement of pupils, the two sections that most typically feature material about the most able.
  • There is huge variation within the sample by percentage of high attainers (2014 GCSE population according to the Secondary Performance Tables measure). The range is from 3% to 52%. The average is 32%, very slightly under the 32.3% average for all state-funded schools. Comparing overall inspection grade with percentage of high attainers suggests a marked difference between those rated outstanding/good (average 32/33%) and those rated as requiring improvement/inadequate (average 23%).
  • 45% of the reports in the sample addressed the most able under Key findings; 38% did so under Areas for improvement and 28% made reference in both sections. However, 45% made no reference in either of these sections. In 2014, 34% mentioned the most able in both main findings and recommendations, while 52% mentioned it in neither. On this measure, inspectors’ focus on the most able has not increased substantively since last year.
  • Community and foundation schools were rather more likely to attract such comments than either converter or sponsored academies. Voluntary schools were least likely to attract them. The lower the overall inspection grade, the more likely a school is to receive such comments.
  • In Key findings, negative comments outnumbered positive comments by a ratio of 3:1. Schools with high percentages of high attainers were well represented amongst those receiving positive comments.
  • Unsurprisingly, schools rated inadequate overall were much more likely to attract negative comments. A correlation between overall quality and quality of provision for the most able was somewhat more apparent than in 2014. There was also some evidence to suggest a correlation between negative comments and a low proportion of high attainers.
  • On the other hand, over half of schools with an overall requiring improvement grade and a third with an overall inspection grade of inadequate did not attract comments about the most able under Key findings. This is not indicative of greater emphasis.
  • The menu of shortcomings is confined to seven principal faults: underachievement (especially too few high GCSE grades), insufficient progress, low expectations, insufficiently challenging work, poor teaching quality, poor planning and poor use of assessment information. In most cases practice is inconsistent but occasionally problems are school-wide.
  • Areas for improvement are almost always expressed in formulaic fashion. Those relating to the most able focus almost invariably on the Quality of teaching. The improvement most commonly urged is more thorough application of assessment information to planning and teaching.
  • Only 15% of reports mention the most able under Leadership and management and, of those, only two are negative comments. The role of governors was not raised once. Too often the school leadership escapes censure for shortcomings identified elsewhere in the report. This is not consistent with indications of new-found emphasis in this territory.
  • The most able are hardly ever mentioned in the Behaviour and safety section of reports. It would seem that bullying is invisible and low level disruption by bored high attainers rare.
  • Conversely, 68% of reports referenced the most able under Quality of teaching. Although negative comments are much more likely in schools judged as inadequate or requiring improvement in this area, a few appear to be succeeding with their most able against the odds. The main text identifies a list of twelve good practice points gleaned from the sample.
  • Only one report fails to mention the most able under Achievement of pupils, but the quality and coverage varies enormously. Some comments are entirely generic; some focus on achievement, others on progress and some on both. Few venture beyond the core subjects. There is very little quantification, especially of insufficient progress (and especially compared with equivalent discussion of progress by disadvantaged learners).
  • Relatively few reports deploy the term ‘most able’ when discussing sixth form provision. Progression to higher education is sometimes mentioned and sometimes not. There is no consensus on how to refer to selective higher education.
  • Only one report in this sample mentions disadvantaged most able students. Two reports betray the tendency of assuming these two groups to be mutually exclusive but, worse still, the sin of omission is almost universal. This provides no support whatsoever for Ofsted’s claim that inspectors already address the issue.

Areas for improvement

Ofsted has made only limited improvements since the previous inspection in May 2014 and its more recent commitments are not yet reflected in Section 5 inspection practice.

In order to pass muster it should:

  • Appoint a lead inspector for the most able who will assume responsibility across Ofsted, including communication and consultation with third parties.
  • Consolidate and clarify material about the most able in the new Inspection Handbooks and supporting guidance for inspectors.
  • Prepare and publish a high quality evaluation toolkit, to support schools and inspectors alike. This should address definitional and terminological issues as well as supplying benchmarking data for achievement and progress. It might also set out the core principles underpinning effective practice.
  • Include within the toolkit a self-assessment and evaluation framework based on the quality standards. This should model Ofsted’s understanding of whole school provision for the most able that aligns with outstanding, good and requiring improvement grades, so that schools can understand the progression between these points.
  • Incorporate data about the incidence of the most able and their performance in the Data Dashboard.
  • Extend all elements of this work programme to the primary and post-16 sectors.
  • Undertake this work programme in consultation with external practitioners and experts in the field, completing it as soon as possible and by December 2015 at the latest.

 .

Verdict: (Still) Requires Improvement.

GP

April 2015

.. 

.

Annex: Coverage in the School Inspection Handbook (December 2014)

Main Text

Inspectors should:

  • Gather evidence about how well they are ‘learning, gaining knowledge and understanding, and making progress’ (para 40)
  • Take account of them when considering performance data (para 59)
  • Take advantage of opportunities to gather evidence from them (para 68)
  • Consider the effectiveness of pupil grouping, for example ‘where pupils are taught in mixed ability groups/classes, inspectors will consider whether the most able are stretched…’ (para 153)
  • Explore ‘how well the school works with families to support them in overcoming the cultural obstacles that often stand in the way of the most able pupils from deprived backgrounds attending university’ (para 154)
  • Consider whether ‘teachers set homework in line with the school’s policy and that challenges all pupils, especially the most able’ (para 180)
  • Consider ‘whether work in Key Stage 3 is demanding enough, especially for the most able when too often undemanding work is repeated unnecessarily’ (para 180)
  • Consider whether ‘teaching helps to develop a culture and ethos of scholastic excellence, where the highest achievement in academic work is recognised, especially in supporting the achievement of the most able’ (para 180)
  • When judging achievement, have regard for ‘the progress that the most able are making towards attaining the highest grades’ and ‘pay particular attention to whether more able pupils in general and the most able pupils in particular are achieving as well as they should’. They must ‘summarise the achievements of the most able pupils in a separate paragraph of the inspection report’ (paras 185-7)
  • Consider ‘how the school uses assessment information to identify pupils who…need additional support to reach their full potential, including the most able.’ (para 193)
  • Consider how well ‘assessment, including test results, targets, performance descriptors or expected standards are used to ensure that…more able pupils do work that deepens their knowledge and understanding’ and ‘pupils’ strengths and misconceptions are identified and acted on by teachers during lessons and more widely to… deepen the knowledge and understanding of the most able’ (para 194)
  • Take account of ‘the learning and progress across year groups of different groups of pupils currently on the roll of the school, including…the most able’. Evidence gathered should include ‘the school’s own records of pupils’ progress, including… the most able pupils such as those who joined secondary schools having attained highly in Key Stage 2’ (para 195)
  • Take account of ‘pupils’ progress in the last three years, where such data exist and are applicable, including that of…the most able’ (para 195)
  • ‘When inspecting and reporting on students’ achievement in the sixth form, inspectors must take into account all other guidance on judging the achievement, behaviour and development of students, including specific groups such as…the most able ‘ (para 210)
  • Talk to sixth form students to discover ‘how well individual study programmes meet their expectations, needs and future plans, including for…the most able’ (para 212)

However, the terminology is not always consistent. in assessing the overall effectiveness of a school, inspectors must judge its response to ‘the achievement of…the highest and lowest attainers’ (para 129)

Grade descriptors

Outstanding

  • Overall effectiveness:

‘The school’s practice consistently reflects the highest expectations of staff and the highest aspirations for pupils, including the most able…’

  • Quality of teaching:

‘Much teaching over time in all key stages and most subjects is outstanding and never less than consistently good. As a result, almost all pupils currently on roll in the school, including…the most able, are making sustained progress that leads to outstanding achievement.’

  • Achievement of pupils:

‘The learning of groups of pupils, particularly… the most able, is consistently good or better.’

  • Effectiveness of sixth form provision:

‘All groups of pupils make outstanding progress, including…the most able’

Good

  • Overall effectiveness:

‘The school takes effective action to enable most pupils, including the most able…’

  • Quality of teaching:

‘Teaching over time in most subjects, including English and mathematics, is consistently good. As a result, most pupils and groups of pupils on roll in the school, including…the most able, make good progress and achieve well over time.’

‘Effective teaching strategies, including setting appropriate homework and well-targeted support and intervention, are matched closely to most pupils’ needs, including those most and least able, so that pupils learn well in lessons’

  • Achievement of pupils:

‘The learning of groups of pupils, particularly… the most able, is generally good.’

  • Effectiveness of sixth form provision:

‘As a result of teaching that is consistently good over time, students make good progress, including…the most able’

Inadequate

  • Quality of teaching:

‘As a result of weak teaching over time, pupils or particular groups of pupils, including…the most able, are making inadequate progress.’

  • Achievement of pupils:

‘Groups of pupils, particularly disabled pupils and/or those who have special educational needs and/or disadvantaged pupils and/or the most able, are underachieving’

  • Effectiveness of sixth form provision:

‘Students or specific groups such as… the most able do not achieve as well as they can. Low attainment of any group shows little sign of rising.’

How strong is Oxbridge access?

.

This post assesses how well Oxford and Cambridge Universities support fair access for students from disadvantaged backgrounds attending state-funded schools and colleges.

courtesy of Wellcome Images

courtesy of Wellcome Images

It sets out an evidence base to inform and support an Access Lecture I have been asked to give at Brasenose College, Oxford on 28 April 2015.

The outline for that Lecture is as follows:

‘If national efforts:

  • by state-funded schools and colleges to close high attainment gaps between learners from advantaged and disadvantaged backgrounds
  • by selective higher education institutions to secure fair access for students from disadvantaged backgrounds

could be integrated more effectively, much more substantial progress could be achieved on both fronts.

There is scope for reform in both sectors, to ensure a closer fit between the ‘push’ from schools and colleges and the ‘pull’ from higher education.

Faster progress will be achieved through a national framework that brings greater coherence to the market on both the demand and supply sides. It should be feasible to focus all support directly on learners, regardless of their educational setting.

Oxford and Cambridge should position themselves at the forefront of such efforts, serving as beacons of excellence and exemplary practice.’

This is a companion piece to two previous posts:

The first of these explores the issue from first principles, considering measures, targets and data before outlining a 10-point improvement plan. The second advances a simplified version of this plan.

This post concentrates principally on description of the access-related activities of these two universities, placing those in the wider context of updated material about national policy developments and the relatively disappointing outcomes achieved to date.

It is organised into five main sections:

  • A review of key changes to the national access effort since November 2013.
  • A note on outcomes, which questions whether Oxbridge reflects the positive trends reported for selective higher education as a whole.
  • In depth analysis of how fair access work has developed, at Oxford and Cambridge respectively, as revealed by their successive access agreements.
  • Analysis of signature access programmes at Oxford and Cambridge, featuring their rival residential summer schools and efforts to develop a longer term relationship with disadvantaged students, as recommended by Offa.
  • My personal assessment of strengths and areas for development, including a slightly revised version of the improvement strategy I have proposed in earlier posts.

Given the length of the post I have inserted page jumps to each section.

.

Recent developments in national fair access policy

My November 2013 post supplies considerable detail about the regulation of fair access to English universities which I shall not repeat here.

Amongst other things, it deals with:

  • Published data on high attainment by disadvantaged students and their progression to Oxbridge – and how this has not always been used appropriately.

This section describes briefly the principal changes to the national fair access mechanisms introduced by and subsequent to the National Strategy – and explains how access agreements fit into these mechanisms.

.

National Strategy for Access and Student Success

The National Strategy sets out a ‘student lifecycle approach’ in which access forms the first of three main stages.

It seeks to address:

‘…the wide gap in participation rates between people from advantaged and disadvantaged backgrounds in society, and between students with different characteristics, particularly at the most selective institutions.’

There are six key actions:

  • Introduce a national approach to collaborative outreach that will foster new collaborative partnerships, reduce duplication and support the tracking of students who have undertaken outreach activities. Hefce will fund the national roll-out of a tracking system.
  • Secure a more coherent approach to the provision of information, advice and guidance. HE outreach activity and schools policy will be ‘joined up’.
  • Develop a national evaluation framework, so universities can evaluate their activities more effectively and provide comparable national data. Hefce and Offa will examine the feasibility of sector-wide evaluation measures and publish good practice guidance by January 2015.
  • Co-ordinate national research into access, build the evidence base for effective outreach and share good practice.
  • Introduce a joint HEFCE-Offa approach to requesting information from institutions and
  • Encourage institutions to re-balance their funding from financial support towards outreach and collaborative outreach’.

The new national approach to collaborative outreach will be derived from a set of principles (the emboldening is mine):

  • ‘Outreach is most effective when delivered as a progressive, sustained programme of activity and engagement over time.
  • Outreach programmes need to be directed towards young people at different stages of their educational career and begin at primary level.
  • The effective delivery of outreach programmes requires the full, adequately resourced involvement and engagement of HEIs, FECs and schools.
  • The collaborative provision of outreach delivers significant benefits in terms of scale, engagement, co-ordination and impartiality.
  • Progression pathways for learners with non-traditional or vocational qualifications need to be clearly articulated.
  • Outreach to mature learners depends on good links with FECs, employers and the community.
  • Without good advice and guidance, outreach is impoverished and less effective.’

In November 2013, institutions were advised that they would be expected to prepare their own Strategies for Access and Student Success (SASS), which would replace Offa’s access agreements and Hefce’s widening participation strategic statements.

These would cover the period 2014-19, incorporating the information and commitments that would otherwise have featured in 2015-16 access agreements. In future these arrangements would be updated each spring. Full guidance was promised by late January 2014.

However, further guidance was issued in February 2014 stating that separate returns would continue because:

‘…of the Department for Business, Innovation and Skills’ unexpected delay in sending HEFCE’s grant letter, and because we appreciate that institutions need to make progress with their access and student success plans, which must be approved by the Director of Fair Access to Higher Education. Separating our information requirements is the most pragmatic approach at this time.’

Hefce now says:

‘We are no longer requesting widening participation strategic statements from institutions and are moving towards an outcomes framework for 2014-15 onwards.’

It appears that the SASS concept has been set aside permanently. Certainly Offa’s 2016-17 guidance (February 2015) envisages the continuation of separate access agreements, although there is now a single monitoring return to Offa and Hefce.

Initiatives prompted by the National Strategy

The outcomes framework will be informed by two research projects, one developing a data return, the other designed to establish how an outcomes framework ‘could lead us to understand the relative impact of a wider range of access and student success activities and expenditure’.

As far as I can establish there has been nothing further on evaluation. Hefce’s website mentions guidance, but the link is to material published in 2010

However, the current work programme does include rolling out a Higher Education Access Tracker (HEAT) which helps universities track outreach participants through to HE entry. Hefce is funding this to the tune of £3m over 2014-17, but institutions must also pay a subscription – and only 21 are currently signed up.

The strategy is also establishing National Networks for Collaborative Outreach (NNCOs) which, it is claimed:

‘will deliver a nationally coordinated approach to working with schools, universities and colleges to help people access HE’.

In fact, the purpose of the networks is almost exclusively the provision of information.

They will supply a single point of contact providing information for teachers and advisers about outreach activity in their area, as well as general advice about progression to HE. They will undertake this through websites to be available ‘in early spring 2015’.

At the time of writing, Hefce’s website merely lists the institutions participating in each network – there are no links to live websites for any of these.

There is a budget of £22m for the networks over academic years 2014/15 and 2015/16. Each network receives £120,000 per year and there is also a small additional allocation for each institution.

Three of the networks have national reach, one of them to support students wishing to progress to Oxbridge. This is called the Oxford and Cambridge Collaborative Network. Oxford is the lead institution.

A Google search confirms no web presence at the time of writing. However Oxford’s press release says:

‘Oxford will lead the Oxford and Cambridge NNCO, which will aim to offer specific support to students hoping to study at Oxford and Cambridge by reaching out to students and teachers in more than 1,600 schools across England. The collaboration will build on the current information and advice already offered to students and teachers, but enhanced by activities including a new interactive website, online webinars with admissions staff from Oxford and Cambridge, and more resources for activities in local schools linked to Oxford and Cambridge colleges….

… Online webinars with admissions staff from both universities will make it easier to make contact with students and schools from hard to reach geographic areas, and those schools with limited numbers of high-achieving students each year.

The new network will aim to work with state schools across England with particular emphasis on those in areas that currently have little engagement with Oxford and Cambridge outreach; those in schools offering post-16 (GCSE) education; those from schools with low progression to Oxford or Cambridge, or from areas of socioeconomic disadvantage.’

Offa guidance and strategic plan

Offa’s latest access agreement guidance (for 2016-17 agreements) sets out future priorities that are consistent with the national strategy. These include:

  • Greater emphasis on long-term outreach: ‘Evidence suggests that targeted, longterm outreach which boosts achievement and aspirations among disadvantaged people is a more effective way of widening access than institutional financial support. Where appropriate, you should therefore consider how you can strengthen your work to raise the aspiration and attainment of potential students of all ages, from primary school pupils through to adults.’
  • More effective collaboration: ‘Collaboration between institutions providing outreach is not limited to alliances of higher education institutions (HEIs). We would normally expect collaborative outreach to include many stakeholders rather than to be between a single HEI and schools, colleges or other stakeholders receiving outreach. For example, collaboration may be between one HEI and further education colleges (FECs), other higher education providers, employers, third sector organisations, schools, colleges, training providers, local authorities and so on.’
  • Stretching targets for achieving faster progress: ‘we now ask you to review and set new stretching targets which set out the desired outcomes of the work set out in your access agreement. When reviewing your targets, we expect all institutions, particularly those with relatively low proportions of students from under-represented groups, to demonstrate how they intend to make faster progress in improving access, success and/or progression for these students. This is in line with the aims expressed in our forthcoming strategic plan, which is informed by guidance from Ministers.’

This strategic plan was published in February 2015. It notes that, while some progress has been made in improving access for disadvantaged students to selective higher education, there is much more still to do.

‘Despite these improvements, the gaps between the most advantaged and most disadvantaged people remain unacceptably large. The latest UCAS data shows that, on average, the most advantaged 20 per cent of young people are 2.5 times more likely to go to higher education than the most disadvantaged 20 per cent. At the most selective institutions this ratio increases – with the most advantaged young people on average 6.8 times more likely to attend one of these institutions compared to the most disadvantaged young people.’

One of Offa’s targets (described as ‘Sector Outcome Objectives’) is:

‘To make faster progress to increase the entry rate of students from underrepresented and disadvantaged groups entering more selective institutions, and narrow the participation gap between people from the most and least advantaged backgrounds at such institutions.’

The measure selected is English 18-year-old entry rates by POLAR 2 for higher tariff providers. The targets are:

‘….for the entry rate from quintile 1 to increase from 3.2 per cent in 2014-15 to 5 per cent by 2019-20, and from 5.1 per cent in 2014-15 to 7 per cent by 2019-20 for quintile 2. To reduce the gap in participation, our target is for the quintile 5: quintile 1 ratio to decrease from 6.8 in 2014-15 to 5.0 by 2019-20.’

.

A Note on Outcomes

High tariff HEIs

As Offa suggests, there is some cause for optimism about wider progress towards fair access, albeit from an exceedingly low base.

The UCAS End of Cycle Report 2014 indicates that:

  • Students from POLAR Quintile 1 are 40% more likely to enter a high-tariff institution than in 2011, though the percentage achieving this is still tiny (it has increased from 2.3% to 3.2%).
  • FSM-eligible students (the Report doesn’t indicate whether they were ‘ever 6 FSM’ or FSM in Year 11) are 50% more likely to enter a higher tariff institution than in 2011, but the 2014 success rate is still only 2.1%.

As noted above, Offa’s Strategic Plan for 2015-20 includes a target to increase the POLAR Quintile 1 success rate from 3.2% to 5% by 2019-20.

This is an increase of 56% in the six years between 2014 and 2020, compared with an increase of 40% in the three years between 2011 and 2014. Looked at in this way it is relatively unambitious.

But what of Oxbridge? How does its performance compare with other high-tariff institutions?

Oxbridge argues that it is a special case – because of its higher entrance requirements – so should not be judged by the same criteria as other high tariff institutions. It is for this reason that Oxford and Cambridge are reluctant to be assessed against HESA’s Performance Indicators.

Offa’s access agreement methodology enables universities to set targets that reflect their different circumstances, but its own KPIs are framed according to national measures which might not be appropriate to some.

There is no separate Offa target to improve Oxbridge access. When it comes to system-wide performance measures, only DfE’s Impact Indicator 12: Percentage of children on free school meals progressing to Oxford or Cambridge University is specific to Oxbridge.

This is based on the DfE’s experimental Destination Measures statistics. FSM eligibility is determined in Year 11 rather than via the ‘ever 6’ methodology.

The Indicator reports an increase from 0.1% in 2010/11 to 0.2% in 2011/12. (This compares with a reported increase in FSM progression to Russell Group universities from 3.0% to 4.0%)

However, as I have pointed out:

  • The 2010/11 intake was 30 and the 2011/12 intake 50.
  • The 2011/12 intake comprised 40 students from state-funded schools and 10 from state-funded colleges, but both numbers are rounded to the nearest 10.
  • The 2012/13 intake, not yet incorporated into the Indicator, is unchanged from 2010-11, both numbers again rounded to the nearest 10, so any improvement achieved in 2011/12 stalled completely in 2012/13.

The most recent data reported to Offa by Oxford and Cambridge also relates to 2012/13.

.

Cambridge

Cambridge uses the POLAR Quintile 1 measure, also a HESA benchmark, though adjusted downwards to reflect its high attainment threshold. It is aiming for a target of 4.0% by 2016/17, against a 2009/10 baseline of 3.1%.

The 2011/12 outcome is given as 2.5%. The 2012/13 line is blank, on the grounds that HESA has not yet reported it. We can now see that the outcome was in fact 3.5% (POLAR2), so a significant improvement, more than catching up the decline the previous year. HESA has recently published the 2013/14 outcome, which is 3.6%, a very slight improvement on the previous year

HESA’s own benchmarks for Cambridge (again POLAR2) were 4.4% in 2011/12, 4.7% in 2012/13 and 4.6% in 2013/14, so it continues to undershoot these quite significantly.

In its latest 2015/16 agreement, Cambridge’s 2017/18 target is unchanged at 4.0% (but now transferred to POLAR3 quintile 1). It has not set a target for 2018/19.

Given Offa’s commitment to achieving a 5.0% outcome by 2019/20, it will be interesting to see where Cambridge pitches its own target in its 2016-17 access agreement. Will it, too, aim for 5%, or will it scale back its own target on the grounds that the attainment profile of its intake is atypically high?

.

Oxford

Oxford opts for a different measure. It only reports outcomes for POLAR Quintiles 1 and 2 combined, which is insufficiently specific, using a measure based on ACORN postcode analysis as its principal indicator of access for disadvantaged students.

On this second measure, it reports a target of 9.0% by 2016/17 against a 2009/10 baseline of 6.1% and, more recently, has projected this forward to 10% by 2018/19.

The 2011/12 outcome is 7.6% and the 2012/13 outcome is 6.7%. This fall of 0.9% is annotated ‘Progress made – but less than anticipated’.

If we were to apply the POLAR2 HESA Quintile 1 measure to Oxford, it would have registered 2.6% in 2011/12 (against a HESA benchmark of 4.7%), 3.0% in 2012/13 (against a benchmark of 4.9%) and only 2.4% in 2013/14 (against a benchmark of 4.8%).

The reason is presumably the atypical attainment threshold for admission to Oxford.

Oxford does not have the benefit of an Offa marker against which to pitch its ACORN target for 2019-20.

Comparing Oxford and Cambridge

Graph 1, below, illustrates progress against each university’s principal measure of fair access, as well as the trend implied by its targets.

.

Oxbridge graph 1

Graph 1: Oxford and Cambridge: Progress against principal fair access target and projected outcomes for future years

The graph shows inconsistent progress to 2012/13. Oxford’s trend is broadly positive, but Cambridge has not yet caught up where it was in 2008/09. The trajectory implied by Oxford’s targets is more ambitious than Cambridge’s.

Graph 2, below provides further analysis of Oxford’s outcomes, based on data provided in the most recent 2015-16 access agreement. Unfortunately Cambridge is less transparent in this respect.

Graph 2 shows the same pattern of progress against the ACORN target as in Graph 1, except that the 2013 figure is an actual outcome (6.8%) rather than a target (7.5%).

It also shows for each year the percentage of all applicants from ACORN 4/5 postcodes who applied successfully. These compare with a success rate for all applicants of around 20%, giving a gap of three or four percentage points to make up. Progress on this measure has also fluctuated, falling back significantly in 2010 and not yet returning to the mark achieved in 2009.

Preliminary data for 2014 suggests a significant improvement, however. The agreement says that 320 conditional offers have been made, giving an estimated figure for acceptances of 275 (my estimate, not Oxford’s) and a corresponding success rate of 19.2%. If confirmed, this will be a significant step forward.

.

Oxbridge graph 2

Graph 2:  The percentage of all successful applicants drawn from ACORN 4/5 postcodes and the percentage of all applicants from ACORN 4/5 postcodes who are successful, 2008-2013

.

Graph 3, below is derived from the data underpinning the DfE’s experimental KS5 destination statistics for 2010 to 2011, 2011 to 2012 and 2012 to 2013. It provides, for each year, the percentage of admissions to Oxbridge, Russell Group, Top Third and all HEIs accounted for by FSM students.

.

Additional Oxbridge graphGraph 3: Percentage of admissions to Oxbridge, RG, Top third and all HEIs accounted for by FSM students, 2010/11 to 2012/13 (From DfE destination statistics, underlying data)

.

The Oxbridge data, especially, must be treated with a degree of caution, since all figures are derived from separate totals for state-funded schools and colleges, each rounded to the nearest 10. Consequently, changes from year to year may be inflated or deflated by the generous rounding.

Nevertheless, one can see that FSM admission to Oxbridge continues to lag well behind the rates for admission to selective higher education more generally. Although one might argue that Oxbridge is improving at a faster rate, it is doing so from a significantly lower base and, in the most recent year (2012/13), the improvement in all other respects is not mirrored in the Oxbridge figures.

Although the rounded number of FSM admissions to Oxbridge in 2012/13 remained unchanged from 2011/12 (at 50) the number of non-FSM admissions increased by 190, so dragging down the percentage.

To summarise:

  • There is an unhelpful two-year lag in outcomes data and limited commonality in the basis of the measures used to set targets, making comparison much more difficult than it needs to be.
  • Neither university routinely releases details of the number of FSM or ‘ever 6’ FSM students within its intake, but DfE destinations data, also affected by a two-year lag, shows that FSM admissions to Oxbridge are significantly lower than to selective HE more generally. The actual number of FSM students admitted has been more or less stalled at 50 or fewer for a decade.
  • Fair access to Oxbridge is improving slightly, but not consistently. Cambridge has not yet caught up where it was in 2008/09. Oxford’s progress is more secure than Cambridge’s, and Oxford’s target is more challenging.

Access Agreements

Access agreements are approved annually by Offa, which issues annual guidance to inform the review process.

It looks particularly at the nature of the access measures adopted, the resources allocated and whether targets and milestones are suitably challenging.

Offa archives old access agreements on its website as well as universities’ self-assessments. The latter should:

  • ‘assess their progress against each target they set themselves in their agreements
  • provide data showing their progress against targets for each academic year since 2006-07 and
  • provide a commentary setting their access work in context, highlighting any particular challenges they have faced, and, if they have not made as wished, explaining the reasons for this.’

The archive includes:

  • Access agreements for Oxford and Cambridge for 2006-07 through to 2015-16 and
  • Self -assessments for Oxford and Cambridge for 2010-11 through to 2012-13

Self-assessments for 2013-14 were due during January 2015 but have not yet been published. In previous years they have not appeared until July.

Access agreements for 2016-17 are due for submission during April 2015. They too are unlikely to appear before July.

Analysis of how access agreements have changed over time provides a valuable insight into the evolution of institutional policies, including the extent to which these have been modified in line with Offa’s guidance.

Comparison between Oxford and Cambridge’s access agreements also helps to draw out key differences between their respective access policies, as well as comparative strengths and weaknesses and areas in which they might potentially learn from each other.

The sections below explore the chronological development of each university’s access agreement under four headings:

  • Budget: The total budget devoted to activity within scope of the agreement, and the balance between funding for bursaries and outreach respectively
  • Bursaries: The bursaries provided to students from the most disadvantaged backgrounds
  • Outreach: The range of activities undertaken 
  • Targets: The targets and milestones set and progress against those not already discussed above.

I have also included a section of Commentary, intended to capture observations that throw additional light on the institution’s approach and attitude to access.

It is important to note that the two universities now adopt a somewhat different approach to the nature of access agreements.

The agreements for 2006-07 were nine (Oxford) and eight (Cambridge) pages in length. Cambridge’s 2015-16 agreement is slightly longer, at 11 pages, but Oxford’s is 48 pages long.

In recent years, Oxford’s agreement has consistently been much more detailed and more informative. This distinction will be apparent from the analysis below.

Moreover, Cambridge’s agreement was unchanged from 2006-07 to 2009-10, whereas Oxford’s changed somewhat in this period. Both universities submitted single agreements for 2010-11 and 2011-12, but both have changed their agreements – at least to some degree – each year since then.

.

Budget (£m pa)

Costs are not always as clearly expressed as one would wish, nor are they always fully comparable. This is despite the fact that Offa now produces a template for the purpose.

There is very limited information in Cambridge’s most recent agreement, whereas Oxford supplies extensive detail, including (at Offa’s behest) what is and is not ‘Offa-countable’:

‘When calculating your progression spend, please note that OFFA’s remit only extends to students and courses that are fee-regulated. This means that only measures targeted at undergraduate students (or postgraduate ITT students) from under-represented and disadvantaged groups should be included in your OFFA-countable spend. For example, you should not include spend on financial support for postgraduate students in your OFFA-countable expenditure, although you may include this in your total expenditure on progression.’ (Offa, 2015-16 Resource Plan)

The tables below represent my best effort at harvesting comparable figures. The first table summarises Cambridge’s budget, the second Oxford’s.

Year Bursaries Outreach Total Notes
2006-10 £7.0m £1.15m £8.15m Bursary cost in steady state..£0.425m of outreach budget from AimHigher and Hefce funds.
2010-12 £7.5m £1.15m £8.65m Bursary cost in steady state..£0.45m of outreach budget from AimHigher and Hefce funds.
2012-13 £8.3m £4.2m £12.5m Bursary cost in steady state and includes £1.2m steady state assumption for NSP..Total outreach cost includes £2.7m current expenditure plus £1.5m from fee income.
2013-14 £8.3m £4.2m £12.5m As above.
2014-15 £8.0m £4.66m £12.66m Bursary cost in steady state and includes £0.9m for NSP..Total outreach cost includes £2.7m current expenditure plus £1.96m fee income (of which £0.258m is redirected from NSP).
2015-16 £6.9m £3.0m £9.9m Bursary cost in steady state..Total outreach cost includes unspecified fee income.

Table 1: Summary of costs in Cambridge’s access agreements, 2006-2016

.

Year Bursaries Outreach Total Notes
2006-07 £6.8m £1.35m £8.15m Bursary cost in steady state..An additional £3m is provided through college support.
2007-08 £6.8m £1.35m £8.15m Bursary cost in steady state..An additional £3m is provided through college support.
2008-09 £6.3m £1.075m £7.375m Bursary cost in steady state.
2009-10 £6.4m £0.968m £7.368m Bursary cost in steady state.
2010-11 £6.4m £0.968m £7.368m Bursary cost in steady state.
2011-12 £6.6m £1.415m £8.015m
2012-13 £8.8m £2.6m £11.65m Bursary total included £2.2m for tuition fee waivers..Plus additional £0.25m for retention, support and employability.Includes NSP allocation of £0.4m
2013-14 £9.4m.(£9.4m) £4.52m.(£2.44m) £13.92m Bursary total includes £2.9m for tuition fee waivers..Plus additional £0.41m for retention, support and employability.Includes NSP allocation of £0.79m

.

Figures in brackets are ‘offa-countable’

.

2014-15 £11.32m.(£11.05m) £5.23m.(£2.92m) £16.55m Bursary includes £4.06m for tuition fee waivers..Plus additional £0.54m for retention, support and employability.Includes NSP allocation of £0.34m

.

Figures in brackets are ‘offa-countable’

 .

2015-16 £10.89m.(£10.6m) £5.67m.(£3.24m) £16.56m Bursary includes £3.63m for tuition fee waivers..Plus additional £0.71m for retention, support and employability.Of total only £13.81m is ‘offa-countable’

 .

Table 2: Summary of costs in Oxford’s access agreements, 2006-2016

These suggest that:

  • Total combined expenditure in 2006-07 was £16.3m, but by 2015-16, this had increased to £23.74m (excluding Oxford’s ‘non-offa countable’ expenditure, an increase of around 46%.
  • Whereas in 2006-07, both Universities were spending exactly the same, by 2015-16, total expenditure at Cambridge had increased by some 21%, while total Offa-countable expenditure at Oxford had increased by about 70%.
  • In 2005-06, the percentage of total funding spent on bursaries was 86% at Cambridge and 83% at Oxford. By 2015-16, the comparable percentages are 70% and 77%. Hence Cambridge has reduced the proportion spent on bursaries more substantively than Oxford, but both Universities continue to direct their funding predominantly towards bursaries.
  • In 2005-06, expenditure on bursaries by each university was very similar. Although the total devoted to bursaries by Cambridge increased slightly in the intervening years, by 2015-16 it was almost the same as in 2005-06. However, expenditure on bursaries at Oxford is some 56% above what it was in 2005-06.
  • Since 2005-06, both Oxford and Cambridge have more than doubled their expenditure on outreach. Taken together, the two universities expect to spend some £6.24m on outreach in 2015-16. Cambridge’s ratio of bursary to outreach spend is approaching 2:1, whereas Oxford’s is more than 3:1.
  • Although the sums they now spend on outreach (offa-countable in Oxford’s case) are relatively similar, Cambridge spends 30% of its total expenditure on outreach while Oxford spends 23%. However, Cambridge spends significantly less than it did at its peak in 2014-15, while Oxford’s expenditure has increased steadily since 2010-11.

Bursaries

Bursary arrangements have shifted subtly, especially as NSP fee waivers have arrived and then disappeared. The details below relate only to the most generous bursary rates for students with the lowest residual household incomes.

Cambridge’s access agreements suggest that:

  • For 2006-10 Cambridge’s bursary offer for students eligible for a full maintenance grant – with a residual household income of £16,000 or below – is £3,000 per year. It estimates that some 10% of its full fee-paying undergraduates – around 955 students – will qualify.
  • For 2010-12 the maximum bursary is £3,400 for all students qualifying for a full maintenance grant – now equivalent to a residual household income of £25,000 or below – and about 1,100 students (13% of Cambridge’s UK undergraduates) will qualify.
  • For 2012-13 the maximum bursary is £3,500 for those with a full maintenance grant. There is an additional fee waiver of £6,000 in the first year of study for such students who are also from ‘particularly disadvantaged backgrounds’ including those formerly in receipt of FSM. (The University points out that these are the Government’s criteria).
  • For 2013-14 the same arrangements apply.
  • For 2014-15 the same arrangements apply, except that recipients can no longer allocate part of their bursary towards an additional fee waiver.
  • For 2015-16 only the bursary of £3,500 remains in place for those with a full maintenance grant.

Oxford’s access agreements reveal that:

  • In 2006-07, students whose residual household income is below £17,500 receive a bursary of £3,000 per year, plus an additional £1,000 in the first year of the course. About 1,200 students are expected to benefit.
  • From 2007-08, these rates increase to £3,070 and £1,025 extra in the first year.
  • From 2008-09, new entrants with a residual household income below £25,000 receive a bursary of £3,150, but all those with an income below £18,000 will receive an extra £850 in the first year of their course.
  • In 2009-10, these rates increase to £3,225 and £875 respectively. This is unchanged for 2010-11.
  • In 2012-13, students with a residual household income below £16,000 a year will receive a bursary of £3,300 per year, plus a tuition fee waiver of £5,500 in the first year of the course and £3,000 in subsequent years.
  • In 2013-14, these arrangements are unchanged.
  • In 2014-15, the bursary rate remains at £3,300, but the fee waiver is reduced to £3,000 a year.
  • In 2015-16, the bursary rate increases substantively to £4,500 per year. A more select group of Moritz-Heyman scholars (with residual income below £16,000 but also ‘flagged on a number of contextual data disadvantage indicators’ ) also receive an annual tuition fee waiver of £3,000

In more recent agreements, Cambridge’s maximum rate of bursary is available for all students below a residual income of £25,000, whereas at Oxford it is confined to students with a residual income of less than £16,000.

Hence Cambridge is comparatively more generous to students with a residual income above £16,000 but below £25,000.

Until 2015-16, the maximum bursary rates were broadly similar, but Oxford has now added a significant increase, offering £1,000 more than Cambridge. Moreover, a fee waiver remains in place for the most disadvantaged students.

Hence Oxford is now more generous to students with a residual income below £16,000. Oxford argues:

‘The University will be monitoring the level of students from households with income of less than £16,000. It is considered that these are the most financially disadvantaged in society, and it is below this threshold that some qualify for receipt of free schools meals, and consideration for the proposed pupil premium. The University does not consider that identifying simply those students who have actually been in receipt of free school meals provides a suitably robust indicator of disadvantage as they are not available in every school or college with post-16 provision, nor does every eligible student choose to receive them.’

The 2014-15 agreement states that 30% of 2012 entrants in receipt of the full bursary – and so with a household income of £16,000 or less – were educated in the independent sector. These students would of course be ineligible for FSM and pupil premium.

The 2015-16 agreement adds that roughly 10% of Home/EU full time undergraduates would qualify for such a bursary. This is supported by the University’s published admissions statistics for 2013, which give the percentage as 9.9% and the number of students as 297.

In 2013, we know that 2,510 admissions were from England, so we can estimate the number of English full bursary holders at approximately 250, of which some 175 were educated in the maintained sector.

But DfE’s destination indicators suggest that only some 25 of these were FSM-eligible.

And other DfE research suggests that only some 14% of students entitled to FSM are not claiming (though that rises to 22% for 15 year-olds).

Taking the latter figure, one might conclude that roughly 30 of the 175 were FSM eligible or non-claimants, so what of the remaining 145 (some 83%)?

It seems likely that they were drawn into residual household income of £16,000 or lower by some combination of:

  • Allowances for additional dependent children (£1,130 per dependent child)
  • Allowances for AVCs and other pension contributions
  • Other allowable expenses.

Interestingly Oxford’s 2013 admissions data shows that the proportion of its intake with incomes between £16,000 and £25,000 was roughly half that of the group with incomes below £16,000.

. 

Outreach

Cambridge

For 2006-2012, Cambridge divides its outreach provision into three categories:

  • Activity to encourage applications from under-represented groups to Cambridge. This is targeted at students in the first generation in their families to attend HE; those who attend schools or colleges with low or below average GCSE and A level performance; and those attending schools or colleges with little recent history of sending students to Cambridge. Three sub-categories are identified: information events for teachers and parents, residential Easter and summer schools and a miscellany of visits to Cambridge, visits to schools, masterclasses, workshops, study days etc.
  • Collaborative activities with other HE partners to raise aspirations and encourage participation. This includes regional Aimhigher projects and gifted and talented events provided through NAGTY.
  • General aspiration-raising activities for the HE sector generally. These are predominantly subject-based and online activities.

For 2012-16, Cambridge continues to describe its provision under the first and third of these categories, adding that both involve collaborative work. It also identifies a wider range of target groups:

‘These include children in care; students eligible for free school meals [NB]; Black, Asian and minority ethnicity students; mature learners; students educated in further education colleges; and bright students in schools and colleges which have not historically sent students to the University of Cambridge.’

‘Or previously eligible’ is added to FSM eligibility in later iterations.

The description of provision is short, mentioning a national programme of visits and events provided by colleges through an Area Links Scheme plus centrally provided summer schools and taster events.

Five priorities are identified:

  • Increasing the number of places available on events with demonstrable impact, particularly summer schools, taster days and events for teachers.
  • Preserving the legacy of local Aimhigher work.
  • Providing a sustained programme of advice and activities for younger students in local secondary schools.
  • Developing initiatives to encourage state school students to choose appropriate subject combinations and apply to selective universities and.
  • Working closely with Oxford

A sixth priority is added in 2013-14 – ensuring PGCE intakes reflect the population from which Cambridge recruits and building networks of graduate teachers to support wider outreach activity.

In 2014-15 these priorities are unchanged, except that the second and third are conflated into one. There is also an added reference to the long-term nature of some of this work:

‘A number of our initiatives engage with younger age groups and consist of a series of sustained engagements over a number of years. For example, our work in Cambridgeshire and with looked-after children involves secondary school students of all ages, whilst our core programme for black, Asian and minority ethnicity students is delivered to each cohort over a three year period.’

.

Oxford

Oxford’s outreach activity is harder to synthesise, because the agreements vary more often and some of more recent are so much more detailed.

In its 2006-07 Agreement, Oxford establishes a distinction between activities designed to encourage applications to the University and more general aspiration-raising activities.

However, these are not separately identified in the list it provides, which includes:

  • Hosting Aspiration Days for students from Years 9-11 drawn from ‘Oxford’s specific “target areas”’
  • A HEFCE specialist summer school for 150 Year 11 students from under-represented groups
  • Local Aimhigher provision
  • A programme of some 500 annual outreach visits targeting schools and colleges with little history of sending students to Oxford or into HE more generally
  • A Year 12 Sutton Trust Summer School for 250 students from non-traditional backgrounds
  • A programme of regional events to encourage applications from non-traditional backgrounds
  • A programme of events for teachers from schools with little history of sending students to Oxford, supporting some 100 teachers a year
  • Support for student-led programmes including the Oxford Access Scheme (for students from inner city schools) and a Target Schools Scheme run by the Student Union
  • A Further Education Access Initiative reaching 100 colleges a year and
  • Subject-specific enrichment activities.

In the following year, the items on the list change slightly. The University is said to be undertaking a thorough audit of these activities.

By 2008-09, Oxford describes the objective of its access work as increasing representation from: state school students, students from lower socio-economic groups, students from BME groups and care leavers.

It is focused on two areas: increasing the number of high quality applications from target groups and ensuring fair admissions processes. It undertakes wider aspirations-raising work on top of this.

The list of central access initiatives annexed to the agreement is missing.

For 2009-10 and 2010-11, the agreement refers to ‘detailed operational plans’ being developed to achieve its objectives.

By 2011-12, Oxford has added a third area of focus to the two immediately above: ensuring that teachers and advisers are able to support intending applicants.

Detailed operational plans are still under development. However, the subsequent agreements introduce several key elements:

  • UNIQ residential summer schools for Year 12 students. Participants are selected on the basis of GCSE A* performance compared with their average school attainment, ACORN postcode, school’s history of sending pupils to Oxford and any care history. A personal statement is also required. There were 380 participants in 2009, rising to 500 in 2010. Capacity is projected to increase to 650 in 2011, 700 in 2012, 850 in 2013, and 1000 participants in 2014.
  • By 2012-13, two other ‘flagship programmes’ are identified: a programme of seven regional one-day teacher conferences and a link programme connecting every local authority with a named college. Participants in the teacher conferences are drawn from schools and colleges with low numbers of students achieving high grades or limited success in achieving offers. Oxford’s target is a 15% success rate for applications from these teachers’ schools.
  • In 2013-14, there is the first reference to a Pathways Programme – longitudinal provision for students across Years 10-13 in schools with little history of engagement with Oxford. By 2014-15 this has expanded to accommodate 500 Year 12 students attending study days and 1,800 Year 10 students attending a taster day. In the 2015-16 agreement there is reference to 3,000 participants.
  • The 2012-13 agreement also outlines a system of access flags attached to certain student applicants, denoting educational and social disadvantage. Some 500 applicants were flagged in 2009/10, 630 in 2010/11 and 928 in 2011/12. The intention is that flagged candidates will achieve the same success rate in receiving offers as all applicants from the same sector. (The sectors specified are comprehensive, grammar, FEC, 6FC and independent). A flag for students from low participation neighbourhoods is incorporated from 2011-12 and one for students from schools and colleges with historically low progression to Oxford is introduced in 2012-13. The 2014-15 agreement notes that the proportion of flagged students achieving an offer and subsequently admitted has risen from 15.6% in 2010-11 to 17.2% in 2011-12. The gap between the success rate of flagged applicants and all UK-domiciled applicants has also fallen from 6.4% to 5.6%. In the 2015-16 agreement, the offer rate for flagged candidates is reported as being 19.1% in 2012-13 and 21.9% in 2013-14 However, there is no comparison with the sector-specific data for all applicants.

The 2012-13 agreement is the first to mention the preparation of an Oxford Common Framework for Access but this is not ready until the publication of the 2014-15 agreement.

In that agreement, Oxford describes a four-fold approach it has developed for targeting different types of schools:

  • The large proportion producing few students with the necessary attainment to apply to Oxford – highly tailored individual activities such as UNIQ, school-cluster visits and the student union’s student shadowing scheme.
  • Schools with little history of sending students to Oxford or students who have been relatively unsuccessful – application and interview preparation workshops and awareness-raising events.
  • Schools where there are many high-attaining students but little history of sending students to Oxford – increase understanding of the application process and break down myths.
  • Schools who have significant numbers of successful applicants – maintain a working relationship.

Targets

.

Cambridge

Cambridge begins by adopting selected HESA benchmarks, even though these have:

‘severe limitations in a Cambridge context, in that they take insufficient account of the University’s entry requirements, both in terms of subject combinations and of levels of qualification. We hope in due course to develop our own internally derived milestones or, alternatively, consider the applicability of any milestones which OFFA might develop.’

Three targets are adopted:

  • Increasing the proportion of UK undergraduates from state schools or colleges to between 60% and 63%, compared with a HESA benchmark for 2001-02 of 65%.
  • Increasing the proportion of students admitted whose parental occupation falls within NS-SEC 4-7 to 13-14%, compared with a HESA benchmark for 2001-02 of 13%.
  • Increasing the proportion of students from low participation neighbourhoods to approximately 8-9% compared with the HESA 2001-02 benchmark of 7%.

For 2010-12, the third of these targets is lowered to 5-6% because HESA has changed the basis of its calculation, reducing Cambridge’s benchmark by 33%.

By 2012-13, the first of these targets is described as the University’s ‘principal objective’, so it is deemed more important than improving fair access for disadvantaged students. This statement is subsequently removed, however.

The third objective is again recalibrated downwards, this time to 4%, because:

‘Currently HESA performance indicators and other national datasets relating to socio-economic background do not take adequate account of the entry requirements of individual institutions. Whilst they take some account of attainment, they do not do so in sufficient detail for highly selective institutions such as Cambridge where the average candidate admitted has 2.5 A* grades with specific subject entry requirements. For the present we have adjusted our HESA low participation neighbourhood benchmark in line with the results of our research in relation to state school entry and will use this as our five-year target….We will seek data through HESA or otherwise to amend or update our target in relation to socio-economic background in a revised access agreement next year.’

A paper is available explaining the recalibration (applying a scaling factor of 0.88)

Two new targets are also introduced: a retention benchmark and a process target relating to the minimum number of summer school places.  There will be a minimum of 600 places a year for the next five years.

The substantive details are unchanged in all subsequent agreements.

Oxford

In its 2006-07 access agreement, Oxford discusses setting a performance indicator for recruitment from the maintained sector, adding that from 2006 it will begin to collect data on recruitment from lower socio-economic groups.

In 2007-08 it notes that recruitment from SEG 4-7 ‘increased by 7% and drew the University closer to its benchmark’.

In 2008-09, Oxford is continuing to monitor participation by SEG 4-7 and planning to introduce an internally developed benchmark, adjusted to reflect the high attainment required for entry to Oxford. By 2009-10/2010-11, work is still ongoing to develop such a benchmark.

In 2011-12 it seems still not to be ready, but in 2012-13 Oxford introduces its current indicators:

  • Increase the percentage of UK undergraduates at Oxford from schools and colleges which historically have had limited progression to Oxford.
  • Increase the percentage of UK undergraduate students at Oxford from disadvantaged socio-economic backgrounds. ACORN is adopted because:

‘The University has found the ACORN information to be the most accurate source of verifiable information to highlight socioeconomic factors that may signify disadvantage, and has used it as a contextual flag in the undergraduate admissions process since 2008-9, and also as a factor when selecting participants for the UNIQ summer schools programmes.’

  • Increase the percentage of UK undergraduate students at Oxford from neighbourhoods with low participation in higher education. This utilises POLAR quintiles 1 and 2 ‘in line with HEFCE and OFFA recommendations’.
  • Meet the HEFCE benchmark on disabled students at Oxford.

It supplements these with three ‘activity targets and outcomes’:

  • 60% of those participating in the UNIQ summer schools make an application to Oxford, and 30% of those applying to receive an offer of a place.
  • Improve the participation, application, and success levels from schools and colleges who have had teachers attend the Regional Teacher Conferences, where these schools and college have either a limited numbers of qualified candidates or where there historically has been limited success in securing offers.
  • Using contextual information in the admissions process to identify candidates who may be suitable to be interviewed on the basis of either time in care, or socio-economic and educational disadvantage. The expectation is that identified candidates would then achieve the same success rate in receiving offers as all applicants to Oxford from equivalent school or college sectors.

These are unchanged in subsequent agreements though, as we have seen, there is no reporting of flagged applicants’ success compared with all students in their respective sectors, only compared with all applicants.

Commentary

There are, within the series of access agreements, valuable insights into the thinking within Oxford and Cambridge about such issues. Here is an annotated selection, presented in broadly chronological order:

  • Improvement will take time: ‘Cambridge will continue to strive to encourage applications from qualified applicants from groups currently under-represented and to admit a greater proportion of them within the context of our admissions policies and without compromising entry standards. Experience has, however, demonstrated that outreach activity takes time to alter the composition of the student population.’ (Cambridge, 2006-10)
  • Partnership and collaboration is necessary: ‘In setting itself these objectives, the University recognises that the problems relating to access to higher education are complex and deep-seated, and beyond the capability of the University to solve by itself. They require the input of all parts of the organisation to address, and indeed the input of agencies external to the University. Oxford is committed to playing its part in addressing these issues…’ (Oxford, 2008-09)
  • Increases in intake are unlikely: ‘Because, in part, of the full-time, residential nature of Cambridge’s undergraduate courses, it is unlikely that the university’s undergraduate intake will significantly increase over the next five years.’ (In 2012-13, this is qualified by the addition of the phrase ‘…beyond the colleges’ capacity to admit them’, but this is dropped again the following year.) (Cambridge 2006-10 and 2012-13)
  • Access is focused on application rather than admission: ‘The selection process aims to identify the most able, by subject, from among a very highly qualified field of candidates. While the purpose of our access work is to ensure that all students who are likely to be able to meet the required standards have the opportunity to apply, our admissions procedures aim to select those candidates who best meet our published selection criteria.’ (Oxford, 2012-13)
  • The balance of expenditure in favour of bursaries is justified: Whilst mindful of OFFA guidance on this subject, we do not believe that there is a sufficient body of evidence that greater benefit would be derived from different proportions of expenditure. As suggested above…we believe that our financial support has a significant bearing on retention. We have also taken full account of student feedback in the formulation of the present scheme. Students have confirmed during the current year that they do not want to see a reduction in bursary levels. It should be noted that the level of expenditure on outreach activity outlined in this agreement is supplemented from very substantial funding through other sources, and so we believe our commitment in this area to be considerable and appropriate.’ (Cambridge 2013-14)
  • This balance of expenditure in favour of bursaries is open to challenge: ‘Our package of financial support to undergraduate students, through both tuition charge waivers and maintenance bursaries, is expected to contribute in broad terms to meeting the targets and outcomes. As yet, however, the evidence for a demonstrable connection between financial support for students and improvements in access to higher education amongst under-represented groups is unclear. We will continue to review our position on the basis of further evidence and analysis.’ (Oxford, 2012-13)
  • Explanations of limited progress: Progress against these targets in 2012 has proved extremely challenging, particularly against the backdrop of the new funding regime combined with a demographic decline in the number of school leavers. In relation to the three targets dealing with educational, social and economic disadvantage, Oxford has seen both a decline in applicants and a decline in the number of students that have been admitted…Oxford will continue to focus its outreach efforts and resources on recruiting and encouraging a wider range of student to apply successfully to the University (Oxford 2014-15)
  • Student funding reforms have depressed performance: ‘The 2011, 2012 and 2013 entry cycles proved atypical, given the extensive changes to student funding, and this was reflected in the limited success against the targets…The provisional figures for 2014 entry, however, indicate that we have made headway across the board, particularly in regard to candidates who are from postcodes with high levels of socio-economic disadvantage using the Acorn (A Classification Of Residential Neighbourhoods) postcode classification. …Sustained long term outreach activity takes time to show in the admissions process, and the need to allow a five year period to assess progress has been reiterated by Oxford on a regular basis.’ (Oxford 2015-16)
  • Potentially negative impact of A level reform: ‘We are concerned that current proposals for A-level reform would significantly reduce student choice and flexibility; in particular, the lack of formal end of Year 12 examinations will adversely affect student confidence and the quality of the advice they receive about higher education options, and also prevent institutions such as Cambridge from accurately assessing current academic performance and trajectory. If effected these proposed reforms could have a significant bearing on our ability to make progress on access measures.’ (In 2015-16 there is also concern ‘…that proposed funding arrangements would effectively restrict students in many state schools to three A-levels, meaning that the opportunity to study extremely valuable fourth subjects such as Further Mathematics would be lost.) (Cambridge, 2014-15 and 2015-16 
  • There is an evidence base for effective practice ‘There is also increasing evidence that sustained work with students over a longer period of time is more effective than one-off interventions, particularly if this work is tailored to the requirements of each age group.’ and ‘Research into access activities has identified that, provided they have a sufficient depth of content, summer schools are a particularly valuable experience for students who have higher academic achievements and aspirations than others in their peer group.’ (Oxford 2014-15 and 2015-16)
  • Universities’ role in raising attainment: There is a larger question about the role of universities in raising attainment rates within schools. Universities can, and Oxford does, work in partnership with schools, local authorities, and third parties to form collaborative networks that can work together to raise the attainment rates of students from the most deprived backgrounds’ (Oxford 2014-15)

Some of these issues will be picked up again in the final section of this post.

.

Oxbridge’s Signature Access Programmes

This section reviews information about key programmes within each university’s access portfolio that reflect their long-term commitment to residential programmes and a more recent focus on longer-term partnership programmes targeting secondary students from disadvantaged backgrounds.

Before engaging with these specific programmes, it is important to give a sense of the full range of activity presently under way. In Oxford’s case, the most recent 2015-16 access agreement provides the basis for this. In Cambridge’s case, I have drawn on online material and an online brochure.

Cambridge’s Access Portfolio 

Cambridge’s Outreach and Access webpages provide details of:

  • Insight supporting students attracting the Pupil Premium in Year 9 through to Year 13 (see below)
  • Experience Cambridge, a 3-week subject-specific academic project, undertaken predominantly through the University’s VLE.
  • HE+, a pilot programme involving regional consortia of state schools and colleges working with their link Cambridge College to enable their academically able students to make competitive applications to selective universities including Cambridge.
  • HE Partnership, an aspiration-raising initiative targeting Year 9-11 students in Cambridgeshire and Peterborough schools with lower than average progression rates – and particularly students attending them with no family background of attending higher education.

A separate Raising Aspirations booklet mentions, in addition:

  • The Subject Matters, events for Year 11 students to support their A level subject choice
  • Year 12 subject masterclasses
  • A Black Asian and minority ethnicity (BAME) outreach programme
  • Further education and mature student outreach
  • Various examples of outreach by University Departments
  • Activity under the College Area Links Scheme
  • The CUSU Shadowing Scheme
  • Open Days
  • Oxford and Cambridge student conferences
  • Participation in higher education conventions

Oxford’s Access Portfolio

Oxford’s 2015-16 access agreement describes:

  • Briefings for Teach First and PGCE students which typically attract 150 students annually.
  • An annual programme of school and college visits, which involved over 3,300 UK schools and colleges in 2012-13. These are undertaken through Link Colleges (see below).
  • Target Schools, an OUSU programme involving undergraduate visits and student Shadowing Scheme.
  • A variety of Departmental and subject-specific outreach activities

Cambridge: Sutton Trust Summer Schools 

Sutton Trust summer schools are subject-specific residential courses for Year 12 students. They are currently provided at ten institutions including Cambridge. There are about 2,000 places nationally and Cambridge accounts for 550 of them.

Cambridge offers 26 five-day courses in July and August, hosted by six of its colleges. They are free to attend. The providers meet all costs including travel to and from the venue, food and accommodation.

Successful applicants must meet most or all of the following eligibility criteria:

  • In the first generation of their family to attend university (in fact this means neither parent has a first degree or equivalent)
  • Eligible for FSM [not pupil premium] during secondary education
  • Have achieved at least 5A*/A grades at GCSE or equivalent and be taking subjects relevant to the summer schools they wish to attend
  • Attend schools/colleges with a low overall A level point score (typically below the national average) and/or low progression to HE
  • Live in neighbourhoods with low progression rates to HE and/or high rates of socio-economic deprivation.

Participants must attend a UK state-funded school or college, so those attending independent schools are ineligible, even if they have moved subsequently into a state sixth form. Priority is given to children who are, or were formerly, looked after or in care.

Cambridge’s website says:

‘We look at a combination of the contextual priority criteria met and GCSE grades (or equivalent) in subjects relevant to the course for which you have applied. In 2014, the majority of our 550 summer school participants met two or more of these criteria.’

In answer to the question ‘does attending a summer school increase my chances of getting a place at Cambridge, the University says:

‘Applications to the University are completely separate from the Summer Schools and use different criteria to those of the Summer School.  Admissions Tutors will not know whether an applicant has attended a Summer School, unless you choose to mention it in your personal statement…Equally, being unsuccessful in a summer school application does not correlate to the likelihood of being accepted to Cambridge as an undergraduate: we use very different criteria and it is in no way a statement about your academic record or potential.’

. 

Oxford: UNIQ summer schools

The UNIQ summer schools website describes a very similar animal. It is also targeted at Year 12 students in state schools and colleges. The courses are also one-week, subject-specific residential experiences undertaken during July and August. All costs are covered.

According to the access agreements Oxford planned to increase the number of places available to 1,000 in 2014 and achievement of this outcome is confirmed in the published statistics, which add that there were 4,327 applications and that 507 ‘near miss applicants’ were invited to undertake other outreach activities.

Interestingly though, the number of places available in 2015 fell back substantially, to 850. The number of courses was 35, unchanged from 2014, suggesting a drop in the average number of students per course from 29 to 24.

Courses are categorised according to whether they are in Humanities, Medical Sciences, Mathematical, Physical and Life Sciences or Social Sciences. Sixteen of the 35 are in Humanities subjects.

The eligibility criteria are also similar to those for Sutton trust summer schools, but  those relating to disadvantage are not described with any degree of specificity . They include:

  • The number of A* GCSE grades achieved compared with the average for the applicant’s school when they took GCSEs. (Applicants are only permitted to have completed one A level.)
  • Academic attainment and history of progression to Oxford at the school or college where the applicant is taking A levels
  • ACORN postcode data
  • POLAR 3 data and
  • The quality of a personal statement

Applications from looked after children are considered ‘on an individual basis’.

A referee, normally a teacher, needs to confirm the details of their application.

Students who complete a UNIQ summer school fulfil the requirements for the ASDAN Universities Award.

The website adds that from 2015, Oxford is ‘running a virtual learning programme for selected applicants’.

The answer given to the question ‘Will attending a UNIQ summer school make it more likely that I will get a place at Oxford University says:

‘Students who attend UNIQ and decide to apply to Oxford University do not receive any preferential treatment at the application stage.

Admissions tutors who make decisions about undergraduate offers select entirely on academic merit. Unless students mention on their UCAS Personal Statement that they have attended the UNIQ Summer School, admissions tutors will not know, as we do not provide them with separate information.’

Cambridge: Insight 

Insight is described in the guide for teachers as:

‘an [sic] multidisciplinary programme which aims develop [sic] and broaden students’ academic interests and tackle the barriers many students face when applying to university. We hope to achieve this through inspiring subject days, discussions with current university students and academics and sessions about university.’

Eligible students are in Year 9, attract the Pupil Premium, can travel to and from Cambridge in a day and are ‘on track to achieve Level 7 English, maths and science but [sic] the end of Key Stage 3’.

The programme is predominantly focused on six London boroughs, but applications are also invited from non-selective state schools elsewhere with ‘above average eligibility for free school meals’.

There is a series of Saturday and holiday events, including:

  • Core sessions, including an introductory event in the Spring term of Year 9 and ‘Subject Matters’ – events to support A level choices – in the Autumn Term of Year 11.
  • Additional subject days provided throughout Years 10 and 11
  • A one-night residential at the end of Year 10 and a four-night residential at the end of Year 11 for ‘those who have shown enthusiasm and commitment to the programme’.
  • A regular email newsletter during Years 12 and 13 providing information about open days, masterclasses, residentials and competitions.

The programme is free of charge.

I could find no evaluation of the impact of this programme, which is not mentioned in Cambridge’s ‘Raising Aspirations’ brochure, even though it seems to be their only substantial long term programme targeting disadvantaged students outside the local area

.

Oxford: Pathways Programme 

The website describes Pathways as an initiative co-ordinated by Oxford’s colleges with support from the Sutton Trust.

‘The programme aims to provide information, advice and guidance on higher education and Oxford to academically able students, and staff members, in non-selective state schools with little history of student progression to Oxford.’

The components are:

  • Year 10 taster days which provide sessions on higher education and student finance. Applications are made by schools, which need to be in the state sector, ‘usually without sixth forms’ and with little or any history of sending students to Oxford.
  • Year 11 investigating options events, focused on the significance of GCSE results and post-16 choices. These are aimed at students who have undertaken a taster event who attend schools fitting the description above. Schools are encouraged to bring up to ten students. There are also two subject-focused days, one devoted to Medicine, the other to Humanities.
  • Year 12 study days providing a taste of subject-specific university-level study. This involves two taster sessions undertaken in small groups, two talks from admissions tutors and a college tour. There are twenty-one subjects offered. Participants are from non-selective state schools and colleges. They are normally expected to have at least 5 GCSE A* grades (7 for medicine) and be predicted to achieve at least 3 A grades at A level, or equivalent.
  • A Year 13 application information day, providing advice on personal statements, tests and interviews. These cover seven broad subject areas. Participants are again drawn from non-selective state schools and colleges.

Although not confined to students from disadvantaged backgrounds, teachers are advised that:

‘When selecting participants for the Year 12 and 13 events, we also take into account socio-economic data, such as parental HE participation and eligibility for benefits or free schools meals.’

The Sutton Trust explains that Pathways involved almost 3,000 students and 400 teachers in its first year. The Trust is funding the further development of the Year 12 and 13 components.

I could find no separate evaluation of the effectiveness of Pathways.

Strengths and weaknesses of Oxbridge provision

.

Summer schools 

Both Oxford and Cambridge place extensive reliance on the effectiveness of summer schools as an instrument for improving access, with summer school provision forming the centrepiece of their respective strategies.

The evidence base in support of this strategy appears relatively slim. Both appear to be relying principally on evaluation of the Sutton Trust’s programme.

The Sutton Trust appears to publish an annual Targeting and Progression Report, but the 2014 edition has all the institution-specific data stripped out, which is not entirely helpful.

However, it does reveal that, amongst applicants for summer schools in all ten locations, only:

  • 59.5% were from the first generation of their family with experience of HE.
  • 54.8% came from schools and colleges with below average A level point scores and/or low progression rates to HE.
  • 29.9% were in Polar 2 quintiles 1 or 2.

There is no reference to the FSM eligibility criterion, so presumably that was not in place last year.

There is limited information about the status of those accepted onto courses. Between them, the document and a parallel powerpoint presentation tell us that:

  • The majority of attendees met two or three of the eligibility criteria
  • 77% met three of the criteria, but we don’t know which three
  • 85% met the ‘first generation’ criterion
  • 74% ‘came from schools with low attainment’
  • 49% ‘lived in areas with the lowest level of progression to university’ (presumably Polar quintiles 1 and 2).

Given the focus of this post, the last outcome is particularly disappointing, since it means that over half were not disadvantaged on the Trust’s only measure. Perhaps the additional FSM criterion has been introduced in an effort to secure a larger majority of applicants from disadvantaged backgrounds.

The presentation also reveals that the Trust specifically targeted 900 ‘hard to reach schools’ which eventually supplied 257 attendees, 88% of them meeting three or more of the eligibility criteria.

The implication must be that, if such an exercise had not taken place, the proportion of attendees from disadvantaged backgrounds would have been significantly lower.

The Report also reveals that, of the 2012 Sutton Trust summer school cohort, 58% of university applicants took up a place at a Russell Group university. A total of 125 students (10% of the cohort) accepted a place at the institution that hosted their summer school.

Oxford publishes information about its summer schools in its access agreements.

The target is for 60% of participants to apply and for 30% of applicants to receive an offer. The University also aims that summer school participants will have the same success rate in securing an offer as the average for all applicants from the state sector.

Each agreement provides detail about the number of participants who apply to Oxford, the number receiving offers and the proportion of those from ACORN groups 4 and 5.

These are summarised in Graph 4, below, which illustrates that the impact on recruitment of students from ACORN 4 and 5 postcodes is not fully commensurate with the increase in the number of participants.

Oxbridge graph 3

Graph 4: Impact of UNIQ summer schools, 2010-2013

.

Oxford also provides details of the proportion of summer school participants from Polar quintiles 1 and 2 receiving an admission offer, for 2011 (19.5%), 2012 (15%) and 2013 (20.3%). In 2013, the comparable ‘success rate’ for all applicants to the University was 20.1%.

The evaluation evidence cited by Oxbridge is captured in a Sutton Trust Summer School Impact Report, dating from 2011. This is based on analysis of the 2008 and 2009 summer school intakes, when course were located at Bristol, Nottingham and St Andrews, as well as at Oxford and Cambridge.

It concludes that:

  • Summer schools successfully select students who fit the eligibility criteria (though that is not entirely borne out by the more recent outcomes above).
  • Amongst the disadvantaged cohort, less disadvantaged students are more likely to take up places than their more disadvantaged peers.
  • However, attending a summer school closes the gap between the success rates – in terms of obtaining admission offers – of more and less disadvantaged students. Exactly why this happens is unclear.
  • There are significant differences between universities. Cambridge exhibits ‘relatively poor conversion of attendees into applications (not least when compared to the equivalent performance of Oxford)’

The overall conclusion is that summer schools do have a positive impact, compared with control groups, but the study does not offer recommendations for how they might work better, or consider value for money.

The closing section notes that:

‘They achieve this by raising two of the three ‘As’ of the WP canon – student awareness and student aspirations. It may not directly enhance the third – student attainment – though summer schools can support students’ study skills – but the growing adoption of a ‘contextual data’ approach to the treatment of university admissions should be to the further benefit of the sorts of students who pass through summer schools.’

Overall then, summer schools have a positive impact, but if we are judging their efficiency as a mechanism for improving the intake of students from disadvantaged backgrounds, it is clear that there is extensive deadweight. They might be better targeted on the most disadvantaged students.

If this is true of summer schools it is almost certainly true of other elements of Oxbridges’s access programmes.

.

Other more general issues 

  • A smorgasbord of provision: It is evident that both Oxford and Cambridge are engaged in multiple overlapping initiatives designed to improve access, both to their own institutions and to selective HE more generally. At Offa’s behest, they are targeting several sub-populations. The 2016-17 guidance on completing access agreements invites them to consider a variety of under-represented groups: minority ethnic students, disabled students, care leavers and students in care, part-time students, mature students, medical students, PGCE students. There seems to be a tendency to invent a series of small targeted initiatives for each subgroup, rather than focusing principally on two or three substantial programmes that would make a real difference to core target groups. 
  • Too many priorities too vaguely expressed: Both universities identify core priorities through the targets they have selected. In Oxford’s case those involve increasing representation from: schools and colleges with limited progression to Oxford; postcodes associated with significant socio-economic disadvantage; postcodes associated with low HE participation; and disabled students. However the first three overlap to some extent and recent access agreements do not indicate the relative priority attached to each. In Cambridge’s case only two targets relate to admissions, one focused on increasing representation from state schools, the other from low participation postcodes. In older agreements, the former has clear priority over the latter but it is unclear whether this remains the case. Offa’s framework requires simplification so that both universities have no option but to prioritise admissions from disadvantaged learners educated in state-funded institutions. It should be much clearer exactly which activities are dedicated to this end and what funding is allocated for this purpose. 
  • A plethora of measures: The Offa system permits Oxbridge and other universities too much leeway in defining the populations whose access they seek to promote and in determining how they measure success. This makes it harder to compare universities’ records and more complex to harmonise with the measures most often applied in schools and colleges. If universities refuse to foreground eligibility for the pupil premium and for FSM, they should at the very least publish annual data about the proportion of their intake falling within these categories, and without the present two year time lag.
  • Limited transparency: There is too much variability in the degree of transparency permitted by the Offa framework. Oxford provides much more data in its access agreement than does Cambridge, but the range of data published in support of fair access is limited across the board. Within the bounds of data protection legislation, it should be possible for the university to state each year, without a two-year timelag, what proportion of their intake fall within certain specified categories, how those vary between subjects and the range of attainment demonstrated in each case. The publication of such material would go a long way towards removing any sense that Oxbridge is overly defensive about these issues. 
  • Limited investment in long term collaborative programmes: Summer schools are valuable but they do not impact early enough, nor do they raise attainment. The Insight and Pathways programmes demonstrate growing recognition of the potential value of establishing long-term relationships with prospective students that begin as early as primary school and certainly before the end of KS3. Such programmes require schools, colleges and universities to preserve continuity for each eligible student through to the point of university entry. Existing programmes are insufficiently intensive and reach too few students. Scalability is an obvious issue. 
  • Negligible involvement in attainment-raising work: Both Oxford and Cambridge state frequently that the principal obstacle to recruiting more disadvantaged students is the scarcity of sufficiently high attainment within the target group. Yet rarely, if ever, do they invest in long-term activities designed to raise these students’ attainment, seeming to believe that this is entirely a matter for schools and colleges. The precedent offered by university involvement in academy sponsorship and A level reform would suggest that there is no fundamental obstacle to much closer engagement in such activities.

.

Tackling the core problem

The proposed solution is a framework that supports a coherent long-term programme for all high-attaining disadvantaged students attending state-funded institutions in England, stretching from Year 7 to Year 13. These might be defined as all those eligible for pupil premium. An additional high attainment criterion, based on achievement in end of KS2 tests, could be introduced if necessary.

Such a programme could be extended to the other home countries and additional populations subject to the availability of funding.

The framework would position the school/college as the co-ordinator, facilitator and quality assurer of each eligible student’s learning experience (with handover as appropriate as and when a learner transfers school or into a post-16 setting).

It would stretch across the full range of competence required for admission to selective HE, including high attainment, personal and learning skills, strong yet realistic aspirations, cultural capital, access to tailored IAG etc.

On the demand side, the framework would be used to identify each student’s strengths and areas for development, and monitor progress again challenging but realistic personal targets.

From Years 7-9 the programme would be light-touch and open access for all eligible disadvantaged students. Emphasis would be placed on awareness-raising and the initial cultivation of relevant skills.

Entry to the programme from Year 10 would be conditional on the achievement of an appropriate nigh attainment threshold at the end of KS3. From this point, provision would be tailored to the individual and more intensive.

Continuation in subsequent years would be dependent on the student achieving appropriate high attainment thresholds and challenging interim targets.

Schools’ and colleges’ performance would be monitored through destinations data and Ofsted inspection.

On the supply side the framework would be used to identify, organise and catalogue all opportunities to develop the full range of competence required for admission to selective HE, whether provided by the student’s own school or college, other education providers in the school, college and HE sectors or reputable private and third sector providers.

Opportunities offered by external providers, whether at national or regional level, would be catalogued and mapped against the framework in a searchable national database. Schools and colleges would be responsible for mapping their own provision and other local provision against the framework.

Each student would have a personal budget supplied from a central fund. Personal budgets would be administered by the school/college and used to purchase suitable learning opportunities with a cost attached. The fund would be fed by an annual £50m topslice from the pupil premium. This would cover the full cost of personal budgets.

The annual budget of £50m per year might be divided between:

  • Light-touch open access activities in Years 7-9 – £10m
  • Intensive programme in Years 10-13 – £10m per year group.

The latter would be sufficient to support 5,000 eligible students to the tune of £2,000 per student per year, or 4,000 to the tune of £2,500.

By comparison, DfE’s destination indicators suggest that, in 2012/13, ‘top third’ universities admitted 2,650 FSM-eligible students; some 1,520 of these were admitted to Russell Group universities and, of those, just 50 were admitted to Oxbridge.

Selective universities would make a small contribution, the sum adjusted to reflect their comparative performance against fair access targets. These contributions would be used to meet the administrative costs associated with the programme. Total annual running costs have not been estimated but are unlikely to be more than £2.5m per year.

Universities might choose to invest additional funding, covered by their annual Offa access agreements, in developing free-to-access products and services that sit within the supply side of the framework. Attainment-raising activities might be a particular priority, especially for Oxbridge.

Philanthropic contributions might also be channelled towards filling gaps in the supply of products and services where, for whatever reason, the market failed to respond.

Selective universities would have access to information about the progress and performance of participating students. Students would apply for higher education via UCAS as normal, but strong performers would expect to receive unconditional offers from their preferred universities, on the strength of their achievement within the programme to date.

Participation in the programme would be a condition of funding for all selective universities. All processes and outcomes would be transparent, unless data protection legislation prevented this. The programme would be independently evaluated.

Optionally, universities might be further incentivised to make unconditional offers and provide the necessary support during undergraduate study. The Government might pay the receiving university a fee supplement, 50% above the going rate, for every student on the programme admitted unconditionally (so up to £22.5m per cohort per year assuming a supplement of £4,500 and 100% recruitment). This supplement would not be provided for conditional offers.

The Government would also claw back the full fee plus the supplement for every student on the programme – whether admitted conditionally or unconditionally – who failed to graduate with a good degree (so £40,500 per student assuming a 3-year degree and a £9,000 fee).

GP

March 2015

Why McInerney is just plain wrong

.

I should be completing my next evidence-based post but, 24 hours on from reading this evidence-light Guardian article by Laura McInerney, I am still incandescent.

.

.

I find I cannot return to normal business until I have shredded these flimsy arguments.  So this post is by way of catharsis.

McInerney’s core premiss is that political parties of all colours focus disproportionately on ‘the smartest children’ while ‘ignoring lower ability learners’.

This poisonous ideology seems particularly prevalent amongst Teach First types. I imagine they are regurgitating lessons they learned on its courses,

I have seen it promulgated by rising stars in the profession. That exchange prompted this previous post which attempted a balanced, rational analysis of our respective positions.

Ideologues cannot be persuaded by evidence, so there is no hope for McInerney and her ilk, but I hope that more open-minded readers will be swayed a little by the reasoning below.

.

What does she mean by ability?

McInerney distinguishes learners who are ‘smart’ or ‘bright’ from those who are ‘lower ability’. This betrays a curious adherence to old-fashioned notions of fixed ability, dividing children into sheep and goats.

There is no recognition of ability as a continuum, or of the capacity of learners to improve through effort, if given the right support.

The principles of personalised learning are thrown out of the window.

Education is not a matter of enabling every learner to ‘become the best that they can be’. Instead it is a zero sum game, trading off the benefits given to one fixed group – the smart kids – against those allegedly denied to another – the lower ability learners.

There is also an elementary confusion between ability and attainment.

It seems that McInerney is concerned with the latter (‘get good marks’; ‘received a high grade’) yet her terminology (‘lower-ability pupils’; ‘the smartest children’; ‘gifted and talented’) is heavily redolent of the former.

.

What does she mean by focusing on the top rather than the tail?

According to McInerney’s notions, these ‘lower ability’ kids face a sad destiny. They are ‘more likely to truant, be excluded or become unemployed’, more likely to ‘slip into unskilled jobs’ and, by implication, form part of the prison population (‘75% of prisoners are illiterate’).

If we accept that low attainers are preponderant in these categories, then it is logical to conclude that programmes focused on tackling such problems are predominantly benefiting low attainers.

So governments’ investment in action to improve behaviour and discipline, tackle truancy and offer Alternative Provision must be distributed accordingly when we are calculating the inputs on either side of this equation.

Since the bulk of those with special educational needs are also low attainers, the same logic must be applied to SEN funding.

And of course most of the £2.5bn pupil premium budget is headed in the same direction.

Set against the size of some of these budgets, Labour’s commitment to invest a paltry £15 million in supporting high attainers pales into insignificance.

There are precious few programmes that disproportionately support high attainers. One might cite BIS support for fair access and possibly DfE support for the Music and Dance Scheme. Most are ‘penny packages’ by comparison.

When the national gifted and talented programme was at its peak it also cost no more than £15m a year.

Viewed in this way, it is abundantly clear that low attainers continue to attract the lion’s share of educational funding and political attention. The distasteful medical analogy with which McInerney opens her piece is just plain wrong.

The simple reason is that substantial investment in high attainers is politically unacceptable.

Even though one could make a convincing case that the economic benefits of investing in the ‘smart fraction’ are broadly commensurate with those derived from shortening the ‘long tail’.

Of course we need to do both simultaneously. This is not a zero sum game.

.

Deficit model thinking

McInerney is engaged in deficit model thinking.

There is no substance to her suggestion that the government’s social mobility strategy is disproportionately focused on ‘making high court judges’. Take a look at the Social Mobility Indicators if you don’t believe me.

McInerney is dangerously close to suggesting that, because low attainers are predominantly disadvantaged, all disadvantaged learners are low attainers. Labour’s commitment is a sop for the middle classes. Nothing could be further from the truth.

But high-attaining learners from disadvantaged backgrounds will not succeed without the requisite support. They have an equal right to such support: they are not ‘the healthiest’, pushing in front of ‘the sickest’ low attainers. Equally, they should not be expected to go to the back of the queue.

There are powerful economic and equity arguments for ensuring that more learners from disadvantaged backgrounds progress to competitive universities and professional careers.

As and when more succeed, they serve as role models for younger learners, persuading them that they too can follow suit.

McInerney has made that journey personally so I find it hard to understand why she has fallen prey to anti-elitism.

Her criticism of Labour is sadly misplaced. She should be asking instead why other parties are not matching their commitment.

According to her there was a golden age under Blunkett ‘who really believed in helping all children, not mostly the smartest.’

Guess who was Secretary of State when Labour first offered support to gifted and talented learners?

He fully appreciated that the tail should not wag the dog.

[Postscript: Here is the Twitter debate that followed this post. Scroll down to the bottom and work upwards to read the discussion in broadly chronological order.]

.

 

GP

March 2015

The Policy Exchange National Scholarships Programme

.

This post is a short critical analysis of the proposal for a new National Scholarships Programme contained in the Policy Exchange Education Manifesto, published in March 2015.

.

Background

Policy Exchange describes itself as ‘the UK’s leading think tank’.

It is Right-leaning, having been established in 2002 by a group including Boles (the founding Director), Gove and Maude, all currently Conservative Ministers in the Coalition Government.

On Friday 6 March, Policy Exchange published an Education Manifesto, authored by its Education Team: Jonathan Simons, Natasha Porter and Annaliese Briggs.

The Manifesto’s Introduction says:

‘This is not a manifesto in its traditional sense. What is published here is a collection of short ideas around particular areas which are more localised than those in our main reports. It is our hope and our belief that any or all of them could be taken up by any main political party in May 2015, and they complement the broader policy recommendations we have put forward in our published reports.’

There are seven ‘ideas’, the last of which is for National Scholarships, summarised as follows:

‘Government should design a prestigious scholarship scheme to financially support the most talented undergraduates in the country – covering approximately 200 individuals a year – if they attend a UK university and remain in the UK for at least three years after graduation.’

Despite the authors named above, this has unmistakeably Odyssean fingerprints!

.

Rationale

The purpose of the Programme seems to be to ensure that the economic benefits vested in the most outstanding undergraduates are not lost to the UK through ‘brain drain’:

‘The intention would be to marry the most able students within the UK with some of the world class provision on offer at UK universities (though the scholar would have their free choice of which institution to attend). The financial package would act less as a facilitator to go to university in general but as a nudge to incentivise scholars to remain in the UK throughout university and beyond, as opposed to going abroad, which is becoming an increasingly competitive battleground. [sic]’

The paper emphasises the economic benefits of investing in a country’s very highest attainers:

‘If such highly able individuals can accrue great awards and accomplishments which benefit not just themselves but, through positive spillovers, drive increase in human capital more widely, then this will be of wider benefit.’

This idea is associated with Benbow and Lubinski, Co-Directors of the Study of Mathematically Precocious Youth (SMPY) located at Vanderbilt University in the US:

‘They argue for a national scheme to identify such individuals and nurture them, both for the individuals’ own benefits but also for the benefits of their home nations. This is because in advanced economies in particular, with a shift towards higher skilled jobs, the economic prosperity of a country depends on its human capital potential. Education today is the economy of tomorrow. If such individuals as these under discussion can generate further talent by virtue of their own accomplishments, then there is a competitive rationale for countries to identify and support these individuals.’

In fact, these arguments have a longer pedigree

There is no explanation of how the highest attainers ‘can generate further talent by virtue of their own accomplishments’, though this might be a reference to potential future employment as university academics.

Some limited evidence is cited to support fears of a brain drain:

‘A BIS report from 2010 found that some 2.8 per cent of state sector pupils and 5.5 per cent of independent sector pupils apply to universities outside the UK – small in absolute terms but “It is particularly significant that it is the academically most gifted pupils who are the most likely to apply to foreign universities”. Longitudinal data – which unfortunately only goes to 2011 – nevertheless shows a consistent increase since 2005.

Most recently, the Institute for International Education and the US-UK Fulbright Commission releaed [sic] data in late 2014 showing that there were a record number of UK students studying in the USA, which has always been the most popular country for foreign study. 10,191 British students pursued study in the US during the 2013/14 academic year, up from around 9,500 12 months earlier and the largest year-on-year increase in more than a decade. Undergraduates accounted for 49.6 per cent of all UK students heading to the US. Some 23.9 per cent were postgraduates and the remainder were taking part in short-term exchanges or graduate work programmes.’

.

What is proposed?

The proposed Programme would award £10,000 per year for three years of undergraduate study at an English university to ‘the top 200 scholars in the country’. The total cost of the awards would be ‘£6m a year in steady state’.

This would involve the Government collaborating with universities and other unspecified partners to develop a new optional test for 17-18 year-olds.

Any student resident in the UK would be eligible, so there would be no screening process.

The test would:

‘…seek to measure via a range of metrics a combination of academic ability and academic potential. The test would be calibrated to accurately identify those with ability found in approximately 1 in 10,000 individuals (or variants of this depending on how wide the entry criteria are drawn). A proportion of the top ranked scores on this test would be designated National Scholars and be eligible for a package of incentives under the National Scholarship Scheme, contingent upon enrolling as an undergraduate at a UK university.’

Anyone who received a scholarship and subsequently left the country within three years of graduating would be required to repay it.

Hence the scheme would obstruct enrolment as an undergraduate overseas and also place a significant obstacle in the path of postgraduate mobility.

Analysis

There is no problem

The idea is a solution in search of a problem.

There is no specific evidence that the 200 students with the highest ability and academic potential (however that is measured) are any more likely to study abroad.

The 2010 BIS research report quoted above notes that 76% of the students in its survey planned to return to the UK, although many wanted to work abroad before doing so.

Furthermore:

‘Significantly, the survey results point to the students with the strongest A level results being more likely to want to return to the UK at some point after their studies. International student mobility should not therefore be interpreted as a brain drain of the UK’s best and brightest young people.’

The BIS report quite rightly explores this issue in the context of international student mobility, the globalisation of higher education and the postgraduate labour market.

The threat of brain drain can be countered by the argument that the strongest UK students should be encouraged to attend the best courses at the world’s best universities (language of tuition permitting). Only by doing so will they maximise their skills and their subsequent economic value.

Meanwhile, the best overseas students should be welcomed to UK universities and encouraged to consider postgraduate study and employment here, so that the UK economy benefits from their engagement.

Poor policy design

There is insufficient information about the nature of the test.

It would not be an intelligence test, but would assess ‘academic ability and potential’.

Since it must be applicable to all students, regardless of their current subjects of study or their intended undergraduate field(s) of study, it must not rely in any way on subject content, otherwise it would be biased in favour of specialists in those fields.

It seems unlikely that such a test already exists, unless one is prepared to argue that the US SAT test fits the bill and, even if it does, the ceiling is almost certainly too low.

The footnotes acknowledge that:

‘…such a proposed test has no track record on validity and there will be a large number of students therefore caught in statistical noise just outside the cut off score.’

The development process would be lengthy and complex – and the costs correspondingly high. These development costs are not included in the £6m budget.

If the test is coachable, this opens up the possibility of a further market for the private tuition industry. Students will be diverted from their A level studies as a consequence.

The reference to ‘a range of metrics’ suggests the possibility of a complex test battery rather than a single assessment. The ongoing cost of administering the test is also excluded from the budget.

Similarly, the ongoing costs of administering the scholarship scheme, evaluating its effectiveness, monitoring the movements of alumni and pursuing repayments are also excluded.

The relationship between the scholarship and other forms of student support is not properly developed. Why not link the incentive to student loan repayments instead of introducing a separate scholarship scheme? One section of the paper suggests it could meet living costs, or be offset against tuition fees.

It acknowledges that many of the beneficiaries of such scholarships are likely to come from privileged backgrounds and be educated in the independent sector.

It seems unlikely that they would they be swayed by financial inducements at this level, especially if their parents have been forking out upwards of £25,000 a year for school fees.

It is likely that those who are determined to study abroad will choose not to take the test. The benefits of £30,000 now will be more than outweighed by the additional earnings they might subsequently expect as a consequence of pursuing a better course elsewhere. This will be especially true of those from affluent backgrounds.

Finally, one doubts whether a sample as tiny as 200 students a year – no matter how talented they are – would have any substantive impact on the UK economy, even assuming that the arguments in favour of globalisation could be set aside. Such a scheme would be more effective if it had a wider reach.

Redundant lines of argument and poor research

The first part of the paper is devoted to describing the original National Scholarship Programme, a completely different animal, designed to provide financial support to enable disadvantaged students to participate in higher education. It is a red herring.

In contrast, the new proposal has nothing to do with fair access or social mobility. It is ‘targeted on talent rather than socio-economic background’.

The paper argues that there are few incentives that ‘recognise and support the most intellectually able’, continuing:

 ‘At a school level, the previous National Academy for Gifted and Talented Youth, was cancelled in 2010 and its funds used for the National Scholarship Programme! [sic]’

This is hopelessly wrong.

NAGTY’s five year contract ended in 2007. Its sponsor, Warwick University, chose not to bid for the subsequent contract, which was intended to extend support to all England’s gifted and talented learners (then numbered at approximately one million), rather than the top 5% of 11-19 year-olds who were NAGTY’s main target group.

The subsequent contract, for Young, Gifted and Talented, ended in 2010 and was not renewed, as the then Labour Government decided to devolve responsibility to schools. This funding stream was not diverted to the NSP, which was administered by HEFCE through BIS.

The paper continues:

‘In line with a general approach towards autonomy, there is also no agreed definition of able students or gifted and talented students. Anecdotally, it is often tended to be used for somewhere around the top 15% or so of the cohort in ability terms. However, this note takes a different and much narrower definition, and is concerned with what might be called the extremely able – those with ability levels found in approximately 1 in every 10,000 of the population.’

The problematic co-existence of definitional autonomy and Ofsted’s emphasis on assessing the effectiveness of all schools’ support for the most able is not discussed.

The reference to ‘somewhere around the top 15%’ is more than anecdotal – it is plucked entirely out of the air. Having introduced this topic, what is the justification given for shifting the emphasis away from 15% of learners to 0.0001% of prospective undergraduates? The policy response to one has negligible bearing on the other.

(In fact, the footnotes reveal that a cadre of 200 scholarships would accommodate some 0.003% of the undergraduate population.)

The next section of the paper suggests that SMPY has been focused on different countries, yet SMPY participants have all been resident in the United States (though Cohort 5 covers graduate students enrolled in the top-ranked maths, science and engineering courses located there).

Benbow and Lubinski argue for a national scheme to identify and nurture such learners from the age of 13. Yet the paper switches again to discuss university scholarship schemes in the US, India, France and Russia. All of the three still extant are focused on maths, science and technology, so are not direct parallels with what is proposed here.

A comparison is drawn with elite sports funding

‘This approach mirrors closely the “no compromise approach” of elite sporting organisations funded by UK Sport, which requires tangible outcomes of high performance (ie realistic chances of an Olympic medal) in exchange for funding. Less successful sports, however, popular, are not entitled to the same levels of funding. The net result is that performance at the elite end of UK sport has exponentially grown – whilst alongside that, other funding helps develop grass roots sport and widening participation.’

I struggle to understand the parallels between funding for successful sports and for successful students, unless this is supposed to make the case for not linking the scholarships to socio-economic disadvantage.

The inclusion of a table of five countries’ Olympic medal tallies from 1996-2012 is, however, entirely spurious and redundant.

.

Conclusion

The end of the paper says:

‘There should also be a renewed focus on how to stretch all pupils within the state sector at whatever level, and further work on identifying potential highly able talent across the wider state education sector as Ofsted have identified – both of which will be the focus of future Policy Exchange work. But this is not the same thing, and nor should it be confused with, a scheme to reward and nurture excellence at 18 now, wherever it comes from.’

This is surely ironic, in that much of the commentary above shows how these two issues have been interleaved in the paper itself.

The fact that Policy Exchange plans fresh work on the wider question of support for the most able in schools is welcome. I look forward to being involved.

But, meanwhile, this idea should be consigned to the bin.

.

.

GP

March 2015

The most able students: Has Ofsted made progress?

.

This post considers Ofsted’s survey report ‘The most able students: An update on progress since June 2013’ published on 4 March 2015.

It is organised into the following sections:

  • The fit with earlier analysis
  • Reaction to the Report
  • Definitions and the consequent size of Ofsted’s ‘most able’ population
  • Evidence base – performance data and associated key findings
  • Evidence base – inspection and survey evidence and associated key findings
  • Ofsted’s recommendations and overall assessment
  • Prospects for success

How this fits with earlier work

The new Report assesses progress since Ofsted’s previous foray into this territory some 21 months ago: ‘The most able students: Are they doing as well as they should in our non-selective secondary schools?’ (June 2013)

The autopsy I performed on the original report was severely critical.

It concluded:

‘My overall impression is of a curate’s egg, whose better parts have been largely overlooked because of the opprobrium heaped on the bad bits.

The Report might have had a better reception had the data analysis been stronger, had the most significant messages been given comparatively greater prominence and had the tone been somewhat more emollient towards the professionals it addresses, with some sort of undertaking to underwrite support – as well as challenge – from the centre.’

In May 2014, almost exactly mid-way between that Report and this, I published an analysis of the quality of Ofsted reporting on support for the most able in a sample of Section 5 secondary school inspection reports.

This uncovered a patchy picture which I characterised as ‘requiring improvement’.

It noted the scant attention given by inspectors to high-attaining disadvantaged learners and called for Ofsted to publish guidance to clarify, for inspectors and schools alike, what they mean by the most able and their expectations of what support schools should provide.

In December 2014, I published ‘HMCI ups the ante on the most able’ which drew attention to commitments in HMCI’s Annual Report for 2013/14 and the supporting documentation released alongside it.

I concluded that post with a series of ten recommendations for further action by Ofsted and other central government bodies that would radically improve the chances of achieving system-wide improvement in this territory.

The new Report was immediately preceded by a Labour commitment to introduce a £15m Gifted and Talented Fund if successful in the forthcoming General Election.

This short commentary discusses that and sets out the wider political context into which Ofsted’s new offering will fall.

.

Reactions to Ofsted’s Report

Before considering the Report’s content, it may be helpful to complete this context-setting by charting immediate reactions to it.

  • DfE’s ‘line to take, as quoted by the Mail, is:

‘We know that the best schools do stretch their pupils. They are the ones with a no-excuses culture that inspires every student to do their best.

Our plan for education is designed to shine a bright light on schools which are coasting, or letting the best and brightest fall by the wayside.

That is why we are replacing the discredited system which rewarded schools where the largest numbers of pupils scraped a C grade at GCSE.

Instead we are moving to a new system which encourages high-achievers to get the highest grades possible while also recognising schools which push those who find exams harder.’

‘David Cameron’s government has no strategy for supporting schools to nurture their most able pupils. International research shows we perform badly in helping the most gifted pupils. We’re going to do something about that. Labour will establish a Gifted and Talented Fund to equip schools with the most effective strategies for stretching their most able pupils.’

  • ASCL complains that the Report ‘fails to recognise that school leaders have done an extraordinary job in difficult circumstances in raising standards and delivering a good education for all children’. It is also annoyed because Ofsted’s press release:

‘…should have focused on the significant amount of good practice identified in the report rather than leading with comments that some schools are not doing enough to ensure the most able children fulfil their potential.’

 .

 .

  • NAHT makes a similarly generic point about volatility and change:

‘The secondary sector has been subject to massive structural change over the past few years. It’s neither sensible nor accurate to accuse secondary schools of failure. The system itself is getting in the way of success…

…Not all of these changes are bad. The concern is that the scale and pace of them will make it very hard indeed to know what will happen and how the changes will interact….

…The obvious answer is quite simple: slow down and plan the changes better; schedule them far enough ahead to give schools time to react….

But the profession also needs to ask what it can do. One answer is not to react so quickly to changes in league table calculations – to continue to do what is right…’

There was no official reaction from ATL, NASUWT or NUT.

Turning to the specialist organisations:

‘If the failure reported by Ofsted was about any other issue there would be a national outcry.

This cannot be an issue laid at the door of schools alone, with so many teachers working hard, and with no budget, to support these children.

But in some schools there is no focus on supporting high potential learners, little training for teachers to cope with their educational needs, and a naive belief that these children will succeed ‘no matter what’.

Ofsted has shown that this approach is nothing short of a disaster; a patchwork of different kinds of provision, a lack of ambitious expectations and a postcode lottery for parents.

We need a framework in place which clearly recognises best practice in schools, along with a greater understanding of how to support these children with high learning potential before it is too late.’

‘NACE concurs with both the findings and the need for urgent action to be taken to remove the barriers to high achievement for ALL pupils in primary and secondary schools…

… the organisation is  well aware that nationally there is a long way to go before all able children are achieving in line with their abilities.’

‘Today’s report demonstrates an urgent need for more dedicated provision for the highly able in state schools. Ofsted is right to describe the situation as ‘especially disappointing’; too many of our brightest students are being let down…

…We need to establish an effective national programme to support our highly able children particularly those from low and middle income backgrounds so that they have the stretch and breath they need to access the best universities and the best careers.’

Summing up, the Government remains convinced that its existing generic reforms will generate the desired improvements.

There is so far no response, from Conservatives or Liberal Democrats, to the challenge laid down by Labour, which has decided that some degree of arms-length intervention from the centre is justified.

The headteacher organisations are defensive because they see themselves as the fall guys, as the centre increasingly devolves responsibility through a ‘school-driven self-improving’ system that cannot yet support its own weight (and might never be able to do so, given the resource implications of building sufficient capacity).

But they cannot get beyond these generic complaints to address the specific issues that Ofsted presents. They are in denial.

The silence of the mainstream teachers’ associations is sufficient comment on the significance they attach to this issue.

The specialist lobby calls explicitly for a national framework, or even the resurrection of a national programme. All are pushing their own separate agendas over common purpose and collaborative action.

Taken together, this does not bode well for Ofsted’s chances of achieving significant traction.

Ofsted’s definitions

.

Who are the most able?

Ofsted is focused exclusively on non-selective secondary schools, and primarily on KS3, though most of the data it publishes relates to KS4 outcomes.

My analysis of the June 2013 report took umbrage at Ofsted’s previous definition of the most able:

‘For the purpose of this survey ‘most able’ is defined as the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2. Some pupils who are new to the country and are learning English as an additional language, for example, might not have attained Level 5 or beyond at the end of Key Stage 2 but have the potential to achieve it.’

On this occasion, the definition is similarly based on prior attainment at KS2, but the unquantified proportion of learners with ‘the potential to attain Level 5 or above’ are removed, meaning that Ofsted is now focused exclusively on high attainers:

‘For this report, ‘most able’ refers to students starting secondary school in Year 7 having attained Level 5 or above in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

This reinforces the unsuitability of the term ‘most able’, on the grounds that attainment, not ability, is the true focus.

Ofsted adds for good measure:

‘There is currently no national definition for most able’

They fail to point out that the Performance Tables include a subtly different definition of high attainers, essentially requiring an APS of 30 points or higher across Key Stage 2 tests in the core subjects.

The 2014 Secondary Performance Tables show that this high attainer population constitutes 32.3% of the 2014 GCSE cohort in state-funded schools.

The associated SFR indicates that high attainers account for 30.9% of the cohort in comprehensive schools (compared with 88.8% in selective schools).

But Ofsted’s definition is wider still. The SFR published alongside the 2014 Primary Performance Tables reveals that, in 2014:

  • 29% of pupils achieved Level 5 or above in KS2 reading and writing
  • 44% of pupils achieved Level 5 or above in KS2 Maths and
  • 24% of pupils achieved Level 5 or above in KS2 reading, writing and maths.

If this information is fed into a Venn diagram, it becomes evident that, this academic year, the ‘most able’ constitute 49% of the Year 7 cohort.

That’s right – almost exactly half of this year’s Year 7s fall within Ofsted’s definition.

.

Ofsted venn Capture

.

The population is not quite so large if we focus instead on KS2 data from 2009, when the 2014 GCSE cohort typically took their KS2 tests, but even that gives a combined total of 39%.

We can conclude that Ofsted’s ‘most able’ population is approximately 40% of the KS4 cohort and approaching 50% of the KS3 cohort.

This again calls into question Ofsted’s terminology, since the ‘most’ in ‘most able’ gives the impression that they are focused on a much smaller population at the top of the attainment distribution.

We can check the KS4 figure against numerical data provided in the Report, to demonstrate that it applies equally to non-selective schools, ie once selective schools have been removed from the equation.

The charts in Annex A of the Report give the total number of pupils in non-selective schools with L5 outcomes from their KS2 assessments five years before they take GCSEs:

  • L5 maths and English = 91,944
  • L5 maths = 165,340
  • L5 English (reading and writing) = 138,789

Assuming there is no double-counting, this gives us a total population of 212,185 in 2009.

I could not find a reliable figure for the number of KS2 test takers in 2009 in state-funded primary schools, but the equivalent in the 2011 Primary Performance Tables is 547,025.

Using that, one can calculate that those within Ofsted’s definition constitute some 39% of the 2014 GCSE cohort in non-selective secondary schools. The calculations above suggest that the KS3 cohort will be some ten percentage points larger.

.

Distribution between schools

Of course the distribution of these students between schools will vary considerably.

The 2014 Secondary Performance Tables illustrate this graphically through their alternative ‘high attainers’ measure. The cohort information provides the percentage of high attainers in the GCSE cohort in each school.

The highest recorded percentage in a state-funded comprehensive school is 86%, whereas 92 state-funded schools record 10% or fewer high attainers and just over 650 have 20% or fewer in their GCSE cohort.

At the other extreme, 21 non-selective state-funded schools are at 61% or higher, 102 at 51% or higher and 461 at 41% or higher.

However, the substantial majority – about 1,740 state-funded, non-selective schools – fall between 21% and 40%.

The distribution is shown in the graph below.

.

Ofsted graph 1

Percentage of high attainers within each state-funded non-selective secondary school’s cohort 2014 (Performance Tables measure)

Ofsted approaches the issue differently, by looking at the incidence of pupils with KS2 L5 in English, maths and both English and maths.

Their tables (again in Annex A of the Report) show that, within the 2014 GCSE cohort there were:

  • 2,869 non-selective schools where at least one pupil previously attained a L5 in KS2 English
  • 2,875 non-selective schools where at least one pupil previously attained a L5 in KS2 maths and
  • 2,859 non-selective schools where at least one pupil previously attained l5 in KS2 English and maths.

According to the cohort data in the 2014 Secondary Performance Tables, this suggests that roughly 9% of state-funded non-selective secondary schools had no pupils in each of these categories within the relevant cohort. (It is of course a different 9% in each case.)

Ofsted’s analysis shows that the lowest decile of schools in the distribution of students with L5 in English will have up to 14 of them.

Similarly the lowest decile for L5 in maths will have up to 18 pupils, and the lowest decile for L5 in maths and English combined will have up to 10 pupils.

Assuming a top set typically contains at least 26 pupils, 50% of state-funded, non-selective schools with at least one pupil with L5 English have insufficient students for one full set. The comparable percentage for maths is 30%.

But Ofsted gives no hint of what might constitute a critical mass of high attainers, appearing to suggest that it is simply a case of ‘the more the better’.

Moreover, it seems likely that Ofsted might simply be identifying the incidence of disadvantage through the proxy of high attainers.

This is certainly true at the extremes of the distribution based on the Performance Tables measure.

  • Amongst the 92 schools with 10% or fewer high attainers, 53 (58%) have a cohort containing 41% or more disadvantaged students.
  • By comparison, amongst the 102 schools with 51% or more high attainers, not one school has such a high proportion of disadvantaged students, indeed, 57% have 10% or fewer.

Disadvantage

When Ofsted discusses the most able from disadvantaged backgrounds, its definition of disadvantage is confined to ‘Ever-6 FSM’.

The Report does not provide breakdowns showing the size of this disadvantaged population in state-funded non-selective schools with L5 English or L5 maths.

It does tell us that 12,150 disadvantaged students in the 2014 GCSE cohort had achieved KS2 L5 in both English and maths.  They form about 13.2% of the total cohort achieving this outcome.

If we assume that the same percentage applies to the total populations achieving L5 English only and L5 maths only, this suggests the total size of Ofsted’s disadvantaged most able population within the 2014 GCSE cohort in state-funded, non-selective schools is almost exactly 28,000 students.

Strangely, the Report does not analyse the distribution of disadvantaged high attainers, as opposed to high attainers more generally, even though the text mentions this as an issue in passing.

One would expect that the so called ‘minority effect’ might be even more pronounced in schools where there are very few disadvantaged high attainers.

Ofsted’s evidence base: Performance data

The Executive Summary argues that analysis of national performance data reveals:

‘…three key areas of underperformance for the most able students. These are the difference in outcomes between:

  • schools where most able students make up a very small proportion of the school’s population and those schools where proportions are higher
  • the disadvantaged most able students and their better off peers
  • the most able girls and the most able boys.

If the performance of the most able students is to be maximised, these differences need to be overcome.’

As noted above, Ofsted does not separately consider schools where the incidence of disadvantaged most able students is low, nor does it look at the interaction between these three categories.

It considers all three areas of underperformance through the single prism of prior attainment in KS2 tests of English and maths.

The Report also comments on a fourth dimension: the progression of disadvantaged students to competitive universities. Once again this is related to KS2 performance.

There are three data-related Key Findings:

  • National data show that too many of the most able students are still being let down and are failing to reach their full potential. Most able students’ achievement appears to suffer even more when they are from disadvantaged backgrounds or when they attend a school where the proportion of previously high-attaining students is small.’
  • ‘Nationally, too many of our most able students fail to achieve the grades they need to get into top universities. There are still schools where not a single most able student achieves the A-level grades commonly preferred by top universities.’
  • The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

The following sections look at achievement compared with prior attainment, followed by each of the four dimensions highlighted above.

GCSE attainment compared with KS2 prior attainment

Ofsted’s approach is modelled on the transition matrices, as applied to non-selective schools, comparing KS2 test performance in 2009 with subsequent GCSE performance in 2014.

Students with KS2 L5 are expected to make at least three levels of progress, to GCSE Grade B or higher, but this is relatively undemanding for high attainers, who should ideally be aiming for A/A* grades.

Ofsted presents two charts which illustrate the relatively small proportions who are successful in these terms – and the comparatively large proportions who undershoot even a grade B.

Ofsted Capture 1

Ofsted Capture 2

 .

  • In English, 39% manage A*/A grades while 77% achieve at least a Grade B, meaning that 23% achieve C or below.
  • In maths, 42% achieve A*/A grades, 76% at least a B and so 24% achieve C or lower.
  • In English and maths combined, 32% achieve A*/A grades in both subjects, 73% manage at least 2 B grades, while 27% fall below this.

Approximately one in four high attainers is not achieving each of these progression targets, even though they are not particularly demanding.

The Report notes that, in selective schools, the proportion of Level 5 students not achieving at least a Grade B is much lower, at 8% in English and 6% in maths.

Even allowing for the unreliability of these ‘levels of progress’ assumptions, the comparison between selective and non-selective schools is telling.

.

The size of a school’s most able population

The Report sets out evidence to support the contention that ‘the most able do best when there are more of them in a school’ (or, more accurately, in their year group).

It provides three graphs – for English, for maths and for maths and English combined – which divide non-selective schools with at least one L5 student into deciles according to the size of that L5 population.

These show consistent increases in the proportion of students achieving GCSE Grade B and above and Grades A*/A, with the lowest percentages for the lowest deciles and vice versa.

Comparing the bottom (fewest L5) and top (most L5) deciles:

  • In English 27% of the lowest decile achieved A*/A and 67% at least a B, whereas in the highest decile 48% achieved A*/A and 83% at least B.
  • In maths 28% of the bottom decile recorded A*/A while 65% managed at least a B, whereas in the top decile 54% achieved A*/A and 83% at least a B.
  • In maths and English combined, the lowest decile schools returned 17% A*/A grades and 58% at B or above, while in the highest decile the percentages were 42% and 81% respectively.

Selective schools record higher percentages than the highest decile on all three measures.

There is a single reference to the impact of sublevels, amply evidenced by the transition matrices.

‘For example, in schools where the lowest proportions of most able students had previously gained Level 5A in mathematics, 63% made more than expected progress. In contrast, in schools where the highest proportion of most able students who had previously attained Level 5A in mathematics, 86% made more than expected progress.’

Ofsted does not draw any inferences from this finding.

As hinted above, one might want to test the hypothesis that there may be an association with setting – in that schools with sufficient Level 5 students to constitute a top set might be relatively more successful.

Pursued to its logical extreme the finding would suggest that Level 5 students will be most successful where they are all taught together.

Interestingly, my own analysis of schools with small high attainer populations (10% or less of the cohort), derived from the 2014 Secondary Performance Tables, shows just how much variation there can be in the performance of these small groups when it comes to the standard measures:

  • 5+ A*-C grades including English and maths varies from 44% to 100%
  • EBacc ranges from 0% to 89%
  • Expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%.

This is partly a function of the small sample sizes. One suspects that Ofsted’s deciles smooth over similar variations.

But the most obvious point is that already emphasised in the previous section – the distribution of high attainers seems in large part a proxy for the level of advantage in a school.

Viewed from this perspective, Ofsted’s data on the variation in performance by distribution of high attaining students seems unsurprising.

.

Excellence gaps

Ofsted cites an ‘ever 6’ gap of 13 percentage points at GCSE grade B and above in English (66% compared with 79%) and of 17 percentage points in maths (61% compared with 78%).

Reverting again to progression from KS2, the gap between L5 ‘ever 6 FSM’ and other students going on to achieve A*/A grades in both English and maths is also given as 17 percentage points (20% versus 37%). At Grade B and above the gap is 16 points (59% compared with 75%).

A table is supplied showing progression by sub-level in English and maths separately.

.

Ofsted Capture 3

. 

A footnote explains that the ‘ever 6 FSM’ population with L5a in English was small, consisting of just 136 students.

I have transferred these excellence gaps to the graph below, to illustrate the relationship more clearly.

.

Ofsted chart 2

GCSE attainment gaps between advantaged and disadvantaged learners by KS2 prior attainment

.

It shows that, for grades A*-B, the size of the gap reduces the higher the KS2 sub-level, but the reverse is true at grades A*/A, at least as far as the distinction between 5c and 5b/a is concerned. The gaps remain similar or identical for progression from the higher two sub-levels.

This might suggest that schools are too little focused on pushing high-attaining disadvantaged learners beyond grade B.

 .

Gender

There is a short section on gender differences which points out that, for students with KS2 L5:

  • In English there was a 10 percentage point gap in favour of girls at Grade B and above and an 11 point gap in favour of girls at A*/A.
  • In maths there was a five percentage point gap at both Grade B and above and Grade A*/A.

But the interrelationship with excellence gaps and the size of the high attainer population is not explored.

.

Progression to competitive higher education

The Executive Summary mentions one outcome from the 2012/13 destinations data – that only 5% of disadvantaged students completing KS5 in 2012 progressed to ‘the top universities’. (The main text also compares the progression rates for state-funded and independent schools).

It acknowledges some improvement compared with previous years, but notes the disparity with progression rates for students from comparatively advantaged backgrounds.

A subsequent footnote reveals that Ofsted is referring throughout to progression to Russell Group universities

The Executive Summary also highlights regional differences:

‘For example, even within a high-achieving region like London, disadvantaged students in Brent are almost four times as likely to attend a prestigious university as those in Croydon.’

The main text adds:

‘For example, of the 500 or so disadvantaged students in Kent, only 2% go on to attend a top university. In Manchester, this rises to 9%. Disadvantaged students in Barnet are almost four times as likely as their peers in Kent to attend a prestigious university.’

Annex A provides only one statistic concerning progression from KS2 to KS5:

‘One half of students achieving Level 5 in English and mathematics at Key Stage 2 failed to achieve any A or A* grades at A level in non-selective schools’

There is no attempt to relate this data to the other variables discussed above.

Ofsted’s Evidence base – inspection and survey evidence

The qualitative evidence in Ofsted’s report is derived from:

  • A survey of 40 non-selective secondary schools and 10 primary schools. All the secondary schools had at least 15% of students ‘considered to be high attaining at the end of Key Stage 2’ (as opposed to meeting Ofsted’s definition), as well as 10% or more considered to be low-attaining. The sample varied according to size, type and urban or rural location. Fifteen of the 40 were included in the survey underpinning the original 2013 report. Nine of the 10 primary schools were feeders for the secondaries in the sample. In the secondary schools, inspectors held discussions with senior leaders, as well as those responsible for transition and IAG (so not apparently those with lead responsibility for high attainers). They also interviewed students in KS3 and KS5 and looked at samples of students’ work.

The six survey questions are shown below

.

Ofsted Capture 4

.

  • Supplementary questions asked during 130 Section 5 inspections, focused on how well the most able students are maintaining their progress in KS3, plus challenge and availability of suitable IAG for those in Year 11.
  • An online survey of 600 Year 8 and Year 11 students from 17 unidentified secondary schools, plus telephone interviews with five Russell Group admissions tutors.

The Report divides the qualitative dimension of its report into seven sections that map broadly on to the six survey questions.

The summary below is organised thematically, pulling together material from the key findings and supporting commentary. Relevant key findings are emboldened. Some of these have relevance to sections other than that in which they are located.

The length of each section is a good guide to the distribution and relative weight of Ofsted’s qualitative evidence

Most able disadvantaged

‘Schools visited were rarely meeting the distinct needs of students who are most able and disadvantaged. Not enough was being done to widen the experience of these students and develop their broader knowledge or social and cultural awareness early on in Key Stage 3. The gap at Key Stage 4 between the progress made by the most able disadvantaged students and their better off peers is still too large and is not closing quickly enough.’

The 2013 Report found few instances of pupil premium being used effectively to support the most able disadvantaged. This time round, about a third of survey schools were doing so. Six schools used the premium effectively to raise attainment.

Funding was more often used for enrichment activities but these were much less common in KS3, where not enough was being done to broaden students’ experience or develop social and cultural awareness.

In less successful schools, funding was not targeted ‘with the most able students in mind’, nor was its impact evaluated with sufficient precision.

In most survey schools, the proportion of most able disadvantaged was small. Consequently leaders did not always consider them.

In the few examples of effective practice, schools provided personalised support plans.

.

.

Leadership

Ofsted complains of complacency. Leaders are satisfied with their most able students making the expected progress – their expectations are not high enough.

School leaders in survey schools:

‘…did not see the need to do anything differently for the most able as a specific group.’

One head commented that specific support would be ‘a bit elitiist’.

In almost half of survey schools, heads were not prioritising the needs of their most able students at a sufficiently early stage.

Just 44 of the 130 schools asked supplementary questions had a senior leader with designated responsibility for the most able. Of these, only 16 also had a designated governor.

The Report comments:

‘This suggests that the performance of the most able students was not a high priority…’

Curriculum

Too often, the curriculum did not ensure that work was hard enough for the most able students in Key Stage 3. Inspectors found that there were too many times when students repeated learning they had already mastered or did work that was too easy, particularly in foundation subjects.’

Although leaders have generally made positive curriculum changes at KS4 and 5, issues remain at KS3. General consensus amongst students in over half the survey schools was that work is too easy.

Students identified maths and English as more challenging than other subjects in about a third of survey schools.

In the 130 schools asked supplementary questions, leaders rarely prioritised the needs of the most able at KS3. Only seven offered a curriculum designed for different abilities.

In the most effective survey schools the KS3 curriculum was carefully structured:

‘…leaders knew that, for the most able, knowledge and understanding of content was vitally important alongside the development of resilience and knowing how to conduct their own research.’

By comparison, the KS4 curriculum was tailored in almost half of survey schools. All the schools introduced enrichment and extra-curricular opportunities, though few were effectively evaluated.

. 

Assessment and tracking

Assessment, performance tracking and target setting for the most able students in Key Stage 4 were generally good, but were not effective enough in Key Stage 3. The schools visited routinely tracked the progress of their older most able students, but this remained weak for younger students. Often, targets set for the most able students were too low, which reflected the low ambitions for these students. Targets did not consistently reflect how quickly the most able students can make progress.’

Heads and assessment leaders considered tracking the progress of the most able sufficient to address their performance, but only rarely was this information used to improve curriculum and teaching strategies.

Monitoring and evaluation tends to be focused on KS4. There were some improvements in tracking at KS4 and KS5, but this had caused many schools to lose focus on tracking from the start of KS3.

KS3 students in most survey schools said their views were sought, but could not always point to changes as a consequence. Only in eight schools were able students’ views sought as a cohort.

Year 8 respondents to the online survey typically said schools could do more to develop their interests.

At KS3, half the survey schools did not track progress in all subjects. Where tracking was comprehensive, progress was inconsistent, especially in foundation subjects.

Assessment and tracking ‘generally lacked urgency and rigour’. This, when combined with ineffective use of KS2 assessments:

‘… has led to an indifferent start to secondary school for many of the most able students in these schools.’

KS2 tests were almost always used to set targets but five schools distrusted these results. Baseline testing was widely used, but only about a quarter of the sample used it effectively to spot gaps in learning or under-achievement.

Twenty-six of the 40 survey schools set targets ‘at just above national expectations’. For many students these were insufficiently demanding.

Expectations were insufficiently high to enable them to reach their potential. Weaknesses at KS3 meant there was too much to catch up at KS4 and 5.

In the better examples:

‘…leaders looked critically at national expectations and made shrewd adjustments so that the most able were aiming for the gold standard of A and A* at GCSE and A levels rather than grade B. They ensured that teachers were clear about expectations and students knew exactly what was expected of them. Leaders in these schools tracked the progress of their most able students closely. Teachers were quickly aware of any dips in performance and alert to opportunities to stretch them.’

The expectations built into levels-based national curriculum assessment imposed ‘a glass ceiling’. It is hoped that reforms such as Progress 8 will help raise schools’ aspirations.

 .

Quality of teaching

‘In some schools, teaching for the most able lacked sufficient challenge in Key Stage 3. Teachers did not have high enough expectations and so students made an indifferent start to their secondary education. The quality of students’ work across different subjects was patchy, particularly in foundation subjects. The homework given to the most able was variable in how well it stretched them and school leaders did not routinely check its effectiveness.’

The most common methods of introducing ‘stretch’ reported by teachers and students were extension work, challenge questions and differentiated tasks.

But in only eight of the survey schools did teachers have specific training in applying these techniques to the most able.

As in 2013, teaching at KS3 was insufficiently focused on the most able. The quality of work and tasks set was patchy, especially in foundation subjects. In two-thirds of survey schools work was insufficiently challenging in foundation subjects; in just under half, work was insufficiently challenging in maths and English.

Students experienced a range of teaching quality, even in the same school. Most said there were lessons that did not challenge them. Older students were more content with the quality of stretch and challenge.

In only about one fifth of survey schools was homework adapted to the needs of the most able. Extension tasks were increasingly common.

The same was true of half of the 130 schools asked supplementary questions.  Only 14 had a policy of setting more challenging homework for the most able.

Most schools placed students in maths and science sets fairly early in Year 7, but did so less frequently in English.

In many cases, older students were taught successfully in mixed ability classes, often because there were too few students to make sets viable:

‘The fact that these schools were delivering mixed ability classes successfully suggests that the organisation of classes by ability is not the only factor affecting the quality of teaching. Other factors, such as teachers not teaching their main subject or sharing classes or leaders focusing the skills of their best teachers disproportionately on the upper key stages, are also influential.’

. 

School culture and ethos

Leaders had not embedded an ethos in which academic excellence was championed with sufficient urgency. Students’ learning in Key Stage 3 in the schools visited was too frequently disrupted by low-level disruption, particularly in mixed-ability classes. Teachers had not had enough effective training in using strategies to accelerate the progress of their most able students.’

Where leadership was effective, leaders placed strong emphasis on creating the right ethos. School leaders had not prioritised embedding a positive ethos at KS3 in 22 of the survey schools.

In half of the survey schools, the most able students said their learning was affected by low-level disruption, though teachers in three-quarters of schools maintained this was rare. Senior leaders also had a more positive view than students.

In 16 of the schools, students thought behaviour was less good in mixed ability classes and staff tended to agree.

.

Transition

‘Inspectors found that the secondary schools visited were not using transition information from primary schools effectively to get the most able off to a flying start in Key Stage 3. Leaders rarely put in place bespoke arrangements for the most able students. In just under half of the schools visited, transition arrangements were not good enough. Some leaders and teachers expressed doubt about the accuracy of Key Stage 2 results. The information that schools gathered was more sophisticated, but, in too many cases, teachers did not use it well enough to make sure students were doing work with the right level of difficulty.

Too often poor transition arrangements meant students were treading water in KS3. The absence of leadership accountability for transition appeared a factor in stifled progress at KS4 and beyond.

Transfer arrangements with primary schools were not well developed in 16 of the survey schools. Compared with 2013, schools were more likely to find out about pupils’ strengths and weaknesses, but the information was rarely used well.

Secondary schools had more frequent and extended contact with primary schools through subject specialists to identify the most able, but these links were not always used effectively. Only one school had a specific curriculum pathway for such students.

Leaders in four of the ten primary schools surveyed doubted whether secondary schools used transition information effectively.

However, transition worked well in half of the secondary schools.  Six planned the Year 7 curriculum jointly with primary teachers. Leaders had the highest expectations of their staff to ensure that the most able were working at the appropriate level of challenge.

Transition appeared more effective where schools had fewer feeder primaries. About one third of the sample had more than 30 feeder schools, which posed more difficulties, but four of these schools had effective arrangements.

Progression to HE

‘Information, advice and guidance to students about accessing the most appropriate courses and universities were not good enough. There were worrying occasions when schools did too little to encourage the most able students to apply to prestigious universities. The quality of support was too dependent on the skills of individual staff in the schools visited.

While leaders made stronger links with universities to provide disadvantaged students in Key Stages 4 and 5 with a wider range of experiences, they were not evaluating the impact sharply enough. As a result, there was often no way to measure how effectively these links were supporting students in preparing successful applications to the most appropriate courses.’

Support and guidance about university applications is ‘still fragile’ and ‘remains particularly weak’.

Students, especially those from disadvantaged backgrounds, were not getting the IAG they need. Ten survey schools gave no specific support to first generation university attendees or those eligible for the pupil premium.

Forty-nine of the 130 school asked additional questions did not prioritise the needs of such students. However, personalised mentoring was reported in 16 schools.

In four survey schools students were not encouraged to apply to the top universities.

‘The remnants of misplaced ideas about elitism appear to be stubbornly resistant to change in a very small number of schools. One admissions tutor commented: ‘There is confusion (in schools) between excellence and elitism’.

Only a third of survey schools employed dedicated staff to support university applications. Much of the good practice was heavily reliant on the skills of a few individuals. HE admissions staff agreed.

In 13 of the schools visited, students had a limited understanding of the range of opportunities available to them.

Survey schools had a sound understanding of subject requirements for different degree courses. Only about one-quarter engaged early with parents.

.

Ofsted and other Central Government action

‘Ofsted has sharpened its focus on the progress and quality of teaching of the most able students. We routinely comment on the achievement of the most able students in our inspection reports. However, more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

.

Ofsted’s recommendations and conclusions

This is a somewhat better Report than its June 2013 predecessor, although it continues to fall into several of the same statistical and presentational traps.

It too is a curate’s egg.

For any student of effective provision for the most able, the broad assessment in the previous section is profoundly unsurprising, but its endorsement by Ofsted gives it added power and significance.

We should be grateful that HMCI has chosen to champion this issue when so many others are content to ignore it.

The overall message can best be summarised by juxtaposing two short statements from the Report, one expressed positively, another negatively:

  • In over half of survey schools, the most able KS3 students were progressing as well as, or better than, others. 
  • The needs of the most able were not being met effectively in the majority of survey schools.

Reading between the lines, too often, the most able students are succeeding despite their schools, rather than because of them.

What is rather more surprising – and potentially self-defeating – is Ofsted’s insistence on laying the problem almost entirely at the door of schools, and especially of headteachers.

There is most definitely a degree of complacency amongst school leaders about this issue, and Ofsted is quite right to point that out.

The determination of NAHT and ASCL to take offence at the criticism being directed towards headteachers, to use volatility and change as an excuse and to urge greater focus on the pockets of good practice is sufficient evidence of this.

But there is little by way of counterbalance. Too little attention is paid to the question whether the centre is providing the right support – and the right level of support – to facilitate system-wide improvement. It as if the ‘school-led, self-improving’ ideal is already firmly in place.

Then again, any commitment on the part of the headteachers’ associations to tackling the root causes of the problem is sadly lacking. Meanwhile, the teachers;’ associations ignored the Report completely.

Ofsted criticises this complacency and expresses concern that most of its survey schools:

‘…have been slow in taking forward Ofsted’s previous recommendations, particularly at KS3’

There is a call for renewed effort:

‘Urgent action is now required. Leaders must grasp the nettle and radically transform transition from primary school and the delivery of the Key Stage 3 curriculum. Schools must also revolutionise the quality of information, advice and guidance for their most able students.’

Ofsted’s recommendations for action are set out below. Seven are directed at school leaders, three at Ofsted and one at DfE.

Ofsted capture 5

Ofsted Capture 6

Those aimed by Ofsted towards itself are helpful in some respects.

For example, there is implicit acknowledgement that, until now, inspectors have been insufficiently focused on the most able from disadvantaged backgrounds.

Ofsted stops short of meeting my call for it to produce guidance to help schools and inspectors to understand Ofsted’s expectations.

But it is possible that it might do so. Shortly after publication of the Report, its Director for Schools made a speech confirming that: 

‘… inspectors are developing a most able evaluation toolkit for schools, aligned to that which is in place for free school meals’. 

.

.

If Ofsted is prepared to consult experts and practitioners on the content of that toolkit, rather than producing it behind closed doors, it is more likely to be successful.

There are obvious definitional issues stemming from the fact that, according to Ofsted’s current approach, the ‘most able’ population constitutes 40-50% of all learners.

While this helps to ensure relevance to every school, no matter how depressed the attainment of its intake, it also highlights the need for further differentiation of this huge population.

Some of Ofsted’s statistical indicators and benchmarking tools will need sharpening, not least to avoid the pitfalls associated with the inverse relationship between the proportion of high attainers and the proportion of disadvantaged learners.

They might usefully focus explicitly on the distribution and incidence of the disadvantaged most able.

Prospects for success

But the obvious question is why schools should be any more likely to respond this time round than in 2013?

Will the references in the Ofsted inspection handbook plus reformed assessment arrangements be sufficient to change schools’ behaviour?

Ofsted is not about to place explicit requirements on the face of the inspection framework.

We are invited to believe that Progress 8 in particular will encourage secondary schools to give due attention to the needs of high attainers.

Yet there is no commitment to the publication of a high attainers’ performance measure (comparable to the equivalent primary measure) or the gap on that measure between those from advantaged and disadvantaged backgrounds.

Data about the performance of secondary high attainers was to have been made available through the now-abandoned Data Portal – and there has been no information about what, if anything, will take its place.

And many believe that the necessary change cannot be achieved by tinkering with the accountability framework.

The specialist organisations are united in one respect: they all believe that schools – and learners themselves – need more direct support if we are to spread current pockets of effective practice throughout the system.

But different bodies have very different views about what form that support should take. Until we can:

  • Establish the framework necessary to secure universally high standards across all schools without resorting to national prescription

we – and Ofsted – are whistling in the wind.

GP

March 2015

Labour’s Commitment to Gifted Education: Can the Tories match it?

.

Today, Labour announced that it would support gifted and talented children.

.

This short post examines what is so far in the public domain.

Is this concerted action?

We heard on Sunday (1 March 2015) that Ofsted is bringing forward publication of its second survey report on the education of the ‘most able’.

Plans for the survey were announced in HMCI’s Annual Report, published in December 2014. I set out exactly what was proposed in this contemporaneous post.

At the end of January, HMCI Wilshaw told the Education Select Committee that the second survey report would be published in May (see page 41) but newspaper reports over the weekend said it would appear tomorrow (4 March).

Labour’s announcement is obviously timed to anticipate Ofsted’s report.

By bringing forward his report to this side of the General Election, HMCI has certainly ensured that it will exert much more leverage on political decision-making. He will want that to impact on the Conservatives as well as Labour.

.

What exactly is Labour’s commitment?

The original newspaper report is so far our only source. (I will add any further details from material that appears subsequently.)

It says that, if elected:

  • Labour would establish an independently-administered Gifted and Talented Fund, which is likely to ‘have a £15m pot initially’.
  • Schools would be able to bid for money from the Fund to ‘help their work in stretching the most able pupils’.
  • The Fund would help to establish ‘a new evidence base on how to encourage talented children’

The current evidence base, cited in support of this decision, comprises: material from Ofsted’s first survey report (June 2013); the Social Mobility and Child Poverty Commission’s report on High-attaining children from disadvantaged backgrounds (June 2014); and PISA data (which I analysed in this post from December 2013.

.

Unanswered questions

There are many.

The use of ‘gifted and talented’ terminology may be misleading, in that the remainder of the text suggests Labour is focused on high attainers including (but not exclusively) those from disadvantaged backgrounds.

It is not clear whether the £15m funding commitment is an annual commitment or an initial investment that might or might not be topped up subsequently.

It seems to be available to both primary and secondary schools, but this is not made explicit.

It is not clear how bids for the funding would be assessed, or who would assess them.

The purpose of the funding seems primarily to support teachers and schools rather than to support high attaining learners themselves.

The relationship between the Fund and building the evidence base is not made clear. Will there be an expectation of school-based action research, for example?

There is no explicit ‘joining up’ with wider Labour action on social mobility or fair access to selective higher education (and there is an unfortunate allusion to the pupil premium which suggests it is exclusively to help lower attainers).

In a separate blog, Shadow Minister Hunt does link the Fund to these twin aims:

‘The long and the short of it is this: if we could help talented, disadvantaged children to achieve at the same trajectory as their better off peers it would almost double the number of children from poor backgrounds attending the top universities.’

but the mechanism by which this will be achieved – and the link with Offa’s regime – is left unexplained.

Then in a third statement, Hunt implies that the funding is:

‘…to support the most able pupils from low and middle income backgrounds to progress into the professions, high quality apprenticeships and the best universities’

This suggests that the funding will not be targeted exclusively towards those from disadvantaged backgrounds, but it will be targeted at learners rather than teachers and schools.

It will be interesting to see whether the Fund is described more specifically in Labour’s Manifesto.

.

[Postscript: Labour’s Education Manifesto, published on 9 April, makes no reference to the Fund. The nearest it comes is a section on page 22:

‘Building character traits such as resilience, creativity and the ability to work well with others also relies on the good provision of extra-curricular activities. However, this varies greatly across the country with many young people, particularly in disadvantaged areas, still being denied access to the pre-and-after-school clubs, holiday and weekend activities that can help build confidence and skills and lift aspiration. Giving young people the opportunities to build their talents and stretch their abilities in a particular sport, creative activity or subject is important for ensuring we maximise the potential of every young person. Currently, these opportunities are often restricted to young people in private education or those in high performing areas.’

But no explicit commitment is attached to this statement. If this is the Fund, it seems clear that it is now focused on accessing ‘extra-curricular activities’. ] 

.

[Postscript: Labour’s full Manifesto has nothing to say on this topic. The nearest equivalent to the statement in the Education Manifesto above is a commitment to:

‘… provide children with before and after-school clubs and activities, helping to raise their aspirations and attainment. This will be underpinned by a new National Primary Childcare Service, a not for profit organisation to promote the voluntary and charitable delivery of quality extracurricular activities.’

There is also a guarantee of:

‘…a universal entitlement to a creative education so that every young person has access to cultural activity and the arts by strengthening creative education in schools and after-school clubs. Institutions that receive arts funding will be required to open up their doors to young people…’

 

.

Is anyone on the inside track?

The word on the street is that Labour developed its policy through an internal review.

But the inclusion of a statement from Peter Lampl might suggest that they are in cahoots with the Sutton Trust, where an ex-Labour SPAD is ensconced as Director of Research and Communications.

The Trust’s Mobility Manifesto (September 2014) includes a call for:

‘…an effective national programme for highly able state school pupils, with ring-fenced funding to support evidence-based activities and tracking of pupils’ progress.’

Unfortunately, it is also wedded to the misguided Open Access scheme, which involves denuding state-funded schools of high attainers and diverting them to independent schools instead. (For a more balanced and careful analysis see this post from April 2012)

It cannot be entirely accidental that Lampl published his latest article pushing this wheeze on the same day as Labour’s announcement.

The Education Endowment Foundation might be a potential home for the Fund – and of course the Sutton Trust has a close relationship with the EEF.

 .

Pressure on the Tories?

The combined weight of Labour’s announcement and HMI’s report will put significant pressure on the Tories, especially, to follow suit.

They are already in a difficult position in this territory, having publicly wavered between selection and setting.

Back in 2007 then Opposition Leader Cameron ruled out new grammar schools and proposed universal setting as an alternative.

‘Most critics seem to accept, when pressed, that as I have said, the prospect of more grammars is not practical politics….

…When I say I oppose nationwide selection by 11 between schools, that does not mean I oppose selection by academic ability altogether.

Quite the reverse. I am passionate about the importance of setting by ability within schools, so that we stretch the brightest kids and help those in danger of being left behind.

With a Conservative Government this would be a motor of aspiration for the brightest kids from the poorest homes – effectively a ‘grammar stream’ in every subject in every school.

Setting would be a focus for Ofsted and a priority for all new academies.

More recently, he has enthused about the expansion of existing grammar schools:

 ‘”I strongly support the right of all good schools to expand. I think that’s very important and that should include grammar schools,” the prime minister said:

“Under this government grammar schools have been able to expand and that is all to the good.”‘

The as-yet-unresolved decision on the Sevenoaks satellite is keeping this a live issue as we approach the Election.

There are now media reports that, while the proposal is ready to be approved, Cameron has insisted that the decision is shelved until after the Election, in an effort to prevent it becoming a significant issue during the Tories’ Campaign.

Meanwhile, in September 2014, there was a brief resurgence of the plan for compulsory setting. But this was rapidly relegated to one of a menu of options in the armoury of regional schools commissioners, who would be granted new powers to intervene in failing schools.

In March 2015, the Tory-leaning Policy Exchange think tank published its Education Manifesto, which proposed that:

‘Government should design a prestigious scholarship scheme to financially support the most talented undergraduates in the country – covering approximately 200 individuals a year – if they attend a UK university and remain in the UK for at least three years after graduation.’

This seems too small a fig-leaf to conceal the Tories’ embarrassment – and it is anyway poorly- conceived – see my analysis here.

.

.

The Tories’ only other fallback is the claim that the Coalition Government’s more generic policies will raise standards across the board, including at the top of the attainment spectrum.  This seems increasingly threadbare, however.

With no viable plan C, they could still be squeezed between Labour’s new-found commitment to gifted education and UKIP’s espousal of grammar schools.

There has been a hint that the Tories do have something up their collective sleeve. In her speech to ASCL on 21st March, Morgan set out areas of unfinished business:

‘This is just the beginning and you know as well as I do, how much there is still left to do:

  • to close the gap between disadvantaged pupils and their peers
  • to ensure the excellence that school freedom has delivered reaches all across the system
  • to ensure that the brightest pupils are properly stretched and less able students are taught to master the basics
  • and to ensure that every school has access to truly excellent teachers

I want to reassure you about what that means in practice, it doesn’t mean 5 years of constant upheaval or constant change.

What it does mean is ensuring that the impact of those changes reaches every part of the country, every child, every family and every community.’

We wait to see whether that was empty rhetoric, or whether there is something specific in the Tory Manifesto.

.

[Postscript: The Tory Manifesto makes no reference to setting. On selection it says only:

‘We will continue to allow all good schools to expand whether they are maintained schools, academies, free schools or grammar schools.’

There is additionally a section on STEM education:

‘We aim to make Britain the best place in the world to study maths, science and engineering, measured by improved performance in the PISA league tables. To help achieve this, we will train an extra 17,500 maths and physics teachers over the next five years. We will make sure that all students are pushed to achieve their potential and create more opportunities to stretch the most able.’

So it seems that any commitment to support the most able is confined to STEM, non-specific, unquantified and uncosted.]

Initial reaction to Labour’s announcement?

This is the first time Labour have expressed support for high attainers since Andy Burnham was Shadow Minister.

If the sum they have announced is an annual commitment, this broadly matches the budget for the National Gifted and Talented Programme when it was at its height in the mid-2000s.

They are clearly anxious to keep this support at arms-length from Government – they don’t want to return to a national programme.

The disadvantages of full autonomy could be avoided if bids are invited against a framework of priorities, rather than left entirely for schools to determine. Labour presumably want this funding to make a difference to the statistics they cite from the evidence base.

If the funding is for educators rather than learners, that begs the question whether those from disadvantaged backgrounds might not also be supported through a £50m pupil premium topslice as I have suggested elsewhere.

It would also be helpful if the funding was linked to a national effort to reach consensus on the education of high attainers, as embodied in these ten core principles.

But this is a decent start. ‘Better than a poke in the eye with a blunt stick’, as my favourite colloquialism has it.

.

[Postscript: Labour’s commitment is relatively vague but costed; the Conservatives have offered merely a general statement confined to STEM. If one were judging between the Parties on this basis, Labour would definitely have the edge.]

.

GP

March 2015

Maths Mastery: Evidence versus Spin

.

On Friday 13 February, the Education Endowment Foundation (EEF) published the long-awaited evaluation reports of two randomised control trials (RCTs) of Mathematics Mastery, an Ark-sponsored programme and recipient of one of the EEF’s first tranche of awards back in 2011.

Inside-out_torus_(animated,_small)Inside-out_torus_(animated,_small)EEF, Ark and Mathematics Mastery each published a press release to mark the occasion but, given the timing, none of these attracted attention from journalists and were discussed only briefly on social media.

The main purpose of this post is to distinguish evidence from spin, to establish exactly what the evaluations tell us – and what provisos should be attached to those findings.

The post is organised into three main sections which deal respectively with:

  • Background to Mathematics Mastery
  • What the evaluation reports tell us and
  • What the press releases claim

The conclusion sets out my best effort at a balanced summary of the main findings. (There is a page jump here for those who prefer to cut to the chase.)

This post is written by a non-statistician for a lay audience. I look to specialist readers to set me straight if I have misinterpreted any statistical techniques or findings,

What was published?

On Friday 13 February the EEF published six different documents relevant to the evaluation:

  • A press release: ‘Low-cost internet-based programme found to considerably improve reading ability of year 7 pupils’.
  • A blog post: ‘Today’s findings: impact, no impact and inconclusive – a normal distribution of findings’.
  • An updated Maths Mastery home page (also published as a pdf Project Summary in a slightly different format).

The last three of these were written by the Independent Evaluators – Jerrim and Vignoles (et al) – employed through the UCL Institute of Education.

The Evaluators also refer to ‘a working paper documenting results from both trials’ available in early 2015 from http://ideas.repec.org/s/qss/dqsswp.html and www.johnjerrim.com. At the time of writing this is not yet available.

Press releases were issued on the same day by:

All of the materials published to date are included in the analysis below.

Background to Maths Mastery

What is Maths Mastery?

According to the NCETM (October 2014) the mastery approach in mathematics is characterised by certain common principles:

‘Teachers reinforce an expectation that all pupils are capable of achieving high standards in mathematics.

  • The large majority of pupils progress through the curriculum content at the same pace. Differentiation is achieved by emphasising deep knowledge and through individual support and intervention.
  • Teaching is underpinned by methodical curriculum design and supported by carefully crafted lessons and resources to foster deep conceptual and procedural knowledge.
  • Practice and consolidation play a central role. Carefully designed variation within this builds fluency and understanding of underlying mathematical concepts in tandem.
  • Teachers use precise questioning in class to test conceptual and procedural knowledge, and assess pupils regularly to identify those requiring intervention so that all pupils keep up.

The intention of these approaches is to provide all children with full access to the curriculum, enabling them to achieve confidence and competence – ‘mastery’ – in mathematics, rather than many failing to develop the maths skills they need for the future.’

The NCETM paper itemises six key features, which I paraphrase as:

  • Curriculum design: Relatively small, sequenced steps which must each be mastered before learners move to the next stage. Fundamental skills and knowledge are secured first and these often need extensive attention.
  • Teaching resources: A ‘coherent programme of high-quality teaching materials’ supports classroom teaching. There is particular emphasis on ‘developing deep structural knowledge and the ability to make connections’. The materials may include ‘high-quality textbooks’.
  • Lesson design: Often involves input from colleagues drawing on classroom observation. Plans set out in detail ‘well-tested methods’ of teaching the topic. They include teacher explanations and questions for learners.
  • Teaching methods: Learners work on the same tasks. Concepts are often explored together. Technical proficiency and conceptual understanding are developed in parallel.
  • Pupil support and differentiation: Is provided through support and intervention rather than through the topics taught, particularly at early stages. High attainers are ‘challenged through more demanding problems which deepen their knowledge of the same content’. Issues are addressed through ‘rapid intervention’ commonly undertaken the same day.
  • Productivity and practice: Fluency is developed from deep knowledge and ‘intelligent practice’. Early learning of multiplication tables is expected. The capacity to recall facts from long term memory is also important.

Its Director published a blog post (October 2014) arguing that our present approach to differentiation has ‘a very negative effect’ on mathematical attainment and that this is ‘one of the root causes’ of our performance in PISA and TIMSS.

This is because it negatively affects the ‘mindset’ of low attainers and high attainers alike. Additionally, low attainers are insufficiently challenged and get further behind because ‘they are missing out on some of the curriculum’. Meanwhile high attainers are racing ahead without developing fluency and deep understanding.

He claims that these problems can be avoided through a mastery approach:

‘Instead, countries employing a mastery approach expose almost all of the children to the same curriculum content at the same pace, allowing them all full access to the curriculum by focusing on developing deep understanding and secure fluency with facts and procedures, and providing differentiation by offering rapid support and intervention to address each individual pupil’s needs.’

But unfortunately he stops short of explaining how, for high attainers, exclusive focus on depth is preferable to a richer blend of breadth, depth and pace, combined according to each learner’s needs.

NCETM is careful not to suggest that mastery is primarily focused on improving the performance of low-attaining learners.

It has published separate guidance on High Attaining Pupils in Primary Schools (registration required), which advocates a more balanced approach, although that predates this newfound commitment to mastery.

NCETM is funded by the Department for Education. Some of the comments on the Director’s blog post complain that it is losing credibility by operating as a cheerleader for Government policy.

Ark’s involvement

Ark is an education charity and multi-academy trust with an enviable reputation.

It builds its approach on six key principles, one of which is ‘Depth before breadth’:

‘When pupils secure firm foundations in English and mathematics, they find the rest of the curriculum far easier to access. That’s why we prioritise depth in these subjects, giving pupils the best chance of academic success. To support fully our pupils’ achievement in maths, we have developed the TES Award winning Mathematics Mastery programme, a highly-effective curriculum and teaching approach inspired by pupil success in Singapore and endorsed by Ofsted. We teach Mathematics Mastery in all our primary schools and at Key Stage 3 in a selection of our secondary schools. It is also being implemented in over 170 schools beyond our network.’

Ark’s 2014 Annual Report identifies five priorities for 2014/15, one of which is:

‘…developing curricula to help ensure our pupils are well prepared as they go through school… codifying our approach to early years and, building on the success of Maths Mastery, piloting an English Mastery programme…’

Mathematics Mastery is a charity in its own right. Its website lists 15 staff, a high-powered advisory group and three partner organisations:  Ark, the EEF (presumably by virtue of the funded evaluation) and the ‘Department for Education and the Mayor of London’ (presumably by virtue of support from the London Schools Excellence Fund).

NCETM’s Director sits on Mathematics Mastery’s Advisory Board.

Ark’s Chief Executive is a member of the EEF’s Advisory Board.

Development of Ark’s Maths Mastery programme

According to this 2012 report from Reform, which features Maths Mastery as a case study, it originated in 2010:

‘The development of Mathematics Mastery stemmed from collaboration between six ARK primary academies in Greater London, and the mathematics departments in seven separate ARK secondary academies in Greater London, Portsmouth and Birmingham. Representatives from ARK visited Singapore to explore the country’s approach first-hand, and Dr Yeap Ban Har, Singapore’s leading expert in maths teaching, visited King Solomon Academy in June 2011.’

In October 2011, EEF awarded Ark a grant of £600,000 for Maths Mastery, one of its first four awards.

The EEF’s press release says:

‘The third grant will support an innovative and highly effective approach to teaching children maths called Mathematics Mastery, which originated in Singapore. The programme – run by ARK Schools, the Academies sponsor, which is also supporting the project – will receive £600,000 over the next four years to reach at least 50 disadvantaged primary and secondary schools.’

Ark’s press release adds:

‘ARK Schools has been awarded a major grant by the Education Endowment Foundation (EEF) to further develop and roll out its Mathematics Mastery programme, an innovative and highly effective approach to teaching children maths based on Singapore maths teaching. The £600,000 grant will enable ARK to launch the programme and related professional development training to improve maths teaching in at least 50 disadvantaged primary and secondary schools.

The funding will enable ARK Schools to write a UK mathematics mastery programme based on the experience of teaching the pilot programme in ARK’s academies. ARK intends to complete the development of its primary modules for use from Sept 2012 and its secondary modules for use from September 2013. In parallel ARK is developing professional training and implementation support for schools outside the ARK network.’

The project home page on EEF’s site now says the total project cost is £774,000. It may be that the balance of £174,000 is the fee paid to the independent evaluators.

This 2012 information sheet says all Ark primary schools would adopt Maths Mastery from September 2012, and that its secondary schools have also devised a KS3 programme.

It describes the launch of a Primary Pioneer Programme from September 2012 and a Secondary Pioneer Programme from September 2013. These will form the cohorts to be evaluated by the EEF.

In 2013, Ark was awarded a grant of £617,375 from the Mayor of London’s London Schools Excellence Fund for the London Primary Schools Mathematics Mastery Project.

This is to support the introduction of Mastery in 120 primary schools spread across 18 London boroughs. (Another source gives the grant as £595,000)

It will be interesting to see whether Maths Mastery (or English Mastery) features in the Excellence Fund’s latest project to increase primary attainment in literacy and numeracy. The outcomes of the EEF evaluations may be relevant to that impending decision.

Ark’s Mathematics Mastery today

The Mathematics Mastery website advertises a branded variant of the mastery model, derived from a tripartite ‘holistic vision’:

  • Deep understanding, through a curriculum that combines universal high expectations with spending more time on fewer topics and heavy emphasis on problem-solving.
  • Integrated professional development through workshops, visits, coaching and mentoring and ‘access to exclusive online teaching and learning materials, including lesson guides for each week’.
  • Teacher collaboration – primary schools are allocated a geographical cluster of 4-6 schools while secondary schools attend a ‘national collaboration event’. There is also an online dimension.

It offers primary and secondary programmes.

The primary programme has three particular features: use of objects and pictures prior to the introduction of symbols; a structured approach to the development of mathematical vocabulary; and heavy emphasis on problem-solving.

It involves one-day training sessions for school leaders, for the Maths Mastery lead and those new to teaching it, and for teachers undertaking the programme in each year group. Each school receives two support visits and attends three local cluster meetings.

Problem-solving is also one of three listed features of the secondary programme. The other two are fewer topics undertaken in greater depth, plus joint lesson planning and departmental workshops.

There are two full training days, one for the Maths Mastery lead and one for the maths department plus an evening session for senior leadership. Each school receives two support visits and attends three national collaborative meetings. They must hold an hour-long departmental workshop each week and commit to sharing resources online.

Both primary and secondary schools are encouraged to launch the programme across Year 1/7 and then roll it upwards ‘over several years’.

The website is not entirely clear but it appears that Maths Mastery itself is being rolled out a year at a time, so even the original primary early adopters will have provision only up to Year 3 and are scheduled to introduce provision for Year 4 next academic year. In the secondary sector, activity currently seems confined to KS3, and predominantly to Year 7.

The number of participating schools is increasing steadily but is still very small.

The most recent figures I could find are 192 (Maths Mastery, November 2014) or 193 – 142 primary and 51 secondary (Ark 2015).

One assumes that this total includes

  • An original tranche of 30 primary ‘early adopters’ including 21 not managed by Ark
  • 60 or so primary and secondary ‘Pioneer Schools’ within the EEF evaluations (ie the schools undertaking the intervention but not those forming the control group, unless they have subsequently opted to take up the programme)
  • The 120 primary schools in the London project
  • Primary and secondary schools recruited outwith the London and EEF projects, either alongside them or subsequently.

But the organisation does not provide a detailed breakdown, or show how these different subsets overlap.

They are particularly coy about the cost. There is nothing about this on the website.

The EEF evaluation reports say that 2FE primary schools and secondary schools will pay ‘an upfront cost of £6,000 for participating in the programme’.

With the addition of staff time for training, the per pupil cost for the initial year is estimated as £127 for primary schools and £50 for secondary schools.

The primary report adds:

‘In subsequent years schools are able to opt for different pathways depending on the amount of support and training they wish to choose; they also have ongoing access to the curriculum materials for additional year groups. The per pupil cost therefore reduces considerably, to below £30 per pupil for additional year groups.’

In EEF terms this is deemed a low cost intervention, although an outlay of such magnitude is a significant burden for primary schools, particularly when funding is under pressure, and might be expected to act as a brake on participation.

Further coyness is evident in respect of statutory assessment outcomes. Some details are provided for individual schools, but there is precious little about the whole cohort.

All I could find was this table in the Primary Yearbook 2014-15.

.

EEF maths mastery performance

It suggests somewhat better achievement at KS1 L2b and L3c than the national average but, there is no information about other Levels and, of course, the sample is not representative, so the comparison is of limited value.

An absence of more sophisticated analysis – combined with the impression of limited transparency for those not yet inside the programme – is likely to act as a second brake on participation.

There is a reference to high attainers in the FAQ on the website:

‘The Mathematics Mastery curriculum emphasises stretching through depth of understanding rather than giving the top end of pupils [sic] new procedures to cover.

Problem solving is central to Mathematics Mastery. The great thing about the problems is that students can take them as far as they can, so those children who grasp the basics quickly can explore tasks further. There is also differentiation in the methods used, with top-end pupils typically moving to abstract numbers more quickly and spending less time with concrete manipulatives or bar models. There are extension ideas and support notes provided with the tasks to help you with this.

A range of schools are currently piloting the programme, which is working well in mixed-ability classes, as well as in schools that have set groups.’

The same unanswered questions arise as with the NCETM statement above. Is ‘Maths Mastery’ primarily focused on the ‘long tail’, potentially at the expense of high attainers?

The IoE evaluators think so. The primary evaluation report says that:

‘Mathematics Mastery intervention is particularly concerned with the ‘mastery’ of basic skills, and raising the attainment of low achievers.’

It would be helpful to have clarity on this point.

.

How influential is Maths Mastery?

Extremely influential.

Much educational and political capital has already been invested in Maths Mastery, hence the peculiar significance of the results contained in the evaluation reports.

The National Curriculum Expert Panel espoused mastery in its ‘Framework for the National Curriculum‘ (December 2011), while ducking the consequences for ‘stretch and challenge’ for high attainers – so creating a tension that remains unresolved to this day.

Meanwhile, the mastery approach has already influenced the new maths programme of study, as the NCETM document makes clear:

‘The 2014 national curriculum for mathematics has been designed to raise standards in maths, with the aim that the large majority of pupils will achieve mastery of the subject…

… For many schools and teachers the shift to this ‘mastery curriculum’ will be a significant one. It will require new approaches to lesson design, teaching, use of resources and support for pupils.’

Maths Mastery confirms that its Director was on the drafting team.

Mastery is also embedded in the national collaborative projects being undertaken through the Maths Hubs. Maths Mastery is one of four national partners in the Hubs initiative.

Ministers have endorsed the Ark programme in their speeches. In April 2014, Truss said:

‘The mastery model of learning places the emphasis on understanding core concepts. It’s associated with countries like Singapore, who have very high-performing pupils.

And in this country, Ark, the academy chain, took it on and developed it.

Ark run training days for maths departments and heads of maths from other schools.

They organise support visits, and share plans and ideas online with other teachers, and share their learning with a cluster of other schools.

It’s a very practical model. We know not every school will have the time or inclination to develop its very own programmes – a small rural school, say, or single-class primary schools.

But in maths mastery, a big chain like Ark took the lead, and made it straightforward for other schools to adopt their model. They maintain an online community – which is a cheap, quick way of keeping up with the best teaching approaches.

That’s the sort of innovation that’s possible.

Of course the important thing is the results. The programme is being evaluated so that when the results come out headteachers will be able to look at it and see if it represents good value.’

In June 2014 she said:

‘This idea of mastery is starting to take hold in classrooms in England. Led by evidence of what works, teachers and schools have sought out these programmes and techniques that have been pioneered in China and East Asia….

…With the Ark Schools Maths Mastery programme, more than 100 primary and secondary schools have joined forces to transform their pupils’ experiences of maths – and more are joining all the time. It’s a whole school programme focused on setting high expectations for all pupils – not believing that some just can’t do it. The programme has already achieved excellent results in other countries.’

Several reputations are being built upon Maths Mastery, many jobs depend upon it and large sums have been invested.

It has the explicit support of one of the country’s foremost academy chains and is already impacting on national curriculum and assessment policy (including the recent consultation on performance indicators for statutory teacher assessment).

Negative or neutral evaluations could have significant consequences for all the key players and are unlikely to encourage new schools to join the Programme.

Hence there is pressure in the system for positive outcomes – hence the significance of spin.

What the EEF evaluations tell us

.

Evaluation Protocols

EEF published separate Protocols for the primary and secondary evaluations in April 2013. These are broadly in line with the approach set out in the final evaluation reports, except that both refer much more explicitly to subsequent longitudinal evaluation:

‘In May/June 2017/18 children in treatment and control schools will sit key stage 2 maths exams. The IoE team will examine the long–run effectiveness of the Maths Mastery programme by investigating differences in school average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children’s Maths Mastery test scores (collected in 2012 and 2013)’.

‘In May/June 2018 children in treatment and control schools will sit national maths exams. The IoE team will examine the long – run effectiveness of the Maths Mastery programme by investigating differences in average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children’s Maths Mastery test scores (collected in 2013 and 2014) by NATCEN.’

It is not clear whether the intention is to preserve the integrity of the intervention and control groups until the former have rolled out Mastery to all year groups, or simply to evaluate the long-term effects of the initial one-year interventions, allowing intervention schools to drop Mastery and control schools to adopt it, entirely as they wish.

EEF Maths Mastery Project Homepage

The EEF’s updated Maths Mastery homepage has been revised to reflect the outcomes of the evaluations. It provides the most accessible summary of those outcomes.

It offers four key conclusions (my emphases):

  • ‘On average, pupils in schools adopting Mathematics Mastery made a small amount more progress than pupils in schools that did not. The effect detected was statistically significant, which means that it is likely that that improvement was caused by the programme.’
  • ‘It is unclear whether the programme had a different impact on pupils eligible for free school meals, or on pupils with higher or lower attainment.’
  • ‘Given the low per-pupil cost, Mathematics Mastery may represent a cost-effective change for schools to consider.’
  • ‘The evaluations assessed the impact of the programme in its first year of adoption. It would be worthwhile to track the medium and long-term impact of the approach.’

A table is supplied showing the effect sizes and confidence intervals for overall impact (primary and secondary together), and for the primary and secondary interventions separately.

EEF table 1 Capture

.

The support materials for the EEF’s toolkit help to explain these judgements.

About the Toolkit tells us that:

‘Average impact is estimated in terms of the additional months’ progress you might expect pupils to make as a result of an approach being used in school, taking average pupil progress over a year as a benchmark.

For example, research summarised in the Toolkit shows that improving the quality of feedback provided to pupils has an average impact of eight months. This means that pupils in a class where high quality feedback is provided will make on average eight months more progress over the course of a year compared to another class of pupils who were performing at the same level at the start of the year. At the end of the year the average pupil in a class of 25 pupils in the feedback group would now be equivalent to the 6th best pupil in the control class having made 20 months progress over the year, compared to an average of 12 months in the other class.’

There is another table showing us how to interpret this scale

EEF table 2 Capture

.

We can see from this that:

  • The overall Maths Mastery impact of +0.073 is towards the upper end of the ‘1 months progress’ category.
  • The ‘primary vs comparison’ impact of +0.10 just scrapes into the ‘2 months progress’ category.
  • The secondary vs comparison impact of +0.06 is towards the middle of the ‘1 months progress category’

All three are officially classed as ‘Low Effect’.

If we compare the effect size attributable to Maths Mastery with others in the Toolkit, it is evident that it ranks slightly above school uniform and slightly below learning styles.

A subsequent section explains that the overall impact rating is dependent on meta-analysis (again my emphases):

‘The findings from the individual trials have been combined using an approach called “meta-analysis”. Meta-analysis can lead to a more accurate estimate of an intervention’s effect. However, it is also important to note that care is needed in interpreting meta-analysed findings.’

But we are not told how, in light of this, we are to exercise care in interpreting this particular finding. There are no explicit ‘health warnings’ attached to it.

The homepage does tell us that:

‘Due to the ages of pupils who participated in the individual trials, the headline findings noted here are more likely to be predictive of programme’s impact on pupils in primary school than on pupils in secondary school.’

It also offers an explanation of why the effects generated from these trials are so small compared with those for earlier studies:

‘The findings were substantially lower than the average effects seen in the existing literature on of “mastery approaches”. A possible explanation for this is that many previous studies were conducted in the United States in the 1970s and 80s, so may overstate the possible impact in English schools today. An alternative explanation is that the Mathematics Mastery programme differed from some examples of mastery learning previously studied. For example classes following the Mathematics Mastery approach did not delay starting new topics until a high level of proficiency had been achieved by all students, which was a key feature in a number of many apparently effective programmes.’

 

There is clearly an issue with the 95% confidence intervals supplied in the first table above. 

The Technical Appendices to the Toolkit say:

‘For those concerned with statistical significance, it is still readily apparent in the confidence intervals surrounding an effect size. If the confidence interval includes zero, then the effect size would be considered not to have reached conventional statistical significance.’ (p6)

The table indicates that the lower confidence interval is zero or lower in all three cases, meaning that none of these findings may be statistically significant.

However, the homepage claims that the overall impact of both interventions, when combined through meta-analysis, is statistically significant.

And it fails entirely to mention that the impact of the both the primary and the secondary interventions separately are statistically insignificant.

The explanation of the attribution of statistical significance to the two evaluations combined is that, whereas the homepage gives confidence intervals to two decimal places, the reports calculate them to a third decimal place.

This gives a lower value of 0.004 (ie four thousandths above zero).

This can be seen from the table annexed to the primary and secondary reports and included in the ‘Overarching Summary Report’

EEF maths mastery 3 decimal places Capture

.

The distinction is marginal, to say the least. Indeed, the Evaluation Reports say:

‘…the pooled effect size of 0.073 is just significantly different from zero at conventional thresholds’

Moreover, notice that the introduction of a third decimal place drags the primary effect size down to 0.099, officially consigning it to the ‘one month’s progress’ category rather than the two months quoted above.

This might appear to be dancing on the head of a statistical pin but, as we shall see later, the spin value of statistical significance is huge!

Overall there is a lack of clarity here that cannot be attributed entirely to the necessity for brevity. The attempt to conflate subtly different outcomes from the separate primary and secondary evaluations has masked these distinctions and distorted the overall assessment.

.

The full reports add some further interesting details which are summarised in the sections below.

Primary Evaluation Report 

EEF maths mastery table 4

Key points:

  • In both the primary and secondary reports, additional reasons are given for why the effects from these evaluations are so much smaller than those from previous studies. These include the fact that:

‘…some studies included in the mastery section of the toolkit show small or no effects, suggesting that making mastery learning work effectively in all circumstances is challenging.’

The overall conclusion is an indirect criticism of the Toolkit, noting as it does that ‘the relevance of such evidence for contemporary education policy in England…may be limited’.

  • The RCT was undertaken across two academic years: In AY2012/13, 40 schools (Cohort A) were involved. Of these, 20 were randomly allocated the intervention and 20 the control. In AY2013/14, 50 schools (Cohort B) participated, 25 allocated the intervention and 25 the control. After the trial, control schools in Cohort A were free to pursue Maths Mastery. (The report does not mention whether this also applied to Cohort B.) It is not clear how subsequent longitudinal evaluation will be affected by such leakage from the control group.
  • The schools participating in the trial schools were recruited by Ark. They had to be state-funded and not already undertaking Maths Mastery:

‘Schools were therefore purposefully selected—they cannot be considered a randomly chosen sample from a well-defined population. The majority of schools participating in the trial were from London or the South East.’

  • Unlike the secondary evaluation, no process evaluation was conducted so it is not possible to determine the extent to which schools adhered to the prescribed programme. 
  • Baseline tests were administered after allocation between intervention and control, at the beginning of each academic year. Pupils were tested again in July. Evaluators used the Number Knowledge Test (NKT) for this purpose. The report discusses reasons why this might not be an accurate predictor of subsequent maths attainment and whether it is so closely related to the intervention as to be ‘a questionable measure of the success of the trial’. The discussion suggests that there were potential advantages to both the intervention and control groups but does not say whether one outweighed the other. 
  • The results of the post-test are summarised thus:

‘Children who received the Mathematics Mastery intervention scored, on average, +0.10 standard deviations higher on the post-test. This, however, only reached statistical significance at the 10% level (t = 1.82; p = 0.07), with the 95% confidence interval ranging from -0.01 to +0.21. Within Cohort A, children in the treatment group scored (on average) +0.09 standard deviations above those children in the control group (confidence interval -0.06 to +0.24). The analogous effect in Cohort B was +0.10 (confidence interval -0.05 to 0.26). Consequently, although the Mathematics Mastery intervention may have had a small positive effect on children’s test scores, it is not possible to rule out sampling variation as an explanation.’

  • The comparison of pre-test and post-test results provides any evidence of differential effects for those with lower or higher prior attainment:

‘Estimates are again presented in terms of effect sizes. The interaction effect is not significantly different from zero, with the 95% confidence interval ranging from -0.01 to +0.02. Thus there is little evidence that the effect of Mathematics Mastery differs between children with different levels of prior achievement.’

The Report adds:

‘Recall that the Mathematics Mastery intervention is particularly concerned with the ‘mastery’ of basic skills, and raising the attainment of low achievers. Thus one might anticipate the intervention to be particularly effective in the bottom half of the test score distribution. There is some, but relatively little, evidence that the intervention was less effective for the bottom half of the test distribution.

So, on this evidence, Maths Mastery is no more effective for the low achievers it is intended to help most. This is somewhat different to the suggestion on the homepage that the answer given to this question is ‘unclear’.

Several limitations are discussed, but it is important to note that they are phrased in hypothetical terms:

  • Pupils’ progress was evaluated after one academic year::

’This may be considered a relatively small ‘dose’ of the Mathematics Mastery programme’.

  • The intervention introduced a new approach to schools, so there was a learning curve which control schools did not experience:

‘With more experience teaching the programme it is possible that teachers would become more effective in implementing it.’

  • The test may favour either control schools or intervention schools.
  • Participating schools volunteered to take part, so it is not possible to say whether similar effects would be found in all schools.
  • It was not possible to control for balance – eg by ethnic background and FSM eligibility – between intervention and control. [This is now feasible so could potentially be undertaken retrospectively to check there was no imbalance.]

Under ‘Interpretation’, the report says:

‘Within the context of the wider educational literature, the effect size reported (0.10 standard deviations) would typically be considered ‘small’….

Yet, despite the modest and statistically insignificant effect, the Mathematics Mastery intervention has shown some promise.’

The phrase ‘some promise’ is justified by reference to the meta-analysis, the cost effectiveness (a small effect size for a low cost is preferable to the same outcome for a higher cost) and the fact that the impact of the entire programme has not yet been evaluated

‘Third, children are likely to follow the Mathematics Mastery programme for a number of years (perhaps throughout primary school), whereas this evaluation has considered the impact of just the first year of the programme. Long-run effects after sustained exposure to the programme could be significantly higher, and will be assessed in a follow-up study using Key Stage 2 data.’

This is the only reference to a follow-up study. It is less definite than the statement in the assessment protocol and there is no further explanation of how this will be managed, especially given potential ‘leakage’ from the control group.

Secondary Evaluation Report

EEF maths mastery table 5

Key points:

  • 50 schools were recruited to participate in the RCT during AY2013/14, with 25 randomly allocated to intervention and control. All Year 7 pupils within the former experienced the intervention.  As in the primary trial, control schools were eligible to access the programme after the end of the trial year. Interestingly, 3 of the 25 intervention schools (12%) dropped out before the end of the year – their reasons are not recorded. 
  • As in the primary trial, Ark recruited the participating schools – which had to be state-funded and new to Maths Mastery. Since schools were deliberately selected they could not be considered a random sample. The report notes:

‘Trial participants, on average, performed less well in their KS1 and KS2 examinations than the state school population as a whole. For instance, their KS1 average points scores (and KS2 maths test scores) were approximately 0.2 standard deviations (0.1 standard deviations) below the population mean. This seems to be driven, at least in part, by the fact that the trial particularly under-represented high achievers (relative to the population). For instance, just 12% of children participating in the trial were awarded Level 3 in their Key Stage 1 maths test, compared to 19% of all state school pupils in England.’

  • KS1 and KS2 tests were used to baseline. The Progress in Maths (PiM) test was used to assess pupils at the end of the year. But about 40% of the questions cover content not included in the Y7 maths mastery curriculum, which disadvantaged them relative to the control group. PiM also includes a calculator section although calculators are not used in Year 7 of Maths Mastery. It was agreed that breakdowns of results would be supplied to account for this.
  • On the basis of overall test results:

‘Children who received the Mathematics Mastery intervention scored, on average, +0.055 standard deviations higher on the PiM post-test. This did not reach statistical significance at conventional thresholds (t = 1.20; p = 0.24), with the 95% confidence interval ranging from –0.037 to +0.147. Turning to the FSM-only sample, the estimated effect size is +0.066 with the 95% confidence interval ranging from –0.037 to +0.169 (p = 0.21). Moreover, we also estimated a model including a FSM-by intervention interaction. Results suggested there was little evidence of heterogeneous intervention effects by FSM. Consequently, although the Mathematics Mastery intervention may have had a small positive effect on overall PiM test scores, one cannot rule out the possibility that this finding is due to sampling variation.

  • When the breakdowns were analysed:

‘As perhaps expected, the Mathematics Mastery intervention did not have any impact upon children’s performance on questions covering topics outside the Mathematics Mastery curriculum. Indeed, the estimated intervention effect is essentially zero (effect size = –0.003). In contrast, the intervention had a more pronounced effect upon material that was focused upon within the Mathematics Mastery curriculum (effect size = 0.100), just reaching statistical significance at the 5% level (t = 2.15; p = 0.04)

  • The only analysis of the comparative performance of high and low attainers is tied to the parts of the test not requiring use of a calculator. It suggests a noticeably smaller effect in the top half of the attainment distribution, with no statistical significance above the 55th This is substantively different to the finding in the primary evaluation, and it begs the question whether secondary Maths Mastery needs adjustment to make it more suitable for high attainers.
  • A process evaluation was focused principally on 5 schools from the intervention group. Focus group discussions were held before the intervention and again towards the end. Telephone interviews were conducted and lessons observed. The sample was selected to ensure different sizes of school, FSM intake and schools achieving both poor and good progress in maths according to their most recent inspection report. One of the recommendations is that:

The intervention should consider how it might give more advice and support with respect to differentiation.’

  • The process evaluation adds further detail about suitability for high attainers:

‘Another school [E] also commented that the materials were also not sufficiently challenging for the highest-attaining children, who were frustrated by revisiting at length the same topics they had already encountered at primary school. Although this observation was also made in other schools, it was generally felt that the children gradually began to realise that they were in fact enjoying the subject more by gaining extra understanding.’

It is not clear whether this latter comment also extends to the high attainers!

A similar set of limitations is explored in similar language to that used in the primary report.

Under ‘Interpretation’ the report says:

‘Although point estimates were consistent with a small, positive gain, the study did not have sufficient statistical power to rule out chance as an explanation. Within the context of the wider educational literature, the effect size reported (less than 0.10 standard deviations) would typically be considered ‘small’…

But, as in the primary report, it detects ‘some promise’ on the same grounds. There is a similar speculative reference to longitudinal evaluation.

.

Press releases and blogs

. 

EEF press release

There is a certain irony in the fact that ‘unlucky’ Friday 13 February was the day selected by the EEF to release these rather disappointing reports.

But Friday is typically the day selected by communications people to release educational news that is most likely to generate negative media coverage – and a Friday immediately before a school holiday is a particularly favoured time to do so, presumably because fewer journalists and social media users are active.

Unfortunately, the practice is at risk of becoming self-defeating, since everyone now expects bad news on a Friday, whereas they might be rather less alert on a busier day earlier in the week.

On this occasion Thursday was an exceptionally busy day for education news, with reaction to Miliband’s speech and a raft of Coalition announcements designed to divert attention from it. With the benefit of hindsight, Thursday might have been a better choice.

The EEF’s press release dealt with evaluation reports on nine separate projects, so increasing the probability that attention would be diverted away from Maths Mastery.

It led on a different evaluation report which generated more positive findings – the EEF seems increasingly sensitive to concerns that too many of the RCTs it sponsors are showing negligible or no positive effect, presumably because the value-for-money police may be inclined to turn their beady eye upon the Foundation itself.

But perhaps it also did so because Maths Mastery’s relatively poor performance was otherwise the story most likely to attract the attention of more informed journalists and commentators.

On the other hand, Maths Mastery was given second billing:

‘Also published today are the results of Mathematics Mastery, a whole-school approach which aims to deepen pupils’ conceptual understanding of key mathematical ideas. Compared to traditional curricula, fewer topics are covered in more depth and greater emphasis is placed on problem solving and encouraging mathematical thinking. The EEF trials found that pupils following the Mathematics Mastery programme made an additional month’s progress over a period of a year.’

.

.

EEF blog post

Later on 13 February EEF released a blog post written by a senior analyst which mentions Maths Mastery in the following terms:

Another finding of note is the small positive impact of teaching children fewer mathematical concepts, but covering them in greater depth to ensure ‘mastery’. The EEF’s evaluation of Mathematics Mastery will make fascinating reading for headteachers contemplating introducing this approach into their school. Of course, the true value of this method may only be evident in years to come as children are able to draw on their secure mathematical foundations to tackle more complex problems.’

EEF is consistently reporting a small positive impact but, as we have seen, this is rather economical with the truth. It deserves some qualification.

More interestingly though, the post adds (my emphases):

‘Our commitment as an organisation is not only to build the strength of the evidence base in education, across key stages, topics, approaches and techniques, but also ensure that the key messages emerging from the research are synthesised and communicated clearly to teachers and school leaders so that evidence can form a central pillar of how decisions are made in schools.

We have already begun this work, driven by the messages from our published trials as well as the existing evidence base. How teaching assistants can be used to best effect, important lessons in literacy at the transition from primary to secondary, and which principles should underpin approaches on encouraging children in reading for pleasure are all issues that have important implications for school leaders. Synthesising and disseminating these vital messages will form the backbone of a new phase of EEF work beginning later in the year.’

It will be interesting to monitor the impact of this work on the communication of outcomes from these particular evaluations.

It will be important to ensure that synthesis and dissemination is not at the expense of accuracy, particularly when ‘high stakes’ results are involved, otherwise there is a risk that users will lose faith in the independence of EEF and its willingness to ‘speak truth unto power’.

.

Maths Mastery Press Release

By also releasing their own posts on 13 February, Mathematics Mastery and Ark made sure that they too would not be picked up by the media.

They must have concluded that, even if they placed the most positive interpretation on the outcomes, they would find it hard to create the kind of media coverage that would generate increased demand from schools.

The Mathematics Mastery release – ‘Mathematics Mastery speeds up pupils’ progress – and is value for money too’ – begins with a list of bullet points citing other evidence that the programme works, so implying that the EEF evaluations are relatively insignificant additions to this comprehensive evidence base:

  • ‘Headteachers say that the teaching of mathematics in their schools has improved
  • Headteachers are happy to recommend us to other schools
  • Numerous Ofsted inspections have praised the “new approach to mathematics” in partner schools
  • Extremely positive evaluations of our training and our school development visits
  • We have an exceptionally high retention rate – schools want to continue in the partnership
  • Great Key Stage 1 results in a large number of schools.’

Much of this is hearsay, or else vague reference to quantitative evidence that is not published openly.

The optimistic comment on the EEF evaluations is:

‘We’re pleased with the finding that, looking at both our primary and secondary programmes together, pupils in the Mathematics Mastery schools make one month’s extra progress on average compared to pupils in the other schools after a one year “dose” of the programme…

…This is a really pleasing outcome – trials of this kind are very rigorous.  Over 80 primary schools and 50 secondary schools were involved in the testing, with over 4000 pupils involved in each phase.  Studies like this often don’t show any progress at all, particularly in the early years of implementation and if, like ours, the programme is aimed at all pupils and not just particular groups.  What’s more, because of the large sample size, the difference in scores between the Mathematics Mastery and other schools is “statistically significant” which means the results are very unlikely to be due to chance.’

The section I have emboldened is in stark contrast to the EEF blog post above, which has the title:

‘Today’s findings; impact, no-impact and inconclusive – a normal distribution of findings’

And so suggests exactly the opposite.

I have already shown just how borderline the calculation of ‘statistical significance’ has been.

The release concludes:

‘Of course we’re pleased with the extra progress even after a limited time, but we’re interested in long term change and long term development and improvement.  We’re determined to work with our partner schools to show what’s possible over pupils’ whole school careers…but it’s nice to know we’ve already started to succeed!’

 .

There was a single retweet of the Tweet above, but from a particularly authoritative source (who also sits on Ark’s Advisory Group).

.

Ark Press Release

Ark’s press release – ‘Independent evaluation shows Mathematics Mastery pupils doing better than their peers’ – is even more bullish.

The opening paragraph claims that:

‘A new independent report from the independent Education Endowment Foundation (EEF) demonstrates the success of the Mathematics Mastery programme. Carried out by academics from Cambridge University and the Institute of Education, the data indicates that the programme may have the potential to halve the attainment gap with high performing countries in the far East.

The second emboldened statement is particularly brazen since there is no evidence in either of the reports that would support such a claim. It is only true in the sense that any programme ‘may have the potential’ to achieve any particularly ambitious outcome.

Statistical significance is again celebrated, though it is important to give Ark credit for adding:

‘…but it is important to note that these individual studies did not reach the threshold for statistical significance. It is only at the combined level across 127 schools and 10,114 pupils that there are sufficient schools and statistical power to determine an effect size of 1 month overall.’

Even if this rather implies that the individual evaluations were somehow at fault for being too small and so not generating ‘sufficient statistical power’.

Then the release returns to its initial theme:

‘… According to the OECD, by age fifteen, pupils in Singapore, Japan, South Korea and China are three years ahead of pupils in England in mathematical achievement. Maths Mastery is inspired by the techniques and strategies used in these countries.

Because Maths Mastery is a whole school programme of training and a cumulative curriculum, rather than a catch-up intervention, this could be a sustained impact. A 2 month gain every primary year and 1 month gain every secondary year could see pupils more than one and a half years ahead by age 16 – halving the gap with higher performing jurisdictions.’

In other words, Ark extrapolates equivalent gains – eschewing all statistical hedging – for each year of study, adding them together to suggest a potential 18 month gain.

It also seems to apply the effect to all participants rather than to the average participant.

This must have been a step too far, even for Ark’s publicity machine.

.

maths mastery ark release capture

.

They subsequently changed the final paragraph above – which one can still find in the version within Google’s cache – to read:

‘…Because Maths Mastery is a whole school programme of training and a cumulative curriculum, rather than a catch-up intervention, we expect this to be a sustained impact.  A longer follow-up study will be needed to investigate this.’

Even in sacrificing the misleading quantification, they could not resist bumping up ‘this could be a sustained impact’ to ‘we expect this to be a sustained impact’

 .

[Postscript: On 25 February, Bank of America Merrill Lynch published a press release announcing a £750,000 donation to Maths Mastery.

The final paragraph ‘About Maths Mastery’ says:

‘Mathematics Mastery is an innovative maths teaching framework, supporting schools, students and teachers to be successful at maths. There are currently 192 Mathematics Mastery partner schools across England, reaching 34,800 pupils. Over the next five years the programme aims to expand to 500 schools, and reach 300,000 pupils. Maths Mastery was recently evaluated by the independent Education Endowment Foundation and pupils were found to be up to two months ahead of their peers in just the first year of the programme. Longer term, this could see pupils more than a year and a half ahead by age 16 – halving the gap with pupils in countries such as Japan, Singapore and China.’

This exemplifies perfectly how such questionable statements are repurposed and recycled with impunity. It is high time that the EEF published a code of practice to help ensure that the outcomes of its evaluations are not misrepresented.]  

.

Conclusion

.

 .

Representing the key findings

My best effort at a balanced presentation of these findings would include the key points below. I am happy to consider amendments, additions and improvements:

  • On average, pupils in primary schools adopting Mathematics Mastery made two months more progress than pupils in primary schools that did not. (This is a borderline result, in that it is only just above the score denoting one month’s progress. It falls to one month’s progress if the effect size is calculated to three decimal places.) The effect is classified as ‘Low’ and this outcome is not statistically significant. 
  • On average, pupils in secondary schools adopting Mathematics Mastery made one month more progress than pupils in secondary schools that did not. The effect is classified as ‘Low’ and this outcome is not statistically significant. 
  • When the results of the primary and secondary evaluations are combined through meta-analysis, pupils in schools adopting Maths Mastery made one month more progress than pupils in schools that did not. The effect is classified as ‘Low’. This outcome is marginally statistically significant, provided that the 95% confidence interval is calculated to three decimal places (but it is not statistically significant if calculated to two decimal places). Care is needed in analysing meta-analysed findings because… [add explanation]. 
  • There is relatively little evidence that the primary programme is more effective for learners with lower prior attainment, but there is such evidence for the secondary programme (in respect of non-calculator questions). There is no substantive evidence that the secondary programme has a different impact on pupils eligible for free schools meals. 
  • The per-pupil cost is relatively low, but the initial outlay of £6,000 for primary schools with 2FE and above is not inconsiderable. Mathematics Mastery may represent a cost-effective change for schools to consider. 
  • The evaluations assessed the impact of the programme in its first year of adoption. It is not appropriate to draw inferences from the findings above to attribute potential value to the whole programme. EEF will be evaluating the medium and long-term impact of the approach by [outline the methodology agreed].

In the meantime, it would be helpful for Ark and Maths Mastery to be much more transparent about KS1 assessment outcomes across their partner schools and possibly publish their own analysis based on comparison between schools undertaking the programme and matched control schools with similar intakes.

And it would be helpful for all partners to explain and evidence more fully the benefits to high attainers of the Maths Mastery approach – and to consider how it might be supplemented when it does not provide the blend of challenge and support that best meets their needs.

It is disappointing that, three years on, the failure of the National Curriculum Expert Panel to reconcile their advocacy for mastery with stretch and challenge for high attainers – in defiance of their remit to consider the latter as well as the former –  is being perpetuated across the system.

NCETM might usefully revisit their guidance on high attainers in primary schools to reflect their new-found commitment to mastery, while also incorporating additional material covering the point above.

.

Postscript

A summary of this piece, published by Schools Week, prompted two comments – one from Stephen Gorard, the other from Dylan Wiliam. The Twitter embed below is the record of a subsequent debate between us and some others, about design of the Maths Mastery evaluations, what they tell us and how useful they are, especially to policy makers.

One of the tweets contains a commitment on the part of Anna Vignoles to set up a seminar to discuss these issues further.

The widget stores the tweets in reverse order (most recent first). Scroll down to the bottom to follow the discussion in chronological order.

.

.

GP

February 2015

High Attainment in the 2014 Secondary and 16-18 Performance Tables

.

This is my annual analysis of high attainment and high attainers’ performance in the Secondary School and College Performance Tables

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on the 2014 Secondary and 16-18 Tables, as well as three statistical releases published alongside them:

It also reports trends since 2012 and 2013, while acknowledging the comparability issues at secondary level this year.

This is a companion piece to previous posts on:

The post opens with the headlines from the subsequent analysis. These are followed by a discussion of definitions and comparability issues.

Two substantive sections deal respectively with secondary and post-16 measures. The post-16 analysis focuses exclusively on A level results. There is a brief postscript on the performance of disadvantaged high attainers.

As ever I apologise in advance for any transcription errors and invite readers to notify me of any they spot, so that I can make the necessary corrections.

.

Headlines

At KS4:

  • High attainers constitute 32.4% of the cohort attending state-funded schools, but this masks some variation by school type. The percentage attending converter academies (38.4%) has fallen by nine percentage points since 2011 but remains almost double the percentage attending sponsored academies (21.2%).
  • Female high attainers (33.7%) continue to outnumber males (32.1%). The percentage of high-attaining males has fallen very slightly since 2013 while the proportion of high-attaining females has slightly increased.
  • 88.8% of the GCSE cohort attending selective schools are high attainers, virtually unchanged from 2013. The percentages in comprehensive schools (30.9%) and modern schools (21.0%) are also little changed.
  • These figures mask significant variation between schools. Ten grammar schools have a GCSE cohort consisting entirely of high attainers but, at the other extreme, one has only 52%.
  • Some comprehensive schools have more high attainers than some grammars: the highest percentage recorded in 2014 by a comprehensive is 86%. Modern schools are also extremely variable, with high attainer populations ranging from 4% to 45%. Schools with small populations of high attainers report very different success rates for them on the headline measures.
  • The fact that 11.2% of the selective school cohort are middle attainers reminds us that 11+ selection is not based on prior attainment. Middle attainers in selective schools perform significantly better than those in comprehensive schools, but worse than high attainers in comprehensives.
  • 92.8% of high attainers in state-funded schools achieved 5 or more GCSEs at grades A*-C (or equivalent) including GCSEs in English and maths. While the success rate for all learners is down by four percentage points compared with 2013, the decline is less pronounced for high attainers (1.9 points).
  • In 340 schools 100% of high attainers achieved this measure, down from 530 in 2013. Fifty-seven schools record 67% or less compared with only 14 in 2013. Four of the 57 had a better success rate for middle attainers than for high attainers.
  • 93.8% of high attainers in state-funded schools achieved GCSE grades A*-C in English and maths. The success rate for high attainers has fallen less than the rate for the cohort as a whole (1.3 points against 2.4 points). Some 470 schools achieved 100% success amongst their high attainers on this measure, down 140 compared with 2013. Thirty-eight schools were at 67% or lower compared with only 12 in 2013. Five of these boast a higher success rate for their middle attainers than their high attainers (and four are the same that do so on the 5+ A*-C including English and maths measure).
  • 68.8% of high attainers were entered for the EBacc and 55% achieved it. The entry rate is up 3.8 percentage points and the success rate up 2.9 points compared with 2013. Sixty-seven schools entered 100% of their high attainers, but only five schools managed 100% success. Thirty-seven schools entered no high attainers at all and 53 had no successful high attainers.
  • 85.6% of high attainers made at least the expected progress in English and 84.7% did so in maths. Both are down on 2013 but much more so in maths (3.1 percentage points) than in English (0.6 points).
  • In 108 schools every high attainer made the requisite progress in English. In 99 schools the same was true of maths in 99 schools. Only 21 schools managed 100% success in both English and maths. At the other extreme there were seven schools in which 50% or fewer made expected progress in both English and maths. Several schools recording 50% or below in either English or maths did significantly better with their middle attainers.
  • In sponsored academies one in four high attainers do not make the expected progress in maths and one in five do not do so in English. In free schools one in every five high attainers falls short in English as do one in six in maths.

At KS5:

  • 11.9% of students at state-funded schools and colleges achieved AAB grades at A level or higher, with at least two in facilitating subjects. This is a slight fall compared with the 12.1% that did so in 2013. The best-performing state institution had a success rate of 83%.
  • 14.1% of A levels taken in selective schools in 2014 were graded A* and 41.1% were graded A* or A. In selective schools 26.1% of the cohort achieved AAA or higher and 32.3% achieved AAB or higher with at least two in facilitating subjects.
  • Across all schools, independent as well as state-funded, the proportion of students achieving three or more A level grades at A*/A is falling and the gap between the success rates of boys and girls is increasing.
  • Boys are more successful than girls on three of the four high attainment measures, the only exception being the least demanding (AAB or higher in any subjects).
  • The highest recorded A level point score per A level student in a state-funded institution in 2014 is 1430.1, compared with an average of 772.7. The lowest is 288.4. The highest APS per A level entry is 271.1 compared with an average of 211.2. The lowest recorded is 108.6.

Disadvantaged high attainers:

  • On the majority of the KS4 headline measures gaps between FSM and non-FSM performance are increasing, even when the 2013 methodology is applied to control for the impact of the reforms affecting comparability. Very limited improvement has been made against any of the five headline measures between 2011 and 2014. It seems that the pupil premium has had little impact to date on either attainment or progress. Although no separate information is forthcoming about the performance of disadvantaged high attainers, it is highly likely that excellence gaps are equally unaffected.

.

Definitions and comparability issues 

Definitions

The Secondary and 16-18 Tables take very different approaches, since the former deals exclusively with high attainers while the latter concentrates exclusively on high attainment.

The Secondary Tables define high attainers according to their prior attainment on end of KS2 tests. Most learners in the 2014 GCSE cohort will have taken these five years previously, in 2009.

The new supporting documentation describes the distinction between high, middle and low attainers thus:

  • low attaining = those below level 4 in the key stage 2 tests
  • middle attaining = those at level 4 in the key stage 2 tests
  • high attaining = those above level 4 in the key stage 2 tests.

Last year the equivalent statement added:

‘To establish a pupil’s KS2 attainment level, we calculated the pupil’s average point score in national curriculum tests for English, maths and science and classified those with a point score of less than 24 as low; those between 24 and 29.99 as middle, and those with 30 or more as high attaining.’

This is now missing, but the methodology is presumably unchanged.

It means that high attainers will tend to be ‘all-rounders’, whose performance is at least middling in each assessment. Those who are exceptionally high achievers in one area but poor in others are unlikely to qualify.

There is nothing in the Secondary Tables or the supporting SFRs about high attainment, such as measures of GCSE achievement at grades A*/A.

By contrast, the 16-18 Tables do not distinguish high attainers, but do deploy a high attainment measure:

‘The percentage of A level students achieving grades AAB or higher in at least two facilitating subjects’

Facilitating subjects include:

‘biology, chemistry, physics, mathematics, further mathematics, geography, history, English literature, modern and classical languages.’

The supporting documentation says:

‘Students who already have a good idea of what they want to study at university should check the usual entry requirements for their chosen course and ensure that their choices at advanced level include any required subjects. Students who are less sure will want to keep their options open while they decide what to do. These students might want to consider choosing at least two facilitating subjects because they are most commonly required for entry to degree courses at Russell Group universities. The study of A levels in particular subjects does not, of course, guarantee anyone a place. Entry to university is competitive and achieving good grades is also important.’

The 2013 Tables also included percentages of students achieving three A levels at grades AAB or higher in facilitating subjects, but this has now been dropped.

The Statement of Intent for the 2014 Tables explains:

‘As announced in the government’s response to the consultation on 16-19 accountability earlier this year, we intend to maintain the AAB measure in performance tables as a standard of academic rigour. However, to address the concerns raised in the 16-19 accountability consultation, we will only require two of the subjects to be in facilitating subjects. Therefore, the indicator based on three facilitating subjects will no longer be reported in the performance tables.’

Both these measures appear in SFR03/15, alongside two others:

  • Percentage of students achieving 3 A*-A grades or better At A level or applied single/double award A level.
  • Percentage of students achieving grades AAB or better at A level or applied single/double award A level.

Comparability Issues 

When it comes to analysis of the Secondary Tables, comparisons with previous years are compromised by changes to the way in which performance is measured.

Both SFRs carry an initial warning:

‘Two major reforms have been implemented which affect the calculation of key stage 4 (KS4) performance measures data in 2014:

  1. Professor Alison Wolf’s Review of Vocational Education recommendations which:
  • restrict the qualifications counted
  • prevent any qualification from counting as larger than one GCSE
  • cap the number of non-GCSEs included in performance measures at two per pupil
  1. An early entry policy to only count a pupil’s first attempt at a qualification.’

SFR02/15 explains that some data has been presented ‘on two alternative bases’:

  • Using the 2014 methodology with the changes above applied and
  • Using a proxy 2013 methodology where the effect of these two changes has been removed.

It points out that more minor changes have not been accounted for, including the removal of unregulated IGCSEs, the application of discounting across different qualification types, the shift to linear GCSE formats and the removal of the speaking and listening component from English.

Moreover, the proxy measure does not:

‘…isolate the impact of changes in school behaviour due to policy changes. For example, we can count best entry results rather than first entry results but some schools will have adjusted their behaviours according to the policy changes and stopped entering pupils in the same patterns as they would have done before the policy was introduced.’

Nevertheless, the proxy is the best available guide to what outcomes would have been had the two reforms above not been introduced. Unfortunately, it has been applied rather sparingly.

Rather than ignore trends completely, this post includes information about changes in high attainers’ GCSE performance compared with previous years, not least so readers can see the impact of the changes that have been introduced.

It is important that we do not allow the impact of these changes to be used as a smokescreen masking negligible improvement or even declines in national performance on key measures.

But we cannot escape the fact that the 2014 figures are not fully comparable with those for previous years. Several of the tables in SFR06/2015 carry a warning in red to this effect (but not those in SFR 02/2015).

A few less substantive changes also impact slightly on the comparability of A level results: the withdrawal of January examinations and ‘automatic add back’ of students whose results were deferred from the previous year because they had not completed their 16-18 study programme.

.

Secondary outcomes

. 

The High Attainer Population 

The Secondary Performance Tables show that there were 172,115 high attainers from state-funded schools within the relevant cohort in 2014, who together account for 32.3% of the entire state-funded school cohort.

This is some 2% fewer than the 175,797 recorded in 2013, which constituted 32.4% of that year’s cohort.

SFR02/2015 provides information about the incidence of high, middle and low attainers by school type and gender.

Chart 1, below, compares the proportion of high attainers by type of school, showing changes since 2011.

The high attainer population across all state-funded mainstream schools has remained relatively stable over the period and currently stands at 32.9%. The corresponding percentage in LA-maintained mainstream schools is slightly lower: the difference is exactly two percentage points in 2014.

High attainers constitute only around one-fifth of the student population of sponsored academies, but close to double that in converter academies. The former percentage is relatively stable but the latter has fallen by some nine percentage points since 2011, presumably as the size of this sector has increased.

The percentage of high attainers in free schools is similar to that in converter academies but has fluctuated over the three years for which data is available. The comparison between 2014 and previous years will have been affected by the inclusion of UTCs and studio schools prior to 2014.

.

HA sec1

*Pre-2014 includes UTCs and studio schools; 2014 includes free schools only

Chart 1: Percentage of high attainers by school type, 2011-2014

. 

Table 1 shows that, in each year since 2011, there has been a slightly higher percentage of female high attainers than male, the gap varying between 0.4 percentage points (2012) and 1.8 percentage points (2011).

The percentage of high-attaining boys in 2014 is the lowest it has been over this period, while the percentage of high attaining girls is slightly higher than it was in 2013 but has not returned to 2011 levels.

Year Boys Girls
2014 32.1 33.7
2013 32.3 33.3
2012 33.4 33.8
2011 32.6 34.4

Table 1: Percentage of high attainers by gender, all state-funded mainstream schools 2011-14

Table 2 shows that the percentage of high attainers in selective schools is almost unchanged from 2013, at just under 89%. This compares with almost 31% in comprehensive schools, unchanged from 2013, and 21% in modern schools, the highest it has been over this period.

The 11.2% of learners in selective schools who are middle attainers remind us that selection by ability through 11-plus tests gives a somewhat different sample than selection exclusively on the basis of KS2 attainment.

. 

Year Selective Comprehensive Modern
2014 88.8 30.9 21.0
2013 88.9 30.9 20.5
2012 89.8 31.7 20.9
2011 90.3 31.6 20.4

Table 2: Percentage of high attainers by admissions practice, 2011-14

The SFR shows that these middle attainers in selective schools are less successful than their high attaining peers, and slightly less successful than high attainers in comprehensives, but they are considerably more successful than middle attaining learners in comprehensive schools.

For example, in 2014 the 5+ A*-C grades including English and maths measure is achieved by:

  • 97.8% of high attainers in selective schools
  • 92.2% of high attainers in comprehensive schools
  • 88.1% of middle attainers in selective schools and
  • 50.8% of middle attainers in comprehensive schools.

A previous post ‘The Politics of Selection: Grammar schools and disadvantage’ (November 2014) explored how some grammar schools are significantly more selective than others – as measured by the percentage of high attainers within their GCSE cohorts – and the fact that some comprehensives are more selective than some grammar schools.

This is again borne out by the 2014 Performance Tables, which show that 10 selective schools have a cohort consisting entirely of high attainers, the same as in 2013. Eighty-nine selective schools have a high attainer population of 90% or more.

However, five are at 70% or below, with the lowest – Dover Grammar School for Boys – registering only 52% high attainers.

By comparison, comprehensives such as King’s Priory School, North Shields and Dame Alice Owen’s School, Potters Bar record 86% and 77% high attainers respectively. 

There is also huge variation in modern schools, from Coombe Girls’ in Kingston, at 45%, just seven percentage points shy of the lowest recorded in a selective school, to The Ellington and Hereson School, Ramsgate, at just 4%.

Two studio colleges say they have no high attainers at all, while 96 schools have 10% or fewer. A significant proportion of these are academies located in rural and coastal areas.

Even though results are suppressed where there are too few high attainers, it is evident that these small cohorts perform very differently in different schools.

Amongst those with a high attainer population of 10% or fewer, the proportion achieving:

  • 5+ A*-C grades including English and maths varies from 44% to100%
  • EBacc ranges from 0% to 89%
  • expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%. 

5+ GCSEs (or equivalent) at A*-C including GCSEs in English and maths 

The Tables show that:

  • 92.8% of high attainers in state-funded schools achieved five or more GCSEs (or equivalent) including GCSEs in English and maths. This compares with 56.6% of all learners. Allowing of course for the impact of 2014 reforms, the latter is a full four percentage points down on the 2013 outcome. By comparison, the outcome for high attainers is down 1.9 percentage points, slightly less than half the overall decline. Roughly one in every fourteen high attainers fails to achieve this benchmark.
  • 340 schools achieve 100% on this measure, significantly fewer than the 530 that did so in 2013 and the 480 managing this in 2012. In 2013, 14 schools registered 67% or fewer high attainers achieving this outcome, whereas in 2014 this number has increased substantially, to 57 schools. Five schools record 0%, including selective Bourne Grammar School, Lincolnshire, hopefully because of their choice of IGCSEs. Six more are at 25% or lower.

. 

A*-C grades in GCSE English and maths 

The Tables reveal that:

  • 93.8% of high attainers in state-funded schools achieved A*-C grades in GCSE English and maths, compared with 58.9% of all pupils. The latter percentage is down by 2.4 percentage points but the former has fallen by only 1.3 percentage points. Roughly one in 16 high attainers fails to achieve this measure.
  • In 2014 the number of schools with 100% of high attainers achieving this measure has fallen to some 470, 140 fewer than in 2013 and 60 fewer than in 2012. There were 38 schools recording 67% or lower, a significant increase compared with 12 in 2013 and 18 in 2012. Of these, four are listed at 0% (Bourne Grammar is at 1%) and five more are at 25% or lower.
  • Amongst the 38 schools recording 67% or lower, five return a higher success rate for their middle attainers than for their high attainers. Four of these are the same that do so on the 5+ A*-C measure above. They are joined by Tong High School. 

Entry to and achievement of the EBacc 

The Tables indicate that:

  • 68.8% of high attainers in state-funded schools were entered for all EBacc subjects and 55.0% achieved the EBacc. The entry rate is up by 3.8 percentage points compared with 2013, and the success rate is up by 2.9 percentage points. By comparison, 31.5% of middle attainers were entered (up 3.7 points) and 12.7% passed (up 0.9 points). Between 2012 and 2013 the entry rate for high attainers increased by 19 percentage points, so the rate of improvement has slowed significantly. Given the impending introduction of the Attainment 8 measure, commitment to the EBacc is presumably waning.
  • Thirty-seven schools entered no high attainers for the EBacc, compared with 55 in 2013 and 186 in 2012. Only 53 schools had no high attainers achieving the EBacc, compared with 79 in 2013 and 235 in 2012. Of these 53, 11 recorded a positive success rate for their middle attainers, though the difference was relatively small in all cases.

At least 3 Levels of Progress in English and maths

The Tables show that:

  • Across all state-funded schools 85.6% of high attainers made at least the expected progress in English while 84.7% did so in maths. The corresponding figures for middle attainers are 70.2% in English and 65.3% in maths. Compared with 2013, the percentages for high attainers are down 0.6 percentage points in English and down 3.1 percentage points in maths, presumably because the first entry only rule has had more impact in the latter. Even allowing for the depressing effect of the changes outlined above, it is unacceptable that more than one in every seven high attainers fails to make the requisite progress in each of these core subjects, especially when the progress expected is relatively undemanding for such students.
  • There were 108 schools in which every high attainer made at least the expected progress in English, exactly the same as in 2013. There were 99 schools which achieved the same outcome in maths, down significantly from 120 in 2013. In 2013 there were 36 schools which managed this in both English in maths, but only 21 did so in 2014.
  • At the other extreme, four schools recorded no high attainers making the expected progress in English, presumably because of their choice of IGCSE. Sixty-five schools were at or below 50% on this measure. In maths 67 schools were at or below 50%, but the lowest recorded outcome was 16%, at Oasis Academy, Hextable.
  • Half of the schools achieving 50% or less with their high attainers in English or maths also returned better results with middle attainers. Particularly glaring differentials in English include Red House Academy (50% middle attainers and 22% high attainers) and Wingfield Academy (73% middle attainers; 36% high attainers). In maths the worst examples are Oasis Academy Hextable (55% middle attainers and 16% high attainers), Sir John Hunt Community Sports College (45% middle attainers and 17% high attainers) and Roseberry College and Sixth Form (now closed) (49% middle attainers and 21% high attainers).

Comparing achievement of these measures by school type and admissions basis 

SFR02/2015 compares the performance of high attainers in different types of school on each of the five measures discussed above. This data is presented in Chart 2 below.

.

HA sec2 

Chart 2: Comparison of high attainers’ GCSE performance by type of school, 2014

.

It shows that:

  • There is significant variation on all five measures, though these are more pronounced for achievement of the EBacc, where there is a 20 percentage point difference between the success rates in sponsored academies (39.2%) and in converter academies (59.9%).
  • Converter academies are the strongest performers across the board, while sponsored academies are consistently the weakest. LA-maintained mainstream schools out-perform free schools on four of the five measures, the only exception being expected progress in maths.
  • Free schools and converter academies achieve stronger performance on progress in maths than on progress in English, but the reverse is true in sponsored academies and LA-maintained schools.
  • Sponsored academies and free schools are both registering relatively poor performance on the EBacc measure and the two progress measures.
  • One in four high attainers in sponsored academies fails to make the requisite progress in maths while one in five fail to do so in English. Moreover, one in five high attainers in free schools fails to make the expected progress in English and one in six in maths. This is unacceptably low.

Comparisons with 2013 outcomes show a general decline, with the exception of EBacc achievement.

This is particularly pronounced in sponsored academies, where there have been falls of 5.2 percentage points on 5+ A*-Cs including English and maths, 5.7 points on A*-C in English and maths and 4.7 points on expected progress in maths. However, expected progress in English has held up well by comparison, with a fall of just 0.6 percentage points.

Progress in maths has declined more than progress in English across the board. In converter academies progress in maths is down 3.1 points, while progress in English is down 1.1 points. In LA-maintained schools, the corresponding falls are 3.4 and 0.4 points respectively.

EBacc achievement is up by 4.5 percentage points in sponsored academies, 3.1 points in LA-maintained schools and 1.8 points in converter academies.

.

Comparing achievement of these measures by school admissions basis 

SFR02/2015 compares the performance of high attainers in selective, comprehensive and modern schools on these five measures. Chart 3 illustrates these comparisons.

.

HA sec3

Chart 3: Comparison of high attainers’ GCSE performance by school admissions basis, 2014

.

It is evident that:

  • High attainers in selective schools outperform those in comprehensive schools on all five measures. The biggest difference is in relation to EBacc achievement (21.6 percentage points). There is a 12.8 point advantage in relation to expected progress in maths and an 8.7 point advantage on expected progress in English.
  • Similarly, high attainers in comprehensive schools outperform those in modern schools. They enjoy a 14.7 percentage point advantage in relation to achievement of the EBacc, but, otherwise, the differences are between 1.6 and 3.5 percentage points.
  • Hence there is a smaller gap, by and large, between the performance of high attainers in modern and comprehensive schools respectively than there is between high attainers in comprehensive and selective schools respectively.
  • Only selective schools are more successful in achieving expected progress in maths than they are in English. It is a cause for some concern that, even in selective schools, 6.5% of pupils are failing to make at least three levels of progress in English.

Compared with 2013, results have typically improved in selective schools but worsened in comprehensive and modern schools. For example:

  • Achievement of the 5+ GCSE measure is up 0.5 percentage points in selective schools but down 2.3 points in comprehensives and modern schools.
  • In selective schools, the success rate for expected progress in English is up 0.5 points and in maths it is up 0.4 points. However, in comprehensive schools progress in English and maths are both down, by 0.7 points and 3.5 points respectively. In modern schools, progress in English is up 0.3 percentage points while progress in maths is down 4.1 percentage points.

When it comes to EBacc achievement, the success rate is unchanged in selective schools, up 3.1 points in comprehensives and up 5 points in modern schools.

. 

Other measures

The Secondary Performance Tables also provide information about the performance of high attainers on several other measures, including:

  • Average Points Score (APS): Annex B of the Statement of Intent says that, as in 2013, the Tables will include APS (best 8) for ‘all qualifications’ and ‘GCSEs only’. At the time of writing, only the former appears in the 2014 Tables. For high attainers, the APS (best 8) all qualifications across all state-funded schools is 386.2, which compares unfavourably with 396.1 in 2013. Four selective schools managed to exceed 450 points: Pate’s Grammar School (455.1); The Tiffin Girls’ School (452.1); Reading School (451.4); and Colyton Grammar School (450.6). The best result in 2013 was 459.5, again at Colyton Grammar School. At the other end of the table, only one school returns a score of under 250 for their high attainers, Pent Valley Technology College (248.1). The lowest recorded score in 2013 was significantly higher at 277.3.
  • Value Added (best 8) prior attainment: The VA score for all state-funded schools in 2014 is 1000.3, compared with 1001.5 in 2013. Five schools returned a result over 1050, whereas four did so in 2013. The 2014 leaders are: Tauheedul Islam Girls School (1070.7); Yesodey Hatorah Senior Girls School (1057.8); The City Academy Hackney (1051.4); The Skinner’s School (1051.2); and Hasmonean High School (1050.9). At the other extreme, 12 schools were at 900 or below, compared with just three in 2013. The lowest performer on this measure is Hull Studio School (851.2). 
  • Average grade: As in the case of APS, the average grade per pupil per GCSE has not yet materialised. The average grade per pupil per qualification is supplied. Five selective schools return A*-, including Henrietta Barnett, Pate’s, Reading School, Tiffin Girls and Tonbridge Grammar. Only Henrietta Barnett and Pate’s managed this in 2013.
  • Number of exam entries: Yet again we only have number of entries for all qualifications and not for GCSE only. The average number of entries per high attainer across state-funded schools is 10.4, compared with 12.1 in 2013. This 1.7 reduction is smaller than for middle attainers (down 2.5 from 11.4 to 8.9) and low attainers (down 3.7 from 10.1 to 6.4). The highest number of entries per high attainer was 14.2 at Gable Hall School and the lowest was 5.9 at The Midland Studio College Hinkley.

16-18: A level outcomes

.

A level grades AAB or higher in at least two facilitating subjects 

The 16-18 Tables show that 11.9% of students in state-funded schools and colleges achieved AAB+ with at least two in facilitating subjects. This is slightly lower than the 12.1% recorded in 2013.

The best-performing state-funded institution is a further education college, Cambridge Regional College, which records 83%. The only other state-funded institution above 80% is The Henrietta Barnett School. At the other end of the spectrum, some 443 institutions are at 0%.

Table 3, derived from SFR03/2015, reveals how performance on this measure has changed since 2013 for different types of institution and, for schools with different admission arrangements.

.

2013 2014
LA-maintained school 11.4 11.5
Sponsored academy 5.4 5.3
Converter academy 16.4 15.7
Free school* 11.3 16.4
Sixth form college 10.4 10
Other FE college 5.8 5.7
 
Selective school 32.4 32.3
Comprehensive school 10.7 10.5
Modern school 2 3.2

.

The substantive change for free schools will be affected by the inclusion of UTCs and studio schools in that line in 2013 and the addition of city technology colleges and 16-19 free schools in 2014.

Otherwise the general trend is slightly downwards but LA-maintained schools have improved very slightly and modern schools have improved significantly.

.

Other measures of high A level attainment

SFR03/15 provides outcomes for three other measures of high A level attainment:

  • 3 A*/A grades or better at A level, or applied single/double award A level
  • Grades AAB or better at A level, or applied single/double award A level
  • Grades AAB or better at A level all of which are in facilitating subjects.

Chart 4, below, compares performance across all state-funded schools and colleges on all four measures, showing results separately for boys and girls.

Boys are in the ascendancy on three of the four measures, the one exception being AAB grades or higher in any subjects. The gaps are more substantial where facilitating subjects are involved.

.

HA sec4

Chart 4: A level high attainment measures by gender, 2014

.

The SFR provides a time series for the achievement of the 3+ A*/A measure, for all schools – including independent schools – and colleges. The 2014 success rate is 12.0%, down 0.5 percentage points compared with 2013.

The trend over time is shown in Chart 5 below. This shows how results for boys and girls alike are slowly declining, having reached their peak in 2010/11. Boys established a clear lead from that year onwards.

As they decline, the lines for boys and girls are steadily diverging since girls’ results are falling more rapidly. The gap between boys and girls in 2014 is 1.3 percentage points.

.

HA sec5

Chart 5: Achievement of 3+ A*/A grades in independent and state-funded schools and in colleges, 2006-2014

.

Chart 6, compares performance on the four different measures by institutional type. It shows a similar pattern across the piece.

Success rates tend to be highest in either converter academies or free schools, while sponsored academies and other FE institutions tend to bring up the rear. LA-maintained schools and sixth form colleges lie midway between.

Converter academies outscore free schools when facilitating subjects do not enter the equation, but the reverse is true when they do. There is a similar relationship between sixth form colleges and LA-maintained schools, but it does not quite hold with the final pair.

. 

HA sec6 

Chart 6: Proportion of students achieving different A level high attainment measures by type of institution, 2014

.

Chart 7 compares performance by admissions policy in the schools sector on the four measures. Selective schools enjoy a big advantage on all four. More than one in four selective school students achieving at least 3 A grades and almost one in 3 achieves AAB+ with at least two in facilitating subjects.

There is a broadly similar relationship across all the measures, in that comprehensive schools record roughly three times the rates achieved in modern schools and selective schools manage roughly three times the success rates in comprehensive schools. 

. 

HA sec7 

Chart 7: Proportion of students achieving different A level high attainment measures by admissions basis in schools, 2014

 .

Other Performance Table measures 

Some of the other measures in the 16-18 Tables are relevant to high attainment:

  • Average Point Score per A level student: The APS per student across all state funded schools and colleges is 772.7, down slightly on the 782.3 recorded last year. The highest recorded APS in 2014 is 1430.1, by Colchester Royal Grammar School. This is almost 100 ahead of the next best school, Colyton Grammar, but well short of the highest score in 2013, which was 1650. The lowest APS for a state-funded school in 2014 is 288.4 at Hartsdown Academy, which also returned the lowest score in 2013. 
  • Average Point Score per A level entry: The APS per A level entry for all state-funded institutions is 211.2, almost identical to the 211.3 recorded in 2013. The highest score attributable to a state-funded institution is 271.1 at The Henrietta Barnett School. This is very slightly slower than the 271.4 achieved by Queen Elizabeth’s Barnet in 2013. The lowest is 108.6, again at Hartsdown Academy, which exceeds the 2013 low of 97.7 at Appleton Academy. 
  • Average grade per A level entry: The average grade across state-funded schools and colleges is C. The highest average grade returned in the state-funded sector is A at The Henrietta Barnett School, Pate’s Grammar School, Queen Elizabeth’s Barnet and Tiffin Girls School. In 2013 only the two Barnet schools achieved the same outcome. At the other extreme, an average U grade is returned by Hartsdown Academy, Irlam and Cadishead College and Swadelands School. 

SFR06/2015 also supplies the percentage of A* and A*/A grades by type of institution and schools’ admissions arrangements. The former is shown in Chart 8 and the latter in Chart 9 below.

The free school comparisons are affected by the changes to this category described above.

Elsewhere the pattern is rather inconsistent. Success rates at A* exceed those set in 2012 and 2013 in LA-maintained schools, sponsored academies, sixth form colleges and other FE institutions. Meanwhile, A*/A grades combined are lower than both 2012 and 2013 in converter academies and sixth form colleges.

.

HA sec8

Chart 8: A level A* and A*/A performance by institutional type, 2012 to 2014

. 

Chart 9 shows A* performance exceeding the success rates for 2012 and 2013 in all three sectors.

When both grades are included, success rates in selective schools have returned almost to 2012 levels following a dip in 2013, while there has been little change across the three years in comprehensive schools and a clear improvement in modern schools, which also experienced a dip last year.

HA sec9

Chart 9: A level A* and A*/A performance in schools by admissions basis, 2012 to 2014.

 .

Disadvantaged high attainers 

There is nothing in either of the Performance Tables or the supporting SFRs to enable us to detect changes in the performance of disadvantaged high attainers relative to their more advantaged peers.

I dedicated a previous post to the very few published statistics available to quantify the size of these excellence gaps and establish if they are closing, stable or widening.

There is continuing uncertainty whether this will be addressed under the new assessment and accountability arrangements to be introduced from 2016.

Although results for all high attainers appear to be holding up better than those for middle and lower attainers, the evidence suggests that FSM and disadvantaged gaps at lower attainment levels are proving stubbornly resistant to closure.

Data from SFR06/2015 is presented in Charts 10-12 below.

Chart 10 shows that, when the 2014 methodology is applied, three of the gaps on the five headline measures increased in 2014 compared with 2013.

That might have been expected given the impact of the changes discussed above but, if the 2013 methodology is applied, so stripping out much (but not all) of the impact of these reforms, four of the five headline gaps worsened and the original three are even wider.

This seems to support the hypothesis that the reforms themselves are not driving this negative trend, athough Teach First has suggested otherwise.

.

HA sec10

Chart 10: FSM gaps for headline GCSE measures, 2013-2014

.

Chart 11 shows how FSM gaps have changed on each of these five measures since 2011. Both sets of 2014 figures are included.

Compared with 2011, there has been improvement on two of the five measures, while two or three have deteriorated, depending which methodology is applied for 2014.

Since 2012, only one measure has improved (expected progress in English) and that by slightly more or less than 1%, according to which 2014 methodology is selected.

Deteriorations have been small however, suggesting that FSM gaps have been relatively stable over this period, despite their closure being a top priority for the Government, backed up by extensive pupil premium funding.

.

HA sec11

Chart 11: FSM/other gaps for headline GCSE measures, 2011 to 2014.

.

Chart 12 shows a slightly more positive pattern for the gaps between disadvantaged learners (essentially ‘ever 6 FSM’ and looked after children) and their peers.

There have been improvements on four of the five headline measures since 2011. But since 2012, only one or two of the measures has improved, according to which 2014 methodology is selected. Compared with 2013, either three or four of the 2014 headline measures are down.

The application of the 2013 methodology in 2014, rather than the 2014 methodology, causes all five of the gaps to increase, so reinforcing the point in bold above.

It is unlikely that this pattern will be any different at higher attainment levels, but evidence to prove or disprove this remains disturbingly elusive.

.

HA sec12

Chart 12: Disadvantaged/other gaps for headline GCSE measures, 2011 to 2014

.

Taken together, this evidence does not provide a ringing endorsement of the Government’s strategy for closing these gaps.

There are various reasons why this might be the case:

  • It is too soon to see a significant effect from the pupil premium or other Government reforms: This is the most likely defensive line, although it begs the question why more urgent action was/is discounted.
  • Pupil premium is insufficiently targeted at the students/school that need it most: This is presumably what underlies the Fair Education Alliance’s misguided recommendation that pupil premium funding should be diverted away from high attaining disadvantaged learners towards their lower attaining peers.
  • Schools enjoy too much flexibility over how they use the pupil premium and too many are using it unwisely: This might point towards more rigorous evaluation, tighter accountability mechanisms and stronger guidance.
  • Pupil premium funding is too low to make a real difference: This might be advanced by institutions concerned at the impact of cuts elsewhere in their budgets.
  • Money isn’t the answer: This might suggest that the pupil premium concept is fundamentally misguided and that the system as a whole needs to take a different or more holistic approach.

I have proposed a more targeted method of tackling secondary excellence gaps and simultaneously strengthening fair access, where funding topsliced from the pupil premium is fed into personal budgets for disadvantaged high attainers.

These would meet the cost of coherent, long-term personalised support programmes, co-ordinated by their schools and colleges, which would access suitable services from a ‘managed market’ of suppliers.

.

Conclusion

This analysis suggests that high attainers, particularly those in selective schools, have been relatively less affected by the reforms that have depressed GCSE results in 2014.

While we should be thankful for small mercies, three issues are of particular concern:

  • There is a stubborn and serious problem with the achievement of expected progress in both English and maths. It cannot be acceptable that approximately one in seven high attainers fails to make three levels of progress in each core subject when this is a relatively undemanding expectation for those with high prior attainment. This issue is particularly acute in sponsored academies where one in four or five high attainers are undershooting their progress targets.
  • Underachievement amongst high attainers is prevalent in far too many state-funded schools and colleges. At KS4 there are huge variations in the performance of high-attaining students depending on which schools they attend. A handful of schools achieve better outcomes with their middle attainers than with their high attainers. This ought to be a strong signal, to the schools as well as to Ofsted, that something serious is amiss.
  • Progress in closing KS4 FSM gaps continues to be elusive, despite this being a national priority, backed up by a pupil premium budget of £2.5bn a year. In the absence of data about the performance of disadvantaged high attainers, we can only assume that this is equally true of excellence gaps.

.

GP

February 2015