Will Maths Hubs Work?

.

Gyroscope_precessionGyroscope_precessionGyroscope_precessionThis post takes a closer look at Maths Hubs, exploring the nature of the model, their early history and performance to date.

It reflects on their potential contribution to the education of the ‘mathematically most able’ and considers whether a similar model might support ‘most able education’.

.

.

Background

.

Origins of this post

The post was prompted by the potential connection between two separate stimuli:

‘We aim to make Britain the best place in the world to study maths, science and engineering, measured by improved performance in the PISA league tables….We will make sure that all students are pushed to achieve their potential and create more opportunities to stretch the most able.’

  • My own recent post on Missing Talent (June 2015) which discussed the Sutton Trust/education datalab recommendation that:

‘Schools where highly able pupils currently underperform should be supported through the designation of another local exemplar school

Exemplar schools…should be invited to consider whether they are able to deliver a programme of extra-curricular support to raise horizons and aspirations for children living in the wider area.’

The second led to a brief Twitter discussion about parallels with an earlier initiative during which Maths Hubs were mentioned.

.

.

Links to previous posts

I touched on Maths Hubs once before, in the final section of 16-19 Maths Free Schools Revisited (October 2014) which dealt with ‘prospects for the national maths talent pipeline’.

This reviewed the panoply of bodies involved in maths education at national level and the potential advantages of investing in a network with genuinely national reach, rather than in a handful of new institutions with small localised intakes and limited capacity for outreach:

‘Not to put to finer point on it, there are too many cooks. No single body is in charge; none has lead responsibility for developing the talent pipeline

The recent introduction of maths hubs might have been intended to bring some much-needed clarity to a complex set of relationships at local, regional and national levels. But the hubs seem to be adding to the complexity by running even more new projects, starting with a Shanghai Teacher Exchange Programme.

A network-driven approach to talent development might just work…but it must be designed to deliver a set of shared strategic objectives. Someone authoritative needs to hold the ring.

What a pity there wasn’t a mechanism to vire the £72m capital budget for 12 free schools into a pot devoted to this end. For, as things stand, it seems that up to £12m will have been spent on two institutions with a combined annual cohort of 120 students, while a further £60m may have to be surrendered back to the Treasury.’

Two further posts are less directly relevant but ought to be mentioned in passing:

The second in particular raises questions about the suitability of NCETM’s version of mastery for our high attaining learners, arguing that essential groundwork has been neglected and that the present approach to ‘stretch and challenge’ is unnecessarily narrow and restrictive.

.

Structure of this post

The remainder of this post is divided into three principal sections:

  • Material about the introduction of Maths Hubs and a detailed exploration of the model. This takes up approximately half of the post.
  • A review of the Hubs’ work programme and the progress they have made during their first year of operation.
  • Proposals for Maths Hubs to take the lead in improving the education of mathematically able learners and for the potential introduction of ‘most able hubs’ to support high attainers more generally. I stop short of potential reform of the entire ‘national maths talent pipeline’ since that is beyond the scope of this post.

Since readers may not be equally interested in all these sections I have supplied the customary page jumps from each of the bullet points above and to the Conclusion, for those who prefer to cut to the chase.

.

The introduction of the Maths Hubs model

.

Initial vision

Maths Hubs were first announced in a DfE press release published in December 2013.

The opening paragraph describes the core purpose as improving teacher quality:

‘Education Minister Elizabeth Truss today announced £11 million for new maths hubs to drive up the quality of maths teachers – as international test results showed England’s performance had stagnated.’

The press release explains the Coalition Government’s plans to introduce a national network of some 30 ‘mathematics education strategic hubs’ (MESH) each led by a teaching school.

A variety of local strategic partners will be drawn into each hub, including teaching school alliances, other ‘school and college groupings’, university faculties, subject associations, ‘appropriate’ local employers and local representatives of national maths initiatives.

There is an expectation that all phases of education will be engaged, particularly ‘early years to post-16’.

National co-ordination will fall to the National Centre for Excellence in the Teaching of Mathematics (NCETM), currently run under contract to DfE by a consortium comprising Tribal Education, the UCL Institute of Education, Mathematics in Education and Industry (MEI) and Myscience.

(A 2014 PQ reply gives the value of this contract as £6.827m, although this probably reflects a 3-year award made in 2012. It must have been extended by a further year, but will almost certainly have to be retendered for the next spending review period, beginning in April 2016.

The £11m budget for Maths Hubs is separate and additional. It is not clear whether part of this sum has also been awarded to NCETM through a single tender. There is more information about funding mid-way through this post.)

The press release describes the Hubs as both a national and a school-led model:

‘The network will bring together the emerging national leaders of mathematics education and aim to make school-led national subject improvement a reality.’

These emerging national leaders are assumed to be located in the lead schools and not elsewhere in the system, at NCETM or in other national organisations.

The policy design is broadly consistent with my personal preference for a ‘managed market’ approach, midway between a ‘bottom-up’ market-driven solution and a centralised and prescriptive ‘top-down’ model

But it embodies a fundamental tension, arising from the need to reconcile the Government’s national priorities with a parallel local agenda.

In order to work smoothly, one set of priorities will almost certainly take precedence over the other (and it won’t be the local agenda).

The model is also expected to:

‘…ensure that all the support provided…is grounded in evidence about what works, both in terms of mathematics teaching and the development of teachers of mathematics.’

Each Hub will be expected to provide support for maths education across all other schools in the area, taking in the full spectrum of provision:

  • recruitment of maths specialists into teaching
  • initial training of maths teachers and converting existing teachers into maths [sic]
  • co-ordinating and delivering a wide range of maths continuing professional development (CPD) and school-to-school support
  • ensuring maths leadership is developed, eg running a programme for aspiring heads of maths departments
  • helping maths enrichment programmes to reach a large number of pupils from primary school onwards’.

This is a particularly tall order, both in terms of the breadth of Hubs’ responsibilities and the sheer number of institutions which they are expected to support. It is over-ambitious given the budget allocated for the purpose and, as we shall see, was scaled back in later material.

The press release says that NCETM has already tested the model with five pathfinders.

It adds:

The main programme will be robustly evaluated, and if it proves successful in raising the standards of mathematics teaching it may be continued in 2016 to 2017, contingent on future spending review outcomes.’

What constitutes ‘the main programme’ is unclear, though it presumably includes the Hubs’ contribution to national projects, if not their local priorities.

Note that continuation from 2016 onwards is conditional on the outcomes of this evaluation, specifically a directly attributable and measurable improvement in maths teaching standards.

I have been unable to trace a contract for the evaluation, which would suggest that one has not been commissioned. This is rather a serious oversight.

We do not know how NCETM is monitoring the performance of the Hubs, nor do we know what evidence will inform a decision about whether to continue with the programme as a whole.

We have only the most basic details of national programmes in AY2015/16 and no information at all about the Hubs’ longer term prospects.

I asked the Maths Hubs Twitter feed about evaluation and was eventually referred to NCETM’s Comms Director.

I have not made contact because:

  • It is a point of principle that these posts rely exclusively on material already available online and so in the public domain. (This reflects a personal commitment to transparency in educational policy.)
  • The Comms Director wouldn’t have to be involved unless NCETM felt that the information was sensitive and had to be ‘managed’ in some way – and that tells me all I need to know.
  • I am not disposed to pursue NCETM for clarification since they have shown zero interest in engaging with me over previous posts, even though I have expressly invited their views.

.

Selection of the Hubs

Three months later, in March 2014, further details were published as part of the process of selecting the Hubs.

The document has two stabs at describing the aims of the project. The first emphasises local delivery:

‘The aim is to enable every school and college in England, from early years to the post-16 sector, to access locally-tailored and quality support in all areas of maths teaching and learning.’

This continues to imply full national reach, although one might argue that ‘enabling access’ is achieved by providing a Hub within reasonable distance of each institution and does not demand the active engagement of every school and college.

The second strives to balance national priorities and local delivery:

‘The aim of the national network of Maths Hubs will be to ensure that all schools have access to excellent maths support that is relevant to their specific needs and that is designed and managed locally. They will also be responsible for the coordinated implementation of national projects to stimulate improvement and innovation in maths education.

Note that these national priorities have now become associated with innovation as well as improvement. This is ‘top-down’ rather than ‘school-led’ innovation – there is no specific push for innovative local projects.

At this stage the Hubs’ initial (national) priorities are given as:

  • Leading the Shanghai Teacher Exchange Programme
  • Supporting implementation of the new maths national curriculum from September 2014 and
  • Supporting introduction of new maths GCSEs and Core Maths qualifications in 2015.

The guidance specifies that:

‘Each Maths Hub will operate at a sub-regional or city regional level. The hubs will work with any group of schools or colleges in the area that request support, or who are referred to the hub for support.’

So responsibility for seeking assistance is placed on other schools and colleges and on third parties (perhaps Ofsted or Regional School Commissioners?) making referrals – Hubs will not be expected to reach out proactively to every institution in their catchment.

The competition is no longer confined to teaching schools. Any school that meets the initial eligibility criteria may submit an expression of interest. But the text is clear that only schools need apply – colleges are seemingly ineligible.

Moreover, schools must be state-funded and rated Outstanding by Ofsted for Overall Effectiveness, Pupil Achievement, Quality of Teaching and Leadership and Management.

Teaching schools are not expected to submit Ofsted inspection evidence – their designation is sufficient.

The guidance says:

‘We may choose to prioritise expression of interest applications based on school performance, geographical spread and innovative practice in maths education.’

NCETM reported subsequently that over 270 expressions of interest were received and about 80 schools were invited to submit full proposals.

The evidence used to select between these is set out in the guidance. There are four main components:

  • Attainment and progress data (primary or secondary and post-16 where relevant) including attainment data (but not progress data) for FSM pupils (as opposed to ‘ever 6 FSM’).
  • Support for improvement and professional development
  • Leadership quality and commitment
  • Record and capacity for partnership and collaboration

The full text is reproduced below

.

Criteria application Capture 1Criteria application capture 2Criteria application Capture 3Criteria application Capture 4.

It is instructive to compare the original version with the assessment criteria set out for the limited Autumn 2015 competition (see below).

In the updated version applicants can be either colleges or schools. Applicants will be invited to presentation days during which their commitment to mastery will be tested:

‘Applicants will be asked to set out…How they will support the development of mastery approaches to teaching mathematics, learning particularly from practice in Shanghai and Singapore.’

The Maths Hub model may be locally-driven but only institutions that support the preferred approach need apply.

The criteria cover broadly the same areas but they have been beefed up significantly.

The original version indicated that full proposals would require evidence of ‘school business expertise’ and ‘informed innovation in maths education’, but these expectations are now spelled out in the criteria.

Applicants must:

‘Provide evidence of a strong track record of taking accountability for funding and contracting other schools/organisations to deliver projects, including value for money, appropriate use of public funding, and impact.’

They must also:

‘Provide two or three examples of how you have led evidence-informed innovation in maths teaching. Include details of evaluation outcomes.

Provide information about the key strategies you would expect the hub to employ to support effective innovation.

Provide evidence of how you propose to test and implement the teaching maths for mastery approach within the hub. Show how effective approaches will be embedded across all school phases.’

Note that this innovative capacity is linked explicitly with the roll-out of mastery, a national priority.

The new guide explains that action plans prepared by the successful applicants will be ‘agreed by the NCETM and submitted to the DfE for approval’. This two-stage process might suggest that NCETM’s decision-making is not fully trusted. Alternatively, it might have something to do with the funding flows.

No further information was released about issues arising during the original selection process. It seems probable that some parts of the country submitted several strong bids while others generated relatively few or none at all.

It will have been necessary to balance the comparative strength of bids against their geographical distribution, and probably to ‘adjust’ the territories of Hubs where two or more particularly strong bids were received from schools in relatively close proximity.

It is not clear whether the NCETM’s five pathfinders were automatically included.

Successful bidders were confirmed in early June 2014, so the competition took approximately three months to complete.

One contemporary TSA source says that Hubs were ‘introduced at a frantic pace’. A 2-day introductory conference took place in Manchester on 18-19 June, prior to the formal launch in London in July.

Hubs had to submit their action plans for approval by the end of the summer term and to establish links with key partners in readiness to become operational ‘from the autumn term 2014’. (The TSA source says ‘in September’).

.

The Hubs are announced

A further DfE press release issued on 1 July 2014 identified 32 Hubs. Two more were added during the autumn term, bringing the total to 34, although the FAQs on the Maths Hubs website still say that there were only 32 ‘in the first wave’.

This implies that a second ‘wave’ is (or was) anticipated.

An earlier NCETM presentation indicated that 35 hubs were planned but it took a full year for the final vacancy to be advertised.

As noted above, in July 2015, an application form and guide were issued ‘for schools and colleges that want to lead a maths hub in south-east London and Cumbria or north Lancashire.’

The guide explains:

‘There are currently 34 Maths Hubs across England with funding available for a 35th Maths Hub in the North West of England. There is a geographical gap in Cumbria and North Lancashire where previously we were unsuccessful in identifying a suitable school or college to lead a Maths Hub in this area. In addition, after establishing the Maths Hub in first year, the lead school for the London South-East Maths Hub has decided to step down from its role.’

As far as I can establish this is the first time that the original failure to recruit the final Hub in the North-West has been mentioned publicly.

No reason is given for the decision by another lead school to drop out. The school in question is Woolwich Polytechnic School.

The two new Hubs are expected to be operational by November 2015. Applications will be judged by an unidentified panel.

Had the first tranche of Hubs proved extremely successful, one assumes that the second wave would have been introduced in readiness for academic year 2015/16, but perhaps it is necessary to await the outcome of the forthcoming spending review, enabling the second wave to be introduced from September 2016.

The embedded spreadsheet below gives details of all 34 Hubs currently operating.

.

.

Most lead institutions are schools, the majority of them secondary academies. A couple of grammar schools are involved as well as several church schools. Catholic institutions are particularly well represented.

Two of the London Hubs are led by singleton primary schools and a third by two primary schools working together. Elsewhere one Hub is based in a 14-19 tertiary college and another is led jointly by a 16-19 free school.

Some are hosted by various forms of school partnership. These include notable multi-academy trusts including the Harris Federation, Outwood Grange Academies Trust and Cabot Learning Federation.

The difference in capacity between a single primary school and a large MAT is enormous, but the expectations of each are identical, as are the resources made available to implement the work programme. One would expect there to be some correlation between capacity and quality with smaller institutions struggling to match their larger peers.

No doubt the MATs take care to ensure that all their schools are direct beneficiaries of their Hubs – and the initiative gives them an opportunity to exert influence beyond their own members, potentially even to scout possible additions to the fold.

Fewer than half of the lead schools satisfy the initial eligibility requirements for ‘outstanding’ inspection reports (and sub-grades). In most cases this is because they are academies and have not yet been inspected in that guise.

One lead school – Bishop Challoner Catholic College – received ‘Good’ ratings from its most recent inspection in 2012. Another – Sir Isaac Newton Sixth Form – has been rated ‘Good’ since becoming a lead school.

We do not know why these institutions were included in the original shortlist but, perhaps fortunately, there was no public backlash from better qualified competitors upset at being overlooked.

This map (taken from a presentation available online) shows the geographical distribution of the original 32 Hubs. It is a more accurate representation than the regional map on the Maths Hub website.

Even with the addition of the two latecomers in November 2014 – one in Kent/Medway, the other in Leicestershire – it is evident that some parts of the country are much better served than others.

There is an obvious gap along the East Coast, stretching from the Wash up to Teesside, and another in the far North-West that the new competition is belatedly intended to fill. The huge South-West area is also relatively poorly served.

.

Maths Hubs locations map. 

If the Hubs were evenly distributed to reflect the incidence of schools and colleges nationally, each would serve a constituency of about 100 state-funded secondary schools and 500 state-funded primary schools, so 600 primary and secondary schools in total, not to mention 10 or so post-16 institutions.

Although there is little evidence on which to base a judgement, it seems unlikely that any of the Hubs will have achieved anything approaching this kind of reach within their first year of operation. One wonders whether it is feasible even in the longer term.

But the relatively uneven geographical distribution of the Hubs suggests that the size of their constituencies will vary.

Since schools and colleges are expected to approach their Hubs – and are free to align with any Hub – the level of demand will also vary.

It would be helpful to see some basic statistics comparing the size and reach of different Hubs, setting out how many institutions they have already engaged actively in their work programmes and what proportion are not yet engaged.

It seems likely that several more hubs will be needed to achieve truly national reach. It might be more feasible with a ratio of 300 schools per hub, but that would require twice as many hubs. The limited supply of high quality candidates may act as an additional brake on expansion, on top of the availability of funding.

.

Hub structure

A presentation given on 27 June 2014 by John Westwell – NCETM’s ‘Director of Strategy Maths Hubs’ – explains Hub structure through this diagram

.

NCETM Maths hubs model Capture.

There is a distinction – though perhaps not very clearly expressed – between the roles of:

  • Strategic partners supporting the lead school with strategic leadership and 
  • Operational partners providing ‘further local leadership and specialist expertise to support [the] whole area’.

It seems that the former are directly involved in planning and evaluating the work programme while the latter are restricted to supporting delivery.

The spreadsheet shows that one of the Hubs – Salop and Herefordshire – fails to mention any strategic partners while another – Jurassic – refers to most of its partners in general terms (eg ‘primary schools, secondary schools’).

The remainder identify between four and 16 strategic partners each. Great North and Bucks, Berks and Oxon are at the lower end of the spectrum. Archimedes NE and Matrix Essex and Herts are at the upper end.

One assumes that it can be a disadvantage either to have too few or too many strategic partners, the former generating too little capacity; the latter too many cooks.

All but five Hubs have at least one higher education partner but of course there is no information about the level and intensity of their involvement, which is likely to vary considerably.

Eighteen mention the FMSP, but only five include the CMSP. Six list MEI as a strategic partner and, curiously, three nominate NCETM. It is unclear whether these enjoy a different relationship with the national co-ordinating body as a consequence.

To date, only the London Central and West Hub is allied with Mathematics Mastery, the Ark-sponsored programme.

However, NCETM says:

‘…a growing number of schools around the country are following teaching programmes from Mathematics Mastery an organisation (separate from the NCETM) whose work, as the name suggests, is wholly devoted to this style of learning and teaching. Mathematics Mastery is, in some geographical areas, developing partnership working arrangements with the Maths Hubs programme.’

Mathematics Mastery also describes itself as ‘a national partner of Maths Hubs’.

.

Work Groups

Hubs plan on the basis of a standard unit of delivery described as a ‘work group’.

Each work group is characterised by:

  • a clear rationale for its existence and activity
  • well defined intended outcomes
  • local leadership supported by expert partners
  • a mixture of different activities over time
  • value for money and
  • systematic evidence collection.

The process is supported by something called the NCETM ‘Work Group Quality Framework’ which I have been unable to trace. This should also be published.

The most recent description of the Hubs’ role is provided by the Maths Hubs Website which was did not appear until November 2014.

The description of ‘What Maths Hubs Are Doing’ reinforces the distinction between:

  • National Collaborative Projects, where all hubs work in a common way to address a programme priority area and
  • Local projects, where hubs work independently on locally tailored projects to address the programme priorities.’

The earlier material includes a third variant:

  • Local priorities funded by other means

But these are not mentioned on the website and it is not clear whether they count as part of the Hubs’ official activity programme.

The spreadsheet shows that the number of work groups operated by each Hub varies considerably.

Four of them – North West One, White Rose, South Yorkshire and London South East – fail to identify any work groups at all.

In the case of White Rose there are links to courses and a conference, but the others include only a generic description of their work programme.

Two further Hubs – Enigma and Cambridge – refer readers to their websites, neither of which contain substantive detail about the Work Groups they have established (though Enigma lists a range of maths CPD opportunities and courses).

Otherwise the number of work groups varies between two (East Midlands South) and 11 (Surrey Plus). Fifteen of the Hubs have six or fewer work groups while nine have eight or more.

This suggests that some Hubs are far more productive and efficient than others, although the number of work groups is not always a reliable indicator, since some Hubs appear to categorise one-off events as work groups, while others use it to describe only longer term projects.

Maybe the Quality Framework needs attention, or perhaps some Hubs are not following it properly.

.

The network defined

To coincide with the launch NCETM published its own information page on Maths Hubs, now available only via archive.

This describes in more detail how the Hubs will be expected to function as a network:

‘…the Maths Hubs will also work together in a national network co-ordinated by the NCETM. The network will ensure that effective practice from within particular hubs is shared widely. It will also provide a setting for Maths Hubs and the NCETM to collaboratively develop new forms of support as needed.

The national network will also come together, once a term, in a regular Maths Hubs Forum, where there will be opportunity to evaluate progress, plan for the future, and to engage with other national voices in maths education, such as the Joint Mathematical Council, the Advisory Committee on Mathematics Education (ACME), the DfE, and Ofsted. As shown in the diagram below’:

.

NCETM national network Capture..

Whether this is genuinely ‘school-led system-wide improvement’ is open to question, relying as it does on central co-ordination and a funding stream provided by central government. It is more accurately a hybrid model that aims to pursue national and local priorities simultaneously.

Essentially Hubs have a tripartite responsibility:

  • To develop and co-ordinate practice within their own Hub.
  • To collaborate effectively with other Hubs.
  • Collectively to contribute to the national leadership of maths education

The sheer complexity of this role – and the level of expectation placed on the Hubs – should not be under-estimated.

The archived NCETM page identifies three core tasks for the Hubs as they operate locally:

  • Identify needs and agree priorities for support in their area. This could involve pro-active surveying of schools; responding to requests and referrals; and considering the implications of national evidence.
  • Co-ordinate a range of high quality specialist mathematics support to address the needs. This could include communicating existing support and extending its reach; commissioning external organisations to provide bespoke support; developing and enabling new forms of support and collaboration.
  • Critically evaluate the quality and impact of the support provided. This could include gathering immediate, medium-term and long-term feedback from participants engaging with support; and more detailed evaluative research used to test innovations.’

We have no information about the extent and quality of cross-fertilisation between Hubs. This seems to depend mainly on the termly attendance of the leads at the Forum meetings, supported through social media interaction via Twitter. There is also some evidence of regional collaboration, though this seems much better developed in some regions than others.

The July 2015 newsletter on the Maths Hub Website says:

‘An added feature of the second year of the Maths Hubs programme will be more collaboration between Maths Hubs, typically bringing a small group of hubs together to pool experience, maybe in the development of a new project, or in the wider implementation of something that’s already worked well in a single hub.’

This may suggest that the collaborative dimension has been rather underplayed during the first year of operation. If it is to be expanded it may well demand additional teacher time and funding.

In the Westwell presentation the model is described as a ‘fully meshed network’ (as opposed to a hub and spoke model) in which ‘all the nodes are hubs’.

Unusually – and in contrast to the DfE press releases – there is explicit recognition that the Hubs’ core purpose is to improve pupil outcomes:

‘Resolute focus on pupils’ maths outcomes:

  • improved levels of achievement
  • increased levels of participation
  • improved attitudes to learning
  • closing the gaps between groups’

They also support school/college improvement:

‘Determined support for all schools/colleges to improve:

  • the teaching of mathematics
  • the leadership of mathematics
  • the school’s mathematics curriculum ‘

Any evaluation would need to assess the impact of each Hub against each of these seven measures. Once again, the level of expectation is self-evident.

. 

Termly Forums and Hub leads

Very little information is made available about the proceedings of the termly Maths Hub Forum, where the 34 Hub leads convene with national partners.

The Maths Hubs website says:

‘At the national level, the Maths Hubs programme, led by the NCETM, is developing partnership working arrangements with organisations that can support across the Maths Hubs network. At the moment, these include:

Other partnership arrangements will be developed in due course.’

There is no further information about these national partnership agreements, especially the benefits accruing to each partner as a consequence.

We know that one Forum took place in October 2014, another in February 2015. We do not know the full list of national partners on the invitation list.

There should be another Forum before the end of summer term 2015, unless the London Maths Hub Conference was intended to serve as a replacement.

The guide to the competition for two new Hubs mentions that the Autumn 2015 Forum will take place in York on 4/5 November.

The July Bespoke newsletter says:

‘…the 34 Maths Hub Leads, who meet termly, will continue to pool their thoughts and experiences, developing a growing and influential voice for mathematics education at a national level.’ 

It is hard to understand how the Forum can become ‘an influential voice’ without a significantly higher profile and much greater transparency over proceedings.

The Maths Hubs website should have a discrete section for the termly forums which contains all key documents and presentations.

In March 2015, NCETM’s Westwell published a post on the NCTL Blog claiming early signs of success for the Hubs:

‘Even though we are less than 2 terms into embedding a new, collaborative way of working, we are seeing encouraging signs that leadership in mathematics education can be shared and spread within geographical areas.’

He continues:

‘Our vision is of a national, collective group of leaders exerting new, subject-specific influence across school phases and across geographical boundaries.

The essential professional characteristics of this group are that they know, from first-hand experience:

  • how maths is best taught, and learnt
  • how good maths teachers are nurtured
  • how high-quality ongoing professional development can help good teachers become excellent ones

They have shown the capacity to lead others in all of these areas.’

And he adds:

‘The maths hub leads also come together in a regular national forum, which allows them to exchange practice but also provides a platform for them to enter into dialogue with policy makers and key national bodies. Over time, we expect that maths hub leads will come to be recognised nationally as leaders of mathematics education.’

This highlights the critical importance of the Maths Hub leads to the success of the model. One assumes that the post-holders are typically serving maths teachers who undertake this role alongside their classroom and middle management responsibilities.

It seems highly likely that most Hub leads will not remain in post for more than two or three years. All will be developing highly transferrable skills. Many will rightly see the role as a stepping stone to senior leadership roles.

Unless they can offer strong incentives to Hub leads to remain in post, NCETM will find turnover a persistent problem.

.

Funding

There is no information about funding on the Maths Hubs Website and details are extremely hard to find, apart from the total budget of £11m, which covers the cost of Hubs up to the end of FY2015-16.

Each Hub receives core operational funding as well as ‘funding on a project basis for local and national initiatives’.

I found an example of an action plan online. The notes provide some details of the annual budget for last financial year:

For the financial year 2014/15, each hub will receive £36,000 to cover the structural costs of the hub including the cost of: the Maths Lead time (expected minimum 1 day/week) and Hub Administrator time (expected minimum 1.5 days/week); the time provided by the Senior Lead Support and the strategic leadership group; identifying and developing operational partner capacity; engaging schools/colleges and identifying their support needs. It is possible to transfer some of the £36,000 to support hub initiated activities.

For the financial year 2014/15, Maths Hubs will receive £40,000 to support hub-initiated activity. As explained at the forum we are using the term “Work Groups” to cover all hub-initiated activity…The cost of the exchange element with the Shanghai teachers will be paid from central national project funds and is outside of the £40,000 budget.’

Another source (a presentation given at the launch of the Norfolk and Suffolk Hub) suggests that in 2014-15 Hubs also received a further £20,000 for national projects.

Hence the maximum budget per Hub in FY2014/15 was £96,000. Assuming all 34 received that sum the total cost was £3.264m (34 x £96K).

We do not know how much more was set aside for central costs, although DfE’s Supplementary Estimates for 2014-15 hint that the total budget might have been £3.7m, which would suggest a balance of £0.436m was spent on central administration.

The NCETM website presently lists a Director and no fewer than six Assistant Directors responsible for Maths Hubs, giving a ratio of one director for every seven hubs. On the face of it, this does not fit the image as a school-led network. Indeed it suggests that the Hubs require intensive central support.

I could find nothing at all about the size of the budget for 2015-16. The Norfolk and Suffolk launch presentation indicates that Hubs will enjoy additional funding for both running costs and projects but does not quantify this statement. Another source suggests that the time allocation for Hub leads will be increased to 0.5FTE.

There is no information about funding levels in the guide to the autumn 2015 competition, although it suggests that the money will come in two separate streams:

‘Each Maths Hub will receive direct funding for structural operational purposes and funding on a project basis for local and national projects.’

It may be that the operational funding is paid via NCTL and the project funding via NCETM.

One assumes that operational funding will need to be uprated by at least 33% for 2015-16 since it will cover a full financial year rather than July to March inclusive (9 months only).

If the funding for local and national projects is increased by the same amount, that would bring the sum per Hub in FY2015-16 to approximately £128,000 and the total budget to something like £5m.

It would be helpful to have rather more transparency about Hub budgets and the total sum available to support them in each financial year.

If the NCETM operation needs retendering for FY2016-17 onwards, one assumes that national co-ordination of the Hubs will form part of the specification. One might expect to see a tender early next academic year.

.

Hubs’ Current Activity

Developing role 

The press release marking the launch was strongly focused on Hubs’ role in leading what was then called the Shanghai Teacher Exchange Programme:

‘A national network of maths hubs that will seek to match the standards achieved in top-performing east Asian countries – including Japan, Singapore and China – was launched today by Education Minister Elizabeth Truss…

These ‘pace-setters’ will implement the Asian-style mastery approach to maths which has achieved world-leading success….Hubs will develop this programme with academics from Shanghai Normal University and England’s National Centre for Excellence in the Teaching of Maths (NCETM)….

… The Shanghai Teacher Exchange programme will see up to 60 English-speaking maths teachers from China embedded in the 30 maths hubs, starting this autumn term.

The Chinese teachers will run master classes for local schools and provide subject-specific on-the-job teacher training.

Two leading English maths teachers from each of the 30 maths hubs will work in schools in China for at least a month, to learn their world-class teaching approaches. The teachers will then put into practice in England what they have learnt and spread this widely to their peers.’

It also mentioned that the Hubs would be supporting the Your Life campaign to inspire young people, especially girls, to study maths and physics.

‘The campaign, led by businesses, aims to increase the number of students taking maths and physics A level by 50% over the next 3 years.’

Moreover:

‘They will also work with new maths and physics chairs, PhD graduates being recruited to become teachers to take their expertise into the classroom and transform the way the maths and physics are taught.’

The Website describes three National Collaborative Projects in slightly different terms:

  • England-China is the new title for the Shanghai Teacher Exchange. Primary sector exchanges took place in 2014/15 and secondary exchanges are scheduled for 2015/16.

The aim of the project is described thus:

‘The aim, as far as the English schools are concerned, is to learn lessons from how maths is taught in Shanghai, with particular focus on the mastery approach, and then research and develop ways in which similar teaching approaches can be used in English classrooms

…The long-term aim of the project is for the participating English schools first to develop a secure mastery approach to maths teaching themselves, and then to spread it around partner schools.’

  • Textbooks and Professional Development involves two primary schools from each Maths Hub trialling adapted versions of Singapore textbooks with their Year 1 classes.

Each school has chosen one of two mastery-focused textbooks: ‘Inspire Maths’ and ‘Maths – No Problem’. Teachers have five days’ workshop support.

  • Post-16 Participation is intended to increase participation rates in A level maths and further maths courses as well as Core Maths and other Level 3 qualifications. Some hubs are particularly focused on girls’ participation.

The initial phase of the project involves identifying schools and colleges that are successful in this respect, itemising the successful strategies they have deployed and exploring how those might be implemented in schools and colleges that have been rather less successful.

.

Progress to date on National Collaborative Projects 

Coverage of the National Projects on the Hubs website is heavily biased towards the England-China project, telling us comparatively little about the other national priorities.

A group of 71 primary teachers visited Shanghai in September 2014. Return visits from 59 Shanghai teachers took place in two waves, in November 2014 and February/March 2015. 

A list of 47 participating schools is supplied including the hubs to which they belong.

There is also a Mid-Exchange Report published in November 2014, a press release from February 2015 marking the arrival of the second wave and the first edition of Bespoke, a Maths Hub newsletter dating from April 2015, which is exclusively focused on mastery.

The latter describes the exchanges as:

‘…the start of a long-term research project, across all of the Maths Hubs, to investigate ways in which mastery approaches can be introduced to maths lessons, to the way teachers design lessons, and to how schools organise time-tables, and the deployment of teachers and teaching assistants.’

These descriptions suggest something rather different to the slavish replication of Shanghai-style mastery, anticipating a ‘secure mastery approach’ that might nevertheless have some distinctive English features.

But NCETM has already set out in some detail the principles and key features of the model they would like to see introduced, so rather less is expected of the Hubs than one might anticipate. They are essentially a testbed and a mechanism for the roll-out of a national strategy.

The website also indicates that, before the end of summer term 2015:

‘…the NCETM, working through the Maths Hubs will publish support materials for assessment of the depth of pupils’ knowledge within the context of a mastery curriculum.’

NCETM describes the materials as a collaborative venture involving several partners:

‘Recording progress without levels requires recording evidence of depth of understanding of curriculum content, rather than merely showing pupils can ‘get the answers right’.

The NCETM, working with other maths experts and primary maths specialists from the Maths Hubs, is currently producing guidance on how to do this for the primary maths National Curriculum. For each curriculum statement, the guidance will show how to identify when a pupil has ‘mastered’ the curriculum content (meaning he or she is meeting national expectations and so ready to progress) and when a pupil is ‘working deeper’ (meaning he or she is exceeding national expectations in terms of depth of understanding).’

This is not yet published and, if NCETM is sensible, it will wait to see the outcomes of the parallel Commission on Assessment Without Levels.

The Bespoke newsletter mentions in passing that further research is needed into the application of mastery teaching in mixed age classes, but no further details are forthcoming.

Information about the planned secondary exchange is also rather thin on the ground.

NCETM said in June that the programme would focus on teaching at the KS2/3 transition.

The second edition of Bespoke, published in July 2015 adds:

‘Primary schools that hosted Shanghai teachers in 2014/15 will continue to develop and embed teaching for mastery approaches, and, in addition, two teachers from secondary schools in each Maths Hub will visit Shanghai in September, with their counterparts returning to work in Key Stage 3 classrooms in November 2015.’

The same is true of the Textbooks project, which was announced in a ministerial speech given in November 2014. Very little detail has been added since.

The July edition of Bespoke says that the project:

‘…will be expanded, to take in more schools and more classes, including Year 2 pupils’

while another section offers the briefest commentary on progress in the first year, twice!:

.

Bespoke July Capture.

Coverage of the Post-16 Participation project is similarly sparse, though this may be because the lead lies with the Further Mathematics Support Programme and Core Maths Support Programme.

July’s Bespoke says of Year 2:

‘Work to help schools and colleges increase the numbers of Year 12 and Year 13 students taking A level maths, and, among them, more girls, will continue. Approaches that bore fruit in some hubs this year will be implemented in other areas.’

The sketchiness of this material causes one to suspect that – leaving aside the Shanghai exchanges – progress on these national projects has been less than spectacular during the first year of the Hubs’ existence.

Even with the England-China project there is no published specification for the long-term research project that is to follow on from the exchanges.

Those working outside the Hubs need more information to understand and appreciate what value the Hubs are adding.

.

New National Collaborative Projects

The July edition of Bespoke confirms two further National Projects.

One is snappily called ‘Developing 140 new Primary Mathematics Teaching for Mastery specialists’:

‘Closely linked to other work on mastery, this project will involve the training of four teachers in each Maths Hub area to become experts in teaching for mastery in their own classrooms, and in supporting the similar development of teachers in partner schools.’

This project appeared a national-programme-in-waiting when it was first announced in April 2015.

A subsequent NCETM press release confirmed that there were over 600 applicants for the available places.

The further details provided by NCETM reveal that participants will pursue a two-year course. Year One combines three two-day residential events with the leadership of teacher research groups, both in the teacher’s own school and for groups of teachers in neighbouring schools.  Year Two is devoted exclusively to these external teacher research groups.

The material explains that a research group is:

‘…a professional development activity attended by a group of teachers, with a specific focus on the design, delivery and learning within a jointly evaluated mathematics lesson.’

A FAQ document explains that a typical research group meeting is a half-day session with discussion taking place before and after a lesson observation.

The four external group meetings in Year One will together constitute a pilot exercise. In Year Two participants will lead up to five such groups, each meeting on six occasions. Groups will typically comprise five pairs of teachers drawn from five different schools.

Release time is 12 days in Year One and up to 30 days in Year Two (assuming the participant leads the maximum five research groups).

Training and supply costs are covered in Year One but in Year Two they are to be met by charging the other participants in the research groups, so a first indication that Hubs will be expected to generate their own income stream from the services they provide. (NCETM will provide ‘guidance’ on fee levels.)

Participants are expected to develop:

  • ‘Understanding of the principles of mastery within the context of teaching mathematics.
  • Deep subject knowledge of primary mathematics to support teaching for mastery.
  • The development of effective teaching techniques to support pupils in developing mastery of mathematics.
  • The ability to assess pupils for mastery.
  • The ability to support other teachers, and lead teacher research groups.’

The intention is that teachers completing the course will roll out further phases of professional development and:

‘Over time, this will spread the understanding of, and expertise in, teaching maths for mastery widely across the primary school system.’

The second new national project is called ‘Mathematical Reasoning’. Bespoke is typically uninformative:

‘A new project will start in September 2015, to trial ways of developing mathematical reasoning skills in Key Stage 3 pupils.’

This may or may not be related to a NCETM Multiplicative Reasoning Professional Development Programme which took place in 2013/14 with the assistance of the Hubs.

This:

‘focused on developing teachers’ understanding and capacity to teach topics that involved multiplicative reasoning to Key Stage 3 (KS3) pupils. Multiplicative reasoning refers to the mathematical understanding and capability to solve problems arising from proportional situations often involving an understanding and application of fractions as well as decimals, percentages, ratios and proportions.’

Some 60 teachers from 30 schools were organised into three regional professional development networks, each with a professional development lead and support from university researchers. Project materials were created by a central curriculum development team. The regional networks were hosted by Maths Hubs, presumably in their pilot phase.

In June 2015 DfE published a project Evaluation featuring a Randomised Control Trial (RCT). Unfortunately, this did not reveal any significant impact on pupil attainment:

‘During the timescale of the trial (13 October 2014 to May 2015) the programme did not have any statistically significant impacts on general mathematical attainment as measured by PiM tests or on items on the tests specifically associated with multiplicative reasoning’.

One of the Report’s recommendations is:

‘For the NCETM to make available MRP materials and approaches to teaching MR through the Maths Hub network’

Another:

‘That the NCETM seeks further opportunities to engage curriculum developers with Maths Hubs and other NCETM activities and potentially to develop future curriculum design projects that address the needs of teachers, schools and pupils’.

With five national collaborative projects rather than three, the work programme in each Hub during Academic Year 2015/16 will be more heavily biased towards the Government’s agenda, unless there is also additional funding to increase the number of local projects. There is no hint in the latest Bespoke newsletter that this is the case.

.

Local projects 

Unfortunately, Hub-specific pages on the Maths Hubs Website do not distinguish national from local projects.

A regional breakdown offers some insight into the typical distribution between the two and the range of issues being addressed.

The embedded spreadsheet provides further details, including links to additional information on each work group where the Hubs have made this available.

  • South West: The four Hubs between them identify 27 work groups. Each Hub has a work group for each of the three initial national collaborative projects. Relatively unusual topics include maths challenge and innovation days and improving primary maths enrichment experiences. The Jurassic Hub includes amongst its list of generic focus areas ‘developing access for the most able’, but there is no associated work group.
  • West Midlands: Two of the three hubs have six work groups and the third has seven. Here there is rather less adherence to the national priorities with only the North Midlands and Peaks Hub noticeably engaged with the mastery agenda. One work group is addressing ‘strategies for preventing (closing) the gap’ in maths. It is disturbing that this is unique across the entire programme – no other region appears concerned enough to make this a priority, nor is it a national project in its own right.
  • North West: Of the three Hubs, one has provided no details of its work groups, one lists six and the other nine. Perhaps the most interesting is North West Two’s Maths App Competition. This involves Y5 and 6 pupils creating ‘a maths-based app for a particular area of weakness that they have identified’.
  • North East: The two North East Hubs have nine and eight work groups respectively. Both address all three initial national priorities. In one the remaining groups are designed to cover the primary, secondary and post-16 sectors respectively. In the other there is a very strong mastery bias with two further work groups devoted to it.
  • Yorkshire and Humberside: Only two of the four Hubs provide details of their work groups in the standard format. One offers eight, the other four. The less ambitious Yorkshire and the Humber Hub does not include any of the three national priorities but addresses some topics not found elsewhere including Same Day Intervention and Differentiation. In contrast, Yorkshire Ridings covers all three national priorities and a local project offering £500 bursaries for small-scale action research projects.
  • East Midlands: Two of the Hubs identify six work groups but the third – one of the two late additions – has only two, neither of them focused on the national priorities. Elsewhere, only East Midlands East has a work group built around the Shanghai exchanges. Otherwise, network focused work groups – whether for primary specialists, subject leaders or SLEs – are dominant.
  • East: Two of the four Hubs provide links to their own websites, which are not particularly informative. The others name nine and five work groups respectively. The former – Matrix Essex and Herts – includes all three initial national priorities, but the latter – Norfolk and Suffolk – includes only increasing post-16 participation. Matrix has a local project to enhance the subject knowledge of teaching assistants. 
  • South East: The five Hubs vary considerably in the number of work groups they operate, ranging between three and 11. Bucks, Berks and Oxon is the least prolific, naming only the three national priorities. At the other extreme, Surrey Plus is the most active of all 34 Hubs, though several of its groups appear to relate to courses, conferences and other one-off meetings. One is providing ‘inspiration days for KS2, KS3 and KS4 students in schools looking to improve attitudes towards maths’. 
  • London: Of the six London Hubs, one has provided no information about its work groups. Two of the remaining five have only three work groups. Of these, London Central and NW lists the three national priorities. The other – London Central and West – mentions the two mastery-related national programmes and then (intriguingly) a third project called ‘Project 4’! London Thames includes a Student Commission Project:

‘Students will become researchers over two days and will explore the difference between depth and acceleration in terms of students’ perceptions of progress. There will be support from an expert researcher to support them in bringing together their findings. They will present their findings at the Specialist Schools and Academy’s Trust (SSAT) Conference and other forums where they can share their experience.’

Unfortunately, the presentation given at this event suggests the students were unable to produce a balanced treatment, carefully weighing up the advantages and disadvantages of each approach and considering how they might be combined to good effect. Naturally they came up with the ‘right’ answer for NCETM!

The variation in the productivity of Hubs is something of a surprise. So are the different levels of commitment they display towards the NCETM’s mastery-focused agenda.

Does NCETM push the laggards to work harder and conform to its priorities, or does it continue to permit this level of variance, even though it will inevitably compromise the overall efficiency of the Maths Hub programme?

.

Supporting the Most Able

.

Through the Maths Hubs 

In 2013, NCETM published guidance on High Attaining Pupils in Primary Schools (one has to register with NCETM to access these materials).

This is strongly influenced by ACME’s Report ‘Raising the bar: developing able young mathematicians’ (December 2012) which defines its target group as:

‘…those students aged 5-16 who have the potential to successfully study mathematics at A level or equivalent’.

ACME bases its report on three principles:

  • ‘Potential heavy users of mathematics should experience a deep, rich, rigorous and challenging mathematics education, rather than being accelerated through the school curriculum.
  • Accountability measures should allow, support and reward an approach focused on depth of learning, rather than rewarding early progression to the next Key Stage.
  • Investment in a substantial fraction of 5-16 year olds with the potential to excel in mathematics, rather than focussing attention on the top 1% (or so), is needed to increase the number of 16+ students choosing to study mathematics-based subjects or careers.’

ACME in turn cites Mathematical Association advice from the previous year on provision for the most able in secondary schools.

It is fascinating – though beyond the scope of this post – to trace through these publications and subsequent NCETM policy the evolution of an increasingly narrow and impoverished concept of top-end differentiation

The line taken in NCETM’s 2013 guidance is still relatively balanced:

‘It’s probably not helpful to think in terms of either enrichment or acceleration, but to consider the balance between these two approaches. Approaches may vary depending on the age of children, or the mathematics topics, while there may be extra-curricular opportunities to meet the needs of high attaining children in other ways. In addition to considerations of which approach supports the best learning, there are practical issues to consider.’

This is a far cry from the more extreme position now being articulated by NCETM, as discussed in my earlier post ‘A digression on breadth, depth, pace and mastery’.

There is in my view a pressing need to rediscover a richer and more sophisticated vision of ‘stretch and challenge’ for high attaining learners in maths and, by doing so, to help to achieve the Conservative manifesto commitment above. This need not be inconsistent with an Anglicised mastery model, indeed it ought to strengthen it significantly.

One obvious strategy is to introduce a new National Collaborative Project, ensuring that all 34 Hubs are engaged in developing this vision and building national consensus around it.

Here are some suggested design parameters:

  • Focus explicitly on improving attainment and progress, reducing underachievement by high attaining learners and closing gaps between disadvantaged high attainers and their peers.
  • Develop interventions targeted directly at learners, as well as professional development, whole school improvement and capacity building to strengthen school-led collaborative support.
  • Emphasise cross-phase provision encompassing primary, secondary and post-16, devoting particular attention to primary/secondary and secondary/post-16 transition.
  • Develop and disseminate effective practice in meeting the needs of the most able within and alongside the new national curriculum, including differentiated support for those capable of achieving at or beyond KS2 L6 in scaled score terms and at or beyond Grade 9 GCSE.at KS4.
  • Develop, test and disseminate effective practice in meeting the needs of the most able through a mastery-driven approach, exemplifying how breadth, depth and pace can be combined in different proportions to reflect high attainers’ varying needs and circumstances.

.

Through ‘Most Able Hubs’

Compared with Maths Hubs, the Sutton Trust’s recommendation – that designated schools should support those that are underperforming with the most able and consider providing a localised extra-curricular enrichment programme – is markedly unambitious.

And of course the Maths Hubs cannot be expected to help achieve Conservative ambitions for the other elements of STEM (let alone STEAM).

Why not introduce a parallel network of Most Able Hubs (MAHs)? These would follow the same design parameters as those above, except that the last would embrace a whole/school college and whole curriculum perspective.

But, in the light of the analysis above, I propose some subtle changes to the model.

  • Number of hubs

Thirty-four is not enough for genuine national reach. But the supply of potential hubs is constrained by the budget and the number of lead institutions capable of meeting the prescribed quality criteria.

Assuming that the initial budget is limited, one might design a long-term programme that introduces the network in two or even three phases. The first tranche would help to build capacity, improving the capability of those intending to follow in their footsteps.

The ideal long-term outcome would be to introduce approximately 100 MAHs, at least 10 per region and sufficient for each to support some 200 primary and secondary schools (170 primary plus 30 secondary) and all the post-16 institutions in the locality.

That might be achieved in two phases of 50 hubs apiece or three of 33-34 hubs apiece.

  • Quality threshold

In the first instance, MAHs would be selected on the basis of Ofsted evaluation – Outstanding overall and for the same sub-categories as Maths Hubs – and high-attaining pupil performance data, relating to attainment, progress and destinations. This should demonstrate a strong record of success with disadvantaged high attainers.

One of the inaugural national collaborative projects (see below) would be to develop and trial a succinct Quality Measure and efficient peer assessment process, suitable for all potential lead institutions regardless of phase or status.

This would be used to accredit all new MAHs, but also to re-accredit existing MAHs every three years. Those failing to meet the requisite standard would be supported to improve.

  • Three tiers and specialism

MAHs would operate at local and national level but would also collaborate regionally. They might take it in turns to undertake regional co-ordination.

Each would pursue a mix of national, regional and local priorities. The regional and local priorities would not replicate national priorities but MAHs would otherwise have free rein in determining them, subject to the approval of action plans (see below).

Each MAH would also be invited to develop a broader specialism which it would pursue in national and regional settings. MAHs from different regions with the same specialism would form a collaborative. The selected specialism might be expected to inform to some extent the choice of local priorities.

  • Strategic partnerships

Each MAH would develop a variety of local strategic partnerships, drawing in other local school and college networks, including TSAs, MATs, local authority networks, maths and music hubs; local universities, their faculties and schools of education; nearby independent schools;  local commercial and third sector providers; and local businesses with an interest in the supply of highly skilled labour. Some partners might prefer to engage at a regional level.

SLEs with a ‘most able’ specialism would be involved as a matter of course and would be expected to play a leading role.

National bodies would serve as national strategic partners, sitting on a National Advisory Group and contributing to the termly national forum.

Participating national bodies would include: central government and its agencies; national organisations, whether third sector or commercial, supporting the most able; and other relevant national education organisations, including subject associations and representative bodies.

Termly forums would be used to monitor progress, resolve issues and plan collaborative ventures. All non-sensitive proceedings would be published online. Indeed a single website would publish as much detail as possible about the MAHs: transparency would be the watchword.

  • Work Groups

Each MAH would agree an annual action plan applying the work group methodology to its national, regional and local priorities. Each priority would entail a substantive work programme requiring significant co-ordinated activity over at least two terms.

An additional work group would capture any smaller-scale local activities (and MAHs might be permitted to use a maximum of 10% of their programme budget for this purpose).

MAHs’ progress against their action plans – including top level output and outcome targets – would be assessed annually and the results used to inform the re-accreditation process.

The programme as a whole would be independently evaluated and adjusted if necessary to reflect the findings from formative evaluation.

  • Staffing and funding

MAHs would operate with the same combination of co-ordinator, SLT sponsor and administrator roles, but with the flexibility to distribute these roles between individuals as appropriate. Hubs would be encouraged to make the lead role a full-time appointment.

Co-ordinators would constitute a ‘network within a network’, meeting at termly forums and supporting each other through an online community (including weekly Twitter chats) and a shared resource base.

Co-ordinators would be responsible for devising and running their own induction and professional development programme and ensuring that new appointees complete it satisfactorily. Additional funding would be available for this purpose. The programme would be accredited at Masters level.

Assuming a full year budget of £160K per MAH (£60K for structural costs; £100K for work groups), plus 10% for central administration, the total steady-state cost of a 100-MAH network would be £17.6m per year, not much more than the £15m that Labour committed during the General Election campaign. If the programme was phased in over three years, the annual cost would be significantly lower during that period.

MAHs might be encouraged to generate income to offset against their structural costs. The co-ordinators’ salary and on-costs might be the first priority. In time Hubs might be expected to meet these entirely from income generated, so reducing the overall cost by almost a third.

In an ideal world, MAHs would also support a parallel programme providing long-term intensive support to disadvantaged high attainers funded through a £50m pupil premium topslice.

The overall cost is significant, but bears comparison with the substantial sums invested in some selective 16-19 free schools, or the £50m recently set aside for School Cadet Forces. Maybe funding for MAHs should also be drawn from the fines levied on the banks!

MAHs would support learners from YR-Y13 and have a genuinely national reach, while free schools can only ever impact significantly on a very limited annual intake plus those fortunate enough to benefit from any localised outreach activity. In short MAHs offer better value for money.

.

Conclusion

The principal findings from this review are that:

  • Maths Hubs offer a potentially workable model for system-wide improvement in the quality of maths education which could help to secure higher standards, stronger attainment and progress. But expectations of the Hubs are set too high given the limited resource available. It is doubtful whether the present infrastructure is strong enough to support the Government’s ambition to make England the best place in the world to study maths (in effect by 2020).
  • Given the dearth of information it is very difficult to offer a reliable assessment of the progress made by Maths Hubs in their first year of operation. The network has managed to establish itself from scratch within a relatively short time and with limited resources, but progress appears inconsistent, with some Hubs taking on and achieving much more than others. Two of the first three national collaborative projects still seem embryonic and the England-China project seems to be making steady rather than spectacular progress.
  • There are some tensions and weaknesses inherent in the model. In particular it relies on the successful reconciliation of potentially competing national and local priorities. There is evidence to suggest that national priorities are dominating at present. The model also depends critically on the capability of a small group of part-time co-ordinators. Several are likely to have limited experience and support, as well as insufficiently generous time allocations. Many will inevitably progress to school leadership positions so turnover will be a problem. An independent evaluation with a formative aspect would have been helpful in refining the model, ironing out the shortcomings and minimising the tensions. The apparent failure to commission an evaluation could become increasingly problematic as the expectations placed on the Hubs are steadily ratcheted upwards.
  • The supply of information is strictly rationed; the profile of Maths Hubs is far too low. Because the quality and quantity of information is so limited, those not working inside the network will infer that there is something to hide. Institutions that have not so far engaged with the Hubs will be less inclined to do so. If external communication is wanting, that may suggest that intra-Hub communication is equally shaky. Effective communication is critical to the success of such networks and ought to be given much higher priority. The Maths Hub website ought to be a ‘one stop shop’ for all stakeholders’ information needs, but it is infrequently updated and poorly stocked. Transparency should be the default position.
  • If the Government is to ‘create more opportunities to stretch the most able’ while ensuring that all high attainers ‘are pushed to achieve their potential’, then Maths Hubs will need to be at the forefront of a collective national improvement effort. NCETM should be making the case for an additional national collaborative project with this purpose. More attention must be given to shaping how the evolving English model of maths mastery provides stretch and challenge to high attainers, otherwise there is a real risk that mastery will perpetuate underachievement, so undermining the Government’s ambitions. In PISA 2012, 3.1% of English participants achieved Level 6 compared with 30.8% of those from Shanghai, while the comparative percentages for Levels 5 and 6 were 12.4% and 55.4% respectively. NCETM should specify now what they would consider acceptable outcomes for England in PISA 2015 and 2018 respectively.
  • Maths Hubs cannot extend their remit into the wider realm of STEM (or potentially STEAM if arts are permitted to feature). But, as Ofsted has shown, there are widespread shortcomings in the quality of ‘most able education’ more generally, not least for those from disadvantaged backgrounds. I have already made the case for a targeted support programme to support disadvantaged high attainers from Year 7 upwards, funded primarily through an annual pupil premium topslice. But the parallel business of school and college improvement might be spearheaded by a national network of Most Able Hubs with a whole school/college remit. I have offered some suggestions for how the Maths Hubs precedent might be improved upon. The annual cost would be similar to the £15m committed by Labour pre-election.

If such a network were introduced from next academic year then, by 2020, the next set of election manifestos might reasonably aim to make Britain the best place in the world for high attaining learners, especially high attaining learners from disadvantaged backgrounds.

And, with a generation of sustained effort across three or four successive governments and universal commitment in every educational setting, we might just make it….

What do you think the chances are of that happening?

Me too.

.

GP

July 2015

Missing Talent

.

people-308531_1280This post reviews the Sutton Trust’s Research Brief ‘Missing Talent’, setting it in the context of the Trust’s own priorities and the small canon of research on excellence gaps in the English education system.

It is structured as follows:

  • Background on what has been published and my own involvement in researching and debating these issues.
  • Analysis of the data-driven substance of the Research Brief
  • Analysis of the recommendations in the Research and their fit with previous recommendations contained in the Sutton Trust’s Mobility Manifesto (September 2014)
  • Commentary on the quality of the Research Brief, prospects for the adoption of these recommendations and comparison with my own preferred way forward.

.

Background

‘Missing Talent’ was prepared for The Sutton Trust by education datalab, an offshoot of the Fischer Family Trust (FFT).

The project was announced by education datalab in March (my emphases):

‘This is a short piece of research to explore differences in the secondary school experiences of highly able children from deprived backgrounds, compared to others. Its purpose is to identify whether and why some of the top 10% highest attaining children at the end of primary school do not achieve their full potential at age 16…

…For this group of highly able children we will:

  • describe the range of different GCSE outcomes they achieve
  • show their distribution across local authorities and different types of schools
  • explore whether there is any evidence that different types of high attaining children need to be differentially catered for within our education system

We hope our research will be able to suggest what number and range of qualifications schools should plan to offer students in this group. We may be able to identify parts of the country or particular types of schools where these students are not currently reaching their potential. We will be able to show whether highly able children from particular backgrounds are not currently reaching their full potential, with tentative suggestions as to whether school or home support are mostly contributing to this underperformance.’

On 2 June 2015, The Sutton Trust published:

  • An Overview summarising the key findings and recommendations
  • A Press Release ‘Over a third of clever but poor boys significantly underachieve at GCSE’ and
  • A guest blog post – Advancing the able – authored by Rebecca Allen, education datalab director. This also appears on the education datalab site.

The post is mostly about the wider issue of the priority attached to support for high attainers. It contains a gratifying reference to ‘brilliant blogger Gifted Phoenix’, but readers can rest assured that I haven’t pulled any punches here as a consequence!

The press release provided the substance of the ensuing media coverage, including pieces by the BBC, Guardian, Mail, Schools Week and TES.

There was limited commentary on social media since release of the Research Brief coincided with publication of the legislation underpinning the new Conservative Government’s drive for academisation. I commented

.

.

Just prior to publication and at extremely short notice I was asked by Schools Week for a comment that foregrounded references to the pupil premium.

This was in their coverage:

“I wholeheartedly support any action to reinforce effective practice in using pupil premium to support ‘the most able disadvantaged’.

“Ofsted is already taking action, but this should also be embedded in pupil premium reviews and become a higher priority for the Education Endowment Foundation.

“Given their close relationship, I hope the Sutton Trust will pursue that course. They might also publicly oppose Teach First proposals for redistributing pupil premium away from high and middle attainers and engage more directly with those of us who are pursuing similar priorities.”

For those who are unaware, I have been campaigning against Teach First’s policy position on the pupil premium, scrutinised in this recent post: Fisking Teach First’s defence of its pupil premium policy (April 2015). This is also mentioned in the Allen blog post.

I have also written extensively about excellence gaps, provisionally defined as:

‘The difference between the percentage of disadvantaged learners who reach a specified age- or stage-related threshold of high achievement – or who secure the requisite progress between two such thresholds – and the percentage of all other eligible learners that do so.’

This appears in a two-part review of the evidence base published in September 2014:

I have drawn briefly on that material in the commentary towards the end of this post.

.

Research Brief findings

.

Main findings, definitions and terminology

The Research Brief reports its key findings thus:

  • 15% of highly able pupils who score in the top 10% nationally at age 11 fail to achieve in the top 25% at GCSE 
  • Boys, and particularly pupil premium eligible boys, are most likely to be in this missing talent group 
  • Highly able pupil premium pupils achieve half a grade less than other highly able pupils, on average, with a very long tail to underachievement 
  • Highly able pupil premium pupils are less likely to be taking GCSEs in history, geography, triple sciences or a language

These are repeated verbatim in the Trust’s overview of research, but are treated slightly differently in the press release, which foregrounds the performance of boys from disadvantaged backgrounds:

‘Over a third (36%) of bright but disadvantaged boys seriously underachieve at age 16, new Sutton Trust research reveals today. Clever but poor girls are slightly less likely to underperform, with just under a quarter (24%) getting disappointing GCSE results. These figures compare with 16% of boys and 9% of girls from better off homes who similarly fall behind by age 16.’

The opening paragraph of the Brief describes ‘highly able’ learners as those achieving within the top decile in KS2 tests. This is a measure of prior attainment, not a measure of ability and it would have been better if the document referred to high attainers throughout.

There is also a curious and cryptic reference to this terminology

‘…following Sutton Trust’s previously used notion of those ‘capable of excellence in school subjects’’

which is not further explained (though ‘capable’ implies a measure of ability rather than attainment).

The analysis is based on the 2014 GCSE cohort and is derived from ‘their mark on each KS2 test paper they sat in 2009’. It therefore depends on high average performance across statutory tests of English, maths and (presumably) science.

The single measure of GCSE performance is achievement on the Attainment 8 measure, as defined in 2014. This has not been made available through the 2014 Secondary Performance Tables.

Essentially Attainment 8 comprises English and maths (both double-weighted) any three EBacc subjects and three other approved qualifications (the Brief says they must be GCSEs).

The measure of ‘missing talent’ is derived from the relationship between these two performance measures. It comprises those who fall within the top decile at KS2 but outside the top quartile nationally (ie the top 25%) at KS4.

There is no explanation or justification for the selection of these two measures, why they are pitched differently and why the difference between them has been set at 15 percentage points.

The text explains that some 7,000 learners qualify as ‘missing talent’, about 15% of all highly able learners (so the total of all highly able learners must approach 47,000).

The analysis is based on certain presumptions about consistency of progress between key stages. The brief says, rather dismissively:

‘Progress through school is not always smooth and predictable. Of course some children do well at primary school but are overtaken by peers who thrive at secondary school.’

It does not mention education datalab’s own analysis which shows that only 45% of learners make the expected linear progress between KS2 and KS3 and just 33% do so between KS3 and KS4. It would have been interesting and useful to have seen material about inconsistency of progress amongst this cohort.

Presumably the selection of top decile at KS2 but top quartile at KS4 is intended in part to compensate for this effect.

The main body of the Research Brief provides analysis of four topics:

  • The characteristics of the ‘missing talent’ subset – covering gender, ethnic background and socio-economic disadvantage.
  • Performance on the Attainment 8 measure of ‘missing talent’ from disadvantaged backgrounds compared with their more advantaged peers.
  • Take up of EBacc subjects by this population, including triple science.
  • The geographical distribution of ‘missing talent’ between local authorities and schools.

The sections below deal with each of these in turn.

.

The characteristics of ‘missing talent’

The ‘missing talent’ population comprises some 7,000 learners, so about 1 in 7 of all highly able learners according to the definition deployed.

We are not provided with any substantive information about the characteristics of the total highly able cohort, so are unable to quantify the differences between the composition of that and the ‘missing talent’ subset.

However we are told that the ‘missing talent’ group:

  • Is slightly more likely to be White British, Black Caribbean, Pakistani or Bangladeshi and somewhat less likely to be Chinese, Indian or African.
  • Includes 1,557 learners (943 boys and 614 girls) who are disadvantaged. The measure of disadvantage is ‘ever 6 FSM’ the basis for the receipt of pupil premium on grounds of deprivation. This is approximately 22% of the ‘missing talent’ group.
  • Includes 24% of the ‘ever 6 FSM’ girls within the highly able cohort compared with 9% of others; and includes 36% of ‘ever 6 FSM’ boys within the whole cohort compared with 16% of others.

Hence: ‘ever 6 FSM’ learners of both genders are more likely to be part of ‘missing talent’; boys are more likely than girls to be included, regardless of socio-economic status; and ‘ever 6 FSM boys are significantly more likely to be included than ‘ever 6 FSM’ girls.

.

Missing Talent Capture

.

The fact that 36% of ‘ever 6 FSM’ boys fall within the ‘missing talent’ group is described as ‘staggering’.

By marrying the numbers given with the percentages in the charts above, it seems that some 5,180 of the total highly able population are disadvantaged – roughly 11% – so both disadvantaged boys and girls are heavily over-represented in the ‘missing talent’ subset (some 30% of the total disadvantaged population are ‘missing talent’) and significantly under-represented in the total ‘highly able’ cohort.

By comparison, the 2014 Secondary Performance Tables show that 26.9% of the overall 2014 GCSE cohort in state-funded schools are disadvantaged (though this includes children in care).

There is no analysis to show whether there is a particular problem with white working class boys (or any other sub-groups for that matter) although that might be expected.

.

Attainment 8 performance

Attainment 8 is described as ‘the Government’s preferred measure’, although we anticipate that proposals in the Conservative manifesto for a ‘compulsory EBacc’ will almost certainly change its nature significantly, even if it is not supplanted by the EBacc.

The document supplies a table showing the average grade (points equivalents) for different percentiles of the ‘highly able FSM6’, ‘highly able not FSM6’ and ‘not highly able’ populations.

.

missing talent Capture 2

.

Median (50th percentile) performance for ‘highly able FSM6’ is 6.7, compared with 7.2 for ‘highly able not FSM6’ and 5.0 for ‘not highly able’.

The commentary translates this:

‘…they [‘highly able FSM 6’] score 4As and 4Bs when their equally able classmates from better off backgrounds get straight As’.

By analogy, the ‘not highly able’ group are achieving straight Cs.

However, there is also a ‘long tail of underachievement’ amongst the highly able disadvantaged:

‘One in ten of the poor but clever pupils are barely achieving C grades (or doing much worse) and at this end of the distribution they are lagging their non-FSM6 peers by almost a whole GCSE grade per subject.’

The latter is actually only true at the 95th percentile.

By comparison, at that point in the distribution, the ‘not highly able’ population are achieving 8 F grades.

So there is a clear excellence gap between the Attainment 8 performance of the highly able and the highly able disadvantaged, though the difference only becomes severe at the extreme of the distribution – the reference to a ‘long tail’ is perhaps a little overdone.

.

Take-up of EBacc subjects

A second table shows the distribution of grades for ‘highly able FSM6’ and ‘highly able not FSM6’ across the five EBacc components: English, maths, sciences, humanities and languages.

.

Missing Talent Capture 3

This is not discussed extensively in the text, but it reveals some interesting comparisons. For example, the percentage point excellence gaps between the two populations at GCSE grades A*/A are: maths 17 points; English 16 points; sciences 22 points; humanities 21 points; and languages 18  points.

At the other extreme 23% of ‘highly able FSM6’ are Ungraded in languages, as are 16% in humanities. This is particularly worrying if true, but Ungraded almost certainly includes those not entered for an appropriate examination.

The commentary says that ‘almost a quarter will not be taking a language at GCSE’, which might suggest that U is a misnomer. It is not clear whether the U category includes both non-takers and ungraded results, however.

The Government’s plans for ‘compulsory EBacc’ seem likely to force all learners to take a language and history or geography in future.

They will be less likely to make triple science compulsory for high attainers, though this is deemed significant in the document:

Just 53% of the highly able FSM6 pupils take triple sciences, compared to 69% of those not in the FSM6 category. This may be through choice or because they are in one of the 20% of schools that does not offer the curriculum. Here again the differences are stark: 20% of highly able FSM6 pupils are in a school not offering triple sciences, compared to just 12% of the highly able not-FSM6 pupils.’

The EBacc does not itself require triple sciences. The implications for teacher supply and recruitment of extending them into the schools that do not currently offer them are not discussed.

.

Geographical distribution of ‘missing talent’

At local authority level the Brief provides a list of 20 areas with relatively high ‘missing talent’ and 20 areas at the other extreme.

The bulk of the former are described as areas where secondary pupil performance is low across the attainment spectrum, but four – Coventry, Lambeth, Leicester and Tower Hamlets – are good overall, so the underachievement of high attainers is apparently exceptional.

Some are described as having comparatively low populations of highly able learners but, as the text implies, that should not be an excuse for underachievement amongst this cohort.

It is not clear whether there is differential performance in respect of disadvantaged learners within the ‘missing talent’ group (though the sample sizes may have been too low to establish this).

It is, however, immediately noticeable that the list of areas with high ‘missing talent’ includes many of the most disadvantaged authorities, while the list with low levels of missing talent is much more ‘leafy’.

Most of the former are located in the Midlands or the North. Almost all were Excellence in Cities areas.

The ‘low missing talent’ list also includes 11 London boroughs, but there are only three on the ‘high missing talent’ list.

The Brief argues that schools with low levels of ‘missing talent’ might support others to improve. It proposes additional selection criteria including:

  • ‘A reasonable number of highly able pupils’ – the rather arbitrary cut-off specified is 7% of cohort. It is not clear whether this is the total cohort or only the GCSE cohort. If the latter, it is more than likely to vary from year to year.
  • ‘Relatively low levels of missing talent’ – fewer than 10% ‘significantly underperform’. It is not clear but one assumes that the sole measure is that described above (ie not within the top 25% on the Attainment 8 measure).
  • ‘A socially mixed intake’ with over 10% of FSM6 learners (this is very low indeed compared with the average for the 2014 GCSE cohort in state-funded schools of 26.9%. It suggests that most of the schools will have relatively advantaged intakes.)
  • Triple science must be offered and the schools must have ‘a positive Progress 8 score overall’ (presumably so that they perform reasonably well across the attainment spectrum).

There is no requirement for the school to have achieved a particular Ofsted rating at its most recent inspection.

We are told that there are some 300 schools meeting this description, but no details are given about their distribution between authorities and regions, beyond the fact that:

‘In half of the 20 local authorities with the highest levels of missing talent there is no exemplar school and so a different policy approach may have to be taken.’

This final section of the document becomes a little discursive, stating that:

‘Any new initiatives to support highly able children at risk of falling behind must recognise the successes and failures of past ‘Gifted and Talented’ initiatives, particularly those of the Blair and Brown governments.’

And

‘We believe that any programme of support – whether through the curriculum or through enrichment – must support schools and children in their localities.’

No effort is made to identify these successes and failures, or to provide evidence to substantiate the belief in localised support (or to explain exactly what that means).

.

Recommendations

.

In the Research Brief

The Research Brief itself consists largely of data analysis, but proffers a brief summary of key findings and a set of policy recommendations.

It is not clear whether these emanate from the authors of the research or have been superimposed by the Trust, but the content distinctly suggests the latter.

There are four recommendations (my emphases):

  • ‘The Government should implement the recommendations of Sutton Trust’s Mobility Manifesto to develop an effective national programme for highly able state school pupils, with ring-fenced funding to support evidence-based activities and tracking of pupils’ progress.
  • All schools must be made accountable for the progress of their most able pupils. These pupils should have access to triple sciences and must study a broad traditional curriculum, including a language and humanity, that widens their future educational opportunities. The Government should report the (3-year average) Progress 8 figures for highly able pupils in performance tables. Schools where highly able pupils currently underperform should be supported through the designation of another local exemplar school. In the small number of areas where there is no exemplary good practice, a one-off centralised support mechanism needs to be set-up.
  • Exemplar schools already successfully catering for highly able pupils that are located in areas of high missing talent should be invited to consider whether they are able to deliver a programme of extra-curricular support to raise horizons and aspirations for children living in the wider area.
  • Highly able pupils who receive Pupil Premium funding are at high risk of underperforming at age 16. Schools should be encouraged to use the Pupil Premium funding for these pupils to improve the support they are able to give them.’

These are also repeated unchanged in the research overview, but are summarised and rephrased slightly in the press release.

Instead of demanding ‘an effective national programme…with ring-fenced funding to support evidence-based activities and tracking of pupils’ progress’ this calls on the Government to:

‘…establish a new highly able fund to test the most effective ways of improving the progress and attainment of highly able students in comprehensive schools and to show that the needs of highly able students, especially those from low and middle income backgrounds, are placed high on the national policy agenda.’

This is heavily redolent of Labour’s pre-election commitment to introduce a Gifted and Talented Fund which would establish a new evidence base and help schools’ ‘work in stretching the most able pupils’.

My own analysis of Labour’s commitment (March 2015) drew attention to similarities between this and The Sutton Trust’s own Mobility Manifesto (September 2014).

.

In the Mobility Manifesto

The Manifesto is mentioned in the footnotes to the press release. It offers three recommendations pertaining to highly able learners:

  • Reintroduce ring-fenced government funding to support the most able learners (roughly the top ten per cent) in maintained schools and academies from key stage three upwards. This funding could go further if schools were required to provide some level of match funding.
  • Develop an evidence base of effective approaches for highly able pupils and ensure training and development for teachers on how to challenge their most able pupils most effectively.
  • Make a concerted effort to lever in additional support from universities and other partners with expertise in catering for the brightest pupils, including through creating a national programme for highly able learners, delivered through a network of universities and accessible to every state-funded secondary school serving areas of disadvantage.’

The press release also mentions the Trust’s Sutton Scholars Scheme, a pilot programme undertaken with partner universities that supports highly able learners from low and middle income backgrounds during KS3.

In 2013 there was an initial pilot with 100 pupils involving UCL. In 2014 this was extended to 400 pupils and four partner universities: UCL, Cambridge, Nottingham and Warwick.

The press release says it currently reaches 500 pupils but still involving just four universities, so this is presumably the size of the 2015 cohort.

The programmes at each institution are subtly different but all involve a mix of out-of-school activities. In most cases they appear to be rebadging elements of the universities’ existing outreach programmes; there is nothing startlingly innovative or radical about them.

.

Commentary

.

Quality of the Research Brief

The document is compressed into three sides of A4 so, inevitably, much valuable information is missing. Education datalab should consider making available a separate annex containing all the underlying data that can be released without infringing data protection rules.

The Brief does not address all the elements set out in the original project description. It does not show the distribution of high attainers by type of school, or discuss the impact on underperformance of home and school respectively, nor does it:

‘…explore whether there is any evidence that different types of high attaining children need to be differentially catered for within our education system’.

It seems that the project has been scaled back compared with these original intentions, whether for lack of useful data or some other reason.

When it comes to the findings that are included:

  • The general conclusions about underachievement, particularly amongst high attainers from disadvantaged backgrounds, add something to our understanding of achievement patterns and the nature of excellence gaps. But the treatment also begs several questions that remain unanswered. The discussion needs reconciling with education datalab’s own findings about the limited incidence of linear progress. Further analysis of the performance of high-attaining disadvantaged boys may be a particular priority.
  • The findings on the take-up of EBacc subjects are relatively unsurprising and second order by comparison. They ought really to have been set in the context of the new Government’s commitment to a ‘compulsory EBacc’ (see below).
  • The information about the distribution of ‘missing talent’ is compromised by the very limited analysis, especially of the distribution between schools. The criteria used to identify a subset of 300 exemplar schools do not bear close scrutiny.

There is no cross-referencing to the existing evidence base on excellence gaps, especially the material relating to whether disadvantaged high attainers remain so in ‘The Characteristics of High Attainers’ (DfES 2007), ‘Performing against the odds: developmental trajectories of children in the EPPSE 3-16 study’ (Siraj-Blatchford et al, 2011) and ‘Progress made by high-attaining children from disadvantaged backgrounds’ (Crawford et al 2014).

.

Prospects for the adoption of these recommendations

The recommendation that schools are more strongly encouraged to use the pupil premium to benefit these learners – and to do so effectively – is important, but the text should explain how this can be achieved.

Ofsted has already made the case for action, concluding in March 2015 that two-thirds of non-selective secondary schools are not yet using pupil premium effectively to support disadvantaged high attainers.

Ofsted is committed to ensuring that school inspections focus sharply on the progress of disadvantaged high attainers and that future thematic surveys investigate the effective use of pupil premium to support them.

It is also preparing a ‘most able’ evaluation toolkit that will address this issue. This might provide a basis for further guidance and professional development, as long as the material is high quality and sufficiently detailed.

Effective provision for high attainers should be a higher priority for the pupil premium champion and, as I have already suggested, should feature prominently and explicitly in the guidance supporting pupil premium reviews.

Above all, the EEF should be supporting research on this topic as part of a wider initiative to help schools close excellence gaps.

All parties, including the Government, should make clear their opposition to the policy of Teach First and its Fair Education Alliance to double-weight pupil premium for low attainers at the expense of high and middle attaining recipients.

If at all possible, Teach First should be persuaded to withdraw this misguided policy.

It seems highly probable that the Trust’s recommendation for access to ‘a broad traditional curriculum’ will be secured in part through the new Government’s commitment to make EBacc subjects compulsory.

This is likely to be justified on grounds of social justice, derived from the conviction that taking these subjects supports progression to post-16 education, employment and higher education.

But that notion is contested. When the Education Select Committee considered this issue they concluded (my emphasis):

‘We support the Government’s desire to have greater equality of opportunity for all students, and to improve the attainment of those eligible for free school meals. The evidence is unclear as to whether entering more disadvantaged students for EBac subjects would necessarily make a significant contribution to this aim. Concentrating on the subjects most valued for progression to higher education could mean schools improve the attainment and prospects of their lowest-performing students, who are disproportionately the poorest as well. However, other evidence suggests that the EBac might lead to a greater focus on those students on the borderline of achieving it, and therefore have a negative impact on the most vulnerable or disadvantaged young people, who could receive less attention as a result. At the same time, we believe that the EBac’s level of prescription does not adequately reflect the differences of interest or ability between individual young people, and risks the very shoe-horning of pupils into inappropriate courses about which one education minister has expressed concerns. Given these concerns, it is essential that the Government confirms how it will monitor the attainment of children on free school meals in the EBac.’

This policy will not secure universal access to triple science, though it seems likely that the Government will continue to support that in parallel.

In the final days of the Coalition government, a parliamentary answer said that:

‘Out of 3,910 mainstream secondary schools in England with at least one pupil at the end of key stage four, 2,736 schools entered at least one pupil for triple science GCSEs in 2013/14. This figure does not include schools which offered triple science GCSEs, but did not enter any pupils for these qualifications in 2013/14. It also excludes those schools with no pupils entered for triple science GCSEs but where pupils have been entered for all three of GCSE science, GCSE further science and GCSE further additional science, which together cover the same content as GCSE triple science.

The Government is providing £2.6 million in funding for the Triple Science Support Programme over the period 2014-16. This will give state funded schools with low take up of triple science practical support and guidance on providing triple science at GCSE. The support comprises professional development for teachers, setting up networks of schools to share good practice and advice on how to overcome barriers to offering triple science such as timetabling and lack of specialist teachers.’

The Conservative manifesto said:

‘We aim to make Britain the best place in the world to study maths, science and engineering, measured by improved performance in the PISA league tables…We will make sure that all students are pushed to achieve their potential and create more opportunities to stretch the most able.’

Continued emphasis on triple science seems highly likely, although this will contribute to wider pressures on teacher supply and recruitment.

The recommendation for an additional accountability measure is sound. There is after all a high attainer measure within the primary headline package, though it has not yet been defined beyond:

‘x% of pupils achieve a very high score in their age 11 assessments’.

In its response to consultation on secondary accountability arrangements, the previous government argued that high attainment would feature in the now defunct Data Portal intended to support the performance tables.

It will be important to ensure consistency between primary and secondary measures. The primary measure seems to be based on attainment rather than progress. The Sutton Trust seems convinced that the secondary equivalent should be a progress measure (Progress 8) but does not offer any justification for this.

It is also critical that the selected measures are reported separately for disadvantaged and all other learners, so that the size of the excellence gap is explicit.

.

Prospects for a new national programme

When it comes to the recommendation for a new national programme, the Trust needs to be clearer and more explicit about the fundamental design features.

The recommendations in the Mobility Manifesto and this latest publication are not fully consistent. No effort is made to cost these proposals, to identify the budgets that will support them, or to make connections with the Government’s wider education policy.

Piecing the two sets of recommendations together, it appears that:

  • The programme would cater exclusively for the top decile of high attainers in the state-funded secondary sector. Post-16 institutions and selective schools may or may not be included.
  • Participation would be determined entirely on the basis of KS2 test outcomes, but it is not clear whether learners would remain within the programme regardless of subsequent progress.
  • The programme would comprise two parallel arms – one providing support directly for learners, the other improving the quality of provision for them within their schools and colleges.
  • The support for learners is not defined, but would presumably draw on existing Trust programmes. It would include ‘extra-curricular support to raise horizons and aspirations’.
  • It is not entirely clear whether this support would be available exclusively to those from disadvantaged backgrounds (though we know it would be ‘accessible to every state-funded secondary school serving areas of disadvantage’).
  • The support for schools and colleges will develop and test effective practice in teaching these learners, in tracking and maximising their attainment and progress. It will provide associated professional development. It is not clear whether this will extend into other dimensions of effective whole school provision.
  • Delivery will be via some combination of a network of universities, a cadre of exemplar schools and other partners with expertise. The interaction between these different providers is not discussed.
  • The exemplar schools will be designated as such and will support other schools in their locality where high attainers under-achieve. They should also be ‘invited to consider’ delivering a programme of extra-curricular support for learners in their area.
  • There will also be an unspecified ‘one-off centralised support mechanism’ for areas with no exemplary schools. What this means is a mystery.
  • Costs will be met from a new ring-fenced ‘highly able fund’ the size of which is not quantified.

The relationship between this programme and the Trust’s proposed ‘Open Access Scheme’ – which would place high attaining students in independent schools – is not discussed. (I will not repeat again my arguments against this Scheme.)

The realistic prospect of securing a sufficiently large ring-fenced pot must be negligible in the present funding environment. Labour’s pre-election commitment to find some £15m (annually?) for this purpose is unlikely to be matched by the Conservatives.

Any support for improving the quality of provision in schools is likely to be found within existing budgets, including those supporting research, professional development, teaching schools, their alliances and their designated Specialist Leaders of Education.

STEM-related initiatives are particularly relevant given the Manifesto reference. One would hope for a systematic and co-ordinated approach rather than the piecemeal introduction of new projects.

I have elsewhere suggested a set of priorities including:

  • Guidance and associated professional development on effective whole school provision derived from a set of core principles, including the adoption of flexible, radical and innovative grouping arrangements.
  • Developing a coherent strategy for strengthening the STEM talent pipeline which harnesses the existing infrastructure and makes high quality support accessible to all learners regardless of the schools and colleges they attend.
  • Establishing centres of excellence and a stronger cadre of expert teachers, but also fostering system-wide partnership and collaboration by including the range of expertise available outside schools.

If funding is to go towards improving provision for learners, the only viable option is to use pupil premium, with the consequence that support will be targeted principally, if not exclusively, at disadvantaged high attainers.

I have elsewhere suggested a programme designed to support all such learners aged 11-18 located in state-funded schools and colleges. There is both wider reach and less deadweight if support is targeted at all eligible learners, rather than at schools ‘serving areas of disadvantage’.

It is critical to include the post-16 sector, given the significant proportion of disadvantaged high attainers who transfer post-GCSE.

This would be funded principally by a £50m topslice from the pupil premium budget (matching the topslice taken to support Y6/7 summer schools), though higher education outreach budgets would also contribute and there would be scope to attract additional philanthropic support.

The over-riding priority is to bring much-needed coherence to what is currently a fragmented market, enabling:

  • Learners to undertake a long-term support programme, tailored to their needs and drawing on the vast range of services offered by a variety of different providers, including universities, commercial and third sector organisations (such as the Trust itself).
  • These providers to position and market their services within a single online national prospectus, enabling them to identify gaps on the supply side and take action to fill them.
  • A single, unified, system-wide effort, harmonising the ‘pull’ from higher education fair access strategies and the ‘push’ from schools’ and colleges’ work to close excellence gaps.

I don’t yet recognise this coherence in the Trust’s preferred model.

.

GP

June 2015

The problem of reverse excellence gaps

This post compares the performance of primary schools that record significant proportions of disadvantaged high attainers.

spiral-77493_1280It explores the nature of excellence gaps, which I have previously defined as:

‘The difference between the percentage of disadvantaged learners who reach a specified age- or stage-related threshold of high achievement – or who secure the requisite progress between two such thresholds – and the percentage of all other eligible learners that do so.’

It draws particular attention to the incidence at school level of sizeable reverse excellence gaps where disadvantaged learners out-perform their more advantaged peers.

According to my theoretical model reverse gaps threaten equilibrium and should be corrected without depressing the achievement of disadvantaged high attainers.

In this post:

  • The measure of disadvantage is eligibility for the pupil premium – those eligible for free school meals at any time in the last six years (‘ever 6 FSM’) and children in care.
  • The measure of high attainment is Level 5 or above in KS2 reading, writing and maths combined.

.

National figures

The 2014 Primary School Performance Tables show that 24% of the cohort attending state-funded primary schools achieved KS2 Level 5 or above in reading, writing and maths combined. In 2013, the comparable figure was 21% and in 2012 it was 20%.

In 2014 some 650 primary schools managed a success rate of 50% or higher for the entire cohort, up from 425 in 2013 and 380 in 2012

The comparable national percentages for disadvantaged learners are 12% in 2014, 10% in 2013 and 9% in 2012. For all other learners (ie non-disadvantaged) they are 24% in 2012, 26% in 2013 and 29% in 2014.

In 2014, there were 97 state-funded schools where 50% or more of disadvantaged learners achieved this benchmark, compared with only 38 in 2013 and 42 in 2012. This group of schools provides the sample for this analysis.

Chart 1 below illustrates the national excellence gaps over time while Chart 2 compares the proportion of schools achieving 50% or higher on this measure with all learners and disadvantaged learners respectively.

.

REG graph 1

Chart 1: Percentage of disadvantaged and other learners achieving L5+ in KS2 reading, writing and maths, 2012-14

Chart 1 shows that all rates are improving, but the rate of improvement is slower for disadvantaged learners. So the socio-economic achievement gap at L5+ in reading, writing and maths combined has grown from 15% in 2012, to 16% in 2013 and then to 17% in 2014.

REG graph 2 

Chart 2: Number of schools where 50% of all/disadvantaged learners achieved L5+ in KS2 reading, writing and maths, 2012-14

Chart 2 shows steady improvement in the number of schools achieving outstandingly on this measure for all learners and disadvantaged learners alike (though there was a slight blip in 2013 in respect of the latter).

Since 2012, the proportion of schools achieving this benchmark with disadvantaged learners has increased more substantially than the proportion doing so with all learners. At first sight this is a positive trend.

However Chart 1 suggests that, even with the pupil premium, the national excellence gap between higher-attaining advantaged and disadvantaged learners is increasing steadily. This is a negative trend.

It might suggest either that high-attaining disadvantaged learners are not benefiting sufficiently from the premium, or that interventions targeted towards them are ineffective in closing gaps. Or perhaps both of these factors are in play.

 

Schools achieving high success rates with disadvantaged learners

The 97 schools achieving a success rate of 50% or more with their disadvantaged high attainers are geographically dispersed across all regions, although a very high proportion (40%) is located in London and over half are in London and the South-East.

.

Reg graph 3

Chart 3: Distribution of schools in sample by region

 .

Nineteen London boroughs are represented but eight of the 97 schools are located in a single borough – Greenwich – with a further five in Kensington and Chelsea. The reasons for this clustering are unclear, though it would suggest a degree of common practice.

Almost half of the sample consists of church schools, fairly equally divided between Church of England and Roman Catholic institutions. Seven of the 97 are academy converters, six are controlled, 42 are aided and the remainder are community schools.

Other variables include:

  • The average size of the KS2 cohort eligible for assessment is about 40 learners, with a range from 14 to 134.
  • The percentage of high attainers varies from 7% to 64%, compared with an average of 25% for all state-funded schools. More than one quarter of these schools record 40% or more high attainers.
  • The percentage of middle attainers ranges between 38% and 78%, compared with an average of 58% for state funded schools.
  • The percentage of low attainers lies between 0% and 38%, compared with the national average for state-funded schools of 18%. Only 15 of the sample record a percentage higher than this national average.
  • The percentage of disadvantaged learners ranges from 4% to 77%, compared with the national average for state-funded schools of 31%. Roughly one in five of the sample has 50% or more, while almost two in five have 20% or less.
  • The number of disadvantaged pupils in the cohort is between 6 and 48. (Schools with fewer than 5 in the cohort have their results suppressed). In only 22 of the sample is the number of disadvantaged pupils higher than 10.
  • In 12 of the schools there are no EAL pupils in the cohort but a further 11 are at 60% or higher, compared with an average for state-funded schools of 18%.

Overall there is significant variation between these schools.

.

School-level performance

The vast majority of the schools in the sample are strong performers overall on the L5 reading, writing and maths measure. All but five lie above the 2014 national average of 24% for state-funded schools and almost half are at 50% or higher.

The average point score ranges from 34.7 to 27.9, compared with the state-funded average of 28.7. All but 15 of the sample record an APS of 30 or higher. The average grade per pupil is 4B in one case only and 4A in fourteen more. Otherwise it is 5C or higher.

Many of these schools are also strong performers in KS2 L6 tests, though these results are not disaggregated for advantaged and disadvantaged learners.

More than four out of five are above the average 9% success rate for L6 maths in state-funded primary schools and almost two out of five are at 20% or higher.

As for L6 grammar, punctuation and spelling (GPS), some two-thirds are above the success rate of 4% for all state-funded primary schools and almost two out of five are at 10% or higher.

When it comes to the core measure used in this analysis, those at the top of the range appear at first sight to have performed outstandingly in 2014.

Four schools come in at over 80%, though none has a disadvantaged cohort larger than eight pupils. These are:

Not far behind them is Tollgate Primary School, Newham (71%) but Tollgate also has a cohort of 34 disadvantaged learners, almost three times the size of any of its nearest rivals.

What stands out from the data above all else is the fact that very few schools show any capacity to replicate this level of performance over two or three years in succession.

In some cases results for earlier years are suppressed because five or fewer disadvantaged pupils constituted the cohort. Leaving those aside, just 6 schools in the sample managed a success rate of 50% or higher in 2013 as well (so for two successive years) and no school managed it for three years in a row.

The schools closest to achieving this are:

  • Tollgate Primary School, Newham (71% in 2014, 50% in 2013 and 40% in 2013)

Only 9 of the sample achieved a success rate of 30% or higher for three years in a row.

The size and direction of excellence gaps

Another conspicuous finding is that several of these schools display sizeable reverse excellence gaps, where the performance of disadvantaged learners far exceeds that of their more advantaged peers.

Their success rates for all other pupils at L5 in reading, writing and maths combined vary enormously, ranging between 91% and 10%. Nineteen of the sample (20%) is at or below the national average rate for state-funded schools.

But in a clear majority of the sample the success rate for all other pupils is lower than it is for disadvantaged pupils.

The biggest reverse excellence gap is recorded by St John’s Church of England Primary School in Cheltenham, Gloucestershire, where the success rate for disadvantaged learners is 67%, compared with 19% for other learners, giving a huge disparity of 48 percentage points!

Several other schools record reverse gaps of 30 points or more, many of them church schools. This raises the intriguing possibility that the ethos and approach in such schools may be relatively more conducive to disadvantaged high attainers, although small numbers are undoubtedly a factor in some schools.

The ‘cliff-edge’ nature of the distinction between disadvantaged and other learners may also be a factor.

If schools have a relatively high proportion of comparatively disadvantaged learners ineligible for the pupil premium they may depress the results for the majority, especially if their particular needs are not being addressed.

At the other extreme, several schools perform creditably with their disadvantaged learners while also demonstrating large standard excellence gaps.

Some of the worst offenders are the schools celebrated above for achieving consistency over a three year period:

  • Fox Primary School has a 2014 excellence gap of 34 points (57% disadvantaged versus 91% advantaged)
  • Nelson Mandela School a similar gap of 28 points (54% disadvantaged versus 82% advantaged).

Only Tollgate School bucks this trend with a standard excellence gap of just two percentage points.

The chart below illustrates the variance in excellence gaps across the sample. Sizeable reverse gaps clearly predominate.

 .

REG graph 4

Chart 4: Incidence of reverse and normal excellence gaps in the sample

Out of the entire sample, only 17 schools returned success rates for advantaged and other learners that were within five percentage points of each other. Less than one-third of the sample falls within a variance of plus or minus 10%.

These extreme variations may in some cases be associated with big disparities in the sizes of the two groups: if disadvantaged high attainers are in single figures, differences can hinge on the performance of just one or two learners. But this does not apply in all cases. As noted above, the underperformance of relatively disadvantaged learners may also be a factor in the reverse gaps scenario.

Ofsted inspection reports

I was curious to see whether schools with sizeable excellence gaps – whether normal or reverse – had received comment on this point from Ofsted.

Of the schools within the sample, just one – Shrewsbury Cathedral Catholic Primary School – has been rated inadequate in its last inspection report. The inspection was undertaken in July 2014, so will not have reflected a huge reverse excellence gap of 38 percentage points in the 2014 KS2 assessments.

The underachievement of the most able is identified as a contributory factor in the special measures judgement but the report comments thus on the achievement of disadvantaged learners:

‘Although in Year 6, pupils eligible for additional government funding (the pupil premium) reach similar levels to their classmates in reading, writing and mathematics, eligible pupils attain lower standards than those reached by their classmates, in Years 2, 3 and 4. The gap between the attainment of eligible and non-eligible pupils in these year groups is widening in reading, writing and mathematics. In mathematics, in Year 3, eligible pupils are over a year behind their classmates.’

Two further schools in the sample were judged by Ofsted to require improvement, both in 2013 – St Matthew’s in Surbiton and St Stephen’s in Godstone, Surrey. All others that have been inspected were deemed outstanding or good.

At St Matthew’s inspectors commented on the achievement of disadvantaged learners:

‘Weaknesses in the attainment of Year 6 pupils supported by pupil premium funding were identified in 2012 and the school took action to reduce the gap in attainment between this group of pupils and their peers. This gap reduced in 2013 so that they were just over one term behind the others in English and mathematics, but there is still a substantial gap for similar pupils in Year 2, with almost a year’s gap evident in 2013. Support is now in place to tackle this.’

In 2014, the KS2 cohort at St Matthew’s achieved a 53% success rate on L5 reading, writing and maths, with disadvantaged learners at 50%, not too far behind.

At St Stephen’s inspectors said of disadvantaged learners:

‘The school successfully closes the gap between the attainment of pupils who benefit from the pupil premium and others. Indeed, in national tests at the end of Year 6 in 2012, the very small number of eligible pupils was attaining about a term ahead of their classmates in English and mathematics. Focused support is being given to eligible pupils in the current year to help all fulfil their potential.’

A more recent report in 2015 notes:

‘The school is successfully closing the gaps between disadvantaged pupils and others. In 2014, at the end of Key Stage 2, disadvantaged pupils outperformed other pupils nationally and in the school by about three terms in mathematics. They also outperformed other pupils nationally by about two terms nationally and in the school in reading and writing. Disadvantaged pupils across the school typically make faster progress than other pupils in reading, writing and mathematics.’

It is not clear whether inspectors regard this as a positive outcome.

Unfortunately, Tollgate, Nelson Mandela and Fox – all three outstanding – have not been inspected since 2008/2009. One wonders whether the significant excellence gaps at the latter might impact on their overall inspection grade.

.

Pupil Premium allocations 

I was equally curious to see what the websites for these three schools recorded about their use of the pupil premium.

Schools are required to publish details of how they spend the pupil premium and the effect this has on the attainment of the learners who attract it.

Ofsted has recently reported that only about one-third of non-selective secondary schools make appropriate use of the pupil premium to support their disadvantaged most able learners – and there is little reason to suppose that most primary schools are any more successful in this respect.

But are these three schools any different?

  • Fox Primary School has pupil premium income of £54.7K in 2014-15. It explains in its statement:

‘Beyond all of this, Fox directs a comparatively large proportion of budget to staffing to ensure small group teaching can target pupils of all attainment to attain and achieve higher than national expectations. Disadvantaged pupils who are attaining above the expected level are also benefitting from small group learning, including core subject lessons with class sizes up to 20. The impact of this approach can be seen in the APS and value added scores of disadvantaged pupils for the last 2 years at both KS1 and KS2. The improved staffing ratios are not included in pupil premium spend.’

  • Nelson Mandela School has so far not uploaded details for 2014-15. In 2013-14 it received pupil premium of £205.2K. The statement contains no explicit reference to high-attaining disadvantaged learners.
  • Tollgate Primary School received pupil premium of £302.2K in 2014-15. Its report covers this and the previous year. In 2013-14 there are entries for:

‘Aim Higher, challenging more able FSM pupils’ (Y6)

In 2014-15 funding is allocated to pay for five intervention teachers, whose role is described as:

‘Small group teaching for higher ability. Intervention programmes for FSM’.

.

Conclusion

The national excellence gap between disadvantaged and other learners achieving KS2 L5 in all of reading, writing and maths is growing, despite the pupil premium. The reasons for this require investigation and resolution.

Ofsted’s commitment to give the issue additional scrutiny will be helpful but may not be sufficient to turn this situation around. Other options should be considered.

The evidence suggests that schools’ capacity to sustain Level 5+ performance across reading, writing and maths for relatively large proportions of their disadvantaged learners is limited. High levels of performance are rarely maintained for two or three years in succession.

Where high success rates are achieved, more often than not this results in a significant reverse excellence gap.

Such reverse gaps may be affected by the small number of disadvantaged learners within some schools’ cohorts but there may also be evidence to suggest that several schools are succeeding with their disadvantaged high achievers at the expense of those from relatively more advantaged backgrounds.

Further investigation is necessary to establish the association between this trend and a ‘cliff-edge’ definition of disadvantage.

Such an outcome is not optimal or desirable and should be addressed quickly, though without depressing the performance of disadvantaged high achievers.

A handful of strong performers, including the majority of those that are relatively more consistent year-on-year, do well despite continuing to demonstrate sizeable standard excellence gaps.

Here the advantaged do outstandingly well and the disadvantaged do significantly worse, but still significantly better than in many other schools.

This outcome is not optimal either.

There are very few schools that perform consistently highly on this measure, for advantaged and disadvantaged high attainers alike.

Newham’s Tollgate Primary School is perhaps the nearest to exemplary practice. It receives significant pupil premium income and, in 2014-15, has invested in five intervention staff whose role is partially to provide small group teaching that benefits high attainers from disadvantaged backgrounds.

Fox Primary School has also acted to reduce group sizes, but it remains to be seen whether this will help to eliminate the large positive excellence gap apparent in 2014.

This is a model that others might replicate, provided their pupil premium income is substantial enough to underwrite the cost, but the necessary conditions for success are not yet clear and further research is necessary to establish and disseminate them.

Alternative approaches will be necessary for schools with small numbers of disadvantaged learners and a correspondingly small pupil premium budget.

The Education Endowment Fund (EEF) is the obvious source of funding. It should be much more explicitly focused on excellence gaps than it has been to date.

GP

May 2015

Proposals for a 2015 Schools White Paper: Most able

. 

This post sets out for consideration some ideas to inform a new ‘most able learners’ policy’ for inclusion in a forthcoming schools white paper.

paper-32377_1280Background

Now that we have a majority Conservative Government, attention is switching to the shape of its education policy agenda for the next five years.

Parliament will be recalled on 18 May and the new Government’s legislative agenda will be set out on 27 May in the Queen’s Speech.

During the Election campaign, Prime Minister Cameron announced plans for a Schools Bill within the first 100 days of the new Parliament.

That deadline expires on 26 August, during the long summer holiday, so one would expect the Bill to be published before term ends on 22 July or, failing that, in early September.

Cameron said the Bill would contain:

‘…more radical measures to ensure young people leave education with the skills they need. It will include new powers to force coasting schools, as well as failing schools, to accept new leadership, continuing the remarkable success story of Britain’s academy schools.’

DfE civil servants will already have established which Conservative Manifesto pledges require primary legislation, but Ministerial clarification will be required and there may be some as yet undeclared priorities to add to the list.

Some likely contenders include:

  • Resits of KS2 tests in Year 7 and making the EBacc compulsory in secondary schools.
  • Any school considered by Ofsted to Require Improvement will be handed over to ‘the best headteachers – backed by expert sponsors or high-performing neighbouring schools – unless it can demonstrate that it has a plan to improve rapidly’.
  • Permission for ‘all good schools to expand, whether they are maintained schools, academies, free schools or grammar schools’.
  • The establishment of an independent College of Teaching.

It is customary for new governments to publish a white paper covering the areas in which they intend to legislate, so we might expect either a Schools or Education White Paper by the end of the summer term.

Between School Selection

The prospects for renewed emphasis on selection are already being discussed. I gave a detailed account of the pre-Election scenario in ‘The Politics of Selection: Grammar schools and disadvantage’ (November 2014).

Key factors include:

  • The postponed decision on whether to approve a grammar school annexe in Sevenoaks and the precedent that would set elsewhere.
  • The existing scope for grammar schools – whether academies or LA-maintained – to increase their planned admission numbers (PAN), typically by adding additional forms of entry (FE).
  • The campaign by centre-right Tory group Conservative Voice to change the law to permit the establishment of new grammar schools, supported by messrs Brady, Davis and Fox, together with early indications of greater influence for Tory backbenchers through the 1922 Committee which Brady chairs.
  • Coded expressions of support from both Home Secretary May and newly-established Cabinet member Johnson, both considered future contenders for the Tory party leadership.

It will be important to establish a clear demarcation line in government policy.

.

Within School Selection

Back in 2007, when in Opposition, Prime Minister Cameron signalled a shift of emphasis, away from grammar schools and towards setting:

‘When I say I oppose nationwide selection by 11 between schools, that does not mean I oppose selection by academic ability altogether.

Quite the reverse. I am passionate about the importance of setting by ability within schools, so that we stretch the brightest kids and help those in danger of being left behind.’

With a Conservative Government this would be a motor of aspiration for the brightest kids from the poorest homes – effectively a ‘grammar stream’ in every subject in every school.’

In September 2014, there were indications of a revival of this strategy, though it was rapidly relegated into plans for Regional Schools Commissioners, newly empowered to intervene in any school rated inadequate by Ofsted to consider enforced setting as one of a ‘menu of options’.

I discussed the evolution of this position in ‘The Politics of Setting’ (November 2014).

In the event, this additional role for Commissioners did not feature in the Conservative Manifesto, so we do not know whether enforced setting will be added to their armoury. This requires clarification in the White Paper.

Ofsted’s evidence

Shortly before election campaigning began, Ofsted published its second survey report on the education of the most able in non-selective secondary schools, which I reviewed in ‘The most able students: Has Ofsted made progress?’ (March 2015).

The Key Findings highlight a litany of shortcomings. The first three alone say:

  • ‘National data show that too many of the most able students are still being let down and are failing to reach their full potential.’
  • ‘Nationally, too many of our most able students fail to achieve the grades they need to get into top universities.’
  • ‘Schools visited were rarely meeting the distinct needs of students who are most able and disadvantaged.’

In relation to this third point, Ofsted found that no more than a third of schools were using pupil premium funding effectively to target the needs of such pupils.

The Report committed Ofsted to focusing within inspections on the progress of the most able disadvantaged, the quality of the curriculum and information, advice and guidance. We wait to see how this will be reflected in the updated School Inspection Handbooks scheduled for publication later this term.

Meanwhile, Ofsted is also preparing a ‘most able evaluation toolkit for schools’ as part of its wider efforts to influence school improvement. The toolkit should feature in the White Paper and there is scope to consider building additional support around it.

.

Excellence Gaps and Pupil Premium

The Conservative Manifesto gave a clear commitment:

‘We will continue to provide the pupil premium, protected at current rates, so that schools receive additional money for those from the poorest backgrounds.’

It added:

‘And we will make schools funding fairer. We have already increased funding for the 69 least well-funded local authorities in the country, and will make this the baseline for their funding in the next Parliament.’

Teach First leads a group of educational organisations lobbying for pupil premium to be reallocated in such a way that those with lower prior attainment attract double the rate awarded to those whose prior attainment is at or above expectations.

I have been campaigning against this proposal, principally on the grounds that:

  • It robs Peter to pay Paul, inflicting collateral damage on the majority of eligible learners, including the ‘most able disadvantaged’, the majority of whom are already poorly served, as Ofsted has established.
  • Closing gaps between disadvantaged learners and their peers should continue to take priority over closing attainment gaps between low and high attainers. The core purpose of pupil premium should be tackling underachievement – rather than low achievement – amongst disadvantaged learners.
  • Any increase in funding weighted towards low prior attainment should be secured through reform of the school funding formula and involve careful consideration of the overlaps between deprivation, low attainment and additional needs, including SEN.

My own efforts to increase the priority attached to the most able disadvantaged include presenting the evidence base for excellence gaps which I define as:

‘The difference between the percentage of disadvantaged learners who reach a specified age- or stage-related threshold of high achievement – or who secure the requisite progress between two such thresholds – and the percentage of all other eligible learners that do so.’

There is increasing focus on excellence gaps in this country and they should be more fully reflected in Government policy as enshrined in the White Paper. Further assurances should be given over pupil premium rates and eligibility for them.

.

Other Manifesto commitments

The Conservative Manifesto includes – in a section headed ‘We will lead the world in maths and science’ – a generic commitment:

‘We will make sure that all students are pushed to achieve their potential and create more opportunities to stretch the most able.’

It is unclear whether this relates exclusively to maths and science. It might hint at the revival of a flagship policy of the last government, to establish a cadre of up to a dozen selective 16-19 maths free schools, which managed to generate just two of these.

As recommended towards the end of my latest post on these institutions there is plenty of scope to rationalise and reform the STEM talent pipeline more efficiently, so that it benefits students regardless of the schools and colleges they attend.

Those finalising the Tory Manifesto may have had in mind a rival Labour commitment – which didn’t make it into their manifesto – to establish a Gifted and Talented Fund. The purpose and application of this Fund, discussed here, were never clarified.

The Conservatives were wise not to take on board a poorly-conceived Policy Exchange proposal to introduce a National Scholarships Scheme. The idea behind this is to support the most talented undergraduates on condition that they attend a UK university and remain in the UK for three years after graduating. It has no merit whatsoever.

The way forward

Rather than adopt a piecemeal approach, or risk being tripped up by the febrile politics of selection, the new Government should actively consider the inclusion in its schools white paper of a holistic policy to support our high-attaining learners.

This would broaden the agenda and allow the Government to take credit for a more sophisticated, multi-stranded approach.

The policy should embrace primary, secondary and post-16 education, placing particular emphasis on reducing excellence gaps and improving access to our most selective universities.

.

.

Key elements of the policy should include:

  • Holding the line on grammar school expansion established in the Manifesto: expansion is permitted, through satellite schools where legally permissible, but new selective institutions are confined to 16+.
  • Incentivising and encouraging all existing grammar schools to give priority in their admission arrangements to learners eligible for the pupil premium – and supporting their wider efforts to work with primary schools to increase their intake of disadvantaged learners.
  • Sponsoring guidance and associated professional development for schools and colleges on effective institution-wide provision for their most able learners, developed from a set of core principles and designed to re-establish national consensus in this field. This should also feature Ofsted’s evaluation toolkit.
  • Sponsoring guidance for schools and colleges on the introduction of more flexible, radical and innovative grouping arrangements, extending beyond the confines of setting and streaming.
  • Developing a coherent strategy for strengthening the STEM talent pipeline which harnesses the existing infrastructure and makes high quality support accessible to all learners regardless of the schools and colleges they attend.
  • Top-slicing £50m from the pupil premium budget to underwrite a coherent market-driven programme supporting high-attaining disadvantaged students to progress to selective universities. This would integrate the ‘push’ from schools and colleges with the ‘pull’ from higher education achieving efficiencies on both sides.
  • Incentivising schools to give higher priority to disadvantaged high attainers by protecting their pupil premium entitlement and sharpening accountability arrangements, including Ofsted inspection but also the publication of key indicators in Performance Tables under the new assessment regime.
  • Building system-wide capacity, by establishing centres of excellence and a stronger cadre of expert teachers, but also by fostering collaboration and partnership between schools, colleges and all other sources of relevant expertise.

.

GP

May 2015

Response to Russell Hobby’s post of 8 May 2015

.

Thank you for taking the time and trouble to provide a considered response to my posts campaigning against the Fair Education Alliance position on the pupil premium: this one launching the campaign and this demolition of Teach First’s official policy statement of 29 April. New-EYEBALL-for-C4D

By responding in this fashion you set a fine example to the other organisations I am challenging to justify their support for this policy.

As things stand, just one other organisation – the Future Leaders Trust – has bothered to make its views known (and duly distanced itself from this policy).

The remainder are unwilling to break ranks. I am not sure whether to charge them with cowardice or complacency. I hope they will now follow your lead.

You have explained that NAHT has not yet formally adopted your recommendation that it support Teach First’s position, so your post constitutes ‘an interim position in lieu of a vote or resolution’. I have offered to meet you to discuss this, to clarify any outstanding issues and – hopefully – to persuade you to revise that recommendation.

Three factual clarifications to begin with:

  • NAHT is listed as a member of the Fair Education Alliance – whose Report Card 2014 is unclear over whether the proposed pupil premium reallocation applies equally to primary and secondary schools – and a supporter of the Read On. Get On campaign, whose publication specifically urges its application in the primary sector (and implies that it is following the Report Card in this respect).
  • There are no proposals in the Report Card for reform of the schools funding formula, whether to increase the weighting for deprivation or for low prior attainment. Teach First’s policy statement mentions a national funding formula but offers no specific proposals for reforming it. I note that NAHT is itself calling for a fair national funding formula.
  • The implication of Teach First’s policy statement is that disadvantaged learners with low prior attainment would attract a pupil premium rate double that available to all other disadvantaged learners, middle as well as high attainers. There is no proposal to change the FSM-driven definition of disadvantage that currently underpins the pupil premium and no definition of what constitutes low prior attainment. I note that you recently floated the idea of replacing ‘ever-6 FSM’ eligibility for pupil premium with ‘a measure of the prior attainment of pupils’.

These are my responses to the substantive points of your argument:

  • It is true that other eligible disadvantaged learners would continue to attract pupil premium funding – at half the rate available for eligible disadvantaged low attainers. This implies that their needs are deemed much less significant, and/or that those needs are significantly easier and cheaper to address. The Report Card makes clear that ‘the change of funding model would increase school accountability for ‘catching up’ pupils’ (p27). All schools would be expected to prioritise ‘catch up’ for disadvantaged low attainers over all other provision for disadvantaged learners. As ASCL has pointed out, this cuts directly across heads’ and governors’ autonomy in deciding how best to allocate pupil premium funding. Hence, in this context, NAHT is arguing for such autonomy to be curtailed. I trust you will concede this?
  • There are presently differential rates of pupil premium for primary and secondary learners. The differential in favour of primary schools was justified by the previous Government, not on equity grounds, but as helping schools to meet higher expectations of ‘secondary readiness’ associated with the new assessment and accountability regime. But the new regime also shifts schools away from a binary approach to a model in which improvements at any point along the scale of prior attainment are equally valued. Double weighting of pupil premium for low attainers points in precisely the opposite direction.
  • You posit an alternative position on equity that:

‘consists in ensuring first that all students achieve a certain level of competence and that therefore more should be invested in those furthest from that threshold… One rationale for this position would be that once individuals have passed a certain threshold they have a capacity for self-improvement whereby they can extend their own education and create opportunities. Below this threshold, such self-determination is significantly harder. Thus, if you had to choose only one option it could be more socially valuable to lift a student to this threshold than to continue to stretch a student already beyond the threshold.’

You explain this as a trade-off imposed as a consequence of scarce resources. Such a position may be ideologically driven, irrational and evidence-free, or supported by an evidence base. The former is not susceptible to counter-argument. The latter can be challenged through an alternative evidence base setting out the equivalent social and economic value of closing excellence gaps. I have presented that evidence base at length and will not revisit it here. But, in determining its final position I trust that NAHT will give full and careful consideration to both sets of evidence, rather than relying exclusively on material that supports your argument. I would welcome your assurances on this point.

  • My broader evidence-driven judgement is that, allowing for scarce resources, the most effective education systems (and the best schools) typically strive to keep excellence and equity in equilibrium. If one is allowed to predominate, the overall quality of education suffers. If a school (or a headteachers’ association or any other organisation targeted by this campaign) holds a particular view on this issue, in which equity is permitted to trump excellence, it seems reasonable to expect it to state explicitly the consequences of that decision – and to hold itself accountable to its stakeholders for those. In the case of a school I would expect this to be made explicit in the vision/mission statements intended for parents and staff alike – and in the documentation supplied to Ofsted prior to inspection. Otherwise there is every risk of hypocrisy. In short, a headteacher who takes this position cannot with integrity run a school that pretends the opposite. If it adopts this policy, I look forward to NAHT advising its members accordingly.
  • You suggest that the distinction between pupil premium and school funding formula is a second order issue. I do not agree. If there is a case for higher weighting for low prior attainment – to reflect the additional costs associated with tackling it – that should be reflected in the core budget through the funding formula, alongside the weightings for pupil deprivation and high needs, typically but not exclusively associated with SEN. The formula should properly recognise the overlap between these factors. I would welcome NAHT’s considered analysis of the totality of funding available to support (disadvantaged) low attainers through all funding streams, since treating pupil premium in isolation is misleading and inappropriate.
  • Pupil premium is different because it is supposed to benefit directly the learners who attract it. Indeed, the latest edition of the Governors’ handbook goes as far as to state that:

‘The pupil premium is a separate funding stream to be used solely for the educational benefit of children eligible and registered for free school meals at any time during the last six years, or those who have been in continuous public care for six months’ (page 109)

While this does not amount to a personal budget, the direct link between the funding and eligible learners means that the reallocation proposed will almost certainly have a direct impact on support for those whose entitlement is reduced, especially if backed up as proposed by accountability pressures. This overrides any consideration of individual needs and circumstances and applies regardless of the total pupil premium funding received by a school. I invite NAHT to consider carefully whether this is in the best interests of the schools its members lead.

  • You accept I have a point about ‘the level of detail in the calculations’. There is no detail whatsoever. This means that the organisations, including NAHT, who support Teach First’s position have effectively signed a ‘blank cheque’. I would hazard a guess that the full consequences of the redistribution, including the risks, have not been thought through. They certainly haven’t been presented. That is not what one would expect of a leading educational organisation, especially one that receives a substantial proportion of its funding from the taxpayer. I recommend that, before taking its decision, NAHT obtains and publishes detailed draft proposals and a full risk analysis.
  • You also acknowledge the potentially negative impact on impact Goal 5. This is especially true of the part relating to progression to selective universities. It suggests that neither Teach First nor the Alliance have properly considered the interaction between their different goals. To suggest, as the Teach First policy statement does, that the appropriate interventions necessary to support Goal 5 are straightforward and inexpensive betrays a certain naivety but also an ignorance of the National strategy for access and student success. I urge that NAHT considers carefully how it will support Goal 5 and whether there is not a risk – even a likelihood – that the proposed reductions in pupil premium would undermine that support.

As you know, both ASCL and the NGA now oppose Teach First’s position, as does John Dunford, the pupil premium champion. The Conservative Manifesto pledges that it will ‘continue to provide the pupil premium, protected at current rates’. NAHT should reassess its own position in the light of this information.

Ofsted has announced that it will ensure inspections continue to focus sharply on the progress of able disadvantaged students, given its finding that only one-third of non-selective secondary schools are using pupil premium effectively to support them.

I have seen no evidence to suggest that primary schools are any more effective in this respect. Regardless of the arguments above, NAHT should obtain this evidence and reflect carefully upon its implications. 

In conclusion, I once more urge NAHT to withdraw its support for Teach First’s policy, as advanced by the FEA and Read On. Get On.

I also invite you to consider what more NAHT itself could do to ensure that its members are providing the best possible education for their most able learners, especially those eligible for the pupil premium.

.

GP

May 2015

Fisking Teach First’s defence of its pupil premium policy

.

New-EYEBALL-for-C4DThis post scrutinises the arguments advanced by Teach First in defence of reallocating Pupil Premium away from disadvantaged learners with middle or high prior attainment.

Background

On 29 April, Teach First responded formally to my campaign against their proposal that the Pupil Premium budget should be redistributed so that learners with low prior attainment attract double the amount available for those with middle and high prior attainment.

The original proposal was included in the Fair Education Alliance Report Card (December 2014) and repeated in a primary sector context in The Power of Reading (April 2015) published on behalf of the Read On Get On campaign.

I set out formidable arguments against this proposal in an earlier post: ‘Protecting Pupil Premium for High Attainers’ (April 2015).

It invited all the organisations listed as members of the Fair Education Alliance or supporters of Read On Get On to justify their backing for the proposal or else distance themselves from it.

To date I have pursued twelve of these organisations for a reply. Eleven have failed to respond.

The twelfth, The Future Leaders Trust provided a statement:

‘…we agree that mid- and high-attainers from poor backgrounds should not be deprived of the support that they need to succeed. FSM children who achieve Level 5 in Reading, Writing and Maths at age 11 are still significantly less likely to go on to A-levels and university than their more affluent peers….rather than trying to redistribute the existing pie, we should be campaigning for a bigger one’.

I take that to mean that they do not fully support the proposal.

Brian Lightman of ASCL sent me a response

Lightman Capture

.

.

.

He wrote:

‘ASCL is not a member of the Fair Education Alliance at this stage although we do agree with many aspects for what they are doing and are in discussion with them about what we might support and how.

However with regards to this specific point our position is similar to the one that NGA expressed. We would not be in agreement with allocating PP on the basis of prior attainment.  FSM is a proxy measure which is used to identify the overall level of disadvantage in a school and therefore pupil premium allocations

We strongly believe that decisions about how to use the PP in schools should be decisions made by school leaders who are fully  accountable for the impact of their decisions.’

Russell Hobby of NAHT – which is a member of the Alliance – committed to a response but has so far failed to produce it. (Quelle surprise, since NAHT has form in that department.)

.

The National Governors’ Association (NGA) has already confirmed its opposition to Teach First’s position.

.

.

Teach First’s argument is also opposed by John Dunford, the Pupil Premium Champion, and by the Sutton Trust.

.

The Teach First response is headed ‘Our policy position on the pupil premium’. It begins:

‘Recently, we’ve had a few questions on our policy position on the Pupil Premium, which we endorsed in the Fair Education Alliance Report Card 2015.’

This helpfully confirms that the proposal set out in the Report Card is official Teach First policy.

It is rather less helpful in failing to acknowledge the source of those questions and failing to link to the counter-arguments set out in my post.

This means that those who want to make up their own minds whether they support Teach First’s position have only one side of the argument available to them. I would have expected more generosity of spirit from an organisation as powerful as Teach First, especially when taking on a singleton blogger like me.

The remainder of this post fisks the Teach First policy position statement.

It strives wherever possible to supplement rather than repeat the substantive arguments advanced in my earlier post, so those who want to consider the case in the round do need to revisit that in addition to the material below.

Recommendation

The statement begins by reiterating the original recommendation, to:

‘Target pupil premium by attainment as well as disadvantage measures. This could be achieved through halving current funding per pupil for FSM Ever 6 [a deprivation measure which includes pupils who have ever been a Looked After Child or eligible for Free School Meals in the previous six years]. Half of this funding could then be re-allocated to pupils eligible for FSM Ever 6 who have low prior attainment. This would give double-weighting to those low income pupils most in need of intervention without raising overall pupil premium spend. The change of funding model would increase school accountability for ‘catching up’ pupils.’

The full implications of what is now declared as official Teach First policy are extremely unclear, because there is no modelling, in the Report Card or elsewhere, of the redistribution or its likely effects.

Indeed, when I challenged Teach First over one aspect of modelling, it admitted that none had been undertaken.

In the absence of any clarification of how the redistribution would work, this is my best guess at what the recommendation means.

One begins with an assumption that one-third of pupil premium beneficiaries are low attainers, while two-thirds are middle and high attainers. (In 2014, 67% of disadvantaged learners achieved KS L4 and above in reading, writing and maths, meaning 33% did not.)

Given a total pupil premium budget of £2.5bn per year, assuming equal shares, the low attainers get £833.33m and the middle and high attainers together get £1.67bn.

One removes half of the funding from the high and middle attainers together – so £833.33m in total, leaving an equivalent sum behind.

The sum removed is added to the low attainers’ budget giving them a total of £1.67bn, meaning they have double the amount available for the other two groups combined.

But this outcome would mean one group, half the size of the other, would also have double the funding, hence each low attainer within that group would have four times the funding allocated to each middle and high attainer.

To make the equation work, one has to divide the sum initially removed from the high and middle attainers into two, allocating £416.67m into each pot.

Then there is £1.25bn for the low attainers and an identical £1.25bn for the middle and high attainers, but there are twice as many of the latter, so each of them gets half the sum available to each low attainer.

Confused yet?

In any case, all of this is guesswork because Teach First has not yet:

  • Confirmed whether this proposal applies to both primary and secondary schools though, since it is referenced in a primary context by the ‘Read On Get On.’ Report and this statement mentions the primary sector in passing , one assumes that it applies equally in both.
  • Defined what constitutes low prior attainment. At secondary level for example, is it below the scaled score equivalent of Level 4b in reading writing and maths combined? Or does it count each assessment separately? Or is it achievement below Level 4, either individually or combined? What is the equivalent measure at primary level? Your guess is as good as mine.

It really behoves Teach First to be clearer on these issues than it has been to date.

However the recommendation above states clearly that learners attracting the pupil premium with low prior attainment would have ‘double weighting’, implying that those with middle and high prior attainment would find their allocations single weighted, so pitched at half this value.

So, in the absence of any further elucidation, I assume that each low attaining pupil premium beneficiary would in future receive twice as much as each middle and high attaining beneficiary.

It would be good to know the size of the premium Teach First expects to be available to each category.

One possible outcome, using the very approximate ratio above, might be:

  • Low attainer primary pupil premium of £1,950 and high/middle attainer primary pupil premium of £975, compared with the present rate of £1,300.
  • Low attainer secondary pupil premium of £1,402 and high/middle attainer pupil premium of £701.

Low attainers would get an additional 50% top-up while all middle/high attainers would get 75% of what they do now.

Until we know the size of the uprating and the numbers used in the calculation, we cannot quantify the redistributive impact, so Teach First has asked its supporters to sign a blank cheque (and they have done so, apparently without too much scrutiny).

.

Pupil premium as it operates now

. 

The positive

The policy statement says:

Teach First is fully supportive of the Pupil Premium. It has been an incredibly important tool that helps to achieve our vision that no child’s success is limited by their socio-economic background.  We will continue to advocate for it, and for it to be protected and enhanced. The introduction of the Pupil Premium has increased accountability for the progress of the country’s poorest children and since this was introduced, an increase in attainment has been seen in those areas where they are the minority, though they still significantly underperform their wealthier peers. We hope and expect the full impact of the Pupil Premium will become apparent as the funding beds in and those pupils who have benefitted from it complete their full school journey.’

The commitment to continued advocacy for the pupil premium to be protected and enhanced rings rather hollow, given that perhaps two-thirds of beneficiaries would have their allocations reduced to half the value of the premium provided for their low attaining peers.

One assumes that ‘protected’ means Pupil Premium should continue to be ring-fenced outside the school funding formula.

‘Enhanced’ is potentially meaningless. It stands proxy for ‘increased’ but, given the wider pressures on the national schools budget, there is little prospect of increasing the total pupil premium budget by the sum necessary to uprate low attainers’ allocations while leaving others unchanged. 

This is apparently what the Future Leaders Trust would like to see, but it simply isn’t realistic.

The data supporting the claim of an increase in attainment since the premium was introduced is unsupported by evidence. What level of attainment? What measure of attainment? What size of area? How do we know the improvement is attributable to the Pupil Premium, as opposed to other factors?

In particular, does this apply to middle and high attainers? If so, what evidence is there to suggest that significantly reducing the sum available to support them will not detract from this progress?

.

The negative

The statement continues:

Schools are held accountable to Ofsted for their spending of the Pupil Premium – demonstrating how it has contributed to improved attainment of eligible pupils. There has not yet been a systematic review of how schools are spending the Pupil Premium, however there is some evidence from Ofsted that Pupil Premium is not always being used as effectively as it could be – in some instances plugging gaps in school budgets which have faced cuts – and that it is not always meeting the needs of those who are falling furthest behind (e.g. Chapter 6 in The Tail).’

This betrays selective use of the evidence base.

Where the funding is being used to plug gaps in the school budget (something that Teach First is also advocating at the macro level – see below) surely middle and high attainers will be suffering equally as much as low attainers, quite possibly more.

In ‘The pupil premium: How schools are spending the funding’ (February 2013), Ofsted reported:

‘Where schools spent the Pupil Premium funding successfully to improve achievement, they shared many of the following characteristics. They:

  • carefully ring-fenced the funding so that they always spent it on the target group of pupils
  • never confused eligibility for the Pupil Premium with low ability, and focused on supporting their disadvantaged pupils to achieve the highest levels.’

Conversely:

‘Where schools were less successful in spending the funding, they tended to have at least some of the following characteristics. They…

  • focused on pupils attaining the nationally expected level at the end of the key stage (Level 4, five A* to C grades at GCSE) but did not go beyond these expectations, so some more able eligible pupils underachieved…’

In ‘The most able students: An update on progress’ (March 2015), Ofsted said:

Our report in 2013 found few instances of the pupil premium being used effectively to support the disadvantaged most able pupils. In the schools visited for this survey, about a third were using the pupil premium funding effectively to target the needs of these pupils.

Ofsted concludes:

‘… more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

Most of the evidence I have seen on this issue suggests that the lowest attainers are more likely than higher attainers to have their needs addressed appropriately through the pupil premium.

The case for reallocation via both pupil premium and the NFF

My previous post argues that, to the extent that reallocation is needed, it should be undertaken solely through the national funding formula (NFF) since using pupil premium creates too much ‘collateral damage’ – in the shape of lower allocations for middle and high attainers.

Teach First asserts:

 ‘We believe that low prior attainment is a compounding disadvantage and should be recognised in the National Funding Formula but that there would also be value in making extra funding to low attainers explicit through shifting the emphasis onto this group in the Pupil Premium.

The re-allocation within Pupil Premium funding would incentivise schools to make more progress with their most needy low income pupils: it would focus the accountability – as well as the financial support – directly on that group of pupils most in need of intervention.’

The case for recognition in the NFF is surely built on the costs involved in raising the attainment of low attainers, whether advantaged or disadvantaged.

If Teach First want to make extra funding for low attainers more explicit, that might be achieved by introducing an additional and entirely separate low attainers’ premium which recognises the needs of advantaged and disadvantaged low attainers alike.

But it would be administratively complex for schools to administer two overlapping ring-fenced budgets. It would be more straightforward to undertake the redistribution entirely through the NFF.

Accountability is achieved fundamentally through Ofsted inspection and School Performance Tables. If Teach First believe that schools need to be made more accountable for improving the performance of disadvantaged low attainers – and they cite no evidence to show that this is necessary – those are the obvious routes.

.

Grounds for justifying the policy

I had asked Teach First to explain whether it justified the proposal on the grounds that it would divert extra funding to ‘catch-up’ or that it would redistribute wider deprivation funding between schools.

The policy statement makes clear that both are in play, but one takes precedence over the other:

  • First and foremost, Teach First apparently believes that: those with low prior attainment have greater needs; that the potential benefits of investment in low attainers are more significant; and that effective interventions for them are comparatively more expensive than those for disadvantaged middle and high attainers.
  • Secondly, this is assumed to be an effective method of redistributing funding away from a few (number unquantified) schools that have built up substantial funding surpluses through the combined effects of the current NFF and pupil premium, towards some (number again unquantified) which receive rather less support.

Each segment of this argument is tackled below.

.

The impact of low attainment

The statement says:

We believe this is important because intervention at the lower ends of the prior attainment distribution could have significant impact on later attainment.  The FEA report card showed that those who fall behind early are not likely to catch up – last year only 7% of pupils achieving below expected levels aged 11 went on to get 5 ‘good’ GCSEs aged 16. And we charted how this ‘class ceiling’ can systemically hold some pupils back – having a knock-on effect on their wellbeing, employment and access to higher education.

There is similar evidence in respect of disadvantaged high attainers, where the comparator group are those with equivalent prior attainment from more advantaged backgrounds.

In ‘Closing England’s Excellence Gaps: Part 2’ (September 2014) I set out all the research evidence I could find on the subsequent progress made by high attainers, including:

  • The chances of FSM-eligible KS2 high attainers still being so at the end of KS4 are 45% lower than for other high attainers with similar prior attainment and characteristics (DfES 2007)
  • 58% of FSM students within the ‘most able’ population in KS2 and attending non-selective secondary schools go on to achieve A*-B GCSE grades in English and maths, compared with 75% of non-FSM pupils, giving a gap of 17 percentage points. (Ofsted 2013)
  • Children from poorer backgrounds who are high attaining at age 7 are more likely to fall off a high attainment trajectory than children from richer backgrounds. We find that high-achieving children from the most deprived families perform worse than lower-achieving students from the least deprived families by Key Stage 4. Conversely, lower-achieving affluent children catch up with higher-achieving deprived children between Key Stage 2 and Key Stage 4.’ (Vignoles 2014)

Teach First continues:

Recent analysis of pupils’ progress has shown that – although the majority of pupils do not have linear trajectories – pupils with high prior attainment are much more likely to stay on a linear trajectory than those with low prior attainment… However, low prior-attainers at Primary and Secondary have much more varied trajectories – indicating that rapid progress is possible, despite the fact that it is often not the case – and that focus on this group could be fruitful.’

I am not quite sure what this contributes to the argument. The analysis relates to progress subsequent to KS1 attainment. As the paper notes:

For children achieving a Level 1C, B or A at this stage, their development is so unpredictable that most will either outperform or underperform any Key Stage Two target that might be set.’

Moreover, the percentages are low at all levels – for example, only 12% of pupils with L3C at KS1 make linear progress at all key stages.

And of course they apply to all learners and not to disadvantaged learners, so we cannot see how much variation there is as a consequence of disadvantage.

The same is true of the primary and secondary transition matrices which, amongst other things show that, in 2014:

  • Of those with KS2 L5A in English or maths only half (48% in maths; 51% in English) achieved a GCSE A* grade.
  • Of those with KS2 L5C in English or maths, just one in five makes only a single level of progress by the end of KS4 in English, while the same is true of almost a third of students in maths.

Perhaps more to the point, excellence gaps are wide and growing. The graph below compares the percentage point gaps between disadvantaged and all other learners at KS2 L4 and above and L5 and above in 2013 and 2014 respectively.

.

Fisk graph

.

In 2014 the L5 gaps are larger across the board, with particularly large differences in maths and reading. In the latter, the gap at L5+ is more than twice as large as it is at L4+.

I note in passing that the Teach First model would presumably involve any disadvantaged low attainer who subsequently achieved or exceeded the expected level of performance moving from the higher level of pupil premium to the lower, otherwise the system would be inequitable. This would be complex and costly to administer.

Finally in this section, Teach First argues:

‘As well as huge personal cost, there is huge national cost to this underachievement – consultancy BCG estimated that boosting the attainment of this group could raise GDP by up to £56bn a year by 2050 (BCG, 2013)’

This is a secondary reference to a finding quoted in ‘The Tail’, which appears to be a sacred script for Teach First and the probable source of their false ideological position.

The actual wording in Marshall’s book is:

‘In a comparable study for the UK, the consulting firm BCG found that matching Finnish levels of social mobility (in terms of raising the educational outcomes of poor children) would add £6bn a year to GDP by 2030 and £56bn a year by 2050. Bringing below-average students in the UK to the national average would add £14bn a year to GDP by 2030 and £140bn by 2050.’

It doesn’t inspire confidence that Teach First has misquoted this statement in the Report Card as well as in its policy statement.

The original source is the Sutton Trust’s ‘Mobility Manifesto’ (2010). The calculations are based on PISA 2006 average scores in maths and science and based on a methodology derived by Hanushek. I shall leave it to others to comment on the reliability of the findings.

The first calculation involved estimating the benefits of matching the distribution of scores across the UK (so not just England) with those of Finland; the second with raising attainment across all socio-economic groups (based on parents’ education) to the UK average (excepting the higher than average value already recorded by the highest socio-economic group).

This is of course an entirely hypothetical model which attempts to quantify the impact of education on economic growth.

I will only note that, in ‘The High Cost of Low Educational Performance’ (2010) Hanushek also calculates the not inconsiderable benefits of improving average PISA maths and science performance by 25 points, so impacting across the attainment spectrum.

I reviewed the parallel literature on the economic benefits of investment at the top end in ‘The Economics of Gifted Education Revisited’ (March 2013).

In light of that, it seems to me there is a reasonable case for arguing that investment at the top end would yield commensurate benefits.

Hanushek himself recognises that:

Importantly, the relative size of the effects of performance at the bottom and at the top of the distribution depends on the specification, and further research is needed to yield more detailed predictions. Even so, the evidence strongly suggests that both dimensions of educational performance count for the growth potential of an economy.’

. 

The impact on Goal 5

My original post pointed out that the Fair Education Alliance was also pursuing another goal to:

Narrow the gap in university graduation, including from the 25% most selective universities

The Fair Education Alliance is committed to closing the graduation gap between young people from low income backgrounds and those from high income backgrounds. Our goal is for at least 5,000 more pupils from low income backgrounds to graduate each year, with 1,600 of these young people graduating from the most selective universities.’

I argued that reducing pupil premium for middle and high attainers would make this much harder to achieve, especially the highlighted final phrase, because it would reduce the chances of such learners achieving the grades necessary for admission to such universities.

Teach First’s policy statement says:

We see this recommendation as focusing on a different part of ‘the gap’ from Impact Goal 5 (the gap in university access) recommendations – this policy is about raising the attainment at KS2 and KS4 (our Impact Goals One and Two) for some of the nation’s most vulnerable children.’

This is risible I’m afraid, since a corollary of rationing pupil premium in this fashion is that exactly those disadvantaged learners most likely to proceed to selective universities will lose funding, while those least likely to do so will gain.

The reference to ‘vulnerable children’ introduces a whole new dimension, only for it to disappear as rapidly. Because if we are talking about funding for additional needs, perhaps SEN or behavioural, a range of additional considerations (and funding streams) apply.

Teach First continues:

We know that the kind of intensive interventions needed to raise attainment can be expensive and that working to change a pupil’s trajectory is likely to be harder than to ‘keep pupils on track’.  We also know that there are an array of inexpensive projects working with schools who can boost the non-cognitive and academic skills of those pupils already on positive trajectories – such as debatemateThe Brilliant Club and our own Futures programme. Hence our recommendation that Pupil Premium funding is redistributed to give greater weighting to low prior attainment and the more expensive interventions required there to change a child’s life.’

Hang on, weren’t we told earlier that the majority of students don’t have linear trajectories?

I would like to see evidence that it is necessarily harder to move, for example, a secure L3 to a L4 than it is to move a secure L5 to a L6. My experience suggests that interventions to raise disadvantaged attainment at the top end may need to be equally intensive as those lower down, especially when the focus is admission to selective universities.

On top of pupil premium, there is additional investment in catch-up, including over £50m a year (£500 per pupil) for the Catch-up Premium and the £50m annual topslice from the pupil premium budget for end of KS2 summer schools, also heavily focused on catch-up.

I have called for a similar £50m topslice to support intensive provision for disadvantaged high attainers seeking admission to selective universities.

In their parallel response to that post, Teach First says:

‘The single biggest factor linked to HE access is prior attainment. The Russell Group highlight that, of 15-year-olds on Free School Meals in 2007, only 0.3% achieved 3As or equivalent in their A-levels two years later – a huge barrier for progression to the most selective universities.

In this response, however, it all seems much more straightforward. There ‘are [sic] an array of inexpensive projects’ that can sort this out. (Do English teachers now consider an array to be plural?)

Unfortunately it’s not that simple. I believe debatemate and The Brilliant Club are both Teach First spin-offs (run by alumni). While debatemate is a member of the Fair Education Alliance, The Brilliant Club is not. While debatemate is focused on developing speaking and listening and critical thinking skills, The Brilliant Club is dedicated principally to placing PhD students in schools.

No doubt both are valuable niche programmes and there are dozens more like them, offered by commercial, third sector or university providers. Some are free, some relatively cheap, others more expensive.

The problem is that disadvantaged students aiming for selective universities need a coherent, long-term support programme that addresses their particular strengths and weaknesses. This is increasingly recognised in the national strategy for access.

They also need support from their schools to secure that provision, drawing on a range of different providers to supply the elements they must combine to generate a holistic programme. That’s precisely what my proposed £50m pupil premium topslice would achieve.

It would support a personal budget of £2,000 a year (almost exactly the same as the illustrative higher rate pupil premium for low attainers above) for some 5,000 high attaining pupil premium eligible learners.

It would be designed to increase significantly the number of such students progressing to high tariff universities, including Russell Group institutions and especially Oxbridge.

No sign of Teach First support for this of course.

.

Redistribution of funding

Reverting to its secondary reason for reallocating pupil premium, Teach First argues:

‘A secondary effect of this Pupil Premium change is that it might better recognise the compound disadvantage of growing up in a low income home in an area with a history of educational under-performance.

The Free School Meals (FSM) measure of disadvantage in the UK is not fully progressive or entirely comprehensive. For example, the binary FSM/non-FSM to dictate funding does not allow for recognition of  low-income families who just miss the eligibility criteria for Free School Meals; the national funding formula does not currently compensate for geographical isolation and high transport costs which can compound low incomes in parts of the country. Consequently – due to the combination of a high intake of pupils attracting the Premium and a currently unequal national school funding formula – there are a small number of very successful schools building up surpluses. Meanwhile some schools with arguably greater need, where pupils suffer different socioeconomic disadvantages that affect their attainment, are receiving comparatively little extra funding. This hampers their ability to deal with the challenges that their students face and to prevent those vulnerable pupils from falling behind their peers.  Those areas struggling to raise the attainment of their deprived pupils would most benefit from this double-weighting for their pupils who have fallen behind.’

My previous post argued strongly that any redistribution of this nature should be undertaken through the NFF and not the pupil premium.

Teach First is perfectly at liberty to lobby for changes to the Formula that would achieve its desired outcomes, though it seems that only ‘a small number’ of schools have built up surpluses.

There is no reason in principle why the NFF should not take account of aspects of disadvantage not caught by ‘ever 6 FSM’ (or indeed the other routes to pupil premium), or reflect sparsity factors.

Pupil premium reallocation might be a ‘quick fix’ for this problem but, as noted above, the collateral damage is too great. It drives a coach and horses through the principle that every ‘ever 6 FSM’ learner attracts the same rate of support. As such, it is not to be tolerated.

.

Conclusion

This policy position is fundamentally inequitable, predicated as it is on the mistaken ideological assumption that a low attainer’s needs must necessarily outweigh and be prioritised over those of a high attainer with the same level of disadvantage.

Teach First will surely nail their colours to this mast and sail away into the sunset. In doing so, they confirm the existence of the bias I already suspected.

But, in the words of the Report Card itself, we need ‘a fair education for all’ supported by the ‘sound moral argument for giving every child an equal chance to succeed‘. Success should not mean all learners achieving the same outcomes. The success of one group should not be at the expense of another.

Nothing in Teach First’s new line of argument has persuaded me that high attainers’ chances of success will be protected if their pupil premium is reduced in this way. The same goes for the ‘squeezed middle’.

At bottom, this is nothing more than robbing Peter to pay Paul.

So I call again on the members of the Fair Education Alliance and supporters of Read On Get On to justify their commitment to this ill-conceived and ill-formed idea.

Or else make clear that they no longer support it.

.

.

GP

April 2015

Oxford Access Lecture

This is a brief post-event report on the presentation I gave at Brasenose College, Oxford on 28 April 2015.

P1030046I had been invited to give an Access Lecture to an invited audience of University admissions and outreach staff and other interested parties.

The groundwork for my presentation is set out in an earlier post – How strong is Oxbridge access? (March 2015) – which provides a full analysis of the access agreements and outreach provision undertaken at each university.

This post provides the powerpoint that accompanied my presentation and the record to date of the Twitter discussion about it, under the hashtag #oxgap.

I have extended an open invitation to participants to continue the discussion further through this medium, should they wish. If there is further discussion I will upload it here.

I would like to place on record my gratitude to everyone at Oxford, for taking the trouble to invite me in the first place, for extending such a warm welcome and for interacting so positively and constructively with the arguments I put to them.

I was hugely reassured by their openness and willingness to engage with objective and evidence-based criticism, which can only augur well as they continue their efforts to improve access to Oxford for students from disadvantaged backgrounds.

.

Powerpoint

My presentation is embedded below.

.

Twitter discussion to date

Here is the discussion to date under the #oxgap hashtag. The most recent tweets are at the top.

.

.

In recent months protecting the equal rights of disadvantaged learners to access the educational support they need, regardless of prior attainment, has been an increasingly uphill battle.

Many organisations have been arguing for pupil premium to be redistributed, so it is doubled for low attainers and halved for middle and high attainers. I continue to press them to justify this idea, so far to little avail.

Elsewhere, influential journalists and social media commentators have begun to suggest that there is an imbalance in favour of higher attainers that should be rectified. I have done my best to challenge that ideology.

It has not escaped me that such views seem particularly prevalent in the generation after mine. I find this particularly dispiriting, having devoted considerable effort to persuading my own generation of the equal rights argument.

It was delightful to spend a little time amongst people of all generations equally committed to improving the lot of disadvantaged high attainers. I wish them every success.

.

GP

April 2015

 

 

 

 

 

 

 

 

 

 

 

A Digression on Breadth, Depth, Pace and Mastery

.

Tricoloring (1)

For a more recent post on these issues, go here

This post explores the emerging picture of mastery-based differentiation for high attainers and compares it with a model we used in the National G&T Programme, back in the day.

It is a rare venture into pedagogical territory by a non-practitioner, so may not bear close scrutiny from the practitioner’s perspective. But it seeks to pose intelligent questions from a theoretical position and so promote further debate.

.

Breadth, depth and pace

. 

Quality standards

In the original National Quality Standards in Gifted and Talented Education (2005) one aspect of exemplary ‘Effective Provision in the Classroom’ was:

‘Teaching and learning are suitably challenging and varied, incorporating the breadth, depth and pace required to progress high achievement. Pupils routinely work independently and self-reliantly.’

In the 2010 version it was still in place:

‘Lessons consistently challenge and inspire pupils, incorporating the breadth, depth and pace required to support exceptional rates of progress. Pupils routinely work creatively, independently and self-reliantly.’

These broad standards were further developed in the associated Classroom Quality Standards (2007) which offered a more sophisticated model of effective practice.

The original quality standards were developed by small expert working groups, reporting to wider advisory groups and were carefully trialled in primary and secondary classrooms.

They were designed not to be prescriptive but, rather, to provide a flexible framework within which schools could develop and refine their own preferred practice.

Defining the terms

What did we mean by breadth, depth and pace?

  • Breadth (sometimes called enrichment) gives learners access to additional material beyond the standard programme of study. They might explore additional dimensions of the same topic, or an entirely new topic. They might need to make cross-curricular connections, and/or to apply their knowledge and skills in an unfamiliar context.
  • Depth (sometimes called extension) involves delving further into the same topic, or considering it from a different perspective. It might foreground problem solving. Learners might need to acquire new knowledge and skills and may anticipate material that typically occurs later in the programme of study.
  • Pace (sometimes called acceleration) takes two different forms. It may be acceleration of the learner, for example advancing an individual to a higher year group in a subject where they are particularly strong. More often, it is acceleration of the learning, enabling learners to move through the programme of study at a relatively faster pace than some or all of their peers. Acceleration of learning can take place at a ‘micro’ level in differentiated lesson planning, or in a ‘macro’ sense, typically through setting. Both versions of acceleration will cause the learner to complete the programme of study sooner and they may be entered early for an associated test or examination.

It should be readily apparent that these concepts are not distinct but overlapping.  There might be an element of faster pace in extension, or increased depth in acceleration for example. A single learning opportunity may include two, or possibly all three. It is not always straightforward to disentangle them completely.

Applying these terms

From the learner’s perspective, one of these three elements can be dominant, with the preferred strategy determined by that learner’s attainment, progress and wider needs.

  • Enrichment might be dominant if the learner is an all-rounder, relatively strong in this subject but with equal or even greater strength elsewhere.
  • Extension might be dominant if the learner shows particular aptitude or interest in specific aspects of the programme of study.
  • Acceleration might be dominant if the learner is exceptionally strong in this subject, or has independently acquired and introduced knowledge or skills that are not normally encountered until later in this or a subsequent key stage.

Equally though, the richest learning experience is likely to involve a blend of all three elements in different combinations: restricting advanced learners to one or two of them might not always be in their best interests. Moreover, some high attainers will thrive with a comparatively ‘balanced scorecard’

The intensity or degree of enrichment, extension or acceleration will also vary according to the learners’ needs. Even in a top set decisions about how broadly to explore, how deeply to probe or how far and how fast to press forward must reflect their starting point and the progress achieved to date.

Acceleration of the learner may be appropriate if he or she is exceptionally advanced.  Social and emotional maturity will need to be taken into account, but all learners are different – this should not be used as a blanket excuse for failing to apply the approach.

There must be evidence that the learner is in full command of the programme of study to date and that restricting his pace is having a detrimental effect. A pedagogical preference for moving along the class at the same pace should never over-ride the learner’s needs.

Both variants of acceleration demand careful long-term planning, so the learner can continue on a fast track where appropriate, or step off without loss of esteem. It will be frustrating for a high attainer expected to ‘mark time’ when continuity is lost. This may be particularly problematic on transfer and transition between settings.

Careful monitoring is also required, to ensure that the learner continues to benefit, is comfortable and remains on target to achieve the highest grades. No good purpose is served by ‘hothousing’.

Mastery and depth

The Expert Panel

The recent evolution of a mastery approach can be tracked back to the Report of the Expert Panel for the National Curriculum Review (December 2011).

‘Amongst the international systems which we have examined, there are several that appear to focus on fewer things in greater depth in primary education, and pay particular attention to all pupils having an adequate understanding of these key elements prior to moving to the next body of content – they are ‘ready to progress’…

… it is important to understand that this model applies principally to primary education. Many of the systems in which this model is used progressively change in secondary education to more selective and differentiated routes. Spread of attainment then appears to increase in many of these systems, but still with higher overall standards than we currently achieve in England…

There are issues regarding ‘stretch and challenge’ for those pupils who, for a particular body of content, grasp material more swiftly than others. There are different responses to this in different national settings, but frequently there is a focus on additional activities that allow greater application and practice, additional topic study within the same area of content, and engagement in demonstration and discussion with others

These views cohere with our notion of a revised model that focuses on inclusion, mastery and progress. However, more work needs to be done around these issues, both with respect to children with learning difficulties and those regarded as high attainers.’

For reasons best known to itself, the Panel never undertook that further work in relation to high attainers, or at least it was never published. This has created a gap in the essential groundwork necessary for the adoption of a mastery-driven approach.

 .

National curriculum

Aspects of this thinking became embodied in the national curriculum, but there are some important checks and balances.

The inclusion statement requires differentiation for high attainers:

‘Teachers should set high expectations for every pupil. They should plan stretching work for pupils whose attainment is significantly above the expected standard.’

The primary programmes of study for all the core subjects remind everyone that:

Within each key stage, schools therefore have the flexibility to introduce content earlier or later than set out in the programme of study. In addition, schools can introduce key stage content during an earlier key stage, if appropriate.’

But, in mathematics, both the primary and secondary PoS say:

‘The expectation is that the majority of pupils will move through the programmes of study at broadly the same pace. However, decisions about when to progress should always be based on the security of pupils’ understanding and their readiness to progress to the next stage. Pupils who grasp concepts rapidly should be challenged through being offered rich and sophisticated problems before any acceleration through new content. Those who are not sufficiently fluent with earlier material should consolidate their understanding, including through additional practice, before moving on.’

These three statements are carefully worded and, in circumstances where all apply, they need to be properly reconciled.

.

NCETM champions the maths mastery movement

The National Centre for Excellence in the Teaching of Mathematics (NCETM), a Government-funded entity responsible for raising levels of achievement in maths, has emerged as a cheerleader for and champion of a maths mastery approach.

It has published a paper ‘Mastery approaches to mathematics and the new national curriculum’ (October 2014).

Its Director, Charlie Stripp, has also written two blog posts on the topic:

The October 2014 paper argues (my emphasis):

‘Though there are many differences between the education systems of England and those of east and south-east Asia, we can learn from the ‘mastery’ approach to teaching commonly followed in these countries. Certain principles and features characterise this approach…

… The large majority of pupils progress through the curriculum content at the same pace. Differentiation is achieved by emphasising deep knowledge and through individual support and intervention.’

It continues:

‘Taking a mastery approach, differentiation occurs in the support and intervention provided to different pupils, not in the topics taught, particularly at earlier stages. There is no differentiation in content taught, but the questioning and scaffolding individual pupils receive in class as they work through problems will differ, with higher attainers challenged through more demanding problems which deepen their knowledge of the same content.’

In his October 2014 post, Stripp opines:

‘Put crudely, standard approaches to differentiation commonly used in our primary school maths lessons involve some children being identified as ‘mathematically weak’ and being taught a reduced curriculum with ‘easier’ work to do, whilst others are identified as ‘mathematically able’ and given extension tasks….

…For the children identified as ‘mathematically able’:

  1. Extension work, unless very skilfully managed, can encourage the idea that success in maths is like a race, with a constant need to rush ahead, or it can involve unfocused investigative work that contributes little to pupils’ understanding. This means extension work can often result in superficial learning. Secure progress in learning maths is based on developing procedural fluency and a deep understanding of concepts in parallel, enabling connections to be made between mathematical ideas. Without deep learning that develops both of these aspects, progress cannot be sustained.
  2. Being identified as ‘able’ can limit pupils’ future progress by making them unwilling to tackle maths they find demanding because they don’t want to challenge their perception of themselves as being ‘clever’ and therefore finding maths easy….

…I do think much of what I’m saying here also applies at secondary level.

Countries at the top of the table for attainment in mathematics education employ a mastery approach to teaching mathematics. Teachers in these countries do not differentiate their maths teaching by restricting the mathematics that ‘weaker’ children experience, whilst encouraging ‘able’ children to ‘get ahead’ through extension tasks… Instead, countries employing a mastery approach expose almost all of the children to the same curriculum content at the same pace…’

The April 2015 post continues in a similar vein, commenting directly on the references in the PoS quoted above (my emphases):

‘The sentence: ‘Pupils who grasp concepts rapidly should be challenged through rich and sophisticated problems before any acceleration through new content’, directly discourages acceleration through content, instead requiring challenge through ‘rich and sophisticated (which I interpret as mathematically deeper) problems’. Engaging with ‘rich and sophisticated problems’ involves reasoning mathematically and applying maths to solve problems, addressing all three curriculum aims. All pupils should encounter such problems; different pupils engage with problems at different depths, but all pupils benefit

…Meeting the needs of all pupils without differentiation of lesson content requires ensuring that both (i) when a pupil is slow to grasp an aspect of the curriculum, he or she is supported to master it and (ii) all pupils should be challenged to understand more deeply…

The success of teaching for mastery in the Far East (and in the schools employing such teaching here in England) suggests that all pupils benefit more from deeper understanding than from acceleration to new material. Deeper understanding can be achieved for all pupils by questioning that asks them to articulate HOW and WHY different mathematical techniques work, and to make deep mathematical connections. These questions can be accessed by pupils at different depths and we have seen the Shanghai teachers, and many English primary teachers who are adopting a teaching for mastery approach, use them very skilfully to really challenge even the highest attaining pupils.’

The NCETM is producing guidance on assessment without levels, showing how to establish when a learner

‘…has ‘mastered’ the curriculum content (meaning he or she is meeting national expectations and so ready to progress) and when a pupil is ‘working deeper’ (meaning he or she is exceeding national expectations in terms of depth of understanding).’

.

Commentary

NCETM wants to establish a distinction between depth via problem-solving (good) and depth via extension tasks (bad)

There is some unhelpful terminological confusion in the assumption that extension tasks necessarily require learners to anticipate material not yet covered by the majority of the class.

Leaving that aside, notice how the relatively balanced wording in the programme of study is gradually adjusted until the balance has disappeared.

The PoS says ‘the majority of pupils will move through…at broadly the same pace’ and that they ‘should be challenged through being offered rich and sophisticated problems before any acceleration through new content).

This is first translated into

‘…the large majority of pupils progress through the curriculum content at the same pace’ (NCETM paper) then it becomes

‘…expose almost all of the children to the same curriculum content at the same pace’ (Stripp’s initial post) and finally emerges as

‘Meeting the needs of all pupils without differentiation of lesson content’ and

‘…all pupils benefit more from deeper understanding than from acceleration to new material.’ (Stripp’s second post).

Any non-mathematician will tell you that the difference between the majority (over 50%) and all (100%) may be close to 50%.

Such a minority could very comfortably include all children achieving L3 equivalent at KS1 or L5 equivalent at KS2, or all those deemed high attainers in the Primary and Secondary Performance Tables.

The NCETM pretends that this minority does not exist.

It does not consider the scope for acceleration towards new content subsequent to the delivery of ‘rich and sophisticated problems’.

Instead it argues that the statement in the PoS ‘directly discourages acceleration through content’ when it does no such thing.

This is propaganda, but why is NCETM advancing it?

One possibility, not fully developed in these commentaries, is the notion that teachers find it easier to work in this way. In order to be successful ‘extension work’ demands exceptionally skilful management.

On the other hand, Stripp celebrates the fact that Shanghai teachers:

…were very skilled at questioning and challenging children to engage more deeply with maths within the context of whole class teaching.’

It is a moot point whether such questioning, combined with the capacity to develop ‘rich and sophisticated problems’, is any more straightforward for teachers to master than the capacity to devise suitable extension tasks, especially when one approach is relatively more familiar than the other.

Meanwhile, every effort is made to associate maths mastery with other predilections and prejudices entertained by educational professionals:

  • It will have a positive impact on teacher workload, but no evidence – real or imagined – is cited to support this belief.
  • The belief that all children can be successful at maths (though with no acknowledgement that some will always be comparatively more successful than others) and an associated commitment to ‘mindset’, encouraging learners to associate success with effort and hard work rather than underlying aptitude.
  • The longstanding opposition of many in the maths education community to any form of acceleration, fuelled by alarming histories of failed prodigies at one extreme and poorly targeted early entry policies at the other. (I well remember discussing this with them as far back as the nineties.)
  • The still contested benefits of life without levels.

On this latter point, the guidance NCETM is developing appears to assume that ‘exceeding national expectations’ in maths must necessarily involve ‘working deeper’.

I have repeatedly argued that, for high attainers, such measures should acknowledge the potential contributions of breadth, depth and pace.

Indeed, following a meeting and email exchanges last December, NAHT said it wanted to employ me to help develop such guidance, as part of its bigger assessment package.

(Then nothing more – no explanation, no apology, zilch. Shame on you, Mr Hobby. That’s no way to run an organisation.)

.

Conclusion

Compared with the richness of the tripartite G&T model, the emphasis placed exclusively on depth in the NCETM mastery narrative seems relatively one-dimensional and impoverished.

There is no great evidence in this NCETM material of a willingness to develop an alternative understanding of ‘stretch and challenge’ for high attainers.  Vague terms like  ‘intelligent practice’, ‘deep thinking’ and ‘deep learning’ are bandied about like magical incantations, but what do they really mean?

NCETM needs to revisit the relevant statement in the programme of study and strip away (pun intended) the ‘Chinese whispers’ (pun once more intended) in which they have cocooned it.

Teachers following the maths mastery bandwagon need meaningful free-to-access guidance that helps them construct suitably demanding and sophisticated problems and to deploy advanced questioning techniques that get the best out of their high attainers.

I do not dismiss the possibility that high attainers can thrive under a mastery model that foregrounds depth over breadth and pace, but it is a mistake to neglect breadth and pace entirely.

Shanghai might be an exception, but most of the other East Asian cradles of mastery also run parallel gifted education programmes in which accelerated maths is typically predominant. I’ve reviewed several on this Blog.

For a more recent treatment of these issues see my September 2015 post here.

.

GP

April 2015

Protecting pupil premium for high attainers

.

This post continues the campaign I have been waging against the Fair Education Alliance, a Teach First-inspired ‘coalition for change in education’ over a proposal in its Report Card 2014 to f-school-letter-gradehalve the pupil premium for disadvantaged learners with high prior attainment.

I am:

  • Inviting Fair Education Alliance members (and Read On. Get On. partners) to defend the proposal or else distance themselves from it and
  • Calling on both campaigns to withdraw it.

.

Background

The Fair Education Alliance was launched by Teach First in June 2014. It aims to:

‘…significantly narrow the achievement gap between young people from our poorest communities and their wealthier peers by 2022’.

There are 27 members in all (see below).

The Alliance plans to monitor progress annually against five Fair Education Impact Goals through an annual Report Card.

The first Report Card, published in December 2014, explains that the Alliance was formed:

‘…in response to the growing demand for a national debate on why thousands of children do not get a fair education’.

The Impact Goals are described thus:

  • ‘Narrow the gap in literacy and numeracy at primary school

The Fair Education Alliance is committed to closing the attainment gap between primary schools serving lower income pupils and those educating higher income pupils. Our goal is for this gap to be narrowed by 90 % by 2022.

  • Narrow the gap in GCSE attainment at secondary school

The Fair Education Alliance is committed to closing the attainment gap between secondary schools serving lower income pupils and those educating higher income pupils. Our goal is to close 44 % of this gap by 2022.

  • Ensure young people develop key strengths, including resilience and wellbeing, to support high aspirations

The Fair Education Alliance is committed to ensuring young people develop non-cognitive skills, including the positive wellbeing and resilience they need to succeed in life. The Alliance will be working with other organisations to develop measurement tools which will allow the development of these key skills to be captured.

  • Narrow the gap in the proportion of young people taking part in further education or employment-based training after finishing their GCSEs.

The Fair Education Alliance wants to see an increase in the number of young people from low-income communities who stay in further education or employment-based training once they have completed Key Stage 4. Our goal is for 90% of young people from schools serving low income communities to be in post-16 education or employment-based training by 2022.

  • Narrow the gap in university graduation, including from the 25% most selective universities

The Fair Education Alliance is committed to closing the graduation gap between young people from low income backgrounds and those from high income backgrounds. Our goal is for at least 5,000 more pupils from low income backgrounds to graduate each year, with 1,600 of these young people graduating from the most selective universities.’

The problematic proposal relates to Impact Goal 2, focused on the GCSE attainment gap in secondary schools.

The gap in question is between:

  • Schools serving low income communities: ‘State schools where 50 % or more of the pupils attending come from the most deprived 30 % of families according to the Income Deprivation Affecting Children Index (IDACI)’ and
  • Schools serving high income communities: ‘State schools where 50 % or more of the pupils attending come from the least deprived 30 % of families according to IDACI’.

The Report Card explains that the Alliance is focused on gaps between schools rather than gaps between pupils:

‘…to better capture data that includes those pupils whose families are on a low income but are just above the income threshold for free school meals (the poverty measure in schooling). This measurement also helps monitor the impact of the Alliance’s efforts towards meeting the goals as many members work with and through schools to tackle educational inequality, rather than with individual pupils.’

Under Goal 2, the gap the Alliance wishes to close relates to:

‘Average point score…across eight GCSE subjects, with extra weighting for English and maths’

The measure excludes equivalent qualifications. The baseline gap – derived from 2012/13 data:

‘…is currently 101.7 average points – the difference between 8 C grades and 8 A grades.

The Report Card says this gap has narrowed by 10.5% since 2010/11, but warns that new accountability measures could work in the opposite direction.

The problematic recommendation

The Report Card discusses the distribution of funding to support deprivation, arguing that:

  • Some aspects of disadvantage ‘are given less recognition in the current funding system. ‘For instance FSM Ever 6 does not include low income families who just miss the eligibility criteria for free school meals; and the national funding formula is not able to compensate for geographical isolation and high transport costs which can compound low incomes in parts of the country.’
  • ‘Consequently – due to the combination of a high intake of pupils attracting the premium and a currently unequal national school funding formula – there are a small number of very successful schools building up large surpluses. Meanwhile some schools with arguably greater need, where pupils suffer different socioeconomic disadvantages that affect their attainment, are receiving comparatively little extra funding. This hampers their ability to deal with the challenges that their students face and to prevent those vulnerable pupils from falling behind their peers.’

To rectify this problem, the Report Card recommends a significant policy adjustment:

Target pupil premium by attainment as well as disadvantage measures: This could be achieved through halving current funding per pupil for FSM Ever 6. Half of this funding could then be re-allocated to pupils eligible for FSM Ever 6 who have low prior attainment. This would give double-weighting to those low income pupils most in need of intervention without raising overall pupil premium spend. The change of funding model would increase school accountability for ‘catching up’ pupils.

The proposal is advanced in a section about secondary schools; it is unclear whether it is intended to apply equally to primary schools.

Quite what constitutes low prior attainment is never made entirely clear either. One assumes that, for secondary students, it is anything below the scaled score equivalent of KS2 L4b in English (reading and writing), maths or both.

This does of course mean that learners attracting the pupil premium who achieve the requisite scores will be as much short-changed as those who exceed them. Low attainers must take precedence over middle attainers as well as high attainers.

I am minded to extend my campaign to encompass the ‘squeezed middle’, but perhaps I should let someone else bear that standard.

.

Why this is objectionable

I oppose this proposal because:

  • The pupil premium is described as ‘additional funding for publicly funded schools in England to raise the attainment of disadvantaged pupils and close the gap between them and their peers’. Although not a personal funding entitlement – the funding can be aggregated and deployed as schools see fit – schools are held accountable for the impact of the pupil premium on the attainment and progress of the pupils that attract it. There is presently no distinction according to the attainment of these students, but the change proposed by the Alliance would shift the accountability focus to prioritise the achievement and progress of disadvantaged low attainers over disadvantaged middle and high attainers.
  • The pupil premium should not be treated as part of the overall school budget. As Ofsted said in its first report on the premium (September 2012):

‘School leaders, including governing bodies, should ensure that Pupil Premium funding is not simply absorbed into mainstream budgets, but instead is carefully targeted at the designated children. They should be able to identify clearly how the money is being spent.’

Since the premium follows the pupil, schools with large numbers of eligible pupils should not have any part of this funding clawed back, nor should those with relatively few eligible pupils have it supplemented.

  • If there are problems with the distribution of deprivation funding, this should be addressed through the school funding formula. It is wrong to suggest that a national funding formula would be incapable of compensating for associated sparsity factors. It is for those devising such a formula to determine whether to compensate for pupils not eligible for the premium and factors such as geographical isolation and high transport costs. The Alliance is perfectly entitled to lobby for this. But, in the absence of such a formula, the premium should not be rationed or redistributed to compensate.

‘Our report in 2013 found few instances of the pupil premium being used effectively to support the disadvantaged most able pupils. In the schools visited for this survey, about a third were using the pupil premium funding effectively to target the needs of these pupils.

  • Any decision to double weight pupil premium for disadvantaged learners with low prior attainment would be likely to penalise disadvantaged high attainers. Although schools could theoretically decide to aggregate the funding and spend it differently, the clear intention is that the accountability framework would incentivise correspondingly stronger improvement by low attainers relative to middle and higher attainers. It is hard to understand how this, combined with the redistribution of funding, would help schools to support the latter and so meet Ofsted’s expectations
  • There are strong equity arguments against such a redistribution: disadvantaged learners should not be penalised on the basis of their prior attainment. That is  not ‘A fair education for all’, nor is it consistent with the ‘sound moral argument for giving every child an equal chance to succeed‘ mentioned in the Executive Summary of the Report Card. There is a fundamental distinction between reflecting the additional costs attributable to supporting all low attainers in the funding formula and redistributing allocations associated with individual disadvantaged learners for the same purpose.
  • The Report Card itself recognises the significance of disadvantaged high attainers:

‘As the Level 5 attainment gap highlights, there is not only a need to catch up those ‘slipping behind’ but also an imperative to ‘stretch the top’ when looking at pupils from low income communities. Some schools do well by this measure: sharing best practice in making better than expected levels of progress and stretching the highest attainers is crucial for ensuring all schools can replicate the successes some have already developed.’

How this can be squared with the proposed redistribution of pupil premium is not addressed. 

  • Such a policy would make the Alliance’s own goal of narrowing the gap in university graduation from the 25% most selective universities much harder to achieve, since it would reduce the likelihood of disadvantaged learners reaching the level of attainment necessary to secure admission.
  • There is already additional funding, outside the school funding settlement, dedicated to ‘catch-up’ for those with low prior attainment. Well over £50m per year is allocated to the ‘catch-up premium’ providing £500 per pupil who did not achieve at least KS2 L4 in reading and/or maths. This may be used for individual or small group tuition, summer schools or resources and materials. A further £50m has also been top-sliced from the pupil premium to provide an annual summer schools programme for those at the end of KS2. A core purpose is ‘to help disadvantaged pupils who are behind in key areas such as literacy and numeracy to catch up with their peers’. There is no corresponding funding for disadvantaged high attainers.
  • For FY2015/16, the Government adjusted the funding formula to allocate an additional £390m to schools in the least fairly funded authorities. This involved setting a minimum funding level for five pupil characteristics, one being ‘pupils from deprived backgrounds’, another ‘pupils with low attainment before starting at their primary or secondary school’. The values for the latter are £660 for primary schools and £940 for secondary schools. This establishes a precedent for reflecting the needs of low attaining learners in further progress towards a national funding formula.

.

The campaign to date

I had an inconclusive discussion with Teach First officials on the day the Report Card was published

.

Subsequently I pressed the Fair Education Alliance spokesperson at Teach First on some specific questions.

.

I received two undertakings to respond online but nothing has materialised. Finally, on 17 April I requested a response within 24 hours.

.

Nothing doing.

Meanwhile though, Sam Freedman published a piece that appeared to accept that such imbalances should be rectified through the schools funding formula:

‘The distribution, in turn, will depend on whether the next Government maintains the pupil premium at the same level – which has shifted funds towards poorer parts of the country – and whether they introduce a “National Funding Formula” (NFF).

At the moment there are significant and historic differences between funding in different parts of the country. Inner London for instance is overfunded, and many schools have significant surpluses, whereas other parts of the country, often more rural, have much tighter margins. The current Government have taken steps to remedy this but plan to go further if they win the election by introducing a NFF. Doing this would help alleviate the worst effects of the cuts for schools that are currently underfunded.’

Freedman himself retweeted this comment.

We had a further conversation on 20 April after this post had been published.

.

.

Another influential Twitterata also appeared influenced – if not yet fully converted – by my line of argument:

Positive though some of these indications are, there are grounds to fear that at least some Alliance Members remain wedded to the redistribution of pupil premium.

The idea recently reappeared in a publication underpinning the Read On Get On campaign, supported by a variety of organisations including Teach First and some of the Fair Education Alliance.

The report in question – The Power of Reading (April 2015) – mentions that:

‘The Read On. Get On. campaign is working closely with the Fair Education Alliance and the National Literacy Forum to achieve our core goals, and this report reflects and builds on their recommendations.’

One of its ‘recommendations to the new Government’ is ‘Ensure stronger support for disadvantaged children who are falling behind’.

‘In what is likely to be a tight public spending round, our priority for further investment is to improve the quality of early education for the poorest children, as set out above. However, there are options for reforming existing pupil premium spending for primary school children so that it focuses resources and accountability on children from disadvantaged backgrounds who are falling behind…

….One option proposed by the Fair Education Alliance is to refocus the existing pupil premium on children who are eligible for free school meals and who start primary school behind. This would use existing funding and accountability mechanisms for the pupil premium to focus attention on children who need the most urgent help to progress, including in reading. It would make primary schools more accountable for how they support disadvantaged children who are falling behind. The primary pupil premium will be worth £1,300 per pupil in 2015–16 and is paid straight to schools for any child registered as eligible for free school meals at any point in the last six years. The FEA proposes halving the existing premium, and redistributing the other half to children who meet the existing eligibility criteria and have low prior attainment. New baseline tests for children at the start of the reception year, to be introduced in September 2016, could be used as the basis for measuring the prior attainment of children starting primary school.’

Interestingly, this appears to confirm that the Fair Education Alliance supports a redistribution of pupil premium in the primary sector as well as the secondary, something I could not find expressed on the face of the Report Card.

I reacted angrily

.

The campaign continued

It won’t be long now before I leave the education world behind for ever, but I have decided to devote spare moments to the pursuit on social media of the organisations that form the Fair Education Alliance and/or support Read On. Get On.

I am asking each organisation to:

  • Justify their support for the policy that has been advanced or 
  • Formally distance themselves from it

I also extend an invitation to both campaigns to formally withdraw their proposals.

I shall publish the outcomes here.

The organisations involved are listed below. If any of them would care to cut to the chase, they are most welcome to use the comments facility on this blog or tweet me @GiftedPhoenix

Since my experience to date has been of surprising coyness when organisations are challenged over their ill-conceived policy ideas, I am imposing a ‘three strikes’ rule.

Any organisation that fails to respond having been challenged three times will be awarded a badge of shame and consigned to the Scrapheap.

Let’s see who’s in there by the end of term.

.

[Postscript 2 (May 10 2015): Teach First published a defence of its policy on 29 April. On 30 April I published a further post fisking this statement to reveal the weaknesses and gaps in their argument.

Of the organisations that are members of the Alliance and/or support Read On. Get On, only Future Leaders and NAHT have responded to my request for clarification.

Future Leaders have distanced themselves from the offending proposal (see their comment on this blog). NAHT has published a response from Russell Hobby to which I have replied. We meet shortly to discuss the matter.

Importantly though, the National Governors’ Association (NGA) has also confirmed its opposition

.

.

And so has ASCL. General Secretary Brian Lightman sent me this statement:

‘ASCL is not a member of the Fair Education Alliance at this stage although we do agree with many aspects for what they are doing and are in discussion with them about what we might support and how. 

However with regards to this specific point our position is similar to the one that NGA expressed. We would not be in agreement with allocating PP on the basis of prior attainment.  FSM is a proxy measure which is used to identify the overall level of disadvantage in a school and therefore pupil premium allocations

We strongly believe that decisions about how to use the PP in schools should be decisions made by school leaders who are fully  accountable for the impact of their decisions.’]

.

GP

April 2015

.


Fair Education Alliance

.

.

.

.

.

.

.

.

I have published a comment from Future Leaders in which they accept that:

‘…mid- and high-attainers from poor backgrounds should not be deprived of the support that they need to succeed’.

Thanks to them for their prompt and clear response.

.

.

.

.

.

.

Read On. Get On.

.

The Scrapheap

.

National Literacy Trust (12/5/15)

Achievement for All (9/6/15)

Teaching Leaders (9/6/15)

.

Has Ofsted improved inspection of the most able?

.

This post examines the quality of Ofsted reporting on how well secondary schools educate their most able learners.

keep-calm-and-prepare-for-ofsted-6The analysis is based on a sample of 87 Section 5 inspection reports published during March 2015.

I have compared the results with those obtained from a parallel exercise undertaken a year ago and published in How well is Ofsted reporting on the most able? (May 2014).

This new post considers how inspectors’ assessments have changed in the light of their increased experience, additional guidance and – most recently – the publication of Ofsted’s survey report: The most able students: An update on progress since June 2013.

This appeared on 4 March 2015, at the beginning of my survey period, although it was heralded in HMCI’s Annual Report and the various supporting materials published alongside it in December 2014. One might therefore expect it to have had an immediate effect on inspection practice.

Those seeking further details of either of these publications are cordially invited to consult the earlier posts I dedicated to them:

The organisation of this post is straightforward.

The first section considers how Ofsted expects its inspectors to report on provision for the most able, as required by the current Inspection Handbook and associated guidance. It also explores how those expectations were intended to change in the light of the Update on Progress.

Subsequent sections set out the findings from my own survey:

  • The nature of the 2015 sample – and how this differs from the 2014 sample
  • Coverage in Key Findings and Areas for Improvement
  • Coverage in the main body of reports, especially under Quality of Teaching and Achievement of Pupils, the sections that most commonly feature material about the most able

The final section follows last year’s practice in offering a set of key findings and areas for improvement for consideration by Ofsted.

I have supplied page jumps to each section from the descriptions above.

How inspectors should address the most able

.

Definition and distribution

Ofsted nowhere explains how inspectors are to define the most able. It is not clear whether they permit schools to supply their own definitions, or else apply the distinctions adopted in their survey reports. This is not entirely helpful to schools.

In the original survey – The most able students: Are they doing as well as they should in our non-selective secondary schools? (June 2013) – Ofsted described the most able as:

‘…the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

The measure of potential is not defined, but an example is given, of EAL students who are new to the country and so might not (yet) have achieved Level 5.

In the new survey prior attainment at KS2 remains the indicator, but the reference to potential is dropped:

‘…students starting secondary school in Year 7 having attained Level 5 or above in English (reading and writing) and/or mathematics at the end of Key Stage 2’

The size of this group varies at national level according to the year group.

If we take learners in Year 7 who completed KS2 in 2014, the data shows that 24% achieved KS2 Level 5 in both English (reading and writing) and maths. A further 5% secured L5 in English (reading and writing only) while another 20% reached L5 in maths only.

So 49% of the present Year 7 are deemed high attainers.

.

Ofsted venn Capture

But this proportion falls to about 40% amongst those who completed KS4 in 2014 and so typically undertook KS2 assessment five years earlier in 2009.

Ofsted’s measure is different to the definition adopted in the Secondary Performance Tables which, although also based on prior attainment at KS2, depends on an APS of 30 or higher in KS2 tests in the core subjects.

Only ‘all-rounders’ count according to this definition, while Ofsted includes those who are relatively strong in either maths or English but who might be weak in the other subject. Neither approach considers achievement beyond the core subjects.

According to the Performance Tables definition, amongst the cohort completing KS4 in 2014, only 32.3% of those in state-funded schools were deemed high attainers, some eight percentage points lower than Ofsted’s figure.

The sheer size of Ofsted’s most able cohort will be surprising to some, who might naturally assume a higher hurdle and a correspondingly smaller group. The span of attainment it covers is huge, from one L5C (possibly paired with a L3) to three L6s.

But the generosity of Ofsted’s assumptions does mean that every year group in every school should contain at least a handful of high attainers, regardless of the characteristics of its intake.

Unfortunately, Ofsted’s survey report does not say exactly how many schools have negligible numbers of high attainers, telling us only how many non-selective schools had at least one pupil in their 2014 GCSE cohort with the requisite prior attainment in English, in maths and in both English and maths.

In each case some 2,850 secondary schools had at least one student within scope. This means that some 9% of schools had no students in each category, but we have no way of establishing how many had no students in all three categories.

Using the rival Performance Table definition, only some 92 state-funded non-selective secondary schools reported a 2014 GCSE cohort with 10% or fewer high attainers. The lowest recorded percentage is 3% and, of those with 5% or fewer, the number of high attaining students ranges from 1 to 9.

Because Ofsted’s definition is more liberal, one might reasonably assume that every secondary school has at least one high-attaining student per year group, though there will be a handful of schools with very few indeed.

At the other extreme, according to the Performance Tables definition, over 100 state-funded non-selective schools can boast a 2014 GCSE population where high attainers are in the majority – and the highest recorded percentage for a state-funded comprehensive is 86%. Using Ofsted’s measure, the number of schools in this position will be substantively higher.

For the analysis below, I have linked the number of high attainers (according to the Performance Tables) in a school’s 2014 GCSE cohort with the outcomes of inspection, so as to explore whether there is a relationship between these two variables.

Framework and Handbook

The current Framework for School Inspection (December 2014) makes no reference to the most able.

Inspectors must consider:

‘…the extent to which the education provided by the school meets the needs of the range of pupils at the school, and in particular the needs of disabled pupils and those who have special educational needs.’

One of the principles of school inspection is that it will:

‘focus on pupils’ and parents’ needs by…evaluating the extent to which schools provide an inclusive environment that meets the needs of all pupils, irrespective of age, disability, gender, race, religion or belief, or sexual orientation’.

Neither ability nor attainment is mentioned. This may or may not change when the Common Inspection Framework is published.

The most recent version of the School Inspection Handbook (December 2014) has much more to say on the issue. All relevant references in the main text and in the grade descriptors are set out in the Annex at the end of this post.

Key points include:

  • Ofsted uses inconsistent terminology (‘most able’, ‘more able’, ‘highest attainers’) without distinguishing between these terms.
  • Most of the references to the most able occur in lists of different groups of learners, another of which is typically ‘disadvantaged pupils’. This gives the mistaken impression that the two groups are distinct – that there is no such thing as a most able disadvantaged learner.
  • The Common Inspection Framework will be supported by separate inspection handbooks for each sector. The consultation response does not mention any revisions relating to the most able; neither does the March 2015 survey report say that revisions will be introduced in these handbooks to reflect its findings and recommendations (but see below). 

.

Guidance

Since the first survey report was published in 2013, several pieces of guidance have issued to inspectors.

  • In Schools and Inspection (October 2013), inspectors’ attention is drawn to key revisions to the section 5 inspection framework:

‘In judging the quality of teaching…Inspectors will evaluate how teaching meets the needs of, and provides appropriate challenge to, the most able pupils. Underachievement of the most able pupils can trigger the judgements of inadequate achievement and inadequate teaching.’

In relation to report writing:

‘Inspectors are also reminded that they should include a short statement in the report on how well the most able pupils are learning and making progress and the outcomes for these pupils.’

  • In Schools and Inspection (March 2014) several amendments are noted to Section 5 inspection and report writing guidance from January of that year, including:

‘Most Able – Inspectors must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

‘…must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

Moreover, for secondary schools:

‘There must be a comment on early entry for GCSE examinations. Where the school has an early entry policy, inspectors must be clear on whether early entry is limiting the potential of the most able pupils. Where early entry is not used, inspectors must comment briefly to that effect.’

  • In School Inspection Update (December 2014) Ofsted’s National Director, Schools reminds inspectors, following the first of a series of half-termly reviews of ‘the impact of policy on school inspection practice’, to:

‘…place greater emphasis, in line with the handbook changes from September, on the following areas in section 5 inspection reports…The provision and outcomes for different groups of children, notably the most-able pupils and the disadvantaged (as referred to in the handbook in paragraphs 40, 129, 137, 147, 155, 180, 186, 194, 195, 196, 207, 208, 210 and 212).’

HMCI’s Annual Report

The 2014 Annual Report said (my emphasis):

‘Ofsted will continue to press schools to stretch their most able pupils. Over the coming year, inspectors will be looking at this more broadly, taking into account the leadership shown in this area by schools. We will also further sharpen our recommendations so that schools have a better understanding of how they can help their most able pupils to reach their potential.’

HMCI’s Commentary on the Report  added for good measure:

‘In the year ahead, Ofsted will look even more closely at the performance of the brightest pupils in routine school inspections.’

So we are to expect a combination of broader focus, closer scrutiny and sharper recommendations.

The Annual Report relates to AY2013/14 and was published at the end of the first term of AY2014/15 and the end of calendar year 2014, so one assumes that references to the ‘coming year’ and ‘the year ahead’ are to calendar year 2015.

We should be able to see the impact of this ramping up in the sample I have selected, but some further change is also likely.

March 2015 survey report

One of the key findings from the March 2015 survey was (my emphasis):

Ofsted has sharpened its focus on the progress and quality of teaching of the most able students. We routinely comment on the achievement of the most able students in our inspection reports. However, more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

Ofsted directed three recommendations at itself which do not altogether reflect this (my emboldening):

‘Ofsted should:

  • Make sure that inspections continue to focus sharply on the progress made by students who are able and disadvantaged.
  • Report more robustly about how well schools promote the needs of the most able through the quality of their curriculum and the information, advice and guidance they offer to the most able students.
  • Ensure thematic surveys investigate, where appropriate, how well the most able are supported through, for example, schools’ use of the pupil premium and the curriculum provided.’

The first of these recommendations implies that inspections already focus sufficiently on the progress of able and disadvantaged learners – an assumption that we shall test in the analysis below. It therefore implies that no further change is necessary.

The third alludes to the most able disadvantaged but relates solely to thematic surveys, not to Section 5 inspection reports.

The second may imply that further emphasis will be placed on inspecting the appropriateness of the curriculum and IAG. Both of these topics seem likely to feature more strongly in a generic sense in the new Framework and Handbooks. One assumes that this will be extended to the most able, amongst other groups.

Though not mentioned in the survey report, we do know that Ofsted is preparing an evaluation toolkit. This was mentioned in a speech given by its Schools Director almost immediately after publication:

‘In this region specifically, inspectors have met with headteachers to address the poor achievement of the brightest disadvantaged children.

And inspectors are developing a most able evaluation toolkit for schools, aligned to that which is in place for free school meals.’

It is not clear from this whether the toolkit will be confined only to the most able disadvantaged or will have wider coverage.

Moreover, this statement raises the prospect that the toolkit might be similar in style to The Pupil Premium: Analysis and challenge tools for schools (January 2013). This is more akin to an old spanner than a Swiss army penknife. Anything of this nature would be rather less helpful than the term ‘toolkit’ implies.

At his request, I emailed Ofsted’s Director, Schools with questions on 21 March 2015. I requested further details of the toolkit. At the time of writing I have still to receive a reply.

.

The sample

I have selected an almost identical sample to that used in my 2014 analysis, one year on. It includes the 87 Section 5 inspection reports on secondary schools (excluding middle schools deemed secondary) that were published by Ofsted in the month of March 2015.

The bulk of the inspections were undertaken in February 2015, though a few took place in late January or early March.

Chart 1 gives the regional breakdown of the schools in the sample. All nine regions are represented, though there are only five schools from the North East, while Yorkshire and Humberside boasts 15. There are between seven and 11 schools in each of the other regions. In total 59 local authorities are represented.

In regional terms, this sample is more evenly balanced than the 2014 equivalent and the total number of authorities is two higher.

 .

Ofanal 1

Chart 1: Schools within the sample by region

Chart 2 shows how different statuses of school are represented within the sample.

All are non-selective. Fifty-three schools (61%) are academies, divided almost equally between the sponsored and converter varieties.

Community and foundation schools together form a third group of equivalent size, while the seven remaining schools have voluntary status, just one of them voluntary controlled. There are no free schools.

.

Ofanal 2

Chart 2: Schools within the sample by status

.

All but three of the schools are mixed – and those three are boys’ schools.

As for age range, there is one 13-18 and one 14-18 school. Otherwise there are 32 11-16 institutions (37% of the sample) while the remaining 53 (61%) are 11-18 or 11-19 institutions.

Chart 3 shows the variation in numbers on roll. The smallest school – a new 11-18 secondary school – has just 125 pupils; the largest 2083. The average is 912.

Fifty-two schools (60%) are between 600 and 1,200 and twenty-three (26%) between 800 and 1,000 pupils.

.

Ofanal 3

Chart 3: Schools within the sample by NOR

. 

Chart 4 shows the overall inspection grade of schools within the sample. A total of 19 schools (22%) are rated inadequate, seven of them attracting special measures. Only nine (10%) are outstanding, while 27 (31%) are good and 32 (37%) require improvement.

This is very similar to the distribution in the 2014 sample, except that there are slightly more inadequate schools and slightly fewer requiring improvement.

.

Ofanal 4

Chart 4: Schools within the sample by overall inspection grade

Unlike the 2104 analysis, I have also explored the distribution of all grades within reports. The results are set out in Chart 5.

Schools in the sample are relatively more secure on Leadership and management (55% outstanding or good) and Behaviour and safety of pupils (60% outstanding or good) than they are on Quality of teaching (43% outstanding or good) and Achievement of pupils (41% outstanding or good).

.

Ofanal 5

Chart 5: Schools within the sample by inspection sub-grades

Another new addition this year is comparison with the number and percentage of high attainers.

Amongst the sample, the number of high attainers in the 2014 GCSE cohort varied from three to 196 and the percentage from 3% to 52%. (Two schools did not have a GCSE cohort in 2014.)

These distributions are shown on the scatter charts 6 and 7, below.

Chart 6 (number) shows one major outlier at the top of the distribution. The vast majority – 64% of the sample – record numbers between 20 and 60. The average number is 41.

.

Ofanal 6

Chart 6: Schools within the sample by number of high attainers (Secondary Performance Tables measure)

. 

Chart 7 again has a single outlier, this time at the bottom of the distribution. The average is 32%, slightly less than the 32.3% reported for all state-funded schools in the Performance Tables.

Two in five of the sample register a high attainer percentage of between 20% and 30%, while three in five register between 20% and 40%.

But almost a third have a high attainer population of 20% or lower.

.

Ofanal 7 

Chart 7: Schools within the sample by percentage of high attainers (Secondary Performance Tables measure)

Out of curiosity, I compared the overall inspection grade with the percentage of high attainers.

  • Amongst the nine outstanding schools, the percentage of high attainers ranged from 22% to 47%, averaging 33% (there was also one without a high attainer percentage).
  • Amongst the 27 good schools, the percentage of high attainers was between 13% and 52% (plus one without a high attainer percentage) and averaged 32%.
  • Amongst the 32 schools requiring improvement, the percentage of high attainers varied between 3% and 40% and averaged 23%.
  • Amongst the 19 inadequate schools, the percentage of high attainers lay between 10% and 38% and also averaged 23%.

This may suggest a tendency for outstanding/good schools to have a somewhat larger proportion of high attainers than schools judged to be requiring improvement or inadequate.

Key findings and areas for improvement

.

Distribution of comments

Thirty-nine of the reports in the sample (45%) address the most able in the Summary of key findings, while 33 (38%) do so in the section about what the school needs to do to improve further.

In 24 cases (28%) there were entries in both these sections, but in 39 of the reports (45%) there was no reference to the most able in either section.

In 2014, 34% of reports in the sample addressed the issue in both the main findings and recommendations and 52% mentioned it in neither of these sections.

These percentage point changes are not strongly indicative of an extended commitment to this issue.

In the 2015 sample it was rather more likely for a reference to appear in the key findings for community schools (53%) and foundation schools (50%) than it was for converter academies (44%), sponsored academies (42%) or voluntary schools (29%).

Chart 8 shows the distribution of comments in these sections according to the overall inspection grade. In numerical terms, schools rated as requiring improvement overall are most likely to attract comments in both Key findings and Areas for improvement related to the most able.

.

Ofanal 8

Chart 8: Most able mentioned in key findings and areas for improvement by overall inspection grade (percentages)

.

But, when expressed as percentages of the total number of schools in the sample attracting these grades, it becomes apparent that the lower the grade, the more likely such a comment will be received.

Of the 39 reports making reference in the key findings, 10 comments were positive, 28 were negative and one managed to be both positive and negative simultaneously:

‘While the most-able students achieve well, they are capable of even greater success, notably in mathematics.’ (Harewood College, Bournemouth)

.

Positive key findings

Five of the ten exclusively positive comments were directed at community schools.

The percentage of high attainers in the 2014 GCSE cohorts at the schools attracting positive comments varied from 13% to 52% and included three of the five schools with the highest percentages in the sample.

Interestingly, only two of the schools with positive comments received an overall outstanding grade, while three required improvement.

Examples of positive comments, which were often generic, include:

  • ‘The most able students achieve very well, and the proportion of GCSE A* and A grades is significantly above average across the curriculum.’ (Durham Johnston Comprehensive School, Durham)
  • ‘The most able students do well because they are given work that challenges them to achieve their potential’. (The Elton High School Specialist Arts College, Bury)
  • ‘Most able students make good progress in most lessons because of well-planned activities to extend their learning’. (Endon High School, Staffordshire)
  • ‘Teachers encourage the most able students to explore work in depth and to master skills at a high level’. (St Richard Reynolds Catholic High School, Richmond-upon-Thames).

Negative key findings

The distribution of the 28 negative comments in Key findings according to overall inspection grade was:  Outstanding (nil); Good five (19%); Requires improvement twelve (38%); Inadequate eleven (58%).

This suggests a relatively strong correlation between the quality of provision for the most able and the overall quality of the school.

The proportion of high attainers in the 2014 GCSE cohorts of the schools attracting negative comments varied between 3% and 42%. All but three are below the national average for state-funded schools on this measure and half reported 20% or fewer high attainers.

This broadly supports the hypothesis that quality is less strong in schools where the proportion of high attainers is comparatively low.

Examples of typical negative comments:

  • ‘The most able students are not given work that is hard enough’ (Dyson Perrins C of E Sports College, Worcestershire)
  • ‘Too many students, particularly the most able, do not make the progress of which they are capable’ (New Line Learning Academy, Kent)
  • ‘Students, particularly the more able, make slower progress in some lessons where they are not sufficiently challenged. This can lead to some off task behaviour which is not always dealt with by staff’ (The Ferrers School, Northamptonshire)
  • ‘Teachers do not always make sufficient use of assessment information to plan work that fully stretches or challenges all groups of students, particularly the most able’ (Noel-Baker School, Derby).

The menu of shortcomings identified is limited, consisting of seven items: underachievement (especially too few high GCSE grades), insufficient progress, low expectations, insufficiently challenging work, poor teaching quality, poor planning and poor use of assessment information.

Of these, the most common comprise a familiar litany. They are (in descending order): 

  • Insufficiently challenging work 
  • Insufficient progress 
  • Underachievement and 
  • Low expectations.

Inspectors often point out inconsistent practice, though in the worst instances these shortcomings are dominant or even school-wide.

.

No key findings

Chart 9 shows the distribution of reports with no comments about the most able in Key findings and Areas for improvement according to overall inspection grade. When expressed as percentages, these again show that schools rated as outstanding are most likely to escape such comments, while inadequate schools are most likely to be in the firing line.

.

Ofanal 9

Chart 9: Most able not mentioned in key findings and areas for improvement by inspection grade (percentages)

This pattern replicates the findings from 2014. Orders of magnitude are also broadly comparable.  There is no substantive evidence of a major increase in emphasis from inspectors.

It seems particularly surprising that, in over half of schools requiring improvement and a third or more of inadequate schools, issues with educating the most able are still not significant enough to feature in these sections of inspection reports.

.

Areas for improvement

By definition, recommendations for improvement are always associated with identified shortcomings.

The correlation between key findings and areas for improvement is inconsistent. In six cases there were Key findings relating to the most able, but no area for improvement specifically associated with those. Conversely, nine reports had identified areas for improvement that were not picked up in the key findings.

Areas for improvement are almost always formulaic and expressed as lists: the school should improve x through y and z.

When it comes to the most able, the area for improvement is almost invariably teaching quality, though sometimes this is indicated as the route to higher achievement while on other occasions teaching quality and raising achievement are perceived as parallel priorities.

Just one report in the sample mentioned the quality of leadership and management:

‘Ensure that leadership and management take the necessary steps to secure a significant rise in students’ achievement at the end of Year 11 through…ensuring that work set for the most able is always sufficiently challenging’ (New Line Learning Academy, Kent).

This is despite the fact that leadership was specifically mentioned as a focus in HMCI’s Annual Report.

The actions needed to bring about improvement reflect the issues mentioned in the analysis of key findings above. The most common involve applying assessment information to planning and teaching:

  • ‘Raise students’ achievement and the quality of teaching further by ensuring that:…all staff develop their use of class data to plan learning so that students, including the most able, meet their challenging targets’ (Oasis Academy Isle of Sheppey, Kent)
  • ‘Ensure the quality of teaching is always good or better, in order to raise attainment and increase rates of progress, especially in English and mathematics, by:…ensuring teachers use all the information available to them to plan lessons that challenge students, including the most able’ (Oasis Academy Lister Park, Bradford)
  • ‘Embed and sustain improvements in achievement overall and in English in particular so that teaching is consistently good and outstanding by: making best use of assessment information to set work that is appropriately challenging, including for the least and most able students’ (Pleckgate High School Mathematics and Computing College, Blackburn with Darwen)

Other typical actions involve setting more challenging tasks, raising the level of questioning, providing accurate feedback, improving lesson planning and maintaining consistently high expectations.

.

Coverage in the main body of reports

.

Leadership and management

Given the reference to this in HMCI’s Annual Report, one might have expected a new and significant emphasis within this section of the reports in the sample.

In fact, the most able were only mentioned in this section in 13 reports (15% of the total). Hardly any of these comments identified shortcomings. The only examples I could find were:

  • ‘The most-able students are not challenged sufficiently in all subjects to
    achieve the higher standards of which they are capable’ (Birkbeck School and Community Arts College, Lincolnshire)
  • ‘Action to improve the quality of teaching is not focused closely enough on the strengths and weaknesses of the school and, as a result, leaders have not done enough to secure good teaching of students and groups of students, including…the most able (Ashington High School Sports College, Northumberland)

Inspectors are much more likely to accentuate the positive:

  • ‘The school has been awarded the Challenge Award more than once. This is given for excellent education for a school’s most-able, gifted and talented students and for challenge across all abilities. Representatives from all departments attend meetings and come up with imaginative ways to deepen these students’ understanding.’ (Cheam High School, Sutton)
  • ‘Leaders and governors are committed to ensuring equality of opportunity for all students and are making effective use of student achievement data to target students who may need additional support or intervention. Leaders have identified the need to improve the achievement of…the most-able in some subjects and have put in place strategies to do so’ (Castle Hall academy Trust, Kirklees)
  • ‘Measures being taken to improve the achievement of the most able are effective. Tracking of progress is robust and two coordinators have been appointed to help raise achievement and aspirations. Students say improvements in teaching have been made, and the work of current students shows that their attainment and progress is on track to reach higher standards.’ (The Byrchall High School, Wigan).

Not one report mentioned the role of governors in securing effective provision for the most able. 

Given how often school leadership escapes censure for issues identified elsewhere in reports, this outcome could be interpreted as somewhat complacent. 

HMCI is quite correct to insist that provision for the most able is a whole school issue and, as such, a school’s senior leadership team should be held to account for such shortcomings.

Behaviour and safety

The impact of under-challenging work on pupils’ behaviour is hardly ever identified as a problem.

One example has been identified in the analysis of Key findings above. Only one other report mentions the most able in this section, and the comment is about the role of the school council rather than behaviour per se:

‘The academy council is a vibrant organisation and is one of many examples where students are encouraged to take an active role in the life of the academy. Sixth form students are trained to act as mentors to younger students. This was seen being effectively employed to…challenge the most able students in Year 9’ (St Thomas More High School, Southend)

A handful of reports make some reference under ‘Quality of teaching’ but one might reasonably conclude that neither  bullying of the most able nor disruptive behaviour from bored high attainers is particularly widespread.

Quality of teaching

Statements about the most able are much more likely to appear in this section of reports. Altogether 59 of the sample (68%) made some reference.

Chart 10 shows the correlation between the incidence of comments and the sub-grade awarded by inspectors to this aspect of provision. It demonstrates that, while differences are relatively small, schools deemed outstanding are rather more likely to attract such comment.

But only one of the comments on outstanding provision is negative and that did not mention the most able specifically:

‘Also, in a small minority of lessons, activities do not always deepen
students’ knowledge and understanding to achieve the very highest grades at GCSE and A level.’ (Central Foundation Boys’ School, Islington)

.

Ofanal 10

Chart 10: Incidence of comments under quality of teaching by grade awarded for quality of teaching

.

Comments are much more likely to be negative in schools where the quality of teaching is judged to be good (41%), requiring improvement (59%) and inadequate (58%).

Even so, a few schools in the lower two categories receive surprisingly positive endorsements:

  • ‘On the other hand, the most able students and the younger students in school consistently make good use of the feedback. They say they greatly value teachers’ advice….The teaching of the most able students is strong and often very strong. As a result, these students make good progress and, at times, achieve very well.’ (RI – The Elton High School Specialist Arts College, Bury)
  • ‘Teaching in mathematics is more variable, but in some classes, good and outstanding teaching is resulting in students’ rapid progress. This is most marked in the higher sets where the most able students are being stretched and challenged and are on track to reach the highest grades at GCSE…. In general, the teaching of the most able students….is good.’ (RI- New Charter Academy, Tameside)
  • ‘At its most effective, teaching is well organised to support the achievement of the most able, whose progress is better than other students. This is seen in some of the current English and science work.’ (I – Ely College, Cambridgeshire).

Negative comments on the quality of teaching supply a familiar list of shortcomings.

Some of the most perceptive are rather more specific. Examples include:

  • ‘While the best teaching allows all students to make progress, sometimes discussions that arise naturally in learning, particularly with more able students, are cut short. As a result, students do not have the best opportunity to explore ideas fully and guide their own progress.’ (Dyson Perrins C of E Sports College, Worcestershire)
  • ‘Teachers’ planning increasingly takes account of current information about students’ progress. However, some teachers assume that because the students are organised into ability sets, they do not need to match their teaching to individual and groups of students’ current progress. This has an inhibiting effect on the progress of the more able students in some groups.’ (Chulmleigh Community College, Devon)
  • ‘In too many lessons, particularly boys’ classes, teachers do not use questioning effectively to check students’ learning or promote their thinking. Teachers accept responses that are too short for them to assess students’ understanding. Neither do they adjust their teaching to revisit aspects not fully grasped or move swiftly to provide greater stretch and new learning for all, including the most able.’ (The Crest Academies, Brent)
  • ‘In some lessons, students, including the most able, are happy to sit and wait for the teacher to help them, rather than work things out for themselves’ (Willenhall E-ACT Academy, Walsall).

Were one compiling a list of what to do to impress inspectors, it would include the following items:

  • Plans lessons meticulously with the needs of the most able in mind 
  • Use assessment information to inform planning of work for the most able 
  • Differentiate work (and homework) to match most able learners’ needs and starting points 
  • Deploy targeted questioning, as well as opportunities to develop deeper thinking and produce more detailed pieces of work 
  • Give the most able the flexibility to pursue complex tasks and do not force them to participate in unnecessary revision and reinforcement 
  • Do not use setting as an excuse for neglecting differentiation 
  • Ensure that work for the most able is suitably challenging 
  • Ensure that subject knowledge is sufficiently secure for this purpose 
  • Maintain the highest expectations of what the most able students can achieve 
  • Support the most able to achieve more highly but do not allow them to become over-reliant on support 
  • Deploy teaching assistants to support the most able 
  • Respond to restlessness and low level disruption from the most able when insufficiently challenged.

While many of the reports implicitly acknowledge that the most able learners will have different subject-specific strengths and weaknesses, the implications of this are barely discussed.

Moreover, while a few reports attempt a terminological distinction between ‘more able’ and ‘most able’, the vast majority seem to assume that, in terms of prior attainment, the most able are a homogenous group, whereas – given Ofsted’s preferred approach – there is enormous variation.

Achievement of pupils 

This is the one area of reports where reference to the most able is now apparently compulsory – or almost compulsory.

Just one report in the sample has nothing to say about the achievement of the most able in this section: that on Ashby School in Leicestershire.

Some of the comments are relatively long and detailed, but others are far more cursory and the coverage varies considerably.

Using as an example the subset of schools awarded a sub-grade of outstanding for the achievement of pupils, we can exemplify different types of response:

  • Generic: ‘The school’s most able students make rapid progress and attain excellent results. This provides them with an excellent foundation to continue to achieve well in their future studies.’ (Kelvin Hall School, Hull)
  • Generic, progress-focused: ‘The most-able students make rapid progress and the way they are taught helps them to probe topics in greater depth or to master skills at a high level.’ (St Richard Reynolds Catholic High School, Richmond-upon-Thames)
  • Achievement-focused, core subjects: ‘Higher attaining students achieve exceptionally well as a result of the support and challenge which they receive in class. The proportion of students achieving the higher A* to A grade was similar to national averages in English but significantly above in mathematics.
  • Specific, achievement- and progress-focused: ‘Although the most able students make exceptional progress in the large majority of subjects, a few do not reach the very highest GCSE grades of which they are capable. In 2014, in English language, mathematics and science, a third of all students gained A and A* GCSE grades. Performance in the arts is a real strength. For example, almost two thirds of students in drama and almost half of all music students achieved A and A* grades. However, the proportions of A and A* grades were slightly below the national figures in English literature, geography and some of the subjects with smaller numbers of students (Central Foundation Boys’ School, Islington)

If we look instead at the schools with a sub-grade of inadequate, the comments are typically more focused on progress, but limited progress is invariably described as ‘inadequate’, ‘requiring improvement’, ‘weak’, ‘not good’, ‘not fast enough’. It is never quantified.

On the relatively few occasions when achievement is discussed, the measure is typically GCSE A*/A grades, most often in the core subjects.

It is evident from cross-referencing the Achievement of pupils sub-grade against the percentage of high attainers in the 2014 GCSE cohort that there is a similar correlation to that with the overall inspection grade:

  • In schools judge outstanding on this measure, the high attainer population ranges from 22% to 47% (average 33%)
  • In schools judged good, the range is from 13% to 52% (average 32%)
  • In schools requiring improvement it is between 3% and 40% (average 23%)
  • In schools rated inadequate it varies from 10% to 32% (average 22%)

.

Sixth Form Provision 

Coverage of the most able in sections dedicated to the sixth form is also extremely variable. Relatively few reports deploy the term itself when referring to 16-19 year-old students.

Sometimes there is discussion of progression to higher education and sometimes not. Where this does exist there is little agreement on the appropriate measure of selectivity in higher education:

  • ‘Students are aspiring to study at the top universities in Britain. This is a realistic prospect and illustrates the work the school has done in raising their aspirations.’ (Welling School, Bexley)
  • ‘The academy carefully tracks the destination of leavers with most students proceeding to university and one third of students gaining entry to a Russell Group university’ (Ashcroft Technology Academy, Wandsworth)
  • ‘Provision for the most able students is good, and an increasing proportion of students are moving on to the highly regarded ‘Russell group’ or Oxbridge universities. A high proportion of last year’s students have taken up a place at university and almost all gained a place at their first choice’ (Ashby School, Leicestershire)
  • ‘Large numbers of sixth form students progress to well-regarded universities’ (St Bartholomew’s School, West Berkshire)
  • ‘Students receive good support in crafting applications to universities which most likely match their attainment; this includes students who aspire to Oxford or Cambridge’ (Anthony Gell School, Derbyshire).

Most able and disadvantaged

Given the commitment in the 2015 survey report to ‘continue to focus sharply on the progress made by students who are able and disadvantaged’, I made a point of reviewing the coverage of this issue across all sections of the sample reports.

Suffice to say that only one report discussed provision for the most able disadvantaged students, in these terms:

‘Pupil premium funding is being used successfully to close the wide achievement gaps apparent at the previous inspection….This funding is also being effectively used to extend the range of experiences for those disadvantaged students who are most able. An example of this is their participation in a residential writing weekend.’ (St Hild’s C of E VA School, Hartlepool)

Take a bow Lead Inspector Petts!

A handful of other reports made more general statements to the effect that disadvantaged students perform equivalently to their non-disadvantaged peers, most often with reference to the sixth form:

  • ‘The few disadvantaged students in the sixth form make the same progress as other students, although overall, they attain less well than others due to their lower starting points’ (Sir Thomas Wharton Community College, Doncaster)
  • ‘There is no difference between the rates of progress made by disadvantaged students and their peers’ (Sarum Academy, Wiltshire)
  • ‘In many cases the progress of disadvantaged students is outstripping that of others. Disadvantaged students in the current Year 11 are on course to do
    every bit as well as other students.’ (East Point Academy, Suffolk).

On two occasions, the point was missed entirely:

  • ‘The attainment of disadvantaged students in 2014 was lower than that of other students because of their lower starting points. In English, they were half a grade behind other students in the school and nationally. In mathematics, they were a grade behind other students in the school and almost a grade behind students nationally. The wider gap in mathematics is due to the high attainment of those students in the academy who are not from disadvantaged backgrounds.’ (Chulmleigh Community College, Devon)
  • ‘Disadvantaged students make good progress from their starting points in relation to other students nationally. These students attained approximately two-thirds of a GCSE grade less than non-disadvantaged students nationally in English and in mathematics. This gap is larger in school because of the exceptionally high standards attained by a large proportion of the most able students…’ (Durham Johnston Comprehensive School, Durham)

If Ofsted believes that inspectors are already focusing sharply on this issue then, on this evidence, they are sadly misinformed.

Key Findings and areas for improvement

.

Key findings: Guidance

  • Ofsted inspectors have no reliable definition of ‘most able’ and no guidance on the appropriateness of definitions adopted by the schools they visit. The approach taken in the 2015 survey report is different to that adopted in the initial 2013 survey and is now exclusively focused on prior attainment. It is also significantly different to the high attainer measure in the Secondary Performance Tables.
  • Using Ofsted’s approach, the national population of most able in Year 7 approaches 50% of all learners; in Year 11 it is some 40% of all learners. The latter is some eight percentage points lower than the cohort derived from the Performance Tables measure.
  • The downside of such a large cohort is that it masks the huge attainment differences within the cohort, from a single L5C (and possibly a L3 in either maths or English) to a clutch of L6s. Inspectors might be encouraged to regard this as a homogenous group.
  • The upside is that there should be a most able presence in every year group of every school. In some comprehensive schools, high attainers will be a substantial majority in every year group; in others there will be no more than a handful.
  • Ofsted has not released data showing the incidence of high attainers in each school according to its measure (or the Performance Tables measure for that matter). This does not features in Ofsted’s Data Dashboard.
  • Guidance in the current School Inspection Handbook is not entirely helpful. There is not space in a Section 5 inspection report to respond to all the separate references (see Appendix for the full list). The terminology is confused (‘most able’, ‘more able’, ‘high attainers’).Too often the Handbook mentions several different groups alongside the most able, one of which is disadvantaged pupils. This perpetuates the false assumption that there are no most able disadvantaged learners. We do not yet know whether there will be wholesale revision when new Handbooks are introduced to reflect the Common Inspection Framework.
  • At least four pieces of subsidiary guidance have issued to inspectors since October 2013. But there has been nothing to reflect the commitments in HMCI’s Annual Report (including a stronger focus on school leadership of this issue) or the March 2015 Survey report. This material requires enhancement and consolidation.
  • The March 2015 Report apparently commits to more intensive scrutiny of curricular and IAG provision in Section 5 inspections, as well as ‘continued focus’ on able and disadvantaged students (see below). A subsequent commitment to an evaluation toolkit would be helpful to inspectors as well as schools, but its structure and content has not yet been revealed.

Key findings: Survey

  • The sample for my survey is broadly representative of regions, school status and variations in NOR. In terms of overall inspection grades, 10% are outstanding, 31% good, 37% require improvement and 22% are inadequate. In terms of sub-grades, they are notably weaker on Quality of teaching and Achievement of pupils, the two sections that most typically feature material about the most able.
  • There is huge variation within the sample by percentage of high attainers (2014 GCSE population according to the Secondary Performance Tables measure). The range is from 3% to 52%. The average is 32%, very slightly under the 32.3% average for all state-funded schools. Comparing overall inspection grade with percentage of high attainers suggests a marked difference between those rated outstanding/good (average 32/33%) and those rated as requiring improvement/inadequate (average 23%).
  • 45% of the reports in the sample addressed the most able under Key findings; 38% did so under Areas for improvement and 28% made reference in both sections. However, 45% made no reference in either of these sections. In 2014, 34% mentioned the most able in both main findings and recommendations, while 52% mentioned it in neither. On this measure, inspectors’ focus on the most able has not increased substantively since last year.
  • Community and foundation schools were rather more likely to attract such comments than either converter or sponsored academies. Voluntary schools were least likely to attract them. The lower the overall inspection grade, the more likely a school is to receive such comments.
  • In Key findings, negative comments outnumbered positive comments by a ratio of 3:1. Schools with high percentages of high attainers were well represented amongst those receiving positive comments.
  • Unsurprisingly, schools rated inadequate overall were much more likely to attract negative comments. A correlation between overall quality and quality of provision for the most able was somewhat more apparent than in 2014. There was also some evidence to suggest a correlation between negative comments and a low proportion of high attainers.
  • On the other hand, over half of schools with an overall requiring improvement grade and a third with an overall inspection grade of inadequate did not attract comments about the most able under Key findings. This is not indicative of greater emphasis.
  • The menu of shortcomings is confined to seven principal faults: underachievement (especially too few high GCSE grades), insufficient progress, low expectations, insufficiently challenging work, poor teaching quality, poor planning and poor use of assessment information. In most cases practice is inconsistent but occasionally problems are school-wide.
  • Areas for improvement are almost always expressed in formulaic fashion. Those relating to the most able focus almost invariably on the Quality of teaching. The improvement most commonly urged is more thorough application of assessment information to planning and teaching.
  • Only 15% of reports mention the most able under Leadership and management and, of those, only two are negative comments. The role of governors was not raised once. Too often the school leadership escapes censure for shortcomings identified elsewhere in the report. This is not consistent with indications of new-found emphasis in this territory.
  • The most able are hardly ever mentioned in the Behaviour and safety section of reports. It would seem that bullying is invisible and low level disruption by bored high attainers rare.
  • Conversely, 68% of reports referenced the most able under Quality of teaching. Although negative comments are much more likely in schools judged as inadequate or requiring improvement in this area, a few appear to be succeeding with their most able against the odds. The main text identifies a list of twelve good practice points gleaned from the sample.
  • Only one report fails to mention the most able under Achievement of pupils, but the quality and coverage varies enormously. Some comments are entirely generic; some focus on achievement, others on progress and some on both. Few venture beyond the core subjects. There is very little quantification, especially of insufficient progress (and especially compared with equivalent discussion of progress by disadvantaged learners).
  • Relatively few reports deploy the term ‘most able’ when discussing sixth form provision. Progression to higher education is sometimes mentioned and sometimes not. There is no consensus on how to refer to selective higher education.
  • Only one report in this sample mentions disadvantaged most able students. Two reports betray the tendency of assuming these two groups to be mutually exclusive but, worse still, the sin of omission is almost universal. This provides no support whatsoever for Ofsted’s claim that inspectors already address the issue.

Areas for improvement

Ofsted has made only limited improvements since the previous inspection in May 2014 and its more recent commitments are not yet reflected in Section 5 inspection practice.

In order to pass muster it should:

  • Appoint a lead inspector for the most able who will assume responsibility across Ofsted, including communication and consultation with third parties.
  • Consolidate and clarify material about the most able in the new Inspection Handbooks and supporting guidance for inspectors.
  • Prepare and publish a high quality evaluation toolkit, to support schools and inspectors alike. This should address definitional and terminological issues as well as supplying benchmarking data for achievement and progress. It might also set out the core principles underpinning effective practice.
  • Include within the toolkit a self-assessment and evaluation framework based on the quality standards. This should model Ofsted’s understanding of whole school provision for the most able that aligns with outstanding, good and requiring improvement grades, so that schools can understand the progression between these points.
  • Incorporate data about the incidence of the most able and their performance in the Data Dashboard.
  • Extend all elements of this work programme to the primary and post-16 sectors.
  • Undertake this work programme in consultation with external practitioners and experts in the field, completing it as soon as possible and by December 2015 at the latest.

 .

Verdict: (Still) Requires Improvement.

GP

April 2015

.. 

.

Annex: Coverage in the School Inspection Handbook (December 2014)

Main Text

Inspectors should:

  • Gather evidence about how well they are ‘learning, gaining knowledge and understanding, and making progress’ (para 40)
  • Take account of them when considering performance data (para 59)
  • Take advantage of opportunities to gather evidence from them (para 68)
  • Consider the effectiveness of pupil grouping, for example ‘where pupils are taught in mixed ability groups/classes, inspectors will consider whether the most able are stretched…’ (para 153)
  • Explore ‘how well the school works with families to support them in overcoming the cultural obstacles that often stand in the way of the most able pupils from deprived backgrounds attending university’ (para 154)
  • Consider whether ‘teachers set homework in line with the school’s policy and that challenges all pupils, especially the most able’ (para 180)
  • Consider ‘whether work in Key Stage 3 is demanding enough, especially for the most able when too often undemanding work is repeated unnecessarily’ (para 180)
  • Consider whether ‘teaching helps to develop a culture and ethos of scholastic excellence, where the highest achievement in academic work is recognised, especially in supporting the achievement of the most able’ (para 180)
  • When judging achievement, have regard for ‘the progress that the most able are making towards attaining the highest grades’ and ‘pay particular attention to whether more able pupils in general and the most able pupils in particular are achieving as well as they should’. They must ‘summarise the achievements of the most able pupils in a separate paragraph of the inspection report’ (paras 185-7)
  • Consider ‘how the school uses assessment information to identify pupils who…need additional support to reach their full potential, including the most able.’ (para 193)
  • Consider how well ‘assessment, including test results, targets, performance descriptors or expected standards are used to ensure that…more able pupils do work that deepens their knowledge and understanding’ and ‘pupils’ strengths and misconceptions are identified and acted on by teachers during lessons and more widely to… deepen the knowledge and understanding of the most able’ (para 194)
  • Take account of ‘the learning and progress across year groups of different groups of pupils currently on the roll of the school, including…the most able’. Evidence gathered should include ‘the school’s own records of pupils’ progress, including… the most able pupils such as those who joined secondary schools having attained highly in Key Stage 2’ (para 195)
  • Take account of ‘pupils’ progress in the last three years, where such data exist and are applicable, including that of…the most able’ (para 195)
  • ‘When inspecting and reporting on students’ achievement in the sixth form, inspectors must take into account all other guidance on judging the achievement, behaviour and development of students, including specific groups such as…the most able ‘ (para 210)
  • Talk to sixth form students to discover ‘how well individual study programmes meet their expectations, needs and future plans, including for…the most able’ (para 212)

However, the terminology is not always consistent. in assessing the overall effectiveness of a school, inspectors must judge its response to ‘the achievement of…the highest and lowest attainers’ (para 129)

Grade descriptors

Outstanding

  • Overall effectiveness:

‘The school’s practice consistently reflects the highest expectations of staff and the highest aspirations for pupils, including the most able…’

  • Quality of teaching:

‘Much teaching over time in all key stages and most subjects is outstanding and never less than consistently good. As a result, almost all pupils currently on roll in the school, including…the most able, are making sustained progress that leads to outstanding achievement.’

  • Achievement of pupils:

‘The learning of groups of pupils, particularly… the most able, is consistently good or better.’

  • Effectiveness of sixth form provision:

‘All groups of pupils make outstanding progress, including…the most able’

Good

  • Overall effectiveness:

‘The school takes effective action to enable most pupils, including the most able…’

  • Quality of teaching:

‘Teaching over time in most subjects, including English and mathematics, is consistently good. As a result, most pupils and groups of pupils on roll in the school, including…the most able, make good progress and achieve well over time.’

‘Effective teaching strategies, including setting appropriate homework and well-targeted support and intervention, are matched closely to most pupils’ needs, including those most and least able, so that pupils learn well in lessons’

  • Achievement of pupils:

‘The learning of groups of pupils, particularly… the most able, is generally good.’

  • Effectiveness of sixth form provision:

‘As a result of teaching that is consistently good over time, students make good progress, including…the most able’

Inadequate

  • Quality of teaching:

‘As a result of weak teaching over time, pupils or particular groups of pupils, including…the most able, are making inadequate progress.’

  • Achievement of pupils:

‘Groups of pupils, particularly disabled pupils and/or those who have special educational needs and/or disadvantaged pupils and/or the most able, are underachieving’

  • Effectiveness of sixth form provision:

‘Students or specific groups such as… the most able do not achieve as well as they can. Low attainment of any group shows little sign of rising.’