Will Maths Hubs Work?

.

Gyroscope_precessionGyroscope_precessionGyroscope_precessionThis post takes a closer look at Maths Hubs, exploring the nature of the model, their early history and performance to date.

It reflects on their potential contribution to the education of the ‘mathematically most able’ and considers whether a similar model might support ‘most able education’.

.

.

Background

.

Origins of this post

The post was prompted by the potential connection between two separate stimuli:

‘We aim to make Britain the best place in the world to study maths, science and engineering, measured by improved performance in the PISA league tables….We will make sure that all students are pushed to achieve their potential and create more opportunities to stretch the most able.’

  • My own recent post on Missing Talent (June 2015) which discussed the Sutton Trust/education datalab recommendation that:

‘Schools where highly able pupils currently underperform should be supported through the designation of another local exemplar school

Exemplar schools…should be invited to consider whether they are able to deliver a programme of extra-curricular support to raise horizons and aspirations for children living in the wider area.’

The second led to a brief Twitter discussion about parallels with an earlier initiative during which Maths Hubs were mentioned.

.

.

Links to previous posts

I touched on Maths Hubs once before, in the final section of 16-19 Maths Free Schools Revisited (October 2014) which dealt with ‘prospects for the national maths talent pipeline’.

This reviewed the panoply of bodies involved in maths education at national level and the potential advantages of investing in a network with genuinely national reach, rather than in a handful of new institutions with small localised intakes and limited capacity for outreach:

‘Not to put to finer point on it, there are too many cooks. No single body is in charge; none has lead responsibility for developing the talent pipeline

The recent introduction of maths hubs might have been intended to bring some much-needed clarity to a complex set of relationships at local, regional and national levels. But the hubs seem to be adding to the complexity by running even more new projects, starting with a Shanghai Teacher Exchange Programme.

A network-driven approach to talent development might just work…but it must be designed to deliver a set of shared strategic objectives. Someone authoritative needs to hold the ring.

What a pity there wasn’t a mechanism to vire the £72m capital budget for 12 free schools into a pot devoted to this end. For, as things stand, it seems that up to £12m will have been spent on two institutions with a combined annual cohort of 120 students, while a further £60m may have to be surrendered back to the Treasury.’

Two further posts are less directly relevant but ought to be mentioned in passing:

The second in particular raises questions about the suitability of NCETM’s version of mastery for our high attaining learners, arguing that essential groundwork has been neglected and that the present approach to ‘stretch and challenge’ is unnecessarily narrow and restrictive.

.

Structure of this post

The remainder of this post is divided into three principal sections:

  • Material about the introduction of Maths Hubs and a detailed exploration of the model. This takes up approximately half of the post.
  • A review of the Hubs’ work programme and the progress they have made during their first year of operation.
  • Proposals for Maths Hubs to take the lead in improving the education of mathematically able learners and for the potential introduction of ‘most able hubs’ to support high attainers more generally. I stop short of potential reform of the entire ‘national maths talent pipeline’ since that is beyond the scope of this post.

Since readers may not be equally interested in all these sections I have supplied the customary page jumps from each of the bullet points above and to the Conclusion, for those who prefer to cut to the chase.

.

The introduction of the Maths Hubs model

.

Initial vision

Maths Hubs were first announced in a DfE press release published in December 2013.

The opening paragraph describes the core purpose as improving teacher quality:

‘Education Minister Elizabeth Truss today announced £11 million for new maths hubs to drive up the quality of maths teachers – as international test results showed England’s performance had stagnated.’

The press release explains the Coalition Government’s plans to introduce a national network of some 30 ‘mathematics education strategic hubs’ (MESH) each led by a teaching school.

A variety of local strategic partners will be drawn into each hub, including teaching school alliances, other ‘school and college groupings’, university faculties, subject associations, ‘appropriate’ local employers and local representatives of national maths initiatives.

There is an expectation that all phases of education will be engaged, particularly ‘early years to post-16’.

National co-ordination will fall to the National Centre for Excellence in the Teaching of Mathematics (NCETM), currently run under contract to DfE by a consortium comprising Tribal Education, the UCL Institute of Education, Mathematics in Education and Industry (MEI) and Myscience.

(A 2014 PQ reply gives the value of this contract as £6.827m, although this probably reflects a 3-year award made in 2012. It must have been extended by a further year, but will almost certainly have to be retendered for the next spending review period, beginning in April 2016.

The £11m budget for Maths Hubs is separate and additional. It is not clear whether part of this sum has also been awarded to NCETM through a single tender. There is more information about funding mid-way through this post.)

The press release describes the Hubs as both a national and a school-led model:

‘The network will bring together the emerging national leaders of mathematics education and aim to make school-led national subject improvement a reality.’

These emerging national leaders are assumed to be located in the lead schools and not elsewhere in the system, at NCETM or in other national organisations.

The policy design is broadly consistent with my personal preference for a ‘managed market’ approach, midway between a ‘bottom-up’ market-driven solution and a centralised and prescriptive ‘top-down’ model

But it embodies a fundamental tension, arising from the need to reconcile the Government’s national priorities with a parallel local agenda.

In order to work smoothly, one set of priorities will almost certainly take precedence over the other (and it won’t be the local agenda).

The model is also expected to:

‘…ensure that all the support provided…is grounded in evidence about what works, both in terms of mathematics teaching and the development of teachers of mathematics.’

Each Hub will be expected to provide support for maths education across all other schools in the area, taking in the full spectrum of provision:

  • recruitment of maths specialists into teaching
  • initial training of maths teachers and converting existing teachers into maths [sic]
  • co-ordinating and delivering a wide range of maths continuing professional development (CPD) and school-to-school support
  • ensuring maths leadership is developed, eg running a programme for aspiring heads of maths departments
  • helping maths enrichment programmes to reach a large number of pupils from primary school onwards’.

This is a particularly tall order, both in terms of the breadth of Hubs’ responsibilities and the sheer number of institutions which they are expected to support. It is over-ambitious given the budget allocated for the purpose and, as we shall see, was scaled back in later material.

The press release says that NCETM has already tested the model with five pathfinders.

It adds:

The main programme will be robustly evaluated, and if it proves successful in raising the standards of mathematics teaching it may be continued in 2016 to 2017, contingent on future spending review outcomes.’

What constitutes ‘the main programme’ is unclear, though it presumably includes the Hubs’ contribution to national projects, if not their local priorities.

Note that continuation from 2016 onwards is conditional on the outcomes of this evaluation, specifically a directly attributable and measurable improvement in maths teaching standards.

I have been unable to trace a contract for the evaluation, which would suggest that one has not been commissioned. This is rather a serious oversight.

We do not know how NCETM is monitoring the performance of the Hubs, nor do we know what evidence will inform a decision about whether to continue with the programme as a whole.

We have only the most basic details of national programmes in AY2015/16 and no information at all about the Hubs’ longer term prospects.

I asked the Maths Hubs Twitter feed about evaluation and was eventually referred to NCETM’s Comms Director.

I have not made contact because:

  • It is a point of principle that these posts rely exclusively on material already available online and so in the public domain. (This reflects a personal commitment to transparency in educational policy.)
  • The Comms Director wouldn’t have to be involved unless NCETM felt that the information was sensitive and had to be ‘managed’ in some way – and that tells me all I need to know.
  • I am not disposed to pursue NCETM for clarification since they have shown zero interest in engaging with me over previous posts, even though I have expressly invited their views.

.

Selection of the Hubs

Three months later, in March 2014, further details were published as part of the process of selecting the Hubs.

The document has two stabs at describing the aims of the project. The first emphasises local delivery:

‘The aim is to enable every school and college in England, from early years to the post-16 sector, to access locally-tailored and quality support in all areas of maths teaching and learning.’

This continues to imply full national reach, although one might argue that ‘enabling access’ is achieved by providing a Hub within reasonable distance of each institution and does not demand the active engagement of every school and college.

The second strives to balance national priorities and local delivery:

‘The aim of the national network of Maths Hubs will be to ensure that all schools have access to excellent maths support that is relevant to their specific needs and that is designed and managed locally. They will also be responsible for the coordinated implementation of national projects to stimulate improvement and innovation in maths education.

Note that these national priorities have now become associated with innovation as well as improvement. This is ‘top-down’ rather than ‘school-led’ innovation – there is no specific push for innovative local projects.

At this stage the Hubs’ initial (national) priorities are given as:

  • Leading the Shanghai Teacher Exchange Programme
  • Supporting implementation of the new maths national curriculum from September 2014 and
  • Supporting introduction of new maths GCSEs and Core Maths qualifications in 2015.

The guidance specifies that:

‘Each Maths Hub will operate at a sub-regional or city regional level. The hubs will work with any group of schools or colleges in the area that request support, or who are referred to the hub for support.’

So responsibility for seeking assistance is placed on other schools and colleges and on third parties (perhaps Ofsted or Regional School Commissioners?) making referrals – Hubs will not be expected to reach out proactively to every institution in their catchment.

The competition is no longer confined to teaching schools. Any school that meets the initial eligibility criteria may submit an expression of interest. But the text is clear that only schools need apply – colleges are seemingly ineligible.

Moreover, schools must be state-funded and rated Outstanding by Ofsted for Overall Effectiveness, Pupil Achievement, Quality of Teaching and Leadership and Management.

Teaching schools are not expected to submit Ofsted inspection evidence – their designation is sufficient.

The guidance says:

‘We may choose to prioritise expression of interest applications based on school performance, geographical spread and innovative practice in maths education.’

NCETM reported subsequently that over 270 expressions of interest were received and about 80 schools were invited to submit full proposals.

The evidence used to select between these is set out in the guidance. There are four main components:

  • Attainment and progress data (primary or secondary and post-16 where relevant) including attainment data (but not progress data) for FSM pupils (as opposed to ‘ever 6 FSM’).
  • Support for improvement and professional development
  • Leadership quality and commitment
  • Record and capacity for partnership and collaboration

The full text is reproduced below

.

Criteria application Capture 1Criteria application capture 2Criteria application Capture 3Criteria application Capture 4.

It is instructive to compare the original version with the assessment criteria set out for the limited Autumn 2015 competition (see below).

In the updated version applicants can be either colleges or schools. Applicants will be invited to presentation days during which their commitment to mastery will be tested:

‘Applicants will be asked to set out…How they will support the development of mastery approaches to teaching mathematics, learning particularly from practice in Shanghai and Singapore.’

The Maths Hub model may be locally-driven but only institutions that support the preferred approach need apply.

The criteria cover broadly the same areas but they have been beefed up significantly.

The original version indicated that full proposals would require evidence of ‘school business expertise’ and ‘informed innovation in maths education’, but these expectations are now spelled out in the criteria.

Applicants must:

‘Provide evidence of a strong track record of taking accountability for funding and contracting other schools/organisations to deliver projects, including value for money, appropriate use of public funding, and impact.’

They must also:

‘Provide two or three examples of how you have led evidence-informed innovation in maths teaching. Include details of evaluation outcomes.

Provide information about the key strategies you would expect the hub to employ to support effective innovation.

Provide evidence of how you propose to test and implement the teaching maths for mastery approach within the hub. Show how effective approaches will be embedded across all school phases.’

Note that this innovative capacity is linked explicitly with the roll-out of mastery, a national priority.

The new guide explains that action plans prepared by the successful applicants will be ‘agreed by the NCETM and submitted to the DfE for approval’. This two-stage process might suggest that NCETM’s decision-making is not fully trusted. Alternatively, it might have something to do with the funding flows.

No further information was released about issues arising during the original selection process. It seems probable that some parts of the country submitted several strong bids while others generated relatively few or none at all.

It will have been necessary to balance the comparative strength of bids against their geographical distribution, and probably to ‘adjust’ the territories of Hubs where two or more particularly strong bids were received from schools in relatively close proximity.

It is not clear whether the NCETM’s five pathfinders were automatically included.

Successful bidders were confirmed in early June 2014, so the competition took approximately three months to complete.

One contemporary TSA source says that Hubs were ‘introduced at a frantic pace’. A 2-day introductory conference took place in Manchester on 18-19 June, prior to the formal launch in London in July.

Hubs had to submit their action plans for approval by the end of the summer term and to establish links with key partners in readiness to become operational ‘from the autumn term 2014’. (The TSA source says ‘in September’).

.

The Hubs are announced

A further DfE press release issued on 1 July 2014 identified 32 Hubs. Two more were added during the autumn term, bringing the total to 34, although the FAQs on the Maths Hubs website still say that there were only 32 ‘in the first wave’.

This implies that a second ‘wave’ is (or was) anticipated.

An earlier NCETM presentation indicated that 35 hubs were planned but it took a full year for the final vacancy to be advertised.

As noted above, in July 2015, an application form and guide were issued ‘for schools and colleges that want to lead a maths hub in south-east London and Cumbria or north Lancashire.’

The guide explains:

‘There are currently 34 Maths Hubs across England with funding available for a 35th Maths Hub in the North West of England. There is a geographical gap in Cumbria and North Lancashire where previously we were unsuccessful in identifying a suitable school or college to lead a Maths Hub in this area. In addition, after establishing the Maths Hub in first year, the lead school for the London South-East Maths Hub has decided to step down from its role.’

As far as I can establish this is the first time that the original failure to recruit the final Hub in the North-West has been mentioned publicly.

No reason is given for the decision by another lead school to drop out. The school in question is Woolwich Polytechnic School.

The two new Hubs are expected to be operational by November 2015. Applications will be judged by an unidentified panel.

Had the first tranche of Hubs proved extremely successful, one assumes that the second wave would have been introduced in readiness for academic year 2015/16, but perhaps it is necessary to await the outcome of the forthcoming spending review, enabling the second wave to be introduced from September 2016.

The embedded spreadsheet below gives details of all 34 Hubs currently operating.

.

.

Most lead institutions are schools, the majority of them secondary academies. A couple of grammar schools are involved as well as several church schools. Catholic institutions are particularly well represented.

Two of the London Hubs are led by singleton primary schools and a third by two primary schools working together. Elsewhere one Hub is based in a 14-19 tertiary college and another is led jointly by a 16-19 free school.

Some are hosted by various forms of school partnership. These include notable multi-academy trusts including the Harris Federation, Outwood Grange Academies Trust and Cabot Learning Federation.

The difference in capacity between a single primary school and a large MAT is enormous, but the expectations of each are identical, as are the resources made available to implement the work programme. One would expect there to be some correlation between capacity and quality with smaller institutions struggling to match their larger peers.

No doubt the MATs take care to ensure that all their schools are direct beneficiaries of their Hubs – and the initiative gives them an opportunity to exert influence beyond their own members, potentially even to scout possible additions to the fold.

Fewer than half of the lead schools satisfy the initial eligibility requirements for ‘outstanding’ inspection reports (and sub-grades). In most cases this is because they are academies and have not yet been inspected in that guise.

One lead school – Bishop Challoner Catholic College – received ‘Good’ ratings from its most recent inspection in 2012. Another – Sir Isaac Newton Sixth Form – has been rated ‘Good’ since becoming a lead school.

We do not know why these institutions were included in the original shortlist but, perhaps fortunately, there was no public backlash from better qualified competitors upset at being overlooked.

This map (taken from a presentation available online) shows the geographical distribution of the original 32 Hubs. It is a more accurate representation than the regional map on the Maths Hub website.

Even with the addition of the two latecomers in November 2014 – one in Kent/Medway, the other in Leicestershire – it is evident that some parts of the country are much better served than others.

There is an obvious gap along the East Coast, stretching from the Wash up to Teesside, and another in the far North-West that the new competition is belatedly intended to fill. The huge South-West area is also relatively poorly served.

.

Maths Hubs locations map. 

If the Hubs were evenly distributed to reflect the incidence of schools and colleges nationally, each would serve a constituency of about 100 state-funded secondary schools and 500 state-funded primary schools, so 600 primary and secondary schools in total, not to mention 10 or so post-16 institutions.

Although there is little evidence on which to base a judgement, it seems unlikely that any of the Hubs will have achieved anything approaching this kind of reach within their first year of operation. One wonders whether it is feasible even in the longer term.

But the relatively uneven geographical distribution of the Hubs suggests that the size of their constituencies will vary.

Since schools and colleges are expected to approach their Hubs – and are free to align with any Hub – the level of demand will also vary.

It would be helpful to see some basic statistics comparing the size and reach of different Hubs, setting out how many institutions they have already engaged actively in their work programmes and what proportion are not yet engaged.

It seems likely that several more hubs will be needed to achieve truly national reach. It might be more feasible with a ratio of 300 schools per hub, but that would require twice as many hubs. The limited supply of high quality candidates may act as an additional brake on expansion, on top of the availability of funding.

.

Hub structure

A presentation given on 27 June 2014 by John Westwell – NCETM’s ‘Director of Strategy Maths Hubs’ – explains Hub structure through this diagram

.

NCETM Maths hubs model Capture.

There is a distinction – though perhaps not very clearly expressed – between the roles of:

  • Strategic partners supporting the lead school with strategic leadership and 
  • Operational partners providing ‘further local leadership and specialist expertise to support [the] whole area’.

It seems that the former are directly involved in planning and evaluating the work programme while the latter are restricted to supporting delivery.

The spreadsheet shows that one of the Hubs – Salop and Herefordshire – fails to mention any strategic partners while another – Jurassic – refers to most of its partners in general terms (eg ‘primary schools, secondary schools’).

The remainder identify between four and 16 strategic partners each. Great North and Bucks, Berks and Oxon are at the lower end of the spectrum. Archimedes NE and Matrix Essex and Herts are at the upper end.

One assumes that it can be a disadvantage either to have too few or too many strategic partners, the former generating too little capacity; the latter too many cooks.

All but five Hubs have at least one higher education partner but of course there is no information about the level and intensity of their involvement, which is likely to vary considerably.

Eighteen mention the FMSP, but only five include the CMSP. Six list MEI as a strategic partner and, curiously, three nominate NCETM. It is unclear whether these enjoy a different relationship with the national co-ordinating body as a consequence.

To date, only the London Central and West Hub is allied with Mathematics Mastery, the Ark-sponsored programme.

However, NCETM says:

‘…a growing number of schools around the country are following teaching programmes from Mathematics Mastery an organisation (separate from the NCETM) whose work, as the name suggests, is wholly devoted to this style of learning and teaching. Mathematics Mastery is, in some geographical areas, developing partnership working arrangements with the Maths Hubs programme.’

Mathematics Mastery also describes itself as ‘a national partner of Maths Hubs’.

.

Work Groups

Hubs plan on the basis of a standard unit of delivery described as a ‘work group’.

Each work group is characterised by:

  • a clear rationale for its existence and activity
  • well defined intended outcomes
  • local leadership supported by expert partners
  • a mixture of different activities over time
  • value for money and
  • systematic evidence collection.

The process is supported by something called the NCETM ‘Work Group Quality Framework’ which I have been unable to trace. This should also be published.

The most recent description of the Hubs’ role is provided by the Maths Hubs Website which was did not appear until November 2014.

The description of ‘What Maths Hubs Are Doing’ reinforces the distinction between:

  • National Collaborative Projects, where all hubs work in a common way to address a programme priority area and
  • Local projects, where hubs work independently on locally tailored projects to address the programme priorities.’

The earlier material includes a third variant:

  • Local priorities funded by other means

But these are not mentioned on the website and it is not clear whether they count as part of the Hubs’ official activity programme.

The spreadsheet shows that the number of work groups operated by each Hub varies considerably.

Four of them – North West One, White Rose, South Yorkshire and London South East – fail to identify any work groups at all.

In the case of White Rose there are links to courses and a conference, but the others include only a generic description of their work programme.

Two further Hubs – Enigma and Cambridge – refer readers to their websites, neither of which contain substantive detail about the Work Groups they have established (though Enigma lists a range of maths CPD opportunities and courses).

Otherwise the number of work groups varies between two (East Midlands South) and 11 (Surrey Plus). Fifteen of the Hubs have six or fewer work groups while nine have eight or more.

This suggests that some Hubs are far more productive and efficient than others, although the number of work groups is not always a reliable indicator, since some Hubs appear to categorise one-off events as work groups, while others use it to describe only longer term projects.

Maybe the Quality Framework needs attention, or perhaps some Hubs are not following it properly.

.

The network defined

To coincide with the launch NCETM published its own information page on Maths Hubs, now available only via archive.

This describes in more detail how the Hubs will be expected to function as a network:

‘…the Maths Hubs will also work together in a national network co-ordinated by the NCETM. The network will ensure that effective practice from within particular hubs is shared widely. It will also provide a setting for Maths Hubs and the NCETM to collaboratively develop new forms of support as needed.

The national network will also come together, once a term, in a regular Maths Hubs Forum, where there will be opportunity to evaluate progress, plan for the future, and to engage with other national voices in maths education, such as the Joint Mathematical Council, the Advisory Committee on Mathematics Education (ACME), the DfE, and Ofsted. As shown in the diagram below’:

.

NCETM national network Capture..

Whether this is genuinely ‘school-led system-wide improvement’ is open to question, relying as it does on central co-ordination and a funding stream provided by central government. It is more accurately a hybrid model that aims to pursue national and local priorities simultaneously.

Essentially Hubs have a tripartite responsibility:

  • To develop and co-ordinate practice within their own Hub.
  • To collaborate effectively with other Hubs.
  • Collectively to contribute to the national leadership of maths education

The sheer complexity of this role – and the level of expectation placed on the Hubs – should not be under-estimated.

The archived NCETM page identifies three core tasks for the Hubs as they operate locally:

  • Identify needs and agree priorities for support in their area. This could involve pro-active surveying of schools; responding to requests and referrals; and considering the implications of national evidence.
  • Co-ordinate a range of high quality specialist mathematics support to address the needs. This could include communicating existing support and extending its reach; commissioning external organisations to provide bespoke support; developing and enabling new forms of support and collaboration.
  • Critically evaluate the quality and impact of the support provided. This could include gathering immediate, medium-term and long-term feedback from participants engaging with support; and more detailed evaluative research used to test innovations.’

We have no information about the extent and quality of cross-fertilisation between Hubs. This seems to depend mainly on the termly attendance of the leads at the Forum meetings, supported through social media interaction via Twitter. There is also some evidence of regional collaboration, though this seems much better developed in some regions than others.

The July 2015 newsletter on the Maths Hub Website says:

‘An added feature of the second year of the Maths Hubs programme will be more collaboration between Maths Hubs, typically bringing a small group of hubs together to pool experience, maybe in the development of a new project, or in the wider implementation of something that’s already worked well in a single hub.’

This may suggest that the collaborative dimension has been rather underplayed during the first year of operation. If it is to be expanded it may well demand additional teacher time and funding.

In the Westwell presentation the model is described as a ‘fully meshed network’ (as opposed to a hub and spoke model) in which ‘all the nodes are hubs’.

Unusually – and in contrast to the DfE press releases – there is explicit recognition that the Hubs’ core purpose is to improve pupil outcomes:

‘Resolute focus on pupils’ maths outcomes:

  • improved levels of achievement
  • increased levels of participation
  • improved attitudes to learning
  • closing the gaps between groups’

They also support school/college improvement:

‘Determined support for all schools/colleges to improve:

  • the teaching of mathematics
  • the leadership of mathematics
  • the school’s mathematics curriculum ‘

Any evaluation would need to assess the impact of each Hub against each of these seven measures. Once again, the level of expectation is self-evident.

. 

Termly Forums and Hub leads

Very little information is made available about the proceedings of the termly Maths Hub Forum, where the 34 Hub leads convene with national partners.

The Maths Hubs website says:

‘At the national level, the Maths Hubs programme, led by the NCETM, is developing partnership working arrangements with organisations that can support across the Maths Hubs network. At the moment, these include:

Other partnership arrangements will be developed in due course.’

There is no further information about these national partnership agreements, especially the benefits accruing to each partner as a consequence.

We know that one Forum took place in October 2014, another in February 2015. We do not know the full list of national partners on the invitation list.

There should be another Forum before the end of summer term 2015, unless the London Maths Hub Conference was intended to serve as a replacement.

The guide to the competition for two new Hubs mentions that the Autumn 2015 Forum will take place in York on 4/5 November.

The July Bespoke newsletter says:

‘…the 34 Maths Hub Leads, who meet termly, will continue to pool their thoughts and experiences, developing a growing and influential voice for mathematics education at a national level.’ 

It is hard to understand how the Forum can become ‘an influential voice’ without a significantly higher profile and much greater transparency over proceedings.

The Maths Hubs website should have a discrete section for the termly forums which contains all key documents and presentations.

In March 2015, NCETM’s Westwell published a post on the NCTL Blog claiming early signs of success for the Hubs:

‘Even though we are less than 2 terms into embedding a new, collaborative way of working, we are seeing encouraging signs that leadership in mathematics education can be shared and spread within geographical areas.’

He continues:

‘Our vision is of a national, collective group of leaders exerting new, subject-specific influence across school phases and across geographical boundaries.

The essential professional characteristics of this group are that they know, from first-hand experience:

  • how maths is best taught, and learnt
  • how good maths teachers are nurtured
  • how high-quality ongoing professional development can help good teachers become excellent ones

They have shown the capacity to lead others in all of these areas.’

And he adds:

‘The maths hub leads also come together in a regular national forum, which allows them to exchange practice but also provides a platform for them to enter into dialogue with policy makers and key national bodies. Over time, we expect that maths hub leads will come to be recognised nationally as leaders of mathematics education.’

This highlights the critical importance of the Maths Hub leads to the success of the model. One assumes that the post-holders are typically serving maths teachers who undertake this role alongside their classroom and middle management responsibilities.

It seems highly likely that most Hub leads will not remain in post for more than two or three years. All will be developing highly transferrable skills. Many will rightly see the role as a stepping stone to senior leadership roles.

Unless they can offer strong incentives to Hub leads to remain in post, NCETM will find turnover a persistent problem.

.

Funding

There is no information about funding on the Maths Hubs Website and details are extremely hard to find, apart from the total budget of £11m, which covers the cost of Hubs up to the end of FY2015-16.

Each Hub receives core operational funding as well as ‘funding on a project basis for local and national initiatives’.

I found an example of an action plan online. The notes provide some details of the annual budget for last financial year:

For the financial year 2014/15, each hub will receive £36,000 to cover the structural costs of the hub including the cost of: the Maths Lead time (expected minimum 1 day/week) and Hub Administrator time (expected minimum 1.5 days/week); the time provided by the Senior Lead Support and the strategic leadership group; identifying and developing operational partner capacity; engaging schools/colleges and identifying their support needs. It is possible to transfer some of the £36,000 to support hub initiated activities.

For the financial year 2014/15, Maths Hubs will receive £40,000 to support hub-initiated activity. As explained at the forum we are using the term “Work Groups” to cover all hub-initiated activity…The cost of the exchange element with the Shanghai teachers will be paid from central national project funds and is outside of the £40,000 budget.’

Another source (a presentation given at the launch of the Norfolk and Suffolk Hub) suggests that in 2014-15 Hubs also received a further £20,000 for national projects.

Hence the maximum budget per Hub in FY2014/15 was £96,000. Assuming all 34 received that sum the total cost was £3.264m (34 x £96K).

We do not know how much more was set aside for central costs, although DfE’s Supplementary Estimates for 2014-15 hint that the total budget might have been £3.7m, which would suggest a balance of £0.436m was spent on central administration.

The NCETM website presently lists a Director and no fewer than six Assistant Directors responsible for Maths Hubs, giving a ratio of one director for every seven hubs. On the face of it, this does not fit the image as a school-led network. Indeed it suggests that the Hubs require intensive central support.

I could find nothing at all about the size of the budget for 2015-16. The Norfolk and Suffolk launch presentation indicates that Hubs will enjoy additional funding for both running costs and projects but does not quantify this statement. Another source suggests that the time allocation for Hub leads will be increased to 0.5FTE.

There is no information about funding levels in the guide to the autumn 2015 competition, although it suggests that the money will come in two separate streams:

‘Each Maths Hub will receive direct funding for structural operational purposes and funding on a project basis for local and national projects.’

It may be that the operational funding is paid via NCTL and the project funding via NCETM.

One assumes that operational funding will need to be uprated by at least 33% for 2015-16 since it will cover a full financial year rather than July to March inclusive (9 months only).

If the funding for local and national projects is increased by the same amount, that would bring the sum per Hub in FY2015-16 to approximately £128,000 and the total budget to something like £5m.

It would be helpful to have rather more transparency about Hub budgets and the total sum available to support them in each financial year.

If the NCETM operation needs retendering for FY2016-17 onwards, one assumes that national co-ordination of the Hubs will form part of the specification. One might expect to see a tender early next academic year.

.

Hubs’ Current Activity

Developing role 

The press release marking the launch was strongly focused on Hubs’ role in leading what was then called the Shanghai Teacher Exchange Programme:

‘A national network of maths hubs that will seek to match the standards achieved in top-performing east Asian countries – including Japan, Singapore and China – was launched today by Education Minister Elizabeth Truss…

These ‘pace-setters’ will implement the Asian-style mastery approach to maths which has achieved world-leading success….Hubs will develop this programme with academics from Shanghai Normal University and England’s National Centre for Excellence in the Teaching of Maths (NCETM)….

… The Shanghai Teacher Exchange programme will see up to 60 English-speaking maths teachers from China embedded in the 30 maths hubs, starting this autumn term.

The Chinese teachers will run master classes for local schools and provide subject-specific on-the-job teacher training.

Two leading English maths teachers from each of the 30 maths hubs will work in schools in China for at least a month, to learn their world-class teaching approaches. The teachers will then put into practice in England what they have learnt and spread this widely to their peers.’

It also mentioned that the Hubs would be supporting the Your Life campaign to inspire young people, especially girls, to study maths and physics.

‘The campaign, led by businesses, aims to increase the number of students taking maths and physics A level by 50% over the next 3 years.’

Moreover:

‘They will also work with new maths and physics chairs, PhD graduates being recruited to become teachers to take their expertise into the classroom and transform the way the maths and physics are taught.’

The Website describes three National Collaborative Projects in slightly different terms:

  • England-China is the new title for the Shanghai Teacher Exchange. Primary sector exchanges took place in 2014/15 and secondary exchanges are scheduled for 2015/16.

The aim of the project is described thus:

‘The aim, as far as the English schools are concerned, is to learn lessons from how maths is taught in Shanghai, with particular focus on the mastery approach, and then research and develop ways in which similar teaching approaches can be used in English classrooms

…The long-term aim of the project is for the participating English schools first to develop a secure mastery approach to maths teaching themselves, and then to spread it around partner schools.’

  • Textbooks and Professional Development involves two primary schools from each Maths Hub trialling adapted versions of Singapore textbooks with their Year 1 classes.

Each school has chosen one of two mastery-focused textbooks: ‘Inspire Maths’ and ‘Maths – No Problem’. Teachers have five days’ workshop support.

  • Post-16 Participation is intended to increase participation rates in A level maths and further maths courses as well as Core Maths and other Level 3 qualifications. Some hubs are particularly focused on girls’ participation.

The initial phase of the project involves identifying schools and colleges that are successful in this respect, itemising the successful strategies they have deployed and exploring how those might be implemented in schools and colleges that have been rather less successful.

.

Progress to date on National Collaborative Projects 

Coverage of the National Projects on the Hubs website is heavily biased towards the England-China project, telling us comparatively little about the other national priorities.

A group of 71 primary teachers visited Shanghai in September 2014. Return visits from 59 Shanghai teachers took place in two waves, in November 2014 and February/March 2015. 

A list of 47 participating schools is supplied including the hubs to which they belong.

There is also a Mid-Exchange Report published in November 2014, a press release from February 2015 marking the arrival of the second wave and the first edition of Bespoke, a Maths Hub newsletter dating from April 2015, which is exclusively focused on mastery.

The latter describes the exchanges as:

‘…the start of a long-term research project, across all of the Maths Hubs, to investigate ways in which mastery approaches can be introduced to maths lessons, to the way teachers design lessons, and to how schools organise time-tables, and the deployment of teachers and teaching assistants.’

These descriptions suggest something rather different to the slavish replication of Shanghai-style mastery, anticipating a ‘secure mastery approach’ that might nevertheless have some distinctive English features.

But NCETM has already set out in some detail the principles and key features of the model they would like to see introduced, so rather less is expected of the Hubs than one might anticipate. They are essentially a testbed and a mechanism for the roll-out of a national strategy.

The website also indicates that, before the end of summer term 2015:

‘…the NCETM, working through the Maths Hubs will publish support materials for assessment of the depth of pupils’ knowledge within the context of a mastery curriculum.’

NCETM describes the materials as a collaborative venture involving several partners:

‘Recording progress without levels requires recording evidence of depth of understanding of curriculum content, rather than merely showing pupils can ‘get the answers right’.

The NCETM, working with other maths experts and primary maths specialists from the Maths Hubs, is currently producing guidance on how to do this for the primary maths National Curriculum. For each curriculum statement, the guidance will show how to identify when a pupil has ‘mastered’ the curriculum content (meaning he or she is meeting national expectations and so ready to progress) and when a pupil is ‘working deeper’ (meaning he or she is exceeding national expectations in terms of depth of understanding).’

This is not yet published and, if NCETM is sensible, it will wait to see the outcomes of the parallel Commission on Assessment Without Levels.

The Bespoke newsletter mentions in passing that further research is needed into the application of mastery teaching in mixed age classes, but no further details are forthcoming.

Information about the planned secondary exchange is also rather thin on the ground.

NCETM said in June that the programme would focus on teaching at the KS2/3 transition.

The second edition of Bespoke, published in July 2015 adds:

‘Primary schools that hosted Shanghai teachers in 2014/15 will continue to develop and embed teaching for mastery approaches, and, in addition, two teachers from secondary schools in each Maths Hub will visit Shanghai in September, with their counterparts returning to work in Key Stage 3 classrooms in November 2015.’

The same is true of the Textbooks project, which was announced in a ministerial speech given in November 2014. Very little detail has been added since.

The July edition of Bespoke says that the project:

‘…will be expanded, to take in more schools and more classes, including Year 2 pupils’

while another section offers the briefest commentary on progress in the first year, twice!:

.

Bespoke July Capture.

Coverage of the Post-16 Participation project is similarly sparse, though this may be because the lead lies with the Further Mathematics Support Programme and Core Maths Support Programme.

July’s Bespoke says of Year 2:

‘Work to help schools and colleges increase the numbers of Year 12 and Year 13 students taking A level maths, and, among them, more girls, will continue. Approaches that bore fruit in some hubs this year will be implemented in other areas.’

The sketchiness of this material causes one to suspect that – leaving aside the Shanghai exchanges – progress on these national projects has been less than spectacular during the first year of the Hubs’ existence.

Even with the England-China project there is no published specification for the long-term research project that is to follow on from the exchanges.

Those working outside the Hubs need more information to understand and appreciate what value the Hubs are adding.

.

New National Collaborative Projects

The July edition of Bespoke confirms two further National Projects.

One is snappily called ‘Developing 140 new Primary Mathematics Teaching for Mastery specialists’:

‘Closely linked to other work on mastery, this project will involve the training of four teachers in each Maths Hub area to become experts in teaching for mastery in their own classrooms, and in supporting the similar development of teachers in partner schools.’

This project appeared a national-programme-in-waiting when it was first announced in April 2015.

A subsequent NCETM press release confirmed that there were over 600 applicants for the available places.

The further details provided by NCETM reveal that participants will pursue a two-year course. Year One combines three two-day residential events with the leadership of teacher research groups, both in the teacher’s own school and for groups of teachers in neighbouring schools.  Year Two is devoted exclusively to these external teacher research groups.

The material explains that a research group is:

‘…a professional development activity attended by a group of teachers, with a specific focus on the design, delivery and learning within a jointly evaluated mathematics lesson.’

A FAQ document explains that a typical research group meeting is a half-day session with discussion taking place before and after a lesson observation.

The four external group meetings in Year One will together constitute a pilot exercise. In Year Two participants will lead up to five such groups, each meeting on six occasions. Groups will typically comprise five pairs of teachers drawn from five different schools.

Release time is 12 days in Year One and up to 30 days in Year Two (assuming the participant leads the maximum five research groups).

Training and supply costs are covered in Year One but in Year Two they are to be met by charging the other participants in the research groups, so a first indication that Hubs will be expected to generate their own income stream from the services they provide. (NCETM will provide ‘guidance’ on fee levels.)

Participants are expected to develop:

  • ‘Understanding of the principles of mastery within the context of teaching mathematics.
  • Deep subject knowledge of primary mathematics to support teaching for mastery.
  • The development of effective teaching techniques to support pupils in developing mastery of mathematics.
  • The ability to assess pupils for mastery.
  • The ability to support other teachers, and lead teacher research groups.’

The intention is that teachers completing the course will roll out further phases of professional development and:

‘Over time, this will spread the understanding of, and expertise in, teaching maths for mastery widely across the primary school system.’

The second new national project is called ‘Mathematical Reasoning’. Bespoke is typically uninformative:

‘A new project will start in September 2015, to trial ways of developing mathematical reasoning skills in Key Stage 3 pupils.’

This may or may not be related to a NCETM Multiplicative Reasoning Professional Development Programme which took place in 2013/14 with the assistance of the Hubs.

This:

‘focused on developing teachers’ understanding and capacity to teach topics that involved multiplicative reasoning to Key Stage 3 (KS3) pupils. Multiplicative reasoning refers to the mathematical understanding and capability to solve problems arising from proportional situations often involving an understanding and application of fractions as well as decimals, percentages, ratios and proportions.’

Some 60 teachers from 30 schools were organised into three regional professional development networks, each with a professional development lead and support from university researchers. Project materials were created by a central curriculum development team. The regional networks were hosted by Maths Hubs, presumably in their pilot phase.

In June 2015 DfE published a project Evaluation featuring a Randomised Control Trial (RCT). Unfortunately, this did not reveal any significant impact on pupil attainment:

‘During the timescale of the trial (13 October 2014 to May 2015) the programme did not have any statistically significant impacts on general mathematical attainment as measured by PiM tests or on items on the tests specifically associated with multiplicative reasoning’.

One of the Report’s recommendations is:

‘For the NCETM to make available MRP materials and approaches to teaching MR through the Maths Hub network’

Another:

‘That the NCETM seeks further opportunities to engage curriculum developers with Maths Hubs and other NCETM activities and potentially to develop future curriculum design projects that address the needs of teachers, schools and pupils’.

With five national collaborative projects rather than three, the work programme in each Hub during Academic Year 2015/16 will be more heavily biased towards the Government’s agenda, unless there is also additional funding to increase the number of local projects. There is no hint in the latest Bespoke newsletter that this is the case.

.

Local projects 

Unfortunately, Hub-specific pages on the Maths Hubs Website do not distinguish national from local projects.

A regional breakdown offers some insight into the typical distribution between the two and the range of issues being addressed.

The embedded spreadsheet provides further details, including links to additional information on each work group where the Hubs have made this available.

  • South West: The four Hubs between them identify 27 work groups. Each Hub has a work group for each of the three initial national collaborative projects. Relatively unusual topics include maths challenge and innovation days and improving primary maths enrichment experiences. The Jurassic Hub includes amongst its list of generic focus areas ‘developing access for the most able’, but there is no associated work group.
  • West Midlands: Two of the three hubs have six work groups and the third has seven. Here there is rather less adherence to the national priorities with only the North Midlands and Peaks Hub noticeably engaged with the mastery agenda. One work group is addressing ‘strategies for preventing (closing) the gap’ in maths. It is disturbing that this is unique across the entire programme – no other region appears concerned enough to make this a priority, nor is it a national project in its own right.
  • North West: Of the three Hubs, one has provided no details of its work groups, one lists six and the other nine. Perhaps the most interesting is North West Two’s Maths App Competition. This involves Y5 and 6 pupils creating ‘a maths-based app for a particular area of weakness that they have identified’.
  • North East: The two North East Hubs have nine and eight work groups respectively. Both address all three initial national priorities. In one the remaining groups are designed to cover the primary, secondary and post-16 sectors respectively. In the other there is a very strong mastery bias with two further work groups devoted to it.
  • Yorkshire and Humberside: Only two of the four Hubs provide details of their work groups in the standard format. One offers eight, the other four. The less ambitious Yorkshire and the Humber Hub does not include any of the three national priorities but addresses some topics not found elsewhere including Same Day Intervention and Differentiation. In contrast, Yorkshire Ridings covers all three national priorities and a local project offering £500 bursaries for small-scale action research projects.
  • East Midlands: Two of the Hubs identify six work groups but the third – one of the two late additions – has only two, neither of them focused on the national priorities. Elsewhere, only East Midlands East has a work group built around the Shanghai exchanges. Otherwise, network focused work groups – whether for primary specialists, subject leaders or SLEs – are dominant.
  • East: Two of the four Hubs provide links to their own websites, which are not particularly informative. The others name nine and five work groups respectively. The former – Matrix Essex and Herts – includes all three initial national priorities, but the latter – Norfolk and Suffolk – includes only increasing post-16 participation. Matrix has a local project to enhance the subject knowledge of teaching assistants. 
  • South East: The five Hubs vary considerably in the number of work groups they operate, ranging between three and 11. Bucks, Berks and Oxon is the least prolific, naming only the three national priorities. At the other extreme, Surrey Plus is the most active of all 34 Hubs, though several of its groups appear to relate to courses, conferences and other one-off meetings. One is providing ‘inspiration days for KS2, KS3 and KS4 students in schools looking to improve attitudes towards maths’. 
  • London: Of the six London Hubs, one has provided no information about its work groups. Two of the remaining five have only three work groups. Of these, London Central and NW lists the three national priorities. The other – London Central and West – mentions the two mastery-related national programmes and then (intriguingly) a third project called ‘Project 4’! London Thames includes a Student Commission Project:

‘Students will become researchers over two days and will explore the difference between depth and acceleration in terms of students’ perceptions of progress. There will be support from an expert researcher to support them in bringing together their findings. They will present their findings at the Specialist Schools and Academy’s Trust (SSAT) Conference and other forums where they can share their experience.’

Unfortunately, the presentation given at this event suggests the students were unable to produce a balanced treatment, carefully weighing up the advantages and disadvantages of each approach and considering how they might be combined to good effect. Naturally they came up with the ‘right’ answer for NCETM!

The variation in the productivity of Hubs is something of a surprise. So are the different levels of commitment they display towards the NCETM’s mastery-focused agenda.

Does NCETM push the laggards to work harder and conform to its priorities, or does it continue to permit this level of variance, even though it will inevitably compromise the overall efficiency of the Maths Hub programme?

.

Supporting the Most Able

.

Through the Maths Hubs 

In 2013, NCETM published guidance on High Attaining Pupils in Primary Schools (one has to register with NCETM to access these materials).

This is strongly influenced by ACME’s Report ‘Raising the bar: developing able young mathematicians’ (December 2012) which defines its target group as:

‘…those students aged 5-16 who have the potential to successfully study mathematics at A level or equivalent’.

ACME bases its report on three principles:

  • ‘Potential heavy users of mathematics should experience a deep, rich, rigorous and challenging mathematics education, rather than being accelerated through the school curriculum.
  • Accountability measures should allow, support and reward an approach focused on depth of learning, rather than rewarding early progression to the next Key Stage.
  • Investment in a substantial fraction of 5-16 year olds with the potential to excel in mathematics, rather than focussing attention on the top 1% (or so), is needed to increase the number of 16+ students choosing to study mathematics-based subjects or careers.’

ACME in turn cites Mathematical Association advice from the previous year on provision for the most able in secondary schools.

It is fascinating – though beyond the scope of this post – to trace through these publications and subsequent NCETM policy the evolution of an increasingly narrow and impoverished concept of top-end differentiation

The line taken in NCETM’s 2013 guidance is still relatively balanced:

‘It’s probably not helpful to think in terms of either enrichment or acceleration, but to consider the balance between these two approaches. Approaches may vary depending on the age of children, or the mathematics topics, while there may be extra-curricular opportunities to meet the needs of high attaining children in other ways. In addition to considerations of which approach supports the best learning, there are practical issues to consider.’

This is a far cry from the more extreme position now being articulated by NCETM, as discussed in my earlier post ‘A digression on breadth, depth, pace and mastery’.

There is in my view a pressing need to rediscover a richer and more sophisticated vision of ‘stretch and challenge’ for high attaining learners in maths and, by doing so, to help to achieve the Conservative manifesto commitment above. This need not be inconsistent with an Anglicised mastery model, indeed it ought to strengthen it significantly.

One obvious strategy is to introduce a new National Collaborative Project, ensuring that all 34 Hubs are engaged in developing this vision and building national consensus around it.

Here are some suggested design parameters:

  • Focus explicitly on improving attainment and progress, reducing underachievement by high attaining learners and closing gaps between disadvantaged high attainers and their peers.
  • Develop interventions targeted directly at learners, as well as professional development, whole school improvement and capacity building to strengthen school-led collaborative support.
  • Emphasise cross-phase provision encompassing primary, secondary and post-16, devoting particular attention to primary/secondary and secondary/post-16 transition.
  • Develop and disseminate effective practice in meeting the needs of the most able within and alongside the new national curriculum, including differentiated support for those capable of achieving at or beyond KS2 L6 in scaled score terms and at or beyond Grade 9 GCSE.at KS4.
  • Develop, test and disseminate effective practice in meeting the needs of the most able through a mastery-driven approach, exemplifying how breadth, depth and pace can be combined in different proportions to reflect high attainers’ varying needs and circumstances.

.

Through ‘Most Able Hubs’

Compared with Maths Hubs, the Sutton Trust’s recommendation – that designated schools should support those that are underperforming with the most able and consider providing a localised extra-curricular enrichment programme – is markedly unambitious.

And of course the Maths Hubs cannot be expected to help achieve Conservative ambitions for the other elements of STEM (let alone STEAM).

Why not introduce a parallel network of Most Able Hubs (MAHs)? These would follow the same design parameters as those above, except that the last would embrace a whole/school college and whole curriculum perspective.

But, in the light of the analysis above, I propose some subtle changes to the model.

  • Number of hubs

Thirty-four is not enough for genuine national reach. But the supply of potential hubs is constrained by the budget and the number of lead institutions capable of meeting the prescribed quality criteria.

Assuming that the initial budget is limited, one might design a long-term programme that introduces the network in two or even three phases. The first tranche would help to build capacity, improving the capability of those intending to follow in their footsteps.

The ideal long-term outcome would be to introduce approximately 100 MAHs, at least 10 per region and sufficient for each to support some 200 primary and secondary schools (170 primary plus 30 secondary) and all the post-16 institutions in the locality.

That might be achieved in two phases of 50 hubs apiece or three of 33-34 hubs apiece.

  • Quality threshold

In the first instance, MAHs would be selected on the basis of Ofsted evaluation – Outstanding overall and for the same sub-categories as Maths Hubs – and high-attaining pupil performance data, relating to attainment, progress and destinations. This should demonstrate a strong record of success with disadvantaged high attainers.

One of the inaugural national collaborative projects (see below) would be to develop and trial a succinct Quality Measure and efficient peer assessment process, suitable for all potential lead institutions regardless of phase or status.

This would be used to accredit all new MAHs, but also to re-accredit existing MAHs every three years. Those failing to meet the requisite standard would be supported to improve.

  • Three tiers and specialism

MAHs would operate at local and national level but would also collaborate regionally. They might take it in turns to undertake regional co-ordination.

Each would pursue a mix of national, regional and local priorities. The regional and local priorities would not replicate national priorities but MAHs would otherwise have free rein in determining them, subject to the approval of action plans (see below).

Each MAH would also be invited to develop a broader specialism which it would pursue in national and regional settings. MAHs from different regions with the same specialism would form a collaborative. The selected specialism might be expected to inform to some extent the choice of local priorities.

  • Strategic partnerships

Each MAH would develop a variety of local strategic partnerships, drawing in other local school and college networks, including TSAs, MATs, local authority networks, maths and music hubs; local universities, their faculties and schools of education; nearby independent schools;  local commercial and third sector providers; and local businesses with an interest in the supply of highly skilled labour. Some partners might prefer to engage at a regional level.

SLEs with a ‘most able’ specialism would be involved as a matter of course and would be expected to play a leading role.

National bodies would serve as national strategic partners, sitting on a National Advisory Group and contributing to the termly national forum.

Participating national bodies would include: central government and its agencies; national organisations, whether third sector or commercial, supporting the most able; and other relevant national education organisations, including subject associations and representative bodies.

Termly forums would be used to monitor progress, resolve issues and plan collaborative ventures. All non-sensitive proceedings would be published online. Indeed a single website would publish as much detail as possible about the MAHs: transparency would be the watchword.

  • Work Groups

Each MAH would agree an annual action plan applying the work group methodology to its national, regional and local priorities. Each priority would entail a substantive work programme requiring significant co-ordinated activity over at least two terms.

An additional work group would capture any smaller-scale local activities (and MAHs might be permitted to use a maximum of 10% of their programme budget for this purpose).

MAHs’ progress against their action plans – including top level output and outcome targets – would be assessed annually and the results used to inform the re-accreditation process.

The programme as a whole would be independently evaluated and adjusted if necessary to reflect the findings from formative evaluation.

  • Staffing and funding

MAHs would operate with the same combination of co-ordinator, SLT sponsor and administrator roles, but with the flexibility to distribute these roles between individuals as appropriate. Hubs would be encouraged to make the lead role a full-time appointment.

Co-ordinators would constitute a ‘network within a network’, meeting at termly forums and supporting each other through an online community (including weekly Twitter chats) and a shared resource base.

Co-ordinators would be responsible for devising and running their own induction and professional development programme and ensuring that new appointees complete it satisfactorily. Additional funding would be available for this purpose. The programme would be accredited at Masters level.

Assuming a full year budget of £160K per MAH (£60K for structural costs; £100K for work groups), plus 10% for central administration, the total steady-state cost of a 100-MAH network would be £17.6m per year, not much more than the £15m that Labour committed during the General Election campaign. If the programme was phased in over three years, the annual cost would be significantly lower during that period.

MAHs might be encouraged to generate income to offset against their structural costs. The co-ordinators’ salary and on-costs might be the first priority. In time Hubs might be expected to meet these entirely from income generated, so reducing the overall cost by almost a third.

In an ideal world, MAHs would also support a parallel programme providing long-term intensive support to disadvantaged high attainers funded through a £50m pupil premium topslice.

The overall cost is significant, but bears comparison with the substantial sums invested in some selective 16-19 free schools, or the £50m recently set aside for School Cadet Forces. Maybe funding for MAHs should also be drawn from the fines levied on the banks!

MAHs would support learners from YR-Y13 and have a genuinely national reach, while free schools can only ever impact significantly on a very limited annual intake plus those fortunate enough to benefit from any localised outreach activity. In short MAHs offer better value for money.

.

Conclusion

The principal findings from this review are that:

  • Maths Hubs offer a potentially workable model for system-wide improvement in the quality of maths education which could help to secure higher standards, stronger attainment and progress. But expectations of the Hubs are set too high given the limited resource available. It is doubtful whether the present infrastructure is strong enough to support the Government’s ambition to make England the best place in the world to study maths (in effect by 2020).
  • Given the dearth of information it is very difficult to offer a reliable assessment of the progress made by Maths Hubs in their first year of operation. The network has managed to establish itself from scratch within a relatively short time and with limited resources, but progress appears inconsistent, with some Hubs taking on and achieving much more than others. Two of the first three national collaborative projects still seem embryonic and the England-China project seems to be making steady rather than spectacular progress.
  • There are some tensions and weaknesses inherent in the model. In particular it relies on the successful reconciliation of potentially competing national and local priorities. There is evidence to suggest that national priorities are dominating at present. The model also depends critically on the capability of a small group of part-time co-ordinators. Several are likely to have limited experience and support, as well as insufficiently generous time allocations. Many will inevitably progress to school leadership positions so turnover will be a problem. An independent evaluation with a formative aspect would have been helpful in refining the model, ironing out the shortcomings and minimising the tensions. The apparent failure to commission an evaluation could become increasingly problematic as the expectations placed on the Hubs are steadily ratcheted upwards.
  • The supply of information is strictly rationed; the profile of Maths Hubs is far too low. Because the quality and quantity of information is so limited, those not working inside the network will infer that there is something to hide. Institutions that have not so far engaged with the Hubs will be less inclined to do so. If external communication is wanting, that may suggest that intra-Hub communication is equally shaky. Effective communication is critical to the success of such networks and ought to be given much higher priority. The Maths Hub website ought to be a ‘one stop shop’ for all stakeholders’ information needs, but it is infrequently updated and poorly stocked. Transparency should be the default position.
  • If the Government is to ‘create more opportunities to stretch the most able’ while ensuring that all high attainers ‘are pushed to achieve their potential’, then Maths Hubs will need to be at the forefront of a collective national improvement effort. NCETM should be making the case for an additional national collaborative project with this purpose. More attention must be given to shaping how the evolving English model of maths mastery provides stretch and challenge to high attainers, otherwise there is a real risk that mastery will perpetuate underachievement, so undermining the Government’s ambitions. In PISA 2012, 3.1% of English participants achieved Level 6 compared with 30.8% of those from Shanghai, while the comparative percentages for Levels 5 and 6 were 12.4% and 55.4% respectively. NCETM should specify now what they would consider acceptable outcomes for England in PISA 2015 and 2018 respectively.
  • Maths Hubs cannot extend their remit into the wider realm of STEM (or potentially STEAM if arts are permitted to feature). But, as Ofsted has shown, there are widespread shortcomings in the quality of ‘most able education’ more generally, not least for those from disadvantaged backgrounds. I have already made the case for a targeted support programme to support disadvantaged high attainers from Year 7 upwards, funded primarily through an annual pupil premium topslice. But the parallel business of school and college improvement might be spearheaded by a national network of Most Able Hubs with a whole school/college remit. I have offered some suggestions for how the Maths Hubs precedent might be improved upon. The annual cost would be similar to the £15m committed by Labour pre-election.

If such a network were introduced from next academic year then, by 2020, the next set of election manifestos might reasonably aim to make Britain the best place in the world for high attaining learners, especially high attaining learners from disadvantaged backgrounds.

And, with a generation of sustained effort across three or four successive governments and universal commitment in every educational setting, we might just make it….

What do you think the chances are of that happening?

Me too.

.

GP

July 2015

Missing Talent

.

people-308531_1280This post reviews the Sutton Trust’s Research Brief ‘Missing Talent’, setting it in the context of the Trust’s own priorities and the small canon of research on excellence gaps in the English education system.

It is structured as follows:

  • Background on what has been published and my own involvement in researching and debating these issues.
  • Analysis of the data-driven substance of the Research Brief
  • Analysis of the recommendations in the Research and their fit with previous recommendations contained in the Sutton Trust’s Mobility Manifesto (September 2014)
  • Commentary on the quality of the Research Brief, prospects for the adoption of these recommendations and comparison with my own preferred way forward.

.

Background

‘Missing Talent’ was prepared for The Sutton Trust by education datalab, an offshoot of the Fischer Family Trust (FFT).

The project was announced by education datalab in March (my emphases):

‘This is a short piece of research to explore differences in the secondary school experiences of highly able children from deprived backgrounds, compared to others. Its purpose is to identify whether and why some of the top 10% highest attaining children at the end of primary school do not achieve their full potential at age 16…

…For this group of highly able children we will:

  • describe the range of different GCSE outcomes they achieve
  • show their distribution across local authorities and different types of schools
  • explore whether there is any evidence that different types of high attaining children need to be differentially catered for within our education system

We hope our research will be able to suggest what number and range of qualifications schools should plan to offer students in this group. We may be able to identify parts of the country or particular types of schools where these students are not currently reaching their potential. We will be able to show whether highly able children from particular backgrounds are not currently reaching their full potential, with tentative suggestions as to whether school or home support are mostly contributing to this underperformance.’

On 2 June 2015, The Sutton Trust published:

  • An Overview summarising the key findings and recommendations
  • A Press Release ‘Over a third of clever but poor boys significantly underachieve at GCSE’ and
  • A guest blog post – Advancing the able – authored by Rebecca Allen, education datalab director. This also appears on the education datalab site.

The post is mostly about the wider issue of the priority attached to support for high attainers. It contains a gratifying reference to ‘brilliant blogger Gifted Phoenix’, but readers can rest assured that I haven’t pulled any punches here as a consequence!

The press release provided the substance of the ensuing media coverage, including pieces by the BBC, Guardian, Mail, Schools Week and TES.

There was limited commentary on social media since release of the Research Brief coincided with publication of the legislation underpinning the new Conservative Government’s drive for academisation. I commented

.

.

Just prior to publication and at extremely short notice I was asked by Schools Week for a comment that foregrounded references to the pupil premium.

This was in their coverage:

“I wholeheartedly support any action to reinforce effective practice in using pupil premium to support ‘the most able disadvantaged’.

“Ofsted is already taking action, but this should also be embedded in pupil premium reviews and become a higher priority for the Education Endowment Foundation.

“Given their close relationship, I hope the Sutton Trust will pursue that course. They might also publicly oppose Teach First proposals for redistributing pupil premium away from high and middle attainers and engage more directly with those of us who are pursuing similar priorities.”

For those who are unaware, I have been campaigning against Teach First’s policy position on the pupil premium, scrutinised in this recent post: Fisking Teach First’s defence of its pupil premium policy (April 2015). This is also mentioned in the Allen blog post.

I have also written extensively about excellence gaps, provisionally defined as:

‘The difference between the percentage of disadvantaged learners who reach a specified age- or stage-related threshold of high achievement – or who secure the requisite progress between two such thresholds – and the percentage of all other eligible learners that do so.’

This appears in a two-part review of the evidence base published in September 2014:

I have drawn briefly on that material in the commentary towards the end of this post.

.

Research Brief findings

.

Main findings, definitions and terminology

The Research Brief reports its key findings thus:

  • 15% of highly able pupils who score in the top 10% nationally at age 11 fail to achieve in the top 25% at GCSE 
  • Boys, and particularly pupil premium eligible boys, are most likely to be in this missing talent group 
  • Highly able pupil premium pupils achieve half a grade less than other highly able pupils, on average, with a very long tail to underachievement 
  • Highly able pupil premium pupils are less likely to be taking GCSEs in history, geography, triple sciences or a language

These are repeated verbatim in the Trust’s overview of research, but are treated slightly differently in the press release, which foregrounds the performance of boys from disadvantaged backgrounds:

‘Over a third (36%) of bright but disadvantaged boys seriously underachieve at age 16, new Sutton Trust research reveals today. Clever but poor girls are slightly less likely to underperform, with just under a quarter (24%) getting disappointing GCSE results. These figures compare with 16% of boys and 9% of girls from better off homes who similarly fall behind by age 16.’

The opening paragraph of the Brief describes ‘highly able’ learners as those achieving within the top decile in KS2 tests. This is a measure of prior attainment, not a measure of ability and it would have been better if the document referred to high attainers throughout.

There is also a curious and cryptic reference to this terminology

‘…following Sutton Trust’s previously used notion of those ‘capable of excellence in school subjects’’

which is not further explained (though ‘capable’ implies a measure of ability rather than attainment).

The analysis is based on the 2014 GCSE cohort and is derived from ‘their mark on each KS2 test paper they sat in 2009’. It therefore depends on high average performance across statutory tests of English, maths and (presumably) science.

The single measure of GCSE performance is achievement on the Attainment 8 measure, as defined in 2014. This has not been made available through the 2014 Secondary Performance Tables.

Essentially Attainment 8 comprises English and maths (both double-weighted) any three EBacc subjects and three other approved qualifications (the Brief says they must be GCSEs).

The measure of ‘missing talent’ is derived from the relationship between these two performance measures. It comprises those who fall within the top decile at KS2 but outside the top quartile nationally (ie the top 25%) at KS4.

There is no explanation or justification for the selection of these two measures, why they are pitched differently and why the difference between them has been set at 15 percentage points.

The text explains that some 7,000 learners qualify as ‘missing talent’, about 15% of all highly able learners (so the total of all highly able learners must approach 47,000).

The analysis is based on certain presumptions about consistency of progress between key stages. The brief says, rather dismissively:

‘Progress through school is not always smooth and predictable. Of course some children do well at primary school but are overtaken by peers who thrive at secondary school.’

It does not mention education datalab’s own analysis which shows that only 45% of learners make the expected linear progress between KS2 and KS3 and just 33% do so between KS3 and KS4. It would have been interesting and useful to have seen material about inconsistency of progress amongst this cohort.

Presumably the selection of top decile at KS2 but top quartile at KS4 is intended in part to compensate for this effect.

The main body of the Research Brief provides analysis of four topics:

  • The characteristics of the ‘missing talent’ subset – covering gender, ethnic background and socio-economic disadvantage.
  • Performance on the Attainment 8 measure of ‘missing talent’ from disadvantaged backgrounds compared with their more advantaged peers.
  • Take up of EBacc subjects by this population, including triple science.
  • The geographical distribution of ‘missing talent’ between local authorities and schools.

The sections below deal with each of these in turn.

.

The characteristics of ‘missing talent’

The ‘missing talent’ population comprises some 7,000 learners, so about 1 in 7 of all highly able learners according to the definition deployed.

We are not provided with any substantive information about the characteristics of the total highly able cohort, so are unable to quantify the differences between the composition of that and the ‘missing talent’ subset.

However we are told that the ‘missing talent’ group:

  • Is slightly more likely to be White British, Black Caribbean, Pakistani or Bangladeshi and somewhat less likely to be Chinese, Indian or African.
  • Includes 1,557 learners (943 boys and 614 girls) who are disadvantaged. The measure of disadvantage is ‘ever 6 FSM’ the basis for the receipt of pupil premium on grounds of deprivation. This is approximately 22% of the ‘missing talent’ group.
  • Includes 24% of the ‘ever 6 FSM’ girls within the highly able cohort compared with 9% of others; and includes 36% of ‘ever 6 FSM’ boys within the whole cohort compared with 16% of others.

Hence: ‘ever 6 FSM’ learners of both genders are more likely to be part of ‘missing talent’; boys are more likely than girls to be included, regardless of socio-economic status; and ‘ever 6 FSM boys are significantly more likely to be included than ‘ever 6 FSM’ girls.

.

Missing Talent Capture

.

The fact that 36% of ‘ever 6 FSM’ boys fall within the ‘missing talent’ group is described as ‘staggering’.

By marrying the numbers given with the percentages in the charts above, it seems that some 5,180 of the total highly able population are disadvantaged – roughly 11% – so both disadvantaged boys and girls are heavily over-represented in the ‘missing talent’ subset (some 30% of the total disadvantaged population are ‘missing talent’) and significantly under-represented in the total ‘highly able’ cohort.

By comparison, the 2014 Secondary Performance Tables show that 26.9% of the overall 2014 GCSE cohort in state-funded schools are disadvantaged (though this includes children in care).

There is no analysis to show whether there is a particular problem with white working class boys (or any other sub-groups for that matter) although that might be expected.

.

Attainment 8 performance

Attainment 8 is described as ‘the Government’s preferred measure’, although we anticipate that proposals in the Conservative manifesto for a ‘compulsory EBacc’ will almost certainly change its nature significantly, even if it is not supplanted by the EBacc.

The document supplies a table showing the average grade (points equivalents) for different percentiles of the ‘highly able FSM6’, ‘highly able not FSM6’ and ‘not highly able’ populations.

.

missing talent Capture 2

.

Median (50th percentile) performance for ‘highly able FSM6’ is 6.7, compared with 7.2 for ‘highly able not FSM6’ and 5.0 for ‘not highly able’.

The commentary translates this:

‘…they [‘highly able FSM 6’] score 4As and 4Bs when their equally able classmates from better off backgrounds get straight As’.

By analogy, the ‘not highly able’ group are achieving straight Cs.

However, there is also a ‘long tail of underachievement’ amongst the highly able disadvantaged:

‘One in ten of the poor but clever pupils are barely achieving C grades (or doing much worse) and at this end of the distribution they are lagging their non-FSM6 peers by almost a whole GCSE grade per subject.’

The latter is actually only true at the 95th percentile.

By comparison, at that point in the distribution, the ‘not highly able’ population are achieving 8 F grades.

So there is a clear excellence gap between the Attainment 8 performance of the highly able and the highly able disadvantaged, though the difference only becomes severe at the extreme of the distribution – the reference to a ‘long tail’ is perhaps a little overdone.

.

Take-up of EBacc subjects

A second table shows the distribution of grades for ‘highly able FSM6’ and ‘highly able not FSM6’ across the five EBacc components: English, maths, sciences, humanities and languages.

.

Missing Talent Capture 3

This is not discussed extensively in the text, but it reveals some interesting comparisons. For example, the percentage point excellence gaps between the two populations at GCSE grades A*/A are: maths 17 points; English 16 points; sciences 22 points; humanities 21 points; and languages 18  points.

At the other extreme 23% of ‘highly able FSM6’ are Ungraded in languages, as are 16% in humanities. This is particularly worrying if true, but Ungraded almost certainly includes those not entered for an appropriate examination.

The commentary says that ‘almost a quarter will not be taking a language at GCSE’, which might suggest that U is a misnomer. It is not clear whether the U category includes both non-takers and ungraded results, however.

The Government’s plans for ‘compulsory EBacc’ seem likely to force all learners to take a language and history or geography in future.

They will be less likely to make triple science compulsory for high attainers, though this is deemed significant in the document:

Just 53% of the highly able FSM6 pupils take triple sciences, compared to 69% of those not in the FSM6 category. This may be through choice or because they are in one of the 20% of schools that does not offer the curriculum. Here again the differences are stark: 20% of highly able FSM6 pupils are in a school not offering triple sciences, compared to just 12% of the highly able not-FSM6 pupils.’

The EBacc does not itself require triple sciences. The implications for teacher supply and recruitment of extending them into the schools that do not currently offer them are not discussed.

.

Geographical distribution of ‘missing talent’

At local authority level the Brief provides a list of 20 areas with relatively high ‘missing talent’ and 20 areas at the other extreme.

The bulk of the former are described as areas where secondary pupil performance is low across the attainment spectrum, but four – Coventry, Lambeth, Leicester and Tower Hamlets – are good overall, so the underachievement of high attainers is apparently exceptional.

Some are described as having comparatively low populations of highly able learners but, as the text implies, that should not be an excuse for underachievement amongst this cohort.

It is not clear whether there is differential performance in respect of disadvantaged learners within the ‘missing talent’ group (though the sample sizes may have been too low to establish this).

It is, however, immediately noticeable that the list of areas with high ‘missing talent’ includes many of the most disadvantaged authorities, while the list with low levels of missing talent is much more ‘leafy’.

Most of the former are located in the Midlands or the North. Almost all were Excellence in Cities areas.

The ‘low missing talent’ list also includes 11 London boroughs, but there are only three on the ‘high missing talent’ list.

The Brief argues that schools with low levels of ‘missing talent’ might support others to improve. It proposes additional selection criteria including:

  • ‘A reasonable number of highly able pupils’ – the rather arbitrary cut-off specified is 7% of cohort. It is not clear whether this is the total cohort or only the GCSE cohort. If the latter, it is more than likely to vary from year to year.
  • ‘Relatively low levels of missing talent’ – fewer than 10% ‘significantly underperform’. It is not clear but one assumes that the sole measure is that described above (ie not within the top 25% on the Attainment 8 measure).
  • ‘A socially mixed intake’ with over 10% of FSM6 learners (this is very low indeed compared with the average for the 2014 GCSE cohort in state-funded schools of 26.9%. It suggests that most of the schools will have relatively advantaged intakes.)
  • Triple science must be offered and the schools must have ‘a positive Progress 8 score overall’ (presumably so that they perform reasonably well across the attainment spectrum).

There is no requirement for the school to have achieved a particular Ofsted rating at its most recent inspection.

We are told that there are some 300 schools meeting this description, but no details are given about their distribution between authorities and regions, beyond the fact that:

‘In half of the 20 local authorities with the highest levels of missing talent there is no exemplar school and so a different policy approach may have to be taken.’

This final section of the document becomes a little discursive, stating that:

‘Any new initiatives to support highly able children at risk of falling behind must recognise the successes and failures of past ‘Gifted and Talented’ initiatives, particularly those of the Blair and Brown governments.’

And

‘We believe that any programme of support – whether through the curriculum or through enrichment – must support schools and children in their localities.’

No effort is made to identify these successes and failures, or to provide evidence to substantiate the belief in localised support (or to explain exactly what that means).

.

Recommendations

.

In the Research Brief

The Research Brief itself consists largely of data analysis, but proffers a brief summary of key findings and a set of policy recommendations.

It is not clear whether these emanate from the authors of the research or have been superimposed by the Trust, but the content distinctly suggests the latter.

There are four recommendations (my emphases):

  • ‘The Government should implement the recommendations of Sutton Trust’s Mobility Manifesto to develop an effective national programme for highly able state school pupils, with ring-fenced funding to support evidence-based activities and tracking of pupils’ progress.
  • All schools must be made accountable for the progress of their most able pupils. These pupils should have access to triple sciences and must study a broad traditional curriculum, including a language and humanity, that widens their future educational opportunities. The Government should report the (3-year average) Progress 8 figures for highly able pupils in performance tables. Schools where highly able pupils currently underperform should be supported through the designation of another local exemplar school. In the small number of areas where there is no exemplary good practice, a one-off centralised support mechanism needs to be set-up.
  • Exemplar schools already successfully catering for highly able pupils that are located in areas of high missing talent should be invited to consider whether they are able to deliver a programme of extra-curricular support to raise horizons and aspirations for children living in the wider area.
  • Highly able pupils who receive Pupil Premium funding are at high risk of underperforming at age 16. Schools should be encouraged to use the Pupil Premium funding for these pupils to improve the support they are able to give them.’

These are also repeated unchanged in the research overview, but are summarised and rephrased slightly in the press release.

Instead of demanding ‘an effective national programme…with ring-fenced funding to support evidence-based activities and tracking of pupils’ progress’ this calls on the Government to:

‘…establish a new highly able fund to test the most effective ways of improving the progress and attainment of highly able students in comprehensive schools and to show that the needs of highly able students, especially those from low and middle income backgrounds, are placed high on the national policy agenda.’

This is heavily redolent of Labour’s pre-election commitment to introduce a Gifted and Talented Fund which would establish a new evidence base and help schools’ ‘work in stretching the most able pupils’.

My own analysis of Labour’s commitment (March 2015) drew attention to similarities between this and The Sutton Trust’s own Mobility Manifesto (September 2014).

.

In the Mobility Manifesto

The Manifesto is mentioned in the footnotes to the press release. It offers three recommendations pertaining to highly able learners:

  • Reintroduce ring-fenced government funding to support the most able learners (roughly the top ten per cent) in maintained schools and academies from key stage three upwards. This funding could go further if schools were required to provide some level of match funding.
  • Develop an evidence base of effective approaches for highly able pupils and ensure training and development for teachers on how to challenge their most able pupils most effectively.
  • Make a concerted effort to lever in additional support from universities and other partners with expertise in catering for the brightest pupils, including through creating a national programme for highly able learners, delivered through a network of universities and accessible to every state-funded secondary school serving areas of disadvantage.’

The press release also mentions the Trust’s Sutton Scholars Scheme, a pilot programme undertaken with partner universities that supports highly able learners from low and middle income backgrounds during KS3.

In 2013 there was an initial pilot with 100 pupils involving UCL. In 2014 this was extended to 400 pupils and four partner universities: UCL, Cambridge, Nottingham and Warwick.

The press release says it currently reaches 500 pupils but still involving just four universities, so this is presumably the size of the 2015 cohort.

The programmes at each institution are subtly different but all involve a mix of out-of-school activities. In most cases they appear to be rebadging elements of the universities’ existing outreach programmes; there is nothing startlingly innovative or radical about them.

.

Commentary

.

Quality of the Research Brief

The document is compressed into three sides of A4 so, inevitably, much valuable information is missing. Education datalab should consider making available a separate annex containing all the underlying data that can be released without infringing data protection rules.

The Brief does not address all the elements set out in the original project description. It does not show the distribution of high attainers by type of school, or discuss the impact on underperformance of home and school respectively, nor does it:

‘…explore whether there is any evidence that different types of high attaining children need to be differentially catered for within our education system’.

It seems that the project has been scaled back compared with these original intentions, whether for lack of useful data or some other reason.

When it comes to the findings that are included:

  • The general conclusions about underachievement, particularly amongst high attainers from disadvantaged backgrounds, add something to our understanding of achievement patterns and the nature of excellence gaps. But the treatment also begs several questions that remain unanswered. The discussion needs reconciling with education datalab’s own findings about the limited incidence of linear progress. Further analysis of the performance of high-attaining disadvantaged boys may be a particular priority.
  • The findings on the take-up of EBacc subjects are relatively unsurprising and second order by comparison. They ought really to have been set in the context of the new Government’s commitment to a ‘compulsory EBacc’ (see below).
  • The information about the distribution of ‘missing talent’ is compromised by the very limited analysis, especially of the distribution between schools. The criteria used to identify a subset of 300 exemplar schools do not bear close scrutiny.

There is no cross-referencing to the existing evidence base on excellence gaps, especially the material relating to whether disadvantaged high attainers remain so in ‘The Characteristics of High Attainers’ (DfES 2007), ‘Performing against the odds: developmental trajectories of children in the EPPSE 3-16 study’ (Siraj-Blatchford et al, 2011) and ‘Progress made by high-attaining children from disadvantaged backgrounds’ (Crawford et al 2014).

.

Prospects for the adoption of these recommendations

The recommendation that schools are more strongly encouraged to use the pupil premium to benefit these learners – and to do so effectively – is important, but the text should explain how this can be achieved.

Ofsted has already made the case for action, concluding in March 2015 that two-thirds of non-selective secondary schools are not yet using pupil premium effectively to support disadvantaged high attainers.

Ofsted is committed to ensuring that school inspections focus sharply on the progress of disadvantaged high attainers and that future thematic surveys investigate the effective use of pupil premium to support them.

It is also preparing a ‘most able’ evaluation toolkit that will address this issue. This might provide a basis for further guidance and professional development, as long as the material is high quality and sufficiently detailed.

Effective provision for high attainers should be a higher priority for the pupil premium champion and, as I have already suggested, should feature prominently and explicitly in the guidance supporting pupil premium reviews.

Above all, the EEF should be supporting research on this topic as part of a wider initiative to help schools close excellence gaps.

All parties, including the Government, should make clear their opposition to the policy of Teach First and its Fair Education Alliance to double-weight pupil premium for low attainers at the expense of high and middle attaining recipients.

If at all possible, Teach First should be persuaded to withdraw this misguided policy.

It seems highly probable that the Trust’s recommendation for access to ‘a broad traditional curriculum’ will be secured in part through the new Government’s commitment to make EBacc subjects compulsory.

This is likely to be justified on grounds of social justice, derived from the conviction that taking these subjects supports progression to post-16 education, employment and higher education.

But that notion is contested. When the Education Select Committee considered this issue they concluded (my emphasis):

‘We support the Government’s desire to have greater equality of opportunity for all students, and to improve the attainment of those eligible for free school meals. The evidence is unclear as to whether entering more disadvantaged students for EBac subjects would necessarily make a significant contribution to this aim. Concentrating on the subjects most valued for progression to higher education could mean schools improve the attainment and prospects of their lowest-performing students, who are disproportionately the poorest as well. However, other evidence suggests that the EBac might lead to a greater focus on those students on the borderline of achieving it, and therefore have a negative impact on the most vulnerable or disadvantaged young people, who could receive less attention as a result. At the same time, we believe that the EBac’s level of prescription does not adequately reflect the differences of interest or ability between individual young people, and risks the very shoe-horning of pupils into inappropriate courses about which one education minister has expressed concerns. Given these concerns, it is essential that the Government confirms how it will monitor the attainment of children on free school meals in the EBac.’

This policy will not secure universal access to triple science, though it seems likely that the Government will continue to support that in parallel.

In the final days of the Coalition government, a parliamentary answer said that:

‘Out of 3,910 mainstream secondary schools in England with at least one pupil at the end of key stage four, 2,736 schools entered at least one pupil for triple science GCSEs in 2013/14. This figure does not include schools which offered triple science GCSEs, but did not enter any pupils for these qualifications in 2013/14. It also excludes those schools with no pupils entered for triple science GCSEs but where pupils have been entered for all three of GCSE science, GCSE further science and GCSE further additional science, which together cover the same content as GCSE triple science.

The Government is providing £2.6 million in funding for the Triple Science Support Programme over the period 2014-16. This will give state funded schools with low take up of triple science practical support and guidance on providing triple science at GCSE. The support comprises professional development for teachers, setting up networks of schools to share good practice and advice on how to overcome barriers to offering triple science such as timetabling and lack of specialist teachers.’

The Conservative manifesto said:

‘We aim to make Britain the best place in the world to study maths, science and engineering, measured by improved performance in the PISA league tables…We will make sure that all students are pushed to achieve their potential and create more opportunities to stretch the most able.’

Continued emphasis on triple science seems highly likely, although this will contribute to wider pressures on teacher supply and recruitment.

The recommendation for an additional accountability measure is sound. There is after all a high attainer measure within the primary headline package, though it has not yet been defined beyond:

‘x% of pupils achieve a very high score in their age 11 assessments’.

In its response to consultation on secondary accountability arrangements, the previous government argued that high attainment would feature in the now defunct Data Portal intended to support the performance tables.

It will be important to ensure consistency between primary and secondary measures. The primary measure seems to be based on attainment rather than progress. The Sutton Trust seems convinced that the secondary equivalent should be a progress measure (Progress 8) but does not offer any justification for this.

It is also critical that the selected measures are reported separately for disadvantaged and all other learners, so that the size of the excellence gap is explicit.

.

Prospects for a new national programme

When it comes to the recommendation for a new national programme, the Trust needs to be clearer and more explicit about the fundamental design features.

The recommendations in the Mobility Manifesto and this latest publication are not fully consistent. No effort is made to cost these proposals, to identify the budgets that will support them, or to make connections with the Government’s wider education policy.

Piecing the two sets of recommendations together, it appears that:

  • The programme would cater exclusively for the top decile of high attainers in the state-funded secondary sector. Post-16 institutions and selective schools may or may not be included.
  • Participation would be determined entirely on the basis of KS2 test outcomes, but it is not clear whether learners would remain within the programme regardless of subsequent progress.
  • The programme would comprise two parallel arms – one providing support directly for learners, the other improving the quality of provision for them within their schools and colleges.
  • The support for learners is not defined, but would presumably draw on existing Trust programmes. It would include ‘extra-curricular support to raise horizons and aspirations’.
  • It is not entirely clear whether this support would be available exclusively to those from disadvantaged backgrounds (though we know it would be ‘accessible to every state-funded secondary school serving areas of disadvantage’).
  • The support for schools and colleges will develop and test effective practice in teaching these learners, in tracking and maximising their attainment and progress. It will provide associated professional development. It is not clear whether this will extend into other dimensions of effective whole school provision.
  • Delivery will be via some combination of a network of universities, a cadre of exemplar schools and other partners with expertise. The interaction between these different providers is not discussed.
  • The exemplar schools will be designated as such and will support other schools in their locality where high attainers under-achieve. They should also be ‘invited to consider’ delivering a programme of extra-curricular support for learners in their area.
  • There will also be an unspecified ‘one-off centralised support mechanism’ for areas with no exemplary schools. What this means is a mystery.
  • Costs will be met from a new ring-fenced ‘highly able fund’ the size of which is not quantified.

The relationship between this programme and the Trust’s proposed ‘Open Access Scheme’ – which would place high attaining students in independent schools – is not discussed. (I will not repeat again my arguments against this Scheme.)

The realistic prospect of securing a sufficiently large ring-fenced pot must be negligible in the present funding environment. Labour’s pre-election commitment to find some £15m (annually?) for this purpose is unlikely to be matched by the Conservatives.

Any support for improving the quality of provision in schools is likely to be found within existing budgets, including those supporting research, professional development, teaching schools, their alliances and their designated Specialist Leaders of Education.

STEM-related initiatives are particularly relevant given the Manifesto reference. One would hope for a systematic and co-ordinated approach rather than the piecemeal introduction of new projects.

I have elsewhere suggested a set of priorities including:

  • Guidance and associated professional development on effective whole school provision derived from a set of core principles, including the adoption of flexible, radical and innovative grouping arrangements.
  • Developing a coherent strategy for strengthening the STEM talent pipeline which harnesses the existing infrastructure and makes high quality support accessible to all learners regardless of the schools and colleges they attend.
  • Establishing centres of excellence and a stronger cadre of expert teachers, but also fostering system-wide partnership and collaboration by including the range of expertise available outside schools.

If funding is to go towards improving provision for learners, the only viable option is to use pupil premium, with the consequence that support will be targeted principally, if not exclusively, at disadvantaged high attainers.

I have elsewhere suggested a programme designed to support all such learners aged 11-18 located in state-funded schools and colleges. There is both wider reach and less deadweight if support is targeted at all eligible learners, rather than at schools ‘serving areas of disadvantage’.

It is critical to include the post-16 sector, given the significant proportion of disadvantaged high attainers who transfer post-GCSE.

This would be funded principally by a £50m topslice from the pupil premium budget (matching the topslice taken to support Y6/7 summer schools), though higher education outreach budgets would also contribute and there would be scope to attract additional philanthropic support.

The over-riding priority is to bring much-needed coherence to what is currently a fragmented market, enabling:

  • Learners to undertake a long-term support programme, tailored to their needs and drawing on the vast range of services offered by a variety of different providers, including universities, commercial and third sector organisations (such as the Trust itself).
  • These providers to position and market their services within a single online national prospectus, enabling them to identify gaps on the supply side and take action to fill them.
  • A single, unified, system-wide effort, harmonising the ‘pull’ from higher education fair access strategies and the ‘push’ from schools’ and colleges’ work to close excellence gaps.

I don’t yet recognise this coherence in the Trust’s preferred model.

.

GP

June 2015

A Digression on Breadth, Depth, Pace and Mastery

.

Tricoloring (1)

For a more recent post on these issues, go here

This post explores the emerging picture of mastery-based differentiation for high attainers and compares it with a model we used in the National G&T Programme, back in the day.

It is a rare venture into pedagogical territory by a non-practitioner, so may not bear close scrutiny from the practitioner’s perspective. But it seeks to pose intelligent questions from a theoretical position and so promote further debate.

.

Breadth, depth and pace

. 

Quality standards

In the original National Quality Standards in Gifted and Talented Education (2005) one aspect of exemplary ‘Effective Provision in the Classroom’ was:

‘Teaching and learning are suitably challenging and varied, incorporating the breadth, depth and pace required to progress high achievement. Pupils routinely work independently and self-reliantly.’

In the 2010 version it was still in place:

‘Lessons consistently challenge and inspire pupils, incorporating the breadth, depth and pace required to support exceptional rates of progress. Pupils routinely work creatively, independently and self-reliantly.’

These broad standards were further developed in the associated Classroom Quality Standards (2007) which offered a more sophisticated model of effective practice.

The original quality standards were developed by small expert working groups, reporting to wider advisory groups and were carefully trialled in primary and secondary classrooms.

They were designed not to be prescriptive but, rather, to provide a flexible framework within which schools could develop and refine their own preferred practice.

Defining the terms

What did we mean by breadth, depth and pace?

  • Breadth (sometimes called enrichment) gives learners access to additional material beyond the standard programme of study. They might explore additional dimensions of the same topic, or an entirely new topic. They might need to make cross-curricular connections, and/or to apply their knowledge and skills in an unfamiliar context.
  • Depth (sometimes called extension) involves delving further into the same topic, or considering it from a different perspective. It might foreground problem solving. Learners might need to acquire new knowledge and skills and may anticipate material that typically occurs later in the programme of study.
  • Pace (sometimes called acceleration) takes two different forms. It may be acceleration of the learner, for example advancing an individual to a higher year group in a subject where they are particularly strong. More often, it is acceleration of the learning, enabling learners to move through the programme of study at a relatively faster pace than some or all of their peers. Acceleration of learning can take place at a ‘micro’ level in differentiated lesson planning, or in a ‘macro’ sense, typically through setting. Both versions of acceleration will cause the learner to complete the programme of study sooner and they may be entered early for an associated test or examination.

It should be readily apparent that these concepts are not distinct but overlapping.  There might be an element of faster pace in extension, or increased depth in acceleration for example. A single learning opportunity may include two, or possibly all three. It is not always straightforward to disentangle them completely.

Applying these terms

From the learner’s perspective, one of these three elements can be dominant, with the preferred strategy determined by that learner’s attainment, progress and wider needs.

  • Enrichment might be dominant if the learner is an all-rounder, relatively strong in this subject but with equal or even greater strength elsewhere.
  • Extension might be dominant if the learner shows particular aptitude or interest in specific aspects of the programme of study.
  • Acceleration might be dominant if the learner is exceptionally strong in this subject, or has independently acquired and introduced knowledge or skills that are not normally encountered until later in this or a subsequent key stage.

Equally though, the richest learning experience is likely to involve a blend of all three elements in different combinations: restricting advanced learners to one or two of them might not always be in their best interests. Moreover, some high attainers will thrive with a comparatively ‘balanced scorecard’

The intensity or degree of enrichment, extension or acceleration will also vary according to the learners’ needs. Even in a top set decisions about how broadly to explore, how deeply to probe or how far and how fast to press forward must reflect their starting point and the progress achieved to date.

Acceleration of the learner may be appropriate if he or she is exceptionally advanced.  Social and emotional maturity will need to be taken into account, but all learners are different – this should not be used as a blanket excuse for failing to apply the approach.

There must be evidence that the learner is in full command of the programme of study to date and that restricting his pace is having a detrimental effect. A pedagogical preference for moving along the class at the same pace should never over-ride the learner’s needs.

Both variants of acceleration demand careful long-term planning, so the learner can continue on a fast track where appropriate, or step off without loss of esteem. It will be frustrating for a high attainer expected to ‘mark time’ when continuity is lost. This may be particularly problematic on transfer and transition between settings.

Careful monitoring is also required, to ensure that the learner continues to benefit, is comfortable and remains on target to achieve the highest grades. No good purpose is served by ‘hothousing’.

Mastery and depth

The Expert Panel

The recent evolution of a mastery approach can be tracked back to the Report of the Expert Panel for the National Curriculum Review (December 2011).

‘Amongst the international systems which we have examined, there are several that appear to focus on fewer things in greater depth in primary education, and pay particular attention to all pupils having an adequate understanding of these key elements prior to moving to the next body of content – they are ‘ready to progress’…

… it is important to understand that this model applies principally to primary education. Many of the systems in which this model is used progressively change in secondary education to more selective and differentiated routes. Spread of attainment then appears to increase in many of these systems, but still with higher overall standards than we currently achieve in England…

There are issues regarding ‘stretch and challenge’ for those pupils who, for a particular body of content, grasp material more swiftly than others. There are different responses to this in different national settings, but frequently there is a focus on additional activities that allow greater application and practice, additional topic study within the same area of content, and engagement in demonstration and discussion with others

These views cohere with our notion of a revised model that focuses on inclusion, mastery and progress. However, more work needs to be done around these issues, both with respect to children with learning difficulties and those regarded as high attainers.’

For reasons best known to itself, the Panel never undertook that further work in relation to high attainers, or at least it was never published. This has created a gap in the essential groundwork necessary for the adoption of a mastery-driven approach.

 .

National curriculum

Aspects of this thinking became embodied in the national curriculum, but there are some important checks and balances.

The inclusion statement requires differentiation for high attainers:

‘Teachers should set high expectations for every pupil. They should plan stretching work for pupils whose attainment is significantly above the expected standard.’

The primary programmes of study for all the core subjects remind everyone that:

Within each key stage, schools therefore have the flexibility to introduce content earlier or later than set out in the programme of study. In addition, schools can introduce key stage content during an earlier key stage, if appropriate.’

But, in mathematics, both the primary and secondary PoS say:

‘The expectation is that the majority of pupils will move through the programmes of study at broadly the same pace. However, decisions about when to progress should always be based on the security of pupils’ understanding and their readiness to progress to the next stage. Pupils who grasp concepts rapidly should be challenged through being offered rich and sophisticated problems before any acceleration through new content. Those who are not sufficiently fluent with earlier material should consolidate their understanding, including through additional practice, before moving on.’

These three statements are carefully worded and, in circumstances where all apply, they need to be properly reconciled.

.

NCETM champions the maths mastery movement

The National Centre for Excellence in the Teaching of Mathematics (NCETM), a Government-funded entity responsible for raising levels of achievement in maths, has emerged as a cheerleader for and champion of a maths mastery approach.

It has published a paper ‘Mastery approaches to mathematics and the new national curriculum’ (October 2014).

Its Director, Charlie Stripp, has also written two blog posts on the topic:

The October 2014 paper argues (my emphasis):

‘Though there are many differences between the education systems of England and those of east and south-east Asia, we can learn from the ‘mastery’ approach to teaching commonly followed in these countries. Certain principles and features characterise this approach…

… The large majority of pupils progress through the curriculum content at the same pace. Differentiation is achieved by emphasising deep knowledge and through individual support and intervention.’

It continues:

‘Taking a mastery approach, differentiation occurs in the support and intervention provided to different pupils, not in the topics taught, particularly at earlier stages. There is no differentiation in content taught, but the questioning and scaffolding individual pupils receive in class as they work through problems will differ, with higher attainers challenged through more demanding problems which deepen their knowledge of the same content.’

In his October 2014 post, Stripp opines:

‘Put crudely, standard approaches to differentiation commonly used in our primary school maths lessons involve some children being identified as ‘mathematically weak’ and being taught a reduced curriculum with ‘easier’ work to do, whilst others are identified as ‘mathematically able’ and given extension tasks….

…For the children identified as ‘mathematically able’:

  1. Extension work, unless very skilfully managed, can encourage the idea that success in maths is like a race, with a constant need to rush ahead, or it can involve unfocused investigative work that contributes little to pupils’ understanding. This means extension work can often result in superficial learning. Secure progress in learning maths is based on developing procedural fluency and a deep understanding of concepts in parallel, enabling connections to be made between mathematical ideas. Without deep learning that develops both of these aspects, progress cannot be sustained.
  2. Being identified as ‘able’ can limit pupils’ future progress by making them unwilling to tackle maths they find demanding because they don’t want to challenge their perception of themselves as being ‘clever’ and therefore finding maths easy….

…I do think much of what I’m saying here also applies at secondary level.

Countries at the top of the table for attainment in mathematics education employ a mastery approach to teaching mathematics. Teachers in these countries do not differentiate their maths teaching by restricting the mathematics that ‘weaker’ children experience, whilst encouraging ‘able’ children to ‘get ahead’ through extension tasks… Instead, countries employing a mastery approach expose almost all of the children to the same curriculum content at the same pace…’

The April 2015 post continues in a similar vein, commenting directly on the references in the PoS quoted above (my emphases):

‘The sentence: ‘Pupils who grasp concepts rapidly should be challenged through rich and sophisticated problems before any acceleration through new content’, directly discourages acceleration through content, instead requiring challenge through ‘rich and sophisticated (which I interpret as mathematically deeper) problems’. Engaging with ‘rich and sophisticated problems’ involves reasoning mathematically and applying maths to solve problems, addressing all three curriculum aims. All pupils should encounter such problems; different pupils engage with problems at different depths, but all pupils benefit

…Meeting the needs of all pupils without differentiation of lesson content requires ensuring that both (i) when a pupil is slow to grasp an aspect of the curriculum, he or she is supported to master it and (ii) all pupils should be challenged to understand more deeply…

The success of teaching for mastery in the Far East (and in the schools employing such teaching here in England) suggests that all pupils benefit more from deeper understanding than from acceleration to new material. Deeper understanding can be achieved for all pupils by questioning that asks them to articulate HOW and WHY different mathematical techniques work, and to make deep mathematical connections. These questions can be accessed by pupils at different depths and we have seen the Shanghai teachers, and many English primary teachers who are adopting a teaching for mastery approach, use them very skilfully to really challenge even the highest attaining pupils.’

The NCETM is producing guidance on assessment without levels, showing how to establish when a learner

‘…has ‘mastered’ the curriculum content (meaning he or she is meeting national expectations and so ready to progress) and when a pupil is ‘working deeper’ (meaning he or she is exceeding national expectations in terms of depth of understanding).’

.

Commentary

NCETM wants to establish a distinction between depth via problem-solving (good) and depth via extension tasks (bad)

There is some unhelpful terminological confusion in the assumption that extension tasks necessarily require learners to anticipate material not yet covered by the majority of the class.

Leaving that aside, notice how the relatively balanced wording in the programme of study is gradually adjusted until the balance has disappeared.

The PoS says ‘the majority of pupils will move through…at broadly the same pace’ and that they ‘should be challenged through being offered rich and sophisticated problems before any acceleration through new content).

This is first translated into

‘…the large majority of pupils progress through the curriculum content at the same pace’ (NCETM paper) then it becomes

‘…expose almost all of the children to the same curriculum content at the same pace’ (Stripp’s initial post) and finally emerges as

‘Meeting the needs of all pupils without differentiation of lesson content’ and

‘…all pupils benefit more from deeper understanding than from acceleration to new material.’ (Stripp’s second post).

Any non-mathematician will tell you that the difference between the majority (over 50%) and all (100%) may be close to 50%.

Such a minority could very comfortably include all children achieving L3 equivalent at KS1 or L5 equivalent at KS2, or all those deemed high attainers in the Primary and Secondary Performance Tables.

The NCETM pretends that this minority does not exist.

It does not consider the scope for acceleration towards new content subsequent to the delivery of ‘rich and sophisticated problems’.

Instead it argues that the statement in the PoS ‘directly discourages acceleration through content’ when it does no such thing.

This is propaganda, but why is NCETM advancing it?

One possibility, not fully developed in these commentaries, is the notion that teachers find it easier to work in this way. In order to be successful ‘extension work’ demands exceptionally skilful management.

On the other hand, Stripp celebrates the fact that Shanghai teachers:

…were very skilled at questioning and challenging children to engage more deeply with maths within the context of whole class teaching.’

It is a moot point whether such questioning, combined with the capacity to develop ‘rich and sophisticated problems’, is any more straightforward for teachers to master than the capacity to devise suitable extension tasks, especially when one approach is relatively more familiar than the other.

Meanwhile, every effort is made to associate maths mastery with other predilections and prejudices entertained by educational professionals:

  • It will have a positive impact on teacher workload, but no evidence – real or imagined – is cited to support this belief.
  • The belief that all children can be successful at maths (though with no acknowledgement that some will always be comparatively more successful than others) and an associated commitment to ‘mindset’, encouraging learners to associate success with effort and hard work rather than underlying aptitude.
  • The longstanding opposition of many in the maths education community to any form of acceleration, fuelled by alarming histories of failed prodigies at one extreme and poorly targeted early entry policies at the other. (I well remember discussing this with them as far back as the nineties.)
  • The still contested benefits of life without levels.

On this latter point, the guidance NCETM is developing appears to assume that ‘exceeding national expectations’ in maths must necessarily involve ‘working deeper’.

I have repeatedly argued that, for high attainers, such measures should acknowledge the potential contributions of breadth, depth and pace.

Indeed, following a meeting and email exchanges last December, NAHT said it wanted to employ me to help develop such guidance, as part of its bigger assessment package.

(Then nothing more – no explanation, no apology, zilch. Shame on you, Mr Hobby. That’s no way to run an organisation.)

.

Conclusion

Compared with the richness of the tripartite G&T model, the emphasis placed exclusively on depth in the NCETM mastery narrative seems relatively one-dimensional and impoverished.

There is no great evidence in this NCETM material of a willingness to develop an alternative understanding of ‘stretch and challenge’ for high attainers.  Vague terms like  ‘intelligent practice’, ‘deep thinking’ and ‘deep learning’ are bandied about like magical incantations, but what do they really mean?

NCETM needs to revisit the relevant statement in the programme of study and strip away (pun intended) the ‘Chinese whispers’ (pun once more intended) in which they have cocooned it.

Teachers following the maths mastery bandwagon need meaningful free-to-access guidance that helps them construct suitably demanding and sophisticated problems and to deploy advanced questioning techniques that get the best out of their high attainers.

I do not dismiss the possibility that high attainers can thrive under a mastery model that foregrounds depth over breadth and pace, but it is a mistake to neglect breadth and pace entirely.

Shanghai might be an exception, but most of the other East Asian cradles of mastery also run parallel gifted education programmes in which accelerated maths is typically predominant. I’ve reviewed several on this Blog.

For a more recent treatment of these issues see my September 2015 post here.

.

GP

April 2015

Has Ofsted improved inspection of the most able?

.

This post examines the quality of Ofsted reporting on how well secondary schools educate their most able learners.

keep-calm-and-prepare-for-ofsted-6The analysis is based on a sample of 87 Section 5 inspection reports published during March 2015.

I have compared the results with those obtained from a parallel exercise undertaken a year ago and published in How well is Ofsted reporting on the most able? (May 2014).

This new post considers how inspectors’ assessments have changed in the light of their increased experience, additional guidance and – most recently – the publication of Ofsted’s survey report: The most able students: An update on progress since June 2013.

This appeared on 4 March 2015, at the beginning of my survey period, although it was heralded in HMCI’s Annual Report and the various supporting materials published alongside it in December 2014. One might therefore expect it to have had an immediate effect on inspection practice.

Those seeking further details of either of these publications are cordially invited to consult the earlier posts I dedicated to them:

The organisation of this post is straightforward.

The first section considers how Ofsted expects its inspectors to report on provision for the most able, as required by the current Inspection Handbook and associated guidance. It also explores how those expectations were intended to change in the light of the Update on Progress.

Subsequent sections set out the findings from my own survey:

  • The nature of the 2015 sample – and how this differs from the 2014 sample
  • Coverage in Key Findings and Areas for Improvement
  • Coverage in the main body of reports, especially under Quality of Teaching and Achievement of Pupils, the sections that most commonly feature material about the most able

The final section follows last year’s practice in offering a set of key findings and areas for improvement for consideration by Ofsted.

I have supplied page jumps to each section from the descriptions above.

How inspectors should address the most able

.

Definition and distribution

Ofsted nowhere explains how inspectors are to define the most able. It is not clear whether they permit schools to supply their own definitions, or else apply the distinctions adopted in their survey reports. This is not entirely helpful to schools.

In the original survey – The most able students: Are they doing as well as they should in our non-selective secondary schools? (June 2013) – Ofsted described the most able as:

‘…the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

The measure of potential is not defined, but an example is given, of EAL students who are new to the country and so might not (yet) have achieved Level 5.

In the new survey prior attainment at KS2 remains the indicator, but the reference to potential is dropped:

‘…students starting secondary school in Year 7 having attained Level 5 or above in English (reading and writing) and/or mathematics at the end of Key Stage 2’

The size of this group varies at national level according to the year group.

If we take learners in Year 7 who completed KS2 in 2014, the data shows that 24% achieved KS2 Level 5 in both English (reading and writing) and maths. A further 5% secured L5 in English (reading and writing only) while another 20% reached L5 in maths only.

So 49% of the present Year 7 are deemed high attainers.

.

Ofsted venn Capture

But this proportion falls to about 40% amongst those who completed KS4 in 2014 and so typically undertook KS2 assessment five years earlier in 2009.

Ofsted’s measure is different to the definition adopted in the Secondary Performance Tables which, although also based on prior attainment at KS2, depends on an APS of 30 or higher in KS2 tests in the core subjects.

Only ‘all-rounders’ count according to this definition, while Ofsted includes those who are relatively strong in either maths or English but who might be weak in the other subject. Neither approach considers achievement beyond the core subjects.

According to the Performance Tables definition, amongst the cohort completing KS4 in 2014, only 32.3% of those in state-funded schools were deemed high attainers, some eight percentage points lower than Ofsted’s figure.

The sheer size of Ofsted’s most able cohort will be surprising to some, who might naturally assume a higher hurdle and a correspondingly smaller group. The span of attainment it covers is huge, from one L5C (possibly paired with a L3) to three L6s.

But the generosity of Ofsted’s assumptions does mean that every year group in every school should contain at least a handful of high attainers, regardless of the characteristics of its intake.

Unfortunately, Ofsted’s survey report does not say exactly how many schools have negligible numbers of high attainers, telling us only how many non-selective schools had at least one pupil in their 2014 GCSE cohort with the requisite prior attainment in English, in maths and in both English and maths.

In each case some 2,850 secondary schools had at least one student within scope. This means that some 9% of schools had no students in each category, but we have no way of establishing how many had no students in all three categories.

Using the rival Performance Table definition, only some 92 state-funded non-selective secondary schools reported a 2014 GCSE cohort with 10% or fewer high attainers. The lowest recorded percentage is 3% and, of those with 5% or fewer, the number of high attaining students ranges from 1 to 9.

Because Ofsted’s definition is more liberal, one might reasonably assume that every secondary school has at least one high-attaining student per year group, though there will be a handful of schools with very few indeed.

At the other extreme, according to the Performance Tables definition, over 100 state-funded non-selective schools can boast a 2014 GCSE population where high attainers are in the majority – and the highest recorded percentage for a state-funded comprehensive is 86%. Using Ofsted’s measure, the number of schools in this position will be substantively higher.

For the analysis below, I have linked the number of high attainers (according to the Performance Tables) in a school’s 2014 GCSE cohort with the outcomes of inspection, so as to explore whether there is a relationship between these two variables.

Framework and Handbook

The current Framework for School Inspection (December 2014) makes no reference to the most able.

Inspectors must consider:

‘…the extent to which the education provided by the school meets the needs of the range of pupils at the school, and in particular the needs of disabled pupils and those who have special educational needs.’

One of the principles of school inspection is that it will:

‘focus on pupils’ and parents’ needs by…evaluating the extent to which schools provide an inclusive environment that meets the needs of all pupils, irrespective of age, disability, gender, race, religion or belief, or sexual orientation’.

Neither ability nor attainment is mentioned. This may or may not change when the Common Inspection Framework is published.

The most recent version of the School Inspection Handbook (December 2014) has much more to say on the issue. All relevant references in the main text and in the grade descriptors are set out in the Annex at the end of this post.

Key points include:

  • Ofsted uses inconsistent terminology (‘most able’, ‘more able’, ‘highest attainers’) without distinguishing between these terms.
  • Most of the references to the most able occur in lists of different groups of learners, another of which is typically ‘disadvantaged pupils’. This gives the mistaken impression that the two groups are distinct – that there is no such thing as a most able disadvantaged learner.
  • The Common Inspection Framework will be supported by separate inspection handbooks for each sector. The consultation response does not mention any revisions relating to the most able; neither does the March 2015 survey report say that revisions will be introduced in these handbooks to reflect its findings and recommendations (but see below). 

.

Guidance

Since the first survey report was published in 2013, several pieces of guidance have issued to inspectors.

  • In Schools and Inspection (October 2013), inspectors’ attention is drawn to key revisions to the section 5 inspection framework:

‘In judging the quality of teaching…Inspectors will evaluate how teaching meets the needs of, and provides appropriate challenge to, the most able pupils. Underachievement of the most able pupils can trigger the judgements of inadequate achievement and inadequate teaching.’

In relation to report writing:

‘Inspectors are also reminded that they should include a short statement in the report on how well the most able pupils are learning and making progress and the outcomes for these pupils.’

  • In Schools and Inspection (March 2014) several amendments are noted to Section 5 inspection and report writing guidance from January of that year, including:

‘Most Able – Inspectors must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

‘…must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

Moreover, for secondary schools:

‘There must be a comment on early entry for GCSE examinations. Where the school has an early entry policy, inspectors must be clear on whether early entry is limiting the potential of the most able pupils. Where early entry is not used, inspectors must comment briefly to that effect.’

  • In School Inspection Update (December 2014) Ofsted’s National Director, Schools reminds inspectors, following the first of a series of half-termly reviews of ‘the impact of policy on school inspection practice’, to:

‘…place greater emphasis, in line with the handbook changes from September, on the following areas in section 5 inspection reports…The provision and outcomes for different groups of children, notably the most-able pupils and the disadvantaged (as referred to in the handbook in paragraphs 40, 129, 137, 147, 155, 180, 186, 194, 195, 196, 207, 208, 210 and 212).’

HMCI’s Annual Report

The 2014 Annual Report said (my emphasis):

‘Ofsted will continue to press schools to stretch their most able pupils. Over the coming year, inspectors will be looking at this more broadly, taking into account the leadership shown in this area by schools. We will also further sharpen our recommendations so that schools have a better understanding of how they can help their most able pupils to reach their potential.’

HMCI’s Commentary on the Report  added for good measure:

‘In the year ahead, Ofsted will look even more closely at the performance of the brightest pupils in routine school inspections.’

So we are to expect a combination of broader focus, closer scrutiny and sharper recommendations.

The Annual Report relates to AY2013/14 and was published at the end of the first term of AY2014/15 and the end of calendar year 2014, so one assumes that references to the ‘coming year’ and ‘the year ahead’ are to calendar year 2015.

We should be able to see the impact of this ramping up in the sample I have selected, but some further change is also likely.

March 2015 survey report

One of the key findings from the March 2015 survey was (my emphasis):

Ofsted has sharpened its focus on the progress and quality of teaching of the most able students. We routinely comment on the achievement of the most able students in our inspection reports. However, more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

Ofsted directed three recommendations at itself which do not altogether reflect this (my emboldening):

‘Ofsted should:

  • Make sure that inspections continue to focus sharply on the progress made by students who are able and disadvantaged.
  • Report more robustly about how well schools promote the needs of the most able through the quality of their curriculum and the information, advice and guidance they offer to the most able students.
  • Ensure thematic surveys investigate, where appropriate, how well the most able are supported through, for example, schools’ use of the pupil premium and the curriculum provided.’

The first of these recommendations implies that inspections already focus sufficiently on the progress of able and disadvantaged learners – an assumption that we shall test in the analysis below. It therefore implies that no further change is necessary.

The third alludes to the most able disadvantaged but relates solely to thematic surveys, not to Section 5 inspection reports.

The second may imply that further emphasis will be placed on inspecting the appropriateness of the curriculum and IAG. Both of these topics seem likely to feature more strongly in a generic sense in the new Framework and Handbooks. One assumes that this will be extended to the most able, amongst other groups.

Though not mentioned in the survey report, we do know that Ofsted is preparing an evaluation toolkit. This was mentioned in a speech given by its Schools Director almost immediately after publication:

‘In this region specifically, inspectors have met with headteachers to address the poor achievement of the brightest disadvantaged children.

And inspectors are developing a most able evaluation toolkit for schools, aligned to that which is in place for free school meals.’

It is not clear from this whether the toolkit will be confined only to the most able disadvantaged or will have wider coverage.

Moreover, this statement raises the prospect that the toolkit might be similar in style to The Pupil Premium: Analysis and challenge tools for schools (January 2013). This is more akin to an old spanner than a Swiss army penknife. Anything of this nature would be rather less helpful than the term ‘toolkit’ implies.

At his request, I emailed Ofsted’s Director, Schools with questions on 21 March 2015. I requested further details of the toolkit. At the time of writing I have still to receive a reply.

.

The sample

I have selected an almost identical sample to that used in my 2014 analysis, one year on. It includes the 87 Section 5 inspection reports on secondary schools (excluding middle schools deemed secondary) that were published by Ofsted in the month of March 2015.

The bulk of the inspections were undertaken in February 2015, though a few took place in late January or early March.

Chart 1 gives the regional breakdown of the schools in the sample. All nine regions are represented, though there are only five schools from the North East, while Yorkshire and Humberside boasts 15. There are between seven and 11 schools in each of the other regions. In total 59 local authorities are represented.

In regional terms, this sample is more evenly balanced than the 2014 equivalent and the total number of authorities is two higher.

 .

Ofanal 1

Chart 1: Schools within the sample by region

Chart 2 shows how different statuses of school are represented within the sample.

All are non-selective. Fifty-three schools (61%) are academies, divided almost equally between the sponsored and converter varieties.

Community and foundation schools together form a third group of equivalent size, while the seven remaining schools have voluntary status, just one of them voluntary controlled. There are no free schools.

.

Ofanal 2

Chart 2: Schools within the sample by status

.

All but three of the schools are mixed – and those three are boys’ schools.

As for age range, there is one 13-18 and one 14-18 school. Otherwise there are 32 11-16 institutions (37% of the sample) while the remaining 53 (61%) are 11-18 or 11-19 institutions.

Chart 3 shows the variation in numbers on roll. The smallest school – a new 11-18 secondary school – has just 125 pupils; the largest 2083. The average is 912.

Fifty-two schools (60%) are between 600 and 1,200 and twenty-three (26%) between 800 and 1,000 pupils.

.

Ofanal 3

Chart 3: Schools within the sample by NOR

. 

Chart 4 shows the overall inspection grade of schools within the sample. A total of 19 schools (22%) are rated inadequate, seven of them attracting special measures. Only nine (10%) are outstanding, while 27 (31%) are good and 32 (37%) require improvement.

This is very similar to the distribution in the 2014 sample, except that there are slightly more inadequate schools and slightly fewer requiring improvement.

.

Ofanal 4

Chart 4: Schools within the sample by overall inspection grade

Unlike the 2104 analysis, I have also explored the distribution of all grades within reports. The results are set out in Chart 5.

Schools in the sample are relatively more secure on Leadership and management (55% outstanding or good) and Behaviour and safety of pupils (60% outstanding or good) than they are on Quality of teaching (43% outstanding or good) and Achievement of pupils (41% outstanding or good).

.

Ofanal 5

Chart 5: Schools within the sample by inspection sub-grades

Another new addition this year is comparison with the number and percentage of high attainers.

Amongst the sample, the number of high attainers in the 2014 GCSE cohort varied from three to 196 and the percentage from 3% to 52%. (Two schools did not have a GCSE cohort in 2014.)

These distributions are shown on the scatter charts 6 and 7, below.

Chart 6 (number) shows one major outlier at the top of the distribution. The vast majority – 64% of the sample – record numbers between 20 and 60. The average number is 41.

.

Ofanal 6

Chart 6: Schools within the sample by number of high attainers (Secondary Performance Tables measure)

. 

Chart 7 again has a single outlier, this time at the bottom of the distribution. The average is 32%, slightly less than the 32.3% reported for all state-funded schools in the Performance Tables.

Two in five of the sample register a high attainer percentage of between 20% and 30%, while three in five register between 20% and 40%.

But almost a third have a high attainer population of 20% or lower.

.

Ofanal 7 

Chart 7: Schools within the sample by percentage of high attainers (Secondary Performance Tables measure)

Out of curiosity, I compared the overall inspection grade with the percentage of high attainers.

  • Amongst the nine outstanding schools, the percentage of high attainers ranged from 22% to 47%, averaging 33% (there was also one without a high attainer percentage).
  • Amongst the 27 good schools, the percentage of high attainers was between 13% and 52% (plus one without a high attainer percentage) and averaged 32%.
  • Amongst the 32 schools requiring improvement, the percentage of high attainers varied between 3% and 40% and averaged 23%.
  • Amongst the 19 inadequate schools, the percentage of high attainers lay between 10% and 38% and also averaged 23%.

This may suggest a tendency for outstanding/good schools to have a somewhat larger proportion of high attainers than schools judged to be requiring improvement or inadequate.

Key findings and areas for improvement

.

Distribution of comments

Thirty-nine of the reports in the sample (45%) address the most able in the Summary of key findings, while 33 (38%) do so in the section about what the school needs to do to improve further.

In 24 cases (28%) there were entries in both these sections, but in 39 of the reports (45%) there was no reference to the most able in either section.

In 2014, 34% of reports in the sample addressed the issue in both the main findings and recommendations and 52% mentioned it in neither of these sections.

These percentage point changes are not strongly indicative of an extended commitment to this issue.

In the 2015 sample it was rather more likely for a reference to appear in the key findings for community schools (53%) and foundation schools (50%) than it was for converter academies (44%), sponsored academies (42%) or voluntary schools (29%).

Chart 8 shows the distribution of comments in these sections according to the overall inspection grade. In numerical terms, schools rated as requiring improvement overall are most likely to attract comments in both Key findings and Areas for improvement related to the most able.

.

Ofanal 8

Chart 8: Most able mentioned in key findings and areas for improvement by overall inspection grade (percentages)

.

But, when expressed as percentages of the total number of schools in the sample attracting these grades, it becomes apparent that the lower the grade, the more likely such a comment will be received.

Of the 39 reports making reference in the key findings, 10 comments were positive, 28 were negative and one managed to be both positive and negative simultaneously:

‘While the most-able students achieve well, they are capable of even greater success, notably in mathematics.’ (Harewood College, Bournemouth)

.

Positive key findings

Five of the ten exclusively positive comments were directed at community schools.

The percentage of high attainers in the 2014 GCSE cohorts at the schools attracting positive comments varied from 13% to 52% and included three of the five schools with the highest percentages in the sample.

Interestingly, only two of the schools with positive comments received an overall outstanding grade, while three required improvement.

Examples of positive comments, which were often generic, include:

  • ‘The most able students achieve very well, and the proportion of GCSE A* and A grades is significantly above average across the curriculum.’ (Durham Johnston Comprehensive School, Durham)
  • ‘The most able students do well because they are given work that challenges them to achieve their potential’. (The Elton High School Specialist Arts College, Bury)
  • ‘Most able students make good progress in most lessons because of well-planned activities to extend their learning’. (Endon High School, Staffordshire)
  • ‘Teachers encourage the most able students to explore work in depth and to master skills at a high level’. (St Richard Reynolds Catholic High School, Richmond-upon-Thames).

Negative key findings

The distribution of the 28 negative comments in Key findings according to overall inspection grade was:  Outstanding (nil); Good five (19%); Requires improvement twelve (38%); Inadequate eleven (58%).

This suggests a relatively strong correlation between the quality of provision for the most able and the overall quality of the school.

The proportion of high attainers in the 2014 GCSE cohorts of the schools attracting negative comments varied between 3% and 42%. All but three are below the national average for state-funded schools on this measure and half reported 20% or fewer high attainers.

This broadly supports the hypothesis that quality is less strong in schools where the proportion of high attainers is comparatively low.

Examples of typical negative comments:

  • ‘The most able students are not given work that is hard enough’ (Dyson Perrins C of E Sports College, Worcestershire)
  • ‘Too many students, particularly the most able, do not make the progress of which they are capable’ (New Line Learning Academy, Kent)
  • ‘Students, particularly the more able, make slower progress in some lessons where they are not sufficiently challenged. This can lead to some off task behaviour which is not always dealt with by staff’ (The Ferrers School, Northamptonshire)
  • ‘Teachers do not always make sufficient use of assessment information to plan work that fully stretches or challenges all groups of students, particularly the most able’ (Noel-Baker School, Derby).

The menu of shortcomings identified is limited, consisting of seven items: underachievement (especially too few high GCSE grades), insufficient progress, low expectations, insufficiently challenging work, poor teaching quality, poor planning and poor use of assessment information.

Of these, the most common comprise a familiar litany. They are (in descending order): 

  • Insufficiently challenging work 
  • Insufficient progress 
  • Underachievement and 
  • Low expectations.

Inspectors often point out inconsistent practice, though in the worst instances these shortcomings are dominant or even school-wide.

.

No key findings

Chart 9 shows the distribution of reports with no comments about the most able in Key findings and Areas for improvement according to overall inspection grade. When expressed as percentages, these again show that schools rated as outstanding are most likely to escape such comments, while inadequate schools are most likely to be in the firing line.

.

Ofanal 9

Chart 9: Most able not mentioned in key findings and areas for improvement by inspection grade (percentages)

This pattern replicates the findings from 2014. Orders of magnitude are also broadly comparable.  There is no substantive evidence of a major increase in emphasis from inspectors.

It seems particularly surprising that, in over half of schools requiring improvement and a third or more of inadequate schools, issues with educating the most able are still not significant enough to feature in these sections of inspection reports.

.

Areas for improvement

By definition, recommendations for improvement are always associated with identified shortcomings.

The correlation between key findings and areas for improvement is inconsistent. In six cases there were Key findings relating to the most able, but no area for improvement specifically associated with those. Conversely, nine reports had identified areas for improvement that were not picked up in the key findings.

Areas for improvement are almost always formulaic and expressed as lists: the school should improve x through y and z.

When it comes to the most able, the area for improvement is almost invariably teaching quality, though sometimes this is indicated as the route to higher achievement while on other occasions teaching quality and raising achievement are perceived as parallel priorities.

Just one report in the sample mentioned the quality of leadership and management:

‘Ensure that leadership and management take the necessary steps to secure a significant rise in students’ achievement at the end of Year 11 through…ensuring that work set for the most able is always sufficiently challenging’ (New Line Learning Academy, Kent).

This is despite the fact that leadership was specifically mentioned as a focus in HMCI’s Annual Report.

The actions needed to bring about improvement reflect the issues mentioned in the analysis of key findings above. The most common involve applying assessment information to planning and teaching:

  • ‘Raise students’ achievement and the quality of teaching further by ensuring that:…all staff develop their use of class data to plan learning so that students, including the most able, meet their challenging targets’ (Oasis Academy Isle of Sheppey, Kent)
  • ‘Ensure the quality of teaching is always good or better, in order to raise attainment and increase rates of progress, especially in English and mathematics, by:…ensuring teachers use all the information available to them to plan lessons that challenge students, including the most able’ (Oasis Academy Lister Park, Bradford)
  • ‘Embed and sustain improvements in achievement overall and in English in particular so that teaching is consistently good and outstanding by: making best use of assessment information to set work that is appropriately challenging, including for the least and most able students’ (Pleckgate High School Mathematics and Computing College, Blackburn with Darwen)

Other typical actions involve setting more challenging tasks, raising the level of questioning, providing accurate feedback, improving lesson planning and maintaining consistently high expectations.

.

Coverage in the main body of reports

.

Leadership and management

Given the reference to this in HMCI’s Annual Report, one might have expected a new and significant emphasis within this section of the reports in the sample.

In fact, the most able were only mentioned in this section in 13 reports (15% of the total). Hardly any of these comments identified shortcomings. The only examples I could find were:

  • ‘The most-able students are not challenged sufficiently in all subjects to
    achieve the higher standards of which they are capable’ (Birkbeck School and Community Arts College, Lincolnshire)
  • ‘Action to improve the quality of teaching is not focused closely enough on the strengths and weaknesses of the school and, as a result, leaders have not done enough to secure good teaching of students and groups of students, including…the most able (Ashington High School Sports College, Northumberland)

Inspectors are much more likely to accentuate the positive:

  • ‘The school has been awarded the Challenge Award more than once. This is given for excellent education for a school’s most-able, gifted and talented students and for challenge across all abilities. Representatives from all departments attend meetings and come up with imaginative ways to deepen these students’ understanding.’ (Cheam High School, Sutton)
  • ‘Leaders and governors are committed to ensuring equality of opportunity for all students and are making effective use of student achievement data to target students who may need additional support or intervention. Leaders have identified the need to improve the achievement of…the most-able in some subjects and have put in place strategies to do so’ (Castle Hall academy Trust, Kirklees)
  • ‘Measures being taken to improve the achievement of the most able are effective. Tracking of progress is robust and two coordinators have been appointed to help raise achievement and aspirations. Students say improvements in teaching have been made, and the work of current students shows that their attainment and progress is on track to reach higher standards.’ (The Byrchall High School, Wigan).

Not one report mentioned the role of governors in securing effective provision for the most able. 

Given how often school leadership escapes censure for issues identified elsewhere in reports, this outcome could be interpreted as somewhat complacent. 

HMCI is quite correct to insist that provision for the most able is a whole school issue and, as such, a school’s senior leadership team should be held to account for such shortcomings.

Behaviour and safety

The impact of under-challenging work on pupils’ behaviour is hardly ever identified as a problem.

One example has been identified in the analysis of Key findings above. Only one other report mentions the most able in this section, and the comment is about the role of the school council rather than behaviour per se:

‘The academy council is a vibrant organisation and is one of many examples where students are encouraged to take an active role in the life of the academy. Sixth form students are trained to act as mentors to younger students. This was seen being effectively employed to…challenge the most able students in Year 9’ (St Thomas More High School, Southend)

A handful of reports make some reference under ‘Quality of teaching’ but one might reasonably conclude that neither  bullying of the most able nor disruptive behaviour from bored high attainers is particularly widespread.

Quality of teaching

Statements about the most able are much more likely to appear in this section of reports. Altogether 59 of the sample (68%) made some reference.

Chart 10 shows the correlation between the incidence of comments and the sub-grade awarded by inspectors to this aspect of provision. It demonstrates that, while differences are relatively small, schools deemed outstanding are rather more likely to attract such comment.

But only one of the comments on outstanding provision is negative and that did not mention the most able specifically:

‘Also, in a small minority of lessons, activities do not always deepen
students’ knowledge and understanding to achieve the very highest grades at GCSE and A level.’ (Central Foundation Boys’ School, Islington)

.

Ofanal 10

Chart 10: Incidence of comments under quality of teaching by grade awarded for quality of teaching

.

Comments are much more likely to be negative in schools where the quality of teaching is judged to be good (41%), requiring improvement (59%) and inadequate (58%).

Even so, a few schools in the lower two categories receive surprisingly positive endorsements:

  • ‘On the other hand, the most able students and the younger students in school consistently make good use of the feedback. They say they greatly value teachers’ advice….The teaching of the most able students is strong and often very strong. As a result, these students make good progress and, at times, achieve very well.’ (RI – The Elton High School Specialist Arts College, Bury)
  • ‘Teaching in mathematics is more variable, but in some classes, good and outstanding teaching is resulting in students’ rapid progress. This is most marked in the higher sets where the most able students are being stretched and challenged and are on track to reach the highest grades at GCSE…. In general, the teaching of the most able students….is good.’ (RI- New Charter Academy, Tameside)
  • ‘At its most effective, teaching is well organised to support the achievement of the most able, whose progress is better than other students. This is seen in some of the current English and science work.’ (I – Ely College, Cambridgeshire).

Negative comments on the quality of teaching supply a familiar list of shortcomings.

Some of the most perceptive are rather more specific. Examples include:

  • ‘While the best teaching allows all students to make progress, sometimes discussions that arise naturally in learning, particularly with more able students, are cut short. As a result, students do not have the best opportunity to explore ideas fully and guide their own progress.’ (Dyson Perrins C of E Sports College, Worcestershire)
  • ‘Teachers’ planning increasingly takes account of current information about students’ progress. However, some teachers assume that because the students are organised into ability sets, they do not need to match their teaching to individual and groups of students’ current progress. This has an inhibiting effect on the progress of the more able students in some groups.’ (Chulmleigh Community College, Devon)
  • ‘In too many lessons, particularly boys’ classes, teachers do not use questioning effectively to check students’ learning or promote their thinking. Teachers accept responses that are too short for them to assess students’ understanding. Neither do they adjust their teaching to revisit aspects not fully grasped or move swiftly to provide greater stretch and new learning for all, including the most able.’ (The Crest Academies, Brent)
  • ‘In some lessons, students, including the most able, are happy to sit and wait for the teacher to help them, rather than work things out for themselves’ (Willenhall E-ACT Academy, Walsall).

Were one compiling a list of what to do to impress inspectors, it would include the following items:

  • Plans lessons meticulously with the needs of the most able in mind 
  • Use assessment information to inform planning of work for the most able 
  • Differentiate work (and homework) to match most able learners’ needs and starting points 
  • Deploy targeted questioning, as well as opportunities to develop deeper thinking and produce more detailed pieces of work 
  • Give the most able the flexibility to pursue complex tasks and do not force them to participate in unnecessary revision and reinforcement 
  • Do not use setting as an excuse for neglecting differentiation 
  • Ensure that work for the most able is suitably challenging 
  • Ensure that subject knowledge is sufficiently secure for this purpose 
  • Maintain the highest expectations of what the most able students can achieve 
  • Support the most able to achieve more highly but do not allow them to become over-reliant on support 
  • Deploy teaching assistants to support the most able 
  • Respond to restlessness and low level disruption from the most able when insufficiently challenged.

While many of the reports implicitly acknowledge that the most able learners will have different subject-specific strengths and weaknesses, the implications of this are barely discussed.

Moreover, while a few reports attempt a terminological distinction between ‘more able’ and ‘most able’, the vast majority seem to assume that, in terms of prior attainment, the most able are a homogenous group, whereas – given Ofsted’s preferred approach – there is enormous variation.

Achievement of pupils 

This is the one area of reports where reference to the most able is now apparently compulsory – or almost compulsory.

Just one report in the sample has nothing to say about the achievement of the most able in this section: that on Ashby School in Leicestershire.

Some of the comments are relatively long and detailed, but others are far more cursory and the coverage varies considerably.

Using as an example the subset of schools awarded a sub-grade of outstanding for the achievement of pupils, we can exemplify different types of response:

  • Generic: ‘The school’s most able students make rapid progress and attain excellent results. This provides them with an excellent foundation to continue to achieve well in their future studies.’ (Kelvin Hall School, Hull)
  • Generic, progress-focused: ‘The most-able students make rapid progress and the way they are taught helps them to probe topics in greater depth or to master skills at a high level.’ (St Richard Reynolds Catholic High School, Richmond-upon-Thames)
  • Achievement-focused, core subjects: ‘Higher attaining students achieve exceptionally well as a result of the support and challenge which they receive in class. The proportion of students achieving the higher A* to A grade was similar to national averages in English but significantly above in mathematics.
  • Specific, achievement- and progress-focused: ‘Although the most able students make exceptional progress in the large majority of subjects, a few do not reach the very highest GCSE grades of which they are capable. In 2014, in English language, mathematics and science, a third of all students gained A and A* GCSE grades. Performance in the arts is a real strength. For example, almost two thirds of students in drama and almost half of all music students achieved A and A* grades. However, the proportions of A and A* grades were slightly below the national figures in English literature, geography and some of the subjects with smaller numbers of students (Central Foundation Boys’ School, Islington)

If we look instead at the schools with a sub-grade of inadequate, the comments are typically more focused on progress, but limited progress is invariably described as ‘inadequate’, ‘requiring improvement’, ‘weak’, ‘not good’, ‘not fast enough’. It is never quantified.

On the relatively few occasions when achievement is discussed, the measure is typically GCSE A*/A grades, most often in the core subjects.

It is evident from cross-referencing the Achievement of pupils sub-grade against the percentage of high attainers in the 2014 GCSE cohort that there is a similar correlation to that with the overall inspection grade:

  • In schools judge outstanding on this measure, the high attainer population ranges from 22% to 47% (average 33%)
  • In schools judged good, the range is from 13% to 52% (average 32%)
  • In schools requiring improvement it is between 3% and 40% (average 23%)
  • In schools rated inadequate it varies from 10% to 32% (average 22%)

.

Sixth Form Provision 

Coverage of the most able in sections dedicated to the sixth form is also extremely variable. Relatively few reports deploy the term itself when referring to 16-19 year-old students.

Sometimes there is discussion of progression to higher education and sometimes not. Where this does exist there is little agreement on the appropriate measure of selectivity in higher education:

  • ‘Students are aspiring to study at the top universities in Britain. This is a realistic prospect and illustrates the work the school has done in raising their aspirations.’ (Welling School, Bexley)
  • ‘The academy carefully tracks the destination of leavers with most students proceeding to university and one third of students gaining entry to a Russell Group university’ (Ashcroft Technology Academy, Wandsworth)
  • ‘Provision for the most able students is good, and an increasing proportion of students are moving on to the highly regarded ‘Russell group’ or Oxbridge universities. A high proportion of last year’s students have taken up a place at university and almost all gained a place at their first choice’ (Ashby School, Leicestershire)
  • ‘Large numbers of sixth form students progress to well-regarded universities’ (St Bartholomew’s School, West Berkshire)
  • ‘Students receive good support in crafting applications to universities which most likely match their attainment; this includes students who aspire to Oxford or Cambridge’ (Anthony Gell School, Derbyshire).

Most able and disadvantaged

Given the commitment in the 2015 survey report to ‘continue to focus sharply on the progress made by students who are able and disadvantaged’, I made a point of reviewing the coverage of this issue across all sections of the sample reports.

Suffice to say that only one report discussed provision for the most able disadvantaged students, in these terms:

‘Pupil premium funding is being used successfully to close the wide achievement gaps apparent at the previous inspection….This funding is also being effectively used to extend the range of experiences for those disadvantaged students who are most able. An example of this is their participation in a residential writing weekend.’ (St Hild’s C of E VA School, Hartlepool)

Take a bow Lead Inspector Petts!

A handful of other reports made more general statements to the effect that disadvantaged students perform equivalently to their non-disadvantaged peers, most often with reference to the sixth form:

  • ‘The few disadvantaged students in the sixth form make the same progress as other students, although overall, they attain less well than others due to their lower starting points’ (Sir Thomas Wharton Community College, Doncaster)
  • ‘There is no difference between the rates of progress made by disadvantaged students and their peers’ (Sarum Academy, Wiltshire)
  • ‘In many cases the progress of disadvantaged students is outstripping that of others. Disadvantaged students in the current Year 11 are on course to do
    every bit as well as other students.’ (East Point Academy, Suffolk).

On two occasions, the point was missed entirely:

  • ‘The attainment of disadvantaged students in 2014 was lower than that of other students because of their lower starting points. In English, they were half a grade behind other students in the school and nationally. In mathematics, they were a grade behind other students in the school and almost a grade behind students nationally. The wider gap in mathematics is due to the high attainment of those students in the academy who are not from disadvantaged backgrounds.’ (Chulmleigh Community College, Devon)
  • ‘Disadvantaged students make good progress from their starting points in relation to other students nationally. These students attained approximately two-thirds of a GCSE grade less than non-disadvantaged students nationally in English and in mathematics. This gap is larger in school because of the exceptionally high standards attained by a large proportion of the most able students…’ (Durham Johnston Comprehensive School, Durham)

If Ofsted believes that inspectors are already focusing sharply on this issue then, on this evidence, they are sadly misinformed.

Key Findings and areas for improvement

.

Key findings: Guidance

  • Ofsted inspectors have no reliable definition of ‘most able’ and no guidance on the appropriateness of definitions adopted by the schools they visit. The approach taken in the 2015 survey report is different to that adopted in the initial 2013 survey and is now exclusively focused on prior attainment. It is also significantly different to the high attainer measure in the Secondary Performance Tables.
  • Using Ofsted’s approach, the national population of most able in Year 7 approaches 50% of all learners; in Year 11 it is some 40% of all learners. The latter is some eight percentage points lower than the cohort derived from the Performance Tables measure.
  • The downside of such a large cohort is that it masks the huge attainment differences within the cohort, from a single L5C (and possibly a L3 in either maths or English) to a clutch of L6s. Inspectors might be encouraged to regard this as a homogenous group.
  • The upside is that there should be a most able presence in every year group of every school. In some comprehensive schools, high attainers will be a substantial majority in every year group; in others there will be no more than a handful.
  • Ofsted has not released data showing the incidence of high attainers in each school according to its measure (or the Performance Tables measure for that matter). This does not features in Ofsted’s Data Dashboard.
  • Guidance in the current School Inspection Handbook is not entirely helpful. There is not space in a Section 5 inspection report to respond to all the separate references (see Appendix for the full list). The terminology is confused (‘most able’, ‘more able’, ‘high attainers’).Too often the Handbook mentions several different groups alongside the most able, one of which is disadvantaged pupils. This perpetuates the false assumption that there are no most able disadvantaged learners. We do not yet know whether there will be wholesale revision when new Handbooks are introduced to reflect the Common Inspection Framework.
  • At least four pieces of subsidiary guidance have issued to inspectors since October 2013. But there has been nothing to reflect the commitments in HMCI’s Annual Report (including a stronger focus on school leadership of this issue) or the March 2015 Survey report. This material requires enhancement and consolidation.
  • The March 2015 Report apparently commits to more intensive scrutiny of curricular and IAG provision in Section 5 inspections, as well as ‘continued focus’ on able and disadvantaged students (see below). A subsequent commitment to an evaluation toolkit would be helpful to inspectors as well as schools, but its structure and content has not yet been revealed.

Key findings: Survey

  • The sample for my survey is broadly representative of regions, school status and variations in NOR. In terms of overall inspection grades, 10% are outstanding, 31% good, 37% require improvement and 22% are inadequate. In terms of sub-grades, they are notably weaker on Quality of teaching and Achievement of pupils, the two sections that most typically feature material about the most able.
  • There is huge variation within the sample by percentage of high attainers (2014 GCSE population according to the Secondary Performance Tables measure). The range is from 3% to 52%. The average is 32%, very slightly under the 32.3% average for all state-funded schools. Comparing overall inspection grade with percentage of high attainers suggests a marked difference between those rated outstanding/good (average 32/33%) and those rated as requiring improvement/inadequate (average 23%).
  • 45% of the reports in the sample addressed the most able under Key findings; 38% did so under Areas for improvement and 28% made reference in both sections. However, 45% made no reference in either of these sections. In 2014, 34% mentioned the most able in both main findings and recommendations, while 52% mentioned it in neither. On this measure, inspectors’ focus on the most able has not increased substantively since last year.
  • Community and foundation schools were rather more likely to attract such comments than either converter or sponsored academies. Voluntary schools were least likely to attract them. The lower the overall inspection grade, the more likely a school is to receive such comments.
  • In Key findings, negative comments outnumbered positive comments by a ratio of 3:1. Schools with high percentages of high attainers were well represented amongst those receiving positive comments.
  • Unsurprisingly, schools rated inadequate overall were much more likely to attract negative comments. A correlation between overall quality and quality of provision for the most able was somewhat more apparent than in 2014. There was also some evidence to suggest a correlation between negative comments and a low proportion of high attainers.
  • On the other hand, over half of schools with an overall requiring improvement grade and a third with an overall inspection grade of inadequate did not attract comments about the most able under Key findings. This is not indicative of greater emphasis.
  • The menu of shortcomings is confined to seven principal faults: underachievement (especially too few high GCSE grades), insufficient progress, low expectations, insufficiently challenging work, poor teaching quality, poor planning and poor use of assessment information. In most cases practice is inconsistent but occasionally problems are school-wide.
  • Areas for improvement are almost always expressed in formulaic fashion. Those relating to the most able focus almost invariably on the Quality of teaching. The improvement most commonly urged is more thorough application of assessment information to planning and teaching.
  • Only 15% of reports mention the most able under Leadership and management and, of those, only two are negative comments. The role of governors was not raised once. Too often the school leadership escapes censure for shortcomings identified elsewhere in the report. This is not consistent with indications of new-found emphasis in this territory.
  • The most able are hardly ever mentioned in the Behaviour and safety section of reports. It would seem that bullying is invisible and low level disruption by bored high attainers rare.
  • Conversely, 68% of reports referenced the most able under Quality of teaching. Although negative comments are much more likely in schools judged as inadequate or requiring improvement in this area, a few appear to be succeeding with their most able against the odds. The main text identifies a list of twelve good practice points gleaned from the sample.
  • Only one report fails to mention the most able under Achievement of pupils, but the quality and coverage varies enormously. Some comments are entirely generic; some focus on achievement, others on progress and some on both. Few venture beyond the core subjects. There is very little quantification, especially of insufficient progress (and especially compared with equivalent discussion of progress by disadvantaged learners).
  • Relatively few reports deploy the term ‘most able’ when discussing sixth form provision. Progression to higher education is sometimes mentioned and sometimes not. There is no consensus on how to refer to selective higher education.
  • Only one report in this sample mentions disadvantaged most able students. Two reports betray the tendency of assuming these two groups to be mutually exclusive but, worse still, the sin of omission is almost universal. This provides no support whatsoever for Ofsted’s claim that inspectors already address the issue.

Areas for improvement

Ofsted has made only limited improvements since the previous inspection in May 2014 and its more recent commitments are not yet reflected in Section 5 inspection practice.

In order to pass muster it should:

  • Appoint a lead inspector for the most able who will assume responsibility across Ofsted, including communication and consultation with third parties.
  • Consolidate and clarify material about the most able in the new Inspection Handbooks and supporting guidance for inspectors.
  • Prepare and publish a high quality evaluation toolkit, to support schools and inspectors alike. This should address definitional and terminological issues as well as supplying benchmarking data for achievement and progress. It might also set out the core principles underpinning effective practice.
  • Include within the toolkit a self-assessment and evaluation framework based on the quality standards. This should model Ofsted’s understanding of whole school provision for the most able that aligns with outstanding, good and requiring improvement grades, so that schools can understand the progression between these points.
  • Incorporate data about the incidence of the most able and their performance in the Data Dashboard.
  • Extend all elements of this work programme to the primary and post-16 sectors.
  • Undertake this work programme in consultation with external practitioners and experts in the field, completing it as soon as possible and by December 2015 at the latest.

 .

Verdict: (Still) Requires Improvement.

GP

April 2015

.. 

.

Annex: Coverage in the School Inspection Handbook (December 2014)

Main Text

Inspectors should:

  • Gather evidence about how well they are ‘learning, gaining knowledge and understanding, and making progress’ (para 40)
  • Take account of them when considering performance data (para 59)
  • Take advantage of opportunities to gather evidence from them (para 68)
  • Consider the effectiveness of pupil grouping, for example ‘where pupils are taught in mixed ability groups/classes, inspectors will consider whether the most able are stretched…’ (para 153)
  • Explore ‘how well the school works with families to support them in overcoming the cultural obstacles that often stand in the way of the most able pupils from deprived backgrounds attending university’ (para 154)
  • Consider whether ‘teachers set homework in line with the school’s policy and that challenges all pupils, especially the most able’ (para 180)
  • Consider ‘whether work in Key Stage 3 is demanding enough, especially for the most able when too often undemanding work is repeated unnecessarily’ (para 180)
  • Consider whether ‘teaching helps to develop a culture and ethos of scholastic excellence, where the highest achievement in academic work is recognised, especially in supporting the achievement of the most able’ (para 180)
  • When judging achievement, have regard for ‘the progress that the most able are making towards attaining the highest grades’ and ‘pay particular attention to whether more able pupils in general and the most able pupils in particular are achieving as well as they should’. They must ‘summarise the achievements of the most able pupils in a separate paragraph of the inspection report’ (paras 185-7)
  • Consider ‘how the school uses assessment information to identify pupils who…need additional support to reach their full potential, including the most able.’ (para 193)
  • Consider how well ‘assessment, including test results, targets, performance descriptors or expected standards are used to ensure that…more able pupils do work that deepens their knowledge and understanding’ and ‘pupils’ strengths and misconceptions are identified and acted on by teachers during lessons and more widely to… deepen the knowledge and understanding of the most able’ (para 194)
  • Take account of ‘the learning and progress across year groups of different groups of pupils currently on the roll of the school, including…the most able’. Evidence gathered should include ‘the school’s own records of pupils’ progress, including… the most able pupils such as those who joined secondary schools having attained highly in Key Stage 2’ (para 195)
  • Take account of ‘pupils’ progress in the last three years, where such data exist and are applicable, including that of…the most able’ (para 195)
  • ‘When inspecting and reporting on students’ achievement in the sixth form, inspectors must take into account all other guidance on judging the achievement, behaviour and development of students, including specific groups such as…the most able ‘ (para 210)
  • Talk to sixth form students to discover ‘how well individual study programmes meet their expectations, needs and future plans, including for…the most able’ (para 212)

However, the terminology is not always consistent. in assessing the overall effectiveness of a school, inspectors must judge its response to ‘the achievement of…the highest and lowest attainers’ (para 129)

Grade descriptors

Outstanding

  • Overall effectiveness:

‘The school’s practice consistently reflects the highest expectations of staff and the highest aspirations for pupils, including the most able…’

  • Quality of teaching:

‘Much teaching over time in all key stages and most subjects is outstanding and never less than consistently good. As a result, almost all pupils currently on roll in the school, including…the most able, are making sustained progress that leads to outstanding achievement.’

  • Achievement of pupils:

‘The learning of groups of pupils, particularly… the most able, is consistently good or better.’

  • Effectiveness of sixth form provision:

‘All groups of pupils make outstanding progress, including…the most able’

Good

  • Overall effectiveness:

‘The school takes effective action to enable most pupils, including the most able…’

  • Quality of teaching:

‘Teaching over time in most subjects, including English and mathematics, is consistently good. As a result, most pupils and groups of pupils on roll in the school, including…the most able, make good progress and achieve well over time.’

‘Effective teaching strategies, including setting appropriate homework and well-targeted support and intervention, are matched closely to most pupils’ needs, including those most and least able, so that pupils learn well in lessons’

  • Achievement of pupils:

‘The learning of groups of pupils, particularly… the most able, is generally good.’

  • Effectiveness of sixth form provision:

‘As a result of teaching that is consistently good over time, students make good progress, including…the most able’

Inadequate

  • Quality of teaching:

‘As a result of weak teaching over time, pupils or particular groups of pupils, including…the most able, are making inadequate progress.’

  • Achievement of pupils:

‘Groups of pupils, particularly disabled pupils and/or those who have special educational needs and/or disadvantaged pupils and/or the most able, are underachieving’

  • Effectiveness of sixth form provision:

‘Students or specific groups such as… the most able do not achieve as well as they can. Low attainment of any group shows little sign of rising.’

The most able students: Has Ofsted made progress?

.

This post considers Ofsted’s survey report ‘The most able students: An update on progress since June 2013’ published on 4 March 2015.

It is organised into the following sections:

  • The fit with earlier analysis
  • Reaction to the Report
  • Definitions and the consequent size of Ofsted’s ‘most able’ population
  • Evidence base – performance data and associated key findings
  • Evidence base – inspection and survey evidence and associated key findings
  • Ofsted’s recommendations and overall assessment
  • Prospects for success

How this fits with earlier work

The new Report assesses progress since Ofsted’s previous foray into this territory some 21 months ago: ‘The most able students: Are they doing as well as they should in our non-selective secondary schools?’ (June 2013)

The autopsy I performed on the original report was severely critical.

It concluded:

‘My overall impression is of a curate’s egg, whose better parts have been largely overlooked because of the opprobrium heaped on the bad bits.

The Report might have had a better reception had the data analysis been stronger, had the most significant messages been given comparatively greater prominence and had the tone been somewhat more emollient towards the professionals it addresses, with some sort of undertaking to underwrite support – as well as challenge – from the centre.’

In May 2014, almost exactly mid-way between that Report and this, I published an analysis of the quality of Ofsted reporting on support for the most able in a sample of Section 5 secondary school inspection reports.

This uncovered a patchy picture which I characterised as ‘requiring improvement’.

It noted the scant attention given by inspectors to high-attaining disadvantaged learners and called for Ofsted to publish guidance to clarify, for inspectors and schools alike, what they mean by the most able and their expectations of what support schools should provide.

In December 2014, I published ‘HMCI ups the ante on the most able’ which drew attention to commitments in HMCI’s Annual Report for 2013/14 and the supporting documentation released alongside it.

I concluded that post with a series of ten recommendations for further action by Ofsted and other central government bodies that would radically improve the chances of achieving system-wide improvement in this territory.

The new Report was immediately preceded by a Labour commitment to introduce a £15m Gifted and Talented Fund if successful in the forthcoming General Election.

This short commentary discusses that and sets out the wider political context into which Ofsted’s new offering will fall.

.

Reactions to Ofsted’s Report

Before considering the Report’s content, it may be helpful to complete this context-setting by charting immediate reactions to it.

  • DfE’s ‘line to take, as quoted by the Mail, is:

‘We know that the best schools do stretch their pupils. They are the ones with a no-excuses culture that inspires every student to do their best.

Our plan for education is designed to shine a bright light on schools which are coasting, or letting the best and brightest fall by the wayside.

That is why we are replacing the discredited system which rewarded schools where the largest numbers of pupils scraped a C grade at GCSE.

Instead we are moving to a new system which encourages high-achievers to get the highest grades possible while also recognising schools which push those who find exams harder.’

‘David Cameron’s government has no strategy for supporting schools to nurture their most able pupils. International research shows we perform badly in helping the most gifted pupils. We’re going to do something about that. Labour will establish a Gifted and Talented Fund to equip schools with the most effective strategies for stretching their most able pupils.’

  • ASCL complains that the Report ‘fails to recognise that school leaders have done an extraordinary job in difficult circumstances in raising standards and delivering a good education for all children’. It is also annoyed because Ofsted’s press release:

‘…should have focused on the significant amount of good practice identified in the report rather than leading with comments that some schools are not doing enough to ensure the most able children fulfil their potential.’

 .

 .

  • NAHT makes a similarly generic point about volatility and change:

‘The secondary sector has been subject to massive structural change over the past few years. It’s neither sensible nor accurate to accuse secondary schools of failure. The system itself is getting in the way of success…

…Not all of these changes are bad. The concern is that the scale and pace of them will make it very hard indeed to know what will happen and how the changes will interact….

…The obvious answer is quite simple: slow down and plan the changes better; schedule them far enough ahead to give schools time to react….

But the profession also needs to ask what it can do. One answer is not to react so quickly to changes in league table calculations – to continue to do what is right…’

There was no official reaction from ATL, NASUWT or NUT.

Turning to the specialist organisations:

‘If the failure reported by Ofsted was about any other issue there would be a national outcry.

This cannot be an issue laid at the door of schools alone, with so many teachers working hard, and with no budget, to support these children.

But in some schools there is no focus on supporting high potential learners, little training for teachers to cope with their educational needs, and a naive belief that these children will succeed ‘no matter what’.

Ofsted has shown that this approach is nothing short of a disaster; a patchwork of different kinds of provision, a lack of ambitious expectations and a postcode lottery for parents.

We need a framework in place which clearly recognises best practice in schools, along with a greater understanding of how to support these children with high learning potential before it is too late.’

‘NACE concurs with both the findings and the need for urgent action to be taken to remove the barriers to high achievement for ALL pupils in primary and secondary schools…

… the organisation is  well aware that nationally there is a long way to go before all able children are achieving in line with their abilities.’

‘Today’s report demonstrates an urgent need for more dedicated provision for the highly able in state schools. Ofsted is right to describe the situation as ‘especially disappointing’; too many of our brightest students are being let down…

…We need to establish an effective national programme to support our highly able children particularly those from low and middle income backgrounds so that they have the stretch and breath they need to access the best universities and the best careers.’

Summing up, the Government remains convinced that its existing generic reforms will generate the desired improvements.

There is so far no response, from Conservatives or Liberal Democrats, to the challenge laid down by Labour, which has decided that some degree of arms-length intervention from the centre is justified.

The headteacher organisations are defensive because they see themselves as the fall guys, as the centre increasingly devolves responsibility through a ‘school-driven self-improving’ system that cannot yet support its own weight (and might never be able to do so, given the resource implications of building sufficient capacity).

But they cannot get beyond these generic complaints to address the specific issues that Ofsted presents. They are in denial.

The silence of the mainstream teachers’ associations is sufficient comment on the significance they attach to this issue.

The specialist lobby calls explicitly for a national framework, or even the resurrection of a national programme. All are pushing their own separate agendas over common purpose and collaborative action.

Taken together, this does not bode well for Ofsted’s chances of achieving significant traction.

Ofsted’s definitions

.

Who are the most able?

Ofsted is focused exclusively on non-selective secondary schools, and primarily on KS3, though most of the data it publishes relates to KS4 outcomes.

My analysis of the June 2013 report took umbrage at Ofsted’s previous definition of the most able:

‘For the purpose of this survey ‘most able’ is defined as the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2. Some pupils who are new to the country and are learning English as an additional language, for example, might not have attained Level 5 or beyond at the end of Key Stage 2 but have the potential to achieve it.’

On this occasion, the definition is similarly based on prior attainment at KS2, but the unquantified proportion of learners with ‘the potential to attain Level 5 or above’ are removed, meaning that Ofsted is now focused exclusively on high attainers:

‘For this report, ‘most able’ refers to students starting secondary school in Year 7 having attained Level 5 or above in English (reading and writing) and/or mathematics at the end of Key Stage 2.’

This reinforces the unsuitability of the term ‘most able’, on the grounds that attainment, not ability, is the true focus.

Ofsted adds for good measure:

‘There is currently no national definition for most able’

They fail to point out that the Performance Tables include a subtly different definition of high attainers, essentially requiring an APS of 30 points or higher across Key Stage 2 tests in the core subjects.

The 2014 Secondary Performance Tables show that this high attainer population constitutes 32.3% of the 2014 GCSE cohort in state-funded schools.

The associated SFR indicates that high attainers account for 30.9% of the cohort in comprehensive schools (compared with 88.8% in selective schools).

But Ofsted’s definition is wider still. The SFR published alongside the 2014 Primary Performance Tables reveals that, in 2014:

  • 29% of pupils achieved Level 5 or above in KS2 reading and writing
  • 44% of pupils achieved Level 5 or above in KS2 Maths and
  • 24% of pupils achieved Level 5 or above in KS2 reading, writing and maths.

If this information is fed into a Venn diagram, it becomes evident that, this academic year, the ‘most able’ constitute 49% of the Year 7 cohort.

That’s right – almost exactly half of this year’s Year 7s fall within Ofsted’s definition.

.

Ofsted venn Capture

.

The population is not quite so large if we focus instead on KS2 data from 2009, when the 2014 GCSE cohort typically took their KS2 tests, but even that gives a combined total of 39%.

We can conclude that Ofsted’s ‘most able’ population is approximately 40% of the KS4 cohort and approaching 50% of the KS3 cohort.

This again calls into question Ofsted’s terminology, since the ‘most’ in ‘most able’ gives the impression that they are focused on a much smaller population at the top of the attainment distribution.

We can check the KS4 figure against numerical data provided in the Report, to demonstrate that it applies equally to non-selective schools, ie once selective schools have been removed from the equation.

The charts in Annex A of the Report give the total number of pupils in non-selective schools with L5 outcomes from their KS2 assessments five years before they take GCSEs:

  • L5 maths and English = 91,944
  • L5 maths = 165,340
  • L5 English (reading and writing) = 138,789

Assuming there is no double-counting, this gives us a total population of 212,185 in 2009.

I could not find a reliable figure for the number of KS2 test takers in 2009 in state-funded primary schools, but the equivalent in the 2011 Primary Performance Tables is 547,025.

Using that, one can calculate that those within Ofsted’s definition constitute some 39% of the 2014 GCSE cohort in non-selective secondary schools. The calculations above suggest that the KS3 cohort will be some ten percentage points larger.

.

Distribution between schools

Of course the distribution of these students between schools will vary considerably.

The 2014 Secondary Performance Tables illustrate this graphically through their alternative ‘high attainers’ measure. The cohort information provides the percentage of high attainers in the GCSE cohort in each school.

The highest recorded percentage in a state-funded comprehensive school is 86%, whereas 92 state-funded schools record 10% or fewer high attainers and just over 650 have 20% or fewer in their GCSE cohort.

At the other extreme, 21 non-selective state-funded schools are at 61% or higher, 102 at 51% or higher and 461 at 41% or higher.

However, the substantial majority – about 1,740 state-funded, non-selective schools – fall between 21% and 40%.

The distribution is shown in the graph below.

.

Ofsted graph 1

Percentage of high attainers within each state-funded non-selective secondary school’s cohort 2014 (Performance Tables measure)

Ofsted approaches the issue differently, by looking at the incidence of pupils with KS2 L5 in English, maths and both English and maths.

Their tables (again in Annex A of the Report) show that, within the 2014 GCSE cohort there were:

  • 2,869 non-selective schools where at least one pupil previously attained a L5 in KS2 English
  • 2,875 non-selective schools where at least one pupil previously attained a L5 in KS2 maths and
  • 2,859 non-selective schools where at least one pupil previously attained l5 in KS2 English and maths.

According to the cohort data in the 2014 Secondary Performance Tables, this suggests that roughly 9% of state-funded non-selective secondary schools had no pupils in each of these categories within the relevant cohort. (It is of course a different 9% in each case.)

Ofsted’s analysis shows that the lowest decile of schools in the distribution of students with L5 in English will have up to 14 of them.

Similarly the lowest decile for L5 in maths will have up to 18 pupils, and the lowest decile for L5 in maths and English combined will have up to 10 pupils.

Assuming a top set typically contains at least 26 pupils, 50% of state-funded, non-selective schools with at least one pupil with L5 English have insufficient students for one full set. The comparable percentage for maths is 30%.

But Ofsted gives no hint of what might constitute a critical mass of high attainers, appearing to suggest that it is simply a case of ‘the more the better’.

Moreover, it seems likely that Ofsted might simply be identifying the incidence of disadvantage through the proxy of high attainers.

This is certainly true at the extremes of the distribution based on the Performance Tables measure.

  • Amongst the 92 schools with 10% or fewer high attainers, 53 (58%) have a cohort containing 41% or more disadvantaged students.
  • By comparison, amongst the 102 schools with 51% or more high attainers, not one school has such a high proportion of disadvantaged students, indeed, 57% have 10% or fewer.

Disadvantage

When Ofsted discusses the most able from disadvantaged backgrounds, its definition of disadvantage is confined to ‘Ever-6 FSM’.

The Report does not provide breakdowns showing the size of this disadvantaged population in state-funded non-selective schools with L5 English or L5 maths.

It does tell us that 12,150 disadvantaged students in the 2014 GCSE cohort had achieved KS2 L5 in both English and maths.  They form about 13.2% of the total cohort achieving this outcome.

If we assume that the same percentage applies to the total populations achieving L5 English only and L5 maths only, this suggests the total size of Ofsted’s disadvantaged most able population within the 2014 GCSE cohort in state-funded, non-selective schools is almost exactly 28,000 students.

Strangely, the Report does not analyse the distribution of disadvantaged high attainers, as opposed to high attainers more generally, even though the text mentions this as an issue in passing.

One would expect that the so called ‘minority effect’ might be even more pronounced in schools where there are very few disadvantaged high attainers.

Ofsted’s evidence base: Performance data

The Executive Summary argues that analysis of national performance data reveals:

‘…three key areas of underperformance for the most able students. These are the difference in outcomes between:

  • schools where most able students make up a very small proportion of the school’s population and those schools where proportions are higher
  • the disadvantaged most able students and their better off peers
  • the most able girls and the most able boys.

If the performance of the most able students is to be maximised, these differences need to be overcome.’

As noted above, Ofsted does not separately consider schools where the incidence of disadvantaged most able students is low, nor does it look at the interaction between these three categories.

It considers all three areas of underperformance through the single prism of prior attainment in KS2 tests of English and maths.

The Report also comments on a fourth dimension: the progression of disadvantaged students to competitive universities. Once again this is related to KS2 performance.

There are three data-related Key Findings:

  • National data show that too many of the most able students are still being let down and are failing to reach their full potential. Most able students’ achievement appears to suffer even more when they are from disadvantaged backgrounds or when they attend a school where the proportion of previously high-attaining students is small.’
  • ‘Nationally, too many of our most able students fail to achieve the grades they need to get into top universities. There are still schools where not a single most able student achieves the A-level grades commonly preferred by top universities.’
  • The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

The following sections look at achievement compared with prior attainment, followed by each of the four dimensions highlighted above.

GCSE attainment compared with KS2 prior attainment

Ofsted’s approach is modelled on the transition matrices, as applied to non-selective schools, comparing KS2 test performance in 2009 with subsequent GCSE performance in 2014.

Students with KS2 L5 are expected to make at least three levels of progress, to GCSE Grade B or higher, but this is relatively undemanding for high attainers, who should ideally be aiming for A/A* grades.

Ofsted presents two charts which illustrate the relatively small proportions who are successful in these terms – and the comparatively large proportions who undershoot even a grade B.

Ofsted Capture 1

Ofsted Capture 2

 .

  • In English, 39% manage A*/A grades while 77% achieve at least a Grade B, meaning that 23% achieve C or below.
  • In maths, 42% achieve A*/A grades, 76% at least a B and so 24% achieve C or lower.
  • In English and maths combined, 32% achieve A*/A grades in both subjects, 73% manage at least 2 B grades, while 27% fall below this.

Approximately one in four high attainers is not achieving each of these progression targets, even though they are not particularly demanding.

The Report notes that, in selective schools, the proportion of Level 5 students not achieving at least a Grade B is much lower, at 8% in English and 6% in maths.

Even allowing for the unreliability of these ‘levels of progress’ assumptions, the comparison between selective and non-selective schools is telling.

.

The size of a school’s most able population

The Report sets out evidence to support the contention that ‘the most able do best when there are more of them in a school’ (or, more accurately, in their year group).

It provides three graphs – for English, for maths and for maths and English combined – which divide non-selective schools with at least one L5 student into deciles according to the size of that L5 population.

These show consistent increases in the proportion of students achieving GCSE Grade B and above and Grades A*/A, with the lowest percentages for the lowest deciles and vice versa.

Comparing the bottom (fewest L5) and top (most L5) deciles:

  • In English 27% of the lowest decile achieved A*/A and 67% at least a B, whereas in the highest decile 48% achieved A*/A and 83% at least B.
  • In maths 28% of the bottom decile recorded A*/A while 65% managed at least a B, whereas in the top decile 54% achieved A*/A and 83% at least a B.
  • In maths and English combined, the lowest decile schools returned 17% A*/A grades and 58% at B or above, while in the highest decile the percentages were 42% and 81% respectively.

Selective schools record higher percentages than the highest decile on all three measures.

There is a single reference to the impact of sublevels, amply evidenced by the transition matrices.

‘For example, in schools where the lowest proportions of most able students had previously gained Level 5A in mathematics, 63% made more than expected progress. In contrast, in schools where the highest proportion of most able students who had previously attained Level 5A in mathematics, 86% made more than expected progress.’

Ofsted does not draw any inferences from this finding.

As hinted above, one might want to test the hypothesis that there may be an association with setting – in that schools with sufficient Level 5 students to constitute a top set might be relatively more successful.

Pursued to its logical extreme the finding would suggest that Level 5 students will be most successful where they are all taught together.

Interestingly, my own analysis of schools with small high attainer populations (10% or less of the cohort), derived from the 2014 Secondary Performance Tables, shows just how much variation there can be in the performance of these small groups when it comes to the standard measures:

  • 5+ A*-C grades including English and maths varies from 44% to 100%
  • EBacc ranges from 0% to 89%
  • Expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%.

This is partly a function of the small sample sizes. One suspects that Ofsted’s deciles smooth over similar variations.

But the most obvious point is that already emphasised in the previous section – the distribution of high attainers seems in large part a proxy for the level of advantage in a school.

Viewed from this perspective, Ofsted’s data on the variation in performance by distribution of high attaining students seems unsurprising.

.

Excellence gaps

Ofsted cites an ‘ever 6’ gap of 13 percentage points at GCSE grade B and above in English (66% compared with 79%) and of 17 percentage points in maths (61% compared with 78%).

Reverting again to progression from KS2, the gap between L5 ‘ever 6 FSM’ and other students going on to achieve A*/A grades in both English and maths is also given as 17 percentage points (20% versus 37%). At Grade B and above the gap is 16 points (59% compared with 75%).

A table is supplied showing progression by sub-level in English and maths separately.

.

Ofsted Capture 3

. 

A footnote explains that the ‘ever 6 FSM’ population with L5a in English was small, consisting of just 136 students.

I have transferred these excellence gaps to the graph below, to illustrate the relationship more clearly.

.

Ofsted chart 2

GCSE attainment gaps between advantaged and disadvantaged learners by KS2 prior attainment

.

It shows that, for grades A*-B, the size of the gap reduces the higher the KS2 sub-level, but the reverse is true at grades A*/A, at least as far as the distinction between 5c and 5b/a is concerned. The gaps remain similar or identical for progression from the higher two sub-levels.

This might suggest that schools are too little focused on pushing high-attaining disadvantaged learners beyond grade B.

 .

Gender

There is a short section on gender differences which points out that, for students with KS2 L5:

  • In English there was a 10 percentage point gap in favour of girls at Grade B and above and an 11 point gap in favour of girls at A*/A.
  • In maths there was a five percentage point gap at both Grade B and above and Grade A*/A.

But the interrelationship with excellence gaps and the size of the high attainer population is not explored.

.

Progression to competitive higher education

The Executive Summary mentions one outcome from the 2012/13 destinations data – that only 5% of disadvantaged students completing KS5 in 2012 progressed to ‘the top universities’. (The main text also compares the progression rates for state-funded and independent schools).

It acknowledges some improvement compared with previous years, but notes the disparity with progression rates for students from comparatively advantaged backgrounds.

A subsequent footnote reveals that Ofsted is referring throughout to progression to Russell Group universities

The Executive Summary also highlights regional differences:

‘For example, even within a high-achieving region like London, disadvantaged students in Brent are almost four times as likely to attend a prestigious university as those in Croydon.’

The main text adds:

‘For example, of the 500 or so disadvantaged students in Kent, only 2% go on to attend a top university. In Manchester, this rises to 9%. Disadvantaged students in Barnet are almost four times as likely as their peers in Kent to attend a prestigious university.’

Annex A provides only one statistic concerning progression from KS2 to KS5:

‘One half of students achieving Level 5 in English and mathematics at Key Stage 2 failed to achieve any A or A* grades at A level in non-selective schools’

There is no attempt to relate this data to the other variables discussed above.

Ofsted’s Evidence base – inspection and survey evidence

The qualitative evidence in Ofsted’s report is derived from:

  • A survey of 40 non-selective secondary schools and 10 primary schools. All the secondary schools had at least 15% of students ‘considered to be high attaining at the end of Key Stage 2’ (as opposed to meeting Ofsted’s definition), as well as 10% or more considered to be low-attaining. The sample varied according to size, type and urban or rural location. Fifteen of the 40 were included in the survey underpinning the original 2013 report. Nine of the 10 primary schools were feeders for the secondaries in the sample. In the secondary schools, inspectors held discussions with senior leaders, as well as those responsible for transition and IAG (so not apparently those with lead responsibility for high attainers). They also interviewed students in KS3 and KS5 and looked at samples of students’ work.

The six survey questions are shown below

.

Ofsted Capture 4

.

  • Supplementary questions asked during 130 Section 5 inspections, focused on how well the most able students are maintaining their progress in KS3, plus challenge and availability of suitable IAG for those in Year 11.
  • An online survey of 600 Year 8 and Year 11 students from 17 unidentified secondary schools, plus telephone interviews with five Russell Group admissions tutors.

The Report divides the qualitative dimension of its report into seven sections that map broadly on to the six survey questions.

The summary below is organised thematically, pulling together material from the key findings and supporting commentary. Relevant key findings are emboldened. Some of these have relevance to sections other than that in which they are located.

The length of each section is a good guide to the distribution and relative weight of Ofsted’s qualitative evidence

Most able disadvantaged

‘Schools visited were rarely meeting the distinct needs of students who are most able and disadvantaged. Not enough was being done to widen the experience of these students and develop their broader knowledge or social and cultural awareness early on in Key Stage 3. The gap at Key Stage 4 between the progress made by the most able disadvantaged students and their better off peers is still too large and is not closing quickly enough.’

The 2013 Report found few instances of pupil premium being used effectively to support the most able disadvantaged. This time round, about a third of survey schools were doing so. Six schools used the premium effectively to raise attainment.

Funding was more often used for enrichment activities but these were much less common in KS3, where not enough was being done to broaden students’ experience or develop social and cultural awareness.

In less successful schools, funding was not targeted ‘with the most able students in mind’, nor was its impact evaluated with sufficient precision.

In most survey schools, the proportion of most able disadvantaged was small. Consequently leaders did not always consider them.

In the few examples of effective practice, schools provided personalised support plans.

.

.

Leadership

Ofsted complains of complacency. Leaders are satisfied with their most able students making the expected progress – their expectations are not high enough.

School leaders in survey schools:

‘…did not see the need to do anything differently for the most able as a specific group.’

One head commented that specific support would be ‘a bit elitiist’.

In almost half of survey schools, heads were not prioritising the needs of their most able students at a sufficiently early stage.

Just 44 of the 130 schools asked supplementary questions had a senior leader with designated responsibility for the most able. Of these, only 16 also had a designated governor.

The Report comments:

‘This suggests that the performance of the most able students was not a high priority…’

Curriculum

Too often, the curriculum did not ensure that work was hard enough for the most able students in Key Stage 3. Inspectors found that there were too many times when students repeated learning they had already mastered or did work that was too easy, particularly in foundation subjects.’

Although leaders have generally made positive curriculum changes at KS4 and 5, issues remain at KS3. General consensus amongst students in over half the survey schools was that work is too easy.

Students identified maths and English as more challenging than other subjects in about a third of survey schools.

In the 130 schools asked supplementary questions, leaders rarely prioritised the needs of the most able at KS3. Only seven offered a curriculum designed for different abilities.

In the most effective survey schools the KS3 curriculum was carefully structured:

‘…leaders knew that, for the most able, knowledge and understanding of content was vitally important alongside the development of resilience and knowing how to conduct their own research.’

By comparison, the KS4 curriculum was tailored in almost half of survey schools. All the schools introduced enrichment and extra-curricular opportunities, though few were effectively evaluated.

. 

Assessment and tracking

Assessment, performance tracking and target setting for the most able students in Key Stage 4 were generally good, but were not effective enough in Key Stage 3. The schools visited routinely tracked the progress of their older most able students, but this remained weak for younger students. Often, targets set for the most able students were too low, which reflected the low ambitions for these students. Targets did not consistently reflect how quickly the most able students can make progress.’

Heads and assessment leaders considered tracking the progress of the most able sufficient to address their performance, but only rarely was this information used to improve curriculum and teaching strategies.

Monitoring and evaluation tends to be focused on KS4. There were some improvements in tracking at KS4 and KS5, but this had caused many schools to lose focus on tracking from the start of KS3.

KS3 students in most survey schools said their views were sought, but could not always point to changes as a consequence. Only in eight schools were able students’ views sought as a cohort.

Year 8 respondents to the online survey typically said schools could do more to develop their interests.

At KS3, half the survey schools did not track progress in all subjects. Where tracking was comprehensive, progress was inconsistent, especially in foundation subjects.

Assessment and tracking ‘generally lacked urgency and rigour’. This, when combined with ineffective use of KS2 assessments:

‘… has led to an indifferent start to secondary school for many of the most able students in these schools.’

KS2 tests were almost always used to set targets but five schools distrusted these results. Baseline testing was widely used, but only about a quarter of the sample used it effectively to spot gaps in learning or under-achievement.

Twenty-six of the 40 survey schools set targets ‘at just above national expectations’. For many students these were insufficiently demanding.

Expectations were insufficiently high to enable them to reach their potential. Weaknesses at KS3 meant there was too much to catch up at KS4 and 5.

In the better examples:

‘…leaders looked critically at national expectations and made shrewd adjustments so that the most able were aiming for the gold standard of A and A* at GCSE and A levels rather than grade B. They ensured that teachers were clear about expectations and students knew exactly what was expected of them. Leaders in these schools tracked the progress of their most able students closely. Teachers were quickly aware of any dips in performance and alert to opportunities to stretch them.’

The expectations built into levels-based national curriculum assessment imposed ‘a glass ceiling’. It is hoped that reforms such as Progress 8 will help raise schools’ aspirations.

 .

Quality of teaching

‘In some schools, teaching for the most able lacked sufficient challenge in Key Stage 3. Teachers did not have high enough expectations and so students made an indifferent start to their secondary education. The quality of students’ work across different subjects was patchy, particularly in foundation subjects. The homework given to the most able was variable in how well it stretched them and school leaders did not routinely check its effectiveness.’

The most common methods of introducing ‘stretch’ reported by teachers and students were extension work, challenge questions and differentiated tasks.

But in only eight of the survey schools did teachers have specific training in applying these techniques to the most able.

As in 2013, teaching at KS3 was insufficiently focused on the most able. The quality of work and tasks set was patchy, especially in foundation subjects. In two-thirds of survey schools work was insufficiently challenging in foundation subjects; in just under half, work was insufficiently challenging in maths and English.

Students experienced a range of teaching quality, even in the same school. Most said there were lessons that did not challenge them. Older students were more content with the quality of stretch and challenge.

In only about one fifth of survey schools was homework adapted to the needs of the most able. Extension tasks were increasingly common.

The same was true of half of the 130 schools asked supplementary questions.  Only 14 had a policy of setting more challenging homework for the most able.

Most schools placed students in maths and science sets fairly early in Year 7, but did so less frequently in English.

In many cases, older students were taught successfully in mixed ability classes, often because there were too few students to make sets viable:

‘The fact that these schools were delivering mixed ability classes successfully suggests that the organisation of classes by ability is not the only factor affecting the quality of teaching. Other factors, such as teachers not teaching their main subject or sharing classes or leaders focusing the skills of their best teachers disproportionately on the upper key stages, are also influential.’

. 

School culture and ethos

Leaders had not embedded an ethos in which academic excellence was championed with sufficient urgency. Students’ learning in Key Stage 3 in the schools visited was too frequently disrupted by low-level disruption, particularly in mixed-ability classes. Teachers had not had enough effective training in using strategies to accelerate the progress of their most able students.’

Where leadership was effective, leaders placed strong emphasis on creating the right ethos. School leaders had not prioritised embedding a positive ethos at KS3 in 22 of the survey schools.

In half of the survey schools, the most able students said their learning was affected by low-level disruption, though teachers in three-quarters of schools maintained this was rare. Senior leaders also had a more positive view than students.

In 16 of the schools, students thought behaviour was less good in mixed ability classes and staff tended to agree.

.

Transition

‘Inspectors found that the secondary schools visited were not using transition information from primary schools effectively to get the most able off to a flying start in Key Stage 3. Leaders rarely put in place bespoke arrangements for the most able students. In just under half of the schools visited, transition arrangements were not good enough. Some leaders and teachers expressed doubt about the accuracy of Key Stage 2 results. The information that schools gathered was more sophisticated, but, in too many cases, teachers did not use it well enough to make sure students were doing work with the right level of difficulty.

Too often poor transition arrangements meant students were treading water in KS3. The absence of leadership accountability for transition appeared a factor in stifled progress at KS4 and beyond.

Transfer arrangements with primary schools were not well developed in 16 of the survey schools. Compared with 2013, schools were more likely to find out about pupils’ strengths and weaknesses, but the information was rarely used well.

Secondary schools had more frequent and extended contact with primary schools through subject specialists to identify the most able, but these links were not always used effectively. Only one school had a specific curriculum pathway for such students.

Leaders in four of the ten primary schools surveyed doubted whether secondary schools used transition information effectively.

However, transition worked well in half of the secondary schools.  Six planned the Year 7 curriculum jointly with primary teachers. Leaders had the highest expectations of their staff to ensure that the most able were working at the appropriate level of challenge.

Transition appeared more effective where schools had fewer feeder primaries. About one third of the sample had more than 30 feeder schools, which posed more difficulties, but four of these schools had effective arrangements.

Progression to HE

‘Information, advice and guidance to students about accessing the most appropriate courses and universities were not good enough. There were worrying occasions when schools did too little to encourage the most able students to apply to prestigious universities. The quality of support was too dependent on the skills of individual staff in the schools visited.

While leaders made stronger links with universities to provide disadvantaged students in Key Stages 4 and 5 with a wider range of experiences, they were not evaluating the impact sharply enough. As a result, there was often no way to measure how effectively these links were supporting students in preparing successful applications to the most appropriate courses.’

Support and guidance about university applications is ‘still fragile’ and ‘remains particularly weak’.

Students, especially those from disadvantaged backgrounds, were not getting the IAG they need. Ten survey schools gave no specific support to first generation university attendees or those eligible for the pupil premium.

Forty-nine of the 130 school asked additional questions did not prioritise the needs of such students. However, personalised mentoring was reported in 16 schools.

In four survey schools students were not encouraged to apply to the top universities.

‘The remnants of misplaced ideas about elitism appear to be stubbornly resistant to change in a very small number of schools. One admissions tutor commented: ‘There is confusion (in schools) between excellence and elitism’.

Only a third of survey schools employed dedicated staff to support university applications. Much of the good practice was heavily reliant on the skills of a few individuals. HE admissions staff agreed.

In 13 of the schools visited, students had a limited understanding of the range of opportunities available to them.

Survey schools had a sound understanding of subject requirements for different degree courses. Only about one-quarter engaged early with parents.

.

Ofsted and other Central Government action

‘Ofsted has sharpened its focus on the progress and quality of teaching of the most able students. We routinely comment on the achievement of the most able students in our inspection reports. However, more needs to be done to develop a clearer picture of how well schools use pupil premium funding for their most able students who are disadvantaged and the quality of information, advice and guidance provided for them. Ofsted needs to sharpen its practice in this area.’

The Department for Education has developed useful data about students’ destinations when they leave Key Stage 4. However, information about students’ destinations when they leave Key Stage 5 is not as comprehensive and so is less useful.’

.

Ofsted’s recommendations and conclusions

This is a somewhat better Report than its June 2013 predecessor, although it continues to fall into several of the same statistical and presentational traps.

It too is a curate’s egg.

For any student of effective provision for the most able, the broad assessment in the previous section is profoundly unsurprising, but its endorsement by Ofsted gives it added power and significance.

We should be grateful that HMCI has chosen to champion this issue when so many others are content to ignore it.

The overall message can best be summarised by juxtaposing two short statements from the Report, one expressed positively, another negatively:

  • In over half of survey schools, the most able KS3 students were progressing as well as, or better than, others. 
  • The needs of the most able were not being met effectively in the majority of survey schools.

Reading between the lines, too often, the most able students are succeeding despite their schools, rather than because of them.

What is rather more surprising – and potentially self-defeating – is Ofsted’s insistence on laying the problem almost entirely at the door of schools, and especially of headteachers.

There is most definitely a degree of complacency amongst school leaders about this issue, and Ofsted is quite right to point that out.

The determination of NAHT and ASCL to take offence at the criticism being directed towards headteachers, to use volatility and change as an excuse and to urge greater focus on the pockets of good practice is sufficient evidence of this.

But there is little by way of counterbalance. Too little attention is paid to the question whether the centre is providing the right support – and the right level of support – to facilitate system-wide improvement. It as if the ‘school-led, self-improving’ ideal is already firmly in place.

Then again, any commitment on the part of the headteachers’ associations to tackling the root causes of the problem is sadly lacking. Meanwhile, the teachers;’ associations ignored the Report completely.

Ofsted criticises this complacency and expresses concern that most of its survey schools:

‘…have been slow in taking forward Ofsted’s previous recommendations, particularly at KS3’

There is a call for renewed effort:

‘Urgent action is now required. Leaders must grasp the nettle and radically transform transition from primary school and the delivery of the Key Stage 3 curriculum. Schools must also revolutionise the quality of information, advice and guidance for their most able students.’

Ofsted’s recommendations for action are set out below. Seven are directed at school leaders, three at Ofsted and one at DfE.

Ofsted capture 5

Ofsted Capture 6

Those aimed by Ofsted towards itself are helpful in some respects.

For example, there is implicit acknowledgement that, until now, inspectors have been insufficiently focused on the most able from disadvantaged backgrounds.

Ofsted stops short of meeting my call for it to produce guidance to help schools and inspectors to understand Ofsted’s expectations.

But it is possible that it might do so. Shortly after publication of the Report, its Director for Schools made a speech confirming that: 

‘… inspectors are developing a most able evaluation toolkit for schools, aligned to that which is in place for free school meals’. 

.

.

If Ofsted is prepared to consult experts and practitioners on the content of that toolkit, rather than producing it behind closed doors, it is more likely to be successful.

There are obvious definitional issues stemming from the fact that, according to Ofsted’s current approach, the ‘most able’ population constitutes 40-50% of all learners.

While this helps to ensure relevance to every school, no matter how depressed the attainment of its intake, it also highlights the need for further differentiation of this huge population.

Some of Ofsted’s statistical indicators and benchmarking tools will need sharpening, not least to avoid the pitfalls associated with the inverse relationship between the proportion of high attainers and the proportion of disadvantaged learners.

They might usefully focus explicitly on the distribution and incidence of the disadvantaged most able.

Prospects for success

But the obvious question is why schools should be any more likely to respond this time round than in 2013?

Will the references in the Ofsted inspection handbook plus reformed assessment arrangements be sufficient to change schools’ behaviour?

Ofsted is not about to place explicit requirements on the face of the inspection framework.

We are invited to believe that Progress 8 in particular will encourage secondary schools to give due attention to the needs of high attainers.

Yet there is no commitment to the publication of a high attainers’ performance measure (comparable to the equivalent primary measure) or the gap on that measure between those from advantaged and disadvantaged backgrounds.

Data about the performance of secondary high attainers was to have been made available through the now-abandoned Data Portal – and there has been no information about what, if anything, will take its place.

And many believe that the necessary change cannot be achieved by tinkering with the accountability framework.

The specialist organisations are united in one respect: they all believe that schools – and learners themselves – need more direct support if we are to spread current pockets of effective practice throughout the system.

But different bodies have very different views about what form that support should take. Until we can:

  • Establish the framework necessary to secure universally high standards across all schools without resorting to national prescription

we – and Ofsted – are whistling in the wind.

GP

March 2015

High Attainment in the 2014 Secondary and 16-18 Performance Tables

.

This is my annual analysis of high attainment and high attainers’ performance in the Secondary School and College Performance Tables

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It draws on the 2014 Secondary and 16-18 Tables, as well as three statistical releases published alongside them:

It also reports trends since 2012 and 2013, while acknowledging the comparability issues at secondary level this year.

This is a companion piece to previous posts on:

The post opens with the headlines from the subsequent analysis. These are followed by a discussion of definitions and comparability issues.

Two substantive sections deal respectively with secondary and post-16 measures. The post-16 analysis focuses exclusively on A level results. There is a brief postscript on the performance of disadvantaged high attainers.

As ever I apologise in advance for any transcription errors and invite readers to notify me of any they spot, so that I can make the necessary corrections.

.

Headlines

At KS4:

  • High attainers constitute 32.4% of the cohort attending state-funded schools, but this masks some variation by school type. The percentage attending converter academies (38.4%) has fallen by nine percentage points since 2011 but remains almost double the percentage attending sponsored academies (21.2%).
  • Female high attainers (33.7%) continue to outnumber males (32.1%). The percentage of high-attaining males has fallen very slightly since 2013 while the proportion of high-attaining females has slightly increased.
  • 88.8% of the GCSE cohort attending selective schools are high attainers, virtually unchanged from 2013. The percentages in comprehensive schools (30.9%) and modern schools (21.0%) are also little changed.
  • These figures mask significant variation between schools. Ten grammar schools have a GCSE cohort consisting entirely of high attainers but, at the other extreme, one has only 52%.
  • Some comprehensive schools have more high attainers than some grammars: the highest percentage recorded in 2014 by a comprehensive is 86%. Modern schools are also extremely variable, with high attainer populations ranging from 4% to 45%. Schools with small populations of high attainers report very different success rates for them on the headline measures.
  • The fact that 11.2% of the selective school cohort are middle attainers reminds us that 11+ selection is not based on prior attainment. Middle attainers in selective schools perform significantly better than those in comprehensive schools, but worse than high attainers in comprehensives.
  • 92.8% of high attainers in state-funded schools achieved 5 or more GCSEs at grades A*-C (or equivalent) including GCSEs in English and maths. While the success rate for all learners is down by four percentage points compared with 2013, the decline is less pronounced for high attainers (1.9 points).
  • In 340 schools 100% of high attainers achieved this measure, down from 530 in 2013. Fifty-seven schools record 67% or less compared with only 14 in 2013. Four of the 57 had a better success rate for middle attainers than for high attainers.
  • 93.8% of high attainers in state-funded schools achieved GCSE grades A*-C in English and maths. The success rate for high attainers has fallen less than the rate for the cohort as a whole (1.3 points against 2.4 points). Some 470 schools achieved 100% success amongst their high attainers on this measure, down 140 compared with 2013. Thirty-eight schools were at 67% or lower compared with only 12 in 2013. Five of these boast a higher success rate for their middle attainers than their high attainers (and four are the same that do so on the 5+ A*-C including English and maths measure).
  • 68.8% of high attainers were entered for the EBacc and 55% achieved it. The entry rate is up 3.8 percentage points and the success rate up 2.9 points compared with 2013. Sixty-seven schools entered 100% of their high attainers, but only five schools managed 100% success. Thirty-seven schools entered no high attainers at all and 53 had no successful high attainers.
  • 85.6% of high attainers made at least the expected progress in English and 84.7% did so in maths. Both are down on 2013 but much more so in maths (3.1 percentage points) than in English (0.6 points).
  • In 108 schools every high attainer made the requisite progress in English. In 99 schools the same was true of maths in 99 schools. Only 21 schools managed 100% success in both English and maths. At the other extreme there were seven schools in which 50% or fewer made expected progress in both English and maths. Several schools recording 50% or below in either English or maths did significantly better with their middle attainers.
  • In sponsored academies one in four high attainers do not make the expected progress in maths and one in five do not do so in English. In free schools one in every five high attainers falls short in English as do one in six in maths.

At KS5:

  • 11.9% of students at state-funded schools and colleges achieved AAB grades at A level or higher, with at least two in facilitating subjects. This is a slight fall compared with the 12.1% that did so in 2013. The best-performing state institution had a success rate of 83%.
  • 14.1% of A levels taken in selective schools in 2014 were graded A* and 41.1% were graded A* or A. In selective schools 26.1% of the cohort achieved AAA or higher and 32.3% achieved AAB or higher with at least two in facilitating subjects.
  • Across all schools, independent as well as state-funded, the proportion of students achieving three or more A level grades at A*/A is falling and the gap between the success rates of boys and girls is increasing.
  • Boys are more successful than girls on three of the four high attainment measures, the only exception being the least demanding (AAB or higher in any subjects).
  • The highest recorded A level point score per A level student in a state-funded institution in 2014 is 1430.1, compared with an average of 772.7. The lowest is 288.4. The highest APS per A level entry is 271.1 compared with an average of 211.2. The lowest recorded is 108.6.

Disadvantaged high attainers:

  • On the majority of the KS4 headline measures gaps between FSM and non-FSM performance are increasing, even when the 2013 methodology is applied to control for the impact of the reforms affecting comparability. Very limited improvement has been made against any of the five headline measures between 2011 and 2014. It seems that the pupil premium has had little impact to date on either attainment or progress. Although no separate information is forthcoming about the performance of disadvantaged high attainers, it is highly likely that excellence gaps are equally unaffected.

.

Definitions and comparability issues 

Definitions

The Secondary and 16-18 Tables take very different approaches, since the former deals exclusively with high attainers while the latter concentrates exclusively on high attainment.

The Secondary Tables define high attainers according to their prior attainment on end of KS2 tests. Most learners in the 2014 GCSE cohort will have taken these five years previously, in 2009.

The new supporting documentation describes the distinction between high, middle and low attainers thus:

  • low attaining = those below level 4 in the key stage 2 tests
  • middle attaining = those at level 4 in the key stage 2 tests
  • high attaining = those above level 4 in the key stage 2 tests.

Last year the equivalent statement added:

‘To establish a pupil’s KS2 attainment level, we calculated the pupil’s average point score in national curriculum tests for English, maths and science and classified those with a point score of less than 24 as low; those between 24 and 29.99 as middle, and those with 30 or more as high attaining.’

This is now missing, but the methodology is presumably unchanged.

It means that high attainers will tend to be ‘all-rounders’, whose performance is at least middling in each assessment. Those who are exceptionally high achievers in one area but poor in others are unlikely to qualify.

There is nothing in the Secondary Tables or the supporting SFRs about high attainment, such as measures of GCSE achievement at grades A*/A.

By contrast, the 16-18 Tables do not distinguish high attainers, but do deploy a high attainment measure:

‘The percentage of A level students achieving grades AAB or higher in at least two facilitating subjects’

Facilitating subjects include:

‘biology, chemistry, physics, mathematics, further mathematics, geography, history, English literature, modern and classical languages.’

The supporting documentation says:

‘Students who already have a good idea of what they want to study at university should check the usual entry requirements for their chosen course and ensure that their choices at advanced level include any required subjects. Students who are less sure will want to keep their options open while they decide what to do. These students might want to consider choosing at least two facilitating subjects because they are most commonly required for entry to degree courses at Russell Group universities. The study of A levels in particular subjects does not, of course, guarantee anyone a place. Entry to university is competitive and achieving good grades is also important.’

The 2013 Tables also included percentages of students achieving three A levels at grades AAB or higher in facilitating subjects, but this has now been dropped.

The Statement of Intent for the 2014 Tables explains:

‘As announced in the government’s response to the consultation on 16-19 accountability earlier this year, we intend to maintain the AAB measure in performance tables as a standard of academic rigour. However, to address the concerns raised in the 16-19 accountability consultation, we will only require two of the subjects to be in facilitating subjects. Therefore, the indicator based on three facilitating subjects will no longer be reported in the performance tables.’

Both these measures appear in SFR03/15, alongside two others:

  • Percentage of students achieving 3 A*-A grades or better At A level or applied single/double award A level.
  • Percentage of students achieving grades AAB or better at A level or applied single/double award A level.

Comparability Issues 

When it comes to analysis of the Secondary Tables, comparisons with previous years are compromised by changes to the way in which performance is measured.

Both SFRs carry an initial warning:

‘Two major reforms have been implemented which affect the calculation of key stage 4 (KS4) performance measures data in 2014:

  1. Professor Alison Wolf’s Review of Vocational Education recommendations which:
  • restrict the qualifications counted
  • prevent any qualification from counting as larger than one GCSE
  • cap the number of non-GCSEs included in performance measures at two per pupil
  1. An early entry policy to only count a pupil’s first attempt at a qualification.’

SFR02/15 explains that some data has been presented ‘on two alternative bases’:

  • Using the 2014 methodology with the changes above applied and
  • Using a proxy 2013 methodology where the effect of these two changes has been removed.

It points out that more minor changes have not been accounted for, including the removal of unregulated IGCSEs, the application of discounting across different qualification types, the shift to linear GCSE formats and the removal of the speaking and listening component from English.

Moreover, the proxy measure does not:

‘…isolate the impact of changes in school behaviour due to policy changes. For example, we can count best entry results rather than first entry results but some schools will have adjusted their behaviours according to the policy changes and stopped entering pupils in the same patterns as they would have done before the policy was introduced.’

Nevertheless, the proxy is the best available guide to what outcomes would have been had the two reforms above not been introduced. Unfortunately, it has been applied rather sparingly.

Rather than ignore trends completely, this post includes information about changes in high attainers’ GCSE performance compared with previous years, not least so readers can see the impact of the changes that have been introduced.

It is important that we do not allow the impact of these changes to be used as a smokescreen masking negligible improvement or even declines in national performance on key measures.

But we cannot escape the fact that the 2014 figures are not fully comparable with those for previous years. Several of the tables in SFR06/2015 carry a warning in red to this effect (but not those in SFR 02/2015).

A few less substantive changes also impact slightly on the comparability of A level results: the withdrawal of January examinations and ‘automatic add back’ of students whose results were deferred from the previous year because they had not completed their 16-18 study programme.

.

Secondary outcomes

. 

The High Attainer Population 

The Secondary Performance Tables show that there were 172,115 high attainers from state-funded schools within the relevant cohort in 2014, who together account for 32.3% of the entire state-funded school cohort.

This is some 2% fewer than the 175,797 recorded in 2013, which constituted 32.4% of that year’s cohort.

SFR02/2015 provides information about the incidence of high, middle and low attainers by school type and gender.

Chart 1, below, compares the proportion of high attainers by type of school, showing changes since 2011.

The high attainer population across all state-funded mainstream schools has remained relatively stable over the period and currently stands at 32.9%. The corresponding percentage in LA-maintained mainstream schools is slightly lower: the difference is exactly two percentage points in 2014.

High attainers constitute only around one-fifth of the student population of sponsored academies, but close to double that in converter academies. The former percentage is relatively stable but the latter has fallen by some nine percentage points since 2011, presumably as the size of this sector has increased.

The percentage of high attainers in free schools is similar to that in converter academies but has fluctuated over the three years for which data is available. The comparison between 2014 and previous years will have been affected by the inclusion of UTCs and studio schools prior to 2014.

.

HA sec1

*Pre-2014 includes UTCs and studio schools; 2014 includes free schools only

Chart 1: Percentage of high attainers by school type, 2011-2014

. 

Table 1 shows that, in each year since 2011, there has been a slightly higher percentage of female high attainers than male, the gap varying between 0.4 percentage points (2012) and 1.8 percentage points (2011).

The percentage of high-attaining boys in 2014 is the lowest it has been over this period, while the percentage of high attaining girls is slightly higher than it was in 2013 but has not returned to 2011 levels.

Year Boys Girls
2014 32.1 33.7
2013 32.3 33.3
2012 33.4 33.8
2011 32.6 34.4

Table 1: Percentage of high attainers by gender, all state-funded mainstream schools 2011-14

Table 2 shows that the percentage of high attainers in selective schools is almost unchanged from 2013, at just under 89%. This compares with almost 31% in comprehensive schools, unchanged from 2013, and 21% in modern schools, the highest it has been over this period.

The 11.2% of learners in selective schools who are middle attainers remind us that selection by ability through 11-plus tests gives a somewhat different sample than selection exclusively on the basis of KS2 attainment.

. 

Year Selective Comprehensive Modern
2014 88.8 30.9 21.0
2013 88.9 30.9 20.5
2012 89.8 31.7 20.9
2011 90.3 31.6 20.4

Table 2: Percentage of high attainers by admissions practice, 2011-14

The SFR shows that these middle attainers in selective schools are less successful than their high attaining peers, and slightly less successful than high attainers in comprehensives, but they are considerably more successful than middle attaining learners in comprehensive schools.

For example, in 2014 the 5+ A*-C grades including English and maths measure is achieved by:

  • 97.8% of high attainers in selective schools
  • 92.2% of high attainers in comprehensive schools
  • 88.1% of middle attainers in selective schools and
  • 50.8% of middle attainers in comprehensive schools.

A previous post ‘The Politics of Selection: Grammar schools and disadvantage’ (November 2014) explored how some grammar schools are significantly more selective than others – as measured by the percentage of high attainers within their GCSE cohorts – and the fact that some comprehensives are more selective than some grammar schools.

This is again borne out by the 2014 Performance Tables, which show that 10 selective schools have a cohort consisting entirely of high attainers, the same as in 2013. Eighty-nine selective schools have a high attainer population of 90% or more.

However, five are at 70% or below, with the lowest – Dover Grammar School for Boys – registering only 52% high attainers.

By comparison, comprehensives such as King’s Priory School, North Shields and Dame Alice Owen’s School, Potters Bar record 86% and 77% high attainers respectively. 

There is also huge variation in modern schools, from Coombe Girls’ in Kingston, at 45%, just seven percentage points shy of the lowest recorded in a selective school, to The Ellington and Hereson School, Ramsgate, at just 4%.

Two studio colleges say they have no high attainers at all, while 96 schools have 10% or fewer. A significant proportion of these are academies located in rural and coastal areas.

Even though results are suppressed where there are too few high attainers, it is evident that these small cohorts perform very differently in different schools.

Amongst those with a high attainer population of 10% or fewer, the proportion achieving:

  • 5+ A*-C grades including English and maths varies from 44% to100%
  • EBacc ranges from 0% to 89%
  • expected progress in English varies between 22% and 100% and expected progress in maths between 27% and 100%. 

5+ GCSEs (or equivalent) at A*-C including GCSEs in English and maths 

The Tables show that:

  • 92.8% of high attainers in state-funded schools achieved five or more GCSEs (or equivalent) including GCSEs in English and maths. This compares with 56.6% of all learners. Allowing of course for the impact of 2014 reforms, the latter is a full four percentage points down on the 2013 outcome. By comparison, the outcome for high attainers is down 1.9 percentage points, slightly less than half the overall decline. Roughly one in every fourteen high attainers fails to achieve this benchmark.
  • 340 schools achieve 100% on this measure, significantly fewer than the 530 that did so in 2013 and the 480 managing this in 2012. In 2013, 14 schools registered 67% or fewer high attainers achieving this outcome, whereas in 2014 this number has increased substantially, to 57 schools. Five schools record 0%, including selective Bourne Grammar School, Lincolnshire, hopefully because of their choice of IGCSEs. Six more are at 25% or lower.

. 

A*-C grades in GCSE English and maths 

The Tables reveal that:

  • 93.8% of high attainers in state-funded schools achieved A*-C grades in GCSE English and maths, compared with 58.9% of all pupils. The latter percentage is down by 2.4 percentage points but the former has fallen by only 1.3 percentage points. Roughly one in 16 high attainers fails to achieve this measure.
  • In 2014 the number of schools with 100% of high attainers achieving this measure has fallen to some 470, 140 fewer than in 2013 and 60 fewer than in 2012. There were 38 schools recording 67% or lower, a significant increase compared with 12 in 2013 and 18 in 2012. Of these, four are listed at 0% (Bourne Grammar is at 1%) and five more are at 25% or lower.
  • Amongst the 38 schools recording 67% or lower, five return a higher success rate for their middle attainers than for their high attainers. Four of these are the same that do so on the 5+ A*-C measure above. They are joined by Tong High School. 

Entry to and achievement of the EBacc 

The Tables indicate that:

  • 68.8% of high attainers in state-funded schools were entered for all EBacc subjects and 55.0% achieved the EBacc. The entry rate is up by 3.8 percentage points compared with 2013, and the success rate is up by 2.9 percentage points. By comparison, 31.5% of middle attainers were entered (up 3.7 points) and 12.7% passed (up 0.9 points). Between 2012 and 2013 the entry rate for high attainers increased by 19 percentage points, so the rate of improvement has slowed significantly. Given the impending introduction of the Attainment 8 measure, commitment to the EBacc is presumably waning.
  • Thirty-seven schools entered no high attainers for the EBacc, compared with 55 in 2013 and 186 in 2012. Only 53 schools had no high attainers achieving the EBacc, compared with 79 in 2013 and 235 in 2012. Of these 53, 11 recorded a positive success rate for their middle attainers, though the difference was relatively small in all cases.

At least 3 Levels of Progress in English and maths

The Tables show that:

  • Across all state-funded schools 85.6% of high attainers made at least the expected progress in English while 84.7% did so in maths. The corresponding figures for middle attainers are 70.2% in English and 65.3% in maths. Compared with 2013, the percentages for high attainers are down 0.6 percentage points in English and down 3.1 percentage points in maths, presumably because the first entry only rule has had more impact in the latter. Even allowing for the depressing effect of the changes outlined above, it is unacceptable that more than one in every seven high attainers fails to make the requisite progress in each of these core subjects, especially when the progress expected is relatively undemanding for such students.
  • There were 108 schools in which every high attainer made at least the expected progress in English, exactly the same as in 2013. There were 99 schools which achieved the same outcome in maths, down significantly from 120 in 2013. In 2013 there were 36 schools which managed this in both English in maths, but only 21 did so in 2014.
  • At the other extreme, four schools recorded no high attainers making the expected progress in English, presumably because of their choice of IGCSE. Sixty-five schools were at or below 50% on this measure. In maths 67 schools were at or below 50%, but the lowest recorded outcome was 16%, at Oasis Academy, Hextable.
  • Half of the schools achieving 50% or less with their high attainers in English or maths also returned better results with middle attainers. Particularly glaring differentials in English include Red House Academy (50% middle attainers and 22% high attainers) and Wingfield Academy (73% middle attainers; 36% high attainers). In maths the worst examples are Oasis Academy Hextable (55% middle attainers and 16% high attainers), Sir John Hunt Community Sports College (45% middle attainers and 17% high attainers) and Roseberry College and Sixth Form (now closed) (49% middle attainers and 21% high attainers).

Comparing achievement of these measures by school type and admissions basis 

SFR02/2015 compares the performance of high attainers in different types of school on each of the five measures discussed above. This data is presented in Chart 2 below.

.

HA sec2 

Chart 2: Comparison of high attainers’ GCSE performance by type of school, 2014

.

It shows that:

  • There is significant variation on all five measures, though these are more pronounced for achievement of the EBacc, where there is a 20 percentage point difference between the success rates in sponsored academies (39.2%) and in converter academies (59.9%).
  • Converter academies are the strongest performers across the board, while sponsored academies are consistently the weakest. LA-maintained mainstream schools out-perform free schools on four of the five measures, the only exception being expected progress in maths.
  • Free schools and converter academies achieve stronger performance on progress in maths than on progress in English, but the reverse is true in sponsored academies and LA-maintained schools.
  • Sponsored academies and free schools are both registering relatively poor performance on the EBacc measure and the two progress measures.
  • One in four high attainers in sponsored academies fails to make the requisite progress in maths while one in five fail to do so in English. Moreover, one in five high attainers in free schools fails to make the expected progress in English and one in six in maths. This is unacceptably low.

Comparisons with 2013 outcomes show a general decline, with the exception of EBacc achievement.

This is particularly pronounced in sponsored academies, where there have been falls of 5.2 percentage points on 5+ A*-Cs including English and maths, 5.7 points on A*-C in English and maths and 4.7 points on expected progress in maths. However, expected progress in English has held up well by comparison, with a fall of just 0.6 percentage points.

Progress in maths has declined more than progress in English across the board. In converter academies progress in maths is down 3.1 points, while progress in English is down 1.1 points. In LA-maintained schools, the corresponding falls are 3.4 and 0.4 points respectively.

EBacc achievement is up by 4.5 percentage points in sponsored academies, 3.1 points in LA-maintained schools and 1.8 points in converter academies.

.

Comparing achievement of these measures by school admissions basis 

SFR02/2015 compares the performance of high attainers in selective, comprehensive and modern schools on these five measures. Chart 3 illustrates these comparisons.

.

HA sec3

Chart 3: Comparison of high attainers’ GCSE performance by school admissions basis, 2014

.

It is evident that:

  • High attainers in selective schools outperform those in comprehensive schools on all five measures. The biggest difference is in relation to EBacc achievement (21.6 percentage points). There is a 12.8 point advantage in relation to expected progress in maths and an 8.7 point advantage on expected progress in English.
  • Similarly, high attainers in comprehensive schools outperform those in modern schools. They enjoy a 14.7 percentage point advantage in relation to achievement of the EBacc, but, otherwise, the differences are between 1.6 and 3.5 percentage points.
  • Hence there is a smaller gap, by and large, between the performance of high attainers in modern and comprehensive schools respectively than there is between high attainers in comprehensive and selective schools respectively.
  • Only selective schools are more successful in achieving expected progress in maths than they are in English. It is a cause for some concern that, even in selective schools, 6.5% of pupils are failing to make at least three levels of progress in English.

Compared with 2013, results have typically improved in selective schools but worsened in comprehensive and modern schools. For example:

  • Achievement of the 5+ GCSE measure is up 0.5 percentage points in selective schools but down 2.3 points in comprehensives and modern schools.
  • In selective schools, the success rate for expected progress in English is up 0.5 points and in maths it is up 0.4 points. However, in comprehensive schools progress in English and maths are both down, by 0.7 points and 3.5 points respectively. In modern schools, progress in English is up 0.3 percentage points while progress in maths is down 4.1 percentage points.

When it comes to EBacc achievement, the success rate is unchanged in selective schools, up 3.1 points in comprehensives and up 5 points in modern schools.

. 

Other measures

The Secondary Performance Tables also provide information about the performance of high attainers on several other measures, including:

  • Average Points Score (APS): Annex B of the Statement of Intent says that, as in 2013, the Tables will include APS (best 8) for ‘all qualifications’ and ‘GCSEs only’. At the time of writing, only the former appears in the 2014 Tables. For high attainers, the APS (best 8) all qualifications across all state-funded schools is 386.2, which compares unfavourably with 396.1 in 2013. Four selective schools managed to exceed 450 points: Pate’s Grammar School (455.1); The Tiffin Girls’ School (452.1); Reading School (451.4); and Colyton Grammar School (450.6). The best result in 2013 was 459.5, again at Colyton Grammar School. At the other end of the table, only one school returns a score of under 250 for their high attainers, Pent Valley Technology College (248.1). The lowest recorded score in 2013 was significantly higher at 277.3.
  • Value Added (best 8) prior attainment: The VA score for all state-funded schools in 2014 is 1000.3, compared with 1001.5 in 2013. Five schools returned a result over 1050, whereas four did so in 2013. The 2014 leaders are: Tauheedul Islam Girls School (1070.7); Yesodey Hatorah Senior Girls School (1057.8); The City Academy Hackney (1051.4); The Skinner’s School (1051.2); and Hasmonean High School (1050.9). At the other extreme, 12 schools were at 900 or below, compared with just three in 2013. The lowest performer on this measure is Hull Studio School (851.2). 
  • Average grade: As in the case of APS, the average grade per pupil per GCSE has not yet materialised. The average grade per pupil per qualification is supplied. Five selective schools return A*-, including Henrietta Barnett, Pate’s, Reading School, Tiffin Girls and Tonbridge Grammar. Only Henrietta Barnett and Pate’s managed this in 2013.
  • Number of exam entries: Yet again we only have number of entries for all qualifications and not for GCSE only. The average number of entries per high attainer across state-funded schools is 10.4, compared with 12.1 in 2013. This 1.7 reduction is smaller than for middle attainers (down 2.5 from 11.4 to 8.9) and low attainers (down 3.7 from 10.1 to 6.4). The highest number of entries per high attainer was 14.2 at Gable Hall School and the lowest was 5.9 at The Midland Studio College Hinkley.

16-18: A level outcomes

.

A level grades AAB or higher in at least two facilitating subjects 

The 16-18 Tables show that 11.9% of students in state-funded schools and colleges achieved AAB+ with at least two in facilitating subjects. This is slightly lower than the 12.1% recorded in 2013.

The best-performing state-funded institution is a further education college, Cambridge Regional College, which records 83%. The only other state-funded institution above 80% is The Henrietta Barnett School. At the other end of the spectrum, some 443 institutions are at 0%.

Table 3, derived from SFR03/2015, reveals how performance on this measure has changed since 2013 for different types of institution and, for schools with different admission arrangements.

.

2013 2014
LA-maintained school 11.4 11.5
Sponsored academy 5.4 5.3
Converter academy 16.4 15.7
Free school* 11.3 16.4
Sixth form college 10.4 10
Other FE college 5.8 5.7
 
Selective school 32.4 32.3
Comprehensive school 10.7 10.5
Modern school 2 3.2

.

The substantive change for free schools will be affected by the inclusion of UTCs and studio schools in that line in 2013 and the addition of city technology colleges and 16-19 free schools in 2014.

Otherwise the general trend is slightly downwards but LA-maintained schools have improved very slightly and modern schools have improved significantly.

.

Other measures of high A level attainment

SFR03/15 provides outcomes for three other measures of high A level attainment:

  • 3 A*/A grades or better at A level, or applied single/double award A level
  • Grades AAB or better at A level, or applied single/double award A level
  • Grades AAB or better at A level all of which are in facilitating subjects.

Chart 4, below, compares performance across all state-funded schools and colleges on all four measures, showing results separately for boys and girls.

Boys are in the ascendancy on three of the four measures, the one exception being AAB grades or higher in any subjects. The gaps are more substantial where facilitating subjects are involved.

.

HA sec4

Chart 4: A level high attainment measures by gender, 2014

.

The SFR provides a time series for the achievement of the 3+ A*/A measure, for all schools – including independent schools – and colleges. The 2014 success rate is 12.0%, down 0.5 percentage points compared with 2013.

The trend over time is shown in Chart 5 below. This shows how results for boys and girls alike are slowly declining, having reached their peak in 2010/11. Boys established a clear lead from that year onwards.

As they decline, the lines for boys and girls are steadily diverging since girls’ results are falling more rapidly. The gap between boys and girls in 2014 is 1.3 percentage points.

.

HA sec5

Chart 5: Achievement of 3+ A*/A grades in independent and state-funded schools and in colleges, 2006-2014

.

Chart 6, compares performance on the four different measures by institutional type. It shows a similar pattern across the piece.

Success rates tend to be highest in either converter academies or free schools, while sponsored academies and other FE institutions tend to bring up the rear. LA-maintained schools and sixth form colleges lie midway between.

Converter academies outscore free schools when facilitating subjects do not enter the equation, but the reverse is true when they do. There is a similar relationship between sixth form colleges and LA-maintained schools, but it does not quite hold with the final pair.

. 

HA sec6 

Chart 6: Proportion of students achieving different A level high attainment measures by type of institution, 2014

.

Chart 7 compares performance by admissions policy in the schools sector on the four measures. Selective schools enjoy a big advantage on all four. More than one in four selective school students achieving at least 3 A grades and almost one in 3 achieves AAB+ with at least two in facilitating subjects.

There is a broadly similar relationship across all the measures, in that comprehensive schools record roughly three times the rates achieved in modern schools and selective schools manage roughly three times the success rates in comprehensive schools. 

. 

HA sec7 

Chart 7: Proportion of students achieving different A level high attainment measures by admissions basis in schools, 2014

 .

Other Performance Table measures 

Some of the other measures in the 16-18 Tables are relevant to high attainment:

  • Average Point Score per A level student: The APS per student across all state funded schools and colleges is 772.7, down slightly on the 782.3 recorded last year. The highest recorded APS in 2014 is 1430.1, by Colchester Royal Grammar School. This is almost 100 ahead of the next best school, Colyton Grammar, but well short of the highest score in 2013, which was 1650. The lowest APS for a state-funded school in 2014 is 288.4 at Hartsdown Academy, which also returned the lowest score in 2013. 
  • Average Point Score per A level entry: The APS per A level entry for all state-funded institutions is 211.2, almost identical to the 211.3 recorded in 2013. The highest score attributable to a state-funded institution is 271.1 at The Henrietta Barnett School. This is very slightly slower than the 271.4 achieved by Queen Elizabeth’s Barnet in 2013. The lowest is 108.6, again at Hartsdown Academy, which exceeds the 2013 low of 97.7 at Appleton Academy. 
  • Average grade per A level entry: The average grade across state-funded schools and colleges is C. The highest average grade returned in the state-funded sector is A at The Henrietta Barnett School, Pate’s Grammar School, Queen Elizabeth’s Barnet and Tiffin Girls School. In 2013 only the two Barnet schools achieved the same outcome. At the other extreme, an average U grade is returned by Hartsdown Academy, Irlam and Cadishead College and Swadelands School. 

SFR06/2015 also supplies the percentage of A* and A*/A grades by type of institution and schools’ admissions arrangements. The former is shown in Chart 8 and the latter in Chart 9 below.

The free school comparisons are affected by the changes to this category described above.

Elsewhere the pattern is rather inconsistent. Success rates at A* exceed those set in 2012 and 2013 in LA-maintained schools, sponsored academies, sixth form colleges and other FE institutions. Meanwhile, A*/A grades combined are lower than both 2012 and 2013 in converter academies and sixth form colleges.

.

HA sec8

Chart 8: A level A* and A*/A performance by institutional type, 2012 to 2014

. 

Chart 9 shows A* performance exceeding the success rates for 2012 and 2013 in all three sectors.

When both grades are included, success rates in selective schools have returned almost to 2012 levels following a dip in 2013, while there has been little change across the three years in comprehensive schools and a clear improvement in modern schools, which also experienced a dip last year.

HA sec9

Chart 9: A level A* and A*/A performance in schools by admissions basis, 2012 to 2014.

 .

Disadvantaged high attainers 

There is nothing in either of the Performance Tables or the supporting SFRs to enable us to detect changes in the performance of disadvantaged high attainers relative to their more advantaged peers.

I dedicated a previous post to the very few published statistics available to quantify the size of these excellence gaps and establish if they are closing, stable or widening.

There is continuing uncertainty whether this will be addressed under the new assessment and accountability arrangements to be introduced from 2016.

Although results for all high attainers appear to be holding up better than those for middle and lower attainers, the evidence suggests that FSM and disadvantaged gaps at lower attainment levels are proving stubbornly resistant to closure.

Data from SFR06/2015 is presented in Charts 10-12 below.

Chart 10 shows that, when the 2014 methodology is applied, three of the gaps on the five headline measures increased in 2014 compared with 2013.

That might have been expected given the impact of the changes discussed above but, if the 2013 methodology is applied, so stripping out much (but not all) of the impact of these reforms, four of the five headline gaps worsened and the original three are even wider.

This seems to support the hypothesis that the reforms themselves are not driving this negative trend, athough Teach First has suggested otherwise.

.

HA sec10

Chart 10: FSM gaps for headline GCSE measures, 2013-2014

.

Chart 11 shows how FSM gaps have changed on each of these five measures since 2011. Both sets of 2014 figures are included.

Compared with 2011, there has been improvement on two of the five measures, while two or three have deteriorated, depending which methodology is applied for 2014.

Since 2012, only one measure has improved (expected progress in English) and that by slightly more or less than 1%, according to which 2014 methodology is selected.

Deteriorations have been small however, suggesting that FSM gaps have been relatively stable over this period, despite their closure being a top priority for the Government, backed up by extensive pupil premium funding.

.

HA sec11

Chart 11: FSM/other gaps for headline GCSE measures, 2011 to 2014.

.

Chart 12 shows a slightly more positive pattern for the gaps between disadvantaged learners (essentially ‘ever 6 FSM’ and looked after children) and their peers.

There have been improvements on four of the five headline measures since 2011. But since 2012, only one or two of the measures has improved, according to which 2014 methodology is selected. Compared with 2013, either three or four of the 2014 headline measures are down.

The application of the 2013 methodology in 2014, rather than the 2014 methodology, causes all five of the gaps to increase, so reinforcing the point in bold above.

It is unlikely that this pattern will be any different at higher attainment levels, but evidence to prove or disprove this remains disturbingly elusive.

.

HA sec12

Chart 12: Disadvantaged/other gaps for headline GCSE measures, 2011 to 2014

.

Taken together, this evidence does not provide a ringing endorsement of the Government’s strategy for closing these gaps.

There are various reasons why this might be the case:

  • It is too soon to see a significant effect from the pupil premium or other Government reforms: This is the most likely defensive line, although it begs the question why more urgent action was/is discounted.
  • Pupil premium is insufficiently targeted at the students/school that need it most: This is presumably what underlies the Fair Education Alliance’s misguided recommendation that pupil premium funding should be diverted away from high attaining disadvantaged learners towards their lower attaining peers.
  • Schools enjoy too much flexibility over how they use the pupil premium and too many are using it unwisely: This might point towards more rigorous evaluation, tighter accountability mechanisms and stronger guidance.
  • Pupil premium funding is too low to make a real difference: This might be advanced by institutions concerned at the impact of cuts elsewhere in their budgets.
  • Money isn’t the answer: This might suggest that the pupil premium concept is fundamentally misguided and that the system as a whole needs to take a different or more holistic approach.

I have proposed a more targeted method of tackling secondary excellence gaps and simultaneously strengthening fair access, where funding topsliced from the pupil premium is fed into personal budgets for disadvantaged high attainers.

These would meet the cost of coherent, long-term personalised support programmes, co-ordinated by their schools and colleges, which would access suitable services from a ‘managed market’ of suppliers.

.

Conclusion

This analysis suggests that high attainers, particularly those in selective schools, have been relatively less affected by the reforms that have depressed GCSE results in 2014.

While we should be thankful for small mercies, three issues are of particular concern:

  • There is a stubborn and serious problem with the achievement of expected progress in both English and maths. It cannot be acceptable that approximately one in seven high attainers fails to make three levels of progress in each core subject when this is a relatively undemanding expectation for those with high prior attainment. This issue is particularly acute in sponsored academies where one in four or five high attainers are undershooting their progress targets.
  • Underachievement amongst high attainers is prevalent in far too many state-funded schools and colleges. At KS4 there are huge variations in the performance of high-attaining students depending on which schools they attend. A handful of schools achieve better outcomes with their middle attainers than with their high attainers. This ought to be a strong signal, to the schools as well as to Ofsted, that something serious is amiss.
  • Progress in closing KS4 FSM gaps continues to be elusive, despite this being a national priority, backed up by a pupil premium budget of £2.5bn a year. In the absence of data about the performance of disadvantaged high attainers, we can only assume that this is equally true of excellence gaps.

.

GP

February 2015

A Primary Assessment Progress Report

.

This post tracks progress towards the introduction of the primary assessment and accountability reforms introduced by England’s Coalition Government.

pencil-145970_640It reviews developments since the Government’s consultation response was published, as well as the further action required to ensure full and timely implementation.

It considers the possibility of delay as a consequence of the May 2015 General Election and the potential impact of a new government with a different political complexion.

An introductory section outlines the timeline for reform. This is followed by seven thematic sections dealing with:

There are page jumps from each of the bullets above, should readers wish to refer to these specific sections.

Each section summarises briefly the changes and commitments set out in the consultation response (and in the original consultation document where these appear not to have been superseded).

Each then reviews in more detail the progress made to date, itemising the tasks that remain outstanding.

I have included deadlines for all outstanding tasks. Where these are unknown I have made a ‘best guess’ (indicated by a question mark after the date).

I have done my best to steer a consistent path through the variety of material associated with these reforms, pointing out apparent conflicts between sources wherever these exist.

A final section considers progress across the reform programme as a whole – and how much remains to be done.

It discusses the likely impact of Election Purdah and the prospects for changes in direction consequent upon the outcome of the Election.

I have devoted previous posts to ‘Analysis of the Primary Assessment and Accountability Consultation Document’ (July 2013) and to the response in ‘Unpacking the Primary Assessment and Accountability Reforms’ (April 2014) so there is inevitably some repetition here, for which I apologise.

This is a long and complex post, even by my standards. I have tried to construct the big picture from a variety of different sources, to itemise all the jigsaw pieces already in place and all those that are still missing.

If you spot any errors or omissions, do let me know and I will do my best to correct them.

.

[Postscript: Please note that I have added several further postscripts to this document since the original date of publication. If you are revisiting, do pause at the new emboldened paragraphs below.]

Timeline for Reform

The consultation document ‘Primary assessment and accountability under the new national curriculum’ was published on 7 July 2013.

It contained a commitment to publish a response in ‘autumn 2013’, but ‘Reforming assessment and accountability for primary schools’ did not appear until March 2014.

The implementation timetable has to be inferred from a variety of sources but seems to be as shown in the table below. (I have set aside interim milestones until the thematic sections below.)

Month/year Action
Sept 2014 Schools no longer expected to use levels for non-statutory assessment
May 2015 End of KS1 and KS2 national curriculum tests and statutory teacher assessment reported through levels for the final time. .
Summer term 2015 Final 2016 KS1 and KS2 test frameworks, sample materials and mark schemes published.
Guidance published on reporting of test results.
Sept 2015 Schools can use approved reception baseline assessments (or a KS1 baseline).
Sept/Autumn term 2015 New performance descriptors for statutory teacher assessment published.
Dec 2015 Primary Performance Tables use levels for the final time.
May 2016 New KS1 and KS tests introduced, reported through new attainment and progress measures.
June 2016 Statutory teacher assessment reported through new performance descriptors.
Sept 2016 Reception baseline assessment the only baseline option for all-through primaries
Schools must publish new headline measures on their websites.
New floor standards come into effect (with progress element still derived from KS1 baseline).
Dec 2016 New attainment and performance measures published in Primary Performance Tables.

The General Election takes place on 7 May 2015, but pre-Election Purdah will commence on 30 March, almost exactly a year on from publication of the consultation response.

At the time of writing, some 40 weeks have elapsed since the response was published – and there are some 10 weeks before Purdah descends.

Assuming that the next Government is formed within a week of the Election (which might be optimistic), there is a second working period of roughly 10 weeks between that and the end of the AY 2014/15 summer term.

The convention is that all significant assessment and accountability reforms are notified to schools a full academic year before implementation, so allowing them sufficient time to plan for implementation.

A full year’s lead time is no longer sacrosanct (and has already been set aside in some instances below) but any shorter notification period may have significant implications for teacher workload – something that the Government is committed to tackling.

.

[Postscript: On 6 February the Government published its response to the Workload Challenge, which contained a commitment to introduce, from ‘Spring 2015’, a:

‘DfE Protocol setting out minimum lead-in times for significant curriculum, qualifications and accountability changes…’

Elsewhere the text says that the minimum lead time will be a year, thus reinforcing the convention described above.

The term ‘significant’ allows some wriggle room, but one might reasonably expect it to be applied to some of the outstanding actions below.

The Protocol was published on 23 March. The first numbered paragraph implicitly defines a significant change as one having ‘a significant workload impact on schools’, though what constitutes significance (and who determines it) is left unanswered.

There is provision for override ‘in cases where change is urgently required’ but criteria for introducing an override are not supplied.]

.

.

We now know that a minimum lead time will not be applied to the introduction of new performance descriptors for statutory teacher assessment (see below). The original timescale did not fit this description and it has not been adjusted in the light of consultation.]

.

Announcements made during the long summer holiday are much disliked by schools, so the end of summer term 2015 becomes the de facto target for any reforms requiring implementation from September 2016.

One might therefore conclude that:

  • We are about two-thirds of the way through the main implementation period.
  • There is a period of some 100 working days in which to complete the reforms expected to be notified to schools before the end of the AY2014/15 summer term. This is divided into two windows of some 50 working days on either side of Purdah.
  • There is some scope to extend more deadlines into the summer break and autumn 2015, but the costs of doing so – including loss of professional goodwill – might outweigh the benefits.

Purdah will act as a brake on progress across the piece. It will delay announcements that might otherwise have been made in April and early May, such as those related to new tests scheduled for May 2016.

The implications of Purdah are discussed further in the final section of this post.

.

Reception Baseline Assessment

Consultation response

A new Reception Baseline will be introduced from September 2015. This will be undertaken by children within their first few weeks of school (so not necessarily during the first half of the autumn term).

Teachers will be able to select from a range of assessments ‘but most are likely to be administered by the reception teaching staff’.  Assessments will be ‘short’ and ‘sit within teachers’ broader assessments of children’s development’.

They will be:

‘…strong predictors of key stage 1 and key stage 2 attainment whilst reflecting the age and abilities of children in reception’

Schools that use an approved baseline assessment ‘in September 2015’ (and presumably later during the 2015/16 academic year) will have their progress measured in 2022 against that or a KS1 baseline, whichever gives the best result.

However, only the reception baseline will be available from September 2016 and, from this point, the Early Years Foundation Stage (EYFS) profile will no longer be compulsory.

The reception baseline will not be compulsory either, since:

‘Schools that choose not to use an approved baseline assessment from 2016 will be judged on an attainment floor standard alone.’

But, since the attainment floor standard is so demanding (see below), this apparent choice may prove illusory for most schools.

Further work includes:

  • Engaging experts to develop criteria for the baselines.
  • A study in autumn 2014 of schools that already use such assessments, to inform decisions on moderation and the reporting of results to parents.
  • Communicating those decisions about moderation and reporting results – to Ofsted as well as to parents – ensuring they are ‘contextualised by teachers’ broader assessments’.
  • Publishing a list of assessments that meet the prescribed criteria.

.

Developments to date

Baseline criteria were published by the STA in May 2014.

The purpose of the assessments is described thus:

‘…to support the accountability framework and help assess school effectiveness by providing a score for each child at the start of reception which reflects their attainment against a pre-determined content domain and which will be used as the basis for an accountability measure of the relative progress of a cohort of children through primary school.’

This emphasis on the relevance of the baseline to floor targets is in marked contrast with the emphasis on reporting progress to parents in the original consultation document.

Towards the end of the document here is a request for ‘supporting information in addition to the criteria’:

‘What guidance will suppliers provide to schools in order to enable them to interpret the results and report them to parents in a contextualised way, for example alongside teacher observation?’

This seems to refer to the immediate reporting of baseline outcomes rather than of subsequent progress measures. Suitability for this purpose does not appear within the criteria themselves.

Interestingly, the criteria specify that the content domain:

‘…must demonstrate a clear progression towards the key stage 1 national curriculum in English and mathematics’,

but there is no reference to progression to KS2, and nothing about assessments being ‘strong predictors’ of future attainment, whether at KS1 or KS2.

Have expectations been lowered, perhaps because of concerns about the predictive validity of the assessments currently available?

A research study was commissioned in June 2014 (so earlier than anticipated) with broader parameters than originally envisaged.

The Government awarded a 9-month contract to NFER worth £49.7K, to undertake surveys of teachers’, school leaders’ and parents’ views on baseline assessment.

The documentation reveals that CEM is also involved in a parallel quantitative study which will ‘simulate an accountability environment’ for a group of schools, to judge changes in their behaviour.

Both of these organisations are also in the running for concession contracts to deliver the assessments from September 2015 (see below).

The aims of the project are to identify:

  • The impact of the introduction of baseline assessments in an accountability context.
  • Challenges to the smooth introduction of baseline assessments as a means to constructing an accountability measure.
  • Potential needs for monitoring and moderation approaches.
  • What reporting mechanisms and formats stakeholders find most useful.

Objectives are set out for an accountability strand and a reporting strand respectively. The former refer explicitly to identification of ‘gaming’ and the exploration of ‘perverse incentives’.

It is not entirely clear from the latter whether researchers are focused solely on initial contexualised reporting of reception baseline outcomes, or are also exploring the subsequent reporting of progress.

The full objectives are reproduced below

.

Reception baseline capture

.

The final ‘publishable’ report is to be delivered by March 2015. It will be touch and go whether this can be released before Purdah descends. Confirmation of policy decisions based on the research will likely be delayed until after the Election.

.

The process has begun to identify and publish a list of assessments that meet the criteria.

A tender appeared on Contracts Finder in September 2014 and has been updated several times subsequently, the most recent version appearing in early December.

The purpose is to award several concession contracts, giving holders the right to compete with each other to deliver baseline assessments.

Contracts were scheduled to be awarded on 26 January 2015, but there was no announcement. Each will last 19 months (to August 2016), with an option to extend for a further year. The total value of the contracts, including extensions, is calculated at £4.2m.

There is no limit to the number of concessions to be awarded, but providers must meet specified (and complex) school recruitment and delivery targets which essentially translate into a 10% sample of all eligible schools.

Under-recruiting providers can be included if fewer than four meet the 10% target, as long as they have recruited at least 1,000 eligible schools.

Moreover:

‘The minimum volume requirement may be waived if the number of schools choosing to administer the reception baseline is fewer than 8,887 [50% of the total number of schools with a reception class].’

Hence the number of suppliers in the market is likely to be limited to 10 or so: there will be some choice, but not too much.

My online researches unearthed four obvious candidates:

And suggestions that this might constitute the entire field

.

.

The initial deadline for recruiting the target number of schools is 30 April 2015, slap-bang in the middle of Purdah. This may prove problematic.

.

[Postscript: The award of six concession contracts was quietly confirmed on Wednesday 4 February, via new guidance on DfE’s website. The two contractors missing from the list above are Early Excellence and Hodder Education.

The guidance confirms that schools must sign up with their preferred supplier. They can do so after the initial deadline of 30 April but, on 3 June, schools will be told if they have chosen a provider that has been suspended for failing to recruit sufficient schools.  They will then need to choose an alternative provider.

It adds that, in AY2015/16, LA-maintained schools, academies and free schools will be reimbursed for the ‘basic cost’ of approved reception baselines. Thereafter, school budgets will include the necessary funding.

In the event, the Government has barely contributed to publicity for the assessment, leaving it to suppliers to make the running. The initial low-key approach (including links to the contractors’ home pages rather than to details of their baseline offers) has been maintained.

The only addition to the guidance has been the inclusion, from 20 March, of the criteria used to evaluate the original bids. This seems unlikely to help schools select their preferred solution since, by definition, all the successful bids must have satisifed these criteria!

Purdah will now prevent any further Government publicity.]

.

It seems likely that the decision to allow a range of baseline assessments – as opposed to a single national measure – will create significant comparability issues.

One of the ‘clarification questions’ posed by potential suppliers is:

‘We can find no reference to providing a comparability score between provider assessments. Therefore, can we assume that each battery of assessments will be independent, stand-alone and with no need to cross reference to other suppliers?’

The answer given is:

‘The assumption is correct at this stage. However, STA will be conducting a comparability study with successful suppliers in September 2015 to determine whether concordance tables can be constructed between assessments.’

This implies that progress measures will need to be calculated separately for users of each baseline assessment – and that these will be comparable only through additional ‘concordance tables’, should these prove feasible.

There are associated administrative and workload issues for schools, particularly those with high mobility rates, which may find themselves needing to engage with several different baseline assessment products.

One answer to a supplier’s question reveals that:

‘As currently, children will be included in performance measures for the school in which they take their final assessment (i.e. key stage 2 tests) regardless of which school they were at for the input measure (i.e. reception baseline on key stage 1). We are currently reviewing how long a child needs to have attended a school in order for their progress outcome to be included in the measure.’

The issue of comparability also raises questions about their aggregation for floor target purposes. Will targets based on several different baseline assessments be comparable with those based on only one? Will schools with high mobility rates be disadvantaged?

Schools will pay for the assessments. The supporting documentation says that:

‘The amount of funding that schools will be provided with is still to be determined. This will not be determined until after bids have been submitted to avoid accusations of price fixing.’

One of the answers to a clarification question says:

‘The funding will be available to schools from October 2015 to cover the reception baseline for the academic year 2015/16.’

Another says this funding is unlikely to be ringfenced.

There is some confusion over the payment mechanism. One answer says:

‘…the mechanism for this is still to be determined. In the longer term, money will be provided to schools through the Dedicated Schools Grant (DSG) to purchase the reception baseline. However, the Department is still considering options for the first year and may pay suppliers directly depending on the amount of data provided.’

But yet another is confident that:

‘Suppliers will be paid directly by schools. The Department will reimburse schools separately.’

The documentation also reveals that there has as yet been no decision on how to measure progress between the baseline and the end of KS2:

‘The Department is still considering how to measure this and is keen for suppliers to provide their thoughts.’

The ‘Statement of requirements’ once again foregrounds the use of the baseline for floor targets rather than reporting individual learners’ progress.

‘On 27 March 2014, the Department for Education (DfE) announced plans to introduce a new floor standard from September 2016. This will be based on the progress made by pupils from reception to the end of primary school.  The DfE will use a new Reception Baseline Assessment to capture the starting point from which the progress that schools make with their pupils will be measured.  The content of the Reception Baseline will reflect the knowledge and understanding of children at the start of reception, and will be clearly linked to the learning and development requirements of the Early Years Foundation Stage and key stage 1 national curriculum in English and mathematics.  The Reception Baseline will be administered within the first half term of a pupil’s entry to a reception class.’

In relation to reporting to parents, one of the answers to suppliers’ questions states:

‘Some parents will be aware of the reception baseline from the national media coverage of the policy announcement. We anticipate that awareness of the reception baseline will develop over time. As with other assessments carried out by a school, we would expect schools to share information with parents if asked, though there will be no requirement to report the outcome of the reception baseline to parents.’

So it appears that, regardless of the outcomes of the research above, initial short term reporting of reception baseline outcomes will be optional.

.

[Postscript: This position is still more vigorously stated in a letter dated November 2014 from Ministers to a primary group formed by two maths associations. It says (my emphasis):

‘Let me be clear that we do not intend the baseline assessment to be used to monitor the progress of individual children. You rightly point out that any assessment that was designed to be reliable at individual child level would need to take into account the different ages at which children start reception and be sufficiently detailed to account for the variation in performance one expects from young children day-to-day. Rather, the baseline assessment is about capturing the starting point for the cohort which can then be used to assess the progress of that cohort at the end of primary school,’

This distinction has not been made sufficiently explicit in material published elsewhere.]

.

The overall picture is of a process in which procurement is running in parallel with research and development work intended to help resolve several significant and outstanding issues. This is a consequence of the September 2015 deadline for introduction, which seems increasingly problematic.

Particularly so given that many professionals are yet to be convinced of the case for reception baseline assessment, expressing reservations on several fundamental grounds, extending well beyond the issues highlighted above.

A January 2015 Report from the Centre Forum – Progress matters in Primary too – defends the plan against its detractors, citing six key points of concern. Some of the counter-arguments summarised below are rather more convincing than others:

  • Validity: The contention that reception level assessments are accurate predictors of attainment at the end of KS2 is justified by reference to CEM’s PIPS assessment, which was judged in 2001 to give a correlation of 0.7. But of course KS2 tests were very different in those days.
  • Reliability: The notion that attainment can be reliably determined in reception is again justified with reference to PIPS data from 2001 (showing a 0.98 correlation on retesting). The authors argue that the potentially negative effects of test conditions on young children and the risks of bias should be ‘mitigated’ (but not eliminated) through the development and selection process.
  • Contextualisation: The risk of over-simplification through reporting a single numerical score, independent of factors such as age, needs to be set against the arguments in favour of a relatively simple and transparent methodology. Schools are free to add such context when communicating with parents.
  • Labelling: The argument that baseline outcomes will tend to undermine universally high expectations is countered by the view that assessment may actually challenge labelling attributable to other causes, and can in any case be managed in reporting to parents by providing additional contextual information.
  • Pupil mobility: Concern that the assessment will be unfair on schools with high levels of mobility is met by reference to planned guidance on ‘how long a pupil needs to have attended a school in order to be included in the progress measure’. However, the broader problems associated with a choice of assessments are acknowledged.
  • Gaming: The risk that schools will artificially depress baseline outcomes will be managed through effective moderation and monitoring.

The overall conclusion is that:

‘…the legitimate concerns raised by stakeholders around the reliability and fairness of a baseline assessment do not present fundamental impediments to implementing the progress measure. Overall, a well-designed assessment and appropriate moderation could address these concerns to the extent that a baseline assessment could provide a reasonable basis for constructing a progress measure.

That said, the Department for Education and baseline assessment providers need to address, and, where indicated, mitigate the concerns. However, in principle, there is nothing to prevent a well-designed baseline test being used to create a progress-based accountability measure.’

The report adds:

‘However, this argument still needs to be won and teachers’ concerns assuaged….

.. Since the majority of schools will be reliant on the progress measure under the new system, they need to be better informed about the validity, reliability and purpose of the baseline assessment. To win the support of school leaders and teachers, the Department for Education must release clear, defensible evidence that the baseline assessment is indeed valid, fair and reliable.’

.

[Postscript: On 25 March the STA tendered for a supplier to ‘determine appropriate models for assuring the national data from the reception baseline’. The notice continues:

‘Once models have been determined, STA will agree up to three approaches to be implemented by the supplier in small scale pilots during September/October 2015. The supplier will also be responsible for evaluating the approaches using evidence from the pilots with the aim of recommending an approach to be implemented from September 2016.’

The need for quality assurance is compounded by the fact that there are six different assessment models. The documentation makes clear that monitoring, moderation and other quality assurance methods will be considered.

The contract runs from 1 July 2015 to 31 January 2016 with the possibility of extension for a further 12 months. It will be let by 19 June.]

 .

Outstanding tasks

  • Publish list of contracts for approved baseline assessments (26 January 2015) COMPLETED
  • Explain funding arrangements for baseline assessments and how FY2015-16 funding will be distributed (January 2015?) COMPLETED
  • Publish research on baseline assessment (March/April 2015) 
  • Confirm monitoring and moderation arrangements (March/April 2015?) 
  • Deadline for contractors recruiting schools for initial baseline assessments (30 April 2015) 
  • Publish guidance on the reporting of baseline assessment results (May 2015?) 
  • Award quality assurance tender (June 2016)
  • Undertake comparability study with successful suppliers to determine whether concordance tables can be constructed (Autumn 2015) 
  • Determine funding required for AY2015/16 assessment and distribute to schools (or suppliers?) (October 2015?)
  • Pilot quality assurance models (October 2015)

KS1 and KS2 tests

.

Consultation response

The new tests will comprise:

  • At KS1 – externally set and internally marked tests of maths and reading and an externally set test of grammar, punctuation and spelling (GPS). It is unclear from the text whether the GPS test will be externally marked.
  • At KS2 – externally set and externally marked tests of maths, reading and science, plus a sampling test in science.

Outcomes of both KS1 and KS2 tests (other than the science sampling test) will be expressed as scaled scores. A footnote makes it clear that, in both cases, a score of ‘100 will represent the new expected standard for that stage’

The consultation document says of the scaled scores:

‘Because it is not possible to create tests of precisely the same difficulty every year, the number of marks needed to meet the secondary readiness standard will fluctuate slightly from one year to another. To ensure that results are comparable over time, we propose to convert raw test marks into a scaled score, where the secondary readiness standard will remain the same from year to year. Scaled scores are used in all international surveys and ensure that test outcomes are comparable over time.’

It adds that the Standards and Testing Agency (STA) will develop the scale.

Otherwise very little detail is provided about next steps. The consultation response is silent on the issue. The original consultation document says only that:

‘The Standards and Testing Agency will develop new national curriculum tests, to reflect the new national curriculum programmes of study.’

Adding, in relation to the science sampling test:

‘We will continue with national sample tests in science, designed to monitor national standards over time. A nationally-representative sample of pupils will sit a range of tests, designed to produce detailed information on the cohort’s performance across the whole science curriculum. The design of the tests will mean that results cannot be used to hold individual schools or pupils accountable.’

.

Developments to date

On March 31 2014, the STA published  draft test frameworks for the seven KS1 and KS2 tests to be introduced from 2016:

  • KS1 GPS: a short written task (20 mins); short answer questions (20 mins) and a spelling task (15 mins)
  • KS1 reading: two reading tests, one with texts and questions together, the other with a separate answer booklet (2 x 20 mins)
  • KS1 maths: an arithmetic test (15 mins) and a test of fluency, problem-solving and reasoning (35 mins)
  • KS2 GPS: a grammar and punctuation test (45 mins) and a spelling task (15 mins)
  • KS2 reading: a single test (60 mins)
  • KS2 maths: an arithmetic test (30 mins) and two tests of fluency, problem-solving and reasoning (2 x 40 mins)
  • KS2 science (sampling): tests in physics, chemistry and biology contexts (3 x 25 mins).

Each test will be designed for the full range of prior attainment and questions will typically be posed in order of difficulty.

Each framework explains that all eligible children at state-funded schools will be required to take the tests, but some learners will be exempt.

For further details of which learners will be exempted, readers are referred to the current Assessment and Reporting Arrangements (ARA) booklets.

According to these, the KS1 tests should be taken by all learners working at level 1 or above and the KS2 tests by all learners working at level 3 and above. Teacher assessment data must be submitted for pupils working below the level of the tests.

But of course levels will no longer exist – and we have no equivalent in the form of scaled scores – so the draft frameworks do not define clearly the lower parameter of the range of prior attainment the tests are intended to accommodate.

It will not be straightforward to design workable tests for such broad spans of prior attainment.

Each framework has a common section on the derivation of scaled scores:

‘The raw score on the test…will be converted into a scaled score. Translating raw scores into scaled scores ensures performance can be reported on a consistent scale for all children. Scaled scores retain the same meaning from one year to the next. Therefore, a particular scaled score reflects the same level of attainment in one year as in the previous year, having been adjusted for any differences in difficulty of the test.

Additionally, each child will receive an overall result indicating whether or not he or she has achieved the required standard on the test. A standard-setting exercise will be conducted on the first live test in 2016 in order to determine the scaled score needed for a child to be considered to have met the standard. This process will be facilitated by the performance descriptor… which defines the performance level required to meet the standard. In subsequent years, the standard will be maintained using appropriate statistical methods to translate raw scores on a new test into scaled scores with an additional judgemental exercise at the expected standard. The scaled score required to achieve the expected level on the test will always remain the same.

The exact scale for the scaled scores will be determined following further analysis of trialling data. This will include a full review of the reporting of confidence intervals for scaled scores.’

In July 2014 STA also published sample questions, mark schemes and associated commentaries for each test.

.

Outstanding tasks

I have been unable to trace any details of the timetable for test development and trialling.

As far as I can establish, STA has not published an equivalent to QCDA’s ‘Test development, level setting and maintaining standards’ (March 2010) which describes in some detail the different stages of the test development process.

This old QCA web-page describes a 22-month cycle, from the initial stages of test development to the administration of the tests.

This aligns reasonably well with the 25-month period between publication of the draft test frameworks on 31 March 2014 and the administration of the tests in early May 2016.

Applying the same timetable to the 2016 tests – using publication of the draft frameworks as the starting point – suggests that:

  • The first pre-test should have been completed by November 2014
  • The second pre-test should take place by February 2015 
  • Mark schemes and tests should be finalised by July 2015

STA commits to publishing, the final test frameworks and a full set of sample tests and mark schemes for each of the national curriculum tests at key stages 1 and 2 ‘during the 2015 summer term’.

Given Purdah, these seem most likely to appear towards the end of the summer term rather than a full year ahead of the tests.

In relation to the test frameworks, STA says:

‘We may make small changes as a result of this work; however, we do not expect the main elements of the frameworks to change.’

They will also produce, to the same deadline, guidance on how the results of national curriculum tests will be reported, including an explanation of scaled scores.

So we have three further outstanding tasks:

  • Publishing the final test frameworks (summer term 2015) 
  • Finalising the scale to be used for the tests (summer term 2015) 
  • Publishing guidance explaining the use and reporting of scaled scores (summer term 2015)

.

[Postscript: Since publishing this post, I have found on Contracts Finder various STA contracts, as follows:

How these square with the timetable above is, as yet, unclear. If there is a possibility that final test frameworks cannot be finalised until Autumn 2015, the Workload Challenge Protocol may well bite here too.]

.

Statutory teacher assessment

.

Consultation response

The response confirms statutory teacher assessment of:

  • KS1 maths, reading, writing, speaking and listening and science
  • KS2 maths, reading, writing and science.

There are to be performance descriptors for each statutory teacher assessment:

  • a single descriptor for KS1 science and KS2 science, reading and maths
  • several descriptors for KS1 maths, reading, writing and speaking and listening, and also for KS2 writing.

There is a commitment to improve KS1 moderation, given concerns expressed by Ofsted and the NAHT Commission.

In respect of low attaining pupils the response says:

‘All pupils who are not able to access the relevant end of key stage test will continue to have their attainment assessed by teachers. We will retain P-scales for reporting teachers’ judgements. The content of the P-scales will remain unchanged. Where pupils are working above the P-scales but below the level of the test, we will provide further information to enable teachers to assess attainment at the end of the relevant key stage in the context of the new national curriculum.’

And there is to be further consideration of whether to move to external moderation of P-scale teacher assessment.

So, to summarise, the further work involves:

  • Developing new performance descriptors – to be drafted by an expert group. According to the response, the KS1 descriptors would be introduced in ‘autumn 2014’. No date is given for the KS2 descriptors.
  • Improving moderation of KS1 teacher assessment, working closely with schools and Ofsted.
  • Providing guidance to support teacher assessment of those working above the P-scales but below the level of the tests.
  • Deciding whether to move to external moderation of P-scale teacher assessment.

.

Developments to date

Updated statutory guidance on the P-Scale attainment targets for pupils with SEN was released in July 2014, but neither it nor the existing guidance on when to use the P-Scales relates them to the new scaled scores, or discusses the issue of moderation.

.

In September 2014, a guidance noteNational curriculum and assessment from September 2014: Information for schools’ revised the timeline for the development of performance descriptors:

‘New performance descriptors will be published (in draft) in autumn 2014 which will inform statutory teacher assessment at the end of key stage 1 and 2 in summer 2016. Final versions will be published by September 2015.’

.

A consultation document on performance descriptors: ‘Performance descriptors for use in key stage 1 and 2 statutory teacher assessment for 2015 to 2016’ was published on 23 October 2014.

The descriptors were:

‘… drafted with experts, including teachers, representatives from Local Authorities, curriculum and subject experts. Also Ofsted and Ofqual have observed and supported the drafting process’

A November 2014 FoI response revealed the names of the experts involved and brief biographies were provided in the media.

A further FoI has been submitted requesting details of their remit but, at the time of writing, this has not been answered.

.

[Postscript: The FoI response setting out the remit was published on 5 February.]

.

The consultation document revealed for the first time the complex structure of the performance descriptor framework.

It prescribes four descriptors for KS1 reading, writing and maths but five for KS2 writing.

The singleton descriptors reflect ‘working at the national standard’.

Where four descriptors are required these are termed (from the top down): ‘mastery’, ‘national’, ‘working towards national’ and ‘below national’ standard.

In the case of KS2 writing ‘above national standard’ is sandwiched between ‘mastery’ and ‘national’.

.

Performance descriptor Capture 1Perfromance Decriptor Capture 2

The document explains how these different levels cross-reference to the assessment of learners exempted from the tests.

In the case of assessments with only a single descriptor, it becomes clear that a further distinction is needed:

‘In subjects with only one performance descriptor, all pupils not assessed against the P-scales will be marked in the same way – meeting, or not meeting, the ‘national standard’.

So ‘not meeting the national standard’ should also be included in the table above. The relation between ‘not meeting’ and ‘below’ national standard is not explained.

But still further complexity is added since:

‘There will be some pupils who are not assessed against the P-scales (because they are working above P8 or because they do not have special educational needs), but who have not yet achieved the contents of the ‘below national standard’ performance descriptor (in subjects with several descriptors). In such cases, pupils will be given a code (which will be determined) to ensure that their attainment is still captured.’

This produces a hierarchy as follows (from the bottom up):

  • P Scales
  • In cases of assessments with several descriptors, an attainment code yet to be determined
  • In case of assessments with single descriptors, an undeclared ‘not meeting the national standard’ descriptor
  • The single descriptor or four/five descriptors listed above.

However, the document says:

‘The performance descriptors do not include any aspects of performance from the programme of study for the following key stage. Any pupils considered to have attained the ‘Mastery standard’ are expected to explore the curriculum in greater depth and build on the breadth of their knowledge and skills within that key stage.’

This places an inappropriate brake on the progress of the highest attainers because the assessment ceiling is pitched too low to accommodate them.

It is acknowledging that some high attainers will be performing above the level of the highest descriptors but, regardless of whether or not they move into the programme for the next key stage, there is no mechanism to record their performance.

This raises the further question whether the mastery standard is pitched at the equivalent of level 6, or below it. It will be interesting to see whether this is addressed in the consultation response.

The consultation document says that the draft descriptors will be trialled during summer term 2015 in a representative sample of schools.

These trials and the consultation feedback will together inform the development of the final descriptors, but also:

  • ‘statutory arrangements for teacher assessment using the performance descriptors;
  • final guidance for schools (and those responsible for external moderation arrangements) on how the performance descriptors should be used;
  • an updated national model for the external moderation of teacher assessment; and
  • nationally developed exemplification of the work of pupils for each performance descriptor at the end of each key stage.’

Published comments on the draft descriptors have been almost entirely negative, which might suggest that the response could be delayed. The consultation document said it should appear ‘around 26 February 2015’.

According to the document, the final descriptors will be published either ‘in September 2015’ or ‘in the autumn term 2015’, depending whether you rely on the section headed ‘Purpose’ or the one called ‘Next Steps’. The first option would allow them to appear as late as December 2015.

A recent newspaper report suggested that the negative reception had resulted in an ‘amber/red’ assessment of primary assessment reform as a whole. The leaked commentary said that any decision to review the approach would increase the risk that the descriptors could not be finalised ‘by September as planned’.

However, the story concludes:

‘The DfE says: “We do not comment on leaks,” but there are indications from the department that the guidance will be finalised by September. Perhaps ministers chose, in the end, not to “review their approach”, despite the concerns.’

Hence it would appear that delay until after the beginning of AY2015/16 will not be countenanced

Note that the descriptors are for use in academic year 2015/16, so even publication in September is problematic, since teachers will begin the year not knowing which descriptors to apply.

The consultation document refers only to descriptors for AY2015/16, which might imply that they will be further refined for subsequent years. Essentially therefore, the arrangements proposed here would be an imperfect interim solution.

.

[Postscript: On 26 February 2015 the Consultation Response was published – so on the date commited to in the consultation document. 

As expected, it revealed significant opposition to the original proposals:

  • 74% of respondents were concerned about nomenclature
  • 76% considered that the descriptors were not spaced effectively across the range of pupils’ performance
  • 69% of respondents considered them not clear or easy to understand

The response acknowledges that the issues raised:

‘….amount to a request for greater simplicity, clarity and consistency to support teachers in applying performance descriptors and to help parents understand their meaning.’

But goes on to allege that: 

‘…there are some stakeholders who valued the levels system and would like performance descriptors to function in a similar way across the key stages, which is not their intention.’

Even so, although the Descriptors are not intended to inform formative assessment, respondents have raised concerns that they could be applied in this manner.

There is also the issue of comparability between formative and summative assessment measures, but this is not addressed.

The response does not entirely acknowledge that opposition to the original proposals is sending it back to the drawing board but:

‘As a result of some of the conflicting responses to the consultation, we will work with relevant experts to determine the most appropriate course of action to address the concerns raised and will inform schools of the agreed approach according to the timetable set out in the consultation document – i.e. by September 2015.

The new assessment commission (see below) will have an as yet undefined role in this process:

‘In the meantime, and to help with this [ie determining the most appropriate course of action] the Government is establishing a Commission on Assessment Without Levels….’

Unfortunately, this role has not been clarified in the Commission’s Statement of Intended Outputs

There is no reference to the trials in schools, which may or may not continue. A DfE Memorandum to the Education Select Committee on its 2014-15 Supplementary Estimates reveals that £0.3m has been reallocated to pay for them, but this is no guarantee that they will take place.

Implementation will not be delayed by a year, despite the commitment to allow a full year’s notice for significant reforms announced in the response to the Workload Challenge.

This part of the timetable is now seriously concertina’d and there must be serious doubt whether the timescale is feasible, especially if proper trialling is to be accommodated.]

.

Outstanding tasks 

  • Publish response to performance descriptors consultation document (26 February 2015) COMPLETED
  • Trial (revised?) draft performance descriptors (summer term 2015) 
  • Publish adjusted descriptors, revised in the light of consultation with experts and input from the commission (summer term 2015)
  • Experts and commission on assessment produce response to concerns raised and inform schools of outcomes (September 2015)
  • Confirm statutory arrangements for use of the performance descriptors (September/autumn term 2015) 
  • Publish final performance descriptors for AY2015/16 (September/autumn term 2015) 
  • Publish final guidance on the use of performance descriptors (September/autumn term 2015) 
  • Publish exemplification of each performance descriptor at each key stage (September/autumn term 2015)
  • Publish an updated model for the external moderation of teacher assessment (September/autumn term 2015?) 
  • Confirm plans for the moderation of KS1 teacher assessment and use of the P-scales (September/autumn term 2015?) 
  • Publish guidance on assessment of those working above the P-scales but below the level of the tests (September/autumn term 2015?) 
  • Decide whether performance descriptors require adjustment for AY2016/17 onwards (summer term 2016)

.

Schools’ internal assessment and tracking systems

.

Consultation response

The consultation document outlined some of the Government’s justification for the removal of national curriculum levels. The statement that:

‘Schools will be able to focus their teaching, assessment and reporting not on a set of opaque level descriptions, but on the essential knowledge that all pupils should learn’

may be somewhat called into question by the preceding discussion of performance descriptors.

The consultation document continues:

‘There will be a clear separation between ongoing, formative assessment (wholly owned by schools) and the statutory summative assessment which the government will prescribe to provide robust external accountability and national benchmarking. Ofsted will expect to see evidence of pupils’ progress, with inspections informed by the school’s chosen pupil tracking data.’

A subsequent section adds:

‘We will not prescribe a national system for schools’ ongoing assessment….

…. We expect schools to have a curriculum and assessment framework that meets a set of core principles…

 … Although schools will be free to devise their own curriculum and assessment system, we will provide examples of good practice which schools may wish to follow. We will work with professional associations, subject experts, education publishers and external test developers to signpost schools to a range of potential approaches.’

The consultation response does not cover this familiar territory again, saying only:

‘Since we launched the consultation, we have had conversations with our expert group on assessment about how to support schools to make best use of the new assessment freedoms. We have launched an Assessment Innovation Fund to enable assessment methods developed by schools and expert organisations to be scaled up into easy-to-use packages for other schools to use.’

Further work is therefore confined to the promulgation of core principles, the application of the Assessment Innovation Fund and possibly further work to ‘signpost schools to a range of potential approaches’.

.

Developments to date

The Assessment Innovation Fund was originally announced initially in December 2013.

A factsheet released at that time explains that many schools are developing new curriculum and assessment systems and that the Fund is intended to enable schools to share these.

Funding of up to £10K per school is made available to help up to 10 schools to prepare simple, easy-to-use packages that can be made freely available to other schools.

They must commit to:

‘…make their approach available on an open licence basis. This means that anyone who wishes to use the package (and any trade-marked name) must be granted a non-revocable, perpetual, royalty-free licence to do so with the right to sub-licence. The intellectual property rights to the system will remain with the school/group which devised it.’

Successful applicants were to be confirmed ‘in the week commencing 21 April 2014’

In the event, nine successful applications were announced on 1 May, although one subsequently withdrew, apparently over the licensing terms.

The packages developed with this funding are stored – in a rather user-unfriendly fashion – on this TES Community Blog, along with other material supportive of the decision to dispense with levels.

Much other useful material has been published online which has not been collected into this repository and it is not clear to what extent it will develop beyond its present limits, since the most recent addition was in early November 2014.

A recent survey by Capita Sims (itself a provider of assessment support) conducted between June and September 2014, suggested that:

  • 25% of primary and secondary schools were unprepared for and 53% had not yet finalised plans for replacing levels.
  • 28% were planning to keep the existing system of levels, 21% intended to introduce a new system and 28% had not yet made a decision.
  • 50% of those introducing an alternative expected to do so by September 2015, while 23% intended to do so by September 2016.
  • Schools’ biggest concern (53% of respondents) is measuring progress and setting targets for learners.

Although the survey is four months old and has clear limitations (there were only 126 respondents) this would suggest further support may be necessary, ideally targeted towards the least confident schools.

.

In April 2014 the Government published a set of Assessment Principles, building on earlier material in the primary consultation document. These had been developed by an ‘independent expert panel’.

It is not entirely clear whether the principles apply solely to primary schools and to schools’ own assessment processes (as opposed to statutory assessment).

The introductory statement says:

‘The principles are designed to help all schools as they implement arrangements for assessing pupils’ progress against their school curriculum; Government will not impose a single system for ongoing assessment.

Schools will be expected to demonstrate (with evidence) their assessment of pupils’ progress, to keep parents informed, to enable governors to make judgements about the school’s effectiveness, and to inform Ofsted inspections.’

This might suggest they are not intended to cover statutory assessment and testing but are relevant to secondary schools.

There are nine principles in all, divided into three groups:

.

Principles Capture

.

The last of these seems particularly demanding.

 .

In July 2014, Ofsted published guidance in the form of a ‘Note for inspectors: use of assessment information during inspections in 2014/15’. This says that:

‘In 2014/15, most schools, academies and free schools will have historic performance data expressed in national curriculum levels, except for those pupils in Year 1. Inspectors may find that schools are tracking attainment and progress using a mixture of measures for some, or all, year groups and subjects.

As now, inspectors will use a range of evidence to make judgements, including by looking at test results, pupils’ work and pupils’ own perceptions of their learning. Inspectors will not expect to see a particular assessment system in place and will recognise that schools are still working towards full implementation of their preferred approach.’

It goes on to itemise the ways in which inspectors will check that these systems are effective, without judging the systems themselves, but by gathering evidence of effective implementation through leadership and management, the accuracy of assessment, effectiveness in securing progress and quality of reporting to parents.

. 

In September 2014, NCTL published a research reportBeyond Levels: alternative assessment approaches developed by teaching schools.’

The report summarises the outcomes of small-scale research conducted in 34 teaching school alliances. It offers six rather prolix recommendations for schools and DfE to consider, which can be summarised as follows:

  • A culture shift is necessary in recognition of the new opportunities provided by the new national curriculum and the removal of levels.
  • Schools need access to conferences and seminars to help develop their assessment expertise.
  • Schools would benefit from access to peer reviewed commercial tracking systems relating to the new national curriculum. Clarification is needed about what data will be collected centrally.
  • Teaching school alliances and schools need financial support to further develop assessment practice, especially practical classroom tools, which should be made freely available online.
  • Financial support is needed for teachers to undertake postgraduate research and courses in this field.
  • It is essential to develop professional knowledge about emerging effective assessment practice.

I can find no government response to these recommendations and so have not addressed them in the list of outstanding tasks below.

.

[Postscript: On 25 February 2015, the Government announced the establishment of a ‘Commission on Assessment Without Levels’:

‘To help schools as they develop effective and valuable assessment schemes, and to help us to identify model approaches we are today announcing the formation of a commission on assessment without levels. This commission will continue the evidence-based approach to assessment which we have put in place, and will support primary and secondary schools with the transition to assessment without levels, identifying and sharing good practice in assessment.’

This appears to suggest belated recognition that the steps outlined above have provided schools with insufficient support for the transition to levels-free internal assessment. It is also a response to the possibility that Labour might revisit the decision to remove them (see below).

The Consultation Response on Performance Descriptors released on 26 February (see above) says that the Commission will help to determine the most appropriate response to concerns raised about the Descriptors, while also suggesting that this task will not be devolved exclusively to them.

It adds that the Commission will:

‘…collate, quality assure, publish and share best practice in assessment with schools across the country…and will help to foster innovation and success in assessment practice more widely.’

The membership of the Commission was announced on 9 March.

.

.

The Commission met on 10 March and 23 March 2015 and will meet four more times – in April, May, June and July.

Its Terms of Reference have been published. The Statement of Intended Outputs mentioned in the consultation response on Performance Descriptors appeared without any publicity on 27 March

It seemed that the Commission, together with the further consultation of experts, supplied a convenient mechanism for ‘parking’ some difficult issues until the other side of the Election.

However, neither the terms of reference nor the statement of outputs mentions the Performance Descriptors, so the Commission’s role in relation to them remains shrouded in mystery.

.

.

The authors of the Statement of Outputs feel it necessary to mention in passing that it:

‘…supports the decision to removel levels, but appreciates that the reasons for removing levels are not widely understood’.

It sets out a 10-point list of outputs comprising:

  • Another statement of the purposes of assessment and another set of principles to support schools in developing effective assessment systems, presumably different to those published by the previous expert group in April 2014. (It will be interesting to compare the two sets of principles, to establish whether Government policy on what constitutes effective assessment has changed over the last 12 months. It will also be worthwhile monitoring the gap between the principles and the views of Alison Peacock, one of the Commission’s members. She also sat on the expert panel that developed the original principles, some of which seem rather at odds with her own practice and preferences. Meanwhile, another member – Sam Freedman – has stated

.

.

  • An explanation of ‘how assessment without levels can better serve the needs of pupils and teachers’.
  • Guidance to ‘help schools create assessment policies which reflect the principles of effective assessment without levels’.
  • Clear information about ‘the legal and regulatory assessment requirements’, intende to clarify what they are now, how they will change and when. (The fact that the Commission concludes that such information is not already available is a searing indictment of the Government’s communications efforts to date.)
  • Clarification with Ofsted of ‘the role that assessment without levels will play in the inspection process’ so schools can demonstrate effectiveness without adding to teacher workload. (So again they must believe that Ofsted has not sufficiently clarified this already.)
  • Dissemination of good practice, obtained through engagement with ‘a wide group of stakeholders including schools, local authorities, teachers and teaching unions’. (This is tacit admission that the strategy described above is not working.)
  • Advice to the Government on how ITT and CPD can support assessment without levels and guidance to schools on the use of CPD for this purpose. (There is no reference to the resource implications of introducing additional training and development.)
  • Advice to the Government on ensuring ‘appropriate provision is made for pupils with SEN in the development of assessment policy’. (Their judgement that this is not yet accounted for is a worrying indictment of Government policy to date. They see this as not simply a lapse of communication but a lacuna in the policy-making process.)
  • ‘Careful consideration’ of commitments to tackling teacher workload – which they expect to alleviate by providing information, advice and support. (There is no hint that the introduction of Performance Descriptors will be delayed in line with the Workload Challenge.)
  • A final report before the end of the summer term, though it may publish some outputs sooner. (It will not be able to do so until the outcome of the Election is decided.)

Although there is some implicit criticism of Government policy and communications to date, the failure to make any reference to the Performance Descriptors is unlikely to instil confidence in the capacity of the Commission to provide the necessary challenge to the original proposals, or support to the profession in identifying a workable alternative.]

.

Outstanding tasks

  • Further dissemination of good practice through the existing mechanisms (ongoing) 
  • Further ‘work with professional associations, subject experts, education publishers and external test developers to signpost schools to a range of potential approaches.’ (ongoing)
  • Additional work (via the commission) to ‘collate, quality assure, publish and share’ best practice (Report by July 2015 with other outputs possible from May 2015)

Reporting to parents

.

Consultation response

The consultation document envisaged three outcomes for each test:

  • A scaled score
  • The learner’s position in the national cohort, expressed as a decile
  • The rate of progress from a baseline, derived by comparing a learner’s scaled score with that of other learners with the same level of prior attainment.

Deciles did not survive the consultation

The consultation response confirms that, for each test, parents will receive:

  • Their own child’s scaled score; and
  • The average scaled score for the school, ‘the local area’ (presumably the geographical area covered by the authority in which the school is situated) and the country as a whole.

They must also receive information about progress, but the response only discusses how this might be published on school websites and for the purposes of the floor targets (see sections below), rather than how it should be reported directly to parents.

We have addressed already the available information about the calculation of the scaled scores.

The original consultation document also outlined the broad methodology underpinning the progress measures:

‘In order to report pupils’ progress through the primary curriculum, the scaled score for each pupil at key stage 2 would be compared to the scores of other pupils with the same prior attainment. This will identify whether an individual made more or less progress than pupils with similar prior attainment…

…. Using this approach, a school might report pupils’ national curriculum test results to parents as follows:

In the end of key stage 2 reading test, Sally received a scaled score of 126 (the secondary ready standard is 100), placing her in the top 10% of pupils nationally. The average scaled score for pupils with the same prior attainment was 114, so she has made more progress in reading than pupils with a similar starting-point.’

.

Developments to date

On this web page first published in April 2014 STA commits to publishing guidance during summer term 2015 on how the results of national curriculum tests will be reported, including an explanation of scaled scores.

In September 2014, a further guidance note ‘National curriculum and assessment from September 2014: Information for schools’ shed a little further light on the calculation of the progress measures:

‘Pupil progress will be determined in relation to the average progress made by pupils with the same baseline (i.e. the same KS1 average point score). For example, if a pupil had an APS of 19 at KS1, we will calculate the average scaled score in the KS2 tests for all pupils with an APS of 19 and see whether the pupil in question achieved a higher or lower scaled score than that average The exact methodology of how this will be reported is still to be determined.’

It is hard to get a clear sense of the full range of assessment information that parents will receive.

I have been unable to find any comprehensive description, which would suggest that this is being held back until the methodology for calculating the various measures is finalised.

The various sections above suggest that they will receive details of:

  • Reception baseline assessment outcomes.
  • Attainment in end of KS1 and end of KS2 tests, now expressed as scaled scores (or via teacher assessment, code or P-scales if working below the level of the tests). This will be supplemented by a series of average scaled scores for each test.
  • Progress between the baseline assessment (reception baseline from 2022; KS1 baseline beforehand) and end of KS2 tests, relative to learners with similar prior attainment at the baseline.
  • Attainment in statutory teacher assessments, normally expressed through performance descriptors, but with different arrangements for low attainers.
  • Attainment and progress between reception baseline, KS1 and KS2 tests, provided through schools’ own internal assessment and tracking systems.

We have seen that reporting mechanisms for the first and fourth are not yet finalised.

The fifth is now for schools to determine, taking account of Ofsted’s guidance and, if they wish, the Assessment Principles.

The scales necessary to report the second are not yet published, and these also form the basis of the remaining progress measures.

Parents will be receiving this information in a variety of different formats: scaled scores, average scaled scores, baseline scores, performance descriptors, progress scores and internal tracking measures.

Moreover, the performance descriptor scales will vary according to the assessment and internal tracking will vary from school to school.

This is certainly much more complex than the current unified system of reporting based on levels. Parents will require extensive support to understand what they are receiving.

Outstanding tasks

Previous sections have already referenced expected guidance on reporting baseline assessments, scaled scores and the use of performance descriptors (which presumably includes parental reporting).

One assumes that there will also need to be unified guidance on all aspects of reporting to parents, intended for parental consumption.

So, avoiding duplication of previous sections, the remaining outstanding tasks are to:

  • Finalise the methodology for reporting on pupil progress (summer term 2015) 
  • Provide comprehensive guidance to parents on all aspects of reporting (summer term 2015?)

Publication of outcomes

.

Consultation response

This section covers publication of material for public consumption, within and alongside the Primary School Performance Tables and on schools’ websites.

The initial consultation document has much to say about first of these, while the consultation response barely mentions the Tables, focusing almost exclusively on school websites

The original document suggests that the Performance Tables will include a variety of measures, including:

  • The percentage of pupils meeting the secondary readiness standard
  • The average scaled score
  • Where the school’s pupils fit in the national cohort
  • Pupils’ rate of progress
  • How many of the school’s pupils are among the highest-attaining nationally, through a measure showing the percentage of pupils attaining a high scaled score in each subject.
  • Teacher assessment outcomes in English maths and science
  • Comparisons of each school’s performance with that of schools with similar intake
  • Data about the progress of those with very low prior attainment.

All the headline measures will be published separately for pupils in receipt of the pupil premium.

All measures will be published as three year rolling averages in addition to annual results.

There is also a commitment to publish a wide range of test and teacher assessment data, relating to both attainment and progress, through a Data Portal:

‘The department is currently procuring a new data portal or “data warehouse” to store the school performance data that we hold and provide access to it in the most flexible way. This will allow schools, governors and parents to find and analyse the data about schools in which they are most interested, for example focusing on the progress of low attainers in mathematics in different schools or the attainment of certain pupil groups.’

The consultation response acknowledges as a guiding principle:

‘…a broad range of information should be published to help parents and the wider public know how well schools are performing.’

The accountability system will:

‘…require schools to publish information on their websites so that parents can understand both the progress pupils make and the standards they achieve.’

Data on low attainers’ attainment and progress will not be published since the diversity of this group demands extensive contextual information.

But when it comes to Performance Tables, the consultation response says only:

‘As now, performance tables will present a wide range of information about primary school performance.’

By implication, they will include progress measures since the text adds:

‘In 2022 performance tables, we will judge schools on whichever is better: their progress from the reception baseline to key stage 2; or their progress from key stage 1 to key stage 2.

However, schools will be required to publish a suite of indicators in standard format on their websites, including:

  • The average progress made by pupils in reading, writing and maths
  • The percentage of pupils achieving the expected standard at the end of KS2 in reading, writing and maths
  • The average score of pupils in their end of KS2 assessments and
  • The ‘percentage of pupils who achieve a high score in all areas’ at the end of KS2.

The precise form of the last of these indicators is not explained. This is not quite the same as the ‘measure showing the percentage of pupils attaining a high scaled score in each subject’ mentioned in the original consultation document.

Does ‘all areas’ mean reading, writing and maths? Must learners achieve a minimum score in each assessment, or a single aggregate score above a certain threshold?

In addition:

‘So that parents can make comparisons between schools, we would like to show each school’s position in the country on these measures and present these results in a manner that is clear for all audiences to understand. We will discuss how best to do so with stakeholders, to ensure that the presentation of the data is clear, fair and statistically robust.’

.

Developments to date

In June 2014, a consultation document was issued ‘Accountability: publishing headline performance measures on school and college websites’. This was accompanied by a press release.

The consultation document explains the intended relationship between the Performance Tables, Data Portal and material published on schools’ websites:

‘Performance tables will continue to provide information about individual schools and colleges and be the central source of school and college performance information.’

Moreover:

‘Future changes to the website, through the school and college performance data portal, will improve accessibility to a wide range of information, including the headline performance measures. It will enable interested parents, students, schools, colleges and researchers to interrogate educational data held by the Department for Education to best meet their requirements.’

But:

‘Nevertheless, the first place many parents and students look for information about a school or college is the institution’s own website’

Schools are already required to publish such information, but there is inconsistency in where and how it is presented. The document expresses the intention that consistent information should be placed ‘on the front page of every school and college website’.

The content proposed for primary school’s websites covers the four headline measures set out in the consultation response.

A footnote says:

‘These measures will apply to all-through primary, junior and middle schools. Variants of these measures will apply for infant and first schools.’

But the variants are not set out.

There is no reference to the plan to show ‘each school’s position in the country on these measures’ as mentioned in the consultation response.

The consultation proposes a standard visual presentation which, for primary schools, looks like this

.

school websites Capture

.

The response to this consultation ‘Publishing performance measures on school and college websites’ appeared in December 2014 (the consultation document had said ‘Autumn 2014’).

The summary of responses says:

‘The majority of respondents to the consultation welcomed the proposals to present headline performance measures in a standard format. There was also strong backing for the proposed visual presentation of data to aid understanding of performance. However, many respondents suggested that without some sense of scale or spread to provide some context to the visual presentation, the data could be misleading. Others said that the language used alongside the charts should be clearer…’

…Whilst most respondents favoured a data application tool that would remove the burden of annually updating performance data on school and college websites, they also highlighted the difficulties of developing a data application that would be compatible with a wide range of school and college websites.’

It is clear that some respondents had questioned why school websites should not simply carry a link on their homepage to the School Performance Tables.

In the light of this reaction, further research will be undertaken to:

  • develop a clear and simple visual representation of the data, but with added contextual information.
  • establish how performance tables data can be presented ‘in a way that reaches more parents’.

The timeline suggests that this will result in ‘proposals for redevelopment of performance tables’ by May 2015, so we can no longer assume that the Tables will cover the list of material suggested in the original consultation document.

The timeline indicates that if initial user research concludes that a data application is required, that will be developed and tested between June and October 2015, for roll out between September 2016 and January 2017.

Schools will be informed by autumn 2015 whether they should carry a link to the Tables, download a data application or pursue a third option.

But, nevertheless:

‘All schools and colleges, including academies, free schools and university technical colleges, will be required to publish the new headline performance measures in a consistent, standard format on their websites from 2016.’

So, if an application is not introduced, it seems that schools will still have to publish the measures on their websites: they will not be able to rely solely on a link to the Performance Tables.

Middle schools will only be required to publish the primary measures. No mention is made of infant or first schools.

.

There is no further reference to the data portal, since this project was quietly shelved in September 2014, following unexplained delays in delivery.

.

.

There has been no subsequent explanation of the implications of this decision. Will the material intended for inclusion in the Portal be included in the Performance Tables, or published by another route, or will it no longer be published?

.

Finally, some limited information has emerged about accountability arrangements for infant schools.

This appears on a web page – New accountability arrangements for infant schools from 2016 – published in June 2014.

It explains that the reception baseline will permit the measurement of progress alongside attainment. The progress of infant school pupils will be published for the first time in the 2019 Performance Tables.

This might mean a further addition to the list of information reported to parents set out in the previous section.

There is also a passing reference to moderation:

‘To help increase confidence and consistency in our moderation of infant schools, we will be increasing the proportion of schools where KS1 assessments are moderated externally. From summer 2015, half of all infant schools will have their KS1 assessments externally moderated.’

But no further information is forthcoming about the nature of other headline measures and how they will be reported.

.

Outstanding tasks

  • Complete user research and publish proposals for redevelopment of Performance Tables (May 2015) 
  • Confirm what data will be published in the 2016 Performance Tables (summer Term 2015?)
  • Confirm how material originally intended for inclusion in Data Portal will be published (summer term 2015?)
  • Confirm the format and publication route for data showing each school’s position in the country on the headline measures (summer term 2015?) 
  • Confirm headline performance measures for infant and first schools (summer term 2015?) 
  • If necessary, further develop and test a prototype data application for schools’ websites (October 2015) 
  • Inform schools whether a data application will be introduced (autumn 2015) 
  • Amend School Information Regulations to require publication of headline measures in standard format (April 2016) 
  • If proceeding, complete development and testing of a data application (May 2016) 
  • If proceeding, complete roll out of data application (February 2017)

.

Floor standards

.

Consultation response

Minimum expectations of schools will continue to be embodied in floor standards. Schools falling below the floor will attract ‘additional scrutiny through inspection’ and ‘intervention may be required’.

Although the new standard:

‘holds schools to account both on the progress they make and on how well their pupils achieve.’

In practice they are able to choose between one or the other.

An all-through primary school will be above the floor standards if:

  • Pupils make sufficient progress between the reception baseline and the end of KS2 in all of reading, writing and maths or
  • 85% or more of pupils meet the new expected standard at the end of KS2 (similar to Level 4b under the current system).

A junior or middle school will be above the floor standard if:

  • pupils make sufficient progress at key stage 2 from their starting point at key stage 1; or
  • 85% or more of pupils meet the new expected standard at the end of key stage 2

At this stage arrangements for measuring the progress of pupils in infant or first schools are still to be considered.

Since the reception baseline will be introduced in 2015, progress in all-through primary schools will continue to be measured from the end of KS1 until 2022.

This should mean that, prior to 2022, the standard would be achieved by ensuring that the progress made by pupils in a school – in reading, writing and maths – equals or exceeds the national average progress made by pupils with similar prior attainment at the end of KS1.

Exactly how individual progress will be aggregated to create a whole school measure is not yet clear. The original consultation document holds up the possibility that slightly below average progress will be acceptable:

‘…we expect the value-added score required to be above the floor to be between 98.5 and 99 (a value-added score of 100 represents average progress).’

The consultation response says the amount of progress required will be determined in 2016:

‘The proposed progress measure will be based on value-added in each of reading, writing and mathematics. Each pupil’s scaled scores in each area at key stage 2 will be compared with the scores of pupils who had the same results in their assessments at key stage 1.

For a school to be above the progress floor, pupils will have to make sufficient progress in all of reading, writing and mathematics. For 2016, we will set the precise extent of progress required once key stage 2 tests have been sat for the first time. Once pupils take a reception baseline, progress will continue to be measured using a similar value added methodology.’

In 2022 schools will be assessed against either the reception or KS1 baseline, whichever gives the best result. From 2023 only the reception baseline will be in play.

The attainment standard will be based on achievement of ‘a scaled score of 100 or more’ in each of the reading and maths tests and achievement, via teacher assessment, of the new expected standard in writing (presumably the middle of the five described above).

The attainment standard is significantly more demanding, in that the present requirement is for 65% of learners to meet the expected standard – and the standard itself will now be pitched higher, at the equivalent of Level 4B.

The original consultation document says:

‘Our modelling suggests that a progress measure set at this level, combined with the 85% threshold attainment measure, would result in a similar number of schools falling below the floor as at present. Over time we will consider whether schools should make at least average progress as part of floor standards.’

The consultation response does not confirm this judgement.

.

Developments

The only significant development since the publication of the consultation response is the detail provided on the June 2014 webpage New accountability arrangements for infant schools from 2016.

In addition to the points in the previous section, this also confirms that:

‘…there will not be a floor standard for infant schools’

But this statement has been called into question, since the table from the performance descriptors consultation, reproduced above, appears to suggest that KS1 teacher assessments in reading, writing and maths do contribute to a floor standard – whether for infant or all-through primary schools is unclear.

.

The aforementioned Centre Forum Report ‘Progress matters in Primary too’ (January 2015) also appears to call into question the results of the modelling reported in the initial consultation document.

It says:

‘…the likelihood is that, based on current performance, progress will be the measure used for the vast majority of schools, at least in the short to medium term. Even those schools which achieve the attainment floor target will only do so by ensuring at least average progress is made by their pupils. As a result, progress will in practice be the dominant accountability metric.’

It undertakes modelling based on 2013 attainment data – ie simulating the effect of the new standards had they been in place in 2013, using selected learning areas within the EYFSP as a proxy for the reception baseline – which suggests that just 10% of schools in 2013 would have met the new attainment floor.

It concludes that:

‘For the vast majority of schools, progress will be their only option for avoiding intervention when the reforms come into effect.’

Unfortunately though, it does not provide an estimate of the proportion of schools likely to achieve the progress floor standard, with either the current KS1 baseline or its proxy for a reception baseline.

Outstanding Tasks

  • Confirm the detailed methodology for deriving both the attainment and progress elements of the floor standards, in relation to both the new reception baseline and the interim KS1 baseline (summer 2015?)
  • Set the amount of progress required to achieve the progress element of the floor standards (summer 2016)
  • (In the consultation document) Consider whether schools should make at least average progress as part of floor standards and ‘move to three year rolling averages for floor standard measures’ (long term)

.

Overall progress, Purdah and General Election outcomes

Progress to date and actions outstanding

The lists of outstanding actions above record some 40 tasks necessary to the successful implementation of the primary assessment and accountability reforms.

If the ‘advance notice’ conventions are observed, roughly half of these require completion by the end of the summer term in July 2015, within the two windows of 50 working days on either side of Purdah.

These conventions have already been set aside in some cases, most obviously in respect of reception baseline assessment and the performance descriptors for statutory teacher assessment.

Unsurprisingly, the commentary above suggests that these two strands of the reform programme are the most complex and potentially the most problematic.

The sheer number of outstanding tasks and the limited time in which to complete them could pose problems.

It is important to remember that there are similar reforms in the secondary and post-16 sectors that need to be managed in parallel.

The leaked amber/red rating was attributed solely to the negative reaction to the draft performance descriptors, but it could also reflect a wider concern that all the necessary steps may not be completed in time to give schools the optimal period for planning and preparation.

Schools may be able to cope with shorter notice in a few instances, where the stakes are relatively low, but if too substantive a proportion of the overall reform programme is delayed into next academic year, they will find the cumulative impact much harder to manage.

In a worst case scenario, implementation of some elements might need to be delayed by a year, although the corollary would be an extended transition period for schools that would be less than ideal. It may also be difficult to disentangle the different strands given the degree of interdependency between them.

Given the proximity of a General Election, it may not be politic to confirm such delays before Purdah intervenes: the path of least resistance is probably to postpone any difficult decisions for consideration by the incoming government.

.

The implications of Purdah

As noted above, if the General Election result is clear-cut, Purdah will last some five-and-a-half weeks and will occur at a critical point in the implementation timetable.

The impact of Purdah should not be under-estimated.

From the point at which Parliament is dissolved on Monday 30 March, the Government must abstain from major policy decisions and announcements.

The Election is typically announced a few days before the dissolution of Parliament. This ‘wash up’ period between announcement and dissolution is typically used to complete essential unfinished business.

The Cabinet Office issues guidance on conduct during Purdah shortly before it begins.

The 2015 guidance has not yet issued so the 2010 guidance is the best source of information about what to expect.

.

[Postscript: 2015 Guidance was posted on 30 March 2015 and is substantively the same as the 2010 edition.]

.

Key points include:

  • ‘Decisions on matters of policy on which a new Government might be expected to want the opportunity to take a different view from the present Government should be postponed until after the Election, provided that such postponement would not be detrimental to the national interest or wasteful of public money.’
  • ‘Officials should not… be asked to devise new policies or arguments…’
  • ‘Departmental communications staff may…properly continue to discharge during the Election period their normal function only to the extent of providing factual explanation of current Government policy, statements and decisions.’
  • ‘There would normally be no objection to issuing routine factual publications, for example, health and safety advice but these will have to be decided on a case by case basis taking account of the subject matter and the intended audience.’
  • ‘Regular statistical releases and research reports (e.g. press notices, bulletins, publications or electronic releases) will continue to be issued and published on dates which have been pre-announced. Ad hoc statistical releases or research reports should be released only where a precise release date has been published prior to the Election period. Where a pre-announcement has specified that the information would be released during a specified period (e.g. a week, or longer time period), but did not specify a precise day, releases should not be published within the Election period.’
  • ‘Research: Fieldwork involving interviews with the public or sections of it will be postponed or abandoned although regular, continuous and on-going statistical surveys may continue.’
  • ‘Official websites…the release of new online services and publication of reworked content should not occur until after the General Election… Content may be updated for factual accuracy but no substantial revisions should be made and distributed.’
  • The general principles and conventions set out in this guidance apply to NDPBs and similar public bodies.

Assuming similar provisions in 2015, most if not all of the assessment and accountability work programme would grind to a halt.

To take an example, it is conceivable that those awarded baseline assessment contracts would be able to recruit schools after 30 March, but they will receive little or no help from the DfE during the Purdah period. Given that the recruitment deadline is 30 April, this may be expected to depress recruitment significantly.

.

The impact of different General Election outcomes

Forming a Government in the case of a Hung Parliament may also take some time, further delaying the process.

The six days taken in 2010 may not be a guide to what will happen in 2015.

The Cabinet Manual (2011) says:

‘Where an election does not result in an overall majority for a single party, the incumbent government remains in office unless and until the Prime Minister tenders his or her resignation and the Government’s resignation to the Sovereign. An incumbent government is entitled to wait until the new Parliament has met to see if it can command the confidence of the House of Commons, but is expected to resign if it becomes clear that it is unlikely to be able to command that confidence and there is a clear alternative…

…The nature of the government formed will be dependent on discussions between political parties and any resulting agreement. Where there is no overall majority, there are essentially three broad types of government that could be formed:

  • single-party, minority government, where the party may (although not necessarily) be supported by a series of ad hoc agreements based on common interests;
  • formal inter-party agreement, for example the Liberal–Labour pact from 1977 to 1978; or
  • formal coalition government, which generally consists of ministers from more than one political party, and typically commands a majority in the House of Commons’.

If one or more of the parties forming the next government has a different policy on assessment and accountability, this could result in pressure to amend or withdraw parts of the reform programme.

If a single party is involved, pre-Election contact with civil servants may have clarified its intentions, enabling work to resume as soon as the new government is in place but, if more than one party is involved, it may take longer to agree the preferred way forward.

Under a worst case scenario, planners might need to allow for Purdah and post-Election negotiations to consume eight weeks or longer.

The impact of the Election on the shape and scope of the primary assessment and accountability reforms will also depend on which party or parties enter government.

If the same Coalition partners are returned, one might expect uninterrupted implementation, unless the minority Lib Dems seek to negotiate different arrangements, which seems unlikely.

But if a different party or a differently constituted Coalition forms the Government, one might expect decisions to abandon or delay some aspects of the programme.

If Labour forms the Government, or is the major party in a Coalition, some unravelling will be necessary.

They are broadly committed to the status quo:

‘Yet when it comes to many of the technical day-to-day aspects of school leadership – child protection, curriculum reform, assessment and accountability – we believe that a period of stability could prove beneficial for raising pupil achievement. This may not be an exciting rallying cry, but it is crucial that the incoming government takes account of the classroom realities.’

Hunt has also declared:

‘Do not mistake me: I am zealot for minimum standards, rigorous assessment and intelligent accountability.

But if we choose to focus upon exam results and league tables to the detriment of everything else, then we are simply not preparing our young people for the demands of the 21st century.’

And, thus far, Labour has made few specific commitments in this territory.

  • They support reception baseline assessment but whether that extends to sustaining a market of providers is unknown. Might they be inclined to replace this with a single national assessment?.
  • There is very little about floor targets – a Labour invention – although the Blunkett Review appears to suggest that Directors of School Standards will enjoy some discretion in respect of their enforcement.

Reading between the lines, it seems likely that they would delay some of the strands described above – and potentially simplify others.

.

Conclusion

The primary assessment reform programme is both extensive and highly complex, comprising several strands and many interdependencies.

Progress to date can best be described as halting.

There are still many steps to be taken and difficult issues to resolve, about half of which should be completed by the end of this academic year. Pre-Election Purdah will cut significantly into the time available.

More announcements may be delayed into the summer holidays or the following autumn term, but this reduces the planning and preparation time available to schools and has potentially significant workload implications.

Alternatively, implementation of some elements or strands may be delayed by a year, but this extends the transition period between old and new arrangements. Any such rationalisation seems likely to be delayed until after the Election and decisions will be influenced by its outcome.

.

[Postscript: The commitment in the Government’s Workload Challenge response to a one-year lead time, now encapsulated in the Protocol published on 23 March, has not resulted in any specific commitments to delay ahead of the descent of Purdah.

At the onset of Purdah on 30 March some 18 actions appear to be outstanding and requiring completion by the end of the summer term. This will be a tall order for a new Government, especially one of a different complexion.]

.

If Labour is the dominant party, they may be more inclined to simplify some strands, especially baseline assessment and statutory teacher assessment, while also providing much more intensive support for schools wrestling with the removal of levels.

Given the evidence set out above, ‘amber/red’ seems an appropriate rating for the programme as a whole.

It seems increasingly likely that some significant adjustments will be essential, regardless of the Election outcome.

.

GP

January 2015

Addressed to Teach First and its Fair Education Alliance

.

This short opinion piece was originally commissioned by the TES in November.

My draft reached them on 24 November; they offered some edits on 17 December.

Betweentimes the Fair Education Alliance Report Card made its appearance on 9 December.

Then Christmas intervened.

On 5 January I offered the TES a revised version they said should be published on 27 February. It never appeared.

This Tweet

.

.

prompted an undertaking that it would appear on 27 March. I’ll believe that when I see it.

But there’s no reason why you should wait any longer. This version is more comprehensive anyway, in that it includes several relevant Twitter comments and additional explanatory material.

I very much hope that Teach First and members of the Fair Education Alliance will read it and reflect seriously on the proposal it makes.

As the final sequence of Tweets below shows, Teach First committed to an online response on 14 February. Still waiting…

.

.

.

How worried are you that so few students on free school meals make it to Oxbridge?

Many different reasons are offered by those who argue that such concern may be misplaced:

  • FSM is a poor proxy for disadvantage; any number of alternatives is preferable;
  • We shouldn’t single out Oxbridge when so many other selective universities have similarly poor records;
  • We obsess about Oxbridge when we should be focused on progression to higher education as a whole;
  • We should worry instead about progression to the most selective courses, which aren’t necessarily at the most selective universities;
  • Oxbridge suits a particular kind of student; we shouldn’t force square pegs into round holes;
  • We shouldn’t get involved in social engineering.

Several of these points are well made. But they can be deployed as a smokescreen, obscuring the uncomfortable fact that, despite our collective best efforts, there has been negligible progress against the FSM measure for a decade or more.

Answers to Parliamentary Questions supplied  by BIS say that the total fluctuated between 40 and 45 in the six years from 2005/06 to 2010/11.

The Department for Education’s experimental destination measures statistics suggested that the 2010/11 intake was 30, rising to 50 in 2011/12, of which 40 were from state-funded schools and 10 from state-funded colleges. But these numbers are rounded to the nearest 10.

By comparison, the total number of students recorded as progressing to Oxbridge from state-funded schools and colleges in 2011/12 is 2,420.

This data underpins the adjustment of DfE’s  ‘FSM to Oxbridge’ impact indicator, from 0.1% to 0.2%. It will be interesting to see whether there is stronger progress in the 2012/13 destination measures, due later this month.

.

[Postscript: The 2012/13 Destinations Data was published on 26 January 2014. The number of FSM learners progressing to Oxbridge is shown only in the underlying data (Table NA 12).

This tells us that the numbers are unchanged: 40 from state-funded schools; 10 from state-funded colleges, with both totals again rounded to the nearest 10.

So any improvement in 2011/12 has stalled in 2012/13, or is too small to register given the rounding (and the rounding might even mask a deterioration)

.

.

The non-FSM totals progressing to Oxbridge in 2012/13 are 2,080 from state-funded schools and 480 from state-funded colleges, giving a total of 2,560. This is an increase of some 6% compared with 2011/12.

Subject to the vagaries of rounding, this suggests that the ratio of non-FSM to FSM learners progressing from state-funded institutions deteriorated in 2012/13 compared with 2011/12.]

.

The routine explanation is that too few FSM-eligible students achieve the top grades necessary for admission to Oxbridge. But answers to Parliamentary Questions reveal that, between 2006 and 2011, the number achieving three or more A-levels at grade A or above increased by some 45 per cent, reaching 546 in 2011.

Judged on this measure, our national commitment to social mobility and fair access is not cutting the mustard. Substantial expenditure – by the taxpayer, by universities and the third sector – is making too little difference too slowly. Transparency is limited because the figures are hostages to fortune.

So what could be done about this? Perhaps the answer lies with Teach First and the Fair Education Alliance.

Towards the end of last year Teach First celebrated a decade of impact. It published a report and three pupil case studies, one of which featured a girl who was first in her school to study at Oxford.

I tweeted

.

.

Teach First has a specific interest in this area, beyond its teacher training remit. It runs a scheme, Teach First Futures, for students who are  “currently under-represented in universities, including those whose parents did not go to university and those who have claimed free school meals”.

Participants benefit from a Teach First mentor throughout the sixth form, access to a 4-day Easter school at Cambridge, university day trips, skills workshops and careers sessions. Those applying to Oxbridge receive unspecified additional support.

.

.

Information about the number of participants is not always consistent, but various Teach First sources suggest there were some 250 in 2009, rising to 700 in 2013. This year the target is 900. Perhaps some 2,500 have taken part to date.

Teach First’s impact report  says that 30 per cent of those who had been through the programme in 2013 secured places at Russell Group universities and that 60 per cent of participants interviewed at Oxbridge received an offer.

I searched for details of how many – FSM or otherwise – had actually been admitted to Oxbridge. Apart from one solitary case study, all I could find was a report that mentioned four Oxbridge offers in 2010.

.

.

.

.

.

Through the Fair Education Alliance, Teach First and its partners are committed to five impact goals, one of which is to:

‘Narrow the gap in university graduation, including from the 25% most selective universities, by 8%’*

Last month the Alliance published a Report Card which argued that:

‘The current amount of pupil premium allocated per disadvantaged pupil should be halved, and the remaining funds redistributed to those pupils who are disadvantaged and have low prior attainment. This would give double-weighting to those low income pupils most in need of intervention without raising overall pupil premium spend.’

It is hard to understand how this would improve the probability of achieving the impact goal above, even though the gaps the Alliance wishes to close are between schools serving high and low income communities.

.

.

.

Perhaps it should also contemplate an expanded Alliance Futures Scheme, targeting simultaneously this goal and the Government’s ‘FSM to Oxbridge’ indicator, so killing two birds with one stone.

A really worthwhile Scheme would need to be ambitious, imposing much-needed coherence without resorting to prescription.

Why not consider:

  • A national framework for the supply side, in which all providers – universities included – position their various services.
  • Commitment on the part of all secondary schools and colleges to a coherent long-term support programme for FSM students, with open access at KS3 but continuing participation in KS4 and KS5 subject to successful progress.
  • Schools and colleges responsible for identifying participants’ learning and development needs and addressing those through a blend of internal provision and appropriate services drawn from the national framework.
  • A personal budget for each participant, funded through an annual £50m topslice from the Pupil Premium (there is a precedent) plus a matching sum from universities’ outreach budgets. Those with the weakest fair access records would contribute most. Philanthropic donations would be welcome.
  • The taxpayer’s contribution to all university funding streams made conditional on them meeting challenging but realistic fair access and FSM graduation targets – and publishing full annual data in a standard format.

 .

.

*In the Report card, this impact goal is differently expressed, as narrowing the gap in university graduation, so that at least 5,000 more students from low income backgrounds graduate each year, 1,600 of them from the most selective universities. This is to be achieved by 2022.

‘Low income backgrounds’ means schools where 50% or more pupils come from the most deprived 30% of families according to IDACI.

The gap to be narrowed is between these and pupils from ‘high income backgrounds’, defined as schools where 50% or more pupils come from the least deprived 30% of families according to IDACI.

‘The most selective universities’ means those in the Sutton Trust 30 (the top 25% of universities with the highest required UCAS scores).

The proposed increases in graduation rates from low income backgrounds do not of themselves constitute a narrowing gap, since there is no information about the corresponding changes in graduation rates from high income grounds.

This unique approach to closing gaps adds yet another methodology to the already long list applied to fair access. It risks adding further density to the smokescreen described at the start of this post.

.

.

.

GP

January 2015

2014 Primary and Secondary Transition Matrices: High Attainers’ Performance

.

This is my annual breakdown of what the Transition Matrices tell us about the national performance of high attainers.

Data Overload courtesy of opensourceway

Data Overload courtesy of opensourceway

It complements my reviews of High Attainment in the 2014 Primary Performance Tables (December 2014) and of High Attainment in the 2014 Secondary and Post-16 Performance Tables (forthcoming, in February 2015).

The analysis is based on:

  • The 2014 Static national transition matrices for reading, writing and mathematics – Key Stage 1 to Key Stage 2 (October 2014) and
  • The 2014 Static key Stage 2 to 4 National transition matrices unamended – English and maths (December 2014).

There is also some reference to SFR41/2014: Provisional GCSE and equivalent results in England, 2013 to 2014.

The post begins with some important explanatory notes, before examining the primary and then the secondary matrices. There is a commentary on each matrix, followed by a summary of the key challenges for each sector.

.

Explanatory notes

The static transition matrices take into account results from maintained mainstream and maintained and non-maintained special schools. 

The tables reproduced below use colour coding:

  • purple = more than expected progress
  • dark green = expected progress
  • light green = less than expected progress and
  • grey = those excluded from the calculation.

I will assume that readers are familiar with expectations of progress under the current system of national curriculum levels.

I have written before about the assumptions underpinning this approach and some of the issues it raises.

(See in particular the sections called:

 ‘How much progress does the accountability regime expect from high attainers?’ and

‘Should we expect more progress from high attainers?’)

I have not reprised that discussion here.

The figures within the tables are percentages – X indicates data that has been suppressed (where the cohort comprises only one or two learners). Because of rounding, lines do not always add up to 100%.

In the case of the primary matrices, the commentary below concentrates on the progress made by learners who achieved level 3 or level 4 at KS1. In the case of the secondary matrices, it focuses on those who achieved sub-levels 5A, 5B or 5C at KS2.

Although the primary matrices include progression from KS1 level 4, the secondary matrices do not include progression from KS2 level 6 since the present level 6 tests were introduced only in 2012. Those completing GCSEs in 2014 will typically have undertaken KS2 assessment five years earlier.

The analysis includes comparison with the matrices for 2012 and 2013 respectively.

.

The impact of policy change on the secondary matrices

This comparison is straightforward for the primary sector (KS1 to KS2) but is problematic when it comes to the secondary matrices (KS2 to KS4).

As SFR41/2014 makes clear, the combined impact of:

  • vocational education reforms (restricting eligible qualifications and significantly reducing the weighting of some of them) and 
  • early entry policy (recording in performance measures only the first result achieved, rather than the outcome of any retakes)

has depressed overall KS4 results.

The impact of these factors on progress is not discussed within the text, although one of the tables gives overall percentages for those making the expected progress under the old and new methodologies respectively.

It does so for two separate groups of institutions, neither of which is perfectly comparable with the transition matrices because of the treatment of special schools:

  • State funded mainstream schools (excluding state-funded special schools and non-maintained special schools) and
  • State-funded schools (excluding non-maintained special schools).

However, the difference is likely to be marginal.

There is certainly very little difference between the two sets of figures for the categories above, though the percentages are very slightly larger for the first.

They show:

  • A variation of 2.3 percentage points in English (72.1% making at least the expected progress under the new methodology compared with 74.4% under the old) and
  • A variation of 2.4 percentage points in maths (66.4% making at least the expected progress compared with 68.8%).

There is no such distinction in the static transition matrices, nor does the SFR provide any information about the impact of these policy changes for different levels of prior attainment.

It seems a reasonable starting hypothesis that the impact will be much reduced at higher levels of prior attainment, because comparatively fewer students will be pursuing vocational qualifications.

One might also expect comparatively fewer high attainers to require English and/or maths retakes, even when the consequences of early entry are factored in, but that is rather more provisional.

It may be that the differential impact of these reforms on progression from different levels of prior attainment will be discussed in the statistical releases to be published alongside the Secondary Performance Tables. In that case I will update this treatment.

For the time being, my best counsel is:

  • To be aware that these policy changes have almost certainly had some impact on the progress of secondary high attainers, but 
  • Not to fall into the trap of assuming that they must explain all – or even a substantial proportion – of any downward trends (or absence of upward trends for that matter).

There will be more to say about this in the light of the analysis below.

Is this data still meaningful?

As we all know, the measurement of progression through national curriculum levels will shortly be replaced by a new system.

There is a temptation to regard the methodology underpinning the transition matrices as outmoded and irrelevant.

For the time being though, the transition matrices remain significant to schools (and to Ofsted) and there is an audience for analysis based on them.

Moreover, it is important that we make our best efforts to track annual changes under the present system, right up to the point of changeover.

We should also be thinking now about how to match progression outcomes under the new model with those available under the current system, so as to secure an uninterrupted perspective of trends over time.

Otherwise our conclusions about the longer-term impact of educational policies to raise standards and close gaps will be sadly compromised.

.

2014 Primary Transition Matrices

.

Reading

.

TM reading KS12 Capture

.

Commentary:

  • It appears that relatively few KS1 learners with L4 reading achieved the minimum expected 2 levels of progress by securing L6 at the end of KS2. It is not possible for these learners to make more than the expected progress. The vast majority (92%) recorded a single level of progress, to KS2 L5. This contrasts with 2013, when 12% of KS1 L4 learners did manage to progress to KS2 L6, while only 88% were at KS2 L5. Caution is necessary since the sample of L1 KS4 readers is so small. (The X suggests the total cohort could be as few as 25 pupils.)
  • The table shows that 1% of learners achieving KS1 L3 reading made 3 levels of progress to KS2 L6, exactly the same proportion as in 2012 and 2013. But we know that L6 reading test entries were up 36% compared with 2013: one might reasonably have expected some increase in this percentage as a consequence. The absence of improvement may be attributable to the collapse in success rates on the 2014 L6 reading test.
  • 90% of learners achieving KS1 L3 made the expected 2 or more levels of progress to KS2 L5 or above, 89% making 2 levels of progress to L5. The comparable figures for those making 2 LoP in 2013 and 2012 were 85% and 89% respectively.
  • In 2014 only 10% of those achieving LS1 L3 made a single level of progress to KS2 L4, compared with 13% in 2013 and 10% in 2012. 
  • So, when it comes to L3 prior attainers, the 2013 dip has been overcome, but there has been no improvement beyond the 2012 outcomes. Chart 1 makes this pattern more obvious, illustrating clearly that there has been relatively little improvement across the board.

.

TM chart 1

Chart 1: Percentage of learners with KS1 L3 reading making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 making the expected progress is significantly lower than the proportions with KS1 L2A, L2B or L2 overall who do so. This pattern is unchanged from 2012 and 2013.
  • The proportion exceeding 2 LoP is also far higher for every other level of KS1 prior achievement, also unchanged from 2012 and 2013.
  • Whereas the gap between KS1 L2 and L3 making more than 2 LoP was 36 percentage points in 2013, by 2014 it had increased substantially to 43 percentage points (44% versus 1%). This may again be partly attributable to the decline in L6 reading results.

.

Writing

.

TM writing KS12 Capture

Commentary:

  • 55% of learners with L4 in KS1 writing made the expected 2 levels of progress to KS2 L6, while only 32% made a single level of progress to KS2 L5. This throws into sharper relief the comparable results for L4 readers. 
  • On the other hand, the 2013 tables recorded 61% of L4 writers making the expected progress, six percentage points higher than the 2014 success rate, so there has been a decline in success rates in both reading and writing for this small cohort. The reason for this is unknown, but it may simply be a consequence of the small sample.
  • Of those achieving KS1 L3, 12% made 3 LoP to KS2 L6, up from 6% in 2012 and 9% in 2013. The comparison with reading is again marked. A further 2% of learners with KS1 L2A made 4 levels of progress to KS2 L6.
  • 91% of learners with KS1 L3 writing made the expected 2 or more levels of progress, up from 89% in 2013. Some 79% made 2 LoP to L5, compared with 80% in 2013 and 79% in 2012, so there has been relatively little change.
  • However, in 2014 9% made only a single level of progress to KS2 L4. This is an improvement on 2013, when 11% did so and continues an improving trend from 2012 when 15% fell into this category, although the rate of improvement has slowed somewhat. 
  • These positive trends are illustrated in Chart 2 below, which shows reductions in the proportion achieving a single LoP broadly matched by corresponding improvements in the proportion achieving 3 LoP.

TM chart 2 

Chart 2: Percentage of learners with KS1 L3 writing making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 making the expected progress is again lower than the proportions with KS1 L2A, L2B or L2 overall doing so. It is even lower than the proportion of those with KS1 L1 achieving this outcome. This is unchanged from 2013.
  • The proportion exceeding 2 LoP is far higher for every other level of KS1 achievement excepting L2C, again unchanged from 2013.
  • The percentage point gap between those with KS1 L2 overall and LS1 L3 making more than 2 LoP was 20 points in 2013 and remains unchanged at 20 points in 2014. Once again again there is a marked contrast with reading. 

.

Maths

.

TM maths KS12 Capture

.

Commentary:

  • 95% of those achieving L4 maths at KS1 made the expected 2 levels of progress to KS2 L6. These learners are unable to make more than expected progress. Only 5% made a single level of progress to KS2 L5. 
  • There is a marked improvement since 2013, when 89% made the expected progress and 11% fell short. This is significantly better than KS1 L4 progression in writing and hugely better than KS1 L4 progression in reading.
  • 35% of learners with KS1 L3 maths also made 3 levels of progress to KS2 L6. This percentage is up from 26% in 2013 and 14% in 2012, indicating a continuing trend of strong improvement. In addition, 6% of those with L2A and 1% of those at L2B managed 4 levels of progress to KS2 L6.
  • 91% of learners with KS1 L3 made the expected progress (up one percentage point compared with 2013). Of these, 56% made 2 LoP to KS2 L5. However, 9% made only a single level of progress to KS2 L4 (down a single percentage point compared with 2013).
  • Chart 3 illustrates these positive trends. It contrasts with the similar charts for writing above, in that the rate at which the proportion of L3 learners making a single LoP is reducing is much slower than the rate of improvement in the proportion of KS1 L3 learners making 3 LoP.

.

TM chart 3

Chart 3: Percentage of learners with KS1 L3 maths making 1, 2 and 3 Levels of progress, 2012 to 2014

.

  • The proportion of learners with KS1 L3 in maths who achieved the expected progress is identical to the proportion achieving L2 overall that do so, at 91%. However, these rates are lower than for learners with KS1 2B and especially 2A.
  • The proportion exceeding 2 LoP is also identical for those with KS1 L3 and L2 overall (whereas in 2013 there was a seven percentage point gap in favour of those with KS1 L2). The proportion of those with KS1 L2A exceeding 2 LoP remains significantly higher, but the gap has narrowed by six percentage points compared with 2013.

.

Key Challenges: Progress of High Attainers between KS1 and KS2

The overall picture from the primary transition matrices is one of comparatively strong progress in maths, positive progress in writing and a much more mixed picture in reading. But in none of these areas is the story unremittingly positive.

Priorities should include:

  • Improving progression from KS1 L4 to KS2 L6, so that the profile for writing becomes more similar to the profile for maths and, in particular, so that the profile for reading much more closely resembles the profile for writing. No matter how small the cohort, it cannot be acceptable that 92% of KS1 L4 readers make only a single level of progress.
  • Reducing to negligible the proportion of KS1 L3 learners making a single level of progress to KS2 L4. Approximately 1 in 10 learners continue to do so in all three assessments, although there has been some evidence of improvement since 2012, particularly in writing. Other than in maths, the proportion of KS1 L3 learners making a single LoP is significantly higher than the proportion of KS1 L2 learners doing so. 
  • Continuing to improve the proportion of KS1 L3 learners making 3 LoP in each of the three assessments, maintaining the strong rate of improvement in maths, increasing the rate of improvement in writing and moving beyond stagnation at 1% in reading. 
  • Eliminating the percentage point gaps between those with KS1 L2A making at least the expected progress and those with KS1 L3 doing so (5 percentage points in maths and 9 percentage points in each of reading and writing). At the very least, those at KS1 L3 should be matching those at KS1 L2B, but there are presently gaps between them of 2 percentage points in maths, 5 percentage points in reading and 6 percentage points in writing.

.

Secondary Transition Matrices

.

English

.

TM English KS24 Capture

.

Commentary:

  • 98% of learners achieving L5A English at KS2 made at least 3 levels of progress to GCSE grade B or above in 2014. The same is true of 93% of those with KS2 L5B and 75% of those with KS2 L5C. All three figures have improved by one percentage point compared with 2013. The comparable figures in 2012 were 98%, 92% and 70% respectively.
  • 88% of learners achieving L5A at KS2 achieved at least four levels of progress from KS2 to KS4, so achieving a GCSE grade of A* or A, as did 67% of those with L5B and 34% of those with 5C. The comparable figures in 2013 were 89%, 66% and 33% respectively, while in 2012 they were 87%, 64% and 29% respectively.
  • 51% of learners with KS2 L5A made 5 levels of progress by achieving an A* grade at GCSE, compared with 25% of those with L5B, 7% of those with L5C and 1% of those with L4A. The L5B and L5C figures were improvements on 2013 outcomes. The 2014 success rate for those with KS2 L5A is down by two percentage points, while that for L5B is up by two points.
  • These cumulative totals suggest relatively little change in 2014 compared with 2013, with the possible exception of these two-percentage-point swings in the proportions of students making 5 LoP. 
  • The chart below compares the proportion of students with KS2 L5A, 5B and 5C respectively making exactly 3, 4 and 5 LoP. (NB: these are not the same as the cumulative totals quoted above). This again shows relatively small changes in 2014, compared with 2013, and no obvious pattern.

.

TM chart 4

Chart 4: Percentage of learners with KS2 L5A, L5B and L5C in English achieving 3, 4 and 5 levels of progress, 2012-2014

.

  • 1% of learners with KS2 L5A made only 2 levels of progress to GCSE grade C, as did 6% of those with L5B and 20% of those with L5C. These percentages are again little changed compared with 2013, following a much more significant improvement between 2012 and 2013).
  • The percentages of learners with KS2 L4A who achieve at least 3 and at least 4 levels of progress – at 87% and 48% respectively – are significantly higher than the corresponding percentages for those with KS2 L5C. These gaps have also changed very little compared with 2013.

.

Maths

.

TM Maths KS24 Capture

.

Commentary:

  • 96% of learners with L5A at KS2 achieved the expected progress between KS2 and KS4 in 2014, as did 86% of those with KS2 L5B and 65% of those with KS2 L5C. The comparable percentages in 2013 were 97%, 88% and 70%, while in 2012 they were 96%, 86% and 67%. This means there have been declines compared with 2013 for L5A (one percentage point) L5B (two percentage points) and L5C (five percentage points).
  • 80% of learners with KS2 L5A made 4 or more levels of progress between KS2 and KS4, so achieving a GCSE grade A* or A. The same was true of 54% of those with L5B and 26% of those with L5C. In 2013, these percentages were 85%, 59% and 31% respectively, while in 2012 they were 84%, 57% and 30% respectively. So all the 2014 figures – for L5A, L5B and L5C alike, are five percentage points down compared with 2013.
  • In 2014 48% of learners with KS2 L5A made 5 levels of progress by achieving a GCSE A* grade, compared with 20% of those with L5B, 5% of those with L5C and 1% of those with L4A. All three percentages for those with KS2 L5 are down compared with 2013 – by 3 percentage points in the case of those with L5A, 2 points for those with L5B and 1 point for those with L5C.
  • It is evident that there is rather more volatility in the trends in maths progression and some of the downward swings are more pronounced than in English.
  • The chart below compares the proportion of students with KS2 L5A, 5B and 5C respectively making exactly 3, 4 and 5 LoP. (NB, these are not the cumulative totals quoted above). The only discernible pattern is that any improvement is confined to those making 3 LoP.

.

TM chart 5

Chart 5: Percentage of learners with KS2 L5A, L5B and L5C in Maths achieving 3, 4 and 5 levels of progress, 2012-2014

  • 4% of those with KS2 L5A made only 2 LoP to GCSE grade C, as did 13% of those with L5B and 31% of those with L5C. All three percentages have worsened compared with 2013, by 1, 2 and 4 percentage points respectively.
  • The percentages of learners with KS2 L4A who achieve at least 3 and at least 4 levels of progress – at 85% and 37% respectively – are significantly higher than the corresponding percentages for those with L5C, just as they are in English. And, as is the case with English, the percentage point gaps have changed little compared with 2013.

.

Key Challenges: Progress of High Attainers Between KS2 and KS4

The overall picture for high attainers from the secondary transition matrices is of relatively little change in English and of rather more significant decline in maths, though not by any means across the board.

It may be that the impact of the 2014 policy changes on high attainers has been relatively more pronounced in maths than in English – and perhaps more pronounced in maths than might have been expected.

If this is the case, one suspects that the decision to restrict reported outcomes to first exam entries is the most likely culprit.

On the other hand, it might be true that relatively strong improvement in English progression has been cancelled out by these policy changes, though the figures provided in the SFR for expected progress regardless of prior attainment make this more unlikely.

Leaving causation aside, the most significant challenges for the secondary sector are to:

  • Significantly improve the progression rates for learners with KS2 L5A to A*. It should be a default expectation that they achieve five levels of progress, yet only 48% do so in maths and 51% in English – and these percentages are down 5 and 2 percentage points respectively compared with 2013.
  • Similarly, significantly improve the progression rates for learners with KS2 L5B to grade A. It should be a default expectation that they achieve at least 4 LoP, yet only 67% do so in English and 54% in maths – down one point since 2013 in English and 5 points in maths.
  • Reduce and ideally eliminate the rump of high attainers who make a single LoP. This is especially high for those with KS2 L5C – 20% in English and, still worse, 31% in maths – but there is also a problem for those with 5B in maths, 13% of whom fall into this category. The proportion making a single LoP from 5C in maths has risen by 4 percentage points since 2013, while there has also been a 2 point rise for those with 4B. (Thankfully the L5C rate in English has improved by 2 points, but there is a long way still to go.)
  • Close significantly, the progression performance gaps between learners with KS2 L5C and KS2 L4A, in both English and maths. In English there is currently a 12 percentage point gap for those making expected progress and a 14-point gap for those exceeding it. In maths, these gaps are 20 and 11 percentage points respectively. The problem in maths seems particularly pronounced. These gaps have changed little since 2013.

.

Conclusion

This analysis of high attainers’ progression suggests a very mixed picture, across the primary and secondary sectors and beween English and maths. There is some limited scope for congratulation, but too many persistent issues remain.

The commentary has identified four key challenges for each sector, which can be synthesised under two broad headings:

  • Raising expectations beyond the minimum expected progress – and significantly reducing our tolerance of underachievement amongst this cohort. 
  • Ensuring that those at the lower end of the high attaining spectrum sustain their initial momentum, at least matching the rather stronger progress of those with slightly lower prior attainment.

The secondary picture has become confused this year by the impact of policy changes.

We do not know to what extent these explain any downward trends – or depress any upward trends – for those with high prior attainment, though one may tentatively hypothesise that any impact has been rather more significant in maths than in English.

It would be quite improper to assume that the changes in high attainers’ progression rates compared with 2013 are entirely attributable to the impact of these policy adjustments.

It would be more accurate to say that they mask any broader trends in the data, making those more difficult to isolate.

We should not allow this methodological difficulty – or the impending replacement of the present levels-based system – to divert us from continuing efforts to improve the progression of high attainers.

For Ofsted is intensifying its scrutiny of how schools support the most able – and they will expect nothing less.

.

GP

January 2015

Gifted Phoenix 2014 Review and Retrospective

.

I am rounding out this year’s blogging with my customary backwards look at the various posts I published during 2014.

This is partly an exercise in self-congratulation but also flags up to readers any potentially useful posts they might have missed.

.

P1020553

Norwegian Panorama by Gifted Phoenix

.

This is my 32nd post of the year, three fewer than the 35 I published in 2013. Even so, total blog views have increased by 20% compared with 2013.

Almost exactly half of these views originate in the UK. Other countries generating a large number of views include the United States, Singapore, India, Australia, Hong Kong, Saudi Arabia, Germany, Canada and South Korea. The site has been visited this year by readers located in157 different countries.

My most popular post during 2014 was Gifted Education in Singapore: Part 2, which was published back in May 2012. This continues to attract interest in Singapore!

The most popular post written during 2014 was The 2013 Transition Matrices and High Attainers’ Performance (January).

Other 2014 posts that attracted a large readership were:

This illustrates just how strongly the accountability regime features in the priorities of English educators.

I have continued to feature comparatively more domestic topics: approximately 75% of my posts this year have been about the English education system. I have not ventured beyond these shores since September.

The first section below reviews the minority of posts with a global perspective; the second covers the English material. A brief conclusion offers my take on future prospects.

.

Global Gifted Education

I began the year by updating my Blogroll, with the help of responses to Gifted Education Activity in the Blogosphere and on Twitter.

This post announced the creation of a Twitter list containing all the feeds I can find that mention gifted education (or a similar term, whether in English or another language) in their profile.

I have continued to update the list, which presently includes 1,312 feeds and has 22 subscribers. If you want to be included – or have additions to suggest – please don’t hesitate to tweet me.

While we’re on the subject, I should take this opportunity to thank my 5,960 Twitter followers, an increase of some 28% compared with this time last year.

In February I published A Brief Discussion about Gifted Labelling and its Permanency. This recorded a debate I had on Twitter about whether the ‘gifted label’ might be used more as a temporary marker than a permanent sorting device.

March saw the appearance of How Well Does Gifted Education Use Social Media?

This proposed some quality criteria for social media usage and blogs/websites that operate within the field of gifted education.

It also reviewed the social media activity of six key players (WCGTC, ECHA, NAGC, SENG, NACE and Potential Plus UK) as well as wider activity within the blogosphere, on five leading social media platforms and utilising four popular content creation tools.

Some of the websites mentioned above have been recast since the post was published and are now much improved (though I claim no direct influence).

Also in March I published What Has Become of the European Talent Network? Part One and Part Two.

These posts were scheduled just ahead of a conference organised by the Hungarian sponsors of the network. I did not attend, fearing that the proceedings would have limited impact on the future direction of this once promising initiative. I used the posts to set out my reservations, which include a failure to engage with constructive criticism.

Part One scrutinises the Hungarian talent development model on which the European Network is based. Part Two describes the halting progress made by to date. It identifies several deficiencies that need to be addressed if the Network is to have a significant and lasting impact on pan-European support for talent development and gifted education.

During April I produced PISA 2012 Creative Problem Solving: International Comparison of High Achievers’ Performance

This analyses the performance of high achievers from a selection of 11 jurisdictions – either world leaders or prominent English-speaking nations – on the PISA 2012 Creative Problem Solving assessment.

It is a companion piece to a 2013 post which undertook a similar analysis of the PISA 2012 assessments in Reading, Maths and Science.

In May I contributed to the Hoagies’ Bloghop for that month.

Air on the ‘G’ String: Hoagies’ Bloghop, May 2014 was my input to discussion about the efficacy of ‘the G word’ (gifted). I deliberately produced a provocative and thought-provoking piece which stirred typically intense reactions in several quarters.

Finally, September saw the production of Beware the ‘short head’: PISA’s Resilient Students’ Measure.

This takes a closer look at the relatively little-known PISA ‘resilient students’ measure – focused on high achievers from disadvantaged socio-economic backgrounds – and how well different jurisdictions perform against it.

The title reflects the post’s conclusion that, like many other countries, England:

‘…should be worrying as much about our ‘short head’ as our ‘long tail’’.

And so I pass seamlessly on to the series of domestic posts I published during 2014…

.

English Education Policy

My substantive post in January was High Attainment in the 2013 Secondary and 16-18 Performance Tables, an analysis of the data contained in last year’s Tables and the related statistical publications.

Also in January I produced a much briefer commentary on The 2013 Transition Matrices and High Attainers’ Performance.

The purpose of these annual posts (and the primary equivalent which appears each December) is to synthesise data about the performance of high attainers and high attainment at national level, so that schools can more easily benchmark their own performance.

In February I wrote What Becomes of Schools that Fail their High Attainers?*

It examines the subsequent history of schools that recorded particularly poor results with high attainers in the Secondary Performance Tables. (The asterisk references a footnote apologising ‘for this rather tabloid title’.)

By March I was focused on Challenging NAHT’s Commission on Assessment subjecting the Commission’s Report to a suitably forensic examination and offering a parallel series of recommendations derived from it.

My April Fool’s joke this year was Plans for a National Centre for Education Research into Free Schools (CERFS). This has not materialised but, had our previous Secretary of State for Education not been reshuffled, I’m sure it would have been only a matter of time!

Also in April I was Unpacking the Primary Assessment and Accountability Reforms, exposing some of the issues and uncertainties embodied in the government’s response to consultation on its proposals.

Some of the issues I highlighted eight months ago are now being more widely discussed – not least the nature of the performance descriptors, as set out in the recent consultation exercise dedicated to those.

But the reform process is slow. Many other issues remain unresolved and it seems increasingly likely that some of the more problematic will be delayed deliberately until after the General Election.

May was particularly productive, witnessing four posts, three of them substantial:

  • How well is Ofsted reporting on the most able? explores how Ofsted inspectors are interpreting the references to the attainment and progress of the most able added to the Inspection Handbook late last year. The sample comproses the 87 secondary inspection reports that were published in March 2014. My overall assessment? Requires Improvement.

.

.

  • A Closer Look at Level 6 is a ‘data-driven analysis of Level 6 performance’. As well as providing a baseline against which to assess future Level 6 achievement, this also identifies several gaps in the published data and raises as yet unanswered questions about the nature of the new tests to be introduced from 2016.
  • One For The Echo Chamber was prompted by The Echo Chamber reblogging service, whose founder objected that my posts are too long, together with the ensuing Twitter debate. Throughout the year the vast majority of my posts have been unapologetically detailed and thorough. They are intended as reference material, to be quarried and revisited, rather than the disposable vignettes that so many seem to prefer. To this day they get reblogged on The Echo Chamber only when a sympathetic moderator is undertaking the task.
  • ‘Poor but Bright’ v ‘Poor but Dim’ arose from another debate on Twitter, sparked by a blog post which argued that the latter are a higher educational priority than the former. I argued that both deserved equal priority, since it is inequitable to discriminate between disadvantaged learners on the basis of prior attainment and the economic arguments cut both ways. This issue continues to bubble like a subterranean stream, only to resurface from time to time, most recently when the Fair Education Alliance proposed that the value of pupil premium allocations attached to disadvantaged high attainers should be halved.

In June I asked Why Can’t We Have National Consensus on Educating High Attainers? and proposed a set of core principles that might form the basis for such consensus.

These were positively received. Unfortunately though, the necessary debate has not yet taken place.

.

.

The principles should be valuable to schools considering how best to respond to Ofsted’s increased scrutiny of their provision for the most able. Any institution considering how best to revitalise its provision might discuss how the principles should be interpreted to suit their particular needs and circumstances.

July saw the publication of Digging Beneath the Destination Measures which explored the higher education destinations statistics published the previous month.

It highlighted the relatively limited progress made towards improving the progression of young people from disadvantaged backgrounds to selective universities.

There were no posts in August, half of which was spent in Norway, taking the photographs that have graced some of my subsequent publications.

In September I produced What Happened to the Level 6 Reading Results? an investigation into the mysterious collapse of L6 reading test results in 2014.

Test entries increased significantly. So did the success rates on the other level 6 tests (in maths and in grammar, punctuation and spelling (GPS)).  Even teacher assessment of L6 reading showed a marked upward trend.

Despite all this, the number of pupils successful on the L6 reading test fell from 2,062 in 2013 to 851 (provisional). The final statistics – released only this month – show a marginal improvement to 935, but the outcome is still extremely disappointing. No convincing explanation has been offered and the impact on 2015 entries is unlikely to be positive.

That same month I published Closing England’s Excellence Gaps: Part One and Part Two.

These present the evidence base relating to high attainment gaps between disadvantaged and other learners, to distinguish what we know from what remains unclear and so to provide a baseline for further research.

The key finding is that the evidence base is both sketchy and fragmented. We should understand much more than we do about the size and incidence of excellence gaps. We should be strengthening the evidence base as part of a determined strategy to close the gaps.

.

.

In October 16-19 Maths Free Schools Revisited marked a third visit to the 16-19 maths free schools programme, concentrating on progress since my previous post in March 2013, especially at the two schools which have opened to date.

I subsequently revised the post to reflect an extended series of tweeted comments from Dominic Cummings, who was a prime mover behind the programme. The second version is called 16-19 Maths Free Schools Revisited: Oddyssean Edition .

The two small institutions at KCL and Exeter University (both very similar to each other) constitute a rather limited outcome for a project that was intended to generate a dozen innovative university-sponsored establishments. There is reportedly a third school in the pipeline but, as 2014 closes, details have yet to be announced.

Excellence Gaps Quality Standard: Version One is an initial draft of a standard encapsulating effective whole school practice in supporting disadvantaged high attainers. It updates and adapts the former IQS for gifted and talented education.

This first iteration needs to be trialled thoroughly, developed and refined but, even as it stands, it offers another useful starting point for schools reviewing the effectiveness of their own provision.

The baseline standard captures the essential ‘non-negotiables’ intended to be applicable to all settings. The exemplary standard is pitched high and should challenge even the most accomplished of schools and colleges.

All comments and drafting suggestions are welcome.

.

.

In November I published twin studies of The Politics of Setting and The Politics of Selection: Grammar Schools and Disadvantage.

These issues have become linked since Prime Minister Cameron has regularly proposed an extension of the former as a response to calls on the right wing of his party for an extension of the latter.

This was almost certainly the source of autumn media rumours that a strategy, originating in Downing Street, would be launched to incentivise and extend setting.

Newly installed Secretary of State Morgan presumably insisted that existing government policy (which leaves these matters entirely to schools) should remain undisturbed. However, the idea might conceivably be resuscitated for the Tory election manifesto.

Now that UKIP has confirmed its own pro-selection policy there is pressure on the Conservative party to resolve its internal tensions on the issue and identify a viable alternative position. But the pro-grammar lobby is unlikely to accept increased setting as a consolation prize…

.

.

Earlier in December I added a companion piece to ‘The Politics of Selection’.

How Well Do Grammar Schools Perform With Disadvantaged Students? reveals that the remaining 163 grammar schools have very different records in this respect. The poor performance of a handful is a cause for concern.

I also published High Attainment in the 2014 Primary School Performance Tables – another exercise in benchmarking, this time for primary schools interested in how well they support high attainers and high attainment.

This shows that HMCI’s recent distinction between positive support for the most able in the primary sector and a much weaker record in secondary schools is not entirely accurate. There are conspicuous weaknesses in the primary sector too.

Meanwhile, Chinese learners continue to perform extraordinarily well on the Level 6 maths test, achieving an amazing 35% success rate, up six percentage points since 2013. This domestic equivalent of the Shanghai phenomenon bears closer investigation.

My penultimate post of the year HMCI Ups the Ante on the Most Able collates all the references to the most able in HMCI’s 2014 Annual Report and its supporting documentation.

It sets out Ofsted’s plans for the increased scrutiny of schools and for additional survey reports that reflect this scrutiny.

It asks the question whether Ofsted’s renewed emphasis will be sufficient to rectify the shortcomings they themselves identify and – assuming it will not – outlines an additional ten-step plan to secure system-wide improvement.

Conclusion

So what are the prospects for 2015 and beyond?

My 2013 Retrospective was decidedly negative about the future of global gifted education:

‘The ‘closed shop’ is as determinedly closed as ever; vested interests are shored up; governance is weak. There is fragmentation and vacuum where there should be inclusive collaboration for the benefit of learners. Too many are on the outside, looking in. Too many on the inside are superannuated and devoid of fresh ideas.’

Despite evidence of a few ‘green shoots’’ during 2014, my overall sense of pessimism remains.

Meanwhile, future prospects for high attainers in England hang in the balance.

Several of the Coalition Government’s education reforms have been designed to shift schools’ focus away from borderline learners, so that every learner improves, including those at the top of the attainment distribution.

On the other hand, Ofsted’s judgement that a third of secondary inspections this year

‘…pinpointed specific problems with teaching the most able’

would suggest that schools’ everyday practice falls some way short of this ideal.

HMCI’s commitment to champion the interests of the most able is decidedly positive but, as suggested above, it might not be enough to secure the necessary system-wide improvement.

Ofsted is itself under pressure and faces an uncertain future, regardless of the election outcome. HMCI’s championing might not survive the arrival of a successor.

It seems increasingly unlikely that any political party’s election manifesto will have anything significant to say about this topic, unless  the enthusiasm for selection in some quarters can be harnessed and redirected towards the much more pertinent question of how best to meet the needs of all high attainers in all schools and colleges, especially those from disadvantaged backgrounds.

But the entire political future is shrouded in uncertainty. Let’s wait and see how things are shaping up on the other side of the election.

From a personal perspective I am closing in on five continuous years of edutweeting and edublogging.

I once expected to extract from this commitment benefits commensurate with the time and energy invested. But that is no longer the case, if indeed it ever was.

I plan to call time at the end of this academic year.

 .

GP

December 2014