Will Maths Hubs Work?


Gyroscope_precessionGyroscope_precessionGyroscope_precessionThis post takes a closer look at Maths Hubs, exploring the nature of the model, their early history and performance to date.

It reflects on their potential contribution to the education of the ‘mathematically most able’ and considers whether a similar model might support ‘most able education’.





Origins of this post

The post was prompted by the potential connection between two separate stimuli:

‘We aim to make Britain the best place in the world to study maths, science and engineering, measured by improved performance in the PISA league tables….We will make sure that all students are pushed to achieve their potential and create more opportunities to stretch the most able.’

  • My own recent post on Missing Talent (June 2015) which discussed the Sutton Trust/education datalab recommendation that:

‘Schools where highly able pupils currently underperform should be supported through the designation of another local exemplar school

Exemplar schools…should be invited to consider whether they are able to deliver a programme of extra-curricular support to raise horizons and aspirations for children living in the wider area.’

The second led to a brief Twitter discussion about parallels with an earlier initiative during which Maths Hubs were mentioned.



Links to previous posts

I touched on Maths Hubs once before, in the final section of 16-19 Maths Free Schools Revisited (October 2014) which dealt with ‘prospects for the national maths talent pipeline’.

This reviewed the panoply of bodies involved in maths education at national level and the potential advantages of investing in a network with genuinely national reach, rather than in a handful of new institutions with small localised intakes and limited capacity for outreach:

‘Not to put to finer point on it, there are too many cooks. No single body is in charge; none has lead responsibility for developing the talent pipeline

The recent introduction of maths hubs might have been intended to bring some much-needed clarity to a complex set of relationships at local, regional and national levels. But the hubs seem to be adding to the complexity by running even more new projects, starting with a Shanghai Teacher Exchange Programme.

A network-driven approach to talent development might just work…but it must be designed to deliver a set of shared strategic objectives. Someone authoritative needs to hold the ring.

What a pity there wasn’t a mechanism to vire the £72m capital budget for 12 free schools into a pot devoted to this end. For, as things stand, it seems that up to £12m will have been spent on two institutions with a combined annual cohort of 120 students, while a further £60m may have to be surrendered back to the Treasury.’

Two further posts are less directly relevant but ought to be mentioned in passing:

The second in particular raises questions about the suitability of NCETM’s version of mastery for our high attaining learners, arguing that essential groundwork has been neglected and that the present approach to ‘stretch and challenge’ is unnecessarily narrow and restrictive.


Structure of this post

The remainder of this post is divided into three principal sections:

  • Material about the introduction of Maths Hubs and a detailed exploration of the model. This takes up approximately half of the post.
  • A review of the Hubs’ work programme and the progress they have made during their first year of operation.
  • Proposals for Maths Hubs to take the lead in improving the education of mathematically able learners and for the potential introduction of ‘most able hubs’ to support high attainers more generally. I stop short of potential reform of the entire ‘national maths talent pipeline’ since that is beyond the scope of this post.

Since readers may not be equally interested in all these sections I have supplied the customary page jumps from each of the bullet points above and to the Conclusion, for those who prefer to cut to the chase.


The introduction of the Maths Hubs model


Initial vision

Maths Hubs were first announced in a DfE press release published in December 2013.

The opening paragraph describes the core purpose as improving teacher quality:

‘Education Minister Elizabeth Truss today announced £11 million for new maths hubs to drive up the quality of maths teachers – as international test results showed England’s performance had stagnated.’

The press release explains the Coalition Government’s plans to introduce a national network of some 30 ‘mathematics education strategic hubs’ (MESH) each led by a teaching school.

A variety of local strategic partners will be drawn into each hub, including teaching school alliances, other ‘school and college groupings’, university faculties, subject associations, ‘appropriate’ local employers and local representatives of national maths initiatives.

There is an expectation that all phases of education will be engaged, particularly ‘early years to post-16’.

National co-ordination will fall to the National Centre for Excellence in the Teaching of Mathematics (NCETM), currently run under contract to DfE by a consortium comprising Tribal Education, the UCL Institute of Education, Mathematics in Education and Industry (MEI) and Myscience.

(A 2014 PQ reply gives the value of this contract as £6.827m, although this probably reflects a 3-year award made in 2012. It must have been extended by a further year, but will almost certainly have to be retendered for the next spending review period, beginning in April 2016.

The £11m budget for Maths Hubs is separate and additional. It is not clear whether part of this sum has also been awarded to NCETM through a single tender. There is more information about funding mid-way through this post.)

The press release describes the Hubs as both a national and a school-led model:

‘The network will bring together the emerging national leaders of mathematics education and aim to make school-led national subject improvement a reality.’

These emerging national leaders are assumed to be located in the lead schools and not elsewhere in the system, at NCETM or in other national organisations.

The policy design is broadly consistent with my personal preference for a ‘managed market’ approach, midway between a ‘bottom-up’ market-driven solution and a centralised and prescriptive ‘top-down’ model

But it embodies a fundamental tension, arising from the need to reconcile the Government’s national priorities with a parallel local agenda.

In order to work smoothly, one set of priorities will almost certainly take precedence over the other (and it won’t be the local agenda).

The model is also expected to:

‘…ensure that all the support provided…is grounded in evidence about what works, both in terms of mathematics teaching and the development of teachers of mathematics.’

Each Hub will be expected to provide support for maths education across all other schools in the area, taking in the full spectrum of provision:

  • recruitment of maths specialists into teaching
  • initial training of maths teachers and converting existing teachers into maths [sic]
  • co-ordinating and delivering a wide range of maths continuing professional development (CPD) and school-to-school support
  • ensuring maths leadership is developed, eg running a programme for aspiring heads of maths departments
  • helping maths enrichment programmes to reach a large number of pupils from primary school onwards’.

This is a particularly tall order, both in terms of the breadth of Hubs’ responsibilities and the sheer number of institutions which they are expected to support. It is over-ambitious given the budget allocated for the purpose and, as we shall see, was scaled back in later material.

The press release says that NCETM has already tested the model with five pathfinders.

It adds:

The main programme will be robustly evaluated, and if it proves successful in raising the standards of mathematics teaching it may be continued in 2016 to 2017, contingent on future spending review outcomes.’

What constitutes ‘the main programme’ is unclear, though it presumably includes the Hubs’ contribution to national projects, if not their local priorities.

Note that continuation from 2016 onwards is conditional on the outcomes of this evaluation, specifically a directly attributable and measurable improvement in maths teaching standards.

I have been unable to trace a contract for the evaluation, which would suggest that one has not been commissioned. This is rather a serious oversight.

We do not know how NCETM is monitoring the performance of the Hubs, nor do we know what evidence will inform a decision about whether to continue with the programme as a whole.

We have only the most basic details of national programmes in AY2015/16 and no information at all about the Hubs’ longer term prospects.

I asked the Maths Hubs Twitter feed about evaluation and was eventually referred to NCETM’s Comms Director.

I have not made contact because:

  • It is a point of principle that these posts rely exclusively on material already available online and so in the public domain. (This reflects a personal commitment to transparency in educational policy.)
  • The Comms Director wouldn’t have to be involved unless NCETM felt that the information was sensitive and had to be ‘managed’ in some way – and that tells me all I need to know.
  • I am not disposed to pursue NCETM for clarification since they have shown zero interest in engaging with me over previous posts, even though I have expressly invited their views.


Selection of the Hubs

Three months later, in March 2014, further details were published as part of the process of selecting the Hubs.

The document has two stabs at describing the aims of the project. The first emphasises local delivery:

‘The aim is to enable every school and college in England, from early years to the post-16 sector, to access locally-tailored and quality support in all areas of maths teaching and learning.’

This continues to imply full national reach, although one might argue that ‘enabling access’ is achieved by providing a Hub within reasonable distance of each institution and does not demand the active engagement of every school and college.

The second strives to balance national priorities and local delivery:

‘The aim of the national network of Maths Hubs will be to ensure that all schools have access to excellent maths support that is relevant to their specific needs and that is designed and managed locally. They will also be responsible for the coordinated implementation of national projects to stimulate improvement and innovation in maths education.

Note that these national priorities have now become associated with innovation as well as improvement. This is ‘top-down’ rather than ‘school-led’ innovation – there is no specific push for innovative local projects.

At this stage the Hubs’ initial (national) priorities are given as:

  • Leading the Shanghai Teacher Exchange Programme
  • Supporting implementation of the new maths national curriculum from September 2014 and
  • Supporting introduction of new maths GCSEs and Core Maths qualifications in 2015.

The guidance specifies that:

‘Each Maths Hub will operate at a sub-regional or city regional level. The hubs will work with any group of schools or colleges in the area that request support, or who are referred to the hub for support.’

So responsibility for seeking assistance is placed on other schools and colleges and on third parties (perhaps Ofsted or Regional School Commissioners?) making referrals – Hubs will not be expected to reach out proactively to every institution in their catchment.

The competition is no longer confined to teaching schools. Any school that meets the initial eligibility criteria may submit an expression of interest. But the text is clear that only schools need apply – colleges are seemingly ineligible.

Moreover, schools must be state-funded and rated Outstanding by Ofsted for Overall Effectiveness, Pupil Achievement, Quality of Teaching and Leadership and Management.

Teaching schools are not expected to submit Ofsted inspection evidence – their designation is sufficient.

The guidance says:

‘We may choose to prioritise expression of interest applications based on school performance, geographical spread and innovative practice in maths education.’

NCETM reported subsequently that over 270 expressions of interest were received and about 80 schools were invited to submit full proposals.

The evidence used to select between these is set out in the guidance. There are four main components:

  • Attainment and progress data (primary or secondary and post-16 where relevant) including attainment data (but not progress data) for FSM pupils (as opposed to ‘ever 6 FSM’).
  • Support for improvement and professional development
  • Leadership quality and commitment
  • Record and capacity for partnership and collaboration

The full text is reproduced below


Criteria application Capture 1Criteria application capture 2Criteria application Capture 3Criteria application Capture 4.

It is instructive to compare the original version with the assessment criteria set out for the limited Autumn 2015 competition (see below).

In the updated version applicants can be either colleges or schools. Applicants will be invited to presentation days during which their commitment to mastery will be tested:

‘Applicants will be asked to set out…How they will support the development of mastery approaches to teaching mathematics, learning particularly from practice in Shanghai and Singapore.’

The Maths Hub model may be locally-driven but only institutions that support the preferred approach need apply.

The criteria cover broadly the same areas but they have been beefed up significantly.

The original version indicated that full proposals would require evidence of ‘school business expertise’ and ‘informed innovation in maths education’, but these expectations are now spelled out in the criteria.

Applicants must:

‘Provide evidence of a strong track record of taking accountability for funding and contracting other schools/organisations to deliver projects, including value for money, appropriate use of public funding, and impact.’

They must also:

‘Provide two or three examples of how you have led evidence-informed innovation in maths teaching. Include details of evaluation outcomes.

Provide information about the key strategies you would expect the hub to employ to support effective innovation.

Provide evidence of how you propose to test and implement the teaching maths for mastery approach within the hub. Show how effective approaches will be embedded across all school phases.’

Note that this innovative capacity is linked explicitly with the roll-out of mastery, a national priority.

The new guide explains that action plans prepared by the successful applicants will be ‘agreed by the NCETM and submitted to the DfE for approval’. This two-stage process might suggest that NCETM’s decision-making is not fully trusted. Alternatively, it might have something to do with the funding flows.

No further information was released about issues arising during the original selection process. It seems probable that some parts of the country submitted several strong bids while others generated relatively few or none at all.

It will have been necessary to balance the comparative strength of bids against their geographical distribution, and probably to ‘adjust’ the territories of Hubs where two or more particularly strong bids were received from schools in relatively close proximity.

It is not clear whether the NCETM’s five pathfinders were automatically included.

Successful bidders were confirmed in early June 2014, so the competition took approximately three months to complete.

One contemporary TSA source says that Hubs were ‘introduced at a frantic pace’. A 2-day introductory conference took place in Manchester on 18-19 June, prior to the formal launch in London in July.

Hubs had to submit their action plans for approval by the end of the summer term and to establish links with key partners in readiness to become operational ‘from the autumn term 2014’. (The TSA source says ‘in September’).


The Hubs are announced

A further DfE press release issued on 1 July 2014 identified 32 Hubs. Two more were added during the autumn term, bringing the total to 34, although the FAQs on the Maths Hubs website still say that there were only 32 ‘in the first wave’.

This implies that a second ‘wave’ is (or was) anticipated.

An earlier NCETM presentation indicated that 35 hubs were planned but it took a full year for the final vacancy to be advertised.

As noted above, in July 2015, an application form and guide were issued ‘for schools and colleges that want to lead a maths hub in south-east London and Cumbria or north Lancashire.’

The guide explains:

‘There are currently 34 Maths Hubs across England with funding available for a 35th Maths Hub in the North West of England. There is a geographical gap in Cumbria and North Lancashire where previously we were unsuccessful in identifying a suitable school or college to lead a Maths Hub in this area. In addition, after establishing the Maths Hub in first year, the lead school for the London South-East Maths Hub has decided to step down from its role.’

As far as I can establish this is the first time that the original failure to recruit the final Hub in the North-West has been mentioned publicly.

No reason is given for the decision by another lead school to drop out. The school in question is Woolwich Polytechnic School.

The two new Hubs are expected to be operational by November 2015. Applications will be judged by an unidentified panel.

Had the first tranche of Hubs proved extremely successful, one assumes that the second wave would have been introduced in readiness for academic year 2015/16, but perhaps it is necessary to await the outcome of the forthcoming spending review, enabling the second wave to be introduced from September 2016.

The embedded spreadsheet below gives details of all 34 Hubs currently operating.



Most lead institutions are schools, the majority of them secondary academies. A couple of grammar schools are involved as well as several church schools. Catholic institutions are particularly well represented.

Two of the London Hubs are led by singleton primary schools and a third by two primary schools working together. Elsewhere one Hub is based in a 14-19 tertiary college and another is led jointly by a 16-19 free school.

Some are hosted by various forms of school partnership. These include notable multi-academy trusts including the Harris Federation, Outwood Grange Academies Trust and Cabot Learning Federation.

The difference in capacity between a single primary school and a large MAT is enormous, but the expectations of each are identical, as are the resources made available to implement the work programme. One would expect there to be some correlation between capacity and quality with smaller institutions struggling to match their larger peers.

No doubt the MATs take care to ensure that all their schools are direct beneficiaries of their Hubs – and the initiative gives them an opportunity to exert influence beyond their own members, potentially even to scout possible additions to the fold.

Fewer than half of the lead schools satisfy the initial eligibility requirements for ‘outstanding’ inspection reports (and sub-grades). In most cases this is because they are academies and have not yet been inspected in that guise.

One lead school – Bishop Challoner Catholic College – received ‘Good’ ratings from its most recent inspection in 2012. Another – Sir Isaac Newton Sixth Form – has been rated ‘Good’ since becoming a lead school.

We do not know why these institutions were included in the original shortlist but, perhaps fortunately, there was no public backlash from better qualified competitors upset at being overlooked.

This map (taken from a presentation available online) shows the geographical distribution of the original 32 Hubs. It is a more accurate representation than the regional map on the Maths Hub website.

Even with the addition of the two latecomers in November 2014 – one in Kent/Medway, the other in Leicestershire – it is evident that some parts of the country are much better served than others.

There is an obvious gap along the East Coast, stretching from the Wash up to Teesside, and another in the far North-West that the new competition is belatedly intended to fill. The huge South-West area is also relatively poorly served.


Maths Hubs locations map. 

If the Hubs were evenly distributed to reflect the incidence of schools and colleges nationally, each would serve a constituency of about 100 state-funded secondary schools and 500 state-funded primary schools, so 600 primary and secondary schools in total, not to mention 10 or so post-16 institutions.

Although there is little evidence on which to base a judgement, it seems unlikely that any of the Hubs will have achieved anything approaching this kind of reach within their first year of operation. One wonders whether it is feasible even in the longer term.

But the relatively uneven geographical distribution of the Hubs suggests that the size of their constituencies will vary.

Since schools and colleges are expected to approach their Hubs – and are free to align with any Hub – the level of demand will also vary.

It would be helpful to see some basic statistics comparing the size and reach of different Hubs, setting out how many institutions they have already engaged actively in their work programmes and what proportion are not yet engaged.

It seems likely that several more hubs will be needed to achieve truly national reach. It might be more feasible with a ratio of 300 schools per hub, but that would require twice as many hubs. The limited supply of high quality candidates may act as an additional brake on expansion, on top of the availability of funding.


Hub structure

A presentation given on 27 June 2014 by John Westwell – NCETM’s ‘Director of Strategy Maths Hubs’ – explains Hub structure through this diagram


NCETM Maths hubs model Capture.

There is a distinction – though perhaps not very clearly expressed – between the roles of:

  • Strategic partners supporting the lead school with strategic leadership and 
  • Operational partners providing ‘further local leadership and specialist expertise to support [the] whole area’.

It seems that the former are directly involved in planning and evaluating the work programme while the latter are restricted to supporting delivery.

The spreadsheet shows that one of the Hubs – Salop and Herefordshire – fails to mention any strategic partners while another – Jurassic – refers to most of its partners in general terms (eg ‘primary schools, secondary schools’).

The remainder identify between four and 16 strategic partners each. Great North and Bucks, Berks and Oxon are at the lower end of the spectrum. Archimedes NE and Matrix Essex and Herts are at the upper end.

One assumes that it can be a disadvantage either to have too few or too many strategic partners, the former generating too little capacity; the latter too many cooks.

All but five Hubs have at least one higher education partner but of course there is no information about the level and intensity of their involvement, which is likely to vary considerably.

Eighteen mention the FMSP, but only five include the CMSP. Six list MEI as a strategic partner and, curiously, three nominate NCETM. It is unclear whether these enjoy a different relationship with the national co-ordinating body as a consequence.

To date, only the London Central and West Hub is allied with Mathematics Mastery, the Ark-sponsored programme.

However, NCETM says:

‘…a growing number of schools around the country are following teaching programmes from Mathematics Mastery an organisation (separate from the NCETM) whose work, as the name suggests, is wholly devoted to this style of learning and teaching. Mathematics Mastery is, in some geographical areas, developing partnership working arrangements with the Maths Hubs programme.’

Mathematics Mastery also describes itself as ‘a national partner of Maths Hubs’.


Work Groups

Hubs plan on the basis of a standard unit of delivery described as a ‘work group’.

Each work group is characterised by:

  • a clear rationale for its existence and activity
  • well defined intended outcomes
  • local leadership supported by expert partners
  • a mixture of different activities over time
  • value for money and
  • systematic evidence collection.

The process is supported by something called the NCETM ‘Work Group Quality Framework’ which I have been unable to trace. This should also be published.

The most recent description of the Hubs’ role is provided by the Maths Hubs Website which was did not appear until November 2014.

The description of ‘What Maths Hubs Are Doing’ reinforces the distinction between:

  • National Collaborative Projects, where all hubs work in a common way to address a programme priority area and
  • Local projects, where hubs work independently on locally tailored projects to address the programme priorities.’

The earlier material includes a third variant:

  • Local priorities funded by other means

But these are not mentioned on the website and it is not clear whether they count as part of the Hubs’ official activity programme.

The spreadsheet shows that the number of work groups operated by each Hub varies considerably.

Four of them – North West One, White Rose, South Yorkshire and London South East – fail to identify any work groups at all.

In the case of White Rose there are links to courses and a conference, but the others include only a generic description of their work programme.

Two further Hubs – Enigma and Cambridge – refer readers to their websites, neither of which contain substantive detail about the Work Groups they have established (though Enigma lists a range of maths CPD opportunities and courses).

Otherwise the number of work groups varies between two (East Midlands South) and 11 (Surrey Plus). Fifteen of the Hubs have six or fewer work groups while nine have eight or more.

This suggests that some Hubs are far more productive and efficient than others, although the number of work groups is not always a reliable indicator, since some Hubs appear to categorise one-off events as work groups, while others use it to describe only longer term projects.

Maybe the Quality Framework needs attention, or perhaps some Hubs are not following it properly.


The network defined

To coincide with the launch NCETM published its own information page on Maths Hubs, now available only via archive.

This describes in more detail how the Hubs will be expected to function as a network:

‘…the Maths Hubs will also work together in a national network co-ordinated by the NCETM. The network will ensure that effective practice from within particular hubs is shared widely. It will also provide a setting for Maths Hubs and the NCETM to collaboratively develop new forms of support as needed.

The national network will also come together, once a term, in a regular Maths Hubs Forum, where there will be opportunity to evaluate progress, plan for the future, and to engage with other national voices in maths education, such as the Joint Mathematical Council, the Advisory Committee on Mathematics Education (ACME), the DfE, and Ofsted. As shown in the diagram below’:


NCETM national network Capture..

Whether this is genuinely ‘school-led system-wide improvement’ is open to question, relying as it does on central co-ordination and a funding stream provided by central government. It is more accurately a hybrid model that aims to pursue national and local priorities simultaneously.

Essentially Hubs have a tripartite responsibility:

  • To develop and co-ordinate practice within their own Hub.
  • To collaborate effectively with other Hubs.
  • Collectively to contribute to the national leadership of maths education

The sheer complexity of this role – and the level of expectation placed on the Hubs – should not be under-estimated.

The archived NCETM page identifies three core tasks for the Hubs as they operate locally:

  • Identify needs and agree priorities for support in their area. This could involve pro-active surveying of schools; responding to requests and referrals; and considering the implications of national evidence.
  • Co-ordinate a range of high quality specialist mathematics support to address the needs. This could include communicating existing support and extending its reach; commissioning external organisations to provide bespoke support; developing and enabling new forms of support and collaboration.
  • Critically evaluate the quality and impact of the support provided. This could include gathering immediate, medium-term and long-term feedback from participants engaging with support; and more detailed evaluative research used to test innovations.’

We have no information about the extent and quality of cross-fertilisation between Hubs. This seems to depend mainly on the termly attendance of the leads at the Forum meetings, supported through social media interaction via Twitter. There is also some evidence of regional collaboration, though this seems much better developed in some regions than others.

The July 2015 newsletter on the Maths Hub Website says:

‘An added feature of the second year of the Maths Hubs programme will be more collaboration between Maths Hubs, typically bringing a small group of hubs together to pool experience, maybe in the development of a new project, or in the wider implementation of something that’s already worked well in a single hub.’

This may suggest that the collaborative dimension has been rather underplayed during the first year of operation. If it is to be expanded it may well demand additional teacher time and funding.

In the Westwell presentation the model is described as a ‘fully meshed network’ (as opposed to a hub and spoke model) in which ‘all the nodes are hubs’.

Unusually – and in contrast to the DfE press releases – there is explicit recognition that the Hubs’ core purpose is to improve pupil outcomes:

‘Resolute focus on pupils’ maths outcomes:

  • improved levels of achievement
  • increased levels of participation
  • improved attitudes to learning
  • closing the gaps between groups’

They also support school/college improvement:

‘Determined support for all schools/colleges to improve:

  • the teaching of mathematics
  • the leadership of mathematics
  • the school’s mathematics curriculum ‘

Any evaluation would need to assess the impact of each Hub against each of these seven measures. Once again, the level of expectation is self-evident.


Termly Forums and Hub leads

Very little information is made available about the proceedings of the termly Maths Hub Forum, where the 34 Hub leads convene with national partners.

The Maths Hubs website says:

‘At the national level, the Maths Hubs programme, led by the NCETM, is developing partnership working arrangements with organisations that can support across the Maths Hubs network. At the moment, these include:

Other partnership arrangements will be developed in due course.’

There is no further information about these national partnership agreements, especially the benefits accruing to each partner as a consequence.

We know that one Forum took place in October 2014, another in February 2015. We do not know the full list of national partners on the invitation list.

There should be another Forum before the end of summer term 2015, unless the London Maths Hub Conference was intended to serve as a replacement.

The guide to the competition for two new Hubs mentions that the Autumn 2015 Forum will take place in York on 4/5 November.

The July Bespoke newsletter says:

‘…the 34 Maths Hub Leads, who meet termly, will continue to pool their thoughts and experiences, developing a growing and influential voice for mathematics education at a national level.’ 

It is hard to understand how the Forum can become ‘an influential voice’ without a significantly higher profile and much greater transparency over proceedings.

The Maths Hubs website should have a discrete section for the termly forums which contains all key documents and presentations.

In March 2015, NCETM’s Westwell published a post on the NCTL Blog claiming early signs of success for the Hubs:

‘Even though we are less than 2 terms into embedding a new, collaborative way of working, we are seeing encouraging signs that leadership in mathematics education can be shared and spread within geographical areas.’

He continues:

‘Our vision is of a national, collective group of leaders exerting new, subject-specific influence across school phases and across geographical boundaries.

The essential professional characteristics of this group are that they know, from first-hand experience:

  • how maths is best taught, and learnt
  • how good maths teachers are nurtured
  • how high-quality ongoing professional development can help good teachers become excellent ones

They have shown the capacity to lead others in all of these areas.’

And he adds:

‘The maths hub leads also come together in a regular national forum, which allows them to exchange practice but also provides a platform for them to enter into dialogue with policy makers and key national bodies. Over time, we expect that maths hub leads will come to be recognised nationally as leaders of mathematics education.’

This highlights the critical importance of the Maths Hub leads to the success of the model. One assumes that the post-holders are typically serving maths teachers who undertake this role alongside their classroom and middle management responsibilities.

It seems highly likely that most Hub leads will not remain in post for more than two or three years. All will be developing highly transferrable skills. Many will rightly see the role as a stepping stone to senior leadership roles.

Unless they can offer strong incentives to Hub leads to remain in post, NCETM will find turnover a persistent problem.



There is no information about funding on the Maths Hubs Website and details are extremely hard to find, apart from the total budget of £11m, which covers the cost of Hubs up to the end of FY2015-16.

Each Hub receives core operational funding as well as ‘funding on a project basis for local and national initiatives’.

I found an example of an action plan online. The notes provide some details of the annual budget for last financial year:

For the financial year 2014/15, each hub will receive £36,000 to cover the structural costs of the hub including the cost of: the Maths Lead time (expected minimum 1 day/week) and Hub Administrator time (expected minimum 1.5 days/week); the time provided by the Senior Lead Support and the strategic leadership group; identifying and developing operational partner capacity; engaging schools/colleges and identifying their support needs. It is possible to transfer some of the £36,000 to support hub initiated activities.

For the financial year 2014/15, Maths Hubs will receive £40,000 to support hub-initiated activity. As explained at the forum we are using the term “Work Groups” to cover all hub-initiated activity…The cost of the exchange element with the Shanghai teachers will be paid from central national project funds and is outside of the £40,000 budget.’

Another source (a presentation given at the launch of the Norfolk and Suffolk Hub) suggests that in 2014-15 Hubs also received a further £20,000 for national projects.

Hence the maximum budget per Hub in FY2014/15 was £96,000. Assuming all 34 received that sum the total cost was £3.264m (34 x £96K).

We do not know how much more was set aside for central costs, although DfE’s Supplementary Estimates for 2014-15 hint that the total budget might have been £3.7m, which would suggest a balance of £0.436m was spent on central administration.

The NCETM website presently lists a Director and no fewer than six Assistant Directors responsible for Maths Hubs, giving a ratio of one director for every seven hubs. On the face of it, this does not fit the image as a school-led network. Indeed it suggests that the Hubs require intensive central support.

I could find nothing at all about the size of the budget for 2015-16. The Norfolk and Suffolk launch presentation indicates that Hubs will enjoy additional funding for both running costs and projects but does not quantify this statement. Another source suggests that the time allocation for Hub leads will be increased to 0.5FTE.

There is no information about funding levels in the guide to the autumn 2015 competition, although it suggests that the money will come in two separate streams:

‘Each Maths Hub will receive direct funding for structural operational purposes and funding on a project basis for local and national projects.’

It may be that the operational funding is paid via NCTL and the project funding via NCETM.

One assumes that operational funding will need to be uprated by at least 33% for 2015-16 since it will cover a full financial year rather than July to March inclusive (9 months only).

If the funding for local and national projects is increased by the same amount, that would bring the sum per Hub in FY2015-16 to approximately £128,000 and the total budget to something like £5m.

It would be helpful to have rather more transparency about Hub budgets and the total sum available to support them in each financial year.

If the NCETM operation needs retendering for FY2016-17 onwards, one assumes that national co-ordination of the Hubs will form part of the specification. One might expect to see a tender early next academic year.


Hubs’ Current Activity

Developing role 

The press release marking the launch was strongly focused on Hubs’ role in leading what was then called the Shanghai Teacher Exchange Programme:

‘A national network of maths hubs that will seek to match the standards achieved in top-performing east Asian countries – including Japan, Singapore and China – was launched today by Education Minister Elizabeth Truss…

These ‘pace-setters’ will implement the Asian-style mastery approach to maths which has achieved world-leading success….Hubs will develop this programme with academics from Shanghai Normal University and England’s National Centre for Excellence in the Teaching of Maths (NCETM)….

… The Shanghai Teacher Exchange programme will see up to 60 English-speaking maths teachers from China embedded in the 30 maths hubs, starting this autumn term.

The Chinese teachers will run master classes for local schools and provide subject-specific on-the-job teacher training.

Two leading English maths teachers from each of the 30 maths hubs will work in schools in China for at least a month, to learn their world-class teaching approaches. The teachers will then put into practice in England what they have learnt and spread this widely to their peers.’

It also mentioned that the Hubs would be supporting the Your Life campaign to inspire young people, especially girls, to study maths and physics.

‘The campaign, led by businesses, aims to increase the number of students taking maths and physics A level by 50% over the next 3 years.’


‘They will also work with new maths and physics chairs, PhD graduates being recruited to become teachers to take their expertise into the classroom and transform the way the maths and physics are taught.’

The Website describes three National Collaborative Projects in slightly different terms:

  • England-China is the new title for the Shanghai Teacher Exchange. Primary sector exchanges took place in 2014/15 and secondary exchanges are scheduled for 2015/16.

The aim of the project is described thus:

‘The aim, as far as the English schools are concerned, is to learn lessons from how maths is taught in Shanghai, with particular focus on the mastery approach, and then research and develop ways in which similar teaching approaches can be used in English classrooms

…The long-term aim of the project is for the participating English schools first to develop a secure mastery approach to maths teaching themselves, and then to spread it around partner schools.’

  • Textbooks and Professional Development involves two primary schools from each Maths Hub trialling adapted versions of Singapore textbooks with their Year 1 classes.

Each school has chosen one of two mastery-focused textbooks: ‘Inspire Maths’ and ‘Maths – No Problem’. Teachers have five days’ workshop support.

  • Post-16 Participation is intended to increase participation rates in A level maths and further maths courses as well as Core Maths and other Level 3 qualifications. Some hubs are particularly focused on girls’ participation.

The initial phase of the project involves identifying schools and colleges that are successful in this respect, itemising the successful strategies they have deployed and exploring how those might be implemented in schools and colleges that have been rather less successful.


Progress to date on National Collaborative Projects 

Coverage of the National Projects on the Hubs website is heavily biased towards the England-China project, telling us comparatively little about the other national priorities.

A group of 71 primary teachers visited Shanghai in September 2014. Return visits from 59 Shanghai teachers took place in two waves, in November 2014 and February/March 2015. 

A list of 47 participating schools is supplied including the hubs to which they belong.

There is also a Mid-Exchange Report published in November 2014, a press release from February 2015 marking the arrival of the second wave and the first edition of Bespoke, a Maths Hub newsletter dating from April 2015, which is exclusively focused on mastery.

The latter describes the exchanges as:

‘…the start of a long-term research project, across all of the Maths Hubs, to investigate ways in which mastery approaches can be introduced to maths lessons, to the way teachers design lessons, and to how schools organise time-tables, and the deployment of teachers and teaching assistants.’

These descriptions suggest something rather different to the slavish replication of Shanghai-style mastery, anticipating a ‘secure mastery approach’ that might nevertheless have some distinctive English features.

But NCETM has already set out in some detail the principles and key features of the model they would like to see introduced, so rather less is expected of the Hubs than one might anticipate. They are essentially a testbed and a mechanism for the roll-out of a national strategy.

The website also indicates that, before the end of summer term 2015:

‘…the NCETM, working through the Maths Hubs will publish support materials for assessment of the depth of pupils’ knowledge within the context of a mastery curriculum.’

NCETM describes the materials as a collaborative venture involving several partners:

‘Recording progress without levels requires recording evidence of depth of understanding of curriculum content, rather than merely showing pupils can ‘get the answers right’.

The NCETM, working with other maths experts and primary maths specialists from the Maths Hubs, is currently producing guidance on how to do this for the primary maths National Curriculum. For each curriculum statement, the guidance will show how to identify when a pupil has ‘mastered’ the curriculum content (meaning he or she is meeting national expectations and so ready to progress) and when a pupil is ‘working deeper’ (meaning he or she is exceeding national expectations in terms of depth of understanding).’

This is not yet published and, if NCETM is sensible, it will wait to see the outcomes of the parallel Commission on Assessment Without Levels.

The Bespoke newsletter mentions in passing that further research is needed into the application of mastery teaching in mixed age classes, but no further details are forthcoming.

Information about the planned secondary exchange is also rather thin on the ground.

NCETM said in June that the programme would focus on teaching at the KS2/3 transition.

The second edition of Bespoke, published in July 2015 adds:

‘Primary schools that hosted Shanghai teachers in 2014/15 will continue to develop and embed teaching for mastery approaches, and, in addition, two teachers from secondary schools in each Maths Hub will visit Shanghai in September, with their counterparts returning to work in Key Stage 3 classrooms in November 2015.’

The same is true of the Textbooks project, which was announced in a ministerial speech given in November 2014. Very little detail has been added since.

The July edition of Bespoke says that the project:

‘…will be expanded, to take in more schools and more classes, including Year 2 pupils’

while another section offers the briefest commentary on progress in the first year, twice!:


Bespoke July Capture.

Coverage of the Post-16 Participation project is similarly sparse, though this may be because the lead lies with the Further Mathematics Support Programme and Core Maths Support Programme.

July’s Bespoke says of Year 2:

‘Work to help schools and colleges increase the numbers of Year 12 and Year 13 students taking A level maths, and, among them, more girls, will continue. Approaches that bore fruit in some hubs this year will be implemented in other areas.’

The sketchiness of this material causes one to suspect that – leaving aside the Shanghai exchanges – progress on these national projects has been less than spectacular during the first year of the Hubs’ existence.

Even with the England-China project there is no published specification for the long-term research project that is to follow on from the exchanges.

Those working outside the Hubs need more information to understand and appreciate what value the Hubs are adding.


New National Collaborative Projects

The July edition of Bespoke confirms two further National Projects.

One is snappily called ‘Developing 140 new Primary Mathematics Teaching for Mastery specialists’:

‘Closely linked to other work on mastery, this project will involve the training of four teachers in each Maths Hub area to become experts in teaching for mastery in their own classrooms, and in supporting the similar development of teachers in partner schools.’

This project appeared a national-programme-in-waiting when it was first announced in April 2015.

A subsequent NCETM press release confirmed that there were over 600 applicants for the available places.

The further details provided by NCETM reveal that participants will pursue a two-year course. Year One combines three two-day residential events with the leadership of teacher research groups, both in the teacher’s own school and for groups of teachers in neighbouring schools.  Year Two is devoted exclusively to these external teacher research groups.

The material explains that a research group is:

‘…a professional development activity attended by a group of teachers, with a specific focus on the design, delivery and learning within a jointly evaluated mathematics lesson.’

A FAQ document explains that a typical research group meeting is a half-day session with discussion taking place before and after a lesson observation.

The four external group meetings in Year One will together constitute a pilot exercise. In Year Two participants will lead up to five such groups, each meeting on six occasions. Groups will typically comprise five pairs of teachers drawn from five different schools.

Release time is 12 days in Year One and up to 30 days in Year Two (assuming the participant leads the maximum five research groups).

Training and supply costs are covered in Year One but in Year Two they are to be met by charging the other participants in the research groups, so a first indication that Hubs will be expected to generate their own income stream from the services they provide. (NCETM will provide ‘guidance’ on fee levels.)

Participants are expected to develop:

  • ‘Understanding of the principles of mastery within the context of teaching mathematics.
  • Deep subject knowledge of primary mathematics to support teaching for mastery.
  • The development of effective teaching techniques to support pupils in developing mastery of mathematics.
  • The ability to assess pupils for mastery.
  • The ability to support other teachers, and lead teacher research groups.’

The intention is that teachers completing the course will roll out further phases of professional development and:

‘Over time, this will spread the understanding of, and expertise in, teaching maths for mastery widely across the primary school system.’

The second new national project is called ‘Mathematical Reasoning’. Bespoke is typically uninformative:

‘A new project will start in September 2015, to trial ways of developing mathematical reasoning skills in Key Stage 3 pupils.’

This may or may not be related to a NCETM Multiplicative Reasoning Professional Development Programme which took place in 2013/14 with the assistance of the Hubs.


‘focused on developing teachers’ understanding and capacity to teach topics that involved multiplicative reasoning to Key Stage 3 (KS3) pupils. Multiplicative reasoning refers to the mathematical understanding and capability to solve problems arising from proportional situations often involving an understanding and application of fractions as well as decimals, percentages, ratios and proportions.’

Some 60 teachers from 30 schools were organised into three regional professional development networks, each with a professional development lead and support from university researchers. Project materials were created by a central curriculum development team. The regional networks were hosted by Maths Hubs, presumably in their pilot phase.

In June 2015 DfE published a project Evaluation featuring a Randomised Control Trial (RCT). Unfortunately, this did not reveal any significant impact on pupil attainment:

‘During the timescale of the trial (13 October 2014 to May 2015) the programme did not have any statistically significant impacts on general mathematical attainment as measured by PiM tests or on items on the tests specifically associated with multiplicative reasoning’.

One of the Report’s recommendations is:

‘For the NCETM to make available MRP materials and approaches to teaching MR through the Maths Hub network’


‘That the NCETM seeks further opportunities to engage curriculum developers with Maths Hubs and other NCETM activities and potentially to develop future curriculum design projects that address the needs of teachers, schools and pupils’.

With five national collaborative projects rather than three, the work programme in each Hub during Academic Year 2015/16 will be more heavily biased towards the Government’s agenda, unless there is also additional funding to increase the number of local projects. There is no hint in the latest Bespoke newsletter that this is the case.


Local projects 

Unfortunately, Hub-specific pages on the Maths Hubs Website do not distinguish national from local projects.

A regional breakdown offers some insight into the typical distribution between the two and the range of issues being addressed.

The embedded spreadsheet provides further details, including links to additional information on each work group where the Hubs have made this available.

  • South West: The four Hubs between them identify 27 work groups. Each Hub has a work group for each of the three initial national collaborative projects. Relatively unusual topics include maths challenge and innovation days and improving primary maths enrichment experiences. The Jurassic Hub includes amongst its list of generic focus areas ‘developing access for the most able’, but there is no associated work group.
  • West Midlands: Two of the three hubs have six work groups and the third has seven. Here there is rather less adherence to the national priorities with only the North Midlands and Peaks Hub noticeably engaged with the mastery agenda. One work group is addressing ‘strategies for preventing (closing) the gap’ in maths. It is disturbing that this is unique across the entire programme – no other region appears concerned enough to make this a priority, nor is it a national project in its own right.
  • North West: Of the three Hubs, one has provided no details of its work groups, one lists six and the other nine. Perhaps the most interesting is North West Two’s Maths App Competition. This involves Y5 and 6 pupils creating ‘a maths-based app for a particular area of weakness that they have identified’.
  • North East: The two North East Hubs have nine and eight work groups respectively. Both address all three initial national priorities. In one the remaining groups are designed to cover the primary, secondary and post-16 sectors respectively. In the other there is a very strong mastery bias with two further work groups devoted to it.
  • Yorkshire and Humberside: Only two of the four Hubs provide details of their work groups in the standard format. One offers eight, the other four. The less ambitious Yorkshire and the Humber Hub does not include any of the three national priorities but addresses some topics not found elsewhere including Same Day Intervention and Differentiation. In contrast, Yorkshire Ridings covers all three national priorities and a local project offering £500 bursaries for small-scale action research projects.
  • East Midlands: Two of the Hubs identify six work groups but the third – one of the two late additions – has only two, neither of them focused on the national priorities. Elsewhere, only East Midlands East has a work group built around the Shanghai exchanges. Otherwise, network focused work groups – whether for primary specialists, subject leaders or SLEs – are dominant.
  • East: Two of the four Hubs provide links to their own websites, which are not particularly informative. The others name nine and five work groups respectively. The former – Matrix Essex and Herts – includes all three initial national priorities, but the latter – Norfolk and Suffolk – includes only increasing post-16 participation. Matrix has a local project to enhance the subject knowledge of teaching assistants. 
  • South East: The five Hubs vary considerably in the number of work groups they operate, ranging between three and 11. Bucks, Berks and Oxon is the least prolific, naming only the three national priorities. At the other extreme, Surrey Plus is the most active of all 34 Hubs, though several of its groups appear to relate to courses, conferences and other one-off meetings. One is providing ‘inspiration days for KS2, KS3 and KS4 students in schools looking to improve attitudes towards maths’. 
  • London: Of the six London Hubs, one has provided no information about its work groups. Two of the remaining five have only three work groups. Of these, London Central and NW lists the three national priorities. The other – London Central and West – mentions the two mastery-related national programmes and then (intriguingly) a third project called ‘Project 4’! London Thames includes a Student Commission Project:

‘Students will become researchers over two days and will explore the difference between depth and acceleration in terms of students’ perceptions of progress. There will be support from an expert researcher to support them in bringing together their findings. They will present their findings at the Specialist Schools and Academy’s Trust (SSAT) Conference and other forums where they can share their experience.’

Unfortunately, the presentation given at this event suggests the students were unable to produce a balanced treatment, carefully weighing up the advantages and disadvantages of each approach and considering how they might be combined to good effect. Naturally they came up with the ‘right’ answer for NCETM!

The variation in the productivity of Hubs is something of a surprise. So are the different levels of commitment they display towards the NCETM’s mastery-focused agenda.

Does NCETM push the laggards to work harder and conform to its priorities, or does it continue to permit this level of variance, even though it will inevitably compromise the overall efficiency of the Maths Hub programme?


Supporting the Most Able


Through the Maths Hubs 

In 2013, NCETM published guidance on High Attaining Pupils in Primary Schools (one has to register with NCETM to access these materials).

This is strongly influenced by ACME’s Report ‘Raising the bar: developing able young mathematicians’ (December 2012) which defines its target group as:

‘…those students aged 5-16 who have the potential to successfully study mathematics at A level or equivalent’.

ACME bases its report on three principles:

  • ‘Potential heavy users of mathematics should experience a deep, rich, rigorous and challenging mathematics education, rather than being accelerated through the school curriculum.
  • Accountability measures should allow, support and reward an approach focused on depth of learning, rather than rewarding early progression to the next Key Stage.
  • Investment in a substantial fraction of 5-16 year olds with the potential to excel in mathematics, rather than focussing attention on the top 1% (or so), is needed to increase the number of 16+ students choosing to study mathematics-based subjects or careers.’

ACME in turn cites Mathematical Association advice from the previous year on provision for the most able in secondary schools.

It is fascinating – though beyond the scope of this post – to trace through these publications and subsequent NCETM policy the evolution of an increasingly narrow and impoverished concept of top-end differentiation

The line taken in NCETM’s 2013 guidance is still relatively balanced:

‘It’s probably not helpful to think in terms of either enrichment or acceleration, but to consider the balance between these two approaches. Approaches may vary depending on the age of children, or the mathematics topics, while there may be extra-curricular opportunities to meet the needs of high attaining children in other ways. In addition to considerations of which approach supports the best learning, there are practical issues to consider.’

This is a far cry from the more extreme position now being articulated by NCETM, as discussed in my earlier post ‘A digression on breadth, depth, pace and mastery’.

There is in my view a pressing need to rediscover a richer and more sophisticated vision of ‘stretch and challenge’ for high attaining learners in maths and, by doing so, to help to achieve the Conservative manifesto commitment above. This need not be inconsistent with an Anglicised mastery model, indeed it ought to strengthen it significantly.

One obvious strategy is to introduce a new National Collaborative Project, ensuring that all 34 Hubs are engaged in developing this vision and building national consensus around it.

Here are some suggested design parameters:

  • Focus explicitly on improving attainment and progress, reducing underachievement by high attaining learners and closing gaps between disadvantaged high attainers and their peers.
  • Develop interventions targeted directly at learners, as well as professional development, whole school improvement and capacity building to strengthen school-led collaborative support.
  • Emphasise cross-phase provision encompassing primary, secondary and post-16, devoting particular attention to primary/secondary and secondary/post-16 transition.
  • Develop and disseminate effective practice in meeting the needs of the most able within and alongside the new national curriculum, including differentiated support for those capable of achieving at or beyond KS2 L6 in scaled score terms and at or beyond Grade 9 GCSE.at KS4.
  • Develop, test and disseminate effective practice in meeting the needs of the most able through a mastery-driven approach, exemplifying how breadth, depth and pace can be combined in different proportions to reflect high attainers’ varying needs and circumstances.


Through ‘Most Able Hubs’

Compared with Maths Hubs, the Sutton Trust’s recommendation – that designated schools should support those that are underperforming with the most able and consider providing a localised extra-curricular enrichment programme – is markedly unambitious.

And of course the Maths Hubs cannot be expected to help achieve Conservative ambitions for the other elements of STEM (let alone STEAM).

Why not introduce a parallel network of Most Able Hubs (MAHs)? These would follow the same design parameters as those above, except that the last would embrace a whole/school college and whole curriculum perspective.

But, in the light of the analysis above, I propose some subtle changes to the model.

  • Number of hubs

Thirty-four is not enough for genuine national reach. But the supply of potential hubs is constrained by the budget and the number of lead institutions capable of meeting the prescribed quality criteria.

Assuming that the initial budget is limited, one might design a long-term programme that introduces the network in two or even three phases. The first tranche would help to build capacity, improving the capability of those intending to follow in their footsteps.

The ideal long-term outcome would be to introduce approximately 100 MAHs, at least 10 per region and sufficient for each to support some 200 primary and secondary schools (170 primary plus 30 secondary) and all the post-16 institutions in the locality.

That might be achieved in two phases of 50 hubs apiece or three of 33-34 hubs apiece.

  • Quality threshold

In the first instance, MAHs would be selected on the basis of Ofsted evaluation – Outstanding overall and for the same sub-categories as Maths Hubs – and high-attaining pupil performance data, relating to attainment, progress and destinations. This should demonstrate a strong record of success with disadvantaged high attainers.

One of the inaugural national collaborative projects (see below) would be to develop and trial a succinct Quality Measure and efficient peer assessment process, suitable for all potential lead institutions regardless of phase or status.

This would be used to accredit all new MAHs, but also to re-accredit existing MAHs every three years. Those failing to meet the requisite standard would be supported to improve.

  • Three tiers and specialism

MAHs would operate at local and national level but would also collaborate regionally. They might take it in turns to undertake regional co-ordination.

Each would pursue a mix of national, regional and local priorities. The regional and local priorities would not replicate national priorities but MAHs would otherwise have free rein in determining them, subject to the approval of action plans (see below).

Each MAH would also be invited to develop a broader specialism which it would pursue in national and regional settings. MAHs from different regions with the same specialism would form a collaborative. The selected specialism might be expected to inform to some extent the choice of local priorities.

  • Strategic partnerships

Each MAH would develop a variety of local strategic partnerships, drawing in other local school and college networks, including TSAs, MATs, local authority networks, maths and music hubs; local universities, their faculties and schools of education; nearby independent schools;  local commercial and third sector providers; and local businesses with an interest in the supply of highly skilled labour. Some partners might prefer to engage at a regional level.

SLEs with a ‘most able’ specialism would be involved as a matter of course and would be expected to play a leading role.

National bodies would serve as national strategic partners, sitting on a National Advisory Group and contributing to the termly national forum.

Participating national bodies would include: central government and its agencies; national organisations, whether third sector or commercial, supporting the most able; and other relevant national education organisations, including subject associations and representative bodies.

Termly forums would be used to monitor progress, resolve issues and plan collaborative ventures. All non-sensitive proceedings would be published online. Indeed a single website would publish as much detail as possible about the MAHs: transparency would be the watchword.

  • Work Groups

Each MAH would agree an annual action plan applying the work group methodology to its national, regional and local priorities. Each priority would entail a substantive work programme requiring significant co-ordinated activity over at least two terms.

An additional work group would capture any smaller-scale local activities (and MAHs might be permitted to use a maximum of 10% of their programme budget for this purpose).

MAHs’ progress against their action plans – including top level output and outcome targets – would be assessed annually and the results used to inform the re-accreditation process.

The programme as a whole would be independently evaluated and adjusted if necessary to reflect the findings from formative evaluation.

  • Staffing and funding

MAHs would operate with the same combination of co-ordinator, SLT sponsor and administrator roles, but with the flexibility to distribute these roles between individuals as appropriate. Hubs would be encouraged to make the lead role a full-time appointment.

Co-ordinators would constitute a ‘network within a network’, meeting at termly forums and supporting each other through an online community (including weekly Twitter chats) and a shared resource base.

Co-ordinators would be responsible for devising and running their own induction and professional development programme and ensuring that new appointees complete it satisfactorily. Additional funding would be available for this purpose. The programme would be accredited at Masters level.

Assuming a full year budget of £160K per MAH (£60K for structural costs; £100K for work groups), plus 10% for central administration, the total steady-state cost of a 100-MAH network would be £17.6m per year, not much more than the £15m that Labour committed during the General Election campaign. If the programme was phased in over three years, the annual cost would be significantly lower during that period.

MAHs might be encouraged to generate income to offset against their structural costs. The co-ordinators’ salary and on-costs might be the first priority. In time Hubs might be expected to meet these entirely from income generated, so reducing the overall cost by almost a third.

In an ideal world, MAHs would also support a parallel programme providing long-term intensive support to disadvantaged high attainers funded through a £50m pupil premium topslice.

The overall cost is significant, but bears comparison with the substantial sums invested in some selective 16-19 free schools, or the £50m recently set aside for School Cadet Forces. Maybe funding for MAHs should also be drawn from the fines levied on the banks!

MAHs would support learners from YR-Y13 and have a genuinely national reach, while free schools can only ever impact significantly on a very limited annual intake plus those fortunate enough to benefit from any localised outreach activity. In short MAHs offer better value for money.



The principal findings from this review are that:

  • Maths Hubs offer a potentially workable model for system-wide improvement in the quality of maths education which could help to secure higher standards, stronger attainment and progress. But expectations of the Hubs are set too high given the limited resource available. It is doubtful whether the present infrastructure is strong enough to support the Government’s ambition to make England the best place in the world to study maths (in effect by 2020).
  • Given the dearth of information it is very difficult to offer a reliable assessment of the progress made by Maths Hubs in their first year of operation. The network has managed to establish itself from scratch within a relatively short time and with limited resources, but progress appears inconsistent, with some Hubs taking on and achieving much more than others. Two of the first three national collaborative projects still seem embryonic and the England-China project seems to be making steady rather than spectacular progress.
  • There are some tensions and weaknesses inherent in the model. In particular it relies on the successful reconciliation of potentially competing national and local priorities. There is evidence to suggest that national priorities are dominating at present. The model also depends critically on the capability of a small group of part-time co-ordinators. Several are likely to have limited experience and support, as well as insufficiently generous time allocations. Many will inevitably progress to school leadership positions so turnover will be a problem. An independent evaluation with a formative aspect would have been helpful in refining the model, ironing out the shortcomings and minimising the tensions. The apparent failure to commission an evaluation could become increasingly problematic as the expectations placed on the Hubs are steadily ratcheted upwards.
  • The supply of information is strictly rationed; the profile of Maths Hubs is far too low. Because the quality and quantity of information is so limited, those not working inside the network will infer that there is something to hide. Institutions that have not so far engaged with the Hubs will be less inclined to do so. If external communication is wanting, that may suggest that intra-Hub communication is equally shaky. Effective communication is critical to the success of such networks and ought to be given much higher priority. The Maths Hub website ought to be a ‘one stop shop’ for all stakeholders’ information needs, but it is infrequently updated and poorly stocked. Transparency should be the default position.
  • If the Government is to ‘create more opportunities to stretch the most able’ while ensuring that all high attainers ‘are pushed to achieve their potential’, then Maths Hubs will need to be at the forefront of a collective national improvement effort. NCETM should be making the case for an additional national collaborative project with this purpose. More attention must be given to shaping how the evolving English model of maths mastery provides stretch and challenge to high attainers, otherwise there is a real risk that mastery will perpetuate underachievement, so undermining the Government’s ambitions. In PISA 2012, 3.1% of English participants achieved Level 6 compared with 30.8% of those from Shanghai, while the comparative percentages for Levels 5 and 6 were 12.4% and 55.4% respectively. NCETM should specify now what they would consider acceptable outcomes for England in PISA 2015 and 2018 respectively.
  • Maths Hubs cannot extend their remit into the wider realm of STEM (or potentially STEAM if arts are permitted to feature). But, as Ofsted has shown, there are widespread shortcomings in the quality of ‘most able education’ more generally, not least for those from disadvantaged backgrounds. I have already made the case for a targeted support programme to support disadvantaged high attainers from Year 7 upwards, funded primarily through an annual pupil premium topslice. But the parallel business of school and college improvement might be spearheaded by a national network of Most Able Hubs with a whole school/college remit. I have offered some suggestions for how the Maths Hubs precedent might be improved upon. The annual cost would be similar to the £15m committed by Labour pre-election.

If such a network were introduced from next academic year then, by 2020, the next set of election manifestos might reasonably aim to make Britain the best place in the world for high attaining learners, especially high attaining learners from disadvantaged backgrounds.

And, with a generation of sustained effort across three or four successive governments and universal commitment in every educational setting, we might just make it….

What do you think the chances are of that happening?

Me too.



July 2015

A Digression on Breadth, Depth, Pace and Mastery


Tricoloring (1)

For a more recent post on these issues, go here

This post explores the emerging picture of mastery-based differentiation for high attainers and compares it with a model we used in the National G&T Programme, back in the day.

It is a rare venture into pedagogical territory by a non-practitioner, so may not bear close scrutiny from the practitioner’s perspective. But it seeks to pose intelligent questions from a theoretical position and so promote further debate.


Breadth, depth and pace


Quality standards

In the original National Quality Standards in Gifted and Talented Education (2005) one aspect of exemplary ‘Effective Provision in the Classroom’ was:

‘Teaching and learning are suitably challenging and varied, incorporating the breadth, depth and pace required to progress high achievement. Pupils routinely work independently and self-reliantly.’

In the 2010 version it was still in place:

‘Lessons consistently challenge and inspire pupils, incorporating the breadth, depth and pace required to support exceptional rates of progress. Pupils routinely work creatively, independently and self-reliantly.’

These broad standards were further developed in the associated Classroom Quality Standards (2007) which offered a more sophisticated model of effective practice.

The original quality standards were developed by small expert working groups, reporting to wider advisory groups and were carefully trialled in primary and secondary classrooms.

They were designed not to be prescriptive but, rather, to provide a flexible framework within which schools could develop and refine their own preferred practice.

Defining the terms

What did we mean by breadth, depth and pace?

  • Breadth (sometimes called enrichment) gives learners access to additional material beyond the standard programme of study. They might explore additional dimensions of the same topic, or an entirely new topic. They might need to make cross-curricular connections, and/or to apply their knowledge and skills in an unfamiliar context.
  • Depth (sometimes called extension) involves delving further into the same topic, or considering it from a different perspective. It might foreground problem solving. Learners might need to acquire new knowledge and skills and may anticipate material that typically occurs later in the programme of study.
  • Pace (sometimes called acceleration) takes two different forms. It may be acceleration of the learner, for example advancing an individual to a higher year group in a subject where they are particularly strong. More often, it is acceleration of the learning, enabling learners to move through the programme of study at a relatively faster pace than some or all of their peers. Acceleration of learning can take place at a ‘micro’ level in differentiated lesson planning, or in a ‘macro’ sense, typically through setting. Both versions of acceleration will cause the learner to complete the programme of study sooner and they may be entered early for an associated test or examination.

It should be readily apparent that these concepts are not distinct but overlapping.  There might be an element of faster pace in extension, or increased depth in acceleration for example. A single learning opportunity may include two, or possibly all three. It is not always straightforward to disentangle them completely.

Applying these terms

From the learner’s perspective, one of these three elements can be dominant, with the preferred strategy determined by that learner’s attainment, progress and wider needs.

  • Enrichment might be dominant if the learner is an all-rounder, relatively strong in this subject but with equal or even greater strength elsewhere.
  • Extension might be dominant if the learner shows particular aptitude or interest in specific aspects of the programme of study.
  • Acceleration might be dominant if the learner is exceptionally strong in this subject, or has independently acquired and introduced knowledge or skills that are not normally encountered until later in this or a subsequent key stage.

Equally though, the richest learning experience is likely to involve a blend of all three elements in different combinations: restricting advanced learners to one or two of them might not always be in their best interests. Moreover, some high attainers will thrive with a comparatively ‘balanced scorecard’

The intensity or degree of enrichment, extension or acceleration will also vary according to the learners’ needs. Even in a top set decisions about how broadly to explore, how deeply to probe or how far and how fast to press forward must reflect their starting point and the progress achieved to date.

Acceleration of the learner may be appropriate if he or she is exceptionally advanced.  Social and emotional maturity will need to be taken into account, but all learners are different – this should not be used as a blanket excuse for failing to apply the approach.

There must be evidence that the learner is in full command of the programme of study to date and that restricting his pace is having a detrimental effect. A pedagogical preference for moving along the class at the same pace should never over-ride the learner’s needs.

Both variants of acceleration demand careful long-term planning, so the learner can continue on a fast track where appropriate, or step off without loss of esteem. It will be frustrating for a high attainer expected to ‘mark time’ when continuity is lost. This may be particularly problematic on transfer and transition between settings.

Careful monitoring is also required, to ensure that the learner continues to benefit, is comfortable and remains on target to achieve the highest grades. No good purpose is served by ‘hothousing’.

Mastery and depth

The Expert Panel

The recent evolution of a mastery approach can be tracked back to the Report of the Expert Panel for the National Curriculum Review (December 2011).

‘Amongst the international systems which we have examined, there are several that appear to focus on fewer things in greater depth in primary education, and pay particular attention to all pupils having an adequate understanding of these key elements prior to moving to the next body of content – they are ‘ready to progress’…

… it is important to understand that this model applies principally to primary education. Many of the systems in which this model is used progressively change in secondary education to more selective and differentiated routes. Spread of attainment then appears to increase in many of these systems, but still with higher overall standards than we currently achieve in England…

There are issues regarding ‘stretch and challenge’ for those pupils who, for a particular body of content, grasp material more swiftly than others. There are different responses to this in different national settings, but frequently there is a focus on additional activities that allow greater application and practice, additional topic study within the same area of content, and engagement in demonstration and discussion with others

These views cohere with our notion of a revised model that focuses on inclusion, mastery and progress. However, more work needs to be done around these issues, both with respect to children with learning difficulties and those regarded as high attainers.’

For reasons best known to itself, the Panel never undertook that further work in relation to high attainers, or at least it was never published. This has created a gap in the essential groundwork necessary for the adoption of a mastery-driven approach.


National curriculum

Aspects of this thinking became embodied in the national curriculum, but there are some important checks and balances.

The inclusion statement requires differentiation for high attainers:

‘Teachers should set high expectations for every pupil. They should plan stretching work for pupils whose attainment is significantly above the expected standard.’

The primary programmes of study for all the core subjects remind everyone that:

Within each key stage, schools therefore have the flexibility to introduce content earlier or later than set out in the programme of study. In addition, schools can introduce key stage content during an earlier key stage, if appropriate.’

But, in mathematics, both the primary and secondary PoS say:

‘The expectation is that the majority of pupils will move through the programmes of study at broadly the same pace. However, decisions about when to progress should always be based on the security of pupils’ understanding and their readiness to progress to the next stage. Pupils who grasp concepts rapidly should be challenged through being offered rich and sophisticated problems before any acceleration through new content. Those who are not sufficiently fluent with earlier material should consolidate their understanding, including through additional practice, before moving on.’

These three statements are carefully worded and, in circumstances where all apply, they need to be properly reconciled.


NCETM champions the maths mastery movement

The National Centre for Excellence in the Teaching of Mathematics (NCETM), a Government-funded entity responsible for raising levels of achievement in maths, has emerged as a cheerleader for and champion of a maths mastery approach.

It has published a paper ‘Mastery approaches to mathematics and the new national curriculum’ (October 2014).

Its Director, Charlie Stripp, has also written two blog posts on the topic:

The October 2014 paper argues (my emphasis):

‘Though there are many differences between the education systems of England and those of east and south-east Asia, we can learn from the ‘mastery’ approach to teaching commonly followed in these countries. Certain principles and features characterise this approach…

… The large majority of pupils progress through the curriculum content at the same pace. Differentiation is achieved by emphasising deep knowledge and through individual support and intervention.’

It continues:

‘Taking a mastery approach, differentiation occurs in the support and intervention provided to different pupils, not in the topics taught, particularly at earlier stages. There is no differentiation in content taught, but the questioning and scaffolding individual pupils receive in class as they work through problems will differ, with higher attainers challenged through more demanding problems which deepen their knowledge of the same content.’

In his October 2014 post, Stripp opines:

‘Put crudely, standard approaches to differentiation commonly used in our primary school maths lessons involve some children being identified as ‘mathematically weak’ and being taught a reduced curriculum with ‘easier’ work to do, whilst others are identified as ‘mathematically able’ and given extension tasks….

…For the children identified as ‘mathematically able’:

  1. Extension work, unless very skilfully managed, can encourage the idea that success in maths is like a race, with a constant need to rush ahead, or it can involve unfocused investigative work that contributes little to pupils’ understanding. This means extension work can often result in superficial learning. Secure progress in learning maths is based on developing procedural fluency and a deep understanding of concepts in parallel, enabling connections to be made between mathematical ideas. Without deep learning that develops both of these aspects, progress cannot be sustained.
  2. Being identified as ‘able’ can limit pupils’ future progress by making them unwilling to tackle maths they find demanding because they don’t want to challenge their perception of themselves as being ‘clever’ and therefore finding maths easy….

…I do think much of what I’m saying here also applies at secondary level.

Countries at the top of the table for attainment in mathematics education employ a mastery approach to teaching mathematics. Teachers in these countries do not differentiate their maths teaching by restricting the mathematics that ‘weaker’ children experience, whilst encouraging ‘able’ children to ‘get ahead’ through extension tasks… Instead, countries employing a mastery approach expose almost all of the children to the same curriculum content at the same pace…’

The April 2015 post continues in a similar vein, commenting directly on the references in the PoS quoted above (my emphases):

‘The sentence: ‘Pupils who grasp concepts rapidly should be challenged through rich and sophisticated problems before any acceleration through new content’, directly discourages acceleration through content, instead requiring challenge through ‘rich and sophisticated (which I interpret as mathematically deeper) problems’. Engaging with ‘rich and sophisticated problems’ involves reasoning mathematically and applying maths to solve problems, addressing all three curriculum aims. All pupils should encounter such problems; different pupils engage with problems at different depths, but all pupils benefit

…Meeting the needs of all pupils without differentiation of lesson content requires ensuring that both (i) when a pupil is slow to grasp an aspect of the curriculum, he or she is supported to master it and (ii) all pupils should be challenged to understand more deeply…

The success of teaching for mastery in the Far East (and in the schools employing such teaching here in England) suggests that all pupils benefit more from deeper understanding than from acceleration to new material. Deeper understanding can be achieved for all pupils by questioning that asks them to articulate HOW and WHY different mathematical techniques work, and to make deep mathematical connections. These questions can be accessed by pupils at different depths and we have seen the Shanghai teachers, and many English primary teachers who are adopting a teaching for mastery approach, use them very skilfully to really challenge even the highest attaining pupils.’

The NCETM is producing guidance on assessment without levels, showing how to establish when a learner

‘…has ‘mastered’ the curriculum content (meaning he or she is meeting national expectations and so ready to progress) and when a pupil is ‘working deeper’ (meaning he or she is exceeding national expectations in terms of depth of understanding).’



NCETM wants to establish a distinction between depth via problem-solving (good) and depth via extension tasks (bad)

There is some unhelpful terminological confusion in the assumption that extension tasks necessarily require learners to anticipate material not yet covered by the majority of the class.

Leaving that aside, notice how the relatively balanced wording in the programme of study is gradually adjusted until the balance has disappeared.

The PoS says ‘the majority of pupils will move through…at broadly the same pace’ and that they ‘should be challenged through being offered rich and sophisticated problems before any acceleration through new content).

This is first translated into

‘…the large majority of pupils progress through the curriculum content at the same pace’ (NCETM paper) then it becomes

‘…expose almost all of the children to the same curriculum content at the same pace’ (Stripp’s initial post) and finally emerges as

‘Meeting the needs of all pupils without differentiation of lesson content’ and

‘…all pupils benefit more from deeper understanding than from acceleration to new material.’ (Stripp’s second post).

Any non-mathematician will tell you that the difference between the majority (over 50%) and all (100%) may be close to 50%.

Such a minority could very comfortably include all children achieving L3 equivalent at KS1 or L5 equivalent at KS2, or all those deemed high attainers in the Primary and Secondary Performance Tables.

The NCETM pretends that this minority does not exist.

It does not consider the scope for acceleration towards new content subsequent to the delivery of ‘rich and sophisticated problems’.

Instead it argues that the statement in the PoS ‘directly discourages acceleration through content’ when it does no such thing.

This is propaganda, but why is NCETM advancing it?

One possibility, not fully developed in these commentaries, is the notion that teachers find it easier to work in this way. In order to be successful ‘extension work’ demands exceptionally skilful management.

On the other hand, Stripp celebrates the fact that Shanghai teachers:

…were very skilled at questioning and challenging children to engage more deeply with maths within the context of whole class teaching.’

It is a moot point whether such questioning, combined with the capacity to develop ‘rich and sophisticated problems’, is any more straightforward for teachers to master than the capacity to devise suitable extension tasks, especially when one approach is relatively more familiar than the other.

Meanwhile, every effort is made to associate maths mastery with other predilections and prejudices entertained by educational professionals:

  • It will have a positive impact on teacher workload, but no evidence – real or imagined – is cited to support this belief.
  • The belief that all children can be successful at maths (though with no acknowledgement that some will always be comparatively more successful than others) and an associated commitment to ‘mindset’, encouraging learners to associate success with effort and hard work rather than underlying aptitude.
  • The longstanding opposition of many in the maths education community to any form of acceleration, fuelled by alarming histories of failed prodigies at one extreme and poorly targeted early entry policies at the other. (I well remember discussing this with them as far back as the nineties.)
  • The still contested benefits of life without levels.

On this latter point, the guidance NCETM is developing appears to assume that ‘exceeding national expectations’ in maths must necessarily involve ‘working deeper’.

I have repeatedly argued that, for high attainers, such measures should acknowledge the potential contributions of breadth, depth and pace.

Indeed, following a meeting and email exchanges last December, NAHT said it wanted to employ me to help develop such guidance, as part of its bigger assessment package.

(Then nothing more – no explanation, no apology, zilch. Shame on you, Mr Hobby. That’s no way to run an organisation.)



Compared with the richness of the tripartite G&T model, the emphasis placed exclusively on depth in the NCETM mastery narrative seems relatively one-dimensional and impoverished.

There is no great evidence in this NCETM material of a willingness to develop an alternative understanding of ‘stretch and challenge’ for high attainers.  Vague terms like  ‘intelligent practice’, ‘deep thinking’ and ‘deep learning’ are bandied about like magical incantations, but what do they really mean?

NCETM needs to revisit the relevant statement in the programme of study and strip away (pun intended) the ‘Chinese whispers’ (pun once more intended) in which they have cocooned it.

Teachers following the maths mastery bandwagon need meaningful free-to-access guidance that helps them construct suitably demanding and sophisticated problems and to deploy advanced questioning techniques that get the best out of their high attainers.

I do not dismiss the possibility that high attainers can thrive under a mastery model that foregrounds depth over breadth and pace, but it is a mistake to neglect breadth and pace entirely.

Shanghai might be an exception, but most of the other East Asian cradles of mastery also run parallel gifted education programmes in which accelerated maths is typically predominant. I’ve reviewed several on this Blog.

For a more recent treatment of these issues see my September 2015 post here.



April 2015

Maths Mastery: Evidence versus Spin


On Friday 13 February, the Education Endowment Foundation (EEF) published the long-awaited evaluation reports of two randomised control trials (RCTs) of Mathematics Mastery, an Ark-sponsored programme and recipient of one of the EEF’s first tranche of awards back in 2011.

Inside-out_torus_(animated,_small)Inside-out_torus_(animated,_small)EEF, Ark and Mathematics Mastery each published a press release to mark the occasion but, given the timing, none of these attracted attention from journalists and were discussed only briefly on social media.

The main purpose of this post is to distinguish evidence from spin, to establish exactly what the evaluations tell us – and what provisos should be attached to those findings.

The post is organised into three main sections which deal respectively with:

  • Background to Mathematics Mastery
  • What the evaluation reports tell us and
  • What the press releases claim

The conclusion sets out my best effort at a balanced summary of the main findings. (There is a page jump here for those who prefer to cut to the chase.)

This post is written by a non-statistician for a lay audience. I look to specialist readers to set me straight if I have misinterpreted any statistical techniques or findings,

What was published?

On Friday 13 February the EEF published six different documents relevant to the evaluation:

  • A press release: ‘Low-cost internet-based programme found to considerably improve reading ability of year 7 pupils’.
  • A blog post: ‘Today’s findings: impact, no impact and inconclusive – a normal distribution of findings’.
  • An updated Maths Mastery home page (also published as a pdf Project Summary in a slightly different format).

The last three of these were written by the Independent Evaluators – Jerrim and Vignoles (et al) – employed through the UCL Institute of Education.

The Evaluators also refer to ‘a working paper documenting results from both trials’ available in early 2015 from http://ideas.repec.org/s/qss/dqsswp.html and www.johnjerrim.com. At the time of writing this is not yet available.

Press releases were issued on the same day by:

All of the materials published to date are included in the analysis below.

Background to Maths Mastery

What is Maths Mastery?

According to the NCETM (October 2014) the mastery approach in mathematics is characterised by certain common principles:

‘Teachers reinforce an expectation that all pupils are capable of achieving high standards in mathematics.

  • The large majority of pupils progress through the curriculum content at the same pace. Differentiation is achieved by emphasising deep knowledge and through individual support and intervention.
  • Teaching is underpinned by methodical curriculum design and supported by carefully crafted lessons and resources to foster deep conceptual and procedural knowledge.
  • Practice and consolidation play a central role. Carefully designed variation within this builds fluency and understanding of underlying mathematical concepts in tandem.
  • Teachers use precise questioning in class to test conceptual and procedural knowledge, and assess pupils regularly to identify those requiring intervention so that all pupils keep up.

The intention of these approaches is to provide all children with full access to the curriculum, enabling them to achieve confidence and competence – ‘mastery’ – in mathematics, rather than many failing to develop the maths skills they need for the future.’

The NCETM paper itemises six key features, which I paraphrase as:

  • Curriculum design: Relatively small, sequenced steps which must each be mastered before learners move to the next stage. Fundamental skills and knowledge are secured first and these often need extensive attention.
  • Teaching resources: A ‘coherent programme of high-quality teaching materials’ supports classroom teaching. There is particular emphasis on ‘developing deep structural knowledge and the ability to make connections’. The materials may include ‘high-quality textbooks’.
  • Lesson design: Often involves input from colleagues drawing on classroom observation. Plans set out in detail ‘well-tested methods’ of teaching the topic. They include teacher explanations and questions for learners.
  • Teaching methods: Learners work on the same tasks. Concepts are often explored together. Technical proficiency and conceptual understanding are developed in parallel.
  • Pupil support and differentiation: Is provided through support and intervention rather than through the topics taught, particularly at early stages. High attainers are ‘challenged through more demanding problems which deepen their knowledge of the same content’. Issues are addressed through ‘rapid intervention’ commonly undertaken the same day.
  • Productivity and practice: Fluency is developed from deep knowledge and ‘intelligent practice’. Early learning of multiplication tables is expected. The capacity to recall facts from long term memory is also important.

Its Director published a blog post (October 2014) arguing that our present approach to differentiation has ‘a very negative effect’ on mathematical attainment and that this is ‘one of the root causes’ of our performance in PISA and TIMSS.

This is because it negatively affects the ‘mindset’ of low attainers and high attainers alike. Additionally, low attainers are insufficiently challenged and get further behind because ‘they are missing out on some of the curriculum’. Meanwhile high attainers are racing ahead without developing fluency and deep understanding.

He claims that these problems can be avoided through a mastery approach:

‘Instead, countries employing a mastery approach expose almost all of the children to the same curriculum content at the same pace, allowing them all full access to the curriculum by focusing on developing deep understanding and secure fluency with facts and procedures, and providing differentiation by offering rapid support and intervention to address each individual pupil’s needs.’

But unfortunately he stops short of explaining how, for high attainers, exclusive focus on depth is preferable to a richer blend of breadth, depth and pace, combined according to each learner’s needs.

NCETM is careful not to suggest that mastery is primarily focused on improving the performance of low-attaining learners.

It has published separate guidance on High Attaining Pupils in Primary Schools (registration required), which advocates a more balanced approach, although that predates this newfound commitment to mastery.

NCETM is funded by the Department for Education. Some of the comments on the Director’s blog post complain that it is losing credibility by operating as a cheerleader for Government policy.

Ark’s involvement

Ark is an education charity and multi-academy trust with an enviable reputation.

It builds its approach on six key principles, one of which is ‘Depth before breadth’:

‘When pupils secure firm foundations in English and mathematics, they find the rest of the curriculum far easier to access. That’s why we prioritise depth in these subjects, giving pupils the best chance of academic success. To support fully our pupils’ achievement in maths, we have developed the TES Award winning Mathematics Mastery programme, a highly-effective curriculum and teaching approach inspired by pupil success in Singapore and endorsed by Ofsted. We teach Mathematics Mastery in all our primary schools and at Key Stage 3 in a selection of our secondary schools. It is also being implemented in over 170 schools beyond our network.’

Ark’s 2014 Annual Report identifies five priorities for 2014/15, one of which is:

‘…developing curricula to help ensure our pupils are well prepared as they go through school… codifying our approach to early years and, building on the success of Maths Mastery, piloting an English Mastery programme…’

Mathematics Mastery is a charity in its own right. Its website lists 15 staff, a high-powered advisory group and three partner organisations:  Ark, the EEF (presumably by virtue of the funded evaluation) and the ‘Department for Education and the Mayor of London’ (presumably by virtue of support from the London Schools Excellence Fund).

NCETM’s Director sits on Mathematics Mastery’s Advisory Board.

Ark’s Chief Executive is a member of the EEF’s Advisory Board.

Development of Ark’s Maths Mastery programme

According to this 2012 report from Reform, which features Maths Mastery as a case study, it originated in 2010:

‘The development of Mathematics Mastery stemmed from collaboration between six ARK primary academies in Greater London, and the mathematics departments in seven separate ARK secondary academies in Greater London, Portsmouth and Birmingham. Representatives from ARK visited Singapore to explore the country’s approach first-hand, and Dr Yeap Ban Har, Singapore’s leading expert in maths teaching, visited King Solomon Academy in June 2011.’

In October 2011, EEF awarded Ark a grant of £600,000 for Maths Mastery, one of its first four awards.

The EEF’s press release says:

‘The third grant will support an innovative and highly effective approach to teaching children maths called Mathematics Mastery, which originated in Singapore. The programme – run by ARK Schools, the Academies sponsor, which is also supporting the project – will receive £600,000 over the next four years to reach at least 50 disadvantaged primary and secondary schools.’

Ark’s press release adds:

‘ARK Schools has been awarded a major grant by the Education Endowment Foundation (EEF) to further develop and roll out its Mathematics Mastery programme, an innovative and highly effective approach to teaching children maths based on Singapore maths teaching. The £600,000 grant will enable ARK to launch the programme and related professional development training to improve maths teaching in at least 50 disadvantaged primary and secondary schools.

The funding will enable ARK Schools to write a UK mathematics mastery programme based on the experience of teaching the pilot programme in ARK’s academies. ARK intends to complete the development of its primary modules for use from Sept 2012 and its secondary modules for use from September 2013. In parallel ARK is developing professional training and implementation support for schools outside the ARK network.’

The project home page on EEF’s site now says the total project cost is £774,000. It may be that the balance of £174,000 is the fee paid to the independent evaluators.

This 2012 information sheet says all Ark primary schools would adopt Maths Mastery from September 2012, and that its secondary schools have also devised a KS3 programme.

It describes the launch of a Primary Pioneer Programme from September 2012 and a Secondary Pioneer Programme from September 2013. These will form the cohorts to be evaluated by the EEF.

In 2013, Ark was awarded a grant of £617,375 from the Mayor of London’s London Schools Excellence Fund for the London Primary Schools Mathematics Mastery Project.

This is to support the introduction of Mastery in 120 primary schools spread across 18 London boroughs. (Another source gives the grant as £595,000)

It will be interesting to see whether Maths Mastery (or English Mastery) features in the Excellence Fund’s latest project to increase primary attainment in literacy and numeracy. The outcomes of the EEF evaluations may be relevant to that impending decision.

Ark’s Mathematics Mastery today

The Mathematics Mastery website advertises a branded variant of the mastery model, derived from a tripartite ‘holistic vision’:

  • Deep understanding, through a curriculum that combines universal high expectations with spending more time on fewer topics and heavy emphasis on problem-solving.
  • Integrated professional development through workshops, visits, coaching and mentoring and ‘access to exclusive online teaching and learning materials, including lesson guides for each week’.
  • Teacher collaboration – primary schools are allocated a geographical cluster of 4-6 schools while secondary schools attend a ‘national collaboration event’. There is also an online dimension.

It offers primary and secondary programmes.

The primary programme has three particular features: use of objects and pictures prior to the introduction of symbols; a structured approach to the development of mathematical vocabulary; and heavy emphasis on problem-solving.

It involves one-day training sessions for school leaders, for the Maths Mastery lead and those new to teaching it, and for teachers undertaking the programme in each year group. Each school receives two support visits and attends three local cluster meetings.

Problem-solving is also one of three listed features of the secondary programme. The other two are fewer topics undertaken in greater depth, plus joint lesson planning and departmental workshops.

There are two full training days, one for the Maths Mastery lead and one for the maths department plus an evening session for senior leadership. Each school receives two support visits and attends three national collaborative meetings. They must hold an hour-long departmental workshop each week and commit to sharing resources online.

Both primary and secondary schools are encouraged to launch the programme across Year 1/7 and then roll it upwards ‘over several years’.

The website is not entirely clear but it appears that Maths Mastery itself is being rolled out a year at a time, so even the original primary early adopters will have provision only up to Year 3 and are scheduled to introduce provision for Year 4 next academic year. In the secondary sector, activity currently seems confined to KS3, and predominantly to Year 7.

The number of participating schools is increasing steadily but is still very small.

The most recent figures I could find are 192 (Maths Mastery, November 2014) or 193 – 142 primary and 51 secondary (Ark 2015).

One assumes that this total includes

  • An original tranche of 30 primary ‘early adopters’ including 21 not managed by Ark
  • 60 or so primary and secondary ‘Pioneer Schools’ within the EEF evaluations (ie the schools undertaking the intervention but not those forming the control group, unless they have subsequently opted to take up the programme)
  • The 120 primary schools in the London project
  • Primary and secondary schools recruited outwith the London and EEF projects, either alongside them or subsequently.

But the organisation does not provide a detailed breakdown, or show how these different subsets overlap.

They are particularly coy about the cost. There is nothing about this on the website.

The EEF evaluation reports say that 2FE primary schools and secondary schools will pay ‘an upfront cost of £6,000 for participating in the programme’.

With the addition of staff time for training, the per pupil cost for the initial year is estimated as £127 for primary schools and £50 for secondary schools.

The primary report adds:

‘In subsequent years schools are able to opt for different pathways depending on the amount of support and training they wish to choose; they also have ongoing access to the curriculum materials for additional year groups. The per pupil cost therefore reduces considerably, to below £30 per pupil for additional year groups.’

In EEF terms this is deemed a low cost intervention, although an outlay of such magnitude is a significant burden for primary schools, particularly when funding is under pressure, and might be expected to act as a brake on participation.

Further coyness is evident in respect of statutory assessment outcomes. Some details are provided for individual schools, but there is precious little about the whole cohort.

All I could find was this table in the Primary Yearbook 2014-15.


EEF maths mastery performance

It suggests somewhat better achievement at KS1 L2b and L3c than the national average but, there is no information about other Levels and, of course, the sample is not representative, so the comparison is of limited value.

An absence of more sophisticated analysis – combined with the impression of limited transparency for those not yet inside the programme – is likely to act as a second brake on participation.

There is a reference to high attainers in the FAQ on the website:

‘The Mathematics Mastery curriculum emphasises stretching through depth of understanding rather than giving the top end of pupils [sic] new procedures to cover.

Problem solving is central to Mathematics Mastery. The great thing about the problems is that students can take them as far as they can, so those children who grasp the basics quickly can explore tasks further. There is also differentiation in the methods used, with top-end pupils typically moving to abstract numbers more quickly and spending less time with concrete manipulatives or bar models. There are extension ideas and support notes provided with the tasks to help you with this.

A range of schools are currently piloting the programme, which is working well in mixed-ability classes, as well as in schools that have set groups.’

The same unanswered questions arise as with the NCETM statement above. Is ‘Maths Mastery’ primarily focused on the ‘long tail’, potentially at the expense of high attainers?

The IoE evaluators think so. The primary evaluation report says that:

‘Mathematics Mastery intervention is particularly concerned with the ‘mastery’ of basic skills, and raising the attainment of low achievers.’

It would be helpful to have clarity on this point.


How influential is Maths Mastery?

Extremely influential.

Much educational and political capital has already been invested in Maths Mastery, hence the peculiar significance of the results contained in the evaluation reports.

The National Curriculum Expert Panel espoused mastery in its ‘Framework for the National Curriculum‘ (December 2011), while ducking the consequences for ‘stretch and challenge’ for high attainers – so creating a tension that remains unresolved to this day.

Meanwhile, the mastery approach has already influenced the new maths programme of study, as the NCETM document makes clear:

‘The 2014 national curriculum for mathematics has been designed to raise standards in maths, with the aim that the large majority of pupils will achieve mastery of the subject…

… For many schools and teachers the shift to this ‘mastery curriculum’ will be a significant one. It will require new approaches to lesson design, teaching, use of resources and support for pupils.’

Maths Mastery confirms that its Director was on the drafting team.

Mastery is also embedded in the national collaborative projects being undertaken through the Maths Hubs. Maths Mastery is one of four national partners in the Hubs initiative.

Ministers have endorsed the Ark programme in their speeches. In April 2014, Truss said:

‘The mastery model of learning places the emphasis on understanding core concepts. It’s associated with countries like Singapore, who have very high-performing pupils.

And in this country, Ark, the academy chain, took it on and developed it.

Ark run training days for maths departments and heads of maths from other schools.

They organise support visits, and share plans and ideas online with other teachers, and share their learning with a cluster of other schools.

It’s a very practical model. We know not every school will have the time or inclination to develop its very own programmes – a small rural school, say, or single-class primary schools.

But in maths mastery, a big chain like Ark took the lead, and made it straightforward for other schools to adopt their model. They maintain an online community – which is a cheap, quick way of keeping up with the best teaching approaches.

That’s the sort of innovation that’s possible.

Of course the important thing is the results. The programme is being evaluated so that when the results come out headteachers will be able to look at it and see if it represents good value.’

In June 2014 she said:

‘This idea of mastery is starting to take hold in classrooms in England. Led by evidence of what works, teachers and schools have sought out these programmes and techniques that have been pioneered in China and East Asia….

…With the Ark Schools Maths Mastery programme, more than 100 primary and secondary schools have joined forces to transform their pupils’ experiences of maths – and more are joining all the time. It’s a whole school programme focused on setting high expectations for all pupils – not believing that some just can’t do it. The programme has already achieved excellent results in other countries.’

Several reputations are being built upon Maths Mastery, many jobs depend upon it and large sums have been invested.

It has the explicit support of one of the country’s foremost academy chains and is already impacting on national curriculum and assessment policy (including the recent consultation on performance indicators for statutory teacher assessment).

Negative or neutral evaluations could have significant consequences for all the key players and are unlikely to encourage new schools to join the Programme.

Hence there is pressure in the system for positive outcomes – hence the significance of spin.

What the EEF evaluations tell us


Evaluation Protocols

EEF published separate Protocols for the primary and secondary evaluations in April 2013. These are broadly in line with the approach set out in the final evaluation reports, except that both refer much more explicitly to subsequent longitudinal evaluation:

‘In May/June 2017/18 children in treatment and control schools will sit key stage 2 maths exams. The IoE team will examine the long–run effectiveness of the Maths Mastery programme by investigating differences in school average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children’s Maths Mastery test scores (collected in 2012 and 2013)’.

‘In May/June 2018 children in treatment and control schools will sit national maths exams. The IoE team will examine the long – run effectiveness of the Maths Mastery programme by investigating differences in average maths test scores between treatment and control group. This information will be taken from the National Pupil Database, which the EEF will link to children’s Maths Mastery test scores (collected in 2013 and 2014) by NATCEN.’

It is not clear whether the intention is to preserve the integrity of the intervention and control groups until the former have rolled out Mastery to all year groups, or simply to evaluate the long-term effects of the initial one-year interventions, allowing intervention schools to drop Mastery and control schools to adopt it, entirely as they wish.

EEF Maths Mastery Project Homepage

The EEF’s updated Maths Mastery homepage has been revised to reflect the outcomes of the evaluations. It provides the most accessible summary of those outcomes.

It offers four key conclusions (my emphases):

  • ‘On average, pupils in schools adopting Mathematics Mastery made a small amount more progress than pupils in schools that did not. The effect detected was statistically significant, which means that it is likely that that improvement was caused by the programme.’
  • ‘It is unclear whether the programme had a different impact on pupils eligible for free school meals, or on pupils with higher or lower attainment.’
  • ‘Given the low per-pupil cost, Mathematics Mastery may represent a cost-effective change for schools to consider.’
  • ‘The evaluations assessed the impact of the programme in its first year of adoption. It would be worthwhile to track the medium and long-term impact of the approach.’

A table is supplied showing the effect sizes and confidence intervals for overall impact (primary and secondary together), and for the primary and secondary interventions separately.

EEF table 1 Capture


The support materials for the EEF’s toolkit help to explain these judgements.

About the Toolkit tells us that:

‘Average impact is estimated in terms of the additional months’ progress you might expect pupils to make as a result of an approach being used in school, taking average pupil progress over a year as a benchmark.

For example, research summarised in the Toolkit shows that improving the quality of feedback provided to pupils has an average impact of eight months. This means that pupils in a class where high quality feedback is provided will make on average eight months more progress over the course of a year compared to another class of pupils who were performing at the same level at the start of the year. At the end of the year the average pupil in a class of 25 pupils in the feedback group would now be equivalent to the 6th best pupil in the control class having made 20 months progress over the year, compared to an average of 12 months in the other class.’

There is another table showing us how to interpret this scale

EEF table 2 Capture


We can see from this that:

  • The overall Maths Mastery impact of +0.073 is towards the upper end of the ‘1 months progress’ category.
  • The ‘primary vs comparison’ impact of +0.10 just scrapes into the ‘2 months progress’ category.
  • The secondary vs comparison impact of +0.06 is towards the middle of the ‘1 months progress category’

All three are officially classed as ‘Low Effect’.

If we compare the effect size attributable to Maths Mastery with others in the Toolkit, it is evident that it ranks slightly above school uniform and slightly below learning styles.

A subsequent section explains that the overall impact rating is dependent on meta-analysis (again my emphases):

‘The findings from the individual trials have been combined using an approach called “meta-analysis”. Meta-analysis can lead to a more accurate estimate of an intervention’s effect. However, it is also important to note that care is needed in interpreting meta-analysed findings.’

But we are not told how, in light of this, we are to exercise care in interpreting this particular finding. There are no explicit ‘health warnings’ attached to it.

The homepage does tell us that:

‘Due to the ages of pupils who participated in the individual trials, the headline findings noted here are more likely to be predictive of programme’s impact on pupils in primary school than on pupils in secondary school.’

It also offers an explanation of why the effects generated from these trials are so small compared with those for earlier studies:

‘The findings were substantially lower than the average effects seen in the existing literature on of “mastery approaches”. A possible explanation for this is that many previous studies were conducted in the United States in the 1970s and 80s, so may overstate the possible impact in English schools today. An alternative explanation is that the Mathematics Mastery programme differed from some examples of mastery learning previously studied. For example classes following the Mathematics Mastery approach did not delay starting new topics until a high level of proficiency had been achieved by all students, which was a key feature in a number of many apparently effective programmes.’


There is clearly an issue with the 95% confidence intervals supplied in the first table above. 

The Technical Appendices to the Toolkit say:

‘For those concerned with statistical significance, it is still readily apparent in the confidence intervals surrounding an effect size. If the confidence interval includes zero, then the effect size would be considered not to have reached conventional statistical significance.’ (p6)

The table indicates that the lower confidence interval is zero or lower in all three cases, meaning that none of these findings may be statistically significant.

However, the homepage claims that the overall impact of both interventions, when combined through meta-analysis, is statistically significant.

And it fails entirely to mention that the impact of the both the primary and the secondary interventions separately are statistically insignificant.

The explanation of the attribution of statistical significance to the two evaluations combined is that, whereas the homepage gives confidence intervals to two decimal places, the reports calculate them to a third decimal place.

This gives a lower value of 0.004 (ie four thousandths above zero).

This can be seen from the table annexed to the primary and secondary reports and included in the ‘Overarching Summary Report’

EEF maths mastery 3 decimal places Capture


The distinction is marginal, to say the least. Indeed, the Evaluation Reports say:

‘…the pooled effect size of 0.073 is just significantly different from zero at conventional thresholds’

Moreover, notice that the introduction of a third decimal place drags the primary effect size down to 0.099, officially consigning it to the ‘one month’s progress’ category rather than the two months quoted above.

This might appear to be dancing on the head of a statistical pin but, as we shall see later, the spin value of statistical significance is huge!

Overall there is a lack of clarity here that cannot be attributed entirely to the necessity for brevity. The attempt to conflate subtly different outcomes from the separate primary and secondary evaluations has masked these distinctions and distorted the overall assessment.


The full reports add some further interesting details which are summarised in the sections below.

Primary Evaluation Report 

EEF maths mastery table 4

Key points:

  • In both the primary and secondary reports, additional reasons are given for why the effects from these evaluations are so much smaller than those from previous studies. These include the fact that:

‘…some studies included in the mastery section of the toolkit show small or no effects, suggesting that making mastery learning work effectively in all circumstances is challenging.’

The overall conclusion is an indirect criticism of the Toolkit, noting as it does that ‘the relevance of such evidence for contemporary education policy in England…may be limited’.

  • The RCT was undertaken across two academic years: In AY2012/13, 40 schools (Cohort A) were involved. Of these, 20 were randomly allocated the intervention and 20 the control. In AY2013/14, 50 schools (Cohort B) participated, 25 allocated the intervention and 25 the control. After the trial, control schools in Cohort A were free to pursue Maths Mastery. (The report does not mention whether this also applied to Cohort B.) It is not clear how subsequent longitudinal evaluation will be affected by such leakage from the control group.
  • The schools participating in the trial schools were recruited by Ark. They had to be state-funded and not already undertaking Maths Mastery:

‘Schools were therefore purposefully selected—they cannot be considered a randomly chosen sample from a well-defined population. The majority of schools participating in the trial were from London or the South East.’

  • Unlike the secondary evaluation, no process evaluation was conducted so it is not possible to determine the extent to which schools adhered to the prescribed programme. 
  • Baseline tests were administered after allocation between intervention and control, at the beginning of each academic year. Pupils were tested again in July. Evaluators used the Number Knowledge Test (NKT) for this purpose. The report discusses reasons why this might not be an accurate predictor of subsequent maths attainment and whether it is so closely related to the intervention as to be ‘a questionable measure of the success of the trial’. The discussion suggests that there were potential advantages to both the intervention and control groups but does not say whether one outweighed the other. 
  • The results of the post-test are summarised thus:

‘Children who received the Mathematics Mastery intervention scored, on average, +0.10 standard deviations higher on the post-test. This, however, only reached statistical significance at the 10% level (t = 1.82; p = 0.07), with the 95% confidence interval ranging from -0.01 to +0.21. Within Cohort A, children in the treatment group scored (on average) +0.09 standard deviations above those children in the control group (confidence interval -0.06 to +0.24). The analogous effect in Cohort B was +0.10 (confidence interval -0.05 to 0.26). Consequently, although the Mathematics Mastery intervention may have had a small positive effect on children’s test scores, it is not possible to rule out sampling variation as an explanation.’

  • The comparison of pre-test and post-test results provides any evidence of differential effects for those with lower or higher prior attainment:

‘Estimates are again presented in terms of effect sizes. The interaction effect is not significantly different from zero, with the 95% confidence interval ranging from -0.01 to +0.02. Thus there is little evidence that the effect of Mathematics Mastery differs between children with different levels of prior achievement.’

The Report adds:

‘Recall that the Mathematics Mastery intervention is particularly concerned with the ‘mastery’ of basic skills, and raising the attainment of low achievers. Thus one might anticipate the intervention to be particularly effective in the bottom half of the test score distribution. There is some, but relatively little, evidence that the intervention was less effective for the bottom half of the test distribution.

So, on this evidence, Maths Mastery is no more effective for the low achievers it is intended to help most. This is somewhat different to the suggestion on the homepage that the answer given to this question is ‘unclear’.

Several limitations are discussed, but it is important to note that they are phrased in hypothetical terms:

  • Pupils’ progress was evaluated after one academic year::

’This may be considered a relatively small ‘dose’ of the Mathematics Mastery programme’.

  • The intervention introduced a new approach to schools, so there was a learning curve which control schools did not experience:

‘With more experience teaching the programme it is possible that teachers would become more effective in implementing it.’

  • The test may favour either control schools or intervention schools.
  • Participating schools volunteered to take part, so it is not possible to say whether similar effects would be found in all schools.
  • It was not possible to control for balance – eg by ethnic background and FSM eligibility – between intervention and control. [This is now feasible so could potentially be undertaken retrospectively to check there was no imbalance.]

Under ‘Interpretation’, the report says:

‘Within the context of the wider educational literature, the effect size reported (0.10 standard deviations) would typically be considered ‘small’….

Yet, despite the modest and statistically insignificant effect, the Mathematics Mastery intervention has shown some promise.’

The phrase ‘some promise’ is justified by reference to the meta-analysis, the cost effectiveness (a small effect size for a low cost is preferable to the same outcome for a higher cost) and the fact that the impact of the entire programme has not yet been evaluated

‘Third, children are likely to follow the Mathematics Mastery programme for a number of years (perhaps throughout primary school), whereas this evaluation has considered the impact of just the first year of the programme. Long-run effects after sustained exposure to the programme could be significantly higher, and will be assessed in a follow-up study using Key Stage 2 data.’

This is the only reference to a follow-up study. It is less definite than the statement in the assessment protocol and there is no further explanation of how this will be managed, especially given potential ‘leakage’ from the control group.

Secondary Evaluation Report

EEF maths mastery table 5

Key points:

  • 50 schools were recruited to participate in the RCT during AY2013/14, with 25 randomly allocated to intervention and control. All Year 7 pupils within the former experienced the intervention.  As in the primary trial, control schools were eligible to access the programme after the end of the trial year. Interestingly, 3 of the 25 intervention schools (12%) dropped out before the end of the year – their reasons are not recorded. 
  • As in the primary trial, Ark recruited the participating schools – which had to be state-funded and new to Maths Mastery. Since schools were deliberately selected they could not be considered a random sample. The report notes:

‘Trial participants, on average, performed less well in their KS1 and KS2 examinations than the state school population as a whole. For instance, their KS1 average points scores (and KS2 maths test scores) were approximately 0.2 standard deviations (0.1 standard deviations) below the population mean. This seems to be driven, at least in part, by the fact that the trial particularly under-represented high achievers (relative to the population). For instance, just 12% of children participating in the trial were awarded Level 3 in their Key Stage 1 maths test, compared to 19% of all state school pupils in England.’

  • KS1 and KS2 tests were used to baseline. The Progress in Maths (PiM) test was used to assess pupils at the end of the year. But about 40% of the questions cover content not included in the Y7 maths mastery curriculum, which disadvantaged them relative to the control group. PiM also includes a calculator section although calculators are not used in Year 7 of Maths Mastery. It was agreed that breakdowns of results would be supplied to account for this.
  • On the basis of overall test results:

‘Children who received the Mathematics Mastery intervention scored, on average, +0.055 standard deviations higher on the PiM post-test. This did not reach statistical significance at conventional thresholds (t = 1.20; p = 0.24), with the 95% confidence interval ranging from –0.037 to +0.147. Turning to the FSM-only sample, the estimated effect size is +0.066 with the 95% confidence interval ranging from –0.037 to +0.169 (p = 0.21). Moreover, we also estimated a model including a FSM-by intervention interaction. Results suggested there was little evidence of heterogeneous intervention effects by FSM. Consequently, although the Mathematics Mastery intervention may have had a small positive effect on overall PiM test scores, one cannot rule out the possibility that this finding is due to sampling variation.

  • When the breakdowns were analysed:

‘As perhaps expected, the Mathematics Mastery intervention did not have any impact upon children’s performance on questions covering topics outside the Mathematics Mastery curriculum. Indeed, the estimated intervention effect is essentially zero (effect size = –0.003). In contrast, the intervention had a more pronounced effect upon material that was focused upon within the Mathematics Mastery curriculum (effect size = 0.100), just reaching statistical significance at the 5% level (t = 2.15; p = 0.04)

  • The only analysis of the comparative performance of high and low attainers is tied to the parts of the test not requiring use of a calculator. It suggests a noticeably smaller effect in the top half of the attainment distribution, with no statistical significance above the 55th This is substantively different to the finding in the primary evaluation, and it begs the question whether secondary Maths Mastery needs adjustment to make it more suitable for high attainers.
  • A process evaluation was focused principally on 5 schools from the intervention group. Focus group discussions were held before the intervention and again towards the end. Telephone interviews were conducted and lessons observed. The sample was selected to ensure different sizes of school, FSM intake and schools achieving both poor and good progress in maths according to their most recent inspection report. One of the recommendations is that:

The intervention should consider how it might give more advice and support with respect to differentiation.’

  • The process evaluation adds further detail about suitability for high attainers:

‘Another school [E] also commented that the materials were also not sufficiently challenging for the highest-attaining children, who were frustrated by revisiting at length the same topics they had already encountered at primary school. Although this observation was also made in other schools, it was generally felt that the children gradually began to realise that they were in fact enjoying the subject more by gaining extra understanding.’

It is not clear whether this latter comment also extends to the high attainers!

A similar set of limitations is explored in similar language to that used in the primary report.

Under ‘Interpretation’ the report says:

‘Although point estimates were consistent with a small, positive gain, the study did not have sufficient statistical power to rule out chance as an explanation. Within the context of the wider educational literature, the effect size reported (less than 0.10 standard deviations) would typically be considered ‘small’…

But, as in the primary report, it detects ‘some promise’ on the same grounds. There is a similar speculative reference to longitudinal evaluation.


Press releases and blogs


EEF press release

There is a certain irony in the fact that ‘unlucky’ Friday 13 February was the day selected by the EEF to release these rather disappointing reports.

But Friday is typically the day selected by communications people to release educational news that is most likely to generate negative media coverage – and a Friday immediately before a school holiday is a particularly favoured time to do so, presumably because fewer journalists and social media users are active.

Unfortunately, the practice is at risk of becoming self-defeating, since everyone now expects bad news on a Friday, whereas they might be rather less alert on a busier day earlier in the week.

On this occasion Thursday was an exceptionally busy day for education news, with reaction to Miliband’s speech and a raft of Coalition announcements designed to divert attention from it. With the benefit of hindsight, Thursday might have been a better choice.

The EEF’s press release dealt with evaluation reports on nine separate projects, so increasing the probability that attention would be diverted away from Maths Mastery.

It led on a different evaluation report which generated more positive findings – the EEF seems increasingly sensitive to concerns that too many of the RCTs it sponsors are showing negligible or no positive effect, presumably because the value-for-money police may be inclined to turn their beady eye upon the Foundation itself.

But perhaps it also did so because Maths Mastery’s relatively poor performance was otherwise the story most likely to attract the attention of more informed journalists and commentators.

On the other hand, Maths Mastery was given second billing:

‘Also published today are the results of Mathematics Mastery, a whole-school approach which aims to deepen pupils’ conceptual understanding of key mathematical ideas. Compared to traditional curricula, fewer topics are covered in more depth and greater emphasis is placed on problem solving and encouraging mathematical thinking. The EEF trials found that pupils following the Mathematics Mastery programme made an additional month’s progress over a period of a year.’



EEF blog post

Later on 13 February EEF released a blog post written by a senior analyst which mentions Maths Mastery in the following terms:

Another finding of note is the small positive impact of teaching children fewer mathematical concepts, but covering them in greater depth to ensure ‘mastery’. The EEF’s evaluation of Mathematics Mastery will make fascinating reading for headteachers contemplating introducing this approach into their school. Of course, the true value of this method may only be evident in years to come as children are able to draw on their secure mathematical foundations to tackle more complex problems.’

EEF is consistently reporting a small positive impact but, as we have seen, this is rather economical with the truth. It deserves some qualification.

More interestingly though, the post adds (my emphases):

‘Our commitment as an organisation is not only to build the strength of the evidence base in education, across key stages, topics, approaches and techniques, but also ensure that the key messages emerging from the research are synthesised and communicated clearly to teachers and school leaders so that evidence can form a central pillar of how decisions are made in schools.

We have already begun this work, driven by the messages from our published trials as well as the existing evidence base. How teaching assistants can be used to best effect, important lessons in literacy at the transition from primary to secondary, and which principles should underpin approaches on encouraging children in reading for pleasure are all issues that have important implications for school leaders. Synthesising and disseminating these vital messages will form the backbone of a new phase of EEF work beginning later in the year.’

It will be interesting to monitor the impact of this work on the communication of outcomes from these particular evaluations.

It will be important to ensure that synthesis and dissemination is not at the expense of accuracy, particularly when ‘high stakes’ results are involved, otherwise there is a risk that users will lose faith in the independence of EEF and its willingness to ‘speak truth unto power’.


Maths Mastery Press Release

By also releasing their own posts on 13 February, Mathematics Mastery and Ark made sure that they too would not be picked up by the media.

They must have concluded that, even if they placed the most positive interpretation on the outcomes, they would find it hard to create the kind of media coverage that would generate increased demand from schools.

The Mathematics Mastery release – ‘Mathematics Mastery speeds up pupils’ progress – and is value for money too’ – begins with a list of bullet points citing other evidence that the programme works, so implying that the EEF evaluations are relatively insignificant additions to this comprehensive evidence base:

  • ‘Headteachers say that the teaching of mathematics in their schools has improved
  • Headteachers are happy to recommend us to other schools
  • Numerous Ofsted inspections have praised the “new approach to mathematics” in partner schools
  • Extremely positive evaluations of our training and our school development visits
  • We have an exceptionally high retention rate – schools want to continue in the partnership
  • Great Key Stage 1 results in a large number of schools.’

Much of this is hearsay, or else vague reference to quantitative evidence that is not published openly.

The optimistic comment on the EEF evaluations is:

‘We’re pleased with the finding that, looking at both our primary and secondary programmes together, pupils in the Mathematics Mastery schools make one month’s extra progress on average compared to pupils in the other schools after a one year “dose” of the programme…

…This is a really pleasing outcome – trials of this kind are very rigorous.  Over 80 primary schools and 50 secondary schools were involved in the testing, with over 4000 pupils involved in each phase.  Studies like this often don’t show any progress at all, particularly in the early years of implementation and if, like ours, the programme is aimed at all pupils and not just particular groups.  What’s more, because of the large sample size, the difference in scores between the Mathematics Mastery and other schools is “statistically significant” which means the results are very unlikely to be due to chance.’

The section I have emboldened is in stark contrast to the EEF blog post above, which has the title:

‘Today’s findings; impact, no-impact and inconclusive – a normal distribution of findings’

And so suggests exactly the opposite.

I have already shown just how borderline the calculation of ‘statistical significance’ has been.

The release concludes:

‘Of course we’re pleased with the extra progress even after a limited time, but we’re interested in long term change and long term development and improvement.  We’re determined to work with our partner schools to show what’s possible over pupils’ whole school careers…but it’s nice to know we’ve already started to succeed!’


There was a single retweet of the Tweet above, but from a particularly authoritative source (who also sits on Ark’s Advisory Group).


Ark Press Release

Ark’s press release – ‘Independent evaluation shows Mathematics Mastery pupils doing better than their peers’ – is even more bullish.

The opening paragraph claims that:

‘A new independent report from the independent Education Endowment Foundation (EEF) demonstrates the success of the Mathematics Mastery programme. Carried out by academics from Cambridge University and the Institute of Education, the data indicates that the programme may have the potential to halve the attainment gap with high performing countries in the far East.

The second emboldened statement is particularly brazen since there is no evidence in either of the reports that would support such a claim. It is only true in the sense that any programme ‘may have the potential’ to achieve any particularly ambitious outcome.

Statistical significance is again celebrated, though it is important to give Ark credit for adding:

‘…but it is important to note that these individual studies did not reach the threshold for statistical significance. It is only at the combined level across 127 schools and 10,114 pupils that there are sufficient schools and statistical power to determine an effect size of 1 month overall.’

Even if this rather implies that the individual evaluations were somehow at fault for being too small and so not generating ‘sufficient statistical power’.

Then the release returns to its initial theme:

‘… According to the OECD, by age fifteen, pupils in Singapore, Japan, South Korea and China are three years ahead of pupils in England in mathematical achievement. Maths Mastery is inspired by the techniques and strategies used in these countries.

Because Maths Mastery is a whole school programme of training and a cumulative curriculum, rather than a catch-up intervention, this could be a sustained impact. A 2 month gain every primary year and 1 month gain every secondary year could see pupils more than one and a half years ahead by age 16 – halving the gap with higher performing jurisdictions.’

In other words, Ark extrapolates equivalent gains – eschewing all statistical hedging – for each year of study, adding them together to suggest a potential 18 month gain.

It also seems to apply the effect to all participants rather than to the average participant.

This must have been a step too far, even for Ark’s publicity machine.


maths mastery ark release capture


They subsequently changed the final paragraph above – which one can still find in the version within Google’s cache – to read:

‘…Because Maths Mastery is a whole school programme of training and a cumulative curriculum, rather than a catch-up intervention, we expect this to be a sustained impact.  A longer follow-up study will be needed to investigate this.’

Even in sacrificing the misleading quantification, they could not resist bumping up ‘this could be a sustained impact’ to ‘we expect this to be a sustained impact’


[Postscript: On 25 February, Bank of America Merrill Lynch published a press release announcing a £750,000 donation to Maths Mastery.

The final paragraph ‘About Maths Mastery’ says:

‘Mathematics Mastery is an innovative maths teaching framework, supporting schools, students and teachers to be successful at maths. There are currently 192 Mathematics Mastery partner schools across England, reaching 34,800 pupils. Over the next five years the programme aims to expand to 500 schools, and reach 300,000 pupils. Maths Mastery was recently evaluated by the independent Education Endowment Foundation and pupils were found to be up to two months ahead of their peers in just the first year of the programme. Longer term, this could see pupils more than a year and a half ahead by age 16 – halving the gap with pupils in countries such as Japan, Singapore and China.’

This exemplifies perfectly how such questionable statements are repurposed and recycled with impunity. It is high time that the EEF published a code of practice to help ensure that the outcomes of its evaluations are not misrepresented.]  





Representing the key findings

My best effort at a balanced presentation of these findings would include the key points below. I am happy to consider amendments, additions and improvements:

  • On average, pupils in primary schools adopting Mathematics Mastery made two months more progress than pupils in primary schools that did not. (This is a borderline result, in that it is only just above the score denoting one month’s progress. It falls to one month’s progress if the effect size is calculated to three decimal places.) The effect is classified as ‘Low’ and this outcome is not statistically significant. 
  • On average, pupils in secondary schools adopting Mathematics Mastery made one month more progress than pupils in secondary schools that did not. The effect is classified as ‘Low’ and this outcome is not statistically significant. 
  • When the results of the primary and secondary evaluations are combined through meta-analysis, pupils in schools adopting Maths Mastery made one month more progress than pupils in schools that did not. The effect is classified as ‘Low’. This outcome is marginally statistically significant, provided that the 95% confidence interval is calculated to three decimal places (but it is not statistically significant if calculated to two decimal places). Care is needed in analysing meta-analysed findings because… [add explanation]. 
  • There is relatively little evidence that the primary programme is more effective for learners with lower prior attainment, but there is such evidence for the secondary programme (in respect of non-calculator questions). There is no substantive evidence that the secondary programme has a different impact on pupils eligible for free schools meals. 
  • The per-pupil cost is relatively low, but the initial outlay of £6,000 for primary schools with 2FE and above is not inconsiderable. Mathematics Mastery may represent a cost-effective change for schools to consider. 
  • The evaluations assessed the impact of the programme in its first year of adoption. It is not appropriate to draw inferences from the findings above to attribute potential value to the whole programme. EEF will be evaluating the medium and long-term impact of the approach by [outline the methodology agreed].

In the meantime, it would be helpful for Ark and Maths Mastery to be much more transparent about KS1 assessment outcomes across their partner schools and possibly publish their own analysis based on comparison between schools undertaking the programme and matched control schools with similar intakes.

And it would be helpful for all partners to explain and evidence more fully the benefits to high attainers of the Maths Mastery approach – and to consider how it might be supplemented when it does not provide the blend of challenge and support that best meets their needs.

It is disappointing that, three years on, the failure of the National Curriculum Expert Panel to reconcile their advocacy for mastery with stretch and challenge for high attainers – in defiance of their remit to consider the latter as well as the former –  is being perpetuated across the system.

NCETM might usefully revisit their guidance on high attainers in primary schools to reflect their new-found commitment to mastery, while also incorporating additional material covering the point above.



A summary of this piece, published by Schools Week, prompted two comments – one from Stephen Gorard, the other from Dylan Wiliam. The Twitter embed below is the record of a subsequent debate between us and some others, about design of the Maths Mastery evaluations, what they tell us and how useful they are, especially to policy makers.

One of the tweets contains a commitment on the part of Anna Vignoles to set up a seminar to discuss these issues further.

The widget stores the tweets in reverse order (most recent first). Scroll down to the bottom to follow the discussion in chronological order.




February 2015