The Politics of Setting

I had been intending never to revisit the difficult topic of setting, secure in the knowledge that I could not improve on my earlier treatment of the pros and cons.

P1010978

Irrelevant picture of Norway by Gifted Phoenix

But recent developments have caused me to reconsider, led me to address the issue from a different perspective.

My previous post attempted an objective and balanced statement of the educational arguments for and against, drawing on the research evidence and taking account of all learners, regardless of their attainment.

This one explores how setting – just one option within the far wider range of so-called ‘ability grouping’ strategies – has been reflected in government policy and party political policy documents since 1997, culminating in the position we have  reached as the parties begin to firm up their 2015 General Election manifestos.

The post begins with brief notes on terminology and incidence.

The substantive text is organised into four sections:

  • How Labour government positions on setting developed and fluctuated between 1997 and 2010.
  • How the Conservative party turned to setting while in opposition.
  • How Coalition Government policy on setting has shifted since May 2010.

It concludes with a summary of the position we have reached as we approach the next Election, together with some ideas for how we might move forwards more constructively.

In case you prefer to read selectively I have included links to the relevant section from each of the bullet points above.

 

Terminology

I take setting to mean grouping learners in a discrete class, ostensibly selected with reference to prior attainment in a specific subject.

It is distinct from streaming, where the selection – which may well be generic and ability-based – applies to teaching across a range of different subjects. The learners in a higher stream may not be higher attainers in each of these subjects.

One sometimes also encounters references to banding which is broadly synonymous with streaming, except that streaming tends to refer to a single class, while bands may include more than one class. It may therefore be a less differentiated form of streaming.

Both setting and streaming are within-school selection strategies, which may be adopted by selective or comprehensive schools. They may be perceived as viable alternatives to between-school selection which is no longer regarded as politically feasible by Labour, Conservatives or Liberal Democrats.

There is, however, continuing pressure from the right of the Conservative party and recently from UKIP for the restoration of grammar schools. The Coalition government has opened up the prospect of satellite establishments and overseen the introduction of several selective post-16 institutions. This might be viewed as the thin end of the wedge.

It has not always been possible to isolate the approach to setting since there is often a tendency to brigade it with streaming and/or a wider range of grouping strategies, occasionally including various approaches to within class grouping.

Sometimes these distinctions are clear and sometimes they are less so. To take a prominent example, the relevant entry in the Education Endowment Foundation’s Teaching and Learning Toolkit is not a model of clarity.

Called ‘Setting or Streaming’, it discusses effectiveness initially in terms of ‘ability grouping’ (first paragraph).

Clarity is not improved by the inclusion of the American terms for streaming (tracking) and setting (regrouping).

Nor is it clear whether ‘ability grouping’ is intended as a synonym for ‘setting or streaming’ or whether it has a broader scope.

The second paragraph reverts to ‘setting or streaming’ before discussing a wider range of interventions targeted at gifted and talented learners including several accelerative measures. One of these – promotion – is not necessarily a grouping strategy, at least as I understand the term.

The next three paragraphs relate to low attainers. The third focuses on ‘ability grouping’, although there is one reference is to ‘setting or streaming’, the fourth discusses both ‘setting’ and ‘ability grouping’, while the fifth mentions only ‘ability grouping’.

This terminological imprecision is confusing and unhelpful, especially when it appears in a text that purports to present the available research evidence clearly and unambiguously.

 

How prevalent is setting?

There are few recent and reliable statistics available on the incidence of setting.

Statistics deposited in the Commons Library in March 2012 (Dep 2012-0434) provide Ofsted data on the percentage of lessons observed in secondary schools that were either setted or streamed/banded for every year from 1996/97 to 2002/03, excluding PE.

In 2002/03, 40% of all secondary lessons observed were setted and 4% were streamed or banded.

From 2003/04 to 2010/11, the table provides percentages of lessons observed that were setted, streamed or banded, for ‘lower’, ‘average’ and ‘upper ability’ learners respectively.

In 2010/11, the average percentages across all year groups were 12% for average ability, 16% for lower ability and 17% for higher ability.

The reply to a PQ from July 2011 provides 2009/10 data, for maths, English and science in primary and secondary schools respectively. The percentages relate to ‘classes setted, streamed or banded by ability where pupils are placed within an ability range within the school’.

The average figures across all year groups are set out below. For primary schools I have included Year 6 percentages in brackets:

  • Maths primary 19% (34%)
  • English primary 11% (19%)
  • Science primary 2% (3%)
  • Maths secondary 71%
  • English secondary 58%
  • Science secondary 62%

A 2014 study of primary practice found that:

Approximately 17% of the pupils studied, who were born in 2000-2001, were in ability streams. Some 8% of the total group were in the top stream, 5% in the middle and 4% in the bottom stream.

Last year Ofsted estimated that, excluding PE, some 45% of secondary lessons were set or streamed. The TES story containing these figures notes:

‘The Department for Education was unable to produce statistics on how many students are set or streamed. Ofsted produced limited data based on lessons it had inspected… but stressed that “there is no way of using this data to draw out national conclusions in any way”….

…In comments accompanying Ofsted’s figures, Sir Michael noted that, since 2005, its inspections have not involved observing all teachers in a school. Lessons that were seen were not “necessarily representative” of the school or system as a whole, he said.

….”It is not possible to deduce from inspection data the proportions of pupils nationally who are taught in setted/streamed classes or in mixed-ability groups,” the chief inspector said.’

We can only conclude that a significant proportion of secondary students and older primary learners is setted and that that this practice is most prevalent in the core subjects. It is unclear whether these percentages are now increasing, stable or declining. 

It would be highly desirable to obtain more accurate figures through the School Census, if only to track the influence of the presentation of the evidence base in the Toolkit.

 

Part One: The evolution of Labour government policy from 1997 to 2010

 

First Labour Government

In 1997 the incoming Labour Government published its White Paper ‘Excellence in Schools’. The chapter on ‘Modernising the comprehensive principle’ said:

Mixed ability grouping… requires excellent teaching and in some schools has worked well. But in too many cases it has failed both to stretch the brightest and to respond to the needs of those who have fallen behind. Setting, particularly in science, maths and languages, is proving effective in many schools. We do not believe that any single model of grouping pupils should be imposed on secondary schools, but unless a school can demonstrate that it is getting better than expected results through a different approach, we do make the presumption that setting should be the norm in secondary schools. In some cases, it is worth considering in primary schools. Schools should make clear in reports to parents the use they are making of different grouping approaches. OFSTED inspections will also report on this.

The clear implication is that, where the quality of teaching is not excellent, setting is likely to prove relatively more effective than ‘mixed ability grouping’, particularly in science, maths and languages.

Setting will not be made compulsory in secondary schools, but there is a presumption that it should be ‘the norm’, presumably in all subjects but certainly in science, maths and languages, unless schools can show ‘better than expected results’ through a different approach. In primary schools, setting should be considered in some unspecified cases.

Ofsted will check what schools are doing (and presumably validate or otherwise any claim of ‘better than expected results’, although the precise meaning of this term is not explained).

The text also says that the Department will publish guidance and exemplification of best practice, taken from this country and abroad ‘in organising classes to meet the different abilities of pupils’. There is a list of strategies in which it has particular interest, including:

  • ‘target-grouping, where pupils are grouped by ability for part of the week and groups are altered in line with regular assessment;
  • fast-tracking, where pupils are encouraged to learn and take qualifications ahead of their age cohort.’

Early in 1999, Ofsted published a survey on setting in primary schools. I cannot source the text online, but contemporary reviews, such as this from the LGA, show that it was strongly supportive of the practice:

Setting, rather than streaming, in primary schools provides a powerful lever for raising standards, so long as it is carefully implemented and properly managed, say Her Majesty’s Inspectors from OFSTED.

A new survey of the practice of setting – grouping children by ability for specific subjects – uses evidence from OFSTED inspection data, from a questionnaire and from focused inspections by HMI. It endorses the government’s view that setting is well worth considering.

‘Where teachers understand its potential and modify their teaching techniques accordingly, setting can be a very successful way of organising teaching groups,’ HMI say in the report Setting in Primary Schools, published today by OFSTED.

They point out that setting does not, by itself, guarantee success in raising standards nor can it compensate for poor teaching. However, evidence from school inspections suggests that the quality of teaching in setted lessons in the three core subjects is slightly better than in lessons with the full ability range.’

This introduces two important themes – that the efficacy of setting is dependent on:

  • it being implemented and managed effectively and
  • the appropriate adaptation of teaching techniques.

In September 2000 a DfEE research report on ‘Innovative Grouping Practices in Secondary Schools’ summarises the advantages and disadvantages of ability grouping more generally, but consists mainly of extended case studies of contemporary innovative practice.

The introduction sets the context thus:

The challenge now is to find ways of grouping pupils and developing pedagogy that capitalises on the advantages and minimises the disadvantages outlined above. In other words, how can schools develop grouping plans to achieve the best attainment outcomes for pupils while minimising any negative impact?

This rather more pragmatic approach reappears in subsequent guidance documents, but was set aside when government policy was articulated.

 

Second Labour Government

The 2001 Green Paper ‘Schools Building on Success’ reverts to a bullish reference to setting in the section on KS3:

We want to see further increases in the extent of setting within subjects including express sets to enable those who are capable of doing so to advance beyond the levels set for their age and to take Key Stage 3 tests early.’

But this does not survive into ‘Schools Achieving Success’, the White Paper published the same year, which makes no reference to setting specifically or ‘ability grouping’ more generally..

A roughly contemporary PQ reply also hedges its bets:

The Government supports a flexible approach to pupil grouping, including setting by ability where appropriate’.

The sentence is vacuous because deliberately imprecise. Essentially it expresses the government’s preference for schools to decide their own approaches.

It seems that there is growing indecision over which line to take. Should the government opt for consistent and wholehearted endorsement, full devolution of responsibility to schools, or a middle path that focuses on developing and disseminating effective practice to meet the needs of different settings?

This is of course redolent of wider contemporary debate about the role of the government in determining education policy and practice.

Setting is not mentioned in the ‘Five Year Strategy for Children and Learners’ which appeared in 2004.

 

Third Labour Government

Setting makes a significant reappearance in the October 2005 White Paper ‘Higher Standards, Better Schools For All’:

‘Grouping students can help to build motivation, social skills and independence; and most importantly can raise standards because pupils are better engaged in their own learning. We have encouraged schools to use setting since 1997. Putting children in different ability groups within a class is commonplace in primary schools. Ofsted reports show that the proportion of Key Stage 3 lessons which are set has risen since 1997 to over a third now, with greater rises in English and maths. The significant majority of English, science and modern foreign language lessons in secondary schools, and about nine in ten maths lessons are already organised by setting.

It will continue to be for schools to decide how and when to group and set by ability. But we will encourage more schools to adopt such grouping and help them to learn from the innovative practices that some schools are already employing without lowering expectations for pupils in lower ability groups or limiting choices in the curriculum. We will publish, in the New Year, independent research into current best practice.

The first emboldened point implies a consistency that is not fully reflected in the narrative above, in that the encouragement for setting seems to have waned somewhat between 2001 and 2004.

The second emboldened section makes it clear that schools remain free to determine their own approaches. The presumption in favour of setting has gone by the wayside and the government will focus instead on encouragement through the continuing promotion of innovation and best practice.

Shortly afterwards, the research report ‘The Effects of Pupil Grouping: Literature Review’ appeared.

Back in 2010 I summarised its key findings thus:

  • No single form of grouping benefits all pupils and there is little attainment advantage associated with setting – ie no significant difference between setting and mixed ability classes in overall attainment outcomes across all pupils.
  • ‘At the extremes of attainment’ low-achieving pupils show more progress in mixed ability classes and high-achieving pupils show more progress in sets.
  • Lower sets tend to contain a disproportionate number of boys, pupils from those ethnic groups that tend to underachieve and pupils with SEN.
  • There are aspirational and behavioural disadvantages to setting, predominantly amongst lower attainers, and there is a correlation between disaffection and setting, particular for pupils in the lowest sets.
  • Higher sets are more likely to have experienced and highly-qualified teachers whereas lower sets experience more changes of teacher and are less likely to be taught by a specialist in the subject.’

A contemporaneous TES story argues that the report undermines the government’s position by offering too little support for setting:

‘Setting pupils by ability, one of the most widely-trailed parts of last week’s white paper, has few benefits, a study funded by the Department for Education and Skills has concluded.

There is no evidence that streamed or set classes produce, on average, higher performance than mixed-ability classes, said the report. It also found that setting pupils is already widespread, particularly in maths….

It says the debate between setting and mixed-ability teaching has become polarised and does not reflect what happens in schools where a wide range of ways of grouping pupils is used….

…The review concluded: “There are no significant differences between setting and mixed-ability teaching in overall attainment … but … low-achieving pupils show more progress in mixed-ability classes and high-achieving pupils show more progress in set classes.’

This provides the spur for a renewed effort to push beyond the polarised debate, to refocus on helping to develop solutions to fit particular needs and circumstances

In 2006, DfES published ‘Pupil Grouping Strategies and Practices at Key Stage 2 and 3: Case Studies of 24 Schools in England’, a companion piece to the 2005 study.

The impact of grouping on pupil attainment were summarised thus:

  • Schools identified that the use of setting enabled them to tailor teaching for different ability pupils in order to impact on their understanding and achievement. However, the research did not find evidence to corroborate these expected achievement gains.
  • In secondary schools that adopted mixed ability or part mixed ability grouping approaches, the rationale given by teachers and senior managers tended not to make reference to attainment but rather to focus on the benefits in terms of social awareness and inclusivity. 
  • In primary schools, which used mixed ability as the predominant organisational grouping, pupils were often seated around tables on the basis of ability and it was not possible to differentiate attainment outcomes that related directly to setting or mixed ability from these observations.’

So advocates of secondary setting could not demonstrate stronger attainment overall, while advocates of secondary mixed ability teaching were not primarily concerned with the impact on attainment.

In the primary sector it was not possible to distinguish a differential impact on outcomes from either option.

In September of the same year, the National Strategies produced ‘Grouping Pupils for Success’, useful guidance for schools deciding on the most appropriate grouping strategies.

The introduction says that it:

‘…moves on from the old ‘for and against’ debates about grouping to a more sophisticated understanding of what it means to group pupils for success.’

Suggestions relating specifically to setting include:

  • ‘Make a careful match of individual teacher strengths with the nature of sets, for example placing a teacher experienced in challenging low attainers with the lowest set or band, to lift attainment.
  • Avoid ‘teaching to the middle’ in mixed-ability classes.
  • Monitor pupils’ learning to ensure that pupils have opportunities to demonstrate higher attainment, for example in tiered papers in the National Curriculum tests, and that access to the curriculum and resources are not limited by assumptions about ability level.
  • Ensure that teaching in top sets creates a learning atmosphere in which it is acceptable to make mistakes, to ask for clarification or repetition.
  • Develop inclusive teaching approaches, for example through differentiated questioning or the use of within-class groupings.’

It summarises the research on setting and mixed ability grouping respectively in the two tables reproduced below.

 

2014 Setting Capture 1

2014 setting Capture 2

 

Effective Teaching and Learning for Pupils in Low Attaining Groups’ (2007) takes the same line as the previous studies in arguing that:

‘…the polarisation of the grouping debate does not reflect the available evidence….Rather than pointing towards the overwhelming superiority of one form of grouping over another, it suggests that different forms of grouping are effective for different ‘types’ of pupils, in relation to different kinds of outcomes.’

But it continues:

‘The decision, therefore, about whether to group by attainment, either has to be seen as a matter of principle, where empirical evidence is of limited relevance, or else has to be regarded as one that is complex and may even be too close to call.

Nevertheless, the authors contribute some further empirical evidence to the debate, notably concerning the characteristics of pupils in low attaining sets:

  • ‘The analysis of data on pupils’ allocation to groups confirms prior attainment as the main, albeit a relatively poor predictor of set placement, for example, with over half the pupils with low prior attainment in English ending up in middle or high sets. Although prior attainment remains statistically significant, setting decisions are clearly not made on this basis alone.’
  • ‘Social class is a significant predictor of set placement. Pupils from higher socio-economic status (SES) backgrounds are more likely to be assigned to higher sets and less likely to be assigned to lower sets.
  • Special Educational Need (SEN) is a significant predictor of set placement (after controlling for social class and prior attainment), with these pupils concentrated in the low attainment sets. Less than 10% of pupils in the highest sets have SEN. This suggests that SEN and low attainment are seen as closely related or overlapping and that set placement may also be confounded by the effect of behaviour.
  • Ethnicity was a weaker significant predictor of set placement, (after controlling for social class and prior attainment), with pupils of Bangladeshi origin being slightly less likely to be selected for the higher sets.
  • Gender was not a significant predictor of set placement (after controlling for social class and prior attainment), except in Key Stage 2 literacy where, against recent trends, females were more likely to be placed in a low set. Overall, males are slightly overrepresented in the low sets and under-represented in the middle sets but this difference was not statistically significant.
  • Other factors including teacher assessments, teacher judgements and pupil characteristics such as behaviour are likely to influence set placement. Some schools allocated pupils with behavioural difficulties to high sets irrespective of prior attainment because they believed that the classroom context provided in these groups would promote positive behaviour. Other schools allocated these pupils to lower sets because they were smaller and provided higher staff ratios.’

Also in 2007, ‘The Children’s Plan’ included a section entitled ‘Good classroom practices – better use of grouping and setting’.

Essentially this replicates the approach taken in the 2005 White Paper, though the drafting is far more convoluted and so far less clear:

‘Improved understanding of each child’s progress should also lead to more effective use of group teaching. Since 1997 we have been encouraging schools to use ‘setting’ (teaching groups of pupils by ability in a particular subject rather than across a range of subjects) and other forms of pupil grouping, and we continue to encourage these practices.

Using setting and groups to teach children of similar abilities and interests can bring real educational benefits. But where it is poorly implemented, for example through ‘streaming’ (where pupils are grouped across a range of subjects based on general rather than subject-specific assessment) it can be socially divisive and detrimental to all but the highest achieving pupils. Grouping can also be used more effectively in the classroom – in particular, through proven approaches to in-class grouping by need, and guided group work when the teacher coaches a small group to apply immediately what they have been learning in the main part of the lesson. We will promote this best practice as standard practice.

Under this new formulation, there is recognition that there can be effective practice in setting and mixed ability grouping alike.

The final sentence potentially embodies a slightly different approach, by introducing the notion of promoting ‘standard practice’, but no further details are provided about what exactly this will entail.

Then Labour appears to lose interest in setting. A 2008 publication from DCSF ‘Personalised Learning: A Practical Guide’ includes a chapter on ‘Pupil Grouping’ but it says almost nothing about setting. It is as if the authors are keen to move on from what has become a rather sterile debate.

A PQ from March 2009 uses the Children’s Plan formulation:

‘Analysis of research suggests that no single model of pupil grouping will be of benefit to all pupils all of the time. For example, there is some evidence that being taught in a mixed ability class can be beneficial for low attainers, but that ability-based classes can be beneficial for high attainers.

We promote setting — the grouping of pupils according to their ability in a particular subject — as an effective way of ensuring that individual pupils are receiving personalised help appropriate to where they are in their learning. Similarly, we promote effective pupil grouping practices, and guided work, as tools for delivering the most appropriate curriculum to each individual in mixed ability classes.

We do not promote streaming—where pupils are assigned to classes on the basis of an overall assessment of their general ability and pupils remain in their streamed classes across the majority of subjects—as it assumes that children will have the same level of ability in all subjects.’

But, three months later, the 2009 White Paper ‘Your child, your schools, our future’ has nothing to say on the subject, and references to setting are conspicuously absent from the Pupil and Parent Guarantees. Ministers have apparently decided that schools are best left to their own devices.

Setting did not appear in Labour’s 2010 Election Manifesto either.

 

Part 2: The Conservatives in opposition

In January 2006, just at the time when Government research reports were discussing the polarised nature of debate on setting and advocating a more nuanced approach, David Cameron and David Willetts (then Tory education spokesman) both made statements that explicitly supported setting.

One report has Cameron saying:

‘I want no child held back, so my priority is not selection by ability between schools but setting by ability within schools, because every parent knows that a high quality education means engaging children at the right level.’

Another attributes to him the statement:

‘I want the Conservative Party to help me campaign in setting by each subject in every school so that we actually do what I think is common sense which is to help stretch the brightest pupils and help those who are in danger of falling behind…There’s a real case for more selection within schools rather than selection between schools.’

“The government is getting into a mess over the issues of selection and admissions.”

It seems that Cameron has identified support for setting as a means of distancing himself from calls within his party for the introduction of more selective schools.

Willetts said:

What I shall be looking for in the months ahead is how best to spread setting, and I would not rule out using central government more in this area…The evidence that setting works is powerful indeed, and yet you still have more than half of lessons not taught in sets, where you can target your teaching methods to children with a particular level of skill.’

Another report has a slightly different version:

We are not saying that an edict will go out from the Department for Education that schools are instructed to set in all circumstances but the empirical evidence is that it works.

I would not rule out ministers getting involved in the way schools organise setting, but our instincts are to cut back rather than add to central bureaucracy and direction.

One can see writ large the tension between the mutually exclusive desires for prescription and autonomy. Willetts is leaving open the possibility of central direction of some sort.

In the event, it was decided that Ofsted would be the enforcer. The November 2007 Conservative Green Paper ‘Raising the Bar, Closing the Gap’ made this clear:

‘While every pupil must be given the opportunity of a good education, we also recognise that each pupil should be given the opportunity to learn in accordance with their particular aptitude and ability, so that the brightest pupils continue to be stretched at the same time as pupils who might be struggling are given extra support.

We believe that setting by ability is the only solution to achieving this ambition. Labour’s 1997 manifesto acknowledged the importance of setting and implied that the amount of setting in schools would be increased significantly. This has not taken place.

… We believe that school children learn more effectively when taught with children of a similar ability. We also believe setting contributes to better behaviour. We will therefore alter guidance to Ofsted to ensure that schools – particularly those not performing at high levels – set all academic subjects by ability.’

Contemporary press reports remind us that Cameron had originally spoken of ‘a grammar school stream’ in every school, but streaming was set aside in favour of setting:

‘The Tories will now make clear that streaming need only apply in smaller schools with limited timetabling.’

Cameron has decided that setting is ‘the only solution’ and the inspection regime will impose this on all schools (the document is not clear whether primary schools are included). There is no explicit exemption for those ‘performing at high levels’ although they will be a lower priority.

This new position is a restatement of the Labour position of 1997.

Hansard shows that new opposition spokesman Michael Gove maintained an interest in the issue until at least summer 2009.

In May 2008 he requests the latest data on the extent of setting and DCSF’s guidance on the issue. Ofsted answers the first point while the reply directs him to the materials referenced above.

In July 2009 he again asks for updated data on the incidence of setting.

But the enforcement of setting through Ofsted was clearly set aside when the time came to consider the content of the 2010 Tory Election Manifesto.

For the time being at least, it seemed that the alternative attractions of full autonomy for schools had triumphed.

 

Part 3: The evolution of Coalition policy on setting

 

2010 to 2014

Indeed, the 2010 Schools White Paper made a virtue of schools’ autonomy in such matters.

‘We will expect schools to set their own improvement priorities. As long as schools provide a good education, we will not mandate specific approaches.

We…believe that it is often effective to incentivise improvement and innovative ideas, rather than to mandate a uniform approach.’

But it takes some time for any evidence of this approach in relation to setting.

A PQ from July 2011 asking for data and current guidance to schools elicits the statement that, while there is no guidance,

Case studies showing the effective use of setting in schools will be made available on the department’s website shortly.’

The task was not a high priority. Eight months later, in March 2012, the answer to the next PQ on the topic confirms that the material has now been published.

These case studies were not transferred to gov.uk, but have been archived and are still available.

The covering article, dated 26 April 2012, reads:

Setting and other forms of pupil grouping are ways of tailoring teaching and learning for mixed-ability classes which can help raise standards.  When setting is done well it can be an effective way to personalise teaching and learning to the differing needs of groups of pupils.’

There are five case studies in all, two of secondary and three of primary schools. Each is no more than a page in length. Compared with some of the guidance produced by Labour administrations they are of relatively limited value.

But the issue was soon stirred up by the intervention of HMCI Wilshaw.

In September 2012, he referred to the issue obliquely, but in such a way that his comments could be interpreted to fit the very different political perspectives of the newspapers that carried his comments.

I can find no official Ofsted record of what he said.

One report offers this version:

‘Heads have got to make up their mind. If they want mixed-ability, then they have got to make sure there’s differentiated teaching. And we will be very critical when we inspect schools, particularly in the secondary sector, if we see mixed-ability without mixed-ability teaching.

He added: ‘If you have got a youngster with low basic skills sitting alongside a youngster with Oxbridge potential, then it is really important that that’s taken into account.’

Another provides a slightly different interpretation:

‘”Where there are mixed-ability classes, unless there is differentiated teaching… it doesn’t work,” he said, adding that effective differentiated teaching was “hugely difficult” to achieve.

He said mixed-ability classes could be an “article of faith” for schools who were not concerned enough about good practice and were doing something “more akin to social engineering”. In those cases Ofsted inspections would be “very critical”.’

A third suggests that Wilshaw expressly put some distance between his remarks and the setting controversy:

This is not a judgment on mixed ability as opposed to setting or streaming, it is saying where there are mixed ability classes unless there is differentiated teaching to groups of school children in the class, unless there are individual programmes of work, it doesn’t work,” he said,

“It is absolutely critical that if you have a youngster with low grades at school who struggles with literacy and numeracy sitting alongside a youngster with Oxbridge potential then it is really important that is taken into account and they are taught by people who are experienced in good teaching of mixed ability classes.”’

Conversely, a fourth was much more bullish:

‘Inspectors will now be critical of schools that do not differentiate between high and low achievers.

This could lead to schools falling into the new category of ‘requires improvement’ (which replaces the old ‘satisfactory’ description), or even being labelled ‘inadequate’…

Ofsted cannot force schools to adopt setting – grouping pupils according to their academic ability in single subjects – or streaming, where ability groups cover most or all subjects.

However, Sir Michael’s intervention is likely to make headteachers rethink their practice of mixed ability classes for fear of being marked down in future inspections

‘It’s a combination of low expectations of what these youngsters can achieve, that their progress is not sufficiently tracked, and what I would call and have done ever since I have been a teacher the curse of mixed-ability classes without mixed-ability teaching,’ he said.

The former head said mixed-ability classes did not work ‘unless there is differentiated teaching to groups of schoolchildren in the class’ and ‘individual programmes of work’….

…Many schools had recognised this and ‘moved towards setting arrangements’, he said.

It seems as though everyone heard what they wanted to hear.

Wilshaw’s fundamental point seems to echo the 1997 White Paper and some of Labour’s guidance material reviewed above.

His principal argument is that mixed ability settings require mixed ability teaching, and that effective mixed ability teaching is a difficult skill to master. He implies, but does not state explicitly, that teaching a narrower range of ability is comparatively easier.

He suggests that Ofsted will look askance at schools that adopt mixed ability teaching on ideological grounds,  that cannot justify it in terms of their learners’ achievement, or where the quality of teaching is insufficient to support it.

Seven months later, in April 2013, a peculiar story appeared in the TES called ‘Conservatives abandon pledge to enforce ability grouping’:

‘The practice of grouping classes by ability has long had strong backing from the top. Ofsted, the education secretary, the prime minister and their Labour predecessors have all encouraged schools to use setting in more lessons.

But, despite their rhetoric, Conservative ministers have quietly dropped a pledge to enforce setting by ability…

… Last September, Ofsted chief inspector Sir Michael Wilshaw appeared to support the call, warning that some students were being held back by “the curse of mixed-ability classes without mixed-ability teaching”, adding that such teaching was “hugely difficult” to achieve.

But the government has now said that it does not advocate setting. “It is for schools to decide how best to organise teaching – including whether to group and set pupils by ability – as they know exactly what their students need,” a spokesman said.

And Ofsted says it “doesn’t have a view on whether setting or streaming is a good idea or not”. A spokeswoman for the inspectorate also revealed that Conservative ministers had not asked Ofsted to enforce setting.’

This is odd, since any Conservative adherence to the enforcement of setting would date back to 2007. I can find no more recent commitment than that. So why overtly drop a policy that no-one could reasonably have assumed the Conservatives still to advocate?

It is as if ministers are determined to re-impose the position on autonomy reached back in 2010, which has been compromised by Wilshaw’s insistence on linking schools’ decisions to Ofsted’s assessment of their performance.

Given more recent history, it is also conceivable that ministers were on the receiving end of pressure from the Prime Minister’s Office to adopt a more interventionist approach. Perhaps this was their way of distancing education ministers from such pressure.

But Ofsted’s alleged neutrality on the question of setting was soon called into question again when in June 2013 it published  ‘The Most Able Students’.

This developed HMCI Wilshaw’s theme:

‘In around a third of the schools visited, students were taught mainly in mixed ability groups throughout Key Stage 3. Where setting by ability occurred at an early stage, this was usually only for mathematics. Sets were introduced at various times for English and science, but often only in the later stages of Key Stage 3.

For most other subjects, mixed ability classes were retained throughout Key Stage 3. In the very best schools, this did not appear to have a detrimental impact on students’ progress because the teaching was carefully planned and well matched to the most able students’ needs. In the less effective schools, the work was pitched at the level of the average-attaining students. It was not challenging enough for the most able and their progress was insufficient…

…It was evident in some of the schools visited that school leaders had responded to recent research findings about mixed ability teaching, particularly in Key Stage 3. Eight of the schools had moved recently towards grouping by ability, particularly in English, mathematics and science. Some other school leaders recognised that their earlier grouping arrangements had not always promoted the best outcomes for the most able students. They indicated that they were moving away from mixed ability teaching to setting, streaming or banding in most subjects. Schools’ data shown to inspectors during the visits indicated that these moves were beginning to have a positive impact on outcomes for the most able students.’ 

Although Ofsted may not have an official view on the desirability of setting, it is abundantly clear that schools are encouraged to consider it where the quality of mixed ability teaching provided is not sufficiently strong to secure commensurate outcomes for able learners.

The current iteration of the inspection handbook says:

Inspectors should consider how effectively pupils are grouped within lessons and across year groups. For example:

  • where pupils are taught in mixed ability groups/classes, inspectors will consider whether the most able are stretched and the least able are supported sufficiently to reach their full potential. 
  • where pupils are taught in sets, inspectors will consider how leaders ensure that pupils in lower sets are not disadvantaged or that teachers take into account that pupils within a set may still have very different needs. ‘

The associated grade descriptors for an inadequate school mention:

‘The organisation of the curriculum and classes is resulting in some pupils achieving less well than they should.’

This carefully balanced approach makes it clear that inspectors will consider equally seriously the efficacy of sets and mixed ability groups.

Schools are likely to be pulled up if their mixed ability settings are insufficiently challenging for high attainers, but will also be challenged if their sets are holding back low attainers or if teaching within sets is insufficiently differentiated.

 

Developments in Autumn 2014

This careful balance was once more disturbed.

On 3 September, the Guardian reported that:

‘Compulsory setting according to ability in England’s secondary schools is to be proposed by the education secretary, Nicky Morgan, in her first big initiative since she took the role in July. She is due to make the announcement as early as today.’

This shock introduction was immediately undermined by a subsequent paragraph indicating that setting would not be compulsory after all, though it would be incentivised through the inspection regime:

‘It is expected that Morgan will ask the education watchdog, Ofsted, to implement and enforce the measure, probably by making it a condition of receiving an outstanding rating‘.

So the strategy would be to prevent schools from receiving the top inspection rating if they had not adopted setting.

The piece also stated unequivocally that the policy had been ‘cleared with Downing Street’, although ‘The Department for Education gave no comment after being contacted.’

This implied that the source of the story was the Prime Minister’s Office.

Just six hours later, the same paper carried a second report featuring comments made by Morgan in a Commons debate that same afternoon:

‘Richard Fuller: The Secretary of State has faced a number of confusing interventions from Opposition Members, one of which repeated something that was said in The Guardian today, which was that she was about to announce a policy a compulsory setting. Will she take this opportunity to say whether she is going to do that?

Nicky Morgan: Let me confirm for the benefit of the House that there is absolutely no truth in those rumours. There are some people outside this House who have a rather unhealthy interest in speculating about what I am or am not about to announce. They would be better served if they spent less time on Twitter and talking to journalists, and more time reflecting on the importance of the policies and reforms that have already been implemented by this Government.’ (Hansard, 3 Sep 2014, Col 357)

So as not to appear entirely wrong-footed, the Guardian cited Dominic Cummings in support of its original story:

Gove’s former special adviser Dominic Cummings said he had been told Cameron wanted to back compulsory setting.

He added on Twitter: “I was told by No 10 and two others in Whitehall a version v close to the Guardian story. Some had warned internally it was mad.” He also suggested there was a launch plan prepared inside No 10.’

Cummings’ Twitter feed on the day in question is instructive:

 

 

A BBC report included comment from the Lib Dems, confirming that they would not support a Coalition policy along these lines:

‘A senior Liberal Democrat source also distanced the party from any such proposal.

“This has not been agreed by the Liberal Democrats and is not government policy. We do not think it would be appropriate to tie schools’ hands in this way.”’

And Labour in opposition took a similar line:

‘Labour’s shadow education secretary Tristram Hunt had called on the education secretary to reject political involvement in such school decisions.

“I believe that excellent heads and great teachers know better than Westminster politicians how to deliver the best schooling for all pupils.

“We thought there was political consensus on the importance of school autonomy.’

The Cummings version lends support to the idea that some sort of enforcement of setting remains under consideration for inclusion in the Conservative Election Manifesto for 2015.

It might once again help to pacify those in the Party who seek a renewed commitment to selective education. Conservative MPs will be acutely aware of UKIP’s declared policy:

‘Existing schools will be allowed to apply to become grammar schools and select according to ability and aptitude. Selection ages will be flexible and determined by the school in consultation with the local authority.’

I could find no explicit statement to the effect that a commitment to introduce setting would definitely not be in the 2015 Manifesto. The final paragraph of a related TES story claimed this was the case, but this is not supported elsewhere.

While there was no reference to setting in Morgan’s speech to the Conservative Party Conference, the idea has subsequently reappeared in a different guise.

On 12 October the Conservative party let it be known that their Manifesto would include plans to enable Regional Schools Commissioners to intervene directly in the operation of any school rated inadequate by Ofsted, whether or not an academy.

The briefing made an explicit link with setting:

‘A Conservative spokesperson said the new powers would be developed in “consultation with Ofsted and the Education Endowment Foundation”, but a “menu of options” might include forcing schools to put children into classes based on ability, or ‘sets’ as they are also known.’  (Academies Week).

So, rather than making setting a condition of an ‘outstanding’ Ofsted rating, this possible new approach is to empower RSCs to impose setting on inadequate schools.

Whether the inclusion of setting in the menu of options would survive the consultation process is open to question – and presumably RSCs would also be reluctant to impose it without hard evidence that it would radically improve the performance of an inadequate school. Such evidence would be hard to find.

Perhaps this is a method of parking the issue:  giving No 10 the impression that enforcement of setting is part of the agenda when in fact it is not.

Meanwhile, the DfE has restated its existing commitment to giving schools autonomy in this matter. On 30 October, a Conservative MP tabled a PQ:

‘Andrew Rosindell (Romford):

To ask the Secretary of State for Education, what steps her Department is taking to ensure that children at secondary school are being efficiently grouped according to their academic ability.

Answered by: Mr David Laws

The Department for Education believes that individual schools are best placed to determine whether and how to group children by academic ability. There are many different models of pupil grouping, and schools themselves are best able to respond to their individual circumstances to meet the needs and capabilities of their pupils.

Note that the reply refers to the DfE’s belief rather than the Government’s position.

This suggests that we may not have heard the last of the matter, especially if setting remains part of the Prime Minister’s strategy for buying off the siren voices calling for renewed commitment to grammar schools.

 

Part 4: The Education Endowment Foundation’s Evidence Base

The Education Endowment Foundation (EEF) exists to improve the achievement of disadvantaged learners. The website says:

‘We aim to raise the attainment of children facing disadvantage by:

  • Identifying and funding promising educational innovations that address the needs of disadvantaged children in primary and secondary schools in England;
  • Evaluating these innovations to extend and secure the evidence on what works and can be made to work at scale;
  • Encouraging schools, government, charities, and others to apply evidence and adopt innovations found to be effective.’

But, confusingly, it has also been designated jointly with the Sutton Trust as a What Works Centre for improving educational outcomes for all school age children:

‘The What Works centres will summarise and share research with local decision-makers, helping them to invest in services that deliver the best outcomes for citizens and value-for-money for taxpayers.

In the EEF’s case, decision-makers include teachers and school-leaders, parents and governors, researchers and policy-makers. They are the primary audience for our Teaching and Learning Toolkit, an accessible summary of educational research which provides guidance for teachers and schools on how to use their resources to improve the attainment of disadvantaged pupils. ‘

See the logical disconnect? The principal tool used by the EEF/Sutton Trust to inform decision makers about what works well with all learners has been designed to inform decisions about what works well with disadvantaged learners.

This is particularly problematic when it comes to setting.

 

 The Teaching and Learning Toolkit

The EEF’s website describes the Toolkit as follows:

‘The Sutton Trust-EEF Teaching and Learning Toolkit is an accessible summary of educational research which provides guidance for teachers and schools on how to use their resources to improve the attainment of disadvantaged pupils.

The Toolkit currently covers 34 topics, each summarised in terms of their average impact on attainment, the strength of the evidence supporting them and their cost.’

One of the 34 topics is ‘Setting or streaming’. This pairing is potentially problematic since the subsequent commentary does not consistently distinguish the impact of one from the other.

I have already described above how the guidance switches between setting, streaming, ability grouping and wider gifted and talented provision.

When it comes to quantification, the Toolkit arrives at an average impact measure of -1 month – ie in terms of average pupil progress over a year, the impact of ‘setting or streaming’ on disadvantaged learners is negative.

The description of the Toolkit notes:

‘Most approaches included in the Toolkit tend to have very similar average impacts on pupils with different characteristics. However, where the research summarised suggests that an approach has a different average impact on the learning of pupils from disadvantaged backgrounds compared to the learning of their peers, the Toolkit’s ‘headline’ average impact figure refers to the former.’

The section describing the impact of ‘setting or streaming’ begins:

Overall, ability grouping appears to benefit higher attaining pupils and be detrimental to the learning of mid-range and lower attaining learners. On average, ability grouping does not appear to be an effective strategy for raising the attainment of disadvantaged pupils, who are more likely to be assigned to lower groups.’

It continues:

On average, studies show that higher attaining learners make between one and two additional months progress when set or streamed compared to when taught in mixed ability groups.’

No reference is made to the plight of disadvantaged high attainers, who might be expected to benefit commensurately.

The impact of setting and streaming remains undifferentiated.

The next section of the commentary considers a wider range of grouping interventions targeted on gifted and talented learners. This does not seem directly relevant to the narrower case of ‘setting or streaming’.

The final section of the commentary is concerned with low attainers (and so, by implication, includes the majority of disadvantaged learners).

It says:

Low attaining learners fall behind by one or two months a year, on average, when compared with the progress of similar students in classes without ability grouping. It appears likely that routine setting or streaming arrangements undermine low attainers’ confidence and discourage the belief that attainment can be improved through effort. Research also suggests that ability grouping can have a longer term negative effect on the attitudes and engagement of low attaining pupils. It should be noted that there are some exceptions to this average, where ability grouping has benefitted all learners. Further study could be undertaken to understand what happened differently in these examples.

Evidence suggests that the impact of setting is more detrimental to low attaining pupils in mathematics who do better in mixed attainment groups, and that ability grouping particularly affects upper primary and lower secondary education. The effects appear to be less clear-cut in other subjects, though negative effects are reported for low attaining pupils across the curriculum.

Though the average impact of ability grouping on low attaining pupils is negative, evidence suggests that certain types of ability grouping are more effective than others. Some studies have shown that reducing the size of the lowest attaining groups and assigning high-performing teachers to these groups can be effective, as can providing additional targeted catch up support.’

So the text suggests that high attaining learners make between one and two months more progress in sets/streams, while low attaining learners fall behind by the same amount. There is, therefore, between two and four months’ difference in the impact on high and low attainers respectively.

But this commentary:

  • Is not providing sufficiently accurate information to enable us to distinguish the impact of setting alone, as opposed to ‘setting or streaming’ together. 
  • Neglects the interests of high-attaining disadvantaged learners who are assumed to be an insignificant minority. 
  • Is fundamentally unclear, a particularly heinous crime considering the purpose of the Toolkit.

 

The KCL Study

One of the projects funded by the EEF is examining Best Practice in Grouping Students. The four-year project began in 2014 and continues through to spring 2018. It is co-ordinated by a team based at King’s College London and is receiving a total of £1.184m. The evaluation has been assigned to the NFER.

The project summary on EEF’s website distinguishes two parallel strands:

  • A randomised control trial of an intervention ‘which trains schools in a best practice approach to setting’ and is focused on English and maths in Years 7 and 8. The trial begins in September 2015, but results are not expected until spring 2018. It will be conducted in a sample of 120 schools, randomly allocated to conduct the trial or form part of the control group.
  • A pilot of an intervention ‘to introduce mixed ability teaching to secondary schools’ and ‘examine whether it is possible to overcome the common barriers to mixed ability teaching’. The intervention will be developed initially with three schools but the pilot will subsequently be extended to ten. Results are due in spring 2017.

One of the descriptions on the King’s College site suggests that the focus is explicitly placed on lower sets and low attainers:

The project addresses the needs of pupils in low ‘ability’ sets and streams, wherein research has identified socially disadvantaged pupils are strongly over-represented.

The project draws on substantial existing research evidence (concerning the educational outcomes for young people in low sets and streams, and the related poor practice often associated with such groupings), as illustrated in the Education Endowment Foundation/Sutton Trust Toolkit and elsewhere. The evidence from the literature concerning existing bad practice and detrimental outcomes associated with low ‘ability’ groups establishes areas for potential improvement, which will be applied via the interventions.’

It adds that the trial will measure the impact on pupil attainment, noting that the developmental phase:

‘Will also allow us to research why schools and policy-makers appear so wedded to ‘ability’ grouping, and what might persuade the adoption of an evidence-based approach.’

A second description confirms the focus of the setting strand on lower sets:

One, on Best Practice in Setting, seeks to remedy the detrimental elements identified by research to be associated with low sets.’

This also mentions that a pilot study – it is not clear whether this is of one strand or both – is being undertaken in six schools in the current academic year. It states that the full study will involve around 100 schools (rather than 120) and will be completed in 2017 (rather than spring 2018).

The exclusive emphasis on low sets is directly contradicted in a TES story about the project:

The Education Endowment Foundation, which commissioned the study, said the research aimed to address poor practices in both low and high sets.’

Is there a difference of opinion between KCL and the EEF? It would be helpful to know the truth, since there is otherwise strong reason to believe that the needs of high-attaining disadvantaged learners will be neglected.

NFER’s description of its evaluation currently says that the protocols for both strands are not yet agreed. Hopefully they willl be clear on whether the operation of higher sets – and the impact on disadvantaged learners within them – is also part of the agenda.

 

What Makes Great Teaching?

On 31 October 2014, the Sutton Trust published ‘What Makes Great Teaching?

One assumes that it has done so in its role as partner with the EEF in the What Works Centre, rather than as a charity supporting social mobility.

The press release dictated much of the media coverage. Provocatively headed:

‘Many popular teaching practices are ineffective, warns new Sutton Trust report’

It begins:

‘Lavish praise for students is among seven popular teaching practices not supported by evidence, according to a new Sutton Trust report which reviews over 200 pieces of research on how to develop great teachers.

What Makes Great Teaching, by Professor Rob Coe and colleagues at Durham University, warns that many common practices can be harmful to learning and have no grounding in research. Examples include using praise lavishly, allowing learners to discover key ideas by themselves, grouping students by ability and presenting information to students based on their “preferred learning style”.’

Later on the press release lists seven ‘examples of strategies unsupported by evidence’. Third in the list is:

Grouping students by ability. Evidence on the effects of grouping by ability, either by allocating students to different classes, or to within-class groups, suggests that it makes very little difference to learning outcomes. It can result in teachers failing to accommodate different needs within an ability group and over-playing differences between groups, going too fast with the high-ability groups and too slow with the low.’

The Report itself does not include this list in its executive summary. It appears in a section called ‘Examples of Ineffective Practices’. But the text repeats more or less verbatim the claim in the press release

The following are examples of practices whose use is not supported by the research evidence…

Group learners by ability

Evidence on the effects of grouping by ability, either by allocating students to different classes, or to within-class groups, suggests that it makes very little difference to learning outcomes (Higgins et al, 2014). Although ability grouping can in theory allow teachers to target a narrower range of pace and content of lessons, it can also create an exaggerated sense of within-group homogeneity and between-group heterogeneity in the teacher’s mind (Stipek, 2010). This can result in teachers failing to make necessary accommodations for the range of different needs within a supposedly homogeneous ‘ability’ group, and over-doing their accommodations for different groups, going too fast with the high-ability groups and too slow with the low.’

The first reference to grouping by ability making ‘very little reference to learning outcomes’ is to the Toolkit, though the report’s bibliography attributes it to 2013 not 2014. The second reference – ‘Stipek 2010’ – inexplicably appears in the bibliography under D rather than S.

As far as I can see, this is a reference to an article – an excerpt from a 2002 book called Motivation to Learn: Integrating Theory and Practice – that cites a series of other studies dating between 1976 and 1998.

Is the opening sentence an accurate description of what the Toolkit says?

As we have seen, the Toolkit considers ‘setting or streaming’ – though it also mentions a range of other strategies targeted at gifted and talented students – but it doesn’t discuss substantive evidence relating to within-class groups.

The only reference to them comes at the end of the Toolkit entry, in the section ‘What should I consider’. It says:

‘Flexible within-class grouping is preferable to tracking or streaming for low-attaining pupils’.

But that doesn’t support the statement above. (Nor, for that matter, is it supported by the evidence in the earlier parts of the Toolkit text.)’

The differential impact of setting on high and low attainers is not mentioned.

How might this statement be improved to reflect the evidence? It might say:

  • When discussing evidence on the effectiveness of ability grouping, it is important to distinguish the impact of setting, streaming and within class ability grouping respectively.
  • It is also important to distinguish the differential impacts on high attainers and low attainers respectively. Great care should be taken to clarify whether the discussion relates to all learners or only to disadvantaged learners. The subset of low attainers ought not to be regarded as analogous with the subset of disadvantaged learners.
  • The evidence suggests the overall impact of setting or streaming – ie one or the other – on low attainers is negative (one to two months) whereas the impact on high attainers is positive (one to two months). There is therefore a difference of up to four months’ progress between high and low attainers respectively.
  • There is less evidence on the differential impact of setting and streaming respectively. What we do know is x.
  • The impact of setting varies according to prior attainment of the learners, the subject of study and how well it is implemented. The available evidence suggests that setting is most likely to be successful under the following conditions….and, conversely, is least likely to be successful when….

 

Where we are now – and future prospects

 

The Evidence Base

The arguments about the advantages and disadvantages of setting have long been polarised – and there is little evidence to suggest that this will change as we head into 2015.

The EEF/Sutton Trust nexus purports to stand for evidence-based pedagogy, but both arms of the partnership are too vague and unspecific in how they present this evidence.

Because they take short cuts, it is too easy to interpret their coverage as overwhelmingly negative towards setting. A more careful and nuanced presentation would highlight the different contexts where setting might be more and less likely to operate effectively.

As things stand, the standard bearers for evidence-based practice seem more inclined to perpetuate the polarisation of views, rather than promoting a more sophisticated understanding of the issue.

This may be a deliberate reaction to the unevidenced espousal of setting by politicians, or it may just be insufficient attention to detail.

Substantive amendment of the toolkit entry – along the lines set out above – is devoutly to be wished for.

And it should be accompanied by a commitment to produce and update accurate data about the incidence of setting by sector, type of school, subject and pupils’ prior attainment. The Schools Census is the perfect vehicle.

One hopes that the results from the KCL study will be more carefully presented, but the absence of evaluation protocols and the disagreements over the focus of the study are a cause for concern. The protocols should be finalised and published forthwith.

The KCL study is unlikely to reveal that best practice in setting has a substantial impact on improvements in the performance of disadvantaged learners, even the high attainers.

But everything is relative: hardly any of the studies of other interventions so far completed by the EEF have identified a significant positive effect.

I would be satisfied with confirmation of a limited but positive impact on the performance of disadvantaged high attainers, combined with recognition that any negative impact on disadvantaged low attainers can potentially be eliminated through effective practice.

Some recommendations for the implementation of hybrid approaches – perhaps combining a single top set with several parallel mixed ability groups – wouldn’t go amiss.

Any evidence that does emerge from the KCL study – positive or negative – will not appear until well after the 2015 Election.

For the future, we anticipate keenly the pronouncements on setting that will emerge from a College of Teaching and/or from the Liberal Democrat’s independent Education Standards Authority. There is no reason to believe that they will be any more inclined to withstand the ideological pressures than their predecessors.

 

The Policies

Labour and Liberal Democrat politicians seem wedded to the notion of full autonomy for schools, though their parallel enthusiasm for the new entities mentioned above might tend to undermine this position.

It is not clear whether schools would be protected as much from the setting-related pronouncements of a College of Teaching as they would from the predilections of a government minister.

As for the Conservatives, they seem caught on the horns of a dilemma. Do they too opt for autonomy and the virtues of the market, or do they get on the front foot and develop a more robust alternative to UKIP’s espousal of selection?

They could commit to more selective schools, if only of the post-16 variety. They might even push back the concept to 14+, perhaps enabling the strongest sixth form colleges to accept 14-16 year-olds just as FE colleges can.

They might develop a new cross-school support system for high attainers, especially disadvantaged high attainers. They need look no further than posts elsewhere on this blog for ideas as to how it might be constructed.

They should avoid at all costs the Sutton Trust open access wheeze, which directs a substantial taxpayer subsidy towards independent schools while denuding the state sector of high attaining learners.

Or they might continue to refine the idea of a grammar stream in every school.

The research evidence against streaming seems to me more damning than the evidence against setting, though this is often obscured by the tendency to roll them up together.

That said, several comprehensive schools operating in direct competition with grammar schools seem to have introduced a grammar stream. It would be possible to promote this practice in this subset of comprehensive schools – whether through Ofsted or otherwise – and to develop practical guidance on effective practice.

Setting might well remain the path of least resistance, although compulsory setting across the board would be too much of a straitjacket, restricting the flexibility of those schools that perform outstandingly with mixed ability teaching.

So some sort of selective imposition is necessary. The Ofsted inspection regime is the only effective lever remaining in the hands of central government. The Inspection Handbook might be altered to reinstate a presumption of the kind advanced by Labour in 1997 – and this might be weighted towards schools in either the higher or the lower performance categories. In either case the presumption might be confined to the core subjects.

But even this could only be carried forward in the teeth of opposition from the education profession, so would have the potential to reduce still further the quantum of Tory teacher votes.

The more recently suggested fall-back – adding setting to a menu of possible school improvement interventions managed through the Regional Schools Commissioners – is weak by comparison. So weak that it is tantamount to kicking the whole idea into the long grass.

There are precious few alternatives. Perhaps the only other realistic proposition is to revert to presentation of the evidence base, but to develop this into substantive guidance that schools are expected to consider before determining their approach to ability grouping – much more substantive than the half-hearted case studies published in 2012.

If it were my call, I would construct a ‘flexible framework’ quality standard that defines the broad parameters of effective practice while permitting schools significant flexibility over interpretation of those in practice.

This would align with and reflect the available research evidence on the most effective approaches to setting, including advice on when setting is most likely to work and when it might be preferable to select an alternative strategy.

I would incorporate the standard into supplementary guidance for Ofsted inspectors, to ensure that inspection judgements on setting are fully consistent with it.

And I would strongly encourage schools to use it within their own improvement planning processes – and through peer-to-peer assessments undertaken by teaching schools, national leaders of education and other elements of the emerging self-improving schools system.

I would combine this with a national support programme for disadvantaged high attainers in Years 7-13, generously funded through matched topslices from the Pupil Premium and HE fair access budgets.

 

The Politics

With May and Johnson already on manoeuvres, not to mention continued pressure from the Brady/Davis camp, Cameron may be even more inclined to press ahead, even in the teeth of opposition from some DfE ministers.

In June 2013, polling suggested  (p8) that 43% of all voters – and 66% of Tory voters – agreed that:

‘The Government should encourage more schools to select by academic ability and build more grammar schools’.

It seems highly likely that, without any viable alternative, the Tories will haemorrhage right wing votes to UKIP over this issue. But there is a trade-off with teacher votes. Are they capable of defining an acceptable middle way, or are they doomed to fall between two stools?

They might at any rate consider some of the ideas set out above.

 

GP

November 2014

 

Excellence Gaps Quality Standard: Version 1

 

This post is the first stage of a potential development project.

letter-33809_640
It is my initial ‘aunt sally’ for a new best fit quality standard, intended to support schools and colleges to close performance gaps between high-achieving disadvantaged learners and their more advantaged peers.

It aims to integrate two separate educational G_letter_blue_whiteobjectives:

  • Improving the achievement of disadvantaged learners, specifically those eligible for Pupil Premium support; and
  • Improving the achievement of high attainers, by increasing the proportion that achieve highly and the levels at which they achieve.

High achievement embraces both high Blue_square_Qattainment and strong progress, but these terms are not defined or quantified on the face of the standard, so that it is applicable in primary, secondary and post-16 settings and under both the current and future assessment regimes.

I have adopted new design parameters for this fresh venture into quality standards:

  • The standard consists of twelve elements placed in what seems a logical order, but they White_Letter_S_on_Green_Backgroundare not grouped into categories. All settings should consider all twelve elements. Eleven are equally weighted, but the first ‘performance’ element is potentially more significant.
  • The baseline standard is called ‘Emerging’ and is broadly aligned with Ofsted’s ‘Requires Improvement’. I want it to capture only the essential ‘non-negotiables’ that all settings must observe or they would otherwise be inadequate. I have erred on the side of minimalism for this first effort.
  • The standard marking progress beyond the baseline is called ‘Improving’ and is (very) broadly aligned with Ofsted’s ‘Good’. I have separately defined only the learner performance expected, on the assumption that in other respects the standard marks a continuum. Settings will position themselves according to how far they exceed the baseline and to what extent they fall short of excellence.
  • The excellence standard is called ‘Exemplary’ and is broadly aligned with Ofsted’s ‘Outstanding’. I have deliberately tried to pitch this as highly as possible, so that it provides challenge for even the strongest settings. Here I have erred on the side of specificity.

The trick with quality standards is to find the right balance between over-prescription and vacuous ‘motherhood and apple pie’ statements.

There may be some variation in this respect between elements of the standard: the section on teaching and learning always seems to be more accommodating of diversity than others given the very different conceptions of what constitutes effective practice. (But I am also cautious of trespassing into territory that, as a non-practitioner, I may not fully understand.)

The standard uses terminology peculiar to English settings but the broad thrust should be applicable in other countries with only limited adaptation.

The terminology needn’t necessarily be appropriate in all respects to all settings, but it should have sufficient currency and sharpness to support meaningful interaction between them, including cross-phase interaction. It is normal for primary schools to find some of the language more appropriate to secondary schools.

It is important to emphasise the ‘best fit’ nature of such standards. Following discussion informed by interaction with the framework, settings will reach a reasoned and balanced judgement of their own performance across the twelve elements.

It is not necessary for all statements in all elements to be observed to the letter. If a setting finds all or part of a statement beyond the pale, it should establish why that is and, wherever possible, devise an alternative formulation to fit its context. But it should strive wherever possible to work within the framework, taking full advantage of the flexibility it permits.

Some of the terminology will be wanting, some important references will have been omitted while others will be over-egged. That is the nature of ‘aunt sallys’.

Feel free to propose amendments using the comments facility below.

The quality standard is immediately below.  To improve readability, I have not reproduced the middle column where it is empty. Those who prefer to see the full layout can access it via this PDF

 

 

Emerging (RI) Improving (G) Exemplary (O)
The setting meets essential minimum criteria In best fit terms the setting has progressed beyond entry level but is not yet exemplary The setting is a model for others to follow
Performance Attainment and progress of disadvantaged high achievers typically matches that of similar learners nationally, or is rapidly approaching this..Attainment and progress of advantaged and disadvantaged high achievers in the setting are both improving. Attainment and progress of disadvantaged high achievers consistently matches and sometimes exceeds that of similar learners nationally..Attainment and progress are improving steadily for advantaged and disadvantaged high achievers in the setting and performance gaps between them are closing. Attainment and progress of disadvantaged high achievers significantly and consistently exceeds that of similar learners nationally..

Attainment and progress matches but does not exceed that of advantaged learners within the setting, or is rapidly approaching this, and both attainment and progress are improving steadily, for advantaged and disadvantaged high achievers alike.

 

 

 

  Emerging (RI) The setting meets essential minimum criteria Exemplary (O) The setting is a model for others to follow
Policy/strategy There is a published policy to close excellence gaps, supported by improvement planning. Progress is carefully monitored. There is a comprehensive yet clear and succinct policy to close excellence gaps that is published and easily accessible. It is familiar to and understood by staff, parents and learners alike.

.

SMART action to close excellence gaps features prominently in improvement plans; targets are clear; resources and responsibilities are allocated; progress is monitored and action adjusted accordingly. Learners’ and parents’ feedback is routinely collected.

.

The setting invests in evidence-based research and fosters innovation to improve its own performance and contribute to system-wide improvement.

Classroom T&L Classroom practice consistently addresses the needs of disadvantaged high achievers, so improving their learning and performance. The relationship between teaching quality and closing excellence gaps is invariably reflected in classroom preparation and practice.

.

All teaching staff and paraprofessionals can explain how their practice addresses the needs of disadvantaged high achievers, and how this has improved their learning and performance.

.

All staff are encouraged to research, develop, deploy, evaluate and disseminate more effective strategies in a spirit of continuous improvement.

Out of class learning A menu of appropriate opportunities is accessible to all disadvantaged high achievers and there is a systematic process to match opportunities to needs. A full menu of appropriate opportunities – including independent online learning, coaching and mentoring as well as face-to-face activities – is continually updated. All disadvantaged high achievers are supported to participate.

.

All provision is integrated alongside classroom learning into a coherent, targeted educational programme. The pitch is appropriate, duplication is avoided and gaps are filled.

.

Staff ensure that: learners’ needs are regularly assessed; they access and complete opportunities that match their needs; participation and performance are monitored and compiled in a learning record.

Assessment/ tracking Systems for assessing, reporting and tracking attainment and progress provide disadvantaged high achievers, parents and staff with the information they need to improve performance Systems for assessing, tracking and reporting attainment and progress embody stretch, challenge and the highest expectations. They identify untapped potential in disadvantaged learners. They do not impose artificially restrictive ceilings on performance.

.

Learners (and their parents) know exactly how well they are performing, what they need to improve and how they should set about it. Assessment also reflects progress towards wider goals.

.

Frequent reports are issued and explained, enabling learners (and their parents) to understand exactly how their performance has changed over time and how it compares with their peers, identifying areas of relative strength and weakness.

.

All relevant staff have real-time access to the assessment records of disadvantaged high attainers and use these to inform their work.

.

Data informs institution-wide strategies to improve attainment and progress. Analysis includes comparison with similar settings.

Curriculum/organisation The needs and circumstances of disadvantaged high achievers explicitly inform the curriculum and curriculum development, as well as the selection of appropriate organisational strategies – eg sets and/or mixed ability classes. The curriculum is tailored to the needs of disadvantaged high achievers. Curriculum flexibility is utilised to this end. Curriculum development and planning take full account of this.

.

Rather than a ‘one size fits all’ approach, enrichment (breadth), extension (depth) and acceleration (pace) are combined appropriately to meet different learners’ needs.

.

Personal, social and learning skills development and the cultivation of social and cultural capital reflect the priority attached to closing excellence gaps and the contribution this can make to improving social mobility.

.

Organisational strategies – eg the choice of sets or mixed ability classes – are informed by reliable evidence of their likely impact on excellence gaps.

Ethos/pastoral The ethos is positive and supportive of disadvantaged high achievers. Excellence is valued by staff and learners alike. Bullying that undermines this is eradicated. The ethos embodies the highest expectations of learners, and of staff in respect of learners. Every learner counts equally.

.

Excellence is actively pursued and celebrated; competition is encouraged but not at the expense of motivation and self-esteem;hothousing is shunned.

.

High achievement is the norm and this is reflected in organisational culture; there is zero tolerance of associated bullying and a swift and proportional response to efforts to undermine this culture.

.

Strong but realistic aspirations are fostered. Role models are utilised. Social and emotional needs associated with excellence gaps are promptly and thoroughly addressed.

.

The impact of disadvantage is monitored carefully. Wherever possible, obstacles to achievement are removed.

Transition/progression The performance, needs and circumstances of disadvantaged high achievers are routinely addressed in transition between settings and in the provision of information, advice and guidance. Where possible, admissions arrangements prioritise learners from disadvantaged backgrounds – and high achievers are treated equally in this respect.

.

Receiving settings routinely collect information about the performance, needs and circumstances of disadvantaged high achievers. They routinely share such information when learners transfer to other settings.

.

Information, advice and guidance is tailored, balanced and thorough. It supports progression to settings that are consistent with the highest expectations and high aspirations while also meeting learners’ needs.

.

Destinations data is collected, published and used to inform monitoring.

.

Leadership, staffing, CPD A named member of staff is responsible – with senior leadership support – for co-ordinating and monitoring activity across the setting (and improvement against this standard)..Professional development needs associated with closing excellence gaps are identified and addressed The senior leadership team has an identified lead and champion for disadvantaged high achievers and the closing of excellence gaps.

.

A named member of staff is responsible for co-ordinating and monitoring activity across the setting (and improvement against this standard).

.

Closing excellence gaps is accepted as a collective responsibility of the whole staff and governing body. There is a named lead governor.

.

There is a regular audit of professional development needs associated with closing excellence gaps across the whole staff and governing body. A full menu of appropriate opportunities is continually updated and those with needs are supported to take part.

.

The critical significance of teaching quality in closing excellence gaps is instilled in all staff, accepted and understood.

Parents Parents and guardians understand how excellence gaps are tackled and are encouraged to support this process. Wherever possible, parents and guardians are actively engaged as partners in the process of closing excellence gaps. The setting may need to act as a surrogate. Other agencies are engaged as necessary.

.

Staff, parents and learners review progress together regularly. The division of responsibility is clear. Where necessary, the setting provides support through outreach and family learning.

.

This standard is used as the basis of a guarantee to parents and learners of the support that the school will provide, in return for parental engagement and learner commitment.

Resources Sufficient resources – staffing and funding – are allocated to improvement planning (and to the achievement of this standard)..Where available, Pupil Premium is used effectively to support disadvantaged high achievers. Sufficient resources – staffing and funding – are allocated to relevant actions in the improvement plan (and to the achievement of this standard).

.

The proportion of Pupil Premium (and/or alternative funding sources) allocated to closing excellence gaps is commensurate with their incidence in the setting.

.

The allocation of Pupil Premium (or equivalent resources) is not differentiated on the basis of prior achievement: high achievers are deemed to have equal needs.

.

Settings should evidence their commitment to these principles in published material (especially information required to be published about the use of Pupil Premium).

Partnership/collaboration The setting takes an active role in collaborative activity to close excellence gaps. Excellence gaps are addressed and progress is monitored in partnership with all relevant ‘feeder’ and ‘feeding’ settings in the locality.

.

The setting leads improvement across other settings within its networks, utilising the internal expertise it has developed to support others locally, regionally and nationally.

.

The setting uses collaboration strategically to build its own capacity and improve its expertise.

 

letter-33809_640G_letter_blue_whiteBlue_square_QWhite_Letter_S_on_Green_Background

 

 

 

 

Those who are not familiar with the quality standards approach may wish to know more.

Regular readers will know that I advocate what I call ‘flexible framework thinking’, a middle way between the equally unhelpful extremes of top-down prescription (one-size-fits-all) and full institutional autonomy (a thousand flowers blooming). Neither secures consistently high quality provision across all settings.

The autonomy paradigm is currently in the ascendant. We attempt to control quality through ever-more elaborate performance tables and an inspection regime that depends on fallible human inspectors and documentation that regulates towards convergence when it should be enabling diversity, albeit within defined parameters.

I see more value in supporting institutions through best-fit guidance of this kind.

My preferred model is a quality standard, flexible enough to be relevant to thousands of different settings, yet specific enough to provide meaningful guidance on effective practice and improvement priorities, regardless of the starting point.

I have written about the application of quality standards to gifted education and their benefits on several occasions:

Quality standards are emphatically not ‘tick box’ exercises and should never be deployed as such.

Rather they are non-prescriptive instruments for settings to use in self-evaluation, for reviewing their current performance and for planning their improvement priorities. They support professional development and lend themselves to collaborative peer assessment.

Quality standards can be used to marshal and organise resources and online support. They can provide the essential spine around which to build guidance documents and they provide a useful instrument for research and evaluation purposes.

 

GP

October 2014

16-19 Maths Free Schools Revisited: Oddyssean Edition

This is the second edition of a post that marks the opening of two university-sponsored 16-19 maths free schools by taking a fresh look at the wider programme that spawned them.

Courtesy of Andrew J Hanson Indiana University

Courtesy of Andrew J Hanson Indiana University

I have revised the text to reflect substantive comments provided by Dominic Cummings through the oddyseanproject Twitter feed. Cummings was political adviser to former Secretary of State for Education Michael Gove until January 2014. He was instigator and champion of the maths free schools programme.

I fee obliged to point out that the inclusion of these comments does not constitute his endorsement or approval of the text. I have reserved the right to part company with him on matters of interpretation (rather than matters of fact) and have signalled where instances occur.

The post scrutinises developments since the publication of ‘A Progress Report on 16-19 Maths Free Schools’ (March 2013), building on the foundations within ‘The Introduction in England of Selective 16-19 Maths Free Schools’ (November 2011).

The broad structure of the post is as follows:

  • A description of the genesis of the programme and a summary of developments up to March 2013.
  • The subsequent history of the programme, from March 2013 to the present day. This reviews efforts to recruit more university sponsors into the programme – and to resist the publication of information showing which had submitted expressions of interest and, subsequently, formal proposals.
  • An assessment of the prospects for the programme at this point and for wider efforts to expand and remodel England’s national maths talent pipeline.

Since many readers will be interested in some of these sections but not others, I have included direct links to the main text from the first word of each bullet point above.

 

Genesis and early developments

Capital investment to support the programme was confirmed in the 2011 Autumn Statement, which referred to:

‘…an extra £600 million to fund 100 additional Free Schools by the end of this parliament. This will include new specialist maths Free Schools for 16-18 year olds, supported by strong university maths departments and academics’.

This followed an orchestrated sequence of stories fed to the media immediately prior to the Statement.

One source reported a plan to establish 12 such schools in major cities by the end of the Parliament (Spring 2015) ‘before the model is expanded nationwide’. These would:

‘…act as a model for similar institutions specialising in other subjects’.

Another confirmed the number of institutions, adding that there would be ‘…a special application process outside the regular free school application process…’

A third added that the project was viewed as an important part of the Government’s strategy for economic growth, suggesting that some of the schools:

‘…would offer pure maths, while others would combine the subject with physics, chemistry or computer sciences.’

Assuming provision for 12 schools at £6m a time, the Treasury had provided a capital budget of £72m available until 2015. It remains unclear whether this sum was ringfenced for university-sponsored maths schools or could be diverted into the wider free schools programme.

We now know that Cummings was behind the maths free schools project. But these original press briefings originated from the Treasury, showing that they were indeed committed to a 12-school programme within the lifetime of the Parliament.

 

 

The most recent edition of Cummings’ essay ‘Some thoughts on education and political priorities’ (2013) sets out the rationale for the programme:

‘We know that at the top end of the ability range, specialist schools, such as the famous Russian ‘Kolmogorov schools’…show that it is possible to educate the most able and interested pupils to an extremely high level…We should give this ~2% a specialist education as per Eton or Kolmogorov, including deep problem-solving skills in maths and physics.

The first English specialist maths schools, run by King’s College and Exeter University, have been approved by the Department for Education and will open in 2014. All of the pupils will be prepared for the maths ‘STEP’ paper that Cambridge requires for entry (or Oxford’s equivalent) – an exam that sets challenging problems involving unfamiliar ways of considering familiar  material, rather than the formulaic multi-step questions of A Level.’

Back in February 2012, TES reported that:

‘The DfE has hosted a consultation meeting on the new free schools with interested parties from the mathematical community in order to outline its plans.’

‘TES understands that officials within the Department for Education are now keen to establish the schools on the model of Kolmogorov, a boarding school that selects the brightest mathematicians in Russia.’

Andrey Kolmogorov courtesy of Svjo

Andrey Kolmogorov courtesy of Svjo

 

In fact, the meeting discussed a variety of international models and, on 20 February, Education Minister Nick Gibb answered a PQ thus:

‘Alex Cunningham: To ask the Secretary of State for Education when he expects the first free school specialising in mathematics for 16 to 18 year-olds to open; how many 16 to 18 year-olds he expects to enrol in free schools specialising in mathematics by 2015; with which universities he has discussed these free schools; and what guidance he plans to provide to people who wish to apply to open such a school.

Mr Gibb: We are developing proposals on how specialist maths schools for 16 to 18-year-olds might operate and will announce further details in due course. We are keen to engage with all those who have an interest to explore possible models and innovative ideas.’ (Col. 723W).

However, no proposals were published.

The minutes from King’s College London (KCL) Council meeting of 26 June 2012 reveal that:

‘Following approval by the Principal’s Central Team, the College was pursuing discussions with the Department for Education about sponsoring one of 12 specialist Maths schools for 16-18 year olds to be established with the support of university Mathematics departments. The initiative was intended to address national deficiencies in the subject and to promote a flow of highly talented students into university. In discussion, members noted that while the financial and reputational risks and the costs in management time needed to be carefully analysed, the project supported the College’s commitment to widening participation and had the potential to enhance the strengths of the Mathematics Department and the Department of Education and Professional Services, as well as addressing a national problem. The Council approved the College’s continued engagement with this initiative.’

By December 2012 KCL had announced that it would establish a maths free school, with both its maths and education departments involved. The school was scheduled to open in September 2014.

KCL confirmed that it had received from DfE a development grant plus a parallel outreach grant to support a programme for mathematically talented 14-16 year-olds, some of whom might subsequently attend the school.

The minutes of the University of Exeter Council meeting of 13 December 2012 record that:

‘As Council were aware, Exeter was going to be a partner in an exciting regional development to set up one of the first two Maths specialist schools with Exeter College. The other school would be led by King’s College London. This would cater for talented Maths students as a Free School with intake from four counties (Devon, Cornwall, Somerset and Dorset) with a planned total number of students of 120 by September 2017. The bid was submitted to the Department of Education on 11th December and the outcome would be announced in early January, with the school opening in 2014. It would be taught by Exeter College teachers with contributions from staff in pure and applied Maths in the College of Engineering, Mathematics and Physical Sciences (CEMPS), input from the Graduate School of Education and from CEMPS students as mentors and ambassadors. It was hoped that at least some of these talented students would choose to progress to the University. Council would be kept informed of the progress of the bid.’

In January 2013 a DfE press release announced approval of this second school. It would indeed have capacity for 120 students, with Monday-Thursday boarding provision for 20% (24 students), enabling it to recruit from across the four counties named above, so acting as a ‘regional centre of excellence’.

This project had also received a development grant – which we know was up to £300K – had agreement in principle to an outreach grant and also expected to open in September 2014.

There is also reference to plans for Met Office involvement with the School.

The press release repeats that:

‘The ultimate aim is to create a network of schools that operate across England which identify and nurture mathematical and scientific talent.’

A page added to DfE’s website in March 2013 invites further expressions of interest to open maths free schools in September 2014 and beyond.

Parallel Q and A, which has now been removed, made clear that development grants would not be available to new applicants:

‘Is there financial support available to develop our plans?

Not at the beginning. Once we have approved a proposal, we do offer some support to cover the costs of project management, and recruiting some staff before the school opens, in the same way we would for any Free School.’

This has subsequently been reversed (see below).

 

Progress since March 2013

 

The Hard Sell

While KCL and Exeter developed their plans, strenuous efforts were made to encourage other universities to participate in the programme.

A TES piece from May 2013, profiling the newly-appointed head of the KCL school, includes a quote from Alison Wolf – the prominent chair of the project group at KCL:

‘’The Brit School is a really good comparison,” she says. “When we were working on the new school and thinking about what to do, we’d look at their website.

“Maths is very glamorous if you’re a young mathematician, which is why they’ll do well when they are around other people who adore maths.”

The story adds that 16 schools are now planned rather than the original 12, but no source is attributed to this statement. Cummings says it is a mistake

 

 

It seems that the wider strategy at this stage was to convince other potential university sponsors that maths schools were an opportunity not to be missed, to imply that there was already substantial interest from prominent competitors, so encouraging them to climb on board for fear of missing the boat.

 

Playing the Fair Access Card

But there was soon an apparent change of tack. In June 2013, the Guardian reported that education minister Liz Truss had written to the heads of university maths departments to encourage bids.

‘As an incentive to open the new schools, universities will be allowed to fund them using budgets otherwise reserved for improving access to higher education for under-represented and disadvantaged groups….

Les Ebdon, director of Offa, said: “I’d be happy to see more university-led maths free schools because of the role they can play in helping able students from disadvantaged backgrounds access higher education.

“It is for individual universities and colleges to decide whether or not this is something they want to do, but Offa is supportive of anything that is targeted at under-represented groups and helps them to fulfil their potential.”

…According to Truss’s letter, Ebdon confirmed it would be “perfectly legitimate to allocate funding ringfenced for improving access for under-represented groups towards the establishment of such schools,” counting the spending as “widening access”.’

My initial post had pointed to the potential significance of this coupling of excellence and equity as early as November 2011:

‘It is not clear whether a fundamental purpose of these institutions is to support the Government’s drive towards greater social mobility through fair access to competitive universities. However, one might reasonably suggest it would be an oversight not to deploy them…encouraging institutions to give priority during the admissions process would be the likely solution.’

What appeared to be Ministers’ rather belated conversion to the merits of alignment with social mobility and fair access might have been interpreted as opportunism rather than a sincere effort to join together two parallel strands of Government policy, especially since it had not been identified as a central feature in either KCL’s or Exeter’s plans.

But Cummings reveals that such alignment was intended from the outset.

 

 

I can find nothing on Offa’s website confirming the statement that funding ringfenced for fair access might be allocated by universities to the development of maths free schools. There is no contemporary press notice and nothing in subsequent guidance on the content of access agreements. This begs the question whether Ebdon’s comments constitute official Offa advice.

I asked Cummings why it took so long to get the line from Ebdon and why that line wasn’t encapsulated in Offa guidance.

 

 

The Cummings view of the dysfunctionality of central government is well-known, but to have to wait nineteen months for a brief statement on a high-priority programme – with inevitably long lead times yet time-limited to the duration of the Parliament – must have been deeply frustrating.

It would seem that Offa had to be persuaded away from sympathy with the negative views Cummings attributes to so many vice chancellors – and that this required a personal meeting at ministerial level.

But this was a priority programme with strong ministerial backing.

 

 

One must draw one’s own private conclusions about the motivations and commitment of the key protagonists – I will not apportion blame.

The text of Truss’s letter is preserved online and the identical text appears within it:

‘I want to encourage other universities to consider whether they could run similar schools: selective, innovative and stretching our brightest and best young mathematicians. It is a logical extension of the role that dozens of universities have already played in sponsoring academies.

I also wanted to highlight to your colleagues that Professor Les Ebdon, Director of the Office for Fair Access, is enthusiastic about the role university led Maths Free Schools can have in encouraging more young people to go on to study maths at university, and to reap the benefits that brings. Professor Ebdon has also confirmed to me that he considers the sponsorship and development of Maths Free Schools as contributing to higher education ‘widening access’ activity, and that it would be perfectly legitimate to allocate funding ring-fenced for improving access for underrepresented groups towards the establishment of such schools.

Unlike our usual practice for Free Schools, there is no competitive application process for Maths Free Schools. Instead we ask interested universities to submit a short proposal setting out the key features of the school. These proposals need not be long: King’s and Exeter both submitted initial proposals that were around 12 pages…

[There follows a list of bullet points describing the content of these initial proposals, none of which address the admission of students from disadvantaged backgrounds.]

….Both King’s College and the University of Exeter had a number of detailed discussions with colleagues in the Department to develop and refine their proposals and we are always happy to work with universities to help them focus their plans before submitting a formal proposal. If we approve a proposal, we do then offer financial support to cover the costs of project management, and of recruiting some staff before the school opens, in the same we would for any free school.’

(By way of an aside, note that the final emboldened sentence in the quotation above corrects the statement in the Q and A mentioned above. It seems that maths free schools are now treated comparably with all other free school projects in this respect, even though the application process remains different.

The latest version of free school pre-opening guidance gives the sum available in Project Development Grant for 16-19 free schools as £0.25m.)

Going back to Offa, there are no conditions imposed by Ebdon in respect of admissions to the schools, which seems a little over-relaxed, given that they might well attract a predominantly advantaged intake. I wonder whether Ebdon was content to offer personal support but refused to provide official Offa endorsement.

 

 

In July 2013 the BBC reported a speech by Truss at the 2013 ACME Conference. Oddly, the speech is not preserved on the gov.uk site. According to the BBC:

“We want this movement to spread still further,” she told delegates.

“So we’re allowing universities to apply to sponsor new maths free schools through a fast-track, simplified procedure, without having to go through the normal competitive application process.

“These schools will not only improve standards in maths teaching, but will equip talented young people from low-income backgrounds with the skills they need to study maths at university.”

Mrs Truss said the Office for Fair Access had confirmed that, when universities contributed to the sponsorship or development of maths free schools, this would be considered as one of their activities to widen access to under-represented groups – and therefore as part of their access agreement.

“I hope that this is the start of a new network of world-class free schools, under the aegis of top universities, helping to prepare talented 16- to 19-year-olds from any and every background for the demands of university study.”

Note that Ebdon’s endorsement is now Offa’s.

Cummings’ essay remarks in a footnote:

‘Other maths departments were enthusiastic about the idea but Vice Chancellor offices were hostile because of the political fear of accusations of ‘elitism’. Hopefully the recent support of Les Ebdon for the idea will change this.’

A year on, we have no evidence that it has done so. Cummings comments

 

 

What that ‘not none’ amounts to – beyond references (reproduced later in this post) in KCL’s and Exeter’s access agreements – remains to be established for, as we shall see, it does not feature prominently in the priorities of either of their schools.

 

The Soft Sell

By the beginning of the following academic year, a more subtle strategy was adopted. The two schools-in-development launched a maths competition for teams from London and the South-West with prizes awarded by education ministers.

 

 

A November 2013 DfE press release marks the ceremony. Michael Gove is quoted:

‘We need specialist maths free schools like King’s College London (KCL) Maths School and Exeter Mathematics School. They will develop the talents of exceptional young mathematicians and ensure they can compete in the global race.’

The release continues:

‘The KCL and Exeter schools are the first to take advantage of a development grant made available by the Department for Education for the creation of university-led specialist maths free schools.’

The notes include a link to the 1 March webpage mentioned above for ‘Universities interested in developing their own maths free school’.

 

Publicity avoided

We now know that a Freedom of Information request had been submitted to DfE in October 2013, asking how many expressions of interest and firm proposals had been received, which institutions had submitted these and which proposals had been approved and rejected.

The source is an ICO Decision Notice published on 12 June 2014.

The request was initially rejected and this decision was upheld in January 2014 following an internal review. A complaint was immediately lodged with the Information Commissioner’s Office.

The Decision Notice records the Commissioner’s decision that public interest outweighs the case for withholding the information. Accordingly he directs that it should be released to the complainant within 35 calendar days of the date of the Notice (ie by 17 July 2014).

The Notice contains some interesting snippets:

  • ‘It has been the DfE’s experience that interested Heads of Maths have contacted it for further information before seeking to discuss the idea with their Vice Chancellor.’ There is no process for accepting formal expressions of interest.
  • There are…no fixed criteria against which all proposals are assessed.’
  • ‘The DfE confirmed that the application is and has always been the first formal stage of the maths free schools process and it has already stated publicly that it has received three applications from King’s College London, Exeter University and the University of Central Lancashire.’
  • ‘It [ie DfE] confirmed that funding arrangements were only confirmed for the development of maths free schools in February 2014 and many policy decisions on this issue have been shaped by the specifics of the two schools that are due to open soon. It expects the policy to develop even further as more maths free schools are approved.’
  • ‘The DfE explained that universities are extremely risk adverse when it comes to protecting their reputation and so do not want to be publically named until they have submitted an application. As such, if they are named at an earlier point it may make them pull out altogether and may make universities unwilling to approach the DfE with ideas.’
  • ‘Similarly, the DfE argued that if it were to release the reasons why one of the applications was rejected it would be likely to deter future interest as the university would not want the public criticism of its ideas. Given that the policy is driven by university interest, if all potential groups are deterred the policy will fail and students will not be able to enjoy the potential benefits.’

The Commissioner gave these arguments short shrift, pointing out the benefits of transparency for policy development and the encouragement of more successful applications.

The text does not say so explicitly, but one can imagine the Commissioner thinking  ‘given the low level of interest stimulated to date, you might at least try a more open strategy – what have you got to lose?’

It does seem unlikely that university heads of maths departments would submit speculative expressions of interest without internal clearance. Their approaches were presumably of the informal ‘sounding out’ variety. They would understand the shaky internal politics of failing to consult the corporate centre – not to mention their education faculties

The lack of specific and transparent assessment criteria does appear to have backfired. What guarantees might universities otherwise receive that their proposals would be judged objectively?

One can imagine the questions:

  • Is the scheme open to all universities, Russell Group or otherwise?
  • If not, what criteria must the host university satisfy?
  • What counts as a ‘strong mathematics department?’
  • Can projects be led by university departments of education, or only undertaken jointly (as at KCL)?

Without explicit and consistent answers one can readily understand why many universities would be disinclined to pursue the idea.

Cummings disagrees strongly with this suggestion

 

 

But I am still unconvinced. Personal experience of working with sceptical vice chancellors and their offices leads me to believe that some distinct parameters would have been better than none, provided that they were flexible parameters, in all the areas where ministers were genuinely flexible.

Some flagging up of ministerial preferences might also have been helpful, provided it was also made clear that ministers could be persuaded away from them by a strong enough bid with a different complexion.

Since ministers set so much store by the fair access dimension, and were acutely aware of the need to face down universities’ concerns about elitism, some explicit statement of the importance they attached to this dimension would not have gone amiss.

And the reference to bespoke solutions rings rather hollow when – as we shall see – the proposals from KCL and Exeter were so strikingly similar.

I suspect this difference of opinion boils down to ideology – our very different ideas about bureaucracy and how best to harness innovation. The point is moot in any case.

 

The reference to belated confirmation of funding arrangements – as recently as February 2014 – is intriguing. It cannot apply to capital funding, unless that was vired in extremis. I wondered whether it might relate to the parallel recurrent funding pot or simply the availability of project development grants.

The latter seems unlikely given the statement in the letter to HoDOMS, dated some eight months previously.

One suspects that there might have been internal difficulties in ringfencing sufficient recurrent funding to honour proposals as and when they were received. Some prospective bidders might have baulked on being told that their budget could not be confirmed until a later date.

But the eventual resolution of this issue a little over a year before the end of the spending round would be unlikely to have a significant impact on the number of successful bids, especially if unspent capital funding has to be surrendered by Spring 2015.

Cummings throws some light on this issue

 

 

It sounds as though there were internal pressures to integrate maths free schools into the 16-19 free schools programme, where levels of bureaucracy might have caused further delay

But these comments tend to play down the budgetary issue flagged up to the ICO. Although it might have been strictly correct that: ‘funding arrangements were only confirmed for the development of maths free schools in February 2014‘, the associated suggestion that this had been a significant factor holding up the approval of further projects seems rather more suspect.

 

Recent developments

In July 2014 the TES revealed that it had been the source of this FoI request.

 

 

But the story itself reveals little new, other than that:

‘Five further expressions of interest have been made but not yet yielded an application’

The sources of these EoIs are not listed, even though they must have been divulged to the paper by this point.

David Reynolds opines that:

‘Having a small number of schools doesn’t matter if we can get the knowledge from them around the system. So we need them to be excellent schools and we need to somehow get that knowledge around.’

A DfE statement concludes:

‘We continue to welcome applications and expressions of interest from universities and the first maths free schools, set up by two leading universities, will be opening in September.’

So we know there have been eight expressions of interest, three of them converted into firm proposals.

The receipt of the third proposal, from the University of Central Lancashire (UClan), is said to have been made public, but I can find no record of it in the lists of Wave 1 to 7 free school applications so far released, or anywhere else for that matter. (KCL and Exeter were both included in Wave 3.)

There is a reference in UCLAN’s 2013-14 access agreement dated 31 May 2012:

‘The University is currently consulting on the formation of a Maths Free School which would be run alongside its new Engineering Innovation Centre at the Preston Campus.’

Nothing is said about the plans in the access agreements for 2014-15 and 2015-16.

There is one further reference on the New Schools Network site to a:

‘Consultant engaged to carry out a feasibility study re a Maths Free School on behalf of the University of Central Lancashire (UCLan)’.

One assumes that this must be out-of-date, unless UCLan is considering a second bid.

Otherwise, a simple process of elimination tells us that UCLan’s proposal must have been rejected. The reason for this is now presumably known to TES, as are the sources of the five expressions of interest that were not converted into proposals. Why have they not published this information?

Perhaps they are waiting for DfE to place these details on its website but, at the time of writing – almost three months after the Decision Notice issued – it has not been uploaded.

Meanwhile, there are no further maths free school proposals in the most recent Wave 7 information relating to applications received by 9 May 2014.

The deadline for Wave 8 is imminent. That may well be the last on this side of the Election.

Cummings reveals that there is a fourth proposal in the pipeline which is not yet ready to be made public.

 

 

One assumes a September 2015 start and we must wait to see whether it catches Wave 8.

We discussed the relationship of this proposal to the evidence submitted to the ICO. We do not know whether it features among the five expressions of interest but it might be supernumary. Cummings is at pains to justify a cautious approach to FoI requests.

 

 

He is content to release details only at the point where development funding is committed.

So, assuming DfE is pursuing the same strategy, one can reasonably conclude that development funding has not yet been agreed for this fourth proposal. Although it has progressed beyond the status of an expression of interest, it is not yet an approved application.

Almost nine months has passed since Cummings left the Department, yet negotiations have not reached the point where development funding is confirmed. This must be a complex and sensitive negotation indeed! Perhaps there is a Big Fish on the end of this particular hook…or perhaps the host university has cold feet. We must wait and see.

A further feature published by the TES in October 2014 throws no fresh light on these matters, though it carries a quote by new Secretary of State Nicky Morgan, interviewed at the KCL School launch:

‘I think that some [universities] are clearly waiting to see how the King’s and Exeter schools go. Clearly there is a huge amount of effort required, but I think King’s will be enormously successful, and I am hoping they will be leading by example.’

That sounds suspiciously like tacit admission that there will be no new proposals before a General Election.

Another opinion, diametrically opposed to David Reynolds’ view, is contributed by the head of the school of education at Nottingham University who is also Deputy Chair of ACME:

‘I’m very supportive of more people doing more maths, but even if you have 12 schools, you are really scratching the surface,” said Andrew Noyes, head of the school of education at Nottingham University and a former maths teacher.

“These kinds of policy experiments are very nice and they’re beneficial for a certain number of young people, but they’re relatively cheap compared with providing high-quality maths education at every stage in every school.”’

So what are the prospects for the success of the KCL and Exeter Schools? The next section reviews the evidence so far in the public domain.

 

The KCL and Exeter Free Schools

 

KCL School

The KCL School opened in September 2014 with 68 students, against a planned admissions number of 60. The most recent TES article says that there were 130 applicants and nearly all of those successful were drawn from state schools.

However, another reliable source – a member of the governing body – says that only 85% (ie 58) are from maintained schools, so the independent sector is actually over-represented.

He adds that:

‘Many are from families where neither parent has attended university’

but that is not necessarily an indicator of disadvantage.

We also know that some 43% (29 students) were female, which is a laudable outcome.

The School is located in Lambeth Walk, some distance from KCL’s main campuses. The capital cost of refurbishing the School was seemingly £5m. It occupies two buildings and the main building is shared with a doctor’s surgery.

My March 2013 post summarised KCL’s plans, as revealed by material on the University’s site at that time, supplemented by the content of an information pack for potential heads which is no longer available online.

I have reproduced the main points below, to provide a baseline against which to judge the finished article.

  • The full roll will be 120, with an annual admission number of 60. Potential applicants must have at least 5 GCSE grades A*-C including A*/A in both maths and physics or maths and dual award science.
  • Other admissions criteria will probably include a school reference, ‘our judgement about how much difference attending the school will make to your future based on a number of factors, including the results from an interview’ and the results of a test of aptitude for problem-solving and mathematical thinking.
  • The headteacher information pack adds that ‘the school will also be committed to recruiting a significant proportion of students from socially disadvantaged backgrounds, and to an outreach programme… to further this objective.’
  • All students will take Maths, Further Maths and Physics A levels. They will be expected to take STEP papers and may take a further AS level (an FAQ suggests this will be an Extended Project). Every student will have a maths mentor, either an undergraduate or ‘a junior member of the maths department’.
  • They will also ‘continue with a broad general curriculum, including other sciences, social science, humanities and languages, and have opportunities for sport and the visual and performing arts.’ Some of this provision will be ‘delivered through existing King’s facilities’. The provisional timetable assumes a 40-hour working week, including independent study.
  • The University maths department ‘will be closely involved in curriculum development’ and academics will have ‘regular timetabled contact’, potentially via masterclasses.
  • There will be strong emphasis on collaboration with partner schools. In the longer term, the school ‘intends to seek independent funding for a larger CPD programme associated with the school’s curriculum and pedagogy, and to offer it to a wide range of  schools and students, using school premises out of hours’.

At the time of writing, the KCL Maths School website does not have a working link to the admissions policy, although it can be found online.

As expected, 60 students will be admitted in September 2015. Minimum requirements are now

‘A or A* in GCSE Mathematics or in iGCSE Mathematics

Either an A or A* in GCSE Physics or iGCSE Physics, or an AA, A*A or A*A* in GCSE Science and GCSE Additional Science, or an A or A* in all three Physics modules contained within the GCSE Science, Additional Science and Further Additional Science qualifications; and

A*-C grade in 5 other GCSEs or other qualifications that count towards the Key Stage 4 performance tables compiled by the Department of Education, normally including English language.’

So the minimum requirement has been stiffened to at least seven GCSEs, or equivalent, including A*/A grades in maths and physics and at least a C in English language.

The application process does indeed include a reference, an aptitude test and an interview.

The test is based on KS3 national curriculum material up to Level 8, containing ‘routine and less familiar problems’. Some specimen questions are supplied.

The latest TES story says there are two interviews but this is wrong – there is one interview but two interview scores.

Cummings queries this point

 

 

I can no longer check the original admissions policy to establish whether there was exceptionally provision for two interviews for admission in 2014, but all the other material I have seen – including the admissions policy for 2015 – refers to a single interview.

One of the two scores is ‘to assess to what extent the school is likely to add value in terms of making a difference to [candidates’] future careers’ but there is no explicit reference to priority for disadvantaged students anywhere in the admissions policy.

Indeed, the section headed Equality and Diversity says:

‘All places at King’s College London Mathematics School are offered on the basis of academic ability and aptitude.’

This does not amount to a commitment to recruit ‘a significant proportion of students from socially disadvantaged backgrounds’, as stated in the headteacher information pack.

A deputy headteacher information pack published in November 2013 had already rowed back from this, simply stating that:

‘Students will be recruited from a wide variety of backgrounds.’

The reasons for such backtracking remain unclear. Perhaps it was only ever insurance against accusations of elitism that never actually materialised.

The website confirms that all students take A levels in maths, further maths and physics, together with an AS EPQ. But now they can also take an optional AS level in computing in Year 12 and may convert it to an A level in Year 13. They will also take either the AEA or STEP papers.

The description of additional curricular provision is somewhat vague. Students will have a series of lessons and educational visits. Each fortnight a KCL lecturer will introduce a new theme, to be explored through ‘mini research projects’. Students will also learn a modern language but to what level is unclear.

A mentor will be assigned to support work for the EPQ. There will also be a maths mentor – always an undergraduate, never ‘a junior member of the maths department’ – available for one meeting a week.

Tuesday afternoons seem to be set aside for sport and exercise. Visual and performing arts will be explored through extra-curricular activity, though this is currently aspirational rather than real:

‘…the school hopes to have sufficient interest to form a student choir, orchestra and dramatic society.’

The length of the school day is six hours and 55 minutes, with five hours of lessons (though the FAQ implies that students will not have a full timetable).

The present staff complement is 10, six of whom seem to be teaching staff. The head was formerly Head of Maths at Highgate School.

Outreach continues for students in Years 10 and 11. There is also a CPD programme for those new to teaching further maths. This is funded by a £75,000 grant from the Mayor’s London Schools Excellence Fund and supports 30 teachers from six schools spread across five boroughs.

KCL’s Access Agreement for 2015/16 says:

‘King’s College London Mathematics School aims to increase substantially the number of young people with the right levels of mathematical attainment to study STEM subjects at top-rated universities. It also aims to improve access to high quality mathematical education at sixth form level and is targeting individuals from schools where such provision is not easily available (in particular, 11-16 schools and schools where further mathematics is not offered as part of the curriculum at A-level). The school has implemented an extensive outreach programme for pupils at KS4, aged 14-16, whereby pupils come to King’s College London for two hours per fortnight over a two-year period. Through this programme, the school will provide students with limited access [sic] to high quality sixth form provision the understanding and skills they need to prepare for A-levels in Maths and Further Maths should they decide to study them, and also to support applications to the maths school should they wish to make them.

The school has also just launched a programme of continuing professional development for maths teachers in London schools. The programme will run for two consecutive years, and will enable high-quality teaching of Further Maths for those new to teaching this A-level. One of the key aims of this programme is to improve take up and retention rates in A-level Further Maths, with a view to increasing numbers of well-trained applicants to STEM subjects at university.’

Exeter

The Exeter School also opened in September 2014, with 34 students, against a planned admission number of 30. Disappointingly only seven are girls. Eleven (32%) are boarders. We do not know the number of applicants.

The School is located in Rougemont House, a Grade 2 listed building close to the University and College. The cost of refurbishment is as yet unknown.

There were relatively fewer details available of Exeter’s plans at the time I wrote my previous post. The January 2013 revealed that:

  • As we have seen, the roll would be 120 students, 60 per year group, with boarding places available for 20% of them.
  • All students would take maths A level and the STEP paper and all would have 1:1 maths mentoring.
  • University academics would provide an ‘enrichment and critical thinking programme’.
  • The Met Office would be involved.

The 2014 admissions policy dates from September 2013.  It indicates that the School will admit 30 students in September 2014, 50 in September 2015 and 60 in September 2016. It will not reach full capacity until September 2017.

Minimum entry requirements are:

  • A* in GCSE Mathematics
  • A or A* in double sciences or single science Physics (in 2015 computer science is also acceptable as an alternative)
  • At least 6 GCSEs at C grade or above, normally to include English Language at a grade B.

So Exeter is more demanding than KCL in respect of the grades required for both GCSE maths and English language, but the minimum number of GCSEs required is one fewer.

The policy says that the School will aim for allocated places to reflect the incidence of potential students across Devon (47%) and in the other three counties served by the school (Cornwall 23%, Somerset 23%, Dorset 6%) but they will not be selected on this basis. There is nothing in the admissions criteria to secure this outcome, so the purpose of this paragraph is unclear.

The selection process involves a written application, a reference an interview and ‘a mathematics-based entry exam’, subsequently called an aptitude test. This is described in identical terms to the test used by KCL – indeed the specimen questions are identical.

The oversubscription criteria involve giving priority to ‘interview answers and the candidates’ potential to thrive and succeed on the course’.

Under ‘Equality and Diversity’ the document says:

‘EMS is committed to widening participation and broadening access to high quality mathematics education. As such, we will target our recruitment in areas which have high levels of deprivation and in schools for which provision is currently limited, such as those without 6th forms.

EMS will encourage applications from female students through targeted marketing and recruitment. However, there will be no positive discrimination for girls in the admissions criteria.’

The first statement is largely meaningless since neither residence in a deprived area nor attendance at a school without a sixth form is mentioned explicitly in the admissions criteria.

The second statement is reflected in the fact that only 20% of the inaugural cohort is female.

The document notes that boarding will be available for learners living more than an hour distant. The proportion of boarders in the first cohort is significantly higher than expected.

It adds that boarding fees will be payable (and published on the School’s website) but it is expected they ‘will be subsidised by a government grant and a private investor’. There will also be a limited number of means-tested full bursaries, the criteria for which will also be published.

At the time of writing neither fees nor subsidies nor bursary criteria are published on the open pages of the website. It also mentions a subsidised transport scheme but provides no details. This is unhelpful to prospective candidates.

Students take A levels in maths and further maths, plus an A level in either physics or computer science. They are also prepared for STEP papers. All students pursue one further AS level at Exeter College, selecting from a choice of over 30 subjects, with the option to complete the A level in Year 13. Amongst the 30 are several non-traditional options such as fashion and design, media studies and world development. The School is clearly not wedded to facilitating subjects!

In maths students will:

‘…collaborate with those in other mathematics schools and meet, converse and work with staff and students from Exeter University’s mathematics department. They will have access to mathematical mentors from the University who will provide 1:1 and small group support for individual development and project work.’

Maths mentors will be 3rd or 4th year undergraduates and sessions will take place fortnightly.

All students will have a pastoral tutor who will ‘deliver a curriculum designed to meet the students’ development needs’. Some extra-curricular options may also be available:

‘Several clubs and societies will exist within EMS, these will be established as a result of students’ own interests. In addition, Exeter College’s specialist facilities, learning centres and other services will be accessible to them. Students will join their friends and other students from the College for sporting and enrichment activities including, for example, structured voluntary work, theatre productions and the Duke of Edinburgh’s Award Scheme.’

I could find no reference to a University-provided enrichment and critical thinking programme or to Met Office involvement.

The Head of Exeter School was formerly a maths teacher and maths AST at Torquay Boys’ Grammar School. Other staff responsibilities are not enumerated, but the Contacts page mentions only one teacher apart from the Head.

Another section of the site says the School will be advertising for a Deputy and ‘teachers of Mathematics, Computer Science and Physics (p/t)’. So the original intention to deploy Exeter College staff seems to have been set aside. Advertisements have been placed for several posts including a Pastoral Leader and an Outreach and Admissions Officer.

An outreach programme is being launched and business links will be established, but there are no details as yet. There are links to a KS4/5 maths teachers’ network sponsored by the Further Maths Support Programme.

Exeter’s 2015/16 Access Agreement says:

‘The University and the College are already joint sponsors of the innovative new Exeter Maths School and are developing a strategic approach to outreach that supports both curriculum enhancement in local schools and progression for the students enrolled in the school. Together with the South Devon UTC, these two new education providers offer opportunities for innovative collaborative approaches to outreach in the region.’

This sounds very much a work in progress.

 

 

Comparing the two schools

My 2013 post observed:

‘From the information so far published, the Exeter project seems very close conceptually to the one at King’s, indeed almost a clone. It would have been good to have seen evidence of a fundamentally different approach.’

If anything, the two projects have grown even more similar as they have matured. To the extent that these are pilot institutions testing out a diversity of models this is not entirely helpful.

Both schools are very small and KCL in particular offers a very restricted range of post-16 qualifications. There is downside to post-16 education on this model – otherwise we wouldn’t be exercised about the negative effects of small sixth forms – though both projects make some effort to broaden their students’ experience and, as we have seen, Exeter includes some shared provision with Exeter College.

The admissions requirements and processes are almost identical. It is important to recognise that neither institution is highly selective, especially in terms of overall GCSE performance and, in this respect, the comparisons with Kolmogorov and other institutions elsewhere in the world are rather misleading.

This is not the top 2% that Cummings cited as the beneficiaries in his essay. Even in terms of mathematical ability, the intake to these schools will be relatively broad.

The expectation that all will take STEP papers may be realistic but, despite the use of an aptitude test, any expectation of universal success is surely over-optimistic.

For Cambridge says STEP papers are ‘aimed at the top 5% or so of all A-level mathematics candidates’.  Fewer than 1,500 students took the most popular Paper 1 in 2013 and, in 2014, over 20% of participants received an Unclassified grade.

Cummings queries my conclusions here

and I have to admit that these are inferred from the evidence set out above. But, on the basis of that evidence, I would be surprised indeed if STEP results for these two schools exceed the national profile in 2016.

Cummings notes that approximately one third of those entered for STEP attend independent schools, meaning that roughly 1,000 of the 2013 cohort were in maintained institutions. There may be some marginal increase in state-funded STEP entry through these two schools, but the impact of MEI support elsewhere is likely to be more significant.

 

The priority attached to excellence is less pronounced than expected. But this is not matched (and justified) by a correspondingly stronger emphasis on equity.

Neither school gives priority within its admissions or oversubscription criteria to students from disadvantaged backgrounds. A major opportunity has been lost as a consequence.

Cummings responds

There are questions to be asked here about just how tightly the universities were held to the specifications they agreed.

There is nothing about the admission of disadvantaged students in the KCL funding agreement (I can’t find Exeter’s). It would be interesting to know what exactly they set down in their proposals, as approved.

One suspects that some effort has been made to prioritise admissions from state schools, especially state schools without sixth forms, but all this is swept up into the interview scores: there is nothing explicit and binding. The fact that 15% of the KCL intake has come from the independent sector shows that  this is insufficient.

Comparison with the admissions policy for the Harris Westminster Sixth Form is instructive:

‘Applicants who have achieved the qualifying score will then be awarded points as follows:

  • One point for the applicant’s home address…if it is in an area of high deprivation, based on an independently published assessment of levels of deprivation of postcodes;
  • One point if they qualify for, or have previously qualified for, Free School Meals.

If Year 12 is oversubscribed then, after the admission of pupils with Special Educational Needs where the Harris Westminster Sixth Form is named on the statement, the criteria will be applied in the order in which they are set out below to those who have achieved a qualifying score:

a. Looked after and former looked after young people;

b. Applicants who have 2 points in accordance with the paragraph above;

c. Applicants who have 1 point in accordance with the paragraph above…’

The funding allocations for academic year 2014/15 show that both maths free schools have been awarded zero free meals funding, suggesting that no pupils eligible for free school meals in Year 11 have been admitted.

So there is too little emphasis on excellence and equity alike. These institutions exemplify a compromise position which, while tenable, will reduce their overall impact on the system.

The only substantive difference between the two schools is that one is located in London and the other in a much more sparsely populated and geographically dispersed region. These latter conditions necessitate a boarding option for some students. The costs associated with boarding are not transparent, but one suspects that they will also serve as a brake on the recruitment of disadvantaged students.

Exeter has no real competitors in its region, other than existing sixth forms and post-16 institutions, but KCL faces stiff competition from the likes of the London Academy of Excellence and the Harris Westminster Sixth Form, both of which are much more substantial institutions offering a wider range of qualifications and, quite possibly, a richer learning experience.

Both Schools are designed to suit students who wish to specialise early and who are content with only limited opportunities to work outside that specialisation. That subgroup does not necessarily include the strongest mathematicians.

It might have been different story if the Schools could have guaranteed progression into the most selective higher education courses, but this they cannot offer. There is no guaranteed progression even to the host universities (whose mathematics departments are not the strongest – one obvious reason why they were attracted to hosting maths schools in the first place).

Exeter and Kings no doubt expect that their Schools will help them to compete more effectively for prospective students – both through direct recruitment and, more indirectly, by raising their profile in the maths education sector – but they will not state this overtly, preferring to emphasise their contribution to improving standards system-wide.

There is no reference to independent evaluation, so one assumes that success indicators will focus on recruitment, a strong showing in the Performance Tables and especially Ofsted inspection outcomes.

A level performance must be consistently high and HE destinations must be commensurate. Because recruitment of disadvantaged students has not been a priority fair access measures are largely irrelevant.

Other indicators should reflect the Schools’ contribution to strengthening the maths talent pipeline and maths education more generally, particularly by offering leadership at regional and national levels.

At this early stage, my judgement is that the KCL project seems rather better placed than Exeter to achieve success. It has hit the ground running while Exeter has some rapid catching up to do. One is good; the other requires improvement.

 

Future Prospects

 

Prospects for the maths school programme

With just seven months before Election Purdah, there is no prospect whatsoever that the programme will reach its target of 12 schools. Indeed it seems highly unlikely that any further projects can be brought to fruition before the end of the spending round, with the possible exception of the mysterious ’4th proposal’.

On assumes that the Regional Schools Commissioners are now responsible for stimulating and supporting new maths school projects – though this has not been made explicit – but they already have their hands full with many other more pressing priorities.

If Labour were to win the Election it seems unlikely that they would want to extend the programme beyond the schools already established.

Even under the Conservatives it would be extremely vulnerable given its poor track record, the very tight budgetary constraints in the next spending round (especially if schools funding is no longer ringfenced) and the fact that its original champions are no longer in place at DfE.

Cummings suggest that a further five schools might be reasonable objective for the next Parliament, but only if the commitment within DfE is sustained.

Even that unlikely prospect would result in a network of only eight schools by 2020, four short of the original target that was to have been delivered five years earlier.

With the benefit of hindsight one might have taken a different approach to programme design and targeting.  Paradoxically, the centre has appeared overly prescriptive – favouring a ‘Kolmogorov-lite’ model, ideally hosted by a Russell Group institution – but also too vague – omitting to clarify their expectations in a specification with explicit ‘non-negotiables’.

Universities were hesitant to come forward. Some will have had other fish to fry, some may have had reservations arising from fear of elitism, but more still are likely to have been unclear about the Government’s agenda and how best to satisfy it.

The belated decision to flag up the potential contribution to fair access was locking the door after the horse had bolted. Other universities will have noted that neither KCL nor Exeter paid lip service in this direction.

Cummings rejects this analysis. For him the resistance from vice chancellors had a straightforward explanation

According to his narrative, many university mathematicians were on the side of the angels, understanding the advantage to their departments of securing a bigger flow of undergraduates, far better prepared for university study.

But they were thwarted by the corporate centres in their institutions, the vice chancellors hamstrung by their fear of potential reputational damage, invariably associated with the charge of elitism.

Yet I have seen negligible evidence of media criticism of KCL and Exeter on these grounds, or any others for that matter.

The only occasion on which I have seen the term ‘elitism’ wielded is by the massed ranks of the Devon Branch of the NUT. Neither KCL nor Exeter has had to play the trump card of priority for disadvantaged students – indeed I have shown above how they have apparently rowed back from earlier commitments on this front.

We shall probably never know the truth since there are no records of these discussions – and I very much doubt whether any vice chancellors will read this and decide to put the record straight.

My own personal experience has been that, by and large, universities are reluctant to serve as instruments to further government education policy. Their knee-jerk is more of the ‘not invented here’ variety and, even if they are given carte blanche, they remain highly suspicious of government motives. Fundamentally, there is an absence of trust.

An internal champion, such as Alison Wolf at KCL, can help to break this down.

 

There were also policy design issues. Because they were awarded a substantial capital budget – and were wedded to the value of free schools – ministers were driven to focus on creating new stand-alone institutions that might ultimately form a network, rather than on building the network itself.

The decision to create a set of maths hubs was the most sensible place to start, enabling new maths schools to take on the role of hubs when they were ready to do so. But, the maths hubs were a later invention and, to date at least, there have been no efforts to ‘retro-fit’ the maths schools into the network, meaning that these parallel policy strands are not yet integrated.

 

Prospects for the national maths talent pipeline

England is far from having a coherent national strategy to improve maths education or, as one element within that, a convincing plan to strengthen the maths talent pipeline.

Maths education enjoys a surfeit of players with overlapping remits. National organisations include:

A host of other organisations are involved, including the Joint Mathematical Council (JMC), an umbrella body, the Advisory Committee on Mathematics Education (ACME), the United Kingdom Mathematics Trust (UKMT) and the School Mathematics Project (SMP).

This leaves to one side the maths-related element of broader programmes to support between-school collaboration, recruit teachers and develop new-style qualifications. There is a parallel set of equally complex relationships in science education.

Not to put too fine a point on it, there are too many cooks. No single body is in charge; none has lead responsibility for developing the talent pipeline.

Ministers have been energetic in generating a series of stand-alone initiatives. The overarching vision has been sketched out in a series of set-piece speeches, but there is no plan showing how the different elements knit together to create a whole greater than the sum of its parts.

This probably has something to do with an ideological distaste for national strategies of any kind.

The recent introduction of maths hubs might have been intended to bring some much-needed clarity to a complex set of relationships at local, regional and national levels. But the hubs seem to be adding to the complexity by running even more new projects, starting with a Shanghai Teacher Exchange Programme.

Cummings has me down as a hopeless idealist

and who am I to contest his more recent and much more wide-ranging experience? I will only say that I can still recollect the conditions under which many such obstacles can be overcome.

 

courtesy of Jim2K

courtesy of Jim2K

 

Last words

A network-driven approach to talent development might just work – I suggested as much at the end of my previous post – but it must be designed to deliver a set of shared strategic objectives. Someone authoritative needs to hold the ring.

What a pity there wasn’t a mechanism to vire the £72m capital budget for 12 free schools into a pot devoted to this end. For, as things stand, it seems that up to £12m will have been spent on two institutions with a combined annual cohort of 120 students, while as much as £60m may have to be surrendered back to the Treasury.

We are better off than we would have been without the KCL and Exeter Schools, but two schools – or perhaps three – is a drop in the ocean. Even 12 schools of this size would have been hard-pressed to drive improvement across the system.

The failure to capitalise on the potential of these projects to support progression by genuinely disadvantaged students is disappointing and deserves to be revisited.

This might have been a once-in-a-generation chance to mend the maths talent pipeline. I do hope we haven’t blown it.

 

GP

October 2014

16-19 Maths Free Schools Revisited

This post marks the opening of two university-sponsored 16-19 maths free schools with a fresh look at the wider programme that spawned them.

It scrutinises developments since the publication of ‘A Progress Report on 16-19 Maths Free Schools’ (March 2013), building on the foundations within ‘The Introduction in England of Selective 16-19 Maths Free Schools’ (November 2011).

courtesy of Jim2K

courtesy of Jim2K

The broad structure of the post is as follows:

  • A description of the genesis of the programme and a summary of developments up to March 2013.
  • The subsequent history of the programme, from March 2013 to the present day. This reviews efforts to recruit more university sponsors into the programme – and to resist the publication of information showing which had submitted expressions of interest and, subsequently, formal proposals.
  • An assessment of the prospects for the programme at this point and for wider efforts to expand and remodel England’s national maths talent pipeline.

Since many readers will be interested in some of these sections but not others, I have included direct links to the main text from the first word of each bullet point above.

 

Genesis and early developments

Capital investment to support the programme was confirmed in the 2011 Autumn Statement, which referred to:

‘…an extra £600 million to fund 100 additional Free Schools by the end of this parliament. This will include new specialist maths Free Schools for 16-18 year olds, supported by strong university maths departments and academics’.

This followed an orchestrated sequence of stories fed to the media immediately prior to the Statement.

One source reported a plan to establish 12 such schools in major cities by the end of the Parliament (Spring 2015) ‘before the model is expanded nationwide’. These would:

‘…act as a model for similar institutions specialising in other subjects’.

Another confirmed the number of institutions, adding that there would be ‘…a special application process outside the regular free school application process…’

A third added that the project was viewed as an important part of the Government’s strategy for economic growth, suggesting that some of the schools:

‘…would offer pure maths, while others would combine the subject with physics, chemistry or computer sciences.’

Assuming provision for 12 schools at £6m a time, the Treasury had provided a capital budget of £72m available until 2015. It remains unclear whether this sum was ringfenced for university-sponsored maths schools or could be diverted into the wider free schools programme.

We now know that former political adviser Dominic Cummings was a prime instigator of the maths free schools project – and presumably behind the press briefings outlined above.

The most recent edition of his essay ‘Some thoughts on education and political priorities’ (2013) says:

‘We know that at the top end of the ability range, specialist schools, such as the famous Russian ‘Kolmogorov schools’…show that it is possible to educate the most able and interested pupils to an extremely high level…We should give this ~2% a specialist education as per Eton or Kolmogorov, including deep problem-solving skills in maths and physics.

The first English specialist maths schools, run by King’s College and Exeter University, have been approved by the Department for Education and will open in 2014. All of the pupils will be prepared for the maths ‘STEP’ paper that Cambridge requires for entry (or Oxford’s equivalent) – an exam that sets challenging problems involving unfamiliar ways of considering familiar  material, rather than the formulaic multi-step questions of A Level.’

Back in February 2012, TES reported that:

‘The DfE has hosted a consultation meeting on the new free schools with interested parties from the mathematical community in order to outline its plans.’

‘TES understands that officials within the Department for Education are now keen to establish the schools on the model of Kolmogorov, a boarding school that selects the brightest mathematicians in Russia.’

In fact, the meeting discussed a variety of international models and, on 20 February, Education Minister Nick Gibb answered a PQ thus:

‘Alex Cunningham: To ask the Secretary of State for Education when he expects the first free school specialising in mathematics for 16 to 18 year-olds to open; how many 16 to 18 year-olds he expects to enrol in free schools specialising in mathematics by 2015; with which universities he has discussed these free schools; and what guidance he plans to provide to people who wish to apply to open such a school.

Mr Gibb: We are developing proposals on how specialist maths schools for 16 to 18-year-olds might operate and will announce further details in due course. We are keen to engage with all those who have an interest to explore possible models and innovative ideas.’ (Col. 723W).

However, no proposals were published.

The minutes from King’s College London (KCL) Council meeting of 26 June 2012 reveal that:

‘Following approval by the Principal’s Central Team, the College was pursuing discussions with the Department for Education about sponsoring one of 12 specialist Maths schools for 16-18 year olds to be established with the support of university Mathematics departments. The initiative was intended to address national deficiencies in the subject and to promote a flow of highly talented students into university. In discussion, members noted that while the financial and reputational risks and the costs in management time needed to be carefully analysed, the project supported the College’s commitment to widening participation and had the potential to enhance the strengths of the Mathematics Department and the Department of Education and Professional Services, as well as addressing a national problem. The Council approved the College’s continued engagement with this initiative.’

By December 2012 KCL had announced that it would establish a maths free school, with both its maths and education departments involved. The school was scheduled to open in September 2014.

KCL confirmed that it had received from DfE a development grant plus a parallel outreach grant to support a programme for mathematically talented 14-16 year-olds, some of whom might subsequently attend the school.

The minutes of the University of Exeter Council meeting of 13 December 2012 record that:

‘As Council were aware, Exeter was going to be a partner in an exciting regional development to set up one of the first two Maths specialist schools with Exeter College. The other school would be led by King’s College London. This would cater for talented Maths students as a Free School with intake from four counties (Devon, Cornwall, Somerset and Dorset) with a planned total number of students of 120 by September 2017. The bid was submitted to the Department of Education on 11th December and the outcome would be announced in early January, with the school opening in 2014. It would be taught by Exeter College teachers with contributions from staff in pure and applied Maths in the College of Engineering, Mathematics and Physical Sciences (CEMPS), input from the Graduate School of Education and from CEMPS students as mentors and ambassadors. It was hoped that at least some of these talented students would choose to progress to the University. Council would be kept informed of the progress of the bid.’

In January 2013 a DfE press release announced approval of this second school. It would indeed have capacity for 120 students, with Monday-Thursday boarding provision for 20% (24 students), enabling it to recruit from across the four counties named above, so acting as a ‘regional centre of excellence’.

This project had also received a development grant – which we know was up to £300K – had agreement in principle to an outreach grant and also expected to open in September 2014.

There is also reference to plans for Met Office involvement with the School.

The press release repeats that:

‘The ultimate aim is to create a network of schools that operate across England which identify and nurture mathematical and scientific talent.’

A page added to DfE’s website in March 2013 invites further expressions of interest to open maths free schools in September 2014 and beyond.

Parallel Q and A, which has now been removed, made clear that development grants would not be available to new applicants:

‘Is there financial support available to develop our plans?

Not at the beginning. Once we have approved a proposal, we do offer some support to cover the costs of project management, and recruiting some staff before the school opens, in the same way we would for any Free School.’

This has subsequently been reversed (see below).

 

Progress since March 2013

 

The Hard Sell

While KCL and Exeter developed their plans, strenuous efforts were made to encourage other universities to participate in the programme.

A TES piece from May 2013, profiling the newly-appointed head of the KCL school, includes a quote from Alison Wolf – the prominent chair of the project group at KCL:

‘’The Brit School is a really good comparison,” she says. “When we were working on the new school and thinking about what to do, we’d look at their website.

“Maths is very glamorous if you’re a young mathematician, which is why they’ll do well when they are around other people who adore maths.”

The story adds that 16 schools are now planned rather than the original 12, but no source is attributed to this statement.

It seems that the wider strategy at this stage was to convince other potential university sponsors that maths schools were an opportunity not to be missed, to imply that there was already substantial interest from prominent competitors, so encouraging them to climb on board for fear of missing the boat.

 

Playing the Fair Access Card

But there was soon a change of tack. In June 2013, the Guardian reported that education minister Liz Truss had written to the heads of university maths departments to encourage bids.

‘As an incentive to open the new schools, universities will be allowed to fund them using budgets otherwise reserved for improving access to higher education for under-represented and disadvantaged groups….

Les Ebdon, director of Offa, said: “I’d be happy to see more university-led maths free schools because of the role they can play in helping able students from disadvantaged backgrounds access higher education.

“It is for individual universities and colleges to decide whether or not this is something they want to do, but Offa is supportive of anything that is targeted at under-represented groups and helps them to fulfil their potential.”

…According to Truss’s letter, Ebdon confirmed it would be “perfectly legitimate to allocate funding ringfenced for improving access for under-represented groups towards the establishment of such schools,” counting the spending as “widening access”.’

My initial post had pointed to the potential significance of this coupling of excellence and equity as early as November 2011:

‘It is not clear whether a fundamental purpose of these institutions is to support the Government’s drive towards greater social mobility through fair access to competitive universities. However, one might reasonably suggest it would be an oversight not to deploy them…encouraging institutions to give priority during the admissions process would be the likely solution.’

But Ministers’ rather belated conversion to the merits of alignment with social mobility and fair access might have been interpreted as opportunism rather than a sincere effort to join together two parallel strands of Government policy, especially since it had not been identified as a central feature in either KCL’s or Exeter’s plans.

I can find nothing on Offa’s website confirming the statement that funding ringfenced for fair access might be allocated by universities to the development of maths free schools. There is no contemporary press notice and nothing in subsequent guidance on the content of access agreements. This begs the question whether Ebdon’s comments constitute official Offa advice.

However the text of the letter is preserved online and the identical text appears within it:

‘I want to encourage other universities to consider whether they could run similar schools: selective, innovative and stretching our brightest and best young mathematicians. It is a logical extension of the role that dozens of universities have already played in sponsoring academies.

I also wanted to highlight to your colleagues that Professor Les Ebdon, Director of the Office for Fair Access, is enthusiastic about the role university led Maths Free Schools can have in encouraging more young people to go on to study maths at university, and to reap the benefits that brings. Professor Ebdon has also confirmed to me that he considers the sponsorship and development of Maths Free Schools as contributing to higher education ‘widening access’ activity, and that it would be perfectly legitimate to allocate funding ring-fenced for improving access for underrepresented groups towards the establishment of such schools.

Unlike our usual practice for Free Schools, there is no competitive application process for Maths Free Schools. Instead we ask interested universities to submit a short proposal setting out the key features of the school. These proposals need not be long: King’s and Exeter both submitted initial proposals that were around 12 pages…

[There follows a list of bullet points describing the content of these initial proposals, none of which address the admission of students from disadvantaged backgrounds.]

….Both King’s College and the University of Exeter had a number of detailed discussions with colleagues in the Department to develop and refine their proposals and we are always happy to work with universities to help them focus their plans before submitting a formal proposal. If we approve a proposal, we do then offer financial support to cover the costs of project management, and of recruiting some staff before the school opens, in the same we would for any free school.’

(By way of an aside, note that the final emboldened sentence in the quotation above corrects the statement in the Q and A mentioned above. It seems that maths free schools are now treated comparably with all other free school projects in this respect, even though the application process remains different.

The latest version of free school pre-opening guidance gives the sum available in Project Development Grant for 16-19 free schools as £0.25m.)

Going back to Offa, there are no conditions imposed by Ebdon in respect of admissions to the schools, which seems a little over-relaxed, given that they might well attract a predominantly advantaged intake. I wonder whether Ebdon was content to offer personal support but refused to provide official Offa endorsement.

 

 

In July 2013 the BBC reported a speech by Truss at the 2013 ACME Conference. Oddly, the speech is not preserved on the gov.uk site. According to the BBC:

“We want this movement to spread still further,” she told delegates.

“So we’re allowing universities to apply to sponsor new maths free schools through a fast-track, simplified procedure, without having to go through the normal competitive application process.

“These schools will not only improve standards in maths teaching, but will equip talented young people from low-income backgrounds with the skills they need to study maths at university.”

Mrs Truss said the Office for Fair Access had confirmed that, when universities contributed to the sponsorship or development of maths free schools, this would be considered as one of their activities to widen access to under-represented groups – and therefore as part of their access agreement.

“I hope that this is the start of a new network of world-class free schools, under the aegis of top universities, helping to prepare talented 16- to 19-year-olds from any and every background for the demands of university study.”

Note that Ebdon’s endorsement is now Offa’s.

Cummings’ essay remarks in a footnote:

‘Other maths departments were enthusiastic about the idea but Vice Chancellor offices were hostile because of the political fear of accusations of ‘elitism’. Hopefully the recent support of Les Ebdon for the idea will change this.’

A year on, we have no evidence that it has done so.

 

The Soft Sell

By the beginning of the following academic year, a more subtle strategy was adopted. The two schools-in-development launched a maths competition for teams from London and the South-West with prizes awarded by education ministers.

 

 

A November 2013 DfE press release marks the ceremony. Michael Gove is quoted:

‘We need specialist maths free schools like King’s College London (KCL) Maths School and Exeter Mathematics School. They will develop the talents of exceptional young mathematicians and ensure they can compete in the global race.’

The release continues:

‘The KCL and Exeter schools are the first to take advantage of a development grant made available by the Department for Education for the creation of university-led specialist maths free schools.’

The notes include a link to the 1 March webpage mentioned above for ‘Universities interested in developing their own maths free school’.

 

Publicity avoided

We now know that a Freedom of Information request had been submitted to DfE in October 2013, asking how many expressions of interest and firm proposals had been received, which institutions had submitted these and which proposals had been approved and rejected.

The source is an ICO Decision Notice published on 12 June 2014.

The request was initially rejected and this decision was upheld in January 2014 following an internal review. A complaint was immediately lodged with the Information Commissioner’s Office.

The Decision Notice records the Commissioner’s decision that public interest outweighs the case for withholding the information. Accordingly he directs that it should be released to the complainant within 35 calendar days of the date of the Notice (ie by 17 July 2014).

The Notice contains some interesting snippets:

  • ‘It has been the DfE’s experience that interested Heads of Maths have contacted it for further information before seeking to discuss the idea with their Vice Chancellor.’ There is no process for accepting formal expressions of interest.
  • There are…no fixed criteria against which all proposals are assessed.’
  • ‘The DfE confirmed that the application is and has always been the first formal stage of the maths free schools process and it has already stated publicly that it has received three applications from King’s College London, Exeter University and the University of Central Lancashire.’
  • ‘It [ie DfE] confirmed that funding arrangements were only confirmed for the development of maths free schools in February 2014 and many policy decisions on this issue have been shaped by the specifics of the two schools that are due to open soon. It expects the policy to develop even further as more maths free schools are approved.’
  • ‘The DfE explained that universities are extremely risk adverse when it comes to protecting their reputation and so do not want to be publically named until they have submitted an application. As such, if they are named at an earlier point it may make them pull out altogether and may make universities unwilling to approach the DfE with ideas.’
  • ‘Similarly, the DfE argued that if it were to release the reasons why one of the applications was rejected it would be likely to deter future interest as the university would not want the public criticism of its ideas. Given that the policy is driven by university interest, if all potential groups are deterred the policy will fail and students will not be able to enjoy the potential benefits.’

The Commissioner gave these arguments short shrift, pointing out the benefits of transparency for policy development and the encouragement of more successful applications.

The text does not say so explicitly, but one can imagine the Commissioner thinking  ‘given the low level of interest stimulated to date, you might at least try a more open strategy –what have you got to lose?’

It does seem unlikely that university heads of maths departments would submit speculative expressions of interest without internal clearance. Their approaches were presumably of the informal ‘sounding out’ variety. They would understand the shaky internal politics of failing to consult the corporate centre – not to mention their education faculties

The lack of specific and transparent assessment criteria does appear to have backfired. What guarantees might universities otherwise receive that their proposals would be judged objectively?

One can imagine the questions:

  • Is the scheme open to all universities, Russell Group or otherwise?
  • If not, what criteria must the host university satisfy?
  • What counts as a ‘strong mathematics department?’
  • Can projects be led by university departments of education, or only undertaken jointly (as at KCL)?

Without explicit and consistent answers one can readily understand why many universities would be disinclined to pursue the idea.

The reference to belated confirmation of funding arrangements – as recently as February 2014 – is intriguing. It cannot apply to capital funding, unless that was vired in extremis. Perhaps it relates to the parallel recurrent funding pot or simply the availability of project development grants.

The latter seems unlikely given the statement in the letter to HoDOMS, dated some eight months previously.

One suspects that there might have been internal difficulties in ringfencing sufficient recurrent funding to honour proposals as and when they were received. Some prospective bidders might have baulked on being told that their budget could not be confirmed until a later date.

But the eventual resolution of this issue a little over a year before the end of the spending round would be unlikely to have a significant impact on the number of successful bids, especially if unspent capital funding has to be surrendered by Spring 2015.

 

Recent developments

In July 2014 the TES revealed that it had been the source of this FoI request.

 

 

But the story reveals little new, other than that:

‘Five further expressions of interest have been made but not yet yielded an application’

The sources are not revealed.

David Reynolds opines that:

‘Having a small number of schools doesn’t matter if we can get the knowledge from them around the system. So we need them to be excellent schools and we need to somehow get that knowledge around.’

A DfE statement concludes:

‘We continue to welcome applications and expressions of interest from universities and the first maths free schools, set up by two leading universities, will be opening in September.’

So we know there have been eight expressions of interest, three of them converted into firm proposals.

The receipt of the third proposal, from the University of Central Lancashire (UClan), is said to have been made public, but I can find no record of it in the lists of Wave 1 to 7 free school applications so far published.

There is a reference in UCLAN’s 2013-14 access agreement dated 31 May 2012:

‘The University is currently consulting on the formation of a Maths Free School which would be run alongside its new Engineering Innovation Centre at the Preston Campus.’

Nothing is said about the plans in the access agreements for 2014-15 and 2015-16.

There is one further reference on the New Schools Network site to a:

‘Consultant engaged to carry out a feasibility study re a Maths Free School on behalf of the University of Central Lancashire (UCLan)’.

One assumes that this must be out-of-date, unless UCLan is considering a second bid.

Otherwise, a simple process of elimination tells us that UCLan’s proposal must have been rejected. The reason for this is now presumably known to TES, as are the sources of the five expressions of interest that were not converted into proposals. Why have they not published this information?

Perhaps they are waiting for DfE to place these details on its website but, at the time of writing – almost three months after the Decision Notice issued – it has not been uploaded.

Meanwhile, there are no further maths free school proposals in the most recent Wave 7 information relating to applications received by 9 May 2014.

The deadline for Wave 8 is imminent. That may well be the last on this side of the Election.

A further feature published by the TES in October 2014 throws no fresh light on these matters, though it carries a quote by new Secretary of State Nicky Morgan, interviewed at the KCL School launch:

‘I think that some [universities] are clearly waiting to see how the King’s and Exeter schools go. Clearly there is a huge amount of effort required, but I think King’s will be enormously successful, and I am hoping they will be leading by example.’

That sounds suspiciously like tacit admission that there will be no new proposals before a General Election.

Another opinion, diametrically opposed to David Reynolds’ view, is contributed by the head of the school of education at Nottingham University who is also Deputy Chair of ACME:

‘I’m very supportive of more people doing more maths, but even if you have 12 schools, you are really scratching the surface,” said Andrew Noyes, head of the school of education at Nottingham University and a former maths teacher.

“These kinds of policy experiments are very nice and they’re beneficial for a certain number of young people, but they’re relatively cheap compared with providing high-quality maths education at every stage in every school.”’

So what are the prospects for the success of the KCL and Exeter Schools? The next section reviews the evidence so far in the public domain.

 

The KCL and Exeter Free Schools

 

KCL School

The KCL School opened in September 2014 with 68 students, against a planned admissions number of 60. The most recent TES article says that there were 130 applicants and nearly all of those successful were drawn from state schools.

However, another reliable source – a member of the governing body – says that only 85% (ie 58) are from maintained schools, so the independent sector is actually over-represented.

He adds that:

‘Many are from families where neither parent has attended university’

but that is not necessarily an indicator of disadvantage.

We also know that some 43% (29 students) were female, which is a laudable outcome.

The School is located in Lambeth Walk, some distance from KCL’s main campuses. The capital cost of refurbishing the School was seemingly £5m. It occupies two buildings and the main building is shared with a doctor’s surgery.

My March 2013 post summarised KCL’s plans, as revealed by material on the University’s site at that time, supplemented by the content of an information pack for potential heads which is no longer available online.

I have reproduced the main points below, to provide a baseline against which to judge the finished article.

  • The full roll will be 120, with an annual admission number of 60. Potential applicants must have at least 5 GCSE grades A*-C including A*/A in both maths and physics or maths and dual award science.
  • Other admissions criteria will probably include a school reference, ‘our judgement about how much difference attending the school will make to your future based on a number of factors, including the results from an interview’ and the results of a test of aptitude for problem-solving and mathematical thinking.
  • The headteacher information pack adds that ‘the school will also be committed to recruiting a significant proportion of students from socially disadvantaged backgrounds, and to an outreach programme… to further this objective.’
  • All students will take Maths, Further Maths and Physics A levels. They will be expected to take STEP papers and may take a further AS level (an FAQ suggests this will be an Extended Project). Every student will have a maths mentor, either an undergraduate or ‘a junior member of the maths department’.
  • They will also ‘continue with a broad general curriculum, including other sciences, social science, humanities and languages, and have opportunities for sport and the visual and performing arts.’ Some of this provision will be ‘delivered through existing King’s facilities’. The provisional timetable assumes a 40-hour working week, including independent study.
  • The University maths department ‘will be closely involved in curriculum development’ and academics will have ‘regular timetabled contact’, potentially via masterclasses.
  • There will be strong emphasis on collaboration with partner schools. In the longer term, the school ‘intends to seek independent funding for a larger CPD programme associated with the school’s curriculum and pedagogy, and to offer it to a wide range of  schools and students, using school premises out of hours’.

At the time of writing, the KCL Maths School website does not have a working link to the admissions policy, although it can be found online.

As expected, 60 students will be admitted in September 2015. Minimum requirements are now

‘A or A* in GCSE Mathematics or in iGCSE Mathematics

Either an A or A* in GCSE Physics or iGCSE Physics, or an AA, A*A or A*A* in GCSE Science and GCSE Additional Science, or an A or A* in all three Physics modules contained within the GCSE Science, Additional Science and Further Additional Science qualifications; and

A*-C grade in 5 other GCSEs or other qualifications that count towards the Key Stage 4 performance tables compiled by the Department of Education, normally including English language.’

So the minimum requirement has been stiffened to at least seven GCSEs, or equivalent, including A*/A grades in maths and physics and at least a C in English language.

The application process does indeed include a reference, an aptitude test and an interview.

The test is based on KS3 national curriculum material up to Level 8, containing ‘routine and less familiar problems’. Some specimen questions are supplied.

The latest TES story says there are two interviews but this is wrong – there is one interview but two interview scores. One of the two scores is ‘to assess to what extent the school is likely to add value in terms of making a difference to [candidates’] future careers’ but there is no explicit reference to priority for disadvantaged students anywhere in the admissions policy.

Indeed, the section headed Equality and Diversity says:

‘All places at King’s College London Mathematics School are offered on the basis of academic ability and aptitude.’

This does not amount to a commitment to recruit ‘a significant proportion of students from socially disadvantaged backgrounds’, as stated in the headteacher information pack.

The website confirms that all students take A levels in maths, further maths and physics, together with an AS EPQ. But now they can also take an optional AS level in computing in Year 12 and may convert it to an A level in Year 13. They will also take either the AEA or STEP papers.

The description of additional curricular provision is somewhat vague. Students will have a series of lessons and educational visits. Each fortnight a KCL lecturer will introduce a new theme, to be explored through ‘mini research projects’. Students will also learn a modern language but to what level is unclear.

A mentor will be assigned to support work for the EPQ. There will also be a maths mentor – always an undergraduate, never ‘a junior member of the maths department’ – available for one meeting a week.

Tuesday afternoons seem to be set aside for sport and exercise. Visual and performing arts will be explored through extra-curricular activity, though this is currently aspirational rather than real:

‘…the school hopes to have sufficient interest to form a student choir, orchestra and dramatic society.’

The length of the school day is six hours and 55 minutes, with five hours of lessons (though the FAQ implies that students will not have a full timetable).

The present staff complement is 10, six of whom seem to be teaching staff. The head was formerly Head of Maths at Highgate School.

Outreach continues for students in Years 10 and 11. There is also a CPD programme for those new to teaching further maths. This is funded by a £75,000 grant from the Mayor’s London Schools Excellence Fund and supports 30 teachers from six schools spread across five boroughs.

KCL’s Access Agreement for 2015/16 says:

‘King’s College London Mathematics School aims to increase substantially the number of young people with the right levels of mathematical attainment to study STEM subjects at top-rated universities. It also aims to improve access to high quality mathematical education at sixth form level and is targeting individuals from schools where such provision is not easily available (in particular, 11-16 schools and schools where further mathematics is not offered as part of the curriculum at A-level). The school has implemented an extensive outreach programme for pupils at KS4, aged 14-16, whereby pupils come to King’s College London for two hours per fortnight over a two-year period. Through this programme, the school will provide students with limited access [sic] to high quality sixth form provision the understanding and skills they need to prepare for A-levels in Maths and Further Maths should they decide to study them, and also to support applications to the maths school should they wish to make them.

The school has also just launched a programme of continuing professional development for maths teachers in London schools. The programme will run for two consecutive years, and will enable high-quality teaching of Further Maths for those new to teaching this A-level. One of the key aims of this programme is to improve take up and retention rates in A-level Further Maths, with a view to increasing numbers of well-trained applicants to STEM subjects at university.’

Exeter

The Exeter School also opened in September 2014, with 34 students, against a planned admission number of 30. Disappointingly only seven are girls. Eleven (32%) are boarders. We do not know the number of applicants.

The School is located in Rougemont House, a Grade 2 listed building close to the University and College. The cost of refurbishment is as yet unknown.

There were relatively fewer details available of Exeter’s plans at the time I wrote my previous post. The January 2013 revealed that:

  • As we have seen, the roll would be 120 students, 60 per year group, with boarding places available for 20% of them.
  • All students would take maths A level and the STEP paper and all would have 1:1 maths mentoring.
  • University academics would provide an ‘enrichment and critical thinking programme’.
  • The Met Office would be involved.

The 2014 admissions policy dates from September 2013.  It indicates that the School will admit 30 students in September 2014, 50 in September 2015 and 60 in September 2016. It will not reach full capacity until September 2017.

Minimum entry requirements are:

  • A* in GCSE Mathematics
  • A or A* in double sciences or single science Physics (in 2015 computer science is also acceptable as an alternative)
  • At least 6 GCSEs at C grade or above, normally to include English Language at a grade B.

So Exeter is more demanding than KCL in respect of the grades required for both GCSE maths and English language, but the minimum number of GCSEs required is one fewer.

The policy says that the School will aim for allocated places to reflect the incidence of potential students across Devon (47%) and in the other three counties served by the school (Cornwall 23%, Somerset 23%, Dorset 6%) but they will not be selected on this basis. There is nothing in the admissions criteria to secure this outcome, so the purpose of this paragraph is unclear.

The selection process involves a written application, a reference an interview and ‘a mathematics-based entry exam’, subsequently called an aptitude test. This is described in identical terms to the test used by KCL – indeed the specimen questions are identical.

The oversubscription criteria involve giving priority to ‘interview answers and the candidates’ potential to thrive and succeed on the course’.

Under ‘Equality and Diversity’ the document says:

‘EMS is committed to widening participation and broadening access to high quality mathematics education. As such, we will target our recruitment in areas which have high levels of deprivation and in schools for which provision is currently limited, such as those without 6th forms.

EMS will encourage applications from female students through targeted marketing and recruitment. However, there will be no positive discrimination for girls in the admissions criteria.’

The first statement is largely meaningless since neither residence in a deprived area nor attendance at a school without a sixth form is mentioned explicitly in the admissions criteria.

The second statement is reflected in the fact that only 20% of the inaugural cohort is female.

The document notes that boarding will be available for learners living more than an hour distant. The proportion of boarders in the first cohort is significantly higher than expected.

It adds that boarding fees will be payable (and published on the School’s website) but it is expected they ‘will be subsidised by a government grant and a private investor’. There will also be a limited number of means-tested full bursaries, the criteria for which will also be published.

At the time of writing neither fees nor subsidies nor bursary criteria are published on the open pages of the website. It also mentions a subsidised transport scheme but provides no details. This is unhelpful to prospective candidates.

Students take A levels in maths and further maths, plus an A level in either physics or computer science. They are also prepared for STEP papers. All students pursue one further AS level at Exeter College, selecting from a choice of over 30 subjects, with the option to complete the A level in Year 13. Amongst the 30 are several non-traditional options such as fashion and design, media studies and world development. The School is clearly not wedded to facilitating subjects!

In maths students will:

‘…collaborate with those in other mathematics schools and meet, converse and work with staff and students from Exeter University’s mathematics department. They will have access to mathematical mentors from the University who will provide 1:1 and small group support for individual development and project work.’

Maths mentors will be 3rd or 4th year undergraduates and sessions will take place fortnightly.

All students will have a pastoral tutor who will ‘deliver a curriculum designed to meet the students’ development needs’. Some extra-curricular options may also be available:

‘Several clubs and societies will exist within EMS, these will be established as a result of students’ own interests. In addition, Exeter College’s specialist facilities, learning centres and other services will be accessible to them. Students will join their friends and other students from the College for sporting and enrichment activities including, for example, structured voluntary work, theatre productions and the Duke of Edinburgh’s Award Scheme.’

I could find no reference to a University-provided enrichment and critical thinking programme or to Met Office involvement.

The Head of Exeter School was formerly a maths teacher and maths AST at Torquay Boys’ Grammar School. Other staff responsibilities are not enumerated, but the Contacts page mentions only one teacher apart from the Head.

Another section of the site says the School will be advertising for a Deputy and ‘teachers of Mathematics, Computer Science and Physics (p/t)’. Advertisements have been placed for several posts including a Pastoral Leader and an Outreach and Admissions Officer.

An outreach programme is being launched and business links will be established, but there are no details as yet. There are links to a KS4/5 maths teachers’ network sponsored by the Further Maths Support Programme.

Exeter’s 2015/16 Access Agreement says:

‘The University and the College are already joint sponsors of the innovative new Exeter Maths School and are developing a strategic approach to outreach that supports both curriculum enhancement in local schools and progression for the students enrolled in the school. Together with the South Devon UTC, these two new education providers offer opportunities for innovative collaborative approaches to outreach in the region.’

This sounds very much a work in progress.

 

Comparing the two schools

My 2013 post observed:

‘From the information so far published, the Exeter project seems very close conceptually to the one at King’s, indeed almost a clone. It would have been good to have seen evidence of a fundamentally different approach.’

If anything, the two projects have grown even more similar as they have matured. To the extent that these are pilot institutions testing out a diversity of models this is not entirely helpful.

Both Schools are very small and KCL in particular offers a very restricted range of post-16 qualifications. There is downside to post-16 education on this model – otherwise we wouldn’t be exercised about the negative effects of small sixth forms – though both projects make some effort to broaden their students’ experience and, as we have seen, Exeter includes some shared provision with Exeter College.

The admissions requirements and processes are almost identical. It is important to recognise that neither institution is highly selective, especially in terms of overall GCSE performance and, in this respect, the comparisons with Kolmogorov and other institutions elsewhere in the world are rather misleading.

This is not the top 2% that Cummings cited as the beneficiaries in his essay. Even in terms of mathematical ability, the intake to these schools will be relatively broad.

The expectation that all will take STEP papers may be realistic but, despite the use of an aptitude test, any expectation of universal success is surely over-optimistic.

For Cambridge says STEP papers are ‘aimed at the top 5% or so of all A-level mathematics candidates’.  Fewer than 1,500 students took the most popular Paper 1 in 2013 and, in 2014, over 20% of participants received an Unclassified grade.

Cummings notes that approximately one third of those entered for STEP attend independent schools, meaning that roughly 1,000 of the 2013 cohort were in maintained institutions. There may be some marginal increase in state-funded STEP entry through these two schools, but the impact of MEI support elsewhere is likely to be more significant.

The priority attached to excellence is less pronounced than expected. But this is not matched (and justified) by a correspondingly stronger emphasis on equity.

Neither school gives priority within its admissions or oversubscription criteria to students from disadvantaged backgrounds. A major opportunity has been lost as a consequence.

So there is insufficient emphasis on excellence and equity alike. These institutions exemplify a compromise position which, while tenable, will reduce their overall impact on the system.

The only substantive difference between the two schools is that one is located in London and the other in a much more sparsely populated and geographically dispersed region. These latter conditions necessitate a boarding option for some students. The costs associated with boarding are not transparent, but one suspects that they will also serve as a brake on the recruitment of disadvantaged students.

Exeter has no real competitors in its region, other than existing sixth forms and post-16 institutions, but KCL faces stiff competition from the likes of the London Academy of Excellence and the Harris Westminster Sixth Form, both of which are much more substantial institutions offering a wider range of qualifications and, quite possibly, a richer learning experience.

Both Schools are designed to suit students who wish to specialise early and who are content with only limited opportunities to work outside that specialisation. That subgroup does not necessarily include the strongest mathematicians.

It might have been different story if the Schools could have guaranteed progression into the most selective higher education courses, but this they cannot offer. There is no guaranteed progression even to the host universities (whose mathematics departments are not the strongest – one obvious reason why they were attracted to hosting maths schools in the first place).

Exeter and Kings no doubt expect that their Schools will help them to compete more effectively for prospective students – both through direct recruitment and, more indirectly, by raising their profile in the maths education sector – but they will not state this overtly, preferring to emphasis their contribution to improving standards system-wide.

There is no reference to independent evaluation, so one assumes that success indicators will focus on recruitment, a strong showing in the Performance Tables and especially Ofsted inspection outcomes.

A level performance must be consistently high and HE destinations must be commensurate. Because recruitment of disadvantaged students has not been a priority fair access measures are largely irrelevant.

Other indicators should reflect the Schools’ contribution to strengthening the maths talent pipeline and maths education more generally, particularly by offering leadership at regional and national levels.

At this early stage, my judgement is that the KCL project seems rather better placed than Exeter to achieve success. It has hit the ground running while Exeter has some rapid catching up to do. One is good; the other requires improvement.

 

Future Prospects

 

Prospects for the maths school programme

With just seven months before Election Purdah, there is no prospect whatsoever that the programme will reach its target of 12 schools. Indeed it seems highly unlikely that any further projects can be brought to fruition before the end of the spending round.

On assumes that the Regional Schools Commissioners are now responsible for stimulating and supporting new maths school projects – though this has not been made explicit – but they already have their hands full with many other more pressing priorities.

If Labour were to win the Election it seems unlikely that they would want to extend the programme beyond the two schools already established.

Even under the Conservatives it would be extremely vulnerable given its poor track record, the very tight budgetary constraints in the next spending round (especially if schools funding is no longer ringfenced) and the fact that its original champions are no longer in place at DfE.

With the benefit of hindsight one might have taken a different approach to programme design and targeting.  Paradoxically, the centre has appeared overly prescriptive – favouring a ‘Kolmogorov-lite’ model, ideally hosted by a Russell Group institution – but also too vague – omitting to clarify their expectations in a specification with explicit ‘non-negotiables’.

Universities were hesitant to come forward. Some will have had other fish to fry, some may have had reservations arising from fear of elitism, but more still are likely to have been unclear about the Government’s agenda and how best to satisfy it.

The belated decision to flag up the potential contribution to fair access was locking the door after the horse had bolted. Other universities will have noted that neither KCL nor Exeter paid lip service in this direction.

Because they were awarded a substantial capital budget – and were wedded to the value of free schools – ministers were driven to focus on creating new stand-alone institutions that might ultimately form a network, rather than on building the network itself.

The decision to create a set of maths hubs was the most sensible place to start, enabling new maths schools to take on the role of hubs when they were ready to do so. But, the maths hubs were a later invention and, to date at least, there have been no efforts to ‘retro-fit’ the maths schools into the network, meaning that these parallel policy strands are not yet integrated.

 

Prospects for the national maths talent pipeline

England is far from having a coherent national strategy to improve maths education or, as one element within that, a convincing plan to strengthen the maths talent pipeline.

Maths education enjoys a surfeit of players with overlapping remits. National organisations include:

A host of other organisations are involved, including the Joint Mathematical Council (JMC), an umbrella body, the Advisory Committee on Mathematics Education (ACME), the United Kingdom Mathematics Trust (UKMT) and the School Mathematics Project (SMP).

This leaves to one side the maths-related element of broader programmes to support between-school collaboration, recruit teachers and develop new-style qualifications. There is a parallel set of equally complex relationships in science education.

Not to put to finer point on it, there are too many cooks. No single body is in charge; none has lead responsibility for developing the talent pipeline.

Ministers have been energetic in generating a series of stand-alone initiatives. The overarching vision has been sketched out in a series of set-piece speeches, but there is no plan showing how the different elements knit together to create a whole greater than the sum of its parts.

This probably has something to do with an ideological distaste for national strategies of any kind.

The recent introduction of maths hubs might have been intended to bring some much-needed clarity to a complex set of relationships at local, regional and national levels. But the hubs seem to be adding to the complexity by running even more new projects, starting with a Shanghai Teacher Exchange Programme.

A network-driven approach to talent development might just work – I suggested as much at the end of my previous post – but it must be designed to deliver a set of shared strategic objectives. Someone authoritative needs to hold the ring.

What a pity there wasn’t a mechanism to vire the £72m capital budget for 12 free schools into a pot devoted to this end. For, as things stand, it seems that up to £12m will have been spent on two institutions with a combined annual cohort of 120 students, while a further £60m may have to be surrendered back to the Treasury.

We are better off than we would have been without the KCL and Exeter Schools, but two schools is a drop in the ocean. Even 12 schools of this size would have been hard-pressed to drive improvement across the system.

This might have been a once-in-a-generation chance to mend the maths talent pipeline. I hope we haven’t blown it.

 

GP

October 2014

Beware the ‘short head’: PISA’s Resilient Students’ Measure

 

This post takes a closer look at the PISA concept of ‘resilient students’ – essentially a measure of disadvantaged high attainment amongst 15 year-olds – and how this varies from country to country.

7211284724_f3c5515bf7_mThe measure was addressed briefly in my recent review of the evidence base for excellence gaps in England but there was not space on that occasion to provide a thoroughgoing review.

The post is organised as follows:

  • A definition of the measure and explanation of how it has changed since the concept was first introduced.
  • A summary of key findings, including selected international comparisons, and of trends over recent PISA cycles.
  • A brief review of OECD and related research material about the characteristics of resilient learners.

I have not provided background about the nature of PISA assessments, but this can be found in previous posts about the mainstream PISA 2012 results and PISA 2012 Problem Solving.

 

Defining the resilient student

In 2011, the OECD published ‘Against the Odds: Disadvantaged students who succeed in school’, which introduced the notion of PISA as a study of resilience. It uses PISA 2006 data throughout and foregrounds science, as did the entire PISA 2006 cycle.

There are two definitions of resilience in play: an international benchmark and a country-specific measure to inform discussion of effective policy levers in different national settings.

The international benchmark relates to the top third of PISA performers (ie above the 67th percentile) across all countries after accounting for socio-economic background. The resilient population comprises students in this group who also fall within the bottom third of the socio-economic background distribution in their particular jurisdiction.

Hence the benchmark comprises an international dimension of performance and a national/jurisdictional dimension of disadvantage.

This cohort is compared with disadvantaged low achievers, a population similarly derived, except that their performance is in the bottom third across all countries, after accounting for socio-economic background.

The national benchmark applies the same national measure relating to socio-economic background, but the measure of performance is the top third of the national/jurisdictional performance distribution for the relevant PISA test.

The basis for determining socio-economic background is the PISA Index of Economic, Social and Cultural Status (ESCS).

‘Against the Odds’ describes it thus:

‘The indicator captures students’ family and home characteristics that describe their socio-economic background. It includes information about parental occupational status and highest educational level, as well as information on home possessions, such as computers, books and access to the Internet.’

Further details are provided in the original PISA 2006 Report (p333).

Rather confusingly, the parameters of the international benchmark were subsequently changed.

PISA 2009 Results: Overcoming Social Background – Equity in Learning Opportunities and Outcomes Volume II describes the new methodology in this fashion:

‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status).’

No reason is given for this shift to a narrower measure of both attainment and disadvantage, nor is the impact on results discussed.

The new methodology is seemingly retained in PISA 2012 Results: Excellence through Equity: Giving every student the chance to succeed – Volume II:

‘A student is class­ed as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter of students among all countries, after accounting for socio-economic status.’

However, multiplication by four is dispensed with.

This should mean that the outcomes from PISA 2009 and 2012 are broadly comparable with some straightforward multiplication. However the 2006 results foreground science, while in 2009 the focus is reading – and shifts on to maths in 2012.

Although there is some commonality between these different test-specific results (see below), there is also some variation, notably in terms of differential outcomes for boys and girls.

 

PISA 2006 results

The chart reproduced below compares national percentages of resilient students and disadvantaged low achievers in science using the original international benchmark. It shows the proportion of resilient learners amongst disadvantaged students.

 

Resil 2006 science Capture

Conversely, the data table supplied alongside the chart shows the proportion of resilient students amongst all learners. Results have to be multiplied by three on this occasion (since the indicator is based on ‘top third attainment, bottom third advantage’).

I have not reproduced the entire dataset, but have instead created a subset of 14 jurisdictions in which my readership may be particularly interested, namely: Australia, Canada, Finland, Hong Kong, Ireland, Japan, New Zealand, Poland, Shanghai, Singapore, South Korea, Taiwan, the UK and the US. I have also included the OECD average.

I have retained this grouping throughout the analysis, even though some of the jurisdictions do not appear throughout – in particular, Shanghai and Singapore are both omitted from the 2006 data.

Chart 1 shows these results.

 

Resil first chart Chart 1: PISA resilience in science for selected jurisdictions by gender (PISA 2006 data)

 

All the jurisdictions in my sample are relatively strong performers on this measure. Only the United States falls consistently below the OECD average.

Hong Kong has the highest percentage of resilient learners – almost 75% of its disadvantaged students achieve the benchmark. Finland is also a very strong performer, while other jurisdictions achieving over 50% include Canada, Japan, South Korea and Taiwan.

The UK is just above the OECD average, but the US is ten points below. The proportion of disadvantaged resilient students in Hong Kong is almost twice the proportion in the UK and two and a half times the proportion in the US.

Most of the sample shows relatively little variation between their proportions of male and female resilient learners. Females have a slight lead across the OECD as a whole, but males are in the ascendancy in eight of these jurisdictions.

The largest gap – some 13 percentage points in favour of boys – can be found in Hong Kong. The largest advantage in favour of girls – 6.9 percentage points – is evident in Poland. In the UK males are ahead by slightly over three percentage points.

The first chart also shows that there is a relatively strong relationship between the proportion of resilient students and of disadvantaged low achievers. Jurisdictions with the largest proportions of resilient students typically have the smallest proportions of disadvantaged low achievers.

In Hong Kong, the proportion of disadvantaged students who are low achievers is 6.3%, set against an OECD average of 25.8%. Conversely, in the US, this proportion reaches 37.8% – and is 26.7% in the UK. Of this sample, only the US has a bigger proportion of disadvantaged low achievers than of disadvantaged resilient students.

 

‘Against the Odds’ examines the relationship between resiliency in science, reading and maths, but does so using the national benchmark, so the figures are not comparable with those above. I have, however, provided a chart comparing performance in my sample of jurisdictions.

 

Resil second chart

Chart 2: Students resilient in science who are resilient in other subjects, national benchmark of resilience, PISA 2006

 

Amongst the jurisdictions for which we have data there is a relatively similar pattern, with between 47% and 56% of students resilient in all three subjects.

In most cases, students who are resilient in two subjects combine science and maths rather than science and reading, but this is not universally true since the reverse pattern applies in Ireland, Japan and South Korea.

The document summarises the outcomes thus:

‘This evidence indicates that the vast majority of students who are resilient with respect to science are also resilient in at least one if not both of the other domains…These results suggest that resilience in science is not a domain-specific characteristic but rather there is something about these students or the schools they attend that lead them to overcome their social disadvantage and excel at school in multiple subject domains.’

 

PISA 2009 Results

The results drawn from PISA 2009 focus on outcomes in reading, rather than science, and of course the definitional differences described above make them incompatible with those for 2006.

The first graph reproduced below shows the outcomes for the full set of participating jurisdictions, while the second – Chart 2 – provides the results for my sample.

Resil PISA 2009 Capture

 

Resil third chart

Chart 3: PISA resilience in reading for selected jurisdictions by gender (PISA 2009 data)

 

The overall OECD average is pitched at 30.8% compared with 39% on the PISA 2006 science measure. Ten of our sample fall above the OECD average and Australia matches it, but the UK, Ireland and the US are below the average, the UK undershooting it by some seven percentage points.

The strongest performer is Shanghai at 75.6%, closely followed by Hong Kong at 72.4%. They and South Korea are the only jurisdictions in the sample which can count over half their disadvantaged readers as resilient. Singapore, Finland and Japan are also relatively strong performers.

There are pronounced gender differences in favour of girls. They have a 16.8 percentage point lead over boys in the OECD average figure and they outscore boys in every country in our sample. These differentials are most marked in Finland, Poland and New Zealand. In the UK there is a difference of 9.2 percentage points, smaller than in many other countries in the sample.

The comparison with the proportion of disadvantaged low achievers is illustrated by chart 3. This reveals the huge variation in the performance of our sample.

 

Resil fourth chart

Chart 4: Comparing percentage of resilient and low-achieving students in reading, PISA 2009

At one extreme, the proportion of disadvantaged low achievers (bottom quartile of the achievement distribution) is virtually negligible in Shanghai and Hong Kong, while around three-quarters of disadvantaged students are resilient (top quartile of the achievement distribution).

At the other, countries like the UK have broadly similar proportions of low achievers and resilient students. The chart reinforces just how far behind they are at both the top and the bottom of the attainment spectrum.

 

PISA 2012 Results

In 2012 the focus is maths rather than reading. The graph reproduced below compares resilience scores across the full set of participating jurisdictions, while Chart 4 covers only my smaller sample.

 

Resil PISA 2012 Capture

resil fifth chart Chart 5: PISA resilience in maths for selected jurisdictions by gender (PISA 2012 data)

 

Despite the change in subject, the span of performance on this measure is broadly similar to that found in reading three years earlier. The OECD average is 25.6%, roughly five percentage points lower than the average in 2009 reading.

Nine of the sample lie above the OECD average, while Australia, Ireland, New Zealand, UK and the US are below. The UK is closer to the OECD average in maths than it was in reading, however, and is a relatively stronger performer than the US and New Zealand.

Shanghai and Hong Kong are once again the top performers, at 76.8% and 72.4% respectively. Singapore is at just over 60% and South Korea at just over 50%. Taiwan and Japan are also notably strong performers.

Within the OECD average, boys have a four percentage point lead on girls, but boys’ relatively stronger performance is not universal – in Hong Kong, Poland, Singapore and South Korea, girls are in the ascendancy.  This is most strongly seen in Poland. The percentage point difference in the UK is just 2.

The comparison with disadvantage low achievers is illustrated in Chart 5.

 

Resil sixth chart

Chart 6: Comparing percentage of resilient and low-achieving students in maths, PISA 2012

 

Once again the familiar pattern emerges, with negligible proportions of low achievers in the countries with the largest shares of resilient students. At the other extreme, the US and New Zealand are the only two jurisdictions in this sample with a longer ‘tail’ of low achievers. The reverse is true in the UK, but only just!

 

Another OECD Publication ‘Strengthening Resilience through Education: PISA Results – background document’ contains a graph showing the variance in jurisdictions’ mathematical performance by deciles of socio-economic disadvantage. This is reproduced below.

 

resil maths deciles Capture

The text adds:

‘Further analysis indicates that the 10% socio-economically most disadvantaged children in Shanghai perform at the same level as the 10% most privileged children in the United States; and that the 20% most disadvantaged children in Finland, Japan, Estonia, Korea, Singapore, Hong Kong-China and Shanghai-China compare favourably to the OECD average.’

One can see that the UK is decidedly ‘mid-table’ at both extremes of the distribution. On the evidence of this measure, one cannot fully accept the oft-repeated saw that the UK is a much stronger performer with high attainers than with low attainers, certainly as far as disadvantaged learners are concerned.

 

The 2012 Report also compares maths-based resiliency records over the four cycles from PISA 2003 to PISA 2012 – as shown in the graph reproduced below – but few of the changes are statistically significant. There has also been some statistical sleight of hand to ensure comparability across the cycles.

 

resil comparing PISA 2003 to 2012 capture

Amongst the outcomes that are statistically significant, Australia experienced a fall of 1.9 percentage points, Canada 1.6 percentage points, Finland 3.3 percentage points and New Zealand 2.9 percentage points. The OECD average was relatively little changed.

The UK is not included in this analysis because of issues with its PISA 2003 results.

Resilience is not addressed in the main PISA 2012 report on problem-solving, but one can find online the graph below, which shows the relative performance of the participating countries.

It is no surprise that the Asian Tigers are at the top of the league (although Shanghai is no longer in the ascendancy). England (as opposed to the UK) is at just over 30%, a little above the OECD average, which appears to stand at around 27%.

The United States and Australia perform at a very similar level. Canada is ahead of them and Poland is the laggard.

 

resil problem solving 2012 Capture

 

Resilience in the home countries

Inserted for the purposes of reinforcement, the chart below compiles the UK outcomes from the PISA 2006, 2009 and 2012 studies above, as compared with the top performer in my sample for each cycle and the appropriate OECD average. Problem-solving is omitted.

Only in science (using the ‘top third attainer, bottom third disadvantage’ formula) does the UK exceed the OECD average figure and then only slightly.

In both reading and maths, the gap between the UK and the top performer in my sample is eye-wateringly large: in each case there are more than three times as many resilient students in the top-performing jurisdiction.

It is abundantly clear from this data that disadvantaged high attainers in the UK do not perform strongly compared with their peers elsewhere.

 

Resil seventh chart

Chart 7: Resilience measures from PISA 2006-2012 comparing UK with top performer in this sample and OECD average

 

Unfortunately NFER does not pick up the concept of resilience in its analysis of England’s PISA 2012 results.

The only comparative analysis across the Home Countries that I can find is contained in a report prepared for the Northern Ireland Ministry of Education by NFER called ‘PISA 2009: Modelling achievement and resilience in Northern Ireland’ (March 2012).

This uses the old ‘highest third by attainment, lowest third by disadvantage’ methodology deployed in ‘Against the Odds’. Reading is the base.

The results show that 41% of English students are resilient, the same figure as for the UK as a whole. The figures for the other home countries appear to be: Northern Ireland 42%; Scotland 44%; and Wales 35%.

Whether the same relationship holds true in maths and science using the ‘top quartile, bottom quartile’ methodology is unknown. One suspects though that each of the UK figures given above will also apply to England.

 

The characteristics of resilient learners

‘Against the Odds’ outlines some evidence derived from comparisons using the national benchmark:

  • Resilient students are, on average, somewhat more advantaged than disadvantaged low achievers, but the difference is relatively small and mostly accounted for by home-related factors (eg. number of books in the home, parental level of education) rather than parental occupation and income.
  • In most jurisdictions, resilient students achieve proficiency level 4 or higher in science. This is true of 56.8% across the OECD. In the UK the figure is 75.8%; in Hong Kong it is 88.4%. We do not know what proportions achieve the highest proficiency levels.
  • Students with an immigrant background – either born outside the country of residence or with parents were born outside the country – tend to be under-represented amongst resilient students.
  • Resilient students tend to be more motivated, confident and engaged than disadvantaged low achievers. Students’ confidence in their academic abilities is a strong predictor of resilience, stronger than motivation.
  • Learning time – the amount of time spent in normal science lessons – is also a strong predictor of resilience, but there is relatively little evidence of an association with school factors such as school management, admissions policies and competition.

Volume III of the PISA 2012 Report: ‘Ready to Learn: Students’ engagement, drive and self-beliefs’ offers a further gloss on these characteristics from a mathematical perspective:

‘Resilient students and advantaged high-achievers have lower rates of absenteeism and lack of punctuality than disadvantaged and advantaged low-achievers…

….resilient and disadvantaged low-achievers tend to have lower sense of belonging than advantaged low-achievers and advantaged high-achievers: socio-economically disadvantaged students express a lower sense of belonging than socio-economically advantaged students irrespective of their performance in mathematics.

Resilient students tend to resemble advantaged high-achievers with respect to their level of drive, motivation and self-beliefs: resilient students and advantaged high-achievers have in fact much higher levels of perseverance, intrinsic and instrumental motivation to learn mathematics, mathematics self-efficacy, mathematics self-concept and lower levels of mathematics anxiety than students who perform at lower levels than would be expected of them given their socio-economic condition…

….In fact, one key characteristic that resilient students tend to share across participating countries and economies, is that they are generally physically and mentally present in class, are ready to persevere when faced with challenges and difficulties and believe in their abilities as mathematics learners.’

Several research studies can be found online that reinforce these findings, sometimes adding a few further details for good measure:

The aforementioned NFER study for Northern Ireland uses a multi-level logistic model to investigate the school and student background factors associated with resilience in Northern Ireland using PISA 2009 data.

It derives odds ratios as follows: grammar school 7.44; female pupils 2.00; possessions – classic literature 1.69; wealth 0.76; percentage of pupils eligible for FSM – 0.63; and books in home – 0-10 books 0.35.

On the positive impact of selection the report observes:

‘This is likely to be largely caused by the fact that to some extent grammar schools will be identifying the most resilient students as part of the selection process. As such, we cannot be certain about the effectiveness or otherwise of grammar schools in providing the best education for disadvantaged children.’

Another study – ‘Predicting academic resilience with mathematics learning and demographic variables’ (Cheung et al 2014) – concludes that, amongst East Asian jurisdictions such as Hong-Kong, Japan and South Korea, resilience is associated with avoidance of ‘redoublement’ and having attended kindergarten for more than a year.

Unsurprisingly, students who are more familiar with mathematical concepts and have greater mathematical self-efficacy are also more likely to be resilient.

Amongst other countries in the sample – including Canada and Finland – being male, native (as opposed to immigrant) and avoiding ‘redoublement’ produced stronger chances of resilience.

In addition to familiarity with maths concepts and self-efficacy, resilient students in these countries were less anxious about maths and had a higher degree of maths self-concept.

Work on ‘Resilience Patterns in Public Schools in Turkey’ (unattributed and undated) – based on PISA 2009 data and using the ‘top third, bottom third’ methodology – finds that 10% of a Turkish sample are resilient in reading, maths and science; 6% are resilient in two subjects and a further 8% in one only.

Resilience varies in different subjects according to year of education.

resil Turkey Capture

There are also significant regional differences.

Odds ratios show a positive association with: more than one year of pre-primary education; selective provision, especially in maths; absence of ability grouping; additional learning time, especially for maths and science; a good disciplinary climate and strong teacher-student relations.

An Italian study – ‘A way to resilience: How can Italian disadvantaged students and schools close the achievement gap?’ (Agasisti and Longobardi, undated) uses PISA 2009 data to examine the characteristics of resilient students attending schools with high levels of disadvantage.

This confirms some of the findings above in respect of student characteristics, finding a negative impact from immigrant status (and also from a high proportion of immigrants in a school). ‘Joy in reading’ and ‘positive attitude to computers’ are both positively associated with resilience, as is a positive relationship with teachers.

School type is found to influence the incidence of resilience – particularly enrolment in Licei as opposed to professional or technical schools – so reflecting one outcome of the Northern Irish study. Other significant school level factors include the quality of educational resources available and investment in extracurricular activities. Regional differences are once more pronounced.

A second Italian study – ‘Does public spending improve educational resilience? A longitudinal analysis of OECD PISA data’ (Agasisti et al 2014) finds a positive correlation between the proportion of a country’s public expenditure devoted to education and the proportion of resilient students.

Finally, this commentary from Marc Tucker in the US links its relatively low incidence of resilient students to national views about the nature of ability:

‘In Asia, differences in student achievement are generally attributed to differences in the effort that students put into learning, whereas in the United States, these differences are attributed to natural ability.  This leads to much lower expectations for students who come from low-income families…

My experience of the Europeans is that they lie somewhere between the Asians and the Americans with respect to the question as to whether effort or genetic material is the most important explainer of achievement in school…

… My take is that American students still suffer relative to students in both Europe and Asia as a result of the propensity of the American education system to sort students out by ability and assign different students work at different challenge levels, based on their estimates of student’s inherited intelligence.’

 

Conclusion

What are we to make of all this?

It suggests to me that we have not pushed much beyond statements of the obvious and vague conjecture in our efforts to understand the resilient student population and how to increase its size in any given jurisdiction.

The comparative statistical evidence shows that England has a real problem with underachievement by disadvantaged students, as much at the top as the bottom of the attainment distribution.

We are not alone in facing this difficulty, although it is significantly more pronounced than in several of our most prominent PISA competitors.

We should be worrying as much about our ‘short head’ as our ‘long tail’.

 

GP

September 2014

 

 

 

 

 

 

What Happened to the Level 6 Reading Results?

 

Provisional 2014 key stage 2 results were published on 28 August.

500px-Japanese_Urban_Expwy_Sign_Number_6.svgThis brief supplementary post considers the Level 6 test results – in reading, in maths and in grammar, punctuation and spelling (GPS) – and how they compare with Level 6 outcomes in 2012 and 2013.

An earlier post, A Closer Look at Level 6, published in May 2014, provides a fuller analysis of these earlier results.

Those not familiar with the 2014 L6 test materials can consult the papers, mark schemes and level thresholds at these links:

 

Number of Entries

Entry levels for the 2014 Level 6 tests were published in the media in May 2014. Chart 1 below shows the number of entries for each test since 2012 (2013 in the case of GPS). These figures are for all schools, independent as well as state-funded.

 

L6 Sept chart 1

Chart 1: Entry rates for Level 6 tests 2012 to 2014 – all schools

 

In 2014, reading entries were up 36%, GPS entries up 52% and maths entries up 36%. There is as yet no indication of a backlash from the decision to withdraw Level 6 tests after 2015, though this may have an impact next year.

The postscript to A Closer Look estimated that, if entries continue to increase at current rates, we might expect something approaching 120,000 in reading, 130,000 in GPS and 140,000 in maths.

Chart 2 shows the percentage of all eligible learners entered for Level 6 tests, again for all schools. Nationally, between one in six and one in five eligible learners are now entered for Level 6 tests. Entry rates for reading and maths have almost doubled since 2012.

 

L6 Sept chart 2

Chart 2: Percentage of eligible learners entered for Level 6 tests 2012 to 2014, all schools

 

Success Rates

The headline percentages in the SFR show:

  • 0% achieving L6 reading (unchanged from 2013)
  • 4% achieving L6 GPS (up from 2% in 2013) and
  • 9% achieving L6 maths (up from 7% in 2013).

Local authority and regional percentages are also supplied.

  • Only in Richmond did the L6 pass rate in reading register above 0% (at 1%). Hence all regions are at 0%.
  • For GPS the highest percentages are 14% in Richmond, 10% in Kensington and Chelsea and Kingston, 9% in Sutton and 8% in Barnet, Harrow and Trafford. Regional rates vary between 2% in Yorkshire and Humberside and 6% in Outer London.
  • In maths, Richmond recorded 22%, Kingston 19%, Trafford, Harrow and Sutton were at 18% and Kensington and Chelsea at 17%. Regional rates range from 7% in Yorkshire and Humberside and the East Midlands to 13% in Outer London.

Further insight into the national figures can be obtained by analysing the raw numbers supplied in the SFR.

Chart 3 shows how many of those entered for each test were successful in each year. Here there is something of a surprise.

 

L6 Sept chart 3

Chart 3: Percentage of learners entered achieving Level 6, 2012 to 2014, all schools

 

Nearly half of all entrants are now successful in L6 maths, though the improvement in the success rate has slowed markedly compared with the nine percentage point jump in 2013.

In GPS, the success rate has improved by nine percentage points between 2013 and 2014 and almost one in four entrants is now successful. Hence the GPS success rate is roughly half that for maths. This may be attributable in part to its shorter history, although the 2014 success rate is significantly below the rate for maths in 2013.

But in reading an already very low success rate has declined markedly, following a solid improvement in 2013 from a very low base in 2012. The 2014 success rate is now less than half what it was in 2012. Fewer than one in a hundred of those entered have passed this test.

Chart 4 shows how many learners were successful in the L6 reading test in 2014 compared with previous years, giving results for boys and girls separately.

 

L6 Sept chart 4

Chart 4: Percentage of learners entered achieving Level 6 in reading, 2012 to 2014, by gender

 

The total number of successful learners in 2014 is over 5% lower than in 2012, when the reading test was introduced, and down 62% on the success rate achieved in 2013.

Girls appear to have suffered disproportionately from the decline in 2014 success rates. While the success rate for girls is down 63%, the decline for boys is slightly less, at 61%. The success rate for boys remains above where it was in 2012 but, for girls, it is about 12% down on where it was in 2012.

In 2012, only 22% of successful candidates were boys. This rose to 26% in 2013 and has again increased slightly, to 28% in 2014. The gap between girls’ and boys’ performance remains substantially bigger than those for GPS and maths.

Charts 5 and 6 give the comparable figures for GPS and maths respectively.

In GPS, the total number of successful entries has increased by almost 140% compared with 2013. Girls form a slightly lower proportion of this group than in 2013, their share falling from 62% to 60%. Boys are therefore beginning to close what remains a substantial performance gap.

 

L6 Sept chart 5

Chart 5: Percentage of learners entered achieving Level 6 in GPS, 2012 to 2014, by gender

 

In maths, the total number of successful entries is up by about 40% on 2013 and demonstrates rapid improvement over the three year period.

Compared with 2013, the success rate for girls has increased by 43%, whereas the corresponding increase for boys is closer to 41%. Boys formed 65% of the successful cohort in 2012, 61% in 2013 and 60% in 2014, so girls’ progress in narrowing this substantial performance gap is slowing.

 

L6 Sept chart 6

Chart 6: Percentage of learners entered achieving Level 6 in maths, 2012 to 2014, by gender

 

Progress

The SFR also provides a table, this time for state-funded schools only, showing the KS1 outcomes of those successful in achieving Level 6. (For maths and reading, this data includes those with a non-numerical grade in the test who have been awarded L6 via teacher assessment. The data for writing is derived solely from teacher assessment.)

Not surprisingly, over 94% of those achieving Level 6 in reading had achieved Level 3 in KS1, but 4.8% were at L2A and a single learner was recorded at Level 1. The proportion with KS1 Level 3 in 2013 was higher, at almost 96%.

In maths, however, only some 78% of those achieving Level 6 were at Level 3 in KS1. A further 18% were at 2A and almost 3% were at 2B. A further 165 were recorded as 2C or 1. In 2013, over 82% had KS1 L3 while almost 15% had 2A.

It seems, therefore, that KS1 performance was a slightly weaker indicator of KS2 level 6 success in 2014 than in the previous year, but this trend was apparent in both reading and maths – and KS1 performance remains a significantly weaker indicator in maths than it is in reading.

 

Why did the L6 reading results decline so drastically?

Given that the number of entries for the Level 6 reading test increased dramatically, the declining pass rate suggests either a problematic test or that schools entered a higher proportion of learners who had relatively little chance of success. A third possibility is that the test was deliberately made more difficult.

The level threshold for the 2014 Level 6 reading test was 24 marks, compared with 22 marks in 2013, but there are supposed to be sophisticated procedures in place to ensure that standards are maintained. We should be able to discount the third cause.

The second cause is also unlikely to be significant, since schools are strongly advised only to enter learners who are already demonstrating attainment beyond KS2 Level 5.There is no benefit to learners or schools from entering pupils for tests that they are almost certain to fail.

The existing pass rate was very low, but it was on an upward trajectory. Increasing familiarity with the test ought to have improved schools’ capacity to enter the right learners and to prepare them to pass it.

That leaves only the first possibility – something must have been wrong with the test.

Press coverage from May 2014, immediately after the test was administered, explained that it contained different rules for learners and invigilators about the length of time available for answering questions.

The paper gave learners one hour for completion, while invigilators were told pupils had 10 minutes’ reading time followed by 50 minutes in which to answer the questions. Schools interpreted this contradiction differently and several reported disruption to the examination as a consequence.

The NAHT was reported to have written to the Standards and Testing Agency:

‘…asking for a swift review into this error and to seek assurance that no child will be disadvantaged after having possibly been given incorrect advice on how to manage their time and answers’.

The STA statement says:

‘We apologise for this error. All children had the same amount of time to complete the test and were able to consult the reading booklet at any time. We expect it will have taken pupils around 10 minutes to read the booklet, so this discrepancy should not have led to any significant advantage for those pupils where reading time was not correctly allotted.’

NAHT has now posted the reply it received from STA on 16 May. It says:

‘Ofqual, our regulator, is aware of the error and of the information set out below and will, of
course, have to independently assure itself that the test remains valid. We would not
expect this to occur until marking and level setting processes are complete, in line with
their normal timescales.’

It then sets out the reasons why it believes the test remains valid. These suggest the advantage to the learners following the incorrect instructions was minimal since:

  • few would need less than 10 minutes’ reading time;
  • pre-testing showed 90% of learners completed the test within 50 minutes;
  • in 2013 only 3.5% of learners were within 1 or 2 marks of the threshold;
  • a comparative study to change the timing of the Levels 3-5 test made little difference to item difficulty.

NAHT says it will now review the test results in the light of this response.

 

 

Who is responsible?

According to its most recent business plan, STA:

‘is responsible for setting and maintaining test standards’ (p3)

but it publishes little or nothing about the process involved, or how it handles representations such as that from NAHT.

Meanwhile, Ofqual says its role is:

‘to make sure the assessments are valid and fit for purpose, that the assessments are fair and manageable, that the standards are properly set and maintained and the results are used appropriately.

We have two specific objectives as set out by law:

  • to promote assessment arrangements which are valid, reliable and comparable
  • to promote public confidence in the arrangements.

We keep national assessments under review at all times. If we think at any point there might be a significant problem with the system, then we notify the Secretary of State for Education.’

Ofqual’s Chair has confirmed via Twitter that Ofqual was:

‘made aware at the time, considered the issues and observed level setting’.

Ofqual was content that the level-setting was properly undertaken.

 

 

I asked whether, in the light of that, Ofqual saw a role for itself in investigating the atypical results. I envisaged that this might take place under the Regulatory Framework for National Curriculum Assessments (2011).

This commits Ofqual to publishing annually its ‘programme for reviewing National Assessment arrangements’ (p14) as well as ‘an annual report on the outcomes of the review programme’ (p18).

However the most recent of these relates to 2011/12 and appeared in November of that year.

 

 

I infer from this that we may seem some reaction from Ofqual, if and when it finally produces an annual report on National Curriculum Assessments in 2014, but that’s not going to appear before 2015 at the earliest.

I can’t help but feel that this is not quite satisfactory – that atypical test performance of this magnitude ought to trigger an automatic and transparent review, even if the overall number of learners affected is comparatively small.

If I were part of the system I would want to understand promptly exactly what happened, for fear that it might happen again.

If you are in any doubt quite how out of kilter the reading test outcomes were, consider the parallel results for Level 6 teacher assessment.

In 2013, 5,698 learners were assessed at Level 6 in reading through teacher assessment – almost exactly two-and-a-half times as many as achieved Level 6 in the test.

In 2014, a whopping 17,582 learners were assessed at Level 6 through teacher assessment, around 20 times as many as secured a Level 6 in the reading test.

If the ratio between test and teacher assessment results in 2014 had been the same as it was in 2013, the number successful on the test would have been over 7,000, eight-fold higher than the reported 851.

I rest my case.

 

The new regime

In February 2013, a DfE-commissioned report Investigation of Key Stage 2 Level 6 Tests recommended that:

‘There is a need to review whether the L6 test in Reading is the most appropriate test to use to discriminate between the highest ability pupils and others given:

a) that only around 0.3 per cent of the pupils that achieved at least a level 5 went on to achieve a level 6 in Reading compared to 9 per cent for Mathematics

b) there was a particular lack of guidance and school expertise in this area

c) pupil maturity was seen to be an issue

d) the cost of supporting and administering a test for such a small proportion of the school population appears to outweigh the benefits.’

This has been overtaken by the decision to withdraw all three Level 6 tests and to rely on single tests of reading GPS and maths for all learners when the new assessment regime is introduced from 2016.

Draft test frameworks were published in March 2014, supplemented in July by sample questions, mark schemes and commentary.

Given the imminent introduction of this new regime, together with schools’ experience in 2014, it seems increasingly unlikely that 2015 Level 6 test entries in reading will approach the 120,000 figure suggested by the trend.

Perhaps more importantly, schools and assessment experts alike seem remarkably sanguine about the prospect of single tests for pupils demonstrating the full range of prior attainment, apart from those assessed via the P-Scales. (The draft test frameworks are worryingly vague about whether those operating at the equivalent of Levels 1 and 2 will be included.)

I could wish to be equally sanguine, on behalf of all those learners capable of achieving at least the equivalent of Level 6 after 2015. But, as things stand, the evidence to support that position is seemingly non-existent.

In October 2013, Ofqual commented that:

‘There are also some significant technical challenges in designing assessments which can discriminate effectively and consistently across the attainment range so they can be reported at this level of precision.’

A year on, we still have no inkling whether those challenges have been overcome.

 

GP

September 2014

 

 

 

 

Why Can’t We Have National Consensus on Educating High Attainers?

 

 

This post proposes a statement of core principles to provoke debate and ultimately build consensus about the education of high attaining learners.

focus-blur--long-way-to-the-top-2_2955981It incorporates an Aunt Sally – admittedly imperfect, provocative and prolix – to illustrate the concept and stimulate initial thinking about what such a statement might contain.

The principles are designed to underpin effective provision. They are intended to apply at every level of the education system (whether national, regional or local) and to every learning setting and age group, from entry to Reception to admission to higher education (or equivalent) and all points in between.

Alongside the draft core principles – which should have more or less global application – I offer a complementary set of ‘reform principles’ which are specific to the English context and describe how our national education reform programme might be harnessed and applied more consistently to support high attainers.

This is expressed in system-wide terms, but could be translated fairly straightforwardly into something more meaningful for schools and colleges.

 

Justification

As education reforms continue to be developed and implemented at a rapid pace, it is essential that they fit together coherently. The various reforms must operate together smoothly, like interlocking cogs in a well-oiled machine, such that the whole is greater than the sum of the parts.

Coherence must be achieved across three dimensions:

  • Horizontally, across the span of education policy.
  • Vertically across the age range, taking in the primary, secondary and tertiary sectors.
  • Laterally for each and every learning setting to which it applies.

There is a risk that such co-ordination becomes more approximate as capacity is stretched by the sheer weight of reform, especially if the central resource traditionally devoted to this task is contracting simultaneously.

In an increasingly bottom-up system, some of the responsibility for ensuring the ‘fit’ across the span of education reforms can be devolved from the centre, initially to a range of intermediary bodies and ultimately to learning settings themselves.

Regardless of where the responsibility lies, there can be a tendency to cut corners, by making these judgements with reference to some notional average learner. But this ignores the needs and circumstances of atypical constituencies including high attainers.

High attainers may even find themselves at the bottom of the pecking order amongst these atypical constituencies, typically as a consequence of the misguided view that they are more or less self-sufficient educationally speaking.

A framework of sorts is necessary to support this process, to protect against the risk that high attainers may otherwise be short-changed and also to ensure flexibility of provision within broad but common parameters.

The Government has recently set a precedent by publishing a set of Assessment Principles ‘to underpin effective assessment systems within schools’.

This post applies that precedent to support the education of high attainers, providing a flexible framework, capable of adoption (with adaptation where necessary) by all the different bodies and settings engaged in this process.

 

The English policy context

I have sought to incorporate in the second set of ‘reform’ principles the full range of areas explored by this blog, which began life at roughly the same time as the present Government began its education reform programme.

They are designed to capture the reform agenda now, as we draw to the close of the 2013/14 academic year. They highlight aspects of reform that are likely to be dominant over the next three academic years, subject of course to any adjustments to the reform programme in the light of the 2015 General Election.

These include:

  • Introduction of a new national curriculum incorporating both greater challenge and greater flexibility, together with full exemption for academies.
  • Introduction of new assessment arrangements, including internal assessment in schools following the withdrawal of national curriculum levels and external assessment arrangements, particularly at the end of KS2.
  • Introduction of revised GCSE and A level qualifications, including a new recalibrated grading system for GCSE.
  • Radical changes to the accountability system, including the reporting of learners’ achievement and the inspection of provision in different learning settings. 
  • Ensuring that the Pupil Premium drives accelerated progress in closing attainment gaps between disadvantaged and advantaged learners.
  • Ensuring accelerated progress against updated social mobility indicators, including improvements in fair access to selective universities.
  • Strengthening system-wide collaboration, ensuring that new types of institution play a significant role in this process, developing subject-specific support networks (especially in STEM) and building the capacity and reach of teaching school alliances.

 

Process

The Aunt Sally might be used as a starting point by a small group charged with generating a viable draft set of principles, either stand-alone or supported by any additional scaffolding deemed necessary.

The preparation of the draft core principles would itself be a consensus-establishing exercise, helping to distinguish areas of agreement and critical sticking points requiring negotiation to resolve.

This draft might be issued for consultation for a fixed period. Responses would be sought directly from a range of key national organisations, all of which would subsequently be invited to endorse formally the final version, revised in the light of consultation.

This stage might entail some further extended negotiation, but the process itself would help to raise the profile of the issue.

Out in the wider system, educators might be encouraged to interact with the final version of the principles, to discuss and record how they might be adjusted or qualified to fit their own particular settings.

There might be an online repository and forum (using a free online platform) enabling educators to discuss their response to the principles, suggest localised adjustments and variants to fit their unique contexts, provide exemplification and share supporting resources, materials and links.

Some of the key national organisations might be encouraged to develop programmes and resources within their own purlieux which would link explicitly with the core principles.

Costs would be limited to the human resource necessary to co-ordinate the initial task and subsequently curate the online repository.

 

Provisos

The focus on high attainment (as a subset of high achievement) has been selected in preference to any categorisation of high ability, talent or giftedness because there are fewer definitional difficulties, the terminology is less problematic and there should be a correspondingly stronger chance of reaching consensus.

I have not at this stage included a definition of high attainers. Potentially one could adopt the definition used in the Primary and Secondary Performance Tables, or an alternative derived from Ofsted’s ‘most able’ concept.

The PISA high achievement benchmarks could be incorporated, so permitting England to compare its progress with other countries.

But, since we are working towards new attainment measures at the end of KS2 and KS4 alike, it may be more appropriate to develop a working definition based on what we know of those measures, adapting the definition as necessary once the measures are themselves more fully defined.

In the two sections following I have set out the two parts of my Aunt Sally

  • A set of ten core principles, designed to embody a shared philosophy underpinning the education of high attainers and
  • A parallel set of ten reform principles, designed to show how England’s education reform agenda might be adapted and applied to support the education of high attainers.

As noted above, I have cast the latter in system-wide terms, hopefully as a precursor to developing a version that will apply (with some customisation) to every learning setting. I have chosen deliberately to set out the big picture from which these smaller versions might be derived.

My Aunt Sally is imbued with a personal belief in the middle way between a bottom-up, school-driven and market-based system on one hand and a rigid, top-down and centrally prescribed system on the other. The disadvantages of the latter still live in the memory, while those of the former are writ large in the current crisis.

Some of this flavour will be obvious below, especially in the last two reform principles, which embody what I call ‘flexible framework thinking’. You will need to make some allowances if you are of a different persuasion.

I have also been deliberately a little contentious in places, so as to stimulate reaction in readers. The final version will need to be more felicitously worded, but it should still be sharp enough to have real meaning and impact.

For there is no point in generating an anodyne ‘motherhood and apple pie’ statement that has no prospect of shifting opinion and behaviour in the direction required.

Finally, the current text is too long-winded, but I judged it necessary to include some broader context and signposting for those coming to this afresh. I am hopeful that, when this is shorn away, the slimmed-down version will be closer to its fighting weight.

 

Ten Core Principles

This section sets out ten essential principles that all parts of the education system should follow in providing for high achievers.

 

  1. Raising achievement – within the education system as a whole and for each and every learner – is one of the principal aims of education. It does not conflict with other aims, or with our duty to promote learners’ personal and social development, or their health, welfare and well-being.

 

  1. Securing high achievement – increasing the proportion of high achievers and raising the achievement of existing high-achievers – is integral to this aim.

 

  1. Both existing and potential high achievers have a right, equal to that of all other learners, to the blend of challenge and support they need to improve further – to become the best that they can be. No learner should be discriminated against educationally on the basis of their prior achievement, whether high or low or somewhere in between.

 

  1. We must pursue simultaneously the twin priorities of raising standards and closing gaps. We must give higher priority to all disadvantaged learners, regardless of their prior achievement. Standards should continue to rise amongst all high achievers, but they should rise faster amongst disadvantaged high achievers. This makes a valuable contribution to social mobility.

 

  1. Securing high attainment is integral to securing high achievement. The route to high attainment may involve any or all of greater breadth, increased depth and a faster pace of learning. These elements should be prioritised and combined appropriately to meet each learner’s needs; a one-size-fits-all solution should not be imposed, nor should any of these elements be ruled out automatically.

 

  1. There must be no artificial ceilings or boundaries restricting high attainment, whether imposed by chronological age or by the expertise available in the principal learning setting; equally, there must be no ‘hot-housing’, resulting from an imbalance between challenge and support and an associated failure to respond with sensitivity to the learner’s wider needs.

 

  1. High attainers are an extremely diverse and disparate population. Some are much higher attainers than others. Some may be ‘all-rounders’ while others have particular strengths and areas for development. All need the right blend of challenge and support to improve alike in areas of strength and any areas of comparative weakness.

 

  1. Amongst the high-attaining population there is significant over-representation of some learner characteristics. But there is also significant diversity, resulting from the interaction between gender, special needs, ethnic and socio-economic background (and several other characteristics besides). This diversity can and should increase as excellence gaps are closed.

 

  1. Educators must guard against the false assumption that high attainment is a corollary of advantage. Equally, they must accept that, while effective education can make a significant difference, external factors beyond their control will also impact upon high attainment. The debate about the relative strength of genetic and environmental influences is irrelevant, except insofar as it obstructs universally high expectations and instilling a positive ‘growth mindset’ in all learners.

 

  1. High attainers cannot meet their own educational needs without the support of educators. Nor is it true that they have no such needs by virtue of their prior attainment. Your investment in their continued improvement is valuable to them as individuals, but also to the country as a whole, economically, socially and culturally.

 

Ten Reform Principles

This section describes how different elements of educational reform might be harnessed to ensure a coherent, consistent and mutually supportive strategy for increasing high attainment

The elements below are described in national system-wide terms, as they apply to the primary and secondary school sectors, but each should be capable of adjustment so it is directly relevant at any level of the system and to every learning setting.

 

  1. Revised national curriculum arrangements offer greater flexibility to design school curricula to meet high attainers’ needs. ‘Top down’ curriculum design, embodying the highest expectations of all learners, is preferable to a ‘deficit model’ approach derived from lowest common denominator thresholds. Exemplary models should be developed and disseminated to support schools in developing their own.

 

  1. The assessment system must enable high attainers to show what they know, understand and can do. Their needs should not be overlooked in the pursuit of universally applicable assessment processes. Formative assessment must provide accurate, constructive feedback and sustain high expectations, regardless of the starting point. Internal and external assessment alike must be free of undesirable ceiling effects.

 

  1. Regardless of their school, all high attainers should have access to opportunities to demonstrate excellence through national assessments and public examinations, including Level 6 assessment (while it exists) and early entry (where it is in their best interests). Progression across transition points – eg primary to secondary – should not require unnecessary repetition and reinforcement. It, should be pre-planned, monitored and kept under review.

 

  1. High attainment measures should feature prominently when results are reported, especially in national School and College Performance Tables, but also on school websites and in the national data portal. Reporting should reveal clearly the extent of excellence gaps between the performance of advantaged and disadvantaged high attainers respectively.

 

  1. Ofsted’s inspection framework now focuses on the attainment and progress of ‘the most able’ in every school. Inspectors should adopt a consistent approach to judging all settings’ provision for high attainers, including explicit focus on disadvantaged high attainers. Inspectors and settings alike would benefit from succinct guidance on effective practice.

 

  1. The impact of the Pupil Premium on closing excellence gaps should be monitored closely. Effective practice should be captured and shared. The Education Endowment Foundation should ensure that impact on excellence gaps is mainstreamed within all its funded programmes and should also stimulate and support programmes dedicated to closing excellence gaps.

 

  1. The closing of excellence gaps should improve progression for disadvantaged high attainers, including to selective secondary, tertiary and higher education. Destination indicators should enable comparison of institutional success in this regard. Disadvantaged high attainers need access to tailored IAG to support fair access at every level. Targeted outreach to support effective transition is also essential at each transition point (typically 11, 16 and 18). Universities should be involved from KS2 onwards. The relevant social mobility measures should align with Pupil Premium ‘eligibility’. Concerted corrective action is required to improve progress whenever and wherever it stalls.

 

  1. System-wide collaboration is required to drive improvement. It must include all geographical areas, educational sectors and institutional types, including independent and selective schools.  All silos – whether associated with localities, academy chains, teaching school alliances, subject specialism or any other subset of provision – must be broken down. This requires joint action by educational settings, voluntary sector organisations and private sector providers alike. Organisations active in the field must stop protecting their fiefdoms and work together for the common good.

 

  1. To minimise fragmentation and patchiness of provision, high attaining learners should have guaranteed access to a menu of opportunities organised within a coherent but flexible framework. Their schools, as lead providers, should facilitate and co-ordinate on their behalves. A similar approach is required to support educators with relevant school improvement, initial training, professional development and research. To support this parallel framework, both theoretical and practical knowledge of the ‘pedagogy of high attainment’ should be collected, organised and shared.

 

  1. All providers should be invited to position their services within these frameworks, using intelligence about the balance between demand and supply to inform the development of new products and services. Responsibility for overseeing the frameworks and for monitoring and reporting progress should be allocated to an independent entity within this national community. As far as possible this should be a self-funding and self-sustaining system.

 

Next Steps

I have already had some welcome interest in developing a set of core principles to support the education of high attaining learners.

This may be a vehicle to stimulate a series of useful partnerships, but it would be premature to publicise these preliminary discussions for fear that they do not reach fruition.

This post is intended to stimulate others to consider the potential benefits of such an approach – and I am at your service should you wish to discuss the idea further.

But if I have only caused you to reflect more deeply about your personal contribution to the education of high attainers, even then this effort has been worthwhile.

GP

May 2014

 

 

‘Poor but Bright’ v ‘Poor but Dim’

 

light-147810_640This post explores whether, in supporting learners from disadvantaged backgrounds, educators should prioritise low attainers over high attainers, or give them equal priority.

 

 

 

Introduction

Last week I took umbrage at a blog post and found myself engaged in a Twitter discussion with the author, one Mr Thomas.

 

 

Put crudely, the discussion hinged on the question whether the educational needs of ‘poor but dim’ learners should take precedence over those of the ‘poor but bright’. (This is Mr Thomas’s shorthand, not mine.)

He argued that the ‘poor but dim’ are the higher priority; I countered that all poor learners should have equal priority, regardless of their ability and prior attainment.

We began to explore the issue:

  • as a matter of educational policy and principle
  • with reference to inputs – the allocation of financial and human resources between these competing priorities and
  • in terms of outcomes – the comparative benefits to the economy and to society from investment at the top or the bottom of the attainment spectrum.

This post presents the discussion, adding more flesh and gloss from the Gifted Phoenix perspective.

It might or might not stimulate some interest in how this slightly different take on a rather hoary old chestnut plays out in England’s current educational landscape.

But I am particularly interested in how gifted advocates in different countries respond to these arguments. What is the consensus, if any, on the core issue?

Depending on the answer to this first question, how should gifted advocates frame the argument for educationalists and the wider public?

To help answer the first question I have included a poll at the end of the post.

Do please respond to that – and feel free to discuss the second question in the comments section below.

The structure of the post is fairly complex, comprising:

  • A (hopefully objective) summary of Mr Thomas’s original post.
  • An embedded version of the substance of our Twitter conversation. I have removed some Tweets – mostly those from third parties – and reordered a little to make this more accessible. I don’t believe I’ve done any significant damage to either case.
  • Some definition of terms, because there is otherwise much cause for confusion as we push further into the debate.
  • A digressionary exploration of the evidence base, dealing with attainment data and budget allocations respectively. The former exposes what little we are told about how socio-economic gaps vary across the attainment spectrum; the latter is relevant to the discussion of inputs. Those pressed for time may wish to proceed directly to…
  • …A summing up, which expands in turn the key points we exchanged on the point of principle, on inputs and on outcomes respectively.

I have reserved until close to the end a few personal observations about the encounter and how it made me feel.

And I conclude with the customary brief summary of key points and the aforementioned poll.

It is an ambitious piece and I am in two minds as to whether it hangs together properly, but you are ultimately the judges of that.

 

What Mr Thomas Blogged

The post was called ‘The Romance of the Poor but Bright’ and the substance of the argument (incorporating several key quotations) ran like this:

  • The ‘effort and resources, of schools but particularly of business and charitable enterprise, are directed disproportionately at those who are already high achieving – the poor but bright’.
  • Moreover ‘huge effort is expended on access to the top universities, with great sums being spent to make marginal improvements to a small set of students at the top of the disadvantaged spectrum. They cite the gap in entry, often to Oxbridge, as a significant problem that blights our society.’
  • This however is ‘the pretty face of the problem. The far uglier face is the gap in life outcomes for those who take least well to education.’
  • Popular discourse is easily caught up in the romance of the poor but bright’ but ‘we end up ignoring the more pressing problem – of students for whom our efforts will determine whether they ever get a job or contribute to society’. For ‘when did you last hear someone advocate for the poor but dim?’ 
  • ‘The gap most damaging to society is in life outcomes for the children who perform least well at school.’ Three areas should be prioritised to improve their educational outcomes:

o   Improving alternative provision (AP) which ‘operates as a shadow school system, largely unknown and wholly unappreciated’ - ‘developing a national network of high–quality alternative provision…must be a priority if we are to close the gap at the bottom’.

o   Improving ‘consistency in SEN support’ because ‘schools are often ill equipped to cope with these, and often manage only because of the extraordinary effort of dedicated staff’. There is ‘inconsistency in funding and support between local authorities’.

o   Introducing clearer assessment of basic skills, ‘so that a student could not appear to be performing well unless they have mastered the basics’.

  • While ‘any student failing to meet their potential is a dreadful thing’, the educational successes of ‘students with incredibly challenging behaviour’ and ‘complex special needs…have the power to change the British economy, far more so than those of their brighter peers.’

A footnote adds ‘I do not believe in either bright or dim, only differences in epigenetic coding or accumulated lifetime practice, but that is a discussion for another day.’

Indeed it is.

 

Our ensuing Twitter discussion

The substance of our Twitter discussion is captured in the embedded version immediately below. (Scroll down to the bottom for the beginning and work your way back to the top.)

 

 

Defining Terms

 

Poor, Bright and Dim

I take poor to mean socio-economic disadvantage, as opposed to any disadvantage attributable to the behaviours, difficulties, needs, impairments or disabilities associated with AP and/or SEN.

I recognise of course that such a distinction is more theoretical than practical, because, when learners experience multiple causes of disadvantage, the educational response must be holistic rather than disaggregated.

Nevertheless, the meaning of ‘poor’ is clear – that term cannot be stretched to include these additional dimensions of disadvantage.

The available performance data foregrounds two measures of socio-economic disadvantage: current eligibility for and take up of free school meals (FSM) and qualification for the deprivation element of the Pupil Premium, determined by FSM eligibility at some point within the last 6 years (known as ‘ever-6’).

Both are used in this post. Distinctions are typically between the disadvantaged learners and non-disadvantaged learners, though some of the supporting data compares outcomes for disadvantaged learners with outcomes for all learners, advantaged and disadvantaged alike.

The gaps that need closing are therefore:

  • between ‘poor and bright’ and other ‘bright’ learners (The Excellence Gap) and 
  • between ‘poor and dim’ and other ‘dim’ learners. I will christen this The Foundation Gap.

The core question is whether The Foundation Gap takes precedence over The Excellence Gap or vice versa, or whether they should have equal billing.

This involves immediate and overt recognition that classification as AP and/or SEN is not synonymous with the epithet ‘poor’, because there are many comparatively advantaged learners within these populations.

But such a distinction is not properly established in Mr Thomas’ blog, which applies the epithet ‘poor’ but then treats the AP and SEN populations as homogenous and somehow associated with it.

 

By ‘dim’ I take Mr Thomas to mean the lowest segment of the attainment distribution – one of his tweets specifically mentions ‘the bottom 20%’. The AP and/or SEN populations are likely to be disproportionately represented within these two deciles, but they are not synonymous with it either.

This distinction will not be lost on gifted advocates who are only too familiar with the very limited attention paid to twice exceptional learners.

Those from poor backgrounds within the AP and/or SEN populations are even more likely to be disproportionately represented in ‘the bottom 20%’ than their more advantaged peers, but even they will not constitute the entirety of ‘the bottom 20%’. A Venn diagram would likely show significant overlap, but that is all.

Hence disadvantaged AP/SEN are almost certainly a relatively poor proxy for the ‘poor but dim’.

That said I could find no data that quantifies these relationships.

The School Performance Tables distinguish a ‘low attainer’ cohort. (In the Secondary Tables the definition is determined by prior KS2 attainment and in the Primary Tables by prior KS1 attainment.)

These populations comprise some 15.7% of the total population in the Secondary Tables and about 18.0% in the Primary Tables. But neither set of Tables applies the distinction in their reporting of the attainment of those from disadvantaged backgrounds.

 

It follows from the definition of ‘dim’ that, by ‘bright’, MrThomas probably intends the two corresponding deciles at the top of the attainment distribution (even though he seems most exercised about the subset with the capacity to progress to competitive universities, particularly Oxford and Cambridge. This is a far more select group of exceptionally high attainers – and an even smaller group of exceptionally high attainers from disadvantaged backgrounds.)

A few AP and/or SEN students will likely fall within this wider group, fewer still within the subset of exceptionally high attainers. AP and/or SEN students from disadvantaged backgrounds will be fewer again, if indeed there are any at all.

The same issues with data apply. The School Performance Tables distinguish ‘high attainers’, who constitute over 32% of the secondary cohort and 25% of the primary cohort. As with low attainers, we cannot isolate the performance of those from disadvantaged backgrounds.

We are forced to rely on what limited data is made publicly available to distinguish the performance of disadvantaged low and high attainers.

At the top of the distribution there is a trickle of evidence about performance on specific high attainment measures and access to the most competitive universities. Still greater transparency is fervently to be desired.

At the bottom, I can find very little relevant data at all – we are driven inexorably towards analyses of the SEN population, because that is the only dataset differentiated by disadvantage, even though we have acknowledged that such a proxy is highly misleading. (Equivalent AP attainment data seems conspicuous by its absence.)

 

AP and SEN

Before exploring these datasets I ought to provide some description of the different programmes and support under discussion here, if only for the benefit of readers who are unfamiliar with the English education system.

 

Alternative Provision (AP) is intended to meet the needs of a variety of vulnerable learners:

‘They include pupils who have been excluded or who cannot attend mainstream school for other reasons: for example, children with behaviour issues, those who have short- or long-term illness, school phobics, teenage mothers, pregnant teenagers, or pupils without a school place.’

AP is provided in a variety of settings where learners engage in timetabled education activities away from their school and school staff.

Providers include further education colleges, charities, businesses, independent schools and the public sector. Pupil Referral Units (PRUs) are perhaps the best-known settings – there are some 400 nationally.

A review of AP was undertaken by Taylor in 2012 and the Government subsequently embarked on a substantive improvement programme. This rather gives the lie to Mr Thomas’ contention that AP is ‘largely unknown and wholly unappreciated’.

Taylor complains of a lack of reliable data about the number of learners in AP but notes that the DfE’s 2011 AP census recorded 14,050 pupils in PRUs and a further 23,020 in other settings on a mixture of full-time and part-time placements. This suggests a total of slightly over 37,000 learners, though the FTE figure is unknown.

He states that AP learners are:

‘…twice as likely as the average pupil to qualify for free school meals’

A supporting Equality Impact Assessment qualifies this somewhat:

‘In Jan 2011, 34.6% of pupils in PRUs and 13.8%* of pupils in other AP, were eligible for and claiming free school meals, compared with 14.6% of pupils in secondary schools. [*Note: in some AP settings, free school meals would not be available, so that figure is under-stated, but we cannot say by how much.]’

If the PRU population is typical of the wider AP population, approximately one third qualify under this FSM measure of disadvantage, meaning that the substantial majority are not ‘poor’ according to our definition above.

Taylor confirms that overall GCSE performance in AP is extremely low, pointing out that in 2011 just 1.4% achieved five or more GCSE grades A*-C including [GCSEs in] maths and English, compared to 53.4% of pupils in all schools.

By 2012/13 the comparable percentages were 1.7% and 61.7% respectively (the latter for all state-funded schools), suggesting an increasing gap in overall performance. This is a cause for concern but not directly relevant to the issue under consideration.

The huge disparity is at least partly explained by the facts that many AP students take alternative qualifications and that the national curriculum does not apply to PRUs.

Data is available showing the full range of qualifications pursued. Taylor recommended that all students in AP should continue to receive ‘appropriate and challenging English and Maths teaching’.

Interestingly, he also pointed out that:

‘In some PRUs and AP there is no provision for more able pupils who end up leaving without the GCSE grades they are capable of earning.’

However, he fails to offer a specific recommendation to address this point.

 

Special Educational Needs (SEN) are needs or disabilities that affect children’s ability to learn. These may include behavioural and social difficulties, learning difficulties or physical impairments.

This area has also been subject to a major Government reform programme now being implemented.

There is significant overlap between AP and SEN, with Taylor’s review of the former noting that the population in PRUs is 79% SEN.

We know from the 2013 SEN statistics that 12.6% of all pupils on roll at PRUs had SEN statements and 68.9% had SEN without statements. But these populations represent only a tiny proportion of the total SEN population in schools.

SEN learners also have higher than typical eligibility for FSM. In January 2013, 30.1% of all SEN categories across all primary, secondary and special schools were FSM-eligible, roughly twice the rate for all pupils. However, this means that almost seven in ten are not caught by the definition of ‘poor’ provided above.

In 2012/13 23.4% of all SEN learners achieved five or more GCSEs at A*-C or equivalent, including GCSEs in English and maths, compared with 70.4% of those having no identified SEN – another significant overall gap, but not directly relevant to our comparison of the ‘poor but bright’ and the ‘poor but dim’.

 

Data on socio-economic attainment gaps across the attainment spectrum

Those interested in how socio-economic attainment gaps vary at different attainment levels cannot fail to be struck by how little material of this kind is published, particularly in the secondary sector, when such gaps tend to increase in size.

One cannot entirely escape the conviction that this reticence deliberately masks some inconvenient truths.

  • The ideal would be to have the established high/middle/low attainer distinctions mapped directly onto performance by advantaged/disadvantaged learners in the Performance Tables but, as we have indicated, this material is conspicuous by its absence. Perhaps it will appear in the Data Portal now under development.
  • Our next best option is to examine socio-economic attainment gaps on specific attainment measures that will serve as decent proxies for high/middle/low attainment. We can do this to some extent but the focus is disproportionately on the primary sector because the Secondary Tables do not include proper high attainment measures (such as measures based exclusively on GCSE performance at grades A*/A). Maybe the Portal will come to the rescue here as well. We can however supply some basic Oxbridge fair access data.
  • The least preferable option is deploy our admittedly poor proxies for low attainers – SEN and AP. But there isn’t much information from this source either. 

The analysis below looks consecutively at data for the primary and secondary sectors.

 

Primary

We know, from the 2013 Primary School Performance Tables, that the percentage of disadvantaged and other learners achieving different KS2 levels in reading, writing and maths combined, in 2013 and 2012 respectively, were as follows:

 

Table 1: Percentage of disadvantaged and all other learners achieving each national curriculum level at KS2 in 2013 in reading, writing and maths combined

L3  or below L4  or above L4B or above L5 or above
Dis Oth Gap Dis Oth Gap Dis Oth Gap Dis Oth Gap
2013 13 5 +8 63 81 -18 49 69 -20 10 26 -16
2012 x x x 61 80 -19 x x x 9 24 -15

 

This tells us relatively little, apart from the fact that disadvantaged learners are heavily over-represented at L3 and below and heavily under-represented at L5 and above.

The L5 gap is somewhat lower than the gaps at L4 and 4B respectively, but not markedly so. However, the L5 gap has widened slightly since 2012 while the reverse is true at L4.

This next table synthesises data from SFR51/13: ‘National curriculum assessments at key stage 2: 2012 to 2013’. It also shows gaps for disadvantage, as opposed to FSM gaps.

 

Table 2: Percentage of disadvantaged and all other learners achieving each national curriculum level, including differentiation by gender, in each 2013 end of KS2 test

L3 L4 L4B L5 L6
D O Gap D O Gap D O Gap D O Gap D O Gap
Reading All 12 6 +6 48 38 +10 63 80 -17 30 50 -20 0 1 -1
B 13 7 +6 47 40 +7 59 77 -18 27 47 -20 0 0 0
G 11 5 +6 48 37 +11 67 83 -16 33 54 -21 0 1 -1
GPS All 28 17 +11 28 25 +3 52 70 -18 33 51 -18 1 2 -1
B 30 20 +10 27 27 0 45 65 -20 28 46 -18 0 2 -2
G 24 13 +11 28 24 +4 58 76 -18 39 57 -18 1 3 -2
Maths All 16 9 +7 50 41 +9 62 78 -16 24 39 -15 2 8 -6
B 15 8 +7 48 39 +9 63 79 -16 26 39 -13 3 10 -7
G 17 9 +8 52 44 +8 61 78 -17 23 38 -15 2 7 -5

 

This tells a relatively consistent story across each test and for boys as well as girls.

We can see that, at Level 4 and below, learners from disadvantaged backgrounds are in the clear majority, perhaps with the exception of L4 GPS.  But at L4B and above they are very much in the minority.

Moreover, with the exception of L6 where low percentages across the board mask the true size of the gaps, disadvantaged learners tend to be significantly more under-represented at L4B and above than they are over-represented at L4 and below.

A different way of looking at this data is to compare the percentages of advantaged and disadvantaged learners respectively at L4 and L5 in each assessment.

  • Reading: Amongst disadvantaged learners the proportion at L5 is -18 percentage points fewer than the proportion at L4, but amongst advantaged learners the proportion at L5 is +12 percentage points higher than at L4.
  • GPS: Amongst disadvantaged learners the proportion at L5 is +5 percentage points more than the proportion at L4, but amongst advantaged learners the proportion at L5 is +26 percentage points higher than at L4.
  • Maths: Amongst disadvantaged learners the proportion at L5 is -26 percentage points fewer than the proportion at L4, but amongst advantaged learners the proportion at L5 is only 2 percentage points fewer than at L4.

If we look at 2013 gaps compared with 2012 (with teacher assessment of writing included in place of the GSP test introduced in 2013) we can see there has been relatively little change across the board, with the exception of L5 maths, which has been affected by the increasing success of advantaged learners at L6.

 

Table 3: Percentage of disadvantaged and all other learners achieving  national curriculum levels 3-6 in each of reading, writing and maths in 2012 and 2013 respectively

L3 L4 L5 L6
D O Gap D O Gap D O Gap D O Gap
Reading 2012 11 6 +5 46 36 +10 33 54 -21 0 0 0
2013 12 6 +6 48 38 +10 30 50 -20 0 1 -1
Writing 2012 22 11 +11 55 52 +3 15 32 -17 0 1 -1
2013 19 10 +9 56 52 +4 17 34 -17 1 2 -1
Maths 2012 17 9 +8 50 41 +9 23 43 -20 1 4 -3
2013 16 9 +9 50 41 +9 24 39 -15 2 8 -6

 

To summarise, as far as KS2 performance is concerned, there are significant imbalances at both the top and the bottom of the attainment distribution and these gaps have not changed significantly since 2012. There is some evidence to suggest that gaps at the top are larger than those at the bottom.

 

Secondary

Unfortunately there is a dearth of comparable data at secondary level, principally because of the absence of published measures of high attainment.

SFR05/2014 provides us with FSM gaps (as opposed to disadvantaged gaps) for a series of GCSE measures, none of which serve our purpose particularly well:

  • 5+ A*-C GCSE grades: gap = 16.0%
  • 5+ A*-C grades including English and maths GCSEs: gap = 26.7%
  • 5+ A*-G grades: gap = 7.6%
  • 5+ A*-G grades including English and maths GCSEs: gap = 9.9%
  • A*-C grades in English and maths GCSEs: gap = 26.6%
  • Achieving the English Baccalaureate: gap = 16.4%

Perhaps all we can deduce is that the gaps vary considerably in size, but tend to be smaller for the relatively less demanding and larger for the relatively more demanding measures.

For specific high attainment measures we are forced to rely principally on data snippets released in answer to occasional Parliamentary Questions.

For example:

  • In 2003, 1.0% of FSM-eligible learners achieved five or more GCSEs at A*/A including English and maths but excluding equivalents, compared with 6.8% of those not eligible, giving a gap of 5.8%. By 2009 the comparable percentages were 1.7% and 9.0% respectively, giving an increased gap of 7.3% (Col 568W)
  • In 2006/07, the percentage of FSM-eligible pupils securing A*/A grades at GCSE in different subjects, compared with the percentage of all pupils in maintained schools doing so were as shown in the table below (Col 808W)

 

Table 4: Percentage of FSM-eligible and all pupils achieving GCSE A*/A grades in different GCSE subjects in 2007

FSM All pupils Gap
Maths 3.7 15.6 11.9
Eng lit 4.1 20.0 15.9
Eng lang 3.5 16.4 12.9
Physics 2.2 49.0 46.8
Chemistry 2.5 48.4 45.9
Biology 2.5 46.8 44.3
French 3.5 22.9 19.4
German 2.8 23.2 20.4

 

  • In 2008, 1% of FSM-eligible learners in maintained schools achieved A* in GCSE maths compared with 4% of all pupils in maintained schools. The comparable percentages for Grade A were 3% and 10% respectively, giving an A*/A gap of 10% (Col 488W)

There is much variation in the subject-specific outcomes at A*/A described above. But, when it comes to the overall 5+ GCSEs high attainment measure based on grades A*/A, the gap is much smaller than on the corresponding standard measure based on grades A*-C.

Further material of broadly the same vintage is available in a 2007 DfE statistical publication: ‘The Characteristics of High Attainers’.

There is a complex pattern in evidence here which is very hard to explain with the limited data available. More time series data of this nature – illustrating Excellence and Foundation Gaps alike – should be published annually so that we have a more complete and much more readily accessible dataset.

I could find no information at all about the comparative performance of disadvantaged learners in AP settings compared with those not from disadvantaged backgrounds.

Data is published showing the FSM gap for SEN learners on all the basic GCSE measures listed above. I have retained the generic FSM gaps in brackets for the sake of comparison:

  • 5+ A*-C GCSE grades: gap = 12.5% (16.0%)
  • 5+ A*-C grades including English and maths GCSEs: gap = 12.1% (26.7%)
  • 5+ A*-G grades: gap = 10.4% (7.6%)
  • 5+ A*-G grades including English and maths GCSEs: gap = 13.2% (9.9%)
  • A*-C grades in English and maths GCSEs: gap = 12.3% (26.6%)
  • Achieving the English Baccalaureate: gap = 3.5% (16.4%)

One can see that the FSM gaps for the more demanding measures are generally lower for SEN learners than they are for all learners. This may be interesting but, for the reasons given above, this is not a reliable proxy for the FSM gap amongst ‘dim’ learners.

 

When it comes to fair access to Oxbridge, I provided a close analysis of much relevant data in this post from November 2013.

The chart below shows the number of 15 year-olds eligible for and claiming FSM at age 15 who progressed to Oxford or Cambridge by age 19. The figures are rounded to the nearest five.

 

Chart 1: FSM-eligible learners admitted to Oxford and Cambridge 2005/06 to 2010/11

Oxbridge chart

 

In sum, there has been no change in these numbers over the last six years for which data has been published. So while there may have been consistently significant expenditure on access agreements and multiple smaller mentoring programmes, it has had negligible impact on this measure at least.

My previous post set out a proposal for what to do about this sorry state of affairs.

 

Budgets

For the purposes of this discussion we need ideally to identify and compare total national budgets for the ‘poor but bright’ and the ‘poor but dim’. But that is simply not possible. 

Many funding streams cannot be disaggregated in this manner. As we have seen, some – including the AP and SEN budgets – may be aligned erroneously with the second of these groups, although they also support learners who are neither ‘poor’ nor ‘dim’ and have a broader purpose than raising attainment. 

There may be some debate, too, about which funding streams should be weighed in the balance.

On the ‘bright but poor’ side, do we include funding for grammar schools, even though the percentage of disadvantaged learners attending many of them is virtually negligible (despite recent suggestions that some are now prepared to do something about this)? Should the Music and Dance Scheme (MDS) be within scope of this calculation?

The best I can offer is a commentary that gives a broad sense of orders of magnitude, to illustrate in very approximate terms how the scales tend to tilt more towards the ‘poor but dim’ rather than the ‘poor but bright’, but also to weave in a few relevant asides about some of the funding streams in question.

 

Pupil Premium and the EEF

I begin with the Pupil Premium – providing schools with additional funding to raise the attainment of disadvantaged learners.

The Premium is not attached to the learners who qualify for it, so schools are free to aggregate the funding and use it as they see fit. They are held accountable for these decisions through Ofsted inspection and the gap-narrowing measures in the Performance Tables.

Mr Thomas suggests in our Twitter discussion that AP students are not significant beneficiaries of such support, although provision in PRUs features prominently in the published evaluation of the Premium. It is for local authorities to determine how Pupil Premium funding is allocated in AP settings.

One might also make a case that ‘bright but poor’ learners are not a priority either, despite suggestions from the Pupil Premium Champion to the contrary.

 

 

As we have seen, the Performance Tables are not sharply enough focused on the excellence gaps at the top of the distribution and I have shown elsewhere that Ofsted’s increased focus on the most able does not yet extend to the impact on those attracting the Pupil Premium, even though there was a commitment that it would do so. 

If there is Pupil Premium funding heading towards high attainers from disadvantaged backgrounds, the limited data to which we have access does not yet suggest a significant impact on the size of Excellence Gaps. 

The ‘poor but bright’ are not a priority for the Education Endowment Foundation (EEF) either.

This 2011 paper explains that it is prioritising the performance of disadvantaged learners in schools below the floor targets. At one point it says:

‘Looking at the full range of GCSE results (as opposed to just the proportions who achieve the expected standards) shows that the challenge facing the EEF is complex – it is not simply a question of taking pupils from D to C (the expected level of attainment). Improving results across the spectrum of attainment will mean helping talented pupils to achieve top grades, while at the same time raising standards amongst pupils currently struggling to pass.’

But this is just after it has shown that the percentages of disadvantaged high attainers in its target schools are significantly lower than elsewhere. Other things being equal, the ‘poor but dim’ will be the prime beneficiaries.

It may now be time for the EEF to expand its focus to all schools. A diagram from this paper – reproduced below – demonstrates that, in 2010, the attainment gap between FSM and non-FSM was significantly larger in schools above the floor than in those below the floor that the EEF is prioritising. This is true in both the primary and secondary sectors.

It would be interesting to see whether this is still the case.

 

EEF Capture

 

AP and SEN

Given the disaggregation problems discussed above, this section is intended simply to give some basic sense of orders of magnitude – lending at least some evidence to counter Mr Thomas’ assertion that the ‘effort and resources, of schools… are directed disproportionately at those who are already high achieving – the poor but bright’.

It is surprisingly hard to get a grip on the overall national budget for AP. A PQ from early 2011 (Col 75W) supplies a net current expenditure figure for all English local authorities of £530m.

Taylor’s Review fails to offer a comparable figure but my rough estimates based on the per pupil costs he supplies suggests a revenue budget of at least £400m. (Taylor suggests average per pupil costs of £9,500 per year for full-time AP, although PRU places are said to cost between £12,000 and £18,000 per annum.)

I found online a consultation document from Kent – England’s largest local authority – stating its revenue costs at over £11m in FY2014-15. Approximately 454 pupils attended Kent’s AP/PRU provision in 2012-13.

There must also be a significant capital budget. There are around 400 PRUs, not to mention a growing cadre of specialist AP academies and free schools. The total capital cost of the first AP free school – Derby Pride Academy – was £2.147m for a 50 place setting.

In FY2011-12, total annual national expenditure on SEN was £5.77 billion (Col 391W). There will have been some cost-cutting as a consequence of the latest reforms, but the order of magnitude is clear.

The latest version of the SEN Code of Practice outlines the panoply of support available, including the compulsory requirement that each school has a designated teacher to be responsible for co-ordinating SEN provision (the SENCO).

In short, the national budget for AP is sizeable and the national budget for SEN is huge. Per capita expenditure is correspondingly high. If we could isolate the proportion of these budgets allocated to raising the attainment of the ‘poor but dim’, the total would be substantial.

 

Fair Access, especially to Oxbridge, and some related observations

Mr Thomas refers specifically to funding to support fair access to universities – especially Oxbridge – for those from disadvantaged backgrounds. This is another area in which it is hard to get a grasp on total expenditure, not least because of the many small-scale mentoring projects that exist.

Mr Thomas is quite correct to remark on the sheer number of these, although they are relatively small beer in budgetary terms. (One suspects that they would be much more efficient and effective if they could be linked together within some sort of overarching framework.)

The Office for Fair Access (OFFA) estimates University access agreement expenditure on outreach in 2014-15 at £111.9m and this has to be factored in, as does DfE’s own small contribution – the Future Scholar Awards.

Were any expenditure in this territory to be criticised, it would surely be the development and capital costs for new selective 16-19 academies and free schools that specifically give priority to disadvantaged students.

The sums are large, perhaps not outstandingly so compared with national expenditure on SEN for example, but they will almost certainly benefit only a tiny localised proportion of the ‘bright but poor’ population.

There are several such projects around the country. Some of the most prominent are located in London.

The London Academy of Excellence (capacity 420) is fairly typical. It cost an initial £4.7m to establish plus an annual lease requiring a further £400K annually.

But this is dwarfed by the projected costs of the Harris Westminster Sixth Form, scheduled to open in September 2014. Housed in a former government building, the capital cost is reputed to be £45m for a 500-place institution.

There were reportedly disagreements within Government:

‘It is understood that the £45m cost was subject to a “significant difference of opinion” within the DfE where critics say that by concentrating large resources on the brightest children at a time when budgets are constrained means other children might miss out…

But a spokeswoman for the DfE robustly defended the plans tonight. “This is an inspirational collaboration between the country’s top academy chain and one of the best private schools in the country,” she said. “It will give hundreds of children from low income families across London the kind of top quality sixth-form previously reserved for the better off.”’

Here we have in microcosm the debate to which this post is dedicated.

One blogger – a London College Principal – pointed out that the real issue was not whether the brightest should benefit over others, but how few of the ‘poor but bright’ would do so:

‘£45m could have a transformative effect on thousands of 16-19 year olds across London… £45m could have funded at least 50 extra places in each college for over 10 years, helped build excellent new facilities for all students and created a city-wide network to support gifted and talented students in sixth forms across the capital working with our partner universities and employers.’

 

Summing Up

There are three main elements to the discussion: the point of principle, the inputs and the impact. The following sections deal with each of these in turn.

 

Principle

Put bluntly, should ‘poor but dim’ kids have higher priority for educators than ‘poor but bright’ kids (Mr Thomas’ position) or should all poor kids have equal priority and an equal right to the support they need to achieve their best (the Gifted Phoenix position)?

For Mr Thomas, it seems this priority is determined by whether – and how far – the learner is behind undefined ‘basic levels of attainment’ and/or mastery of ‘the basics’ (presumably literacy and numeracy).

Those below the basic attainment threshold have higher priority than those above it. He does not say so but this logic suggests that those furthest below the threshold are the highest priority and those furthest above are the lowest.

So, pursued to its logical conclusion, this would mean that the highest attainers would get next to no support while a human vegetable would be the highest priority of all.

However, since Mr Thomas’ focus is on marginal benefit, it may be that those nearest the threshold would be first in the queue for scarce resources, because they would require the least effort and resources to lift above it.

This philosophy drives the emphasis on achievement of national benchmarks and predominant focus on borderline candidates that, until recently, dominated our assessment and accountability system.

For Gifted Phoenix, every socio-economically disadvantaged learner has equal priority to the support they need to improve their attainment, by virtue of that disadvantage.

There is no question of elevating some ahead of others in the pecking order because they are further behind on key educational measures since, in effect, that is penalising some disadvantaged learners on the grounds of their ability or, more accurately, their prior attainment.

This philosophy underpins the notion of personalised education and is driving the recent and welcome reforms of the assessment and accountability system, designed to ensure that schools are judged by how well they improve the attainment of all learners, rather than predominantly on the basis of the proportion achieving the standard national benchmarks.

I suggested that, in deriding ‘the romance of the poor but bright’, Mr Thomas ran the risk of falling into ‘the slough of anti-elitism’. He rejected that suggestion, while continuing to emphasise the need to ‘concentrate more’ on ‘those at risk of never being able to engage with society’.

I have made the assumption that Thomas is interested primarily in KS2 and GCSE or equivalent qualifications at KS4 given his references to KS2 L4, basic skills and ‘paper qualifications needed to enter meaningful employment’.

But his additional references to ‘real qualifications’ (as opposed to paper ones) and engaging with society could well imply a wider range of personal, social and work-related skills for employability and adult life.

My preference for equal priority would apply regardless: there is no guarantee that high attainers from disadvantaged backgrounds will necessarily possess these vital skills.

But, as indicated in the definition above, there is an important distinction to be maintained between:

  • educational support to raise the attainment, learning and employability skills of socio-economically disadvantaged learners and prepare them for adult life and
  • support to manage a range of difficulties – whether behavioural problems, disability, physical or mental impairment – that impact on the broader life chances of the individuals concerned.

Such a distinction may well be masked in the everyday business of providing effective holistic support for learners facing such difficulties, but this debate requires it to be made and sustained given Mr Thomas’s definition of the problem in terms of the comparative treatment of the ‘poor but bright’ and the ‘poor but dim’.

Having made this distinction, it is not clear whether he himself sustains it consistently through to the end of his post. In the final paragraphs the term ‘poor but dim’ begins to morph into a broader notion encompassing all AP and SEN learners regardless of their socio-economic status.

Additional dimensions of disadvantage are potentially being brought into play. This is inconsistent and radically changes the nature of the argument.

 

Inputs

By inputs I mean the resources – financial and human – made available to support the education of ‘dim’ and ‘bright’ disadvantaged learners respectively.

Mr Thomas also shifts his ground as far as inputs are concerned.

His post opens with a statement that ‘the effort and resources’ of schools, charities and businesses are ‘directed disproportionately’ at the poor but bright – and he exemplifies this with reference to fair access to competitive universities, particularly Oxbridge.

When I point out the significant investment in AP compared with fair access, he changes tack – ‘I’m measuring outcomes not just inputs’.

Then later he says ‘But what some need is just more expensive’, to which I respond that ‘the bottom end already has the lion’s share of funding’.

At this point we have both fallen into the trap of treating the entirety of the AP and SEN budgets as focused on the ‘poor but dim’.

We are failing to recognise that they are poor proxies because the majority of AP and SEN learners are not ‘poor’, many are not ‘dim’, these budgets are focused on a wider range of needs and there is significant additional expenditure directed at ‘poor but dim’ learners elsewhere in the wider education budget.

Despite Mr Thomas’s opening claim, it should be reasonably evident from the preceding commentary that my ‘lion’s share’ point is factually correct. His suggestion that AP is ‘largely unknown and wholly unappreciated’ flies in the face of the Taylor Review and the Government’s subsequent work programme.

SEN may depend heavily on the ‘extraordinary effort of dedicated staff’, but at least there are such dedicated staff! There may be inconsistencies in local authority funding and support for SEN, but the global investment is colossal by comparison with the funding dedicated on the other side of the balance.

Gifted Phoenix’s position acknowledges that inputs are heavily loaded in favour of the SEN and AP budgets. This is as it should be since, as Thomas rightly notes, many of the additional services they need are frequently more expensive to provide. These services are not simply dedicated to raising their attainment, but also to tackling more substantive problems associated with their status.

Whether the balance of expenditure on the ‘bright’ and ‘dim’ respectively is optimal is a somewhat different matter. Contrary to Mr Thomas’s position, gifted advocates are often convinced that too much largesse is focused on the latter at the expense of the former.

Turning to advocacy, Mr Thomas says ‘we end up ignoring the more pressing problem’ of the poor but dim. He argues in the Twitter discussion that too few people are advocating for these learners, adding that they are failed ‘because it’s not popular to talk about them’.

I could not resist countering that advocacy for gifted learners is equally unpopular, indeed ‘the word is literally taboo in many settings’. I cannot help thinking – from his footnote reference to ‘epigenetic coding’ – that Mr Thomas is amongst those who are distinctly uncomfortable with the term.

Where advocacy does survive it is focused exclusively on progression to competitive universities and, to some extent, high attainment as a route towards that outcome. The narrative has shifted away from concepts of high ability or giftedness, because of the very limited consensus about that condition (even amongst gifted advocates) and even considerable doubt in some quarters whether it exists at all.

 

Outcomes

Mr Thomas maintains in his post that the successes of his preferred target group ‘have the power to change the British economy, far more so than those of their brighter peers’. This is because ‘the gap most damaging to society is in life outcomes for the children that perform least well at school’.

As noted above, it is important to remember that we are discussing here the addition of educational and economic value by tackling underachievement amongst learners from disadvantaged backgrounds, rather than amongst all the children that perform least well.

We are also leaving to one side the addition of value through any wider engagement by health and social services to improve life chances.

It is quite reasonable to advance the argument that ‘improving the outcomes ‘of the bottom 20%’ (the Tail) will have ‘a huge socio-economic impact’ and ‘make the biggest marginal difference to society’.

But one could equally make the case that society would derive similar or even higher returns from a decision to concentrate disproportionately on the highest attainers (the Smart Fraction).

Or, as Gifted Phoenix would prefer, one could reasonably propose that the optimal returns should be achieved by means of a balanced approach that raises both the floor and the ceiling, avoiding any arbitrary distinctions on the basis of prior attainment.

From the Gifted Phoenix perspective, one should balance the advantages of removing the drag on productivity of an educational underclass against those of developing the high-level human capital needed to drive economic growth and improve our chances of success in what Coalition ministers call the ‘global race’.

According to this perspective, by eliminating excellence gaps between disadvantaged and advantaged high attainers we will secure a stream of benefits broadly commensurate to that at the bottom end.

These will include substantial spillover benefits, achieved as a result of broadening the pool of successful leaders in political, social, educational and artistic fields, not to mention significant improvements in social mobility.

It is even possible to argue that, by creating a larger pool of more highly educated parents, we can also achieve a significant positive impact on the achievement of subsequent generations, thus significantly reducing the size of the tail.

And in the present generation we will create many more role models: young people from disadvantaged backgrounds who become educationally successful and who can influence the aspirations of younger disadvantaged learners.

This avoids the risk that low expectations will be reinforced and perpetuated through a ‘deficit model’ approach that places excessive emphasis on removing the drag from the tail by producing a larger number of ‘useful members of society’.

This line of argument is integral to the Gifted Phoenix Manifesto.

It seems to me entirely conceivable that economists might produce calculations to justify any of these different paths.

But it would be highly inequitable to put all our eggs in the ‘poor but bright’ basket, because that penalises some disadvantaged learners for their failure to achieve high attainment thresholds.

And it would be equally inequitable to focus exclusively on the ‘poor but dim’, because that penalises some disadvantaged learners for their success in becoming high attainers.

The more equitable solution must be to opt for a ‘balanced scorecard’ approach that generates a proportion of the top end benefits and a proportion of the bottom end benefits simultaneously.

There is a risk that this reduces the total flow of benefits, compared with one or other of the inequitable solutions, but there is a trade-off here between efficiency and a socially desirable outcome that balances the competing interests of the two groups.

 

The personal dimension

After we had finished our Twitter exchanges, I thought to research Mr Thomas online. Turns out he’s quite the Big-Cheese-in-Embryo. Provided he escapes the lure of filthy lucre, he’ll be a mover and shaker in education within the next decade.

I couldn’t help noticing his own educational experience – public school, a First in PPE from Oxford, leading light in the Oxford Union – then graduation from Teach First alongside internships with Deutsche Bank and McKinsey.

Now he’s serving his educational apprenticeship as joint curriculum lead for maths at a prominent London Academy. He’s also a trustee of ‘a university mentoring project for highly able 11-14 year old pupils from West London state schools’.

Lucky I didn’t check earlier. Such a glowing CV might have been enough to cow this grammar school Oxbridge reject, even if I did begin this line of work several years before he was born. Not that I have a chip on my shoulder…

The experience set me wondering about the dominant ideology amongst the Teach First cadre, and how it is tempered by extended exposure to teaching in a challenging environment.

There’s more than a hint of idealism about someone from this privileged background espousing the educational philosophy that Mr Thomas professes. But didn’t he wonder where all the disadvantaged people were during his own educational experience, and doesn’t he want to change that too?

His interest in mentoring highly able pupils would suggest that he does, but also seems directly to contradict the position he’s reached here. It would be a pity if the ‘poor but bright’ could not continue to rely on his support, equal in quantity and quality to the support he offers the ‘poor but dim’.

For he could make a huge difference at both ends of the attainment spectrum – and, with his undeniable talents, he should certainly be able to do so

 

Conclusion

We are entertaining three possible answers to the question whether in principle to prioritise the needs of the ‘poor but bright’ or the ‘poor but dim’:

  • Concentrate principally – perhaps even exclusively – on closing the Excellence Gaps at the top
  • Concentrate principally – perhaps even exclusively – on closing the Foundation Gaps at the bottom
  • Concentrate equally across the attainment spectrum, at the top and bottom and all points in between.

Speaking as an advocate for those at the top, I favour the third option.

It seems to me incontrovertible – though hard to quantify – that, in the English education system, the lion’s share of resources go towards closing the Foundation Gaps.

That is perhaps as it should be, although one could wish that the financial scales were not tipped so excessively in their direction, for ‘poor but bright’ learners do in my view have an equal right to challenge and support, and should not be penalised for their high attainment.

Our current efforts to understand the relative size of the Foundation and Excellence Gaps and how these are changing over time are seriously compromised by the limited data in the public domain.

There is a powerful economic case to be made for prioritising the Foundation Gaps as part of a deliberate strategy for shortening the tail – but an equally powerful case can be constructed for prioritising the Excellence Gaps, as part of a deliberate strategy for increasing the smart fraction.

Neither of these options is optimal from an equity perspective however. The stream of benefits might be compromised somewhat by not focusing exclusively on one or the other, but a balanced approach should otherwise be in our collective best interests.

You may or may not agree. Here is a poll so you can register your vote. Please use the comments facility to share your wider views on this post.

 

 

 

 

Epilogue

 

We must beware the romance of the poor but bright,

But equally beware

The romance of rescuing the helpless

Poor from their sorry plight.

We must ensure the Tail

Does not wag the disadvantaged dog!

 

GP

May 2014

How well is Ofsted reporting on the most able?

 

 

This post considers how Ofsted’s new emphasis on the attainment and progress of the most able learners is reflected in school inspection reports.

My analysis is based on the 87 Section 5 secondary school inspection reports published in the month of March 2014.

keep-calm-and-prepare-for-ofsted-6I shall not repeat here previous coverage of how Ofsted’s emphasis on the most able has been framed. Interested readers may wish to refer to previous posts for details:

The more specific purpose of the post is to explore how consistently Ofsted inspectors are applying their guidance and, in particular, whether there is substance for some of the concerns I expressed in these earlier posts, drawn together in the next section.

The remainder of the post provides an analysis of the sample and a qualitative review of the material about the most able (and analogous terms) included in the sample of 87 inspection reports.

It concludes with a summary of the key points, a set of associated recommendations and an overall inspection grade for inspectors’ performance to date. Here is a link to this final section for those who prefer to skip the substance of the post.

 

Background

Before embarking on the real substance of this argument I need to restate briefly some of the key issues raised in those earlier posts:

  • Ofsted’s definition of ‘the most able’ in its 2013 survey report is idiosyncratically broad, including around half of all learners on the basis of their KS2 outcomes.
  • The evidence base for this survey report included material suggesting that the most able students are supported well or better in only 20% of lessons – and are not making the progress of which they are capable in about 40% of schools.
  • The survey report’s recommendations included three commitments on Ofsted’s part. It would:

 o   ‘focus more closely in its inspections on the teaching and progress of the most able students, the curriculum available to them, and the information, advice and guidance provided to the most able students’;

o   ‘consider in more detail during inspection how well the pupil premium is used to support the most able students from disadvantaged backgrounds’ and

o   ‘report its inspection findings about this group of students more clearly in school inspection, sixth form and college reports.’

  • Subsequently the school inspection guidance was revised somewhat  haphazardly, resulting in the parallel use of several undefined terms (‘able pupils’, ‘most able’, ‘high attaining’, ‘highest attaining’),  the underplaying of the attainment and progress of the most able learners attracting the Pupil Premium and very limited reference to appropriate curriculum and IAG.
  • Within the inspection guidance, emphasis was placed primarily on learning and progress. I edited together the two relevant sets of level descriptors in the guidance to provide this summary for the four different inspection categories:

In outstanding schools the most able pupils’ learning is consistently good or better and they are making rapid and sustained progress.

In good schools the most able pupils’ learning is generally good, they make good progress and achieve well over time.

In schools requiring improvement the teaching of the most able pupils and their achievement are not good.

In inadequate schools the most able pupils are underachieving and making inadequate progress.

  • No published advice has been made available to inspectors on the interpretation of these amendments to the inspection guidance. In October 2013 I wrote:

‘Unfortunately, there is a real risk that the questionable clarity of the Handbook and Subsidiary Guidance will result in some inconsistency in the application of the Framework, even though the fundamental purpose of such material is surely to achieve the opposite.’

  • Analysis of a very small sample of reports for schools reporting poor results for high attainers in the school performance tables suggested inconsistency both before and after the amendments were introduced into the guidance. I commented:

‘One might expect that, unconsciously or otherwise, inspectors are less ready to single out the performance of the most able when a school is inadequate across the board, but the small sample above does not support this hypothesis. Some of the most substantive comments relate to inadequate schools.

It therefore seems more likely that the variance is attributable to the differing capacity of inspection teams to respond to the new emphases in their inspection guidance. This would support the case made in my previous post for inspectors to receive additional guidance on how they should interpret the new requirement.’

The material below considers the impact of these revisions on a more substantial sample of reports and whether this justifies some of the concerns expressed above.

It is important to add that, in January 2014, Ofsted revised its guidance document ‘Writing the report for school inspections’ to include the statement that:

Inspectors must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’ (p8)

This serves to reinforce the changes to the inspection guidance and clearly indicates that coverage of this issue – at least in these terms – is a non-negotiable: we should expect to see appropriate reference in every single section 5 report.

 

The Sample

The sample comprises 87 secondary schools whose Section 5 inspection reports were published by Ofsted in the month of March 2014.

The inspections were conducted between 26 November 2013 and 11 March 2014, so the inspectors will have had time to become familiar with the revised guidance.

However up to 20 of the inspections took place before Ofsted felt it necessary to emphasise that coverage of the progress and teaching of the most able is compulsory.

The sample happens to include several institutions inspected as part of wider-ranging reviews of schools in Birmingham and schools operated by the E-ACT academy chain. It also incorporates several middle-deemed secondary schools.

Chart 1 shows the regional breakdown of the sample, adopting the regions Ofsted uses to categorise reports, as opposed to its own regional structure (ie with the North East identified separately from Yorkshire and Humberside).

It contains a disproportionately large number of schools from the West Midlands while the South-West is significantly under-represented. All the remaining regions supply between 5 and 13 schools. A total of 57 local authority areas are represented.

 

Chart 1: Schools within the sample by region

Ofsted chart 1

 

Chart 2 shows the different statuses of schools within the sample. Over 40% are community schools, while almost 30% are sponsored academies. There are no academy converters but sponsored academies, free schools and studio schools together account for some 37% of the sample.

 

Chart 2: Schools within the sample by status

Ofsted chart 2

 

The vast majority of schools in the sample are 11-16 or 11-18 institutions, but four are all-through schools, five provide for learners aged 13 or 14 upwards and 10 are middle schools. There are four single sex schools.

Chart 3 shows the variation in school size. Some of the studio schools, free schools and middle schools are very small by secondary standards, while the largest secondary school in the sample has some 1,600 pupils. A significant proportion of schools have between 600 and 1,000 pupils.

 

Chart 3: Schools within the sample by number on roll

Ofsted chart 3

The distribution of overall inspection grades between the sample schools is illustrated by Chart 4 below. Eight of the sample were rated outstanding, 28 good, 35 as requiring improvement and 16 inadequate.

Of those rated inadequate, 12 were subject to special measures and four had serious weaknesses.

 

Chart 4: Schools within the sample by overall inspection grade

 Ofsted chart 4

The eight schools rated outstanding include:

  • A mixed 11-18 sponsored academy
  • A mixed 14-19 studio school
  • A mixed 11-18 free school
  • A mixed 11-16 VA comprehensive;
  • A girls’ 11-18  VA comprehensive
  • A boys’ 11-18 VA selective school
  • A girls’ 11-18 community comprehensive and
  • A mixed 11-18 community comprehensive

The sixteen schools rated inadequate include:

  • Eight mixed 11-18 sponsored academies
  • Two mixed 11-16 sponsored academies
  • An mixed all-through sponsored academy
  • A mixed 11-16 free school
  • Two mixed 11-16 community comprehensives
  • A mixed 11-18 community comprehensive and
  • A mixed 13-19 community comprehensive

 

Coverage of the most able in main findings and recommendations

 

Terminology 

Where they were mentioned, such learners were most often described as ‘most able’ but a wide range of other terminology is deployed included ‘most-able’, ‘the more able’, ‘more-able’, ‘higher attaining’, ‘high-ability’, ‘higher-ability’ and ‘able students’.

The idiosyncratic adoption of redundant hyphenation is an unresolved mystery.

It is not unusual for two or more of these terms to be used in the same report. Because there is no glossary in existence, this makes some reports rather less straightforward to interpret accurately.

It is also more difficult to compare and contrast reports. Helpful services like Watchsted’s word search facility become less useful.

 

Incidence of commentary in the main findings and recommendations

Thirty of the 87 inspection reports (34%) addressed the school’s most able learners explicitly (or applied a similar term) in the sections setting out the report’s main findings and the recommendations respectively.

The analysis showed that 28% of reports on academies (including studios and free schools) met this criterion, whereas 38% of reports on non-academy schools did so.

Chart 5 shows how the incidence of reference in both main findings and recommendations varies according to the overall inspection grade awarded.

One can see that this level of attention is most prevalent in schools requiring improvement, followed by those with inadequate grades. It was less common in schools rated good and less common still in outstanding schools. The gap between these two categories is perhaps smaller than expected.

The slight lead for schools requiring improvement over inadequate schools may be attributable to a view that more of the latter face more pressing priorities, or it may have something to do with the varying proportions of high attainers in such schools, or both of these factors could be in play, amongst others.

 

Chart 5: Most able covered in both main findings and recommendations by overall inspection rating (percentage)

Ofsted chart 5

A further eleven reports (13%) addressed the most able learners in the recommendations but not the main findings.

Only one report managed to feature the most able in the main findings but not in the recommendations and this was because the former recorded that ‘the most able students do well’.

Consequently, a total of 45 reports (52%) did not mention the most able in either the main findings or the recommendations.

This applied to some 56% of reports on academies (including free schools and studio schools) and 49% of reports on other state-funded schools.

So, according to these proxy measures, the most able in academies appear to receive comparatively less attention from inspectors than those in non-academy schools. It is not clear why. (The samples are almost certainly too small to support reliable comparison of academies and non-academies with different inspection ratings.)

Chart 6 below shows the inspection ratings for this subset of reports.

 

Chart 6: Most able covered in neither main findings nor recommendations by overall inspection rating (percentage)

Ofsted chart 6

Here is further evidence that the significant majority of outstanding schools are regarded as having no significant problems in respect of provision for the most able.

On the other hand, this is far from being universally true, since it is an issue for one in four of them. This ratio of 3:1 does not lend complete support to the oft-encountered truism that outstanding schools invariably provide outstandingly for the most able – and vice versa.

At the other end of the spectrum, and perhaps even more surprisingly, over 30% of inadequate schools are assumed not to have issues significant enough to warrant reference in these sections. Sometimes this may be because they are equally poor at providing for all their learners, so the most able are not separately singled out.

Chart 7 below shows differences by school size, giving the percentage of reports mentioning the most able in both main findings and recommendations and in neither.

It divides schools into three categories: small (24 schools with a NOR of 599 or lower), medium (35 schools with a NOR of 600-999) and large (28 schools with a NOR of 1000 or higher.

 

Chart 7: Reports mentioning the most able in main findings and recommendations by school size 

 Ofsted chart 7

It is evident that ‘neither’ exceeds ‘both’ in the case of all three categories. The percentages are not too dissimilar in the case of small and large schools, which record a very similar profile.

But there is a much more significant difference for medium-sized schools. They demonstrate a much smaller percentage of ‘both’ reports and comfortably the largest percentage of ‘neither’ reports.

This pattern – suggesting that inspectors are markedly less likely to emphasise provision for the most able in medium-sized schools – is worthy of further investigation.

It would be particularly interesting to explore further the relationship between school size, the proportion of high attainers in a school and their achievement.

 

Typical references in the main findings and recommendations

I could detect no obvious and consistent variations in these references by school status or size, but it was possible to detect a noticeably different emphasis between schools rated outstanding and those rated inadequate.

Where the most able featured in reports on outstanding schools, these included recommendations such as:

‘Further increase the proportion of outstanding teaching in order to raise attainment even higher, especially for the most able students.’ (11-16 VA comprehensive).

‘Ensure an even higher proportion of students, including the most able, make outstanding progress across all subjects’ (11-18 sponsored academy).

These statements suggest that such schools have made good progress in eradicating underachievement amongst the most able but still have further room for improvement.

But where the most able featured in recommendations for inadequate schools, they were typically of this nature:

‘Improve teaching so that it is consistently good or better across all subjects, but especially in mathematics, by: raising teachers’ expectations of the quality and amount of work students of all abilities can do, especially the most and least able.’  (11-16 sponsored academy).

‘Improve the quality of teaching in order to speed up the progress students make by setting tasks that are at the right level to get the best out of students, especially the most able.’ (11-18 sponsored academy).

‘Rapidly improve the quality of teaching, especially in mathematics, by ensuring that teachers: have much higher expectations of what students can achieve, especially the most able…’ (11-16 community school).

These make clear that poor and inconsistent teaching quality is causing significant underachievement at the top end (and ‘especially’ suggests that this top end underachievement is particularly pronounced compared with other sections of the attainment spectrum in such schools).

Recommendations for schools requiring improvement are akin to those for inadequate schools but typically more specific, pinpointing particular dimensions of good quality teaching that are absent, so limiting effective provision for the most able. It is as if these schools have some of the pieces in place but not yet the whole jigsaw.

By comparison, recommendations for good schools can seem rather more impressionistic and/or formulaic, focusing more generally on ‘increasing the proportion of outstanding teaching’. In such cases the assessment is less about missing elements and more about the consistent application of all of them across the school.

One gets the distinct impression that inspectors have a clearer grasp of the ‘fit’ between provision for the most able and the other three inspection outcomes, at least as far as the distinction between ‘good’ and ‘outstanding’ is concerned.

But it would be misleading to suggest that these lines of demarcation are invariably clear. The boundary between ‘good’ and ‘requires improvement’ seems comparatively distinct, but there was more evidence of overlap at the intersections between the other grades.

 

Coverage of the most able in the main body of reports 

References to the most able rarely turn up in the sections dealing with behaviour and safety and leadership and management. I counted no examples of the former and no more than one or two of the latter.

I could find no examples where information, advice and guidance available to the most able are separately and explicitly discussed and little specific reference to the appropriateness of the curriculum for the most able. Both are less prominent than the recommendations in the June 2013 survey report led us to expect.

Within this sample, the vast majority of reports include some description of the attainment and/or progress of the most able in the section about pupils’ achievement, while roughly half pick up the issue in relation to the quality of teaching.

The extent of the coverage of most able learners varied enormously. Some devoted a single sentence to the topic while others referred to it separately in main findings, recommendations, pupils’ achievement and quality of teaching. In a handful of cases reports seemed to give disproportionate attention to the topic.

 

Attainment and progress

Analyses of attainment and progress are sometimes entirely generic, as in:

‘The most able students make good progress’ (inadequate 11-18 community school).

‘The school has correctly identified a small number of the most able who could make even more progress’ (outstanding 11-16 RC VA school).

‘The most able students do not always secure the highest grades’ (11-16 community school requiring improvement).

‘The most able students make largely expected rates of progress. Not enough yet go on to attain the highest GCSE grades in all subjects.’ (Good 11-18 sponsored academy).

Sometimes such statements can be damning:

‘The most-able students in the academy are underachieving in almost every subject. This is even the case in most of those subjects where other students are doing well. It is an academy-wide issue.’ (Inadequate 11-18 sponsored academy).

These do not in my view constitute reporting ‘in detail on the progress of the most able pupils’ and so probably fall foul of Ofsted’s guidance to inspectors on writing reports.

More specific comments on attainment typically refer explicitly to the achievement of A*/A grades at GCSE and ideally to specific subjects, for example:

‘In 2013, standards in science, design and technology, religious studies, French and Spanish were also below average. Very few students achieved the highest A* and A grades.’ (Inadequate 11-18 sponsored academy)

‘Higher-ability students do particularly well in a range of subjects, including mathematics, religious education, drama, art and graphics. They do as well as other students nationally in history and geography.’ (13-18 community school  requiring improvement)

More specific comments on progress include:

‘The progress of the most able students in English is significantly better than that in other schools nationally, and above national figures in mathematics. However, the progress of this group is less secure in science and humanities.’  (Outstanding 11-18 sponsored academy)

‘In 2013, when compared to similar students nationally, more-able students made less progress than less-able students in English. In mathematics, where progress is less than in English, students of all abilities made similar progress.’ (11-18 sponsored academy requiring improvement).

Statements about progress rarely extend beyond English and maths (the first example above is exceptional) but, when attainment is the focus, some reports take a narrow view based exclusively on the core subjects, while others are far wider-ranging.

Despite the reference in Ofsted’s survey report, and subsequently the revised subsidiary guidance, to coverage of high attaining learners in receipt of the Pupil Premium, this is hardly ever addressed.

I could find only two examples amongst the 87 reports:

‘The gap between the achievement in English and mathematics of students for whom the school receives additional pupil premium funding and that of their classmates widened in 2013… During the inspection, it was clear that the performance of this group is a focus in all lessons and those of highest ability were observed to be achieving equally as well as their peers.’ (11-16 foundation school requiring improvement)

‘Students eligible for the pupil premium make less progress than others do and are consequently behind their peers by approximately one GCSE grade in English and mathematics. These gaps reduced from 2012 to 2013, although narrowing of the gaps in progress has not been consistent over time. More-able students in this group make relatively less progress.’ (11-16 sponsored academy requiring improvement)

More often than not it seems that the most able and those in receipt of the Pupil Premium are assumed to be mutually exclusive groups.

 

Quality of teaching 

There was little variation in the issues raised under teaching quality. Most inspectors select two or three options from a standard menu:

‘Where teaching is best, teachers provide suitably challenging materials and through highly effective questioning enable the most able students to be appropriately challenged and stretched…. Where teaching is less effective, teachers are not planning work at the right level of difficulty. Some work is too easy for the more able students in the class. (Good 11-16 community school)

 ‘In teaching observed during the inspection, the pace of learning for the most able students was too slow because the activities they were given were too easy. Although planning identified different activities for the most able students, this was often vague and not reflected in practice.  Work lacks challenge for the most able students.’ (Inadequate 11-16 community school)

‘In lessons where teaching requires improvement, teachers do not plan work at the right level to ensure that students of differing abilities build on what they already know. As a result, there is a lack of challenge in these lessons, particularly for the more able students, and the pace of learning is slow. In these lessons teachers do not have high enough expectations of what students can achieve.’ (11-18 community school requiring improvement)

‘Tasks set by teachers are sometimes too easy and repetitive for pupils, particularly the most able. In mathematics, pupils are sometimes not moved on quickly enough to new and more challenging tasks when they have mastered their current work.’ (9-13 community middle school requiring improvement)

‘Targets which are set for students are not demanding enough, and this particularly affects the progress of the most able because teachers across the year groups and subjects do not always set them work which is challenging. As a result, the most able students are not stretched in lessons and do not achieve as well as they should.’ (11-16 sponsored academy rated inadequate)

All the familiar themes are present – assessment informing planning, careful differentiation, pace and challenge, appropriate questioning, the application of subject knowledge, the quality of homework, high expectations and extending effective practice between subject departments.

 

Negligible coverage of the most able

Only one of the 87 reports failed to make any mention of the most able whatsoever. This is the report on North Birmingham Academy, an 11-19 mixed school requiring improvement.

This clearly does not meet the injunction to:

‘…report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough’.

It ought not to have passed through Ofsted’s quality assurance processes unscathed. The inspection was conducted in February 2014, after this guidance issued, so there is no excuse.

Several other inspections make only cursory references to the most able in the main body of the report, for example:

‘Where teaching is not so good, it was often because teachers failed to check students’ understanding or else to anticipate when to intervene to support students’ learning, especially higher attaining students in the class.’ (Good 11-18 VA comprehensive).

‘… the teachers’ judgements matched those of the examiners for a small group of more-able students who entered early for GCSE in November 2013.’ (Inadequate 11-18 sponsored academy).

‘More-able students are increasingly well catered for as part of the academy’s focus on raising levels of challenge.’ (Good 11-18 sponsored academy).

‘The most able students do not always pursue their work to the best of their capability.’ (11-16 free school requiring improvement).

These would also fall well short of the report writing guidance. At least 6% of my sample falls into this category.

Some reports note explicitly that the most able learners are not making sufficient progress, but fail to capture this in the main findings or recommendations, for example:

‘The achievement of more able students is uneven across subjects. More able students said to inspectors that they did not feel they were challenged or stretched in many of their lessons. Inspectors agreed with this view through evidence gathered in lesson observations…lessons do not fully challenge all students, especially the more able, to achieve the grades of which they are capable.’ (11-19 sponsored academy requiring improvement).

‘The 2013 results of more-able students show they made slower progress than is typical nationally, especially in mathematics.  Progress is improving this year, but they are still not always sufficiently challenged in lessons.’ (11-18 VC CofE school requiring improvement).

‘There is only a small proportion of more-able students in the academy. In 2013 they made less progress in English and mathematics than similar students nationally. Across all of their subjects, teaching is not sufficiently challenging for more-able students and they leave the academy with standards below where they should be.’ (Inadequate 11-18 sponsored academy).

‘The proportion of students achieving grades A* and A was well below average, demonstrating that the achievement of the most able also requires improvement.’  (11-18 sponsored academy requiring improvement).

Something approaching 10% of the sample fell into this category. It was not always clear why this issue was not deemed significant enough to feature amongst schools’ priorities for improvement. This state of affairs was more typical of schools requiring improvement than inadequate schools, so one could not so readily argue that the schools concerned were overwhelmed with the need to rectify more basic shortcomings.

That said, the example from an inadequate academy above may be significant. It is almost as if the small number of more able students is the reason why this shortcoming is not taken more seriously.

Inspectors must carry in their heads a somewhat subjective hierarchy of issues that schools are expected to tackle. Some inspectors appear to feature the most able at a relatively high position in this hierarchy; others push it further down the list. Some appear more flexible in the application of this hierarchy to different settings than others.

 

Formulaic and idiosyncratic references 

There is clear evidence of formulaic responses, especially in the recommendations for how schools can improve their practice.

Many reports adopt the strategy of recommending a series of actions featuring the most able, either in the target group:

‘Improve the quality of teaching to at least good so that students, including the most able, achieve higher standards, by ensuring that: [followed by a list of actions] (9-13 community middle school requiring improvement)

Or in the list of actions:

‘Improve the quality of teaching in order to raise the achievement of students by ensuring that teachers:…use assessment information to plan their work so that all groups of students, including those supported by the pupil premium and the most-able students, make good progress.’ (11-16 community school requiring improvement)

It was rare indeed to come across a report that referred explicitly to interesting or different practice in the school, or approached the topic in a more individualistic manner, but here are a few examples:

‘More-able pupils are catered for well and make good progress. Pupils enjoy the regular, extra challenges set for them in many lessons and, where this happens, it enhances their progress. They enjoy that extra element which often tests them and gets them thinking about their work in more depth. Most pupils are keen to explore problems which will take them to the next level or extend their skills.’  (Good 9-13 community middle school)

‘Although the vast majority of groups of students make excellent progress, the school has correctly identified a small number of the most able who could make even more progress. It has already started an impressive programme of support targeting the 50 most able students called ‘Students Targeted A grade Results’ (STAR). This programme offers individualised mentoring using high-quality teachers to give direct intervention and support. This is coupled with the involvement of local universities. The school believes this will give further aspiration to these students to do their very best and attend prestigious universities.’  (Outstanding 11-16 VA school)

I particularly liked:

‘Policies to promote equality of opportunity are ineffective because of the underachievement of several groups of students, including those eligible for the pupil premium and the more-able students.’ (Inadequate 11-18 academy) 

 

Conclusion

 

Main Findings

The principal findings from this survey, admittedly based on a rather small and not entirely representative sample, are that:

  • Inspectors are terminologically challenged in addressing this issue, because there are too many synonyms or near-synonyms in use.
  • Approximately one-third of inspection reports address provision for the most able in both main findings and recommendations. This is less common in academies than in community, controlled and aided schools. It is most prevalent in schools with an overall ‘requires improvement’ rating, followed by those rated inadequate. It is least prevalent in outstanding schools, although one in four outstanding schools is dealt with in this way.
  • Slightly over half of inspection reports address provision for the most able in neither the main findings nor the recommendations. This is relatively more common in the academies sector and in outstanding schools. It is least prevalent in schools rated inadequate, though almost one-third of inadequate schools fall into this category. Sometimes this is the case even though provision for the most able is identified as a significant issue in the main body of the report.
  • There is an unexplained tendency for reports on medium-sized schools to be significantly less likely to feature the most able in both main findings and recommendations and significantly more likely to feature it in neither. This warrants further investigation.
  • Overall coverage of the topic varies excessively between reports. One ignored it entirely, while several provided only cursory coverage and a few covered it to excess. The scope and quality of the coverage does not necessarily correlate with the significance of the issue for the school.
  • Coverage of the attainment and progress of the most able learners is variable. Some reports offer only generic descriptions of attainment and progress combined, some are focused exclusively on attainment in the core subjects while others take a wider curricular perspective. Outside the middle school sector, desirable attainment outcomes for the most able are almost invariably defined exclusively in terms of A* and A grade GCSEs.
  • Hardly any reports consider the attainment and/or progress of the most able learners in receipt of the Pupil Premium.
  • None of these reports make specific and explicit reference to IAG for the most able. It is rarely stated whether the school’s curriculum satisfies the needs of the most able.
  • Too many reports adopt formulaic approaches, especially in the recommendations they offer the school. Too few include reference to interesting or different practice.

In my judgement, too much current inspection reporting falls short of the commitments contained in the original Ofsted survey report and of the more recent requirement to:

‘always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

 

Recommendations

  • Ofsted should publish a glossary defining clearly all the terms for the most able that it employs, so that both inspectors and schools understand exactly what is intended when a particular term is deployed and which learners should be in scope when the most able are discussed.
  • Ofsted should co-ordinate the development of supplementary guidance clarifying their expectations of schools in respect of provision for the most able. This should set out in more detail what expectations would apply for such provision to be rated outstanding, good, requiring improvement and inadequate respectively. This should include the most able in receipt of the Pupil Premium, the suitability of the curriculum and the provision of IAG.
  • Ofsted should provide supplementary guidance for inspectors outlining and exemplifying the full range of evidence they might interrogate concerning the attainment and progress of the most able learners, including those in receipt of the Pupil Premium.
  • This guidance should specify the essential minimum coverage expected in reports and the ‘triggers’ that would warrant it being referenced in the main findings and/or recommendations for action.
  • This guidance should discourage inspectors from adopting formulaic descriptors and recommendations and specifically encourage them to identify unusual or innovative examples of effective practice.
  • The school inspection handbook and subsidiary guidance should be amended to reflect the supplementary guidance.
  • The School Data Dashboard should be expanded to include key data highlighting the attainment and progress of the most able.
  • These actions should also be undertaken for inspection of the primary and 16-19 sectors respectively.

 

Overall assessment: Requires Improvement.

 

GP

May 2014

 

 

 

 

 

 

 

 

 

 

 

PISA 2012 Creative Problem Solving: International Comparison of High Achievers’ Performance

This post compares the performance of high achievers from selected jurisdictions on the PISA 2012 creative problem solving test.

It draws principally on the material in the OECD Report ‘PISA 2012 Results: Creative Problem Solving’ published on 1 April 2014.

Pisa ps cover CaptureThe sample of jurisdictions includes England, other English-speaking countries (Australia, Canada, Ireland and the USA) and those that typically top the PISA rankings (Finland, Hong Kong, South Korea, Shanghai, Singapore and Taiwan).

With the exception of New Zealand, which did not take part in the problem solving assessment, this is deliberately identical to the sample I selected for a parallel post reviewing comparable results in the PISA 2012 assessments of reading, mathematics and science: ‘PISA 2012: International Comparisons of High Achievers’ Performance’ (December 2013)

These eleven jurisdictions account for nine of the top twelve performers ranked by mean overall performance in the problem solving assessment. (The USA and Ireland lie outside the top twelve, while Japan, Macao and Estonia are the three jurisdictions that are in the top twelve but outside my sample.)

The post is divided into seven sections:

  • Background to the problem solving assessment: How PISA defines problem solving competence; how it defines performance at each of the six levels of proficiency; how it defines high achievement; the nature of the assessment and who undertook it.
  • Average performance, the performance of high achievers and the performance of low achievers (proficiency level 1) on the problem solving assessment. This comparison includes my own sample and all the other jurisdictions that score above the OECD average on the first of these measures.
  • Gender and socio-economic differences amongst high achievers on the problem solving assessment  in my sample of eleven jurisdictions.
  • The relative strengths and weaknesses of jurisdictions in this sample on different aspects of the problem solving assessment. (This treatment is generic rather than specific to high achievers.)
  • What proportion of high achievers on the problem-solving assessment in my sample of jurisdictions are also high achievers in reading, maths and science respectively.
  • What proportion of students in my sample of jurisdictions achieves highly in one or more of the four PISA 2012 assessments – and against the ‘all-rounder’ measure, which is based on high achievement in all of reading, maths and science (but not problem solving).
  • Implications for education policy makers seeking to improve problem solving performance in each of the sample jurisdictions.

Background to the Problem Solving Assessment

.

Definition of problem solving

PISA’s definition of problem-solving competence is:

‘…an individual’s capacity to engage in cognitive processing to understand and resolve problem situations where a method of solution is not immediately obvious. It includes the willingness to engage with such situations in order to achieve one’s potential as a constructive and reflective citizen.’

The commentary on this definition points out that:

  • Problem solving requires identification of the problem(s) to be solved, planning and applying a solution, and monitoring and evaluating progress.
  • A problem is ‘a situation in which the goal cannot be achieved by merely applying learned procedures’, so the problems encountered must be non-routine for 15 year-olds, although ‘knowledge of general strategies’ may be useful in solving them.
  • Motivational and affective factors are also in play.

The Report is rather coy about the role of creativity in problem solving, and hence the justification for the inclusion of this term in its title.

Perhaps the nearest it gets to an exposition is when commenting on the implications of its findings:

‘In some countries and economies, such as Finland, Shanghai-China and Sweden, students master the skills needed to solve static, analytical problems similar to those that textbooks and exam sheets typically contain as well or better than 15-year-olds, on average, across OECD countries. But the same 15-year-olds are less successful when not all information that is needed to solve the problem is disclosed, and the information provided must be completed by interacting with the problem situation. A specific difficulty with items that require students to be open to novelty, tolerate doubt and uncertainty, and dare to use intuitions (“hunches and feelings”) to initiate a solution suggests that opportunities to develop and exercise these traits, which are related to curiosity, perseverance and creativity, need to be prioritised.’

.

Assessment framework

PISA’s framework for assessing problem solving competence is set out in the following diagram

 

PISA problem solving framework Capture

 

In solving a particular problem it may not be necessary to apply all these steps, or to apply them in this order.

Proficiency levels

The proficiency scale was designed to have a mean score across OECD countries of 500. The six levels of proficiency applied in the assessment each have their own profile.

The lowest, level 1 proficiency is described thus:

‘At Level 1, students can explore a problem scenario only in a limited way, but tend to do so only when they have encountered very similar situations before. Based on their observations of familiar scenarios, these students are able only to partially describe the behaviour of a simple, everyday device. In general, students at Level 1 can solve straightforward problems provided there is a simple condition to be satisfied and there are only one or two steps to be performed to reach the goal. Level 1 students tend not to be able to plan ahead or set sub-goals.’

This level equates to a range of scores from 358 to 423. Across the OECD sample, 91.8% of participants are able to perform tasks at this level.

By comparison, level 5 proficiency is described in this manner:

‘At Level 5, students can systematically explore a complex problem scenario to gain an understanding of how relevant information is structured. When faced with unfamiliar, moderately complex devices, such as vending machines or home appliances, they respond quickly to feedback in order to control the device. In order to reach a solution, Level 5 problem solvers think ahead to find the best strategy that addresses all the given constraints. They can immediately adjust their plans or backtrack when they detect unexpected difficulties or when they make mistakes that take them off course.’

The associated range of scores is from 618 to 683 and 11.4% of all OECD students achieve at this level.

Finally, level 6 proficiency is described in this way:

‘At Level 6, students can develop complete, coherent mental models of diverse problem scenarios, enabling them to solve complex problems efficiently. They can explore a scenario in a highly strategic manner to understand all information pertaining to the problem. The information may be presented in different formats, requiring interpretation and integration of related parts. When confronted with very complex devices, such as home appliances that work in an unusual or unexpected manner, they quickly learn how to control the devices to achieve a goal in an optimal way. Level 6 problem solvers can set up general hypotheses about a system and thoroughly test them. They can follow a premise through to a logical conclusion or recognise when there is not enough information available to reach one. In order to reach a solution, these highly proficient problem solvers can create complex, flexible, multi-step plans that they continually monitor during execution. Where necessary, they modify their strategies, taking all constraints into account, both explicit and implicit.’

The range of level 6 scores is from 683 points upwards and 2.5% of all OECD participants score at this level.

PISA defines high achieving students as those securing proficiency level 5 or higher, so proficiency levels 5 and 6 together. The bulk of the analysis it supplies relates to this cohort, while relatively little attention is paid to the more exclusive group achieving proficiency level 6, even though almost 10% of students in Singapore reach this standard in problem solving.

 .

The sample

Sixty-five jurisdictions took part in PISA 2012, including all 34 OECD countries and 31 partners. But only 44 jurisdictions took part in the problem solving assessment, including 28 OECD countries and 16 partners. As noted above, that included all my original sample of twelve jurisdictions, with the exception of New Zealand.

I could find no stated reason why New Zealand chose not to take part. Press reports initially suggested that England would do likewise, but it was subsequently reported that this decision had been reversed.

The assessment was computer-based and comprised 16 units divided into 42 items. The units were organised into four clusters, each designed to take 20 minutes to complete. Participants completed one or two clusters, depending on whether they were also undertaking computer-based assessments of reading and maths.

In each jurisdiction a random sample of those who took part in the paper-based maths assessment was selected to undertake the problem solving assessment. About 85,000 students took part in all. The unweighted sample sizes in my selected jurisdictions are set out in Table 1 below, together with the total population of 15 year-olds in each jurisdiction.

 

Table 1: Sample sizes undertaking PISA 2012 problem solving assessment in selected jurisdictions

Country Unweighted Sample Total 15 year-olds
Australia 5,612 291,976
Canada 4,601 417,873
Finland 3,531 62,523
Hong Kong 1,325 84,200
Ireland 1,190 59,296
Shanghai 1,203 108,056
Singapore 1,394 53,637
South Korea 1,336 687,104
Taiwan 1,484 328,356
UK (England) 1,458 738,066
USA 1,273 3,985,714

Those taking the assessment were aged between 15 years and three months and 16 years and two months at the time of the assessment. All were enrolled at school and had completed at least six years of formal schooling.

Average performance compared with the performance of high and low achievers

The overall table of mean scores on the problem solving assessment is shown below

PISA problem solving raw scores Capture

 .

There are some familiar names at the top of the table, especially Singapore and South Korea, the two countries that comfortably lead the rankings. Japan is some ten points behind in third place but it in turn has a lead of twelve points over a cluster of four other Asian competitors: Macao, Hong Kong, Shanghai and Taiwan.

A slightly different picture emerges if we compare average performance with the proportion of learners who achieve the bottom proficiency level and the top two proficiency levels. Table 2 below compares these groups.

This table includes all the jurisdictions that exceeded the OECD average score. I have marked out in bold the countries in my sample of eleven which includes Ireland, the only one of them that did not exceed the OECD average.

Table 2: PISA Problem Solving 2012: Comparing Average Performance with Performance at Key Proficiency Levels

 

Jurisdiction Mean score Level 1 (%) Level 5 (%) Level 6 (%) Levels 5+6 (%)
Singapore 562 6.0 19.7 9.6 29.3
South Korea 561 4.8 20.0 7.6 27.6
Japan 552 5.3 16.9 5.3 22.2
Macao 540 6.0 13.8 2.8 16.6
Hong Kong 540 7.1 14.2 5.1 19.3
Shanghai 536 7.5 14.1 4.1 18.2
Taiwan 534 8.2 14.6 3.8 18.4
Canada 526 9.6 12.4 5.1 17.5
Australia 523 10.5 12.3 4.4 16.7
Finland 523 9.9 11.4 3.6 15.0
England (UK) 517 10.8 10.9 3.3 14.2
Estonia 515 11.1 9.5 2.2 11.7
France 511 9.8 9.9 2.1 12.0
Netherlands 511 11.2 10.9 2.7 13.6
Italy 510 11.2 8.9 1.8 10.7
Czech Republic 509 11.9 9.5 2.4 11.9
Germany 509 11.8 10.1 2.7 12.8
USA 508 12.5 8.9 2.7 11.6
Belgium 508 11.6 11.4 3.0 14.4
Austria 506 11.9 9.0 2.0 11.0
Norway 503 13.2 9.7 3.4 13.1
Ireland 498 13.3 7.3 2.1 9.4
OECD Ave. 500 13.2 8.9 2.5 11.4

 .

The jurisdictions at the top of the table also have a familiar profile, with a small ‘tail’ of low performance combined with high levels of performance at the top end.

Nine of the top ten have fewer than 10% of learners at proficiency level 1, though only South Korea pushes below 5%.

Five of the top ten have 5% or more of their learners at proficiency level 6, but only Singapore and South Korea have a higher percentage at level 6 than level 1 (with Japan managing the same percentage at both levels).

The top three performers – Singapore, South Korea and Japan – are the only three jurisdictions that have over 20% of their learners at proficiency levels 5 and 6 together.

South Korea slightly outscores Singapore at level 5 (20.0% against 19.7%). Japan is in third place, followed by Taiwan, Hong Kong and Shanghai.

But at level 6, Singapore has a clear lead, followed by South Korea, Japan, Hong Kong and Canada respectively.

England’s overall place in the table is relatively consistent on each of these measures, but the gaps between England and the top performers vary considerably.

The best have fewer than half England’s proportion of learners at proficiency level 1, almost twice as many learners at proficiency level 5 and more than twice as many at proficiency levels 5 and 6 together. But at proficiency level 6 they have almost three times as many learners as England.

Chart 1 below compares performance on these four measures across my sample of eleven jurisdictions.

All but Ireland are comfortably below the OECD average for the percentage of learners at proficiency level 1. The USA and Ireland are atypical in having a bigger tail (proficiency level 1) than their cadres of high achievers (levels 5 and 6 together).

At level 5 all but Ireland and the USA are above the OECD average, but USA leapfrogs the OECD average at level 6.

There is a fairly strong correlation between the proportions of learners achieving the highest proficiency thresholds and average performance in each jurisdiction. However, Canada stands out by having an atypically high proportion of students at level 6.

.

Chart 1: PISA 2012 Problem-solving: Comparing performance at specified proficiency levels

Problem solving chart 1

.

PISA’s Report discusses the variation in problem-solving performance within different jurisdictions. However it does so without reference to the proficiency levels, so we do not know to what extent these findings apply equally to high achievers.

Amongst those above the OECD average, those with least variation are Macao, Japan, Estonia, Shanghai, Taiwan, Korea, Hong Kong, USA, Finland, Ireland, Austria, Singapore and the Czech Republic respectively.

Perhaps surprisingly, the degree of variation in Finland is identical to that in the USA and Ireland, while Estonia has less variation than many of the Asian jurisdictions. Singapore, while top of the performance table, is only just above the OECD average in terms of variation.

The countries below the OECD average on this measure – listed in order of increasing variation – include England, Australia and Canada, though all three are relatively close to the OECD average. So these three countries and Singapore are all relatively close together.

Gender and socio-economic differences amongst high achievers

 .

Gender differences

On average across OECD jurisdictions, boys score seven points higher than girls on the problem solving assessment. There is also more variation amongst boys than girls.

Across the OECD participants, 3.1% of boys achieved proficiency level 6 but only 1.8% of girls did so. This imbalance was repeated at proficiency level 5, achieved by 10% of boys and 7.7% of girls.

The table and chart below show the variations within my sample of eleven countries. The performance of boys exceeds that of girls in all cases, except in Finland at proficiency level 5, and in that instance the gap in favour of girls is relatively small (0.4%).

 .

Table 3: PISA Problem-solving: Gender variation at top proficiency levels

Jurisdiction Level 5 (%) Level 6 (%) Levels 5+6 (%)
  Boys Girls Diff Boys Girls Diff Boys Girls Diff
Singapore 20.4 19.0 +1.4 12.0 7.1 +4.9 32.4 26.1 +6.3
South Korea 21.5 18.3 +3.2 9.4 5.5 +3.9 30.9 23.8 +7.1
Hong Kong 15.7 12.4 +3.3 6.1 3.9 +2.2 21.8 16.3 +5.5
Shanghai 17.0 11.4 +5.6 5.7 2.6 +3.1 22.7 14.0 +8.7
Taiwan 17.3 12.0 +5.3 5.0 2.5 +2.5 22.3 14.5 +7.8
Canada 13.1 11.8 +1.3 5.9 4.3 +1.6 19.0 16.1 +2.9
Australia 12.6 12.0 +0.6 5.1 3.7 +1.4 17.7 15.7 +2.0
Finland 11.2 11.6 -0.4 4.1 3.0 +1.1 15.3 14.6 +0.7
England (UK) 12.1 9.9 +2.2 3.6 3.0 +0.6 15.7 12.9 +2.8
USA 9.8 7.9 +1.9 3.2 2.3 +0.9 13.0 10.2 +2.8
Ireland 8.0 6.6 +1.4 3.0 1.1 +1.9 11.0 7.7 +3.3
OECD Average 10.0 7.7 +2.3 3.1 1.8 +1.3 13.1 9.5 +3.6

There is no consistent pattern in whether boys are more heavily over-represented at proficiency level 5 than proficiency level 6, or vice versa.

There is a bigger difference at level 6 than at level 5 in Singapore, South Korea, Canada, Australia, Finland and Ireland, but the reverse is true in the five remaining jurisdictions.

At level 5, boys are in the greatest ascendancy in Shanghai and Taiwan while, at level 6, this is true of Singapore and South Korea.

When proficiency levels 5 and 6 are combined, all five of the Asian tigers show a difference in favour of males of 5.5% or higher, significantly in advance of the six ‘Western’ countries in the sample and significantly ahead of the OECD average.

Amongst the six ‘Western’ representatives, boys have the biggest advantage at proficiency level 5 in England, while at level 6 boys in Ireland have the biggest advantage.

Within this group of jurisdictions, the gap between boys and girls at level 6 is comfortably the smallest in England. But, in terms of performance at proficiency levels 5 and 6 together, Finland is ahead.

 .

Chart 2: PISA Problem-solving: Gender variation at top proficiency levels

Problem solving chart 2

The Report includes a generic analysis of gender differences in performance for boys and girls with similar levels of performance in English, maths and science.

It concludes that girls perform significantly above their expected level in both England and Australia (though the difference is only statistically significant in the latter).

The Report comments:

‘It is not clear whether one should expect there to be a gender gap in problem solving. On the one hand, the questions posed in the PISA problem-solving assessment were not grounded in content knowledge, so boys’ or girls’ advantage in having mastered a particular subject area should not have influenced results. On the other hand… performance in problem solving is more closely related to performance in mathematics than to performance in reading. One could therefore expect the gender difference in performance to be closer to that observed in mathematics – a modest advantage for boys, in most countries – than to that observed in reading – a large advantage for girls.’

 .

Socio-economic differences

The Report considers variations in performance against PISA’s Index of Economic, Social and Cultural status (IESC), finding them weaker overall than for reading, maths and science.

It calculates that the overall percentage variation in performance attributable to these factors is about 10.6% (compared with 14.9% in maths, 14.0% in science and 13.2% in reading).

Amongst the eleven jurisdictions in my sample, the weakest correlations were found in Canada (4%), followed by Hong Kong (4.9%), South Korea (5.4%), Finland (6.5%), England (7.8%), Australia (8.5%), Taiwan (9.4%), the USA (10.1%) and Ireland (10.2%) in that order. All those jurisdictions had correlations below the OECD average.

Perhaps surprisingly, there were above average correlations in Shanghai (14.1%) and, to a lesser extent (and less surprisingly) in Singapore (11.1%).

The report suggests that students with parents working in semi-skilled and elementary occupations tend to perform above their expected level in problem-solving in Taiwan, England, Canada, the USA, Finland and Australia (in that order – with Australia closest to the OECD average).

The jurisdictions where these students tend to underperform their expected level are – in order of severity – Ireland, Shanghai, Singapore, Hong Kong and South Korea.

A parallel presentation on the Report provides some additional data about the performance in different countries of what the OECD calls ‘resilient’ students – those in the bottom quartile of the IESC but in the top quartile by perfromance, after accounting for socio-economic status.

It supplies the graph below, which shows all the Asian countries in my sample clustered at the top, but also with significant gaps between them. Canada is the highest-performing of the remainder in my sample, followed by Finland, Australia, England and the USA respectively. Ireland is some way below the OECD average.

.

PISA problem resolving resilience Capture

.

Unfortunately, I can find no analysis of how performance varies according to socio-economic variables at each proficiency level. It would be useful to see which jurisdictions have the smallest ‘excellence gaps’ at levels 5 and 6 respectively.

 .

How different jurisdictions perform on different aspects of problem-solving

The Report’s analysis of comparative strengths and weaknesses in different elements of problem-solving does not take account of variations at different proficiency levels

It explains that aspects of the assessment were found easier by students in different jurisdictions, employing a four-part distinction between:

‘Exploring and understanding. The objective is to build mental representations of each of the pieces of information presented in the problem. This involves:

  • exploring the problem situation: observing it, interacting with it, searching for information and finding limitations or obstacles; and
  • understanding given information and, in interactive problems, information discovered while interacting with the problem situation; and demonstrating understanding of relevant concepts.

Representing and formulating. The objective is to build a coherent mental representation of the problem situation (i.e. a situation model or a problem model). To do this, relevant information must be selected, mentally organised and integrated with relevant prior knowledge. This may involve:

  • representing the problem by constructing tabular, graphic, symbolic or verbal representations, and shifting between representational formats; and
  • formulating hypotheses by identifying the relevant factors in the problem and their inter-relationships; and organising and critically evaluating information.

Planning and executing. The objective is to use one’s knowledge about the problem situation to devise a plan and execute it. Tasks where “planning and executing” is the main cognitive demand do not require any substantial prior understanding or representation of the problem situation, either because the situation is straightforward or because these aspects were previously solved. “Planning and executing” includes:

  • planning, which consists of goal setting, including clarifying the overall goal, and setting subgoals, where necessary; and devising a plan or strategy to reach the goal state, including the steps to be undertaken; and
  • executing, which consists of carrying out a plan.

Monitoring and reflecting.The objective is to regulate the distinct processes involved in problem solving, and to critically evaluate the solution, the information provided with the problem, or the strategy adopted. This includes:

  • monitoring progress towards the goal at each stage, including checking intermediate and final results, detecting unexpected events, and taking remedial action when required; and
  • reflecting on solutions from different perspectives, critically evaluating assumptions and alternative solutions, identifying the need for additional information or clarification and communicating progress in a suitable manner.’

Amongst my sample of eleven jurisdictions:

  • ‘Exploring and understanding’ items were found easier by students in Singapore, Hong Kong, South Korea, Australia, Taiwan and Finland. 
  • ‘Representing and formulating’ items were found easier in Taiwan, Shanghai, South Korea, Singapore, Hong Kong, Canada and Australia. 
  • ‘Planning and executing’ items were found easier in Finland only. 
  • ‘Monitoring and reflecting’ items were found easier in Ireland, Singapore, the USA and England.

The Report concludes:

‘This analysis shows that, in general, what differentiates high-performing systems, and particularly East Asian education systems, such as those in Hong Kong-China, Japan, Korea [South Korea], Macao-China, Shanghai -China, Singapore and Chinese Taipei [Taiwan], from lower-performing ones, is their students’ high level of proficiency on “exploring and understanding” and “representing and formulating” tasks.’

It also distinguishes those jurisdictions that perform best on interactive problems, requiring students to discover some of the information required to solve the problem, rather than being presented with all the necessary information. This seems to be the nearest equivalent to a measure of creativity in problem solving

Comparative strengths and weaknesses in respect of interactive tasks are captured in the following diagram.

.

PISA problem solving strengths in different countries

.

One can see that several of my sample – Ireland, the USA, Canada, Australia, South Korea and Singapore – are placed in the top right-hand quarter of the diagram, indicating stronger than expected performance on both interactive and knowledge acquisition tasks.

England is stronger than expected on the former but not on the latter.

Jurisdictions that are weaker than inspected on interactive tasks only include Hong Kong, Taiwan and Shanghai, while Finland is weaker than expected on both.

We have no information about whether these distinctions were maintained at different proficiency levels.

.

Comparing jurisdictions’ performance at higher proficiency levels

Table 4 and Charts 3 and 4 below show variations in the performance of countries in my sample across the four different assessments at level 6, the highest proficiency level.

The charts in particular emphasise how far ahead the Asian Tigers are in maths at this level, compared with the cross-jurisdictional variation in the other three assessments.

In all five cases, each ‘Asian Tiger’s’ level 6 performance in maths also vastly exceeds its level 6 performance in the other three assessments. The proportion of students achieving level 6 proficiency in problem solving lags far behind, even though there is a fairly strong correlation between these two assessments (see below).

In contrast, all the ‘Western’ jurisdictions in the sample – with the sole exception of Ireland – achieve a higher percentage at proficiency level 6 in problem solving than they do in maths, although the difference is always less than a full percentage point. (Even in Ireland the difference is only 0.1 of a percentage point in favour of maths.)

Shanghai is the only jurisdiction in the sample which has more students achieving proficiency level 6 in science than in problem solving. It also has the narrowest gap between level 6 performance in problem solving and in reading.

Meanwhile, England, the USA, Finland and Australia all have broadly similar profiles across the four assessments, with the largest percentage of level 6 performers in problem solving, followed by maths, science and reading respectively.

The proximity of the lines marking level 6 performance in reading and science is also particularly evident in the second chart below.

.

Table 4: Percentage achieving proficiency Level 6 in each domain

  PS L6  Ma L6 Sci L6 Re L6
Singapore 9.6 19.0 5.8 5.0
South Korea 7.6 12.1 1.1 1.6
Hong Kong 5.1 12.3 1.8 1.9
Shanghai 4.1 30.8 4.2 3.8
Taiwan 3.8 18.0 0.6 1.4
Canada 5.1 4.3 1.8 2.1
Australia 4.4 4.3 2.6 1.9
Finland 3.6 3.5 3.2 2.2
England (UK) 3.3 3.1 1.9 1.3
USA 2.7 2.2 1.1 1.0
Ireland 2.1 2.2 1.5 1.3
OECD Average 2.5 3.3 1.2 1.1

 Charts 3 and 4: Percentage achieving proficiency level 6 in each domain

Problem solving chart 3

Problem solving chart 4

The pattern is materially different at proficiency levels 5 and above, as the table and chart below illustrate. These also include the proportion of all-rounders, who achieved proficiency level 5 or above in each of maths, science and reading (but not in problem-solving).

The lead enjoyed by the ‘Asian Tigers’ in maths is somewhat less pronounced. The gap between performance within these jurisdictions on the different assessments also tends to be less marked, although maths accounts for comfortably the largest proportion of level 5+ performance in all five cases.

Conversely, level 5+ performance on the different assessments is typically much closer in the ‘Western’ countries. Problem solving leads the way in Australia, Canada, England and the USA, but in Finland science is in the ascendant and reading is strongest in Ireland.

Some jurisdictions have a far ‘spikier’ profile than others. Ireland is closest to achieving equilibrium across all four assessments. Australia and England share very similar profiles, though Australia outscores England in each assessment.

The second chart in particular shows how Shanghai’s ‘spike’ applies in all the other three assessments but not in problem solving.

Table 5: Percentage achieving Proficiency level 5 and above in each domain

  PS L5+  Ma L5+ Sci L5+ Re L5+ Ma + Sci + Re L5+
Singapore 29.3 40.0 22.7 21.2 16.4
South Korea 27.6 30.9 11.7 14.2 8.1
Hong Kong 19.3 33.4 16.7 16.8 10.9
Shanghai 18.2 55.4 27.2 25.1 19.6
Taiwan 18.4 37.2 8.4 11.8 6.1
Canada 17.5 16.4 11.3 12.9 6.5
Australia 16.7 14.8 13.5 11.7 7.6
Finland 15.0 15.2 17.1 13.5 7.4
England (UK) 14.2 12.4 11.7 9.1 5.7* all UK
USA 11.6 9.0 7.4 7.9 4.7
Ireland 9.4 10.7 10.8 11.4 5.7
OECD Average 11.4 12.6 8.4 8.4 4.4

 .

Charts 5 and 6: Percentage Achieving Proficiency Level 5 and above in each domain

Problem solving chart 5Problem solving chart 6.

How high-achieving problem solvers perform in other assessments

.

Correlations between performance in different assessments

The Report provides an analysis of the proportion of students achieving proficiency levels 5 and 6 on problem solving who also achieved that outcome on one of the other three assessments: reading, maths and science.

It argues that problem solving is a distinct and separate domain. However:

‘On average, about 68% of the problem-solving score reflects skills that are also measured in one of the three regular assessment domains. The remaining 32% reflects skills that are uniquely captured by the assessment of problem solving. Of the 68% of variation that problem-solving performance shares with other domains, the overwhelming part is shared with all three regular assessment domains (62% of the total variation); about 5% is uniquely shared between problem solving and mathematics only; and about 1% of the variation in problem solving performance hinges on skills that are specifically measured in the assessments of reading or science.’

It discusses the correlation between these different assessments:

‘A key distinction between the PISA 2012 assessment of problem solving and the regular assessments of mathematics, reading and science is that the problem-solving assessment does not measure domain-specific knowledge; rather, it focuses as much as possible on the cognitive processes fundamental to problem solving. However, these processes can also be used and taught in the other subjects assessed. For this reason, problem-solving tasks are also included among the test units for mathematics, reading and science, where their solution requires expert knowledge specific to these domains, in addition to general problem-solving skills.

It is therefore expected that student performance in problem solving is positively correlated with student performance in mathematics, reading and science. This correlation hinges mostly on generic skills, and should thus be about the same magnitude as between any two regular assessment subjects.’

These overall correlations are set out in the table below, which shows that maths has a higher correlation with problem solving than either science or reading, but that this correlation is lower than those between the three subject-related assessments.

The correlation between maths and science (0.90) is comfortably the strongest (despite the relationship between reading and science at the top end of the distribution noted above).

PISA problem solving correlations capture

Correlations are broadly similar across jurisdictions, but the Report notes that the association is comparatively weak in some of these, including Hong Kong. Students here are more likely to perform poorly on problem solving and well on other assessments, or vice versa.

There is also broad consistency at different performance levels, but the Report identifies those jurisdictions where students with the same level of performance exceed expectations in relation to problem-solving performance. These include South Korea, the USA, England, Australia, Singapore and – to a lesser extent – Canada.

Those with lower than expected performance include Shanghai, Ireland, Hong Kong, Taiwan and Finland.

The Report notes:

‘In Shanghai-China, 86% of students perform below the expected level in problem solving, given their performance in mathematics, reading and science. Students in these countries/economies struggle to use all the skills that they demonstrate in the other domains when asked to perform problem-solving tasks.’

However, there is variation according to students’ maths proficiency:

  • Jurisdictions whose high scores on problem solving are mainly attributable to strong performers in maths include Australia, England and the USA. 
  • Jurisdictions whose high scores on problem solving are more attributable to weaker performers in maths include Ireland. 
  • Jurisdictions whose lower scores in problem solving are more attributable to weakness among strong performers in maths include Korea. 
  • Jurisdictions whose lower scores in problem solving are more attributable to weakness among weak performers in maths include Hong Kong and Taiwan. 
  • Jurisdictions whose weakness in problem solving is fairly consistent regardless of performance in maths include Shanghai and Singapore.

The Report adds:

‘In Italy, Japan and Korea, the good performance in problem solving is, to a large extent, due to the fact that lower performing students score beyond expectations in the problem-solving assessment….This may indicate that some of these students perform below their potential in mathematics; it may also indicate, more positively, that students at the bottom of the class who struggle with some subjects in school are remarkably resilient when it comes to confronting real-life challenges in non-curricular contexts…

In contrast, in Australia, England (United Kingdom) and the United States, the best students in mathematics also have excellent problem-solving skills. These countries’ good performance in problem solving is mainly due to strong performers in mathematics. This may suggest that in these countries, high performers in mathematics have access to – and take advantage of – the kinds of learning opportunities that are also useful for improving their problem-solving skills.’

What proportion of high performers in problem solving are also high performers in one of the other assessments?

The percentages of high achieving students (proficiency level 5 and above) in my sample of eleven jurisdictions who perform equally highly in each of the three domain-specific assessments are shown in Table 6 and Chart 7 below.

These show that Shanghai leads the way in each case, with 98.0% of all students who achieve proficiency level 5+ in problem solving also achieving the same outcome in maths. For science and reading the comparable figures are 75.1% and 71.7% respectively.

Taiwan is the nearest competitor in respect of problem solving plus maths, Finland in the case of problem solving plus science and Ireland in the case of problem solving plus reading.

South Korea, Taiwan and Canada are atypical of the rest in recording a higher proportion of problem solving plus reading at this level than problem solving plus science.

Singapore, Shanghai and Ireland are the only three jurisdictions that score above 50% on all three of these combinations. However, the only jurisdictions that exceed the OECD averages in all three cases are Singapore, Hong Kong, Shanghai and Finland.

Table 6: PISA problem-solving: Percentage of students achieving proficiency level 5+ in domain-specific assessments

  PS + Ma PS + Sci PS + Re
Singapore 84.1 57.0 50.2
South Korea 73.5 34.1 40.3
Hong Kong 79.8 49.4 48.9
Shanghai 98.0 75.1 71.7
Taiwan 93.0 35.3 43.7
Canada 57.7 43.9 44.5
Australia 61.3 54.9 47.1
Finland 66.1 65.4 49.5
England (UK) 59.0 52.8 41.7
USA 54.6 46.9 45.1
Ireland 59.0 57.2 52.0
OECD Average 63.5 45.7 41.0

Chart 7: PISA Problem-solving: Percentage of students achieving proficiency level 5+ in domain-specific assessments

Problem solving chart 7.

What proportion of students achieve highly in one or more assessments?

Table 7 and Chart 8 below show how many students in each of my sample achieved proficiency level 5 or higher in problem-solving only, in problem solving and one or more assessments, in one or more assessments but not problem solving and in at least one assessment (ie the total of the three preceding columns).

I have also repeated in the final column the percentage achieving this proficiency level in each of maths, science and reading. (PISA has not released information about the proportion of students who achieved this feat across all four assessments.)

These reveal that the percentages of students who achieve proficiency level 5+ only in problem solving are very small, ranging from 0.3% in Shanghai to 6.7% in South Korea.

Conversely, the percentages of students achieving proficiency level 5+ in any one of the other assessments but not in problem solving are typically significantly higher, ranging from 4.5% in the USA to 38.1% in Shanghai.

There is quite a bit of variation in terms of whether jurisdictions score more highly on ‘problem solving and at least one other’ (second column) and ‘at least one other excluding problem solving (third column).

More importantly, the fourth column shows that the jurisdiction with the most students achieving proficiency level 5 or higher in at least one assessment is clearly Shanghai, followed by Singapore, Hong Kong, South Korea and Taiwan in that order.

The proportion of students achieving this outcome in Shanghai is close to three times the OECD average, comfortably more than twice the rate achieved in any of the ‘Western’ countries and three times the rate achieved in the USA.

The same is true of the proportion of students achieving this level in the three domain-specific assessments.

On this measure, South Korea and Taiwan fall significantly behind their Asian competitors, and the latter is overtaken by Australia, Finland and Canada.

 .

Table 7: Percentage achieving proficiency level 5+ in different combinations of PISA assessments

  PS only% PS + 1 or more% 1+ butNot PS% L5+ in at least one % L5+ in Ma + Sci + Re %
Singapore 4.3 25.0 16.5 45.8 16.4
South Korea 6.7 20.9 11.3 38.9 8.1
Hong Kong 3.4 15.9 20.5 39.8 10.9
Shanghai 0.3 17.9 38.1 56.3 19.6
Taiwan 1.2 17.1 20.4 38.7 6.1
Canada 5.5 12.0 9.9 27.4 6.5
Australia 4.7 12.0 7.7 24.4 7.6
Finland 3.0 12.0 11.9 26.9 7.4
England (UK) 4.4 9.8 6.8 21.0 5.7* all UK
USA 4.1 7.5 4.5 16.1 4.7
Ireland 2.6 6.8 10.1 19.5 5.7
OECD Average 3.1 8.2 8.5 19.8 4.4

Chart 8: Percentage achieving proficiency level 5+ in different combinations of PISA assessments

Problem solving chart 8

The Report comments:

The proportion of students who reach the highest levels of proficiency in at least one domain (problem solving, mathematics, reading or science) can be considered a measure of the breadth of a country’s/economy’s pool of top performers. By this measure, the largest pool of top performers is found in Shanghai-China, where more than half of all students (56%) perform at the highest levels in at least one domain, followed by Singapore (46%), Hong  Kong-China (40%), Korea and Chinese  Taipei (39%)…Only one OECD country, Korea, is found among the five countries/economies with the largest proportion of top performers. On average across OECD countries, 20% of students are top performers in at least one assessment domain.

The proportion of students performing at the top in problem solving and in either mathematics, reading or science, too can be considered a measure of the depth of this pool. These are top performers who combine the mastery of a specific domain of knowledge with the ability to apply their unique skills flexibly, in a variety of contexts. By this measure, the deepest pools of top performers can be found in Singapore (25% of students), Korea (21%), Shanghai-China (18%) and Chinese Taipei (17%). On average across OECD countries, only 8% of students are top performers in both a core subject and in problem solving.’

There is no explanation of why proficiency level 5 should be equated by PISA with the breadth of a jurisdiction’s ‘pool of top performers’. The distinction between proficiency levels 5 and 6 in this respect requires further discussion.

In addition to updated ‘all-rounder’ data showing what proportion of students achieved this outcome across all four assessments, it would be really interesting to see the proportion of students achieving at proficiency level 6 across different combinations of these four assessments – and to see what proportion of students achieving that outcome in different jurisdictions are direct beneficiaries of targeted support, such as a gifted education programme.

In the light of this analysis, what are jurisdictions’ priorities for improving  problem solving performance?

Leaving aside strengths and weaknesses in different elements of problem solving discussed above, this analysis suggests that the eleven jurisdictions in my sample should address the following priorities:

Singapore has a clear lead at proficiency level 6, but falls behind South Korea at level 5 (though Singapore re-establishes its ascendancy when levels 5 and 6 are considered together). It also has more level 1 performers than South Korea. It should perhaps focus on reducing the size of this tail and pushing through more of its mid-range performers to level 5. There is a pronounced imbalance in favour of boys at level 6, so enabling more girls to achieve the highest level of performance is a clear priority. There may also be a case for prioritising the children of semi-skilled workers.

South Korea needs to focus on getting a larger proportion of its level 5 performers to level 6. This effort should be focused disproportionately on girls, who are significantly under-represented at both levels 5 and 6. South Korea has a very small tail to worry about – and may even be getting close to minimising this. It needs to concentrate on improving the problem solving skills of its stronger performers in maths.

Hong Kong has a slightly bigger tail than Singapore’s but is significantly behind at both proficiency levels 5 and 6. In the case of level 6 it is equalled by Canada. Hong Kong needs to focus simultaneously on reducing the tail and lifting performance across the top end, where girls and weaker performers in maths are a clear priority.

Shanghai has a similar profile to Hong Kong’s in all respects, though with somewhat fewer level 6 performers. It also needs to focus effort simultaneously at the top and the bottom of the distribution. Amongst this sample, Shanghai has the worst under-representation of girls at level 5 and levels 5 and 6 together, so addressing that imbalance is an obvious priority. It also demonstrated the largest variation in performance against PISA’s IESC index, which suggests that it should target young people from disadvantaged backgrounds, as well as the children of semi-skilled workers.

Taiwan is rather similar to Hong Kong and Shanghai, but its tail is slightly bigger and its level 6 cadre slightly smaller, while it does somewhat better at level 5. It may need to focus more at the very bottom, but also at the very top. Taiwan also has a problem with high-performing girls, second only to Shanghai as far as level 5 and levels 5 and 6 together are concerned. However, like Shanghai, it does comparatively better than the other ‘Asian Tigers’ in terms of girls at level 6. It also needs to consider the problem solving performance of its weaker performers in maths.

Canada is the closest western competitor to the ‘Asian Tigers’ in terms of the proportions of students at levels 1 and 5 – and it already outscores Shanghai and Taiwan at level 6. It needs to continue cutting down the tail without compromising achievement at the top end. Canada also has small but significant gender imbalances in favour of boys at the top end.

Australia by comparison is significantly worse than Canada at level 1, broadly comparable at level 5 and somewhat worse at level 6. It too needs to improve scores at the very bottom and the very top. Australia’s gender imbalance is more pronounced at level 6 than level 5.

Finland has the same mean score as Australia’s but a smaller tail (though not quite as small as Canada’s). It needs to improve across the piece but might benefit from concentrating rather more heavily at the top end. Finland has a slight gender imbalance in favour of girls at level 5, but boys are more in the ascendancy at level 6 than in either England or the USA. As in Australia, this latter point needs addressing.

England has a profile similar to Australia’s, but less effective at all three selected proficiency levels. It is further behind at the top than at the bottom of the distribution, but needs to work hard at both ends to catch up the strongest western performers and maintain its advantage over the USA and Ireland. Gender imbalances are small but nonetheless significant.

USA has a comparatively long tail of low achievement at proficiency level 1 and, with the exception of Ireland, the fewest high achievers. This profile is very close to the OECD average. As in England, the relatively small size of gender imbalances in favour of boys does not mean that these can be ignored.

Ireland has the longest tail of low achievement and the smallest proportion of students at proficiency levels 5, 6 and 5 and 6 combined. It needs to raise the bar at both ends of the achievement distribution. Ireland has a larger preponderance of boys at level 6 than its Western competitors and this needs addressing. The limited socio-economic evidence suggests that Ireland should also be targeting the offspring of parents with semi-skilled and elementary occupations.

So there is further scope for improvement in all eleven jurisdictions. Meanwhile the OECD could usefully provide a more in-depth analysis of high achievers on its assessments that features:

  • Proficiency level 6 performance across the board.
  • Socio-economic disparities in performance at proficiency levels 5 and 6.
  • ‘All-rounder’ achievement at these levels across all four assessments and
  • Correlations between success at these levels and specific educational provision for high achievers including gifted education programmes.

.

GP

April 2014