Closing England’s Excellence Gaps: Part 2

This is the second part of an extended post considering what we know – and do not know – about high attainment gaps between learners from advantaged and disadvantaged backgrounds in England.

512px-Bakerloo_line_-_Waterloo_-_Mind_the_gap

Mind the Gap by Clicsouris

Part one provided an England-specific definition, articulated a provisional theoretical model for addressing excellence gaps and set out the published data about the size of excellence gaps at Key Stages 2,4 and 5, respectively.

Part two continues to review the evidence base for excellence gaps, covering the question whether high attainers remain so, international comparisons data and related research and excellence gaps analysis from the USA.

It also describes those elements of present government policy that impact directly on excellence gaps and offers some recommendations for strengthening our national emphasis on this important issue.

 

Whether disadvantaged high achievers remain so

 

The Characteristics of High Attainers

The Characteristics of high attainers (DfES 2007) includes investigation of:

  • whether pupils in the top 10% at KS4 in 2006 were also high attainers at KS3 in 2004 and KS2 in 2001, by matching back to their fine grade points scores; and
  • chances of being a KS4 high attainer given a range of pupil characteristics at KS2 and KS3.

On the first point it finds that 4% of all pupils remain in the top 10% throughout, while 83% of pupils are never in the top 10% group.

Some 63% of those who were high attainers at the end of KS2 are still high attainers at the end of KS3, while 72% of KS3 high attainers are still in that group at the end of KS4. Approximately half of high attainers at KS2 are high attainers at KS4.

The calculation is not repeated for advantaged and disadvantaged high attainers respectively, but this shows that – while there is relatively little movement between  the high attaining population and other learners (with only 17% of the overall population falling within scope at any point) – there is a sizeable ‘drop out’ amongst high attainers at each key stage.

Turning to the second point, logistic regression is used to calculate the odds of being a KS4 high attainer given different levels of prior attainment and a range of pupil characteristics. Results are controlled to isolate the impact of individual characteristics and for attainment.

The study finds that pupils with a KS2 average points score (APS) above 33 are more likely than not to be high attainers at KS4, and this probability increases as their KS2 APS increases. For those with an APS of 36, the odds are 23.73, meaning they have a 24/25 chance of being a KS4 high attainer.

For FSM-eligible learners though, the odds are 0.55, meaning that the chances of being a KS4 high attainer are 45% lower amongst FSM-eligible pupils, compared to  their non-FSM counterparts with similar prior attainment and characteristics.

The full set of findings for individual characteristics is reproduced below.

Ex gap Capture 7

 

An appendix supplies the exact ratios for each characteristic and the text points out that these can be multiplied to calculate odds ratios for different combinations:

The odds for different prior attainment levels and other characteristics combined with FSM eligibility are not worked through, but could easily be calculated. It would be extremely worthwhile to repeat this analysis using more recent data to see whether the results would be replicated for those completing KS4 in 2014.

 

Sutton Trust

In 2008, the Sutton Trust published ‘Wasted talent? Attrition rates of high achieving pupils between school and university’ which examines the attrition rates for FSM-eligible learners among the top 20% of performers at KS2, KS3 and KS4.

A footnote says that this calculation was ‘on the basis of their English and maths scores at age 11, and at later stages of schooling’, which is somewhat unclear. A single, unidentified cohort is tracked across key stages.

The report suggests ‘extremely high rates of ‘leakage’ amongst the least privileged pupils’. The key finding is that two-thirds of disadvantaged top performers at KS2 are not amongst the top performers at KS4, whereas 42% advantaged top performers are not.

 

Ofsted

Ofsted’s The most able students: Are they doing as well as they should in our non-selective secondary schools? (2013) defines this population rather convolutedly as those:

‘…starting secondary school in Year 7 attaining level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and or mathematics at the end of Key Stage 2.’ (Footnote p6-7)

There is relatively little data in the report about the performance of high-attaining disadvantaged learners, other than the statement that only 58% of FSM students within the ‘most able’ population in KS2 and attending non-selective secondary schools go on to achieve A*-B GCSE grades in English and maths, compared with 75% of non-FSM pupils, giving a gap of 17 percentage points.

I have been unable to find national transition matrices for advantaged and disadvantaged learners, enabling us to compare the proportion of advantaged and disadvantaged pupils making and exceeding the expected progress between key stages.

 

Regression to the mean and efforts to circumvent it

Much prominence has been given to Feinstein’s 2003 finding that, whereas high-scoring children from advantaged and disadvantaged backgrounds (defined by parental occupation) perform at a broadly similar level when tested at 22 months, the disadvantaged group are subsequently overtaken by relatively low-scoring children from advantaged backgrounds during the primary school years.

The diagram that summarises this relationship has been reproduced widely and much used as the centrepiece of arguments justifying efforts to improve social mobility.

Feinstein Capture

But Feinstein’s finding were subsequently challenged on methodological grounds associated with the effects of regression to the mean.

Jerrim and Vignoles (2011) concluded:

‘There is currently an overwhelming view amongst academics and policymakers that highly able children from poor homes get overtaken by their affluent (but less able) peers before the end of primary school. Although this empirical finding is treated as a stylised fact, the methodology used to reach this conclusion is seriously flawed. After attempting to correct for the aforementioned statistical problem, we find little evidence that this is actually the case. Hence we strongly recommend that any future work on high ability–disadvantaged groups takes the problem of regression to the mean fully into account.’

On the other hand, Whitty and Anders comment:

‘Although some doubt has been raised regarding this analysis on account of the potential for regression to the mean to exaggerate the phenomenon (Jerrim and Vignoles, 2011), it is highly unlikely that this would overturn the core finding that high SES, lower ability children catch up with their low-SES, higher-ability peers.’

Their point is borne out by Progress made by high-attaining children from disadvantaged backgrounds (June 2014) suggesting that Vignoles, as part of the writing team, has changed her mind somewhat since 2011.

This research adopts a methodological route to minimise the impact of regression to the mean. This involves assigning learners to achievement groups using a different test to those used to follow their attainment trajectories and focusing principally on those trajectories from KS2 onwards.

The high attaining group is defined as those achieving Level 3 or above in KS1 writing, which selected in 12.6% of the sample. (For comparison, the same calculations are undertaken based on achieving L3 or above in KS1 maths.) These pupils are ranked and assigned a percentile on the basis of their performance on the remaining KS1 tests and at each subsequent key stage.

The chart summarising the outcomes in the period from KS1 to KS4 is reproduced below, showing the different trajectories of the ‘most deprived’ and ‘least deprived’. These are upper and lower quintile groups of state school students derived on the basis of FSM eligibility and a set of area-based measures of disadvantage and measures of socio-economic status derived from the census.

 

Ex gap 8 Capture

The trajectories do not alter significantly beyond KS4.

The study concludes:

‘…children from poorer backgrounds who are high attaining at age 7 are more likely to fall off a high attainment trajectory than children from richer backgrounds. We find that high-achieving children from the most deprived families perform worse than lower-achieving students from the least deprived families by Key Stage 4. Conversely, lower-achieving affluent children catch up with higher-achieving deprived children between Key Stage 2 and Key Stage 4.’

Hence:

‘The period between Key Stage 2 and Key Stage 4 appears to be a crucial time to ensure that higher-achieving pupils from poor backgrounds remain on a high achievement trajectory.’

In short, a Feinstein-like relationship is established but it operates at a somewhat later stage in the educational process.

 

International comparisons studies

 

PISA: Resilience

OECD PISA studies have recently begun to report on the performance of what they call ‘resilient’ learners.

Against the Odds: Disadvantaged Students Who Succeed in Schools (OECD, 2011) describes this population as those who fall within the bottom third of their country’s distribution by socio-economic background, but who achieve within the top third on PISA assessments across participating countries.

This publication uses PISA 2006 science results as the basis of its calculations. The relative position of different countries is shown in the chart reproduced below. Hong Kong tops the league at 24.8%, the UK is at 13.5%, slightly above the OECD average of 13%, while the USA is languishing at 9.9%.

Ex Gap Capture 9

The findings were discussed further in PISA in Focus 5 (OECD 2011), where PISA 2009 data is used to make the calculation. The methodology is also significantly adjusted so that includes a substantially smaller population:

‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status.’

According to this measure, the UK is at 24% and the US has leapfrogged them at 28%. Both are below the OECD average of 31%, while Shanghai and Hong Kong stand at over 70%.

The Report on PISA 2012 (OECD 2013) retains the more demanding definition of resilience, but dispenses with multiplication by 4, so these results need to be so multiplied to be comparable with those for 2009.

This time round, Shanghai is at 19.2% (76.8%) and Hong Kong at 18.1% (72.4%). The OECD average is 6.4% (25.6%), the UK at 5.8% (23.2%) and the US at 5.2% (20.8%).

So the UK has lost a little ground compared with 2009, but is much close to the OECD average and has overtaken the US, which has fallen back by some seven percentage points.

I could find no commentary on these changes.

NFER has undertaken some work on resilience in Northern Ireland, using PISA 2009 reading results (and the original ‘one third’ methodology) as a base. This includes odds ratios for different characteristics of being resilient. This could be replicated for England using PISA 2012 data and the latest definition of resilience.

 

Research on socio-economic gradients

The Socio-Economic Gradient in Teenagers’ Reading Skills: How Does England Compare with Other Countries? (Jerrim 2012) compares the performance of students within the highest and lowest quintiles of the ISEI Index of Occupational Status on the PISA 2009 reading tests.

It quantifies the proportion of these two populations within each decile of  achievement, so generating a gradient, before reviewing how this gradient has changed between PISA 2000 and PISA 2009, comparing outcomes for England, Australia, Canada, Finland, Germany and the US.

Jerrim summarises his findings thus:

‘The difference between advantaged and disadvantaged children’s PISA 2009 reading test scores in England is similar (on average) to that in most other developed countries (including Australia, Germany and, to some extent, the US). This is in contrast to previous studies from the 1990s, which suggested that there was a particularly large socio-economic gap in English pupils’ academic achievement.

Yet the association between family background and high achievement seems to be stronger in England than elsewhere.

There is some evidence that the socio-economic achievement gradient has been reduced in England over the last decade, although not amongst the most able pupils from advantaged and disadvantaged homes.’

Jerrim finds that the link in England between family background and high achievement is stronger than in most other OECD countries, whereas this is not the case at the other end of the distribution.

He hypothises that this might be attributable to recent policy focus on reducing the ‘long tail’ while:

‘much less attention seems to be paid to helping disadvantaged children who are already doing reasonably well to push on and reach the top grades’.

He dismisses the notion that the difference is associated with the fact that  disadvantaged children are concentrated in lower-performing schools, since it persists even when controls for school effects are introduced.

In considering why PISA scores show the achievement gap in reading has reduced between 2000 and 2009 at the lower end of the attainment distribution but not at the top, he cites two possibilities: that Government policy has been disproportionately successful at the lower end; and that there has been a more substantial decline in achievement amongst learners from advantaged backgrounds than amongst their disadvantaged peers. He is unable to rule out the latter possibility.

He also notes in passing that PISA scores in maths do not generate the same pattern.

These arguments are further developed in ‘The Reading Gap: The socio-economic gap in children’s reading skills: A cross-national comparison using PISA 2009’ (Jerrim, 2013) which applies the same methodology.

This finds that high-achieving (top decile of the test distribution) boys from the most advantaged quintile in England are two years and seven months ahead of high-achieving boys from the most disadvantaged quintile, while the comparable gap for girls is slightly lower, at two years and four months.

The chart reproduced below illustrates international comparisons for boys. It shows that only Scotland has a larger high achievement gap than England. (The black lines indicate 99% confidence intervals – he associates the uncertainty to ‘sampling variation’.)

Gaps in countries at the bottom of the table are approximately half the size of those in England and Scotland.

Ex gap 10 capture

 

One of the report’s recommendations is that:

‘The coalition government has demonstrated its commitment to disadvantaged pupils by establishing the Education Endowment Foundation… A key part of this Foundation’s future work should be to ensure highly able children from disadvantaged backgrounds succeed in school and have the opportunity to enter top universities and professional jobs. The government should provide additional resources to the foundation to trial interventions that specifically target already high achieving children from disadvantaged homes. These should be evaluated using robust evaluation methodologies (e.g. randomised control trials) so that policymakers develop a better understanding of what schemes really have the potential to work.’

The study is published by the Sutton Trust whose Chairman – Sir Peter Lampl – is also chairman of the EEF.

In ‘Family background and access to high ‘status’ universities’ (2013) Jerrim provides a different chart showing estimates by country of disadvantaged high achieving learners. The measure of achievement is PISA Level 5 in reading and the measure of disadvantage remains quintiles derived from the ISEI index.

Ex Gap 12 Capture 

The underlying figures are not supplied.

Also in 2013, in ‘The mathematical skills of school children: how does England compare to the high-performing East Asian jurisdictions?’ Jerrim and Choi construct a similar gradient for maths, drawing on a mix of PISA and TIMSS assessments conducted between 2003 and 2009, so enabling them to consider variation according to the age at which assessment takes place.

The international tests selected are TIMSS 2003, 4th grade; TIMSS 2007, 8th grade and PISA 2009. The differences between what these tests measure are described as ‘slight’. The analysis of achievement relies on deciles of the achievement distribution.

Thirteen comparator countries are included, including six wealthy western economies, three ‘middle income’ western economies and four Asian Tigers (Hong Kong, Japan, Singapore and Taiwan).

This study applies as the best available proxy for socio-economic status the number of books in the family home, comparing the most advantaged (over 200 books) with the least (under 25 books). It acknowledges the limitations of this proxy, which Jerrim discusses elsewhere.

The evidence suggests that:

‘between primary school and the end of secondary school, the gap between the lowest achieving children in England and the lowest achieving children in East Asian countries is reduced’

but remains significant.

Conversely, results for the top 10% of the distribution:

‘suggest that the gap between the highest achieving children in England and the highest achieving children in East Asia increases between the end of primary school and the end of secondary school’.

The latter outcome is illustrated in the chart reproduced below

Ex gap 11 Capture

 

The authors do not consider variation by socio-economic background amongst the high-achieving cohort, presumably because the data still does not support the pattern they previously identified for reading.

 

US studies

In 2007 the Jack Kent Cooke Foundation published ‘Achievement Trap: How America is Failing Millions of High-Achieving Students from Low Income Backgrounds’ (Wyner, Bridgeland, Diiulio) The text was subsequently revised in 2009.

This focuses exclusively on gaps attributable to socio-economic status, by comparing the performance of those in the top and bottom halves of the family income distribution in the US, as adjusted for family size.

The achievement measure is top quartile performance on nationally normalised exams administered within two longitudinal studies: The National Education Longitudinal Study (NELS) and the Baccalaureate and Beyond Longitudinal Study (B&B).

The study reports that relatively few lower income students remain high achievers throughout their time in elementary and high school:

  • 56% remain high achievers in reading by Grade 5, compared with 69% of higher income students.
  • 25 percent fall out of the high achiever cohort in high school, compared with 16% of higher income students.
  • Higher income learners who are not high achievers in Grade 1 are more than twice as likely to be high achievers by Grade 5. The same is true between Grades 8 and 12.

2007 also saw the publication of ‘Overlooked Gems: A national perspective on low income promising learners’ (Van Tassel-Baska and Stambaugh). This is a compilation of the proceedings of a 2006 conference which does not attempt a single definition of the target group, but draws on a variety of different research studies and programmes, each with different starting points.

An influential 2009 McKinsey study ‘The Economic Impact of the Achievement Gap in America’s Schools’ acknowledges the existence of what it calls a ‘top gap’. They use this term with reference to:

  • the number of top performers and the level of top performance in the US compared with other countries and
  • the gap in the US between the proportion of Black/Latino students and the proportion of all students achieving top levels of performance.

The authors discuss the colossal economic costs of achievement gaps more generally, but fail to extend this analysis to the ‘top gap’ specifically.

In 2010 ‘Mind the Other Gap: The Growing Excellence Gap in K-12 Education’ (Plucker, Burroughs and Song) was published – and seems to have been the first study to use this term.

The authors define such gaps straightforwardly as

‘Differences between subgroups of students performing at the highest levels of achievement’

The measures of high achievement deployed are the advanced standards on US NAEP maths and reading tests, at Grades 4 and 8 respectively.

The study identifies gaps based on four sets of learner characteristics:

  • Socio-economic status (eligible or not for free or reduced price lunch).
  • Ethnic background (White versus Black and/or Hispanic).
  • English language proficiency (what we in England would call EAL, compared with non-EAL).
  • Gender (girls versus boys).

Each characteristic is dealt with in isolation, so there is no discussion of the gaps between – for example – disadvantaged Black/Hispanic and disadvantaged White boys.

In relation to socio-economic achievement gaps, Plucker et al find that:

  • In Grade 4 maths, from 1996 to 2007, the proportion of advantaged learners achieving the advanced level increased by 5.6 percentage points, while the proportion of disadvantaged learners doing so increased by 1.2 percentage points. In Grade 8 maths, these percentage point changes were 5.7 and 0.8 percentage points respectively. Allowing for changes in the size of the advantaged and disadvantaged cohorts, excellence gaps are estimated to have widened by 4.1 percentage points in Grade 4 (to 7.3%) and 4.9 percentage points in Grade 8 (to 8.2%).
  • In Grade 4 reading, from 1998 to 2007, the proportion of advantaged learners achieving the advanced level increased by 1.2 percentage points, while the proportion of disadvantaged students doing so increased by 0.8 percentage points. In Grade 8 reading, these percentage point changes were almost negligible for both groups. The Grade 4 excellence gap is estimated to have increased slightly, by 0.4 percentage points (to 9.4%) whereas Grade 8 gaps have increased minimally by 0.2 percentage points (to 3.1%).

They observe that the size of excellence gaps are, at best, only moderately correlated with those at lower levels of achievement.

There is a weak relationship between gaps at basic and advanced level – indeed ‘smaller achievement gaps among minimally competent students is related to larger gaps among advanced students’ – but there is some inter-relationship between those at proficient and advanced level.

They conclude that, whereas No Child Left Behind (NCLB) helped to narrow achievement gaps, this does not extend to high achievers.

There is no substantive evidence that the NCLB focus on lower achievers has increased the excellence gap, although the majority of states surveyed by the NAGC felt that NCLB had diverted attention and resource away from gifted education.

In 2011 ‘Do High Fliers Maintain their Altitude?’ (Xiang et al 2011) provides a US analysis of whether individual students remain high achievers throughout their school careers.

They do not report outcomes for disadvantaged high achievers, but do consider briefly those attending schools with high and low proportions respectively of students eligible for free and reduced price lunches.

For this section of the report, high achievement is defined as ‘those whose math or reading scores placed them within the top ten per cent of their individual grades and schools’. Learners were tracked from Grades 3 to 5 and Grades 6 to 8.

It is described as exploratory, because the sample was not representative.

However:

‘High-achieving students attending high-poverty schools made about the same amount of academic growth over time as their high-achieving peers in low-poverty schools…It appears that the relationship between a school’s poverty rate and the growth of its highest-achieving students is weak. In other words, attending a low-poverty school adds little to the average high achiever’s prospects for growth.’

The wider study was criticised in a review by the NEPC, in part on the grounds that the results may have been distorted by regression to the mean, a shortcoming only briefly discussed in an appendix..

The following year saw the publication of Unlocking Emergent Talent: Supporting High Achievement of Low-Income, High-Ability Students (Olszewski-Kubilius and Clarenbach, 2012).

This is the report of a national summit on the issue convened in that year by the NAGC.

It follows Plucker (one of the summit participants) in using as its starting point,the achievement of advanced level on selected NAEP assessments by learners eligible for free and reduced price lunches.

But it also reports some additional outcomes for Grade 12 and for assessments of civics and writing:

  • ‘Since 1998, 1% or fewer of 4th-, 8th-, and 12th-grade free or reduced lunch students, compared to between 5% and 6% of non-eligible students scored at the advanced level on the NAEP civics exam.
  • Since 1998, 1% or fewer of free and reduced lunch program-eligible students scored at the advanced level on the eighth-grade NAEP writing exam while the percentage of non-eligible students who achieved advanced scores increased from 1% to 3%.’

The bulk of the report is devoted to identifying barriers to progress and offering recommendations for improving policy, practice and research. I provided an extended analysis in this post from May 2013.

Finally, ‘Talent on the Sidelines: Excellence Gaps and America’s Persistent Talent Underclass’ (Plucker, Hardesty and Burroughs 2013) is a follow-up to ‘Mind the Other Gap’.

It updates the findings in that report, as set out above:

  • In Grade 4 maths, from 1996 to 2011, the proportion of advantaged students scoring at the advanced level increased by 8.3 percentage points, while the proportion of disadvantaged learners doing so increased by 1.5 percentage points. At Grade 8, the comparable changes were 8.5 percentage points and 1.5 percentage points respectively. Excellence gaps have increased by 6.8 percentage points at Grade 4 (to 9.6%) and by 7 percentage points at Grade 8 (to 10.3%).
  • In Grade 4 reading, from 1998 to 2011, the proportion of advantaged students scoring at the advanced level increased by 2.6 percentage points, compared with an increase of 0.9 percentage points amongst disadvantaged learners. Grade 8 saw equivalent increases of 1.8 and 0.9 percentage points respectively. Excellence gaps are estimated to have increased at Grade 4 by 1.7 percentage points (to 10.7%) and marginally increased at Grade 8 by 0.9 percentage points (to 4.2%).

In short, many excellence gaps remain large and most continue to grow. The report’s recommendations are substantively the same as those put forward in 2010.

 

How Government education policy impacts on excellence gaps

Although many aspects of Government education policy may be expected to have some longer-term impact on raising the achievement of all learners, advantaged and disadvantaged alike, relatively few interventions are focused exclusively and directly on closing attainment gaps between advantaged and disadvantaged learners – and so have the potential to makes a significant difference to excellence gaps.

The most significant of these include:

 

The Pupil Premium:

In November 2010, the IPPR voiced concerns that the benefits of the pupil premium might not reach all those learners who attract it.

Accordingly they recommended that pupil premium should be allocated directly to those learners through an individual Pupil Premium Entitlement which might be used to support a menu of approved activities, including ‘one-to-one teaching to stretch the most able low income pupils’.

The recommendation has not been repeated and the present Government shows no sign of restricting schools’ freedom to use the premium in this manner.

However, the Blunkett Labour Policy Review ‘Putting students and parents first’ recommends that Labour in government should:

‘Assess the level and use of the Pupil Premium to ensure value for money, and that it is targeted to enhance the life chances of children facing the biggest challenges, whether from special needs or from the nature of the background and societal impact they have experienced.’

In February 2013 Ofsted reported that schools spending the pupil premium successfully to improve achievement:

‘Never confused eligibility for the Pupil Premium with low ability, and focused on supporting their disadvantaged pupils to achieve the highest levels’.

Conversely, where schools were less successful in spending the funding, they:

‘focused on pupils attaining the nationally expected level at the end of the key stage…but did not go beyond these expectations, so some more able eligible pupils underachieved.’

In July 2013, DfE’s Evaluation of Pupil Premium reported that, when deciding which disadvantaged pupils to target for support, the top criterion was ‘low attainment’ and was applied in 91% of primary schools and 88% of secondary schools.

In June 2013, in ‘The Most Able Students’, Ofsted reported that:

‘Pupil Premium funding was used in only a few instances to support the most able students who were known to be eligible for free school meals. The funding was generally spent on providing support for all underachieving and low-attaining students rather than on the most able students from disadvantaged backgrounds.’

Accordingly, it gave a commitment that:

‘Ofsted will… consider in more detail during inspection how well the pupil premium is used to support the most able students from disadvantaged backgrounds.’

However, this was not translated into the school inspection guidance.

The latest edition of the School Inspection Handbook says only:

‘Inspectors should pay particular attention to whether more able pupils in general and the most able pupils in particular are achieving as well as they should. For example, does a large enough proportion of those pupils who had the highest attainment at the end of Key Stage 2 in English and mathematics achieve A*/A GCSE grades in these subjects by the age of 16?

Inspectors should summarise the achievements of the most able pupils in a separate paragraph of the inspection report.’

There is no reference to the most able in parallel references to the pupil premium.

There has, however, been some progress in giving learners eligible for the pupil premium priority in admission to selective schools.

In May 2014, the TES reported that:

‘Thirty [grammar] schools have been given permission by the Department for Education to change their admissions policies already. The vast majority of these will introduce the changes for children starting school in September 2015…A small number – five or six – have already introduced the reform.’

The National Grammar Schools Association confirmed that:

‘A significant number of schools 38 have either adopted an FSM priority or consulted about doing so in the last admissions round. A further 59 are considering doing so in the next admissions round.’

In July 2014, the Government launched a consultation on the School Admissions Code which proposes extending to all state-funded schools the option to give priority in their admission arrangements to learners eligible for the pupil premium. This was previously open to academies and free schools via their funding agreements.

 

The Education Endowment Foundation (EEF)

The EEF describes itself as:

‘An independent grant-making charity dedicated to breaking the link between family income and educational achievement, ensuring that children from all backgrounds can fulfil their potential and make the most of their talents.’

The 2010 press release announcing its formation emphasised its role in raising standards in underperforming schools. This was reinforced by the Chairman in a TES article from June 2011:

‘So the target group for EEF-funded projects in its first couple of years are pupils eligible for free school meals in primary and secondary schools underneath the Government’s floor standards at key stages 2 and 4. That’s roughly 1,500 schools up and down the country. Projects can benefit other schools and pupils, as long as there is a significant focus on this core target group of the most needy young people in the most challenging schools.’

I have been unable to trace any formal departure from this position, though it no longer appears in this form in the Foundation’s guidance. The Funding FAQs say only:

‘In the case of projects involving the whole school, rather than targeted interventions, we would expect applicants to be willing to work with schools where the proportion of FSM-eligible pupils is well above the national average and/or with schools where FSM-eligible pupils are under-performing academically.’

I can find no EEF-funded projects that are exclusively or primarily focused on high-attaining disadvantaged learners, though a handful of its reports do refer to the impact on this group.

 

Changes to School Accountability Measures

As we have seen in Part one, the School Performance Tables currently provide very limited information about the performance of disadvantaged high achievers.

The July 2013 consultation document on primary assessment and accountability reform included a commitment to publish a series of headline measures in the tables including:

‘How many of the school’s pupils are among the highest-attaining nationally, by…showing the percentage of pupils attaining a high scaled score in each subject.’

Moreover, it added:

‘We will publish all the headline measures to show the attainment and progress of pupils for whom the school is in receipt of the pupil premium.’

Putting two and two together, this should mean that, from 2016, we will be able to see the percentage of pupil premium-eligible students achieving a high scaled score, though we do not yet know what ‘high scaled score’ means, nor do we know whether the data will be for English and maths separately or combined.

The October 2013 response to the secondary assessment and accountability consultation document fails to say explicitly whether excellence gap measures will be published in School Performance Tables.

It mentions that:

‘Schools will now be held to account for (a) the attainment of their disadvantaged pupils, (b) the progress made by their disadvantaged pupils, and (c) the in-school gap in attainment between disadvantaged pupils and their peers.’

Meanwhile the planned data portal will contain:

‘the percentage of pupils achieving the top grades in GCSEs’

but the interaction between these two elements, if any, remains unclear.

The March 2014 response to the consultation on post-16 accountability and assessment says:

‘We intend to develop measures covering all five headline indicators for students in 16-19 education who were in receipt of pupil premium funding in year 11.’

The post-16 headline measures include a new progress measure and an attainment measure showing the average points score across all level 3 qualifications.

It is expected that a destination measure will also be provided, as long as the methodology can be made sufficiently robust. The response says:

‘A more detailed breakdown of destinations data, such as entry to particular groups of universities, will continue to be published below the headline. This will include data at local authority level, so that destinations for students in the same area can be compared.’

and this should continue to distinguish the destinations of disadvantaged students.

Additional A level attainment measures – the average grade across the best three A levels and the achievement of AAB grades with at least two in facilitating subjects seem unlikely to be differentiated according to disadvantage.

There remains a possibility that much more excellence gap data, for primary, secondary and post-16, will be made available through the planned school portal, but no specification has been made public at the time of writing.

 

The Social Mobility and Child Poverty Commission (SMCPC)

The Commission was established with the expectation that it would ‘hold the Government’s feet to the fire’ to encourage progress on these two topics.

It publishes annual ‘state of the nation’ reports that are laid before Parliament and also undertakes ‘social mobility advocacy’.

The first annual report – already referenced in Part one – was published in November 2013. The second is due in October 2014.

The Chairman of the Commission was less than complimentary about the quality of the Government’s response to its first report, which made no reference to its comments about attainment gaps at higher grades. It remains to be seen whether the second will be taken any more seriously.

The Commission has already shown significant interest in disadvantaged high achievers – in June 2014 it published the study ‘Progress made by high-attaining children from disadvantaged backgrounds’ referenced above – so there is every chance that the topic will feature again in the 2014 annual report.

The Commission is of course strongly interested in the social mobility indicators and progress made against them, so may also include recommendations for how they might be adjusted to reflect changes to the schools accountability regime set out above.

 

Recommended reforms to close excellence gaps

Several proposals emerge from the commentary on current Government policy above:

  • It would be helpful to have further evaluation of the pupil premium to check whether high-achieving disadvantaged learners are receiving commensurate support. Schools need further guidance on ways in which they can use the premium to support high achievers. This should also be a focus for the pupil premium Champion and in pupil premium reviews.
  • Ofsted’s school inspection handbook requires revision to fulfil its commitment to focus on the most able in receipt of the premium. Inspectors also need guidance (published so schools can see it) to ensure common expectations are applied across institutions. These provisions should be extended to the post-16 inspection regime.
  • All selective secondary schools should be invited to prioritise pupil premium recipients in their admissions criteria, with the Government reserving the right to impose this on schools that do not comply voluntarily.
  • The Education Endowment Foundation should undertake targeted studies of interventions to close excellence gaps, but should also ensure that the impact on excellence gaps is mainstreamed in all the studies they fund. (This should be straightforward since their Chairman has already called for action on this front.)
  • The Government should consider the case for the inclusion of data on excellence gaps in all the headline measures in the primary, secondary and post-16 performance tables. Failing that, such data (percentages and numbers) should be readily accessible from the new data portal as soon as feasible, together with historical data of the same nature. It should also publish annually a statistical analysis of all excellence gaps and the progress made towards closing them. As much progress as possible should be made before the new assessment and accountability regime is introduced. At least one excellence gap measure should be incorporated into revised DfE impact indicators and the social mobility indicators.
  • The Social Mobility and Child Poverty Commission (SMCPC) should routinely consider the progress made in closing excellence gaps within its annual report – and the Government should commit to consider seriously any recommendations they offer to improve such progress.

This leaves the question whether there should be a national programme dedicated to closing excellence gaps, and so improving fair access to competitive universities. (It makes excellent sense to combine these twin objectives and to draw on the resources available to support the latter.)

Much of the research above – whether it originates in the US or UK – argues for dedicated state/national programmes to tackle excellence gaps.

More recently, the Sutton Trust has published a Social Mobility Manifesto for 2015 which recommends that the next government should:

‘Reintroduce ring-fenced government funding to support the most able learners (roughly the top ten per cent) in maintained schools and academies from key stage three upwards. This funding could go further if schools were required to provide some level of match funding.

Develop an evidence base of effective approaches for highly able pupils and ensure training and development for teachers on how to challenge their most able pupils most effectively.

Make a concerted effort to lever in additional support from universities and other partners with expertise in catering for the brightest pupils, including through creating a national programme for highly able learners, delivered through a network of universities and accessible to every state-funded secondary school serving areas of disadvantage.’

This is not as clear as it might be about the balance between support for the most able and the most able disadvantaged respectively.

I have written extensively about what shape such a programme should have, most recently in the final section of ‘Digging Beneath the Destination Measures’ (July 2014).

The core would be:

‘A light touch framework that will supply the essential minimum scaffolding necessary to support effective market operation on the demand and supply sides simultaneously…

The centrepiece of the framework would be a structured typology or curriculum comprising the full range of knowledge, skills and understanding required by disadvantaged students to equip them for progression to selective higher education

  • On the demand side this would enable educational settings to adopt a consistent approach to needs identification across the 11-19 age range. Provision from 11-14 might be open to any disadvantaged learner wishing it to access it, but provision from 14 onwards would depend on continued success against challenging attainment targets.
  • On the supply side this would enable the full range of providers – including students’ own educational settings – to adopt a consistent approach to defining which knowledge, skills and understanding their various programmes and services are designed to impart. They would be able to qualify their definitions according to the age, characteristics, selectivity of intended destination and/or geographical location of the students they serve.

With advice from their educational settings, students would periodically identify their learning needs, reviewing the progress they had made towards personal targets and adjusting their priorities accordingly. They would select the programmes and services best matched to their needs….

…Each learner within the programme would have a personal budget dedicated to purchasing programmes and services with a cost attached. This would be fed from several sources including:

  • Their annual Pupil Premium allocation (currently £935 per year) up to Year 11.
  • A national fund fed by selective higher education institutions. This would collect a fixed minimum topslice from each institution’s outreach budget, supplemented by an annual levy on those failing to meet demanding new fair access targets. (Institutions would also be incentivised to offer programmes and services with no cost attached.)
  • Philanthropic support, bursaries, scholarships, sponsorships and in-kind support sourced from business, charities, higher education, independent schools and parents. Economic conditions permitting, the Government might offer to match any income generated from these sources.’

 

Close

We know far too little than we should about the size of excellence gaps in England – and whether or not progress is being made in closing them.

I hope that this post makes some small contribution towards rectifying matters, even though the key finding is that the picture is fragmented and extremely sketchy.

Rudimentary as it is, this survey should provide a baseline of sorts, enabling us to judge more easily what additional information is required and how we might begin to frame effective practice, whether at institutional or national level.

 

GP

September 2014

Closing England’s Excellence Gaps: Part One

This post examines what we know – and do not know – about high attainment gaps between learners from advantaged and disadvantaged backgrounds in England.

Mind the Gap by Clicsouris

Mind the Gap by Clicsouris

It assesses the capacity of current national education policy to close these gaps and recommends further action to improve the prospects of doing so rapidly and efficiently.

Because the post is extremely long I have divided it into two parts.

Part one comprises:

  • A working definition for the English context, explanation of the significance of excellence gaps, description of how this post relates to earlier material and provisional development of the theoretical model articulated in those earlier posts.
  • A summary of the headline data on socio-economic attainment gaps in England, followed by a review of published data relevant to excellence gaps at primary, secondary and post-16 levels.

Part two contains:

  • A distillation of research evidence, including material on whether disadvantaged high attainers remain so, international comparisons studies and research derived from them, and literature covering excellence gaps in the USA.
  • A brief review of how present Government policy might be expected to impact directly on excellence gaps, especially via the Pupil Premium, school accountability measures, the Education Endowment Foundation (EEF) and the Social Mobility and Child Poverty Commission (SMCPC). I have left to one side the wider set of reforms that might have an indirect and/or longer-term impact.
  • Some recommendations for strengthening our collective capacity to quantify address and ultimately close excellence gaps.

The post is intended to synthesise, supplement and update earlier material, so providing a baseline for further analysis – and ultimately consideration of further national policy intervention, whether under the present Government or a subsequent administration.

It does not discuss the economic and social origins of educational disadvantage, or the merits of wider policy to eliminate poverty and strengthen social mobility.

It starts from the premiss that, while education reform cannot eliminate the effects of disadvantage, it can make a significant, positive contribution by improving significantly the life chances of disadvantaged learners.

It does not debate the fundamental principle that, when prioritising educational support to improve the life chances of learners from disadvantaged backgrounds, governments should not discriminate on the basis of ability or prior attainment.

It assumes that optimal policies will deliver improvement for all disadvantaged learners, regardless of their starting point. It suggests, however, that intervention strategies should aim for equilibrium, prioritising gaps that are furthest away from it and taking account of several different variables in the process.

 

A working definition for the English context

The literature in Part two reveals that there is no accepted universal definition of excellence gaps, so I have developed my own England-specific working definition for the purposes of this post.

An excellence gap is:

‘The difference between the percentage of disadvantaged learners who reach a specified age- or stage-related threshold of high achievement – or who secure the requisite progress between two such thresholds – and the percentage of all other eligible learners that do so.’

This demands further clarification of what typically constitutes a disadvantaged learner and a threshold of high achievement.

In the English context, the measures of disadvantage with the most currency are FSM eligibility (eligible for and receiving free school meals) and eligibility for the deprivation element of the pupil premium (eligible for and receiving FSM at some point in the preceding six years – often called ‘ever 6’).

Throughout this post, for the sake of clarity, I have given priority to the former over the latter, except where the former is not available.

The foregrounded characteristic is socio-economic disadvantage, but this does not preclude analysis of the differential achievement of sub-groups defined according to secondary characteristics including gender, ethnic background and learning English as an additional language (EAL) – as well as multiple combinations of these.

Some research is focused on ‘socio-economic gradients’, which show how gaps vary at different points of the achievement distribution on a given assessment.

The appropriate thresholds of high achievement are most likely to be measured through national assessments of pupil attainment, notably end of KS2 tests (typically Year 6, age 11), GCSE and equivalent examinations (typically Year 11, age 16) and A level and equivalent examinations (typically Year 13, age 18).

Alternative thresholds of high achievement may be derived from international assessments, such as PISA, TIMSS or PIRLS.

Occasionally – and especially in the case of these international studies – an achievement threshold is statistically derived, in the form of a percentile range of performance, rather than with reference to a particular grade, level or score. I have not allowed for this within the working definition.

Progress measures typically relate to the distance travelled between: baseline assessment (currently at the end of KS1 – Year 2, age 7 – but scheduled to move to Year R, age 4) and end of KS2 tests; or between KS2 tests and the end of KS4 (GCSE); or between GCSE and the end of KS5 (Level 3/A level).

Some studies extend the concept of progress between two thresholds to a longitudinal approach that traces how disadvantaged learners who achieve a particular threshold perform throughout their school careers – do they sustain early success, or fall away, and what proportion are ‘late bloomers’?

 

Why are excellence gaps important?

Excellence gaps are important for two different sets of reasons: those applying to all achievement gaps and those which apply more specifically or substantively to excellence gaps.

Under the first heading:

  • The goal of education should be to provide all learners, including disadvantaged learners, with the opportunity to maximise their educational potential, so eliminating ‘the soft bigotry of low expectations’.
  • Schools should be ‘engines of social mobility’, helping disadvantaged learners to overcome their backgrounds and compete equally with their more advantaged peers.
  • International comparisons studies reveal that the most successful education systems can and do raise attainment for all and close socio-economic achievement gaps simultaneously.
  • There is a strong economic case for reducing – and ideally eradicating – underachievement attributable to disadvantage.

Under the second heading:

  • An exclusive or predominant focus on gaps at the lower end of the attainment distribution is fundamentally inequitable and tends to reinforce the ‘soft bigotry of low expectations’.
  • Disadvantaged learners benefit from successful role models – predecessors or peers from a similar background who have achieved highly and are reaping the benefits.
  • An economic imperative to increase the supply of highly-skilled labour will place greater emphasis on the top end of the achievement distribution. Some argue that there is a ‘smart fraction’ tying national economic growth to a country’s stock of high achievers. There may be additional spin-off benefits from increasing the supply of scientists, writers, artists, or even politicians!
  • The most highly educated disadvantaged learners are least likely to confer disadvantage on their children, so improving the proportion of such learners may tend to improve inter-generational social mobility.

Excellence gaps are rarely identified as such – the term is not yet in common usage in UK education, though it has greater currency in the US. Regardless of terminology, they rarely receive attention, either as part of a wider set of achievement gaps, or separately in their own right.

 

Relationship with earlier posts

Since this blog was founded in April 2010 I have written extensively about excellence gaps and how to address them.

The most pertinent of my previous posts are:

I have also written about excellence gaps in New Zealand – Part 1 and Part 2 (June 2012) – but do not draw on that material here.

Gifted education (or apply your alternative term) is amongst those education policy areas most strongly influenced by political and ideological views on the preferred balance between excellence and equity. This is particularly true of decisions about how best to address excellence gaps.

The excellence-equity trade-off was identified in my first post (May 2010) as one of three fundamental polarities that determine the nature of gifted education and provide the basis for most discussion about what form it should take.

The Gifted Phoenix Manifesto for Gifted Education (March 2013) highlighted their significance thus:

‘Gifted education is about balancing excellence and equity. That means raising standards for all while also raising standards faster for those from disadvantaged backgrounds.

Through combined support for excellence and equity we can significantly increase our national stock of high level human capital and so improve economic growth…

…Excellence in gifted education is about maximising the proportion of high achievers reaching advanced international benchmarks (eg PISA, TIMSS and PIRLS) so increasing the ‘smart fraction’ which contributes to economic growth.

Equity in gifted education is about narrowing (and ideally eliminating) the excellence gap between high achievers from advantaged and disadvantaged backgrounds (which may be attributable in part to causes other than poverty). This also increases the proportion of high achievers, so building the ‘smart fraction’ and contributing to economic growth.’

More recently, one of the 10 draft core principles I set out in ‘Why Can’t We Have National Consensus on Educating High Attainers?’ (June 2014) said:

‘We must pursue simultaneously the twin priorities of raising standards and closing gaps. We must give higher priority to all disadvantaged learners, regardless of their prior achievement. Standards should continue to rise amongst all high achievers, but they should rise faster amongst disadvantaged high achievers. This makes a valuable contribution to social mobility.’

 

This model provisionally developed

Using my working definition as a starting point, this section describes a theoretical model showing how excellence and equity are brought to bear when considering excellence gaps – and then how best to address them.

This should be applicable at any level, from a single school to a national education system and all points in between.

The model depends on securing the optimal balance between excellence and equity where:

  • Excellence is focused on increasing the proportion of all learners who achieve highly and, where necessary, increasing the pitch of high achievement thresholds to remove unhelpful ceiling effects. The thresholds in question may be nationally or internationally determined and are most likely to register high attainment through a formal assessment process. (This may be extended so there is complementary emphasis on increasing the proportion of high-achieving learners who make sufficiently strong progress between two different age- or stage-related thresholds.)
  • Equity is focused on increasing the proportion of high-achieving disadvantaged learners (and/or the proportion of disadvantaged learners making sufficiently strong progress) at a comparatively faster rate, so they form a progressively larger proportion of the overall high-achieving population, up to the point of equilibrium, where advantaged and disadvantaged learners are equally likely to achieve the relevant thresholds (and/or progress measure). This must be secured without deliberately repressing improvement amongst advantaged learners – ie by introducing policies designed explicitly to limit their achievement and/or progress relative to disadvantaged learners – but a decision to do nothing or to redistribute resources in favour of disadvantage is entirely permissible.

The optimal policy response will depend on the starting position and the progress achieved over time.

If excellence gaps are widening, the model suggests that interventions and resources should be concentrated in favour of equity. Policies should be reviewed and adjusted, or strengthened where necessary, to meet the desired objectives.

If excellence gaps are widening rapidly, this reallocation and adjustment process will be relatively more substantial (and probably more urgent) than if they are widening more slowly.

Slowly widening gaps will demand more reallocation and adjustment than a situation where gaps are stubbornly resistant to improvement, or else closing too slowly. But even in the latter case there should be some reallocation and adjustment until equilibrium is achieved.

When excellence gaps are already closing rapidly – and there are no overt policies in place to deliberately repress improvement amongst high-achieving advantaged learners – it may be that unintended pressures in the system are inadvertently bringing this about. In that case, policy and resources should be adjusted to correct these pressures and so restore the correct twin-speed improvement.

The aim is to achieve and sustain equilibrium, even beyond the point when excellence gaps are eliminated, so that they are not permitted to reappear.

If ‘reverse gaps’ begin to materialise, where disadvantaged learners consistently outperform their more advantaged peers, this also threatens equilibrium and would suggest a proportionate redistribution of effort towards excellence.

Such scenarios are most likely to occur in settings where there are a large proportion of learners that, while not disadvantaged according to the ‘cliff edge’ definition required to make the distinction, are still relatively disadvantaged.

Close attention must therefore be paid to the distribution of achievement across the full spectrum of disadvantage, to ensure that success at the extreme of the distribution does not mask significant underachievement elsewhere.

One should be able to determine a more precise policy response by considering a restricted set of variables. These include:

  • The size of the gaps at the start of the process and, associated with this, the time limit allowed for equilibrium to be reached. Clearly larger gaps are more likely to take longer to close. Policy makers may conclude that steady improvement over several years is more manageable for the system than a rapid sprint towards equilibrium. On the other hand, there may be benefits associated with pace and momentum.
  • The rate at which overall high achievement is improving. If this is relatively fast, the rate of improvement amongst advantaged high achievers will be correspondingly strong, so the rate for disadvantaged high achievers must be stronger still.
  • The variance between excellence gaps at different ages/stages. If the gaps are larger at particular stages of education, the pursuit of equilibrium suggests disproportionate attention is given to those so gaps are closed consistently. If excellence gaps are small for relatively young learners and increase with age, priority should be given to the latter, but there may be other factors in play, such as evidence that closing relatively small gaps at an early stage will have a more substantial ‘knock-on’ effect later on.
  • The level at which high achievement thresholds are pitched. Obviously this will influence the size of the gaps that need to be closed. But, other things being equal, enabling a higher proportion of learners to achieve a relatively high threshold will demand more intensive support. On the other hand, relatively fewer learners – whether advantaged or disadvantaged – are likely to be successful. Does one need to move a few learners a big distance or a larger proportion a smaller one?
  • Whether or not gaps at lower achievement thresholds are smaller and/or closing at a faster rate. If so, there is a strong case for securing parity of progress at higher and lower thresholds alike. On the other hand, if excellence gaps are closing more quickly, it may be appropriate to reallocate resources away from them and towards lower levels of achievement.
  • The relative size of the overall disadvantaged population, the associated economic gap between advantage and disadvantage and (as suggested above) the distribution in relation to the cut-off. If the definition of disadvantage is pitched relatively low (ie somewhat disadvantaged), the disadvantaged population will be correspondingly large, but the economic gap between advantage and disadvantage will be relatively small. If the definition is pitched relatively high (ie very disadvantaged) the reverse will be true, giving a comparatively small disadvantaged population but a larger gap between advantage and disadvantage.
  • The proportion of the disadvantaged population that is realistically within reach of the specified high achievement benchmarks. This variable is a matter of educational philosophy. There is merit in an inclusive approach – indeed it seems preferable to overestimate this proportion than the reverse. Extreme care should be taken not to discourage late developers or close off opportunities on the basis of comparatively low current attainment, so reinforcing existing gaps through unhelpfully low expectations. On the other hand, supporting unrealistically high expectations may be equally damaging and ultimately waste scarce resources. There may be more evidence to support such distinctions with older learners than with their younger peers. 

 

How big are England’s headline attainment gaps and how fast are they closing?

Closing socio-economic achievement gaps has been central to English educational policy for the last two decades, including under the current Coalition Government and its Labour predecessor.

It will remain an important priority for the next Government, regardless of the outcome of the 2015 General Election.

The present Government cites ‘Raising the achievement of disadvantaged children’ as one of ten schools policies it is pursuing.

The policy description describes the issue thus:

‘Children from disadvantaged backgrounds are far less likely to get good GCSE results. Attainment statistics published in January 2014 show that in 2013 37.9% of pupils who qualified for free school meals got 5 GCSEs, including English and mathematics at A* to C, compared with 64.6% of pupils who do not qualify.

We believe it is unacceptable for children’s success to be determined by their social circumstances. We intend to raise levels of achievement for all disadvantaged pupils and to close the gap between disadvantaged children and their peers.’

The DfE’s input and impact indicators  – showing progress against the priorities set out in its business plan – do not feature the measure mentioned in the policy description (which is actually five or more GCSEs at Grades A*-C or equivalents, including GCSEs in English and maths).

The gap on this measure was 27.7% in 2009, improving to 26.7% in 2013, so there has been a small 1.0 percentage point improvement over five years, spanning the last half of the previous Government’s term in office and the first half of this Government’s term.

Instead the impact indicators include three narrower measures focused on closing the attainment gap between free school meal pupils and their peers, at 11, 16 and 19 respectively:

  • Impact Indicator 7 compares the percentages of FSM-eligible and all other pupils achieving level 4 or above in KS2 assessment of reading, writing and maths. The 2013 gap is 18.7%, down 0.4% from 19.1% in 2012.
  • Impact Indicator 8 compares the percentages of FSM-eligible and all other pupils achieving A*-C grades in GCSE maths and English. The 2013 gap is 26.5%, up 0.3% from 26.2% in 2012.
  • Impact Indicator 9 compares the percentages of learners who were FSM-eligible at age 15 and all other learners who attain a level 3 qualification by the end of the academic year in which they are 19. The 2013 gap is 24.3%, up 0.1% from 24.2% in 2012.

These small changes, not always pointing in the right direction, reflect the longer term narrative, as is evident from the Government’s Social Mobility Indicators which also incorporate these three measures.

  • In 2005-06 the KS2 L4 maths and English gap was 25.0%, so there has been a fairly substantial 6.3 percentage point reduction over seven years, but only about one quarter of the gap has been closed.
  • In 2007-08 the KS4 GCSE maths and English gap was 28.0%, so there has been a minimal 1.5 percentage point reduction over six years, equivalent to annual national progress of 0.25 percentage points per year. At that rate it will take another century to complete the process.
  • In 2004-05 the Level 3 qualification gap was 26.4%, so there has been a very similar 2.1 percentage point reduction over 8 years.

The DfE impact indicators also include a set of three destination measures that track the percentage of FSM learners progressing to Oxford and Cambridge, any Russell Group university and any university.

There is a significant time lag with all of these – the most recent available data relates to 2011/2012 – and only two years of data have been collected.

All show an upward trend. Oxbridge is up from 0.1% to 0.2%, Russell Group up from 3% to 4% and any university up from 45% to 47% – actually a 2.5 percentage point improvement.

The Oxbridge numbers are so small that a percentage measure is a rather misleading indicator of marginal improvement from a desperately low base.

It is important to note that forthcoming changes to the assessment regime will impose a different set of headline indicators at ages 11 and 16 that will not be comparable with these.

From 2014 significant methodological adjustments are being introduced to School Performance Tables that significantly restrict the range of qualifications equivalent to GCSEs. Only the first entry in each subject will count for Performance Table purposes, this applying to English Baccalaureate subjects in 2014 and then all subjects in 2015.

Both these factors will tend to depress overall results and may be expected to widen attainment gaps on the headline KS4 measure as well as the oft-cited 5+ GCSEs measure.

From 2016 new baseline assessments, the introduction of scaled scores at the end of KS2 and a new GCSE grading system will add a further layer of change.

As a consequence there will be substantial revisions to the headline measures in Primary, Secondary and Post-16 Performance Tables. The latter will include destination measures, provided they can be made methodologically sound.

At the time of writing, the Government has made negligible reference to the impact of these reforms on national measures of progress, including its own Impact Indicators and the parallel Social Mobility indicators, though the latter are reportedly under review.

 

Published data on English excellence gaps

The following sections summarise what data I can find in the public domain about excellence gaps at primary (KS2), secondary (KS4) and post-16 (KS5) respectively.

I have cited the most recent data derivable from Government statistical releases and performance tables, supplemented by other interesting findings gleaned from research and commentary.

 

Primary (KS2) 

The most recent national data is contained in SFR51/2013: National Curriculum Assessments at Key Stage 2: 2012 to 2013. This provides limited information about the differential performance of learners eligible for and receiving FSM (which I have referred to as ‘FSM’), and for those known to be eligible for FSM at any point from Years 1 to 6 (known as ‘ever 6’ and describing those in receipt of the Pupil Premium on grounds of deprivation).

There is also additional information in the 2013 Primary School Performance Tables, where the term ‘disadvantaged’ is used to describe ‘ever 6’ learners and ‘children looked after’.

There is comparably little variation between these different sets of figures at national level. In the analysis below (and in the subsequent section on KS4) I have used FSM data wherever possible, but have substituted ‘disadvantaged’ data where FSM is not available.  All figures apply to state-funded schools only.

I have used Level 5 and above as the best available proxy for high attainment. Some Level 6 data is available, but in percentages only, and these are all so small that comparisons are misleading.

The Performance Tables distinguish a subset of high attainers, on the basis of prior attainment (at KS1 for KS2 and at KS2 for KS4) but no information is provided about the differential performance of advantaged and disadvantaged high attainers.

In 2013:

  • 21% of all pupils achieved Level 5 or above in reading, writing and maths combined, but only 10% of FSM pupils did so, compared with 26% of others, giving an attainment gap of 16%. The comparable gap at Level 4B (in reading and maths and L4 in writing) was 18%. At Level 4 (across the board) it was 20%. In this case, the gaps are slightly larger at lower attainment levels but, whereas the L4 gap has narrowed by 1% since 2012, the L5 gap has widened by 1%.
  • In reading, 44% of all pupils achieved Level 5 and above, but only 21% of FSM pupils did so, compared with 48% of others, giving an attainment gap of 21%. The comparable gap at Level 4 and above was eight percentage points lower at 13%.
  • In writing (teacher assessment), 31% of all pupils achieved level 5 and above, but only 15% of FSM pupils did so, compared with 34% of others, giving an attainment gap of 19%. The comparable gap at Level 4 and above was three percentage points lower at 16%.
  • In grammar, punctuation and spelling (GPS), 47% of all pupils achieved Level 5 and above, but only 31% of FSM pupils did so, compared with 51% of others, giving an attainment gap of 20%. The comparable gap at Level 4 and above was two percentage points lower at 18%.
  • In maths, 41% of pupils in state-funded schools achieved Level 5 and above, up 2% on 2012. But only 24% of FSM pupils achieved this compared with 44% of others, giving an attainment gap of 20%. The comparable gap at level 4 and above is 13%.

Chart 1 shows these outcomes graphically. In four cases out of five, the gap at the higher attainment level is greater, substantially so in reading and maths. All the Level 5 gaps fall between 16% and 20%.

 

Ex gap table 1

Chart 1: Percentage point gaps between FSM and all other pupils’ attainment at KS2 L4 and above and KS2 L5 and above, 2013 

 

It is difficult to trace reliably the progress made in reducing these gaps in English, since the measures have changed frequently. There has been more stability in maths, however, and the data reveals that – whereas the FSM gap at Level 4 and above has reduced by 5 percentage points since 2008 (from 18 points to 13 points) – the FSM gap at Level 5 and above has remained between 19 and 20 points throughout. Hence the gap between L4+ and L5+ on this measure has increased in the last five years.

There is relatively little published about KS2 excellence gaps elsewhere, though one older Government publication, a DfES Statistical Bulletin: The characteristics of high attainers (2007) offers a small insight.

It defines KS2 high attainers as the top 10%, on the basis of finely grained average points scores across English, maths and science, so a more selective but wider-ranging definition than any of the descriptors of Level 5 performance above.

According to this measure, some 2.7% of FSM-eligible pupils were high attainers in 2006, compared with 11.6% of non-FSM pupils, giving a gap of 8.9 percentage points.

The Bulletin supplies further analysis of this population of high attainers, summarised in the table reproduced below.

 

EX Gap Capture 1 

  

Secondary (KS4) 

While Government statistical releases provide at least limited data about FSM performance at high levels in end of KS2 assessments, this is entirely absent from KS4 data, because there is no information about the achievement of GCSE grades above C, whether for single subjects or combinations.

The most recent publication: SFR05/2014: GCSE and equivalent attainment by pupil characteristics, offers a multitude of measures based on Grades G and above or C and above, many of which are set out in Chart 2, which illustrates the FSM gap on each, organised in order from the smallest gap to the biggest.

(The gap cited here for A*-C grades in English and maths GCSEs is very slightly different to the figure in the impact indicator.)

 

Ex gap table 2

Chart 2: Percentage point gaps between FSM and all other pupils’ attainment on different KS4 measures, 2013

 

In its State of the Nation Report 2013, the Social Mobility and Child Poverty Commission included a table comparing regional performance on a significantly more demanding ‘8+ GCSEs excluding equivalents and including English and maths’ measure. This uses ‘ever 6’ rather than FSM as the indicator of disadvantage.

The relevant table is reproduced below. It shows regional gaps of between 20 and 26 percentage points on the tougher measure, so a similar order of magnitude to the national indicators at the top end of Chart 2.

 

ExGap 2 Capture

 

Comparing the two measures, one can see that:

  • The percentages of ‘ever 6’ learners achieving the more demanding measure are very much lower than the comparable percentages achieving the 5+ GCSEs measure, but the same is also true of their more advantaged peers.
  • Consequently, in every region but London and the West Midlands, the attainment gap is actually larger for the less demanding measure.
  • In London, the gaps are much closer, at 19.1 percentage points on the 5+ measure and 20.9 percentage points on the 8+ measure. In the West Midlands, the gap on the 8+ measure is larger by five percentage points. In all other cases, the difference is at least six percentage points in the other direction.

We do not really understand the reasons why London and the West Midlands are atypical in this respect.

The Characteristics of High Attainers (2007) provides a comparable analysis for KS4 to that already referenced at KS2. In this case, the top 10% of high attainers is derived on the basis of capped GCSE scores.

This gives a gap of 8.8 percentage points between the proportion of non-FSM (11.2%) and FSM (2.4%) students within the defined population, very similar to the parallel calculation at KS2.

Other variables within this population are set out in the table reproduced below.

 

ExGap Capture 3

Finally, miscellaneous data has also appeared from time to time in the answers to Parliamentary Questions. For example:

  • In 2003, 1.0% of FSM-eligible learners achieved five or more GCSEs at A*/A including English and maths but excluding equivalents, compared with 6.8% of those not eligible, giving a gap of 5.8 percentage points. By 2009 the comparable percentages were 1.7% and 9.0% respectively, resulting in an increased gap of 7.3 percentage points (Col 568W)
  • In 2006/07, the percentage of FSM-eligible pupils securing A*/A grades at GCSE in different subjects, compared with the percentage of all pupils in maintained schools doing so were as shown in the table below (Col 808W)
FSM All pupils Gap
Maths 3.7 15.6 11.9
Eng lit 4.1 20.0 15.9
Eng lang 3.5 16.4 12.9
Physics 2.2 49.0 46.8
Chemistry 2.5 48.4 45.9
Biology 2.5 46.8 44.3
French 3.5 22.9 19.4
German 2.8 23.2 20.4

Table 1: Percentage of FSM-eligible and all pupils achieving GCSE A*/A grades in different GCSE subjects in 2007

  • In 2008, 1% of FSM-eligible learners in maintained schools achieved A* in GCSE maths compared with 4% of all pupils in maintained schools. The comparable percentages for Grade A were 3% and 10% respectively, giving an A*/A gap of 10 percentage points (Col 488W)

 

Post-16 (KS5)

The most recent post-16 attainment data is provided in SFR10/2014: Level 2 and 3 attainment by young people aged 19 in 2013 and SFR02/14: A level and other level 3 results: academic year 2012 to 2013.

The latter contains a variety of high attainment measures – 3+ A*/A grades;  AAB grades or better; AAB grades or better with at least two in facilitating subjects;  AAB grades or better, all in facilitating subjects – yet none of them distinguish success rates for advantaged and disadvantaged learners.

The former does includes a table which provides a time series of gaps for achievement of Level 3 at age 19 through 2 A levels or the International Baccalaureate. The measure of disadvantage is FSM-eligibility in Year 11. The gap was 22.0 percentage points in 2013, virtually unchanged from 22.7 percentage points in 2005.

In (How) did New Labour narrow the achievement and participation gap (Whitty and Anders, 2014) the authors reproduce a chart from a DfE roundtable event held in March 2013 (on page 44).

This is designed to show how FSM gaps vary across key stages and also provides ‘odds ratios’ – the relative chances of FSM and other pupils achieving each measure. It relies on 2012 outcomes.

The quality of the reproduction is poor, but it seems to suggest that, using the AAB+ in at least two facilitating subjects measure, there is a five percentage point gap between FSM students and others (3% versus 8%), while the odds ratio shows that non-FSM students are 2.9 times more likely than FSM students to achieve this outcome.

Once again, occasional replies to Parliamentary Questions provide some supplementary information:

  • In 2007, 189 FSM-eligible students (3.7%) in maintained mainstream schools (so excluding sixth form colleges and FE colleges) achieved 3 A grades at A level. This compared with 13,467 other students (9.5%) giving a gap of 5.8 percentage points (Source: Parliamentary Question, 26 November 2008, Hansard (Col 1859W)
  • In 2008, 160 students (3.5%) eligible for FSM achieved that outcome. This compares with 14,431 (10.5%) of those not eligible for FSM, giving a gap of 7.0 percentage points. The figures relate to 16-18 year-olds, in maintained schools only, who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth form colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are counted. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
  • Of pupils entering at least one A level in 2010/11 and eligible for FSM at the end of Year 11, 546 (4.1%) achieved 3 or more GCE A levels at A*-A compared with 22,353 other pupils (10.6%) so giving a gap of 6.5 percentage points. These figures include students in both the schools and FE sectors. (Parliamentary Question, 9 July 2012, Hansard (Col 35W)) 

 

Additional evidence of Key Stage excellence gaps from a sample born in 1991

In Progress made by high-achieving children from disadvantaged backgrounds (Crawford, Macmillan and Vignoles, 2014) provides useful data on the size of excellence gaps at different key stages, as well as analysis of whether disadvantaged high achievers remain so through their school careers.

The latter appears in Part two, but the first set of findings provides a useful supplement to the broad picture set out above.

This study is based on a sample of learners born in 1991/1992, so they would presumably have taken end of KS2 tests in 2002, GCSEs in 2007 and A levels in 2009. It includes all children who attended a state primary school, including those who subsequently attended an independent secondary school.

It utilises a variety of measures of disadvantage, including whether learners were always FSM-eligible (in Years 7-11), or ‘ever FSM’ during that period. This summary focuses on the distinction between ‘always FSM’ and ‘never FSM’.

It selects a basket of high attainment measures spread across the key stages, including:

  • At KS1, achieving Level 3 or above in reading and maths.
  • At KS2, achieving Level 5 or above in English and maths.
  • At KS4, achieving six or more GCSEs at grades A*-C in EBacc subjects (as well as five or more).
  • At KS5, achieving two or more (and three or more) A levels at grades A-B in any subjects.
  • Also at KS5, achieving two or more (and three or more) A levels at grades A-B in facilitating subjects.

The choice of measures at KS2 and KS5 is reasonable, reflecting the data available at the time. For example, one assumes that A* grades at A level do not feature in the KS5 measures since they were not introduced until 2010).

At KS4, the selection is rather more puzzling and idiosyncratic. It would have been preferable to have included at least one measure based on performance across a range of GCSEs at grades A*-B or A*/A.

The authors justify their decision on the basis that ‘there is no consensus on what is considered high attainment’, even though most commentators would expect this to reflect higher grade performance, while few are likely to define it solely in terms of breadth of study across a prescribed set of ‘mainstream’ subjects.

Outcomes for ‘always FSM’ and ‘never FSM’ on the eight measures listed above are presented in Chart 3.

Ex gap Table 3

Chart 3: Achievement of ‘always FSM’ and ‘never FSM’ on a basket of high attainment measures for pupils born in 1991/92

 

This reveals gaps of 12 to 13 percentage points at Key Stages 1 and 2, somewhat smaller than several of those described above.

It is particularly notable that the 2013 gap for KS2 L5 reading, writing and maths is 16 percentage points, whereas the almost comparable 2002 (?) gap for KS2 English and maths amongst this sample is 13.5%. Even allowing for comparability issues, there may tentative evidence here to suggest widening excellence gaps at KS2 over the last decade.

The KS4 gaps are significantly larger than those existing at KS1/2, at 27 and 18 percentage points respectively. But comparison with the previous evidence reinforces the point that the size of the gaps in this sample is attributable to subject mix: this must be the case since the grade expectation is no higher than C.

The data for A*/A performance on five or more GCSEs set out above, which does not insist on coverage of EBacc subjects other than English and maths, suggests a gap of around seven percentage points. But it also demonstrates big gaps – again at A*/A – for achievement in single subjects, especially the separate sciences.

The KS5 gaps on this sample range from 2.5 to 13 percentage points. We cited data above suggesting a five percentage point gap in 2012 for AAB+, at least two in facilitating subjects. These findings do not seem wildly out of kilter with that, or with the evidence of gaps of around six to seven percentage points for AAA grades or higher.

 

Overall pattern 

The published data provides a beguiling glimpse of the size of excellence gaps and how they compare with FSM gaps on the key national benchmarks.

But discerning the pattern is like trying to understand the picture on a jigsaw when the majority of pieces are missing.

The received wisdom is capture in the observation by Whitty and Anders that:

‘Even though the attainment gap in schools has narrowed overall, it is largest for the elite measures’

and the SMCPC’s comment that:

‘…the system is better at lifting children eligible for FSM above a basic competence level (getting 5A*–C) than getting them above a tougher level of attainment likely to secure access to top universities.’

This seems broadly true, but the detailed picture is rather more complicated.

  • At KS2 there are gaps at L5 and above of around 16-20 percentage points, the majority higher than the comparable gaps at L4. But the gaps for core subjects combined are smaller than for each assessment. There is tentative evidence that the former may be widening.
  • At KS4 there are very significant differences between results in individual subjects. When it comes to multi-subject indicators, differences in the choice of subject mix – as well as choice of grade – make it extremely difficult to draw even the most tentative conclusions about the size of excellence gaps and how they relate to benchmark-related gaps at KS4 and excellence gaps at KS2.
  • At KS5, the limited evidence suggests that A level excellence gaps at the highest grades are broadly similar to those at GCSE A*/A. If anything, gaps seem to narrow slightly compared with KS4. But the confusion over KS4 measures makes this impossible to verify.

We desperately need access to a more complete dataset so we can understand these relationships more clearly.

This is the end of Part one. In Part two, we move on to consider evidence about whether high attainers remain so, before examining international comparisons data and related research, followed by excellence gaps analysis from the USA.

Part two concludes with a short review of how present government policy impacts on excellence gaps and some recommendations for strengthening the present arrangements.

 

GP

September 2014

What Happened to the Level 6 Reading Results?

 

Provisional 2014 key stage 2 results were published on 28 August.

500px-Japanese_Urban_Expwy_Sign_Number_6.svgThis brief supplementary post considers the Level 6 test results – in reading, in maths and in grammar, punctuation and spelling (GPS) – and how they compare with Level 6 outcomes in 2012 and 2013.

An earlier post, A Closer Look at Level 6, published in May 2014, provides a fuller analysis of these earlier results.

Those not familiar with the 2014 L6 test materials can consult the papers, mark schemes and level thresholds at these links:

 

Number of Entries

Entry levels for the 2014 Level 6 tests were published in the media in May 2014. Chart 1 below shows the number of entries for each test since 2012 (2013 in the case of GPS). These figures are for all schools, independent as well as state-funded.

 

L6 Sept chart 1

Chart 1: Entry rates for Level 6 tests 2012 to 2014 – all schools

 

In 2014, reading entries were up 36%, GPS entries up 52% and maths entries up 36%. There is as yet no indication of a backlash from the decision to withdraw Level 6 tests after 2015, though this may have an impact next year.

The postscript to A Closer Look estimated that, if entries continue to increase at current rates, we might expect something approaching 120,000 in reading, 130,000 in GPS and 140,000 in maths.

Chart 2 shows the percentage of all eligible learners entered for Level 6 tests, again for all schools. Nationally, between one in six and one in five eligible learners are now entered for Level 6 tests. Entry rates for reading and maths have almost doubled since 2012.

 

L6 Sept chart 2

Chart 2: Percentage of eligible learners entered for Level 6 tests 2012 to 2014, all schools

 

Success Rates

The headline percentages in the SFR show:

  • 0% achieving L6 reading (unchanged from 2013)
  • 4% achieving L6 GPS (up from 2% in 2013) and
  • 9% achieving L6 maths (up from 7% in 2013).

Local authority and regional percentages are also supplied.

  • Only in Richmond did the L6 pass rate in reading register above 0% (at 1%). Hence all regions are at 0%.
  • For GPS the highest percentages are 14% in Richmond, 10% in Kensington and Chelsea and Kingston, 9% in Sutton and 8% in Barnet, Harrow and Trafford. Regional rates vary between 2% in Yorkshire and Humberside and 6% in Outer London.
  • In maths, Richmond recorded 22%, Kingston 19%, Trafford, Harrow and Sutton were at 18% and Kensington and Chelsea at 17%. Regional rates range from 7% in Yorkshire and Humberside and the East Midlands to 13% in Outer London.

Further insight into the national figures can be obtained by analysing the raw numbers supplied in the SFR.

Chart 3 shows how many of those entered for each test were successful in each year. Here there is something of a surprise.

 

L6 Sept chart 3

Chart 3: Percentage of learners entered achieving Level 6, 2012 to 2014, all schools

 

Nearly half of all entrants are now successful in L6 maths, though the improvement in the success rate has slowed markedly compared with the nine percentage point jump in 2013.

In GPS, the success rate has improved by nine percentage points between 2013 and 2014 and almost one in four entrants is now successful. Hence the GPS success rate is roughly half that for maths. This may be attributable in part to its shorter history, although the 2014 success rate is significantly below the rate for maths in 2013.

But in reading an already very low success rate has declined markedly, following a solid improvement in 2013 from a very low base in 2012. The 2014 success rate is now less than half what it was in 2012. Fewer than one in a hundred of those entered have passed this test.

Chart 4 shows how many learners were successful in the L6 reading test in 2014 compared with previous years, giving results for boys and girls separately.

 

L6 Sept chart 4

Chart 4: Percentage of learners entered achieving Level 6 in reading, 2012 to 2014, by gender

 

The total number of successful learners in 2014 is over 5% lower than in 2012, when the reading test was introduced, and down 62% on the success rate achieved in 2013.

Girls appear to have suffered disproportionately from the decline in 2014 success rates. While the success rate for girls is down 63%, the decline for boys is slightly less, at 61%. The success rate for boys remains above where it was in 2012 but, for girls, it is about 12% down on where it was in 2012.

In 2012, only 22% of successful candidates were boys. This rose to 26% in 2013 and has again increased slightly, to 28% in 2014. The gap between girls’ and boys’ performance remains substantially bigger than those for GPS and maths.

Charts 5 and 6 give the comparable figures for GPS and maths respectively.

In GPS, the total number of successful entries has increased by almost 140% compared with 2013. Girls form a slightly lower proportion of this group than in 2013, their share falling from 62% to 60%. Boys are therefore beginning to close what remains a substantial performance gap.

 

L6 Sept chart 5

Chart 5: Percentage of learners entered achieving Level 6 in GPS, 2012 to 2014, by gender

 

In maths, the total number of successful entries is up by about 40% on 2013 and demonstrates rapid improvement over the three year period.

Compared with 2013, the success rate for girls has increased by 43%, whereas the corresponding increase for boys is closer to 41%. Boys formed 65% of the successful cohort in 2012, 61% in 2013 and 60% in 2014, so girls’ progress in narrowing this substantial performance gap is slowing.

 

L6 Sept chart 6

Chart 6: Percentage of learners entered achieving Level 6 in maths, 2012 to 2014, by gender

 

Progress

The SFR also provides a table, this time for state-funded schools only, showing the KS1 outcomes of those successful in achieving Level 6. (For maths and reading, this data includes those with a non-numerical grade in the test who have been awarded L6 via teacher assessment. The data for writing is derived solely from teacher assessment.)

Not surprisingly, over 94% of those achieving Level 6 in reading had achieved Level 3 in KS1, but 4.8% were at L2A and a single learner was recorded at Level 1. The proportion with KS1 Level 3 in 2013 was higher, at almost 96%.

In maths, however, only some 78% of those achieving Level 6 were at Level 3 in KS1. A further 18% were at 2A and almost 3% were at 2B. A further 165 were recorded as 2C or 1. In 2013, over 82% had KS1 L3 while almost 15% had 2A.

It seems, therefore, that KS1 performance was a slightly weaker indicator of KS2 level 6 success in 2014 than in the previous year, but this trend was apparent in both reading and maths – and KS1 performance remains a significantly weaker indicator in maths than it is in reading.

 

Why did the L6 reading results decline so drastically?

Given that the number of entries for the Level 6 reading test increased dramatically, the declining pass rate suggests either a problematic test or that schools entered a higher proportion of learners who had relatively little chance of success. A third possibility is that the test was deliberately made more difficult.

The level threshold for the 2014 Level 6 reading test was 24 marks, compared with 22 marks in 2013, but there are supposed to be sophisticated procedures in place to ensure that standards are maintained. We should be able to discount the third cause.

The second cause is also unlikely to be significant, since schools are strongly advised only to enter learners who are already demonstrating attainment beyond KS2 Level 5.There is no benefit to learners or schools from entering pupils for tests that they are almost certain to fail.

The existing pass rate was very low, but it was on an upward trajectory. Increasing familiarity with the test ought to have improved schools’ capacity to enter the right learners and to prepare them to pass it.

That leaves only the first possibility – something must have been wrong with the test.

Press coverage from May 2014, immediately after the test was administered, explained that it contained different rules for learners and invigilators about the length of time available for answering questions.

The paper gave learners one hour for completion, while invigilators were told pupils had 10 minutes’ reading time followed by 50 minutes in which to answer the questions. Schools interpreted this contradiction differently and several reported disruption to the examination as a consequence.

The NAHT was reported to have written to the Standards and Testing Agency:

‘…asking for a swift review into this error and to seek assurance that no child will be disadvantaged after having possibly been given incorrect advice on how to manage their time and answers’.

The STA statement says:

‘We apologise for this error. All children had the same amount of time to complete the test and were able to consult the reading booklet at any time. We expect it will have taken pupils around 10 minutes to read the booklet, so this discrepancy should not have led to any significant advantage for those pupils where reading time was not correctly allotted.’

NAHT has now posted the reply it received from STA on 16 May. It says:

‘Ofqual, our regulator, is aware of the error and of the information set out below and will, of
course, have to independently assure itself that the test remains valid. We would not
expect this to occur until marking and level setting processes are complete, in line with
their normal timescales.’

It then sets out the reasons why it believes the test remains valid. These suggest the advantage to the learners following the incorrect instructions was minimal since:

  • few would need less than 10 minutes’ reading time;
  • pre-testing showed 90% of learners completed the test within 50 minutes;
  • in 2013 only 3.5% of learners were within 1 or 2 marks of the threshold;
  • a comparative study to change the timing of the Levels 3-5 test made little difference to item difficulty.

NAHT says it will now review the test results in the light of this response.

 

 

Who is responsible?

According to its most recent business plan, STA:

‘is responsible for setting and maintaining test standards’ (p3)

but it publishes little or nothing about the process involved, or how it handles representations such as that from NAHT.

Meanwhile, Ofqual says its role is:

‘to make sure the assessments are valid and fit for purpose, that the assessments are fair and manageable, that the standards are properly set and maintained and the results are used appropriately.

We have two specific objectives as set out by law:

  • to promote assessment arrangements which are valid, reliable and comparable
  • to promote public confidence in the arrangements.

We keep national assessments under review at all times. If we think at any point there might be a significant problem with the system, then we notify the Secretary of State for Education.’

Ofqual’s Chair has confirmed via Twitter that Ofqual was:

‘made aware at the time, considered the issues and observed level setting’.

Ofqual was content that the level-setting was properly undertaken.

 

 

I asked whether, in the light of that, Ofqual saw a role for itself in investigating the atypical results. I envisaged that this might take place under the Regulatory Framework for National Curriculum Assessments (2011).

This commits Ofqual to publishing annually its ‘programme for reviewing National Assessment arrangements’ (p14) as well as ‘an annual report on the outcomes of the review programme’ (p18).

However the most recent of these relates to 2011/12 and appeared in November of that year.

 

 

I infer from this that we may seem some reaction from Ofqual, if and when it finally produces an annual report on National Curriculum Assessments in 2014, but that’s not going to appear before 2015 at the earliest.

I can’t help but feel that this is not quite satisfactory – that atypical test performance of this magnitude ought to trigger an automatic and transparent review, even if the overall number of learners affected is comparatively small.

If I were part of the system I would want to understand promptly exactly what happened, for fear that it might happen again.

If you are in any doubt quite how out of kilter the reading test outcomes were, consider the parallel results for Level 6 teacher assessment.

In 2013, 5,698 learners were assessed at Level 6 in reading through teacher assessment – almost exactly two-and-a-half times as many as achieved Level 6 in the test.

In 2014, a whopping 17,582 learners were assessed at Level 6 through teacher assessment, around 20 times as many as secured a Level 6 in the reading test.

If the ratio between test and teacher assessment results in 2014 had been the same as it was in 2013, the number successful on the test would have been over 7,000, eight-fold higher than the reported 851.

I rest my case.

 

The new regime

In February 2013, a DfE-commissioned report Investigation of Key Stage 2 Level 6 Tests recommended that:

‘There is a need to review whether the L6 test in Reading is the most appropriate test to use to discriminate between the highest ability pupils and others given:

a) that only around 0.3 per cent of the pupils that achieved at least a level 5 went on to achieve a level 6 in Reading compared to 9 per cent for Mathematics

b) there was a particular lack of guidance and school expertise in this area

c) pupil maturity was seen to be an issue

d) the cost of supporting and administering a test for such a small proportion of the school population appears to outweigh the benefits.’

This has been overtaken by the decision to withdraw all three Level 6 tests and to rely on single tests of reading GPS and maths for all learners when the new assessment regime is introduced from 2016.

Draft test frameworks were published in March 2014, supplemented in July by sample questions, mark schemes and commentary.

Given the imminent introduction of this new regime, together with schools’ experience in 2014, it seems increasingly unlikely that 2015 Level 6 test entries in reading will approach the 120,000 figure suggested by the trend.

Perhaps more importantly, schools and assessment experts alike seem remarkably sanguine about the prospect of single tests for pupils demonstrating the full range of prior attainment, apart from those assessed via the P-Scales. (The draft test frameworks are worryingly vague about whether those operating at the equivalent of Levels 1 and 2 will be included.)

I could wish to be equally sanguine, on behalf of all those learners capable of achieving at least the equivalent of Level 6 after 2015. But, as things stand, the evidence to support that position is seemingly non-existent.

In October 2013, Ofqual commented that:

‘There are also some significant technical challenges in designing assessments which can discriminate effectively and consistently across the attainment range so they can be reported at this level of precision.’

A year on, we still have no inkling whether those challenges have been overcome.

 

GP

September 2014

 

 

 

 

Digging Beneath the Destination Measures

 

This post takes as its starting point the higher education destination data published by the Department for Education (DfE) in June 2014.

turner-oxford-high-stIt explores:

  • The gaps between progression rates for students from disadvantaged backgrounds (defined in terms of eligibility for free school meals) and those of their more advantaged peers.
  • How these rates vary according to whether the students come from schools or colleges and the selectivity of the higher education to which they progress.
  • Regional differences, with a particular focus on Inner and Outer London.

Although these are officially classified as experimental statistics, they supply a valuable alternative perspective on national progress towards fair access for disadvantaged learners to selective universities.

Securing such progress is integral to the Government’s education and social mobility strategy, since it is embedded in DfE’s Impact Indicators, in BIS Performance Indicators and the Social Mobility Indicators. The DfE indicators depend on these destination measures.

The final section discusses the optimal policy response to the position revealed by this analysis. It:

  • Discusses the limitations of a free market solution combined with institutional autonomy, structural reform – especially the introduction of specialist post-16 providers – and the expected incorporation of these measures into the post-16 accountability framework.
  • Sets out the advantages of introducing a framework to support the market on both the demand and supply sides. This would secure a coherent and consistent menu of opportunities that might be targeted directly at the learners most likely to benefit. This might be undertaken at national or at regional level, including in London.
  • Suggests that – given the abundant evidence of stalled progress – the latter approach is most likely to bring about more immediate, significant and sustained improvement without excessive deadweight cost.

I am publishing this on the eve of The Brilliant Club’s Inaugural Conference, which asks the question

‘How can universities and schools help pupils from low participation backgrounds secure places and succeed at highly competitive universities?’

The organisers and participants are cordially invited to admit this second personal contribution to this debate, for I have already written extensively about the particular problem of fair access to Oxbridge for disadvantaged learners.

That post exposed some rather shaky statistical interpretation by the universities concerned and proposed a series of policy steps to address the worryingly low progression rates to these two universities. I will refer to it occasionally below, keeping repetition to a minimum. I commend it to you as a companion piece to this.

 

The Destination Data

DfE published SFR 19/2014: ‘Destinations of key stage 4 and key stage 5 pupils: 2011 to 2012’ on 26 June 2014.

These are described as ‘experimental statistics…as data are still being evaluated and remain subject to further testing in terms of their reliability and ability to meet customer needs’.

Nevertheless, subject to possible further refinement, DfE plans to incorporate KS5 destination measures into the new post-16 accountability arrangements to be introduced from 2016.  They are set to become increasingly significant for school sixth forms and post-16 providers alike.

The measures are based on student activity in the year immediately following the completion of A level or other Level 3 qualifications.

Students are included if:

  • They are aged 16, 17 or 18 and entered for at least one A level or other L3 qualification. (Those entered for AS level only are therefore excluded.)
  • They ‘show sustained participation…in all of the first two terms of the year after…’ ie from October 2011 to March 2012. (Dropouts are excluded but there is provision to pick up students transferring from one provider to another.)

The time lag is caused by the need to match data from the national pupil database (NPD) and the Higher Education Statistics Agency (HESA). The most recent matchable dataset combines the HESA data for academic year 2011/12 with the KS5 performance data for academic year 2010/11.

The 2011/12 destination data includes partial coverage of independent schools for the first time, alongside state-funded schools and colleges, but my analysis is confined to state-funded institutions.

The measure of disadvantage is eligibility for free school meals (FSM). Students are considered disadvantaged if they were eligible for and receiving FSM meals at any point in Year 11, so immediately prior to KS5. This post typically uses ‘FSM’ or ‘FSM-eligible’ to describe this group.

FSM is a narrower definition of disadvantage than the Pupil Premium, which is based on FSM eligibility at any point in the preceding six years. These two definitions continue to have most currency in the schools sector, but are frequently disregarded in the higher education sector where several alternatives are deployed.

All measures of disadvantage have upside and downside and, having explored this issue extensively in my previous post about Oxbridge, I do not propose to cover the same ground here.

I will only repeat the contention that, far too often, those facing criticism for their failure to improve fair access will criticise in turn the measures adopted, so producing a smokescreen to deflect attention from that failure.

The analysis that follows draws principally on tables included in the underlying data published alongside the SFR. The presentation of the data in these tables – used in all the published material – is important to bear in mind.

All totals are rounded to the nearest ten, while any single figure less than 6 is suppressed and replaced with ‘x’.

Hence a total of ‘10’ is an approximation which might represent any figure between 6 and 14.

It follows that a calculation involving several totals may be even more approximate. To take an important example, the sum of five totals, each given as ‘10’, may represent anything between 30 and 70.

This degree of imprecision is less than helpful when smaller cohorts – such as FSM-eligible students progressing to the most competitive universities – are under discussion.

A more detailed and sophisticated explanation of the methodology supporting the measures can be found in the Technical Note published alongside the SFR.

 

Nature of the Total Population

Table 1, below, shows how the national population is distributed between state-funded schools and colleges – and between FSM and non-FSM students from each of those settings.

 

Table 1: Distribution of national KS5 population and numbers progressing to a sustained education destination 2011/12

State-funded schools State-funded colleges Total
FSM Non-FSM Total FSM Non-FSM Total FSM Non-FSM Total
No of students 11,100 153,480 164,580 17,680 153,230 170,910 28,770 306,720 335,490
Sustained education destination 8,020 114,470 122,490 10,430 90,830 101,260 18,450 205,300 223,760

 

Key points include:

  • Of the total KS5 student population of 335,490, only some 8.6% are FSM-eligible. Hence the analysis below is derived from a sample of 28,770 students.
  • Some 49% of this population attend mainstream state-funded schools compared with 51% at state-funded colleges. Total numbers are therefore distributed fairly evenly between the two sectors.
  • The FSM-eligible population attending schools is 6.7% of the total population attending schools and over 38% of the total FSM population. The former percentage is significantly lower than the proportion of FSM-eligible students aged 11-15 in the national secondary school population, which stood at around 16% in 2012.
  • The FSM-eligible population attending colleges is 10.3% of the total population attending colleges and over 61% of the total FSM population.

Hence the overall population is spread fairly evenly between schools and colleges, but a significant majority of the FSM-eligible population is located in the latter.

Furthermore:

  • The proportion of KS5 students progressing to a sustained education destination (as opposed to not progressing to any destination, or progressing to employment or training) is almost 67%, but amongst FSM-eligible learners this falls slightly, to 64%.
  • Amongst those attending schools, the proportion of FSM-eligible students progressing to a sustained education destination is approximately 72%; amongst those attending colleges it is much lower – some 59%.

The analysis below uses the total population as a base, rather than the proportion that progresses to a sustained educational destination.

The incidence of FSM-eligible students also varies considerably by region. Chart 1 below shows the percentage of FSM and other students in each region’s overall KS5 cohort.

 

Chart 1: Percentages of FSM and non-FSM in KS5 cohort by region 2010/11

Destinations chart 1

 

The percentage of FSM-eligible students ranges from as low as 4.3% in the South East up to 30.3% in Inner London – a vast differential.

Inner London has comfortably more than twice the incidence of FSM students in Outer London, the next highest, and some seven times the rate in the South East.

The sizes of these cohorts are also extremely variable. There are over 4,000 students in the FSM populations for each of Inner and Outer London, compared with as few as 1,400 in the South West region. Taken together, Inner and Outer London account for slightly over 30% of the total English FSM-eligible population.

However, the total KS5 population is far bigger in the South East (58,260) than in any other region, while Inner London (14,030) is the smallest population. The South East alone accounts for over 17% of the total KS5 cohort.

These variations – particularly the high incidence of FSM students within a relatively small overall KS5 population in Inner London – are bound to have a profound effect on progression to higher education.

The concentration in Inner London is such that it will almost certainly be a relatively easy task to prioritise FSM students’ needs and also achieve economies of scale through provision across multiple schools.

There will be heavy concentrations of FSM-eligible students in many secondary schools, as well as in post-16 provision in both schools and colleges. Significantly fewer institutions – secondary or post-16 – will have negligible FSM-eligible populations.

There will be a similar effect in Outer London, though patchier and not so profound.

 

Progression to a UK Higher Education Institution

Table 2: National breakdown of numbers progressing to a UK Higher education institution, 2011/12

State-funded schools State-funded colleges Total
FSM Non-FSM Total FSM Non-FSM Total FSM Non-FSM Total
No of students 11,100 153,480 164,580 17,680 153,230 170,910 28,770 306,720 335,490
UK HEI destination 6,250 95,880 102,130 7,290 67,130 74,420 13,540 163,010 176,550

 

Table 2, above, shows that:

  • The overall proportion progressing to a UK higher education institution is almost 53%, but this falls to 47% for FSM-eligible students.
  • The proportion of FSM students attending schools that progresses to a UK HEI is 56% whereas the comparable proportion for those attending FE colleges is 41% – a significant difference of 15 percentage points.
  • The number of FSM students progressing from colleges (7,290) remains larger than that progressing from schools (6,250).
  • There is a six percentage point variation between the progression rates for FSM and non-FSM students attending schools (56% versus 62%). In colleges the variation is only three percentage points (41% versus 44%).

Chart 2, below, shows the percentage of the KS5 FSM cohort in each region progressing to a UK higher education institution, compared with the percentage of the KS5 non-FSM cohort doing so.

The overall progression rate for FSM-eligible students is very nearly twice as high in each of Inner and Outer London as it is in the South West, the lowest performing region.

Incredibly, in Inner London, the progression rate for FSM-eligible students slightly exceeds the rate for non-FSM students – and these two rates are also very close in Outer London

 

Chart 2: Percentages of FSM and non-FSM progressing to UK HE by region 2011/12

Destinations chart 2

 

There is relatively little disparity between the regional progression rates for non-FSM students – only 16 percentage points variation between the highest and lowest performing regions (63% in Outer London versus 47% in the South West), compared with a 30 percentage point variation for FSM students (63% in Inner London versus 33% in South West England).

Outside London, the regions with the smallest variation between progression rates for FSM and non-FSM respectively are the West Midlands (nine percentage points) and Yorkshire and Humberside (eleven percentage points). The largest variation is in the North East (seventeen percentage points).

It is worth labouring the point by noting that FSM-eligible students located in London are almost twice as likely to progress to some form of UK higher education as those in the South West and the South East, and more likely to progress than non-FSM students in every other region, with the sole exception of Outer London

London is clearly an outstanding success in these terms, so bearing out all the recent publicity given to London’s relative success in securing high levels of attainment while simultaneously closing FSM gaps.

Some other regions need to work much harder than others to close this widening participation gap.

 

Progression to Selective UK Higher Education

But does this marked disparity between London and other English regions extend to progression to selective universities?

The destinations data incorporates several different measures of selectivity, each a subset of its predecessor:

  • Top third: the top 33% of HEIs, as measured by their mean UCAS tariff score, based on the best three A level grades of students admitted (other qualifications are excluded). The subset of institutions within this group changes annually, although 88% of those represented in 2011/12 had been included for six consecutive years, from 2006/07 onwards. (The technical note includes a full list at Annex 1.)
  • Russell Group: institutions belonging to the self-selecting Russell Group,all of which are represented within the top third.
  • Oxbridge: comprising Oxford and Cambridge, two particularly prominent members of the Russell Group which, rightly or wrongly, are perceived to be the pinnacle of selectivity in UK higher education (an assumption discussed in my Oxbridge post).

The last two of these feature in DfE’s Impact Indicators, alongside the percentage of FSM-eligible learners progressing to any university. The first is utilised in the Social Mobility Indicators (number 13), but to compare progression from state and independent institutions respectively.

The sections that follow look at each of these in order of selectivity, beginning with a national level comparison between progression rates for schools and colleges and proceeding to examine regional disparities for schools and colleges together.

 

Progression to the Top Third

Table 3 compares numbers of FSM-eligible and non-FSM learners progressing to top third institutions from state-funded schools and colleges respectively.

 

Table 3: National numbers progressing to UK HEIs and ‘Top Third’ HEIs in 2011/12

State-funded schools State-funded colleges Total
FSM Non-FSM Total FSM Non-FSM Total FSM Non-FSM Total
No of students 11,100 153,480 164,580 17,680 153,230 170,910 28,770 306,720 335,490
UK HEI destination 6,250 95,880 102,130 7,290 67,130 74,420 13,540 163,010 176,550
Top Third destination 1,300 35,410 36,710 920 15,000 15,920 2,210 50,410 52,620

 

The numbers reveal that:

  • The overall progression rate for KS5 students to top third institutions is 15.7%, but this masks a difference of almost nine percentage points between non-FSM students (16.4%) and their FSM peers (7.7%). Hence non-FSM students are more than twice as likely to gain a place at a top third institution.
  • School-based students are much more likely to reach top third institutions than those at colleges (22.3% versus 9.3%). The same is true amongst the FSM population – the FSM-eligible progression rate from schools is 11.7%, compared with just 5.2% from colleges. This is a substantively larger differential than applies in respect of all UK higher education.
  • Whereas the raw number of FSM learners progressing to any UK HE destination is higher in colleges, the reverse is true when it comes to the top third.
  • Overall, almost 30% of KS5 students progressing to a UK HE institution make it to one in the top third. But whereas roughly one in three (31%) of non-FSM students do so, only one in six (16.3%) of FSM students manage this.
  • When it comes to FSM students from schools and colleges respectively, approximately one in five (20.8%) of FSM students from schools who progress to a UK HE institution make it to a top third institution, whereas this is true of around one in eight of those from colleges (12.6%).

In sum, there are very significant gaps at national level between FSM-eligible progression rates to all UK higher education on one hand and top third institutions on the other. There are equally significant gaps in the FSM progression rates to top third institutions from schools and colleges respectively.

Chart 3, below, compares FSM and non-FSM progressions to top third higher education institutions in different regions.

 

Chart 3: Percentages of FSM and non-FSM students in the overall KS5 cohort who progressed to ‘top third’ HEIs in 2011/12

Destinations chart 3

One can see that:

  • The highest rate for non-FSM students is 24% in Outer London. Inner London rates only fourth on this measure, having dropped behind the Eastern and South Eastern regions. It is only one percentage point above the national average.
  • The highest rate for FSM-eligible students is 12%, again in Outer London, with Inner London just behind at 11%. These are significantly higher than the next highest rates (7%) in the West Midlands and the South East.
  • The non-FSM rates exceed the FSM rates in every region. In the East and South West, the non-FSM rate is three times higher than the FSM rate and, even in Inner London, the gap is six percentage points in favour of non-FSM.

The huge differences between regional success rates for progression to all UK higher education and top third institutions respectively are illustrated by Chart 4.

 

Chart 4: Comparison of regional progression to all UK HE and ‘top third institutions, comparing FSM and non-FSM, 2011/12

Destinations chart 4

It is immediately clear that the top third progression rates are invariably much lower than for progression to all UK higher education institutions, for both FSM-eligible and non-FSM students.

  • The gap at national level between non-FSM students progressing to all institutions and top third institutions is 37 percentage points (53% versus 16%). The comparable gap for FSM students is 39 percentage points (47% versus 8%). So whereas almost half of FSM students progress to any UK higher education institution, fewer than one in ten progress to ‘top third’ institutions.
  • Whereas Inner London recorded 63% of FSM students progressing to all institutions and Outer London wasn’t far behind at 62%, their comparable percentages for FSM progression to ‘top third’ institutions are 11% and 12% respectively. Both these gaps – standing at 50 percentage points or so – are huge, and significantly larger than the national average of 39 percentage points. The smallest gap between these two progression rates for FSM students is 27 percentage points in the South East. So the gap in London is almost twice the size of the gap in the South East. Moreover, the gap between these two rates is larger for non-FSM than FSM students in every region outside London, where the reverse is true.
  • On the other hand, whereas nationally there is a ratio of around 6:1 between FSM progression rates to UK higher education and top third institutions respectively, this falls to around 5:1 in both Inner and Outer London. Conversely it reaches 9:1 in the North East

Overall, it is clear that London leads the way on both measures of FSM progression. But the huge lead London has established in terms of progression to all UK higher education only serves to emphasise their rather more limited progress against the more demanding benchmark. That said, London is still achieving close to twice the rate of the next best region on the more demanding measure.

 

Russell Group

We might expect a broadly similar pattern in respect of progression rates to Russell Group universities, but it should also be instructive to compare performance on these two selective measures, even though cohorts are now small enough for the impact of rounding to be felt.

 

Table 4: National numbers progressing to all UK HE institutions and Russell Group Universities in 2011/12

State-funded schools State-funded colleges Total
FSM Non-FSM Total FSM Non-FSM Total FSM Non-FSM Total
No of students 11,100 153,480 164,580 17,680 153,230 170,910 28,770 306,720 335,490
UK HEI destination 6,250 95,880 102,130 7,290 67,130 74,420 13,540 163,010 176,550
Top third destination 1,300 35,410 36,710 920 15,000 15,920 2,210 50,410 52,620
Russell Group destination 740 24,180 24,920 510 9,790 10,300 1,240 33,970 35,220

 

Table 4 reveals that:

  • The overall national progression rate for KS5 students to Russell Group universities is 10.5%, compared with 15.7% for the top third. There is again a marked difference between the non-FSM rate (11.1%, compared with 16.4% for the top third) and the FSM rate (4.3%, compared with 7.7% for the top third). Whereas one in every nine non-FSM students progress to a Russell Group university, the corresponding odds for FSM are closer to one in 23. The ratio between FSM and non-FSM progression rates is larger at this higher level of selectivity.
  • The progression rate for all school-based students to Russell Group universities is 15.1% (compared with 22.3% for the top third), whereas the progression rate from colleges is much lower, at 6% (compared with 9.3% for the top third).
  • On the schools side, the FSM-eligible progression rate stands at 6.7% (against 11.7% for the top third), while in colleges it is as low as 2.9% (compared with 5.2% for the top third). The non-FSM rates are 15.8% for schools and 6.4% for colleges, so a higher proportion of FSM-eligible students from schools are successful than non-FSM students from colleges.
  • Almost 20% of all students who progress to a UK higher education institution go to a Russell Group university (compared with 30% going to a top third institution) but, for FSM-eligible learners, this falls to 9.2% (compared with 16.3% going to the top third). Whereas the FSM success rate for the top third was slightly more than half the non-FSM success rate, it is slightly less than half the non-FSM rate for Russell Group progression. The comparable percentages for schools and colleges are 11.8% and 7% respectively.
  • Overall, 66.9% of students reaching a ‘top third’ university are attending a Russell Group institution. But this overall ‘top third/RSG conversion rate’ for FSM-eligible students is only 56.1%, almost eleven percentage points lower than the rate for all students. (There is only a small difference between schools and colleges in this respect.) Hence the chances of FSM-eligible students attending Russell Group institutions within the ‘top third’ are significantly lower than those of their more advantaged peers.
  • It is also instructive to compare the different size of these cohorts. The overall non-FSM cohort progressing to Russell Group universities is 27 times the size of the FSM cohort doing so. Put another way, the overall FSM cohort is just 3.5% of the total population progressing to Russell Group institutions. (Interestingly, this falls to 3% for those attending schools whereas the comparable percentage for those attending colleges is higher at 5%.) The total number of FSM-eligible students going on to all Russell Group institutions is about half the number of non-FSM students progressing to Oxbridge alone.

Chart 5, immediately below, provides a region-by-region comparison of FSM-eligible and non-FSM progression rates to Russell Group universities.

 

Chart 5: Percentage of KS5 cohort – fsm and non-fsm – progressing to Russell Group universities by region, 2011/12

Destinations chart 5

 

This shows that:

  • Outer London is leading the way in terms of progression by FSM-eligible and non-FSM students alike. On the non-FSM side it is comfortably ahead of the North West, followed by the rest of the pack. Inner London brings up the rear, a full five percentage points behind the outer boroughs.
  • When it comes to FSM-eligible students there is little to choose between the regions, since they are all clustered between 3% and 6%. But it is much harder to establish real distinctions when percentages are so low. Inner London seems to be in the middle of the pack for FSM progression, suggesting it is performing respectably but not outstandingly on this measure.
  • The numbers – see Table 5 below – indicate that outer London contributes one in five of the FSM cohort progressing to Russell Group institutions, while Inner and Outer London together account for more than a third. (This is an important fact to bear in mind when contemplating the case for a separate London-wide strategy to improve FSM progression rates.) Numbers contributed by the North East, East Midlands and South West regions are markedly low by comparison.

 

Table 5: Percentage of FSM-eligible students progressing to Russell Group universities from each region 2011/12

NE NW YH EM WM EE IL OL SE SW Eng
Numbers progressing to RG universities 50 240 110 50 160 60 190 260 80 50 1240
%age of total 4% 19% 9% 4% 13% 5% 15% 21% 6% 4% 100%

 

Oxbridge

Table 6 below shows national progression rates to Oxbridge by sector, differentiating FSM-eligible and non-FSM. It reveals that:

  • The overall progression rate for all students to Oxbridge is 0.72%, so roughly one in every 140 KS5 students goes to Oxbridge. If we focus only on those progressing to UK higher education, this rate halves to around one in every 70. Of those progressing to Russell Group universities, 6.9% are headed to Oxbridge, equivalent to almost one in every 15.
  • But, when it comes to FSM students, theses rates are much, much lower. Of those progressing to Russell Group institutions, only one in 25 are destined for Oxbridge. Roughly one in every 270 FSM students progressing to UK higher education will attend these two universities.

 

Table 6: National numbers progressing to all UK HE institutions, top third, Russell Group and Oxbridge 2011/12

State-funded schools State-funded colleges Total
FSM Non-FSM Total FSM Non-FSM Total FSM Non-FSM Total
No of students 11,100 153,480 164,580 17,680 153,230 170,910 28,770 306,720 335,490
UK HEI destination 6,250 95,880 102,130 7,290 67,130 74,420 13,540 163,010 176,550
Top third destination 1,300 35,410 36,710 920 15,000 15,920 2,210 50,410 52,620
Russell Group destination 740 24,180 24,920 510 9,790 10,300 1,240 33,970 35,220
Oxbridge destination 40 1,850 1,890 10 520 530 50 2,370 2,420

 

  • If Oxbridge were to accept the same proportion of FSM students that attend Russell Group universities, they would together take in some 85 students rather than the 50 recorded here.

But, for all we know they are doing so, since we are at the very limits of the usefulness of these statistics.

The totals in the data above are rounded to the nearest 10, so the number of FSM students progressing to Oxbridge could be as low as 40 (35 from schools + 5 from colleges) or as high as 58 (44 from schools + 14 from colleges).

This degree of possible variance rather calls into question the wisdom of using this data to support a national impact indicator.

It also reinforces the case for Oxford and Cambridge to publish accurate annual data on the actual numbers of formerly FSM-eligible students they admit, ensuring that they define that term in exactly the same manner as these destination measures.

A figure at the lower end of this distribution would be broadly consistent with other data and suggest continuing long-term failure to shift this figure upwards.

BIS has provided figures over the years in answer to various Parliamentary Questions. These are derived by matching the NPD, HESA Student Record and Individual Learners Record (ILR). They are rounded to the nearest five, rather than the nearest ten, and together supply annual outcomes from 2005/06 to 2010/11.

 

Table 7: FSM-eligible progression to Oxbridge 2005-2011 sourced  from BIS replies to PQs

2005/06 2006/07 2007/08 2008/09 2009/10 2010/11
Oxford 25 20 20 25 15 15
Cambridge 20 25 20 20 25 25
TOTAL 45 45 40 45 40 40

 

My educated guess is that this number remained at or below 45 in 2011/12 and is unlikely to rise significantly for the foreseeable future.

But we should not be satisfied even if it doubles between 2010/11 and 2015/16, reaching 80-90 over that five year period. The desperately low base should not be used to justify such poverty of ambition.

I note in passing that the approach to rounding in the regional destination data is markedly unhelpful. Remember that all figures in the data are rounded to the nearest 10 and x indicates a number between 1 and 5. Table 8 shows the possible impact on figures for FSM progression to Oxbridge by region.

 

Table 7: Potential variance in numbers of FSM-eligible students progressing to Oxbridge by region 2001/12

Region Given Min Max Mean
NE X 1 5 3
NW 10 6 14 10
YH X 1 5 3
EM X 1 5 3
WM 10 6 14 10
EE 10 6 14 10
IL 10 6 14 10
OL 10 6 14 10
SE X 1 5 3
SW X 1 5 3
Eng 50 35 95 65

 

The obvious point is that the given total of 50 students could stand proxy for any figure between 35 and 95 (though one assumes that the real total must lie between 40 and 58, as indicated by the national figures in Table 6).

 

Putting it all Together

What are the headlines from the preceding analysis, as far as the progression of FSM-eligible students is concerned?

  • The destinations data generates a national population of almost 29,000 FSM-eligible students who constitute 8.6% of the total cohort. Over 60% of these are located in colleges, the remainder in schools. These national figures mask substantial regional variations: the FSM-eligible population ranges from 4.3% of the total (South East) to 30.3% (Inner London). The size of these regional FSM cohorts is also extremely variable. Inner and Outer London combined account for over 30% of the national FSM-eligible population.
  • At national level, some 64% of FSM-eligible KS5 learners progress to a sustained educational destination (as opposed to no sustained destination or else employment/training) but this rate is 72% amongst those who attended schools compared with 59% amongst those who attended colleges.
  • Over half (53%) of all KS5 students progress to a UK higher education institution, but the progression rate for FSM-eligible students is six percentage points lower at 47%.
  • About one in six of all KS5 students progress to a ‘top third’ institution, but only about one in 13 FSM-eligible students do so. About one in ten of all KS5 students attend Russell Group institutions, but this falls to one in 23 for FSM-eligible students.
  • There are significant differences between progression rates from schools and colleges respectively. From schools, the FSM-eligible progression rate to all UK higher education is 56%, to top third institutions it is 11.7% and to Russell Group Institutions it is 6.7%. The comparable percentages for colleges are consistently lower at 41%, 5.2% and 2.9% respectively. Whereas the number progressing to UK higher education is higher in colleges, the majority of those progressing to top third institutions are from schools. Almost 60% of those progressing to Russell Group universities are located in schools.
  • In regional terms, the FSM progression rate to all UK higher education ranges from 33.6% in the South West to 63.1% in Inner London, a huge 30 percentage point variation. Outer London is only one point behind at 61.9%. Exceptionally, the FSM progression rate in Inner London exceeds the non-FSM progression rate. Elsewhere, the non-FSM rate exceeds the FSM rate by between nine and 17 percentage points.
  • FSM progression rates to top third institutions are much lower, ranging from 4.4% (North East) to 12.4% (Outer London), which outscores Inner London at 10.6%. Both are well ahead of the national average at 7.7%. The non-FSM progression rates significantly exceed the FSM-eligible rates in every region. The gap is smallest in Inner London at 6.6 percentage points.
  • The gaps in London between FSM-eligible progression rates to all UK HE and the top third institutions reach 50 percentage points, significantly higher than the 39 percentage point national average. The smallest gap is 27 percentage points in the South East. Although London is leading the way on both these measures, its conspicuous success on the less demanding measure throws into sharper relief the limited progress made against the other.
  • A similar pattern is revealed when it comes to Russell Group universities, though the differences are more severe. The FSM progression rate ranges from 2.9% in Eastern England to 5.9% in Outer London, with Inner London only very slightly above the national average at 4.5%. Inner London also falls behind the North West on this measure. There are again significant differences between the rates for FSM and non-FSM. This gap is smallest in Inner London at 4.8 percentage points.
  • Chart 6, below, compares FSM and non-FSM progression rates by region to all UK higher education, the top third and Russell Group institutions respectively. The data is shown rounded to a single decimal place. This shows that the gaps between Russell Group and top third progression rates for FSM students are far bigger in London than anywhere else – 6.1 percentage points in Inner London and 6.5 percentage points in Outer London compared with a national average of 3.4 percentage points. FSM progression to Russell Group universities seems to be the point at which the celebrated London effect has stalled.

 

Chart 6: FSM and non-FSM progression by region to all UK HE, ‘top third’ and Russell Group institutions 2011/12

Destinations chart 6

 

  • As far as FSM progression to Oxbridge is concerned, the data is too limited and approximate to tell us anything substantial, other than to confirm that national FSM progression rates are scandalously low. There might have been a slight improvement – we can’t tell for certain – but from a horrifically low base. Five regions sent a maximum of 5 FSM-eligible learners to Oxbridge in 2012 while the other five each managed between 6 and 14.

 

What limits FSM progression to selective higher education?

Selective universities frequently argue that the main obstacle preventing the admission of more disadvantaged students is that far too few of them achieve the highest attainment levels necessary to secure admission.

Much is made in particular of the comparatively low number of FSM eligible students achieving AAA+ grades at A level – though a PQ reply confirmed (Col 35W) that 546 students achieved this in 2011 and, as we have seen, the data above shows 1,240 FSM students progressing to Russell Group universities in 2012, so well over 50% had lower grades than this. Some courses require slightly lower grades and contexualised admissions practice is almost certainly more widespread than many are prepared to admit.

Unfortunately though, there is very little published data defining excellence gaps – the difference in performance at high attainment levels between advantaged and disadvantaged students – so it is much more difficult than it should be to find hard evidence of this relationship and how it varies by region.

There seems to be broad consensus in the research literature that, although attainment is not the only contributory factor, it is the most significant cause of under-representation, not least because the effect is much more limited when controls for high attainment are introduced.

But it also recognised that a variety of other factors are in play, including:

  • Personal, peer and community aspirations
  • Motivation and resilience
  • Acquisition of social and cultural capital
  • Subject choice (often discussed in terms of ‘facilitating subjects’)
  • Access to and quality of information, advice and guidance
  • Aversion to student debt
  • Whether educators demonstrate consistently high expectations and are favourably disposed towards the most selective universities.

Of course it is overly simplistic to regard such factors as distinct from high attainment, since several of them contribute indirectly towards it.

It is also important to bear in mind that the most demanding and highest tariff courses in particular disciplines are not necessarily located at the most prestigious universities, so – even allowing for screening effects – schools and colleges may be acting in many students’ best interests by pointing them in other directions.

And it is open to question whether disadvantaged students should be persuaded to attend higher education institutions that do not suit them personally, even if the future flow of economic benefits suggests this is the most rational decision. There is a trade-off between present happiness and future income and potential students – as adults – should arguably be able to exercise some freedom of choice. There is also the risk of drop-out to consider.

These factors will impact on different students with different intensities in different combinations and in very different ways: there can be no ‘one size fits all’ solution.

All this aside, it seems that – for the disadvantaged student cohort as a whole – the cumulative impact of such factors is much less significant than the impact of attainment.

So it would be a reasonable hypothesis that regions whose FSM (and non-FSM) students are under-represented at Russell Group universities demonstrate relatively lower levels of high attainment at GCSE and A level.

Could this help to explain why Inner London, so successful in terms of progression to UK higher education institutions, is far less so where Russell Group universities are concerned? The remainder of this section struggles to test this hypothesis with the very limited data available.

Taking A level first, Chart 7, below, compares top grade A level performance in 2013, the most recent year for which this data is available, while Chart 8 compares achievement of AAB A level grades or higher in 2011 and 2012 with FSM-eligible and non-FSM progression rates in 2012 drawn from the destinations data. (Note that the 2011 data does not supply separate AAB+ outcomes for Inner and Outer London).

 

Chart 7: Top A level performance by region 2012/13

Destinations chart 7

Chart 8: Regional achievement of AAB+ grades at A level in 2011 and 2012 compared with 2012 FSM and non-FSM progression rates to RG universities

Destinations chart 8

 

Chart 7 shows that Inner London returns the lowest rates of top-grade A level attainment, while Outer London is at the top of the range. This suggests that top grade A level attainment is depressed in Inner London, which might well be attributable to the exceptionally high incidence of relatively lower attaining FSM-eligible students.

Chart 8 again shows Outer London performing strongly – on both top grade A level attainment and Russell Group progression, while Inner London is lagging behind.

A straightforward bilateral comparison between Inner and Outer London suggests a clear correlation between these two variables, although correlation does not amount to causation.

Moreover, the picture becomes somewhat more complex when other regions are factored in. Outer London has similar top grade A level attainment to the South East, but performs significantly better on Russell Group progression, even with a significantly higher proportion of FSM students.

Meanwhile Inner London, clearly the laggard in terms of top grade A level performance, is also the backmarker for non-FSM Russell Group progression. However, it still seems to perform comparatively well in terms of FSM progression, especially when compared with the South East.

This could be explainable by the fact that relatively more FSM students in Inner London achieve the highest grades, or perhaps they are disproportionately the beneficiaries of contexualised admissions practice. Other factors could also be in play, not least the geographical proximity of several Russell Group institutions.

There is some evidence – published by the Social Mobility and Child Poverty Commission (SMCPC) and recently taken up in CfBT’s research on the ‘London effect’ – that disadvantaged students across London as a whole are relatively strong performers in higher grade GCSEs.

The SMCPC’s 2013 State of the Nation report (page 191) drew attention to overall London success on an 8+ A*-B GCSE including English and Maths (and excluding equivalents) measure – albeit distinguishing between those attracting Pupil Premium funding and their peers.

 

destinations capture 1

 

This table was converted into a chart in a recent CfBT research report on London.

 

destinations capture 2

 

Unfortunately, we cannot see the data for Inner and Outer London separately, so the ‘London effect’ may be disproportionately attributable to the Outer boroughs.

So where does this leave us?

The balance of probabilities suggests that the incidence of high attainment at GCSE and post-16 will impact strongly on progression to selective higher education and so provide the root cause for regional differences in progression rates.

Regions wishing to improve their performance need to look first at increasing high attainment, taking full account of disparities between the performance of FSM and non-FSM students.

There is some evidence to suggest that the celebrated ‘London effect’ has not translated into achievement of the highest attainment levels at A level in Inner London, especially compared with Outer London. This is impacting negatively on progression rates for FSM students but, ironically, progression rates for non-FSM students seem to be taking a bigger hit, perhaps because they do not benefit so significantly from contexualised admissions.

Any London-wide regional strategy to improve progression to the most selective universities would need to focus strongly on closing the gaps between FSM and non-FSM progression rates in Inner and Outer London respectively.

 

The policy response to poor FSM progression

The current policy response is multi-faceted but focused primarily on system-wide improvement, rather than organising and targeting support directly at the students most likely to benefit.

This is partly a function of a market-driven political philosophy, fundamental aversion to centrally organised programmes and commitment to a distributed model in which institutions enjoy substantial autonomy, subject to a strong accountability regime which focuses primarily (but not exclusively) on outcomes, including via the introduction of the destination measures discussed in this post.

By strengthening the system as a whole, it is anticipated that standards will rise across the board. A more rigorous national curriculum and more demanding qualifications will raise performance thresholds, ensuring that all learners are better prepared for progression, regardless of their destination. Some examinations are being revised to remove ceilings on the performance of the highest attainers.

Reporting of performance is adjusted to ensure that schools focus on improving attainment and progress of all learners, regardless of their starting point. Inspection includes checks that high attainers are not underachieving.

A series of interventions has been introduced to strengthen attainment and progression in maths and across other STEM subjects.

There have been efforts to strengthen the role of the Office for Fair Access (OFFA) and to introduce a co-ordinated ‘National Strategy for Access and Student Success’ involving collaboration between OFFA and HEFCE. Meanwhile HE student number controls have been relaxed enabling institutions to expand their intakes of suitably qualified students.

Some degree of localised intervention is taking place through the free schools programme as a first tranche of selective 16-19 institutions has been established, often with an explicit mission to increase the flow of disadvantaged students to selective higher education.

Financial support has been targeted towards disadvantaged learners through the Pupil Premium, ensuring that schools receive extra funding for each disadvantaged learner they admit up to and including Year 11. Academies – including many selective schools – are permitted to prioritise admission of these learners when oversubscribed.

There are issues with aspects of this agenda, for example:

  • The introduction of universal end of KS2 tests may reduce their capacity to differentiate the performance of the highest attainers, so recently enhanced through the adoption of Level 6 tests. There is an associated risk that schools’ internal assessment systems will impose artificially low ceilings restricting high attainers’ progress.
  • Ofsted’s welcome focus on the most able in schools gives insufficient emphasis to those attracting the Pupil Premium and is not backed up by explicit guidance. Nor does it apply to the separate inspection of post-16 settings, undertaken under a different inspection framework.
  • OFFA and HEFCE cannot readily alter the behaviour of independent higher education institutions that make too little progress with fair access, or which improve too slowly. There are too few carrots and sticks and widespread resistance to the imposition of robust targets, even though the SMCPC has called for this repeatedly. Efforts at strengthening institutional collaboration are equally constrained.
  • As yet there are too few selective 16-19 institutions to make a real difference. They are too little focused on supporting improvements in neighbouring institutions and, even within their own intakes, do not always give sufficient priority to the most disadvantaged students.
  • The Pupil Premium stops at age 16 and schools are largely free to use it as they wish – there is no guarantee that each learner attracting the Premium will benefit commensurately and some risk that high attainers are amongst the most vulnerable in this respect.
  • One wonders whether the destination indicators, when introduced into the accountability regime in 2016, will be influential enough to change institutional behaviour. The simultaneous deployment of several different measures of selectivity may dilute their impact. On the other hand, a single measure would be too blunt an instrument.

But these are second order issues. Overall, the current education reform programme can be expected to bring about some improvement in FSM progression rates to selective higher education.

However:

  • It will take a comparatively long time.
  • There is significant deadweight.
  • Fault lines between higher education and schools policy remain problematic.
  • Nothing is holding these disparate policy elements together, so ensuring that ‘the whole is greater than the sum of the parts’.

 

Solving the Policy Design Problem

Given the context of wider government policy, what additional policy dimension should be introduced to secure significant improvements in progression rates for disadvantaged learners to selective higher education?

The missing component – which might be introduced nationally or piloted at regional level and subsequently rolled out – is a light touch framework that will supply the essential minimum scaffolding necessary to support effective market operation on the demand and supply sides simultaneously.

This is by no means equivalent to a rigid, centralised top-down programme, but it does recognise that, left to its own devices, the free market will not create the conditions necessary for success. Some limited intervention is essential.

The centrepiece of the framework would be a structured typology or curriculum comprising the full range of knowledge, skills and understanding required by disadvantaged students to equip them for progression to selective higher education

  • On the demand side this would enable educational settings to adopt a consistent approach to needs identification across the 11-19 age range. Provision from 11-14 might be open to any disadvantaged learner wishing it to access it, but provision from 14 onwards would depend on continued success against challenging attainment targets.
  • On the supply side this would enable the full range of providers – including students’ own educational settings – to adopt a consistent approach to defining which knowledge, skills and understanding their various programmes and services are designed to impart. They would be able to qualify their definitions according to the age, characteristics, selectivity of intended destination and/or geographical location of the students they serve.

With advice from their educational settings, students would periodically identify their learning needs, reviewing the progress they had made towards personal targets and adjusting their priorities accordingly. They would select the programmes and services best matched to their needs.

The supply side would use market intelligence to adjust the range of programmes and services to meet need from different constituencies and localities, acting swiftly to fill gaps in the market and eradicate over-supply.  Programmes and services attracting insufficient demand would close down, while popular programmes and services would expand to meet demand. Small providers with many competitors would discuss the benefits of collaboration to achieve economies of scale, so bringing down costs and increasing demand.

Each learner within the programme would have a personal budget dedicated to purchasing programmes and services with a cost attached. This would be fed from several sources including:

  • Their annual Pupil Premium allocation (currently £935 per year) up to Year 11.
  • A national fund fed by selective higher education institutions. This would collect a fixed minimum topslice from each institution’s outreach budget, supplemented by an annual levy on those failing to meet demanding new fair access targets. (Institutions would also be incentivised to offer programmes and services with no cost attached.)
  • Philanthropic support, bursaries, scholarships, sponsorships and in-kind support sourced from business, charities, higher education, independent schools and parents. Economic conditions permitting, the Government might offer to match any income generated from these sources.

This would be a more than adequate replacement for Aim Higher funding, the loss of which is still felt keenly according to this recent DfE research report.

A solution of this kind would be largely self-regulating, requiring only minimal co-ordination and a small administrative budget. It would have several conspicuous advantages in terms of securing much greater coherence and consistency:

  • Across the age range, securing continuity and progression for all participating learners throughout secondary education up to the point of entry into higher education.
  • Between educational settings, especially at the key transition point between secondary and tertiary education at age 16, when half or more students might be expected to move to a different setting.
  • Regardless of geographical location, so that students are less disadvantaged by virtue of where they live, able to draw on high quality blended and online provision in locations where face-to-face provision is unviable.
  • Incorporating the contribution of national, regional and local centres of excellence – including for example new selective 16-19 institutions such as the London Academy for Excellence and the Harris Westminster Sixth Form – providing them with a platform to share and spread excellent practice and supply outreach of their own.
  • Providing a nexus for cross-sectoral partnership and collaboration, including collaborative efforts in the higher education sector recently launched by OFFA and HEFCE.
  • Supplying a context in which selective higher education institutions can be more transparent about their contextual admission offers and other fair access policies, enabling students to make proper comparisons when selecting their preferred institutions.
  • Accommodating and complementing the reform package I have already proposed to improve fair access to Oxbridge.

The first of these dimensions is particularly important given recent research, published by the Social Mobility and Child Poverty Commission which finds that:

‘Of 7,853 children from the most deprived homes who achieve level 5 in English and maths at age 11 (8.5%…), only 906 (11.5%) make it to an elite university. If they had the same trajectory as a child from one of the least deprived families, then 3,066 of these children would be likely to go to an elite university (39.0%) – suggesting that 2,160 children are falling behind.’

The report concludes:

‘Poorer students have lower average achievement at each stage of their education and even those who start strongly with higher achievement at Key Stages 1 and 2 are more likely to fall off their high achievement trajectory than their wealthier peers. The achievement of students from poorer backgrounds is particularly likely to fall away between Key Stage 2 and Key Stage 4, making secondary school a potentially important area of intervention for policymakers interested in increasing participation at high-status universities amongst young people from more deprived backgrounds.’

Such an approach would be relatively inexpensive and fully scalable (I have not properly costed it, but a £50m topslice from the annual £2.5bn national Pupil Premium budget – for which there is precedent – would be more than enough to meet the full burden on the taxpayer.)

A regional pilot – perhaps in London, or perhaps elsewhere – would accommodate an EEF-funded randomised control trial, though this would need to be extended if incorporating a cohort undertaking the full cycle from Year 7 upwards.

The full benefits would not be realised until this first seven year cycle was completed, but one would anticipate significant positive impacts on attainment much sooner than that and, if students were allowed to participate from Year 10, or possibly even later, the impact on progression to selective universities would be felt within the lifetime of the next government.

 

Conclusion

There are strong equity and social mobility arguments for improving significantly the attainment of disadvantaged students and increasing their rates of progression to selective universities. This is also a sound investment in human capital, improving our national standing in the ‘global race’.

These progression rates have been stalled for a generation. Recent attempts to claim ‘green shoots of recovery’ relate only to the least selective top third measure. Even if they are realised, they are unlikely to wash through to Russell Group and Oxbridge admissions where the under-representation of FSM students is marked and, some would argue, a national scandal.

The publication of Destination Measures provides a valuable addition to our evidence base, though we know far too little about excellence gaps – between the performance of advantaged and disadvantaged learners on high attainment measures – so cannot readily explore the impact of these on progression rates.

Current education policy will likely bring about improvements, but only very slowly. Progression rates to the most selective institutions will be the hardest and slowest to shift. There are ongoing risks associated with cross-policy coherence and the fault lines between education policy for schools and higher education respectively (with the post-16 sector caught somewhere in between).

An additional policy strand is needed to secure vertical, horizontal and lateral coherence and deliver a whole greater than the sum of its parts. Potential design principles for this strand are set out above.  Substantial benefits would be realised during the lifetime of the next government.

Perfect Manifesto material!

 

GP

July 2014

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Why Can’t We Have National Consensus on Educating High Attainers?

 

 

This post proposes a statement of core principles to provoke debate and ultimately build consensus about the education of high attaining learners.

focus-blur--long-way-to-the-top-2_2955981It incorporates an Aunt Sally – admittedly imperfect, provocative and prolix – to illustrate the concept and stimulate initial thinking about what such a statement might contain.

The principles are designed to underpin effective provision. They are intended to apply at every level of the education system (whether national, regional or local) and to every learning setting and age group, from entry to Reception to admission to higher education (or equivalent) and all points in between.

Alongside the draft core principles – which should have more or less global application – I offer a complementary set of ‘reform principles’ which are specific to the English context and describe how our national education reform programme might be harnessed and applied more consistently to support high attainers.

This is expressed in system-wide terms, but could be translated fairly straightforwardly into something more meaningful for schools and colleges.

 

Justification

As education reforms continue to be developed and implemented at a rapid pace, it is essential that they fit together coherently. The various reforms must operate together smoothly, like interlocking cogs in a well-oiled machine, such that the whole is greater than the sum of the parts.

Coherence must be achieved across three dimensions:

  • Horizontally, across the span of education policy.
  • Vertically across the age range, taking in the primary, secondary and tertiary sectors.
  • Laterally for each and every learning setting to which it applies.

There is a risk that such co-ordination becomes more approximate as capacity is stretched by the sheer weight of reform, especially if the central resource traditionally devoted to this task is contracting simultaneously.

In an increasingly bottom-up system, some of the responsibility for ensuring the ‘fit’ across the span of education reforms can be devolved from the centre, initially to a range of intermediary bodies and ultimately to learning settings themselves.

Regardless of where the responsibility lies, there can be a tendency to cut corners, by making these judgements with reference to some notional average learner. But this ignores the needs and circumstances of atypical constituencies including high attainers.

High attainers may even find themselves at the bottom of the pecking order amongst these atypical constituencies, typically as a consequence of the misguided view that they are more or less self-sufficient educationally speaking.

A framework of sorts is necessary to support this process, to protect against the risk that high attainers may otherwise be short-changed and also to ensure flexibility of provision within broad but common parameters.

The Government has recently set a precedent by publishing a set of Assessment Principles ‘to underpin effective assessment systems within schools’.

This post applies that precedent to support the education of high attainers, providing a flexible framework, capable of adoption (with adaptation where necessary) by all the different bodies and settings engaged in this process.

 

The English policy context

I have sought to incorporate in the second set of ‘reform’ principles the full range of areas explored by this blog, which began life at roughly the same time as the present Government began its education reform programme.

They are designed to capture the reform agenda now, as we draw to the close of the 2013/14 academic year. They highlight aspects of reform that are likely to be dominant over the next three academic years, subject of course to any adjustments to the reform programme in the light of the 2015 General Election.

These include:

  • Introduction of a new national curriculum incorporating both greater challenge and greater flexibility, together with full exemption for academies.
  • Introduction of new assessment arrangements, including internal assessment in schools following the withdrawal of national curriculum levels and external assessment arrangements, particularly at the end of KS2.
  • Introduction of revised GCSE and A level qualifications, including a new recalibrated grading system for GCSE.
  • Radical changes to the accountability system, including the reporting of learners’ achievement and the inspection of provision in different learning settings. 
  • Ensuring that the Pupil Premium drives accelerated progress in closing attainment gaps between disadvantaged and advantaged learners.
  • Ensuring accelerated progress against updated social mobility indicators, including improvements in fair access to selective universities.
  • Strengthening system-wide collaboration, ensuring that new types of institution play a significant role in this process, developing subject-specific support networks (especially in STEM) and building the capacity and reach of teaching school alliances.

 

Process

The Aunt Sally might be used as a starting point by a small group charged with generating a viable draft set of principles, either stand-alone or supported by any additional scaffolding deemed necessary.

The preparation of the draft core principles would itself be a consensus-establishing exercise, helping to distinguish areas of agreement and critical sticking points requiring negotiation to resolve.

This draft might be issued for consultation for a fixed period. Responses would be sought directly from a range of key national organisations, all of which would subsequently be invited to endorse formally the final version, revised in the light of consultation.

This stage might entail some further extended negotiation, but the process itself would help to raise the profile of the issue.

Out in the wider system, educators might be encouraged to interact with the final version of the principles, to discuss and record how they might be adjusted or qualified to fit their own particular settings.

There might be an online repository and forum (using a free online platform) enabling educators to discuss their response to the principles, suggest localised adjustments and variants to fit their unique contexts, provide exemplification and share supporting resources, materials and links.

Some of the key national organisations might be encouraged to develop programmes and resources within their own purlieux which would link explicitly with the core principles.

Costs would be limited to the human resource necessary to co-ordinate the initial task and subsequently curate the online repository.

 

Provisos

The focus on high attainment (as a subset of high achievement) has been selected in preference to any categorisation of high ability, talent or giftedness because there are fewer definitional difficulties, the terminology is less problematic and there should be a correspondingly stronger chance of reaching consensus.

I have not at this stage included a definition of high attainers. Potentially one could adopt the definition used in the Primary and Secondary Performance Tables, or an alternative derived from Ofsted’s ‘most able’ concept.

The PISA high achievement benchmarks could be incorporated, so permitting England to compare its progress with other countries.

But, since we are working towards new attainment measures at the end of KS2 and KS4 alike, it may be more appropriate to develop a working definition based on what we know of those measures, adapting the definition as necessary once the measures are themselves more fully defined.

In the two sections following I have set out the two parts of my Aunt Sally

  • A set of ten core principles, designed to embody a shared philosophy underpinning the education of high attainers and
  • A parallel set of ten reform principles, designed to show how England’s education reform agenda might be adapted and applied to support the education of high attainers.

As noted above, I have cast the latter in system-wide terms, hopefully as a precursor to developing a version that will apply (with some customisation) to every learning setting. I have chosen deliberately to set out the big picture from which these smaller versions might be derived.

My Aunt Sally is imbued with a personal belief in the middle way between a bottom-up, school-driven and market-based system on one hand and a rigid, top-down and centrally prescribed system on the other. The disadvantages of the latter still live in the memory, while those of the former are writ large in the current crisis.

Some of this flavour will be obvious below, especially in the last two reform principles, which embody what I call ‘flexible framework thinking’. You will need to make some allowances if you are of a different persuasion.

I have also been deliberately a little contentious in places, so as to stimulate reaction in readers. The final version will need to be more felicitously worded, but it should still be sharp enough to have real meaning and impact.

For there is no point in generating an anodyne ‘motherhood and apple pie’ statement that has no prospect of shifting opinion and behaviour in the direction required.

Finally, the current text is too long-winded, but I judged it necessary to include some broader context and signposting for those coming to this afresh. I am hopeful that, when this is shorn away, the slimmed-down version will be closer to its fighting weight.

 

Ten Core Principles

This section sets out ten essential principles that all parts of the education system should follow in providing for high achievers.

 

  1. Raising achievement – within the education system as a whole and for each and every learner – is one of the principal aims of education. It does not conflict with other aims, or with our duty to promote learners’ personal and social development, or their health, welfare and well-being.

 

  1. Securing high achievement – increasing the proportion of high achievers and raising the achievement of existing high-achievers – is integral to this aim.

 

  1. Both existing and potential high achievers have a right, equal to that of all other learners, to the blend of challenge and support they need to improve further – to become the best that they can be. No learner should be discriminated against educationally on the basis of their prior achievement, whether high or low or somewhere in between.

 

  1. We must pursue simultaneously the twin priorities of raising standards and closing gaps. We must give higher priority to all disadvantaged learners, regardless of their prior achievement. Standards should continue to rise amongst all high achievers, but they should rise faster amongst disadvantaged high achievers. This makes a valuable contribution to social mobility.

 

  1. Securing high attainment is integral to securing high achievement. The route to high attainment may involve any or all of greater breadth, increased depth and a faster pace of learning. These elements should be prioritised and combined appropriately to meet each learner’s needs; a one-size-fits-all solution should not be imposed, nor should any of these elements be ruled out automatically.

 

  1. There must be no artificial ceilings or boundaries restricting high attainment, whether imposed by chronological age or by the expertise available in the principal learning setting; equally, there must be no ‘hot-housing’, resulting from an imbalance between challenge and support and an associated failure to respond with sensitivity to the learner’s wider needs.

 

  1. High attainers are an extremely diverse and disparate population. Some are much higher attainers than others. Some may be ‘all-rounders’ while others have particular strengths and areas for development. All need the right blend of challenge and support to improve alike in areas of strength and any areas of comparative weakness.

 

  1. Amongst the high-attaining population there is significant over-representation of some learner characteristics. But there is also significant diversity, resulting from the interaction between gender, special needs, ethnic and socio-economic background (and several other characteristics besides). This diversity can and should increase as excellence gaps are closed.

 

  1. Educators must guard against the false assumption that high attainment is a corollary of advantage. Equally, they must accept that, while effective education can make a significant difference, external factors beyond their control will also impact upon high attainment. The debate about the relative strength of genetic and environmental influences is irrelevant, except insofar as it obstructs universally high expectations and instilling a positive ‘growth mindset’ in all learners.

 

  1. High attainers cannot meet their own educational needs without the support of educators. Nor is it true that they have no such needs by virtue of their prior attainment. Your investment in their continued improvement is valuable to them as individuals, but also to the country as a whole, economically, socially and culturally.

 

Ten Reform Principles

This section describes how different elements of educational reform might be harnessed to ensure a coherent, consistent and mutually supportive strategy for increasing high attainment

The elements below are described in national system-wide terms, as they apply to the primary and secondary school sectors, but each should be capable of adjustment so it is directly relevant at any level of the system and to every learning setting.

 

  1. Revised national curriculum arrangements offer greater flexibility to design school curricula to meet high attainers’ needs. ‘Top down’ curriculum design, embodying the highest expectations of all learners, is preferable to a ‘deficit model’ approach derived from lowest common denominator thresholds. Exemplary models should be developed and disseminated to support schools in developing their own.

 

  1. The assessment system must enable high attainers to show what they know, understand and can do. Their needs should not be overlooked in the pursuit of universally applicable assessment processes. Formative assessment must provide accurate, constructive feedback and sustain high expectations, regardless of the starting point. Internal and external assessment alike must be free of undesirable ceiling effects.

 

  1. Regardless of their school, all high attainers should have access to opportunities to demonstrate excellence through national assessments and public examinations, including Level 6 assessment (while it exists) and early entry (where it is in their best interests). Progression across transition points – eg primary to secondary – should not require unnecessary repetition and reinforcement. It, should be pre-planned, monitored and kept under review.

 

  1. High attainment measures should feature prominently when results are reported, especially in national School and College Performance Tables, but also on school websites and in the national data portal. Reporting should reveal clearly the extent of excellence gaps between the performance of advantaged and disadvantaged high attainers respectively.

 

  1. Ofsted’s inspection framework now focuses on the attainment and progress of ‘the most able’ in every school. Inspectors should adopt a consistent approach to judging all settings’ provision for high attainers, including explicit focus on disadvantaged high attainers. Inspectors and settings alike would benefit from succinct guidance on effective practice.

 

  1. The impact of the Pupil Premium on closing excellence gaps should be monitored closely. Effective practice should be captured and shared. The Education Endowment Foundation should ensure that impact on excellence gaps is mainstreamed within all its funded programmes and should also stimulate and support programmes dedicated to closing excellence gaps.

 

  1. The closing of excellence gaps should improve progression for disadvantaged high attainers, including to selective secondary, tertiary and higher education. Destination indicators should enable comparison of institutional success in this regard. Disadvantaged high attainers need access to tailored IAG to support fair access at every level. Targeted outreach to support effective transition is also essential at each transition point (typically 11, 16 and 18). Universities should be involved from KS2 onwards. The relevant social mobility measures should align with Pupil Premium ‘eligibility’. Concerted corrective action is required to improve progress whenever and wherever it stalls.

 

  1. System-wide collaboration is required to drive improvement. It must include all geographical areas, educational sectors and institutional types, including independent and selective schools.  All silos – whether associated with localities, academy chains, teaching school alliances, subject specialism or any other subset of provision – must be broken down. This requires joint action by educational settings, voluntary sector organisations and private sector providers alike. Organisations active in the field must stop protecting their fiefdoms and work together for the common good.

 

  1. To minimise fragmentation and patchiness of provision, high attaining learners should have guaranteed access to a menu of opportunities organised within a coherent but flexible framework. Their schools, as lead providers, should facilitate and co-ordinate on their behalves. A similar approach is required to support educators with relevant school improvement, initial training, professional development and research. To support this parallel framework, both theoretical and practical knowledge of the ‘pedagogy of high attainment’ should be collected, organised and shared.

 

  1. All providers should be invited to position their services within these frameworks, using intelligence about the balance between demand and supply to inform the development of new products and services. Responsibility for overseeing the frameworks and for monitoring and reporting progress should be allocated to an independent entity within this national community. As far as possible this should be a self-funding and self-sustaining system.

 

Next Steps

I have already had some welcome interest in developing a set of core principles to support the education of high attaining learners.

This may be a vehicle to stimulate a series of useful partnerships, but it would be premature to publicise these preliminary discussions for fear that they do not reach fruition.

This post is intended to stimulate others to consider the potential benefits of such an approach – and I am at your service should you wish to discuss the idea further.

But if I have only caused you to reflect more deeply about your personal contribution to the education of high attainers, even then this effort has been worthwhile.

GP

May 2014

 

 

‘Poor but Bright’ v ‘Poor but Dim’

 

light-147810_640This post explores whether, in supporting learners from disadvantaged backgrounds, educators should prioritise low attainers over high attainers, or give them equal priority.

 

 

 

Introduction

Last week I took umbrage at a blog post and found myself engaged in a Twitter discussion with the author, one Mr Thomas.

 

 

Put crudely, the discussion hinged on the question whether the educational needs of ‘poor but dim’ learners should take precedence over those of the ‘poor but bright’. (This is Mr Thomas’s shorthand, not mine.)

He argued that the ‘poor but dim’ are the higher priority; I countered that all poor learners should have equal priority, regardless of their ability and prior attainment.

We began to explore the issue:

  • as a matter of educational policy and principle
  • with reference to inputs – the allocation of financial and human resources between these competing priorities and
  • in terms of outcomes – the comparative benefits to the economy and to society from investment at the top or the bottom of the attainment spectrum.

This post presents the discussion, adding more flesh and gloss from the Gifted Phoenix perspective.

It might or might not stimulate some interest in how this slightly different take on a rather hoary old chestnut plays out in England’s current educational landscape.

But I am particularly interested in how gifted advocates in different countries respond to these arguments. What is the consensus, if any, on the core issue?

Depending on the answer to this first question, how should gifted advocates frame the argument for educationalists and the wider public?

To help answer the first question I have included a poll at the end of the post.

Do please respond to that – and feel free to discuss the second question in the comments section below.

The structure of the post is fairly complex, comprising:

  • A (hopefully objective) summary of Mr Thomas’s original post.
  • An embedded version of the substance of our Twitter conversation. I have removed some Tweets – mostly those from third parties – and reordered a little to make this more accessible. I don’t believe I’ve done any significant damage to either case.
  • Some definition of terms, because there is otherwise much cause for confusion as we push further into the debate.
  • A digressionary exploration of the evidence base, dealing with attainment data and budget allocations respectively. The former exposes what little we are told about how socio-economic gaps vary across the attainment spectrum; the latter is relevant to the discussion of inputs. Those pressed for time may wish to proceed directly to…
  • …A summing up, which expands in turn the key points we exchanged on the point of principle, on inputs and on outcomes respectively.

I have reserved until close to the end a few personal observations about the encounter and how it made me feel.

And I conclude with the customary brief summary of key points and the aforementioned poll.

It is an ambitious piece and I am in two minds as to whether it hangs together properly, but you are ultimately the judges of that.

 

What Mr Thomas Blogged

The post was called ‘The Romance of the Poor but Bright’ and the substance of the argument (incorporating several key quotations) ran like this:

  • The ‘effort and resources, of schools but particularly of business and charitable enterprise, are directed disproportionately at those who are already high achieving – the poor but bright’.
  • Moreover ‘huge effort is expended on access to the top universities, with great sums being spent to make marginal improvements to a small set of students at the top of the disadvantaged spectrum. They cite the gap in entry, often to Oxbridge, as a significant problem that blights our society.’
  • This however is ‘the pretty face of the problem. The far uglier face is the gap in life outcomes for those who take least well to education.’
  • Popular discourse is easily caught up in the romance of the poor but bright’ but ‘we end up ignoring the more pressing problem – of students for whom our efforts will determine whether they ever get a job or contribute to society’. For ‘when did you last hear someone advocate for the poor but dim?’ 
  • ‘The gap most damaging to society is in life outcomes for the children who perform least well at school.’ Three areas should be prioritised to improve their educational outcomes:

o   Improving alternative provision (AP) which ‘operates as a shadow school system, largely unknown and wholly unappreciated’ - ‘developing a national network of high–quality alternative provision…must be a priority if we are to close the gap at the bottom’.

o   Improving ‘consistency in SEN support’ because ‘schools are often ill equipped to cope with these, and often manage only because of the extraordinary effort of dedicated staff’. There is ‘inconsistency in funding and support between local authorities’.

o   Introducing clearer assessment of basic skills, ‘so that a student could not appear to be performing well unless they have mastered the basics’.

  • While ‘any student failing to meet their potential is a dreadful thing’, the educational successes of ‘students with incredibly challenging behaviour’ and ‘complex special needs…have the power to change the British economy, far more so than those of their brighter peers.’

A footnote adds ‘I do not believe in either bright or dim, only differences in epigenetic coding or accumulated lifetime practice, but that is a discussion for another day.’

Indeed it is.

 

Our ensuing Twitter discussion

The substance of our Twitter discussion is captured in the embedded version immediately below. (Scroll down to the bottom for the beginning and work your way back to the top.)

 

 

Defining Terms

 

Poor, Bright and Dim

I take poor to mean socio-economic disadvantage, as opposed to any disadvantage attributable to the behaviours, difficulties, needs, impairments or disabilities associated with AP and/or SEN.

I recognise of course that such a distinction is more theoretical than practical, because, when learners experience multiple causes of disadvantage, the educational response must be holistic rather than disaggregated.

Nevertheless, the meaning of ‘poor’ is clear – that term cannot be stretched to include these additional dimensions of disadvantage.

The available performance data foregrounds two measures of socio-economic disadvantage: current eligibility for and take up of free school meals (FSM) and qualification for the deprivation element of the Pupil Premium, determined by FSM eligibility at some point within the last 6 years (known as ‘ever-6’).

Both are used in this post. Distinctions are typically between the disadvantaged learners and non-disadvantaged learners, though some of the supporting data compares outcomes for disadvantaged learners with outcomes for all learners, advantaged and disadvantaged alike.

The gaps that need closing are therefore:

  • between ‘poor and bright’ and other ‘bright’ learners (The Excellence Gap) and 
  • between ‘poor and dim’ and other ‘dim’ learners. I will christen this The Foundation Gap.

The core question is whether The Foundation Gap takes precedence over The Excellence Gap or vice versa, or whether they should have equal billing.

This involves immediate and overt recognition that classification as AP and/or SEN is not synonymous with the epithet ‘poor’, because there are many comparatively advantaged learners within these populations.

But such a distinction is not properly established in Mr Thomas’ blog, which applies the epithet ‘poor’ but then treats the AP and SEN populations as homogenous and somehow associated with it.

 

By ‘dim’ I take Mr Thomas to mean the lowest segment of the attainment distribution – one of his tweets specifically mentions ‘the bottom 20%’. The AP and/or SEN populations are likely to be disproportionately represented within these two deciles, but they are not synonymous with it either.

This distinction will not be lost on gifted advocates who are only too familiar with the very limited attention paid to twice exceptional learners.

Those from poor backgrounds within the AP and/or SEN populations are even more likely to be disproportionately represented in ‘the bottom 20%’ than their more advantaged peers, but even they will not constitute the entirety of ‘the bottom 20%’. A Venn diagram would likely show significant overlap, but that is all.

Hence disadvantaged AP/SEN are almost certainly a relatively poor proxy for the ‘poor but dim’.

That said I could find no data that quantifies these relationships.

The School Performance Tables distinguish a ‘low attainer’ cohort. (In the Secondary Tables the definition is determined by prior KS2 attainment and in the Primary Tables by prior KS1 attainment.)

These populations comprise some 15.7% of the total population in the Secondary Tables and about 18.0% in the Primary Tables. But neither set of Tables applies the distinction in their reporting of the attainment of those from disadvantaged backgrounds.

 

It follows from the definition of ‘dim’ that, by ‘bright’, MrThomas probably intends the two corresponding deciles at the top of the attainment distribution (even though he seems most exercised about the subset with the capacity to progress to competitive universities, particularly Oxford and Cambridge. This is a far more select group of exceptionally high attainers – and an even smaller group of exceptionally high attainers from disadvantaged backgrounds.)

A few AP and/or SEN students will likely fall within this wider group, fewer still within the subset of exceptionally high attainers. AP and/or SEN students from disadvantaged backgrounds will be fewer again, if indeed there are any at all.

The same issues with data apply. The School Performance Tables distinguish ‘high attainers’, who constitute over 32% of the secondary cohort and 25% of the primary cohort. As with low attainers, we cannot isolate the performance of those from disadvantaged backgrounds.

We are forced to rely on what limited data is made publicly available to distinguish the performance of disadvantaged low and high attainers.

At the top of the distribution there is a trickle of evidence about performance on specific high attainment measures and access to the most competitive universities. Still greater transparency is fervently to be desired.

At the bottom, I can find very little relevant data at all – we are driven inexorably towards analyses of the SEN population, because that is the only dataset differentiated by disadvantage, even though we have acknowledged that such a proxy is highly misleading. (Equivalent AP attainment data seems conspicuous by its absence.)

 

AP and SEN

Before exploring these datasets I ought to provide some description of the different programmes and support under discussion here, if only for the benefit of readers who are unfamiliar with the English education system.

 

Alternative Provision (AP) is intended to meet the needs of a variety of vulnerable learners:

‘They include pupils who have been excluded or who cannot attend mainstream school for other reasons: for example, children with behaviour issues, those who have short- or long-term illness, school phobics, teenage mothers, pregnant teenagers, or pupils without a school place.’

AP is provided in a variety of settings where learners engage in timetabled education activities away from their school and school staff.

Providers include further education colleges, charities, businesses, independent schools and the public sector. Pupil Referral Units (PRUs) are perhaps the best-known settings – there are some 400 nationally.

A review of AP was undertaken by Taylor in 2012 and the Government subsequently embarked on a substantive improvement programme. This rather gives the lie to Mr Thomas’ contention that AP is ‘largely unknown and wholly unappreciated’.

Taylor complains of a lack of reliable data about the number of learners in AP but notes that the DfE’s 2011 AP census recorded 14,050 pupils in PRUs and a further 23,020 in other settings on a mixture of full-time and part-time placements. This suggests a total of slightly over 37,000 learners, though the FTE figure is unknown.

He states that AP learners are:

‘…twice as likely as the average pupil to qualify for free school meals’

A supporting Equality Impact Assessment qualifies this somewhat:

‘In Jan 2011, 34.6% of pupils in PRUs and 13.8%* of pupils in other AP, were eligible for and claiming free school meals, compared with 14.6% of pupils in secondary schools. [*Note: in some AP settings, free school meals would not be available, so that figure is under-stated, but we cannot say by how much.]’

If the PRU population is typical of the wider AP population, approximately one third qualify under this FSM measure of disadvantage, meaning that the substantial majority are not ‘poor’ according to our definition above.

Taylor confirms that overall GCSE performance in AP is extremely low, pointing out that in 2011 just 1.4% achieved five or more GCSE grades A*-C including [GCSEs in] maths and English, compared to 53.4% of pupils in all schools.

By 2012/13 the comparable percentages were 1.7% and 61.7% respectively (the latter for all state-funded schools), suggesting an increasing gap in overall performance. This is a cause for concern but not directly relevant to the issue under consideration.

The huge disparity is at least partly explained by the facts that many AP students take alternative qualifications and that the national curriculum does not apply to PRUs.

Data is available showing the full range of qualifications pursued. Taylor recommended that all students in AP should continue to receive ‘appropriate and challenging English and Maths teaching’.

Interestingly, he also pointed out that:

‘In some PRUs and AP there is no provision for more able pupils who end up leaving without the GCSE grades they are capable of earning.’

However, he fails to offer a specific recommendation to address this point.

 

Special Educational Needs (SEN) are needs or disabilities that affect children’s ability to learn. These may include behavioural and social difficulties, learning difficulties or physical impairments.

This area has also been subject to a major Government reform programme now being implemented.

There is significant overlap between AP and SEN, with Taylor’s review of the former noting that the population in PRUs is 79% SEN.

We know from the 2013 SEN statistics that 12.6% of all pupils on roll at PRUs had SEN statements and 68.9% had SEN without statements. But these populations represent only a tiny proportion of the total SEN population in schools.

SEN learners also have higher than typical eligibility for FSM. In January 2013, 30.1% of all SEN categories across all primary, secondary and special schools were FSM-eligible, roughly twice the rate for all pupils. However, this means that almost seven in ten are not caught by the definition of ‘poor’ provided above.

In 2012/13 23.4% of all SEN learners achieved five or more GCSEs at A*-C or equivalent, including GCSEs in English and maths, compared with 70.4% of those having no identified SEN – another significant overall gap, but not directly relevant to our comparison of the ‘poor but bright’ and the ‘poor but dim’.

 

Data on socio-economic attainment gaps across the attainment spectrum

Those interested in how socio-economic attainment gaps vary at different attainment levels cannot fail to be struck by how little material of this kind is published, particularly in the secondary sector, when such gaps tend to increase in size.

One cannot entirely escape the conviction that this reticence deliberately masks some inconvenient truths.

  • The ideal would be to have the established high/middle/low attainer distinctions mapped directly onto performance by advantaged/disadvantaged learners in the Performance Tables but, as we have indicated, this material is conspicuous by its absence. Perhaps it will appear in the Data Portal now under development.
  • Our next best option is to examine socio-economic attainment gaps on specific attainment measures that will serve as decent proxies for high/middle/low attainment. We can do this to some extent but the focus is disproportionately on the primary sector because the Secondary Tables do not include proper high attainment measures (such as measures based exclusively on GCSE performance at grades A*/A). Maybe the Portal will come to the rescue here as well. We can however supply some basic Oxbridge fair access data.
  • The least preferable option is deploy our admittedly poor proxies for low attainers – SEN and AP. But there isn’t much information from this source either. 

The analysis below looks consecutively at data for the primary and secondary sectors.

 

Primary

We know, from the 2013 Primary School Performance Tables, that the percentage of disadvantaged and other learners achieving different KS2 levels in reading, writing and maths combined, in 2013 and 2012 respectively, were as follows:

 

Table 1: Percentage of disadvantaged and all other learners achieving each national curriculum level at KS2 in 2013 in reading, writing and maths combined

L3  or below L4  or above L4B or above L5 or above
Dis Oth Gap Dis Oth Gap Dis Oth Gap Dis Oth Gap
2013 13 5 +8 63 81 -18 49 69 -20 10 26 -16
2012 x x x 61 80 -19 x x x 9 24 -15

 

This tells us relatively little, apart from the fact that disadvantaged learners are heavily over-represented at L3 and below and heavily under-represented at L5 and above.

The L5 gap is somewhat lower than the gaps at L4 and 4B respectively, but not markedly so. However, the L5 gap has widened slightly since 2012 while the reverse is true at L4.

This next table synthesises data from SFR51/13: ‘National curriculum assessments at key stage 2: 2012 to 2013’. It also shows gaps for disadvantage, as opposed to FSM gaps.

 

Table 2: Percentage of disadvantaged and all other learners achieving each national curriculum level, including differentiation by gender, in each 2013 end of KS2 test

L3 L4 L4B L5 L6
D O Gap D O Gap D O Gap D O Gap D O Gap
Reading All 12 6 +6 48 38 +10 63 80 -17 30 50 -20 0 1 -1
B 13 7 +6 47 40 +7 59 77 -18 27 47 -20 0 0 0
G 11 5 +6 48 37 +11 67 83 -16 33 54 -21 0 1 -1
GPS All 28 17 +11 28 25 +3 52 70 -18 33 51 -18 1 2 -1
B 30 20 +10 27 27 0 45 65 -20 28 46 -18 0 2 -2
G 24 13 +11 28 24 +4 58 76 -18 39 57 -18 1 3 -2
Maths All 16 9 +7 50 41 +9 62 78 -16 24 39 -15 2 8 -6
B 15 8 +7 48 39 +9 63 79 -16 26 39 -13 3 10 -7
G 17 9 +8 52 44 +8 61 78 -17 23 38 -15 2 7 -5

 

This tells a relatively consistent story across each test and for boys as well as girls.

We can see that, at Level 4 and below, learners from disadvantaged backgrounds are in the clear majority, perhaps with the exception of L4 GPS.  But at L4B and above they are very much in the minority.

Moreover, with the exception of L6 where low percentages across the board mask the true size of the gaps, disadvantaged learners tend to be significantly more under-represented at L4B and above than they are over-represented at L4 and below.

A different way of looking at this data is to compare the percentages of advantaged and disadvantaged learners respectively at L4 and L5 in each assessment.

  • Reading: Amongst disadvantaged learners the proportion at L5 is -18 percentage points fewer than the proportion at L4, but amongst advantaged learners the proportion at L5 is +12 percentage points higher than at L4.
  • GPS: Amongst disadvantaged learners the proportion at L5 is +5 percentage points more than the proportion at L4, but amongst advantaged learners the proportion at L5 is +26 percentage points higher than at L4.
  • Maths: Amongst disadvantaged learners the proportion at L5 is -26 percentage points fewer than the proportion at L4, but amongst advantaged learners the proportion at L5 is only 2 percentage points fewer than at L4.

If we look at 2013 gaps compared with 2012 (with teacher assessment of writing included in place of the GSP test introduced in 2013) we can see there has been relatively little change across the board, with the exception of L5 maths, which has been affected by the increasing success of advantaged learners at L6.

 

Table 3: Percentage of disadvantaged and all other learners achieving  national curriculum levels 3-6 in each of reading, writing and maths in 2012 and 2013 respectively

L3 L4 L5 L6
D O Gap D O Gap D O Gap D O Gap
Reading 2012 11 6 +5 46 36 +10 33 54 -21 0 0 0
2013 12 6 +6 48 38 +10 30 50 -20 0 1 -1
Writing 2012 22 11 +11 55 52 +3 15 32 -17 0 1 -1
2013 19 10 +9 56 52 +4 17 34 -17 1 2 -1
Maths 2012 17 9 +8 50 41 +9 23 43 -20 1 4 -3
2013 16 9 +9 50 41 +9 24 39 -15 2 8 -6

 

To summarise, as far as KS2 performance is concerned, there are significant imbalances at both the top and the bottom of the attainment distribution and these gaps have not changed significantly since 2012. There is some evidence to suggest that gaps at the top are larger than those at the bottom.

 

Secondary

Unfortunately there is a dearth of comparable data at secondary level, principally because of the absence of published measures of high attainment.

SFR05/2014 provides us with FSM gaps (as opposed to disadvantaged gaps) for a series of GCSE measures, none of which serve our purpose particularly well:

  • 5+ A*-C GCSE grades: gap = 16.0%
  • 5+ A*-C grades including English and maths GCSEs: gap = 26.7%
  • 5+ A*-G grades: gap = 7.6%
  • 5+ A*-G grades including English and maths GCSEs: gap = 9.9%
  • A*-C grades in English and maths GCSEs: gap = 26.6%
  • Achieving the English Baccalaureate: gap = 16.4%

Perhaps all we can deduce is that the gaps vary considerably in size, but tend to be smaller for the relatively less demanding and larger for the relatively more demanding measures.

For specific high attainment measures we are forced to rely principally on data snippets released in answer to occasional Parliamentary Questions.

For example:

  • In 2003, 1.0% of FSM-eligible learners achieved five or more GCSEs at A*/A including English and maths but excluding equivalents, compared with 6.8% of those not eligible, giving a gap of 5.8%. By 2009 the comparable percentages were 1.7% and 9.0% respectively, giving an increased gap of 7.3% (Col 568W)
  • In 2006/07, the percentage of FSM-eligible pupils securing A*/A grades at GCSE in different subjects, compared with the percentage of all pupils in maintained schools doing so were as shown in the table below (Col 808W)

 

Table 4: Percentage of FSM-eligible and all pupils achieving GCSE A*/A grades in different GCSE subjects in 2007

FSM All pupils Gap
Maths 3.7 15.6 11.9
Eng lit 4.1 20.0 15.9
Eng lang 3.5 16.4 12.9
Physics 2.2 49.0 46.8
Chemistry 2.5 48.4 45.9
Biology 2.5 46.8 44.3
French 3.5 22.9 19.4
German 2.8 23.2 20.4

 

  • In 2008, 1% of FSM-eligible learners in maintained schools achieved A* in GCSE maths compared with 4% of all pupils in maintained schools. The comparable percentages for Grade A were 3% and 10% respectively, giving an A*/A gap of 10% (Col 488W)

There is much variation in the subject-specific outcomes at A*/A described above. But, when it comes to the overall 5+ GCSEs high attainment measure based on grades A*/A, the gap is much smaller than on the corresponding standard measure based on grades A*-C.

Further material of broadly the same vintage is available in a 2007 DfE statistical publication: ‘The Characteristics of High Attainers’.

There is a complex pattern in evidence here which is very hard to explain with the limited data available. More time series data of this nature – illustrating Excellence and Foundation Gaps alike – should be published annually so that we have a more complete and much more readily accessible dataset.

I could find no information at all about the comparative performance of disadvantaged learners in AP settings compared with those not from disadvantaged backgrounds.

Data is published showing the FSM gap for SEN learners on all the basic GCSE measures listed above. I have retained the generic FSM gaps in brackets for the sake of comparison:

  • 5+ A*-C GCSE grades: gap = 12.5% (16.0%)
  • 5+ A*-C grades including English and maths GCSEs: gap = 12.1% (26.7%)
  • 5+ A*-G grades: gap = 10.4% (7.6%)
  • 5+ A*-G grades including English and maths GCSEs: gap = 13.2% (9.9%)
  • A*-C grades in English and maths GCSEs: gap = 12.3% (26.6%)
  • Achieving the English Baccalaureate: gap = 3.5% (16.4%)

One can see that the FSM gaps for the more demanding measures are generally lower for SEN learners than they are for all learners. This may be interesting but, for the reasons given above, this is not a reliable proxy for the FSM gap amongst ‘dim’ learners.

 

When it comes to fair access to Oxbridge, I provided a close analysis of much relevant data in this post from November 2013.

The chart below shows the number of 15 year-olds eligible for and claiming FSM at age 15 who progressed to Oxford or Cambridge by age 19. The figures are rounded to the nearest five.

 

Chart 1: FSM-eligible learners admitted to Oxford and Cambridge 2005/06 to 2010/11

Oxbridge chart

 

In sum, there has been no change in these numbers over the last six years for which data has been published. So while there may have been consistently significant expenditure on access agreements and multiple smaller mentoring programmes, it has had negligible impact on this measure at least.

My previous post set out a proposal for what to do about this sorry state of affairs.

 

Budgets

For the purposes of this discussion we need ideally to identify and compare total national budgets for the ‘poor but bright’ and the ‘poor but dim’. But that is simply not possible. 

Many funding streams cannot be disaggregated in this manner. As we have seen, some – including the AP and SEN budgets – may be aligned erroneously with the second of these groups, although they also support learners who are neither ‘poor’ nor ‘dim’ and have a broader purpose than raising attainment. 

There may be some debate, too, about which funding streams should be weighed in the balance.

On the ‘bright but poor’ side, do we include funding for grammar schools, even though the percentage of disadvantaged learners attending many of them is virtually negligible (despite recent suggestions that some are now prepared to do something about this)? Should the Music and Dance Scheme (MDS) be within scope of this calculation?

The best I can offer is a commentary that gives a broad sense of orders of magnitude, to illustrate in very approximate terms how the scales tend to tilt more towards the ‘poor but dim’ rather than the ‘poor but bright’, but also to weave in a few relevant asides about some of the funding streams in question.

 

Pupil Premium and the EEF

I begin with the Pupil Premium – providing schools with additional funding to raise the attainment of disadvantaged learners.

The Premium is not attached to the learners who qualify for it, so schools are free to aggregate the funding and use it as they see fit. They are held accountable for these decisions through Ofsted inspection and the gap-narrowing measures in the Performance Tables.

Mr Thomas suggests in our Twitter discussion that AP students are not significant beneficiaries of such support, although provision in PRUs features prominently in the published evaluation of the Premium. It is for local authorities to determine how Pupil Premium funding is allocated in AP settings.

One might also make a case that ‘bright but poor’ learners are not a priority either, despite suggestions from the Pupil Premium Champion to the contrary.

 

 

As we have seen, the Performance Tables are not sharply enough focused on the excellence gaps at the top of the distribution and I have shown elsewhere that Ofsted’s increased focus on the most able does not yet extend to the impact on those attracting the Pupil Premium, even though there was a commitment that it would do so. 

If there is Pupil Premium funding heading towards high attainers from disadvantaged backgrounds, the limited data to which we have access does not yet suggest a significant impact on the size of Excellence Gaps. 

The ‘poor but bright’ are not a priority for the Education Endowment Foundation (EEF) either.

This 2011 paper explains that it is prioritising the performance of disadvantaged learners in schools below the floor targets. At one point it says:

‘Looking at the full range of GCSE results (as opposed to just the proportions who achieve the expected standards) shows that the challenge facing the EEF is complex – it is not simply a question of taking pupils from D to C (the expected level of attainment). Improving results across the spectrum of attainment will mean helping talented pupils to achieve top grades, while at the same time raising standards amongst pupils currently struggling to pass.’

But this is just after it has shown that the percentages of disadvantaged high attainers in its target schools are significantly lower than elsewhere. Other things being equal, the ‘poor but dim’ will be the prime beneficiaries.

It may now be time for the EEF to expand its focus to all schools. A diagram from this paper – reproduced below – demonstrates that, in 2010, the attainment gap between FSM and non-FSM was significantly larger in schools above the floor than in those below the floor that the EEF is prioritising. This is true in both the primary and secondary sectors.

It would be interesting to see whether this is still the case.

 

EEF Capture

 

AP and SEN

Given the disaggregation problems discussed above, this section is intended simply to give some basic sense of orders of magnitude – lending at least some evidence to counter Mr Thomas’ assertion that the ‘effort and resources, of schools… are directed disproportionately at those who are already high achieving – the poor but bright’.

It is surprisingly hard to get a grip on the overall national budget for AP. A PQ from early 2011 (Col 75W) supplies a net current expenditure figure for all English local authorities of £530m.

Taylor’s Review fails to offer a comparable figure but my rough estimates based on the per pupil costs he supplies suggests a revenue budget of at least £400m. (Taylor suggests average per pupil costs of £9,500 per year for full-time AP, although PRU places are said to cost between £12,000 and £18,000 per annum.)

I found online a consultation document from Kent – England’s largest local authority – stating its revenue costs at over £11m in FY2014-15. Approximately 454 pupils attended Kent’s AP/PRU provision in 2012-13.

There must also be a significant capital budget. There are around 400 PRUs, not to mention a growing cadre of specialist AP academies and free schools. The total capital cost of the first AP free school – Derby Pride Academy – was £2.147m for a 50 place setting.

In FY2011-12, total annual national expenditure on SEN was £5.77 billion (Col 391W). There will have been some cost-cutting as a consequence of the latest reforms, but the order of magnitude is clear.

The latest version of the SEN Code of Practice outlines the panoply of support available, including the compulsory requirement that each school has a designated teacher to be responsible for co-ordinating SEN provision (the SENCO).

In short, the national budget for AP is sizeable and the national budget for SEN is huge. Per capita expenditure is correspondingly high. If we could isolate the proportion of these budgets allocated to raising the attainment of the ‘poor but dim’, the total would be substantial.

 

Fair Access, especially to Oxbridge, and some related observations

Mr Thomas refers specifically to funding to support fair access to universities – especially Oxbridge – for those from disadvantaged backgrounds. This is another area in which it is hard to get a grasp on total expenditure, not least because of the many small-scale mentoring projects that exist.

Mr Thomas is quite correct to remark on the sheer number of these, although they are relatively small beer in budgetary terms. (One suspects that they would be much more efficient and effective if they could be linked together within some sort of overarching framework.)

The Office for Fair Access (OFFA) estimates University access agreement expenditure on outreach in 2014-15 at £111.9m and this has to be factored in, as does DfE’s own small contribution – the Future Scholar Awards.

Were any expenditure in this territory to be criticised, it would surely be the development and capital costs for new selective 16-19 academies and free schools that specifically give priority to disadvantaged students.

The sums are large, perhaps not outstandingly so compared with national expenditure on SEN for example, but they will almost certainly benefit only a tiny localised proportion of the ‘bright but poor’ population.

There are several such projects around the country. Some of the most prominent are located in London.

The London Academy of Excellence (capacity 420) is fairly typical. It cost an initial £4.7m to establish plus an annual lease requiring a further £400K annually.

But this is dwarfed by the projected costs of the Harris Westminster Sixth Form, scheduled to open in September 2014. Housed in a former government building, the capital cost is reputed to be £45m for a 500-place institution.

There were reportedly disagreements within Government:

‘It is understood that the £45m cost was subject to a “significant difference of opinion” within the DfE where critics say that by concentrating large resources on the brightest children at a time when budgets are constrained means other children might miss out…

But a spokeswoman for the DfE robustly defended the plans tonight. “This is an inspirational collaboration between the country’s top academy chain and one of the best private schools in the country,” she said. “It will give hundreds of children from low income families across London the kind of top quality sixth-form previously reserved for the better off.”’

Here we have in microcosm the debate to which this post is dedicated.

One blogger – a London College Principal – pointed out that the real issue was not whether the brightest should benefit over others, but how few of the ‘poor but bright’ would do so:

‘£45m could have a transformative effect on thousands of 16-19 year olds across London… £45m could have funded at least 50 extra places in each college for over 10 years, helped build excellent new facilities for all students and created a city-wide network to support gifted and talented students in sixth forms across the capital working with our partner universities and employers.’

 

Summing Up

There are three main elements to the discussion: the point of principle, the inputs and the impact. The following sections deal with each of these in turn.

 

Principle

Put bluntly, should ‘poor but dim’ kids have higher priority for educators than ‘poor but bright’ kids (Mr Thomas’ position) or should all poor kids have equal priority and an equal right to the support they need to achieve their best (the Gifted Phoenix position)?

For Mr Thomas, it seems this priority is determined by whether – and how far – the learner is behind undefined ‘basic levels of attainment’ and/or mastery of ‘the basics’ (presumably literacy and numeracy).

Those below the basic attainment threshold have higher priority than those above it. He does not say so but this logic suggests that those furthest below the threshold are the highest priority and those furthest above are the lowest.

So, pursued to its logical conclusion, this would mean that the highest attainers would get next to no support while a human vegetable would be the highest priority of all.

However, since Mr Thomas’ focus is on marginal benefit, it may be that those nearest the threshold would be first in the queue for scarce resources, because they would require the least effort and resources to lift above it.

This philosophy drives the emphasis on achievement of national benchmarks and predominant focus on borderline candidates that, until recently, dominated our assessment and accountability system.

For Gifted Phoenix, every socio-economically disadvantaged learner has equal priority to the support they need to improve their attainment, by virtue of that disadvantage.

There is no question of elevating some ahead of others in the pecking order because they are further behind on key educational measures since, in effect, that is penalising some disadvantaged learners on the grounds of their ability or, more accurately, their prior attainment.

This philosophy underpins the notion of personalised education and is driving the recent and welcome reforms of the assessment and accountability system, designed to ensure that schools are judged by how well they improve the attainment of all learners, rather than predominantly on the basis of the proportion achieving the standard national benchmarks.

I suggested that, in deriding ‘the romance of the poor but bright’, Mr Thomas ran the risk of falling into ‘the slough of anti-elitism’. He rejected that suggestion, while continuing to emphasise the need to ‘concentrate more’ on ‘those at risk of never being able to engage with society’.

I have made the assumption that Thomas is interested primarily in KS2 and GCSE or equivalent qualifications at KS4 given his references to KS2 L4, basic skills and ‘paper qualifications needed to enter meaningful employment’.

But his additional references to ‘real qualifications’ (as opposed to paper ones) and engaging with society could well imply a wider range of personal, social and work-related skills for employability and adult life.

My preference for equal priority would apply regardless: there is no guarantee that high attainers from disadvantaged backgrounds will necessarily possess these vital skills.

But, as indicated in the definition above, there is an important distinction to be maintained between:

  • educational support to raise the attainment, learning and employability skills of socio-economically disadvantaged learners and prepare them for adult life and
  • support to manage a range of difficulties – whether behavioural problems, disability, physical or mental impairment – that impact on the broader life chances of the individuals concerned.

Such a distinction may well be masked in the everyday business of providing effective holistic support for learners facing such difficulties, but this debate requires it to be made and sustained given Mr Thomas’s definition of the problem in terms of the comparative treatment of the ‘poor but bright’ and the ‘poor but dim’.

Having made this distinction, it is not clear whether he himself sustains it consistently through to the end of his post. In the final paragraphs the term ‘poor but dim’ begins to morph into a broader notion encompassing all AP and SEN learners regardless of their socio-economic status.

Additional dimensions of disadvantage are potentially being brought into play. This is inconsistent and radically changes the nature of the argument.

 

Inputs

By inputs I mean the resources – financial and human – made available to support the education of ‘dim’ and ‘bright’ disadvantaged learners respectively.

Mr Thomas also shifts his ground as far as inputs are concerned.

His post opens with a statement that ‘the effort and resources’ of schools, charities and businesses are ‘directed disproportionately’ at the poor but bright – and he exemplifies this with reference to fair access to competitive universities, particularly Oxbridge.

When I point out the significant investment in AP compared with fair access, he changes tack – ‘I’m measuring outcomes not just inputs’.

Then later he says ‘But what some need is just more expensive’, to which I respond that ‘the bottom end already has the lion’s share of funding’.

At this point we have both fallen into the trap of treating the entirety of the AP and SEN budgets as focused on the ‘poor but dim’.

We are failing to recognise that they are poor proxies because the majority of AP and SEN learners are not ‘poor’, many are not ‘dim’, these budgets are focused on a wider range of needs and there is significant additional expenditure directed at ‘poor but dim’ learners elsewhere in the wider education budget.

Despite Mr Thomas’s opening claim, it should be reasonably evident from the preceding commentary that my ‘lion’s share’ point is factually correct. His suggestion that AP is ‘largely unknown and wholly unappreciated’ flies in the face of the Taylor Review and the Government’s subsequent work programme.

SEN may depend heavily on the ‘extraordinary effort of dedicated staff’, but at least there are such dedicated staff! There may be inconsistencies in local authority funding and support for SEN, but the global investment is colossal by comparison with the funding dedicated on the other side of the balance.

Gifted Phoenix’s position acknowledges that inputs are heavily loaded in favour of the SEN and AP budgets. This is as it should be since, as Thomas rightly notes, many of the additional services they need are frequently more expensive to provide. These services are not simply dedicated to raising their attainment, but also to tackling more substantive problems associated with their status.

Whether the balance of expenditure on the ‘bright’ and ‘dim’ respectively is optimal is a somewhat different matter. Contrary to Mr Thomas’s position, gifted advocates are often convinced that too much largesse is focused on the latter at the expense of the former.

Turning to advocacy, Mr Thomas says ‘we end up ignoring the more pressing problem’ of the poor but dim. He argues in the Twitter discussion that too few people are advocating for these learners, adding that they are failed ‘because it’s not popular to talk about them’.

I could not resist countering that advocacy for gifted learners is equally unpopular, indeed ‘the word is literally taboo in many settings’. I cannot help thinking – from his footnote reference to ‘epigenetic coding’ – that Mr Thomas is amongst those who are distinctly uncomfortable with the term.

Where advocacy does survive it is focused exclusively on progression to competitive universities and, to some extent, high attainment as a route towards that outcome. The narrative has shifted away from concepts of high ability or giftedness, because of the very limited consensus about that condition (even amongst gifted advocates) and even considerable doubt in some quarters whether it exists at all.

 

Outcomes

Mr Thomas maintains in his post that the successes of his preferred target group ‘have the power to change the British economy, far more so than those of their brighter peers’. This is because ‘the gap most damaging to society is in life outcomes for the children that perform least well at school’.

As noted above, it is important to remember that we are discussing here the addition of educational and economic value by tackling underachievement amongst learners from disadvantaged backgrounds, rather than amongst all the children that perform least well.

We are also leaving to one side the addition of value through any wider engagement by health and social services to improve life chances.

It is quite reasonable to advance the argument that ‘improving the outcomes ‘of the bottom 20%’ (the Tail) will have ‘a huge socio-economic impact’ and ‘make the biggest marginal difference to society’.

But one could equally make the case that society would derive similar or even higher returns from a decision to concentrate disproportionately on the highest attainers (the Smart Fraction).

Or, as Gifted Phoenix would prefer, one could reasonably propose that the optimal returns should be achieved by means of a balanced approach that raises both the floor and the ceiling, avoiding any arbitrary distinctions on the basis of prior attainment.

From the Gifted Phoenix perspective, one should balance the advantages of removing the drag on productivity of an educational underclass against those of developing the high-level human capital needed to drive economic growth and improve our chances of success in what Coalition ministers call the ‘global race’.

According to this perspective, by eliminating excellence gaps between disadvantaged and advantaged high attainers we will secure a stream of benefits broadly commensurate to that at the bottom end.

These will include substantial spillover benefits, achieved as a result of broadening the pool of successful leaders in political, social, educational and artistic fields, not to mention significant improvements in social mobility.

It is even possible to argue that, by creating a larger pool of more highly educated parents, we can also achieve a significant positive impact on the achievement of subsequent generations, thus significantly reducing the size of the tail.

And in the present generation we will create many more role models: young people from disadvantaged backgrounds who become educationally successful and who can influence the aspirations of younger disadvantaged learners.

This avoids the risk that low expectations will be reinforced and perpetuated through a ‘deficit model’ approach that places excessive emphasis on removing the drag from the tail by producing a larger number of ‘useful members of society’.

This line of argument is integral to the Gifted Phoenix Manifesto.

It seems to me entirely conceivable that economists might produce calculations to justify any of these different paths.

But it would be highly inequitable to put all our eggs in the ‘poor but bright’ basket, because that penalises some disadvantaged learners for their failure to achieve high attainment thresholds.

And it would be equally inequitable to focus exclusively on the ‘poor but dim’, because that penalises some disadvantaged learners for their success in becoming high attainers.

The more equitable solution must be to opt for a ‘balanced scorecard’ approach that generates a proportion of the top end benefits and a proportion of the bottom end benefits simultaneously.

There is a risk that this reduces the total flow of benefits, compared with one or other of the inequitable solutions, but there is a trade-off here between efficiency and a socially desirable outcome that balances the competing interests of the two groups.

 

The personal dimension

After we had finished our Twitter exchanges, I thought to research Mr Thomas online. Turns out he’s quite the Big-Cheese-in-Embryo. Provided he escapes the lure of filthy lucre, he’ll be a mover and shaker in education within the next decade.

I couldn’t help noticing his own educational experience – public school, a First in PPE from Oxford, leading light in the Oxford Union – then graduation from Teach First alongside internships with Deutsche Bank and McKinsey.

Now he’s serving his educational apprenticeship as joint curriculum lead for maths at a prominent London Academy. He’s also a trustee of ‘a university mentoring project for highly able 11-14 year old pupils from West London state schools’.

Lucky I didn’t check earlier. Such a glowing CV might have been enough to cow this grammar school Oxbridge reject, even if I did begin this line of work several years before he was born. Not that I have a chip on my shoulder…

The experience set me wondering about the dominant ideology amongst the Teach First cadre, and how it is tempered by extended exposure to teaching in a challenging environment.

There’s more than a hint of idealism about someone from this privileged background espousing the educational philosophy that Mr Thomas professes. But didn’t he wonder where all the disadvantaged people were during his own educational experience, and doesn’t he want to change that too?

His interest in mentoring highly able pupils would suggest that he does, but also seems directly to contradict the position he’s reached here. It would be a pity if the ‘poor but bright’ could not continue to rely on his support, equal in quantity and quality to the support he offers the ‘poor but dim’.

For he could make a huge difference at both ends of the attainment spectrum – and, with his undeniable talents, he should certainly be able to do so

 

Conclusion

We are entertaining three possible answers to the question whether in principle to prioritise the needs of the ‘poor but bright’ or the ‘poor but dim’:

  • Concentrate principally – perhaps even exclusively – on closing the Excellence Gaps at the top
  • Concentrate principally – perhaps even exclusively – on closing the Foundation Gaps at the bottom
  • Concentrate equally across the attainment spectrum, at the top and bottom and all points in between.

Speaking as an advocate for those at the top, I favour the third option.

It seems to me incontrovertible – though hard to quantify – that, in the English education system, the lion’s share of resources go towards closing the Foundation Gaps.

That is perhaps as it should be, although one could wish that the financial scales were not tipped so excessively in their direction, for ‘poor but bright’ learners do in my view have an equal right to challenge and support, and should not be penalised for their high attainment.

Our current efforts to understand the relative size of the Foundation and Excellence Gaps and how these are changing over time are seriously compromised by the limited data in the public domain.

There is a powerful economic case to be made for prioritising the Foundation Gaps as part of a deliberate strategy for shortening the tail – but an equally powerful case can be constructed for prioritising the Excellence Gaps, as part of a deliberate strategy for increasing the smart fraction.

Neither of these options is optimal from an equity perspective however. The stream of benefits might be compromised somewhat by not focusing exclusively on one or the other, but a balanced approach should otherwise be in our collective best interests.

You may or may not agree. Here is a poll so you can register your vote. Please use the comments facility to share your wider views on this post.

 

 

 

 

Epilogue

 

We must beware the romance of the poor but bright,

But equally beware

The romance of rescuing the helpless

Poor from their sorry plight.

We must ensure the Tail

Does not wag the disadvantaged dog!

 

GP

May 2014

One for The Echo Chamber

 

We must get better at educating clever kids

 

 

 

 

 

 

 

 

 

GP

May 2014

A Closer Look at Level 6

 

pencil-145970_640This post provides a data-driven analysis of Level 6 (L6) performance at Key Stage 2, so as to:

  • Marshall the published information and provide a commentary that properly reflects this bigger picture;
  • Establish which data is not yet published but ought to be in the public domain;
  • Provide a baseline against which to measure L6 performance in the 2014 SATs; and
  • Initiate discussion about the likely impact of new tests for the full attainment span on the assessment and performance of the highest attainers, both before and after those tests are introduced in 2016.

Following an initial section highlighting key performance data across the three L6 tests – reading; grammar, punctuation and spelling (GPS); and maths – the post undertakes a more detailed examination of L6 achievement in English, maths and science, taking in both teacher assessment and test outcomes.

It  concludes with a summary of key findings reflecting the four purposes above.

Those who prefer not to read the substantive text can jump straight to the summary from here

I apologise in advance for any transcription errors and statistical shortcomings in the analysis below.

 

Background

 

Relationship with previous posts

This discussion picks up themes explored in several previous posts.

In May 2013 I reviewed an Investigation of Level 6 Key Stage 2 Tests commissioned and published by in February that year by the Department for Education.

My overall assessment of that report?

‘A curate’s egg really. Positive and useful in a small way, not least in reminding us that primary-secondary transition for gifted learners remains problematic, but also a missed opportunity to flag up some other critical issues – and of course heavily overshadowed by the primary assessment consultation on the immediate horizon.’

The performance of the highest primary attainers also featured strongly in an analysis of the outcomes of NAHT’s Commission on Assessment (February 2014) and this parallel piece on the response to the consultation on primary assessment and accountability (April 2014).

The former offered the Commission two particularly pertinent recommendations, namely that it should:

‘shift from its narrow and ‘mildly accelerative’ view of high attainment to accommodate a richer concept that combines enrichment (breadth), extension (depth) and acceleration (faster pace) according to learners’ individual needs.’

Additionally it should:

‘incorporate a fourth ‘far exceeded’ assessment judgement, since the ‘exceeded’ judgement covers too wide a span of attainment.’

 

 

The latter discussed plans to discontinue L6 tests by introducing from 2016 single tests for the full attainment span at the end of KS2, from the top of the P-scales to a level the initial consultation document described as ‘at least of the standard of’ the current L6.

It opined:

‘The task of designing an effective test for all levels of prior attainment at the end of key stage 2 is…fraught with difficulty…I have grave difficulty in understanding how such assessments can be optimal for high attainers and fear that this is bad assessment practice.’

Aspects of L6 performance also featured in a relatively brief review of High Attainment in 2013 Primary School Performance Tables (December 2013). This post expands significantly on the relevant data included in that one.

The new material is drawn from three principal sources:

 

The recent history of L6 tests

Level 6 tests have a rather complex history. The footnotes to SFR 51/2013 simplify this considerably, noting that:

  • L6 tests were initially available from 1995 to 2002
  • In 2010 there was a L6 test for mathematics only
  • Since 2012 there have been tests of reading and mathematics
  • The GPS test was introduced in 2013.

In fact, the 2010 maths test was the culmination of an earlier QCDA pilot of single level tests. In that year the results from the pilot were reported as statutory National Curriculum test results in pilot schools.

In 2011 optional L6 tests were piloted in reading, writing and maths. These were not externally marked and the results were not published.

The June 2011 Bew Report came out in favour:

‘We believe that the Government should continue to provide level 6 National Curriculum Tests for schools to use on an optional basis, whose results should be reported to parents and secondary schools.’

Externally marked L6 tests were offered in reading and maths in 2012, alongside L6 teacher assessment in writing. The GPS test was added to the portfolio in the following year.

In 2012, ministers were talking up the tests describing them as:

‘…a central element in the Coalition’s drive to ensure that high ability children reach their potential. Nick Gibb, the schools minister, said: “Every child should be given the opportunity to achieve to the best of their abilities.

“These tests will ensure that the brightest pupils are stretched and standards are raised for all.”’

In 2012 the Primary Performance Tables used L6 results only in the calculation of ‘level 5+’, APS, value-added and progress measures, but this was not the case in 2013.

The Statement of Intent on the Tables said:

‘…the percentage of the number of children at the end of Key Stage 2 achieving level 6 in a school will also be shown in performance tables. The Department will not publish any information at school level about the numbers of children entered for the level 6 tests, or the percentage achieving level 6 of those entered for level 6.’

The nature of the test is unchanged for 2014: they took place on 12, 13 and 15 May respectively. This post is timed to coincide with their administration.

The KS2 ARA booklet  continues to explain that:

‘Children entered for level 6 tests are required to take the levels 3-5 tests. Headteachers should consider a child’s expected attainment before registering them for the level 6 tests as they should be demonstrating attainment above level 5. Schools may register children for the level 6 tests and subsequently withdraw them.

The child must achieve a level 5 in the levels 3-5 test and pass the corresponding level 6 test in the same year in order to be awarded an overall level 6 result. If the child does not pass the level 6 test they will be awarded the level achieved in the levels 3-5 test.’

 

Anticipated future developments

At the time of writing the Government has not published a Statement of Intent explaining whether there will be any change in the reporting of L6 results in the December 2014 Primary School Performance Tables.

An accompanying Data Warehouse (aka Portal) is also under development and early iterations are expected to appear before the next set of Tables. The Portal will make available a wider range of performance data, some of it addressing high attainment.

The discussion in this post of material not yet in the public domain is designed in part as a marker to influence consideration of material for inclusion in the Portal.

As noted above, the Government has published its response to the consultation on primary assessment and accountability arrangements, confirming that new single assessments for the full attainment span will be introduced in 2016.

At the time of writing, there is no published information about the number of entries for the 2014 tests. (In 2013 these details were released in the reply to a Parliamentary Question.)

Entries had to be confirmed by March 2014, so it may be that the decision to replace the L6 tests, not confirmed until that same month, has not impacted negatively on demand. The effect on 2015 entries remains to be seen, but there is a real risk that these will be significantly depressed.

L6 tests are scheduled to be taken for the final time in May 2015. The reading and maths tests will have been in place for four consecutive years; the GPS test for three.

Under the new arrangements there will continue to be tests in reading, GSP and maths – plus a sampling test in science – as well as teacher assessment in reading, writing, maths and science.

KS2 test outcomes (but not teacher assessment) will be reported by means of a scaled score for each test, alongside three average scaled scores, for the school, the local area and nationally.

The original consultation document proposed that each scaled score would be built around a ‘secondary readiness standard’ loosely aligned with the current L4B, but converted into a score of 100.

The test development frameworks mention that:

‘at the extremes of the scaled score distribution, as is standard practice, the scores will be truncated such that above and below a certain point, all children will be awarded the same scaled score in order to minimise the effect for children at the ends of the distribution where the test is not measuring optimally.’

A full set of sample materials including tests and mark schemes for every test will be published by September 2015, the beginning of the academic year in which the new tests are first deployed.

The consultation document said these single tests would:

‘include challenging material (at least of the standard of the current level 6 test) which all pupils will have the opportunity to answer, without the need for a separate test’.

The development frameworks published on 31 March made it clear that the new tests should:

‘provide a suitable challenge for all children and give every child the opportunity to achieve as high a standard…as possible.’

Additionally:

‘In order to improve general accessibility for all children, where possible, questions will be placed in order of difficulty.’

These various and potentially conflicting statements informed the opinion I have already repeated.

The question then arises whether the Government’s U turn on separate tests for the highest attainers is in the latter’s best interests. There cannot be a continuation of L6 tests per se, because the system of levels that underpins it will no longer exist, but separate tests could in principle continue.

Even if the new universal tests provide equally valid and reliable judgements of their attainment – which is currently open to question – one might reasonably argue that the U turn itself may undermine continuity of provision and continued improvement in schools’ practice.

The fact that this practice needs substantive improvement is evidenced by Ofsted’s recent decision to strengthen the attention given to the attainment and progress of what they call ‘the most able’ in all school inspection reports.

 

L6 tests: Key Performance Data

 

Entry and success rates

As noted above, the information in the public domain about entry rates to L6 tests is incomplete.

The 2013 Investigation provides the number of pupils entered for each test in 2012. We do not have comparable data for 2013, but a PQ reply does supply the number of pupils registered for the tests in both 2012 and 2013. This can be supplemented by material in the 2013 SFR and the corresponding 2012 publication.

The available data is synthesised in this table showing for each year – and where available – the number registered for each test, the number entered, the total number of pupils achieving L6 and, of those, the number attending state-funded schools.

 

                    2012                   2013
Reg Ent Pass PassSF Reg Ent Pass Pass SF
Reading 47,148 46,810 942 x 73,118 x 2,262 2,137
GPS x x x x 61,883 x 8,606 x
Maths 55,809 55,212 18,953 x 80,925 x 35,137 33,202

 

One can see that there are relatively small differences between the numbers of pupils registered and the number entered, so the former is a decent enough proxy for the latter. I shall use the former in the calculations immediately below.

It is also evident that the proportions of learners attending independent schools who achieve L6 are small though significant. But, given the incomplete data set for state-funded schools, I shall use the pass rate for all schools in the following calculations.

In sum then, in 2012, the pass rates per registered entry were:

  • Reading – 2.0%
  • Maths – 34.0%

And in 2013 they were:

  • Reading – 3.1%
  • GPS – 13.9%
  • Maths – 43.4%

The pass rates in 2013 have improved significantly in both reading and maths, the former from a very low base. However, the proportion of learners successful in the L6 reading test remains extremely small.

The 2013 Investigation asserted, on the basis of the 2012 results, that:

‘The cost of supporting and administering a test for such a small proportion of the school population appears to outweigh the benefits’

However it did not publish any information about that cost.

It went on to suggest that there is a case for reviewing whether the L6 test is the most appropriate means to  ‘identify a range of higher performing pupils, for example the top 10%’. The Government chose not to act on this suggestion.

 

Gender, ethnic background and disadvantage

The 2013 results demonstrate some very significant gender disparities, as revealed in Chart 1 below.

Girls account for 62% of successful pupils in GPS and a whopping 74% in reading, while boys account for 61% of successful pupils in maths. These imbalances raise important questions about whether gender differences in high attainment are really this pronounced, or whether there is significant underachievement amongst the under-represented gender in each case.

 

Chart 1: Number of pupils successful in 2013 L6 tests by gender

L6 chart 1

 

There are equally significant disparities in performance by ethnic background. Chart 2 below illustrates how the performance of three selected ethnic minority groups – white, Asian and Chinese – varies by test and gender.

It shows that pupils from Chinese backgrounds have a marked ascendancy in all three tests, while Asian pupils are ahead of white pupils in GPS and maths but not reading. Girls are ahead of boys within all three ethnic groups, girls leading in reading and GPS and boys leading in maths. Chinese girls comfortably out-perform white and Asian boys

Chinese pupils are way ahead in maths, with 29% overall achieving L6 and an astonishing 35% of Chinese boys achieving this outcome.

The reasons for this vast disparity are not explained and raise equally awkward questions about the distribution of high attainment and the incidence of underachievement.

 

Chart 2: Percentages of pupils successful in 2013 L6 tests by gender and selected ethnic background

L6 chart2

 

There are also significant excellence gaps on each of the tests, though these are hard to visualise when working solely with percentages (pupil numbers have not been published).

The percentage variations are shown in the table below. This sets out the FSM gap and the disadvantaged gap, the latter being based on the ever-6 FSM measure that underpins the Pupil Premium.

These figures suggest that, while learners eligible for the Pupil Premium are demonstrating success on the maths test (and, for girls at least, on the GPS test too), they are over three times less likely to be successful than those from advantaged backgrounds. The impact of the Pupil Premium is therefore limited.

The gap between the two groups reaches as high as 7% for boys in maths. Although this is low by comparison with the corresponding gap at level 4, it is nonetheless significant. There is more about excellence gaps in maths below.

 

Reading GPS        Maths   
G B G B G B
FSM 0 0 1 0 2 3
Non-FSM 1 0 2 1 6 9
Gap 1 0 1 1 4 6
 
Dis 0 0 1 0 2 3
Non-Dis 1 0 3 2 7 10
Gap 1 0 2 2 5 7

 

 

Schools achieving L6 success

Finally in this opening section, a comparison of schools achieving L6 success in the 2013 Primary School Performance Tables reveals different patterns for each test.

The table below shows how many schools secured different percentages of pupils at L6. The number of schools achieving 11-20% at L6 in the GPS test is over 20 times the number that achieved that outcome in reading. But over eight times more schools secured this outcome in maths than managed it in GPS.

No schools made it beyond 20% at L6 in reading and none pushed beyond 40% at L6 in GPS, but the outliers in maths managed well over 60% and even 70% returns.

 

11-20% 21-30% 31-40% 41-50% 51-60% 61-70% 71-80% Total
Reading 24 24
GPS 298 22 2 322
Maths 2521 531 106 25 0 1 2 3186

 

There is also some evidence of schools being successful in more than one test.

Amongst the small sample of 28 schools that secured 41% or more L6s in maths,  two also featured amongst the top 24 performers in reading and five amongst the top 24 performers in GSP.

The school with arguably the best record across all three tests is Christ Church Primary School in Hampstead, which secured 13% in reading, 21% in GPS and 46% in maths, from a KS2 cohort of 24. The FSM/Pupil Premium rates at the school are low but, nevertheless, this is an outstanding result.

The following sections look more closely at L6 test and teacher assessment results in each subject. Each section consists of a series of bullet points highlighting significant findings.

 

English

 

Reading Test

The evidence on performance on the L6 reading test is compromised to some extent by the tiny proportions of pupils that achieve it. However:

  • 9,605 schools registered pupils for the 2013 L6 reading test, up 48% from 6,469 in 2012, and the number of pupils registered increased from 47,148 in 2012 to 73,118 in 2013, an increase of 55%.
  • Of the 539,473 learners who undertook the 2013 KS2 reading tests, only 2,262 (about 0.42%) achieved L6. This figure includes some in independent schools; the comparable figure for state-funded schools only is 2,137, so 5.5% of L6s were secured in the independent sector.
  • Of this first total – ie including pupils from independent schools – 1,670 were girls (0.63% of all girls who undertook the KS2 reading tests) and 592 were boys (0.21% of all boys who undertook the KS2 reading tests).
  • These are significant improvements on the comparable 2012 figures which showed about 900 learners achieving L6, including 700 girls and 200 boys. (The figures were rounded in the SFR but the 2013 evaluation confirmed the actual number as 942). The overall percentage achieving L6 therefore increased by about 140% in 2013, compared with 2012. If we assume registration for L6 tests as a proxy for entry, this suggests that just over 3% of entrants passed in 2013.
  • In state-funded schools only, the percentage of learners from a Chinese background entered for KS2 reading tests who achieved L6 reaches 2%, compared with 1% for those of mixed background and 0% for learners from white, Asian and black backgrounds.
  • Amongst the defined sub-groups, learners of Irish, any other white, white and Asian and any other Asian backgrounds also make it to 1%. All the remainder are at 0%.
  • The same is true of EAL learners and native English speakers, FSM-eligible and disadvantaged learners, making worthwhile comparisons almost impossible.
  • The 2013 transition matrices show that 12% of learners who had achieved L4 at the end of KS1 went on to achieve L6, while 1% of those who had achieved L3 did so. Hence the vast majority of those at L4 in KS1 did not make two levels of progress.
  • Progression data in the SFR shows that, of the 2,137 learners achieving L6 in state funded schools, 2,047 were at L3 or above at KS1, 77 were at L2A, 10 were at L2B and 3 were at L2C. Of the total population at KS1 L3 or above, 1.8% progressed to L6.
  • Regional and local authority breakdowns are given only as percentages, of limited value for comparative purposes because they are so small. Only London and the South East record 1% at L6 overall, with all the remaining regions at 0%. Only one local authority – Richmond upon Thames – reaches 2%.
  • However 1% of girls reach L6 in all regions apart from Yorkshire and Humberside and a few more authorities record 2% of girls at L6: Camden, Hammersmith and Fulham, Kensington and Chelsea, Kingston, Richmond and Solihull.
  • The 2013 Primary School Performance Tables show that some 12,700 schools recorded no learners achieving L6.
  • At the other end of the spectrum, 36 schools recorded 10% or more of their KS2 cohort achieving L6. Four of these recorded 15% or higher:

Iford and Kingston C of E Primary School, East Sussex (19%; cohort of 21).

Emmanuel C of E Primary School, Camden (17%; cohort of 12).

Goosnargh Whitechapel Primary School, Lancashire (17%; cohort of 6).

High Beech  C of E VC Primary School, Essex (15%; cohort of 13).

 

Reading TA

There is relatively little data about teacher assessment outcomes.

  • The total number of pupils in all schools achieving L6 in reading TA in 2013 is 15,864 from a cohort of 539,729 (2.94%). This is over seven times as many as achieved L6 in the comparable test (whereas in maths the figures are very similar). It would be useful to know how many pupils achieved L6 in TA, were entered for the test and did not succeed.
  • The number of successful girls is 10,166 (3.85% of females assessed) and the number of boys achieving L6 is 5,698 (2.06% of males assessed). Hence the gap between girls and boys is far narrower on TA than it is on the corresponding test.
  • Within the 2013 Performance Tables, eight schools recorded 50% or more of their pupils at L6, the top performer being Peppard Church of England Primary School, Oxfordshire, which reached 83% (five from a cohort of six).

 

Writing (including GPS)

 

GPS Test

The L6 Grammar, Punctuation and Spelling (GPS) test was newly introduced in 2013. This is what we know from the published data:

  • The number of schools that registered for the test was 7,870, almost 2,000 fewer than registered for the reading test. The number of pupil registrations was 61,883, over 12,000 fewer than for reading.
  • The total number of successful learners is 8,606, from a total of 539,438 learners assessed at KS2, including those in independent schools taking the tests, giving an actual percentage of 1.6%. As far as I can establish, a comparable figure for state-funded schools is not available.
  • As with reading, there are significant differences between boys and girls. There were 5,373 successful girls (2.04% of girls entered for KS2 GPS tests) and 3,233 successful boys (1.17% of boys entered for KS2 GPS). This imbalance in favour of girls is significant, but not nearly as pronounced as in the reading test.
  • The proportion of pupil registrations for the L6 GPS test resulting in L6 success is around one in seven (13.9%) well over four times as high as for reading.
  • The ethnic breakdown in state-funded schools shows that Chinese learners are again in the ascendancy. Overall, 7% of pupils from a Chinese background achieved L6, compared with 1% white, 2% mixed, 2% Asian and 1% black.
  • Chart 3 below shows how L6 achievement in GPS varies between ethnic sub-groups. Indian pupils reach 4% while white and Asian pupils score 3%, as do pupils from any other Asian background.

 

Chart 3: 2013 GPS L6 performance by ethnic sub-groups

L6 chart 3

 

  • When gender differences are taken into account, Chinese girls are at 8% (compared with boys at 7%), ahead of Indian girls at 5% (boys 3%), white and Asian girls at 4% (boys 3%) and any other Asian girls also at 4% (boys 3%). The ascendancy of Chinese girls over boys from any other ethnic background is particularly noteworthy and replicates the situation in maths (see below).
  • Interestingly, EAL learners and learners with English as a native language both record 2% at L6. Although these figures are rounded, it suggests that exceptional performance in this aspect of English does not correlate with being a native speaker.
  • FSM-eligible learners register 0%, compared with 2% for those not eligible. However, disadvantaged learners are at 1% and non-disadvantaged 2% (Disadvantaged boys are at 0% and non-disadvantaged girls at 3%). Without knowing the numbers involved we can draw few reliable conclusions from this data.
  • Chart 4 below gives illustrates the regional breakdown for boys, girls and both genders. At regional level, London reaches 3% success overall, with both the South East and Eastern regions at 2% and all other regions at 1%. Girls record 2% in every region apart from the North West and Yorkshire and Humberside. Only in London do boys reach 2%.

 

Chart 4: 2013 L6 GPS outcomes by gender and region

L6 chart 4

 

  • At local authority level the highest scoring are Richmond (7%); the Isles of Scilly (6%); Kingston and Sutton (5%); and Harrow, Hillingdon and Wokingham (4%).
  • The School Performance Tables reveal that some 10,200 schools posted no L6 results while, at the other extreme, 34 schools recorded 20% or more of their KS2 cohort at L6 and 463 schools managed 10% or above. The best records were achieved by:

St Joseph’s Catholic Primary School, Southwark (38%; cohort of 24).

The Vineyard School, Richmond  (38%; cohort of 56).

Cartmel C of E Primary School,  (29%; cohort of 7) and

Greystoke School, (29%; cohort of 7).

 

Writing TA

When it comes to teacher assessment:

  • 8,410 learners from both state and independent schools out of a total of 539,732 assessed (1.56%) were judged to be at L6 in writing. The total figure for state-funded schools is 7,877 pupils. This is very close to the number successful in the L6 GPS test, even though the focus is somewhat different.
  • Of these, 5,549 are girls (2.1% of the total cohort) and 2,861 boys (1.04% of the total cohort). Hence the imbalance in favour of girls is more pronounced in writing TA than in the GPS test, whereas the reverse is true for reading. 
  • About 5% of learners from Chinese backgrounds achieve L6, as do 3% of white Asian and 3% of Irish pupils.
  • The 2013 transition matrices record progression in writing TA, rather than in the GSP test. They show that 61% of those assessed at L4 at KS1 go on to achieve L6, so only 6 out of 10 are making the expected minimum two levels of progress. On the other hand, some 9% of those with KS1 L3 go on to achieve L6, as do 2% of those at L2A.
  • The SFR provides further progression data – again based on the TA outcomes – for state-funded schools only. It shows us that one pupil working towards L1 at KS1 went on to achieve L6 at KS2, as did 11 at L1, 54 at L2C, 393 at L2B, 1,724 at L2A and 5,694 at L3 or above. Hence some pupils are making five or more levels of progress.
  • The regional breakdown – this time including independent schools – gives the East Midlands, West Midlands, London and the South West at 2%, with all the rest at 1%. At local authority level, the best performers are: City of London at 10%; Greenwich, Kensington and Chelsea and Richmond at 5% and Windsor and Maidenhead at 4%.

 

English TA

There is additionally a little information about pupils achieving L6 across the subject:

  • The SFR confirms that 8,087 pupils (1.5%) were assessed at L6 in English, including 5,244 girls (1.99% of all girls entered) and 2,843 boys (1.03% of all boys entered). These figures are for all schools, including independent schools.
  • There is a regional breakdown showing the East and West Midlands, London and the South West at 2%, with all the remainder at 1%. Amongst local authorities, the strongest performers are City of London (10%); and Bristol, Greenwich, Hackney, Richmond, Windsor and Maidenhead (4%). The exceptional performance of Bristol, Greenwich and Hackney is noteworthy.
  • In the Performance Tables, 27 schools record 30% or more pupils at L6 across English, the top performer again being Newton Farm, at 60%.

 

Maths

L6 performance in maths is more common than in other tests and subjects and the higher percentages generated typically result in more meaningful comparisons.

  • The number of school registrations for L6 maths in 2013 was 11,369, up almost 40% from 8,130 in 2012. The number of pupil registrations was 80,925, up some 45% from 55,809 in 2012.
  • The number of successful pupils – in both independent and state schools – was 35,137 (6.51% of all entrants). The gender imbalance in reading and GPS is reversed, with 21,388 boys at this level (7.75% of males entered for the overall KS2 test) compared with 13,749 girls (5.22% of females entered for the test). The SFR gives a total for state-funded schools of 33,202 pupils, so some 5.5% of Level 6s were achieved in independent schools.
  • Compared with 2012, the numbers of successful pupils has increased from 18,953. This represents an increase of 85%, not as huge as the increase for reading but a very substantial increase nevertheless. 
  • The number of successful girls has risen by some 108% from 6,600 (rounded) and the number of successful boys by about 72%, from 12,400 (rounded), so the improvement in girls’ success is markedly larger than the corresponding improvement for boys.  
  • Assuming L6 test registration as a proxy for entry, the success rate in 2013 is around 43.4%, massively better than for reading (3%) and GPS (13.9%). The corresponding success rate in 2012 was around 34%. (Slightly different results would be obtained if one used actual entry rates and passes for state schools only, but we do not have these figures for both years.)
  • The breakdown in state-funded schools for the main ethnic groups by gender is illustrated by Chart 5 below. This shows how performance by boys and girls varies according to whether they are white ( W), mixed (M), Asian (A), black (B) or Chinese (C). It also compares the outcomes in 2012 and 2013. The superior performance of Chinese learners is evident, with Chinese boys reaching a staggering 35% success rate in 2013. As things stand, Chinese boys are almost nine times more likely to achieve L6 than black girls.
  • Chart 5 also shows that none of the gender or ethnic patterns has changed between 2012 and 2013, but some groups are making faster progress, albeit from a low base. This is especially true of white girls, black boys and, to a slightly lesser extent, Asian girls.
  • Chinese girls and boys have improved at roughly the same rate and black boys have progressed faster than black girls but, in the remaining three groups, girls are improving at a faster rate than boys.

 

Chart 5: L6 Maths test by main ethnic groups and gender

L6 chart 5

 

  • Amongst sub-groups, not included on this table, the highest performing are: any other Asian background 15%, Indian 14%, white and Asian 11% and Irish 10%. Figures for Gypsy/Roma and any other white background are suppressed, while travellers of Irish heritage are at 0%, black Caribbean at 2% and any other black background at 3%. In these latter cases, the differential with Chinese performance is huge.
  • EAL learners record a 7% success rate, compared with 6% for native English language speakers, an improvement on the level pegging recorded for GPS. This gap widens to 2% for boys – 9% versus 7% in favour of EAL, whereas for girls it is 1% – 6% versus 5% in favour of EAL. The advantage enjoyed by EAL learners was also evident in 2012.
  • The table below shows the position for FSM and disadvantaged learners by gender, and how this has changed since 2012.

 

FSM boys Non FSM boys Gap Dis boys Non dis boys Gap
2012 1% 5% 4% 1% 6% 5%
2013 3% 9% 6% 3% 10% 7%
FSM girls Non FSM girls Gap Dis girls Non dis girls Gap
2012 1% 3% 2% 1% 3% 2%
2013 2% 6% 4% 2% 7% 5%
FSM all Non FSM all Gap Dis all Non dis all Gap
2012 1% 4% 3% 1% 4% 3%
2013 2% 7% 5% 2% 8% 6%

 

  • This shows that the gap between FSM and non-FSM and between disadvantaged and non-disadvantaged has grown – for boys, girls and the groups as a whole – between 2012 and 2013. All the gaps have increased by 2% or 3%, with higher gaps between disadvantaged and advantaged girls and for disadvantaged boys and girls together, compared with their more advantaged peers.
  • The gaps are all between 2% and 7%, so not large compared with those lower down the attainment spectrum, but the fact that they are widening is a significant cause for concern, suggesting that Pupil Premium funding is not having an impact at L6 in maths.
  • The Transition Matrices show that 89% of learners assessed at L4 in KS1 went on to achieve L6, while 26% of those with L3 at KS1 did so, as did 4% of those with L2A and 1% of those with L2B. Hence a noticeable minority is making four levels of progress.
  • The progression data in the SFR, relating to state-funded schools, show that one pupil made it from W at KS1 to L6, while 8 had L1, 82 had 2C, 751 had 2B, 4,983 had 2A and 27,377 had L3. Once again, a small minority of learners is making four or five levels of progress.
  • At regional level, the breakdown is: NE 6%, NW 6%, Y+H 5%, EM 6%, WM 6%, E 6%, London 9%, SE 7% and SW 6%. So London has a clear lead in respect of the proportion of its learners achieving L6.
  • The local authorities leading the rankings are: City of London 24%, Richmond 19%, Isles of Scilly 17%, Harrow and Kingston 15%, Trafford and Sutton 14%. No real surprises there!
  • The Performance Tables show 33 schools achieved 40% or higher on this measure. Eight schools were at 50% or above. The best performing schools were:

St Oswald’s C of E Aided Primary School, Cheshire West and Chester (75%; cohort 8)

St Joseph’s Roman Catholic Primary School, Hurst Green, Lancashire (71%; cohort 7)

Haselor School, Warwickshire (67%; cohort 6).

  • Some of the schools achieving 50% were significantly larger, notably Bowdon C of E Primary School, Trafford, which had a KS2 cohort of 60.

 

Maths TA

The data available on maths TA is more limited:

  • Including pupils at independent schools, a total of 33,668 were assessed at L6 in maths (6.24% of all KS2 candidates). This included 20,336 boys (7.37% of all male KS2 candidates) and 13,332 girls (5.06% of all female candidates). The number achieving L6 maths TA is slightly lower than the corresponding number achieving L6 in the test.
  • The regional breakdown was as follows: NE 5%; NW 5%; Y+H 5%; EM 5%, WM 6%; E 6%, London 8%; SE 7%, SW 6%, so London’s ascendancy is not as significant as in the test. 
  • The strongest local authority performers are: City of London 24%; Harrow and Richmond 15%; Sutton 14%; Trafford 13%; Solihull and Bromley 12%.
  • In the Performance Tables, 63 schools recorded 40% or higher on this measure, 15 of them at 50% or higher. The top performer was St Oswald’s C of E Aided Primary School (see above) with 88%.

 

Science

Science data is confined to teacher assessment outcomes.

  • A total of just 1,633 pupils achieved L6 in 2013, equivalent to 0.3% of the KS2 science cohort. Of these, 1,029 were boys (0.37%) and 604 were girls (0.23%), suggesting a gender imbalance broadly similar to that in maths.
  • No regions and only a handful of local authorities recorded a success rate of 1%.
  • In the Performance Tables, 31 schools managed 20% or higher and seven schools were above 30%. The best performing were:

Newton Farm (see above) (50%; cohort 30)

Hunsdon Junior Mixed and Infant School, Hertfordshire (40%; cohort 10)

Etchingham Church of England Primary School, East Sussex (38%; cohort 16)

St Benedict’s Roman Catholic Primary School Ampleforth, North Yorkshire (36%; cohort 14).

 

Conclusions

 

Key findings from this data analysis

I will not repeat again all of the significant points highlighted above, but these seem particularly worthy of attention and further analysis:

  • The huge variation in success rates for the three L6 tests. The proportion of learners achieving L6 in the reading test is improving at a faster rate than in maths, but from a very low base. It remains unacceptably low, is significantly out of kilter with the TA results for L6 reading and – unless there has been a major improvement in 2014 – is likely to stay depressed for the limited remaining lifetime of the test.
  • In the tests, 74% of those successful in reading are girls, 62% of those successful in GPS are girls and 61% of those successful in maths are boys. In reading there are also interesting disparities between gender distribution at L6 in the test and in teacher assessment. Can these differences be attributed solely to gender distinctions or is there significant gender-related underachievement at the top of the attainment distribution? If so, how can this be addressed? 
  • There are also big variations in performance by ethnic background. Chinese learners in particular are hugely successful, especially in maths. In 2013, Chinese girls outscored significantly boys from all other backgrounds, while an astonishing 35% of Chinese boys achieved L6. This raises important questions about the distribution of high attainment, the incidence of underachievement and how the interaction between gender and ethnic background impacts on these.
  • There are almost certainly significant excellence gaps in performance on all three tests (ie between advantaged and disadvantaged learners), though in reading and GPS these are masked by the absence of numerical data. In maths we can see that the gaps are not as large as those lower down the attainment spectrum, but they widened significantly in 2013 compared with 2012. This suggests that the impact of the Pupil Premium on the performance of the highest attainers from disadvantaged backgrounds is extremely limited.  What can and should be done to address this issue?
  • EAL learners perform equally as well as their counterparts in the GPS test and even better in maths. This raises interesting questions about the relationship between language acquisition and mathematical performance and, even more intriguingly, the relationship between language acquisition and skill in manipulating language in its written form. Further analysis of why EAL learners are so successful may provide helpful clues that would improve L6 teaching for all learners.
  • Schools are recording very different success rates in each of the tests. Some schools that secure very high L6 success rates in one test fail to do so in the others, but a handful of schools are strong performers across all three tests. We should know more than we do about the characteristics and practices of these highly successful schools.

 

Significant gaps in the data

A data portal to underpin the School Performance Tables is under construction. There have been indications that it will contain material about high attainers’ performance but, while levels continue to be used in the Tables, this should include comprehensive coverage of L6 performance, as well as addressing the achievement of high attainers as they are defined for Performance Table purposes (a much broader subset of learners).

Subject to the need to suppress small numbers for data protection purposes, the portal might reasonably include, in addition to the data currently available:

  • For each test and TA, numbers of registrations, entries and successful pupils from FSM and disadvantaged backgrounds respectively, including analysis by gender and ethnic background, both separately and combined. All the data below should also be available for these subsets of the population.
  • Registrations and entries for each L6 test, for every year in which the tests have been administered, showing separately rates for state-funded and all schools and rates for different types of state-funded school.
  • Cross-referencing of L6 test and TA performance, to show how many learners are successful in one, the other and both – as well as how many learners achieve L6 on more than one test and/or TA and different combinations of assessments.
  • Numbers of pupils successful in each test and TA by region and LA, as well as regional breakdowns of the data above and below.
  • Trends in this data across all the years in which the tests and TA have been administered.
  • The annual cost of developing and administering each of the L6 tests so we can make a judgement about value for money.

It would also be helpful to produce case studies of schools that are especially successful in maximising L6 performance, especially for under-represented groups.

 

The impact of the new tests pre- and post-2016

We do not yet know whether the announcement that L6 tests will disappear after 2015 has depressed registration, entry and success rates in 2014. This is more likely in 2015, since the 2014 registration deadline and the response to the primary assessment and accountability consultation were broadly co-terminous.

All the signs are that the accountability regime will continue to focus some attention on the performance of high attainers:

  • Ofsted is placing renewed emphasis on the attainment and progress of the ‘most able’ in school inspection, though they have a broad conceptualisation of that term and may not necessarily highlight L6 achievement.
  • From 2016, schools will be required to publish ‘the percentage of pupils who achieve a high score in all areas at the end of key stage 2.’ But we do not know whether this means publishing separately the percentage of pupils achieving high scores in each area, or only the percentage of pupils achieving high scores across all areas. Nor do we know what will count as a high score for these purposes.
  • There were commitments in the original primary assessment and accountability consultation document to inclusion of measures in the Primary Performance Tables setting out:

‘How many of a school’s pupils are amongst the highest attaining nationally, by showing the percentage of pupils achieving a high scaled score in each subject.’

but these were not repeated in the consultation response.

In short, there are several unanswered questions and some cause to doubt the extent to which Level 6-equivalent performance will continue to be a priority. The removal of L6 tests could therefore reduce significantly the attention primary schools give to their highest attainers.

Moreover, questions remain over the suitability of the new tests for these highest attainers. These may possibly be overcome but there is considerable cause for concern.

It is quite conceivable that the test developers will not be able to accommodate effective assessment of L6 performance within single tests as planned.

If that is the case, the Government faces a choice between perpetuating separate tests, or the effective relegation of the assessment of the highest attainers to teacher assessment alone.

Such a decision would almost certainly need to be taken on this side of a General Election. But of course it need not be binding on the successor administration. Labour has made no commitments about support for high attainers, which suggests they will not be a priority for them should they form the next Government.

The recently published Assessment Principles are intended to underpin effective assessment systems within schools. They state that such systems:

‘Differentiate attainment between pupils of different abilities, giving early recognition of pupils who are falling behind and those who are excelling.’

This lends welcome support to the recommendations I offered to NAHT’s Commission on Assessment

But the national system for assessment and accountability has an equally strong responsibility to differentiate throughout the attainment spectrum and to recognise the achievement of those who excel.

As things stand, there must be some doubt whether it will do so.

 

Postscript

On 19 May 2014 two newspapers helpfully provided the entry figures for the 2014 L6 tests. These are included in the chart below.

 

L6 postscript chart

 

It is clear that entries to all three tests held up well in 2014 and, as predicted, numbers have not yet been depressed as a consequence of the decision to drop L6 tests after 2015.

The corresponding figures for the numbers of schools entering learners for each test have not been released, so we do not know to what extent the increase is driven by new schools signing up, as opposed to schools with previous entries increasing the numbers they enter.

This additional information makes it easier to project approximate trends into 2015, so we shall be able to tell next year whether the change of assessment policy will cause entry rates to tail off.

  • Entries for the L6 reading test were 49% up in 2013 and 36% up in 2014. Assuming the rate of increase in 2015 falls to 23% (ie again 13% down on the previous year), there would be some 117,000 entries in 2015.
  • Entries for the L6 maths test were 41% up in 2013 and 36% up in 2014. Assuming the rate of increase in 2015 falls to 31% (ie again 5% down on the previous year), there would be around 139,000 entries in 2015.
  • GPS is more problematic because we have only two years on which to base the trend. If we assume that the rate of increase in entries will fall somewhere between the rate for maths and the rate for reading in 2014 (their second year of operation) there would be somewhere between 126,000 and 133,000 entries in 2015 – so approximately 130,000 entries.

It is almost certainly a projection too far to estimate the 2014 pass rates on the basis of the 2014 entry rates, so I will resist the temptation. Nevertheless, we ought to expect continued improvement at broadly commensurate rates.

The press stories include a Government ‘line to take’ on the L6 tests.

In the Telegraph, this is:

‘Want to see every school stretching all their pupils and these figures show that primary schools have embraced the opportunity to stretch their brightest 11-year-olds.’

‘This is part of a package of measures – along with toughening up existing primary school tests, raising the bar and introducing higher floor standards – that will raise standards and help ensure all children arrive at secondary school ready to thrive.’

In the Mail it is:

‘We brought back these tests because we wanted to give teachers the chance to set high aspirations for pupils in literacy and numeracy.’

‘We want to see every school stretching all their pupils. These figures show that primary schools have embraced the opportunity to stretch their brightest 11-year-olds by  teaching them more demanding new material, in line with the new curriculum, and by entering them for the Level 6 test.’

There is additionally confirmation in the Telegraph article that ‘challenging material currently seen in the level 6 exams would be incorporated into all SATs tests’ when the new universal assessments are introduced, but nothing about the test development difficulties that this presents.

But each piece attributed this welcome statement to Mr Gove:

‘It is plain wrong to set a ceiling on the talents of the very brightest pupils and let them drift in class.’

‘Letting teachers offer level 6 tests means that the most talented children will be fully stretched and start secondary school razor sharp.’

Can we read into that a commitment to ensure that the new system – including curriculum, assessment, qualifications, accountability and (critically) Pupil Premium support for the disadvantaged – is designed in a joined up fashion to meet the needs of ‘the very brightest pupils’?

I wonder if Mr Hunt feels able to follow suit.

 

 

GP

May 2014

 

 

 

 

 

 

 

 

Air on the ‘G’ String: Hoagies’ Bloghop, May 2014

 

medium_17873944As I see it, there are three sets of issues with the ‘G’ word:

  • Terminological – the term carries with it associations that make some advocates uncomfortable and predispose others to resist such advocacy.
  • Definitional – there are many different ways to define the term and the subset of the population to which it can be applied; there is much disagreement about this, even amongst advocates.
  • Labelling – the application of the term to individuals can have unintended negative consequences, for them and for others.

 

Terminological issues

We need shared terminology to communicate effectively about this topic. A huge range of alternatives is available: able, more able, highly able, most able, talented, asynchronous, high potential, high learning potential… and so on.

These terms – the ‘g’ word in particular – are often qualified by an adjective – profoundly, highly, exceptionally – which adds a further layer of complexity. Then there is the vexed question of dual and multiple exceptionality…

Those of us who are native English speakers conveniently forget that there are also numerous terms available in other languages: surdoue, hochbegabung, hochbegaabte, altas capacidades, superdotados, altas habilidades, evnerik and many, many more!

Each of these terms has its own good and bad points, its positive and negative associations.

The ‘g’ word has a long history, is part of the lingua franca and is still most widely used. But its long ascendancy has garnered a richer mix of associations than some of the alternatives.

The negative associations can be unhelpful to those seeking to persuade others to respond positively and effectively to the needs of these children and young people. Some advocates feel uncomfortable using the term and this hampers effective communication, both within the community and outside it.

Some react negatively to its exclusive, elitist connotations; on the other hand, it can be used in a positive way to boost confidence and self-esteem.

But, ultimately, the term we use is less significant than the way in which we define it. There may be some vague generic distaste for the ‘g’ word, but logic should dictate that most reactions will depend predominantly on the meaning that is applied to the term.

 

Definitional issues

My very first blog post drew attention to the very different ways in which this topic is approached around the world. I identified three key polarities:

  • Nature versus nurture – the perceived predominance of inherited disposition over effort and practice, or vice versa.
  • Excellence versus equity – whether priority is given to raising absolute standards and meritocracy or narrowing excellence gaps and social mobility.
  • Special needs versus personalisation – whether the condition or state defined by the term should be addressed educationally as a special need, or through mainstream provision via differentiation and tailored support.

These definitional positions may be associated with the perceived pitch or incidence of the ‘g’ condition. When those at the extreme of the distribution are under discussion, or the condition is perceived to be extremely rare, a nature-excellence-special needs perspective is more likely to predominate. A broader conceptualisation pushes one towards the nurture-equity-personalisation nexus.

Those with a more inclusive notion of ‘g’-ness – who do not distinguish between ‘bright’ and ‘g’, include all high attainers amongst the latter and are focused on the belief that ‘g’-ness is evenly distributed in the population by gender, ethnic and socio-economic background – are much more likely to hold the latter perspective, or at least tend towards it.

There are also differences according to whether the focus is the condition itself – ‘g’-ness – or schooling for the learners to whom the term is applied – ‘g’ education. In the first case, nature, excellence and special needs tend to predominate; in the second the reverse is true. This can compromise interaction between parents and educators.

In my experience, if the ‘g’ word is qualified by a careful definition that takes account of these three polarities, a mature discussion about needs and how best to meet them is much more likely to occur.

In the absence of a shared definition, the associations of the term will likely predominate unchecked. Effective communication will be impossible; common ground cannot be established; the needs that the advocate is pressing will remain unfulfilled. That is in no-one’s best interests, least of all those who are ‘g’.

 

Labelling Issues 

When the ‘g’ word is applied to an individual, it is likely to influence how that individual perceives himself and how others perceive him.

Labelling is normally regarded as negative, because it implies a fixed and immutable state and may subject the bearers of the label to impossibly high expectations, whether of behaviour or achievement, that they cannot always fulfil.

Those who do not carry the label may see themselves as second class citizens, become demotivated and much less likely to succeed.

But, as noted above, it is also possible to use the ‘g’ label to confer much-needed status and attention on those who do not possess the former or receive enough of the latter. This can boost confidence and self-esteem, making the owners of the label more likely to conform to the expectations that it carries.

This is particularly valuable for those who strive to promote equity and narrow excellence gaps between those from advantaged and disadvantaged backgrounds.

Moreover, much depends on whether the label is permanently applied or confers a temporary status.

I recently published a Twitter conversation explaining how the ‘g’ label can be used as a marker to identify those learners who for the time being need additional learning support to maximise their already high achievement.

This approach reflects the fact that children and young people do not develop through a consistent linear process, but experience periods of rapid development and comparative stasis.

The timing and duration of these periods will vary so, at any one time in any group of such individuals, some will be progressing rapidly and others will not. Over the longer term some will prove precocious; others late developers.

This is not to deny that a few learners at the extreme of the distribution will retain the marker throughout their education, because they are consistently far ahead of their peers and so need permanent additional support to maximise their achievement.

But, critically, the label is earned through evidence of high achievement rather than through a test of intelligence or cognitive ability that might have been administered once only and in the distant past. ‘G’-ness depends on educational success. It also forces educators to address underachievement at the top of the attainment spectrum.

If a label is more typically used as a temporary marker it must be deployed sensitively, in a way that is clearly understood by learners and their parents. They must appreciate that the removal of the marker is not a punishment or downgrading that leads to loss of self-esteem.

Because the ‘g’ label typically denotes a non-permanent state that defines need rather than expectation, most if not all of the negative connotations can be avoided.

Nevertheless, this may be anathema to those with a nature-excellence-special needs perspective!

 

Conclusion 

I have avoided using the ‘g’ word within this post, partly to see if it could be done and partly out of respect for those of you who dislike it so much.

But I have also advanced some provocative arguments using terminology that some of you will find equally disturbing. That is deliberate and designed to make you think!

The ‘g’ word has substantial downside, but this can be minimised through careful definition and the application of the label as a non-permanent marker.

It may be that the residual negative associations are such that an alternative is still preferable. The question then arises whether there is a better term with the same currency and none of the negative connotations.

As noted above there are many contenders – not all of them part of the English language – but none stands head-and-shoulders above its competitors.

And of course it is simply impossible to ban a word. Indeed, any attempt to do so would provoke many of us – me included – to use the ‘g’ word even more frequently and with much stronger conviction.

 

 

Hoagies bloghop

 

This blog is part of the Hoagies’ Gifted Education Page inaugural Blog Hop on The “G” Word (“Gifted”).  To read more blogs in this hop, visit this Blog Hop at www.hoagiesgifted.org/blog_hop_the_g_word.htm

 

 

GP

May 2014

 

 

 

 

 

photo credit: <a href=”http://www.flickr.com/photos/neurollero/17873944/”>neurollero</a&gt; via <a href=”http://photopin.com”>photopin</a&gt; <a href=”http://creativecommons.org/licenses/by-sa/2.0/”>cc</a&gt;

How well is Ofsted reporting on the most able?

 

 

This post considers how Ofsted’s new emphasis on the attainment and progress of the most able learners is reflected in school inspection reports.

My analysis is based on the 87 Section 5 secondary school inspection reports published in the month of March 2014.

keep-calm-and-prepare-for-ofsted-6I shall not repeat here previous coverage of how Ofsted’s emphasis on the most able has been framed. Interested readers may wish to refer to previous posts for details:

The more specific purpose of the post is to explore how consistently Ofsted inspectors are applying their guidance and, in particular, whether there is substance for some of the concerns I expressed in these earlier posts, drawn together in the next section.

The remainder of the post provides an analysis of the sample and a qualitative review of the material about the most able (and analogous terms) included in the sample of 87 inspection reports.

It concludes with a summary of the key points, a set of associated recommendations and an overall inspection grade for inspectors’ performance to date. Here is a link to this final section for those who prefer to skip the substance of the post.

 

Background

Before embarking on the real substance of this argument I need to restate briefly some of the key issues raised in those earlier posts:

  • Ofsted’s definition of ‘the most able’ in its 2013 survey report is idiosyncratically broad, including around half of all learners on the basis of their KS2 outcomes.
  • The evidence base for this survey report included material suggesting that the most able students are supported well or better in only 20% of lessons – and are not making the progress of which they are capable in about 40% of schools.
  • The survey report’s recommendations included three commitments on Ofsted’s part. It would:

 o   ‘focus more closely in its inspections on the teaching and progress of the most able students, the curriculum available to them, and the information, advice and guidance provided to the most able students’;

o   ‘consider in more detail during inspection how well the pupil premium is used to support the most able students from disadvantaged backgrounds’ and

o   ‘report its inspection findings about this group of students more clearly in school inspection, sixth form and college reports.’

  • Subsequently the school inspection guidance was revised somewhat  haphazardly, resulting in the parallel use of several undefined terms (‘able pupils’, ‘most able’, ‘high attaining’, ‘highest attaining’),  the underplaying of the attainment and progress of the most able learners attracting the Pupil Premium and very limited reference to appropriate curriculum and IAG.
  • Within the inspection guidance, emphasis was placed primarily on learning and progress. I edited together the two relevant sets of level descriptors in the guidance to provide this summary for the four different inspection categories:

In outstanding schools the most able pupils’ learning is consistently good or better and they are making rapid and sustained progress.

In good schools the most able pupils’ learning is generally good, they make good progress and achieve well over time.

In schools requiring improvement the teaching of the most able pupils and their achievement are not good.

In inadequate schools the most able pupils are underachieving and making inadequate progress.

  • No published advice has been made available to inspectors on the interpretation of these amendments to the inspection guidance. In October 2013 I wrote:

‘Unfortunately, there is a real risk that the questionable clarity of the Handbook and Subsidiary Guidance will result in some inconsistency in the application of the Framework, even though the fundamental purpose of such material is surely to achieve the opposite.’

  • Analysis of a very small sample of reports for schools reporting poor results for high attainers in the school performance tables suggested inconsistency both before and after the amendments were introduced into the guidance. I commented:

‘One might expect that, unconsciously or otherwise, inspectors are less ready to single out the performance of the most able when a school is inadequate across the board, but the small sample above does not support this hypothesis. Some of the most substantive comments relate to inadequate schools.

It therefore seems more likely that the variance is attributable to the differing capacity of inspection teams to respond to the new emphases in their inspection guidance. This would support the case made in my previous post for inspectors to receive additional guidance on how they should interpret the new requirement.’

The material below considers the impact of these revisions on a more substantial sample of reports and whether this justifies some of the concerns expressed above.

It is important to add that, in January 2014, Ofsted revised its guidance document ‘Writing the report for school inspections’ to include the statement that:

Inspectors must always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’ (p8)

This serves to reinforce the changes to the inspection guidance and clearly indicates that coverage of this issue – at least in these terms – is a non-negotiable: we should expect to see appropriate reference in every single section 5 report.

 

The Sample

The sample comprises 87 secondary schools whose Section 5 inspection reports were published by Ofsted in the month of March 2014.

The inspections were conducted between 26 November 2013 and 11 March 2014, so the inspectors will have had time to become familiar with the revised guidance.

However up to 20 of the inspections took place before Ofsted felt it necessary to emphasise that coverage of the progress and teaching of the most able is compulsory.

The sample happens to include several institutions inspected as part of wider-ranging reviews of schools in Birmingham and schools operated by the E-ACT academy chain. It also incorporates several middle-deemed secondary schools.

Chart 1 shows the regional breakdown of the sample, adopting the regions Ofsted uses to categorise reports, as opposed to its own regional structure (ie with the North East identified separately from Yorkshire and Humberside).

It contains a disproportionately large number of schools from the West Midlands while the South-West is significantly under-represented. All the remaining regions supply between 5 and 13 schools. A total of 57 local authority areas are represented.

 

Chart 1: Schools within the sample by region

Ofsted chart 1

 

Chart 2 shows the different statuses of schools within the sample. Over 40% are community schools, while almost 30% are sponsored academies. There are no academy converters but sponsored academies, free schools and studio schools together account for some 37% of the sample.

 

Chart 2: Schools within the sample by status

Ofsted chart 2

 

The vast majority of schools in the sample are 11-16 or 11-18 institutions, but four are all-through schools, five provide for learners aged 13 or 14 upwards and 10 are middle schools. There are four single sex schools.

Chart 3 shows the variation in school size. Some of the studio schools, free schools and middle schools are very small by secondary standards, while the largest secondary school in the sample has some 1,600 pupils. A significant proportion of schools have between 600 and 1,000 pupils.

 

Chart 3: Schools within the sample by number on roll

Ofsted chart 3

The distribution of overall inspection grades between the sample schools is illustrated by Chart 4 below. Eight of the sample were rated outstanding, 28 good, 35 as requiring improvement and 16 inadequate.

Of those rated inadequate, 12 were subject to special measures and four had serious weaknesses.

 

Chart 4: Schools within the sample by overall inspection grade

 Ofsted chart 4

The eight schools rated outstanding include:

  • A mixed 11-18 sponsored academy
  • A mixed 14-19 studio school
  • A mixed 11-18 free school
  • A mixed 11-16 VA comprehensive;
  • A girls’ 11-18  VA comprehensive
  • A boys’ 11-18 VA selective school
  • A girls’ 11-18 community comprehensive and
  • A mixed 11-18 community comprehensive

The sixteen schools rated inadequate include:

  • Eight mixed 11-18 sponsored academies
  • Two mixed 11-16 sponsored academies
  • An mixed all-through sponsored academy
  • A mixed 11-16 free school
  • Two mixed 11-16 community comprehensives
  • A mixed 11-18 community comprehensive and
  • A mixed 13-19 community comprehensive

 

Coverage of the most able in main findings and recommendations

 

Terminology 

Where they were mentioned, such learners were most often described as ‘most able’ but a wide range of other terminology is deployed included ‘most-able’, ‘the more able’, ‘more-able’, ‘higher attaining’, ‘high-ability’, ‘higher-ability’ and ‘able students’.

The idiosyncratic adoption of redundant hyphenation is an unresolved mystery.

It is not unusual for two or more of these terms to be used in the same report. Because there is no glossary in existence, this makes some reports rather less straightforward to interpret accurately.

It is also more difficult to compare and contrast reports. Helpful services like Watchsted’s word search facility become less useful.

 

Incidence of commentary in the main findings and recommendations

Thirty of the 87 inspection reports (34%) addressed the school’s most able learners explicitly (or applied a similar term) in the sections setting out the report’s main findings and the recommendations respectively.

The analysis showed that 28% of reports on academies (including studios and free schools) met this criterion, whereas 38% of reports on non-academy schools did so.

Chart 5 shows how the incidence of reference in both main findings and recommendations varies according to the overall inspection grade awarded.

One can see that this level of attention is most prevalent in schools requiring improvement, followed by those with inadequate grades. It was less common in schools rated good and less common still in outstanding schools. The gap between these two categories is perhaps smaller than expected.

The slight lead for schools requiring improvement over inadequate schools may be attributable to a view that more of the latter face more pressing priorities, or it may have something to do with the varying proportions of high attainers in such schools, or both of these factors could be in play, amongst others.

 

Chart 5: Most able covered in both main findings and recommendations by overall inspection rating (percentage)

Ofsted chart 5

A further eleven reports (13%) addressed the most able learners in the recommendations but not the main findings.

Only one report managed to feature the most able in the main findings but not in the recommendations and this was because the former recorded that ‘the most able students do well’.

Consequently, a total of 45 reports (52%) did not mention the most able in either the main findings or the recommendations.

This applied to some 56% of reports on academies (including free schools and studio schools) and 49% of reports on other state-funded schools.

So, according to these proxy measures, the most able in academies appear to receive comparatively less attention from inspectors than those in non-academy schools. It is not clear why. (The samples are almost certainly too small to support reliable comparison of academies and non-academies with different inspection ratings.)

Chart 6 below shows the inspection ratings for this subset of reports.

 

Chart 6: Most able covered in neither main findings nor recommendations by overall inspection rating (percentage)

Ofsted chart 6

Here is further evidence that the significant majority of outstanding schools are regarded as having no significant problems in respect of provision for the most able.

On the other hand, this is far from being universally true, since it is an issue for one in four of them. This ratio of 3:1 does not lend complete support to the oft-encountered truism that outstanding schools invariably provide outstandingly for the most able – and vice versa.

At the other end of the spectrum, and perhaps even more surprisingly, over 30% of inadequate schools are assumed not to have issues significant enough to warrant reference in these sections. Sometimes this may be because they are equally poor at providing for all their learners, so the most able are not separately singled out.

Chart 7 below shows differences by school size, giving the percentage of reports mentioning the most able in both main findings and recommendations and in neither.

It divides schools into three categories: small (24 schools with a NOR of 599 or lower), medium (35 schools with a NOR of 600-999) and large (28 schools with a NOR of 1000 or higher.

 

Chart 7: Reports mentioning the most able in main findings and recommendations by school size 

 Ofsted chart 7

It is evident that ‘neither’ exceeds ‘both’ in the case of all three categories. The percentages are not too dissimilar in the case of small and large schools, which record a very similar profile.

But there is a much more significant difference for medium-sized schools. They demonstrate a much smaller percentage of ‘both’ reports and comfortably the largest percentage of ‘neither’ reports.

This pattern – suggesting that inspectors are markedly less likely to emphasise provision for the most able in medium-sized schools – is worthy of further investigation.

It would be particularly interesting to explore further the relationship between school size, the proportion of high attainers in a school and their achievement.

 

Typical references in the main findings and recommendations

I could detect no obvious and consistent variations in these references by school status or size, but it was possible to detect a noticeably different emphasis between schools rated outstanding and those rated inadequate.

Where the most able featured in reports on outstanding schools, these included recommendations such as:

‘Further increase the proportion of outstanding teaching in order to raise attainment even higher, especially for the most able students.’ (11-16 VA comprehensive).

‘Ensure an even higher proportion of students, including the most able, make outstanding progress across all subjects’ (11-18 sponsored academy).

These statements suggest that such schools have made good progress in eradicating underachievement amongst the most able but still have further room for improvement.

But where the most able featured in recommendations for inadequate schools, they were typically of this nature:

‘Improve teaching so that it is consistently good or better across all subjects, but especially in mathematics, by: raising teachers’ expectations of the quality and amount of work students of all abilities can do, especially the most and least able.’  (11-16 sponsored academy).

‘Improve the quality of teaching in order to speed up the progress students make by setting tasks that are at the right level to get the best out of students, especially the most able.’ (11-18 sponsored academy).

‘Rapidly improve the quality of teaching, especially in mathematics, by ensuring that teachers: have much higher expectations of what students can achieve, especially the most able…’ (11-16 community school).

These make clear that poor and inconsistent teaching quality is causing significant underachievement at the top end (and ‘especially’ suggests that this top end underachievement is particularly pronounced compared with other sections of the attainment spectrum in such schools).

Recommendations for schools requiring improvement are akin to those for inadequate schools but typically more specific, pinpointing particular dimensions of good quality teaching that are absent, so limiting effective provision for the most able. It is as if these schools have some of the pieces in place but not yet the whole jigsaw.

By comparison, recommendations for good schools can seem rather more impressionistic and/or formulaic, focusing more generally on ‘increasing the proportion of outstanding teaching’. In such cases the assessment is less about missing elements and more about the consistent application of all of them across the school.

One gets the distinct impression that inspectors have a clearer grasp of the ‘fit’ between provision for the most able and the other three inspection outcomes, at least as far as the distinction between ‘good’ and ‘outstanding’ is concerned.

But it would be misleading to suggest that these lines of demarcation are invariably clear. The boundary between ‘good’ and ‘requires improvement’ seems comparatively distinct, but there was more evidence of overlap at the intersections between the other grades.

 

Coverage of the most able in the main body of reports 

References to the most able rarely turn up in the sections dealing with behaviour and safety and leadership and management. I counted no examples of the former and no more than one or two of the latter.

I could find no examples where information, advice and guidance available to the most able are separately and explicitly discussed and little specific reference to the appropriateness of the curriculum for the most able. Both are less prominent than the recommendations in the June 2013 survey report led us to expect.

Within this sample, the vast majority of reports include some description of the attainment and/or progress of the most able in the section about pupils’ achievement, while roughly half pick up the issue in relation to the quality of teaching.

The extent of the coverage of most able learners varied enormously. Some devoted a single sentence to the topic while others referred to it separately in main findings, recommendations, pupils’ achievement and quality of teaching. In a handful of cases reports seemed to give disproportionate attention to the topic.

 

Attainment and progress

Analyses of attainment and progress are sometimes entirely generic, as in:

‘The most able students make good progress’ (inadequate 11-18 community school).

‘The school has correctly identified a small number of the most able who could make even more progress’ (outstanding 11-16 RC VA school).

‘The most able students do not always secure the highest grades’ (11-16 community school requiring improvement).

‘The most able students make largely expected rates of progress. Not enough yet go on to attain the highest GCSE grades in all subjects.’ (Good 11-18 sponsored academy).

Sometimes such statements can be damning:

‘The most-able students in the academy are underachieving in almost every subject. This is even the case in most of those subjects where other students are doing well. It is an academy-wide issue.’ (Inadequate 11-18 sponsored academy).

These do not in my view constitute reporting ‘in detail on the progress of the most able pupils’ and so probably fall foul of Ofsted’s guidance to inspectors on writing reports.

More specific comments on attainment typically refer explicitly to the achievement of A*/A grades at GCSE and ideally to specific subjects, for example:

‘In 2013, standards in science, design and technology, religious studies, French and Spanish were also below average. Very few students achieved the highest A* and A grades.’ (Inadequate 11-18 sponsored academy)

‘Higher-ability students do particularly well in a range of subjects, including mathematics, religious education, drama, art and graphics. They do as well as other students nationally in history and geography.’ (13-18 community school  requiring improvement)

More specific comments on progress include:

‘The progress of the most able students in English is significantly better than that in other schools nationally, and above national figures in mathematics. However, the progress of this group is less secure in science and humanities.’  (Outstanding 11-18 sponsored academy)

‘In 2013, when compared to similar students nationally, more-able students made less progress than less-able students in English. In mathematics, where progress is less than in English, students of all abilities made similar progress.’ (11-18 sponsored academy requiring improvement).

Statements about progress rarely extend beyond English and maths (the first example above is exceptional) but, when attainment is the focus, some reports take a narrow view based exclusively on the core subjects, while others are far wider-ranging.

Despite the reference in Ofsted’s survey report, and subsequently the revised subsidiary guidance, to coverage of high attaining learners in receipt of the Pupil Premium, this is hardly ever addressed.

I could find only two examples amongst the 87 reports:

‘The gap between the achievement in English and mathematics of students for whom the school receives additional pupil premium funding and that of their classmates widened in 2013… During the inspection, it was clear that the performance of this group is a focus in all lessons and those of highest ability were observed to be achieving equally as well as their peers.’ (11-16 foundation school requiring improvement)

‘Students eligible for the pupil premium make less progress than others do and are consequently behind their peers by approximately one GCSE grade in English and mathematics. These gaps reduced from 2012 to 2013, although narrowing of the gaps in progress has not been consistent over time. More-able students in this group make relatively less progress.’ (11-16 sponsored academy requiring improvement)

More often than not it seems that the most able and those in receipt of the Pupil Premium are assumed to be mutually exclusive groups.

 

Quality of teaching 

There was little variation in the issues raised under teaching quality. Most inspectors select two or three options from a standard menu:

‘Where teaching is best, teachers provide suitably challenging materials and through highly effective questioning enable the most able students to be appropriately challenged and stretched…. Where teaching is less effective, teachers are not planning work at the right level of difficulty. Some work is too easy for the more able students in the class. (Good 11-16 community school)

 ‘In teaching observed during the inspection, the pace of learning for the most able students was too slow because the activities they were given were too easy. Although planning identified different activities for the most able students, this was often vague and not reflected in practice.  Work lacks challenge for the most able students.’ (Inadequate 11-16 community school)

‘In lessons where teaching requires improvement, teachers do not plan work at the right level to ensure that students of differing abilities build on what they already know. As a result, there is a lack of challenge in these lessons, particularly for the more able students, and the pace of learning is slow. In these lessons teachers do not have high enough expectations of what students can achieve.’ (11-18 community school requiring improvement)

‘Tasks set by teachers are sometimes too easy and repetitive for pupils, particularly the most able. In mathematics, pupils are sometimes not moved on quickly enough to new and more challenging tasks when they have mastered their current work.’ (9-13 community middle school requiring improvement)

‘Targets which are set for students are not demanding enough, and this particularly affects the progress of the most able because teachers across the year groups and subjects do not always set them work which is challenging. As a result, the most able students are not stretched in lessons and do not achieve as well as they should.’ (11-16 sponsored academy rated inadequate)

All the familiar themes are present – assessment informing planning, careful differentiation, pace and challenge, appropriate questioning, the application of subject knowledge, the quality of homework, high expectations and extending effective practice between subject departments.

 

Negligible coverage of the most able

Only one of the 87 reports failed to make any mention of the most able whatsoever. This is the report on North Birmingham Academy, an 11-19 mixed school requiring improvement.

This clearly does not meet the injunction to:

‘…report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough’.

It ought not to have passed through Ofsted’s quality assurance processes unscathed. The inspection was conducted in February 2014, after this guidance issued, so there is no excuse.

Several other inspections make only cursory references to the most able in the main body of the report, for example:

‘Where teaching is not so good, it was often because teachers failed to check students’ understanding or else to anticipate when to intervene to support students’ learning, especially higher attaining students in the class.’ (Good 11-18 VA comprehensive).

‘… the teachers’ judgements matched those of the examiners for a small group of more-able students who entered early for GCSE in November 2013.’ (Inadequate 11-18 sponsored academy).

‘More-able students are increasingly well catered for as part of the academy’s focus on raising levels of challenge.’ (Good 11-18 sponsored academy).

‘The most able students do not always pursue their work to the best of their capability.’ (11-16 free school requiring improvement).

These would also fall well short of the report writing guidance. At least 6% of my sample falls into this category.

Some reports note explicitly that the most able learners are not making sufficient progress, but fail to capture this in the main findings or recommendations, for example:

‘The achievement of more able students is uneven across subjects. More able students said to inspectors that they did not feel they were challenged or stretched in many of their lessons. Inspectors agreed with this view through evidence gathered in lesson observations…lessons do not fully challenge all students, especially the more able, to achieve the grades of which they are capable.’ (11-19 sponsored academy requiring improvement).

‘The 2013 results of more-able students show they made slower progress than is typical nationally, especially in mathematics.  Progress is improving this year, but they are still not always sufficiently challenged in lessons.’ (11-18 VC CofE school requiring improvement).

‘There is only a small proportion of more-able students in the academy. In 2013 they made less progress in English and mathematics than similar students nationally. Across all of their subjects, teaching is not sufficiently challenging for more-able students and they leave the academy with standards below where they should be.’ (Inadequate 11-18 sponsored academy).

‘The proportion of students achieving grades A* and A was well below average, demonstrating that the achievement of the most able also requires improvement.’  (11-18 sponsored academy requiring improvement).

Something approaching 10% of the sample fell into this category. It was not always clear why this issue was not deemed significant enough to feature amongst schools’ priorities for improvement. This state of affairs was more typical of schools requiring improvement than inadequate schools, so one could not so readily argue that the schools concerned were overwhelmed with the need to rectify more basic shortcomings.

That said, the example from an inadequate academy above may be significant. It is almost as if the small number of more able students is the reason why this shortcoming is not taken more seriously.

Inspectors must carry in their heads a somewhat subjective hierarchy of issues that schools are expected to tackle. Some inspectors appear to feature the most able at a relatively high position in this hierarchy; others push it further down the list. Some appear more flexible in the application of this hierarchy to different settings than others.

 

Formulaic and idiosyncratic references 

There is clear evidence of formulaic responses, especially in the recommendations for how schools can improve their practice.

Many reports adopt the strategy of recommending a series of actions featuring the most able, either in the target group:

‘Improve the quality of teaching to at least good so that students, including the most able, achieve higher standards, by ensuring that: [followed by a list of actions] (9-13 community middle school requiring improvement)

Or in the list of actions:

‘Improve the quality of teaching in order to raise the achievement of students by ensuring that teachers:…use assessment information to plan their work so that all groups of students, including those supported by the pupil premium and the most-able students, make good progress.’ (11-16 community school requiring improvement)

It was rare indeed to come across a report that referred explicitly to interesting or different practice in the school, or approached the topic in a more individualistic manner, but here are a few examples:

‘More-able pupils are catered for well and make good progress. Pupils enjoy the regular, extra challenges set for them in many lessons and, where this happens, it enhances their progress. They enjoy that extra element which often tests them and gets them thinking about their work in more depth. Most pupils are keen to explore problems which will take them to the next level or extend their skills.’  (Good 9-13 community middle school)

‘Although the vast majority of groups of students make excellent progress, the school has correctly identified a small number of the most able who could make even more progress. It has already started an impressive programme of support targeting the 50 most able students called ‘Students Targeted A grade Results’ (STAR). This programme offers individualised mentoring using high-quality teachers to give direct intervention and support. This is coupled with the involvement of local universities. The school believes this will give further aspiration to these students to do their very best and attend prestigious universities.’  (Outstanding 11-16 VA school)

I particularly liked:

‘Policies to promote equality of opportunity are ineffective because of the underachievement of several groups of students, including those eligible for the pupil premium and the more-able students.’ (Inadequate 11-18 academy) 

 

Conclusion

 

Main Findings

The principal findings from this survey, admittedly based on a rather small and not entirely representative sample, are that:

  • Inspectors are terminologically challenged in addressing this issue, because there are too many synonyms or near-synonyms in use.
  • Approximately one-third of inspection reports address provision for the most able in both main findings and recommendations. This is less common in academies than in community, controlled and aided schools. It is most prevalent in schools with an overall ‘requires improvement’ rating, followed by those rated inadequate. It is least prevalent in outstanding schools, although one in four outstanding schools is dealt with in this way.
  • Slightly over half of inspection reports address provision for the most able in neither the main findings nor the recommendations. This is relatively more common in the academies sector and in outstanding schools. It is least prevalent in schools rated inadequate, though almost one-third of inadequate schools fall into this category. Sometimes this is the case even though provision for the most able is identified as a significant issue in the main body of the report.
  • There is an unexplained tendency for reports on medium-sized schools to be significantly less likely to feature the most able in both main findings and recommendations and significantly more likely to feature it in neither. This warrants further investigation.
  • Overall coverage of the topic varies excessively between reports. One ignored it entirely, while several provided only cursory coverage and a few covered it to excess. The scope and quality of the coverage does not necessarily correlate with the significance of the issue for the school.
  • Coverage of the attainment and progress of the most able learners is variable. Some reports offer only generic descriptions of attainment and progress combined, some are focused exclusively on attainment in the core subjects while others take a wider curricular perspective. Outside the middle school sector, desirable attainment outcomes for the most able are almost invariably defined exclusively in terms of A* and A grade GCSEs.
  • Hardly any reports consider the attainment and/or progress of the most able learners in receipt of the Pupil Premium.
  • None of these reports make specific and explicit reference to IAG for the most able. It is rarely stated whether the school’s curriculum satisfies the needs of the most able.
  • Too many reports adopt formulaic approaches, especially in the recommendations they offer the school. Too few include reference to interesting or different practice.

In my judgement, too much current inspection reporting falls short of the commitments contained in the original Ofsted survey report and of the more recent requirement to:

‘always report in detail on the progress of the most able pupils and how effectively teaching engages them with work that is challenging enough.’

 

Recommendations

  • Ofsted should publish a glossary defining clearly all the terms for the most able that it employs, so that both inspectors and schools understand exactly what is intended when a particular term is deployed and which learners should be in scope when the most able are discussed.
  • Ofsted should co-ordinate the development of supplementary guidance clarifying their expectations of schools in respect of provision for the most able. This should set out in more detail what expectations would apply for such provision to be rated outstanding, good, requiring improvement and inadequate respectively. This should include the most able in receipt of the Pupil Premium, the suitability of the curriculum and the provision of IAG.
  • Ofsted should provide supplementary guidance for inspectors outlining and exemplifying the full range of evidence they might interrogate concerning the attainment and progress of the most able learners, including those in receipt of the Pupil Premium.
  • This guidance should specify the essential minimum coverage expected in reports and the ‘triggers’ that would warrant it being referenced in the main findings and/or recommendations for action.
  • This guidance should discourage inspectors from adopting formulaic descriptors and recommendations and specifically encourage them to identify unusual or innovative examples of effective practice.
  • The school inspection handbook and subsidiary guidance should be amended to reflect the supplementary guidance.
  • The School Data Dashboard should be expanded to include key data highlighting the attainment and progress of the most able.
  • These actions should also be undertaken for inspection of the primary and 16-19 sectors respectively.

 

Overall assessment: Requires Improvement.

 

GP

May 2014