I promised that I would post a final entry on this blog to explain why Gifted Phoenix is returning to the flames. This is published exactly one week prior to regeneration.
Let me apologise at the outset for the plethora of first-person pronouns in this post.
I tend not to blow my own trumpet because I find such behaviour in others so blatant and cringeworthy.
It’s as if they were carrying a huge banner saying ‘JUST LOOK HOW IMPORTANT I THINK I AM, YET HOW GLARINGLY INSECURE’.
This is not a typical Gifted Phoenix post – less reasoned analysis and more personal rant – but I hope it will help to explain why I feel how I do, and help to provide some sense of closure.
What I did
Gifted Phoenix is my social media alias. He made his appearance in 2010 at the same time as the real me became an independent education consultant.
He divided his time between:
Twitter, where his profile described him as an ‘education policy analyst specialising in global gifted and talented education and a balanced critique of wider education reform here in England’. He has posted over 30,000 tweets and acquired almost 6,700 followers.
This eponymous blog, specialising in ‘global gifted education and English education policy analysis’ which contains 217 posts. Many are substantial pieces of work, often of 10,000 words or more.
I intended that Gifted Phoenix would be useful to me as well as to others.
I planned to use Twitter as a virtual filing cabinet, logging all relevant publications and news commentary for my own purposes while simultaneously publishing each record openly. My feed is a sizeable searchable database, extremely useful for researching blog posts and responding to queries.
I decided to blog differently too. I wanted to develop a research-heavy oeuvre, akin to academic journal papers, but without the tediously formulaic structure and style. I sought to produce detailed posts written in short, pithy journalistic paragraphs. I tried to be authoritative, reliable, balanced and evidence-based.
The early work adopts a determinedly global perspective and focuses on gifted education. The later posts deal almost exclusively with education in England, especially the implications for high attainers.
Why I’m stopping
As a gifted person I can confirm that gifted people are often rather stupid.
Rather stupidly I had assumed an unwritten contract with you, Dear Reader.
The deal was this:
I would provide an entirely free service via social media to anyone that chose to access it and, in return, the beneficiaries (that’s you), having assured themselves of my expertise, would reciprocate with a regular supply of well-paid consultancy, conference and written work.
But instead I found my side of the contract becoming steadily more onerous while the supply of paid work – never more than a trickle at the best of times – has dried up almost entirely. Most of my working week has been dedicated to supplying freebies for your delectation.
Perhaps you never appreciated that there was an unwritten contract, or never fully understood it….
Perhaps you did understand, at least implicitly, but played me for a fool by taking the free stuff while mocking my naivete…
Or perhaps I should have sold myself and my wares with rather more chutzpah (though that’s a big ask for a borderline Aspie).
So the bottom line is financial. I’m working too hard for too little gain.
I have some other frustrations.
There has always been an undercurrent associated with the fact that I am not and have never been a serving teacher. There’s a constituency on Twitter utterly convinced that teachers have the monopoly on educational wisdom.
I rarely touch on pedagogy but in all other respects I’d back myself against most teachers and school leaders in my areas of specialism. I might not have the experience of either, or an academic background, but I am still one of the foremost experts in my field (mind that banner!)
I fear the ‘school-led, self-improving system’ is abeguiling delusion. Presumably it has diverted much of the available consultancy work towards Teaching Schools and their ilk. I’m sure the best of them provide a quality of service I could never aspire to. But the worst are endlessly recycling their own inadequacies. Neither is predisposed to work collaboratively with the likes of me.
Then there are the other educationalcliques, perpetually engaged in mutual backslapping. Much more of the available work is distributed by them, between themselves. There is no space for an interloper, especially one who is constitutionally incapable of suffering fools gladly.
In this respect the English education scene is only marginally better than the academic gifted education community. I gave up on them a few years back, entirely disillusioned with the pantheon of minor deities.
They’re incurably Americo-centric. Each of them peddles their own belief system, constantly failing to grasp the bigger picture. They refuse to make common cause with each other, or with anyone else.
They bolster themselves up with vacuous doctorates and chairs and pour scorn on those who recognise their pointlessness. Almost everything they publish festers behind a paywall, but most of it is irrelevant anyway. With a few honourable exceptions they are a complete waste of space.
I’m also fed up with several of the prominent educational organisations here in England. It pays to keep on their right side, better still to butter them up, but I don’t do flattery.
Too many use their privileged positions to advance silly ideas.
Some employ far too many young whippersnappers, still wet behind the ears, yet puffed up by the arrogance of youth.
Several react negatively to intelligent, constructive criticism. Rather than addressing it openly, they default automatically to an Ostrich Position, pretending it doesn’t exist. They exemplify the ‘not invented here’ mindset.
And I mustn’t forget the feckless few who make verbal commitments to commission work, only to renege or turn strangely silent.
I have a blacklist of organisations and individuals that have shafted me in this fashion, or in others, all of them blissfully untroubled by stirrings of guilt. None has the faintest notion that an apology is due.
I was planning to publish that list right here, but I guess discretion is the better part of valour. A plague on all their houses. I anticipate keenly the biter bit.
On top of all this I am intensely frustrated at my apparent inability to convince others through logic and rational argument that high attaining learners, especially those from disadvantaged backgrounds, have an equal right to educational challenge and support.
The opposite prejudice – which so many of us fought so hard to counter during the Blair years – now seems to be regaining ascendancy. That is profoundly dispiriting.
Finally there are some personal reasons. I must try to be a better house husband and family man. I have numerous other interests that I’d like to pursue before I’m too doddery. I might even find a way to generate a more reliable income stream.
End of rant.
I have been remiss in getting to this point without extending heartfelt thanks and appreciation to many supporters, too numerous to mention. Without you Gifted Phoenix wouldn’t have lasted this long. Both he and I are really most grateful.
Ironically enough, since I made my decision I’ve been offered some consultancy by a Particularly Prestigious Client. I won’t count chickens as there’s no contract yet. Their name would have looked great on my Linked In profile, had I been touting for more business.
This work I will accept, although I have dropped all other commitments.
The Other Half says she’ll never forgive me if I turn down fresh offers that play to my strengths, though I must admit I’m sorely tempted. I don’t want you to think I’m writing this to shame you into charity.
I’ve decided to protect my Twitter feed, which is supposed to mean you can’t access it any more (though Google might still find a way). I don’t think you’ll miss it after a little while.
As for my blog, you’ll find it here over the summer. Thereafter, I’m not sure. There might be an e-book or two if I can summon the will, but nothing with a hard cover. I might just throw the switch. But, once again, I’m sure Google will come to your aid if you’re desperate. I’ll make sure the key documents go to a good home.
I won’t be responding to comments on this post, here or on Twitter, so you can say goodbye or vituperate with impunity, whatever you will.
For the future all I ask is that, if ever you mention my work in yours, you cite it and me properly.
I live in hope that much of the passive resistance I’ve encountered is down to your problems with the messenger. Someone else whose jib has the right cut will make the identical points and the Powers That Be will be readier to listen.
One last request, if I may…When you see or hear my arguments recycled and are feeling brave enough, be good enough to drop in a casual:
‘I fear you’ve been reading Gifted Phoenix’
Maybe from time to time I’ll search on that phrase and appreciate that he did make some difference after all.
I am rounding out this year’s blogging with my customary backwards look at the various posts I published during 2014.
This is partly an exercise in self-congratulation but also flags up to readers any potentially useful posts they might have missed.
Norwegian Panorama by Gifted Phoenix
This is my 32nd post of the year, three fewer than the 35 I published in 2013. Even so, total blog views have increased by 20% compared with 2013.
Almost exactly half of these views originate in the UK. Other countries generating a large number of views include the United States, Singapore, India, Australia, Hong Kong, Saudi Arabia, Germany, Canada and South Korea. The site has been visited this year by readers located in157 different countries.
This illustrates just how strongly the accountability regime features in the priorities of English educators.
I have continued to feature comparatively more domestic topics: approximately 75% of my posts this year have been about the English education system. I have not ventured beyond these shores since September.
The first section below reviews the minority of posts with a global perspective; the second covers the English material. A brief conclusion offers my take on future prospects.
This proposed some quality criteria for social media usage and blogs/websites that operate within the field of gifted education.
It also reviewed the social media activity of six key players (WCGTC, ECHA, NAGC, SENG, NACE and Potential Plus UK) as well as wider activity within the blogosphere, on five leading social media platforms and utilising four popular content creation tools.
Some of the websites mentioned above have been recast since the post was published and are now much improved (though I claim no direct influence).
These posts were scheduled just ahead of a conference organised by the Hungarian sponsors of the network. I did not attend, fearing that the proceedings would have limited impact on the future direction of this once promising initiative. I used the posts to set out my reservations, which include a failure to engage with constructive criticism.
Part One scrutinises the Hungarian talent development model on which the European Network is based. Part Two describes the halting progress made by to date. It identifies several deficiencies that need to be addressed if the Network is to have a significant and lasting impact on pan-European support for talent development and gifted education.
This analyses the performance of high achievers from a selection of 11 jurisdictions – either world leaders or prominent English-speaking nations – on the PISA 2012 Creative Problem Solving assessment.
It is a companion piece to a 2013 post which undertook a similar analysis of the PISA 2012 assessments in Reading, Maths and Science.
In May I contributed to the Hoagies’ Bloghop for that month.
Air on the ‘G’ String: Hoagies’ Bloghop, May 2014 was my input to discussion about the efficacy of ‘the G word’ (gifted). I deliberately produced a provocative and thought-provoking piece which stirred typically intense reactions in several quarters.
This takes a closer look at the relatively little-known PISA ‘resilient students’ measure – focused on high achievers from disadvantaged socio-economic backgrounds – and how well different jurisdictions perform against it.
The title reflects the post’s conclusion that, like many other countries, England:
‘…should be worrying as much about our ‘short head’ as our ‘long tail’’.
And so I pass seamlessly on to the series of domestic posts I published during 2014…
The purpose of these annual posts (and the primary equivalent which appears each December) is to synthesise data about the performance of high attainers and high attainment at national level, so that schools can more easily benchmark their own performance.
It examines the subsequent history of schools that recorded particularly poor results with high attainers in the Secondary Performance Tables. (The asterisk references a footnote apologising ‘for this rather tabloid title’.)
Some of the issues I highlighted eight months ago are now being more widely discussed – not least the nature of the performance descriptors, as set out in the recent consultation exercise dedicated to those.
But the reform process is slow. Many other issues remain unresolved and it seems increasingly likely that some of the more problematic will be delayed deliberately until after the General Election.
May was particularly productive, witnessing four posts, three of them substantial:
How well is Ofsted reporting on the most able? explores how Ofsted inspectors are interpreting the references to the attainment and progress of the most able added to the Inspection Handbook late last year. The sample comproses the 87 secondary inspection reports that were published in March 2014. My overall assessment? Requires Improvement.
A Closer Look at Level 6 is a ‘data-driven analysis of Level 6 performance’. As well as providing a baseline against which to assess future Level 6 achievement, this also identifies several gaps in the published data and raises as yet unanswered questions about the nature of the new tests to be introduced from 2016.
One For The Echo Chamber was prompted by The Echo Chamber reblogging service, whose founder objected that my posts are too long, together with the ensuing Twitter debate. Throughout the year the vast majority of my posts have been unapologetically detailed and thorough. They are intended as reference material, to be quarried and revisited, rather than the disposable vignettes that so many seem to prefer. To this day they get reblogged on The Echo Chamber only when a sympathetic moderator is undertaking the task.
‘Poor but Bright’ v ‘Poor but Dim’ arose from another debate on Twitter, sparked by a blog post which argued that the latter are a higher educational priority than the former. I argued that both deserved equal priority, since it is inequitable to discriminate between disadvantaged learners on the basis of prior attainment and the economic arguments cut both ways. This issue continues to bubble like a subterranean stream, only to resurface from time to time, most recently when the Fair Education Alliance proposed that the value of pupil premium allocations attached to disadvantaged high attainers should be halved.
The principles should be valuable to schools considering how best to respond to Ofsted’s increased scrutiny of their provision for the most able. Any institution considering how best to revitalise its provision might discuss how the principles should be interpreted to suit their particular needs and circumstances.
Test entries increased significantly. So did the success rates on the other level 6 tests (in maths and in grammar, punctuation and spelling (GPS)). Even teacher assessment of L6 reading showed a marked upward trend.
Despite all this, the number of pupils successful on the L6 reading test fell from 2,062 in 2013 to 851 (provisional). The final statistics – released only this month – show a marginal improvement to 935, but the outcome is still extremely disappointing. No convincing explanation has been offered and the impact on 2015 entries is unlikely to be positive.
These present the evidence base relating to high attainment gaps between disadvantaged and other learners, to distinguish what we know from what remains unclear and so to provide a baseline for further research.
The key finding is that the evidence base is both sketchy and fragmented. We should understand much more than we do about the size and incidence of excellence gaps. We should be strengthening the evidence base as part of a determined strategy to close the gaps.
@GiftedPhoenix very useful summary – the importance of both high achievement and subject choice at GCSE needs more investigation.
In October 16-19 Maths Free Schools Revisited marked a third visit to the 16-19 maths free schools programme, concentrating on progress since my previous post in March 2013, especially at the two schools which have opened to date.
The two small institutions at KCL and Exeter University (both very similar to each other) constitute a rather limited outcome for a project that was intended to generate a dozen innovative university-sponsored establishments. There is reportedly a third school in the pipeline but, as 2014 closes, details have yet to be announced.
Excellence Gaps Quality Standard: Version One is an initial draft of a standard encapsulating effective whole school practice in supporting disadvantaged high attainers. It updates and adapts the former IQS for gifted and talented education.
This first iteration needs to be trialled thoroughly, developed and refined but, even as it stands, it offers another useful starting point for schools reviewing the effectiveness of their own provision.
The baseline standard captures the essential ‘non-negotiables’ intended to be applicable to all settings. The exemplary standard is pitched high and should challenge even the most accomplished of schools and colleges.
All comments and drafting suggestions are welcome.
These issues have become linked since Prime Minister Cameron has regularly proposed an extension of the former as a response to calls on the right wing of his party for an extension of the latter.
This was almost certainly the source of autumn media rumours that a strategy, originating in Downing Street, would be launched to incentivise and extend setting.
Newly installed Secretary of State Morgan presumably insisted that existing government policy (which leaves these matters entirely to schools) should remain undisturbed. However, the idea might conceivably be resuscitated for the Tory election manifesto.
Now that UKIP has confirmed its own pro-selection policy there is pressure on the Conservative party to resolve its internal tensions on the issue and identify a viable alternative position. But the pro-grammar lobby is unlikely to accept increased setting as a consolation prize…
This shows that HMCI’s recent distinction between positive support for the most able in the primary sector and a much weaker record in secondary schools is not entirely accurate. There are conspicuous weaknesses in the primary sector too.
Meanwhile, Chinese learners continue to perform extraordinarily well on the Level 6 maths test, achieving an amazing 35% success rate, up six percentage points since 2013. This domestic equivalent of the Shanghai phenomenon bears closer investigation.
My penultimate post of the year HMCI Ups the Ante on the Most Able collates all the references to the most able in HMCI’s 2014 Annual Report and its supporting documentation.
It sets out Ofsted’s plans for the increased scrutiny of schools and for additional survey reports that reflect this scrutiny.
It asks the question whether Ofsted’s renewed emphasis will be sufficient to rectify the shortcomings they themselves identify and – assuming it will not – outlines an additional ten-step plan to secure system-wide improvement.
‘The ‘closed shop’ is as determinedly closed as ever; vested interests are shored up; governance is weak. There is fragmentation and vacuum where there should be inclusive collaboration for the benefit of learners. Too many are on the outside, looking in. Too many on the inside are superannuated and devoid of fresh ideas.’
Despite evidence of a few ‘green shoots’’ during 2014, my overall sense of pessimism remains.
Meanwhile, future prospects for high attainers in England hang in the balance.
Several of the Coalition Government’s education reforms have been designed to shift schools’ focus away from borderline learners, so that every learner improves, including those at the top of the attainment distribution.
On the other hand, Ofsted’s judgement that a third of secondary inspections this year
‘…pinpointed specific problems with teaching the most able’
would suggest that schools’ everyday practice falls some way short of this ideal.
HMCI’s commitment to champion the interests of the most able is decidedly positive but, as suggested above, it might not be enough to secure the necessary system-wide improvement.
Ofsted is itself under pressure and faces an uncertain future, regardless of the election outcome. HMCI’s championing might not survive the arrival of a successor.
It seems increasingly unlikely that any political party’s election manifesto will have anything significant to say about this topic, unless the enthusiasm for selection in some quarters can be harnessed and redirected towards the much more pertinent question of how best to meet the needs of all high attainers in all schools and colleges, especially those from disadvantaged backgrounds.
But the entire political future is shrouded in uncertainty. Let’s wait and see how things are shaping up on the other side of the election.
From a personal perspective I am closing in on five continuous years of edutweeting and edublogging.
I once expected to extract from this commitment benefits commensurate with the time and energy invested. But that is no longer the case, if indeed it ever was.
I plan to call time at the end of this academic year.
This post is the first stage of a potential development project.
It is my initial ‘aunt sally’ for a new best fit quality standard, intended to support schools and colleges to close performance gaps between high-achieving disadvantaged learners and their more advantaged peers.
It aims to integrate two separate educational objectives:
Improving the achievement of disadvantaged learners, specifically those eligible for Pupil Premium support; and
Improving the achievement of high attainers, by increasing the proportion that achieve highly and the levels at which they achieve.
High achievement embraces both high attainment and strong progress, but these terms are not defined or quantified on the face of the standard, so that it is applicable in primary, secondary and post-16 settings and under both the current and future assessment regimes.
I have adopted new design parameters for this fresh venture into quality standards:
The standard consists of twelve elements placed in what seems a logical order, but they are not grouped into categories. All settings should consider all twelve elements. Eleven are equally weighted, but the first ‘performance’ element is potentially more significant.
The baseline standard is called ‘Emerging’ and is broadly aligned with Ofsted’s ‘Requires Improvement’. I want it to capture only the essential ‘non-negotiables’ that all settings must observe or they would otherwise be inadequate. I have erred on the side of minimalism for this first effort.
The standard marking progress beyond the baseline is called ‘Improving’ and is (very) broadly aligned with Ofsted’s ‘Good’. I have separately defined only the learner performance expected, on the assumption that in other respects the standard marks a continuum. Settings will position themselves according to how far they exceed the baseline and to what extent they fall short of excellence.
The excellence standard is called ‘Exemplary’ and is broadly aligned with Ofsted’s ‘Outstanding’. I have deliberately tried to pitch this as highly as possible, so that it provides challenge for even the strongest settings. Here I have erred on the side of specificity.
The trick with quality standards is to find the right balance between over-prescription and vacuous ‘motherhood and apple pie’ statements.
There may be some variation in this respect between elements of the standard: the section on teaching and learning always seems to be more accommodating of diversity than others given the very different conceptions of what constitutes effective practice. (But I am also cautious of trespassing into territory that, as a non-practitioner, I may not fully understand.)
The standard uses terminology peculiar to English settings but the broad thrust should be applicable in other countries with only limited adaptation.
The terminology needn’t necessarily be appropriate in all respects to all settings, but it should have sufficient currency and sharpness to support meaningful interaction between them, including cross-phase interaction. It is normal for primary schools to find some of the language more appropriate to secondary schools.
It is important to emphasise the ‘best fit’ nature of such standards. Following discussion informed by interaction with the framework, settings will reach a reasoned and balanced judgement of their own performance across the twelve elements.
It is not necessary for all statements in all elements to be observed to the letter. If a setting finds all or part of a statement beyond the pale, it should establish why that is and, wherever possible, devise an alternative formulation to fit its context. But it should strive wherever possible to work within the framework, taking full advantage of the flexibility it permits.
Some of the terminology will be wanting, some important references will have been omitted while others will be over-egged. That is the nature of ‘aunt sallys’.
Feel free to propose amendments using the comments facility below.
The quality standard is immediately below. To improve readability, I have not reproduced the middle column where it is empty. Those who prefer to see the full layout can access it via this PDF
The setting meets essential minimum criteria
In best fit terms the setting has progressed beyond entry level but is not yet exemplary
The setting is a model for others to follow
Attainment and progress of disadvantaged high achievers typically matches that of similar learners nationally, or is rapidly approaching this..Attainment and progress of advantaged and disadvantaged high achievers in the setting are both improving.
Attainment and progress of disadvantaged high achievers consistently matches and sometimes exceeds that of similar learners nationally..Attainment and progress are improving steadily for advantaged and disadvantaged high achievers in the setting and performance gaps between them are closing.
Attainment and progress of disadvantaged high achievers significantly and consistently exceeds that of similar learners nationally..
Attainment and progress matches but does not exceed that of advantaged learners within the setting, or is rapidly approaching this, and both attainment and progress are improving steadily, for advantaged and disadvantaged high achievers alike.
Exemplary (O)The setting is a model for others to follow
There is a published policy to close excellence gaps, supported by improvement planning. Progress is carefully monitored.
There is a comprehensive yet clear and succinct policy to close excellence gaps that is published and easily accessible. It is familiar to and understood by staff, parents and learners alike.
SMART action to close excellence gaps features prominently in improvement plans; targets are clear; resources and responsibilities are allocated; progress is monitored and action adjusted accordingly. Learners’ and parents’ feedback is routinely collected.
The setting invests in evidence-based research and fosters innovation to improve its own performance and contribute to system-wide improvement.
Classroom practice consistently addresses the needs of disadvantaged high achievers, so improving their learning and performance.
The relationship between teaching quality and closing excellence gaps is invariably reflected in classroom preparation and practice.
All teaching staff and paraprofessionals can explain how their practice addresses the needs of disadvantaged high achievers, and how this has improved their learning and performance.
All staff are encouraged to research, develop, deploy, evaluate and disseminate more effective strategies in a spirit of continuous improvement.
Out of class learning
A menu of appropriate opportunities is accessible to all disadvantaged high achievers and there is a systematic process to match opportunities to needs.
A full menu of appropriate opportunities – including independent online learning, coaching and mentoring as well as face-to-face activities – is continually updated. All disadvantaged high achievers are supported to participate.
All provision is integrated alongside classroom learning into a coherent, targeted educational programme. The pitch is appropriate, duplication is avoided and gaps are filled.
Staff ensure that: learners’ needs are regularly assessed; they access and complete opportunities that match their needs; participation and performance are monitored and compiled in a learning record.
Systems for assessing, reporting and tracking attainment and progress provide disadvantaged high achievers, parents and staff with the information they need to improve performance
Systems for assessing, tracking and reporting attainment and progress embody stretch, challenge and the highest expectations. They identify untapped potential in disadvantaged learners. They do not impose artificially restrictive ceilings on performance.
Learners (and their parents) know exactly how well they are performing, what they need to improve and how they should set about it. Assessment also reflects progress towards wider goals.
Frequent reports are issued and explained, enabling learners (and their parents) to understand exactly how their performance has changed over time and how it compares with their peers, identifying areas of relative strength and weakness.
All relevant staff have real-time access to the assessment records of disadvantaged high attainers and use these to inform their work.
Data informs institution-wide strategies to improve attainment and progress. Analysis includes comparison with similar settings.
The needs and circumstances of disadvantaged high achievers explicitly inform the curriculum and curriculum development, as well as the selection of appropriate organisational strategies – eg sets and/or mixed ability classes.
The curriculum is tailored to the needs of disadvantaged high achievers. Curriculum flexibility is utilised to this end. Curriculum development and planning take full account of this.
Rather than a ‘one size fits all’ approach, enrichment (breadth), extension (depth) and acceleration (pace) are combined appropriately to meet different learners’ needs.
Personal, social and learning skills development and the cultivation of social and cultural capital reflect the priority attached to closing excellence gaps and the contribution this can make to improving social mobility.
Organisational strategies – eg the choice of sets or mixed ability classes – are informed by reliable evidence of their likely impact on excellence gaps.
The ethos is positive and supportive of disadvantaged high achievers. Excellence is valued by staff and learners alike. Bullying that undermines this is eradicated.
The ethos embodies the highest expectations of learners, and of staff in respect of learners. Every learner counts equally.
Excellence is actively pursued and celebrated; competition is encouraged but not at the expense of motivation and self-esteem;hothousing is shunned.
High achievement is the norm and this is reflected in organisational culture; there is zero tolerance of associated bullying and a swift and proportional response to efforts to undermine this culture.
Strong but realistic aspirations are fostered. Role models are utilised. Social and emotional needs associated with excellence gaps are promptly and thoroughly addressed.
The impact of disadvantage is monitored carefully. Wherever possible, obstacles to achievement are removed.
The performance, needs and circumstances of disadvantaged high achievers are routinely addressed in transition between settings and in the provision of information, advice and guidance.
Where possible, admissions arrangements prioritise learners from disadvantaged backgrounds – and high achievers are treated equally in this respect.
Receiving settings routinely collect information about the performance, needs and circumstances of disadvantaged high achievers. They routinely share such information when learners transfer to other settings.
Information, advice and guidance is tailored, balanced and thorough. It supports progression to settings that are consistent with the highest expectations and high aspirations while also meeting learners’ needs.
Destinations data is collected, published and used to inform monitoring.
Leadership, staffing, CPD
A named member of staff is responsible – with senior leadership support – for co-ordinating and monitoring activity across the setting (and improvement against this standard)..Professional development needs associated with closing excellence gaps are identified and addressed
The senior leadership team has an identified lead and champion for disadvantaged high achievers and the closing of excellence gaps.
A named member of staff is responsible for co-ordinating and monitoring activity across the setting (and improvement against this standard).
Closing excellence gaps is accepted as a collective responsibility of the whole staff and governing body. There is a named lead governor.
There is a regular audit of professional development needs associated with closing excellence gaps across the whole staff and governing body. A full menu of appropriate opportunities is continually updated and those with needs are supported to take part.
The critical significance of teaching quality in closing excellence gaps is instilled in all staff, accepted and understood.
Parents and guardians understand how excellence gaps are tackled and are encouraged to support this process.
Wherever possible, parents and guardians are actively engaged as partners in the process of closing excellence gaps. The setting may need to act as a surrogate. Other agencies are engaged as necessary.
Staff, parents and learners review progress together regularly. The division of responsibility is clear. Where necessary, the setting provides support through outreach and family learning.
This standard is used as the basis of a guarantee to parents and learners of the support that the school will provide, in return for parental engagement and learner commitment.
Sufficient resources – staffing and funding – are allocated to improvement planning (and to the achievement of this standard)..Where available, Pupil Premium is used effectively to support disadvantaged high achievers.
Sufficient resources – staffing and funding – are allocated to relevant actions in the improvement plan (and to the achievement of this standard).
The proportion of Pupil Premium (and/or alternative funding sources) allocated to closing excellence gaps is commensurate with their incidence in the setting.
The allocation of Pupil Premium (or equivalent resources) is not differentiated on the basis of prior achievement: high achievers are deemed to have equal needs.
Settings should evidence their commitment to these principles in published material (especially information required to be published about the use of Pupil Premium).
The setting takes an active role in collaborative activity to close excellence gaps.
Excellence gaps are addressed and progress is monitored in partnership with all relevant ‘feeder’ and ‘feeding’ settings in the locality.
The setting leads improvement across other settings within its networks, utilising the internal expertise it has developed to support others locally, regionally and nationally.
The setting uses collaboration strategically to build its own capacity and improve its expertise.
Those who are not familiar with the quality standards approach may wish to know more.
Regular readers will know that I advocate what I call ‘flexible framework thinking’, a middle way between the equally unhelpful extremes of top-down prescription (one-size-fits-all) and full institutional autonomy (a thousand flowers blooming). Neither secures consistently high quality provision across all settings.
The autonomy paradigm is currently in the ascendant. We attempt to control quality through ever-more elaborate performance tables and an inspection regime that depends on fallible human inspectors and documentation that regulates towards convergence when it should be enabling diversity, albeit within defined parameters.
I see more value in supporting institutions through best-fit guidance of this kind.
My preferred model is a quality standard, flexible enough to be relevant to thousands of different settings, yet specific enough to provide meaningful guidance on effective practice and improvement priorities, regardless of the starting point.
I have written about the application of quality standards to gifted education and their benefits on several occasions:
Quality standards are emphatically not ‘tick box’ exercises and should never be deployed as such.
Rather they are non-prescriptive instruments for settings to use in self-evaluation, for reviewing their current performance and for planning their improvement priorities. They support professional development and lend themselves to collaborative peer assessment.
Quality standards can be used to marshal and organise resources and online support. They can provide the essential spine around which to build guidance documents and they provide a useful instrument for research and evaluation purposes.
There are two definitions of resilience in play: an international benchmark and a country-specific measure to inform discussion of effective policy levers in different national settings.
The international benchmark relates to the top third of PISA performers (ie above the 67th percentile) across all countries after accounting for socio-economic background. The resilient population comprises students in this group who also fall within the bottom third of the socio-economic background distribution in their particular jurisdiction.
Hence the benchmark comprises an international dimension of performance and a national/jurisdictional dimension of disadvantage.
This cohort is compared with disadvantaged low achievers, a population similarly derived, except that their performance is in the bottom third across all countries, after accounting for socio-economic background.
The national benchmark applies the same national measure relating to socio-economic background, but the measure of performance is the top third of the national/jurisdictional performance distribution for the relevant PISA test.
The basis for determining socio-economic background is the PISA Index of Economic, Social and Cultural Status (ESCS).
‘Against the Odds’ describes it thus:
‘The indicator captures students’ family and home characteristics that describe their socio-economic background. It includes information about parental occupational status and highest educational level, as well as information on home possessions, such as computers, books and access to the Internet.’
‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status).’
No reason is given for this shift to a narrower measure of both attainment and disadvantage, nor is the impact on results discussed.
‘A student is classed as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter of students among all countries, after accounting for socio-economic status.’
However, multiplication by four is dispensed with.
This should mean that the outcomes from PISA 2009 and 2012 are broadly comparable with some straightforward multiplication. However the 2006 results foreground science, while in 2009 the focus is reading – and shifts on to maths in 2012.
Although there is some commonality between these different test-specific results (see below), there is also some variation, notably in terms of differential outcomes for boys and girls.
PISA 2006 results
The chart reproduced below compares national percentages of resilient students and disadvantaged low achievers in science using the original international benchmark. It shows the proportion of resilient learners amongst disadvantaged students.
Conversely, the data table supplied alongside the chart shows the proportion of resilient students amongst all learners. Results have to be multiplied by three on this occasion (since the indicator is based on ‘top third attainment, bottom third advantage’).
I have not reproduced the entire dataset, but have instead created a subset of 14 jurisdictions in which my readership may be particularly interested, namely: Australia, Canada, Finland, Hong Kong, Ireland, Japan, New Zealand, Poland, Shanghai, Singapore, South Korea, Taiwan, the UK and the US. I have also included the OECD average.
I have retained this grouping throughout the analysis, even though some of the jurisdictions do not appear throughout – in particular, Shanghai and Singapore are both omitted from the 2006 data.
Chart 1 shows these results.
Chart 1: PISA resilience in science for selected jurisdictions by gender (PISA 2006 data)
All the jurisdictions in my sample are relatively strong performers on this measure. Only the United States falls consistently below the OECD average.
Hong Kong has the highest percentage of resilient learners – almost 75% of its disadvantaged students achieve the benchmark. Finland is also a very strong performer, while other jurisdictions achieving over 50% include Canada, Japan, South Korea and Taiwan.
The UK is just above the OECD average, but the US is ten points below. The proportion of disadvantaged resilient students in Hong Kong is almost twice the proportion in the UK and two and a half times the proportion in the US.
Most of the sample shows relatively little variation between their proportions of male and female resilient learners. Females have a slight lead across the OECD as a whole, but males are in the ascendancy in eight of these jurisdictions.
The largest gap – some 13 percentage points in favour of boys – can be found in Hong Kong. The largest advantage in favour of girls – 6.9 percentage points – is evident in Poland. In the UK males are ahead by slightly over three percentage points.
The first chart also shows that there is a relatively strong relationship between the proportion of resilient students and of disadvantaged low achievers. Jurisdictions with the largest proportions of resilient students typically have the smallest proportions of disadvantaged low achievers.
In Hong Kong, the proportion of disadvantaged students who are low achievers is 6.3%, set against an OECD average of 25.8%. Conversely, in the US, this proportion reaches 37.8% – and is 26.7% in the UK. Of this sample, only the US has a bigger proportion of disadvantaged low achievers than of disadvantaged resilient students.
‘Against the Odds’ examines the relationship between resiliency in science, reading and maths, but does so using the national benchmark, so the figures are not comparable with those above. I have, however, provided a chart comparing performance in my sample of jurisdictions.
Chart 2: Students resilient in science who are resilient in other subjects, national benchmark of resilience, PISA 2006
Amongst the jurisdictions for which we have data there is a relatively similar pattern, with between 47% and 56% of students resilient in all three subjects.
In most cases, students who are resilient in two subjects combine science and maths rather than science and reading, but this is not universally true since the reverse pattern applies in Ireland, Japan and South Korea.
The document summarises the outcomes thus:
‘This evidence indicates that the vast majority of students who are resilient with respect to science are also resilient in at least one if not both of the other domains…These results suggest that resilience in science is not a domain-specific characteristic but rather there is something about these students or the schools they attend that lead them to overcome their social disadvantage and excel at school in multiple subject domains.’
PISA 2009 Results
The results drawn from PISA 2009 focus on outcomes in reading, rather than science, and of course the definitional differences described above make them incompatible with those for 2006.
The first graph reproduced below shows the outcomes for the full set of participating jurisdictions, while the second – Chart 2 – provides the results for my sample.
Chart 3: PISA resilience in reading for selected jurisdictions by gender (PISA 2009 data)
The overall OECD average is pitched at 30.8% compared with 39% on the PISA 2006 science measure. Ten of our sample fall above the OECD average and Australia matches it, but the UK, Ireland and the US are below the average, the UK undershooting it by some seven percentage points.
The strongest performer is Shanghai at 75.6%, closely followed by Hong Kong at 72.4%. They and South Korea are the only jurisdictions in the sample which can count over half their disadvantaged readers as resilient. Singapore, Finland and Japan are also relatively strong performers.
There are pronounced gender differences in favour of girls. They have a 16.8 percentage point lead over boys in the OECD average figure and they outscore boys in every country in our sample. These differentials are most marked in Finland, Poland and New Zealand. In the UK there is a difference of 9.2 percentage points, smaller than in many other countries in the sample.
The comparison with the proportion of disadvantaged low achievers is illustrated by chart 3. This reveals the huge variation in the performance of our sample.
Chart 4: Comparing percentage of resilient and low-achieving students in reading, PISA 2009
At one extreme, the proportion of disadvantaged low achievers (bottom quartile of the achievement distribution) is virtually negligible in Shanghai and Hong Kong, while around three-quarters of disadvantaged students are resilient (top quartile of the achievement distribution).
At the other, countries like the UK have broadly similar proportions of low achievers and resilient students. The chart reinforces just how far behind they are at both the top and the bottom of the attainment spectrum.
PISA 2012 Results
In 2012 the focus is maths rather than reading. The graph reproduced below compares resilience scores across the full set of participating jurisdictions, while Chart 4 covers only my smaller sample.
Chart 5: PISA resilience in maths for selected jurisdictions by gender (PISA 2012 data)
Despite the change in subject, the span of performance on this measure is broadly similar to that found in reading three years earlier. The OECD average is 25.6%, roughly five percentage points lower than the average in 2009 reading.
Nine of the sample lie above the OECD average, while Australia, Ireland, New Zealand, UK and the US are below. The UK is closer to the OECD average in maths than it was in reading, however, and is a relatively stronger performer than the US and New Zealand.
Shanghai and Hong Kong are once again the top performers, at 76.8% and 72.4% respectively. Singapore is at just over 60% and South Korea at just over 50%. Taiwan and Japan are also notably strong performers.
Within the OECD average, boys have a four percentage point lead on girls, but boys’ relatively stronger performance is not universal – in Hong Kong, Poland, Singapore and South Korea, girls are in the ascendancy. This is most strongly seen in Poland. The percentage point difference in the UK is just 2.
The comparison with disadvantage low achievers is illustrated in Chart 5.
Chart 6: Comparing percentage of resilient and low-achieving students in maths, PISA 2012
Once again the familiar pattern emerges, with negligible proportions of low achievers in the countries with the largest shares of resilient students. At the other extreme, the US and New Zealand are the only two jurisdictions in this sample with a longer ‘tail’ of low achievers. The reverse is true in the UK, but only just!
‘Further analysis indicates that the 10% socio-economically most disadvantaged children in Shanghai perform at the same level as the 10% most privileged children in the United States; and that the 20% most disadvantaged children in Finland, Japan, Estonia, Korea, Singapore, Hong Kong-China and Shanghai-China compare favourably to the OECD average.’
One can see that the UK is decidedly ‘mid-table’ at both extremes of the distribution. On the evidence of this measure, one cannot fully accept the oft-repeated saw that the UK is a much stronger performer with high attainers than with low attainers, certainly as far as disadvantaged learners are concerned.
The 2012 Report also compares maths-based resiliency records over the four cycles from PISA 2003 to PISA 2012 – as shown in the graph reproduced below – but few of the changes are statistically significant. There has also been some statistical sleight of hand to ensure comparability across the cycles.
Amongst the outcomes that are statistically significant, Australia experienced a fall of 1.9 percentage points, Canada 1.6 percentage points, Finland 3.3 percentage points and New Zealand 2.9 percentage points. The OECD average was relatively little changed.
The UK is not included in this analysis because of issues with its PISA 2003 results.
Resilience is not addressed in the main PISA 2012 report on problem-solving, but one can find online the graph below, which shows the relative performance of the participating countries.
It is no surprise that the Asian Tigers are at the top of the league (although Shanghai is no longer in the ascendancy). England (as opposed to the UK) is at just over 30%, a little above the OECD average, which appears to stand at around 27%.
The United States and Australia perform at a very similar level. Canada is ahead of them and Poland is the laggard.
Resilience in the home countries
Inserted for the purposes of reinforcement, the chart below compiles the UK outcomes from the PISA 2006, 2009 and 2012 studies above, as compared with the top performer in my sample for each cycle and the appropriate OECD average. Problem-solving is omitted.
Only in science (using the ‘top third attainer, bottom third disadvantage’ formula) does the UK exceed the OECD average figure and then only slightly.
In both reading and maths, the gap between the UK and the top performer in my sample is eye-wateringly large: in each case there are more than three times as many resilient students in the top-performing jurisdiction.
It is abundantly clear from this data that disadvantaged high attainers in the UK do not perform strongly compared with their peers elsewhere.
Chart 7: Resilience measures from PISA 2006-2012 comparing UK with top performer in this sample and OECD average
This uses the old ‘highest third by attainment, lowest third by disadvantage’ methodology deployed in ‘Against the Odds’. Reading is the base.
The results show that 41% of English students are resilient, the same figure as for the UK as a whole. The figures for the other home countries appear to be: Northern Ireland 42%; Scotland 44%; and Wales 35%.
Whether the same relationship holds true in maths and science using the ‘top quartile, bottom quartile’ methodology is unknown. One suspects though that each of the UK figures given above will also apply to England.
The characteristics of resilient learners
‘Against the Odds’ outlines some evidence derived from comparisons using the national benchmark:
Resilient students are, on average, somewhat more advantaged than disadvantaged low achievers, but the difference is relatively small and mostly accounted for by home-related factors (eg. number of books in the home, parental level of education) rather than parental occupation and income.
In most jurisdictions, resilient students achieve proficiency level 4 or higher in science. This is true of 56.8% across the OECD. In the UK the figure is 75.8%; in Hong Kong it is 88.4%. We do not know what proportions achieve the highest proficiency levels.
Students with an immigrant background – either born outside the country of residence or with parents were born outside the country – tend to be under-represented amongst resilient students.
Resilient students tend to be more motivated, confident and engaged than disadvantaged low achievers. Students’ confidence in their academic abilities is a strong predictor of resilience, stronger than motivation.
Learning time – the amount of time spent in normal science lessons – is also a strong predictor of resilience, but there is relatively little evidence of an association with school factors such as school management, admissions policies and competition.
‘Resilient students and advantaged high-achievers have lower rates of absenteeism and lack of punctuality than disadvantaged and advantaged low-achievers…
….resilient and disadvantaged low-achievers tend to have lower sense of belonging than advantaged low-achievers and advantaged high-achievers: socio-economically disadvantaged students express a lower sense of belonging than socio-economically advantaged students irrespective of their performance in mathematics.
Resilient students tend to resemble advantaged high-achievers with respect to their level of drive, motivation and self-beliefs: resilient students and advantaged high-achievers have in fact much higher levels of perseverance, intrinsic and instrumental motivation to learn mathematics, mathematics self-efficacy, mathematics self-concept and lower levels of mathematics anxiety than students who perform at lower levels than would be expected of them given their socio-economic condition…
….In fact, one key characteristic that resilient students tend to share across participating countries and economies, is that they are generally physically and mentally present in class, are ready to persevere when faced with challenges and difficulties and believe in their abilities as mathematics learners.’
Several research studies can be found online that reinforce these findings, sometimes adding a few further details for good measure:
The aforementioned NFER study for Northern Ireland uses a multi-level logistic model to investigate the school and student background factors associated with resilience in Northern Ireland using PISA 2009 data.
It derives odds ratios as follows: grammar school 7.44; female pupils 2.00; possessions – classic literature 1.69; wealth 0.76; percentage of pupils eligible for FSM – 0.63; and books in home – 0-10 books 0.35.
On the positive impact of selection the report observes:
‘This is likely to be largely caused by the fact that to some extent grammar schools will be identifying the most resilient students as part of the selection process. As such, we cannot be certain about the effectiveness or otherwise of grammar schools in providing the best education for disadvantaged children.’
Unsurprisingly, students who are more familiar with mathematical concepts and have greater mathematical self-efficacy are also more likely to be resilient.
Amongst other countries in the sample – including Canada and Finland – being male, native (as opposed to immigrant) and avoiding ‘redoublement’ produced stronger chances of resilience.
In addition to familiarity with maths concepts and self-efficacy, resilient students in these countries were less anxious about maths and had a higher degree of maths self-concept.
Work on ‘Resilience Patterns in Public Schools in Turkey’ (unattributed and undated) – based on PISA 2009 data and using the ‘top third, bottom third’ methodology – finds that 10% of a Turkish sample are resilient in reading, maths and science; 6% are resilient in two subjects and a further 8% in one only.
Resilience varies in different subjects according to year of education.
There are also significant regional differences.
Odds ratios show a positive association with: more than one year of pre-primary education; selective provision, especially in maths; absence of ability grouping; additional learning time, especially for maths and science; a good disciplinary climate and strong teacher-student relations.
This confirms some of the findings above in respect of student characteristics, finding a negative impact from immigrant status (and also from a high proportion of immigrants in a school). ‘Joy in reading’ and ‘positive attitude to computers’ are both positively associated with resilience, as is a positive relationship with teachers.
School type is found to influence the incidence of resilience – particularly enrolment in Licei as opposed to professional or technical schools – so reflecting one outcome of the Northern Irish study. Other significant school level factors include the quality of educational resources available and investment in extracurricular activities. Regional differences are once more pronounced.
Finally, this commentary from Marc Tucker in the US links its relatively low incidence of resilient students to national views about the nature of ability:
‘In Asia, differences in student achievement are generally attributed to differences in the effort that students put into learning, whereas in the United States, these differences are attributed to natural ability. This leads to much lower expectations for students who come from low-income families…
My experience of the Europeans is that they lie somewhere between the Asians and the Americans with respect to the question as to whether effort or genetic material is the most important explainer of achievement in school…
… My take is that American students still suffer relative to students in both Europe and Asia as a result of the propensity of the American education system to sort students out by ability and assign different students work at different challenge levels, based on their estimates of student’s inherited intelligence.’
What are we to make of all this?
It suggests to me that we have not pushed much beyond statements of the obvious and vague conjecture in our efforts to understand the resilient student population and how to increase its size in any given jurisdiction.
The comparative statistical evidence shows that England has a real problem with underachievement by disadvantaged students, as much at the top as the bottom of the attainment distribution.
We are not alone in facing this difficulty, although it is significantly more pronounced than in several of our most prominent PISA competitors.
We should be worrying as much about our ‘short head’ as our ‘long tail’.
This is the second part of an extended post considering what we know – and do not know – about high attainment gaps between learners from advantaged and disadvantaged backgrounds in England.
Mind the Gap by Clicsouris
Part one provided an England-specific definition, articulated a provisional theoretical model for addressing excellence gaps and set out the published data about the size of excellence gaps at Key Stages 2,4 and 5, respectively.
Part two continues to review the evidence base for excellence gaps, covering the question whether high attainers remain so, international comparisons data and related research and excellence gaps analysis from the USA.
It also describes those elements of present government policy that impact directly on excellence gaps and offers some recommendations for strengthening our national emphasis on this important issue.
whether pupils in the top 10% at KS4 in 2006 were also high attainers at KS3 in 2004 and KS2 in 2001, by matching back to their fine grade points scores; and
chances of being a KS4 high attainer given a range of pupil characteristics at KS2 and KS3.
On the first point it finds that 4% of all pupils remain in the top 10% throughout, while 83% of pupils are never in the top 10% group.
Some 63% of those who were high attainers at the end of KS2 are still high attainers at the end of KS3, while 72% of KS3 high attainers are still in that group at the end of KS4. Approximately half of high attainers at KS2 are high attainers at KS4.
The calculation is not repeated for advantaged and disadvantaged high attainers respectively, but this shows that – while there is relatively little movement between the high attaining population and other learners (with only 17% of the overall population falling within scope at any point) – there is a sizeable ‘drop out’ amongst high attainers at each key stage.
Turning to the second point, logistic regression is used to calculate the odds of being a KS4 high attainer given different levels of prior attainment and a range of pupil characteristics. Results are controlled to isolate the impact of individual characteristics and for attainment.
The study finds that pupils with a KS2 average points score (APS) above 33 are more likely than not to be high attainers at KS4, and this probability increases as their KS2 APS increases. For those with an APS of 36, the odds are 23.73, meaning they have a 24/25 chance of being a KS4 high attainer.
For FSM-eligible learners though, the odds are 0.55, meaning that the chances of being a KS4 high attainer are 45% lower amongst FSM-eligible pupils, compared to their non-FSM counterparts with similar prior attainment and characteristics.
The full set of findings for individual characteristics is reproduced below.
An appendix supplies the exact ratios for each characteristic and the text points out that these can be multiplied to calculate odds ratios for different combinations:
The odds for different prior attainment levels and other characteristics combined with FSM eligibility are not worked through, but could easily be calculated. It would be extremely worthwhile to repeat this analysis using more recent data to see whether the results would be replicated for those completing KS4 in 2014.
A footnote says that this calculation was ‘on the basis of their English and maths scores at age 11, and at later stages of schooling’, which is somewhat unclear. A single, unidentified cohort is tracked across key stages.
The report suggests ‘extremely high rates of ‘leakage’ amongst the least privileged pupils’. The key finding is that two-thirds of disadvantaged top performers at KS2 are not amongst the top performers at KS4, whereas 42% advantaged top performers are not.
Twenty learners were identified who were at the end of KS3 or at KS4 and who had achieved well above predicted levels in English and maths at the end of KS2. Achievement was predicted for the full sample of 2,800 children within the EPPSE study via multi-level modelling, generating:
‘…residual scores for each individual child, indicating the differences between predicted and attained achievement at age 11, while controlling for certain child characteristics (i.e., age, gender, birth weight, and the presence of developmental problems) and family characteristics (i.e., mothers’ education, fathers’ education, socio-economic status [SES] and family income). ‘
The 20 identified as succeeding against the odds had KS2 residual scores for both English and maths within the highest 20% of the sample. ‘Development trajectories’ were created for the group using a range of assessments conducted at age 3, 4, 5, 7, 11 and 14.
The highest job level held in the family when the children were aged 3-4 was manual, semi-skilled or unskilled, or the parent(s) had never worked.
The 20 were randomly selected from each gender – eight boys and 12 girls – while ensuring representation of ‘the bigger minority ethnic groups’. It included nine students characterised as White UK, five Black Caribbean, two Black African and one each of Indian (Sikh), Pakistani, Mixed Heritage and Indian (Hindu).
Interviews were conducted with children, parents and the teacher at their [present] secondary school the learners felt ‘knew them best’. Teacher interviews were secured for 11 of the 20.
Comparison of development trajectories showed significant gaps between this ‘low SES high attainment’ group and a comparative sample of ‘low SES, predicted attainment’ students. They were ahead from the outset and pulled further away.
They also exceeded a comparator group of high SES learners performing at predicted levels from entry to primary education until KS2. Even at KS3, 16 of the 20 were still performing above the mean of the high SES sample.
These profiles – illustrated in the two charts below – were very similar in English and maths respectively. In either case, Group 1 are those with ‘low SES, high attainment’, while Group 4 are ‘high SES predicted attainment’ students.
Interviews identified five factors that helped to explain this success:
The child’s perceived cognitive ability, strong motivation for school and learning and their hobbies and interests. Most parents and children regarded cognitive ability as ‘inherent to the child’, but they had experienced many opportunities to develop their abilities and received support in developing a ‘positive self-image’. Parenting ‘reflected a belief in the parent’s efficacy to positively influence the child’s learning’. Children also demonstrated ability to self-regulate and positive attitudes to homework. They had a positive attitude to learning and made frequent use of books and computers for this purpose. They used school and learning as distractions from wider family problems. Many were driven to learn, to succeed educationally and achieve future aspirations.
Home context – effective practical and emotional support with school and learning. Families undertook a wide range of learning activities, especially in the early years. These were perceived as enjoyable but also valuable preparation for subsequent schooling. During the primary years, almost all families actively stimulated their children to read. In the secondary years, many parents felt their efforts to regulate their children’s activities and set boundaries were significant. Parents also provided practical support with school and learning, taking an active interest and interacting with their child’s school. Their parenting style is described as ‘authoritative: warm, firm and accepting of their needs for psychological autonomy but demanding’. They set clear standards and boundaries for behaviour while granting extra autonomy as their children matured. They set high expectations and felt strongly responsible for their child’s education and attitude to learning. They believed in their capacity to influence their children positively. Some were motivated by the educational difficulties they had experienced.
(Pre-)School environment – teachers who are sensitive and responsive to the child’s needs and use ‘an authoritative approach to teaching and interactive teaching strategies’; and, additionally, supportive school policies. Parents had a positive perception of the value of pre-school education, though the value of highly effective pre-school provision was not clear cut with this sample. Moreover ‘very few clear patterns of association could be discerned between primary school effectiveness and development of rankings on trajectories’. That said both parents and children recognised that their schools had helped them address learning and behavioural difficulties. Success was attributed to the quality of teachers. ‘They thought that good quality teaching meant that teachers were able to explain things clearly, were enthusiastic about the subject they taught, were approachable when things were difficult to understand, were generally friendly, had control over the class and clearly communicated their expectations and boundaries.’
Peers providing practical, emotional and motivational support. Friends were especially valuable in helping them to respond to difficulties, helping in class, with homework and revision. Such support was often mutual, helping to build understanding and develop self-esteem, as a consequence of undertaking the role of teacher. Friends also provided role models and competitors.
Similar support provided by the extended family and wider social, cultural and religious communities. Parents encouraged their children to take part in extra-curricular activities and were often aware of their educational benefits. Family networks often provided additional learning experiences, particularly for Caribbean and some Asian families.
‘…starting secondary school in Year 7 attaining level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and or mathematics at the end of Key Stage 2.’ (Footnote p6-7)
There is relatively little data in the report about the performance of high-attaining disadvantaged learners, other than the statement that only 58% of FSM students within the ‘most able’ population in KS2 and attending non-selective secondary schools go on to achieve A*-B GCSE grades in English and maths, compared with 75% of non-FSM pupils, giving a gap of 17 percentage points.
I have been unable to find national transition matrices for advantaged and disadvantaged learners, enabling us to compare the proportion of advantaged and disadvantaged pupils making and exceeding the expected progress between key stages.
Regression to the mean and efforts to circumvent it
Much prominence has been given to Feinstein’s 2003 finding that, whereas high-scoring children from advantaged and disadvantaged backgrounds (defined by parental occupation) perform at a broadly similar level when tested at 22 months, the disadvantaged group are subsequently overtaken by relatively low-scoring children from advantaged backgrounds during the primary school years.
The diagram that summarises this relationship has been reproduced widely and much used as the centrepiece of arguments justifying efforts to improve social mobility.
But Feinstein’s finding were subsequently challenged on methodological grounds associated with the effects of regression to the mean.
‘There is currently an overwhelming view amongst academics and policymakers that highly able children from poor homes get overtaken by their affluent (but less able) peers before the end of primary school. Although this empirical finding is treated as a stylised fact, the methodology used to reach this conclusion is seriously flawed. After attempting to correct for the aforementioned statistical problem, we find little evidence that this is actually the case. Hence we strongly recommend that any future work on high ability–disadvantaged groups takes the problem of regression to the mean fully into account.’
‘Although some doubt has been raised regarding this analysis on account of the potential for regression to the mean to exaggerate the phenomenon (Jerrim and Vignoles, 2011), it is highly unlikely that this would overturn the core finding that high SES, lower ability children catch up with their low-SES, higher-ability peers.’
This research adopts a methodological route to minimise the impact of regression to the mean. This involves assigning learners to achievement groups using a different test to those used to follow their attainment trajectories and focusing principally on those trajectories from KS2 onwards.
The high attaining group is defined as those achieving Level 3 or above in KS1 writing, which selected in 12.6% of the sample. (For comparison, the same calculations are undertaken based on achieving L3 or above in KS1 maths.) These pupils are ranked and assigned a percentile on the basis of their performance on the remaining KS1 tests and at each subsequent key stage.
The chart summarising the outcomes in the period from KS1 to KS4 is reproduced below, showing the different trajectories of the ‘most deprived’ and ‘least deprived’. These are upper and lower quintile groups of state school students derived on the basis of FSM eligibility and a set of area-based measures of disadvantage and measures of socio-economic status derived from the census.
The trajectories do not alter significantly beyond KS4.
The study concludes:
‘…children from poorer backgrounds who are high attaining at age 7 are more likely to fall off a high attainment trajectory than children from richer backgrounds. We find that high-achieving children from the most deprived families perform worse than lower-achieving students from the least deprived families by Key Stage 4. Conversely, lower-achieving affluent children catch up with higher-achieving deprived children between Key Stage 2 and Key Stage 4.’
‘The period between Key Stage 2 and Key Stage 4 appears to be a crucial time to ensure that higher-achieving pupils from poor backgrounds remain on a high achievement trajectory.’
In short, a Feinstein-like relationship is established but it operates at a somewhat later stage in the educational process.
International comparisons studies
OECD PISA studies have recently begun to report on the performance of what they call ‘resilient’ learners.
This publication uses PISA 2006 science results as the basis of its calculations. The relative position of different countries is shown in the chart reproduced below. Hong Kong tops the league at 24.8%, the UK is at 13.5%, slightly above the OECD average of 13%, while the USA is languishing at 9.9%.
The findings were discussed further in PISA in Focus 5 (OECD 2011), where PISA 2009 data is used to make the calculation. The methodology is also significantly adjusted so that includes a substantially smaller population:
‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status.’
According to this measure, the UK is at 24% and the US has leapfrogged them at 28%. Both are below the OECD average of 31%, while Shanghai and Hong Kong stand at over 70%.
The Report on PISA 2012 (OECD 2013) retains the more demanding definition of resilience, but dispenses with multiplication by 4, so these results need to be so multiplied to be comparable with those for 2009.
This time round, Shanghai is at 19.2% (76.8%) and Hong Kong at 18.1% (72.4%). The OECD average is 6.4% (25.6%), the UK at 5.8% (23.2%) and the US at 5.2% (20.8%).
So the UK has lost a little ground compared with 2009, but is much close to the OECD average and has overtaken the US, which has fallen back by some seven percentage points.
I could find no commentary on these changes.
NFER has undertaken some work on resilience in Northern Ireland, using PISA 2009 reading results (and the original ‘one third’ methodology) as a base. This includes odds ratios for different characteristics of being resilient. This could be replicated for England using PISA 2012 data and the latest definition of resilience.
It quantifies the proportion of these two populations within each decile of achievement, so generating a gradient, before reviewing how this gradient has changed between PISA 2000 and PISA 2009, comparing outcomes for England, Australia, Canada, Finland, Germany and the US.
Jerrim summarises his findings thus:
‘The difference between advantaged and disadvantaged children’s PISA 2009 reading test scores in England is similar (on average) to that in most other developed countries (including Australia, Germany and, to some extent, the US). This is in contrast to previous studies from the 1990s, which suggested that there was a particularly large socio-economic gap in English pupils’ academic achievement.
Yet the association between family background and high achievement seems to be stronger in England than elsewhere.
There is some evidence that the socio-economic achievement gradient has been reduced in England over the last decade, although not amongst the most able pupils from advantaged and disadvantaged homes.’
Jerrim finds that the link in England between family background and high achievement is stronger than in most other OECD countries, whereas this is not the case at the other end of the distribution.
He hypothises that this might be attributable to recent policy focus on reducing the ‘long tail’ while:
‘much less attention seems to be paid to helping disadvantaged children who are already doing reasonably well to push on and reach the top grades’.
He dismisses the notion that the difference is associated with the fact that disadvantaged children are concentrated in lower-performing schools, since it persists even when controls for school effects are introduced.
In considering why PISA scores show the achievement gap in reading has reduced between 2000 and 2009 at the lower end of the attainment distribution but not at the top, he cites two possibilities: that Government policy has been disproportionately successful at the lower end; and that there has been a more substantial decline in achievement amongst learners from advantaged backgrounds than amongst their disadvantaged peers. He is unable to rule out the latter possibility.
He also notes in passing that PISA scores in maths do not generate the same pattern.
This finds that high-achieving (top decile of the test distribution) boys from the most advantaged quintile in England are two years and seven months ahead of high-achieving boys from the most disadvantaged quintile, while the comparable gap for girls is slightly lower, at two years and four months.
The chart reproduced below illustrates international comparisons for boys. It shows that only Scotland has a larger high achievement gap than England. (The black lines indicate 99% confidence intervals – he associates the uncertainty to ‘sampling variation’.)
Gaps in countries at the bottom of the table are approximately half the size of those in England and Scotland.
One of the report’s recommendations is that:
‘The coalition government has demonstrated its commitment to disadvantaged pupils by establishing the Education Endowment Foundation… A key part of this Foundation’s future work should be to ensure highly able children from disadvantaged backgrounds succeed in school and have the opportunity to enter top universities and professional jobs. The government should provide additional resources to the foundation to trial interventions that specifically target already high achieving children from disadvantaged homes. These should be evaluated using robust evaluation methodologies (e.g. randomised control trials) so that policymakers develop a better understanding of what schemes really have the potential to work.’
The study is published by the Sutton Trust whose Chairman – Sir Peter Lampl – is also chairman of the EEF.
In ‘Family background and access to high ‘status’ universities’ (2013) Jerrim provides a different chart showing estimates by country of disadvantaged high achieving learners. The measure of achievement is PISA Level 5 in reading and the measure of disadvantage remains quintiles derived from the ISEI index.
The international tests selected are TIMSS 2003, 4th grade; TIMSS 2007, 8th grade and PISA 2009. The differences between what these tests measure are described as ‘slight’. The analysis of achievement relies on deciles of the achievement distribution.
Thirteen comparator countries are included, including six wealthy western economies, three ‘middle income’ western economies and four Asian Tigers (Hong Kong, Japan, Singapore and Taiwan).
This study applies as the best available proxy for socio-economic status the number of books in the family home, comparing the most advantaged (over 200 books) with the least (under 25 books). It acknowledges the limitations of this proxy, which Jerrim discusses elsewhere.
The evidence suggests that:
‘between primary school and the end of secondary school, the gap between the lowest achieving children in England and the lowest achieving children in East Asian countries is reduced’
but remains significant.
Conversely, results for the top 10% of the distribution:
‘suggest that the gap between the highest achieving children in England and the highest achieving children in East Asia increases between the end of primary school and the end of secondary school’.
The latter outcome is illustrated in the chart reproduced below
The authors do not consider variation by socio-economic background amongst the high-achieving cohort, presumably because the data still does not support the pattern they previously identified for reading.
This focuses exclusively on gaps attributable to socio-economic status, by comparing the performance of those in the top and bottom halves of the family income distribution in the US, as adjusted for family size.
The achievement measure is top quartile performance on nationally normalised exams administered within two longitudinal studies: The National Education Longitudinal Study (NELS) and the Baccalaureate and Beyond Longitudinal Study (B&B).
The study reports that relatively few lower income students remain high achievers throughout their time in elementary and high school:
56% remain high achievers in reading by Grade 5, compared with 69% of higher income students.
25 percent fall out of the high achiever cohort in high school, compared with 16% of higher income students.
Higher income learners who are not high achievers in Grade 1 are more than twice as likely to be high achievers by Grade 5. The same is true between Grades 8 and 12.
2007 also saw the publication of ‘Overlooked Gems: A national perspective on low income promising learners’ (Van Tassel-Baska and Stambaugh). This is a compilation of the proceedings of a 2006 conference which does not attempt a single definition of the target group, but draws on a variety of different research studies and programmes, each with different starting points.
‘Differences between subgroups of students performing at the highest levels of achievement’
The measures of high achievement deployed are the advanced standards on US NAEP maths and reading tests, at Grades 4 and 8 respectively.
The study identifies gaps based on four sets of learner characteristics:
Socio-economic status (eligible or not for free or reduced price lunch).
Ethnic background (White versus Black and/or Hispanic).
English language proficiency (what we in England would call EAL, compared with non-EAL).
Gender (girls versus boys).
Each characteristic is dealt with in isolation, so there is no discussion of the gaps between – for example – disadvantaged Black/Hispanic and disadvantaged White boys.
In relation to socio-economic achievement gaps, Plucker et al find that:
In Grade 4 maths, from 1996 to 2007, the proportion of advantaged learners achieving the advanced level increased by 5.6 percentage points, while the proportion of disadvantaged learners doing so increased by 1.2 percentage points. In Grade 8 maths, these percentage point changes were 5.7 and 0.8 percentage points respectively. Allowing for changes in the size of the advantaged and disadvantaged cohorts, excellence gaps are estimated to have widened by 4.1 percentage points in Grade 4 (to 7.3%) and 4.9 percentage points in Grade 8 (to 8.2%).
In Grade 4 reading, from 1998 to 2007, the proportion of advantaged learners achieving the advanced level increased by 1.2 percentage points, while the proportion of disadvantaged students doing so increased by 0.8 percentage points. In Grade 8 reading, these percentage point changes were almost negligible for both groups. The Grade 4 excellence gap is estimated to have increased slightly, by 0.4 percentage points (to 9.4%) whereas Grade 8 gaps have increased minimally by 0.2 percentage points (to 3.1%).
They observe that the size of excellence gaps are, at best, only moderately correlated with those at lower levels of achievement.
There is a weak relationship between gaps at basic and advanced level – indeed ‘smaller achievement gaps among minimally competent students is related to larger gaps among advanced students’ – but there is some inter-relationship between those at proficient and advanced level.
They conclude that, whereas No Child Left Behind (NCLB) helped to narrow achievement gaps, this does not extend to high achievers.
There is no substantive evidence that the NCLB focus on lower achievers has increased the excellence gap, although the majority of states surveyed by the NAGC felt that NCLB had diverted attention and resource away from gifted education.
They do not report outcomes for disadvantaged high achievers, but do consider briefly those attending schools with high and low proportions respectively of students eligible for free and reduced price lunches.
For this section of the report, high achievement is defined as ‘those whose math or reading scores placed them within the top ten per cent of their individual grades and schools’. Learners were tracked from Grades 3 to 5 and Grades 6 to 8.
It is described as exploratory, because the sample was not representative.
‘High-achieving students attending high-poverty schools made about the same amount of academic growth over time as their high-achieving peers in low-poverty schools…It appears that the relationship between a school’s poverty rate and the growth of its highest-achieving students is weak. In other words, attending a low-poverty school adds little to the average high achiever’s prospects for growth.’
The wider study was criticised in a review by the NEPC, in part on the grounds that the results may have been distorted by regression to the mean, a shortcoming only briefly discussed in an appendix..
This is the report of a national summit on the issue convened in that year by the NAGC.
It follows Plucker (one of the summit participants) in using as its starting point,the achievement of advanced level on selected NAEP assessments by learners eligible for free and reduced price lunches.
But it also reports some additional outcomes for Grade 12 and for assessments of civics and writing:
‘Since 1998, 1% or fewer of 4th-, 8th-, and 12th-grade free or reduced lunch students, compared to between 5% and 6% of non-eligible students scored at the advanced level on the NAEP civics exam.
Since 1998, 1% or fewer of free and reduced lunch program-eligible students scored at the advanced level on the eighth-grade NAEP writing exam while the percentage of non-eligible students who achieved advanced scores increased from 1% to 3%.’
The bulk of the report is devoted to identifying barriers to progress and offering recommendations for improving policy, practice and research. I provided an extended analysis in this post from May 2013.
It updates the findings in that report, as set out above:
In Grade 4 maths, from 1996 to 2011, the proportion of advantaged students scoring at the advanced level increased by 8.3 percentage points, while the proportion of disadvantaged learners doing so increased by 1.5 percentage points. At Grade 8, the comparable changes were 8.5 percentage points and 1.5 percentage points respectively. Excellence gaps have increased by 6.8 percentage points at Grade 4 (to 9.6%) and by 7 percentage points at Grade 8 (to 10.3%).
In Grade 4 reading, from 1998 to 2011, the proportion of advantaged students scoring at the advanced level increased by 2.6 percentage points, compared with an increase of 0.9 percentage points amongst disadvantaged learners. Grade 8 saw equivalent increases of 1.8 and 0.9 percentage points respectively. Excellence gaps are estimated to have increased at Grade 4 by 1.7 percentage points (to 10.7%) and marginally increased at Grade 8 by 0.9 percentage points (to 4.2%).
In short, many excellence gaps remain large and most continue to grow. The report’s recommendations are substantively the same as those put forward in 2010.
How Government education policy impacts on excellence gaps
Although many aspects of Government education policy may be expected to have some longer-term impact on raising the achievement of all learners, advantaged and disadvantaged alike, relatively few interventions are focused exclusively and directly on closing attainment gaps between advantaged and disadvantaged learners – and so have the potential to makes a significant difference to excellence gaps.
The most significant of these include:
The Pupil Premium:
In November 2010, the IPPR voiced concerns that the benefits of the pupil premium might not reach all those learners who attract it.
Accordingly they recommended that pupil premium should be allocated directly to those learners through an individual Pupil Premium Entitlement which might be used to support a menu of approved activities, including ‘one-to-one teaching to stretch the most able low income pupils’.
The recommendation has not been repeated and the present Government shows no sign of restricting schools’ freedom to use the premium in this manner.
‘Assess the level and use of the Pupil Premium to ensure value for money, and that it is targeted to enhance the life chances of children facing the biggest challenges, whether from special needs or from the nature of the background and societal impact they have experienced.’
In February 2013 Ofsted reported that schools spending the pupil premium successfully to improve achievement:
‘Never confused eligibility for the Pupil Premium with low ability, and focused on supporting their disadvantaged pupils to achieve the highest levels’.
Conversely, where schools were less successful in spending the funding, they:
‘focused on pupils attaining the nationally expected level at the end of the key stage…but did not go beyond these expectations, so some more able eligible pupils underachieved.’
In July 2013, DfE’s Evaluation of Pupil Premium reported that, when deciding which disadvantaged pupils to target for support, the top criterion was ‘low attainment’ and was applied in 91% of primary schools and 88% of secondary schools.
In June 2013, in ‘The Most Able Students’, Ofsted reported that:
‘Pupil Premium funding was used in only a few instances to support the most able students who were known to be eligible for free school meals. The funding was generally spent on providing support for all underachieving and low-attaining students rather than on the most able students from disadvantaged backgrounds.’
Accordingly, it gave a commitment that:
‘Ofsted will… consider in more detail during inspection how well the pupil premium is used to support the most able students from disadvantaged backgrounds.’
However, this was not translated into the school inspection guidance.
‘Inspectors should pay particular attention to whether more able pupils in general and the most able pupils in particular are achieving as well as they should. For example, does a large enough proportion of those pupils who had the highest attainment at the end of Key Stage 2 in English and mathematics achieve A*/A GCSE grades in these subjects by the age of 16?
Inspectors should summarise the achievements of the most able pupils in a separate paragraph of the inspection report.’
There is no reference to the most able in parallel references to the pupil premium.
There has, however, been some progress in giving learners eligible for the pupil premium priority in admission to selective schools.
‘Thirty [grammar] schools have been given permission by the Department for Education to change their admissions policies already. The vast majority of these will introduce the changes for children starting school in September 2015…A small number – five or six – have already introduced the reform.’
The National Grammar Schools Association confirmed that:
‘A significant number of schools 38 have either adopted an FSM priority or consulted about doing so in the last admissions round. A further 59 are considering doing so in the next admissions round.’
In July 2014, the Government launched a consultation on the School Admissions Code which proposes extending to all state-funded schools the option to give priority in their admission arrangements to learners eligible for the pupil premium. This was previously open to academies and free schools via their funding agreements.
The Education Endowment Foundation (EEF)
The EEF describes itself as:
‘An independent grant-making charity dedicated to breaking the link between family income and educational achievement, ensuring that children from all backgrounds can fulfil their potential and make the most of their talents.’
The 2010 press release announcing its formation emphasised its role in raising standards in underperforming schools. This was reinforced by the Chairman in a TES article from June 2011:
‘So the target group for EEF-funded projects in its first couple of years are pupils eligible for free school meals in primary and secondary schools underneath the Government’s floor standards at key stages 2 and 4. That’s roughly 1,500 schools up and down the country. Projects can benefit other schools and pupils, as long as there is a significant focus on this core target group of the most needy young people in the most challenging schools.’
I have been unable to trace any formal departure from this position, though it no longer appears in this form in the Foundation’s guidance. The Funding FAQs say only:
‘In the case of projects involving the whole school, rather than targeted interventions, we would expect applicants to be willing to work with schools where the proportion of FSM-eligible pupils is well above the national average and/or with schools where FSM-eligible pupils are under-performing academically.’
I can find no EEF-funded projects that are exclusively or primarily focused on high-attaining disadvantaged learners, though a handful of its reports do refer to the impact on this group.
Changes to School Accountability Measures
As we have seen in Part one, the School Performance Tables currently provide very limited information about the performance of disadvantaged high achievers.
The July 2013 consultation document on primary assessment and accountability reform included a commitment to publish a series of headline measures in the tables including:
‘How many of the school’s pupils are among the highest-attaining nationally, by…showing the percentage of pupils attaining a high scaled score in each subject.’
Moreover, it added:
‘We will publish all the headline measures to show the attainment and progress of pupils for whom the school is in receipt of the pupil premium.’
Putting two and two together, this should mean that, from 2016, we will be able to see the percentage of pupil premium-eligible students achieving a high scaled score, though we do not yet know what ‘high scaled score’ means, nor do we know whether the data will be for English and maths separately or combined.
The October 2013 response to the secondary assessment and accountability consultation document fails to say explicitly whether excellence gap measures will be published in School Performance Tables.
It mentions that:
‘Schools will now be held to account for (a) the attainment of their disadvantaged pupils, (b) the progress made by their disadvantaged pupils, and (c) the in-school gap in attainment between disadvantaged pupils and their peers.’
Meanwhile a planned data portal will contain:
‘the percentage of pupils achieving the top grades in GCSEs’
but the interaction between these two elements, if any, remains unclear.
The March 2014 response to the consultation on post-16 accountability and assessment says:
‘We intend to develop measures covering all five headline indicators for students in 16-19 education who were in receipt of pupil premium funding in year 11.’
The post-16 headline measures include a new progress measure and an attainment measure showing the average points score across all level 3 qualifications.
It is expected that a destination measure will also be provided, as long as the methodology can be made sufficiently robust. The response says:
‘A more detailed breakdown of destinations data, such as entry to particular groups of universities, will continue to be published below the headline. This will include data at local authority level, so that destinations for students in the same area can be compared.’
and this should continue to distinguish the destinations of disadvantaged students.
Additional A level attainment measures – the average grade across the best three A levels and the achievement of AAB grades with at least two in facilitating subjects seem unlikely to be differentiated according to disadvantage.
There remains a possibility that much more excellence gap data, for primary, secondary and post-16, will be made available through the planned school portal, but no specification had been made public at the time of writing.
More worryingly, recent news reports have suggested that the IT project developing the portal and the ‘data warehouse’ behind it has been abandoned. The statements refer to coninuing to deliver ‘the school performance tables and associated services’ but there is no clarification of whether this latter phrase includes the portal. Given the absence of an official statement, one suspects the worst.
DfE has cancelled RM's £24m School Performance Data Programme contract http://t.co/42bh7oGkXv So no Data Portal behind Performance Tables?
The Social Mobility and Child Poverty Commission (SMCPC)
The Commission was established with the expectation that it would ‘hold the Government’s feet to the fire’ to encourage progress on these two topics.
It publishes annual ‘state of the nation’ reports that are laid before Parliament and also undertakes ‘social mobility advocacy’.
The first annual report – already referenced in Part one – was published in November 2013. The second is due in October 2014.
The Chairman of the Commission was less than complimentary about the quality of the Government’s response to its first report, which made no reference to its comments about attainment gaps at higher grades. It remains to be seen whether the second will be taken any more seriously.
The Commission has already shown significant interest in disadvantaged high achievers – in June 2014 it published the study ‘Progress made by high-attaining children from disadvantaged backgrounds’ referenced above – so there is every chance that the topic will feature again in the 2014 annual report.
The Commission is of course strongly interested in the social mobility indicators and progress made against them, so may also include recommendations for how they might be adjusted to reflect changes to the schools accountability regime set out above.
Recommended reforms to close excellence gaps
Several proposals emerge from the commentary on current Government policy above:
It would be helpful to have further evaluation of the pupil premium to check whether high-achieving disadvantaged learners are receiving commensurate support. Schools need further guidance on ways in which they can use the premium to support high achievers. This should also be a focus for the pupil premium Champion and in pupil premium reviews.
Ofsted’s school inspection handbook requires revision to fulfil its commitment to focus on the most able in receipt of the premium. Inspectors also need guidance (published so schools can see it) to ensure common expectations are applied across institutions. These provisions should be extended to the post-16 inspection regime.
All selective secondary schools should be invited to prioritise pupil premium recipients in their admissions criteria, with the Government reserving the right to impose this on schools that do not comply voluntarily.
The Education Endowment Foundation should undertake targeted studies of interventions to close excellence gaps, but should also ensure that the impact on excellence gaps is mainstreamed in all the studies they fund. (This should be straightforward since their Chairman has already called for action on this front.)
The Government should consider the case for the inclusion of data on excellence gaps in all the headline measures in the primary, secondary and post-16 performance tables. Failing that, such data (percentages and numbers) should be readily accessible from a new data portal as soon as feasible, together with historical data of the same nature. (If the full-scale portal is no longer deliverable, a suitable alternative openly accessible database should be provided.) It should also publish annually a statistical analysis of all excellence gaps and the progress made towards closing them. As much progress as possible should be made before the new assessment and accountability regime is introduced. At least one excellence gap measure should be incorporated into revised DfE impact indicators and the social mobility indicators.
The Social Mobility and Child Poverty Commission (SMCPC) should routinely consider the progress made in closing excellence gaps within its annual report – and the Government should commit to consider seriously any recommendations they offer to improve such progress.
This leaves the question whether there should be a national programme dedicated to closing excellence gaps, and so improving fair access to competitive universities. (It makes excellent sense to combine these twin objectives and to draw on the resources available to support the latter.)
Much of the research above – whether it originates in the US or UK – argues for dedicated state/national programmes to tackle excellence gaps.
More recently, the Sutton Trust has published a Social Mobility Manifesto for 2015 which recommends that the next government should:
‘Reintroduce ring-fenced government funding to support the most able learners (roughly the top ten per cent) in maintained schools and academies from key stage three upwards. This funding could go further if schools were required to provide some level of match funding.
Develop an evidence base of effective approaches for highly able pupils and ensure training and development for teachers on how to challenge their most able pupils most effectively.
Make a concerted effort to lever in additional support from universities and other partners with expertise in catering for the brightest pupils, including through creating a national programme for highly able learners, delivered through a network of universities and accessible to every state-funded secondary school serving areas of disadvantage.’
This is not as clear as it might be about the balance between support for the most able and the most able disadvantaged respectively.
‘A light touch framework that will supply the essential minimum scaffolding necessary to support effective market operation on the demand and supply sides simultaneously…
The centrepiece of the framework would be a structured typology or curriculum comprising the full range of knowledge, skills and understanding required by disadvantaged students to equip them for progression to selective higher education
On the demand side this would enable educational settings to adopt a consistent approach to needs identification across the 11-19 age range. Provision from 11-14 might be open to any disadvantaged learner wishing it to access it, but provision from 14 onwards would depend on continued success against challenging attainment targets.
On the supply side this would enable the full range of providers – including students’ own educational settings – to adopt a consistent approach to defining which knowledge, skills and understanding their various programmes and services are designed to impart. They would be able to qualify their definitions according to the age, characteristics, selectivity of intended destination and/or geographical location of the students they serve.
With advice from their educational settings, students would periodically identify their learning needs, reviewing the progress they had made towards personal targets and adjusting their priorities accordingly. They would select the programmes and services best matched to their needs….
…Each learner within the programme would have a personal budget dedicated to purchasing programmes and services with a cost attached. This would be fed from several sources including:
Their annual Pupil Premium allocation (currently £935 per year) up to Year 11.
A national fund fed by selective higher education institutions. This would collect a fixed minimum topslice from each institution’s outreach budget, supplemented by an annual levy on those failing to meet demanding new fair access targets. (Institutions would also be incentivised to offer programmes and services with no cost attached.)
Philanthropic support, bursaries, scholarships, sponsorships and in-kind support sourced from business, charities, higher education, independent schools and parents. Economic conditions permitting, the Government might offer to match any income generated from these sources.’
We know far too little than we should about the size of excellence gaps in England – and whether or not progress is being made in closing them.
I hope that this post makes some small contribution towards rectifying matters, even though the key finding is that the picture is fragmented and extremely sketchy.
Rudimentary as it is, this survey should provide a baseline of sorts, enabling us to judge more easily what additional information is required and how we might begin to frame effective practice, whether at institutional or national level.
This post examines what we know – and do not know – about high attainment gaps between learners from advantaged and disadvantaged backgrounds in England.
Mind the Gap by Clicsouris
It assesses the capacity of current national education policy to close these gaps and recommends further action to improve the prospects of doing so rapidly and efficiently.
Because the post is extremely long I have divided it into two parts.
Part one comprises:
A working definition for the English context, explanation of the significance of excellence gaps, description of how this post relates to earlier material and provisional development of the theoretical model articulated in those earlier posts.
A summary of the headline data on socio-economic attainment gaps in England, followed by a review of published data relevant to excellence gaps at primary, secondary and post-16 levels.
A distillation of research evidence, including material on whether disadvantaged high attainers remain so, international comparisons studies and research derived from them, and literature covering excellence gaps in the USA.
A brief review of how present Government policy might be expected to impact directly on excellence gaps, especially via the Pupil Premium, school accountability measures, the Education Endowment Foundation (EEF) and the Social Mobility and Child Poverty Commission (SMCPC). I have left to one side the wider set of reforms that might have an indirect and/or longer-term impact.
Some recommendations for strengthening our collective capacity to quantify address and ultimately close excellence gaps.
The post is intended to synthesise, supplement and update earlier material, so providing a baseline for further analysis – and ultimately consideration of further national policy intervention, whether under the present Government or a subsequent administration.
It does not discuss the economic and social origins of educational disadvantage, or the merits of wider policy to eliminate poverty and strengthen social mobility.
It starts from the premiss that, while education reform cannot eliminate the effects of disadvantage, it can make a significant, positive contribution by improving significantly the life chances of disadvantaged learners.
It does not debate the fundamental principle that, when prioritising educational support to improve the life chances of learners from disadvantaged backgrounds, governments should not discriminate on the basis of ability or prior attainment.
It assumes that optimal policies will deliver improvement for all disadvantaged learners, regardless of their starting point. It suggests, however, that intervention strategies should aim for equilibrium, prioritising gaps that are furthest away from it and taking account of several different variables in the process.
A working definition for the English context
The literature in Part two reveals that there is no accepted universal definition of excellence gaps, so I have developed my own England-specific working definition for the purposes of this post.
An excellence gap is:
‘The difference between the percentage of disadvantaged learners who reach a specified age- or stage-related threshold of high achievement – or who secure the requisite progress between two such thresholds – and the percentage of all other eligible learners that do so.’
This demands further clarification of what typically constitutes a disadvantaged learner and a threshold of high achievement.
In the English context, the measures of disadvantage with the most currency are FSM eligibility (eligible for and receiving free school meals) and eligibility for the deprivation element of the pupil premium (eligible for and receiving FSM at some point in the preceding six years – often called ‘ever 6’).
Throughout this post, for the sake of clarity, I have given priority to the former over the latter, except where the former is not available.
The foregrounded characteristic is socio-economic disadvantage, but this does not preclude analysis of the differential achievement of sub-groups defined according to secondary characteristics including gender, ethnic background and learning English as an additional language (EAL) – as well as multiple combinations of these.
Some research is focused on ‘socio-economic gradients’, which show how gaps vary at different points of the achievement distribution on a given assessment.
The appropriate thresholds of high achievement are most likely to be measured through national assessments of pupil attainment, notably end of KS2 tests (typically Year 6, age 11), GCSE and equivalent examinations (typically Year 11, age 16) and A level and equivalent examinations (typically Year 13, age 18).
Alternative thresholds of high achievement may be derived from international assessments, such as PISA, TIMSS or PIRLS.
Occasionally – and especially in the case of these international studies – an achievement threshold is statistically derived, in the form of a percentile range of performance, rather than with reference to a particular grade, level or score. I have not allowed for this within the working definition.
Progress measures typically relate to the distance travelled between: baseline assessment (currently at the end of KS1 – Year 2, age 7 – but scheduled to move to Year R, age 4) and end of KS2 tests; or between KS2 tests and the end of KS4 (GCSE); or between GCSE and the end of KS5 (Level 3/A level).
Some studies extend the concept of progress between two thresholds to a longitudinal approach that traces how disadvantaged learners who achieve a particular threshold perform throughout their school careers – do they sustain early success, or fall away, and what proportion are ‘late bloomers’?
Why are excellence gaps important?
Excellence gaps are important for two different sets of reasons: those applying to all achievement gaps and those which apply more specifically or substantively to excellence gaps.
Under the first heading:
The goal of education should be to provide all learners, including disadvantaged learners, with the opportunity to maximise their educational potential, so eliminating ‘the soft bigotry of low expectations’.
Schools should be ‘engines of social mobility’, helping disadvantaged learners to overcome their backgrounds and compete equally with their more advantaged peers.
International comparisons studies reveal that the most successful education systems can and do raise attainment for all and close socio-economic achievement gaps simultaneously.
There is a strong economic case for reducing – and ideally eradicating – underachievement attributable to disadvantage.
Under the second heading:
An exclusive or predominant focus on gaps at the lower end of the attainment distribution is fundamentally inequitable and tends to reinforce the ‘soft bigotry of low expectations’.
Disadvantaged learners benefit from successful role models – predecessors or peers from a similar background who have achieved highly and are reaping the benefits.
An economic imperative to increase the supply of highly-skilled labour will place greater emphasis on the top end of the achievement distribution. Some argue that there is a ‘smart fraction’ tying national economic growth to a country’s stock of high achievers. There may be additional spin-off benefits from increasing the supply of scientists, writers, artists, or even politicians!
The most highly educated disadvantaged learners are least likely to confer disadvantage on their children, so improving the proportion of such learners may tend to improve inter-generational social mobility.
Excellence gaps are rarely identified as such – the term is not yet in common usage in UK education, though it has greater currency in the US. Regardless of terminology, they rarely receive attention, either as part of a wider set of achievement gaps, or separately in their own right.
Relationship with earlier posts
Since this blog was founded in April 2010 I have written extensively about excellence gaps and how to address them.
I have also written about excellence gaps in New Zealand – Part 1 and Part 2 (June 2012) – but do not draw on that material here.
Gifted education (or apply your alternative term) is amongst those education policy areas most strongly influenced by political and ideological views on the preferred balance between excellence and equity. This is particularly true of decisions about how best to address excellence gaps.
The excellence-equity trade-off was identified in my first post (May 2010) as one of three fundamental polarities that determine the nature of gifted education and provide the basis for most discussion about what form it should take.
‘Gifted education is about balancing excellence and equity. That means raising standards for all while also raising standards faster for those from disadvantaged backgrounds.
Through combined support for excellence and equity we can significantly increase our national stock of high level human capital and so improve economic growth…
…Excellence in gifted education is about maximising the proportion of high achievers reaching advanced international benchmarks (eg PISA, TIMSS and PIRLS) so increasing the ‘smart fraction’ which contributes to economic growth.
Equity in gifted education is about narrowing (and ideally eliminating) the excellence gap between high achievers from advantaged and disadvantaged backgrounds (which may be attributable in part to causes other than poverty). This also increases the proportion of high achievers, so building the ‘smart fraction’ and contributing to economic growth.’
‘We must pursue simultaneously the twin priorities of raising standards and closing gaps. We must give higher priority to all disadvantaged learners, regardless of their prior achievement. Standards should continue to rise amongst all high achievers, but they should rise faster amongst disadvantaged high achievers. This makes a valuable contribution to social mobility.’
This model provisionally developed
Using my working definition as a starting point, this section describes a theoretical model showing how excellence and equity are brought to bear when considering excellence gaps – and then how best to address them.
This should be applicable at any level, from a single school to a national education system and all points in between.
The model depends on securing the optimal balance between excellence and equity where:
Excellence is focused on increasing the proportion of all learners who achieve highly and, where necessary, increasing the pitch of high achievement thresholds to remove unhelpful ceiling effects. The thresholds in question may be nationally or internationally determined and are most likely to register high attainment through a formal assessment process. (This may be extended so there is complementary emphasis on increasing the proportion of high-achieving learners who make sufficiently strong progress between two different age- or stage-related thresholds.)
Equity is focused on increasing the proportion of high-achieving disadvantaged learners (and/or the proportion of disadvantaged learners making sufficiently strong progress) at a comparatively faster rate, so they form a progressively larger proportion of the overall high-achieving population, up to the point of equilibrium, where advantaged and disadvantaged learners are equally likely to achieve the relevant thresholds (and/or progress measure). This must be secured without deliberately repressing improvement amongst advantaged learners – ie by introducing policies designed explicitly to limit their achievement and/or progress relative to disadvantaged learners – but a decision to do nothing or to redistribute resources in favour of disadvantage is entirely permissible.
The optimal policy response will depend on the starting position and the progress achieved over time.
If excellence gaps are widening, the model suggests that interventions and resources should be concentrated in favour of equity. Policies should be reviewed and adjusted, or strengthened where necessary, to meet the desired objectives.
If excellence gaps are widening rapidly, this reallocation and adjustment process will be relatively more substantial (and probably more urgent) than if they are widening more slowly.
Slowly widening gaps will demand more reallocation and adjustment than a situation where gaps are stubbornly resistant to improvement, or else closing too slowly. But even in the latter case there should be some reallocation and adjustment until equilibrium is achieved.
When excellence gaps are already closing rapidly – and there are no overt policies in place to deliberately repress improvement amongst high-achieving advantaged learners – it may be that unintended pressures in the system are inadvertently bringing this about. In that case, policy and resources should be adjusted to correct these pressures and so restore the correct twin-speed improvement.
The aim is to achieve and sustain equilibrium, even beyond the point when excellence gaps are eliminated, so that they are not permitted to reappear.
If ‘reverse gaps’ begin to materialise, where disadvantaged learners consistently outperform their more advantaged peers, this also threatens equilibrium and would suggest a proportionate redistribution of effort towards excellence.
Such scenarios are most likely to occur in settings where there are a large proportion of learners that, while not disadvantaged according to the ‘cliff edge’ definition required to make the distinction, are still relatively disadvantaged.
Close attention must therefore be paid to the distribution of achievement across the full spectrum of disadvantage, to ensure that success at the extreme of the distribution does not mask significant underachievement elsewhere.
One should be able to determine a more precise policy response by considering a restricted set of variables. These include:
The size of the gaps at the start of the process and, associated with this, the time limit allowed for equilibrium to be reached. Clearly larger gaps are more likely to take longer to close. Policy makers may conclude that steady improvement over several years is more manageable for the system than a rapid sprint towards equilibrium. On the other hand, there may be benefits associated with pace and momentum.
The rate at which overall high achievement is improving. If this is relatively fast, the rate of improvement amongst advantaged high achievers will be correspondingly strong, so the rate for disadvantaged high achievers must be stronger still.
The variance between excellence gaps at different ages/stages. If the gaps are larger at particular stages of education, the pursuit of equilibrium suggests disproportionate attention is given to those so gaps are closed consistently. If excellence gaps are small for relatively young learners and increase with age, priority should be given to the latter, but there may be other factors in play, such as evidence that closing relatively small gaps at an early stage will have a more substantial ‘knock-on’ effect later on.
The level at which high achievement thresholds are pitched. Obviously this will influence the size of the gaps that need to be closed. But, other things being equal, enabling a higher proportion of learners to achieve a relatively high threshold will demand more intensive support. On the other hand, relatively fewer learners – whether advantaged or disadvantaged – are likely to be successful. Does one need to move a few learners a big distance or a larger proportion a smaller one?
Whether or not gaps at lower achievement thresholds are smaller and/or closing at a faster rate. If so, there is a strong case for securing parity of progress at higher and lower thresholds alike. On the other hand, if excellence gaps are closing more quickly, it may be appropriate to reallocate resources away from them and towards lower levels of achievement.
The relative size of the overall disadvantaged population, the associated economic gap between advantage and disadvantage and (as suggested above) the distribution in relation to the cut-off. If the definition of disadvantage is pitched relatively low (ie somewhat disadvantaged), the disadvantaged population will be correspondingly large, but the economic gap between advantage and disadvantage will be relatively small. If the definition is pitched relatively high (ie very disadvantaged) the reverse will be true, giving a comparatively small disadvantaged population but a larger gap between advantage and disadvantage.
The proportion of the disadvantaged population that is realistically within reach of the specified high achievement benchmarks. This variable is a matter of educational philosophy. There is merit in an inclusive approach – indeed it seems preferable to overestimate this proportion than the reverse. Extreme care should be taken not to discourage late developers or close off opportunities on the basis of comparatively low current attainment, so reinforcing existing gaps through unhelpfully low expectations. On the other hand, supporting unrealistically high expectations may be equally damaging and ultimately waste scarce resources. There may be more evidence to support such distinctions with older learners than with their younger peers.
How big are England’s headline attainment gaps and how fast are they closing?
Closing socio-economic achievement gaps has been central to English educational policy for the last two decades, including under the current Coalition Government and its Labour predecessor.
It will remain an important priority for the next Government, regardless of the outcome of the 2015 General Election.
The present Government cites ‘Raising the achievement of disadvantaged children’ as one of ten schools policies it is pursuing.
‘Children from disadvantaged backgrounds are far less likely to get good GCSE results. Attainment statistics published in January 2014 show that in 2013 37.9% of pupils who qualified for free school meals got 5 GCSEs, including English and mathematics at A* to C, compared with 64.6% of pupils who do not qualify.
We believe it is unacceptable for children’s success to be determined by their social circumstances. We intend to raise levels of achievement for all disadvantaged pupils and to close the gap between disadvantaged children and their peers.’
The DfE’s input and impact indicators – showing progress against the priorities set out in its business plan – do not feature the measure mentioned in the policy description (which is actually five or more GCSEs at Grades A*-C or equivalents, including GCSEs in English and maths).
The gap on this measure was 27.7% in 2009, improving to 26.7% in 2013, so there has been a small 1.0 percentage point improvement over five years, spanning the last half of the previous Government’s term in office and the first half of this Government’s term.
Instead the impact indicators include three narrower measures focused on closing the attainment gap between free school meal pupils and their peers, at 11, 16 and 19 respectively:
Impact Indicator 7 compares the percentages of FSM-eligible and all other pupils achieving level 4 or above in KS2 assessment of reading, writing and maths. The 2013 gap is 18.7%, down 0.4% from 19.1% in 2012.
Impact Indicator 8 compares the percentages of FSM-eligible and all other pupils achieving A*-C grades in GCSE maths and English. The 2013 gap is 26.5%, up 0.3% from 26.2% in 2012.
Impact Indicator 9 compares the percentages of learners who were FSM-eligible at age 15 and all other learners who attain a level 3 qualification by the end of the academic year in which they are 19. The 2013 gap is 24.3%, up 0.1% from 24.2% in 2012.
These small changes, not always pointing in the right direction, reflect the longer term narrative, as is evident from the Government’s Social Mobility Indicators which also incorporate these three measures.
In 2005-06 the KS2 L4 maths and English gap was 25.0%, so there has been a fairly substantial 6.3 percentage point reduction over seven years, but only about one quarter of the gap has been closed.
In 2007-08 the KS4 GCSE maths and English gap was 28.0%, so there has been a minimal 1.5 percentage point reduction over six years, equivalent to annual national progress of 0.25 percentage points per year. At that rate it will take another century to complete the process.
In 2004-05 the Level 3 qualification gap was 26.4%, so there has been a very similar 2.1 percentage point reduction over 8 years.
There is a significant time lag with all of these – the most recent available data relates to 2011/2012 – and only two years of data have been collected.
All show an upward trend. Oxbridge is up from 0.1% to 0.2%, Russell Group up from 3% to 4% and any university up from 45% to 47% – actually a 2.5 percentage point improvement.
The Oxbridge numbers are so small that a percentage measure is a rather misleading indicator of marginal improvement from a desperately low base.
It is important to note that forthcoming changes to the assessment regime will impose a different set of headline indicators at ages 11 and 16 that will not be comparable with these.
From 2014 significant methodological adjustments are being introduced to School Performance Tables that significantly restrict the range of qualifications equivalent to GCSEs. Only the first entry in each subject will count for Performance Table purposes, this applying to English Baccalaureate subjects in 2014 and then all subjects in 2015.
Both these factors will tend to depress overall results and may be expected to widen attainment gaps on the headline KS4 measure as well as the oft-cited 5+ GCSEs measure.
From 2016 new baseline assessments, the introduction of scaled scores at the end of KS2 and a new GCSE grading system will add a further layer of change.
As a consequence there will be substantial revisions to the headline measures in Primary, Secondary and Post-16 Performance Tables. The latter will include destination measures, provided they can be made methodologically sound.
At the time of writing, the Government has made negligible reference to the impact of these reforms on national measures of progress, including its own Impact Indicators and the parallel Social Mobility indicators, though the latter are reportedly under review.
Published data on English excellence gaps
The following sections summarise what data I can find in the public domain about excellence gaps at primary (KS2), secondary (KS4) and post-16 (KS5) respectively.
I have cited the most recent data derivable from Government statistical releases and performance tables, supplemented by other interesting findings gleaned from research and commentary.
The most recent national data is contained in SFR51/2013: National Curriculum Assessments at Key Stage 2: 2012 to 2013. This provides limited information about the differential performance of learners eligible for and receiving FSM (which I have referred to as ‘FSM’), and for those known to be eligible for FSM at any point from Years 1 to 6 (known as ‘ever 6’ and describing those in receipt of the Pupil Premium on grounds of deprivation).
There is also additional information in the 2013 Primary School Performance Tables, where the term ‘disadvantaged’ is used to describe ‘ever 6’ learners and ‘children looked after’.
There is comparably little variation between these different sets of figures at national level. In the analysis below (and in the subsequent section on KS4) I have used FSM data wherever possible, but have substituted ‘disadvantaged’ data where FSM is not available. All figures apply to state-funded schools only.
I have used Level 5 and above as the best available proxy for high attainment. Some Level 6 data is available, but in percentages only, and these are all so small that comparisons are misleading.
The Performance Tables distinguish a subset of high attainers, on the basis of prior attainment (at KS1 for KS2 and at KS2 for KS4) but no information is provided about the differential performance of advantaged and disadvantaged high attainers.
21% of all pupils achieved Level 5 or above in reading, writing and maths combined, but only 10% of FSM pupils did so, compared with 26% of others, giving an attainment gap of 16%. The comparable gap at Level 4B (in reading and maths and L4 in writing) was 18%. At Level 4 (across the board) it was 20%. In this case, the gaps are slightly larger at lower attainment levels but, whereas the L4 gap has narrowed by 1% since 2012, the L5 gap has widened by 1%.
In reading, 44% of all pupils achieved Level 5 and above, but only 21% of FSM pupils did so, compared with 48% of others, giving an attainment gap of 21%. The comparable gap at Level 4 and above was eight percentage points lower at 13%.
In writing (teacher assessment), 31% of all pupils achieved level 5 and above, but only 15% of FSM pupils did so, compared with 34% of others, giving an attainment gap of 19%. The comparable gap at Level 4 and above was three percentage points lower at 16%.
In grammar, punctuation and spelling (GPS), 47% of all pupils achieved Level 5 and above, but only 31% of FSM pupils did so, compared with 51% of others, giving an attainment gap of 20%. The comparable gap at Level 4 and above was two percentage points lower at 18%.
In maths, 41% of pupils in state-funded schools achieved Level 5 and above, up 2% on 2012. But only 24% of FSM pupils achieved this compared with 44% of others, giving an attainment gap of 20%. The comparable gap at level 4 and above is 13%.
Chart 1 shows these outcomes graphically. In four cases out of five, the gap at the higher attainment level is greater, substantially so in reading and maths. All the Level 5 gaps fall between 16% and 20%.
Chart 1: Percentage point gaps between FSM and all other pupils’ attainment at KS2 L4 and above and KS2 L5 and above, 2013
It is difficult to trace reliably the progress made in reducing these gaps in English, since the measures have changed frequently. There has been more stability in maths, however, and the data reveals that – whereas the FSM gap at Level 4 and above has reduced by 5 percentage points since 2008 (from 18 points to 13 points) – the FSM gap at Level 5 and above has remained between 19 and 20 points throughout. Hence the gap between L4+ and L5+ on this measure has increased in the last five years.
It defines KS2 high attainers as the top 10%, on the basis of finely grained average points scores across English, maths and science, so a more selective but wider-ranging definition than any of the descriptors of Level 5 performance above.
According to this measure, some 2.7% of FSM-eligible pupils were high attainers in 2006, compared with 11.6% of non-FSM pupils, giving a gap of 8.9 percentage points.
The Bulletin supplies further analysis of this population of high attainers, summarised in the table reproduced below.
While Government statistical releases provide at least limited data about FSM performance at high levels in end of KS2 assessments, this is entirely absent from KS4 data, because there is no information about the achievement of GCSE grades above C, whether for single subjects or combinations.
(The gap cited here for A*-C grades in English and maths GCSEs is very slightly different to the figure in the impact indicator.)
Chart 2: Percentage point gaps between FSM and all other pupils’ attainment on different KS4 measures, 2013
In its State of the Nation Report 2013, the Social Mobility and Child Poverty Commission included a table comparing regional performance on a significantly more demanding ‘8+ GCSEs excluding equivalents and including English and maths’ measure. This uses ‘ever 6’ rather than FSM as the indicator of disadvantage.
The relevant table is reproduced below. It shows regional gaps of between 20 and 26 percentage points on the tougher measure, so a similar order of magnitude to the national indicators at the top end of Chart 2.
Comparing the two measures, one can see that:
The percentages of ‘ever 6’ learners achieving the more demanding measure are very much lower than the comparable percentages achieving the 5+ GCSEs measure, but the same is also true of their more advantaged peers.
Consequently, in every region but London and the West Midlands, the attainment gap is actually larger for the less demanding measure.
In London, the gaps are much closer, at 19.1 percentage points on the 5+ measure and 20.9 percentage points on the 8+ measure. In the West Midlands, the gap on the 8+ measure is larger by five percentage points. In all other cases, the difference is at least six percentage points in the other direction.
We do not really understand the reasons why London and the West Midlands are atypical in this respect.
The Characteristics of High Attainers (2007) provides a comparable analysis for KS4 to that already referenced at KS2. In this case, the top 10% of high attainers is derived on the basis of capped GCSE scores.
This gives a gap of 8.8 percentage points between the proportion of non-FSM (11.2%) and FSM (2.4%) students within the defined population, very similar to the parallel calculation at KS2.
Other variables within this population are set out in the table reproduced below.
Finally, miscellaneous data has also appeared from time to time in the answers to Parliamentary Questions. For example:
In 2003, 1.0% of FSM-eligible learners achieved five or more GCSEs at A*/A including English and maths but excluding equivalents, compared with 6.8% of those not eligible, giving a gap of 5.8 percentage points. By 2009 the comparable percentages were 1.7% and 9.0% respectively, resulting in an increased gap of 7.3 percentage points (Col 568W)
In 2006/07, the percentage of FSM-eligible pupils securing A*/A grades at GCSE in different subjects, compared with the percentage of all pupils in maintained schools doing so were as shown in the table below (Col 808W)
Table 1: Percentage of FSM-eligible and all pupils achieving GCSE A*/A grades in different GCSE subjects in 2007
In 2008, 1% of FSM-eligible learners in maintained schools achieved A* in GCSE maths compared with 4% of all pupils in maintained schools. The comparable percentages for Grade A were 3% and 10% respectively, giving an A*/A gap of 10 percentage points (Col 488W)
The latter contains a variety of high attainment measures – 3+ A*/A grades; AAB grades or better; AAB grades or better with at least two in facilitating subjects; AAB grades or better, all in facilitating subjects – yet none of them distinguish success rates for advantaged and disadvantaged learners.
The former does includes a table which provides a time series of gaps for achievement of Level 3 at age 19 through 2 A levels or the International Baccalaureate. The measure of disadvantage is FSM-eligibility in Year 11. The gap was 22.0 percentage points in 2013, virtually unchanged from 22.7 percentage points in 2005.
This is designed to show how FSM gaps vary across key stages and also provides ‘odds ratios’ – the relative chances of FSM and other pupils achieving each measure. It relies on 2012 outcomes.
The quality of the reproduction is poor, but it seems to suggest that, using the AAB+ in at least two facilitating subjects measure, there is a five percentage point gap between FSM students and others (3% versus 8%), while the odds ratio shows that non-FSM students are 2.9 times more likely than FSM students to achieve this outcome.
Once again, occasional replies to Parliamentary Questions provide some supplementary information:
In 2007, 189 FSM-eligible students (3.7%) in maintained mainstream schools (so excluding sixth form colleges and FE colleges) achieved 3 A grades at A level. This compared with 13,467 other students (9.5%) giving a gap of 5.8 percentage points (Source: Parliamentary Question, 26 November 2008, Hansard (Col 1859W)
In 2008, 160 students (3.5%) eligible for FSM achieved that outcome. This compares with 14,431 (10.5%) of those not eligible for FSM, giving a gap of 7.0 percentage points. The figures relate to 16-18 year-olds, in maintained schools only, who were eligible for FSM at age 16. They do not include students in FE sector colleges (including sixth form colleges) who were previously eligible for FSM. Only students who entered at least one A level, applied A level or double award qualification are counted. (Parliamentary Question, 6 April 2010, Hansard (Col 1346W))
Of pupils entering at least one A level in 2010/11 and eligible for FSM at the end of Year 11, 546 (4.1%) achieved 3 or more GCE A levels at A*-A compared with 22,353 other pupils (10.6%) so giving a gap of 6.5 percentage points. These figures include students in both the schools and FE sectors. (Parliamentary Question, 9 July 2012, Hansard (Col 35W))
In September 2014, a DfE response to a Freedom of Information request provided some additional data about FSM gaps at A level over the period from 2009 to 2013. This is set out in the table below, which records the gaps between FSM and all other pupils, presumably for all schools and colleges, whether or not state-funded.
Apart from the atypical result for the top indicator in 2010, all these percentages fall in the range 6.0% to 10%, so are in line with the sources above.
3+ grades at A*/A or applied single/double award
AAB+ grades in facilitating subjects
AAB+ grades at least 2 in facilitating subjects
Additional evidence of Key Stage excellence gaps from a sample born in 1991
The latter appears in Part two, but the first set of findings provides a useful supplement to the broad picture set out above.
This study is based on a sample of learners born in 1991/1992, so they would presumably have taken end of KS2 tests in 2002, GCSEs in 2007 and A levels in 2009. It includes all children who attended a state primary school, including those who subsequently attended an independent secondary school.
It utilises a variety of measures of disadvantage, including whether learners were always FSM-eligible (in Years 7-11), or ‘ever FSM’ during that period. This summary focuses on the distinction between ‘always FSM’ and ‘never FSM’.
It selects a basket of high attainment measures spread across the key stages, including:
At KS1, achieving Level 3 or above in reading and maths.
At KS2, achieving Level 5 or above in English and maths.
At KS4, achieving six or more GCSEs at grades A*-C in EBacc subjects (as well as five or more).
At KS5, achieving two or more (and three or more) A levels at grades A-B in any subjects.
Also at KS5, achieving two or more (and three or more) A levels at grades A-B in facilitating subjects.
The choice of measures at KS2 and KS5 is reasonable, reflecting the data available at the time. For example, one assumes that A* grades at A level do not feature in the KS5 measures since they were not introduced until 2010).
At KS4, the selection is rather more puzzling and idiosyncratic. It would have been preferable to have included at least one measure based on performance across a range of GCSEs at grades A*-B or A*/A.
The authors justify their decision on the basis that ‘there is no consensus on what is considered high attainment’, even though most commentators would expect this to reflect higher grade performance, while few are likely to define it solely in terms of breadth of study across a prescribed set of ‘mainstream’ subjects.
Outcomes for ‘always FSM’ and ‘never FSM’ on the eight measures listed above are presented in Chart 3.
Chart 3: Achievement of ‘always FSM’ and ‘never FSM’ on a basket of high attainment measures for pupils born in 1991/92
This reveals gaps of 12 to 13 percentage points at Key Stages 1 and 2, somewhat smaller than several of those described above.
It is particularly notable that the 2013 gap for KS2 L5 reading, writing and maths is 16 percentage points, whereas the almost comparable 2002 (?) gap for KS2 English and maths amongst this sample is 13.5%. Even allowing for comparability issues, there may tentative evidence here to suggest widening excellence gaps at KS2 over the last decade.
The KS4 gaps are significantly larger than those existing at KS1/2, at 27 and 18 percentage points respectively. But comparison with the previous evidence reinforces the point that the size of the gaps in this sample is attributable to subject mix: this must be the case since the grade expectation is no higher than C.
The data for A*/A performance on five or more GCSEs set out above, which does not insist on coverage of EBacc subjects other than English and maths, suggests a gap of around seven percentage points. But it also demonstrates big gaps – again at A*/A – for achievement in single subjects, especially the separate sciences.
The KS5 gaps on this sample range from 2.5 to 13 percentage points. We cited data above suggesting a five percentage point gap in 2012 for AAB+, at least two in facilitating subjects. These findings do not seem wildly out of kilter with that, or with the evidence of gaps of around six to seven percentage points for AAA grades or higher.
The published data provides a beguiling glimpse of the size of excellence gaps and how they compare with FSM gaps on the key national benchmarks.
But discerning the pattern is like trying to understand the picture on a jigsaw when the majority of pieces are missing.
The received wisdom is capture in the observation by Whitty and Anders that:
‘Even though the attainment gap in schools has narrowed overall, it is largest for the elite measures’
and the SMCPC’s comment that:
‘…the system is better at lifting children eligible for FSM above a basic competence level (getting 5A*–C) than getting them above a tougher level of attainment likely to secure access to top universities.’
This seems broadly true, but the detailed picture is rather more complicated.
At KS2 there are gaps at L5 and above of around 16-20 percentage points, the majority higher than the comparable gaps at L4. But the gaps for core subjects combined are smaller than for each assessment. There is tentative evidence that the former may be widening.
At KS4 there are very significant differences between results in individual subjects. When it comes to multi-subject indicators, differences in the choice of subject mix – as well as choice of grade – make it extremely difficult to draw even the most tentative conclusions about the size of excellence gaps and how they relate to benchmark-related gaps at KS4 and excellence gaps at KS2.
At KS5, the limited evidence suggests that A level excellence gaps at the highest grades are broadly similar to those at GCSE A*/A. If anything, gaps seem to narrow slightly compared with KS4. But the confusion over KS4 measures makes this impossible to verify.
We desperately need access to a more complete dataset so we can understand these relationships more clearly.
This is the end of Part one. In Part two, we move on to consider evidence about whether high attainers remain so, before examining international comparisons data and related research, followed by excellence gaps analysis from the USA.
Part two concludes with a short review of how present government policy impacts on excellence gaps and some recommendations for strengthening the present arrangements.
Put crudely, the discussion hinged on the question whether the educational needs of ‘poor but dim’ learners should take precedence over those of the ‘poor but bright’. (This is Mr Thomas’s shorthand, not mine.)
He argued that the ‘poor but dim’ are the higher priority; I countered that all poor learners should have equal priority, regardless of their ability and prior attainment.
We began to explore the issue:
as a matter of educational policy and principle
with reference to inputs – the allocation of financial and human resources between these competing priorities and
in terms of outcomes – the comparative benefits to the economy and to society from investment at the top or the bottom of the attainment spectrum.
This post presents the discussion, adding more flesh and gloss from the Gifted Phoenix perspective.
It might or might not stimulate some interest in how this slightly different take on a rather hoary old chestnut plays out in England’s current educational landscape.
But I am particularly interested in how gifted advocates in different countries respond to these arguments. What is the consensus, if any, on the core issue?
Depending on the answer to this first question, how should gifted advocates frame the argument for educationalists and the wider public?
To help answer the first question I have included a poll at the end of the post.
Do please respond to that – and feel free to discuss the second question in the comments section below.
The structure of the post is fairly complex, comprising:
A (hopefully objective) summary of Mr Thomas’s original post.
An embedded version of the substance of our Twitter conversation. I have removed some Tweets – mostly those from third parties – and reordered a little to make this more accessible. I don’t believe I’ve done any significant damage to either case.
Some definition of terms, because there is otherwise much cause for confusion as we push further into the debate.
A digressionary exploration of the evidence base, dealing with attainment data and budget allocations respectively. The former exposes what little we are told about how socio-economic gaps vary across the attainment spectrum; the latter is relevant to the discussion of inputs. Those pressed for time may wish to proceed directly to…
…A summing up, which expands in turn the key points we exchanged on the point of principle, on inputs and on outcomes respectively.
I have reserved until close to the end a few personal observations about the encounter and how it made me feel.
And I conclude with the customary brief summary of key points and the aforementioned poll.
It is an ambitious piece and I am in two minds as to whether it hangs together properly, but you are ultimately the judges of that.
What Mr Thomas Blogged
The post was called ‘The Romance of the Poor but Bright’ and the substance of the argument (incorporating several key quotations) ran like this:
The ‘effort and resources, of schools but particularly of business and charitable enterprise,are directed disproportionately at those who are already high achieving – the poor but bright’.
Moreover ‘huge effort is expended on access to the top universities, with great sums being spent to make marginal improvements to a small set of students at the top of the disadvantaged spectrum. They cite the gap in entry, often to Oxbridge, as a significant problem that blights our society.’
This however is ‘the pretty face of the problem. The far uglier face is the gap in life outcomes for those who take least well to education.’
‘Popular discourse is easily caught up in the romance of the poor but bright’ but ‘we end up ignoring the more pressing problem – of students for whom our efforts will determine whether they ever get a job or contribute to society’. For ‘when did you last hear someone advocate for the poor but dim?’
‘The gap most damaging to society is in life outcomes for the children who perform least well at school.’ Three areas should be prioritised to improve their educational outcomes:
o Improving alternative provision (AP) which ‘operates as a shadow school system, largely unknown and wholly unappreciated’ – ‘developing a national network of high–quality alternative provision…must be a priority if we are to close the gap at the bottom’.
o Improving ‘consistency in SEN support’because ‘schools are often ill equipped to cope with these, and often manage only because of the extraordinary effort of dedicated staff’. There is ‘inconsistency in funding and support between local authorities’.
o Introducing clearer assessment of basic skills, ‘so that a student could not appear to be performing well unless they have mastered the basics’.
While ‘any student failing to meet their potential is a dreadful thing’, the educational successes of ‘students with incredibly challenging behaviour’ and ‘complex special needs…have the power to change the British economy, far more so than those of their brighter peers.’
A footnote adds ‘I do not believe in either bright or dim, only differences in epigenetic coding or accumulated lifetime practice, but that is a discussion for another day.’
Indeed it is.
Our ensuing Twitter discussion
The substance of our Twitter discussion is captured in the embedded version immediately below. (Scroll down to the bottom for the beginning and work your way back to the top.)
I take poor to mean socio-economic disadvantage, as opposed to any disadvantage attributable to the behaviours, difficulties, needs, impairments or disabilities associated with AP and/or SEN.
I recognise of course that such a distinction is more theoretical than practical, because, when learners experience multiple causes of disadvantage, the educational response must be holistic rather than disaggregated.
Nevertheless, the meaning of ‘poor’ is clear – that term cannot be stretched to include these additional dimensions of disadvantage.
The available performance data foregrounds two measures of socio-economic disadvantage: current eligibility for and take up of free school meals (FSM) and qualification for the deprivation element of the Pupil Premium, determined by FSM eligibility at some point within the last 6 years (known as ‘ever-6’).
Both are used in this post. Distinctions are typically between the disadvantaged learners and non-disadvantaged learners, though some of the supporting data compares outcomes for disadvantaged learners with outcomes for all learners, advantaged and disadvantaged alike.
The gaps that need closing are therefore:
between ‘poor and bright’ and other ‘bright’ learners (The Excellence Gap) and
between ‘poor and dim’ and other ‘dim’ learners. I will christen this The Foundation Gap.
The core question is whether The Foundation Gap takes precedence over The Excellence Gap or vice versa, or whether they should have equal billing.
This involves immediate and overt recognition that classification as AP and/or SEN is not synonymous with the epithet ‘poor’, because there are many comparatively advantaged learners within these populations.
But such a distinction is not properly established in Mr Thomas’ blog, which applies the epithet ‘poor’ but then treats the AP and SEN populations as homogenous and somehow associated with it.
By ‘dim’ I take Mr Thomas to mean the lowest segment of the attainment distribution – one of his tweets specifically mentions ‘the bottom 20%’. The AP and/or SEN populations are likely to be disproportionately represented within these two deciles, but they are not synonymous with it either.
This distinction will not be lost on gifted advocates who are only too familiar with the very limited attention paid to twice exceptional learners.
Those from poor backgrounds within the AP and/or SEN populations are even more likely to be disproportionately represented in ‘the bottom 20%’ than their more advantaged peers, but even they will not constitute the entirety of ‘the bottom 20%’. A Venn diagram would likely show significant overlap, but that is all.
Hence disadvantaged AP/SEN are almost certainly a relatively poor proxy for the ‘poor but dim’.
That said I could find no data that quantifies these relationships.
The School Performance Tables distinguish a ‘low attainer’ cohort. (In the Secondary Tables the definition is determined by prior KS2 attainment and in the Primary Tables by prior KS1 attainment.)
These populations comprise some 15.7% of the total population in the Secondary Tables and about 18.0% in the Primary Tables. But neither set of Tables applies the distinction in their reporting of the attainment of those from disadvantaged backgrounds.
It follows from the definition of ‘dim’ that, by ‘bright’, MrThomas probably intends the two corresponding deciles at the top of the attainment distribution (even though he seems most exercised about the subset with the capacity to progress to competitive universities, particularly Oxford and Cambridge. This is a far more select group of exceptionally high attainers – and an even smaller group of exceptionally high attainers from disadvantaged backgrounds.)
A few AP and/or SEN students will likely fall within this wider group, fewer still within the subset of exceptionally high attainers. AP and/or SEN students from disadvantaged backgrounds will be fewer again, if indeed there are any at all.
The same issues with data apply. The School Performance Tables distinguish ‘high attainers’, who constitute over 32% of the secondary cohort and 25% of the primary cohort. As with low attainers, we cannot isolate the performance of those from disadvantaged backgrounds.
We are forced to rely on what limited data is made publicly available to distinguish the performance of disadvantaged low and high attainers.
At the top of the distribution there is a trickle of evidence about performance on specific high attainment measures and access to the most competitive universities. Still greater transparency is fervently to be desired.
At the bottom, I can find very little relevant data at all – we are driven inexorably towards analyses of the SEN population, because that is the only dataset differentiated by disadvantage, even though we have acknowledged that such a proxy is highly misleading. (Equivalent AP attainment data seems conspicuous by its absence.)
AP and SEN
Before exploring these datasets I ought to provide some description of the different programmes and support under discussion here, if only for the benefit of readers who are unfamiliar with the English education system.
‘They include pupils who have been excluded or who cannot attend mainstream school for other reasons: for example, children with behaviour issues, those who have short- or long-term illness, school phobics, teenage mothers, pregnant teenagers, or pupils without a school place.’
AP is provided in a variety of settings where learners engage in timetabled education activities away from their school and school staff.
Providers include further education colleges, charities, businesses, independent schools and the public sector. Pupil Referral Units (PRUs) are perhaps the best-known settings – there are some 400 nationally.
Taylor complains of a lack of reliable data about the number of learners in AP but notes that the DfE’s 2011 AP census recorded 14,050 pupils in PRUs and a further 23,020 in other settings on a mixture of full-time and part-time placements. This suggests a total of slightly over 37,000 learners, though the FTE figure is unknown.
He states that AP learners are:
‘…twice as likely as the average pupil to qualify for free school meals’
‘In Jan 2011, 34.6% of pupils in PRUs and 13.8%* of pupils in other AP, were eligible for and claiming free school meals, compared with 14.6% of pupils in secondary schools. [*Note: in some AP settings, free school meals would not be available, so that figure is under-stated, but we cannot say by how much.]’
If the PRU population is typical of the wider AP population, approximately one third qualify under this FSM measure of disadvantage, meaning that the substantial majority are not ‘poor’ according to our definition above.
Taylor confirms that overall GCSE performance in AP is extremely low, pointing out that in 2011 just 1.4% achieved five or more GCSE grades A*-C including [GCSEs in] maths and English, compared to 53.4% of pupils in all schools.
By 2012/13 the comparable percentages were 1.7% and 61.7% respectively (the latter for all state-funded schools), suggesting an increasing gap in overall performance. This is a cause for concern but not directly relevant to the issue under consideration.
The huge disparity is at least partly explained by the facts that many AP students take alternative qualifications and that the national curriculum does not apply to PRUs.
Data is available showing the full range of qualifications pursued. Taylor recommended that all students in AP should continue to receive ‘appropriate and challenging English and Maths teaching’.
Interestingly, he also pointed out that:
‘In some PRUs and AP there is no provision for more able pupils who end up leaving without the GCSE grades they are capable of earning.’
However, he fails to offer a specific recommendation to address this point.
Special Educational Needs (SEN) are needs or disabilities that affect children’s ability to learn. These may include behavioural and social difficulties, learning difficulties or physical impairments.
There is significant overlap between AP and SEN, with Taylor’s review of the former noting that the population in PRUs is 79% SEN.
We know from the 2013 SEN statistics that 12.6% of all pupils on roll at PRUs had SEN statements and 68.9% had SEN without statements. But these populations represent only a tiny proportion of the total SEN population in schools.
SEN learners also have higher than typical eligibility for FSM. In January 2013, 30.1% of all SEN categories across all primary, secondary and special schools were FSM-eligible, roughly twice the rate for all pupils. However, this means that almost seven in ten are not caught by the definition of ‘poor’ provided above.
In 2012/13 23.4% of all SEN learners achieved five or more GCSEs at A*-C or equivalent, including GCSEs in English and maths, compared with 70.4% of those having no identified SEN – another significant overall gap, but not directly relevant to our comparison of the ‘poor but bright’ and the ‘poor but dim’.
Data on socio-economic attainment gaps across the attainment spectrum
Those interested in how socio-economic attainment gaps vary at different attainment levels cannot fail to be struck by how little material of this kind is published, particularly in the secondary sector, when such gaps tend to increase in size.
One cannot entirely escape the conviction that this reticence deliberately masks some inconvenient truths.
The ideal would be to have the established high/middle/low attainer distinctions mapped directly onto performance by advantaged/disadvantaged learners in the Performance Tables but, as we have indicated, this material is conspicuous by its absence. Perhaps it will appear in the Data Portal now under development.
Our next best option is to examine socio-economic attainment gaps on specific attainment measures that will serve as decent proxies for high/middle/low attainment. We can do this to some extent but the focus is disproportionately on the primary sector because the Secondary Tables do not include proper high attainment measures (such as measures based exclusively on GCSE performance at grades A*/A). Maybe the Portal will come to the rescue here as well. We can however supply some basic Oxbridge fair access data.
The least preferable option is deploy our admittedly poor proxies for low attainers – SEN and AP. But there isn’t much information from this source either.
The analysis below looks consecutively at data for the primary and secondary sectors.
We know, from the 2013 Primary School Performance Tables, that the percentage of disadvantaged and other learners achieving different KS2 levels in reading, writing and maths combined, in 2013 and 2012 respectively, were as follows:
Table 1: Percentage of disadvantaged and all other learners achieving each national curriculum level at KS2 in 2013 in reading, writing and maths combined
L3 or below
L4 or above
L4B or above
L5 or above
This tells us relatively little, apart from the fact that disadvantaged learners are heavily over-represented at L3 and below and heavily under-represented at L5 and above.
The L5 gap is somewhat lower than the gaps at L4 and 4B respectively, but not markedly so. However, the L5 gap has widened slightly since 2012 while the reverse is true at L4.
This next table synthesises data from SFR51/13: ‘National curriculum assessments at key stage 2: 2012 to 2013’. It also shows gaps for disadvantage, as opposed to FSM gaps.
Table 2: Percentage of disadvantaged and all other learners achieving each national curriculum level, including differentiation by gender, in each 2013 end of KS2 test
This tells a relatively consistent story across each test and for boys as well as girls.
We can see that, at Level 4 and below, learners from disadvantaged backgrounds are in the clear majority, perhaps with the exception of L4 GPS. But at L4B and above they are very much in the minority.
Moreover, with the exception of L6 where low percentages across the board mask the true size of the gaps, disadvantaged learners tend to be significantly more under-represented at L4B and above than they are over-represented at L4 and below.
A different way of looking at this data is to compare the percentages of advantaged and disadvantaged learners respectively at L4 and L5 in each assessment.
Reading: Amongst disadvantaged learners the proportion at L5 is -18 percentage points fewer than the proportion at L4, but amongst advantaged learners the proportion at L5 is +12 percentage points higher than at L4.
GPS: Amongst disadvantaged learners the proportion at L5 is +5 percentage points more than the proportion at L4, but amongst advantaged learners the proportion at L5 is +26 percentage points higher than at L4.
Maths: Amongst disadvantaged learners the proportion at L5 is -26 percentage points fewer than the proportion at L4, but amongst advantaged learners the proportion at L5 is only 2 percentage points fewer than at L4.
If we look at 2013 gaps compared with 2012 (with teacher assessment of writing included in place of the GSP test introduced in 2013) we can see there has been relatively little change across the board, with the exception of L5 maths, which has been affected by the increasing success of advantaged learners at L6.
Table 3: Percentage of disadvantaged and all other learners achieving national curriculum levels 3-6 in each of reading, writing and maths in 2012 and 2013 respectively
To summarise, as far as KS2 performance is concerned, there are significant imbalances at both the top and the bottom of the attainment distribution and these gaps have not changed significantly since 2012. There is some evidence to suggest that gaps at the top are larger than those at the bottom.
Unfortunately there is a dearth of comparable data at secondary level, principally because of the absence of published measures of high attainment.
SFR05/2014 provides us with FSM gaps (as opposed to disadvantaged gaps) for a series of GCSE measures, none of which serve our purpose particularly well:
5+ A*-C GCSE grades: gap = 16.0%
5+ A*-C grades including English and maths GCSEs: gap = 26.7%
5+ A*-G grades: gap = 7.6%
5+ A*-G grades including English and maths GCSEs: gap = 9.9%
A*-C grades in English and maths GCSEs: gap = 26.6%
Achieving the English Baccalaureate: gap = 16.4%
Perhaps all we can deduce is that the gaps vary considerably in size, but tend to be smaller for the relatively less demanding and larger for the relatively more demanding measures.
For specific high attainment measures we are forced to rely principally on data snippets released in answer to occasional Parliamentary Questions.
In 2003, 1.0% of FSM-eligible learners achieved five or more GCSEs at A*/A including English and maths but excluding equivalents, compared with 6.8% of those not eligible, giving a gap of 5.8%. By 2009 the comparable percentages were 1.7% and 9.0% respectively, giving an increased gap of 7.3% (Col 568W)
In 2006/07, the percentage of FSM-eligible pupils securing A*/A grades at GCSE in different subjects, compared with the percentage of all pupils in maintained schools doing so were as shown in the table below (Col 808W)
Table 4: Percentage of FSM-eligible and all pupils achieving GCSE A*/A grades in different GCSE subjects in 2007
In 2008, 1% of FSM-eligible learners in maintained schools achieved A* in GCSE maths compared with 4% of all pupils in maintained schools. The comparable percentages for Grade A were 3% and 10% respectively, giving an A*/A gap of 10% (Col 488W)
There is much variation in the subject-specific outcomes at A*/A described above. But, when it comes to the overall 5+ GCSEs high attainment measure based on grades A*/A, the gap is much smaller than on the corresponding standard measure based on grades A*-C.
There is a complex pattern in evidence here which is very hard to explain with the limited data available. More time series data of this nature – illustrating Excellence and Foundation Gaps alike – should be published annually so that we have a more complete and much more readily accessible dataset.
I could find no information at all about the comparative performance of disadvantaged learners in AP settings compared with those not from disadvantaged backgrounds.
Data is published showing the FSM gap for SEN learners on all the basic GCSE measures listed above. I have retained the generic FSM gaps in brackets for the sake of comparison:
5+ A*-C GCSE grades: gap = 12.5% (16.0%)
5+ A*-C grades including English and maths GCSEs: gap = 12.1% (26.7%)
5+ A*-G grades: gap = 10.4% (7.6%)
5+ A*-G grades including English and maths GCSEs: gap = 13.2% (9.9%)
A*-C grades in English and maths GCSEs: gap = 12.3% (26.6%)
Achieving the English Baccalaureate: gap = 3.5% (16.4%)
One can see that the FSM gaps for the more demanding measures are generally lower for SEN learners than they are for all learners. This may be interesting but, for the reasons given above, this is not a reliable proxy for the FSM gap amongst ‘dim’ learners.
The chart below shows the number of 15 year-olds eligible for and claiming FSM at age 15 who progressed to Oxford or Cambridge by age 19. The figures are rounded to the nearest five.
Chart 1: FSM-eligible learners admitted to Oxford and Cambridge 2005/06 to 2010/11
In sum, there has been no change in these numbers over the last six years for which data has been published. So while there may have been consistently significant expenditure on access agreements and multiple smaller mentoring programmes, it has had negligible impact on this measure at least.
My previous post set out a proposal for what to do about this sorry state of affairs.
For the purposes of this discussion we need ideally to identify and compare total national budgets for the ‘poor but bright’ and the ‘poor but dim’. But that is simply not possible.
Many funding streams cannot be disaggregated in this manner. As we have seen, some – including the AP and SEN budgets – may be aligned erroneously with the second of these groups, although they also support learners who are neither ‘poor’ nor ‘dim’ and have a broader purpose than raising attainment.
There may be some debate, too, about which funding streams should be weighed in the balance.
On the ‘bright but poor’ side, do we include funding for grammar schools, even though the percentage of disadvantaged learners attending many of them is virtually negligible (despite recent suggestions that some are now prepared to do something about this)? Should the Music and Dance Scheme (MDS) be within scope of this calculation?
The best I can offer is a commentary that gives a broad sense of orders of magnitude, to illustrate in very approximate terms how the scales tend to tilt more towards the ‘poor but dim’ rather than the ‘poor but bright’, but also to weave in a few relevant asides about some of the funding streams in question.
Pupil Premium and the EEF
I begin with the Pupil Premium – providing schools with additional funding to raise the attainment of disadvantaged learners.
The Premium is not attached to the learners who qualify for it, so schools are free to aggregate the funding and use it as they see fit. They are held accountable for these decisions through Ofsted inspection and the gap-narrowing measures in the Performance Tables.
Mr Thomas suggests in our Twitter discussion that AP students are not significant beneficiaries of such support, although provision in PRUs features prominently in the published evaluation of the Premium. It is for local authorities to determine how Pupil Premium funding is allocated in AP settings.
One might also make a case that ‘bright but poor’ learners are not a priority either, despite suggestions from the Pupil Premium Champion to the contrary.
As we have seen, the Performance Tables are not sharply enough focused on the excellence gaps at the top of the distribution and I have shown elsewhere that Ofsted’s increased focus on the most able does not yet extend to the impact on those attracting the Pupil Premium, even though there was a commitment that it would do so.
If there is Pupil Premium funding heading towards high attainers from disadvantaged backgrounds, the limited data to which we have access does not yet suggest a significant impact on the size of Excellence Gaps.
This 2011 paper explains that it is prioritising the performance of disadvantaged learners in schools below the floor targets. At one point it says:
‘Looking at the full range of GCSE results (as opposed to just the proportions who achieve the expected standards) shows that the challenge facing the EEF is complex – it is not simply a question of taking pupils from D to C (the expected level of attainment). Improving results across the spectrum of attainment will mean helping talented pupils to achieve top grades, while at the same time raising standards amongst pupils currently struggling to pass.’
But this is just after it has shown that the percentages of disadvantaged high attainers in its target schools are significantly lower than elsewhere. Other things being equal, the ‘poor but dim’ will be the prime beneficiaries.
It may now be time for the EEF to expand its focus to all schools. A diagram from this paper – reproduced below – demonstrates that, in 2010, the attainment gap between FSM and non-FSM was significantly larger in schools above the floor than in those below the floor that the EEF is prioritising. This is true in both the primary and secondary sectors.
It would be interesting to see whether this is still the case.
AP and SEN
Given the disaggregation problems discussed above, this section is intended simply to give some basic sense of orders of magnitude – lending at least some evidence to counter Mr Thomas’ assertion that the ‘effort and resources, of schools… are directed disproportionately at those who are already high achieving – the poor but bright’.
It is surprisingly hard to get a grip on the overall national budget for AP. A PQ from early 2011 (Col 75W) supplies a net current expenditure figure for all English local authorities of £530m.
Taylor’s Review fails to offer a comparable figure but my rough estimates based on the per pupil costs he supplies suggests a revenue budget of at least £400m. (Taylor suggests average per pupil costs of £9,500 per year for full-time AP, although PRU places are said to cost between £12,000 and £18,000 per annum.)
I found online a consultation document from Kent – England’s largest local authority – stating its revenue costs at over £11m in FY2014-15. Approximately 454 pupils attended Kent’s AP/PRU provision in 2012-13.
There must also be a significant capital budget. There are around 400 PRUs, not to mention a growing cadre of specialist AP academies and free schools. The total capital cost of the first AP free school – Derby Pride Academy – was £2.147m for a 50 place setting.
In FY2011-12, total annual national expenditure on SEN was £5.77 billion (Col 391W). There will have been some cost-cutting as a consequence of the latest reforms, but the order of magnitude is clear.
The latest version of the SEN Code of Practice outlines the panoply of support available, including the compulsory requirement that each school has a designated teacher to be responsible for co-ordinating SEN provision (the SENCO).
In short, the national budget for AP is sizeable and the national budget for SEN is huge. Per capita expenditure is correspondingly high. If we could isolate the proportion of these budgets allocated to raising the attainment of the ‘poor but dim’, the total would be substantial.
Fair Access, especially to Oxbridge, and some related observations
Mr Thomas refers specifically to funding to support fair access to universities – especially Oxbridge – for those from disadvantaged backgrounds. This is another area in which it is hard to get a grasp on total expenditure, not least because of the many small-scale mentoring projects that exist.
Mr Thomas is quite correct to remark on the sheer number of these, although they are relatively small beer in budgetary terms. (One suspects that they would be much more efficient and effective if they could be linked together within some sort of overarching framework.)
The Office for Fair Access (OFFA) estimates University access agreement expenditure on outreach in 2014-15 at £111.9m and this has to be factored in, as does DfE’s own small contribution – the Future Scholar Awards.
Were any expenditure in this territory to be criticised, it would surely be the development and capital costs for new selective 16-19 academies and free schools that specifically give priority to disadvantaged students.
The sums are large, perhaps not outstandingly so compared with national expenditure on SEN for example, but they will almost certainly benefit only a tiny localised proportion of the ‘bright but poor’ population.
There are several such projects around the country. Some of the most prominent are located in London.
The London Academy of Excellence (capacity 420) is fairly typical. It cost an initial £4.7m to establish plus an annual lease requiring a further £400K annually.
There were reportedly disagreements within Government:
‘It is understood that the £45m cost was subject to a “significant difference of opinion” within the DfE where critics say that by concentrating large resources on the brightest children at a time when budgets are constrained means other children might miss out…
But a spokeswoman for the DfE robustly defended the plans tonight. “This is an inspirational collaboration between the country’s top academy chain and one of the best private schools in the country,” she said. “It will give hundreds of children from low income families across London the kind of top quality sixth-form previously reserved for the better off.”’
Here we have in microcosm the debate to which this post is dedicated.
One blogger – a London College Principal – pointed out that the real issue was not whether the brightest should benefit over others, but how few of the ‘poor but bright’ would do so:
‘£45m could have a transformative effect on thousands of 16-19 year olds across London… £45m could have funded at least 50 extra places in each college for over 10 years, helped build excellent new facilities for all students and created a city-wide network to support gifted and talented students in sixth forms across the capital working with our partner universities and employers.’
There are three main elements to the discussion: the point of principle, the inputs and the impact. The following sections deal with each of these in turn.
Put bluntly, should ‘poor but dim’ kids have higher priority for educators than ‘poor but bright’ kids (Mr Thomas’ position) or should all poor kids have equal priority and an equal right to the support they need to achieve their best (the Gifted Phoenix position)?
For Mr Thomas, it seems this priority is determined by whether – and how far – the learner is behind undefined ‘basic levels of attainment’ and/or mastery of ‘the basics’ (presumably literacy and numeracy).
Those below the basic attainment threshold have higher priority than those above it. He does not say so but this logic suggests that those furthest below the threshold are the highest priority and those furthest above are the lowest.
So, pursued to its logical conclusion, this would mean that the highest attainers would get next to no support while a human vegetable would be the highest priority of all.
However, since Mr Thomas’ focus is on marginal benefit, it may be that those nearest the threshold would be first in the queue for scarce resources, because they would require the least effort and resources to lift above it.
This philosophy drives the emphasis on achievement of national benchmarks and predominant focus on borderline candidates that, until recently, dominated our assessment and accountability system.
For Gifted Phoenix, every socio-economically disadvantaged learner has equal priority to the support they need to improve their attainment, by virtue of that disadvantage.
There is no question of elevating some ahead of others in the pecking order because they are further behind on key educational measures since, in effect, that is penalising some disadvantaged learners on the grounds of their ability or, more accurately, their prior attainment.
This philosophy underpins the notion of personalised education and is driving the recent and welcome reforms of the assessment and accountability system, designed to ensure that schools are judged by how well they improve the attainment of all learners, rather than predominantly on the basis of the proportion achieving the standard national benchmarks.
I suggested that, in deriding ‘the romance of the poor but bright’, Mr Thomas ran the risk of falling into ‘the slough of anti-elitism’. He rejected that suggestion, while continuing to emphasise the need to ‘concentrate more’ on ‘those at risk of never being able to engage with society’.
I have made the assumption that Thomas is interested primarily in KS2 and GCSE or equivalent qualifications at KS4 given his references to KS2 L4, basic skills and ‘paper qualifications needed to enter meaningful employment’.
But his additional references to ‘real qualifications’ (as opposed to paper ones) and engaging with society could well imply a wider range of personal, social and work-related skills for employability and adult life.
My preference for equal priority would apply regardless: there is no guarantee that high attainers from disadvantaged backgrounds will necessarily possess these vital skills.
But, as indicated in the definition above, there is an important distinction to be maintained between:
educational support to raise the attainment, learning and employability skills of socio-economically disadvantaged learners and prepare them for adult life and
support to manage a range of difficulties – whether behavioural problems, disability, physical or mental impairment – that impact on the broader life chances of the individuals concerned.
Such a distinction may well be masked in the everyday business of providing effective holistic support for learners facing such difficulties, but this debate requires it to be made and sustained given Mr Thomas’s definition of the problem in terms of the comparative treatment of the ‘poor but bright’ and the ‘poor but dim’.
Having made this distinction, it is not clear whether he himself sustains it consistently through to the end of his post. In the final paragraphs the term ‘poor but dim’ begins to morph into a broader notion encompassing all AP and SEN learners regardless of their socio-economic status.
Additional dimensions of disadvantage are potentially being brought into play. This is inconsistent and radically changes the nature of the argument.
By inputs I mean the resources – financial and human – made available to support the education of ‘dim’ and ‘bright’ disadvantaged learners respectively.
Mr Thomas also shifts his ground as far as inputs are concerned.
His post opens with a statement that ‘the effort and resources’ of schools, charities and businesses are ‘directed disproportionately’ at the poor but bright – and he exemplifies this with reference to fair access to competitive universities, particularly Oxbridge.
When I point out the significant investment in AP compared with fair access, he changes tack – ‘I’m measuring outcomes not just inputs’.
Then later he says ‘But what some need is just more expensive’, to which I respond that ‘the bottom end already has the lion’s share of funding’.
At this point we have both fallen into the trap of treating the entirety of the AP and SEN budgets as focused on the ‘poor but dim’.
We are failing to recognise that they are poor proxies because the majority of AP and SEN learners are not ‘poor’, many are not ‘dim’, these budgets are focused on a wider range of needs and there is significant additional expenditure directed at ‘poor but dim’ learners elsewhere in the wider education budget.
Despite Mr Thomas’s opening claim, it should be reasonably evident from the preceding commentary that my ‘lion’s share’ point is factually correct. His suggestion that AP is ‘largely unknown and wholly unappreciated’ flies in the face of the Taylor Review and the Government’s subsequent work programme.
SEN may depend heavily on the ‘extraordinary effort of dedicated staff’, but at least there are such dedicated staff! There may be inconsistencies in local authority funding and support for SEN, but the global investment is colossal by comparison with the funding dedicated on the other side of the balance.
Gifted Phoenix’s position acknowledges that inputs are heavily loaded in favour of the SEN and AP budgets. This is as it should be since, as Thomas rightly notes, many of the additional services they need are frequently more expensive to provide. These services are not simply dedicated to raising their attainment, but also to tackling more substantive problems associated with their status.
Whether the balance of expenditure on the ‘bright’ and ‘dim’ respectively is optimal is a somewhat different matter. Contrary to Mr Thomas’s position, gifted advocates are often convinced that too much largesse is focused on the latter at the expense of the former.
Turning to advocacy, Mr Thomas says ‘we end up ignoring the more pressing problem’ of the poor but dim. He argues in the Twitter discussion that too few people are advocating for these learners, adding that they are failed ‘because it’s not popular to talk about them’.
I could not resist countering that advocacy for gifted learners is equally unpopular, indeed ‘the word is literally taboo in many settings’. I cannot help thinking – from his footnote reference to ‘epigenetic coding’ – that Mr Thomas is amongst those who are distinctly uncomfortable with the term.
Where advocacy does survive it is focused exclusively on progression to competitive universities and, to some extent, high attainment as a route towards that outcome. The narrative has shifted away from concepts of high ability or giftedness, because of the very limited consensus about that condition (even amongst gifted advocates) and even considerable doubt in some quarters whether it exists at all.
Mr Thomas maintains in his post that the successes of his preferred target group ‘have the power to change the British economy, far more so than those of their brighter peers’. This is because ‘the gap most damaging to society is in life outcomes for the children that perform least well at school’.
As noted above, it is important to remember that we are discussing here the addition of educational and economic value by tackling underachievement amongst learners from disadvantaged backgrounds, rather than amongst all the children that perform least well.
We are also leaving to one side the addition of value through any wider engagement by health and social services to improve life chances.
It is quite reasonable to advance the argument that ‘improving the outcomes ‘of the bottom 20%’ (the Tail) will have ‘a huge socio-economic impact’ and ‘make the biggest marginal difference to society’.
But one could equally make the case that society would derive similar or even higher returns from a decision to concentrate disproportionately on the highest attainers (the Smart Fraction).
Or, as Gifted Phoenix would prefer, one could reasonably propose that the optimal returns should be achieved by means of a balanced approach that raises both the floor and the ceiling, avoiding any arbitrary distinctions on the basis of prior attainment.
From the Gifted Phoenix perspective, one should balance the advantages of removing the drag on productivity of an educational underclass against those of developing the high-level human capital needed to drive economic growth and improve our chances of success in what Coalition ministers call the ‘global race’.
According to this perspective, by eliminating excellence gaps between disadvantaged and advantaged high attainers we will secure a stream of benefits broadly commensurate to that at the bottom end.
These will include substantial spillover benefits, achieved as a result of broadening the pool of successful leaders in political, social, educational and artistic fields, not to mention significant improvements in social mobility.
It is even possible to argue that, by creating a larger pool of more highly educated parents, we can also achieve a significant positive impact on the achievement of subsequent generations, thus significantly reducing the size of the tail.
And in the present generation we will create many more role models: young people from disadvantaged backgrounds who become educationally successful and who can influence the aspirations of younger disadvantaged learners.
This avoids the risk that low expectations will be reinforced and perpetuated through a ‘deficit model’ approach that places excessive emphasis on removing the drag from the tail by producing a larger number of ‘useful members of society’.
It seems to me entirely conceivable that economists might produce calculations to justify any of these different paths.
But it would be highly inequitable to put all our eggs in the ‘poor but bright’ basket, because that penalises some disadvantaged learners for their failure to achieve high attainment thresholds.
And it would be equally inequitable to focus exclusively on the ‘poor but dim’, because that penalises some disadvantaged learners for their success in becoming high attainers.
The more equitable solution must be to opt for a ‘balanced scorecard’ approach that generates a proportion of the top end benefits and a proportion of the bottom end benefits simultaneously.
There is a risk that this reduces the total flow of benefits, compared with one or other of the inequitable solutions, but there is a trade-off here between efficiency and a socially desirable outcome that balances the competing interests of the two groups.
The personal dimension
After we had finished our Twitter exchanges, I thought to research Mr Thomas online. Turns out he’s quite the Big-Cheese-in-Embryo. Provided he escapes the lure of filthy lucre, he’ll be a mover and shaker in education within the next decade.
I couldn’t help noticing his own educational experience – public school, a First in PPE from Oxford, leading light in the Oxford Union – then graduation from Teach First alongside internships with Deutsche Bank and McKinsey.
Now he’s serving his educational apprenticeship as joint curriculum lead for maths at a prominent London Academy. He’s also a trustee of ‘a university mentoring project for highly able 11-14 year old pupils from West London state schools’.
Lucky I didn’t check earlier. Such a glowing CV might have been enough to cow this grammar school Oxbridge reject, even if I did begin this line of work several years before he was born. Not that I have a chip on my shoulder…
The experience set me wondering about the dominant ideology amongst the Teach First cadre, and how it is tempered by extended exposure to teaching in a challenging environment.
There’s more than a hint of idealism about someone from this privileged background espousing the educational philosophy that Mr Thomas professes. But didn’t he wonder where all the disadvantaged people were during his own educational experience, and doesn’t he want to change that too?
His interest in mentoring highly able pupils would suggest that he does, but also seems directly to contradict the position he’s reached here. It would be a pity if the ‘poor but bright’ could not continue to rely on his support, equal in quantity and quality to the support he offers the ‘poor but dim’.
For he could make a huge difference at both ends of the attainment spectrum – and, with his undeniable talents, he should certainly be able to do so
We are entertaining three possible answers to the question whether in principle to prioritise the needs of the ‘poor but bright’ or the ‘poor but dim’:
Concentrate principally – perhaps even exclusively – on closing the Excellence Gaps at the top
Concentrate principally – perhaps even exclusively – on closing the Foundation Gaps at the bottom
Concentrate equally across the attainment spectrum, at the top and bottom and all points in between.
Speaking as an advocate for those at the top, I favour the third option.
It seems to me incontrovertible – though hard to quantify – that, in the English education system, the lion’s share of resources go towards closing the Foundation Gaps.
That is perhaps as it should be, although one could wish that the financial scales were not tipped so excessively in their direction, for ‘poor but bright’ learners do in my view have an equal right to challenge and support, and should not be penalised for their high attainment.
Our current efforts to understand the relative size of the Foundation and Excellence Gaps and how these are changing over time are seriously compromised by the limited data in the public domain.
There is a powerful economic case to be made for prioritising the Foundation Gaps as part of a deliberate strategy for shortening the tail – but an equally powerful case can be constructed for prioritising the Excellence Gaps, as part of a deliberate strategy for increasing the smart fraction.
Neither of these options is optimal from an equity perspective however. The stream of benefits might be compromised somewhat by not focusing exclusively on one or the other, but a balanced approach should otherwise be in our collective best interests.
You may or may not agree. Here is a poll so you can register your vote. Please use the comments facility to share your wider views on this post.
We must beware the romance of the poor but bright,
The sample of jurisdictions includes England, other English-speaking countries (Australia, Canada, Ireland and the USA) and those that typically top the PISA rankings (Finland, Hong Kong, South Korea, Shanghai, Singapore and Taiwan).
With the exception of New Zealand, which did not take part in the problem solving assessment, this is deliberately identical to the sample I selected for a parallel post reviewing comparable results in the PISA 2012 assessments of reading, mathematics and science: ‘PISA 2012: International Comparisons of High Achievers’ Performance’ (December 2013)
These eleven jurisdictions account for nine of the top twelve performers ranked by mean overall performance in the problem solving assessment. (The USA and Ireland lie outside the top twelve, while Japan, Macao and Estonia are the three jurisdictions that are in the top twelve but outside my sample.)
The post is divided into seven sections:
Background to the problem solving assessment: How PISA defines problem solving competence; how it defines performance at each of the six levels of proficiency; how it defines high achievement; the nature of the assessment and who undertook it.
Average performance, the performance of high achievers and the performance of low achievers (proficiency level 1) on the problem solving assessment. This comparison includes my own sample and all the other jurisdictions that score above the OECD average on the first of these measures.
Gender and socio-economic differences amongst high achievers on the problem solving assessment in my sample of eleven jurisdictions.
The relative strengths and weaknesses of jurisdictions in this sample on different aspects of the problem solving assessment. (This treatment is generic rather than specific to high achievers.)
What proportion of high achievers on the problem-solving assessment in my sample of jurisdictions are also high achievers in reading, maths and science respectively.
What proportion of students in my sample of jurisdictions achieves highly in one or more of the four PISA 2012 assessments – and against the ‘all-rounder’ measure, which is based on high achievement in all of reading, maths and science (but not problem solving).
Implications for education policy makers seeking to improve problem solving performance in each of the sample jurisdictions.
Background to the Problem Solving Assessment
Definition of problem solving
PISA’s definition of problem-solving competence is:
‘…an individual’s capacity to engage in cognitive processing to understand and resolve problem situations where a method of solution is not immediately obvious. It includes the willingness to engage with such situations in order to achieve one’s potential as a constructive and reflective citizen.’
The commentary on this definition points out that:
Problem solving requires identification of the problem(s) to be solved, planning and applying a solution, and monitoring and evaluating progress.
A problem is ‘a situation in which the goal cannot be achieved by merely applying learned procedures’, so the problems encountered must be non-routine for 15 year-olds, although ‘knowledge of general strategies’ may be useful in solving them.
Motivational and affective factors are also in play.
The Report is rather coy about the role of creativity in problem solving, and hence the justification for the inclusion of this term in its title.
Perhaps the nearest it gets to an exposition is when commenting on the implications of its findings:
‘In some countries and economies, such as Finland, Shanghai-China and Sweden, students master the skills needed to solve static, analytical problems similar to those that textbooks and exam sheets typically contain as well or better than 15-year-olds, on average, across OECD countries. But the same 15-year-olds are less successful when not all information that is needed to solve the problem is disclosed, and the information provided must be completed by interacting with the problem situation. A specific difficulty with items that require students to be open to novelty, tolerate doubt and uncertainty, and dare to use intuitions (“hunches and feelings”) to initiate a solution suggests that opportunities to develop and exercise these traits, which are related to curiosity, perseverance and creativity, need to be prioritised.’
PISA’s framework for assessing problem solving competence is set out in the following diagram
In solving a particular problem it may not be necessary to apply all these steps, or to apply them in this order.
The proficiency scale was designed to have a mean score across OECD countries of 500. The six levels of proficiency applied in the assessment each have their own profile.
The lowest, level 1 proficiency is described thus:
‘At Level 1, students can explore a problem scenario only in a limited way, but tend to do so only when they have encountered very similar situations before. Based on their observations of familiar scenarios, these students are able only to partially describe the behaviour of a simple, everyday device. In general, students at Level 1 can solve straightforward problems provided there is a simple condition to be satisfied and there are only one or two steps to be performed to reach the goal. Level 1 students tend not to be able to plan ahead or set sub-goals.’
This level equates to a range of scores from 358 to 423. Across the OECD sample, 91.8% of participants are able to perform tasks at this level.
By comparison, level 5 proficiency is described in this manner:
‘At Level 5, students can systematically explore a complex problem scenario to gain an understanding of how relevant information is structured. When faced with unfamiliar, moderately complex devices, such as vending machines or home appliances, they respond quickly to feedback in order to control the device. In order to reach a solution, Level 5 problem solvers think ahead to find the best strategy that addresses all the given constraints. They can immediately adjust their plans or backtrack when they detect unexpected difficulties or when they make mistakes that take them off course.’
The associated range of scores is from 618 to 683 and 11.4% of all OECD students achieve at this level.
Finally, level 6 proficiency is described in this way:
‘At Level 6, students can develop complete, coherent mental models of diverse problem scenarios, enabling them to solve complex problems efficiently. They can explore a scenario in a highly strategic manner to understand all information pertaining to the problem. The information may be presented in different formats, requiring interpretation and integration of related parts. When confronted with very complex devices, such as home appliances that work in an unusual or unexpected manner, they quickly learn how to control the devices to achieve a goal in an optimal way. Level 6 problem solvers can set up general hypotheses about a system and thoroughly test them. They can follow a premise through to a logical conclusion or recognise when there is not enough information available to reach one. In order to reach a solution, these highly proficient problem solvers can create complex, flexible, multi-step plans that they continually monitor during execution. Where necessary, they modify their strategies, taking all constraints into account, both explicit and implicit.’
The range of level 6 scores is from 683 points upwards and 2.5% of all OECD participants score at this level.
PISA defines high achieving students as those securing proficiency level 5 or higher, so proficiency levels 5 and 6 together. The bulk of the analysis it supplies relates to this cohort, while relatively little attention is paid to the more exclusive group achieving proficiency level 6, even though almost 10% of students in Singapore reach this standard in problem solving.
Sixty-five jurisdictions took part in PISA 2012, including all 34 OECD countries and 31 partners. But only 44 jurisdictions took part in the problem solving assessment, including 28 OECD countries and 16 partners. As noted above, that included all my original sample of twelve jurisdictions, with the exception of New Zealand.
I could find no stated reason why New Zealand chose not to take part. Press reports initially suggested that England would do likewise, but it was subsequently reported that this decision had been reversed.
The assessment was computer-based and comprised 16 units divided into 42 items. The units were organised into four clusters, each designed to take 20 minutes to complete. Participants completed one or two clusters, depending on whether they were also undertaking computer-based assessments of reading and maths.
In each jurisdiction a random sample of those who took part in the paper-based maths assessment was selected to undertake the problem solving assessment. About 85,000 students took part in all. The unweighted sample sizes in my selected jurisdictions are set out in Table 1 below, together with the total population of 15 year-olds in each jurisdiction.
Table 1: Sample sizes undertaking PISA 2012 problem solving assessment in selected jurisdictions
Total 15 year-olds
Those taking the assessment were aged between 15 years and three months and 16 years and two months at the time of the assessment. All were enrolled at school and had completed at least six years of formal schooling.
Average performance compared with the performance of high and low achievers
The overall table of mean scores on the problem solving assessment is shown below
There are some familiar names at the top of the table, especially Singapore and South Korea, the two countries that comfortably lead the rankings. Japan is some ten points behind in third place but it in turn has a lead of twelve points over a cluster of four other Asian competitors: Macao, Hong Kong, Shanghai and Taiwan.
A slightly different picture emerges if we compare average performance with the proportion of learners who achieve the bottom proficiency level and the top two proficiency levels. Table 2 below compares these groups.
This table includes all the jurisdictions that exceeded the OECD average score. I have marked out in bold the countries in my sample of eleven which includes Ireland, the only one of them that did not exceed the OECD average.
Table 2: PISA Problem Solving 2012: Comparing Average Performance with Performance at Key Proficiency Levels
Level 1 (%)
Level 5 (%)
Level 6 (%)
Levels 5+6 (%)
The jurisdictions at the top of the table also have a familiar profile, with a small ‘tail’ of low performance combined with high levels of performance at the top end.
Nine of the top ten have fewer than 10% of learners at proficiency level 1, though only South Korea pushes below 5%.
Five of the top ten have 5% or more of their learners at proficiency level 6, but only Singapore and South Korea have a higher percentage at level 6 than level 1 (with Japan managing the same percentage at both levels).
The top three performers – Singapore, South Korea and Japan – are the only three jurisdictions that have over 20% of their learners at proficiency levels 5 and 6 together.
South Korea slightly outscores Singapore at level 5 (20.0% against 19.7%). Japan is in third place, followed by Taiwan, Hong Kong and Shanghai.
But at level 6, Singapore has a clear lead, followed by South Korea, Japan, Hong Kong and Canada respectively.
England’s overall place in the table is relatively consistent on each of these measures, but the gaps between England and the top performers vary considerably.
The best have fewer than half England’s proportion of learners at proficiency level 1, almost twice as many learners at proficiency level 5 and more than twice as many at proficiency levels 5 and 6 together. But at proficiency level 6 they have almost three times as many learners as England.
Chart 1 below compares performance on these four measures across my sample of eleven jurisdictions.
All but Ireland are comfortably below the OECD average for the percentage of learners at proficiency level 1.The USA and Ireland are atypical in having a bigger tail (proficiency level 1) than their cadres of high achievers (levels 5 and 6 together).
At level 5 all but Ireland and the USA are above the OECD average, but USA leapfrogs the OECD average at level 6.
There is a fairly strong correlation between the proportions of learners achieving the highest proficiency thresholds and average performance in each jurisdiction. However, Canada stands out by having an atypically high proportion of students at level 6.
PISA’s Report discusses the variation in problem-solving performance within different jurisdictions. However it does so without reference to the proficiency levels, so we do not know to what extent these findings apply equally to high achievers.
Amongst those above the OECD average, those with least variation are Macao, Japan, Estonia, Shanghai, Taiwan, Korea, Hong Kong, USA, Finland, Ireland, Austria, Singapore and the Czech Republic respectively.
Perhaps surprisingly, the degree of variation in Finland is identical to that in the USA and Ireland, while Estonia has less variation than many of the Asian jurisdictions. Singapore, while top of the performance table, is only just above the OECD average in terms of variation.
The countries below the OECD average on this measure – listed in order of increasing variation – include England, Australia and Canada, though all three are relatively close to the OECD average. So these three countries and Singapore are all relatively close together.
Gender and socio-economic differences amongst high achievers
On average across OECD jurisdictions, boys score seven points higher than girls on the problem solving assessment. There is also more variation amongst boys than girls.
Across the OECD participants, 3.1% of boys achieved proficiency level 6 but only 1.8% of girls did so. This imbalance was repeated at proficiency level 5, achieved by 10% of boys and 7.7% of girls.
The table and chart below show the variations within my sample of eleven countries. The performance of boys exceeds that of girls in all cases, except in Finland at proficiency level 5, and in that instance the gap in favour of girls is relatively small (0.4%).
Table 3: PISA Problem-solving: Gender variation at top proficiency levels
Level 5 (%)
Level 6 (%)
Levels 5+6 (%)
There is no consistent pattern in whether boys are more heavily over-represented at proficiency level 5 than proficiency level 6, or vice versa.
There is a bigger difference at level 6 than at level 5 in Singapore, South Korea, Canada, Australia, Finland and Ireland, but the reverse is true in the five remaining jurisdictions.
At level 5, boys are in the greatest ascendancy in Shanghai and Taiwan while, at level 6, this is true of Singapore and South Korea.
When proficiency levels 5 and 6 are combined, all five of the Asian tigers show a difference in favour of males of 5.5% or higher, significantly in advance of the six ‘Western’ countries in the sample and significantly ahead of the OECD average.
Amongst the six ‘Western’ representatives, boys have the biggest advantage at proficiency level 5 in England, while at level 6 boys in Ireland have the biggest advantage.
Within this group of jurisdictions, the gap between boys and girls at level 6 is comfortably the smallest in England. But, in terms of performance at proficiency levels 5 and 6 together, Finland is ahead.
Chart 2: PISA Problem-solving: Gender variation at top proficiency levels
The Report includes a generic analysis of gender differences in performance for boys and girls with similar levels of performance in English, maths and science.
It concludes that girls perform significantly above their expected level in both England and Australia (though the difference is only statistically significant in the latter).
The Report comments:
‘It is not clear whether one should expect there to be a gender gap in problem solving. On the one hand, the questions posed in the PISA problem-solving assessment were not grounded in content knowledge, so boys’ or girls’ advantage in having mastered a particular subject area should not have influenced results. On the other hand… performance in problem solving is more closely related to performance in mathematics than to performance in reading. One could therefore expect the gender difference in performance to be closer to that observed in mathematics – a modest advantage for boys, in most countries – than to that observed in reading – a large advantage for girls.’
The Report considers variations in performance against PISA’s Index of Economic, Social and Cultural status (IESC), finding them weaker overall than for reading, maths and science.
It calculates that the overall percentage variation in performance attributable to these factors is about 10.6% (compared with 14.9% in maths, 14.0% in science and 13.2% in reading).
Amongst the eleven jurisdictions in my sample, the weakest correlations were found in Canada (4%), followed by Hong Kong (4.9%), South Korea (5.4%), Finland (6.5%), England (7.8%), Australia (8.5%), Taiwan (9.4%), the USA (10.1%) and Ireland (10.2%) in that order. All those jurisdictions had correlations below the OECD average.
Perhaps surprisingly, there were above average correlations in Shanghai (14.1%) and, to a lesser extent (and less surprisingly) in Singapore (11.1%).
The report suggests that students with parents working in semi-skilled and elementary occupations tend to perform above their expected level in problem-solving in Taiwan, England, Canada, the USA, Finland and Australia (in that order – with Australia closest to the OECD average).
The jurisdictions where these students tend to underperform their expected level are – in order of severity – Ireland, Shanghai, Singapore, Hong Kong and South Korea.
A parallel presentation on the Report provides some additional data about the performance in different countries of what the OECD calls ‘resilient’ students – those in the bottom quartile of the IESC but in the top quartile by perfromance, after accounting for socio-economic status.
It supplies the graph below, which shows all the Asian countries in my sample clustered at the top, but also with significant gaps between them. Canada is the highest-performing of the remainder in my sample, followed by Finland, Australia, England and the USA respectively. Ireland is some way below the OECD average.
Unfortunately, I can find no analysis of how performance varies according to socio-economic variables at each proficiency level. It would be useful to see which jurisdictions have the smallest ‘excellence gaps’ at levels 5 and 6 respectively.
How different jurisdictions perform on different aspects of problem-solving
The Report’s analysis of comparative strengths and weaknesses in different elements of problem-solving does not take account of variations at different proficiency levels
It explains that aspects of the assessment were found easier by students in different jurisdictions, employing a four-part distinction between:
‘Exploringand understanding. The objective is to build mental representations of each of the pieces of information presented in the problem. This involves:
exploring the problem situation: observing it, interacting with it, searching for information and finding limitations or obstacles; and
understanding given information and, in interactive problems, information discovered while interacting with the problem situation; and demonstrating understanding of relevant concepts.
Representing and formulating. The objective is to build a coherent mental representation of the problem situation (i.e. a situation model or a problem model). To do this, relevant information must be selected, mentally organised and integrated with relevant prior knowledge. This may involve:
representing the problem by constructing tabular, graphic, symbolic or verbal representations, and shifting between representational formats; and
formulating hypotheses by identifying the relevant factors in the problem and their inter-relationships; and organising and critically evaluating information.
Planning and executing. The objective is to use one’s knowledge about the problem situation to devise a plan and execute it. Tasks where “planning and executing” is the main cognitive demand do not require any substantial prior understanding or representation of the problem situation, either because the situation is straightforward or because these aspects were previously solved. “Planning and executing” includes:
planning, which consists of goal setting, including clarifying the overall goal, and setting subgoals, where necessary; and devising a plan or strategy to reach the goal state, including the steps to be undertaken; and
executing, which consists of carrying out a plan.
Monitoring and reflecting.The objective is to regulate the distinct processes involved in problem solving, and to critically evaluate the solution, the information provided with the problem, or the strategy adopted. This includes:
monitoring progress towards the goal at each stage, including checking intermediate and final results, detecting unexpected events, and taking remedial action when required; and
reflecting on solutions from different perspectives, critically evaluating assumptions and alternative solutions, identifying the need for additional information or clarification and communicating progress in a suitable manner.’
Amongst my sample of eleven jurisdictions:
‘Exploring and understanding’ items were found easier by students in Singapore, Hong Kong, South Korea, Australia, Taiwan and Finland.
‘Representing and formulating’ items were found easier in Taiwan, Shanghai, South Korea, Singapore, Hong Kong, Canada and Australia.
‘Planning and executing’ items were found easier in Finland only.
‘Monitoring and reflecting’ items were found easier in Ireland, Singapore, the USA and England.
The Report concludes:
‘This analysis shows that, in general, what differentiates high-performing systems, and particularly East Asian education systems, such as those in Hong Kong-China, Japan, Korea [South Korea], Macao-China, Shanghai -China, Singapore and Chinese Taipei [Taiwan], from lower-performing ones, is their students’ high level of proficiency on “exploring and understanding” and “representing and formulating” tasks.’
It also distinguishes those jurisdictions that perform best on interactive problems, requiring students to discover some of the information required to solve the problem, rather than being presented with all the necessary information. This seems to be the nearest equivalent to a measure of creativity in problem solving
Comparative strengths and weaknesses in respect of interactive tasks are captured in the following diagram.
One can see that several of my sample – Ireland, the USA, Canada, Australia, South Korea and Singapore – are placed in the top right-hand quarter of the diagram, indicating stronger than expected performance on both interactive and knowledge acquisition tasks.
England is stronger than expected on the former but not on the latter.
Jurisdictions that are weaker than inspected on interactive tasks only include Hong Kong, Taiwan and Shanghai, while Finland is weaker than expected on both.
We have no information about whether these distinctions were maintained at different proficiency levels.
Comparing jurisdictions’ performance at higher proficiency levels
Table 4 and Charts 3 and 4 below show variations in the performance of countries in my sample across the four different assessments at level 6, the highest proficiency level.
The charts in particular emphasise how far ahead the Asian Tigers are in maths at this level, compared with the cross-jurisdictional variation in the other three assessments.
In all five cases, each ‘Asian Tiger’s’ level 6 performance in maths also vastly exceeds its level 6 performance in the other three assessments. The proportion of students achieving level 6 proficiency in problem solving lags far behind, even though there is a fairly strong correlation between these two assessments (see below).
In contrast, all the ‘Western’ jurisdictions in the sample – with the sole exception of Ireland – achieve a higher percentage at proficiency level 6 in problem solving than they do in maths, although the difference is always less than a full percentage point. (Even in Ireland the difference is only 0.1 of a percentage point in favour of maths.)
Shanghai is the only jurisdiction in the sample which has more students achieving proficiency level 6 in science than in problem solving. It also has the narrowest gap between level 6 performance in problem solving and in reading.
Meanwhile, England, the USA, Finland and Australia all have broadly similar profiles across the four assessments, with the largest percentage of level 6 performers in problem solving, followed by maths, science and reading respectively.
The proximity of the lines marking level 6 performance in reading and science is also particularly evident in the second chart below.
Table 4: Percentage achieving proficiency Level 6 in each domain
Charts 3 and 4: Percentage achieving proficiency level 6 in each domain
The pattern is materially different at proficiency levels 5 and above, as the table and chart below illustrate. These also include the proportion of all-rounders, who achieved proficiency level 5 or above in each of maths, science and reading (but not in problem-solving).
The lead enjoyed by the ‘Asian Tigers’ in maths is somewhat less pronounced. The gap between performance within these jurisdictions on the different assessments also tends to be less marked, although maths accounts for comfortably the largest proportion of level 5+ performance in all five cases.
Conversely, level 5+ performance on the different assessments is typically much closer in the ‘Western’ countries. Problem solving leads the way in Australia, Canada, England and the USA, but in Finland science is in the ascendant and reading is strongest in Ireland.
Some jurisdictions have a far ‘spikier’ profile than others. Ireland is closest to achieving equilibrium across all four assessments. Australia and England share very similar profiles, though Australia outscores England in each assessment.
The second chart in particular shows how Shanghai’s ‘spike’ applies in all the other three assessments but not in problem solving.
Table 5: Percentage achieving Proficiency level 5 and above in each domain
Ma + Sci + Re L5+
5.7* all UK
Charts 5 and 6: Percentage Achieving Proficiency Level 5 and above in each domain
How high-achieving problem solvers perform in other assessments
Correlations between performance in different assessments
The Report provides an analysis of the proportion of students achieving proficiency levels 5 and 6 on problem solving who also achieved that outcome on one of the other three assessments: reading, maths and science.
It argues that problem solving is a distinct and separate domain. However:
‘On average, about 68% of the problem-solving score reflects skills that are also measured in one of the three regular assessment domains. The remaining 32% reflects skills that are uniquely captured by the assessment of problem solving. Of the 68% of variation that problem-solving performance shares with other domains, the overwhelming part is shared with all three regular assessment domains (62% of the total variation); about 5% is uniquely shared between problem solving and mathematics only; and about 1% of the variation in problem solving performance hinges on skills that are specifically measured in the assessments of reading or science.’
It discusses the correlation between these different assessments:
‘A key distinction between the PISA 2012 assessment of problem solving and the regular assessments of mathematics, reading and science is that the problem-solving assessment does not measure domain-specific knowledge; rather, it focuses as much as possible on the cognitive processes fundamental to problem solving. However, these processes can also be used and taught in the other subjects assessed. For this reason, problem-solving tasks are also included among the test units for mathematics, reading and science, where their solution requires expert knowledge specific to these domains, in addition to general problem-solving skills.
It is therefore expected that student performance in problem solving is positively correlated with student performance in mathematics, reading and science. This correlation hinges mostly on generic skills, and should thus be about the same magnitude as between any two regular assessment subjects.’
These overall correlations are set out in the table below, which shows that maths has a higher correlation with problem solving than either science or reading, but that this correlation is lower than those between the three subject-related assessments.
The correlation between maths and science (0.90) is comfortably the strongest (despite the relationship between reading and science at the top end of the distribution noted above).
Correlations are broadly similar across jurisdictions, but the Report notes that the association is comparatively weak in some of these, including Hong Kong. Students here are more likely to perform poorly on problem solving and well on other assessments, or vice versa.
There is also broad consistency at different performance levels, but the Report identifies those jurisdictions where students with the same level of performance exceed expectations in relation to problem-solving performance. These include South Korea, the USA, England, Australia, Singapore and – to a lesser extent – Canada.
Those with lower than expected performance include Shanghai, Ireland, Hong Kong, Taiwan and Finland.
The Report notes:
‘In Shanghai-China, 86% of students perform below the expected level in problem solving, given their performance in mathematics, reading and science. Students in these countries/economies struggle to use all the skills that they demonstrate in the other domains when asked to perform problem-solving tasks.’
However, there is variation according to students’ maths proficiency:
Jurisdictions whose high scores on problem solving are mainly attributable to strong performers in maths include Australia, England and the USA.
Jurisdictions whose high scores on problem solving are more attributable to weaker performers in maths include Ireland.
Jurisdictions whose lower scores in problem solving are more attributable to weakness among strong performers in maths include Korea.
Jurisdictions whose lower scores in problem solving are more attributable to weakness among weak performers in maths include Hong Kong and Taiwan.
Jurisdictions whose weakness in problem solving is fairly consistent regardless of performance in maths include Shanghai and Singapore.
The Report adds:
‘In Italy, Japan and Korea, the good performance in problem solving is, to a large extent, due to the fact that lower performing students score beyond expectations in the problem-solving assessment….This may indicate that some of these students perform below their potential in mathematics; it may also indicate, more positively, that students at the bottom of the class who struggle with some subjects in school are remarkably resilient when it comes to confronting real-life challenges in non-curricular contexts…
In contrast, in Australia, England (United Kingdom) and the United States, the best students in mathematics also have excellent problem-solving skills. These countries’ good performance in problem solving is mainly due to strong performers in mathematics. This may suggest that in these countries, high performers in mathematics have access to – and take advantage of – the kinds of learning opportunities that are also useful for improving their problem-solving skills.’
What proportion of high performers in problem solving are also high performers in one of the other assessments?
The percentages of high achieving students (proficiency level 5 and above) in my sample of eleven jurisdictions who perform equally highly in each of the three domain-specific assessments are shown in Table 6 and Chart 7 below.
These show that Shanghai leads the way in each case, with 98.0% of all students who achieve proficiency level 5+ in problem solving also achieving the same outcome in maths. For science and reading the comparable figures are 75.1% and 71.7% respectively.
Taiwan is the nearest competitor in respect of problem solving plus maths, Finland in the case of problem solving plus science and Ireland in the case of problem solving plus reading.
South Korea, Taiwan and Canada are atypical of the rest in recording a higher proportion of problem solving plus reading at this level than problem solving plus science.
Singapore, Shanghai and Ireland are the only three jurisdictions that score above 50% on all three of these combinations. However, the only jurisdictions that exceed the OECD averages in all three cases are Singapore, Hong Kong, Shanghai and Finland.
Table 6: PISA problem-solving: Percentage of students achieving proficiency level 5+ in domain-specific assessments
PS + Ma
PS + Sci
PS + Re
Chart 7: PISA Problem-solving: Percentage of students achieving proficiency level 5+ in domain-specific assessments
What proportion of students achieve highly in one or more assessments?
Table 7 and Chart 8 below show how many students in each of my sample achieved proficiency level 5 or higher in problem-solving only, in problem solving and one or more assessments, in one or more assessments but not problem solving and in at least one assessment (ie the total of the three preceding columns).
I have also repeated in the final column the percentage achieving this proficiency level in each of maths, science and reading. (PISA has not released information about the proportion of students who achieved this feat across all four assessments.)
These reveal that the percentages of students who achieve proficiency level 5+ only in problem solving are very small, ranging from 0.3% in Shanghai to 6.7% in South Korea.
Conversely, the percentages of students achieving proficiency level 5+ in any one of the other assessments but not in problem solving are typically significantly higher, ranging from 4.5% in the USA to 38.1% in Shanghai.
There is quite a bit of variation in terms of whether jurisdictions score more highly on ‘problem solving and at least one other’ (second column) and ‘at least one other excluding problem solving (third column).
More importantly, the fourth column shows that the jurisdiction with the most students achieving proficiency level 5 or higher in at least one assessment is clearly Shanghai, followed by Singapore, Hong Kong, South Korea and Taiwan in that order.
The proportion of students achieving this outcome in Shanghai is close to three times the OECD average, comfortably more than twice the rate achieved in any of the ‘Western’ countries and three times the rate achieved in the USA.
The same is true of the proportion of students achieving this level in the three domain-specific assessments.
On this measure, South Korea and Taiwan fall significantly behind their Asian competitors, and the latter is overtaken by Australia, Finland and Canada.
Table 7: Percentage achieving proficiency level 5+ in different combinations of PISA assessments
PS + 1 or more%
1+ butNot PS%
L5+ in at least one %
L5+ in Ma + Sci + Re %
5.7* all UK
Chart 8: Percentage achieving proficiency level 5+ in different combinations of PISA assessments
The Report comments:
‘The proportion of students who reach the highest levels of proficiency in at least one domain (problem solving, mathematics, reading or science) can be considered a measure of the breadth of a country’s/economy’s pool of top performers. By this measure, the largest pool of top performers is found in Shanghai-China, where more than half of all students (56%) perform at the highest levels in at least one domain, followed by Singapore (46%), Hong Kong-China (40%), Korea and Chinese Taipei (39%)…Only one OECD country, Korea, is found among the five countries/economies with the largest proportion of top performers. On average across OECD countries, 20% of students are top performers in at least one assessment domain.
The proportion of students performing at the top in problem solving and in either mathematics, reading or science, too can be considered a measure of the depth of this pool. These are top performers who combine the mastery of a specific domain of knowledge with the ability to apply their unique skills flexibly, in a variety of contexts. By this measure, the deepest pools of top performers can be found in Singapore (25% of students), Korea (21%), Shanghai-China (18%) and Chinese Taipei (17%). On average across OECDcountries, only 8% of students are top performers in both a core subject and in problem solving.’
There is no explanation of why proficiency level 5 should be equated by PISA with the breadth of a jurisdiction’s ‘pool of top performers’. The distinction between proficiency levels 5 and 6 in this respect requires further discussion.
In addition to updated ‘all-rounder’ data showing what proportion of students achieved this outcome across all four assessments, it would be really interesting to see the proportion of students achieving at proficiency level 6 across different combinations of these four assessments – and to see what proportion of students achieving that outcome in different jurisdictions are direct beneficiaries of targeted support, such as a gifted education programme.
In the light of this analysis, what are jurisdictions’ priorities for improving problem solving performance?
Leaving aside strengths and weaknesses in different elements of problem solving discussed above, this analysis suggests that the eleven jurisdictions in my sample should address the following priorities:
Singapore has a clear lead at proficiency level 6, but falls behind South Korea at level 5 (though Singapore re-establishes its ascendancy when levels 5 and 6 are considered together). It also has more level 1 performers than South Korea. It should perhaps focus on reducing the size of this tail and pushing through more of its mid-range performers to level 5. There is a pronounced imbalance in favour of boys at level 6, so enabling more girls to achieve the highest level of performance is a clear priority. There may also be a case for prioritising the children of semi-skilled workers.
South Korea needs to focus on getting a larger proportion of its level 5 performers to level 6. This effort should be focused disproportionately on girls, who are significantly under-represented at both levels 5 and 6. South Korea has a very small tail to worry about – and may even be getting close to minimising this. It needs to concentrate on improving the problem solving skills of its stronger performers in maths.
Hong Kong has a slightly bigger tail than Singapore’s but is significantly behind at both proficiency levels 5 and 6. In the case of level 6 it is equalled by Canada. Hong Kong needs to focus simultaneously on reducing the tail and lifting performance across the top end, where girls and weaker performers in maths are a clear priority.
Shanghai has a similar profile to Hong Kong’s in all respects, though with somewhat fewer level 6 performers. It also needs to focus effort simultaneously at the top and the bottom of the distribution. Amongst this sample, Shanghai has the worst under-representation of girls at level 5 and levels 5 and 6 together, so addressing that imbalance is an obvious priority. It also demonstrated the largest variation in performance against PISA’s IESC index, which suggests that it should target young people from disadvantaged backgrounds, as well as the children of semi-skilled workers.
Taiwan is rather similar to Hong Kong and Shanghai, but its tail is slightly bigger and its level 6 cadre slightly smaller, while it does somewhat better at level 5. It may need to focus more at the very bottom, but also at the very top. Taiwan also has a problem with high-performing girls, second only to Shanghai as far as level 5 and levels 5 and 6 together are concerned. However, like Shanghai, it does comparatively better than the other ‘Asian Tigers’ in terms of girls at level 6. It also needs to consider the problem solving performance of its weaker performers in maths.
Canada is the closest western competitor to the ‘Asian Tigers’ in terms of the proportions of students at levels 1 and 5 – and it already outscores Shanghai and Taiwan at level 6. It needs to continue cutting down the tail without compromising achievement at the top end. Canada also has small but significant gender imbalances in favour of boys at the top end.
Australia by comparison is significantly worse than Canada at level 1, broadly comparable at level 5 and somewhat worse at level 6. It too needs to improve scores at the very bottom and the very top. Australia’s gender imbalance is more pronounced at level 6 than level 5.
Finland has the same mean score as Australia’s but a smaller tail (though not quite as small as Canada’s). It needs to improve across the piece but might benefit from concentrating rather more heavily at the top end. Finland has a slight gender imbalance in favour of girls at level 5, but boys are more in the ascendancy at level 6 than in either England or the USA. As in Australia, this latter point needs addressing.
England has a profile similar to Australia’s, but less effective at all three selected proficiency levels. It is further behind at the top than at the bottom of the distribution, but needs to work hard at both ends to catch up the strongest western performers and maintain its advantage over the USA and Ireland. Gender imbalances are small but nonetheless significant.
USA has a comparatively long tail of low achievement at proficiency level 1 and, with the exception of Ireland, the fewest high achievers. This profile is very close to the OECD average. As in England, the relatively small size of gender imbalances in favour of boys does not mean that these can be ignored.
Ireland has the longest tail of low achievement and the smallest proportion of students at proficiency levels 5, 6 and 5 and 6 combined. It needs to raise the bar at both ends of the achievement distribution. Ireland has a larger preponderance of boys at level 6 than its Western competitors and this needs addressing. The limited socio-economic evidence suggests that Ireland should also be targeting the offspring of parents with semi-skilled and elementary occupations.
So there is further scope for improvement in all eleven jurisdictions. Meanwhile the OECD could usefully provide a more in-depth analysis of high achievers on its assessments that features:
Proficiency level 6 performance across the board.
Socio-economic disparities in performance at proficiency levels 5 and 6.
‘All-rounder’ achievement at these levels across all four assessments and
Correlations between success at these levels and specific educational provision for high achievers including gifted education programmes.
Across the Blogosphere and five of the most influential English language social media platforms – Facebook, Google+, LinkedIn, Twitter and You Tube and
Utilising four content curation tools particularly favoured by gifted educators, namely PaperLi, Pinterest, ScoopIt and Storify.
Considers the gap between current practice and the proposed quality criteria – and whether there has been an improvement in the application of social media across the five dimensions of gifted education identified in my previous post.
I should declare at the outset that I am a Trustee of Potential Plus UK and have been working with them to improve their online and social media presence. This post lies outside that project, but some of the underlying research is the same.
I have been this way before
This is my second excursion into this territory.
In September 2012 I published a two-part response to the question ‘Can Social Media Help Overcome the Problems We Face in Gifted Education?’
Part One outlined an analytical framework based on five dimensions of gifted education. Each dimension is stereotypically associated with a particular stakeholder group though, in reality, each group operates across more than one area. The dimensions (with their associated stakeholder groups in brackets) are: advocacy (parents); learning (learners); policy-making (policy makers); professional development (educators); and research (academics).
Part Two used this framework to review the challenges faced by gifted education, to what extent these were being addressed through social media and how social media could be applied more effectively to tackle them. It also outlined the limitations of a social media-driven approach and highlighted some barriers to progress.
The conclusions I reached might be summarised as follows:
Many of the problems associated with gifted education are longstanding and significant, but not insurmountable. Social media will not eradicate these problems but can make a valuable contribution towards that end by virtue of their unrivalled capacity to ‘only connect’.
Gifted education needs to adapt if it is to thrive in a globalised environment with an increasingly significant online dimension driven by a proliferation of social media. The transition from early adoption to mainstream practice has not yet been effected, but rapid acceleration is necessary otherwise gifted education will be left behind.
Gifted education is potentially well-placed to pioneer new developments in social media but there is limited awareness of this opportunity, or the benefits it could bring.
The post was intended to inform discussion at a Symposium at the ECHA Conference in Munster, Germany in September 2012. I published the participants’ presentations and a report on proceedings (which is embedded within a review of the Conference as a whole).
I have not previously attempted to pin down what constitutes a high quality website or blog and effective social media usage, not least because so many have gone before me.
But, on reviewing their efforts, I could find none that embodied every dimension I considered important, while several appeared unduly restrictive.
It seems virtually impossible to reconcile these two conflicting pressures, defining quality with brevity but without compromising flexibility. Any effort to pin down quality risks reductionism while also fettering innovation and wilfully obstructing the pioneering spirit.
I am a strong advocate of quality standards in gifted education but, in this context, it seemed beyond my capacity to find or generate the ideal ‘flexible framework’, offering clear guidance without compromising innovation and capacity to respond to widely varying needs and circumstances.
But the project for Potential Plus UK required us to consult stakeholders on their understanding of quality provision, so that we could reconcile any difference between their perceptions and our own.
And, in order to consult effectively, we needed to make a decent stab at the task ourselves.
So I prepared some draft success criteria, drawing on previous efforts I could find online as well as my own experience over the last four years.
I have reproduced the draft criteria below, with slight amendment to make them more universally applicable. The first set – for a blog or website – are generic, while those relating to wider online and social media presence are made specific to gifted education.
Draft Quality Criteria for a Blog or Website
1. The site is inviting to regular and new readers alike; its purpose is up front and explicit; as much content as possible is accessible to all.
2. Readers are encouraged to interact with the content through a variety of routes – and to contribute their own (moderated) content.
3. The structure is logical and as simple as possible, supported by clear signposting and search.
4. The design is contemporary, visually attractive but not obtrusive, incorporating consistent branding and a complementary colour scheme. There is no external advertising.
5. The layout makes generous and judicious use of space and images – and employs other media where appropriate.
6. Text is presented in small blocks and large fonts to ensure readability on both tablet and PC.
7. Content is substantial, diverse and includes material relevant to all the site’s key audiences.
8. New content is added weekly; older material is frequently archived (but remains accessible).
9. The site links consistently to – and is linked to consistently by – all other online and social media outlets maintained by the authors.
10. Readers can access site content by multiple routes, including other social media, RSS and email.
Draft quality criteria for wider online/social media activity
1. A body’s online and social media presence should be integral to its wider communications strategy which should, in turn, support its purpose, objectives and priorities.
2. It should:
a. Support existing users – whether they are learners, parents/carers, educators, policy-makers or academics – and help to attract new users;
b. Raise the entity’s profile and build its reputation – both nationally and internationally – as a first-rate provider in one or more of the five areas of gifted education;
c. Raise the profile of gifted education as an issue and support campaigning for stronger provision;
d. Help to generate income to support the pursuit of these objectives and the body’s continued existence.
3. It should aim to:
a. Provide a consistently higher quality and more compelling service than its main competitors, generating maximum benefit for minimum cost.
b. Use social media to strengthen interaction with and between users and provide more effective ‘bottom-up’ collaborative support.
c. Balance diversity and reach against manageability and effectiveness, prioritising media favoured by users but resisting pressure to diversify without justification and resource.
d. Keep the body’s online presence coherent and uncomplicated, with clear and consistent signposting so users can navigate quickly and easily between different online locations.
e. Integrate all elements of the body’s online presence, ensuring they are mutually supportive.
4. It should monitor carefully the preferences of users, as well as the development of online and social media services, adjusting the approach only when there is a proven business case for doing so.
Perth Pelicans by Gifted Phoenix
Applying the Criteria
These draft criteria reflect the compromise I outlined above. They are not the final word. I hope that you will help us to refine them as part of the consultation process now underway and I cannot emphasise too much that they are intended as guidelines, to be applied with some discretion.
I continue to maintain my inalienable right – as well as yours – to break any rules imposed by self-appointed arbiters of quality.
To give an example, readers will know that I am particularly exercised by any suggestion that good blog posts are, by definition, brief!
I also maintain your inalienable right to impose your own personal tastes and preferences alongside (or in place of) these criteria. But you might prefer to do so having reflected on the criteria – and having dismissed them for logical reasons.
There are also some fairly obvious limitations to these criteria.
For example, bloggers like me who use hosted platforms are constrained to some extent by the restrictions imposed by the host, as well as by our preparedness to pay for premium features.
Moreover, the elements of effective online and social media practice have been developed with a not-for-profit charity in mind and some in particular may not apply – or may not apply so rigorously – to other kinds of organisations, or to individuals engaged in similar activity.
In short, these are not templates to be followed slavishly, but rather a basis for reviewing existing provision and prompting discussion about how it might be further improved.
It would be forward of me to attempt a rigorous scrutiny against each of the criteria of the six key players mentioned above, or of any of the host of smaller players, including the 36 active gifted education blogs now listed on my blogroll.
I will confine myself instead to reporting factually all that I can find in the public domain about the activity of the six bodies, comparing and contrasting their approaches with broad reference to the criteria and arriving at an overall impressionistic judgement.
As for the blogs, I will be even more tactful, pointing out that my own quick and dirty self-review of this one – allocating a score out of ten for each of the ten items in the first set of criteria – generated a not very impressive 62%.
Of course I am biased. I still think my blog is better than yours, but now I have some useful pointers to how I might make it even better!
Comparing six major players
I wanted to compare the social media profile of the most prominent international organisations, the most active national organisations based in the US (which remains the dominant country in gifted education and in supporting gifted education online) and the two major national organisations in the UK.
I could have widened my reach to include many similar organisations around the world but that would have made this post more inaccessible. It also struck me that I could evidence my key messages by analysis of this small sample alone – and that my conclusions would be equally applicable to others in the field, wherever they are located geographically.
My analysis focuses on these organisations’:
Principal websites, including any information they contain about their wider online and social media activity;
Profile across the five selected social media platforms and use of blogs plus the four featured curational tools.
I have confined myself to universally accessible material, since several of these organisations have additional material available only to their memberships.
I have included only what I understand to be official channels, tied explicitly to the main organisation. I have included accounts that are linked to franchised operations – typically conferences – but have excluded personal accounts that belong to individual employees or trustees of the organisations in question.
Table 1 below shows which of the six organisations are using which social media. The table includes hyperlinks to the principal accounts and I have also repeated these in the commentary that follows.
Table 1: The social media used by the sample of six organisations
The table gives no information about the level or quality of activity on each account – that will be addressed in the commentary below – but it gives a broadly reliable indication of which organisations are comparatively active in social media and which are less so.
The analysis shows that Facebook and Twitter are somewhat more popular platforms than Google+, LinkedIn and You Tube, while Pinterest leads the way amongst the curational tools. This distribution of activity is broadly representative of the wider gifted education community.
The next section takes a closer look at this wider activity on each of the ten platforms and tools.
Comparing gifted-related activity on the ten selected platforms and tools
As far as I can establish, none of the six organisations currently maintains a blog. SENG does have what it describes as a Library of Articles, which is a blog to all intents and purposes – and Potential Plus UK is currently planning a blog.
Earlier this year I noticed that my blogroll was extremely out of date and that several of the blogs it contained were no longer active. I reviewed all the blogs I could find in the field and sought recommendations from others.
I imposed a rule to distinguish live blogs from those that are dead or dormant – they had to have published three or more relevant posts in the previous six months.
I also applied a slightly more subjective rule, in an effort to sift out those that had little relevance to anyone beyond the author (being cathartic diaries of sorts) and those that are entirely devoted to servicing a small local advocacy group.
I ended up with a long shortlist of 36 blogs, which now constitutes the revised blogroll in the right hand column. Most are written in English but I have also included a couple of particularly active blogs in other languages.
The overall number of active blogs is broadly comparable with what I remember in 2010 when I first began, but the number of posts has probably fallen.
I don’t know to what extent this reflects changes in the overall number of active blogs and posts, either generically or in the field of education. In England there has been a marked renaissance in edublogging over the last twelve months, yet only three bloggers venture regularly into the territory of gifted education.
Alongside Twitter, Facebook has the most active gifted education community.
There are dozens of Facebook Groups focused on giftedness and high ability. At the time of writing, the largest and most active are:
There is a Gifted Phoenix page, which is rigged up to my Twitter account so all my tweets are relayed there. Only those with a relevant hashtag – #gtchat or #gtvoice – will be relevant to gifted education.
To date there is comparatively little activity on Google+, though many have established an initial foothold there.
Part of the problem is lack of familiarity with the platform, but another obstacle is the limited capacity to connect other parts of one’s social media footprint with one’s Google+ presence.
There is only one Google+ Community to speak of: ‘Gifted and Talented’ currently with 134 members.
A search reveals a large number of people and pages ostensibly relevant to gifted education, but few are useful and many are dormant.
My own Google+ page is dormant. It should now be possible to have WordPress.com blogposts appear automatically on a Google+ page, but the service seems unreliable. There is no capacity to link Twitter and Google+ in this fashion. I am waiting on Google to improve the connectivity of their service.
LinkedIn is also comparatively little used by the gifted education community. There are several groups:
But none is particularly active, despite the rather impressive numbers above. Similarly, a handful of organisations have company pages on LinkedIn, but only one or two are active.
The search purports to include a staggering 98,360 people who mention ‘gifted’ in their profiles, but basic account holders can only see 100 results at a time.
My own LinkedIn page is registered under my real name rather than my social media pseudonym and is focused principally on my consultancy activity. I often forget it exists.
By comparison, Twitter is much more lively.
My brief January post mentioned my Twitter list containing every user I could find who mentions gifted education (or a similar term, whether in English or a selection of other languages) in their profile.
The list currently contains 1,263 feeds. You are welcome to subscribe to it. If you want to see it in action first, it is embedded in the right-hand column of this Blog, just beneath the blogroll.
The majority of the gifted-related activity on Twitter takes place under the #gtchat hashtag, which tends to be busier than even the most popular Facebook pages.
This hashtag also accommodates an hour long real-time chat every Friday (at around midnight UK time) and at least once a month on Sundays, at a time more conducive to European participants.
Other hashtags carrying information about gifted education include: #gtvoice (UK-relevant), #gtie (Ireland-relevant), #hoogbegaafd (Dutch-speaking); #altascapacidades (Spanish-speaking), #nagc and #gifteded.
Chats also take place on the #gtie and #nagc hashtags, though the latter may now be discontinued.
Several feeds provide gifted-relevant news and updates from around the world. Amongst the most followed are:
The most viewed video is called ‘Top 10 Myths in Gifted Education’, a dramatised presentation which was uploaded in March 2010 by the Gifted and Talented Association of Montgomery County. This has had almost 70,000 views.
Gifted Phoenix does not have a You Tube presence.
Paper.li describes itself as ‘a content curation service’ which ‘enables people to publish newspapers based on topics they like and treat their readers to fresh news, daily.’
It enables curators to draw on material from Facebook, Twitter, Google+, embeddable You Tube videos and websites via RSS feeds.
In September 2013 it reported 3.7m users each month.
I found six gifted-relevant ‘papers’ with over 1,000 subscriptions:
There is, as yet, no Gifted Phoenix presence on paper.li, though I have been minded for some months to give it a try.
Pinterest is built around a pinboard concept. Pins are illustrated bookmarks designating something found online or already on Pinterest, while Boards are used to organise a collection of pins. Users can follow each other and others’ boards.
Pinterest is said to have 70 million users, of which 80% are female.
A search on ‘gifted education’ reveals hundreds of boards dedicated to the topic, but unfortunately there is no obvious way to rank them by number of followers or number of pins.
Since advanced search capability is conspicuous by its absence, the user apparently has little choice but to sift laboriously through each board. I have not undertaken this task so I can bring you no useful information about the most used and most popular boards.
Judging by the names attached to these boards, they are owned almost exclusively by women. It is interesting to hypothesise about what causes this gender imbalance – and whether Pinterest is actively pursuing female users at the expense of males.
There are, however, some organisations in the field making active use of Pinterest. A search of ‘pinners’ suggests that amongst the most popular are:
IAGC Gifted which has 26 boards, 734 pins and 400 followers.
Gifted Phoenix is male and does not have a presence on Pinterest…yet!
Scoop.it stores material on a page somewhere between a paper.li-style newspaper and a Pinterest-style board. It is reported to have almost seven million unique visitors each month.
‘Scoopable’ material is drawn together via URLs, a programmable ‘suggestions engine’ and other social media, including all the ‘big four’. The free version permits a user to link only two social media accounts however, putting significant restrictions on Scoop.it’s curational capacity.
Scoop.it also has limited search engine capability. It is straightforward to conduct an elementary search like this one on ‘gifted’ which reveals 107 users.
There is no quick way of finding those pages that are most used or most followed, but one can hover over the search results for topics to find out which have most views:
Storify is a slightly different animal to the other three tools. It describes itself as:
‘the leading social storytelling platform, enabling users to easily collect tweets, photos, videos and media from across the web to create stories that can be embedded on any website. With Storify, anyone can curate stories from the social web to embed on their own site and share on the Storify platform.’
Estimates of user numbers vary but are typically from 850,000 to 1m.
Storify is a flexible tool whose free service permits one to collect material already located on the platform and from a range of other sources including Twitter, Facebook, You Tube, Flickr, Instagram, Google search, Tumblr – or via RSS or URL.
The downside is that there is no way to search within Storify for stories or users, so one cannot provide information about the level of activity or users that it might be helpful to follow.
However, a Google search reveals that users of Storify include:
These tiny numbers show that Storify has not really taken off as a curational platform in its own right, though it is an excellent supporting tool, particularly for recording transcripts of Twitter chats.
So, having reviewed wider gifted education-related activity on these ten social media platforms and tools, it is time to revisit the online and social media profile of the six selected organisations.
The WCGTC website was revised in 2012 and has a clear and contemporary design.
The Council’s Mission Statement has a strong networking feel to it and elsewhere the website emphasises the networking benefits associated with membership:
‘…But while we’re known for our biennial conference the spirit of sharing actually goes on year round among our membership.
By joining the World Council you can become part of this vital network and have access to hundreds of other peers while learning about the latest developments in the field of gifted children.’
The home page includes direct links to the organisation’s Facebook Page and Twitter feed. There is also an RSS feed symbol but it is not active.
Both Twitter and Facebook are of course available to members and non-members alike.
At the time of writing, the Facebook page has 1,616 ‘likes’ and is relatively current, with five posts in the last month, though there is relatively little comment on these.
The Twitter feed typically manages a daily Tweet. Hashtags are infrequently if ever employed. At the time of writing the feed has 1,076 followers.
Almost all the Tweets are links to a daily paper.li production ‘WCGTC Daily’ which was first published in late July 2013, just before the last biennial conference. This has 376 subscribers at the present time, although the gifted education coverage is selective and limited.
As noted above, the World Council website provides links to two of its six strands of social media activity, but not the remaining four. It is not yet serving as an effective hub for the full range of this activity.
Some of the strands link together well – eg Twitter to paper.li – but there is considerable scope to improve the incidence and frequency of cross-referencing.
Of the six organisations in this sample, ECHA is comfortably the least active in social media with only a Facebook page available to supplement its website.
The site itself is rather old-fashioned and could do with a refresh. It includes a section ‘Introducing ECHA’ which emphasises the organisation’s networking role:
‘The major goal of ECHA is to act as a communications network to promote the exchange of information among people interested in high ability – educators, researchers, psychologists, parents and the highly able themselves. As the ECHA network grows, provision for highly able people improves and these improvements are beneficial to all members of society.’
There is no reference on the website to the Facebook group which is closed, but not confined solely to ECHA members. There are currently 191 members. The group is fairly active, but does not rival those with far more members listed above.
There’s not much evidence of cross-reference between the Facebook group and the website, but that may be because the website is infrequently updated.
As with the World Council, ECHA conferences have their own social media profile.
At the 2012 Conference in In Munster this was left largely to the delegates. Several of us live Tweeted the event.
I blogged about the Conference and my part in it, providing links to transcripts of the Twitter record. The post concluded with a series of learning points for this year’s ECHA Conference in Slovenia.
The Conference website explains that the theme of the 2014 event is ‘Rethinking Giftedness: Giftedness in the Digital Age’.
Six months ahead of the event, there is a Twitter feed with 29 followers that has been dormant for three months at the time of writing and a LinkedIn group with 47 members that has been quiet for five months.
A Forum was also established which has not been used for over a year. There is no information on the website about how the event will be supported by social media.
I sincerely hope that my low expectations will not be fulfilled!
SENG is far more active across social media. Its website carries a 2012 copyright notice and has a more contemporary feel than many of the others in this sample.
The bottom of the home page extends an invitation to ‘connect with the SENG community’ and carries links to Facebook, Twitter and LinkedIn (though not to Google+ or You Tube).
In addition, each page carries a set of buttons to support the sharing of this information across a wide range of social media.
The organisation’s Strategic Plan 2012-2017 makes only fleeting reference to social media, in relation to creating a ‘SENG Liaison Facebook page’ to support inter-state and international support.
It does, however, devote one of its nine goals to the further development of its webinar programme (each costs $40 to access or $40 to purchase a recording for non-participants).
SENG offers online parent support groups but does not state which platform is used to host these. It has a Technology/Social Media Committee but its proceedings are not openly available.
Reference has already been made above to the principal Facebook Page which is popular, featuring posts on most days and a fair amount of interaction from readers.
The parallel group for SENG Liaisons is also in place, but is closed to outsiders, which rather seems to defeat the object.
The SENG Twitter feed is relatively well followed and active on most days. The LinkedIn page is somewhat less active but can boast 142 followers while Google+ is clearly a new addition to the fold.
The You Tube channel has 257 subscribers however and carries 16 videos, most of them featuring presentations by James Webb. Rather strangely, these don’t seem to feature in the media library carried by the website.
SENG is largely a voluntary organisation with little staff resource, but it is successfully using social media to extend its footprint and global influence. There is, however, scope to improve coherence and co-ordination.
National Association for Gifted Children
The NAGC’s website is also in some need of refreshment. Its copyright notice dates from 2008, which was probably when it was designed.
There are no links to social media on the home page but ‘NAGC at a glance’ carries a direct link to the Facebook group and a Twitter logo without a link, while the page listing NAGC staff has working links to both Facebook and Twitter.
In the past, NAGC has been more active in this field.
This post was filled by July 2013. The postholder seems to have been concentrating primarily on editing the magazine edition of Parenting High Potential, which is confined to members only (but also has a Facebook presence – see below).
NAGC’s website carries a document called ‘NAGC leadership initiatives 2013-14’ which suggests further developments in the next few months.
The initiatives include:
‘Leverage content to intentionally connect NAGC resources, products and programs to targeted audiences through an organization-wide social media strategy.’
‘Implement a new website and membership database that integrates with social media and provides a state-of-the-art user interface.’
One might expect NAGC to build on its current social media profile which features:
A Facebook Group which currently has 2,420 members and is reasonably active, though not markedly so. Relatively few posts generate significant comments.
There is additional activity associated with the Annual NAGC Convention. There was extensive live Tweeting from the 2013 Convention under the rival hashtags #NAGC2013 and #NAGC13. #NAGC14 looks the favourite for this year’s Convention which has also established a Facebook presence
NAGC also has its own networks. The website lists 15 of these but hardly any of their pages give details of their social media activity. A cursory review reveals that:
Overall, NAGC has a fairly impressive array of social media activity but demonstrates relatively little evidence of strategic coherence and co-ordination. This may be expected to improve in the next six months, however.
NACE is not quite the poorest performer in our sample but, like ECHA, it has so far made relatively little progress towards effective engagement with social media.
Its website dates from 2010 but looks older. Prominent links to Twitter and Facebook appear on the front page as well as – joy of joys – an RSS feed.
However, the Facebook link is not to a NACE-specific page or group and the RSS feed doesn’t work.
There are references on the website to the networking benefits of NACE membership, but not to any role for the organisation in wider networking activity via social media. Current efforts seem focused primarily on advertising NACE and its services to prospective members and purchasers.
The Twitter feed has a respectable 1,426 followers but Tweets tend to appear in blocks of three or four spaced a few days apart. Quality and relevance are variable.
Whereas the old Facebook page had reached 1,344 likes, the new one is currently at roughly half that level – 683 – but the level of activity is reasonably impressive.
There is a third Facebook page dedicated to the organisation’s ‘It’s Alright to Be Bright’ campaign, which is not quite dormant.
All website pages carry buttons supporting information-sharing via a wide range of social media outlets. But there is little reference in the website content to its wider social media activity.
The Twitter feed is fairly lively, boasting 1,093 followers. It currently has some 400 fewer followers than NACE but has published about 700 more Tweets. Both are publishing at about the same rate. Quality and relevance are similarly variable.
The LinkedIn page is little more than a marker and does not list the products offered.
The Google+ presence uses the former NAGC Britain name and is also no more than a marker.
But the level of activity on Pinterest is more significant. There are 14 boards each containing a total of 271 pins and attracting 26 followers. This material has been uploaded during 2014.
I shall begin by reflecting on Gifted Phoenix’s profile across the ten elements included in this analysis:
He has what he believes is a reasonable Blog.
He is one of the leading authorities on gifted education on Twitter (if not the leading authority).
His Facebook profile consists almost exclusively of ‘repeats’ from his Twitter feed.
His LinkedIn page reflects a different identity and is not connected properly to the rest of his profile.
His Google+ presence is embryonic.
He has used Scoop.it and Storify to some extent, but not Paper.li or Pinterest.
GP currently has a rather small social media footprint, since he is concentrating on doing only two things – blogging and microblogging – effectively.
He might be advised to extend his sphere of influence by distributing the limited available human resource more equitably across the range of available media.
On the other hand he is an individual with no organisational objectives to satisfy. Fundamentally he can follow his own preferences and inclinations.
Maybe he should experiment with this post, publishing it as widely as possible and monitoring the impact via his blog analytics…
The Six Organisations
There is a strong correlation between the size of each organisation’s social media footprint and the effectiveness with which they use social media.
There are no obvious examples – in this sample at least – of organisations that have a small footprint because of a deliberate choice to specialise in a narrow range of media.
If we were to rank the six in order of effectiveness, the World Council, NAGC and SENG would be vying for top place, while ECHA and NACE would be competing for bottom place and Potential Plus UK would be somewhere in the middle.
But none of the six organisations would achieve more than a moderate assessment against the two sets of quality criteria. All of them have huge scope for improvement.
Their priorities will vary, according to what is set out in their underlying social media strategies. (If they have no social media strategy, the obvious priority is to develop one, or to revise it if it is outdated.)
The Overall Picture across the Five Aspects of Gifted Education
This analysis has been based on the activities of a small sample of six generalist organisations in the gifted education field, as well as wider activity involving a cross-section of tools and platforms.
It has not considered providers who specialise in one of the five aspects – advocacy, learning, professional development, policy-making and research – or the use being made of specialist social media, such as MOOCs and research tools.
So the judgements that follow are necessarily approximate. But nothing I have seen across the wider spectrum of social media over the past 18 months would seriously call into question the conclusions reached below.
Advocacy via social media is slightly stronger than it was in 2012 but there is still much insularity and too little progress has been made towards a joined up global movement. The international organisations remain fundamentally inward-looking and have been unable to offer the leadership and sense of direction required. The grip of the old guard has been loosened and some of the cliquey atmosphere has dissipated, but academic research remains the dominant culture.
Learning via social media remains limited. There are still several niche providers but none has broken through in a global sense. The scope for fruitful partnership between gifted education interests and one or more of the emerging MOOC powerhouses remains unfulfilled. The potential for social media to support coherent and targeted blended learning solutions – and to support collaborative learning amongst gifted learners worldwide – is still largely unexploited.
Professional development via social media has been developed at a comparatively modest level by several providers, but the prevailing tendency seems to be to regard this as a ‘cash cow’ generating income to support other activities. There has been negligible progress towards securing the benefits that would accrue from systematic international collaboration.
Policy-making via social media is still the poor relation. The significance of policy-making (and of policy makers) within gifted education is little appreciated and little understood. What engagement there is seems focused disproportionately on lobbying politicians, rather than on developing at working level practical solutions to the policy problems that so many countries face in common.
Research via social media is negligible. The vast majority of academic researchers in the field are still caught in a 20th Century paradigm built around publication in paywalled journals and a perpetual round of face-to-face conferences. I have not seen any significant examples of collaboration between researchers. A few make a real effort to convey key research findings through social media but most do not. Some of NAGC’s networks are beginning to make progress and the 2013 World Conference went further than any of its predecessors in sharing proceedings with those who could not attend. Now the pressure is on the EU Talent Conference in Budapest and ECHA 2014 in Slovenia to push beyond this new standard.
Overall progress has been limited and rather disappointing. The three conclusions I drew in 2012 remain valid.
In September 2012 I concluded that ‘rapid acceleration is necessary otherwise gifted education will be left behind’. Eighteen months on, there are some indications of slowly gathering speed, but the gap between practice in gifted education and leading practice has widened meanwhile – and the chances of closing it seem increasingly remote.
Back in 2010 and 2011 several of my posts had an optimistic ring. It seemed then that there was an opportunity to ‘only connect’ globally, but also at European level via the EU Talent Centre and in the UK via GT Voice. But both those initiatives are faltering.
My 2012 post also finished on an optimistic note:
‘Moreover, social media can make a substantial and lasting contribution to the scope, value and quality of gifted education, to the benefit of all stakeholders, but ultimately for the collective good of gifted learners.
No, ‘can’ is too cautious, non-assertive, unambitious. Let’s go for WILL instead!’
Now in 2014 I am resigned to the fact that there will be no great leap forward. The very best we can hope for is disjointed incremental improvement achieved through competition rather than collaboration.
I will be doing my best for Potential Plus UK. Now what about you?
We discussed the issue of labelling gifted learners and the idea that such labels may not be permanent sifting devices, but temporary markers attached to such learners only while they need additional challenge and support.
This is not to deny that some gifted learners may warrant a permanent marker, but it does imply that many – probably most – will move in and out of scope as they develop in non-linear fashion and differentially to their peers.
Of course much depends on one’s understanding of giftedness and gifted education, a topic I have addressed frequently, starting with my inaugural post in May 2010.
Three-and-a-half years on, it seems to me that the default position has shifted somewhat further towards the Nurture, Equity and Personalisation polarities.
But the notion of giftedness as dynamic in both directions – with learners shifting in and out of scope as they develop – may be an exception to that broader direction of travel.
Of course there’s been heavy emphasis on movement into scope (the broader notion of giftedness as learned behaviour and achievable through effort) but very little attention given to progress in the opposite direction.
It is easy to understand how this would be a red rag to several bulls in the gifted education field, while outward movement raises difficult questions for everybody – whether or not advocates for gifted education – about communication and management of self-esteem.
But reform and provocation are often stalwart bedfellows. Feel free to vent your spleen in the comments section below.