This post examines what PISA 2012 can tell us about the comparative performance of high achievers in England, other English-speaking countries and those that top the PISA rankings.
It draws on a similar range of evidence to that deployed in my post on the PISA 2009 results (December 2010).
A more recent piece, ‘The Performance of Gifted High Achievers in TIMSS, PIRLS and PISA’ (January 2013) is also relevant.
The post reviews:
- How the PISA 2012 Assessment Framework defines reading, mathematical and scientific literacy and its definitions of high achievement in each of the three core domains.
- How average (headline) performance on the three core measures has changed in each jurisdiction compared with PISA 2006 and PISA 2009.
- By comparison, how high achievers’ performance – and the balance between high and low achievers’ performance – has changed in each jurisdiction over the same period.
- How jurisdictions compare on the ‘all-rounder’ measure, derived from achievement of a high performance threshold on all three assessments.
The twelve jurisdictions included in the main analysis are: Australia, Canada, England, Finland, Hong Kong (China), Ireland, New Zealand, Shanghai (China), Singapore, South Korea, Taiwan and the USA.
The post also compares the performance of the five home countries against the high achievement thresholds. I have foregrounded this analysis, which appears immediately below, save only for the headline (but potentially misleading) ‘top 10’ high achiever rankings for 2012.
World Leaders against PISA’s High Achievement Benchmarks
The top 10 performers in PISA 2012 against the high achievement benchmarks (Level 5 and above), in reading, maths and science respectively, are set out in Table 1 below.
The 2009 rankings are shown in brackets and the 2012 overall average rankings in bold, square brackets. I have also included England’s rankings.
|1||Shanghai (1) ||Shanghai (1) ||Shanghai (1) |
|2||Singapore (3) ||Singapore (2) ||Singapore (2) |
|3||Japan (5) ||Taiwan (4) ||Japan (5) |
|4||Hong Kong (9) ||Hong Kong (3) ||Finland (3) |
|5||S. Korea (6) ||S Korea (5) ||Hong Kong (6) |
|6||N Zealand (2) ||Liechtenstein (13) ||Australia (7) |
|7||Finland (4) ||Macao (15) ||N Zealand (4) |
|8||Canada (7=) ||Japan (8) ||Estonia (17) |
|9||France (13) ||Switzerland (6) ||Germany (8) |
|10||Belgium (10) ||Belgium (9)  ||Netherlands (9) |
|England 19th (19) ||England 24th (32) ||England 11th (12) |
On the basis of these crude rankings alone, it is evident that Shanghai has maintained its ascendancy across all three domains.
Singapore has reinforced its runner-up position by overtaking New Zealand in reading. Hong Kong and Japan also make it into the top ten in all three domains.
Notable improvements in the rankings have been made by:
- Japan, Hong Kong and France in reading
- Liechtenstein and Macao in maths
- Japan and Estonia in science
Jurisdictions falling down the rankings include:
- Australia, New Zealand and Finland in reading
- Finland and Switzerland in maths
- Canada and New Zealand in science.
Those whose high achiever rankings significantly exceed their average rankings include:
- New Zealand, France and Belgium in reading
- Belgium in maths
- Australia, New Zealand, Germany and the Netherlands in science
The only one of the top ten jurisdictions exhibiting the reverse pattern with any degree of significance is Hong Kong, in science.
On this evidence, England has maintained its relatively strong showing in science and a mid-table position in reading, but it has slipped several places in maths.
Comparing England’s rankings for high achievers with its rankings for average performance:
- Reading 19th versus 23rd
- Maths 24th versus 25th
- Science 11th versus 18th
This suggests that England is substantively stronger at the top end of the achievement spectrum in science, slightly stronger in reading and almost identical in maths. (The analysis below explores whether this is borne out by the proportions of learners achieving the relevant PISA thresholds.)
Overall, these rankings suggest that England is a respectable performer at the top end, but nothing to write home about. It is not deteriorating, relatively speaking – with the possible exception of mathematics – but it is not improving significantly either. The imbalance is not atypical and it requires attention, but only as part of a determined effort to build performance at both ends.
Comparing the Home Countries’ Performance
Table 2 below shows how each home country has performed at Level 5 and above in each of the three core PISA assessments since 2006.
|2012 Level 5+||2009 Level 5+||2006 Level 5+|
In 2012, England is ahead of the other home countries in all three domains. Northern Ireland is runner-up in reading and science, Scotland in maths. Wales is a long way behind the other four in all three assessments.
Only England tops the OECD average in reading. All the home countries fall below the OECD average in maths, though all but Wales are above it in science.
Compared with 2006, England’s performance has changed little in reading, increased somewhat in maths (having fallen back betweentimes) and fallen quite significantly in science.
In comparison, Northern Ireland is on a downward trend in all three domains, as is Scotland (though it produced small improvements in maths and reading in 2009). Wales has fallen back significantly in science, though somewhat less so in reading and maths.
It seems that none of the home countries is particularly outstanding when it comes to the performance of their high achievers, but England is the strongest of the four, while Wales is clearly the weakest.
A slightly different perspective can be gained by comparing high and low performance in 2012.
Table 3 below shows that the proportion of low achievers is comfortably larger than the proportion of high achievers. This is true of all the home countries and all subjects, though the difference is less pronounced in science across the board and also in Scotland. Conversely, the imbalance is much more significant in Wales.
The ‘tail’ in reading is significantly higher than the OECD average in all four countries but – with the exception of Wales – somewhat lower in science.
In maths, the ‘tail’ is higher than the OECD average in Wales and Northern Ireland, but below average in England and Scotland.
The average figures suggest that, across the OECD as a whole, the top and bottom are broadly balanced in reading, there is a small imbalance in science towards the bottom end and a more significant imbalance in maths, again towards the bottom end.
By comparison, the home countries have a major issue at the bottom in reading, but are less significantly out of line in maths and science.
Overall, there is some evidence here of a longish tail of low achievement, but with considerable variation according to country and domain.
The bottom line is that all of the home countries have significant issues to address at both the top and the bottom of the achievement distribution. Any suggestion that they need to concentrate exclusively on low achievers is not supported by this evidence.
Background to PISA
What is PISA?
The Programme for International Student Assessment (PISA) is a triennial OECD survey of the performance of 15 year-old students which typically covers maths, science and reading. Science was the main focus in 2006, reading in 2009 and maths in 2012.
PISA 2012 also included a computer-based assessment of problem-solving and a financial literacy assessment. However, some jurisdictions did not participate in the problem-solving exercise owing to ‘technical issues’ and financial literacy was undertaken by some countries only, as an optional extra.
Fifty-eight jurisdictions took part in PISA 2006 and 74 in PISA 2009 (65 undertook the assessment in 2009 and a further nine did so in 2010).
To date, a total of 65 jurisdictions have also taken part in PISA 2012.
According to the OECD’s own FAQ:
- PISA tests reading, mathematical and scientific literacy ‘in terms of general competencies, that is, how well students can apply the knowledge and skills they have learned at school to real-life challenges. PISA does not test how well a student has mastered a school’s specific curriculum.’
- Student performance in each field is comparable between assessments – one cannot reasonably argue therefore that a drop in performance is attributable to a more difficult assessment.
- Each participating jurisdiction receives an overall score in each subject area – the average of all its students’ scores. The average score among OECD countries is set at 500 points (with a standard deviation of 100 points).
- Participating jurisdictions are ranked in each subject area according to their mean scores, but:
‘is not possible to assign a single exact rank in each subject to each country…because PISA tests only a sample of students from each country and this result is then adjusted to reflect the whole population of 15-year-old students in that country. The scores thus reflect a small measure of statistical uncertainty and it is therefore only possible to report the range of positions (upper rank and lower rank) within which a country can be placed.’
Outside the confines of reports by the OECD and its national contractors, this is honoured more in the breach than the observance.
- Scores are derived from scales applied to each subject area. Each scale is divided into levels, Level 1 being the lowest and Level 6 typically the highest
Further background detail on the 2012 assessments is set out in the ‘PISA 2012 Assessment and Analytical Framework’ (2013).
This explains that the framework for assessing maths was completely revised ahead of the 2012 cycle and ‘introduces three new mathematical processes that form the basis of developments in the reporting of PISA mathematics outcomes’, whereas those for science and reading were unchanged (the science framework was revised when it was the main focus in 2006 and ditto for reading in 2009).
The Framework clarifies the competency-based approach summarised in the FAQ:
‘ISA focuses on competencies that 15-year-old students will need in the future and seeks to assess what they can do with what they have learnt – reflecting the ability of students to continue learning throughout their lives by applying what they learn in school to non-school environments, evaluating their choices and making decisions. The assessment is informed, but not constrained, by the common denominator of national curricula. Thus, while it does assess students’ knowledge, PISA also examines their ability to reflect, and to apply their knowledge and experience to real-life issues in a reflective way. For example, in order to understand and evaluate scientific advice on food safety, an adult would need not only to know some basic facts about the composition of nutrients, but also to be able to apply that information.’
It explains that between 4,500 and 10,000 students drawn from 150 schools are typically tested in each jurisdiction.
Initial reports suggested that England would not take part in the 2012 assessments of problem-solving and financial literacy, but it subsequently emerged that this decision had been reversed in respect of problem-solving.
Setting PISA Outcomes in Context
There are plenty of reasons why one should not place excessive weight on PISA outcomes:
- The headline rankings carry a significant health warning, which remains important, even though it is commonly ignored.
- There is a long history of criticism that PISA assessments are ‘fundamentally flawed’; moreover, the OECD itself notes that:
‘As the PISA 2000 and PISA 2003 samples for the United Kingdom did not meet the PISA response-rate standards, no trend comparisons are possible for these years.’ (p.1)
Hence, for the UK at least, reliable comparisons with pre-2006 results are off the table.
- The data may not bear the weight of the policy conclusions derived from it:
‘The pressure from policymakers for advice based on PISA interacts with this unhealthy mix of policy and technical people. The technical experts make sure that the appropriate caveats are noted, but the warnings are all too often ignored by the needs of the policy arm of PISA. As a result, PISA reports often list the known problems with the data, but then the policy advice flows as though those problems didn’t exist. Consequently, some have argued that PISA has become a vehicle for policy advocacy in which advice is built on flimsy data and flawed analysis.’
- PISA is not the only game in town. TIMSS and PIRLS are equally significant, though relatively more focused on content knowledge, whereas PISA is primarily concerned with the application of skills in real life scenarios.
- There are big political risks associated with worshipping at the PISA altar for, if the next set of outcomes is disappointing, the only possible escape route is to blame the previous administration, a strategy that wears increasingly thin with the electorate the longer the current administration has been in power.
It would be quite wrong to dismiss PISA results out of hand, however. They are a significant indicator of the comparative performance of national (and regional) education systems. But they are solely an indicator, rather than a statement of fact.
What is assessed – and what constitutes high achievement – in each domain
The Assessment and Analytical Framework provides definitions of each domain and level descriptors for each level within the assessments.
The PISA 2012 mathematics framework defines mathematical literacy as:
‘An individual’s capacity to formulate, employ and interpret mathematics in a variety of contexts. It includes reasoning mathematically and using mathematical concepts, procedures, facts and tools to describe, explain and predict phenomena. It assists individuals to recognise the role that mathematics plays in the world and to make the well-founded judgments and decisions needed by constructive, engaged and reflective citizens.’
Three aspects of maths are identified:
- Mathematical processes and the fundamental capabilities underlying them. Three processes are itemised: formulating situations mathematically; employing mathematical concepts, facts, procedures and reasoning; and interpreting, applying and evaluating mathematical outcomes. The capabilities are: communication; mathematizing (transforming a real life problem to a mathematical form); representation; reasoning and argument; devising problem-solving strategies; using symbolic, formal and technical language and operations; and using mathematical tools.
- Content knowledge, comprising four elements: change and relationships; space and shape; quantity; and uncertainty and data.
- The contexts in which mathematical challenges are presented: personal; occupational; societal and scientific.
Six levels are identified within the PISA 2012 mathematics scale’. The top two are described thus:
- ‘At Level 6 students can conceptualise, generalise and utilise information based on their investigations and modelling of complex problem situations. They can link different information sources and representations and flexibly translate among them. Students at this level are capable of advanced mathematical thinking and reasoning. These students can apply their insight and understandings along with a mastery of symbolic and formal mathematical operations and relationships to develop new approaches and strategies for attacking novel situations. Students at this level can formulate and precisely communicate their actions and reflections regarding their findings, interpretations, arguments and the appropriateness of these to the original situations.’
- ‘At Level 5 students can develop and work with models for complex situations, identifying constraints and specifying assumptions. They can select, compare and evaluate appropriate problem-solving strategies for dealing with complex problems related to these models. Students at this level can work strategically using broad, well-developed thinking and reasoning skills, appropriate linked representations, symbolic and formal characterisations and insight pertaining to these situations. They can reflect on their actions and formulate and communicate their interpretations and reasoning.’
Reading Literacy is defined as:
‘An individual’s capacity to understand, use, reflect on and engage with written texts, in order to achieve one’s goals, to develop one’s knowledge and potential, and to participate in society.’
The assessment ‘is built on three major task characteristics’:
- Situation – the context or purpose for which reading takes place, which may be personal (practical and intellectual interests), public (activities and concerns of society), educational (for learning purposes) or occupational (accomplishment of a task).
- Text – the range of material that is read, which may be print or digital. In the case of digital text, the environment may be authored (the reader is receptive), message based, or mixed. In the case of both print and digital text, the format may be continuous (sentences and paragraphs), non-continuous (eg graphs, lists), mixed or multiple, while the text type may be description, narration, exposition, argumentation, instruction or transaction.
- Aspect – how readers engage with the text, which includes accessing and retrieving; integrating and interpreting; and reflecting and evaluating.
Separate proficiency scales are provided for print and digital reading respectively. Both describe achievement in terms of the task rather than the student.
The print reading scale has six levels (Level One is subdivided into two). The top levels are described as follows:
- Level 6: Tasks at this level typically require the reader to make multiple inferences, comparisons and contrasts that are both detailed and precise. They require demonstration of a full and detailed understanding of one or more texts and may involve integrating information from more than one text. Tasks may require the reader to deal with unfamiliar ideas, in the presence of prominent competing information, and to generate abstract categories for interpretations. Reflect and evaluate tasks may require the reader to hypothesise about or critically evaluate a complex text on an unfamiliar topic, taking into account multiple criteria or perspectives, and applying sophisticated understandings from beyond the text. A salient condition for access and retrieve tasks at this level is precision of analysis and fine attention to detail that is inconspicuous in the texts.
- Level 5: Tasks at this level that involve retrieving information require the reader to locate and organise several pieces of deeply embedded information, inferring which information in the text is relevant. Reflective tasks require critical evaluation or hypothesis, drawing on specialised knowledge. Both interpretative and reflective tasks require a full and detailed understanding of a text whose content or form is unfamiliar. For all aspects of reading, tasks at this level typically involve dealing with concepts that are contrary to expectations.
For digital reading there are only four levels, categorised as 2-5. Level 5 is described thus:
‘Tasks at this level typically require the reader to locate, analyse and critically evaluate information, related to an unfamiliar context, in the presence of ambiguity. They require generating criteria to evaluate the text. Tasks may require navigation across multiple sites without explicit direction, and detailed interrogation of texts in a variety of formats.’
Scientific literacy is defined as:
‘An individual’s scientific knowledge and use of that knowledge to identify questions, to acquire new knowledge, to explain scientific phenomena, and to draw evidence-based conclusions about science-related issues, understanding of the characteristic features of science as a form of human knowledge and enquiry, awareness of how science and technology shape our material, intellectual, and cultural environments, and willingness to engage in science-related issues, and with the ideas of science, as a reflective citizen.’
The domain consists of four interrelated aspects:
- Context – life situations involving science and technology. Contexts are personal, social or global and may relate to health, natural resources, environment, hazard or the frontiers of science and technology.
- Knowledge – knowledge of the natural world (covering physical systems, living systems, earth and space systems and technology systems) and knowledge about science itself (scientific enquiry and scientific explanations).
- Competencies , of which three are identified: identify scientific issues, explain phenomena scientifically and use scientific evidence.
- Attitudes, including an interest in science, support for scientific enquiry and a motivation to act responsibly towards the natural world.
A 6-level proficiency scale is defined with the top levels explained as follows:
- At Level 6, students can consistently identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations. They can link different information sources and explanations and use evidence from those sources to justify decisions. They clearly and consistently demonstrate advanced scientific thinking and reasoning, and they use their scientific understanding in support of solutions to unfamiliar scientific and technological situations. Students at this level can use scientific knowledge and develop arguments in support of recommendations and decisions that centre on personal, social or global situations.
- At Level 5, students can identify the scientific components of many complex life situations, apply both scientific concepts and knowledge about science to these situations, and can compare, select and evaluate appropriate scientific evidence for responding to life situations. Students at this level can use well-developed inquiry abilities, link knowledge appropriately and bring critical insights to situations. They can construct explanations based on evidence and arguments based on their critical analysis.
Changes in Average Performance in Reading, Maths and Science
The OECD published PISA outcomes for maths, science and reading on 3 December 2013.
Similarly, the PISA National Report on England, published simultaneously, covers the three core assessments.
This section looks briefly at the headline average scores and rankings across the selected sample of twelve jurisdictions, principally to enable comparisons to be drawn with the subsequent analysis of high achievers’ performance.
I apologise in advance for any transcription errors. Please let me know if you spot any and I will correct the tables accordingly.
Table 4 below gives the headline average numerical scores and ranks in reading from PISA 2006, 2009 and 2012 respectively.
Shanghai has retained the ascendancy it established in 2009, adding a further 14 points to its average 2009 score. Whereas it was only 17 points beyond its nearest competitor in 2009, that lead has now been extended to 25 points.
South Korea’s performance has fallen slightly and it has been leapfrogged in the rankings by Hong Kong (up 12 points), Singapore (up 16 points), and Japan (not included in the table).
Two countries making even more significant improvements are Taiwan (up 28 points) and Ireland (up 27 points). Conversely, the performance of Finland (down 12 points) and New Zealand (down 9 points) has noticeably declined. Finland’s performance has been declining since 2006.
Results remain broadly unchanged in Australia, Canada, England, South Korea and the USA. South Korea has been unable to make up the ground it lost in 2009.
Ireland’s huge improvement from a very similar starting point in 2009 throws England’s lack of progress into sharper relief, although it is largely catching up lost ground in 2009, having performed relatively well in 2006.
England, like the US, continues to perform slightly above the OECD average, but has fallen further behind the Asian Tigers. The gap with the world’s leader in each assessment is now 70 points (up from 60 in 2006),
Table 5 below sets out scores and rankings in maths since PISA 2006
The overall picture is rather similar to that for reading.
Shanghai (up 13 points) and Singapore (up 11 points) continue to stretch away at the head of the field. Taiwan (up 17 points) has also made significant improvement and is now close behind Hong Kong.
There has been relatively more modest improvement in Hong Kong and South Korea (which has been overtaken by Taiwan).
Elsewhere, Ireland has again made significant headway and is back to the level it achieved in 2006. But Finland’s score has plummeted 22 points. New Zealand is not far behind (down 19). There have also been significant falls in the performance of Australia (down 10) Canada (down 9) and the US (down 6).
The US is now trailing 13 points below the OECD average, having failed to sustain the substantial improvement it made in 2009.
In England meanwhile, results are largely unchanged, though now just above the OECD average rather than just below it.
The gap between England and world leader Shanghai has reached 118 points, compared with a gap in 2006 between England and world leader Taiwan of 54 points. The gap between England and its main Commonwealth competitors has narrowed, but only as a consequence of the significant declines in the latter.
Table 6 below provides the same data in respect of science.
Shanghai is again out in front, having repeated the clean sweep it achieved in 2009.
However, it has managed only a 5-point improvement, while Taiwan has improved by 13 points and Singapore by 9 points. Hong Kong has moved up by 6 points and Taiwan by 3 points, but South Korea’s score is unchanged from 2009.
New Zealand has dropped by 16 points and Finland by 9 points compared with 2009. There have been comparatively smaller declines in Australia and Canada, while Ireland has once again improved dramatically, by 14 points, and – in this case – the improvement is not simply clawing back ground lost in 2009.
England remains comfortably above the OECD average, but has made negligible improvement since 2006. US performance has dropped back below the OECD average as it has lost some of the ground it made up in 2009.
The gap between England and the world leaders is comparable with that in maths and significantly lower than in reading. The gap is now 64 points, compared with just 47 points in 2006.
Overall, the Asian Tigers have consolidated their positions by maintaining improvement in all three domains, though South Korea appears to be struggling to maintain the success of earlier years.
Finland and New Zealand are in worrying decline while Ireland is making rapid progress in the opposite direction.
The US results are stagnant, remaining comparatively poor, particularly in maths.
England has broadly maintained its existing performance profile, neither improving nor declining significantly. But, it is conspicuously losing ground on the world leaders, especially in maths. Other than in science it is close to the OECD average.
There is nothing here to give comfort to either the previous Government or the present incumbents. There might be some limited relief – even a degree of shadenfreude – in the fact that several better-placed nations are falling back more severely. But of course one cannot win the ‘global race’ by simply standing still.
Changes in High Achievers’ Performance
So much for the average headline figures.
The remainder of this post is focused on high achievement data. The ensuing sections once more examine reading, maths and science in that order, followed by a section on all-rounders.
Table 7 shows how the percentage achieving higher levels in reading has changed since PISA 2006, providing separate columns for Level 6 and above level 5 respectively (there was no Level 6 in 2006)..
|Level 6||Levels 5 and 6||Level 6||Levels 5+6||Level 5|
This reveals that:
- In 2012, Singapore has a clear lead on its competitors at Level 6, but it is overtaken by Shanghai at Level 5 and above. New Zealand also remains comparatively strong at Level 6, but falls back significantly when Levels 5 and 6 are combined.
- The other Asian Tigers do not perform outstandingly well at Level 6: Hong Kong, South Korea and Taiwan are all below 2.0%, behind Canada and Finland. However, all but Taiwan outscore their competitors when Levels 5 and 6 are combined.
- Hong Kong, Shanghai, Singapore and Taiwan are all making fairly strong progress over time. Patterns are rather less discernible for other countries, though there is a downward trend in the US.
- In Finland, New Zealand and Canada – countries that seem to be falling back overall – the percentage of Level 6 readers continues to improve. This might suggest that the proportion of the highest performers in reading is not significantly affected when national performance begins to slide.
- When judged against these world leaders, England’s comparative performance is brought into much clearer perspective. At Level 6 it is not far behind Taiwan, South Korea and even Hong Kong. But, at Level 5 and above, the gap is somewhat more pronounced. England is improving, but very slowly.
- The comparison with Taiwan is particularly stark. In 2006, England had roughly twice as many students performing at Level 5. By 2009 Taiwan had caught up some of this ground and, by 2012, it had overtaken.
Table 8 compares changes since PISA 2006 in national performance at Level 5 and above with changes at Level 1 and below.
This is intended to reveal the balance between top and bottom – and whether this sample of world-leading and other English-speaking jurisdictions is making consistent progress at either end of the spectrum.
|Country||Levels 5 (and 6 from 2009)||Level 1 (or equivalent) and below|
We can see that:
- The countries with the highest proportion of students at Level 5 and above tend to have the lowest proportion at Level 1 and below. In Shanghai in 2012, there is a 22% percentage point gap between these two populations and fewer than 3 in every hundred fall into the lower attaining group.
- Singapore is much closer to Shanghai at the top end than it is at the bottom. But even Shanghai seems to be making faster progress at the top than at the bottom, which might suggest that it is approaching the point at which the proportion of low achievers cannot be further reduced.
- Compared with Hong Kong and South Korea, Singapore has a higher proportion of both high achievers and low achievers.
- Whereas Taiwan had three times as many low achievers as high achievers in 2006, by 2012 the proportions were broadly similar, but progress at the top end is much faster than at the bottom.
- The decline in Finland has less to do with performance at the top end (which has fallen by three percentage points) than with performance at the bottom (which has increased by more than six percentage points).
- Canada has consistently maintained a higher percentage of high achievers than low achievers, but the reverse is true in Australia. In New Zealand the percentage at the top is declining and the percentage at the bottom is increasing. The gap between the two has narrowed slightly in England, but not significantly so.
- To catch up with Shanghai, England has to close a gap of some 16 percentage points at the top end, compared with one of around 14 percentage points at the bottom.
The PISA National Report on England offers some additional analysis, noting that 18 jurisdictions had a higher proportion of pupils than England at Level 5 or above in 2012, including all those that outperformed England overall (with the exception of Estonia and Macao), and also France and Norway.
The National Report relies more heavily on comparing the performance of learners at the 5th and 95th percentiles in each country, arguing that:
‘This is a better measure for comparing countries than using the lowest and highest scoring pupils, as such a comparison may be affected by a small number of pupils in a country with unusually high or low scores.’
This is true in the sense that a minimum sample of 4,500 PISA participants would result in fewer than 100 at Level 6 in many jurisdictions.
On the other hand, the National Report fails to point out that analysis on this basis is not particularly informative about comparative achievement of the criterion-referenced standards denoted by the PISA thresholds.
It says rather more about the spread of performance in each country and rather less about direct international comparisons.
Key points include:
- In England the score of learners at the 5th percentile was 328, compared with 652 at the 95th percentile. This difference of 324 points is slightly larger than the OECD average difference of 310 points. More than two-thirds of OECD countries had a smaller difference between these percentiles.
- Compared with PISA 2012, the score of high achievers at the 95th percentile in PISA 2009 increased by six points to 652, while the score of low achievers at the 5th percentile fell by six points to 328. This increase in the attainment gap is higher than in 2009 (312) but lower than in 2006 (337). Thirteen OECD countries reported a wider spread of attainment than England.
- Of countries outperforming England, only Japan (325 points), Singapore (329 points) Belgium (339 points) and New Zealand (347 points) demonstrated a similar or wider spread of attainment. Shanghai had the lowest difference (259 points) followed by Estonia (263).
- The strongest performing jurisdictions at the 95th percentile were Singapore (698), Shanghai (690) and Japan (689), compared with 652 for England.
- Amongst jurisdictions ranked higher than England, only the Netherlands, Liechtenstein, Estonia and Macao secured a lower score at the 95th percentile. Only Belgium reported a lower score at the 5th percentile.
Turning to maths, Table 9 illustrates changes in the pattern of high achievement since 2006, again showing the percentages performing at Level 6 and above Level 5 respectively.
|Level 6||Levels 5 + 6||Level 6||Levels 5+6||Level 6||Levels 5+6|
The variations between countries tend to be far more pronounced than in reading:
- There is a huge 28 percentage point spread in performance at Level 6 within this sample – from 2% to 30% – compared with a three percentage point spread in reading. The spread at Level 5 and above is also significantly larger – 46 percentage points compared with 17 percentage points in reading.
- Shanghai has an 11 percentage point lead over its nearest competitor at Level 6 and an even larger 15 percentage point lead for Level 5 and above. Moreover it has improved significantly on both counts since 2009. Well over half its sample is now performing at Level 5 or above and almost a third are at Level 6.
- Singapore and Taiwan are the next best performers, both relatively close together. Both are improving but, following a small dip in 2009, Taiwan is improving at a faster rate – faster even than Shanghai.
- Hong Kong and South Korea also have similar 2012 profiles, as they did back in 2006. South Korea also lost ground in 2009, but is now improving at a faster rate than Hong Kong.
- Finland appears to be experiencing quite significant decline: the proportion of Level 6 performers in 2012 is not far short of half what it was in 2006 and performance above Level 5 has fallen by more than nine percentage points. This is a somewhat different pattern to reading, in that the top performers are also suffering from the overall decline.
- Australia, Canada and New Zealand have maintained broadly the same performance over time, though all are showing a slight falling off at Level 5 and above, and in New Zealand this also applies at Level 6.
- After a serious slump in 2006, Ireland has overtaken its 2006 position. Meanwhile, the US has been making some progress at Level 6 but is less convincing at Level 5 and above.
- Once again, this comparison does not particularly flatter England. It is not too far behind the Commonwealth countries and declining Finland at Level 6 but the gap is slightly larger at Level 5 and above. That said, England has consistently performed below the OECD average and remains in that position.
- There are, however, some grounds for domestic celebration, in that England has improved by 2.5% at Level 5 and above, and by 1.4% at Level 6. This rate of improvement bears comparison with Hong Kong, albeit from a much lower base. It suggests a narrowing gap between England and its Commonwealth counterparts.
Table 10 gives the comparison with achievement at the bottom end of the distribution, setting out the percentages performing at different levels.
|Country||Levels 5 and 6||Level 1 and below|
Key points include:
- The same pattern is discernible amongst the strongest performers as was evident with reading: those with the highest percentages at the top end tend to have the lowest percentages at the bottom. If anything this distinction is even more pronounced. Shanghai records a 52 percentage point gap between its highest and lowest performers and the latter group is only slightly larger than the comparable group in the reading assessment.
- Amongst the Asian Tigers, the ratio between top and bottom is at least 3:1 in favour of the top. For most of the other countries in the sample, there is never more than a 7 percentage point gap between top and bottom, but this stretches to 9 in the case of England and 13 for the USA. Needless to say, the low achievers are in the majority in both cases.
- Although the percentages for top and bottom in Australia are broadly comparable, it has shifted since 2006 from a position where the top end was in the majority by 3 percentage points to almost a mirror image of that pattern. In New Zealand, the lower achievers have increased by almost 9 percentage points, almost double the rate of decline at the top end, as their ‘long tail’ grows significantly longer.
- Apart from Shanghai, only Singapore, Hong Kong and South Korea have fewer than 10% in the lower performing category. Despite its reputation as a meritocratic environment, Singapore gets much closer to Shanghai at the bottom of the distribution than it does at the top. The same is true of Hong Kong and South Korea.
- It is also noticeable that none of the Tigers is making extraordinary progress at the bottom end. Hong Kong has reduced this population by 1% since 2003, Singapore by 1.5% since 2006, Shanghai by only 0.9% since 2006. The percentage has increased in South Korea and Taiwan. Improvement has been significantly stronger at the top of the distribution. Again this might suggest that the Tigers are closing in on the point where they cannot improve further at the bottom end.
- In Finland, the percentage achieving the higher levels has fallen by over 9 percentage points since 2006, while the increase at the lower levels is over 6 percentage points. This compares with a 3 point fall at the top and a 6 point rise at the bottom in reading. The slump amongst Finland’s high achievers is clearly more pronounced in maths.
- England’s 9.3 percentage point gap between the top and bottom groups in 2012 is lightly larger than the 8.7 point gap in 2006. It has a whopping 43 percentage point gap to make up on Shanghai at the top end, and an 18 point gap at the bottom. England is just on the right side of the OECD average at the bottom and just on the wrong side at the top.
The National Report notes that all jurisdictions ahead of England in the rankings had a higher percentage of learners at Level 5 or above.
As for percentiles
- The difference between the 5th percentile (335 points) and the 95th percentile (652 points) was 316 in England. The average difference for OECD countries was 301, only slightly lower than that.
- Ten countries had a greater difference than this, five of them amongst those the highest overall mean scores. Others were Israel, Belgium, Slovakia, New Zealand and France.
- Whereas the difference between the lowest and highest percentiles has increased very slightly across all OECD countries, this is more pronounced in England, increasing from 285 points in 2009 to 316 points in 2012. This is attributable to decreasing scores at the 5th percentile (350 in 2006, 349 in 2009 and 335 in 2012) compared with changes at the 95th percentile (643 in 2006, 634 in 2009 and 652 in 2012).
Table 11 compares the performance of this sample of PISA participants at the higher levels in the science assessment on the last three occasions.
|Level 6||Levels 5 + 6||Level 6||Levels 5+6||Level 6||Levels 5+6|
In science, the pattern of high achievement has more in common with reading than maths. It shows that:
- There is again a relatively narrow spread of performance between this sample of jurisdictions – approaching five percentage points at Level 6 and 20 percentage points at Level 5 and above.
- As in reading, Singapore outscores Shanghai at the top level 6, but is outperformed by Shanghai at Level 5 and above. Both are showing steady improvement, but Singapore’s improvement at Level 6 is more pronounced than Shanghai’s.
- Finland remains the third best performer, although the proportion of learners achieving at both Level 6 and Level 5 plus has been declining slightly since 2006.
- Another similarity with reading is that Australia, Finland and New Zealand all perform significantly better at Level 6 than Hong Kong, South Korea and Taiwan. Hong Kong alone performs equally well at Level 5 and above. None of these three Asian Tigers has made significant progress since 2006.
- In Australia, Canada, New Zealand and the US there has also been relatively little progress over time – indeed some evidence to suggest a slight decline. Conversely, Ireland seems to be moving forward again after a slight dip at Level 5 and above in 2009.
- England was a strong performer in 2006, broadly comparable with many of its competitors. But it fell back significantly in 2009 and has made no progress since then. The proportions are holding up but there is no substantive improvement since 2009, unlike in maths and (to a lesser extent) reading. However England continues to perform somewhat higher than the OECD average. There is an interesting parallel with Taiwan, although that country dipped even further than England in 2009.
Table 12 provides the comparison with the proportions achieving the lower thresholds.
|Country||Levels 5 and 6||Levels 1 and Below|
- Amongst the top performers the familiar pattern reappears. In 2012 Shanghai has 27% in the top categories against 2.7% in the bottom categories. This is very similar to reading (25.1% against 2.9%). At the bottom end, Shanghai’s nearest competitors are Hong Kong and South Korea, while Singapore and Taiwan are each approaching 10% at these levels. This is another similarity with reading (whereas, in maths, Singapore is more competitive at the lower end).
- Since 2009, Shanghai has managed only a comparatively modest 0.5% reduction in the proportion of its students at the bottom end, compared with an increase of almost 3% at the top end. This may lend further support to the hypothesis that it is approaching the point at which further bottom end improvement is impossible.
- No country has made consistently strong progress at the bottom end, though Ireland has made a significant improvement since 2009. There has been steady if unspectacular improvement in Hong Kong, Taiwan and Singapore. South Korea, having achieved a major improvement in 2009 has found itself unable to continue this positive trend.
- Finland’s negative trend is consistent since 2006 at both ends of the achievement spectrum, though the decline is not nearly as pronounced as in maths. In science Finland is maintaining a ratio of 2:1 in favour of the performers at the top end, while percentages at top and bottom are now much closer together in both reading and maths.
- There are broadly similar negative trends at top and bottom alike in the Commonwealth countries of Australia, Canada and New Zealand, although they have fallen back in fits and starts. In New Zealand the balance between top and bottom has shifted from being 4 percentage points in favour of the top end in 2006, to 3 percentage points in favour of the bottom end by 2012.
- A similar gap in favour of lower achievers also exists in England and is unchanged from 2009. By comparison with the US (which is a virtual mirror image of the top-bottom balance in Finland, Singapore or South Korea) it is in a reasonable position, rather similar to New Zealand, now that it has fallen back.
- England has a 1.5 percentage point gap to make up on Shanghai at the top end of the distribution, compared with a 12.2 percentage point gap at the bottom.
The PISA 2012 National Study reports that only the handful of jurisdictions shown in Table 11 above has a larger percentage of learners achieving Level 6. Conversely, England has a relatively large number of low achievers compared with these jurisdictions.
Rather tenuously, it argues on this basis that:
‘Raising the attainment of lower achievers would be an important step towards improving England’s performance and narrowing the gap between highest and lowest performers.’
When it comes to comparison of the 5th and 95th percentiles:
- The score at the 5th percentile (343) and at the 95th percentile (674) gives a difference of 331 points, larger than the OECD average of 304 points. Only eight jurisdictions had a wider distribution: Israel, New Zealand, Luxembourg, Slovakia, Belgium, Singapore and Bulgaria.
- The OECD average difference between the 5th and 95th percentiles has reduced slightly (from 311 in 2006 to 304 in 2012) and there has also been relatively little change in England.
Volume 1 of the OECD’s ‘PISA 2012 Results’ document provides additional data about all-round top performers achieving Level 5 or above in each of the three domains.
The diagram shows that 4.4% of learners across OECD countries achieve this feat.
This is up 0.3% on the PISA 2009 figure revealed in this PISA in Focus publication.
Performance on this measure in 2012, compared with 2009, amongst the sample of twelve jurisdictions is shown in the following Table 13. (NB that the UK figure is for the UK combined, not just England).
In terms of percentage increases, the fastest progress on this measure is being made by Hong Kong, Ireland, Shanghai, Singapore and Taiwan. Shanghai has improved a full five percentage points and one in five of its students now achieve this benchmark.
The UK is making decent progress, particularly compared with Australia, Canada, Finland New Zealand and the US, which are moving in the opposite direction.
The Report notes:
‘Among countries with similar mean scores in PISA, there are remarkable differences in the percentage of top-performing students. For example, Denmark has a mean score of 500 points in mathematics in PISA 2012 and 10% of students perform at high proficiency levels in mathematics, which is less than the average of around 13%. New Zealand has a similar mean mathematics score of 500 points, but 15% of its students attain the highest levels of proficiency, which is above the average…these results could signal the absence of a highly educated talent pool for the future.
Having a large proportion of top performers in one subject is no guarantee of having a large proportion of top performers in the others. For example, Switzerland has one of the 10 largest shares of top performers in mathematics, but only a slightly-above-average share of top performers in reading and science.
Across the three subjects and across all countries, girls are as likely to be top performers as boys. On average across OECD countries, 4.6% of girls and 4.3% of boys are top performers in all three subjects…To increase the share of top-performing students, countries and economies need to look at the barriers posed by social background…the relationship between performance and students’… and schools’ organisation, resources and learning environment.’ (p65)
Priorities for Different Countries
On the basis of this evidence, it is possible to draw up a profile of the performance of different countries across the three assessments at these higher levels, and so make a judgement about the prospects in each of ‘a highly educated talent pool for the future’. The twelve jurisdictions in our sample might be advised as follows:
- Shanghai should be focused on establishing ascendancy at Level 6 in reading and science, particularly if there is substance to the suspicion that scope for improvement at the bottom of the spectrum is now rather limited. Certainly it is likely to be easier to effect further improvement at the very top.
- Singapore has some ground to catch up with Shanghai at Level 6 in maths. It has narrowed that gap by three percentage points since 2009, but there is still some way to go. Otherwise it should concentrate on strengthening its position above Level 5, where Shanghai is also conspicuously stronger.
- Hong Kong needs to focus on Level 6 in reading and science, but perhaps also in maths where it has been extensively outpaced by Taiwan since 2009. At levels 5 and above it faces strong pressure to maintain proximity with Shanghai and Singapore, as well as marking the charge made by Taiwan in reading and maths. Progress in science is relatively slow.
- South Korea should also pay attention to Level 6 in reading and science. It is improving faster than Hong Kong at Level 6 in maths but is also losing ground on Taiwan. That said, although South Korea now seems back on track at Level 5 and above in maths, but progress remains comparatively slow in reading and science, so both Levels 5 and 6 need attention.
- Taiwan has strong improvement in reading and maths since 2009, but is deteriorating in science at both Levels 5 and 6. It still has much ground to pick up at Level 6 in reading. Its profile is not wildly out of kilter with Hong Kong and South Korea.
- Finland is bucking a downward trend at Level 6 in reading and slipping only slightly in science, so the more noticeable decline is in maths. However, the ground lost is proportionately greater at Level 5 and above, once again more prominently in maths. As Finland fights to stem a decline at the lower achievement levels, it must take care not to neglect those at the top.
- Australia seems to be slipping back at both Levels 5 and 6 across all three assessments, while also struggling at the bottom end. There are no particularly glaring weaknesses, but it needs to raise its game across the board.
- Canada is just about holding its own at Level 6, but performance is sliding back at Level 5 and above across all three domains. This coincides with relatively little improvement and some falling back at the lower end of the achievement distribution. It faces a similar challenge to Finland’s although not so pronounced.
- New Zealand can point to few bright points in an otherwise gloomy picture, one of which is that Level 6 performance is holding up in reading. Elsewhere, there is little to celebrate in terms of high achievers’ performance. New Zealand is another country that, in tackling more serious problems with the ‘long tail’, should not take its eye off the ball at the top.
- The US is also doing comparatively well in reading at Level 6, but is otherwise either treading water or slipping back a little. Both Level 6 and Level 5 and above need attention. The gap between it and the world’s leading countries continues to increase, suggesting that it faces future ‘talent pool’ issues unless it can turn round its performance.
- Ireland is a good news story, at the top end as much as the bottom. It has caught up lost ground and is beginning to push beyond where it was in 2006. Given Ireland’s proximity, the home countries might want to understand more clearly why their nearest neighbour is improving at a significantly faster rate. That said, Ireland has significant room for improvement at both Level 6 and Level 5 and above.
- England’s performance at Level 6 and Level 5 and above has held up surprisingly well compared with 2009, especially in maths. When the comparison is solely historical, there might appear to be no real issue. But many other countries are improving at a much stronger rate and so England (as well as the other home countries) risks being left behind in the ‘global race’ declared by its Prime Minister. The world leaders now manage three times as many Level 6 performers in science, four times as many in reading and ten times as many in maths. It must withstand the siren voices urging it to focus disproportionately at the bottom end.
Addressing These Priorities
It is far more straightforward to pinpoint these different profiles and priorities than to recommend convincingly how they should be addressed.
The present UK Government believes firmly that its existing policy direction will deliver the improvements that will significantly strengthen its international competiveness, as judged by PISA outcomes. It argues that it has learned these lessons from careful study of the world’s leading performers and is applying them carefully and rigorously, with due attention to national needs and circumstances.
But – the argument continues – it is too soon to see the benefits of its reforms in PISA 2012, such is the extended lag time involved in improving the educational outcomes of 15 year-olds. According to this logic, the next Government will reap the significant benefits of the present Government’s reform programme, as revealed by PISA 2015.
Recent history suggests that this prediction must be grounded more in hope than expectation, not least because establishing causation between indirect policy interventions and improved test performance must surely be the weakest link in the PISA methodology.
But, playing devil’s advocate for a moment, we might reasonably conclude that any bright spots in England’s performance are attributable to interventions that the previous Government got right between five and ten years ago. It would not be unreasonable to suggest that the respectable progress made at the top PISA benchmarks is at least partly attributable to the national investment in gifted education during that period.
We might extend this argument by suggesting a similar relationship between progress in several of the Asian Tigers at these higher levels and their parallel investment in gifted education. Previous posts have drawn attention to the major programmes that continue to thrive in Hong Kong, Singapore, South Korea and Taiwan.
Shanghai might have reached the point where success in mainstream education renders investment in gifted education unnecessary. On the other hand, such a programme might help it to push forward at the top in reading and science – perhaps the only conspicuous chink in its armour. There are lessons to be learned from Singapore. (Gifted education is by no means dormant on the Chinese Mainland and there are influential voices pressing the national government to introduce more substantive reforms.)
Countries like Finland might also give serious consideration to more substantive investment in gifted education geared to strengthening high attainment in these core domains. There is increasingly evidence that the Finns need to rethink their approach.
The relationship between international comparisons studies like PISA and national investment in gifted education remains poorly researched and poorly understood, particularly how national programmes can most effectively be aligned with and support such assessments.
The global gifted education community might derive some much-needed purpose and direction by establishing an international study group to investigate this issue, providing concrete advice and support to governments with an interest.