PISA 2009: International Comparisons of Gifted High Achievers’ Performance

This post is an initial review of what PISA 2009 tells us about the performance of gifted high achievers in England and other English-speaking countries compared with the countries at the top of the PISA 2009 rankings.

It concentrates on what we can deduce from the figures rather than causation: that will be addressed in subsequent posts. It examines:

  • average performance by country, including changes between PISA 2006 and PISA 2009
  • the performance of high achievers, comparing the relative results of different countries in 2009 and how those have changed since PISA 2006
  • relative differences between the performance of high achievers and average performance by country – expressed in terms of rankings – and how those have altered between 2006 and 2009.

The twelve countries and regions included in the analysis are the highest performers – Hong Hong (China), Korea, Taiwan, Finland and, for 2009 only, Shanghai (China) and Singapore – plus Australia, Canada, Ireland, New Zealand the UK and the USA.

I should state at the outset that I am not a statistician: this is a lay analysis and I apologise in advance for any transcription errors. Nevertheless, I hope it reveals some significant findings, including points which have received scant attention in the wider media coverage of the PISA results.

Background to PISA

The Programme for International Student Assessment (PISA) is a triennial OECD survey of the performance of 15 year-old students in science, mathematics and reading. Science was the main focus in 2006; reading is the main focus in 2009.

Fifty-seven countries took part in PISA 2006; a total of sixty-seven countries have taken part in PISA 2009. The effect of this increase in numbers on rankings should be borne in mind, especially the inclusion of very high-performing areas, notably Shanghai and Singapore.

It is also worth noting at the outset that PISA rankings do not reflect the overall numbers of students achieving specific levels: a small country that has a high percentage of its students achieving a high achievement level outscores a bigger country with a lower percentage of high achievers, even though the overall number of high achievers in the bigger country is greater.

PISA assesses reading, scientific, mathematical literacy. It is important to have a clear understanding of exactly what is being assessed, not least so we can understand to what extent this differs from the nature of our own national assessments.

If a country’s national assessments are congruent with PISA then it will be likely to perform much better in PISA than a similar country which is domestically focused on quite different priorities.

According to the PISA 2009 Assessment Framework:

Reading literacy…is defined in terms of students’ ability to understand, use and reflect on written text to achieve their purposes…the capacity not just to understand a text but to reflect on it, drawing on one’s own thoughts and experiences. In PISA, reading literacy is assessed in relation to the:

Text format…continuous texts or prose organised in sentences and paragraphs…non-continuous texts that present information in other ways, such as in lists, forms, graphs, or diagrams… a range of prose forms, such as narration, exposition and argumentation…both print and electronic texts…these distinctions are based on the principle that individuals will encounter a range of written material in their civic and work-related adult life (e.g. application, forms, advertisements) and that it is not sufficient to be able to read a limited number of types of text typically encountered in school.

Reading processes (aspects): Students are not assessed on the most basic reading skills, as it is assumed that most 15-year-old students will have acquired these. Rather, they are expected to demonstrate their proficiency in accessing and retrieving information, forming a broad general understanding of the text, interpreting it, reflecting on its contents and reflecting on its form and features.

Situations: These are defined by the use for which the text was constructed. For example, a novel, personal letter or biography is written for people’s personal use; official documents or announcements for public use; a manual or report for occupational use; and a textbook or worksheet for educational use. Since some groups may perform better in one reading situation than in another, it is desirable to include a range of types of reading in the assessment items.

Mathematical literacy… is concerned with the ability of students to analyse, reason, and communicate ideas effectively as they pose, formulate, solve, and interpret solutions to mathematical problems in a variety of situations. The PISA mathematics assessment has, so far, been designed in relation to the:

Mathematical content: This is defined mainly in terms of four overarching ideas (quantity, space and shape, change and relationships, and uncertainty) and only secondarily in relation to curricular strands (such as numbers, algebra and geometry).

Mathematical processes: These are defined by individual mathematical competencies. These include the use of mathematical language, modelling and problem-solving skills…

Situations: These are defined in terms of the ones in which mathematics is used, based on their distance from the students. The framework identifies five situations: personal, educational, occupational, public and scientific.

However, a major revision of the PISA mathematics framework is currently underway in preparation for the PISA 2012 assessment.

Scientific literacy… is defined as the ability to use scientific knowledge and processes not only to understand the natural world but to participate in decisions that affect it. The PISA science assessment is designed in relation to:

Scientific knowledge or concepts: These constitute the links that aid understanding of related phenomena. In PISA, while the concepts are the familiar ones relating to physics, chemistry, biological sciences and earth and space sciences, they are applied to the content of the items and not just recalled.

Scientific processes: These are centred on the ability to acquire, interpret and act upon evidence. Three such processes present in PISA relate to: 1) describing, explaining and predicting scientific phenomena, 2) understanding scientific investigation, and 3) interpreting scientific evidence and conclusions.

Situations or contexts: These concern the application of scientific knowledge and the use of scientific processes applied. The framework identifies three main areas: science in life and health, science in Earth and environment, and science in technology.’

Defining high achievers in PISA

PISA performance scales are designed so that the average student score in OECD countries is 500 or thereabouts. Student performance is divided into 6 proficiency levels (only 5 for reading in PISA 2006), defined in terms of the competences demonstrated by students achieving that level.


In PISA 2006 in reading, the highest proficiency level 5 was achieved by 8.6% of OECD students with a lower score limit of 625.6. In PISA 2009 a level 6 was introduced (lower score limit of 698.3) which was achieved by 0.8% of OECD students. Levels 5 and 6 combined (lower score limit of 625.6) was achieved by 7.6% of OECD students. This analysis assumes therefore that levels 5 and 6 together in 2009 can be compared with level 5 in 2006.

We can conclude that overall higher level performance in OECD countries fell by 1.0% between 2006 and 2009. This may well be attributable to changes in the level of demand in the assessment framework rather than an overall dip in performance.

According to the PISA 2009 Assessment Framework (or the PISA Results book Volume I in the case of reading) tasks at level 6:

‘typically require the reader to make multiple inferences, comparisons and contrasts that are both detailed and precise. They require demonstration of a full and detailed understanding of one or more texts and may involve integrating information from more than one text. Tasks may require the reader to deal with unfamiliar ideas, in the presence of prominent competing information, and to generate abstract categories for interpretations. Reflect and evaluate tasks may require the reader to hypothesise about or critically evaluate a complex text on an unfamiliar topic, taking into account multiple criteria or perspectives, and applying sophisticated understandings from beyond the text. A salient condition for access and retrieve tasks at this level is precision of analysis and fine attention to detail that is inconspicuous in the texts.’

And tasks at level 5:

‘that involve retrieving information require the reader to locate and organise several pieces of deeply embedded information, inferring which information in the text is relevant. Reflective tasks require critical evaluation or hypothesis, drawing on specialised knowledge. Both interpretative and reflective tasks require a full and detailed understanding of a text whose content or form is unfamiliar. For all aspects of reading, tasks at this level typically involve dealing with concepts that are contrary to expectations.


In PISA 2006, science level 6 was achieved by 1.3% of OECD students and required a lower score limit of 707.9. Level 5 and above was achieved by 9.0% requiring a lower score of 633.3.

In 2009, these figures were: level 6 achieved by 1.1% of OECD students with a lower score limit of 707.9; level 5 and above achieved by 8.5% of OECD students with a lower score limit of 633.3.

The science framework does not seem to have changed significantly between the two assessments, so we can provisionally identify a small overall dip in higher level performance between 2006 and 2009.

Level 6 students can:

‘consistently identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations. They can link different information sources and explanations and use evidence from those sources to justify decisions. They clearly and consistently demonstrate advanced scientific thinking and reasoning, and they use their scientific understanding in support of solutions to unfamiliar scientific and technological situations. Students at this level can use scientific knowledge and develop arguments in support of recommendations and decisions that centre on personal, social or global situations.

At Level 5, students can:

‘identify the scientific components of many complex life situations, apply both scientific concepts and knowledge about science to these situations, and can compare, select and evaluate appropriate scientific evidence for responding to life situations. Students at this level can use well-developed inquiry abilities, link knowledge appropriately and bring critical insights to situations. They can construct explanations based on evidence and arguments based on their critical analysis.


In PISA 2006 mathematics, level 6 was achieved by 3.3 % of OECD students with a lower score limit of 669.3 and level 5 and above by 13.3% of OECD students with a lower score of 607.

In PISA 2009, level 6 was achieved by 3.1% of OECD students with a lower score limit of 669.3 and level 5 and above by 12.7% of OECD students with a lower score of 607.

As with science, the framework does not appear significantly changed and so we can provisionally identify a small drop overall in the proportion of OECD students achieving these higher levels.

The PISA 2009 rubric says:

‘At ‘Level 6 students can conceptualise, generalise, and utilise information based on their investigations and modelling of complex problem situations. They can link different information sources and representations and flexibly translate among them. Students at this level are capable of advanced mathematical thinking and reasoning. These students can apply this insight and understandings along with a mastery of symbolic and formal mathematical operations and relationships to develop new approaches and strategies for attacking novel situations. Students at this level can formulate and precisely communicate their actions and reflections regarding their findings, interpretations, arguments, and the appropriateness of these to the original situations.

‘At Level 5 students can develop and work with models for complex situations, identifying constraints and specifying assumptions. They can select, compare, and evaluate appropriate problem solving strategies for dealing with complex problems related to these models. Students at this level can work strategically using broad, well-developed thinking and reasoning skills, appropriate linked representations, symbolic and formal characterisations, and insight pertaining to these situations. They can reflect on their actions and formulate and communicate their interpretations and reasoning.’

Comparing PISA 2006 and 2009 results by country for all participants

Table 1 below compares average scores by country in PISA 2006 and PISA 2009. These are essentially the headline figures which attract most media attention and they are included here primarily for the purposes of comparison.

Country Reading Maths Science
2009 2006 2009 2006 2009 2006
score rank score rank score rank score rank score rank score rank
Aus 515 9 513 7 514 15 520 13 527 10 527 8
Can 524 6 527 4 527 10 527 7 529 8 534 3
Fin 536 3 547 2 541 6 548 2 554 2 563 1
HK 533 4 536 3 555 3 547 3 549 3 542 2
Ire 496 21 517 6 487 32 501 22 508 20 508 20
Korea 539 2 556 1 546 4 547 4 538 6 522 11
NZ 521 7 521 5 519 13 522 11 532 7 530 7
Shang 556 1 N/A N/A 600 1 N/A N/A 575 1 N/A N/A
Sing 526 5 N/A N/A 562 2 N/A N?A 542 4 N/A N/A
Taiwan 495 23 496 16 543 5 549 1 520 12 532 4
UK 494 25 495 17 492 27 495 24 514 16 515 14
US 500 17 N/A N/A 487 31 474 35 502 23 489 29
Average 493 495 496 497 501 498

However, it is worth drawing attention to some key points arising from the table:

  • As indicated above, there have been small falls in overall OECD performance in reading and maths between 2006 and 2009 and a corresponding small increase in science performance. The change in reading in particular may be more attributable to a tightening of the assessment framework
  • In reading, the average score has increased slightly in Australia, remained unchanged in New Zealand, and fallen slightly in Canada, Hong Kong, Taiwan and the UK. Given the relatively tougher assessment framework and the associated overall dip in cross-OECD performance, these countries have arguably done well to maintain their scores
  • However, there have been more significant falls in reading performance in Finland, Ireland and Korea – all three strong performers in PISA 2006. Only Ireland has experienced a significant drop in ranking as a consequence, but these results should be a matter of concern in all three countries, perhaps suggesting they may need to focus more on aspects of reading newly introduced into the 2009 assessment framework
  • In maths, the average score has increased significantly in Hong Kong and the US, remained largely unchanged in Canada, New Zealand and the UK and fallen significantly in Australia, Finland, Ireland and Taiwan. Only Ireland has experienced a significant drop in its ranking
  • Nevertheless, Australia, Finland and Canada should be concerned about the dip in their performance of 6-7 points in each case. This cannot be attributable to other countries leapfrogging them in the table
  • In science, Hong Kong, Korea and the US have all made significant improvements since 2006, while performance is largely unchanged in Australia, Ireland, New Zealand and the UK and has declined significantly in Canada, Finland and Taiwan. The latter should be concerned.
  • In all three areas, loss of rank combined with fairly static performance is attributable to other countries improving at a faster rate and is a matter of relative competition. It is not possible to depress the performance of a competitor so these countries must concentrate on improving their own performance. That said, they should take some comfort from their capacity to sustain their 2006 performance when their competitors are clearly not doing so
  • On the basis of this evidence, the countries with the biggest overall headaches are Canada, Finland and especially Taiwan, all three lauded to some degree as PISA world leaders.

Comparing Percentages of High Achievers in PISA2009 and PISA2006

Table 2 compares the percentages of high achievers in each of our 12 countries who achieved the higher levels in reading, maths and science in 2006 and 2009 respectively.

Country Reading Maths Science
2009 2006 2009 2006 2009 2006
Level 6 Levels 5+6 Level 5 Level 6 Levels 5+6 Level 6 Levels 5+6 Level 6 Levels 5+6 Level 6 Levels 5+6
Aus 2.1 12.8 10.6 4.5 16.4 4.3 16.4 3.1 14.6 2.8 14.6
Can 1.8 12.8 14.5 4.4 18.3 4.4 18 1.6 12.1 2.4 14.4
Fin 1.6 14.5 16.7 4.9 21.6 6.3 24.4 3.3 18.7 3.9 20.9
HK 1.2 12.4 12.8 10.8 30.7 9 27.7 2 16.2 2.1 15.9
Ire 0.7 7 11.7 0.9 6.7 1.6 10.2 1.2 8.7 1.1 9.4
Korea 1 12.9 21.7 7.8 25.5 9.1 27.1 1.1 11.6 1.1 10.3
NZ 2.9 15.8 15.9 5.3 18.9 5.7 18.9 3.6 17.6 4 17.6
Shang 2.4 19.4 N/A 26.6 50.7 N/A N/A 3.9 24.3 N/A N/A
Sing 2.6 15.7 N/A 15.6 35.6 N/A N/A 4.6 19.9 N/A N/A
Taiwan 0.4 5.2 4.7 11.3 28.5 11.8 31.9 0.8 8.8 1.7 14.6
UK 1 8 9 1.8 9.9 2.5 11.2 1.9 11.4 2.9 13.7
US 1.5 9.9 N/A 1.9 9.9 1.3 7.7 1.3 9.2 1.5 9.1
Average 1 7 8.6 3.1 12.7 3.3 13.4 1.1 8.5 1.3 8.8


  • The 2006 leaders amongst our subset of countries were Korea, Finland and New Zealand respectively whereas, in 2009, the leaders were Shanghai, New Zealand and Singapore (Shanghai and Singapore did not take part in the 2006 assessment).
  • All except Taiwan and Ireland exceeded the OECD average, although the percentage of the highest level 6 achievers in 2009 was lower than the OECD average in Taiwan and Ireland and equivalent to it in the UK and Korea. These four countries arguably need to concentrate more on the very top of their achievement range.
  • The percentage achieving levels 5/6 has increased over the 3-year period in Australia and Taiwan, remained largely unchanged in Hong Kong and New Zealand, fallen slightly in the UK and fallen substantially in Canada, Finland, Ireland and Korea. The decline in Korea is particularly startling.


  • The 2006 leaders in our subset were Taiwan, Korea and Hong Kong respectively at level 6 and Taiwan, Hong Kong and Korea respectively at levels 5/6. In 2009, the leaders are Shanghai, Singapore and Taiwan respectively at level 6 and Shanghai, Singapore and Hong Kong respectively at levels 5/6.
  • In 2006, the UK, US and Ireland were below the OECD average for level 6 performance and the other nine countries were above it. This continued to be the case in 2009 though, whereas the US was moving in the right direction, level 6 performance declined in the UK and Ireland, identifying this as an aspect potentially requiring attention in both countries;
  • In 2006, the same three countries were below the OECD average for level 5/6 performance and this continued to be the case in 2009. As with level 6, the US has improved its performance, drawing level with the UK, but the UK’s performance has declined somewhat and Ireland’s has declined significantly. This suggests that higher achievers also need more attention in both countries
  • Between 2006 and 2009, other countries improving their performance included Australia and Hong Kong (level 6) and Canada and Hong Kong (levels 5 and 6) though only Hong Kong managed significant improvement. Performance was relatively unchanged in Canada (level 6) and New Zealand (levels 5 and 6). There was a decline in Finland, Korea, New Zealand and Taiwan at level 6, most noticeably in Finland and Korea, and in Finland, Korea and Taiwan at levels 5 and 6 together.
  • If we compare rates of change for level 6 and levels 5/6 respectively, we see that countries doing relatively better with their highest achievers (level 6) include Australia, New Zealand, Taiwan, the UK and the US, while countries doing relatively better with their higher achievers (levels 5 and 6) include Canada, Finland and Korea.


  • The 2006 leaders in terms of level 6 performance were New Zealand, Finland and the UK. Finland, New Zealand and Hong Kong led the field for level 5 and 6 performance. In 2009 Singapore, Shanghai and New Zealand respectively were leaders in level 6 performance and Shanghai, Singapore and Finland respectively for levels 5 and 6 together
  • In 2006, Ireland and Korea were below the OECD average for level 6 performance but all countries were above the average for levels 5 and 6 combined. In 2009, Taiwan had fallen below the OECD average for level 6, Korea matched it and Ireland had exceeded it; all countries were still above the OECD average for levels 5 and 6 together. This suggests that Ireland as well as Korea deserve credit for the progress made with their highest achievers in science.
  • Australia was the only other country to improve its level 6 performance in science during this period while Hong Kong, Korea and the US (very slightly) improved their performance for levels 5 and 6 together.
  • There were declines in performance at level 6 for Canada, Finland, Hong Kong Kong, New Zealand, Taiwan, the UK and the US, while Korea flatlined. The worst declines were in Canada, Taiwan and the UK.
  • In terms of levels 5 and 6 combined, improvement was made in the period by Hong Kong, Korea and the US (very slightly in the case of the US). There were declines in performance in Canada, Finland, Ireland, Korea, Taiwan and the UK, the fall in Taiwan being particularly marked.
  • Examining the rate of change for level 6 compared with levels 5 and 6, it is hard to detect a clear pattern but Australia and Ireland seem to be doing relatively better with level 6 while, conversely, Canada, Korea the UK and the US seem to be doing relatively worse

As we have noted above, performance across all OECD countries fell slightly across the board between 2006 and 2009. Insofar as this is not attributable to changes to the assessment frameworks, we might reasonably note that the OECD’s effort in producing PISA has not of itself resulted in improved performance across OECD countries for high achievers over this 3-year period.

Comparing ranks for high achievers versus all achievers, 2006 and 2009

The third and final table compares rank positions for high achievers and all achievers in 2006 and 2009 respectively.

This comparison could also be undertaken on the basis of percentages achieving the different levels and/or the average scores achieved, but the rankings are more readily available and are a reasonable guide to changes in the relative performance of countries, if not absolute changes.

Reading Maths Science
2009 2006 2009 2006 2009 2006
Level 6 Levels 5+6 All Level 5 All Level 6 Levels 5+6 All Level 6 Levels 5+6 All Level 6 Levels 5+6 All Level 6 Levels 5+6 All
Aust 4 7 9 9 7 13 16 15 14 14 13 5 7 10 4 5 8
Can 6 7 6 4 4 14 12 10 13 12 7 10 9 8 6 7 3
Fin 7 4 3 2 2 11 7 6 6 4 2 4 3 2 2 1 1
HK 10 9 4 5 3 4 3 3 3 2 3 7 6 3 9 4 2
Ire 17 22 21 6 6 40 36 32 32 28 22 15 19 20 18 19 20
Korea 12 6 2 1 1 5 5 4 2 3 4 18 18 10 18 16 11
NZ 1 2 7 3 5 9 11 13 9 8 11 3 4 7 1 2 7
Shang 3 1 1 N/A N/A 1 1 1 N/A N/A N/A 2 1 1 N/A N/A N/A
Sing 2 3 5 N/A N/A 2 2 2 N/A N/A N/A 1 2 4 N/A N/A N/A
Taiw 27 29 23 29 16 3 4 5 1 1 1 23 18 12 12 5 4
UK 12 18 25 9 27 30 30 27 24 23 22 8 11 16 3 8 14
US 8 11 17 N/A N/A 29 30 31 33 29 35 14 17 23 14 9 29

  • For reading, Taiwan and the UK were relatively unusual in 2006 because of the dissonance between their rankings – the UK because it did so much better for its higher achievers; Taiwan because it did so much worse. By 2009, Hong Kong and Korea are beginning to follow the same trend as Taiwan, while Australia, New Zealand and the US have joined the UK. These latter four countries might therefore be expected to concentrate disproportionately on their lower achievers in future
  • For maths, there are some clear disparities between relative national ranks in 2006. In the case of Canada and Ireland, the rank is significantly lower for level 5 and above than it is for all students. By 2009, however, these gaps have almost invariably narrowed, perhaps suggesting a degree of responsiveness to PISA 2006, although in some cases it appears that slippage down the overall rankings has influenced matters. Certainly there is no real evidence here that high achievers in maths are particularly neglected compared with their peers, or vice versa.
  • For science, it is clear that in 2006, Australia, New Zealand, the UK and the US were stronger performers, relatively speaking, with their higher achievers – and this is particularly pronounced in the last three of these countries. By 2009, these distinctions continue but are becoming relatively less clear-cut, following a similar pattern to maths.


The analysis above provides some detailed pointers for future support for high achievers, but what overall assessment can we offer for each of our English-speaking countries?


In PISA 2006, Australia achieved high overall rankings in reading (7) and science (8) and a relatively good ranking in maths (13). It fell two places in the rankings in all three areas in PISA 2009, although its average score increased slightly in maths, was unchanged in science and fell significantly in maths.

In 2006, its ranking for high achievers (levels 5 and 6) was slightly higher than its overall ranking in science, but not in reading or maths. By 2009, this remained true of science and had became true of reading as well.

The percentage of higher achievers (levels 5 and 6) in reading has increased significantly between 2006 and 2009, but the equivalent percentages in maths and science remain largely unchanged, except for small improvements for the highest achievers (level 6).

Moving forward, the priorities for Australia are likely to be improvement in maths across the board and probably for relatively low achievers in reading and science.


PISA 2006 showed Canada achieving very highly overall in reading (4) and science (3) and highly in maths (7). In 2009 it fell two places in reading (6), three places in maths (10) and five places in science (8), although average scores remained unchanged in maths and fell somewhat in science and reading.

Its 2006 rankings for high achievers were significantly lower than its overall ranking in maths and science but identical in reading. In 2009, there was still little difference in the relative rankings for science and reading and now little difference in maths either, although the change in maths is attributable to a fall in overall ranking rather than an improvement for high achievers.

The percentage of higher achievers has declined in reading and in science between 2006 and 2009 but has increased slightly in maths.

Canada has a relatively ‘balanced scorecard’ and will likely continue to focus on improving its results in all three areas and all achievement levels, though maths may be a relatively higher priority.


Ireland’s overall rankings from PISA 2006 were high for reading (6) and mid-table for maths (22) and science (20). In PISA 2009 its ranking for science remained unchanged (20) but fell very significantly in maths (32) and especially reading (21). Average scores also fell significantly in maths and reading and were unchanged in science.

The 2006 rankings for higher achievers showed very little difference to overall rankings in science and reading, but somewhat lower relative rankings for high achievers in maths. The position is similar in 2009, and there is marked slippage down the rankings in reading – and to a lesser extent maths – for higher achievers as well as for all achievers.

The percentage of higher achievers has fallen significantly in maths and reading and slightly in science.

For the future, Ireland will need to reverse its downward trend in maths and reading while not neglecting improvements in science. It needs to focus on all levels of achievement, including its higher achievers.

New Zealand

In PISA 2006, New Zealand achieved a very high overall ranking in reading (5), a high ranking in science (7) and a relatively high ranking in maths (11). In PISA 2009, it slipped 2 places in reading and maths but retained its position in science. Average scores were unchanged in reading, fell slightly in maths and increased slightly in science.

Rankings for higher achievers in 2006 were significantly higher than overall rankings in science and slightly higher in reading and maths. By 2009 the difference between science rankings had closed somewhat, but this is attributable to slippage in the higher achieving rankings. In maths the position is broadly unchanged, but in reading the relatively higher ranking of the higher achievers is now more pronounced.

In terms of the percentages achieving higher levels, there has been little relatively change in reading, maths or science.

New Zealand is another country with a relatively ‘balanced scorecard’ but its higher achievers seem to be doing relatively well and it may wish to concentrate more on lower end of the achievement spectrum.


The UK achieved good to mid-table rankings in PISA 2006 for science (14), reading (17) and maths (24). In PISA 2009 it fell slightly in science (16) and maths (27) and significantly in reading (25). Average scores fell slightly in all three areas.

In 2006, rankings for higher achievers were significantly higher than overall rankings in science and reading, but very similar in maths. This continues to be the case in 2009 with the decline shared across achievement levels.

The percentage achieving higher levels has fallen significantly between 2006 and 2009 in science and maths, and fallen slightly in reading.

The UK has to improve in all three areas, but particularly maths and reading. High achievers must be a priority in maths especially, but effort is required across all levels of achievement to ensure that lower achievers do not improve at the expense of their higher-achieving peers.


The PISA 2006 overall rankings for the US were low to mid-table in science (29) and maths (35). No result was declared for reading because of problems with the administration of the assessment. The PISA 2009 outcomes show that the US has improved its ranking by six places in science (23) and four places in maths (31) while it achieved a ranking of 17 in reading. Average scores increased significantly in both maths and science.

2006 rankings for higher achievers were much higher than the overall ranking in science and slightly higher in maths. By 2009, the gap had narrowed in science and maths. In reading higher achievers are ranked significantly higher than the overall ranking.

The percentage achieving higher levels is little changed in science between 2006 and 2009 but there is a significant improvement in maths.

The US is moving in broadly the right direction but has to continue to improve in all three areas, especially maths. This evidence suggests that the focus should be predominantly on lower achievers – except in maths where there is a problem across the board – but, as with the UK, care is needed to ensure that higher achievers are not neglected as a result.

The UK and the US are therefore in very similar positions, but whereas the UK needs to arrest a downward trajectory, the US is already moving in the right direction.

There is an agenda for improvement in all these countries, should they choose – as the UK has done – to align their priorities firmly with those assessed by PISA and other international comparisons studies.

And this analysis has also shown that there is clear room for improvement in the performance of other world leaders, such as Finland, Hong Kong and Korea: we should take with a big pinch of salt the news headlines that say we need only emulate them to be successful.


December 2010